Develop a movie talk inbetween the browser and mobile app, Streaming Movie WebRTC server and SIP gateway for browsers and mobile apps

Develop a movie talk inbetween the browser and mobile app

Evil empires are often a target of grudge and hate from end users. This is understandable and sometimes even justified, but nevertheless Uber still partially pays for our trips (even tho’ it's temporarily) and Google accelerated the WebRTC technology, which otherwise would stay a proprietary and expensive software contraption for fairly a narrow b2b purposes if not for the Evil Empire of Google.

Since WebRTC has been made public, movie talks became much lighter to develop. A number of API and services, servers and framework has emerged for that. In this article we accurately describe one more way to develop a movie talk inbetween a web browser and a native Android application.

Movie talk in a browser

A classic WebRTC movie talk inbetween browsers starts from SDP (session description protocol) exchange. Alice sends her SDP to Boris, and Boris responds with his one. SDP is a config like below:

o=- 1,46879219443992e18 Two IN IP4 127.0.0.1

a=group:BUNDLE audio movie

a=msid-semantic: WMS 9nKsWmxMvOQBYaz9xhRffWeWSUbCbnox6aQ4

m=audio nine UDP/TLS/RTP/SAVPF one hundred eleven 103 one hundred four 9 zero 8 one hundred six 105 thirteen 110 one hundred twelve 113 126

a=rtcp:9 IN IP4 0.0.0.0

a=ssrc:3525514540 msid:9nKsWmxMvOQBYaz9xhRffWeWSUbCbnox6aQ4 09bdb6e7-b4b3-437b-945e-771f535052e3

m=movie nine UDP/TLS/RTP/SAVPF ninety six 98 one hundred 102 one hundred twenty seven 97 ninety nine 101 125

a=rtcp:9 IN IP4 0.0.0.0

a=rtcp-fb:96 ccm fir

a=rtcp-fb:96 nack pli

a=rtcp-fb:98 ccm fir

a=rtcp-fb:98 nack pli

a=rtcp-fb:100 ccm fir

a=rtcp-fb:100 nack pli

a=ssrc-group:FID two billion four hundred seventy million nine hundred thirty six thousand eight hundred forty 2969787502

a=ssrc:2470936840 msid:9nKsWmxMvOQBYaz9xhRffWeWSUbCbnox6aQ4 ce9235c5-f300-466a-aadd-b969dc2f3664

a=ssrc:2969787502 msid:9nKsWmxMvOQBYaz9xhRffWeWSUbCbnox6aQ4 ce9235c5-f300-466a-aadd-b969dc2f3664

From this SDP config we can say, for example, that it suggests using H.264 and VP8 codecs for movie and Opus for audio. Besides, it provides a lot of other information useful for communication: codecs priority, usage of fir, nack, pli feedbacks, the profile level for the H.264 codec is 42e01f – Baseline Trio.1, and so on.

When you implement a movie talk based on native WebRTC API you should understand what are SDP, candidates, codecs, ICE, Overwhelm, TURN and many other scary words.

WebRTC, Websockets � SIP

Terms WebRTC and Websockets are often confused. Sometimes SIP takes part in this mess too.

Well, we can undoubtedly state that WebRTC is not directly related to neither Websockets nor SIP.

Websockets is simply a convenient way to transmit SDP from Boris to Alice. We could use plain HTTP for that or send SDP by e-mail. SDP messages exchange is signalling information and we can use any protocol to send it. For browsers, the default protocols to send data are Websockets and HTTP. Hence, Websockets is mostly used because it is closer to real time compared to HTTP. You can't transfer movie or audio via Websockets, only signalling information: text and guidelines.

SIP is a text protocol to exchange messages. WebRTC is often wrongly called SIP in a browser, most likely because SIP messages also use SDP to configure codecs and establish connections.

On the other side, when we say something like SIP telephone we mean a device that along with the SIP (RFC3261) protocol also supports a dozen of other network specifications and protocols: RTP, SDP, AVPF, etc.

Indeed, in its core, WebRTC uses construction bricks similar to those used by a SIP telephone (SRTP, Numb, and so on). So one could say that both WebRTC and SIP devices and software use the same technology basis. But calling WebRTC SIP in a browser is incorrect not least because browsers do not have SIP out-of-the-box.

WebRTC is a technology that has three main audio/movie transmission functions:

  • Capturing, encoding and sending
  • Receiving, decoding and playback
  • Overcoming NAT and Firewall

Plus a lot of auxiliary functions such as jitter compensation, adaptive bitrate, network overcharge control and so on.

As described above, in order to successfully transmit media via WebRTC, Alice and Boris should exchange SDP containing detailed information on movie stream formats, packeting and other parameters that specify how the SDP sender will receive movie.

In addition to exchanging SDP, a TURN-server may be required. This server will pass the movie traffic through if the peer-to-peer connection will not be established, for example if Alice or Boris have some unfriendly (for example, symmetric) NAT.

Now, suppose we want to add a third active participant to the talk, or simply another viewer. Here is a good example: debates. Two participants talk while all others just observe. Another example is a talk for three or more participants.

When the third participant arrives, things get more sophisticated. Now every participant needs to capture and compress two movie flows instead of just one, and establish mutual connections to overcome NAT. In this case time needed to establish a connection increases while the stability of this connection decreases. Two or more movie flows compressed and sent at the same time creates a serious geyser to CPU and network and affects the quality especially on mobile devices:

Tasks like these:

  • connection of three or more participants
  • connection of extra subscribers of the movie talk
  • recording of the movie talk

are beyond the scope of peer-to-peer and require a centralized WebRTC server that will manage all the connections.

As we said above, there are services and servers as well as more or less convenient API on top of WebRTC API that can speed up development of movie talks and permit to work with handier abstractions i.e. Stream, Room, Publisher, Subscriber and so on.

For example, to create the simplest movie talk, exchanging names of the rivulets would be enough. Boris knows Alice's stream. Alice know Boris' stream. The movie talk is ready:

Example of a movie talk in a browser

In this article we will demonstrate how Streaming API works with Web Call Server Five – a WebRTC server for movie talks and online broadcasts.

The movie talk in act is illustrated on the following two screenshots. The very first subscriber Alice sees the movie talk like this:

The 2nd subscriber Edward sees the movie talk like this:

In this example a few things happen:

  1. Alice sends the movie stream named Alice from the browser to the server.
  2. Edward sends the movie stream named Edward from the browser to the server.
  3. Alice fetched and played the movie stream named Edward.
  4. Edward fetched and played the movie stream named Alice.

As seen from the example, we built a movie talk based on the assumption that both Alice and Edward know each other's stream names. We didn't directly used SDP, PeerConnection, NAT, TURN, etc.

Therefore, a movie talk can be implemented by simply passing names of the flows to those who should play them.

This elementary concept permits using any front-end and back-end technologies such as Jquery, Bootstrap, React, Angular, PHP, Java, .Net, and further on. The good news is embedding support for movie flows and movie talk does not have effect on the existing we application. You control your movie talk simply permitting (or denying) given subscribers to play specific movie flows.

Source code of the movie talk in a browser

Now let's see how the corresponding code looks. An HTML page with the movie talk has two main div elements:

  • localVideo – the movie captured from the web camera
  • remoteVideo – the movie that is played form the server

You can assign arbitrary identifiers to this divs, for example id=”captureVideo” or id=”playbackVideo”, but both div elements must present on the page.

The HTML page with localVideo and remoteVideo blocks looks as goes after:

Now, here is the code that is responsible for sending and playing the movie.

Sending the stream from a webcam

To send, we use the session.createStream().publish() API method. For this stream we specify the HTML div element that should display the movie captured from the webcam, localVideo, as well as the name of the movie stream, Alice, so that any connected client that knows this name will be able to play the stream.

Playing the stream from the server

To play, we specify the name of the stream that we want to play, and the HTML div element, remoteVideo, that should display the stream received from the server. We use the session.createStream().play() API method.

While working with the server, the HTML page will receive various statuses from it, i.e. PLAYING, STOPPED for playback and PUBLISHING, UNPUBLISHED for publishing. Therefore, the basic thing we need to do for a movie talk to work is to place two div blocks on the web page and include the corresponding scripts that will execute stream.play() and stream.publish() for the given stream name. The utter source code of the Two Way Streaming example can be downloaded here.

Example of a WebRTC movie talk in an Android application

The movie talk for Android works exactly the same way as a movie talk in a browser. The app establishes connection to the server and sends a movie stream from the camera of the Android device, as well as receive and play the other movie stream from the server. Below is the Android app Streaming Min (a mobile version of the Two Way Streaming example for a movie talk in a browser), that permits to exchange movie flows.

As you can see from the screenshot, nothing has switched. We have two movie windows. The left one displays the movie captured from the webcam, and the right one displays the movie received from the server. Exchanging of movie stream is too based on stream names. We publish one stream and play the other one.

Source code of the movie talk for an Android application

While to create a movie talk in a browser we used Web SDK that includes the flashphoner.js API script, for a full-featured Android application we need to import the aar-file of the Android SDK to the project. To understand how this works, we recommend to build and execute the Streaming Min example based on the Android SDK. All examples are available in the github repository.

1. Download all examples

Three. Link SDK as the aar-file to examples.

Note, we tell the export.sh script the path to the downloaded file: wcs-android-sdk-1.0.1.25.aar – Android SDK

As a result, in the export/output folder you will find a totally configured project that you can open in Android Studio

Now you only need to build the examples using gradle.

1 – Create a fresh run configuration

Two – Select the Gradle script

As a result, we should receive apk-files, that can be installed to an Android device. In this example we exchanged movie flows with a browser. The movie stream test33 was sent from the Android device to the server and played in a browser. The movie stream 8880 was sent by the browser and played on the Android device. Therefore we ended up with a two-way audio and movie communication inbetween the browser and the Android app.

In the Web version of the movie talk we used HTML div elements for movie. On Android, we use renderers.

The localRenderer displays the movie captured from the camera of the Android device. The remoteRenderer shows the movie received from the server.

1. Establish a connection to the server and set renderers.

Two. Create a stream with an arbitrary name and publish the stream to the server.

Three. Specify the name of the stream to play and fetch the stream from the server.

StreamOptions streamOptions = fresh StreamOptions(mPlayStreamView.getText().toString()); playStream = session.createStream(streamOptions); … playStream.play(); The utter source code of the StreamingMinActivity.java class is available here. And the entire Streaming Min example for Android is available in the repository here.

Web Call Server

In conclusion, we demonstrated how to creare a elementary exchange of movie flows inbetween an HTML page in a browser and an Android application. Movie rivulets go through Web Call Server that is both the signalling server and an audio and movie proxy.

Web Call Server – is a server software that can be installed on Linux, either on a virtual server or a dedicated server. WCS is a streaming movie WebRTC server, and it can manage movie rivulets from browsers, iOS and Android devices.

References

Technologies and protocols

SDP – Session description protocol, RFC

Server and API for movie talk development

Web Call Server – WebRTC streaming movie server for movie talks

Web SDK – Web SDK to develop movie talks with WebRTC support

Android SDK – Android SDK to develop movie talks with WebRTC support

Working examples

Web Two Way Streaming – an example movie stream exchange for Web

Android Two Way Streaming – an example movie stream exchange Android application

Source codes of examples

Web Two Way Streaming – source code of an example movie stream exchange for Web

Android Two Way Streaming – source code of an example movie stream exchange for Android

Develop a movie talk inbetween the browser and mobile app, Streaming Movie WebRTC server and SIP gateway for browsers and mobile apps

Develop a movie talk inbetween the browser and mobile app

Evil empires are often a target of grudge and hate from end users. This is understandable and sometimes even justified, but nevertheless Uber still partially pays for our trips (even tho’ it's temporarily) and Google accelerated the WebRTC technology, which otherwise would stay a proprietary and expensive software implement for fairly a narrow b2b purposes if not for the Evil Empire of Google.

Since WebRTC has been made public, movie talks became much lighter to develop. A number of API and services, servers and framework has emerged for that. In this article we accurately describe one more way to develop a movie talk inbetween a web browser and a native Android application.

Movie talk in a browser

A classic WebRTC movie talk inbetween browsers starts from SDP (session description protocol) exchange. Alice sends her SDP to Boris, and Boris responds with his one. SDP is a config like below:

o=- 1,46879219443992e18 Two IN IP4 127.0.0.1

a=group:BUNDLE audio movie

a=msid-semantic: WMS 9nKsWmxMvOQBYaz9xhRffWeWSUbCbnox6aQ4

m=audio nine UDP/TLS/RTP/SAVPF one hundred eleven 103 one hundred four 9 zero 8 one hundred six 105 thirteen 110 one hundred twelve 113 126

a=rtcp:9 IN IP4 0.0.0.0

a=ssrc:3525514540 msid:9nKsWmxMvOQBYaz9xhRffWeWSUbCbnox6aQ4 09bdb6e7-b4b3-437b-945e-771f535052e3

m=movie nine UDP/TLS/RTP/SAVPF ninety six 98 one hundred 102 one hundred twenty seven 97 ninety nine 101 125

a=rtcp:9 IN IP4 0.0.0.0

a=rtcp-fb:96 ccm fir

a=rtcp-fb:96 nack pli

a=rtcp-fb:98 ccm fir

a=rtcp-fb:98 nack pli

a=rtcp-fb:100 ccm fir

a=rtcp-fb:100 nack pli

a=ssrc-group:FID two billion four hundred seventy million nine hundred thirty six thousand eight hundred forty 2969787502

a=ssrc:2470936840 msid:9nKsWmxMvOQBYaz9xhRffWeWSUbCbnox6aQ4 ce9235c5-f300-466a-aadd-b969dc2f3664

a=ssrc:2969787502 msid:9nKsWmxMvOQBYaz9xhRffWeWSUbCbnox6aQ4 ce9235c5-f300-466a-aadd-b969dc2f3664

From this SDP config we can say, for example, that it suggests using H.264 and VP8 codecs for movie and Opus for audio. Besides, it provides a lot of other information useful for communication: codecs priority, usage of fir, nack, pli feedbacks, the profile level for the H.264 codec is 42e01f – Baseline Trio.1, and so on.

When you implement a movie talk based on native WebRTC API you should understand what are SDP, candidates, codecs, ICE, Overwhelm, TURN and many other scary words.

WebRTC, Websockets � SIP

Terms WebRTC and Websockets are often confused. Sometimes SIP takes part in this mess too.

Well, we can certainly state that WebRTC is not directly related to neither Websockets nor SIP.

Websockets is simply a convenient way to transmit SDP from Boris to Alice. We could use plain HTTP for that or send SDP by e-mail. SDP messages exchange is signalling information and we can use any protocol to send it. For browsers, the default protocols to send data are Websockets and HTTP. Hence, Websockets is mostly used because it is closer to real time compared to HTTP. You can't transfer movie or audio via Websockets, only signalling information: text and directives.

SIP is a text protocol to exchange messages. WebRTC is often wrongly called SIP in a browser, most likely because SIP messages also use SDP to configure codecs and establish connections.

On the other side, when we say something like SIP telephone we mean a device that along with the SIP (RFC3261) protocol also supports a dozen of other network specifications and protocols: RTP, SDP, AVPF, etc.

Indeed, in its core, WebRTC uses construction bricks similar to those used by a SIP telephone (SRTP, Overwhelm, and so on). So one could say that both WebRTC and SIP devices and software use the same technology basis. But calling WebRTC SIP in a browser is incorrect not least because browsers do not have SIP out-of-the-box.

WebRTC is a technology that has three main audio/movie transmission functions:

  • Capturing, encoding and sending
  • Receiving, decoding and playback
  • Overcoming NAT and Firewall

Plus a lot of auxiliary functions such as jitter compensation, adaptive bitrate, network overcharge control and so on.

As described above, in order to successfully transmit media via WebRTC, Alice and Boris should exchange SDP containing detailed information on movie stream formats, packeting and other parameters that specify how the SDP sender will receive movie.

In addition to exchanging SDP, a TURN-server may be required. This server will pass the movie traffic through if the peer-to-peer connection will not be established, for example if Alice or Boris have some unfriendly (for example, symmetric) NAT.

Now, suppose we want to add a third active participant to the talk, or simply another viewer. Here is a good example: debates. Two participants talk while all others just see. Another example is a talk for three or more participants.

When the third participant arrives, things get more complicated. Now every participant needs to capture and compress two movie flows instead of just one, and establish mutual connections to overcome NAT. In this case time needed to establish a connection increases while the stability of this connection decreases. Two or more movie rivulets compressed and sent at the same time creates a serious explosion to CPU and network and affects the quality especially on mobile devices:

Tasks like these:

  • connection of three or more participants
  • connection of extra subscribers of the movie talk
  • recording of the movie talk

are beyond the scope of peer-to-peer and require a centralized WebRTC server that will manage all the connections.

As we said above, there are services and servers as well as more or less convenient API on top of WebRTC API that can speed up development of movie talks and permit to work with handier abstractions i.e. Stream, Room, Publisher, Subscriber and so on.

For example, to create the simplest movie talk, exchanging names of the flows would be enough. Boris knows Alice's stream. Alice know Boris' stream. The movie talk is ready:

Example of a movie talk in a browser

In this article we will demonstrate how Streaming API works with Web Call Server Five – a WebRTC server for movie talks and online broadcasts.

The movie talk in act is illustrated on the following two screenshots. The very first subscriber Alice sees the movie talk like this:

The 2nd subscriber Edward sees the movie talk like this:

In this example a few things happen:

  1. Alice sends the movie stream named Alice from the browser to the server.
  2. Edward sends the movie stream named Edward from the browser to the server.
  3. Alice fetched and played the movie stream named Edward.
  4. Edward fetched and played the movie stream named Alice.

As seen from the example, we built a movie talk based on the assumption that both Alice and Edward know each other's stream names. We didn't directly used SDP, PeerConnection, NAT, TURN, etc.

Therefore, a movie talk can be implemented by simply passing names of the rivulets to those who should play them.

This ordinary concept permits using any front-end and back-end technologies such as Jquery, Bootstrap, React, Angular, PHP, Java, .Net, and further on. The good news is embedding support for movie rivulets and movie talk does not have effect on the existing we application. You control your movie talk simply permitting (or denying) given subscribers to play specific movie flows.

Source code of the movie talk in a browser

Now let's see how the corresponding code looks. An HTML page with the movie talk has two main div elements:

  • localVideo – the movie captured from the web camera
  • remoteVideo – the movie that is played form the server

You can assign arbitrary identifiers to this divs, for example id=”captureVideo” or id=”playbackVideo”, but both div elements must present on the page.

The HTML page with localVideo and remoteVideo blocks looks as goes after:

Now, here is the code that is responsible for sending and playing the movie.

Sending the stream from a webcam

To send, we use the session.createStream().publish() API method. For this stream we specify the HTML div element that should display the movie captured from the webcam, localVideo, as well as the name of the movie stream, Alice, so that any connected client that knows this name will be able to play the stream.

Playing the stream from the server

To play, we specify the name of the stream that we want to play, and the HTML div element, remoteVideo, that should display the stream received from the server. We use the session.createStream().play() API method.

While working with the server, the HTML page will receive various statuses from it, i.e. PLAYING, STOPPED for playback and PUBLISHING, UNPUBLISHED for publishing. Therefore, the basic thing we need to do for a movie talk to work is to place two div blocks on the web page and include the corresponding scripts that will execute stream.play() and stream.publish() for the given stream name. The total source code of the Two Way Streaming example can be downloaded here.

Example of a WebRTC movie talk in an Android application

The movie talk for Android works exactly the same way as a movie talk in a browser. The app establishes connection to the server and sends a movie stream from the camera of the Android device, as well as receive and play the other movie stream from the server. Below is the Android app Streaming Min (a mobile version of the Two Way Streaming example for a movie talk in a browser), that permits to exchange movie flows.

As you can see from the screenshot, nothing has switched. We have two movie windows. The left one displays the movie captured from the webcam, and the right one displays the movie received from the server. Exchanging of movie stream is too based on stream names. We publish one stream and play the other one.

Source code of the movie talk for an Android application

While to create a movie talk in a browser we used Web SDK that includes the flashphoner.js API script, for a full-featured Android application we need to import the aar-file of the Android SDK to the project. To understand how this works, we recommend to build and execute the Streaming Min example based on the Android SDK. All examples are available in the github repository.

1. Download all examples

Trio. Link SDK as the aar-file to examples.

Note, we tell the export.sh script the path to the downloaded file: wcs-android-sdk-1.0.1.25.aar – Android SDK

As a result, in the export/output folder you will find a entirely configured project that you can open in Android Studio

Now you only need to build the examples using gradle.

1 – Create a fresh run configuration

Two – Select the Gradle script

As a result, we should receive apk-files, that can be installed to an Android device. In this example we exchanged movie rivulets with a browser. The movie stream test33 was sent from the Android device to the server and played in a browser. The movie stream 8880 was sent by the browser and played on the Android device. Therefore we ended up with a two-way audio and movie communication inbetween the browser and the Android app.

In the Web version of the movie talk we used HTML div elements for movie. On Android, we use renderers.

The localRenderer displays the movie captured from the camera of the Android device. The remoteRenderer shows the movie received from the server.

1. Establish a connection to the server and set renderers.

Two. Create a stream with an arbitrary name and publish the stream to the server.

Trio. Specify the name of the stream to play and fetch the stream from the server.

StreamOptions streamOptions = fresh StreamOptions(mPlayStreamView.getText().toString()); playStream = session.createStream(streamOptions); … playStream.play(); The total source code of the StreamingMinActivity.java class is available here. And the entire Streaming Min example for Android is available in the repository here.

Web Call Server

In conclusion, we demonstrated how to creare a elementary exchange of movie rivulets inbetween an HTML page in a browser and an Android application. Movie flows go through Web Call Server that is both the signalling server and an audio and movie proxy.

Web Call Server – is a server software that can be installed on Linux, either on a virtual server or a dedicated server. WCS is a streaming movie WebRTC server, and it can manage movie flows from browsers, iOS and Android devices.

References

Technologies and protocols

SDP – Session description protocol, RFC

Server and API for movie talk development

Web Call Server – WebRTC streaming movie server for movie talks

Web SDK – Web SDK to develop movie talks with WebRTC support

Android SDK – Android SDK to develop movie talks with WebRTC support

Working examples

Web Two Way Streaming – an example movie stream exchange for Web

Android Two Way Streaming – an example movie stream exchange Android application

Source codes of examples

Web Two Way Streaming – source code of an example movie stream exchange for Web

Android Two Way Streaming – source code of an example movie stream exchange for Android

Related video:

Leave a Reply

Your email address will not be published. Required fields are marked *