Low-Latency Video Streaming from RPi To Mobile Application

by Bushev, Y.

Bushev Y (2017). Low-Latency Video Streaming from RPi To Mobile Application. In Young Scientist USA, Vol. 10 (p. 106). Auburn, WA: Lulu Press.



 

Development of mobile devices and portable computers as well as their interfaces made it possible to transfer data between network participants at such a high speed that today nobody thinks that it is impossible to implement the issue of video streaming without delays (<200ms).

Let's consider an example of creating a video and audio streaming system from a camera connected to a laptop (Raspberry Pi) and a mobile phone powered by iOS. A server running powered by Node.js platform (open-source, cross-platform JavaScript run-time environment for executing JavaScript code server-side [1]) will be started on Raspberry Pi, which performs all operations with the camera and microphone. The mobile application running on the iOS device will allow us to control the camera and microphone settings remotely, as well as view stream video with sound.

Here is a number of requirements for the system:

        Maximum video resolution 1920 x 1080 (Full HD) 25 FPS.

        Maximum delay - 200 milliseconds.

        Wi-Fi transfer at a distance of up to 100 feet.

        Video should be with sound.

        The mobile app should run on iOS and Android.

        Simultaneous operation with multiple mobile devices.

Hybrid framework Ionic was chosen for the development of the mobile application. It allows us to write the source code of the mobile application in JavaScript programming language and minimize the time for creating applications for various mobile platforms (iOS and Android).

After a series of experiments with H.264 video stream, we decided to use video in MJPEG format. This is due to the fact that at that time iOS platform had a number of problems associated with H.264 streaming video decoding. However, use of the MJPEG video format did not allow us to transmit sound. GStreamer (GStreamer is a pipeline-based multimedia framework that links together a wide variety of media processing systems to complete complex workflows [2]) and Janus Gateway (Janus is an open source, general purpose, WebRTC gateway [3]) were used to solve this problem. The diagram is shown in Fig. 1.

Figure 1. Diagram of sound transmission components interaction.

 

Let's use open source library MJPG Streamer (MJPG Streamer takes JPGs from Linux-UVC compatible webcams, filesystem or other input plugins and streams them as M-JPEG via HTTP to web browsers, VLC and other software [4]) for streaming video transmission. An example of starting a command is shown in Fig. 2.

Figure 2. Command to start MJPG Streamer process

 

We will use codec Opus (lossy audio coding format developed by the Xiph.Org Foundation and standardized by the Internet Engineering Task Force, designed for efficiently code speech and general audio in a single format, while remaining low latency for real-time interactive communication and low-complexity enough for low-end ARM3 processors). The main advantage of this codec is low code delay (from 2.5 to 60 ms, configurable), stronger compression of audio data, support of multi-channel audio.

Figure 3. Configuration file for Janus WebRTC Gateway

An example of Janus setting up to use codec Opus for the audio stream is shown in Fig. 3. Janus Gateway is configured to transfer a data stream available on local port 5002.

Let's create a GStreamer pipeline - a command that describes transformation of raw data from a physical device (Fig. 4) to get and prepare an audio stream from the microphone.

 

Figure 4. GStreamer pipeline example

Node.js server starts simultaneous transfer of video and audio using a special command passed via HTTP protocol. At the same time data can be obtained by an unlimited number of mobile devices that have access through the network (in practice, limitation is presented by the transmission channel capacity).

The following two elements were used to embed video with audio in Ionic application HTML page:

        <img [src]="streamUrl" alt="image"> – for video.

        <audio id="audio-element"></audio> – for sound

where streamUrl is a link to MJPEG stream available via HTTP.

It ispossible to connect to Janus Gateway using their own library in JavaScript. In addition you will need full support of WebRTC, which is unavailable by default in Safari browser. So, let's connect (Fig. 5) "cordova-plugin-iosrtc" library (cordova iOS plugin exposing the full WebRTC W3C JavaScript APIs):

 

Figure 5. Library connection for full support of WebRTC.

 

Figure 6. Watch streaming video using a mobile device.

 

Example 6. An example of a prototype application is presented. The application has additional functions that allow to adjust the quality and resolution of video (Fig. 7) as well as view and manage the archive of videos.

Figure 7. Adjustment of camcorder settings through the application interface.

 

In our opinion, the most optimal solution is to use H.264 codec in conjunction with Janus Gateway to reduce the load on the data channel and, consequently, improve the quality and range of transmission. However, not full support of H.264 streaming decoding in iOS did not allow to implement this in 2016. On the other hand, it was possible to test this model of data transmission on Android platform and it confirmed our theory.

 

References

 

  1.                Node.js // Wikipedia. URL: https://wikipedia.org/wiki/Node.js (access date: Oct 21, 2017).
  2.                GStreamer. // Open Source media framework URL: https://gstreamer.freedesktop.org (access date: 21.10.2017)
  3.                Janus. // Janus WebRTC Gateway. URL: https://github.com/meetecho/janus-gateway (access date: Oct 21, 2017)
  4.                MJPG-streamer. // URL: https://github.com/jacksonliam/mjpg-streamer (access date: Oct 21, 2017)