Search This Blog

Monday, August 18, 2014

Live Video from Raspberry Pi to .NET

Simple example showing how to implement video streaming from Raspberry Pi camera to a .NET application.


The source code for this example can be downloaded here.

Introduction

The example bellow demonstrates how to implement live video streaming from Raspberry Pi camera to a .NET application and display it.
The example shows the implementation of a service running on Raspberry which captures the video from the camera and streams it to a .NET client which processes it and displays on the screen.
To implement this scenario following topics need to be addressed by the implementation:
  • Capturing video from the Raspberry Pi camera by the service application.
  • Streaming video across the network.
  • Processing and displaying video by the .NET client application.
The example streams the video via TCP but UDP or WebSockets can be used too.



Capturing Video from Raspberry Pi Camera

Raspberry Pi Camera is a high definition camera producing video data in raw H 264 format.

To control and capture the video Raspberry provides the console application 'raspivid' which can be executed with various parameters specifying how the video shall be captured. E.g. you can specify parameters like width, height or frames per seconds as well as if the video shall be produced to a file or to stdout (standard output).

Also the service implemented in this example internally uses raspivid. To capture the video the service starts raspivid and then reads incoming video data from raspivid's stdout.

The service starts raspivid with parameters suitable for live streaming:

raspivid -n -vf -hf -ih -w 320 -h 240 -fps 24 -t 0 -o -

-n No preview.
-vf -hf Flip video vertically and horizontally.
-ih Insert SPS and PPS inline headers to the video stream (e.g. so that if a second client connects to an ongoing capturing it can synchronize to image frames).
-w 320 -h 240 Produce 320 x 240 pixels video.
-fps 24 Produce video with 24 frames per second.
-t 0 Capture video for infinite time.
-o - Produce video to standard output (so that it can be captured by the service).

There are other parameters you can try to play with. To list all of them you can run:
raspivid -h

Streaming Video Across Network

Live video data continuously comming from raspivid stdout needs to be transfered across the network to the connected client.
To transfer data the implementation uses Eneter Messaging Framework the lightweight cross-platform library for the interprocess communication.

To avoid serialization/deserialization the communication is based directly on duplex channels. It means the service application running on Raspberry Pi uses duplex input channel and the .NET client running on PC uses duplex output channel.
Then when a chunk of video data is read from raspivid stdout the service uses the duplex input channel to send data to connected clients. The .NET client then uses duplex output channel to receive the video data and to notify it for further processing (e.g. displaying).

Processing and Displaying Video by .NET Client

Although H 264 is a very common encoding it is not a trivial task to play the live video encoded in this codec.

The major problem is Media Element (UI control from WPF) does not support playing video from a memory stream or just from an array of bytes. It expects a path to a local file or URL.
I found some hack proposing to register own protocol so that when the protocol name appears in URI the registered library will get the call but I did not want to go for such solution.
Another problem is that if you have Windows Vista (or Windows XP) you would have to install codecs for H 264 (and when I did so I still did not manage to play raw H 264 bytes stored in the file).

Another alternative is to use VLC library from Video Lan. Although VLC does not support playing from the memory stream or from the array of bytes it supports playing video from the named pipe. It means if the video source is specified like

stream://\\\.\pipe\MyPipeName

then VLC will try to open MyPipeName and play the video.

But also this is not out of the box solution.
The major problem is the VLC library exports just pure C methods. And so it does not contain a WPF based UI control you can just drag and drop into your UI.
There are several wrappers implementing UI controls based on VLC (e.g. very promising solution I tested is nVLC implemented by Roman Ginzburg) but these implementations look quite complex.

I like the approach described by Richard Starkey (part 1 and part 2) to provide just a thin wrapper and to use the VLC functionality directly. The advantage of this approach is the solution is lightweight and give you the full flexibility. And as you can see from the implementation of the .NET client it is really not difficult to use.
So I have slightly reworked Richard's original code and used it for the implementation of the .NET client. The implementation of the whole wrapper can be found in the VLC.cs file.


To Run Example

Downloads
  1. Download and unzip this example.
  2. Download 'Eneter for .NET' and 'Eneter for Java' from http://www.eneter.net/ProductDownload.htm.
  3. Download and install VLC media player from https://www.videolan.org/. (VLC libraries will be used by .NET application to play the video stream)
Raspberry Pi service application
  1. Open Java project raspberry-camera-service in Eclipse and add reference to eneter-messaging.jar which you downloaded.
    (Right click on the project -> Properties -> Java Build Path -> Libraries -> Add External Jars -> eneter-messaging-6.0.1.jar)
  2. Build the project and then export it to executable jar.
    (Right click on the project -> Export... -> Java -> Runable JAR file -> Launch configuration -> Export Destination -> Package required libraries into generated JAR -> Finish.)
  3. Copy the generated jar to the Raspberry device.
  4. Start the application
    java -jar raspberry-camera-service.jar
.NET Client Application
  1. Open RaspberryCameraClient solution and add reference to Eneter.Messaging.Framework.dll which you downloaded.
  2. Check if the path VLC is correct in MainWindow.xaml.cs.
  3. Provide correct IP address to your Raspberry Pi service in MainWindow.xaml.cs.
  4. Compile and run.
  5. Press 'Start Capturing'.


Raspberry Pi Service Application

Raspberry Pi Service is a simple console application implemented in Java. It listens for clients. When a client connects it starts the 'raspivid' application with specific parameters to start video capturing. Then it consumes the stdout from 'raspivid' and forwards video data to the connected client.

The code is very simple:

package eneter.camera.service;

import java.io.InputStream;
import java.util.HashSet;

import eneter.messaging.diagnostic.EneterTrace;
import eneter.messaging.messagingsystems.messagingsystembase.*;
import eneter.messaging.messagingsystems.tcpmessagingsystem.TcpMessagingSystemFactory;
import eneter.messaging.messagingsystems.udpmessagingsystem.UdpMessagingSystemFactory;
import eneter.net.system.EventHandler;

class CameraService
{
    // Channel used to response the image.
    private IDuplexInputChannel myVideoChannel;
    private Process myRaspiVidProcess;
    private InputStream myVideoStream;
    private HashSet<String> myConnectedClients = new HashSet<String>();
    private boolean myClientsUpdatedFlag;
    private Object myConnectionLock = new Object();
    
    
    public void startService(String ipAddress, int port) throws Exception
    {
        try
        {
            // Use TCP messaging.
            // Note: you can try UDP or WebSockets too.
            IMessagingSystemFactory aMessaging = new TcpMessagingSystemFactory();
            //IMessagingSystemFactory aMessaging = new UdpMessagingSystemFactory();
            
            myVideoChannel = aMessaging.createDuplexInputChannel("tcp://" + ipAddress + ":" + port + "/");
            myVideoChannel.responseReceiverConnected().subscribe(myClientConnected);
            myVideoChannel.responseReceiverDisconnected().subscribe(myClientDisconnected);
            myVideoChannel.startListening();
        }
        catch (Exception err)
        {
            stopService();
            throw err;
        }
    }
    
    public void stopService()
    {
        if (myVideoChannel != null)
        {
            myVideoChannel.stopListening();
        }
    }
    
    private void onClientConnected(Object sender, ResponseReceiverEventArgs e)
    {
        EneterTrace.info("Client connected.");
        
        try
        {
            synchronized (myConnectionLock)
            {
                myConnectedClients.add(e.getResponseReceiverId());
                myClientsUpdatedFlag = true;
                
                // If camera is not running start it.
                if (myRaspiVidProcess == null)
                {
                    // Captured video: 320x240 pixels, 24 frames/s
                    // And it also inserts SPS and PPS inline headers (-ih) so that
                    // later connected clients can synchronize to ongoing video frames.
                    String aToExecute = "raspivid -n -vf -hf -ih -w 320 -h 240 -fps 24 -t 0 -o -";
                    myRaspiVidProcess = Runtime.getRuntime().exec(aToExecute);
                    myVideoStream = myRaspiVidProcess.getInputStream();
                    
                    Thread aRecordingThread = new Thread(myCaptureWorker);
                    aRecordingThread.start();
                }
            }
        }
        catch (Exception err)
        {
            String anErrorMessage = "Failed to start video capturing.";
            EneterTrace.error(anErrorMessage, err);
            
            return;
        }
    }
    
    private void onClientDisconnected(Object sender, ResponseReceiverEventArgs e)
    {
        EneterTrace.info("Client disconnected.");
        
        synchronized (myConnectionLock)
        {
            myConnectedClients.remove(e.getResponseReceiverId());
            myClientsUpdatedFlag = true;
            
            // If no client is connected then turn off the camera.
            if (myConnectedClients.isEmpty() && myRaspiVidProcess != null)
            {
                myRaspiVidProcess.destroy();
                myRaspiVidProcess = null;
            }
        }
    }
    
    private void doCaptureVideo()
    {
        try
        {
            String[] aClients = {};
            byte[] aVideoData = new byte[4096];
            while (myVideoStream.read(aVideoData) != -1)
            {
                // Only if amount of connected clients changed update
                // the local list.
                if (myClientsUpdatedFlag)
                {                
                    aClients = new String[myConnectedClients.size()];
                    synchronized (myConnectionLock)
                    {
                        myConnectedClients.toArray(aClients);
                        myClientsUpdatedFlag = false;
                    }
                }
                
                for (String aClient : aClients)
                {
                    try
                    {
                        // Send captured data to all connected clients.
                        myVideoChannel.sendResponseMessage(aClient, aVideoData);
                    }
                    catch (Exception err)
                    {
                        // Ignore if sending to one of clients failed.
                        // E.g. in case it got disconnected.
                    }
                }
            }
        }
        catch (Exception err)
        {
            // Stream from raspivid got closed.
        }
        
        EneterTrace.info("Capturing thread ended.");
    }
    
    private EventHandler<ResponseReceiverEventArgs> myClientConnected
        = new EventHandler<ResponseReceiverEventArgs>()
    {
        @Override
        public void onEvent(Object sender, ResponseReceiverEventArgs e)
        {
            onClientConnected(sender, e);
        }
    };
    
    private EventHandler<ResponseReceiverEventArgs> myClientDisconnected
        = new EventHandler<ResponseReceiverEventArgs>()
    {
        @Override
        public void onEvent(Object sender, ResponseReceiverEventArgs e)
        {
            onClientDisconnected(sender, e);
        }
    };
    

    private Runnable myCaptureWorker = new Runnable()
    {
        @Override
        public void run()
        {
            doCaptureVideo();
        }
    };
}


.NET Client Application

.NET client is a simple WPF based application. When a user clicks on 'Start Capturing' button it creates the named pipe and sets VLC to use this named pipe as the video source. It also sets VLC to expect raw H 264 encoded video data. Then using Eneter it opens the connection with the Raspberry Pi service.
Streamed video is then received in the OnResponseMessageReceived(...) method which just writes it to the named pipe.

Please do not forget to provide correct IP address to your Raspberry Pi and check if you do not need to update the path to VLC.

The code is very simple:

using System;
using System.IO.Pipes;
using System.Threading;
using System.Windows;
using Eneter.Messaging.MessagingSystems.MessagingSystemBase;
using Eneter.Messaging.MessagingSystems.TcpMessagingSystem;
using Eneter.Messaging.MessagingSystems.UdpMessagingSystem;
using VLC;


namespace RaspberryCameraClient
{
    /// <summary>
    /// Interaction logic for MainWindow.xaml
    /// </summary>
    public partial class MainWindow : Window
    {
        private IDuplexOutputChannel myVideoChannel;

        // VLC will read video from the named pipe.
        private NamedPipeServerStream myVideoPipe;

        private VlcInstance myVlcInstance;
        private VlcMediaPlayer myPlayer;

        public MainWindow()
        {
            InitializeComponent();

            System.Windows.Forms.Panel aVideoPanel = new System.Windows.Forms.Panel();
            aVideoPanel.BackColor = System.Drawing.Color.Black;
            VideoWindow.Child = aVideoPanel;

            // If not installed in Provide path to your VLC.
            myVlcInstance = new VlcInstance(@"c:\Program Files\VideoLAN\VLC\");

            // Use TCP messaging.
            // You can try to use UDP or WebSockets too.
            myVideoChannel = new TcpMessagingSystemFactory()
            //myVideoChannel = new UdpMessagingSystemFactory()
                // Note: Provide address of your service here.
                .CreateDuplexOutputChannel("tcp://192.168.1.17:8093/");
            myVideoChannel.ResponseMessageReceived += OnResponseMessageReceived;
        }

        private void Window_Closed(object sender, EventArgs e)
        {
            StopCapturing();
        }

        private void OnStartCapturingButtonClick(object sender, RoutedEventArgs e)
        {
            StartCapturing();
        }

        private void OnStopCapturingButtonClick(object sender, RoutedEventArgs e)
        {
            StopCapturing();
        }

        private void StartCapturing()
        {
            // Use unique name for the pipe.
            string aVideoPipeName = Guid.NewGuid().ToString();

            // Open pipe that will be read by VLC.
            myVideoPipe = new NamedPipeServerStream(@"\" + aVideoPipeName,
                                                    PipeDirection.Out, 1,
                                                    PipeTransmissionMode.Byte,
                                                    PipeOptions.Asynchronous, 0, 32764);
            ManualResetEvent aVlcConnectedPipe = new ManualResetEvent(false);
            ThreadPool.QueueUserWorkItem(x =>
            {
                myVideoPipe.WaitForConnection();

                // Indicate VLC has connected the pipe.
                aVlcConnectedPipe.Set();
            });

            // VLC connects the pipe and starts playing.
            using (VlcMedia aMedia = new VlcMedia(myVlcInstance, @"stream://\\\.\pipe\" + aVideoPipeName))
            {
                // Setup VLC so that it can process raw h264 data (i.e. not in mp4 container)
                aMedia.AddOption(":demux=H264");

                myPlayer = new VlcMediaPlayer(aMedia);
                myPlayer.Drawable = VideoWindow.Child.Handle;

                // Note: This will connect the pipe and read the video.
                myPlayer.Play();
            }

            // Wait until VLC connects the pipe so that it is ready to receive the stream.
            if (!aVlcConnectedPipe.WaitOne(5000))
            {
                throw new TimeoutException("VLC did not open connection with the pipe.");
            }

            // Open connection with service running on Raspberry.
            myVideoChannel.OpenConnection();
        }

        private void StopCapturing()
        {
            // Close connection with the service on Raspberry.
            myVideoChannel.CloseConnection();

            // Close the video pipe.
            if (myVideoPipe != null)
            {
                myVideoPipe.Close();
                myVideoPipe = null;
            }

            // Stop VLC.
            if (myPlayer != null)
            {
                myPlayer.Dispose();
                myPlayer = null;
            }
        }

        private void OnResponseMessageReceived(object sender, DuplexChannelMessageEventArgs e)
        {
            byte[] aVideoData = (byte[])e.Message;

            // Forward received data to the named pipe so that VLC can process it.
            myVideoPipe.Write(aVideoData, 0, aVideoData.Length);
        }
    }
}

2 comments:

  1. What adjustments need to be made to this to work on Android?

    ReplyDelete
    Replies
    1. Hi Rob, you need to figure out hot get video frames from Android. If they are in H264 format you can use the .NET client without changes.
      If it is in another format you can may need to do some small changes in setting up VLC parameters.

      Delete