I am using MJPEG here, you may use H.264, but MJPEG will be easier for me to interface with openCV later, see this post.
Updating the firmware first:
sudo rpi-updateThis will get the latest RPi firmware, with latest raspivid binary for streaming.
Then, we will install gsteamer:
sudo apt-get install gstreamer1.0 gstreamer1.0-plugins-badThe "gstreamer1.0-plugins-bad'" package is for "jpegparse" plugin for streaming MJPEG to the network.
After all set, you can start the streaming by executing:
raspivid -t 0 -cd MJPEG -w 1280 -h 720 -fps 40 -b 8000000 -o - | gst-launch-1.0 fdsrc ! "image/jpeg,framerate=40/1" ! jpegparse ! rtpjpegpay ! udpsink host=<client_ip> port=<client_port>
Here is the explanation on the supplied flags/plugins:
- raspivid
- -t 0: Running raspivid forever, the program will not stop after certain time
- -cd MJPEG: Default output data is H.264, we specify this flag to force output to MJPEG
- -w 1280: Set output video width to 1280(px)
- -h 720: Set output video height to 720(px)
- -fps 40: Set frame rate to 40
- -b 8000000: Set target bit rate to 8000000bps (8Mbps)
- -o - : Data will pipe to stdout
- gst-launch-1.0
- fdsrc: Getting data from stdin (stdout from raspivid)
- "image/jpeg,framerate=40/1": Caps for jpegparse, we tell jpegparse the frame data type is JPEG and frame rate is 40fps (matching with the one we specified in raspivid -fps flag)
- jpegparse: Parse JPEG frames. As we are not sure the data from raspivid is one frame at a time, we need jpegparse to combine incoming data fragments to a frame
- rtpjpegpay: Wrap the JPEG frames to RTP payload
- udpsink: The RTP payload will be transmitted to the specified host and port via UDP
cd <gstreamer_binaries_directory> gst-launch-1.0.exe udpsrc port=<client_port> ! "application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)JPEG,a-framerate=(string)40.000000,a-framesize=(string)1280-720,payload=(int)26" ! rtpjpegdepay ! decodebin ! autovideosinkNote that you may need to modify "clock-rate", "a-framerate", "a-framesize" and "payload" according to the server (RPi). You may find these parameters when you run gst-launch-1.0 with verbose mode on the Raspberry Pi:
raspivid -t 0 -cd MJPEG -w 1280 -h 720 -fps 40 -b 8000000 -o - | gst-launch-1.0 -v fdsrc ! "image/jpeg,framerate=40/1" ! jpegparse ! rtpjpegpay ! udpsink host=<client_ip> port=<client_port>Note that the supplied "-v" flag will turn on verbose mode. Then, you will see something similar to this:
Make sure the client run with the same caps("clock-rate", "a-framerate", "a-framesize" and "payload") with the server, or else you may not see the video properly.
When everything is all done, a window will then popup, showing the video of the camera.
Very helpful - many thanks!
ReplyDeleteNo problem, glad it helps :)
DeleteReally useful article! Many thanks
ReplyDeleteThis solved all my problems. Thank you! I get almost no latency
ReplyDeleteThis comment has been removed by a blog administrator.
ReplyDelete