hogmartin
Posts: 3
Joined: Thu Nov 21, 2013 10:05 pm
Location: East Lansing MI USA

Near-realtime video streaming?

Fri Apr 04, 2014 2:30 pm

I've used the RasPi security cam guide to put together a neat little Pi camera to watch the place when I'm gone. I can monitor the camera at any time by pointing a browser to port 8081 and watching the MJPEG feed, and if motion is detected, it will start sending captured frames to my remote server for safekeeping and send me a SMS message as well. I was impressed with how easy and configurable the whole project is.
However, there appears to be a limitation that could sink another project that I had in mind. I want to build a small mobile robot that I can control over WiFi or XBee, with a near-realtime video stream coming back to me over WiFi. My original intention was to use the Pi and its camera board to send the video back to me, and if the processing and network overhead from the live stream was inhibiting the controls' responsiveness, use a separate Arduino solution to operate the robot and dedicate the Pi solely to the video. The problem is that I haven't found a solution for video streaming from the Pi that provides anything close to a sufficient frame rate for live control. The motion preview MJPEG for example appears to max out at about a frame per second, which is fine for a security camera but totally insufficient for e.g. chasing a cat around the house.
I've done a bit of searching to see if anyone had addressed this application, but most searches turn up either low frame rate streams for security cameras, using the Pi as a streaming home media server, or watching Internet video streams on the Pi itself. Does anyone know if a high frame rate/low latency IP stream from the Pi and its camera board is even possible? If so, is there a good guide or tutorial somewhere that I've missed? If not, are there any alternate workaround solutions that I should investigate?

Thanks!

User avatar
redhawk
Posts: 3465
Joined: Sun Mar 04, 2012 2:13 pm
Location: ::1

Re: Near-realtime video streaming?

Fri Apr 04, 2014 3:13 pm

Compressed video is okay if you can live with the latency but for low latency you're going to struggle doing this digitally and then send the data remotely.
Personally if I wanted real-time video streaming I would use a TV sender one that broadcasts analogue video to a receiver box or alternative (and more costly) a HDMI video sender dongle.

Richard S.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7411
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Near-realtime video streaming?

Fri Apr 04, 2014 3:40 pm

MJPEG streaming off the GPU will go up to certainly 720P30. I can't remember if 1080P30 works.
Motion has a load of code that can throttle the framerate as defined by the config file and other things. It's probably the wrong tool to use for low latency streaming.

Have a search on the forums - lots of people have set up low latency links with gstreamer or modifications to raspivid. Raspimjpeg springs to mind - http://www.raspberrypi.org/forums/viewt ... 43&t=61771

If you've got a decent network link, then the overall latency should be less than 3 frames (100ms).
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

mikerr
Posts: 2781
Joined: Thu Jan 12, 2012 12:46 pm
Location: UK
Contact: Website

Re: Near-realtime video streaming?

Fri Apr 04, 2014 4:45 pm

I use 30fs h264 via my android app (see sig)
but that does have a latency of 0.5s or more,
Wifi and buffering doesn't help
Android app - Raspi Card Imager - download and image SD cards - No PC required !

hogmartin
Posts: 3
Joined: Thu Nov 21, 2013 10:05 pm
Location: East Lansing MI USA

Re: Near-realtime video streaming?

Fri Apr 04, 2014 5:01 pm

I'll take a look at gstreamer and Raspimjpeg, thanks. I had previously tried mjpg-streamer but the frame rate was no better than motion's monitor. I agree that motion probably has far too much processing overhead to be the right choice in this context; I think I just meant to indicate that I had gotten at least some streaming solution working. The frame rate doesn't have to be TV quality at all, but something better than 1fps (otherwise the thing would be impossible to drive).

Regarding hardware, I guess my question was born of the knowledge that I can get a decent frame rate, resolution, and latency with Skype with people a few hundred miles away (granted, with a proprietary codec), even on middling hardware. Given that the Pi wouldn't have to do any simultaneous decompression, I was hoping I could get similar results with only transmitting video. I would prefer not to use separate video transmission hardware e.g. a TV or HDMI sender if at all possible for various reasons, if there's a chance the Pi can satisfy these requirements itself.

mikerr - what compression and transmission software are you using on the Pi-side for h264? .5sec latency might be workable.
Thanks!

jamesh
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 23874
Joined: Sat Jul 30, 2011 7:41 pm

Re: Near-realtime video streaming?

Fri Apr 04, 2014 7:14 pm

On a wired network from Pi to desktop, I was getting 2 or 3 frame delays (tenth of second) on H264 1080p30. It was almost unnoticeable. So that's pretty much a best case.
Principal Software Engineer at Raspberry Pi (Trading) Ltd.
Contrary to popular belief, humorous signatures are allowed. Here's an example...
“I think it’s wrong that only one company makes the game Monopoly.” – Steven Wright

Swap_File
Posts: 18
Joined: Sun Apr 06, 2014 5:31 pm

Re: Near-realtime video streaming?

Sun Apr 06, 2014 5:46 pm

This is the best way I have found to stream video from pi to pi. There is currently a bug (http://www.raspberrypi.org/forums/viewt ... 31#p522531) in in how gstreamer queries V4L2 devices, so you have to pipe raspivid into gstreamer for it to work.

Code: Select all

raspivid -w 800 -h 480 -fps 20 -n -pf baseline -ex auto -o - -t 0 | \
gst-launch-0.10 -v fdsrc !  h264parse ! rtph264pay config-interval=10 pt=96 ! udpsink host=192.168.1.255 port=9000 

Code: Select all

gst-launch-1.0 -v udpsrc port=9000 caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264' ! rtph264depay ! h264parse ! omxh264dec ! eglglessink sync=false 
To install, add this to /etc/apt/sources.list

Code: Select all

deb http://vontaene.de/raspbian-updates/ . main

Code: Select all

sudo apt-get update
sudo apt-get install libgstreamer1.0-0-dbg gstreamer1.0-tools libgstreamer-plugins-base1.0-0 gstreamer1.0-plugins-good gstreamer1.0-plugins-bad-dbg gstreamer1.0-omx gstreamer1.0-alsa
It's about 10% CPU usage on the sending side, 30-50% on receive. Receive CPU usage should be lower, I'm not quite sure what part of gstreamer is using it. 150ms or so latency, and its wifi friendly.

I'm still working on getting low latency sound working, I can't seem to get it below 270ms. Something somewhere (maybe in ALSA or in hardware?) is buffering it. Also note, gstreamer-1.0 won't let me hook the alsasrc directly to udpsink and then it uses way more cpu, so I stick to 0.10 here. It uses under 10% on both ends.

Code: Select all

gst-launch-0.10 -v alsasrc  device=hw:0  ! audio/x-raw-int, rate=48000, channels=1, endianness=1234, width=16, depth=16, signed=true ! udpsink host=192.168.1.255 port=5000

Code: Select all

gst-launch-0.10 udpsrc port=5000 ! audio/x-raw-int, rate=48000, channels=1, endianness=1234, width=16, depth=16, signed=true ! audioconvert ! alsasink 

towolf
Posts: 421
Joined: Fri Jan 18, 2013 2:11 pm

Re: Near-realtime video streaming?

Mon Apr 07, 2014 11:32 am

For your audio, here’s your buffer:
gst-inspect-1.0 alsasrc
[...]
buffer-time : Size of audio buffer in microseconds, this is the maximum amount of data that is buffered in the device and the maximum latency that the source reports
flags: readable, writable
Integer64. Range: 1 - 9223372036854775807 Default: 200000
That is, 200ms of default buffer.

Also, you can’t just pipe PCM audio to udpsrc because you should use a RTP payloader. Try inserting a pair of "rtpL16pay" and "rtpL16depay".

But then you are sending raw audio over wifi. That is 1.4 MBps and large udp packets. Maybe you want to use a light audio codec. Perhaps ulaw or alaw. Those are telephone quality but very low CPU complexity. Maybe try "speexenc" or "opusenc" on low complexity. Opus was conceived with low encoding latency .

And you’d want to use a single gst-launch to maintain A/V sync.

Swap_File
Posts: 18
Joined: Sun Apr 06, 2014 5:31 pm

Re: Near-realtime video streaming?

Mon Apr 07, 2014 1:29 pm

I've tried manually setting the buffer-time (and latency-time) as low as 10ms on both the send and receive ends, but it doesn't seem to affect the observed latency, even though gstreamer reports actual-buffer-time as being set properly. It does start to make the stream crackle below 10ms of buffer, so it must be doing something.

When I get time I'll need to investigate this further, it's possible that something is still using a 200ms buffer and lying about it (or maybe there is another layer of buffering somewhere?).
270ms observed - 200ms of buffer = 70ms, which sounds reasonable.

I did try using "rtpL16pay" and "rtpL16depay" during testing (and a few various codecs and payloaders), but they didn't provide any noticeable improvements, it just used more CPU time, which is at a premium to start with. I don't mind wasting some bandwidth, and for mono, its only 700kbit/s.

I don't particularly care about maintaining perfect sync either, just so it's as fast and low latency as possible, so I didn't bother using a single gst-launch and combining the two streams in a container (more CPU again).

Swap_File
Posts: 18
Joined: Sun Apr 06, 2014 5:31 pm

Re: Near-realtime video streaming?

Tue Apr 08, 2014 12:40 am

I found a solution. It turns out that the udpsrc buffer-size on the receiving was causing my problems. Now I'm using:

Send:

Code: Select all

gst-launch-0.10 alsasrc  device=hw:0  ! audio/x-raw-int, rate=48000, channels=1, endianness=1234, width=16, depth=16, signed=true ! udpsink host=192.168.1.255 port=5000
Receive:

Code: Select all

gst-launch-0.10 udpsrc buffer-size=1 port=5000 ! audio/x-raw-int, rate=48000, channels=1, endianness=1234, width=16, depth=16, signed=true ! alsasink sync=false
I'm not sure if a buffer-size of 1 is going to cause problems somewhere else, but it seems to work for now. I'm seeing 8% CPU usage on the sending side, 15% receive, a non-overclocked pi.

The latency is low enough I can't measure it via "clap" method (under 80ms?), and I'm currently too lazy to hook up my scope, so I'm going to call it good.

It works perfectly for me, and it can handle drop out of either the receiver or sender, and will continue playback when the connection comes back without reloading the sender or receiver.

Edit: It looks like the default value of buffer-size in gstreamer is 0, but this is being ignored by something elsewhere, and defaulting to a much higher value. Also, for video the default buffer-size doesn't seem to affect anything observed latency wise, but I need it set to at least ~20000 for my video stream to play.

Edit2: If anyone is curious, I believe the extra CPU usage on the receive side (for both video and audio) is due to udpsrc, but there really isn't a way to get around it without doing some major re-work, and in my case at least, its not that bad:

http://gstreamer-devel.966125.n4.nabble ... 55463.html
Last edited by Swap_File on Tue Apr 08, 2014 6:58 pm, edited 1 time in total.

unphased
Posts: 23
Joined: Tue Apr 08, 2014 2:44 pm

Re: Near-realtime video streaming?

Tue Apr 08, 2014 2:51 pm

Hi Swap_File, it's really encouraging for me to find after hours upon hours of searching that there really might be an H.264 streaming solution that is optimal to both latency (couldn't ask for sub 100ms!!) and a potentially flaky wifi-link (UDP and RTP will be a must!)

The reason that both of these are so crucial is of course the application which is connecting the Raspberry Pi which is flying in the air with a ground control computer that is linked to it via Wifi.

I wonder if anyone's been able to get this working with the video client not being another raspi but a real computer instead? What I need is to be able to perform image processing on the video stream rapidly... conceptually, being able to playback should be the same as having the uncompressed video frame in memory, which is the same as being able to perform image processing on it.

Swap_File
Posts: 18
Joined: Sun Apr 06, 2014 5:31 pm

Re: Near-realtime video streaming?

Tue Apr 08, 2014 6:43 pm

You might be better off using mjpeg-streamer (with the new V4L2 drivers, and not raspistill) for pi to PC streaming.

mjpeg-streamer has lower CPU usage and appears to have slightly less latency (with V4L2) than gstreamer, but it has a few hard-coded tricks in it's HTTP server to support streaming to web browsers that didn't work well with gstreamer on the reception end for me. I ended up recompiling mjpeg-streamer to get it to work, but then giving up on it for Pi to Pi streaming because it used too much CPU on the decode end. I couldn't get any of the hardware mjpeg decoding working in gstreamer on the pi, and omxplayer added way too much latency.

This is what I used for setting up mjpeg-streamer:
http://blog.miguelgrinberg.com/post/how ... spberry-pi
And this to use the V4L2 drivers:
http://www.linux-projects.org/modules/s ... e&artid=16

I was using it at 30fps 800x480, but I'm not sure what launch settings I was using.

With enough CPU power, you could stream via mjpeg-streamer on the pi side, and capture via gstreamer PC side. That might be the absolute lowest latency. Or you could just view it in Firefox on a laptop.

unphased
Posts: 23
Joined: Tue Apr 08, 2014 2:44 pm

Re: Near-realtime video streaming?

Wed Apr 09, 2014 3:23 pm

Interesting... I was looking at http://picamera.readthedocs.org/en/latest/index.html and it says he could only get VGA res at 15fps using MJPEG, but 800x480 at 30fps is a pretty large improvement there.

I really would prefer to perform processing onboard, however, in order to have the best results for control feedback. Streaming would then be a secondary concern, although not initially as I still have to view what the hell it's doing while I'm developing it.

I'm having not much luck at all on OSX with gstreamer, it looks like I'll need to manually compile (which I don't mind doing). The package I downloaded didn't seem to set anything up properly, and none of the gstreamer commands that people post work on the OS X end.

Swap_File
Posts: 18
Joined: Sun Apr 06, 2014 5:31 pm

Re: Near-realtime video streaming?

Mon May 26, 2014 1:24 pm

Something changed in the last few months, and now after updating to the latest version and firmware, my system CPU usage is much higher (sometimes an additional 20% sy, on top of what was already being used). I can't pin point it to any one program, it seems to be related to heavy USB bus use, but I'm not sure. It seems worse the more additional network traffic I have going out of the Pi, and worse when I am streaming video and audio (USB microphone) at the same time.

The newer software does fix some of the sound clicks and quirks I was having, so I'm OK with the tradeoff of more CPU usage if fixes the bugs. I'll have to dig into it later and see exactly what the cause is when I have more time.

Return to “Camera board”