Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

H264 stream from MotionEye? #1259

Closed
mr-leo opened this issue Dec 2, 2017 · 34 comments
Closed

H264 stream from MotionEye? #1259

mr-leo opened this issue Dec 2, 2017 · 34 comments

Comments

@mr-leo
Copy link

mr-leo commented Dec 2, 2017

I have motioneyeos installed on my Raspberry Pi 3. Can motioneye generate a h264 stream or only a mjpeg steam? I'm using motioneye os as a vast network camera and record using my synology nas but the files are huge because of mjpeg.

@ccrisan
Copy link
Collaborator

ccrisan commented Dec 3, 2017

motionEye can only stream mjpeg (it's in fact motion that only streams mjpeg). However movies (that are saved to storage) are actually created using efficient codecs (such as h264). How does your Synology store the mjpeg stream? I doubt it stores it as a series of jpegs - that would be wrong for many reasons.

@MrGlasspoole
Copy link

I really like motionEye but the problem is that some devices don't support mjpeg (TV, satellite receiver...).
I have it running on a server and on the PI i just use fast network camera.
It would be nice if the server could stream h264 rtsp.

@fawick
Copy link

fawick commented Jan 22, 2018

Would it be possible to add a "Fast Network camera" to motioneyeos for all cameras like this:

  • The motioneyeos instance does not run the motion daemon itself.
  • Instead the source (no matter, whether it's a Pi CSI cam or a "normal" webcam) is read and h264-encoded by FFMPEG and then sent via RTSP? (I believe there is a tool ffserve for this in the FFMPEG project).
  • Raspberry Pi would probably benefit most from this due to its OMX h264 hardware encoder.

The motion detection could then be running motioneye (or, if a user prefers, zoneminder/shinobi/kerberos.io etc). on a more powerful computer in the network.

@mr-leo
Copy link
Author

mr-leo commented Jan 23, 2018

I would pay for that

@fawick
Copy link

fawick commented Jan 23, 2018

I am going to take a shot at it in this branch https://github.com/fawick/motioneyeos/tree/fast_network_webcam

When I have something worth showing, I'll send a PR.

@ccrisan I might need some advise on how to to best put controls for this in the settings GUI at some point later.

@ccrisan
Copy link
Collaborator

ccrisan commented Jan 23, 2018

@fawick great initiative. I'm not sure how well such a setup would perform, but it's probably better than plain motion and having h264 streams coming out of motionEyeOS would indeed be a win.

As you probably already know, the FNC mode turns down motion completely and runs streamEye instead. streamEye simply streams MJPEG generated by raspimjpeg.py, which could in theory be replaced by the recent official raspivid tool that now supports MJPEG.

The UI part is implemented in streameyectl.py and the streamEye daemon is controlled using streameye.sh shell script.

Now, in theory, one could write a drop-in replacement for the streamEye + raspimjpeg.py combo that accepts the same configuration options (we're talking about the same CSI RPi camera, after all) and start the streaming via whatever protocol/process/pipeline it desires, thus requiring virtually no change from the UI point of view. Perhaps a Fast Network Camera extra option to select streaming protocol/method would suffice - option that would select and start the appropriate streaming combo.

With regards to ffserve, I remember experimenting a bit trying to simulate a RTSP server, and I recall being quite disappointed with the resulted performance (in terms of delays, quality, CPU usage, etc). Now I agree that recent RPi OMX ffmpeg acceleration could drastically improve things, but it might be worth considering writing a simple standalone RTSP server, similar to streamEye (or even better, part of streamEye).

Either way, I'm more than happy to support this feature in any way that I can.

@d-faure
Copy link

d-faure commented Jan 23, 2018

According to https://trac.ffmpeg.org/wiki/ffserver, ffserver is about to be deprecated!

Further documentation reading on https://www.ffmpeg.org/ffmpeg-protocols.html#rtsp makes me think that this could be implemented using regular ffmpeg

@fawick
Copy link

fawick commented Jan 23, 2018

@ccrisan Thanks for the great feedback. Is my understanding correct, that streamEye was written with the CSI camera in mind and is not cacable to read from a v4l2 source? My main motivation for working on this issue is that for my Pi B+/USB webcam the motion daemon (even when no motion detection was enabled on the Pi to which the camera was connected) can only obtain 3-4 FPS, while I have no problem encoding up 15 FPS with FFMPEG and h264_omx on the same board in real-time and a CPU load of around 30%. I understand that your experiments showed that using FFMPEG instead of motion made things even worse, but I assume that was CPU-based MJPEG encoding instead of hardware-accelerated h264, right?

I had the chance to play a bit with the ffmpeg/ffserver combo today after I was able to rebuild motioneyeos (*), but most of it my experiments with it were unsuccesful. So far I was unable to generate a hardware-accelerated h264_omx stream with ffmpeg and feed it to ffserver. On the other hand, I had no problems writing an MP4 file with hardware-acceleration on the very same Raspberry Pi B+, so I know that the board/software combo is able to do it, even in real-time.

@d-faure: Thanks for the heads-up regarding the deprecation. I am quite relieved, actually, as ffserver is not as great and straightfoward as I thought. Can you elaborate a bit on how ffmpeg can be used as a RTSP server? All I was able to understand from https://www.ffmpeg.org/ffmpeg-protocols.html#rtsp was that is is capable to send data to a server, not to take the server role on its own.

Maybe RTSP is not the optimal protocol for this purpose, but I am not aware of another alternative. I'm happy to take suggestions, though. :-)

(*) Kudos to you, @ccrisan, the motioneyeos/thingos buildchain is absolutely fantastic and working with it was the most pleasant experience I had in software engineering for a long time. I feared that simply building ffserver for ARM would take lot of research and fiddling with compiler toolchains when in fact it turned out to be fully automated already. I had no prior experience with buildroot, but found build.sh raspberrypi menuconfig / build.sh raspberrypi / build.sh raspberrypi mkimage to be mind-boggingly easy. It took mere minutes to figure out where to make it building ffserver a default. I am in awe of what the buildroot contributors and you have achieved here! That certainly lowers the entry barrier into IoT tinkering a lot!

@fawick
Copy link

fawick commented Jan 23, 2018

Has anyone of you tinkered with MPEG-DASH before? Or HLS?

@jasaw
Copy link
Collaborator

jasaw commented Jan 23, 2018

@d-faure H264 RTSP can be implemented using raspivid and ffmpeg. I've done it before. Something like this:

FIFO=/tmp/live.h264
mkfifo $FIFO
raspivid -w 1280 -h 960 -fps 24 -g 24 -t 0 -b 600000 -o $FIFO &
# video stream only
ffmpeg -fflags +nobuffer -re -i $FIFO -c:v copy -f rtsp -metadata title=StreamEye rtsp://0.0.0.0:554
# video and audio stream
#ffmpeg -fflags +nobuffer -re -i $FIFO -fflags +nobuffer -re -f alsa -ar 16000 -ac 2 -i hw:1,0 -map 0:0 -map 1:0 -c:v copy -c:a aac -b:a 16k -ac 1 -f rtsp -metadata title=StreamEye rtsp://0.0.0.0:554

You can replace raspivid with v4l2, but replace -c:v copy with -vcodec h264_omx to enable hardware accelerated encoding.

Disclaimer: I whipped this code up quickly without any testing, just to give you an idea of how it can be done.

@d-faure
Copy link

d-faure commented Jan 23, 2018

@fawick ffmpeg on its own is only able to handle a single broadcast destination... using the RSTP protocol.
In order to handle several simultaneous clients, you'll need a relay server such as the one cited in the ffmpeg documentation: https://github.com/revmischa/rtsp-server

@jasaw I'd love to have this kind of stream available in Streameye

@jasaw
Copy link
Collaborator

jasaw commented Jan 23, 2018

Yes, you'll need the RTSP server running on the FNC device to relay the stream.

@d-faure
Copy link

d-faure commented Jan 24, 2018

Perhaps a more reliable RSTP server here: https://github.com/kzkysdjpn/rtsp-server

@fawick
Copy link

fawick commented Jan 24, 2018

@jasaw, @d-faure Thanks for your tipps.

I indeed successfully streamed h264 at 30fps from the USB webcam connected to the Pi (w/ USB webcam, hardware-accelerated) to a linux/amd64 machine with these commands

# on the linux/amd64 machine (with IP address 172.16.42.1)
ffmpeg   -rtsp_flags listen -i rtsp:///172.16.42.10:5454 -f null /dev/null

# on the Pi (usually running motioneye)
/etc/init.d/S85motioneye stop # if one wants to reproduce this with a unmodified motioneye instance
ffmpeg -s 640x480 -i /dev/video0 -c:v h264_omx -f rtsp rtsp://172.16.42.1:5454

On the Pi, top reported the following load:

Mem: 92296K used, 287244K free, 132K shrd, 4688K buff, 37464K cached
CPU:  31% usr   5% sys   0% nic  59% idle   0% io   0% irq   4% sirq
Load average: 0.39 0.32 0.35 3/93 7288
  PID  PPID USER     STAT   VSZ %VSZ %CPU COMMAND
 7195  6715 root     R     105m  28%  38% ffmpeg -s 640x480 -i /dev/video0 -c:v h264_omx -f rtsp rtsp://172.16.42.1:5454

For 15fps, the CPU load was around 35%, and for 5fps 32%. I'd

So next step would be to try out the suggested rtsp-servers on the board and see whether I can fit them into the motioneyeos ecosystem.

@ccrisan
Copy link
Collaborator

ccrisan commented Jan 24, 2018

@fawick

Is my understanding correct, that streamEye was written with the CSI camera in mind and is not cacable to read from a v4l2 source?

Not necessarily. streamEye was designed with no particular source in mind and reads its frames from standard input. I've successfully streamed from a V4L2 camera with ffmpeg + streamEye.

Guys, I would appreciate if we didn't add perl as dependency to motionEyeOS. Also is it only me or having a RTSP server written in perl these days sounds a bit strange?

@fawick
Copy link

fawick commented Jan 24, 2018

@ccrisan Thanks for clarifying the streamEye design.

Perl sure does sound oldfashioned. :-) I'd rather find a rstp-server in Go, actually. Although Python would probably suit the rest of motioneye most.

@d-faure
Copy link

d-faure commented Jan 24, 2018

I was wondering, as we are looking for a way to broadcast an hardware already-encoded video stream, if ffmpeg was the right tool to do that.
Digging a bit further, I found Gstreamer (https://gstreamer.freedesktop.org/features/), and some related tryouts:

But at least, I think I found a better RTSP relay server candidate:

@fawick
Copy link

fawick commented Jan 24, 2018

I was wondering, as we are looking for a way to broadcast an hardware already-encoded video stream, if ffmpeg was the right tool to do that.

The h264 stream is "already hardware-accelerated" only if the Raspberry Pi CSI camera and raspivid is used. My goal for this issue is to find a general solution that also works with an v4l2 USB webcam. FFMPEG is the tool that grabs frames from the v4l2 device and feeds them to the hardware h264 encoder on the Broadcom SoC of the raspberry pi. But you are right in a way, gstreamer is another tool that should be able to do exactly that and then stream the output. Personally I feel more comfortable with ffmpeg at the moment (which took me long enough to understand in all of its glory), but I may look into gstreamer eventually.

Thanks for finding the RTSP server alternative, I am going to check them out.

@d-faure
Copy link

d-faure commented Jan 24, 2018

If the GStreamer based RTSP server is too heavy for the host, a lighter solution could be to use live555 as described here: https://emtunc.org/blog/02/2016/setting-rtsp-relay-live555-proxy/

@jasaw
Copy link
Collaborator

jasaw commented Jan 24, 2018

I used crtmpserver previously on the original Pi B, in conjunction with the ffmpeg commands that I posted yesterday. crtmpserver is a high performance streaming server written in C++, so no need to introduce a new language. It supports RTSP as well as other protocols. Worth considering.

@fawick
Copy link

fawick commented Jan 25, 2018

crtmpserver looks promising on first glance, I'm checking it out in detail when I get the chance. Thanks for putting it on my radar.

@MarkusEakar
Copy link

Some time ago I also played around successfully with H264 v4l2 RTSP Server from MPromonet.
https://github.com/mpromonet/v4l2rtspserver

It‘s very versatile as it can grab H264 streams as well as JPEG streams from v4l2 USB devices for example as well as two video sources simultanous or audio streams.

For sources only supporting uncompressed pixel formats the author also provides some tools for encoding Thema to H264 (hardware accellerated on Raspberry) and mirroring them to a virtual v4l2 device.
https://github.com/mpromonet/v4l2tools

So perhaps it‘s worth considering them for implementing in MotionEye(OS)?
They are based on Live555.

Implementing RTSP/H264 streaming would be a Real performance amplifier for MEye so I would much appreciate it whilest I‘m unfortunately not a coder/programmer myself.

@mikedolx
Copy link

Hi All,

i too use motioneyeos on my Raspberry PI together with my synology nas (surveillance station) to record videos. First of all i have to admit that installation of motioneyeos itself and integration into my synology was flawless (thumbs up).

BUT, as the OP has stated the recorded files are really large. To provide the discussion some numbers i have recoded my recorded files via handbrake to each a H264 and a H265 encoded file. Furthermore i threw all into a spread sheet and made some calculations. Here are the results:

  • Codec -> avg. size of file
  • H264 -> 137,32MB (16% in comparision to original)
  • H265 -> 70,51MB (8% in comparision to original)
  • MJPEG -> 858,69MB (100%)

The orignal stream was setup with the resoultion of 1280x720 and ~12FPS. Each file was a recording of ~30min. The whole 'night' of recording is in total ~140GB.

As you can see i have also taken h265 into account - i was just curious how big the difference would be between h264 and h265. Of course it would be interesting to know if the effort to implement h265 is the same as h264, or if the implementation of h265 can benefit from the h264 - but that's just curiosity.

Coming back to the OP's topic. When using motioneyeos with a synology, it seems that the only way to save a lot of space, is to export the recorded videos to a puplic share (from surveillance station) recode them and import them back.

@ccrisan
Copy link
Collaborator

ccrisan commented Mar 13, 2018

@mikedolx so what would you propose?

@atonn
Copy link

atonn commented Apr 17, 2018

Do we need rtsp output for this? For modern browsers HTML5, wouldn't webm(vp8), mp4(h264) or ogg(theora) files be the best? I have been fighting with ffmpeg for a while, live-converting from an rtsp input stream to a file on disk and accessing that file via the browser, embedded in a html5 video tag. However:

ogg/theora: choppy playback in some browsers for me, generally weird playback behaviour, acceptable transcoding CPU load

webm/vp9: worked perfectly in the browser, but with no encoding hardware acceleration available, it's just not feasible (massive CPU load)

mp4/h264: almost zero CPU load since you can just use c:v copy for standard rtsp camera streams. HOWEVER, apparently mp4 is a tricky container format to get live-streaming right, I could never get it to behave like you would expect a live-stream of your camera to behave.

It probably involves building/shuffling an index around in real-time, and constantly chopping "old" live stream data off the front of the mp4 file. But all my googling and trying around so far has been unsuccessful.

@jasaw
Copy link
Collaborator

jasaw commented Dec 5, 2018

@fawick Did you get anywhere with enabling RTSP stream?

I managed to cross compile crtmpserver and v4l2rtspserver, and gave them a test run.

  • I couldn't get v4l2rtspserver to work at all. It kept complaining about some zero value field in RTSP stream.
  • I got more success with crtmpserver. I enabled RTSP acceptor in the example flvplayback.lua, ran crtmpserver flvplayback.lua, then raspivid -w 1280 -h 960 -fps 24 -g 24 -t 0 -b 600000 -o - | ffmpeg -fflags +nobuffer -re -i - -c:v copy -strict -2 -f rtsp -metadata title=StreamEye rtsp://0.0.0.0:554 to feed h264 video stream. To view the RTSP stream, point VLC to rtsp://cam-ip-address/StreamEye. This works quite well (1% CPU usage on RPi2), but I'm not sure what's the best way to view the camera stream through HTTP. We could:
    1. Use ffmpeg to generate 2nd output in mjpeg format and pipe it to streameye. Not sure if ffmpeg can do mjpeg encoding.
    2. Use ffserver to transcode to mjpeg, but ffserver has been deprecated.
    3. Update streameye to support other video formats, and we pipe ffmpeg's 2nd output to it.

@ccrisan Any thoughts?

As a bonus with crtmpserver method, it's really easy to finally add audio support. Just need to get ffmpeg to read audio device and encode to mp3 or AAC.

@jasaw
Copy link
Collaborator

jasaw commented Dec 8, 2018

I found a better way, which might be workable.

raspivid ---> Live555 ---> any RTSP client
                  +------> ffmpeg (to mjpeg) ---> streameye ---> motioneye

raspivid does the H264 encoding, pipes it to Live555 RTSP server which can serve multiple clients simultaneously.

To test this yourself, follow these steps:

  1. Apply this patch to live555ProxyServer by placing it in motioneyeos source under package/live555 directory.
  2. Compile motioneyeos source.
  3. Get the binary over to your motioneyeos device.
  4. Run raspivid -w 1280 -h 960 -fps 10 -g 30 -t 0 -b 600000 -o - | live555ProxyServer -i. You can view the RTSP stream on VLC on rtsp://<RPi_IP_address>/h264.
  5. Run ffmpeg -i "rtsp://127.0.0.1/h264" -c:v mjpeg -vf fps=1 -q:v 3 -bufsize 1000000 -maxrate 400000 -f mjpeg pipe:1 | /usr/bin/streameye -p 8081.
  6. Run meyectl startserver -c /data/etc/motioneye.conf.
  7. Now you can log in to motionEye just like before and you should get a video feed too. You can also use any RTSP client to play the same video feed via rtsp://<RPi_IP_address>/h264.

@jasaw
Copy link
Collaborator

jasaw commented Dec 10, 2018

This is a much better solution:

raspivid ---> Live555 +---> any RTSP client
                      +---> motion ---> motioneye

Simultaneous H264/RTSP streams to multiple clients.
motion here is another RTSP client that does the RTSP to mjpeg.
motioneye displays the mjpeg on the web front end.

  1. Apply this patch and this patch to live555ProxyServer.
  2. Compile motioneyeos source.
  3. Flash compiled motioneyeos image onto SD card.
  4. Copy this motion thread-1.conf file to the device. fnc-thread-1.conf.txt
  5. Run raspivid -w 1280 -h 720 -fps 20 -g 200 -t 0 -b 600000 -o - | live555ProxyServer -i
  6. Run motion -c fnc-thread-1.conf -n
  7. You should be able to see the live mjpeg stream from your web browser at address http://<RPI_IP_Address>:8081. MotionEye should also be able to get the same stream for web front end.

This works well as long as CPU is not under heavy load, e.g. keep mjpeg frame rate low to minimize CPU load as this is used for web front end only.

@ccrisan @fawick @d-faure I would be happy to run with this set up. What do you guys think?

@jasaw
Copy link
Collaborator

jasaw commented Dec 16, 2018

This is the best option by far that supports live audio and video stream over RTSP. It reads from V4L2 video device, uses OMX encoder, so technically we can use RPi-Cam or any USB webcam. It also reads audio from an alsa device, so it supports any USB mic or audio card. GStreamer RTSP server is able to serve simultaneous RTSP streams to multiple clients. Motion reads the RTSP client and converts it to mjpeg and motionEye displays the mjpeg on the web front end.

RPi-Cam (Video) ---+                 +--- Any RTSP Client
                   +--- GStreamer ---+
USB Mic (Audio) ---+                 +--- motion --- motionEye

We could remove motion from the pipeline if we add RTSP streaming support to motionEye. WebRTC looks quite suitable, so probably transcode RTSP to WebRTC, but I'm no expert in web stuff.

Here's what I did:

  1. Upgrade rpi-firmware. The version that we are using has a bug in the OMX IL, so won't work with some software. GStreamer is one of them.

  2. Enable GStreamer and various plugins in motionEyeOS source. I'll try to list the plugins needed when I have some time to go through my config.

  3. Increase root partition size by 50MB. GStreamer is huge, and I couldn't fit the rootfs in 200MB, so I increased it. I didn't check exactly how much space is needed by GStreamer.

  4. Compile motionEyeOS from source and flash it to SD card.

  5. Copy this motion thread-1.conf file to the device.
    fnc-thread-1.conf.txt

  6. Copy GStreamer's RTSP server example application test-launch to the device.

  7. Start GStreamer: test-launch "( v4l2src device=/dev/video0 ! "video/x-raw,format=YV12,width=1280,height=720,framerate=20/1" ! videoconvert ! omxh264enc target-bitrate=600000 control-rate=variable ! video/x-h264,profile=high ! rtph264pay name=pay0 pt=96 alsasrc "device=hw:1,0" ! audioresample ! audio/x-raw,rate=16000,channels=2 ! queue ! voaacenc ! rtpmp4gpay pt=97 name=pay1 )"

  8. Start a dummy RTSP client: gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test ! fakesink . This is required to workaround motion being the first RTSP client causing low bit rate and missing audio for subsequent RTSP players.

  9. Start motion with the special thread-1.conf file: motion -c fnc-thread-1.conf -n

  10. Finally start motionEye: /etc/init.d/S85motioneye start

This is the CPU and memory usage running on my Raspberry Pi 2, 1280 x 720 resolution, 20 fps.

  VSZ %VSZ %CPU
 223m  30%  26% test-launch ( v4l2src device=/dev/vide
92808  12%  14% motion -c /home/ftp/sdcard/fnc-thread-
 149m  20%   0% gst-launch-1.0 rtspsrc location=rtsp:/

@jasaw
Copy link
Collaborator

jasaw commented Dec 16, 2018

I've evaluated several approaches and here's a brief summary:

  1. v4l2rtspserver : No active development or bug fixes. Difficult to cross compile. OMX encoding didn't work, probably caused by OMX encoding bug.
  2. crtmpserver : Not much development or bug fixes recently (last commit was 3 years ago). Some effort to cross compile. Requires Lua. Latest version wasn't reliable when I tested it.
  3. ffmpeg and ffserver : Works well, except ffserver has been deprecated and removed from ffmpeg package.
  4. raspivid and live555 : Works well, very low latency. Supporting audio is very difficult. Software is very lean.
  5. gstreamer : Works well, but slightly higher latency (~1.5 seconds). Supports audio. Flexible (reads from v4l2 video and alsa audio). Software takes up a lot of disk space. Large user base and active community behind it. Requires rpi-firmware to be upgraded to avoid OMX encoding bug.

@ccrisan Which option do you prefer? Personally, I would go for the gstreamer option.

Implementing this feature is looking like a big task, so we probably need to split it into several stages, and I'm going to need your help too, especially on motionEye side of things like user interface and configuration.

I suggest we split the work into 3 stages.

  1. Add audio video RTSP stream support in Fast Network Camera mode. Minimize changes to motionEye, so only support RPi-Cam & alsa audio device. We need to evaluate how we apply camera options (exposure, ISO, ... ) from motionEye to V4L2 device. We use motion as an RTSP to mjpeg transcoder so motionEye can continue to show mjpeg on the web front end. I believe mjpeg still has the best support across all devices/browsers.
  2. Add support for more USB camera types, and add some configuration for the audio part.
  3. Add support for other video stream protocols to motionEye. Probably something like webRTC.

@ryanjjennings
Copy link

Just wanted to say I am looking forward to your guys' progress on this. I have been looking for a way to stream h264 from the pi to zoneminder.

@jasaw
Copy link
Collaborator

jasaw commented Dec 25, 2018

Yet another option. This option assumes all V4L2 video devices support h264 output. I think modern webcams and bcm2835-v4l2 driver support h264 output. Audio can be any ALSA device. One instance of GStreamer acts as an RTSP server which can serve multiple RTSP clients. To minimize changes to motionEye, we use another instance of GStreamer as RTSP client, converting h264 to jpeg, pipe to StreamEye and motionEye gets the jpeg stream from port 8081. This 2nd instance of GStreamer can be removed in the future if motionEye supports RTSP streams, probably using webRTC.

RPi-Cam / Webcam (Video) ---+    GStreamer    +--- Any RTSP Client
                            +---   RTSP    ---+
USB / Webcam Mic (Audio) ---+     Server      +--- GStreamer RTSP Client --- StreamEye --- motionEye

Steps to prepare the required software:

  1. Upgrade rpi-firmware. The version that we are using has a bug in the OMX IL, so won't work with some software. GStreamer is one of them.
  2. Enable GStreamer and various plugins in motionEyeOS source. I'll try to list the plugins needed when I have some time to go through my config.
  3. Increase root partition size by 50MB. GStreamer is huge, and I couldn't fit the rootfs in 200MB, so I increased it. I didn't check exactly how much space is needed by GStreamer.
  4. Compile motionEyeOS from source and flash it to SD card.
  5. Copy GStreamer's RTSP server example application test-launch to the device.

Steps to start the H264 live stream:

  1. Start GStreamer RTSP server: test-launch "( v4l2src device=/dev/video0 ! video/x-h264,width=1280,height=720,framerate=20/1 ! h264parse ! rtph264pay name=pay0 pt=96 alsasrc "device=hw:1,0" ! audioresample ! audio/x-raw,rate=16000,channels=2 ! queue ! voaacenc ! rtpmp4gpay pt=97 name=pay1 )". The /dev/video0 input can be a webcam or RPi-Cam with bcm2835-v4l2.
  2. Start GStreamer RTSP client to convert h264 to jpeg: gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test ! rtph264depay ! h264parse ! omxh264dec ! videorate ! video/x-raw,framerate=5/1 ! jpegenc ! filesink location=/dev/stdout | /usr/bin/streameye -p 8081. The jpeg frame rate is limited to 5 fps to minimize network bandwidth usage. It is important that this GStreamer RTSP client is the first client to connect to our RTSP server, otherwise you may experience loss of audio, or connection problem, or low bitrate. Maybe use IP tables to block other RTSP clients until this 2nd instance is running.
  3. Finally start motionEye if it's not already running: /etc/init.d/S85motioneye start

This is the CPU and memory usage running on a Raspberry Pi 2 with 1280 x 720 resolution at 20 fps:

VSZ %VSZ %CPU COMMAND
213m 28% 11% gst-launch-1.0 rtspsrc location=rtsp:/
115m 15% 4% test-launch ( v4l2src device=/dev/vide
30156 4% 0% /usr/bin/streameye -p 8081

@jasaw
Copy link
Collaborator

jasaw commented May 13, 2019

Basic RTSP support has been implemented in dev branch.

@jasaw jasaw closed this as completed May 13, 2019
@TheGithubJoshua
Copy link

I have motioneyeos installed on my Raspberry Pi 3. Can motioneye generate a h264 stream or only a mjpeg steam? I'm using motioneye os as a vast network camera and record using my synology nas but the files are huge because of mjpeg.

How do you use a motioneye camera with a Synology NAS?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

No branches or pull requests