-
Notifications
You must be signed in to change notification settings - Fork 897
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fast Network Camera mode supports H264 RTSP stream #1764
Conversation
I have configured my raspberry 3 with MotionEye and am ready to begin setting up RTSP. Use me as a 3rd party tester...Tony |
@pacman8521 If you are interested in testing this, try out my pre-built images here: https://github.com/jasaw/motioneyeos/releases/tag/rtsp-fnc-dev20190321 |
Just got this hooked on a rpi zero w. I have it set at 800:600 25fps and it works great!! The server is on a rpi3 model B running motioneye 0.39.3 & motion 4.2.2. |
@ccrisan Have you had a chance to look into this PR yet? Any thoughts? I've been running this for weeks, and seems to work fine from my limited testing. |
@jasaw I am really sorry but I haven't been able to find enough time to try this PR of yours. Let's get it done with the new (pre)release, making sure that all users have the new partition in place, happily running Motion 4.x with motionEye 0.40. Then this is one of the pending features that will see its way to motionEyeOS. |
Read audio_bitrate and audio_channels from raspimjpeg.conf file.
@jasaw this work of yours is amazing. I took the time to test it and it just works and it works very well. Let's have it merged for the next release. |
Am i reading this right that this has now been merged and audio should be now supported for such rtsp streams? I have both a .h264 and .aac rtsp stream from my cameras. |
@MACscr Yes, we now have experimental support for video & audio RTSP stream for Fast Network Camera mode. The web interface does not expose audio control at the moment because No one has tested audio except myself. The work in this PR has been replaced by the PR below, but the audio instructions in this PR still apply. |
Oh, so wouldnt work with some third party camera (yi home 1080p with custom firmware) that provides the two (h264 and acc) rtsp streams? Why would the source device matter? |
@MACscr This PR is about making raspberry pi capable of generating a RTSP stream that has video and audio. Since you already have cameras that already generate video & audio stream, I believe what you are looking for is audio support on the receiving end, which requires audio support to be added to motion software. I haven't got any luck convincing motion developers to support audio. :-( Lots of people are asking for audio support too. This is one of them. This is in motion's FAQ regarding audio: |
So RTSP streams are in May 2019 dev branch. |
dev20200907 has RTSP stream capable on port 8081 (viewed with VLC on Win10
from PiZeroW running dev20200907 & using PiCam (mmal).
…On Sat, Sep 26, 2020 at 11:19 AM Name ***@***.***> wrote:
So RTSP streams are in May 2019 dev branch.
Has this since propagated to the latest release? (a year ago in September
2019)
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#1764 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AEZTUHPNUFBFUF63E5CTVW3SHYBAFANCNFSM4GM6FERA>
.
--
Thanks
Kevin Shumaker
Personal Tech Support <https://kevinshumaker.wixsite.com/thethirdlevel>
N38° 19' 56.52"
W85° 45' 8.56"
Semper Gumby
“Don't tell people how to do things. Tell them what to do and let them
surprise you with their results.” - G.S. Patton, Gen. USA
Ethics are what we do when no one else is looking.
Quis custodiet ipsos custodes?
“There is no end to the good you can do if you don’t care who
gets the credit.” - C Powell
You know we're sitting on four million pounds of fuel, one nuclear weapon
and a thing that has 270,000 moving parts built by the lowest bidder. Makes
you feel good, doesn't it?
|
Thanks. So just to clarify I can't use the release version, I have to use My
|
I cannot say how much earlier it works.If there is much interest, I can
start loading all versions prior to dev20200907 to see.
There is a newer release version: 20200606, but I don't currently have a OI
set up with that.
I know a lot of stuff was updated/upgraded in the dev20200907 from upstream
motion and RPi sources.
In VLC, I used Media, Network Stream, entered rtsp://<ipaddress>:8081 (I do
not have a password set)
…On Sat, Sep 26, 2020 at 11:55 AM Name ***@***.***> wrote:
Thanks. So just to clarify I can't use the release version, I have to use
dev20200907?
What network source string are you opening in VLC?
rtsp://IPADDRESS:8081/h264 ?
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#1764 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AEZTUHJO2XSNEHQ65FP3CYLSHYFGNANCNFSM4GM6FERA>
.
--
Thanks
Kevin Shumaker
Personal Tech Support <https://kevinshumaker.wixsite.com/thethirdlevel>
N38° 19' 56.52"
W85° 45' 8.56"
Semper Gumby
“Don't tell people how to do things. Tell them what to do and let them
surprise you with their results.” - G.S. Patton, Gen. USA
Ethics are what we do when no one else is looking.
Quis custodiet ipsos custodes?
“There is no end to the good you can do if you don’t care who
gets the credit.” - C Powell
You know we're sitting on four million pounds of fuel, one nuclear weapon
and a thing that has 270,000 moving parts built by the lowest bidder. Makes
you feel good, doesn't it?
|
Thanks again. Right now I can't get a connection with VLC with the URI schema you posted, even though I think I have all the configuration where it should be according to #1764 (comment)
I am not getting any web interface output and lots of errors like:
But I guess that's normal since RTSP is supposed to break MJPEG stream, web interface and motion capture to SMB share, if I understand correctly. I will try with the newest dev release. Right now I am using the release from https://github.com/ccrisan/motioneyeos/wiki/Supported-Devices which links to the 2019 version. |
All releases are here:
https://github.com/ccrisan/motioneyeos/releases
…On Sat, Sep 26, 2020 at 12:31 PM Name ***@***.***> wrote:
Thanks again. Right now I can't get a connection with VLC with the URI
schema you posted, even though I think I have all the configuration where
it should be according to #1764 (comment)
<#1764 (comment)>
Testing without audio for now.
VIDEO_DEV="/dev/video0" is the USB device that I have used so far for the
old MJPEG stream. I verified that with v4l2-ctl -d /dev/video0 --all. I
want to move to H264, because the JPEG stream takes too much storage space
at approximately 10Mbit/s.
I am not getting any web interface output and lots of errors like:
[ 3.742995] smsc95xx 1-1.1:1.0 eth0: hardware isn't capable of remote wakeup
[ 4.711802] usb 1-1.4: new high-speed USB device number 4 using dwc_otg
[ 4.843692] usb 1-1.4: New USB device found, idVendor=534d, idProduct=0021, bcdDevice= 1.21
[ 4.843708] usb 1-1.4: New USB device strings: Mfr=1, Product=2, SerialNumber=0
[ 4.843717] usb 1-1.4: Product: AV TO USB2.0
[ 4.843726] usb 1-1.4: Manufacturer: MACROSIL
[ 4.867356] uvcvideo: Found UVC 1.00 device AV TO USB2.0 (534d:0021)
[ 4.868220] uvcvideo: UVC non compliance - GET_DEF(PROBE) not supported. Enabling workaround.
[ 4.869151] uvcvideo 1-1.4:1.0: Entity type for entity Processing 2 was not initialized!
[ 4.869168] uvcvideo 1-1.4:1.0: Entity type for entity Camera 1 was not initialized!
[ 4.869790] usbcore: registered new interface driver uvcvideo
[ 4.869797] USB Video Class driver (1.1.1)
[ 4.903302] usbcore: registered new interface driver snd-usb-audio
[1:ml1:FrontDoor] [ERR] [VID] v4l2_capture: VIDIOC_QBUF: No such device
[1:ml1:FrontDoor] [ERR] [ALL] mlp_capture: Video device fatal error - Closing video device
[1:ml1:FrontDoor] [WRN] [ALL] mlp_retry: Retrying until successful connection with camera
cucrulr:l :( 5(65)6 )R eRcevc vf afialiulruer:e :C oCnonnencetcitoino nr erseeste tb yb yp epeere
r
2020-09-26 17:42:11: [motioneye] WARNING: 400 GET /picture/1/current (192.168.129.145) 5.31ms
2020-09-26 17:42:11: [motioneye] ERROR: HTTP 400: Bad Request (unknown operation)
But I guess that's normal since RTSP is supposed to break MJPEG stream,
web interface and motion capture to SMB share, if I understand correctly.
I will try with the newest dev release. Right now I am using the release
from https://github.com/ccrisan/motioneyeos/wiki/Supported-Devices which
links to the 2019 version.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#1764 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AEZTUHP464T7WOBSKQGJAXDSHYJNLANCNFSM4GM6FERA>
.
--
Thanks
Kevin Shumaker
Personal Tech Support <https://kevinshumaker.wixsite.com/thethirdlevel>
N38° 19' 56.52"
W85° 45' 8.56"
Semper Gumby
“Don't tell people how to do things. Tell them what to do and let them
surprise you with their results.” - G.S. Patton, Gen. USA
Ethics are what we do when no one else is looking.
Quis custodiet ipsos custodes?
“There is no end to the good you can do if you don’t care who
gets the credit.” - C Powell
You know we're sitting on four million pounds of fuel, one nuclear weapon
and a thing that has 270,000 moving parts built by the lowest bidder. Makes
you feel good, doesn't it?
|
@kwinz The RTSP FNC implementation here has been replaced by a more efficient method that does all the processing and encoding in GPU, which means even a RPi Zero W can do 1080p 30fps, but only RPi CSI camera is supported: #2126 This should be in the latest release. To enable RTSP mode on your FNC, go to the web interface of your FNC. Under "Video Streaming" section, change "Streaming Protocol" to RTSP. With RTSP mode, you get 2 video streams from your FNC, one is RTSP (high res, can be used for pass-through recording), another is MJPEG (meant to be low res low frame rate for motion detection and live-stream on web interface). All these parameters are configurable. On your motionEye server that receives the RTSP stream, you need to add RTSP camera with |
Tested and confirmed working on dev20200907, RPiZeroW, PiCam V1.3, set to
Fast Network Cam, & Stream RTSP, 1280x1024, 15 fps, to intel i5, mE 0.42.1,
Motion 4.3.1 on Ubuntu 20.04
Curious why rtsp://<ipaddress>:8081 works, no logic test for properly
formatted output?
…On Sat, Sep 26, 2020 at 9:40 PM jasaw ***@***.***> wrote:
@kwinz <https://github.com/kwinz> The RTSP FNC implementation here has
been replaced by a more efficient method that does all the encoding in GPU,
which means even a RPi Zero W can do 1080p 30fps, but only RPi CSI camera
is supported: #2126 <#2126>
This should be in the latest release.
To enable RTSP mode on your FNC, go to the web interface of your FNC.
Under "Video Streaming" section, change "Streaming Protocol" to RTSP. With
RTSP mode, you get 2 video streams from your FNC, one is RTSP (high res,
can be used for pass-through recording), another is MJPEG (meant to be low
res low frame rate for motion detection and live-stream on web interface).
All these parameters are configurable.
On your motionEye server that receives the RTSP stream, you need to add
RTSP camera with rtsp://IPADDRESS:554/h264 or simply rtsp://IPADDRESS/h264
.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#1764 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AEZTUHJLR5AJ6NECGH37DITSH2JYLANCNFSM4GM6FERA>
.
--
Thanks
Kevin Shumaker
Personal Tech Support <https://kevinshumaker.wixsite.com/thethirdlevel>
N38° 19' 56.52"
W85° 45' 8.56"
Semper Gumby
“Don't tell people how to do things. Tell them what to do and let them
surprise you with their results.” - G.S. Patton, Gen. USA
Ethics are what we do when no one else is looking.
Quis custodiet ipsos custodes?
“There is no end to the good you can do if you don’t care who
gets the credit.” - C Powell
You know we're sitting on four million pounds of fuel, one nuclear weapon
and a thing that has 270,000 moving parts built by the lowest bidder. Makes
you feel good, doesn't it?
|
BTW, good work! Adding docs to my support documentation.
On Sun, Sep 27, 2020 at 8:28 AM Kevin Shumaker <kevin.shumaker@gmail.com>
wrote:
… Tested and confirmed working on dev20200907, RPiZeroW, PiCam V1.3, set to
Fast Network Cam, & Stream RTSP, 1280x1024, 15 fps, to intel i5, mE 0.42.1,
Motion 4.3.1 on Ubuntu 20.04
Curious why rtsp://<ipaddress>:8081 works, no logic test for properly
formatted output?
On Sat, Sep 26, 2020 at 9:40 PM jasaw ***@***.***> wrote:
> @kwinz <https://github.com/kwinz> The RTSP FNC implementation here has
> been replaced by a more efficient method that does all the encoding in GPU,
> which means even a RPi Zero W can do 1080p 30fps, but only RPi CSI camera
> is supported: #2126 <#2126>
> This should be in the latest release.
> To enable RTSP mode on your FNC, go to the web interface of your FNC.
> Under "Video Streaming" section, change "Streaming Protocol" to RTSP. With
> RTSP mode, you get 2 video streams from your FNC, one is RTSP (high res,
> can be used for pass-through recording), another is MJPEG (meant to be low
> res low frame rate for motion detection and live-stream on web interface).
> All these parameters are configurable.
> On your motionEye server that receives the RTSP stream, you need to add
> RTSP camera with rtsp://IPADDRESS:554/h264 or simply
> rtsp://IPADDRESS/h264.
>
> —
> You are receiving this because you commented.
> Reply to this email directly, view it on GitHub
> <#1764 (comment)>,
> or unsubscribe
> <https://github.com/notifications/unsubscribe-auth/AEZTUHJLR5AJ6NECGH37DITSH2JYLANCNFSM4GM6FERA>
> .
>
--
Thanks
Kevin Shumaker
Personal Tech Support <https://kevinshumaker.wixsite.com/thethirdlevel>
N38° 19' 56.52"
W85° 45' 8.56"
Semper Gumby
“Don't tell people how to do things. Tell them what to do and let them
surprise you with their results.” - G.S. Patton, Gen. USA
Ethics are what we do when no one else is looking.
Quis custodiet ipsos custodes?
“There is no end to the good you can do if you don’t care who
gets the credit.” - C Powell
You know we're sitting on four million pounds of fuel, one nuclear weapon
and a thing that has 270,000 moving parts built by the lowest bidder. Makes
you feel good, doesn't it?
--
Thanks
Kevin Shumaker
Personal Tech Support <https://kevinshumaker.wixsite.com/thethirdlevel>
N38° 19' 56.52"
W85° 45' 8.56"
Semper Gumby
“Don't tell people how to do things. Tell them what to do and let them
surprise you with their results.” - G.S. Patton, Gen. USA
Ethics are what we do when no one else is looking.
Quis custodiet ipsos custodes?
“There is no end to the good you can do if you don’t care who
gets the credit.” - C Powell
You know we're sitting on four million pounds of fuel, one nuclear weapon
and a thing that has 270,000 moving parts built by the lowest bidder. Makes
you feel good, doesn't it?
|
@ccrisan can we please give permission to @starbasessd to update the wiki here? He has done a fantastic job at supporting motionEyeOS. |
@jasaw Just as long as you realize I am NOT a programmer. I am an excellent
beta tester, I have written documentation for my support teams in various
industries, and I love (well, sorta ;) ) helping end users (as long as they
are willing to learn). I've been in Help Desk 20+ years, and my skill set
is flexing to users level, and acting as translator. If these meet y'alls
needs, I'd be happy to help.
…On Sun, Sep 27, 2020 at 8:16 PM jasaw ***@***.***> wrote:
@ccrisan <https://github.com/ccrisan> can we please give permission to
@starbasessd <https://github.com/starbasessd> to update the wiki here? He
has done a fantastic job at supporting motionEyeOS.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#1764 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AEZTUHM47C4UQ4EVH6R5JMDSH7IWBANCNFSM4GM6FERA>
.
--
Thanks
Kevin Shumaker
Personal Tech Support <https://kevinshumaker.wixsite.com/thethirdlevel>
N38° 19' 56.52"
W85° 45' 8.56"
Semper Gumby
“Don't tell people how to do things. Tell them what to do and let them
surprise you with their results.” - G.S. Patton, Gen. USA
Ethics are what we do when no one else is looking.
Quis custodiet ipsos custodes?
“There is no end to the good you can do if you don’t care who
gets the credit.” - C Powell
You know we're sitting on four million pounds of fuel, one nuclear weapon
and a thing that has 270,000 moving parts built by the lowest bidder. Makes
you feel good, doesn't it?
|
Done! @starbasessd you've been invited to be a collaborator on this project. Thank you for your help! |
@jasaw thanks for your great work on enabling h264 for raspi CSI camera in #1764 ! I was quite pleased to find it. I do wonder if some configurations it would be possible to have this method enabled while still feeding frames to "motion". In my case, i have raspberrypi 2, and a bunch of them. They use motion to detect locally and record motion to SD card. These are my "just in case" something bad happens at my shop while the main system is down, which is zoneminder. Thats the part where i could get a much higher frame rate on the noisy wifi network if i could ship it as h264. On my pi2 setups, they only run about 30% CPU with motion and streaming. When i turn on FNC and h264 RTSP, then they run 3% CPU, so it seems like they'd be able to take the "chain" that you made, and still run motion after the fact with CPU to spare. I'm not a very good developer, but I'll try to hack around and maybe submit a pull request if i get it to work. Pointers would be handy if you have the time, or want to partner on it. Thanks! |
@shorted-neuron You can certainly split the h264 stream and feed one of them into motion, but be careful if you use OMX to re-encode the video stream on motion side because the RPi GPU may not be fast enough to handle one additional encoding. Let me give you some background on this RTSP FNC implementation: The video pipeline looks like this:
The lower resolution MJPEG stream is used by The V4L2 lookback device is read by v4l2multi_stream_mmal can be found here: https://github.com/jasaw/v4l2tools/tree/mmal-multi-stream To add a 2nd h264 stream, you'll need to add a splitter and sink the 2nd H264 stream to another V4L2 loopback device, like so:
You'll need to use Look at the |
This is my first attempt at implementing #1259 feature, please review and give feedback.
This PR adds live video and audio RTSP stream to fast network camera backend for Raspberry Pi 1, 2, and 3, which serves the RTSP stream at url:
rtsp://<rpi-IP-address>/h264
. The video is in H264 format, and optional audio in AAC format. It relies on GStreamer to read H264 video feed from a v4l2 video interface, which works with RPi Cam via bcm2835-v4l2 driver. USB webcams with H264 output are supported as well. As for audio, GStreamer reads the audio from a specified ALSA interface. Any microphone that appears as an ALSA device should work.Note: RTSP configuration support on the web interface is not covered by this PR, which means RTSP mode can only be enabled by editing configuration files by hand.
If you are interested in testing this out, head to my pre-built images: https://github.com/jasaw/motioneyeos/releases/tag/rtsp-fnc-dev20190321
I suggest trying this on a sacrificial microSD card so you don't need to rebuild your system if it doesn't work.
Be warned that I have only tested the Raspberry Pi 2 version in a controlled environment. Use at your own risk.
Architecture
GStreamer reads the video and audio, parses the h264 video stream while encoding the audio to AAC format. The video and audio feed (timestamped to help playback synchronization) are handled over to GStreamer RTSP server (called test-launch) which is able to serve multiple clients simultaneously.
To minimize changes to web front end and still be able to show video feed like before, another GStreamer instance is set up to convert RTSP to MJPEG. The MJPEG is fed into StreamEye, which serves the MJPEG stream to MotionEye.
Instructions
As the web front end has not been updated to support RTSP configuration, it is recommended to follow the below steps.
PROTO="rtsp"
to/data/etc/streameye.conf
.VIDEO_DEV="/dev/video0"
to/data/etc/streameye.conf
. If VIDEO_DEV is not specified, it defaults to/dev/video0
.AUDIO_DEV="hw:1,0"
to/data/etc/streameye.conf
. See the audio section below for more information.bitrate 4000000
to/data/etc/raspimjpeg.conf
.audio_bitrate 16000
andaudio_channels 2
to/data/etc/raspimjpeg.conf
.Video
V4L2 interface allows cameras to be configured for different light conditions, but the parameters can be quite different between cameras. Currently, configuration specified in
/data/etc/raspimjpeg.conf
are applied to RPi Cam only. Other cameras will have to be configured manually, probably via/data/etc/userinit.sh
script.The below commands should list all the supported configuration items for a specified camera.
The v4l2 camera controls can be found here: https://hverkuil.home.xs4all.nl/spec/uapi/v4l/extended-controls.html
Audio
An ALSA audio device is mapped into this format:
hw:<card>,<subdevice>
. To find your device mapping, runarecord -l
.Example output for
hw:1,0
mapping:The ALSA device may need to be configured, which can be done from
alsamixer -c 1
for card number 1.To configure the ALSA from a script like
/data/etc/userinit.sh
, the below commands may be useful:amixer -c 1 sget 'Mic'
amixer -c 1 sset 'Mic' cap
amixer -c 1 sset 'Mic' 50%
amixer -c 1 sset 'Mic' 50dB
If you have microphone problem, you may want to do a test recording:
arecord -D hw:1,0 -f DAT -r 16 /tmp/my_record.wav
Monitoring
When running with a RPi Cam, H264 video encoding and decoding are done in the GPU, which requires a lot of memory. Make sure you have enough memory allocated to the GPU. The amount of memory required depends on the video resolution.
To get GPU memory status, run:
vcgencmd get_mem malloc ; vcgencmd get_mem reloc
CPU load and system memory usage are worth monitoring as H264 video and AAC audio encoding can be quite taxing on both CPU and memory. Make sure CPU load is always under 70% during intended use. Make sure there is enough system memory during intended use to avoid Out-of-Memory (OOM) killer. Lower the video resolution if you need to reduce CPU / memory usage.
Known Issues
I do not know the cause of the below issues yet.
Some RTSP clients (if the first client to connect to GStreamer RTSP server) causes various issues to subsequent clients, e.g. unable to connect, or loss of audio, or laggy video. To workaround the issue, the
streameye.sh
script uses iptables to temporarily block RTSP port to make sure our RTSP to MJPEG client is always the first client to connect.The services must be started in the right sequence for the RTSP stream to work.
test-launch
first, then RTSP to MJPEG to StreamEye, then finally MotionEye.Stream latency accumulates over time.