Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add timestamp overlay to hw accelerated transcoded stream #402

Closed
dbuezas opened this issue May 1, 2023 · 8 comments
Closed

Add timestamp overlay to hw accelerated transcoded stream #402

dbuezas opened this issue May 1, 2023 · 8 comments
Labels
question Further information is requested
Milestone

Comments

@dbuezas
Copy link
Contributor

dbuezas commented May 1, 2023

After a lot of trial and error, I was able to stream my webcam in h264 with hw acceleration and a text overlay:

webcam: >-
      exec:ffmpeg
      -hide_banner
      -hwaccel vaapi
      -hwaccel_output_format vaapi
      -f v4l2
      -i /dev/video2
      -c:v h264_vaapi
      -g 50
      -bf 0
      -profile:v high
      -level:v 4.1
      -sei:v 0
      -an
      -vf "drawtext=fontfile=arial.ttf:text='%{{localtime\: %Y-%m-%d %H\\\:%M\\\:%S}}':x=20:y=20:fontsize=24:fontcolor=white:box=1:boxcolor=black@0.5:boxborderw=5:expansion=normal,format=vaapi|nv12,hwupload"
      -user_agent ffmpeg/go2rtc
      -rtsp_transport tcp
      -f rtsp {{output}}

Note: the double curly braces in {{output}} and {{localtime...}} are because I'm configuring inside frigate, and their formatter tries to fill "output" and crashes. That took me a lot to figure out too

Is there a standard way of doing this?

I was hoping for something like this:

    timestamp_hw: >-
      -vf "drawtext=fontfile=arial.ttf:text='%{{localtime\: %Y-%m-%d %H\\\:%M\\\:%S}}':x=20:y=20:fontsize=24:fontcolor=white:box=1:boxcolor=black@0.5:boxborderw=5:expansion=normal"
  streams:
    webcam: >-
      ffmpeg:device?video=1#video=timestamp_hw#video=h264#hardware

or

ffmpeg:device?video=1#video=h264#raw=timestamp_hw#hardware

But I get no timestamp overlay.

@AlexxIT AlexxIT added the question Further information is requested label May 2, 2023
@AlexxIT
Copy link
Owner

AlexxIT commented May 2, 2023

It's very complicated questing about mixing filters and hardware acceleration.

  1. You can't mix filters with hardware acceleration. You should write whole ffmpeg command manually
  2. Raw command don't support templates #raw=timestamp_hw
  3. Video template should output some video #video=timestamp_hw, not just filter

@dbuezas
Copy link
Contributor Author

dbuezas commented May 2, 2023

I see.
Templates in raw would be very nice, although it would also need to merge the -vf with the one created from #video=h264#hardware for this to work.

Btw, since adding a timestamp seems like something one would do often with cameras, it may also be a cool (and somewhat complicated feature )

@AlexxIT
Copy link
Owner

AlexxIT commented May 2, 2023

Many changes with picture can't be done hardware. They are always software. So you need to mix hardware and software frames. It'll cost many CPU time.

In your case you have raw frames on the input. So you can add timestamp with software and then pack it with hardware. But you need to learn FFmpeg docs. It's outside this project.

Other users should unpack frames with hardware, make changes with software, and pack them again with hardware.

@dbuezas
Copy link
Contributor Author

dbuezas commented May 2, 2023

Makes sense. It looks like my approach with a raw "exec:ffmpeg" was correct then

@dbuezas dbuezas closed this as completed May 2, 2023
@chatziko
Copy link

I had the same issue, it can be done a bit more elegantly by using #input, #video and removing #hardware. I post it below in case it's helpful to anyone.

But again we needs to pass most ffmpeg params. Maybe a solution would be to add a #filter=value option that supports templates, which is combined with any automatically set filters into a single -vf option.

streams:
  camera1:
    # We remove #hardware to avoid messing with -vf and set all params in #input and #video
    - ffmpeg:<url>#input=vaapi_input#video=vaapi_h264_ts

ffmpeg:
  # This is the 'http' template, with the addition of hwaccel (since we removed #hardware)
  vaapi_input: >-
    -hwaccel vaapi -hwaccel_output_format nv12 -fflags nobuffer -flags low_delay -i {input}

  # The first line is just the 'h264/vaapi' template.
  # The second is the drawtext filter, with the vaapi additions (format/hwupload)
  vaapi_h264_ts: >-
    -c:v h264_vaapi -g 50 -bf 0 -profile:v high -level:v 4.1 -sei:v 0
    -vf "drawtext=text='%{localtime\: %Y-%m-%d %H\\\:%M\\\:%S}':x=2:y=2:fontsize=12:fontcolor=white:box=1:boxcolor=black:boxborderw=2,format=vaapi|nv12,hwupload"

@AlexxIT
Copy link
Owner

AlexxIT commented Jul 11, 2023

Timestamp template supported https://github.com/AlexxIT/go2rtc/releases/tag/v1.6.0

@AlexxIT AlexxIT added this to the v1.6.0 milestone Jul 11, 2023
@janusn
Copy link

janusn commented Feb 13, 2024

@chatziko

I am having trouble including the audio in the rtsp output. Could you help me out? Here is the corresponding part in my go2rtc.yaml:

  ffmpeg:
      vaapi_input: "-hwaccel vaapi -hwaccel_output_format nv12 -fflags nobuffer -flags low_delay -i {input}"
      vaapi_h264: "-c:a copy -c:v h264_vaapi -g 50 -bf 0 -profile:v high -level:v 4.1 -sei:v 0"
      timestamp: >-
        -vf "drawtext=text='%{localtime}':x=w/25:y=(h-th)-h/25:fontsize=h/25:fontcolor=white:shadowx=2:shadowy=2,format=vaapi|nv12,hwupload"
  streams:
    doorbell_raw:
      # the stream extracted from HASS
      - "echo:/config/get_ha_stream.sh camera.hello_video_doorbell"
    video_doorbell:
      # restream from doorbell_raw above with time stamp added.
      - "ffmpeg:rtsp://user:password@127.0.0.1:8554/doorbell_raw#input=vaapi_input#video=vaapi_h264#raw=timestamp"

The stream doorbell_raw above is the video feed extracted from
ass-expose-camera-stream-source. It has a ma4 AAC audio verified by VLC ingest.

The stream video_doorbell restream the doorbell_raw above with timestamp added but it does not carry audio. How could I fix it? Many Thanks!

@janusn
Copy link

janusn commented Feb 15, 2024

I have found out the issue. Due to the fact the arguments are passed from frigate to go2rtc, I have to add a separated audio option to the stream argument instead of putting it into the video argument. 😳
Like the following:

ffmpeg:
      vaapi_input: "-hwaccel vaapi -hwaccel_output_format nv12 -fflags nobuffer -flags low_delay -i {input}"
      vaapi_h264: "-c:v h264_vaapi -g 50 -bf 0 -profile:v high -level:v 4.1 -sei:v 0"
      timestamp: >-
        -vf "drawtext=text='%{localtime}':x=w/25:y=(h-th)-h/25:fontsize=h/25:fontcolor=white:shadowx=2:shadowy=2,format=vaapi|nv12,hwupload"
  streams:
    doorbell_raw:
      # the stream extracted from HASS
      - "echo:/config/get_ha_stream.sh camera.hello_video_doorbell"
    video_doorbell:
      # restream from doorbell_raw above with time stamp added.
      - "ffmpeg:rtsp://user:password@127.0.0.1:8554/doorbell_raw#input=vaapi_input#video=vaapi_h264#audio=copy#raw=timestamp"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants