Skip to content
mxa edited this page May 28, 2020 · 1 revision

Some simple GStreamer examples (assuming that the v4l2loopback-device is /dev/video1).

Consumer

$ gst-launch-1.0 v4l2src device=/dev/video1 ! xvimagesink

Producer

videotestsrc

gst-launch-1.0 videotestsrc ! v4l2sink device=/dev/video1

Different videotestsrc patterns with different resolutions:

gst-launch-1.0 -v videotestsrc pattern=snow ! "video/x-raw,width=640,height=480,framerate=15/1,format=YUY2" ! v4l2sink device=/dev/video1
gst-launch-1.0 -v videotestsrc pattern=ball ! "video/x-raw,width=800,height=600,framerate=30/1,format=YUY2" ! v4l2sink device=/dev/video1
gst-launch-1.0 -v videotestsrc pattern=smpte horizontal-speed=1 ! "video/x-raw,width=1280,height=720,framerate=30/1,format=YUY2" ! v4l2sink device=/dev/video1

Desktop capture as producer

gst-launch-1.0 -v ximagesrc startx=1 starty=1 endx=320 endy=240 ! videoconvert ! "video/x-raw,format=YUY2" ! v4l2sink device=/dev/video1

Video file as producer

gst-launch-1.0 -v filesrc location=test.avi ! avidemux ! decodebin ! videoconvert ! "video/x-raw,format=YUY2" ! v4l2sink device=/dev/video1

This producer is not infinite: when the file ends, the producer will stop.

Separate PNG frames as infinite producer

Create frames, by splitting the input AVI into multiple PNG-files.

mkdir test
gst-launch-1.0 -v filesrc location=test.avi ! avidemux ! decodebin ! videoconvert ! pngenc snapshot=false ! multifilesink location=test/%05d.png

Use frames as infinite producer:

gst-launch-1.0 -v multifilesrc location=test/%05d.png loop=1 caps="image/png,framerate=30/1" ! pngdec ! videoconvert ! "video/x-raw,format=YUY2" ! v4l2sink device=/dev/video1

When sending frames to v4l2sink, the framerate can differ from framerate of the original video file. It can be set freely (60, 30, 15, 10, 5 or any other). This allows one to produce different framerates with the same frame set.

Decoding PNG can consume much CPU time. To reduce the load on the processor use raw YUV frames (see next example).

Separate YUV frames as infinite producer

Create frames, by splitting the video. When consuming the frames, we must know their dimension. For the sake of making this example work, the video file "test.avi" is therefore scaled to 640x480@30. In real life, you probably want to use the original dimension, rather than scale.

mkdir test
gst-launch-1.0 -v filesrc location=test.avi ! avidemux ! decodebin ! videoconvert ! videoscale ! videorate ! "video/x-raw,width=640,height=480,framerate=30/1,format=YUY2" ! multifilesink location=test/%05d.yuv

Use frames as infinite producer:

gst-launch-1.0 -v multifilesrc location=test/%05d.yuv loop=1 caps="video/x-raw,width=640,height=480,framerate=30/1,format=YUY2" ! videoconvert ! v4l2sink device=/dev/video1

When producing frames to v4l2sink, the framerate can differ from framerate of the original video file. It can be set freely (60, 30, 15, 10, 5 or any other). This allows to produce different framerates with the same frame set.

Decoding YUV frames does not consume much CPU time. But YUV frames consume much disk space and memory.

NOTE

This last example (raw YUV-frames) doesn't really work yet with Gstreamer-1.0 ...

"Internal data flow error. / reason not-negotiated (-4)"

If you get the famous Internal data flow error when using GStreamer as a producer, you might want to add a tee element (to work around a bug in the current v4l2loopback (as of writing 0.10.0):

gst-launch-1.0 -v videotestsrc ! tee ! v4l2sink device=/dev/video1

With even newer GStreamer (>=1.14), this is supposed to no longer work and you could try this instead:

gst-launch-1.0 -v videotestsrc ! identity drop-allocation=1 ! v4l2sink device=/dev/video1

See also issue #83