-
Notifications
You must be signed in to change notification settings - Fork 14
Source video format is overconstrained #13
Comments
My webcam doesn't support Here's an official piece of documentation from FOURCC (the group that creates and helps standardize a lot of these video formats) about all the different YUV formats: I already plan to create an issue on switching to MJPEG for streaming video from the webcam. There are many reasons for this (you have to understand a bit about how webcams work) and based on a bit of experimenting I've done, I think it can be done in a reasonably secure way with GStreamer. More on this later. |
Your webcam almost certainly supports a YUV raw video format of some type though (as does mine). By the way, could I please have the output of |
The output of
The webcam seems to be a The following #!/usr/bin/python3 --
"""exec" /usr/bin/python3 -- "$0" "$@"
exit 127
"""
import struct
import os
import sys
import subprocess
def main(argv):
width, height, frame_rate = 1280, 720, 30
sys.stdout.buffer.write(struct.pack('=HHH', width, height, frame_rate))
sys.stdout.buffer.flush()
print('Starting webcam stream at {}x{} {} FPS...'.format(
width, height, frame_rate), file=sys.stderr)
sys.stderr.flush()
subprocess.run(('sudo', 'modprobe', 'uvcvideo'), check=True)
res = subprocess.run(executable='/usr/bin/gst-launch-1.0', args=(
'gst-launch-1.0',
'--quiet',
'v4l2src',
'!',
'queue',
'!',
'image/jpeg,'
'width={},'
'height={},'
'framerate={}/1'.format(width, height, frame_rate),
'!',
'jpegdec',
'!',
'fdsink',
), stdin=subprocess.DEVNULL)
sys.exit(res.returncode)
if __name__ == '__main__':
main(sys.argv) |
And the receiver script: #!/usr/bin/python3 --
# Copyright (C) 2021 Elliot Killick <elliotkillick@zohomail.eu>
# Copyright (C) 2021 Demi Marie Obenour <demi@invisiblethingslab.com>
# Licensed under the MIT License. See LICENSE file for details.
import struct
import os
import sys
def main(argv):
if len(argv) != 1:
raise RuntimeError('should not have any arguments')
s = struct.Struct('=HHH')
if s.size != 6:
raise AssertionError('bug')
untrusted_input = os.read(0, 6)
if len(untrusted_input) != 6:
raise RuntimeError('wrong number of bytes read')
untrusted_width, untrusted_height, untrusted_fps = s.unpack(untrusted_input)
del untrusted_input
if untrusted_width > 4096 or untrusted_height > 4096 or untrusted_fps > 4096:
raise RuntimeError('excessive width, height, and/or fps')
width, height, fps = untrusted_width, untrusted_height, untrusted_fps
del untrusted_width, untrusted_height, untrusted_fps
print('Receiving video stream at {}x{} {} FPS…'.format(width, height, fps),
file=sys.stderr)
os.execv('/usr/bin/gst-launch-1.0', (
'gst-launch-1.0',
'fdsrc',
'!',
'capsfilter',
'caps=video/x-raw,'
'width={},'
'height={},'
'framerate={}/1,'
'format=I420,'
'colorimetry=2:4:7:1,'
'chroma-site=none,'
'interlace-mode=progressive,'
'pixel-aspect-ratio=1/1,'
'max-framerate={}/1,'
'views=1'.format(width, height, fps, fps),
'!',
'rawvideoparse',
'use-sink-caps=true',
'!',
'v4l2sink',
'device=/dev/video0',
'sync=false',
))
if __name__ == '__main__':
main(sys.argv) |
For me (with the GStreamer in |
It does not work for me, perhaps because my webcam does not support I420 video, only YUYV and MJPEG. Trying to convert from YUYV to I420 caused errors and assertion failures in GStreamer. Furthermore, video quality is lower in YUYV mode than in MJPEG mode, so I would prefer to use MJPEG mode anyway. Below is the sender I use currently. It is exposed as the #!/usr/bin/python3 --
"""exec" /usr/bin/python3 -- "$0" "$@"
exit 127
"""
import struct
import os
os.environ['G_DEBUG'] = 'fatal-criticals'
import sys
import subprocess
import gi
gi.require_version('Gtk', '3.0')
gi.require_version('Gst', '1.0')
from gi.repository import Gtk, Gst
def main(argv):
width, height, frame_rate = 1280, 720, 30
sys.stdout.buffer.write(struct.pack('=HHH', width, height, frame_rate))
sys.stdout.buffer.flush()
print('Starting webcam stream at {}x{} {} FPS...'.format(
width, height, frame_rate), file=sys.stderr)
sys.stderr.flush()
subprocess.run(('sudo', 'modprobe', 'uvcvideo'), check=True)
Gst.init()
element = Gst.parse_launchv((
'v4l2src',
'!',
'queue',
'!',
'capsfilter',
'caps=image/jpeg,'
'width={},'
'height={},'
'framerate={}/1,'
'format=I420,'
'colorimetry=2:4:7:1,'
'chroma-site=none,'
'interlace-mode=progressive,'
'pixel-aspect-ratio=1/1,'
'max-framerate={}/1,'
'views=1'.format(width, height, frame_rate, frame_rate),
'!',
'jpegdec',
'!',
'capsfilter',
'caps=video/x-raw,'
'width={},'
'height={},'
'framerate={}/1,'
'format=I420,'
'interlace-mode=progressive,'
'pixel-aspect-ratio=1/1,'
'max-framerate={}/1,'
'views=1'.format(width, height, frame_rate, frame_rate),
'!',
'fdsink',
))
def msg_handler(bus, msg):
if msg.type == Gst.MessageType.EOS:
print('<6>End of stream, exiting', file=sys.stderr)
elif msg.type == Gst.MessageType.ERROR:
print('<3>Fatal error:', msg.parse_error(), file=sys.stderr)
elif msg.type == Gst.MessageType.CLOCK_LOST:
print('<6>Clock lost, resetting', file=sys.stderr)
element.set_state(Gst.State.PAUSED)
element.set_state(Gst.State.PLAYING)
return
else:
return # FIXME!
element.set_state(Gst.State.NULL)
Gtk.main_quit()
bus = element.get_bus()
bus.add_signal_watch()
bus.connect('message', msg_handler)
element.set_state(Gst.State.PLAYING)
Gtk.main()
if __name__ == '__main__':
main(sys.argv) The second |
It is also worth noting that since the frontend qube cannot attack the webcam, the webcam’s own indicator light also serves as an unforgable and unpreventable indication that recording is taking place, even without the UI. |
Hmm, alright well your way (with
Yes, as noted in the
As mentioned prior, my webcam also doesn't support
The reason I've been seeing if it would be possible just to stream the JPEGs directly and have had success in that. Originally, I wanted to only do raw video because I figured that would allow for the smallest attack surface. However, the compressed MJPG/MJPEG is much more widely used by IP cameras so because of that it may be more "battle tested" and therefore still potentially reasonably secure. Because of it's widespread use, it also has better support from applications (e.g. #1) and better performance. More info about MJPEG: I have a very simple working pipeline for passing JPEGs here: Pipeline in #!/bin/bash
gst-launch-1.0 -q v4l2src ! \
queue ! \
"image/jpeg" ! \
fdsink Pipeline in #!/bin/bash
gst-launch-1.0 -v fdsrc ! \
"image/jpeg,width=1920,height=1080,framerate=30/1,colorimetry=sRGB" ! \
jpegdec ! \
glimagesink Make sure you have your webcam format is set to For testing in one VM (wihout passing between VMs): This will open a preview of the video feed in a OpenGL viewer. Note that the OpenGL viewer doesn't have the best performance but that's nothing to do with the MJPEG format. I tested this in dom0 (where my webcam device is) where and Note the My only problem now is getting that into a By the way, here is my
|
Thank you so much for all the help so far! Porting the pipeline into a Python programs with GI bindings is a big help. I don't have ample time on my hands right this very moment but it would seem that you do. I'm going to add you you as a contributor to the repo so you can just make any additions you want to without having to go through me. Thanks again! For the time being, I'm going to work on the C program for getting webcam formats with |
That implicit conversion may be working for me because I'm using an older version (still a 1.x version though) of GStreamer that comes packaged with dom0 (Fedora 25). Remember, my webcam USB device is located in |
I noticed you are doing Lastly, I noticed you're hard-coding the dimensions and FPS to your webcam values whereas the current webcam sender script doesn't do that. This will of course have to be changed in an actual release. Thanks again for the help @DemiMarie! |
To reduce attack surface, I disallow autoloading of many drivers, including most USB drivers. Therefore, the UVC driver needs to be loaded manually. This is specific to my system and should not be part of a released version, at least unless Qubes adds its own blocklist.
Indeed it will. I didn’t want to try to parse the output of
You’re welcome! |
I wonder if a custom GStreamer element could help. The errors I am getting seem to indicate that GStreamer does not know what to do with the loopback device, perhaps because of a bug in the driver. |
Looks like the problem is v4l2loopback/v4l2loopback#137, which is really a GStreamer bug. GStreamer tries to query the supported formats of |
Hm, yes I came to a similar conclusion but based on this issue where umlaeute talked about setting the "output-format" of the pipeline: Maybe we could even just take code from the
You got that right! Here's a list of all the formats I found:
Just coming back to this, I think the reason for this issue was in the older version of GStreamer (In R4.0 dom0) I'm using |
Fantastic news! I just found this issue where someone seems to be having the same problem we're having and has solved the issue (with a "janky fix") just 20 days ago for a project called jacksonliam/mjpg-streamer#298 (comment) This confirms that the issue is what we both thought it to be. As proven by my PoC right here: We already are fully capable of streaming MJPEG across VMs through file descriptors using just GStreamer (no So, my understanding is then is if we just implement a similar fix with a GStreamer element of our own forcefully setting |
My webcam does not support
I420
output, so GStreamer insys-usb
fails. Removing this constraint fixes the problem.The text was updated successfully, but these errors were encountered: