You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We want to be able to support streaming video in via continuous log/send calls.
The rough sketch for this feature would be to introduce an VideoStream archetype to which users continuously add VideoSample components.
The VideoStream archetype has some meta data that is ideally logged as static (so it's never GC'ed even if we lose samples).
Each VideoSample component is equivalent to a sample in our decoder abstraction - so ideally we just do a bytemuck cast to the decoder and no further processing is needed.
All samples on the VideoStreams are fed to the decoder at once, we essentially ignore the timeline since necessarily all samples come with their own decode & presentation timestamp.
Constraint: samples have to come in decode timestamp order (otherwise we have to do reordering which is expensive perf & memory wise for long streams; also would break under GC)
Open questions:
what does a typical workflow for this look like
obviously we have to provide samples where we encode a video stream e.g. with gstreamer and send it continuously to the viewer
what utilities do we provide exactly?
Exact mechanics of sample logging
where on the timeline are samples added?
is the timeline really completely (other than ordering) and how do we communicate this? Should we encourage index timeline which is independent of the timeline that contains the VideoFrameReference, etc.
are these arrays of samples on each timeline step? (why not?)
can we support b-frames? Most likely, but there might be jankiness if provided incorrectly
The text was updated successfully, but these errors were encountered:
If you can't wait for us to implement this feature, you can work around it by logging your video stream as many shorts videos, each logged as a AssetVideo to the same Entity Path.
We want to be able to support streaming video in via continuous log/send calls.
The rough sketch for this feature would be to introduce an
VideoStream
archetype to which users continuously addVideoSample
components.The
VideoStream
archetype has some meta data that is ideally logged as static (so it's never GC'ed even if we lose samples).Each
VideoSample
component is equivalent to a sample in our decoder abstraction - so ideally we just do abytemuck
cast to the decoder and no further processing is needed.All samples on the
VideoStream
s are fed to the decoder at once, we essentially ignore the timeline since necessarily all samples come with their own decode & presentation timestamp.Constraint: samples have to come in decode timestamp order (otherwise we have to do reordering which is expensive perf & memory wise for long streams; also would break under GC)
Open questions:
VideoFrameReference
, etc.The text was updated successfully, but these errors were encountered: