Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stop using playback rate manipulation for buffer management #7602

Closed
tykus160 opened this issue Nov 15, 2024 · 7 comments · Fixed by #7617
Closed

Stop using playback rate manipulation for buffer management #7602

tykus160 opened this issue Nov 15, 2024 · 7 comments · Fixed by #7617
Assignees
Labels
priority: P1 Big impact or workaround impractical; resolve before feature release type: enhancement New feature or request
Milestone

Comments

@tykus160
Copy link
Member

Have you read the FAQ and checked for duplicate open issues?
Yes

Is your feature request related to a problem? Please describe.

Currently shaka manipulates playbackRate to control the buffer, i.e. if we think player is starving, we're setting playbackRate to 0 to prevent playback until buffer is fulfilled. Unfortunately, this behavior causes some issues:

  1. We've noticed that on live streams when we're too close to the live edge, Chrome sometimes entirely freezes the video (while audio continues to play). We have confirmation from our internal tests that disabling manipulating playbackRate prevents this issue.
  2. We've seen the issue on PlayStation 5 that at the start video plays, but buffering spinner is visible for a couple of seconds. Turned out PS5 does not support playbackRate=0 at all.

I know there was already an attempt to stop manipulate playback rate and rely completely on media element events, but it has been reverted due to issues it caused around rebufferingGoal handling.

Describe the solution you'd like

We're thinking about potential solutions and 2 comes to my mind for now:

  1. Nuclear solution - remove buffer poller, rely entirely on video element events, and deprecate rebufferingGoal and minBufferTime. It's harsh but relatively easy to implement. I think other JS players are working like that.
  2. Use SegmentPrefetch for buffer management - push data only there until rebufferingGoal is reached, when it's ready move this data to SourceBuffers. We could keep our configs then, but implementation can be tricky.

Describe alternatives you've considered

Additional context

In dash.js minBufferTime is used only to calculate presentation delay if there is no suggestedPresentationDelay set.
For a rebufferingGoal equivalent, looking at config there's only initialBufferLevel (defaults to NaN), that is used to wait for enough data in buffer to start playback. I think playback is prevented by not having autoplay attribute in video element and explicitly called play() when buffer is fulfilled.

Previous work in this area:

Are you planning send a PR to add it?
Yes, either me or one of my team mates are happy to work on it, once we agree on what solution is most suitable.

@tykus160 tykus160 added type: enhancement New feature or request priority: P1 Big impact or workaround impractical; resolve before feature release labels Nov 15, 2024
@tykus160 tykus160 self-assigned this Nov 15, 2024
@avelad
Copy link
Member

avelad commented Nov 15, 2024

We will leave the issue open for a week so that the community can give their opinion, if there are no opinions in a week we will implement solution 1!

@shaka-bot shaka-bot added this to the Backlog milestone Nov 15, 2024
@tykus160 tykus160 pinned this issue Nov 15, 2024
@avelad avelad modified the milestones: Backlog, v4.13 Nov 15, 2024
@dsilhavy
Copy link

Hi all,
@gkatsev highlighted this issue in the SVTA Players & Playback WG Slack channel, so I am taking the opportunity to comment from a dash.js perspective:

  • Just for clarification: The @minBufferTime attribute in the MPD is not an instruction for the player on how much data to buffer before starting the playback. The name of the attribute is quite confusing, though. From 23009-1:

The value of the minimum buffer time does not provide any instructions to the client on how long
to buffer the media. The value however describes how much buffer a client should have under
ideal network conditions. As such, MBT is not describing the burstiness or jitter in the network,
it is describing the burstiness or jitter in the content encoding. Together with the BW value, it is
a property of the content. Using the "leaky bucket" model, it is the size of the bucket that makes
BW true, given the way the content is encoded

  • In dash.js the app can define an initialBufferLevel. If defined, we set the initial buffer target at playback start to that value. If autoplay is enabled, we wait for the buffer target to be reached and then start playback. However, we do not use this value again after seeking or when the buffer ran dry during playback. It is only used at playback start. This behavior we might change in the future. Example: https://reference.dashif.org/dash.js/nightly/samples/buffer/initial-buffer.html
  • We adjust video.autoplay depending on whether playback should start immediately or not.

Happy to discuss this in the SVTA Players & Playback WG call if required.

@gkatsev
Copy link
Contributor

gkatsev commented Nov 15, 2024

Sounds like at the very least we should decouple the usage of minBufferTime here.

For removing the feature, any concerns for feature flagging it, at least in the interim? Or will that be the approach for removing it long term? I guess we probably would want it to be configurable at runtime as well, if we do.

@avelad
Copy link
Member

avelad commented Nov 15, 2024

Keeping both options adds complexity and makes maintenance more complicated. I would prefer not to add both options.

@martinstark
Copy link
Contributor

Would it be possible to implement option 1 and also supply an API for external buffer management? Allowing the consumer to intercept segments before they are fed to the source buffer

@joeyparrish
Copy link
Member

an API for external buffer management? Allowing the consumer to intercept segments before they are fed to the source buffer

Does that differ from the network response filters we have, where you can intercept network responses in an async callback and inspect and/or modify them before we consume them?

@martinstark
Copy link
Contributor

martinstark commented Nov 18, 2024

I thought about it, but I'm not sure it's good to try to mix a potential buffer management API which might delay responses from resolving for a long time with network request/retry/timeout settings, but maybe those settings are no longer in use after the response filter has triggered.

To keep things decoupled from the network intercept API, the buffer API could live closer to the SourceBuffer, allowing Shaka to process the resolved response before handing over to a potential BufferManager. It's probably more future proof, in case Shaka again wants to update how it handles buffering.

avelad added a commit to avelad/shaka-player that referenced this issue Nov 18, 2024
avelad added a commit to avelad/shaka-player that referenced this issue Nov 18, 2024
avelad added a commit that referenced this issue Nov 19, 2024
Related to
#7602 (comment)

From 23009-1:

The value of the minimum buffer time does not provide any instructions
to the client on how long
to buffer the media. The value however describes how much buffer a
client should have under
ideal network conditions. As such, MBT is not describing the burstiness
or jitter in the network,
it is describing the burstiness or jitter in the content encoding.
Together with the BW value, it is
a property of the content. Using the "leaky bucket" model, it is the
size of the bucket that makes
    BW true, given the way the content is encoded
@avelad avelad closed this as completed in 84b64af Nov 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
priority: P1 Big impact or workaround impractical; resolve before feature release type: enhancement New feature or request
Projects
None yet
7 participants