Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HLS.js memory increase issue when playing live streaming #5402

Closed
qqfei1858 opened this issue Apr 19, 2023 · 12 comments · Fixed by #6550
Closed

HLS.js memory increase issue when playing live streaming #5402

qqfei1858 opened this issue Apr 19, 2023 · 12 comments · Fixed by #6550

Comments

@qqfei1858
Copy link

What do you want to do with Hls.js?

The page needs to use hls.js to play live streaming and display multiple screens at the same time. However, due to the long playback time, the memory of the page keeps increasing, which eventually leads to page crashes.

What have you tried so far?

OS: Windows 11 browser: Microsoft Edge 112.0.1722.48

  1. The memory still increases despite setting "backBufferLength", "", "maxBufferSize", and "maxBufferLength
    image

image

  1. This is the memory usage at the beginning of playback (5 scenes).
    image

  2. 10 minutes later
    image

This is only a testing scenario, and the screenshot is meant to illustrate the issue. In actual live streaming, the broadcast may last for more than a day or even longer, causing the page to crash due to memory usage issues.

@qqfei1858 qqfei1858 added Needs Triage If there is a suspected stream issue, apply this label to triage if it is something we should fix. Question labels Apr 19, 2023
@qqfei1858
Copy link
Author

hls version: 1.4.0

@robwalch
Copy link
Collaborator

Try comparing memory heaps and allocation profiling to find where the test is leaking: https://developer.chrome.com/docs/devtools/memory-problems/

If you find specific evidence of leaks in HLS.js, please share and we can treat this a bug.

@robwalch robwalch added Need info Incomplete and removed Needs Triage If there is a suspected stream issue, apply this label to triage if it is something we should fix. labels Apr 25, 2023
@zsilbi
Copy link

zsilbi commented May 1, 2023

I found and fixed a similar issue here: videojs/m3u8-parser#164 for my stream.
I assume the root cause of this issue is the same, it's a Chromium problem when handling large strings like raw m3u8.

@OrenMe
Copy link
Collaborator

OrenMe commented May 2, 2023

@zsilbi great find! and thanks for posting back here. So basically a solution will need to be adding this in the parser all around where match or exec are being executed?

@robwalch
Copy link
Collaborator

robwalch commented May 2, 2023

We already handle sliced strings. Let me know if memory profiling reveals any references to full playlist text in any parsed properties. I am not aware of any.

@zsilbi
Copy link

zsilbi commented May 2, 2023

@zsilbi great find! and thanks for posting back here. So basically a solution will need to be adding this in the parser all around where match or exec are being executed?

Yes, for the calls which are executed on the full playlist string.

We already handle split strings. Let me know if memory profiling reveals any references to full playlist text in any parsed properties. I am not aware of any.

muxinc/elements#679 (comment)

@robwalch
Copy link
Collaborator

robwalch commented May 2, 2023

Thanks @zsilbi,

Exposing the media playlist text is intentional. Removing this from the library could break HLS.js integrations that use it:

level.m3u8 = string;

public m3u8: string = '';

If parsed media playlist instances are not being replaced and garbage collected then that would be a leak. What I am asking for as part of this bug report are pointers to any such leaks, preferably isolated from any app integration or demo page.

The player never reads level details m3u8 so you could set it to undefined in level and track loaded events if you are concerned about runtime memory usage.

@zsilbi
Copy link

zsilbi commented May 2, 2023

Then it looks like I wasn't patient to enough to wait for GC.
The playlist file I tested with was 12 hours long so it immediately popped up at the top, I thought that's the same issue that I had.

@robwalch
Copy link
Collaborator

robwalch commented May 2, 2023

@zsilbi was it from a different variant? The player does not remove level details from variants it switches away from. This is something the player could do for live after the parsed playlist details are no longer usable.

@robwalch
Copy link
Collaborator

robwalch commented May 2, 2023

I'm looking at Part and Fragment instances retained by the FragmentTracker in #5423 and am making sure that parts are removed after parts for a fragment are appended, and that fragments tracks are removed after the buffer is ejected.

A quick test you can run in the console to see if these objects are piling up is:

Object.keys(hls.streamController.fragmentTracker.fragments);

The expectation is that the objects tracked match what is in the buffer whether it is the player or the browser that ejects media from either source buffer. Using a backBufferLength that matches your playlist window or is set to something reasonable like 90 will help keep the objects tracked to a minimum.

It would also help if you notice any other objects piling up on the player instance to mention it as part of a bug report.

@mzvast
Copy link

mzvast commented Jul 3, 2023

Any update?

@robwalch
Copy link
Collaborator

robwalch commented Jul 12, 2024

Try disabling the Web Worker by setting enableWorker: false. Let me know how that impacts memory usage in your profiling. #6550 will land in v1.6.0 and reduce worker instances used by HLS.js with the default behavior which uses workers (enableWorker: true).

Outside of this we haven't identified any other leaks. Please provide solid leads by comparing heap snap shots before and after playing a significant amount of content and forcing GC. Keep in mind that all buffered content and parsed media used by the API will still occupy space.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants