Fix already processed fragments being recognised as new due to change of base fragments url #51
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Same fragments might get different urls on a different worker server on two consecutive updates of hls playlist.
HLSDownloader._fill_queue
relies on comparison of full fragment urls to pick new ones, so when this happens it will treat all urls from the current playlist chunk as new, which will cause about ten fragments to get downloaded for a second time, leading to about a ten seconds segment of the video being repeated twice in the recording.Here is an example of two different urls pointing to fragment with the same number and seemingly the same content:
This PR makes it that only filename part of the fragment url (
73436.ts
for the links above) is used for comparison, which fixes the issue.Currently there is no warning issued when it happens, but it can still be detected by
[hls] Found 12 new fragments
line in output. Perhaps some dedicated debug message should be added?