-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HDR signal peak luminance metadata not sent to the display #10129
Comments
Yes, I did not initially implement this in vo:gpu as I was focused on getting the general PQ stuff done (also I did not see with the test content I was utilizing in development major issues with this information missing on my specific screen). So yes, vo=gpu with the d3d11 back-end will only set the swap chain to PQ+BT.2020 mode in 10 bit. Not pass through the HDR10 metadata. As I have essentially rewritten nev's changes to libplacebo's d3d11 module regarding swap chain color space etc, I have a pretty good grasp on how the API works, but I think only the max brightness bit is available through the mpv interfaces right now in the location the metadata is being currently set (at swap chain creation). With that merge request's changes vo=gpu-next with |
Thanks. A quick search reveals that the relevant API is |
I eagerly await these updates! |
Out of curiosity what should i see in the test-pattern (White_900-4000nits-MaxCLL-4000-) on a 1000 nits TV, with HDR on in Win11? |
If the full chain is working properly, i.e. assuming that the metadata is preserved throughout AND your display's HDR tone mapping is any good, you should in theory see the full range or most of it. For example, on my setup (Windows 11 21H2 22000.613 HDR ON, 3080 Ti driver 512.15, LG G1 firmware 03.20.16), I can see the difference up until 3500 nits or so using VLC and madVR, though they need to be full screen and I need to wait for about 10 seconds or so as the TV "fades" into the new metadata - it doesn't switch instantly. If I force the TV to assume 1000 nits using a hidden LG menu, the whole ramp disappears, which is exactly what one would expect and confirms that the TV is tone mapping at 4000 nits peak otherwise. I just tested using the Windows 11 "Films & TV" app and the result seems incorrect - it's blending in at about 2500 nits or so. Unfortunately I don't know of an easier way to diagnose HDR metadata issues - LG TVs will not tell you which peak luminance it's currently receiving over the HDMI link (not even in hidden service menus) and Windows/nvidia will not tell you what metadata it's sending, either. So basically one has no choice but to do this blind and take a guess at what's happening based on how test signals look like. AFAIK the only way to truly know what's actually happening over the HDMI connection is to use an analyser like Dr HDMI but that's a lot of money just to confirm a single number :/ |
ah ok i see, yet VLC looks similar to mpv/JRMC for me, merges at around 1500 nits. If i set I assume for HDR10 content that's not a big issue, since most have static MaxCCL = 1000, yet Dolby Vision files usually target 4000 nits from what i have seen? So should i set PS: I also noticed that all Win11 player's will only display the |
Then your setup is not tone mapping properly. Either because the component responsible for tone mapping (i.e. your display, if you're not tone mapping in the player) is bad, or because it's not getting correct metadata for some reason. Or there is something else going on which is unknowable without having direct access to your particular setup. Unfortunately this can be pretty tricky to troubleshoot given how opaque the various components tend to be in terms of diagnostics. Sadly one often find themselves fumbling in the dark when it comes to this stuff.
I'm honestly not very familiar with mpv (I only started using it recently), so I'm not 100% sure what the To clarify: the context around my original bug report is a setup in which I want the display (i.e. my LG TV) to handle the entire tone mapping process - I don't want mpv to do any tone mapping and I want both pixel values and metadata to be passed through to the HDMI port unchanged.
According to this (admittedly old) post, 4000-nit HDR10 movies are actually quite common:
If your display's tone mapping is bad/broken, then it might make sense to tone map in mpv instead. But the problem is, if you do that, then you have to make sure that your display's tone mapping is completely turned off (i.e. hard clipping) - you don't want tone mapping to be done twice, the results will be incorrect/bad. The way to do that depends on your particular setup and display. Honestly I wouldn't attempt something like this without access to a color meter to double-check the resulting response curve. Especially since this might not just affect luminance, but color gamut as well. |
Just noticed you use the 0.34.0 version, so what Atm we have two major options on windows, I use the daily builds from here with On the other hand |
I used 0.34.0 with all defaults, so I guess I'm using I could try with |
I am pretty sure it's the TV that does this, presumably to avoid sudden/jarring luminance changes. The reason why I think it's the TV is because the same "progressive fade" effect occurs if I override maxCLL/mastering peak in the TV hidden menu.
Oh? Picture me interested! Where can I find this patch? |
Ah, okay, but that's an mpv patch. I misunderstood; I thought you had a way of making the GPU driver (e.g. nvidia) indicate precisely which maxCLL it's currently sending down the HDMI wire at the hardware level. Knowing what maxCLL is at the output of mpv is nice I guess, but there's so much that can go wrong at the OS/driver level… |
No it is not. The patch you have linked to is just verbose-logging what is being passed into the "please set this hint into the swap chain" API of libplacebo as part of the FFmpeg AVFrame (as haasn didn't feel like integrating the full CLL or mastering screen metadata into the mp_image structure itself). There is no such API to my knowledge that returns exactly what the screen is currently configured to, unless it is vendor specific. And even in that case I do not know of such API. Please stop, you are not being useful. |
With a quick search there is an nvidia-specific API which might return the current configuration directly from the driver as a following struct: https://docs.nvidia.com/gameworks/content/gameworkslibrary/coresdk/nvapi/struct__NV__HDR__CAPABILITIES__V2.html . (some pages mention |
I suspect this will only return what the display is capable of (i.e. from its EDID), not the details of the signal it's currently being sent. |
Right, quite possible. I was just doing a quick search. In that sense it would be similar to the Windows standard API that returns you the EDID information, and thus nvapi shenanigans wouldn't be required. I did think of plugging that information into the mpv tone mapping output target brightness, but at least without the peak brightness metadata it just caused the image to lose brightness :) (although possibly if I would have set mpv tone mapping mode to clip it might have fared better) |
Yeah another problem is that some displays don't populate this information in their EDID. For example my LG G1 doesn't have max luminance information in its EDID (which is a bit surprising since it's well-known to be about 750-800 nits - must have been an oversight from LG). |
@dechamps Quick update after fiddling with my setup last night. I did reset the TV and started with a fresh calibration and now i can see all boxes (merges with last two) on the 4000 nits sample. My HDR config with latest daily build bb5b4b1.
PS: I still see the bug, that without explicitly setting |
I would not trust HDR metadata APIs to behave correctly without double-checking on the final setup. OSes have bugs. GPU drivers have bugs. Incorrect HDR metadata transmission can easily go unnoticed because its effect won't be obvious unless you know where to look. Heck, last time I checked, on Windows 10 HDR metadata APIs didn't work at all, regardless of player, despite Windows 10 having an HDR output mode. I had to upgrade to Windows 11 for it to work. There is also the more general problem that OSes can't blindly pass these calls through in general, because multiple applications could be running at the same time with different requirements (e.g. standard SDR apps vs a HDR video player) and the OS has to do its best to keep everyone happy. For example I noticed that with VLC and madVR, in general the metadata is only transmitted if the player is full screen. Which can lead to other potential problems e.g. the OS/driver not treating some app as full screen even though it is, and dropping its HDR metadata. |
@Andy2244 Using your exact config, verbatim, with latest shinchiro build |
@dechamps yeah i also noticed that vlc has more pronounced boxes for the colored test sections.
So doing a pre-tonemap pass to the target nits, while trying to preserve the dynamic range, which is far from your untouched pass-through goal. |
@dechamps FYI my MR into libplacebo got merged, so git master mpv with libplacebo version >= 4.204.0 - when used with edit: The usual notes from #10158 (comment) still apply of course, due to how MS handles things with regards to d3d11. |
@jeeb Sorry it took me a while to get around to testing your solution. I can confirm that, with the shinchiro 20220605 build, maxCLL metadata appears to be correctly transmitted using |
(I'll just mention in passing that it would be great if these "make it work the way it's supposed to" options could be enabled by default, instead of mpv users getting suboptimal HDR rendering out-of-the-box. Presumably it's only a question of time though.) |
Reproduction steps
Play a video suitable for testing HDR metadata/tone mapping.
I use the Mehanik HDR10 test patterns, specifically
02. White_Color clipping\04. White_900-4000nits-MaxCLL-4000-MDL-4000.mp4
, which usually makes it obvious if the HDR metadata is being sent correctly or not.Expected behavior
The HDR metadata is sent correctly. In the case of the test video mentioned above, a peak luminance of 4000 nits should be sent to the display; as a result the display should apply the corresponding tone mapping curve resulting in proper gradation across the band.
Actual behavior
The metadata appears to be sent incorrectly (i.e. the gradations disappear), even in full screen mode.
Additional information
Note that, strangely, the issue is not always perfectly reproducible. Most of the time mpv won't be able to get the metadata across, but once in a blue moon it actually works. It's not clear to me what triggers it.
At first I thought it could be triggered by using mpv after using another HDR-capable player first (e.g. VLC or madVR), but that doesn't seem to always be the case.
This smells like a OS or GPU driver bug, but I will point out that neither VLC nor madVR ever seem to get this wrong (as long as they are in full screen mode) - as far as I can tell only mpv is affected.
Log file
log.txt
The text was updated successfully, but these errors were encountered: