-
-
Notifications
You must be signed in to change notification settings - Fork 21.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support output to HDR monitors #94496
base: master
Are you sure you want to change the base?
Conversation
I gave this a quick test locally (on Windows 11 23H2 + NVIDIA 560.80 + LG C2 42"), it works as expected. This is encouraging to see, I've been wanting this for a while 🙂 I'll need to look into building more extensive scenes and getting tonemapped screenshots/videos out of this. 2D HDR also needs to be tested thoroughly. Remember that JPEG XL or AVIF for images and AV1 for videos are a must for HDR, as other formats can only store SDR data. You may need to embed those in ZIP archives and ask users to preview them in a local media player, as GitHub doesn't allow uploading those formats and browsers often struggle displaying HDR correctly. I noticed some issues for now:
See the settings exposed by the Control HDR mod for an example of a best-in-class HDR implementation (related video): control_hdr_mod_settings.mp4Interesting, that UI seems to use the term "paperwhite" in a different way, and has a dedicated setting for the brightness of UI and HUD elements. |
Sets the maximum luminance of the display in nits (cd/m²) when HDR is enabled. | ||
This is used to scale the HDR effect to avoid clipping. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sets the maximum luminance of the display in nits (cd/m²) when HDR is enabled. | |
This is used to scale the HDR effect to avoid clipping. | |
Sets the maximum luminance of the display in nits (cd/m²) when HDR is enabled. If set to [code]0.0[/code], luminance is not limited, which may look worse than setting a max luminance value suited to the display currently in use. | |
This is used to scale the HDR effect to avoid clipping. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not so sure about this change, the max value allowed by the spec for ST2084 is 10,000 nits, which always looks blown out on any consumer display (and most of the professional ones too). Perhaps a more reasonable default value would make more sense?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indeed, in real world scenarios, you'll always want luminance to be limited to a reasonable value. That said, as I understand the code, no limitation is applied if the luminance cap is set to 0 nits (the project setting defaults to that value).
That reminds me, should the default value for the HDR luminance cap be changed? The demo project uses 600 nits. We should probably see what modern games typically use as their default luminance cap value and use a value similar to that.
Only available on platforms that support HDR output, have HDR enabled in the system settings, and have a compatible display connected. | ||
</member> | ||
<member name="display/window/hdr/max_luminance" type="float" setter="" getter="" default="0.0"> | ||
Sets the maximum luminance of the display in nits (cd/m²) when HDR is enabled. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sets the maximum luminance of the display in nits (cd/m²) when HDR is enabled. | |
Sets the maximum luminance of the display in nits (cd/m²) when HDR is enabled. If set to [code]0.0[/code], luminance is not limited, which may look worse than setting a max luminance value suited to the display currently in use. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also not so sure about this, for the reasons in the other related comment.
88beb60
to
8df131d
Compare
Thanks for taking a look!
Odd that NVidia's RTX HDR doesn't detect the HDR color space and avoid messing with the final swap chain buffer. Auto-HDR in Windows 11 appears to avoid messing with Godot when HDR is enabled. Updating the NVidia Profile may be outside the scope of this PR and be best done with a more focused PR.
For the initial draft, yes, everything is mapped using the same tonemapper. However, we should map UI elements to a different brightness to avoid them being too bright. For now, that can be worked around with dimming the brightness of any UI elements via the theme, but I would like to fix that in this PR.
I haven't looked into configuring the editor to use HDR yet. Will do after I figure out how to properly tone map UI elements, if you enable HDR on the editor now, the UI is a little unpleasant.
Agreed, UI elements and other 2D elements should probably be mapped to a different brightness curve. I'll probably have to figure out where in the engine 3D and 2D elements are composited together and perform the tone mapping there.
That might be outside of the scope of this PR. I'm not sure how I would indicate that certain 3D elements need to be mapped using a different brightness curve once they are all combined into the same buffer. It would be similar to trying to avoid sRGB mapping certain rendered elements. For now, this can be worked around by decreasing the brightness of the color of these elements.
Baldur's Gate 3 and Cyberpunk 2077 also have really nice HDR settings menus. I've been basing some of this work off their approach, though modifying contrast and brightness I'm leaving up to Environment since those effects are already there. Thanks again for your comments! I'll add some TODO items to the description for tracking. |
b89985a
to
e9742ba
Compare
e9742ba
to
b2bd1a1
Compare
Can you use any Godot project to test this PR? Bistro-Demo-Tweaked and Crater-Province-Level both use physical light units, and use as close to reference values for luminosity on light sources. (i.e. the sun at noon is 100000 lux, the moon at midnight is 0.3 lux) I'd love to help test this PR but unfortunately I don't have HDR hardware |
I recently got a monitor that supports Anyway, adding HDR output to D3D12 should be trivial and I might give it a try. (No promises!) Shall we also consider implementing HDR display for the compatibility renderer? I am not sure if native OpenGL can do HDR, but it is very possible to implement on Windows with the help of ANGLE and some manual setting up. |
This needs a rebase on master, but I have a https://www.dell.com/en-ca/shop/alienware-34-curved-qd-oled-gaming-monitor-aw3423dw/apd/210-bcye/monitors-monitor-accessories HDR display. I can help test. |
You should be able to test with any scene, though keep in mind that the realistic light units will not map directly to the brightness of the display. Consumer desktop displays typically don't go much above 1000 nits on the high end, which is far too dim to simulate sunlight. Values from the scene will be mapped to a range fitting within the max luminosity set for the window. |
b2bd1a1
to
728912f
Compare
Here are the changes to get Rec. 2020 HDR output on D3D12: master...alvinhochun:godot:hdr-output-d3d12 |
The over-exposure in your screenshot is expected, but the colours are oversaturated because it is missing a colour space conversion. The colours need to be converted from BT.709 primaries to BT.2020 primaries. This is how it should look with the correct colours: The conversion may be done with something like this: diff --git a/servers/rendering/renderer_rd/shaders/color_space_inc.glsl b/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
index 3583ee8365..76305a8a3c 100644
--- a/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
+++ b/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
@@ -19,6 +19,15 @@ vec3 linear_to_st2084(vec3 color, float max_luminance) {
// max_luminance is the display's peak luminance in nits
// we map it here to the native 10000 nits range of ST2084
float adjustment = max_luminance * (1.0f / 10000.0f);
+ color = color * adjustment;
+
+ // Color transformation matrix values taken from DirectXTK, may need verification.
+ const mat3 from709to2020 = mat3(
+ 0.6274040f, 0.0690970f, 0.0163916f,
+ 0.3292820f, 0.9195400f, 0.0880132f,
+ 0.0433136f, 0.0113612f, 0.8955950f
+ );
+ color = from709to2020 * color;
// Apply ST2084 curve
const float c1 = 0.8359375;
@@ -26,7 +35,7 @@ vec3 linear_to_st2084(vec3 color, float max_luminance) {
const float c3 = 18.6875;
const float m1 = 0.1593017578125;
const float m2 = 78.84375;
- vec3 cp = pow(abs(color.rgb * adjustment), vec3(m1));
+ vec3 cp = pow(abs(color.rgb), vec3(m1));
return pow((c1 + c2 * cp) / (1 + c3 * cp), vec3(m2));
}
|
728912f
to
56d27a6
Compare
Thanks for that, I've incorporated the color rotation and it looks fine on my end. |
I've noticed some oddness with my Surface Laptop Studio where the NVidia GPU doesn't have any HDR colorspaces in Vulkan, but the integrated Intel Xe graphics does. Direct X sees HDR capable color spaces for both.
I'm not sure how to set the color space for OpenGL, there doesn't seem to be a standard extension for it. I think for now, support for the compatibility renderer should be left to a future PR. |
56d27a6
to
2829e3e
Compare
2829e3e
to
aa929d4
Compare
da2e7ef
to
37d8407
Compare
I've added support for DirectX, based on the code provided by @alvinhochun. I've also given them co-authorship for their contributions. Thanks! |
37d8407
to
d5a7579
Compare
Update: I'm currently working on tone-mapping 2D elements differently from 3D, but I'm running into some issues with how Godot renders scenes in its render targets. Godot will render the 3D scene, tonemap that scene, then proceed to render any 2D elements directly into the same render target. Then, any render targets (from different viewports) are blitted together into the final framebuffer. I'm currently performing the colorspace conversion from sRGB/Linear to HDR 10 at this blitter, which cannot distinguish between the 2D and 3D elements. I figured I would then update the 3D tonemap shader and canvas shader to perform the colorspace conversion themselves, but the engine makes assumptions (which are invalidated by this PR) in various different parts of the renderer that only sRGB and Linear colorspaces exist, which is making it difficult to ensure that I don't accidentally perform a conversion that has already occurred. I'm also trying to make sure any changes made are as local and limited as possible to avoid making this PR harder to merge. I'm working my way through all of the sites where sRGB conversion takes place and trying to see if there is a clean way to track what conversions have occurred, or at least determine if there is a limited subset I can touch and assume the correct space later on. I'm assuming it would not be acceptable to have the canvas renderer render into its own render target and have the blitter combine them later. Not only would that cost more VRAM, but there is a computational cost as well. There would have to be more of a benefit than just making my life easier. :) |
5f5f917
to
52059de
Compare
5abaebf
to
4e94080
Compare
This is really neat, I've been waiting for a proper HDR10 implementation in Godot for awhile. A couple comments: I think this PR uses a Secondly, I've pushed an old HDR10 prototype for Godot I did awhile back, just in case it might help somehow. If you see anything useful, feel free to use it here. Edit: NVIDIA just released |
HDR with raw Vulkan on Windows (or anywhere TBH, even on Linux for example) is notoriously finnicky, it's yet another reason why the best Vulkan games on Windows tend to present through DXGI. As well as the fact that Windows and most modern compositors are moving away from "exclusive fullscreen", and expect games to present to the compositor now. (and leave the resposibility of making present low latency and high performance to the compositor) IIRC the latest versions of Windows actually don't have true exclusive fullscreen anymore, the compositor just "fakes" it when an app requests it now. |
Co-authored-by: Alvin Wong <alvinhochun@gmail.com>
4e94080
to
1f2daf6
Compare
This PR enables the ability for Godot to output to HDR capable displays. This allows Godot to output brighter luminance than allowed in SDR mode and with more vibrant colors.
Testing project: https://github.com/DarkKilauea/godot-hdr-output
HDR (blown out a bit, looks better on an HDR display):
SDR:
Supported Platforms:
Supported Graphics APIs:
Supported HDR Formats:
Work to do:
Help Needed:
Adding support for DirectX, I'm not currently sure how to update the swap chain to output an HDR format.Technical Details:
I updated the Blit shader to convert the viewport buffer into the ST2084 color space when an HDR format is detected. The max supported nits for the ST2084 color space is 10,000 nits, which consumer monitors are not capable of achieving, so some adjustment is required. I looked at how AMD did it with FidelityFX and took a similar approach to adjusting the curve based on the max luminance of the display.
Here you can see the curves at several peek luminance values:
Plotting the derivative, you can see the error amongst the adjusted values is small, which should result in images looking similar on different displays with different max luminance capabilities:
This is an approximation though, AMD's FidelityFX HDR Mapper does a lot of fancy logic with color spaces and likely does a better job of mapping colors from the source format to the display. However, this approximation looks good to me and may be good enough for now.