Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support output to HDR monitors #94496

Draft
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

DarkKilauea
Copy link
Contributor

@DarkKilauea DarkKilauea commented Jul 18, 2024

This PR enables the ability for Godot to output to HDR capable displays. This allows Godot to output brighter luminance than allowed in SDR mode and with more vibrant colors.

Testing project: https://github.com/DarkKilauea/godot-hdr-output

HDR (blown out a bit, looks better on an HDR display):
image

SDR:
image

Supported Platforms:

  • Windows

Supported Graphics APIs:

  • Vulkan
  • DirectX

Supported HDR Formats:

  • HDR10

Work to do:

  • Look into DirectX 12 support
  • Tonemap 2D elements at a different brightness value to 3D elements.
  • Look into Dolby Vision support.
  • Investigate NVidia RTX HDR flickering bug.
  • Look into supporting HDR in the editor.
  • Test more devices.

Help Needed:

  • Testing more displays. I'm working with my LG OLED monitor, but different displays, graphics drivers, and OS setups may have different color spaces or required formats that need support added to this PR.
  • Adding support for DirectX, I'm not currently sure how to update the swap chain to output an HDR format.
  • Adding support for more operating systems. Linux seems like it might not be ready yet, with HDR support only now being added to Wayland. However, macOS in theory should have support. I only have an 8 GB mac mini that isn't capable of outputting HDR to my display.
  • Design: I tried to make the API as "Godot" as possible, but would really like some help on whether this is a good way to implement this feature.

Technical Details:

I updated the Blit shader to convert the viewport buffer into the ST2084 color space when an HDR format is detected. The max supported nits for the ST2084 color space is 10,000 nits, which consumer monitors are not capable of achieving, so some adjustment is required. I looked at how AMD did it with FidelityFX and took a similar approach to adjusting the curve based on the max luminance of the display.

Here you can see the curves at several peek luminance values:
image

Plotting the derivative, you can see the error amongst the adjusted values is small, which should result in images looking similar on different displays with different max luminance capabilities:
image

This is an approximation though, AMD's FidelityFX HDR Mapper does a lot of fancy logic with color spaces and likely does a better job of mapping colors from the source format to the display. However, this approximation looks good to me and may be good enough for now.

@Calinou
Copy link
Member

Calinou commented Jul 19, 2024

I gave this a quick test locally (on Windows 11 23H2 + NVIDIA 560.80 + LG C2 42"), it works as expected. This is encouraging to see, I've been wanting this for a while 🙂

I'll need to look into building more extensive scenes and getting tonemapped screenshots/videos out of this. 2D HDR also needs to be tested thoroughly.

Remember that JPEG XL or AVIF for images and AV1 for videos are a must for HDR, as other formats can only store SDR data. You may need to embed those in ZIP archives and ask users to preview them in a local media player, as GitHub doesn't allow uploading those formats and browsers often struggle displaying HDR correctly.

I noticed some issues for now:

  • Having RTX HDR enabled will mess with the HDR that is enabled in the editor. It will continuously enable and disable itself whenever you make any input in the editor (and disable itself after being idle for a second). This is also an issue on master with HDR disabled.
  • HDR Max Luminance affects both 2D (UI) and 3D rendering. Is that intended?
  • The HDR editor setting is not applied instantly when you change it, even though the demo project shows a working example of it being toggled at runtime. You can update the viewport's status based on editor settings here:
    void EditorNode::_update_from_settings() {
  • There doesn't appear to be a paperwhite setting you can use to adjust UI brightness. This is typically offered in games to prevent the UI from being too bright. Using a paperwhite value around 200 nits is common, since a lot of OLED displays cap out at that brightness level in SDR. Either way, this should be exposed in the project settings and the documentation should recommend exposing this setting to player (just like HDR peak luminance).
    • There should also be a way for unshaded materials to base themselves on paperwhite, so that Sprite3D and Label3D used for UI purposes are not overly bright in HDR. I suppose this would be a BaseMaterial3D property or a shader render mode.
      • In the interest of compatibility, we may not be able to enable this by default in Sprite3D due to VFX usage (where HDR display can be intended), but for Label3D, we may be able to safely default to this.

See the settings exposed by the Control HDR mod for an example of a best-in-class HDR implementation (related video):

control_hdr_mod_settings.mp4

Interesting, that UI seems to use the term "paperwhite" in a different way, and has a dedicated setting for the brightness of UI and HUD elements.

Comment on lines +1624 to +1654
Sets the maximum luminance of the display in nits (cd/m²) when HDR is enabled.
This is used to scale the HDR effect to avoid clipping.
Copy link
Member

@Calinou Calinou Jul 19, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Sets the maximum luminance of the display in nits (cd/m²) when HDR is enabled.
This is used to scale the HDR effect to avoid clipping.
Sets the maximum luminance of the display in nits (cd/m²) when HDR is enabled. If set to [code]0.0[/code], luminance is not limited, which may look worse than setting a max luminance value suited to the display currently in use.
This is used to scale the HDR effect to avoid clipping.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not so sure about this change, the max value allowed by the spec for ST2084 is 10,000 nits, which always looks blown out on any consumer display (and most of the professional ones too). Perhaps a more reasonable default value would make more sense?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indeed, in real world scenarios, you'll always want luminance to be limited to a reasonable value. That said, as I understand the code, no limitation is applied if the luminance cap is set to 0 nits (the project setting defaults to that value).

That reminds me, should the default value for the HDR luminance cap be changed? The demo project uses 600 nits. We should probably see what modern games typically use as their default luminance cap value and use a value similar to that.

Only available on platforms that support HDR output, have HDR enabled in the system settings, and have a compatible display connected.
</member>
<member name="display/window/hdr/max_luminance" type="float" setter="" getter="" default="0.0">
Sets the maximum luminance of the display in nits (cd/m²) when HDR is enabled.
Copy link
Member

@Calinou Calinou Jul 19, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Sets the maximum luminance of the display in nits (cd/m²) when HDR is enabled.
Sets the maximum luminance of the display in nits (cd/m²) when HDR is enabled. If set to [code]0.0[/code], luminance is not limited, which may look worse than setting a max luminance value suited to the display currently in use.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also not so sure about this, for the reasons in the other related comment.

@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch 2 times, most recently from 88beb60 to 8df131d Compare July 19, 2024 06:30
@DarkKilauea
Copy link
Contributor Author

I gave this a quick test locally (on Windows 11 23H2 + NVIDIA 560.80 + LG C2 42"), it works as expected. This is encouraging to see, I've been wanting this for a while 🙂

Thanks for taking a look!

I noticed some issues for now:

* Having RTX HDR enabled will mess with the HDR that is enabled in the editor. It will continuously enable and disable itself whenever you make any input in the editor (and disable itself after being idle for a second). This is also an issue on `master` with HDR disabled.

* See [[4.3 Beta 3] Strange editor brightness and colors caused by RTX Dynamic Vibrance affecting the editor #94231](https://github.com/godotengine/godot/issues/94231). We should see if we can forcibly disable RTX HDR and RTX Dynamic Vibrance for the editor using a NVIDIA profile. I haven't seen options for those in NVIDIA Profile Inspector so far.

Odd that NVidia's RTX HDR doesn't detect the HDR color space and avoid messing with the final swap chain buffer. Auto-HDR in Windows 11 appears to avoid messing with Godot when HDR is enabled. Updating the NVidia Profile may be outside the scope of this PR and be best done with a more focused PR.

* HDR Max Luminance affects both 2D (UI) and 3D rendering. Is that intended?

For the initial draft, yes, everything is mapped using the same tonemapper. However, we should map UI elements to a different brightness to avoid them being too bright. For now, that can be worked around with dimming the brightness of any UI elements via the theme, but I would like to fix that in this PR.

* The HDR editor setting is not applied instantly when you change it, even though the demo project shows a working example of it being toggled at runtime. You can update the viewport's status based on editor settings here: https://github.com/godotengine/godot/blob/ff8a2780ee777c2456ce42368e1065774c7c4c3f/editor/editor_node.cpp#L356

I haven't looked into configuring the editor to use HDR yet. Will do after I figure out how to properly tone map UI elements, if you enable HDR on the editor now, the UI is a little unpleasant.

* There doesn't appear to be a paperwhite setting you can use to adjust UI brightness. This is typically offered in games to prevent the UI from being too bright. Using a paperwhite value around 200 nits is common, since a lot of OLED displays cap out at that brightness level in SDR. Either way, this should be exposed in the project settings and the documentation should recommend exposing this setting to player (just like HDR peak luminance).

Agreed, UI elements and other 2D elements should probably be mapped to a different brightness curve. I'll probably have to figure out where in the engine 3D and 2D elements are composited together and perform the tone mapping there.

  * There should also be a way for unshaded materials to base themselves on paperwhite, so that Sprite3D and Label3D used for UI purposes are not overly bright in HDR. I suppose this would be a BaseMaterial3D property or a shader render mode.

    * In the interest of compatibility, we may not be able to enable this by default in Sprite3D due to VFX usage (where HDR display can be intended), but for Label3D, we may be able to safely default to this.

That might be outside of the scope of this PR. I'm not sure how I would indicate that certain 3D elements need to be mapped using a different brightness curve once they are all combined into the same buffer. It would be similar to trying to avoid sRGB mapping certain rendered elements.

For now, this can be worked around by decreasing the brightness of the color of these elements.

See the settings exposed by the Control HDR mod for an example of a best-in-class HDR implementation (related video):
control_hdr_mod_settings.mp4

Interesting, that UI seems to use the term "paperwhite" in a different way, and has a dedicated setting for the brightness of UI and HUD elements.

Baldur's Gate 3 and Cyberpunk 2077 also have really nice HDR settings menus. I've been basing some of this work off their approach, though modifying contrast and brightness I'm leaving up to Environment since those effects are already there.

Thanks again for your comments! I'll add some TODO items to the description for tracking.

@Jamsers
Copy link

Jamsers commented Aug 28, 2024

Can you use any Godot project to test this PR? Bistro-Demo-Tweaked and Crater-Province-Level both use physical light units, and use as close to reference values for luminosity on light sources. (i.e. the sun at noon is 100000 lux, the moon at midnight is 0.3 lux)

I'd love to help test this PR but unfortunately I don't have HDR hardware ☹️

@alvinhochun
Copy link
Contributor

I recently got a monitor that supports fake HDR DisplayHDR 400 so I thought I could give this a try, but on Intel UHD 620 it prints "WARNING: HDR output requested but no HDR compatible format was found, falling back to SDR." and doesn't display in HDR. I was kind of expected this since it is using Vulkan, but I'm a bit surprised it works for you, even on windowed mode no less. I guess there is some special handling in the NVIDIA driver?

Anyway, adding HDR output to D3D12 should be trivial and I might give it a try. (No promises!)


Shall we also consider implementing HDR display for the compatibility renderer? I am not sure if native OpenGL can do HDR, but it is very possible to implement on Windows with the help of ANGLE and some manual setting up.

@fire
Copy link
Member

fire commented Aug 28, 2024

This needs a rebase on master, but I have a https://www.dell.com/en-ca/shop/alienware-34-curved-qd-oled-gaming-monitor-aw3423dw/apd/210-bcye/monitors-monitor-accessories HDR display.

I can help test.

@DarkKilauea
Copy link
Contributor Author

Can you use any Godot project to test this PR? Bistro-Demo-Tweaked and Crater-Province-Level both use physical light units, and use as close to reference values for luminosity on light sources. (i.e. the sun at noon is 100000 lux, the moon at midnight is 0.3 lux)

I'd love to help test this PR but unfortunately I don't have HDR hardware ☹️

You should be able to test with any scene, though keep in mind that the realistic light units will not map directly to the brightness of the display. Consumer desktop displays typically don't go much above 1000 nits on the high end, which is far too dim to simulate sunlight. Values from the scene will be mapped to a range fitting within the max luminosity set for the window.

@alvinhochun
Copy link
Contributor

Here are the changes to get Rec. 2020 HDR output on D3D12: master...alvinhochun:godot:hdr-output-d3d12

@alvinhochun
Copy link
Contributor

Quote

HDR (blown out a bit, looks better on an HDR display): image

SDR: image

The over-exposure in your screenshot is expected, but the colours are oversaturated because it is missing a colour space conversion. The colours need to be converted from BT.709 primaries to BT.2020 primaries. This is how it should look with the correct colours:

image

The conversion may be done with something like this:

diff --git a/servers/rendering/renderer_rd/shaders/color_space_inc.glsl b/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
index 3583ee8365..76305a8a3c 100644
--- a/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
+++ b/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
@@ -19,6 +19,15 @@ vec3 linear_to_st2084(vec3 color, float max_luminance) {
        // max_luminance is the display's peak luminance in nits
        // we map it here to the native 10000 nits range of ST2084
        float adjustment = max_luminance * (1.0f / 10000.0f);
+       color = color * adjustment;
+
+       // Color transformation matrix values taken from DirectXTK, may need verification.
+    const mat3 from709to2020 = mat3(
+          0.6274040f, 0.0690970f, 0.0163916f,
+          0.3292820f, 0.9195400f, 0.0880132f,
+          0.0433136f, 0.0113612f, 0.8955950f
+       );
+       color = from709to2020 * color;

        // Apply ST2084 curve
        const float c1 = 0.8359375;
@@ -26,7 +35,7 @@ vec3 linear_to_st2084(vec3 color, float max_luminance) {
        const float c3 = 18.6875;
        const float m1 = 0.1593017578125;
        const float m2 = 78.84375;
-       vec3 cp = pow(abs(color.rgb * adjustment), vec3(m1));
+       vec3 cp = pow(abs(color.rgb), vec3(m1));

        return pow((c1 + c2 * cp) / (1 + c3 * cp), vec3(m2));
 }

@DarkKilauea
Copy link
Contributor Author

Quote

The over-exposure in your screenshot is expected, but the colours are oversaturated because it is missing a colour space conversion. The colours need to be converted from BT.709 primaries to BT.2020 primaries. This is how it should look with the correct colours:

image

The conversion may be done with something like this:

diff --git a/servers/rendering/renderer_rd/shaders/color_space_inc.glsl b/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
index 3583ee8365..76305a8a3c 100644
--- a/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
+++ b/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
@@ -19,6 +19,15 @@ vec3 linear_to_st2084(vec3 color, float max_luminance) {
        // max_luminance is the display's peak luminance in nits
        // we map it here to the native 10000 nits range of ST2084
        float adjustment = max_luminance * (1.0f / 10000.0f);
+       color = color * adjustment;
+
+       // Color transformation matrix values taken from DirectXTK, may need verification.
+    const mat3 from709to2020 = mat3(
+          0.6274040f, 0.0690970f, 0.0163916f,
+          0.3292820f, 0.9195400f, 0.0880132f,
+          0.0433136f, 0.0113612f, 0.8955950f
+       );
+       color = from709to2020 * color;

        // Apply ST2084 curve
        const float c1 = 0.8359375;
@@ -26,7 +35,7 @@ vec3 linear_to_st2084(vec3 color, float max_luminance) {
        const float c3 = 18.6875;
        const float m1 = 0.1593017578125;
        const float m2 = 78.84375;
-       vec3 cp = pow(abs(color.rgb * adjustment), vec3(m1));
+       vec3 cp = pow(abs(color.rgb), vec3(m1));

        return pow((c1 + c2 * cp) / (1 + c3 * cp), vec3(m2));
 }

Thanks for that, I've incorporated the color rotation and it looks fine on my end.

@DarkKilauea
Copy link
Contributor Author

I recently got a monitor that supports fake HDR DisplayHDR 400 so I thought I could give this a try, but on Intel UHD 620 it prints "WARNING: HDR output requested but no HDR compatible format was found, falling back to SDR." and doesn't display in HDR. I was kind of expected this since it is using Vulkan, but I'm a bit surprised it works for you, even on windowed mode no less. I guess there is some special handling in the NVIDIA driver?

I've noticed some oddness with my Surface Laptop Studio where the NVidia GPU doesn't have any HDR colorspaces in Vulkan, but the integrated Intel Xe graphics does. Direct X sees HDR capable color spaces for both.

Shall we also consider implementing HDR display for the compatibility renderer? I am not sure if native OpenGL can do HDR, but it is very possible to implement on Windows with the help of ANGLE and some manual setting up.

I'm not sure how to set the color space for OpenGL, there doesn't seem to be a standard extension for it. I think for now, support for the compatibility renderer should be left to a future PR.

@DarkKilauea
Copy link
Contributor Author

I've added support for DirectX, based on the code provided by @alvinhochun. I've also given them co-authorship for their contributions.

Thanks!

@DarkKilauea
Copy link
Contributor Author

Update:

I'm currently working on tone-mapping 2D elements differently from 3D, but I'm running into some issues with how Godot renders scenes in its render targets.

Godot will render the 3D scene, tonemap that scene, then proceed to render any 2D elements directly into the same render target. Then, any render targets (from different viewports) are blitted together into the final framebuffer. I'm currently performing the colorspace conversion from sRGB/Linear to HDR 10 at this blitter, which cannot distinguish between the 2D and 3D elements.

I figured I would then update the 3D tonemap shader and canvas shader to perform the colorspace conversion themselves, but the engine makes assumptions (which are invalidated by this PR) in various different parts of the renderer that only sRGB and Linear colorspaces exist, which is making it difficult to ensure that I don't accidentally perform a conversion that has already occurred. I'm also trying to make sure any changes made are as local and limited as possible to avoid making this PR harder to merge.

I'm working my way through all of the sites where sRGB conversion takes place and trying to see if there is a clean way to track what conversions have occurred, or at least determine if there is a limited subset I can touch and assume the correct space later on.

I'm assuming it would not be acceptable to have the canvas renderer render into its own render target and have the blitter combine them later. Not only would that cost more VRAM, but there is a computational cost as well. There would have to be more of a benefit than just making my life easier. :)

@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch 2 times, most recently from 5f5f917 to 52059de Compare October 5, 2024 01:28
@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch 2 times, most recently from 5abaebf to 4e94080 Compare October 23, 2024 02:02
@Quadtree
Copy link
Contributor

Quadtree commented Oct 23, 2024

This is really neat, I've been waiting for a proper HDR10 implementation in Godot for awhile. A couple comments:

I think this PR uses a VK_FORMAT_A2B10G10R10_UNORM_PACK32 / VK_COLOR_SPACE_HDR10_ST2084_EXT swapchain. I think this is the fastest option, but on recent versions of the NVIDIA drivers on Windows 11 (Testing on 565.90) I've noticed a strange bug where running Godot in Fullscreen Exclusive with that swapchain format and Vulkan the HDR looks very dim and distorted. I think this is bug in the NVIDIA driver, but I'm not 100% sure. Switching to VK_FORMAT_R16G16B16A16_SFLOAT / VK_COLOR_SPACE_EXTENDED_SRGB_LINEAR_EXT and mapping to scRGB does seem to fix it but this isn't really an ideal fix. Also, I haven't heard anyone else report this issue so it might be something wrong with my setup.

Secondly, I've pushed an old HDR10 prototype for Godot I did awhile back, just in case it might help somehow. If you see anything useful, feel free to use it here.

Edit: NVIDIA just released 566.03 which appears to fix this issue. I can no longer repro it.

@Jamsers
Copy link

Jamsers commented Oct 23, 2024

HDR with raw Vulkan on Windows (or anywhere TBH, even on Linux for example) is notoriously finnicky, it's yet another reason why the best Vulkan games on Windows tend to present through DXGI. As well as the fact that Windows and most modern compositors are moving away from "exclusive fullscreen", and expect games to present to the compositor now. (and leave the resposibility of making present low latency and high performance to the compositor)

IIRC the latest versions of Windows actually don't have true exclusive fullscreen anymore, the compositor just "fakes" it when an app requests it now.

Co-authored-by: Alvin Wong <alvinhochun@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants