-
-
Notifications
You must be signed in to change notification settings - Fork 35.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adopt (or simply promote) a new true HDR file format leveraging JPEG/WEBP compression. Potentially alleviates HDR and EXR file size issues #27171
Comments
Excellent! I was just looking up info on the UltraHDR format, thinking how nice it would be if we had a polyfill to support it for three.js environment maps across platforms - it sounds like you've done just that, so thank you! You refer to 10-bit HDR, but I would like to remind everyone here that there's TV HDR (where a few extra fixed-point bits is adequate) and then there's physical lighting HDR, which has many orders of magnitude higher dynamic range requirements. In three.js we store environment lighting as half-float textures on the GPU for this reason, and I've even seen clipping occasionally on these! I would recommend updating your example with true HDR lighting, e.g. Spruit Sunrise. Chrome's API for I would love to support this effort in any way I can - certainly I would be happy to add this support to model-viewer and promote it with our users. |
This is an exciting development. I have been leveraging KTX2 compressed textures lately, UASTC mostly but also BasisLZ, to save VRAM. Do you think GainMaps could work with compressed textures as well? |
I've actually been having a discussion with the folks who invented the KTX2 compression about this. It's a very different technology from JPEG, so the short answer I believe is no, but the good news is they are looking at their own way to compress HDR data, so stay tuned! |
Thanks for the information. |
@elalish maybe this is related? https://github.com/richgel999/png16/tree/main |
I must admit I simply copy/pasted the format description coming from libultrahdr (which they now, of course, corrected themselves) which mentioned 10-bit, but the format itself supports un-clipped HDR ranges, hence is suitable for IBL workflows. I've edited the title accordingly.
Exactly! The format (and our library) already encodes an unlimited HDR range (uses Half Float Render targets and returns
Thanks! our library and our online converter will stay free to use, if you find it useful you can integrate it as you please in model-viewer, we'd love that! Let us know if you have any problem with it, we are available for collaboration. |
Well, technically, an HDR file could theoretically be reconstructed starting from two KTX textures (an sdr KTX and a gain map KTX), you can already do this this way:
when you need to load the HDR image you can follow our https://github.com/MONOGRID/gainmap-js/blob/main/examples/decode-from-separate-data.ts example and replace the The technology is very interesting because it allows you to use any 8-bit image format, as long as it is loaded by Threejs. |
Someone with deeper knowledge can correct me, but I believe the problem with textures like KTX2 which are uncompressed on the GPU is that the decode process would also need to happen on the GPU as a compute shader. |
Preface: I must admit that reconstructing an HDR image using KTX textures has not been tested at all. Our implementation, though, reconstructs the full HDR range precisely on the GPU: when given two sdr images and the reconstruction metadata a simple shader, (not a compute shader) is sufficent Our decoder, in fact, returns a So this is theoretically feasible. See both our examples w/o Loader and with a Loader and you'll notice we populate a material The only issue we found is when you need to use the renderTarget.texture.mapping = EquirectangularReflectionMapping
renderTarget.texture.needsUpdate = true
material.map = renderTarget.texture does not work, so you must request a conversion to maybe someone in the three.js team can shed light into why this happens. Otherwise feel free to experiment and let us know your findings! |
Thanks for updating your example - it looks great! I've just been testing your compression tool on some of my favorite HDR environments and I'm seeing a 10x - 20x improvement in file size in JPEG mode. I just checked bundlephobia and it says your npm module comes in at 6kb - not bad, but smaller would be great. I assume that's for both encode and decode? How small can we get it for just decode? I would love to see a PR for this into Three - I have several users who need this desperately. It's so frustrating to put effort into compressing a GLB really well only to serve it with an environment image that's equally large. |
Personally, I prefer the JPEG solution, as single-file is much easier logistically for editing and serving. I see this requires the UltraHDR wasm module. But all this does is parse a bit of metadata from the JPEG header, right? Seems easy enough to rewrite that as a tiny bit of JS. Or am I missing something? |
EDIT: Done in version 2.0.0
keep in mind that editing a jpeg with an embedded gain map is not easily done at the moment. All current photo editing software will open the base SDR representation and discard the gain map. Some notable exceptions are:
Plus, the simple act of sharing a gain map jpeg with an image sharing service (Twitter, Facebook, Slack, etc) , often leads to the loss of metadata which in turn means the HDR info is gone.
EDIT: scratch that, I managed to get rid of the The whole extraction process in pure js lasts For comparison, the I've updated the example with the new pure js implementation, published a new 2.0.0 version on npm and bundlephobia now reports a minified +gzipped size of |
I don't know if KTX2 pairs of SDR+gain files is general-purpose enough that we'd want to make the required changes throughout three.js to support them, though it could be implemented in userland and would be interesting to compare. Note: It is critical that compressed textures in KTX2 containers remain compressed on the GPU. For a while now, I've wished we had a practical way to produce KTX2 files using BC6H compression, which remains compressed on the GPU, and has runtime advantages not available by any other method in this thread. Not all platforms support BC6H, but Khronos has published lightweight decoders that can be used to produce f16 data when runtime support isn't available. I imagine BC6H complements the approach here well – you might, for example, use libultrahdr when network bandwidth is the top concern, and KTX2 when VRAM budgets or texture upload without dropping frames are required. I think 6kb is an excellent tradeoff for the savings libultrahdr provides, and honestly — I don't think I've ever seen a useful WASM module packaged that small before, great work! |
the KTK discussion was bit off topic, the main purpose of our library is to serve jpeg (or separate webp, it you really really want to save some more file size) files with full-range hdr capabilities. It is theoretically possible to reconstruct a full range HDR
hold on the compliments :) Speaking of I think we'll keep the So stay tuned, I'll publish a new npm package + example soon and, once it's done, I was thinking of opening a PR with our example. |
Still great compared to the cost of HDR files, and comparable to the binaries we're using to decode Basis and Draco compression, so I'm not withdrawing the compliment. :)
Will be interested to check that out! |
Excellent work! This looks like everything we need for an efficient decoding solution in three. Thanks for reducing dependencies! |
You have sparked my interest on the KTX topic: https://monogrid.github.io/gainmap-js/ktx.html if the decoded renderTarget could be used directly with Another quirk is that the KTX texture seems flipped on the Y axis, I'm flipping the I've read somewhere KTX textures need to be flipped manually, I'll see what I can do on my side in the decoding shader maybe. |
Keep in mind that KTX doesn't really help here - it's not any smaller over the wire, it's only smaller in GPU memory. However, since we have to process these images on the GPU and the GPU can't write to compressed textures, there is no GPU memory savings. In this case JPEG will actually give better performance. |
Description
.hdr
and.exr
files are commonly used for generating HDRenvMaps
and scene backgrounds, they work nicely as long as you keep them within reasonable file size which, in turn, constraints their maximum resolution.The webgl_materials_envmaps_hdr example is a perfect representation of this problem, the background is low resolution because otherwise it would need to download huge
.exr
files.Some people (like @elalish) lamented the disadvantages of using traditional hdr files because of their huge file size.
Solution
We ported to javascript a new technology called
Gain maps
(also published on NPM).Gain maps work by reconstructing an HDR image starting from 3 pieces of information:
These 3 pieces of information can be kept separate or embedded into a single image file.
Keeping the files separate allows you to use additional compressed image formats, including
webp
.Embedding the gain map into a single file is theoretically possible with
jpeg
,heic
avif
jpeg xl
tiff
anddng
(see the specat
How to Store Gain Maps in Various File Formats
) file formats but we currently implemented a wasm encoder/decoder only forjpeg
, this is a fork of google's own ultra hdr file format lib compiled inwasm
(see additional context below).Additionally, we created a free online converter that allows to input an
.exr
or.hdr
and convert it to a gain map. It also allows you to view already created gain maps as if they were conventional HDR files.The online converter/viewer tool works entirely in your browser, encoding and decoding happen on your CPU and GPU.
For starters, we propose the integration of an an external example (like previously done with this one ), which leverages our Three.js Loaders.
I can create a pull request that adds such an examples if you like the technology and the implementation.
I'm also available to answer any questions you may have about our implementation and/or the technology in general.
Alternatives
RGBM is a comparable alternative but it has the following disadvantages:
HDR range is limited to 3-4 stops, after that the technique starts fo to fall apart.
Gain maps have no such limitations and are able to represent the full un-clipped HDR range (as far as Half Float precision goes).
Requires a javascript PNG parser which is not very fast, big
rgbm
images can take a long time to parse.jpeg
andwebp
Gain maps leverage the browser built-in decoding and the reconstruction of the HDR image isaccomplished with a custom shader, which is near instant.
PNG compression is not as good as
jpeg
andwebp
, file size is still an issue sometimes.LogLuv is also a comparable alternative but:
Additional context
Google is adopting the gain map technology in Android 14 but it refers to it as Ultra HDR Image Format and a JPEG file with embedded gain map is called JPEGR in their terminology.
Hence the terms
Ultra HDR
andGain Map
are effectively synonyms, this can be a little confusing but the technology is still evolving and standard names are not established yet.Chrome has support for native JPEG gain map decoding in chrome (with initial
avif
gain maps support behind a flagchrome://flags/#avif-gainmap-hdr-images
). This allows HDR displays users to visualize HDR content compressed in JPEG or AVIF.Is is unclear if, in the future, Chrome's JS APIs will allow for natively obtain
Uint16Array
buffers starting from gain map images, for the moment we do it ourselves with our library.References:
The text was updated successfully, but these errors were encountered: