Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

png file is garbled, Mac OS X #1531

Open
justjoheinz opened this issue Feb 12, 2024 · 25 comments
Open

png file is garbled, Mac OS X #1531

justjoheinz opened this issue Feb 12, 2024 · 25 comments
Milestone

Comments

@justjoheinz
Copy link

I use allegro 5.2.9.1 (built by brew) on a Mac OS X M1 architecture. Libpng is 1.6.42.

I have a file (attached) which according to file is :

a10000

> file a10000.png                     
a10000.png: PNG image data, 64 x 64, 16-bit/color RGBA, non-interlaced

Using https://github.com/liballeg/allegro5/blob/master/examples/ex_bitmap_file.c to render this image the output is garbled.

If I do

> convert a10000.png -depth 8 a10000.png
> file a10000.png                     [13:15:09]
a10000.png: PNG image data, 64 x 64, 8-bit gray+alpha, non-interlaced

the output is ok

@pedro-w
Copy link
Contributor

pedro-w commented Feb 12, 2024

Would be useful to know if this is MacOS only or Allegro in general (guessing 16-bit pngs are not supported?)

@lockie
Copy link
Contributor

lockie commented Feb 12, 2024

The very same image is displayed properly on both Linux and Windows.

@pedro-w
Copy link
Contributor

pedro-w commented Feb 13, 2024

Alright,

int bits = [bitmap_rep bitsPerPixel];
int samples = bits / 8;
ALLEGRO_DEBUG("Read image of size %dx%dx%d\n", w, h, bits);

appears to be wrong; it's assuming 8 bits per channel, so that number of samples per pixel = number of bits per pixel / 8. In the example png, bitsPerPixel is 64, i.e. 4 samples of 16 bits each, will be incorrectly calculated as 8 samples of 8 bits each.
There may be other errors subsequent to this.

@pedro-w
Copy link
Contributor

pedro-w commented Feb 14, 2024

At the moment we're loading an NSImage then extracting its first NSImageRep which we hope is really an NSBitmapImageRep.
This class represents many different bitmap variants - planar or intermingled, premultiplied or not, with up to 5 channels, each up to 32 bits each. Allegro converts this to an ALLEGRO_BITMAP of type ALLEGRO_PIXEL_FORMAT_ABGR_8888_LE but it doesn't handle all possible variants. Doing that would be possible but probably quite tedious, especially as most variants would be rare in practice.
Options I think are

  • Write that code for all variants
  • Error if it's not a 'common' variant
  • Investigate CoreGraphics
  • Draw to an intermediate context
  • Use libpng et al (like Linux does, I assume)

I don't know anything about CoreGraphics but I saw Apple's docs suggesting it should be used if image manipulation was being done.
The 'intermediate context' idea is to create a Cocoa graphics context backed by a fixed bitmap format (ie. the equivalent of ALLEGRO_PIXEL_FORMAT_ABGR_8888_LE) and just draw the NSImage into it, letting Cocoa do all the format changing. I think this should work but can't say if it would be too slow.

@SiegeLord has there been previous discussion on this, I can't believe this issue has only surfaced in 2024!

@SiegeLord
Copy link
Member

I didn't know 16 bit PNGs even existed, I'm not surprised this hasn't come up before.

Regardless of what's done, we should at least gracefully reject formats we can't handle.

@pedro-w
Copy link
Contributor

pedro-w commented Feb 14, 2024

...I think maybe TIFF also supports higher bits per channel.

How about retaining the current code (should be fast) but falling back to the drawing context option if the current code can't handle it? At the moment AFAICS there's no else clause to catch unhandled format so we don't write anything into the target bitmap!

I can investigate if no-one else feels motivated to.

@allefant
Copy link
Contributor

As a workaround - you can disable the native image addon and fall back to libpng (I don't remember if because of this specific bug, but the native addon never worked for my pngs - I just never investigated more).

@pedro-w
Copy link
Contributor

pedro-w commented Feb 15, 2024

I've looked into the render to context option, it works on the png file here and on an 8-bit png. All bitmap loads (bmp, jpg, ...) on MacOS will pass through that function so it's a bit scary to be changing it. I will post some code soon so others can weigh in.

I've also started on trying CIImage , currently it is messing up the colours (swapping channels I guess) and I'm not sure why. (CoreImage looks pretty heavy but the only parts we need are "image from data" and "render to bitmap")

More later!

@pedro-w
Copy link
Contributor

pedro-w commented Feb 17, 2024

Have pushed some code, please have a look
https://github.com/pedro-w/allegro5/tree/issue-1531
(definitely not ready for a PR yet - in particular I'm not really sure I understand premultiplying alpha)
I think it's better than what was there before - we're still at the mercy of MacOS built-in image handling but via CoreImage rather than AppKit (both of these end up in CoreGraphics I suppose)
It would be great if someone could have a play with some usual and unusual bitmap formats.

@pedro-w
Copy link
Contributor

pedro-w commented Feb 26, 2024

Ironically I was hit by this myself yesterday - spent ages trying to debug something when the issue was that ImageMagick's montage command had created a 16 bit-per-channel PNG for me 😠

@lockie
Copy link
Contributor

lockie commented Mar 11, 2024

@pedro-w Hey, I've tried your fork under MacOS virtual machine and I'm afraid it still outputs the 16-bit PNG image from OP incorrectly:

2024-03-11-095849_604x502_scrot

(and just a small observation, the visual garbage itself is different every time I launch ex_bitmap_file)

@pedro-w
Copy link
Contributor

pedro-w commented Mar 11, 2024

Disappointing. It worked for me. What version of MacOS are you using?
The visual garbage would be different every time under the old code because it allocates the memory and then sees a format it can't handle and skips the writing stage. The result is a bitmap made of of uninitialized memory.
Pete

@lockie
Copy link
Contributor

lockie commented Mar 11, 2024

That'd be Monterey.
Let me know if I can add some debugging code to clear this thing up.

@pedro-w
Copy link
Contributor

pedro-w commented Mar 11, 2024

One useful thing would be to just fwrite the contents of the surf array, round about line 52.
Then you can use ImageMagick to convert it to a normal image file. Then we'll see if the problem is (stage 1) loading the image data with CoreImage or (stage 2) the conversion to an Allegro bitmap.

@lockie
Copy link
Contributor

lockie commented Mar 11, 2024

Oh, so sorry, I forgot to checkout the correct branch 😅
With your changes actually applied the image in question is indeed displayed correctly.

@pedro-w
Copy link
Contributor

pedro-w commented Mar 11, 2024

Let's not ever mention this again 😉

Thanks very much for testing. I've also got an old MacOS (Catalina) so would be handy if someone with a newer version, especially an Apple Silicon model, could give this code a try out. @justjoheinz would you be able to try it on your M1?

@justjoheinz
Copy link
Author

I can give it a shot.
I have M1/Sonoma 14.3.1

@justjoheinz
Copy link
Author

LGTM, I tried your issue-1531 branch in your repo.

@pedro-w
Copy link
Contributor

pedro-w commented Mar 12, 2024

OK I'll tidy it up for a PR. I'll look to add a test for it too.
Only remaining issue is pre-multiplied alpha - I don't really know how to check that's working correctly.

@SimonN
Copy link
Contributor

SimonN commented Sep 18, 2024

I have a related-looking bug in my Allegro 5 game: Lix #431: Magic pink won't become transparent on macOS.

(My obvious symptom in Lix, the failing conversion from magic pink to transparency, is not the bug itself. Other colors are also slightly wrong. I recolor in-game UI according to pixels from PNGs, and this UI recoloring fails, too.)

@Rhys-T has reduced my bug to the interaction between Allegro 5 and Cocoa (Mac's native image loading API). Our findings so far agree with @pedro-w's findings:

  1. The bug only affects Mac, and it affects both Intel and ARM Macs. It never happens on Windows or Linux.
  2. The bug stops happening when we disable the native loader by passing -DWANT_NATIVE_IMAGE_LOADER=off to CMake during Allegro 5's configuration before building.
  3. The bug affects palete-indexed PNGs and 24-bit PNGs (see next post). The images load largely similar as they would load on Linux and Windows (i.e., they're not completely garbled as the 16-bit PNG here), but with slightly-off colors. E.g., magic pink #FF00FF from file loads as #FB00FF.
  4. We haven't tested 16-bit PNGs.
  5. The bug stops happening when the PNGs are 32-bit. This matches the finding posted here from 2024-02-13: "[Allegro 5 is] assuming 8 bits per channel, so that number of samples per pixel = number of bits per pixel / 8." My workaround in Lix is to tell Mac users to mass-convert all PNGs to 32-bit.

How is the PR going? Do you need more help with some of the formats?

Can Allegro 5 tell Cocoa to always offer the loaded file as 32-bit, so that Allegro 5's assumptions (always 8 bit per channel) become correct?

Should Allegro 5 disable the native Mac image loaders by default? This would fix the bug in future commonly packaged Allegro 5 binaries and make Allegro 5's codebase simpler at the same time.

I don't have a Mac to test myself.

@Rhys-T
Copy link

Rhys-T commented Sep 18, 2024

  1. The bug affects palete-indexed PNGs and 24-bit PNGs

Slight correction: as far as I can tell, it only affects palette-indexed PNGs. I was just saying "don't convert everything to 24-bit, because that loses information from some images that are currently working as-is".

I'm not 100% sure what the correct behavior is here. Converting the colors arguably might be more correct in some ways - as far as I can tell, Allegro is exclusively working with colors in the display's native RGB color space, not the sRGB that the images were in, so leaving the colors numerically the same actually ends up changing them visually. But on the other hand, pre-converting the colors means that things like 'magic pink' (#FF00FF) no longer get recognized by al_convert_mask_to_alpha and the like…

In any case, it definitely needs to pay attention to what kind of data the NSBitmapImageRep is actually giving it.

@Rhys-T
Copy link

Rhys-T commented Sep 20, 2024

(i.e., they're not completely garbled as the 16-bit PNG here)

There seem to be two semi-related problems with how the macOS native bitmap loader handles the NSBitmapImageRep it gets back:

  1. If it doesn't support the color format, it gives up and returns uninitialized memory. This is what this issue was originally opened for.
  2. It pays no attention to the color space it's getting back. This is what Lix is running into.
    Screen-native RGB is more correct for display purposes, but you need what's actually in the file (sRGB in Lix's case) to be able to recognize it for color-remapping purposes (magic pink, as well as other 'special' colors that the game might look for, like the UI colors, or the lix sprite colors that need to be remapped in multiplayer.).
    I'm not sure how to reconcile this without changes to Allegro's API. And NSImage (at least the way it's being used now) seems inconsistent about which color space it decides to return.

Summary of the results of different loader/color format/color space combinations as far as I understand at this point:

Loader PNG color space and format Result
libpng any format(?) ✅ sRGB colors read correctly, and are then gamma-adjusted.
✅ Magic pink survives the gamma process because it's made entirely of FFs and 00s, and is recognized. Same with black.
❌ Other special colors aren't so lucky, if Allegro is configured to use a different gamma value than the image.
    ✅ But they're fine with Allegro's default gamma value of 2.2.
❌ Color space/profile is ignored, and colors are reintepreted as native RGB, and render slightly wrong. (Not sure yet whether the gamma part is right or not.)
NSImage sRGB; 24-bit, 32-bit, other 8-bit-per-channel formats
sRGB; indexed with 32-bit palette
✅ sRGB colors read correctly and are returned as-is.
✅ Magic pink and other special colors are recognized.
❌ Colors are then reintepreted as native RGB without taking color profiles/spaces into account, and render slightly wrong.
NSImage sRGB; 16-bit (4 bits per channel) ❌ Can't read at all. Returns uninitialized memory.
NSImage sRGB; indexed with 24-bit palette ✅ sRGB colors read correctly and are converted to native RGB.
❌ Magic pink and other special colors are no longer at their sRGB values and are not recognized.
✅ Color profiles/spaces are still ignored, but the colors are already in the native color space and render correctly.
CIImage
(from #1531 (comment))
sRGB; 24-bit, 32-bit, other 8-bit-per-channel formats
sRGB; indexed with 32-bit palette
✅ sRGB colors read correctly, and are then gamma-adjusted.
✅ Magic pink survives the gamma process because it's made entirely of FFs and 00s, and is recognized. Same with black.
❌ Other special colors aren't so lucky, and don't get recognized.
CIImage
(from #1531 (comment))
sRGB; 16-bit (4 bits per channel) ❓ Haven't tried this combo yet.

@Rhys-T
Copy link

Rhys-T commented Oct 19, 2024

I've figured out why I couldn't get @pedro-w's CIImage-based loader to build under Nix: Nix defaults to targeting a minimum macOS version of 10.12, and CIContextOption first appeared in 10.14. I worked around that and started testing it with Lix. (I'm applying that loader as a patch to the same version of Allegro I've been building Lix against, rather than using that branch as-is, to minimize the number of changes that might affect what I'm seeing. However, it seemed to look the same when I used the actual branch.)

Magic pink seems to work correctly with the CIImage loader, but normal colors all look significantly darker than in either of the current loaders, and UI icons don't get remapped from grayscale to the proper light-blue colors.1 I suspect this is a gamma-related issue: magic pink consists of only 00s and FFs, which would both map to themselves under any gamma value, so it just happens to survive the process. The libpng loader looks identical if I export SCREEN_GAMMA=1.0 before running the game (default is 2.2).

I'll update the table above with this info.

Screenshots

Main menu

With libpng, or NSImage after converting to PNG32:
Screen Shot 2024-10-19 at 2 57 42 PM

With CIImage, or libpng with SCREEN_GAMMA=1.0:
Screen Shot 2024-10-19 at 2 57 33 PM

In-game

With libpng, or NSImage after converting to PNG32:
Screen Shot 2024-09-12 at 12 00 11 AM

With CIImage, or libpng with SCREEN_GAMMA=1.0:
Screen Shot 2024-10-19 at 2 58 09 PM

Dock icon

Left is CIImage, or libpng with SCREEN_GAMMA=1.0. Right is the standard loaders.
Screen Shot 2024-10-19 at 3 09 04 PM

Footnotes

  1. The lix sprite colors do get remapped properly in multiplayer, which confused me for a bit, but then I remembered that the 'from' and 'to' colors for that mapping are being loaded from lixrecol.png through the same loader as everything else, rather than being hardcoded in the program.

@pedro-w
Copy link
Contributor

pedro-w commented Oct 20, 2024

This is great work. Sorry I've not kept up with this thread.

It seems to me that it would be better to go for the libpng loader, because then any improvements we make would be cross-plaform (ditto any idiosyncrasies that still occur).

Not sure about the color space thing though - is there a library to convert color spaces, or is it acceptable for us to say in the docs that only sRGB files will be reproduced faithfully?

Also if gamma can be set via the config file, does this mean that programs that rely on image pixels being a certain RGB value can be broken? (e.g. I think I remember an RPG-type game where the character sprites were recolored to give them different clothes)

@Rhys-T
Copy link

Rhys-T commented Oct 20, 2024

One question I have is, what color space is Allegro supposed to be working in - or in other words, what space is an ALLEGRO_COLOR treated as? I can't seem to find this documented anywhere. Right now it seems like it's passed through as the screen's native RGB space (at least on macOS - I haven't had a chance to test it on any other platforms yet). That lets you access the full gamut of whatever screen you're on, instead of being limited to what sRGB covers, but it would also require an extra conversion step before trying to display anything, if you want it to look consistent.

It seems like the ideal setup would be to separate loading from converting to native: First load the pixel values as-is, without any conversion or gamma adjustment or anything, so the game can recognize any special colors it needs to. Then either have a function to convert the final bitmap to native explicitly before drawing, or make it so that an ALLEGRO_DISPLAY has a concept of what color space the program is using and can convert that to native whenever it draws something. But either of those options seems like it would involve (potentially breaking) API changes. (Possibly related: #463)

I've never actually developed anything using Allegro myself, nor done much of anything that needed to care about color spaces, so I can't really offer any insight as to what approach would be best from a developer perspective.

Also if gamma can be set via the config file, does this mean that programs that rely on image pixels being a certain RGB value can be broken? (e.g. I think I remember an RPG-type game where the character sprites were recolored to give them different clothes)

Yes - if I set SCREEN_GAMMA (or its .allegro5rc equivalent) to something other than 2.2, Lix fails to identify certain colors. For example, look at the in-game screenshots above: the buttons in the lower-right corner are supposed to be in shades of blue, but with the darker gamma, they're coming out in grays like in the actual image (except where it happens to land on another recognized shade of gray and gets mapped to a blue).

(Lix does have a few other special colors it looks for, but for various reasons, the gamma stuff in particular doesn't affect most of them. Black and magic pink are both used as 'transparent' in various places, but they're at the very ends of the gamma curve and don't get adjusted, so they work. In multiplayer, the various colors in the lix sprites get remapped to shades of each player's color, closer to your RPG example, and those colors do get adjusted - but the colors that the game is looking for are taken from the first row of lixrecol.png, so they've also been adjusted, and still match what's in the sprites. The only other special color I know of that might be getting affected is #505050, which the game uses to find a lix's eye to correctly position the fuse for an exploder in multiplayer. But if that one's failing, it's subtle enough that I'm not seeing it.)

@SiegeLord SiegeLord added this to the 5.2.11 milestone Nov 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants