Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can I do with error that - Input file has corrupt header: pngload: reached chunk/cache limits #2984

Closed
lix059 opened this issue Nov 21, 2021 · 8 comments

Comments

@lix059
Copy link

lix059 commented Nov 21, 2021

What are you trying to achieve?
I was going to get a thumbnail from a png which is 254 M large, which occured that error.

Have you searched for similar questions?
yes, but I can not find out how to fixed it.

Are you able to provide a minimal, standalone code sample that demonstrates this question?
that is my process code
image

Are you able to provide a sample image that helps explain the question?
that is error tip
image

@lix059 lix059 changed the title how can i do with error that - Input file has corrupt header: pngload: reached chunk/cache limits How can I do with error that - Input file has corrupt header: pngload: reached chunk/cache limits Nov 21, 2021
@lovell
Copy link
Owner

lovell commented Nov 21, 2021

I suspect this image trips the SPNG_CHUNK_COUNT_LIMIT of 1000, which is a safety feature of libspng.

https://github.com/randy408/libspng/blob/master/docs/decode.md#memory-usage

To allow an optional increase in this value, we'd need to expose control over this in libvips first.

Are you able to provide a sample image that fails in this manner?

@lix059
Copy link
Author

lix059 commented Nov 22, 2021

The sample image is large. Can you leave an email address? I will post to you.

@randy408
Copy link

I suspect this image trips the SPNG_CHUNK_COUNT_LIMIT of 1000, which is a safety feature of libspng.

Hopefully it's not that, I'll raise the default value if that's the case. Either way I'll be adding a new error code to differentiate from the other two limits.

@lovell
Copy link
Owner

lovell commented Nov 22, 2021

@lix059 Thank you, if it's a really big file it might be worth using a file hosting service rather than email e.g. something like Dropbox or Google Drive.

"author": "Lovell Fuller <npm@lovell.info>",

@lix059
Copy link
Author

lix059 commented Nov 23, 2021

@lovell I posted the sample image download url to npm@lovell.info email. Thank you.

image

@lovell
Copy link
Owner

lovell commented Nov 23, 2021

Thanks for the image, it has 143 chunks, two of which are text chunks that consume ~260MB when decompressed.

$ pngchunks preview221.png | grep "Chunk:" | wc -l
143
$ pngchunks preview221.png | grep zTXt
Chunk: Data Length 131093001 (max 2147483647), Type 1951945850 [zTXt]
Chunk: Data Length 131085202 (max 2147483647), Type 1951945850 [zTXt]

As Randy has pointed out, there's a new unlimited flag in libvips that we need to expose in sharp, which should then allow this image to be processed (this will be opt-in, as the safety features are there for a reason.)

@lix059
Copy link
Author

lix059 commented Nov 24, 2021

@lovell Thanks.

@lovell lovell added this to the v0.30.0 milestone Nov 24, 2021
@lovell
Copy link
Owner

lovell commented Feb 1, 2022

v0.30.0 now available with this feature.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants