Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AVIF? #18

Closed
berkmh opened this issue Aug 8, 2023 · 3 comments · Fixed by #21
Closed

AVIF? #18

berkmh opened this issue Aug 8, 2023 · 3 comments · Fixed by #21

Comments

@berkmh
Copy link

berkmh commented Aug 8, 2023

can you add an AVIF handler to serve which ever image (AVIF, webp, etc..) is smaller in size?

@HoraceShmorace
Copy link
Owner

Sorry, the past year has been busy. I can add support for the AVIF format. However, in order to choose between AVIF and webp based on file size, the service would have to generate both, which would increase computational latency. I could add the ability to favor AVIF rather than webp via an environment variable or a command line flag.

If you absolutely need to always use whichever format results in the smaller file size, then I'd suggest creating a simple service that reaches out to your Image Flex instance and requests both file types, compares the sizes of the responses, and returns the smaller-sized format.

I'll also consider just adding the feature and just making it configurable, but the increase in computation time needed to generate another image would likely negate the savings you'd gain from having the smaller file-sized format on calls to origin.

@jcam
Copy link

jcam commented Apr 25, 2024

I added AVIF in my fork here (back in 2022) but I found that the time to generate an AVIF was much too long for our user experience so I turned it back off in a later commit.
(there are a few other changes for our use case)
master...jcam:Image-Flex:master

@HoraceShmorace
Copy link
Owner

Ok, I've done a lot of testing. The default image format is now AVIF. I haven't seen any additional latency when generating the image. If you encounter any added latency, then you can always warm the cache.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants