Skip to content

Instantly share code, notes, and snippets.

@pmarreck
Created September 21, 2023 19:15
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save pmarreck/3ac96786e6463d9a3a70f69d3cb5053d to your computer and use it in GitHub Desktop.
Save pmarreck/3ac96786e6463d9a3a70f69d3cb5053d to your computer and use it in GitHub Desktop.
JPEG-XL tests
a few tests using `cjxl` (the reference `jpeg-xl` converter) I performed on a 120.7 MB PNG (the Carina Nebula, first photo from the James Webb Space Telescope):
there's 2 main parameters, "distance" (where 0 is lossless and 1 is the least detectable lossy; higher numbers are more lossy) and "effort" (compression effort, where 1 is quick and 9 is exhaustive/very long)
distance 0 effort 9 took 3 hours (obviously this is for a "write once, read many" sort of archival use-case) and got it down to about half of PNG, 67.2 MB; not bad for maintaining lossless.
distance 0 effort 6 took 1m43s, got it down to 75 MB, about 63% of PNG.
distance 1 effort 8 took 3 minutes and got it down to 7.7 MB which is about 1/15th of the PNG.
distance 1 effort 6 took 4s; got it to 7.1 MB... wait, _whut?_ That's... odd. Smaller than with effort 8. Anyway.
All encodings only used 1 of my 128 cores. Whole-image compression of this nature may not be "trivially parallellizable" ::shrug::
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment