As mentioned in the previous post, I spent the past few weeks focusing on compression performance. I introduced a number of improvements to the compression endpoint:
integrating with MozJPEG which is a fork of libjpeg-turbo by Mozilla. MozJPEG is known to provide one of best results. See: https://libjpeg-turbo.org/About/Mozjpeg
adding the possibility to generate progressive JPEGs when compressing. Progressive JPEGs are basically JPEGs that contain a special kind of information or script that tells the decoder to decode the image as fast as possible and then improve it as more data comes in. This creates a kind of blurry or low resolution image as a first step, and then improves little by little depending on the configured scans. I wrote something about in imager200's blog: https://www.imager200.io/blog/walkthough-progressive-jpeg/
The toughest part was the integration with MozJPEG. Even if the library is quite adopted, I did not find any full example out there that does what I want, which is simply writing an image back to a buffer in memory after finishing the compression. Their example is good but does provide information about how to read and decode the input image: https://github.com/mozilla/mozjpeg/blob/master/example.txt
It took me few days of trial and error to get it right.
Once I got the right result in C language, I was able to port it to Golang using cgo.
For now, I am only exposing the quality and progressive parameters to end users. More details on the api docs: https://api-docs.imager200.io/#compresscompress_body_sync
As next steps, I am planning to integrate with optipng to improve the PNG compression. I would like also to give more attention to the progressive aspect of JPEGs, maybe make it more configurable, but I don't yet how.