Mozilla Wants To Take A Byte Out Of JPEGs
Written by Mike James   
Friday, 07 March 2014

JPEG is well known, well used and well understood. Surely there cannot be anything left to squeeze out of this old compression algorithm? Mozilla seems to think that we can get more if we are careful.

A new Mozilla project called "mozjpeg" is trying to create a better JPEG.

Photos generally take a lot of bytes to store and in most cases the amount of data needed can be reduced by using lossy compression. JPEG is the best known lossy compression method for photos and it is in use not only in websites but in digital cameras and other devices.

Better compression would reduce the time for most websites to load because images represent most of the website's size. 

 

mozjpeg

 

Google has invented a completely new compression algorithm WebP which is claims to be up to 40% better than JPEG without any additional loss of quality. The big problem is that not all web browsers support WebP and hence we have a chicken-and-egg problem of users not willing to use it until web browsers support it. 

The Mozilla team wondered if it was possible to make a better compression encoder using the existing algorithm. This doesn't seem very likely after more than 20 years. However, after doing some research, the team concluded that there was still room for improvement. The encoder could still be tweaked to give the same quality with lower storage requirements without any changes to the decoder.  

The first version of the new mozjpeg is based on the existing libjpeg-turbo library that makes use of hardware acceleration to speed up encoding.

The first improvement that has been included does depend on the high speed of encoding for its successful operation because it essentially optimizes the compression by trying lots of different settings. It is based on the Perl script, jpgcrush, which tunes the progressive coding parameters by trial and error to produce the smallest file. Typically it produces files which are 3-6% smaller for JPEG encoded PNGs and 10% on a sample of 1500 JPEGS stored on Wikimedia. 

Interestingly, Google has already been down this road of optimizing compression but in the form of the Zopfli project, which searched for the optimum settings of the parameters in a ZIP compression. The big problem is that it takes time to try out all of the possible values. However, compression is a once only task (as opposed to decompression which happens over and over again) and it doesn't change the time it takes to decompress the file. 

A 3-10% size reduction may not seem like much, but the next step is to investigate other improvements such as implementing trellis quantization. This optimizes the way the coefficients of the DCT used in JPEG are quantized so as to provide the lowest distortion ratio. 

It also should be noticed that all of these optimization apply to the open source libjpeg encoder. It might well be that other encoders, Adobe's for example, already use techniques such as trellis quantization to produce smaller files.

A more important point is that providing a gain of 10% on file size may be a good thing, but users generally don't apply the maximum compression that they could get away with. Most simply save a JPEG using whatever percentage compression is set - often by the previous user. The result is often no compression at all. Even the selection of PNG v JPEG or lossy v lossless and so on are ignored by users. 

A more effective way to reduce file sizes in practice would be to provide a more intelligent way to set compression parameters to trade off size and quality. Some graphics packages, for example GIMP, do make a point of showing the user the image with an option to change the compression, but a more intelligent approach that did the job for them would be so much better. 

 

mozjpeg

More Information

Introducing the ‘mozjpeg’ Project

mozjpeg on GitHub

Related Articles

Google Zopfli - Nice Compression Shame About The Speed

Data compression the dictionary way

Coding Theory

Network Coding Speeds Up Wireless by 1000%

Information Theory

Zip for the Genome - G-SQueeZ

Fractal Image Compression 

 

To be informed about new articles on I Programmer, install the I Programmer Toolbar, subscribe to the RSS feed, follow us on, Twitter, FacebookGoogle+ or Linkedin,  or sign up for our weekly newsletter.

 

espbook

 

Comments




or email your comment to: comments@i-programmer.info

 

Banner


CSS Ecosystem In the Spotlight
06/11/2024

The 2024 edition of the State of CSS has been posted, revealing that the latest features of the language not only do away with extra tooling, but even start taking on tasks that previously requir [ ... ]



Data Wrangler Gets Copilot Integration
11/11/2024

Microsoft has announced that Copilot is being integrated into Data Wrangler. The move will give data scientists the ability to use natural language to clean and transform data, and to get help with fi [ ... ]


More News

 

Last Updated ( Friday, 07 March 2014 )