This is a guest post courtesy of Jonathan Klein. The post originally appeared on Jonathan’s Blog on Feb 20th, 2013.
WebP (pronounced “weppy“) is an ambitious and promising new image format that was created by a team at Google back in September of 2010, but browser adoption has been slow. At the time of this writing the only major browsers that support WebP are Chrome and Opera, even though support in native applications is strong (albeit with plugins in many cases). In this article I want to present an unbiased, holistic view of the WebP landscape as it stands today. My end goal is to further the conversation about alternate image formats and image compression on the web, because I believe it is one of the biggest opportunities for making web browsing faster across all devices and networks.
The description that Google provides for WebP is concise and explicit:
WebP is a new image format that provides lossless and lossy compression for images on the web. WebP lossless images are 26% smaller in size compared to PNGs. WebP lossy images are 25-34% smaller in size compared to JPEG images at equivalent SSIM index. WebP supports lossless transparency (also known as alpha channel) with just 22% additional bytes. Transparency is also supported with lossy compression and typically provides 3x smaller file sizes compared to PNG when lossy compression is acceptable for the red/green/blue color channels.
The link above also has more information about how WebP achieves those smaller image sizes, as well as the container and lossless bitstream specifications. There have been a number of posts about real world situations where WebP has succeeded, a few of which are linked below.
- Torbit saw a ~54% drop in file-size from using WebP
- Google reduced image size in the Chrome Web Store by ~30% with WebP
- Another test showing a 40% improvement on average
- Opera Turbo saw a 22% reduction in data transfer and a 260% speed improvement by using WebP
- Smaller images – this one is obvious, and was explained in detail above.
- It can replace both PNGs and JPEGs, so we could use one image format for most web content.
- It is royalty-free, open source, and anyone can contribute to or implement the spec.
- The project is backed by Google, and thus has a lot of resources behind it.
- Google’s original study used JPEGs as the source images, and looked at peak signal to noise ratio (PSNR) as a comparison metric, which is “accepted as a poor measure of visual quality”. Google did a second study using Structural Similarity (SSIM) vs. bits per pixel (bpp) that showed similar results as the first. The second study makes this concern moot.
- WebP lacks support for key image features, specifically EXIF data, ICC color profiles, alpha channel, and 4:4:4:4 YCrCb color representation. As detailed above, ICC color profiles and alpha transparency were added in a later version of WebP. I can’t speak to EXIF data or 4:4:4:4 YCrCb color, but if you know more about this please let me know and I’ll update this point. UPDATE: Vikas Arora from the WebP team let me know that WebP now supports EXIF and XMP metadata as well.
- There is a cost associated with new image formats – developers have to decide which one to use and clients need to support it forever. There are third party vendors (like Torbit) and open source modules (mod_pagespeed/ngx_pagespeed) that will automate this for you, so developers no longer have to do the work themselves.
- The progressive decoding of WebP requires using a separate library, instead of reusing the existing WebM decoder.
- During the transition period when support is partial, you have to have duplicate images on your server, creating both JPEGs and WebP images. This eats up storage space, and can make image retrieval slower (for example, if your working set used to fit in memory, and after doubling it you need to go to disk for some images).
- CDNs and intermediate proxies have to cache both versions of the image, reducing cache hit ratio.
- Currently the lack of context in Accept headers means that all WebP requests are marked as Cache-Control: private so the origin can do user agent detection (more on this below).
- Rolling out new image formats is hard.
There are many dimensions to consider when thinking about the performance implications of image formats. Most people gravitate toward quality (subjective) and compression sizes (performance). Both are interesting, but the elapsed time required to download the image is only part of the problem. After the image arrives on the local system there are several steps before it’s seen on the screen. The exact steps depend on the browser, operating system, and hardware. The basic pipeline looks something like this:
- System memory is allocated and the encoded image is stored in this memory.
- Additional system memory is allocated, and the image is decoded at native resolution into this intermediary.
- The native image may need to be converted into another intermediary format.
- The appropriate intermediary is copied to the GPU (on hardware accelerated scenarios).
- Finally, the image is drawn to the screen.
On mobile caliber devices it’s not uncommon to see these operations take several 100ms. Some browsers attempt to do this work in parallel, so it may not directly block execution on the primary thread, but the elapsed time of this process impacts what are often referred to as TTG (time to glass) metrics.
To improve TTG on the client you need to streamline this process. GPU’s are massively parallel and highly specialized silicon which can perform graphics operations more efficiently than the general purpose CPU silicon. Some operations can be performed 1000x more quickly through the GPU. There’s broad hardware support for traditional image formats like jpg in hardware today which speedup this process, and these opportunities don’t exist for WebP.
For WebP to make the web faster the elapsed time downloading the images (from fewer bytes) needs to be greater than the additional cost of getting the image to the screen (TTG). That’s not the case in the real world today. And over time, other image formats including jpg have more headroom to improve. Modern hardware is amazing!
This is a point that few people talk about, and it’s great that Jason was willing to explain it. If you spend more incremental time getting an image actually displayed to the screen than you save by downloading fewer bytes, that image has actually made the webslower.
I corresponded with Ilya about this issue, and he said that while WebP decoding is about 1.4x slower than JPEG, this difference is largely overwhelmed by the speed improvement that comes from sending fewer bytes across the wire. Some additional comments from him are below.
First off, to give the IE team credit, I do think they are ahead of most other browser vendors when it comes to their GPU + mobile browser story — in fact, probably significantly so. I think Chrome and FF will close this gap in the not so distant future, but credit where credit’s due. Now, having said that, I don’t know the specific details of Jason’s test, but my guess is, he’s testing image intensive apps, with good network connectivity (3.9/4G), on latest Nokia hardware, and with minimum/no concern for bandwidth caps… That’s fair, but that’s not representative.
Most users are stuck on non HSPA+ connections, with middle of the road hardware, and are using an older (year or more) WebKit build. This profile, compared to what he’s [likely] analyzing, is night and day. In fact, in markets where mobile browsing is already dominant (India), bandwidth is the #1 concern – bits are expensive, literally. Hence, I take any argument about GPU performance with a grain of salt. Yes, it is definitely an important metric, and one that all of us will likely pay much closer attention to in the future, but it is not the limiting factor today for the vast majority of the population.
This is where WebP helps and why we’re investing all the effort into it… Not to say that WebP is the answer to all things. It just happens to be the best option at this very moment. I hope we have more image formats, with even better compression and better “time to glass” in the future! These requirements are not entirely exclusive.
There is an important distinction to mention here: when we are talking about hardware accelerated browsers where image decoding happens on the GPU, lack of hardware support for WebP can cause a significant increase in TTG. If we look at browsers that do image decoding on the CPU, the time difference is not as large (~1.4x as Ilya mentioned). In the former case, it is likely that in the real world increased TTG time overwhelms the decrease in download time from WebP in at least some, and possibly most cases. In the latter case, the decrease in download time likely dominates the difference in decoding time in at least some, and possibly most cases. We clearly need more hard data to be published in this area to further inform this debate, and I will update this post as studies are published.
I’ll also reiterate that if you take speed out of the equation and just talk about the cost of the bytes on the wire (due to bandwidth caps), WebP provides significant and unambiguous benefits.
WebP also provides a tremendous amount of value on mobile devices with bandwidth caps. As I said above, 60% of bytes served on the Web come from images, and for many mobile sites/apps that number is going to be much higher. Take any mobile shopping app for example: once you have the app on your phone, you only have to download a few bytes of JSON data and images for all of the products you are looking at. WebP is a huge win here. Even if the increased decoding time offsets some of the improvement in download time, people are still saving money on bandwidth usage.
In addition to talking to Jason and Ilya, I’ve reached out to Mozilla to get more opinions about why adoption has been slow, and I will update this post as I get more feedback. If you have more examples of WebP studies, additional benefits/drawbacks, or information about the political and technical reasons that are blocking adoption, please post in the comments. If you prefer to contact me anonymously, you can email me directly at jonathan.n.klein@gmail.com.
Improving the web is difficult, but in the words of Ilya Grigorik, “some uphill battles are worth fighting – I think this a good one.”