Friday, July 18, 2014

Google WebP

According to Google, paintings and photographs now accounts for about 65% of the content accessed through the web. Most photos on the Internet is encrypted using the JPEG format is which is quite old (born in 1992). Therefore, Google has developed its own tool to compress images i.e. Google WebP. According to the Google, WebP is 39.8% lighter than other image formats. This tool also limits the maximum size of the image is 16,383 x 16,383 pixels.
Google has posted comparable JPEG images and WebP however, because browsers do not have (including Google Chrome) WebP support image display, they are using MIME encoding PNG. A WebP file consists of VP8 image data, and a container based on RIFF.

WebP is a method of lossy compression that can be used on photographic images. WebP offers compression that has shown 39.8% more byte-size efficiency than JPEG for the same quality in a large scale study of 900,000 images on the Web. The degree of compression is adjustable so a user can choose the trade-off between file size and image quality.
google webp jpeg compare 1 Google WebP: Free Image Compression tool from Google
google webp jpeg compare 2 Google WebP: Free Image Compression tool from Google
google webp jpeg compare 3 Google WebP: Free Image Compression tool from Google
You can also find more information about WebP at Google WebP page. You can download Google WebP tool from Google Code. At this moment this tool is available only for Linux x86 (64-bit) and for windows system, it’s expected soon.
Not to forget we have similar tool from Yahoo! known as Smush.it and also a WordPress plugins for Smush.it which help you to compress your images and helps in faster loading of WordPress blog.
If you use lots of images on your blog, this tool is going to be very handy as you can compress images and make your blog load faster.
Do let us know if you optimize your images for your blog by compressing them or you post images without any optimization?

1 comment:

  1. Image compression works by reducing the amount of visible information. So whereas the original picture might be able to display 65 million colors, since the eye can not differentiate between anywhere near this number of colors the software is going through and removing colors that are not needed. The actual complexity of the algorithms is far more complex than that, and beyond the scope of this response. But the threshold (assuming I understand your context correctly) has to do with how the algorithm decides what information to throw away and what information to keep, and thus will affect the resulting quality and size of the image after compressions. By the way to use this free Image Compression Tool

    ReplyDelete