How reliable is HTTP compression with gzip?

YSlow suggested using HTTP compression to improve the performance of my site. However, as noted by Yahoo , these are some problems.

There are known issues with browsers and proxies that can cause a mismatch in what the browser expects and what it receives regarding compressed content. Fortunately, these edge cases are shrinking as the use of older browsers goes away. Apache modules help by adding suitable Vary response headers automatically.

I understand that the most common problem occurs with IE6 behind the proxy. But how common are these problems today? To quantify what percentage of web users experience HTTP compression errors?

+6
source share
2 answers

According to the Recommended Apache Usage Example for using mod_deflate , the only user agents that do not have proper support are:

[...] Netscape Navigator version 4.x. These versions cannot handle compression of types other than text / html. In versions 4.06, 4.07 and 4.08 there are also problems with unpacking html files. Thus, we completely disable the deflation filter for them.

Any other browser, especially all modern browsers, must properly support compression.

+4
source

While I do not have statistics about which clients can / cannot use compression, I think it is worth noting that from IIS7 it was changed to Http compression (for static content) by default, which at least shows how Microsoft feels the safety of its use. Dynamic content is still disabled, but this means that CPU cycles are longer than client compatibility.

Some specific IIS7 data can be found here , although I suspect you will find similar performance characteristics for other web servers.

I bet to enable compression on every site. I have not yet reported problems.

+4
source

All Articles