RE: Support for compression in XHR?

Dom,
One more point on this thread. We in most cases do see an advantage in compressing HTML and XHTML web pages using GZIP/deflate in our network proxies, and since the compression is done on a per HTTP packet basis, the browser does not have to wait to get the whole page before uncompressing (the browser has to uncompress each packet individually anyway, since they are compressed as discrete transfer units).

Only if the web server compressed the content itself, as a whole document, and then sent it over multiple HTTP CONTINUATION packets, would the browser need to get the whole page before uncompressing. But that is not normal behavior of web servers that we see in our network. Most large pages are received in plain text form as a series of HTTP CONTINUATION packets from the web server, and compressed by the network proxy before being forwarded to the browser in a HTTP CONTINUATION.

So browsers that do progressive rendering are not inhibited from doing so, if the HTTP compression is applied on a per-packet basis.

Best regards,
Bryan Sullivan | AT&T
-----Original Message-----
From: Dominique Hazael-Massieux [mailto:dom@w3.org] 
Sent: Thursday, September 11, 2008 12:24 AM
To: Sullivan, Bryan
Cc: David Storey; public-bpwg
Subject: RE: Support for compression in XHR?

Le mercredi 10 septembre 2008 à 15:22 -0700, Sullivan, Bryan a écrit :
> OK, given that there are some variant opinions (on technical grounds)
> between the value of HTML and XHTML markup, and this is complicated by
> the reality of mobile browser implementations, there still is the
> practical limitation that a page that is so large as to require
> progressive rendering probably would not result in a good user
> experience anyway. 

I think that's a bit too strong a statement: on a GRPS network where you
typically get a ~3 kB/s bandwidth, this means that a 10kB page (which I
don't think can argued to be very large, esp. in the context of mobile
web application) will take 3s to download - having the first part of the
page shown after 1s instead of 3 certainly improves the user experience.

Taking the example of a larger page (30kB), which I don't think is
either extraordinary in the case of a mobile web application, we can see
that the decision to compress (assuming a compressed size of ~10kB) or
not is not trivial:
 * on a GPRS network, 30kB would take 10s to download, while dowloading
10 kB will take only 3s; that said, the uncompressed page would start
showing almost immediately, while the compressed page (I believe)
wouldn't show anything before these 3s are completed
 * on a Wifi network, 30kB would take hardly more time than the
compressed version, and decompressing 10K would add ~1/3s to the display
of the page on a mobile with limited CPU - so compression is not likely
beneficial

My current thinking is that compressing CSS and Javascript files above
2kB is almost always beneficial (since they are not affected by
progressive rendering), but I'm not sure recommending compression for
HTML/XHTML is necessarily a good idea.

Dom

Received on Tuesday, 23 September 2008 05:35:54 UTC