This document:Public document·Annotated document·View comments·Search comments·Add a new comment·Send replies to comments·Disposition of Comments·
Nearby:Mobile Web Best Practices Working Group
Other specs in this tool
Mobile Web Best Practices Working Group's Issue tracker
Quick access to LC-2267
There are 20 comments (sorted by their types, and the section they are about).
Overall, I'd say that the document quality is still marginal at best; it uses a lot of imprecise terminology and muddies the waters more than it clarifies things. Many (if not most) of its requirements aren't testable.
My concern is that Recommending this document will cause more harm than good to the Web overall (even if it does represent a consensus of sort among a more limited community).
> "Is the current version of the document good enough to ship?"
No, it is not.
- it is not good enough for mobile developers who desperatly need a
solid platform to build their apps (and not a spec that can be leveraged
by someone somewhere to justify messing with HTTP in unforeseeable ways)
- it is not good enough for content owners (who don't want to have their
content messed with)
- it is not good enough for telcos which want to deploy transcoders
(there are no clear rules about what they should and should not do to
avoid liability when they transcode other companies' content)
The only people who CTG is good for at the moment are transcoder
vendors, who can refer to a document which is ambiguous enough that they
can read whatever they want into it and wrap their products into a
shroud of W3C legitimacy.
* 4.1.1 "Proxies should not intervene in methods other than GET, POST, HEAD." "Intervene" is vague here; does it mean that they're not allowed to change the requests, but are allowed to change the responses to them? Or they're not allowed to transform neither the request nor the response?
* 4.1.4 "so should notify the user that this is the case" -- how? Using a Warning header? Are they required to populate the Age header, so that the user can calculate whether it's stale themselves?
* 4.1.5 "the request is part of a sequence of requests to the same Web site and either it is technically infeasible not to adjust the request because of earlier interaction, or because doing so preserves consistency of user experience." This seems like a hole that a proxy vendor can drive a truck through... are you serious?
* 188.8.131.52 "The theoretical idempotency of GET requests is not always respected by servers. In order, as far as possible, to avoid misoperation of such content, proxies should avoid issuing duplicate requests and specifically should not issue duplicate requests for comparison purposes."
Existing proxies can and do already retry GETs; I'm not sure who you're trying to protect here.
Francois Daoust wrote:
> For clarification, the guidelines do not build on the assumption that
> GET is not safe.
> The mechanism described by Luca is actually recommended by the
> guidelines: send a GET with original headers, then send a request with
> modified headers if the first response is a "request unacceptable"
Francois, this is not what I meant. What I meant is "content tasting".
Proxies should send a GET with original headers and if they get a
response (which they probably will), they should smell the response and
figure out whether that content may be good enough for mobile (and err
on the side of assuming it is). If the content is likely to be OK for a
mobile device, no transcoding should take place at all.
This is explicitly ruled out by 184.108.40.206:
"The theoretical idempotency of GET requests is not always respected by
servers. In order, as far as possible, to avoid misoperation of such
content, proxies *should* avoid issuing duplicate requests and
specifically *should not* issue duplicate requests for comparison purposes."
There was no reason to add this part, except, as I mentioned in my first
message, to help novarra, whose transcoder does not behave this way.
e.g. it allows retrying a POST request upon a 406, even though this isn't allowed in HTTP.
It effectively allows proxies to ignore no-transform, if they really want to.
It blurs the semantics of a 200 response based upon its content.
* 4.2.7 Link to "handheld" representation -- you're requiring proxies to "process" (whatever that means) handheld links, even if the client isn't handheld?
A proposal to amend the CTG with the objective of avoiding deleterious interferences
of transformation proxies with certain non-browsing applications.
Developers are deploying applications that go beyond traditional browsing, by taking
advantage of powerful devices and advanced user agents.
The cluster of technologies identified as AJAX (AJAX, JSON, XMLHttpRequest) has
already established itself in the mobile world. Web Services (SOAP, WSDL) is another
one that, while still in its infancy regarding mobile phones, is already available
on laptops with wireless connections.
The W3C acknowledges the importance of emerging applications based on such
technologies for the mobile world, notably with respect to AJAX in its "Mobile Web
Applications Best Practices" (currently under review).
Section 4.1.3 of the CTG warns about potentially serious problems when content
transformation proxies alter HTTP transactions making up the communication flow
between non-traditional browsing clients and servers. However, the CTG do not
provide any guidance as to the avoidance of such misoperations.
In the field, application developers have been facing aggressively configured CT
proxies that interfer with AJAX communications -- on the basis that the content
transmitted over HTTP does not fit into pre-defined categories of "mobile browsing",
is henceforth viewed as "desktop content", and then thoroughly garbled by
The following text is included in the normative part of the document:
"A content transformation proxy MUST handle HTTP requests from a terminal, and
corresponding responses to them, transparently whenever the HTTP transaction
conveys a payload advertised as one of the following MIME types:
These MIME types distinguish traditional browsing transactions from AJAX
communications and messages in Web Services."
a) Compliance with standards
The listed MIME types are specified by the IETF or the ITU-T:
application/json in RFC4627;
application/xml and text/xml in RFC3023;
application/soap+xml in RFC3902;
application/fastinfoset in ITU-T Rec. X.891 | ISO/IEC 24824-1;
application/soap+fastinfoset and application/fastsoap in ITU-T Rec. X.892 | ISO/IEC
All are registered at IANA (see http://www.iana.org/assignments/media-types).
b) Application scope
The listed MIME types are conclusively used for non-traditional browsing applications.
application/json, application/soap+xml, application/soap+fastinfoset are exclusively
associated with AJAX, resp. Web Services applications.
The type application/soap+xml is recommended by the W3C for marshalling messages
between Web Service entities:
SOAP Version 1.2 Part 1: Messaging Framework (Second Edition)
W3C Recommendation 27 April 2007
The W3C further mandates support for this MIME type in:
SOAP Version 1.2 Part 2: Adjuncts (Second Edition)
W3C Recommendation 27 April 2007
MIME types application/xml and text/xml are preferred by the W3C for information
exchange during an AJAX session in its on-going standardization of XMLHttpRequest:
W3C Working Draft 20 August 2009
XMLHttpRequest Level 2
W3C Working Draft 20 August 2009
These two MIME types are also those that application developers should or even must
use, according to the documentation of several manufacturers of client software.
c) Overlap with browsing
The listed MIME types are neither used, nor recommended for traditional browsing;
hence, there is no ambiguity as to the non-applicability of transformations on HTTP
transactions that deal with content of those types.
An alternative is to insert a "no-transform" directive in the HTTP transactions of
non-traditional browsing applications. This is however not always possible because
the AJAX or SOAP modules may be compiled packages that cannot be configured or
modified by the developer (whether in the terminal user agent or on the server Web
platform), or that are not under the control of the developer (terminal: configuration
only possible manually by users themselves, or only by the operator; server: platform
under the control of the ISP in a shared hosting environment).
220.127.116.11 #2: The altered content should validate to an appropriate
published formal grammar and be well-formed. Validation might be a
problem for an accessibility transcoding solution. Validation is not
part of WCAG 2.0 because sometimes adding in stuff that's not in the DTD
can make something more accessible (like ARIA for example). Note that
this is a "should", not a "must".
A proposal to endow servers with the possibility to protect their HTTPS services from
I.Â Â Â CONTEXT
The CTG allows proxies to rewrite URL, including those indicating that the
communication between terminal and server is to be established as an end-to-end
connection over TLS/SSL.
The W3C acknowledges the serious security concerns that arise from such rewriting
operations, but does not provide any mechanism for servers to protect their services
from the corresponding security risks.
HTTPS URL rewriting has perhaps been the more contentious issue of the CTG so far,
and has serious consequences on the credibility of the mobile Web for advanced
applications requiring privacy, payment security; it opens up a can of worm regarding
liability and certification of mobile e-commerce solutions.
II.Â Â Â PROPOSAL
The following text is to be added to a new section 18.104.22.168 of the CTG:
"Proxies must provide a means for servers to express preferences for inhibiting
HTTPS URL rewriting regardless of the preferences expressed by the user.
Those preferences must be maintained on a Web site by Web site basis."
III.Â Â Â RATIONALE
The proposal addresses the issues raised by HTTPS URL rewriting, without imposing
a specific mode of implementation. As such, it does not prescribe (non-standard)
mechanisms, nor introduces new technology into the CTG, but only sets formal
requirements as to their end-effect.
I re-establishes the consistency between on the one hand the facts that
a) the original decision to establish HTTPS connections lies within the server;
b) the knowledge about what level of security is appropriate for a Web application
lies on the server side;
c) problems of liability, customer support, and commercial reputation fall back onto
and, on the other hand, the facts that
d) the server is not given any decision power as to whether end-to-end security is to
be respected or not;
e) only the end-user, who does not possess all required knowledge to assess the
situation, is given mechanisms in the current CTG to prevent transformations on site
by site basis.
It takes into account the fact that none of the available mechanisms utilized by
well-behaved proxies is reliable when it comes for a server to detecting whether an
HTTPS URL rewriting has taken place or not. In particular, "via" fields may or may
not be retransmitted integrally by some HTTP standards-compliant proxies, as indicated
in section 22.214.171.124 of the CTG.
It is in line with the practice in some countries, where operators have set up
mechanisms such as "white lists" to exclude financial institutes from HTTPS URL
rewriting. This demonstrates that, notwithstanding whatever is stated about the
relevance of rewriting HTTPS links, the consequences of such operations are taken
>From the introductory text and clauses such as 4.2.7 through 4.2.9, it
is clear that this is targeted at proxies that are transcoding content
for mobile devices yet the title sounds like it's targeted at any kind
of transformation proxy. Suggest changing the title to more accurately
reflect the narrower scope.
* 4.1.5 It needs to be explicitly pointed out here that the modifications listed are not allowed when CC: no-transform is present in the request. Otherwise, the relative precedence of the requirements in the document is too imprecise.
* 4.1.5 "Aside from the usual procedures defined in [RFC 2616 HTTP] proxies should not modify the values" -- I have a hard time parsing this. Do you mean "In addition to the requirements of [RFC2616]..." ?
4.2.3: If a website contains a "Cache-Control: no transform" directive,
proxies must NOT alter the content. Would this be a problem for a
proxy-based accessibility transcoding solution?
As a final comment, in the check list at the top of section 4.9.2, the
"other factors" laudably include:
> "the user agent has features (such as linearization or zoom) that
> allow it to present the content unaltered;"
Given that content transformation proxies at times depend on the
network that is used to access a resource, it might also be worthwhile
calling out the use of a "desktop" browser over a mobile network as a
case that proxies should take into account.
From a quick review, section 126.96.36.199 looks vastly improved. I'll
solicit the WSC WG's opinions on the changed version; speaking
personally, I'm happy with the current text.
I would like to call out a specific point in 188.8.131.52:
> Proxies must preserve security between requests for domains that are
> not same-origin in respect of cookies and scripts.
It is probably worthwhile to call out in non-normative security
considerations what that actually means -- namely, fairly heavy
rewriting of scripts along the lines of what CaJa does, and rewriting
of cookies to emulate the behavior that a browser would otherwise show.
Add a comment.