From W3C Wiki
After a certain point in the life of a W3C specification, to prove that the technology has support from the Web community and can be implemented in a practical manner, W3C working groups seek implementations of features. What does this phase tell us? And what do we need to make it a tool of quality?
The Process Document defines the life cycle of a specification. One of the step is Call for Implementations
The working group is NOT REQUIRED to show that a technical report has two independent and interoperable implementations as part of a request to the Director to announce a Call for Implementations. However, the working group SHOULD include a report of present and expected implementations as part of the request.
The call for implementations is done during the CR phase:
Candidate Recommendation is a document that W3C believes has been widely reviewed and satisfies the Working Group's technical requirements. W3C publishes a Candidate Recommendation to gather implementation experience.
The Process Document doesnât impose many requirements on an implementation report, but there is a (developing? âEtanWexler) mutual experience of what such a report should be.
Double Implementation Report
Letâs say we have to test the language FooML for âtwo independent and interoperable implementationsâ. The relevant working group is recording experiences of implementation, among participants of the working group and among other people. Assume that FooML has only six features. The working group produces the following implementation report.
|FooML||Product A||Product B|
The Process Document urges the working group to drop features which have fewer than two implementations. Clearly, features with no implementation at all should be dropped. For FooML, the dropped features would be Feature 4 and Feature 7. So the working group drops those and publishes the following table.
|FooML||Product A||Product B|
From this table we can see that all features have a double implementation.
- Product A implements 60% of FooML
- Product B implements 100% of FooML
- Product C implements 80% of FooML
The inconsistency raises a question: does this table prove interoperability between the products? We find that only Feature 1 and Feature 2 are implemented across the three products; only 40% of the specification FooML has reached widespread interoperability.
Is FooML at a good point or at a bad point? Can we improve things? Can we use better metrics for improving implementations?
We are tempted to think that we really need to improve things. (Who is âweâ? The entire W3C? The Quality Assurance people in the W3C? âEtanWexler) Is an interoperability report more than a double implementation report? What form would an interoperability report take?
Features versus profiles/modules
When it comes to implementations, we should be very careful that functional units of the specification are implemented in an interoperable way. Such interoperability encourages either a modularity in the organization of the technology or the creation of profiles defining which sets of features must be implemented. In these cases, the criterion for interoperability would be that one profile or module is implemented identically in two or more products. The criterion doesnât completely remove interoperability problems. But at least the criterion improves the prospects on one particular module, so that a set of functionality will reach compatibility between products.
Can we consider a specification double-implemented if supported only on one platform? If we are seeking implementations on different platforms, how do we test that? How do we ensure that implementations on the different platforms are identically implemented?
When all products implementing FooML are on a single platform, interoperability is absolutely broken.
The conformance section of a specification might help to define the minimum requirements needed for interoperability. These requirements might be usable in creating an interoperability report.
How many products in the report
Are two or three implementations of a specification enough to prove that the specification is implementable and that there is an interest from the Web community? How do we evaluate whether the number of products available in the interoperability report is sufficient?
One possibility (which has been proposed before) is that every W3C member organization participating in a working group producing a specification should commit to implementing the specification. This approach might limit the working groupâs participants to those from software-oriented organizations. In this case, W3C would have to find a mechanism to make the participation of user-focused companies possible without having to implement something. (As by testing products? As by editing specifications?).
Examples of Implementation Report
- VoiceXML Implementation Report: This document shows the implementation of each feature but also describe the method used for defining the test and how each of them has been tested. It's an impressive work.
- CC/PP: Structure and Vocabularies Implementation Report
- IESG: Protocol Implementation Reports
- GRDDL Test results It was not clear why some of the tests were only passed by one implementation. It means when the tests are introduced there must be a clear way of presenting the results. Sometimes the success would mean to pass a group of tests more than an individual test for one specific implementation.
Beer the ultimate answer?
Could the production and consumption of enormous quantities of ale and lager solve our problems? This question itself raises questions: How much hops? How much malt? Will W3C be capable of fairly administering the distribution of beer to the Web community? What about the underage population?
Further study is needed, preferably funded by DARPA. Cheers.