W3C

2013 Open Web Platform Testing Plan.

W3C Document 03 April 2013

Editor:
Tobie Langel

Abstract

This document describes the infrastructure and test development plan and budget for the 2013 Open Web Platform Testing Effort.

Status of This Document

This document is merely a W3C-internal document. It has no official standing of any kind and does not represent consensus of the W3C Membership.

Table of Contents

1. Introduction

1.1 Project Goals

1.2 Means

1.3 Measures of success

More precise metrics TBD.

2. Infrastructure Development

This sections describes the different components of the testing infrastructure and estimates their costs.

2.1 Website Entry point & Documentation Center

This includes basic infrastructure for hosting markdown and HTML pages, simple navigation, and overall website design and branding.

The documentation center contains documentation on authoring, submitting and reviewing tests, along with documentation on the various APIs, the test runner, etc.

2.2 Spec database

This is a database containing references to all W3C and non-W3C specs referenced by W3C specifications. It allows basic CRUD editing, and exposes an API used by various components of the testing infrastructure and spec authoring tools such as ReSpec.

2.3 Test Coverage

Spec coverage is determined by comparing different heuristics with the number of existing tests for each section and subsections of a spec. This process is fully automated, but can be overridden in specific areas through manual input (usually by test coordinators).

Test coverage data is available through a JSON API, as a widget (which can be embedded directly in specs or on webplatform.org), and in a dashboard which allows getting coverage info at the spec level or drilling down to find the precise coverage of nested subsections.

2.4 Test Runner

The Web test runner can run testharness.js tests, manual tests, and other JavaScript tests provided a shim is available and collect their results. Unfortunately, it cannot automate ref tests runs within the browser as there are no APIs to do so.

Ref tests maybe be fully automated in WebDriver-capable browsers. However, this cannot be done by simply visiting an URL. The test runner has to be installed locally to run ref tests using WebDriver. Note that testharness.js and manual tests can also be run in that scenario. Although much more complex to install and use than regular web-based testing, WebDriver is an interesting solution to run ref tests. It also seems it could enable automated testing of certain accessibility specs such as [WAI-ARIA], by providing access to the accessibility tree.

The test runner relies on a server module that is specially designed to allow testing Open Web Platform specs such as [XMLHTTPREQUEST] or [POSTMSG].

An instance of the Web test runner is hosted on the testing website. Test run results are stored in a database.

Semi-automated testing and manual testing are both needed for testing implementations of certain types of accessibility-supporting features. This requirement is enabled through a dedicated test runner which is integrated with the rest of the testing infrastructure, and which will provide integrated output on test results.

2.5 Test Results

Test results are stored in a database along with information about the user agent that was tested (such as the user agent string).

Test result data is available through a JSON API, as a widget (which can be embedded directly in specs or on webplatform.org), and in a dashboard which displays support status of features for each browser.

2.6 Test Management

Test management is done through Git and GitHub. In order to mitigate the risk of relying on tools provided by a third party, the test repository is synced to W3C infrastructure. Comments on pull requests and issues are archived using GitHub's API.

A budget is reserved to help working groups which are relying heavily on other system to migrate.

The Website also offers the possibility to link GitHub and W3C accounts together, sign-up for writing tests, and showcases contributors and test coordinators within a dashboard.

2.7 Continuous Integration & Review tools

Experience shows test review is the main bottleneck to increase coverage. A number of tools and process change are designed to simplify the work of the reviewer. This system is designed to be easily extensible.

It includes a continuous integration solution where submissions are automatically tested against a subset of user agents and stress-testing to reject unstable tests upfront. This solution can also be used to seed the test results database and not rely solely on crowd-sourcing test runs to do so.

Other tools can be used to verify metadata, validate markup, run linters, check CLA, etc.

Results are presented in a publicly accessible dashboard and are linked from the GitHub pull request.

2.8 WebIDL parser and harness

Improving the [WEBIDL] parser and idlharness to generate better and more numerous test cases is high leverage as it impacts all specs which have a dependency on [WEBIDL].

Currently, idlharness is a client-side tool. Adding a server-side mode could allow pre-building the tests which would speed up test runs.

3. Infrastructure maintenance

This includes:

4. Test Development

4.1 Scope Depth

We are focusing our effort on automate-able unit tests (in-depth conformance testing), testable through either User Agent scripting (testharness.js tests) or WebDriver (ref tests). This does not rule out any manual tests per se, but makes them the exception rather than the norm.

4.2 Scope Breadth

The total scope of this effort is the union of Coremob and TV profiles. Whether this scope can be met is essentially a matter of how much funding is obtained. Obviously, the larger the funding, the more comprehensive the effort. We described below alternate milestones which we might aim for depending on the level of funding and of other aspects such as preserving the community, etc.

4.3 Cost estimate

In order to provide the best cost estimate possible, we have devised a tool that parses the specifications and, through different heuristics, estimates the number of tests necessary for each section and subsection of the document.

Through various means, we obtained data on the number of exiting tests and submissions for each of these tests and derived the number of tests that still needed to be developed per specification.

We estimated the cost of writing and reviewing each of these tests, accounting for the various benefits our planned infrastructure would provide.

We then made projections for best and worst case scenarios which we used in the below estimations.

In order to account for different funding amounts, we divided the project in three levels. Note that none of these estimate account for some of the unknown costs described in section 4.4.4 Accounting for Unknowns, nor do they consider external contribution (notably from vendors) or test repurposing as described in section 4.4.3 Repurposing existing tests.

4.3.1 Mostly HTML5

This first level focuses on [HTML5] and a small number of other specs, picked to help validate our estimation model and develop a good understanding of the different types of testing.

4.3.2 Intersection of the Coremob and TV profiles

A second level which increase the scope of the effort to the intersection of the Coremob and TV profiles and which comprises all the specs listed in section A. Intersection of the Coremob and TV Profiles (this amount includes the estimated cost of the [HTML5] spec listed in section 4.3.1 Mostly HTML5 above).

4.3.3 Symmetric Difference of the Coremob and TV profiles

A third level which includes the specs described in section B. Symmetric Difference of the Coremob and TV Profiles to cover the union of the two profiles.

4.4 Other factors to consider

Choosing which level to aim for is essentially a matter of how much funding is available. Yet there are a number of other factors to keep in mind. Some are listed below.

4.4.1 Balancing crowd-sourcing and out-sourcing

One of the goals of this effort is to foster a community around testing that can continue its work and increase its effort in both breadth (as new specs mature) and depth (looking into performance, quality of implementation and interaction testing, notably). Aggressively out-sourcing all the specs might hinder the development of this community. We need to find the right balance between out-sourcing to meet the aggressive deadlines some of our members have in mind and crowd-sourcing in order to foster this nascent community.

4.4.2 Understanding the Dynamics of External Contributions

We are at the very beginning of this effort and are yet to understand and account for the contributions made by implementors, Working Group participants, and Web developers. The infrastructure and process changes we're planning to make will drive these contributions up, that's for sure, but we do not yet know by how much.

4.4.3 Repurposing existing tests

There are significant opportunities to increase test coverage at lower cost by repurposing existing tests and taping into not yet reviewed submissions. There are costs associated with these efforts that need to be estimated, but they can substantially lower the cost of test acquisition.

  • Most CSS 2.1 tests can be re-used in CSS 3 specs. This implies reviewing over 10,000 reference tests, and adding meta-data accordingly.
  • Existing browser vendor tests. These would need to be reviewed, transformed into testharness.js tests and assigned to the appropriate spec and spec section.
  • Existing test submissions which still need to be reviewed. This includes roughly 3,500 test pages in HTML5 and a number of WebApps test suites. It is unclear how much duplication there is in that area.

4.4.4 Accounting for Unknowns

We are still looking for data on existing tests for a number of specs and/or our tuning our test coverage tool to gather better test requirements. These specs are marked by the dagger symbol(‡) in section 4.3.2 Intersection of the Coremob and TV profiles.

4.5 Picking the Right Strategy

In view of the above, we recommend raising for test development, pick the low hanging fruits described above (repurposing tests, submissions from implementors, etc) and focus on developing tests for [HTML5] and a small number of other, diverse enough specs. This will help us validate our model, refine our estimates, and understand the progress we can make through crowd-sourcing and other contributions. We can provide an update late Q3 2013, and re-adjust funding then.

5. Community management

A community manager or dedicated task force is needed to foster a community around testing. This should represent a 50% FTE but could be spread among multiple people.

6. Facilitate vendors upstreaming tests and sharing a single test repository

As discussed during out January meeting, contributions from implementors can be an important source of tests.

After further discussion with implementors, there's clear agreement that the real values lies in:

This requires guarantees in terms of the quality of the tests contained in this repository. These are covered in section 2.7 Continuous Integration & Review tools.

In order to enable this key effort to succeed, implementors need to dedicate resources to:

Implementors also need to author JavaScript tests using testharness.js and ECMAScript 5.1 (or lower).

A. Intersection of the Coremob and TV Profiles

This list all of the specs which are present in both profiles and that we intend to test.

Specs marked by the symbol (‡) are specs where were cost estimates are still pending.

B. Symmetric Difference of the Coremob and TV Profiles

This list all of the specs which are present in either one profile or the other, but not both.

C. Non W3C Specs Present in Either of the Coremob or TV Profiles

This list all of the specs which are present in either one of the profiles but which we won't be developing tests for as they are not W3C specs. Where possible and when licensing permits, we will enable running existing these suites for these specs on our infrastructure.

D. References

D.1 Informative references

[ANIMATION-TIMING]
James Robinson; Cameron McCormack. Timing control for script-based animations. 21 February 2012. W3C Last Call Working Draft. URL: http://www.w3.org/TR/2012/WD-animation-timing-20120221
[CANVAS-2D]
Rik Cabanier; Eliot Graff; Jay Munro; Tom Wiltzius; Ian Hickson. HTML Canvas 2D Context. 17 December 2012. W3C Candidate Recommendation. URL: http://www.w3.org/TR/2012/CR-2dcontext-20121217
[CORS]
Anne van Kesteren. Cross-Origin Resource Sharing. 29 January 2013. W3C Candidate Recommendation. URL: http://www.w3.org/TR/2013/CR-cors-20130129
[CSS-ADAPTATION]
Rune Lillesveen. CSS Device Adaptation. 15 September 2011. W3C First Public Working Draft. URL: http://www.w3.org/TR/2011/WD-css-device-adapt-20110915
[CSS21]
Bert Bos et al. Cascading Style Sheets, level 2 (CSS2) Specification. 07 June 2011. W3C Recommendation. URL: http://www.w3.org/TR/CSS21/
[CSS3-ANIMATIONS]
Dean Jackson; David Hyatt; Chris Marrin; Sylvain Galineau; L. David Baron. CSS Animations. 03 April 2012. W3C Working Draft. URL: http://www.w3.org/TR/2012/WD-css3-animations-20120403
[CSS3-BG]
Bert Bos; Elika J. Etemad; Brad Kemper. CSS Backgrounds and Borders Module Level 3. 24 July 2012. W3C Candidate Recommendation. URL: http://www.w3.org/TR/2012/CR-css3-background-20120724/
[CSS3-FONTS]
John Daggett. CSS Fonts Module Level 3. 11 December 2012. W3C Working Draft. URL: http://www.w3.org/TR/2012/WD-css3-fonts-20121211
[CSS3-IMAGES]
Elika J. Etemad; Tab Atkins Jr.. CSS Image Values and Replaced Content. 17 April 2012. W3C Candidate Recommendation. URL: http://www.w3.org/TR/2012/CR-css3-images-20120417
[CSS3-MEDIAQUERIES]
Håkon Wium Lie; Tantek Çelik; Daniel Glazman; Anne van Kesteren. Media Queries. 19 June 2012. W3C Recommendation. URL: http://www.w3.org/TR/css3-mediaqueries
[CSS3-TRANSFORMS]
Simon Fraser; Dean Jackson; David Hyatt; Chris Marrin; Edward O'Connor; Dirk Schulze; Aryeh Gregor. CSS Transforms. 11 September 2012. W3C Working Draft. URL: http://www.w3.org/TR/2012/WD-css3-transforms-20120911
[CSS3-TRANSITIONS]
Dean Jackson; David Hyatt; Chris Marrin; L. David Baron. CSS Transitions. 03 April 2012. W3C Working Draft. URL: http://www.w3.org/TR/2012/WD-css3-transitions-20120403/
[CSS3COL]
Håkon Wium Lie. CSS3 module: Multi-column layout. 12 April 2011. W3C Candidate Recommendation. URL: http://www.w3.org/TR/css3-multicol/
[CSS3COLOR]
Tantek Çelik; Chris Lilley; L. David Baron. CSS Color Module Level 3. 07 June 2011. W3C Recommendation. URL: http://www.w3.org/TR/css3-color
[CSS3NAMESPACE]
Anne van Kesteren; Elika J. Etemad. CSS Namespaces Module. 23 May 2008. W3C Candidate Recommendation. URL: http://www.w3.org/TR/2008/CR-css3-namespace-20080523
[CSS3TEXT]
Elika J. Etemad; Koji Ishii. CSS Text Module Level 3. 13 November 2012. W3C Working Draft. URL: http://www.w3.org/TR/2012/WD-css3-text-20121113
[CSS3UI]
Tantek Çelik. CSS3 Basic User Interface Module. 17 January 2012. W3C Working Draft. URL: http://www.w3.org/TR/css3-ui/
[CSS3VAL]
Håkon Wium Lie; Tab Atkin; Elika J. Etemad. CSS3 Values and Units. 28 August 2012. W3C Candidate Recommendation. URL: http://www.w3.org/TR/2012/CR-css3-values-20120828/
[CSS3WRITINGMODES]
Elika J. Etemad; Koji Ishii; Shinyu Murakami. CSS Writing Modes Module Level 3. 17 October 2010. W3C Editor's Draft. URL: http://dev.w3.org/csswg/css3-writing-modes
[CSSOM]
Anne van Kesteren. CSSOM. 12 July 2011. W3C First Public Working Draft. URL: http://www.w3.org/TR/2011/WD-cssom-20110712
[CSSOM-VIEW]
Anne van Kesteren. CSSOM View Module. 4 August 2011. W3C Working Draft. URL: http://www.w3.org/TR/2011/WD-cssom-view-20110804
[DEVICE-ORIENTATION]
Steve Block; Andrei Popescu. DeviceOrientation Event Specification. 1 December 2011. W3C Last Call Working Draft. URL: http://www.w3.org/TR/2011/WD-orientation-event-20111201
[DOM-LEVEL-3-EVENTS]
Travis Leithead; Jacob Rossi; Doug Schepers; Björn Höhrmann; Philippe Le Hégaret; Tom Pixley. Document Object Model (DOM) Level 3 Events Specification. 06 September 2012. W3C Working Draft. URL: http://www.w3.org/TR/2012/WD-DOM-Level-3-Events-20120906
[DOM4]
Anne van Kesteren; Aryeh Gregor; Lachlan Hunt; Ms2ger. DOM4. 6 December 2012. W3C Working Draft. URL: http://www.w3.org/TR/2012/WD-dom-20121206
[ECMA-262-51]
ECMAScript Language Specification, Edition 5.1. June 2011. URL: http://www.ecma-international.org/publications/standards/Ecma-262.htm
[FILE-API]
Arun Ranganathan; Jonas Sicking. File API. 25 October 2012. W3C Working Draft. URL: http://www.w3.org/TR/2012/WD-FileAPI-20121025
[FLEXBOX]
Tab Atkins Jr; Elika J. Etemad; Alex Mogilevsky. CSS Flexible Box Layout Module. 18 September 2012. W3C Candidate Recommendation. URL: http://www.w3.org/TR/2012/CR-css3-flexbox-20120918
[GEOLOCATION-API]
Andrei Popescu. Geolocation API Specification. 10 May 2012. W3C Proposed Recommendation. URL: http://www.w3.org/TR/2012/PR-geolocation-API-20120510
[HTML5]
Robin Berjon et al. HTML5. 17 December 2012. W3C Candidate Recommendation. URL: http://www.w3.org/TR/html5/
[HTMLMEDIACAPTURE]
Anssi Kostiainen; Ilkka Oksanen; Dominique Hazaël-Massieux. HTML Media Capture. 29 May 2012. W3C Working Draft. URL: http://www.w3.org/TR/2012/WD-html-media-capture-20120529/
[HTTP11]
R. Fielding et al. Hypertext Transfer Protocol - HTTP/1.1. June 1999. RFC 2616. URL: http://www.ietf.org/rfc/rfc2616.txt
[INDEXEDDB]
Nikunj Mehta; Jonas Sicking; Eliot Graff; Andrei Popescu; Jeremy Orlow. Indexed Database API. 24 May 2012. W3C Last Call Working Draft. URL: http://www.w3.org/TR/2012/WD-IndexedDB-20120524/
[OMA-URI-SCHEMES]
URI Schemes for the Mobile Applications Environment. Approved Version 1.0. 26 Jun 2008. URL: http://www.openmobilealliance.org/Technical/release_program/docs/URI_Schemes/V1_0-20080626-A/OMA-TS-URI_Schemes-V1_0-20080626-A.pdf
[ORIGIN]
A. Barth. The Web Origin Concept. December 2011. RFC 6454. URL: http://tools.ietf.org/html/rfc6454
[POINTER-EVENTS]
Jacob Rossi; Matt Brubeck. Pointer Events. 15 January 2013. W3C Working Draft. URL: http://www.w3.org/TR/2013/WD-pointerevents-20130115
[POSTMSG]
Ian Hickson. HTML5 Web Messaging. 01 May 2012. W3C Candidate Recommendation. URL: http://www.w3.org/TR/2012/CR-webmessaging-20120501
[PROGRESS-EVENTS]
Anne van Kesteren. Progress Events. 22 September 2011. W3C Candidate Recommendation. URL: http://www.w3.org/TR/2011/CR-progress-events-20110922
[QUOTA-API]
Kinuko Yasuda. Quota Management API. 03 July 2012. W3C First Public Working Draft. URL: http://www.w3.org/TR/2012/WD-quota-api-20120703
[RFC2397]
L. Masinter. The "data" URL scheme (RFC 2397). August 1998. RFC. URL: http://www.ietf.org/rfc/rfc2397.txt
[RFC3966]
H. Schulzrinne. The tel URI for Telephone Numbers (RFC 3966). December 2004. RFC. URL: http://www.ietf.org/rfc/rfc3966.txt
[RFC5724]
E. Wilde; A. Vaha-Sipila. URI Scheme for Global System for Mobile Communications (GSM) Short Message Service (SMS) (RFC 5785). January 2010. RFC. URL: http://www.ietf.org/rfc/rfc5724.txt
[RFC6068]
M. Duerst; L. Masinter; J. Zawinski. The 'mailto' URI Scheme (RFC 6068). October 2010. RFC. URL: http://www.ietf.org/rfc/rfc6068.txt
[SELECT]
Tantek Çelik; Elika J. Etemad; Daniel Glazman; Ian Hickson et al. Selectors Level 3. 29 September 2011. W3C Recommendation. URL: http://www.w3.org/TR/css3-selectors/
[SELECTORS-API]
Lachlan Hunt; Anne van Kesteren. Selectors API Level 1. 13 December 2012. W3C Proposed Recommendation. URL: http://www.w3.org/TR/2012/PR-selectors-api-20121213
[SSE]
Ian Hickson. Server-Sent Events. 11 December 2012. W3C Candidate Recommendation. URL: http://www.w3.org/TR/2012/CR-eventsource-20121211/
[SVG11]
Erik Dahlström et al. Scalable Vector Graphics (SVG) 1.1 (Second Edition). 16 August 2011. W3C Recommendation. URL: http://www.w3.org/TR/2011/REC-SVG11-20110816/
[TOUCH-EVENTS]
Doug Schepers; Sangwhan Moon; Matt Brubeck. Touch Events version 1. 24 January 2013. W3C Working Draft. URL: http://www.w3.org/TR/2013/WD-touch-events-20130124
[WAI-ARIA]
James Craig; Michael Cooper et al. Accessible Rich Internet Applications (WAI-ARIA) 1.0. 24 February 2009. W3C Working Draft. URL: http://www.w3.org/TR/2009/WD-wai-aria-20090224
[WEBGL]
Chris Marrin (Apple Inc.). WebGL Specification, Version 1.0. 10 February 2011. URL: https://www.khronos.org/registry/webgl/specs/1.0/
[WEBIDL]
Cameron McCormack. Web IDL. 19 April 2012. W3C Candidate Recommendation. URL: http://www.w3.org/TR/2012/CR-WebIDL-20120419/
[WEBSTORAGE]
Ian Hickson. Web Storage. 08 December 2011. W3C Candidate Recommendation. URL: http://www.w3.org/TR/2011/CR-webstorage-20111208
[WEBWORKERS]
Ian Hickson. Web Workers. 01 May 2012. W3C Candidate Recommendation. URL: http://www.w3.org/TR/2012/CR-workers-20120501
[WOFF]
Jonathan Kew; Tal Leming; Erik van Blokland. WOFF File Format 1.0. 13 December 2012. W3C Recommendation. URL: http://www.w3.org/TR/WOFF/
[XMLHTTPREQUEST]
Julian Aubourg; Jungkee Song; Hallvord R. M. Steen; Anne van Kesteren. XMLHttpRequest. 6 December 2012. W3C Working Draft. URL: http://www.w3.org/TR/2012/WD-XMLHttpRequest-20121206/

$Revision: 1.4 $ of $Date: 2013/04/23 18:01:47 $