W3C

QA Framework: Testing Guidelines

W3C Working Draft 12th September 2002

This version:
Latest version:
Previous version:
Editors:
Kirill Gavrylyuk (kirillg@microsoft.com)
Dimitris Dimitriadis (dimitris@ontologicon.com)
Lofton Henderson (lofton@rockynet.com)
Mark Skall (mark.skall@nist.gov)
Peter Fawcette (pfawcett@real.com)
Contributors:
See Acknowledgments.

Abstract

This document defines a set of common guidelines for building conformance test materials for W3C specifications. This document is one in a family of Framework documents of the Quality Assurance (QA) Activity, which includes the other existing or in-progress specifications: Introduction; Process Guidelines; and Specification Guidelines.

Status of this document

This section describes the status of this document at the time of its publication. Other documents may supersede this document. The latest status of this document series is maintained at the W3C.

This document is a W3C QA Working Group-only Working Draft (WD), made available for discussion within the QA Working Group. For more information about the QA Activity, please see the QA Activity statement.

This version is the first Working Draft, and incorporates the discussions at the last QA face-to-face, plus several subsequent QA Working Group [QAWG] teleconferences. It is expected that updated WD versions of this document will be produced regularly, along with other members of the Framework documents family. Future progression of this document beyond Working Draft is possible, but has not yet been determined.

This part of the Framework document family will eventually have an accompanying "Testing Examples and Techniques" (in progress). Some of the lengthier examples and "how to" text of this current guidelines document version will be moved to the "Examples and Techniques" document.

Please send comments to www-qa@w3.org, the publicly archived list of the QA Interest Group [QAIG]. Please note that any mail sent to this list will be publicly archived and available, do not send information you wouldn't want to see distributed, such as private data.

Publication of this document does not imply endorsement by the W3C, its membership or its staff. This is a draft document and may be updated, replaced, or obsoleted by other documents at any time. It is inappropriate to use W3C Working Drafts as reference material or to cite them as other than "work in progress".

A list of current W3C Recommendations and other technical documents can be found at http://www.w3.org/TR/.

Table of contents

1. Introduction
    1.1 Motivation for this guidelines document
    1.2 Navigating through this document.
    1.3 Priorities
    1.4 Terminology
    1.5 Glossary
2. Guidelines
        G 1. Provide analysis of the specification(s).
        G 2. Declare the structure of the test suite.
        G 3. Document the testing methodology
        G 4. Provide the test automation and framework
        G 5. Provide the results reporting framework
        G 6. Organize tests development
        G 7. Conduct testing
3. Relationship between QA and other WGs
4. Conformance
    4.1 Conformance definition
    4.2 Conformance disclaimer
5. Acknowledgments
6. References
7. Change History


1. Introduction

This document is part of a family of QA Framework documents designed to improve the quality of W3C specifications as well as their implementations by solidifying and extending current quality practices within the W3C. The QA Framework documents are:

The guidelines are intended for all Working Groups as well as developers of conformance materials for W3C specifications. Not only are the Working Groups the consumer of these guidelines they are also key contributors. The guidelines capture the experiences, good practices, activities, and lessons learned of the Working Groups and present them in a comprehensive, cohesive set of documents for all to use and benefit from. The objective is to reuse what works rather than reinvent and to foster consistency across the various Working Group quality activities and deliverables.

This document aims at giving guidelines for conducting conformance testing of W3C specifications. In each of the subsections below, you will find information and pointers necessary to either choose among the existing test suites and test frameworks which may suit your needs, or constructing a new ones for testing implementations' conformance with W3C specifications.

The process for developing conformance test materials is affected by QA activities beyond those that are explicitly provided in this document. Specifically, the QA Framework documents are interrelated and complement each other. Links between applicable guidelines in this document and the other Framework documents will be given.

Specifically, the document illustrates the benefits from following the guidelines for writing specifications, stating conformance criteria, and keeping in mind the testability of specifications, especially stressing the interdependencies between specification markup languages (where it is currently being investigated if there can be an easy mechanism for stating test assertions among other things) and the testing frameworks that are discussed. In particular, the document aims to show the added value introduced by using structured test representations and semantic requirements and how these can be used to provide detailed information on implementation conformance and streamline the testing process.

1.1 Motivation for this guidelines document

One of the ultimate goals of a standard is interoperability between it's implementations. Several contemplating efforts help to ensure this goal:

The first effort is discussed in details in the specification guidelines. The third one is done by implementers following release criteria. While those two efforts are essential for interoperability they both leave a room for not interoperable implementations. The first one has no effective criteria for "specification clarity", e.g. the clarity is judged by the specification editors and reviewers. The second one is focused on a particular implementation and depends on the Quality Assurance release criteria for a particular implementation. By introducing a free-to-use conformance test suite that covers most if not all of the specification requirements, developed by a number interested parties across the industry, applicable to any of the specification's implementations, we:

Once we establish a conformance test suite as a criteria for implementations as well as the specification, the quality of the test suite becomes an important factor.

Whether we are taking on a new test suite development task or we are looking for a test suite to reuse, we always ask ourselves about:

These questions are the necessary basis to determine the quality of a test suite, the test suite quality criteria. The main goal of the checkpoints in this document is to verify if the test suite provides effective means to answer these questions.

1.2 Navigating through this document.

The Guidelines of this document follow the structure of the test suite quality criteria outlined above.

The first two guidelines target the test strategy, providing a simple checklist to verify the scope of the test suite, the "target set" of specifications/areas.

Guidelines 3 and 4 address test tactics, looking in details at the test methodology and test automation qualities. Guideline 4 (test automation) also focuses on instruments the automation must provide to measure the specification coverage by the test suite. The specificaiton coverage measurement answers the question about test suite completion.

Guideline 5 focuses on the reporting means that the test suite must provide, in order to be able to define the test criteria for implementations.

This document employs the WAI (Web Accessibility Initiative) model for representing guidelines or general principles for the development of conformance materials. See, for example, Web Content Accessibility Guidelines. Each guideline includes:

The checkpoint definitions in each guideline define the processes and operations that need to be implemented in order to accomplish the guideline. Each checkpoint definition includes:

Each checkpoint is intended to be specific enough so that someone can implement the checkpoint as well as verify that the checkpoint has been satisfied.

1.3 Priorities

High quality and timely production of test materials are the key requirements to produce a high quality interoperable standard. Therefore each checkpoint has a priority level assigned by the QA Working Group based on the checkpoint's impact on the quality and timing of the test materials produced by a Working Group.

[Priority 1]
Satisfying this checkpoint is a basic requirement to ensure quality and interoperability of the standard. If the checkpoint is not satisfied, the test materials will not be produced by the time they are required to ensure the quality of the standard, or they may not be usable.
[Priority 2]
Satisfying this checkpoint will significantly improve the interoperability of the standard. If the checkpoint is not satisfied, it may be difficult to produce high quality test materials by the time they are required to ensure the quality of the standard.
[Priority 3]
Satisfying this checkpoint will further improve the interoperability of the standard. If the checkpoint is not satisfied, it may be somewhat difficult to maintain the high quality of the test materials and to ensure the quality of the standard.

1.4 Terminology

The keywords "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" will be used as defined in RFC 2119 [RFC2119].

Unusual terms in these framework documents are defined when first used, and most generally useful QA-specific terms will eventually be in the QA Glossary [QA-GLOSSARY].

1.5 Glossary

This section contains the definitions for all the critical terms used in the guidelines below. This does not substitute the QA Glossary [QA-GLOSSARY], but rather focuses on the most important terms for the Testing guidelines.

2. Guidelines

Guideline 0. OLD TEXT: Analyze the specification(s).

Guideline 1. Provide analysis of the specification(s).

In order to define the strategy of the test suite, a detailed analysis of the specification (the subject of the test suite) is required. The better the initial analysis can be done, the clearer testing strategy will be.

Checkpoint 1.0. OLD TEXT: Create a list of all the specifications used or referenced.

Checkpoint 1.0. OLD TEXT: Define the target set of specifications that are being tested.

Checkpoint 1.1. Identify the target set of specifications that are being tested. [Priority 1]

Most if not all of the specifications use notions and behaviors defined in other technical documents. For example, even base specification like XML uses definitions from specifications like URN and URI Syntax, Media Types, Unicode, etc. Some specifications are more self-contained and makes only limited use of the syntax defined in other specifications. Other specifications like XSLT [@@LINK] heavily relies on the syntax and semantics defined in the XPath [@@LINK] specification. In order to understand the scope of the test development work, building the tree of the referenced specifications helps

[EX-TECH] The XQuery specification which has a draft status as of the time of writing, explicitly defines a set of the W3C specifications it depends on together with their versions: XPath 2.0, XmlSchema 1.0, XML 1.0 SE. This allows conformance test developers to determine the scope of their work and to reuse the tests from the test suites built for the referenced standards.

The target set may include more then one specification, depending on how strongly does the primary specification under testing rely on the referenced specifications.

[EX-TECH] For example, XML test suite [@@Link] may not include tests that specifically test the URN format, but XSLT [@@Link] and XQuery [@@Link] test suites will include many tests for XPath functions.

Checkpoint 1.2. Identify test assertions from the target set of specifications. [Priority 1]

Once the test suite defined the target set of specifications that are being tested, more formal analysis is required to be present for each of them.

The QA Specifications Guidelines recommend to produce a set of test assertions for specification, so you may have them already. List of test assertions is necessary to focus testing. [KG] Need definition of the test assertion to be referenced.

[EX-TECH]

XML Protocol Working Group produced a list of assertions for the SOAP 1.2 part 1 and part 2 specification. The assertions where extracted manually from the specification text and links to the original text were added.

Microsoft, Open Wave and America Online contributed an HTML test suite [Ed: TEMP LOCATION, change before FPWD] that contains a list of assertions from the HTML 4.01 specification.

In both cases, extracted assertions helped to reach maximum specification coverage when developing the test suite. Links from the tests to the assertions provide a quantified information about the specification coverage.

NIST produced a test matrix for the XML 1.0 test suite that includes explicit references to the assertions in the spec.[@@Link]

Checkpoint 1.3. Group test assertions by levels, profiles, modules, etc. [Priority 1]

The conformance criteria, with respect to sub setting the specification , may include various degrees of variability (e.g., levels, profiles, modules). The test assertions should be grouped according to these subsets to allow the subsets to be tested as a whole.

[EX-TECH] In the SOAP example for the checkpoint above, the assertions were grouped by modules.

Checkpoint 1.3. OLD TEXT: Identify those test assertions that are part of the conformance criteria

Checkpoint 1.4. Identify those test assertions that are part of the conformance criteria [Priority 1]

Depending on the Conformance Criteria defined in the specification, not all of the test assertions are necessary to satisfy in order to be conformant to the specification. For example if the conformance criteria requires implementer to comply with only those assertions that have "MUST" or "SHALL", all other test assertions (with "SHOULD", "MAY", etc) do not belong to conformance criteria.

[EX-TECH] In all of the examples for the checkpoint above, all the assertions belongs to the conformance criteria for the corresponding specifications.

Checkpoint 1.4. OLD TEXT: Extract all the discretionary behaviors defined in the specification

Checkpoint 1.5. Identify all the discretionary behaviors defined in the specification [Priority 1]

Test suite should accommodate discretionary behaviors to be used to test products depending on the vendors choice among the allowed behaviors. Having a standalone list of the discretionary behaviors extracted from the specification helps to automate tuning of the test suite according to the choices taken by a particular implementation.

[EX-TECH] OASIS XSLT/XPath Conformance Technical Committee produced a list of the discretionary items for the XSLT/XPath specifications. This was produced to be able to automate tuning of the test suite according to a particular implementation's choices as described in the checkpoint.

Checkpoint 1.6. Identify optional behaviors in the specification [Priority 2]

Some part of the specifications might be declared as optional to implement as a whole. But if an implementation claims to implement them, it must conforms to the whole part. A good example of such optional "adjuncts" are protocol bindings.

Checkpoint 1.7. Identify behaviors that are undefined or defined ambiguously in the specification [Priority 2]

Although the spec ideally should not have such flaws, one can never guarantee it. Maintaining a list of such issues helps in both: fixing specifications in next revisions and cataloging the incoming tests that falls into undefined or ambiguously defined areas.

Checkpoint 1.8. Identify explicitly undefined behaviors in the specification [Priority 2]

Although it is not a recommended practice, a specification's authors may explicitly abstain from defining product behavior in certain circumstances. List of such areas helps to analyze incoming tests appropriately.

Checkpoint 1.8. OLD TEXT: (obsoleted)Contact specification developers to ensure that vague or ambiguous requirements are rewritten.

After these requirements are re-written, tests can be written to check for conformance.

Checkpoint 1.9. Identify contradictory behaviors in the specification [Priority 2]

Such contradictory combinations should not occur in a specification in the first place, but if they exists, they are needed for both test analysis and future errata tracking.

[KG] add to the Glossary.

Checkpoint 1.9. OLD TEXT: (obsoleted)Contact specification developers to ensure that contradictory behaviors are rewritten

After the contradictory behaviors are rewritten, tests can be developed to check for conformance.

Checkpoint 1.10. List the user scenarios for the specification [Priority 1]

User scenarios help keep the tests focused. They also help to understand requirements for the test framework on a very early stage of testing

[EX-Tech] The W3C XML Query Working Group produced a comprehensive list of the Use Cases that has been very helpful in development of the conformance test suite.

Guideline 1. OLD TEXT: Define testing areas

Guideline 2. Declare the structure of the test suite.

There are many ways to structure the test suite. This guideline lists common requirements to the structure and provide common examples.

Checkpoint 2.1. Document the structure for the test suite. [Priority 1]

Usually the structure of the test sutie matches the specification areas/content, but sometimes it is easier to define them based on some other criteria, like applicable testing methodology, user scenarios, profiles etc.

Test areas . Going further We will call test areas the atomic units of the test suite structure.

[EX-TECH] Here is an informative list of the possible test suite structure:

The test suite may use a combination of these organizational principles.

Checkpoint 2.2. Provide mapping between the test suite structure and the specification structure. [Priority 1]

Regardless of the principle chosen for the test suite structure, the mapping between the test areas and the specification text is essential.Relationship between tests and specification should be traceable via test to test assertions mapping described in the checkpoint below.

[EX-TECH] Each of the contributions to the W3C XML Schema Test collection are categorized according to the specification structure. Which is the recommended practice, unless the categorization by user scenarios or some other criteria suites better test suite framework and development prioritization.

Checkpoint 2.3. For each testing area, provide a sample testing scenario [Priority 3]

[Ed.] Does this still belong here?

Before creating test cases for certain area of the specification, it may be useful to design a set of sample testing scenarios, based on the user scenarios. Those are not actual tests, but rather examples. This helps to properly select testing framework, create templates for test cases, define future sub areas

[EX-TECH] XML Query Use cases document is a good example of the sample test scenarios that could be used as a basis for any XQuery test suite.

Checkpoint 2.4. Map sample testing scenarios to test assertions, discretionary behaviors and vague areas [Priority 3]

[Ed.] Does this still belong here?

Helps to formalize testing scenarios and provides basis for the future analysis of the specification coverage.

Guideline 2. OLD TEXT: Choose the testing methodology

Guideline 3. Document the testing methodology

By testing methodology we understand a highlevel answer to the question "How does the test suite verify the compliance to the specification?"

[EX-TECH]

A partial list of methodologies include

For smil and smil 2.0 the test suite used a specification driven framework. The test suite makes an attempt to verify the use of each element and attribute as defined in the normative language of the specification as well as attempting to verify any interactions between elements, also as defined by normative language. In other specifications I can see a different approach being more appropriate, for example with a specification that defines a specific protocol or method of communication (like soap) would be better tested using a use case type method where a series of use cases are defined but it is up to the tester to create a specific test that verifies a general use case.

Checkpoint 3.0. OLD TEXT: For each test area identify applicable testing approach

Checkpoint 3.1. For each test area identify the testing approach [Priority 1]

By testing approach we understand a set of high-level methods/ideas/techniques. It is convenient to define test areas so that testing in a single area can be done using a single methodology.

Therational for using a specific testing methodology with in a specific test area should be defined. This is to ensure a consistent approach to testing by all users of a test suite.

We state that it is convenient to have a single methodology within a single area of testing. Do we feel it is important for a consistent methodology to be used over all test areas if possible. There is a potential trade off here between consistency and using the best approach to cover a specific area.

[Ed.] Examples.

Checkpoint 3.1. OLD TEXT: Identify publicly available testing techniques. Reuse publicly available testing techniques if applicable. List the publicly available testing techniques that have been reused

Checkpoint 3.2. Identify publicly available testing techniques. List the publicly available testing techniques that have been reused [Priority 1]

It is critical to avoid "reinventing the wheel" from both resource considerations and future integration perspectives.

Guideline 4. Provide the test automation and framework

The right choice of the test framework is critical part of the test tactics

Ideally, QA WG will produce a small number of test frameworks that will implement most, if not all, of the options mentioned. For clarification, it is not to be understood that the QA WG will produce test frameworks for all WG's; there are however a series of things that should be in synchronization, most notably reporting, result publication and test extraction (if it is done using the specification granularity we speak of in Specification Guidelines).

Checkpoint 4.0. OLD TEXT: Review available test frameworks, automation and adopt existing if applicable Identify available test frameworks used. If none, justify why new frameworks are needed, and existing ones could not be used.

Checkpoint 4.1. List available test frameworks, automation applicable. Identify available test frameworks used. If none, justify why new frameworks are needed, and existing ones could not be used. [Priority 1]

The WG that wants to produce a framework for testing implementation conformance with W3C specifications should initially invest some time in reviewing existing testing frameworks (some of which are given in the TestGL ExTech document [@@ link needed]), evaluating them for their purposes and, if plausible (in order to not reinvent the wheel) adopt them. The following roadmap is suggested:

If a particular framework is judged appropriate, inform original author of test framework (or WG chair if the framework was produced by a WG) and communicate additions, changes and errors if applicable.

If you do not find a testing framework that suits your needs, proceed along the following checkpoints to produce a testing framework that is fir to your needs.

Argumentation is the same as in reusing testing methodology

Checkpoint 4.2. Ensure the framework and automation are platform independent. Demonstrate on 3 platforms. Ensure that the framework and automation are built using open standards. [Priority 1]

Ideally, any testing framework should be platform independent, insofar as running and reporting is concerned. Of course, we cannot at this stage envision what implementations will come along, neither can we anticipate what subsets of existing specifications that future implementations will conform to. In any case, a guiding measure is that if testing frameworks as web-based, they should run in the majority of main stream browsers (with a satisfactory degree of conformance to specifications they implement).

[dd] The two previous checkpoints could be given in one, stating that the testing framework chosen should, if possible, be platform independent. Also, we need to keep in mind that providing platform-specific test frameworks raises issues with adding work that needs to be done to the testing framework itself; if we were to provide platform-specific test framework, the QA WG or any party producing those would need to allocate time to produce and ascertain their quality.

Checkpoint 4.3. Ensure the framework and automation are applicable to any product/content that implements the specification. Demonstrate with three products/contents. Ensure that the framework and automation are built using open standards. [Priority 2]

Similar to the previous one, test suite should be able to cover all products that specification allows.

[dd] The two previous checkpoints could be given in one, stating that the testing framework chosen should, if possible, be platform independent. Also, we need to keep in mind that providing platform-specific test frameworks raises issues with adding work that needs to be done to the testing framework itself; if we were to provide platform-specific test framework, the QA WG or any party producing those would need to allocate time to produce and ascertain their quality.

Checkpoint 4.4. Ensure the framework makes it easy to add tests for any of the specification areas. Demonstrate, through an example, how tests are easily added to a specification area. [Priority 2]

Test suite will expand over time, and eventually cover all areas of the specification. Test frameworks should be open to easily adding new test material, especially since test suites will grow over time and test added functionality. Thus it is vital that it is easy to add new tests, properly marked (eg. per specification module).

[pf] Do we need to add any language here for coordinating with requirements of Guidelines 5 and 8 of the Operational Guidelines document? Especially Guideline 8 that concerns defining a process of adding tests.

Checkpoint 4.5. Ensure the ease of use for the test automation. Document how the test automation is easily used. [Priority 1]

Usability is critical requirements of the test suite. But as critical is to ease the tests contribution.

Checkpoint 4.6. Ensure the framework allows for specification versioning and errata levels. Explain how specification versioning and errata levels are accommodated by the test framework [Priority 2]

This should be one of the criteria for adopting/producing a testing framework. It is easy to solve in the case of using a particular set of markup which would then include specification level along with added metadata information. Requirement from the Process guidelines.

Checkpoint 4.7. Ensure the framework accounts for choices allowed for discretionary behaviors in the specification. Explain how discretionary behaviors are accommodated by the framework. [Priority 3]

This is integral part of the test suite to be applicable to any product allowed by the specification. It also applies to the checkpoint immediately after this one. Given that the tests will be represented in a structured manner, it should be possible to run only those tests that a particular implementation is known to support (which is the case with optional sets of specifications).

[pf] This is especially important for specifications that use profiles or other degrees of variability. A framework should clearly define what is required for a specific target profile. It should also allow for the testing of discretionary (non-required) behaviors or features. This applies to both 4.7 and 4.8.

Checkpoint 4.8. Ensure the framework allows for tests for optional behaviors defined in the specification. Explain how optional behaviors are accommodated by the framework. [Priority 3]

While optional behaviors are not necessary to implement, some of them might be self contained additions (like protocol bindings), that needs a test suite themselves. These tests will of course be applicable only to those products that claims to implement optional behaviors/profiles.

[dd] Experience from the DOM TS shows that allowing for optional/multiple behaviors is a high priority on the wish list for the TS. Implementers want to be able to test particular behaviors as defined in the specification, especially as they may have chosen to support only parts of the specifications (eg. DOM builds on XML, which allows for entity expanding/entity preserving applications).

Checkpoint 4.9. Ensure the framework accommodates levels of conformance defined in the specification. Demonstrate how the framework allows tests to be filtered by levels. [Priority 1]

If the conformance criteria introduces levels, test framework should allow to filter tests by levels.

Checkpoint 4.10. Ensure the results verification is product independent. Demonstrate results verification on 3 different products. [Priority 1]

Results verification is critical part of the test framework. Since the test should run on any platform against any product implementing the spec, results verification (for example, comparison against expected output) should be product independent.

[dd] If the testing framework is platform independent, result reporting can also be made indecent. If this does not apply, provide extra functionality that can be run using eg. a mainstream browser.

Checkpoint 4.11. Ensure the framework allows the tests to be documented. Explain how to document the tests, within the framework [Priority 2]

For better maintenance. This includes annotating tests with pointers to the original specification(s). Using a proper source and test documentation mechanism is vital for the quality of the test framework.

Checkpoint 4.12. Ensure the framework has proper test case management. Demonstrate how at least one of the following test case management functions are accomplished, within the framework: managing additions; managing removals; filtering by various criteria [Priority 3]

Test case management includes accounting system for tests, managing additions, removal, filtering by various criteria.

Checkpoint 4.13. Ensure the framework allows specification coverage to be measured. Demonstrate the above by mapping a list of tests to the list of test assertions, grouped by areas [Priority 2]

One effective way to measure the specification coverage is to map list of tests to the list of test assertions grouped by areas. Using a structured markup to represent tests gives the possibility to point to particular parts of the specification (especially if the specifications are also written using structured markup). In this way, one can group tests according to specification parts, and view the results in the same manner.

Guideline 5. Provide the results reporting framework

WG should encourage vendors to report testing results for their products. In order to do that, a WG needs to provide vendors with the results format, necessary style sheets, etc.

[DD] Wording for these checkpoints is similar to wording for checkpoints in previous guideline, should be easy to copy.

Checkpoint 5.1. Review available results reporting frameworks and adopt existing if applicable. If existing frameworks are not adopted, explain why. [Priority 1]

Checkpoint 5.2. Ensure the results reporting is platform independent. Demonstrate on 3 platforms. [Priority 1]

Similar to the tests, results reporting should be usable by any vendor.

[dd] As above, given that the testing framework is uniform in functionality and independence, this will have been dealt with elsewhere.

Checkpoint 5.3. Ensure the results reporting is compatible with the test framework [Priority 1]

Checkpoint 5.4. Ensure the ease of use for results reporting. Demonstrate that the results reporting has sorting and filtering capabilities. [Priority 1]

Necessary to facilitate the results reporting by vendors. Ensure the results reporting has sorting and filtering capabilities, etc.

Checkpoint 5.5. Ensure the results reporting allows for specification versioning and errata levels. Explain how specification versioning and errata levels are accommodated by the test results reporting [Priority 2]

Same as in tests.

Checkpoint 5.6. Document how the results reporting allows results to be exported in a self-contained format suitable to publication on the web. [Priority 2]

[dd] Keep in mind that results should be publishable on the web; therefore aim at producing a self-contained version of the test reporting (including pointers to tests and relevant parts of the specification) in HTML form.

Checkpoint 5.7. Demonstrate that the results reporting provides details on failures (logs) sufficient to investigate [Priority 3]

Logging for example.

[dd] On rereading this checkpoint, I don't know how doable it is. It requires that intended behavior is explicitly stated in order to do a comparison between that and output. Thus, we have an implicit requirement to use granular grammars for specifications.

Checkpoint 5.8. Document how the results reporting allows for history/storing analysis comments [Priority 3]

To investigate/compare the different versions of the product.

Guideline 6. Organize tests development

Checkpoint 6.1. Prioritize testing areas [Priority 2]

Helps to prioritize testing development. Also allows to identify certain areas to be covered by a test prototype.

Checkpoint 6.2. Start with the test suite prototype and publish it. [Priority 2]

Checkpoint 6.3. Start with atomic tests first, according to priorities defined in Ck2.5. Provide documentation describing the atomic tests [Priority 1]

Checkpoint 6.4. Conduct regular public reviews of the test suite as specification and test suite development continues. Provide the schedule for the public reviews. [Priority 2]

[dd] Ideally yes, but this will not necessarily the case if the TS will be produced within the WG

Checkpoint 6.5. Conduct regular specification coverage analysis. Provide the schedule for specification coverage analysis. [Priority 2]

Guideline 7. Conduct testing

Checkpoint 7.1. A Working Group must publicly encourage conformance testing among vendors. Organize at least one face-to-face meeting with vendors to review the test suite and encourage testing. [Priority 1]

A common practice is to support public discussion group dedicated to the test-suite, organize f2f meetings for vendors.

[dd] And other interested parties.

Checkpoint 7.2. Encourage Vendors to publish test results for their products by reserving a special space where information pertaining to test results can be maintained. [Priority 3]

[dd] It may be that the W3C can have a special space where information pertaining to test results can be given, if not explicitly, then using links to those pages where the information can be found (in order not to have to provide disclaimers).

3. Relationship between QA and other WGs

4. Conformance

This section defines conformance of Working Group processes and operations to the requirements of this specification. The requirements of this specification are detailed in the checkpoints of the preceding "Guidelines" chapter of this specification, and apply to the Working Group QA-related documents and deliverables required by this specification.

4.1 Conformance definition

This section defines three levels of conformance to this specification:

A Working Group conforms to the "QA Framework: Operational Guidelines" at Level X (A, AA, or AAA) if the Working Group meets at least all Conformance Level X requirements.

To make an assertion about conformance to this document, specify:

Example:

"This QA processes and operations of this Working Group conform to W3C's 'QA Framework: Operational Guidelines', available at http://www.w3.org/TR/2002/qaframe-ops/, Level AA."

4.2 Conformance disclaimer

The checkpoints of this specification present verifiable conformance requirements about the operational aspects of Working Group quality processes. As with any verifiable test requirements, users should be aware that:

  1. Passing all of the requirements to achieve a given conformance level -- A, AA, or AAA -- does not guarantee that the subject operations and processes are well-suited to or will achieve their intended purposes, nor does it guarantee the quality or suitability of test materials produced under the processes.
  2. Failing to achieve level A conformance does not mean that the subject quality processes and/or associated test materials are necessarily deficient to their intended purposes. It means that the processes have failed one or more checkpoints that best-practice experience has shown to facilitate and enable successful quality processes, that in turn have a high correlation with timely and successful development and maintenance of the test materials.

5. Acknowledgments

The following QA Working Group and Interest Group participants have contributed significantly to the content of this document:

6. References

EXTERN-TA
QA activity email thread about third-party participation in test materials production, available at http://lists.w3.org/Archives/Public/www-qa/2001Oct/0060.html.
MATRIX
W3C-wide conformance activity survey covering all the Working Groups, "The Matrix", available at http://www.w3.org/QA/TheMatrix.
PROCESS
W3C Process Document, 19 July 2001, available at http://www.w3.org/Consortium/Process-20010719/.
TAXONOMY
QA Activity test taxonomy, a classification scheme for conformance test materials, available at http://www.w3.org/QA/Taxonomy.
QA-GLOSSARY
A comprehensive glossary of QA terms, maintained by the QA Working Group. (Initial version under construction.)
QAIG
Quality Assurance Interest Group of the W3C QA Activity, which may be found at http://www.w3.org/QA/IG/.
QAWG
Quality Assurance Working Group of the W3C QA Activity, which may be found at http://www.w3.org/QA/WG/.
DOM Working Group TS
Process document for DOM Working Group Test suite, available at http://www.w3.org/2002/01/DOMConformanceTS-Process-20020115.
REC-TRACK
Stages and milestones in the W3C Recommendation Track, per the Process Document (Process Document is available at http://www.w3.org/Consortium/Process-20010719/, see section 5.2).
RFC2119
Key words for use in RFCs to Indicate Requirement Levels, March 1997, available at http://www.ietf.org/rfc/rfc2119.txt.
SVGTEST
SVG Working Group's test suite resource page, which may be found at http://www.w3.org/Graphics/SVG/Test/.
WCAG10
Web Content Accessibility Guidelines, version 1.0, W3C Recommendation, 5 May 1999, available at http://www.w3.org/TR/WCAG10/.
WG-QA-RANGE
Email proposal by David Marston, on the QA public mail list, for range of Working Group commitment levels to conformance test materials production, available at http://lists.w3.org/Archives/Public/www-qa/2001Apr/0004.html.
XMLTEST
OASIS XML Conformance TC's XML test suite resource page, which may be found at http://www.oasis-open.org/committees/xml-conformance/.
XSLT-TEST
OASIS XML Conformance TC's XSLT/Xpath test suite resource page, which may be found at http://www.oasis-open.org/committees/xml-conformance/.
QAF-TEST
QA Framework: Test Materials Guidelines (not yet published).
QAF-SPEC
"QA Framework: Specification Guidelines", Working Draft companion version to this document, available at [...].
XQuery-use-case
"XML Query Use Cases
SOAP-test
SOAP Version 1.2 Specification Assertions and Test Collection
SOAP12-1
SOAP Version 1.2 Part 1
SOAP12-2
SOAP Version 1.2 Part 2
XSD-TEST
W3C XML Schema Test Collection
XSLT-DISC
List of Discretionary items produced by OASIS XSLT/XPath Technical Committee

7. Change History

09-12-2002

Expanded introduction, added motivation, etc...

Added examples to the checkpoints in the Gd1,2,3

[MS] Changed the text of many checkpoints to make them verifiable

[DD] First pass on Introduction, added more text to the checkpoints in the Gd 3-5

07-01-2002

Fixed definitions of priorities

Fixed the glitch with the "Test Areas" guideline

Added clarification to Ck 1.1, 1.2, 1.5 (removed "vague"), 1.6

06-17-2002

Added short prose to each checkpoint

06-12-2002

First draft outline