W3C

QA Framework: Test Guidelines

W3C Working Draft 20 December 2002

This version:
http://www.w3.org/TR/2002/WD-qaframe-test-20021220/
Latest version:
http://www.w3.org/TR/qaframe-test/
Previous version:
(None, this is the first published WD)
Editors:
Kirill Gavrylyuk (kirillg@microsoft.com)
Dimitris Dimitriadis (dimitris@ontologicon.com)
Lofton Henderson (lofton@rockynet.com)
Mark Skall (mark.skall@nist.gov)
Peter Fawcett (pfawcett@real.com)
Contributors:
See Acknowledgments.

Abstract

This document defines a set of common guidelines for conformance test materials for W3C specifications. This document is one in a family of Framework documents of the Quality Assurance (QA) Activity, which includes the other existing or in-progress specifications QA Framework: Introduction,QA Framework: Operational Guidelines, and QA Framework: Specification Guidelines.

Status of this document

This section describes the status of this document at the time of its publication. Other documents may supersede this document. The latest status of this document series is maintained at the W3C.

This document is a W3C Working Draft (WD), made available by the W3C Quality Assurance (QA) Activity for discussion by W3C members and other interested parties. For more information about the QA Activity, please see the QA Activity statement.

This version is the first published Working Draft. It is expected that updated WD versions of this document will be produced regularly, along with other members of the Framework documents family. By intention, this Test Guidelines document lags behind the other guidelines documents of the Framework family by one or two publication cycles. Future progression of this document beyond Working Draft is possible, but has not yet been determined.

This part of the Framework document family will eventually have an informative accompanying QA Framework: Test Examples and Techniques document. It will illustrate ways in which the guidelines and checkpoints of this document might be satisfied.

The QA Working Group Patent Disclosure page contains details on known patents related to this specification, in conformance with W3C policy requirements.

At this time, the QAWG (QA Working Group) has addressed the contents of the document at a high level, agreeing to the concepts and principles as well the coverage of the guidelines. The QAWG has not, at this time, addressed and achieved consensus of the priority levels. A future version of this document will be accompanied by a "Specification Examples & Techniques" document, which will illustrate the guidelines and checkpoints with case studies, and explain how to satisfy the checkpoints. In this version, some prospective content is included after the label "[EX-TECH]". These sections have some unfinished details -- mostly references -- marked by "@@".

Please send comments to www-qa@w3.org, the publicly archived list of the QA Interest Group [QAIG]. Please note that any mail sent to this list will be publicly archived and available, do not send information you wouldn't want to see distributed, such as private data.

Publication of this document does not imply endorsement by the W3C, its membership or its staff. This is a draft document and may be updated, replaced, or obsoleted by other documents at any time. It is inappropriate to use W3C Working Drafts as reference material or to cite them as other than "work in progress".

A list of current W3C Recommendations and other technical documents can be found at http://www.w3.org/TR/.

Table of contents

1. Introduction
    1.1 Motivation for this guidelines document
    1.2 Navigating through this document
    1.3 Priorities
    1.4 Terminology
    1.5 Definitions
2. Guidelines
         G 1. Provide analysis of the specification(s).
         G 2. Declare the structure of the test suite.
         G 3. Document the testing methodology.
         G 4. Provide the test automation and framework
         G 5. Provide the results reporting framework
         G 6. Plan for tests development
         G 7. Plan for conformance testing
3. Conformance
    3.1 Conformance definition
    3.2 Conformance disclaimer
4. Acknowledgments
5. References
6. Change History

An appendix to this document [TEST-CHECKLIST] presents all checkpoints in a tabular form, for convenient reference. This checklist is an Implementation Conformance Statement (ICS) pro-forma for this specification.


1. Introduction

This document is part of a family of QA Framework documents designed to improve the quality of W3C specifications as well as their implementations by solidifying and extending current quality practices within the W3C. The QA Framework documents are:

The guidelines are intended for all Working Groups as well as developers of conformance materials for W3C specifications. Not only are the Working Groups the consumer of these guidelines they are also key contributors. The guidelines capture the experiences, good practices, activities, and lessons learned of the Working Groups and present them in a comprehensive, cohesive set of documents for all to use and benefit from. The objective is to reuse what works rather than reinvent and to foster consistency across the various Working Group quality activities and deliverables.

This document aims at giving guidelines for testing implementation's conformance with the W3C specifications. In each of the subsections below, you will find information and pointers necessary to either choose among existing test suites and test frameworks which may suit your needs or construct a new test suite for testing implementations' conformance.

Whether developing a new test suite or looking for one to reuse, these questions need to be answered.

These questions are the necessary basis to determine the quality of a test suite and its quality criteria. The main goal of the checkpoints in this document is to verify that a test suite provides sufficient information to answer these questions.

The process for developing and using conformance test materials is affected by QA activities beyond those that are explicitly provided in this document. The QA Framework documents are interrelated and complement each other. Links between applicable guidelines in this document and the other Framework documents will be given.

This document illustrates the benefits achieved by following the guidelines for writing specifications, that is, stating conformance criteria,and keeping in mind the testability of specifications, such as the interdependencies between specification markup languages and the testing frameworks. In particular, this document shows the added value introduced by using structured test representations and semantic requirements and how these can be used to provide detailed information on implementation conformance as well as streamline the testing process.

1.1 Motivation for this guidelines document

One of the ultimate goals of a standard is interoperability between it's implementations. Several complementary efforts help to ensure this goal:

The first effort is discussed in details in the specification guidelines. The third one is done by implementers following release criteria. While those two efforts are essential for interoperability, they both leave a room for non-interoperable implementations. The first one has no effective criteria for "specification clarity", e.g., the clarity is judged by the specification editors and reviewers. The second one, is focused on a particular implementation and depends on the quality assurance release criteria for a particular implementation. A free-to-use conformance test suite that covers most if not all of the specification requirements, is developed by interested parties across industry, and is applicable to any of the specification's implementations, provides:

Thus, ensuring that implementations can comply to all the specification requirements covered by the tests they pass. Once a conformance test suite is established as a criterion for implementations as well as the specification, the quality of the test suite becomes an important factor.

1.2 Navigating through this document

The Guidelines of this document follow the structure of the test suite quality criteria outlined above.

The first two guidelines target the test strategy, providing a simple checklist to verify the scope of the test suite, the "target set" of specifications/areas.

Guidelines 3 - 5 address details of the test methodology and test automation qualities. Guideline 4 (test automation) also focuses on the mechanisms that must be provided in order to measure the specification coverage achieved by the test suite. The specification coverage measurement answers the question of test suite completion.

Guideline 5 focuses on the reporting mechanisms that the test suite must provide, in order to be able to define the test criteria for implementations.

This document employs the WAI (Web Accessibility Initiative) model for representing guidelines or general principles for the development of conformance materials. See, for example, Web Content Accessibility Guidelines. Each guideline includes:

The checkpoint definitions in each guideline define the processes and operations that need to be implemented in order to accomplish the guideline. Each checkpoint definition includes:

Each checkpoint is intended to be specific enough so that someone can implement the checkpoint as well as verify that the checkpoint has been satisfied.

1.3 Priorities

High quality and timely production of test materials are the key requirements to producing a high quality interoperable standard. Therefore, each checkpoint has a priority level assigned by the QA Working Group based on the checkpoint's impact on the quality and timing of the test materials produced by a Working Group.

[Priority 1]
Critical/essential. These checkpoints are considered to be basic requirements for ensuring the quality of the test materials and interoperability of the implementations. Satisfying these checkpoints is a basic requirement to ensure that test materials are usable and verify the minimum conformance requirements of the standard.
[Priority 2]
Important/desirable. Satisfying these checkpoints, in addition to the priority 1 checkpoints, should significantly improve the usability of the test materials, as well as the interoperability of the implementations.
[Priority 3]
Useful/beneficial. Satisfying these checkpoints, on top of all the others, will further improve the quality and usability of the test materials and facilitate the conformance testing of the implementations.

1.4 Terminology

The keywords "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" will be used as defined in RFC 2119 [RFC2119].

Unusual terms in these framework documents are defined when first used, and most generally useful QA-specific terms will eventually be in the QA Glossary [QA-GLOSSARY].

1.5 Definitions

This section contains the definitions for all the critical terms used in the guidelines below. This does not substitute the QA Glossary [QA-GLOSSARY], but rather focuses on the most important terms for the Testing guidelines. Some terms in this section have been borrowed or adapted from other specifications.

Contradictory behaviors
Two or more different behaviors unconditionally prescribed by a specification for the same class of implementations under the same circumstances.
Discretionary choices
A value or behavior may be chosen from a well-defined enumerated set of two or more possibilities.
Optional behaviors
A well-defined feature may be supported or not (if supported, then the requirements are clear and unambiguous).
Explicitly undefined behaviors
Specification states that it is open ended and undefined, what set of values an element or attribute may have, or the behaviors of a product that implements a feature.
Test Assertion
A set of premises that are known to be true by definition in the spec.
Test Area
A minimal compound unit in the test suite structure.
Test Framework
A set of utilities, stylesheets and documentation that describe and facilitate development, documentation and use of the tests.
Results Verification
A common testing practice used to determine if a test passes or fails by verification of the test result or output against the expected one.

2. Guidelines

Guideline 1. Provide analysis of the specification(s).

In order to define the strategy of the test suite, a detailed analysis of the specification (the subject of the test suite) is required. The better the initial analysis, the clearer the testing strategy will be.

Checkpoint 1.1. Identify the target set of specifications that are being tested. [Priority 1]

Most if not all of the specifications use notions and behaviors defined in other technical documents. For example, even base specification like XML uses definitions from specifications like URN and URI Syntax, Media Types, Unicode. Some specifications are more self-contained and makes only limited use of the syntax defined in other specifications. Other specifications like XSLT [@@LINK] heavily relies on the syntax and semantics defined in the XPath [@@LINK] specification. In order to understand the scope of the test development work, building a tree of the referenced specifications helps to:

[EX-TECH] The XQuery specification [@@link] which has a draft status as of the time of writing, explicitly defines a set of the W3C specifications it depends on together with their versions: XPath 2.0, XML Schema 1.0, XML 1.0 SE. This allows conformance test developers to determine the scope of their work and to reuse the tests from the test suites built for the referenced standards.

The target set may include more then one specification, depending on how strongly the primary specification under test relies on the referenced specifications.

[EX-TECH] For example, XML test suite [@@Link] may not include tests that specifically test the URN format, but XSLT [@@Link] and XQuery [@@Link] test suites will include many tests for XPath functions.

Checkpoint 1.2. Identify test assertions from the target set of specifications. [Priority 1]

Once the target set of specifications are defined, more formal analysis is required for each of them.

The QA Specifications Guidelines requires that a set of test assertions for a specification be produced, so this may already exist. A listing of the test assertions is necessary to focus the testing.

[EX-TECH]

XML Protocol Working Group produced a list of assertions for the SOAP 1.2, part 1 and part 2 specification. The assertions where extracted manually from the specification text and links to the original text were added.

Microsoft, Open Wave, and America Online contributed a HTML test suite [Ed: TEMP LOCATION, change before FPWD] that contains a list of assertions from the HTML 4.01 specification.

In both cases, extracted assertions helped to reach maximum specification coverage when developing the test suite. Links from the tests to the assertions provide quantified information about the specification coverage.

NIST produced a test matrix for the XML 1.0 test suite that includes explicit references to the assertions in the spec.[@@Link]

Checkpoint 1.3. Group test assertions by levels, profiles, modules, etc. [Priority 1]

The conformance criteria, with respect to subsetting the specification, may include various degrees of variability (e.g., levels, profiles, modules). The test assertions should be grouped according to these subsets, thus facilitating the testing of entire subsets.

[EX-TECH] In the SOAP example for the checkpoint above, the assertions were grouped by modules.

Checkpoint 1.4. Identify those test assertions that are part of the conformance criteria [Priority 1]

Depending on the Conformance Criteria defined in the specification, not all of the test assertions need to be satisfied in order to be conformant to the specification. For example, if the conformance criteria of the specification requires implementers to comply with only those assertions that use "MUST" or "SHALL" keywords as defined in [RFC2119], then all other test assertions (with "SHOULD", "MAY", etc) do not belong in the conformance criteria.

[EX-TECH] In all of the examples for the checkpoint above, all the assertions belong to the conformance criteria for the corresponding specifications.

Checkpoint 1.5. Identify all the discretionary choices defined in the specification [Priority 1]

A test suite should be designed to take into account discretionary choices. This allows for testing all possible choices regardless of which of the allowed behaviors an implementation implements. Having a standalone list of the discretionary choices extracted from the specification helps to automate the tuning of the test suite according to the choices taken by a particular implementation.

[EX-TECH] OASIS XSLT/XPath Conformance Technical Committee produced a list of the discretionary items for the XSLT/XPath specifications. This provided a way to automatically tune the test suite to a particular implementation's choices.

Checkpoint 1.6. Identify optional behaviors in the specification [Priority 2]

Some parts of the specifications might be declared as optional to implement. If an implementation claims to implement them, then it must implement them correctly and conform to the whole part. A good example of such optional "adjuncts" are protocol bindings.

Checkpoint 1.7. Identify behaviors that are undefined or defined ambiguously in the specification [Priority 2]

Although the specification should not have such flaws, there is no guarantee that it is flawless. Maintaining a list of such issues helps provides a feedback loop to the specification developers so that the specification can be fixed in its next revision. Additionally, a list can be used in cataloging incoming tests that fall into undefined or ambiguously defined areas.

Checkpoint 1.8. Identify explicitly undefined behaviors in the specification [Priority 2]

Although it is not a recommended practice, a specification's authors may explicitly abstain from defining product behavior in certain circumstances. A list of such statements in the specification helps to analyze incoming tests appropriately.

Checkpoint 1.9. Identify contradictory behaviors in the specification [Priority 2]

Such contradictory combinations should not occur in a specification in the first place. However, if they exist, they are used for both test analysis and future errata tracking.

Checkpoint 1.10. List the user scenarios for the specification [Priority 1]

User scenarios help keep the tests focused. They also help to understand requirements for the test framework at a very early stage of testing

[EX-Tech] The W3C XML Query Working Group produced a comprehensive list of the Use Cases that has been very helpful in development of the conformance test suite.

Guideline 2. Declare the structure of the test suite.

There are many ways to structure the test suite. This guideline lists common requirements for the structure and provide common examples.

Checkpoint 2.1. Document the structure for the test suite. [Priority 1]

Usually the structure of the test suite matches the structure of the specification or its content. However, sometimes it is easier to define the test suite structure in other ways, such as applicable testing methodology, user scenarios, profiles, etc.

[EX-TECH]The following is an informative, non-exhaustive list of possible test suite structures:

The test suite may use a combination of these organizational principles.

Checkpoint 2.2. Provide mapping between the test suite structure and the specification structure. [Priority 1]

Regardless of the principle chosen for the test suite structure, the mapping between the test areas and the specification text is essential. The relationship between tests and specification should be traceable via a test to test assertions mapping as described in the checkpoint below.

The document model organizational principle is the recommended practice for structuring a test suite and the one most commonly used.

[EX-TECH] Each of the contributions to the W3C XML Schema Test collection are categorized according to the document model (i.e., specification structure).

[EX-TECH] When mapping between the test suite structure and the test specification is provided, it may be useful to design a set of sample testing scenarios, based on the user scenarios. These are not actual tests, but rather test examples. This helps to properly select the testing framework, create templates for test cases, and define future subareas.

For example, XML Query Use cases document has sample test scenarios that could be used as a basis for any XQuery test suite and gives a good feeling of the requirements for the test framework.

To assess the completeness of the specification coverage by such testing scenarios, additional mapping could be built between the sample testing scenarios and the test assertions. This helps to formalize testing scenarios and provides a basis for the future analysis of the specification coverage.

Guideline 3. Document the testing methodology.

The testing methodology provides high-level answer to the question "How does the test suite verify conformance to the specification?"

[EX-TECH]

A non-exhaustive list of methodologies include:

For SMIL and SMIL 2.0, the test suite used a specification driven framework. The test suite makes an attempt to verify the use of each element and attribute as defined in the normative language of the specification. Additionally, it attempts to verify any interactions between elements which is also defined by the normative language of the specification. In other specifications, other approaches may be more appropriate. For example, a specification that defines a specific protocol or method of communication (e.g., SOAP) would be better tested using a use case type method where a series of use cases are defined and it is up to the tester to create a specific test that verifies a general use case.

Checkpoint 3.1. For each test area, identify the testing approach [Priority 1]

The testing approach provides a set of high-level methods, ideas, or techniques to test the implementation's conformance to the standard. It is convenient to define test areas so that testing in a single area can be done using a single methodology.

The rational for using a specific testing methodology within a specific test area should be defined. This is to ensure a consistent approach to testing by all users of a test suite.

Although not required, it is convenient to have a single methodology within a single area of testing. When possible, a consistent methodology should be used over all test areas. There is a potential trade off here between consistency and using the best approach to cover a specific area.

[EX-TECH] A simple example of the test suite using different testing approach for difference specification areas would be the W3C XML Schema test suite [@@LINK]. Testing the conformance to the schema validation assertions uses a schema parsing technique with the expected result being true or false. Whereas testing XML instance validation assertions requires verifying the schema(s) validity, instance well-formness, and instance validity against the schema(s).

Another example from the same test suite is testing XML instance validity against the schema. It may be done using an inline schema definition inside the XML instance, explicitly referring schema from the instance using the @schemaLocation attribute or just supplying both the schema document for the namespace used in the XML instance. In different parts of the test suite, different techniques are used.

Checkpoint 3.2. Identify publicly available testing techniques. List the publicly available testing techniques that have been reused [Priority 1]

It is critical to avoid "reinventing the wheel" from both resource considerations and future integration perspectives.

Guideline 4. Provide the test automation and framework

The right choice of the test framework is a critical part of the test development.

That said, the right framework for the test suite is a responsibility of the Working Group. There are however a series of things that should be in synchronization, most notably, reporting, result publication, and test extraction (if it is done using the specification granularity described in the Specification Guidelines). The QA Working Group can provide assistance in accomplishing this.

Checkpoint 4.1. List available test frameworks and applicable automation. Identify available test frameworks used. If none, justify why new frameworks are needed and existing ones could not be used. [Priority 1]

A Working Group that wants to produce a framework for testing implementation conformance with W3C specifications should initially invest some time in reviewing existing testing frameworks (some of which will be given in the future TestGL ExTech document), evaluating them with respect to the WG's testing objectives, and if applicable (in order to not reinvent the wheel) adopt them. The following roadmap is suggested:

If a particular framework is judged appropriate, inform the original author of test framework (or WG chair, if the framework was produced by a WG) and communicate additions, changes and errors, if applicable.

If a suitable testing framework is not found, then proceed along the following checkpoints to produce a testing framework that is applicable.

Augmentation is the same as reusing testing methodology

Checkpoint 4.2. Ensure the framework and automation are platform independent. Demonstrate on 3 platforms. Ensure that the framework and automation are built using open standards. [Priority 1]

Ideally, any test framework should be platform independent, insofar as running and reporting is concerned. In cases where it is not feasible, parts of the framework could be made platform dependent as long as they are provided for each of the main stream platforms. However, keep in mind that providing platform-specific test frameworks has several drawbacks with respect to time and resource. It requires additional work in creating these platform-specific frameworks as well as ascertaining their quality.

Checkpoint 4.3. Ensure the framework and automation are applicable to any product or content that implements the specification. Demonstrate with three products or contents. Ensure that the framework and automation are built using open standards. [Priority 2]

Similar to the previous checkpoint, a test suite should be able to cover all classes of products that a specification allows. The challenge is to create a test framework that can be adaptable and extended to accommodate future implementations as well as future versions of the specification. This being a noble goal, it is not always possible to do and it may be necessary to build additional test frameworks in the future. However, the fact that a test framework is applicable to at least three different implementations is a good indicator that it will be feasible to adapt it to any other product.

Sometimes parts of a test framework have to be adjusted for a particular implementation. The OASIS XSLT [@@LINK] test framework is a good example of this. The test framework was adjusted and in doing so, resulted in expending more resources to accomplish this additional work.

[EX-TECH] An example of the product independent testing framework would be a web-based testing framework, that runs in the majority of main stream browsers (with a satisfactory degree of conformance to specifications they implement).

Checkpoint 4.4. Ensure the framework makes it easy to add tests for any of the specification areas. Demonstrate, through an example, how tests are added to a specification area. [Priority 2]

A test suite will expand over time and eventually cover all areas of the specification. Test frameworks should be designed for extensibilty, making it possible to add new test material. This is especially important since test suites will grow over time and test added functionality. It is also critical to comply to this checkpoint in order to fulfill the requirements of checkpoint 5.2 and 8.1 in the QA Operational Guidelines.

Checkpoint 4.5. Ensure the ease of use for the test automation. Document how the test automation is used. [Priority 1]

Usability is a critical requirement for the test suite and the framework. Good test frameworks must provide sufficient documentation on hor to add new tests, how to run tests, how to monitor and investigate results.

[EX-TECH] A good example of the documented test framework for a conformance test suite is an XSLT test framework[@@LINK].

Checkpoint 4.6. Ensure the framework allows for specification versioning and errata levels. Explain how specification versioning and errata levels are accommodated by the test framework [Priority 2]

This is one of the criteria for adopting or producing a testing framework. It can easily be accomplished by using a particular set of markup which would then include specification level along with added metadata information. This is also a requirement of the QA Operational Guidelines.

Checkpoint 4.7. Ensure the framework accounts for choices allowed by discretionary choices in the specification. Explain how discretionary behaviors are accommodated by the framework. [Priority 2]

It is an integral part of the test suite to be applicable to any product that implements the specification. It also applies to the checkpoint immediately after this one. Given that the tests will be represented in a structured manner, it should be possible to run only those tests that a particular implementation is known to support (which is the case with optional sets of specifications).

Checkpoint 4.8. Ensure the framework includes tests for the optional behaviors defined in the specification. Explain how optional behaviors are accommodated by the framework. [Priority 3]

While it is not a requirement to implement optional behaviors, some of them might be self contained additions (like protocol bindings, DOM HTML module) that need their own test suite. These tests will of course be applicable only to those products that claim to implement optional behaviors or profiles.

Experience from the DOM test suite shows that allowing for optional or multiple behaviors is a high priority on the wish list for the test suites. Implementers want to be able to test particular behaviors as defined in the specification, especially when they have chosen to support only parts of the specifications (eg. DOM builds on XML, which allows for entity expanding/entity preserving applications).

Checkpoint 4.9. Ensure the test framework accommodates profiles, modules, product classes, and levels if they are used in the specification. For each dimension of variability used in the specification, demonstrate how the framework allows tests to be filtered by the dimension. [Priority 2]

A framework should provide for tailoring the test suite to the implementation, i.e., selecting only those tests that reflect the combination of the dimensions of variability that is supported by the implementation.

Checkpoint 4.10. Ensure the framework accommodates the conformance policy defined in the specification. Demonstrate how the framework allows tests to be filtered by levels. [Priority 1]

If the conformance criteria introduces levels, the test framework should allow that tests can be filtered by levels.

Checkpoint 4.11. Ensure the framework supports test results verification. Demonstrate results verification by testing three products. [Priority 1]

Results verification is a common testing practice used to determine if a test passes or fails by verification of the test result or output against the expected one. Results verification is a critical part of the test framework. The tests should run on any platform against any product implementing the specification. The same applies to the result verification support.

Checkpoint 4.12. Ensure the framework allows the tests to be documented. Explain how to document the tests within the framework. [Priority 2]

Documentation aids in the understanding of the tests as well as facilitating maintenance of the test suite. This includes annotating tests with pointers to the original specification(s). Using a proper source and test documentation mechanism is vital for the quality of the test framework.

[EX-TECH] W3C test suites for XML, DOM, XML Schema, and OASIS XSLT test suite all contain good examples of the documented tests.

Checkpoint 4.13. Ensure the framework has proper test case management. Demonstrate how at least one of the following test case management functions is accomplished within the framework: managing additions, managing removals, filtering by various criteria [Priority 3]

Test case management includes an accounting system for the tests, managing additions, removal, and filtering by various criteria.

Checkpoint 4.14. Ensure the framework allows specification coverage to be measured. Demonstrate the above by mapping a list of tests to the list of test assertions, grouped by areas [Priority 2]

One effective way to measure the specification coverage is to map the list of tests to the list of test assertions which are grouped by areas. Using a structured markup to represent tests makes it possile to point to particular parts of the specification (especially if the specifications are also written using structured markup). In this way, it is possible to group tests according to specification parts and view the results in the same manner.

Guideline 5. Provide the results reporting framework

The WG should encourage vendors to report testing results for their products. In order to do that, a WG needs to provide vendors with the results format, necessary style sheets, and any other tools to facilitate reporting.

Checkpoint 5.1. Ensure that the test framework supports reporting the results. [Priority 1]

All the requirements for the test framework such as applicability to any platform and implementation, ease of use, support for the specification versioning, and errata levels directly apply to the results reporting support.

Checkpoint 5.2. Ensure the ease of use for results reporting. Demonstrate that the results reporting has sorting and filtering capabilities. [Priority 1]

This is a necessary part of the ease of use requirement, that is, to facilitate the results reporting by vendors.

Checkpoint 5.3. Document how the results reporting allows results to be exported in a self-contained format suitable to publication on the web. [Priority 2]

Results reporting needs to be implemented in a way that allows for the results to be published on the web. Therefore, producing a self-contained version of the test reporting (including pointers to tests and relevant parts of the specification) in HTML form is recommended.

Checkpoint 5.4. Demonstrate that the results reporting provides details on failures (logs) sufficient to investigate. [Priority 3]

Logging facilitates testing results investigation and therefore, is part of the ease of use requirement. Results reporting should provide test logs.

[EX-TECH] An example of the logging incorporated into the test framework and results reporting is the SOAP Builder's interoperability testing participants results page [@@LINK].

Checkpoint 5.5. Document how the results reporting allows for history/storing analysis of comments [Priority 3]

Documenting the results of the tests from various versions of a product allows the implementor to investigate and compare the results.

Guideline 6. Plan for tests development

The aim of the checkpoints in this guideline is to ensure that the Working Group has a plan in place for the test development. Fulfillment of the checkpoints below will facilitate conformance with checkpoints 5.2 and 5.4 from the QA Operational Guidelines

Checkpoint 6.1. Define priorities for the test areas [Priority 2]

This helps prioritize test suite development and test reviews. It also allows Working Groups to identify specific areas in the specification for inclusion in a test prototype (see next checkpoint) at the early stages of the test suite development.

Checkpoint 6.2. Introduce a mechanism for the early feedback on the test suite architecture and test framework. Document and publish the mechanism to the intended audience. [Priority 3]

An effective way to verify that the test framework and the test suite architecture suits the testing needs is to develop a test suite prototype that contains a limited amount of tests and a test framework prototype for it.

Checkpoint 6.3. Ensure regular specification coverage analysis. Provide the schedule for specification coverage analysis. [Priority 2]

Fulfillment of this checkpoint facilitates satisfying checkpoint 3.1 of the QA Operational Guidelines

[EX-TECH] Several tips for organizing an effective test suite development are listed below

Guideline 7. Plan for conformance testing

Checkpoints of this guideline aim to ensure that the Working Group has a plan for getting interested parties involved in the development and use of conformance materials.

Checkpoint 7.1. Document the plan for engaging vendors of implementations in conformance testing activities. [Priority 1]

A common practice is to support public discussion group dedicated to the test suite and organize face-to-face meetings for vendors and other interested parties.

Checkpoint 7.2. Encourage Vendors to publish test results for their products by reserving a special space where information pertaining to test results can be maintained. [Priority 3]

It may be possible for the W3C to have a special space where information pertaining to test results can be given, if not explicitly, then using links to those pages where the information can be found (in order not to have to provide disclaimers).

3. Conformance

This section defines conformance of Working Group processes and operations to the requirements of this specification. The requirements of this specification are detailed in the checkpoints of the preceding "Guidelines" chapter and apply to the Working Group QA-related documents and deliverables required by this specification.

3.1 Conformance definition

This section defines three levels of conformance to this specification:

A Working Group conforms to the "QA Framework: Test Guidelines" at Level X (A, AA, or AAA) if the Working Group meets at least all Conformance Level X requirements.

To make an assertion about conformance to this document, specify:

Example:

The Test Suite for X module, version 2.1 of this Working Group, conforms to the W3C's 'QA Framework: Test Guidelines' version 1.0, available at http://www.w3.org/TR/2002/WD-qaframe-test-20021220/, Level AA as determ ined on January 1, 2003.

3.2 Conformance disclaimer

The checkpoints of this specification present verifiable conformance requirements about the quality of the test materials developed or adopted by the Working Group. As with any verifiable test requirements, users should be aware that:

  1. Passing all of the requirements to achieve a given conformance level -- A, AA, or AAA -- does not guarantee that the quality of the subject test materials are well-suited to or will achieve their intended purposes.
  2. Failing to achieve level A conformance does not mean that the subject test materials are necessarily deficient to their intended purposes. It means that the test materials fail one or more checkpoints that best-practice experience has shown to facilitate and enable successful development, maintenance and use of the test materials.

4. Acknowledgments

The following QA Working Group and Interest Group participants have contributed significantly to the content of this document:

5. References

EXTERN-TA
QA activity email thread about third-party participation in test materials production, available at http://lists.w3.org/Archives/Public/www-qa/2001Oct/0060.html.
MATRIX
W3C-wide conformance activity survey covering all the Working Groups, "The Matrix", available at http://www.w3.org/QA/TheMatrix.
PROCESS
W3C Process Document, 19 July 2001, available at http://www.w3.org/Consortium/Process-20010719/.
TAXONOMY
QA Activity test taxonomy, a classification scheme for conformance test materials, available at http://www.w3.org/QA/Taxonomy.
QA-GLOSSARY
A comprehensive glossary of QA terms, maintained by the QA Working Group. (Initial version under construction.)
QAIG
Quality Assurance Interest Group of the W3C QA Activity, which may be found at http://www.w3.org/QA/IG/.
QAWG
Quality Assurance Working Group of the W3C QA Activity, which may be found at http://www.w3.org/QA/WG/.
DOM Working Group TS
Process document for DOM Working Group Test suite, available at http://www.w3.org/2002/01/DOMConformanceTS-Process-20020115.
REC-TRACK
Stages and milestones in the W3C Recommendation Track, per the Process Document (Process Document is available at http://www.w3.org/Consortium/Process-20010719/, see section 5.2).
RFC2119
Key words for use in RFCs to Indicate Requirement Levels, March 1997, available at http://www.ietf.org/rfc/rfc2119.txt.
SVGTEST
SVG Working Group's test suite resource page, which may be found at http://www.w3.org/Graphics/SVG/Test/.
TEST-CHECKLIST
An appendix to this test guidelines document presents all checkpoints in tabular form. Available at http://www.w3.org/TR/2002/WD-qaframe-test-20021220/qaframe-test-checklist
WCAG10
Web Content Accessibility Guidelines, version 1.0, W3C Recommendation, 5 May 1999, available at http://www.w3.org/TR/WCAG10/.
WG-QA-RANGE
Email proposal by David Marston, on the QA public mail list, for range of Working Group commitment levels to conformance test materials production, available at http://lists.w3.org/Archives/Public/www-qa/2001Apr/0004.html.
XMLTEST
OASIS XML Conformance TC's XML test suite resource page, which may be found at http://www.oasis-open.org/committees/xml-conformance/.
XSLT-TEST
OASIS XML Conformance TC's XSLT/Xpath test suite resource page, which may be found at http://www.oasis-open.org/committees/xml-conformance/.
QAF-INTRO
"QA Framework: Introduction", Working Draft companion version to this document, available at http://www.w3.org/TR/2002/WD-qaframe-intro-20021108/.
QAF-OPS
"QA Framework: Operational Guidelines", Working Draft companion version to this document, available at http://www.w3.org/TR/2002/WD-qaframe-ops-20021108/.
QAF-SPEC
"QA Framework: Specification Guidelines", Working Draft companion version to this document, available at http://www.w3.org/TR/2002/WD-qaframe-spec-20021108/.
XQuery-use-case
"XML Query Use Cases
SOAP-TEST
SOAP Version 1.2 Specification Assertions and Test Collection
SOAP12-1
SOAP Version 1.2 Part 1
SOAP12-2
SOAP Version 1.2 Part 2
XSD-TEST
W3C XML Schema Test Collection
XSLT-DISC
List of Discretionary items produced by OASIS XSLT/XPath Technical Committee

6. Change History

12-05-2002

Edited, improved the Introduction(goals, motivation, document's structure

Updated the definition of the checkpoint's Priorities

corrected abstract, SOT

Changed the goal of the document and wording of the checkpoints/guidelines to focus it on testing strategy, moving all the tips on tactics to be EX-TECH

Added examples to most of the checkpoints

incorporated all the editors comments

Rewrote several checkpoints in the Gd 5 - defined results verification and reporting as part of the test framework to remove redundant checkpoints

Rewrote guideline 6 and 7 to focus on strategy of test development and testing, rather then on tactics.

Updated the conformance section

09-12-2002

Expanded introduction, added motivation, etc...

Added examples to the checkpoints in the Gd1,2,3

[MS] Changed the text of many checkpoints to make them verifiable

[DD] First pass on Introduction, added more text to the checkpoints in the Gd 3-5

07-01-2002

Fixed definitions of priorities

Fixed the glitch with the "Test Areas" guideline

Added clarification to Ck 1.1, 1.2, 1.5 (removed "vague"), 1.6

06-17-2002

Added short prose to each checkpoint

06-12-2002

First draft outline