W3C

QA Framework: Testing Guidelines

W3C Working Draft 20 June 2002

This version:
Latest version:
Previous version:
Editors:
Contributors:
See Acknowledgments.

Abstract

This document defines a set of common guidelines for building conformance test materials for W3C specifications. This document is one in a family of Framework documents of the Quality Assurance (QA) Activity, which includes the other existing or in-progress specifications: Introduction; Specification Guidelines; and, Test Materials Guidelines.

Status of this document

Note. This is a preliminary document plan - content only.

This section describes the status of this document at the time of its publication. Other documents may supersede this document. The latest status of this document series is maintained at the W3C.

This document is a W3C Working Draft (WD), made available by the W3C Quality Assurance (QA) Activity for discussion by W3C members and other interested parties. For more information about the QA Activity, please see the QA Activity statement.

This version is the first public Working Draft, and incorporates the discussions at the first QA face-to-face, plus several subsequent QA Working Group [QAWG] teleconferences, and supersedes all previous drafts. It is expected that updated WD versions of this document will be produced regularly, along with other members of the Framework documents family. Future progression of this document beyond Working Draft is possible, but has not yet been determined.

This part of the Framework document family will eventually have an accompanying "Testing Examples and Techniques" (in progress). Some of the lengthier examples and "how to" text of this current guidelines document version will be moved to the "Examples and Techniques" document.

Please send comments to www-qa@w3.org, the publicly archived list of the QA Interest Group [QAIG]. Please note that any mail sent to this list will be publicly archived and available, do not send information you wouldn't want to see distributed, such as private data.

Publication of this document does not imply endorsement by the W3C, its membership or its staff. This is a draft document and may be updated, replaced, or obsoleted by other documents at any time. It is inappropriate to use W3C Working Drafts as reference material or to cite them as other than "work in progress".

A list of current W3C Recommendations and other technical documents can be found at http://www.w3.org/TR/.

Table of contents

1. Introduction
    1.1 Navigating through this document.
    1.2 Priorities
    1.3 Terminology
2. Guidelines
         G 1. Analyze the specification(s).
         G 2. Choose the testing methodology
         G 3. Provide the test automation and framework
         G 4. Provide the results reporting framework
         G 5. Organize tests development
         G 6. Conduct testing
3. Relationship between QA and other WGs
4. Conformance
    4.1 Conformance definition
    4.2 Conformance disclaimer
5. Acknowledgments
6. References
7. Change History


1. Introduction

1.1 Navigating through this document.

The Guidelines in the document are organized chronologically. The document starts with the guidelines that apply at the formation of a Working Group (e.g., charter considerations) and continues leading the reader through the various process and operational activities necessary in planning, developing, deploying and maintaining conformance materials. This document is applicable to all Working Groups, including those that are being rechartered or already exist. Working Groups may already be doing some of these activities and should review the document and in so far as possible incorporate principles and guidelines into their work.

This document employs the WAI (Web Accessibility Initiative) model for representing guidelines or general principles for the development of conformance materials. See, for example, Web Content Accessibility Guidelines. Each guideline includes:

The checkpoint definitions in each guideline define the processes and operations that need to be implemented in order to accomplish the guideline. Each checkpoint definition includes:

Each checkpoint is intended to be specific enough so that someone can implement the checkpoint as well as verify that the checkpoint has been satisfied.

1.2 Priorities

High quality and timely production of test materials are the key requirements to produce a high quality interoperable standard. Therefore each checkpoint has a priority level assigned by the QA Working Group based on the checkpoint's impact on the quality and timing of the test materials produced by a Working Group.

[Priority 1]
A Working Group must satisfy this checkpoint. Otherwise the test materials will not be produced by the time they are required to ensure the quality of the standard, or they may not be usable. Satisfying this checkpoint is a basic requirement to ensure quality and interoperability of the standard.
[Priority 2]
A Working Group should satisfy this checkpoint. Otherwise it may be difficult to produce high quality test materials by the time they are required to ensure the quality of the standard. Satisfying this checkpoint will significantly improve the interoperability of the standard.
[Priority 3]
A Working Group may address this checkpoint. Otherwise it may be somewhat difficult to maintain the high quality of the test materials and to ensure the quality of the standard. Satisfying this checkpoint will improve the interoperability of the standard.

1.3 Terminology

The keywords "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" will be used as defined in RFC 2119 [RFC2119].

Unusual terms in these framework documents are defined when first used, and most generally useful QA-specific terms will eventually be in the QA Glossary [QA-GLOSSARY].

2. Guidelines

Guideline 1. Analyze the specification(s).

As with any product testing, the first step should be to analyze the subject. The better initial analysis can be done, the easier will be to design the test suite.

Checkpoint 1.1. Create a list of all the specifications used or referenced. [Priority 2]

Most if not all of the specifications use notions and behaviors defined in other technical documents. Building the tree of the specifications being used helps to understand relationships between them, asses the testing work already done for the referenced specifications.

[dd] I'd actually propose that this be priority 1, as interdependencies between specifications are vital to test, especially for API specifications and base specifications, such as XML, which is referenced by many others.

Checkpoint 1.2. Extract testable assertions from the target set of specifications. [Priority 1]

Once you deifned the target set of specifications that you're testing, more formal analysis should be done for each of them. The QA Specifications Guidelines recommend to produce a set of testable assertions for specification, so you may have them already. List of testable assertions is necessary to focus testing. [KG] Need definition of the testable assertion to be referenced.

Checkpoint 1.3. Define those testable assertions that are part of conformance criteria [Priority 1]

Depending on the Confromance Criteria defined in the specification, not all of the testable assertions are necessary to satisphy in order to be conformant to the specification. For example if the conformance criteria requires implementor to comply with only those assertions that have "MUST" or "SHALL", all other testable assertions (with "SHOULD", "MAY",etc) do not belong to conformance criteria.

Moreover, conformance criteria may define levels of conformnace, in which case testable assertions should be grouped by those levels.

Checkpoint 1.4. Extract all the discretionary behaviors defined in the specification [Priority 1]

Test suite should accomodate discretionary behaviors to be used to test products depending on the vendors choice among the allowed behaviors. Therefore, if the discretionary behaviors are not identified in the specification already, tester should do that.

Checkpoint 1.5. Identify vague areas in the specification [Priority 2]

[DM]Ideally, the specs have no vagueness. Test developers can identify the "known" areas of vagueness, but must synchronize with errata. Scanning the documents to detect vagueness and cataloging feedback sent to the spec editors are two different things; which would this checkpoint espouse?

[dd] Purely editorial: I think that find vague areas in the specification is a better wording than define vague areas. Also, finding vague areas opens up the possibility to extract vague areas using a stylesheet, for example.

By vague areas we mean unintentionally undefined or defined ambiguously behaviors. Although the spec ideally should have those, one can never gurantee those. Maintaining a list of such issues helps in both: fixing specifications in next revisions and cataloging the incoming tests that falls into vague areas.

Checkpoint 1.6. Identify explicitly undefined behaviors in the specification [Priority 2]

Although it is not a recommended practice, a specification's authors may explicitly abstain from defining product behavior in certain circumstances. List of such areas helps to analyze incoming tests appropriately.

Checkpoint 1.7. Identify optional behaviors in the specification [Priority 2]

For example, protocol bindings.

Checkpoint 1.8. Identify contradictory behaviors in the specification [Priority 2]

Should not be there, but if exists, needed for test analysis.

Checkpoint 1.9. List user scenarios for the specification [Priority 2]

User scenarios help keep the tests focused.

[dd] Again, I'd go for moving this up to priority one. It is especially important for non-technical oriented specifications, such as WAI.

Checkpoint 1.10. Define target areas for testing [Priority 1]

Needed mainly for categorization of the tests in the test suite. Usually the testing areas match the specification ares/content, but sometimes it is easier to define them based on some other criteria, like applicable testing methodology, user scenarios, etc.

Checkpoint 1.11. Prioritize testing areas [Priority 2]

Helps to prioritize testing development.

Checkpoint 1.12. For each testing area, produce a set of sample testing scenarios [Priority 3]

Before creating test cases for certain area of the specification, it may be useful to design a set of sample testing scenarios, based on the user scenarios. Those are not actual tests, but rather examples. This helps to properly select testing framework, create templates for test cases, define future sub areas

Checkpoint 1.13. Map sample testing scenarios to testable assertions, discretionary behaviors and vague areas [Priority 3]

Helps to formalize testing scenarios and provides basis for the future analyzis of the specification coverage.

Guideline 2. Choose the testing methodology

[dd] This guideline covers the area of test framework, something I anticipate will be covered elsewhere. Here, only substantial issues relevant to incorporating existing frameworks, and not altering them, should be raised.

[dd] Available and applicable methodologies need to be given here.

Checkpoint 2.1. For each test area choose applicable testing approach [Priority 1]

[dd] As above, list available ones.

By testing approach we understand a set of highlevel methods/ideas/strategies. It is convinient to define test areas so that testing in a single area can be done using single methodology.

Checkpoint 2.2. Reuse publicly available testing techniques if applicable [Priority 1]

It is critical to avoid "reinventing the wheel" from both resource considerations and future integration perspectives.

Guideline 3. Provide the test automation and framework

Once the test strategy is defined, the right choice of the framework is critical for smooth future test development/use

Checkpoint 3.1. Review available test frameworks, automation and adopt existing if applicable [Priority 1]

[dd] This checkpoint is partly incosisten with the immediately previous one; reordering them might help.

Argumentation is the same as in reusing testing methodology

Checkpoint 3.2. Ensure the framework and automation are platform independent. [Priority 1]

Alternative is to provide an implementation of the framework for every platform.

Checkpoint 3.3. Ensure the framework and automation are applicable to any product/content that implements the specification [Priority 2]

Similar to the previous one, test suite should be able to cover all products that specification allows.

[dd] The two previous checkpoints could be given in one, stating that the testing framework chosen should, if possible, be platform independent. Also, we need to keep in mind that providing platform-specific test frameworks raises issues with adding work that needs to be done to the testing framework itself; if we were to provide platform-specific test framework, the QA WG or any party producing those would need to allocate time to produce and ascertain their quality.

Checkpoint 3.4. Ensure the framework makes it easy to add tests for any of the specification areas [Priority 2]

Test suite will expand over time, and eventually cover all areas of the specification.

Checkpoint 3.5. Ensure the ease of use for the test automation [Priority 1]

Usability is critical requirements of the test suite. But as critical is to ease the tests contribution.

Checkpoint 3.6. Ensure the framework allows for specification versioning and errata levels [Priority 2]

Requirement from the Process guidelines.

Checkpoint 3.7. Ensure the framework accounts for choices allowed for discretionary behaviors in the specification [Priority 3]

This is integral part of the test suite to be applicable to any product allowed by the specification.

Checkpoint 3.8. Ensure the framework allows for tests for optional behaviors defined in the specification [Priority 3]

While optional behaviors are not necessary to implement, some of them might be self contained additions (like protocol bindings), that needs a test sutie themselves. These tests will of course be applicable only to those products that claims to implement optional behaviors/profiles.

[dd] Experience from the DOM TS shows that allowing for optional/multiple behaviours is a high priority on the wish list for the TS. Implementors want to be able to test particular behaviours as defined in the specification, especially as they may have chosen to support only parts of the specifications (eg. DOM builds on XML, which allows for entity expanding/entity preserving applications).

Checkpoint 3.9. Ensure the framework accommodates levels of conformance defined in the specification [Priority 1]

If the conformance criteria introduces levels, test framework should allow to filter tests by levels.

Checkpoint 3.10. Ensure the results verification is product independent [Priority 1]

Results verification is critical part of the test framework. Since the test should run on any platform against any product implementing the spec, results verification (for example, comparison against expexcted output) should be product intependent.

Checkpoint 3.11. Ensure the framework allows to document the tests [Priority 2]

For better maintenance. This includes annotating tests with pointers to the original specification(s).

Checkpoint 3.12. Ensure the framework has proper test case management [Priority 3]

Test case management includes accounting system for tests, managing additions, removal, filtering by various criteria.

Checkpoint 3.13. Ensure the framework allows to measure specification coverage [Priority 2]

One effective way to measure the specification coverage is to map list of tests to the list of testable assertions grouped by areas.

[dd] Absolutely; this way of grouping works fine with the way we have discussed modules of specifications.

Guideline 4. Provide the results reporting framework

WG should encourage vendors to report testing results for their products. In order to do that, a WG needs to provide vendors with the results format, necessary stylesheets, etc.

Checkpoint 4.1. Review available results reporting frameworks and adopt existing if applicable. [Priority 1]

Checkpoint 4.2. Ensure the results reporting is platform independent [Priority 1]

Similar to the tests, results reporting should be usable by any vendor.

[dd] As above, given that the testing framework is uniform in functionality and independence, this will have been dealt with elsewhere.

Checkpoint 4.3. Ensure the results reporting is compatible with the test framework [Priority 1]

Checkpoint 4.4. Ensure the ease of use for results reporting [Priority 1]

Necessary to facilitate the results reporting by vendors. Ensure the results reporting has sorting and filtering capabilities, etc.

Checkpoint 4.5. Ensure the results reporting allows for specification versioning and errata levels [Priority 2]

Same as in tests.

Checkpoint 4.6. Ensure the results reporting allows to export results in self-contained format suitable to publish on the web [Priority 2]

Checkpoint 4.7. Ensure the results reporting provides details on failures sufficient to investigate [Priority 3]

Logging for example.

Checkpoint 4.8. Ensure the results reporting allows for history/storing analysis comments [Priority 3]

To investigate/compare the different versions of the product.

Guideline 5. Organize tests development

Checkpoint 5.1. Start with the test suite prototype and publish it. [Priority 2]

Checkpoint 5.2. Start with atomic tests first, according to priorities defined in Ck2.5 [Priority 1]

Checkpoint 5.3. Conduct regular public reviews of the test suite as specification and test suite development continues [Priority 2]

[dd] Ideally yes, but this will not necessarily the case if the TS will be produced within the WG

Checkpoint 5.4. Conduct regular specification coverage analysis. [Priority 2]

Guideline 6. Conduct testing

Checkpoint 6.1. A Working Group must publicly encourage conformance testing among vendors. [Priority 1]

A common practice is to support public discussion group dedicated to the test-suite, organize f2f meetings for vendors.

[dd] And other interested parties.

Checkpoint 6.2. Vendors to publish test results for their products. [Priority 3]

[dd] It may be that the W3C can have a special space where information pertaining to test results can be given, if not expliitly, then using links to those pages where the information can be found (in order not to have to provide disclaimers).

3. Relationship between QA and other WGs

4. Conformance

This section defines conformance of Working Group processes and operations to the requirements of this specification. The requirements of this specification are detailed in the checkpoints of the preceding "Guidelines" chapter of this specification, and apply to the Working Group QA-related documents and deliverables required by this specification.

4.1 Conformance definition

This section defines three levels of conformance to this specification:

A Working Group conforms to the "QA Framework: Operational Guidelines" at Level X (A, AA, or AAA) if the Working Group meets at least all Conformance Level X requirements.

To make an assertion about conformance to this document, specify:

Example:

"This QA processes and operations of this Working Group conform to W3C's 'QA Framework: Operational Guidelines', available at http://www.w3.org/TR/2002/qaframe-ops/, Level AA."

4.2 Conformance disclaimer

The checkpoints of this specification present verifiable conformance requirements about the operational aspects of Working Group quality processes. As with any verifiable test requirements, users should be aware that:

  1. Passing all of the requirements to achieve a given conformance level -- A, AA, or AAA -- does not guarantee that the subject operations and processes are well-suited to or will achieve their intended purposes, nor does it guarantee the quality or suitability of test materials produced under the processes.
  2. Failing to achieve level A conformance does not mean that the subject quality processes and/or associated test materials are necessarily deficient to their intended purposes. It means that the processes have failed one or more checkpoints that best-practice experience has shown to facilitate and enable successful quality processes, that in turn have a high correlation with timely and successful development and maintenance of the test materials.

5. Acknowledgments

The following QA Working Group and Interest Group participants have contributed significantly to the content of this document:

6. References

EXTERN-TA
QA activity email thread about third-party participation in test materials production, available at http://lists.w3.org/Archives/Public/www-qa/2001Oct/0060.html.
MATRIX
W3C-wide conformance activity survey covering all the Working Groups, "The Matrix", available at http://www.w3.org/QA/TheMatrix.
PROCESS
W3C Process Document, 19 July 2001, available at http://www.w3.org/Consortium/Process-20010719/.
TAXONOMY
QA Activity test taxonomy, a classification scheme for conformance test materials, available at http://www.w3.org/QA/Taxonomy.
QA-GLOSSARY
A comprehensive glossary of QA terms, maintained by the QA Working Group. (Initial version under construction.)
QAIG
Quality Assurance Interest Group of the W3C QA Activity, which may be found at http://www.w3.org/QA/IG/.
QAWG
Quality Assurance Working Group of the W3C QA Activity, which may be found at http://www.w3.org/QA/WG/.
DOM Working Group TS
Process document for DOM Working Group Test suite, available at http://www.w3.org/2002/01/DOMConformanceTS-Process-20020115.
REC-TRACK
Stages and milestones in the W3C Recommendation Track, per the Process Document (Process Document is available at http://www.w3.org/Consortium/Process-20010719/, see section 5.2).
RFC2119
Key words for use in RFCs to Indicate Requirement Levels, March 1997, available at http://www.ietf.org/rfc/rfc2119.txt.
SVGTEST
SVG Working Group's test suite resource page, which may be found at http://www.w3.org/Graphics/SVG/Test/.
WCAG10
Web Content Accessibility Guidelines, version 1.0, W3C Recommendation, 5 May 1999, available at http://www.w3.org/TR/WCAG10/.
WG-QA-RANGE
Email proposal by David Marston, on the QA public mail list, for range of Working Group commitment levels to conformance test materials production, available at http://lists.w3.org/Archives/Public/www-qa/2001Apr/0004.html.
XMLTEST
OASIS XML Conformance TC's XML test suite resource page, which may be found at http://www.oasis-open.org/committees/xml-conformance/.
XSLT-TEST
OASIS XML Conformance TC's XSLT/Xpath test suite resource page, which may be found at http://www.oasis-open.org/committees/xml-conformance/.
QAF-TEST
QA Framework: Test Materials Guidelines (not yet published).
QAF-SPEC
"QA Framework: Specification Guidelines", Working Draft companion version to this document, available at [...].

7. Change History

06-12-2002

First draft outline