W3C

QA Framework: Test Guidelines

W3C Working Draft 15 July 2002

This version:
Latest version:
Previous version:
Editors:
Contributors:
See Acknowledgments.

Abstract

This document defines a set of common guidelines for building conformance test materials for W3C specifications. This document is one in a family of Framework documents of the Quality Assurance (QA) Activity, which includes the other existing or in-progress specifications: Introduction; Operational Guidelines; and, Specification Guidelines.

Status of this document

Note. This is a preliminary document plan. It is an Editors draft. It is meant for QA Editors and QAWG members internal discussion only. Its contents have not been endorsed or approved by the QAWG, or by any other entity of W3C.

This section describes the status of this document at the time of its publication. Other documents may supersede this document. The latest status of this document series is maintained at the W3C.

This document is a [...QA WG Editors draft...]. For more information about the QA Activity, please see the QA Activity statement.

This version is [...QA WG Editors draft...]. It is expected that updated [..public..] WD versions of this document will be produced regularly, along with other members of the Framework documents family. Future progression of this document beyond Working Draft is possible, but has not yet been determined.

This part of the Framework document family will eventually have an accompanying "Testing Examples & Techniques". Some of the lengthier examples and "how to" text of this current guidelines document version will be moved to the "Examples and Techniques" document.

Table of contents

1. Introduction
1.1 Navigating through this document.
1.2 Priorities
1.3 Terminology
2. Guidelines
G 1. Analyze the specification(s).
G 2. Define testing areas
G 3. Choose the testing methodology
G 4. Provide the test automation and framework
G 5. Provide the results reporting framework
G 6. Organize tests development
G 7. Conduct testing
3. Relationship between QA and other WGs
4. Conformance
4.1 Conformance definition
4.2 Conformance disclaimer
5. Acknowledgments
6. References
7. Change History


1. Introduction

1.1 Navigating through this document.

[@@Ed. Note. Ignore this section. It has not yet been rewritten from OpsGL@@] The Guidelines in the document are organized chronologically. The document starts with the guidelines that apply at the formation of a Working Group (e.g., charter considerations) and continues leading the reader through the various process and operational activities necessary in planning, developing, deploying and maintaining conformance materials. This document is applicable to all Working Groups, including those that are being rechartered or already exist. Working Groups may already be doing some of these activities and should review the document and in so far as possible incorporate principles and guidelines into their work.

This document employs the WAI (Web Accessibility Initiative) model for representing guidelines or general principles for the development of conformance materials. See, for example, Web Content Accessibility Guidelines. Each guideline includes:

The checkpoint definitions in each guideline define the processes and operations that need to be implemented in order to accomplish the guideline. Each checkpoint definition includes:

Each checkpoint is intended to be specific enough so that someone can implement the checkpoint as well as verify that the checkpoint has been satisfied.

1.2 Priorities

High quality and timely production of test materials are the key requirements to produce a high quality interoperable standard. Therefore each checkpoint has a priority level assigned by the QA Working Group based on the checkpoint's impact on the quality and timing of the test materials produced by a Working Group.

[Priority 1]
Satisfying this checkpoint is a basic requirement to ensure quality and interoperability of the standard. If the checkpoint is not satisfied, the test materials will not be produced by the time they are required to ensure the quality of the standard, or they may not be usable.
[Priority 2]
Satisfying this checkpoint will significantly improve the interoperability of the standard. If the checkpoint is not satisfied, it may be difficult to produce high quality test materials by the time they are required to ensure the quality of the standard.
[Priority 3]
Satisfying this checkpoint will further improve the interoperability of the standard. If the checkpoint is not satisfied, it may be somewhat difficult to maintain the high quality of the test materials and to ensure the quality of the standard.

1.3 Terminology

The keywords "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" will be used as defined in RFC 2119 [RFC2119].

Unusual terms in these framework documents are defined when first used, and most generally useful QA-specific terms will eventually be in the QA Glossary [QA-GLOSSARY].

2. Guidelines

Guideline 1. Analyze the specification(s).

As with any product testing, the first step should be to analyze the subject. The better initial analysis can be done, the easier will be to design the test suite.

Checkpoint 1.1. Create a list of all the specifications used or referenced. [Priority 2]

Most if not all of the specifications use notions and behaviors defined in other technical documents. For example, even base specification like XML uses definitions from specifications like URN and URI Syntax, Media Types, Unicode, etc. Some specifications are more self-contained and makes only limited use of the syntax defined in other specifications. Other specifications like XSLT [@@LINK] heavily relies on the syntax and semantics defined in the XPath [@@LINK] specification. Building the tree of the referenced specifications helps

[dd] I'd actually propose that this be priority 1, as interdependencies between specifications are vital to test, especially for API specifications and base specifications, such as XML, which is referenced by many others.

[wg] Discussed what specifications this checkpoint was meant to include. Agreed that it meant those being tested AND those that were referenced by the testing, and that when you tested you assumed the dependencies worked correctly. Also discussed that Issue13 (testing multiple specifications) needed to be resolved, but that either way it should not affect this checkpoint. Some clarification to be added, including why the list was important.

Checkpoint 1.2. Extract test assertions from the target set of specifications. [Priority 1]

Once you defined the target set of specifications that you're testing, more formal analysis should be done for each of them. The target set may include more then one specification, depending on how strongly does the primary specification under testing rely on the referenced specifications. For example, XML test suite may not include tests that specifically test the URN format, but XSLT and XQuery test suite will include many tests for XPath functions.

The QA Specifications Guidelines recommend to produce a set of test assertions for specification, so you may have them already. List of test assertions is necessary to focus testing. [KG] Need definition of the test assertion to be referenced.

[WG] Needs to be clarified to indicate it is related to only the target specification (or specs based on issue 13)

Checkpoint 1.3. Define those test assertions that are part of conformance criteria [Priority 1]

Depending on the Conformance Criteria defined in the specification, not all of the test assertions are necessary to satisfy in order to be conformant to the specification. For example if the conformance criteria requires implementer to comply with only those assertions that have "MUST" or "SHALL", all other test assertions (with "SHOULD", "MAY",etc) do not belong to conformance criteria.

Moreover, conformance criteria may define levels of conformance, in which case test assertions should be grouped by those levels.

[WG] Discussed if, based on the wording, we were creating assertions that might not be used to test conformance. Agreed that conformance levels were possible but that this section needed to be clarified to indicate how to correctly group the test assertions.

Checkpoint 1.4. Extract all the discretionary behaviors defined in the specification [Priority 1]

Test suite should accommodate discretionary behaviors to be used to test products depending on the vendors choice among the allowed behaviors. Therefore, if the discretionary behaviors are not identified in the specification already, tester should do that.

Checkpoint 1.5. Identify optional behaviors in the specification [Priority 2]

For example, protocol bindings.

Checkpoint 1.6. Identify behaviors that are unintentionally undefined or defined ambiguously in the specification [Priority 2]

[DM]Ideally, the specs have no vagueness. Test developers can identify the "known" areas of vagueness, but must synchronize with errata. Scanning the documents to detect vagueness and cataloging feedback sent to the spec editors are two different things; which would this checkpoint espouse?

Although the spec ideally should not have such flaws, one can never guarantee it. Maintaining a list of such issues helps in both: fixing specifications in next revisions and cataloging the incoming tests that falls into undefined or ambiguously defined areas.

Checkpoint 1.7. Identify explicitly undefined behaviors in the specification [Priority 2]

Although it is not a recommended practice, a specification's authors may explicitly abstain from defining product behavior in certain circumstances. List of such areas helps to analyze incoming tests appropriately.

Checkpoint 1.8. Identify contradictory behaviors in the specification [Priority 2]

Should not be there, but if exists, needed for test analysis.

[WG] Okay, but also discussed that there are a number of terms that need to go into the Glossary for this to all be clear.

Checkpoint 1.9. List user scenarios for the specification [Priority 2]

User scenarios help keep the tests focused.

[dd] Again, I'd go for moving this up to priority one. It is especially important for non-technical oriented specifications, such as WAI.

Guideline 2. Define testing areas

Testing area is a set of rules described in the specification that tester groups together based on some commonality.

Checkpoint 2.1. Define target areas for testing [Priority 1]

Needed mainly for categorization of the tests in the test suite. Usually the testing areas match the specification areas/content, but sometimes it is easier to define them based on some other criteria, like applicable testing methodology, user scenarios, etc. If there is no 1:1 mapping between test areas and specification areas/content, relationship between tests and specification should be still traceable via test to test assertions mapping described in the checkpoint below.

[WG] Needs examples and to be discussed as it relates to levels, modules and profiles.

Checkpoint 2.2. Prioritize testing areas [Priority 2]

Helps to prioritize testing development.

[WG] Needs to be moved to someplace under the guideline on test development, and should provide examples of different criteria you could use to prioritize the tests.

Checkpoint 2.3. For each testing area, produce a set of sample testing scenarios [Priority 3]

Before creating test cases for certain area of the specification, it may be useful to design a set of sample testing scenarios, based on the user scenarios. Those are not actual tests, but rather examples. This helps to properly select testing framework, create templates for test cases, define future sub areas

[WG] Needs examples and some clarification.

Checkpoint 2.4. Map sample testing scenarios to test assertions, discretionary behaviors and vague areas [Priority 3]

Helps to formalize testing scenarios and provides basis for the future analysis of the specification coverage.

Guideline 3. Choose the testing methodology

[dd] This guideline covers the area of test framework, something I anticipate will be covered elsewhere. Here, only substantial issues relevant to incorporating existing frameworks, and not altering them, should be raised.

[dd] Available and applicable methodologies need to be given here.

[WG] This guideline is meant to be a high level approach on how to build tests, and needs to be updated to make this clear.

Checkpoint 3.1. For each test area identify applicable testing approach [Priority 1]

[dd] As above, list available ones.

By testing approach we understand a set of high-level methods/ideas/strategies. It is convenient to define test areas so that testing in a single area can be done using single methodology.

[WG] Discussed if this meant choose from a list or define how. Agreed it should be define how. To clarify.

Checkpoint 3.2. Reuse publicly available testing techniques if applicable [Priority 1]

It is critical to avoid "reinventing the wheel" from both resource considerations and future integration perspectives.

[WG] Both 2.1 and 2.2 need to be reworded, as neither is verifiable or actionable.

Guideline 4. Provide the test automation and framework

Once the test strategy is defined, the right choice of the framework is critical for a smooth future test development/use

[dd] A general point: in the checkpoints connected to this guideline there seems to be a choice between frameworks. Ideally, QA WG will produce a small number of test frameworks that will implement most, if not all, of the options mentioned. For clarification, it is not to be understood that the QA WG will produce test frameworks for all WG's; there are however a series of things that should be in synchronization, most notably reporting, result publication and test extraction (if it is done using the specification granularity we speak of in Specification Guidelines). To allow for this, I would propose that this guideline be reorganized as follows:

Checkpoint 4.1. Review available test frameworks, automation and adopt existing if applicable [Priority 1]

[dd] This checkpoint is partly inconsistent with the immediately previous one; reordering them might help.

Argumentation is the same as in reusing testing methodology

Checkpoint 4.2. Ensure the framework and automation are platform independent. [Priority 1]

Alternative is to provide an implementation of the framework for every platform.

Checkpoint 4.3. Ensure the framework and automation are applicable to any product/content that implements the specification [Priority 2]

Similar to the previous one, test suite should be able to cover all products that specification allows.

[dd] The two previous checkpoints could be given in one, stating that the testing framework chosen should, if possible, be platform independent. Also, we need to keep in mind that providing platform-specific test frameworks raises issues with adding work that needs to be done to the testing framework itself; if we were to provide platform-specific test framework, the QA WG or any party producing those would need to allocate time to produce and ascertain their quality.

Checkpoint 4.4. Ensure the framework makes it easy to add tests for any of the specification areas [Priority 2]

Test suite will expand over time, and eventually cover all areas of the specification.

Checkpoint 4.5. Ensure the ease of use for the test automation [Priority 1]

Usability is critical requirements of the test suite. But as critical is to ease the tests contribution.

Checkpoint 4.6. Ensure the framework allows for specification versioning and errata levels [Priority 2]

Requirement from the Process guidelines.

Checkpoint 4.7. Ensure the framework accounts for choices allowed for discretionary behaviors in the specification [Priority 3]

This is integral part of the test suite to be applicable to any product allowed by the specification.

Checkpoint 4.8. Ensure the framework allows for tests for optional behaviors defined in the specification [Priority 3]

While optional behaviors are not necessary to implement, some of them might be self contained additions (like protocol bindings), that needs a test suite themselves. These tests will of course be applicable only to those products that claims to implement optional behaviors/profiles.

[dd] Experience from the DOM TS shows that allowing for optional/multiple behaviors is a high priority on the wish list for the TS. Implementers want to be able to test particular behaviors as defined in the specification, especially as they may have chosen to support only parts of the specifications (eg. DOM builds on XML, which allows for entity expanding/entity preserving applications).

Checkpoint 4.9. Ensure the framework accommodates levels of conformance defined in the specification [Priority 1]

If the conformance criteria introduces levels, test framework should allow to filter tests by levels.

Checkpoint 4.10. Ensure the results verification is product independent [Priority 1]

Results verification is critical part of the test framework. Since the test should run on any platform against any product implementing the spec, results verification (for example, comparison against expected output) should be product independent.

Checkpoint 4.11. Ensure the framework allows to document the tests [Priority 2]

For better maintenance. This includes annotating tests with pointers to the original specification(s).

Checkpoint 4.12. Ensure the framework has proper test case management [Priority 3]

Test case management includes accounting system for tests, managing additions, removal, filtering by various criteria.

Checkpoint 4.13. Ensure the framework allows to measure specification coverage [Priority 2]

One effective way to measure the specification coverage is to map list of tests to the list of test assertions grouped by areas.

[dd] Absolutely; this way of grouping works fine with the way we have discussed modules of specifications.

Guideline 5. Provide the results reporting framework

WG should encourage vendors to report testing results for their products. In order to do that, a WG needs to provide vendors with the results format, necessary stylesheets, etc.

Checkpoint 5.1. Review available results reporting frameworks and adopt existing if applicable. [Priority 1]

Checkpoint 5.2. Ensure the results reporting is platform independent [Priority 1]

Similar to the tests, results reporting should be usable by any vendor.

[dd] As above, given that the testing framework is uniform in functionality and independence, this will have been dealt with elsewhere.

Checkpoint 5.3. Ensure the results reporting is compatible with the test framework [Priority 1]

Checkpoint 5.4. Ensure the ease of use for results reporting [Priority 1]

Necessary to facilitate the results reporting by vendors. Ensure the results reporting has sorting and filtering capabilities, etc.

Checkpoint 5.5. Ensure the results reporting allows for specification versioning and errata levels [Priority 2]

Same as in tests.

Checkpoint 5.6. Ensure the results reporting allows to export results in self-contained format suitable to publish on the web [Priority 2]

Checkpoint 5.7. Ensure the results reporting provides details on failures sufficient to investigate [Priority 3]

Logging for example.

Checkpoint 5.8. Ensure the results reporting allows for history/storing analysis comments [Priority 3]

To investigate/compare the different versions of the product.

Guideline 6. Organize tests development

Checkpoint 6.1. Start with the test suite prototype and publish it. [Priority 2]

Checkpoint 6.2. Start with atomic tests first, according to priorities defined in Ck2.5 [Priority 1]

Checkpoint 6.3. Conduct regular public reviews of the test suite as specification and test suite development continues [Priority 2]

[dd] Ideally yes, but this will not necessarily the case if the TS will be produced within the WG

Checkpoint 6.4. Conduct regular specification coverage analysis. [Priority 2]

Guideline 7. Conduct testing

Checkpoint 7.1. A Working Group must publicly encourage conformance testing among vendors. [Priority 1]

A common practice is to support public discussion group dedicated to the test-suite, organize f2f meetings for vendors.

[dd] And other interested parties.

Checkpoint 7.2. Vendors to publish test results for their products. [Priority 3]

[dd] It may be that the W3C can have a special space where information pertaining to test results can be given, if not explicitly, then using links to those pages where the information can be found (in order not to have to provide disclaimers).

3. Relationship between QA and other WGs

[@@to be written@@]

4. Conformance

[@@Ed. Note. Ignore this chapter. It has not yet been rewritten from OpsGL@@]

This section defines conformance of Working Group processes and operations to the requirements of this specification. The requirements of this specification are detailed in the checkpoints of the preceding "Guidelines" chapter of this specification, and apply to the Working Group QA-related documents and deliverables required by this specification.

4.1 Conformance definition

This section defines three levels of conformance to this specification:

A Working Group conforms to the "QA Framework: Operational Guidelines" at Level X (A, AA, or AAA) if the Working Group meets at least all Conformance Level X requirements.

To make an assertion about conformance to this document, specify:

Example:

"This QA processes and operations of this Working Group conform to W3C's 'QA Framework: Operational Guidelines', available at http://www.w3.org/TR/2002/qaframe-ops/, Level AA."

4.2 Conformance disclaimer

The checkpoints of this specification present verifiable conformance requirements about the operational aspects of Working Group quality processes. As with any verifiable test requirements, users should be aware that:

  1. Passing all of the requirements to achieve a given conformance level -- A, AA, or AAA -- does not guarantee that the subject operations and processes are well-suited to or will achieve their intended purposes, nor does it guarantee the quality or suitability of test materials produced under the processes.
  2. Failing to achieve level A conformance does not mean that the subject quality processes and/or associated test materials are necessarily deficient to their intended purposes. It means that the processes have failed one or more checkpoints that best-practice experience has shown to facilitate and enable successful quality processes, that in turn have a high correlation with timely and successful development and maintenance of the test materials.

5. Acknowledgments

The following QA Working Group and Interest Group participants have contributed significantly to the content of this document:

6. References

EXTERN-TA
QA activity email thread about third-party participation in test materials production, available at http://lists.w3.org/Archives/Public/www-qa/2001Oct/0060.html.
MATRIX
W3C-wide conformance activity survey covering all the Working Groups, "The Matrix", available at http://www.w3.org/QA/TheMatrix.
PROCESS
W3C Process Document, 19 July 2001, available at http://www.w3.org/Consortium/Process-20010719/.
TAXONOMY
QA Activity test taxonomy, a classification scheme for conformance test materials, available at http://www.w3.org/QA/Taxonomy.
QA-GLOSSARY
A comprehensive glossary of QA terms, maintained by the QA Working Group. (Initial version under construction.)
QAIG
Quality Assurance Interest Group of the W3C QA Activity, which may be found at http://www.w3.org/QA/IG/.
QAWG
Quality Assurance Working Group of the W3C QA Activity, which may be found at http://www.w3.org/QA/WG/.
DOM Working Group TS
Process document for DOM Working Group Test suite, available at http://www.w3.org/2002/01/DOMConformanceTS-Process-20020115.
REC-TRACK
Stages and milestones in the W3C Recommendation Track, per the Process Document (Process Document is available at http://www.w3.org/Consortium/Process-20010719/, see section 5.2).
RFC2119
Key words for use in RFCs to Indicate Requirement Levels, March 1997, available at http://www.ietf.org/rfc/rfc2119.txt.
SVGTEST
SVG Working Group's test suite resource page, which may be found at http://www.w3.org/Graphics/SVG/Test/.
WCAG10
Web Content Accessibility Guidelines, version 1.0, W3C Recommendation, 5 May 1999, available at http://www.w3.org/TR/WCAG10/.
WG-QA-RANGE
Email proposal by David Marston, on the QA public mail list, for range of Working Group commitment levels to conformance test materials production, available at http://lists.w3.org/Archives/Public/www-qa/2001Apr/0004.html.
XMLTEST
OASIS XML Conformance TC's XML test suite resource page, which may be found at http://www.oasis-open.org/committees/xml-conformance/.
XSLT-TEST
OASIS XML Conformance TC's XSLT/Xpath test suite resource page, which may be found at http://www.oasis-open.org/committees/xml-conformance/.
QAF-TEST
QA Framework: Test Materials Guidelines (not yet published).
QAF-SPEC
"QA Framework: Specification Guidelines", Working Draft companion version to this document, available at [...].

7. Change History

07-15-2002

Fixed title (chg "Testing" to "Test")

Clarified SoTD -- **Editors Draft**

Added "@@Ignore this.." to unchanged OpsGL verbiage.

07-01-2002

Fixed definitions of priorities

Fixed the glitch with the "Test Areas" guideline

Added clarification to Ck 1.1, 1.2, 1.5 (removed "vague"), 1.6

06-17-2002

Added short prose to each checkpoint

06-12-2002

First draft outline