Copyright © 2003 W3C ® (MIT, ERCIM, Keio), All Rights Reserved. W3C liability, trademark, document use and software licensing rules apply.
This document defines a set of common guidelines for conformance test materials for W3C specifications. This document is one in a family of Framework documents of the Quality Assurance (QA) Activity, which includes the other existing or in-progress QA Framework specifications: Introduction,QA Framework: Operational Guidelines, and QA Framework: Specification Guidelines.
This section describes the status of this document at the time of its publication. Other documents may supersede this document. The latest status of this document series is maintained at the W3C.
This document is a W3C Working Draft (WD), made available by the W3C Quality Assurance (QA) Activity, for discussion by W3C members and other interested parties. For more information about the QA Activity, please see the QA Activity statement.
This version is the second published Working Draft. It is expected that updated WD versions of this document will be produced regularly, along with other members of the Framework documents family. Future progression of this document beyond Working Draft is possible, but has not yet been determined.
This second published Working Draft represents a major reorganization and restructuring of the previous version. It is lacks much detail, but the set of Guidelines and Checkpoints is complete, and there is at least some "Conformance requirements" and "Discussion" on most.
The QA Working Group wants comments, and asks that reviewers focus specifically on the new organization and structure, and the collection of Guidelines and Checkpoints. Comments on wording, prose, completeness of description will be of relatively low usefulness at this stage. It is the structure and normative content that interests us now.
This part of the Framework document family will eventually have an informative accompanying QA Framework: Test Examples and Techniques document. It will illustrate ways in which the guidelines and checkpoints of this document might be satisfied.
The QA Working Group Patent Disclosure page contains details on known patents related to this specification, in conformance with W3C policy requirements.
Please send comments to www-qa@w3.org, the publicly archived list of the QA Interest Group [QAIG]. Please note that any mail sent to this list will be publicly archived and available. Do not send information you wouldn't want to see distributed, such as private data.
Publication of this document does not imply endorsement by the W3C, its membership or its staff. This is a draft document and may be updated, replaced, or made obsolete by other documents at any time. It is inappropriate to use W3C Working Drafts as reference material or to cite them as other than "work in progress".
A list of current W3C Recommendations and other technical documents can be found at http://www.w3.org/TR/.
Two separate appendices to this document [TEST-CHECKLIST] and [TEST-ICS] present all checkpoints in a tabular form sorted in their original order and sorted by their priorities, for convenient reference. The latter is an Implementation Conformance Statement (ICS) pro-forma for this specification.
The scope of this specification is a set of requirements for Test Materials (TM) that, if satisfied, will enhance the usability and clarity of the test materials. It covers the analysis and coverage of specifications, the priorization and management of test cases, test frameworks and result reporting.
The goal is to help W3C Working Groups (WGs) and test material developers in developing test materials that provide consistent, reproducible results with in a well defined and clear scope.
The class of product or target of this specification is conformance test materials, including conformance test suites, validation tools, conformance checklists, and any other materials that are used to check or indicate conformance.
The intended audience of these guidelines is developers of conformance test materials for W3C specifications. However they are applicable to all W3C Working Group members as well as the testers of implementations.
While it is optimal that the development of test materials begin early in the process, these guidelines are intended for newly chartered and existing Working Groups alike. Working Groups who may already be doing some of these activities should review the document and incorporate the principles and guidelines into their test materials as much as possible.
The development of quality test materials helps enhance the quality of specifications by identifying aspects of the specification that are ambiguous, contradictory or not implementable. By helping to improve the clarity of the specification, the quality of implementations is also improved.
Quality test materials also help improve the conformance of implementations by providing methods of checking conformance to well defined criteria in a consistent way. By providing a consistent way to check the conformance of implementations, test materials also lead to more interoperable implementations.
This document is part of a family of QA Framework documents designed to help the WGs improve all aspects of their quality practices by solidifying and extending current quality practices found within the W3C. The QA Framework documents are:
The QA Framework documents are interrelated and complement each other. For example, there is a close relationship between producing and maintaining test materials and the tracking of errata. Hence there is an interrelationship between this document and the Operational Guidelines. The reader is strongly encouraged to be familiar with the other documents in the family.
The guidelines are intended for all Working Groups as well as developers of conformance materials for W3C specifications. Not only are the Working Groups the consumer of these guidelines, they are also key contributors. The guidelines capture the experiences, good practices, activities, and lessons learned of the Working Groups and present them in a comprehensive, cohesive set of documents for all to use and benefit from. The objective is to reuse what works rather than reinvent, and to foster consistency across the various Working Groups' activities and deliverables relating to quality assurance.
This specification applies the WAI (Web Accessibility Initiative) guidelines model to Working Group quality practices and the development of conformance test materials. See, for example, Web Content Accessibility Guidelines. Each guideline includes:
The guidelines in this document begin with the analysis of the specification and the identification of priorities and goals of the test materials (GL1-2). The guidelines then focus on test case management and development (GL3-4). They then proceed to cover usability and results reporting (GL5-7).
The checkpoint definitions in each guideline define test material aspects and systems that need to be implemented in order to accomplish the guideline. Each checkpoint definition includes:
Each checkpoint is intended to be specific enough so that someone can implement the checkpoint as well as verify that the checkpoint has been satisfied. A checkpoint will contain at least one, and may contain multiple, individual requirements, that use RFC2119 normative keywords. See the Conformance section for further discussion of requirements and test assertions. Two separate appendices to this document [TEST-CHECKLIST] and [TEST-ICS] present all checkpoints in a tabular form sorted in their original order and sorted by their priorities, for convenient reference. The latter is an Implementation Conformance Statement ( ICS) pro-forma for this specification.
Some checkpoints are more critical than others for the timely production of high-quality, highly usable test materials. Therefore each checkpoint has a priority level based on the checkpoint's impact on the quality and timing of the test materials produced by a Working Group.
The keywords "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" are used as defined in RFC 2119 [RFC2119]. When used with the normative RFC2119 meanings, they are all upper case. Occurrences of these words in lower case comprise normal prose usage, with no normative implications.
[@@Ed note. The normative usage of must, should, etc, is not yet in upper case.]
Unusual terms in this framework document are defined when first used, and most generally useful QA-specific terms will eventually be in the QA Glossary [QA-GLOSSARY].
In order to determine the testing strategy or strategies to be used, a high-level analysis of the structure of the specification (the subject of the test suite) must be performed. The better the initial analysis, the clearer the testing strategy will be.
Conformance requirements: The test suite MUST define it scopes, goal, and intended purpose.
Rationale: When writing test suites it is critical to understand their primary purpose and scope. The scope describes the areas covered by the test suite, thereby indicating the applicability of the test suite, the motivation and objectives, and coverage. For example, the goals, intended purpose, and coverage of tests for a CR document may be different than those for Recommendation. Note, a WG may have multiple test suites for different parts of a specification (e.g., profiles or modules) or for different applications of the specification (e.g., bindings).
Conformance requirements: All specifications referenced by the specification under test must be identified. The extent to which these specifications can be assumed to be tested, or must be tested by the conformance test suite for this specification, must be stated.
Rationale: Many specifications make reference to and/or depend on other specifications. It is important to determine the extent to which the specification under test can assume that referenced specifications have already been conformance-tested.
Conformance requirements: The scope, goal, and purpose of the test suite as a whole, and where appropriate of each logical 'partition' of the test suite, and the mapping between such partitions and sections of the specification, must be identified and documented
Rationale: Different areas of the specification under test may require different testing approaches (for example, low-level API testing as opposed to higher-level user-scenario testing). The test-suite documentation should explain any partitioning of the specification (and other referenced specifications) and the testing approach adopted for each partition.
As recommended in the QA Specification Guidelines, an important prerequisite for test development is the identification of test assertions within the specification.
Once assertions have been identified they should be tagged with attributes that will enable test developers, the test-management system, the test-execution framework, and the results-reporting process to make useful distinctions between groups of tests.
Conformance requirements: Test assertions within the specification must be identified and documented.
Conformance requirements:TBD
Rationale: Assertions should be tagged with metadata that identifies:
The goal when testing an implementation is to achieve consistent reproducible results that accurately indicate the conformance of the implementation to the specification. In order to achieve this goal it is necessary that any tester, given a set of criteria for an implementation (the profile, conformance level, etc), will use the same set of tests and arrive at the same set of results from those tests. If the results of testing an implementation vary, depending on the tester, then it cannot be determined which set of results is the one that was used to verify conformance.
In order to help ensure that testing is done in a consistent way, test management systems and test frameworks should be used. A test management systems is a system that organizes and manages test cases and allows information about test cases to be associated with the tests. A test framework is a system that assists in the running of the tests. The two systems may be integrated. This guideline covers test management systems. The next guideline covers test frameworks.
Conformance requirements: There must be a system designed to manage and organize test cases.
Rationale: In order to accurately test an implementation, one needs to be able to systematically cover each area of functionality. One must know what has been covered, what still needs to be covered and what is not applicable. By providing some system of management and organization, testers are better able to provide accurate and more complete coverage.
Conformance requirements: The test management system must support the association of test cases with the criteria given in Guideline 2 Checkpoint 2.
Rationale: A tester needs to know which test cases apply to a given implementation in order to accurately test its conformance. By providing the associated criteria to the tester along with the test case, the tester can verify that the case is applicable to the implementation.
Conformance requirements: The test management system must allow for tests to be filtered and sorted based on the list of criteria given in Guideline 2 Checkpoint 2.
Rationale: By allowing testers to sort and filter test cases based on these criteria, testers can quickly identify and run only the tests associated with the specific target that the tester is covering. By associating test cases back to the test assertion, testers can identify what is being tested and what the correct result should be.
Conformance requirements: The test management system must allow for the results of a test case to be associated with the test.
Rationale: By providing a way for testers to associate the results of a test, they become one more criteria by which tests may be grouped. By identifying tests that pass, testers do not need to recover the same ground.
Discussion. Providing support for results helps to fulfill part of the requirements of Guideline 6.
Conformance requirements: The Working Group must provide a test framework.
Rationale: By providing a test framework, a consistent interface to testing is used by all testers. This leads to more consistent and uniform results which aid in more uniform conformance testing.
Conformance requirements: The test framework must be prototyped by the Working Group and tested by implementors of the specifications.
Rationale: To be truly useful, the test framework must work with the systems that the implementors are supporting. It follows that such test frameworks should be designed to run on the widest number of platforms applicable to the specifications domain.
Conformance requirements: The test framework must support the automation of running tests.
Rationale: Reasonably complete test suites may contain a significantly large number of test cases. Running each test case by hand takes a significant amount of time. Because of this time investment, the interoperability testing may delay the entrance to Proposed Recommendation. In order shorten the amount of time required to run a test suite, some test cases may be automated.
One of the entrance criteria for a Proposed Recommendation (PR) is that each feature of the technical report has been implemented, preferably by demonstrating two interoperable implementations of each feature. Providing a standardized result reporting mechanism helps facilitate these interoperability tests. A second goal of the WG should be to encourage vendors to report testing results for their products. In order to do that, a WG needs to provide vendors with the results format, necessary style sheets, and any other tools needed to facilitate reporting.
Conformance requirements: Test Materials must support a method of recording the pass/fail state for each test.
Rationale: In the case of interoperability test suites required for verifying two implementations, providing a system for result reporting allows implementors to submit uniform sets of results which helps ease the process of comparison and the determination of what features are in danger of being eliminated due to their lack of support.
Conformance requirements: The result reporting framework should allow for uses to export or otherwise generate a self contained report that may be published on the web.
Rationale: In order to encourage vendors to publicly publish their results on the web it is desirable to make the process of publishing as easy as possible. By producing a unified web page or package of pages, vendors do not need to convert the results from one format to another in order to publish.
Conformance requirements: The results reporting framework must indicate what tests have passed and what tests have failed. For failed tests, some indication of the reason for failure should also be included in or referenced by the report.
Rationale: The whole purpose of a results reporting framework is to be able to determine the result of each test. If the framework does not indicate if tests are passing or failing then it is not doing its primary job. In order to improve an implementation, the implementors must know what is failing and how it is failing in order to fix it. The more details that are provided about the failure, the easier it is for the implementors to locate and fix the problem.
Conformance requirements: The results reporting framework should support filtering on the same metadata criteria as the Test Management System.
Rationale: Results should be filterable based on criteria just as test cases should be. This allows for implementors to only focus on tests and the results of tests that relate to the particular target criteria of the implementation, while being able to filter out tests and the results of tests that are not applicable to a given implementation.
Checkpoints for this guideline aim to ensure that the Working Group has a plan for getting interested parties involved in the development and use of conformance materials.
Conformance requirements: Document a plan to engage vendors of implementations to participate in conformance testing activities.
Rationale: The conformance testing of various implementations helps improve the interoperability across implementations.
A common practice is to support public discussion group dedicated to the test suite and organize face-to-face meetings for vendors and other interested parties.
Conformance requirements: Provide a special space where information pertaining to test results of vendors' products can be maintained.
Rationale:
It may be possible for the W3C to have a special space where information pertaining to test results can be given, if not explicitly, then using links to those pages where the information can be found (in order not to have to provide disclaimers).
This section defines conformance of Working Group processes and operations to the requirements of this specification. The requirements of this specification are detailed in the checkpoints of the preceding "Guidelines" chapter and apply to the Working Group QA-related documents and deliverables required by this specification.
This section defines three levels of conformance to this specification:
A Working Group conforms to the "QA Framework: Test Guidelines" at Level X (A, AA, or AAA) if the Working Group meets at least all Conformance Level X requirements.
To make an assertion about conformance to this document, specify:
Example:
The Test Suite for X module, version 2.1, at http://www.example.org/ts-xmod21/, conforms to the W3C's 'QA Framework: Test Guidelines', available at http://www.w3.org/TR/2003/WD-qaframe-test-20030516/, Level AA, as determined on August 1, 2003.
The checkpoints of this specification present verifiable conformance requirements about the quality of the test materials developed or adopted by the Working Group. As with any verifiable test requirements, users should be aware that:
This section contains the definitions for all the critical terms used in the guidelines below. This does not replace the QA Glossary [QA-GLOSSARY], but rather focuses on the most important terms for the Testing guidelines. Some terms in this section have been borrowed or adapted from other specifications.
The following QA Working Group and Interest Group participants have contributed significantly to the content of this document:
Added Patrick's text for GL 1 and 2. Wrote text for GL 3 and 4. Rewrote GL 5 and merged it into GL 3 and 4 and tried to address some of the comments made by Mark and Sandra in e-mail.
Moved the definitions section to its new location and updated it.
Started work on new introduction using the same format as the other framework documents.
Added new outline to document; commented out a number of sections that need editing.
Edited, improved the Introduction(goals, motivation, document's structure).
Updated the definition of the checkpoint's Priorities.
Corrected abstract, SOT.
Changed the goal of the document and wording of the checkpoints/guidelines to focus it on testing strategy, moving all the tips on tactics to be EX-TECH.
Added examples to most of the checkpoints.
incorporated all the editors comments.
Rewrote several checkpoints in the GL5; defined results verification and reporting as part of the test framework to remove redundant checkpoints.
Rewrote guideline 6 and 7 to focus on strategy of test development and testing, rather then on tactics.
Updated the conformance section.
Expanded introduction, added motivation, etc.
Added examples to the checkpoints in the GL1, 2, 3.
[MS] Changed the text of many checkpoints to make them verifiable.
[DD] First pass on Introduction, added more text to the checkpoints in the GL3-5.
Fixed definitions of priorities.
Fixed the glitch with the "Test Areas" guideline.
Added clarification to CP 1.1, 1.2, 1.5 (removed "vague"), 1.6.
Added short prose to each checkpoint.
First draft outline.