W3C

QA Framework: Test Guidelines

W3C Working Draft 13 December 2002

This version:
http://www.w3.org/QA/WG/2002/12/qaframe-test-20021205
Latest version:
http://www.w3.org/QA/WG/qaframe-test
Previous version:
(None, this is the first published WD)

http://www.w3.org/QA/WG/2002/10/qaframe-test-20021006

Editors:
Kirill Gavrylyuk (kirillg@microsoft.com)
Dimitris Dimitriadis (dimitris@ontologicon.com)
Lofton Henderson (lofton@rockynet.com)
Mark Skall (mark.skall@nist.gov)
Peter Fawcett (pfawcett@real.com)
Contributors:
See Acknowledgments.

Abstract

This document defines a set of common guidelines for conformance test materials for W3C specifications. This document is one in a family of Framework documents of the Quality Assurance (QA) Activity, which includes the other existing or in-progress specifications QA Framework: Introduction,QA Framework: Operational Guidelines, and QA Framework: Specification Guidelines.

Status of this document

This section describes the status of this document at the time of its publication. Other documents may supersede this document. The latest status of this document series is maintained at the W3C.

This document is a W3C Working Draft (WD), made available by the W3C Quality Assurance (QA) Activity for discussion by W3C members and other interested parties. For more information about the QA Activity, please see the QA Activity statement.

This version is the first published Working Draft. It is expected that updated WD versions of this document will be produced regularly, along with other members of the Framework documents family. Future progression of this document beyond Working Draft is possible, but has not yet been determined.

This part of the Framework document family will eventually have an informative accompanying QA Framework: Test Examples and Techniques document. It will illustrate ways in which the guidelines and checkpoints of this document might be satisfied.

The QA Working Group Patent Disclosure page contains details on known patents related to this specification, in conformance with W3C policy requirements.

At this time, the QAWG (QA Working Group) has addressed the contents of the document at a high level, agreeing to the concepts and principles as well the coverage of the guidelines. The QAWG has not, at this time, addressed and achieved consensus of the priority levels. A future version of this document will be accompanied by a "Specification Examples & Techniques" document, which will illustrate the guidelines and checkpoints with case studies, and explain how to satisfy the checkpoints.

Please send comments to www-qa@w3.org, the publicly archived list of the QA Interest Group [QAIG]. Please note that any mail sent to this list will be publicly archived and available, do not send information you wouldn't want to see distributed, such as private data.

Publication of this document does not imply endorsement by the W3C, its membership or its staff. This is a draft document and may be updated, replaced, or obsoleted by other documents at any time. It is inappropriate to use W3C Working Drafts as reference material or to cite them as other than "work in progress".

A list of current W3C Recommendations and other technical documents can be found at http://www.w3.org/TR/.

Table of contents


Introduction

Scope and goals

Class of product and audience

Motivation and expected benefits

Relationship to other specifications

Understanding and using this document

Checkpoint priorities

Terminology

Definitions

This section contains the definitions for all the critical terms used in the guidelines below. This does not substitute the QA Glossary [QA-GLOSSARY], but rather focuses on the most important terms for the Testing guidelines. Some terms in this section have been borrowed or adapted from other specifications.

Contradictory behaviors
Two or more different behaviors unconditionally prescribed by a specification for the same class of implementations under the same circumstances.
Discretionary choices
A value or behavior may be chosen from a well-defined enumerated set of two or more possibilities.
Optional behaviors
A well-defined feature may be supported or not (if supported, then the requirements are clear and unambiguous).
Explicitly undefined behaviors
Specification states that it is open ended and undefined, what set of values an element or attribute may have, or the behaviors of a product that implements a feature.
Test Assertion
A set of premises that are known to be true by definition in the spec.
Test Area
A minimal compound unit in the test suite structure.
Test Framework
A set of utilities, stylesheets and documentation that describe and facilitate development, documentation and use of the tests.
Results Verification
A common testing practice used to determine if a test passes or fails by verification of the test result or output against the expected one.

Guidelines

Provide analysis of the specification(s).

In order to define the strategy of the test suite, a detailed analysis of the specification (the subject of the test suite) is required. The better the initial analysis, the clearer the testing strategy will be.

Checkpoints:

Identify the target set of specifications that are being tested.

Most if not all of the specifications use notions and behaviors defined in other technical documents. For example, even base specification like XML uses definitions from specifications like URN and URI Syntax, Media Types, Unicode. Some specifications are more self-contained and makes only limited use of the syntax defined in other specifications. Other specifications like XSLT [@@LINK] heavily relies on the syntax and semantics defined in the XPath [@@LINK] specification. In order to understand the scope of the test development work, building a tree of the referenced specifications helps to:

[EX-TECH] The XQuery specification which has a draft status as of the time of writing, explicitly defines a set of the W3C specifications it depends on together with their versions: XPath 2.0, XML Schema 1.0, XML 1.0 SE. This allows conformance test developers to determine the scope of their work and to reuse the tests from the test suites built for the referenced standards.

The target set may include more then one specification, depending on how strongly the primary specification under test relies on the referenced specifications.

[EX-TECH] For example, XML test suite [@@Link] may not include tests that specifically test the URN format, but XSLT [@@Link] and XQuery [@@Link] test suites will include many tests for XPath functions.

ExTec Link

Identifytest assertions from the target set of specifications.

Once the target set of specifications are defined, more formal analysis is required for each of them.

The QA Specifications Guidelines requires that a set of test assertions for a specification be produced, so this may already exist. A listing of the test assertions is necessary to focus the testing.

[EX-TECH]

XML Protocol Working Group produced a list of assertions for the SOAP 1.2, part 1 and part 2 specification. The assertions where extracted manually from the specification text and links to the original text were added.

Microsoft, Open Wave, and America Online contributed a HTML test suite [Ed: TEMP LOCATION, change before FPWD] that contains a list of assertions from the HTML 4.01 specification.

In both cases, extracted assertions helped to reach maximum specification coverage when developing the test suite. Links from the tests to the assertions provide quantified information about the specification coverage.

NIST produced a test matrix for the XML 1.0 test suite that includes explicit references to the assertions in the spec.[@@Link]

ExTec Link

Group test assertions by levels, profiles, modules, etc.

The conformance criteria, with respect to subsetting the specification, may include various degrees of variability (e.g., levels, profiles, modules). The test assertions should be grouped according to these subsets, thus facilitating the testing of entire subsets.

[EX-TECH] In the SOAP example for the checkpoint above, the assertions were grouped by modules.

ExTec Link

Identify those test assertions that are part of the conformance criteria

Depending on the Conformance Criteria defined in the specification, not all of the test assertions need to be satisfied in order to be conformant to the specification. For example, if the conformance criteria of the specification requires implementers to comply with only those assertions that use "MUST" or "SHALL" keywords as defined in [RFC2119], then all other test assertions (with "SHOULD", "MAY", etc) do not belong in the conformance criteria.

[EX-TECH] In all of the examples for the checkpoint above, all the assertions belong to the conformance criteria for the corresponding specifications.

ExTec Link

Identify all the discretionary choices defined in the specification

A test suite should be designed to take into account discretionary choices. This allows for testing all possible choices regardless of which of the allowed behaviors an implementation implements. Having a standalone list of the discretionary choices extracted from the specification helps to automate the tuning of the test suite according to the choices taken by a particular implementation.

[EX-TECH] OASIS XSLT/XPath Conformance Technical Committee produced a list of the discretionary items for the XSLT/XPath specifications. This provided a way to automatically tune the test suite to a particular implementation's choices.

ExTec Link

Identify optional behaviors in the specification

Some parts of the specifications might be declared as optional to implement. If an implementation claims to implement them, then it must implement them correctly and conform to the whole part. A good example of such optional "adjuncts" are protocol bindings.

ExTec Link

Identify behaviors that are undefined or defined ambiguously in the specification

Although the specification should not have such flaws, there is no guarantee that it is flawless. Maintaining a list of such issues helps provides a feedback loop to the specification developers so that the specification can be fixed in its next revision. Additionally, a list can be used in cataloging incoming tests that fall into undefined or ambiguously defined areas.

ExTec Link

Identify explicitly undefined behaviors in the specification

Although it is not a recommended practice, a specification's authors may explicitly abstain from defining product behavior in certain circumstances. A list of such statements in the specification helps to analyze incoming tests appropriately.

ExTec Link

Identify contradictory behaviors in the specification

Such contradictory combinations should not occur in a specification in the first place. However, if they exist, they are used for both test analysis and future errata tracking.

ExTec Link

List the user scenarios for the specification

User scenarios help keep the tests focused. They also help to understand requirements for the test framework at a very early stage of testing

[EX-Tech] The W3C XML Query Working Group produced a comprehensive list of the Use Cases that has been very helpful in development of the conformance test suite.

ExTec Link

Declare the structure of the test suite.

There are many ways to structure the test suite. This guideline lists common requirements for the structure and provide common examples.

Checkpoints:

Document the structure for the test suite.

Usually the structure of the test suite matches the structure of the specification or its content. However, sometimes it is easier to define the test suite structure in other ways, such as applicable testing methodology, user scenarios, profiles, etc.

[EX-TECH]The following is an informative, non-exhaustive list of possible test suite structures:

The test suite may use a combination of these organizational principles.

ExTec Link

Provide mapping between the test suite structure and the specification structure.

Regardless of the principle chosen for the test suite structure, the mapping between the test areas and the specification text is essential. The relationship between tests and specification should be traceable via a test to test assertions mapping as described in the checkpoint below.

The document model organizational principle is the recommended practice for structuring a test suite and the one most commonly used.

[EX-TECH] Each of the contributions to the W3C XML Schema Test collection are categorized according to the document model (i.e., specification structure).

[EX-TECH] When mapping between the test suite structure and the test specification is provided, it may be useful to design a set of sample testing scenarios, based on the user scenarios. These are not actual tests, but rather test examples. This helps to properly select the testing framework, create templates for test cases, and define future subareas.

For example, XML Query Use cases document has sample test scenarios that could be used as a basis for any XQuery test suite and gives a good feeling of the requirements for the test framework.

To assess the completeness of the specification coverage by such testing scenarios, additional mapping could be built between the sample testing scenarios and the test assertions. This helps to formalize testing scenarios and provides a basis for the future analysis of the specification coverage.

ExTec Link

Document the testing methodology.

The testing methodology provides high-level answer to the question "How does the test suite verify conformance to the specification?"

[EX-TECH]

A non-exhaustive list of methodologies include:

For SMIL and SMIL 2.0, the test suite used a specification driven framework. The test suite makes an attempt to verify the use of each element and attribute as defined in the normative language of the specification. Additionally, it attempts to verify any interactions between elements which is also defined by the normative language of the specification. In other specifications, other approaches may be more appropriate. For example, a specification that defines a specific protocol or method of communication (e.g., SOAP) would be better tested using a use case type method where a series of use cases are defined and it is up to the tester to create a specific test that verifies a general use case.

Checkpoints:

For each test area, identify the testing approach

The testing approach provides a set of high-level methods, ideas, or techniques to test the implementation's conformance to the standard. It is convenient to define test areas so that testing in a single area can be done using a single methodology.

The rational for using a specific testing methodology within a specific test area should be defined. This is to ensure a consistent approach to testing by all users of a test suite.

Although not required, it is convenient to have a single methodology within a single area of testing. When possible, a consistent methodology should be used over all test areas. There is a potential trade off here between consistency and using the best approach to cover a specific area.

[EX-TECH] A simple example of the test suite using different testing approach for difference specification areas would be the W3C XML Schema test suite [@@LINK]. Testing the conformance to the schema validation assertions uses a schema parsing technique with the expected result being true or false. Whereas testing XML instance validation assertions requires verifying the schema(s) validity, instance well-formness, and instance validity against the schema(s).

Another example from the same test suite is testing XML instance validity against the schema. It may be done using an inline schema definition inside the XML instance, explicitly referring schema from the instance using the @schemaLocation attribute or just supplying both the schema document for the namespace used in the XML instance. In different parts of the test suite, different techniques are used.

ExTec Link

Identify publicly available testing techniques. List the publicly available testing techniques that have been reused

It is critical to avoid "reinventing the wheel" from both resource considerations and future integration perspectives.

ExTec Link

Provide the test automation and framework

The right choice of the test framework is a critical part of the test development.

That said, the right framework for the test suite is a responsibility of the Working Group. There are however a series of things that should be in synchronization, most notably, reporting, result publication, and test extraction (if it is done using the specification granularity described in the Specification Guidelines). The QA Working Group can provide assistance in accomplishing this.

Checkpoints:

List available test frameworks and applicable automation. Identify available test frameworks used. If none, justify why new frameworks are needed and existing ones could not be used.

A Working Group that wants to produce a framework for testing implementation conformance with W3C specifications should initially invest some time in reviewing existing testing frameworks (some of which are given in the TestGL ExTech document [@@ link needed]), evaluating them with respect to the WG's testing objectives, and if applicable (in order to not reinvent the wheel) adopt them. The following roadmap is suggested:

If a particular framework is judged appropriate, inform the original author of test framework (or WG chair, if the framework was produced by a WG) and communicate additions, changes and errors, if applicable.

If a suitable testing framework is not found, then proceed along the following checkpoints to produce a testing framework that is applicable.

Augmentation is the same as reusing testing methodology

ExTec Link

Ensure the framework and automation are platform independent. Demonstrate on 3 platforms. Ensure that the framework and automation are built using open standards.

Ideally, any test framework should be platform independent, insofar as running and reporting is concerned. In cases where it is not feasible, parts of the framework could be made platform dependent as long as they are provided for each of the main stream platforms. However, keep in mind that providing platform-specific test frameworks has several drawbacks with respect to time and resource. It requires additional work in creating these platform-specific frameworks as well as ascertaining their quality.

ExTec Link

Ensure the framework and automation are applicable to any product or content that implements the specification. Demonstrate with three products or contents. Ensure that the framework and automation are built using open standards.

Similar to the previous checkpoint, a test suite should be able to cover all classes of products that a specification allows. The challenge is to create a test framework that can be adaptable and extended to accommodate future implementations as well as future versions of the specification. This being a noble goal, it is not always possible to do and it may be necessary to build additional test frameworks in the future. However, the fact that a test framework is applicable to at least three different implementations is a good indicator that it will be feasible to adapt it to any other product.

Sometimes parts of a test framework have to be adjusted for a particular implementation. The OASIS XSLT [@@LINK] test framework is a good example of this. The test framework was adjusted and in doing so, resulted in expending more resources to accomplish this additional work.

[EX-TECH] An example of the product independent testing framework would be a web-based testing framework, that runs in the majority of main stream browsers (with a satisfactory degree of conformance to specifications they implement).

ExTec Link

Ensure the framework makes it easy to add tests for any of the specification areas. Demonstrate, through an example, how tests are added to a specification area.

A test suite will expand over time and eventually cover all areas of the specification. Test frameworks should be designed for extensibilty, making it possible to add new test material. This is especially important since test suites will grow over time and test added functionality. It is also critical to comply to this checkpoint in order to fulfill the requirements of checkpoint 5.2 and 8.1 in the QA Operational Guidelines.

ExTec Link

Ensure the ease of use for the test automation. Document how the test automation is used.

Usability is a critical requirement for the test suite and the framework. Good test frameworks must provide sufficient documentation on hor to add new tests, how to run tests, how to monitor and investigate results.

[EX-TECH] A good example of the documented test framework for a conformance test suite is an XSLT test framework[@@LINK].

ExTec Link

Ensure the framework allows for specification versioning and errata levels. Explain how specification versioning and errata levels are accommodated by the test framework

This is one of the criteria for adopting or producing a testing framework. It can easily be accomplished by using a particular set of markup which would then include specification level along with added metadata information. This is also a requirement of the QA Operational Guidelines.

ExTec Link

Ensure the framework accounts for choices allowed by discretionary choices in the specification. Explain how discretionary behaviors are accommodated by the framework.

It is an integral part of the test suite to be applicable to any product that implements the specification. It also applies to the checkpoint immediately after this one. Given that the tests will be represented in a structured manner, it should be possible to run only those tests that a particular implementation is known to support (which is the case with optional sets of specifications).

ExTec Link

Ensure the framework includes tests for the optional behaviors defined in the specification. Explain how optional behaviors are accommodated by the framework.

While it is not a requirement to implement optional behaviors, some of them might be self contained additions (like protocol bindings, DOM HTML module) that need their own test suite. These tests will of course be applicable only to those products that claim to implement optional behaviors or profiles.

Experience from the DOM test suite shows that allowing for optional or multiple behaviors is a high priority on the wish list for the test suites. Implementers want to be able to test particular behaviors as defined in the specification, especially when they have chosen to support only parts of the specifications (eg. DOM builds on XML, which allows for entity expanding/entity preserving applications).

ExTec Link

Ensure the test framework accommodates profiles, modules, product classes, and levels if they are used in the specification. For each dimension of variability used in the specification, demonstrate how the framework allows tests to be filtered by the dimension.

A framework should provide for tailoring the test suite to the implementation, i.e., selecting only those tests that reflect the combination of the dimensions of variability that is supported by the implementation.

ExTec Link

Ensure the framework accommodates the conformance policy defined in the specification. Demonstrate how the framework allows tests to be filtered by levels.

If the conformance criteria introduces levels, the test framework should allow that tests can be filtered by levels.

ExTec Link

Ensure the framework supports test results verification. Demonstrate results verification by testing three products.

Results verification is a common testing practice used to determine if a test passes or fails by verification of the test result or output against the expected one. Results verification is a critical part of the test framework. The tests should run on any platform against any product implementing the specification. The same applies to the result verification support.

ExTec Link

Ensure the framework allows the tests to be documented. Explain how to document the tests within the framework.

Documentation aids in the understanding of the tests as well as facilitating maintenance of the test suite. This includes annotating tests with pointers to the original specification(s). Using a proper source and test documentation mechanism is vital for the quality of the test framework.

[EX-TECH] W3C test suites for XML, DOM, XML Schema, and OASIS XSLT test suite all contain good examples of the documented tests.

ExTec Link

Ensure the framework has proper test case management. Demonstrate how at least one of the following test case management functions is accomplished within the framework: managing additions, managing removals, filtering by various criteria

Test case management includes an accounting system for the tests, managing additions, removal, and filtering by various criteria.

ExTec Link

Ensure the framework allows specification coverage to be measured. Demonstrate the above by mapping a list of tests to the list of test assertions, grouped by areas

One effective way to measure the specification coverage is to map the list of tests to the list of test assertions which are grouped by areas. Using a structured markup to represent tests makes it possile to point to particular parts of the specification (especially if the specifications are also written using structured markup). In this way, it is possible to group tests according to specification parts and view the results in the same manner.

ExTec Link

Provide the results reporting framework

The WG should encourage vendors to report testing results for their products. In order to do that, a WG needs to provide vendors with the results format, necessary style sheets, and any other tools to facilitate reporting.

Checkpoints:

Ensure that the test framework supports reporting the results.

All the requirements for the test framework such as applicability to any platform and implementation, ease of use, support for the specification versioning, and errata levels directly apply to the results reporting support.

ExTec Link

Ensure the ease of use for results reporting. Demonstrate that the results reporting has sorting and filtering capabilities.

This is a necessary part of the ease of use requirement, that is, to facilitate the results reporting by vendors.

ExTec Link

Document how the results reporting allows results to be exported in a self-contained format suitable to publication on the web.

Results reporting needs to be implemented in a way that allows for the results to be published on the web. Therefore, producing a self-contained version of the test reporting (including pointers to tests and relevant parts of the specification) in HTML form is recommended.

ExTec Link

Demonstrate that the results reporting provides details on failures (logs) sufficient to investigate.

Logging facilitates testing results investigation and therefore, is part of the ease of use requirement. Results reporting should provide test logs.

[EX-TECH] An example of the logging incorporated into the test framework and results reporting is the SOAP Builder's interoperability testing participants results page [@@LINK].

ExTec Link

Document how the results reporting allows for history/storing analysis of comments

Documenting the results of the tests from various versions of a product allows the implementor to investigate and compare the results.

ExTec Link

Plan for tests development

The aim of the checkpoints in this guideline is to ensure that the Working Group has a plan in place for the test development. Fulfillment of the checkpoints below will facilitate conformance with checkpoints 5.2 and 5.4 from the QA Operational Guidelines

Checkpoints:

Define priorities for the test areas

This helps prioritize test suite development and test reviews. It also allows Working Groups to identify specific areas in the specification for inclusion in a test prototype (see next checkpoint) at the early stages of the test suite development.

ExTec Link

Introduce a mechanism for the early feedback on the test suite architecture and test framework. Document and publish the mechanism to the intended audience.

An effective way to verify that the test framework and the test suite architecture suits the testing needs is to develop a test suite prototype that contains a limited amount of tests and a test framework prototype for it.

ExTec Link

Ensure regular specification coverage analysis. Provide the schedule for specification coverage analysis.

Fulfillment of this checkpoint facilitates satisfying checkpoint 3.1 of the QA Operational Guidelines

[EX-TECH] Several tips for organizing an effective test suite development are listed below

ExTec Link

Plan for conformance testing

Checkpoints of this guideline aim to ensure that the Working Group has a plan for getting interested parties involved in the development and use of conformance materials.

Checkpoints:

Document the plan for engaging vendors of implementations in conformance testing activities.

A common practice is to support public discussion group dedicated to the test suite and organize face-to-face meetings for vendors and other interested parties.

ExTec Link

Encourage Vendors to publish test results for their products by reserving a special space where information pertaining to test results can be maintained.

It may be possible for the W3C to have a special space where information pertaining to test results can be given, if not explicitly, then using links to those pages where the information can be found (in order not to have to provide disclaimers).

ExTec Link

Conformance

This section defines conformance of Working Group processes and operations to the requirements of this specification. The requirements of this specification are detailed in the checkpoints of the preceding "Guidelines" chapter and apply to the Working Group QA-related documents and deliverables required by this specification.

Conformance definition

This section defines three levels of conformance to this specification:

A Working Group conforms to the "QA Framework: Test Guidelines" at Level X (A, AA, or AAA) if the Working Group meets at least all Conformance Level X requirements.

To make an assertion about conformance to this document, specify:

Example:

The Test Suite for X module, version 2.1 of this Working Group, conforms to the W3C's 'QA Framework: Test Guidelines' version 1.0, available at http://www.w3.org/TR/2002/qaframe-test/, Level AA as determ ined on January 1, 2003.

Conformance disclaimer

The checkpoints of this specification present verifiable conformance requirements about the quality of the test materials developed or adopted by the Working Group. As with any verifiable test requirements, users should be aware that:

  1. Passing all of the requirements to achieve a given conformance level -- A, AA, or AAA -- does not guarantee that the quality of the subject test materials are well-suited to or will achieve their intended purposes.
  2. Failing to achieve level A conformance does not mean that the subject test materials are necessarily deficient to their intended purposes. It means that the test materials fail one or more checkpoints that best-practice experience has shown to facilitate and enable successful development, maintenance and use of the test materials.

Acknowledgments

The following QA Working Group and Interest Group participants have contributed significantly to the content of this document:

References

EXTERN-TA
QA activity email thread about third-party participation in test materials production, available at http://lists.w3.org/Archives/Public/www-qa/2001Oct/0060.html.
MATRIX
W3C-wide conformance activity survey covering all the Working Groups, "The Matrix", available at http://www.w3.org/QA/TheMatrix.
PROCESS
W3C Process Document, 19 July 2001, available at http://www.w3.org/Consortium/Process-20010719/.
TAXONOMY
QA Activity test taxonomy, a classification scheme for conformance test materials, available at http://www.w3.org/QA/Taxonomy.
QA-GLOSSARY
A comprehensive glossary of QA terms, maintained by the QA Working Group. (Initial version under construction.)
QAIG
Quality Assurance Interest Group of the W3C QA Activity, which may be found at http://www.w3.org/QA/IG/.
QAWG
Quality Assurance Working Group of the W3C QA Activity, which may be found at http://www.w3.org/QA/WG/.
DOM Working Group TS
Process document for DOM Working Group Test suite, available at http://www.w3.org/2002/01/DOMConformanceTS-Process-20020115.
REC-TRACK
Stages and milestones in the W3C Recommendation Track, per the Process Document (Process Document is available at http://www.w3.org/Consortium/Process-20010719/, see section 5.2).
RFC2119
Key words for use in RFCs to Indicate Requirement Levels, March 1997, available at http://www.ietf.org/rfc/rfc2119.txt.
SVGTEST
SVG Working Group's test suite resource page, which may be found at http://www.w3.org/Graphics/SVG/Test/.
WCAG10
Web Content Accessibility Guidelines, version 1.0, W3C Recommendation, 5 May 1999, available at http://www.w3.org/TR/WCAG10/.
WG-QA-RANGE
Email proposal by David Marston, on the QA public mail list, for range of Working Group commitment levels to conformance test materials production, available at http://lists.w3.org/Archives/Public/www-qa/2001Apr/0004.html.
XMLTEST
OASIS XML Conformance TC's XML test suite resource page, which may be found at http://www.oasis-open.org/committees/xml-conformance/.
XSLT-TEST
OASIS XML Conformance TC's XSLT/Xpath test suite resource page, which may be found at http://www.oasis-open.org/committees/xml-conformance/.
QAF-TEST
QA Framework: Test Materials Guidelines (not yet published).
QAF-SPEC
"QA Framework: Specification Guidelines", Working Draft companion version to this document, available at [...].
XQuery-use-case
"XML Query Use Cases
SOAP-test
SOAP Version 1.2 Specification Assertions and Test Collection
SOAP12-1
SOAP Version 1.2 Part 1
SOAP12-2
SOAP Version 1.2 Part 2
XSD-TEST
W3C XML Schema Test Collection
XSLT-DISC
List of Discretionary items produced by OASIS XSLT/XPath Technical Committee

Change History

12-05-2002

Edited, improved the Introduction(goals, motivation, document's structure

Updated the definition of the checkpoint's Priorities

corrected abstract, SOT

Changed the goal of the document and wording of the checkpoints/guidelines to focus it on testing strategy, moving all the tips on tactics to be EX-TECH

Added examples to most of the checkpoints

incorporated all the editors comments

Rewrote several checkpoints in the Gd 5 - defined results verification and reporting as part of the test framework to remove redundant checkpoints

Rewrote guideline 6 and 7 to focus on strategy of test development and testing, rather then on tactics.

Updated the conformance section

09-12-2002

Expanded introduction, added motivation, etc...

Added examples to the checkpoints in the Gd1,2,3

[MS] Changed the text of many checkpoints to make them verifiable

[DD] First pass on Introduction, added more text to the checkpoints in the Gd 3-5

07-01-2002

Fixed definitions of priorities

Fixed the glitch with the "Test Areas" guideline

Added clarification to Ck 1.1, 1.2, 1.5 (removed "vague"), 1.6

06-17-2002

Added short prose to each checkpoint

06-12-2002

First draft outline