[contents]


Abstract

The purpose of this document is to support developers of web accessibility evaluation tools by identifying typical features of those tools and how to classify them according to different combinations of those features.

Status of this document

This section describes the status of this document at the time of its publication. Other documents may supersede this document. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at http://www.w3.org/TR/.

Table of contents

  1. Introduction
    1. Audience of this document
    2. Document conventions
    3. Complementary resources
  2. Typical features of an accessibility evaluation tool
  3. Example profiles of evaluation tools
  4. References
  5. Acknowledgements
  6. Appendix A: Customising results to different audiences
  7. Appendix B: Integrating the evaluation procedure into the development testing workflows

1. Introduction

There is a wide variety of accessibility evaluation tools to be found on the web. This document intends to support the tool developers to identify their key characteristics. To do that, this document:

1.1. Audience of this Document

This document is targeted mainly to development managers and developers of web accessibility evaluation tools. Under this scope, we will not distinguish between commercial and open source developers, although there are use cases that could be more relevant to one group than to the other.

A secondary audience of this document are users of accessibility evaluation tools like accessibility experts or web developers.

Examples of tools that are within the scope of the document include:

1.2. Document Conventions

The keywords must, required, recommended, should, may, and optional in this document are used in accordance with RFC 2119 [RFC2119].

1.3. Complementary resources

2. Typical features of an accessibility evaluation tool

In this section, we will describe typical features and functionalities of accessibility evaluation tools. We have tried to be as complete as possible, but it may be that some features of existing or future evaluation tools were omitted.

It is very important that you analyse and describe for your own development process and for your customers which of those features are supported in your tool and declare any limitations of your tool.

2.1. Types of document formats analyzed

Although the vast majority of web documents are HTML documents, there are many other types of resources that need to be considered when analysing web accessibility. Of particular importance are resources like CSS stylesheets or Javascript scripts, which allow the modification of markup documents in the user agent when they are loaded or via user interaction, as many accessibility tests are the result of the interpretation of those resources.

In general, we can distinguish these types of formats:

Most of the accessibility evaluation tools concentrate on the markup validation, but the most advanced are able to process many of the types described above.

2.2. Crawling of sites

A typical difference between commercial and open source tools is the capability to crawl web resources. That means that the tool incorporates a web crawler [WEBCRAWLER] able to extract hyperlinks out of web resources. It must be kept in mind that, as seen in the previous section, there are many types of resources on the web that contain hyperlinks. The misconception that only HTML documents contain links may lead to wrong assumptions in the evaluation process.

A web crawler defines an starting point and a set of options. The critical features of a web crawler are related to its configuration capabilities. Among them, we can highlight:

2.3. ...

3. Example profiles of evaluation tools

4. References

The following are references cited in the document.

RFC2119
Key words for use in RFCs to Indicate Requirement Levels. IETF RFC, March 1997. Available at: http://www.ietf.org/rfc/rfc2119.txt
WCAG20
Web Content Accessibility Guidelines (WCAG) 2.0. W3C Recommendation 11 December 2008. Ben Caldwell, Michael Cooper, Loretta Guarino Reid, Gregg Vanderheiden (editors). Available at: http://www.w3.org/TR/WCAG20/
WEBCRAWLER
Wikipedia. http://en.wikipedia.org/wiki/Web_crawler

Acknowledgements

Appendix A: Customising results to different audiences

Appendix B: Integrating the evaluation procedure into the development testing workflows