W3C logoWeb Accessibility initiative

WAI: Strategies, guidelines, resources to make the Web accessible to people with disabilities

Authoring Tool Accessibility Guidelines (ATAG) 2.0
Candidate Recommendation Information

Page Contents

Quick links

ATAG at a Glance
http://www.w3.org/WAI/intro/atag-glance
ATAG Test Harness Scorecard (requires W3C account)
https://www.w3.org/WAI/AU/CR20/eval/Overview
ATAG Tester Training (HTML slide deck)
http://www.w3.org/WAI/AU/CR20/TestingTraining/index.html
Test Reference doc
http://www.w3.org/WAI/AU/2013/ATAG2-10April2012PublicWD-Tests
Implementation Report (under development)
http://www.w3.org/WAI/AU/CR20/ImplementationReport

Introduction

Thank you for helping us demonstrate the ATAG 2.0 guidelines work "in the real world". ATAG 2.0 serves two distinct purposes:

  1. Authoring tools can be used by people with disabilities (Part A)
  2. Content produced by authoring tools meets WCAG 2.0 (Part B)

For a quick overview of the ATAG requirements see ATAG at a Glance.

ATAG 2.0 is being developed by the Authoring Tool Accessibility Guidelines Working Group (AUWG). ATAG 2.0 is currently a W3C "Candidate Recommendation". (These stages are explained in How WAI Develops Accessibility Guidelines through the W3C Process.) To move ATAG 2.0 to the next step toward a final W3C Recommendation, AUWG is asking for assistance in testing authoring tools for ATAG 2.0.

Decisions to Make Prior to Testing:

  1. Decide on a target ATAG 2.0 level to test for (Level A, AA, AAA). This decision will determine (a) the number of ATAG 2.0 success criteria that need to be evaluated and (b) the number of WCAG 2.0 success criteria that will need to be evaluated as part of the Web Content Accessibility Test Procedure. If the tool was only designed to conform at Level A, it will be much easier to simply test it at that level.
  2. Decide on the web content technologies produced by the authoring tool that are to be "included". Some authoring tools, especially general-purpose editors, provide support for authoring with a variety of web content technologies (e.g. HTML4, HTML5, SVG, MathML, etc.). ATAG 2.0 allows authoring tools to conform with respect to just a defined subset of the technologies produced.
    Note: If the authoring tool produces any web content technologies by default, then these must be included.

ATAG Tests

Tools and Resources Needed for Testing ATAG 2.0 Success Criteria:

1. Web Content Accessibility Test Procedure (Level A, AA, AAA):

Many ATAG 2.0 success criteria refer to meeting WCAG 2.0 success criteria. In order to test these success criteria, you will need a Web Content Accessibility Testing Procedure that is: (a) specific to the "included" web content technology (e.g. HTML, CSS, SVG, etc.) produced by the authoring tool and (b) designed to test WCAG 2.0 conformance to at least the target level (e.g., Level AA). Such a test procedure may include:

Note: The WCAG 2.0 requirement that "only accessibility-supported ways of using technologies are relied upon to satisfy the WCAG 2.0 success criteria" does not need to be applied as described in ATAG 2.0 section Relationship to the Web Content Accessibility Guidelines (WCAG) 2.0.

2. Platform Accessibility Service Test Procedure:

This is the procedure that is to be used whenever it is necessary to determine whether information has been properly communicated to an Platform Accessibility Service (e.g. MSAA, IAccessible2 and UI Automation for Windows applications, AXAPI for Mac OS X applications, GNOME Accessibility Toolkit API for GNOME applications, Java Access for Java applications.

For some platforms, semi-automated testing solutions may also exist (e.g., inspect32 (for Windows), AccProbe (for Windows), Accersisor (for Gnome) , Accessibility Inspector (for Mac OSX).

3. User Agent Accessibility Test Procedure (Level A only):

This procedure is used if it is necessary to determine whether a preview meets UAAG. For UAAG 1.0, the most complete test is the UAAG 1.0 Test Suite for HTML 4.01.

Important Note: This tool is not required if previews can be performed in an in-market user agent (see ATAG 2.0 Success Criterion A.3.7.1 for more information).

4. Accessible test content file (Level A, AA, AAA):

This pristine, accessible content is needed to test criteria such as whether content transformations lose accessibility information and whether checkers detect false positives, etc. The method for loading this content will depend on the nature of the authoring tool (e.g. opening a test file, pasting in content, authoring the content manually). The test content should:

Note: This test content may not be needed if the tool does not import content or allow markup to be pasted in.

5. Non-accessible test content file (Level A, AA, AAA):

This non-accessible content is used to whether test checkers are effective at detecting issues. The method for loading this content will depend on the nature of the authoring tool (e.g. opening a test file, pasting in content, authoring the content manually)

Note: This test content may not be needed if the tool does not import content or allow markup to be pasted in.

6. A selection of separate pieces of content:

These pieces of content will be used, as needed, to test various success criteria. The method for loading this content will depend on the nature of the authoring tool (e.g. opening a test file, pasting in content, authoring the content manually). Should include:

7. List of "accessible content support features" (may be created during testing):

While testing the authoring tool against all of the following relevant success criteria, compile a list of the authoring tool features that are relevant to each test (they do not necessarily have to pass) as well as whether the feature can be turned off, either directly from where it appears in the user interface (e.g., via a "Do not show this again" dialog) or from the authoring tool settings.

Relevant Success Criteria:

[outdated] ATAG 2.0 Testing Overview - links to the 2012 testing files and manifest documents

[outdated] W3C Testing Framework - ATAG 2.0 - W3C Testing Framework for running tests in 2012. This is not working properly, do not use.