W3C logoWeb Accessibility Initiative (WAI) logo

WAI: Strategies, guidelines, resources to make the Web accessible to people with disabilities

[DRAFT] Combining Expertise to Evaluate Web Accessibility

Page Contents

Note: This document is a draft and should not be referenced or quoted under any circumstances.
$Date: 2006/02/16 23:19:00 $ [changelog]

Note that many of the links below do not go to current versions, because they are relative. Use the main site navigation to get to the latest versions.

Introduction

Evaluation of Web accessibility can benefit from combining the expertise and perspectives of multiple evaluators. While it is possible for individuals to evaluate Web accessibility effectively if they have training and experience across a broad range of disciplines, it is less likely that one individual will have all the expertise that a collaborative approach can bring. 

This document describes the recommended expertise for Web accessibility evaluation; opportunities for creating collaborative evaluation processes; and considerations for effective collaboration in different settings. 

References to related evaluation resources are mentioned throughout this document. Most of these resources can be found in this resource suite, Evaluating Web Sites for Accessibility.

Recommended Expertise

Effective evaluation of Web accessibility requires more than just running an evaluation tool over a Web site. A comprehensive and effective evaluation requires evaluators with an understanding of the Web technologies used on the site; a variety of evaluation tools and approaches; the barriers that people with disabilities experience; tools and strategies that people with disabilities use in accessing Web content; the Web Content Accessibility Guidelines (WCAG) and the Techniques for WCAG.  

The following list includes recommended expertise across a variety of areas and links to initial resources which might be useful in learning more about those areas:

Opportunities for Collaborative Evaluation

When first conducting a Web accessibility evaluation, the initial approach in many organizations may be to either assign this task to an individual within the organization, or to outsource this to an individual expert. However, over time many organizations change to using a collaborative evaluation process involving the skills and perspectives of multiple evaluators. This approach can allow an organization to draw from different sources of expertise within, while relying on outside experts where needed to fill gaps in expertise. 

Evaluation processes that enable collaboration between evaluators with diverse expertise include:

Considerations in Combining Expertise

Developing a distributed evaluation capability across an organization

Some organizations choose to develop a distributed evaluation capability and to integrate this into Web development processes throughout the organization. This approach can help in identifying diverse expertise, since one can look beyond the boundaries of a particular work unit. In addition, it can help in developing a shared organizational mission for continual improvement of accessibility rather than leaving oversight of accessibility as the responsibility of a single office.

Identification of external expertise

Once the gaps in internal expertise are clear, an organization can prioritize its needs for locating external expertise. Often the internal gaps are in areas of knowledge specific to disability and/or accessibility; for instance: Web accessibility guidelines; approaches for evaluating Web accessibility; usage of a variety of accessibility evaluation tools; understanding of barriers; and familiarity with assistive technologies. In addition, even organizations with established user testing processes may need guidance on considerations for getting feedback from users with disabilities. As these all require different types of expertise, it may be valuable in some cases to bring in more than one outside expert to cover this range of issues effectively.

Involving users in evaluation

Inclusion of people with disabilities in a collaborative group may contribute to development of a better understanding of accessibility issues within the organization, or to maintaining focus on the urgency of addressing accessibility barriers on a site; as well as through their individual technical contributions to the evaluation.

In addition, regardless of the collective expertise of a collaborative group of evaluators, the group may want to ensure periodic review by users with a variety of disabilities in addition to the conformance evaluations conducted by the group. There are many considerations in effectively involving users in Web accessibility evaluations, including ensuring diversity in disabilities represented, types of assistive technology used, experience with the Web, etc.

Facilitating collaboration through shared tools and templates

A group of evaluators may want to arrange for shared access to certain evaluation tools, or to ensure that they have access to a broad range of evaluation tools across the group as a whole.

Using an agreed-upon template for reporting the results of evaluations can greatly facilitate coordination between different evaluators. Such a template might be an adaptation of the Template for Accessibility Evaluation Reports.

Communicating results

Collaborative teams may want to give particular attention to communicating the results of their evaluations to their customers clearly, since their reports represent the combined perspectives of different evaluators. Meeting to review the evaluation report results together may provide a useful addition to the written reports.

Getting and giving feedback

Providing a mechanism for feedback within an organization on the usefulness of the evaluation process and the report may assist the groups in ongoing identification of gaps in the group's expertise and contribute to long-term improvement in the quality of evaluation.

Feedback from experienced groups of evaluators on evaluation resources such as W3C/WAI's Evaluating Web Sites for Accessibility can, over time, also help improve the quality of evaluation support resources available to the broader Web community. Feedback links are available in the footers of pages in this Evaluation resource suite.