[EARLY CONCEPT DRAFT] Evaluating Web Accessibility with Users
Note: This document is a draft and should not be referenced or quoted under any circumstances. [requirements & changelog]
Introduction
- encouraging - you can do at any level (don't have to do expensive, formal usability testing), lots of bang for buck (even with informal evaluation with a couple of users)
- benefits - awareness for developers, managers, etc. (power of stakeholders observing]; maximizes investment in accessibility (limits stupid mistakes, e.g., alt text: ("This image is a line art drawing of a dark green magnifying glass. If you click on it, it will take you to the Search page.") {Accessibility evaluation is often limited to assessing conformance to accessibility standards. When the focus is on the technical aspects of accessibility, the human interaction aspect can be lost.} {Effective accessibility evaluation includes both evaluation expertise and the experience of people with disabilities.}
- incorporate users with disabilities throughout design, not just at end evaluation
- usability testing not a requirement to ensure comply with WCAG. and: {while usability testing helps assess how usable accessibility solutions are by people with disabilities, it does not evaluate conformance to [WCAG].}
- this resource highlights some main points. other (non-WAI) resources available with more details on recruiting people with disabilities, conducting usability testing, ...
Types of User Involvement
ranging from:{including people with disabilities in limited evaluation focusing specifically on accessibility issues as well as in formal usability testing}
- evaluating specific accessibility features...
- informal, asking person down the hall to try something...
- formal usability testing, task-based, overall Web site
-
{Usability testing provides quantitative and qualitative data from real users performing real tasks with a product.} {Usability testing is a method by which representative users of a product are asked to perform certain tasks in an effort to measure the product's ease-of-use, task time, and the user's perception of the experience. [2]} - and lots in between
- ("Mention tasks and expected outcomes. Don't just sit the user down
and say "what do you think of that?"" - from Alan's e-mail) - bottom line: {Large-scale formal usability testing is not necessary to evaluate usable accessibility. Short informal evaluation can gather valuable feedback from people with disabilities without the rigor of formal usability testing.}
{Screening techniques for accessibility are simple activities to help... evaluators identify potential accessibility problems.}:
- some screening techniques listed in preliminary review
- {Using screening techniques... help avoid the pitfall of bringing in people with disabilities only to discover the [Web site] is so inaccessible that [PWDs can't use it at all].} {Screening techniques can make [evaluating] with real participants with disabilities more effective by identifying obvious problems before... testing [with PWDs].}
- caution: {False negatives can be generated from screening techniques. For example, most screen readers (software, primarily used by people who are blind, that speaks the text displayed on a computer screen) have a fairly high learning curve. New screen reader users [non-disabled evaluators] are likely to attribute problems to the [Web site] being evaluated and not realize that the problem is in their lack of knowledge in using the screen reader.}
The User Aspect
- more than screen readers users![how scope this point? just say "more that screen reader users"? - or go into more about selecting participant characteristics ? (which is complex & no easy answer, e.g., see http://www.ittatc.org/technical/access-ucd/ut_plan.php#characteristics)]
- user experience level with Web sites & with assistive
technology {Some AT has a fairly steep learning curve, and so
the participant's experience with particular AT impacts the [evaluation].}
- depends on target audience for Web site (e.g., Web application for in-house accountants - high experience. Web site for applying for disability benefits - low experience.)
- too advanced might know uncommon work-arounds {Expert AT users might be able to overcome problems that the average user would find difficult or impossible to handle.}
- not advanced enough may not know thinks like links lists, headings nav, etc. {Novices can spend lots of time figuring out the AT rather than the [Web site] being tested and even give "false negatives," problems which are not the fault of the product being tested, rather the participant doesn't know how to use AT effectively.}
- people often misstate their own level. use tests to determine. [minor point? out of scope?]
- PWDS will find usability issues, {Usability testing with participants with disabilities will identify usability problems that impact all users, regardless of disability}... so instead of doing usual number of non-disabled + additional disabled, can do variety of disabled!
- preparing for PWDs [how scope these? important, but hard to cover briefly, e.g., see http://www.ittatc.org/technical/access-ucd/ut_prep.php 7 http://www.ittatc.org/technical/access-ucd/ut_conduct.php#roomsetup]
- interacting with PWDs[how scope this? important, but hard to cover briefly, e.g., see http://www.ittatc.org/technical/access-ucd/ut_conduct.php#interacting]
Understanding [Findings/Results]
- one user not representative of all, don't do everything one user says! {Don't assume that feedback from one person with a disability applies to all people with disabilities. A person with a disability does not necessarily know how other people with the same disability interact with products, nor know enough about other disabilities to provide valid guidance. Just as you would not make design decisions based on feedback from just one usability testing participant, don't make accessibility design decisions based only on the recommendations of one person with a disability. What works for one person might not work for everyone with that disability or other disabilities.}
- *vital* to distinguish between issues of:
- user agent, assistive technology {Different AT can
interact differently... for example, different screen readers
interact differently with the same Web page.}
- If so: do you need/want to do a work-around? please inform the vendor!
- user knowledge
- If so: is this common in your target user group? do you want to provide additional instructions?
- Web site
- If so: then you can fix!
- user agent, assistive technology {Different AT can
interact differently... for example, different screen readers
interact differently with the same Web page.}
- distinguish between usability & accessibility
issues {In addition to finding accessibility problems, usability
testing with participants with disabilities will identify usability
problems that impact all users, regardless of disability... Because
usability and accessibility overlap, it is sometimes difficult to
distinguish an issue as usability or accessibility. When usability test
reports are used internally to improve the usability of the product for
all users, it is usually not necessary to distinguish between usability
and accessibility issues. However, when usability test reports make
statements about accessibility, it can be vital to distinguish between
usability and accessibility issues.}
- {Usability problems impact all users equally, regardless of ability. That is, a person with a disability is not disadvantaged to a greater extent by usability issues than a person without a disability.}
- {Accessibility problems impact people with disabilities, and not people without disabilities. When a person with a disability is at a disadvantage relative to a person without a disability, that is an accessibility issue.}
- {Clearly indicate what the report asserts and what it does not assert, especially when people who are not familiar with accessibility evaluation will likely read the report. Successful usability test results do not guarantee that the product is accessible to all people with disabilities, nor that is conforms to [WCAG].}
?
[Would like some kind of concluding section. Perhaps something from the introduction could move down here?]
Notes
{...} unless otherwise indicated, information in curly brackets are quotes
from pages in this resource: [1]http://www.ittatc.org/technical/access-ucd/overview.php
(which Shawn is editor of & we don't expect to use verbatim)
[2] www.remedy.com/customers/dev_community/UserExperience/glossary.htm