|
User Agent Implementation Report for Second Candidate
Recommendation
Nearby: UAWG home page and
How to do a user agent
evaluation
Individual Evaluations
Evaluations Included in Report
-
Internet Explorer 6.0 for Windows
9x/Me/2000/XP (HTML 4.01, CSS1, CSS2 and SMIL 2.0)
-
GW-Micro Window-Eyes (with Internet
Explorer 5.5) 4.11 for Windows (HTML 4.01, CSS1 and CSS2 )
-
JAWS (with Internet Explorer 6.0) 4.02
for Windows 9x/Me/2000/XP (HTML 4.01, CSS1, CSS2 and SMIL 2.0)
-
Opera Browser 6.0 for Windows (HTML
4.01, CSS1 and CSS2)
-
Mozilla Browser 0.9.9 for Windows
(HTML 4.01, CSS1 and CSS2)
-
IBM Home Page Reader 3.02 for Microsoft
Windows (HTML 4.01, CSS1 and CSS2)
-
Internet Explorer 5.0 for Macintosh OS
9.x (HTML 4.01, CSS1 and CSS2)
-
Accessible Browser Project at UIUC
(uses Internet Explorer) beta for Windows (HTML 4.01, CSS 1 and CSS 2)
-
Real Media Player (Audio and Video
content only) 6.0.9.584 for Microsoft Windows (Audio, Video and SMIL)
-
Windows Media Player for Windows
XP 8.0 for Windows XP (Audio and Video)
-
Grins SMIL2 Player for Windows 9x
(SMIL 1.0, 2.0)
-
Adobe PDF Reader 5.0 for Microsoft
Windows (Portable Document Format (PDF))
Pending Evaluations
- Konqueror 3.0 for UNIX
Formats: HTML 4.01, CSS1 and CSS2
Reviewer: Ian Jacobs and Dirk Mueller
- Quicktime Player
Formats: Audo, Video and SMIL
Previous Evaluations
-
Jaws 3.7
and Internet Explorer 5.5
-
Internet
Explorer 5.5 for Windows
-
Opera
5.12 for Windows
-
Internet
Explorer 5.0 for Windows
-
Amaya
2.1
-
IBM
Home Page Reader 2.5 for Windows
-
Evaluation of
HAL, MSIE, NS, and Opera (on Windows 95; 6 September draft)
-
Netscape
Navigator 4.6 on Linux
-
P W
WebSpeak 3.0 for Windows
-
RealPlayer Basic 7
-
IBM
Home Page Reader 2.5 for Windows (Candidate Recommendation 1 of UAAG 1.0)
-
Lynx
2.8.3 with second last call UAAG 1.0.
-
Opera 5.1
for 9 April 2001 draft of UAAG 1.0
Rating Information
-
Rating Scale
-
C: Complete Implementation
-
VG: Very Good Implementation, almost all requirements
satisfied
-
G: Good Implementation, most important requirements
satisfied
-
P: Poor Implementation, some requirements satisfied
and/or difficult for user to access feature
-
NI: Not Implemented
-
NR: Not Rated
-
NA: Not Applicable
Summary of Checkpoint Implementation
Maximum Checkpoint Rating by Checkpoint Priority
Rating |
Priority 1 |
Priority 2 |
Priority 3 |
Total |
Complete |
47 |
28 |
7 |
82 |
Very Good |
0 |
1 |
1 |
2 |
Good |
1 |
0 |
0 |
1 |
Poor |
0 |
1 |
0 |
1 |
Not Implemented |
0 |
3 |
1 |
4 |
Two Complete |
43 |
21 |
5 |
69 |
Totals |
48 |
33 |
9 |
90 |
Percent One Complete Implementation |
97% |
84% |
77% |
91% |
Percentage More than One Complete Implementation |
89% |
63% |
55% |
76% |
Checkpoints with Low Implementation
Experience
Note: The Max rating is an indication of how close the
checkpoint is to having one complete implementation. The Average
rating is an indication of how widely the
checkpoint is implemented across all users agents in this report, for
checkpoints with only one complete implementation. User agents that mark the
checkpoint as not applicable are not included in the average calculation for
that checkpoint.
Priority 1 Checkpoints
- No complete implementation experience
-
4.6
Position captions. (Max rating: G )
- One complete implementation
-
1.2
Activate event handlers. (Ave rating: P+ )
-
3.5
Toggle content refresh. (Ave rating: P- )
-
4.4
Slow multimedia. (Ave rating: P- )
-
10.1
Table orientation. (Ave rating: G )
Priority 2 Checkpoints
- No complete implementation experience
-
3.6
Toggle redirects. (Max rating: NI )
-
4.8
Control other multimedia. (Max rating: NI )
-
5.5
Confirm form submission. (Max rating: P )
-
5.6 Confirm
fee links. (Max rating: NI )
-
10.5
Outline view. (Max rating: VG )
- One complete implementation
-
4.7
Slow other multimedia. (Ave rating: P )
-
4.11
Control other volume. (Ave rating: C )
-
5.3
Manual viewport open only. (Ave rating: P+ )
-
6.8 DOM
CSS access. (Ave rating: G )
-
9.5
No events on focus change. (Ave rating: P )
-
9.6 Show
event handlers. (Ave rating: P )
-
11.2
Current author bindings. (Ave rating: P- )
Priority 3 Checkpoints
- No complete implementation experience
-
2.10
Toggle placeholders. (Max rating: NI )
-
2.11
Alert unsupported language. (Max rating: VG )
- One complete implementation
-
5.7
Manual viewport close only. (Ave rating: P- )
-
9.10
Configure important elements. (Ave rating: P- )
Disclaimer
In order to verify the utility and applicability of the Guidelines, the
User Agent Accessibility Guidelines Working
Group (UAWG) is testing the Guidelines by reviewing a variety of user
agents (user agents for the purpose of this report may consist of combinations
of several technologies) on a variety of platforms. This review will help us
determine which requirements of the guidelines have been implemented and which
requirements have not.
- The report is not meant as a definitive review of
products although we anticipate sending our findings and observations to
developers. When possible, developers will be provided an opportunity to
comment on product reviews before they are included in the report. Reviews can
change at any time as new information is provided to the working group.
- The individual product reviews in this report should
not be considered as a statement of
conformance with the User Agent
Accessibility Guidelines (UAAG). Please review the Guidelines for
developing conformance statements. Some products may only consist of partial
reviews to show implementation for a subset of
UAAG requirements.
- These reviews are informative only and should not
to be used to try to rate or compare product accessibility. The reviews do not
necessarily reflect a consensus of the user agent working group and comments on
any of the reviews can be sent to theUAWG
mail list.
- Please note that the Guidelines document is a W3C Working Draft,
which means that the document may change at any time.
The UAWG Working Group welcomes additional reviews. Each review should
include the above disclaimer. Reviews should also clearly state the product
version, operating system version, and any other information necessary to allow
someone else to repeat the evaluation. If possible use the submit the review in
the XML evaluation report. A how to document is being
prepared to guide reviewers though the evaluation of a product.
The report generation tool is based on XML and uses XML formatted
evaluations on individual user agents to compile a full report on checkpoint
implementation experience. Reviewers following the evaluation format can have
their reviews easily add to the implementation report.
Jon Gunderson (jongund@uiuc.edu)
Ian Jacobs (ij@w3.org)
Last revised: $Date: 2001/12/14 19:24:31
$
|
Copyright
2000, 2001, 2002 W3C (MIT, INRIA, Keio), All Rights Reserved. W3C
liability,
trademark, document
useand software
licensingrules apply.