Presenting accessibility defects as Annotea annotations

Introduction

This is a demo that illustrates how Annotea could be used as part of accessibility evaluation process.

The main scenario is that we have a group of designers creating an accessible Web site. These designers test the Web pages while developing them with the help of accessibility evaluation tools and share the accessibility problems on the Web by using Annotea technology. Because the evaluation results are shared it is easy to evaluate and correct the accessibility defects co-operatively.

Our goal is that the tools produce the evaluation results in metadata format, such as EARL. The EARL or a similar format is stored to an annotation server and transformed to Annotea annotations format by using some logic rules. When the annotation server is subscribed by the designers when the pages are accessed they can see the evaluations on the page as annotations. These annotations can have reply fields that mark the evaluated defects as taken care of or discuss about the solution.

With the current evaluation tools some transformations in our demo need to be done manually. However, the future goal is that the tools develope so that this becomes unnecessary.

We would also like have some mechanisms that allows the tools to use the reply information. For instance, if a evaluation says a point needs to be checked by a human and it is marked as checked with a reply annotation, it would be nice if the tool could take that information and not report the same thing again unless it was changed.

Demonstration steps

These are the demonstration steps that we used in the Oxygen demo.

Step1a: accessibility evaluation data

The Oxygen Web page is evaluated by using an accessibility evaluation tool. We would like this tool to produce RDF metadata, such as EARL, but the tool reports the findings in a proprietory XML format. The examples of the format is shown below. It consist of a report with several reportitems each reporting a possible accessibility defect and other information of date etc.

<? xml version="1.0" standalone="yes" ?>
<mm:report>
  <mm:reportname>OS 9:Desktop Folder:OxygenResultsReport.xml</mm:reportname> 
  <mm:reportdate>Monday June 13 11:08:03 2005</mm:reportdate> 
  <mm:reportlocation>Current Document: OS 9:Desktop Folder:oxygen.html</mm:reportlocation> 
    <mm:reportitem></mm:reportitem> 
    ...
</mm:report>

The report item has information of the the possible defect, it's position in the Web page and some other info. The following is an example of a reportitem reporting missing ALT text.

<mm:reportitem>
  <mm:statusicon></mm:statusicon> 
  <mm:displaystr>oxygen.html</mm:displaystr> 
  <mm:file>file:///OS 9/Desktop Folder/oxygen.html</mm:file
  <mm:linenumber>14</mm:linenumber> 
  <mm:description>non spacer IMG with valid ALT -- FAILED -- No ALT defined for image.</mm:description> <mm:reportedby>Evaluate and Fix</mm:reportedby> 
</mm:reportitem> 

Here is another reportitem reporting a missing LONDESC text for an image.

<mm:reportitem> 
  <mm:statusicon></mm:statusicon> 
  <mm:displaystr>oxygen.html</mm:displaystr> 
  <mm:file>file:///OS 9/Desktop Folder/oxygen.html</mm:file> 
  <mm:linenumber>14</mm:linenumber> 
  <mm:description>non spacer IMG needs LONGDESC -- MANUAL -- Non spacer image may need a LONGDESC attribute.</mm:description> 
  <mm:reportedby>Evaluate and Fix</mm:reportedby> 
</mm:reportitem>

The tool itself provides some correction mechanisms for an individual designer, but the acessibility information also needs to be shared with other designers/evaluators. For that purpose we transform the data to Annotea annotations. However, we still want to keep the original information as well.

Step 1b: tranforming positions of accessibility defects

Everything else in the accessibility evaluation report can be transformed quite easily but the position is given as an internal line number so we need to make sure the it is understandable with other tools too. This transformation is done manually right now, but the goal is that the tool makers provide them with the report items. In this particular case the tool maker has actually agreed in providing positions also with XPointer format.

Step 1c: tranforming XML to RDF XML

As the reportitem format is just XML it is transformed to RDF by some simple XSLT code. That helps us to keep track of the original data when we add the annotation properties. The reportitem element becomes an RDF node and all it's elements become properties.

Step 2: adding annotation information by using logic rules

To use the annotation mechanisms the reportitems needs to be transformed to Annotea annotations. We keep the original data and add some RDF logic rules that state that an object of type reportitem also has type annotation and similarly state that some reportitem properties are also or can be converted to annotation specific properties.

In the following are some examples of the logic rules written with n3. The first rule says that every node that is of type reportitem as defined in the document marked mm:is also of type Annotation item as defined in the document annot:. Similarly, if there is a description relation between two nodes, there is also a body relation between those same nodes, and the annotates relation is equivalent to the file relation. All the elements of a reportitem become properties.

@prefix log: <http://www.w3.org/2000/10/swap/log#>.
@prefix mm: <http://www.w3.org/2001/05/earl/MacroMedia#>. 
@prefix annot: <http://www.w3.org/2000/10/annotation-ns#>. 
@prefix : <http://www.w3.org/2001/05/earl#>.
this log:forAll :n1, :n2.  
{ :n1 a mm:reportitem.}   log:implies { :n1 a annot:Annotation }.  
{ :n1 mm:description :n2.}   log:implies { :n1 annot:body :n2 }.  
{ :n1 mm:file :n2.}   log:implies { :n1 annot:annotates :n2 }.
#ends

If we store the accessibility evaluation reports to a general metadata store we can attach these logic rules to the same database. These rules can be used in two ways: 1) the additional properties can be attached to the existing reportitems in the database and everytime a new reportitem is added or 2) they can be calculated everytime a query for reportitems is made against the database.

Here are some sample annotations in Annotea format after the RDF transformation and after logic rules have been applied to the evaluation reportitems. The position in XPointer format is still missing as that is easiest to create by the evaluation tool itself.

<?xml version="1.0" encoding="utf-8" ?>  - 
<rdf:RDF xmlns:a="http://www.w3.org/2000/10/annotation-ns#" xmlns:mm="MacroMedia" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#">
  <rdf:Description>
    <a:created>Monday May 16 08:39:40 2005</a:created>
    <mm:statusicon />
    <mm:displaystr>oxygen.html</mm:displaystr>
    <a:annotates>file:///OS 9/Desktop Folder/oxygen.html</a:annotates>
    <mm:linenumber>14</mm:linenumber>
    <a:body>non spacer IMG with equivalent ALT -- MANUAL --</a:body>
    <mm:reportedby>Evaluate and Fix</mm:reportedby>
  </rdf:Description>
  <rdf:Description>
    <a:created>Monday May 16 08:39:40 2005</a:created>
    <mm:statusicon />
    <mm:displaystr>oxygen.html</mm:displaystr>
    <a:annotates>file:///OS 9/Desktop Folder/oxygen.html</a:annotates>
    <mm:linenumber>14</mm:linenumber>
    <a:body>non spacer IMG needs LONGDESC -- MANUAL -- Non spacer image may need a LONGDESC attribute.</a:body>
    <mm:reportedby>Evaluate and Fix</mm:reportedby>
  </rdf:Description>
</rdf:RDF>

Step 3: evaluations are presented as annotations on the Web page

To be able to see the evaluation defects on a Web page a user needs to subscribe to the accessibility evaluation database which has the evaluation date as well as the logic rules needed for the conversion to Annotea annotations.

When the annotations are turned on the user simple goes to the Oxygen page and immediately sees all the accessibility defects as annotations. A future goal is to define some specific annotation type for accessibility related annotations and also have a dedicated icon for them.

image!!

Updating the accessibility evaluations

We are not done after sharing the evaluation information as annotations. The next step is to define how the evaluation database should be updated. For instance, if an evaluation is done several times, if would be nice if the system would keep the checks already marked done by the designers. However, we don't usually need 5 annotations all stating the exactly same information unless we are comparing the tools. So we probably need some more logic rules stating when to replace the old evaluations and when to keep them. For instance, we might not add another reportitem to the same position or same mark-up, when it is stating the exact same information as already exist.

Conclusions

With some minor changes it is possible to easily present automatically generated data, such as accessibility evaluation data, as Annotea annotations. This helps collaboration when checking for and correcting the defects.The next step is to use the annotations for giving some information back to the tools and using that to decide when to add new annotations or replace older ones.

$Id: AnnoteaOxygenDemo.html,v 1.4 2001/08/24 14:40:34 swick Exp $