30 July 2001 - WAI ER Working Group Teleconference

Minutes taken by Wendy.


Agenda for this week's meeting and Rest of AERT open issues



  1. Josh and Chris gave an overview of the suite of test files they've been working on.
  2. Chris gave an overview of the ATR tool he developed to help evaluators save results of taking an evaluation/repair tool through the test files. Results are in EARL.
  3. Resolved that direction WART is headed is similar to ATR, and that we move the ATR to a webified interface and make it the 3rd aspect of the WART tool: an interface to produce a conformance evaluation results in EARL for ATAG, AERT, or WCAG. Perhaps we could also add modules for other aspects (test suites) in the future.
  4. Resolved that we had properly divvyed up the AERT open issues and will formally hand them off to ATAG, WCAG and WCAG/xtech.

Action Items:

Action WC and KHS: Divide up the 268 test files to check on InFocus/InSight. WC - WCAG 1.0 guidelines #'s 1 through 7, KHS - #'s 8 through 13

Action SBP: Add interactive/non-interactive/repair property to tools in EARL schema.

Action everyone: review test files and ATR tool ASAP to get feedback to Chris before he moves on to another project.

Action WC, SBP, and CR: Once ATR stabilizes and we get some EARL results to play with, discuss moving ATR to a forms based front-end with python or Perl backend.

Action WC: Send out AERT issues to WCAG, ATAG, and X-Tech

Chris and Josh's test files

Pertinent Document Links:

JK HTML content. Whether should or should not generate AERT issue. Wanted as simple and small. Each test a single issue. Each file addresses a single point. Test cases for particular points. Help ER tools, but also for WCAG issues.

CR some files will show error, some won't. Help determine if generating false positives.

JK Program to turn XML into HTML. It's most helpful for tools who want to use the suite, to read in the XML to know whether they pass or fail the test. It is AERT-centric. It functions as a reference. Could use the same framework for own situation, but not part of public distribution.

CR We've each run through them, but we need someone else to go through to see if something missing. 268 files.

WC KHS and I split up tests to run SSB through them.

JK Want to make sure they are kosher first.

WC Do both - check files and tool at same time.

JK Part of design - test files should show only.

CR Assuming no such thing as a manual check, that there is a tool that could be created. e.g. checking image for low contrast.

JK Image doesn't exist yet.

CR Impossible for any tool to pass all of the tests.

KHS See what it can actually evaluate then try to repair?

CR It's unlikely that tool can go through every file. Perhaps wait to end of week until go through some files.

JK Might be good to put in repair possibilities.

Chris and Josh's Test Files

Pertinent Document Links:

CR Run test file through tool, then mark in ART as pass or fail. Keeps track of passes and fails. At the end, get a report. How many passed, how many failed, someone can take all of EARL.

WC Per test file basis?

CR Yes.

HB Always clear that passes or fails? Perhaps need "uknown."

CR Also open to interpretation. I may say you pass, you may say it fails.

WC Track the reviewer? What about a comment?

CR Yes.

SBP Tool URI and tool name?

CR Put bobby URI instead of name?

SBP Have an extra box for URI so people know where to find it.

HB Point of comment is to clarify test or to provide more info.

WC Need to store in a public spot (results).

CR Also tests.


CR W3C more authoritative.

JK Under version control. At some point, stop changing and add new, so as not to mess up old tests.

WC If we have comments, we need to get them to you in the next day or so.

JK Interactive vs. non-interactive are two categories, repair is another. Several ways can be used for evaluation.

Action SBP: Add interactive/non-interactive/repair property to tools.

JK There were a number that were difficult, we found holes in AERT. Validating to formal grammars.

CR That's where AERT specifies doctype, but it hints at valid HTML.

JK Most documents don't.

SBP Valid alt-text.

JK In order to make docs work, often have to do broken stuff.

CR Browser-specific elements.

SBP Use the validator.

JK Yes, but hard to generate test files. Many things are invalid.

WC Perhaps Gerald has some good test files.

JK Could easily have genreated 100's of files for a single technique. e.g., relative size. Therefore we took sample.


WC Seems to me that the ATR is the AERT conformance evaluation aspect of the WART, similar to the ATAG and WCAG aspects of it. Therefore, why don't we get the Windows version of ATR working like we like, then we'll work on porting it to a webified version.

SBP Could do in python.

WC Was thinking Perl since that's what WART is written in now.

Action WC, SBP, and CR: Once ATR stabilizes and we get some EARL results to play with, discuss moving ATR to a forms based front-end with python or Perl backend.

AERT Open-Issues

Pertinent Document Links:

Rest of AERT open issues

Action WC: Send out to WCAG, ATAG, and X-Tech

Next Meeting

Monday, August 6th, 2001 @ the usual time, on the Longfellow bridge +1 617-258-7910.

Telecon Details: Regularly scheduled ER WG calls are Mondays, 10:00 am to 11:30 am, Eastern USA Time
(GMT -05:00) on the MIT bridge (+1 617-258-7910), except when there's a joint ER WG/AU WG meeting that day.

Joint Meetings: Joint meetings with the AUWG are held the first Monday of each month,
except where noted, 12:00 noon to 1:00 pm, Eastern USA Time (GMT -05:00) on the MIT bridge (+1-617-258-7910)

Last Updated: $Date: 2001/07/30 17:28:10 $
by: Wendy Chisholm or Katie Haritos-Shea