W3C

WCAG 2.0 Evaluation Methodology Task Force (Eval TF) Teleconference Meeting

01 Sep 2011

Agenda

See also: IRC log

Attendees

Present
Shadi, Denis, Richard, Samuel, Liz, Kathy, Detlev, Leonie, Vincent, Eric, Vivienne, Alistair, Kerstin
Regrets
Tim
Chair
Eric
Scribe
Denis, Detlev

Contents


General advice, join irc first then see what line you're linked to so it's easier to connect with zakim

Discussion of requirements

<shadi> http://lists.w3.org/Archives/Public/public-wai-evaltf/2011Aug/0050.html

requirements were sent (about 15), there are some examples of requirement on the w3c website, was very short extract to the mail

discussions already started on the mailing list, but let's focus on requirements then start filling in the methodology

any comments on the draft sent?

<vivienne> yes, I have looked through it

detlev: i think there a requirement missing on the traceability of requirements

eric: about draft sent : has anyone seen something that they do not like?

<vivienne> the objectives all look good to me

eric: sent mail directly after agenda with draft in it

detlev: does this include the reference and computer assisted reference - would this be enough?

eric: critical part analysis bit - key scenarios - not just samples of site but critical parts - scenarios, steps to go through - not just pages individually

<vivienne> critical path = process?

detlev: critical interaction sequence instead of critical path

<vivienne> could we re-term critical path to script or process?

<kerstin> agree with vivienne

<LeonieWatson> +1 to Vivienne

critical process or critical script?

<vivienne> we give a script to a user team to follow to have them see the task required and follow it through to completion

<shadi> [[complete task]] [[complete transaction]] [[complete process]]

<samuel> +1 to process, but i've heard "scenario" over the line and it sounds good

<vivienne> vivienne will

EV: we could discuss critical path terms

<dboudreau> sorry guys, it's just too much of an effort, to translate, process, the translate back the other way in time to keep up with the discussion :)

EV: better discuss those terms on mailing list

Vivienne: Expand objectives to cover internal and external tests
... qualifiy where it was done and in what methodology

EV: tests should be usable acoss the board

<vivienne> yes, its the statement that's important so we can see how they are making the claim

EV: may be valid to state in Eval results
... Will send update of objectives
... Proposes discussion of target audiences
... Is any target audience missing?

<dboudreau> * Target Audience:

<dboudreau> A01: All organization evaluating one or more websites

<dboudreau> A02: Web accessibility benchmarking organizations

<dboudreau> A03: Web content producers wishing to evaluate their content

<dboudreau> A04: Developers of Evaluation and Repair Tools

<dboudreau> A05: Policy makers and Web site owners wishing to evaluate websites

Vivienne: A01 All orgs: is that companies focused on web site organisations?

EV: Yes. A05 is for owners
... No people with disabilities on the list

<vivienne> would people with disabilities be evaluating websites?

<vivienne> I think the disability organisations might fall in with #1

Shadi: There may be others - one primary audience: org who wants to conduct eval of a single website
... other users would be scondary: policy makers, tool makers

<vivienne> for example, universities teaching accessibility evaluation?

Shadi: others third in line with long tem benefits

EV: agrees focus is on orgs evaluating websites

Shadi: Universities may also be an audience, but not primary

<vivienne> teaching would fall in with #1 and #2 anyway

EV: Mostly in standards docs there is this list given in Target Audience

<kerstin> I'm missing testers who are not involved with special organisations: freelancers

EV: will modify by adding primary audience

AG: Level of evaluation related to kind of audience

Denis: Five target audiences may be rearranged, more in the first group

<dboudreau> Why not simply?

<dboudreau> A01: Evaluation Organizations, Web Content Producers and Benchmarking Organizations

<dboudreau> A02: Web site owners wishing to evaluate websites

<dboudreau> A03: Developers of Evaluation and Repair Tools

<dboudreau> A04: Policy makers

EV: List was not prescriptive

KP: Freelances missing in the list of Target audiences

<vivienne> aren't freelances in with #1?

EC: Freelance evaluators not always a web producer - could be in first or second

KP: evaluators can have different roles and allocations

EV: evaluators may be orgs or freelancers, could be described more openly

Kathy: Target audiences: nothing about designers, but they are key people

<kerstin> do I have to mute me and how?

Kathy: EV: where should designers be added?

<shadi> [[Somebody who wants to evaluate a website. Examples include: ..., ..., ...]]

<agarrison> agree with shadi

<kerstin> agree also with shadi

<dboudreau> +1 to shadi

Kathy: Designers and usabilities engineers should go together

<vivienne> I like the way Shadi puts it

LW: Just variations of methodology - describes variants of methodology

page selection applies to clients, not so much for designers

<vivienne> we also evaluate a page before it goes live

EV: Methodology should focus on evaluation
... people making web sites is not primary

<samuel> agree with shadi on the proposed format on IRC format.

DB: Aim is to break down requirements for different parts of the team
... Most important to focus on people concerned with evaluation proper, not other roles

<LeonieWatson> +1 to Denis

Shadi: List of examples of target audiences can be given
... Methodology will also have educational aspects, for teaching, self-learning: secondary audience

Vivienne: Designers want to use a test for checking even before site goes live

Methodology should be also useful on an isolated page level, as design input

Vivienne: those users can be added as secondary

Shadi: Methodology can enlist scenarios of use across audiences and levels

<dboudreau> imho, all users should be added as secondary, whether they'Re designers, analysts, usability experts, content writers, programers and whatnot

Samuel: Methodology shoulld apply to all stages of the life cycle

EV: will adapt requirements text
... Dicussing requirements list
... ANy requirements that should niot be in the list

<shadi> Detlev: some of the requirements seem problematic

<shadi> ...for example, what is "unique interpretation" and would it conflict with "independent of tools"

DF: issues: unique interpretaiton, Replicability party in conflict; documentation should be added

DB: Unique interpretation looks difficult to achiewve

EV: Must be discussed in more detail

Samuel: R03difficult - it could help to use close closed questions (yes/no)

Vivienne: Question whether evaluation should involve recommendation and assistance to improving matters

EV: will be added to list of requirements

Shadi: Agrees with Vivienne - should also include things on reporting

<vivienne> I agree, reporting is very important - how, to what extent?

Shadi: 2 Point: no direct reference to WCAG 2.0
... Link to WCAG is important

<vivienne> should also include to what level of WCAG 2.0 the evaluation includes

Shadi: Question os whether reference to techniques can or should be the only source
... SO techniques should be mentioned but not be the only point of reference
... Methodoogy should be able to be used with other techniques as well

EV: Time is over
... Will send out a new versino of the document, invites discussion of Req in the mailing list

Summary of Action Items

[End of minutes]

Minutes formatted by David Booth's scribe.perl version 1.136 (CVS log)
$Date: 2011/09/01 20:46:56 $