W3C

WCAG 2.0 Evaluation Methodology Task Force

Face to Face Meeting on 30 Oct 2012 in Lyon, France

See also: IRC log

Attendees

Present
Eric_Velleman, Katie_Haritos-Shea, Vivienne_Conway, Detlev_Fischer, Aurelien_Levy, Ramon_Corominas, Shadi_Abou-Zahra, Sylvie_Duchateau_(EOWG), Helle_Bjarno_(EOWG), Shawn_Henry_(EOWG), Dominique_Burger_(Observer), John_S_Lee_(Observer)
Regrets
Chair
Eric
Scribe
Vivienne, Shadi, Detlev, Katie

Contents


<Ryladog> http://www.w3.org/TR/WCAG-EM/

<shadi> http://www.w3.org/WAI/ER/2011/eval/f2f_TPAC.html

Eric: agenda
... SEO - WCAG-EM comes up
... outlined the agenda for today http://www.w3.org/WAI/ER/2011/eval/f2f_TPAC
... introductions of those present

Shadi: outlined the technical issues we worked on yesterday. Today with EOWG will be looking at the educational value of the document. We should try to stay on the higher level - overall issues. Want to look at definitions, but keep minor comments for later discussion.

Katie: do we want to look at the numbering schema - yes.

Eric: ask EOWG of the first impression of the document

Shawn: first impression - cool. Several people found that wading through the links to the definitions was prohibitive, not for just screen reader users. In the introduction you see constant words underlined to show definitions.

Sylvie: the links are difficult to remember. The numbering of sections is difficult to work through and follow because of the numbers. Sounds like reading a math text.

Shawn: in the main part of the document - where it says 2.1 Scope of applicability. Having 2.1 in there may be unnecessary and maybe even having it linked may be unnecessary. Looking at 3.1.1. Step 1.a - that needs work and referring to it is cumbersome

Katie: would it be better to have 2.1 - scope of applicability?

Ramon: in other W3C documents we don't have numbering like this - like in WCAG 2

Katie: but it's not a methodology

Detlev: you're not sure if it is part of the hierarchy
... there is too much structure and it gets in the way

<shawn> +1 to Devlev

<shawn> says: 3.1.1 Step 1.a: Define the Scope of the Website

<shawn> Methodology Requirement 1.a: Define the scope of the website.

Eric: it gets worse as you go down
... go back to the top of the document where you can see the table of contents
... reviewed how it is set up as steps because we want people to see that it is sequential steps

Helle: you are mixing 2 different numbering systems

Shawn: can you take off the numbering at the first level all together

<shawn> take off all numbering - then all that is left is the Step numbering

Shawn: take off the first level of numbers e.g. for Introduction, and the next level for Scope of this document

Aurelien: we already know that it is steps - Conformance Evaluation Procedure, Step 1..., 1.1, 1.2

<shawn> Step 1. then 1.a., 1.b, 1.c OR then 1.1, 1.2, 1.3

Eric - in 3, we would drop the 3 and also with the 3.1

Thanks Sylvie!

Aurelien: we say we must have 1.2 or 1.b, as there are no other numbers, we may be able to get more simple and just have numbers, or just keeping a,b,c,

Detlev: you would need 1a or 2a

Detlev, it has to be 1a or 2c or 3d

Katie: has a word document that we will share to show the proposed numbering

all agreed that it looks so much more simple now

Shawn: it makes is stronger because the only numbering is for the steps

Syivie: it will have a better flow for reading

Katie: will stick it in IRC as "Suggested Numbering"

Eric: in agenda - alignment with other WAI documents - next topic

Shawn: http://www.w3.org/WAI/eval/

<Ryladog> Suggested Numbering Streamlining Change Introduction o Scope of this Document o Target Audience o Background Reading o Terms and Definitions Using this Methodology o Scope of Applicability  Particular Types of Websites o Required Expertise o Evaluation Tools (Optional) o Review Teams (Optional) o Involving Users (Optional) Conformance Evaluation Procedure o Step 1: Define the Evaluation Scope  1.a or 1.1: Define the Scope of the Website  1.[CUT]

Shawn: how does WCAG-EM relate to this older document. We are re-doing the preliminary review - maybe 'quick checks'. The page that is currently called "Performance Evaluation" might even be WCAG-EM (Light). Quick check is like a 5 minute check or a 15 minute check. If you really want to do conformance, you go to WCAG-EM.
... that is just 1 approach. For the specifications we have a very brief overview page - who document is for, what is in it, and points to technical specification - no content, just an introduction
... of all the other sub-sections, how does it relate to what we have elsewhere?

Shadi: ongoing discussion about what is in the scope of WCAG-EM and what is not. There are idea of what a consultant does to help an organisation towards certification etc. The purpose of WCAG-EM is harmonize the approaches. Having a WCAG-light should not be guidance of its own.

Shawn: the intro page is a high level description
... WCAG over - who is it for, what is in it, also has WCAG 2 at a glance

Detlev: would like the 5 minute check, 15 minute - should say clearly that it is a spot check. We do that already when we are contacted to get a first impression as evaluators. Like the label which indicates that it is not a full conformance evaluation.

Ramon: when I read the methodology for the first time, I didn't know that it was always to assess the full conformance. EG for a basic report, I didn't understand that it would give a report of conformance. I don't like the idea of fragmenting between preliminary to full report. We do many types of evaluations and don't want to have to go to different type of methdologies.

Shadi: we are re-opening the scope of WCAG-EM here and should save that. the preliminary eval is not a conformance check.
... introduction document should be more of a summary - eg there are 5 steps - here are the steps

<shawn> Shawn: goal of this page is easy-to-digest introduction, e.g., for managers

Shawn: goal would be to have an overview of the WCAG-EM - like an executive summary

Shadi: like the WCAG at a glance - it is not the WCAG, it is just an overview

<shawn> shawn: *not* WCAG-EM light

Aurelien: the 15 minute light is not the WCAG-EM

Ramon: most of the steps can be shared - you more or less do the sample, you do the same steps but in a smaller way. It is not any level of conformance, but it just a check.

<shawn> http://www.w3.org/WAI/EO/wiki/Eval_Analysis

Shawn: if I'm having problems with the website, I would want to know what the problems are before I contact the organisation and I could use this. e.g. check the alt text. Provided an example of how someone could use the quick checks, use the overview so they could see whether an bidding evaluator has an accessible website themselves.
... demonstrated the link above to show the different use cases.

Katie: on the overview - agrees the need for the page - who it is for etc. like: Introd WCAG

Shadi: section: What's in WCAg-EM

Shawn: we've got good guidance to go back to EO. We're also looking at simplifying the quick checks - just based on a single web page.

Ramon: does it work in reality? A quick check of only 1 page - would only show if it is a complete disaster.

Shawn: we're only looking at whether it is a complete disaster or not

Ramon: we usually do more like 5 pages

Shawn: still not comfortable that we don't have the middle thing. Keep working on the preliminary.

Helle: there is some good ideas of just looking at 1 page - e.g. headings etc. It is a step to say should I go further into the website, or is it so terrible that I just stop at the front. You can do it yourself with your own website and don't need a consultant to see where the worst problems are.

Shadi: motivation for the preliminary review is the target audience - people who are not technical - manager who need help. for WCAG-EM is non-trivial and we want to be able to say that when they used the procedure it is comparable with someone else. We would want the person who does the pre-sales check to know what they are doing. It cause potential confusion.

<shawn> Vivienne: Can see the quick checks for single webpage would be useful for training

Shawn: we're thinking of use the BAD for that

<Zakim> shawn, you wanted to ask what about other sections and other existing documents?

dominique: checking to see is if my website compliant to a standard (ISO). You need a process, not simple, WCAG-EM provides guidance to go through a process. The target needs to made really clear that this is the purpose in the first sections - target audience etc.
... 5 minute is more for advocacy, E&O, - needs to be inspired by the WCAG-EM

You can't have a WCAG-EM light to do this

Katie: like WCAG, there are principles

Shadi: the overview/summary is 5 steps

Shawn: the document currently says: "evaluate conformance to WCAG". do we now want to say "evaluate conformance to WCAG (which is also ISOIEC 40500:2012)"?

Helle: is this document going to be part of mandate 376 going into procurement work? Is the WCAG-EM going to be a standard like a ISO 9000?

Shadi: it is a non-normative document, not even a W3C recommendation. It needs to be kept clear that WCAG is the standard. WCAG-EM is a way to check with reasonable confidence if you have met that. People should follow WCAG in their development, not WCAG-EM. It is not trying to be an ISO standarfd.

Ramon: it must be clear that this is 1 method, not the only method.

Shadi: it is not the intent to push all W3C standards through ISO. WCAG was particularly for countries that were starting to develop the UN Convention of the Rights of people with disabilities. ISO language needs to be carefully used.

John: Mandate 376 - due to current status for us to include it in the technical report. Was supposed to go in the Evaluation Criteria. The benefit of having this, it can be re-submitted to the technical report? sorry not sure of the acronyms?

Shadi: the main specification to Mandate 376 there is a reference to the WCAG-EM. WCAG is in there in full - a reference that for evaluation you can use WCAG-EM as one of the possibilities.

John: certain portions will be included in the technical report - can be added at a later time.l

Shawn: maybe the answer on the ISO is that there is a lot of support for it.

Shadi: needs to be worded in a way that it is not confusing

<Zakim> shawn, you wanted to say WCAG + ISO!?!? (and also say comment before)

<shawn> you wanted to ask what about other sections and other existing documents

Shawn: we looked briefly at the evaluation suite and the sub-documents. How does that overlap with what you have in here. For example, you have "involving users" how does that relate to the WCAG-EM. You have review teams, evaluation tools.

Alignment with other WAI documents

Eric: we got comments from WCAG WG: bemore careful with outgoing links
... Some stuff in background reading
... Overview page is agreed - do we need more?

Shadi: There were comments on background reading (required knowledge) - too much linking out
... In section 2 there were some 'teasers' linking some thought that should be included or left out
... in 3 Conformance Eval Procedure there is a link to prelim evaluation but that may have been taken out already

Eric: this link is in Section 2 and 4.1 - comment was: If this is important, then maybe include in document?

Shawn: in 1.3 background reading there were comments on accessible web design
... problem of including it in WCAG-EM is that it gets overwhelming

<shadi> http://www.w3.org/WAI/ER/conformance/comments-20120730-WCAG#c4

Vivienne: Regarding links out, if there are many you tend to not go there

<shadi> https://www.w3.org/2002/09/wbs/35422/20120816misc/results

<shadi> https://www.w3.org/2002/09/wbs/35422/20120830evaltf/results

Vivienne: links out may not be used therefore a synopsis could describe what it#s about instead of just having the link

Shawn: Involving users...

Shadi: Michael Cooper had an issue with that

Ramon: Though the methodology could be used by everyone - now he gets the imression it is only suited for expert evaluators
... this seems in conflict with the stated group of users of WCAG EM

Detlev: doesn#t cover details - scope and sampling is more general than havingto know all the accessibility tecjnical stuff

Michael Cooper enters room

Shadi: assumption that anyone dowing actual evaluations will have to be a11y experts

<shawn> [[ current wording: "It is assumed that the reader of this document is familiar with the following related resources from W3C/WAI:"

<shawn> Ideas for edited wording: "To effectively use this methodology, you should be familiar with the following related information."

<shawn> or "The information below related to WCAG 2.0 and evaluation is important background for using this methodology. "http://www.w3.org/WAI/EO/wiki/WCAG-EM_review]

Shadi: WCAG EM is a particular type of evaluations mostly geared towards people who are not developers themselves (in house scenario)

Vivienne: Maybe we need to explain how the different stakeholders (developer, owner, externa evaluator) can use WCAG EM
... Otherwise those users will not know why it is aimed at them
... explaining other use cases for WCAG eM
... Shadi (addressing Michael) regarding comments about linking out

Michael: my concern was that there was repetition by pointing readers to external resources, having go back and forth, 2) the two kinds of docs don't have the same status (informal/ formal)

<shawn> Michael: I have no problem with linking out in Background Reading. I have problem if *requirements* are in another document.

Michael: try to put everything you need right into WCAG EM and then point to further info as well where needed

Shawn, Shadi looking for areas in text that demonstrated theproblem

Shadi: Problem more pronouced with links to Understanding doc and techniques

Michael: either point out straight waway or include it all - just one para and then linking out throws you off course

Shawn: Distinguishing between usefuk but not treqiured info, and info essential for following WCAG EM

<shadi> .oO WCAG'em

Michael: righr under heading, one may link out to more detailed documents, then sumarising.

Shawn: links out in the actual conformance steps?

Shadi: no, was taken out
... Example of evaluation tools - no specific tools required but it is an importany evaluationant part for

Shawn: ... problem with verry long documents that may be out of date, not useful to reference
... Are you saying the doc itself should provide guidance? or just link to guidance?

Michael: uif this was a spec, I would wnat to read this top to bottom as selfcontained as possible

<shawn> cooper: I feel like I'm reading along and need to go read this other document and come back...

Michael: partial summary with link to more detailed doc so it feels self-contained

Vivienne: Reg. evaluation tools: would like to see more guidance regarding the selection of tools

<Zakim> shawn, you wanted to say helpful to hear *what* the issue is so we can better address it

Vivienne: aim would be to have enough info to understand importance of tools and selecting appropriate ones

Shawn: we should no tfocus on specifics but think about what users of the doc really need to make it a useable document

Shadi: Eric, are there more comments that Michael should respond to?

Eric: another issue was the consistent use of 'must', 'shall' etc

Michael: SOm people would still lie to be normative, but if this is not going to happen, then the language must be softened

Shadi: issue that it is often not possible to cover every functionality especially in a web application - the issue is the wording of 'common functionality'

Michael: Start with 'What is the web site for?'
... Address this from the user perspective - register, place order, chat with other users, get information etc. different from the raison d'etre of the web site to put the company forward as a great entity

Shadi: We have 'including user tasks' etc - the argument was how can we distinguish important / critical / key processes

Ramon: core?

<shawn> Detlev: all critical for reaching users' objectives

Shadi: we tried different adjectives - another approach

Detlev: all functionality critical for reaching user's objective?

Vivienne: key tasks key funcrtionality

Michael: some things are truly common such as registering, others mopre individual

Dominique: Problem is two worlds: we need a definition of user tasks, and what tasks are taken into consideration in evaluation

<shawn> Shawn: how it is used in the document: 3.2.2 Step 2.b: Identify Common Functionality of the Website

Ramon: propose 'essential': if removed, this would fundamentally change the information or functionality' (glossary of WCAG)

<shawn> WCAG 2 glossary: "essential- if removed, would fundamentally change the information or functionality of the content, and information and functionality cannot be achieved in another way that would conform"

<shawn> detlev: not sure why two views. looking at users trying to achieve something

<shawn> "Methodology Requirement 2.b: Identify the common functionality of the website. ...The outcome of this step is a list of user tasks..."

Dominique: in theevaluation process, the evaluators need to state what tasks they consider important for checking - at this point they need to declare what they consider essential - the definition of the task itself might be different - so the process of evaluation singles out what evaluation looks at

Shawn: For practical reasons a selection need to be made: the owner may propose key tasks and the evaluator may confirm that, or disagree and correct it

Shadi: Step 1 includes commissioner where possible, step 2 possibly the developer - the idea is exploration to inform the selection of a representative sample
... allows to create tasks that are fairly high-level, then select pages to reflect that
... this may later includes subtasks, such as subtasks 'change hoe address' in the overall purchasing task at Amazon or similar
... third level may be 'entering a date'
... For sampling, this may mean that some subtask are covered by the same state of a page

Vivienne: >Thetasks should reappear in the section on reporting
... sees problem with the word 'essential' for tasks - user may not do anything on a site - instead it could be important, key, etc, essential sounds like mandatory

Ramon: explains 'essential' as not meaning mandatory but as invalidating the user experience

'essential functionality' rather than 'essential tasks'

katie: if you distinguish primary and secondary as sewquential rather than a matter of priority it is perhaps clearer

Shawn: but some things *are* more important than others

<Zakim> shawn, you wanted to ask shadi if it's clear do this with evaluation commissioner? and to clarify comment for the minutes: Shadi said (something like) "Step two ideally is done

Shawn: If, say, step 1 is to be done with the commissoner, is that clear from the document?

Shadi: This is optional

Eric: problem of comissioners trying to suggest a particular scope (to avoid things to be evaluated)

They want to constrain evaluation, for example, to get a logo for their site no matter what

Eric: Problerm of dlineating sirtes and parts of sites that follow the same design or are fuincrtionally important - often difficult to distinguish
... difficulty of defining key functionality

katie: simply because it is hard, it should not be left out

Shadi: Assuming that there are multipe levels of functionality, we have only described the top level

High level task implies that lower levels need to be defined / targeted for later sampling

Shadi: some tasks may be outside thie initial exploratio nbut may later enter the evaluation

<shawn> Detlev: have similar issue - commissioner would like to exclude somet hings from the evaluation

<shawn> ... want to prevent it

<shawn> John: the issue with to make it clear in the scope - what was included in teh evaluation (and what was not)

<JohnS_> In the scope of the reporting of the evaluation

<shawn> Detlev: WCAG-EM defines XYZ - then whoever uses it, defines xyz - evaluator decides if they just take the tasks that the commissioner wants - OR the evaluator decides that they wil include all important funtctionality/tasks, e.g., if it is an evaluation for a seal

John (before Detlev): does not matter if commissioner defines the scope because the report makes limitation clear

Vivienne: Does not always communicate with commissioner what 'key processes included' actually are - evaluator has to form an independent opinion as to what are the main user goals / processes

Ramon: matter of reliability and confidence of the methodology - if a label is handed out site owners - WCAG itself would prohibit a labrl on any site that has not been evaluated - so how can we get round this problem?
... WCAG EM should provide a measure of reliability on its own
... the a11y logo of the certifying org implies a conformance claim according to WCAG

Shadi: are there any ways of ensuring important parts are not left out?

Eric: wrap-up

Afternoon work: highlevel look at report templates than maybe some detail work

Alignments of Evaluation Documents

Eric: have overview page for EOWG to do

Shawn: the documents referenced in "using this methodology" is important
... does section "particular types of websites" overlap with the EO resource "considerations for specific contexts"

Shadi: yes, some overlap but in the context of conformance evaluation
... not sure what the future of "consideration for specific contexts" will be

Ramon: we use the word "context" differently in WCAG-EM and in the EO resource

Shawn: not sure what the future is of this document
... does "required expertise" section overlap with "review teams"

Shadi: no, really about the expertise of the evaluator

Eric: will the "selecting web accessibility evaluation tools" become something completely different in the future?

Shadi: conceptually will always provide the same information
... but the content may change significantly

Eric: can we do more to address Michael's comment?

Detlev: was it to remove content or to actually add more description?

Shawn: user may not know how to fix and issue but what they actually want
... want to have all information in one place

Detlev: impression that the main concern is relying on non-normative resources

<shawn> shadi: scenario: experienced web developer doing first formal conformance eval

Vivienne: why is it optional?

Shawn: "required expertise" is non-optional and "review teams" is optional

Vivienne: have to use some sort of tools to determine conformance
... at least color contrast

Detlev: could just drop optional because doesn't mandate anything in particular

Shadi: maybe that is what also confused Michael to think that these sections are part of the steps in the procedure

Eric: had discussions about that in the past
... would seem that we are requiring "review teams"?

Detlev: people need to read the text more carefully

Ramon: is notepad a tool?

Shawn: not important what a tool is in this context

<shawn> Shawn: suggest remove "(optional)" and leave text to clarify

[unanimous consent to removing the word "optional"]

Eric: [reads aloud section "evaluation tools"]

Vivienne: maybe need to emphasize that we are talking about automated tools in the note

Shadi: word smithing can be done later

<shawn> shadi: e.g., read 2.3 Evaluation Tools - how does this work ifyou are an experienced evaluator? do you feel like you have to go read this? (addressing Michael's issue)

Detlev: people want to know if the link sends you outside the document or stay within the document

Katie: maybe add a couple of examples of different types of tools

Shadi: does that help someone who is knowledgeable of accessibility

Katie: maybe describe what tools are
... not only expensive tools but also different ones

Vivienne: suggestion has merit
... but listing individual tools would quickly get outdated

Sylvie: think that section is really clear
... maybe example without quoting any tool

<Ryladog> *free browser based accessibility tool, free contrast ratio anaylizers, Free DOM inspection tools for AAPI's, etc

<Ryladog> * Free AT

Detlev: anyone going to provide such a list and maintain it

<Zakim> shadi, you wanted to make proposal

Shadi: add brief description of tools but rest of the section seems ok?

Katie: add types of tools, like DOM inspection tools

Shadi: need to look later at what specifically comes into this brief description

Types of Evaluation Goals

Eric: [reads aloud section "define the goal of evaluation"]
... all three types are full conformance evaluations
... just different degrees of detail

Dominique: only speaking of reports or also different evaluation styles

Ramon: think they are different types of evaluations
... sounds to me like "basic evaluation" rather than "basic report"

Vivienne: questioning the use of the word "report" there
... is more of an evaluation than the report
... also think should be more descriptive at the beginning of the section that they are all conformance evaluations
... need to explain that evaluation is the same
... just different type of report

Ramon: not sure if it is the same type of evaluation
... may need less detailed checking if only to say conform or not
... if the report is more detailed then may need to evaluate in more details
... or using more pages
... concerned about requirement for repair suggestions to be provided under in-depth analysis
... not all evaluators can provide this

<shawn> ... maybe the knoweldge to provide fixes is beyond conformance eval

Sylvie: can't really understand the difference between "detail report" and "in-depth analysis2
... also with conformance evaluation, always need to check things in detail
... also the definition for "detailed report" talks about use for developers whereas this methodology is for websites after they are done

Detlev: some methodology that are targeted to only judge conformance may stop when errors are identified
... other methodologies that want to provide more information will go through the entire sample in any case
... quite significant differences in doing things

Shawn: don't think "basic evaluation" is good as it may get confused with "preliminary evaluation"

Shadi: had that initially, and had exactly that issue

Shawn: agree with Detlev that doing a detailed evaluation may be different from a basic evaluation

Katie: think same approach for all three types of evaluation

aurelien: agree with previous comment
... goal is the report
... only different types of report
... but the process is the same

dominique: difference is the results that the evaluation commissioner can expect
... answer is the same, yes or no
... but commissioner may want more information
... maybe a small check to see if conform or not
... might be all the commissioner want
... sometimes commissioner may want more information
... suggest using the term "analysis" throughout

Katie: analysis and recommendations

ramon: agree that processes are differences
... don't need to analyze all images if the first three failed
... structure of the document really important

aurelien: in some cases can be between detailed and in-depth report

detlev: unlcear what basic report really is
... if just an indication of conformance or not then should be clear
... most cases the commissioner will want to know some more about the nature of issues

<shawn> shadi: rather than 3 types, just 3: yes-no, then yes-no plus

<shawn> ... or else there are 15 different types

<shawn> s/rather than 3 types, just 3: yes-no, then yes-no plus/rather than 3 types, just 2: 1. yes-no, 2. yes-no plus/

<shawn> ... what process do you do for 3rd-party eval for 'logo'/badge ?

Katie: do a lot of conformity checking as independent evaluator

<shawn> ... assume people coming to you think they already have made it accessible

<shawn> shadi: how does that differe from someone who comes and says we want you to help us do accessibility better

<Zakim> shawn, you wanted to say what you need in this sectino depends on how specific the following secttions are (especailly the reporting sectioN)

shawn: what is the purpose of this and how does it influence the rest of the document
... currently reporting section doesn't address this in detail

vivienne: when asked for more than yes/no evaluation then often a combination of detailed and in-depth analysis
... same evaluation but feedback depends on what commissioner wants
... think two different types of reports/evaluations

detlev: need to think if "basic evaluation" is "look until you find something" or follow the same structured approach

ramon: section says "some of the evaluation goals include"
... so other types of evaluations could be covered by WCAG-EM
... so reporting should be very open

<shawn> [ shawn reminds people of Template for Accessibility Evaluation Reports http://www.w3.org/WAI/eval/template.html ]

helle: find it difficult to see the methodology you will develop in regard to different countries and their requirements for report

Reporting Template

<SH> What are the pro and cons

<SAZ> The document has alreadyth been critiqued for leng

<SH> The report must specify the deatils for what you did is Step 1a, for example, and then other guidance that has more options

<VC> Use the Example Reports to give new users and idea for what is expected

EV: If we add info like add step 2 we could do that

VC: Taking into account MC comment, it is best to have example information inside the doc

SAZ: Maybe we can coolapse it into one template

<Zakim> shawn, you wanted to ask also BAD report

SAZ: Lets look at what we have in the examplke reports right now

EV: Here are three times almost the same thing. Perhaps we could combine into one for those three, based on ISO requirements we use in Netherlands

VC: I used this reporting format in a practical sense
... I used Example 3 and it actually worked, I actually found that it actually met everything that I had done in the Methodology
... Eric, I think you did a great job on it

EV: I took it out and put it back in as we had not discussed it as a group
... Anyone who sees anything missing in here?

SAZ: Give two types of evaluation, pass/fail and detailed (which comes in many flavors)

<Zakim> shawn, you wanted to say yes "Step 5.a: Provide Documentation for Each Step" rather than example

SAZ: In WCAG-EM Methodology we provide the requirements of what should be in a reports, and here is an example that you CAN use if you choose

SH: This methodology requires that you include this information - now if want - here are two templates you could use

SAZ: Currently we do not require a Report, on purpose

RC: We send emails

SH: But that could be a report

RC: Say some of the optional components wpuld not need a report

SAZ: Maybe we should go back to the group and ask if we want to require a Report

EV: The requirement for a Report ()of some kind, could be added to Step 5

SAZ: The methodology might be used in soi very many contexts

SH: the issue is what is the minimum that is needed

KHS: To me the minimum you need from a report is does each CS pass or fail

SH: If you say, we followed WCAG_EM on this point, what if I want to challenge you on your statement? That is the whole point, I need to be able to compare to something youstated

RC: My exampke, I am the owner, developer and client - I do not need a report in that instance

SH: Do made my point, so if you state, with all hats, I followed WCAG-EM and have a logo, you actually need a repport

SAZ: No, you dio not need a report, you need a valid accessibility statement

SH: That says that you conform to WCAG, not WCAG-EM

SAZ: When you want to make a public statement, then there are cvertain thiongs you need to do

SH: But it does not say that anywhere

SAZ: The report can be confidential, you do not have to identify where you failed in detail - just the Statement

RC: We have some clients that do not want the conformance emplem, but I as the developer want to use this methodology as the developer

SAZ: We should collect most known use case, and maybe in 90% of the cases we will need a repeort

KHS: What about two levels of WCAG-EM conformance, one with a report and one without

SAZ and EV: We will take this question to the group

VC: 5.a: Provide Documentation for Each Step - this is to prove that you followed the methodolgy, so you must have the documentation. This implies reporting. I agree with you Shawn

SAZ: The idea was, say you do screenshots, but you end there. You do not actually formally

KHS: Maybe we could split up the two types of test - basic and detailed

RC: I am thinking about the cost, not to waSTE MY CLINETS TIME

SH: What is the minimum that you have to identify

RC: But then we have to do all the work, which means time and money

SH: What if the report literally takes 15 minutes'

RC: You shpuld at least provide a list of success criteria

SH: That is not necessary
... A simple table identifying date, evaluator, scope, result (for example)

VC: I agree, even an email would contain these points

SH: Maybe the scope is a line in the database or something

RC: I don't want to require a report for clinets that dont want it

SH: But you provide feedback that covers date, evaluator, scopor, result

Aurelien: I agree with Ramon, there are many occassions that do not need a report - when you are helping with development

Dominique: I thinkj a report should be required

EV: taht is the practice in the Netherlands too

SAZ: That is one use case. Peter Korn suggested that one development team shares with another team in the same company

<shawn> +1 to removing the examples and clarifying in 5.a.

SAZ: We do have requirements for what needs to be in. Perhaps it is best to remove the Examples and provide what to provide concerning documenattipon int he provess instaed

KHS: Documentation is a responsibility of any Accessibility Report in a government or industry context

<Zakim> shawn, you wanted to reiterate - minimal report may be a simple e-mail with who did eval (e-mail sender), date (e-mail date), scope: URI of page(s) referred to, results: "all SC

Aurelien: Reports would need to be different for variou audineces
... You can a report for designer, for javascript developer, and the back-end developer

RC: Reporting is for AFTER the evaluation, that is why I cannot see why we need this

SAZ: Step 5a requires quite a bit of documentation rather than specifically reporting
... EO may need to be review this
... It is like Review Teams, documentation you actually need to do during

VC: I am not real happy with that approach, many people who will use WCAG-EM will expect some kind of report because they have been paid. They want feedback. I think that we need to have a minimum reporting standard
... If the federal government wants a report from me if I don';t provide a proper report that will stand up in a court of law.
... I think a report is a necessity

SAZ: BUt that is not technically part of the WCAG-EM procedure. I agree that we need strong guidance about how to prepare a report. You cannot claim conformance without documentation in WCAG-EM

RC: Provide a Step that is determine if a Report is warranted

VC: Cant we have reporting exmples?

SAZ: Yes, but cant we agree that the terms "documentation" and "reporting" are two different things, and a suggestion for another step in 5 which is talkingf about reporting which is different from documentation

RC: The Step inself would be documented in Step 5
... I always document my work through email and other docs that are not reports

SH: Yes I think that the work 'report' is so loaded

SAZ: We had many discussions around this problem with using that word and why it was better to use documentation

KHS: Look to change Step 5 to 'Document the Findings'

Editorial refinements; overall structure and presentation of the document; alignment with other WAI documents; section numbering; linking of terms to definitions; use of terms "evaluation", "testing", "check", and "audit

SAZ: Why are we using these terms

SH: We are thinking about our terms too in EO

SAZ: There was no deliberate choices, and maybe this is where we need to pick EOs brains
... When do you check? When do you assess? Is this supported by FireFox etc.
... Does this help or does it confuse things more?

SH: It would be good for us to review the terms used in WCAG-EM

SAZ: In the evaluation process - do you want to now audit these pages. In that context you are going to find there is some kind of a systematic use of our terms

SH: So this is technical writing, not creative writing - so you should not worry about repeating words and needing to come up with many terms that mean the same thing

VC: Some clients like the term audit, it helps them to satisfy formal requirements. Other like a less formal evaluation to test certain crtieria
... They need to be used correctly

SAZ: I think they are used mostly systematically

Dominique: We should use Evaluate throughout the document

<Zakim> shawn, you wanted to say for example: "They can help assess if WCAG 2.0 Success Criteria are met..." -> "They can help determine if WCAG 2.0 Success Criteria are met" and to say

SAZ: Using too many terms for the same thing is too confusing in a technical document - for internationalization using one term is better

RC: WCAG says that the SC are testable - because of that we should keep the word "tested", rather than the word "evaluator"

SAZ: You evaluate the conformance
... For Search engine optimisation, is it best to use multiple terms?

SH: Well, the OVERVIEW document we do want to make sure that we do that - because we want that to be the OverView page to be the landing page is Search queries
... That is our goal for all WAI specs, we are making progress fir WCAG

SAZ: That is why I am afraid of Google, they know you....

<shadi> s/SAZ: That is why I am afraid of Google, they know you..../

EV: Thanks to all for participating.

SH: We will follow up in the EO meeting with Eric and Vivienne who will be attending EO

Summary of Action Items

[End of minutes]

Minutes formatted by David Booth's scribe.perl version 1.137 (CVS log)
$Date: 2012/11/13 17:40:44 $