WCAG 2.0 Evaluation Methodology Task Force Teleconference

19 Apr 2012

See also: IRC log


Don, Eric, Liz, Martijn, Peter, Detlev, Sarah, Maureen, Elle, Alistair, Mike, Tim
Kathy, Shadi, Vivienne


Eric: important to describe scope, chosen target, context, use, etc. and sequence of steps, so that another evaluator might try to do the same thing following the same sequence of steps and variables (time of day, user profile, etc.)
... about step 5a, provide documentation for each step, opinions? sufficient description?

<Detlev> will redial

Peter: couple of comments on the housekeeping level (evaluation is misspelled), documentation for each step, might want to say "as part of documentation"
... concerned about documenting, the size of this documentation may be large for large websites, important before we finalize this to have some idea of how large this might be and decide if we're comfortable with that
... other concern is that we may be overloading the word "step

<Detlev> Had the same comment about step

Peter: concerned about what is a step of walking through the application versus a step of the evaluation, we need to be careful about when we mean which

Eric: agree, should be clarified
... also size is necessary to put in the minutes as something to look at, we do say publishing is optional

Moe: question regarding bullet #3, web pages included in the sample
... are we thinking they'd just record the URL of the page? as Peter mentioned, documentation could get quite lengthy... what level of recording the pages are we looking at?

Eric: thinking we do describe (Step 3 if I'm not mistaken) the sample, also the sample used for the audit, so what you'd point to in the Appendix
... also dependent on goal of evaluation, if just fail/pass, not sure if it's necessary to publish the sample

Moe: if there is a dynamic application, one base URL but content consistently changes, do they record that base page of the application or parts of the application tested?

Eric: how do we record it, if we sample the website and it's one page but it's an extensively dynamic page, not enough to say "this is the page address" because it could just be checking part of what happens in that page
... WCAG 2 is on the page level, so if it's all on one page, we have to test all that's there, but how to record it, not sure

<korn> I have a direct response to that question

Detlev: part of confusion on section on documentation was that step would be a nice one for complete process, fill in form, error message, confirmation page... but we have used Step for overall methodology
... maybe the sections can be renamed so we can reserve Step
... checking dynamic things, easiest thing might be to document the start/landing page for the process and then describe in the documentation the number of steps you have to follow to arrive at the end of the process
... say, put something in the search form, what you input, then follow hit links on the page
... describing it succinctly means you describe the process that you have to follow, making it independent of the base URL

Eric: clear that there is confusion about the words Step and Sequence, maybe renaming the sections would be better and using Step as described
... the question still arises about the size of the documentation if you have to describe in such detail, could make this in a large document

Detlev: when we do this, it's usually just a few bullet points, but complex applications may get very complicated, though, yes

Sarah: thinking about the amount of documentation as well, but many evaluators would like to capture just the URL and call it good, but this approach here is more like what we do in our labs, describing the process, so that the person coming after you can identify dynamic components on the page
... sometimes evaluators don't know everything that could be described with just that base URL, but a general description of what worked and didn't, our overall goal is "does dynamic functionality work" - we may not get every one, though

Peter: on the notion of Steps, we can also address this with an adjective "evaluation steps" versus "testing steps" as an option
... echo comments that others have made about web applications, we're seeing more and more of those and it's really one of the core use cases for the evaluation methodology, more than just straight WCAG evaluation of a single plage


Peter: the whole concept of "page" is breaking down, and we really need to keep this in the back of our minds

Eric: something we describe also in the requirements, yes, I think then the solution given by people previously (documenting with a general description and process) might be a good solution

Allistair: just spent 2 weeks documenting a very small application, it does take a substantial amount of time to do that
... a lot of people who have that documentation already have that in play, can we not get a copy of their use cases document and work through specific things that they intended the web application to be used for, and we can then say if those paths are accessible or not?

Eric: this can be something we cover in an earlier step, reporting step as well?

Allistair: pick up this documentation from those who have built the application, problem is longer time writing about website than testing it
... if we spend that much time, fewer assessments and possible changes by that time, so needs to be very rapid, have mechanisms to take in as much information as is already there

Eric: would be great to gather them, and an evaluator could always decide if there's another use case needed, then you could point to it on step on reporting and use it on step on auditing
... marked in Step 2, if people agree that it would be good to ask the owner of the website for the use cases that they have for their website, and include that in deciding what use cases to use for the evaluation

Sarah: the use cases may not actually point to individual URLs, but developers know where the dynamic functionality is on the page, maybe we can leverage that information

Sarah: when you're describing, if you use this level of detail in reporting, the reader can tell where in the page they were focusing on for the pass/fail of the particular component or part of the page

Detlev: reporting where to report findings, if you have a process that spans several pages, the question is where to put that, our tool is to select pages as page sample and attach all your criteria to that page
... if that page is a starting page of a longer process, we get some slight confusion on whether all those pages have a sample, where to document
... not sure if we can go into such specifics, or leave it up to the application reporting web evaluation, or how it will be organized

Eric: tried to cover this in specific sequences, maybe not clear enough, add examples perhaps, bullet point 1 says "include complete process"

<korn> 10 Euro cents?

Allistair: with regards to reporting, collecting all the different pages that we have in our sample, we used to do all of this, and no one ever re-checks the audit reports after they were given to the client
... the idea of keeping all those pages, who will audit the auditor?

Eric: do we want to keep all these pages?
... relates to Step 4D: archiving web pages for reference
... direct result of that in 5, add web pages into sample
... does anyone ever use this, or is there a real reason to keep this?

<MartijnHoutepen> -agarrison

Detlev: just wanted to mention that there can be cases where it's useful to have at least the URL of the page tested, complaints and discussions about how you rated certain points, it's good to be able to return to that page, some things cannot be covered in a screenshot, but URLs is certainly a minimum

Mike: understand the need to keep things as light weight as possible, a matter of course to have set of URLs and screenshots, really helpful for reference in a report (able to take a look at the page in particular), may also be helpful to identifiy what was reviewed last time to set up a plan for revisiting at a later date, memory costs are low, so URLs and screenshots seem to be a good way to track

Moe: other scenarios: retail application, pages are based on templates, if the URLs change, templates don't change.... another is a search application, where the content changes but the template is essentially the same
... we are talking about a sample, but if we record these pages, the content may change but the template will not
... taking a snapshot of a page in time that may change, gets trickier when the content regularly changes

Eric: Step 2 A identifies naming templates, something we're looking at, but should add to reporting

Moe: talking about something slightly different, an actual web page template

Eric: 2A should mean the web page templates (not evaluator templates) where content is inside, but forgot to move it to reporting section, so will add it there

Allistair: we used to download the pages and they didn't work as they did previously, so we had to correct it all to work locally, which was painful, and also changed the actual pages
... a lot of things were code violations, so screenshots didn't really help
... recording the URLs are a no-brainer, and a lot of the report points to URLs
... could we ask people who are in charge of the content management system (CMS) to give us templates used for the website so that we can separately look at all the content in the empty template, and then look at them in context somewhere else?
... how do we as evaluators discover all the templates on a 2 million page website?

Eric: we need to clarify this in the text, it's more or less what we mean in 2A about identify all the templates
... this would be the best moment to ask the website owner: where are you templates, use cases, etc.?
... identified in Step 1 and Step 2, and then it should also be in the reporting section
... will try to clarify that in Step 1 and 2
... evaluator still has the responsibility to look beyond just the templates, but agreed, needs clarification

Peter: time check, request about other items

Eric: closing down 5A for now

Detlev: regarding the word template, pages that consist of different elements created by various teams
... impossible to basically track all of them, so we need definition about what the word "template" means, perhaps not always available

Eric: proposal in the next version?

Detlev: yes please

Eric: worked on more than 5A, but discussion also rendered interesting points for earlier sections, valuable discussion
... Step 5B, provide an accessibility statement (optional), looked at a lot of statements, tried to see what would be completely necessary to put in an accessibility statement
... what we use for ISO documents: scope, evaluation date, conformance level, use of non-WCAG supported techniques, version of WCAG, version of this Conformance Methodology used
... these are points that have to be in official documents in the Netherlands, welcome any additions or discussion

<agarrison> When I use templates I usually am talking about templates in the context of content management systems - the mold into which an editors contents sits.

Peter: what does conformance level mean? understanding of WCAG is either you do or you don't conform, is that what's meant?
... because this is a statistical sample of a site, we don't actually know perfection, only what we sampled had no issues or had issues... would be more interesting if the accessibility statement was based on what was found and not a simple conformance statement
... a range of statements "everything found conformed" or "everything found largely conformed" etc

Eric: we could add how near you get to this conformance level

Peter: what does conformance level mean?

Eric: A, AA, AAA

Peter: would be helpful to have this in the text

Eric: next one is 5C, performance score
... did not add any text there, will do so in the coming week, information on findings is an optional addition

Allistair: should read sample conformed to certain level of WCAG 2.0

Peter: you mention if something is a failure, provide at least 3 examples of what that error is
... we may not have 3 of the same error or even 3 pages, text needs to be reviewed

Eric: yes, will look into that
... will work on new version and make additions to conformance methodology in Section 4
... next version will be ready by Monday or early Tuesday
... thank you all for being here and discussing, please go on discussing the other sections we didn't have time for (5E, etc.) on the list
... have a good weekend!

bye, all!

Summary of Action Items

[End of minutes]

Minutes formatted by David Booth's scribe.perl version 1.136 (CVS log)
$Date: 2012/04/23 15:58:23 $