W3C logo Web Accessibility Initiative (WAI) logo > EOWG Home > EOWG Minutes

EOWG Minutes 5 September 2003 Meeting

on this page: attendees - agenda - debrief eval exchange - eval resource suite - mentored evals - deliverables - next meeting

Agenda

Attendees

Regrets

Agenda Review

JB: Reviewed the agenda. Asked for priorities.

LC: Perhaps we should move deliverables earlier.

JB: Do we have critical mass for coached/mentored evaluations?

HBj: I would like to keep this on the agenda for the morning.

Debrief Yesterday's Best Practices Evaluation Exchange

JB: We should also debrief from yesterday. The questions come to the afternoon. We could move deliverables up and walk through evaluation suite. This time we did a best practices training exchange. What are the reactions? Outcomes? Format?

BB: I had been aware of the resources. Yesterday helped to synthesize. This was useful.

AG: For the future, don’t concentrate on documentation that people have read [on W3C site]. Try to extend ideas. Discussion is the most important thing. What do people do under each of the checkpoints?

Next Best Practices Exchange

JB: We should have had a question on registration form on levels of expertise.

DK: As a new person, it would have been helpful to see the process. Maybe see a Web site and look at the process. This could be done in different languages.

AG: Reduce the amount of items. Allow more questions, answers, and debate.

LC: Need to have a wireless connection.

BS: There was a lot of information on what to report, and not [as much?] on the method.

EE: I am interested in how to evaluate another language when you are not familiar with the language. I know HTML. I can read tags. I am going to require that they put in the LANG attribute. I liked screen shots. One presenter said that this is helpful. Another presenter said that it’s the same problem over and over so could use generic approach. I feel that it should be specific.

JB: It would be interesting to address specific topics, such as evaluating specific languages.

DK: Is there going to be a harmonized process rather than vertical approach. May get different results if method isn’t harmonized.

JB: I think that this is an open question because it may not be W3C’s role to standardize but we can provide some guidance. It was interesting when talked about coached/mentored evaluations that people said that many approaches could be used.

SLH: The focus could be on the answer not how got there.

AG: You have so many audiences. It’s stuff that I know. I would like to talk people how they do things.

JB: We have been playing with best practice concept, doesn’t assume that there is a correct answer. Allow people to share best approaches. Try to think about the next one that we do. What assumptions can we make about people coming in?

HBj: In planning the event, ask people about specific checkpoints. How to evaluate drop-down menus, plug-ins, applets. We tend to go back to alt tag for images. We know this. We should address complicated things.

LC: Could have concurrent sessions. Some people still need entry level material.

SLH: The audience is diverse.

AG: We can tell people to read things in advance.

BM: I don’t think that should join different topics. I don’t think that there is anything to be gained to have introductory materials. The expert meeting is an invited meeting.

SLH: What about evaluation vs. training? What is the need for the next one?

AG: I don’t do any training.

BB: Combining evaluation and training could be a good introduction. Can invite experts to share specific information.

BM: Ultimately, can’t possibly train all the trainers. Have work on Design 4 All curriculum. Maybe do some collaborative activity.

LC: Maybe the next event could be training so that we could focus on the online curriculum.

JB: We had thought that, in order to attend this meeting, participants would have to submit an evaluation. In the WG we said that we didn’t want to be exclusive.

EE: If have an expert group. Need to bring in experts from commercial vendors such as Oracle, Adobe, Macromedia. What are the solutions?

JB: This is another topic. How to evaluate dynamically generated Web sites.

AG: The EOWG has evaluation sitting on it. This is very technical. These elements are not sitting together.

JB: We could do a meeting in partnership with another WG. We should be co-hosting with another group.

HBj: Is this within the EO group? We know more about who we are. When we met last year, we talked about tutorials and templates.

JB: This is in the wish list section of deliverables.

BM: I endorse doing something jointly with WCAG WG because looking for authoritative judgment. Some difficult cases will involve 3rd party vendors. It’s hard to do this in parallel with 2.0 discussion.

JB: We have some obligation to do outreach, training, expert workshops. We have heard that evaluation is one area. This would go beyond members of WG. We can try to figure out how to do this in an effective educational way. The resource suite is different. We regularly produce educational materials. In 2.0, there will be some educational materials on evaluation. The WCAG WG will be coming up with testing criteria. We don’t have have enough technical background for us to do this. We will have to translate this to educational language.

??: In 2.0, will have guidelines, checkpoints, and testing criteria?

JB: yes. It will also have to go through implementation period.

AG: It would be helpful for WCAG WG to discuss testing criteria with EOWG. We have practical experience under the checkpoints. It would be helpful to provide feedback to WCAG WG.

AC: Our role should be to present to more general audiences e.g., Web designers. We have been preaching to converted.

JB: Instead of trying to ensure getting experts, can reach new people.

AG: There’s a specific document of WCAG WG: Checklist of HTML Techniques for WCAG 2.0. It’s in the current works section: HTML techniques.

JB: I invite people to comment on Alan’s comment. Does it make sense to have two day round. First day is wide open. Second day is on challenge topics e.g., dynamically generated sites.

EE: Agree to split off on second day to specific focus groups but not limit participation.

AC: In Palo Alto, we did attract a wider audience of known specialists. We could advertise meeting through other channels.

JB: We used local industry networks.

DB: This could be organized as a preconference. ICCHP is holding a conference in July. Could ask companies to present their work and their expectation.

DK: Could consider a panel discussion with representatives from industry. Could have open debate on how could do a better job, foster relationships.

EE: No sales persons allowed.

JB: What are the advantages? Concerns?

DK: Some of the concerns is that one company will dominate the discussion. Need to foster discussion.

JB: We need to interact with developers. We could have clear agenda and send out questions in advance.

FJ: It would be interesting to have teams who have experience to show the process.

JB: Is this instead of developer panel or another part of the day?

FJ: I am talking about site developers.

JB: Let’s hang onto this idea.

AG: RNIB put together a similar idea. The conference was very large.

LC: Do we have to wait for face-to-face meeting for this?

AC: This would change the orientation of the working group.

LC: This could be once or twice a year.

JB: We could do this with dynamically generated content and WCAG WG. I am hearing that one day should be an application developer panel. Could also have Web site developers. We have some information for another event in 2004. Another day could be complex topics.

HBj: Would we have a 3rd day working on the documents? It would be interesting to have American members attend European activities.

SLH: Most people really want to come but they don’t have support from their organizations.

JB: There is still a chilling political effect. Foreign travel has been shut down. We have participation from Canada and Australia. We are not good about setting dates far in advance.

DK: Can we bring people in through technology?

JB: It’s difficult to do a working group meeting that is not face-to-face.

AC: It could be interesting to set-up Web cast of open meeting.

DK: I can look into this.

JB: If we do bring together panels, it might be nice to have it virtual.

LC: Could we have simultaneous translation at face-to-face meetings?

JB: We have considered captioning in English. Lots of issues, e.g., which language should we start with?

FJ: Do we have Asian participants?

JB: no. We’re having trouble in some other WGs. We had an alternative time slot. We should consider having a meeting in Japan, China, and Korea. I want to summarize where we are. People are interested in a 3-day event—not a Web cast. Could do this at a preconference.

HBj: We announced too late for the AAATE.

AC: How many people attended both? This may be an advantage for some people.

DK: Was the Web cast for the internal meetings?

JB: Alan was proposing for the external meetings? It would be interesting to do internal meetings using Web casts. Lots of complications.

Evaluation Resource Suite

JB: We have a large number of edit requests. These are editorial changes. Are there thoughts about evaluation resource suite as result of our meeting yesterday? I was surprised at how many people are using it. Will it look different in the future? Will it just be a pointer to an EARL wrapper?

AG: Is the information about usability to be incorporated? Are we swinging toward usability?

SLH: No. I think that we should point to other resources. I am concerned about what’s in there now.

AG: It was built-in as a subject yesterday. I am a little bit worried. We should be concentrating on HTML specs.

BM: Interested in comments yesterday that you don’t have to think about users at all, just have to think about guidelines. Need to control creep. It’s dangerous to do a little bit of this. Your responsibility stops if you have satisfied the guidelines.

JB: This doesn’t match on how people organize their work. I want people with disabilities to use the Web. The guidelines are the best expression on guidance on how to do it. I think that we need to advise people to try this with real people. Some sites that conform may be implemented in a way that does not make sense. Maybe should be under the additional advice heading.

BM: Needs to be carefully split up. We should look at this as we examine the material.

HBj: We have to think about user testing. Can you document that if do usability testing, does it matter that participants are people with disabilities? What are benefits in involving persons with disabilities in general usability testing? We need better evidence. Accessibility is conformance to guidelines. Whether it’s a good or bad site, it’s a different question.

DK: WAI may want to have a statement on this delineation. If WAI says something like “We support usability studies…”

JB: We’ve had this debate before. We are trying to ensure that there is usability testing on accessibility features. We would not replicate a general usability approach in our documents. An example is someone using “skip nav” on several places on the home page.

BM: Does the preliminary hit this?

SLH: Not really.

BM: I support if going to do usability, it should incorporate persons with disabilities.

??: For accessibility reasons, might want to do usability testing.

SLH: Regardless of usability, if focusing on high accessibility, then some level of usability testing is important.

SLH: I’ve been working with a company for awhile and they met accessibility checkpoints but company is getting complaints by persons who are blind.

AG: Do usability testing because sites may not be usable by sighted persons, as well.

??: Usability is a good thing. It’s not what this group should do. Should not be written into a conformance. Won’t be transmutable to 2.0. It’s a usability matrix. Our component is just Web content. We also look at user agents and authoring tools. Will we do usability testing on those?

JB: We do a piece of it. The part that makes me uncomfortable, is that the guidelines are insufficient. WCAG WG says that it’s impossible to write guidelines that will ensure complete accessibility. This seems like we should have a conversation with WCAG WG. For example, we think we should say something about usability. Where should we say this? Is it preliminary review? Should conformance evaluation be the place? Or, to do a more comprehensive evaluation. Here are other things to look at. Maybe it falls into on-going monitoring.

AG: It’s technical accessibility to the specification. It’s usability advice.

SLH: It’s implementing the specifications to do the way it meets the intention.

JD: Once make value judgment about quality, then making usability judgment.

PG: Some criteria are about accessibility and usability. Can checkpoints be identified as accessibility, usability, or both?

JB: Jon, I think that you were oversimplify it. I think that I am more along the lines of what BM said. Have to ask if doing it in a functionally appropriate way? Whether or not it falls in a conformance evaluation, I don’t think that it does. What conversation could we have with the WCAG WG? There’s precious little checkpoints 1.0 that don’t have to do functionality.

AG: Functional use is separate instance.

FJ: Language declaration is purely technical.

SLH: Jon was saying: Is the way to judge to be confident on what a tool can do?

FJ: The language question, tools can’t do this.

JB: Bring back discussion, we think that we develop guidance on what other WGs develop. What do we want to do in future [in eval resource suite] particularly with regard to usability and accessibility? We might have 3 main sections: preliminary, conformance, and comprehensive additional suggestions. This could include usability features and where to go for further information.

JD: This sounds good as long as pitch it right. Make it more motivating.

BM: On pitching it, I say what conformance evaluation gets you is that certain known and recognized barriers are absent. Something that could be made explicit, is that these are resource tiers.

JB: One of the things that came out of conversation, may want to have a mini-business case to do evaluations.

AG: We shouldn’t say that testing with users is gold plus effort. We also suggest that it might be helpful to say that it could be helpful to test site with persons with disabilities.

SLH: I think that it’s how you present it. There is WCAG conformance and then there are additional ways to confirm accessibility, incuding usability testing.

HBj: I don’t know if it’s relevant in this structure. Are there laws about usability? We have to be careful about this section.

SLH: In US, Section 255 Telecommunications Guidelines specifically include usability.

JB: I don’t think that this would conflict.

EE: Section 508 sections address usability.

SLH: I don't think so.

[looked up Section 508, and EE identified sections did not directly address usability]

JB: I think it’s in the preamble only. Are we going to say don’t try on different persons.

BM: This is a resource issue.

JB: I think that resources are a part of the matrix. I don’t think that it is a determinant. If we have all testing criteria, would we advise platform by platform test?

JD: Within conformance evaluation, can ask them to do conformance. No. You don’t need to look across platforms and browsers.

DK: Are we just talking about setting the minimum bar.

AG: You could get away with not testing on different platforms. It’s up to the developer to see if it works on different platforms.

JB: You have to look for it.

AG: If look for things work properly, it’s the developer’s concern. It should render the same across platforms.

JB: In what world?

PG: Prefer when things are easy even when not complete. The three part does not work. A third paragraph opens the door to another issue. We need to give more information. As part of conformance, put two paragraphs inside that mention usability. Some checkpoints cover both accessibility and usability. It’s hard to check all the browsers. Prefer to say “have a dream…Web sites work on all browsers that …” We will loose people if we add more.

JB: If we take out usability testing from conformance, would it make sense to move out cross platform? Move into broader section.

LC: How will W3C be viewed if we say that we have criteria for accessibility but then people develop Web sites that follow the guidelines and are not accessible [usable by people with disabilities]? Favor three-part proposal.

JB: What percent of corporate and government Web sites do usability testing?

SLH: Depends. Large companies, maybe 75%.

DK: Sure, about that.

EE: US government, 25%

several: There seems to be attention to this in many countries.

PG: In France, think that accessibility is everything.

JB: I think that this will be a long conversation. We will have to negotiate this with WCAG WG. This would have to be clear that this is not conformance.

HBj: How much usability is built-into WCAG 2.0?

JB: Next Friday, we’ll be doing final comments on the draft.

HBj: I went to AAATE, WWWAAC session, one of the projects is a Web browser. They used WCAG 2.0 checkpoint. We heard from EdeAN, that Hermes is being developed, would have profile suited to your needs.

JB: More than for any other resource, we will have to partner with WCAG 2.0 and bring in people from other areas. We may be able to give back a certain section to WCAG 2.0.

SLH: General idea is good. We should do different headings. There are different methods and preferences for finding technical non-conformance, usability testing might be one. We need to position it differently. There are different goals for a review and different people doing the review. Different people may be able to look at markup, others would do better with a tool, and others using different browsers.

FJ: I like the three levels. I don’t know which order. It could be slippery to incorporate usability.

BS: It works.

DK: I think that it’s a good model. People want to understand the minimum bar. Needs to be part of the minimum bar.

BB: Minimum bar is important. Don’t confuse accessibility and usability.

SAZ: I don’t know how easy to distinguish between accessibility and usability. Some usability should be part of minimum. The proposal has too many levels between accessibility and usability.

Clarifications & Comments

JB: What are the additional comments on the evaluation resource suite based on yesterday’s presentation?

LC: How incorporate Shawn’s presentation into evaluation template?

SLH: Yes. I did have additional information.

FJ: Will the presentation be online?

SLH: yes, hopefully by next Friday.

BM: What is the importance that reporting meets accessibility standards?

SLH: Very high.

BM: Do we say this in the template?

SLH: I guess probably just assumed - probably should.

JB: Review template for evaluation reports.

FJ: Missing a section that says that sites that are available in different versions e.g., version of CSS, text only, CCCP, etc.

HBj: Where would you incorporate user interface. How would Web site be presented to you? If use different equipment, would have different presentations of the sites. Different styles based on different browsers.

JB: I don’t know how much address adequacy of final results. Do you mean that there is another site for accessibility that is not fully featured?

FJ: Do you need to check all of them?

JB: How check with multiple versions of a site. This is a question of testing assumptions.

??: Should there be an advice section on this?

FJ: I think that it should be a topic.

LC: Consider addressing this further in the WG.

HBj: When Eric was presenting, have a table checkpoints, how to do, why to do. Maybe should enhance the document in the preliminary review.

JB: We should enhance document with checklist.

HBj: We need to explain why to do something and why. Can we put in links to the bookmarklets?

SLH: If we have an inclusive list, we can point to it.

HBj: Tell people to use a search engine.

JB: Should we link to testing tools e.g., Lynx? When we have EARL Wrapper, is evaluation resource suite less useful?

SLH: It won’t take care of all of it.

SAZ: EARL wrapper and bad site are useful. But, I don’t think that it is a replacement.

JB: We should have a lead-in on the EARL wrapper.

FJ: Impression of no-man’s land of pdf, flash. No one is saying anything about it.

JB: Adobe and Macromedia are working with us, in parallel with WCAG 2.0. W3C will not publish these. But, they will be published on companies’ Web sites. Is someone providing a known alternative? Maybe need a section that explains what to do with this.

FJ: Impression is that W3C does not want to hear about this.

JB: Comments on dealing with propriety formats?

SLH: Should have a comment—but not a separate section.

JB: With 1.0 we made a decision that these don’t apply. People didn’t get it. Maybe there needs to be a comment or a section.

EE: Don’t want to be product endorsement.

JB: Consider adding a comment or section highlighting how to evaluate other formats.

BM: People in this meeting understand W3C processes. Many people accessing the materials don’t understand this. Make reference to the fact that there are some things that W3C can do and can’t do. Silence may not be helpful.

JB: I disagree with your conclusion. Guidelines and checkpoints can apply to anything. But, the techniques can’t be applied.

BM: How do I go about evaluating a site with pdfs?

JB: The testing techniques are different but the normative guidelines are not.

Coached or Mentored Evaluations

JB: We talked a little about this yesterday. This furthers the goal in consistencies in output, not necessarily methodologies. We were trying to do this as part of the WAI gallery. It’s intended to be a showcase of AA sites. It’s been difficult to get underway. Evaluations were done with different methodologies, documented differently, and sites were different. Can this be streamlined? One approach was to construct an inaccessible site and invite finite number of reviews of site. Maybe have certain criteria. May be more useful to have common documentation, not necessarily common methodology. Submit confidentially or not. People could look at a final report as a model. We would publish aggregation of reports.

JD: This is a good idea because people learn better by example. Gallery idea is a good idea.

HBj: It’s something quite new to put on our to do list. Relate more to the gallery discussion. From our previous meetings, we have come up with many ideas. A new idea is to get the gallery rolling. Maybe if we took a group of people to come up with a site. It should be a real site, not necessarily one that we develop.

JB: Matt May is eager to do this. The thing about doing it on a real site because it’s public and it’s dynamic.

HBj: We can use Henk’s template and compare evaluations.

JB: The majority of people who signed up for this said they don’t have time to work on this now. Eric is no longer comfortable with what is in the template. If we get into issue that we have different reactions on checkpoints, we are debating 1.0. WAI needs to do this but not with the EOWG.

EE: I have seen very bad example and ask how many can you find? Then, we’ll tell you what is wrong with it. When they are good, they can go into coached/mentored evaluation.

JB: Have first step be self-taught?

DK: One of things that we did to help developers of code, after they did work on their own, the developer/expert would explain why they wrote code in a certain way.

LC: Should also have a good site example.

JB: We got feedback that this is powerful. This could be working on before/after with intermediary explanatory piece.

JD: The site has to look like it is professionally designed.

EE: A few years ago, Amazon.com had an explanation of how they changed their site.

JB: Does this help us make progress on self-sustaining gallery?

BM: It is at right angles with the gallery.

JB: It doesn’t help us look at quality of reviews.

EE: This is a step in the process.

HBj: Are you saying that we have to start coached/mentored evaluation process in order to build up gallery?

JB: The intermediate step is to do an exercise with experienced evaluators.

HBj: Is it doable? How far apart are we? We have done little experimentation with template.

JB: I’m not hearing enthusiasm on coached/mentored evaluation now. As presented in self-study, there seems to be some interest.

JD: I thought that before/after is a deliverable at the end. To get there, we would need poor site.

JB: We have a before/after example that needs to be updated.

JD: I thought that the exercise is how people report things.

JB: The deliverables overlap and intersect. We have before/after with self-study. Am I correct that there is interest in before/after with self-study?

most: yes.

JB: Going back to templates, should we go to gallery nominations? This is an internal examination.

BM: Go WIKI. Why not put it out there---nominate sites that they would like evaluations on. Anyone can do the evaluation.

JB: This is old ground. There is a gallery introduction page with disclaimers. There seems to be a fuzzy conclusion except that we tie before/after with self-study evaluations.

Deliverables and Assignments

JB: We are doing a walk through of the deliverables.

[Judy provided an overview of the WAI resources page.]

There are changes that need to be made.

PG: The document “How People with Disabilities Use the Web” is very important. Can this be updated? Can we develop another document “How Accessibility on your Web Sites Helps Everyone”? Can a video be developed? This is very helpful.

JB: “How People with Disabilities…” is our most pressing deliverable. “How the Web Benefits Everyone” we have it in several documents. RNIB did a video in conjunction with WAI. Some of the content may be outdated. The WG felt it should excerpt sections.

EE: “How People with Disabilities….” is it more than just blindness?

JB: The document is good.

PG: Can we write scenarios on how Web accessibility helps someone with no disabilities?

JB: This is part of auxiliary benefits document. This is one part of a broader business case.

JD: The information is excellent. Presentation is terrible. Have to be motivated to read it. It’s about the audience. Would be helpful to indicate the audience.

JB: Is there anyone who can help with the presentation?

JD: I am interested in the presentation.

SLH: We have started the process of starting a task force.

JB: I am talking about presentation of individual documents.

SLH: I think that usability activities will address this.

JD: Should not be technical—talk about need, benefits.

JB: But, this is for different type of audiences.

DK: It’s tracks for different audiences.

JB: What you’re doing may help with finding information. We don’t have documents that speak to the disability community. There are so many deliverables that we have done or drafted, have different audiences, we may want to use a sortable database of documents.

DK: How are tasks assigned? Followed-up?

JB: It’s ranged from formal to semi-formal. There are not a lot of people who are willing or able to do intensive editing. Real work gets done in teleconferences.

DK: What are skill sets?

JB: I think that we’ll be tapping into you for workflow. It’s been informal. We are missing editing and copyediting. We were having different people do editing and this has led to inconsistencies. We only have a general style guide.

LC: Need to have approach for curriculum.

JB: Chuck, who developed current curriculum, will do clean-up of technical problems. It needs brainstorming. He wants to be paired with someone with online learning expertise.

EE: I am still trying to assess the choices. I like training and outreach. I am interested in something on which there’s a lot of work to do.

DK: I would be happy to work on editing of “Building the Case for Web Accessibility.”

SLH: I am building a task force for the Web site redesign. We have people who have expressed interest.

JB: We could move ahead with “How People with Disabilities Use the Web.” Also, training resource suite, I can finish doing edits. Libby, you could take it from there. Helle wants to get the gallery going. Pierre, would you help with gallery?

PG: Yes.

JB: Three things we can work on: business case, how people with disabilities use Web, and the gallery.

SLH: A task force can meet at an alternative time to Fridays.

Next Meeting

12 September 2003


Last updated $Date: 2003/09/15 19:11:07 $ by $Author: shawn $