14 March 2002 telecon minutes

Present

Regrets

Next F2F meeting after 23/24 March

JW Around 8-10 July ASSETS.

GSW Not likely to attend.

LS Possibly - June too far ahead. What about WWW2002. I'll probably be there.

WAC Me too.

JA Might be able to make July

LGR Not able to make either.

BC Not WWW2002, not sure about ASSETS

MM 0% Hawaii, 20% Scotland

AP Not sure on either.

MS Doubt hawaii, for sure Scotland

WAC Money or time?

LGR Time.

LS Scotland probably cheaper for me, question if I can get away.

JW Scotland possibility for me.

PB Would like to do, planning on it. Chances are high.

DC/NIST October/November

JW Possibility

LS same restrictions

GSW Not sure, but hope to make it to one of them. I'm more likely to make it to the cheaper.

WAC will go ahead and work on logistics for these two meetings. (i.e. suggest dates, find meeting space, host for June, etc.)

Checkpoints under Guideline 3

LS Waiting for next draft.

AP I worked a bit on 3.5. I sent a mail last night. (Annuska's email re: 3.5)

JW Main issues you would like comment on?

AP Success criteria 2 (3.5 is annotating complex info). How do we be clear about the success criteria for a summary for a table. What types of table require a summary? First occurence on a page or in the site. Everyone said page. Everyone agree?

LS I don't agree. Not just the first time but every time it appears. How differentiate between what needs defn or not?

AP "a particular community" seemed too limiting to me. Trying to make it more open, but think perhaps more confusing.

LS Perhaps jargon.

WAC What about tools like ATOMICA? Where the user can click on any word and get a definition.

LS Can't do that for acronyms or community specific jargon.

AP How does Atomica deal with that? Show all possible matches?

MM If it has several different definitions it will come up with all of them.

JW One justification for only annotating the 1st occurence is that software can apply the definition or expansion to subsequent occurences.

PB A problem with that is that we assume linear access to information. If a person comes in on a site on the 3rd page...even a definition of a page is problematic..if taken from databases to populate page in different sections...

MS We were doing something similar - a site has been created from a database. We use a combo of html tags (acronym and abbr) with an external style sheet. Screen readers can pronounce them correctly. It's a bit buggy since user tools aren't completely up to date.

GSW The problem with that, what if someone turns of the style sheets.

JW They could lose it if the UA doesn't interpret the CSS elements. Not a problem with the solution, but with the UA.

MS We're using an external style sheet where we apply each acronym or abbr just once. When it's activated the SR will read the full word.

WAC How well supported?

MS Testing. Still buggy.

WAC class on each instance?

MS Yes.

WAC So could put title on instead, although allows you to keep a dictionary with expansions in one place to propogate throughout the site. That's cool. Good technique.

JW Each one sent out should be annotated on first occurence, if people want to annotate every instance that is extra item of functionality that can be provided. Software should be able to substitute.

LGR If we put a link on every term, make pages hard to navigate.

WAC Metadata - perhaps more like a tooltip where if you hover over a word you get the expansion.

LGR Wording sounds like a link.

AP For those tools that don't support, provide a link.

WAC I think UA should figure out how to prevent it.

JW Close to 1.3 - provide the info for the machine to interpret. Not that it falls under, but it is very similar. Refering to just first success criteria.

WAC

LS They are available.

MS Are available, but don't always work. Odd combinations of do work or not.

WAC There is a subset of things that are not handled today, what do we do today?

JW Personal suspicsion - expect the tools will happen before large numbers of authors adjust sites to compensate for them. Therefore, the best way to solve the problem is to get the tools developed.

LS A large assumption. Having tried to make tools for people with disabilities, it's not being funded for the moment. It's a small market. There are factors. On the other hand, the guidelines may not be taken on by wide groups of people, but they are being adopted by people who matter - governments, education, etc. If the content is available but not usable (can't study, can't register for services, etc.). People are interested in following these guidelines and they are keen on making the quality of people's life better.

WAC 100% adoption or adoption in the most critical places? (possible consensus item?)

MS How many universities are making tools? Several universities are making tools. Guidelines are for the future, there will be more methods to do this in the future. Nice to provide guidelines for that time. Then, is this something you have to do or be nice to do?

JW One way of doing it would be: provide semantics so that the tool can provide what is necessary is requirement, sites that go beyond that b/c tools aren't there...sites can claim an additional level of conformance. If a site decides going beyond providing the semantics - working with currently deployed technologies.

WAC Interesting idea. Seems to wrap until user agents into conformance. I wonder how that would work?

JW "Not only have I conformed to the minimal requirements, but make sure that all of these techniques work on widely deployed tools." Not sure I suppor the idea, but floating it.

LS In Hebrew, it's similar to acronyms, there are letters and combinations of dots you can put around the letters to change the pronunciation. I've never seen a Hebrew site use dots. Screen readers in Hebrew are expensive as braille readers. They have to guess pronunciation based on frequency. Easier for people than machines, since they have to guess context. Machines only get 1 of 4 words are correct (on average). Person has to detect as wrong then make a guess. No longer a tool of convenience. What happens there? The web authors are leaving out this information.

JW I imagine the argument is that the information is not being provided since it is only probabilistic.

LS Where would you put that? You'd have to check that they work? That's tough. It's a pandora's box - you'd have to put vowels in.

JW If we were to go down this path (not that I support it yet) but hypothesizing, one of the requirements would be that you must have a way to get the info reliably out of the web content. A guaranteed way to get the info - a well known deterministic way to extract that info from the markup or content. Perhaps taking it too far afield from today's discussion.

+Cynthia

WAC Modular?

JW Split into at least 2 levels of conformance.

  1. provide semantics so that they can be ascertained by software
  2. gone beyond and made sure compatible with currently deployed technologies. Not only are semantics there but good chance of being delivered to current users. Some associated document provide guidance.

WAC Lisa? reaction?

LS The need of vowels has not been adequately dealt with. Pronunciation and acronyms are in machine readable form...but it's not usable.

JW Agreed. If we say that the pronunciation information should be there for all...hmm..we'd have to specify where it is necessary - certain circumstances...

LS Each word can be uniquely identified.

WAC Seems very similar to 1.4 - talks about pronunciation.

AP I like the tie-in to 1.4.

WAC Differentiate 2 issues:

  1. pronouncing languages that might be visually ambiguous (Hebrew, Chinese?)
  2. English (for example) has less of this problem, i.e. today's Screen readers are doing ok (at least not the hardship Lisa described with Hebrew.

WAC Highlight W3C Recommendation - Ruby. Can be used to help with pronunciation.

LS Experienced reader vs new reader.

GO 1.4 is in WCAG 1.0 as a P1. In New Zealand we have 2 languages (English and Mouri (sp?)). However, if you put the markup in place no tool to take advantage of it. I would like someone to create a screen reader to pronounce Mouri correctly.

GSW Essentially the only reason to mark changes in language is for braille readers.

WAC HPR does change pronunciation for the languages it supports.

GO Only 1/2 a dozen languages.

LS Does it have other requirements? phonetic info?

WAC What do you tell authors today?

GO Today we tell people: don't use markup today. If trying to retrofit a site, huge amount of effort for no benefit.

LS What about using...people who speak Mouri use English. Then the best thing to do is to make a Ruby translation.

JW Issues of tool support are on next week's agenda. One way of dealing with this is for me to write up my ideas about conformance in a way to avoid the objections made. If someone else would like to make a proposal for how to deal with issues of support and how that changes over time...we don't have many concrete proposals. We've had lots of discussion and keeps coming up but we don't have any proposals.

WAC Also, are there more assumptions and consensus items we can tease out of this discussion to help us reach consensus? e.g. what reach are we aiming for: 100% adoption or only focusing on particular services.

LS If you aim for only one segment (e.g. goverment) that's a big difference. It's a step forward.

JW An argument for different levels of conformance is to distinguish qualitively the amounts of accessibility. Perhaps it's time to go back to conformance discussion, that's likely to occur at F2F next week.

Next meeting

Our next meeting is the F2F (23/24 March). We will not have a teleconference 21 March.

Reminder: if you want to attend the F2F by phone, please register by tomorrow.


$Date: 2002/03/14 22:32:29 $ Wendy Chisholm