W3C

- DRAFT -

Cognitive Accessibility Task Force Teleconference

31 Mar 2014

See also: IRC log

Attendees

Present
Gregg_Vanderheiden, Kinshuk, janina, Debbie_Dahl, John_Rochford, Mary_Jo_Mueller, Neil, Michael_Cooper, Suzanne_Taylor
Regrets
debra_Rue, Barry_Johnson
Chair
Lisa_Seeman
Scribe
MaryJo

Contents


<Lisa_Seeman> agenda: this

<scribe> scribe:MaryJo

preview agenda

LS: Gregg Vanderheiden will speak today on GPII.
... Each of the disability group teams will be expected to report on their progress over the next few weeks.

<Lisa_Seeman> https://www.w3.org/WAI/PF/cognitive-a11y-tf/wiki/Gap_Analysis/Introduction#Methodology_in_User_Research

LS: There should be two user scenarios for each disability type and the group will review and give feedback.
... There will probably be two reviews of each disability type before we call that research work done.

Possible date for FTF

<neilmilliken> Looking forward to hearing the update from gregg

<MichaelC> http://join.me/403094088

<Kinshuk> https://join.me/403094088

<Lisa_Seeman> http://registration.icchp.org/Programme/Overview

LS: There is an upcoming ICCHP conference in Paris from July 9-11.

<neilmilliken> I may be able to get to paris

<neilmilliken> its easier than san diego

LS: The problem is it is in Paris, so not sure if others plan to be there - especially those in the U.S.
... Will post this to the list to see how many could possibly attend.

Review of GPII by Gregg Vanderheiden

GV: Presenting the GPII work on global infrastructure.
... Why GPII? Not good solutions for everyone, and some can't afford solutions because of expensive technologies.
... Many people with disabilities have to apply for jobs on the Web, and solutions are often too complicated, like the web applications to change the thermostat.
... In addition, the solutions don't work on all different devices/platforms that people use.
... GPII is an infrastructure that makes it easy to find out what helps a person use a digital UI, invokes those features on any device - anywhere, any time, and makes it easier/less expensive to create new solutions.
... http://tinyurl.com/NSF-NIST-GPII-DEMO
... This is a demo of how a user interface can be adjusted based on the specific users' needs.
... The user has a card with an NFC that has their user settings/needs on it. If you approach a device that reads the NFC card and then the UI changes based on their needs.
... The UI can be magnified, a screen reader could be automatically started, there could be highlighting, and any other number of adjustments to the UI.
... It could even cause the user interface to be very simplified with only essential options presented to the user.
... So for example if you have an elderly person, you could make a very simple mail function indicated by something familiar like a mailbox. Then they can see the email as a simple post card with simple actions.
... When the user replies to an email, they can get understandable feedback like a mail truck taking the sent mail.
... There can also be a simple interface to access pictures they have received.
... It is not a separate program internally, it is really the same program (like gmail) with a simplified and familiar user interface instead of a complicated user interface.
... It makes computers more friendly for those unfamilar with technology so users feel empowered and capable of accomplishing tasks like sending email.
... The framework for this will be open source and made available through GPII, and can be changed further based on what the user needs. You could even create a UI with no words in environments where literacy is an issue.
... GPII is built by the Raising the Floor consortium.
... What makes it work? 1. Helping users find what will help them through user awareness, needs and preferences discovery and storage, GPII unified listing and marketplace, and a shopping aid.
... 2. Provides acces instantly, anywhere on any device through a preferences server, real-time matching of preferences to find a good fit, delivery management of AT and UI adjustment, ...

GV... media and materials to automatically help with transformation of the UI, and assistance on demand that the user can ask for help.

GV: 3. Easier, lower-cost to create, market and support new solutions - Through a developer space with free and commercial parts/frameworks, consumers and experts connection that can help new developers, ...
... ...service creation tools to help make accessible media and documents, a unified listing and marketplace and a micro finance infrastructure to help finance research and development.
... Currently testing with various devices and operating systems - including kiosks, mobile, desktop, etc.
... Working with standards groups to get the support for all of this in the standards.
... The main driver for the GPII work was the cognitive area. It is difficult for people with cognitive disabilities to find and install and use the right assistive technology.

LS: There is a lot of synergy between what our task force is doing and what GPII is developing as solutions in this space.
... Are there metadata tags for different cognitive function already defined?

<neilmilliken> My understanding is it's based on user preferences

GV: Yes and no. We don't describe users at all. We only describe what users need in the metadata. We don't capture the users' disability, but instead capture their user needs/preferences.

<MichaelC> http://gpii.net/

<MichaelC> http://wiki.gpii.net/

GV: The metadata captures the capabilities of the device to meet certain user needs/preferences, so they can add more capabilities later and add that to the metadata.
... On the above website, there is a common terms registry ( http://wiki.gpii.net/index.php/Common_Terms_Registry ) which can be used in the user needs/preference statements.
... There is no metadata describing the user. The user would have to set their preferences to what would help them, like they prefer breadcrumbs. Then if the product supports breadcrumbs it could turn that feature on.

LS: The setup would not necessarily be easy as the user doesn't know what preferences to set up.

GV: There would be a discovery and exploring aid to help the user discover what is helpful to them and then set set their own personal preferences.
... As an example, not all individuals with low working memory will find the same features helpful.
... So that person would discover and explore what are the most helpful features and they would save that. Then whenever technology supports those features, it can adjust to that particular users' needs.
... The Access for All standard was used to create the preferences included in the IMS data registry.
... The common terms registry is used to create the common terms used to write a metadata statement.
... We don't profile users, but instead profile in more general terms what features help users.
... The GPII doesn't create techniques, but fosters technique development by others. It is all about facilitating everyone on how to find, use and create accessibility features.

Liddy: Will talk next week about the ISO context of describing what are the user needs and has a vocabulary for doing that.
... There are parts in the ISO work that the cognitive group can help out with determining what is the simplified language that could be helpful to users.

<Lisa_Seeman> https://www.w3.org/WAI/PF/cognitive-a11y-tf/wiki/Section_3

LS: Made a page for ideas for inclusion, link above.
... Use the wiki as a scratch pad so we don't lose any ideas.
... Liddy will be presenting next week, and hopefully the aging and dementia or possibly the non-vocal group can present next week as well.

GV: Will try to attend our meetings when he can.

LS: Will check to see if Greg can send us a copy of the slides to share with the team.

Summary of Action Items

[End of minutes]

Minutes formatted by David Booth's scribe.perl version 1.138 (CVS log)
$Date: 2014/03/31 17:59:34 $

Scribe.perl diagnostic output

[Delete this section before finalizing the minutes.]
This is scribe.perl Revision: 1.138  of Date: 2013-04-25 13:59:11  
Check for newer version at http://dev.w3.org/cvsweb/~checkout~/2002/scribe/

Guessing input format: RRSAgent_Text_Format (score 1.00)

Succeeded: s/Many people/GV: Many people/
Succeeded: s/Because/As an example,/
Succeeded: s/Task Force/Task Force Teleconference/
Succeeded: s/Meeting: Cognitive Accessibility Task Force Teleconference//
Succeeded: s/Teleconference//
Succeeded: s/Task Force/Task Force Teleconference/
Found Scribe: MaryJo
Inferring ScribeNick: MaryJo
Default Present: Gregg_Vanderheiden, Kinshuk, janina, Debbie_Dahl, John_Rochford, Mary_Jo_Mueller, Neil, Michael_Cooper, Suzanne_Taylor
Present: Gregg_Vanderheiden Kinshuk janina Debbie_Dahl John_Rochford Mary_Jo_Mueller Neil Michael_Cooper Suzanne_Taylor
Regrets: debra_Rue Barry_Johnson
Got date from IRC log name: 31 Mar 2014
Guessing minutes URL: http://www.w3.org/2014/03/31-coga-minutes.html
People with action items: 

WARNING: Input appears to use implicit continuation lines.
You may need the "-implicitContinuations" option.


[End of scribe.perl diagnostic output]