W3C

Accessible Platform Architectures Working Group Teleconference

07 Jun 2017

See also: IRC log

Attendees

Present
MichielBijl, janina, jeffs, Anssi_Kostiainen, Joanmarie_Diggs, Dominique_Hazael-Massieux, JF, Jeff_Sonstein, IanPouncey, Diego_Marcos_(Mozilla), MichaelC, Léonie, jasonjgw, Charles_LaPierre, tdrake, Gottfried, davidb, leweaver, ddorwin, judax, bajones
Regrets
Chair
Janina
Scribe
JF

Contents


Welcome & Brief Participant Intros

<jeffs> [waves hello]

<jeffs> oh goodie! Zakim our old friend is here!

<janina> Michiel, can you scribe today?

<scribe> scribe: JF

JS: Welcoming comments to members of the Virtual Reality participants - Welcome!

<dom> [Dominique Hazael-Massieux, W3C, here from the WebVR perspective as organizer of the Web & VR workshop last year]

<janina> Hi, I'm Janina Sajka, Chair of APA, and a beneficiary of good accessibility technology personally.

<jeffs> Prof. Jeff Sonstein from RIT, coming from the VR side

<jasonjgw> Jason White - Educational Testing Service - participant in APA and Facilitator of the Research Questions Task Force.

<MichielBijl> Michiel Bijl, TPG / The Paciello Group

Hi, JF (John Foliot) member of APA, WCAG, and other accessibility efforts at W3C (and today's scribe)

<anssik> Anssi Kostiainen, Intel Corporation, co-chair of the W3C's Web & VR workshop

<joanie> Joanmarie Diggs, Igalia S.L., co-chair of ARIA working group, member of APA.

<Judy> Judy Brewer, WAI Director (in IRC)

<jeffs> Jeff Sonstein: also worked on W3C Mobile Web Initiative for a number of years

<MichaelC_> Michael Cooper, APA staff contact

<leweaver> Lewis Weaver and Nell Waliczek - Microsoft WebVR & Windows Mixed Reality

<davidb> David Bolter, Mozilla, last minute addition from accessibility, (once connected haptic devices to VRML 1000 years ago)

VR Deliverables Overview

<dom> [For VR Workshop Accessibility session, see https://www.w3.org/2016/06/vr-workshop/report.html#accessibility]

<DigiTec> Justin Rogers, Oculus, Engineering Manager for Oculus Browse and general standards guy for our web efforts.

JS: asking VR Group for an overview of Deliverables

DOM: will give a brief explanation of web VR

an API that allows browser to detect VR headset devices

<jeffs> ?

once connected, it allows for the creation of 3D views in real time on the headset

These data headsets are gathering data and combined with other technologies allows for the creation of massive VR experiences

JS: have some questions about those headset

<jeffs> [note: all VR is not immersive/headset-based]

<dom> [the WebVR API https://w3c.github.io/webvr/spec/latest/]

questions related to visual versus sonic expeience

also: there are other external influences (wind directions, poximity sensing, etc.) - how advanced are these issues?

Dom: we are currently limited by what is on the market (i.e. mostly haptic feedback)

so for today, it detects the movement of the headset

in more advance setups, there are controls that can allow further sensory experiences (i.e. movement)

some controllers alsohave vibration capacity as well

but these are all limited in scope and capability today

currently the Web VR API is not focussing on these additional possibilities

Justin we are thinknig about the problem. it comes down to providing the appropriate centers in the device

working on the Web VR 2 spec to allow for those types of inputs

audio exposure is currently tied to Web Audio API - no specific work happening there yet

also exploring 3rd party camera integration

Looking from the APA group on what needs would be

<jeffs> IMHO it is A Good Thing to use other APIs rather than adding to a monolithic VR API

Welcome & Brief Participant Intros

Accessibility at the VR Workshop -- Charles?

<anssik> https://www.w3.org/2016/06/vr-workshop/slides/VR-Accessibility.pdf

<dom> Summary of the accessibility session at the VR workshop

CL: will share slides and note when done

gave the accessibility overview at the workshop

<jeffs> we got the slides URI, no need to read them to us

[Charles reviews the presentation with the group]

outlining various types of supplimental data that would be required.

example: online/VR purchase of clothing: description of both the garment, but also the ability to describe tactile information (feeling of the material), etc.

<judax> Hello everyone, Iker Jamardo from Google's Daydream WebXR team (Technical Leader). I have been in the call from the beginning over a phone call.

<jeffs> welcome, Iker

CL: recognition that accessibility must be integrated, and not bolted on

JS: we can take questions during the next agenda item

<tdrake> Ted Drake, Intuit, Accessibility . Part of APA group

JW: interest in educational efforts and VR

<Gottfried> I am an invited expert of APA - computer scientist

JW: it was suggested that the APIs are being impacted by what is available in the market today - and this is logical

however there are some technologies that PwD may use, they are not intended for gernal audience but rather specialty auciences

given that, there will be requirements that should be contemplated when developing new APIs

Anticipated VR Accessibility Challenges Discussion

<tink> http://www.gamasutra.com/blogs/IanHamilton/20161031/284491/VR__accessibility.php

LW: should have a chat with ____ Hamilton - very active in VR gaming

provides some interesting high-level possibilities and thinking

<bajones> Hi all! Brandon Jones doing my intro line. :)

thoughts around VR and real-time description

<jeffs> tnx for the link tink ;^}

Anssik: question for APA

haven't followed a11y issues around canvas lately - any updates?

LW: Canvas, in and of itself is not accessible

thus there is no semantic content in canvas

there are strategies today, but they are limited

there is potentially more accessibility coming forward with SVG rather than canvas

anssik: it seems we are inheriting many of the same issues

JW: there is some ability to use ARIA (role) in some instances
... WebGL - making accessible is quite difficult

<tink> s/thoughts around VR and real-time description  /Wonder about the possibility of bringing together neural networking and Web VR to create real-time audio description/

have seen "hacky" solutions such as a dupicate HTML version postioned off-screen, and that copy ahs the ARIA and HTML semantics included

this creates a maintenance issue to maintain 2 copies

in sync

LW: from Web Plat perspective, unaware of any specific work here

<dom> [for those in VR not familiar with what ARIA is: https://www.w3.org/WAI/intro/aria]

but to use ARIA here would require a new taxonomy for the world (describing things in VR)

Joanie: unaware of any other canvas/accessibility work outside of what jason and leonie have suggested

<MichielBijl> <canvas aria-desribedby="here-is-what-you-could-have-had"></canvas>

Justin: one general question is: there is a way to impact the accessibility tree to place content in the DOM

there are emergent descriptive languages that would allow for this to describe things, rather than using metadata on the side (etc.)

<DigiTec> Current scene description languages that could expose some accessibility meta-data. React VR, Three.js, Babylon.js, A-Frame.

<Zakim> tink, you wanted to mention the AOM

Justin: in the short-term we'll likely need to continue creating duplcate (side) content, but can we move forward from there?

<dom> Accessibility Object Model (explainer)

<tink> https://github.com/WICG/aom

LW: want to point out the accessible Object model - a JS accessibility API

<jeffs> the idea of extending a "scene description language" like ThreeJS with std accessibility-oriented data is an interesting one

this will make the accessibility tree a lot more flexible - the ability to query, add to and modify will be easier

BJ: in exploring a parallel accessibility tree, we've discussed the possibility of bringing portions of the DOM into the canvas space

would be interested n seeing that the accessibility of the DOM tree that is brought over is not lost - not sure how to ensure it is tied to the visuals, but ensuring that it is there would be a useful first step

<jeffs> my $0.02 worth: a parallel anything requiring maintaining two thingies in sync is Not A Good Idea

JW: there are 2 approaches: more direct support to devices and people using those devices (hardware solution), and second: an accessibility tree or DOM derived tree that AT on the client side could process to enhance the VR application
... is there a need for both? which would be better n what circumstances? would they / how would they interact wit each other?

+1 to JeffS

BJ: wanted to explore issues related to accessibility issues in the real world - heavy skew towards sight impairment, but what of height of countertop, or you need to be agile but you are mobility impaired

seems like an unique opportunity here: accessibility that mirrors a more reaql-world need

BJ: so perhaps we might consider introducing a virtual control that raises the floor (as opposed to lowering a counter height), ot the ability to turn cameras even if/when you yourself cannot turn

JB: have done some analysis when Second Life was both popular and being investigated by government entities

it was extraordinary experience - things that were possible to do (i.e. personal flying) was quite interesting

JS: the sensory experience is so rich that evenif you remove one sensory experience, you will still get a lot from the experience

CL: what Janina said resonated. the "look around" mode was quite interesting and useful

also Judy's input around physical limitations versus customized 'virtual experience with the environment adaptable to your needs

JB: so much of VR design is based on visual, but for experiences (imagine a roller-coaster ride) will require more than just visual data (haptic feedback, etc.)

the ability to deliberately amplify or extend some of those additional opportunities is exciting

JW: will need to give carefu lconsideration on HOW these things are implemented - little deisre to write all of this from scratch - look at libraries and ensure that a11y features are in the architecture

also wonder aloud about input handling and generic support for more abstractly defined modes of input

JW: not sure where those architecture opportunities exist

JS: may be something for further research

<jeffs> +100 to "next steps", more imp than closure on anything in this initial mtg

<Zakim> Judy, you wanted to make a last quick comment on api design goals?

JF: question around movement versus inertia issues

<clapierre> I was going to say that in a recent Code-Sprint / Hackathon at CSUN we worked with PHET and realtime simulations and we prototyped a priority utterance queue to describe things in realtime.

JB: often start by looking at sweet spots at the W3C, but here ensuring that APIs can extend to meet accessibility needs will be important

BJ: comment regarding different input modalities

even without thinking about a11y, we've had to think and work on input modes limited by different types of tech - from simple carboard views to high-end viewers with multiple buttons

BJ: the fact taht we are already dealing with a wide range of devices is almost forcing us to think about a11y

Is our Checklist helpful? http://w3c.github.io/apa/fast/checklist

<dom> Framework for Accessibility in the Specification of Technologies (FAST)

MC: the checklis is in the header. it is divided into sections around features that possible tech might provide

under each section there are related questions to apply

first to see if it is useful, and second we'd love any feedback regarding enahncements

especially related to VR

<clapierre> Checklist looks great!

Develop Prioritized Coordination Plan

<jeffs> +1

<davidb> DB(unspoken): (multiple modalities and the flexible (re)mapping of input and output seems like the long range future here, as well as machine learning based assistance as tink brought up)

<davidb> (gotta run sorry)

JS: open question - is onging dialog with our research TF make sense?

<jeffs> +1

JS: perhaps we may want to think about this and respond onlne

<judax> +1

<jasonjgw> +1 to RQTF working in the area.

<Judy> +1 to following up on this topic

<jeffs> bye

Will post minutes to both WG's - thanks for attending today

<tink> Thanks JF for scribing.

<judax> Thank you and bye!

trackbot, end meeting

Summary of Action Items

Summary of Resolutions

[End of minutes]

Minutes formatted by David Booth's scribe.perl version 1.152 (CVS log)
$Date: 2017/06/07 17:19:05 $

Scribe.perl diagnostic output

[Delete this section before finalizing the minutes.]
This is scribe.perl Revision: 1.152  of Date: 2017/02/06 11:04:15  
Check for newer version at http://dev.w3.org/cvsweb/~checkout~/2002/scribe/

Guessing input format: Irssi_ISO8601_Log_Text_Format (score 1.00)

Succeeded: s/memebers/members/
Succeeded: s/outher/other/
Succeeded: s/Just:/Justin/
Succeeded: s/DOM:/Dom:/
Succeeded: s/garmet/garment/
Succeeded: s/followed issues/followed a11y issues/
Succeeded: s/inheiriting/inheriting/
Succeeded: s/WebVL/WebGL/
FAILED: s/thoughts around VR and real-time description  /Wonder about the possibility of bringing together neural networking and Web VR to create real-time audio description/
Succeeded: s/Joanine/Joanie/
Succeeded: s/mirros/mirrors/
Succeeded: s/weet/sweet/
Found embedded ScribeOptions:  -final

*** RESTARTING DUE TO EMBEDDED OPTIONS ***

FAILED: s/thoughts around VR and real-time description  /Wonder about the possibility of bringing together neural networking and Web VR to create real-time audio description/

WARNING: Replacing list of attendees.
Old list: MichielBijl janina jeffs Anssi_Kostiainen Joanmarie_Diggs Dominique_Hazael-Massieux JF Jeff_Sonstein IanPouncey Diego_Marcos_(Mozilla) MichaelC Léonie jasonjgw Charles_LaPierre tdrake Gottfried davidb leweaver ddorwin judax bajones
New list: MichielBijl janina jeffs Anssi_Kostiainen Joanmarie_Diggs Dominique_Hazael-Massieux JF Jeff_Sonstein IanPouncey

Default Present: MichielBijl, janina, jeffs, Anssi_Kostiainen, Joanmarie_Diggs, Dominique_Hazael-Massieux, JF, Jeff_Sonstein, IanPouncey
Present: MichielBijl janina jeffs Anssi_Kostiainen Joanmarie_Diggs Dominique_Hazael-Massieux JF Jeff_Sonstein IanPouncey Diego_Marcos_(Mozilla) MichaelC Léonie jasonjgw Charles_LaPierre tdrake Gottfried davidb leweaver ddorwin judax bajones
Found Scribe: JF
Inferring ScribeNick: JF
Found Date: 07 Jun 2017
Guessing minutes URL: http://www.w3.org/2017/06/07-apa-minutes.html
People with action items: 

WARNING: Input appears to use implicit continuation lines.
You may need the "-implicitContinuations" option.


[End of scribe.perl diagnostic output]