W3C

- DRAFT -

Accessible Platform Architectures Working Group Teleconference

21 Aug 2019

Attendees

Present
jasonjgw, janina, scott_h, SteveNoble
Regrets
Chair
jasonjgw
Scribe
janina

Contents


<scribe> scribe: janina

XR accessibility: understanding XR technologies and their accessibility implications.

jgw: Recalls his on list XR tech summary

https://lists.w3.org/Archives/Public/public-rqtf/2019Aug/0016.html

jgw: Asking whether there's a fair summary of inputs in the above

jo: Some more than others currently

<Joshue108> https://www.w3.org/WAI/APA/wiki/Xaur_draft#XR_controllers_and_accessibility

jo: Talking about controller support in a broad way, a challenge for developers
... So, I used Xbox adaptive controller as an example
... Noting some users have certain restrictions of what they can do, therefore different abilities with devices

<Joshue108> JS: I'm impressed by the range of cutting edge AT that are not going mainstream in XR.

<Joshue108> Jason, you mentioned many of these things in your mail.

<Joshue108> What about what Dragon used to be?

<Joshue108> They all started as niche AT but are not mainstream and are effective.

<Joshue108> My predication is that the XBox and whatever input device we use is temporary, as we will be able to use our body and gestures as input.

<Joshue108> So the modifications will come from how can we support users who can't do those things?

jgw: Asks whether these input techniques may become more important?

jo: Definitely in broad agreement
... Need to capture the broad range input/output

<Joshue108> JS: So the reason to point out the mapping between these things AT, and mainstream is that we have been living this.

<Joshue108> You need to say it indirectly but we urge people to note that this stuff has been around for a while.

<Joshue108> Jason - do have a look at the Various Input Modalitues section to see if it covers what you are talking about https://www.w3.org/WAI/APA/wiki/Xaur_draft#XR_controllers_and_accessibility

jo: Can we work out the unique a11y features/challenges?

sh: Orig AT was quite linear, XR isn't. That's a diff
... No longer restricted that way -- can go in any direction

jo: We need to support multiple alternative outputs that all need to be in sync
... multiple outputs from multiple devices going to multiple devices

<Joshue108> Great point Scott!

js: Suggests the APIs need to expressly support choosing which media to represent, and which to simply not stream or waste processing cycles on, e.g. video for blind users
... or audio for users who are deaf/hh

jgw: How can one be selective in what non visual info to present?

<Zakim> Joshue, you wanted to say what about modality muting

jo: Still, sync'ing will be important in multi user scenarios

sh: Also a haptic modality

jo: Bandwidth will be a problem for sometime, so madality muting could be beneficial

<Zakim> janina, you wanted to say that's a screen reader problem, not an XR problem

<Joshue108> JS: Sounds like a screen reader, i'm critizising..

<Joshue108> JS: The real world is full of sounds..

<Joshue108> We don't have a problem choosing ques that are non verbal..

<Joshue108> We can pick and choose from some distance.

<Joshue108> Its not such a focus thing, like this is for a screen reader - if it is a seperate reality then we will need different techniques.

<Joshue108> JW: +1

jgw: Description won't be able to keep up, there are too many things with too many detailed parts to possibly describe for any meaningful interaction model

<Joshue108> s\abstaction\abstraction

<Zakim> Joshue, you wanted to say that is the responsibility or property of author abstaction

jo: The shorter term won't be that capable, we'll have to do well with more limited content, therefore more quality authoring support

<Zakim> janina, you wanted to discuss the ing/yang of it all

<Joshue108> JS: In the longer range we will need to rethink..

<Joshue108> We do need to support today, so what that entails, we need an approach/

<Joshue108> What Silver is doing now around the users ability to do tasks, is great.

<Joshue108> We can now write docs, annotate things, print etc - we can do lots.

<Joshue108> Collaborative doc editing is now the cutting edge..

<Joshue108> As get get close to AR and VR we have other functional tasks in the world.

<Joshue108> We need to think what that model is, and how to support PwDs and what is required.

<Joshue108> We spoke about this with Judy, she was talking about Bio experiments..

<Joshue108> Frog descriptions etc but what is the model to help a blind user do that?

<Joshue108> Thats a useful thought experiment.

<Joshue108> JW: I wanted to clarify on Joshs point..

jgw: Notes XR gives us a different and more elaborate set of design possibilities
... Will need to capture in a doc for XR app designers

User Needs in XR

<Joshue108> https://www.w3.org/WAI/APA/wiki/Xaur_draft#XR_User_Needs

jo: Have updated under Ed, Gaming, Health, etc
... Too many examples would be counterproductive, but we need to communicate these ideas; That's my challenge
... Think some of the reqs are generic
... Whether gaming, ed, transport, whatever ...
... Please review what's there now, today's conversation won't change anything!
... Even though a useful conversation

Real-time communication accessibility: draft and updates.

jgw: Notes the on list coversation and current draft of a11y reqs

<Joshue108> JS: Meeting is scheduled.

<Joshue108> JS: We got a good response from WebRTC

<Joshue108> Agreement to meet with the group is the key, will give them pointers in advance.

<Joshue108> We will bring up the need to support RTT especially in the US.

<Joshue108> This will help them be a key player in the market - so we will make this point.

<Joshue108> Then the other requirements around text are mostly user agent requirements.

<Joshue108> Then we will show them the use cases that they will want to meet.

<Joshue108> So the APIs they build will need to support them.

jo: Reading AOM has the same API problem, purpose not specified in English prose
... We need to finish with the use cases

Getting ready for FPWD

jo: Still work to be done, but closer

<Joshue108> JS: Whenever we think they are ready to publish we can.

<Joshue108> We have discussed with Dom.

<Joshue108> We can discuss with APA and if so lets move fast.

<Joshue108> I think there is a relationship between FAST and XAUR, with FAST as the parent and XAUR as the child.

<Joshue108> +1 to conversation around FAST at TPAC.

Summary of Action Items

Summary of Resolutions

[End of minutes]

Minutes manually created (not a transcript), formatted by David Booth's scribe.perl version 1.154 (CVS log)
$Date: 2019/08/21 14:05:25 $

Scribe.perl diagnostic output

[Delete this section before finalizing the minutes.]
This is scribe.perl Revision: 1.154  of Date: 2018/09/25 16:35:56  
Check for newer version at http://dev.w3.org/cvsweb/~checkout~/2002/scribe/

Guessing input format: Irssi_ISO8601_Log_Text_Format (score 1.00)

Default Present: jasonjgw, janina, scott_h, SteveNoble
Present: jasonjgw janina scott_h SteveNoble
Found Scribe: janina
Inferring ScribeNick: janina
Found Date: 21 Aug 2019
People with action items: 

WARNING: IRC log location not specified!  (You can ignore this 
warning if you do not want the generated minutes to contain 
a link to the original IRC log.)


[End of scribe.perl diagnostic output]