W3C

– DRAFT –
Immersive Web WG/CG bi-weekly call

18 April 2023

Attendees

Present
cabanier, CharlesL, yonet
Regrets
-
Chair
Ada
Scribe
Leonard

Meeting minutes

<yonet> https://issues-by-label.glitch.me/?label=face-to-face&command=/facetoface

upcoming face-to-face agenda

Aysegul: Topics for F2F next week M/T at Apple in Silicon Valley
… looking for topics, especially those not already listed. Add to GitHub

<yonet> https://issues-by-label.glitch.me/?label=face-to-face&command=/facetoface

1+

Lots of discussion about logistics for F2F

Chairs will get out a timed agenda this week.

<yonet> https://github.com/immersive-web/administrivia/blob/main/F2F-April-2023/schedule.md

Parking lot locations will be announced

See above link for all schedules and location for F2F

Dinner is being planned. Ada is having difficulties with the restaurant. Stay tuned for location & other details

Dinner is Monday!

Ada: Food available for breaks & lunch

Current issues by label: https://issues-by-label.glitch.me/?label=face-to-face&command=/facetoface

XR Access and our NSF Grant on making non-verbal cues and documents/whiteboards in VR accessible to people with low vision and blindness.

https://github.com/immersive-web/administrivia/issues/195

Charles: Introductions of team

Sean: (Mgr of Corporate Relations at LightHouse).

Dylan (Head of Community Outreach, XR Access): Discusses NSF project to convert non-verbal communication into sound and haptic
… communicationb
… Include eye contact, group formation, shared (3D) objects, etc. turned into sound and haptics to allow non-visual
… participants first-class citizens of virtual spaces
… Goal: Create set of guidelines that can be implemented by W3C and others to make 3D worlds accessible

Charles: In addition to social cues, make (3D) objects accessible to visually impaired individuals
… Includes audio cues of actions (whiteboard markings, etc.), then reading the result (when appropriate)
… Grant Phase 1 ends Sep 2023. Phase 2 (2-year) is to build Phase 1 results.
… intent to launch company to do this at the end of Phase 2

Dylan: Make sure guidelines can be used and embraced

Sean: Including visually impaired individuals in all phases of the project. to ensure guidelines work in practice.

<Zakim> Ada_, you wanted to talk about possible entry points

Ada: Can IW use HTML for accessiblity fall-back?

Dylan: Ties into AOM (Accessible Object Model) project. Project is defining metadata for objects + structure for spaces.
… Work is just beginning.

Piotr: Goal is to expose additonal data/structure for s/w to understand semantics of the scene?
… How are the app developers forced/encourage to use this (new) system?

Dylan: IW is part of the solution. There are other groups too (e.g., phone apps).
… What does it take for WebXR to understand semantics. If not, where in W3C are these guidelines managed?

<yonet> https://github.com/immersive-web/webxr/blob/main/accessibility-considerations-explainer.md

From Dylan in chat: "Example of AOM work: How Do You Add Alternative Text and Metadata to glTF Objects?https://equalentry.com/accessibility-gltf-objects/"

Koolala: Best developer tools are in the browser, but there are no VR developer tool there

Charles: Looking for contact to W3C accessibility

Aysegul: Will provide it.

<yonet> https://www.w3.org/TR/xaur/

Shiri: (Cornell U) Goal is to have design recommendations adopted
… What is looked for, how do you (IW) work with other organizations, etc.?

<Zakim> Ada_, you wanted to mention model

Ada: Looking at <model> tag and the ability to include accessibility info.

Dylan: ... Alt text on 3D objects is probably not going to fly. Need to understand where people are located in space.
… Need to understand which topics/fields go to which group.

Ada: Standards would probably happen in this group

Dylan: Avatar location/orientation is important. How would that be represented in XR?
… Would each object need to have a set of coordinates where the change would cause an audio signal

Ada: Metadata system would be useful. If social VR stuff is left to developers, than it probably not be developed.
… OS (browser) has a much better opportunity of managing the information.

<Koolala> From the developer tools perspective, if you close your eyes and imagine every object around your room surrounded by a cube XML meta data - that meta data is what feeds into accessibility features. Model view helps here. So does 3D detached CSS.

Brandel: IW is not just Immersive XR. This group is "least-worst" place to start discussion, but distribution
… is likely to happen depending on the specific issue

Dylan: Some are developer standards (e.g., alt text), others are design items. Team focus is design.
… WebXR appears to be more focused on developer items.

Brandel: IW has WebXR API and model.

Piotr: Might have a problem with a communication channel and how to force apps to use that channel?
… If there are multiple ways, people will choose the easiest for them
… Need to have a developer educational program
… Is it possible to get a binding to share semantics binding?

Yonet: Having or giving the information is not the same as making it accessible. Judicious choices must be made

<Koolala> Can the developer tools be that channel?

Dylan: Information Overload Mode will be overwhelming. Need a variety of screen readers that use different interpertations
… May need to have metadata from early in the creating pipeline

Charles: Poeple from IWWG that could create an accessiblity task force?
… to steer work for accessibility

Leonard: Is there a short-list of very important accessibility items?

Charles: Do have a short list. Need to come up with stories and wire-frames. Will share here for reactions

Dylan: long list: https://bit.ly/xraccess-github

<yonet> https://github.com/XRAccessibility/xraccessibility.github.io

Dylan: email: Dylan [AT] xraccess [DOT] org

Minutes manually created (not a transcript), formatted by scribe.perl version 197 (Tue Nov 8 15:42:48 2022 UTC).