Silver XR Subgroup

10 Aug 2020


jeanne, Crispy, CharlesHall, Joshue108
jeanne, Joshue108


<jeanne> scribe: jeanne

<michaelcrabb> https://www.w3.org/2017/08/telecon-info_silver-xr

<Joshue108> scribe: Joshue108

SM: A central position here would be best

I'll also spoken with Garrett Ford Williams

He is about to publish a study.

Looking at various impairments.

140 people in data set.

They have observed friction points etc

A11y is not a plug in

SM: It would be great for you to get access to this document.
... There are concerns about a11y in VR.

Must minimise nausea etc

Colour coding, radar, connecting lines etc

Can Garrett email you Jeanne?

JS: Yup and Mike also.

MC: I also worked with Beeb.
... Sounds great!

SM: Good for you to know there is connected thinking.

MC: We haven't considered nausea and the role Captions etc play in that.

Prob something to look at.

They do induce motion, so we need to figure that out.

JS: How much of this is in the player? How much in the content?

SM: There is also the hardware - refresh rate may not be determined by software or player.

Look at Sony PS VR headset etc they have motion smoothing to interpolate and artificially smooth etc.

There is also the idea of colour coded text and fonts for different users

Check out Nightwatch.

MC: Where do Guidelines stop etc?

There are levels beyond the guidelines etc so how can we adopt them so traditional things don't apply.

SM: Some users don't know what works - experiential learning is key to this.

MC: Regarding data - is this related to object based medis stuff.

SM: Gives examples - Subtitled score etc. Text stopping and parking
... Depends on making a universal toolset.

MC: Can't access Dropbox etc

SM: Our UX teams are using Snapchat and Blender and more.

MC: Questions?

JS: I do!

Regarding fixing subtitles to the person..

I've seen demos, but never the ones with the problems that you did!

Please talk more..

SM: Demos problems..

<shows pros and cons>

JS: WHat about bubble showing time?

SM: You could have hybrid. Am inspired by comics.

<CharlesHall> someone did a demo of the radar at the w3c immersive workshop last november.

JS: Regarding user preferences..

<CharlesHall> a couple demo links are in the report (not sure which) https://www.w3.org/2019/08/inclusive-xr-workshop/report.html

Should that be a part of the player?

SM: Yes

And the content makers would use a format like XML or JSON

rather than burning it in..

For WebXR it is a browser thing.

MC: Have subtitle standards got support for spatial info?

SM: No

MC: Have talked with Chris Hughes
... Gives overview of other groups work

SM: Have you worked with second screen tech?
... No, not officially.

I've worked with Haptic and visual stuff

<CharlesHall> radar is part of Melina Möhlne (IRT) presentation from the ImAc project: http://media.w3.org/2019/11/Subtitling360-ImmersiveWeb_W3C_05-06.11.2019.pptx

SM: <gives info>


look at feels like rain..

MC: 5 g means the device will not be a limiting factor

SM: Downside is lag and refresh rate..

There is no Harding test for XR.


(For photosensitivity and epilepsy)

MC: Are you seeing more a11y work going on in media platforms?

<CharlesHall> the harding test is used in television and video games. it’s time we bring it to web technology.

SM: There is always a concern - it goes hand in hand with good design
... We need data for claim x or y validation in a11y and XR

I'm happy to make more sketches etc

MC: Gives 360 subtitle planeing in XR

SM: Gives more examples

Shows Lens Studio..

MC: THis is helpful for demoing

SM: Facebook have SparkAR

and Unity and Unreal

Lens Studio is easier

<CharlesHall> https://lensstudio.snapchat.com/

MC: Questions?

SM: You have my email - I'm available.

JS: Regarding the functional outcomes.. we would like your feedback

<michaelcrabb> https://w3c.github.io/silver/subgroups/xr/captioning/functional-outcomes.html

MC: Gives overview

SM: The user needs to feel they are in control

And for time based stuff have a time scrubber..

SM: VR is heading for a metaverse thing..

MC: User testing in VR is difficult a la COVID etc

How are the Beeb doing this?

SM: We shut a lot of that down when COVID started etc

disposable things needed or using individual mobile phones

SM: New AR glasses etc Snapdragon, Antic, Apple glasses and more in Safari etc..

Diff prototyping tools..

MC: Thats been great Spencer and to see what BBC are doing.

Summary of Action Items

Summary of Resolutions

[End of minutes]

Minutes manually created (not a transcript), formatted by David Booth's scribe.perl version (CVS log)
$Date: 2020/08/10 14:01:28 $

Scribe.perl diagnostic output

[Delete this section before finalizing the minutes.]
This is scribe.perl Revision of Date 
Check for newer version at http://dev.w3.org/cvsweb/~checkout~/2002/scribe/

Guessing input format: Irssi_ISO8601_Log_Text_Format (score 1.00)

Default Present: jeanne, Crispy, CharlesHall, Joshue
Present: jeanne Crispy CharlesHall Joshue108
Found Scribe: jeanne
Found Scribe: Joshue108
Inferring ScribeNick: Joshue108
Scribes: jeanne, Joshue108

WARNING: No "Topic:" lines found.

WARNING: No date found!  Assuming today.  (Hint: Specify
the W3C IRC log URL, and the date will be determined from that.)
Or specify the date like this:
<dbooth> Date: 12 Sep 2002

People with action items: 

WARNING: Input appears to use implicit continuation lines.
You may need the "-implicitContinuations" option.

WARNING: No "Topic: ..." lines found!  
Resulting HTML may have an empty (invalid) <ol>...</ol>.

Explanation: "Topic: ..." lines are used to indicate the start of 
new discussion topics or agenda items, such as:
<dbooth> Topic: Review of Amy's report

WARNING: IRC log location not specified!  (You can ignore this 
warning if you do not want the generated minutes to contain 
a link to the original IRC log.)

[End of scribe.perl diagnostic output]