W3C

- DRAFT -

Silver XR Subgroup

03 Aug 2020

Attendees

Present
jeanne, bruce_bailey, CharlesHall, Crispy
Regrets
Chair
MikeCrabb
Scribe
jeanne

Contents


<scribe> scribe: jeanne

Timelines

MC: What do we have to have done for 11 August

JS: Functional OUtcomes and some tests, if possible

MC: Deadline for FPWD?

JS: We should finish by 31 August in order to get the FPWD published before TPAC on 12 October

Functional Outcomes

<michaelcrabb> Outcome 1: We Need Captions

<michaelcrabb> Auditory information, including speech and key sound effects, are translated into alternative formats (e.g. captions) so media can be consumed when sound is unavailable or limited

<michaelcrabb> Functional Need -

<michaelcrabb> Usage without hearing

<michaelcrabb> Usage with limited hearing

<michaelcrabb> Usage without vision

<michaelcrabb> Usage with limited vision

<bruce_bailey> that looks good, and right inline with functional needs statement definition

<Zakim> bruce_bailey, you wanted to discuss consumed

JS: Translates speech and key sound effects into alternative formats (e.g. captions)

CH: Can we add other needs from the current Functional Needs work? For example, Limited auditory sensory memory.

JS: That is my understanding of the resolution from the Silver meeting on Friday.
... I think they will be included as tags, and reference the Functional Needs document.

CH: That meets one of the Requirements goals of Silver -- to include more disability groups.

<CharlesHall> Use without vision and hearing

JS: Translates speech and key sound effects into alternative formats (e.g. captions) so media can be understood when sound is unavailable or limited

<michaelcrabb> Outcome 2: We need meta data of sound effects

<michaelcrabb> Auditory meta-information, including sound directionality, is conveyed to the viewer to that contextual information is available when sound is unavailable or limited

JS: Conveys Auditory meta-information to the viewer, including sound source and direction so that contextual information is available when sound is unavailable or limited.

CH: What about "information about the sound in addition to the text of the sound"

JS: Conveys information about the sound in addition to the text of the sound (for example,sound source and direction).
... users understand the person speaking and where the sound is coming from in addition to the text description of the sound.
... Conveys information about the sound in addition to the text of the sound (for example, sound source, duration, and direction) so users understand the necessary information about the sound to understand the meaning of the sound.
... Conveys information about the sound in addition to the text of the sound (for example, sound source, duration, and direction) so users know the necessary information about the sound to understand the meaning of the sound.
... Conveys information about the sound in addition to the text of the sound (for example, sound source, duration, and direction) so users know the necessary information about the sound to understand the context of the sound in the environment of the sound.

<michaelcrabb> Conveys information about the sound in addition to the text of the sound (for example, sound source, duration, and direction) so users know the necessary information about the context of the sound in relation to the environment it is situated in

+1 to Mike's version

<michaelcrabb> Outcome 3: Second screen adaptions should be possible

<michaelcrabb> Captions and caption meta-data are capable of being presented in alternative methods (e.g. second screen) to make the information more accessible when visual access is unavailable

JS: Provides captions and caption meta-data in alternative formats (e.g. second screen)
... Provides captions and caption meta-data in alternative formats (for example, second screen or braille display)

<michaelcrabb> Functional Need -

<michaelcrabb> Usage without vision

<michaelcrabb> Usage with limited vision

<michaelcrabb> Usage with Limited manipulation or Strength

<michaelcrabb> Usage with Limited Reach

CH: I can think of XR scenarios that would benefit a sighted person when the screen is dense with information and I would need to move the captions to another device.

+1

<michaelcrabb> 3) Provides captions and caption meta-data in alternative formats (for example, second screen or braille display) to allow users the opportunity to move caption and meta-data to alternative displays

JS: I find it hard to imagine routing to captions to a second screen when wearing goggles, but not all xr is used with goggles.

<michaelcrabb> This benefits users with limited vision and users without vision; and users with Limited manipulation, strength, or reach.

3) Provides captions and caption meta-data in alternative formats (for example, second screen or braille display) to allow users the opportunity to move caption and meta-data to alternative displays. This benefits users without sound and vision, users who need assistive technology to magnify portions of the view, and users who have limited reach.

3) Provides captions and caption meta-data in alternative formats (for example, second screen or braille display) to allow users the opportunity to move caption and meta-data to alternative displays. For example, this benefits users without sound and vision, users who need assistive technology to magnify portions of the view, and users who have limited reach.

<bruce_bailey> +1

3) Provides captions and caption meta-data in alternative formats (for example, second screen or braille display) to allow users the opportunity to move caption and meta-data to alternative displays. For example, this benefits users without sound and vision, users who need assistive technology to magnify portions of the view, or users who have limited reach.

<michaelcrabb> +1

<michaelcrabb> Outcome 4: Customization of captions

<michaelcrabb> Customisation of caption style and position is available to support users that would benefit from tailored presentation options

<michaelcrabb> Provides customisation of caption style and position

<michaelcrabb> to support people with limited vision or color vision deficiency.

<michaelcrabb> Provides customisation of caption style and position to support people with limited vision or color vision deficiency.

<michaelcrabb> Provides customisation of caption style and position to support people with limited vision or color perception. Customisation options also have the potential to benefit all users.

<michaelcrabb> Outcome 5: Time alterations for caption viewing

<michaelcrabb> The amount of time that a given caption (and associated meta-data) spends on screen can be personalised in order to give additional time to locate the sound that is being presented

JS: The last sentence needs some plain language work, but we can ask for help with that.

<michaelcrabb> Provides customisation of caption timing to support people with limited manipulation, strength, or cognition.

4) Provides customisation of caption style and position to support people with limited vision or color perception. Customisation options can benefit all users.

<michaelcrabb> Functional Need -

<michaelcrabb> Usage without vision

<michaelcrabb> Usage with Limited manipulation or Strength

<michaelcrabb> Usage with Limited Reach

<michaelcrabb> Usage with Limited Cognition

5) Provides customisation of caption timing to support people with limited manipulation, strength, or cognition.

Summary of Action Items

Summary of Resolutions

[End of minutes]

Minutes manually created (not a transcript), formatted by David Booth's scribe.perl version (CVS log)
$Date: 2020/08/03 14:00:19 $

Scribe.perl diagnostic output

[Delete this section before finalizing the minutes.]
This is scribe.perl Revision of Date 
Check for newer version at http://dev.w3.org/cvsweb/~checkout~/2002/scribe/

Guessing input format: Irssi_ISO8601_Log_Text_Format (score 1.00)

Present: jeanne bruce_bailey CharlesHall Crispy
Found Scribe: jeanne
Inferring ScribeNick: jeanne

WARNING: No date found!  Assuming today.  (Hint: Specify
the W3C IRC log URL, and the date will be determined from that.)
Or specify the date like this:
<dbooth> Date: 12 Sep 2002

People with action items: 

WARNING: IRC log location not specified!  (You can ignore this 
warning if you do not want the generated minutes to contain 
a link to the original IRC log.)


[End of scribe.perl diagnostic output]