12:43:50 RRSAgent has joined #rqtf 12:43:50 logging to https://www.w3.org/2021/05/19-rqtf-irc 12:43:52 RRSAgent, make logs public 12:43:55 Meeting: Accessible Platform Architectures Working Group Teleconference 12:43:55 Date: 19 May 2021 12:44:10 agenda+ Introductions. 12:44:17 zakim, clear agenda 12:44:17 agenda cleared 12:44:23 agenda+ Introductions. 12:44:58 agenda+ Overview of current Research Questions Task Force activities. 12:45:21 agenda+ Overview of Immersive Captions Community Group activities. 12:45:38 agenda+ Issues of joint interest. 12:46:11 agenda+ Actions to take and coordination of future work. 12:46:17 zakim, list agenda 12:46:17 I see 5 items remaining on the agenda: 12:46:18 1. Introductions. [from jasonjgw] 12:46:18 2. Overview of current Research Questions Task Force activities. [from jasonjgw] 12:46:18 3. Overview of Immersive Captions Community Group activities. [from jasonjgw] 12:46:18 4. Issues of joint interest. [from jasonjgw] 12:46:18 5. Actions to take and coordination of future work. [from jasonjgw] 12:46:37 Judy has joined #rqtf 12:46:42 chair: jasonjgw 12:46:45 present+ 12:57:36 scott_h has joined #rqtf 12:57:47 present+ 12:58:29 Judy has joined #rqtf 13:01:54 JPaton has joined #rqtf 13:02:18 present+ John_Paton 13:04:11 scribe: joconnor 13:05:09 13:06:02 zakim, next item 13:06:02 agendum 1 -- Introductions. -- taken up [from jasonjgw] 13:06:08 SteveNoble has joined #rqtf 13:06:09 JB: Over to Jason 13:06:19 present+ 13:06:57 present+ 13:07:10 13:07:33 JS: Works on phone, I'm scribing also. 13:08:11 JW: Judys efforts on planning and logistics are appreciated 13:08:27 13:08:58 becky has joined #rqtf 13:08:58 13:09:35 JS: 13:10:10 Glad that there is an active community of deaf folks at W3C 13:10:23 Great to have more direct interaction. 13:10:51 APA is the W3C group that works on overseeing accessibility at W3C 13:11:33 JB: Mentions the current group is ~ 20. 13:12:42 Wendy D: Hello, this is Wendy. I work at RIT - one of the deaf community research members 13:12:51 JOC: 13:12:58 SH: 13:13:15 13:15:50 present+ 13:16:52 zakim, next item 13:16:52 agendum 2 -- Overview of current Research Questions Task Force activities. -- taken up [from jasonjgw] 13:18:10 JW: We want to here provide overview of the RQTF work that would be of interest to the Immersive Captions community group 13:18:29 JW: I'd like to mention the RTC Accessibility User Requirements doc. 13:18:43 It has recieved wide review and is soon to be published 13:19:14 It addresses a range of issues of interest to the CG 13:19:30 https://raw.githack.com/w3c/apa/74f3a865bc80548eb45add85f2c2561db23c0c60/raur/index.html 13:19:41 There is also the XR A11y User Requirements doc 13:19:48 https://raw.githack.com/w3c/apa/6d5bf713d9d7c65ecda104c213ad47b0e98cfbe1/xaur/index.html 13:20:26 Another document is the Natural Language A11y User Requirements 13:20:52 Used for UIs that have natural language UIs - speech, and text input - Natural Language 13:21:15 Currently in an earlier stage of development - it would be of interest to Immersive Captions CG for feedback 13:21:36 SuzanneTaylor has joined #rqtf 13:21:47 The RQTF is currently also looking at Media Synchonisation issues around, audio and video - captions and textual descriptions 13:22:20 The impetus is from common interest accross groups to achieve sync of various forms of media 13:22:32 Steve Noble and Scott H are working a lot on this. 13:22:42 May become more formal publication soon. 13:22:50 They are the main activities 13:23:16 JS: You did good! 13:23:22 JS: Questions? 13:23:24 q? 13:23:44 agenda? 13:24:15 WD: Can we post something in the chat? 13:24:30 q? 13:24:47 JB: Which link? 13:25:12 WD: XAUR? 13:26:19 JOC: Those links are all on that master page 13:26:35 JS: Note that the RAUR URI will change as it is being published tomorrow 13:26:49 JW: I will update wiki page tomorrow 13:27:14 zakim, next item 13:27:14 agendum 3 -- Overview of Immersive Captions Community Group activities. -- taken up [from jasonjgw] 13:27:36 JS: Can anyone provide summary? 13:27:46 JB: Q from Wendy.. 13:28:13 WD: When you use the word user, it is a person that uses something. What about other different kinds of authority? 13:28:17 A host etc? 13:28:18 q+ 13:28:33 WD: What is others are managing technology for their own use? 13:28:40 ack jan 13:28:51 JS: That is a good question Wendy 13:29:06 A host is a just a user playing a different role, or performing a function. 13:29:35 We may describe requirements for participation but we dont want to overlook any roles that may need to be fullfilled. 13:29:49 We intend to cover all the roles a user may have. 13:29:54 q? 13:30:23 JW: The RQTF is also developing a doc on Remote meetings and hybrid meetings. 13:30:43 RAUR addresses particular technology parts of the stack - and this new one is a higher level overview 13:30:46 ack janina 13:31:13 JW: Update on IC group activity. 13:31:24 JB: We've not prearranged this. 13:31:45 13:32:10 WD: I think Chris - could you walk thru the platforms? 13:32:18 CH: You mean our prototypes? 13:32:54 CH: In our group we are interested in how users use captions in 360 - as well as XR at some stage 13:33:09 We are looking at video using headmounted displays 13:33:26 Once we have worked out 360 we will look at Immersive Environments etc 13:33:41 Our group is passionate about the topic 13:34:08 We found existing advise to be lacking, not interested in default solutions and there are more interesting things we can do. 13:34:39 I've been building JavaScript implementations of our ideas 13:34:45 Allows us to experiment 13:35:01 13:35:06 https://www.chxr.org/immersive_subs2_5/?pn=Barbershop 13:35:49 We are building tools with interesting parameters 13:36:26 It is a HTML and JS implementation, no fancy UI but useful as a POC. 13:36:51 As a group we want to be proactive about building stuff 13:37:16 WD: In that 360 view the sound varies.. 13:38:12 13:38:39 WD: Some people with hearing can identify things by sound.. 13:39:16 For deaf people, we have annotations that are built and added at the beginning, so the deaf user can identify characters as they move through the game. 13:39:25 We also have directional q's 13:39:45 WD: Frances? 13:39:58 FB: Chris's prototype is amazing.. 13:40:22 Also Wendy worked on things that show speed variability for users 13:40:32 So these options are really important. 13:40:59 We have started to draft high level requirements etc, that we can make available. 13:41:11 Like, understanding who is speaking.. 13:41:54 There are different perceptions - we need to make sure when we fix one problem, we dont cause others. 13:42:14 Lots of energy in the group. 13:42:36 JW: That was very helpful 13:43:02 WD: I've another comment - yes, it is on a case by case video.. 13:43:21 For example if people are talking fast, they may want to use a screen etc 13:43:36 Options are good, transcript or captioning etc 13:43:44 And they need to be able to make a choice. 13:43:53 Also curved captions are v helpful 13:44:23 Like Frances was saying speed is critical - slowing audio down can cause issues.. 13:44:42 Or looking up - if people switch positions, captions need to follow etc 13:44:47 q? 13:46:01 JB: We should talk about XAUR, and Bill Curtis would like to mention something 13:46:16 agenda? 13:47:01 zakim, next item 13:47:01 agendum 4 -- Issues of joint interest. -- taken up [from jasonjgw] 13:47:37 scribe: janina 13:47:40 JW: Josh, to discuss XAUR things 13:48:31 jo: Appreciates feedback from Wendy and others in the CG on our XAUR work. Some questions ... 13:48:42 jo: Should we be discussing signing avatars? 13:48:55 jo: Might there be some contexts where they're acceptable? 13:49:31 jo: Wondering about associated situations ... 13:49:51 wendy: Depends on the developer ... 13:50:17 wendy: The language itself is important, but avatars don't have facial expression and may have artificial language semantics 13:50:41 wendy: I've seen unacceptable and more acceptable avatars. Sometimes developers think they've got things right, but really don't 13:51:22 WHO? 13:51:44 Howard: right now not that great, 2d, too much missing data 13:51:58 howard: recording a human signing would be better 13:52:15 howard: ASL, where one has words in a specific grammatical order 13:52:31 howard: asl might change that based on expression (facial), or vocal emphasis, etc 13:52:42 howard: turning that into algo is not likely soon 13:53:17 howard: ntid and gu have put effort into this -- but we're not yet there 13:53:45 howard: Notes we appreciate having human interpretation on this call 13:54:02 jo: Asks about the potential -- and it's an opportunity to discuss 13:55:15 WD: Maybe yeah, but right now - and as far as signing avatars go there are a group of people reviewing feedback etc 13:55:30 Howard: The problem is objective measurements for VR/XR 13:56:02 Based on current science we have live interpreter as is required 13:56:22 Currently WCAG mentions that automated captioning is not sufficient 13:56:45 We could mention that automated avatars are not sufficient, in a similar vein. 13:57:03 Some PIP, Picture in Picture is tiny 13:57:16 WFD, have guidelines for minimal size etc 13:57:28 25% ~ 13:57:38 we suggest 33% 13:57:48 q? 13:58:20 zakim, next item 13:58:20 agendum 5 -- Actions to take and coordination of future work. -- taken up [from jasonjgw] 13:58:57 JS: Can we co-ordinate further? 13:59:08 There can be a follow meet. 13:59:26 Future work on Natural Language Interfaces etc would be of use also 13:59:32 Suggestions on co-ordination? 13:59:51 +1 to more meetings like this; this was so helpful 14:01:03 JB: Poeple interested in follow up? 14:01:10 XR Access Symposium: bit.ly/xraccess21 14:02:05 (Notice also that Chris' prototype allows you to save a set of settings to share as a URL for further discussion.) 14:02:09 zakim, list attendees 14:02:09 As of this point the attendees have been jasonjgw, Joshue, John_Paton, janina, scott_h, SteveNoble 14:02:16 present+ 14:02:24 rrsagent, draft minutes 14:02:24 I have made the request to generate https://www.w3.org/2021/05/19-rqtf-minutes.html Joshue108 14:07:06 janina has left #rqtf 15:58:32 becky has joined #rqtf