19:00:39 RRSAgent has joined #aria-at 19:00:39 logging to https://www.w3.org/2022/04/11-aria-at-irc 19:02:07 Meeting: ARIA AT Automation Group Meeting 19:03:22 https://github.com/w3c/aria-at-automation/issues/18 19:03:26 s3ththompson has joined #aria-at 19:03:29 mzgoddard has joined #aria-at 19:03:38 MEETING: ARIA and Assistive Technologies Community Group (AT Automation) 19:04:10 rrsagent, make minutes 19:04:10 I have made the request to generate https://www.w3.org/2022/04/11-aria-at-minutes.html s3ththompson 19:04:18 scribe: s3ththompson 19:04:27 RRSAgent, make logs public 19:04:58 TOPIC: Summary from previous working session 19:11:56 zcorpan: we talked about a "headless" mode for screenreaders... which could in theory allow running another instance of a screen reader for testing while keeping one installed to vocalize outputs. and also the idea of "sessions". someone pointed out the need for automatically downloading screen readers of a particular version from a script. 19:12:24 https://github.com/w3c/aria-at-automation/issues/15 19:13:14 s3ththompson: as for meeting structure, we will have monthly "Group Meeting" where we try to get all vendors to attend and discuss topics relating to broad consensus, followed by a "Working session" 2 weeks later. The working session is open to all, but some vendors expressed that they only have capacity for a monthly meeting 19:13:40 s3ththompson: Bocoup will set up calendar event and write out meetings on wiki page 19:13:55 Matt_King: I'd like to send explicit invites to certain people as well 19:13:59 s3ththompson: great, let's talk about that 19:14:05 TOPIC: API Standard Roadmap 19:14:26 zcorpan: Milestone 0 would be architecture, API shape, protocol 19:15:35 zcorpan: Milestone 1 would be settins 19:20:01 James Scholes: how do we handle abstraction or cases where settings differ between screen reader 19:20:23 Matt_King: i assume we'd have the ability to change arbitrary settings for each screen reader, along with trying to have additional mechanisms for changing shared settings 19:22:54 s3ththompson: we also can decide whether an abstraction should live at the level of the Standard API or a higher-level library, like playwright or something 19:23:26 James Scholes: so there would be an API surface for shared settings, an API surface for changing arbitrary settings, and an API for enumerating arbitrary settings 19:28:19 Michael Fairchild: would vendors be implementing at the same time as we are developing the API 19:28:44 Matt_King: we will be working on implementation in NVDA alongside developing the API, and hoping other vendors work on implementation alongside to 19:29:11 James Scholes: our implementation roadmap isn't exactly captured here, but it will be parallel with this as much as possible 19:31:21 zcorpan: Milestone 2 API to capture spoken output without changing the TTS voice 19:32:01 Michael Fairchild: so would you be pressing key presses or sending via API? how would that differ based on whether you were in headless mode or not? 19:36:58 James Scholes: we haven't completely defined headless mode yet, but i think the idea would be that you could isolate the input and output of the AT from other instances running 19:37:56 James Scholes: in general, we might optimize for running in the cloud anyways, where there aren't necessarily multiple instances running 19:38:59 jugglinmike: perhaps the word "headless" isn't the best word here, since there's no UI involved... perhaps "process-bound" helps us understand our goals here better 19:39:26 Matt_King: i like the idea of process-bound. it helps communicate the idea very clearly of what we want 19:40:56 James Scholes: it's possible that screen readers say: this is a great idea, but tell Microsoft to go implement it... 19:41:43 mzgoddard: perhaps we should just focus on articulating these use cases with the vendors as early as possible. they might have just not had pressing use cases to motivate this kind of thing before 19:47:36 zcorpan: feel free to read and comment on the longer list of Milestones on the issue 19:47:46 zcorpan: in the remaining time, let's jump to Milestone 4 19:48:30 zcorpan: Milestone 4 is a list of commands for specific behaviors that we might want to implement 19:49:08 James Scholes: this was a confusing list to read because some of them are missing their directional counterpart... and some we didn't recognize. maybe we base this list on the published docs from each screenreader rather than parsing output from other tools 19:53:01 Matt_King: how much would ARIA-AT use the API commands vs. the keyboard presses? 19:54:04 James Scholes: we could deduplicate a fair amount of AT-specific test writing 19:55:10 https://a11ysupport.io/learn/commands 19:57:29 zcorpan: in general, milestone 4 is more in the realm of the web developer audience than ARIA-AT... we care more about the actual user inputs matching keypresses and we don't necessarily trust abstract APIs to match reality... 19:57:48 James Scholes: or someone could wright a utility library: map "next heading" to the right keypress 20:08:35 s3ththompson has joined #aria-at