W3C

– DRAFT –
ARIA and Assistive Technologies Community Group

05 October 2023

Attendees

Present
Hadi, howard-e, James_Scholes, jugglinmike, Matt_King, Michael_Fairchild
Regrets
-
Chair
Matt King
Scribe
jugglinmike

Meeting minutes

Matt_King: Currently, I'm still planning on having a meeting on Wednesday of next week

Matt_King: Though the Bocoup team will be out

Matt_King: It depends on whether or not I come up with more blocking issues related to finishing out the specification of things that the Bocoup team is waiting on

Matt_King: If nothing comes up by Tuesday, I may cancel next Wednesday's meeting

Matt_King: But I want to keep it on the calendar for now

Matt_King: There is no automation meeting on October 9th--that is also cancelled. The next automation meeting will be October 23

Update on ARIA-AT site changes

Matt_King: On Tuesday, howard-e and the team at Bocoup pushed out a huge update to the website

Matt_King: I wanted to take some time to walk through the changes with this group

Matt_King: They're really important updates, and I think it's important for people to be aware of them

Matt_King: The agenda includes a link which leads directly to the new "data management" page

Matt_King: This page is available to anybody who comes to the website--there's a different form for "signed out" users and for "signed in" users

Matt_King: Specifically, the view changes for "signed in" users who are admins

Matt_King: At a high level, this page is to help with version control of test plans

Matt_King: As we move a Test Plan through the working mode, we want to be able to see which version of which test plan is in which phase of the working mode

Matt_King: And Admins want to see what needs to be done to move each to the next step in the process

Matt_King: Right now, you can see there are 35 test plans that exist in the repository

Matt_King: A test plan can have more than one version active at the same time. That was previously very hard to see; this interface makes it much more clear

Matt_King: The page allows you to filter Test Plan Versions and narrow down

Matt_King: Right now, all Test Plan Versions in either "R&D Complete" or "In Candidate Review". In the future, when we begin deprecating Test Plan Versions, that will be apparent, too

Matt_King: I'm really excited about this table. To me, it makes it really clear what's going on in the project.

Matt_King: The very first column, the "test plan" column, is a link. If you open one of those links, it opens a page that shows you information about all versions of that test plan that were ever created

Matt_King: ...Here, you can see information about the plan at a high level

Matt_King: ...Next, you have a version summary table.

Matt_King: ...Next, you have a list of all related GitHub issues.

Matt_King: ...Then there's a timeline for all versions. This shows the chronological order of how a Test Plan was developed

Matt_King: ...Then there's a series of sections which detail the history of each version

Matt_King: This is the "Test Plan Version History" page. It's an awesome piece of work. I love this page

Matt_King: Back on the "Test Plans Status Summary" table, the next column describes the "Covered AT"

Matt_King: Further along, in the "Candidate Review" column, there is a button labeled "Required Reports:". This opens a dialog which surfaces missing reports, if any

Matt_King: In that same cell, you can see the number of relevant open issues on GitHub (in the case of "Action Button Menu Example", there are three)

Matt_King: And it shows the target date for completing Candidate Review (for admins, this is rendered as a button which allows them to modify the target date)

Matt_King: The final column is for each Test Plan's "Recommended" version. We don't have any Test Plans with a version in the "recommended" phase right now, so there's nothing to see here at the moment

Matt_King: So that's the "Data Management" page!

Matt_King: The other changes in the app are largely related to version control

Matt_King: There are still more on the way

Matt_King: The goal is that the providence of any information in the system is obvious no matter where you are

Matt_King: It's about transparency for the big picture, and giving admins the ability to manage the working mode

Matt_King: Big thanks to the folks at Bocoup for making this happen. This is a big turn of the crank

Michael_Fairchild: This is huge--thanks!

Matt_King: There are ten known bugs right now. I've put a link to those in the agenda. Some will be fixed in the next deployment

Matt_King: The next deployment is tentatively planned for Thursday of next week

Matt_King: One of the big changes coming in that deployment is a change to how we present reports

Matt_King: We'll spend some time walking through that in our meeting which follows that deployment, on October 19

Issue 809 - Define required AT versions for reporting on recommended test plans

github: w3c/aria-at-app#809

Matt_King: Last week, we talked about how when a test plan becomes recommended, that's when we want to start making sure that the data for the test plan remains current

Matt_King: That way, visitors to APG can get information about how the latest version of their screen reader is working with that test plan

Matt_King: We don't have the ability to show reporting history and trends on the site (that's something we'll talk about next year)

Matt_King: Last week, we decided that we can't keep up with doing that for doing that for every new browser version and every new AT version

Matt_King: We decided we would only keep up with new releases of AT

Matt_King: So I'm thinking about how we determined when/if AT versions are missing

Matt_King: In this issue, I'm proposing three possiblities as to decisions that we could make

Matt_King: There may be more options, I just wanted to have a starting point for a decision framework

Matt_King: The three options that I came up with are...

Matt_King: Suppose we marked a test plan as recommended today, let's limit our discussion to NVDA.

Matt_King: There are three options for "which versions of NVDA should have required reports?"

Matt_King: One option: since we first started with NVDA in 2022 (let's say), we should generate data for every version since then. (The rational being that we started collecting for that version, and we don't want gaps in the data)

Matt_King: Another option: We use whatever the latest version of the test plan that we have

Matt_King: A third option: We only consider new versions released after the moment the report became recommended

Matt_King: In summary: (1) the version used to generate the earliest approved report, (2) the version used to generate the latest approved report, or (3) the first version released after the test plan becomes recommended

Hadi: I'm having trouble following; I think seeing it in writing would help

Matt_King: I wrote this up in issue gh-809

Hadi: Great, thank you

Matt_King: I don't necessarily expect to make a decision today, but I definitely didn't want to make the call on my own. This effects the folks here and various external stakeholders

James_Scholes: I think we need to explicitly define what we mean by "recent" when it comes to browser releases and AT releases. We shouldn't ask Testers to make a judgement call about that

Matt_King: Sounds good

Matt_King:

James_Scholes: I think the word should be interpreted in terms of the time that the Tester is done

Matt_King: Knowing the current major version of stable browsers will take some work

Matt_King: If we have this requirement, can the app figure it out, or will it be a human job?

howard-e: I think it might have to be a human job because otherwise, we may make false assumptions

Matt_King: Can the app use an API (e.g. provided by a browser vendor) to discover new versions?

James_Scholes: It has to be possible to some extent because Playwright knows how to download the latest browser

Matt_King: Do we know if we can do that for Firefox, Chrome, Safari, and Edge?

howard-e: I don't know for all of those. Mostly likely yes for Firefox and Chrome

James_Scholes: Edge may be difficult because of the way Edge is pushed out

Matt_King: I'll open an issue to perform an investigation into the feasibility

James_Scholes: We could do this together on a weekly cadence

James_Scholes: But we shouldn't ask someone to do this more often, so we'll still be in a position where we don't know for sure that we've identified the latest version available

jugglinmike: Recall that last week, we talked about setting policies in terms of release date, e.g. "any version released in the past 6 months"

jugglinmike: So the data on release that we collect (whether via an automated process or a manual one), should probably include both the version numbers and date

jugglinmike: We don't have to only consider the release of software in the time window

jugglinmike: Instead of saying "the version released in the past 90 days, or, missing that, the next version released"...

jugglinmike: ...we could say "the newest version available in the past 90 days" (meaning that even if there was no release, we would accept the version that Testers would plausibly have installed during that time frame

Minutes manually created (not a transcript), formatted by scribe.perl version 221 (Fri Jul 21 14:01:30 2023 UTC).

Diagnostics

Succeeded: s/Another option:/Matt_King: Another option:/

Succeeded: s/A third option/Matt_King: A third option/

Succeeded: s/In summary:/Matt_King: In summary:/

All speakers: Hadi, howard-e, James_Scholes, jugglinmike, Matt_King, Michael_Fairchild

Active on IRC: howard-e, jugglinmike, Matt_King