W3C

Results of Questionnaire Evaluation Tools List Starfish Review

The results of this questionnaire are available to anybody. In addition, answers are sent to the following email address: dmontalvo@w3.org

This questionnaire was open from 2022-05-25 to 2022-06-06.

10 answers have been received.

Jump to results for question:

  1. Introduction
  2. Review level
  3. List of Evaluation Tools - List View
  4. Filter Assistant
  5. Tool Submission Process
  6. Proposed Features

1. Introduction

This is a Starfish Review survey for the Evaluation Tools List .

What you're reviewing is a mature draft. Estimated review time is between 40 and 60 minutes.

For this review please consider:

  • Are all functions and instruction clearly provided - is anything missing?
  • Is there anything that should not be included?
  • Try to catch all significant issues in this review. (if you bring up big issues later, they could be disruptive)
  • Please provide high-level word-smithing suggestions if you have the time via GitHub Issues or Pull Requests.
Please note that the information provided in the tools themselves is not subject to review – this content will be added and updated by tool vendors in the publishing process.

Details

Responder Comments
Laura Keen
Brian Elton
Mary Jo Mueller I was mainly checking the IBM entries for correctness and that the content makes sense as presented. I have a little editorial update: "IBM Equal Access NPM Accessibility CHecker" should have "Checker" with the "h" lowercase. Tried to use the "Update information" link on that entry which did nothing. Not sure if it doesn't work because it hasn't yet been promoted to the live site or something else. A couple of odd things - not sure if it's a bug or intentional.

1) I found it odd that there for our Karma and was a link labeled "Accessibility Statement" but it goes to the GitHub issue list for the tool. So that's a mismatch, as I expect such a link would be called "GitHub issues" that would be in a sentence like "Support: GitHub issues".

2) Last updated date - it isn't clear whether this date is the last time the tool entry on this page was updated, or the last time it was verified as accurate, or the last time that the tool had a new version available. Our tool is updated way more regularly than back in 2020 as the date shown on the page.

3) "Show more details" link text doesn't change to "Show less details" when you've expanded, so it's a mismatch between the icon "+" or "-" indication and the text shown.

4) What is considered a "simulation"? The only simulation tools shown are ones that have visual simulations for color blindness. Is showing the tab order or showing the nesting of headings and/or landmark regions with their programmatic labels (what a screen reader user might see when listing those in their screen reader tool) considered a simulation? Is a tool that shows what a screen reader might say (like some name/role/value examination tool) or text output of what a screen reader would output for some content be considered a simulation tool? (e.g. JAWS inspect)

5) "Compliance' is used instead of "conformance" in several of the tools' descriptions.
Kevin White
Sylvie Duchateau
Michele Williams
Carlos Duarte
Howard Kramer
Sharron Rush
Brent Bakken

2. Review level

summary | by responder | by choice

Summary

ChoiceAll responders
Results
I reviewed it thoroughly. 5
I skimmed it. 3
I pass on this review and will not raise big issues later. 1
I need more time, and will complete this by the date in the comment field below. 1

Skip to view by choice.

View by responder

Details

Responder Review levelComments
Laura Keen
  • I reviewed it thoroughly.
Brian Elton
  • I reviewed it thoroughly.
Mary Jo Mueller
  • I skimmed it.
While I skimmed for IBM's entries, and spot checked a few features, I paid attention to the structure and presentation of the information, aspects of filtering, etc. and commented on thoughts.

Kevin White
  • I reviewed it thoroughly.
Sylvie Duchateau
  • I skimmed it.
Michele Williams
  • I reviewed it thoroughly.
Carlos Duarte
  • I reviewed it thoroughly.
Comments provided on GitHub
Howard Kramer
  • I need more time, and will complete this by the date in the comment field below.
6/10/22
Sharron Rush
  • I skimmed it.
Brent Bakken
  • I pass on this review and will not raise big issues later.

View by choice

ChoiceResponders
I reviewed it thoroughly.
  • Laura Keen
  • Brian Elton
  • Kevin White
  • Michele Williams
  • Carlos Duarte
I skimmed it.
  • Mary Jo Mueller
  • Sylvie Duchateau
  • Sharron Rush
I pass on this review and will not raise big issues later.
  • Brent Bakken
I need more time, and will complete this by the date in the comment field below.
  • Howard Kramer

3. List of Evaluation Tools - List View

Do you have any feedback on the current draft of the list view and filters of the List of Evaluation Tools? This can include feedback and proposed changes to the design and text within the interface.

Please review the list view and filters

  • Is the UI clear?
  • Are the filters well described and usable?
  • Is there anything in there that should not be here?

Please provide your comments in the below box or in GitHub Issue for Eval Tools List

Details

Responder Comments
Laura Keen The filter assistant is not in the tab order and does not appear to be keyboard accessible. I'm also not able to open/close the more/less navigation or the info "i" See GitHub issue #83
Brian Elton I am fine with the content of the filters, but have technical concerns.
- Keyboard-only users are not able to expand/collapse filter groups, view tool tips, or activate the "more/less" controls. (This could also affect screen reader users, but did not test it out)
- The language filters include the name of the language in its own language, but a lang attribute has not been defined for each change in language.

In the list view:
- the Show more details control breaks once activated via keyboard. The section expands, but the text does not update to "show less details" and has styling issues. There is also a reading order issue, in that when the section is expanded, focus stays on the "show more details" control, which is now below the newly exposed content. SR users would need to know to navigate backwards to access that content.
- The icon <img> elements are missing an alt attribute

There are other issues, so a thorough accessibility audit should be performed on this tool. I am happy to help with this and advise on any technical aspects.
Mary Jo Mueller My thoughts:

1) The filter for Automated testing doesn't show any of the IBM tools, so it makes me wonder what constitutes an automated tool vs. something else in context of this page. In our website we call using our tools to test content the automated part of testing, so it seems odd that I go to this page and it isn't considered an automated tool (for the filter).
2) The use of "Guidelines" rather than "Standards" as a category. Isn't WCAG really a Standard? Aren't all W3C Recommendations standards? Everything else listed there is a Standard".
3) Curious why Stanca Act (which is a law in Italy, not a standard or anything providing technical requirements to my knowledge) is listed here at all. To my knowledge, correct me if I'm wrong, Stanca Act is the accessibility law and they require WCAG to meet it. I don't think it adds any additional technical requirements, unless it's some sort of accessibility reporting format.
4) Not sure why WAI-ARIA isn't listed in the Guidelines list. Our tool takes great care in validating, as much as possible, well-formed WAI-ARIA.
5) Don't really understand what "Limited" is under "Paid or free" - limited free functionality? When I look at the entries that appear with that filter, there's no explanation what this means in terms of the functionality.
6) There are an awful lot of filters. Are all necessary? Some items have zero entries even with all filters turned off. Could there be logic that removes those from the displayed list?
Kevin White All looking good. Couple of points raised in GitHub
Sylvie Duchateau 1. To editor's discretion:

May be it could be helpful to remind the reader that tools are an aid to evaluate but that they should not be used as unique toool to evaluate a page or an application.

2. In the page structure, I could find a way to browser from one tool feature to the others, but I cannot find a way to browse from one tool to the other. May be add an heading H3 to reach each tool?
Michele Williams Added and commented on issues in GitHub related to the UI and functionality of the page (Issues 91, 92, 85, 83)
Carlos Duarte
Howard Kramer
Sharron Rush * The UI was unclear. I found the process to be a bit confusing but that is not uncommon for me with new UI. I am easily overwhelmed with an array of choices like this one. Upon landing there are 5 or 6 boxes or areas of seemingly unrelated content. My eyes jump around among them before knowing where to start. Why is there not a clear heading - the box with "Information on this website is provided by vendors. W3C does not endorse specific products. See Disclaimer" interrupts the flow of the introductory paragraph. I am just not sure where to start.
* The filters are well described and fairly usable (it's quite a long list and not sure why some are expanded and some are not)
* Not sure if there are things in there that should not be but nothing jumped out.
Brent Bakken

4. Filter Assistant

Please use the button to open the filter assistant and consider its purpose and function.

  • Is the purpose clear?
  • Is the guidance provided useful and clear?
  • Does it work as expected?
  • Is anything missing or confusing?

Please provide your comments in the below box or using GitHub Issue for Filter Assistant.

Details

Responder Comments
Laura Keen The content of the filter assistant is clear and useable. I'm concerned with the accessibility of the widget. I tested with my keyboard but not with a screen reader. See issue #83 as mentioned above.
Brian Elton I think the filter assistant is great! Technical issues exist, so a thorough accessibility audit should be performed
Mary Jo Mueller It seems the filter assistant is missing a lot of the tools in what it is filtering, so that's quite confusing. If you look at the filtering in the category "Paid or free", it looks like the database has 171 tools (assuming they are mutually exclusive). The filter assistant at the bottom says "Show 156 results" but when you add the checkbox items that only adds up to 25 items. IBM's tools don't show up when checking any of the checkboxes, though all of them are used to test websites. axe doesn't show up either. So there's some disconnect in the identification of the tools for their functionality.

The terminology "Product to evaluate" and "What product should the tool be able to test" seems very specific. I don't consider documents "Products" nor are websites "Products". Would think "Technology" might be better than "Product".

Purpose screen uses the term "compliant" where it should say "conformant" or "conforms". Again, on this screen a total of 19 tools are in the filter checkbox list where there's 156 total tools.

It seems that the assistant is basically taking you through the different categories of filters in a different way, so I'm not sure it's super helpful unless there's users that don't want to scroll through a huge list of filters.
Kevin White No problem with the purpose of the assistant, I would be keen to understand usage of this going forward as I don't know that it does much more than the existing filters.

One issue raised in GitHub
Sylvie Duchateau Should the link "filter assistant", be a button? It is announced as link with JAWS here.

It is confusing to me as it seems to be the same page as the page that was indicated in question 3.

Otherwise, no comments on the filter.
Michele Williams Added a commentary in GitHub (Issue #93)
Carlos Duarte
Howard Kramer
Sharron Rush * The purpose seemed to be to simply choosing the search filters - is that right?
* Yes quite clear
* Yes
* I am confused by the results and the repetition of icons, they bother me (but all icons do, so that is just me).
Brent Bakken

5. Tool Submission Process

We have added and updated the tool submission process. Submitting a tool through the form will send the information to GitHub, where the list maintainer and editors can check the submission for completeness and validity. They may contact the tool vendor to provide extra information during this process. Once the submission is approved, it is published to the list of tools.

Please review the Eval Tool submission form and feel free to use it to submit a test entry. Please consider:

  • Is the process clear?
  • is it appropriately designed?
  • Is there anything that should not be included?
  • Is anything missing?

Please provide your comments in the below box or in GitHub issue for tools submission

Details

Responder Comments
Laura Keen The add/remove feature and language buttons do not work when I hit enter. I had to use my mouse. See GitHub issue #84
Brian Elton This form is set up so that all fields are required except where noted, but that instruction is not present before the form fields. This way of doing it is also opposite to the Course Submission form (https://wai-course-list.netlify.app/courses/submission/) where all required fields have (required) in their label. I think this form should follow the pattern used in the course submission, but if that is not possible, users should be informed in text that all fields are required except where noted.
The required fields also make submitting the form as someone who is suggesting a tool more difficult. Someone suggesting a tool may not know the release date or the date of most recent update, but those fields need to be filled out. There also shouldn't be an onus on the one suggesting to list all of the various aspects of the tool, as it could be a barrier. Perhaps there should be a toggle to indicate that the tool is being suggested rather than being added by the tool vendor, and then the information being requested could be reduced. OR which fields are required could change, giving the user the opportunity to provide more info if they want, but not a requirement for simply suggesting a tool.

There are also some technical accessibility concerns, so this form should go through a thorough audit.
Mary Jo Mueller Use U.S. English spellings. e.g "organisation" should be "organization".

Need to provide the actual URI to the EOWG resource on accessibility statement.

WCAG 1.0 should not be listed, as it is a withdrawn standard.

The information icons on the "Guidelines" category don't work.

The "Type of license" doesn't fully align with the "Cost" filters.

"Implements ACT Rules" information is not gathered, so makes that filter unnecessary to have.


Kevin White Easy process, good design, can't think of anything missing.

Couple of technical things dropped into GitHub.
Sylvie Duchateau No comments on that part
Michele Williams Opened Issues 94, 95, and 96, and commented on Issue 89
Carlos Duarte
Howard Kramer
Sharron Rush Seems fine but I urge us to look at the accessibilty errors before publishing.
Brent Bakken

6. Proposed Features

The following is not yet implemented in the draft. We would appreciate your consideration of the proposals and any feedback you may have.

The team is currently working on a streamlined process for updating a tool. Tool vendors may receive a link that automatically pulls information from Git and fills in the form, so the vendor can update relevant information and re-submit the tool for review. The purpose is to make it easier for vendors to maintain current information on the list.

We are also working on a much simpler form for tool users (rather than vendors) to suggest a tool to add to the list. This way tool users won’t have to recall the technical details about the tool, and the list maintainer/editors will be able to contact the tool vendor to suggest submission.

Please provide your thoughts on the usefulness of these plans. Thank you!

Details

Responder Comments
Laura Keen I think these features could be useful for vendors and tool users.
Brian Elton
Mary Jo Mueller I think it would be a nice feature to have our tool's information pulled from Git. It is open source, so easy to access. Additionally, it would be able to reflect the most current information on the W3C tools page without having to remember to go review and request an update.
Kevin White Fantastic bit of work!!
Sylvie Duchateau No opinion for the moment.
Michele Williams Sounds great overall. I'm wondering about the placement of the "Suggest a tool" link - currently it's on the submission path but will it also appear somewhere on the list of tools? Otherwise, all makes sense to me!
Carlos Duarte 1 - Nice idea. I fully support exploring this.

2 - Could be interesting. Not sure if it is not placing too much burden on the tools list maintainers.
Howard Kramer
Sharron Rush I am a bit doubtful about people having a form to submit suggestions in this way. For one thing, does WAI really have the resources to follow up? If that answer is an uncertain "Yes" and the team wants to do it, I am not opposed.
Brent Bakken

More details on responses

  • Laura Keen: last responded on 27, May 2022 at 13:12 (UTC)
  • Brian Elton: last responded on 27, May 2022 at 14:17 (UTC)
  • Mary Jo Mueller: last responded on 27, May 2022 at 21:54 (UTC)
  • Kevin White: last responded on 31, May 2022 at 14:36 (UTC)
  • Sylvie Duchateau: last responded on 1, June 2022 at 14:43 (UTC)
  • Michele Williams: last responded on 2, June 2022 at 15:34 (UTC)
  • Carlos Duarte: last responded on 3, June 2022 at 14:23 (UTC)
  • Howard Kramer: last responded on 6, June 2022 at 04:31 (UTC)
  • Sharron Rush: last responded on 6, June 2022 at 19:43 (UTC)
  • Brent Bakken: last responded on 6, June 2022 at 22:03 (UTC)

Non-responders

The following persons have not answered the questionnaire:

  1. Eric Velleman
  2. Andrew Arch
  3. Shawn Lawton Henry
  4. Shadi Abou-Zahra
  5. Kazuhito Kidachi
  6. Jedi Lin
  7. David Sloan
  8. Vicki Menezes Miller
  9. Reinaldo Ferraz
  10. Bill Kasdorf
  11. Cristina Mussinelli
  12. Kevin White
  13. Kevin Rydberg
  14. Adina Halter
  15. Denis Boudreau
  16. Sarah Pulis
  17. Bill Tyler
  18. Gregorio Pellegrino
  19. Ruoxi Ran
  20. Jennifer Chadwick
  21. Sean Kelly
  22. Muhammad Saleem
  23. Sarah Lewthwaite
  24. Daniel Montalvo
  25. Mark Palmer
  26. Jade Matos Carew
  27. Sonsoles López Pernas
  28. Greta Krafsig
  29. Jason McKee
  30. Jayne Schurick
  31. Billie Johnston
  32. Shikha Nikhil Dwivedi
  33. Julianna Rowsell
  34. Tabitha Mahoney
  35. Fred Edora
  36. Rabab Gomaa
  37. Marcelo Paiva
  38. Eloisa Guerrero
  39. Leonard Beasley
  40. Frankie Wolf
  41. Supriya Makude
  42. Aleksandar Cindrikj
  43. Angela Young

Send an email to all the non-responders.


Compact view of the results / list of email addresses of the responders

WBS home / Questionnaires / WG questionnaires / Answer this questionnaire