UT May 2015/Session Feedback

From Education & Outreach

Feedback on Specific Tools

Whiteboard Images

Here is a collection of whiteboard imagestaken while we discussed usability testing at the F2F last week. Transcription needed, especially for the QuickRef. Volunteers sign up below:

Note that some images show the same content from different angles! (Comment: I don’t think we need those transcribed individually. {Eric, 2015-05-22})

I can transcribe (at least some of) the whiteboard images:

  • [Name, list of images to transcribe by Date]

Quick Ref Whiteboard

QR Image 1

Save Filters Expand/Collapse Layout Look and Feel
What does? Want deselect all – category

- whole ? Want recheck all

Speed Keyboard close Only …? ☺

Filter status looks like input - truncated - “useless” Devices Laptops?

Back and forth to page

…? that filters worked

Multiple types and filters

Filter out SC (e.g. no video) and parts of

Clear all made her nervous

Never saw it

Side panel – mixed

…? … too busy + ?

Wants all collapse first

Hide/reveal techniques ☺

All vs sufficient …?



Terminology:

Non-user friendly jargon – toggle ?::


Presentation of failure links

Good and bad examples

Advisory?

Simple language

Menus not sticky

Page too wide

Show techniques 1.1 vs 1.1.1

1.1 Text alternative – hide description

How techniques shown ☺

Only buttons ➢ didn’t see filter

Level A too far on right

Clear filters location ☹

Technique button placement ☹

Lefthand menu ☺ …?

Seems more ….?/ phone friendly

Link icon > on right

Drop down

Differentiate + V Not noticed right away

Save icon is for download

Blue too start for 1.1. level areas Else liked it

Not clean this is nothing new

Definition links ☹

Contrast levels ☹

Heading not bold enough

Seems like made only for blind users

Pale and dark blue obsession ☹

Reinforced a11y design can’t look good

Liked focus on ….? Dark

Inconsistent link treatment

User typed on filters box

QR Image 2

WCAG Ideas
Abstract terms

Long technical

Some guidelines have own techniques

What is a technique?

G92 means nothing to me

Links in text of WCAG

Failure techniques too wordy

Contrast minimum vs ???????

CAPTCHA don’t understand in 1.1.1

How to use WCAG docs

Video/tutorials in 3 mins

QR Image 3

IDEAS

How to use WCAG docs

Video/tutorials in 3 mins


QR Image 4

This image reflects a side view of the whiteboard.

QR Image 5

This image is the last three columns "Look and feel, WCAG and Ideas". Image 1 transcript contains these columns.

QR Image 6

Personas Search Save
New to a11y

New to WCAG < SOS

Experienced SME:

Web developer

UX designer

UX analyst


Workflow: Analyst ☺

WCAG

Job

Open links in tabs

Links to relevant tasks e.g. color contrast

Print – include SC + techniques

Permalinks to share

EWhat is you don’t know what to search?

Few used it by default

Could see it

What hidden mean?

Ctrl-F

Searched for ARIA

Thought global for WAI

Go when I say go

Fast scroll scary

Liked idea for all in one

Make next button

Skip past visible

What does?

QR Image 7

Same as above

QR Image 8

QR Image 9

QR Image 10

Functionality Terminology
Intro overlay

➢ Reminds them of CIF format ➢ Not all events would ..???

Web techns not looks like a ????? ➢ How hide/show pages works

Generate a conformance claim

Categorize results into sub-reports

Have an option between ??? + pers

Indicate severity/impact

Want to filter out/not present items

Good match to work flow of evaluator

Extra time invested worth it based o output

Want an easier way to export, distribute via e-mail

Want to merge multiple files

Hide/show indiv. page tests confusing

Fail 1 page should fail whole SC

Better summary ? for a SC

Screen shots + videos

Likes options

Essential function page

Show me examples

Export to excel or ??

Expand all pages under with SC

Save button each page …?....?

Exploration notes

Specification URL

Too much text - step…? - Start page A11y support baseline

Scope of web site vs additional reqs

Additional eval reqs

i. 3rd bullet – put …?... scope

Cumbersome, many terms ….? - variety - randomly selected

Optional exploration notes?

Evaluation Commissioner?

Load prior report

Experienced person understood

Not present > unclear ???

Simplify instructions

“In this step, you will…”

Web tech relied upon misunderstood


WCAG-EM Report Tool Whiteboard

WCAG-EM Image 1

  • Personas
    • Eval commissioner
    • Experienced evaluator (4 checkmarks, one with asterisk)
    • Minimalist evaluation
    • UX person w/minimal a11y knowledge
  • General
    • Good tool for proper persona (not me)

WCAG-EM Image 2 WCAG-EM Image 2

  • Functionality
    • Intro overlay 💡
    • Generate conformance claim?
    • Web tech used: doesn't look like/sometimes doesn't work like a dropdown
    • Reminds them of CIF format
    • Not all evaluators will know what technology was used
    • How hide/show people works
    • Categorize results into SB(illegible) reports
    • Have option between fail/pass
    • Want to filter out not present items
    • Indicate severity/impact
    • Good match to work flow of one evaluator who said the extra time invested in set up is worth it based on output.
    • Want easier way to export and distribute to others via email.
    • Want to merge multiple files
    • Fail of one page should trigger fail of entire SC (4 check marks)
    • Better summation for SC
    • Want to ave screen shots (3 check mark) + video
    • Like options
    • Export to excel or bug marker
    • Essential function - (unreadable) ?
    • Show me examples (2 check marks)
    • Expand all pages under all SCs (2 check marks)
    • Save button on each page (Looked at the bottom) 2 check marks
  • Terminology
    • Exploration notes?
    • Specification URL? (2 check marks)
    • Too much text (3 check marks)
      • step 4 intro
      • start page
    • A11y support baseline - what does it mean (7 check marks)
    • Scope of website vs additional requirements (2 check marks)
    • Additional eval requirements. info circle w/3rd bullet, put under Scope
    • Cumbersome, many unclear terms (2 check marks)
      • variety (1 check mark)
      • randomly selected
    • Overly technical page structure and terms
    • Optional exploration notes? Tasks?
    • Evaluation commissioner ?
    • Use "load" rather than "Open" Prior report
    • Experienced person understood
    • Not present: - unclear - better than NA
    • Simply instructions. "In this step you will accomplish this..."
    • Web tech relied upon misunderstood

WCAG-EM Image 3

  • Expectations
    • Explain what it does (2 check marks)
    • Users expected it to perform test for them (5 check marks)
    • Why random pages?
    • Not clear what the tool does for them
    • [Need suggestions, illustration, examples, annotation. You will do THIS -> get THIS]
    • Expected tool to walk them through testing process
    • Does this replace my current process? What is the value proposition?
    • Highlight does not save
    • Support (illegible)

WCAG-EM Image 4

  • Look and Feel
    • Focus on ???/sve buttons not clear
    • chain link icon noted but not understood in relation to text
    • Like L & F
    • Shine it up!
  • Layout
    • audit page surprisingly long (not prepared!)

WCAG-EM Image 5

  • Behavior
    • Skipped intro (3 check marks)
    • Went straight to "How Tool Works"
  • Wcag-EM
    • (illegible)

Roadmap Whiteboard

This tool feedback has been transcribed, find details on the Session Feedback/Roadmap page of this wiki

Feedback template

Facilitator: your name

Resource reviewed: resource name

Participant details:

  • Provide a few
  • Bullet points
  • Describing the participant

Key themes:

  • Record some
  • Key themes
  • From the
  • Session
  • Five or so is not unreasonable

Specific findings:

  • Record any specific findings that are worth noting
  • For example, if wording on a button caused some specific problems
  • Or, for example, if the participant made a suggestion worth noting

Session 1

Facilitator: James Green

Observer: Shawn Henry

Subject self-described as fairly familiar with WCAG 2.0, though his job is to help projects become WCAG 2.0 conformant and appeared quite faimiliar.

He goes to the standards, understanding, and how-to-meet docs regularly

Works with staff trying to get a11y approval on their code,he audits, researches, and shares links to the WCAG docs to illustrate his points of view.

Observations and quotes:

  • left hand nav is good
  • likes how whole 1.1 level is dark blue, separating it as a "heading"
  • likes that focus on elements is darker not lighter
  • doesnt like the old way of having 3 separate doc sites, likes this all in one model "i dont care that one is an advisory, give me everything I need"
  • sees inconsistence in links - some are underlined some are just blue
  • says make sure links go places and buttons do things like open hidden content etc.
  • observation about WCAG text - thinks CAPTCHA should not be a link under 1.1.1. He says that there is so much more than captcha now, that it should be a more general term (that's not a link to improve visibility, parallel structure, amd reduce confusion) that covers captcha and anything that proves you are human
  • the sufficient/all techniques buttons are acting odd -thinks it is that when you click all techniques, but all of them are sufficient, the heading doesnt change to all, so it looks like the button didnt work.
  • he doesnt search much, likes to browse. taht's what he did
  • i asked him to search and when the page scrolled and highlighted words yellow he was surpirsed and didnt like it. he did not like that it moved without him hitting enter "don't do something i havent asked for" and found the scrolling/flying by disorienting.
  • he noticed teh note of how many results are visible and hidden, but didnt understand what hidden meant. he thought it meant below the fold, not hidden in a collapsed section. He wasnt too upset, saying "once I know... but i wasnt expecting it"
  • suggested that instead of next going to the next result which might be in the same section, allowing a next button to take you to the next result not visible on screen, like a skip ahead by a certain number
  • did not know the chain icon means external link
  • completely overlooked filters (though he normally wouldnt want to use them anyway)
  • when in filters, he understood the "only" buttons right away
  • once you close the filters, he said the bar showing what's been filtered out is useless info to him and not very helpful, it's truncated...
  • didnt like idea of a side panel for filter "might be a bit busy"
  • did like filtering functionality
  • would like a reset button inside the modal or a select/deselect all button
  • overall he said this site would be good for sharing info with others, for new people, it would be better/faster in the long run
  • said it seems more tablet/phone friednly
  • if he printed, it woudl be to share info, and he'd want general criteria plus techniques.

Session 2

Facilitator: Kevin

Resource reviewed: Quick Ref

Participant details:

  • Web administrator
  • Developer
  • Uses WCAG
  • Interested in finding examples, samples, and best practice materials

Key themes:

  • Useful to filter by role
  • Filters hugely useful but not clear what is being filtered
  • Checklist functionality might be helpful

Session 3

Facilitator: Kevin

Resource reviewed: Quick ref

Participant details:

  • UX designer
  • Visual and interaction design
  • Background in front-end programming
  • Looking for best practices
  • Need WCAG but hate standards
  • Used Quick Ref - bit cumbersome with too much everywhere

Key themes:

  • Liked the resource title
  • Unclear how the tasks and components filters might effect the document
  • No idea what 'Techniques' are
  • Want to be able to limit to issues relevant to particular development task e.g. forms
  • Filtering out is hard because too many items {SC's} are left

Specific findings:

  • Want to be able to filter out anything to do with audio/video

Session 4

Facilitator: Kevin

Resource reviewed: Accessibility Roadmap

Participant details:

  • Accessibility lead
  • Involved in training, advocacy, advisory, strategy development
  • Lacking code library and testing tools
  • Huge number of developers and content managers to train
  • Involved in developing an accessibility culture
  • Enterprise level organization

Key themes:

  • Business plan may be too early in the process - more important to explore and understand the motivation for the organization to connect meaningfully to stakeholders
  • Accessibility champion is critical; may not be knowledgeable but empathic and will keep the budget coming in
  • Communicating early when no resources are in place does not help engage people
  • Level of information is almost too basic
    • Missing things like risk analysis as part of obtaining support
    • Missing connecting accessibility goals to broader business goals e.g. adding captions to help foreign language students
    • Need to alert people to the resistence they will face and what tools might help; case studies, examples, specific process improvements
  • Would be helpful to have a diary or personal checklist of activities to make it more active

Specific findings:

  • 'Prepare business case' is too close to 'Develop business case'
  • Agreement to prepare a policy is needed before policy is written; without agreement it is an unfunded mandate
  • Suggested titles: Planning Book, Getting Started Checklist, Accessibility Planning Checklist

Session 5

Facilitator: Kevin

Resource reviewed: Report tool

Participant details:

  • Part of programme management
  • Writing policy
  • Involved in reviewing forms used throughout organization
  • Various form format; HTML, PDF, Word
  • Have internal checklist for preparing forms; design details, little accessibility information

Key themes:

  • Purpose of the tool is unclear
  • Might look for more information on WCAG-EM but only after using for a while

Specific findings:

  • 'Website name' not clear; thought it was for URI
  • If evaluator is not familiar with code, then 'Web Technology' makes little sense
  • In step 3 when adding sample, title is 'Add web page', name is 'Short name', URI is 'Address (URL) or description', then button is 'Add web page'. Caused some confusion having all different labels

Session 6

Facilitator: Kevin

Resource reviewed: Quick ref

Participant details:

  • UX practitioner
  • Very experienced

Key themes:

  • Looking for solutions to specific problems
  • Connect techniques to aspects of the success criteria
  • If there is a top-down understanding of WCAG then tool is good. If 'in the weeds' solving problems, then no use
  • No relationship between the Techniques filter and what happens with content
  • Making 'components' more specific e.g. to 'checkbox' would be useful
  • Wanted this to be problem lead, not guideline lead

Specific findings:

  • Technique number is a database key - not useful for actual users
  • 'Only' was really nice
  • No check all or clear all filters in the modal window
  • Hidden items in search is confusing and why would you show me what I can't see [highlight techniques containing hidden items]
  • Missing 'media' from filters
  • "What did I filter in?!" - filtered out is useful when few items removed; filter in is useful when few items included.
  • 1.1.1 is rubbish for filtering

Session 7

Facilitator: Kevin

Resource reviewed: Quick ref

Participant details:

  • Developer
  • JSP, HTML, Java, XML
  • Needs tools to help identifying accessibility issues
  • Aim to have set of rules to follow when developing and then evaluate

Key themes:

  • 'Techniques' respond to the need for examples
  • Looking for easy definitions for some of the terminology
  • Not seeing the impact of particular filters
  • Filters effect techniques and SCs differently

Specific findings:

  • No 'audio/video' filter
  • Looking to generate a PDF from an XML file, so the examples in the PDF techniques were of little value

Session 8

Facilitator: Howard

Resource reviewed: Quick ref

Participant details:

  • Familiar with WCAG
  • Usability/UX expert

Key themes:

  • Felt this resources reinforces the idea that she argues against – that accessible content can’t look good
  • Felt it was designed for the blind user – much would be missed by visual user
  • Wished all this stuff could be written in more understandable language - written in a way that a high school student could understand, with clear examples
  • Would like to see examples of good and bad - & some code snippets
  • Links to tools for every guideline would be helpful

Specifics:

  • Didn’t understand link icon.
  • Light blue color for techniques didn’t catch her eye – also because it was on the bottom
  • Needed more bolded text for headings
  • Listing “all techniques” was confusing because 1st one listed was “sufficient techniques”
  • Level tag indicator – too far to the right
  • Suggested “more detail” or “more information” instead of “understanding techniques”
  • “too many links in the paragraph”
  • “Clear All” – didn’t know what it would do – made her nervous to select.

Specifics for Task: searching for “color contrast”

  • Not sure what “advisory techniques” are
  • Strange that contrast (minimum) wasn’t next to contrast (enhanced)
  • “Failures” under techniques not what she expected
  • Didn’t think location of “clear filters” was a good place for it.
  • Didn’t like appearance of the “understanding techniques” button on top of the “techniques” button