W3C

- DRAFT -

User Agent Accessibility Guidelines Working Group Teleconference

19 Feb 2015

See also: IRC log

Attendees

Present
Jeanne, Jan, Jim_Allan, Kim_Patch
Regrets
greg
Chair
Jim Allan
Scribe
allanj

Contents


<trackbot> Date: 19 February 2015

<scribe> scribe: allanj

open item 1

Publishing

js: after house keeping, publish week of Mar 16, with 3 weeks for comments

close item 1

open item 2

Exit Criteria

https://lists.w3.org/Archives/Public/w3c-wai-ua/2015JanMar/0036.html

current wording:

1. Define test cases: Complete a set of tests needed to cover all UAAG normative success criteria.

perhaps update instead of complete

2. Test Implementations: Perform these tests and verify independent results on user agents, and to the extent needed, combinations of users agents and extensions.

missing coverage...at least 2 implementations

<jeanne> 2. Test Implementations: Perform these tests and verify at least two independent results on user agents, and to the extent needed, combinations of users agents and extensions.

discussion of extensions. need to add informative bit about free, updated, accessible installer etc.

Note on Independent Implementations:

a) Success criteria that rely solely on the rendering engine of the user agent (e.g. x.x.x) will need implementations of two different rendering engines.

added examples

b) Success criteria that rely solely with the user agent user interface (e.g. x.x.x) may have implementations using the same rendering engine.

<jeanne> b) Success criteria that rely on the user agent user interface

<jeanne> (e.g. x.x.x) may have implementations using the same rendering engine.

jr: overarching principle. any feature need 2 implementations (UI) and 2 resultant rendering implementations

<jeanne> Implementations features that satisfy a specific success criteria (plugins, extensions or user agents) must be from different code bases in order to be considered independent.

if you have chrome and opera (2 different UIs), they both use Blink rendering engine

the test is to see if they allow the user to set something (a UI)

we assume that it will be rendered appropriately

the setting will be rendered appropriately

we will still need to verify the rendering

<Jan> The goal is to have independent implementations for each success criteria, while taking into account that other software components (that are not connected to the success criteria being tested), may be shared.

example: 1.8.3 Provide Viewport Scrollbars:

When the rendered content extends beyond the viewport dimensions, users can have graphical viewports include scrollbars, overriding any values specified by the author. (Level A)

1.4.1 Text Scale, Color, Font (Globally):

1.4.2 Text Size, Color and Font (by Element):

<jeanne> b) Success criteria that rely on the user agent user interface

<jeanne> (e.g. 1.4.1, 1.4.2) may have implementations using the same rendering engine. In the 1.4.1 where the user can set a preference for a default font size, the independent implementations are the user interface where the preference can be set. The rendering can share a code base, since that is not what is being tested in the success criteria.

<Jan> The goal is to have independent implementations for each success criteria, while taking into account that other software components (that are not connected to the success criteria being tested), may be shared.

<Jan> For example:

<Jan> - Success criteria that are implemented at the rendering engine level must be demonstrated by browsers that use different rendering engines.

<Jan> - Success criteria that are not implemented at the rendering engine level can be demonstrated by browsers share a common rendering engine.

- Success criteria that are not implemented at the User interface level (a setting) the results can be demonstrated by browsers share a common rendering engine.

- Success criteria that are implemented at the User interface level (a setting) the results can be demonstrated by browsers share a common rendering engine.

<Jan> The goal is to have independent implementations for each success criteria, while taking into account that other software components (that are not connected to the success criteria being tested), may be shared.

<Jan> For example:

<Jan> - Success criteria that are implemented at the rendering engine level must be demonstrated by browsers that use different rendering engines.

<Jan> - Success criteria that are not implemented at the rendering engine level can be demonstrated by browsers that share a common rendering engine.

<Jan> - Success criteria that are implemented by extensions can be demonstrated by two independent extensions to the same user agent.

suggested replacement for #2 - Success criteria that are implemented at the User interface level (a setting) the results can be demonstrated by browsers share a common rendering engine.

<Jan> - Success criteria that are implemented at the User interface level (a setting) the results can be demonstrated by browsers that share a common rendering engine.

suggested replacement for #2 - Success criteria that are implemented at the User interface level (set by the user) the results can be demonstrated by browsers share a common rendering engine.

suggested replacement for #2 - Success criteria that are implemented at the User interface level (set by the user) the results can be demonstrated by browsers that share a common rendering engine..

<jeanne> Success criteria that are implemented at the User interface level (set by the user) can demonstrate the results with browsers that share a common rendering engine.

<Jan> For success criteria that are implemented at the user interface level (set by the user), independent results can be demonstrated by browsers that share a common rendering engine.

<Jan> For success criteria that are only implemented at the user interface level (e.g. x.y.z), independent results can be demonstrated by browsers that share a common rendering engine.

1.8.3 Provide Viewport Scrollbars:

1.8.13 Allow Same User Interface:

- Success criteria that are implemented at the rendering engine level must be demonstrated by browsers that use different rendering engines.

- For success criteria that are only implemented at the user interface level (e.g. something set by a user), independent results can be demonstrated by browsers that share a common rendering engine.

- Success criteria that are implemented by extensions can be demonstrated by two independent extensions to the same user agent.

chrome and opera cannot both be used for what

js: maintain point of regard, indicate unrendered content

jr: how big is this issue?

js: html had to have different rendering engines

<jeanne> http://www.w3.org/TR/2011/CR-wai-aria-20110118/

<jeanne> http://www.w3.org/WAI/ARIA/1.0/CR/implementation-report

<Jan> http://www.w3.org/WAI/ARIA/1.0/CR/implementation-report#test_results

blink is used in Chrome starting at version 28,[6][7] Opera (15+),[6] Amazon Silk and other Chromium based browsers as well as Android's (4.4+) WebView and Qt's WebEngine.

web browsers using Gecko include Airfox, Waterfox, K-Meleon, Lunascape, Pale Moon, Portable Firefox, Conkeror, Classilla, TenFourFox, HP Secure Web Browser, Oxygen and Sylera (for mobile).

browsers using Trident IE, AOL, Avant, Nintendo, QQ, etc.

webkit browsers. Dolphin, Safari, iCab, webOS

if we choose 4 rendering engines and 4 UIs are we covered

how do we tell which SC are in the UI and which are in the rendering engine

<jeanne> http://w3c.github.io/UAAG/UAAG20-Reference/

<jeanne> JR suggests listing the few SC that must have two separate rendering engines

d) Implementations (plugins, extensions or user agents) must be from different code bases in order to be considered independent.

kp: 2 separate plugins for same UA doesnt work?

continue with d) Implementations (plugins, extensions or user agents) must be from different code bases in order to be considered independent. finalize the exit criteria

Summary of Action Items

[End of minutes]

Minutes formatted by David Booth's scribe.perl version 1.140 (CVS log)
$Date: 2015/02/19 19:37:33 $

Scribe.perl diagnostic output

[Delete this section before finalizing the minutes.]
This is scribe.perl Revision: 1.140  of Date: 2014-11-06 18:16:30  
Check for newer version at http://dev.w3.org/cvsweb/~checkout~/2002/scribe/

Guessing input format: RRSAgent_Text_Format (score 1.00)

Found Scribe: allanj
Inferring ScribeNick: allanj
Default Present: Jeanne, Jan, Jim_Allan, Kim_Patch
Present: Jeanne Jan Jim_Allan Kim_Patch
Regrets: greg
Found Date: 19 Feb 2015
Guessing minutes URL: http://www.w3.org/2015/02/19-ua-minutes.html
People with action items: 

WARNING: Input appears to use implicit continuation lines.
You may need the "-implicitContinuations" option.


[End of scribe.perl diagnostic output]