See also: IRC log
<Kathy> https://www.w3.org/WAI/GL/mobile-a11y-tf/wiki/Failure_of_2.5.3
Kathy: I won't be on the week of the 18th but I'll try to have understanding written by then and take a look at feedback when I get back
Kathy: on wiki
<Detlev> Shouldn't "due to all content and functionality not being available" read "due to not all content and functionality being available"?
<Jan> What about "Failure of 2.5.3 Content or functionality not available by touch gesture when built-in assistive technology is active"
<Jan> What about "Failure of 2.5.3 Content or functionality not available by touch gesture when built-in assistive technology is active"
Kathy: sections under each failure where it says a better solution would be to – we don't do that in any other failure that I can see. Failure is usually indicating what the error condition was not a better way of actually doing it
Jon: there are number that do that
Kathy: below examples, above resources
David: we were nervous that people would look at the failures as ways to do things properly and get all mixed up
Kathy: first one is the only when I saw it in. I think it's good to have it in there, but I'm wondering if we should have it in their
David: let's keep that and we can link to it
<David> <p>A better solution would be to provide previous and next controls that
<David> allow the user to scroll the carousel by tapping (or double tapping when
<Kathy> Add technique for "A better solution would be to provide previous and next controls that allow the user to scroll the carousel by tapping (or double tapping when platform assistive technology such as screen reader) is active in addition to support the single finger swipe left and right functionality."
<David> platform assistive technology such as screen reader) is active in addition
<David> to support the single finger swipe left and right functionality.</p>
<Kathy> Add technique for "A better solution would be to use a device independent method such as a click handler (with platform touch support) that allows the user to interact with the control by platforms that alter touch access when built-in assistive technology is used by touch or through a keyboard interface."
David: add as technique instead so people don't mix up failures with what you're supposed to do
Kathy: and the failure should be
at another step in the procedure that says turn on the built-in
assistive technology
... step two, turn it on
David: screenreader or assistive technology?
Kathy: built-in platform assistive technology such as a screen reader
Alistair: sufficient technique M027, which is pretty much the flipside of this – supporting touch with gestures. We need to blend it in with that
Kathy: could put examples in that
<agarrison> Under tests "Test that for each interactive element that responds to touch or gesture there is equivalent functionality provided and all content is available when built-in assistive technology is enabled that changes the touch interaction mode". Changed to "Test that for each interactive element that responds to touch or gesture all content is available when built-in assistive technology is enabled that changes the touch interaction mode; or the[CUT]
David: remaps touch gestures and alters the touch interface
Kathy: I understand remapping, but what does altering mean?
Jon: when the assistive technology like screen reader is running you can touch the screen without activating an element – you have to double tap for example to send equivalent tap gestures. Wording doesn't matter
Kathy: remap better
Alistair: remap isn't exactly perfect, there must be an easier way of putting it but it's not for today
<David> Ex 2 <p>A better solution would be to use a device independent method such as a
<David> click handler (with platform touch support) that allows the user to
<David> interact with the control by platforms that alter touch access when
<David> built-in assistive technology is used by touch or through a keyboard
<David> interface.</p>
Kathy: #2 and the procedures –
doesn't specify one or more, so somebody could read this that
they have to turn on everything all at the same time which may
not be what we intended
... want to make sure it's valid moving forward, not just
screen readers
Detlev: are there others
Jeanne: could do this is a for next loop – repeat for each individual platform assistive technology
Kathy: might be instances where you want to turn on more than one, screenreader and screen magnifier
Detlev: rephrase for each interactive element that responds to touch or gesture – is there a case for also covering generic gestures which may not be clearly related to any interactive element – scroll gestures where you swipe across the screen without hitting interactive element in particular, do we need to include that?
Alistair: it was the introduction
text
... for the top sentence
David: were going to have to define platform assistive technology – do we want to say built-in platform assistive technology, or just platform
Jon: I thought we discussed built-in but it could apply to things like Android you could have third-party screen readers
David: we don't want to do that
Jon: it just complicates it
Kathy: an understanding I put system assistive technology – we need to agree on a term and change it throughout. What do people prefer built-in, system, platform
Jeanne: I like platform it's consistent with UAAG and ATAG
David: system, platform, built-in
platform, what else – operating system
... is system a shortcut for operating system?
Jon: and other document we said platform level assistive technology indicating that its operating system level and not some other level like the browser. Browser can be operating system as well
David: people can get very
technical with this language. What we really want to say which
is too long is the assistive technology that was included with
the operating system
... that's too long, but we can put that in a definition and
have a short term and link
... OS assistive technology? Platform excessive technology?
With the easiest to say, most elegant looking, most
comprehensive
Alistair: edge case Samsung talkback, if we bind it too heavily to the platform – Android – we may have problems there because there may not be another one that is provided with a platform
David: it's the Samsung version of android
Jon: we say platform level in
other places – that's a version of that
... 4.1.4 touch proposal discussion
Kathy: consistency is a good argument to say platform
I like platform – more clear
<agarrison> Suggestion - platform's default assistive technology
<jeanne> FRom ATAG: platform accessibility service
<jeanne> A programmatic interface that is specifically engineered to provide communication between applications and assistive technologies (e.g. MSAA, IAccessible2 and UI Automation for Windows applications, AXAPI for Mac OS X applications, GNOME Accessibility Toolkit API for GNOME applications, Java Access for Java applications). On some platforms, it may be conventional to enhance communication
<jeanne> further by implementing a document object.
Detlev: difficult for edge cases that come in on a skin level or vendor level
<jeanne> From UAAG: platform accessibility service
<jeanne> A programmatic interface that is engineered to enhance communication between mainstream software applications and assistive technologies (e.g. MSAA, UI Automation, and IAccessible2 for Windows applications, AXAPI for Mac OSX applications, Gnome Accessibility Toolkit API for GNOME applications, Java Access for Java applications). On some platforms it can be conventional to enhance
<jeanne> communication further by implementing a DOM.
David: interesting your using the word level – we've been bouncing around platform or platform level
<jeanne> UAAG Platform def: platform
<jeanne> The software and hardware environment(s) within which the user agent operates. Platforms provide a consistent operational environment. There can be layers of software in an hardware architecture and each layer can be considered a platform. Native platforms include desktop operating system (e.g. Linux, Mac OS, Windows, etc.), mobile operating systems (e.g. Android, Blackberry, iOS,
<jeanne> Windows Phone, etc.), and cross-OS environments (e.g. Java). Web-based platforms are other user agents. User agents can employ server-based processing, such as web content transformations, text-to-speech production, etc.
<jeanne> Note 1: A user agent can include functionality hosted on multiple platforms (e.g. a browser running on the desktop can include server-based pre-processing and web-based documentation).
<jeanne> Note 2: Accessibility guidelines for developers exist for many platforms.
Alistair: is level adding anything?
Jon: my net be provided with a
platform but interfaces at a low level with the platform
... I still think that talkback – injecting JavaScript but as
far as touch interface at the platform level not at the browser
level
David: I think the edge case of Samsung I think we're okay because I would really say they're just modifying the platform with their own thing attached to it
Kathy: going to be a lot of edge cases with Android. Samsung in latest operating system, but that could happen all over the place
David: better off saying platform rather than platform level. If somebody's modifying the platform, than the assistive technology needs to work, and needs to work on at least one platform. I think we need to stay with that – android is such a disaster right now for the screenreader talkback – work on one platform is sufficient
Detlev: Samsung has a platform that their shipping and Google has a platform
Jon: platform and platform level used interchangeably
David: are people good with using platform without level?
<jeanne> +1 to platform
general agreement
Kathy: changing to platform throughout including understanding
platform assistive technology
<agarrison> Changed to "Test that for each interactive element that responds to touch or gesture all content is available when built-in assistive technology is enabled that changes the touch interaction mode; or there is equivalent functionality provided".
<Detlev> Shall we address general gestures here?
<Detlev> Swipe to scroll, to bring in menu from display edge, etc
Kathy: under user agent might be good to say the current technology that remaps gesture are…
David: also give a sense of what it is in the understanding so people don't have to go look somewhere else
Kathy: understanding language we had a link to gestures from the manufacturers – that's good to put under resources
Jeanne: I can find them
Kathy: in this failure good to have the gestures that get remapped for screen readers
Jeanne: link to list
Detlev: are assistive technologies taking specific gestures, things like swipe, or the entire interface
Kathy: some of them don't get remaps, some of them do, you can also do passthrough gestures, you can also write custom gestures in some scenarios, so I don't think we can get more specific
<Detlev> This is a bit dated by I try to update that soon: http://www.incobs.de/gesten.html
<Detlev> and it's German right now
Jeanne: I'll take that definition and link throughout the document and techniques
David: want to make sure that it's clear that were not requiring people to make something work by touch and a screen reader that doesn't work by touch otherwise
Jon: we say that in the
notes
... could be met through screen keyboard, screen controller
with arrow keys, use the on-screen thing to meet this
requirement – there are other ways to meet it
<David> The technique is applicable even when
<David> access via a physical keyboard interaction is present and makes sure that
<David> interactions can be performed through a keyboard interface using touch
<David> gestures rather than relying on a physical keyboard accompanying the touch
<David> screen device.
Jon: the whole point of this
technique is say you have a slider – key up up in key down
events, available through keyboard but not accessible through
touch gestures, that's why we have this. physical keyboard
access needs to be tested separately. Two requirements here
keyboard interface requirement that supported with touch and
keyboard interface requirement that supported with
physical...
... keyboard.
<David> his technique applies to interactive elements on platforms where touch screen access is
<David> provided.
<Detlev> got thrown out of the WevEx Meeting somehow...
David: so were really talking at the element level
Jon: maybe at the feature level – say swipe – screen level
<agarrison> under tests 3 - suggest change "active or another touch method is" to "active or another method is"
David: I'll continue with this paragraph and send it off.
Kathy: will continue this discussion on the list
This is scribe.perl Revision: 1.144 of Date: 2015/11/17 08:39:34 Check for newer version at http://dev.w3.org/cvsweb/~checkout~/2002/scribe/ Guessing input format: RRSAgent_Text_Format (score 1.00) No ScribeNick specified. Guessing ScribeNick: Kim Inferring Scribes: Kim Default Present: Kim, Alistair, Kathy, David, Detlev, Jan, Marc, Jon, Jeanne Present: Kim Alistair Kathy David Detlev Jan Marc Jon Jeanne Found Date: 04 Feb 2016 Guessing minutes URL: http://www.w3.org/2016/02/04-mobile-a11y-minutes.html People with action items:[End of scribe.perl diagnostic output]