IRC log of mobile-a11y on 2016-10-20

Timestamps are in UTC.

14:59:57 [RRSAgent]
RRSAgent has joined #mobile-a11y
14:59:57 [RRSAgent]
logging to http://www.w3.org/2016/10/20-mobile-a11y-irc
14:59:59 [trackbot]
RRSAgent, make logs public
14:59:59 [Zakim]
Zakim has joined #mobile-a11y
15:00:01 [trackbot]
Zakim, this will be WAI_MATF
15:00:01 [Zakim]
ok, trackbot
15:00:02 [trackbot]
Meeting: Mobile Accessibility Task Force Teleconference
15:00:02 [trackbot]
Date: 20 October 2016
15:00:50 [shadi]
present+
15:00:51 [Kathy]
Kathy has joined #mobile-a11y
15:01:31 [patrick_h_lauke]
patrick_h_lauke has joined #mobile-a11y
15:01:38 [patrick_h_lauke]
present+ patrick_h_lauke
15:01:39 [Alan_Smith]
Alan_Smith has joined #mobile-a11y
15:02:08 [David]
David has joined #mobile-a11y
15:02:27 [Alan_Smith]
Present +
15:02:33 [Kim]
present+ Kim
15:02:42 [David]
Present+ David
15:02:53 [marcjohlic]
marcjohlic has joined #mobile-a11y
15:03:12 [Kathy]
present+ Kathy
15:03:22 [patrick_h_lauke]
btw i joined the call but just sorting out something noisy so on mute ;)
15:03:23 [Jatin]
Jatin has joined #mobile-a11y
15:03:27 [Alan_Smith]
present+ Alan
15:03:44 [Jatin]
present+ Jatin
15:03:51 [marcjohlic]
present+ marcjohlic
15:05:15 [Kathy]
https://www.w3.org/WAI/GL/mobile-a11y-tf/wiki/WCAG_2.1_Success_Criteria_Submission_Requirements
15:05:25 [Kim]
Agenda+ Continuation of SC review - https://github.com/w3c/Mobile-A11y-Extension/tree/gh-pages/SCs , https://www.w3.org/WAI/GL/mobile-a11y-tf/wiki/WCAG_2.1_Success_Criteria_Submission_Requirements M13 Orientation
15:05:26 [Kim]
Agenda+ M6 Input-Agnostic - https://github.com/w3c/Mobile-A11y-Extension/blob/gh-pages/SCs/m6.md
15:05:28 [Kim]
Agenda+ Next steps and schedule
15:05:41 [Kim]
TOPIC: success criteria submission requirements
15:05:56 [DAVID2]
DAVID2 has joined #mobile-a11y
15:06:17 [Kim]
Kathy: discussing edits
15:07:04 [Kim]
Kathy: the ones in the column essay reviewed by task force have been reviewed and I think are finalized. M1-M10 will be submitted as a full group
15:07:21 [Kim]
Kathy: that puts us mid-November when we get all of the submitted. Our deadline is December 1 – that's a hard deadline
15:07:28 [chriscm]
chriscm has joined #mobile-a11y
15:07:41 [Kim]
Kathy: Andrew said that if were still working on tweaking some of the wording we can go ahead and update after we've submitted
15:07:50 [Kim]
Kathy: we are in pretty good shape
15:08:17 [Kim]
Kathy: since Patrick is here M6 first
15:08:22 [patrick_h_lauke]
https://github.com/w3c/Mobile-A11y-Extension/blob/gh-pages/SCs/m6.md
15:08:31 [Kim]
Patrick: updated 10 minutes ago
15:08:38 [Kim]
TOPIC: M6
15:08:52 [chriscm]
present+ chriscm
15:09:09 [Kim]
Patrick: short name input agnostic, all text can be operated without requiring any particular type of input mechanism, suggested AA
15:09:28 [Kim]
Patrick: superset SC that tries to cover more than just the keyboard and mouse scenario.
15:10:01 [Kim]
Patrick: potentially working group may combine it something else but this is the culmination of the various input related SCs - the gold standard
15:10:18 [Kim]
Patrick: will check glossary definitions
15:10:31 [Kim]
Patrick: will fall under inputs with assistive technology although it does cover other types
15:11:43 [Kim]
Kathy: also in the description this is really a superset for the keyboard. Note at end of description, task force feels that this is a superset of the keyboard one, if the working group would like to consider adjusting the keyboard input to have this one instead. Thinking of this as 2.1 or silver
15:11:45 [David]
q+
15:12:04 [jeanne]
jeanne has joined #mobile-a11y
15:12:21 [Kim]
Patrick: reading description
15:12:22 [jeanne]
regrets+ jeanne
15:14:20 [Kim]
Patrick: thinking about range – all input devices, all standardize input devices
15:14:37 [Kim]
Kathy: may be supported
15:14:52 [Kim]
Kathy: the other one we talked about last week we tied to the accessibility support – can we tie in something like that
15:15:01 [Kathy]
q?
15:15:09 [Kim]
Alan: can we go back to without requiring any particular type of input negative some type wording instead of all
15:15:39 [Kim]
Patrick: Yes that makes sense – last sentence… without requiring any particular type of input mechanism – that makes sense
15:16:01 [Kim]
q+
15:16:59 [Kim]
David: seems like we're trying to address IndieUI but don't have the tools today. We have the standard for touch and pointer but we don't actually have anything implemented – trying to build a framework for something that doesn't exist yet
15:17:32 [Kim]
Patrick: suggested techniques further down – we can already do this by relying on high-level input agnostic events such as focus blur click…
15:17:53 [Kim]
David: last time I checked with talkback you're swiping through and you're listening for an on blur the swiping itself, the virtual cursor is not listening to that blur activity – am I right on that
15:18:16 [Kim]
Chris: no it's not even going to cause that because the talkback focus – you can only listen for that as part of a native API, it's not in the web browser
15:18:34 [Kim]
Patrick: focus and blur are fired say a tap, touchscreen and talkback – it depends what exactly you are implementing
15:19:07 [Kim]
Patrick: if your focus is currently on an element like a button and you moved to another element and activate that blur is fired on the element where the focus previously was
15:19:38 [Kathy]
q?
15:19:46 [Kim]
Chris: developers typically take advantage of that if input focus is cycling through elements – the only way for talkback user to put focus on something would be actually to select
15:20:25 [Kim]
Patrick: further clarification: focus and blur not fired when you would expect them in certain input mechanisms. Don't assume that a user can first focus and then separately click
15:21:11 [Kim]
David: language right now all functionality can be operated without particular – like say I want you to be able to write a letter without using any particular type of writing mechanism
15:21:18 [Kim]
Patrick: pen, dictating it…
15:22:23 [Kim]
David: 10,000 foot view of the keyboard thing
15:22:41 [Kim]
Chris: asking developers to be responsible for this level of agnosticism
15:22:48 [Kathy]
q?
15:23:31 [Kim]
David: we want to find universal accessibility, no matter what you are touching it with it's going to be activated, but I don't see how we can fix this – is going to take incredible bandwidth. This is a fundamental shift towards something that's not yet available
15:23:58 [Kim]
Patrick: I really don't believe that it's not available. Maybe I'm missing something but the same argument that we just had could be made – if it's not mouse workable can't be made to be keyboard workable
15:24:22 [Kim]
Patrick: key handling, mouse handling, touch handling, pointer events – not science fiction, building applications and not assuming one type of input. I'm not seeing how it's not workable right now. I'm doing exactly this already
15:25:03 [Kim]
David: I don't think you can say without requiring any particular type of input mechanism on our current success criteria. It's very strong normative statement that's superwide. I'm going to sound like a Luddite here but I don't know how you can go that wide.
15:25:22 [Kim]
David: we are in a three-year window now, that seems to be the definition of the new charter. Maybe in three years we will be able to say just use the high level
15:26:51 [Kathy]
q?
15:27:01 [Kathy]
ack David
15:27:04 [Kathy]
Ack Kim
15:27:26 [marcjohlic]
q+
15:27:37 [Alan_Smith]
q+
15:27:40 [Kim]
Kim: I'd add speech input to the description – Mic as the device. I think we been talking about input agnostic for a long time now and should address it
15:27:58 [Kim]
David: large width
15:28:09 [Kim]
Kathy: what are the scenarios that would not be possible today – maybe we can address how we can look at that
15:29:32 [Kim]
David: pulling up a list of all of our success criteria – I think the other ones address what this is trying to do except this one is going whiter than all the other ones. It's going beyond what our current technology is I would say and it's getting at something that we don't yet have. It's one thing to aspire to something but you can't require – the role of WCAG in my understanding is...
15:29:34 [Kim]
...were not inventors. We vet things. We standardize things that are emerging, that are working well. We don't generally create momentum of something that doesn't exist
15:29:34 [marcjohlic]
q-
15:29:42 [Kim]
Patrick: I contend that that's not the case – I'm not proposing something that doesn't exist
15:29:52 [Kim]
Patrick: I think we need to submit it and see with the working group says
15:30:39 [Kim]
Chris: let's take a look at the technique. Focus and blur I would argue in android are not input agnostic. They would not be usable the way people would expect them to be usable using talkback on an android
15:30:49 [Kim]
Chris: that part of this is very broken
15:31:33 [Kim]
David: is there something here Patrick that's not in your other proposals that exists today – we have five new success criterion over the past four or five weeks, they are all filling in – is this in place of all of those?
15:31:49 [Kim]
Patrick: this covers things that are directly mentioned in the other ones potentially making it future proof
15:31:57 [Kim]
David: we don't have a much of a future for 2.1
15:32:16 [Kim]
David: it's basically filling a few gaps for the last couple of years of WCAG
15:32:26 [Kim]
Patrick: let's remove it, let's park for future versions and concentrate on pushing the others through
15:33:08 [Kim]
Kathy: previously we had this additional input relying on assistive technology. I just want to make sure were not going to miss something if we take this out. We got keyboards with AT and then this one was additional inputs – are we going to lose something
15:33:32 [Kim]
Patrick: potentially one aspect we are going to lose is users switching between technologies in a session
15:33:58 [Kim]
Patrick: maybe a failure under the proposed pointer one
15:34:16 [Kathy]
https://www.w3.org/WAI/GL/mobile-a11y-tf/wiki/WCAG_2.1_Success_Criteria_Submission_Requirements#Timeline
15:34:45 [Kim]
Comparing existing techniques
15:38:59 [Kim]
David: We have the proposed pointer, we have keyboard already. In the next three years are we going to see something new come out that's going to blow this out of the water
15:39:32 [Kim]
Kim: speech plus mouse device is common and has been for a long time
15:40:40 [Kim]
Kathy: not limit the user but draw the correlation that for the types of input that are available for use within the website or application can be used together and are interchangeable. That covers what Kim is doing and that's the point that I heard from Patrick. So if we change this – can interchange the different iinput methods that might address Patrick's concern and your concern David,...
15:40:41 [Kim]
...and Kim's example
15:41:15 [Kim]
Kathy: you can start with speech then move over to pointer then keyboard – not locked into one technology to do a task
15:43:18 [Alan_Smith]
How about this: All functionality can be operated without limiting interaction to a particular type of input mechanism
15:43:35 [patrick_h_lauke]
q+
15:43:47 [patrick_h_lauke]
(noting that i just updated https://github.com/w3c/Mobile-A11y-Extension/blob/gh-pages/SCs/m6.md)
15:45:47 [Kim]
David: trying to think what problem we are trying to solve
15:46:21 [Kathy]
q?
15:46:26 [Kim]
Kim: more examples of important to be able to use two input methods at once
15:46:41 [Kim]
Alan: input – is human computer interaction limited by input or should we be considering some other term
15:46:54 [Kim]
Alan: I don't see a definition for input – just want to make sure that's clarified
15:47:53 [Kim]
Patrick: I think we can address that as well – I've updated M6 as we have been talking. Stripped down to core nugget that isn't covered by any of the other SEs – this concept of concurrency. Concurrent input mechanism might be a better short name for the new stripped-down version
15:48:44 [Kim]
Patrick: all functionality can be operated with all input mechanisms available to the user, and description about user being able to switch between inputs – not assume
15:49:09 [Kim]
Patrick: that would cover the core negative my concern of a website that says there is no touchscreen so I won't listen for touchscreen input which would block a user who then tries to use
15:49:48 [Kim]
Patrick: for the techniques added a clarifier saying that however, note the peculiarities of specific input mechanisms – blur different keyboard and mouse – that probably needs more work
15:50:29 [Kim]
Patrick: concurrent input mechanisms?
15:50:52 [Kim]
David: when we say available to the user we just don't know what's available to the user
15:51:15 [Kim]
Kathy: do you have better wording
15:52:29 [Kim]
David: trying to be clear on what problem we are trying to solve – sounds like we are trying to solve a whole bunch of problems – person getting into their car and switching to a different device. Also trying to solve the problem of the universal input device that we all want but we don't have yet. So we think that the way to meet this until we get that universal device higher abstract...
15:52:31 [Kim]
...level than before. I have huge concerns because it's so wide. If we were to get this through we really don't need 2.1 anymore so this is competing against existing criteria
15:53:42 [Kim]
Patrick: not necessarily – refocused this only says you need to be able to do them at the same time – other success criteria including other things. This is just addressing that user might be mixing and matching input methods in one session. The way in which you can satisfy this SC can be relying on high-level agnostic input – doesn't mean they have to. It may be they register different...
15:53:43 [Kim]
...handlers but make sure they register them all at the same time.
15:54:33 [Kim]
Patrick: scenario phone and then attach a Bluetooth keyboard on their phone – if the site doesn't follow the advice that at any point the user may use a different type of input then that user couldn't use that keyboard even though they've just attached it. I think now that we've just refocused it I don't think it makes the others redundant. All these things, but also make sure they work at...
15:54:34 [Kim]
...the same time as well
15:55:10 [Kim]
David: so were really saying all functionality that's required and other success criterion, you need to be able to use them interchangeably. That's okay if that's what we are trying to say
15:55:20 [Kim]
Kathy: yes, that's what we are saying
15:55:41 [Kim]
David: I've got a touchpad from 1996 with a stylus on it – that's available to me I could plug it into my PS2 port and suddenly they're required to support this device
15:56:02 [Kim]
Kathy: maybe tired to accessibility support or supported input methods that have been defined
15:56:21 [Kim]
Patrick: if nothing actually happens if you plug the arbitrary input device into the computer that's not an input mechanism – supported by the operating system or something along those lines?
15:57:01 [shadi]
q+
15:57:08 [Kim]
David: but then we are making a requirement to support every operating system. Making things tight enough. All functionality with all input mechanisms is wide.
15:57:33 [Kim]
Kathy: maybe we can put a note in this in the description stating to the working group what we are trying to accomplish, and then go from there. I don't think this is something the task force is going to solve.
15:57:57 [Kim]
David: I just would like to see a real user who's having problem on the Internet right now – a widespread dumb practice by developers that could be fixed by our requirement
15:58:40 [Kim]
Patrick: Yahoo did a very naïve if touchscreens are present just reacted touchscreens and all the sudden on mixed input devices such as the surface three things didn't work for miles users – that's exactly that scenario and a recent implementation – they did fix it after I alerted them to it
15:59:12 [Kim]
Kathy: another example educational interactive elements – a lot of times they are locking you into – if you start using keyboard you have to finish it with keyboard. If you started with a mouse same – they've changed the interface based on what you start out using
15:59:23 [patrick_h_lauke]
video of the flickr "touch OR mouse" problem that yahoo fixed after i alrerted them to it https://www.youtube.com/watch?v=f_2GKsI9TQU
15:59:25 [Kim]
David: I can ssee a huge benefit in being able to swap devices in the middle of a task
15:59:30 [Kim]
Kathy: that's what this is addressing
15:59:54 [shadi]
q-
16:00:24 [Kim]
David: we can pass it onto the working group with a note that says something like that. To me the language right now doesn't speak to this – big huge red flags all over the place even in the current language. I'd like to see a type thing that basically says you can swap out what you're using. And I don't know how to say that at this point
16:01:06 [Kim]
Kim: swap out is not wide enough for my scenarios (using speech and mouse device in concert)
16:01:40 [Kim]
Kathy: limiting to certain user groups – if you aren't able to use those methods you're in trouble because you can't do it?
16:01:59 [Kim]
David: I've never in 15 years of doing accommodations with people seen a function that should have been done with the mouse but couldn't
16:02:09 [patrick_h_lauke]
proposed new SC text: "All functionality can be operated even when a user switches between input mechanisms"
16:02:11 [Kim]
Kathy: I've seen that specifically in educational materials. We've done usability studies with it as well and it ends up being problematic
16:02:31 [Kim]
Kathy: low vision users use keyboard also want to use the mouse in one area, and then blocked from using keyboard and that's frustration
16:02:59 [Kim]
David: it sounds like several people in the group want to see this go through to the next level with a working group. I won't stand in the way of that. It just doesn't seem like the same level of maturity of some of the others.
16:03:31 [Kim]
Patrick: being mindful of the time possibly a as a resolution – all functionality supported when a user switches between input mechanisms
16:03:46 [patrick_h_lauke]
changes commited https://github.com/w3c/Mobile-A11y-Extension/blob/gh-pages/SCs/m6.md
16:03:47 [Kim]
Patrick: making that change
16:04:09 [Kim]
Kathy: Chris – draft of M14?
16:04:17 [Kim]
Chris: yes – it's essentially done
16:04:39 [Kim]
RESOLUTION: Patrick to make final edits and ready for working group
16:05:27 [Kim]
Kathy: next week we will jump into the orientation SC that Jon has drafted. Other two input SC's from Patrick?
16:05:36 [Kim]
Patrick: will work on them for next week
16:06:24 [patrick_h_lauke]
did final edits i think https://github.com/w3c/Mobile-A11y-Extension/blob/gh-pages/SCs/m6.md
16:07:00 [Alan_Smith]
Still wondering if this wording would give the developers control: All functionality can be operated without limiting interaction to a particular type of input mechanism
16:07:03 [patrick_h_lauke]
changed the Principle/Guideline. don't think it now needs any glossary additions/changes
16:07:17 [Alan_Smith]
ok, on the glossary additions.
16:08:08 [Kim]
rrsagent, make minutes
16:08:08 [RRSAgent]
I have made the request to generate http://www.w3.org/2016/10/20-mobile-a11y-minutes.html Kim
16:09:10 [Kim]
chair: Kathleen_Wahlbin
16:10:52 [Kim]
Regrets+ Jonathan
16:10:54 [Kim]
rrsagent, make minutes
16:10:54 [RRSAgent]
I have made the request to generate http://www.w3.org/2016/10/20-mobile-a11y-minutes.html Kim
16:12:38 [patrick_h_lauke]
patrick_h_lauke has left #mobile-a11y
17:38:27 [Kim]
rrsagent, bye
17:38:27 [RRSAgent]
I see no action items