14:59:57 RRSAgent has joined #mobile-a11y 14:59:57 logging to http://www.w3.org/2016/10/20-mobile-a11y-irc 14:59:59 RRSAgent, make logs public 14:59:59 Zakim has joined #mobile-a11y 15:00:01 Zakim, this will be WAI_MATF 15:00:01 ok, trackbot 15:00:02 Meeting: Mobile Accessibility Task Force Teleconference 15:00:02 Date: 20 October 2016 15:00:50 present+ 15:00:51 Kathy has joined #mobile-a11y 15:01:31 patrick_h_lauke has joined #mobile-a11y 15:01:38 present+ patrick_h_lauke 15:01:39 Alan_Smith has joined #mobile-a11y 15:02:08 David has joined #mobile-a11y 15:02:27 Present + 15:02:33 present+ Kim 15:02:42 Present+ David 15:02:53 marcjohlic has joined #mobile-a11y 15:03:12 present+ Kathy 15:03:22 btw i joined the call but just sorting out something noisy so on mute ;) 15:03:23 Jatin has joined #mobile-a11y 15:03:27 present+ Alan 15:03:44 present+ Jatin 15:03:51 present+ marcjohlic 15:05:15 https://www.w3.org/WAI/GL/mobile-a11y-tf/wiki/WCAG_2.1_Success_Criteria_Submission_Requirements 15:05:25 Agenda+ Continuation of SC review - https://github.com/w3c/Mobile-A11y-Extension/tree/gh-pages/SCs , https://www.w3.org/WAI/GL/mobile-a11y-tf/wiki/WCAG_2.1_Success_Criteria_Submission_Requirements M13 Orientation 15:05:26 Agenda+ M6 Input-Agnostic - https://github.com/w3c/Mobile-A11y-Extension/blob/gh-pages/SCs/m6.md 15:05:28 Agenda+ Next steps and schedule 15:05:41 TOPIC: success criteria submission requirements 15:05:56 DAVID2 has joined #mobile-a11y 15:06:17 Kathy: discussing edits 15:07:04 Kathy: the ones in the column essay reviewed by task force have been reviewed and I think are finalized. M1-M10 will be submitted as a full group 15:07:21 Kathy: that puts us mid-November when we get all of the submitted. Our deadline is December 1 – that's a hard deadline 15:07:28 chriscm has joined #mobile-a11y 15:07:41 Kathy: Andrew said that if were still working on tweaking some of the wording we can go ahead and update after we've submitted 15:07:50 Kathy: we are in pretty good shape 15:08:17 Kathy: since Patrick is here M6 first 15:08:22 https://github.com/w3c/Mobile-A11y-Extension/blob/gh-pages/SCs/m6.md 15:08:31 Patrick: updated 10 minutes ago 15:08:38 TOPIC: M6 15:08:52 present+ chriscm 15:09:09 Patrick: short name input agnostic, all text can be operated without requiring any particular type of input mechanism, suggested AA 15:09:28 Patrick: superset SC that tries to cover more than just the keyboard and mouse scenario. 15:10:01 Patrick: potentially working group may combine it something else but this is the culmination of the various input related SCs - the gold standard 15:10:18 Patrick: will check glossary definitions 15:10:31 Patrick: will fall under inputs with assistive technology although it does cover other types 15:11:43 Kathy: also in the description this is really a superset for the keyboard. Note at end of description, task force feels that this is a superset of the keyboard one, if the working group would like to consider adjusting the keyboard input to have this one instead. Thinking of this as 2.1 or silver 15:11:45 q+ 15:12:04 jeanne has joined #mobile-a11y 15:12:21 Patrick: reading description 15:12:22 regrets+ jeanne 15:14:20 Patrick: thinking about range – all input devices, all standardize input devices 15:14:37 Kathy: may be supported 15:14:52 Kathy: the other one we talked about last week we tied to the accessibility support – can we tie in something like that 15:15:01 q? 15:15:09 Alan: can we go back to without requiring any particular type of input negative some type wording instead of all 15:15:39 Patrick: Yes that makes sense – last sentence… without requiring any particular type of input mechanism – that makes sense 15:16:01 q+ 15:16:59 David: seems like we're trying to address IndieUI but don't have the tools today. We have the standard for touch and pointer but we don't actually have anything implemented – trying to build a framework for something that doesn't exist yet 15:17:32 Patrick: suggested techniques further down – we can already do this by relying on high-level input agnostic events such as focus blur click… 15:17:53 David: last time I checked with talkback you're swiping through and you're listening for an on blur the swiping itself, the virtual cursor is not listening to that blur activity – am I right on that 15:18:16 Chris: no it's not even going to cause that because the talkback focus – you can only listen for that as part of a native API, it's not in the web browser 15:18:34 Patrick: focus and blur are fired say a tap, touchscreen and talkback – it depends what exactly you are implementing 15:19:07 Patrick: if your focus is currently on an element like a button and you moved to another element and activate that blur is fired on the element where the focus previously was 15:19:38 q? 15:19:46 Chris: developers typically take advantage of that if input focus is cycling through elements – the only way for talkback user to put focus on something would be actually to select 15:20:25 Patrick: further clarification: focus and blur not fired when you would expect them in certain input mechanisms. Don't assume that a user can first focus and then separately click 15:21:11 David: language right now all functionality can be operated without particular – like say I want you to be able to write a letter without using any particular type of writing mechanism 15:21:18 Patrick: pen, dictating it… 15:22:23 David: 10,000 foot view of the keyboard thing 15:22:41 Chris: asking developers to be responsible for this level of agnosticism 15:22:48 q? 15:23:31 David: we want to find universal accessibility, no matter what you are touching it with it's going to be activated, but I don't see how we can fix this – is going to take incredible bandwidth. This is a fundamental shift towards something that's not yet available 15:23:58 Patrick: I really don't believe that it's not available. Maybe I'm missing something but the same argument that we just had could be made – if it's not mouse workable can't be made to be keyboard workable 15:24:22 Patrick: key handling, mouse handling, touch handling, pointer events – not science fiction, building applications and not assuming one type of input. I'm not seeing how it's not workable right now. I'm doing exactly this already 15:25:03 David: I don't think you can say without requiring any particular type of input mechanism on our current success criteria. It's very strong normative statement that's superwide. I'm going to sound like a Luddite here but I don't know how you can go that wide. 15:25:22 David: we are in a three-year window now, that seems to be the definition of the new charter. Maybe in three years we will be able to say just use the high level 15:26:51 q? 15:27:01 ack David 15:27:04 Ack Kim 15:27:26 q+ 15:27:37 q+ 15:27:40 Kim: I'd add speech input to the description – Mic as the device. I think we been talking about input agnostic for a long time now and should address it 15:27:58 David: large width 15:28:09 Kathy: what are the scenarios that would not be possible today – maybe we can address how we can look at that 15:29:32 David: pulling up a list of all of our success criteria – I think the other ones address what this is trying to do except this one is going whiter than all the other ones. It's going beyond what our current technology is I would say and it's getting at something that we don't yet have. It's one thing to aspire to something but you can't require – the role of WCAG in my understanding is... 15:29:34 ...were not inventors. We vet things. We standardize things that are emerging, that are working well. We don't generally create momentum of something that doesn't exist 15:29:34 q- 15:29:42 Patrick: I contend that that's not the case – I'm not proposing something that doesn't exist 15:29:52 Patrick: I think we need to submit it and see with the working group says 15:30:39 Chris: let's take a look at the technique. Focus and blur I would argue in android are not input agnostic. They would not be usable the way people would expect them to be usable using talkback on an android 15:30:49 Chris: that part of this is very broken 15:31:33 David: is there something here Patrick that's not in your other proposals that exists today – we have five new success criterion over the past four or five weeks, they are all filling in – is this in place of all of those? 15:31:49 Patrick: this covers things that are directly mentioned in the other ones potentially making it future proof 15:31:57 David: we don't have a much of a future for 2.1 15:32:16 David: it's basically filling a few gaps for the last couple of years of WCAG 15:32:26 Patrick: let's remove it, let's park for future versions and concentrate on pushing the others through 15:33:08 Kathy: previously we had this additional input relying on assistive technology. I just want to make sure were not going to miss something if we take this out. We got keyboards with AT and then this one was additional inputs – are we going to lose something 15:33:32 Patrick: potentially one aspect we are going to lose is users switching between technologies in a session 15:33:58 Patrick: maybe a failure under the proposed pointer one 15:34:16 https://www.w3.org/WAI/GL/mobile-a11y-tf/wiki/WCAG_2.1_Success_Criteria_Submission_Requirements#Timeline 15:34:45 Comparing existing techniques 15:38:59 David: We have the proposed pointer, we have keyboard already. In the next three years are we going to see something new come out that's going to blow this out of the water 15:39:32 Kim: speech plus mouse device is common and has been for a long time 15:40:40 Kathy: not limit the user but draw the correlation that for the types of input that are available for use within the website or application can be used together and are interchangeable. That covers what Kim is doing and that's the point that I heard from Patrick. So if we change this – can interchange the different iinput methods that might address Patrick's concern and your concern David,... 15:40:41 ...and Kim's example 15:41:15 Kathy: you can start with speech then move over to pointer then keyboard – not locked into one technology to do a task 15:43:18 How about this: All functionality can be operated without limiting interaction to a particular type of input mechanism 15:43:35 q+ 15:43:47 (noting that i just updated https://github.com/w3c/Mobile-A11y-Extension/blob/gh-pages/SCs/m6.md) 15:45:47 David: trying to think what problem we are trying to solve 15:46:21 q? 15:46:26 Kim: more examples of important to be able to use two input methods at once 15:46:41 Alan: input – is human computer interaction limited by input or should we be considering some other term 15:46:54 Alan: I don't see a definition for input – just want to make sure that's clarified 15:47:53 Patrick: I think we can address that as well – I've updated M6 as we have been talking. Stripped down to core nugget that isn't covered by any of the other SEs – this concept of concurrency. Concurrent input mechanism might be a better short name for the new stripped-down version 15:48:44 Patrick: all functionality can be operated with all input mechanisms available to the user, and description about user being able to switch between inputs – not assume 15:49:09 Patrick: that would cover the core negative my concern of a website that says there is no touchscreen so I won't listen for touchscreen input which would block a user who then tries to use 15:49:48 Patrick: for the techniques added a clarifier saying that however, note the peculiarities of specific input mechanisms – blur different keyboard and mouse – that probably needs more work 15:50:29 Patrick: concurrent input mechanisms? 15:50:52 David: when we say available to the user we just don't know what's available to the user 15:51:15 Kathy: do you have better wording 15:52:29 David: trying to be clear on what problem we are trying to solve – sounds like we are trying to solve a whole bunch of problems – person getting into their car and switching to a different device. Also trying to solve the problem of the universal input device that we all want but we don't have yet. So we think that the way to meet this until we get that universal device higher abstract... 15:52:31 ...level than before. I have huge concerns because it's so wide. If we were to get this through we really don't need 2.1 anymore so this is competing against existing criteria 15:53:42 Patrick: not necessarily – refocused this only says you need to be able to do them at the same time – other success criteria including other things. This is just addressing that user might be mixing and matching input methods in one session. The way in which you can satisfy this SC can be relying on high-level agnostic input – doesn't mean they have to. It may be they register different... 15:53:43 ...handlers but make sure they register them all at the same time. 15:54:33 Patrick: scenario phone and then attach a Bluetooth keyboard on their phone – if the site doesn't follow the advice that at any point the user may use a different type of input then that user couldn't use that keyboard even though they've just attached it. I think now that we've just refocused it I don't think it makes the others redundant. All these things, but also make sure they work at... 15:54:34 ...the same time as well 15:55:10 David: so were really saying all functionality that's required and other success criterion, you need to be able to use them interchangeably. That's okay if that's what we are trying to say 15:55:20 Kathy: yes, that's what we are saying 15:55:41 David: I've got a touchpad from 1996 with a stylus on it – that's available to me I could plug it into my PS2 port and suddenly they're required to support this device 15:56:02 Kathy: maybe tired to accessibility support or supported input methods that have been defined 15:56:21 Patrick: if nothing actually happens if you plug the arbitrary input device into the computer that's not an input mechanism – supported by the operating system or something along those lines? 15:57:01 q+ 15:57:08 David: but then we are making a requirement to support every operating system. Making things tight enough. All functionality with all input mechanisms is wide. 15:57:33 Kathy: maybe we can put a note in this in the description stating to the working group what we are trying to accomplish, and then go from there. I don't think this is something the task force is going to solve. 15:57:57 David: I just would like to see a real user who's having problem on the Internet right now – a widespread dumb practice by developers that could be fixed by our requirement 15:58:40 Patrick: Yahoo did a very naïve if touchscreens are present just reacted touchscreens and all the sudden on mixed input devices such as the surface three things didn't work for miles users – that's exactly that scenario and a recent implementation – they did fix it after I alerted them to it 15:59:12 Kathy: another example educational interactive elements – a lot of times they are locking you into – if you start using keyboard you have to finish it with keyboard. If you started with a mouse same – they've changed the interface based on what you start out using 15:59:23 video of the flickr "touch OR mouse" problem that yahoo fixed after i alrerted them to it https://www.youtube.com/watch?v=f_2GKsI9TQU 15:59:25 David: I can ssee a huge benefit in being able to swap devices in the middle of a task 15:59:30 Kathy: that's what this is addressing 15:59:54 q- 16:00:24 David: we can pass it onto the working group with a note that says something like that. To me the language right now doesn't speak to this – big huge red flags all over the place even in the current language. I'd like to see a type thing that basically says you can swap out what you're using. And I don't know how to say that at this point 16:01:06 Kim: swap out is not wide enough for my scenarios (using speech and mouse device in concert) 16:01:40 Kathy: limiting to certain user groups – if you aren't able to use those methods you're in trouble because you can't do it? 16:01:59 David: I've never in 15 years of doing accommodations with people seen a function that should have been done with the mouse but couldn't 16:02:09 proposed new SC text: "All functionality can be operated even when a user switches between input mechanisms" 16:02:11 Kathy: I've seen that specifically in educational materials. We've done usability studies with it as well and it ends up being problematic 16:02:31 Kathy: low vision users use keyboard also want to use the mouse in one area, and then blocked from using keyboard and that's frustration 16:02:59 David: it sounds like several people in the group want to see this go through to the next level with a working group. I won't stand in the way of that. It just doesn't seem like the same level of maturity of some of the others. 16:03:31 Patrick: being mindful of the time possibly a as a resolution – all functionality supported when a user switches between input mechanisms 16:03:46 changes commited https://github.com/w3c/Mobile-A11y-Extension/blob/gh-pages/SCs/m6.md 16:03:47 Patrick: making that change 16:04:09 Kathy: Chris – draft of M14? 16:04:17 Chris: yes – it's essentially done 16:04:39 RESOLUTION: Patrick to make final edits and ready for working group 16:05:27 Kathy: next week we will jump into the orientation SC that Jon has drafted. Other two input SC's from Patrick? 16:05:36 Patrick: will work on them for next week 16:06:24 did final edits i think https://github.com/w3c/Mobile-A11y-Extension/blob/gh-pages/SCs/m6.md 16:07:00 Still wondering if this wording would give the developers control: All functionality can be operated without limiting interaction to a particular type of input mechanism 16:07:03 changed the Principle/Guideline. don't think it now needs any glossary additions/changes 16:07:17 ok, on the glossary additions. 16:08:08 rrsagent, make minutes 16:08:08 I have made the request to generate http://www.w3.org/2016/10/20-mobile-a11y-minutes.html Kim 16:09:10 chair: Kathleen_Wahlbin 16:10:52 Regrets+ Jonathan 16:10:54 rrsagent, make minutes 16:10:54 I have made the request to generate http://www.w3.org/2016/10/20-mobile-a11y-minutes.html Kim 16:12:38 patrick_h_lauke has left #mobile-a11y 17:38:27 rrsagent, bye 17:38:27 I see no action items