15:55:39 RRSAgent has joined #indie-ui 15:55:39 logging to http://www.w3.org/2014/10/28-indie-ui-irc 15:55:41 RRSAgent, make logs public 15:55:41 Zakim has joined #indie-ui 15:55:43 Zakim, this will be INDIE 15:55:43 ok, trackbot, I see WAI_Indie()11:00AM already started 15:55:44 Meeting: Independent User Interface Task Force Teleconference 15:55:44 Date: 28 October 2014 15:55:46 zakim, call Monterey 15:55:46 ok, MichaelC_; the call is being made 15:55:47 +Monterey 15:56:23 rrsagent, do not start a new log 15:59:11 -> http://www.w3.org/2014/10/27-indie-ui-minutes.html Minutes from yesterday 16:00:17 Ryladog has joined #indie-ui 16:07:00 richardschwerdtfeger has joined #indie-ui 16:08:10 JF has joined #indie-ui 16:10:42 scribe: Rich 16:11:39 Katie: Do people think Shapes are enough? 16:12:03 Katie: I think we need the user context. The user was missing from the discussion 16:12:30 Rich: We need to allow the author to provide information to enable the shape to be controlled in a device independent way 16:13:30 Rich: I thought that was the thing missing - we hav e this object in the DOM we want to send it device indeprnefdnt info - they had the iinsert text thing - but to make the decision on how mto do it 16:13:54 you have to know more einformation. Event driven is to slow - it has to propogate 16:14:20 is DOM 3 events enough. you fire an event and then it bubles. Does that give us enough 16:14:44 Katie: What is DOM 4 paradigm 16:15:18 janina_ has joined #indie-ui 16:15:31 Rich: I am not quite sure if the DOM paradigm is the right way now. An this was a question they asked us 16:16:19 Rich: if we unlock that - we have to be a little careful of - I think he was looking t prototypee htings 16:16:39 RS; i think we need to expose some things in the API to the DOM 16:17:17 RS: Jame ws trting to use polyfills - we need to know - what is the increment - we have to provide that information 16:18:01 s/trting/trying 16:18:29 RS: I think we need to push on getting information out - if we can leverage native host language semantis and ARIA and developes ake us e of that with device indepemdence 16:19:09 RS: This would be good for Tab Index. One thing we missd talkig about. It is easy to reflect the role attribute but not the mountain of others 16:20:28 Katie: CMN is working on AccessKey 16:21:19 John: The concept behind access keys that charles is working on is he wants to make them discoverable. He wants to also deal with conflicts. 16:21:52 John: today access keys are definitive. He wants to do here is what you want and the browser can map it the way they want to. 16:22:28 zakim, Monterey has Susann_Keohane, Marc_Johlic, John_Foliot, Katie_Haritos-Shea, Mary_Jo_Mueller, Rich_Schwerdtfeger, Kurosawa_Takeshi, Cynthia_Shelly, Janina_Sajka, Michael_Cooper, Joanie_Diggs 16:22:28 +Susann_Keohane, Marc_Johlic, John_Foliot, Katie_Haritos-Shea, Mary_Jo_Mueller, Rich_Schwerdtfeger, Kurosawa_Takeshi, Cynthia_Shelly, Janina_Sajka, Michael_Cooper, Joanie_Diggs; 16:22:29 marcjohlic has joined #indie-ui 16:22:32 ... got it 16:23:10 John: He wanted to do what Opera did which was to list the access key assignments and allow the user to remap them 16:23:40 http://www.w3.org/TR/xhtml-access/ 16:24:12 cyns_ has joined #indie-ui 16:24:53 john: the key is a tight binding 16:30:17 Rich; In order to send the events what do we need to send to the browser? You basically nee a range of Nodes and an offset. That is basically what we do in AAPI... 16:31:11 ...in a selection module dyoy have to have a node and an offset at the start - and at the end. Whatever mthey do in the event model they are going to have that running.... 16:31:22 ...he wants pass all of that selection state in the event? 16:31:26 CS: I think so 16:32:02 Ryladog has joined #indie-ui 16:32:16 Rich: Why dont we make a list of questions 16:45:32 janina__ has joined #indie-ui 16:51:27 https://dvcs.w3.org/hg/dom3events/raw-file/tip/html/DOM3-Events.html#event-type-select 16:52:14 MaryJo has joined #indie-ui 16:52:15 http://w3c.github.io/selection-api/ 16:52:21 http://www.w3.org/TR/selection-api/ 16:53:28 jcraig has joined #indie-ui 16:53:39 http://w3c.github.io/selection-api/ 16:54:27 richardschwerdtfeger has joined #indie-ui 16:57:52 http://w3c.github.io/selection-api/ 17:00:27 http://www.w3.org/TR/selection-api/ 17:02:24 q+ 17:02:51 q+ to announce HTML5 is now a W3C Recommendation !!! 17:03:52 q- 17:10:00 1. Selection involves a starting node and an offset within that node for say text. Is your intent to only store that information in the event or can we get that from the Document object selection API? 17:10:00 If it is only in the event data other technologies (like assistive technologies don’t have access to what is selected today). Selection, today, is retrieved from the Document object vs. the event instance. We would need to remap from the event to the accessibility API. 17:10:02 What is the relationship between the event data for selection and what is stored in the DOM? … document.getSelection. 17:10:03 2. We believe to do this correctly (beyond just selection) that we need to have state data that can be exposed by the author to the browser. For example, a slider adjustment from a device independent source would need to know the range and increment to be able to know what commands were needed to advance the slider position. In WAI-ARIA we expose much of information that is needed based on the role of the object. This was derived from accessi[CUT] 17:10:05 APIs on multiple operating system platforms that in turn were derived from common state and property data found in GUIs over the years. We believe it is essential that you include this information in your shape (API Design Pattern) planning. 17:10:06 We ran into issues with implementing polyfills in IndieUI around sliders as we needed this additional information. 17:10:07 We believe this is a way to mainstream aria work as authors gain mainstream benefits. 17:10:08 What is your plan to address this? 17:12:32 +JasonJGW 17:19:04 http://msdn.microsoft.com/en-us/library/windows/desktop/ee671665(v=vs.85).aspx 17:19:13 http://msdn.microsoft.com/en-us/library/windows/desktop/ff384841(v=vs.85).aspx#SelectingText 17:22:43 IE does support document.getselection http://msdn.microsoft.com/en-us/library/ie/ms535869(v=vs.85).aspx 17:23:17 sorry, wrong link http://msdn.microsoft.com/en-us/library/ie/ff975169(v=vs.85).aspx 17:24:48 janina_ has joined #indie-ui 17:28:27 jcraig has joined #indie-ui 17:30:45 Questions and Discussion Points: 17:30:46 1. Selection involves a starting node and an offset within that node for say text. Is your intent to only store that information in the event or can we get that from the Document object selection API? 17:30:47 If it is only in the event data other technologies (like assistive technologies don’t have access to what is selected today). Selection, today, is retrieved from the Document object vs. the event instance. We would need to remap from the event to the accessibility API. 17:30:49 What is the relationship between the event data for selection and what is stored in the DOM? … document.getSelection. 17:30:50 Note: we believe there are advantages to having both. The event would allow us to speak the text automatically without having to round trip back to the document. The document access is important for browsing the content and dealing with things like embedded objects (e.g. attachments) in a web mail application. 17:30:52 17:30:53 2. We need specific information in the events. If you are including data in the event for selection we would like to see to reduce round tripping to the DOM or accessibility API: 17:30:55 - Start node and offset 17:30:56 - End node and offset 17:30:58 - The text string of the text selected area 17:31:00 Mainstream example: I am creating and audio UI that includes speech output. This would be fast. 17:31:01 Aging or Dyslexic users: Having basic feedback that augments the selection enables better comprehension in users. 17:31:02 3. We believe to do this correctly (beyond just selection) that we need to have state data that can be exposed by the author to the browser. For example, a slider adjustment from a device independent source would need to know the range and increment to be able to know what commands were needed to advance the slider position. In WAI-ARIA we expose much of information that is needed based on the role of the object. This was derived from accessi[CUT] 17:31:04 APIs on multiple operating system platforms that in turn were derived from common state and property data found in GUIs over the years. We believe it is essential that you include this information in your shape (API Design Pattern) planning. 17:31:05 We ran into issues with implementing polyfills in IndieUI around sliders as we needed this additional information. 17:31:06 We believe this is a way to mainstream aria work as authors gain mainstream benefits. 17:31:07 What is your plan to address this? 17:31:48 -JasonJGW 17:44:19 looks great rich 18:01:24 jcraig has joined #indie-ui 18:01:55 +JasonJGW 18:17:25 topic: Intention events discussion continued 18:17:31 janina_ has joined #indie-ui 18:17:46 cyns has joined #indie-ui 18:17:52 scribe: cyns 18:18:18 RS: summary... The author can initiate an intent based event based onteh device dependent event 18:18:48 RS: also the UA can generate a user intent event, which can come from aapi or from UA specific code 18:18:57 q+ to ask about the comparison (similarities and differences) between this approach and IndieUI 18:19:25 RS: new selection api would not change. This is based off current slection model but gives the author more control 18:19:51 BP: This allows the author to modify the selection before it shows up in the document.seleciton or the viewport 18:20:03 JW: what about the actions done on teh selection? 18:20:26 BP: it would be covered, but not by selection. For example, drop would be an intention. paste, copy, etc are intentions 18:20:43 BP: so you create a seleciton, and then you can perform intentions on it 18:21:39 BP: it clarifies the difference between the action the user takes (keydown) and the intention (scroll) 18:22:37 joanie: I was thinking about notifications... Will all the info about the selection, like what's in the seleciton object, be available. 18:22:54 BP: yes, it's there in the selection object. what's the use case 18:23:07 joanie: screen reader reading the selection. 18:23:53 BP: good idea to have a way for AT to know that an intention happened and completed 18:24:29 MC: there are events that return an object that is the result of the event. 18:24:39 BP: do you need it in the event? 18:24:49 joanie: yes, we have to do a lot of round tripping. 18:25:34 joanie: when text selection changes, I am notified that it changed on an accessible object assoc with node in dom. round trip for start/end offset. round trip to text at offsets. etc. 18:26:11 joanie: now that you have started explaining more, my use case is more about selection api than intention events 18:26:17 BP: I work on that too. 18:26:40 Not related to "user intent" events, but Joanie was talking about the inverse: web application notifying the platform APIs what just happened. e.g. explaining some of the business logic of the web app. 18:26:47 BP: this seems like it;s on the other side of the browser. it's n ot script, it's how does the browser give the at/api the info. 18:26:51 apologies, I have to go 18:27:09 -??P0 18:27:47 e.g. Twitter responds to j and k as "previous" and "next" intents. Screenreader gets a series of focus, node deleted, node added, and selection changed events. 18:27:56 zakim, Ben_Peters has joined Monterey 18:27:56 sorry, MichaelC, I do not recognize a party named 'Ben_Peters' 18:28:06 zakim, Ben_Peters is also in Monterey 18:28:06 I don't understand 'Ben_Peters is also in Monterey', MichaelC 18:28:15 zakim, Monterey also has Ben_Peters 18:28:15 +Ben_Peters; got it 18:28:22 joanie: twitter, when you press j, I get a ton of accessibiltiy events, but none of them say "move to next item". If there was an after event that could be communciated via the accessiblity api, then I woudl know that orca should speak the next tweet. 18:28:28 richardschwerdtfeger has joined #indie-ui 18:28:35 zakim, Monterey also has James_Craig 18:28:36 +James_Craig; got it 18:28:48 Would be nice if the web app could declare, "MOVE TO NEXT is about to happen"; event spew; "MOVE TO NEXT just happened" 18:29:02 BP: need a way to announce what just happened in a consice way 18:29:31 q+ 18:29:44 q- later 18:30:36 CS: we could work that use case into WAPA 18:31:19 BP: pleaes give me a bug 18:31:28 benjamp has joined #indie-ui 18:32:28 jamesn has joined #indie-ui 18:32:34 JW: getting into selection api, might generalize... when a script modifies a docuemnt, the way we normally enable at to find out when that modification has ended is to use aria live regions. In this case, script is modifiying the seleciton and I wonder if there is alocking mechnism to prevent seleciton from being queried when it's being modified and notify AT when it's done 18:32:37 https://github.com/w3c/editing-explainer/issues 18:33:01 please file a bug tracking the twitter issue and the need for custom Intention events on github above 18:33:23 JS: lots of aapi events when hitting j in twitter 18:33:25 q? 18:33:32 CS: aapis are noisy and low level 18:33:41 ack jo 18:33:42 ack j 18:33:45 JS: somehting similar to aria live region for the selection object 18:34:34 joanie: the answer to the 'j situation' is that right now you're talking about browser defined events, but you're thinking about custom events, which 'next item' would be? 18:34:51 BP: yes I'm thinking about it, but it's long term 18:35:09 joanie: is author initiating or consuming intention event? 18:35:12 BP: both 18:35:56 BP: auhor can catch any event and then call declareintention api and say what the inteatnion of that event is 18:36:06 KHS: isn't that custom? 18:36:22 BP: you can only send the defined set of events. custom would be making your own events 18:36:36 q+ 18:37:43 JC: so the server can declare the user's intention? 18:37:51 ack m 18:37:51 MichaelC, you wanted to ask about the comparison (similarities and differences) between this approach and IndieUI 18:38:01 BP: yes, the app can. If it's wrong that's an app bug 18:38:11 ack me 18:38:17 q? 18:38:22 MC: how are these similar to and different from indie ui events? we need to figute that out, but not right now 18:38:31 q+ 18:39:10 RS: talking about use cases alraedy in IRC 18:39:41 RS: speach api in browsers. would be good to connect to intent events 18:39:54 BP: how does that differ fromt eh browser itself? 18:41:33 JC: user has a greasemonkey script or browswer extension that acts as AT can listen to these selection evetns and then call speach api to have it spoken. I think those events exist 18:41:42 RS: yes, but you have to roundtrip. 18:42:03 MC: is that a feature request on the seleciton apoi 18:42:33 RS: android puts the seleciton in teh event and it's nice 18:43:00 s/user has a greasemonkey script or browswer extension /I think what Rich is describing is that the user has a greasemonkey script or browser extension / 18:43:05 RS: what about annoationtion. 18:43:11 q+ to mention that providing the outcome of the event requires flexibility to handle the various underlying things that can happen 18:43:13 file a bug in Selection API to include getSelection in the event: https://github.com/w3c/selection-api 18:43:29 s/annoationtion/annotation/ 18:43:57 CS: both seleciton an annotation have ranges that may span elements 18:43:59 ack me 18:44:06 q? 18:44:11 q? 18:44:36 q+ 18:44:47 ack b 18:44:53 q+ to mention concurrent editing related to selection API 18:44:55 richardschwerdtfeger has joined #indie-ui 18:45:07 BP: I want to talk about the events that are in indieui now, and how they fit into the specs being built in webapps 18:45:34 BP: indieui has undo, scrolling, etc. which we have too 18:46:11 q+ to talk about process with transmogrifying IndieUI to other events 18:46:40 BP: are you interested in putting the things that webapps/sites need for mainstream use into webapps with good accessibility, and only keep the things we're not covering in indieui 18:46:44 JC: yes 18:46:46 JS: yes 18:47:33 MC: if there are going to be general purpose events that cover oru requirements, we prefer these. from the indieui wg, we need to think about how that impacts our deliveerables. 18:47:40 s/JC: yes/JC: yes, presuming that those mainstream APIs provide enough introspection for acceessibility and AT needs/ 18:47:51 s/acceessibility/accessibility/ 18:48:18 MC: if you think some will be taken up in webapps, maybe we should look at whether we can do with all of them. indieui might continue to exist as non-deliverable group to push use cases. 18:49:03 BP: events are scroll, undo/redo, selection 18:49:23 JC: mark request is for selection in rich text but also selectable items like table rows and lists. 18:49:48 q? 18:49:49 q- 18:49:52 q? 18:50:01 BP: need to disambiguate text selection and item selection across web. selection api might own ths 18:50:35 CS: what about media events? 18:51:24 JS: these are general purpose events on os's but not on the web 18:52:10 MC: one idea woudl be to adopt model of intention events, develop the ones that no one else is doing on that model, and then roll it into another spec 18:52:14 ack james 18:52:17 ack j 18:52:17 jcraig, you wanted to mention concurrent editing related to selection API 18:52:35 JC: media events you can trigger via methods other than keyboard 18:53:06 q+ jcraig 18:53:08 BP: we have traction on some event sin web apps. indieui has traction on other events. you should keep those for now because otherwise they won't have any traction 18:53:14 q+ jason 18:53:43 joanie: media events feel similar to move next in twitter 18:53:59 jcraig has joined #indie-ui 18:54:00 BP: there is no native control in move next, but there is in media 18:54:10 JS: there's html api for them 18:54:14 q? 18:54:36 joanie: move next previous isn't really custom. it exists in native os 18:54:50 joanie: might be good to expand set of intention events 18:55:24 q+ to say that wapa talks about virtualized list management and we could add twitter case 18:55:54 ack jc 18:55:57 JC: concurrent editing poses challenges for selection api. need multiple selections at the same time and assoc with other users 18:56:07 BP: seleciton is the user at this machine 18:56:55 BP: other user's seleciton should be a different object. 18:57:04 JC: it think we want the same evetns/api 18:57:56 JC: example... there are a lot of cucurrent editing coding sites. the purpose is to do a javascript challenge while I watch you edit it. need an api to make that concept accessible in general. 18:58:11 JC: whether or not you call it seleciton, it would be similar 18:58:25 BP: maybe it inherits from seleciton, but it's a different thing 18:58:51 BP: might need its own spec. might be related to annotations... or not... 18:59:16 JC: annotations are permenant and selection is temporary 18:59:46 MC: feels like a version 2 problem 19:00:13 JC: note that we are aware of use case but not dealing with it justnow 19:00:27 scribe: jcraig 19:00:28 q? 19:00:38 scribe: jcraig 19:01:02 ack jas 19:02:03 JW: possibility of remote selection is to consider it a transient readonly annotation 19:03:12 JW: re: scope of WG events activiites: One approach would be to move some events into other groups. Another approach is move that work here. 19:03:56 JW: point being that it may be appropriate to modify the TFs to include other groups that are working on similar problems 19:04:15 JW: We want to avoid overlap and inconsistencies 19:04:22 q+ 19:04:26 q? 19:04:48 BP: that is the plan for the explainer, to cover all the use cases 19:04:49 janina_ has joined #indie-ui 19:05:03 q? 19:05:31 ack cy 19:05:31 cyns, you wanted to say that wapa talks about virtualized list management and we could add twitter case 19:05:32 ack me 19:05:49 ack cy 19:07:54 jcraig_ has joined #indie-ui 19:08:08 CS: big problem for accessibility; for mainstream users; you can hack 19:08:10 s/hack/hack it/ 19:08:39 BP: goal is to make this easy enough that it's desirable for authors to do it the right way, not the hacky way 19:09:47 RS: ??? 19:10:20 CS: groups of intentions that are related to ??? 19:10:54 BP: explainer is the umbrella work; does not include all individual pieces 19:12:16 MC: considering including IndieUI in the Editing TF 19:12:43 MC: Would like to sort out where all the discussions are happening 19:13:11 BP: We need to rename the TF to something like "editing and intents" 19:13:41 MC: Combining TF is an agenda item for near future. 19:16:52 CS: User Intention Events is way of doing things. There are several groups of things we want to do this way, such as editiing, scrolling, ui widget state changes (aria), selection, media, etc. 19:22:01 janina_ has joined #indie-ui 19:24:49 -JasonJGW 19:25:27 CSS providing sufficient contrast for placeholder text 20:24:54 jcraig has joined #indie-ui 20:31:09 +JasonJGW 20:31:27 kurosawa has joined #indie-ui 20:35:26 MichaelC has joined #indie-ui 20:37:16 jcraig has joined #indie-ui 20:38:36 marcjohlic has joined #indie-ui 20:40:08 MaryJo has joined #indie-ui 20:43:25 scribe: MichaelC 20:43:27 s/jcraing/jcraig/ 20:43:38 s/jcraing/jcraig/g 20:43:45 topic: Kim Patch requirements input 20:44:01 kp: Have worked on UAAG and Mobile TF 20:44:36 and have a speech recognition hands-free app 20:44:48 for people who need speech for everything 20:44:53 so command and control a priority 20:45:28 have been thinking about how touch and speech work together 20:46:02 orthodoxy has been that input methods need to work together 20:46:08 but my view is they can be separated 20:46:22 e..g., using speech and mouse 20:46:40 use the pointer, but then saying ¨click paste¨ 20:47:08 pastes where the pointer is 20:47:31 jc: also could combine with eye tracker 20:47:32 kp: yes 20:47:41 using conventional and unconventional methods together 20:47:53 another example is gestures like sign language 20:48:01 whaddaya think? too soon? 20:48:40 jc: initial implementation is usually to mimic another device, like a keyboard 20:48:44 the interpreter does that 20:48:53 triggers a script / macro etc. 20:49:25 richardschwerdtfeger has joined #indie-ui 20:50:02 kp: what about when using two input methods together? 20:50:56 jc: sounds like ideal is to have an entire command / control interface controllable by a specific modality 20:51:22 which might be less critical for mainstream user though important to PWD 20:51:25 kp: we don´t know what people will adopt 20:51:48 jc: a11y features often adopted as mainstream, but not by majority 20:52:03 janina_ has joined #indie-ui 20:52:10 kp: take mobile phones 20:52:19 have separate touch interface, but have keyboard 20:52:24 but can´t say ¨ctrl-p¨ to print 20:53:13 an ideal browser would have that function so can plug in external keyboard 20:53:27 js: supporting keyboard on phone out of scope for IndieUI 20:53:43 but actuating intentional events via keyboard is in our scope 20:54:03 jc: take undo or copy/paste 20:54:10 have different key combinations on different platforms 20:55:41 or consider the click event that is effectively used for keyboard and mouse activation 20:56:23 or say on map site where wheel zooms instead of pans 20:56:26 that´s confusing 20:56:43 with IndieUI could map desired physical events to the ultimate action 20:56:50 and user could re-map 20:57:14 janina__ has joined #indie-ui 20:57:32 so taking speech command, the interpreter could figure out and fire the desired intended event 20:57:38 that´s the goal - though we´re not there yet 20:58:01 kp: what about supporting multiple modalities at once? 20:58:12 jc: web app would care about order events came in 20:58:28 in mouse plus speech example, doesn´t have to know about 20:58:39 kp: add gesture to turn on / off mic 20:58:47 jc: that´s outside this context 20:58:51 kp: gesture for click? 20:58:56 jc: app still doesn´t care 20:58:59 just gets the events 20:59:11 kp: anything special the mobile tf should examine? 20:59:26 often we look at keyboard issues and assume IndieUI is addressing 20:59:56 khs: are mobile interfaces out of scope of IndieUI? 21:00:04 jc: yes 21:00:37 kp: keyboard accessibility has been the base fallback input event 21:00:45 khs: hope that will migrate to intentions 21:01:13 jc: native platforms will handle 21:01:27 there may be many ways to acuate a given intention 21:01:52 the problem is when author uses non-standard control 21:01:59 ARIA allows us to get values 21:02:09 but hard for user to manipulate 21:02:51 and input values 21:03:01 kp: so I would be able to use speech, touch, gesture 21:03:09 and access both native and non-native controls 21:03:10 jc: yes 21:03:27 kp: is the chosen word ¨intention¨? 21:03:33 js: we´re sorting that out 21:03:54 it´s our working term 21:04:03 khs: so be careful of putting it in techniques 21:04:26 mc: use in draft, just recognize will need to update when the terms stabilize 21:04:48 jc: had sent comments to UAAG with a suggestion 21:05:11 kp: UAAG has a clarification on ¨keyboard accessibility¨ that it applies to independent controls 21:05:20 but the IndieUI terms might work 21:05:32 khs: mobile techniques not there yet? 21:05:37 kp: starting to look at that stuff 21:05:57 jc: you´re looking at ¨intent¨? 21:06:10 maybe ¨input independent¨ is better 21:06:23 or ¨device or modality independent¨ 21:06:34 ¨intent¨ has a slightly different meaning 21:08:08 kp: what else should we be thinking about? 21:08:30 waypoints we should be looking for? 21:08:34 richardschwerdtfeger has joined #indie-ui 21:09:04 mc: take a look at heartbeats 21:09:26 jc: suggest you observe our interaction with WebApps today 21:09:28 to get some non-WAI thinking 21:09:39 kp: what are your key work points coming up 21:10:06 jc: we´re sorting out overlap with other groups 21:10:44 looking to meet all the use cases - without worrying about what spec address which part of the use cases 21:11:29 kp: speech users struggle with single-key shortcuts 21:11:42 do you know the issue? 21:11:45 21:11:50 it´s not obvious to people 21:12:05 but it´s easy to accidentally trigger such an action 21:12:17 by happening to say a word that is a command 21:13:29 jc: so AT sends the letters, not the words? 21:13:34 kp: yes, it doesn´t know what´s what 21:13:45 jc: AT should have way to differentiate 21:14:04 kp: it can be a pain in the arse 21:16:50 q+ 21:16:51 21:17:17 js: we´ve been discussing that mappings could be packaged as a product 21:17:24 re-mappability is part of the design 21:17:25 q- rich 21:18:20 jc, kp: 21:18:21 ack jo 21:18:49 jd: does this group have a speech recognition representative? 21:19:00 js: no, though there are other modalities not represented 21:19:08 jc: and some that are; would be nice to have that POV 21:19:49 jd: in example of ¨save as¨, faster to issue the command than to navigate the menu 21:20:14 kp: I can control a new app via the menu 21:20:22 if there are only commands, I have to learn them 21:20:43 so harder at the start, still need the menu 21:20:46 but easier once learned 21:20:59 keep in mind there are billions of possible speech commands 21:21:06 early attempts tried to include them all 21:21:10 run out of spacce 21:21:24 q? 21:21:32 in the hardware brain and the bio-brain 21:21:52 jd: as an AT developer I am interested in core actions 21:22:19 have observed of emergence of a broader class of user intentions 21:22:25 e.g., ¨next tweet 21:22:46 even just ¨play¨ on a video without having to navigate to it etc. 21:22:53 s/AT should have way to differentiate/Then that's a bug in the AT. AT has way to differentiate, so it shouldn't be triggering accidental or raw input./ 21:23:05 there exists a broad class of action shared among applications 21:23:54 jc: much of this is in the menu, so controllable 21:24:28 jd: 21:24:58 if we could identify those, could meet needs of more users 21:25:48 kp: am hoping for a standard to help app folks have consistent approaches 21:26:05 make it easier to remember all the commands 21:26:23 ack j 21:27:58 jgw: mixing event types means could have conflict over what sort of events might be sent 21:28:13 and app developer will have to sort 21:28:35 but might not know specifics of the user environment 21:29:41 jc: yes, that is a concrn 21:29:49 s/concrn/concern/ 21:29:49 -JasonJGW 21:29:55 kp: yes that´s important 21:30:08 I´ve been assuming I can use a mix 21:30:37 jc: could have one intent trigger another trigger another 21:30:45 kp: want to get to final intent 21:30:50 but don´t want to lose the daisy-chain 21:31:29 zakim, Monterey also has Kim_Patch 21:31:29 +Kim_Patch; got it 21:31:33 zakim, list participants 21:31:33 As of this point the attendees have been Susann_Keohane, Marc_Johlic, John_Foliot, Katie_Haritos-Shea, Mary_Jo_Mueller, Rich_Schwerdtfeger, Kurosawa_Takeshi, Cynthia_Shelly, 21:31:36 ... Janina_Sajka, Michael_Cooper, Joanie_Diggs, JasonJGW, Ben_Peters, James_Craig, Kim_Patch 21:31:38 zakim, drop Monterey 21:31:38 Monterey is being disconnected 21:31:40 WAI_Indie()11:00AM has ended 21:31:40 Attendees were Susann_Keohane, Marc_Johlic, John_Foliot, Katie_Haritos-Shea, Mary_Jo_Mueller, Rich_Schwerdtfeger, Kurosawa_Takeshi, Cynthia_Shelly, Janina_Sajka, Michael_Cooper, 21:31:40 ... Joanie_Diggs, JasonJGW, Ben_Peters, James_Craig, Kim_Patch 21:32:37 kp: will review specs when they´re published 21:33:20 various: here´s a bunch of stuff you can read 21:33:24 kp: please send me refs 21:34:27 janina_ has joined #indie-ui 21:35:41 richardschwerdtfeger has joined #indie-ui 21:38:13 Ryladog_ has joined #indie-ui 21:44:28 rrsagent, make minutes 21:44:28 I have made the request to generate http://www.w3.org/2014/10/28-indie-ui-minutes.html MichaelC 21:59:18 kurosawa_ has joined #indie-ui 22:09:01 janina_ has joined #indie-ui 22:12:52 kurosawa_ has joined #indie-ui 22:27:30 jcraig has joined #indie-ui 22:30:24 MichaelC_ has joined #indie-ui 22:30:45 kurosawa has joined #indie-ui 22:40:22 richardschwerdtfeger has joined #indie-ui 23:00:15 richardschwerdtfeger has joined #indie-ui 23:11:33 richardschwerdtfeger has joined #indie-ui 23:15:32 jcraig has joined #indie-ui 23:34:10 Zakim has left #indie-ui 23:49:21 richardschwerdtfeger has joined #indie-ui 23:58:08 jcraig has joined #indie-ui 00:01:10 jcraig has joined #indie-ui 00:10:08 MichaelC has joined #indie-ui 00:40:41 jcraig has joined #indie-ui 01:04:44 jcraig has joined #indie-ui 01:32:04 jcraig has joined #indie-ui 03:47:47 janina_ has joined #indie-ui 03:48:37 janina_ has changed the topic to: IndieUI Teleconference; Wednesday 12 November at 22:00Z for 60 minutes; Zakim 46343# 10:56:09 smaug has joined #indie-ui