16:37:01 RRSAgent has joined #ua 16:37:01 logging to http://www.w3.org/2010/08/03-ua-irc 16:37:03 RRSAgent, make logs public 16:37:03 Zakim has joined #ua 16:37:05 Zakim, this will be WAI_UAWG 16:37:05 ok, trackbot, I see WAI_UAWG(WGroup)12:00PM already started 16:37:06 Meeting: User Agent Accessibility Guidelines Working Group Teleconference 16:37:06 Date: 03 August 2010 16:39:09 greg has joined #ua 16:39:44 Kim has joined #ua 16:40:11 rrsagent, make logs public 16:40:17 rrsagent, make minutes 16:40:17 I have made the request to generate http://www.w3.org/2010/08/03-ua-minutes.html jeanne 16:40:42 present+ Jim, Greg, Jeanne, Kim, Kelly 16:41:01 kford has joined #ua 16:41:52 +AllanJ 16:42:20 rrsagent, make minutes 16:42:20 I have made the request to generate http://www.w3.org/2010/08/03-ua-minutes.html kford 16:43:08 Jeanne, can you paste a link to the doc again? 16:50:41 http://www.w3.org/WAI/UA/2010/ED-UAAG20-20100802/MasterUAAG20100802.html 16:51:32 topic: Principle 2 17:15:26 [writing assignments] 17:16:03 2.1.3 reference http://lists.w3.org/Archives/Public/w3c-wai-ua/2008OctDec/0050.html 17:16:04 jeanne and Kim were discussing the need for a stable point of regard, for example, when increasing text size in a long document, the point of regard should be saved. 17:20:19 -AllanJ 17:38:50 +AllanJ 17:39:42 +[IPcaller] 17:41:21 Jan has joined #ua 17:42:15 topic: 2.1.3 Accessible Alternative 17:42:34 Existing wording: 2.1.3 Accessible Alternative: If a feature is not supported by the accessibility architecture(s), provide an equivalent feature that does support the accessibility architecture(s). Document the equivalent feature in the conformance claim. (Level A) 17:44:12 New proposed rewording, intent and example: 17:44:14 1. If an item of the user agent user interface cannot be exposed through the platform accessibility architecture, then provide an ("separate but equal") equivalent alternative that is exposed through the platform accessibility architecture. 17:44:15 a. Users need to be able to carry out all tasks provided by the user agent. The purpose of this S C is to ensure that when circumstances do not allow direct accessibility to some items in the user agent, there is an accessible option that will let them complete their task. 17:44:17 b. For example, the user agent provides a single, complex control for 3-dimensional manipulation of a virtual object. This custom control cannot be represented in the platform accessibility architecture, so the user agent provides the user the option to achieve the same functionality through an alternate user interface, such as a panel with several basic controls that adjust the yar, spin,... 17:44:18 ...and roll independently. 17:44:42 1. If an item of the user agent user interface cannot be exposed through the platform accessibility architecture, then provide an equivalent alternative that is exposed through the platform accessibility architecture. 17:44:43 a. Users need to be able to carry out all tasks provided by the user agent. The purpose of this S C is to ensure that when circumstances do not allow direct accessibility to some items in the user agent, there is an accessible option that will let them complete their task. 17:44:45 b. For example, the user agent provides a single, complex control for 3-dimensional manipulation of a virtual object. This custom control cannot be represented in the platform accessibility architecture, so the user agent provides the user the option to achieve the same functionality through an alternate user interface, such as a panel with several basic controls that adjust the yar, spin,... 17:44:46 ...and roll independently. 17:47:29 Jan notes that they originally chose "feature" over "item" because all the functionality that is grouped together in the default user interface should also be grouped together in the alternative UI, rather than having to go elsewhere for one portion of it. Thinking of features as a whole as working or not working, rather than breaking it down to the level of individual controls. 17:48:57 Kelly notes that in his workplace, the term "feature" is used for very large chunks of functionality that include controls, behaviors, interactions, etc. 17:49:57 Kelly refers to things like "spell checking" as a feature that includes numerous menu items, dialog boxes, controls, hotkeys, etc. 17:51:07 Jan: ATAG has high-level guidance as well, equivalent to Greg's point that this general idea can apply to almost anything in the document. 17:51:44 JR: Notes "applicability notes" construct in ATAG2.0: http://www.w3.org/TR/2010/WD-ATAG20-20100708/#part_a 17:51:57 JR: and http://www.w3.org/TR/2010/WD-ATAG20-20100708/#part_b 17:58:50 This discussion started because Greg argued that 2.1.3 is just a single instance of the general rule that would apply to every SC in this document. 18:00:20 That is, all *functionality* needs to be available through accessible user interface, but not every user interface element. For example, I believe that its acceptable to have an toolbar button that lacks keyboard access as long as the command is available through an accessible menu system. 18:02:43 I think I sent email to the list on April 8 on this topic, discussing scoping and exceptions. 18:03:31 I think eventually we'll be able to get rid of 2.1.3, as it will be redundant to some more general SC or conformance rule. 18:06:39 2.1.3 Accessible Alternative: If a component of the user agent user interface cannot be exposed through the platform accessibility architecture, then provide an equivalent alternative that is exposed through the platform accessibility architecture. 18:06:41 Intent: Users need to be able to carry out all tasks provided by the user agent. The purpose of this S C is to ensure that when circumstances do not allow direct accessibility to some items in the user agent, there is an accessible option that will let them complete their task. 18:06:42 Example: The user agent provides a single, complex control for 3-dimensional manipulation of a virtual object. This custom control cannot be represented in the platform accessibility architecture, so the user agent provides the user the option to achieve the same functionality through an alternate user interface, such as a panel with several basic controls that adjust the yar, spin, and roll... 18:06:44 ...independently. 18:07:08 Jeanne suggested "component". Jan suggests "functionality". 18:08:34 Think about replacing component with functionality and appropriate wording. 18:09:05 rssagent, make minutes 18:10:09 topic: 2.1.1 18:10:10 Intent for 2.1.1 18:10:10 Computers, including many smart phones, have accessibility features built into the operating system. Some well-known APIs for the Windows operating system are: MSAA, iAccessible2, [more]. Where ever technically possible, support the existing accessibility APIs. 18:10:20 Examples: 18:10:20 Browser A is developing a new user interface button bar for their Microsoft Windows product. The developer codes a call to the MSAA API for the functionality. 18:10:42 We didn't try to put together the list of resources, but that will be needed. 18:11:44 SC: 2.1.1 Platform Accessibility Architecture: Support an platform accessibility architecture relevant to the operating environment. (Level A) 18:13:09 Issue: Ensure UAAG docus have fully updated references to the various accessibility APIs. 18:13:09 Created ISSUE-72 - Ensure UAAG docus have fully updated references to the various accessibility APIs. ; please complete additional details at http://www.w3.org/WAI/UA/tracker/issues/72/edit . 18:13:30 topic: 2.1.2 18:13:31 My only suggestion would be to clarify in the Intent paragraph that this is about API for programmatic access (with assistive technology), rather than about all the "accessibility features built into the operating system". 18:13:56 2.1.2 Name, Role, State, Value, Description: 2.1.2 Name, Role, State, Value, Description: For all user interface components including the user interface, rendered content, and alternative content, make available the name, role, state, value, and description via an platform accessibility architecture. (Level A) 18:14:11 The information that assistive technology requires is the Name (component name), the Role (purpose, such as alert, button, checkbox, etc), State (current status, such as busy, disabled, hidden, etc), Value (information associated with the component such as, the data in a text box, the position number of a slider, the date in a calendar widget), Description (user instructions about the component). 18:14:11 For every component developed for the user agent, pass this information to the appropriate accessibility platform architecture or application program interface (API). Embedded user agents, like media players can pass Name, Role, State, Value and Description via the WAI-ARIA techniques. 18:14:32 Example for browser (not complete) 18:14:39 A browser is developing a component to search a listing of files stored in folders. The text box to enter the search terms is coded to pass the following information:Name= 18:14:39 State 18:14:39 STATE_FOCUSABLE 18:14:39 STATE_SELECTABLE 18:15:12 example for embedded media player using WAI-ARIA 18:15:15 A media player implements a slider to control the sound volume. The developer codes the component to pass the following information to the accessibility API: 18:15:15 Name = Volume control 18:15:15 Role = Slider 18:15:15 States & Values 18:15:15 aria-valuenow 18:15:17 The slider’s current value. 18:15:18 aria-value-min 18:15:22 The minimum of the value range 18:15:22 aria-value-max 18:15:24 The maximum of the value range 18:15:27 Description 18:15:29 aria-describedby = 'Use the right or left arrow key to change the sound volume.' 18:16:10 Does the phrase "rendered content, and alternative content" in the SC include generated content, or do we need to add that explicitly? 18:16:44 to me generated content comes from CSS 18:17:02 it probably should be explicitly included. 18:17:21 we had an item in UAAG10 concerning this. 18:18:29 We can address it either in the SC explicitly or in the glossary entry for rendered content. 18:19:38 action: jeanne to Add "generated content" to the SC 2.1.2 18:19:38 Created ACTION-419 - Add "generated content" to the SC 2.1.2 [on Jeanne Spellman - due 2010-08-10]. 18:20:16 Kelly questions why 2.1.2 and 2.1.6 are separate. 18:20:40 UAAG10 css generated content is 6.9 it was a P2 18:45:09 topic: Accessibility API reference 18:45:16 Microsoft's Active Accessibility (MSAA) msdn.microsoft.com/en-us/library/ms971310.aspx 18:45:17 User Interface (UI) Automation msdn.microsoft.com/en-us/library/ms747327.aspx 18:45:19 Gnome Accessibility Toolkit (ATK) library.gnome.org/devel/atk/ 18:45:20 KDE Assistive Technology Service Provider Interface (AT-SPI) accessibility.kde.org/developer/atk.php 18:45:22 Mac Accessibility API http://developer.apple.com/ue/accessibility/ 18:45:23 Iaccessible2 accessibility.linuxfoundation.org/a11yspecs/ia2/docs/html/ , www-03.ibm.com/able/open.../open_source_windows.html , 18:45:25 Accessibility API Cross reference www.mozilla.org/access/platform-apis 18:45:26 PDF Accessibility API Reference www.adobe.com/devnet/acrobat/pdfs/access.pdf 19:00:04 Topic: 2.1.5 19:00:09 SC: 2.1.5 Write Access: If the user can modify the state or value of a piece of content through the user interface (e.g., by checking a box or editing a text area), the same degree of write access is available programmatically. (Level A) 19:00:29 Intent for 2.1.1 19:00:29 Computers, including many smart phones, have accessibility features built into the operating system. Some well-known APIs for the Windows operating system are: MSAA, iAccessible2, UIAutomation, [more]. Where ever technically possible, support the existing accessibility APIs. 19:00:55 MT that was 2.1.1 sorry. 19:01:02 Intent for 2.1.5 19:01:02 If the user can affect the user interface using any form of input, the same affect may be done with assistive technologies. It is more reliable for the assistive technology to directly control the state, rather than having to simulate controls. 19:01:21 Examples for 2.1.5: 19:01:21 A volume control slider in a media player can be set directly to the desired value, e.g. the user can speak "Volume 35%". 19:01:33 A set box with a tri-state value, e.g. "checked, unchecked and mixed" where the keystrokes are different to achieve the desired setting, depending on the state. The user can directly select the value when the control is programmatic. 19:03:18 -[IPcaller] 19:05:11 Jan_ has joined #ua 19:05:25 Possible revision to Intent: 19:05:30 If the user can affect the user interface using any form of input, the same affect may be done through programatic access. 19:05:31 It is often more reliable for the 19:05:33 assistive technology to use the programatic method of access versus attempting to simulate mouse or keyboard input. 19:06:38 Topic: 2.1.6 Properties 19:06:51 2.1.6 Properties: If any of the following properties are supported by the accessibility platform architecture, make the properties available to the accessibility platform architecture: (Level A) 19:06:53 * (a) the bounding dimensions and coordinates of rendered graphical objects 19:06:54 * (b) font family of text 19:06:56 * (c) font size of text 19:06:57 * (d) foreground color of text 19:06:59 * (e) background color of text. 19:07:00 * (f) change state/value notifications 19:07:02 * (g) selection 19:07:03 * (h) highlighting 19:07:05 * (i) input device focus 19:07:06 * Intent of Success Criterion 2.1.6: 19:07:08 These properties are all used by assistive technology to allow provide alternative means for the user to view or navigate the content, or to accurately create a view of the user interface and rendered content. 19:07:09 * Examples of Success Criterion 2.1.61: 19:07:11 • Kiara loads a new version of a popular web browser for the first time. She puts her screen reader into an "explore mode" that lets her review what is appearing on the screen. Her screen reader uses the bounding rectangle of each element to tell her that items from the menu bar all appear on the same horizontal line, which is below the window's title bar. 19:07:15 • Kiara is using a screen reader at a telephone call center. The Web application displays caller names in different colors depending on their banking status. Kiara needs to know this information to appropriately respond to each customer immediately, without taking the time to look up their status through other means. 19:07:19 • Max uses a screen magnifier that only shows him a small amount of the screen at one time. He gives it commands to pan through different portions of a Web page, but then can give it additional commands to quickly pan back to positions of interest, such as the text matched by the recent Search operation, text that he previously selected by dragging the mouse, or the text caret, rather than... 19:07:24 ...having to manually pan through the document searching for them. 19:11:01 rrsagent, make minutes 19:11:01 I have made the request to generate http://www.w3.org/2010/08/03-ua-minutes.html jeanne 19:45:50 topic: 2.1.4 19:45:56 2.1.4 Programmatic Availability of DOMs: If the user agent implements one or more DOMs, they must be made programmatically available to assistive technologies. (Level A) 19:45:58 • Intent of Success Criterion 2.1.4: 19:45:59 User agents (and other applications) and assistive technologies use a combination of DOMs, accessibility APIs, native platform APIs, and hard-coded heuristics to provide an accessible user interface and accessible content (http://accessibility.linuxfoundation.org/a11yspecs/atspi/adoc/a11y-dom-apis.html). It is the user agents responsibility to expose all relevant content to the platform... 19:46:01 ...accessibility api. Alternatively, the user agent must respond to requests for information from APIs. 19:46:03 • Examples of Success Criterion 2.1.4 : 19:46:04 In user agents today, an author may inject content into a web page using CSS (generated content). This content is written to the screen and the CSS DOM. The user agent does not expose this generated content from the CSS-DOM (as per CSS recommendation) to the platform accessibility API or to the HTML-DOM. This generated content is non-existent to an assistive technology user. The user agent... 19:46:06 ...should expose all information from all DOMs to the platform accessibility API. 19:46:08 A web page is a compound document containing HTML, MathML, and SVG. Each has a separate DOM. As the user moves through the document, they are moving through multiple DOMs. The transition between DOMs is seamless and transparent to the user and their assistive technology. All of the content is read and all of the interaction is available from the keyboard regardless of the underlying source... 19:46:13 ...code or the respective DOM. 19:46:15 • Related Resources for Success Criterion 2.1.4: 19:46:17 o www.w3.org/TR/SVG/svgdom.html 19:46:19 o www.w3.org/TR/MathML/chapter8.html 19:46:21 o www.w3.org/TR/DOM-Level-2-HTML/ 19:46:23 o www.w3.org/TR/DOM-Level-2-Style/ 19:46:25 o https://developer.mozilla.org/en/gecko_dom_reference 19:46:27 o http://developer.apple.com/mac/library/documentation/AppleApplications/Conceptual/SafariJSProgTopics/Tasks/DOM.html 19:46:30 o http://msdn.microsoft.com/en-us/library/ms533050%28VS.85%29.aspx 19:46:33 o www.adobe.com/devnet/acrobat/pdfs/access.pdf 19:46:34 o www.w3.org/2004/CDF/ 19:46:36 o dev.w3.org/2006/cdf/cdi-framework/ 19:46:38 o www.w3.org/TR/CDR/ 19:50:47 Suggest changing "expose all relevant content to the platform accessibility API." to "expose all of its user interface and relevant content through the platform accessibility API, as this is the only approach which lets assistive technology interact with all software on the platform without having to implement separate solutions for each." 19:53:38 Change "This content is written to the screen and the CSS DOM. The user agent does not" to merely "This content is written to the screen, but the user agent does not." as the generated content is not "written to" the CSS DOM, but rather it is written to the screen based on formatting instructions in the CSS, after it is parsed into the CSS DOM. 19:54:19 Kelly and I both think the last sentence of the Intent can be removed. 19:57:00 A key factor for 2.1.4 is that in many cases the DOM exposes richer content than can be exposed through the platform API. For example, the HTML DOM would expose attributes such as a link destination and whether or not it should be opened in a new window, which are not part of the generic set of properties that can be exposed through MSAA and equivalents. 19:57:32 That is the real reason why the DOM needs to be exposed *in addition to* exposing content through the platform accessibility API. 19:58:25 Accessibility APIs at some level are abstracting data from a more robust sourced. 19:58:36 i agree with the removal of the last sentence of the intent. 19:58:50 A DOM will usually have more details than an API spe cific to accessibility can provide. 20:00:46 example: page with 5 links and text. UA loads all info into DOM, then it exposes (automatically or on request) to relevant sources (rendering engine, accessibiity api) 20:01:31 a11y api can ask what is at location x,y. or what are children of z element 20:10:30 My comment above about 2.1.4 being about exposing the DOM in addition to platform API, that would change the ending of the first example, etc. 20:12:00 close action-418 20:12:00 ACTION-418 Copy proposals 3.1.4, 3.11 general intent, 3.11.1 specific intent, 3.11.1,4 & 5 Examples, and 3.13.1 from minutes of 02-08-2010. Put in the Guidelines Master and the Survey for 5 August. closed 20:13:05 4.6.3 Match Found: When there is a match, the user is alerted and the viewport moves so that the matched text content is at least partially within it. The user can search for the next instance of the text from the location of the match. 20:13:07 4.6.3 20:13:09 Intent 20:13:10 It is important for the user to easily recognize where a search will start from. 20:13:12 Example: Jules is low vision and uses a magnified screen. She frequently searches for terms that appear multiple times in a document that contains a lot of repetition. It is important that the viewport moves after each search so she can easily track where she is in the document. 20:13:13 4.6.4 Alert on No Match: The user is notified when there is no match or after the last match in content (i.e., prior to starting the search over from the beginning of content). 20:13:15 4.6.4 20:13:16 Intent 20:13:18 It is important for users to get clear, timely feedback so they don't waste time waiting or, worse, issue a command based on a wrong assumption. It is important during a search that users are informed when there is no match or that the search has reached the beginning of the document. 20:13:21 Example: 20:13:23 Dennis uses a screen reader. As soon as he gets a message that there is no match he goes on to search for something else. If he does not get a message he wastes time retrying the search to make sure there is not a match. 20:13:26 4.6.5 Advanced Find: The user agent provides an accessible advanced search facility, with a case sensitive and case-insensitive search option, and the ability for the user to perform a search within all content (including hidden content and captioning) for text and text alternatives, for any sequence of characters from the document character set. 20:13:30 4.6.5 20:13:32 Intent: 20:13:34 Searching is much more useful when the user can specify whether case matters in a search and when the user can search alternative text. 20:13:37 Examples: 20:13:39 Dennis uses a screen reader. He wants to find all the instances of his friend Bill in a blog post about finances. He needs to specify case in order to avoid stopping at instances of "bill". Later, he searches for his friend's name in a blog post about poetry where the author never uses capital letters. In this instance he specifies that case does not matter. 20:13:44 Dennis he remembers a portion of a caption for something he had seen before that he wants to find. He needs to be able to search on the caption. 20:19:41 rrsagent, make minutes 20:19:41 I have made the request to generate http://www.w3.org/2010/08/03-ua-minutes.html AllanJ 20:20:40 Re 4.6.3 Intent, should address the portion of the SC about scrolling the window to show the match. That's important so that user's don't have to hunt through the document for the match. The SC doesn't really address "recognize where the search will start from", as it only provides that on successive searches, not the initial search. 20:21:13 rrsagent, make minutes 20:21:13 I have made the request to generate http://www.w3.org/2010/08/03-ua-minutes.html jeanne 20:22:15 Re 4.6.3, it seems like this is one of those SC that is almost pointless, as it's hard to imagine a user agent not doing it already. 20:23:56 4.6.3 rewritten Example: Jules is low vision and uses a magnified screen. She frequently searches for terms that appear multiple times in a document that contains a lot of repetition. It is important that the viewport moves and if necessary her screen scrolls after each search so she can easily track where she is in the document. 20:25:24 Re 4.6.4 I think you should include the terrible results of real world cases where user agents don't do this: the user keeps searching through the document again and again, without realizing they're just seeing the same matches over and over again. 20:25:59 Topic: 2.1.7 timely communication 20:26:06 2.1.7 Timely Communication: For APIs implemented to satisfy the requirements of this document, ensure that programmatic exchanges proceed at a rate such that users do not perceive a delay. (Level A). 20:26:07 Intent: Conveying information for accessibility can often involve extensive communication between a user agent, an accessibility API, document object model and end user interaction. The objective is to ensure that the end user does not perceive a delay when interacting with the user agent. 20:26:09 Example: 20:26:11 Bonita accesses her web browser with a speech input program. She navigates to a web page and speaks the name of a link she wants to click. The link is activated with the same speed as it would be if a mouse had been used to click the link. 20:26:13 Resources: 20:26:15 Insert something about performance and classifications. 20:26:37 Note: This changes wording of the SC slightly. 20:30:43 it drops the parenthetical (for non-web-based user agents) 20:31:20 Re 2.1.7 Intent, the interaction also includes the assistive technology program. 20:33:14 Re 2.1.7 Intent, you might end with something akin to: "Users would find a noticable delay between their key press and the response unacceptable, whether or not they are using assistive technology." 20:33:55 Updated 2.1.7: 20:34:01 2.1.7 Timely Communication: For APIs implemented to satisfy the requirements of this document, ensure that programmatic exchanges proceed at a rate such that users do not perceive a delay. (Level A). 20:34:02 Intent: Conveying information for accessibility can often involve extensive communication between a user agent, an accessibility API, document object model, assistive technology and end user interaction. The objective is to ensure that the end user does not perceive a delay when interacting with the user agent. 20:34:04 Example: 20:34:06 Bonita accesses her web browser with a speech input program. She navigates to a web page and speaks the name of a link she wants to click. The link is activated with the same speed as it would be if a mouse had been used to click the link. 20:34:08 Resources: 20:34:09 Insert something about performance and classifications. 20:34:30 Another good example would be that a user press the tab key to move the focus to another button, his screen reader immediately says the name of that button, rather than making them wait for a second or two. 20:34:41 Sounds good. 20:37:32 Update again to 2.1.7 20:37:38 2.1.7 Timely Communication: For APIs implemented to satisfy the requirements of this document, ensure that programmatic exchanges proceed at a rate such that users do not perceive a delay. (Level A). 20:37:39 Intent: Conveying information for accessibility can often involve extensive communication between a user agent, an accessibility API, document object model, assistive technology and end user interaction. The objective is to ensure that the end user does not perceive a delay when interacting with the user agent. 20:37:41 Example: 20:37:42 Bonita accesses her web browser with a speech input program. She navigates to a web page and speaks the name of a link she wants to click. The link is activated with the same speed as it would be if a mouse had been used to click the link. 20:37:44 Arthur is browsing a web page with a screen reader. As he tabs from link to link, the text of each link instantly appears on his braille display. 20:37:46 Resources: 20:37:48 Insert something about performance and classifications. 20:39:02 rrsagent, make minutes 20:39:02 I have made the request to generate http://www.w3.org/2010/08/03-ua-minutes.html kford 20:39:04 kelly +1 21:17:48 Topic: 4.1.12 Specify preferred keystrokes: 21:18:05 Adding my text for this SC and such but don't want to ruin the dialog that's going on. 21:18:17 4.1.12 Specify preferred keystrokes: 21:18:54 4.1.12 Specify preferred keystrokes: The user can override any keyboard shortcut including recognized author supplied shortcuts (e.g accesskeys) and user interface controls, except for conventional bindings for the operating environment (e.g., for access to help). (Level AA) 21:18:56 Intent: 21:18:58 Some users may be able to hit certain keys on the keyboard with greater ease than others. Assistive technology software typically has extensive keyboard commands as well. The goal of this SC is to enable the user to be in control of what happens when a given key is pressed and use the keyboard commands that meet his or her needs. 21:19:00 Example: 21:19:01 Laura types with one hand and finds keys on the left side of the keyboard easier to press. She browses to a web page and notices that the author has assigned access keys using keys from the right side of the keyboard. She opens a dialog in the user agent and reassigns the access keys from the web page to the left side of the keyboard home row. 21:19:03 Elaine's screen magnification program uses alt+m to increase the size of the magnified area of the screen. She notices that in her web browser, alt+m is a hotkey for activating a home button that stops her from being able to control her magnification software. She opens a hotkey reassignment feature in the user agent, and sets alt+o to be the new hotkey for the home button. Her screen... 21:19:06 ...magnification software now works correctly. 21:23:15 Topic 3.13.1 again, new updates 21:23:23 • Intent of Success Criterion 3.13.1: 21:23:24 Users who use only the keyboard or screen readers need to be able to easily discover information about a link, including the title of the link, whether or not that link is a webpage, PDF, etc. and whether the link goes to a new page, opens a new user agent with a new page, or goes to a different location in the current page. This information allows the navigation of Web content quicker,... 21:23:26 ...easier, and with an expectation of what will happen upon link activation. 21:23:28 • Examples of Success Criterion 3.13.1: 21:23:29 • Robert, who uses a screen reader, needs to know whether a given link will automatically open in a new page or a new window. The browser indicates this information so he can discover it before he makes a decision to click on a link. 21:23:31 • Maria has an attention disorder, new windows opening are a large distraction. She needs to know whether a given link will automatically open in a new page or a new window. The browser indicates this information so she can decide not to follow a link that opens a new window. 21:28:47 jeanne has put the 3.13.1 text into the document. I haven't done the earlier ones yet. 21:31:42 close action-419 21:31:43 ACTION-419 Add "generated content" to the SC 2.1.2 closed 21:33:33 Anyone have an opinon on my text? 21:41:05 rrsagent, make minutes 21:41:05 I have made the request to generate http://www.w3.org/2010/08/03-ua-minutes.html jeanne 21:43:08 good stuff kelly 22:01:56 action: jeanne to update document and survey with Kim's droaft of 4.6.3, 4.6.4, 4.6.5 (see rewrites) 22:01:56 Created ACTION-420 - Update document and survey with Kim's droaft of 4.6.3, 4.6.4, 4.6.5 (see rewrites) [on Jeanne Spellman - due 2010-08-10]. 22:03:41 action: jeanne to update document with Kford's draft of 4.1.12 from minutes http://www.w3.org/2010/08/03-ua-minutes.html#item10 22:03:41 Created ACTION-421 - Update document with Kford's draft of 4.1.12 from minutes http://www.w3.org/2010/08/03-ua-minutes.html#item10 [on Jeanne Spellman - due 2010-08-10]. 22:05:47 rrsagent, make minutes 22:05:47 I have made the request to generate http://www.w3.org/2010/08/03-ua-minutes.html kford 22:09:04 Kim and I were normalizing terminology related to "focus" in 3.11, and are ready to begin writing Intent and Examples (a few are done). Our work in progress is in https://docs.google.com/Doc?docid=0ASiGLIaAlHSKZGR3d3FrbWJfMjMzZHJtemhuY3o&hl=en 22:09:16 topic 3.13.1 & 2 22:09:22 Problems with 3.13.1. 22:09:24 SC wording 22:09:25 3.13.1 Basic Link Information: The following information is provided for each link (Level A): 22:09:27 • (a) link element content, 22:09:28 • (e) new viewport: whether the author has specified that the resource will open in a new viewport. 22:09:30 Should ‘link’ be ‘anchor’, to differentiate from the ‘link’ in the HTML 22:09:31 Anchor (Link) element content includes ‘href’, ‘title’, ‘target’ (opening in a new window), ‘hreflang’ (language of the destination page), protocol (from the href), destination file type (from the href), character set of the destination page. If we include all of these as part of ‘link element content’ the SC will overlap all of 3.13.2. Since all of the information is... 22:09:33 ...available to the UA,... 22:09:34 ...suggest removing 3.13.2. If the developers will go to the effort of exposing the target (new window) they can do all of them. 22:10:00 s/topic 3.13/topic: 3.13 22:10:27 Topic: 3.11 22:10:38 Kim and I were normalizing terminology related to "focus" in 3.11, and are ready to begin writing Intent and Examples (a few are done). Our work in progress is in https://docs.google.com/Doc?docid=0ASiGLIaAlHSKZGR3d3FrbWJfMjMzZHJtemhuY3o&hl=en 22:11:21 rrsagent, make minutes 22:11:21 I have made the request to generate http://www.w3.org/2010/08/03-ua-minutes.html AllanJ 22:13:21 did the call just end? 22:14:07 everybody just vanished 22:14:26 zakim, bye 22:14:26 leaving. As of this point the attendees were [Microsoft], AllanJ, [IPcaller] 22:14:26 Zakim has left #ua 22:14:37 rrsagent, bye 22:14:37 I see 3 open action items saved in http://www.w3.org/2010/08/03-ua-actions.rdf : 22:14:37 ACTION: jeanne to Add "generated content" to the SC 2.1.2 [1] 22:14:37 recorded in http://www.w3.org/2010/08/03-ua-irc#T18-19-38 22:14:37 ACTION: jeanne to update document and survey with Kim's droaft of 4.6.3, 4.6.4, 4.6.5 (see rewrites) [2] 22:14:37 recorded in http://www.w3.org/2010/08/03-ua-irc#T22-01-56 22:14:37 ACTION: jeanne to update document with Kford's draft of 4.1.12 from minutes http://www.w3.org/2010/08/03-ua-minutes.html#item10 [3] 22:14:37 recorded in http://www.w3.org/2010/08/03-ua-irc#T22-03-41