Minutes of UAAG mobile examples 12 October 2012

UAWG,

Kim Patch and I sat down with Kathy Wahlbin and Mark Sadecki and wrote 
mobile examples for Implementing UAAG. We finished Principles 1, 2 & 3. 
  I have updated the Implementing document with the new examples, linked 
from the UA home page.

Here are the minutes from the meeting with the notes of our discussion 
and drafts of examples.

Minutes:http://www.w3.org/2012/10/12-ua-minutes.html


    [1]W3C

       [1] http://www.w3.org/

                                - DRAFT -

     User Agent Accessibility Guidelines Working Group Teleconference

12 Oct 2012

    See also: [2]IRC log

       [2] http://www.w3.org/2012/10/12-ua-irc

Attendees

    Present
           MIT-G451, Kim_Patch, Mark_Sadecki, Jeanne_Spellman,
           Kathy_Wahlbin

    Regrets
    Chair
           Kim

    Scribe
           jeanne

Contents

      * [3]Topics
          1. [4]1.1
          2. [5]1.2
          3. [6]1.3
          4. [7]1.4
          5. [8]1.5 & 1.6
          6. [9]1.7
          7. [10]1.8
          8. [11]2.1
          9. [12]2.4
         10. [13]2.5
         11. [14]2.6
         12. [15]2.7
         13. [16]2.8
         14. [17]2.9
         15. [18]2.11
         16. [19]2.12
         17. [20]Principle 3
      * [21]Summary of Action Items
      __________________________________________________________

    <trackbot> Date: 12 October 2012

1.1

    1.1.1 Example: Brin is deaf. The video player she is using has
    a button displayed beneath the playing video that indicates
    that captions are available. She clicks the button to toggle
    the captions on so she can understand the video. On her mobile
    phone, Brin touches a video, which displays the controls
    including the "display caption" control.

    <mark> hello!

    1.1.1 Example: Brin is deaf. The video player she is using has
    a button displayed beneath the playing video that indicates
    that captions are available. She clicks the button to toggle
    the captions on so she can understand the video. On her mobile
    phone, Brin touches a video, which displays the controls
    including the "display caption" control.

    <Kathy> Ben has low vision. In the mobile settings dialog box,
    he chooses to always display the alternative ("fallback")
    content for embedded objects, such as videos and images. On the
    mobile phone, the text version of the images is shown.

    1.1.2

    <Kathy> Ben has low vision. In the mobile settings dialog box,
    he chooses to always display the alternative ("fallback")
    content for embedded objects, such as images. Images become
    pixelated so on the mobile phone, the text version of the
    images is preferred.

    <Kathy> Ben has low vision. In the mobile settings dialog box,
    he chooses to always display the alternative ("fallback")
    content for embedded objects, such as images. Images become
    pixelated so on the mobile phone, so he prefers the text
    version of the images.

    1.1.3

    Jaime is watchinga video on her mobile phone and wants to turn
    on the caption controls. She has her phone configured so that
    she has a presistent control that will allow her to access the
    video controls. The button for the controls can be moved around
    the small screen, and after 3 seconds, it becomes transparent.

    <Kathy> Jaime is deaf and prefers to always display captions on
    her mobile phone. She has set her global settings on the phone
    to turn on closed captions. All videos displayed on the phone
    will automatically display captions.

    <Kathy> Jaime is deaf and prefers to always display captions on
    her mobile phone. She has set her video settings on the phone
    to turn on closed captions. All videos displayed on the phone
    will automatically display captions.

    <mark_> 1.1.4 *NEW EXAMPLE* Ben has low vision and prefers to
    display the longer alternative content (@alt or @title) on his
    desktop browser where his display allows. When using his mobile
    device, Ben has configured his device to display the shortest
    alternative content available for non-text content.

    1.1.3 example: Ben has low vision that becomes worse throughout
    the day as he becomes more tired. He keeps a floating control
    on his mobile phone that allows one touch access to his
    configuration. The floating control can be easily moved around
    the screen so it is not in the way of other controls, and it
    becomes translucent after it is idle for a few seconds.

    1.1.3: Ben has low vision that becomes worse throughout the day
    as he becomes more tired. He keeps a floating control on his
    mobile phone that allows one touch access to his configuration
    so that he can change the font size. The floating control can
    be easily moved around the screen so it is not in the way of
    other controls, and it becomes translucent after it is idle for
    a few seconds.

    <mark_> 1.1.5 When Tom watches narrow-aspect video on a
    wide-aspect screen or in landscape mode on his mobile device,
    he moves the window displaying sign language interpretation to
    the side, allowing the primary video to take up the entire
    height of the screen without the interpretation getting in the
    way.

1.2

    1.2.1 the Intent is not clear. In particular, the second
    paragraph doesn't seem like it belongs here, and actually
    belongs with SC on overflow control. Recommend moving that
    paragraph and put in a link under related resources.

    1.2 would not add value by adding mobile examples.

1.3

    <Kathy> X wants a visible focus indicator to know what element
    on the page has focus so when gestures are used on the mobile
    phone, he will know what element will be activated.

    1.3.2 - no mobile example needed.

    1.3.1 X wants a visible focus indicator to know what element on
    the page has focus so when gestures are used on the mobile
    phone, he will know what element will be activated.

    1.3.1 - George has limited hand use and uses custom gestures on
    his mobile phone. He wants a visible focus indicator to know
    what element on the page has focus so when gestures are used on
    the mobile phone, he will know what element will be activated.

1.4

    <Kathy> Ben has low vision. In the mobile settings dialog box,
    he chooses to large text for font size. All applications on the
    mobile phone display text in large font.

    discussion of Zoom feature vs. text configuration. PWD -
    limited hand mobility would not want extra gestures for zoom
    and default font size would be more important. APple Music
    player has fixed size for artist and title.

1.5 & 1.6

    While mobile applies, further examples would not add value.

    Topic 1.7

    <mark_> 1.5 Add personas and disabilities to examples

    Discussion of existing stylesheets on mobile. Safari has a
    bookmarklet that allows you to import stylesheets.

    <mark_> 1.7.1, 1.7.2, 1.7.3 Lee has low vision and finds text
    easiest to read text on her mobile device when it is presented
    in yellow on a black background. She has configured her browser
    to override the author stylesheets to always display text in
    her browser using this color scheme.

    <Kathy> Tanya browses to a new website on her mobile phone and
    finds that the site is not optimized for mobile devices. She
    alters the stylesheet to provide better layout and larger
    fonts. The custom settings for the stylesheet are saved and
    applied when she returns.

1.7

1.8

    Discussion of applicability of 1.8 to mobile. Existing mobile
    does not have multiple viewports. While it may in the future,
    adding mobile examples may confuse the readers.

    <Kathy> Mattias has ADHD and finds text easiest to read text if
    text is highlighted in blue as it is being read out loud. This
    highlight and text color should be configurable and override
    the author stylesheets so text is readable and has sufficient
    color contrast.

    above example is for 1.7.3

    <mark_> 1.8.2 Lee typically views web content on her mobile
    phone at a high level of zoom, frequently positioning elements
    outside the viewport. When moving between focusable elements,
    the viewport automatically scrolls to the element currently in
    focus.

    Protofluid (javascript library) can show different sizes of
    screen and different orientations.

    RWD bookmarklet for Chrome does the same thing.

    [22]http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&c
    d=1&cad=rja&ved=0CCgQFjAA&url=http%3A%2F%2Fwww.disabilitystatis
    tics.org%2Fglossary.cfm%3Fg_id%3D273%26view%3Dtrue&ei=FDR4UMC2A
    tC40gHDw4A4&usg=AFQjCNFlpQgHTnoNm8yNP-P2Desr4SvpHQ&sig2=f347YIZ
    AEshAr3prmIzEVg

      [22] 
http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CCgQFjAA&url=http%3A%2F%2Fwww.disabilitystatistics.org%2Fglossary.cfm%3Fg_id%3D273%26view%3Dtrue&ei=FDR4UMC2AtC40gHDw4A4&usg=AFQjCNFlpQgHTnoNm8yNP-P2Desr4SvpHQ&sig2=f347YIZAEshAr3prmIzEVg

    [23]http://www.disabilitystatistics.org/ <- statistics for U.S.
    Disabilities

      [23] http://www.disabilitystatistics.org/

    <mark_> 1.8.6 example 1 *EDIT* "She opens a web application
    ^THAT^ uses small text fonts"

    1.8.6 When Tanya views a web site on her mobile phone, she
    first scans the website at a very small size to guess where she
    wants to zoom in first. The zoom feature increases the size of
    both text and images.

    1.8.7 - change the Xu example to a mobile device.

    <Admin> 1.8.4 Terry has memory issues. She configures her
    mobile computer so that scrollbars are always on so she can
    instantly see where she is in a document.

    <mark_> 1.8.6 Ally has cognitive issues that make it difficult
    to orient. When looking at a map on her mobile device, she must
    frequently zoom in to view her current location or destination
    and zoom out to put the location into the context of the large
    map.

    <mark_> ...larger map.

    <mark_> 1.8.7 last sentence of first example *EDIT* The text is
    reduced to a co^M^fortable

    <scribe> ACTION: Jeanne to propose to split up the IER for
    1.8.8 Viewport History, 1.8.9 Open on Request & 1.8.10 Do not
    take focus. [recorded in
    [24]http://www.w3.org/2012/10/12-ua-minutes.html#action01]

    <trackbot> Created ACTION-769 - Propose to split up the IER for
    1.8.8 Viewport History, 1.8.9 Open on Request & 1.8.10 Do not
    take focus. [on Jeanne F Spellman - due 2012-10-19].

    1.8.11 - same UI doesn't need a mobile example.

    <mark_> 1.8.8 Ray is blind. His mobile device automatically
    opens location links and calendar dates found on web pages in
    native apps available on the device. When he returns to the
    browser, focus on the original link is maintained.

    1.8.12 Reflowing Zoom. When Frank is using his mobile phone to
    read a web page, he will zoom in to read a article on a web
    site. He configures his mobile phone so that the text reflows
    to always display zoomed content to fit in one column.

    <Admin> 1.8.13: Add to Jamie:

    <Admin> Jamie also uses bookmarks on her mobile phone to cut
    down on scrolling.

    <mark_> 1.10.1 *EDIT* Example 2 "Courtney has a cogn^T^ive
    disability that"

    1.10.2 there is no difference on the mobile devices. No example
    needed.

    <Kathy> Karen uses gestures to navigate her mobile phone. As
    focus moves from one element to another, there is a visble
    focus indicator.

2.1

    2.1.1 Karen uses gestures to navigate her mobile phone. As
    focus moves from one element to another, there is a visble
    focus indicator.

    2.1.2 will have a speech example from Kim

    2.1.3 Keyboard Trap does not apply to mobile.

    scribe: group could not come up with any examples of keyboard
    trap on mobile devices.

    2.1.4 Kathy will write

    2.1.5 does not apply

    2.1.6 just is applicable to keyboard

    <Kathy> 2.1.4 Ari uses Voiceover on his iPhone to navigate a
    webpage. He selects an item and is able to activate the element
    using gestures. This requires sufficient screen real estate to
    perform gestures without changing focus.

    2.2.1 George is blind and uses a screenreader on his computer
    and the voice announce feature of his mobile phone. When
    completing a web form on his phone, he uses the Next key to
    advance through the form. If George goes past the next form
    field, or wishes to return to a previous form field, he can use
    the back key.

    <Admin> 2.1.2 Jeremy is a speech-input user who cannot use his
    hands to control his mobile computer. He opens a webpage using
    a speech command. The webpage has a search field, and normally
    comes up with the keyboard focus in the search field. Jeremy
    sees the indicator in the search field and knows he does not
    have to navigate to the search fieldbefore saying a search
    term.

    <mark_s> 2.1.6 George is blind and uses gestures on his mobile
    device to move focus to the top of the page, return to the
    previous web page and activate links.

    <Kathy> Jeff has a mobility impairment. He uses gestures to
    navigate the page. When he reaches the last active element on
    the page there is an indicator that the end of the page is
    reached before changing focus (e.g. wrapping to the top,
    switching pages).

    <Kathy> 2.2.4

    <Admin> 2.3.1 On her mobile phone, Mary uses a single speech
    command to launch the app, rather than having to use multiple
    commands to page through screens to find the app icon and
    activate it.

    <Kathy> 2.3.4 Neta has a repetitive strain injury. She relies
    on gestures and shortcuts to complete tasks. Using a specialize
    command, she can pull up a list of all the commands that can be
    completed in that context.

    2.3.2 When reading email on her tablet, Mary touches a control
    which opens a toolbar with a setting to display the accesskeys
    and other direct commands that the author created. She sees
    that a 3-finger swipe will delete the current email.

    2.4.2 is more important for mobile, but it is not platorm
    dependednt.

2.4

2.5

    When Armand is using the speech feature on his smartphone
    surfing the web, he can navigate from heading to heading using
    gesture commands.

    2.5.1 When Armand is using the speech feature on his smartphone
    surfing the web, he can navigate from heading to heading using
    gesture commands.

    <Admin> 2.3.3 Mary cannot use the mouse or keyboard due to a
    repetitive strain injury. On her mobile phone, Mary uses a
    single speech command to launch the app, rather than having to
    use multiple commands to page through screens to find the app
    icon and activate it.

    <Admin> Corrected 2.3.1 Mary cannot use the mouse or keyboard
    due to a repetitive strain injury. On her mobile phone, Mary
    uses a single speech command to select the app, rather than
    having to use multiple commands to page through screens to find
    the app icon and activate it.

    2.5.2 Armand is blind. When he is using the speech feature on
    his smartphone surfing the web, Armand can navigate from
    heading to heading using gesture commands.

2.6

    <Kathy> 2.6.1 Ingrid has low vision. When navigating a page
    with a smartphone, she can use both keyboard and gestures to
    navigate within the page.

2.7

    <mark_s> 2.7.1 Betty is a low vision user and has a highly
    customized color palette defined in her browser. By saving her
    customizations to a cloud-based storage service, her
    preferences can easily be transferred to other desktop and
    mobile browsers that she uses.

    2.7.2 Kathy accidently turns on a zoom feature on her
    smartphone and cannot figure out how to turn it off. She
    gestures to navigate to the preferences menus and selects a
    command to reset preferences to default.

    <Kathy> 2.7.5 Jeanette is a low vision user who has configured
    her smartphone to show text in a particular font and size with
    specific color settings. She has recently upgraded her phone
    and sets up the her new phone by transferring these settings
    through the bluetooth features on her phone.

    <Admin> 2.7.4 Jan is easily confused by new interfaces. She
    tries out the voiceover capabilities on her mobile phone, but
    then can't figure out how to turn them off. She connects the
    device to her computer and restores the default settings.

2.8

    2.8.1 has a mobile example

    2.8.2 is the same for mobile, it would not add value.

    2.9

2.9

    no added value with mobile examples

    2.10 Flash

    no added value with mobile

2.11

    2.11.2 same for mobile, no added value.

    <mark_s> 2.11.1 Betty is a low vision user and has difficulty
    reading text on her mobile device when it is displayed over a
    background image. Using her user-defined style sheet, she can
    disable all background images from being rendered in her
    browser.

    2.11.3 Evan has configured his mobile phone to so that any
    audio or video file displays a placeholder with a triangle
    "play" icon, so that he can control when the audio or video
    starts.

2.12

    <mark_s> 2.12.1 Armand is a blind and uses a bluetooth braille
    keyboard to interact with his mobile device. He should expect
    his mobile browser to accept and properly process input from
    any braille keyboard supported by his mobile operating system.

    <Kathy> 3.3.2 Neta has a repetitive strain injury. She relies
    on gestures and shortcuts to complete tasks. Using a specialize
    command, she can pull up a list of all the gestures commands
    available with descriptions on how they function.

Principle 3

Summary of Action Items

    [NEW] ACTION: Jeanne to propose to split up the IER for 1.8.8
    Viewport History, 1.8.9 Open on Request & 1.8.10 Do not take
    focus. [recorded in
    [25]http://www.w3.org/2012/10/12-ua-minutes.html#action01]

    [End of minutes]

Received on Saturday, 13 October 2012 15:45:05 UTC