17:02:14 RRSAgent has joined #ua 17:02:14 logging to http://www.w3.org/2014/10/02-ua-irc 17:02:16 RRSAgent, make logs public 17:02:16 Zakim has joined #ua 17:02:18 Zakim, this will be WAI_UAWG 17:02:18 ok, trackbot, I see WAI_UAWG()1:00PM already started 17:02:19 Meeting: User Agent Accessibility Guidelines Working Group Teleconference 17:02:19 Date: 02 October 2014 17:02:27 rrsagent, set logs public 17:03:15 Chair: Jim Allan and Kelly Ford 17:03:20 zakim, who is on the phone? 17:03:20 On the phone I see Greg_Lowney, Jeanne 17:03:24 Agenda+ TPAC 17:03:26 Agenda+ Writing tests 17:03:51 Jan has joined #ua 17:04:05 +Jim_Allan 17:04:22 +Kim_Patch 17:04:30 +[IPcaller] 17:04:43 zakim, [IPcaller] is really Jan 17:04:43 +Jan; got it 17:06:29 agenda+ Charter timeline 17:07:11 scribe: allanj 17:07:22 open item q 17:07:27 open item 1 17:07:55 registration paid for invited experts 17:08:17 +[Microsoft] 17:09:39 I'll be here http://www.ramada.com/hotels/california/sunnyvale/ramada-silicon-valley/hotel-overview 17:10:28 still working on travel. 17:16:15 all, discussion of travel and lodging. 17:16:52 zakim, close item 1 17:16:52 agendum 1, TPAC, closed 17:16:53 I see 2 items remaining on the agenda; the next one is 17:16:53 2. Writing tests [from allanj] 17:17:05 zakim, open item 2 17:17:05 agendum 2. "Writing tests" taken up [from allanj] 17:21:06 wiki page for testing, greg to create magic script for numbers and text 17:21:15 https://www.w3.org/WAI/GL/mobile-a11y-tf/wiki/Technique_Development_Assignments 17:22:07 mobile task force, have a wiki list, pick task, due monday, in survey, then survey reviewed at meeting 17:23:09 if we get tests done quickly, and hopefully few comments, then do LC and CR simultaneously 17:23:56 NDIVIDUAL PAGE TEMPLATE 17:23:58 The individual pages are linked and named by the SC# 17:23:59 I will set up a template that people can copy to create the individual page. Note that there can be multiple tests for each SC. 17:24:01 For each test: 17:24:02 Test Assertion 17:24:04 Procedure 17:24:05 Expected Result 17:24:46 assertion, specific thing you are testing in that test. 17:25:10 1 sentence 17:25:52 different test for browser or audio player, separate assertion for each. 17:26:36 need a test for every SC 17:27:43 http://www.w3.org/WAI/AU/CR20/TestPrep20131206.html 17:27:58 http://www.w3.org/WAI/AU/2013/ATAG2-10April2012PublicWD-Tests 17:33:51 http://w3c.github.io/UAAG/UAAG20/#gl-obs-env-conventions 17:34:48 js: don't over think the tests. keep it simple and generic 17:34:52 http://www.w3.org/TR/2014/NOTE-WCAG20-TECHS-20140311/G4 17:34:57 brb 17:37:07 we (UAWG) will have to perform all of the tests, and match to implementations 17:38:25 kp: the examples will help write the test 17:39:18 topic: Sample test 2.4.1 17:39:26 2.4.1 Text Search: The user can perform a search within rendered content, including rendered text alternatives and rendered generated content, for any sequence of printing characters from the document character set. (Level A) 17:40:32 ja: do we need an html page? 17:40:49 gl: what about all the other formats? can't do them all 17:41:14 jr: atag has an accessible page, an inaccessible page to test against 17:42:17 gl: are there sample test pages at least in html to test against 17:42:37 js: before/after page from EO 17:43:14 jr: really difficult to use, separate from the chrome of the EO pages 17:45:56 Assertion: The user can perform a search within rendered content, including rendered text alternatives and rendered generated content, for any sequence of printing characters from the document character set. 17:46:57 1. load content with text, text alternatives, and generated content 17:47:07 est 0001 Assertion: All editing-views enable text search where any text content that is editable by the editing-view is searchable, results can be made visible to authors and given focus, authors are informed when no results are found and search can be made forwards or backwards. 17:47:08 If the authoring tool does not allow the editing of text content (e.g. because it is a graphics editor), then select SKIP. 17:47:10 For each editing view that enables the editing of text content: 17:47:12 Load the accessible test content file (any level), which contains non-text content with text alternatives, in the editing view. 17:47:13 Choose a word that is repeated in the text and then determine whether a search function exists for the editing view that can find all of the instances of the word. In web-based tools, the search function may be part of the user agent. If this is not possible, then select FAIL. 17:47:15 When a match is found, check whether the match can be presentwed and given focus in the editing view. If this is not done, then select FAIL 17:47:16 Determine whether search is possible forwards and backwards. If it is not, then select FAIL 17:47:18 Choose a search term that is not in the content (e.g. a nonsense word) and search for it. If no indication is made of the failure of the search, then select FAIL 17:47:19 If the editing view enables editing of text alternatives for non-text content, choose a search term from within the text alternative. If the term cannot be found, then select FAIL. 17:47:20 gl: what about searching for punctuation, etc. 17:47:21 Go to the next editing view that enables the editing of text content (if any). 17:47:22 Select PASS (all of the editing views must have passed) 17:51:55 jr: create a page with a block of text with all kinds of characters, an image with alt of the same block of characters, and a paragraph "this paragraph has generated text preceding it" with the block of characters generated from css 17:53:55 jr: create one big test file. 17:56:44 1. load page (with rendered text, alternative text, generated text), 2. search for a known string, 3. observe/record results 17:57:18 note: need to create a page of content 18:00:44 2a. in rendered text, ( if pass go to next /fail), 2b. in alternative text ( if pass go to next /fail), 2c. in generated test ( if pass go to next / fail), 2d. search for foreign language character ( if pass go to next / fail) 18:04:50 1. load page (with rendered text, alternative text, generated text, foreign language characters), 18:04:52 2. search for a known string in rendered text, ( if pass go to next else fail), 18:04:53 3. search for a known string in alternative text ( if pass go to next else fail), 18:04:55 4. search for a known string in generated test ( if pass go to next else fail), 18:04:56 5. search for foreign language character ( if pass go to next else fail), 18:04:58 6. mark PASS 18:06:38 remove 6 18:06:39 expected results 18:06:41 2-5 are true 18:07:13 "known string" = string of text on the test page to have search success 18:07:45 brb 18:08:09 back 18:08:49 js: how exact to we need to be in defining test (concern about cross technology applications) 18:10:59 2-5 either PASS or are N/A 18:11:00 in PDF not sure there is generated text. how to make sure? do we need to? 18:12:00 gl: at the beginning state assumptions that some technologies may not have all features (e.g. PDF does not have generated content) 18:13:58 js: there will be a comment area for each test, to explain NA 18:15:42 RESOLUTION: Use the WCAG model of writing tests with Procedure and Expected Results. 18:16:02 RESOLUTION: Use the WCAG model of writing tests with Procedure and Expected Results. 18:17:24 SC # stem: full text 18:17:30 procedure: 18:17:37 expected results: 18:18:36 2.4.1 done....as the Count would say .... ONE ah ah ah... 18:19:11 send a tests in by monday, in survey by tuesday 18:20:33 meeting on thursday - discuss only disagreements. severely limit smithing 18:20:57 tests are NOT NORMATIVE we can change anytime 18:21:29 "delayed smithing gratification" 18:22:06 zakim, open item 3 18:22:06 agendum 3. "Charter timeline" taken up [from jeanne] 18:23:42 how long will writing test take? 18:23:46 about 130 SC = 2 each per week = 4 months 18:24:37 ja: 10 tests a week, could finish testing in 4 months. perhaps... 18:26:27 27 sc 18:28:42 27 gls, 45 sc in gl 1 18:29:36 -Jan 18:29:51 -Kim_Patch 18:30:19 44 sc om gl2 18:30:39 14 in gl 3 18:30:52 -Jim_Allan 18:30:58 -Jeanne 18:31:05 6 sc in gl 4 18:31:06 -[Microsoft] 18:31:07 -Greg_Lowney 18:31:07 WAI_UAWG()1:00PM has ended 18:31:07 Attendees were Greg_Lowney, Jeanne, Jim_Allan, Kim_Patch, Jan, [Microsoft] 18:31:16 6 in gl 5 18:32:12 rrsagent, make minutes 18:32:12 I have made the request to generate http://www.w3.org/2014/10/02-ua-minutes.html allanj 18:32:21 zakim, please part 18:32:21 Zakim has left #ua 18:52:55 rrsagent, please part 18:52:55 I see no action items