13:01:00 RRSAgent has joined #webtv 13:01:00 logging to http://www.w3.org/2014/06/25-webtv-irc 13:01:02 RRSAgent, make logs world 13:01:04 Zakim, this will be 13:01:04 I don't understand 'this will be', trackbot 13:01:05 Meeting: Web and TV Interest Group Teleconference 13:01:05 Date: 25 June 2014 13:01:08 gmandyam has joined #webtv 13:01:20 yosuke has joined #webtv 13:01:22 aldafu has joined #webtv 13:01:41 zakim, this is 932881 13:01:41 ok, ddavis; that matches UW_WebTVIG()9:00AM 13:01:50 +??P2 13:01:53 zakim, ??P2 is me 13:01:53 +ddavis; got it 13:02:01 zakim, who is here? 13:02:02 + +49.303.aaaa 13:02:02 On the phone I see gmandyam, ddavis, +49.303.aaaa 13:02:04 On IRC I see aldafu, yosuke, gmandyam, RRSAgent, Zakim, ddavis, wuwei, jcverdie, MarkS, tobie, schuki, timeless_, trackbot 13:02:30 Zakim: +49.303.is me 13:02:32 Bin_Hu has joined #webtv 13:02:57 +EricP 13:03:04 Zakim, +49.303.aaaa is me 13:03:04 +aldafu; got it 13:03:31 +[IPcaller] 13:03:41 + +1.650.946.aabb 13:03:51 zakim, aabb is me 13:03:51 +Bin_Hu; got it 13:04:27 zakim, who is here? 13:04:27 On the phone I see gmandyam, ddavis, aldafu, wuwei, [IPcaller], Bin_Hu 13:04:29 On IRC I see Bin_Hu, aldafu, yosuke, gmandyam, RRSAgent, Zakim, ddavis, wuwei, jcverdie, MarkS, tobie, schuki, timeless_, trackbot 13:04:29 zakim, mute me 13:04:29 ddavis should now be muted 13:05:06 zakim, [+IPcaller] is me 13:05:07 sorry, yosuke, I do not recognize a party named '[+IPcaller]' 13:05:32 zakim, [IPcaller] is me 13:05:32 +yosuke; got it 13:06:30 zakim, unmute me 13:06:30 ddavis should no longer be muted 13:06:38 scribenick: ddavis 13:06:42 scribe: Daniel 13:07:28 Previous minutes: http://www.w3.org/2014/06/11-webtv-minutes.html 13:07:33 https://www.w3.org/2011/webtv/wiki/Media_APIs#Iterations_and_Timeline 13:08:20 yosuke: Here is a timeline of the use cases and requirements. 13:08:31 yosuke: Today, we'll finalise the use cases 13:08:35 zakim, mute me 13:08:35 ddavis should now be muted 13:08:58 yosuke: After that, we'll work on the requirements and gap analysis 13:09:25 yosuke: We'll have a one-day face-to-face meeting at TPAC in October 13:09:33 yosuke: Any comments or questions about the schedule? 13:09:47 +CyrilRa 13:09:59 Topic: Use cases 13:10:32 https://www.w3.org/2011/webtv/wiki/New_Ideas 13:11:18 yosuke: Let's check whether each use case is worth pursuing. 13:11:21 zakim, unmute me 13:11:21 ddavis should no longer be muted 13:13:18 ddavis: Audio fingerprinting is when the app listens to a short clip of audio and compares that to a database of existing media. It can then offer more information to the user or e.g. share it on social networks. 13:13:58 yosuke: I think we can include this use case. We may be able to implement this system using client and server technology but there may be a better way using web APIs. 13:14:53 ddavis: I agree. My instinct is it could be possible already with existing APIs but we should investigate this during the gap analysis phase. 13:15:00 yosuke: OK, so let's include this. 13:17:19 ddavis: Audio watermarking is similar but media is detected by audio tones that are inaudible to humans. These are unique to each media. 13:17:29 yosuke: let's deal with this use case in the IG as well. 13:17:46 ddavis: The next use case has been removed because it's already part of the HTML spec. 13:18:18 whyun has joined #webtv 13:18:19 ddavis: The two synchronisation use cases have not changed since the last call. 13:18:58 yosuke: I think these could be included too. 13:19:27 ddavis: I'd be interested to hear if anyone thinks these are too similar to second-screen use cases we did in the previous gap analysis iteration. 13:19:43 wuwei has joined #webtv 13:20:00 Bin_Hu: In the first round, synchronisation was covered but the scenario was slightly different. 13:20:13 +??P14 13:20:20 q+ 13:20:30 Bin_Hu: In this scenario, the media is played at identical times so it's different. 13:20:50 Bin_Hu: The media stream may not be identical but it's played at the same time. 13:21:02 zakim, ??P14 is me 13:21:02 +whyun; got it 13:21:14 zakim, mute me 13:21:14 whyun should now be muted 13:21:18 Bin_Hu: We can investigate to see if there's any overlap. 13:22:08 yosuke: There may be some overlap but it's worth investigating. 13:22:26 gmandyam: We've discussed this in ATSC. One example is tracks from multiple cameras 13:22:38 gmandyam: There's scepticism as to how important that use case is. 13:23:18 gmandyam: It could be looked into but I won't go into too much technical detail on this call. 13:23:51 yosuke: I understand your point - there have been some ideas from the broadcasting industry to use multiple cameras and synchronise those screens. 13:24:02 yosuke: This is supposed to increase engagement. 13:24:19 yosuke: That kind of use case has been suggested but not actually adopted. 13:24:32 CyrilRa has joined #webtv 13:24:46 yosuke: There are movements in web technologies that combine multiple views into one data source. 13:25:18 yosuke: There are JavaScript frameworks that can achieve this - you can see the same object in different places within the DOM. 13:25:54 yosuke: It's good to look into this use case again. 13:26:43 yosuke: Next - interactive overlay. 13:27:00 Bin_Hu: This is from a feature that's already available on native platforms. 13:27:18 Bin_Hu: An event is triggered by in-band or out-band triggering mechanisms. 13:27:28 Bin_Hu: In-band means the events are embedded in the media stream. 13:27:50 Bin_Hu: Such as text or other tracks. 13:28:02 Bin_Hu: Out-band is at the platform level. 13:28:19 Bin_Hu: Platform triggers are passed to an API - the app can receive these events. 13:28:40 Bin_Hu: The web app can overlay the triggered content on top of the current media screen. 13:28:53 Bin_Hu: The use case is just how the user perceives this type of event. 13:29:37 Bin_Hu: The more important part is that this use case requires a more generic API to handle in-band and out-band triggers. 13:30:18 yosuke: I think this use case is also interesting because we don't have a generic API to cover this. 13:30:34 yosuke: External standards organisations are defining their own APIs for this. 13:30:56 yosuke: Currently, web standards are heading towards more declarative or generic APIs so it's a good time to tackle this use case. 13:31:41 yosuke: Next use case is Clean Audio 13:34:02 ddavis: This was submitted by Janina from the media accessibility sub-group 13:34:12 gmandyam: How is this different to just synchronisation. 13:34:28 s/synchronisation./synchronisation?/ 13:34:55 yosuke: I'm not sure this use case is a special case. I added a comment that this use case conflicts with EME. 13:35:13 yosuke: EME doesn't allow users to modify or manage audio tracks. 13:35:34 yosuke: Synchronisation use cases don't include modification. 13:37:18 ddavis: Increasing or decreasing certain frequencies differentiates it from the other use cases IMO. 13:38:31 tune the app to the frequencies that are best for that 13:38:31 particular viewer. 13:38:31 ... This is a very important strategy. 13:38:31 ... It could also be possible to use the same app in the cinema 13:38:32 watching movies." 13:39:32 ddavis: Even if it does affect EME it's still relevant for content that's not protected. 13:39:59 yosuke: If EME allows this or not, then the accessibility task force should talk to the HTML Media Task Force 13:40:46 ddavis: I can suggest that to Mark Sadecki 13:41:03 ACTION: ddavis to bring up issue of frequency-changing affecting EME 13:41:03 Created ACTION-204 - Bring up issue of frequency-changing affecting eme [on Daniel Davis - due 2014-07-02]. 13:41:25 yosuke: I think at least we should keep this use case as a specific example of synchronisation. 13:42:18 yosuke: We can see if it's different depending on feedback from Mark (accessibility group). 13:42:59 yosuke: I put a note about the web and TV accessibility initiative at the bottom of the page. 13:43:22 yosuke: Mark S, Janina and I exchanged some ideas. 13:43:38 q+ 13:43:38 http://5stardata.info/ 13:43:44 yosuke: One was a concept of 5-star accessibility, similar to the 5-star open data measurement process. 13:44:18 yosuke: This has helped increase awareness of linked open data on the web and a similar thing could be done for accessibility. 13:44:52 yosuke: There could be a rating system for accessibility to see the achievement of a site. 13:45:18 gmandyam: Another thing we've been discussing in ATSC is personalisation information, for example closed caption preference. 13:45:35 gmandyam: Extensions to the runtime engine are necessary for this feature so that apps can get access to use your preferences. 13:45:59 gmandyam: There's standardised storage preferences. Is this something the accessibility efforts have been considering? 13:46:36 gmandyam: This is a case where the persistent storage mechanism (appcache, cookies) are not sufficient and cross-domain issues, but data like this is universally applicable. 13:46:51 yosuke: The team is now creating media accessibility guidelines. 13:47:00 yosuke: I'm not sure it covers this point. 13:48:18 yosuke: We're not sure that the guidelines will cover this and other efforts in other SDOs. 13:48:41 yosuke: We can check this and make sure things are inline. 13:49:10 q? 13:49:20 ack gmandyam 13:49:43 yosuke: A further thing we discussed is to make the the accessibility guidelines relevant for the TV industry. 13:50:07 gmandyam: I don't like other standards bodies creating new web APIs for their use cases. 13:50:44 gmandyam: If the accessibility group can come up with a recommendation that would be useful. 13:51:47 yosuke: Third idea is a work in progress which is deciding the use cases and working with the accessibility sub-group. 13:51:59 yosuke: There is also work going on in the TV Control API Community Group 13:52:28 yosuke: This group will define a new API and they'd like to make sure that it satisfies the accessibility requirements. 13:53:06 Bin_Hu: The TV Control API CG so far has taken technical requirements - we plan to finish this input stage by July 6th. 13:53:25 Bin_Hu: So far we have input from Mozilla and the BBC regarding channel scan, etc. 13:53:40 Bin_Hu: One use case talks about accessibility-related requirements but it's very simple. 13:54:07 Bin_Hu: So we're still collecting use cases. Next we'll work with the editors to address the technical requirements. 13:54:52 Bin_Hu: One requirement is the ability to show subtitles, the other is the ability to play supplementary audio tracks (descriptions). 13:55:18 Bin_Hu: We are not quite clear if those use cases would satisfy the accessibility guidelines. 13:55:58 jcverdie has joined #webtv 13:56:11 Bin_Hu: Another question is how accessibility APIs could benefit the whole TV industry, considering this TV Control API is only a part of hybrid TV. 13:56:20 Bin_Hu: It's still a grey area. 13:57:43 yosuke: So we should start with the big picture and when we dig into the TV Control API, if it helps accessibility then we address it then. 13:58:13 yosuke: Daniel and I will talk about how to proceed with the accessibility team. 13:58:39 Topic: Info for IG members about testing 13:58:51 yosuke: The co-chairs are re-visiting testing efforts. 13:59:08 yosuke: The chairs sent a message to other SDOs to ask about their thoughts about testing. 13:59:20 yosuke: The email was sent on June 19th and we're waiting for replies. 13:59:30 yosuke: We'd like to share them when we have some replies. 13:59:47 yosuke: I think that's all for now. 13:59:58 Topic: Summary 14:00:24 yosuke: How about creating a spreadsheet that summarises the use cases? 14:00:38 ddavis: What's the best format for that? 14:00:52 yosuke: The format we used for the first round is good. 14:01:02 ddavis: OK, so we can copy it and change the data. 14:01:22 yosuke: So we will create that spreadsheet to work collaboratively online. 14:01:39 yosuke: Then we can work together on the gap analysis and requirements. 14:01:53 yosuke: Any other business? 14:01:56 q? 14:02:10 q+ a11y 14:02:13 ACTION: yosuke and ddavis to create online docs for requirements/gap analysis 14:02:13 Created ACTION-205 - And ddavis to create online docs for requirements/gap analysis [on Yosuke Funahashi - due 2014-07-02]. 14:02:17 unmute me 14:02:23 ack wuwei 14:02:41 wuwei: I'm from the accessibility team. 14:02:55 wuwei: Talking about the use cases, what's the timeline for discussing the use cases? 14:03:29 https://www.w3.org/2011/webtv/wiki/Media_APIs#Iterations_and_Timeline 14:03:53 yosuke: For this second iteration of work, today we finalised the list of use cases. 14:04:27 yosuke: From today, we'll polish the use cases and extract requirements, then start a gap analysis. 14:05:18 wuwei: I want to make sure we can implement the requirements for accessibility in this second round. 14:05:28 wuwei: I'd like to join the discussion about accessibility. 14:06:08 yosuke: Thank you very much. Meeting adjourned. 14:06:09 -gmandyam 14:06:12 -Bin_Hu 14:06:13 -whyun 14:06:15 -ddavis 14:06:16 -yosuke 14:06:18 -CyrilRa 14:06:22 -wuwei 14:06:51 rrsagent, make minutes 14:06:51 I have made the request to generate http://www.w3.org/2014/06/25-webtv-minutes.html ddavis 14:07:54 -aldafu 14:07:55 UW_WebTVIG()9:00AM has ended 14:07:55 Attendees were gmandyam, ddavis, aldafu, wuwei, +1.650.946.aabb, Bin_Hu, yosuke, CyrilRa, whyun 14:28:22 jcverdie has joined #webtv 15:00:49 jcverdie has joined #webtv 15:59:31 glenn has joined #webtv 16:11:04 Zakim has left #webtv