W3C

- DRAFT -

Web and TV Interest Group Teleconference

25 Jun 2014

See also: IRC log

Attendees

Present
Regrets
Chair
Yosuke
Scribe
Daniel

Contents


<trackbot> Date: 25 June 2014

<aldafu> Zakim: +49.303.is me

<scribe> scribenick: ddavis

<scribe> scribe: Daniel

Previous minutes: http://www.w3.org/2014/06/11-webtv-minutes.html

<yosuke> https://www.w3.org/2011/webtv/wiki/Media_APIs#Iterations_and_Timeline

yosuke: Here is a timeline of the use cases and requirements.
... Today, we'll finalise the use cases
... After that, we'll work on the requirements and gap analysis
... We'll have a one-day face-to-face meeting at TPAC in October
... Any comments or questions about the schedule?

Use cases

<yosuke> https://www.w3.org/2011/webtv/wiki/New_Ideas

yosuke: Let's check whether each use case is worth pursuing.

ddavis: Audio fingerprinting is when the app listens to a short clip of audio and compares that to a database of existing media. It can then offer more information to the user or e.g. share it on social networks.

yosuke: I think we can include this use case. We may be able to implement this system using client and server technology but there may be a better way using web APIs.

ddavis: I agree. My instinct is it could be possible already with existing APIs but we should investigate this during the gap analysis phase.

yosuke: OK, so let's include this.

ddavis: Audio watermarking is similar but media is detected by audio tones that are inaudible to humans. These are unique to each media.

yosuke: let's deal with this use case in the IG as well.

ddavis: The next use case has been removed because it's already part of the HTML spec.
... The two synchronisation use cases have not changed since the last call.

yosuke: I think these could be included too.

ddavis: I'd be interested to hear if anyone thinks these are too similar to second-screen use cases we did in the previous gap analysis iteration.

Bin_Hu: In the first round, synchronisation was covered but the scenario was slightly different.
... In this scenario, the media is played at identical times so it's different.
... The media stream may not be identical but it's played at the same time.
... We can investigate to see if there's any overlap.

yosuke: There may be some overlap but it's worth investigating.

gmandyam: We've discussed this in ATSC. One example is tracks from multiple cameras
... There's scepticism as to how important that use case is.
... It could be looked into but I won't go into too much technical detail on this call.

yosuke: I understand your point - there have been some ideas from the broadcasting industry to use multiple cameras and synchronise those screens.
... This is supposed to increase engagement.
... That kind of use case has been suggested but not actually adopted.
... There are movements in web technologies that combine multiple views into one data source.
... There are JavaScript frameworks that can achieve this - you can see the same object in different places within the DOM.
... It's good to look into this use case again.
... Next - interactive overlay.

Bin_Hu: This is from a feature that's already available on native platforms.
... An event is triggered by in-band or out-band triggering mechanisms.
... In-band means the events are embedded in the media stream.
... Such as text or other tracks.
... Out-band is at the platform level.
... Platform triggers are passed to an API - the app can receive these events.
... The web app can overlay the triggered content on top of the current media screen.
... The use case is just how the user perceives this type of event.
... The more important part is that this use case requires a more generic API to handle in-band and out-band triggers.

yosuke: I think this use case is also interesting because we don't have a generic API to cover this.
... External standards organisations are defining their own APIs for this.
... Currently, web standards are heading towards more declarative or generic APIs so it's a good time to tackle this use case.
... Next use case is Clean Audio

ddavis: This was submitted by Janina from the media accessibility sub-group

gmandyam: How is this different to just synchronisation?

yosuke: I'm not sure this use case is a special case. I added a comment that this use case conflicts with EME.
... EME doesn't allow users to modify or manage audio tracks.
... Synchronisation use cases don't include modification.

ddavis: Increasing or decreasing certain frequencies differentiates it from the other use cases IMO.

tune the app to the frequencies that are best for that

particular viewer.

scribe: This is a very important strategy.
... It could also be possible to use the same app in the cinema

watching movies."

ddavis: Even if it does affect EME it's still relevant for content that's not protected.

yosuke: If EME allows this or not, then the accessibility task force should talk to the HTML Media Task Force

ddavis: I can suggest that to Mark Sadecki

<scribe> ACTION: ddavis to bring up issue of frequency-changing affecting EME [recorded in http://www.w3.org/2014/06/25-webtv-minutes.html#action01]

<trackbot> Created ACTION-204 - Bring up issue of frequency-changing affecting eme [on Daniel Davis - due 2014-07-02].

yosuke: I think at least we should keep this use case as a specific example of synchronisation.
... We can see if it's different depending on feedback from Mark (accessibility group).
... I put a note about the web and TV accessibility initiative at the bottom of the page.
... Mark S, Janina and I exchanged some ideas.

<yosuke> http://5stardata.info/

yosuke: One was a concept of 5-star accessibility, similar to the 5-star open data measurement process.
... This has helped increase awareness of linked open data on the web and a similar thing could be done for accessibility.
... There could be a rating system for accessibility to see the achievement of a site.

gmandyam: Another thing we've been discussing in ATSC is personalisation information, for example closed caption preference.
... Extensions to the runtime engine are necessary for this feature so that apps can get access to use your preferences.
... There's standardised storage preferences. Is this something the accessibility efforts have been considering?
... This is a case where the persistent storage mechanism (appcache, cookies) are not sufficient and cross-domain issues, but data like this is universally applicable.

yosuke: The team is now creating media accessibility guidelines.
... I'm not sure it covers this point.
... We're not sure that the guidelines will cover this and other efforts in other SDOs.
... We can check this and make sure things are inline.
... A further thing we discussed is to make the the accessibility guidelines relevant for the TV industry.

gmandyam: I don't like other standards bodies creating new web APIs for their use cases.
... If the accessibility group can come up with a recommendation that would be useful.

yosuke: Third idea is a work in progress which is deciding the use cases and working with the accessibility sub-group.
... There is also work going on in the TV Control API Community Group
... This group will define a new API and they'd like to make sure that it satisfies the accessibility requirements.

Bin_Hu: The TV Control API CG so far has taken technical requirements - we plan to finish this input stage by July 6th.
... So far we have input from Mozilla and the BBC regarding channel scan, etc.
... One use case talks about accessibility-related requirements but it's very simple.
... So we're still collecting use cases. Next we'll work with the editors to address the technical requirements.
... One requirement is the ability to show subtitles, the other is the ability to play supplementary audio tracks (descriptions).
... We are not quite clear if those use cases would satisfy the accessibility guidelines.
... Another question is how accessibility APIs could benefit the whole TV industry, considering this TV Control API is only a part of hybrid TV.
... It's still a grey area.

yosuke: So we should start with the big picture and when we dig into the TV Control API, if it helps accessibility then we address it then.
... Daniel and I will talk about how to proceed with the accessibility team.

Info for IG members about testing

yosuke: The co-chairs are re-visiting testing efforts.
... The chairs sent a message to other SDOs to ask about their thoughts about testing.
... The email was sent on June 19th and we're waiting for replies.
... We'd like to share them when we have some replies.
... I think that's all for now.

Summary

yosuke: How about creating a spreadsheet that summarises the use cases?

ddavis: What's the best format for that?

yosuke: The format we used for the first round is good.

ddavis: OK, so we can copy it and change the data.

yosuke: So we will create that spreadsheet to work collaboratively online.
... Then we can work together on the gap analysis and requirements.
... Any other business?

<scribe> ACTION: yosuke and ddavis to create online docs for requirements/gap analysis [recorded in http://www.w3.org/2014/06/25-webtv-minutes.html#action02]

<trackbot> Created ACTION-205 - And ddavis to create online docs for requirements/gap analysis [on Yosuke Funahashi - due 2014-07-02].

<wuwei> unmute me

wuwei: I'm from the accessibility team.
... Talking about the use cases, what's the timeline for discussing the use cases?

<yosuke> https://www.w3.org/2011/webtv/wiki/Media_APIs#Iterations_and_Timeline

yosuke: For this second iteration of work, today we finalised the list of use cases.
... From today, we'll polish the use cases and extract requirements, then start a gap analysis.

wuwei: I want to make sure we can implement the requirements for accessibility in this second round.
... I'd like to join the discussion about accessibility.

yosuke: Thank you very much. Meeting adjourned.

Summary of Action Items

[NEW] ACTION: ddavis to bring up issue of frequency-changing affecting EME [recorded in http://www.w3.org/2014/06/25-webtv-minutes.html#action01]
[NEW] ACTION: yosuke and ddavis to create online docs for requirements/gap analysis [recorded in http://www.w3.org/2014/06/25-webtv-minutes.html#action02]
 
[End of minutes]

Minutes formatted by David Booth's scribe.perl version 1.138 (CVS log)
$Date: 2014/06/25 14:12:25 $

Scribe.perl diagnostic output

[Delete this section before finalizing the minutes.]
This is scribe.perl Revision: 1.138  of Date: 2013-04-25 13:59:11  
Check for newer version at http://dev.w3.org/cvsweb/~checkout~/2002/scribe/

Guessing input format: RRSAgent_Text_Format (score 1.00)

Succeeded: s/synchronisation./synchronisation?/
Found ScribeNick: ddavis
Found Scribe: Daniel

WARNING: No "Present: ... " found!
Possibly Present: Bin_Hu CyrilRa EricP IPcaller MarkS P14 P2 aaaa aabb aldafu ddavis gmandyam https jcverdie schuki scribenick timeless_ tobie trackbot whyun wuwei yosuke
You can indicate people for the Present list like this:
        <dbooth> Present: dbooth jonathan mary
        <dbooth> Present+ amy


WARNING: No meeting chair found!
You should specify the meeting chair like this:
<dbooth> Chair: dbooth

Found Date: 25 Jun 2014
Guessing minutes URL: http://www.w3.org/2014/06/25-webtv-minutes.html
People with action items: ddavis yosuke

[End of scribe.perl diagnostic output]