See also: IRC log
<ddavis> scribenick: ddavis
<yosuke> http://www.w3.org/2011/webtv/wiki/New_Ideas
yosuke: So there are some use
cases from IG members and several concerning
accessibility.
... So going through the wiki page...
... First use-case is audio fingerprinting.
<kaz> scribenick: kaz
<inserted> Audio Fingerprinting
ddavis: ATSC asked about proposal
on this
... Facebook has announced they're adding watermarking
capability to their mobile apps
<gmandyam> Just to clarify: The ATSC put out a call-for-proposals for watermarking video/audio with an embedded low rate data channel (around 1 kbps), where the user device could extract information in the broadcast feed to retrieve additional data related to the content being played.
paul: uncertain about what to do...
ddavis: would add clarification
paul: great
gmandyam: video watermark and
audio watermark
... audio fingerprinting for location identification,
etc.
... think video watermarking fits the description here
paul: what do you mean by "video
watermarking" and "audio watermarking"?
... microphone to content recognition
ddavis: maybe there are two separate use cases here
yosuke: agree
ddavis: would split them then
jcverdie: concerned about patents
ddavis: thought about that...
paul: this is more about APIs
ddavis: need to investigate
paul: EME also has APIs for
content protection
... may be related to patents
ddavis: EME itself is an
exception
... would investigate that point
<inserted> Media Playback Adjustment
ddavis: adjust media playback speed, etc.
<PaulHiggs> <video> can change speed
<PaulHiggs> but audio is likely to be muted
yosuke: thinks this feature is
important for broadcasting service
... any other questions?
aldafu: playback rate for
video
... possible to do it
ddavis: but audio may be muted
<ddavis> scribenick: ddavis
kaz: When we investigate this,
maybe we can consider extending it, e.g. for Japanese videos -
if we try to playback at a higher speed, the player can play
the video and sound faster and the speech sound is played back
at the original frequency.
... So there's a special speech playback technology. The sound
is not muted and the sound quality is not changed.
yosuke: I think modern VCRs have
similar functions.
... So when we playback with double speed the voice may be
high-pitched, which we don't want.
<PaulHiggs> how does the re-sync occur?
<aldafu> ddavis: check http://www.w3.org/2010/05/video/mediaevents.html
yosuke: If the HTML5 media element doesn't have that function, it's worth considering.
<aldafu> ddavis, you can actually control playback rate and audio goes along for me
<aldafu> ddavis, not really, it becomes pretty warped
<kaz> [some smart recorder has a capability of preserver the audio pitch]
ddavis: Next use case is about media sychronisation. For example, two people watching the same content on two devices (e.g. next to each other on the train) would like to have the videos synchronised so they can laugh at the same jokes at the same time.
<PaulHiggs> there are several industry initiatives on screen synchronization
<inserted> Media Stream Synchronization
yosuke: Using watermarking can take time so it may not work.
PaulHiggs: There are a few other industry organisations looking at synchronising streams - AKA screen synchronisation or companion device synchronisation.
yosuke: This use case has some
new perspective.
... The broadcasting industry is syncing a broadcast stream and
video, whereas this use case is about syncing two videos.
PaulHiggs: I read a couple of
things in this - there's multi-camera angle synchronisation and
multi-video synchronisation.
... It could be 2.5 use cases. The multi-angle one could be on
the same device or separate devices.
<scribe> ACTION: ddavis to clarify use case 1 description [recorded in http://www.w3.org/2014/05/28-webtv-minutes.html#action01]
<trackbot> Created ACTION-194 - Clarify use case 1 description [on Daniel Davis - due 2014-06-04].
<scribe> ACTION: ddavis to check current ability to change playback rate for HTML media element. [recorded in http://www.w3.org/2014/05/28-webtv-minutes.html#action02]
<trackbot> Created ACTION-195 - Check current ability to change playback rate for html media element. [on Daniel Davis - due 2014-06-04].
<scribe> ACTION: ddavis to split synchronisation use case (#4) [recorded in http://www.w3.org/2014/05/28-webtv-minutes.html#action03]
<trackbot> Created ACTION-196 - Split synchronisation use case (#4) [on Daniel Davis - due 2014-06-04].
<inserted> Triggered Interactive Overlay UC
yosuke: Bin, could you explain use case 4 please?
Bin_Hu: When you watch TV, based
on the service provider you'll be provided with an overlay
showing the promotion of another channel, for example.
... The content is triggered by content in the main
stream.
... There are a few requirements that can be extracted from
this.
... So the trigger could be within the stream, or it could be
within the platform and associated with a channel.
... Also, you may be viewing a baseball game, for example, and
there could be a trigger that shows an overlay.
... This overlay could show a different team.
... These overlays must be valid during that 15 or 30
seconds.
... Or it could be triggered by a particular actor in the
content.
<PaulHiggs> can you clarify why this only applies to a "Hybrid STB"?
Bin_Hu: This trigger must be technology-agnostic.
<PaulHiggs> HbbTV and MPEG DASH already provide "standardized" events
Bin_Hu: HTML5 may be enhanced by
added event types to support these different triggers.
... On the other hand, in HTML5 there may be some other ways,
specifically in specs for the TV industry.
yosuke: These trigger event types
may be related to the Media In-Band Resource Community
Group.
... They're looking at MPEG events and tracks
Bin_Hu: Right, it may be part of that or it could be something for another group to look into.
PaulHiggs: In the use case you say this applies to a hybrid STB. Any reason why it has to be a hybrid STB?
Bin_Hu: That's just an example. Maybe IP STB or other is also applicable.
jcverdie: I'm confused about the difference about your use case and a basic web app to get web notifications from the server.
Bin_Hu: For example with sports,
you may have a trigger engine within the STB, e.g. at 7:00pm,
so the user may be offered other content.
... But how to enable those events on the web platform is the
issue.
... How to create the same experience on top of playing content
(e.g. Channel 5)?
... The native platform is able to use such trigger
engines.
yosuke: Any other comments?
... Bin, I think this is interesting so I'd like to do some gap
analysis of this use case and the Media In-Band Resource
CG.
... I'll do that quickly and put the results on the mailing
list.
Bin_Hu: Great, thank you.
kaz: I'd also like to join the gap analysis work because there's a slight possibility of MMI work being related.
<PaulHiggs> what is MMI?
yosuke: Do we have any of the accessibility people here today?
kaz: No but we can invite them for the next time.
<PaulHiggs> is this "accessability" for those with disabled?
ddavis: These are additional parts to the use cases from the first round.
kaz: These are not necessarily brand new use cases. We can start by briefly checking the use cases additions that have been submitted.
yosuke: So you suggest we walk through them quickly?
kaz: Yes, some of them.
... Maybe we can just pick up the accessibility portion from
some of them.
yosuke: Do you have any suggestions about which to look at?
kaz: Mark Sadecki (W3C) and John Foliot (member) created these. John considered media accessibility requirements so let's look at UC 8.
<kaz> Accessibility Extension Use Case 8
yosuke: Use case is Download and
Go.
... The original use case is when the user downloads content on
to a tablet to watch later.
... Accessibility requirement is that whenever a media file is
downloaded, all related resources (captions, etc.) must also be
available for download.
... The download of the supplemental resources should maintain
the hierarchy.
... The file directory structure is preserved in the
download.
... For example, putting all files in a zip or cab file.
kaz: So this means we have to
think about how to pack all related content.
... This is similar to ePub format for digital books.
yosuke: The reality of the
industry is that the recorder splits the information from the
original stream.
... I think we can suggest something to the TV industry.
kaz: We could also think about
media synchronisation for this purpose as well.
... Not only in-band information but also out-band information
as well.
yosuke: I think there are two
ways to deal with these requirements.
... One way is to improve these use cases.
... But to me, these sound like more fundamental
requirements.
... It sounds like something similar to GGIE that is now
talking about nodes between original creators and consumers and
that content should not change between them.
... We could create a note or guideline about essential
requirements for web and TV.
... Which way do you think fits into W3C's way of working
best?
kaz: Maybe we could start with a note or guidelines and if needed pick up additional requirements or use cases later.
ddavis: So we can get general, more common accessibility issues as a note or set of guidelines, and then particular ones may be better as a new use case.
yosuke: People form the industry
can reference what kind of requirements they need to satisfy by
looking at our guidelines. For something more specific we can
have a separate use case.
... So Kaz or Daniel, could you feed this back to the
accessibility group?
ddavis: Yes
yosuke: Should we invite them to this call?
kaz: Yes, we can do that.
yosuke: Should the IG do
something before the next call?
... I can classify their requirements into general ones and
specific ones.
ddavis: That sounds very useful to me.
kaz: we can also send a call for participation on this topic.
<gmandyam> Giri signing off - thanks.
ddavis: The deadline for use case submissions is the end of this week, right?
kaz: Yes, but we can polish use cases after that.
yosuke: So we've gone through all the use cases.
<scribe> ACTION: Yosuke to classify the accessibility requirements into general ones and specific ones. [recorded in http://www.w3.org/2014/05/28-webtv-minutes.html#action04]
<trackbot> Created ACTION-197 - Classify the accessibility requirements into general ones and specific ones. [on Yosuke Funahashi - due 2014-06-04].
ddavis: With 4k content starting to increase (e.g. Netflix) is this something that can affect our use cases? Do we need new ones?
yosuke: In Japan broadcasters
have started looking at caption for 4k but it's still very
early stage.
... I'm not sure the caption people have considered 4k. It may
be better to ask them if there are gaps or problems.
ddavis: OK, I'll ask them.
yosuke: During the last TPAC the work of the Community Group was migrated into the Working Group.
ddavis: I can speak to the Timed Text Working Group
<scribe> ACTION: ddavis to ask Timed Text WG about 4k affecting captioning. [recorded in http://www.w3.org/2014/05/28-webtv-minutes.html#action05]
<trackbot> Created ACTION-198 - Ask timed text wg about 4k affecting captioning. [on Daniel Davis - due 2014-06-04].
yosuke: There may be other
non-captioning issues with 4K.
... We can take a step-by-step approach, looking at captioning
first.
... Another alternative is to also discuss other issues within
the IG.
... What do you think?
kaz: I think step-by-step approach is better.
yosuke: Let's go with a step-by-step approach.
PaulHiggs: If we're going to
start looking at future technologies, there's all kind of
things about gesture and speech-based input/control.
... Are you expecting use cases on those?
PaulHiggs: So we could focus on 4K but then we don't look at something else.
yosuke: Any topic that IG members
or the industry is interested in can be the topic for these use
cases.
... For example, Netflix is bringing 4K content so it's worth
looking at.
... Japanese industry will also start broadcasting 4k trials
during this year's world cup. So it's not a far-future
technology.
... If there's something that can affect web users, it could be
a relevant topic.
ddavis: There's also a requirement that any use case is fine as long as somebody has ownership of it.
PaulHiggs: Should Netflix be bringing this to the discussion?
yosuke: What about asking the
stakeholders to submit issues or problems?
... There are some research facilities dealing with 4k - we can
ask them about issues and web standards.
kaz: You can also bring your ideas to the wiki.
PaulHiggs: What I'm trying to say
is let's solve a problem where there is one.
... I don't think we can answer what they need - only they can
answer that.
yosuke: Paul's point is good, so
how about creating an invitation or questionnaire for 4k
stakeholders?
... We can create the text and ask stakeholders, sharing the
result with the IG.
... Based on that we can decide what to do next.
<scribe> ACTION: ddavis and yosuke to create questionnaire for 4k stakeholders about web standards issues. [recorded in http://www.w3.org/2014/05/28-webtv-minutes.html#action06]
<trackbot> Created ACTION-199 - And yosuke to create questionnaire for 4k stakeholders about web standards issues. [on Daniel Davis - due 2014-06-04].
yosuke: Thank you all. Let's do follow-up work on the mailing list.
<kaz> [ adjourned ]