W3C

- DRAFT -

Media & Entertainment IG monthly call

07 Apr 2020

Agenda

Attendees

Present
Chris_Needham, John_Riviello, Fuqiao, Xue, Steve_Morris, Kazuhiro_Hoya, Takio_Yamaoka, Nigel_Megitt, Will_Law, Yajun_Chen, Rob_Smith, Francois_Daoust, Kazuyuki_Ashimura, Pierre_Lemieux, Gary_Katsevman, Ali_Begen, Garrett_Singer, Huaqi_Shan, Tatsuya_Igarashi, Larry_Zhao, Zhaoxin_Tan, Barbara_Hochgesang, Peipei_Guo
Regrets
Chair
Chris
Scribe
cpn, kaz, tidoust

Contents


<cpn> scribenick: cpn

Introduction

Chris: We have one main topic today: the web media integration proposal.
... If we have time, we can give short updates on our other active work.

Web Media Integration

John and Steve's slides

John: Interop issues with web media APIs with devices having hardware AV decoders
... multiple video elements in a page, where a device has only a single decoder is one example
... For background, we discussed in the CTA WAVE project and the W3C Web Media API CG
... There's an issue link in the webmediaporting repo, and we'd like to move discussion to a new repo
... HbbTV also discussing this, and we felt W3C was best place to bring together these best practices
... Goal is to create a guidelines document as a Note, the IG charter allows from this
... We want participation from browser vendors and device manufacturers, to bring expertise
... If we find discrepancies we can raise with Media WG or WHATWG
... First thing is to identify and capture the issues

Steve: I'll introduce a few examples, things that device manufacturers and integrators encounter with media playback on devices
... Typically this comes down to what happens when APIs acquire limited resources
... A media element where you call load() or play(): when does the scarce resource actually get claimed for use my a media element?
... What happens if it fails, how does it fail, and what is the state of the media element?
... Another example is pre-loading, which is done with ad insertion, multiple media elements? What happens if you only have one hardware decoder?
... Related to this is also releasing resources: implicit or explicit release, does it remain claimed?
... What about different sources: MSE vs file resource URLs?
... This causes issues for manufacturers and also application developers
... What happens when there's resource contention? E.g., a video element and Web Audio that also wants to use the decoder?
... Is it pre-empted? There's no good answer right now.
... What happens now is that these problems are solved unofficially by manufacturers individually, or by other standards groups such as WAVE or HbbTV
... Integrators working across multiple markets may have to implement differently
... Other category of problem is timing: delays through the media decoding pipeline. These vary across different platforms
... How do we deal with these delays? Which part of the algotrithms are affected?
... It could take up to half a second to initialise the hardware decoder. On some platforms you may need to reconfigure, the application would need to know where this delay happens
... Similarly with media play times, if there's latency, what time does currentTime report?
... This could be off by 750 milliseconds, but it would affect seamless playback between two pieces of video or between video and additional content

<kaz> scribenick: kaz

webmediaporting issue 30

John: We welcome everyone's feedback.
... Once we create a new repo, would direct people there
... (possibly me-media-integration-guidelines?)

(chris rejoins)

John: Request for editor/co-editors, let's determine best place for the discussion.

Questions & Answers

Chris: Any questions?

<scribe> scribenick: cpn

Chris: We (BBC) are interested in this, not sure we can edit though. I will ask my colleagues.

Barbara: Similar for us at Intel. Is this also of interest to the Media WG?

John: We're thinking of starting in the IG but would need their help where we identify interop issues

Will: Are there other aspects to consider, such as cameras or microphones? Is there precendent in W3C for this, and a single place to bring it?

Kaz: There's been discussion in the Device and Sensors WG and the WoT WG.
... The WoT is generating use cases description for media use cases.
... This question is good input for that.

<Barbara_Hochgesang> Intel would be interested in this. Will have to check on who. Doubt as editor but could provide expertise input.

Will: What John's raising is important, there'd need to be strict guidelines for how it's done and interop tests.

Kaz: The MIDI API is for controlling instruments, but also for devices such as video editors.
... WoT scope includes this and IoT, this point should also be discussed with them.

Francois: I'm not aware of groups who've tackled this. The other examples, camera and microphone seems also to be media, so this IG could also cover those.
... If you want to extend beyond media hardware, e.g., network hardware integration, that would be beyond the scope for this IG.
... Could be a Pandora's box to open this up too far.
... A question: The Web Media API CG is working on the web media integration topic. What's the overlap with the web media porting document in the CG?

John: This would be separate. There's no additional work going into that, work didn't get started on it
... I think it would be good to create a new location for this, to avoid confusion with this

Will: I can support John's point of view on that
... We could deprecate that document and point to the IG document instead

Chris: Is there not still interest in the broader integration issues in webmediaporting, e.g., local storage limits etc

Will: There is interest, but not active contribution on it
... The IG document would be referenced by the WAVE device playback spec

John: And this could lead to updates to W3C specs

<scribe> scribenick: tidoust

Chris: Some of the issues that Steve described would be good to input in the Media WG right now. Overlap with Media Capabilities. Also codec transition for ad insertion use cases would be good input. It's in the explainer, not sure if it's in the spec yet.

<Barbara_Hochgesang> On use cases, video conferencing would be impacted. WebRTC community group may have input?

<cpn> scribenick: cpn

Steve: We want to look at the economies of the market and lifetime of devices, e.g., TV devices don't receive software updates after being shipped, or after a year
... So you end up with a mixed set of capabilities in the field.
... The capabilities API could really help, working with integration guidelines to help a good user experience.

Chris: Recommend bringing to Media WG for Media Capabilities and also describing integration in the IG document.

Igarashi: Does this also apply to software based devices?

John: Yes, it's not limited to just hardware

Igarashi: Do you see similar issues with software implemnetations

John: Yes we do, and our goal is to identify the interoperability issues, and document them as best practices to inform developers

Igarashi: So the purpose is to clarify the behaviour of the media elements

John: Yes

Kaz: In this case, I want to suggest further discussion with WoT.
... We can think about the hardware and software features, and have a description and which capabilities use each.
... WoT has been working on device description, and now seek input on data resources such as media streaming between devices.

Nigel: Thinking about the impact of the time to start up. Is the issue that the users don't get a timely response, so it's not possible to get a user experience with the right timeliness, or is it a coding problem, where a Promise kind of approach would work

Steve: It depends on the use case.
... With things like media time, you need to be careful when querying at the start vs later on
... Are the timings reported different now, in practice?
... It is, depending on where in the pipeline the time is taken, and there can be cumulative error build up

Next steps

<Zakim> tidoust, you wanted to discuss IG process to adopt the document

<tidoust> scribenick: tidoust

Francois: For the IG to host this document, I would imagine that we'd need to run a CfC in the IG to hear from everyone. Also, for that to work, an editor seems a prerequisite

<kaz> scribenick: kaz

Chris: Would like to ask my co-Chairs about starting up a new activity.

Igarashi: We need to involve browser vendors if it's related to browser implementation.

Chris: That's true, one person already commented on issue #30, and I would like to build on that collaboration with them.
... Regarding a CfC, the proposal is to create an IG repo, like me-media-integration-guidelines.

Igarashi: Is this really an integration issue? Might be more of a media implementation issue.

<nigel> +1 I was thinking the same

<nigel> Unless the specs that define media time reporting API are ambiguous

Igarashi: We'd like to clarify the behavior of the media APIs?

Chris: Clarifications to media APIs could be part of the results.

Pierre: So media API guidelines?

Igarashi: Sounds like "implementation guidelines".

Pierre: What's the name in your mind then?

Igarashi: Implementation issues for media APIs, something that focuses more on the goal of this work.

Pierre: would be better not to include "issue" within the name?

Igarashi: what is the most critical issue here?
... accuracy of media playback?

Chris: It seems to be acquisition and releasing of limited resources, and resource contention.

Igarashi: In that case, what about "media capability detection"? Is that the goal?

<Barbara_Hochgesang> And depended the network

<nigel> I think the name needs to be more tightly scoped to the problem. Media Interoperability Guidelines is really generic, and could be mistaken for, e.g. CMAF.

<nigel> Hardware Resource Availability Guidelines?

<cpn> scribenick: cpn

Kaz: I'm ok with the currently proposed name. Would we want to create a TF, could cover other topics. so let's clarify the scope, if the target is just issue #30, the name could be ok.

<JohnRiv> Nigel, Dan from the Chromium Team mentioned in a comment in the current issue there is more than just hardware: "Hardware codecs are not our only constrained resource. CPU memory, CPU cycles, DMA bandwidth, GPU memory, and network bandwidth are all impacted."

Pierre: It should become apparent what the name should be as work proceeds. I suggest creating a PR against the MEIG repo, and whoever does that can choose a name.

<tidoust> [The editor-to-be could also just create a repo under his own name and share it with the IG, we can migrate it afterwards]

John: I can do that.

Chris: Are we ok with this approach?

(no objections)

Next MEIG calls

Chris: The next MEIG call is on May 5th.
... Also April 20th for Media Timed event.
... A next call for the Bullet Chatting Task Force will be announced when a date is decided. We will discuss offline.

<kaz> [adjourned]

<JohnRiv> thank you everyone

Summary of Action Items

Summary of Resolutions

[End of minutes]

Minutes formatted by David Booth's scribe.perl version 1.152 (CVS log)
$Date: 2020/06/23 06:44:11 $