W3C

- DRAFT -

MEIG Monthly Call

01 Dec 2020

Agenda

Attendees

Present
Kaz_Ashimura, Chris_Needham, Chris_Seeger, Francois_Daoust, Gary_Katsevman, Kazuhiro_Hoya, Leonard_Rosenthol, Nigel_Megitt, Pierre-Anthony_Lemieux, Takio_Yamaoka, Tatsuya_Igarashi, Will_Law, Zachary_Cava
Regrets
Chair
Chris
Scribe
tidoust

Contents


<scribe> scribe: tidoust

Agenda recap

ChrisN: I wanted us to look at outcomes from TPAC meetings, around color (HDR and WCG) that Pierre will report on, CMAF and Media Capabilities and MSE, Media Integration Guidelines for which we need some next steps, and commonalities between WebRTC architecture and stream media
... We're running a survey to find a possibly better meeting time

ChrisS: New in the Group. In Broadcast, interested in still graphics signaling, integration into ATSC 3 and HbbTV hybrid mode where they may be pulling from the Web

ChrisN: OK, let's come back to that.

Kaz: Maybe not for today, but we might want to look into the MiniApps proposal at some point.
... Proposal for a new platform for Web apps, which may affect media playback as well.
... We had a tutorial on MiniApps today for the Japanese Members and there were 65 attendees there
... so there seems to be interest there

ChrisN: If there are specific aspects that are media/streaming related, we should discuss that, indeed.
... Let's follow-up offline.


<cpn> Chris Needham's slides for today's topics

HDR and WCG

Pierre: Inaugural call last week. Outcomes: discussion will happen in the Color on the Web CG, next meeting on Dec 9 at 6am UTC. I just sent a calendar invite to the group's reflector.
... We expect a bunch of technology demos on how the Web (WebGPU, videos) can better handle HDR and WCG.

ChrisN: This IG is interested in media industry requirements in general, but given that color support is broader in scope, it makes sense to do it in a separate CG.
... I would expect that activity to be looking at static image support
... I don't think that we have something specific on that for now.
... An initial report has been prepared in the Color on the Web CG.
... My personal thought is that we would be extending the responsive images mechanism.
... So that the browser can choose which image to render.

<kaz> ITU-T H.Sup19 : Usage of video signal type code points

ChrisS: I note ITU-T standard that consolidates signaling for video.
... 37 standards that need to be consolidated.
... Having a single document would be good

Leonard: One of the things we did in the CG is establish a formal liaison with the ICC, where some of this work also happens.

ChrisN: Any document that you could share on that, Chris?

ChrisS: I'll grab that and share.

ChrisN: I would suggest to focus the discussions in the CG itself. We don't necessarily have all of the people interested in that space today.

Leonard: You're also correct that static images are being addressed by the Color on the Web CG as well.

Pierre: Again, if you're interested in Color on the Web, please JOIN the CG. Just one click away!

CMAF, MSE, and Media Capabilities

ChrisN: We had a call last month where guys from Chrome team clarified how technologies work together. Chris Cunningham put together an explainer document as a result of this discussion.

HDR Capability Detection

ChrisN: I invite people to review that explainer
... The possiblity to use a polyfill library was also mentioned to account for the different CMAF capabilities.
... I don't know yet if people have looked into it already and whether people are willing to work on the polyfill.

Will: I'm involved on the WAVE side, but not aware of anyone willing to contribute to the polyfill. WAVE does not develop software. We need to find some mechanism to develop the software, perhaps look for sponsors, etc.

ChrisN: I think you're right, it needs some engineering effort.
... It could ultimately be a thing that is an open source project.

Will: WAVE was thinking of an open source github project. Problem is maintenance, e.g. diff and update every 6 months.
... It's more difficult to find people willing to commit to it.

ChrisN: Would it be useful to reach out to Media Player libraries people? A common library could be useful
... It does seem that we need to answer some of these questions.
... We need to do some implementation work there.
... That could be a good way to follow up.
... Next part is CMAF in MSE Byte Stream Format Registry. There is a document being worked upon in the WAVE project.
... Open questions on whether to merge with the ISO BMFF Byte Stream Format spec or whether to have a distinct Byte Stream Format spec.
... Whether browsers will stricly validate the content or take a more liberal approach.
... I know that John Simmons is leading that activity.
... The other thing that I'd like to point out is that we have the DataCue activity where we look at emsg box support.
... Some questions e.g. around how the events get mapped to the media timeline
... We have a monthly call, next one of those will be in 2-3 weeks from now.
... Some specific technical work that we need to do to figure out how timing relates to MSE interfaces.
... How much do we want to do in-band vs. out-of-band.
... Or handled separately from the media processing
... If you're interested to contribute, let me know
... At the end of the meeting, we said that we'd organize a follow-up meeting between W3C and CTA WAVE. No date confirmed for now, we keep you updated.

Media Integration Guidelines

ChrisN: We talked about this during the TPAC meeting. Good input from the Chrome team. We would also like to get input from other browser vendors and integrators in embedded devices.
... Should we flesh out some content in the doc already? Or do we want more input from browser vendors first?
... More content would perhaps help gather additional feedback.
... Certainly happy to point people from Apple and Firefox at the issues to get more responses.
... If people have input from TV embedded browsers, that kind of input would be very welcome as well.

Browser Media Architecture

ChrisN: We had a really interesting conversation with the WebRTC WG, where they presented their vision of where they are going.
... Two "parallel" worlds: WebRTC world, and what we have for streaming delivery with MSE, HTMLMediaElement. As we introduce the capabilities as the proposed Insertable Streams, APIs such as WebCodecs, and others that give you access to low-level interfaces, are there particular use cases that we're interested in looking at?
... How are we going to make use of these additional capabilities that browser vendors are introducing?
... It occurs to me that this would be a good way to review the proposals to make sure that they meet our needs.
... Anyone willing to share experience in this area?
... We'll keep track on development, but it does not feel like something that people on this call today have a particular need of.

MEIG Meeting schedule

ChrisN: We have been thinking about our call schedule, and how the timing is favorable for Europe, and more difficult for others.
... The question is: should we alter the meeting schedule to make it at more friendly times for other timezones?
... We could cycle between times.
... Also, regarding summer/winter times, should we keep the time constant in UTC, and thus in Asia?
... There is an ongoing survey, please repond to it.
... Most people who responsded indicated that they are fine with current timing, but 3 people, based in Asia I believe, indicated that this makes it difficult for them to participate.

Pierre: Have you set a deadline for responses?

ChrisN: This meeting, but I don't actually remember whether I said so in the message I sent to the mailing-list.

Pierre: You could send a reminder today and give them until the end of the week.

ChrisN: Yes, let's do that.

Igarashi: How about asking this call's attendees?
... Preference for rotating or not rotating?

Takio: Fixed time is useful for me (easier to set the schedule) and current time is acceptable.

ChrisN: The idea that we share the pain between us seems fair. I think that is what is driving that discussion.

Igarashi: How about US people?

ChrisN: The survey response suggested that people were OK.

Pierre: In the reminder, please insist that people need to voice their difficulties. We cannot take a good decision if we don't hear about people.
... I guess we could just give cycling times a try

ChrisS: Rotating is fine from my perspective.

ChrisN: Proposed times would be difficult for people in the East Coast though

Leonard: Pretty much "business as usual" in international standards commitees

Hoya: I think this time slot is good for us because it will be not interrupted by other things in working time.

<igarashi> i would know the view who has difficultiy with current time slot firstly

<igarashi> my preference is not rotating time slot

Hybrid mode in ATSC 3.0, HbbTV 2.0

ChrisS: how are we going to carry static graphics and render them consistently across possibly multiple devices.
... Not much information from me, but I'm willing to hear about that.
... In linear broadcast, you may render images on top of video, and you need to understand what are the display capabilities to pick up the appropriate content (e.g. HDR/SDR)
... That would be application to our teams at Sky.
... Once we get more information, we may provide more input of how it affects our workflow

ChrisN: I would suggest to do that as part of our activities in the Color on the Web CG.

<igarashi> +1 to hoya-san on not interrupting day work

Leonard: I'm pretty sure we did a report on static image formats in the Color on the Web CG.

ChrisS: There's a GitHub document that I've seen. The one I saw may need a bit more details. E.g. it does not mention PQ in some elements.

<inserted> High Dynamic Range and Wide Gamut Color on the Web

ChrisN: I produced this document a few years ago. This was initially just an internal document for my BBC colleagues to figure out what we needed. Then I shared it with the Color on the Web CG which adopted it as a CG deliverable.
... Since then, I hardly touched on it. Chris Lilley expanded it.
... I certainly agree that it lacks details.

Pierre: I suggest that we put that on the agenda for our first meeting.
... More importantly, we really need a practical example. Someone who tries to do what it wants to do and shares what breaks.

ChrisS: We can do a lot of that. With the Olympics for instance.
... That's the biggest hole that we have right now. We can provide some examples.

Pierre: If you can drive us through this during next CG call, that would be great;
... a lot of these documents are very aspirational. At the end of the day, it would be good to focus on what people want to do right away.
... Actual code, e.g. actual canvas code.

Leonard: +1, and "here is what I try to accomplish".

Pierre: We need to identify specific gaps.

ChrisN: This feels like a good agenda item for this next meeting.
... Open question on whether this document should be dropped or updated, and whether more editors could help.
... I plan to join next call.

ChrisS: If we could include a table with still graphic format, how HDR support would be signaled, that would be easier to absorb.

ChrisN: That's doable, the key is to have someone who could act as editor. I'm not sure I'm in a position to volunteer to continue to do that.
... In principle, I agree that a summary table is a good suggestion.

Igarashi: How to get feedback from ATSC and HbbTV? Feedback from the industry. The IG may help as liaison point.

ChrisN: That's true.

Next meeting

ChrisN: Next meeting in January, wish you a merry Christmas in advance.

<kaz> [adjourned]

Summary of Action Items

Summary of Resolutions

[End of minutes]

Minutes manually created (not a transcript), formatted by David Booth's scribe.perl version (CVS log)
$Date: 2020/12/01 16:44:53 $