14:53:40 RRSAgent has joined #me 14:53:45 logging to https://www.w3.org/2024/02/06-me-irc 14:53:45 Zakim has joined #me 14:53:58 Meeting: MEIG meeting 14:54:18 Chair: Chris_Needham 14:55:09 Agenda: https://www.w3.org/events/meetings/689f94d6-7a6a-4fd3-9723-2ad37b09bb64 14:59:40 tidoust has joined #me 14:59:40 ryo has joined #me 14:59:41 nigel has joined #me 15:01:12 present+ Kaz_Ashimura, Chris_Needham, Tatsuya_Igarashi, John_Riviello, Kinji_Matsumura, Ohmata, Ryo_Yasuoka 15:01:30 Chair+ Tatsuya_Igarashi 15:01:45 Present+ Nigel_Megitt 15:02:02 present+ Francois_Daoust 15:02:28 present+ Hisayuki_Ohmata 15:02:36 present- Ohmata 15:02:38 rrsagent, make log public 15:03:16 agenda: https://lists.w3.org/Archives/Public/public-web-and-tv/2024Jan/0001.html 15:03:23 ohmata has joined #me 15:03:43 JohnRiv has joined #me 15:03:44 scribe+ cpn 15:03:47 scribe+ nigel 15:04:40 slideset: https://github.com/w3c/media-and-entertainment/files/14183262/2024-02-06-W3C-MEIG-Meeting-Media-Capabilities.pdf 15:05:14 [slide 1] 15:05:27 Topic: Intro and agenda 15:05:35 Chris: Welcome to the first MEIG of the year. 15:05:54 .. Following up on discussions in the MediaWG about media capabilities. 15:06:01 .. They've done some triage and prioritisation. 15:06:19 .. I saw that there were a number of issues where the WG could use wider industry input, 15:06:23 .. which is what today is about. 15:06:41 .. There are some proposed features raised in issues that the WG has not prioritised. 15:06:41 s/MediaWG/Media WG/ 15:06:59 .. We can re-evaluate those and discuss if they continue to be useful, and provide advice into the Media WG 15:07:08 .. to help with that prioritisation. 15:07:13 [slide 2] 15:07:17 [slide 3] 15:07:24 Chris: Recap: 15:07:33 .. Media Capabilities API is a browser API 15:07:49 .. Provides info to the page about the browser's ability to decode and play various media format. 15:08:01 .. Also for encoding and transmission, applies more in a WebRTC context. 15:08:18 .. This one API works in both a streaming media context and also in a WebRTC context. 15:08:38 .. One interesting aspect of the design is that the Media Capabilities API is intended to focus on 15:08:49 .. the decoding and encoding capabilities, and the ability to render the decoded media 15:08:59 .. is not really in scope except for a couple of exceptions. 15:09:04 .. It's a design choice. 15:09:13 .. e.g. things to do with the properties of the display is excluded. 15:09:34 .. It works by providing a MIME type, if it's HDR content, use parameters to describe the use of HDR metadata 15:09:37 .. and colour spaces. 15:09:54 .. The information you get back indicates if the format can be decoded, and if so, if playback is expected to be smooth. 15:10:03 Louay has joined #me 15:10:07 .. This is in some implementations dependent on real time feedback based on previous experience 15:10:14 present+ Louay_Bassbouss 15:10:14 .. that the browser might accumulate. 15:10:33 present+ 15:10:34 .. A flag tells you if playback is power efficient, which could be due to hardware acceleration. 15:10:44 .. In some cases software decoding could be as efficient. 15:10:59 .. I'm focusing on MSE decoding and playback for today, rather than encoding or WebRTC, 15:11:06 .. but if you're interested in those we can talk about them. 15:11:16 .. This is intended to get your feedback on these open questions. 15:11:28 [slide 4] 15:11:46 igarashi has joined #me 15:11:59 Chris: Implementation status. There are differences in how up to date the implementations are 15:12:05 .. with the draft specification. 15:12:10 .. Particular items for today: 15:12:15 .. Text Track Capabilities 15:12:24 .. Ability to decode multiple streams simultaneously. 15:12:28 .. Transitions (e.g. for ad insertion) 15:12:38 .. Rendering and Decoding capabilities. 15:12:56 .. Open question: are there other requirements that should be prioritised beyond the ones I chose today? 15:13:03 .. Any thoughts or initial questions? 15:13:23 no group questions 15:13:35 Topic: Text track capabilities 15:13:36 [slide 5] 15:13:43 Chris: 3 GitHub issues from review. 15:13:59 .. Accessibility HR highlighted that audio and video media are often accompanied by text tracks, 15:14:14 .. either embedded or separate. They asked if media capabilities' scope would include that. 15:14:35 .. w3c/media-capabilities#157 is "text tracks not supported" 15:14:51 .. Raised by Mike Dolan who pointed to a general need that detection for TTML. IMSC, WebVTT is out of scope. 15:14:56 .. There's some discussion in the thread. 15:15:10 .. The other issue, 99 is more specific. 15:15:30 .. w3c/media-capabilities#99 is about SEI messages in the video stream carrying 608 or 708 captions 15:15:46 .. As I understand it, of the major browser engines, only Safari has support for embedded timed text media. 15:15:52 .. Chrome and Firefox don't do that. 15:16:11 Nigel: Can you explain embedded timed text media? 15:16:30 Chris: I was avoiding using the term "in-band" 15:17:01 Nigel: Is it something in the manifest, or something multiplexed with the video itself? Or a HTML track element? 15:17:27 Chris: I see the HTML element as a separate thing. Text Track Cues can be programmatically 15:17:33 .. added to a track object. 15:17:39 .. This depends on how you're providing the media. 15:17:55 .. For example, if you were to give an HTML