Meeting minutes
Slideset: https://
<McCool> audio is very quiet
angel: today there will be 2 presentations, then we will have enough time for questions and discussion
<McCool> much better
Lin: today we are going to share use cases from 3 companies
… China Mobile, telco operator, for live-streaming of sports events and music concerts
<angel_> this session is planned to be recorded, any objections?
Lin: also meta verse convention
… ByteDance, will talk about Education live-streaming, e-commerce, next-gen conference
… Alibaba will focus on e-commerce
Lin: a few use cases using "Cloud Box"
<chunming> TPAC Demo relevant to WebRTC and Web&Network IG
<chunming> https://
<chunming> https://
<angel_> Lin's slides
Lin: 5 steps for RTC Cloud
… during all these process, we have done a lot for optimization
Lin: another scenario, live e-commerce, which allows interactive product showcasese
Lin: another use case for e-commerce, anchor initiates live streaming via RTC client
Lin: next scenario is about Metaverse Convention
… everything is in your browser
… use your mouse to control the view
… kind of a cloud game powered by RTC
… virtualise the characters
… we call it cloud rendering in the platform
Lin: these are the basic steps how we designed this system
… we are still using WebRTC, but a cloud based one
… what we call RTC-C
… there are also some other methods to push and to pull
Lin: here are a few requirements we collected from these use cases
… an API to set the duration of frame freeze
… also a few parameters to add to the Statistics API
… when the connection is bad, the estimating freezeCount algorithm couldn't fully meet the needs
… the linear avg. of durations of the last 30 rendered frames is 200ms, which is not take as a freeze
Q1: have you file the bug in the spec? what's the bug num?
answer: will do
Q2: for one particular codec, work is done in some other standard body (IETF), W3C might not be able to do anything about it
answer: yes we know, would like to explore a few future steps to solve the problem
<jesup> the cloud game could pass audio and video for users close to the end user directly to the user, and let the local client merge and decide where/if to display. Probably one would do this only for a few streams. The client would need to render the environment and decode N streams, and encode 1 stream.
Lin: one more issue, it always failed to start the audio playback device when iOS app is in the background
… here are the steps to repeat the bug
… the bug was already reported to Chromium
https://
<jesup> You've reported this bug, good. There's no W3C work required here I believe; it doesn't change the API visible to JS/user code
Lin: as for the next step, we wonder if we can create a certain TF, or in the WNIG, or even a new CG to further discussion these requirements and explore the solutions
Angel: next is another presentation from ByteDance
… from Shishen Zhang
jesup: why do you need the video there while playing the game?
shizhen: the game is using anchors, also they need the live chats
Q3: for the game's sake, this is quite similar to what we did for @@, we need a WebRTC CDN, when a lot of user can watch the screen
Shishen: they don't want to abandon the CDN
… as it has been used a long time, a lot of vendors still rely on it
Song: addition more comments
… the regular CDN is often for board casting
… but the CDN Shishen mentioned was more for general usage
Q4: there have been some more discussion about key frames
Q5: we have an API for picturing the key frames, every 5s
… unless you want more accurate key frames
<jesup> My question was that for many games, it should be possible to run the game as a normal multiple-user game running in all the clients, exchanging data with a server which maintains game state but doesn't render the game. Rendering is done locally. In this case, the bandwidth required is far lower, though you would still try to use realtime channels, such as DataChannels. Audio and video can be shared simultaneously as needed, and the se[CUT]
Q5: but I'm not sure about implementing
McCool: I have a question about RTCC in the IoT device usage
Song: is that about the case 3, the bug in the iOS app in the background?
McCool: we can followup later, would like to know more details about the IoT use cases
… please reach out to me
Lin: it's a very good question
… in China, a lot of vendor also use similar technologies in the IoT cases
McCool: I wonder if anybody have adopted it, and how
… in the IoT, we are also looking at the roadmaps
jesup: for the car game example, another possible approach to have the video to the stream to the user interacting with others
… mapping any video and audio attached to it
… there may be issues with background noice
… this might help to minimize the delay
… there can be a lot of ways to reduce the latency
… just some advice
song: would like to thank all for providing these use cases
… some of them need further discussion
… please write down your use cases and proposals
… so we can followup with the WebRTC WG and the WNIG
… even though these use cases were from 2 very different companies
<angel_> https://
song: so I encourage people to look at the demos from the web-networks IG
<jesup> lin: for example, see Hubs
song: in the distributed network area
Fraunhofer: thanks for the cases
… we also want to see more testing in these cases, f.ex., in cloud gaming
… we are also working on other ways other than WebRTC
… like WebTransport
… please also get to us, we like to have more f2f discussion about it
<McCool> (know we are overtime, but I would like to summarize Distributed Workers briefly and relate to WebRTC and edge computing)
Fraunhofer: regarding the performance, to improve the UE
McCool: since Distributed Workers is mentioned, I'd like to summarize it
… I have a video for it, please watch it
[adjourned]