W3C

– DRAFT –
WebRTC TPAC 2022 - Day 1

12 September 2022

Attendees

Present
BenWagner, bernard, Bradley_Needham, Cullen, DanSanders, dom, Elad, EricCarlson, Henrik, hta, jeff, JIB, Jinkyu, Kentsaku, Louay_Bassbouss, MarkFolz, MartinThomson, MichaelZou, Mike English, MikeEnglish, Orphis, PatrickRockhill, Peter_Thatcher, Philipp, Randell, riju, Tatsuya, Thomas, Tony, Tove, TuukaToivonen, Varun, Will, Xiameng, Youenn, YvesL
Regrets
-
Chair
bernard, hta, jan-ivar
Scribe
dom, youenn_

Meeting minutes

Slideset: https://lists.w3.org/Archives/Public/www-archive/2022Sep/att-0000/WEBRTCWG-2022-TPAC.pdf

<Bernard> Dom volunteers to take notes. Yeah!

<Bernard> We will not be recording the session.

State of the Union

[Slide 13]

HTA: our charter started in 2011 - we're about 11 years old
… the last 2 revisions of our charter have had very little change
… we're set to define APIs to enable real-time communications in the browser
… we've reached Rec for one of our document - although that doc is still evolving as Dom will talk about later today
… other of our docs are functionally stable, broadly used but not progressing much on the Rec track - we've not been very good at pushing to the last stages
… mediacapture-main has been close to be ready for a couple of years; but still need some tidying
… not a lot of work on resdesigning current concepts - are they just good enough, or people just working around their limitations?
… ongoing work on new features; gaze correction, face detection, etc
… We need to focus on getting mediacapture-main and webrtc-stats out of the door
… Stream transforms and media codecs are challenging our basic model, requiring us rethinking
… In terms of what we're not doing: we've stopped work on mediacapture-depth
… no active work on 3D & spatial awareness - talked about it, but not specific proposals
… SOme of the work is being done elsehewere: EME, streaming of stored media (see MOQ in IETF)
… integration with other work (WebGPU, WebNN) could also be usefully done
… Changes since last year: additional clarity on what we're doing and not doing
… not many new resources dedicated to progress the work

<fluffy> One small side note on slide, I see the MoQ work being used for live and real time media as well as stored.

HTA: a lot of usage of WebRTC out there
… including outside the browser

WebRTC-NV Use Cases

[Slide 20]

[Slide 21]

Bernard: reviewing history of WebRTC NV use cases - initial draft 27 months before the pandemic, webcodecs
… pandemic brought technologies to the mass market
… in TPAC 2021, we talked about the relevance of NV use cases
… Tim Panton submitted a number of new use cases which arose during the pandemic
… were added in Nov 21

[Slide 22]

Bernard: how do these use cases relate to our current work? Not much
… still plenty of requirements not covered by our existing work
… 4/11 use cases have matching API proposals
… unclear consensus on some of these use cases
… dependency to changes in data transport that the WG hasn't been working on
… how close are we to satisfying these use cases
… extending webrtc vs using webcodecs over new transport

issue #62 / PR #75

Bernard: Game streaming needs ultra-low latency (<100ms) but not huge scale
… low latency broadcast on the other hand can live with ~1s latency, but needs scalability
… most game sreaming services use WebRTC today
… quite a few of the low latency broadcast solutions also using WebRTC
… the WISH WG in IETF is looking at ingestion via WebRTC

[Slide 23]

[Slide 24]

Bernard: Game streaming has typically one way A/V direction, with data going in the other direction
… sometimes with P2P support

Bernard: this needs lower level control on the data transport (req N15)
… high resolution video processing (N37)
… and control for the jittering buffer / rendering delay (N38)

Harald: re rendering delay, it's often outside of the control of the app or browser, it's part of the OS
… that might turn into a requirement to know it rather than control it

<Bernard> Harald: may need to know it, not control rendering delay

fluffy: re game, there is also controller input (incl access to USB devices) where latency also matters there

<Bernard> Fluffy: APIs from controller...

youenn: would this relate to the gamepad API?

fluffy: that API isn't fast enough - it may need either implementation improvements or API changes

hta: may not be in scope for the group, but useful to report to the relevant groups

<Bernard> Tim Panton: data channel in workers

TimP: if communicaiton can happen from a worker, this would help

Bernard: we've heard repeated support for worker support for data channels

<martinthomson> my impression is that the player often isn't using a web browser when playing the game here, so it might not have a direct effect on this particular use case

Bernard: particularly important in the context of webcodecs in a worker

[Slide 25]

Bernard: low latency broadcast is different both in terms of latency and scale
… this can extend to distributed auctions and betting
… limited interactivity, needs NAT
… this also ties to encrypted media
… that use case also comes with additional control of data transport (N15)
… and DRM (N36)

<steely_glint> I believe Stadia is in the browser.

Bradley: working on the gamepad API extension for multitouch support (for Sony)
… sending the gamepad data as quickly to the server directly from the hardware would be ideal

<martinthomson> a new RTP media type?

Bernard: would this fall under the gamepad API?

Bradley: not sure where it should live - but it probably should not be part of the gamepad API which is meant for interperation in the client

thomas: in the ultra-low latency use case, there is also a distributed music playing use case
… challenging to do without very low latency

bernard: we had discussion about a 3rd use case around distributed music playing
… different from game given that media has to flow in both directions

hta: we have some prior experience from taking input from one place and moving it to another without touching the main thread
… with stream forwarding
… once we have them transferable
… that might be something that we could cogitate more: connecting a stream of gamepad events to an RTP stream and have "magic happens"

<Bernard> Harald: transferrable streams can connect a stream of gamepad events from main thread to a worker thread.

TimP: there is a clear definition in the USB spec of what a gamepad sends

youenn: there is a discussion about gamepad in a worker happening in their repo

fluffy: sending gamepad input directly via RTP is valuable, but could apply to other modalities (e.g. gesture tracking)
… would be nice to have something flexible to deal with that variety of input

HTA: will these use cases be proposed for consensus?

<fluffy> If it is interesting for anyone, some work I am going on game input over RTP https://datatracker.ietf.org/doc/html/draft-jennings-dispatch-game-state-over-rtp-00

Bernard: once we finalize the requirements in the PR, we will bring it to CfC

Developer Engagement

[Slide 27]

TimP: there is unmet demand out there, possibly for lack of resources
… how could we get more resources into this group?

TimP: I was interviewing a devrel expert on how to build a community
… part of it is figuring out what's valuable to people show up, and the blockers to people that don't show up

[Slide 28]

TimP: what's valuable to the people are showing up?
… it's about the value of standards - they grow the market, avoid some legal complexities, provide certainty to developers
… and it builds on a shared expertise, better results than a de-facto standard

[Slide 29]

TimP: What's valuable and unaddressed for people who aren't showing up?
… not sure we know
… who would we want to show up?

[Slide 30]

TimP: What are the blockers to people who don't show up? incl those who did try but didn't stick
… some of the blockers are legal (not allowed to contribute)
… some is because they're not seeing progress on the issues that matter to them
… also issues with cumbersome process or hostile atmosphere
… limitations due to the fact that we're only dealing with half of the problem given inactivity of RTCWeb in IETF

[Slide 31]

TimP: Part of what is needed is to make it so that people don't waste their time, feeling they're not being listened to

[Slide 32]

TimP: a possible solution would be to create a network o users - not people building browsers or WebRTC stack
… could provide very useful feedback & input
… could be done under chatham house rules

<hta> discuss-webrtc@googlegroups.com is one place where such a forum could be advertised.

<fippo> hta: for that discuss-webrtc would have to be moderated for first-time posters more often than the current "every second saturday" :-(

dom: similar experience with the WebAuthn Adoption Community Group
… would be happy to work with you in exploring a similar path for WebRTC

TimP: started sharing that idea with a few folks, getting some interesting feedback

[Slide 33]

TimP: we also need to broaden our intput
… incl from non-browser projects (e.g. pion)
… they have a lot more flexibility in experimenting with APIs than in a browser context
… using as a sandbox to changes, or input to our own design process

bradley: one of the big things for me working on the gamepad API and its chromium implementation
… one of the frustrations in shipping that experimental feature requires an extension to the standard for iterating
… hard to iterate quickly

TimP: maintaining a fork of Chromium is a substantial effort

fluffy: I feel what you're saying on all of this; an interesting premise is that this group is often seen as a group of implementors
… instead of a negotiation between users and implementors
… the reasons I don't come here are similar to what you describe
… I'm not sure having a separate group will help if we don't fix the root cause

TimP: I accept that risk - but feels better than doing nothing

fluffy: right - but unless the input gets listened to, it will still feel like a waste of time

TimP: right this group, would have to commit to listen to the input
… also helps decoupling the legal issues, and provides a different audience

Bernard: with the next generation of APIs, we're also not talking to a single group - there is WebTranport, Web Media...
… up to 7 groups with one aspect of that architecture
… providing a single entry point would help

<martinthomson> is steely_glint talking about forming a community group?

TimP: that expands the scope of what I had in mind

hta: the place I currently see for this kind of developer input is in the chrome bug tracker where they're requesting non-standard feature
… there is a community willing to provide input but not sure how to contributing it
… Google should not the org doing the outreach

TimP: +1 - it should be somewhat separate from the browser makers community
… not quite sure yet how to manage the input to the group
… if there is feeling this is worth constructing, I'll take a stab at it

hta: I support this

youenn: at a previous TPAC, there was a session where we invited developers for feedback

Henrik: what would be the output of this group? concrete API proposals? PoC implementations?

TimP: really good question - would be dependent on what this group would accept
… not standards proposals though
… it could be a prototype in Pion
… and then translated in W3C speak
… it could also be useful when we have questions about developer ergnomics

HTA: next step is on TimP's shoulders

WebRTC-PC

[Slide 36]

[Slide 37]

WebRTC revision process

[Slide 38]

[Slide 39]

[Slide 42]

dom: future cfc to move on with agreed changes in WebRTC pc
… plus discussions to move away from WebRTC extensions.

Bernard: there are a number of changes in webrtc-pc that we could move back from webrtc-extensions

Bernard: number of WebRTC extensions have been removed from webrtc-pc, they are now implemented. So we should move them back to webrtc-pc?

dom: yes

Bernard: e.G maxframerate has been implemented

<martinthomson> "at least one browser" isn't something that I consider sufficient for moving stuff

youenn: what is the process to move directly to webrtc-pc

dom: we would provide annotations (experimental for instance).

youenn: would reduce editors burden to directly go to webrtc-pc.

hta: hearing support to move in that direction

<vr000m> Does this change confuse the reader on what has been implemented and what is not. This was one of the main reasons to split the work into extensions, i.e., good ideas to be implemented but not yet.

hta: also show values in pushing docs to Rec, e.g. for mediacapture-main and webrtc-stats

dom: if possible a module is good, some changes are deep in the main spec, so good to update the webrtc main spec.

orphis: what about webrtc 2.0?

dom: we could go with webrtc 1.1... 2.0.
… might be better to move away from versioning.

renaming WebRTC 1.0 -> WebRTC instead for instance.

The room seems to agree with the overall direction.

PR #2763: add relayProtocol to RTCIceCandidate

[Slide 42]

fippo: propose to fix inconsistency between stats and -pc on candidate objects
… also remove unimplemented url field of the event interface

youenn: it's exposed in Safari

hta: it makes more sense to have it in the candidate than in the event

fippo: I'll make a PR for that

[Slide 43]

Simulcast negotiation

[Slide 44]

[Slide 45]

Jan-Ivar: looking from easiest to hardest issue
… starting with #2732
… Chrome and Safari throws on > 16 characters
… propose we align with that

fluffy: strongly object to that without having this done in IETF
… agree with fixing it, but in the right place

<martinthomson> fluffy: RFC 8851-bis?

youenn: this is only from the sender side

jan-ivar: this would only affect for transceiver, not in offers to receive simulcast

Byron: if JS try to use that the impl can't handle, would OperationError be appropriate

Henrik: could reject the offer?

Byron: that's not an offer that point of the discussion

hta: background on the 16 choice - this used to create a Chrome crash
… with inconsistent checks (one on 256 vs 16)
… we did hit someone sending a 17 characters rid

Orphis: the limit of 16 characters was in the older version of webrtc-pc

HTA: can also commit to file an IETF issue

Jan-ivar: but this change is only impacting what browser would send

<Orphis> jib: https://www.w3.org/TR/2017/CR-webrtc-20171102/#dom-rtcpeerconnection-addtransceiver

RESOLUTION: limit rid to 16 in addTransceiver and file issue in IETF on RID length in general

[Slide 46]

jan-ivar: other cases where chromium and safari throws on specific valid RID

cullen: getting the inconsistency between rfc8851 and rfc8852 fixed and aligning with it sounds better

Byron: 8852 is more restrictive than 8851
… aligning with the more restrictive is probably more prudent
… we can take that to the IETF

jan-ivar: so do that before making the change?

hta: we should at least add a note that - and _ can be problematics

<vr000m> ABNF on 8851 is:

<vr000m> rid-syntax = %s"a=rid:" rid-id SP rid-dir [ rid-pt-param-list / rid-param-list ] rid-id = 1*(alpha-numeric / "-" / "_")

hta: not unreasonable to limit what we send within the constraints of the RFC

youenn: we should document something in the spec right now and add a link to the IETF discussion

hta: +1

<vr000m> rid-syntax = %s"a=rid:" rid-id SP rid-dir [ rid-pt-param-list / rid-param-list ] rid-id = 1*(alpha-numeric / "-" / "_")

cullen: 8851 is what defines the rid syntax

[Slide 47]

Issue #2734: addTransceiver does not check for missing rid properties

hta: +1

youenn: +1

#2733 addTransceiver does not check for uniqueness of rid

[Slide 48]

hta: +1

youenn: +1 as long as there is no web compat issue

Issue #2762: Simulcast: Implementations do not fail

[Slide 49]

[Slide 50]

hta: one edge case I've encountered: if you get a remote offer with 3 layers
… and then get a later remote offer with 2 (reduced # of layers), I've implemented it as removing top layer
… if you then have a local offer, should you send 2 or 3? ie should we generate an offer that tries to re-expand it

Byron: maybe a bad idea? I don't know

hta: not sure what the right way is, but the spec should say something

jan-ivar: that will block this issue

hta: I'll make a note on the issue
… but what's on the slide is fine - it's just another edge case to consider

Issue 2764: What is the intended behavior of rollback of remote simulcast offer?

[Slide 51]

Byron: the reverse situations is more complicated - a previously negotiated simulcast session with 3 and a re-offer with 2 that gets rolled back - what should happen?

henrik: the rollback has to reset everything, otherwise it becomes difficult to re-purpose the transceiver

youenn: I'm surprised of the divergence, may be a bug in safari

[running of time to finalize session on simulcast]

Restarting

With webrtc stats

[slide 59] what to do with RTP stats lifetime

Henryk: in favour of proposal A.

stream stats do not have any useful information before receiving / sending packets

<vr000m> I like Proposal A as well, as there is no data before the packet is sent or received

jib: proposal A is good to me

fippo: proposal A might be a breaking change as people might rely on it

risky change

jib: people relying on this could only work for some browsers.

youenn: how will you ensure backwards compat?

henrik: would ship with a flag to disable the change

RESOLUTION: Proposal A but implementors need to check web compatiblity

Henrik will prototype the change

[slide 60] when are stats destroyed

Issue #668 - When are RTP streams destroyed?

Harald: one reason for not deleting stats is to be able to get the total number of bytes.

hta: nobody has complained that this hasn't been possible though

jib: +1
… part of the reasons for eternal stats was to keep them available after pc.close() which would be lost here
… probably fine since nobody has complained about this
… SSRC change is hard to find in the WebRTC stack

fluffy: SSRC change in the remote side is hard to detect; they're rare but can happen

henrik: the point of PR is exposing the current set of streams as identified by the impl

fluffy: transceiver going away I understand; not sure how to detect SSRC going away though (e.g. long mute)

henrik: renegotiation might the practical approach

RESOLUTION: agreement to delete stats when sender/receiver is reconfigured.

Issue 643

Issue #643 - Do we agree removing "sender", "receiver" and "transceiver" stats is a good idea?

[Slide 61]

[Slide 62]

RESOLUTION: close issue 643 as "yes, it was a good idea to remove them for WebRTC stats"

youenn: it may be worth even removing it from -provisional

henrik: maybe, but worth discussing separately

Issue #666 - powerEfficientEncoder/powerEfficientDecoder

[Slide 63]

henrik: proposed PR #670

youenn: there is a fingerprinting issue with encoderImplementation
… we need to solve it before adding a new surface
… at least in safari, this would be exposing new information not exposed by encoderImplementation

henrik: how about tying this to getUserMedia permission?

jib: I think tying this to gUM would be a mistake
… powerEfficient was moved to media capabilities to avoid this privacy issue

<Tim_Panton> Not in favour of linkage to get user media.

youenn: the issue with media capabilities is that it's about what can be done, but what is being done
… e.g. if all hardware encoders are being used

jib: isn't the fingerprinting part that the system has hw-capability?

<Tim_Panton> GUM should be irrel

<Bernard> Agree. Media Capabilities has no dependency on gUM

<Tim_Panton> irrelevant to a recv only decoder

youenn: exposing that there is hw-capability *and* that is being used is additional fingerprinting

jib: re encoderImplementation exposing this, it's only in some implementations - and we should limit that too

youenn: let's revive the fingerprinting issue

Issue #662 - Don't expose so many RTCCodecStats!

[Slide 64]

henrik: proposing PE #669

jib: could also be obtained from media capabilities

orphis: or with getParameters

RESOLUTION: Adopt #669

WebRTC-Extensions

[Slide 66]

Issue #98: Disabling hardware encoding/decoding

[Slide 68]

[Slide 69]

youenn: why would this succeed more than setCodecPreferences?

bernard: it's simpler

fippo: I have an implementation for it, very short

dom: how could this be used for fingerprinting?

fippo: by turning it on and off, this may be surface more details about codecs profiles

youenn: re gating this to reloading - would this be the top level page or iframe?

jib: to prevent fingerprinting, tying this to top level context would make it a bigger deterrent

hta: let's iterate on privacy issues in the bug

Issue #112: header extension API: do we need enabled in setParameters?

[Slide 70]

[support for the removal]

Issue #111: Integration of congestion control across SCTP and media

[Slide 71]

[Slide 72]

<englishm> +present Mike English

<vr000m> What would be the API impact? wouldn't the existing APIs be sufficient input for the implementors?

hta: there actual cases where congestion control conflicts prove problematic, e.g. large file upload in Meet

youenn: is webrtc priority addressing this?

fluffy: the RTP and the data can go different destinations, which further complicates the congestion control
… feels like implementation quality issue
… the right solution is probably to have them both on top of QUIC

Bernard: main use cases are in game streaming
… a bit distinct from this, with current congestion control on the datachannel

Peter: if we have a good API for encoded stream, then you could send you A/V/data on a single transport, or you could encode them together and send them over rtp
… which would ensure they get treated as a whole for congestion control

<jesup_> +1 to cullen; people do want this. not here though

hta: hearing consensus that #111 can be closed as no action needed from this group

RESOLUTION: close #111

Summary of resolutions

  1. limit rid to 16 in addTransceiver and file issue in IETF on RID length in general
  2. Proposal A but implementors need to check web compatiblity
  3. agreement to delete stats when sender/receiver is reconfigured.
  4. close issue 643 as "yes, it was a good idea to remove them for WebRTC stats"
  5. Adopt #669
  6. close #111
Minutes manually created (not a transcript), formatted by scribe.perl version repo-links-187 (Sat Jan 8 20:22:22 2022 UTC).