IRC log of webrtc on 2011-11-01

Timestamps are in UTC.

00:00:05 [DanD]
anant: what are the different attack possibilities? Should be captured
00:01:03 [DanD]
juberti: What's unique is that you can send it in peer to peer way. No server involved
00:01:37 [DanD]
hta: You said data must be encrypted
00:03:42 [DanD]
hta: being encrypted will take care of some concerns
00:03:57 [stakagi]
stakagi has joined #webrtc
00:04:36 [DanD]
hta: it would make more sense to have a constructor of itself and then be attached to a peerConnection
00:05:13 [DanD]
Milan: Question about ack
00:05:51 [DanD]
juberti: The choices considered for the wire protocol make it useful
00:06:17 [DanD]
Milan: Protocol has an ack and it doesn't need to be exposed
00:06:56 [DanD]
Milan: an example with the ack would be useful to understand
00:07:12 [DanD]
juberti: I'll take it as an action point
00:07:33 [DanD]
Stefan: we can conclude this session
00:07:57 [DanD]
juberti: I'll have it updated and sent to the mailing list for review
00:08:23 [DanD]
fluffy: this is just the API proposal not the actual implementation, right?
00:09:06 [DanD]
fluffy: We're moving along with this until we figure out the implementation.
00:09:44 [DanD]
juberti: Requirements came from the wire protocol
00:10:09 [DanD]
fluffy: looks good. Can we build it?
00:10:40 [DanD]
fluffy: That's what I'm concerned and maybe we should relax our requirements
00:12:10 [francois]
[ref possible alignment with Websockets, perhaps change "sendMessage" to "send"]
00:13:33 [DanD]
francois: there's a process called feature at risk
00:14:30 [francois]
RRSAgent, draft minutes
00:14:30 [RRSAgent]
I have made the request to generate http://www.w3.org/2011/11/01-webrtc-minutes.html francois
00:16:04 [francois]
Topic: MediaStream
00:17:45 [francois]
-> http://www.w3.org/2011/04/webrtc/wiki/images/1/1c/MediaStream_TPAC_2011.odp MediaStream slides (odp format)
00:18:04 [francois]
scribe: francois
00:18:12 [francois]
[going through slides]
00:18:34 [francois]
cullen: why do audio tracks precede?
00:18:54 [francois]
adam: if the last track is not a video track, you can assume there's no video in there.
00:19:07 [francois]
... there used to be 2 lists.
00:19:18 [francois]
anant: the order doesn't have to correspond to anything.
00:19:28 [francois]
cullen: there's another ordering in SDP.
00:19:30 [francois]
anant: not related.
00:19:44 [francois]
cullen: wondering whether that ordering could be the same.
00:19:56 [francois]
... just strikes me as something weird.
00:20:01 [ekr]
ekr has joined #webrtc
00:20:48 [francois]
DanD: think we should be explicit that the order does not have to match that of SDP
00:20:53 [hiroki]
hiroki has joined #webrtc
00:21:03 [francois]
anant: the only people who have to worry about that is browser vendors, no need to be exposed to users.
00:21:23 [francois]
stefan: I liked it better when there were two different lists.
00:21:41 [francois]
adam: it was easier to query whether there is audio or video.
00:21:55 [francois]
... Moving on to definitions.
00:22:14 [kermit]
kermit has joined #webrtc
00:22:21 [francois]
... MediaStream represents stream of media data. Do I need to go through it?
00:23:03 [francois]
cullen: find this definition fascinating. Can you have stereo audio in two tracks? Is voice and video one track? audio and DTMF? No idea.
00:23:41 [francois]
anant: a track is lowest you can go. Having 5.1 audio in one track looks weird.
00:24:12 [juberti]
what about comfort noise?
00:24:18 [juberti]
is that the same track as audio?
00:24:37 [francois]
cullen: need some group for synchronization, but separate thing.
00:25:31 [francois]
anant: getObjectURL function is on the MediaStream, right? When you assign a stream to a video element.
00:25:55 [francois]
cullen: presumably, if I have a stream with 3 video streams, I want to send it to 3 different video elements.
00:26:34 [francois]
anant: media fragment API could be used to select the track you're interested in.
00:26:41 [francois]
s/API//
00:27:05 [Ruinan]
Ruinan has joined #webrtc
00:27:10 [francois]
DanD: as long as we all agree on what's inside, we're in good shape.
00:27:26 [francois]
... This is a good start for a glossary.
00:28:16 [francois]
cullen: let's say that graphic card has VP8 support. You can't assume that the clone happens before the decoding happens.
00:29:26 [francois]
[discussion on GStream and tracks]
00:30:12 [derf]
s/GStream/gstreamer/
00:30:46 [francois]
anant: I think gstreamer has two separate tracks-like for stereo audio.
00:31:04 [francois]
tim: surely, a 5.1 audio is one source for gstreamer.
00:32:29 [francois]
adam: the motivation to remove the parallel between MediaStreamTrack and media track is that audio was a multiple list whereas video was an exclusive track.
00:32:58 [francois]
hta: basically one media streamtrack is one stream of audio.
00:33:17 [francois]
cullen: stereo is two tracks, 5.1 is 6 tracks. That's very easy to deal with.
00:33:36 [francois]
anant: you want to be able to disable audio tracks.
00:33:56 [francois]
tim: how do I know which track is the rear right and so on?
00:34:32 [francois]
DanD: technically, with 3D video, you'll want to sync those two tracks.
00:35:29 [francois]
francois: 6 tracks for 5.1 audio means disabling audio is disabling 6 tracks.
00:35:38 [francois]
anant: we can add a layer at MediaStream level.
00:35:40 [Kangchan]
Kangchan has joined #webrtc
00:36:13 [francois]
burn: the real world allows both, combined or not.
00:36:52 [francois]
cullen: question is does something that is jointly coded with multiple channels, is that one track?
00:37:24 [francois]
... If that's one track with a bunch of channels, the fact that it could be represented as two tracks sounds like a complete disaster.
00:37:51 [francois]
... We need some abstraction layer to ease the life of Web developers.
00:39:30 [francois]
hta: in the case of 4 microphones, you want to send 4 tracks. With 6, you want to send 6 tracks.
00:39:41 [Gang]
Gang has joined #webrtc
00:39:50 [francois]
anant: I think early implementations will only support one or two channels at most.
00:40:10 [francois]
tim: there are plenty of places where we can get audio that is not one channel.
00:40:21 [francois]
anant: right, from files, for instance.
00:42:08 [francois]
anant: my preference is to stick to a MediaStreamTrack as the lowest thing.
00:42:50 [francois]
adam: moving on. An instance of a MediaStreamTrack can only belong to one MediaStream.
00:43:28 [francois]
anant: noting that "track" is really not the same thing as a track in music, so we need to be explicit in the doc about that, not to create additional confusion.
00:43:58 [derf]
s/music/container formats, etc./
00:44:08 [francois]
RRSAgent, draft minutes
00:44:08 [RRSAgent]
I have made the request to generate http://www.w3.org/2011/11/01-webrtc-minutes.html francois
00:46:22 [francois]
[meeting adjourned]
00:46:23 [francois]
RRSAgent, draft minutes
00:46:23 [RRSAgent]
I have made the request to generate http://www.w3.org/2011/11/01-webrtc-minutes.html francois
00:46:41 [Zakim]
-Justin_Uberti
00:50:42 [Zakim]
-Prospector_AB
00:50:43 [Zakim]
UW_(WebRTC)12:00PM has ended
00:50:45 [Zakim]
Attendees were Justin_Uberti, Prospector_AB
00:51:01 [hiroki]
hiroki has joined #webrtc
00:56:00 [francois]
Zakim, bye
00:56:00 [Zakim]
Zakim has left #webrtc
01:09:30 [ekr]
ekr has joined #webrtc
01:10:05 [ekr_]
ekr_ has joined #webrtc
03:21:18 [lgombos]
lgombos has joined #webrtc
03:26:45 [Mani]
Mani has joined #webrtc
04:56:07 [ekr]
ekr has joined #webrtc
05:05:19 [hta]
hta has joined #webrtc
05:23:41 [JonathanJ]
JonathanJ has joined #webrtc
05:29:32 [stakagi]
stakagi has joined #webrtc
05:42:30 [howard]
howard has joined #webrtc
05:46:34 [howard]
howard has left #webrtc
05:49:42 [Gang]
Gang has joined #webrtc
06:04:02 [hta]
hta has joined #webrtc