IRC log of audio on 2012-05-09

Timestamps are in UTC.

18:50:39 [RRSAgent]
RRSAgent has joined #audio
18:50:39 [RRSAgent]
logging to
18:50:43 [olivier]
trackbot, start
18:50:43 [trackbot]
Sorry, olivier, I don't understand 'trackbot, start'. Please refer to for help
18:50:47 [olivier]
trackbot, start meeting
18:50:49 [trackbot]
RRSAgent, make logs world
18:50:51 [trackbot]
Zakim, this will be 28346
18:50:51 [Zakim]
ok, trackbot; I see RWC_Audio()3:00PM scheduled to start in 10 minutes
18:50:52 [trackbot]
Meeting: Audio Working Group Teleconference
18:50:52 [trackbot]
Date: 09 May 2012
18:51:02 [olivier]
18:51:10 [olivier]
Agenda+ Specs Roadmap
18:51:28 [olivier]
Agenda+ JavaScriptNode buffer size and delay (ISSUE-13 and ISSUE-14)
18:52:26 [gabriel]
gabriel has joined #audio
18:54:34 [chrislowis]
chrislowis has joined #audio
18:56:08 [Zakim]
RWC_Audio()3:00PM has now started
18:56:15 [Zakim]
18:56:31 [chrislowis]
Zakim, P0 is chrislowis
18:56:31 [Zakim]
sorry, chrislowis, I do not recognize a party named 'P0'
18:56:37 [chrislowis]
Zakim, ??P0 is chrislowis
18:56:37 [Zakim]
+chrislowis; got it
18:56:53 [chrislowis]
Hi olivier!
18:57:11 [roc]
roc has joined #audio
18:57:27 [mdjp]
mdjp has joined #audio
18:57:55 [Zakim]
18:58:05 [gabriel]
zakim, ??P1 is me
18:58:05 [Zakim]
+gabriel; got it
18:58:11 [Zakim]
18:58:32 [olivier]
Chair: Olivier
18:58:38 [olivier]
Scribe: Chris Lowis
18:58:46 [olivier]
ScribeNick: chrislowis
18:59:27 [jussi]
jussi has joined #audio
18:59:40 [chrislowis]
Zakim, who is on the call?
18:59:40 [Zakim]
On the phone I see chrislowis, gabriel, ??P7
19:00:04 [roc]
I think I joined
19:00:09 [roc]
it's completely silent
19:00:14 [Zakim]
19:00:34 [mdjp]
mdjp - same problem
19:00:50 [jussi]
Zakim, ??P9 is me
19:00:50 [Zakim]
+jussi; got it
19:01:08 [jussi]
for me VOIP worked fine
19:01:14 [chris]
chris has joined #audio
19:01:20 [Zakim]
19:01:25 [chrislowis]
mdjp, olivier: ?
19:01:29 [olivier]
19:01:43 [Zakim]
19:01:43 [olivier]
got in
19:01:51 [olivier]
zakim, ??P10 is me
19:01:51 [Zakim]
+olivier; got it
19:01:54 [jernoble]
jernoble has joined #audio
19:02:01 [chrislowis]
mdjp: try again?
19:02:06 [olivier]
zakim, who is here?
19:02:06 [Zakim]
On the phone I see chrislowis, gabriel, ??P7, jussi, ??P8, olivier
19:02:07 [mdjp]
Zakim, ??P8 is me
19:02:07 [Zakim]
On IRC I see jernoble, chris, jussi, mdjp, roc, chrislowis, gabriel, RRSAgent, Zakim, olivier, F1LT3R, colinbdclark, kinetik, kennyluck, shepazu, foolip, trackbot, paul_irish
19:02:07 [Zakim]
+mdjp; got it
19:02:23 [chrislowis]
roc: are you ??P7 ?
19:02:28 [olivier]
Regrets: Alistair
19:02:31 [roc]
guess so
19:02:38 [olivier]
yes I think so
19:02:39 [chrislowis]
Zakim, ??P7 is roc
19:02:39 [Zakim]
+roc; got it
19:02:40 [Zakim]
19:03:02 [chrislowis]
Zakim, who is noisy?
19:03:07 [jussi]
might be me
19:03:09 [Zakim]
19:03:11 [jussi]
although I'm muted
19:03:15 [Zakim]
chrislowis, listening for 12 seconds I heard sound from the following: jussi (54%), olivier (3%)
19:03:17 [jussi]
I think
19:03:20 [jernoble]
Zakim, ++P12 is jernoble
19:03:20 [Zakim]
sorry, jernoble, I do not recognize a party named '++P12'
19:03:26 [olivier]
zakim, mute jussi
19:03:26 [Zakim]
jussi should now be muted
19:03:27 [jernoble]
Zakim, ??P12 is jernoble
19:03:27 [Zakim]
+jernoble; got it
19:03:32 [jussi]
19:03:42 [olivier]
jussi, ack-ing you will unmute you
19:03:48 [olivier]
zakim, who is here?
19:03:48 [Zakim]
On the phone I see chrislowis, gabriel, roc, jussi (muted), mdjp, olivier, CRogers, jernoble
19:03:51 [Zakim]
On IRC I see jernoble, chris, jussi, mdjp, roc, chrislowis, gabriel, RRSAgent, Zakim, olivier, F1LT3R, colinbdclark, kinetik, kennyluck, shepazu, foolip, trackbot, paul_irish
19:04:01 [olivier]
shepazu, are you joining?
19:04:04 [jussi]
olivier: alright
19:04:50 [Zakim]
19:05:12 [olivier]
zakim, agenda?
19:05:12 [Zakim]
I see 2 items remaining on the agenda:
19:05:13 [Zakim]
1. Specs Roadmap [from olivier]
19:05:13 [Zakim]
2. JavaScriptNode buffer size and delay (ISSUE-13 and ISSUE-14) [from olivier]
19:05:16 [olivier]
zakim, take up agendum 1
19:05:16 [Zakim]
agendum 1. "Specs Roadmap" taken up [from olivier]
19:05:40 [olivier]
19:06:13 [chrislowis]
olivier: today I want to talk about:
19:06:24 [chrislowis]
olivier: 1) what are we going to do with our two specs
19:06:38 [chrislowis]
olivier: 2) What are we going to do with UC and recs document
19:07:23 [chrislowis]
olivier: Last week we noted that there was a lot of buy in for the Web Audio API, but less so for the Mediastream API.
19:07:43 [chrislowis]
olivier: today I'd like to hear from the two editors about what they think.
19:08:01 [chrislowis]
olivier: I've had a quick conversation with roc.
19:08:24 [chrislowis]
olivier: It feels like we will proceed with the Web Audio API and try to fold the Mediastream API in as a note.
19:09:30 [chrislowis]
shepazu: given what I've seen a good strategy would be to keep in mind the Use Cases around streaming that roc contributed, especially around the consistancy of video and audio.
19:10:12 [chrislowis]
roc: I think that what you are proposing is reasonable
19:10:34 [chrislowis]
roc: There's still issues I have around syncronisation and I still have a strong desire for tight integration with Mediastreams.
19:10:46 [chrislowis]
roc: and some way of processing different types of media.
19:11:25 [chrislowis]
roc: I want to try and figure out how to integrate the two. I'd like the semantics of mediastreams and web audio nodes to match
19:11:57 [chrislowis]
roc: I think we can do that while still keeping compatability for the people who are currently using the web audio api
19:12:31 [chrislowis]
olivier: I think the biggest question was the necessity of having consistancy between web audio and mediastreams.
19:13:05 [chrislowis]
olivier: Chris Rogers, could you respond to the question about how the web audio spec could better integrate with Mediastreams?
19:13:31 [chrislowis]
CRogers: it's a good time to talk about this as we are starting to prototype mediastreams in Chrome right now.
19:14:02 [chrislowis]
CRogers: I put a proposal of how the two might integrate based on roc's use cases, which were a very useful starting point.
19:14:06 [chris]
19:14:17 [chrislowis]
19:15:17 [chrislowis]
chris: this document is my first best stab at how the integration might work. We're going to try it in a prototype.
19:16:20 [chrislowis]
chris: it's two new methods on the audioContext, so it's fairly light-weight.
19:16:46 [chrislowis]
chris: we'd like to show our progress with WebRTC
19:17:09 [chrislowis]
olivier: I'd say go ahead and add it to the spec, note that it's still under discussion exactly how'd they'd work.
19:17:53 [chrislowis]
chris: there's still a lot of discussion, such as how to deal with multiple audio and video tracks.
19:18:10 [chrislowis]
chris: it's useful to be able to split out the streams and deal with the separately.
19:19:10 [chrislowis]
roc: in mediastream processing spec you can get a video stream from a canvas and use that for overlays. That's a logical way of doing that. With multiple audio tracks you can mix them together too.
19:19:47 [chrislowis]
chris: I'm sure we'll find some cases where we'll overlay video tracks together in a canvas, but I suspect it'll be more normal that each video track will have a separate layout on a page.
19:19:57 [chrislowis]
chris: e.g. video conference-type application.
19:20:53 [chrislowis]
roc: if you're going to process multiple tracks you will need some API to allow them to be mixed. To keep the simple case simple the default behaviour might be to mix them together by default, then have an API to allow them to be split.
19:21:01 [olivier]
19:21:20 [chrislowis]
chris: even without the Web Audio API in the picture you'd still need to cope with multiple streams. So maybe that is the best default behaviour.
19:22:31 [chrislowis]
olivier: I'd like to give everyone the chance to register their objections to this and document in our rechartering that we'll focus on the web audio api going forward.
19:22:56 [chrislowis]
olivier: roc: if we're going to republish your work as a note, do you need some time to reflect any changes you've made?
19:23:15 [chrislowis]
roc: I think we'll try to publish it the way it is, there's not much point in changing it now.
19:23:26 [chrislowis]
shepazu: if we decide later to change anything we can update the note.
19:24:07 [chrislowis]
olivier: it's quite important to have it in a state we're happy with as it's a cornerstone of our work on the web audio api, so if you'd like to make changes feel free.
19:24:30 [chrislowis]
roc: at the moment it reflects the implementation so it makes sense to keep it as it is, even though there's a couple of things I could change.
19:25:16 [olivier]
RESOLVED: the group will publish the mediastream processing API as a note
19:25:50 [olivier]
RESOLVED: our new charter will document the focus on the web audio API as our audio processing spec
19:26:15 [chrislowis]
olivier: Moving to the 2nd question:
19:26:24 [chrislowis]
(What are we going to do with UC and recs document)
19:27:17 [chrislowis]
olivier: if you go to the spec today there is a section called Use Cases and Requirements.
19:27:49 [chrislowis]
olivier: my question is whether the work we have done this winter on the UC&R document should go into the spec as an informative section. Or whether we'd rather keep it in the wiki.
19:28:37 [chrislowis]
chris: my preference would be to take it from the wiki into a separate html file but to link to it from the spec.
19:28:59 [chrislowis]
chris: I think the wiki made more sense when we were brainstorming the ideas. It could be formatted more nicely as a formal document.
19:29:29 [chrislowis]
olivier: that's pretty close to my preference: take the use cases and requirements, turn it into a working draft and eventually to publish it as a note when it's more mature.
19:29:37 [chrislowis]
olivier: then we'd link that note from the spec.
19:29:50 [chrislowis]
olivier: and note which use cases we considered out of scope for the document.
19:30:04 [chrislowis]
19:30:32 [chrislowis]
chris: I would put them in the use cases and requirements doc.
19:30:37 [chrislowis]
olivier: agreed. Objections?
19:30:42 [chrislowis]
None noted.
19:31:22 [chrislowis]
olivier: Then I'll start on preparing that for that draft. The door is open for volunteers to take that on.
19:32:13 [olivier]
RESOLVED: the group will publish the use cases & requirements as a WD, with a view to publish as a note
19:32:30 [olivier]
RESOLVED: features left out of scope for the v1 of web audio API will be documented in the UC&D Note
19:32:57 [olivier]
zakim, next agendum
19:32:57 [Zakim]
agendum 2. "JavaScriptNode buffer size and delay (ISSUE-13 and ISSUE-14)" taken up [from olivier]
19:33:36 [olivier]
19:33:41 [olivier]
19:33:41 [trackbot]
ISSUE-13 -- JavaScriptNode Delays -- raised
19:33:41 [trackbot]
19:33:46 [olivier]
19:33:46 [trackbot]
ISSUE-14 -- Default value for bufferSize in createJavaScriptNode() -- raised
19:33:46 [trackbot]
19:33:49 [olivier]
19:33:49 [olivier]
19:35:16 [chrislowis]
I'll start with ISSUE-13. mdjp, would you explain a little more?
19:36:19 [gabriel]
zakim, I should be muted
19:36:19 [Zakim]
I don't understand 'I should be muted', gabriel
19:36:46 [jussi]
Zakim, mute gabriel
19:36:46 [Zakim]
gabriel should now be muted
19:36:55 [olivier]
zakim, mute mdjp
19:36:55 [Zakim]
mdjp should now be muted
19:37:05 [chrislowis]
mdjp: the main thing is to make people aware that when using the JS node that there is a delay introduced by the node.
19:37:30 [mdjp]
I should be muted now
19:37:41 [olivier]
zakim, unmute mdjp
19:37:41 [Zakim]
mdjp should no longer be muted
19:37:54 [chrislowis]
chris: yes, the JS audio node has an inherent latency due to its buffering. We talked about adding a latency method to query what the latency is on a node.
19:38:11 [jernoble]
19:38:13 [chrislowis]
chris: the point is that we need a latency attribute on the AudioNode.
19:38:29 [chrislowis]
chris: there's a Rendering Time attribute that supplies additional information.
19:38:41 [chris]
19:38:44 [olivier]
ack jer
19:39:07 [chrislowis]
jernoble: are we asking about adding an attribute to query the latency on all nodes, not just the JS node?
19:39:10 [chrislowis]
chris: yes.
19:39:37 [chrislowis]
chris: in the link above there's an attribute called event time to determine where you are in the playback stream.
19:39:52 [roc]
19:39:59 [chrislowis]
chris: it provides a way for JS nodes to syncronise themselves with other nodes.
19:40:24 [chrislowis]
olivier: so using this you could compensate in other parts of the graph for this delay.
19:40:35 [olivier]
ack roc
19:40:45 [chrislowis]
chris: yes, so using this you would be able to compensate for this.
19:41:21 [chrislowis]
roc: in mediastream processing you don't need to query nodes for the latency. If we can avoid it in the web audio api that would be best for authors.
19:42:00 [chrislowis]
chris: my feeling is we can't do everything automatically - there's some cases where you would want to compensate and some where you wouldn't, and the system wouldn't be able to detect reliable which mode to be in.
19:42:24 [chrislowis]
chris: c.f Logic Audio 9 screenshot previously on the list.
19:42:25 [roc]
19:43:20 [chrislowis]
olivier: could you see a case where the developer might add latency deliberately and this would hurt that?
19:43:41 [chrislowis]
chris: Referring to a thread on the list where this was discussed.
19:43:58 [chrislowis]
olivier: could you find that thread and add it to ISSUE-13?
19:44:04 [chrislowis]
chris: sure.
19:44:07 [olivier]
ack roc
19:45:18 [chrislowis]
roc: forcing developers to do latency calculations themselves is something we should avoid.
19:45:33 [chrislowis]
chris: my feeling is the latency compentacy should be opt-in rather than opt-out.
19:45:43 [olivier]
19:45:49 [chrislowis]
olivier: thanks :)
19:46:20 [chrislowis]
19:46:57 [chrislowis]
olivier: perhaps we could add something to the next WD of the spec, to look for feedback.
19:47:59 [chrislowis]
chris: one of the examples in the previous thread I mentioned was someone playing a MIDI synth along with a generated sequence - if people are trying to use the API in a simple way they won't understand where the delay is coming from.
19:48:36 [chrislowis]
olivier: do you (chris) have any notion of the performance issues caused by always-on latency compensation?
19:48:42 [chrislowis]
chris: in terms of CPU load?
19:49:15 [olivier]
ack chrislo
19:49:15 [chrislowis]
olivier: yes. I don't think it would have an appreciable effect. I'm not sure. I'm more concerned about the impact on delays.
19:49:46 [olivier]
chrislowis: wanted to point out that the problem we ran into was when using synthesis, when phase was important.
19:50:00 [olivier]
… we were missing a few native building blocks, like additions
19:50:11 [olivier]
… the more native blocks, the less of an issue it becomes
19:50:18 [chris]
19:50:30 [chrislowis]
chris: I'd like to bring up the playback time attribute --^
19:51:14 [chrislowis]
chris: the playback time is a timestamp, so you know exactly what time things are happen so you can syncronise events in a JS node exactly.
19:51:35 [chrislowis]
olivier: is this just a matter of documentation?
19:52:05 [chrislowis]
chris: we should still have the latency attribute, to handle both synthesis and the generation of note events.
19:52:15 [roc]
seems to me that using the playbackTime attribute gives you the information you need about latency
19:52:49 [olivier]
19:52:51 [jussi]
19:52:52 [olivier]
ack jussi
19:53:36 [roc]
19:53:39 [olivier]
zakim, mute jussi
19:53:39 [Zakim]
jussi should now be muted
19:53:40 [chrislowis]
jussi: if the implementation could put in a default value for good latency that would be an option.
19:53:43 [jussi]
19:53:48 [olivier]
ack roc
19:54:11 [trackbot]
ISSUE-14 -- Default value for bufferSize in createJavaScriptNode() -- raised
19:54:12 [trackbot]
19:54:17 [chrislowis]
roc: in mediastreams processing the implementation always chooses the buffer size.
19:54:30 [chrislowis]
olivier: what does it choose as the default?
19:55:17 [chrislowis]
roc: it'll be implementation dependent - depending on whether you're buffering ahead. In my implementation it changes dynamically depending on the amount of buffering. I can dig up more information off-call.
19:55:33 [chrislowis]
olivier: would this make inter-operability difficult?
19:56:12 [chrislowis]
chris: we've had a bit of a discussion about this a few months ago with Joe. We were going back and forth on whether we should go with roc's suggestion, or whether we should allow the developer to specify the buffer size.
19:56:34 [olivier]
ack jussi
19:56:42 [chrislowis]
chris: I don't really have a firm opinion either way, it's a tough question. roc's suggestion has a lot of merit, and I think joe agrees.
19:56:45 [olivier]
mute jussi
19:56:53 [chrislowis]
chris: jussi - do you have any objections?
19:57:01 [chrislowis]
jussi: I'm not sure, need to think about it.
19:57:02 [roc]
19:57:13 [chrislowis]
chris: me too. It's really hard! I think we should talk about it on the list.
19:57:19 [olivier]
ack roc
19:57:20 [chrislowis]
chris: in principle I agree with roc on this.
19:58:20 [chrislowis]
roc: interop won't be a problem if implementation that are widely used *do* vary the buffer size dynamically, but if devs start working around things, it'll be tricky.
19:58:38 [chrislowis]
olivier: I think the way forward is to continue the discussion on the list.
19:59:18 [chrislowis]
olivier: good time to wrap up the call as we're reaching an hour. AOB?
19:59:28 [chrislowis]
None noted.
20:00:04 [chrislowis]
olivier: same time next week.
20:00:08 [chrislowis]
shepazu: before we go...
20:00:22 [chrislowis]
shepazu: did we resolve to go forward with publication?
20:00:33 [chrislowis]
olivier: we decided we would be we haven't talked about logistics.
20:00:44 [chrislowis]
olivier: we have a recorded resolution.
20:00:59 [chrislowis]
shepazu: we'll have to wait a couple of weeks, as there's a moritorium on publication at the moment.
20:01:15 [jussi]
thanks everyone! bye
20:01:17 [chrislowis]
olivier: let's figure out the logistics after the AC meeting.
20:01:25 [Zakim]
20:01:26 [Zakim]
20:01:27 [Zakim]
20:01:28 [gabriel]
20:01:31 [Zakim]
20:01:32 [Zakim]
20:01:33 [Zakim]
20:01:36 [mdjp]
bye all
20:01:36 [Zakim]
20:01:37 [olivier]
rrsagent, make minutes
20:01:37 [RRSAgent]
I have made the request to generate olivier
20:01:47 [Zakim]
20:06:47 [Zakim]
disconnecting the lone participant, jernoble, in RWC_Audio()3:00PM
20:06:48 [Zakim]
RWC_Audio()3:00PM has ended
20:06:48 [Zakim]
Attendees were chrislowis, gabriel, jussi, olivier, mdjp, roc, CRogers, jernoble, Doug_Schepers
20:51:46 [automata]
automata has joined #audio
20:59:28 [roc]
roc has joined #audio
21:00:11 [roc]
roc has left #audio
22:40:26 [colinbdclark]
colinbdclark has joined #audio
22:41:15 [colinbdclark_]
colinbdclark_ has joined #audio
22:45:31 [colinbdclark_]
colinbdclark_ has joined #audio
23:11:09 [colinbdclark]
colinbdclark has joined #audio
23:22:11 [colinbdclark_]
colinbdclark_ has joined #audio
23:26:25 [paul___irish]
paul___irish has joined #audio
23:26:58 [colinbdclark_]
colinbdclark_ has joined #audio
23:47:19 [colinbdclark_]
colinbdclark_ has joined #audio
23:48:23 [automata]
automata has joined #audio
23:50:55 [colinbdclark_]
colinbdclark_ has joined #audio