This is an archived snapshot of W3C's public bugzilla bug tracker, decommissioned in April 2019. Please see the home page for more details.
Audio-ISSUE-54 (MediaElementAudioSourceNode): MediaElementAudioSourceNode [Web Audio API] http://www.w3.org/2011/audio/track/issues/54 Raised by: Philip Jägenstedt On product: Web Audio API HTMLMediaElement is integrated into Web Audio API using the MediaElementAudioSourceNode interface and createMediaElementSource. This looks like an unnecessarily complicated way of making the connection. Suggestion: Extend the HTML AudioTrack interface [1] with an "AudioSourceNode node" member. This allows us to use multiple audio tracks from a single HTMLMediaElement. [1] http://www.whatwg.org/specs/web-apps/current-work/multipage/the-video-element.html#audiotrack
Ian Hickson and I had discussed the trade-off of these two approaches a while back. He suggested that with the current approach (with create() method) that we can keep the two APIs well-factored, and not need to change the HTMLMediaElement spec. Also, I don't think that the create() approach is very complex at all. It's true that it adds one more line of JS, but this is a very small cost and it's consistent with other source nodes such as createBufferSource(). We've already had plenty of people using this API, and nobody has complained about this particular aspect.
Philip, any thought given the recent changes in http://dvcs.w3.org/hg/audio/rev/9224fb26e77d ?
That change was in response to Bug 17346 and looks fine to me. As for createMediaElementSource() vs a property on AudioTrack, I'm fine with keeping a create method for consistency and API separation. What remains unresolved is the issue of multi-track resources. If instead of a createMediaElementSource we had a createAudioTrackSource, the issue of what to do with media elements without any audio goes away, and it's also possible to process the audio streams separately. Should I file a new bug for this?
(In reply to comment #3) > What remains unresolved is the issue of multi-track resources. If instead of > a createMediaElementSource we had a createAudioTrackSource, the issue of > what to do with media elements without any audio goes away, and it's also > possible to process the audio streams separately. Should I file a new bug > for this? Makes sense. Let's keep it here - care to change/clarify the issue name? (If you can... not sure what the typical Bugzilla rights are these days)
OK, reopened and changed name.
Web Audio API issues have been migrated to Github. See https://github.com/WebAudio/web-audio-api/issues
Closing. See https://github.com/WebAudio/web-audio-api/issues for up to date list of issues for the Web Audio API.