This is an archived snapshot of W3C's public bugzilla bug tracker, decommissioned in April 2019. Please see the home page for more details.
Audio-ISSUE-72 (ConvolverNodeState): ConvolverNode state modification [Web Audio API] http://www.w3.org/2011/audio/track/issues/72 Raised by: Philip Jägenstedt On product: Web Audio API When buffer and normalize are modified, when does it take effect? If modifications are not applied atomically, it's possible to get spikes (or dropouts) depending on the order of setting and the previous state. Related: https://www.w3.org/2011/audio/track/issues/28 Further, setting normalize to true is defined using the phrasing "when the buffer atttribute is set", which implies that the order of setting the attributes matters. However, no such order-dependence exists when setting normalize to false. If an order-dependence is intended, it ought to be changed to either commit atomically after the script thread has finished, or we should have a setter taking both a buffer and a normalize flag.
Much more detail added here: https://dvcs.w3.org/hg/audio/rev/4df179094971
The change mostly covers the questions asked. Feedback on the new changes: - The algorithm defined in the C++ (?) function calculateNormalizationScale would be much better defined in pseudo code, and could probably be more compact. The code also seems to depend on internal data structures specific to a particular implementation. - The text "A mono, stereo, or 4-channel <code>AudioBuffer</code> containing the (possibly multi-channel) impulse response" is confusing. What does "possibly multi-channel" mean in this context? Can a mono AudioBuffer be multi-channel? - Editorial: "Normative requirements for multi-channel convolution matrixing are described <a href="#Convolution-reverb-effect">here</a>". Please don't use "here"-links. - It is unspecified what should happen if you first set the buffer attribute to an AudioBuffer "buf", and later make changes to your locally referenced "buf" (or, for that matter, make modifications directly to the array returned by buffer.getChannelData(k)).
...additionally, I'd suggest that the buffer + normalize attributes are replaced by a single setImpulseResponse(AudioBuffer buffer, boolean normalize) method. That would make the interface much clearer, and avoid possible problems with modifying the AudioBuffer after setting it.
Indeed, transferring ownership of the buffer and neutering it would avoid any problems with later modification.
Web Audio API issues have been migrated to Github. See https://github.com/WebAudio/web-audio-api/issues
Closing. See https://github.com/WebAudio/web-audio-api/issues for up to date list of issues for the Web Audio API.