Proposal for fixing race conditions

We need to avoid having implementation details (e.g. whether or when data
is copied internally) affect observable output. This can be an issue when
JS passes an array to an API (or gets an array from an API) and later
modifies the array. We need to specify what happens in all such cases.

I believe these are the remaining issues not already addressed in the spec:

1) AudioContext.createPeriodicWave(Float32Array real, Float32Array imag)
I propose copying the data during the createPeriodicWave call.

2) AudioParam.setValueCurveAtTime(Float32Array values, double startTime,
double duration)
I propose copying the data during the setValueCurveAtTime call.

3) WaveShaperNode.curve
I propose copying the data when the attribute is assigned to.

4) AudioBuffer.getChannelData(unsigned long channel)
This is the tricky one. Proposal:
Define a spec-level operation "acquire AudioBuffer contents" which delivers
the current contents of the AudioBuffer to whatever operation needs them,
replaces the AudioBuffer's internal Float32Arrays with new Float32Array
objects containing copies of the data, and neuters the previous
Float32Arrays.
[IMPORTANT: Implementations can and should optimize this operation so that
a) multiple "acquire contents" operations on the same AudioBuffer (with no
intervening calls to getChannelData) return the same shared data; b)
replacing the internal Float32Arrays with new Float32Arrays happens lazily
at the next getChannelData (if any); and thus c) no data copying actually
happens during an "acquire contents" operation. Let me know if this is
unclear; it's terrifically important.]
Then:
-- Every assignment to AudioBufferSourceNode.buffer "acquires the contents"
of that buffer and the result is what gets used by the
AudioBufferSourceNode.
-- Immediately after the dispatch of an AudioProcessingEvent, the UA
"acquires the contents" of the event's outputBuffer. (This is similar to
what the spec already says; however, the "acquire contents" operation
neuters existing arrays (which is observable), which lets the UA avoid a
copy.)
-- Every assignment to ConvolverNode.buffer "acquires the contents" of that
buffer for use by the ConvolverNode.

Additional minor comments:
OfflineAudioCompletionEvent.renderedBuffer should specify that a fresh
AudioBuffer is used for each event. AudioProcessingEvent.inputBuffer and
outputBuffer should specify that fresh AudioBuffers are used for each event.
The text "This AudioBuffer is only valid while in the scope of the
onaudioprocess function. Its values will be meaningless outside of this
scope." is itself meaningless :-). If we specify that inputBuffer is always
a fresh AudioBuffer object, I think nothing else needs to be said.

All these comments are what we've actually implemented, except for
createPeriodicWave which isn't fully implemented yet.

Rob
-- 
Jtehsauts tshaei dS,o n" Wohfy Mdaon yhoaus eanuttehrotraiitny eovni le
atrhtohu gthot sf oirng iyvoeu rs ihnesa.r"t sS?o Whhei csha iids teoa
stiheer :p atroa lsyazye,d 'mYaonu,r "sGients uapr,e tfaokreg iyvoeunr, 'm
aotr atnod sgaoy ,h o'mGee.t" uTph eann dt hwea lmka'n? gBoutt uIp waanndt
wyeonut thoo mken.o w

Received on Friday, 21 June 2013 02:32:52 UTC