[Bug 20698] New: Need way to determine "performance.now()" time of current audio output

https://www.w3.org/Bugs/Public/show_bug.cgi?id=20698

            Bug ID: 20698
           Summary: Need way to determine "performance.now()" time of
                    current audio output
    Classification: Unclassified
           Product: AudioWG
           Version: unspecified
          Hardware: PC
                OS: All
            Status: NEW
          Severity: normal
          Priority: P2
         Component: Web Audio API
          Assignee: crogers@google.com
          Reporter: joe@noteflight.com
        QA Contact: public-audio@w3.org

Use case:

If one needs to display a visual cursor in relationship to some onscreen
representation of an audio timeline (e.g. a cursor on top of music notation or
DAW clips) then knowing the real time coordinates for what is coming out of the
speakers is essential.

However on any given implementation an AudioContext's currentTime may report a
time that is somewhat ahead of the time of the actual audio signal emerging
from the device, by a fixed amount.  If a sound is scheduled (even very far in
advance) to be played at time T, the sound will actually be played when
AudioContext.currentTime = T + L where L is a fixed number.

On Jan 16, 2013, at 2:05 PM cwilso@google.com wrote:

It's problematic to incorporate scheduling other real-time events (even knowing
precisely "what time it is" from the drawing function) without a better
understanding of the latency.

The idea we reached (I think Chris proposed it, but I can't honestly remember)
was to have a performance.now()-reference clock time on AudioContext that would
tell you when the AudioContext.currentTime was taken (or when that time will
occur, if it's in the future); that would allow you to synchronize the two
clocks.  The more I've thought about it, the more I quite like this approach -
having something like AudioContext.currentSystemTime in
window.performance.now()-reference.

On Jan 16, 2013, at 3:18 PM, Chris Rogers <crogers@google.com> wrote:

the general idea is that the underlying different platforms/OSs can have very
different latency characteristics, so I think you're looking for a way to query
the system to know what it is.  I think that something like
AudioContext.presentationLatency is what we're looking for.  Presentation
latency is the time difference between when you tell an event to happen and the
actual time when you hear it.  So, for example, with source.start(0), you would
hope to hear the sound right now, but in reality will hear it with some
(hopefully) small delay.  One example where this could be useful is if you're
trying to synchronize a visual "playhead" to the actual audio being
scheduled...

I believe the goal for any implementation should be to achieve as low a latency
as possible, one which is on-par with desktop/native audio software on the same
OS/hardware that the browser is run on.  That said, as with other aspects of
the web platform (page rendering speed, cache behavior, etc.) performance is
something which is tuned (and hopefully improved) over time for each browser
implementation and OS.

-- 
You are receiving this mail because:
You are the QA Contact for the bug.

Received on Thursday, 17 January 2013 14:15:14 UTC