Bug 14970 - <video> Expose statistics for tracking playback quality (framerate information)
<video> Expose statistics for tracking playback quality (framerate information)
Status: NEW
Product: HTML WG
Classification: Unclassified
Component: HTML5 spec
unspecified
All All
: P2 enhancement
: ---
Assigned To: Silvia Pfeiffer
HTML WG Bugzilla archive list
:
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2011-11-28 22:44 UTC by Max Kanat-Alexander
Modified: 2013-03-11 22:20 UTC (History)
13 users (show)

See Also:


Attachments

Note You need to log in before you can comment on or make changes to this bug.
Description Max Kanat-Alexander 2011-11-28 22:44:03 UTC
It would be useful for the <video> element to expose information about framerate.

At least one concrete use-case is aggregating these statistics so that an organization can prove that <video> playback is looking just as good as Flash playback for the same videos, from the user's perspective, across a wide range of clients. Also, an organization may want to use these statistics in the aggregate to make sure that client changes they make don't impact the general viewing experience.

There are currently various quality-related statistics proposed here:

http://wiki.whatwg.org/wiki/Video_Metrics#Proposal
Comment 1 Michael[tm] Smith 2011-11-28 22:51:39 UTC
Note that this bug was split out from  bug 12399, which is an LC1 bug
Comment 2 Eric Carlson 2011-11-29 14:52:49 UTC
(In reply to comment #0)
> It would be useful for the <video> element to expose information about
> framerate.
> 
What exactly does "framerate" mean?
Comment 3 Andrew Scherkus 2011-11-29 18:20:58 UTC
I agree with Eric that "framerate" is ill defined. A video is composed of frames that are individually timestamped. A frame rate implies that each frame is timestamped at a constant interval, but that isn't always the case.

Check out the notes from the playback metrics session at OVC 2011:
http://openetherpad.org/ovc11-standards-for-browser-video-statistics
Comment 4 Max Kanat-Alexander 2011-11-29 21:35:10 UTC
(In reply to comment #2)
> What exactly does "framerate" mean?

  Part of this bug would be defining what statistics we want. As opposed to framerate, what we really want to know is (a) how well the video is playing for the user in terms of non-network-related aspects and (b) what's causing it to play poorly or well.
Comment 5 Ian 'Hixie' Hickson 2011-12-07 01:07:06 UTC
If the goal is to "prove that browser-native video renders better than Flash", I'd be concerned about asking the browser for the evidence...
Comment 6 Silvia Pfeiffer 2011-12-07 07:50:07 UTC
(In reply to comment #5)
> If the goal is to "prove that browser-native video renders better than Flash",
> I'd be concerned about asking the browser for the evidence...

That might be a side effect. But what you're really after when measuring the performance of video at the client is the quality at which the video is presented to users. That may have nothing to do with the browser: it can have a lot of different causes including poor network performance, machine overload with other processes (so the video decoder starves) or a poor video card.

The idea is that if a user complains to a publisher that their experience is bad that the publisher has a means to track down exactly what is causing that poor experience.
Comment 7 Ian 'Hixie' Hickson 2011-12-07 21:01:52 UTC
> The idea is that if a user complains to a publisher that their experience is
> bad that the publisher has a means to track down exactly what is causing that
> poor experience.

Ah well that's an interesting use case that wasn't brought up before.

If that's the use case, it seems like the best API would be something that returned a list of components involved in the display of the video, and for each one gave some sort of performance metric. The components could be UA defined, since different UAs could have different components, but could e.g. be "network", "decoding", and "display". Each one would then have an attribute saying what fraction of the media stream it was handling per unit time, and an attribute saying whether this performance was constrained by hardware limitations (e.g. pegging the CPU, the cache, the network or GPU bandwidth), or whether it was constrained by software limitations (e.g. the decoding can only happen at the display rate because the software doesn't know how to buffer decoded frames), or whether it was being artificially constrained to maintain a good user experience (e.g. the download could go faster but is being throttled by the client because the user might want to use the bandwidth for other things). So e.g. if the network was downloading a 30 minute video as fast as i could at a rate that would take 15 minutes, it would have the value "2" (twice real time) and "hardware" (it's going as fast as it can). We'd probably want some sort of indicator of regularity, too, e.g. to report cases where the decoding is happening at an ideal 25 fps, but actually doing 50 frames one second and zero the next.
Comment 8 Silvia Pfeiffer 2011-12-07 21:42:01 UTC
This is going in the right direction.

The idea of the metrics listed at http://wiki.whatwg.org/wiki/Video_Metrics#Proposal is to provide the measurements to calculate for each of the individual components what fraction of the media stream they were handling per unit time:

The network component would report how many bytes of video it received and since when and how much of that time was waiting time. This allows calculating the bitrate at which the video is being received.

The decoding component would report per video and audio track how many bytes it was given to decode and how many it was actually able to decode.

The rendering component would report how many frames it was given from the decoder and how many of these it presented and how many had to be dropped because they were too late.

I think these components are not UA specific, but generic. Also, these measures are not UA specific nor are they encoding format specific.

There are two ways of approaching these measurements: you can measure from the start of video download, or you can measure over a certain time frame (e.g. 100ms). The latter gives a rate that can be plotted, but the earlier provides more accurate information that can be polled by JS at a resolution as required and the rate can be calculated from differences between polling.
Comment 9 Ian 'Hixie' Hickson 2011-12-09 22:10:54 UTC
(In reply to comment #8)
> 
> I think these components are not UA specific, but generic.

We've already received implementation feedback to the contrary, which is why I think it makes sense to make this more of a UA-defined list of components than a defined list. 

For the given use case, it doesn't matter if every browser has the same list, since the use case is specifically about determining why specific cases render poorly.
Comment 10 Ian 'Hixie' Hickson 2012-02-10 00:40:13 UTC
Implementation feedback on the idea in comment 7, intended to address the use case in comment 6, would be helpful at this point.
Comment 11 contributor 2012-07-18 04:38:09 UTC
This bug was cloned to create bug 17803 as part of operation convergence.
Comment 12 Robin Berjon 2013-01-21 15:58:50 UTC
Mass move to "HTML WG"
Comment 13 Robin Berjon 2013-01-21 16:01:36 UTC
Mass move to "HTML WG"