This is an archived snapshot of W3C's public bugzilla bug tracker, decommissioned in April 2019. Please see the home page for more details.

Bug 26462 - <video> Allow VideoTracks and AudioTracks to report the ranges over which they are valid
Summary: <video> Allow VideoTracks and AudioTracks to report the ranges over which the...
Status: RESOLVED WONTFIX
Alias: None
Product: WHATWG
Classification: Unclassified
Component: HTML (show other bugs)
Version: unspecified
Hardware: Other All
: P3 normal
Target Milestone: Needs Impl Interest
Assignee: Ian 'Hixie' Hickson
QA Contact: contributor
URL:
Whiteboard:
Keywords:
Depends on:
Blocks:
 
Reported: 2014-07-30 00:13 UTC by Ian 'Hixie' Hickson
Modified: 2017-07-21 09:40 UTC (History)
6 users (show)

See Also:


Attachments

Description Ian 'Hixie' Hickson 2014-07-30 00:13:27 UTC
Philip proposes having seekable and buffered TimeRanges on the AudioTrack and VideoTrack objects so one could tell which ones actually have data for any given time.
Comment 1 Ian 'Hixie' Hickson 2014-07-30 00:13:53 UTC
Aaron, could you describe the use cases you referenced in bug 24977 comment 22?
Comment 2 Aaron Colwell 2014-07-30 16:33:35 UTC
(In reply to Ian 'Hixie' Hickson from comment #1)
> Aaron, could you describe the use cases you referenced in bug 24977 comment
> 22?

Sure. They are similar to ones I've talked about before where alternate tracks only appear in sections of the timeline. The main case in broadcast television is where the TV show has alternate language tracks, but the commercials do not. One could also imagine a case where multiple camera views could be available for a sporting event, but the follow-up programming could be only a single view. 

I know in the past you have said such transitions should be handled by requesting a different resource. I suppose that is one solution, but it is unforunate that we aren't accomodating an existing broadcast model especially since it appears that we have all the objects in place to do so.


I'm really on the fence about Philip's suggestion. On the one hand it would provide sufficient information for the application to determine what tracks are available at different parts of the timeline for the types of broadcasts I've mentioned above and things like chained Ogg files. On the other hand it feels like it is adding complexity for outlier cases. I guess my main concern is how can we do something reasonable when the user seeks to a section of the timeline where one of the enabled/selected tracks doesn't exist. Conceptually it seems like the track shouldn't be in the xxxTracks list when you are in a part of the timeline where it doesn't exist, but I realize that this may not be intuitive for some developers and it isn't clear what to do when the old selected track becomes available again. At least adding/removing the tracks would allow applications to hook into the addtrack & removetrack events so they could trigger logic to select a new track when the transition occurred.

Ideally, I'd like to figure out a way to make it "just work" like it does for a TV. I believe this means defining fallback rules and introducing some sort of concept like a "default track", but I don't have a concrete proposal.
Comment 3 Ian 'Hixie' Hickson 2014-07-30 20:16:14 UTC
So the use case would be something like a YouTube video that had a secondary video track occasionally giving different angles, e.g. a football game with multiple angles when the video was showing the game, and just one angle when the video was showing the commentators, where YouTube would be showing some custom UI to allow the user to switch between the different angles, but only while the game itself is being shown?
Comment 4 Aaron Colwell 2014-08-01 16:57:52 UTC
(In reply to Ian 'Hixie' Hickson from comment #3)
> So the use case would be something like a YouTube video that had a secondary
> video track occasionally giving different angles, e.g. a football game with
> multiple angles when the video was showing the game, and just one angle when
> the video was showing the commentators, where YouTube would be showing some
> custom UI to allow the user to switch between the different angles, but only
> while the game itself is being shown?

Yes that example would work. Presumably custom controls would not necessarily be needed since the default controls could also facilitate track selection. I think the key question is what happens when the selected track "goes away" or "ends" but the presentation itself hasn't ended yet? 

My main concern is that it feels like something needs to be defined at the UA level because the web application may not be able to react fast enough to provide a seamless transition.
Comment 5 Ian 'Hixie' Hickson 2014-08-01 19:59:39 UTC
Well if there's no custom control, there's no need for the API, right?

What _should_ happen when you're on a track and that track has no data? Does it matter if the track has no data for the next hour vs the next 5 seconds? Does it matter if it's audio (multiple tracks can be enabled at once) or video (one at a time)? What if you're in a media controller situation with one media resource playing in two <video>s, one showing the main video track and the other showing the sign-language track, where the sign-language track only has data while there's speech to sign?
Comment 6 Ian 'Hixie' Hickson 2014-09-08 23:37:11 UTC
I've defined the case of missing media data as it just being silent/transparent black, so that resolves most of comment 5.

If it's just something for the UA, then as far as I can tell, the browser can already do this. Is the problem here just for UAs? Or do we need an API?

Is there implementation interest for an API from more than one vendor here?
Comment 7 Robert O'Callahan (Mozilla) 2014-09-09 01:22:07 UTC
I'm not opposed to adding per-track seekable and bufferedRanges, but this isn't something that's important to us at the moment.
Comment 8 Aaron Colwell 2014-09-09 01:58:14 UTC
(In reply to Robert O'Callahan (Mozilla) from comment #7)
> I'm not opposed to adding per-track seekable and bufferedRanges, but this
> isn't something that's important to us at the moment.

+1. I don't think this is high priority for us either.

For MSE at least, the SourceBuffer.buffered attribute provides enough info and is essentially a proxy for per-track information. Large MSE users like Netflix and YouTube tend to use demuxed data anyway where each SourceBuffer is only handling 1 track.
Comment 9 Ian 'Hixie' Hickson 2014-09-09 16:08:12 UTC
Ok, I'm marking this as "Needs Implementor Interest"; when you decide you're ready to implement something like this, mention it here and I'll spec it.
Comment 10 Anne 2017-07-21 09:40:35 UTC
If this becomes important to someone at some point, please file an issue at https://github.com/whatwg/html/issues/new. Thanks!