This is an archived snapshot of W3C's public bugzilla bug tracker, decommissioned in April 2019. Please see the home page for more details.

Bug 20714 - timestampOffset in live case
Summary: timestampOffset in live case
Status: RESOLVED LATER
Alias: None
Product: HTML WG
Classification: Unclassified
Component: Media Source Extensions (show other bugs)
Version: unspecified
Hardware: PC Windows NT
: P2 normal
Target Milestone: ---
Assignee: Adrian Bateman [MSFT]
QA Contact: HTML WG Bugzilla archive list
URL:
Whiteboard:
Keywords:
Depends on:
Blocks:
 
Reported: 2013-01-20 16:32 UTC by Cyril Concolato
Modified: 2013-03-25 19:14 UTC (History)
4 users (show)

See Also:


Attachments
exemple of timeline after joining a live stream 2h after the start (4.08 KB, image/png)
2013-01-21 11:31 UTC, Cyril Concolato
Details

Description Cyril Concolato 2013-01-20 16:32:10 UTC
When joining a live session, the timestamp of the first coded frame received is typically not zero. To have proper playback (not have the player stall), the web apps needs to set the timestampOffset attribute of the SourceBuffer. In some cases, such as using MPEG-DASH, an approximate timestamp might be provided in the manifest. However, in many cases, the web app will have to parse the media data in JavaScript, which is not optimal. It would be preferable to be able to force an automatic offset, or to set/force the timestamp to zero. Alternatively, it could be interesting to have a method to ask the media engine to parse and return the timestamp range of some data, so that the app can then transfer the data to the sourcebuffer with the right timestampOffset.
Comment 1 Steven Robertson 2013-01-20 19:03:29 UTC
I think the assumption that playback must begin at t=0 may be wrong here. Are you encountering problems when you append media data starting at t>0, and then seeking to that time to begin playback? If so, that's an implementation bug, not a spec issue.

Note that you can append media data and then examine the buffered ranges to determine the timestamps of the media data that was appended.

By what means are you fetching media data? In most scenarios (e.g. DASH, HLS) the timestamp of the media data being fetched is provided in the manifest which describes chunk URLs, so you don't even need to wait for parsing to finish.

(IMHO, t=0 should be the natural start time of the stream, not the join time. This lets users watching on multiple devices (or users who refresh the page) have the same timestamps; it allows users to rewind to before they started watching a stream; it allows people to talk about and link to particular timecodes in a stream, and have them be consistent; and it allows the same content to be used in a VOD setting after the live presentation has concluded without different player logic. Of course, all of that is application-level policy, but it has bearing on this issue because it suggests that the proposed "force the timestamp to zero" functionality may not be widely used.)
Comment 2 Cyril Concolato 2013-01-21 11:31:55 UTC
Created attachment 1316 [details]
exemple of timeline after joining a live stream 2h after the start
Comment 3 Adrian Bateman [MSFT] 2013-01-29 04:08:45 UTC
(In reply to comment #2)
> Created attachment 1316 [details]
> exemple of timeline after joining a live stream 2h after the start

Apps can draw whatever custom UI they like for the seek buffer. It's probably quite likely that apps with live content will want a custom UI indicating whether it is possible to seek back and by how much.
Comment 4 Cyril Concolato 2013-01-29 07:35:00 UTC
Had problems replying the first time. Replying again.

(In reply to comment #1)
> I think the assumption that playback must begin at t=0 may be wrong here.
> Are you encountering problems when you append media data starting at t>0,
> and then seeking to that time to begin playback? If so, that's an
> implementation bug, not a spec issue.
I'm not having problems appending, the problem is mostly with the UI as depicted in the attachement. 

> 
> Note that you can append media data and then examine the buffered ranges to
> determine the timestamps of the media data that was appended.
Yes, but you cannot change the timestamps anymore. I think it's a general problem that in order to append properly (i.e. without creating gap) you need to parse the data first to get the time stamp. 

> 
> By what means are you fetching media data? In most scenarios (e.g. DASH,
> HLS) the timestamp of the media data being fetched is provided in the
> manifest which describes chunk URLs, so you don't even need to wait for
> parsing to finish.
In my understanding HLS does not provide such a timestamp. DASH provides, only in some cases, the presentationTimeOffset attribute, but not always. In my case, I'm also envisaging simple chunk-transfer delivery of live data without manifest, similar to IceCast and ShoutCast.
 
> (IMHO, t=0 should be the natural start time of the stream, not the join
> time. This lets users watching on multiple devices (or users who refresh the
> page) have the same timestamps; it allows users to rewind to before they
> started watching a stream; it allows people to talk about and link to
> particular timecodes in a stream, and have them be consistent; and it allows
> the same content to be used in a VOD setting after the live presentation has
> concluded without different player logic. Of course, all of that is
> application-level policy, but it has bearing on this issue because it
> suggests that the proposed "force the timestamp to zero" functionality may
> not be widely used.)
I agree that your use case make sense but you assume that a live event is a one shot event for which the streaming starts at the beginning of the event (timestamp = 0). If you record a single program in 24/7 TV channel, the timestamp of the first AU won't be zero. You'd have the same problem. 

On second thoughts, maybe the problem is not about shifting the timestamps but about indicating properly to the browser that it will never be able to seek before the first data received, so that it can adjust the timeline UI. I'm unclear about how the buffered and seekable attributes are filled in with MSE (and HTML 5 Media Elements in general).
Comment 5 Cyril Concolato 2013-01-29 07:37:21 UTC
(In reply to comment #3)
> (In reply to comment #2)
> > Created attachment 1316 [details]
> > exemple of timeline after joining a live stream 2h after the start
> 
> Apps can draw whatever custom UI they like for the seek buffer. It's
> probably quite likely that apps with live content will want a custom UI
> indicating whether it is possible to seek back and by how much.
Why wouldn't it be possible for the app to indicate direclty to the built-in UI that seek is not possible (either before the first data or outside of a seeking window), without requiring custom UI?
Comment 6 Adrian Bateman [MSFT] 2013-01-29 14:26:27 UTC
(In reply to comment #5)
> (In reply to comment #3)
> > (In reply to comment #2)
> > > Created attachment 1316 [details]
> > > exemple of timeline after joining a live stream 2h after the start
> > 
> > Apps can draw whatever custom UI they like for the seek buffer. It's
> > probably quite likely that apps with live content will want a custom UI
> > indicating whether it is possible to seek back and by how much.
> Why wouldn't it be possible for the app to indicate direclty to the built-in
> UI that seek is not possible (either before the first data or outside of a
> seeking window), without requiring custom UI?

Everything is possible if it is specified. There are lots of different options that people might want - our goal for v1 has been to keep it simple and address the most common goals intrinsically, gain implementation experience and see what is common in sites and libraries, and then consider enhancements subsequently.
Comment 7 Cyril Concolato 2013-01-29 15:26:07 UTC
(In reply to comment #6)
> (In reply to comment #5)
> > (In reply to comment #3)
> > > (In reply to comment #2)
> > > > Created attachment 1316 [details]
> > > > exemple of timeline after joining a live stream 2h after the start
> > > 
> > > Apps can draw whatever custom UI they like for the seek buffer. It's
> > > probably quite likely that apps with live content will want a custom UI
> > > indicating whether it is possible to seek back and by how much.
> > Why wouldn't it be possible for the app to indicate direclty to the built-in
> > UI that seek is not possible (either before the first data or outside of a
> > seeking window), without requiring custom UI?
> 
> Everything is possible if it is specified. There are lots of different
> options that people might want - our goal for v1 has been to keep it simple
> and address the most common goals intrinsically, gain implementation
> experience and see what is common in sites and libraries, and then consider
> enhancements subsequently.
I understand. Consider this bug report as a feedback from using the current spec for live cases. I think enhancements are needed, whether or not they should be part of v1 ... maybe.
Comment 8 Aaron Colwell (c) 2013-01-30 00:24:35 UTC
Hi Cyril,

If I understand correctly, I think you are bringing up several different issues.

1. The default UI for streams that don't start at 0 is suboptimal.
2. There is no way to disable seeking.
3. Media data has to be parsed in JavaScript to force a 0 start time.

For items 1 & 2, I think you have competing requirements for display that may be specific to your personal preferences for live controls. Like Adrian says, custom UI controls can make it easier for you to control seeking and display the timeline however you wish. In the case of live, it could be considered a feature that you can seek back into the buffered data.

For item 3, you don't have to parse the data in JavaScript. You could do the following to make the timeline start at 0 w/o parsing the data in JavaScript.

sourceBuffer.append(firstMediaSegment);
sourceBuffer.timestampOffset = -sourceBuffer.buffered.start(0);
sourceBuffer.remove(sourceBuffer.buffered.start(0), sourceBuffer.buffered.end(0))
sourceBuffer.append(firstMediaSegment);

You only have to do this once at the beginning and it will make the timeline start at 0. I think this will make the default UI look closer to what you want, but the displayed time will be 0 based instead of starting at the join time. I've used this trick with some internal MSE live demos I've written so I know it works.

I would prefer to keep things as is for v1. I don't think what I propose here will be too high of a burden for web developers. If it is then we'll use their feedback to craft a solution for v2.
Comment 9 Cyril Concolato 2013-02-15 10:55:28 UTC
Hi Aaron,

(In reply to comment #8)
> Hi Cyril,
> 
> If I understand correctly, I think you are bringing up several different
> issues.
> 
> 1. The default UI for streams that don't start at 0 is suboptimal.
> 2. There is no way to disable seeking.
> 3. Media data has to be parsed in JavaScript to force a 0 start time.
I think you've summarized the issues quite well. 

> For items 1 & 2, I think you have competing requirements for display that
> may be specific to your personal preferences for live controls. 
'Competing' between each others or with someone else's requirements? The latter I presume.

> Like Adrian
> says, custom UI controls can make it easier for you to control seeking and
> display the timeline however you wish. In the case of live, it could be
> considered a feature that you can seek back into the buffered data.
I agree, although it is not clear to me reading the spec that when buffered range is seekable. The 'seekable' given its name is clearer.
Anyway, there are cases where you know that you won't be able to seek outside of the range that has been passed to the SourceBuffer. There should be a way to inform the built-in UI about it.

> 
> For item 3, you don't have to parse the data in JavaScript. You could do the
> following to make the timeline start at 0 w/o parsing the data in JavaScript.
> 
> sourceBuffer.append(firstMediaSegment);
> sourceBuffer.timestampOffset = -sourceBuffer.buffered.start(0);
> sourceBuffer.remove(sourceBuffer.buffered.start(0),
> sourceBuffer.buffered.end(0))
> sourceBuffer.append(firstMediaSegment);
> 
> You only have to do this once at the beginning and it will make the timeline
> start at 0. I think this will make the default UI look closer to what you
> want, but the displayed time will be 0 based instead of starting at the join
> time. I've used this trick with some internal MSE live demos I've written so
> I know it works.
This might have consequences on the media element state, trigger events ... which you might not want.
Comment 10 Aaron Colwell (c) 2013-03-12 22:01:16 UTC
I believe calling abort("timestampOffset") before appending anything will allow an application to always make sure the presentation starts at zero now without having to do the elaborate dance I outlined in a previous comment. The timestampOffset attribute will reflect the offset being applied once the first append completes so custom UI's can adjust the display of the current playback time if necessary.

We could add the following attribute if you really want to be able to signal that
HTMLMediaElement.seekable should return an empty TimeRanges.
partial interface MediaSource {
    attribute boolean seekable;
}

- Initially set to true.
- If true then existing behavior will be used.
- If false then an empty TimeRanges object will be returned by HTMLMediaElement.seekable.
Comment 11 Adrian Bateman [MSFT] 2013-03-25 17:56:04 UTC
(In reply to comment #10)
> We could add the following attribute if you really want to be able to signal
> that
> HTMLMediaElement.seekable should return an empty TimeRanges.
> partial interface MediaSource {
>     attribute boolean seekable;
> }
> 
> - Initially set to true.
> - If true then existing behavior will be used.
> - If false then an empty TimeRanges object will be returned by
> HTMLMediaElement.seekable.

I think applications are going to want more control than seekable or not seekable. As I said before, I don't think this is necessary as a v1 feature. Applications can build custom UI with whatever seek logic they want. Once we get more implementation experience of what people commonly want to build, especially for the Live scenario, we can consider what platform support this requires.

I recommend RESOLVE, LATER.
Comment 12 Aaron Colwell (c) 2013-03-25 19:14:49 UTC
sounds reasonable to me.