Skip to toolbar

Community & Business Groups

Multi-device Timing Community Group

Timing mechanisms allow operations to be executed at the correct time. The Web already has several mechanisms supporting timed operations, including setTimeout and setInterval, as well as controllers for media frameworks and animations. However, the Web lacks support for multi-device timing. A multi-device timing mechanism would allow timed operations across Web pages hosted by different devices. Multi-device timing is particularly important for the broadcasting industry, as it is the key enabler for web-based secondary device offerings. More generally, multi-device timing has wide utility in communication, collaboration and multi-screen presentation. This Community Group aims to define a common, multi-device, timing mechanism and a practical programming model. This will improve the Web as a platform for time-sensitive, multi-device Web applications. Charter : http://webtiming.github.io

webtiming

Group's public email, repo and wiki activity over time

Note: Community Groups are proposed and run by the community. Although W3C hosts these conversations, the groups do not necessarily represent the views of the W3C Membership or staff.

drafts / licensing info

name
Timing Object Draft Specification

Chairs, when logged in, may publish draft and final reports. Please see report requirements.

Publish Reports

HbbTV 1.5 experiments

Hi all,

In collaboration with Vicometch-IK4, we did some tests of HbbTV 1.5 to see if we could manage to exploit Shared Motions[1] to add sync capabilities to existing smart TVs.  The HbbTV 1.5 does not have any explicit support for synchronization, and while HbbTV 2.0 will bring this, lots of existing SmartTVs will not get these upgrades.  If you are interested and have knowledge about HbbTV, we welcome any input on this initial experiment.

We quickly discovered that the media element of the TV is unable to provide a good user experience when slaving after a Shared Motion.  It lacks variable playback rate, and skip operations are very slow.  Our approach was therefore to request the media element to play from a given position.  It will not be very correct, but instead of trying to correct the playback on the TV, we rather adjust the Shared Motion to match what the TV does.  In this way, we’ve re-created a master-slave relation, with one master (the TV/Chromecast) and however many slaves you want.

Here is a film we made from our experiment with a Panasonic TV [2]:

Interestingly, we see that the currentTime reported by this TV fluctuates within around 250ms.  We are however able to select the better samples and in that way provide a consistent experience.  The TV we tested did need calibration, but this seemed to be hardware specific and consistent for skips, reloads and other content.

This test is of course IP based.  We asked for some input on this for broadcasted content. It appears that we only would have streamevents to provide an estimated time (possibly to within a second), but perhaps even relatively rough estimates of the current position could be extracted and make for user friendly transitions between broadcasted and IP delivered content?

Of course, HbbTV 2.0 devices should be much better at all of this, and provide local synchronization to boot.  However, we believe this experiment opens for an interesting transition phase, where current SmartTVs can provide at least some additional functionality for a vast number of users.

If there is any interest in testing other HbbTVs, we’re very willing to provide a simple web application for testing, including manual calibration.  Please let us know!

Regards, Njål

[1]: Motion Corporation

[2]: https://youtu.be/Be_z4MiY9oI

Multi-device timing at NABShow 2016

Njål and I are going to NABShow 16-21 April in Las Vegas [1,2] to promote the timing object and the W3C Multi-device Timing CG. If you, or any of your colleagues, are attending NAB, please come by our booth in the Future Park [3].

The booth is hosted by EU FP7 project MediaScape [4]. The invitation came as a result of demos at IBC 2015, where MediaScape project lead partner Vicomtech [5] showed flexible and tightly synced multi-device adaptation in regular Web browsers. The MediaScape project uses the Shared Motion approach to distributed timing and control in Web browsers, and has been central in pushing for standardization of the Timing Object through the Multi-device Timing CG initiative.

Vicomtech by Mikel Zorilla and Esther Novo will demonstrate the many fruits of the MediaScape approach, with a particular emphasis on multi-device adaptation, while Norut (Njål and I) will focus on distributed control and synchronization for IP-based services.

Also, if you are interested in discussing commercial opportunities implied by Web-based timing, you may set up a meet through our American partners, Glen Sakata [6] or Chris Lennon [7] at MediAnswers [8]. They have a long track record within the american broadcasting industry, and a keen understanding of what opportunities the Timing Object & Shared Motion can offer to the industry. 

Best, Ingar

[1] http://www.nabshow.com

[2] NAB: premier trade association for America’s radio and TV broadcasters

[3]  Booth nr: SU16713 Booth name Vicomtech-IK4. Find the booth in this link, go to south hall (upper), in the right side there is the future park.

[4] http://mediascapeproject.eu/

[5] http://www.vicomtech.org/

[6] gsakata@medianswers.tv

[7] clennon@medianswers.tv

[8] http://medianswers.tv/

Multi-device timing in W3C highlights

The Multi-device Timing CG is mentioned in recent W3C highlights [1] of March 2016, under the entertainment section. Web synchronization is also mentioned as overlapping work in the telecommunications section. Finally, cross-device synchronization is one of three headlight topics for 2016.

Though not explicitly referenced, multi-device temporal control is also relevant also for other highlight activities, such as Second Screen Working Group (synchronization and distributed control), HTML Media (timing provides Web apps with more flexible control over video content – addition to MSE or alternative), and Digital Marketing (ads exploiting temporal context, timed multi-screen etc.).

[1] https://www.w3.org/2016/03/w3c-highlights/ 

Inter-device sync

Hi all, after a bug report about our variable playbackRate detector being too eager, I did a proper stress test of Firefox and Chrome on Linux to test the new media sync code.  I made a video if you want to look at it.  Modern browsers are amazing. 🙂

currentTime reporting, Chromium on Android

Hi all,

I’d like to give a quick update on some developments on Chrome, and some relevant points we have discovered.

The Unified Media Pipeline (can be enabled on Android Dev and Beta channels using chrome://flags) gives us variable playback rates on Android, allowing for a much tighter sync and nicer user experience. There’s a bug report tracking this development in [1].

During testing, we did however discover some discrepancies (which we have communicated), and I think this reflects a need for developers to have good testing systems for timing sensitive issues. The particular issue we noticed is that currentTime appears slightly off, and it’s off by a different amount for audio and video. This gives an audiable echo wieh played back on multiple devices within hearing range.  EDIT: This discrepancy was actually some other change that made us detect variable playback rate as broken prematurely.  After fixing it, there are still some open issues on some devices, but the difference between media types has gone.  Of course, video has a higher bitrate and is therefore also more prone to errors on resource limited devices.

The reporting of currentTime is quite critical for multi-device playback, with two main points:

  1. consitency – how much does currentTime reporting vary
  2. correctness – how closely does currentTime reflect the actual player state

This leads to the question: How do developers measure the correctness of currentTime reporting. Consistency is quite easy, we even have a page to do that [2]. For our sync library, we smooth the values, which seems to work quite well. Correctness is more tricky, and the way we have done this for now is to find a reference player (we use Chrome on a Linux machine) and play back synchronized content on both browsers. An example can be seen on [3] testing Chrome vs Firefox about a year ago. Chrome produced some audio artifacts as we synchronized it then – you can hear it in the right ear.

If I’m not mistaken, the currentTime reported to Javascript is actually specified as being an estimation, as it’s a snapshot that doesn’t change during execution of JS. We got some quite good results when sampling currentTime immediately after a timeUpdate event has been emitted, but this varies between browsers. I actually believe a bug report could be filed with HTML5 on this, as very little help is provided to implementers when it comes to correctness of currentTime.

Possible suggestions for a solution could be:

  1. Add a new currentTimeTS property, providing a tuple of [currentTime, timestamp] to reflect at which time the sample was taken. This will allow us to compensate for the time passed between sample and reading the value.
  2. A requirement that currentTime should be updated to be correct immediately before the timeUpdate event is emitted. This likely needs to save [currentTime, timestamp] internally, then estimating the currentTime before reporting.
  3. Add a property currentTime (or currentTimeTS) to the timeUpdate event similar to 2 or 1.

Solution 1 could even be writeable, providing a way to specify that players need to compensate for buffering time, latency of decoding subsystems etc. This would give us a perfect foundation to provide very nice user experiences as a skip during playback will behave as if it started playing immediately even if invisibly (due to a lack of data).

I’d love to hear about any experiences or suggestions you might have in this respect.

Njål
[1]: https://bugs.chromium.org/p/chromium/issues/detail?id=263654
[2]: http://fanoli01.itek.norut.no/Analysis/
[3]: https://www.youtube.com/watch?v=lfoUstnusIE

Multi-device Timing has come to the Web

Hi all.

This announces the successful integration of Shared Motion by Motion Corporation as first timing provider for timing object.

http://webtiming.github.io/timingsrc/

Timingsrc library is now complete with source code, documentation, example code and live demos.

In other words; a new programming model for timed multi-device Web applications is ready for your experimentation.

Enjoy – and please help spread the word!

Best,

Ingar and Njål

Web documentation for timingsrc

Hi all.

I announced https://github.com/webtiming/timingsrc a little earlier, a GitHub repository for the Multi-device Timing Community Group. The repository currently includes implementation of the TimingObject, along with TimingConverters and tools for sequencing and media synchronization.

I also want to announce that the documentation for this code will be available at

http://webtiming.github.io/timingsrc/

Best regards, Ingar

First implementation of Timing Object draft spec

Hi all

I’ve created a new repository https://github.com/webtiming/timingsrc for the Multi-device Timing Community Group.

The repository now holds a near-complete JavaScript implementation of the Timing Object draft spec. The repository additionally includes key tools and concepts to use with the TimingObject when programming timed web-applications.

  • timing converters for chaining timing objects.
  • sequencing tools (Sequencer, IntervalSequencer, SetPointCallback, SetIntervalCallback)
  • mediasync for synchronization of HTML5 media.

Support for timing providers are currently not included (but will be quite soon, thereby opening up for multi-device timing).

The timing object with associated concepts effectively define a new programming model for precisely timed web-pages and web-applications. Even without support for precise distributed timing and control (through timing providers) it should still represent a significant improvement on state of the art. Further improvement to synchronization of HTML5 Media likely requires timing support from Web browsers.

You are all invited to start playing with this programming model – and to share your feedback with the group. The code is licensed under LGPL.

The current implementation deviates from the spec on a few minor points. For some of these points I’ll be suggesting changes to the spec. I’ll come back to this later though.

Ingar

 

Media Sync for Timing Object

Hi all.

This announces the publication of MediaSync, a generic and open-source JavaScript library for timed (synchronized) playback of HTML5 audio and video. The MediaSync library works to time align media playback to the progression of the timing object.

http://webtiming.github.io/mediasync/

http://webtiming.github.io/timingsrc/

The timing object draft spec defines *timed playback mode* as a possible extension to existing standards for HTMLMediaElements [1]. However, until standardization and implementation is a reality, the Multi-device Timing CG makes available a generic and open source JavaScript library for timed playback, integrated with the timing object/shared motion. MediaSync is published [2] as research output from MediaScape [3], an ongoing FP7 EU project.  As part of the dissemination and standardization work of MediaScape, MediaSync is also made available for the Multi-device Timing CG. The Web page linked above includes API documentation, example usage and demonstrations. Source code is available at [2] .

And, not to forget, integration with timing object/shared motion enables HTML5 video and audio to take part in globally synced multi-device playback!

Njål and Ingar

[1]: http://webtiming.github.io/timingobject/#media-elements-and-the-timing-object

[2]: https://github.com/mediascape/mediasync

[3]: http://mediascapeproject.eu/

 

Sequencer for Timing Object

Hi all.

This announces the publication of a generic and open-source JavaScript Sequencer for the timing object.

http://webtiming.github.io/sequencer/

http://webtiming.github.io/timingsrc/

The timing object draft spec defines a new TimingTextTrack [1]  as a mechanism for precisely timed presentation of timed data, integrated with the timing object. However, until standardization and implementation is a reality, the Multi-device Timing CG makes available a generic and open source JavaScript Sequencer integrated with the timing object/shared motion. The Sequencer is published [2] as research output from MediaScape [3], an ongoing FP7 EU project.  As part of the dissemination and standardization work of MediaScape, the Sequencer is also made available for the Multi-device Timing CG. The Web page linked above includes API documentation, example usage and demonstrations. Source code is available at https://github.com/webtiming/sequencer  https://github.com/webtiming/timingsrc 

Please note that API and implementation of the Sequencer is similar, but not identical to the proposed TimingTextTracks. The differences should be fairly small though. The Sequencer implementation currently has no known issues and supports all the goals defined in the timing object spec.

  • data-independency (not tied to specific media format)
  • UI-independency (no predefined UI – custum UI == simple web development)
  • precise timing (millisecond precision)
  • expressive controls (suitable for a variety of media products)
  • dynamic data (supports live data sources and modification of temporal aspects during playback)
  • simple usage

And, not to forget, integration with timing object/shared motion enables Sequencers to take part in globally synced multi-device playback!

Ingar

[1]: http://webtiming.github.io/timingobject/#timed-data-and-the-timing-object

[2]: https://github.com/mediascape/sequencer

[3]: http://mediascapeproject.eu/