Skip to toolbar

Community & Business Groups

Multi-device Timing Community Group

Timing mechanisms allow operations to be executed at the correct time. The Web already has several mechanisms supporting timed operations, including setTimeout and setInterval, as well as controllers for media frameworks and animations. However, the Web lacks support for multi-device timing. A multi-device timing mechanism would allow timed operations across Web pages hosted by different devices. Multi-device timing is particularly important for the broadcasting industry, as it is the key enabler for web-based secondary device offerings. More generally, multi-device timing has wide utility in communication, collaboration and multi-screen presentation. This Community Group aims to define a common, multi-device, timing mechanism and a practical programming model. This will improve the Web as a platform for time-sensitive, multi-device Web applications. Charter : http://webtiming.github.io

Note: Community Groups are proposed and run by the community. Although W3C hosts these conversations, the groups do not necessarily represent the views of the W3C Membership or staff.

drafts / licensing info

name
Timing Object Draft Specification

Chairs, when logged in, may publish draft and final reports. Please see report requirements.

Publish Reports

Bug report : Safari Playbackrate

For the most precise synchronization of HTML5 media, and for the best user experiences (avoiding audiovisual artifacts)  we depend on dynamically adjusting variable playbackrate. This works across browsers, but we have identified a subtle bug in the implementation of variablePlaybackRate in Safari, resulting in a terrible experience.

There seems to be a side-effect when variableplaybackrate it modified, causing the value of currentTime to pause for a short time interval, about 0.1 – 0.3 seconds.

We’ve reported the bug to apple. Hopefully they’ll be able to fix it.

https://bugs.webkit.org/show_bug.cgi?id=163433

Ingar and Njål

Multi-device Timing CG at TPAC 2016

Hi all.

The multi-device timing CG is set up for a session at this years TPAC in Lisbon. The session is scheduled for Thursday 22 Sept, 15:30 to 17:30 [1].

The session will be hosted by Francois Daoust (W3C) and myself.

The agenda is quite simple. I’ll start by giving a presentation and possibly some demos covering:

  • CG status
  • introduction to multi-device timing
  • reality & goals for timing in the Web platform
  • proposal – timing object + online timing providers
  • applications/use-cases

Then we’ll go on to discuss next steps

  • need for standardization
  • attracting support
  • relation to related initiatives
  • CG organization/activities

Best regards,

Ingar Arntzen, chair.

[1] https://www.w3.org/2016/09/TPAC/schedule.html

Multi-device Timing at IBC 2016

Hi all.

Njål, Francois and I are publishing a new paper on multi-device timing at IBC 2016. The paper is titled (rather boldly;)) “Timing: Small step for developers, giant leap for the media industry” and is included as supporting paper in the paper session: “Enhancing the Multi-screen Experience through Synchronisation and Personalisation”. For those present at IBC this year session details may be found here session link.

This paper isn’t overly technical, but focusing instead on how the industry currently deals with timing, as well as pointing out the opportunities that would come from adopting the multi-device timing approach (i.e. timing object + shared motion).

PDF here Borch_IBC2016-final

Njål will be representing the Multi-device Timing CG at IBC, so he is the guy to talk to.

Best, Ingar

Timing Object at NABShow 2016

Njål and I are just back from NABShow 2016, 16-21 April in Las Vegas where MediaScape partners Norut and Vicomtech promoted the timing object and the W3C Multi-device Timing Community Group.

Njål Borch at Timing Object booth - NABShow 2016 Njål Borch at Timing Object booth – NABShow 2016

You may find our main leaflet here : Timing object in a nutshell

Multi-device sync Multi-device sync

Our setup in the Futures Park booth was fairly simple; four different laptop devices and two smart phones.  As you can see in the picture we used the laptops to present a selection of HTML5 videos being synchronized across the different screens (using Shared Motion and the MediaSync library). Two laptops were cabled, two on WiFi. We used Firefox and Chrome browsers. One smart phone was used for controls (play, pause time-shifting the timing objects as well as switching between videos). Another smart phone was used to present the audio of the video. We also brought two pairs of headphones, one connected to a laptop computer and one connected to the smart phone. This way, by using both headphones together, our audience could verify echoless sync between smartphone and laptop computer. We also made sure to reload the Web-browsers to demonstrate how quickly sync is regained – fractions second as long as video data is available. The demos ran in perfect synchrony for four consecutive days, without as much as a glitch. That’s impressive – especially considering the poor networking conditions in the NAB exhibition hall!

Reactions to the demonstrations were overwhelmingly positive. Many people expressed excitement that there was an initiative aiming at improved support for timing on the Web platform. People were also taken aback by the quality of the synchronization as well as the prospect of doing this globally. Some people were curious about use cases, whereas others immediately recognized the need for timing and synchronization in various broadcasting applications, be it live streaming, ad-insertion, tiled screen setups, timed UGC, collaborative viewing, remote control or what not. We mentioned concrete use cases such as secondary device applications, alternative audio tracks on secondary devices (accessibility etc). We also presented more high level value promises such as timing-consistency in UX and the important role of timing with respect to integration and interoperability between heterogeneous media systems. Finally, we had some very concrete interests from very central players. We’ll let you know when interests materialize.

So, a big thanks to Norut, Vicomtech and MediaScape for an excellent show at NAB!
Next major event up for the Multi-device Timing CG will likely be a F2F in Lisbon at TPAC 2016 in september.

Best regards, Ingar

Sequencing with the Timing Object

We have just published a paper on sequencing in Web multimedia. Sequencing is about activation and deactivation of media items at the correct time during media playback. The paper highlights the importance of decoupling sequencing logic from data formats, timing/control and UI in Web-based multimedia.

  • Data-independent sequencing implies broad utility as well as simple integration of different data types and delivery methods in multimedia applications.
  • UI-independent sequencing simplifies integration of new data types into visual and interactive components.
  • Integration with the Timing Object ensures that sequencing tasks may trivially be synchronized and remote controlled, both in single-page media presentations as well as global, multi-device media experiences (e.g. through Shared Motion).

In short, we see precise, distributed sequencing as a fundamental building block in multi-device timed multimedia.

The Sequencer is presented as a generic programming tool for timed Web-based multimedia, implemented in JavaScript and based on setTimeout. Though timing errors within a couple of milliseconds is often acceptable in Web applications, this also indicates that future improvements to setTimeout would be beneficial.

The paper will be presented at the ACM MMSys’16, Special section for Media Synchronization, Klagenfurt, Austria, May 10-13.

The paper is available in the ACM library here or from Norut here.

HbbTV 1.5 experiments

Hi all,

In collaboration with Vicometch-IK4, we did some tests of HbbTV 1.5 to see if we could manage to exploit Shared Motions[1] to add sync capabilities to existing smart TVs.  The HbbTV 1.5 does not have any explicit support for synchronization, and while HbbTV 2.0 will bring this, lots of existing SmartTVs will not get these upgrades.  If you are interested and have knowledge about HbbTV, we welcome any input on this initial experiment.

We quickly discovered that the media element of the TV is unable to provide a good user experience when slaving after a Shared Motion.  It lacks variable playback rate, and skip operations are very slow.  Our approach was therefore to request the media element to play from a given position.  It will not be very correct, but instead of trying to correct the playback on the TV, we rather adjust the Shared Motion to match what the TV does.  In this way, we’ve re-created a master-slave relation, with one master (the TV/Chromecast) and however many slaves you want.

Here is a film we made from our experiment with a Panasonic TV [2]:

Interestingly, we see that the currentTime reported by this TV fluctuates within around 250ms.  We are however able to select the better samples and in that way provide a consistent experience.  The TV we tested did need calibration, but this seemed to be hardware specific and consistent for skips, reloads and other content.

This test is of course IP based.  We asked for some input on this for broadcasted content. It appears that we only would have streamevents to provide an estimated time (possibly to within a second), but perhaps even relatively rough estimates of the current position could be extracted and make for user friendly transitions between broadcasted and IP delivered content?

Of course, HbbTV 2.0 devices should be much better at all of this, and provide local synchronization to boot.  However, we believe this experiment opens for an interesting transition phase, where current SmartTVs can provide at least some additional functionality for a vast number of users.

If there is any interest in testing other HbbTVs, we’re very willing to provide a simple web application for testing, including manual calibration.  Please let us know!

Regards, Njål

[1]: Motion Corporation

[2]: https://youtu.be/Be_z4MiY9oI

Multi-device timing at NABShow 2016

Njål and I are going to NABShow 16-21 April in Las Vegas [1,2] to promote the timing object and the W3C Multi-device Timing CG. If you, or any of your colleagues, are attending NAB, please come by our booth in the Future Park [3].

The booth is hosted by EU FP7 project MediaScape [4]. The invitation came as a result of demos at IBC 2015, where MediaScape project lead partner Vicomtech [5] showed flexible and tightly synced multi-device adaptation in regular Web browsers. The MediaScape project uses the Shared Motion approach to distributed timing and control in Web browsers, and has been central in pushing for standardization of the Timing Object through the Multi-device Timing CG initiative.

Vicomtech by Mikel Zorilla and Esther Novo will demonstrate the many fruits of the MediaScape approach, with a particular emphasis on multi-device adaptation, while Norut (Njål and I) will focus on distributed control and synchronization for IP-based services.

Also, if you are interested in discussing commercial opportunities implied by Web-based timing, you may set up a meet through our American partners, Glen Sakata [6] or Chris Lennon [7] at MediAnswers [8]. They have a long track record within the american broadcasting industry, and a keen understanding of what opportunities the Timing Object & Shared Motion can offer to the industry. 

Best, Ingar

[1] http://www.nabshow.com

[2] NAB: premier trade association for America’s radio and TV broadcasters

[3]  Booth nr: SU16713 Booth name Vicomtech-IK4. Find the booth in this link, go to south hall (upper), in the right side there is the future park.

[4] http://mediascapeproject.eu/

[5] http://www.vicomtech.org/

[6] gsakata@medianswers.tv

[7] clennon@medianswers.tv

[8] http://medianswers.tv/

Multi-device timing in W3C highlights

The Multi-device Timing CG is mentioned in recent W3C highlights [1] of March 2016, under the entertainment section. Web synchronization is also mentioned as overlapping work in the telecommunications section. Finally, cross-device synchronization is one of three headlight topics for 2016.

Though not explicitly referenced, multi-device temporal control is also relevant also for other highlight activities, such as Second Screen Working Group (synchronization and distributed control), HTML Media (timing provides Web apps with more flexible control over video content – addition to MSE or alternative), and Digital Marketing (ads exploiting temporal context, timed multi-screen etc.).

[1] https://www.w3.org/2016/03/w3c-highlights/ 

Inter-device sync

Hi all, after a bug report about our variable playbackRate detector being too eager, I did a proper stress test of Firefox and Chrome on Linux to test the new media sync code.  I made a video if you want to look at it.  Modern browsers are amazing. 🙂