Skip to toolbar

Community & Business Groups

Multi-device Timing Community Group

Timing mechanisms allow operations to be executed at the correct time. The Web already has several mechanisms supporting timed operations, including setTimeout and setInterval, as well as controllers for media frameworks and animations. However, the Web lacks support for multi-device timing. A multi-device timing mechanism would allow timed operations across Web pages hosted by different devices. Multi-device timing is particularly important for the broadcasting industry, as it is the key enabler for web-based secondary device offerings. More generally, multi-device timing has wide utility in communication, collaboration and multi-screen presentation. This Community Group aims to define a common, multi-device, timing mechanism and a practical programming model. This will improve the Web as a platform for time-sensitive, multi-device Web applications. Charter : http://webtiming.github.io

webtiming

Group's public email, repo and wiki activity over time

Note: Community Groups are proposed and run by the community. Although W3C hosts these conversations, the groups do not necessarily represent the views of the W3C Membership or staff.

drafts / licensing info

name
Timing Object Draft Specification

Chairs, when logged in, may publish draft and final reports. Please see report requirements.

Publish Reports

First Draft of Timing Object Specification published by Multi-device Timing Community Group

On 2015-08-28 the Multi-device Timing Community Group published the first draft of the following specification:

Participants contribute material to this specification under the W3C Community Contributor License Agreement (CLA).

If you have any questions, please contact the group on their public list: public-webtiming@w3.org. Learn more about the Multi-device Timing Community Group.

Multi-device MPEG DASH

Hi all,

I did some experiments using the Shaka MPEG DASH player, which lends itself quite nicely to synchronization using the Timing object/Shared Motions.  I used the DashCast streamer to create a looping live stream with a 5 minute or so VOD window.  The demonstration shows one live window and three windows for my user.  The controls allow me to switch between my own navigation and the live one seamlessly.

Note that the audio sounds a bit strange while the system is adjusting, as it’s a screen cast from a single box.  Also notice that even though the files are hosted on localhost, the sync is global.   As I did have issues with encoding audio through DashCast, the audio track is actually a separate audio element on a loop, but it’s of course still synchronized.  The test-picture from http://vm2.dashif.org/livesim/testpic_2s/Manifest.mpd does however play very nicely with audio too, but it’s a bit boring to watch.

Regards, Njål

 

Linear/Temporal Composition on the Web

We have just published a paper highlighting the importance of precise timing for Web-based media, both in single-device and multi-device scenarios. The paper may serve as a high-level introduction to the approach and tasks suggested for the Multi-device Timing Community Group.

https://sites.google.com/site/mediasynchronization/Paper4_Arntzen_webComposition_CR.pdf

The paper will be presented at the Media Synchronization Workshop 2015 in conjunction with the ACM TVX 2015 in Brussels.

Timed data and multi-device playback

Hi all, time for a new demonstration!

As we are working on the HTML5 Timing Object, it is nice to visualize how it can be used.  In this demonstration, we show how we map data onto a timeline. This is loaded into a “sequencer” (or MovingCursor) – basically a generalized track element that can be controlled by an HTMLTimingObject.  We then play it back using an HTMLTimingObject connected to a Shared Motion.  This of course includes videos, operations, data and so on.  This is in contrast to the “standard” way of hooking timed data onto a media element, then playing the “master” track.

In this demo, we show how we create the small Motion Corp videos.  We do not use standard linear video editing software (we find it less flexible), but stay in the pure HTML5 world.  This allows us to stay away from “flattening” – the process of merging “layers” of data into a single stream, and keep all data elements open for the entire process.  The final, YouTube films are simple screen captures of a separate browser window (we control it from a different browser, possibly on a different machine too)

In case some of you are interested in experiencing such a tool first hand, we’ve made it available online – note however that the Shared Motion powering it is shared between everyone in the world – so if more than one person is active, they will share the experience!  We did this to avoid logins or any other kind of difficulty for people to play with it – if it becomes a problem let us know and we’ll make it private.

Draft HTMLTimingObject

Hi all,

A first draft for the HTMLTimingObject is out and we are inviting your feedback.

http://webtiming.github.io/timingobject/

The HTMLTimingObject in a nutshell:

The Web already defines a rich set of concepts and frameworks for timed operations, e.g., HTMLMediaElement, HTMLTrackElement, HTMLMediaController, WebAnimation, SMIL and WebAudio. Naturally, we would like to construct timed media by combining the strengths of different frameworks. After all, composition is a key feature of the Web. Unfortunately, this is not so easy. Timing is internal and custom to each framework, and timing models tend to be pulse-based, imprecise and non-deterministic.

The HTMLTimingObject is proposed to address this issue. It is precise, expressive and deterministic. This makes it an excellent basis for timed coordination of heterogeneous frameworks. Perhaps even more important, the HTMLTimingObject is designed specifically to integrate with Shared Motion, a generic timing mechanism for the Web. This way, we envision the HTMLTimingObject as a common basis for timed coordination both in single-device and multi-device Web applications.

Francois also introduces the draft on the webtiming mailing list https://lists.w3.org/Archives/Public/public-webtiming/2015Mar/0004.html

The Motion Corp band

Hi all,

we created this little and slightly silly demonstration of multi-device timing.  The devices play separate ogg or mp3 tracks.  You can test it yourself if you like on http://mcorp.no/examples/multitrack/ – note that all devices that connect will be in sync (it’s not limited to pr. user), so it’s a global performance! Here is the description available on YouTube:

The MotionCorp Band plays up with a cover of “Silent Man” by Speak Softly. This is a recording from their first practice together, so don’t expect the London Symphonic Orchestra!

The band members are:
Asus Zenbook on percussion
Samsung Note 8 on the Synth (he’s not very good and got the most forgiving instrument, but don’t tell him!)
The Nexus 7 brothers (2013 and 2012) on the Piano
Samsung Galaxy S4 on vocals
and finally Nexus5 and Samsung Galaxy Tab from all the way from 2010 on backing vocals.

They run various Android versions (4.2, 4.3, 4.4 and 5.0) – all Androids run Firefox. All synchronization using the inMotion global timing service from the Motion Corporation.

Timed Multi-device Internet Radio

Hi All,

In order to demonstrate multi-device timing, we create demonstrations from time to time. You might have seen the Carnival video posted earlier, which demonstrated video synchronization of on demand content.

As a second demonstration, we’ve looked into live audio. People are used to FM radios, which allow them to turn on as many radios they like. By nature, they are synchronized, and having the radio on in both the kitchen and living room is perfectly normal. Web based radios on the other hand tend to be all over the place, with several seconds difference. While some systems exist to do synchronized playback of on demand content (e.g. Sonos), we have had a look at live, web based playback. We created a small proxy server to timestamp the data, allowing clients to receive a normal Ogg Vorbis audio stream with a known start time. Below is a recording.

If you want more information about any of these demonstrations, don’t hesitate to ask.

Njål

MTCG Task Overview

Before we start any concrete work within the Multi-device Timing Community Group (MTCG) we have made efforts to provide some overview over the scope of this group, as well as the possible tasks we could undertake.

https://www.w3.org/community/webtiming/tasks/

This document defines the introduction of a HTML5TimingElement as the central concept, and defines 8 tasks all relating in some way to this concept. In terms of prioritization, we’ll likely focus initially on T1. HTML5TimingElement API and T2. Timing-enabled HTML5MediaElement.

Welcome Multi-device Timing!

Thank you all for endorsing and joining the multi-device timing group!

To start out with something a bit inspirational, here is a little demo we made highlighting one of the more trivial use-cases for multi-device timing – collaborative video.

The demo shows a chrome browser and a firefox browser playing the same video at the same time. The demo aims to expose current limitations of timed operations with HTML5.

You will notice that the video is a screen capture, so the two browsers are in fact running on a single device. However, there is no local communication going on. The two browsers are completely ignorant of each other. They are only connected across the Internet, via Shared Motion, our implementation of multi-device timing. So, running this demo on multiple devices is only a matter of opening the link on multiple devices.

 

Some nerdy details below:

It’s a horribly difficult video to synchronise due to all the changes in angles, flashes and hefty rhythms, but we like a challenge.

The video is 30 frames pr second, while our screen (used for screen recording) refreshes 60 times pr second. Ideally, the browsers should update the frame shown on the screen every second refresh of the screen.  But. as our browsers are not synchronised with the video card, we tend to hit the right frame, but some times at the wrong time with respect to the video card!  So instead of having both browsers showing frame X in two frames, one will show it first, then both, then the last one.  This is surprisingly visible with large blinks, large movements and so on.

We also focus a bit on reloading, as this is important in the Web domain. The multi-device timing service gives precise timing info within fractions of a second, but the video needs to spends more time to adjust. We are using variablePlaybackrate to adjust slowly, as this generally gives the best user experience.

The multi-device timing service is much more precise than the video though, so a point to take away is that the HTML5 video element is really the weak point here with regard to precise timing. This is something we would like this CG to address.

Another point worth noting is that this kind of precision in multi-device HTML5 playback, though feasible, is by no means easy. Our results depends on the development of specific technical concepts for synchronisation (MediaStateVectors) as well as dedicated engineering efforts.

However, it should not be that way. This should be easy! With Web support for multi-device timing all of this complexity should be encapsulated, and programmers should only have to connect a video with a multi-device media controller to make this work. That should be about 3 lines of JavaScript. Incidentally, that is precisely what you’ll find in our demo code 🙂

Ingar and Njål

Call for Participation in Multi-device Timing Community Group

The Multi-device Timing Community Group has been launched:


Timing mechanisms allow operations to be executed at the correct time. The Web already has several mechanisms supporting timed operations, including setTimeout and setInterval, as well as controllers for media frameworks and animations. However, the Web lacks support for multi-device timing. A multi-device timing mechanism would allow timed operations across Web pages hosted by different devices. Multi-device timing is particularly important for the broadcasting industry, as it is the key enabler for web-based secondary device offerings. More generally, multi-device timing has wide utility in communication, collaboration and multi-screen presentation. This Community Group aims to define a common, multi-device, timing mechanism and a practical programming model. This will improve the Web as a platform for time-sensitive, multi-device Web applications.

Charter : http://webtiming.github.io


In order to join the group, you will need a W3C account.

This is a community initiative. This group was originally proposed on 2015-02-03 by Ingar Arntzen. The following people supported its creation: Ingar Arntzen, François Daoust, Dominique Hazaël-Massieux, Yosuke Funahashi, Njål Borch, Ryoichi Kawada. W3C’s hosting of this group does not imply endorsement of the activities.

The group must now choose a chair. Read more about how to get started in a new group and good practice for running a group.

We invite you to share news of this new group in social media and other channels.

If you believe that there is an issue with this group that requires the attention of the W3C staff, please email us at site-comments@w3.org

Thank you,
W3C Community Development Team