Publications
Posted on:
2018 : Chapter “Media Synchronization on the Web”
https://link.springer.com/chapter/10.1007/978-3-319-65840-7_17
Authors: Ingar M. Arntzen, Njål T. Borch and Francois Daoust.
Book chapter appears in “MediaSync: Handbook on Multimedia Synchronization”
https://www.springer.com/gp/book/9783319658391
We also provide the author version of this chapter. Please cite original chapter published by Springer. You may also request original version of chapter by emailing authors directly or request from ResearchGate
2016 : Timing: Small step for developers, giant leap for the media industry
This paper isn’t overly technical, but focusing instead on how the industry deals with timing, surveying some well know products and pointing out the benefits and potential that would come with the multi-device timing approach (i.e. timing object + shared motion).
IBC 8 Sept. 2016. Amsterdam. Advances in Technology/Paper Session: Enhancing the Multi-screen Experience through Synchronisation & Personalisation.
2016 : Data-independent sequencing with the Timing Object; A JavaScript sequencer for single-device and multi-device Web media
In this paper we highlight the importance of isolating sequencing logic from media formats, authoring models, timing/control, UI and media players. Instead, generic sequencing logic is made available as a standalone programming concept, enabling a wide array of applications and usecases in Web media.
Ingar M. Arntzen and Njål T. Borch. 2016. Data-independent sequencing with the timing object: a JavaScript sequencer for single-device and multi-device web media. In Proceedings of the 7th International Conference on Multimedia Systems (MMSys ’16). ACM, New York, NY, USA, , Article 24 , 10 pages.
http://dl.acm.org/citation.cfm?id=2910614
2015 : Timingsrc: A programming model for timed Web applications, based on the Timing Object. Precise timing, synchronization and control for single-device and multi-device Web applications.
This is the documentation and implementation of a the new programming model implicated by the introduction of the timing object. Code is open sourced on GitHub.
http://webtiming.github.io/timingsrc/
2015 : Report: Evaluating timed playback of HTML5 Media
In this report we provide an extensive analysis of timing aspects of HTML5 Media Elements, across a variety of browsers, operating systems and media formats. Particularly we investigate how playback compares to the progression of the local clock and how players respond to time-shifting and adjustments in playback-rate.
http://norut.no/sites/norut.no/files/norut_tromso_rapport_28-2015.pdf
2015 : Timing Object Draft Specification
This specification defines the timing object. The timing object is a local object that may be used by Web clients to ensure precisely timed operation as well as flexible timing control. If multiple timing-sensitive components take direction from the same timing object, their behaviour will be precisely aligned in time (synchronized). Crucially, this is also the case in distributed settings. A central motivation for the timing object is that it may be connected to an online timing resource. This way, the local timing object is a gateway to precisely timed operations, both in single-device and multi-device scenarios.
http://webtiming.github.io/timingobject/
2015 : Multi-device Linear Composition on the Web
This paper highlights the importance of Web support for linear (temporal) composition, and indicates how the Multi-device Timing Community Group is working towards this goal by proposing standardization of the Timing Object and integration with solutions for online timing objects.
Ingar M. Arntzen, Njål T. Borch, François Daoust, Dominique Hazael-Massieux, “Multi-device Linear Composition on the Web, Enabling Multi-device Linear Media with HTMLTimingObject and Shared Motion”, Media Synchronization Workshop 2015, ACM TVX Brussels, June 2015.
https://sites.google.com/site/mediasynchronization/Paper4_Arntzen_webComposition_CR.pdf
2014 : Report : Distributed Synchronization of HTML5 Media
In this report we analyze the quality of synchronization we can expect when synchronizing HTML5 audio and video elements on multiple devices using Shared Motion. We demonstrate that the concept of Shared Motion enables sub-frame synchronization for video, and near perfect synchronization for audio. Experiments are conducted in real world scenarios.
2013 : Composite Media: A new paradigm for online media
This paper indicates the ultimate purpose of Shared Motion which is to enable a new model for online media. Composite Media represents a shift away from the classical model, where online media is essentially made from linear media content and a media player, and media motion is an internal concern of the media player.
In the new model media motion is represented as an explicit resource. Online media is then made from two types of resources, linear media content and media motions, synthesized into presentation at the client-side. Shared Motion allows allows precisely synchronized media control across Internet, thereby allowing distributed media components to play part in a single, consistent media experience.
Ingar M. Arntzen and Njål T. Borch 2013. “Composite Media, A new paradigm for online media” In Proceedings of 2013 NEM Summit (Networked Electronic Media). Eurescom, Nantes, France, 105-110.
http://www.mcorp.no/publications/compositemedia2013.pdf
2013 : The Media State Vector; A unifying concept for multi-device media navigation.
This paper presents a technical concept and solution to distributed, server-based motion synchronization for Web agents.
Ingar M. Arntzen, Njål T. Borch and Christopher P. Needham. 2013. “The Media State Vector: A unifying concept for multi-device media navigation”. In Proceedings of the 5th Workshop on Mobile Video (MoVid ’13). ACM, New York, NY, USA, 61-66.
http://dl.acm.org/citation.cfm?id=2457427
There is also a full 12-page version (2012) of this paper that additionally discusses and defines the proper requirements for distributed motion synchronization. Unfortunately, this paper was rejected by reviewers, so we made it publicly available on the Web instead.
http://www.mcorp.no/publications/mediastatevector2012.pdf