Headlights2016

From W3C Wiki

Status: Work in progress

W3C management reviews W3C priorities and resources on an ongoing basis, and annually (typically mid-year) explores major re-allocations of resources to align with important trends. W3C staff identifies (in internal wiki) some potential directions for W3C (technical, organizational, etc.). We then launch task forces to develop proposals with community participation.

If you are interested in participating in a task force, please contact the lead(s). Note that these topics are in development; W3C has not allocated resources to them other than to develop proposals.

See also Headlights2014.

Schedule

  • September-November 2015: W3C staff comes up with proposed ideas
  • November 2015: W3C management filters proposals that needs deeper investigation
  • end of January 2016: W3C management reviews results of investigations and use them as input to prioritization discussions

Projects

Web and Virtual Reality

As Virtual Reality product and solutions are gaining momentum, we should make sure we understand if and how the Web can play a role in this space, and chart the standardization needs that emerge from it.

More detailed description

A number of industry players have announced important investments or initial products in Virtual Reality: among others, Facebook bought Occulus Rift, Microsoft released their SDK for their VR headset, Google is spreading the usage of their VR cardboard solution, Samsung has released a consumer-grade VR headset, and Sony is known to be finalizing its own. Mozilla has research around WebVR, a set of efforts to make the Web a platform for Virtual Reality.

All these trends point toward the need to ensure that the Web can indeed be such a platform; we have some of the initial pieces: 3D rendering with WebGL, movement detection with the device orientation API, audio processing via Web Audio, some crude video processing via canvas. But most if not all of them are already known to be insufficient to provide a compelling platform for actual VR development, and there are probably more pieces that a more in depth analysis would bring.

This headlight project offers to review in more depth the state of the art of native VR platforms, their market adoption and the use cases they enable, and use that information to determine the role the Web can or should play in this space, especially based on existing or previous related efforts.

Distributed Web Applications

Users are surrounded by screens and devices, which they now use in combination, creating a trend towards applications that span multiple devices. What Web standards are missing or need updating to be able to provide a seamless user experience on multiple devices?

More detailed description

The Web and TV community has highlighted the importance of companion screens for several years. Most interactive TV systems, in particular those based on Web technologies, define APIs to create companion screen experiences. Most devices embed technologies that allow them to discover and pair with other devices, or to advertise themselves as possible companion devices. Wireless keyboards and mice TV have been around for some time now. TV sets as well as screens extended with HDMI dongles have also joined the collection of "regular" devices (smartphone, tablet, laptop) that the user has available. Various protocols may be used under the hoods to connect devices: Apple Airplay, Google Cast, Netflix' DIAL, the Wi-Fi Alliance's Miracast, etc. The Web platform has a strong role to play for the success of multi-device applications: it is likely that the different devices that a user may want to use in combination will come from different vendors and run different systems. Interoperability is key.

The development of the Presentation API illustrates the need for standardization at the application level in this space. The Presentation API is just a starting point, though. How could it be extended to cover the discovery and control of other types of devices? How can a Web application bootstrap a user experience that may involve an arbitrary set of devices? From an authoring perspective, how can developers and designers manage the explosion of combinations that might arise and create a responsive multi-device layout?

This headlight project offers to study such aspects of distributed Web applications and determine the updates to Web standards that may be needed to enable them.

Cross-device Synchronization

More detailed description

The need to synchronize content arises naturally whenever data or resources are shared across devices, tasks, or users. In Web applications, this happens when a user is engaged in a companion device experience (e.g. watching TV while browsing related content on her tablet), when different users are engaged in a collaborative task using different devices (e.g. online collaborative editing, or synchronized video watching) or when different devices are combined to create a unique interface (e.g. video wall, distributed music playback, digital signage scenarios). Synchronization needs depend on the underlying use case, from within 100ms for non-media non-visual related use cases, down to within 25ms for video synchronization and within 10ms for echoless audio playback.

To synchronize both media and non-media related content across devices, Web applications must have a way to:

  1. tell the user agent to follow an external clock;
  2. share and control a timeline;
  3. harness the media playback so that it follows that timeline.

Cross-device synchronization can already be achieved to some extent using Web technologies, but the synchronization of media content typically requires hacking around media players, as media support in HTML5 was not designed with cross-device synchronization issues in mind. The Timing Object specification, developed by the Multi-Device Timing Community Group defines an API that expose cross-device synchronization primitives to Web applications and proposes an extension to media elements in HTML5 to use these primitives.

This headlight project intends to assess the feasibility of the approach and the possibility to progress this work on the standardization track.