Orientation Sensor

W3C Working Draft,

This version:
https://www.w3.org/TR/2021/WD-orientation-sensor-20210902/
Latest published version:
https://www.w3.org/TR/orientation-sensor/
Editor's Draft:
https://w3c.github.io/orientation-sensor/
Previous Versions:
Version History:
https://github.com/w3c/orientation-sensor/commits/main/index.bs
Feedback:
public-device-apis@w3.org with subject line “[orientation-sensor] … message topic …” (archives)
Issue Tracking:
Orientation Sensor Issues Repository
Editors:
Kenneth Rohde Christiansen (Intel Corporation)
Anssi Kostiainen (Intel Corporation)
Former Editors:
Mikhail Pozdnyakov (Intel Corporation)
Alexander Shalamov (Intel Corporation)
Test Suite:
web-platform-tests on GitHub

Abstract

This specification defines a base orientation sensor interface and concrete sensor subclasses to monitor the device’s physical orientation in relation to a stationary three dimensional Cartesian coordinate system.

Status of this document

This section describes the status of this document at the time of its publication. Other documents may supersede this document. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at https://www.w3.org/TR/.

This document was published by the Devices and Sensors Working Group as a Working Draft. This document is intended to become a W3C Recommendation.

If you wish to make comments regarding this document, please send them to public-device-apis@w3.org (subscribe, archives). When sending e-mail, please put the text “orientation-sensor” in the subject, preferably like this: “[orientation-sensor] …summary of comment…”. All comments are welcome.

Publication as a Working Draft does not imply endorsement by the W3C Membership. This is a draft document and may be updated, replaced or obsoleted by other documents at any time. It is inappropriate to cite this document as other than work in progress.

This document was produced by a group operating under the W3C Patent Policy. W3C maintains a public list of any patent disclosures made in connection with the deliverables of the group; that page also includes instructions for disclosing a patent. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) must disclose the information in accordance with section 6 of the W3C Patent Policy.

This document is governed by the 15 September 2020 W3C Process Document.

The Devices and Sensors Working Group is pursuing modern security and privacy reviews for this specification in consideration of the amount of change in both this specification and in privacy and security review practices since the horizontal reviews took place on 14 October 2019. Similarly, the group is pursuing an update to the Technical Architecture Group review for this specification to account for the latest architectural review practices.

1. Introduction

The Orientation Sensor API extends the Generic Sensor API [GENERIC-SENSOR] to provide generic information describing the device’s physical orientation in relation to a three dimensional Cartesian coordinate system.

The AbsoluteOrientationSensor class inherits from the OrientationSensor interface and describes the device’s physical orientation in relation to the Earth’s reference coordinate system.

Other subclasses describe the orientation in relation to other stationary directions, such as true north, or non stationary directions, like in relation to a devices own z-position, drifting towards its latest most stable z-position.

The data provided by the OrientationSensor subclasses are similar to data from DeviceOrientationEvent, but the Orientation Sensor API has the following significant differences:

  1. The Orientation Sensor API represents orientation data in WebGL-compatible formats (quaternion, rotation matrix).

  2. The Orientation Sensor API satisfies stricter latency requirements.

  3. Unlike DeviceOrientationEvent, the OrientationSensor subclasses explicitly define which low-level motion sensors are used to obtain the orientation data, thus obviating possible interoperability issues.

  4. Instances of OrientationSensor subclasses are configurable via SensorOptions constructor parameter.

2. Use Cases and Requirements

The use cases and requirements are discussed in the Motion Sensors Explainer document.

3. Examples

const sensor = new AbsoluteOrientationSensor();
const mat4 = new Float32Array(16);
sensor.start();
sensor.onerror = event => console.log(event.error.name, event.error.message);

sensor.onreading = () => {
  sensor.populateMatrix(mat4);
};
const sensor = new AbsoluteOrientationSensor({ frequency: 60 });
const mat4 = new Float32Array(16);
sensor.start();
sensor.onerror = event => console.log(event.error.name, event.error.message);

function draw(timestamp) {
  window.requestAnimationFrame(draw);
  try {
    sensor.populateMatrix(mat4);
  } catch(e) {
    // mat4 has not been updated.
  }
  // Drawing...
}

window.requestAnimationFrame(draw);

4. Security and Privacy Considerations

There are no specific security and privacy considerations beyond those described in the Generic Sensor API [GENERIC-SENSOR].

5. Model

The OrientationSensor class extends the Sensor class and provides generic interface representing device orientation data.

To access the Orientation Sensor sensor type’s latest reading, the user agent must invoke request sensor access abstract operation for each of the low-level sensors used by the concrete orientation sensor. The table below describes mapping between concrete orientation sensors and permission tokens defined by low-level sensors.

OrientationSensor sublass Permission tokens
AbsoluteOrientationSensor "accelerometer", "gyroscope", "magnetometer"
RelativeOrientationSensor "accelerometer", "gyroscope"

The AbsoluteOrientationSensor is a policy-controlled feature identified by strings "accelerometer", "gyroscope" and "magnetometer" . Its default allowlist is 'self'.

The RelativeOrientationSensor is a policy-controlled feature identified by strings "accelerometer" and "gyroscope". Its default allowlist is 'self'.

A latest reading for a Sensor of Orientation Sensor sensor type includes an entry whose key is "quaternion" and whose value contains a four element list. The elements of the list are equal to components of a unit quaternion [QUATERNIONS] [Vx * sin(θ/2), Vy * sin(θ/2), Vz * sin(θ/2), cos(θ/2)] where V is the unit vector (whose elements are Vx, Vy, and Vz) representing the axis of rotation, and θ is the rotation angle about the axis defined by the unit vector V.

Note: The quaternion components are arranged in the list as [q1, q2, q3, q0] [QUATERNIONS], i.e. the components representing the vector part of the quaternion go first and the scalar part component which is equal to cos(θ/2) goes after. This order is used for better compatibility with the most of the existing WebGL frameworks, however other libraries could use a different order when exposing quaternion as an array, e.g. [q0, q1, q2, q3].

The concrete OrientationSensor subclasses that are created through sensor-fusion of the low-level motion sensors are presented in the table below:

OrientationSensor sublass Low-level motion sensors
AbsoluteOrientationSensor Accelerometer, Gyroscope, Magnetometer
RelativeOrientationSensor Accelerometer, Gyroscope

Note: Accelerometer, Gyroscope and Magnetometer low-level sensors are defined in [ACCELEROMETER], [GYROSCOPE], and [MAGNETOMETER] specifications respectively. The sensor fusion is platform specific and can happen in software or hardware, i.e. on a sensor hub.

This example code explicitly queries permissions for AbsoluteOrientationSensor before calling start().
const sensor = new AbsoluteOrientationSensor();
Promise.all([navigator.permissions.query({ name: "accelerometer" }),
             navigator.permissions.query({ name: "magnetometer" }),
             navigator.permissions.query({ name: "gyroscope" })])
       .then(results => {
             if (results.every(result => result.state === "granted")) {
               sensor.start();
               ...
             } else {
               console.log("No permissions to use AbsoluteOrientationSensor.");
             }
       });

Another approach is to simply call start() and subscribe to onerror event handler.

const sensor = new AbsoluteOrientationSensor();
sensor.start();
sensor.onerror = event => {
  if (event.error.name === 'SecurityError')
    console.log("No permissions to use AbsoluteOrientationSensor.");
};

5.1. The AbsoluteOrientationSensor Model

The AbsoluteOrientationSensor class is a subclass of OrientationSensor which represents the Absolute Orientation Sensor.

For the absolute orientation sensor the value of latest reading["quaternion"] represents the rotation of a device’s local coordinate system in relation to the Earth’s reference coordinate system defined as a three dimensional Cartesian coordinate system (x, y, z), where:

The device’s local coordinate system is the same as defined for the low-level motion sensors. It can be either the device coordinate system or the screen coordinate system.

Note: Figure below represents the case where device’s local coordinate system and the Earth’s reference coordinate system are aligned, therefore, orientation sensor’s latest reading would represent 0 (rad) [SI] rotation about each axis.

AbsoluteOrientationSensor coordinate system.

5.2. The RelativeOrientationSensor Model

The RelativeOrientationSensor class is a subclass of OrientationSensor which represents the Relative Orientation Sensor.

For the relative orientation sensor the value of latest reading["quaternion"] represents the rotation of a device’s local coordinate system in relation to a stationary reference coordinate system. The stationary reference coordinate system may drift due to the bias introduced by the gyroscope sensor, thus, the rotation value provided by the sensor, may drift over time.

The stationary reference coordinate system is defined as an inertial three dimensional Cartesian coordinate system that remains stationary as the device hosting the sensor moves through the environment.

The device’s local coordinate system is the same as defined for the low-level motion sensors. It can be either the device coordinate system or the screen coordinate system.

Note: The relative orientation sensor data could be more accurate than the one provided by absolute orientation sensor, as the sensor is not affected by magnetic fields.

6. API

6.1. The OrientationSensor Interface

OrientationSensor

In only one current engine.

FirefoxNoneSafariNoneChrome67+
Opera54+Edge79+
Edge (Legacy)NoneIENone
Firefox for AndroidNoneiOS SafariNoneChrome for Android67+Android WebView67+Samsung Internet9.0+Opera Mobile48+
typedef (Float32Array or Float64Array or DOMMatrix) RotationMatrixType;

[SecureContext, Exposed=Window]
interface OrientationSensor : Sensor {
  readonly attribute FrozenArray<double>? quaternion;
  undefined populateMatrix(RotationMatrixType targetMatrix);
};

enum OrientationSensorLocalCoordinateSystem { "device", "screen" };

dictionary OrientationSensorOptions : SensorOptions {
  OrientationSensorLocalCoordinateSystem referenceFrame = "device";
};

6.1.1. OrientationSensor.quaternion

OrientationSensor/quaternion

In only one current engine.

FirefoxNoneSafariNoneChrome67+
Opera54+Edge79+
Edge (Legacy)NoneIENone
Firefox for AndroidNoneiOS SafariNoneChrome for Android67+Android WebView67+Samsung Internet9.0+Opera Mobile48+

Returns a four-element FrozenArray whose elements contain the components of the unit quaternion representing the device orientation. In other words, this attribute returns the result of invoking get value from latest reading with this and "quaternion" as arguments.

6.1.2. OrientationSensor.populateMatrix()

OrientationSensor/populateMatrix

In only one current engine.

FirefoxNoneSafariNoneChrome67+
Opera54+Edge79+
Edge (Legacy)NoneIENone
Firefox for AndroidNoneiOS SafariNoneChrome for Android67+Android WebView67+Samsung Internet9.0+Opera Mobile48+

The populateMatrix() method populates the given object with rotation matrix which is converted from the value of latest reading["quaternion"] [QUATCONV], as shown below:

Converting quaternion to rotation matrix.

where:

The rotation matrix is flattened in targetMatrix object according to the column-major order, as described in populate rotation matrix algorighm.

To populate rotation matrix, the populateMatrix() method must run these steps or their equivalent:
  1. If targetMatrix is not of type defined by RotationMatrixType union, throw a "TypeError" exception and abort these steps.

  2. If targetMatrix is of type Float32Array or Float64Array with a size less than sixteen, throw a "TypeError" exception and abort these steps.

  3. Let quaternion be the result of invoking get value from latest reading with this and "quaternion" as arguments.

  4. If quaternion is null, throw a "NotReadableError" DOMException and abort these steps.

  5. Let x be the value of quaternion[0]

  6. Let y be the value of quaternion[1]

  7. Let z be the value of quaternion[2]

  8. Let w be the value of quaternion[3]

  9. If targetMatrix is of Float32Array or Float64Array type, run these sub-steps:

    1. Set targetMatrix[0] = 1 - 2 * y * y - 2 * z * z

    2. Set targetMatrix[1] = 2 * x * y - 2 * z * w

    3. Set targetMatrix[2] = 2 * x * z + 2 * y * w

    4. Set targetMatrix[3] = 0

    5. Set targetMatrix[4] = 2 * x * y + 2 * z * w

    6. Set targetMatrix[5] = 1 - 2 * x * x - 2 * z * z

    7. Set targetMatrix[6] = 2 * y * z - 2 * x * w

    8. Set targetMatrix[7] = 0

    9. Set targetMatrix[8] = 2 * x * z - 2 * y * w

    10. Set targetMatrix[9] = 2 * y * z + 2 * x * w

    11. Set targetMatrix[10] = 1 - 2 * x * x - 2 * y * y

    12. Set targetMatrix[11] = 0

    13. Set targetMatrix[12] = 0

    14. Set targetMatrix[13] = 0

    15. Set targetMatrix[14] = 0

    16. Set targetMatrix[15] = 1

  10. If targetMatrix is of DOMMatrix type, run these sub-steps:

    1. Set targetMatrix.m11 = 1 - 2 * y * y - 2 * z * z

    2. Set targetMatrix.m12 = 2 * x * y - 2 * z * w

    3. Set targetMatrix.m13 = 2 * x * z + 2 * y * w

    4. Set targetMatrix.m14 = 0

    5. Set targetMatrix.m21 = 2 * x * y + 2 * z * w

    6. Set targetMatrix.m22 = 1 - 2 * x * x - 2 * z * z

    7. Set targetMatrix.m23 = 2 * y * z - 2 * x * w

    8. Set targetMatrix.m24 = 0

    9. Set targetMatrix.m31 = 2 * x * z - 2 * y * w

    10. Set targetMatrix.m32 = 2 * y * z + 2 * x * w

    11. Set targetMatrix.m33 = 1 - 2 * x * x - 2 * y * y

    12. Set targetMatrix.m34 = 0

    13. Set targetMatrix.m41 = 0

    14. Set targetMatrix.m42 = 0

    15. Set targetMatrix.m43 = 0

    16. Set targetMatrix.m44 = 1

6.2. The AbsoluteOrientationSensor Interface

AbsoluteOrientationSensor

In only one current engine.

FirefoxNoneSafariNoneChrome67+
Opera54+Edge79+
Edge (Legacy)NoneIENone
Firefox for AndroidNoneiOS SafariNoneChrome for Android67+Android WebView67+Samsung Internet9.0+Opera Mobile48+

AbsoluteOrientationSensor/AbsoluteOrientationSensor

In only one current engine.

FirefoxNoneSafariNoneChrome67+
Opera54+Edge79+
Edge (Legacy)NoneIENone
Firefox for AndroidNoneiOS SafariNoneChrome for Android67+Android WebView67+Samsung Internet9.0+Opera Mobile48+
[SecureContext, Exposed=Window]
interface AbsoluteOrientationSensor : OrientationSensor {
  constructor(optional OrientationSensorOptions sensorOptions = {});
};

To construct an AbsoluteOrientationSensor object the user agent must invoke the construct an orientation sensor object abstract operation for the AbsoluteOrientationSensor interface.

Supported sensor options for AbsoluteOrientationSensor are "frequency" and "referenceFrame".

6.3. The RelativeOrientationSensor Interface

RelativeOrientationSensor

In only one current engine.

FirefoxNoneSafariNoneChrome67+
Opera54+Edge79+
Edge (Legacy)NoneIENone
iOS SafariNoneChrome for Android67+Android WebView67+Samsung Internet9.0+Opera Mobile48+

RelativeOrientationSensor/RelativeOrientationSensor

In only one current engine.

FirefoxNoneSafariNoneChrome67+
Opera54+Edge79+
Edge (Legacy)NoneIENone
iOS SafariNoneChrome for Android67+Android WebView67+Samsung Internet9.0+Opera Mobile48+
[SecureContext, Exposed=Window]
interface RelativeOrientationSensor : OrientationSensor {
  constructor(optional OrientationSensorOptions sensorOptions = {});
};

To construct a RelativeOrientationSensor object the user agent must invoke the construct an orientation sensor object abstract operation for the RelativeOrientationSensor interface.

Supported sensor options for RelativeOrientationSensor are "frequency" and "referenceFrame".

7. Abstract Operations

7.1. Construct an Orientation Sensor object

input

orientation_interface, an interface identifier whose inherited interfaces contains OrientationSensor.

options, a OrientationSensorOptions object.

output

An OrientationSensor object.

  1. Let allowed be the result of invoking check sensor policy-controlled features with the interface identified by orientation_interface.

  2. If allowed is false, then:

    1. Throw a SecurityError DOMException.

  3. Let orientation be a new instance of the interface identified by orientation_interface.

  4. Invoke initialize a sensor object with orientation and options.

  5. If options.referenceFrame is "screen", then:

    1. Define local coordinate system for orientation as the screen coordinate system.

  6. Otherwise, define local coordinate system for orientation as the device coordinate system.

  7. Return orientation.

8. Automation

This section extends the automation section defined in the Generic Sensor API [GENERIC-SENSOR] to provide mocking information about the device’s physical orientation in relation to a three dimensional Cartesian coordinate system for the purposes of testing a user agent’s implementation of AbsoluteOrientationSensor and RelativeOrientationSensor APIs.

8.1. Mock Sensor Type

The AbsoluteOrientationSensor class has an associated mock sensor type which is "absolute-orientation", its mock sensor reading values dictionary is defined as follows:

dictionary AbsoluteOrientationReadingValues {
  required FrozenArray<double>? quaternion;
};

The RelativeOrientationSensor class has an associated mock sensor type which is "relative-orientation", its mock sensor reading values dictionary is defined as follows:

dictionary RelativeOrientationReadingValues : AbsoluteOrientationReadingValues {
};

9. Acknowledgements

Tobie Langel for the work on Generic Sensor API.

10. Conformance

Conformance requirements are expressed with a combination of descriptive assertions and RFC 2119 terminology. The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in the normative parts of this document are to be interpreted as described in RFC 2119. However, for readability, these words do not appear in all uppercase letters in this specification.

All of the text of this specification is normative except sections explicitly marked as non-normative, examples, and notes. [RFC2119]

A conformant user agent must implement all the requirements listed in this specification that are applicable to user agents.

The IDL fragments in this specification must be interpreted as required for conforming IDL fragments, as described in the Web IDL specification. [WEBIDL]

Index

Terms defined by this specification

Terms defined by reference

References

Normative References

[ACCELEROMETER]
Anssi Kostiainen; Alexander Shalamov. Accelerometer. 24 July 2021. CR. URL: https://www.w3.org/TR/accelerometer/
[GENERIC-SENSOR]
Rick Waldron; et al. Generic Sensor API. 29 July 2021. CR. URL: https://www.w3.org/TR/generic-sensor/
[GEOMETRY-1]
Simon Pieters; Chris Harrelson. Geometry Interfaces Module Level 1. 4 December 2018. CR. URL: https://www.w3.org/TR/geometry-1/
[GYROSCOPE]
Anssi Kostiainen; Mikhail Pozdnyakov. Gyroscope. 24 July 2021. CR. URL: https://www.w3.org/TR/gyroscope/
[INFRA]
Anne van Kesteren; Domenic Denicola. Infra Standard. Living Standard. URL: https://infra.spec.whatwg.org/
[MAGNETOMETER]
Anssi Kostiainen; Rijubrata Bhaumik. Magnetometer. 24 July 2021. WD. URL: https://www.w3.org/TR/magnetometer/
[PERMISSIONS-POLICY-1]
Ian Clelland. Permissions Policy. 16 July 2020. WD. URL: https://www.w3.org/TR/permissions-policy-1/
[RFC2119]
S. Bradner. Key words for use in RFCs to Indicate Requirement Levels. March 1997. Best Current Practice. URL: https://datatracker.ietf.org/doc/html/rfc2119
[WEBIDL]
Boris Zbarsky. Web IDL. 15 December 2016. ED. URL: https://heycam.github.io/webidl/

Informative References

[HTML]
Anne van Kesteren; et al. HTML Standard. Living Standard. URL: https://html.spec.whatwg.org/multipage/
[QUATCONV]
Watt, Alan H., and Mark Watt.. Advanced animation and rendering techniques., page 362. 1992. Informational. URL: https://www.cs.cmu.edu/afs/cs/academic/class/15462-s14/www/lec_slides/3DRotationNotes.pdf
[QUATERNIONS]
Quaternion. URL: https://en.wikipedia.org/wiki/Quaternion
[SI]
SI Brochure: The International System of Units (SI), 8th edition. 2014. URL: http://www.bipm.org/en/publications/si-brochure/

IDL Index

typedef (Float32Array or Float64Array or DOMMatrix) RotationMatrixType;

[SecureContext, Exposed=Window]
interface OrientationSensor : Sensor {
  readonly attribute FrozenArray<double>? quaternion;
  undefined populateMatrix(RotationMatrixType targetMatrix);
};

enum OrientationSensorLocalCoordinateSystem { "device", "screen" };

dictionary OrientationSensorOptions : SensorOptions {
  OrientationSensorLocalCoordinateSystem referenceFrame = "device";
};

[SecureContext, Exposed=Window]
interface AbsoluteOrientationSensor : OrientationSensor {
  constructor(optional OrientationSensorOptions sensorOptions = {});
};

[SecureContext, Exposed=Window]
interface RelativeOrientationSensor : OrientationSensor {
  constructor(optional OrientationSensorOptions sensorOptions = {});
};

dictionary AbsoluteOrientationReadingValues {
  required FrozenArray<double>? quaternion;
};

dictionary RelativeOrientationReadingValues : AbsoluteOrientationReadingValues {
};