Generic Sensor API

W3C Working Draft,

This version:
https://www.w3.org/TR/2017/WD-generic-sensor-20170530/
Latest published version:
https://www.w3.org/TR/generic-sensor/
Editor's Draft:
https://w3c.github.io/sensors/
Previous Versions:
Feedback:
public-device-apis@w3.org with subject line “[generic-sensor] … message topic …” (archives)
GitHub (new issue, level 1 issues, all issues)
Editors:
Tobie Langel (Intel Corporation)
Rick Waldron (JS Foundation)
Other:
Test suite, version history

Abstract

This specification defines a framework for exposing sensor data to the Open Web Platform in a consistent way. It does so by defining a blueprint for writing specifications of concrete sensors along with an abstract Sensor interface that can be extended to accommodate different sensor types.

Status of this document

This section describes the status of this document at the time of its publication. Other documents may supersede this document. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at https://www.w3.org/TR/.

This document was published by the Device and Sensors Working Group as a Working Draft. This document is intended to become a W3C Recommendation.

If you wish to make comments regarding this document, please send them to public-device-apis@w3.org (subscribe, archives). When sending e-mail, please put the text “generic-sensor” in the subject, preferably like this: “[generic-sensor] …summary of comment…”. All comments are welcome.

Publication as a Working Draft does not imply endorsement by the W3C Membership. This is a draft document and may be updated, replaced or obsoleted by other documents at any time. It is inappropriate to cite this document as other than work in progress.

This document was produced by a group operating under the 5 February 2004 W3C Patent Policy. W3C maintains a public list of any patent disclosures made in connection with the deliverables of the group; that page also includes instructions for disclosing a patent. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) must disclose the information in accordance with section 6 of the W3C Patent Policy.

This document is governed by the 1 March 2017 W3C Process Document.

1. Introduction

Increasingly, sensor data is used in application development to enable new use cases such as geolocation, counting steps or head-tracking. This is especially true on mobile devices where new sensors are added regularly.

Exposing sensor data to the Web has so far been both slow-paced and ad-hoc. Few sensors are already exposed to the Web. When they are, it is often in ways that limit their possible use cases (for example by exposing abstractions that are too high-level and which don’t perform well enough). APIs also vary greatly from one sensor to the next which increases the cognitive burden of Web application developers and slows development.

The goal of the Generic Sensor API is to promote consistency across sensor APIs, enable advanced use cases thanks to performant low-level APIs, and increase the pace at which new sensors can be exposed to the Web by simplifying the specification and implementation processes.

This lacks an informative section with examples for developers. Should contain different use of the API, including using it in conjunction with requestAnimationFrame.

2. Scope

This section is non-normative.

The scope of this specification is currently limited to specifying primitives which enable expose data from local sensors.

Exposing remote sensors or sensors found on personal area networks (e.g. Bluetooth) is out of scope. As work in these areas mature, it is possible that common, lower-level primitives be found, in which case this specification will be updated accordingly. This should have little to no effects on implementations, however.

This specification also does not currently expose a sensor discovery API. This is because the limited number of sensors currently available to user agents does not warrant such an API. Using feature detection, such as described in §4 A note on Feature Detection of Hardware Features, is good enough for now. A subsequent version of this specification might specify such an API, and the current API has been designed with this in mind.

3. Background

This section is non-normative.

This section is ill-named. It principally covers default sensors and explains the reasoning behind them. It should be renamed accordingly and moved, either to another section of the spec or to an external explainer document.

The Generic Sensor API is designed to make the most common use cases straightforward while still enabling more complex use cases.

Most devices deployed today do not carry more than one sensor of each sensor types. This shouldn’t come as a surprise since use cases for more than a sensor of a given type are rare and generally limited to specific sensor types, such as proximity sensors.

The API therefore makes it easy to interact with the device’s default (and often unique) sensor for each type simply by instantiating the corresponding Sensor subclass.

Indeed, without specific information identifying a particular sensor of a given type, the default sensor is chosen.

Listening to geolocation changes:
let sensor = new GeolocationSensor({ accuracy: "high" });

sensor.onchange = function(event) {
    var coords = [sensor.latitude, sensor.longitude];
    updateMap(null, coords, sensor.accuracy);
};

sensor.onerror = function(error) {
    updateMap(error);
};
sensor.start();

Note: extension to this specification may choose not to define a default sensor when doing so wouldn’t make sense. For example, it might be difficult to agree on an obvious default sensor for proximity sensors.

In cases where multiple sensors of the same type may coexist on the same device, specification extension will have to define ways to uniquely identify each one.

For example checking the pressure of the left rear tire:
var sensor = new DirectTirePressureSensor({ position: "rear", side: "left" });
sensor.onchange = _ => console.log(sensor.pressure);
sensor.start();

4. A note on Feature Detection of Hardware Features

This section is non-normative.

Feature detection is an established Web development best practice. Resources on the topic are plentiful on and offline and the purpose of this section is not to discuss it further, but rather to put it in the context of detecting hardware-dependent features.

Consider the below feature detection examples:

if (typeof Gyroscope === "function") {
    // run in circles...
}

if ("ProximitySensor" in window) {
    // watch out!
}

if (window.AmbientLightSensor) {
    // go dark...
}

// etc.

All of these tell you something about the presence and possible characteristics of an API. They do not tell you anything, however, about whether that API is actually connected to a real hardware sensor, whether that sensor works, if its still connected, or even whether the user is going to allow you to access it. Note you can check the latter using the Permissions API [PERMISSIONS].

In an ideal world, information about the underlying status would be available upfront. The problem with this is twofold. First, getting this information out of the hardware is costly, in both performance and battery time, and would sit in the critical path. Secondly, the status of the underlying hardware can evolve over time. The user can revoke permission, the connection to the sensor be severed, the operating system may decide to limit sensor usage below a certain battery threshold, etc.

Therefore, an effective strategy is to combine feature detection, which checks whether an API for the sought-after sensor actually exists, and defensive programming which includes:

  1. checking for error thrown when instantiating a Sensor object,

  2. listening to errors emitted by it,

  3. handling all of the above graciously so that the user’s experience is enhanced by the possible usage of a sensor, not degraded by its absence.

try { // No need to feature detect thanks to try..catch block.
    var sensor = new GeolocationSensor();
    sensor.start();
    sensor.onerror = error => gracefullyDegrade(error);
    sensor.onchange = _ => updatePosition(sensor.latitude, sensor.longitude);
} catch(error) {
    gracefullyDegrade(error);
}

5. Security and privacy considerations

This section needs to be reorganized. It probably needs a section that lists threats and one that lists mitigation strategies, with links between both.

Privacy risks can arise when sensors are used with each other, in combination with other functionality, or when used over time, specifically with the risk of correlation of data and user identification through fingerprinting. Web application developers using these JavaScript APIs should consider how this information might be correlated with other information and the privacy risks that might be created. The potential risks of collection of such data over a longer period of time should also be considered.

Variations in sensor readings as well as event firing rates offer the possibility of fingerprinting to identify users. User agents may reduce the risk by limiting event rates available to web application developers.

Note: do we really want this mitigation strategy?

Frequency polling in periodic reporting mode might allow the fingerprinting of hardware or implementation types, by probing which actual frequencies are supported by the platform.

Minimizing the accuracy of a sensor’s readout generally decreases the risk of fingerprinting. User agents should not provide unnecessarily verbose readouts of sensors data. Each sensor type should be assessed individually.

If the same JavaScript code using the API can be used simultaneously in different window contexts on the same device it may be possible for that code to correlate the user across those two contexts, creating unanticipated tracking mechanisms.

User agents should consider providing the user an indication of when the sensor is used and allowing the user to disable it. Additionally, user agents may consider allowing the user to verify past and current sensor use patterns.

Web application developers that use sensors should perform a privacy impact assessment of their application taking all aspects of their application into consideration.

Ability to detect a full working set of sensors on a device can form an identifier and could be used for fingerprinting.

A combination of selected sensors can potentially be used to form an out of band communication channel between devices.

Sensors can potentially be used in cross-device linking and tracking of a user.

5.1. Mitigation Strategies

5.1.1. Secure Context

Sensor readings are explicitly flagged by the Secure Contexts specification [POWERFUL-FEATURES] as a high-value target for network attackers. Thus all interfaces defined by this specification or extension specifications must only be available within a secure context.

5.1.2. Top-Level Browsing Context

Sensor readings must only be available in the top-level browsing context to avoid the privacy risk of sharing the information defined in this specification (and specifications extending it) with contexts unfamiliar to the user.

Note: Feature Policy should allow securely relaxing those restrictions once it matures.

5.1.3. Loosing Focus

When the top-level browsing context loses focus, or when a nested browsing context of a different origin gains focus (for example when the user carries out an in-game purchase using a third party payment service from within an iframe) the top-level browsing contexts suddenly becomes in a position to carry out a skimming attack against the browsing context that has gained focus.

To mitigate this threat, readings of sensors running in a top-level browsing contexts must not be delivered in such cases. A security check is run before sensor readings are delivered to ensure that.

5.1.4. Visibility State

Sensor readings must only be available in browsing contexts that are visible by the user, that is, whose visibility state is "visible". A security check is run before sensor readings are delivered to ensure that.

certain use cases require sensors to have background access. Using a more complex PermissionDescriptor. (e.g. with a boolean allowBackgroundUsage = false; dictionary member), might be the solution to relax this restriction.

5.1.5. Permissions API

Access to sensor readings must be controlled by the Permissions API [PERMISSIONS]. User agents may use a number of criteria to grant access to the readings. Access may be granted without prompting the user.

5.2. Mitigation strategies applied on a case by case basis

Each sensor type will need to be assessed individually, taking into account the use cases it enables and its particular threat profile. While some of the below mitigation strategies are effective for certain sensors, they might also hinder or altogether prevent certain use cases.

Note: These mitigation strategies can be apllied constantly or temporarily, for example when the user is carrying out specific actions, when other APIs which are known to amplify the level of the threat are in use, etc.

5.2.1. Limit maximum polling frequency

User agents may mitigate certain threats by limiting the maximum polling frequency. What upper limit to choose depends on the sensor type, the kind of threats the user agent is trying to protect against, the expected resources of the attacker, etc.

Limiting the maximum polling frequency prevents use cases which rely on low latency or high data density.

5.2.2. Stopping the sensor altogether

This is obviously a last-resort solution, but it can be extremely effective if it’s temporal, for example to prevent password skimming attempts when the user is entering credentials on a different origin ([rfc6454]) or in a different application.

5.2.3. Limit number of delivered readings

An alternative to limiting the maximum polling frequency is to limit the number of sensor readings delivered to Web application developer, regardless of what frequency the sensor is polled at. This allows use cases which have low latency requirement to increase polling frequency without increasing the amount of data provided.

Discarding intermediary readings prevents certain use cases, such as those relying on certain kinds of filters.

5.2.4. Reducing accuracy

Reducing the accuracy of sensor readings or sensor reading timestamps might also help mitigate certain threats, thus user agents should not provide unnecessarily verbose readouts of sensors data.

However, certain use cases require highly accurate readings, especially when operations carried out on the readings, or time deltas calculated from the timestamps, increase innacuracies exponentially.

Note: while adding random bias to sensor readings has similar effects, it shouldn’t be used in practice as it is easy to filter out the added noise.

5.2.5. Keeping the user informed about API use

User agents may choose to keep the user informed about current and past use of the API.

Note: this does not imply keeping a log of the actual sensor readings which would have issues of its own.

6. Concepts

6.1. Sensors

A sensor measures different physical quantities and provide corresponding raw sensor readings which are a source of information about the user and their environment.

Each reading is composed of the values of the different physical quantities measured by the sensor at time tn.

Known, predictable discrepancies between raw sensor readings and the corresponding physical quantities being measured are corrected through calibration.

Known but unpredictable discrepancies need to be addressed dynamically through a process called sensor fusion.

Calibrated raw sensor readings are referred to as sensor readings, whether or not they have undergone sensor fusion.

6.2. Sensor Types

Different sensor types measure different physical quantities such as temperature, air pressure, heart-rate, or luminosity.

For the purpose of this specification we distinguish between high-level and low-level sensor types.

Sensor types which are characterized by their implementation are referred to as low-level sensors. For example a Gyroscope is a low-level sensor type.

Sensors named after their readings, regardless of the implementation, are said to be high-level sensors. For instance, geolocation sensors provide information about the user’s location, but the precise means by which this data is obtained is purposefully left opaque (it could come from a GPS chip, network cell triangulation, wifi networks, etc. or any combination of the above) and depends on various, implementation-specific heuristics. High-level sensors are generally the fruits of applying algorithms to low-level sensors—for example, a pedometer can be built using only the output of a gyroscope—or of sensor fusion.

That said, the distinction between high-level and low-level sensor types is somewhat arbitrary and the line between the two is often blurred. For instance, a barometer, which measures air pressure, would be considered low-level for most common purposes, even though it is the product of the sensor fusion of resistive piezo-electric pressure and temperature sensors. Exposing the sensors that compose it would serve no practical purpose; who cares about the temperature of a piezo-electric sensor? A pressure-altimeter would probably fall in the same category, while a nondescript altimeter—which could get its data from either a barometer or a GPS signal—would clearly be categorized as a high-level sensor type.

Because the distinction is somewhat blurry, extensions to this specification (see §10 Extensibility) are encouraged to provide domain-specific definitions of high-level and low-level sensors for the given sensor types they are targeting.

Sensor readings from different sensor types can be combined together through a process called sensor fusion. This process provides higher-level or more accurate data (often at the cost of increased latency). For example, the readings of a three-axis magnetometer needs to be combined with the readings of an accelerometer to provide a correct bearing.

Smart sensors and sensor hubs have built-in compute resources which allow them to carry out calibration and sensor fusion at the hardware level, freeing up CPU resources and lowering battery consumption in the process.

But sensor fusion can also be carried out in software. This is particularly useful when performance requirements can only be met by relying on application-specific data. For example, head tracking for virtual or augmented reality applications requires extremely low latency to avoid causing motion sickness. That low-latency is best provided by using the raw output of a gyroscope and waiting for quick rotational movements of the head to compensate for drift.

Note: sensors created through sensor fusion are sometimes called virtual or synthetic sensors. However, the specification doesn’t make any practical differences between them, preferring instead to differentiate sensors as to whether they describe the kind of readings produced--these are high-level sensors—or how the sensor is implemented (low-level sensors).

6.3. Reporting Modes

This feature is at risk. It is not clear whether there is value in splitting up sensor types between those that fire events at regular intervals and those which don’t.

Sensors have different reporting modes. When sensor readings are reported at regular intervals, at an adjustable frequency measured in hertz (Hz), the reporting mode is said to be periodic. On sensor types with support for periodic reporting mode, periodic reporting mode is triggered by requesting a specific frequency.

Sensor types which do not support periodic reporting mode are said to operate in an implementation specific way. When the reporting mode is implementation specific, sensor readings may be provided at regular intervals, irregularly, or only when a reading change is observed. This allows user agents more latitude to carry out power- or CPU-saving strategies, and support multiple hardware configurations. Periodic reporting mode, on the other hand, allows a much more fine-grained approach and is essential for use cases with, for example, low latency requirements.

Sensors which support periodic reporting mode fallback to implementation specific reporting mode when no requirements are made as to what frequency they should operate at.

Note: reporting mode is distinct from, but related to, sensor readings acquisition. If sensors are polled at regular interval, as is generally the case, reporting mode can be either periodic or implementation specific. However, when the underlying implementation itself only provides sensor readings when it measures change, perhaps because is is relying on smart sensors or a sensor hubs, the reporting mode cannot be periodic, as that would require data inference.

This lacks a description of the different data acquisition modes, notably polling vs. on change, both at the platform and HW layer.

It would be useful to describe the process of sensor polling and how increased sensor polling frequency decreases latency.

A definition of sensor accuracy and how it affects threshold, and thus "on change" sensors would be useful.

7. Model

A diagram would really help here.

7.1. Sensor Type

A sensor type has an associated interface whose inherited interfaces contains Sensor.

A sensor type has a set of associated sensors.

If a sensor type has more than one sensor, it must have a set of associated identifying parameters to select the right sensor to associate to each new Sensor objects.

A sensor type may have a default sensor.

A sensor type has an associated PermissionName.

Note: multiple sensor types may share the same PermissionName.

A sensor type has a permission revocation algorithm.

To invoke the permission revocation algorithm with PermissionName permission_name, run the following steps:

  1. For each sensor_type which has an associated PermissionName permission_name:

    1. For each sensor in sensor_type’s set of associated sensors,

      1. Invoke the revoke sensor permission abstract operation with sensor as argument.

7.2. Sensor

A sensor has an associated set of activated Sensor objects. This set is initially empty.

A sensor has an associated latest reading map which holds the latest available sensor readings.

does the latest reading map need to be tied to an origin?

The latest reading map contains an entry whose key is "timestamp" and whose value is a high resolution timestamp of the time at which the latest reading was obtained expressed in milliseconds that passed since the time origin. latest reading["timestamp"] is initially set to null, unless the latest reading map caches a previous reading.

The other entries of the latest reading map hold the values of the different quantities measured by the sensor. The keys of these entries must match the attribute names defined by the sensor type's associated interface, so that the getter of the foo attribute can simply return latest reading["foo"].

The [map/value] of all latest reading entries is initially set to null.

A sensor supports periodic reporting mode if its associated sensor type does.

A sensor has an associated reporting flag which is initially unset.

A sensor has an associated periodic reporting mode flag which is initially unset.

A sensor has an associated current polling frequency which is initially null.

8. API

8.1. The Sensor Interface

[SecureContext]
interface Sensor : EventTarget {
  readonly attribute boolean activated;
  readonly attribute DOMHighResTimeStamp? timestamp;
  void start();
  void stop();
  attribute EventHandler onchange;
  attribute EventHandler onactivate;
  attribute EventHandler onerror;
};

dictionary SensorOptions {
  double? frequency;
};

A Sensor object has an associated sensor.

8.1.1. Sensor lifecycle

Sensor lifecycle idle idle activating activating idle->activating start() start() activating->idle Error activated activated activating->activated activated->idle stop() / Error start start->idle construct

8.1.2. Sensor internal slots

Instances of Sensor are created with the internal slots described in the following table:

Internal Slot Description (non-normative)
[[state]] The current state of Sensor object which is one of "idle", "activating", or "activated". It is initially "idle".
[[desiredPollingFrequency]] The requested polling frequency. It is initially unset.
[[lastEventFiredAt]] the high resolution timestamp of the latest sensor reading that was sent to observers of the Sensor object, expressed in milliseconds that passed since the time origin. It is initially null.
[[waitingForUpdate]] A boolean which indicates wether the observers have been updated or whether the object is waiting for a new reading to do so. It is initially true.
[[identifyingParameters]] A sensor type-epecific group of dictionary members used to select the correct sensor to associate to this Sensor object.

8.1.3. Sensor.activated

The getter of the activated attribute must run these steps or their equivalent:
  1. If this.[[state]] is "activated", return true.

  2. Otherwise, return false.

8.1.4. Sensor.timestamp

The getter of the timestamp attribute returns latest reading["timestamp"].

8.1.5. Sensor.start()

The start() method must run these steps or their equivalent:
  1. Let sensor_state be the value of sensor_instance.[[state]].

  2. If sensor_state is either "activating" or "activated", then return.

  3. Set sensor_instance.[[state]] to "activating".

  4. Run these sub-steps in parallel:

    1. let connected be the result of invoking the Connect to Sensor abstract operation.

    2. If connected is false, then abort these steps.

    3. Let permission_state be the result of invoking the Request Sensor Access abstract operation, passing it sensor_instance as argument.

    4. If permission_state is "granted",

      1. Invoke Register a Sensor Object passing it sensor_instance as argument.

    5. Otherwise, if permission_state is "denied",

      1. let e be the result of creating a "NotAllowedError" DOMException.

      2. Invoke the Handle Errors abstract operation, passing it e and sensor_instance as arguments.

8.1.6. Sensor.stop()

The stop() method must run these steps or their equivalent:
  1. If sensor_instance.[[state]] is "idle", then return.

  2. Set sensor_instance.[[state]] to "idle".

  3. Run these sub-steps in parallel:

    1. Invoke Unregister a Sensor passing it sensor_instance as argument.

8.1.7. Sensor.onchange

onchange is an EventHandler which is called whenever a new reading is available.

Issue #205 on GitHub: “Agree on event names”

Right now, no one's happy with onchange.

There was a vague consensus around "onread"/"onreading" at some point, and also a will to move towards the "ondata"/'onchange" node.js inspired model (which seems a poor fit for the Web). See notably #152 (comment) and following thread.

Maybe "onupdate" is more appropriate, or even "onsample" (if we're consistent with the #209 renaming).

Either way, this also depends on whether we have different reporting modes (i.e. data is updated when there's a change vs. periodically), so this is probably where we might need to start.

8.1.8. Sensor.onactivate

onactivate is an EventHandler which is called when this.[[state]] transitions from "activating" to "activated".

8.1.9. Sensor.onerror

onerror is an EventHandler which is called whenever an exception cannot be handled synchronously.

8.1.10. Event handlers

The following are the event handlers (and their corresponding event handler event types) that must be supported as attributes by the objects implementing the Sensor interface:

event handler event handler event type
onchange change
onactivate activate
onerror error

8.2. The SensorErrorEvent Interface

[SecureContext, Constructor(DOMString type, SensorErrorEventInit errorEventInitDict)]
interface SensorErrorEvent : Event {
  readonly attribute Error error;
};

dictionary SensorErrorEventInit : EventInit {
  required Error error;
};

8.2.1. SensorErrorEvent.error

Gets the Error object passed to SensorErrorEventInit.

9. Abstract Operations

9.1. Construct Sensor Object

input

options, a SensorOptions object.

output

sensor_instance, a Sensor object.

  1. If the incumbent settings object is not a secure context, then:

    1. throw a SecurityError.

  2. If the browsing context is not a top-level browsing context, then:

    1. throw a SecurityError.

  3. Let sensor_instance be a new Sensor object,

  4. If sensor supports periodic reporting mode and options.frequency is present, then

    1. Set sensor_instance.[[desiredPollingFrequency]] to options.frequency.

    Note: there is not guarantee that the requested options.frequency can be respected. The actual frequency can be calculated using Sensor timestamp attributes.

  5. If identifying parameters in options are set, then:

    1. Set sensor_instance.[[identifyingParameters]] to identifying parameters.

  6. Set sensor_instance.[[state]] to "idle".

  7. Return sensor_instance.

9.2. Connect to Sensor

input

sensor_instance, a Sensor object.

output

a boolean.

  1. If sensor_instance.[[identifyingParameters]] is set and sensor_instance.[[identifyingParameters]] allows a unique sensor to be identified, then:

    1. let sensor be that sensor,

    2. associate sensor_instance with sensor.

    3. Return true.

  2. If the sensor type of sensor_instance has an associated default sensor and there is a corresponding sensor on the device, then

    1. associate sensor_instance with default sensor.

    2. Return true.

  3. let e be the result of creating a "NotReadableError" DOMException.

  4. Invoke the Handle Errors abstract operation, passing it e and sensor_instance as arguments.

  5. Return false.

9.3. Register a Sensor Object

input

sensor_instance, a Sensor object.

output

None

  1. Let sensor be the sensor associated with sensor_instance.

  2. Add sensor_instance to sensor’s set of activated Sensor objects.

  3. Invoke the Set Sensor Settings abstract operation, passing it sensor as argument.

9.4. Unregister a Sensor

input

sensor_instance, a Sensor object.

output

None

  1. Let sensor be the sensor associated with sensor_instance.

  2. Remove sensor_instance from sensor’s set of activated Sensor objects.

  3. If sensor’s set of activated Sensor objects is empty,

    1. Unset the periodic reporting mode flag.

    2. Set current polling frequency to null.

    3. Update the user-agent-specific way in which sensor readings are obtained from sensor to no longer provide readings.

    4. Abort these steps.

  4. Invoke the Set Sensor Settings abstract operation, passing it sensor as argument.

9.5. Revoke sensor permission

input

sensor, a sensor.

output

None

  1. let activated_sensors be sensor’s associated set of activated Sensor objects.

  2. For each s of activated_sensors,

    1. Remove s from activated_sensors.

    2. let e be the result of creating a "NotAllowedError" DOMException.

    3. Invoke the Handle Errors abstract operation, passing it e and s as arguments.

  3. Unset sensor’s periodic reporting mode flag.

  4. Set sensor’s current polling frequency to null.

  5. Update the user-agent-specific way in which sensor readings are obtained from sensor to no longer provide readings.

9.6. Set Sensor Settings

input

sensor, a sensor.

output

None

  1. Let settings_changed be false.

  2. Let is_periodic be the result of invoking the Is Current Reporting Mode Periodic abstract operation, with sensor as argument.

  3. If is_periodic is false and the periodic reporting mode flag is set, then

    1. set settings_changed to true.

    2. Unset the periodic reporting mode flag.

  4. Otherwise if is_periodic is true and the periodic reporting mode flag is unset, then

    1. set settings_changed to true.

    2. Set the periodic reporting mode flag.

  5. Let frequency be the result of invoking the Find the polling frequency of a Sensor abstract operation, with sensor as argument.

  6. If frequency is different from sensor’s current polling frequency,

    1. set settings_changed to true.

    2. Set current polling frequency to frequency.

  7. If settings_changed is true

    1. Invoke the Observe a Sensor abstract operation, passing it sensor as argument.

This abstract operation needs to return settings_changed instead of the Observe a Sensor abstract operation itself.

9.7. Observe a Sensor

This needs to be refactored in an abstract operation that has access to the Sensor instance sensor_instance that just got started.

input

sensor, a sensor.

output

None

  1. If sensor’s latest reading["timestamp"] is not null, invoke the update observers abstract operation passing it sensor_instance and latest reading["timestamp"] as arguments.

  2. Otherwise, poll sensor immediately.

    Issue #214 on GitHub: “Sensors unable to provide their state when instantiated”

    Some HW sensors on some platforms are unable to provide their state upon instantiation and sometimes take a very long time to provide any reading whatsoever.

    This is related to an old issue (see #87), was a concern in previous APIs and this behavior is currently in violation of the spec.

    Perhaps the spec should special case such sensors and force them to advertise their inability to provide the current state upfront (e.g. via a dedicated event?)

  3. If sensor’s periodic reporting mode flag is set,

    1. let frequency be the current polling frequency, capped by the upper and lower bounds of the underlying hardware.

      Should this max polling frequency be reflected in the Sensor interface? E.g. Through a dedicated attribute?

      Does the max polling frequency affect the reporting frequency? If so, should we advise the developer of this issue? E.g. via a dedicated event?

    2. Poll sensor at frequency.

    3. Hook into the requestAnimationFrame framework [HTML] to invoke the update latest reading abstract operation with every new frame passing it sensor and the latest sensor reading as arguments.

      Relying on requestAnimationFrame gives us a perfect point to buffer readings > 60Hz and to pass them to together with every new frame. That’s a level 2 feature.

      Figure out how to handle sensors/platforms that push the data rather than wait for it to be polled.

  4. If the periodic reporting mode flag is unset,

    1. the user-agent can decide on the best reporting strategy for this particular sensor and sensor type.

      This needs to be defined better.

9.8. Is Current Reporting Mode Periodic

input

sensor, a sensor.

output

result, a boolean.

  1. Let result be false.

  2. For each sensor_instance in sensor’s set of activated Sensor objects:

    1. if sensor_instance.[[desiredPollingFrequency]] is set,

      1. set result to true, then break.

  3. return result.

9.9. Find the polling frequency of a Sensor

input

sensor, a sensor.

output

frequency, a frequency.

  1. Let frequency be null.

  2. For each sensor_instance in sensor’s set of activated Sensor objects:

    1. let f be sensor_instance.[[desiredPollingFrequency]].

    2. if f is set and f is greater than frequency,

      1. set frequency to f.

  3. return frequency.

9.10. Update latest reading

input

sensor, a sensor.

reading, a sensor reading.

reading_timestamp, the timestamp at which sensor was polled.

The timestamp needs to be specified more precisely, see issue #155.

output

None

  1. If sensor’s reporting flag is set,

    1. abort these steps.

  2. If reading_timestamp is equal latest reading["timestamp"],

    1. abort these steps.

  3. Set sensor’s reporting flag.

  4. If the result of invoking the security check is "insecure", then abort these steps.

    Issue #223 on GitHub: “Should a "suspended" state be added”

    Sensors would be in this state when the security check would return "insecure".

    Additionally, or alternatively, an "suspend" event could be fired in such cases.

    Follow-up questions:

    • What should the "activated" attribute getter return in such cases?
    • Do we need a "state" or "status" attribute instead?
    • Should an "activate" event be fired once the security check returns "secure" again?
    • Etc., etc.
  5. Set latest reading["timestamp"] to reading_timestamp.

  6. For each keyvalue of latest reading.

    1. If key is "timestamp", continue.

    2. Set latest reading[key] to the corresponding value of reading.

    Maybe compare value with corresponding value of reading to see if there’s a change that needs to be propagated.

  7. Unset sensor’s reporting flag.

9.11. Update Observers

input

sensor_instance, a Sensor object.

timestamp, a high resolution timestamp.

output

None

  1. If sensor_instance.[[state]] is "activating":

    1. Set sensor_instance.[[state]] to "activated".

    2. Fire an event named "activate" at sensor_instance.

  2. If sensor_instance.[[waitingForUpdate]] is true, then

    Should we fire delayed readings? Or should we just drop readings instead?

    1. Set sensor_instance.[[waitingForUpdate]] to false.

    2. Fire an event named "change" at sensor_instance.

    3. Set sensor_instance.[[lastEventFiredAt]] to timestamp.

    Issue #215 on GitHub: “Use simple event dispatch mechanism instead of task source (queued)”

    Short summary:

    Sensors are operating in simple Publish & Subscribe model, therefore, queued event processing is not required and over-complicates API and it's implementation.

    Detailed explanation:

    Event queue is suitable for cases where ordered task processing is needed. For example, networking, databases or APIs that provide sync API and manage async task queue. You can call write(data); multiple times and each write is ordered task, if error happens, all pending tasks are removed from queue and 'onerror' event task is put to the queue.

    Problems with queued task processing:

    • The 'onchange' event would need to carry reading data, otherwise when event queue is processed, event would not be in sync with Sensor.|reading|
    • When we attach |reading| to an event, we will have another problem, event.reading != Sensor.reading, might be confusing for developers.

    Pros for simple event:

    • onchange is synchronized with Sensor.reading
    • No need to carry extra data in event.
    • No need to manage queue, we don't even need a queue, since we always have 1 update per event loop cycle per Sensor instance.
    • Simpler for browser vendors to implement.

    Proposed resolution:

    • Use simple event dispatch mechanism
    • Remove task source related information from the spec

9.12. Security Check

input

None

output

A string whose value is either "secure" or "insecure".

  1. Let document be the top-level browsing context's active document.

  2. Let current_visibility_state be the result of running the [= steps to determine the visibility state=] of document.

  3. If current_visibility_state is not "visible", then return "insecure".

  4. If the currently focused area of the current top-level browsing context is a nested browsing context whose active document's origin is not same origin-domain as document’s origin, then return "insecure".

  5. Let has_focus be the result of running the has focus steps passing it document as argument.

  6. If has_focus is false, then return "insecure".

  7. If the user agent loses focus, then return "insecure".

    Issue #2716 on GitHub: “Focus state is unspecified when user agent itself loses focus.”

    This makes it difficult to normatively stop certain behaviors (such as collecting sensor readings) when the app is unfocused but stays visible.

    This can notably allow PIN skimming attacks using sensors.

  8. Return "secure".

Note: user agents are encouraged stop sensor polling the sensors when security check would return "insecure" in order to reduce resource consumption, notably battery usage.

9.13. Handle Errors

input

sensor_instance, a Sensor object.

error, an exception.

output

None

  1. Set sensor_instance.[[state]] to "idle".

  2. Fire an event named "error" at sensor_instance using SensorErrorEvent with its error attribute initialized to error.

9.14. Request Sensor Access

input

sensor_instance, a Sensor object.

output

state, a permission state.

  1. Let sensor be the sensor associated with sensor_instance.

  2. Let permission_name be the PermissionName associated with sensor.

  3. Let state be the result of requesting permission to use permission_name.

  4. Return state.

10. Extensibility

This section is non-normative.

Its purpose is to describe how this specification can be extended to specify APIs for different sensor types.

Extension specifications are encouraged to focus on a single sensor type, exposing both high and low level as appropriate.

10.1. Security

All interfaces defined by extension specifications should only be available within a secure context.

10.2. Naming

Sensor interfaces for low-level sensors should be named after their associated sensor. So for example, the interface associated with a gyroscope should be simply named Gyroscope. Sensor interfaces for high-level sensors should be named by combining the physical quantity the sensor measures with the "Sensor" suffix. For example, a sensor measuring the distance at which an object is from it may see its associated interface called ProximitySensor.

Attributes of the Sensor subclass that hold sensor readings values should be named after the full name of these values. For example, the Thermometer interface should hold the sensor reading's value in a temperature attribute (and not a value or temp attribute). A good starting point for naming are the Quantities, Units, Dimensions and Data Types Ontologies [QUDT].

10.3. Unit

Extension specification must specify the unit of sensor readings.

As per the Technical Architecture Group’s (TAG) API Design Principles [API-DESIGN-PRINCIPLES], all time measurement should be in milliseconds. All other units should be specified using, in order of preference, and with the exception of temperature (for which Celsius should be favored over Kelvin), the International System of Units (SI), SI derived units, and Non-SI units accepted for use with the SI, as described in the SI Brochure [SI].

10.4. Exposing High-Level vs. Low-Level Sensors

So far, specifications exposing sensors to the Web platform have focused on high-level sensors APIs. [GEOLOCATION-API] [ORIENTATION-EVENT]

This was a reasonable approach for a number of reasons. Indeed, high-level sensors:

However, an increasing number of use cases such as virtual and augmented reality require low-level access to sensors, most notably for performance reasons.

Providing low-level access enables Web application developers to leverage domain-specific constraints and design more performant systems.

Following the precepts of the Extensible Web Manifesto [EXTENNNNSIBLE], extension specifications should focus primarily on exposing low-level sensor APIs, but should also expose high-level APIs when they are clear benefits in doing so.

10.5. When is Enabling Multiple Sensors of the Same Type Not the Right Choice?

TODO: provide guidance on when to:

10.6. Definition Requirements

The following definitions must be specified for each sensor type in extension specifications:

An extension specification may specify the following definitions for each sensor types:

10.7. Extending the Permission API

Provide guidance on how to extend the Permission API [PERMISSIONS] for each sensor types.

10.8. Example WebIDL

Here’s example WebIDL for a possible extension of this specification for proximity sensors.

[SecureContext, Constructor(optional ProximitySensorOptions proximitySensorOptions)]
interface ProximitySensor : Sensor {
    readonly attribute unrestricted double distance;
};

dictionary ProximitySensorOptions : SensorOptions {
    double? min = -Infinity;
    double? max = Infinity;
    ProximitySensorPosition? position;
    ProximitySensorDirection? direction;
};

enum ProximitySensorPosition {
    "top-left",
    "top",
    "top-right",
    "middle-left",
    "middle",
    "middle-right",
    "bottom-left",
    "bottom",
    "bottom-right"
};

enum ProximitySensorDirection {
    "front",
    "rear",
    "left",
    "right",
    "top",
    "bottom"
};

11. Acknowledgements

First and foremost, I would like to thank Anssi Kostiainen for his continuous and dedicated support and input throughout the development of this specification, as well as Mikhail Pozdnyakov, Alexander Shalamov, Rijubrata Bhaumik, and Kenneth Rohde Christiansen for their invaluable implementation feedback, suggestions, and research that have helped inform the specification work.

Special thanks to Rick Waldron for driving the discussion around a generic sensor API design for the Web, sketching the original API on which this is based, providing implementation feedback from his work on Johnny-Five, and continuous input during the development of this specification.

Special thanks to Boris Smus, Tim Volodine, and Rich Tibbett for their initial work on exposing sensors to the web with consistency.

Thanks to Anne van Kesteren for his tireless help both in person and through IRC.

Thanks to Domenic Denicola and Jake Archibald for their help.

Thanks also to Frederick Hirsch and Dominique Hazaël-Massieux (via the HTML5Apps project) for both their administrative help and technical input.

Thanks to Tab Atkins for making Bikeshed and taking the time to explain its subtleties.

Thanks to Lukasz Olejnik and Maryam Mehr for their contributions around privacy and security.

The following people have greatly contributed to this specification through extensive discussions on GitHub: Anssi Kostiainen, Boris Smus, chaals, Claes Nilsson, Dave Raggett, David Mark Clements, Domenic Denicola, Dominique Hazaël-Massieux (via the HTML5Apps project), Francesco Iovine, Frederick Hirsch, gmandyam, Jafar Husain, Johannes Hund, Kris Kowal, Lukasz Olejnik, Marcos Caceres, Marijn Kruisselbrink, Mark Foltz, Mats Wichmann, Matthew Podwysocki, pablochacin, Remy Sharp, Rich Tibbett, Rick Waldron, Rijubrata Bhaumik, robman, Sean T. McBeth, smaug----, Tab Atkins Jr., Virginie Galindo, zenparsing, and Zoltan Kis.

We’d also like to thank Anssi Kostiainen, Dominique Hazaël-Massieux, Erik Wilde, and Michael[tm] Smith for their editorial input.

Conformance

Document conventions

Conformance requirements are expressed with a combination of descriptive assertions and RFC 2119 terminology. The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in the normative parts of this document are to be interpreted as described in RFC 2119. However, for readability, these words do not appear in all uppercase letters in this specification.

All of the text of this specification is normative except sections explicitly marked as non-normative, examples, and notes. [RFC2119]

Examples in this specification are introduced with the words "for example" or are set apart from the normative text with class="example", like this:

This is an example of an informative example.

Because this document doesn’t itself define APIs for specific sensor typesthat is the role of extensions to this specification—all examples are inevitably (wishful) fabrications. Although all of the sensors used a examples would be great candidates for building atop the Generic Sensor API, their inclusion in this document does not imply that the relevant Working Groups are planning to do so.

Informative notes begin with the word "Note" and are set apart from the normative text with class="note", like this:

Note, this is an informative note.

Conformant Algorithms

Requirements phrased in the imperative as part of algorithms (such as "strip any leading space characters" or "return false and abort these steps") are to be interpreted with the meaning of the key word ("must", "should", "may", etc) used in introducing the algorithm.

Conformance requirements phrased as algorithms or specific steps can be implemented in any manner, so long as the end result is equivalent. In particular, the algorithms defined in this specification are intended to be easy to understand and are not intended to be performant. Implementers are encouraged to optimize.

Conformance Classes

A conformant user agent must implement all the requirements listed in this specification that are applicable to user agents.

Index

Terms defined by this specification

Terms defined by reference

References

Normative References

[DOM]
Anne van Kesteren. DOM Standard. Living Standard. URL: https://dom.spec.whatwg.org/
[HR-TIME-2]
Ilya Grigorik; James Simonsen; Jatinder Mann. High Resolution Time Level 2. URL: https://w3c.github.io/hr-time/
[HTML]
Anne van Kesteren; et al. HTML Standard. Living Standard. URL: https://html.spec.whatwg.org/multipage/
[INFRA]
Anne van Kesteren; Domenic Denicola. Infra Standard. Living Standard. URL: https://infra.spec.whatwg.org/
[PAGE-VISIBILITY]
Jatinder Mann; Arvind Jain. Page Visibility (Second Edition). 29 October 2013. REC. URL: https://www.w3.org/TR/page-visibility/
[PERMISSIONS]
Mounir Lamouri; Marcos Caceres. The Permissions API. URL: https://w3c.github.io/permissions/
[POWERFUL-FEATURES]
Mike West. Secure Contexts. URL: https://w3c.github.io/webappsec-secure-contexts/
[RFC2119]
S. Bradner. Key words for use in RFCs to Indicate Requirement Levels. March 1997. Best Current Practice. URL: https://tools.ietf.org/html/rfc2119
[WebIDL]
Cameron McCormack; Boris Zbarsky; Tobie Langel. Web IDL. URL: https://heycam.github.io/webidl/

Informative References

[API-DESIGN-PRINCIPLES]
Domenic Denicola. API Design Principles. 29 December 2015. URL: https://w3ctag.github.io/design-principles/
[EXTENNNNSIBLE]
The Extensible Web Manifesto. 10 June 2013. URL: https://extensiblewebmanifesto.org/
[GEOLOCATION-API]
Andrei Popescu. Geolocation API Specification 2nd Edition. URL: http://dev.w3.org/geo/api/spec-source.html
[ORIENTATION-EVENT]
Rich Tibbett; et al. DeviceOrientation Event Specification. URL: https://w3c.github.io/deviceorientation/spec-source-orientation.html
[QUDT]
Ralph Hodgson; et al. QUDT - Quantities, Units, Dimensions and Data Types Ontologies. 18 March 2014. URL: http://www.qudt.org/
[RFC6454]
A. Barth. The Web Origin Concept. December 2011. Proposed Standard. URL: https://tools.ietf.org/html/rfc6454
[SI]
SI Brochure: The International System of Units (SI), 8th edition. 2014. URL: http://www.bipm.org/en/publications/si-brochure/

IDL Index

[SecureContext]
interface Sensor : EventTarget {
  readonly attribute boolean activated;
  readonly attribute DOMHighResTimeStamp? timestamp;
  void start();
  void stop();
  attribute EventHandler onchange;
  attribute EventHandler onactivate;
  attribute EventHandler onerror;
};

dictionary SensorOptions {
  double? frequency;
};

[SecureContext, Constructor(DOMString type, SensorErrorEventInit errorEventInitDict)]
interface SensorErrorEvent : Event {
  readonly attribute Error error;
};

dictionary SensorErrorEventInit : EventInit {
  required Error error;
};