WebXR Hand Input Module - Level 1

W3C Working Draft,

More details about this document
This version:
https://www.w3.org/TR/2022/WD-webxr-hand-input-1-20220419/
Latest published version:
https://www.w3.org/TR/webxr-hand-input-1/
Editor's Draft:
https://immersive-web.github.io/webxr-hand-input/
Previous Versions:
History:
https://www.w3.org/standards/history/webxr-hand-input-1
Feedback:
GitHub
Inline In Spec
Editor:
(Google [Mozilla until 2020])
Participate:
File an issue (open issues)
Mailing list archive
W3C’s #immersive-web IRC

Abstract

The WebXR Hand Input module expands the WebXR Device API with the functionality to track articulated hand poses.

Status of this document

This section describes the status of this document at the time of its publication. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at https://www.w3.org/TR/.

The Immersive Web Working Group maintains a list of all bug reports that the group has not yet addressed. This draft highlights some of the pending issues that are still to be discussed in the working group. No decision has been taken on the outcome of these issues including whether they are valid. Pull requests with proposed specification text for outstanding issues are strongly encouraged.

This document was published by the Immersive Web Working Group as a Working Draft using the Recommendation track. This document is intended to become a W3C Recommendation.

Publication as a Working Draft does not imply endorsement by W3C and its Members. This is a draft document and may be updated, replaced or obsoleted by other documents at any time. It is inappropriate to cite this document as other than work in progress.

This document was produced by a group operating under the 1 August 2017 W3C Patent Policy. W3C maintains a public list of any patent disclosures made in connection with the deliverables of the group; that page also includes instructions for disclosing a patent. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) must disclose the information in accordance with section 6 of the W3C Patent Policy.

This document is governed by the 2 November 2021 W3C Process Document.

This WebXR Augmented Reality Module is designed as a module to be implemented in addition to WebXR Device API, and is originally included in WebXR Device API which was divided into core and modules.

1. Introduction

On some XR devices it is possible to get fully articulated information about the user’s hands when they are used as input sources.

This API exposes the poses of each of the users' hand skeleton joints. This can be used to do gesture detection or to render a hand model in VR scenarios.

2. Initialization

If an application wants to view articulated hand pose information during a session, the session MUST be requested with an appropriate feature descriptor. The string "hand-tracking" is introduced by this module as a new valid feature descriptor for articulated hand tracking.

The "hand-tracking" feature descriptor should only be granted for an XRSession when its XR device has physical hand input sources that support hand tracking.

The user agent MAY gate support for hand based XRInputSources based upon this feature descriptor.

NOTE: This means that if an XRSession does not request the "hand-tracking" feature descriptor, the user agent may choose to not support input controllers that are hand based.

3. Physical Hand Input Sources

An XRInputSource is a physical hand input source if it tracks a physical hand. A physical hand input source supports hand tracking if it supports reporting the poses of one or more skeleton joints defined in this specification.

Physical hand input sources MUST include the input profile name of "generic-hand-select" in their profiles.

For many physical hand input sources, there can be overlap between the gestures used for the primary action and the squeeze action. For example, a pinch gesture may indicate both a "select" and "squeeze" event, depending on whether you are interacting with nearby or far away objects. Since content may assume that these are independent events, user agents MAY, instead of surfacing the squeeze action as the primary squeeze action, surface it as an additional "grasp button", using an input profile derived from the "generic-hand-select-grasp" profile.

3.1. XRInputSource

partial interface XRInputSource {
   [SameObject] readonly attribute XRHand? hand;
};

The hand attribute on a physical hand input source that supports hand tracking will be an XRHand object giving access to the underlying hand-tracking capabilities. hand will have its input source set to this.

If the XRInputSource belongs to an XRSession that has not been requested with the "hand-tracking" feature descriptor, hand MUST be null.

3.2. Skeleton Joints

A physical hand input source is made up of many skeleton joints.

A skeleton joint for a given hand can be uniquely identified by a skeleton joint name, which is an enum of type XRHandJoint.

A skeleton joint may have an associated bone that it is named after and used to orient its -Z axis. The associated bone of a skeleton joint is the bone that comes after the joint when moving towards the fingertips. The tip and wrist joints have no associated bones.

A skeleton joint has a radius which is the radius of a sphere placed at its center so that it roughly touches the skin on both sides of the hand. The "tip" skeleton joints SHOULD have an appropriate nonzero radius so that collisions with the fingertip may work. Implementations MAY offset the origin of the tip joint so that it can have a spherical shape with nonzero radius.

This list of joints defines the following skeleton joints and their order:

Skeleton joint Skeleton joint name Index
Wrist wrist 0
Thumb Metacarpal thumb-metacarpal 1
Proximal Phalanx thumb-phalanx-proximal 2
Distal Phalanx thumb-phalanx-distal 3
Tip thumb-tip 4
Index finger Metacarpal index-finger-metacarpal 5
Proximal Phalanx index-finger-phalanx-proximal 6
Intermediate Phalanx index-finger-phalanx-intermediate 7
Distal Phalanx index-finger-phalanx-distal 8
Tip index-finger-tip 9
Middle finger Metacarpal middle-finger-metacarpal 10
Proximal Phalanx middle-finger-phalanx-proximal 11
Intermediate Phalanx middle-finger-phalanx-intermediate 12
Distal Phalanx middle-finger-phalanx-distal 13
Tip middle-finger-tip 14
Ring finger Metacarpal ring-finger-metacarpal 15
Proximal Phalanx ring-finger-phalanx-proximal 16
Intermediate Phalanx ring-finger-phalanx-intermediate 17
Distal Phalanx ring-finger-phalanx-distal 18
Tip ring-finger-tip 19
Little finger Metacarpal pinky-finger-metacarpal 20
Proximal Phalanx pinky-finger-phalanx-proximal 21
Intermediate Phalanx pinky-finger-phalanx-intermediate 22
Distal Phalanx pinky-finger-phalanx-distal 23
Tip pinky-finger-tip 24

Visual aid demonstrating joint layout

3.3. XRHand

enum XRHandJoint {
  "wrist",

  "thumb-metacarpal",
  "thumb-phalanx-proximal",
  "thumb-phalanx-distal",
  "thumb-tip",

  "index-finger-metacarpal",
  "index-finger-phalanx-proximal",
  "index-finger-phalanx-intermediate",
  "index-finger-phalanx-distal",
  "index-finger-tip",

  "middle-finger-metacarpal",
  "middle-finger-phalanx-proximal",
  "middle-finger-phalanx-intermediate",
  "middle-finger-phalanx-distal",
  "middle-finger-tip",

  "ring-finger-metacarpal",
  "ring-finger-phalanx-proximal",
  "ring-finger-phalanx-intermediate",
  "ring-finger-phalanx-distal",
  "ring-finger-tip",

  "pinky-finger-metacarpal",
  "pinky-finger-phalanx-proximal",
  "pinky-finger-phalanx-intermediate",
  "pinky-finger-phalanx-distal",
  "pinky-finger-tip"
};

[Exposed=Window]
interface XRHand {
    iterable<XRHandJoint, XRJointSpace>;

    readonly attribute unsigned long size;
    XRJointSpace get(XRHandJoint key);
};

The XRHandJoint enum defines the various joints that each XRHand MUST contain.

Every XRHand has an associated input source, which is the physical hand input source that it tracks.

Each XRHand object has a [[joints]] internal slot, which is an ordered map of pairs with the key of type XRHandJoint and the value of type XRJointSpace.

The ordering of the [[joints]] internal slot is given by the list of joints under skeleton joints.

[[joints]] MUST NOT change over the course of a session.

The value pairs to iterate over for an XRHand object are the list of value pairs with the key being the XRHandJoint and the value being the XRJointSpace corresponding to that XRHandJoint, ordered by list of joints under skeleton joints.

If an individual device does not support a joint defined in this specification, it MUST emulate it instead.

The size attribute MUST return the number 25.

The get(jointName) method when invoked on an XRHand this MUST run the following steps:
  1. Let joints be the value of this's [[joints]] internal slot.

  2. Return joints[jointName]. (This implies returning undefined for unknown jointName.)

3.4. XRJointSpace

[Exposed=Window]
interface XRJointSpace: XRSpace {
  readonly attribute XRHandJoint jointName;
};

The native origin of an XRJointSpace is the position and orientation of the underlying joint.

The native origin of the XRJointSpace may only be reported when native origins of all other XRJointSpaces on the same hand are being reported. When a hand is partially obscured the user agent MUST either emulate the obscured joints, or report null poses for all of the joints.

Note: This means that when fetching poses you will either get an entire hand or none of it.

This by default precludes faithfully exposing polydactyl/oligodactyl hands, however for fingerprinting concerns it will likely need to be a separate opt-in, anyway. See Issue 11 for more details.

The native origin has its -Y direction pointing perpendicular to the skin, outwards from the palm, and -Z direction pointing along their associated bone, away from the wrist.

For tip skeleton joints where there is no associated bone, the -Z direction is the same as that for the associated distal joint, i.e. the direction is along that of the previous bone. For wrist skeleton joints the -Z direction SHOULD point roughly towards the center of the palm.

Every XRJointSpace has an associated hand, which is the XRHand that created it.

jointName returns the joint name of the joint it tracks.

Every XRJointSpace has an associated joint, which is the skeleton joint corresponding to the jointName.

4. Frame Loop

4.1. XRFrame

partial interface XRFrame {
    XRJointPose? getJointPose(XRJointSpace joint, XRSpace baseSpace);
    boolean fillJointRadii(sequence<XRJointSpace> jointSpaces, Float32Array radii);

    boolean fillPoses(sequence<XRSpace> spaces, XRSpace baseSpace, Float32Array transforms);
};

The getJointPose(XRJointSpace joint, XRSpace baseSpace) method provides the pose of joint relative to baseSpace as an XRJointPose, at the XRFrame's time.

When this method is invoked, the user agent MUST run the following steps:

  1. Let frame be this.

  2. Let session be frame’s session object.

  3. If frame’s active boolean is false, throw an InvalidStateError and abort these steps.

  4. If baseSpace’s session or joint’s session are different from this session, throw an InvalidStateError and abort these steps.

  5. Let pose be a new XRJointPose object in the relevant realm of session.

  6. Populate the pose of joint in baseSpace at the time represented by frame into pose, with force emulation set to false.

  7. If pose is null return null.

  8. Set pose’s radius to the radius of joint, emulating it if necessary.

  9. Return pose.

The fillJointRadii(sequence<XRJointSpace> jointSpaces, Float32Array radii) method populates radii with the radii of the jointSpaces, and returns a boolean indicating whether all of the spaces have a valid pose.

When this method is invoked on an XRFrame frame, the user agent MUST run the following steps:

  1. Let frame be this.

  2. Let session be frame’s session object.

  3. If frame’s active boolean is false, throw an InvalidStateError and abort these steps.

  4. For each joint in the jointSpaces:

    1. If joint’s session is different from session, throw an InvalidStateError and abort these steps.

  5. If the length of jointSpaces is larger than the number of elements in radii, throw a TypeError and abort these steps.

  6. let offset be a new number with the initial value of 0.

  7. Let allValid be true.

  8. For each joint in the jointSpaces:

    1. Set the float value of radii at offset as follows:

      If the user agent can determine the poses of all the joints belonging to the joint’s hand:
      Set the float value of radii at offset to that radius.
      Otherwise
      Set the float value of radii at offset to NaN.
      Set allValid to false.
    2. Increase offset by 1.

  9. Return allValid.

NOTE: if the user agent can’t determine the pose of any of the spaces belonging to the same XRHand, all the spaces of that XRHand must also not have a pose.

The fillPoses(sequence<XRSpace> spaces, XRSpace baseSpace, Float32Array transforms) method populates transforms with the matrices of the poses of the spaces relative to the baseSpace, and returns a boolean indicating whether all of the spaces have a valid pose.

When this method is invoked on an XRFrame frame, the user agent MUST run the following steps:

  1. Let frame be this.

  2. Let session be frame’s session object.

  3. If frame’s active boolean is false, throw an InvalidStateError and abort these steps.

  4. For each space in the spaces sequence:

    1. If space’s session is different from session, throw an InvalidStateError and abort these steps.

  5. If baseSpace’s session is different from session, throw an InvalidStateError and abort these steps.

  6. If the length of spaces multiplied by 16 is larger than the number of elements in transforms, throw a TypeError and abort these steps.

  7. let offset be a new number with the initial value of 0.

  8. Initialize pose as follows:

    If fillPoses() was called previously, the user agent MAY:
    Let pose be the same object as used by an earlier call.
    Otherwise
    Let pose be a new XRPose object in the relevant realm of session.
  9. Let allValid be true.

  10. For each space in the spaces sequence:

    1. Populate the pose of space in baseSpace at the time represented by frame into pose.

    2. If pose is null, perform the following steps:

    3. Set 16 consecutive elements of the transforms array starting at offset to NaN.

    4. Set allValid to false.

    5. If pose is not null, copy all elements from pose’s matrix member to the transforms array starting at offset.

    6. Increase offset by 16.

  11. Return allValid.

NOTE: if any of the spaces belonging to the same XRHand return null when populating the pose, all the spaces of that XRHand must also return null when populating the pose

4.2. XRJointPose

An XRJointPose is an XRPose with additional information about the size of the skeleton joint it represents.

[Exposed=Window]
interface XRJointPose: XRPose {
    readonly attribute float radius;
};

The radius attribute returns the radius of the skeleton joint in meters.

The user-agent MUST set radius to an emulated value if the XR device does not have the capability of determining this value, either in general or in the current animation frame (e.g. when the skeleton joint is partially obscured).

5. Privacy & Security Considerations

The WebXR Hand Input API is a powerful feature that carries significant privacy risks.

Since this feature returns new sensor data, the User Agent MUST ask for explicit consent from the user at session creation time.

Data returned from this API, MUST NOT be so specific that one can detect individual users. If the underlying hardware returns data that is too precise, the User Agent MUST anonymize this data before revealing it through the WebXR Hand Input API.

This API MUST only be supported in XRSessions created with XRSessionMode of "immersive-vr" or "immersive-ar". "inline" sessions MUST not support this API.

When anonymizing the hands data, the UA can follow these guidelines:

Changes

Changes from the First Public Working Draft 22 October 2020

Conformance

Document conventions

Conformance requirements are expressed with a combination of descriptive assertions and RFC 2119 terminology. The key words “MUST”, “MUST NOT”, “REQUIRED”, “SHALL”, “SHALL NOT”, “SHOULD”, “SHOULD NOT”, “RECOMMENDED”, “MAY”, and “OPTIONAL” in the normative parts of this document are to be interpreted as described in RFC 2119. However, for readability, these words do not appear in all uppercase letters in this specification.

All of the text of this specification is normative except sections explicitly marked as non-normative, examples, and notes. [RFC2119]

Examples in this specification are introduced with the words “for example” or are set apart from the normative text with class="example", like this:

This is an example of an informative example.

Informative notes begin with the word “Note” and are set apart from the normative text with class="note", like this:

Note, this is an informative note.

Conformant Algorithms

Requirements phrased in the imperative as part of algorithms (such as "strip any leading space characters" or "return false and abort these steps") are to be interpreted with the meaning of the key word ("must", "should", "may", etc) used in introducing the algorithm.

Conformance requirements phrased as algorithms or specific steps can be implemented in any manner, so long as the end result is equivalent. In particular, the algorithms defined in this specification are intended to be easy to understand and are not intended to be performant. Implementers are encouraged to optimize.

Index

Terms defined by this specification

Terms defined by reference

References

Normative References

[HTML]
Anne van Kesteren; et al. HTML Standard. Living Standard. URL: https://html.spec.whatwg.org/multipage/
[INFRA]
Anne van Kesteren; Domenic Denicola. Infra Standard. Living Standard. URL: https://infra.spec.whatwg.org/
[RFC2119]
S. Bradner. Key words for use in RFCs to Indicate Requirement Levels. March 1997. Best Current Practice. URL: https://datatracker.ietf.org/doc/html/rfc2119
[SERVICE-WORKERS-1]
Alex Russell; et al. Service Workers 1. 19 November 2019. CR. URL: https://www.w3.org/TR/service-workers-1/
[WEBIDL]
Edgar Chen; Timothy Gu. Web IDL Standard. Living Standard. URL: https://webidl.spec.whatwg.org/
[WEBXR]
Brandon Jones; Manish Goregaokar; Rik Cabanier. WebXR Device API. 31 March 2022. CR. URL: https://www.w3.org/TR/webxr/
[WEBXR-AR-MODULE-1]
Brandon Jones; Manish Goregaokar; Rik Cabanier. WebXR Augmented Reality Module - Level 1. 17 March 2022. WD. URL: https://www.w3.org/TR/webxr-ar-module-1/

IDL Index

partial interface XRInputSource {
   [SameObject] readonly attribute XRHand? hand;
};

enum XRHandJoint {
  "wrist",

  "thumb-metacarpal",
  "thumb-phalanx-proximal",
  "thumb-phalanx-distal",
  "thumb-tip",

  "index-finger-metacarpal",
  "index-finger-phalanx-proximal",
  "index-finger-phalanx-intermediate",
  "index-finger-phalanx-distal",
  "index-finger-tip",

  "middle-finger-metacarpal",
  "middle-finger-phalanx-proximal",
  "middle-finger-phalanx-intermediate",
  "middle-finger-phalanx-distal",
  "middle-finger-tip",

  "ring-finger-metacarpal",
  "ring-finger-phalanx-proximal",
  "ring-finger-phalanx-intermediate",
  "ring-finger-phalanx-distal",
  "ring-finger-tip",

  "pinky-finger-metacarpal",
  "pinky-finger-phalanx-proximal",
  "pinky-finger-phalanx-intermediate",
  "pinky-finger-phalanx-distal",
  "pinky-finger-tip"
};

[Exposed=Window]
interface XRHand {
    iterable<XRHandJoint, XRJointSpace>;

    readonly attribute unsigned long size;
    XRJointSpace get(XRHandJoint key);
};

[Exposed=Window]
interface XRJointSpace: XRSpace {
  readonly attribute XRHandJoint jointName;
};

partial interface XRFrame {
    XRJointPose? getJointPose(XRJointSpace joint, XRSpace baseSpace);
    boolean fillJointRadii(sequence<XRJointSpace> jointSpaces, Float32Array radii);

    boolean fillPoses(sequence<XRSpace> spaces, XRSpace baseSpace, Float32Array transforms);
};

[Exposed=Window]
interface XRJointPose: XRPose {
    readonly attribute float radius;
};

Issues Index

This by default precludes faithfully exposing polydactyl/oligodactyl hands, however for fingerprinting concerns it will likely need to be a separate opt-in, anyway. See Issue 11 for more details.