Xaur draft

From Accessible Platform Architectures Working Group

XAUR - User Needs and Requirements for Augmented and Virtual reality [DRAFT]

[NOTE:This page is an early draft, XR Accessibility User requirements is now published as a working group note

Document Scope

XAUR aims to outline user needs for emerging technologies such as Immersive, Augmented and Mixed Reality environments (XR) including people with disabilities, and users of Assistive Technologies.

Status

This document is a [DRAFT], and developed as part of initial discovery into potential accessibility related user needs and requirements for XR. These will be used as the basis for further development in the RQTF/APA where this work is ongoing. This document does not represent a formal working group position, nor does it represent a set of technical requirements that a developer or designer need strictly follow. It aims to outline what is needed or required by the user of these technologies and experiences.

Related Documents

Other documents that relate to this and represent current work in the RQTF/APA are:

  • XR Semantics Module - this document outlines proposed accessibility requirements that may be used in a modular way in Immersive, Augmented or Mixed Reality (XR). A modular approach may help us to define clear accessibility requirements that support XR accessibility user needs, as they relate to the immersive environment, objects, movement, and interaction accessibility. Such a modular approach may help the development of clear semantics, designed to describe specific parts of the immersive eco-system. In immersive environments it is imperative that the user can understand what objects are, understand their purpose, as well as another qualities and properties including interaction affordance, size, form, shape, and other inherent properties or attributes.
  • WebXR Standards and Accessibility Architecture Issues - this document is informative and aims to outline some of the challenges in understanding the complex technical architecture and processes behind how XR (Virtual, Augmented and Mixed reality) environments are currently rendered. To make these environments accessible and provide a quality user experience it is important to also understand the nuances and complexity of accessible user interface design and development for the 2D web. Any attempt to make XR accessible also needs to be based on meeting the practical user needs and requirements of people with disabilities (outlined in this document).

Definitions of Augmented Reality

Augmented Reality definitions vary but converge on the notion of computer mediated interactions involving overlays on the real world. These may be informational, or interactive depending on the application.

Definitions of Virtual Reality

Virtual Reality definitions vary but converge on the notion of immersive computer mediated experiences. They involve interaction with objects, people and environments using a range of controls. These experiences are often multi-sensory and may be used for educational, therapeutic or entertainment purposes.

What does XR mean?

As with the WebXR API spec, this document uses the acronym XR throughout to refer to the spectrum of hardware, applications, and techniques used for Virtual Reality, Augmented Reality, and other related technologies. Examples include, but are not limited to:

  • Immersive or augmented environments used for education, gaming, multimedia, 360o content and other applications.
  • Head mounted displays, whether they are opaque, transparent, or utilise video passthrough.
  • Mobile devices with positional tracking.
  • Fixed displays with head tracking capabilities.

The important commonality between them being that they all offer some degree of spatial tracking with which to simulate a view of virtual content as well as navigation and interaction with the objects within these environments.

Terms like "XR Device", "XR Application", etc. are generally understood to apply to any of the above. Portions of this document that only apply to a subset of these devices will be indicated as appropriate.

Purpose of XR

XR has an exhaustive range of purposes from education, gaming, multimedia, immersive communication and many others. It is currently evolving at a very fast rate and is not yet mainstream. This will change as computing power increases , hardware improves as well as the quality of user experience, XR will be more commonly be used for the performance of everyday practical tasks, for therapeutic uses, education and for entertainment.

XR and Multimodality

Modality relates to modes of sense perception such as sight, hearing, touch and so on. Accessibility can be thought of as supporting multi-modal requirements and the transformation of content or aspects of a user interface from one mode to another that will support various user needs.

Considering various modality requirements in the foundation of XR means these platforms will be better able to support accessibility related user needs. There will be many modality aspects for the developer and/or content author to consider:

NOTE: XR authors and content designers will also need access to tools that support the multimodal requirements listed below.

The following Inputs and Outputs can be considered modalities that should be supported in XR environments.

Various Input Modalities

The following are example of some of the diverse input methods used by people with disabilities. NOTE: In many real world applications these input methods may be combined.

  • Speech - this is where a users voice is the main input. Using a range of voice commands a user should be able to navigate in an XR environment, interact with the objects in that environment using their voice alone.
  • Keyboard - this is where the keyboard alone is the users main input. A user should be able to navigate in an XR environment, interact with the objects in that environment using the keyboard alone.
  • Switch this is where a since button Switch alone is the users main input. A user should be able to navigate in an XR environment, interact with the objects in that environment using a Switch alone. This switch may be used in conjunction with an Assistive Technology scanning application within the XR environment that allows them to select directions for navigation, macros for communication and interaction.
  • Gesture - this is where gesture based controllers are the main input and can be used to navigate in an XR environment, interact with the objects in that environment make selections using their voice alone.
  • Eye Tracking - this is where eye tracking applications is the main input. Using a range of voice commands a user should be able to navigate in an XR environment, interact with the objects in that environment using these eye tracking applications.

Various Output Modalities

The following are a list of outputs that can be available to a user to help them understand, interact with and 'sense' feedback from an XR application. Some of these are in common use on the Web and other exploratory (such as Olfactory and Gustatory.)

  • Tactile - this is using the sense of touch, or commonly referred to as haptics.
  • Visual - this is using the sense of sight, such as 2D and 3D graphics.
  • Auditory - this is using the sense of sound, such as rich spatial audio, surround sound.
  • Olfactory - this is the sense of smell.
  • Gustatory 2 - this is the sense of taste.

Understanding XR and accessibility challenges

Understanding XR, Mixed Reality and so on presents various challenges that are very technical. They include issues with hardware, software, interaction design, design principles, semantics and more. So these are the 'basic' technical complexities that are substantial. To add to this, for many designers and authors they may neither know or have access to people with disabilities who they can build a solid set of user needs and requirements from. In short, they just may not understand what user needs they are trying to meet when making XR accessible.

Some of the issues in XR, for example in gaming, for people with disabilities including:

  • Over emphasis on motion controls. The are many motion controllers that emphasise using your body to control the experience. Some games with XR components may lock out traditional control methods when a VR headset is being used, and the user should always be able to use a range of input mechanisms.
  • VR Headsets need the user to be a physical position to play. The user should not have to be in a particular physical position to play a game or perform some action. Or there should be ability to remap these 'physical positions' to other controls (such as using WalkingVRDriver).
  • Games and hardware being locked to certain manufacturers Consoles should allow full button remapping on standard game controllers - to different types of AT such as switches. These remapping preference should be mobile, and transportable across a range of hardware devices and software.
  • Gamification of VR forces game dynamics on the user. Some users may wish to just explore an immersive environment without the 'game', or any particular challenge.
  • Audio design lacks spatial accuracy sound design needs particular attention and can be critical for a good user experience for people with disabilities. In deed the auditory experience of a game or other immersive environment may 'be' the experience.

These issues come from an original article on AbleGamers by AJ Ryan.4

There are also a range of of other disabilities that will need to be considered in making XR accessible. It is beyond the scope of this document to describe them all in detail. General categories or types of disabilities are:

  • Auditory disabilities
  • Cognitive disabilities
  • Neurological disabilities
  • Physical disabilities
  • Speech disabilities
  • Visual disabilities

A person may have one of these disabilities or a combination of several. Each of these 'types' will be presented as a user need that should be met and understanding these needs are crucial in rising to the range of interesting challenges XR designers and authors will have when supporting accessibility and multimodality in XR environments.

These may be:

  • Understanding specific diverse user needs and how they relate to XR.
  • Successfully identifying modality needs that are not obvious - but still need to be supported.
  • Having suitable authoring tools for designers that support accessibility requirements in XR.
  • Using languages, platforms and engines that support accessibility semantics.
  • Successfully abstracting XR applications by providing accessible alternatives for content and interaction.
  • The provision of specific commands within the VR environment (e.g., to go directly to a specified location or to follow another user) which assist with non-visual navigation.
  • The use of virtual assistive technologies (e.g., white cane via a haptic device) to provide non-visual feedback. The research identified that if the same audio cues associated with a real-world infrared white cane were used in immersive environment, users were able to effectively centre themselves in the middle of pathways and walk successfully through virtual doorways based on the same audio feedback as used in the equivalent real-world device (Maidenbaum & Amedi,2015)

XR controllers and accessibility

As mentioned there are a range of input devices that may be used. Supporting these controllers requires an understanding of what they are and how they work. There are a variety of alternative gaming controls that may be very useful in XR environments and applications. For example the XBOX Adaptive Controller.

While XR is the experience, the controller is king, and plays a critical part in overcoming some complexity as well as mediating issues that may relate to other challenges around usability and helping the user understand sensory substitution devices.

Controllers such as the XBOX Adaptive Controller and other switch type inputs allow the issuer to remapping keyboard inputs to control virtual environments. The powerful customisations may allow the user to "do that thing that is difficult" for them with ease. In conjunction with this controller, for example, users with limited mobility they can also simulate actions in the XR environment that they would not be able to physically perform, WalkinVRDriver is a good example of this, where motion range, position and orientation can be set to the users ability.

Controller Challenges

Customisation of control inputs

Giving the user the ability to modify their input preference or use a variety of input devices. The remapping of keys used to control movement or interaction in virtual environments is not currently required by WCAG. It is nevertheless noted in the literature as desirable.

Using multiple diverse inputs simultaneously

A user with a disability may have several input devices. A user may switch 'mode' of interaction or the tools used and should be able to do so without degrading into a poor user experience where they loose focus on a task and cannot return to it, or make unforced errors, accidental input and so on.

Consistent tracking with multiple inputs

There may be tracking issues when switching input devices. A tracking issue is where the user may loose their focus or it can be modified in unpredictable or unwanted ways, this can cause loss of focus and potentially push the user to make unwanted inputs or choices.

Outputs sent to multiple devices will need to be synchronised.

Usability and affordances in XR

An XR application should have a high level of usability for someone with a disability who is using AT. Some challenges in translating interaction models may be:

  • How can a user understand the affordance models used in XR interactions? Or can this be mediated by their own interaction preferences and controllers?
  • What interactions are allowed or not allowed?
  • How can an accessibility abstracted XR experience focussed on supporting a different modality, successfully interact with another?

User Needs/Scenarios definition

This document outlines various accessibility related 'user needs' for XR. These 'user needs' should drive accessibility requirements for XR and its related architecture. These come from people with disabilities who use Assistive Technology (AT) and wish to see the features described available within XR enabled applications.

User needs are framed in a range of 'Scenarios' (which can be thought of as similar to 'User Stories').

Use case definition

Use case's in the context of this document refers to aspects that technically support accessibility user needs. References to 'use cases' - will mean technical application level interactions that may directly or indirectly facilitate any given user need.

XR User Needs

XR user needs are dependent on context and domain. These domains are as varied as education, gaming, health, multimedia, communications, travel.

The following are examples of user needs/scenarios that need to be considered in XR in some of these domains. The idea is not that these are exhaustive but are presented in order to help orientate the reader to some baseline user needs and requirements.

Immersive Environments and Experiences

Some of the many challenges with accessible gaming and immersive environments include the use of extremely complex input devices, control schemes that require a high degree of precision, timing and simultaneous action; ability to distinguish subtle differences in busy visual and audio information, having to juggle multiple complex goals and objectives 7.There are also currently very useful accessibility guidelines available that are specific to gaming 8.

The following outline some accessibility user needs in immersive environments:

  • A blind user wants to navigate, identify locations and objects within a gaming or immersive environment.
  • A mobility impaired user wants to interact with items in an immersive environment in a way that doesn't require particular bodily movement to perform any given action. E.g They should have a way of standing up in the environment, without having to do so physically.
  • A user with a cognitive disability needs to use symbol sets for personalising games or training to help people on the autism spectrum.
  • A user with limited mobility needs to be able to hit a particular 'Target size' for a button or other controls in XR: A user should not need very fine motor control to be able to activate a hit target.
  • A user with limited mobility wants to be able to use Voice Commands when gaming, to navigate, interact and communicate with others in XR environments.
  • A user who has colour blindness can have their environment skinned to suit their particular luminosity and colour contrast requirements.
  • A screen magnification user who needs to be able to check the context of their view in XR environments, and track where their focus is.
  • A screen magnification user will need to be made aware of critical messaging and alerts in XR environments. They may need to route those messages to second screens.
  • The user needs be able to reset and calibrate orientation/view in a device independent way.
  • The user needs be able to set a time limit for any Immersive session. Some users may easily loose track of time, or too much time in any XR experience may effect their mental health adversely.
  • Visual reflow of on-screen content needs to be context dependent and customisable depending on context of presentation in XR environments.

Education

The following are some user needs for education and training in XR. These can be abstracted education environments created and customised based on user needs and preferences.

  • Teach Deaf people Sign Language in an engaging environment where text and motion are the primary forms of interaction. Sign language will need to be supported in XR as this may be the primary language of a deaf person.
  • Help a Deaf user access audio information equivalents in real-time. 1
  • Encourage people with cognitive disabilities to engage in activities designed to learn and stimulate their cognitive function.

Health

  • Disability simulation for learning about disability within a virtual environment. Someone who wants to understand more about the needs of people with disabilities could use XR for simulation of vision impairments, low-vision, blindness, low physical movement or ability.
  • Post-injury patients in rehabilitation want to build mobility and improving co-ordination as well as simulating tasks that they are yet to perform in the real world but will need to. For example, balance re-training for a user recovering from an operation or a stroke, or training for new prostheses.
  • User of new Assistive Technology wants to prepare to interact with a new environment. For example, with a new cane for a blind user.

Multimedia

  • Subtitling, audio Description, captioning, signing of audio content for blind users or for deaf and hard of hearing users. This could be in gaming, education and so on.
  • Customisation of subtitling Audio Description and captioning of audio content for low vision users who need to modify the size, colour contrast or positioning.

Communications

  • Deaf-blind user communicating via a RTC application in XR In these interactions a deaf-blind user may need to route information to a braille or other device, or they may choose to have certain types of output routed in particular ways, say descriptions of items to a braille display - conversations using RTT, or have a bespoke system they need supported.

Transport and Travel

  • Navigating real-world environment or tourist landmark. A blind user may be able to explore a building they have never been in before by taking a tour before hand. A blind user will need to understand the direction they are moving, objects they come across, and be able to orientate themselves via various landmarks or notable places of use or interest within that environment.
  • Provide a safe place for someone whose cognitive disability means they can be overwhelmed when travelling some new, even in an XR environment. This could be accessed via some quick key or shortcut.
  • Use of Helper Avatars Users with cognitive disabilities may benefit from having 'helper avatars' that they can bring with them in AR, as they navigate new environments. This may help a person who may be overwhelmed by their environment.
  • A blind user will need to interact with objects and services in a simulated travel environment.

Hearing Impaired user may need to have that description visually presented to them or signed by an avatar. 2

Shopping Online

  • A vision impaired user puts on gloves, and a headset complete with video and audio so they can interact with a virtual menu system, and enable a self voicing option and have each shopping category, or item description spoken to them, as they gesture to trigger both movement and interaction. The user may get more detail about items that are closer to them, if navigating a virtual store and allow them to virtually “select” an item.


References

Acknowledgements