Wot technical

From Accessible Platform Architectures Working Group

Accessibility and Web of Things (WoT)

The following are items suggested by Joshue O Connor (W3C/WAI), for further technical discussion within the Accessible Platform Architecture Working Group (APA) and the Research Questions Accessibility Task Force (RQTF) around the Web of Things (WoT) space and its potential need to support accessibility use cases and requirements for people with disabilities.

Status

This document is a [DRAFT], and part of initial discovery into potential technical issues relating to Web Of Things. These will be used as the basis for further discussion in the RQTF/APA. The work is ongoing and aims to identify technical issues that relate to accessibility requirements for Web Of Things. This document does not represent a formal working group position. Previous work in this space can be seen in Accessibility and Web Of Things.

Accessibility schema for WoT

In conversation with Michael McCool(Principal Engineer, Intel) about the need for better accessibility semantics - He suggested creating an iot:accessibility type scheme and submitting it to Web Of Things working group (WoTWG) for review.

Accessibility Linked Data and JSON-LD

Linked Data is a way to create a network of standards-based machine interpretable data across different documents and Web sites. It allows an application to start at one piece of Linked Data, and follow embedded links to other pieces of Linked Data that are hosted on different sites across the Web.

JSON-LD is a lightweight syntax to serialise Linked Data in JSON. Its design allows existing JSON to be interpreted as Linked Data with minimal changes. JSON-LD introduces:

  • a universal identifier mechanism for JSON objects via the use of IRIs,
  • a way to disambiguate keys shared among different JSON documents by mapping them to IRIs via a context,
  • a mechanism in which a value in a JSON object may refer to a JSON object on a different site on the Web,
  • the ability to annotate strings with their language,
  • a way to associate datatypes with values such as dates and times,
  • and a facility to express one or more directed graphs, such as a social network, in a single document.

This is a format used in the WoT community. Could JSON-LD be useful to point to or consume aspects or existing specs that support accessibility semantics? Can existing accessibility user agents support this kind of data? Would there need to be middleware accessibility API abstraction needed or can JSON-LD provide useful semantic hooks for AT out-of-the-box, to provide extra semantics in combination with thing descriptions?

Accessibility Object Model (AOM) and Thing Descriptions

The Accessibility Object Model project aims to improve certain aspects of the user and developer experiences concerning the interaction between web pages and assistive technology.

In particular, AOM is concerned with improving the developer experience around:

  • building Web Components which are as accessible as a built-in element;
  • expressing and modifying the semantics of any Element using DOM APIs;
  • expressing semantic relationships between Elements;
  • expressing semantics for visual user interfaces which are not composed of Elements, such as canvas-based user interfaces;
  • understanding and testing the process by which HTML and ARIA contribute to the computation of the accessibility tree.

By reducing the friction experienced by developers in creating accessible web pages, and filling in gaps in what semantics may be expressed via DOM APIs, the APIs proposed in the Accessibility Object Model aim to improve the user experience of users interacting with web pages via assistive technology.

A Thing Description describes the metadata and interfaces of Things, where a Thing is an abstraction of a physical or virtual entity that provides interactions to and participates in the Web of Things. Thing Descriptions provide a set of interactions based on a small vocabulary that makes it possible both to integrate diverse devices and to allow diverse applications to interoperate. Thing Descriptions, by default, are encoded in a JSON format that also allows JSON-LD processing. The latter provides a powerful foundation to represent knowledge about Things in a machine-understandable way. A Thing Description instance can be hosted by the Thing itself or hosted externally when a Thing has resource restrictions (e.g., limited memory space) or when a Web of Things-compatible legacy device is retrofitted with a Thing Description.

What is the role of the AOM in Web Of Things? How would it relate to WoT Thing descriptions? For example, are there use cases for combining AOM type rich semantic descriptions combined with a thing description, we need to identify gaps. For example, the AOM model may help with expressing semantic relationships between elements, or dynamic thing output.

WoT needs to support interoperability between web enabled devices and sensors by allowing semantically rich data to be transported between nodes or updating to reflect their purpose, their state and other useful data.

WoT Convergence

While the WoT space is evolving quickly, it seems there is need for greater consensus and convergence of different aspects of the WoT standards. For example, there is not currently tight agreement on what a thing is. This will be problematic when it comes to the next item 'Dynamic Things' in this list. When it comes to understanding what is the the purpose of a thing and how that purpose may impact accessibility: We need a common understanding of what a thing 'is' especially when things interact dynamically.

Dynamic Things and Accessibility

Dynamic combined thing output: Things need semantics for purpose - so what happens when two things that do A/B - work together and create C/D - then rich descriptions of this combined output is needed if it has an impact on the user or on what a user needs to know about a current things 'state'. This combined output needs to be understood in order to support multi-modal abstractions for accessibility. Or role/state/purpose/output that can be ported from thing to thing.

Engagement and Support for multimodal requirements

How can we help to develop a clearer understanding of how can WoT support the needs of people with disabilities with multimodal requirements or facilitate interactions with WoT enabled environments and technologies - within the WoT group and related interest groups.

Support for Profiling and Preferences

How can a user with accessibility requirements get support from things in a sensor enabled environment via profiling of these requirements? Can a sensor enabled network or machine to machine interaction be supported by profiling or modify machine to machine output as needed to meet some purpose? This kind of pattern could be 'opt in' systems for way finding with user tokens, or profile details being passed within a WoT network. This information could also contain personalisation or customised preferences for different user requirements.

Current real world WoT implementations

Look at existing industrial partner implementations of WoT architectures - and assess what the accessibility challenges are in real world applications.

Useful Current work for technical specification creators

Framework for Accessible Specification of Technologies (FAST) advises creators of technical specifications how to ensure their technology meets the needs of user with disabilities. It addresses primarily web content technologies but also relates to any technology that affects web content sent to users, including client-side APIs, transmission protocols, and interchange formats. Specifications that implement these guidelines make it possible for content authors and user agents to render the content in an accessible manner to people with a wide range of abilities.