The IndieUI Working Group is no longer chartered to operate. The scope of its work are expected to be taken up in several other Working Groups:

  • Web Platform may develop device-abstracted events similar in scope to IndieUI: Events;
  • CSS may express some of the context properties from IndieUI: User Context as Media Queries;
  • ARIA may build a module of additional context properties from IndieUI: User Context.

Resources from the IndieUI remain available to support long-term institutional memory, but this information is of historical value only.

Events/Authoring Guide

From Independent User Interface
Jump to: navigation, search

The following guidance for authors was previously in the requirements but seems to be more for authors.

Notification event

  • The device should post a notification event to the UI object either by a bubbled event from a descendant object or directly.
  • The notification should not provide any device specific information about the devices that sent it.
  • The device must ensure that a keyboard device is capable of sending the notification.

Multi-Input

  • An interaction through any input type (screen, keyboard, mouse) must not prevent using a different input type for subsequent interactions
  • Authors should not have to write multiple code paths for the same type of interaction event (e.g. pan, zoom, rotate)

Optional point-of-regard

  • An interaction can have an explicit or implicit point of regard. For example, zooming with a touch screen explicitly zooms around a specific point, while zooming with a keyboard may use an implicit point like the center of the object.

Units of interaction that correspond precisely with units of display on a screen

  • It should be possible to implement a map such that all touch-screen interactions keep the same points positioned under the user's fingers. That is, a user could touch two locations on a map, then do a complex series of pinch, pan, and rotate gestures and at the end still have those two locations under their fingers.
  • The units of interaction must be at least as granular as the pixel units. Eg. it should be possible to drag the object exactly one pixel on the screen with a mouse.

Multiple concurrent and inter-related interactions

  • Panning, zooming and rotation may all occur with the same type of gesture on a touch screen (eg. two fingers on the screen moving). A single hardware-level event (eg. one finger moving) may result in multiple logical interactions (eg. rotate and zoom) which should ideally be applied as a single operation.