Mapping games users and designers needs to technologies

This workshop session was part of the Workshop on Web games. Report written by Indira Knight.

Summary

The “mapping games users and designers needs to technologies” workshop was designed to get participants to think about specific users needs and how they could be fulfilled with web games technologies. The user groups were touch/haptics researchers, UX designers, gamers, hardware developers and serious games designers(educational and therapeutic applications), all groups who could be using web game technologies. The outputs of the workshop are a matrix, and series of speculative design ideas. The matrix shows the user groups and ideas that could help them. The speculative design looked at how these groups could be using the web in the future, and what would need to be implemented for this to happen.

Themes have been identified from the outputs of the workshop. One of the biggest themes was around APIs with suggestions for new APIs or extending existing ones for example the Gamepad API. Social/collaborative was another large theme with suggestions such as creating an open source social graph, permission based access for friends and a proximity API. Other themes include metadata, standards, latency, documentation and UI.

The speculative design ideas are varied but there were a few that would give the user more agency. For example being able to easily disconnect from touch, and having an open source natural language processing. There were also ideas around having 3D model texture and temperature as material properties, native support for 3D in the browser, and expanding the access to instrumentation and tooling for performing UX studies in the browser. The documents produced from this workshop can be analysed further to find practical solutions that fulfil these user groups needs.

The Workshop

The workshop was run in three groups and had two parts. The first part was to create the matrix. This was done by giving participants 2 minutes per user group to solo brainstorm solutions for that user. For the second part of the workshop, participants chose which user group they wanted to work on and in groups discussed how that user would be using the web in 3 years time. They thought about the use but also what would need to be in place to get there. There was time at the end of the workshop to feedback their groups solutions to the whole group.

The analysis of the workshop is in three sections. Section one is the speculation on future uses. The second section is the thematic analysis of the solo brainstorming of users needs to technologies. The third section is a matrix of the solo brainstorming.

Speculation on future uses

This was the final part of the workshop, participants chose which user group they wanted to work on and those groups discussed how the user would be using the web in 3 years time. They thought about the use but also what would need to be in place to get there.

Touch/Haptics

Idea

Thought about full body haptics and the need for parental/safety controls. There could be an API for this, so you could choose a body part and send a signal to describe the touch. There should be a kill switch, so the person can abort the touch, disconnect from it.

Idea

Higher level APIs (3D model tog) 3D model texture and temperature as material properties. WebXR extensions, VR controller gloves vibrate at positions, could heat up if hardware supported it. Vibration vest - like audio channel? Custom hardware of hid (usb/bluetooth)

UX design

Idea

Web XR is like the early days of mobile web, it was initially hard to prototype for the mobile web, there needs to be a sketch for the VR experience. There needs to be a way to edit in AR and VR, drag and drop, tooling will need to change. There needs to be a polyfill approach to explore different ideas, then turn that into solid tooling

Idea
  • Stylus/touch/voice interface - natural behaviours -> digital objects
  • Native support for 3D in the browser
  • Visual programming king/libraries for creation, easy (non-programmers), consistent across platforms
  • Open source library of objects/pre-fabs
  • Visual “language” for creating/assigning behaviours
  • Common/central repository - github equivalent + standards
Idea

Future Designer Needs:

  • Near future: Expanding Speech APIs to include the ability to give the browser commands. This would give the Designer the ability to quickly map phrases to actions in a game.
  • Near future: Expanding the access to instrumentation and tooling for performing UX studies in the browser. Designers need the ability to capture and analyze massive amounts of input data while a player tests a game. This would require the browser to expand the permission model for accessing and storing data that could fingerprint a user. This would need to be behind a launch flag or something that a user would have to take direct action to turn on. Tooling should include eye tracking via the webcam, cursor tracking, touch tracking, game pad input tracking, etc etc.
  • Somewhere between now and the near future: Give designers the ability to prototype and test new interfaces via a declarative interaction model similar to A-Frame. The UX designer would be able to declare which objects are interactive and which aren’t. Declare if an object can be grabbed, touched, picked up, static, etc etc. (This led to a discussion about the SMIL 3.0 spec and how we could borrow from it)
  • New future: Rethink text inputs and standardize mobile inputs in web games.

Gamers

Idea

There is already the technology available for the wants of the gamers. The gamers want better content so there needs to be incentives to build better games.

The strength of the web is that is is a good place for a social experience. At the moment social graphs are proprietary (e.g. Facebook). For web games there would need to be an open source social graph for gamers to be able to play with friends.

Idea
  • Webassembly - bridging the gap to streaming “cloud gaming”
  • Open XR - breaking walls down between traditional gaming “20” “side scroller” “30” “VR”, could be the same game with different forms of deployment

Hardware developer

Idea

Need low level access exposed in the browser and device level interfaces. It would be good to have a standard so different devices can be linked, so people using a hololens can interact with people using a phone. A way of mapping values between devices

Serious games (educational, therapies and treatments)

Idea

Be able to get virtual hugs in the future or have adaptive early learning tools. The developer community and the public should be made more aware of this area and the potential for development in this area. There needs to be machine learning API’s, Open speak API’s (natural language processing NLP). Currently NLP is controlled by big companies such as Google or Microsoft, there is space for an open version that can be easily used in web projects

Themes

While putting ideas from the brainstorming into the matrix themes started to appear. There is a short write up of some of these themes below, followed by a full list of themes. Some of the ideas overlap between themes, so they have been added were they are more specific.

Haptics

One of the main themes for haptics was having a standard format for the different touch experiences from the hardware, so there can be standards for remapping touch. Another theme was around meta data for example 3D file formats could include data about how haptics should feel for an object, and also a data format or schema for consensual interactions.

There were a lot of suggestions for haptics API’s, either looking to extend existing API’s such at the Gamepad API or WebXR or creating a new haptics API, or a generic sensor API that can be used for rapid prototyping with pressure, vibration etc.

Social and collaborative games

There were thoughts on how gamers could play with friends, a couple of the suggestions were an open source social graph or accessing phone contacts go make social gaming possible without the need for interaction with social network.

Machine Learning

There were a lot of suggestions on API’s, there are a couple of larger sub themes in the API theme which are connected , a machine learning API and a speech API, which could include machine learning.

Networks

Peer to Peer and WebRTC mentioned a few times. Frictionless login/single sign on across browsers/devices also mentioned.

Testing

There were some ideas around testing including creating unique URLs for each game level/location(for quick debugging. It was suggested that web game platforms would have to provide more QA to ensure quality.

Theme list

Haptic API
A haptic API that can send a signal and describe a touch:
  • Intent oriented actions API
  • How to translate sensor data into a specific type of haptic event? Need for haptic definitions, e.g. light touch, rapid movement, squeeze/pinch
  • API for haptics support with standardized interface to selectively enable/disable remap body zones
  • Extension to WebXR device API
  • Need for generic sensor API for rapid prototyping, pressure, vibration, IR
  • Flexible abstraction of app intents to available physical devices
    • Portable to different systems
    • Future proof
    • Implemented/defined in OpenXR
    • Vibration API window.navigator.vibration Haptics.js
    • API to drive touch/feeling hardware
    • API to get sensor readings of hand position
    • Google’s vibration extension for gamepad API
Other API’s
  • Touchscreen API
  • Projection mapping API support. Let browsers expose depth mapped surface
  • Cache API’s for speed up shader and JIT runs
  • PWA Really just need a better supported gamepad API no room for AAA exp
  • Create events API emulating hardware events to JS
  • Generic Web <-> IOT device interaction API/protocol that can work with any device long term
  • Standardised phone sensor API
  • Scalability APIs make it simple to scale from 10s of players to 10,000s of players
  • Web Proximity API
  • Virtual controls portability API
Bluetooth
  • Bluetooth/wireless protocols access (expand WebUSB) (haptics)
  • Make USB and bluetooth available in browsers. Maybe other devices via I2C, I2A etc
  • Bluetooth API + WebXR
Social/collaborative
  • An open source social graph (currently available through social networks)
  • API to access phone contacts go make social gaming possible without the need for interaction with social network
  • Interactive design of VR app in VR ‘Tiltbrush for experiences’ multi user over the web
  • Design tools that supports moving, resizing, locking, text, action linking for collaborative design
  • Quickly implementing shared states between instances of the game basically multiplayer with IPFS
    • P2P data channel
  • Use WebRTC data channels
  • Permission based access for friends, list to invite others to games
  • P2P networking w/discovery
  • Open social graph
  • QUIC support for webRTC
  • Proximity API?
Speed
  • Fast content delivery
  • Layered caching system
  • Proper threading for AAA experience
Machine Learning
  • Open source NLP
    • Better voice activation for VR
  • Use machine learning to read design sketches and convert to DIVs
  • Expand the ability to utilize on device machine learning. ML without a library
  • Shared machine learning models
  • Speech recognition and speech synthesis API
    • Speech recognition API
    • Open speech API’s
    • NLP on the web, would need device level access to NLP API’s
  • Web API’s to encode and upload media to a server for ML processing and then download + decode + play
    • ML API’s
Prototyping
  • VR/AR - Tooling needs to change, start with polyfill approach to explore ways to create prototypes, then turn that into solid tooling
  • A declarative behavior standard
    • High level declarative 3D language
  • Sharing environments
  • Scratch like editor
  • Open database of 3D positioning data for AR
  • Modular event system with custom controls
  • File format for UX that could be integrated into content creation tools with 2D & 3D support
  • Draw pictures and it turns into HTML
  • A-Frame gesture/voice to 3D object
  • Primitive toolkit
Metadata
  • Description of touch
  • Metadata to declare hardware requirements for a web game
  • Meta data to declare what inputs a game needs to be playable?
  • Data format or schema for consensual interactions
  • Haptic cloud
Sensors
  • Smell sensors
  • Low cost facial expression sensors
  • Communication with personal health devices (fitbit etc)
  • Communication with IoT devices
Interactions
  • Voice UI
  • Neural UI
Controllers
  • Physical objects that serve as a bridge between reality and VR
  • Warmth from controllers
Game Engines
  • Export to web xr from engines/tools
Testing
  • Instrumentation for real time play testing feedbacks i.e. eye tracking, cursor tracking, input tracking
  • Unique URLs for each game level/location (for quick debugging)
  • Game platform would have to provide more QA to ensure quality (similar to console development)
  • Tools and distribution system built into the browser or VR
Standardization
  • Standardized virtual avatars
  • Standardize the way web games are packaged
  • Standardized texture format
  • Open standard for social multiplayer games? I.e. data format that can describe a turn in a game?
  • 3D avatar file format (VRM?)
  • Voice interaction language for a given app (voice xml?)
  • 3D file formats could include data about how haptics should feel for an object?
  • Standards on inspector/debug exposed to lib/engine/tool
  • Standardised volume controls
  • Real emphasis on standardizing mobile device APIs
  • Security concerns vs service PWA is the solution, external sensor GPIO like standard API (pi/?)
Ease of use
  • Add icon to home screen pop up on first visit
    • Better web support for “downloading an app to your homescreen
  • Improve wireless technology to support greater freedom of movement
  • Localized
    • Browser level cross device progress saving
  • Persistence and storage improvements
  • Easy cloud save for cross device game portability
  • Frictionless login/single sign on across browsers/devices
    • Login/sign in as fast as platform
Development
  • Tools to simulate old device tech
Offline
  • Offline, spec to declare all files necessary i.e. a manifest
  • Use service workers for offline gaming
  • PWA offline caching of game assets
Audio
  • Music composition tools targeting web audio
  • Audio feedback
  • Accurate sound to reinforce haptics - iPhone clack on keyboard
Models/3D
  • Enable automatic reconstruction of actual locations using existing street view databases
  • Work with WebML CG
  • Need 3D library of assets to supply creators with to be able to create environments quickly and for custom situations
  • Procedurally generated environments from simple input but detailed enough for believability
  • Figma like 3D canvas rather than artboard
    • Figma plugin for VR to design HUDs
  • Real time mesh reconstruction is an issue on low end devices
  • Could you leverage existing web/graphics APIs to deliver customisable “prefabs” (house, cafe, etc)
  • Higher level asset/3D model/3D CSS for client-side customization
  • Asset database for rapid prototyping
Frameworks
  • Rewriting input class of most popular frameworks
Latency
  • Low latency device communicating
  • High bandwidth/low latency, high frequency communication protocol for full body haptic feedback in WebXR: I.E. 100+ channels of haptic control
  • Low latency network API
  • Web API for low-latency transmission of touch data “I tickle you at location x”
Protocols
  • Portability: Standard format for hardware to describe its inputs/outputs for haptics, therefore allowing standard remapping
  • Haptic web platform. Expose “mapping” functionality for arbitrary events/hardware, like game keyboard mapping.
User
  • Open source documentation of the user requirements
  • Opt in on the web platform (output haptics)
  • Ownership of virtual objects
Machine Learning
  • On device speech recognition
  • Local ML models
  • ML for character models to give users agency
  • ML via WebGPU or Web ML
Designing
  • VR UX sandbox sketch like
  • Maquette - 3D UI using blobs
Monetization
  • Make them pay for the production costs of making good games
  • Monetization API
  • Web payments API - micro transactions
Documentation
  • A clear timeline for specs
    • When will specs be stable
    • When will them be impl-ed
    • What is the commitment of the vendors so far
    • Easier to make plan/investment
  • Provide a mapping of the user iterations and the existing APIs, find out whats missing, whats possible and what is not
Accessibility
  • Debugging accessibility inspecting contrast/font size
  • A11Y
Data
  • Easy integration of real-life data sources
  • Data API
Privacy/Security
  • Account management authorization user IO
  • Auth and ID services?
  • Granular storage controls with user involvement

Matrix

Existing APIs/technologies
User group Needs
Touch/Haptics research
  • Integration with haptic feedback in game pad
  • Google’s vibration extension for gamepad API
  • Further extension to gamepad API
  • Generic sensor API
  • Flexible abstraction of app intents to available physical devices
  • Portable to different systems
  • Future proof
  • Implemented/defined in OpenXR
  • 3D file formats could include data about how haptics should feel for an object?
  • Gamepad API extended to include haptics
  • Vibration API window.navigator.vibration Haptics.jsPhone vibrations on touching interactive dom objects
  • Web RTCExisting web audio as a means to generate signals over timeWeb USB, HID standoutBluetooth, web bluetooth
UX designers
  • Use A-Frame entity component system it’s HTML-ish too!
  • Instrumentation for real time play testing feedbacks i.e. eye tracking, cursor tracking, input tracking
  • Better voice activation for VR
  • Cooperation in web
  • High level declarative 3D language 
  • Figma plugin for VR to design HUDs
  • Recognition API speechBetter use case desci
  • Better classify games - mini game, cloud gaming, highly IA games
  • Middleware for ? in 3D space
  • Self consciousness proprioperception 
  • Use gamepad APIs
  • Audio feedbackVR
  • UX sandbox sketch like
  • Amazon Sumerian - how to create fast mock-ups in VR
  • Unique URLs for each game level/location(for quick debuggingUnity and A-FrameBabylon.jsDomain of toolsLive edit REPL
Gamers
  • PWA
  • Quickly implementing shared states between instances of the game basically multiplayer with IPFS
  • Add icon to home screen pop up on first visit
  • WASM threads
  • Use WebRTC data channels
  • Use service workers
  • Social platform abstraction
  • P2P data channel Low latency network API
  • Better home screen support
  • Easy to access immersive gaming on mobile web
  • PWA
  • Make them pay for the production costs of making good games
  • Send gamers to appropriate category and quality of games through discoverability
  • Web payments API - micro transactions 
  • Granular storage controls with user involvement
  • Federated Auth/IDUDP support?
  • WebRTC peer to peer coms
Hardware developers
  • Expand WebUSB
  • Expand and standardize mobile sensor access
  • All for greater access to low level devices with low latent connections
  • Allow low level integration with microcontrollers (WebUSB, Web Serial)WebRTC
  • Presentation API
  • Bluetooth API + WebXR
  • Just give them bluetooth access
  • Generic Web <-> IOT device interaction API/protocol that can work with any device long term
Serious games designers
  • Voice interaction language for a given app (voice xml?)
  • Speech interaction as default way
  • Conversational design
  • Associate musical tones with numbers and letters to help young children learn current tech
  • Speech recognition and speech synthesis API
  • User avatar for browsers
  • Medical students learning how to perform surgeries through AR environments
  • Universal design
  • On device speech recognition
  • Local ML modelsVR UX sandbox sketch likeWeb RTC
  • Could you leverage existing web/graphics APIs to deliver customisable “prefabs” (house, cafe, etc)
Desired APIs/technologies
User group Needs
Touch/Haptics research
  • Bluetooth/wireless protocols access (expand WebUSB)
  • High bandwidth/low latency, high frequency communication protocol for full body haptic feedback in WebXR: I.E. 100+ channels of haptic control
  • Portability: Standard format for hardware to describe its inputs/outputs for haptics, therefore allowing standard remapping 
  • Real Time feedback visualization with graphical representation
  • Neural UIAPI for haptics support with standardized interface to selectively enable/disable remap body zones
  • Low level game input API
  • Haptic web platform. Expose “mapping” functionality for arbitrary events/hardware, like game keyboard mapping.
  • Haptics controller vibration based on audio signal
  • API to drive touch/feeling hardwareAPI to get sensor readings of hand position
  • Web API for low-latency transmission of touch data “I tickle you at location x”
  • Provide a mapping of the user iterations and the existing APIs, find out whats missing, whats possible and what is not
  • Extension to WebXR device API
  • Introduce higher level API for animated haptics eventsUniversal haptics API
  • Universal 6DoF controller Web API
  • Bluetooth API + driver registry
  • Vibration API, fake forceNeed for generic sensor API for rapid prototyping, pressure, vibration, IR
  • Minimum interaction requirementHaptics portablility device to device
  • Opt in on the web platform (output haptics)
  • Abstracted definitions of a ‘touch’Identifying devices - commercial devices XBox controller, custom devices, API for input devices
UX designers
  • Modular event system with custom controls
  • AI
  • File format for UX that could be integrated into content creation tools with 2D & 3D support
  • Design tools that supports moving, resizing, locking, text, action linking for collaborative design
  • Improve keyboard interaction on mobile games
  • Improve augmented reality out of screen (hologram)
  • Interactive design of VR app in VR ‘Tiltbrush for experiences’ multi user over the web
  • Use machine learning to read design sketches and convert to DIVs
  • Figma like 3D canvas rather than artboard
  • Standards on inspector/debug exposed to lib/engine/toolNode graphs
  • Maquette - 3D UI using blobs
  • Primitive toolkitVisual programming e.g. Twine, scratch
  • Front-end GameUI prototyping tool
Gamers
  • Low level game input API
  • Voice UI
  • Hardware store for easy discovery
  • Easy interaction
  • Permission based access for  friends, list to invite others to games
  • Test responsive websites to view across many devices
  • PWA offline caching of game assets
  • Standardized texture format
  • Web GPU
  • Meta data to declare what inputs a game needs to be playable?
  • Offline, spec to declare all files necessary i.e. a manifestOpen standard for social multiplayer games? I.e. data format that can describe a turn in a game?
  • Music composition tools targeting web audio
  • Cache API’s for speed up shader and JIT runs
  • Browser level cross device progress saving
  • P2p networking w/discovery
  • Proper threading for AAA experience
  • Better web support for “downloading an app to your homescreen
  • Standardised volume controls
  • PWA
  • Really just need a better supported gamepad API
  • no room for AAA exp
  • PWA + easy p2p (webRTC) service
  • Devs provide better more clearly timelines for the exisitng tech f.ex, gamepad API, webGPU touch events API
  • PWA - game engine a la Unity3D sounds + UX should be dealt by the game engine
  • Gamers don’t play on phones?
  • Need better locomotion rather than teleport
  • Web Proximity API
  • Invisible links
  • Virtual controls portability API
  • Monetization
  • Monetization API
  • Ownership of virtual objects
  • Account mgmt authorization user IO
  • security/privacy
  • Fingerprinting can infer a lot of potentially private user data
  • Easy cloud save for cross device game portability 
  • Open social graphAuth and ID services?
  • Frictionless login/single sign on across browsers/devicesLogin/sign in as fast as platform
  • QUIC support for webRCTPWA offline
Hardware developers
  • A11Y
  • Web GP10 API
  • Metadata to declare hardware requirements for a web game
  • Smell sensors in games
  • Low cost facial expression sensor, probably AI driven making VR video conferencing actually productive, read reactions
  • Create events API emulating hardware events to JS
  • Make USB and bluetooth available in browsers. Maybe other devices via I2C, I2A etcPWA + multiuser + standard hardware
  • Proximity API?Standardised phone sensor API
  • Enhanced permissions when installed native
Serious games designers
Game engines
User group Needs
Touch/Haptics research
  • Represent new interaction in game
  • Intent oriented actions API
  • Rewriting input class of most popular frameworks
  • How to translate sensor data into a specific type of haptic event? Need for haptic definitions, e.g. light touch, rapid movement, squeeze/pinch
  • Visual programming through game engine (Unity?) editor
  • Accurate sound to reinforce haptics - iPhone clack on keyboard
  • Warmth from controller
UX designers
  • Export to web xr from engines/tools
  • Declarative behaviour standard
  • Geocached
  • Sharing of environments
  • Engine or library fits needsNiche tools for each use case
  • Sketching interactive patterns
  • Sketching interaction using sketching to create interactive prototype
  • Draw pictures and it turns into HTML
Gamers
  • Tools to simulate old device tech
  • Demo games
Hardware developers
  • A clear timeline for specs
  • When will specs be stable
  • When will them be implmented
  • What is the commitment of the vendors so far
  • Easier to make plan/investment
  • Input === output
  • Tracking precision will be different between devices - hololens vs mobile phone
Serious games designers
  • Speech -> WebMLRemote server
  • Local ML
  • Work with WebML CG
  • Open source documentation of the user requirements
  • Hardware limitations drives game design
  • Tethered vs untetheredInside out tracking vs outside In
  • Optimization is key - Frame drops & hitches could increase anxiety
  • ML Models for interactions need to be created to allow users to have agency in approaching a situation
  • PrefabsCross platform game design for variable hardware access
Game platforms
User group Needs
Touch/Haptics research
  • Haptic cloud
UX designers
  • Scratch like editor
  • Open database of 3D positioning data for AR
  • Do it all in tools
Gamers
  • Fast and easy logging in and authenticating
  • Blockchain
  • Cloud gaming
  • Fast content deliver
  • Layered caching system
  • Standardization of virtual avatars
  • Facial expression detection and remote avatar expressions on twitch like multi user experiences
  • API to access phone contacts go make social gaming possible without the need for interaction with social network
  • Standardize the way web games are packaged
  • Game platform would have to provide more QA to ensure quality (similar to console development)
  • Shared context content
  • Art asset portability, game to game
Hardware developers
  • Real emphasis on standardizing mobile device APIs actually too slow
  • Security concerns vs service PWA is the solution, external sensor GPIO like standard API (pi/?)
  • Real time mesh reconstruction is an issue on low end devices
  • Sensor differences between web enabled devices needs custom calibration for tracking
  • Web compatibility vs specific capability vs privacy
  • Low level hardware port access for design
Serious games designers
  • Big enough library of recorded words for text-to-speech
  • localized
Additional hardware
User group Needs
Touch/Haptics research
  • Air puff
  • User can “read” content on the page using haptic feedback, letter, images, translate into vibrations
  • Low latency device communicating
  • Gloves
  • Sensors within a baby’s crib to monitor vitals
UX designers
  • Physical objects that serve as a bridge between reality and VR
  • VR prototypes
Gamers
Hardware developers
  • Hardware API standardisation - H/W manufacturers conform to USB-HID standard
  • Optical user interfaces
Serious games designers
  • Improve wireless technology to support greater freedom of movement
  • Chip implants/biotech
Other
User group Needs
Touch/Haptics research
  • Controllers
  • Images
UX designers
  • Physical objects that serve as a bridge between reality and VR
  • VR prototypes
  • 3D model viewer with places you can “tap” and it will change to another model (like 2D tools)
  • Use mouse inside VR for fast and precise interactions
  • Easy showing and previewing of prototypes
  • A-Frame gesture/voice to 3D object
Gamers
  • Make platform more attractive to AAA devsPersistence and storage improvements
Hardware developers
  • iOS broke the trust
  • Push for exp. Only the web can do to help entice browser makers
Serious games designers
  • Need 3D library of assets to supply creators with to be able to create environments quickly and for custom situations
  • Need to be 1:1 representative visually ot allow user to practice coping skills
  • Procedurally generated environments from simple input but detailed enough for believability
  • Regulation thing to ensure people don’t get triggered
  • Debugging accessibility inspecting contrast/font size
  • K-12 Education distribution pipeline
  • Asset management localisation
  • AI/ML for natural language response, are there existing web focused AI/ML models?