Minutes - Day 2/2

Minutes taken by Karen Myers, Dominique Hazaël-Massieux and Chris Needham. Many thanks to them!

See also minutes of day 1


  1. Audio & games
    1. Interoperability/Reusability of high level WebAudio components
    2. Porting Wwise to the web
    3. Audio Device Client, Better and faster audio I/O on the web
  2. Cloud Gaming
    1. Advancing the Gamepad specification
    2. Reducing the latency of inputs on the Web
    3. WebCodecs & WebTransport
  3. Gender inclusiveness in localization
  4. Breakout Sessions
    1. More 3D Controls
    2. Discoverability and Monetization
    3. Web Games in Hosted Apps
    4. Threads
    5. Networking and Games
    6. Audio
    7. Web Assembly
    8. Accessibility Clinic
  5. Next steps for standardization
  6. Wrap Up

Audio & games

Interoperability/Reusability of high level WebAudio components

Michel: I'm from University of Nice
... I'm the national coordinator of a big research project, dealing with musicians, composers
... [Example of guitar music]
... Everything here is done on the web
... This was a guitar amplifier simulation, multi-track player
... Everything is real-time
... Some impressive WebAudio developments, synthesizers, vocoders, written in JavaScript or WASM
... Building a Web Audio graph to process the sound
... So far, no standard for reusing existing code, except for copy/paste or including a library
... [Picture of a native DAW]
... Four competing standards for plugins, complicated for developers to port
... In early 2018, we started working on an open plugin standard for Web Audio
... We wanted to bring native developers and low-level DSP developers
... Some low level languages for DSP
... We noticed that the existing standards are not web aware
... Use URIs, dealing with async events, packaging, web components, JS modules
... Guidelines for publishing and reusing components
... Example of virtual pedal board for guitar effects
... Some are coded in Faust, cross compiled to WASM
... C++ synthesizers cross compiled to WASM
... [Shows PedalBoard demo]
... We designed a Web Audio plugin standard, using a toolchain for C and C++
... Extending to other languages such as MAX MSP, used by audio developers
... For web aware distributing and packaging
... Connect to your web audio graph
... These are loaded on demand
... It's easy to load then, here's Amped Studio, uses the same web component
... Another commercial DAW managed to load our plugins in a few minutes
... There's a Faust compiler, compiles to WASM, write and run DSP code, run directly in the browser
... We embedded a GUI builder for creating components
... Apply styling, then publish on a remote server
... It generated a web component, with a page for loading and testing it
... We'll automate this to publish on a public web server, provide documentation etc
... Once published, you can try it in other hosts
... The host scans remote repositories, can drag and drop plugins
... [Demo of a synthesizer connected to a guitar amplifier]
... You can plug in MIDI controllers and your electric guitar
... We have a free SDK on GitHub
... Unit tests to help developers test their plugins
... We have developers, a first online IDE for making low level DSP algorithms in WASM
... You can develop using classic web development tools
... Can we make a Rocksmith game, like guitar hero but with a real guitar?
... Yes, we have efficient effects, pitch detection
... Problem with latency, recording live audio is OK on Mac OS, 12-23ms
... Higher in Firefox, but should be the same
... It's not playable on Windows, 81 to 100 ms
... The Web Audio WG said there's a solution that should be shipped
... You need the right OS level driver
... The Web Audio API has an OutputLatency property, but not implemented yet
... Some work is needed to make this more consistent
... I work with online music schools, customers using Windows

Porting Wwise to the web

Philippe: I'm a developer at Audiokinetic in Montreal
... I'll talk about out attempt to make Wwise run on the web
... The process of porting a big C++ game will be similar
... Wwise is an authoring tool for Mac and Windows, used by sound designers to create game audio
... Mixing, advanced effects, LPF, everything
... It creates sound banks that need to be packaged with the game
... There's an SDK that's integrated into the game, or via Unreal or Unity
... This can read the sound banks and execute the audio in the game
... If we want the web to become a major gaming platform, middleware is important
... We have indie and AAA games launching with Wwise
... Wwise is popular because it's cross platform, we support mobile OSs to high powered consoles and PCs, VR
... The web can become another major gaming platform, we want to help get there
... Getting stared, just make a sound, C++ code, port to the Web
... Started with a Linux port, Emscripten provides a POSIX environment
... WASM is an architecture that's neither x86 or ARM, some of our code paths rely on those
... SSE emulation for SIMD, we couldn't get working
... Wwise has an abstraction layer for atomics. We used some compiler intrinsics that produced linker errors
... We replaced with Enscripten specific functions
... pthreads, used workers instead, by far our biggest blocker
... First attempt resulted in lots of deadlocks
... An event manager thread that reads game events from a queue, updates game state, produces audio samples into a ring buffer
... The thread can be woken up in many ways. Usually there's a separate audio thread that works differently on every platform
... Takes the audio data and copies it to the hardware
... It's a heartbeat for the system, keeps it alive
... It was problematic for Emscripten, as that was based on the old ScriptProcessorNode in Web Audio, so on the same thread
... The Wwise SDK comes with some blocking API calls for loading/unloading bank files
... This was an exercise to see how much friction C++ developers will encounter when porting to the web
... It was important to do better than that
... We ended up disabling threading, do all on the main thread, to avoid deadlocks
... Makes it impossible for the game to run, can't do advanced effects
... Second attempt, replace SDL dependency with something based on Web Audio API and AudioWorklet
... Worklet is separate from the main application. How to share the ring buffer with the processor program?
... We looked at how emscripten does it for pthreads. We saw their worker imports the main app module
... The app module detects which environment it's in
... From our processor program, we imported the main game module, didn't work
... The main JS file used APIs not supported in the audio worklet global scope, but took a long time to figure out due to lack of error information
... If we compile as a self contained ES6 module, AudioWorklet can imported it
... But, the worker global scope doesn't support importing ES6 module
... We ended up with initialising the processor program using postMessage, with pointers to the audio buffers
... This works as long as the WASM heap doesn't change during execution
... We were able to reenable threading. It works
... AudioWorklet is an improvement over ScriptProcessorNode
... Threading is difficult to get right. The pthread is so different, it's hard to get working
... Debugging was difficult, hard to get source maps
... Could have done more if we'd had access to a proper debugger
... We wish there were a lower level API for controlling number of samples per frame, channel interleaving, final channel downmix
... Not possible with AudioWorklet, but could be optimised
... SharedArrayBuffer - we need this, if you want C++ games to run on the web, need to fix this. If there's no threading, there's no AAA games on the web
... Excited to hear Mozilla has a solution coming, what about Safari, mobile browsers?
... We want to support the web platform, not just Chrome. Don't want to go back to "works on browser X"

Francois: Is audio for games solved now?

Preston: Some engines that use Wwise have web targets, what do they currently use for web export?

Philippe: Games using Unity can use the web right now, currently the Wwise plugin doesn't work with it
... The Unity and Unreal engine plugins are done in-house, we're working on the web effort now

Audio Device Client, Better and faster audio I/O on the web

Hongchan: I'm a spec editor for the Web Audio API
... ADC is a proposal, working on the design in the CG
... Want to collect feedback from industry experts
... ADC offers low level I/O, better access to the audio hardware, a dedicated scope that runs on a real time priority thread
... Why have a new API for audio?
... We want to close the app gap for audio
... All native platforms have low level audio APIs. What do we have on the web?
... Does Web Audio serve as a low level audio API?
... We're getting to a v1 Recommendation now
... When it was reviewed by TAG in 2013, feedback was around lack of extensibility
... You can't extend an AudioNode, also ScriptProcessorNode was broken
... So we came up with AudioWorklet, where you can write JS code for custom audio processing
... The most crucial improvement, we have isolation of audio processing code from the main JS thread
... Now seeing exciting things on the web, can easily use WASM code as well
... Developers now asking how to port to AudioWorklet
... Put WASM code into AudioWorklet and AudioWorkletNode
... This is a building block, part of the audio graph
... The limitation of the Web Audio API still applies, the render buffer size is 128 sample frames, less than 3ms at 44.1kHz
... Sensible for small building blocks like oscillators or filters, but not for an entire application such as a DAW
... Audio developers take advantage of AudioWorklet, SharedArrayBuffer, and ?
... The AudioWorklet is limited to 128 sample frames, so want to use Worker, can set up your own rendering, use the AW as an audio callback, and use a SAB
... The audio thread is higher priority, but the worker thread is low priority
... ADC is supposed to be the lowest layer, no redundant overhead, perfect for a WASM approach
... Gives you better access to hardware without affecting user privacy, free from Audio API limitations of fixed buffer size
... Layering of Web Audio on top of ADC
... We'll push for the idea of a real time priority thread for JS audio processing
... Game engines can take advantage of ADC for their audio code
... Unlocks many new large scale applications for audio
... You can configure hardware related properties, hardware delay, head mounted devices
... Also useful if you have custom WASM decoder for teleconferencing app
... Here's how to set up constraints (using capability an constraint pattern)
... You can set up sample rate, channel count, callback buffer size
... Pass to the capability function
... use addModule() to launch your processing code
... You can import an ES6 module, use it to process input and output buffers, from the audio hardware
... You can register your processing function, using setDeviceCallback()
... Core design issues discussed in the CG now, we want to discuss integration plans with WASM, Web Audio, WebRTC
... I talked with Chrome security team. Real-time thread doesn't introduce a problem
... Enabled via a flag in Chrome, want to collect feedback
... Initial feedback from testing is positive
... Mitigate by only allowing ADC to be enabled from a top level document
... https://github.com/WebAudio/web-audio-cg

Navid: In Philippe's slides, there was an event loop with a queue. Do you have to ?

Philippe: Where should the algorithm run? It's in a Web Worker generated by Emscripten, generated by the pthread emulation
... You could suffer from audio starvation. Still an issue, no clear answer

Francois: Can you create Web Audio on top of ADC?

Hongchan: Yes. We have some ideas there. Currently the AudioContext constructor doesn't have an option to set the ?
... Another idea is to have a getContext method

Dave_Evans: Was there a reason that the ADC code uses a JavaScript file URL?

Hongchan: We used the established pattern, may want to revisit that

Dave_Evans: Preferable not to have to do as files, some kind of object

Philippe: Important to isolate from the global scope, can do with files

David: Doing that, we're not trusting the web developer, shouldn't be too protective
... A lot of people share the pain, using createObjectURL from a Blob

Philippe: As middleware developers, it's hard to ask clients to put a JS file on their server

David: People using bundlers to package their code, except one tiny file for the worker

Francois: SharedArrayBuffer is a common requirement

Luke: There's a cross browser plan, two new HTTP headers, cross origin embedding policy
... Opt in for subresources
... When both headers are set, we can enable SAB, so no hazard for spectre attack
... Had lots of cross browser iteration, implementations in Firefox nearing completion
... We'll enable in Nightly to get feedback

Francois: Also enabling on mobile?

Luke: Mobile has extra limits on number of processes you can create
... Could allow it to be enabled on mobile

Philippe: Someone said web games are most popular on mobile right now

Francois: What are the plans for ADC?

Hongchan: I want to talk about integration, it'll be useful for audio, but we want to talk about how to integrate with other parts of the web platform

Francois: To summarise, it seems audio is going in the right direction, from a gaming perspective. There are ongoing plans to solve some more advanced issues.

Cloud Gaming

Advancing the Gamepad specification

Kelvin: director at Sony in PlayStation Division; PlayStation Now is our game streaming service
... will focus on gamepad as it relates to games on the Web
... How many of you have worked with the Gamepad API?

[quite a few hands go up]

Kelvin: how many are satisfied with it?

[one hand goes up]

Kelvin: that matches our own experience!
... we want to share some of the ideas we have to improve the situation
... Initially we tried to get our whole UI done through the Web layer - but the UI is controlled via a gamepad, and it turned out the gamepad API limitations prevented us from going that way
... when we realized that, we worked with other teams and consoles to see if there was interest in fixing this
... on the other side, we heard from the gamepad api creators that there wasn't enough demand to bring changes / momentum to the API
... we concluded we needed to help with the spec
... the Gamepad API in 2014 provided some basic gamepad functionality
... nothing has chanced since then
... it is still a WD, never made it to CR
... as we surveyed the group of users, we heard interest in the following changes:
... - standardize gamepad inputs
... there is variation both across controllers AND across browsers
... which makes it very complicated to use it
... - support for modern controller features, e.G. touch surfaces, light indicators, haptics, accelerometer, gyroscope
... We divided our work into 2 work streams: v1 is about finishing the current gamepad spec by bringing to Rec after bringing clarfiications and fixing privacy & security aspects

Issues on Gamepad API

Kelvin: v2 is about adding support for modern gamepad features
... for touchpad support, there is variation in shapes of the surfaces, support for single/multi-touch
... we have a proposal that is planned for implementation in Chrome and Firefox soon
... Another addition is for Light indicator
... this is more complicated than it may looked at first, and is key to guide users in how they detect active controllers
... we have a proposal https://github.com/w3c/gamepad/issues/67
... planned for implementation in Chrome and Firefox
... I'm interested in feedback - is this the right direction?
... Thank you!

Nell: what about eventing?

Kelvin: we looked into it - some controllers are events-based, others are poll-based
... we aren't actively working in it - interesting in getting contributors

Nell: there are lots of ongoing conversation in WebXR around controller input
... What about round vs square pad? is that looked at in a different issue?

Kelvin: no, this is part of the proposal in issue 67 - please bring your input

Philippe: what about haptics? it didn't appear in your v2 description?

Kelvin: Google is working on a proposal for haptics - will appear in the same repo

Philippe: what about controller identification? user id? device id?
... how do you manage it in the gamepad api?

Kelvin: it's not clearly defined at the moment; the default is the first gamepad plugged is assigned to user 1, the second to player 2,
... what's less clear is what to do when a gamepad gets unplugged
... that isn't clearly defined yet in v1

Philippe: but the identification is bound to the player number?

Kelvin: correct

Matthew: from an accessibility perspective, we would need button remapping, or use 2 controllers instead of 1 for ease of usage
... there is also the need to recognize other type of input as controllers, or consider new controllers such as the Xbox accessible controller

Kelvin: we looked at this at the product level, but haven't addressed this at the API level
... it's a bit tricky to manage button remapping at the gamepad API level - we still need to put more thoughts into this

Matthew: ideally, we wouldn't have to ask the game developers to have to deal with it, but be managed transparently

Francois: you mentioned light indicators - any work for controllers that include screens

Kelvin: not aware of this

Nell: a quick update on where WebXR is heading on this
... we have the notion of input source
... which can be a hand, a controller, etc
... we distinguish tracked and untracked input sources
... we expose the gamepad API as part of this input source
... one challenge is that VR controllers have a lot more diversity in their form factors
... we've converged toward a shared model with a hierarchy of interaction profiles, from most to least specific to enable fallback
... the InputSource object has a selectstart and selectend event
... we have an open issue about whether other buttons should trigger events - want to collaborate with the gamepad crew
... we're also building a registry of controller inputs

Kelvin: how dependent are you on the gamepad API?

Nell: we're using the interface and exposing it in the InputSource object - not as navigator.gamepads
... the way we structured it, we knew about changes coming to the gamepad API
... we can add additional properties to the InputSource as the gamepad API evolve and requires it

Kelvin: our controllers tend to be poll-based - how will you manage that?

Nell: data on controllers need to be bound to the particular pose of the player and its devices - sync is key
... so we need to align to an event model, and ask the UA to deal with polling to keep the event loop aligned with the display framerate

Reducing the latency of inputs on the Web

Navid: I'm about to talk about input latency in general, and how we're looking at reducing it
... it's particularly important in Cloud Gaming for which we want to reduce client side latency as much as possible

Reducing input latency on the web

Navid: I'm part of the Chromium input dev team
... we're looking at making user interaction as smooth as possible, bringing new capabilities, and reducing developer pain points
... Today, I'll focus on latency of input but touch a bit on the 2 other aspects
... there are lots of sources of latency
... some come from how developers write their code e.g. long running tasks
... some come from how browsers work - either for architectural reasons, some for security or privacy reasons
... To help with developers-induced latency, we have guidelines on how to avoid these problems - e.g. requestIdleCallback, workers, or the new proposal isInputPending
... But I'm particularly interested in browser-induced latency
... for instance, browsers suppress touch events if you do screen touch in a small region (called slop region) because they need to detect whether this is a gesture or a scroll
... there are efforts to align user inputs with requestAnimationFrame
... efforts to move processing off the main thread eg. with worklets
... One source of latency is having workers and worklets but not having access to input events
... Workers have a very limited API surface
... you can delegate work off the main thread with workers and worklets
... but there is still the bottleneck that all the input still goes through the main thread
... Some animations aren't based on user input (e.g. time-based animations or scrolling) - I'll focus on those that depend on user input
... A few use cases: purely local gaming with offloading off the main thread, network-based streaming of games
... low-latency drawing, taking advantage of the offscreen canvas API that helps with rendering
... low-latency interactive audio - e.g. when playing audio based on user input
... or interactive animations based on user input
... [showing video]
... our goal is that when the main thread is under pressure, user input in a separate worker would still keep good performance
... so we want to duplicate or mirror user input into separate workers
... we have a proposed API - it focus on user input that are limited to a given target
... in that case, we could delegate events of a specific target to a separate worker
... with that approach, we can completely avoid blokcing the main thread thanks to the usage of the compositor
... as a result, we can push events to the network at no cost
... it's fully polyfillable from a functionality perspective (but of course not from a performance perspective)
... we are at the early stages of the design - we have an explainer on which we would like to get feedback
... we already had one question about gamepad - we may have to deal with it differently
... please get in touch if you're interested
... This was one of the topic - input on workers
... The other one is non-rAF aligned events
... today, browsers delay events until requestAnimationFrame - the assumption is that events only matter when display get updated
... as an optimization
... some apps may be sensitive to the more precise timeline, e.g. when shipping events to the network
... this has lead to this notion of high frequency update - available behind a flag in Chrome
... it's a potential performance footgun - we're using different event names to make clear of its potential impact
... I'm happy to hear about more sources of potential latency latency

<leonie> http://bit.ly/reduce-input-latency

Kasper: if we get input in the worker with offscreencanvas, can we run the whole game engine out of the main thread?

Navid: yes, that's the point

Kasper: that's cool

Myles: will that work on all Chrome platforms?

Navid: in the Chrome architecture, our browser thread is always independent from individual renderer used by Web sites, which should help with dealing with that approach
... one more thing
... a couple of new input capabilities we're looking at
... Pointer Events v2 for better drawing features
... an addition to pointerlock to remove OS-accelerated events
... fancy mouse (e.g. 3D buttons)
... we really need feedback to justify these additions
... On developer points, we're working on reducing inconsistencies across browsers
... Gamepad is poll-based so has to be handled separately, incl for security reasons
... one problem with that is that you may easily miss a button press with that approach
... I also gave input to the sensor APIs

WebCodecs & WebTransport

Peter: I work at Google on Chrome and have been involved in WebRTC
... want to talk about WebCodecs & WebTransport that can help with cloud gaming
... 1st issue is that WebRTC is not a great fit for cloud gaming (although it can be used)
... likelwise, MSE is not great for cloud gaming
... Web sockets or RTC Data channels aren't a good fit either
... this has led to the design of these 2 new primitives I'll present
... you can think of WebRTC and MSE has two high-level APIs well suited for their use cases
... for cloud gaming, these high level APIs aren't goo enough of a fit; we could look at a dedicated cloud gaming API, but we thought going at low level APIs that would serve the purpose
... we've seen lots of feedback that a UDP-like API for client-server would go a long way
... Likewise, getting MSE to fulfill always-more-advanced codecs needs is going to be difficult
... this leads to the notion of looking at WebCodecs and WebTransports - similar to some of the primitives in OS (e.g. android)
... no particular challenge for Codecs
... for UDP, there are security challenges
... but QUIC to the rescue - QUIC is a new protocol that gives security, low latency congestion control, reliable and unreliable transport, serves as a basis of http3 and has already good library support
... It benefits games: faster loading, and more network resilient
... In the context of cloud gaming, HTTP3 is too high level - that's where Web Transport comes into play
... Streaming today can be done:
... - with WebRTC (ICE, DTLS, SRTP) - all the depacketizing/buffering/decoding/rendering is done entirely by the browser
... that makes it hard to use in a number of contexts and is limited by RTP
... one work around is to transmit it via the RTC data channel
... it remains hard to use in a client-server architecture, is not consistent across browsers
... Another approach is to use WASM to deal with the encoding and decoding, and transmit over WebSocket
... that's not ideal from a battery/CPU perspective - no access to hardware encoder/decoder
... We can do better
... WebTransport is based on QUIC, combines the advantages of datachannels and websockets with other benefits
... including ease of use in client/server architecture, with pluggable congestion control and a better API
... likewise, for codecs, with WebCodecs we can expose the features implicitly provided by MSE and WebRTC and expose them to the Web in a Web friendly way
... WebTransport is in origin-trial - we're looking for customers
... WebCodecs is less advanced - want to find interested parties to prioritize the work
... We're looking for customers and want to learn more about applicable use cases, and constraints of their use
... this isn't just applicable to Cloud Gaming, but applicable to gaming more broadly: low latency game state push, low latency game assets streaming, in-game comms, low-latency server-based machine learning, transcoding
... get new codecs support faster without waiting for browser implementations

Steve: we (Sony) are definitely interested in both

Chris: strong support from BBC as well
... what about synchronized events along your media stream

Peter: the metadata could be transmitted in whatever format you want
... and the app could control the synchronization when rendering via WebCodec

Chris: we're working on a DataCue API - event messages delivered in a media container
... I'd like to be able to carry forward these requirements in that API

Peter: right now, the WebCodec API is below containers
... but all of this would be in the control of the application - they can implement these additions without having to wait for browser implementations

Kelvin: for client-server, how far is the implementation on that?
... interest from other browser vendors?

Peter: client-server: work hasn't started yet; you can emulate it with the current P2P implementation
... I have an almost-working demo for client/server
... wrt other browser vendors - Microsoft is very involved in WebTransport
... on WebCodecs, Paul from Mozilla is involved in the work

Francois: is transport a bottleneck in multi-user today?

DaveEvans: have been desperate for UDP support - this looks very interesting

Philippe: WebCodecs looks very interesting for us
... what about caching?

Peter: this would not integrate with browser cache at all

Bernard: but with HTTP3+QUIC, you get that for free

Peter: but not with WebTransport

Philippe: I think it's important to keep the two API separate - we're interested in WebCodecs without necessiraly using WebTransports

Francois: would this be only in the main thread?

Peter: for both of these is that they would work off the main thread
... that's a current limitation of WebRTC Data channel
... making these new APIs transferable should not be too much of an issue

Francois: how do you render the output of a WebCodec?

Peter: one output is a MediaStreamTrack
... it could also be tied to WebAudio
... for video, it's more challenging for performance reasons

Yang_Tencent: I'm wondering what's the difference with WebRTC from a gamer perspective?

Peter: are you referring to media or data channel?

Yang: both

Peter: for data channel (the closest thing to WebTransport), the differences are:
... - the protocol is QUIC rather than SCTP, which is easier to deploy on services and requires fewer RTT in establishment and has improved congestion control
... and it can be used off the main thread
... Compared to WebRTC media stack, all the media/transport is tightly coupled
... with WebTransprot and WebCodec we decouple this and provide a lot more flexibility, which is needed for Cloud Gaming
... this work started in the context of WebRTC NV (Next Version)

Yang: is this used in Stadia?

Peter: [no comment]
... WebTransport is incubated in the WICG
... WebCodecs is looking for expression of support - might land in the MediaWG

Gender inclusiveness in localization

Francois: We are going to talk about localization
... and diversity and inclusion; two topics in one
... how you localize your games and how do you adapt an inclusive perspective
... we are going to talk about that, have a short discussion, and then break for lunch

Gabriel: Gabriel Tutone. We are with Keywords Studios
... we supply many support needs

Elina: Elina Bytskevich. Our goal is to raise awareness and have an educated debate
... we want to make sure we reach everyone, and that everyone is reflected
... What does "genderize" mean?
... dictionary definition vs. other systems in practice

Gabriel: 6,900 languages; different grammatical gender

Elina: What is gender inclusive language?
... gender neutrality should be used when gender is unknown or indeterminable
... in gender languages, the male gender language was the default
... but more recently academics are using gender neutral language
... There have always been female gaming players
... Let's see what developers have done

Gabriel: First example is The Sims
... in Sims 4 they removed binary gender descriptions
... characters can choose how to dress, and how characters behave
... Fallen London has a new approach for choosing gender
... instead of option to use gender, they asked for address
... which may not match how they want to address
... Sunelss Sea
... use of neutrality in gender
... how do we introduce 'gender-inclusive' language?
... In English there are some options; but not so much in other languages when there are gender-specific names
... Some solutions would be duplicating gender structures in the source files
... this adds more strings
... could also consider using different sentence structures
... to avoid gender-specific references

Elina: Let's see how languages are doing with inclusivity
... In French, gender-inclusive nouns have been created
... a more popular solution from what I see online
... is people add an agenda ending between two full stops
... such as cher.e.s ami.e.s
... In German you can use an asterisk
... In Russian, it becomes more complicated
... we have three genders: masculine, feminine and neutral
... it is challenging to work around
... linguists typically use second person plural
... If I translate the sentence, 'yesterday I found a nice cake recipe'
... it is difficult to translate into Russian in a neutral way
... professional names may refer to women
... but in Russian it would only be masculine or feminine

Gabriel: In Spanish, masculine refers to groups
... the characters cannot be pronounced or reproduced in text to speech which is problematic for people with disabilities
... In LGBT in Italian it is common use to replace the a and o in some words
... I have not seen this much in common language, but it is used in LBGT and hopefully its usage will expand
... Like Spanish there is the possibility to use @ to indicate plural
... but if it's used for every word, the text would be full of symbols

Elina: Using non-gender terms is not inclusive
... for developers, adding strings is more time consuming
... gender inclusive language is not supported in text to speech

Gabriel: We have not overcome challenges to localization
... we would like to find some solutions for gender localization
... some companies have programs to actively avoid gender biases
... with more women and non-binary people, we need to be more reflective
... thank you

Francois: Many thanks for the presentation. Any feedback from people who have had experience in localization?

Christian: It's more about use of the language
... when I talk to you, I wonder why this is really necessary?
... sometimes, it's annoying in German
... when you use not the asterisks but use both words
... then text to speech becomes unlistenable to me

Gabriel: you are mentioning that it feels unnatural to read the language and official documents
... You want to make sure that the player feels included in the game
... good to start to advance in language and localized text
... make sure the player feels like player is part of being the protagonist

Dom: Thank you for this eye-opening presentation
... there is an interesting tension between gender inclusiveness and accessibility, particularly with pronunciations
... that is way beyond games and the web
... I wonder if you know of any local or global effort in turning this written form into pronounceable one
... there is a W3C effort to make certain words more pronounceable
... I guess it could be a way to broaden the useag
... but they need to exist. Is anything happening?

Elina: The language is not right there
... we need time before the language catches up
... discussions about gender inclusiveness
... traditionally, there have been forums and web discussions
... and how the written language evolved
... a couple days ago I read an interesting article on how to represent the written language in the speech
... there was no way in Russian
... unfortunately it sounds bizarre
... need to say the actor and the actress
... need to say both for inclusiveness

Gabriel: A lot of formal academic languages are dictated by academies
... and some of them are behind
... the modern day thinking and cultural age
... I don't know of any particular associations that are trying to push it through
... languages are always evolving
... will be interesting to see how language evolve to include gender neutral options

Elina: Discussions have been happening in small, marginal groups, but today people are becoming more and more familiar
... I think it's a matter of time
... Russian is dated by 1968
... it has not changed, but it's time

Bjorn, King: Add to inclusiveness and diversity in games

scribe: good that we care and think about this actively
... at Kind we are very conscious about this
... we collaborated with MIT game lab
... and identified diversity in avatars
... and Activision Blizzard and we found blind spots in the diversity of the characters
... it si more challenging with text to speech
... we should not ignore it

Elina: We see this happening

Indira: back to '80s sounding strange
... like chairman to chair
... have to start somewhere, and games is good place to start

Elina: My Mom teaches English and she asked me why this is important
... they don't understand why this is an important topic
... this is an important moment so that people will be more aware and learn to be more inclusive

Nell: I expect there are some folks who may be uncomfortable with this topic
... this feeling of feeling uncomfortable may be similar to the feeling of those who do not feel included
... challenge to put forth
... is that you express a reaction that you don't like it
... but maybe there would be people willing to speak up if your reaction is not to shut down those conversations

Dom: Follow-up to question I asked earlier
... the web has played a role in getting more understanding in the needs of gender-less communities
... LGBTQA communities
... wonder if there is an opportunity for the technical world to help by making it easier to distribute new language, localization patterns
... idea of how to promote the adoption of these new conventions
... I don't have a specific proposal
... but seed I am trying to plant is whether this approach could help to simplify this
... fuzzy but pointing at a direction

Elina: What I can think of
... on any given webform
... you are asked to mention your gender or not
... a good point for languages to start to use phrasing that is accurate for the gender chosen
... If there is a question, 'have you ever been to a tropical country?' this would refer to me as a woman

Gabriel: Many languages use masculine as the default
... we are more focused on the localization aspects; make it gender-neutral in translation
... cannot think of anything on the technological side

Elina: Add more variables and more strings
... so the proper forms are used dependent upon the contexts
... first pronunciations

Francois: we don't know how to pronounce these forms

Matthew, Paciello Group: This has been very enlightening
... in terms of accessibility
... I know W3C is doing something in pronunciation
... I think one of the specs has gone into consensus
... but I will definitely look into it

Francois: someone just took an action
... One thing I forgot to say: when I was researching the topic on localization
... I chatted with i18n people at W3C and they pointed me to the editors of the Multilingual magazine
... whom had just published an issue on games.
... They put me in touch with several authors, including Keyword Studios. Thanks for that!
... They also sent some copies of the magazines, which you'll find at the back of the room. Please help yourself.

Gabriel: If you want to read further

Francois: yes, the article is there

Francois: many thanks for this talk and for the discussion

Breakout Sessions

Francois: We have eight topics and three rooms
... we will divide this main meeting room into two; two groups in parallel
... you can move tables around
... Here's how it's going to work
... You will have 50 minutes per break-out sessions
... and 10 minutes in between to switch rooms
... I will ask each breakout session to take notes
... you can create a channel on irc, such as #games-@
... or if you want to use another system, Google docs, and Office document is fine
... you have to take notes so that we know what you have been talking about
... Eight topics
... First one is 3D controls
... Three rooms: Stinger is this room to be divided into two
... Origami is room we had yesterday
... Discoverability and Monetizaiton clinic; two in one
... Second on networking and games, web assembly Q&A in this room
... Gandolf
... threats is first session; web games in hosted apps, such as Facebook instant games, Baidu
... what it means
... third room is session on audio and 3D controls on the web
... possibility of a 3D element
... take a photo
... First session starts five minutes from now 1:05 to 1:55
... second session 2:05 to 2:55
... and then we come back to this room
... and each individual session reports outcomes of their discussions for two minutes each
... is that clear for everyone
... Ready, set, go

More 3D Controls

Yasushi: Want a way to have 3D on the web be declarative
... to know what models look like
... explore how using web components how to do
... augemented reality; see what are next steps to do tag that is native in the browser
... browser rendering not JS
... run natively and run as many
... see where the benefits are
... this was first time we started to see what approach...
... seeing and comparing
... to the 3D elements
... how to go from there
... what we did not know
... there was not a W3C person; not sure what the next steps are
... Khronos also was there...how to make the models look the same
... some other groups discussing
... how a tag should look; features it should have
... next steps were not clear

Neil: we wondered if a liaison might be interesting
... super open to collaboration with W3C
... our processes are pretty open
... have. a common discussion with web case and standard glTF

Francois: W3C is usually pretty open to that; usually constraints come from the other side, e.g. because we work in public.

Discoverability and Monetization

See the minutes of the breakout session.

Tom Greenaway: The idea was to think about ways that the open web can take inspiration from the closed ecosystems like WeChat and Baidu
... and improve monetization and discoverability of games
... we discussed embedded metadata formats
... to allow developers to describe and declare the capabilities of their games
... or an open web cataglogue of games, to filter, etc.
... or maybe metadata to communicate what that game is about, a video or gif
... almost like a trailer for the game
... on the monetization front

Tom Greenaway: a few different sides
... limiting the spending of users
... parents can spend money on credit card to have funds put aside
... limit how much money they can spend
... gaps with different advertising models
... like rewarded video ads
... and rediscoverability of ads and the trust users have in games
... that user trust determines whether they are wiling to spend or not
... anything else anyone from group wants to raise?

Dom: On discoverability
... what the video games and schema.org discussed? https://schema.org/VideoGame

Tom: yes, existing discovery on schema.org was discussed
... it's more about a game, the facts related to any sort of game and the properties that exist
... but don't describe the input times; more about game director, who made the music
... but won't help with filtering or categorizing games
... game genre is not included

Francois: capabilities as well

Tom: yes, input types
... you could declare this game requires touch or keyboard and mouse
... device, an open web catalogue could filter

Francois: And monetization schemes as well?

Tom: yes, the scheme could say which monetization is available

Dom: And are there things that already exist, like in App Stores?

Tom: Do App stores have a common vocab?
... yes, AppStore, Google Play Store have existing vocabularies
... someone raised that maybe those are too restricted; be more extensible
... many more ways to describe a game; maybe a tagging model
... whether game requires connectivity; monetization more straight forward

Francois: Thank you

Web Games in Hosted Apps

See the minutes of the breakout session.

Ping: Baidu Smart Game is a new kind of game emerging
... with standard programming language
... power is rendering provided by the super apps
... new kind of game emerging in mobile ecosystems
... special but very common; has special details
... we discussed the design and implementation part of this native rendering of this JS binding
... and also audio files system
... we discussed with Facebook about the rendering
... how to get better @
... and provide flexible, programmable APIs with developers
... today we don't get out many new inputs
... for this specification
... but we wil summarize more and get more
... our experience in this run-time implementation
... and new needs from our end users
... such as record audio, service workers and web assembly
... we also support distribution of native apps
... and support
... and new AI framework was discussed
... we proposed a new web AR
... much to do with this kind of APIs, implementations and user scenarios
... this was today's discussion

Francois: Anyone want to add something
... I attended part of this session
... the games run in a hosted app
... there are things you cannot do there that you can do in the web environment
... and converserly
... some of the new [super] APIs could be brought to the web; that could be interesting

Ping: We are compatible with web specifications
... run time, JS execution, binding framework


See the minutes of the breakout session.

David: complexity of web worker
... may be stuff we can use
... web assembly
... inimitable
... sharable
... using it from the thread
... discussed idea
... if we can make a web worker without making a file
... and use the coding thread
... import, like library from main thread
... that is the discussion we had; pretty interesting
... see what's next
... questions?

Francois: what are the next steps?

David: I don't know
... we have a lot of motivation

Networking and Games

See the minutes of the breakout session.

Peter: We discussed if the web transport API is useful
... discussion that it would be, mainly in a client/server context
... both for @ and cloud gaming
... VR
... discussion on how to transition from web sockets to web transport and what polyfils could look like there
... some advanced discussion on congestion control
... how web app can customize that
... web codecs
... use it more easily synchronize it
... with video things

Francois: Any ideas in mind?

Peter: control over when things render
... specify when things render at the same times


Paul: For audio we started talking about
... device clients
... lots of control on audio processing
... pipeline
... explain where context is not the fastest way to do things
... talked about why and how we will make it work with WASM
... such a big request from vendors
... we then continued to engage with web transport and web codecs people
... how those three APIs would be used together to form a coherent solution
... such as @ network, games, reimplement
... emphasis was on performance
... then we talked about needs for XR, VR
... mostly about positional audio
... talks about whether or not it is feasible to create a credible solution for high fidelity solution for
... Omnitone
... and various things about input
... real immersion
... also talked a bit about solution to offload parts of the processing to hardware
... specialized for audio ore more general purpose
... I think that's it
... take-away is
... at the end of the web API, now we are gathering use cases and figuring out solutions
... make sure all APIs play nice with others out there

Web Assembly

See notes taken during the breakout session.

Mathias: we talked about Web Assembly
... and we talked about suggestion to when we create new APis, we create with Web Assembly in mind
... @ and object management
... user can keep control over memory
... talked about memory mapping Web Assembly
... nothing got on yet
... talked about the obfuscation and protection of assets; WA doesn't do anything here
... web crypto might be something for game devs to use
... run time
... code being able to unlock resources to protect assets
... nothing WA specific
... are there resources for 'regular' web developers to start using Web Assembly
... example was going through C
... how do regular JS coders jump into it
... Assembly Script was a suggestion
... exceptionality is coming in WA
... discussion about shared array buffer
... and is there feature detection
... some currently, but might change
... load and time or efficiency for the modules
... question where is WA assembly in loading and shared libraries
... recent changes have paved the way for load time dynamically
... @6 module
... another question how to do cross-origin sharing to save loading time
... slight discussion on caching
... recap from yesterday's talks
... lastly, there was a question if there is a different cost calling from JS to WA
... blog post with recent benchmark numbers; should be very fast. Hope to provide link to that blog post
... that covers it

Francois: Thank you
... interesting that it's in your session that you discussed asset protection, which was raised yesterday
... I don't see any solution proposals
... more about using tools you have

Mathias: Current state of things I guess

Francois: and some thinking ahead
... thank you for this summary

Accessibility Clinic

Matthew: We did not do much of a clinic
... but we discussed a few different things
... Some general questions how blind people play beat 'em up games
... talked about authoring tool access
... consuming of content is great but creating is better
... work going on in that respect, last week, the Unity team conveyed how to make games more accessible
... some big interest in authoring tool access which is great
... we talked about things that came up yesterday
... how to provide some form of accessibility, UI focus
... how we might look into that
... note the issues and opps arising from inclusive language talk earlier today
... Acess people may already be aware of those things
... there is work going on that might be helpful there
... we talked about interesting, innovative projects in XR
... MS produced Seeing VR
... allows to do some forms of accessibility for XR
... things like processing that just work
... color filters
... post processing effects
... as developer, more in way of semantics
... in scenes you can guide users around...
... outline things of interest, use higher contrast, use a line to guide people to the right place
... loads of stuff being done in XR that is very exciting
... also talked about progression of awareness of accessibility; gone from 'what is it' to 'how do I do it'
... and many console games have at least one accessibility feature, from changing text
... to use audio, spacial audio, higher contrast features like taht
... a lot of progress being made
... even cloud streaming games, cross device, XR
... real opportunities there
... plenty of things that I have learned from this workshop over the past two days
... thanks for everybody
... all the talks were very interesting
... if anyone has questions about accessibility, please contact me via my Twitter account

Francois: thank you for the report
... that concludes the break-out sessions
... we will break for 15 minutes and then we will conclude the workshop
... and we will ask you what we do now; different options..
... see if there is agreement, and see if there are people who are interested to do the letter
... and writing a letter to Santa Claus...
... then we will wrap up
... short break; back at 3:55

Next steps for standardization

Francois: Different types of groups at W3C.
... CGs are open to all, for preliminary technical work
... BGs are open to all, for business requirements
... IG are for W3C members, set roadmaps for a technology space (Media & Entertainment, Payments)
... Could we have a Games IG?
... Actual standardisation happens in WGs, strong royalty free policy
... We'll review the topics from the workshop. Options:
... 1. do nothing, if we don't have the right people, or if it's already covered
... 2. want to refine scope, could happen in a BG or IG to look at use cases and requirements
... 3. For an understood technical problem, incubate in a CG, new or existing
... 4. when there's a solution with some support, move to standardisation, charter a WG
... Who's willing to support, drive the work, participate?
... First topic: Threading, e.g., PromiseTask proposal, sharing objects between threads, worker creation that takes a function

Dom: Useful to write up typed objects proposal, problem in WASM for GC language support, useful primitives

David: Where to take worker creation with a function?

Dom: likely WHATWG, but useful to write it up first, then we can determine where it goes
... could bring to a Game IG or CG?

Francois: That makes us jump directly to the last thing I wanted to cover
... What do we do next with games per se? Do we want to create a community around game for use cases and requirements at W3C?
... There's an inactive games CG, we could make a games IG, if people are willing to participate
... That could be a natural home for writing such proposals, then take to WHATWG, TC39, W3C, etc.
... Show of interest in a games activity at W3C?
... [20 or so]
... That's a good score. Interesting. I'll reach out to you after the workshop to evaluate that possibility then.
... Web Assembly: Debugging, DWARF proposal. There's already a CG and WG, so no need for more structure, just want to input to those
... DWARF: Please contribute to the CG
... Garbage collection?
... everything covered already
... 3D rendering: universal textures, second-gen PBR
... Interest in glTF and how it integrates to the Web
... Semantic level for glTF, have dialog between Khronos group and W3C
... Then, the 3D model viewer

Neil: An action is to set up a simple liaison between Khronos and W3C

Nell: File an issue in the Immersive Web repo to start discussion on the web side
... There's people already interested in that, in the CG, so makes sense to start there

Francois: WebGPU, going in the right direction, it's needed. Discussed shading language, can we usefully say something?

Myles: Useful to gather evidence, use cases, bring to the WebGPU CG
... We're already gathering use cases in the CG, individual input welcome
... Universal support for a WG/CG model, similar to Immersive Web

Francois: So the workshop can voice our general support
... Asset loading. Status of background fetch?

Andrew: It's in WICG, Google actively working on it, don't know if other intents to implement yet

Francois: Useful to flag our support for background fetch
... Caching of common libraries. No solution there, there are privacy issues. What needs to be discussed further?

Kasper: It's straightforward on our platform, needs engine support. Discussion among engines on how to support
... If engine code isn't coupled to game code, can optimise by reusing

Francois: So nothing to do there
... Best practice for games developers. A games IG could do that. Who's interested?
... [2 or 3]
... It's doable, but needs a lot of investment

@@4: A lot of it is engine specific

Diego: Also include MDN people, the place developers would go

Francois: Storage

Andrew: Game developers are interested in defining mulitple buckets. No API proposals yet
... It's a mishmash of WHATWG and Service Workers. IndexedDB in Web Apps WG?
... I'll take this and raise the issues with people. Need to figure out how to expose the buckets

Francois: Accessibilty. Mapping between native assistive APIs and AOM
... Alignment already exists. When rendering content in a canvas, AOM the equivalent of ARIA for non-DOM content
... Educating developers
... Web IDL bindings from native code to AOM. The take away is we have some things already in place, but needs further exploration

Matthew: Early prototyping could be done using Rust, which has the bindings
... Looks to be the most concrete thing to do first, then I'll send suggestions to the APA WG, which reviews specs and give advice to other groups
... The issue of pronunciation is more W3C, there's a Task Force, may be in there, I can raise it there

Francois: Audio. OutputLatency mentioned as not supported

Raymond: It's in the spec, not implemented yet

Michel: Audio driver selection, on Windows, can have multiple drivers in the OS. Some applications need low latency driver
... Not necessarily related to the Web Audio spec, useful to have a way to know if the right driver is installed for low latency

Raymond: Audio Device Client is in the Audio CG, people active on it

Francois: People can join the CG to help it progress

Raymond: We have a proposed solution, but need input from participants to tell us if it meets the needs for a low level audio interface

Francois: SharedArrayBuffer support. There's an agreed path to a solution. Our report can emphasise the need for SAB
... High priority threads. Anything to capture?

Raymond: May not need to be standardised, could be a quality of implementation issue. It just need to be made safe
... This will be in Web Worker spec, not part of audio spec

Francois: Gamepad inputs. The gamepad spec hasn't evolved, but help needed, in Web Apps WG
... Call to action, to give support, or contribute to get additional features
... Alignment with WebXR, need communication between the two groups
... Input latency. Allow inputs in workers: addEventTarget(), also non-rAF aligned input events

Navid: pointerrawupdate is already being discussed
... Input in workers, WICG incubation, we need partners, first users to give feedback, can use origin trial

Francois: Streaming

Peter: WebTransport is in WICG, the p2p version is in an origin trial in Chrome, if people want to experiment

Francois: We had expressions of support, people more interested in client/server than peer to peer
... WebCodecs, synchronisation.

Peter: There's a Discourse post on WebCodecs, show support there to get it to WICG

Francois: Localisation. We discussed pronunciation, Matthew following up
... Discoverability and monetisation: a vocabulary or schema?

Tom: I'd really like to expore that
... Could resurrect the video game schema CG, not active

Dom: Rewarded ads are missing?

Tom: The payments WG is focused on checkout and transactions, maybe for another group

Francois: There's a payments BG
... could be a natural home for that topic

Tom: I can look into that

Francois: 3D controls, a possible 3d model viewer element. The immersive Web CG would be a good first home for that
... Set up a lightweight liaison with Khronos group
... Web games in hosted apps. A game activity in W3C could be useful, there's already discussion on hosted apps in the Chinese IG
... They'll bring that discussion to wider W3C
... We have some support for creation of a game activity. I'll contact some of you to see who's willing to drive this, we'd need a Chair
... This takes time, drafting a Charter, getting approval to create an IG
... Should we have future events? Generally we have workshops as one-off events
... Interest in meeting again?
... [about 25]
... What didn't we cover?
... Asset protection? Can discuss in a game activity

Ricardo: Platforms such as Steam provide multi-user, peer to peer, if you want to do a multi-user game you need WebRTC, hard to set up servers etc. Want this provided as a service, so you can focus on making the game

Francois: What does that require?

Ricardo: Main issue is with WebRTC you can make a p2p connection, but doesn't work between home and corporate. Needs infrastructure, TURN
... Playstation and Steam ecosystems provide this, so you don't have to do it yourself

Tom: Is this a standardisation issue, or a service that needs providing?

Ricardo: More about having a service, it's a big missing piece

Francois: Any other topics?

Binh: We make games for children, in Europe we have GDPR, requires consent, but children can't give consent
... Useful to discuss more

Edgar: Experiences will be VR and AR, need to keep this in mind. Maybe not ready yet

Francois: Good point. We didn't emphasize XR in this workshop because we already ran other workshops on XR.
... Please use your contacts to build the community

Wrap Up

David: Thank you very much for coming. I'm proud of what we've done here

[End of minutes]