Quality of Implementation/Audio
From Core Mobile Web Platform Community Group
Audio is essential in games where it is used nearly exclusively in the following way:
- there is a background audio track playing,
- short sounds (effects) are triggered by in game or user generated events.
While there's a great number of really interesting work going on in implementations and spec development of low-level audio API, game developers rely nearly exclusively on the HTML5 AUDIO element for their needs.
Currently, there are three quality of implementation issues which are hampering the development on games on the Mobile Web Platform:
- Some browsers cannot play enough audio elements at the same time, preventing the use of background music or cutting off effects sound when new ones are triggered. Playing 8 channels simultaneously would be sufficient for most use cases.
- Latency is so big in some browsers (up to 0.8s on iOS) it makes it impossible to use sound effects at all. Sub 10ms latency would be optimal.
- Playing sound files generates audio artifacts in some browsers.
- add non-normative implementation guidance to the spec.
- testing this is hard, but tests would be useful.