WebXR logo

The WebXR Device API

WebXR provides the foundations to bring immersive experiences to the Web, by enabling the rendering of content in Web browsers to augmented reality (AR) & virtual reality (VR) devices.

The WebXR device API is coming to browsers in the not too distant future. In this talk, Ada Rose Cannon explains why it has evolved from WebVR and some of the important new features it brings to the Immersive Web. See slides.


Video hosted by WebCastor on their StreamFizz platform.

photo of Ada Rose Cannon

Ada Rose Cannon, Developer advocate at Samsung for the Web browser Samsung Internet, and co-chair of the W3C Immersive Web Working Group.

Ada-Rose is passionate about virtual reality and other new front-end Web technologies. She writes and talks about graphics on the Web, offline first Web apps and front end animation performance.



Hi I'm Ada Rose Cannon, I've just come from a full-day of Chairing the Immersive Web Working Group so forgive me if I'm a little tired.

But I'm here to talk to you today about the WebXR Device API, which is the API developed by the Immersive Web Working Group to allow access to the displays and sensors of VR hardware so you can make virtual reality and augmented reality experiences.

I work for the web browser Samsung Internet.

As anyone heard of Samsung Internet before?

A few more hands than usual, that's pretty good.

So yeah, it's a pretty cool browser.

I like it.

They pay me.

That's enough of that.

So the Web XR Device API is based around WebGL so you can render a scene using information you've got from the hardware and you can send it back to the device.

It ends up with a pretty good result because you can...

The goals of the Web XR Device API is that allows you to build one experience that works across devices and not just across VR Headsets, but between mobile phones and headsets between Virtual Reality devices and Augmented Reality devices so you can build one experience which can progressively enhance toward whatever the user has.

I mentioned handsets because one of the really nice features of the old Web VR API was that people built these things where you could look through the device as if it was a magic window as a way to onboard people onto a full immersive experience.

It was a way to give people like a little taster of your VR scene in order to persuade them to have a go at full virtual reality.

This was such a popular pattern that worked so well, we've made it a first class citizen in the Web XR Device API. So inline sessions can be embedded inside a web page allowing you to do Virtual Reality which they can view just by moving that phone around.

And then should they decide to they can push the button and put it into a headset if they've got something like a Gear VR or they can fire it up on their computer or on a standalone headset.

The end result of building VR and AR on the web is that it does end up being the fastest way to get a new developer building their Virtual Reality experiences for the first time.

Some of the libraries like JavaScript, Babylon, A-Frame, they're fantastic for getting people started building VR really quickly and they don't have to worry about specializing their experiences for specific pieces of hardware.

They can build it once and they can try it out on a variety of devices.

And it is really nice seeing an experience which someone has built and tested on a cardboard because that's all they had and I can then take it home and fire it up on my HTC Vive and have an amazing experience.

We've recently tweaked the way we've been working in the Immersive Web Working Group.

We've broken up the API into modules so the Web XR spec is more or less a replacement for Web VR.

The bits of the AR parts and gamepads have been moved out into modules which are being worked on separately in parallel, letting us have much faster iteration on these components, without having to do releases of the full API.

Who's involved in the Immersive Web Working Group is kinda cool.

There's loads of big companies involved.

I'll see if I can remember this slide without looking behind me.

Because I've just been in meetings with them all day long and it'll be embarrassing if I can't.

So we have Samsung 'cause that's me, and we have Google, Amazon, Apple, Magic Leap, Facebook 'cause they do the Core Oculus Hardware, I think I've got everyone, Oh Microsoft I'm so sorry, Microsoft and Google.

Forgot the really big ones yeah.

But loads of cool people doing amazing work and it's a super positive and fantastic working group and I like being involved.

So when can we expect it to land?

'cause I've been talking about it and been talkin' it up.

Well it's kind of landing-ish at the moment so we can expect it to start landing in a browser near you over the next six months in a variety of browsers.

It's looking to come into Samsung internet in Q2 next year, which is pretty cool.

But if you want to use it today there is a Polyfill you can use.

You can get the Polyfill from our GitHub and you can use it to start experimenting and testing with the API.

So I've mentioned WebVR a lot in this talk and has anyone built stuff for WebVR before?

Oh okay I expected more of you 'cause it was really popular and unfortunately it has been deprecated.

It was a really great way to start fleshing out and working out what we needed for a API to access immersive hardware.

But it had numerous issues that stopped it being future proof and was kind of holding us back a little bit.

So it pivoted into WebVR 2.0 and WebAR for a while but nowadays it's all just WebXR which is the umbrella term for everything and should be one API that handles all your needs.

I'm now going into a little bit of detail in my last couple of minutes about the modules because WebXR largely replaces what WebVR did.

It's a way to access the sensors and displays of the VR headset, and it's just VR right now.

AR is a separate module which I goin' to talk about last.

But first I'm going to talk about one of my favorite things which is the controllers, because it's cool to pick stuff up and throw stuff.

The way controllers work in XR is surprisingly complicated because it's difficult to map what the user has in their hand, to not only a real device, to a virtual representation of the real device in a way that is responsive but also to do in a way that is secure and doesn't expose too much information as a fingerprinting target.

The XRInputSource is what contains the information for our VR controllers.

So it contains information which allows you to access the position and rotation of the device in real space.

It also has a Gamepad object itself and this Gamepad object is like a Gamepad you get from the getGamepads API but it doesn't, itself appear in there.

So the gamepad's API is reserved for stuff like Gamecube controllers or PlayStation controllers or Snes controllers or whatever you're using.

But we can use those two information to actually build the rest of it.

So we don't provide a 3D model or any of the information node to you.

Because this information would quickly get out of date.

If we would build it into the Platform our experiences would quickly degrade over time.

So that is the kind of thing we are doing as a Open Source Library.

So we are providing the profile which map the physical hardware to the virtual model of the controller as JSON files.

We have 3D assets kindly provided by the hardware manufacturers for the controllers.

So you can get an actual representation of the controller you are using.

And when you combine the information from the gamepad with the JSON files you could work out what nodes to move, of the controller and there the JavaScript library which is also in the repo as well, which does all the interpolation and mapping for you.

So you don't have to worry about any of the really difficult stuff which means by including this, no matter what headset you're using on what controllers, you'll always have a good representation of what you've got in your hands.

If you wanna play with this today, you can.

And instead of having an actual game pad, there's sliders which represent the different inputs from the Gamepad API, so you can slide them around and see how the 3D model behaves.

So I'm hoping we can start seeing this used soon to have people having real XR experiences.

We have plans for Augmented Reality as well.

So the main deal here is the immersive-ar Enum.

So WebXR handles virtual reality by allowing you to do inline sessions or sessions which are immersive in the Virtual Headset.

Well in AR we only have one, which is immersive-ar, which handles both cases for mobile AR where if you are using something like ARCore or ARKit, it opens up in kind of like a new windows, in a full screen window.

It can't be used inline with the page but it can still be used on the handsets, allowing you to use your phone to place objects in the scene`.

And it also works with Immersive Headsets like Magic Leap and Hololens.

So if you are like me and you can't afford the expensive devices, you can still build something targeting a mobile phone and you have a pretty good expectation that it will work on the Immersive Headsets.

It's a lot of work and we could do with some help.

So there's a lot of resources which are in the....

A lot of resources which we could do with help building which normal Web Developers can help with.

So we have WebXR samples we could do with people testing on and doing pull request to.

We have tests which need writing and fixing.

We want you to use the Polyfill and if you find bugs please file issues and PRs.

It would be really fantastic to have more people involved in this effort.

And try it out, give it a go.

Build XR experiences.

Thank you so much.