Skip to toolbar

Community & Business Groups

Augmented Reality Community Group

The W3C Augmented Reality Community Group is an open forum for collaborative discussions about the intersection of Augmented Reality and the Web, or more simply the Augmented Web. This forum welcomes discussions about related standards, the standardisation process, related market developments and the broader social implications of this new generation of the web.

We believe that the Augmented Web brings a unique perspective that pushes standards, APIs, hardware technologies and the broader web platform to the edge of their performance limits. The Augmented Web embraces the changes brought about by HTML5 and other related standards including Geolocation, DeviceOrientation, DeviceMotion, WebGL, Web Audio, Media Capture & Streams and WebRTC. The Augmented Web integrates all of these disparate technologies into an integrated new vision of the web.

This group will not produce specifications.

Instead it aims to build an integrated community voice that reaches out to all of the other relevant working groups and standards bodies to ensure that the Augmented Web perspective is clearly represented and considered. Our goal is to help ensure that the disparate standards and APIs being planned and implemented by these other groups can be seamlessly integrated into this new vision for the Augmented Web.

Read more about goals and operating guidelines in the Charter: http://www.w3.org/community/ar/wiki/Charter

Group's public email, repo and wiki activity over time

Note: Community Groups are proposed and run by the community. Although W3C hosts these conversations, the groups do not necessarily represent the views of the W3C Membership or staff.

No Reports Yet Published

Learn more about publishing.

Chairs, when logged in, may publish draft and final reports. Please see report requirements.

This group does not have a Chair and thus cannot publish new reports. Learn how to choose a Chair.

Full Mixed Reality in the web is here…finally!

This group has been dormant for quite a while now while we’ve been waiting for the standards to stabilise and for performance to improve. But now it’s time to move to the next stage.

At Augmented World Expo this week I presented a session on Computer Vision in the browser and showed the first public demonstration of Natural Feature Tracking working in a standard web browser. This is a watershed moment and signals that AR has now really arrived on the web.

Here’s a link to my session http://www.augmentedworldexpo.com/agenda/session/175703

Here’s a link to my slides https://www.slideshare.net/robman/computer-vision-now-working-in-over-2-billion-web-browsers

Here’s a link to a video of the first demo that shows Milgram’s Mixed Reality Continuum all running in a browser https://www.youtube.com/watch?v=fhJSfqBYjKE

If you’re not familiar with Milgrams Continuum then I’d recommend you checkout this great post from Mark Billinghurst https://medium.com/startup-grind/what-is-mixed-reality-60e5cc284330 (more discussion on this soon).

And I’ll post a video of our NFT demo and more information about it here soon too.

Right now I’m heading off to Web3D 2017 where I’ll be running a BoF session on AR in the web. We’ll be covering how AR is working in the web right now, then discussing what APIs are required for the near future.

We’ll also be discussing the fragmentation that’s already starting to form with a AR, VR and MR Community Groups – it would be great if we could all build a shared vision to work towards.

Recently we also launched an update to our awe.media platform that makes it easy for anyone to Create, View & Share Location based AR using just their web browser. And you can do this on any device – mobile, tablet, desktop or head mounted display/glasses.

We’ll be extending this very soon adding our new Natural Feature Tracking plus Visual Search and lots more features. This is all built using our open source awe.js framework that was first released back in 2012.

So you can see that there’s lot going on and finally…Mixed Reality has arrived in the web and it’s clear that the web is how Mixed Reality can gain massive adoption!

If you’re at Web3D please come along and join in the discussion. And I’ll post a summary here after the event and then kick off some more detailed discussions on the mailing list after that. Now’s the time for some specific proposals to move us forward.

It’s time to wake up the W3C!

Who would like to work with me on my new open source augmented reality sdk?

Who here is interested in a new augmented reality sdk? I plan to reboot my open source augmented reality project. Details are given below.

If you also think this is a good idea, and would like to join my community and be part of something big then do so at this page: http://tagar.launchrock.com/

What is tagAR?

TagAR is an open source augmented reality sdk for android (initially, until we port it to other platforms). TagAR will provide an open, easy to understand and easy to use sdk. Initially tagAR will be implementing gps location based tagging add sdk. We will be implementing other types of augmented reality as well.

OpenGL will be used for rendering the graphics. OpenGL is an industry standard. OpenGL also provides an easy to use graphics library.

How different is tagAR from all those other augmented reality sdk and apps? Why do I need to make another sdk/app? What is unique about tagAR?

There are many sdks available, and also of really good quality, but most of them are just free to use or require a fee to use. Those are closed source, and are controlled by a single company, which can change what ever they want, at a moments notice causing disruption in the apps which are using those sdks. Also the sdks which require a fee to be used are just too expensive for individual developers.

However, there are open source alternatives, but they are either too few or their respositories have not been updated for a long time.

Furthermore, some of the open source sdks are not using opengl which limits their use to certain applications, and not much customization is possible.

tagAR on the other hand will be using opengl which will provide tagAR with the rendering power of opengl. 3D models, videos, images can be loaded and played/shown in the place of tags, which trully can make tagAR a powerful sdk. 

tagAR code base will be well commented and documented code, and will have an easy to understand design which will further improve the performance and quality of the sdk. tagAR will be easy to use and develop where extensibility and management of the code will be emphasized upon.

The code repository will be available on github.com and will be public where developers interested in augmented reality can use the sdk in their own apps, and also help build tagAR.

We will be accepting pull requests, but will only include the code which meets the sdk’s requirements and quality specifications, assuring a well formed code is built without any bugs.

What benefit will the devs get from my work?

The devs will be part of building a state of the art augmented reality sd. They will get a chance to submit their own code. Most of all the sdk will be open source, they will be able to use the sdk into their own apps, and also improve the sdk and submit the improvements that they have made to be added into the sdk.

How will i generate revenue from this?

i will be setting up a cloud service which will offer free and paid packages. The cloud service will provide apis which the users will able to add their own data and manage it. If the devs do not want to use our cloud service, they can set up their own cloud, but we recommend our cloud service since it will be affordable.

We can also generate revenue providing dedicated support apart from the community support, and also build features as per individual requests.

Why augmented reality?

Augmented reality is the future of computing and digital media, and the way we interact with out surroundings. Augmented reality can be used in a number of different fields such as tourism, medicine, manufacturing, retail, etc…

By building tagAR we hope that we and our community will help augmented reality grow and usher into a new era.

What type of augmented reality?

There are many different types of augmented reality, such as location based tagging browser, computer vision detection and recognition of objects, etc…

tagAR initially will be a location based tagging browser, where tags can be added at a particular location and users can see the tag in their augmented reality browser, and interact with the tag.

The Augmented Web leaps forward and the year has only just started

I hope you had a great break over the holiday period.  You’ll notice that at the beginning of the year the group page was updated with the new charter and now we’re all set for 2013.

It’s great to see that the Augmented Web has taken some great leaps forward with WebRTC now supported on Chrome, Firefox and Opera on the desktop as well as by Opera on Android and Bowser on iOS.  WebGL is also making good progress with coverage of almost every platform except for iOS now.  There is also good progress being made to enable Web Workers to process Canvas elements too.

However, the debate about device enumeration and fingerprinting continues to be a major quagmire.

There is also good progress being made by the Web3D community in refining their AR extensions to the X3D specification.  It’s interesting to review the comparison of their previous proposals.

And today the ISO SC24 WG9 starts a workshop to refine their Augmented Reality Continuum Reference Model – which is where I am right now.

So the momentum continues and expect to see a lot more interesting examples and demos as more people start to realise the power the Augmented Web is unleashing.

Proposing a new charter and a new start

The new year is approaching fast and the intersection of Augmented Reality and the Web has come a long way (see my recent position paper at the 7th AR Standards meeting).  I think it’s now time for this group to step up and build a real voice within the AR, web and standards communities.

In order to make this happen I’ve created a proposed charter so we can all collaborate on this.  I’ve also added a poll for the group so we can all vote on adopting this new charter.  The deadline for this poll closes on the 1st of January, 2013 so this is perfect timing to start the new year with a clear direction and some real momentum.

I also think that we should start the new year with a single chair for this group and I’ll be starting a more detailed discussion about that on the mailing list soon.

So get ready to get collaborating.  There’s a lot of work to be done and there’s a lot of topics that need much broader discussion.  I can’t wait to see this group fire up and I’m looking forward to working with you all to help realise the amazing vision that the Augmented Web is quickly growing into.