Skip

Metaverse cloud rendering on the web

By Dr. Louay Bassbouss

See also the slides.

Transcript

Hello, my name is Louay Bassbouss from Fraunhofer FOKUS. Today, I'm excited to demonstrate "Metaverse Cloud Rendering on the Web." This technology aligns perfectly with Edge Rendering, a prominent use case currently under discussion within the W3C Web & Networks Interest Group.

Let's start by exploring the concept of Cloud Rendering. It empowers compute-intensive 3D applications and immersive metaverse experiences on nearly any device, simply through a web browser.

While Cloud Gaming is perhaps the most well-known application, its applicability in other domains, such as the industrial metaverse, is relevant as well. Here, it can render digital twin experiences and facilitate simulations on edge or cloud servers.

To evaluate the performance of edge or cloud-rendered applications under various network conditions, compute capabilities, codecs, and 3D engines, we've developed the 1ClickMetaverse Testbed. It earns its name, 1ClickMetaverse, because you can easily enter the experience with just a single click from your browser, eliminating the need for any application installations.

Key web technologies, including WebRTC, WebCodecs, and WebTransport, play an important role in enabling seamless Cloud Rendering experiences on the web.

This diagram provides an overview of the high-level components of the cloud rendering architecture.

Notice that WebRTC serves as the primary protocol for streaming audio and video feeds from cloud-rendered applications to the browser. User interactions, like mouse or keyboard inputs, are transmitted to the cloud renderer through WebRTC data channels.

The signaling server employs WebSockets to facilitate the exchange of WebRTC signaling data, establishing a peer-to-peer connection between the web client and the cloud renderer.

Furthermore, rendering and WebRTC performance metrics are captured and reported to a metrics server via HTTP, with these metrics visualized through a comprehensive monitoring dashboard accessible via a web browser.

Since most metaverse experiences and 3D applications are built using popular gaming engines like Unity or Unreal Engine, we've introduced a modular approach by bundling these applications as Docker containers. This simplifies the deployment of 3D applications within the Testbed and allows for easy integration of new rendering engines in the future.

Now, let's dive into the demo. As you can see, the client is nothing more than a web browser. It displays a list of available metaverse experiences already running in the cloud. With just one click, these experiences can be quickly loaded and displayed to the user because they're already running in the cloud. The user can control the avatar in this scene using a mouse and keyboard. All interactions are captured and sent to the cloud renderer for further processing. The cloud renderer then streams the video via WebRTC, which is displayed to the user through the HTML Video element. As you can see, when we show the controls of the video element, the client only needs to display the video, with no additional processing required.

In the second scene, we showcase a simple industrial metaverse application for a digital twin experience. These applications hold immense potential in design, engineering, testing, simulation, and training. An important feature of the Testbed is its metrics reporting capabilities, which you can observe in this scene. These metrics are not only displayed within the application but also reported to a metrics server and visualized in a dashboard accessible through a web browser.

Here's another example of cloud rendering, showcasing a virtual world for bike training. Cloud rendering makes it possible to deliver this experience on TVs with limited graphical processing capabilities, which cannot render the same experience locally. This experience also demonstrates the integration of physical devices, like the bike trainer, which sends data such as speed to the TV via Bluetooth. The web client transmits this data to the cloud renderer through WebRTC DataChannels, and the renderer applies it to the avatar.

A critical aspect of a cloud rendering pipeline is connectivity. Latency and packet loss significantly impact the user experience. This is where 5G technology comes into play, allowing for metaverse experiences to be rendered on the edge of the 5G network, reducing latency. Through 5G Network Slicing, it's even possible to introduce network slices tailored to the requirements of remote rendering and streaming.

In this demo, you can witness a multiuser experience accessed via browsers on three different devices connected via 4G, 5G, and WiFi. It's worth noting that within this demonstration, a single cloud-rendered session serves all three users, where each them is allocated a virtual camera within the scene.

Thank you for joining me for this demonstration. For further information, please refer to the links provided on this slide.

1ClickMetaverse: www.fokus.fraunhofer.de/go/metaverse

Web & Networks IG: www.w3.org/web-networks/

Skip

Sponsors

Support TPAC 2023 and get great benefits from our Sponsorship packages.
For further details, contact sponsorship@w3.org

Silver sponsor

Gooroomee

Bronze sponsors

Sttark Igalia

Inclusion fund sponsors

TetraLogical Services University of Illinois Chicago Igalia