Cloud-based 360° Video Playout on TV
Presenter: Louay Bassbouss
Demo from Web & Networks Interest Group
The Cloud-based 360° Video Playout allows the viewing of high quality 360° videos on devices with constrained capabilities such as TV sets. It reduces the required bandwidth and processing resources by rendering the field of view (FOV) in the cloud in advance and only stream the selected FOV to the client. Each FOV video is provided as individual stream. Transition videos which enable a smooth switch between FOVs are generated and provided as individual streams as well.
The 360° player implementation in the browser uses multiple Web APIs like fetch (with readable streams), MSE, EME and video element. The player requests video segments of the current FOV until the user switches to another view. In this case, the MSE buffer which contains video segments of current FOV will be emptied (from a certain position) and video segments of the transition video or of the new FOV video will be fetched from the CDN and appended to the MSE buffer. Information about network latency and actual throughput are essential to determine the bitrate level of the new segments to fetch in order to get them at the right time and to calculate the position at which the buffer will be emptied.
My name is Louay Bassbouss, and I work for Fraunhofer Institute for Open Communication Systems.
I am going to demonstrate in this video, our solution Cloud-based 360° Video Playout on TV as one of the Web and Networks Breakout Session Demos.
If you want to know more about the solution, please visit the link displayed on this slide.
Now let's start with the benefits of the Cloud-based 360° Video Playout.
First, it enables 360° video playback on devices with limited processing capabilities like TVs by rendering the field of view videos in advance.
It reduces the bandwidth consumption by 80 to 90%, since on this current field of view, it's streamed to the client and not the entire 360° video.
It also supports multiple video codecs and streaming formats like MPEG-DASH.
It uses existing streaming infrastructures, and Web APIs for the delivery and playback.
And finally it enables a playback of high quality, 360° videos.
You can see on this slide, a 24K Equirectangular Frame.
And on this slide, the corresponding field of your frame was 4K resolution, rendered from the previous 24K equirectangular frame.
Let's take a look on how the cloud pre-rendering approach works.
The first step is the field of view videos are pre-rendered in the cloud.
Then the pre-rendered videos are packaged as DASH streams and made available in a cloud storage.
In the third step, the DASH streams are delivered to clients via CDNs.
In the last step, the DASH streams are played back in a dedicated player, where players can use, for example, the fetch and MSE APIs to request and play the DASH segments.
Let's take a closer look to the structure of the generated videos.
So I have two types of pre-rendered videos: Static Field of Videos, which are the recording of the field of view by fixing the virtual camera at predefined static positions - the example shown on this slide, we have four static positions, zero degree, 90 degree, 180 degree and 270 degree; and Transition Videos, which are required to enable a smooth transition between the static field of views - these are generated by rotating the virtual camera at a constant speed, starting from each static field of view position and in different directions.
The example on this slide considers only the left and right directions.
In this example, 12 DASH streams are generated in total, four Static Field of View Videos, four Transition Videos to the Left, and four Transition Videos to the Right.
The solution is already used in production by many broadcasters, as part of their HbbTV services.
Let's have a closer look to the Biathlon 360° live streaming on TV, which was provided by the German public broadcaster ZDF during Biathlon World Cup last year.
TV screen with German audio comments - the German public Broadcaster ZDF offers an HbbTV application that can be launched via the Red or Green remote controller buttons.
TV Screen shows Intro page to 360° Live Streaming on TV during Biathlon World cup in Oberhof, Germany.
TV screen with German explanations to use 360° live stream: the launched HbbTV application integrations the 360° Web player using the cloud rendering approach.
Now let's take a closer look to the web player with focus on the buffering algorithm.
We will use the same example shown before with four Static Field of View positions, and with Left and Right Transition Videos.
This is by the way, the same configuration used for the Biathlon live stream.
The player starts playing field of view video at static position, zero degree and keeps one additional segment in the MSE SourceBuffer.
During the playback, also search field of view segment.
So user presses, the left button, which causes the player to start buffering left transition video segments starting by segment number four.
The user releases the button during the playback of segment number five, but since segment number six, of the new field of view cannot be downloaded on time, segment number seven will be requested instead.
So now let's watch the sequence of this example at normal speed.
The most important challenge of this approach is the accurate prediction on the timing for downloading video segments.
This is already identified as an important requirement in the Web and Networks group.
Let's have a look again to the example, to see where the prediction is required.
When the user presses the left button while playing segment number three, the player decides to download segment four of the left transition video at 270 degree and not segment number five of the left transition video at zero degree.
Similar when the user releases the left button while playing segment number five, the player decides to download segment number seven of the Static Field of View at 270 degree.
Thank you for watching this video.
If you have questions, please visit the solution page or contact me.