MetaStream: Live Volumetric Content Capture, Creation, Delivery, and Rendering in Real Time

System Overview

Abstract

While recent work explored streaming volumetric content on-demand, there is little effort on live volumetric video streaming that bears the potential of bringing more exciting applications than its on-demand counterpart. To fill this critical gap, in this paper, we propose MetaStream, which is, to the best of our knowledge, the first practical live volumetric content capture, creation, delivery, and rendering system for immersive applications such as virtual, augmented, and mixed reality. To address the key challenge of the stringent latency requirement for processing and streaming a huge amount of 3D data, MetaStream integrates several innovations into a holistic system, including dynamic camera calibration, edge-assisted object segmentation, cross-camera redundant point removal, and foveated volumetric content rendering. We implement a prototype of MetaStream using commodity devices and extensively evaluate its performance. Our results demonstrate that MetaStream achieves low-latency volumetric video streaming at close to 30 frames per second on WiFi networks. Compared to state-of-the-art systems, MetaStream reduces end-to-end latency by up to 31.7% while improving visual quality by up to 12.5%.

Publication
The 29th Annual International Conference On Mobile Computing And Networking (MobiCom 23)
Yongjie Guan
Yongjie Guan
Ph.D. Student

My research interests include mixed reality and wirelss edge computing systems.

Xueyu Hou
Xueyu Hou
Ph.D. Student

My research interests include distributed machine learning and networking systems.