Jump to Content

Experiencing Real-time 3D Interaction with Depth Maps for Mobile Augmented Reality in DepthLab

Maksym Dzitsiuk
Luca Prasso
Ivo Duarte
Jason Dourgarian
Joao Afonso
Jose Pascoal
Josh Gladstone
Nuno Moura e Silva Cruces
Shahram Izadi
Konstantine Nicholas John Tsotsos
Adjunct Publication of the 33rd Annual ACM Symposium on User Interface Software and Technology, ACM (2020), pp. 108-110

Abstract

We demonstrate DepthLab, a wide range of experiences using the ARCore Depth API that allows users to detect the shape and depth in the physical environment with a mobile phone. DepthLab encapsulates a variety of depth-based UI/UX paradigms, including geometry-aware rendering (occlusion, shadows, texture decals), surface interaction behaviors (physics, collision detection, avatar path planning), and visual effects (relighting, 3D-anchored focus and aperture effects, 3D photos). We have open-sourced our software at https://github.com/googlesamples/arcore-depth-lab to facilitate future research and development in depth-aware mobile AR experiences. With DepthLab, we aim to help mobile developers to effortlessly integrate depth into their AR experiences and amplify the expression of their creative vision.