Jump to Content

Multi-Camera Lighting Estimation for Photorealistic Front-Facing Mobile AR

Yiqin Zhao
Tian Guo
Association for Computing Machinery, New York, NY, USA (2023), 68–73

Abstract

Lighting estimation plays an important role in virtual object composition, including mobile augmented reality (AR) applications. Prior work often targets recovering lighting from the physical environment to support photorealistic AR rendering. Because the common workflow is to use a backward-facing camera to capture the overlay of the physical world and virtual objects, we refer to this usage pattern as backward-facing AR. However, existing methods often fall short of supporting emerging front-facing virtual try-on applications where a mobile user leverages a front-facing camera to explore the effect of various products, e.g., glasses or hats, of different styles. This lack of support can be attributed to the unique challenges of obtaining 360◦ HDR environment maps, an ideal format of lighting representation, from the front-facing camera. In this paper, we propose to leverage a dual-camera streaming setup (front and backward-facing), to perform multi-view lighting estimation. Our approach results in improved rendering quality and visually coherent AR try-on experiences. Our contributions include energy conserving data capturing, high-quality environment map generation, and parametric directional light estimation.

Research Areas