- Julien Valentin
- Adarsh Kowdle
- Jonathan T. Barron
- Neal Wadhwa
- Max Dzitsiuk
- Michael John Schoenberg
- Vivek Verma
- Ambrus Csaszar
- Eric Lee Turner
- Ivan Dryanovski
- Joao Afonso
- Jose Pascoal
- Konstantine Nicholas John Tsotsos
- Mira Angela Leung
- Mirko Schmidt
- Onur Gonen Guleryuz
- Sameh Khamis
- Vladimir Tankovich
- Sean Fanello
- Shahram Izadi
- Christoph Rhemann
Abstract
Augmented reality (AR) for smartphones has matured from a technology for earlier adopters, available only on select high-end phones, to one that is truly available to the general public. One of the key breakthroughs has been in low-compute methods for six degree of freedom (6DoF) tracking on phones using only the existing hardware (camera and inertial sensors). 6DoF tracking is the cornerstone of smartphone AR allowing virtual content to be precisely locked on top of the real world. However, to really give users the impression of believable AR, one requires mobile depth. Without depth, even simple effects such as a virtual object being correctly occluded by the real-world is impossible. However, requiring a mobile depth sensor would severely restrict the access to such features. In this article, we provide a novel pipeline for mobile depth that supports a wide array of mobile phones, and uses only the existing monocular color sensor. Through several technical contributions, we provide the ability to compute low latency dense depth maps using only a single CPU core of a wide range of (medium-high) mobile phones. We demonstrate the capabilities of our approach on high-level AR applications including real-time navigation and shopping.
Research Areas
Learn more about how we do research
We maintain a portfolio of research projects, providing individuals and teams the freedom to emphasize specific types of work