Google Research

Information Geometry for Regularized Optimal Transport and Barycenters of Patterns

Neural Computation (2019) (to appear)

Abstract

We propose a new divergence on the manifold of probability distributions, building upon the entropic regularization of optimal transportation problems. As shown in [7], regularizing the optimal transport problem with an entropic term is known to bring several computational benefits. However, because of that regularization, the resulting approximation of the optimal transport cost does not define a proper distance or divergence between probability distributions. We have recently tried to introduce a family of divergences connecting the Wasserstein distance and the KL divergence from the information geometry point of view (see [3]). However, that proposal was not able to retain key intuitive aspects of the Wasserstein geometry, such as translation invariances, which play a key role when used in the more general problem of computing optimal transport barycenters. The divergence we propose in this work is able to retain such properties and admits an intuitive interpretation.

Learn more about how we do research

We maintain a portfolio of research projects, providing individuals and teams the freedom to emphasize specific types of work