Regularized Optimal Transport is Ground Cost Adversarial
Abstract
Regularizing Wasserstein distances has proved to
be the key in the recent advances of optimal transport (OT) in machine learning. Most prominent is
the entropic regularization of OT, which not only
allows for fast computations and differentiation
using Sinkhorn algorithm, but also improves stability with respect to data and accuracy in many
numerical experiments. Theoretical understanding of these benefits remains unclear, although
recent statistical works have shown that entropyregularized OT mitigates classical OT’s curse of
dimensionality. In this paper, we adopt a more geometrical point of view, and show using Fenchel
duality that any convex regularization of OT can
be interpreted as ground cost adversarial. This
incidentally gives access to a robust dissimilarity
measure on the ground space, which can in turn
be used in other applications. We propose algorithms to compute this robust cost, and illustrate
the interest of this approach empirically.
be the key in the recent advances of optimal transport (OT) in machine learning. Most prominent is
the entropic regularization of OT, which not only
allows for fast computations and differentiation
using Sinkhorn algorithm, but also improves stability with respect to data and accuracy in many
numerical experiments. Theoretical understanding of these benefits remains unclear, although
recent statistical works have shown that entropyregularized OT mitigates classical OT’s curse of
dimensionality. In this paper, we adopt a more geometrical point of view, and show using Fenchel
duality that any convex regularization of OT can
be interpreted as ground cost adversarial. This
incidentally gives access to a robust dissimilarity
measure on the ground space, which can in turn
be used in other applications. We propose algorithms to compute this robust cost, and illustrate
the interest of this approach empirically.