- Aude Genevay
- Lénaïc Chizat
- Francis Bach
- Marco Cuturi
- Gabriel Peyré
Abstract
Optimal transport (OT) and maximum mean discrepancies (MMD) are now routinely used in machine learning to compare probability measures. We focus in this paper on \emph{Sinkhorn divergences} (SDs), a regularized variant of OT distances which can interpolate, depending on the regularization strength ε, between OT (ε=0) and MMD (ε=∞). Although the tradeoff induced by that regularization is now well understood computationally (OT, SDs and MMD require respectively O(n3logn), O(n2) and n2 operations given a sample size n), much less is known in terms of their \emph{sample complexity}, namely the gap between these quantities, when evaluated using finite samples \emph{vs.} their respective densities. Indeed, while the sample complexity of OT and MMD stand at two extremes, 1/n1/d for OT in dimension d and 1/n‾√ for MMD, that for SDs has only been studied empirically. In this paper, we \emph{(i)} derive a bound on the approximation error made with SDs when approximating OT as a function of the regularizer ε, \emph{(ii)} prove that the optimizers of regularized OT are bounded in a Sobolev (RKHS) ball independent of the two measures and \emph{(iii)} provide the first sample complexity bound for SDs, obtained,by reformulating SDs as a maximization problem in a RKHS. We thus obtain a scaling in 1/n‾√ (as in MMD), with a constant that depends however on ε, making the bridge between OT and MMD complete.
Research Areas
Learn more about how we do research
We maintain a portfolio of research projects, providing individuals and teams the freedom to emphasize specific types of work