Jump to Content

Provably Fast Finite Particle Variants of SVGD via Virtual Particle Stochastic Approximation

Aniket Das
NeurIPS 2023 (2023) (to appear)
Google Scholar

Abstract

Stein Variational Gradient Descent (SVGD) is a popular nonparametric variational inference algorithm which simulates an interacting particle system to approximate a target distribution. While SVGD has demonstrated promising empirical performance across various domains, and its population (i.e, infinite-particle) limit dynamics is well studied, the behavior of SVGD in the finite-particle regime is much less understood. In this work, we design two computationally efficient variants of SVGD, namely VP-SVGD and RB-SVGD, with provably fast finite-particle convergence rates. By introducing the notion of \emph{virtual particles}, we develop novel stochastic approximations of population-limit SVGD dynamics in the space of probability measures, which is exactly implementable using only a finite number of particles. Our algorithms can be viewed as specific random-batch approximations of SVGD, which are computationally more efficient than ordinary SVGD. We establish that the $n$ particles output by VP-SVGD and RB-SVGD, run for $T$ steps, are i.i.d samples from a distribution whose Kernel Stein Discrepancy to the target is at most $O(T^{\nicefrac{-1}{6}})$ under standard assumptions. Our results hold under a mild growth condition on the potential function, which is significantly weaker than the isoperimetric assumptions (e.g. Poincare Inequality) or information-transport conditions (e.g. Talagrand's Inequality $\mathsf{T}_1$) generally considered in prior works. As a corollary, we consider the convergence of the empirical measure (of the particles output by VP-SVGD and RB-SVGD) to the target distribution and demonstrate a \emph{double exponential improvement} over the best known finite-particle analysis of SVGD.