Automatic Structured Variational Inference
Abstract
Probabilistic programming is concerned with the symbolic specification of probabilistic models for which inference can be performed automatically. Gradient-based automatic differentiation stochastic variational inference offers an attractive option as the default method for (differentiable) probabilistic programming. However, the performance of any (parametric) variational approach depends on the choice of an appropriate variational family. Here, we introduce automated structured variational inference (ASVI), a fully automated method for constructing structured variational families, inspired by the closed-form update in conjugate Bayesian models. These pseudo-conjugate families incorporate the forward pass of the input probabilistic program and can therefore capture complex statistical dependencies. Pseudo-conjugate families have the same space and time complexity of the input probabilistic program and are therefore tractable for a very large family of models including both continuous and discrete variables. We provide a fully automatic implementation in TensorFlow Probability. We validate our automatic variational method on a wide range of both low- and high-dimensional inference problems including deep learning components.