Google Research

TensorFlow Distributions

Programming Language Design and Implementation (2017)

Abstract

Within the past year, deep probabilistic programming has significantly expanded the reach of probabilistic systems with neural networks, exploiting modern compute capabilities such as accelerators, batching, automatic differentiation, and varying numerical precision. Fundamental to these systems is the ability to build and manipulate probability distributions. However, existing distributions libraries lack modern tools necessary for deep probabilistic programming. To this end, we developed TensorFlow Distributions. TensorFlow Distributions defines two abstractions: ds.Distributions are probability distributions with fast, numerically stable methods for sampling, log density, and many statistics; ds.Bijectors are cached, composable, volume-tracking transformations. As examples, we demonstrate modern high-dimensional distributions and transformations not possible with previous libraries: pixelCNN, (inverse) autoregressive flows, and reversible residual networks. TensorFlow Distributions is widely used at Google in production systems; it is used across research labs in Google Brain and Deepmind; and it is the backend to the Edward probabilistic programming language.

Learn more about how we do research

We maintain a portfolio of research projects, providing individuals and teams the freedom to emphasize specific types of work