Jump to Content

Graph Normalizing Flows

Aviral Kumar
Jamie Kiros
Jenny Liu
Jimmy Ba
NeurIPS 2019 (2019) (to appear)
Google Scholar


We introduce Graph Normalizing Flows (GNFs), a new, reversible graph neural network (GNN) model for prediction and generation. On supervised tasks, GNFs perform similarly to GNNs, but at a significantly reduced memory footprint, allowing them to scale to larger graphs. In the unsupervised case, we combine GNFs with a novel graph auto-encoder to create a generative model of graph structures. Our GNF model is permutation invariant, generating entire graphs with a single feed-forward pass, and achieves competitive results with the state-of-the art autoregressive models, while being better suited to parallel computing architectures.

Research Areas