Metastable Mixing of Markov Chains: Efficiently Sampling Low Temperature Exponential Random Graphs
Abstract
In this paper we consider the problem of sampling from the low-temperature exponential random graph model (ERGM). The usual approach is via Markov chain Monte Carlo, but strong lower bounds have been established for the ERGM showing that any local Markov chain suffers from an exponentially large mixing time due to meta-stable states.
We instead consider meta-stable mixing, a notion of approximate mixing within a collection of meta-stable states. In the case of the ERGM, we show that Glauber dynamics with the right
$G(n,p)$ initialization has a \stable~mixing time of $O(n^2\log n)$ to within total variation distance $\exp(-\Omega(n))$.
We instead consider meta-stable mixing, a notion of approximate mixing within a collection of meta-stable states. In the case of the ERGM, we show that Glauber dynamics with the right
$G(n,p)$ initialization has a \stable~mixing time of $O(n^2\log n)$ to within total variation distance $\exp(-\Omega(n))$.