State Space LSTM Models with Particle MCMC Inference
Abstract
Long Short-Term Memory (LSTM) is one of the most powerful sequence models.
Despite the strong performance, however, it lacks the nice interpretability as in
state space models. In this paper, we present a way to combine the best of both
worlds by introducing State Space LSTM (SSL) models that generalizes the earlier
work Zaheer et al. (2017) of combining topic models with LSTM. However,
unlike Zaheer et al. (2017), we do not make any factorization assumptions in our
inference algorithm. We present an efficient sampler based on sequential Monte
Carlo (SMC) method that draws from the joint posterior directly. Experimental
results confirms the superiority and stability of this SMC inference algorithm on
a variety of domains.
Despite the strong performance, however, it lacks the nice interpretability as in
state space models. In this paper, we present a way to combine the best of both
worlds by introducing State Space LSTM (SSL) models that generalizes the earlier
work Zaheer et al. (2017) of combining topic models with LSTM. However,
unlike Zaheer et al. (2017), we do not make any factorization assumptions in our
inference algorithm. We present an efficient sampler based on sequential Monte
Carlo (SMC) method that draws from the joint posterior directly. Experimental
results confirms the superiority and stability of this SMC inference algorithm on
a variety of domains.