Amortized Inference of Variational Bounds for Learning Noisy-OR
Abstract
Classical approaches for approximate inference depend on cleverly designed variational
distributions and bounds. Modern approaches
employ amortized variational inference, which
uses a neural network to approximate any posterior without leveraging the structures of the
generative models. In this paper, we propose Amortized Conjugate Posterior (ACP),
a hybrid approach taking advantages of both
types of approaches. Specifically, we use the
classical methods to derive specific forms of
posterior distributions and then learn the variational parameters using amortized inference.
We study the effectiveness of the proposed
approach on the noisy-or model and compare to both the classical and the modern
approaches for approximate inference and parameter learning. Our results show that the
proposed method outperforms or are at par
with other approaches.
distributions and bounds. Modern approaches
employ amortized variational inference, which
uses a neural network to approximate any posterior without leveraging the structures of the
generative models. In this paper, we propose Amortized Conjugate Posterior (ACP),
a hybrid approach taking advantages of both
types of approaches. Specifically, we use the
classical methods to derive specific forms of
posterior distributions and then learn the variational parameters using amortized inference.
We study the effectiveness of the proposed
approach on the noisy-or model and compare to both the classical and the modern
approaches for approximate inference and parameter learning. Our results show that the
proposed method outperforms or are at par
with other approaches.