MaskGAN: Better Text Generation via Filling in the ____

William Fedus
Ian Goodfellow
ICLR (2018)

Abstract

Recurrent neural networks (RNNs) are a common method of generating text token by token. These models are typically trained via maximum likelihood (known in this context as teacher forcing). However, this approach frequently suffers from problems when using a trained model to generate new text since when generating words later in the sequence the model often conditions on a sequence of
words that was never observed at training time. We explore methods for using Generative Adversarial Networks (GANs) as an alternative to teacher forcing to generate discrete sequences.
In particular, we consider a conditional GAN that fills in missing text conditioned on the surrounding context. We show qualitatively and quantitatively evidence that this produces more realistic text samples compared to a maximum likelihood trained model. We also propose a new task that quantitatively measures the quality of RNN produced samples.