Jump to Content

Imputer: Sequence Modelling via Imputation and Dynamic Programming

Chitwan Saharia
Geoffrey Everest Hinton
Mohammad Norouzi
Navdeep Jaitly
William Chan
ICML (2020)
Google Scholar

Abstract

We present the Imputer, a neural sequence model which generates sequences iteratively via imputations. Imputer is a constant-time generation model, requiring a fixed number of generative iterations independent of the number of input or output tokens. Imputer can be trained to marginalize over all possible alignments between the input sequence and output sequence, and all possible generation orders. We present a tractable dynamic programming training algorithm and show it is a lower-bound of the likelihood. We apply Imputer in the context of end-to-end speech recognition, and outperform all prior non-autoregressive models and achieve competitive results to autoregressive models. On LibriSpeech test-other, Imputer achieves 11.1 WER, outperforming CTC at 13.0 WER and LAS at 12.5 WER.