Jump to Content

Inference Strategies for Machine Translation with Conditional Masking

Julia Kreutzer
George Foster
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) (to appear)

Abstract

Conditional masked language model (CMLM) training has proven successful for non-autoregressive and semi-autoregressive sequence generation tasks, such as machine translation. Given a trained CMLM, however, it is not clear what the best inference strategy is. We formulate masked inference as a factorization of conditional probabilities of partial sequences, show that this does not harm performance, and investigate a number of simple heuristics motivated by this perspective. We identify a thresholding strategy that has advantages over the standard "mask-predict" algorithm, and provide analyses of its behavior on machine translation tasks.

Research Areas