David Reitter

David Reitter studies natural language processing in the contexts of dialogue and generative models. His most recent work has solved problems in generating grounded text. Such text appears, for example, in certain chat or summarization systems. Grounded text only makes claims that are based on available evidence. Dr. Reitter has studied ways to detect claims that are not grounded in evidence; these evaluation methods have become the basis for generative models across many Google products.

Reitter's research interests cover diverse areas of computational cognitive science. Reitter, with academic colleagues, started a subfield of psycholinguistics that used large-scale observational datasets to understand how the mind processes language and allows us to engage in conversation.

David Reitter has authored more than 120 papers in both cognitive psychology and computer science, and Aquamacs, a widely used software package. Prof. Reitter joined Google from Penn State, where he directed an NSF-funded research group on computational cognition and language processing. He holds a Ph.D. from the University of Edinburgh, prior degrees in linguistics and computer science, and was a fellow at MIT's Media Lab Europe (working on multimodal user interfaces) and a post-doc at Carnegie Mellon University (working on cognitive modeling).

Authored Publications
Sort By
  • Title
  • Title, descending
  • Year
  • Year, descending
    Google
Evaluating Attribution in Dialogue Systems: The BEGIN Benchmark
Nouha Dziri
Tal Linzen
Transactions of the Association for Computational Linguistics, 10 (2022), 1066–1083
CONQRR: Conversational Query Rewriting for Retrieval with Reinforcement Learning
Ellen Wu
Yi Luan
Hannaneh Hajishirzi
Mari Ostendorf
The 2022 Conference on Empirical Methods in Natural Language Processing (2022)
Increasing Faithfulness in Knowledge-Grounded Dialogue with Controllable Features
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) (2021), pp. 704-718
Indirect Associations in Learning Semantic and Syntactic Lexical Relationships
Matthew A. Kelly
Moojan Ghafurian
Robert L. West
Journal of Memory and Language, 115 (2020), pp. 104153
Like a Baby: Visually Situated Neural Language Acquisition
Alexander G. Ororbia II
Ankur Mali
Matthew A. Kelly
57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy (2019)