
Jacob Eisenstein
I work on computational linguistics and natural language processing. One of my main research focuses is language variation and change: making NLP systems robust to it, and using computational techniques to measure and understand it.
Research Areas
Authored Publications
Sort By
Google
MD3: The Multi-Dialect Dataset of Dialogues
Clara Rivera
Dora Demszky
Devyani Sharma
InterSpeech (2023) (to appear)
Dialect-robust Evaluation of Generated Text
Jiao Sun
Elizabeth Clark
Tu Vu
Sebastian Gehrmann
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Association for Computational Linguistics, Toronto, Canada (2023), pp. 6010-6028
Attributed Question Answering: Evaluation and Modeling for Attributed Large Language Models
Pat Verga
Jianmo Ni
arXiv (2022)
The MultiBERTs: BERT Reproductions for Robustness Analysis
Steve Yadlowsky
Jason Wei
Naomi Saphra
Iulia Raluca Turc
2022
Adapting Language Models to Temporal Knowledge
Bhuwan Dhingra
Transactions of the ACL (2021)
Underspecification Presents Challenges for Credibility in Modern Machine Learning
Dan Moldovan
Ben Adlam
Babak Alipanahi
Alex Beutel
Christina Chen
Jon Deaton
Matthew D. Hoffman
Shaobo Hou
Neil Houlsby
Ghassen Jerfel
Yian Ma
Diana Mincu
Akinori Mitani
Andrea Montanari
Christopher Nielsen
Thomas Osborne
Rajiv Raman
Kim Ramasamy
Jessica Schrouff
Martin Gamunu Seneviratne
Shannon Sequeira
Harini Suresh
Victor Veitch
Steve Yadlowsky
Xiaohua Zhai
D. Sculley
Journal of Machine Learning Research (2020)