Google Research

Timothy Dozat


I joined Google as a research scientist in January 2019. My current work falls into two broad categories: neural network architectures (and some of the theory behind them), emphasizing language model pretraining and distillation; and "classic" NLP tasks, such as part-of-speech tagging and parsing. I've also recently been collaborating with teams working on the Google Assistant. I received my PhD in Lingustics from Stanford University, where I worked under Chris Manning on developing Universal Dependencies and building neural parsers that could reproduce it. I also dabbled in convex optimization at one point, and I might come back to it someday.

Learn more about how we do research

We maintain a portfolio of research projects, providing individuals and teams the freedom to emphasize specific types of work