
Jacob Austin
Jacob Austin is an AI Resident in Google Research on the Brain Team, working on program synthesis and probabilistic modeling. His work at Google has focused on program synthesis using large neural network language models in real-world programming languages and diffusion models for discrete state spaces.
Authored Publications
Sort By
Google
Resolving Code Review Comments with Machine Learning
Alexander Frömmgen
Peter Choy
Elena Khrapko
Marcus Revaj
2024 IEEE/ACM 46th International Conference on Software Engineering: Software Engineering in Practice (ICSE-SEIP) (to appear)
PaLM: Scaling Language Modeling with Pathways
Aakanksha Chowdhery
Sharan Narang
Jacob Devlin
Maarten Bosma
Hyung Won Chung
Sebastian Gehrmann
Parker Schuh
Sasha Tsvyashchenko
Abhishek Rao
Yi Tay
Noam Shazeer
Nan Du
Reiner Pope
James Bradbury
Guy Gur-Ari
Toju Duke
Henryk Michalewski
Xavier Garcia
Liam Fedus
David Luan
Barret Zoph
Ryan Sepassi
David Dohan
Shivani Agrawal
Mark Omernick
Marie Pellat
Aitor Lewkowycz
Erica Moreira
Rewon Child
Oleksandr Polozov
Zongwei Zhou
Brennan Saeta
Michele Catasta
Jason Wei
Kathy Meier-Hellstern
arxiv:2204.02311 (2022)
Beyond In-Place Corruption: Insertion and Deletion In Denoising Probabilistic Models
Daniel Dun-ning Woo Johnson
Rianne van den Berg
ICML Workshop on Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models (2021)
Show Your Work: Scratchpads for Intermediate Computation with Language Models
Maxwell Nye
Guy Gur-Ari
Henryk Witold Michalewski
David Bieber
David Martin Dohan
Aitor Lewkowycz
Maarten Paul Bosma
David Luan
Augustus Odena
(2021)
Program Synthesis with Large Language Models
Augustus Odena
David Martin Dohan
Ellen Jiang
Henryk Michalewski
Maarten Paul Bosma
Maxwell Nye
n/a, n/a, n/a (2021), n/a
Structured Denoising Diffusion Models in Discrete State-Spaces
Daniel Dun-ning Woo Johnson
Jonathan Ho
Rianne van den Berg
Advances in Neural Information Processing Systems (2021) (to appear)