
Oriol Vinyals
Oriol Vinyals is a Principal Scientist at Google DeepMind, and a team lead of the Deep Learning group. His work focuses on Deep Learning and Artificial Intelligence. Prior to joining DeepMind, Oriol was part of the Google Brain team. He holds a Ph.D. in EECS from the University of California, Berkeley and is a recipient of the 2016 MIT TR35 innovator award. His research has been featured multiple times at the New York Times, Financial Times, WIRED, BBC, etc., and his articles have been cited over 70000 times. His academic involvement includes program chair for the International Conference on Learning Representations (ICLR) of 2017, and 2018. He has also been an area chair for many editions of the NeurIPS and ICML conferences. Some of his contributions such as seq2seq, knowledge distillation, or TensorFlow are used in Google Translate, Text-To-Speech, and Speech recognition, serving billions of queries every day, and he was the lead researcher of the AlphaStar project, creating an agent that defeated a top professional at the game of StarCraft, achieving Grandmaster level, also featured as the cover of Nature. At DeepMind he continues working on his areas of interest, which include artificial intelligence, with particular emphasis on machine learning, deep learning and reinforcement learning.
Authored Publications
Sort By
Google
Emergent abilities of large language models
Barret Zoph
Colin Raffel
Dani Yogatama
Jason Wei
Liam B. Fedus
Maarten Paul Bosma
Percy Liang
Sebastian Borgeaud
Tatsunori B. Hashimoto
Yi Tay
TMLR (2022)
Pointer Graph Networks
Matthew C. Overlan
Razvan Pascanu
Charles Blundell
Thirty-fourth Conference on Neural Information Processing Systems (NeurIPS 2020) (2020) (to appear)
Reinforced Genetic Algorithm Learning for Optimizing Computation Graphs
Aditya Paliwal
Felix Gimeno
Vinod Gopal Nair
Yujia Li
Miles Lubin
International Conference on Learning Representations (ICLR) (2020)
Relational inductive biases, deep learning, and graph networks
Peter Battaglia
Jessica Blake Chandler Hamrick
Victor Bapst
Alvaro Sanchez
Vinicius Zambaldi
Mateusz Malinowski
Andrea Tacchetti
David Raposo
Adam Santoro
Ryan Faulkner
Caglar Gulcehre
Francis Song
Andy Ballard
Justin Gilmer
Ashish Vaswani
Kelsey Allen
Charles Nash
Victoria Jayne Langston
Chris Dyer
Nicolas Heess
Daan Wierstra
Matt Botvinick
Yujia Li
Razvan Pascanu
arXiv (2018)
Hierarchical Representations for Efficient Architecture Search
Hanxiao Liu
Karen Simonyan
Chrisantha Fernando
Koray Kavukcuoglu
International Conference on Learning Representations (2018)
TEMPORAL MODELING USING DILATED CONVOLUTION AND GATING FOR VOICE-ACTIVITY-DETECTION
Gabor Simko
Aäron van den Oord
ICASSP 2018
Prediction errors of molecular machine learning models lower than hybrid DFT error
Felix Faber
Luke Hutchinson
Huang Bing
Justin Gilmer
Sam Schoenholz
Steven Kearnes
Patrick Riley
Anatole von Lilienfeld
Journal of Chemical Theory and Computation (2017)