
YaGuang Li
YaGuang Li a senior staff research engineer in the Google DeepMind, He co-led the finetuning effort of Gemini 1.5 and Gemini 1.0 for Gemini Advanced. He is also core contributor of LaMDA, PaLM-2 working on pre-training, instruction tuning and improving serving efficiency. Prior to joining Google, YaGuang received his Ph.D. degree in Computer Science at the University of Southern California and his Master degree in Computer Science from Institute of Software in University of Chinese Academy of Sciences in 2014.
Authored Publications
Sort By
Google
LaMDA: Language Models for Dialog Applications
Aaron Daniel Cohen
Alena Butryna
Alicia Jin
Apoorv Kulshreshtha
Ben Zevenbergen
Chung-ching Chang
Cosmo Du
Daniel De Freitas Adiwardana
Dehao Chen
Dmitry (Dima) Lepikhin
Erin Hoffman-John
Igor Krivokon
James Qin
Jamie Hall
Joe Fenton
Johnny Soraker
Kathy Meier-Hellstern
Maarten Paul Bosma
Marc Joseph Pickett
Marcelo Amorim Menegali
Marian Croak
Maxim Krikun
Noam Shazeer
Rachel Bernstein
Ravi Rajakumar
Ray Kurzweil
Romal Thoppilan
Steven Zheng
Taylor Bos
Toju Duke
Tulsee Doshi
Vincent Y. Zhao
Will Rusch
Yuanzhong Xu
Zhifeng Chen
arXiv (2022)
HyperPrompt: Prompt-based Task-Conditioning of Transformers
Cosmo Du
Steven Zheng
Vamsi Aribandi
Yi Tay
Yun He
Zhao Chen
Zhe Zhao
ICML (2022)