
Andrew Hard
I'm a Software Engineer in Google DeepMind, where I specialize in post-training for the Gemini Live model. My research interests include RL for post-training as well as multi-turn conversation simulation and optimization. Past projects have included federated learning for "OK Google" models and keyboard next-word prediction. I received a PhD in high-energy particle physics for work related to the discovery of the Higgs boson with the ATLAS experiment at CERN.
Research Areas
Authored Publications
Sort By
Google
Learning from straggler clients in federated learning
Ehsan Amid
Rohan Anil
Arxiv (2024) (to appear)
Diurnal or Nocturnal? Federated Learning of Multi-branch Networks from Periodically Shifting Distributions
Chen Zhu
Jakub Konečný
Tom Goldstein
International Conference on Learning Representations (2022) (to appear)
Mixed Federated Learning: Joint Decentralized and Centralized Learning
Karan Singhal
Satyen Kale
Arxiv (2022) (to appear)
A Field Guide to Federated Optimization
Jianyu Wang
Gauri Joshi
Maruan Al-Shedivat
Galen Andrew
A. Salman Avestimehr
Katharine Daly
Deepesh Data
Suhas Diggavi
Hubert Eichner
Advait Gadhikar
Antonious M. Girgis
Filip Hanzely
Chaoyang He
Samuel Horvath
Martin Jaggi
Tara Javidi
Satyen Chandrakant Kale
Sai Praneeth Karimireddy
Jakub Konečný
Sanmi Koyejo
Tian Li
Peter Richtarik
Karan Singhal
Virginia Smith
Mahdi Soltanolkotabi
Weikang Song
Sebastian Stich
Ameet Talwalkar
Hongyi Wang
Blake Woodworth
Honglin Yuan
Manzil Zaheer
Mi Zhang
Tong Zhang
Chunxiang (Jake) Zheng
Chen Zhu
arxiv (2021)
Jointly Learning from Decentralized (Federated) and Centralized Data to Mitigate Distribution Shift
NeurIPS 2021 Workshop on Distribution Shifts (2021) (to appear)
Training Keyword Spotting Models on Non-IID Data with Federated Learning
Aishanee Shah
Cameron Nguyen
Niranjan Subrahmanya
Pai Zhu
Interspeech (2020)
Federated Learning for Mobile Keyboard Prediction
Chloé M Kiddon
Hubert Eichner
(2019)
Federated Learning for Mobile Keyboard Prediction
Chloé M Kiddon
Hubert Eichner
(2018)