- Peter Kairouz
- H. Brendan McMahan
- Brendan Avent
- Aurélien Bellet
- Mehdi Bennis
- Arjun Nitin Bhagoji
- K. A. Bonawitz
- Zachary Charles
- Graham Cormode
- Rachel Cummings
- Rafael G.L. D'Oliveira
- Salim El Rouayheb
- David Evans
- Josh Gardner
- Zachary Garrett
- Adrià Gascón
- Badih Ghazi
- Phillip B. Gibbons
- Marco Gruteser
- Zaid Harchaoui
- Chaoyang He
- Lie He
- Zhouyuan Huo
- Ben Hutchinson
- Justin Hsu
- Martin Jaggi
- Tara Javidi
- Gauri Joshi
- Mikhail Khodak
- Jakub Konečný
- Aleksandra Korolova
- Farinaz Koushanfar
- Sanmi Koyejo
- Tancrède Lepoint
- Yang Liu
- Prateek Mittal
- Mehryar Mohri
- Richard Nock
- Ayfer Özgür
- Rasmus Pagh
- Mariana Raykova
- Hang Qi
- Daniel Ramage
- Ramesh Raskar
- Dawn Song
- Weikang Song
- Sebastian U. Stich
- Ziteng Sun
- Ananda Theertha Suresh
- Florian Tramèr
- Praneeth Vepakomma
- Jianyu Wang
- Li Xiong
- Zheng Xu
- Qiang Yang
- Felix X. Yu
- Han Yu
- Sen Zhao
Abstract
Federated learning (FL) is a machine learning setting where many clients (e.g., mobile devices or whole organizations) collaboratively train a model under the orchestration of a central server (e.g., service provider), while keeping the training data decentralized. FL embodies the principles of focused data collection and minimization, and mitigates many of the systemic privacy risks and costs resulting from traditional, centralized machine learning and data science approaches. Motivated by the explosive growth in FL research, this paper discusses recent advances and presents a comprehensive list of open problems and challenges.
Research Areas
Learn more about how we do research
We maintain a portfolio of research projects, providing individuals and teams the freedom to emphasize specific types of work