
Wennan Zhu
Wennan Zhu is a research scientist working on algorithmic game theory and mechanism design. She received her Ph.D. in Computer Science from Rensselaer Polytechnic Institute, where she worked on social choice algorithms. Visit her homepage for more information.
Authored Publications
Sort By
Auto-bidding and Auctions in Online Advertising: A Survey
Ashwinkumar Badanidiyuru Varadaraja
Ariel Schvartzman
Hanrui Zhang
Kelly Spendlove
Georgios Piliouras
Haihao (Sean) Lu
Andres Perlroth
Christopher Liaw
Mingfei Zhao
ACM SIGecom Exchanges, 22 (2024)
Preview abstract
In this survey, we summarize recent developments in research fueled by the growing adoption of automated bidding strategies in online advertising. We explore the challenges and opportunities that have arisen as markets embrace this autobidding and cover a range of topics in this area, including bidding algorithms, equilibrium analysis and efficiency of common auction formats, and optimal auction design.
View details
A Field Guide to Federated Optimization
Suhas Diggavi
Chaoyang He
Mahdi Soltanolkotabi
Maruan Al-Shedivat
Chen Zhu
Peter Richtarik
Honglin Yuan
Ameet Talwalkar
Sebastian Stich
Sanmi Koyejo
Hongyi Wang
Deepesh Data
Blake Woodworth
Filip Hanzely
A. Salman Avestimehr
Tian Li
Jianyu Wang
Samuel Horvath
Antonious M. Girgis
Mi Zhang
Advait Gadhikar
Martin Jaggi
Gauri Joshi
Tara Javidi
Virginia Smith
Sai Praneeth Karimireddy
Karan Singhal
Jakub Konečný
Manzil Zaheer
Satyen Chandrakant Kale
Chunxiang (Jake) Zheng
Weikang Song
Galen Andrew
Katharine Daly
Tong Zhang
Hubert Eichner
arxiv (2021)
Preview abstract
Federated learning and analytics are a distributed approach for collaboratively learning models (or statistics) from decentralized data, motivated by and designed for privacy protection. The distributed learning process can be formulated as solving federated optimization problems, which emphasize communication efficiency, data heterogeneity, compatibility with privacy and system requirements, and other constraints that are not primary considerations in other problem settings. This paper provides recommendations and guidelines on formulating, designing, evaluating and analyzing federated optimization algorithms through concrete examples and practical implementation, with a focus on conducting effective simulations to infer real-world performance. The goal of this work is not to survey the current literature, but to inspire researchers and practitioners to design federated learning algorithms that can be used in various practical applications.
View details
Federated Heavy Hitters with Differential Privacy
Haicheng Sun
Vivian (Wei) Li
International Conference on Artificial Intelligence and Statistics (AISTATS) 2020
Preview abstract
The discovery of heavy hitters (most frequent items) in user-generated data streams drives improvements in the app and web ecosystems, but can incur substantial privacy risks if not done with care. To address these risks, we propose a distributed and privacy-preserving algorithm for discovering the heavy hitters in a population of user-generated data streams. We leverage the sampling property of our distributed algorithm to prove that it is inherently differentially private, without requiring additional noise. We also examine the trade-off between privacy and utility, and show that our algorithm provides excellent utility while also achieving strong privacy guarantees. A significant advantage of this approach is that it eliminates the need to centralize raw data while also avoiding the significant loss in utility incurred by local differential privacy. We validate our findings both theoretically, using worst-case analyses, and practically, using a Twitter dataset with 1.6M tweets and over 650k users. Finally, we carefully compare our approach to Apple's local differential privacy method for discovering heavy hitters.
View details