
Srinadh Bhojanapalli
I am a research scientist at Google research in New York. Earlier I was a research assistant professor at TTI Chicago. I obtained my PhD at The University of Texas at Austin where I was advised by Prof. Sujay Sanghavi.
My research is primarily focused on designing statistically efficient algorithms for large scale machine learning problems. I am interested in non-convex optimization, matrix and tensor factorization, neural networks and sub-linear time algorithms.
Research Areas
Authored Publications
Sort By
Google
Efficient Language Model Architectures for Differentially Private Federated Learning
Yanxiang Zhang
Privacy Regulation and Protection in Machine Learning Workshop at ICLR 2024 (2024) (to appear)
Dual-Encoders for Extreme Multi-label Classification
Nilesh Gupta
Devvrit Khatri
Inderjit Dhillon
International Conference on Learning Representations (ICLR) (2024)
On Emergence of Activation Sparsity in Trained Transformers
Zonglin Li
Chong You
Daliang Li
Ke Ye
International Conference on Learning Representations (ICLR) (2023)
Teacher's pet: understanding and mitigating biases in distillation
Aditya Krishna Menon
Transactions on Machine Learning Research (2022)
Coping with label shift via distributionally robust optimisation
Jingzhao Zhang
Aditya Krishna Menon
Suvrit Sra
International Conference on Learning Representations (2021)
Understanding Robustness of Transformers for Image Classification
Daliang Li
Thomas Unterthiner
Proceedings of the IEEE/CVF International Conference on Computer Vision (2021) (to appear)
$O(n)$ Connections are Expressive Enough: Universal Approximability of Sparse Transformers
Chulhee Yun
Advances in Neural Information Processing Systems (2020)
Does label smoothing mitigate label noise?
Aditya Krishna Menon
International Conference on Machine Learning (2020) (to appear)
Modifying Memories in Transformer Models
Chen Zhu
Daliang Li
Manzil Zaheer
Arxiv (2020)
Semantic label smoothing for sequence to sequence problems
Himanshu Jain
Aditya Krishna Menon
Seungyeon Kim
EMNLP (2020) (to appear)