Katya’s main research interests lie broadly in continuous optimization, focusing on convex optimization, derivative free optimization, and large-scale methods for Big Data and Machine Learning applications, co-organizing multiple workshops on these topics. Katya is currently the Editor-in-Chief of the MOS-SIAM Series on Optimization and an associate editor of the SIAM Journal on Optimization and Mathematical Programming and Mathematical Programming, Series A. She is a recipient of the 2015 Lagrange Prize in Continuous Optimization, along with Andrew R. Conn and Luis Nunes Vicente, for their book Introduction to Derivative-Free Optimization. Katya’s research is supported by grants from AFOSR, DARPA, NSF, and Yahoo.
While visiting Google NYC, Katya works on several projects such as improving convergence of methods for federated learning, using derivative free methods for model predictive control, and integrating derivative-free optimization with hyperparameter optimization. Katya is generally interested in understanding and improving optimization for deep learning and hopes to obtain interesting empirical and theoretical results that will carry over to her future research.