- Adrien B. Taylor
- Yoel Drori
Abstract
We describe a novel constructive technique for devising efficient first-order methods for a wide range of large-scale convex minimization settings, including smooth, nonsmooth, and strongly convex minimization. This novel design technique takes a method performing a series of subspace-searches and constructs a fixed-step method that has the same worst-case guarantees. We show that this technique yields optimal algorithms in the smooth and nonsmooth cases, and use it to devise a new method for smooth strongly convex minimization with an improved convergence rate as compared to Nesterov's celebrated fast gradient methods.
Research Areas
Learn more about how we do research
We maintain a portfolio of research projects, providing individuals and teams the freedom to emphasize specific types of work