Efficient first-order methods for convex minimization: a constructive approach
Abstract
We describe a novel constructive technique for devising efficient first-order methods for a wide range of large-scale convex minimization settings, including smooth, nonsmooth, and strongly convex minimization. This novel design technique takes a method performing a series of subspace-searches and constructs a fixed-step method that has the same worst-case guarantees.
We show that this technique yields optimal algorithms in the smooth and nonsmooth cases, and
use it to devise a new method for smooth strongly convex minimization with an improved convergence rate as compared to Nesterov's celebrated fast gradient methods.
We show that this technique yields optimal algorithms in the smooth and nonsmooth cases, and
use it to devise a new method for smooth strongly convex minimization with an improved convergence rate as compared to Nesterov's celebrated fast gradient methods.