Efficient first-order methods for convex minimization: a constructive approach

18Citations
Citations of this article
28Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We describe a novel constructive technique for devising efficient first-order methods for a wide range of large-scale convex minimization settings, including smooth, non-smooth, and strongly convex minimization. The technique builds upon a certain variant of the conjugate gradient method to construct a family of methods such that (a) all methods in the family share the same worst-case guarantee as the base conjugate gradient method, and (b) the family includes a fixed-step first-order method. We demonstrate the effectiveness of the approach by deriving optimal methods for the smooth and non-smooth cases, including new methods that forego knowledge of the problem parameters at the cost of a one-dimensional line search per iteration, and a universal method for the union of these classes that requires a three-dimensional search per iteration. In the strongly convex case, we show how numerical tools can be used to perform the construction, and show that the resulting method offers an improved worst-case bound compared to Nesterov’s celebrated fast gradient method.

Cite

CITATION STYLE

APA

Drori, Y., & Taylor, A. B. (2020). Efficient first-order methods for convex minimization: a constructive approach. Mathematical Programming, 184(1–2), 183–220. https://doi.org/10.1007/s10107-019-01410-2

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free