Smooth convex optimization

3Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this chapter, we study the complexity of solving optimization problems formed by differentiable convex components. We start by establishing the main properties of such functions and deriving the lower complexity bounds, which are valid for all natural optimization methods. After that, we prove the worst-case performance guarantees for the Gradient Method. Since these bounds are quite far from the lower complexity bounds, we develop a special technique, based on the notion of estimating sequences, which allows us to justify the Fast Gradient Methods. These methods appear to be optimal for smooth convex problems. We also obtain performance guarantees for these methods targeting on generating points with small norm of the gradient. In order to treat problems with set constraints, we introduce the notion of a Gradient Mapping. This allows an automatic extension of methods for unconstrained minimization to the constrained case. In the last section, we consider methods for solving smooth optimization problems, defined by several functional components.

Cite

CITATION STYLE

APA

Nesterov, Y. (2018). Smooth convex optimization. In Springer Optimization and Its Applications (Vol. 137, pp. 59–137). Springer International Publishing. https://doi.org/10.1007/978-3-319-91578-4_2

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free