The exact worst-case convergence rate of the gradient method with fixed step lengths for L-smooth functions

5Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this paper, we study the convergence rate of the gradient (or steepest descent) method with fixed step lengths for finding a stationary point of an L-smooth function. We establish a new convergence rate, and show that the bound may be exact in some cases, in particular when all step lengths lie in the interval (0, 1/L]. In addition, we derive an optimal step length with respect to the new bound.

Cite

CITATION STYLE

APA

Abbaszadehpeivasti, H., de Klerk, E., & Zamani, M. (2022). The exact worst-case convergence rate of the gradient method with fixed step lengths for L-smooth functions. Optimization Letters, 16(6), 1649–1661. https://doi.org/10.1007/s11590-021-01821-1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free