A modified limited memory steepest descent method motivated by an inexact super-linear convergence rate analysis

1Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

How to choose the step size of gradient descent method has been a popular subject of research. In this paper we propose a modified limited memory steepest descent method (MLMSD). In each iteration we propose a selection rule to pick a unique step size from a candidate set, which is calculated by Fletcher's limited memory steepest descent method (LMSD), instead of going through all the step sizes in a sweep, as in Fletcher's original LMSD algorithm. MLMSD is motivated by an inexact super-linear convergence rate analysis. The R-linear convergence of MLMSD is proved for a strictly convex quadratic minimization problem. Numerical tests are presented to show that our algorithm is efficient and robust.

Cite

CITATION STYLE

APA

Gu, R., & Du, Q. (2021). A modified limited memory steepest descent method motivated by an inexact super-linear convergence rate analysis. IMA Journal of Numerical Analysis, 41(1), 247–270. https://doi.org/10.1093/imanum/drz059

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free