How to choose the step size of gradient descent method has been a popular subject of research. In this paper we propose a modified limited memory steepest descent method (MLMSD). In each iteration we propose a selection rule to pick a unique step size from a candidate set, which is calculated by Fletcher's limited memory steepest descent method (LMSD), instead of going through all the step sizes in a sweep, as in Fletcher's original LMSD algorithm. MLMSD is motivated by an inexact super-linear convergence rate analysis. The R-linear convergence of MLMSD is proved for a strictly convex quadratic minimization problem. Numerical tests are presented to show that our algorithm is efficient and robust.
CITATION STYLE
Gu, R., & Du, Q. (2021). A modified limited memory steepest descent method motivated by an inexact super-linear convergence rate analysis. IMA Journal of Numerical Analysis, 41(1), 247–270. https://doi.org/10.1093/imanum/drz059
Mendeley helps you to discover research relevant for your work.