Explicit stabilised gradient descent for faster strongly convex optimisation

15Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This paper introduces the Runge–Kutta Chebyshev descent method (RKCD) for strongly convex optimisation problems. This new algorithm is based on explicit stabilised integrators for stiff differential equations, a powerful class of numerical schemes that avoid the severe step size restriction faced by standard explicit integrators. For optimising quadratic and strongly convex functions, this paper proves that RKCD nearly achieves the optimal convergence rate of the conjugate gradient algorithm, and the suboptimality of RKCD diminishes as the condition number of the quadratic function worsens. It is established that this optimal rate is obtained also for a partitioned variant of RKCD applied to perturbations of quadratic functions. In addition, numerical experiments on general strongly convex problems show that RKCD outperforms Nesterov’s accelerated gradient descent.

Cite

CITATION STYLE

APA

Eftekhari, A., Vandereycken, B., Vilmart, G., & Zygalakis, K. C. (2021). Explicit stabilised gradient descent for faster strongly convex optimisation. BIT Numerical Mathematics, 61(1), 119–139. https://doi.org/10.1007/s10543-020-00819-y

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free