Convergence analysis of the Fast Subspace Descent method for convex optimization problems

  • Chen L
  • Hu X
  • Wise S
8Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

Abstract

The full approximation storage (FAS) scheme is a widely used multigrid method for nonlinear problems. In this paper, a new framework to design and analyze FAS-like schemes for convex optimization problems is developed. The new method, the Fast Subspace Descent (FASD) scheme, which generalizes classical FAS, can be recast as an inexact version of nonlinear multigrid methods based on space decomposition and subspace correction. The local problem in each subspace can be simplified to be linear and one gradient descent iteration (with an appropriate step size) is enough to ensure a global linear (geometric) convergence of FASD.

Cite

CITATION STYLE

APA

Chen, L., Hu, X., & Wise, S. M. (2020). Convergence analysis of the Fast Subspace Descent method for convex optimization problems. Mathematics of Computation, 89(325), 2249–2282. https://doi.org/10.1090/mcom/3526

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free