We study the l 1 regularized least squares optimization problem in a separable Hilbert space. We show that the iterative soft-thresholding algorithm (ISTA) converges linearly, without making any assumption on the linear operator into play or on the problem. The result is obtained combining two key concepts: the notion of extended support, a finite set containing the support, and the notion of conditioning over finite-dimensional sets. We prove that ISTA identifies the solution extended support after a finite number of iterations, and we derive linear convergence from the conditioning property, which is always satisfied for l 1 regularized least squares problems. Our analysis extends to the entire class of thresholding gradient algorithms, for which we provide a conceptually new proof of strong convergence, as well as convergence rates.
CITATION STYLE
Garrigos, G., Rosasco, L., & Villa, S. (2020). Thresholding gradient methods in Hilbert spaces: Support identification and linear convergence. ESAIM - Control, Optimisation and Calculus of Variations, 26. https://doi.org/10.1051/cocv/2019011
Mendeley helps you to discover research relevant for your work.