Backward-forward least angle shrinkage for sparse quadratic optimization

0Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In compressed sensing and statistical society, dozens of algorithms have been developed to solve ℓ1 penalized least square regression, but constrained sparse quadratic optimization (SQO) is still an open problem. In this paper, we propose backward-forward least angle shrinkage (BF-LAS), which provides a scheme to solve general SQO including sparse eigenvalue minimization. BF-LAS starts from the dense solution, iteratively shrinks unimportant variables' magnitudes to zeros in the backward step for minimizing the ℓ1 norm, decreases important variables' gradients in the forward step for optimizing the objective, and projects the solution on the feasible set defined by the constraints. The importance of a variable is measured by its correlation w.r.t the objective and is updated via least angle shrinkage (LAS). We show promising performance of BF-LAS on sparse dimension reduction. © 2010 Springer-Verlag.

Cite

CITATION STYLE

APA

Zhou, T., & Tao, D. (2010). Backward-forward least angle shrinkage for sparse quadratic optimization. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6443 LNCS, pp. 388–396). https://doi.org/10.1007/978-3-642-17537-4_48

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free