Bag of pursuits and neural gas for improved sparse coding

1Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Sparse coding employs low-dimensional subspaces in order to encode high-dimensional signals. Finding the optimal subspaces is a difficult optimization task. We show that stochastic gradient descent is superior in finding the optimal subspaces compared to MOD and K-SVD, which are both state-of- The art methods. The improvement is most significant in the difficult setting of highly overlapping subspaces. We introduce the so-called "Bag of Pursuits" that is derived from Orthogonal Matching Pursuit. It provides an improved approximation of the optimal sparse coefficients, which, in turn, significantly improves the performance of the gradient descent approach as well as MOD and K-SVD. In addition, the "Bag of Pursuits" allows to employ a generalized version of the Neural Gas algorithm for sparse coding, which finally leads to an even more powerful method. © Springer-Verlag Berlin Heidelberg 2010.

Cite

CITATION STYLE

APA

Labusch, K., Barth, E., & Martinetz, T. (2010). Bag of pursuits and neural gas for improved sparse coding. In Proceedings of COMPSTAT 2010 - 19th International Conference on Computational Statistics, Keynote, Invited and Contributed Papers (pp. 327–336). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-7908-2604-3_30

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free