Sc2Net: Sparse LSTMs for sparse coding ∗

45Citations
Citations of this article
55Readers
Mendeley users who have this article in their library.

Abstract

The iterative hard-thresholding algorithm (ISTA) is one of the most popular optimization solvers to achieve sparse codes. However, ISTA suffers from following problems: 1) ISTA employs non-adaptive updating strategy to learn the parameters on each dimension with a fixed learning rate. Such a strategy may lead to inferior performance due to the scarcity of diversity; 2) ISTA does not incorporate the historical information into the updating rules, and the historical information has been proven helpful to speed up the convergence. To address these challenging issues, we propose a novel formulation of ISTA (named as adaptive ISTA) by introducing a novel adaptive momentum vector. To efficiently solve the proposed adaptive ISTA, we recast it as a recurrent neural network unit and show its connection with the well-known long short term memory (LSTM) model. With a new proposed unit, we present a neural network (termed SC2Net) to achieve sparse codes in an end-to-end manner. To the best of our knowledge, this is one of the first works to bridge the 1-solver and LSTM, and may provide novel insights in understanding model-based optimization and LSTM. Extensive experiments show the effectiveness of our method on both unsupervised and supervised tasks.

Cite

CITATION STYLE

APA

Zhou, J. T., Di, K., Du, J., Peng, X., Yang, H., Pan, S. J., … Goh, R. S. M. (2018). Sc2Net: Sparse LSTMs for sparse coding ∗. In 32nd AAAI Conference on Artificial Intelligence, AAAI 2018 (pp. 4588–4595). AAAI press. https://doi.org/10.1609/aaai.v32i1.11721

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free