Semi-supervised sparse coding

8Citations
Citations of this article
22Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.

Cite

CITATION STYLE

APA

Wang, J. J. Y., & Gao, X. (2014). Semi-supervised sparse coding. In Proceedings of the International Joint Conference on Neural Networks (pp. 1630–1637). Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.1109/IJCNN.2014.6889449

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free