Joint clustering and feature selection

7Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Due to the absence of class labels, unsupervised feature selection is much more difficult than supervised feature selection. Traditional unsupervised feature selection algorithms usually select features to preserve the structure of the data set. Inspired from the recent developments on discriminative clustering, we propose in this paper a novel unsupervised feature selection approach via Joint Clustering and Feature Selection (JCFS). Specifically, we integrate Fisher score into the clustering framework. We select those features such that the fisher criterion is maximized and the manifold structure can be best preserved simultaneously. We also discover the connection between JCFS and other clustering and feature selection methods, such as discriminative K-means, JELSR and DCS. Experimental results on real world data sets demonstrated the effectiveness of the proposed algorithm. © 2013 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Du, L., & Shen, Y. D. (2013). Joint clustering and feature selection. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7923 LNCS, pp. 241–252). Springer Verlag. https://doi.org/10.1007/978-3-642-38562-9_25

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free