Semi-supervised feature selection via rescaled linear regression

101Citations
Citations of this article
51Readers
Mendeley users who have this article in their library.

Abstract

With the rapid increase of complex and highdimensional sparse data, demands for new methods to select features by exploiting both labeled and unlabeled data have increased. Least regression based feature selection methods usually learn a projection matrix and evaluate the importances of features using the projection matrix, which is lack of theoretical explanation. Moreover, these methods cannot find both global and sparse solution of the projection matrix. In this paper, we propose a novel semi-supervised feature selection method which can learn both global and sparse solution of the projection matrix. The new method extends the least square regression model by rescaling the regression coefficients in the least square regression with a set of scale factors, which are used for ranking the features. It has shown that the new model can learn global and sparse solution. Moreover, the introduction of scale factors provides a theoretical explanation for why we can use the projection matrix to rank the features. A simple yet effective algorithm with proved convergence is proposed to optimize the new model. Experimental results on eight real-life data sets show the superiority of the method.

Cite

CITATION STYLE

APA

Chen, X., Nie, F., Yuan, G., & Huang, J. Z. (2017). Semi-supervised feature selection via rescaled linear regression. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 0, pp. 1525–1531). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2017/211

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free