Feature selection at the discrete limit

56Citations
Citations of this article
26Readers
Mendeley users who have this article in their library.

Abstract

Feature selection plays an important role in many machine learning and data mining applications. In this paper, we propose to use L2,p norm for feature selection with emphasis on small p. As p → 0, feature selection becomes discrete feature selection problem. We provide two algorithms, proximal gradient algorithm and rank- one update algorithm, which is more efficient at large regularization λ. We provide closed form solutions of the proximal operator at p = 0,1/2. Experiments on real life datasets show that features selected at small p consistently outperform features selected at p = 1, the standard L2,1 approach and other popular feature selection methods.

Cite

CITATION STYLE

APA

Zhang, M., Ding, C., Zhang, Y., & Nie, F. (2014). Feature selection at the discrete limit. In Proceedings of the National Conference on Artificial Intelligence (Vol. 2, pp. 1355–1361). AI Access Foundation. https://doi.org/10.1609/aaai.v28i1.8919

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free