Feature Selection via l1-Penalized Squared-Loss Mutual Information

14Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

Abstract

Feature selection is a technique to screen out less important features. Many existing supervised feature selection algorithms use redundancy and relevancy as the main criteria to select features. However, feature interaction, potentially a key characteristic in real-world problems, has not received much attention. As an attempt to take feature interaction into account, we propose l1-LSMI, an l1-regularization based algorithm that maximizes a squared-loss variant of mutual information between selected features and outputs. Numerical results show that l1-LSMI performs well in handling redundancy, detecting non-linear dependency, and considering feature interaction. © 2013 The Institute of Electronics.

Cite

CITATION STYLE

APA

Jitkrittum, W., Hachiya, H., & Sugiyama, M. (2013). Feature Selection via l1-Penalized Squared-Loss Mutual Information. IEICE Transactions on Information and Systems, E96-D(7), 1513–1524. https://doi.org/10.1587/transinf.E96.D.1513

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free