Feature selection is a technique to screen out less important features. Many existing supervised feature selection algorithms use redundancy and relevancy as the main criteria to select features. However, feature interaction, potentially a key characteristic in real-world problems, has not received much attention. As an attempt to take feature interaction into account, we propose l1-LSMI, an l1-regularization based algorithm that maximizes a squared-loss variant of mutual information between selected features and outputs. Numerical results show that l1-LSMI performs well in handling redundancy, detecting non-linear dependency, and considering feature interaction. © 2013 The Institute of Electronics.
CITATION STYLE
Jitkrittum, W., Hachiya, H., & Sugiyama, M. (2013). Feature Selection via l1-Penalized Squared-Loss Mutual Information. IEICE Transactions on Information and Systems, E96-D(7), 1513–1524. https://doi.org/10.1587/transinf.E96.D.1513
Mendeley helps you to discover research relevant for your work.