A novel feature selection method based on normalized mutual information

136Citations
Citations of this article
62Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, a novel feature selection method based on the normalization of the well-known mutual information measurement is presented. Our method is derived from an existing approach, the max-relevance and min-redundancy (mRMR) approach. We, however, propose to normalize the mutual information used in the method so that the domination of the relevance or of the redundancy can be eliminated. We borrow some commonly used recognition models including Support Vector Machine (SVM), k-Nearest-Neighbor (kNN), and Linear Discriminant Analysis (LDA) to compare our algorithm with the original (mRMR) and a recently improved version of the mRMR, the Normalized Mutual Information Feature Selection (NMIFS) algorithm. To avoid data-specific statements, we conduct our classification experiments using various datasets from the UCI machine learning repository. The results confirm that our feature selection method is more robust than the others with regard to classification accuracy. © 2011 Springer Science+Business Media, LLC.

Cite

CITATION STYLE

APA

Vinh, L. T., Lee, S., Park, Y. T., & D’Auriol, B. J. (2012, July). A novel feature selection method based on normalized mutual information. Applied Intelligence. https://doi.org/10.1007/s10489-011-0315-y

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free