A hybrid genetic algorithm for feature selection based on mutual information

6Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Feature selection aims to reduce the dimensionality of patterns for clas-sificatory analysis by selecting the most informative rather than irrelevant and/or redundant features. In this study, a hybrid genetic algorithm for feature selection is presented to combine the advantages of both wrappers and filters. Two stages of optimization are involved. The outer optimization stage completes the global search for the best subset of features in a wrapper way, in which the mutual information between the predictive labels of a trained classifier and the true classes serves as the fitness function for the genetic algorithm. The inner optimization performs the local search in a filter manner, in which an improved estimation of the conditional mutual information acts as an independent measure of feature ranking. This measure takes into account not only the relevance of the candidate feature to the output classes but also the redundancy to the features already selected. The inner and outer optimizations cooperate with each other and achieve the high global predictive accuracy as well as the high local search efficiency. Experimental results demonstrate both parsimonious feature selection and excellent classification accuracy of the method on a range of benchmark data sets. © 2009 Springer US.

Cite

CITATION STYLE

APA

Huang, J., & Rong, P. (2009). A hybrid genetic algorithm for feature selection based on mutual information. In Information Theory and Statistical Learning (pp. 125–152). Springer US. https://doi.org/10.1007/978-0-387-84816-7_6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free