Feature subset selection using differential evolution

17Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

One of the fundamental motivations for feature selection is to overcome the curse of dimensionality. A novel feature selection algorithm is developed in this chapter based on a combination of Differential Evolution (DE) optimization technique and statistical feature distribution measures. The new algorithm, referred to as DEFS, utilizes the DE float number optimizer in a combinatorial optimization problem like feature selection. The proposed DEFS highly reduces the computational cost while at the same time proves to present a powerful performance. The DEFS is tested as a search procedure on different datasets with varying dimensionality. Practical results indicate the significance of the proposed DEFS in terms of solutions optimality and memory requirements. © 2009 Springer Berlin Heidelberg.

Cite

CITATION STYLE

APA

Khushaba, R. N., Al-Ani, A., & Al-Jumaily, A. (2009). Feature subset selection using differential evolution. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5506 LNCS, pp. 103–110). https://doi.org/10.1007/978-3-642-02490-0_13

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free