One of the fundamental motivations for feature selection is to overcome the curse of dimensionality. A novel feature selection algorithm is developed in this chapter based on a combination of Differential Evolution (DE) optimization technique and statistical feature distribution measures. The new algorithm, referred to as DEFS, utilizes the DE float number optimizer in a combinatorial optimization problem like feature selection. The proposed DEFS highly reduces the computational cost while at the same time proves to present a powerful performance. The DEFS is tested as a search procedure on different datasets with varying dimensionality. Practical results indicate the significance of the proposed DEFS in terms of solutions optimality and memory requirements. © 2009 Springer Berlin Heidelberg.
CITATION STYLE
Khushaba, R. N., Al-Ani, A., & Al-Jumaily, A. (2009). Feature subset selection using differential evolution. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5506 LNCS, pp. 103–110). https://doi.org/10.1007/978-3-642-02490-0_13
Mendeley helps you to discover research relevant for your work.