Feature construction and dimension reduction using genetic programming

30Citations
Citations of this article
40Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper describes a new approach to the use of genetic programming (GP) for feature construction in classification problems. Rather than wrapping a particular classifier for single feature construction as in most of the existing methods, this approach uses GP to construct multiple (high-level) features from the original features. These constructed features are then used by decision trees for classification. As feature construction is independent of classification, the fitness function is designed based on the class dispersion and entropy. This approach is examined and compared with the standard decision tree method, using the original features, and using a combination of the original features and constructed features, on 12 benchmark classification problems. The results show that the new approach outperforms the standard way of using decision trees on these problems in terms of the classification performance, dimension reduction and the learned decision tree size. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Neshatian, K., Zhang, M., & Johnston, M. (2007). Feature construction and dimension reduction using genetic programming. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4830 LNAI, pp. 160–170). Springer Verlag. https://doi.org/10.1007/978-3-540-76928-6_18

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free