Reducing dimensionality to improve search in semantic genetic programming

5Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Genetic programming approaches are moving from analysing the syntax of individual solutions to look into their semantics. One of the common definitions of the semantic space in the context of symbolic regression is a n-dimensional space, where n corresponds to the number of training examples. In problems where this number is high, the search process can became harder as the number of dimensions increase. Geometric semantic genetic programming (GSGP) explores the semantic space by performing geometric semantic operations—the fitness landscape seen by GSGP is guaranteed to be conic by construction. Intuitively, a lower number of dimensions can make search more feasible in this scenario, decreasing the chances of data overfitting and reducing the number of evaluations required to find a suitable solution. This paper proposes two approaches for dimensionality reduction in GSGP: (i) to apply current instance selection methods as a pre-process step before training points are given to GSGP; (ii) to incorporate instance selection to the evolution of GSGP. Experiments in 15 datasets show that GSGP performance is improved by using instance reduction during the evolution.

Cite

CITATION STYLE

APA

Oliveira, L. O. V. B., Miranda, L. F., Pappa, G. L., Otero, F. E. B., & Takahashi, R. H. C. (2016). Reducing dimensionality to improve search in semantic genetic programming. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9921 LNCS, pp. 375–385). Springer Verlag. https://doi.org/10.1007/978-3-319-45823-6_35

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free