Grammar guided genetic programming for flexible neural trees optimization

5Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In our previous studies, Genetic Programming (GP), Probabilistic Incremental Program Evolution (PIPE) and Ant Programming (AP) have been used to optimal design of Flexible Neural Tree (FNT). In this paper Grammar Guided Genetic Programming (GGGP) was employed to optimize the architecture of FNT model. Based on the predefined instruction sets, a flexible neural tree model can be created and evolved. This framework allows input variables selection, over-layer connections and different activation functions for the various nodes involved. The free parameters embedded in the neural tree are optimized by particle swarm optimization algorithm. Empirical results on stock index prediction problems indicate that the proposed method is better than the neural network and genetic programming forecasting models. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Wu, P., & Chen, Y. (2007). Grammar guided genetic programming for flexible neural trees optimization. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4426 LNAI, pp. 964–971). Springer Verlag. https://doi.org/10.1007/978-3-540-71701-0_108

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free