Extending genetic programming to evolve perceptron-like learning programs

1Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We extend genetic programming (GP) with a local memory and vectorization to evolve simple, perceptron-like programs capable of learning by error correction. The local memory allows for a scalar value or vector to be stored and manipulated within a local scope of GP tree. Vectorization consists in grouping input variables and processing them as vectors. We demonstrate these extensions, along with an island model, allow to evolve general perceptron-like programs, i.e. working for any number of inputs. This is unlike in standard GP, where inputs are represented explicitly as scalars, so that scaling up the problem would require to evolve a new solution. Moreover, we find vectorization allows to represent programs more compactly and facilitates the evolutionary search. © 2010 Springer-Verlag.

Cite

CITATION STYLE

APA

Suchorzewski, M. (2010). Extending genetic programming to evolve perceptron-like learning programs. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6114 LNAI, pp. 221–228). https://doi.org/10.1007/978-3-642-13232-2_27

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free