Evolution of cubic spline activation functions for artificial neural networks

1Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The most common (or even only) choice of activation functions (AFs) for multi-layer perceptrons (MLPs) widely used in research, engineering and business is the logistic function. Among the reasons for this popularity are its boundedness in the unit interval, the function's and its derivative's fast computability, and a number of amenable mathematical properties in the realm of approximation theory. However, considering the huge variety of problem domains MLPs are applied in, it is intriguing to suspect that specific problems call for specific activation functions. Biological neural networks with their enormous variety of neurons mastering a set of complex tasks may be considered to motivate this hypothesis. We present a number of experiments evolving structure and activation functions of generalized multi-layer perceptrons (GMLPs) using the parallel netGEN system to train the evolved architectures. For the evolution of activation functions we employ cubic splines and compare the evolved cubic spline ANNs with evolved sigmoid ANNs on synthetic classification problems which allow conclusions w.r.t. the shape of decision boundaries. Also, an interesting observation concerning Minsky's Paradox is reported. © Springer-Verlag Berlin Heidelberg 2001.

Cite

CITATION STYLE

APA

Mayer, H. A., & Schwaiger, R. (2001). Evolution of cubic spline activation functions for artificial neural networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2258 LNAI, pp. 63–73). Springer Verlag. https://doi.org/10.1007/3-540-45329-6_10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free