Estimating the size of neural networks from the number of available training data

8Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Estimating a priori the size of neural networks for achieving high classification accuracy is a hard problem. Existing studies provide theoretical upper bounds on the size of neural networks that are unrealistic to implement. This work provides a computational study for estimating the size of neural networks using as an estimation parameter the size of available training data. We will also show that the size of a neural network is problem dependent and that one only needs the number of available training data to determine the size of the required network for achieving high classification rate. We use for our experiments a threshold neural network that combines the perceptron algorithm with simulated annealing and we tested our results on datasets from the UCI Machine Learning Repository. Based on our experimental results, we propose a formula to estimate the number of perceptrons that have to be trained in order to achieve a high classification accuracy. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Lappas, G. (2007). Estimating the size of neural networks from the number of available training data. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4668 LNCS, pp. 68–77). Springer Verlag. https://doi.org/10.1007/978-3-540-74690-4_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free