Repeated measures multiple comparison procedures applied to model selection in neural networks

9Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

One of the main research concern in neural networks is to find the appropriate network size in order to minimize the trade-off between overfitting and poor approximation. In this paper the choice among different competing models that fit to the same data set is faced when statistical methods for model comparison are applied. The study has been conducted to find a range of models that can work all the same as the cost of complexity varies. If they do not, then the generalization error estimates should be about the same among the set of models. If they do, then the estimates should be different and our job would consist on analyzing pairwise differences between the least generalization error estimate and each one of the range, in order to bound the set of models which might result in an equal performance. This method is illustrated applied to polynomial regression and RBF neural networks. © Springer-Verlag Berlin Heidelberg 2001.

Cite

CITATION STYLE

APA

Guerrero Vázquez, E., Yañez Escolano, A., Galindo Riaño, P., & Pizarro Junquera, J. (2001). Repeated measures multiple comparison procedures applied to model selection in neural networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2085 LNCS, pp. 88–95). Springer Verlag. https://doi.org/10.1007/3-540-45723-2_10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free