Equivalent number of degrees of freedom for neural networks

6Citations
Citations of this article
29Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The notion of equivalent number of degrees of freedom (e.d.f.) to be used in neural network modeling from small datasets has been introduced in Ingrassia and Morlini (2005). It is much smaller than the total number of parameters and it does not depend on the number of input variables. We generalize our previous results and discuss the use of the e.d.f. in the general framework of multivariate nonparametric model selection. Through numerical simulations, we also investigate the behavior of model selection criteria like AIC, GCV and BIC/SBC, when the e.d.f. is used instead of the total number of the adaptive parameters in the model.

Cite

CITATION STYLE

APA

Ingrassia, S., & Morlini, I. (2007). Equivalent number of degrees of freedom for neural networks. In Studies in Classification, Data Analysis, and Knowledge Organization (pp. 229–236). Kluwer Academic Publishers. https://doi.org/10.1007/978-3-540-70981-7_26

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free