Asymptotic law of likelihood ratio for multilayer perceptron models

1Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We consider regression models involving multilayer perceptrons (MLP) with one hidden layer and a Gaussian noise. The data are assumed to be generated by a true MLP model and the estimation of the parameters of the MLP is done by maximizing the likelihood of the model. When the number of hidden units of the model is over-estimated, the Fischer information matrix of the model is singular and the asymptotic behavior of the LR statistic is unknown or can be divergent if the set of possible parameter is too large. This paper deals with this case, and gives the exact asymptotic law of the LR statistic. Namely, if the parameters of the MLP lie in a suitable compact set, we show that the LR statistic converges to the maximum of the square of a Gaussian process indexed by a class of limit score functions. © 2008 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Rynkiewicz, J. (2008). Asymptotic law of likelihood ratio for multilayer perceptron models. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5263 LNCS, pp. 186–195). Springer Verlag. https://doi.org/10.1007/978-3-540-87732-5_21

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free