Bayesian regularization in constructive neural networks

22Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we study the incorporation of Bayesian regularization into constructive neural networks. The degree of regularization is automatically controlled in the Bayesian inference framework and hence does not require manual setting. Simulation shows that regularization, with input training using a full Bayesian approach, produces networks with better generalization performance and lower susceptibility to over-fitting as the network size increases. Regularization with input training under MacKay's evidence framework, however, does not produce significant improvement on the problems tested.

Cite

CITATION STYLE

APA

Kwok, T. Y., & Yeung, D. Y. (1996). Bayesian regularization in constructive neural networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 1112 LNCS, pp. 557–562). Springer Verlag. https://doi.org/10.1007/3-540-61510-5_95

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free