In this paper, we propose a learning method called Corrected Error Backpropagation which maximizes the corrected log-likelihood which works like Akaike Information Criterion. For the purpose of maximizing the corrected log-likelihood, we introduce temperature parameter for the corrected log-likelihood. This paper also shows an optimal scheduling of the temperature parameter. Applying to our method to a linear regression model on the Boston house price estimation problem and multi layered perceptrons on the DELVE datasets, the method gives good results. © 2009 Springer Berlin Heidelberg.
CITATION STYLE
Sekino, M., & Nitta, K. (2009). Automatic model selection via corrected error backpropagation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5507 LNCS, pp. 220–227). https://doi.org/10.1007/978-3-642-03040-6_27
Mendeley helps you to discover research relevant for your work.