Automatic model selection via corrected error backpropagation

1Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we propose a learning method called Corrected Error Backpropagation which maximizes the corrected log-likelihood which works like Akaike Information Criterion. For the purpose of maximizing the corrected log-likelihood, we introduce temperature parameter for the corrected log-likelihood. This paper also shows an optimal scheduling of the temperature parameter. Applying to our method to a linear regression model on the Boston house price estimation problem and multi layered perceptrons on the DELVE datasets, the method gives good results. © 2009 Springer Berlin Heidelberg.

Cite

CITATION STYLE

APA

Sekino, M., & Nitta, K. (2009). Automatic model selection via corrected error backpropagation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5507 LNCS, pp. 220–227). https://doi.org/10.1007/978-3-642-03040-6_27

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free