Model selection and information criterion

2Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this chapter, a problem of estimating model parameters from observed data is considered such as regression and function approximation, and a method of evaluating the goodness of model is introduced. Starting from so-called leave-one-out cross-validation, and investigating asymptotic statistical properties of estimated parameters, a generalized Akaike's information criterion (AIC) is derived for selecting an appropriate model from several candidates. In addition to model selection, the concept of information criteria provides an assessment of the goodness of model in various situations. Finally, an optimization method using regularization is presented as an example. © 2009 Springer US.

Cite

CITATION STYLE

APA

Murata, N., & Park, H. (2009). Model selection and information criterion. In Information Theory and Statistical Learning (pp. 333–354). Springer US. https://doi.org/10.1007/978-0-387-84816-7_14

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free