In this chapter, a problem of estimating model parameters from observed data is considered such as regression and function approximation, and a method of evaluating the goodness of model is introduced. Starting from so-called leave-one-out cross-validation, and investigating asymptotic statistical properties of estimated parameters, a generalized Akaike's information criterion (AIC) is derived for selecting an appropriate model from several candidates. In addition to model selection, the concept of information criteria provides an assessment of the goodness of model in various situations. Finally, an optimization method using regularization is presented as an example. © 2009 Springer US.
CITATION STYLE
Murata, N., & Park, H. (2009). Model selection and information criterion. In Information Theory and Statistical Learning (pp. 333–354). Springer US. https://doi.org/10.1007/978-0-387-84816-7_14
Mendeley helps you to discover research relevant for your work.