The behaviour of the Akaike information criterion when applied to non-nested sequences of models

1Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A typical approach to the problem of selecting between models of differing complexity is to choose the model with the minimum Akaike Information Criterion (AIC) score. This paper examines a common scenario in which there is more than one candidate model with the same number of free parameters which violates the conditions under which AIC was derived. The main result of this paper is a novel upper bound that quantifies the poor performance of the AIC criterion when applied in this setting. Crucially, the upper-bound does not depend on the sample size and will not disappear even asymptotically. Additionally, an AIC-like criterion for sparse feature selection in regression models is derived, and simulation results in the case of denoising a signal by wavelet thresholding demonstrate the new AIC approach is competitive with SureShrink thresholding. © 2010 Springer-Verlag.

Cite

CITATION STYLE

APA

Schmidt, D. F., & Makalic, E. (2010). The behaviour of the Akaike information criterion when applied to non-nested sequences of models. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6464 LNAI, pp. 223–232). https://doi.org/10.1007/978-3-642-17432-2_23

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free