Information criteria for model selection

4Citations
Citations of this article
40Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The rapid development of modeling techniques has brought many opportunities for data-driven discovery and prediction. However, this also leads to the challenge of selecting the most appropriate model for any particular data task. Information criteria, such as the Akaike information criterion (AIC) and Bayesian information criterion (BIC), have been developed as a general class of model selection methods with profound connections with foundational thoughts in statistics and information theory. Many perspectives and theoretical justifications have been developed to understand when and how to use information criteria, which often depend on particular data circumstances. This review article will revisit information criteria by summarizing their key concepts, evaluation metrics, fundamental properties, interconnections, recent advancements, and common misconceptions to enrich the understanding of model selection in general. This article is categorized under: Data: Types and Structure > Traditional Statistical Data Statistical Learning and Exploratory Methods of the Data Sciences > Modeling Methods Statistical and Graphical Methods of Data Analysis > Information Theoretic Methods Statistical Models > Model Selection.

Cite

CITATION STYLE

APA

Zhang, J., Yang, Y., & Ding, J. (2023, September 1). Information criteria for model selection. Wiley Interdisciplinary Reviews: Computational Statistics. John Wiley and Sons Inc. https://doi.org/10.1002/wics.1607

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free