Mixture models for classification

14Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Finite mixture distributions provide efficient approaches of model-based clustering and classification. The advantages of mixture models for unsupervised classification are reviewed. Then, the article is focusing on the model selection problem. The usefulness of taking into account the modeling purpose when selecting a model is advocated in the unsupervised and supervised classification contexts. Thispoint of view had lead to the definition of two penalized likelihood criteria, ICL and BEC, which are presented and discussed. Criterion ICL is the approximation of the integrated completed likelihood and is concerned with model-based cluster analysis. Criterion BEC is the approximation of the integrated conditional likelihood and is concerned with generative models of classification. The behavior of ICL for choosing the number of components in a mixture model and of BEC to choose a model minimizing the expected error rate are analyzed in contrast with standard model selection criteria.

Cite

CITATION STYLE

APA

Celeux, G. (2007). Mixture models for classification. In Studies in Classification, Data Analysis, and Knowledge Organization (pp. 3–14). Kluwer Academic Publishers. https://doi.org/10.1007/978-3-540-70981-7_1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free