Global em learning of finite mixture models using the Greedy elimination method

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The standard learning method for finite mixture models is the Expectation-Maximization (EM) algorithm, which performs hill-climbing from an initial solution to obtain the local maximum likelihood solution. However, given that the solution space is large and multimodal, EM is prone to produce inconsistent and sub-optimal solutions over multiple runs. This paper presents a novel global greedy learning method called the Greedy Elimination Method (GEM) to alleviate these problems. GEM is simple to implement in any finite mixture model, yet effective to enhance the global optimality and the consistency of the solutions. It is also very efficient as its complexity grows only linearly with the number of data patterns. GEM is demonstrated on clustering synthetic datasets using the mixture of Gaussian model, and on clustering the shrinking spiral data set using the mixture of Factor Analyzers. © 2006 Springer-Verlag London.

Cite

CITATION STYLE

APA

Chan, Z. S. H., & Kasabov, N. (2006). Global em learning of finite mixture models using the Greedy elimination method. In Research and Development in Intelligent Systems XXII - Proceedings of AI 2005, the 25th SGAI International Conference on Innovative Techniques and Applications of Artificial Intelligence (pp. 37–45). Springer London. https://doi.org/10.1007/978-1-84628-226-3_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free