A Gaussian mixture model (GMM) estimates a probability density function using the expectation-maximization algorithm. However, it may lead to a poor performance or inconsistency. This paper analytically shows that performance of a GMM can be improved in terms of Kullback-Leibler divergence with a committee of GMMs with different initial parameters. Simulations on synthetic datasets demonstrate that a committee of as few as 10 models outperforms a single model. © Springer-Verlag Berlin Heidelberg 2004.
CITATION STYLE
Lee, H. J., & Cho, S. (2004). Combining Gaussian mixture models. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3177, 666–671. https://doi.org/10.1007/978-3-540-28651-6_98
Mendeley helps you to discover research relevant for your work.