Continuous gaussian mixture modeling

4Citations
Citations of this article
25Readers
Mendeley users who have this article in their library.
Get full text

Abstract

When the projection of a collection of samples onto a subset of basis feature vectors has a Gaussian distribution, those samples have a generalized projective Gaussian distribution (GPGD). GPGDs arise in a variety of medical images as well as some speech recognition problems. We will demonstrate that GPGDs are better represented by continuous Gaussian mixture models (CGMMs) than finite Gaussian mixture models (FGMMs). This paper introduces a novel technique for the automated specification of CGMMs, height ridges of goodness-of-fit. For GPGDs, Monte Carlo simulations and ROC analysis demonstrate that classifiers utilizing CGMMs defined via goodness-of-fit height ridges provide consistent labelings and compared to FGMMs provide better true positive rates (TPRs) at low false-positive rates (FPRs). The CGMM-based classification of gray and white matter in an inhomogeneous magnetic resonance (MR) image of the brain is demonstrated.

Cite

CITATION STYLE

APA

Aylward, S., & Pizer, S. (1997). Continuous gaussian mixture modeling. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 1230, pp. 176–189). Springer Verlag. https://doi.org/10.1007/3-540-63046-5_14

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free