The principle of maximum entropy is a powerful framework that can be used to estimate class posterior probabilities for pattern recognition tasks. In this paper, we show how this principle is related to the discriminative training of Gaussian mixture densities using the maximum mutual information criterion. This leads to a relaxation of the constraints on the covariance matrices to be positive (semi-) definite. Thus, we arrive at a conceptually simple model that allows to estimate a large number of free parameters reliably. We compare the proposed method with other state-of-the-art approaches in experiments with the well known US Postal Service handwritten digits recognition task. © Springer-Verlag Berlin Heidelberg 2002.
CITATION STYLE
Keysers, D., Och, F. J., & Ney, H. (2002). Maximum entropy and Gaussian models for image object recognition. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2449 LNCS, pp. 498–506). Springer Verlag. https://doi.org/10.1007/3-540-45783-6_60
Mendeley helps you to discover research relevant for your work.