Entropies and rates of convergence for maximum likelihood and Bayes estimation for mixtures of normal densities

  • Ghosal S
  • Van Der Vaart A
  • 27


    Mendeley users who have this article in their library.
  • 105


    Citations of this article.


We study the rates of convergence of the maximum likelihood esti- mator (MLE) and posterior distribution in density estimation problems, where the densities are location or location-scale mixtures of normal dis- tributions with the scale parameter lying between two positive numbers. The true density is also assumed to lie in this class with the true mixing distribution either compactly supported or having sub-Gaussian tails. We obtain bounds for Hellinger bracketing entropies for this class, and from these bounds, we deduce the convergence rates of (sieve) MLEs in Hellinger distance. The rate turns out to be (log n)/ n-, where K > 1 is a constant that depends on the type of mixtures and the choice of the sieve. Next, we consider a Dirichlet mixture of normals as a prior on the unknown density. We estimate the prior probability of a certain Kullback-Leibler type neigh- borhood and then invoke a general theorem that computes the posterior convergence rate in terms the growth rate of the Hellinger entropy and the concentration rate of the prior. The posterior distribution is also seen to converge at the rate (log n)5/ n in, where K now depends on the tail behavior of the base measure of the Dirichlet process.

Author-supplied keywords

  • Bracketing
  • Dirichlet mixture
  • Entropy
  • Maximum likelihood
  • Mixture of normals
  • Posterior distribution
  • Rate of convergence
  • Sieve

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document

Get full text


  • Subhashis Ghosal

  • Aad W. Van Der Vaart

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free