Optimal Kullback–Leibler aggregation in mixture density estimation by maximum likelihood

  • Dalalyan A
  • Sebbar M
N/ACitations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

We study the maximum likelihood estimator of density of $n$ independent observations, under the assumption that it is well approximated by a mixture with a large number of components. The main focus is on statistical properties with respect to the Kullback-Leibler loss. We establish risk bounds taking the form of sharp oracle inequalities both in deviation and in expectation. A simple consequence of these bounds is that the maximum likelihood estimator attains the optimal rate $((\log K)/n)^{1/2}$, up to a possible logarithmic correction, in the problem of convex aggregation when the number $K$ of components is larger than $n^{1/2}$. More importantly, under the additional assumption that the Gram matrix of the components satisfies the compatibility condition, the obtained oracle inequalities yield the optimal rate in the sparsity scenario. That is, if the weight vector is (nearly) $D$-sparse, we get the rate $(D\log K)/n$. As a natural complement to our oracle inequalities, we introduce the notion of nearly-$D$-sparse aggregation and establish matching lower bounds for this type of aggregation.

Cite

CITATION STYLE

APA

Dalalyan, A. S., & Sebbar, M. (2018). Optimal Kullback–Leibler aggregation in mixture density estimation by maximum likelihood. Mathematical Statistics and Learning, 1(1), 1–35. https://doi.org/10.4171/msl/1-1-1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free