Statistical estimation of the kullback–leibler divergence

17Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

Asymptotic unbiasedness and L2-consistency are established, under mild conditions, for the estimates of the Kullback–Leibler divergence between two probability measures in Rd, absolutely continuous with respect to (w.r.t.) the Lebesgue measure. These estimates are based on certain k-nearest neighbor statistics for pair of independent identically distributed (i.i.d.) due vector samples. The novelty of results is also in treating mixture models. In particular, they cover mixtures of nondegenerate Gaussian measures. The mentioned asymptotic properties of related estimators for the Shannon entropy and cross-entropy are strengthened. Some applications are indicated.

Cite

CITATION STYLE

APA

Bulinski, A., & Dimitrov, D. (2021). Statistical estimation of the kullback–leibler divergence. Mathematics, 9(5), 1–36. https://doi.org/10.3390/math9050544

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free