Application of offset estimator of differential entropy and mutual information with multivariate data

2Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Numerical estimators of differential entropy and mutual information can be slow to converge as sample size increases. The offset Kozachenko-Leonenko (KLo) method described here implements an offset version of the Kozachenko-Leonenko estimator that can markedly improve convergence. Its use is illustrated in applications to the comparison of trivariate data from successive scene color images and the comparison of univariate data from stereophonic music tracks. Publicly available code for KLo estimation of both differential entropy and mutual information is provided for R, Python, and MATLAB computing environments at https://github.com/imarinfr/klo.

Cite

CITATION STYLE

APA

Marin-Franch, I., Sanz-Sabater, M., & Foster, D. H. (2022). Application of offset estimator of differential entropy and mutual information with multivariate data. Experimental Results, 3. https://doi.org/10.1017/exp.2022.14

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free