An estimate of mutual information that permits closed-form optimisation

N/ACitations
Citations of this article
15Readers
Mendeley users who have this article in their library.

Abstract

We introduce a new estimate of mutual information between a dataset and a target variable that can be maximised analytically and has broad applicability in the field of machine learning and statistical pattern recognition. This estimate has previously been employed implicitly as an approximation to quadratic mutual information. In this paper we will study the properties of these estimates of mutual information in more detail, and provide a derivation from a perspective of pairwise interactions. From this perspective, we will show a connection between our proposed estimate and Laplacian eigenmaps, which so far has not been shown to be related to mutual information. Compared with other popular measures of mutual information, which can only be maximised through an iterative process, ours can be maximised much more efficiently and reliably via closed-form eigendecomposition. © 2013 by the authors.

Cite

CITATION STYLE

APA

Liu, R., & Gillies Duncan, A. F. (2013). An estimate of mutual information that permits closed-form optimisation. Entropy, 15(5), 1690–1704. https://doi.org/10.3390/e15051690

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free