Mutual information between discrete and continuous data sets

476Citations
Citations of this article
545Readers
Mendeley users who have this article in their library.

Abstract

Mutual information (MI) is a powerful method for detecting relationships between data sets. There are accurate methods for estimating MI that avoid problems with "binning" when both data sets are discrete or when both data sets are continuous. We present an accurate, non-binning MI estimator for the case of one discrete data set and one continuous data set. This case applies when measuring, for example, the relationship between base sequence and gene expression level, or the effect of a cancer drug on patient survival time. We also show how our method can be adapted to calculate the Jensen-Shannon divergence of two or more data sets. © 2014 Brian C. Ross.

Cite

CITATION STYLE

APA

Ross, B. C. (2014). Mutual information between discrete and continuous data sets. PLoS ONE, 9(2). https://doi.org/10.1371/journal.pone.0087357

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free