On estimating mutual information for feature selection

15Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Mutual Information (MI) is a powerful concept from information theory used in many application fields. For practical tasks it is often necessary to estimate the Mutual Information from available data. We compare state of the art methods for estimating MI from continuous data, focusing on the usefulness for the feature selection task. Our results suggest that many methods are practically relevant for feature selection tasks regardless of their theoretic limitations or benefits. © 2010 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Schaffernicht, E., Kaltenhaeuser, R., Verma, S. S., & Gross, H. M. (2010). On estimating mutual information for feature selection. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6352 LNCS, pp. 362–367). https://doi.org/10.1007/978-3-642-15819-3_48

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free