Mutual Information (MI) is a powerful concept from information theory used in many application fields. For practical tasks it is often necessary to estimate the Mutual Information from available data. We compare state of the art methods for estimating MI from continuous data, focusing on the usefulness for the feature selection task. Our results suggest that many methods are practically relevant for feature selection tasks regardless of their theoretic limitations or benefits. © 2010 Springer-Verlag Berlin Heidelberg.
CITATION STYLE
Schaffernicht, E., Kaltenhaeuser, R., Verma, S. S., & Gross, H. M. (2010). On estimating mutual information for feature selection. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6352 LNCS, pp. 362–367). https://doi.org/10.1007/978-3-642-15819-3_48
Mendeley helps you to discover research relevant for your work.