Measuring Dependence with Matrix-based Entropy Functional

16Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.

Abstract

Measuring the dependence of data plays a central role in statistics and machine learning. In this work, we summarize and generalize the main idea of existing information-theoretic dependence measures into a higher-level perspective by the Shearer’s inequality. Based on our generalization, we then propose two measures, namely the matrix-based normalized total correlation and the matrix-based normalized dual total correlation, to quantify the dependence of multiple variables in arbitrary dimensional space, without explicit estimation of the underlying data distributions. We show that our measures are differentiable and statistically more powerful than prevalent ones. We also show the impact of our measures in four different machine learning problems, namely the gene regulatory network inference, the robust machine learning under covariate shift and non-Gaussian noises, the subspace outlier detection, and the understanding of the learning dynamics of convolutional neural networks, to demonstrate their utilities, advantages, as well as implications to those problems.

Cite

CITATION STYLE

APA

Yu, S., Alesiani, F., Yu, X., Jenssen, R., & Principe, J. (2021). Measuring Dependence with Matrix-based Entropy Functional. In 35th AAAI Conference on Artificial Intelligence, AAAI 2021 (Vol. 12B, pp. 10781–10789). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v35i12.17288

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free