Exploring high dimension large data correlation analysis with mutual information and application

0Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Applying for information entropy theory, we present a measure of dependence for multi-variables relationships: the high dimensional maximal mutual information coefficient (HMIC). It is a kind of maximal information-based nonparametric exploration (MINE) statistics for identifying and classifying relationships in large data sets which generalizes the maximum information coefficient (MIC) measurement in mutual variables. To decreasing the complexity of the HMIC computing, the improved uniform grid is proposed by data grid idea. At the same time, some optimal single axis partition algorithm (SAR) is built to ensure the feasible of the HMIC measurement. Finally we apply the HMIC to analysis the data sets of physical measurement among college students.

Cite

CITATION STYLE

APA

Jiang, Y. S., Zhang, D. K., Wang, X. M., & Zhu, W. Y. (2016). Exploring high dimension large data correlation analysis with mutual information and application. In Advances in Intelligent Systems and Computing (Vol. 443, pp. 361–371). Springer Verlag. https://doi.org/10.1007/978-3-319-30874-6_34

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free