Measuring independence between statistical randomness tests by mutual information

17Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

Abstract

The analysis of independence between statistical randomness tests has had great attention in the literature recently. Dependency detection between statistical randomness tests allows one to discriminate statistical randomness tests that measure similar characteristics, and thus minimize the amount of statistical randomness tests that need to be used. In this work, a method for detecting statistical dependency by using mutual information is proposed. The main advantage of using mutual information is its ability to detect nonlinear correlations, which cannot be detected by the linear correlation coefficient used in previous work. This method analyzes the correlation between the battery tests of the National Institute of Standards and Technology, used as a standard in the evaluation of randomness. The results of the experiments show the existence of statistical dependencies between the tests that have not been previously detected.

Cite

CITATION STYLE

APA

Karell-Albo, J. A., Legón-Pérez, C. M., Madarro-Capó, E. J., Rojas, O., & Sosa-Gómez, G. (2020). Measuring independence between statistical randomness tests by mutual information. Entropy, 22(7). https://doi.org/10.3390/e22070741

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free