The fundamentals of information theory and also their applications to testing statistical hypotheses have been known and available for some time. There is currently a new and heterogeneous development of statistical procedures, based on information measures, scattered through the literature. In this paper a unification is attained by consistent application of the concepts and properties of information theory. Our aim is to examine a wide range of divergence type measures and their applications to statistical inferences, with special emphasis on multinomial and multivariate normal distributions, The “maximum likelihood” and the “minimum discrepancy” principles are combined here in order to derive new approaches to the discrimination between two groups or populations. To study the asymptotic properties of divergence statistics, we propose a unified expression, called (h, φ)-divergence, which includes as particular cases most divergences. Under different assumptions it is shown that the asymptotic distributions of the (h, φ)-divergences are either normal or chi square. From the previous results a wide range of statistical hypotheses about the parameters of one or two populations can be tested To help clarify the discussion and provide a simple illustration examples are given. © 1994 Academic Press, Inc.
CITATION STYLE
Salicrú, M., Morales, D., Menéndez, M. L., & Pardo, L. (1994). On the applications of divergence type measures in testing statistical hypotheses. Journal of Multivariate Analysis, 51(2), 372–391. https://doi.org/10.1006/jmva.1994.1068
Mendeley helps you to discover research relevant for your work.