Certain Inequalities in Information Theory and the Cramer-Rao Inequality

  • Kullback S
N/ACitations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

The Cramer-Rao inequality provides, under certain regularity conditions, a lower bound for the variance of an estimator [7], [15]. Various generalizations, extensions and improvements in the bound have been made, by Barankin [1], [2], Bhattacharyya [3], Chapman and Robbins [5], Fraser and Guttman [11], Kiefer [12], and Wolfowitz [16], among others. Further considerations of certain inequality properties of a measure of information, discussed by Kullback and Leibler [14], yields a greater lower bound for the information measure (formula (4.11)), and leads to a result which may be considered a generalization of the Cramer-Rao inequality, the latter following as a special case. The results are used to define discrimination efficiency and estimation efficiency at a point in parameter space.

Cite

CITATION STYLE

APA

Kullback, S. (1954). Certain Inequalities in Information Theory and the Cramer-Rao Inequality. The Annals of Mathematical Statistics, 25(4), 745–751. https://doi.org/10.1214/aoms/1177728660

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free