An extended Čencov characterization of the information metric

  • Campbell L
87Citations
Citations of this article
30Readers
Mendeley users who have this article in their library.

Abstract

Čencov has shown that Riemannian metrics which are derived from the Fisher information matrix are the only metrics which preserve inner products under certain probabilistically important mappings. In Čencov’s theorem, the underlying differentiable manifold is the probability simplex Σ 1 n x i = 1 , x i > 0 \Sigma _1^n{x_i} = 1, x_i > 0 . For some purposes of using geometry to obtain insights about probability, it is more convenient to regard the simplex as a hypersurface in the positive cone. In the present paper Čencov’s result is extended to the positive cone. The proof uses standard techniques of differential geometry but does not use the language of category theory.

Cite

CITATION STYLE

APA

Campbell, L. L. (1986). An extended Čencov characterization of the information metric. Proceedings of the American Mathematical Society, 98(1), 135–141. https://doi.org/10.1090/s0002-9939-1986-0848890-5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free