Information geometry and sufficient statistics

58Citations
Citations of this article
52Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Information geometry provides a geometric approach to families of statistical models. The key geometric structures are the Fisher quadratic form and the Amari–Chentsov tensor. In statistics, the notion of sufficient statistic expresses the criterion for passing from one model to another without loss of information. This leads to the question how the geometric structures behave under such sufficient statistics. While this is well studied in the finite sample size case, in the infinite case, we encounter technical problems concerning the appropriate topologies. Here, we introduce notions of parametrized measure models and tensor fields on them that exhibit the right behavior under statistical transformations. Within this framework, we can then handle the topological issues and show that the Fisher metric and the Amari–Chentsov tensor on statistical models in the class of symmetric 2-tensor fields and 3-tensor fields can be uniquely (up to a constant) characterized by their invariance under sufficient statistics, thereby achieving a full generalization of the original result of Chentsov to infinite sample sizes. More generally, we decompose Markov morphisms between statistical models in terms of statistics. In particular, a monotonicity result for the Fisher information naturally follows.

Cite

CITATION STYLE

APA

Ay, N., Jost, J., Lê, H. V., & Schwachhöfer, L. (2015). Information geometry and sufficient statistics. Probability Theory and Related Fields, 162(1–2), 327–364. https://doi.org/10.1007/s00440-014-0574-8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free