We give conditions for an O(1/n) rate of convergence of Fisher information and relative entropy in the Central Limit Theorem. We use the theory of projections in L 2 spaces and Poincaré inequalities, to provide a better understanding of the decrease in Fisher information implied by results of Barron and Brown. We show that if the standardized Fisher information ever becomes finite then it converges to zero.
CITATION STYLE
Johnson, O., & Barron, A. (2004). Fisher information inequalities and the central limit theorem. Probability Theory and Related Fields, 129(3), 391–409. https://doi.org/10.1007/s00440-004-0344-0
Mendeley helps you to discover research relevant for your work.