We investigate the asymptotic normality of the posterior distribution in the discrete setting, when model dimension increases with sample size. We consider a probability mass function θ0 on N \ (0) and a sequence of truncation levels (kn)n satisfying [formula]. Let θ denote the dimensional vector which i-th coordinate is defined by [formula] for 1 ≤ i ≤ kn. We check that under mild conditionson θ and on the sequence of prior probabilities on the kn-dimensional simplices, after centering and rescaling, the variation distance between the posterior distribution recentered around θn and rescaled by [formula] and the kn-dimensional Gaussian distribution N(Δn(θ0), I−1(θ0)) converges in probability to 0. This theorem can be used to prove the asymptotic normality of Bayesian estimators of Shannon and Rényi entropies. The proofs are based on concentration inequalities for centered and noncentered Chi-square (Pearson) statistics. The latter allow to establish posterior concentration rates with respect to Fisher distance rather than with respect to the Hellinger distance as it is commonplace in non-parametric Bayesian statistics. © 2009, Institute of Mathematical Statistics. All rights reserved.
CITATION STYLE
Boucheron, S., & Gassiat, E. (2009). A Bernstein-Von mises theorem for discrete probability distributions. Electronic Journal of Statistics, 3, 114–148. https://doi.org/10.1214/08-EJS262
Mendeley helps you to discover research relevant for your work.