We consider the asymptotic behavior of posterior distributions if the model is misspecified. Given a prior distribution and a random sample from a distribution P 0, which may not be in the support of the prior, we show that the posterior concentrates its mass near the points in the support of the prior that minimize the Kullback-Leibler divergence with respect to P 0. An entropy condition and a prior-mass condition determine the rate of convergence. The method is applied to several examples, with special interest for infinite-dimensional models. These include Gaussian mixtures, nonparametric regression and parametric models. © Institute of Mathematical Statistics, 2006.
CITATION STYLE
Kleijn, B. J. K., & Van Der Vaart, A. W. (2006). Misspecification in infinite-dimensional Bayesian statistics. Annals of Statistics, 34(2), 837–877. https://doi.org/10.1214/009053606000000029
Mendeley helps you to discover research relevant for your work.