Checking for prior-data conflict using prior-to-posterior divergences

17Citations
Citations of this article
24Readers
Mendeley users who have this article in their library.

Abstract

When using complex Bayesian models to combine information, checking consistency of the information contributed by different components of the model for inference is good statistical practice. Here a new method is developed for detecting prior-data conflicts in Bayesian models based on comparing the observed value of a prior-to-posterior divergence to its distribution under the prior predictive distribution for the data. The divergence measure used in our model check is a measure of how much beliefs have changed from prior to posterior, and can be thought of as a measure of the overall size of a relative belief function. It is shown that the proposed method is intuitive, has desirable properties, can be extended to hierarchical settings, and is related asymptotically to Jeffreys' and reference prior distributions. In the case where calculations are difficult, the use of variational approximations as a way of relieving the computational burden is suggested. The methods are compared in a number of examples with an alternative but closely related approach in the literature based on the prior predictive distribution of a minimal sufficient statistic.

Cite

CITATION STYLE

APA

Nott, D. J., Wang, X., Evans, M., & Englert, B. G. (2020). Checking for prior-data conflict using prior-to-posterior divergences. Statistical Science, 35(2), 234–253. https://doi.org/10.1214/19-STS731

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free