Log-linear pool to combine prior distributions: A suggestion for a calibration-based approach

25Citations
Citations of this article
29Readers
Mendeley users who have this article in their library.

Abstract

An important issue involved in group decision making is the suitable aggregation of experts' beliefs about a parameter of interest. Two widely used combination methods are linear and log-linear pools. Yet, a problem arises when the weights have to be selected. This paper provides a general decision-based procedure to obtain the weights in a log-linear pooled prior distribution. The process is based on Kullback-Leibler divergence, which is used as a calibration tool. No information about the parameter of interest is considered before dealing with the experts' beliefs. Then, a pooled prior distribution is achieved, for which the expected calibration is the best one in the Kullback-Leibler sense. In the absence of other information available to the decision-maker prior to getting experimental data, the methodology generally leads to selection of the most diffuse pooled prior. In most cases, a problem arises from the marginal distribution related to the noninformative prior distribution since it is improper. In these cases, an alternative procedure is proposed. Finally, two applications show how the proposed techniques can be easily applied in practice. © 2012 International Society for Bayesian Analysis.

Cite

CITATION STYLE

APA

Rufo, M. J., Martín, J., & Pérez, C. J. (2012). Log-linear pool to combine prior distributions: A suggestion for a calibration-based approach. Bayesian Analysis, 7(2), 411–438. https://doi.org/10.1214/12-BA714

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free