Asymptotically constant-risk predictive densities when the distributions of data and target variables are different

0Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

Abstract

We investigate the asymptotic construction of constant-risk Bayesian predictive densities under the Kullback-Leibler risk when the distributions of data and target variables are different and have a common unknown parameter. It is known that the Kullback-Leibler risk is asymptotically equal to a trace of the product of two matrices: the inverse of the Fisher information matrix for the data and the Fisher information matrix for the target variables. We assume that the trace has a unique maximum point with respect to the parameter. We construct asymptotically constant-risk Bayesian predictive densities using a prior depending on the sample size. Further, we apply the theory to the subminimax estimator problem and the prediction based on the binary regression model. © 2014 by the authors; licensee MDPI, Basel, Switzerland.

Cite

CITATION STYLE

APA

Yano, K., & Komaki, F. (2014). Asymptotically constant-risk predictive densities when the distributions of data and target variables are different. Entropy, 16(6), 3026–3048. https://doi.org/10.3390/e16063026

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free