Bayesian regression based on principal components for high-dimensional data

6Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The Gaussian sequence model can be obtained from the high-dimensional regression model through principal component analysis. It is shown that the Gaussian sequence model is equivalent to the original high-dimensional regression model in terms of prediction. Under a sparsity condition, we investigate the posterior consistency and convergence rates of the Gaussian sequence model. In particular, we examine two different modeling strategies: Bayesian inference with and without covariate selection. For Bayesian inferences without covariate selection, we obtain the consistency results of the estimators and posteriors with normal priors with constant and decreasing variances, and the James-Stein estimator; for Bayesian inference with covariate selection, we obtain convergence rates of Bayesian model averaging (BMA) and median probability model (MPM) estimators, and the posterior with variable selection prior. Based on these results, we conclude that variable selection is essential in high-dimensional Bayesian regression. A simulation study also confirms the conclusion. The methodologies are applied to a climate prediction problem. © 2013 Elsevier Inc.

Cite

CITATION STYLE

APA

Lee, J., & Oh, H. S. (2013). Bayesian regression based on principal components for high-dimensional data. Journal of Multivariate Analysis, 117, 175–192. https://doi.org/10.1016/j.jmva.2013.02.002

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free