Spiked dirichlet process priors for gaussian process models

10Citations
Citations of this article
24Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

We expand a framework for Bayesian variable selection for Gaussian process (GP) models by employing spiked Dirichlet process (DP) prior constructions over set partitions containing covariates. Our approach results in a nonparametric treatment of the distribution of the covariance parameters of the GP covariance matrix that in turn induces a clustering of the covariates. We evaluate two prior constructions: the first one employs a mixture of a point-mass and a continuous distribution as the centering distribution for the DP prior, therefore, clustering all covariates. The second one employs a mixture of a spike and a DP prior with a continuous distribution as the centering distribution, which induces clustering of the selected covariates only. DP models borrow information across covariates through model-based clustering. Our simulation results, in particular, show a reduction in posterior sampling variability and, in turn, enhanced prediction performances. In our model formulations, we accomplish posterior inference by employing novel combinations and extensions of existing algorithms for inference with DP prior models and compare performances under the two prior constructions. Copyright © 2010 Terrance Savitsky and Marina Vannucci.

Cite

CITATION STYLE

APA

Savitsky, T., & Vannucci, M. (2010). Spiked dirichlet process priors for gaussian process models. Journal of Probability and Statistics. https://doi.org/10.1155/2010/201489

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free