We study Bayesian procedures for sparse linear regression when the unknown error distribution is endowed with a non-parametric prior. Specifically, we put a symmetrized Dirichlet process mixture of Gaussian prior on the error density, where the mixing distributions are compactly supported. For the prior on regression coefficients, a mixture of point masses at zero and continuous distributions is considered. Under the assumption that the model is well specified, we study behavior of the posterior with diverging number of predictors. The compatibility and restricted eigenvalue conditions yield the minimax convergence rate of the regression coefficients in 1- and 2-norms, respectively. In addition, strong model selection consistency and a semi-parametric Bernstein-von Mises theorem are proven under slightly stronger conditions.
CITATION STYLE
Chae, M., Lin, L., & Dunson, D. B. (2019). Bayesian sparse linear regression with unknown symmetric error. Information and Inference, 8(3), 621–653. https://doi.org/10.1093/imaiai/iay022
Mendeley helps you to discover research relevant for your work.