Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression

57Citations
Citations of this article
30Readers
Mendeley users who have this article in their library.

Abstract

We study the asymptotic properties of Lasso+mLS and Lasso+ Ridge under the sparse high-dimensional linear regression model: Lasso se- lecting predictors and then modified Least Squares (mLS) or Ridge esti- mating their coefficients. First, we propose a valid inference procedure for parameter estimation based on parametric residual bootstrap after Lasso+ mLS and Lasso+Ridge. Second, we derive the asymptotic unbiasedness of Lasso+mLS and Lasso+Ridge. More specifically, we show that their biases decay at an exponential rate and they can achieve the oracle convergence rate of s/n (where s is the number of nonzero regression coefficients and n is the sample size) for mean squared error (MSE). Third, we show that Lasso+mLS and Lasso+Ridge are asymptotically normal. They have an oracle property in the sense that they can select the true predictors with probability converging to 1 and the estimates of nonzero parameters have the same asymptotic normal distribution that they would have if the zero parameters were known in advance. In fact, our analysis is not limited to adopting Lasso in the selection stage, but is applicable to any other model selection criteria with exponentially decay rates of the probability of select- ing wrong models.

Cite

CITATION STYLE

APA

Liu, H., & Yu, B. (2013). Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression. Electronic Journal of Statistics, 7(1), 3124–3169. https://doi.org/10.1214/14-EJS875

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free