Oracle inequalities and optimal inference under group sparsity

213Citations
Citations of this article
72Readers
Mendeley users who have this article in their library.

Abstract

We consider the problem of estimating a sparse linear regression vector β ∗ under a Gaussian noise model, for the purpose of both prediction and model selection. We assume that prior knowledge is available on the sparsity pattern, namely the set of variables is partitioned into prescribed groups, only few of which are relevant in the estimation process. This group sparsity assumption suggests us to consider the Group Lasso method as a means to estimate β ∗.We establish oracle inequalities for the prediction and ℓ2 estimation errors of this estimator. These bounds hold under a restricted eigenvalue condition on the design matrix. Under a stronger condition, we derive bounds for the estimation error for mixed (2,p)-norms with 1 ≤ p≤∞. When p=∞, this result implies that a thresholded version of the Group Lasso estimator selects the sparsity pattern of β ∗ with high probability. Next, we prove that the rate of convergence of our upper bounds is optimal in a minimax sense, up to a logarithmic factor, for all estimators over a class of group sparse vectors. Furthermore, we establish lower bounds for the prediction and ℓ2 estimation errors of the usual Lasso estimator. Using this result, we demonstrate that the Group Lasso can achieve an improvement in the prediction and estimation errors as compared to the Lasso.

Cite

CITATION STYLE

APA

Lounici, K., Pontil, M., Van De Geer, S., & Tsybakov, A. B. (2011). Oracle inequalities and optimal inference under group sparsity. Annals of Statistics, 39(4), 2164–2204. https://doi.org/10.1214/11-AOS896

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free