This article investigates selection of variables in high-dimension from a non-parametric regression model. In many concrete situations, we are concerned with estimating a non-parametric regression function f that may depend on a large number p of inputs variables. Unlike standard procedures, we do not assume that f belongs to a class of regular functions (Holder, Sobolev, . . . ), yet we assume that f is a square-integrable function with respect to a known product measure. Furthermore, observe that, in some situations, only a small number s of the coordinates actually affects f in an additive manner. In this context, we prove that, with only O(s log p) random evaluations of f, one can find which are the relevant input variables with overwhelming probability. Our proposed method is an unconstrained ℓ1-minimization procedure based on the Sobol's method. One step of this procedure relies on support recovery using ℓ1-minimization and thresholding. More precisely, we use a thresholded-LASSO to faithfully uncover the significant input variables. In this frame, we prove that one can relax the mutual incoherence property (known to require O(s2 log p) observations) and still ensure faithful recovery from O(sα log p) observations for any 1 ≤ α ≤ 2.
CITATION STYLE
De Castro, Y., & Janon, A. (2015). Randomized pick-freeze for sparse Sobol indices estimation in high dimension. ESAIM - Probability and Statistics, 19, 725–745. https://doi.org/10.1051/ps/2015013
Mendeley helps you to discover research relevant for your work.