We present a greedy method for simultaneously performing local bandwidth selection and variable selection in nonparametric regression. The method starts with a local linear estimator with large bandwidths, and incrementally decreases the bandwidth of variables for which the gradient of the estimator with respect to bandwidth is large. The method - called rodeo (regularization of derivative expectation operator) - conducts a sequence of hypothesis tests to threshold derivatives. and is easy to implement. Under certain assumptions on the regression function and sampling density, it is shown that the rodeo applied to local linear smoothing avoids the curse of dimensionality. achieving near optimal minimax rates of convergence in the number of relevant variables, as if these variables were isolated in advance. © Institute of Mathematical Statistics, 2008.
CITATION STYLE
Lafferty, J., & Wasserman, L. (2008). Rodeo: Sparse, greedy nonparametric regression. Annals of Statistics, 36(1), 28–63. https://doi.org/10.1214/009053607000000811
Mendeley helps you to discover research relevant for your work.