A kernel smoother is an intuitive estimate of a regression function or conditional expectation; at each point xo the estimate of E(Y\xo) is a weighted mean of the sample Yi. with observations close to xo receiving the largest weights. Unfortunately this simplicity has flaws. At the boundary of the predictor space, the kernel neighborhood is asymmetric and the estimate may have substantial bias. Bias can be a problem in the interior as well if the predictors are nonuniform or if the regression function has substantial curvature. These problems are particularly severe when the predictors are multidimensional. A variety of kernel modifications have been proposed to provide approximate and asymptotic adjustment for these biases. Such methods generally place substantial restrictions on the regression problems that can be considered; in unfavorable situations, they can perform very poorly. Moreover, the necessary modifications are very difficult to implement in the multidimensional case. Local regression smoothers fit low-order polynomials in x locally at xo, and the estimate of f(xo) is taken from the fitted polynomial at xo. They automatically, intuitively and simultaneously adjust for both the biases above to the given order and generalize naturally to the multidimensional case. They also provide natural estimates for the derivatives of f, an approach more attractive than using higher-order kernel functions for the same purpose. © 1993, Institute of Mathematical Statistics. All Rights Reserved.
CITATION STYLE
Hastie, T., & Loader, C. (1993). Local regression: Automatic Kernel carpentry. Statistical Science, 8(2), 120–129. https://doi.org/10.1214/ss/1177011002
Mendeley helps you to discover research relevant for your work.