Efficient and fast spline-backfitted kernel smoothing of additive models

35Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A great deal of effort has been devoted to the inference of additive model in the last decade. Among existing procedures, the kernel type are too costly to implement for high dimensions or large sample sizes, while the spline type provide no asymptotic distribution or uniform convergence. We propose a one step backfitting estimator of the component function in an additive regression model, using spline estimators in the first stage followed by kernel/local linear estimators. Under weak conditions, the proposed estimator's pointwise distribution is asymptotically equivalent to an univariate kernel/local linear estimator, hence the dimension is effectively reduced to one at any point. This dimension reduction holds uniformly over an interval under assumptions of normal errors. Monte Carlo evidence supports the asymptotic results for dimensions ranging from low to very high, and sample sizes ranging from moderate to large. The proposed confidence band is applied to the Boston housing data for linearity diagnosis. © 2007 The Institute of Statistical Mathematics, Tokyo.

Cite

CITATION STYLE

APA

Wang, J., & Yang, L. (2009). Efficient and fast spline-backfitted kernel smoothing of additive models. Annals of the Institute of Statistical Mathematics, 61(3), 663–690. https://doi.org/10.1007/s10463-007-0157-x

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free