We suggest two new methods, which are applicable to both deconvolution and regression with errors in explanatory variables, for nonparametric inference. The two approaches involve kernel or orthogonal series methods. They are based on defining a low order approximation to the problem at hand, and proceed by constructing relatively accurate estimators of that quantity rather than attempting to estimate the true target functions consistently. Of course, both techniques could be employed to construct consistent estimators, but in many contexts of importance (e.g. those where the errors are Gaussian) consistency is, from a practical viewpoint, an unattainable goal. We rephrase the problem in a form where an explicit, interpretable, low order approximation is available. The information that we require about the error distribution (the error-in-variables distribution, in the case of regression) is only in the form of low order moments and so is readily obtainable by a rudimentary analysis of indirect measurements of errors, e.g. through repeated measurements. In particular, we do not need to estimate a function, such as a characteristic function, which expresses detailed properties of the error distribution. This feature of our methods, coupled with the fact that all our estimators are explicitly defined in terms of readily computable averages, means that the methods are particularly economical in computing time.
CITATION STYLE
Carroll, R. J., & Hall, P. (2004). Low order approximations in deconvolution and regression with errors in variables. Journal of the Royal Statistical Society. Series B: Statistical Methodology, 66(1), 31–46. https://doi.org/10.1111/j.1467-9868.2004.00430.x
Mendeley helps you to discover research relevant for your work.