An exponential inequality for the distribution function of the kernel density estimator, with applications to adaptive estimation

40Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

It is shown that the uniform distance between the distribution function Fn Kh of the usual kernel density estimator (based on an i.i.d. sample from an absolutely continuous law on R) with bandwidth h and the empirical distribution function F n satisfies an exponential inequality. This inequality is used to obtain sharp almost sure rates of convergence of |FnK (h n)-Fn ∞ under mild conditions on the range of bandwidths h n , including the usual MISE-optimal choices. Another application is a Dvoretzky-Kiefer-Wolfowitz-type inequality for F n K (h)-F ∞ , where F is the true distribution function. The exponential bound is also applied to show that an adaptive estimator can be constructed that efficiently estimates the true distribution function F in sup-norm loss, and, at the same time, estimates the density of F-if it exists (but without assuming it does)-at the best possible rate of convergence over Hölder-balls, again in sup-norm loss. © 2008 Springer-Verlag.

Cite

CITATION STYLE

APA

Giné, E., & Nickl, R. (2009). An exponential inequality for the distribution function of the kernel density estimator, with applications to adaptive estimation. Probability Theory and Related Fields, 143(3–4), 569–596. https://doi.org/10.1007/s00440-008-0137-y

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free