Let ℙn *Khn(x) = n-1h n-d ∑i=1n K ((x-X i/hn) be the classical kernel density estimator based on a kernel K and n independent random vectors X i each distributed according to an absolutely continuous law ℙ on ℝd. It is shown that the processes f n ∫ fd(ℙn * Khn- ℙ), f F, converge in law in the Banach space ℓ(F), for many interesting classes F of functions or sets, some ℙ-Donsker, some just ℙ-pregaussian. The conditions allow for the classical bandwidths h n that simultaneously ensure optimal rates of convergence of the kernel density estimator in mean integrated squared error, thus showing that, subject to some natural conditions, kernel density estimators are 'plug-in' estimators in the sense of Bickel and Ritov (Ann Statist 31:1033-1053, 2003). Some new results on the uniform central limit theorem for smoothed empirical processes, needed in the proofs, are also included. © 2007 Springer-Verlag.
CITATION STYLE
Giné, E., & Nickl, R. (2008). Uniform central limit theorems for kernel density estimators. Probability Theory and Related Fields, 141(3–4), 333–387. https://doi.org/10.1007/s00440-007-0087-9
Mendeley helps you to discover research relevant for your work.