A unified jackknife theory for empirical best prediction with M-estimation

116Citations
Citations of this article
32Readers
Mendeley users who have this article in their library.

Abstract

The paper presents a unified jackknife theory for a fairly general class of mixed models which includes some of the widely used mixed linear models and generalized linear mixed models as special cases. The paper develops jackknife theory for the important, but so far neglected, prediction problem for the general mixed model. For estimation of fixed parameters, a jackknife method is considered for a general class of M-estimators which includes the maximum likelihood, residual maximum likelihood and ANOVA estimators for mixed linear models and the recently developed method of simulated moments estimators for generalized linear mixed models. For both the prediction and estimation problems, a jackknife method is used to obtain estimators of the mean squared errors (MSE). Asymptotic unbiasedness of the MSE estimators is shown to hold essentially under certain moment conditions. Simulation studies undertaken support our theoretical results.

Cite

CITATION STYLE

APA

Jiang, J., Lahiri, P., & Wan, S. M. (2002). A unified jackknife theory for empirical best prediction with M-estimation. Annals of Statistics, 30(6), 1782–1810. https://doi.org/10.1214/aos/1043351257

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free