Monte Carlo likelihood inference for missing data models

  • Sung Y
  • Geyer C
  • 4


    Mendeley users who have this article in their library.
  • N/A


    Citations of this article.


We describe a Monte Carlo method to approximate the maximum likelihood estimate (MLE), when there are missing data and the observed data likelihood is not available in closed form. This method uses simulated missing data that are independent and identically distributed and independent of the observed data. Our Monte Carlo approximation to the MLE is a consistent and asymptotically normal estimate of the minimizer theta* of the Kullback-Leibler information, as both Monte Carlo and observed data sample sizes go to infinity simultaneously. Plug-in estimates of the asymptotic variance are provided for constructing confidence regions for theta*. We give Logit-Normal generalized linear mixed model examples, calculated using an R package.

Author-supplied keywords

  • asymptotic theory
  • convergence
  • convex sets cones
  • em algorithm
  • empirical process
  • generalized linear mixed model
  • generalized linear-models
  • maximum likelihood
  • maximum-likelihood
  • mixed models
  • model misspecification
  • monte carlo
  • pedigrees
  • sequences

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document


  • Y J Sung

  • C J Geyer

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free