General maximum likelihood empirical bayes estimation of normal means

130Citations
Citations of this article
48Readers
Mendeley users who have this article in their library.

Abstract

We propose a general maximum likelihood empirical Bayes (GMLEB) method for the estimation of a mean vector based on observations with i.i.d. normal errors. We prove that under mild moment conditions on the unknown means, the average mean squared error (MSE) of the GMLEB is within an infinitesimal fraction of the minimum average MSE among all separable estimators which use a single deterministic estimating function on individual observations, provided that the risk is of greater order than (log n) 5/n. We also prove that the GMLEB is uniformly approximately minimax in regular and weak ℓp balls when the order of the length-normalized norm of the unknown means is between (log n) κ1/n1/(pΛ2) and n/(log n) κ2 . Simulation experiments demonstrate that the GMLEB outperforms the James-Stein and several state-of-the-art threshold estimators in a wide range of settings without much down side. © Institute of Mathematical Statistics, 2009.

Cite

CITATION STYLE

APA

Jiang, W., & Zhang, C. H. (2009). General maximum likelihood empirical bayes estimation of normal means. Annals of Statistics, 37(4), 1647–1684. https://doi.org/10.1214/08-AOS638

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free