On general maximum likelihood empirical bayes estimation of heteroscedastic iid normal means

4Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We propose a general maximum likelihood empirical Bayes (GMLEB) method for the heteroscedastic normal means estimation with known variances. The idea is to plug the generalized maximum likelihood estimator in the oracle Bayes rule. From the point of view of restricted empirical Bayes, the general empirical Bayes aims at a benchmark risk smaller than the linear empirical Bayes methods when the unknown means are i.i.d. variables. We prove an oracle inequality which states that under mild conditions, the regret of the GMLEB is of smaller order than (log n)5/n. The proof is based on a large deviation inequality for the generalized maximum likelihood estimator. The oracle inequality leads to the property that the GMLEB is adaptive minimax in L√p balls when the order of the norm of the ball is larger than ((log n)5/2/ n)1/(p∧2) . We demonstrate the superb risk performance of the GMLEB through simulation experiments.

Cite

CITATION STYLE

APA

Jiang, W. (2020). On general maximum likelihood empirical bayes estimation of heteroscedastic iid normal means. Electronic Journal of Statistics, 14(1), 2272–2297. https://doi.org/10.1214/20-EJS1717

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free