The data-constrained generalized maximum entropy estimator of the GLM: Asymptotic theory and inference

20Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.

Abstract

Maximum entropy methods of parameter estimation are appealing because they impose no additional structure on the data, other than that explicitly assumed by the analyst. In this paper we prove that the data constrained GME estimator of the general linear model is consistent and asymptotically normal. The approach we take in establishing the asymptotic properties concomitantly identifies a new computationally efficient method for calculating GME estimates. Formulae are developed to compute asymptotic variances and to perform Wald, likelihood ratio, and Lagrangian multiplier statistical tests on model parameters. Monte Carlo simulations are provided to assess the performance of the GME estimator in both large and small sample situations. Furthermore, we extend our results to maximum cross-entropy estimators and indicate a variant of the GME estimator that is unbiased. Finally, we discuss the relationship of GME estimators to Bayesian estimators, pointing out the conditions under which an unbiased GME estimator would be efficient. © 2013 by the authors.

Cite

CITATION STYLE

APA

Mittelhammer, R., Cardell, N. S., & Marsh, T. L. (2013). The data-constrained generalized maximum entropy estimator of the GLM: Asymptotic theory and inference. Entropy, 15(5), 1756–1775. https://doi.org/10.3390/e15051756

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free