Asymptotic theory for maximum likelihood estimates in reduced-rank multivariate generalized linear models

5Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Reduced-rank regression is a dimensionality reduction method with many applications. The asymptotic theory for reduced rank estimators of parameter matrices in multivariate linear models has been studied extensively. In contrast, few theoretical results are available for reduced-rank multivariate generalized linear models. We develop M-estimation theory for concave criterion functions that are maximized over parameter spaces that are neither convex nor closed. These results are used to derive the consistency and asymptotic distribution of maximum likelihood estimators in reduced-rank multivariate generalized linear models, when the response and predictor vectors have a joint distribution. We illustrate our results in a real data classification problem with binary covariates.

Cite

CITATION STYLE

APA

Bura, E., Duarte, S., Forzani, L., Smucler, E., & Sued, M. (2018). Asymptotic theory for maximum likelihood estimates in reduced-rank multivariate generalized linear models. Statistics, 52(5), 1005–1024. https://doi.org/10.1080/02331888.2018.1467420

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free