Let the kp-variate random vector X be partitioned into k subvectors Xi of dimension p each, and let the covariance matrix Ψ of X be partitioned analogously into submatrices Ψij. The common principal component (CPC) model for dependent random vectors assumes the existence of an orthogonal p by p matrix β such that βtΨijβ is diagonal for all (i, j). After a formal definition of the model, normal theory maximum likelihood estimators are obtained. The asymptotic theory for the estimated orthogonal matrix is derived by a new technique of choosing proper subsets of functionally independent parameters. © 2000 Academic Press.
CITATION STYLE
Neuenschwander, B. E., & Flury, B. D. (2000). Common principal components for dependent random vectors. Journal of Multivariate Analysis, 75(2), 163–183. https://doi.org/10.1006/jmva.2000.1908
Mendeley helps you to discover research relevant for your work.