Principal components regression to mitigate the effects of multicollinearity

ISSN: 0015749X
22Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.

Abstract

One consequence of multicollinearity among the structural independent variables of a regression model is that variables are frequently deleted as a means of proceeding with sensible hypothesis tests. Principal components regression has the advantage of avoiding model specification error due to variable deletion. The technique works as follows: the independent variables are orthogonalized into their principal components; components with low information content are deleted; the model is estimated by ordinary least squares; then the principal component estimators are converted into coefficients in the original parameter space, where a judgement about their contribution is made via an F-test. An example using tree growth data is presented to demonstrate the merits of principal components regression over variable deletion. Results indicate that "correct" structural specification does not have to be compromised through variable deletion when collinearity is present. This obviously has implications for large-scaled regression models, in which an increased number of independent variables in the specification may promote a level of collinearity that is not conducive to making statistical inferences. Analytical methods like principal components, that adjust for the effects of collinearity on the variable selection process, are merited. © 1991 by the Society of American Foresters.

Cite

CITATION STYLE

APA

Morzuch, B. J., & Ruark, G. A. (1991). Principal components regression to mitigate the effects of multicollinearity. Forest Science, 37(1), 191–199.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free