Multicollinearity: A tale of two nonparametric regressions

  • De Veaux R
  • Ungar L
N/ACitations
Citations of this article
95Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The most popular form of artificial neural network, feedforward networks with sigmoidal activation functions, and a new statistical technique, multivariate adaptive regression splines (MARS) can both be classified as nonlinear, nonparametric function estimation techniques, and both show great promise for fitting general nonlinear multivariate functions. In comparing the two methods on a variety of test problems, we find that MARS is in many cases both more accurate and much faster than neural networks. In addition, MARS is interpretable due to the choice of basic functions which make up the final predictive equation. This suggests that MARS could be used on many of the applications where neural networks are currently being used. However, MARS exhibits problems in choosing among predictor variables when multicollinearity is present. Due to their redundant architecture, neural networks, however, do not share this problem, and are better able to predict in this situation. To improve the ability of MARS to deal with multicollinearity, we first use principal components to reduce the dimensionality of the input variables before invoking MARS. Using data from a polymer production run, we find that the resulting model retains the interpretability and improves the accuracy of MARS in the multicollinear setting.

Cite

CITATION STYLE

APA

De Veaux, R. D., & Ungar, L. H. (1994). Multicollinearity: A tale of two nonparametric regressions (pp. 393–402). https://doi.org/10.1007/978-1-4612-2660-4_40

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free