A systematic evaluation of the benefits and hazards of variable selection in latent variable regression. Part II. Practical applications

46Citations
Citations of this article
31Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Leave-multiple-out cross-validation (LMO-CV) is compared to leave-one-out cross-validation (LOO-CV) as objective function in variable selection for four real data sets. Two data sets stem from NIR spectroscopy and two from quantitative structure-activity relationships. In all four cases, LMO-CV outperforms LOO-CV with respect to prediction quality, model complexity (number of latent variables) and model size (number of variables). The number of objects left out in LMO-CV has an important effect on the final results. It controls both the number of latent variables in the final model and the prediction quality. The results of variable selection need to be validated carefully with a validation step that is independent of the variable selection. This step needs to be done because the internal figures of merit (i.e. anything that is derived from the objective function value) do not correlate well with the external predictivity of the selected models. This is most obvious for LOO-CV. LOO-CV without further constraints always shows the best internal figures of merit and the worst prediction quality. © 2002 John Wiley & Sons, Ltd.

Cite

CITATION STYLE

APA

Baumann, K., Von Korff, M., & Albert, H. (2002). A systematic evaluation of the benefits and hazards of variable selection in latent variable regression. Part II. Practical applications. Journal of Chemometrics, 16(7), 351–360. https://doi.org/10.1002/cem.729

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free