A systematic evaluation of the benefits and hazards of variable selection in latent variable regression. Part II. Practical applications

  • Baumann K
  • von Korff M
  • Albert H
 et al. 
  • 1


    Mendeley users who have this article in their library.
  • N/A


    Citations of this article.


Leave-multiple-out cross-validation (LMO-CV) is compared to leave-one-out cross-validation (LOO-CV) as objective function in variable selection for four real data sets. Two data sets stem from NIR spectroscopy and two from quantitative structure–activity relationships. In all four cases, LMO-CV outperforms LOO-CV with respect to prediction quality, model complexity (number of latent variables) and model size (number of variables). The number of objects left out in LMO-CV has an important effect on the final results. It controls both the number of latent variables in the final model and the prediction quality. The results of variable selection need to be validated carefully with a validation step that is independent of the variable selection. This step needs to be done because the internal figures of merit (i.e. anything that is derived from the objective function value) do not correlate well with the external predictivity of the selected models. This is most obvious for LOO-CV. LOO-CV without further constraints always shows the best internal figures of merit and the worst prediction quality. Copyright © 2002 John Wiley & Sons, Ltd.

Author-supplied keywords

  • Cross-validation
  • PCR
  • PLS
  • Tabu search
  • Variable selection
  • cross-validation
  • pcr
  • pls
  • tabu search
  • variable selection

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document


  • K. Baumann

  • M. von Korff

  • H. Albert

  • M. von Korff

  • H. Albert

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free