Bootstrap bias corrected cross validation applied to super learning

N/ACitations
Citations of this article
7Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Super learner algorithm can be applied to combine results of multiple base learners to improve quality of predictions. The default method for verification of super learner results is by nested cross validation; however, this technique is very expensive computationally. It has been proposed by Tsamardinos et al., that nested cross validation can be replaced by resampling for tuning hyper-parameters of the learning algorithms. The main contribution of this study is to apply this idea to verification of super learner. We compare the new method with other verification methods, including nested cross validation. Tests were performed on artificial data sets of diverse size and on seven real, biomedical data sets. The resampling method, called Bootstrap Bias Correction, proved to be a reasonably precise and very cost-efficient alternative for nested cross validation.

Cite

CITATION STYLE

APA

Mnich, K., Kitlas Golińska, A., Polewko-Klim, A., & Rudnicki, W. R. (2020). Bootstrap bias corrected cross validation applied to super learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12139 LNCS, pp. 550–563). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-50420-5_41

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free