Stopping criteria for ensemble-based feature selection

9Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Selecting the optimal number of features in a classifier ensemble normally requires a validation set or cross-validation techniques. In this paper, feature ranking is combined with Recursive Feature Elimination (RFE), which is an effective technique for eliminating irrelevant features when the feature dimension is large. Stopping criteria are based on out-of-bootstrap (COB) estimate and class separability, both computed on the training set thereby obviating the need for validation. Multi-class problems are solved using the Error-Correcting Output Coding (ECOC) method. Experimental investigation on natural benchmark data demonstrates the effectiveness of these stopping criteria. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Windeatt, T., & Prior, M. (2007). Stopping criteria for ensemble-based feature selection. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4472 LNCS, pp. 271–281). Springer Verlag. https://doi.org/10.1007/978-3-540-72523-7_28

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free