Selecting the optimal number of features in a classifier ensemble normally requires a validation set or cross-validation techniques. In this paper, feature ranking is combined with Recursive Feature Elimination (RFE), which is an effective technique for eliminating irrelevant features when the feature dimension is large. Stopping criteria are based on out-of-bootstrap (COB) estimate and class separability, both computed on the training set thereby obviating the need for validation. Multi-class problems are solved using the Error-Correcting Output Coding (ECOC) method. Experimental investigation on natural benchmark data demonstrates the effectiveness of these stopping criteria. © Springer-Verlag Berlin Heidelberg 2007.
CITATION STYLE
Windeatt, T., & Prior, M. (2007). Stopping criteria for ensemble-based feature selection. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4472 LNCS, pp. 271–281). Springer Verlag. https://doi.org/10.1007/978-3-540-72523-7_28
Mendeley helps you to discover research relevant for your work.