The Feature Selection problem involves discovering a subset of features such that a classifier built only with this subset would have better predictive accuracy than a classifier built from the entire set of features. Ensemble methods, such as Bagging and Boosting, have been shown to increase the performance of classifiers to remarkable levels but surprisingly have not been tried in other parts of the classification process. In this paper, we apply the ensemble approach to feature selection by proposing a systematic way of combining various outcomes of a feature selection algorithm. The proposed framework, named STochFS, have been shown empirically to improve the performance of well-known feature selection algorithms. © Springer-Verlag Berlin Heidelberg 2005.
CITATION STYLE
De Souza, J. T., Japkowicz, N., & Matwin, S. (2005). STochFS: A framework for combining feature selection outcomes through a stochastic process. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 3721 LNAI, pp. 667–674). Springer Verlag. https://doi.org/10.1007/11564126_71
Mendeley helps you to discover research relevant for your work.