We investigated a "sample-feature-subset" approach which is a kind of extension of bagging and the random subspace method. In the procedure, we collect some subsets of training samples in each class and then remove the redundant features from those subsets. As a result, those subsets are represented in different feature spaces. We constructed one-against-other classifiers as the component classifiers by feeding those subsets to a base classifier and then combined them in majority voting. Some experimental results showed that this approach outperformed the random subspace method. © Springer-Verlag Berlin Heidelberg 2007.
CITATION STYLE
Kudo, M., Shirai, S., & Tenmoto, H. (2007). A combination of sample subsets and feature subsets in one-against-other classifiers. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4472 LNCS, pp. 241–250). Springer Verlag. https://doi.org/10.1007/978-3-540-72523-7_25
Mendeley helps you to discover research relevant for your work.