Several SVM ensemble methods integrated with under-sampling for imbalanced data learning

27Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Imbalanced data learning (IDL) is one of the most active and important fields in machine learning research. This paper focuses on exploring the efficiencies of four different SVM ensemble methods integrated with under-sampling in IDL. The experimental results on 20 UCI imbalanced datasets show that two new ensemble algorithms proposed in this paper, i.e., CABagE (which is bagging-style) and MABstE (which is boosting-style), can output the SVM ensemble classifiers with better minority-class-recognition abilities than the existing ensemble methods. Further analysis on the experimental results indicates that MABstE has the best overall classification performance, and we believe that this should be attributed to its more robust example-weighting mechanism. © 2009 Springer.

Cite

CITATION STYLE

APA

Lin, Z., Hao, Z., Yang, X., & Liu, X. (2009). Several SVM ensemble methods integrated with under-sampling for imbalanced data learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5678 LNAI, pp. 536–544). https://doi.org/10.1007/978-3-642-03348-3_54

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free