Applying threshold SMOTE algoritwith attribute Bagging to imbalanced datasets

12Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Synthetic minority over-sampling technique (SMOTE) is an effective over-sampling technique and specifically designed for learning from imbalanced data sets. However, in the process of synthetic sample generation, SMOTE is of some blindness. This paper proposes a novel approach for imbalanced problem, based on a combination of the Threshold SMOTE (TSMOTE) and the Attribute Bagging (AB) algorithms. TSMOTE takes full advantage of majority samples to adjust the neighbor selective strategy of SMOTE in order to control the quality of the new sample. Attribute Bagging, a famous ensemble learning algorithm, is also used to improve the predictive power of the classifier. A comprehensive suite of experiments tested on 7 imbalanced data sets collected from UCI machine learning repository is conducted. Experimental results show that TSMOTE-AB outperforms the SMOTE and other previously known algorithms. © 2013 Springer-Verlag.

Cite

CITATION STYLE

APA

Wang, J., Yun, B., Huang, P., & Liu, Y. A. (2013). Applying threshold SMOTE algoritwith attribute Bagging to imbalanced datasets. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8171 LNAI, pp. 221–228). https://doi.org/10.1007/978-3-642-41299-8_21

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free