Two Novel SMOTE Methods for Solving Imbalanced Classification Problems

43Citations
Citations of this article
58Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The imbalanced classification problem has always been one of the important challenges in neural network and machine learning. As an effective method to deal with imbalanced classification problems, the synthetic minority oversampling technique (SMOTE) has its disadvantage: Some noise samples may participate in the process of synthesizing new samples; As a result, the new synthetic sample lacks its rationality, which will reduce the classification performances of the network. To remedy this shortcoming, two novel improved SMOTE method are proposed in this paper: Center point SMOTE (CP-SMOTE) method and Inner and outer SMOTE (IO-SMOTE) method. The CP-SMOTE method generates new samples based on finding several center points, then linearly combining the minority samples with their corresponding center points. The IO-SMOTE method divides minority samples into inner and outer samples, and then uses inner samples as much as possible in the subsequent process of generating new samples. Numerical experiments are conducted to prove that compared with no-sampling and conventional SMOTE methods, the CP-SMOTE and IO-SMOTE methods can achieve better classification performances.

Cite

CITATION STYLE

APA

Bao, Y., & Yang, S. (2023). Two Novel SMOTE Methods for Solving Imbalanced Classification Problems. IEEE Access, 11, 5816–5823. https://doi.org/10.1109/ACCESS.2023.3236794

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free