Adaptive FH-SVM for Imbalanced Classification

21Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Support vector machines (SVMs), powerful learning methods, have been popular among machine learning researches due to their strong performance on both classification and regression problems. However, traditional SVM making use of Hinge Loss cannot deal with class imbalance problems, because it applies the same weight of loss to each class. Recently, Focal Loss has been widely used for deep learning to address the imbalanced datasets. The significant effectiveness of Focal loss attracts the attention in many fields, such as object detection, semantic segmentation. Inspired by Focal loss, we reconstructed Hinge Loss with the scaling factor of Focal loss, called FH Loss, which not only deals with the class imbalance problems but also preserve the distinctive property of Hinge loss. Owing to the difficulty of the trade-off between positive and negative accuracy in imbalanced classification, FH loss pays more attention on minority class and misclassified instances to improve the accuracy of each class, further to reduce the influence of imbalance. In addition, due to the difficulty of solving SVM with FH loss, we propose an improved model with modified FH loss, called Adaptive FH-SVM. The algorithm solves the optimization problem iteratively and adaptively updates the FH loss of each instance. Experimental results on 31 binary imbalanced datasets demonstrate the effectiveness of our proposed method.

Cite

CITATION STYLE

APA

Wang, Q., Tian, Y., & Liu, D. (2019). Adaptive FH-SVM for Imbalanced Classification. IEEE Access, 7, 130410–130422. https://doi.org/10.1109/ACCESS.2019.2940983

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free