Incremental Cost-Sensitive Support Vector Machine with Linear-Exponential Loss

20Citations
Citations of this article
25Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Incremental learning or online learning as a branch of machine learning has attracted more attention recently. For large-scale problems and dynamic data problem, incremental learning overwhelms batch learning, because of its efficient treatment for new data. However, class imbalance problem, which always appears in online classification brings a considerable challenge for incremental learning. The serious class imbalance problem may directly lead to a useless learning system. Cost-sensitive learning is an important learning paradigm for class imbalance problems and widely used in many applications. In this article, we propose an incremental cost-sensitive learning method to tackle the class imbalance problems in the online situation. This proposed algorithm is based on a novel cost-sensitive support vector machine, which uses the Linear-exponential (LINEX) loss to implement high cost for minority class and low cost for majority class. Using the half-quadratic optimization, we first put forward the algorithm for the cost-sensitive support vector machine, called CSLINEX-SVM∗. Then we propose the incremental cost-sensitive algorithm, ICSL-SVM. The results of numeric experiments demonstrate that the proposed incremental algorithm outperforms some conventional batch algorithms except the proposed CSLINEX-SVM∗.

Cite

CITATION STYLE

APA

Ma, Y., Zhao, K., Wang, Q., & Tian, Y. (2020). Incremental Cost-Sensitive Support Vector Machine with Linear-Exponential Loss. IEEE Access, 8, 149899–149914. https://doi.org/10.1109/ACCESS.2020.3015954

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free