LINEX Support Vector Machine for Large-Scale Classification

27Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Traditional soft margin support vector machine usually uses hinge loss to build a classifier with the 'maximum-margin' principle. However, C-SVM depends on support vectors causing the loss of data information. Then, least square support vector machine is proposed with square loss ( l-{2} -loss). It establishes equality constraints instead of inequalities and considering all the instances. However, the square loss is still not the perfect one, since it gives equivalent punishment to the instances at both sides of the center plane. It does not match the reality considering the instances between two center planes deserve heavier penalty than the others. To this end, we propose a novel SVM method with the adoption of the asymmetry LINEX (linear-exponential) loss, which we called it LINEX-SVM. The LINEX loss gives different treatments to instances based on the importance of each point. It gives a heavier penalty to the points between two center planes while drawing light penalty to the points outside of the corresponding center planes. The comprehensive experiments have been implemented to validate the effectiveness of the LINEX-SVM.

Cite

CITATION STYLE

APA

Ma, Y., Zhang, Q., Li, D., & Tian, Y. (2019). LINEX Support Vector Machine for Large-Scale Classification. IEEE Access, 7, 70319–70331. https://doi.org/10.1109/ACCESS.2019.2919185

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free