Abstract
Support Vector Machine (SVM) is originally proposed as a binary classification model with achieving great success in many applications. In reality, it is more often to solve a problem which has more than two classes. So, it is natural to extend SVM to a multi-class classifier. There have been many works proposed to construct a multi-class classifier based on binary SVM, such as one versus rest strategy (OvsR), one versus one strategy (OvsO) and Weston's multi-class SVM. The first two split the multi-class problem to multiple binary classification subproblems, and we need to train multiple binary classifiers. Weston's multi-class SVM is formed by ensuring risk constraints and imposing a specific regularization, like Frobenius norm. It is not derived by maximizing the margin between hyperplane and training data which is the motivation in SVM. In this paper, we propose a multi-class SVM model from the perspective of maximizing margin between training points and hyper-plane, and analyze the relation between our model and other related methods. In the experiment, it shows that our model can get better or compared results when comparing with other related methods.
Cite
CITATION STYLE
Xu, J., Liu, X., Huo, Z., Deng, C., Nie, F., & Huang, H. (2017). Multi-class support vector machine via maximizing multi-class margins. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 0, pp. 3154–3160). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2017/440
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.