Efficient decision trees for multi-class support vector machines using entropy and generalization error estimation

9Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We propose new methods for support vector machines using a tree architecture for multi-class classification. In each node of the tree, we select an appropriate binary classifier, using entropy and generalization error estimation, then group the examples into positive and negative classes based on the selected classifier, and train a new classifier for use in the classification phase. The proposed methods can work in time complexity between O(log 2 N) and O(N), where N is the number of classes. We compare the performance of our methods with traditional techniques on the UCI machine learning repository using 10-fold cross-validation. The experimental results show that the methods are very useful for problems that need fast classification time or those with a large number of classes, since the proposed methods run much faster than the traditional techniques but still provide comparable accuracy.

Cite

CITATION STYLE

APA

Kantavat, P., Kijsirikul, B., Songsiri, P., Fukui, K. I., & Numao, M. (2018). Efficient decision trees for multi-class support vector machines using entropy and generalization error estimation. International Journal of Applied Mathematics and Computer Science, 28(4), 705–717. https://doi.org/10.2478/amcs-2018-0054

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free