Comparative study of standalone classifier and ensemble classifier

3Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

Abstract

Ensemble learning is one of machine learning method that can solve performance measurement problem. Standalone classifiers often show a poor performance result, thus why combining them with ensemble methods can improve their performance scores. Ensemble learning has several methods, in this study, three methods of ensemble learning are compared with standalone classifiers of support vector machine, Naïve Bayes, and decision tree. bagging, AdaBoost, and voting are the ensemble methods that are combined then compared to standalone classifiers. From 1670 dataset of twitter mentions about tourist’s attraction, ensemble methods did not show a specific improvement in accuracy and precision measurement since it generated the same result as decision tree as standalone classifier. Bagging method showed a significant development in recall, f-measure, and area under curve (AUC) measurement. For overall performance, decision tree as standalone classifier and decision tree with AdaBoost method have the highest score for accuracy and precision measurements, meanwhile support vector machine with bagging method has the highest score for recall, f-measure, and AUC.

Cite

CITATION STYLE

APA

Priasni, T. O., & Oswari, T. (2021). Comparative study of standalone classifier and ensemble classifier. Telkomnika (Telecommunication Computing Electronics and Control), 19(5), 1747–1754. https://doi.org/10.12928/TELKOMNIKA.V19I5.19508

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free