Ensembles of decision trees for imbalanced data

5Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Ensembles of decision trees are considered for imbalanced datasets. Conventional decision trees (C4.5) and trees for imbalanced data (CCPDT: Class Confidence Proportion Decision Tree) are used as base classifiers. Ensemble methods, based on undersampling and oversampling, for imbalanced data are considered. Conventional ensemble methods, not specific for imbalanced data, are also studied: Bagging, Random Subspaces, AdaBoost, Real AdaBoost, MultiBoost and Rotation Forest. The results show that the ensemble method is much more important that the type of decision trees used as base classifier. Rotation Forest is the ensemble method with the best results. For the decision tree methods, CCPDT shows no advantage. © 2011 Springer-Verlag.

Cite

CITATION STYLE

APA

Rodríguez, J. J., Díez-Pastor, J. F., & García-Osorio, C. (2011). Ensembles of decision trees for imbalanced data. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6713 LNCS, pp. 76–85). https://doi.org/10.1007/978-3-642-21557-5_10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free