An experimental study about simple decision trees for bagging ensemble on datasets with classification noise

25Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Decision trees are simple structures used in supervised classification learning. The results of the application of decision trees in classification can be notably improved using ensemble methods such as Bagging, Boosting or Randomization, largely used in the literature. Bagging outperforms Boosting and Randomization in situations with classification noise. In this paper, we present an experimental study of the use of different simple decision tree methods for bagging ensemble in supervised classification, proving that simple credal decision trees (based on imprecise probabilities and uncertainty measures) outperforms the use of classical decision tree methods for this type of procedure when they are applied on datasets with classification noise. © 2009 Springer Berlin Heidelberg.

Cite

CITATION STYLE

APA

Abellán, J., & Masegosa, A. R. (2009). An experimental study about simple decision trees for bagging ensemble on datasets with classification noise. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5590 LNAI, pp. 446–456). https://doi.org/10.1007/978-3-642-02906-6_39

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free