This paper presents an extension of a standard decision tree classifier, namely, the C4.5 algorithm. This extension allows the C4.5 algorithm to handle uncertain labeled training data where uncertainty is modeled within the possibility theory framework. The extension mainly concerns the attribute selection measure in which a clustering of possibility distributions of a partition is performed in order to assess the homogeneity of that partition. This paper also provides a comparison with previously proposed possibilistic decision tree approaches. © 2009 Springer Berlin Heidelberg.
CITATION STYLE
Jenhani, I., Benferhat, S., & Elouedi, Z. (2009). On the use of clustering in possibilistic decision tree induction. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5590 LNAI, pp. 505–517). https://doi.org/10.1007/978-3-642-02906-6_44
Mendeley helps you to discover research relevant for your work.