On the use of clustering in possibilistic decision tree induction

8Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper presents an extension of a standard decision tree classifier, namely, the C4.5 algorithm. This extension allows the C4.5 algorithm to handle uncertain labeled training data where uncertainty is modeled within the possibility theory framework. The extension mainly concerns the attribute selection measure in which a clustering of possibility distributions of a partition is performed in order to assess the homogeneity of that partition. This paper also provides a comparison with previously proposed possibilistic decision tree approaches. © 2009 Springer Berlin Heidelberg.

Cite

CITATION STYLE

APA

Jenhani, I., Benferhat, S., & Elouedi, Z. (2009). On the use of clustering in possibilistic decision tree induction. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5590 LNAI, pp. 505–517). https://doi.org/10.1007/978-3-642-02906-6_44

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free