We examine a new approach to building decision tree by introducing a geometric splitting criterion, based on the properties of a family of metrics on the space of partitions of a finite set. This criterion can be adapted to the characteristics of the data sets and the needs of the users and yields decision trees that have smaller sizes and fewer leaves than the trees built with standard methods and have comparable or better accuracy. © Springer-Verlag Berlin Heidelberg 2006.
CITATION STYLE
Simovici, D. A., & Jaroszewicz, S. (2006). Generalized conditional entropy and a metric splitting criterion for decision trees. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 3918 LNAI, pp. 35–44). https://doi.org/10.1007/11731139_7
Mendeley helps you to discover research relevant for your work.