Numerical attributes in decision trees: A hierarchical approach

3Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Decision trees are probably the most popular and commonly-used classification model. They are recursively built following a top-down approach (from general concepts to particular examples) by repeated splits of the training dataset. When this dataset contains numerical attributes, binary splits are usually performed by choosing the threshold value which minimizes the impurity measure used as splitting criterion (e.g. C4.5 gain ratio criterion or CART Gini's index). In this paper we propose the use of multi-way splits for continuous attributes in order to reduce the tree complexity without decreasing classification accuracy. This can be done by intertwining a hierarchical clustering algorithm with the usual greedy decision tree learning. © Springer-Verlag Berlin Heidelberg 2003.

Cite

CITATION STYLE

APA

Berzal, F., Cubero, J. C., Marín, N., & Sánchez, D. (2003). Numerical attributes in decision trees: A hierarchical approach. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2810, 198–207. https://doi.org/10.1007/978-3-540-45231-7_19

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free