Evaluation of decision tree pruning with subadditive penalties

0Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recent work on decision tree pruning [1] has brought to the attention of the machine learning community the fact that, in classification problems, the use of subadditive penalties in cost-complexity pruning has a stronger theoretical basis than the usual additive penalty terms. We implement cost-complexity pruning algorithms with general size-dependent penalties to confirm the results of [1]. Namely, that the family of pruned subtrees selected by pruning with a subadditive penalty of increasing strength is a subset of the family selected using additive penalties. Consequently, this family of pruned trees is unique, it is nested and it can be computed efficiently. However, in spite of the better theoretical grounding of cost-complexity pruning with subadditive penalties, we found no systematic improvements in the generalization performance of the final classification tree selected by cross-validation using subadditive penalties instead of the commonly used additive ones. © Springer-Verlag Berlin Heidelberg 2006.

Cite

CITATION STYLE

APA

García-Moratilla, S., Martínez-Muñoz, G., & Suárez, A. (2006). Evaluation of decision tree pruning with subadditive penalties. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4224 LNCS, pp. 995–1002). Springer Verlag. https://doi.org/10.1007/11875581_119

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free