Decision Tree Induction: Using Entropy for Attribute Selection

  • Bramer M
N/ACitations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This chapter examines some alternative strategies for selecting attributes at each stage of the TDIDT decision tree generation algorithm and compares the size of the resulting trees for a number of datasets. The risk of obtaining decision trees that are entirely meaningless is highlighted, pointing to the importance of a good choice of attribute selection strategy. One of the most widely used strategies is based on minimising entropy (or equivalently maximising information gain) and this approach is illustrated in detail.

Cite

CITATION STYLE

APA

Bramer, M. (2020). Decision Tree Induction: Using Entropy for Attribute Selection (pp. 49–62). https://doi.org/10.1007/978-1-4471-7493-6_5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free