Splitting choice and computational complexity analysis of decision trees

3Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.

Abstract

Some theories are explored in this research about decision trees which give theoretical support to the applications based on decision trees. The first is that there are many splitting criteria to choose in the tree growing process. The splitting bias that influences the criterion chosen due to missing values and variables with many possible values has been studied. Results show that the Gini index is superior to entropy information as it has less bias regarding influences. The second is that noise variables with more missing values have a better chance to be chosen while informative variables do not. The third is that when there are many noise variables involved in the tree building process, it influences the corresponding computational complexity. Results show that the computational complexity increase is linear to the number of noise variables. So methods that decompose more information from the original data but increase the variable dimension can also be considered in real applications.

Cite

CITATION STYLE

APA

Zhao, X., & Nie, X. (2021). Splitting choice and computational complexity analysis of decision trees. Entropy, 23(10). https://doi.org/10.3390/e23101241

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free