Enhancing the efficiency of decision tree C4.5 using average hybrid entropy

1Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Getting the efficient and effective decision tree is important, because of its numerous applications in mining and machine learning. Different modifications have been done on the splitting criteria in the decision tree. Different entropy concepts are introduced by different scholars. Shannon’s entropy, Renyi’s entropy, and Tsalli’s entropy are those entropies which can affect the overall efficiency of decision tree C4.5. This research implemented new average hybrid entropy that has combined statistical properties of Reyni’s and Tsalli’s entropy, Average Hybrid entropy is the average between the maxima of Reyni’s and Tsalli’s entropy. The overall idea is, applying Average Hybrid entropy on the basis of instances and integrates those instances after pruning. This makes the pruning process easy and gives better results. Research is done on three standard datasets Credit-g, Diabetes, and Glass dataset taken from UCI repository; it is proved that the average hybrid entropy is having the more efficient results.

Cite

CITATION STYLE

APA

Rani, P., Kaur, K., & Kaur, R. (2019). Enhancing the efficiency of decision tree C4.5 using average hybrid entropy. In Communications in Computer and Information Science (Vol. 955, pp. 119–134). Springer Verlag. https://doi.org/10.1007/978-981-13-3140-4_12

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free