Forest CERN: A new decision forest building technique

27Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Persistent efforts are going on to propose more accurate decision forest building techniques. In this paper, we propose a new decision forest building technique called “Forest by Continuously Excluding Root Node (Forest CERN)”. The key feature of the proposed technique is that it strives to exclude attributes that participated in the root nodes of previous trees by imposing penalties on them to obstruct them appear in some subsequent trees. Penalties are gradually lifted in such a manner that those attributes can reappear after a while. Other than that, our technique uses bootstrap samples to generate predefined number of trees. The target of the proposed algorithm is to maximize tree diversity without impeding individual tree accuracy. We present an elaborate experimental results involving fifteen widely used data sets from the UCI Machine Learning Repository. The experimental results indicate the effectiveness of the proposed technique in most of the cases.

Cite

CITATION STYLE

APA

Adnan, M. N., & Islam, M. Z. (2016). Forest CERN: A new decision forest building technique. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9651, pp. 304–315). Springer Verlag. https://doi.org/10.1007/978-3-319-31753-3_25

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free