Decision tree learning using a bayesian approach at each node

0Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We explore the problem of learning decision trees using a Bayesian approach, called TREBBLE (TREe Building by Bayesian LE- arning), in which a population of decision trees is generated by constructing trees using probability distributions at each node. Predictions are made either by using Bayesian Model Averaging to combine information from all the trees (TREBBLE-BMA) or by using the single most likely tree (TREBBLE-MAP), depending on what is appropriate for the particular application domain. We show on benchmark data sets that this method is more accurate than the traditional decision tree learning algorithm C4.5 and is as accurate as the Bayesian method SimTree while being much simpler to understand and implement. In many application domains, such as help-desks and medical diagnoses, a decision tree needs to be learned from a prior tree (provided by an expert) and some (usually small) amount of training data. We show how TREBBLE-MAP can be used to learn a single tree that performs better than using either the prior tree or the training data alone. © 2009 Springer Berlin Heidelberg.

Cite

CITATION STYLE

APA

Andronescu, M., & Brodie, M. (2009). Decision tree learning using a bayesian approach at each node. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5549 LNAI, pp. 4–15). https://doi.org/10.1007/978-3-642-01818-3_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free