A difficult problem associated with traditional decision tree rule induction algorithms is how to achieve common currency between the actual classification accuracy and the distribution of this accuracy between the outcome classes. This paper introduces a novel method which successfully shows that it is possible to attain decision trees which exhibit both good performance and balance. The method first involves the induction of a decision tree, which is then fuzzified using a Genetic Algorithm (GA). Each solution generated by the GA produces a set of fuzzy regions which are mapped onto all nodes in the tree. The GA's fitness function consists of two components : classification accuracy and balance which are optimised concurrently. Three alternative functions are defined, each of which opposes different penalties on the classification accuracy depending on the selected weighting associated with the balance component. The method is applied to two real world data sets and is shown to achieve a high degree of common currency between accuracy and balance.
CITATION STYLE
Crockett, K. A., Bandar, Z., & Al-Attar, A. (1999). Optimising Decision Classifications Using Genetic Algorithms. In Artificial Neural Nets and Genetic Algorithms (pp. 191–195). Springer Vienna. https://doi.org/10.1007/978-3-7091-6384-9_33
Mendeley helps you to discover research relevant for your work.