Optimal Decision Trees for Nonlinear Metrics

12Citations
Citations of this article
28Readers
Mendeley users who have this article in their library.

Abstract

Nonlinear metrics, such as the F1-score, Matthews correlation coefficient, and Fowlkes–Mallows index, are often used to evaluate the performance of machine learning models, in particular, when facing imbalanced datasets that contain more samples of one class than the other. Recent optimal decision tree algorithms have shown remarkable progress in producing trees that are optimal with respect to linear criteria, such as accuracy, but unfortunately nonlinear metrics remain a challenge. To address this gap, we propose a novel algorithm based on bi-objective optimisation, which treats misclassifications of each binary class as a separate objective. We show that, for a large class of metrics, the optimal tree lies on the Pareto frontier. Consequently, we obtain the optimal tree by using our method to generate the set of all nondominated trees. To the best of our knowledge, this is the first method to compute provably optimal decision trees for nonlinear metrics. Our approach leads to a trade-off when compared to optimising linear metrics: the resulting trees may be more desirable according to the given nonlinear metric at the expense of higher runtimes. Nevertheless, the experiments illustrate that runtimes are reasonable for majority of the tested datasets.

Cite

CITATION STYLE

APA

Demirovic, E., & Stuckey, P. J. (2021). Optimal Decision Trees for Nonlinear Metrics. In 35th AAAI Conference on Artificial Intelligence, AAAI 2021 (Vol. 5A, pp. 3733–3741). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v35i5.16490

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free