Abstract
Regression trees are one of the oldest forms of AI models, and their predictions can be made without a calculator, which makes them broadly useful, particularly for high-stakes applications. Within the large literature on regression trees, there has been little effort towards full provable optimization, mainly due to the computational hardness of the problem. This work proposes a dynamic-programming-with-bounds approach to the construction of provably-optimal sparse regression trees. We leverage a novel lower bound based on an optimal solution to the k-Means clustering algorithm on one dimensional data. We are often able to find optimal sparse trees in seconds, even for challenging datasets that involve large numbers of samples and highly-correlated features.
Cite
CITATION STYLE
Zhang, R., Xin, R., Seltzer, M., & Rudin, C. (2023). Optimal Sparse Regression Trees. In Proceedings of the 37th AAAI Conference on Artificial Intelligence, AAAI 2023 (Vol. 37, pp. 11270–11279). AAAI Press. https://doi.org/10.1609/aaai.v37i9.26334
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.