ControlBurn: Feature Selection by Sparse Forests

9Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Tree ensembles distribute feature importance evenly amongst groups of correlated features. The average feature ranking of the correlated group is suppressed, which reduces interpretability and complicates feature selection. In this paper we present ControlBurn, a feature selection algorithm that uses a weighted LASSO-based feature selection method to prune unnecessary features from tree ensembles, just as low-intensity fire reduces overgrown vegetation. Like the linear LASSO, ControlBurn assigns all the feature importance of a correlated group of features to a single feature. Moreover, the algorithm is efficient and only requires a single training iteration to run, unlike iterative wrapper-based feature selection methods. We show that ControlBurn performs substantially better than feature selection methods with comparable computational costs on datasets with correlated features.

Cite

CITATION STYLE

APA

Liu, B., Xie, M., & Udell, M. (2021). ControlBurn: Feature Selection by Sparse Forests. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 1045–1054). Association for Computing Machinery. https://doi.org/10.1145/3447548.3467387

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free