Causal Transfer Random Forest: Combining Logged Data and Randomized Experiments for Robust Prediction

15Citations
Citations of this article
45Readers
Mendeley users who have this article in their library.
Get full text

Abstract

It is often critical for prediction models to be robust to distributional shifts between training and testing data. From a causal perspective, the challenge is to distinguish the stable causal relationships from the unstable spurious correlations across shifts. We describe a causal transfer random forest (CTRF) that combines existing training data with a small amount of data from a randomized experiment to train a model which is robust to the feature shifts and therefore transfers to a new targeting distribution. Theoretically, we justify the robustness of the approach against feature shifts with the knowledge from causal learning. Empirically, we evaluate the CTRF using both synthetic data experiments and real-world experiments in the Bing Ads platform, including a click prediction task and in the context of an end-to-end counterfactual optimization system. The proposed CTRF produces robust predictions and outperforms most baseline methods compared in the presence of feature shifts.

Cite

CITATION STYLE

APA

Zeng, S., Bayir, M. A., Pfeiffer, J. J., Charles, D., & Kiciman, E. (2021). Causal Transfer Random Forest: Combining Logged Data and Randomized Experiments for Robust Prediction. In WSDM 2021 - Proceedings of the 14th ACM International Conference on Web Search and Data Mining (pp. 211–219). Association for Computing Machinery, Inc. https://doi.org/10.1145/3437963.3441722

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free