Adjusting for Confounders with Text: Challenges and an Empirical Evaluation Framework for Causal Inference

  • Weld G
  • West P
  • Glenski M
  • et al.
N/ACitations
Citations of this article
22Readers
Mendeley users who have this article in their library.

Abstract

Causal inference studies using textual social media data can provide actionable insights on human behavior. Making accurate causal inferences with text requires controlling for confounding which could otherwise impart bias. Recently, many different methods for adjusting for confounders have been proposed, and we show that these existing methods disagree with one another on two datasets inspired by previous social media studies. Evaluating causal methods is challenging, as ground truth counterfactuals are almost never available. Presently, no empirical evaluation framework for causal methods using text exists, and as such, practitioners must select their methods without guidance. We contribute the first such framework, which consists of five tasks drawn from real world studies. Our framework enables the evaluation of any casual inference method using text. Across 648 experiments and two datasets, we evaluate every commonly used causal inference method and identify their strengths and weaknesses to inform social media researchers seeking to use such methods, and guide future improvements. We make all tasks, data, and models public to inform applications and encourage additional research.

Cite

CITATION STYLE

APA

Weld, G., West, P., Glenski, M., Arbour, D., Rossi, R. A., & Althoff, T. (2022). Adjusting for Confounders with Text: Challenges and an Empirical Evaluation Framework for Causal Inference. Proceedings of the International AAAI Conference on Web and Social Media, 16, 1109–1120. https://doi.org/10.1609/icwsm.v16i1.19362

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free