Treatment effect estimates from a regression discontinuity design (RDD) have high internal validity. However, the arguments that support the design apply to a subpopulation that is narrower and usually different from the population of substantive interest in evaluation research. The disconnect between RDD population and the evaluation population of interest suggests that RDD evaluations lack external validity. New methodological research offer strategies for studying and sometimes improving external validity in RDDs. This article examines four techniques: comparative RDD, covariate matching RDD, treatment effect derivatives, and statistical tests for local selection bias. The goal of the article is to help evaluators understand the logic, assumptions, data requirements, and reach of the new methods.
CITATION STYLE
Wing, C., & Bello-Gomez, R. A. (2018). Regression Discontinuity and Beyond. American Journal of Evaluation, 39(1), 91–108. https://doi.org/10.1177/1098214017736155
Mendeley helps you to discover research relevant for your work.