Analyzing Regression-Discontinuity Designs With Multiple Assignment Variables

2013 ◽  
Vol 38 (2) ◽  
pp. 107-141 ◽  
Author(s):  
Vivian C. Wong ◽  
Peter M. Steiner ◽  
Thomas D. Cook
Stats ◽  
2021 ◽  
Vol 4 (4) ◽  
pp. 893-915
Author(s):  
Albert Whata ◽  
Charles Chimedza

In this paper, we determine treatment effects when the treatment assignment is based on two or more cut-off points of covariates rather than on one cut-off point of one assignment variable. using methods that are referred to as multivariate regression discontinuity designs (MRDD). One major finding of this paper is the discovery of new evidence that both matric points and household income have a huge impact on the probability of eligibility for funding from the National Student Financial Aid Scheme (NSFAS) to study for a bachelor’s degree program at universities in South Africa. This evidence will inform policymakers and educational practitioners on the effects of matric points and household income on the eligibility for NSFAS funding. The availability of the NSFAS grant impacts greatly students’ decisions to attend university or seek other opportunities elsewhere. Using the frontier MRDD analytical results, barely scoring matric points greater than or equal to 25 points compared to scoring matric points less than 25 for students whose household income is less than R350,000 (≈US$2500) increases the probability of eligibility for NSFAS funding by a significant 3.75 ( p-value = 0.0001 < 0.05) percentage points. Therefore, we have shown that the frontier MRDD can be employed to determine the causal effects of barely meeting the requirements of one assignment variable, among the subjects that either meet or fail to meet the requirements of the other assignment variable.


2020 ◽  
pp. 1-17
Author(s):  
Erin Hartman

Abstract Regression discontinuity (RD) designs are increasingly common in political science. They have many advantages, including a known and observable treatment assignment mechanism. The literature has emphasized the need for “falsification tests” and ways to assess the validity of the design. When implementing RD designs, researchers typically rely on two falsification tests, based on empirically testable implications of the identifying assumptions, to argue the design is credible. These tests, one for continuity in the regression function for a pretreatment covariate, and one for continuity in the density of the forcing variable, use a null of no difference in the parameter of interest at the discontinuity. Common practice can, incorrectly, conflate a failure to reject evidence of a flawed design with evidence that the design is credible. The well-known equivalence testing approach addresses these problems, but how to implement equivalence tests in the RD framework is not straightforward. This paper develops two equivalence tests tailored for RD designs that allow researchers to provide statistical evidence that the design is credible. Simulation studies show the superior performance of equivalence-based tests over tests-of-difference, as used in current practice. The tests are applied to the close elections RD data presented in Eggers et al. (2015b).


2017 ◽  
Vol 3 (2) ◽  
pp. 134-146
Author(s):  
Matias D. Cattaneo ◽  
Gonzalo Vazquez-Bare

2020 ◽  
Vol 8 (1) ◽  
pp. 164-181
Author(s):  
Cristian Crespo

Abstract This paper elaborates on administrative sorting, a threat to internal validity that has been overlooked in the regression discontinuity (RD) literature. Variation in treatment assignment near the threshold may still not be as good as random even when individuals are unable to precisely manipulate the running variable. This can be the case when administrative procedures, beyond individuals’ control and knowledge, affect their position near the threshold non-randomly. If administrative sorting is not recognized it can be mistaken as manipulation, preventing fixing the running variable and leading to discarding viable RD research designs.


2008 ◽  
Vol 142 (2) ◽  
pp. 615-635 ◽  
Author(s):  
Guido W. Imbens ◽  
Thomas Lemieux

Sign in / Sign up

Export Citation Format

Share Document