Latest News and Comment from Education

Saturday, January 11, 2014

Litigating DC IMPACT: The real usefulness of the Dee/Wyckoff Regression Discontinuity Design | School Finance 101

Litigating DC IMPACT: The real usefulness of the Dee/Wyckoff Regression Discontinuity Design | School Finance 101:

Litigating DC IMPACT: The real usefulness of the Dee/Wyckoff Regression Discontinuity Design

Posted on January 11, 2014

 
 
 
 
 
 
Rate This


Much has been made of late regarding the erroneous classification of 44 teachers in Washington DCas ineffective, thus facing job consequences. This particular erroneous rating was based on an “error” in the calculation of the teachers’ total ratings, as acknowledged by the consulting firm applying the ratings. That is, in this case, the consultants simply did not carry out their calculations as intended. This is not to suggest by any stretch that the intended calculations are necessarily more accurate or precise than the unintended error. That is, there certainly may be far more – are likely far more than these 44 teachers whose ratings fall arbitrarily and capriciously in the zone whereby those teachers would face employment consequences.
So, how can we tell… how can we identify such teachers. Well, DC’s own evaluation study of IMPACT provides us one useful road map and even a list of individuals arbitrarily harmed by the evaluation model. As I’ve stated on many, many occasions, it is simply inappropriate to make bright line distinctions through fuzzy data. Teacher evaluation data are fuzzy. Yet teacher evaluation systems like IMPACT impose on those data many bright lines – cut points… to make important consequential decisions. Distinctions which are unwarranted. Distinctions which characterize as substantively different individuals who simply are not.
Nowhere is this more clearly acknowledged than in Tom Dee and Jim Wyckoff’s choice of regression discontinuity to evaluate the effect of being place in different performance categories. I