Latest News and Comment from Education

Sunday, November 6, 2011

MPR’s Unfortunate Sidestepping around Money Questions in the Charter CMO Report « School Finance 101

MPR’s Unfortunate Sidestepping around Money Questions in the Charter CMO Report « School Finance 101:

MPR’s Unfortunate Sidestepping around Money Questions in the Charter CMO Report

Let me start by pointing out that Mathematica Policy Research, in my view, is an exceptional research organization. They have good people. They do good work and have done much to inform public policy in what I believe are positive ways. That’s why I found it so depressing when I started digging through the recent report on Charter CMOs – a report which as framed, was intended to explore the differences in effectiveness, practices and resources of charter schools operated by various Charter Management Organizations.
First, allow me to point out that I believe that the “relative effectiveness of CMOs” is not necessarily the right question – though it does have particular policy relevance when framed that way. Rather, I believe that the right questions at this point are not about charter versus non-charter, KIPP versus Imagine or White Hat, but rather about what these schools are doing, and whether we have evidence that it works (across a broad array of


When VAMs Fail: Evaluating Ohio’s School Performance Measures

Any reader of my blog knows already that I’m a skeptic of the usefulness of Value-added models for guiding high stakes decisions regarding personnel in schools. As I’ve explained on previous occasions, while statistical models of large numbers of data points – like lots of teachers or lots of schools – might provide us with some useful information on the extent of variation in student outcomes across schools or teachers and might reveal for us some useful patterns – it’s generally not a useful exercise to try to say anything about any one single point within the data set. Yes, teacher “effectiveness” estimates tend to be based on the many student points across students taught by that teacher, but are still highly unstable. Unstable to the point, where even as a researcher hoping to find value in this information, I’ve become skeptical.
However, I had still been holding out more hope that school level aggregate information on student growth – value added estimates – might still be more useful mainly because it represents a higher level of aggregation. That is,