Tuesday, August 26, 2014

Shanker Blog » A Quick Look At The ASA Statement On Value-Added

Shanker Blog » A Quick Look At The ASA Statement On Value-Added:



A Quick Look At The ASA Statement On Value-Added

Posted by  on August 26, 2014



Several months ago, the American Statistical Association (ASA) released a statement on the use of value-added models in education policy. I’m a little late getting to this (and might be repeating points that others made at the time), but I wanted to comment on the statement, not only because I think it’s useful to have ASA add their perspective to the debate on this issue, but also because their statement seems to have become one of the staple citations for those who oppose the use of these models in teacher evaluations and other policies.
Some of these folks claimed that the ASA supported their viewpoint – i.e., that value-added models should play no role in accountability policy. I don’t agree with this interpretation. To be sure, the ASA authors described the limitations of these estimates, and urged caution, but I think that the statement rather explicitly reaches a more nuanced conclusion: That value-added estimates might play a useful role in education policy, as one among several measures used in formal accountability systems, but this must be done carefully and appropriately.*
Much of the statement puts forth the standard, albeit important, points about value-added (e.g., moderate stability between years/models, potential for bias, etc.). But there are, from my reading, three important takeaways that bear on the public debate about the use of these measures, which are not always so widely acknowledged.
First, the authors state: “Estimates from VAMs should always be accompanied by measures of precision and a discussion of the assumptions and possible limitations of the model.” The imprecision of value-added estimates gets a great deal of attention (not all of it well-informed), but, while there are exceptions, many states/districts haven’t made much of an effort to report or discuss candidly this imprecision (or, even better, what can be done about it).
Moreover, states’ and districts’ discussion of even the most basic assumptions and issues hasn’t always been particularly robust, and has in some cases become highly politicized, with officials (again, with exceptions) downplaying the concerns. In Florida, for instance, some value-added supporters (including state officials) have claimed that because the estimates from the state’s model are not associated strongly with subsidized lunch eligibility, this means that factors associated with poverty don’t influence individual teachers’ scores. This is a painfully oversimplified conclusion (the reality is far more nuanced).
It could be that states and districts are hesitant to lay it all out because these models are already controversial, and any statements in which states/districts are blunt about their limitations, or about addressing imprecision, would Shanker Blog » A Quick Look At The ASA Statement On Value-Added: