Sunday, June 17, 2012

Curmudgeon: Value-added measures don't measure up for Evaluation

Curmudgeon: Value-added measures don't measure up for Evaluation:


Value-added measures don't measure up for Evaluation


Value-Added Measures don't make a good foundation for a teacher  evaluation system.

A comment on a Joanne Jacobs article:
VAM measures the amount of improvement your students make. There are a number of ways to do this. Some of the early VAM methods were highly unstable. More sophisticated methods do seem to hold up well from year to year and also correlate with positive long term outcomes such as lower teen pregnancy rates and better education and employment as adults.

And here I thought I was supposed to be teaching math.

I don’t believe VA is anything on which to base bonus or termination. “Seem to correlate” does not mean “cause” … and that’s for the best measurements.

What of all the poor ones? “Some of the early VAM methods were highly unstable” ("Unstable" is a charitable term for "Any resemblance to a consistent reality is neither implied nor intended.") 

It means that the results cannot be trusted for grading the student who took them (that's stated plainly and explicitly in the administrator's notes) and it means that the test are worse at evaluating the teacher who didn't take them.