Latest News and Comment from Education

Wednesday, October 7, 2015

Teacher: Testing is supposed to provide ‘rich’ student data — but here’s why it’s actually worthless - The Washington Post

Teacher: Testing is supposed to provide ‘rich’ student data — but here’s why it’s actually worthless - The Washington Post:

Teacher: Testing is supposed to provide ‘rich’ student data — but here’s why it’s actually worthless






One of the big ideas driving school reform today is that data is king, and the more the better for teachers who can use it to improve and tailor instruction to their students. The problem, according to many teachers, is that the data they are given from standardized tests doesn’t do much of anything to help them because it doesn’t have any meaning. This is clearly explained in  this post by Peter Greene,  a veteran teacher of English in a small town in Pennsylvania, who initially published this on his lively Curmudgucation blog,
Greene talks about value-added measurement, or VAM, which is the method by which teacher evaluations are now being linked to student standardized test scores. VAM formulas supposedly can tease out, by way of a mathematical formula using the test scores, how much “value” a teacher adds to a student’s academic progress while factoring out every other influence on a student’s test performance (including hunger, illness, etc.). Assessment experts say VAM should not be used to evaluate teachers because the method isn’t reliable enough, but school reforms are doing it anyway.
How VAM formulas generally work is that each student is assigned a “predicted” score — based on past performance by that student and other students — on a state-mandated test. If a student exceeds the predicted score, the teacher is credited with “adding value.” If the student does not do as well as the predicted score, the teacher is held responsible and that score counts negatively towards his/her evaluation. (You can read here about one teacherwhose top-scoring students actually hurt his evaluation.)
Here’s Greene’s piece on just how value-less supposedly valuable value-added data really is.
By Peter Greene
It’s autumn in Pennsylvania, which means it’s time to look at the rich data to be gleaned from our Big Standardized Test (called PSSA for grades 3-8, and Keystone Exams at the high school level). We love us some value-added data crunching in PA (our version is called PVAAS, an early version of the value-added baloney model).
This is a model that promises far more than it can deliver, but it also makes up a sizable chunk of our school evaluation model, which in turn is part of our teacher evaluation model. Of course the data crunching and collecting is supposed to have many valuable benefits, not the least of which is unleashing a pack of rich and robust data hounds who will chase the wild beast of low student achievement up the tree of instructional re-alignment. Like every other state, we have been promised that the tests will have classroom teachers swimming in a vast vault of data, like Scrooge McDuck on a gold bullion Teacher: Testing is supposed to provide ‘rich’ student data — but here’s why it’s actually worthless - The Washington Post: