Monday, October 5, 2015

PSSAs: Emphasis on standardized testing has gone too far

PSSAs: Emphasis on standardized testing has gone too far:

PSSAs: Emphasis on standardized testing has gone too far



For a few years, measuring education performance using standardized tests was all about making “adequate yearly progress.”
But don’t look for any “progress” in the newest crop of Pennsylvania standardized test scores — and not just because the scores are all lower.
Even the Pennsylvania Department of Education admits “it is not useful to directly compare students’ scores on the new assessment to students’ scores from previous assessments.”
That’s because the most recent test is radically different than the test given the year before. Further, halfway through the process, the department changed the “cut score,” making it even harder to score well.
Or, as Pottsgrove School Board President Justin Valentine said at a recent board meeting, it is like being “told a home run is 305 feet and having them change it to 350 feet in the middle of the game.”
This was all done to align the test — the Pennsylvania System of School Assessment, or PSSA — with commonwealth’s new “core” standards, a variation on the national core standards being implemented, with some controversy, nationwide.
School officials throughout the region got the word out before last week’s release of the scores that parents should be prepared for lower scores because of the changes. But a growing movement in Pennsylvania also questions whether the emphasis on tests — “how did my school do?” — has become more about reputation than actual learning.
Or, about the community’s challenges than about the opportunity schools provide to overcome those challenges. In a policy brief published last year, Ed Fuller, executive director of the Department of Education Policy at Penn State’s College of Education — wrote that school profile scores are more closely tied to things like poverty and a parent’s education than to school effectiveness.
“The SPP scores are more accurate at identifying the percentage of economically disadvantaged students in a school than at identifying the effectiveness of a school,” Fuller wrote in his brief which matched data about the economic demographics of students with their school’s scores.
Standardized tests are also big business.
“Strength in Numbers,” a 2012 Brown Center on Education Policy report for The Brookings Institution, updated in June, found that states spend about $1.7 billion on testing.
Six major national vendors dominate that market with Data Recognition Corp., used in Pennsylvania, ranking as the second most expensive, with an average cost of $39 per student, the report found.
In the period from 2009 to 2012, Pennsylvania spent more than $32 million per year testing just under PSSAs: Emphasis on standardized testing has gone too far: