So about that actionable data...
One of the frequently-offered reasons for the Big Standardized Tests is that they are supposed to provide information that will allow classroom teachers to "inform instruction," to tweak our instruction to
Our BS Tests are called the Keystones (we're the Keystone State-- get it?). They are not a state requirement yet-- the legislature has blinked a couple of times now and kicked that can down the road. Because these tests are norm-referenced aka graded on a curve, using them as a graduation requirement is guaranteed to result in the denial of diplomas for some huge number of Pennsylvania students. However, many local districts like my own, make them a local graduation requirement in anticipation of the day when the legislature has the nerve to pull the trigger (right now 2019 is the year it all happens). The big difference with a local requirement is that we can offer an alternative assessment; our students who never pass the Keystones must complete the Binder of Doom-- a huge collection of exercises and assessment activities that allow them to demonstrate mastery. It's no fun, but it beats not getting a diploma because you passed all your classes but failed on bad standardized test.
Why do local districts attach stakes to the Keystones? Because our school rating and our individual teacher ratings depend upon those test results.
So it is with a combination of curiosity and professional concern that I try to find real, actionable data in the Keystone results, to see if there are things I can do, compromises I can make, even insights I can glean from breaking that data down.
The short answer is no. Let me walk you through the long answer. (We're just going to stick to the ELA results here).
The results come back to the schools from the state in the form of an enormous Excel document. It CURMUDGUCATION: Inactive Data: