Friday, May 15, 2015

Standardized Test Scores and Family Income | Tucson Weekly

Standardized Test Scores and Family Income | The Range: The Tucson Weekly's Daily Dispatch | Tucson Weekly:

Standardized Test Scores and Family Income 






click to enlargetucson-school-score-map_allclouds.jpg

    This bears repeating whenever the subject of high stakes testing comes up. There’s a very strong correlation between standardized test scores and family income. Test scores are higher in areas with high family income and lower in areas with low family income. It’s true in Tucson. It’s true in Phoenix. It’s true across the United States. It’s true in developed countries around the world (and probably even more true in undeveloped countries).

    Based on information from international testing, we know that test score inequality is higher in the U.S. than other developed countries, but so is income inequality.

    I created a few maps of the Tucson area awhile back with schools' state grades and median household income to demonstrate how neatly the test score/family income correlation works out.

    The map at the top of the post shows the state grades of all the schools in the Tucson area. Since state grades are mostly a reflection of the schools’ AIMS scores, a high grade generally means high test scores. I generated the map from the Department of Education website, then added colored clouds to emphasize the grade clusters. As you can see, Marana, Oro Valley, the Foothills and Vail have mostly A schools with a smattering of B schools thrown in. The B schools cluster just below the Foothills, the C’s are scattered from the center to the east of the city and the D schools are mainly in the south and southwest areas.

    Anyone who’s lived around here for awhile doesn’t really need the second map to understand how closely the school scores align with the incomes of families living in those areas, but in the map below, the distribution of median family incomes lays any questions to rest.



    click to enlargePIMA-NOT-CHARTER-AND-MEDIAN-INCOME-w_clouds.jpg
       
      Blue areas have the highest incomes, and the pinkish areas have the lowest incomes—the lighter the shade, the lower the income. The triangles are schools and their state grades. The colored clouds for the school grades match up incredibly closely with the income levels. (Note: Since the person who made this map for me left out the charter schools, the clouds on the two maps are close but not a perfect match.)

      I've gone through the same process with a map of the Phoenix area and arrived at similar results, and all the literature indicates you'll find the same correlation elsewhere.

      What does this mean? It means we pretty much know how schools' test scores are going to turn out, with minor variations and occasional exceptions. Generally, schools with high income students will get high test scores. and schools with low income students will get low test scores. In other words, most of what we “learn” from our arduous, costly regimen of high stakes testing, we already know. The scores have less to do with successful schools and failing schools than with the socioeconomic status of the kids who walk through the door. As for the exceptions, they tell us less than we might hope. The tests are such blunt instruments and gaming the tests to increase student scores, legally and illegally, is so prevalent, minor variations from what we expect tell us little about a school’s student achievement. As for those situations where the results are very different from what we might expect—usually schools with low income students who get high test scores—all too often, the unbelievable results turn out to be just that: unbelievable. We should have learned that lesson from schools in Atlanta, Georgia, Washington, DC, and most likely Arizona as well, and we have plenty of indications it’s true elsewhere. We’ve been fooled too many times. We shouldn’t let amazingly high test scores make fools of us again.

      Of course, some schools really do get great results, and not because they cherry-picked their students or because they gamed the test. Some schools genuinely do an exceptional job with their students. Unfortunately, we haven’t figured out how to catch that lightning, put it in a bottle and transport it to other schools. It’s not like building a thousand identical McDonalds and turning out identical burgers and fries at each location. Education doesn’t work that way. Sure, we can keep testing every student every year, year after year, and tell schools with low income students, “Your scores are too low. Be exceptional! Be exceptional!” We can also tell writers whose style and substance aren’t up to snuff, “Your writing isn't good enough. Be Shakespeare! Be Shakespeare!” Unfortunately, that doesn’t mean those schools will end up being exceptional or those writers will end up being Shakespeare.

      You want to test kids so we can get a snapshot of their achievement level and a rough idea of how well their schools are teaching them? Fine. Administer standardized tests to 3rd grade, 6th grade and 10th grade students. It’ll cost less, take up far less teaching time, create far less stress for students and staff and yield the same or better results as what we’re getting from this insane regimen of yearly high stakes tests.Standardized Test Scores and Family Income | The Range: The Tucson Weekly's Daily Dispatch | Tucson Weekly: