Were kids actually smarter decades ago?: Student achievement data trends (NAEP, etc.)
To understand the success and failures of public education in United States and Texas (as measured by achievement tests), I ask the following research questions in this post: How do the nation’s public schools perform on the NAEP over the past several decades? Were kids actually smarter 30 years ago in the U.S? Also, how does Texas perform on state-mandated exams and compare to the nation and peer states (California and New York) over the past several decades in national achievement data? To answer these questions, first I examine K-12 achievement data from the TAAS, TAKS, and STAAR— the succession of state-mandated exam regimes in Texas. I then utilize data from the NAEP to examine national trends on the NAEP over the last several decades (Spoiler Alter: Our kids in public schools aren’t as stupids (sic) as we are told by the media and school “reformers”). Then, I rank the Lone Star State on the NAEP relative to all U.S. states and its most comparable peer states (California and New York). For international comparisons between all U.S. states and the world see the post Who’s Smarter Than Texans: Math and Science Test Scores Compared to the World and Nation Note: Lest you believe I have become enamored with high-stakes tests, in a post later this week I will discuss our nation’s obsession with measuring outputs via high-stakes tests relative to inputs (school finance).
First, let’s start with Texas data (skip these sections if you don’t giving a flying into Fukushima Airport about Texas), then I will move to the national data and comparisons.
Texas Assessment of Academic Skills (TAAS)
Figure 1 shows that for All Students, the percent of students passing went from 55 percent in 1994 to 92 percent by