Latest News and Comment from Education

Wednesday, May 9, 2012

Further Thoughts on Computer Scoring of Student Writing - Dana Goldstein

Further Thoughts on Computer Scoring of Student Writing - Dana Goldstein:


Further Thoughts on Computer Scoring of Student Writing

Today Slate published a piece I've been working on for about a month, about computer assessment of student writing. You may have read about a new University of Akron study that found computer programs and people award student writing samples similar grades. The results are impressive, but their applicability is limited. As I discuss in the article, the study looked exclusively at the current generation of state standardized tests, which require students to write essays that are far less sophisticated than the ones we hope they will write once the Common Core is fully implemented in 2014.
The recent push for automated essay scoring comes just as we’re on the verge of making standardized essay tests much more sophisticated in ways robo-graders will have difficulty dealing with. One of the major goals of the new Common Core curriculum standards, which 45 states have agreed to adopt, is to supplant the soft-focus “personal essay” writing that currently predominates in American classrooms with more evidence-driven, subject-specific writing. The creators of the Common Core hope machines can soon score these essays cheaply and quickly, saving states money in a time of harsh education budget cuts.