Latest News and Comment from Education

Sunday, August 9, 2015

CURMUDGUCATION: Another Writing Robo-teacher

CURMUDGUCATION: Another Writing Robo-teacher:

Another Writing Robo-teacher






This summer the University of Delaware was happy to unveil yet more research on yet another attempt to argue that computer software has a place in writing instruction.

Being up front

As a high school English teacher, I've thought about this a great deal, and written about it on several occasions (herehere and here, for example). And mostly I think actual useful essay-grading computers are about as probable as unicorns dancing with chartreuse polar bears in fields of asparagus. We could safely label me "Predisposed to be Skeptical."

And yet I'm determined to have an open mind. But I've been down this road before, and I recognize the Big Red Flags when I see them.

Red Flag #1: Who's Paying for This?

Assistant professor Joshus Wilson, from UD's School of Education, set out to see if the softwarePEGWriting could be used not just to score student writing, but to inform and assist instruction throughout the year. Why would he want to look into that?


The software Wilson used is called PEGWriting (which stands for Project Essay Grade Writing), based on work by the late education researcher Ellis B. Page and sold by Measurement Incorporated, which supports Wilson's research with indirect funding to the University. 

So, the software maker paid for and perhaps commissioned this research. Just to be clear, the fact that there's no direct quid pro quo makes it worse-- if I'm counting on your funding to pay for the project I'm doing, the funding and the project can go away together and life can go on for the rest of my department. But if I'm doing research on your product over here and you're paying for, say, all our office furniture over there, the stakes are higher.

At any rate, this is clear built-in bias. Anything else?

Red Flag #2: You're Scoring What??!!

The software uses algorithms to measure more than 500 text-level variables to yield scores and feedback regarding the following characteristics of writing quality: idea development, organization, style, word choice, sentence structure, and writing conventions such as spelling and grammar. 

First, I know you think it's impressive that it's measuring 500 variables ("text-level"-- as opposed to some other level?? Paper-level?), but it's not. It's like telling me that you have a vocabulary of 500 words. not so impressive, given the nature of language.

But beyond that-- PEGWriting wants to market itself as being tuned into six trait writing. I have no beef with the six traits-- I've used them myself for decades. And that's how I know that no software can possibly do what this software claims it can do.

Idea development? Really? I will bet you dollars to donuts that if I take my thesis statement ("Abe Lincoln was a great peacemaker") and develop it with absolute baloney support ("Lincoln helped CURMUDGUCATION: Another Writing Robo-teacher: