Thursday, November 28, 2019

CURMUDGUCATION: AI: Bad Data, Bad Results

CURMUDGUCATION: AI: Bad Data, Bad Results

AI: Bad Data, Bad Results

Once upon a time, when you took computer programming courses, you had two things drilled into you:

1) Computers are dumb. Fast and indefatigable, but dumb.

2) Garbage in, garbage out.

The rise of artificial intelligence is supposed to make us forget both of those things. It shouldn't. It especially shouldn't in fields like education which are packed with cyber-non-experts and far too many people who think that computers are magic and AI computers are super-shiny magic. Too many folks in the Education Space get the majority of their current computer "training" from folks who have something to sell.


AI is too often used inappropriately, when all we've really got is a fancy algorithm, but no actual intelligence, artificial or otherwise. We're supposed to get past that with software that can learn, except that we haven't got that sorted out either.

Remember Tay, the Microsoft intelligent chatbot that learned to be a horrifying racist? Tay actually had a younger sister, Zo, who was supposed to be better, but was just arguably worse in different ways. Facial recognition programs still mis-identify black faces.

The pop culture notion, long embedded in all manner of fiction, is that a cold, logical computer would be ruthlessly objective. Instead, what we learn over and over and over and over and over and over again is that a computer is ruthlessly attached to whatever biases are programmed into it.

Wired just published an article about how tweaking the data used to train an AI could be the new CONTINUE READING: 
CURMUDGUCATION: AI: Bad Data, Bad Results