Latest News and Comment from Education

Monday, August 20, 2018

Schools are using AI to track their students — Quartz

Schools are using AI to track their students — Quartz
Schools are using AI to track what students write on their computers


Over 50 million k-12 students will go back to school in the US this month. For many of them using a school computer, every word they type will be tracked.
Under the Children’s Internet Protection Act (CIPA), any US school that receives federal funding is required to have an internet-safety policy. As school-issued tablets and Chromebook laptops become more commonplace, schools must install technological guardrails to keep their students safe. For some, this simply means blocking inappropriate websites. Others, however, have turned to software companies like GaggleSecurly, and GoGuardian to surface potentially worrisome communications to school administrators.
These Safety Management Platforms (SMPs) use natural-language processing to scan through the millions of words typed on school computers. If a word or phrase might indicate bullying or self-harm behavior, it gets surfaced for a team of humans to review.
In an age of mass school-shootings and increased student suicides, SMPs can play a vital role in preventing harm before it happens. Each of these companies has case studies where an intercepted message helped save lives. But the software also raises ethical concerns about the line between protecting students’ safety and protecting their privacy. 
“A good-faith effort to monitor students keeps raising the bar until you have a sort of surveillance state in the classroom,” Girard Kelly, the director of privacy review at Common Sense Media, a non-profit that promotes internet-safety education for children, told Quartz. “Not only are there metal detectors and cameras in the schools, but now their learning objectives and emails are being tracked too.”

The debate around SMPs sits at the intersection of two topics of national interest—protecting schools and protecting data. As more and more schools go one-to-one, the industry term for assigning every student a device of their own, the need to protect students’ digital lives is only going to increase. Over 50% of teachers say their schools are one-to-one, according to a 2017 survey from Freckle Education, meaning there’s a huge market to tap into.
But even in an age of student suicides and school shootings, when do security precautions start to infringe on students’ freedoms?

Safety, managed

The most popular SMPs all work slightly differently. Gaggle, which charges roughly $5 per student annually, is a filter on top of popular tools like Google Docs and Gmail. When the Gaggle algorithm surfaces a word or phrase that may be of concern—like a mention of drugs or signs of cyberbullying—the “incident” gets sent to human reviewers before being passed on to the school. Securly goes one step beyond classroom tools and gives schools the option to perform sentiment analysis on students’ public social media posts. Using AI, the software is able to process thousands of student tweets, posts, and status updates to look for signs of harm. 
Kelly thinks SMPs help normalize surveillance from a young age. In the wake of the Cambridge Analytica scandal at Facebook and other recent data breaches from companies like Equifax, we have the opportunity to teach kids the importance of protecting their online data, he said.
“There should be a whole gradation of how this [software] should work,” Daphne Keller, the director of the Stanford Center for Internet and Society (and mother of two), told Quartz. “We should be able to Continue reading: Schools are using AI to track their students — Quartz