TWIEtR: Student data privacy, algorithm risk

This week in edtech reports was a bit creepy. Two reports were released looking at the potential dark side of educational technology in terms of surveillance (student data monitoring), and  bias (algorithmic personalization).

As always, TWIEtR is all about fact-based reports (surveys, research and so on) that catch my eye on Twitter and in other news feeds. Subscribing to these weekly updates by email is easy: Just enter your email address above “Notify Me” in the left navigation.

The Electronic Frontier Foundation, which has a decided point of view about these things (you might recall they filed a federal complaint against Google in 2015 over alleged student data scanning in Google Apps for Education), released its “Spying on Students” report. Indeed, it was that December 2015 FTC complaint that began the campaign which led to this report.

The new EFF report uses charged language in this examination of data gathered by devices and software used in K-12 education (one example: “Surveillance Culture Starts in Grade School”). At the same time, it looked into privacy policies of 152 edtech services and surveyed more than 1,000 students, parents, teachers, and school administrators.

Some of the disconnects and risks, especially of “free” products that schools increasingly rely upon, are indisputable, even if the report language tends to the attention-getting.

As the EFF warns, “Student laptops and educational services are often available for a steeply reduced price, and are sometimes even free. However, they come with real costs and unresolved ethical questions. Throughout EFF’s investigation over the past two years, we have found that educational technology services often collect far more information on kids than is necessary and store this information indefinitely.”

Not specifically about edtech — but with strong implications for the continued rush to computer-based, personalized learning — is a new report from the RAND Corporation, “An Intelligence in Our Image: The Risks of Bias and Errors in Artificial Intelligence.”

Yes, algorithms (or sets of rules for machine problem solving and learning) power personalization. But these software-embedded instructions are created by people. And people are, well, flawed. Science-fiction fans might remember Colossus: The Forbin Project and many other cautionary tales of that be-careful-what-you-wish-for-in-tech sub-genre.

Less dramatic than that 1970 film, but perhaps more realistic, is Benjamin Herold’s take on the report’s implications for education technology with the RAND authors in Education Week.

And one more thing:

Yes, we have no flying cars or personal jet packs. But we also have no books being ground using hand-crank power to flow knowledge through wires into students’ heads.

Back to the Future of Edtech: A Meditation” is a deeply interesting piece in EDUCAUSE Review, by Educause President and CEO John O’Brien, about how visions of the application of technology to education have changed over the decades — and what it says about us and our aspirations at the time.

There are also claims about what tech could do immediately, too. Did you know Royal’s manual portable typewriter promised to raise grades up to 38% … in 1958?

It’s a long, fun, and thoughtful read. A bonus: there are lots of pictures and video examples.