The COVID-19 outbreak has led to more than 90% of the world’s students not physically attending schools. As a result, schools worldwide are turning to online learning service providers, or education technology commonly known as EdTech.

Market analysts expect the emerging “smart education” market to be worth about $680 billion by 2027. But the EdTech industry’s ability to collect massive amounts of data on students and its potential to transform curricula has groups like the American Civil Liberties Union (ACLU) concerned.

In March, the ACLU warned:

“Now that the COVID-19 pandemic has created an unprecedented opportunity for EdTech companies to make the use of their privacy-violating educational products nearly universal, there is a real risk that these companies, under the guise of a generous act, will use this opportunity to create personal information dossiers on an entire generation of young Americans.”

https://twitter.com/ACLU/status/1246892111728787456?s=20

A look at some existing EdTech projects offers insights into what’s ahead. For example, one nonprofit, Enlearn, provides an online platform where students interact with materials that adapt to their progress, and teachers get real-time data about individual students.

Enlearn’s artificial intelligence (AI) software “continuously learns and improves itself over time.” Enlearn’s privacy policy is vague, allowing the nonprofit to share personal data with third parties to enforce its own terms or service, to prevent fraud, to prevent technical issues, to protect its own legal rights or to carry out the requests of its users.

Another EdTech player, KidAptive, offers insights and analytics to companies that build education-related apps on its cloud-based adaptive learning platform (ALP). KidAptive’s ALP has the ability to identify when children are guessing answers, working too quickly or too slowly and not re-checking their answers.

Both Enlearn and KidAptive gain insights into the behavioral and psychological patterns of children and turn those insights into valuable data, which can then be turned into algorithms to refine their other products.

Independent researcher Alison McDowell thinks we’re witnessing the commodification of children as data. “By collecting our children’s data through these programs, these private companies get to train their AI systems with public assets,” McDowell told The Defender.

Some EdTech companies are collecting biometric data, including data about children’s educational, social and emotional skills, and using the data to train and improve their AI.

For example, Affectiva created facial recognition software that can be used to detect the cognitive and social behaviors of children by analyzing how they react to stimuli during an “educational” game.

In 2018, Massachusetts Institute of Technology (MIT) researchers conducted classroom trials of a teaching aid to monitor students’ attentiveness, using infrared beams that were able to read the student’s corneal pulse. MIT also tested sensor-equipped glasses on students. The glasses were capable of analyzing  students’  interpersonal skills and reactions. When combined with an AI program, the data collected by the sensors could determine if the child belonged in gifted classes, or maybe needed special help.

The money for much of these EdTech programs comes from behemoths like the Gates Foundation, which had a hand in funding both Enlearn and KidAdaptive. Enlearn was launched with $9 million in Gates Foundation grant funds. KidAptive seed money was provided by the NewSchools Venture Fund, to which Gates Foundation has given at least $50 million.

The Gates Foundation started shaping EdTech in the U.S. in 2011, when the foundation teamed up with the Carnegie Corporation to fund, to the tune of $100 million, an EdTech initiative called inBloom. inBloom aimed to create a centralized platform for data sharing and curricula. The initiative collected information from public school students,including test scores, attendance, health records, race and ethnicity data and economic and disability status. inBloom stored the information on Amazon servers, then analyzed and distributed the data to specified for-profit third parties.

inBloom’s privacy policy originally stated that it “cannot guarantee the security of the information stored in inBloom or that the information will not be intercepted when it is being transmitted.”

The 2013 launch of inBloom, which was met with strong public backlash about privacy concerns, coincided with Edward Snowden’s revelations about the National Security Agency (NSA) collecting data on U.S. civilians. After parents in several U.S. states brought lawsuits against inBloom, the initiative was shut down in 2014.

In May, during the height of the COVID outbreak, New York Gov. Andrew Cuomo announced that his state would work with the Gates Foundation to “reimagine” New York’s education system through a public-private partnership, InnovateEDU. InnovateEDU and the Gates Foundation plan to create the “next generation learning models and tools.”

LandingZone, an InnovateEDU project, aggregates, collects and mines every student’s data from Google Classrooms.

McDowell said privacy concerns extend beyond just what EdTech companies might do with students’ data. There are also concerns about what the national surveillance apparatus might do with EdTech’s predictive behavior modeling. According to McDowell:

“Through AI engineering platforms, some of which are tied to the defense industry, they can essentially mold children.We don’t want these people teaching our children.”

InnovateEDU is a member of the National Defense Industrial Association (NDIA), a trade group established in 1919 to help the defense industry scale up the war effort during World War I. According to its website, the NDIA believes in “investing in the professional growth of the defense industry” and in “promoting national security and defense.”

One software company at the forefront of EdTech, Clever, was funded by Peter Thiel’s venture capital firm. Thiel is co-founder of Palantir, the big data company instrumental in NSA surveillance and whose seed funding came from the Central Intelligence Agency.

Clever helps school districts aggregate hundreds of school-related AI apps, like Vmath, TypingClub and SpringBoard, into a single portal. One of the apps hosted by Clever, the “e-hallpass,” even tracks how much time students spend in the bathroom.

According to Clever’s website, more than 60% of the U.S. K–12 schools use Clever learning games and applications. That adds up to about about 15 million students per month, a number that will likely grow given that Clever provides the service to school districts for free.

“It’s a consumption model of education — where students simply consume content, and this content is connected to artificial intelligence and predictive behavior modeling,” McDowell said.

While the development of much of this EdTech was already underway, the pandemic has disrupted traditional schooling and created a surge in the use of this technology and the data collected from students each day. McDowell said this surge raises questions regarding surveillance and the control of dissent:

“If a highschooler writes a paper on a certain controversial topic and takes a certain political stance, the AI, depending on who it belongs to, might profile the student in such a way and discourage such behavior with future assignments.”

According to the The Pioneer Institute, federal, state and local governments spent more than $30 billion in 2018 to implement social-emotional-learning monitoring in K–12 public schools, prioritizing predatory learn-for-data digital schemes over the typical student-human teacher relationships.

This is the first in a series of articles examining Big Tech’s impact on children in schools.