鶹ý

In Short

Public Schools, Private Eyes: How EdTech Monitoring Is Reshaping Public Schools

A Lawsuit Seeks Transparency in Texas Schools

K-12 student on computer
Shutterstock

As K-12 public schools in the United States adopt a growing range of educational technologies, many are also implementing digital surveillance tools that monitor students—. A 2023 survey of education technology (edtech) surveillance systems suggest in the classroom, and . Despite a flagging the danger of edtech surveillance, there is a persistent lack of transparency around the efficacy, use, and misuse of this technology.

Numerous schools have , the largest of which include , , , , and . These surveillance systems use artificial intelligence (AI) to see if students’ web activity, correspondence, or keyword searches can reveal their state of mind, if they are being bullied, or even their potential for committing an act of violence. This monitoring occurs through three primary pathways: school-issued devices, school-managed internet connections, or school-managed accounts (e.g., learning management systems or email accounts). Research indicates, however, that the edtech companies spearheading classroom surveillance technologies often , and .

In March, the filed a to uncover how AI-powered monitoring technologies are being used on school-issued devices in North Texas’s Grapevine-Colleyville Independent School District. The lawsuit was filed after the school district denied a request from the Knight Institute to release records on its use of edtech surveillance. The school district claimed that the requested records to the Knight Institute due to potential risks to network security. Notably, it remains unclear how the Knight Institute’s request would jeopardize the security of the district’s network. Still, the Knight Institute narrowed its request to focus on information such as flagged content types and emails referencing the district’s edtech surveillance vendors. Despite a revised information request scope and the legally tenuous nature of the network security invocation, . The Knight Institute’s lawsuit seeks to compel the release of these records in order to promote transparency and enable a public evaluation of whether digital monitoring practices are appropriate and their potentially on students. Without access to this information, the scope and nature of the district’s monitoring practices remain opaque. Those most directly affected by this technology—students, teachers, and parents—remain inadequately informed about how their data is being used, despite their rightful expectation of transparency, clarity, and accountability.

How Are Current State and Federal Policies Driving School Surveillance?

Texas has for edtech surveillance platforms than any other state, with over . In the last decade, technology in Texas rose 66 percent per student, compared to only a 28 percent increase in social services. The Knight Institute v. Grapevine-Colleyville Independent School District lawsuit highlights concerns about the lack of transparency and accountability in the use of these technologies.

In October 2022, the Biden administration released its which warned that “continuous surveillance and monitoring should not be used in education…where the use of such surveillance technologies is likely to limit rights, opportunities, or access.” Building on this foundation, in 2023, President Biden signed . This order has been replaced by a by President Trump; it has a deregulatory, private-sector approach with significantly fewer references to bias, accountability, and data privacy.

As AI expands in education, bolstered by nearly 70 companies signing the Trump administration’s , agencies like the , which in the past provided guidance, no longer exist. Given the White House’s push for AI implementation that , edtech surveillance tools could enter education more rapidly than previous years and with fewer brakes and guardrails, particularly for those who are already most vulnerable. , , and are all urgent signs that pressure for states and school districts to expand surveillance and data collection on already marginalized communities is well underway. Participating fully in the classroom can then come with risks to these students’ safety. found that a small but concerning number of students flagged by school monitoring software have been contacted by immigration enforcement directly. Similarly, increasingly stringent legislation surrounding risk being enabled by invasive monitoring of trans students.

How Does the Wider Legal Context Shape Student Surveillance?

stipulates that schools “filter obscene content” by “monitoring the online activities of minors.” In an effort to protect minors online, CIPA requires schools and libraries that receive federal funding for internet access use filters and monitoring tools to block pornography and other content deemed harmful or obscene. In 2000, , making teachers among those best situated to prevent harmful content access through simple filtering software. Today, critics argue CIPA is frequently used as a justification for the constant surveillance students are subjected to, well beyond a reasonable interpretation of the law.

This surveillance has a . , which provides affordable broadband access and other services for schools and libraries, is more commonly used by schools with a higher percentage of students eligible for the National School Lunch Program. E-Rate recipients must comply with CIPA, often through the use of web filtering, to keep explicit content from children. Since to school-issued devices, that students in those districts are subject to more pervasive monitoring than students in wealthy districts who have access to personal devices. In the past decade, the Federal Communications Commission, CIPA’s administering body, has done little to clarify the boundaries of CIPA’s monitoring with .

Regulations like the and the are designed to safeguard student privacy and safety, but they have also been used to justify access restriction to certain educational platforms and extractive data use. For example, broadened the scope of who could be granted access to student records; the amendments redefined “authorized representative” as a nongovernmental actor who may represent schools. This change allows edtech surveillance companies to operate as educational partners, permitting schools to disclose student data to these companies in the same capacity as school officials. This loophole is even more problematic because of the . Enforcement focuses on schools—through the possible loss of federal funding—rather than holding the companies accountable.

Student typing
Shutterstock

Why Does EdTech Surveillance Matter for Students?

The edtech surveillance industry, now , is dominated by figures with limited ties to pedagogy or the realities of student learning; as highlighted by a , popular surveillance company founders are professional entrepreneurs, cybersecurity officials, and law enforcement professionals. The disconnect between who builds these technologies and what students actually need in the classroom helps explain why edtech surveillance often fails to support——meaningful learning. The ability to learn and explore without a constant fear of monitoring, criminalization, and retaliation is critical to students in the public education system. School investment in edtech surveillance platforms that are can divert and . Investing in edtech surveillance rather than on resources that truly promote school safety, like , seems like a poor decision.

While surveillance raises concerns for all students, its harms are not distributed evenly. There is clear evidence that edtech surveillance often , leading to disproportionate discipline and scrutiny for already marginalized groups. Because edtech surveillance systems operate by flagging behaviors consistent with being “anomalous,” suggests that “disabled students are more likely to be flagged as potentially suspicious…simply because of the ways disabled people already exist.” Platforms are built to flag behavior that deviates from a perceived norm, so certain natural differences in how students with disabilities may communicate or interact with technology can be misinterpreted as suspicious or threatening. Students with disabilities who use assistive technologies or interact differently in chat platforms and digital tools are or , even when they have school accommodations.

Monitoring also for inappropriate behavior, further perpetuating and . For example, a monitoring system that flags students’ online messages for potential cyberbullying could rely on natural language processing algorithms, many of which . Thus, it’s possible that some content from Black students is disproportionately flagged, making them more likely to be referred for disciplinary action. In other cases, flagged students are not first approached by school administrators but are instead , even for relatively benign infractions.

Queer and trans students also face misuse of surveillance technologies inside and outside the classroom. Monitoring algorithms can . While these systems aim to prevent cyberbullying, they can end up creating an environment where no space—online or offline—feels private or safe. This kind of surveillance discourages free exploration and learning on sexuality and gender, and it can wrongly flag these students as disciplinary concerns simply because the content they engage with references sex or sexual identity. Similarly, sharing of LGBTQ+ student data with companies can lead to unwelcome infringement of student privacy. of nationally representative samples of 6th to 12th grade students found that nearly 30 percent of students reported being outed as a result of their school’s digital activity monitoring.

Is it Possible to Have Student Safety Without Privacy and Transparency?

A central justification for the promotion of surveillance technologies in educational settings is their . While safeguards are needed to protect K–12 students in school and the digital sphere, evidence suggests that . The manner in which these surveillance tools can label students as potential threats, play fast and loose with student data, and create punitive learning environments suggests that rather than fostering security, these tools can undermine it.

With even than years past, edtech surveillance companies may be poised to accelerate the transformation of schools into engines of data extraction and control, deepening . The lack of transparency surrounding Grapevine-Colleyville’s use of AI-powered monitoring technologies reflects a national pattern of limited public oversight in school surveillance practices. The ongoing lawsuit seeks to obtain public records to clarify how these technologies are being used, emphasizing the importance of transparency in evaluating their impact. Without access to information like this, it remains difficult for researchers, policymakers, and communities to make informed judgments about the legitimacy and impact of these technologies. As school districts continue to deepen their reliance on these technologies, they risk normalizing invasive surveillance without adequate information. If this trend is left unchallenged, students will be learning under conditions of surveillance, sorting, and suspicion rather than trust, care, and agency.

More 鶹ý the Authors

Kéah Sharma
KéahSharmaHeadshot
Kéah Sharma

Program Associate, Teaching, Learning & Technology

Public Schools, Private Eyes: How EdTech Monitoring Is Reshaping Public Schools