Cajon Valley Union School District meeting in El Cajon on Feb. 14, 2023.
Cajon Valley Union School District meeting in El Cajon on Feb. 14, 2023. / Photo by Ariana Drehsler

In 2014, the Cajon Valley Union Elementary School District began distributing laptops to all of their K-8 students. It was an ambitious and forward-looking program and part of a suite of technology-minded reforms that helped earn the district accolades like entry into the Digital Promise League of Innovative Schools.  

Once students had the laptops, district leaders wanted a way to monitor their activities. By the following year, Cajon Valley had added a new tool to its digital utility belt: Gaggle. The company monitors student activity using AI and has been criticized by activists, students and even U.S. Senators for the breadth of information it scans. 

A striking uptick in student mental health crises nationwide supercharged by the pandemic, has spurred the increase in student monitoring technology like Gaggle. It now monitors nearly 6 million students in 49 states. 

But how best to deploy robust surveillance tools like Gaggle, if at all, is still up in the air.  

How it Works

Gaggle pairs AI with a team of content moderators to scan all activity on the laptops the district distributed to students and on the suite of Google applications available to them. It can scan content even when applications are used on a personal device. Gaggle’s software looks out for keywords associated with issues like self-harm, drug use, violence and sexual assault and notifies district leaders when concerning content is created. It also can identify if a student is sending sensitive pictures of themselves. The content doesn’t even have to be shared. If something is written in a private Google document, Gaggle has access to that too.  

In the most serious instances, Jeremy Boerner’s phone rings. He works as director of student development for Cajon Valley. 

“An example of that would be a student contemplating suicide, or a description of an act of violence in progress or about to happen,” Boerner said. He’s seen both of those examples. 

If the program flags what it views as an immediate danger, and a Gaggle employee agrees it’s concerning, a call is made to Boerner and other district employees. Sometimes in the middle of the night. Gaggle then “provides the context and the information that we would need to look deeper into that and possibly contact the parents to work with us on making sure that the child is safe,” Boerner said.  

According to El Cajon Police Department records, Gaggle has identified a handful of crimes over the past five years, including one instance of lewd or lascivious acts with a minor. Emails also show a handful of communications between district officials, Gaggle, the El Cajon Police Department and the Internet Crimes Against Children Task Force. 

None of this is happening in secret, Boerner said. The district notifies parents of the software when the family is issued the technology and they’re subsequently required to sign an agreement that the student will use the technology appropriately. So far, he said parents have been universally positive about the program. 

“I’ve only had parents be incredibly grateful that someone contacted them and allowed them to follow through with their child,” Boerner said. In one instance, the district notified a parent their child was sending pictures of themselves to someone. When the district notified the parent, Boerner said they were relieved that there was some oversight on the technology. 

Cajon Valley has been so bullish on Gaggle that it allowed the company to produce a case study based on the district’s experience. In it, Jonathan Guertin, Cajon Valley’s chief technology officer, applauds the software and the human touch it brings. Guertin claimed the program has helped save a student’s life during every school year it has been used.

He said it can be difficult for some students to ask for help if they’re experiencing mental health issues, but now that students know Gaggle is watching he believes “comments are purposefully written in Google Docs to reach out for help.”  

Monitoring like this, he said, is all part and parcel of the brave new internet age kids are living in, and it’s the districts’ responsibility to ensure there are guard rails.

“You can’t put adult tools in the hands of students and not provide some kind of guidance,” Guertin said. “I’m a privacy guy myself, but at the same time, you’re using district equipment, you’re using district communication tools. I think we’re obligated to ensure that those tools are used appropriately,” he said. 

Some Aren’t Sold on the Technology  

Jason Kelley is the activism director at the Electronic Frontier Foundation and the co-lead of the nonprofit’s student privacy working group. He said the notion that kids are typing notes into private Google docs to get attention from school officials seems overly idealistic. 

“It could happen … but I think that most students actually have a fairly adversarial relationship with their school officials. They may have teachers that they trust, or counselors that they trust, but I don’t think young people trust the surveillance system that the school uses more than the counselor or the teacher,” Kelley said. 

Kelley has studied GoGuardian, a Gaggle competitor. One of his chief concerns with such technologies is the lack of context often provided with reports and the propensity for false flags. According to Kelley, the services don’t do a very good job distinguishing genuinely concerning instances of communication from a message where a student jokes they want to die because of a zit they have on their face, for example. Student journalists have themselves complained that the technology “crosses the line between security and privacy.” 

 Advocates like Kelley say the technology could put kids from marginalized communities at greatest risk. A recent Vice News documentary, for example, did a deep dive on Gaggle and surfaced instances of LGBTQ+ kids feeling targeted by the monitoring program because “gay,” “lesbian” and “transgender” were flagged words. Earlier this year, the company stopped monitoring for LGBTQ+ keywords

That companies like Gaggle forward some reports to police led U.S. Senators Edward Markey and Elizabeth Warren to write in a 2022 report that the technology may be “exacerbating the school-to-prison pipeline by increasing law enforcement interactions with students.” The senators wrote that their “investigation confirms the need for federal action to protect students’ civil rights, safety, and privacy.” 

For Kelley, another concern is the potential chilling effect of 24/7 monitoring. Ultimately, young people’s lives are inextricably linked with the internet, Kelley said. Whether they’re connecting with friends, writing something in a Google Doc or sending emails, kids nowadays type everything into their computers or phones, and for some the district-issued computer may be the only one they have access to. 

“So, you’re essentially surveilling the entire life of a young person. You can imagine how difficult it would be to self-censor, but how necessary it must feel when you know that not only is some of the content you’d like to see blocked, but the words you’re saying are scanned and sent to officials or school counselors or administrators or teachers,” he said. 

“Whether you’re teaching a young person that they should self-censor, or you’re teaching them that it almost doesn’t matter because they’re going to be surveilled forever, you’re sending a terrible message to young people about what school ought to be,” Kelley said. 

Kathryn Gray is a Voice of San Diego intern.

Jakob McWhinney is Voice of San Diego's education reporter. He can be reached by email at and followed on Twitter @jakobmcwhinney. Subscribe...

Join the Conversation


  1. Interesting piece. In theory, parents are supposed to be monitoring their child’s use of the internet. In practice, it’s virtually impossible to do.

    As a parent, I’d want to be alerted if an AI feature flagged my child’s computer behavior so that I could help, and every parent would want to prevent an act of aggression against other students. The question is really, how much of their child’s privacy are they willing to give up for these safety features?

    If Gaggle sounds too invasive, then I recommend you trade in your smart phone for a flip phone because almost every app you use–including Google–monitors your interactions (and locations) and monetizes that data, often through the use of algorithms. Since the vast majority of us are already giving up our privacy for the convenience and pleasure with phone apps, an AI system for student safety seems like a positive. My only concern would be what else the companies that monitor students use on the internet do with the information besides alert the school and parents.

  2. So basically the parents want this but some weird guy without a kid in the school thinks it’s a privacy concern? Same privacy advocates that had zero percent support for banning street light cameras?

  3. May I please republish this? To and please, and possibly elsewhere? I see it’s by a Staff Writer but your site doesn’t mention Interns on reproduction guidelines. Nope; no pay-wall.

Leave a comment
We expect all commenters to be constructive and civil. We reserve the right to delete comments without explanation. You are welcome to flag comments to us. You are welcome to submit an opinion piece for our editors to review.

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.