Texas schools are rapidly scaling up the use of technology that monitors email, web history and social media posts of potentially millions of students, often without their knowledge or consent, a Dallas Morning News investigation has found.
Legal and privacy experts have long raised concerns about this technology and questioned its effectiveness in detecting potential threats. Despite those worries, Texas’ schools have spent millions of tax dollars on these services since 2015.
The proliferation of student surveillance has been fueled by nationwide fears about school shootings, suicides and cyberbullying. Among school districts, no state has more contracts with digital surveillance companies than Texas, according to GovSpend, a company that tracks government spending.
Using school records and purchasing data, The News examined some of the most widely used monitoring technologies in Texas schools: Social Sentinel, Gaggle, Securly and GoGuardian. In the past six years, more than 200 districts statewide have used these technologies. At least 28 are in North Texas, including some of the largest districts — Dallas, Carrollton-Farmers Branch, Carroll, Irving, Coppell and Grapevine-Colleyville ISDs have all used one of the four services.
The companies offer services ranging from public social media monitoring to tracking nearly everything a student does on a device. Contracts for these services range from a few hundred dollars to six figures, depending on the service and the size of the school or district.
Social Sentinel says it scans more than a billion posts on social media every day against more than 450,000 words and phrases that indicate potential harm. It then uses artificial intelligence to identify potential threats of violence and suicide.
Social Sentinel co-founder Gary Margolis has said in news interviews that the service doesn’t surveil or monitor students, because it merely scans public social media posts.
But Amelia Vance, the director of youth and education privacy at the Future of Privacy Forum, disagrees.
“It’s absolutely a monitoring or surveillance tool,” Vance said.
“The privacy concerns with this are, of course, never-ending.”
A spokesperson for Navigate 360, the parent company of Social Sentinel, disputed The News’ description of how its technology works but declined to specify what was incorrect.
“We do not comment publicly on safety systems and processes associated with past, future or current customers,” the spokesperson wrote. “Exposing their systematic approach increases the risks to the lives they protect.”
The other services, Securly’s Auditor and Gaggle, use artificial intelligence and a team of people to flag content in Gmail and Google Docs deemed to be concerning.
Mike Jolley, a director at Securly, said the company encourages all school districts to be transparent with students and parents.
“The idea is to be transparent and let them know why you’re doing it, because they’re going to find out anyway,” Jolley said. “We just want to look out for you and make sure you’re safe.”
Gaggle disputed the notion that it was a surveillance tool and said its technology has saved thousands of students’ lives by averting suicides.
Carrollton-Farmers Branch ISD said it used Gaggle multiple times to help students in crisis.
“There have been multiple incidents in which students at risk have been identified and imminent self-harm was avoided,” said Dawn Parnell, the district’s chief communications officer.
In some instances, students wrote letters detailing plans to hurt themselves and saying goodbye to their families. Others talked about running away to meet up with people they had met online, Parnell said.
GoGuardian’s “Beacon” monitors students even more closely, capturing everything they type into their web browser for signs of imminent violence or self-harm. Any such content triggers an email or text message alert to school officials. The company also offers products that allow schools the ability to block potentially objectionable websites and services that let teachers remotely see and control students’ computers.
“When students go online, they can also encounter inappropriate or even dangerous content,” said Jeff Gordon, GoGuardian’s director of public relations. “Schools and educators rely on GoGuardian’s products to help create safer digital learning environments so that students can maximize their learning potential on school system technology.”
When the University of North Texas bought Social Sentinel in 2015, it did not tell students that their public social media posts could be monitored.
UNT police Chief Ed Reynolds said he wasn’t concerned about the privacy implications of the technology because it captured only public social media posts.
Reynolds said UNT started using the service after seeing an increase in students talking online about hurting themselves.
“What we were trying to do was be proactive,” he said.
School officials from Stafford, Smyer, Woodville, Vidor and Blooming Grove ISDs also said they never notified parents or students about the surveillance.
“It’s something that just happens behind the scenes,” Rick Hartley, superintendent of Blooming Grove ISD, said of Gaggle.
Gaggle CEO Jeff Patterson said in a statement that the company encourages schools to be transparent about their service, but they leave it to districts to handle how they communicate with students and parents.
There is little public information about how the technologies work and how well they perform aside from the companies’ claims. Two of the four companies contacted by The News said they had not given their data and code to independent researchers. The other two companies did not comment.
Patterson said Gaggle does not provide student data to outside parties.
“We do not have a third-party analysis of our work, simply because of the significant privacy issues that would be raised if we were to export this sensitive data and information to an independent researcher for review and examination,” he said.
In patent documents, Securly says its web filter can track every website a student visits and where the student was while visiting the site. Jolley said they track device locations so districts can keep track of their property and also to alert them in case students search for activity like suicide.
Although Social Sentinel’s tool is designed to detect threats on campuses, it also can inadvertently capture social media posts from people who are not students.
That’s because the company’s system considers the location listed in a user’s social media profile to determine whether an alert should be issued, Buzzfeed News data provided to The News shows. The service also looks at whether the posters follow an official campus account, even if they never attended the school.
Margolis, the Social Sentinel founder, has said in interviews that the service has a 90% success rate in detecting the proper context of the 1 billion social media posts it scans every day. But a report from the Center for Democracy and Technology, a nonprofit based in Washington, D.C., found that even the highest-performing machine learning systems correctly catch the meaning of harmful content only about 80% of the time.
But an analysis of Social Sentinel data by Buzzfeed News found that the service generated a flood of alerts seeming to contain nonthreatening information.
Tammy Dowdy, director of communications for Dickinson ISD, told The News that the district is rethinking its use of Social Sentinel after receiving a large number of alerts for nonthreatening posts.
The Center for Democracy and Technology also found that many popular machine learning tools were more likely to misinterpret women and Black people, sometimes even classifying their speech as another language altogether.
Vance, the Future of Privacy Forum director, also said these services may lead to fewer students searching for resources they need to get help.
“Kids aren’t going to look up the National Suicide Hotline,” Vance said. “They’re not trying to access things if they know that that’s going to get them flagged.”
Schools may not be using these services to stop only shootings and suicides.
In an email, a Social Sentinel sales director told a Stephen F. Austin State University police detective that the service was used not only to prevent suicides and shootings but also for “Forestalling potentially volatile protests/demonstrations.”
News reports also show some schools have used the technology to monitor protesters. A report from Criminal Legal News in 2018 found that East Carolina University used Social Sentinel to identify a non-student activist who was critical of the university’s chancellor. A year later, NBC News found that UNC-Chapel Hill had used a geofence, a virtual boundary for a geographic area, to scan the accounts of activists who were protesting a Confederate statue on campus. Documents obtained by NBC show that UNC paid Social Sentinel $73,500 in 2016 for a three-year contract for the service.
In a blog post, Margolis wrote that he came up with the idea for Social Sentinel when he was the police chief at the University of Vermont, after an officer told him about a planned protest at the university’s executive offices, and recent crimes, which were posted on Facebook.
Legal scholars said it’s unclear at what point this type of monitoring may cross a legal line. Aziz Huq, a law professor at the University of Chicago, said monitoring technologies are a serious concern, but current case law is unclear about whether their use constitutes a search under the Fourth Amendment and whether students would be protected from such a search.
A recent U.S. Supreme Court decision may shed new light on whether schools can take action against students for their off-campus speech.
The court ruled in June that a Pennsylvania school district had violated the free-speech rights of a student when it punished her for a vulgar Snapchat message, but the justices declined to set a universal standard for what counts as “off-campus” speech.
“[F]rom the student speaker’s perspective, regulations of off-campus speech, when coupled with regulations of on-campus speech, include all the speech a student utters during the full 24-hour day,” Justice Stephen Breyer wrote. “When it comes to political or religious speech that occurs outside school or a school program or activity, the school will have a heavy burden to justify intervention.”
But students hoping the recent decision would limit schools’ authority to surveil may be disappointed. Emily Suski, a law professor at the University of South Carolina, said schools already have extensive surveillance authority under state laws designed to protect against cyberbullying. Breyer’s suggestion that teachers stand in place of parents while students are at school may grant even greater surveillance authority, Suski said.
These services may not only be harmful to students’ privacy. Harold Krent, a law professor at Chicago-Kent College of Law, said schools could face greater legal liability if they fail to act on an alert from one of these services. Schools could also see more legal troubles if the platforms give students a false sense of security.
At least one state has tried to limit these technologies. In 2014, California passed a law requiring students and parents to be informed about any technology schools are considering that collects social media data.
Giovanni Capriglione, a Republican state representative from Southlake, told The News he plans to introduce legislation addressing the lack of transparency associated with monitoring technologies and said Texas’ laws don’t go far enough in protecting students.
“Parents do not know, unfortunately, that their kids are part of a mass surveillance effort,” Capriglione said. “And we need to address that, quite simply.”
While some have worked to limit these technologies, many districts believe a Clinton-era federal law, the Children’s Internet Protection Act, already requires them to adopt monitoring technologies. A marketing video posted to Gaggle’s YouTube page mentions the law explicitly in its pitch to school districts.
If these technologies weren’t already required, a bill from Sen. John Cornyn, R-Texas, would have made them all but mandatory. In October 2019, Cornyn introduced the RESPONSE Act, a bill he said would help prevent school shootings. Had it passed, the bill would have modified a 1930s communications law requiring schools receiving federal communications funding to adopt a technology that detects the online activities of children who are at risk of committing self-harm or extreme violence to others.
Cornyn’s office did not respond to requests for comment.
Cornyn isn’t the only prominent Texan to promote monitoring technologies for schools. In his 2018 school safety plan, Gov. Greg Abbott mentioned Social Sentinel and technologies like it as a potential way to address threats made on social media against schools.
Though some believe there are no problems with these monitoring technologies, a large body of legal, psychological and sociological material suggests that being surveilled, even in the absence of any action taken on the information, has measurable effects on those being monitored.
A study of more than 1,200 largely younger adults conducted by Oxford researchers found that after being informed about a government surveillance program, survey participants were much less likely to speak or write about certain things, share personal content or engage with social media, and were more cautious about the language they used online. Furthermore, researchers found that the younger the internet user, the greater the chilling effect they reported.
“Kids need to have the intellectual ability to make mistakes, to suggest ideas outside the box, to give the wrong answer in class, to just test out their mind,” Vance said. “And they are much less likely to do that when they’re being surveilled.”
Dallas Morning News investigative reporter Lauren McGaughy contributed to this report.