They argue that the negligent design of the major social media platforms has rewired student psychology and changed the way children think, behave, and learn, leaving teachers to manage the fallout.
“The algorithmic designs underlying the social media products are causing significant disruption and harm to the education system and to the student population,” says Duncan Embury, lawyer for the school boards. “That takes all kinds of forms.”
This includes reducing the attention and focus required for learning and increasing security-related incidents caused by sexting, cyberbullying, and emotional dysregulation, says Embury, who practises at Neinstein Personal Injury Lawyers. “Things that the schools are encountering on a day-to-day, hour-to-hour, minute-to-minute basis that are correlated, we say, to the algorithmic designs underlying social media products.”
While the algorithms are proprietary, he hopes that through the document and oral discovery process, further information will be forthcoming on how the algorithms were coded, how they were designed to work, and the intention behind them.
A spokesperson for TikTok told Law Times that the company has “industry-leading safeguards such as parental controls, an automatic 60-minute screen time limit for users under 18, age restrictions on features like push notifications, and more.” TikTok’s team of safety professionals constantly evaluates emerging practices and insights to support user well-being, they said.