Seattle schools sue TikTok, Meta and other platforms over youth 'mental health crisis'
They've hooked students into 'excessive use and abuse,' the complaint states.
Seattle public schools have sued the tech giants behind TikTok, Facebook, Instagram, YouTube and Snapchat, accusing them of creating a "mental health crisis among America's Youth." The 91-page lawsuit filed in a US district court states that tech giants exploit the addictive nature of social media, leading to rising anxiety, depression and thoughts of self-harm.
"Defendants’ growth is a product of choices they made to design and operate their platforms in ways that exploit the psychology and neurophysiology of their users into spending more and more time on their platforms," the complaint states. "[They] have successfully exploited the vulnerable brains of youth, hooking tens of millions of students across the country into positive feedback loops of excessive use and abuse of Defendants’ social media platforms."
Harmful content pushed to users includes extreme diet plants, encouragement of self-harm and more, according to the complaint. That has led to a 30 percent increase between 2009 and 2019 of students who report feeling "so sad or hopeless... for two weeks or more in a row that [they] stopped doing some usual activities."
Defendants’ misconduct has been a substantial factor in causing a youth mental health crisis, which has been marked by higher and higher proportions of youth struggling with anxiety, depression, thoughts of self-harm, and suicidal ideation. The rates at which children have struggled with mental health issues have climbed steadily since 2010 and by 2018 made suicide the second leading cause of death for youths.
That in turn leads to a drop in performance in their studies, making them "less likely to attend school, more likely to engage in substance use, and to act out, all of which directly affects Seattle Public Schools’ ability to fulfill its educational mission."
Section 230 of the US Communications Decency Act means that online platforms aren't responsible for content posted by third parties. However, the lawsuit claims that the provision doesn't protect social media companies for recommending, distributing and promoting content "in a way that causes harm."
"We have invested heavily in creating safe experiences for children across our platforms and have introduced strong protections and dedicated features to prioritize their wellbeing," a Google spokesperson told Axios. "For example, through Family Link, we provide parents with the ability to set reminders, limit screen time and block specific types of content on supervised devices."
"We've developed more than 30 tools to support teens and families, including supervision tools that let parents limit the amount of time their teens spend on Instagram, and age verification technology that helps teens have age-appropriate experiences," Meta's global head of safety Antigone Davis said in a statement. "We'll continue to work closely with experts, policymakers and parents on these important issues." TikTok has yet to react, but Engadget has reached out to the company.
Critics and experts have recently accused social media companies of exploiting teens and children. Meta whistleblower Frances Haugen, for one, testified to Congress that "Facebook's products harm children." Eating disorders expert Bryn Austin wrote in a 2021 Harvard article that social media content can send teens into "a dangerous spiral." And the issue has caught the attention of legislators, who proposed the Kids Online Safety Act (KOSA) last year.