This story was originally published by CalMatters. Sign up for their newsletters.
The Meta researcher’s tone was alarmed.
“oh my gosh yall IG is a drug,” the user experience specialist allegedly wrote to a colleague, referring to the social media platform Instagram. “We’re basically pushers… We are causing Reward Deficit Disorder bc people are binging on IG so much they can’t feel reward anymore.”
The researcher concluded that users’ addiction was “biological and psychological” and that company management was keen to exploit the dynamic. “The top down directives drive it all towards making sure people keep coming back for more,” the researcher added.
The conversation was included recently as part of a long-simmering lawsuit in a California-based federal court. Condensing complaints from hundreds of school districts and state attorneys general, including California’s, the suit alleges that social media companies knew about risks to children and teens but pushed ahead with marketing their products to them, putting profits above kids’ mental health. The suit seeks monetary damages and changes to companies’ business practices.
The suit, and a similar one filed in Los Angeles Superior Court, targets Facebook, Instagram, YouTube, TikTok, and Snap. The cases are exposing embarrassing internal conversations and findings at the companies, particularly Facebook and Instagram owner Meta, further tarnishing their brands in the public eye. They are also testing a particular vector of attack against the platforms, one that targets not so much alarming content as design and marketing decisions that accelerated harms. The upshot, some believe, could be new forms of regulation, including at the federal level.
One document discussed during a hearing this week included a 2016 email from Mark Zuckerberg about Facebook’s live videos feature. In the email, the Meta chief wrote, “we’ll need to be very good about not notifying parents / teachers” about teens’ videos.
“If we tell teens’ parents about their live videos, that will probably ruin the product from the start,” he wrote, according to the email.
In slides summarizing internal tech company documents, released this week as part of the litigation, an internal YouTube discussion suggested that accounts from minors in violation of YouTube policies were actively on the platform for years, producing content an average of “938 days before detection – giving them plenty of time to create content and continue putting themselves and the platform at risk.”
A spokesperson for Meta didn’t immediately respond to requests for comment.
A YouTube spokesperson, José Castañeda, described the slide released this week as “a cherry-picked view of a much larger safety framework” and said the company uses more than one tool to detect underage accounts, while taking action every time it finds an underage account.
In court, the companies have argued that they are making editorial decisions permitted by the First Amendment. That trial is set for June.
The state court litigation moved into jury selection this week, increasing the pressure on social media companies.
While the state and federal cases differ slightly, the core argument is the same: that social media companies deliberately designed their products to hook young people, leading to disastrous but foreseeable consequences.
“It’s led to mental health issues, serious anxiety, depression, for many. For some, eating disorders, suicidality,” said Previn Warren, co-lead counsel on the case in federal court. “For the schools, it’s been lost control over the educational environment, inability of teachers to really control their classrooms and teach.”
A federal suit
Meta and other companies have faced backlash for years over their treatment of kids on their platforms, including Facebook and Instagram. Parents, lawmakers and privacy advocates have argued that social media contributed to a mental health crisis among young people and that tech companies failed to act when that fact became clear.
Those allegations gained new scrutiny last month when a brief citing still-sealed documents in the federal suit became public.
While the suit also names TikTok, Snap, and Google as defendants, the filing includes allegations against Meta that are especially detailed.
In the more than 200-page filing, for example, the plaintiffs argue that Meta deliberately misled the public about how damaging their platforms were.
Warren pointed to claims in the brief that Meta researchers found that 55% of Facebook users had “mild” problematic use of the platform, while 3.1 percent had “severe” problems. Zuckerberg, according to the brief, pointed out that 3% of billions would still be millions of people.
But the brief claims the company published research noting only that “we estimate (as an upper bound) that 3.1% of Facebook users in the US experience problematic use.”
“That’s a lie,” Warren said.