By Effie Webb
Over the past week, we’ve been going through a treasure trove of nearly 6,000 pages of court filings that draw on internal emails, research and documents from major social media companies.
They were released by a California court as part of ongoing multi-district litigation from more than 1,800 plaintiffs, including parents and school districts. It alleges that platforms including TikTok, Meta, Google and Snapchat knowingly built addictive products that were harmful to children.
Some of these documents have already been reported on, mostly those containing internal emails from Meta. But the focus on Meta has let some other companies off the hook.
Some of our most alarming discoveries concern Snapchat, a platform with an especially young user base. Nearly two-thirds of 13- to 17-year-olds in the U.K. have a Snapchat profile, according to Ofcom. And the company claims it has nearly a billion monthly active users.
The documents we found show that some users have deeply unhealthy relationships with the platform, specifically concerning issues of addiction, anxiety and body image.
They also show that staff were well aware of this. In fact, Snapchat staff have repeatedly raised concerns about the risks to young users’ mental health posed by some of the platform’s key features — all of which are still central to the app today.
The company told us that the allegations in the case “fundamentally misrepresent our platform.”
They said: “The safety and well-being of our community is a top priority.”
The material includes extracts from thousands of pages of testimony and internal company documents. Some of the original documents remain under seal, meaning we have only seen the legal excerpts.
‘Accidentally addictive’
One issue that recurs in the emails is the platform’s potentially addictive qualities. In an internal planning discussion from 2020, one employee apparently hinted at the company’s unspoken ambition to make the platform more addictive to its users.

Other documents make reference to some disturbing correspondence from users who had been blocked from the platform. According to documents referenced in the plaintiffs’ legal arguments, one user appears to have contacted the company, threatening to kill themselves if their account was not restored.
Another wrote to the company, directly addressing the CEO and founder Evan Spiegel, in an attempt to get her account back online.
“Please, I’m begging you to help me,” she wrote.
Central to many internal conversations around user addiction was Snapchat’s Streaks feature, which keeps count of the consecutive days two users have exchanged a snap.
In an email chain from 2017, one employee describes it as having caused “mass psychosis” among users.

By the following year, concerns around the feature had reached the very top. In an August 2018 email, Spiegel wrote that it had “created anxiety” in some users.
In a separate internal note, a senior product manager described it as “accidentally addictive” and “somewhat unhealthy” — but said the company was unwilling to remove the feature because it had become so popular.


The feature remained, and concerns continued. In 2023, a senior communications executive emailed a colleague with feedback from focus groups.
Parents had said their teenagers felt a “seemingly uncontrollable need to ‘keep their streaks alive,’” the executive wrote, and linked the behaviour to screen addiction.
Yet ultimately, these concerns were not enough to cause the platform to remove the Streaks feature. It remains a signature part of Snapchat today.
Lexi Hazam, one of the lead lawyers in the federal litigation, told us families have been “paying the price” for decisions made behind closed doors by social media companies.
She said: “Internal evidence shows they knew their products were addictive and harmful to youth yet they kept parents and teachers in the dark.”
‘Lens dysmorphia’
Another key Snapchat feature is Lenses, which are filters that allow a user to alter their face using augmented reality. The documents show that concerns were repeatedly raised by staff about how they promote narrow and potentially damaging beauty standards. Internal testing found that so-called “beautification” lenses that slimmed the face or enlarged the eyes were especially popular.

In an incident documented in the plaintiffs’ legal arguments, one Black employee reported that when she had used a popular anime-style filter, it lightened her skin, narrowed her nose and lips, turned her eyes blue and smoothed away her braids. It is unclear exactly which filter was being referred to, but many similar anime-style filters remain popular and available today.
One internal report of a 2020 survey of young users described their responses to the lenses as “troubling.” Teenagers said filters made them look “decent instead of being a horrendous ugly black girl,” “thinner and lighter” and that the filter “hides my ugliness.”
The report noted that users were “left feeling bad about how they look,” calling the pattern “Lens dysmorphia.”

This appears to have been an ongoing concern. The plaintiffs’ legal documents describe an episode when early testing of a new lens prompted complaints from users who said it worsened their body image and triggered disordered-eating behaviors.
After both the communications and policy teams advised against launching it, the team developing the feature replied, saying “there is no way this lens will not be launched,” given the millions of new users it would attract.
This in turn prompted one senior communications executive to email her colleagues, laying out in no uncertain terms the ethical concerns her team had about the lens in question.

It is unclear which specific lens this exchange referred to or whether it was ultimately released, but lenses and filters that alter users’ faces remain widely available on the platform.
The company spokesperson told us that Snapchat was built to encourage “self-expression and authentic connections.”
The company said that the quotes in the legal filing were “cherry-picked” excerpts taken out of context, and some points in the legal documents rely on quotes from user feedback that reflect a small percentage of users.
“We’ve built safeguards, launched safety tutorials, partnered with experts, and continue to invest in features and tools that support the safety, privacy, and well-being of all Snapchatters,” they said.
Originally published by The Bureau of Investigative Journalism.
Effie Webb is an artificial intelligence (AI) fellow and reporter on The Bureau of Investigative Journalism’s Big Tech team. Before joining the team, she reported for Business Insider, investigating the global data worker industry behind AI, automation’s impact on jobs, and how big tech’s race to monetize generative AI is reshaping society. She has reported on health and wellness misinformation, Tesla, health tech, and cybercrime across print and documentary.
