Newly unsealed court filings allege that Meta concealed evidence indicating that Facebook and Instagram can harm users’ mental health.
The claims stem from a United States class-action lawsuit filed by school districts and are based on internal research known as Project Mercury.
Internal documents reveal that in a 2020 study conducted with Nick Clegg, users who deactivated Facebook and Instagram for a week reported lower levels of depression, anxiety, loneliness, and social comparison.
Rather than publishing the results or pursuing further research, Meta reportedly ended the project, internally blaming the negative outcomes on the “existing media narrative” surrounding the company.
Privately, Meta staff reassured Clegg, the company’s then-head of global public policy, that the study’s conclusions were indeed valid.
“The Nielsen study does show causal impact on social comparison,” (unhappy face emoji), an unnamed staff researcher allegedly wrote.
Another staffer worried that keeping quiet about negative findings would be akin to the tobacco industry “doing research and knowing cigs were bad and then keeping that info to themselves.”
The filing alleges that despite Meta’s own research showing a causal link between its platforms and negative mental health effects, the company told Congress it could not quantify whether its products were harmful to teenage girls.
On Saturday, Meta spokesperson Andy Stone said the study was halted due to methodological flaws and emphasized that the company has been working diligently to enhance the safety of its products.
“The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens,” he said.
The claim that Meta concealed evidence of social media harms is among several allegations in a late Friday filing by Motley Rice, a law firm representing school districts in lawsuits against Meta, Google, TikTok, and Snapchat.
The plaintiffs broadly contend that the companies deliberately kept known risks of their products hidden from users, parents, and educators.

