AG Ferguson statement on unsealed federal complaint against Meta for harming youth mental health

Submitted by the Office of the Attorney General

Attorney General Bob Ferguson issued the following statement today on his office’s federal lawsuit against Meta for knowingly harming youth mental health. The social media company recently agreed to unseal information it had previously designated confidential.

The full complaint is now public, minus identifying information for certain non-executive employees, revealing specific details of Meta’s unlawful conduct.

“The evidence is clear — Mark Zuckerberg and Meta’s top executives knew and disregarded the extensive risks that addictive features on Instagram and Facebook posed to children,” Ferguson said. “They ignored repeated warnings from their employees and researchers and exploited harmful features to maximize profit. My office will continue doing everything we can to protect the mental health of Washington youth.”

Ferguson is suing Meta in U.S. District Court for the Northern District of California, as part of a bipartisan coalition of 42 state attorneys general. The federal lawsuit, filed by 33 of those states, accuses Meta of putting profits before the well-being of millions of children and teens by intentionally targeting them with harmful features to get them hooked for life. Internal documents show the tech company knew the risks those features posed and not only ignored them but publicly downplayed them in violation of the Consumer Protection Act. Read more about the lawsuit here.

Highlights from the unsealed complaint

Meta CEO Mark Zuckerberg ignored internal documents on detailed consultation with “21 independent experts around the world” who found that filters with cosmetic surgery effects “can have severe impacts on both the individuals using the effects and those viewing the images.” Experts told Meta that children were particularly vulnerable as well as those with a history of eating disorders and mental illness. Instagram’s head of public policy wrote to Zuckerberg that outside experts were “nearly unanimous on the harm here.” Zuckerberg canceled a meeting to discuss these issues, then subsequently vetoed a proposal to ban the filters. He dismissed the concerns as “paternalistic.”

In response to the veto, then-vice president of product design wrote in an email to Zuckerberg: “I respect your call on this and I’ll support it, but want to just say for the record that I don’t think it’s the right call given the risks … I just hope that years from now we will look back and feel good about the decision we made here.”

Internal emails show that Meta disregarded well-documented research on the psychological harm to youth when they are inundated with notifications. For example, an internal Meta document discussing “Problematic Facebook Use” stated that “smartphone notifications caused inattention and hyperactivity among teens, and they reduced productivity and well-being.” Despite this knowledge, the company pursued a strategy for “Teen Growth” by “leveraging] teens’ higher tolerance for notifications to push retention and engagement.”

Meanwhile, internal documents and emails between top executives show that Meta has long known its frequency of notifications are problematic, but continued the practice to maximize engagement. In fact, the then-vice president of analytics said in an email “Fundamentally I believe that we have abused the notifications channel as a company.” In June 2018, an internal presentation called “Facebook ‘Addiction’” proposed that Meta reduce notifications to curb problematic use. To date, Instagram does not offer users a setting to permanently disable all notifications. At most, users can opt to pause all notifications for up to eight hours at a time or pause notifications for a specific category. After notifications are disabled, Meta pressures users to reinstate notifications.

Meta executives repeatedly ignored or declined requests to fund proposed well-being initiatives and strategies that were intended to reduce harmful features on Instagram and Facebook. For example, in April 2019, Meta’s then-vice president of research emailed Zuckerberg proposing well-being investments on the platforms, pointing out, “There is increasing scientific evidence (particularly in the US…) that the average net effect of [Facebook] on people’s well-being is slightly negative.” Meta’s leadership team declined to fund the initiative. Requests like these, which involved internal discussions between multiple top executives at both Instagram and Facebook over several years, were repeatedly denied.