*
Plaintiffs allege Meta hid product risks from users and
authorities
*
Meta accused of ineffective youth safety features and
prioritizing growth over safety
*
Meta opposed unsealing of internal documents in court
*
Meta allegedly ignored valid research findings on mental
health
impacts
*
TikTok allegedly influenced National PTA to publicly
support its
safety claims
By Jeff Horwitz
Nov 22 (Reuters) - Meta shut down internal research into
the mental health effects of Facebook and Instagram after
finding causal evidence that its products harmed users' mental
health, according to unredacted filings in a class action by
U.S. school districts against Meta and other social media
platforms.
In a 2020 research project code-named "Project Mercury,"
Meta scientists worked with survey firm Nielsen to
gauge the effect of "deactivating" Facebook and Instagram,
according to Meta documents obtained via discovery. To the
company's disappointment, "people who stopped using Facebook for
a week reported lower feelings of depression, anxiety,
loneliness and social comparison," internal documents said.
Rather than publishing those findings or pursuing additional
research, the filing states, Meta called off further work and
internally declared that the negative study findings were
tainted by the "existing media narrative" around the company.
Privately, however, staff assured Nick Clegg, Meta's
then-head of global public policy, that the conclusions of the
research were valid.
"The Nielsen study does show causal impact on social
comparison," (unhappy face emoji), an unnamed staff researcher
allegedly wrote. Another staffer worried that keeping quiet
about negative findings would be akin to the tobacco industry
"doing research and knowing cigs were bad and then keeping that
info to themselves."
Despite Meta's own work documenting a causal link between
its products and negative mental health effects, the filing
alleges, Meta told Congress that it had no ability to quantify
whether its products were harmful to teenage girls.
In a statement Saturday, Meta spokesman Andy Stone said the
study was stopped because its methodology was flawed and that it
worked diligently to improve the safety of its products.
"The full record will show that for over a decade, we have
listened to parents, researched issues that matter most, and
made real changes to protect teens," he said.
PLAINTIFFS ALLEGE PRODUCT RISKS WERE HIDDEN
The allegation of Meta burying evidence of social media harms is
just one of many in a late Friday filing by Motley Rice, a law
firm suing Meta, Google, TikTok and Snapchat
on behalf of school districts around the country. Broadly, the
plaintiffs argue the companies have intentionally hidden the
internally recognized risks of their products from users,
parents and teachers.
TikTok, Google and Snapchat did not immediately respond to a
request for comment.
Allegations against Meta and its rivals include tacitly
encouraging children below the age of 13 to use their platforms,
failing to address child sexual abuse content and seeking to
expand the use of social media products by teenagers while they
were at school. The plaintiffs also allege that the platforms
attempted to pay child-focused organizations to defend the
safety of their products in public.
In one instance, TikTok sponsored the National PTA and then
internally boasted about its ability to influence the
child-focused organization. Per the filing, TikTok officials
said the PTA would "do whatever we want going forward in the
fall... (t)hey'll announce things publicly(,), (t)heir CEO will do
press statements for us."
By and large, however, the allegations against the other
social media platforms are less detailed than those against
Meta. The internal documents cited by the plaintiffs allege:
1. Meta intentionally designed its youth safety features to
be
ineffective and rarely used, and blocked testing of safety
features that it feared might be harmful to growth.
2. Meta required users to be caught 17 times attempting to
traffic
people for sex before it would remove them from its platform,
which a document described as "a very, very, very high strike
threshold."
3. Meta recognized that optimizing its products to increase
teen
engagement resulted in serving them more harmful content, but
did so anyway.
4. Meta stalled internal efforts to prevent child predators
from
contacting minors for years due to growth concerns, and
pressured safety staff to circulate arguments justifying its
decision not to act.
5. In a text message in 2021, Mark Zuckerberg said that he
wouldn't
say that child safety was his top concern "when I have a number
of other areas I'm more focused on like building the metaverse."
Zuckerberg also shot down or ignored requests by Clegg to better
fund child safety work.
Meta's Stone disputed these allegations, saying the
company's teen safety measures are effective and that the
company's current policy is to remove accounts as soon as they
are flagged for sex trafficking.
He said the suit misrepresents its efforts to build safety
features for teens and parents, and called its safety work
"broadly effective."
"We strongly disagree with these allegations, which rely on
cherry-picked quotes and misinformed opinions," Stone said.
The underlying Meta documents cited in the filing are not
public, and Meta is opposing their unsealing.
A hearing regarding the filing is set for January 26 in
Northern California District Court.