Feb 27 (Reuters) - Meta Platforms ( META ) said on
Thursday it had resolved an error that flooded the personal
Reels feeds of Instagram users with violent and graphic videos
worldwide.
It was not immediately clear how many people were affected
by the glitch. Meta's comments followed a wave of complaints on
social media about violent and "not safe for work" content in
Reels feeds, despite some users having enabled the "sensitive
content control" setting meant to filter such material.
"We have fixed an error that caused some users to see
content in their Instagram Reels feed that should not have been
recommended. We apologize for the mistake," a spokesperson for
Meta said.
It did not disclose the reason behind the error.
Meta's moderation policies have come under scrutiny after it
decided last month to scrap its U.S. fact-checking program on
Facebook, Instagram and Threads, three of the world's biggest
social media platforms with more than 3 billion users globally.
Violent and graphic videos are prohibited under Meta's
policy and the company usually removes such content to protect
users, barring exceptions given for videos that raise awareness
on topics including human rights abuse and conflict.
The company has in recent years been leaning more on its
automated moderation tools, a tactic that is expected to
accelerate with the shift away from fact-checking in the United
States.
Meta has faced criticism for failing to effectively balance
content recommendations and user safety, as seen in incidents
like the spread of violent content during the Myanmar genocide,
Instagram promoting eating disorder content to teens and
misinformation during the COVID-19 pandemic.
(Reporting by Surbhi Misra and Akash Sriram in Bengaluru;
Editing by Saumyadeb Chakrabarty)