Dec 3 (Reuters) - Despite widespread concern that
generative AI could interfere with major elections around the
globe this year, the technology had limited impact across Meta
Platforms' ( META ) apps, the tech company said on Tuesday.
Coordinated networks of accounts seeking to spread
propaganda or false content largely failed to build a
significant audience on Facebook and Instagram or use AI
effectively, Nick Clegg, Meta's president of global affairs,
told a press briefing. The volume of AI-generated misinformation
was low and Meta was able to quickly label or remove the
content, he said.
The snapshot from Meta comes as misinformation experts say AI
content has so far failed to significantly sway public opinion,
as notable deepfake videos and audio, including of President Joe
Biden's voice, have been quickly debunked.
Coordinated networks of accounts attempting to spread false
content are increasingly shifting their activities to other
social media and messaging apps with fewer safety guardrails, or
are operating their own websites in order to stay online, Clegg
said.
Even as Meta said it was able to take down about 20 covert
influence operations on its platform this year, the company has
retreated from more stringent content moderation it employed
during the previous U.S. presidential election in 2020.
The company heard feedback from users who complained that
their content had been removed unfairly, and Meta will aim to
protect free expression and be more precise in enforcing its
rules, Clegg said.
"We feel we probably overdid it a bit," he said. "While
we've been really focusing on reducing prevalence of bad
content, I think we also want to redouble our efforts to improve
the precision and accuracy with which we act on our rules."
The move is also in response to push-back from some Republican
lawmakers who have questioned what they say is censorship of
certain viewpoints on social media. In an August letter to the
U.S. House of Representatives Judiciary Committee, Meta CEO Mark
Zuckerberg said he regretted some content take-downs the company
made in response to pressure from the Biden administration.
(Reporting by Sheila Dang in Austin; Editing by Chizu Nomiyama )