Representational image
Representational image
Social media companies are increasingly relying on fact-checks by their users, thereby offloading the responsibility for moderating factually incorrect content in controversial posts.
From next week, Meta will begin testing a crowdsourced Community Notes approach across Facebook, Instagram and Threads, starting with the US. It is being widely seen in news verification circles as Meta founder Mark Zuckerberg’s way to curry favour with the Trump administration that in the past has popularised the phrase“alternative facts”.
Meta has said its new content moderation tool incorporates the same open-source algorithm that powers X’s Community Notes and it plans to modify the algorithm to better serve its Facebook, Instagram and Threads apps.
Elon Musk’s X popularised the concept of Community Notes that relies heavily on its users to police the site for misinformation. X began testing the programme before Musk acquired the company in 2022 but the senior adviser to US President Donald Trump accelerated the programme and largely did away with the fact-checking labels Twitter once applied tomisleading posts.
The crowdsourced approach works for topics on which there is a broad consensus but users with different political opinions have to agree on a fact-check before it is publicly added to a post. The process may take time and politically charged posts may go unchecked for along time.
“Meta won’t decide what gets rated or written — contributors from our community will. And to safeguard against bias, notes won’t be published unless contributors with a range of viewpoints broadly agree on them,” Meta has said.
So far, around 200,000 people (in the US) have signed up to become Community Notes contributors “and the waitlist remains open for those who wish to take part in the programme”.
Community Notes is not applicable to advertisements but it can be attached to “almost any other forms of content, including posts by Meta, our executives, politicians and other public figures”. Posts that are appended with Community Notes can’t be appealed and there is no penalty for content that’s flagged.
Meta, which has more than three billion users globally, originally announced its Community Notes days after Nick Clegg, the former UK Deputy Prime Minister, announced he was stepping down as Meta’s president of global affairs to be replaced by the prominent Republican Joel Kaplan.
In January, Zuckerberg said in an Instagram post titled “It’s time to get back to our roots around free expression”: “We built a lot of complex systems to moderate content, but the problem with complex systems is they make mistakes, even if they accidentally censor just one per cent of posts, that’s millions of people, and we’ve reached a point where it’s just too many mistakes and too much censorship. The recent elections also feel like a cultural tipping point towards, once again, prioritising speech.”
There is little data around the usefulness of Community Notes, like on X. MediaWise, a media literacy programme at the Poynter Institute, found in July that only about 6 per cent of the drafted Community Notes on posts about immigration became public, and only 4 per cent of drafted fact-checks on posts about abortion were published, according to The New York Times.
Meta’s decision to end its traditional fact-checking programme may remind some of what the late US politician and diplomat Daniel Patrick Moynihan said: “Everyone is entitled to his own opinion, but not his own facts.”