Meta, the company behind Facebook and Instagram, is ending its fact-checking program and replacing it with a system similar to X’s Community Notes, the firm’s CEO Mark Zuckerberg says.
In the last few days, it has become clear to some observers that Zuckerberg – possibly pushed or threatened by the incoming Donald Trump administration – is moving in the direction of MAGA.
On Monday, he appointed UFC chief executive and Trump buddy Dana White to Meta's board of directors. Last week, Nick Clegg, a liberal policy head at Facebook, was replaced by prominent Republican Joel Kaplan.
Finally, on Tuesday, Zuckerberg announced in a new video that Meta will now end its fact-checking program with trusted third-party partners, citing a shifting political landscape and a desire to embrace free speech.
The content moderation mechanism will allegedly be replaced with a community-driven system. X, a platform owned by billionaire Elon Musk, already uses Community Notes.
Zuckerberg speaks about censorship
“We're gonna get back to our roots and focus on reducing mistakes, simplifying our policies, and restoring free expression on our platforms,” Zuckerberg said in the video.
“More specifically, here's what we're going to do. [...] We're going to get rid of fact-checkers and replace them with community notes similar to X, starting in the US.”
According to Zuckerberg, Meta will also be eliminating some content politics around hot-button issues including immigration and gender – but the company will aggressively moderate content related to drugs, terrorism, and child exploitation.
Meta CEO even criticized “governments and legacy media” for pushing “to censor more and more.”
Meta’s independent fact-checking program was launched in 2016 and has grappled with the spread of controversial and sometimes dangerous content, such as misinformation about elections, anti-vaccination scare stories, violence, and hate speech.
Now, the social media giant is drastically changing its content moderation policy. And if Zuckerberg’s announcement was typically moderate, Meta’s newly appointed chief of global affairs, Kaplan, was pretty blunt when speaking to Fox on Tuesday.
According to Kaplan, Meta’s partnerships with fact-checkers were “well-intentioned at the outset, but there’s just been too much political bias in what they choose to fact-check and how.”
Kaplan also said that professional fact-checkers – usually experienced journalists and editors – were “so-called experts” who brought “their own biases into the program.”
“Over time, we ended up with too much content being fact-checked that people would understand to be legitimate political speech and debate,” Kaplan added in a separate statement.
Do Community Notes actually work?
To be fair, though, studies have shown that fact-checkers have mostly had to work on right-leaning Facebook posts because they usually contain lies or misinformation.
In late 2023, a study by Northeastern University in Boston, published in the journal Nature Communications Psychology, said when confronted with fake news, Republicans are more likely to believe the false headlines than Democrats.
The truth is that no fact-checker has been given authority by any tech platform to take down content.
Angie Drobnic Holan.
Last year, Angie Drobic Holan, director of the Poynter Institute – it launched the International Fact-Checking Network in 2015 – said in a comment that fact-checking wasn’t censorship: “It adds to the public debate and doesn’t suppress it.”
“The truth is that no fact-checker has been given authority by any tech platform to take down content. The fact-checkers I work with would rather see inaccurate content contextualized and labeled, so it can remain part of the public record and the public debate,” wrote Drobnic Holan.
Once Meta’s Community Notes program is up and running, the company will not write the notes or decide which ones appear on the platforms. Meta said, “Just like they do on X, Community Notes will require agreement between people with a range of perspectives to help prevent biased ratings.”
But that might not help, recent Washington Post research has revealed. In October, the newspaper published a report saying that on X, the Community Notes’ requirement of securing agreement from reviewers of opposing political viewpoints is actually hampering the project.
In other words, there are some topics on which political opponents are never going to agree. This means they aren’t getting Community Noted, either.
Your email address will not be published. Required fields are markedmarked