Meta is discontinuing the use of independent fact-checkers on Facebook and Instagram, opting instead for a “community notes” system where users contribute to verifying the accuracy of posts, similar to the approach used by platform X.
In a video accompanying a blog post on Tuesday, CEO Mark Zuckerberg stated that third-party moderators were “too politically biased” and emphasized the company’s shift back to promoting “free expression.”
This decision aligns with efforts by Zuckerberg and other tech leaders to strengthen ties with U.S. President-elect Donald Trump, who will take office later this month. Trump and his Republican allies have criticized Meta’s fact-checking practices, accusing the company of suppressing conservative viewpoints. After the announcement, Trump praised the move, stating Meta had “made significant progress.”
When asked if Zuckerberg’s decision was influenced by past threats from Trump, the president-elect responded, “Probably.”
Joel Kaplan, a Republican and new global affairs chief at Meta, replacing Sir Nick Clegg, described the reliance on independent moderators as “well-intentioned” but acknowledged it had led to excessive censorship.
The announcement has drawn criticism from campaigners against online hate speech, who argue that the change is aimed at appeasing Trump. Ava Lee of Global Witness, an organization focused on holding tech companies accountable, condemned the move as a political strategy to sidestep responsibility for managing hate speech and disinformation.
“By framing this as a stand against censorship, Zuckerberg is sidestepping the platforms’ role in enabling harmful content,” Lee said.
Emulating X
Meta is replacing its current fact-checking system, introduced in 2016, with a “community notes” feature, starting in the U.S. The existing program relies on independent organizations to review posts flagged as potentially false or misleading. Posts deemed inaccurate are labeled with additional information and pushed lower in users’ feeds.
The new system, inspired by a similar feature on platform X (formerly Twitter), allows users with differing viewpoints to collaborate on adding context or clarification to posts. Elon Musk, who introduced the feature after acquiring X, described Meta’s adoption as “cool.”
Meta clarified that it does not intend to remove third-party fact-checkers in the UK or EU for now. The company also reassured users that its approach to handling content related to suicide, self-harm, and eating disorders remains unchanged.
Full Fact, a European fact-checking organization involved in Meta’s current program, rejected claims of bias in its work. Chris Morris, the organization’s CEO, criticized Meta’s decision, calling it a “disappointing and regressive step” with potentially global consequences for managing misinformation.
‘Facebook jail’
Fact checkers often liken their role to being the internet’s first responders, similar to content moderators. However, Meta executives believe they’ve been overly involved in policing content.
Joel Kaplan, in a statement on Tuesday, acknowledged issues such as unnecessary censorship, wrongful account restrictions—dubbed “Facebook jail”—and delayed resolutions. He noted these problems have prompted a shift in approach.
Mark Zuckerberg admitted in a video announcement that the adjustment involves compromises. “We’ll identify less harmful content,” he explained, “but we’ll also reduce the unintended removal of innocent users’ posts and accounts.”
This new strategy contrasts with stricter regulations in the UK and Europe, where tech giants are being held increasingly accountable for hosted content, under threat of heavy fines.
Currently, Meta’s reduced oversight applies only in the U.S., reflecting a cautious, region-specific implementation.
‘A radical swing’
Meta’s blog post stated its intention to reverse the “mission creep” in its content moderation policies, arguing that rules should not prohibit statements permissible on TV or in Congress.
The shift coincides with preparations for Donald Trump’s inauguration on January 20. Tech leaders have responded in various ways—some congratulating Trump publicly, while others, like Zuckerberg, have met him at Mar-a-Lago. Meta has also contributed $1 million to Trump’s inauguration fund.
In a video statement, Zuckerberg described the recent elections as a cultural shift back toward prioritizing free speech. Meta reportedly informed Trump’s team of its policy changes prior to making them public, according to The New York Times.
The replacement of Sir Nick Clegg, a former UK deputy prime minister, by Joel Kaplan as Meta’s president of global affairs is seen as a sign of the company’s evolving moderation strategy and political stance. Additionally, Meta announced Dana White, a Trump ally and UFC president, would join its board of directors.
Kate Klonick, a law professor at St. John’s University, noted that these developments align with a broader trend toward reduced moderation, which she said has felt inevitable, particularly after Elon Musk’s acquisition of X. Klonick explained that while platforms previously faced pressure to address harassment, hate speech, and disinformation, they are now moving sharply in the opposite direction, driven by political dynamics.