In a groundbreaking shift, Meta CEO Mark Zuckerberg has announced the end of Facebook and Instagram’s third-party fact-checking program, signaling a renewed commitment to free expression across the company’s platforms. The move, described as a return to Meta’s foundational values, aims to simplify content moderation policies, minimize mistakes, and promote open discourse.
Meta Abandons Fact-Checking Program for Community Notes
During a video announcement on Tuesday, Zuckerberg emphasized the company’s intention to move away from third-party fact-checkers and adopt a model similar to X’s (formerly Twitter) Community Notes. This approach will rely on platform users to provide context and commentary on posts rather than outsourcing content verification to independent agencies.
“We’re going to get back to our roots and focus on reducing mistakes, simplifying our policies, and restoring free expression on our platforms,” Zuckerberg said. “More specifically, we’re going to get rid of fact-checkers and replace them with Community Notes similar to X, starting in the U.S.”
Joel Kaplan, Meta’s Chief Global Affairs Officer, echoed Zuckerberg’s sentiments during an exclusive interview on Fox & Friends. Kaplan emphasized that this change represents a “reset in favor of free expression” and an effort to reduce political bias in content moderation.
The Problem with Third-Party Fact-Checking
Meta introduced third-party fact-checking in response to widespread misinformation concerns after the 2016 U.S. presidential election. However, the system has faced increasing criticism for perceived political bias and inconsistent application.
“It has become clear there is too much political bias in what third-party fact-checkers choose to address,” Kaplan stated in an interview with Fox News Digital. “Basically, they get to fact-check whatever they see on the platform, and that creates an imbalance.”
Kaplan explained that Community Notes will offer a more democratic approach to content moderation. Instead of relying on external experts, platform users will have the opportunity to provide their insights on posts. If these contributions gain support across a diverse range of users, they will be highlighted as context for others to view.
“We believe this is a more transparent and balanced system compared to outsourcing decisions to so-called experts who may carry their own biases into the program,” Kaplan added.
Content Moderation Rules to Be Revised
Meta’s overhaul doesn’t stop at removing third-party fact-checkers. The company is also revising its internal content moderation policies, focusing on fostering open discussions on sensitive topics such as immigration, gender identity, and political discourse.
“We want to make sure that conversations around sensitive issues can happen freely on our platforms without fear of unwarranted censorship,” Kaplan said.
He admitted that automated moderation systems often flag or remove content that doesn’t actually violate community standards. These technical errors have fueled frustrations among users and amplified perceptions of bias in content management.
Focus on High-Severity Violations
While Meta is relaxing restrictions on many types of content, Kaplan clarified that the company will continue to enforce strict rules against posts promoting terrorism, child exploitation, and illegal activities. These remain non-negotiable boundaries.
“Our focus will shift toward illegal and high-severity violations,” Kaplan said. “This ensures we address genuine threats while allowing for broader freedom of expression.”
Political Content to Become User-Centric
Meta is also planning to give users more control over the type of political content they see on their feeds. Those interested in more political discussions will be able to customize their preferences, while others can reduce exposure to such content.
“We want to personalize the experience so that every user feels comfortable and in control of their newsfeed,” Kaplan explained.
Zuckerberg’s Letter to Congress: Political Pressure on Meta
Last year, Mark Zuckerberg admitted in a letter to the U.S. House Judiciary Committee that Meta had faced significant pressure from the Biden administration regarding content moderation policies. Topics like COVID-19 misinformation, satire, and political commentary were heavily scrutinized.
“When U.S. companies face pressure from their own government to censor content, it sets a dangerous precedent for authoritarian regimes worldwide,” Kaplan noted. “This is why we see this moment as an opportunity to reset our relationship with the government and align with our core values.”
Trump Administration: A Key Partnership for Meta
Kaplan hinted at a cooperative relationship with the incoming Trump administration, which has historically championed free speech principles on social media platforms.
“We have a new administration that values free expression and opposes government pressure on tech companies to censor content,” Kaplan stated. “This is an opportunity to work together to uphold First Amendment principles and promote American innovation in technology.”
Meta’s Return to Its Roots
Zuckerberg’s announcement underscores a broader shift at Meta, which aims to reduce unnecessary intervention in conversations while addressing genuinely harmful content. Kaplan highlighted that Meta intends to change not just its policies but also how those policies are enforced.
“We’re not just revising rules on paper; we’re fundamentally changing how we approach enforcement,” he said.
Community Notes: A Transparent Approach to Moderation
Meta’s adoption of a Community Notes-style model reflects a broader industry trend where social media platforms rely more on collective user contributions than centralized decision-making by experts.
Under the new system:
- Users can attach context to posts.
- Notes will require broad user agreement before becoming visible.
- The system aims to reduce bias by democratizing the content moderation process.
Balancing Freedom and Responsibility
While Meta’s changes aim to promote free speech, Kaplan acknowledged the need to strike a balance between open dialogue and responsible platform management. Certain types of content, such as hate speech, threats of violence, and illegal activities, will continue to be tightly regulated.
“We’re building systems that enable free expression without compromising the safety and integrity of our platforms,” Kaplan assured.
Global Impact of Meta’s Policy Shift
Meta’s decision to dismantle its fact-checking program will have global ramifications. Governments worldwide have relied on the platform’s moderation policies as a benchmark for their own digital governance standards.
This shift is expected to spark debates about free speech, misinformation, and the role of social media platforms in shaping public discourse.
Meta’s Future Vision
Meta’s restructuring represents a major gamble but also an opportunity to rebuild trust with its user base. Zuckerberg and Kaplan both emphasized their commitment to creating an environment where users feel empowered to express themselves while maintaining platform integrity.
“This isn’t just about changing policies; it’s about restoring trust and building a platform where free speech thrives,” Kaplan concluded.
Final Thoughts
Meta’s decision to scrap third-party fact-checking in favor of a Community Notes-style system marks a significant turning point in the tech giant’s history. By shifting moderation responsibilities to users and simplifying content policies, Meta aims to create a space that prioritizes open dialogue while addressing harmful content effectively.
As the world watches these changes unfold, the success of this new approach will depend on how effectively Meta balances free expression with platform safety. For now, Zuckerberg’s message is clear: Meta is returning to its roots, and free speech is at the heart of this transformation.

