Skip to main content
Tech

Zuckerberg says Meta is ditching fact-checkers for community-driven oversight

Listen
Share

Meta announced Tuesday, Jan. 7, that it will replace its third-party fact-checking program with a community-driven system called Community Notes. The change will begin rolling out in the United States on Facebook, Instagram and Threads, with plans for global implementation in 2025, according to CEO Mark Zuckerberg.

Media Landscape

See who else is reporting on this story and which side of the political spectrum they lean. To read other sources, click on the plus signs below. Learn more about this data
Left 27% Center 40% Right 33%
Bias Distribution Powered by Ground News

In a video statement, Zuckerberg explained that the transition is aimed at simplifying policies, reducing moderation errors and prioritizing free expression. He criticized the existing fact-checking system as overly complicated, leading to errors that impacted millions of users.

“So we’re going to get back to our roots and focus on reducing mistakes, simplifying our policies, and restoring free expression on our platforms. More specifically, here’s what we’re going to do,” Zuckerberg said. “First, we’re going to get rid of fact checkers and replace them with community notes similar to X starting in the U.S.”

QR code for SAN app download

Download the SAN app today to stay up-to-date with Unbiased. Straight Facts™.

Point phone camera here

Meta’s fact-checking initiative, launched in 2016, involved partnerships with independent organizations certified by the International Fact-Checking Network and the European Fact-Checking Standards Network. These groups reviewed flagged content, assessed its accuracy and assigned ratings such as “False,” “Altered” or “Missing Context.”

Under the outgoing system, flagged posts had their visibility reduced while users received notifications before sharing inaccurate content. Repeat offenders faced penalties, including reduced reach and restrictions on monetization. Zuckerberg said the complexity of the system led to unintended consequences.

“So we built a lot of complex systems to moderate content. But the problem with complex systems is they make mistakes even if they accidentally censor just 1% of posts. That’s millions of people. And we’ve reached a point where it’s just too many mistakes in too much censorship,” Zuckerberg said.

Community Notes, modeled after a similar feature implemented by X, will rely on user contributions to flag and add context to posts. Meta plans to refine the system in the United States before expanding it globally. The company described the move as part of an effort to enhance transparency and involve users more directly in content moderation.

Meta emphasized that strict oversight will remain for content related to terrorism, child exploitation and drug-related issues. While the company acknowledged concerns about the potential for increased misinformation, it expressed confidence that refinements to Community Notes will mitigate these risks.

Tags: , , , , , , , , , ,

[Craig Nigrelli]

META ANNOUNCED TUESDAY IT IS REPLACING ITS THIRD-PARTY FACT-CHECKING PROGRAM WITH A COMMUNITY-DRIVEN SYSTEM CALLED COMMUNITY NOTES. CEO MARK ZUCKERBERG SAYS THE TRANSITION, WHICH BEGINS IN THE UNITED STATES ACROSS PLATFORMS LIKE FACEBOOK, INSTAGRAM, AND THREADS, WILL EXPAND GLOBALLY IN 2025.

MARK ZUCKERBERG

FACEBOOK CEO

“So we’re going to get back to our roots and focus on reducing mistakes, simplifying our policies, and restoring free expression on our platforms. More specifically, here’s what we’re going to do. First, we’re going to get rid of fact checkers and replace them with community notes similar to X starting in the US.”

[craig nigrelli]

ZUCKERBERG SAYS THAT THE CHANGE IS INTENDED TO SIMPLIFY POLICIES, REDUCE MODERATION ERRORS, AND PRIORITIZE FREE EXPRESSION. HE CHARACTERIZES THE EXISTING SYSTEM AS OVERLY COMPLEX.

META WILL MAINTAIN STRICT OVERSIGHT FOR ISSUES SUCH AS TERRORISM, CHILD EXPLOITATION, AND DRUG-RELATED CONTENT WHILE RELYING ON COMMUNITY INPUT FOR BROADER CONTENT MODERATION.

THE OUTGOING FACT-CHECKING PROGRAM, LAUNCHED IN 2016, INVOLVED PARTNERSHIPS WITH INDEPENDENT ORGANIZATIONS CERTIFIED BY THE INTERNATIONAL FACT-CHECKING NETWORK AND THE EUROPEAN FACT-CHECKING STANDARDS NETWORK. THESE GROUPS REVIEWED FLAGGED CONTENT AND ASSIGNED RATINGS SUCH AS “FALSE,” “ALTERED,” OR “MISSING CONTEXT.”

POSTS DEEMED INACCURATE HAD THEIR VISIBILITY REDUCED, AND REPEAT OFFENDERS FACED PENALTIES SUCH AS REDUCED REACH OR MONETIZATION RESTRICTIONS.

MARK ZUCKERBERG

FACEBOOK CEO

“So we built a lot of complex systems to moderate content. But the problem with complex systems is they make mistakes even if they accidentally censor just 1% of posts. That’s millions of people. And we’ve reached a point where it’s just too many mistakes in too much censorship.”

[craig nigrelli]

COMMUNITY NOTES, INSPIRED BY A SIMILAR SYSTEM IMPLEMENTED BY X, INVOLVES USERS CONTRIBUTING CONTEXT TO FLAGGED POSTS.

META PLANS TO REFINE THE SYSTEM DOMESTICALLY BEFORE ITS GLOBAL ROLLOUT. THE COMPANY DESCRIBES THE MOVE AS PART OF A BROADER EFFORT TO FOSTER TRANSPARENCY AND USER PARTICIPATION IN CONTENT MODERATION.

META ACKNOWLEDGES CONCERNS FROM CRITICS ABOUT THE POTENTIAL FOR INCREASED MISINFORMATION BUT EXPRESSES CONFIDENCE THAT THE PROGRAM’S REFINEMENTS AND OVERSIGHT MECHANISMS WILL ADDRESS THESE CHALLENGES.

FOR MORE OF OUR UNBIASED, STRAIGHT FACT REPORTING –  DOWNLOAD THE STRAIGHT ARROW NEWS APP TODAY, OR LOG ON TO SAN.COM.