Meta’s decision to end its third-party fact-checking programme and replace it with a Community Notes system has sparked controversy and concern among digital rights advocates. A South African non-profit organisation, Campaign on Digital Ethics (CODE), has criticized Meta’s move as a “reckless and dangerous gamble” that could have serious implications.
In a recent announcement, Meta, the parent company of social media platforms like Facebook and Instagram, revealed that it would be phasing out its third-party fact-checking programme in favor of a new Community Notes system. This new system will allow users to flag posts that they believe contain misleading or falsified information, prompting a community-driven approach to content moderation.
Mark Zuckerberg, CEO of Meta, defended the decision in a video titled “More speech and fewer mistakes,” stating that fact-checking organizations had shown bias in their moderation practices. He emphasized the importance of free speech and expressed a desire to simplify content policies while reducing errors in moderation.
However, Kavisha Pillay, executive director of CODE, raised concerns about the potential consequences of this shift. Pillay highlighted the negative impact of poor content moderation, citing examples such as election interference, violence, hate speech, and the spread of conspiracy theories. She warned that dismantling third-party fact-checking could lead to a surge in misinformation, division, and harm to vulnerable communities on a global scale.
Pillay argued that unmoderated platforms often amplify harmful perspectives and create an environment where harassment, hate speech, and disinformation thrive unchecked. She called for continued improvement in content moderation practices rather than a complete abandonment of safeguards.
In response to Meta’s decision, CODE has urged the South African government to develop robust regulatory frameworks similar to the European Union’s Digital Services Act. These regulations would hold tech companies accountable for the content they host and amplify on their platforms. Additionally, CODE has called for a nationwide digital literacy campaign to empower the public to navigate the digital landscape responsibly.
As Meta prepares to implement these changes across its major social media platforms, including Facebook, Instagram, and Threads, which collectively reach over 3 billion users worldwide, the debate over content moderation and free speech in the digital age continues to intensify. It remains to be seen how this shift towards community-driven moderation will impact the online landscape and the spread of misinformation.