TikTok Removes Over 900,000 Videos in South Africa for Violating Guidelines
TikTok, the popular short-form video platform, recently revealed in its Q3 Community Guidelines Enforcement Report that it removed more than 900,000 videos in South Africa for violating the platform’s guidelines.
Out of the 928,334 videos removed in South Africa, 83.6% were taken down within 24 hours. TikTok’s proactive detection rate globally is now at 98.2%, with over 97.9% of all content being removed before any community member reports it.
Most Enforced Policies
- Regulated Goods and Commercial Activities: 96.9% of removals happened before any user report
- Sensitive and Mature Themes: 99.5% removals happened before any user report
- Mental and Behavioral Health: 99.8% of removals happened before any user report
Fake Likes and Engagements
Additionally, TikTok removed over one billion fake likes and more than 350 million fake engagements globally. The platform also prevented over 347 million fake accounts through the use of AI during the same period.
With millions of pieces of content posted daily on TikTok, the platform continues to invest in technologies that improve content understanding and assess potential risks to remove harmful content before it reaches viewers.
Safety and Integrity
TikTok remains committed to inspiring creativity and bringing joy while prioritizing the safety and well-being of its South African community. The platform employs dedicated trust and safety professionals and leverages advanced technology to ensure compliance with its Community Guidelines, Terms of Service, and Advertising Policies.
As other tech companies face scrutiny for the content shared on their platforms, TikTok’s focus on transparency and platform safety continues to be at the forefront. With ongoing developments such as the 30-day reprieve granted by US President Donald Trump, it will be interesting to see how TikTok evolves in the future.
You can access the full report here.
Misinformation on Other Platforms
Meta, the parent company of platforms like Facebook and Instagram, recently announced changes to its fact-checking program. The new Community Notes system allows users to identify potentially misleading or falsified information in posts and add additional context to provide more clarity.
However, the decision to end the third-party fact-checking program was met with criticism from a South African non-profit organization, highlighting the potential implications of such a move.
As tech companies navigate the challenges of misinformation and content moderation, TikTok and other platforms continue to adapt and enhance their approaches to ensure a safe and engaging online environment for users.