As reported by Nairobi News, TikTok has released its Community Guidelines Enforcement Report for Q2 2024, outlining efforts to moderate content in Kenya amid heightened government scrutiny. The report comes after a petition in Parliament last year called for a ban on the platform, citing concerns over the spread of inappropriate content. Although the Kenyan Parliament rejected the petition in September 2023, lawmakers urged TikTok to strengthen its moderation measures to address public concerns about user safety and content quality.
In response, TikTok’s report reveals that more than 360,000 videos in Kenya were removed for policy violations in Q2 2024, accounting for 0.3% of all content uploaded in the region. Notably, 99.1% of these videos were identified and taken down proactively before any user reports. Alongside these removals, TikTok suspended 60,465 accounts in Kenya, with 57,262 of these actions specifically targeting accounts suspected to be operated by users under the age of 13, a step aligned with the platform’s youth protection policies.
On a global scale, TikTok removed over 178 million videos in June 2024, with 144 million of these taken down through automatic detection tools. This reliance on AI-powered moderation showcases TikTok’s ongoing investment in advanced technology to swiftly detect and remove potentially harmful content, often before it is seen by users.
Looking forward, TikTok plans to continue enhancing its AI-driven moderation tools, which currently boast a proactive content detection rate of 98.2%. This investment aims to reduce harmful content across the platform, improve user experiences, and align with regulatory expectations worldwide.