
TikTok's reputation takes another hit as a recent investigation by Global Witness uncovers that the social media platform's algorithm exposes teenagers to inappropriate content. Even with the "restricted mode" activated, which is intended to filter out adult material, young users are still being fed sexual content tailored to their searches. These unsettling findings were shared by UNN, highlighting a significant failure in TikTok's parental control measures.
The crux of the investigation involved setting up fake accounts with the user's age listed as 13. Despite activating the supposed safety net of restricted mode, these accounts quickly encountered explicit content within the "you might like" sections. This suggests a systemic issue with how the platform moderates content visibility for younger audiences.
“TikTok must ensure that its platform prioritizes children's well-being, especially when offering safety features that parents and children rely on,” the Global Witness study emphasizes.
Global Witness representatives expressed their shock at these findings, underscoring the need for TikTok to overhaul its approach to child safety. With a vast and growing number of young users, the platform has a responsibility to create a secure environment for its audience. The investigation has sparked a call for TikTok to implement more robust measures to prevent exposure to harmful content.
In related news, UNN previously reported concerns about AI-generated videos on TikTok, which spread misleading narratives about peace and conflict. This adds another layer of complexity to the content moderation challenges facing the platform, raising questions about the effectiveness of its current systems.