A recent report from the human rights group Global Witness points to issues concerning TikTok’s recommendation algorithm, which allegedly directs sexually explicit content to child accounts. Researchers set up four accounts purporting to belong to 13-year-olds, activating safety settings intended to limit exposure to mature content. Despite these measures, the accounts still received suggestions for explicit sexual material.
The study discovered that terms leading to such content appeared in the “you may like” section of the app, even without any active searches. Suggested content reportedly included videos featuring women in sexually suggestive scenarios, and, in some cases, explicit pornography. Researchers noted that the platform embeds this material among innocuous content to evade moderation. According to Global Witness, the findings shed light on TikTok’s failure to restrict inappropriate content effectively.
In response to these findings, TikTok stated that it prioritizes providing safe and age-appropriate experiences for users. After being alerted about the issues, the platform claimed to have taken immediate action to address them. However, follow-up investigations revealed that similar sexual content was still being recommended to the test accounts.
The report also underscores the legal context surrounding online content for minors. As of July 25, 2023, the Online Safety Act’s Children’s Codes impose obligations on platforms to prevent minors from accessing pornography and harmful content. Global Witness conducted its research after these guidelines came into effect, emphasizing the need for regulatory intervention.
Further interactions on the app indicated other users’ confusion regarding the sexualized recommendations. This raises questions about TikTok’s algorithm and its effectiveness in safeguarding young users from inappropriate material.
Source: https://www.bbc.com/news/articles/c708v7qkeg1o?at_medium=RSS&at_campaign=rss

