Instagram's Reels algorithm is under intense scrutiny after a recent report uncovered that young users are being recommended sexually explicit and harmful videos more frequently than disclosed. The unsettling findings stem from two site experiments led by the Wall Street Journal and Northeastern University computer science professor Laura Edelson. Over seven months, they created new minor accounts that engaged with Instagram's Reels, skipping standard content to focus on racier material. Shockingly, within just 20 minutes, these accounts were overwhelmed with promotions for "adult sex-content creators" and offers for nude photos.
Despite these revelations, Instagram maintains that accounts tagged as minors are subject to the platform's strictest content control measures. Meta spokesperson Andy Stone responded to the report, stating, "This was an artificial experiment that doesn’t match the reality of how teens use Instagram." He assured that Meta has been actively working to diminish the volume of sensitive content visible to teens and claims to have significantly reduced these numbers recently.
“As part of our long-running work on youth issues, we established an effort to further reduce the volume of sensitive content teens might see on Instagram, and have meaningfully reduced these numbers in the past few months.”
The findings echo past internal reports from Meta, revealing the algorithm's tendency to push more explicit content to younger audiences than adults. Similar tests on platforms like TikTok and Snapchat did not show the same recommendation patterns, further isolating Instagram's challenges.
This comes amid broader industry concerns over social media's role in facilitating online child exploitation. The November report highlighted Instagram's algorithm recommending explicit content even to adult users following child accounts. Meanwhile, a February investigation unearthed troubling trends of exploitative parents and adult users monetizing children's images on Instagram. In response, Meta has faced multiple lawsuits, including one in December accusing the company of creating a "marketplace for predators." In 2023, Meta launched a child safety task force and introduced new safety tools geared towards curbing these issues.
Elsewhere, Meta's competitor X has also adjusted its adult content policies, now allowing properly labeled adult content, while promising to shield underage users from such material. Unlike Instagram, X does not specify penalties for mislabeling, raising additional concerns about content moderation on social platforms.