Instagram Fails Teens: New Report Says Meta’S Safety Tools Fall Short

  • By Cole
  • Oct. 5, 2025, 8:05 a.m.

Instagram’s Safety Tools Under Fire

Despite ongoing scrutiny from lawmakers, lawsuits, and research, Instagram’s safety measures for teens have been labeled "woefully ineffective" in a report by former employee and whistleblower Arturo Bejar alongside four nonprofit groups. The findings suggest Meta – Instagram’s parent company – is not stepping up to the plate to genuinely address these pressing concerns, opting instead for headline-grabbing updates that fall short of meaningful change.

Meta’s response to these criticisms has been one of defense, with the company arguing that the report misrepresents their efforts to make Instagram safer for teenagers. According to Meta, their Teen Accounts offer industry-leading protections and parental controls, though the report evaluates 47 of 53 safety features as ineffective or non-existent. The report emphasizes the difference between improving design for safety and censoring content, stressing that accountability is crucial in safeguarding young users.

The Report’s Stark Findings

The report, published by Bejar with the support of Cybersecurity For Democracy at New York University, Northeastern University, and nonprofits like the Molly Rose Foundation, Fairplay, and ParentsSOS, details significant gaps in Instagram's safety net. Out of 53 safety features, only eight were found to effectively protect teens without limitations. This is a critical issue as Instagram's very design – not its content moderation – is under scrutiny.

“Holding Meta accountable for deceiving young people and parents about how safe Instagram really is, is not a free speech issue,” the report insists.

Despite Instagram’s promises to restrict adult contact with minors and hide inappropriate content, the report found that children can still be exposed to harmful interactions and content through various Instagram features. Alarmingly, the report points out that minors could still receive unwanted sexual advances with no effective way to report these incidents to Meta.

Meta’s Defense and Recommendations for Improvement

Meta has dismissed the report as "misleading" and "dangerously speculative," arguing that it assesses their tools based on unpromised capabilities. For example, the company never claimed its comment-hiding feature would allow users to explain why they were hiding comments. Nevertheless, the report’s findings suggest that Instagram is not as safe as it claims, with features like disappearing messages posing risks for illicit activities.

The report also highlights disturbing algorithmic recommendations that expose teens to age-inappropriate content, including violence and self-harm. Children under 13 have reportedly been encouraged by Instagram’s algorithms to engage in sexualized behavior. To tackle these issues, the authors suggest Meta implement "red-team" testing, create simple reporting mechanisms for teens, and provide transparency by publishing data on teen experiences.

As New Mexico Attorney General Raúl Torrez continues a lawsuit against Meta for endangering children, the call for actionable change grows louder. Until significant improvements are made, the report warns that Instagram will remain a risky digital space for teens.

Cole
Author: Cole
Cole

Cole

Cole covers the infrastructure of the creator economy - OnlyFans, Fansly, Patreon, and the rules that move money. Ex–fact-checker and recovering musicologist, he translates ToS changes, fees, and DMCA actions into clear takeaways for creators and fans. His column Receipts First turns hype into numbers and next steps. LA-based; sources protected; zero patience for vague PR.