Meta Under Fire For Alleged Safety Failures On Instagram

  • By Imani
  • Nov. 30, 2025, 6:05 a.m.

Meta's Alleged "17x" Policy Under Scrutiny

In a dramatic turn, a new legal filing accuses social media giant Meta of enforcing a controversial "17x" policy on Instagram. This policy allegedly let sex traffickers post content up to 16 times before their accounts faced suspension on the 17th strike. The filing claims Meta, owner of Facebook and Instagram, compromised children's safety to boost user engagement and profits.

The lawsuit, which ropes in other tech giants like Google, Snap, and TikTok, points fingers at these companies for addicting children to their platforms, despite knowing the harm. School districts, states like California, parents, and students are part of the coalition bringing these accusations to the fore.

Companies React to Allegations

The accusations have sparked significant backlash, with Meta denying all claims. A Meta spokesperson stated, "We strongly disagree with these allegations, which rely on cherry-picked quotes and misinformed opinions in an attempt to present a deliberately misleading picture." Despite these denials, the filing lays out further claims of Meta's failure to heed warnings about the harmful impact of its platforms on minors.

“Despite earning billions of dollars in annual revenue – and its leader being one of the richest people in the world – Meta simply refused to invest resources in keeping kids safe,” claimed the Friday filing in Oakland’s U.S. District Court.

Snap took a similar defensive stance, highlighting its platform's unique features that don't rely on likes or comparisons. A representative stated, "We’ve built safeguards, launched safety tutorials, partnered with experts, and continue to invest in features and tools that support the safety, privacy, and well-being of all Snapchatters." Meanwhile, Google and TikTok have yet to provide any public comment on the lawsuit.

Internal Reports and Safety Measures

According to the filing, internal reports revealed worrying lapses in safety measures. For instance, an Instagram feature allegedly recommended nearly 2 million minors to adults with ill intentions in 2023. Additionally, the document states Meta's internal audit found over a million inappropriate adult recommendations to teen users in a single day in 2022.

The legal document criticizes Meta's reluctance to enhance protective measures, despite making a $62.4 billion profit last year. It accuses the company of delaying privacy settings to protect children from adult predators due to potential impacts on user engagement metrics.

The Larger Picture: Impact on Mental Health and Education

The lawsuit also raises concerns over the broader impact of social media on children's mental health and educational environments. It claims platforms like Instagram and Facebook have deeply infiltrated classrooms, causing distractions and contributing to a national youth mental health crisis. Meta's own studies reportedly highlighted the negative emotional effects of social media, but these insights were allegedly sidelined in favor of maintaining user engagement.

Former Meta Vice-President Brian Boland backed up these claims, stating in a deposition, "My feeling then and my feeling now is that they don’t meaningfully care about user safety." This revelation adds another layer to the multifaceted lawsuit that continues to generate waves across the tech and educational sectors.

Imani
Author: Imani
Imani

Imani

Imani follows the money: payouts, contracts, lawsuits, and platform enforcement. With a background in entertainment PR and paralegal work, she breaks complex stories into plain-English playbooks for creators. Her series Follow the Money connects drama to data - who benefits, who pays, and what to do next. Calm, sourced, and courtroom-ready; DTLA is her second office.