Australia Cracks Down On Ai Nudify Tools: The Digital Duty Of Care Revolution

  • By Cole
  • Sept. 14, 2025, 7:50 a.m.

Australia's Bold Move Against AI Nudify Tools

In a decisive move, the Australian government has unveiled plans to ban "nudify" tools that exploit artificial intelligence to fabricate nude images. This initiative is part of a larger strategy aiming to establish a "digital duty of care" in the online realm. It mandates tech firms to proactively combat online dangers, taking significant steps to curb access to these harmful tools.

But how will Australia make this ban effective? The government is considering several strategies, including potentially geoblocking access to nudify websites and collaborating with tech platforms to limit exposure to these apps. The goal is clear – protect individuals from the risks associated with these misleading images, which can cause severe emotional and reputational harm.

Inside the World of Nudify Tools

Nudify tools are easily accessible through app stores and websites, using AI to create realistic, albeit fake, sexually explicit images from everyday photos. The alarming ease of access and use is a major concern, as these images can lead to bullying, harassment, and more severe consequences like anxiety and self-harm.

In mid-2024, legal actions highlighted the scale of the issue, with a lawsuit revealing that 16 popular nudify sites had amassed over 200 million visits. A subsequent study showed these platforms were making millions, thanks to tech services provided by major companies like Google and Amazon.

“This is not just about technology; it's about ensuring the safety and dignity of individuals in the digital space,” said a spokesperson for the eSafety Commissioner.

Legal Hurdles and Tech Company Efforts

While sharing non-consensual deepfake images is illegal for adults across most of Australia, creating these images isn't universally criminalized. Laws are stricter when it comes to minors, with severe penalties for generating or sharing child-related deepfake materials.

Despite the legal gray areas, some tech companies are already taking steps to eradicate these tools. Apple and Discord have removed nudify apps, while Meta has filed lawsuits against companies violating its policies on non-consensual imagery. These actions indicate a growing corporate responsibility to tackle the issue head-on.

Beyond Bans: The Role of Education

However, government restrictions and corporate responsibility alone won't eliminate the threat of nudify tools. Essential to this fight is comprehensive education for young people about digital privacy, rights, and respectful online interactions. This education should focus on affirmative consent and the critical assessment of digital content.

Effective bystander interventions and robust support systems for victims are also crucial. These initiatives will equip individuals to challenge harmful behaviors and provide much-needed support to those affected by deepfake abuses.

Cole
Author: Cole
Cole

Cole

Cole covers the infrastructure of the creator economy - OnlyFans, Fansly, Patreon, and the rules that move money. Ex–fact-checker and recovering musicologist, he translates ToS changes, fees, and DMCA actions into clear takeaways for creators and fans. His column Receipts First turns hype into numbers and next steps. LA-based; sources protected; zero patience for vague PR.