An unsettling trend is alarming Australians as AI-powered 'nudifying' apps become increasingly prolific. These apps can instantaneously transform a harmless street photo into explicit, pornographic content. An investigation has shed light on how schoolgirls are being virtually 'undressed' and subjected to degrading, sexualized scenarios online.
Caitlin Roper from Collective Shout has voiced concerns about the unchecked proliferation of these apps, which threaten the privacy and dignity of countless individuals. Even without a social media presence, anyone's image can be captured in public spaces, such as streets and schools, and manipulated without consent.
A 'nation-leading' law recently took effect in South Australia, criminalizing AI-generated deepfake images with penalties of up to $20,000 in fines or four years imprisonment. Yet, according to Roper, these measures are insufficient since the apps remain accessible globally. Her analysis of 20 such apps highlighted the ease with which AI can be used to exploit images of women and girls.
“It's completely unregulated. What they would often have is a little checkbox, saying 'I confirm that I have consent, that I’m not using children, or that I’m not violating local laws',” Roper stated. "But obviously, these sites intend to undress, degrade, and humiliate."
Alarmingly, the apps are primarily designed to target female images, often marketed towards men and boys. Some platforms even incentivize users financially to invite others, spreading their reach further.
While Victoria and South Australia have specific laws against the non-consensual creation and distribution of deepfake content, activists argue these focus too heavily on distribution over creation. Dr. Asher Flynn, from the Australian Research Council Centre for the Elimination of Violence Against Women, advocates for creation itself to be criminalized.
Roper suggests that a global effort could curb this menace, proposing measures such as geo-blocking and restrictions by digital platforms. "We do have some laws in Australia," she noted, "but they’re not working as intended."
For anyone affected by these issues or seeking support, resources are available through the Australian Centre to Counter Child Exploitation. In emergencies, contact Triple Zero (000) or local authorities.