OpenAI CEO Sam Altman and Elon Musk have sparked conversations about AI's role in creating adult content. Altman wants an “adult version” of ChatGPT, while Musk backs Grok generating R-rated material. But a recent study from George Mason University adds fuel to the debate, uncovering that more than half of U.S. teens are using AI tools to create nude images.
The study, published in the journal PLOS ONE by Chad Steel, surveyed 557 U.S. teens aged 13 to 17, with parental consent. The anonymous responses reveal a troubling use of AI-powered nudification apps.
The data is unsettling: 55.3% of teens admitted to using AI to make or receive nude images of themselves or others. Even more disturbing, 36.3% had their images generated without consent, and 33.2% had these images shared without permission. It’s a clear indication that AI tools are circumventing traditional notions of consent and privacy.
Across demographics, male teens were more likely to engage in creating and sharing these images, both with and without consent. This signals a shift in digital behavior, with potential long-term impacts on young people's lives.
Historically, sharing intimate images required consent, but AI nudification tools are changing the rules. Anyone with a photo and access to these apps can create fake nudes without the subject's knowledge. The repercussions mirror those of child exploitation material, causing dehumanization and ongoing personal harm.
“Teens are no longer just digital natives but AI-natives. 'Nudification' is their new 'sexting,' with more complex consent issues," says Steel.
The hope is that this data will drive lawmakers and educators to address these digital dangers before they escalate further.