Mara Wilson, remembered fondly by many from her iconic role in Matilda, is voicing serious concerns about the impact of deepfake technology on young actors today. In a candid interview with The Guardian, Wilson highlighted the dangers these digital manipulations pose, recalling her own unsettling experiences with image misuse and exploitation.
Wilson, who was unfortunately a victim of child sexual abuse material (CSAM) as a youngster, warns that deepfake technology compounds these issues, threatening the safety and careers of emerging actors. "Before I was even in high school, my image had been used for child sexual abuse material," she shared, stressing how easily young stars become targets online.
“Once I was an adult, I worried about the other kids who had followed after me. Were similar things happening to the Disney stars, the Stranger Things cast, the preteens making TikTok dances and smiling in family vlogger YouTube channels?”
Wilson's fears aren't unfounded, as the accessibility of image-generating software means anyone can exploit this technology, leaving many vulnerable.
The menace of deepfake technology doesn't stop with websites. According to The 19th News, Elon Musk's AI tool Grok, part of X (formerly Twitter), became an image-generating app in late 2025. This tool has been misused for creating explicit and non-consensual images, including fake images of women and children, sparking significant backlash.
Despite reassurances from X's safety team that illegal content would be dealt with severely, concerns persist. The platform's decision to make Grok's features available only to paid users has drawn criticism, as experts argue the paywall could be seen as profiting from potential abuse.
Amid these growing concerns, legislative efforts like the Defiance Act aim to hold platforms accountable. As reported by The 19th News, Senator Dick Durbin, a co-sponsor, criticized X for its inadequate response to harmful content. “Even after these terrible, deepfake, harming images are pointed out to Grok and to X, formerly Twitter, they did not respond," he stated, advocating for the act's necessity to ensure accountability and justice for victims.
Introduced by representatives from both parties, the Defiance Act, along with the Take It Down Act, hopes to provide a framework for tackling deepfake abuse. Omny Miranda Martone, CEO of the Sexual Violence Prevention Association, expressed optimism that these legislative moves will help mitigate the emotional and financial damage caused to victims.
As the conversation around deepfake dangers grows, Wilson and many others hope for legal protections to keep young creators safe in the digital age.