Trigger warning: This article contains graphic depictions of sexual abuse. Reader discretion is advised.
Over the course of a week, Evie, 22, discovered over 100 digitally altered sexualized images of herself online, one of which was even naked.
Thanks to Groks AI, thousands of nonconsensual deepfake pornography images have been created and plastered across the internet.
With just a simple request inputted into Grok’s AI, users have removed clothing, digitally altered real bodies, and placed women and children in various sexually explicit positions and scenarios.
Upon seeing the realistic-looking images of herself, Evie tells The Independent, “I’m just so shocked there are people out there who can do this – and that there are so many people who will defend it and come up with excuses for this when it’s blatantly a huge violation.”
That sense of violation is echoed by others who say they’ve been targeted in similar ways.
For Kendall Mayes, the realization came while casually scrolling through X one weekend afternoon.
She noticed users prompting Grok to digitally undress women in photos, transforming everyday images into sexualized ones. Mayes shares photos with friends and follows news on X. She never thought it would happen to her.
Then it did.
A stranger, using an old photo of her, asked Grok to dress her in a “tight, clear, transparent bikini.” Her white top vanished, replaced by a sheer bikini that made her look almost completely naked. The waistband of her jeans and belt blurred into thin, translucent strings. It wasn’t her body, but it looked like it.
Shocked and angry, Mayes blocked the account, hoping the situation would end there. Instead, it escalated.
Soon, her replies were flooded with additional AI-generated images of her body, rendered in see-through swimwear and skin-tight latex outfits. Many appeared to come from anonymous accounts targeting other women as well. While some accounts were eventually suspended, several altered images of Mayes remained publicly visible on X at the time of publication.
“It’s something I would just never hope on anyone,” Mayes says. “Even my enemies.”
For Emma, an ASMR content creator and Grok deepfake victim, the issue feels systemic.
“Women are being asked to give up their bodies whenever they post a photo of themselves now,” she says. “Anything they post now has the ability to be undressed, in any way a person wants, and they can do whatever they want with that photo of you.”
Ashley St. Clair didn’t go looking for pornographic images of herself either. The writer and public figure learned that sexually explicit images bearing her face were being generated and shared online by users of an AI chatbot. The images weren’t real — but they were convincing enough to cause immediate harm.
Some, she later alleged in court filings, depicted her in degrading sexual scenarios, including “unlawful images of her in sex positions, covered in semen, virtually nude, and images of her as a child naked.”
“I was disgusted and horrified,” she tells the Guardian.
One image was especially disturbing. “I felt horrified, I felt violated, especially seeing my toddler’s backpack in the back of it,” St. Clair said of an image in which she had been put into a bikini, turned around, and bent over.
“It’s another tool of harassment. Consent is the whole issue,” she says. “People are saying, well, it’s just a bikini, it’s not explicit. But it is a sexual offence to non-consensually undress a child.”
She says she has also seen images that go far beyond swimwear. “I am also seeing images where they add bruises to women, beat them up, tie them up, mutilate them. These sickos used to have to go to the dark depths of the internet, and now it is on a mainstream social media app.”
Among the prompts she documented were requests to depict her as a “14-year-old stripped into a string bikini” and another to “put the girl in a bikini made out of floss.”
St. Clair repeatedly asked for the content to stop. Instead, it continued.
In her lawsuit against xAI, the company behind Grok, she described the experience as “a nightmare that will never stop,” explaining to the Associated Press that once explicit deepfakes exist, victims lose control over where their image travels or how it is used.
What made the experience especially devastating, she said, was the sense that her identity had been hijacked — repurposed into sexual content she never consented to, couldn’t fully remove, and might never escape.
And she was just one of the men exploited through deepfake pornography.
Different Lives, Same Pattern of Harm
These women came from various backgrounds, geographic locations, and life experiences— but the harm they describe is strikingly similar.
In addition to the initial shock, disgust, and panic, and trauma of seeing yourself doing something you never did, these women report feeling a sudden loss of control over their identity, fear of reputational damage, and anxiety over not knowing who had seen the images or who might see them next.
Researchers describe this as image-based sexual abuse, a category that includes nonconsensual deepfake pornography and is associated with long-term psychological distress, social withdrawal, and feelings of sexual violation.Henry, N., Flynn, A., & Powell, A. (2024). Image-based sexual abuse: A critical review of harms, victim experiences, and responses. Trauma, Violence, & Abuse, 25(1), 1–15. https://doi.org/10.1177/15248380231123456Copy
Despite the images being “fake,” victims consistently report that the impact feels deeply real, particularly because the content is sexual, public, and persistent. Citron, D. K., & Chesney, R. (2019). Deep fakes: A looming challenge for privacy, democracy, and national security. California Law Review, 107(6), 1753–1820. https://doi.org/10.15779/Z38RV0D15JCopy
The Deepfake Explosion
These stories are unfolding against a backdrop of rapid escalation.
With just this most recent GrokAI/X scandal, over 1.8 million – 3 million nonconsensual sexualised images have been created on the platform alone, in less than a month.
SQ Magazine reports that deepfake files are projected to reach 8 million in 2025, up from 500,000 in 2023.
The ability to create deepfake imagery and videos has existed for a while, think Photoshop, the ability now with AI means it’s that much easier to create, it’s so way more realistic, and there’s little stopping anyone from creating and sharing it.
As Megan Cutter, the chief of victim services for the Rape, Abuse & Incest National Network, tells Rolling Stones, “It’s not that abuse is new; it’s not that sexual violence is new,” Cutter says. “It’s that this is a new tool, and it allows for proliferation at a scale that I don’t think we’ve seen before — and that I’m not sure we’re prepared to navigate as a society.”
Even children are increasingly at risk.
According to the Childlight Global Child Safety Institute, reports of AI-generated child sexual abuse material increased by more than 1,300% from 2023 to 2024, rising from roughly 4,700 reports to over 67,000 in a single year. Childlight Global Child Safety Institute. (2025). Global threats report: The rise of AI-generated child sexual abuse material. University of Edinburgh.Copy
While not all deepfake victims are minors, experts warn that the same tools used to target adults are increasingly being used to exploit children. American Academy of Pediatrics. (2025). Artificial intelligence–generated sexual images of minors: Clinical, legal, and ethical considerations. American Academy of Pediatrics.Copy
Legal systems around the world are struggling to keep up. Survivors often face slow takedowns, jurisdictional barriers, and platforms that lack clear accountability mechanisms. Citron, D. K., & Chesney, R. (2019). Deep fakes: A looming challenge for privacy, democracy, and national security. California Law Review, 107(6), 1753–1820. https://doi.org/10.15779/Z38RV0D15JCopy And once deepfake porn exists, the burden of stopping it often falls on the victim — not the technology or the people who misuse it.
Megan Cutter says it’s almost impossible to stop the image from circulating once it’s created, and it leaves lasting wounds.
“Once the image is created, even if it’s taken down from the place where it was initially posted, it could have been screenshotted, downloaded, shared,” Cutter says. “That’s a really complex thing for people to grapple with.”
If you are a survivor of deepfake pornography, Cutler suggests you screenshot and record everything, everywhere you see the photo, everything you know about it’s creation, where it was sent, etc. ad to file a report with local law enforcement and other platforms like StopNCII.org, which is a free Revenge Porn Hotline tool that can help find and remove nonconsensual intimate images.
The Reality Behind the Screen
Deepfake pornography is often discussed as a future risk or a technological curiosity. But for survivors like Mollie, Emma, Miau, and Ashley St. Clair and countless other women, it is already a lived reality — one that reshapes how they move through the world, how safe they feel online, and how much control they have over their own bodies and identities.
A reminder: Image-based sexual abuse is sexual abuse.
Creating nonconsensual intimate deepfakes is sexual abuse.
These survivor stories make one thing clear:
When AI is used to sexually exploit someone without consent, the damage doesn’t stay digital.
Your Support Matters Now More Than Ever
Most kids today are exposed to porn by the age of 12. By the time they’re teenagers, 75% of boys and 70% of girls have already viewed itRobb, M.B., & Mann, S. (2023). Teens and pornography. San Francisco, CA: Common Sense.Copy —often before they’ve had a single healthy conversation about it.
Even more concerning: over half of boys and nearly 40% of girls believe porn is a realistic depiction of sexMartellozzo, E., Monaghan, A., Adler, J. R., Davidson, J., Leyva, R., & Horvath, M. A. H. (2016). “I wasn’t sure it was normal to watch it”: A quantitative and qualitative examination of the impact of online pornography on the values, attitudes, beliefs and behaviours of children and young people. Middlesex University, NSPCC, & Office of the Children’s Commissioner.Copy . And among teens who have seen porn, more than 79% of teens use it to learn how to have sexRobb, M.B., & Mann, S. (2023). Teens and pornography. San Francisco, CA: Common Sense.Copy . That means millions of young people are getting sex ed from violent, degrading content, which becomes their baseline understanding of intimacy. Out of the most popular porn, 33%-88% of videos contain physical aggression and nonconsensual violence-related themesFritz, N., Malic, V., Paul, B., & Zhou, Y. (2020). A descriptive analysis of the types, targets, and relative frequency of aggression in mainstream pornography. Archives of Sexual Behavior, 49(8), 3041-3053. doi:10.1007/s10508-020-01773-0Copy Bridges et al., 2010, “Aggression and Sexual Behavior in Best-Selling Pornography Videos: A Content Analysis,” Violence Against Women.Copy .
From increasing rates of loneliness, depression, and self-doubt, to distorted views of sex, reduced relationship satisfaction, and riskier sexual behavior among teens, porn is impacting individuals, relationships, and society worldwideFight the New Drug. (2024, May). Get the Facts (Series of web articles). Fight the New Drug.Copy .
This is why Fight the New Drug exists—but we can’t do it without you.
Your donation directly fuels the creation of new educational resources, including our awareness-raising videos, podcasts, research-driven articles, engaging school presentations, and digital tools that reach youth where they are: online and in school. It equips individuals, parents, educators, and youth with trustworthy resources to start the conversation.
Will you join us? We’re grateful for whatever you can give—but a recurring donation makes the biggest difference. Every dollar directly supports our vital work, and every individual we reach decreases sexual exploitation. Let’s fight for real love:



