Skip to main content
Keep our educational resources free Donate
Blog

Grok Allows Sexual Abuse in “Mass Digital Undressing Spree”

Grok users sexualized real people’s photos without consent. What followed was a wave of AI-generated sexual images targeting women and children. This isn’t about innovation—it’s about how easily technology can enable abuse at scale.

By January 6, 2026No Comments

Disclaimer: Fight the New Drug is a non-religious and non-legislative awareness and education organization. Some of the issues discussed in the following article are legislatively-affiliated. Though our organization is non-legislative, we fully support the regulation of already illegal forms of pornography and sexual exploitation, including the fight against sex trafficking, and safeguards to protect children.

Recently, X’s AI chatbot, Grok, has been under fire after it was used to generate sexualized images of real people, most often young women, and in several documented cases, children.

According to reporting by Reuters, BBC News, and The Guardian, users uploaded photos of women and girls into Grok and prompted the AI software to digitally remove clothing or alter the images in sexualized ways or positions without the subjects’ consent. The result was what journalists described as a “mass digital undressing spree,” where real people’s likenesses were turned into sexual content at scale.

The people targeted were not public figures opting into experimentation. They weren’t only often exploited celebrities, they include ordinary users, and everyday children whose images were taken, manipulated, and sexualized without their permission.

The targets were real people.
And the responsibility does not stop with the users who typed the prompts.

What happened?

Grok, X’s AI chatbot, is being used as a type of nudify app to digitally undress and reposition individuals in sexually compromising positions, most of whom did not consent to have their image altered or shared.

Although Grok’s December update doesn’t allow totally nude requests, users have asked it to undress countless women into their underwear or tiny bikinis or to be repositioned. Some sources mention women appearing to have oil or even something looking like semen on them. It’s not surprising that the majority of images featured women under 30, with some as young as 5.

Related: Artificial Intelligence Helps Produce Child Sexual Abuse Material

Stranger Things actress Nell Fisher (Holly from Stranger Things Season 5) had an image of her altered that portrayed her in a banana print bikini. She is only 14 years old.

Elon Musk initially responded to the flood of outcries regarding the disturbing images with endless laughing/crying emojis. He eventually said, “Anyone using Grok to make illegal content will suffer the same consequences as if they upload illegal content.”

However, we’re not sure that’s been the case.

Grok apologized publicly, too, if you can even say that, seeing that Grok is not a person. The post features the AI chatbot apologizing on its own behalf, stating that its creation of a portrayal of young girls aged 12-16 was a “failure in safeguards.”

Nudify apps have been around for a while, and deepfake technology keeps improving, with more and more victims suffering the consequences. This mass exploitation was more than a failure in safeguards; this was Grok allowing sexual abuse to thrive on their platform.

Nonconsensual Sexualized Imagery Is Sexual Abuse—Even If It’s AI-Generated

Creating sexualized images of someone without their consent—whether through deepfakes, “nudifier” tools, or AI image manipulation—is a recognized form of image-based sexual abuse.

Survivors often report anxiety, depression, fear for their safety, reputational damage, and long-term psychological harmCitron, D. K. (2019). Sexual privacy. Yale Law Journal, 128(6), 1870–1960.Copy .

And when the targets are children, the harm is even more severe. Sexualized images of minors—real or simulated—are widely recognized as child sexual abuse material because they sexualize a child’s likeness and contribute to demand for exploitationEuropol. (2023). Facing reality? Law enforcement and the challenge of deepfake technology. Europol Innovation Lab.Copy .

Sexual abuse does not require touch.

Get The Facts

This Technology Didn’t Appear Overnight—The Risk Was Known

AI “nudifier” tools have existed for years. Experts have repeatedly warned that image-generation technology would be used to sexually exploit women and children if meaningful safeguards weren’t in place Chesney, R., & Citron, D. K. (2019). Deep fakes: A looming challenge for privacy, democracy, and national security. California Law Review, 107(6), 1753–1820. https://doi.org/10.15779/Z38RV0J9S9Copy .

Related: A Terrifying Deepfake Site Can “Nudify” Images of Women with One Click

What makes Grok different—and more dangerous—is scale.

By integrating an image-generating AI directly into a major social media platform, X dramatically lowered the barrier to creating sexual abuse imagery. No specialized software. No technical skill. Just a photo and a prompt.

When access becomes effortless, abuse multiplies.

Calling This a ‘Safeguard Failure’ Minimizes Harm

After backlash, Grok issued an apology, describing the incident as a “failure in safeguards.” But language like that reframes sexual exploitation as a technical inconvenience rather than a human rights issue.

A system that repeatedly generates sexualized images of real people when prompted is not neutral.
It is functioning exactly as it was allowed to function.

French, Indian, and Malaysian authorities have since launched investigations into whether Grok violated laws related to obscenity and child protection, signaling that governments recognize the seriousness of this harm.

Responsibility Doesn’t End With the User

Yes, individuals who use Grok to sexualize others are responsible for their actions. But platforms and developers are not passive bystanders. If the technology didn’t exist, neither would the abuse.

When companies deploy powerful tools without robust consent protections, age safeguards, or enforcement mechanisms, and relentless testing—especially on platforms already struggling with abuse moderation—they share responsibility for the outcomes. They give perpetrators the power and tools to abuse.

Foreseeable harm that isn’t prevented is still harm.

When speed, engagement, and innovation are prioritized over safety, the cost is paid by victims—not companies.

This Is Part of a Larger Pattern of Sexual Exploitation

At Fight the New Drug, we’ve consistently shown how new technologies often replicate the same dynamics seen in the pornography industry:

  • Women’s bodies treated as content
  • Consent treated as optional
  • Sexual entitlement normalized
  • Children left unprotected

AI-generated sexual imagery doesn’t exist in a vacuum. It reinforces a pornified culture that already normalizes sexual objectification and voyeurism—especially toward women and girls.

When abuse is automated, scaled, and normalized, the number of victims skyrockets.

If platforms are serious about protecting users, accountability must go beyond apologies and policy updates.

That includes:

  • Preventing the sexualization of real people without explicit consent
  • Enforcing zero tolerance for sexualized images of minors
  • Designing AI systems that default to human dignity, not exploitation
  • Removing abusive content quickly and transparently
  • Cooperating fully with investigations and survivor-led takedown efforts
What You Can Do

If you come across nonconsensual or sexually abusive content:

  • Report it immediately on the platform
  • Do not engage or share, even to criticize
  • Support survivors by believing them and amplifying resources

If you or someone you know has been targeted by image-based sexual abuse, there are organizations that can help with takedowns, legal options, and emotional support.

Technology does not get a free pass simply because the harm happened behind a screen. When AI tools enable sexual exploitation, the companies that build and deploy them must be held accountable. Reject sexual exploitation and abuse. Say no to porn.

Your Support Matters Now More Than Ever

Most kids today are exposed to porn by the age of 12. By the time they’re teenagers, 75% of boys and 70% of girls have already viewed itRobb, M.B., & Mann, S. (2023). Teens and pornography. San Francisco, CA: Common Sense.Copy —often before they’ve had a single healthy conversation about it.

Even more concerning: over half of boys and nearly 40% of girls believe porn is a realistic depiction of sexMartellozzo, E., Monaghan, A., Adler, J. R., Davidson, J., Leyva, R., & Horvath, M. A. H. (2016). “I wasn’t sure it was normal to watch it”: A quantitative and qualitative examination of the impact of online pornography on the values, attitudes, beliefs and behaviours of children and young people. Middlesex University, NSPCC, & Office of the Children’s Commissioner.Copy . And among teens who have seen porn, more than 79% of teens use it to learn how to have sexRobb, M.B., & Mann, S. (2023). Teens and pornography. San Francisco, CA: Common Sense.Copy . That means millions of young people are getting sex ed from violent, degrading content, which becomes their baseline understanding of intimacy. Out of the most popular porn, 33%-88% of videos contain physical aggression and nonconsensual violence-related themesFritz, N., Malic, V., Paul, B., & Zhou, Y. (2020). A descriptive analysis of the types, targets, and relative frequency of aggression in mainstream pornography. Archives of Sexual Behavior, 49(8), 3041-3053. doi:10.1007/s10508-020-01773-0Copy Bridges et al., 2010, “Aggression and Sexual Behavior in Best-Selling Pornography Videos: A Content Analysis,” Violence Against Women.Copy .

From increasing rates of loneliness, depression, and self-doubt, to distorted views of sex, reduced relationship satisfaction, and riskier sexual behavior among teens, porn is impacting individuals, relationships, and society worldwideFight the New Drug. (2024, May). Get the Facts (Series of web articles). Fight the New Drug.Copy .

This is why Fight the New Drug exists—but we can’t do it without you.

Your donation directly fuels the creation of new educational resources, including our awareness-raising videos, podcasts, research-driven articles, engaging school presentations, and digital tools that reach youth where they are: online and in school. It equips individuals, parents, educators, and youth with trustworthy resources to start the conversation.

Will you join us? We’re grateful for whatever you can give—but a recurring donation makes the biggest difference. Every dollar directly supports our vital work, and every individual we reach decreases sexual exploitation. Let’s fight for real love: