How would you feel if we told you that someone could pull a fully dressed, digitized photo of you from the internet and easily undress it in seconds—all without your permission?
And in addition to that, they could share a link of that now nude photo of you with anyone in the world: your crush, your mom, your grandmother, your whole school, among others?
If deepfake survivor history is any indication, our guess is that someone committing that kind of image-based sexual abuse would feel traumatizing to anyone and could lead to a whole host of severe and disruptive mental health effects.
Here’s the worst part, though: this deepfake technology exists today and is being used on everyday people—specifically, numerous women.
What is a “deepfake?”
Before we discuss the disturbing technology we described above, let’s talk about deepfakes.
Deepfakes are a digital phenomenon that has emerged over the last few years. A “deepfake” is a digital manipulation of a real person’s face or voice, created using deep learning technology which “learns” from existing data, such as photos and videos of that person.
The data can be used to manufacture fake audio, images, or videos of someone, making them appear to do or say something they might never say in real life.
In this YouTube video, there are a couple examples of deepfakes—you’ll see Elon Musk’s face replacing Charlie Sheen’s, Mike Tyson’s face replacing Oprah’s, and Kevin Hart’s face superimposed on Mr. T’s, among a number of other examples.
The video exhibits the fun and relatively harmless side of deepfakes, but the fact of the matter is that when it comes to deepfakes, there are serious ethical concerns. When researchers at AI cybersecurity company Deeptrace did a search of the internet in 2019, they found that of the almost 15,000 deepfakes online, 96% are pornographic.
Meaning, many individuals have been nonconsensually featured in thousands of sexually explicit videos and images.
And that’s in line with the deepfake technology that’s newly making the rounds and causing concern.
The deepfake tool undressing thousands
In 2020, a deepfake website launched that had developed its own “state of the art” deep-learning image translation algorithms to “nudify” female bodies.
In other words, any user can upload any photo of a dressed female or female-presenting subject and the site will undress the photo for the user.
We’re not going to name it here or include any identifying information that could lead someone to the site.
According to HuffPost, when someone uploads female and female-presenting subjects, the results are strikingly realistic, often bearing no glitches or visual clues that the photos are deepfakes. While the site claims not to save any uploaded or “nudified” photos, it does provide the person uploading with a link to their nude photo that can be shared with anyone.
As of this summer, the site has already gotten more than 38 million visits in 2021.
And given the “referral” program through which users can publicly share a personalized link and earn rewards for each new person who clicks on it, it wouldn’t be surprising to see that number climb even higher.
Why does this tool exist and is anyone trying to address it?
It’s clear this kind of site can cause immeasurable harm to anyone depicted—namely women in this case—so how is it possible that it’s still up and running? Is anyone trying to do something about it?
The first question is easy to answer: demand. The millions and millions of hits the site has received in a matter of months show that people desire to use the site.
And where there’s demand, there’s money. Site users are normally only able to “nudify” one picture for free every two hours. Alternatively, they can pay a cryptocurrency fee to skip the wait time for a specified number of additional photos.
The second question is a bit more difficult to answer. When HuffPost reached out to its original web host provider, the site went offline briefly because the web host provider terminated its hosting services. However, the site was back up less than a day later with a new host.
It’s currently unknown who operates the site—operators didn’t respond to multiple interview requests from multiple media outlets.
Regardless of who the creators are, it’s clear who the consumers are: the U.S. has reportedly been by far the site’s leading source of traffic.
U.S. lawmakers have shown little concern about abusive deepfakes outside of their ability to cause political issues. Social media companies have also shown less concern, being slow to respond to complaints about nonconsensual pornographic content (real or fake) that they tend to face no liability for.
Unfortunately, United Kingdom-based deepfake expert Henry Ajder calls the situation “really, really bleak” because the tool’s “realism has improved massively” and because the “vast majority of people using these [tools] want to target people they know.”
Weaponizing deepfake technology against celebrities and influencers has victimized many—and now, everyday women have become targets in unthinkable ways.
Why this matters
What it comes down to is this: pornographic deepfakes are nonconsensual porn.
Deepfake pictures and videos are being created without the consent, knowledge, and input of the victim. The victim has no control over what they appear to do or say, and who sees and shares the content. This is a very serious offense.
Victims of image-based sexual abuse like nonconsensual porn, such as deepfakes, face catastrophic emotional consequences. According to one qualitative study of revenge porn survivors, study participants experienced many disruptive mental health issues after victimization that affected their daily lives. Participants generally engaged in negative coping mechanisms, such as denial and self-medicating, closer to when they were victimized.
Nearly all participants experienced a general loss of trust in others. Many participants also experienced more severe and disruptive mental health effects. Inductive analysis revealed posttraumatic stress disorder (PTSD), anxiety, depression, suicidal thoughts, and several other mental health effects.
Studies like the above clearly exhibit that deepfakes are degrading, humiliating, and potentially life-ruining—there is no place for websites that make them.
Click here to learn what you can do if you’re a victim of revenge porn or deepfakes porn.
Support this resource
Thanks for reading our article! Fight the New Drug is a registered 501(c)(3) nonprofit, which means the educational resources we create are made possible through donations from people like you. Join Fighter Club for as little as $10/month and help us educate on the harms of porn!
JOIN FIGHTER CLUB