Cover screenshot from a deepfakes video featuring images of Emma Watson, retrieved from BBC. 2 minute read.
You know how you can face-swap with just about anything using filters from Snapchat or Instagram? And it usually results in something hilarious?
Now, imagine that face-swapping concept, but with the most graphic porn available and regular people like you or your friends. Not so funny, right?
Introducing “deepfakes”
It’s known as “deepfakes,” or “deepfakes porn,” a growing trend of “fake” pornography that’s going viral online and making victims out of celebrities, public figures, and otherwise innocent social media users, like 22-year-old law student Noelle Martin.
In the darker corners of the world wide web, deepfakes creators are taking orders and selling custom-made clips like it’s the next big thing and, as the technology continues to improve, the demand is on the up and up and the product is getting more convincing.
What is “deepfakes” porn?
Deepfakes pornography uses the latest—and totally easy to use—AI technology to graft the faces of non-consenting people onto pre-existing explicit material in order to create fake pornographic images, videos, and GIFs. The result is nonconsensual, convincing content that depicts extremely intimate, explicit acts that are created without permission from the person portrayed.
Related: Webcam Sex Scams: Inside The Dark World Of Online Blackmail
The term was created by Reddit user, “deepfakes,” who started posting pornographic celebrity videos using this face-swapping technology, including Gal Gadot, Scarlett Johansson, Aubrey Plaza, and Taylor Swift. This Windows program called FakeApp has made it easy to create the videos once a user has gathered enough photos of their subject, which is easy to do with celebrities:

FAKEAPP
This type of pornography is closely related to revenge porn or hacked celebrity photos, which can be weaponized to humiliate a person and ruin their identity or reputation. And although revenge porn and hacked photos are different than deepfakes—because the original material doesn’t need to be altered—the ethical issues of consent and objectification of victims are similar.
Wait, what does this mean?
While deepfakes porn typically features actresses or other celebrity women, new and easily-accessible software is making it easy for anyone to stitch footage of one person’s face onto another’s, which opens up the victim pool to anyone who has a social media or online presence (a.k.a. much of the known world), which means it’s probably time to up those privacy settings.
To add insult to a severe invasion of privacy, there aren’t really legal ramifications for users who upload this nonconsensual content since they aren’t made from illegally stolen intimate photos, and as one reporter in Wired wrote, “You can’t sue someone for exposing the intimate details of your life when it’s not your life they’re exposing.”
As this technology becomes more common and easier to use, the authenticity of all sorts of videos (not just pornographic) will continually be in question. Video evidence could become inadmissible in court, while pornographic material could exploit unwitting celebrities and social media users. This dangerously blurs the lines between reality and mimicry, causing anyone from celebrities to yourself to possibly fall victim to its creators.
So, how can we fight this technological tyrant, if we can?
How do we fight back?
Technological advances can be used for both the progressive and digressive. For instance, the same AI technology used to create this nonconsensual content also created Princess Leia’s AI face at the end of Rogue One, brought Paul Walker back to the big screen in Furious 7, and more recently, was used to plaster Nicolas Cage’s face all over iconic movie scenes—which, let’s face it, is meme gold.
But there is nothing funny or acceptable about ruining someone’s image for sexual entertainment.
While notable porn sites like Pornhub have previously promised to ban deepfakes porn, their users are still managing to upload and spread these videos and images at a crazy rate. Other sites where deepfakes porn has become popular are taking a more aggressive approach like the gif site, Gfycat, who says it’s figured out a way to train an artificial intelligence to spot fraudulent videos. With this developing technology, Gfycat—who has at least 200 million active daily users—hopes to bring a more comprehensive approach to kicking deepfakes off their site for good.
Related: 6 Ways Ditching Porn Can Improve Your Dating Game
Other websites like Reddit and Twitter are relying on users to report deepfakes content on their platforms. Not a very developed system, but at least they’re taking a stand against this unacceptable content.
We fight against porn and sexual exploitation because we don’t believe anyone’s violation should be sexual entertainment. Join us, and choose to create a culture that stops the demand for nonconsensual deepfakes pornography and sexual exploitation altogether.
Get Involved
Expose the porn industry for what it really is and speak out against this harmful trend. SHARE this article to spread the word on the harms of porn and add your voice to the conversation.
Spark Conversations
This movement is all about changing the conversation about pornography and stopping the demand for sexual exploitation. When you rep a tee, you can spark meaningful conversation on porn’s harms and inspire lasting change in individuals’ lives, and our world. Are you in? Check out all our styles in our online store, or click below to shop: