With nothing more than a few photographs and some free software, it’s now possible to quickly graft anyone’s face onto pornographic images, videos, and GIFs. The results are far from convincing as they are right now, but they’ll get better fast, according to a report by The Verge.
As first reported by Motherboard, a Reddit user named “deepfakes” has been applying these tools to create content for a pornographic subreddit. Using a combination of open-source AI (artificial intelligence) software—including Google’s TensorFlow—deepfakes pasted the faces of celebrities like Scarlett Johansson, Taylor Swift, and Maisie Williams onto X-rated GIFS. He even created a full video with the face of Wonder Woman star Gal Gadot inserted into an incest-themed porn scene. (At the time of writing, this content has been removed from its original web host, though it’s not clear who removed it.)
Related: Parasite Porn: How Your Photos Could Be Stolen & Photoshopped Onto Porn (VIDEO)
It’s true that popular applications like Snapchat have been using face-swapping tech for years, but any user could tell you that the results are less than convincing—which makes them all the more hilarious, IMO. But here’s the kicker with new tech: if you want to make a fake video using Photoshop, you have to edit every individual frame. And the problem is, AI helps to automate this regularly arduous process. Machine learning does much of the hard work for you—if you have only a little bit of technical knowledge.
It doesn’t take much imagination to see how this could be a big problem. Imagine a world where it takes little effort to make fake porn of a classmate, or a scorned ex. And in our status-update driven society, where it’s all too easy to gather pics of someone from their social accounts, there’s no doubt that this could become a widespread issue.
Related: Facebook Joins Fight Against Sex Trafficking With Face Scanning Technology
Experts in the AI community say they’re dismayed and disappointed by this use of the technology, but some see it as an opportunity to have a much-needed conversation about the future of digital imagery. Alex Champandard, a researcher who builds AI-powered creative tools, tells The Verge that there needs to be better public understanding of what technology can do. He also suggests that researchers can and should build programs that are better at spotting such fakes.
“Parasite Porn” Is Just the Start
While AI-assisted fake video porn is just becoming possible, “parasite porn” photos have unfortunately been around for a while.
Recently, ABC News in Australia reported (link contains triggering content) the story of a now 22-year old Australian law student, Noelle Martin. In her freshman year at college, Noelle was in her dorm room scrolling through her photos. She came across a selfie that she had taken when she was 17 years old and decided to drop it into Google Image Search to see what would come up.
What she found made her sick.
Noelle’s search brought up multiple porn sites with her picture posted. Absolutely disgusted, she was even more shocked when she found there were more photos of her face superimposed onto pornographic bodies. The photoshopped and manipulated images featured her face pasted onto naked and even sexually explicit photos of other women’s bodies.
The comments on the pictures were almost worse—vulgar, crude, and threatening to her, calling for her sexual violation and assault. All this had been done to her personal photos, and she had no idea.
Related: 14-Year-Old Girl Sues Facebook For Failing to Remove Revenge Porn
Noelle was a victim of “parasite porn,” a growing porn trend where someone will steal and alter regular photos to be explicit and pornographic. This is all done without the knowledge or consent of those pictured, and unfortunately, it can be nearly impossible to get such images removed once they are posted.
In Noelle’s case, one webmaster even threatened to send these images to her parents and the dean of her university if she did not allow them to remain on the site. Additionally, this same webmaster agreed to remove the images, but only if Noelle would send him sexually explicit photos of herself. She refused, and has since been working with the Australian government to take action against these porn sites.
Why This Matters
Beyond the effects that porn has on its viewers, it is important that we understand the harm it inflicts on those used to create it. Even, and especially, when they’re unwilling participants.
Recent disturbing trends like revenge porn, parasite porn, and fake video porn can completely destroy the reputation and livelihood of those depicted. Not only are these genres creepy, they are simply wrong. And it’s up to us to spread awareness, and stop the demand. Are you with us?