BlogWorld

Why the Number of Explicit Deepfakes Videos on Mainstream Porn Sites is Rapidly Increasing

By September 22, 2020No Comments
TRIGGER WARNING
 

49,081.

According to deepfake detection company, Sensity, that’s the number of deepfakes that had been found on the web as of June 2020—many of which were explicit.

Deepfakes are fake videos or audio recordings that are made to look and sound just like the real thing. In some cases, a deepfake might include someone editing LeBron James into a video of their pickup basketball game with friends. While still a problem because it includes using someone’s name and likeness without their permission, the issue at hand becomes that much more clear when the video is pornographic in nature.

Related: Here’s What It’s Like To See Yourself In A Deepfake Porn Video

Now, it’s more common for people to use technology to make porn deepfakes of celebrities, musicians, and actresses, rather than deepfake videos that aren’t pornographic in nature.

WIRED, an American magazine company that looks at how emerging technologies affect culture, economics, and politics, recently reported that hundreds of these explicit videos are uploaded to the world’s biggest porn websites every month. These videos rack up millions of views, and the porn sites that host these videos fail to remove them from their sites.

Wondering how this could be? Let’s get into the details.

Brain Heart World

Taylor Swift, Natalie Portman, and more are being viewed in deepfake videos

Even if it’s labeled as, ‘Oh, this is not actually her,’ it’s hard to think about that, I’m being exploited,” Hollywood actress Kristen Bell told Vox in June regarding finding out that deepfakes were being made using her image.

Other deepfake videos, which have hundreds of thousands or millions of views, include a list of A-list celebrities such as Natalie Portman, Billie Eilish, Taylor Swift, Anushka Shetty, and Emma Watson. Watson’s video alone, a 30-second clip using her face in a pornographic video, has been viewed more than 23 million times. And many of these celebrities have been frequent deepfake targets since the trend emerged in 2018.

Related: Would AI-Generated Nudes Solve The Ethical Problems Of Porn Sites?

The fact that the above names are all women is not a coincidence, either. A report released by Sensity last year found that 96% of the deepfake videos that were found online in July 2019 were pornographic, and almost all are focused on women.

Store - General

What are porn sites doing about this?

The quick answer? Not much.

Sensity’s 2019 analysis shows that the top four deepfake porn websites received more than 134 million views last year. Some of the videos were even requested, and their creators can be paid in bitcoin.

And while the number of videos being released and the number of views on each video continues to climb (up to 1,000 deepfake videos are uploaded to porn sites every month and the total number of videos as of June 2020 is about three times what it was July 2019), porn sites aren’t doing anywhere near enough to take down this illegal content or keep it from being uploaded.

Related: Deepfake Porn Videos Of Celebrities Are Just Another Form Of Sexual Exploitation

With the money-making potential of deepfakes due to the ad space surrounding the uploads on porn sites, sites’ inaction is no surprise—especially with the number of visitors the sites have.

Note Pornhub’s “Prohibited Uses” information on their Terms of Service page in this screenshot taken in late September 2020:

pornhub-terms-of-service-screenshot

Deepfakes porn violates a number of these terms, and yet, there are reportedly numerous deepfakes porn videos on Pornhub. In fact, the search bar reportedly auto-fills in “deepfakesporn” when a user types in just the first few letters of that word.

XVideos and Xnxx, porn sites that also host numerous deepfake videos, are two of the top three porn and two of the top ten overall websites in the world, and they have as many visitors or more visitors than Wikipedia, Amazon, and Reddit.

Related: Are Porn Sites Protecting Victims Of Nonconsensual Content? We Investigated, Here’s What We Found

When asked about the increasing deepfake problem, Alex Hawkins, the VP of popular porn site xHamster, said that, while the company doesn’t have a specific policy for deepfakes, it treats them “like any other nonconsensual content.” More specifically, Hawkins said that the company removes videos if people’s images are used without permission:

We absolutely understand the concern around deepfakes, so we make it easy for it to be removed. Content uploaded without necessary permission being obtained is in violation of our Terms of Use and will be removed once identified.”

Regardless, xHamster seemed to need WIRED’s help in identifying dozens of videos appearing as deepfakes on the site. And, although such videos are widely considered to target, harm, and humiliate the women at their center, CEO and Chief Scientist at Sensity, Giorgio Patrini, says, “The attitude of these websites is that they don’t really consider this a problem.” Moreover, the number of people who are featured in deepfakes is increasing.

Patrini also added that Sensity has seen a growing number of deepfakes made of Instagram, Twitch and YouTube influencers, which leads him to think that members of the public will eventually and inevitably target them for similar videos.

People Are Not Products - Black

Why this matters

As the underlying artificial intelligence technology needed to make deepfake videos advances, gets, and gets cheaper and easier to use, so grows the risk that any one of us could end up in such a video—a risk that’s magnified further by the fact that it seems like some of the world’s most popular porn sites, sites full of deepfakes, refuse to take nonconsensual video creation seriously.

Clare McGlynn, a professor at the Durham Law School who specializes in porn regulations and sexual abuse images, agrees. “What this shows is the looming problem is going to come for non-celebrities,” she said. “This is a serious issue for celebrities and others in the public eye. But my longstanding concern, speaking to survivors who are not celebrities, is the risk of what is coming down the line.”

What it comes down to is this: even when porn is “not” exploitative because it is using AI-generated imagery, it is.

Although not facing literal sexual assault, survivors of nonconsensual video creation face many disruptive mental health issues that affect their daily lives—in some cases, there are striking similarities between the mental health effects of sexual assault and nonconsensual video creation for survivors.

Related: What I Did When I Found Out My Partner Posted Photos Of Me To Porn Sites

How many more must be exploited until we condemn the creation of deepfakes?

This is one of the many reasons we refuse to click porn sites, because they profit from the creation and distribution of nonconsensual content.

Will you join us?

Click here to learn what you can do if you’re a victim of revenge porn or deepfakes porn.

Send this to a friend