Skip to main content
Blog

“I Wasn’t in Control of My Body”: How the Porn Industry Cashes In on Nonconsensual Content

Even though porn sites are not moderating in the way they profess to be, there is an army of women who are waging war on nonconsensual porn. 

My body, my face. I went into a state of shock; I really don’t know how to explain that feeling. It was like I wasn’t in control of my body, I was just shaking uncontrollably and couldn’t catch my breath.”

Those are the words of Aysha* (name has been changed) as reported by Glamour after she was sent a screenshot which showed about 30 explicit photos and videos of her.

All of those pictures and videos had been taken five years prior when she was 21 with a guy she had been seeing at the time. The private content had been accessed through an iCloud hack, and it had been nonconsensually shared on a number of porn sites.

Related: 7 Things You Can Do If You’re A Victim Of Deepfakes Or Revenge Porn

When Aysha attempted to find the content online, she found that all of it had been posted under her full name, and that it was reportedly trending on Pornhub.

For the two weeks after my images were leaked, it was just panic attack after panic attack,” she said. “I couldn’t sleep alone, and had to keep a bucket next to my bed. The police came round and took notes but didn’t have any advice for me. I kept messaging Pornhub and just wasn’t getting anywhere, until I lied that I was underage in the content, then it finally got removed.”

That’s right—Aysha reportedly had to reportedly lie about her age to have numerous nonconsensual pictures and videos of her removed from Pornhub. However, because it took so long for them to be removed, and Pornhub had a download feature on all content—which has now been disabled—people had a chance to download the content. Within a matter of days of the photos and videos being deleted, they were reuploaded.

Wondering how this could be?

Let’s get the facts.

Related: Are Porn Sites Protecting Victims Of Nonconsensual Content? We Investigated, Here’s What We Found

BHW - General

Pornhub’s moderating process

According to a spokesperson for Pornhub, the porn site has an “extensive team of human moderators that work around the clock to review and remove illegal content,” even though that number was reported to be twelve moderators in this Glamour article. (The number they shared has since been removed since the article’s original release.)

The spokesperson also said that Pornhub uses Vobile, software which automatically digitally fingerprints every video removed from the site, and scans all new uploads for potential matches to anything that has been removed previously.

Related: Pornhub Reportedly Profits From Nonconsensual Videos And Real Rape Tapes—Here Are The Latest Examples

While Pornhub used to have a “download” button right next to its content, presumably, the moderator team and software should be stopping reuploads that come from downloads. But that doesn’t seem to be the case, as exhibited by Aysha’s story.

So, what’s going on?

The lack of justice for nonconsensual porn victims

It turns out that some major flaws exist when it comes to addressing illicit nonconsensual porn and holding responsible parties accountable for sharing content without permission.

The activist who launched the #NotYourPorn campaign to raise awareness and hold the porn industry accountable for their actions, Kate Isaacs, says that “it’s completely legal for porn sites to be hosting and profiting from nonconsensual content.”

Why?

Because “the law only applies to individuals, not commercialized companies…” In other words, Pornhub itself cannot be held responsible for profiting off of nonconsensual porn on its site. You can only hold accountable the seemingly untraceable individual who posted the content on the site.

Related: Their Private Videos Were Nonconsensually Uploaded To Pornhub, And Now These Women Are Fighting Back

Kate says that this reality is clearly exhibited by the “amount of content that comes up when you search anything like ‘leaked’ or ‘stolen’ – there’s a lot of it, and it’s dressed up as ‘fantasy.’” What it comes down to is that “there’s no way for them to tell if that content is consensual or not.”

And, even when victims try to hold individuals who’ve shared nonconsensual nudes accountable, the language in the legislation makes it nearly impossible for a victim to get justice.

Become A Fighter

Assistant professor at Durham University’s Department of Sociology, Dr. Kelly Johnson says the legislation language, “disclosing private sexual photographs and films with intent to cause distress,” makes it so that “you have to prove that someone shared it with the intent to cause distress to the victim.”

According to Dr. Johnson, this means “if you report someone for sharing your nudes against your will, they could just say to the police, ‘I didn’t think she’d ever find out’ or ‘I was just doing it for a laugh,’” making it easy for the perpetrator to avoid being held accountable.

Even though porn sites are not moderating in the way they profess to be and the law has a number of loopholes that keep people from being fully protected, there is an army of women who are waging war on nonconsensual porn.

The fight to get nonconsensual porn site Anon-IB shut down

Anon-IB is a porn database that allows people to track nonconsensual porn victims (who are often underage) by name and location.

Emilia, the 23-year-old from Rhode Island who is leading the charge against the site, had nudes of herself shared by an ex to the porn database.

After first being alerted to her presence on the site, she scrolled further down and found pictures of six other girls she had graduated from high school with. Each was underage when their photos were taken, and all photos included their names and locations.

This led Emilia to create a TikTok video warning women about Anon-IB. The video garnered more than 9.5 million views and hundreds of comments from women, many of who are survivors of nonconsensual porn.

While Anon-IB has been extremely difficult to remove, all the attention Emilia’s video has received, the media attention, and the growing outrage with the site are making a difference. Authorities have at times shut the site down and it has been repeatedly forced to change URLs, almost certainly harming popularity and revenue streams.

Podcast

Why this matters

When porn sites reportedly knowingly allow videos and images to be uploaded by users without being checked, they’re effectively facilitating the perpetuation of nonconsensual porn.

And that facilitation leads people like Aysha and Emilia to suffer.

That’s one of the numerous reasons why we refuse to click.

Click here to learn what you can do if you are a victim of image-based abuse or revenge porn.