Skip to main content
Blog

Are Porn Sites Protecting Victims Of Nonconsensual Content? We Investigated, Here’s What We Found

It might surprise you to learn that nonconsensual porn is not uncommon. But are porn sites doing all they can to prevent and remove it? Spoiler alert: no. Here's what our investigation found.

Trigger warning:

In 2009, a 14-year-old named Rose was abducted and raped in a 12-hour long, overnight attack by two men, with a third man video recording the assault.

Sometime later, after the attack, some students from Rose’s school shared a link on MySpace. It led to Pornhub and revealed videos with hundreds of thousands of views and titles like “teen crying and getting slapped around,” “teen getting destroyed,” and “passed out teen.”

They were all of Rose. From that night, from that assault.

In November 2019, a 15-year-old girl who had been missing for a year was discovered in videos on Pornhub. She had become a victim of sex trafficking, and yet the porn site hosted 58 videos of her being raped by her trafficker and other sex buyers.

Related: Pornhub Reportedly Profits From Nonconsensual Videos And Real Rape Tapes—Here Are The Latest Examples

These are not the only stories of nonconsensual content being uploaded to Pornhub, but they are two real and verified examples.

Honestly, we have no way of knowing how much content on Pornhub is nonconsensual or what percentage of their profits come from it. We only know that illegal or abusive videos exist when they shouldn’t and that Pornhub and other sites could be doing more to eradicate it. This is what this article is aiming to uncover.

BHW - General

More nonconsensual porn, and lots of it

Other examples of available nonconsensual content include “revenge porn,” or private images or videos posted by the ex-partner of the victim depicted. Some have had private accounts hacked and their images uploaded by an unknown perpetrator. Other victims of nonconsensual content have been secretly filmed in locker rooms or showers and “upskirted” on public transportation. Some are children forced to perform sex acts for an online audience, and others are groomed by adults online asking for nude pics. Some are deepfakes of ex-partners’ faces graphed onto porn performers’ bodies in horrifyingly convincing videos.

All are victims of image-based sexual abuse perpetrated partially or entirely online.

Free porn tube sites like Pornhub, thrive off of user-uploaded content. They encourage anyone, anywhere to upload porn, and lots of it, with seemingly no review in place before the content is available to the public for consumption—some of which is nonconsensual content like what we’ve described above.

Pornhub’s infamous “year in review” reports features mind-boggling examples of how often people take advantage of their user upload feature. Just consider the more than 1.36 million hours of new content that was uploaded to the site in 2019 alone. It would take a person 169 years to watch it all, and that’s just the new content, not the existing videos on the site.

Related: Their Private Videos Were Nonconsensually Uploaded To Pornhub, And Now These Women Are Fighting Back

The porn giant brags about their 115 million visits to the site each day. The combination of new videos and millions of eyeballs browsing the site is enticing to some advertisers—this is one way Pornhub makes money. As it turns out, they have continued to sell ad space and profit from illegal and abusive material, too.

To many survivors, the act of uploading recorded footage is emotionally devastating. Not all, but many report showing signs of post-traumatic stress disorder or trauma akin to rape victims, which are responses supported and illustrated by studies that examine the impact of “revenge porn” on survivors.

Store - Trafficking

A growing movement to hold tech companies responsible

It might surprise you to learn that nonconsensual porn is not uncommon.

“The first thing people need to understand is that any system that allows you to share photos and videos is absolutely infested with child sexual abuse,” Alex Stamos, professor at Stanford and former security chief at Facebook and Yahoo, said to The New York Times.

Related: Content On Pornhub Reportedly Normalizes And Promotes Racism And Racist Stereotypes

Last year, the newspaper investigated the rise of child abuse material online and discovered tech companies aren’t putting their efforts toward monitoring for illegal imagery.

The investigation looked at companies and tech platforms, including social media platforms. Facebook, they found, does scan for illegal material and reports the majority of flagged content by tech companies, but even they are not using all available resources to detect harmful or illegal content.

Store - Trafficking

It’s important to note here that there are few incentives for Facebook and other social media sites to better monitor content partially because of Section 230 of the Communications Decency Act, which allows for platforms like Facebook to say in certain situations that they aren’t publishers and therefore not responsible for the content their users upload to the site.

In other words, Mark Zuckerberg—the Co-founder of Facebook—isn’t who gets sued if a guy in Kansas uploads intimate pictures of his ex-girlfriend on Facebook with the intent to humiliate her. That guy in Kansas is responsible. This is understandable, but these cases are rarely so simple and sometimes tech platforms can be held accountable for facilitating sexual abuse, or in severe cases, human trafficking.

Still, there are roadblocks for victims to get any sense of justice.

For starters, it is very difficult to remove images or videos once they have been shared online. Angela Chaisson, Principal at Chaisson Law, told us in an interview that getting images removed from a site like Pornhub is next to impossible:

“I will often tell a client that it’s just not worth the effort that it takes, which is a very unsatisfactory thing to say to a client as a lawyer. It’s like whack-a-mole. If you get them taken down from one place, they pop up in another.”

Recent public outrage suggests some people believe the platform should also be held responsible for not monitoring illegal or abusive content.

Related: “Hit That”: Do Both Pop Culture And Porn Culture Normalize The Abuse Of Women?

Pornhub’s response seems to be unconditional denial. Here’s part of a statement Pornhub Vice President Blake White sent to the Daily Dot, responding to calls to shut the site down because of nonconsensual images and videos:

“Pornhub has a steadfast commitment to eradicating and fighting any and all illegal content on the internet, including nonconsensual content and child sexual abuse material. Any suggestion otherwise is categorically and factually inaccurate. While the wider tech community must continue to develop new methods to rid the internet of this horrific content, Pornhub is actively working to put in place state-of-the-art, comprehensive safeguards on its platform to combat this material. These actions include a robust system for flagging, reviewing and removing all illegal material, employing an extensive team of human moderators dedicated to manually reviewing all uploads to the site, and using a variety of digital fingerprinting solutions. We use automated detection technologies such as YouTube’s CSAI Match and Microsoft’s PhotoDNA as added layers of protection to keep unauthorized content off the site. We also use Vobile, a state-of-the-art fingerprinting software that scans any new uploads for potential matches to unauthorized materials to protect against any banned video being re-uploaded to the platform. We are actively working on expanding our safety measures and adding new features and products to our platform to this end, as they become available.”

Still, despite their alleged dedication to eradicating nonconsensual content, the Internet Watch Foundation recently investigated the site and confirmed over 100 cases of child sexual abuse material on Pornhub. But in response to this report, the site pointed out that this is less than 1% of the website’s content.

(Let’s do some quick math. Even if just 0.1% of the videos only uploaded in 2019 were nonconsensual, that’s still 6,830 videos in one year. In our opinion, anything higher than 0% of the content being content that exploits children is still far too high. One exploited child or even one exploited adult is too many.)

Podcast

Pornhub’s White also recently released a statement that details how the company believes their work to protect victims is having a positive impact.

Kate Isaacs, the founder of UK anti-revenge porn movement #NotYourPorn, disagrees. She worked with The Times to investigate if there was illegal content hosted on Pornhub, and spoiler alert, they found it too.

“There’s a level of delusion,” Isaacs said. “I genuinely think they think they are helping people more than they are.”

What does content moderation actually mean?

There are multiple options and technologies available to help monitor content, some of which Pornhub mentioned in their statement, but so far none are a perfect solution even if they’re deployed thoroughly. At a minimum, monitoring needs to be a two-step process.

Still, here are a few methods that are used by big tech companies to monitor for illicit content all over the world, some of which Pornhub claims to use.

Human moderators

Facebook famously employs thousands of human moderators who rapidly click through image after image, flagging those that may be against the site’s community rules.

Related: More Than 80 Men Were Sexually Exploited And Secretly Filmed For This Guy’s Porn Site

We spoke with Karl Muth, economist and professor at Northwestern University, who told us human moderators are not the best way forward. No matter how little companies pay these legions of moderators, it is still expensive, not to mention traumatizing for the employees. No doubt it would be even more so for moderators of Pornhub content.

Reverse image search and sourcing technologies

Muth mentioned other options, such as reverse image search technologies. They are good at scanning through images to find copies, but not helpful in discovering new problematic images as the original image or some copy is needed to conduct the search. Also, there are sourcing or tracing technologies that don’t necessarily look at the image but examine where the image comes from.

So if an image was shared yesterday on a group chat suspected of sharing illegal material, and is then uploaded to Pornhub, that could be a red flag.

Hashing images

In 2009, Hany Farid, then a professor at Dartmouth University and now the University of California, Berkeley, developed a software with Microsoft to detect illegal images. Basically, PhotoDNA converts a unique image to a greyscale version, divides the image on a grid, analyzes each of the smaller sections of the image, and then creates a “hash” or digital signature of the image made up of numbers and letters. This categorization method can then be compared to other images in a database of confirmed sexual abuse images to identify a match.

Hashing is used by the Internet Watch Foundation and many other organizations, but it too is limited because it requires comparing to a database of confirmed cases, meaning others may slip through the cracks, likely in Pornhub’s case.

What Pornhub reportedly isn’t doing to help victims

These are methods Pornhub could utilize, but as far as reports and evidence suggest, they aren’t to their fullest abilities. Despite their claims of being proactive with removing illicit content, there is little evidence to suggest they monitor content uploaded to their sites, but instead, they mainly rely on users to report illegal content once it’s already been posted.

Here’s how it supposedly works.

In theory, a nonconsensual porn victim submits a content removal request to Pornhub with links to the images or videos to be removed. If the victim doesn’t want their content re-uploaded to Pornhub, they are referred to Vobile to “fingerprint” their content, which supposedly makes it impossible to reupload. Or does it?

Related: Ukrainian Gynecologist Accused Of Sharing Hidden Cam Footage Of Patients With Porn Sites

Last month, VICE tested the reporting system and found that it doesn’t always work this way. VICE found that with minor edits, the fingerprinting could be circumvented. Meaning, Pornhub may delete the original upload, but modified copies can continue to spread.

But let’s backtrack. Does Pornhub quickly respond to every request to remove nonconsensual content in the first place, like they claim?

Activists and victims have complained that Pornhub isn’t always cooperative and responsive to content removal requests.

For example, after 14-year-old Rose found videos of her assault on Pornhub, she sent messages to Pornhub over a six month period, repeatedly asking for the videos to be removed. She explained that the videos were of her assault, that she was a minor. She never received a response. It wasn’t until she had the idea to open a new email account and pretend to be a lawyer threatening legal action on her behalf that the videos were taken down in two days.

Related: Pornhub Reportedly Refused To Remove Videos Of This Minor’s Sexual Assault—Until She Posed As Her Own Lawyer

Pornhub responded to Rose’s experience in a statement that said this happened under different leadership of the company in 2009, and they have better practices now. But after working with victims in the UK very recently, Kate Isaacs has said their response has been inconsistent. Sometimes responsive and sometimes silent.

Cara Van Dorn, an associate at the law firm Sanford Heisler Sharp, which has been representing some of the women involved in the notorious Girls Do Porn case, said to VICE:

“We had reached out to [Pornhub’s parent company] many times over the years and it wasn’t until the start of trial and obtaining numerous favorable rulings demonstrating undeniable strength of our case that [Pornhub] finally decided to start taking action. It’s not really ‘believing victims’ when it takes a team of lawyers and years of litigation before you lift a finger. It’s not really ‘doing the right thing’ when you only act when it is in your self-interest.”

Note that the first Girls Do Porn videos of trafficked women started to get uploaded to porn sites like Pornhub around 2015, and the women depicted pleaded for years for the removal of this content once they were made aware of it through doxing and harassment. They were reportedly met with silence and inaction.

Fortify

Pornhub finally removed the official Girls Do Porn channel in October 2019 after the company owners were arrested for sex trafficking, but copies of hundreds of the videos still remain on multiple free porn sites. VICE reported that the videos are hosted against banner ads that Pornhub still profits from.

Related: 22 Women Paid $12.7 Million And Given Rare Ownership Rights In GirlsDoPorn Lawsuit

The main problem with content moderation that largely relies on reporting after it’s been posted is it puts the burden on victims to find, flag, and fingerprint their own abusive images or videos. This process can be traumatic, and then after all of that effort, it isn’t guaranteed to work.

Here’s what porn sites could do to better protect victims

Pornhub has the opportunity to set a precedent for other porn tube sites and the adult industry as a whole to minimize the spread of nonconsensual content. But will they take advantage of it while the world’s eyes are on them?

Here are a few bare minimum things Pornhub and other porn sites could do to minimize the spread of nonconsensual content.

Note that even exploitation-free porn is not harm-free and as an organization, we maintain that porn is harmful to consumers, relationships, and society as shown by decades of studies done by major institutions.

Related: Even If All Porn Was Consensual, Would There Be Any Issue With Watching It?

Monitor images and videos

The first move that would make a difference is a genuine policy to thoroughly monitor all content on the platform and review content before it is available for public consumption. Using technologies available and investing in those currently being developed could alleviate the burden on victims to find and report their own content.

Scan for search terms and titles

Scanning for and banning search terms associated with child sexual abuse material, such as “teens” or “lolita” is a no-brainer place to start. When Pornhub was asked why they hosted many videos with titles like “teen abused while sleeping” or “extreme teen abuse,” they responded:

“We allow all forms of sexual expression that follow our Terms of Use, and while some people may find these fantasies inappropriate, they do appeal to many people around the world and are protected [forms of expression].”

Related: Why This Massively Popular Porn Site Doesn’t Care If Their Content Shows Rape

But the issue isn’t just people finding fantasies “inappropriate,” it’s finding that much of the content on Pornhub—consensually uploaded and not—promotes and glorifies the rape, abuse, and exploitation of minors and men and women around the world. In any other industry, for any other tech company, this would not be tolerated.

Scan for personal information

Many abuse videos include the full name or other personally identifying details of the victim depicted. Often they are doxed in the title or comments, but if a victim’s name has already been reported to Pornhub, they could easily scan for other mentions and remove their personal information that, when public, could cause victims further harm.

Make verification count

It can be difficult to distinguish between real nonconsensual content and videos made by professional studios intending to mimic abusive situations. There’s a difference, even though violent porn (professional or not) fantasizing abusive content has been shown to have concerning effects on consumers. This could be helped by properly verifying users who upload content to minimize the chance of this violent content being from trafficked individuals.

Across social media platforms, the blue checkmark helps users find verified accounts of public figures. Pornhub similarly uses the blue check system, but the barriers for anyone to get it are reportedly low. All a person needs to do is upload a verification image, which is a photo of them with their Pornhub username and Pornhub’s website written on a piece of paper, or their body. Pornhub accepts either.

The 15-year-old trafficking victim we mentioned before was “verified” with the blue checkmark on Pornhub, misleading consumers to believe she was a consenting adult performer. The reality was very different. Clearly, Pornhub needs a better and more reliable system, not that it would completely filter out all exploitative content. Consider how even established professional porn performers are often exploited and abused on set in the name of sexual entertainment.

Remove the download feature

To download any video on the site, all you need is a login. That’s it. This ease with which site users can download videos is in large part why there are still hundreds of Girls Do Porn videos featuring verified trafficking victims online, as well as endless copies of other abusive or exploitative content.

Reportedly, Pornhub offers the option for “model” profiles on the platform to customize their download settings for their uploaded videos, allowing consumers to save their clips for a price. Model profiles can set a price per video download between $.99 and $150, hypothetically enabling users with illicit content to profit directly from videos of exploited or trafficked individuals.

Predictably, downloaded, edited, manipulated, and pirated copies of saved videos are reuploaded daily, ensuring that content of trafficked and exploited individuals can never truly be erased from the site or other porn tube sites.

Incentivizing meaningful change

The main problem with any of these suggestions is that Pornhub doesn’t have many incentives to change. After all, it’s easier not to invest in monitoring tools than it is to launch a “clean the beach” campaign that inspires good publicity to drown out the bad.

Monitoring content is an investment and generally not one that brings a monetary return—in fact, it’s often the opposite. But it’s the responsible and ethical thing to do, and the very least porn sites can do since they knowingly or otherwise profit from exploited individuals and nonconsensual content.

Related: How This Guy Reportedly Posted Revenge Porn Of His Ex To Pornhub Where It Got 1,000+ Views

Karl Muth explained that because it’s not a revenue-rich venture, it’s unlikely a tech company would hire a top product manager and assign them to content moderation when they could be maximizing revenue in another department. Facebook is a prime example of this.

“I think that’s why this issue has ended up in the corporate social responsibility backwater of the conversation rather than being an area where people develop and apply cutting edge solutions,” he said. “As long as there’s only one Katie Hill a year, does anybody on Facebook’s board care?”

In response to the many, many cases of nonconsensual porn that have come to light in recent months, we’re compelled to ask: Does Pornhub as a brand and a company truly, deeply care when there are other revenge porn cases shared across their site? What about child abuse images or sex trafficking? Or deepfakes and spycam porn?

Related: PayPal, Kraft, And Unilever: Why These Big Companies Recently Stopped Working With Pornhub

The bottom line is the negative consequences for failing to monitor content don’t seem to be severe enough for sites to take action, even considering the social fallout Pornhub is experiencing.

Even in a hypothetical world where Pornhub and other similar sites are perfectly held accountable and image-based sexual abuse and child sexual abuse material is successfully removed every time, the demand remains. This is a cultural problem.

Still, this all traces back to the issue that porn sites seem to encourage the demand for everything from young-looking performers to abusive porn by hosting this content in the first place and only responding to criticism now that there’s been some pressure for change.

It’s up to each of us as consumers to choose who will profit from our screen time. As a pro-love, pro-sex, anti-porn organization, we know they won’t be profiting from us.