Skip to main content
Blog

Pornhub Just Removed Over Half of the Site’s Content in a Purge of Unverified Videos

Will these changes be enough to prevent abuse and undo the damage that has been done to countless victims of underage content, revenge porn, nonconsensual porn, sex trafficking, sexual exploitation, and other forms of image-based abuse?

By December 14, 2020No Comments

Trigger warning:

This morning, Pornhub started purging the entire platform of unverified videos, deleting over 10 million videos from the site. The only videos that will remain are those uploaded by official content partners and content from members of its model affiliate program. This means a very significant portion of the site’s content will disappear.

According to Vice, before the content purge on Sunday evening, Pornhub hosted around 13.5 million videos according to the number displayed on the site’s search bar, a large number of them from unverified accounts. On Monday morning as of 9 a.m., that search bar is showing only 4.7 million, and dropping.

“As part of our policy to ban unverified uploaders, we have now also suspended all previously uploaded content that was not created by content partners or members of the Model Program,” according to Pornhub’s announcement. “This means every piece of Pornhub content is from verified uploaders, a requirement that platforms like Facebook, Instagram, TikTok, YouTube, Snapchat and Twitter have yet to institute.”

This is a huge shift in the way the site has operated since its founding in 2007, largely being a hub for anyone from anywhere to upload videos of any kind without much security protocol in place.

Their long-overdue actions today are the result of a cascade of events over the last week: first, it was award-winning journalist Nicholas Kristof’s eye-opening New York Times exposé on the site’s practice of hosting illicit and exploitative content. Mastercard and Visa announced a day later that they would investigate the issues of unlawful material on the site, prompting Pornhub to immediately announce changes in site security and protocol.

Related: The New York Times Exposé That Helped Spark The Possible Beginning Of The End Of Pornhub

These alleged changes will include:

  • They will only allow uploads from verified users, not everyone
  • There will no longer be an option to download existing videos on the platform
  • There will be big improvements in moderation, and a semblance of a larger moderation team

Here’s a Twitter thread about the changes by Nicholas Kristof, author of the recent New York Times opinion piece:

Kristof also reported that Pornhub representatives told him these new policies apply to all MindGeek sites, not just Pornhub. MindGeek is Pornhub’s parent company that also owns and operates some of the world’s other most popular porn sites.

Subsequently, Visa and Mastercard officially announced they would be cutting ties with Pornhub and MindGeek following their investigation’s findings. Now, it seems as though Pornhub is purging unverified content today to try and retain some credibility in the public’s view. No doubt that Visa and Mastercard’s actions have pushed for Pornhub to make changes and follow through.

Related: Visa And Mastercard Sever Ties With Pornhub Due To Abusive Content On The Site

It’s been a rollercoaster of a week for Pornhub, and the situation is still evolving. Still, will their promised changes and site purge be enough to undo the damage that has been done to countless victims of underage content, revenge porn, nonconsensual porn, sex trafficking, sexual exploitation, and other forms of image-based abuse? Only time will tell, but probably not. Some questions we have, in light of the newly announced changes:

  • Will other types of image-based abuse like deepfakes be removed from the site?
  • What systems of accountability will be introduced so the public can be assured that Pornhub is keeping their word?

To fully explain why we believe these new changes will not be enough to undo the damage that has been done and even prevent new issues from arising, even considering the mass deletion of videos, we need to look at how porn sites have operated over the last decade to understand how the issues of nonconsensual and exploitative content have gotten so bad.

Related: Are Porn Sites Protecting Victims Of Nonconsensual Content? We Investigated, Here’s What We Found

Nonconsensual and underage content on Pornhub

In 2009, a 14-year-old named Rose was abducted and raped in a 12-hour long, overnight attack by two men, with a third man video recording the assault.

Sometime later, after the attack, some students from Rose’s school shared a link on MySpace. It led to Pornhub and revealed videos with hundreds of thousands of views and titles like “teen crying and getting slapped around,” “teen getting destroyed,” and “passed out teen.”

They were all of Rose. From that night, from that assault.

In November 2019, a 15-year-old girl who had been missing for a year was discovered in videos on Pornhub. She had become a victim of sex trafficking, and yet the porn site hosted 58 videos of her being raped by her trafficker and other sex buyers. Note that these videos were purportedly uploaded by a verified user, the same type of user that Pornhub says will be the only kind allowed to upload content to the platform.

Related: Pornhub Reportedly Profits From Nonconsensual Videos And Real Rape Tapes—Here Are The Latest Examples

These are not the only stories of nonconsensual content being uploaded to Pornhub, but they are two real and verified examples.

Honestly, we have no way of knowing how much content on Pornhub is nonconsensual or what percentage of their profits come from it. We only know that illegal or abusive videos exist when they shouldn’t and that Pornhub and other sites could be doing more to eradicate it. This is what this article is aiming to uncover.

BHW - General

More nonconsensual porn, and lots of it

Other examples of available nonconsensual content on Pornhub include “revenge porn,” or private images or videos posted by the ex-partner of the victim depicted. Some have had private accounts hacked and their images uploaded by an unknown perpetrator. Other victims of nonconsensual content have been secretly filmed in locker rooms or showers and “upskirted” on public transportation. Some are children forced to perform sex acts for an online audience, and others are groomed by adults online asking for nude pics. Some are deepfakes of ex-partners’ faces graphed onto porn performers’ bodies in horrifyingly convincing videos.

All are victims of image-based sexual abuse perpetrated partially or entirely online.

Free porn tube sites like Pornhub, thrive off of user-uploaded content. Formerly, Pornhub specifically encouraged anyone, anywhere to upload porn, and lots of it, with seemingly no review in place before the content is available to the public for consumption—some of which is nonconsensual content like what we’ve described above. Now, with the new security measures in place on Pornhub, it seems as though only verified accounts will be able to upload content, but bear in mind that verified accounts have uploaded abusive content before. Not only that, but users can easily go to other porn sites that still continue to allow user uploaded content without any verifications in place.

Pornhub’s infamous “year in review” reports features mind-boggling examples of how often people take advantage of their (now supposedly suspended) user upload feature. Just consider the more than 1.36 million hours of new content that was uploaded to the site in 2019 alone. It would take a person 169 years to watch it all, and that’s just the new content, not the existing videos on the site.

Related: Their Private Videos Were Nonconsensually Uploaded To Pornhub, And Now These Women Are Fighting Back

The porn giant brags about their 115 million visits to the site each day. The combination of new videos and millions of eyeballs browsing the site is enticing to some advertisers—this is one way Pornhub makes money. As it turns out, they have continued to sell ad space and profit from illegal and abusive material, too.

To many survivors, the act of uploading recorded footage is emotionally devastating. Not all, but many report showing signs of post-traumatic stress disorder or trauma akin to rape victims, which are responses supported and illustrated by studies that examine the impact of “revenge porn” on survivors.

Store - Trafficking

A growing movement to hold tech companies responsible

It might surprise you to learn that nonconsensual porn is not uncommon.

“The first thing people need to understand is that any system that allows you to share photos and videos is absolutely infested with child sexual abuse,” Alex Stamos, professor at Stanford and former security chief at Facebook and Yahoo, said to The New York Times.

Related: Content On Pornhub Reportedly Normalizes And Promotes Racism And Racist Stereotypes

Last year, the newspaper investigated the rise of child abuse material online and discovered tech companies aren’t putting their efforts toward monitoring for illegal imagery.

The investigation looked at companies and tech platforms, including social media platforms. Facebook, they found, does scan for illegal material and reports the majority of flagged content by tech companies, but even they are not using all available resources to detect harmful or illegal content.

It’s important to note here that there are few incentives for Facebook and other social media sites to better monitor content partially because of Section 230 of the Communications Decency Act, which allows for platforms like Facebook to say in certain situations that they aren’t publishers and therefore not responsible for the content their users upload to the site.

In other words, Mark Zuckerberg—the Co-founder of Facebook—isn’t who gets sued if a guy in Kansas uploads intimate pictures of his ex-girlfriend on Facebook with the intent to humiliate her. That guy in Kansas is responsible. This is understandable, but these cases are rarely so simple and sometimes tech platforms can be held accountable for facilitating sexual abuse, or in severe cases, human trafficking.

Still, there are roadblocks for victims to get any sense of justice.

For starters, it is very difficult to remove images or videos once they have been shared online. Angela Chaisson, Principal at Chaisson Law, told us in an interview that getting images removed from a site like Pornhub is next to impossible:

“I will often tell a client that it’s just not worth the effort that it takes, which is a very unsatisfactory thing to say to a client as a lawyer. It’s like whack-a-mole. If you get them taken down from one place, they pop up in another.”

Recent public outrage suggests some people believe the platform should also be held responsible for not monitoring illegal or abusive content.

Related: “Hit That”: Do Both Pop Culture And Porn Culture Normalize The Abuse Of Women?

Store - Trafficking

Pornhub’s response seems to be unconditional denial. Here’s part of a statement Pornhub Vice President Blake White sent to the Daily Dot in March of 2020, responding to calls to shut the site down because of nonconsensual images and videos. They seem to acknowledge the prolific presence of nonconsensual content on the site now, but this largely embodies their response over the last few years to victims of exploitation or abuse who begged to have content removed:

“Pornhub has a steadfast commitment to eradicating and fighting any and all illegal content on the internet, including nonconsensual content and child sexual abuse material. Any suggestion otherwise is categorically and factually inaccurate. While the wider tech community must continue to develop new methods to rid the internet of this horrific content, Pornhub is actively working to put in place state-of-the-art, comprehensive safeguards on its platform to combat this material. These actions include a robust system for flagging, reviewing and removing all illegal material, employing an extensive team of human moderators dedicated to manually reviewing all uploads to the site, and using a variety of digital fingerprinting solutions. We use automated detection technologies such as YouTube’s CSAI Match and Microsoft’s PhotoDNA as added layers of protection to keep unauthorized content off the site. We also use Vobile, a state-of-the-art fingerprinting software that scans any new uploads for potential matches to unauthorized materials to protect against any banned video being re-uploaded to the platform. We are actively working on expanding our safety measures and adding new features and products to our platform to this end, as they become available.”

Still, despite their alleged dedication to eradicating nonconsensual content, the Internet Watch Foundation recently investigated the site and confirmed over 100 cases of child sexual abuse material on Pornhub. But in response to this report, the site pointed out that this is less than 1% of the website’s content.

(Let’s do some quick math. Even if just 0.1% of the videos only uploaded in 2019 were nonconsensual, that’s still 6,830 videos in one year. In our opinion, anything higher than 0% of the content being content that exploits children is still far too high. One exploited child or even one exploited adult is too many.)

Pornhub’s White also released a statement in 2020 that details how the company believes their work to protect victims is having a positive impact.

Kate Isaacs, the founder of UK anti-revenge porn movement #NotYourPorn, disagrees. She worked with The Times to investigate if there was illegal content hosted on Pornhub, and spoiler alert, they found it too.

“There’s a level of delusion,” Isaacs said. “I genuinely think they think they are helping people more than they are.”

So, given the site’s new security restrictions, will Pornhub vet all previous content and remove it if it has been uploaded by non-verified users and/or is nonconsensual or exploitative in nature?

Podcast

What does content moderation actually mean?

There are multiple options and technologies available to help monitor content, some of which Pornhub mentioned in their statement and even their announcement today about their updated security measures, but so far none are a perfect solution even if they’re deployed thoroughly. At a minimum, monitoring needs to be a two-step process.

Still, here are a few methods that are used by big tech companies to monitor for illicit content all over the world, some of which Pornhub claims to use.

Human moderators

Facebook famously employs thousands of human moderators who rapidly click through image after image, flagging those that may be against the site’s community rules.

Related: More Than 80 Men Were Sexually Exploited And Secretly Filmed For This Guy’s Porn Site

We spoke with Karl Muth, economist and professor at Northwestern University, who told us human moderators are not the best way forward. No matter how little companies pay these legions of moderators, it is still expensive, not to mention traumatizing for the employees. No doubt it would be even more so for moderators of Pornhub content.

Reverse image search and sourcing technologies

Muth mentioned other options, such as reverse image search technologies. They are good at scanning through images to find copies, but not helpful in discovering new problematic images as the original image or some copy is needed to conduct the search. Also, there are sourcing or tracing technologies that don’t necessarily look at the image but examine where the image comes from.

So if an image was shared yesterday on a group chat suspected of sharing illegal material, and is then uploaded to Pornhub, that could be a red flag.

Hashing images

In 2009, Hany Farid, then a professor at Dartmouth University and now the University of California, Berkeley, developed a software with Microsoft to detect illegal images. Basically, PhotoDNA converts a unique image to a greyscale version, divides the image on a grid, analyzes each of the smaller sections of the image, and then creates a “hash” or digital signature of the image made up of numbers and letters. This categorization method can then be compared to other images in a database of confirmed sexual abuse images to identify a match.

Hashing is used by the Internet Watch Foundation and many other organizations, but it too is limited because it requires comparing to a database of confirmed cases, meaning others may slip through the cracks, likely in Pornhub’s case.

What Pornhub reportedly isn’t doing to help victims

These are methods Pornhub could utilize, but as far as reports and evidence suggest, they aren’t to their fullest abilities. Despite their claims of being proactive with removing illicit content, and claims that they will be even more proactive in the future with these new announcements, there is little evidence to suggest they currently monitor content uploaded to their sites, but instead, they mainly rely on users to report illegal content once it’s already been posted.

Here’s how it supposedly works.

In theory, a nonconsensual porn victim submits a content removal request to Pornhub with links to the images or videos to be removed. If the victim doesn’t want their content re-uploaded to Pornhub, they are referred to Vobile to “fingerprint” their content, which supposedly makes it impossible to reupload. Or does it?

Related: Ukrainian Gynecologist Accused Of Sharing Hidden Cam Footage Of Patients With Porn Sites

In February 2020, VICE tested the reporting system and found that it doesn’t always work this way. VICE found that with minor edits, the fingerprinting could be circumvented. Meaning, Pornhub may delete the original upload, but modified copies can continue to spread.

But let’s backtrack. Does Pornhub quickly respond to every request to remove nonconsensual content in the first place, like they claim?

Activists and victims have complained that Pornhub isn’t always cooperative and responsive to content removal requests.

For example, after 14-year-old Rose found videos of her assault on Pornhub, she sent messages to Pornhub over a six month period, repeatedly asking for the videos to be removed. She explained that the videos were of her assault, that she was a minor. She never received a response. It wasn’t until she had the idea to open a new email account and pretend to be a lawyer threatening legal action on her behalf that the videos were taken down in two days.

Related: Pornhub Reportedly Refused To Remove Videos Of This Minor’s Sexual Assault—Until She Posed As Her Own Lawyer

Pornhub responded to Rose’s experience in a statement that said this happened under different leadership of the company in 2009, and they have better practices now. But after working with victims in the UK very recently, Kate Isaacs has said their response has been inconsistent. Sometimes responsive and sometimes silent.

Cara Van Dorn, an associate at the law firm Sanford Heisler Sharp, which has been representing some of the women involved in the notorious Girls Do Porn case, said to VICE:

“We had reached out to [Pornhub’s parent company] many times over the years and it wasn’t until the start of trial and obtaining numerous favorable rulings demonstrating undeniable strength of our case that [Pornhub] finally decided to start taking action. It’s not really ‘believing victims’ when it takes a team of lawyers and years of litigation before you lift a finger. It’s not really ‘doing the right thing’ when you only act when it is in your self-interest.”

Note that the first GirlsDoPorn videos of trafficked women started to get uploaded to porn sites like Pornhub around 2015, and the women depicted pleaded for years for the removal of this content once they were made aware of it through doxing and harassment. They were reportedly met with silence and inaction.

Fortify

Pornhub finally removed the official GirlsDoPorn channel in October 2019 after the company owners were arrested for sex trafficking, but copies of hundreds of the videos still remain on multiple free porn sites (including, reportedly, Pornhub itself). VICE reported that the videos are hosted against banner ads that Pornhub still profits from. Note that the GirlsDoPorn channel was verified, so even if they still existed today, Pornhub’s newly announced security features wouldn’t necessarily have prevented the channel’s abusive video uploads, anyway. Verification does not solve everything.

Related: 22 Women Paid $12.7 Million And Given Rare Ownership Rights In GirlsDoPorn Lawsuit

The main problem with content moderation that largely relies on reporting after it’s been posted is it puts the burden on victims to find, flag, and fingerprint their own abusive images or videos. This process can be traumatic, and then after all of that effort, it isn’t guaranteed to work.

Here’s what porn sites could do to better protect victims

Pornhub has the opportunity to set a precedent for other porn tube sites and the adult industry as a whole to minimize the spread of nonconsensual content. But will they take advantage of it while the world’s eyes are on them? The new security features announced today are a start, but how well will they be implemented? 

Here are a few bare minimum things Pornhub and other porn sites could do to minimize the spread of nonconsensual content. Remember that the December 8th announcement of new site security measures on Pornhub does take some of these steps, but other porn sites are woefully behind and hardly take any of these measures into practice.

Note that even exploitation-free porn is not harm-free and as an organization, we maintain that porn is harmful to consumers, relationships, and society as shown by decades of studies done by major institutions.

Related: Even If All Porn Was Consensual, Would There Be Any Issue With Watching It?

Monitor images and videos

The first move that would make a difference is a genuine policy to thoroughly monitor all content on the platform and review content before it is available for public consumption. Using technologies available and investing in those currently being developed could alleviate the burden on victims to find and report their own content.

Pornhub’s December 8th announcement says: “Pornhub’s current content moderation includes an extensive team of human moderators dedicated to manually reviewing every single upload, a thorough system for flagging, reviewing and removing illegal material, robust parental controls, and utilization of a variety of automated detection technologies.”

But if their current moderation system is going to continue, it isn’t good enough. See our evidence of complaints from abuse victims above.

Scan for search terms and titles

Scanning for and banning search terms associated with child sexual abuse material, such as “teens” or “lolita” is a no-brainer place to start for any and all porn companies. When Pornhub was asked why they hosted many videos with titles like “teen abused while sleeping” or “extreme teen abuse,” they responded:

“We allow all forms of sexual expression that follow our Terms of Use, and while some people may find these fantasies inappropriate, they do appeal to many people around the world and are protected [forms of expression].”

But the issue isn’t just people finding fantasies “inappropriate,” it’s finding that much of the content on Pornhub—consensually uploaded and not—promotes and glorifies the rape, abuse, and exploitation of minors and men and women around the world. In any other industry, for any other tech company, this would not be tolerated.

In Pornhub’s announcement today, they said that “while the list of banned keywords on Pornhub is already extensive, we will continue to identify additional keywords for removal on an ongoing basis. We will also regularly monitor search terms within the platform for increases in phrasings that attempt to bypass the safeguards in place.” We’ll have to see how effective these keyword bans truly are in the future, and we hope other porn sites will follow suit.

Related: Why This Massively Popular Porn Site Doesn’t Care If Their Content Shows Rape

Scan for personal information

Many abuse videos include the full name or other personally identifying details of the victim depicted. Often they are doxed in the title or comments, but if a victim’s name has already been reported to Pornhub, they could easily scan for other mentions and remove their personal information that, when public, could cause victims further harm.

The December 8th change to site security features did not address this issue, as far as we are aware.

Make verification count

It can be difficult to distinguish between real nonconsensual content and videos made by professional studios intending to mimic abusive situations. There’s a difference, even though violent porn (professional or not) fantasizing abusive content has been shown to have concerning effects on consumers. This could be helped by properly verifying users who upload content to minimize the chance of this violent content being from trafficked individuals.

Across social media platforms, the blue checkmark helps users find verified accounts of public figures. Pornhub similarly uses the blue check system, and now only those who are verified can apparently upload content, but the barriers for anyone to get it are reportedly low. All a person needs to do is upload a verification image, which is a photo of them with their Pornhub username and Pornhub’s website written on a piece of paper, or their body. Pornhub accepts either.

The 15-year-old trafficking victim we mentioned before was “verified” with the blue checkmark on Pornhub, misleading consumers to believe she was a consenting adult performer. The reality was very different. Also, consider how the now-defunct GirlsDoPorn production company’s channel was verified on Pornhub—and the 20th most-popular channel on Pornhub at its peak—and it hosted dozens of videos of trafficked women.

Clearly, Pornhub needs a better and more reliable system, not that it would completely filter out all exploitative content. Consider how even established professional porn performers are often exploited and abused on set in the name of sexual entertainment. Many other porn sites haven’t even begun to have a verification upload system such as this at the time of writing this article.

Remove the download feature

On many porn sites, to download any video on the site, all you need is a login. That’s it. On Pornhub, this change was announced today: “Effective immediately, we have removed the ability for users to download content from Pornhub, with the exception of paid downloads within the verified Model Program. In tandem with our fingerprinting technology, this will mitigate the ability for content already removed from the platform to be able to return.”

Previously, the ease with which site users could download videos is in large part why there are still hundreds of GirlsDoPorn videos featuring verified trafficking victims online, as well as endless copies of other abusive or exploitative content. Will Pornhub remove these videos now, retroactively?

Pornhub offers the option for “model” profiles on the platform to customize their download settings for their uploaded videos, allowing consumers to save their clips for a price. That’s what they referenced in the announcement of changes today. Model profiles can set a price per video download between $.99 and $150. Still, this possibly enables users with illicit content to profit directly from videos of exploited or trafficked individuals.

Predictably, downloaded, edited, manipulated, and pirated copies of saved videos are reuploaded daily, ensuring that content of trafficked and exploited individuals can never truly be erased from the site or other porn tube sites, even taking into account the fingerprint technology Pornhub purports to use.

Incentivizing meaningful change

The main problem with any of these suggestions is that Pornhub hasn’t had many incentives to change, until now. Visa and Mastercard’s investigation into the site directly impacted these changes.

Monitoring content is an investment and generally not one that brings a monetary return—in fact, it’s often the opposite. But it’s the responsible and ethical thing to do, and the very least porn sites can do since they knowingly or otherwise profit from exploited individuals and nonconsensual content.

Related: How This Guy Reportedly Posted Revenge Porn Of His Ex To Pornhub Where It Got 1,000+ Views

Karl Muth explained that because it’s not a revenue-rich venture, it’s unlikely a tech company would hire a top product manager and assign them to content moderation when they could be maximizing revenue in another department. Facebook is a prime example of this.

“I think that’s why this issue has ended up in the corporate social responsibility backwater of the conversation rather than being an area where people develop and apply cutting edge solutions,” he said. “As long as there’s only one Katie Hill a year, does anybody on Facebook’s board care?”

In response to the many, many cases of nonconsensual porn that have come to light in recent months, we’re compelled to ask: Does Pornhub as a brand and a company truly, deeply care when there are other revenge porn cases shared across their site? What about child abuse images or sex trafficking? Or deepfakes and spycam porn? Would they have announced these changes today had it not been for the Visa and Mastercard investigations into MindGeek?

Related: PayPal, Kraft, And Unilever: Why These Big Companies Recently Stopped Working With Pornhub

The bottom line is the negative consequences for failing to monitor content don’t seem to be severe enough for sites to take action, even considering the social fallout Pornhub is experiencing.

Even in a hypothetical world where Pornhub and other similar sites are perfectly held accountable and image-based sexual abuse and child sexual abuse material is successfully removed every time, the demand remains. This is a cultural problem.

Still, this all traces back to the issue that porn sites seem to encourage the demand for everything from young-looking performers to abusive porn by hosting this content in the first place and only responding to criticism now that there’s been some pressure for change.

It’s up to each of us as consumers to choose who will profit from our screen time. As a pro-love, pro-sex, anti-porn organization, we know they won’t be profiting from us.