Skip to main content
Blog

Pornhub Allegedly Only Recently Started Reporting Child Abuse and Nonconsensual Content on the Site

The world now knows how involved the world’s largest porn company has been in reporting child abuse content uploaded to its sites, and it’s seemingly disturbingly lacking.

Disclaimer: Fight the New Drug is a non-religious and non-legislative awareness and education organization. Some of the issues discussed in the following article are legislatively-affiliated. Though our organization is non-legislative, we fully support the regulation of already illegal forms of pornography and sexual exploitation, and we support the fight against sex trafficking. 

The world now knows how involved the world’s largest porn company has been in reporting child abuse content uploaded to its sites, and it’s seemingly disturbingly lacking.

The biggest brand in the porn industry has been under intense scrutiny for years by survivors and anti-exploitation advocates, but media attention and public pressure has increased significantly since December 2020.

Pornhub and its lesser-known parent company, MindGeek, were exposed in an article published in The New York Times by award-winning journalist Nicholas Kristof. Kristof gave visibility to survivors’ stories and years of information that show that Pornhub reportedly profits from CSAM and nonconsensual content.

These allegations were reported previously by several outlets, including ourselves, but Kristof’s article was able to grab the attention of payment services like Visa, Discover, and Mastercard, who suspended services to MindGeek after independently confirming the claims of CSAM on Pornhub.

Related: MindGeek, Pornhub’s Parent Company, Sued For Reportedly Hosting Videos Of Child Sex Trafficking

MindGeek announced policy changes at the end of 2020, including removing the download feature on Pornhub and only allowing uploads from verified users. They also suspended all content on Pornhub that was originally uploaded by unverified users, which turned out to be over 10 million videos, or over half of the porn site’s content library.

While each of these steps are headed in the right direction, some survivors felt Pornhub’s motives were disingenuous, and that the company only reacted to protect their bottom line—not out of concern or respect for victims of sexual abuse and exploitation.

As one survivor said in reference to Pornhub finally removing nonconsensually shared videos of her sexual abuse in December, “Had they done this back in 2018 when I first contacted them, my life would look much different now. They never cared about my wellbeing and profited off these illegal activities.”

In February 2021, Canadian lawmakers opened an investigation into MindGeek’s dealings. While the porn company is headquartered in Luxembourg, its main office is in Montreal. The Standing Committee on Access to Information, Privacy, and Ethics (ETHI) called on MindGeek executives, CEO Feras Antoon and COO David Tassillo to testify, and their statements were mixed with mistruths about MindGeek’s dedication to victims of abuse.

Most recently, the committee heard witness statements from leaders in child protection services that further weakened the MindGeek executives’ testimonies about their content moderation and reporting of illegal content.

Related: 13 Times MindGeek Executives Reportedly Didn’t Tell The Full Truth To Canadian Lawmakers

Pornhub’s failure to report revealed

“Our two-decade-long social experiment with an unregulated internet has shown that tech companies are failing to prioritize the protection of children online,” said Lianna McDonald, Executive Director of the Canadian Centre for Child Protection, in her opening statement to the ETHI committee.

Along with McDonald, other child protection leaders spoke to the committee about the dire issue of CSAM and nonconsensual content online, plus MindGeek’s responsibilities.

A recent letter sent to the Canadian committee, signed by 525 organizations and 104 survivors from 65 countries, stated MindGeek appears to have violated Canada’s child protection laws which require internet service providers to notify police of instances of CSAM on their sites. For the 10 years this law has been active, MindGeek has reportedly not done so.

As their defense, the MindGeek executives pointed to their “partnership” with the National Center for Missing and Exploited Children (NCMEC). John F. Clark, President and CEO of NCMEC dispelled this idea before his testimony.

“I would like to clarify for the committee,” Clark began. “That NCMEC and Pornhub are not partners. Pornhub has registered to voluntarily report instances of child sexual abuse material on its website to NCMEC, but this does not create a partnership… as Pornhub recently claimed during some of their testimony.”

Later, Clark revealed that for 2020, Pornhub made over 13,000 reports of CSAM to the Cybertipline operated by NCMEC; however, about 9,000 are duplicates. For comparison, Facebook made 15 million reports to NCMEC in 2019. This was by far the largest number of reports from a tech company, which does not necessarily mean there is more abusive content on Facebook, but that the social media platform is doing a better job at identifying and reporting it.

Related: 525 Organizations And 104 Survivors Sign Letter Urging Canada Lawmakers To Investigate MindGeek, Pornhub’s Parent Company

BHW - General

Pornhub’s failures in content moderation

“‘We’ll do better’ isn’t a defense. It’s a confession,” said Daniel Bernhard, the Executive Director of Friends of Canadian Broadcasting, in reference to MindGeek executives’ promises to improve their content moderation.

“Of course, Pornhub’s leaders try to blame everyone but themselves,” said Bernhard. He told the ETHI committee that Canadian law does in fact mandate that platforms take responsibility for what their users post in two circumstances. First, if a platform promotes illegal content that they know of in advance and publish anyway, or secondly, a platform may be liable if they are made aware of the illegal content post-publication but neglect to remove it.

MindGeek and Pornhub could be very clearly liable in both circumstances.

Related: What’s Going On With Pornhub? A Simplified Timeline Of Events

For one, the company claims human moderators view every single piece of content uploaded to their sites, which is easy to doubt based on hours of content uploaded and the stated number of moderators at the company. However, the MindGeek executives testified to this, perhaps unknowingly implying that the company was aware of the abusive or underage content and allowed it to be uploaded anyway. Secondly, from survivor testimonies, it is also clear MindGeek reportedly neglected to remove nonconsensual content after being notified by victims.

Several survivors have expressed how frustrating their experiences have been in dealing with Pornhub when trying to get their content removed. The company is reportedly slow at responding to take-down requests and sometimes asks victims to provide proof the video is nonconsensual. Instead of erring on the side of caution that a video may be nonconsensual, Pornhub reportedly told at least one victim to submit copyright take-down notices. Some survivors only received a response from Pornhub after pretending to be someone older with more authority, like their parent or lawyer.

“We also noticed in some instances a very strong reluctance on their part to take down material,” said John F. Clark of NCMEC. Over the years, many victims have reached out to NCMEC saying they have not received positive responses from Pornhub. On their behalf, NCMEC has communicated directly with Pornhub to request the content removal, which was reportedly granted.

Clark also shared that it was only after the MindGeek executives’ testimonies in early February that the company entered into an agreement with NCMEC to access their hashing databases. These are collections of confirmed CSAM and sexually exploited content that companies use in conjunction with hashing technologies that scan websites for copies of the abusive content in the databases.

Related: These Exploitation Survivors Boldly Testified Against Pornhub To Canada’s Parliament

This is standard practice, which leads us to this question: how has MindGeek been utilizing the hashing technologies they claim they have used for years if MindGeek has only just asked for access to NCMEC’s databases? This revelation raises serious questions about Pornhub’s ability to moderate its content. Clark said Pornhub has not yet taken steps to access these databases.

Store - General

Holding platforms to account

MindGeek may not be responsible for the original nonconsensual content that’s uploaded, but countless survivors have said that the dissemination of their content on Pornhub made their trauma worse.

From NCMEC’s work with survivors, John F. Clark said the trauma suffered by victims of this online and image-based abuse is unique. “The continued sharing and recirculation of a child’s sexual abusive images and videos inflicts significant revictimization on the child.”

Over the course of these hearings, the true scope of the situation is being revealed. It’s clear the world’s largest porn company, founded in 2004, that claims to care about victims of CSAM and nonconsensual abuse has reportedly only very, very recently put basic safeguards in place.

And these incredibly basic safeguards were put in place seemingly not because of the hundreds, possibly thousands, of victims of image-based abuse, trafficking, and child exploitation begging for the videos and images of their violations to be removed, but reportedly because they want to protect their financial successes and preserve their bottom line.

Related: The New York Times Exposé That Helped Spark The Possible Beginning Of The End Of Pornhub

However, it is worth noting that MindGeek is not the only company with this problem. Clark shared that no other adult platform is currently reporting to or working with NCMEC. Surely this needs to change to make a difference for survivors of abuse.

Lianna McDonald at the Canadian Centre for Child Protection echoed the sentiments from the committee witnesses:

“While the spotlight is currently focused on MindGeek, we wanted to make it clear that this type of online harm is occurring daily across many mainstream and not so mainstream companies operating websites, social media services, and any number of them could have been put under this microscope as MindGeek has by this committee. It is clear whatever companies claim that they are doing to keep CSAM off their servers, it is not enough.”

We agree, which is why we joined with 525 organizations and 104 survivors all from 65 countries in signing a letter calling on the Canadian government to encourage law enforcement to launch a criminal investigation into MindGeek.

While MindGeek needs to be held accountable, our work to educate on the harmful effects of porn and stopping the demand for abusive content is far from over. This is only the beginning.