Disclaimer: Fight the New Drug is a non-religious and non-legislative awareness and education organization. Some of the issues discussed in the following article are legislatively-affiliated. Including links and discussions about these legislative matters does not constitute an endorsement by Fight the New Drug. Though our organization is non-legislative, we fully support the regulation of already illegal forms of pornography and sexual exploitation, including the fight against sex trafficking.
When Taylor was 14, her boyfriend secretly made a video of her performing a sex act.
Taylor, now age 18, said she went to school the next day thinking life was normal, but noticed that her peers kept looking at their phones and then at her, laughing. The video had been posted on Pornhub. Later, she said she attempted suicide twice.
Not only was Taylor a victim of image-based sexual abuse when that video was filmed and shared without her consent, but it is also considered child sexual abuse material because she was underage. To add further pain to the experience, Pornhub profited from Taylor’s video by hosting ads on the same webpage.
“They made money off my pain and suffering,” she said.
Taylor is one of many victims of image-based sexual abuse (IBSA) and child sexual abuse material (CSAM) that is shared on porn sites and social media platforms. Her story was published along with other heart-wrenching victimizations in a recent opinion column by the Pulitzer Prize-winning journalist Nicholas Kristof, and a subsequent follow-up piece five days later.
Listen: Nicholas Kristof’s Interview with Consider Before Consuming, a Podcast by Fight the New Drug
Kristof’s articles continued the work of many journalists and advocates in exposing the side of the internet that most people do not want to think about.
“The Children of Pornhub”
The article, titled “The Children of Pornhub,” gives visibility to Pornhub’s questionable business practices, nonconsensual content reportedly posted on the platform, and anti-exploitation advocates including the Internet Watch Foundation, Traffickinghub, and the National Center for Missing and Exploited Children.
And while not mentioned in the article, the work of many other advocates and survivors made this article possible.
The fact is nonconsensual imagery and child sexual abuse material exist on porn sites and even on the tech platforms you likely use every day.
The initial op-ed points out that when it comes to CSAM and IBSA, this is not a dividing political or religious issue. This is about sexual abuse on the internet. In a sentiment that is also reflected in Fight the New Drug’s mission, Kristof wrote, “It should be possible to be sex-positive and Pornhub negative.”
The response to the first article was immediate, resulting in Pornhub announcing three major policy changes they say will prevent the sharing and continuous circulation of nonconsensual images and videos.
For some, this marks a positive shift in the right direction, but to other victims and advocates, it is too little, too late.
Regardless, this globally-reaching article has been years in the making. Countless anti-exploitation advocates and survivors of trafficking, child abuse, and image-based abuse have worked tirelessly since long before December 2020 to expose the questionable business practices of the porn industry and the proliferation of nonconsensual content on porn sites.
An eye-opening op-ed
Cali, Serena, Xela, Nicole, Lydia, Susan, Jessica, and Leo told their stories and were quoted in Nicholas Kristof’s initial December 4th article. Doubtless there were many more, like Rose Kalemba and Avri Sapir, two victims of child sexual abuse and outspoken advocates for change in the way porn sites have responded to their requests for the removal of rape videos.
Each of their experiences reveal the pain victims experience. They may suffer through bullying or ostracizing from their social network, drop out or struggle through school, and future careers are at risk because of explicit videos that show up when a potential employer googles their name.
Victims of IBSA and CSAM often experience devastating mental health impacts such as depression, anxiety, post-traumatic stress disorder. Some victims even attempt suicide—in Taylor’s case, she tried twice.
Related: 7 Cases Of Nonconsensual Porn And Rape Tapes Pornhub Doesn’t Want Consumers To Know About
“I may never be able to get away from this,” Cali said. “I may be 40 with eight kids, and people are still masturbating to my photos.”
“An assault eventually ends, but Pornhub renders the suffering interminable,” Kristof wrote.
Alongside the stories of young female victims, Kristof’s first article criticized porn sites and social media companies for profiting from nonconsensual and abusive material. While the focus was largely on Pornhub because it is a household name and represents a significant chunk of the adult industry, he did not forget to mention MindGeek, the parent company that owns Pornhub and about 100 other websites and production companies, as well as competitors like XVideos, which he described as having “even fewer scruples.”
He also pointed out an important but often missed fact that CSAM exists on mainstream sites like Twitter, Reddit, and the suite of platforms owned by Facebook. Even Google, who blocks porn advertisements, still supports the business model of porn sites by returning explicit videos as the top hits of searches like “young porn.”
Kristof called out American Express, Visa, and Mastercard for providing the services to pay for this content and even the entire country of Canada, where MindGeek is headquartered, for ignoring the company’s pattern of profiting from exploitation.
In a statement to Kristof, Pornhub said they are “unequivocally committed to combating child sexual abuse material” and that they have instituted an “industry-leading” safety policy to identify and remove abusive material from the site.
If this statement was referring to Pornhub’s content moderation team, then we would say it is far from effective. Moderators review new uploads checking for age and abuse, but both aspects are extremely difficult to confirm.
One Pornhub moderator told Kristof that there are about 80 moderators worldwide in comparison to Facebook’s 15,000. Let’s do some quick math. If there really are only 80 moderators reviewing “every single upload” which totaled 1.39 million hours in 2019, that’s 17,000 hours of reviewing work a year for one moderator. The average 40-hour work week for 52 weeks equals 2,080 hours a year. Even reviewing new uploads in sped up time, the numbers do not add up.
A former MindGeek employee said the focus was maximizing profits, and as a moderator, the goal was to “let as much content as possible go through.”
Backlash brings new changes
The responses to Nicholas Kristof’s article were almost immediate. Justin Trudeau, Prime Minister of Canada, said he was “extremely concerned” and that his government is working with law enforcement to address sex trafficking and child abuse material. Perhaps unsatisfied with this statement, a collaboration of Canadian lawmakers sent a letter to the Attorney General stating they were “appalled at the lack of enforcement by Canada’s justice system.” This was their second letter, a follow up from their first plea in March 2020 for the Canadian government to act.
Also quick to respond were American Express who clarified that they do not associate with adult websites, and Mastercard and Visa began an investigation into the allegations of illegal content on Pornhub.
Days later, both payment services stopped processing payments on the tube site after confirming unlawful content. Visa said payments are suspended for MindGeek but their review is still ongoing, and Mastercard said they are continuing to investigate potential illegal content on other websites.
These payment services cutting ties with Pornhub follow after PayPal who stopped working with MindGeek after a groundbreaking investigation by the Sunday Times UK in 2019. That reporting similarly uncovered child sexual abuse material and nonconsensual content on Pornhub and helped pave the way for the recent New York Times article.
Before Visa and Mastercard cut ties, there were signs Pornhub was nervous about losing their payment services and issued a statement this week adopting Kristof’s three suggestions: allow only verified users to post content, ban downloads, and expand moderation.
The policy change was met with frustration from victims and advocates who have been vocal in their criticism of Pornhub’s mistreatment of victims well before Visa and Mastercard’s investigations.
For example, Rose Kalemba’s requested Pornhub remove a video of her sexual assault but was met with silence until she pretended in an email to be her own lawyer, and English footballer Leigh Nichols reported a nonconsensual video recorded when she was under 18 and posted Pornhub only to later have the site advertise the video and her full name on the Pornhub homepage.
Despite these hostile experiences and many more pleas for change from survivors, it took the New York Times to spur Visa and Mastercard into action, and only then Pornhub responded with policy changes. Of course, this is enough for anyone to feel suspicious of Pornhub’s priorities.
Related: Visa And Mastercard Sever Ties With Pornhub Due To Abusive Content On The Site
Following Pornhub’s statement, Kristof responded with skeptical optimism, tweeting:
“A great deal depends on how responsibly Pornhub implements these [changes], and it has not earned my trust at all, but these seem significant.”
In contrast, Scott Berkowitz, CEO of Rape, Abuse & Incest National Network (RAINN), said he did not think Pornhub’s changes would come anywhere close to fixing the problem of CSAM and IBSA online.
These may be dramatic changes for Pornhub since their baseline for supporting victims is so low, yet we are equally skeptical if these measures will work. More than anything, it is a shame it took the threat of losing Mastercard and Visa for Pornhub to make a change, and not the pleas of countless survivors of rape, abuse, and trafficking.
“Until Pornhub can brush off their egos and admit to the damage they’ve done to victims of image-based sexual abuse, they’re going to have serious problems,” Kate Isaacs, founder of #NotYourPorn, told us. “It’s time for them to put their hands up, admit they have not been doing nearly enough to protect victims and make huge changes to the platform.”
Can a leopard change its spots?
Pornhub’s current business model encourages hours and even years of uploaded content and to sell ads on that content, like other mainstream tech companies. They pay lip service to victims of nonconsensual imagery and profess a zero-tolerance policy for child abuse material, but then knowingly or unknowingly position ads next to those videos to directly profiting from that material.
In a tweet, Kristof said MindGeek had informed him that the changes would apply to all MindGeek sites, not just Pornhub. But we are still wondering, are these new policies enough to change the nature of Pornhub? Or is this merely a gesture to boost public opinion? Let’s take a closer look.
The first change is to only allow verified users to upload content to Pornhub. Currently, only content partners such as porn production studios and members of Pornhub’s Model Program can upload content to the tube sites. Pornhub promised to implement a new verification process in the new year that would allow any user to upload after passing the protocol.
Barriers like this could reduce the number of abusive material uploaded simply because it will not be as easy as it once was. That is a positive. Where we are concerned is the new verification process and hope it will be more rigorous than the previous protocol that does not require recorded age or identification.
The previous verification rules would be laughable except that they allowed for videos of sexual assault to be uploaded and spread throughout the site. One story quoted in Kristof’s article was of a 15-year-old trafficking victim who was discovered in videos on Pornhub, but those videos came from a verified account. In another instance, a man was recently charged for producing child sexual abuse material of a 16-year-old female and Pornhub confirmed the videos came from the abuser’s verified account.
The second change is—in our minds—the biggest win and may make a genuine difference. Removing the download feature could significantly reduce the copies of IBSA and CSAM and their recirculation online. Victims live through this constant cycle of finding their video online, reporting it, the video being removed (if they are lucky) only to have a downloaded copy re-uploaded and circulated.
Avri Sapir was a victim of child sexual abuse from the time she was an infant until 15 years old. When videos of her abuse were uploaded to Pornhub, they were removed within a day, but Pornhub did not remove the accounts that made the initial uploads. Unsurprisingly, the videos were back online a few hours later.
“Surely [Pornhub] would fingerprint the videos as they claimed they always do,” Avri tweeted, “Report them to NCMEC [National Center for Missing and Exploited Children], & make sure they couldn’t be reposted, especially after I go so much attention for calling them out the first time they were posted, right? Wrong.”
Avri wrote that a video of her when she was 9, being assaulted and tortured, was again uploaded to Pornhub where it remained for weeks until NCMEC “forced” the tube site to take it down. By that time, the video had 20,684 views.
The supposed state-of-the-art fingerprinting software Avri mentions that is intended to prevent previously removed content from being re-uploaded is anything but airtight. An investigation by VICE revealed how the software fails to truly protect victims even as Pornhub points to it as proof they stop abusive content from being shared.
“It’s like whack-a-mole,” Angela Chaisson a Canadian attorney, explained to us. “If you get them taken down from one place, they pop up in another.” It is this constant fear of dissemination that makes victims feel like the abuse will never end. At this point in time, they are not wrong.
Of course, there are still ways to screenshot or download videos through alternative software, but by simply removing the easy access download feature could dissuade consumers.
Finally, Pornhub says they will make big improvements in moderation with their newly established “Red Team” that will sweep the existing content on the platform for violations and look out for breakdowns in the moderation process. They said this will be in addition to their current, “extensive team of human moderators dedicated to manually reviewing every single upload.” They did not confirm the number of the “extensive team” or suggest how many more will make up the Red Team. It seems unlikely MindGeek will rapidly hire thousands of moderators to compare with Facebook, but clearly moderators alone are not an effective way to prevent abusive material online as plenty of nonconsensual content is shared and spread around Facebook’s sites, too.
This issue is bigger than Pornhub alone
Despite some of the flaws, Mary Anne Franks, President of Cyber Civil Rights Initiative (CCRI) seems optimistic about the changes.
“We are glad that Pornhub has adopted several of our recommendations to underscore the importance of consent… It of course remains to be seen how effective these policies will be, but we welcome these important initial steps towards responsibility in an industry notorious for avoiding it.”
Indeed, the porn industry is notorious for ignoring the problems of nonconsensual imagery and child abuse material found on their websites, but not only that, they have been woeful at assisting victims. It was those victims who were conspicuously absent from Pornhub’s statement and policy change. If MindGeek is genuine about protecting victims, whether or not Mastercard or Visa are involved, then we would suggest another quick fix is removing all the content of the victims who have spoken out and including their names or identifying features as banned search terms so as not to be discoverable on porn sites.
Victims of IBSA and CSAM suffer tremendously. The abuse they survive can have life-altering negative effects, even including suicidal ideation. Unlike assault which usually ends, these images shared on social media and porn sites retraumatize the victim again and again.
This is not just a Pornhub issue, or even a porn industry issue. This is an issue of child sexual abuse material and image-based sexual abuse on the internet. It is not about political persuasions or religious beliefs, it is about stopping abuse and helping victims heal.
Kristof put it well when he said in his follow-up column published on December 9th, “The issue isn’t pornography but rape. It is not prudishness to feel revulsion at global companies that monetize sexual assaults on children; it’s compassion.”
If you’re a survivor of abuse, exploitation, or trafficking, reach out to us. We want to support you and connect you with survivor-led resources.
Support this resource
Did you like that article? Help us keep our educational resources free to access! Fight the New Drug is a registered 501(c)(3) nonprofit, which means the educational resources we create are made possible through donations from people like you. Just one dollar can make a difference!
Give $1