fbpx Skip to main content
BlogWorld

XVideos, World’s Most Popular Porn Site, Reportedly Hosts Nonconsensual Content & Child Exploitation

XVideos is the most-visited porn site in the world, and it’s also the latest tube site under fire for reportedly hosting abusive content.  

This month, we invite you to educate yourself and others on how the porn industry normalizes and fuels the demand for exploitation in various forms. Together, we can stop the demand. Learn More

In a recent investigative op-ed for The New York Times, journalist Nicholas Kristof recounted the tragic experience of a 16-year-old girl in Perth, Australia. This teen Snapchatted a nude photo of herself to her boyfriend at the time, with the message, “I love you. I trust you.”

Without consent, the boyfriend took a screenshot of the Snap and shared it with five of his friends, who then shared it with 47 other friends. Before long, more than 200 students at the teen’s school had seen her image. One person uploaded it to a porn site with her name and school.

The teen stopped attending school and self-medicated with drugs. Her family moved to a different city and then a different state, but felt she could not escape. At 21-years-old, she died by suicide.

Related: Study Reveals Image-Based Abuse Victims Suffer Similar Trauma As Sexual Assault Victims

The harmful effects of nonconsensually sharing private sexual images, also known as image-based sexual abuse (IBSA), are serious, and the devastation victims experience is made exponentially worse if their images or videos are uploaded to porn sites.

This is why XVideos is the latest tube site under fire for reportedly hosting this abusive content.

Stop The Demand - Denim

A quick timeline of events

In December 2020, Nicholas Kristof wrote a different op-ed that went viral. It exposed Pornhub for reportedly hosting and profiting from nonconsensual content, or IBSA, and child sexual abuse material (CSAM), as well as criticized the porn company for their reported poor treatment of victims who appealed to the website to remove abusive images.

In response to these allegations, Pornhub announced a series of improvements, including removing the download feature and only permitting verified users to upload content. Yet, it wasn’t enough to keep payment services like Mastercard, Discover, and Visa who independently verified the existence of illicit content on Pornhub and severed ties with the company.

Related: The New York Times Exposé That Helped Spark The Possible Beginning Of The End Of Pornhub

At the start of this year, the Canadian government opened an inquiry into Pornhub’s dealings and reviewed its parent company, MindGeek. The hearings included testimony from the porn company’s executives and from victims whose lives have been dramatically altered by videos of their abuse being nonconsensually shared and promoted on Pornhub. The outcome of these discussions is still unclear, but one thing we know for sure is that our culture cannot go back to pretending we are unaware of the kind of content on porn sites. Nonconsensual content on porn tube sites that rely on user-generated material has been proven to be common, accessible, and devastating to those who are victimized.

While Kristof’s article garnered a lot of outrage and attention about Pornhub and MindGeek, one pornographer said to Kristof that his reporting was a gift to Pornhub’s competitor—that the journalist was like “Santa Claus” to XVideos. That has proven to be true, so far.

When Pornhub deleted 10 million videos from its site that were uploaded by unverified users, many upset porn consumers flocked to its rival site, XVideos—a site with much fewer scruples than Pornhub, and seemingly less oversight.

Fighter Pledge

What we know about XVideos

The issue of nonconsensual content and CSAM online is much bigger than Pornhub. It exists on other porn sites, social media platforms, and just about any site that allows users to upload content.

XVideos and Pornhub are free adult tube sites that have competed for years for the top spot in the porn industry. Currently, XVideos ranks as the number one porn site in the world, and the seventh most visited website on the internet with its sister site XNXX.com close behind at number nine. Since the Pornhub controversy, that site has tumbled out of the top ten to number 13.

This is part of the reason why Kristof’s reporting was described as a “gift” to XVideos. As Pornhub has been under fire, consumers have migrated from Pornhub to XVideos where there are very few restrictions on content.

Related: Massive Porn Site XVideos Investigated For Hosting Videos Of Child Sexual Abuse And Exploitation

The website guides consumers to videos that claim to be of children. It was reported that searching for “young” returned similar suggestions including “tiny,” “girl,” “boy,” “jovencita,” and “youth porn.” And if that wasn’t concerning enough, the page where CSAM can be reported on XVideos says that if there are no details included with the report, it “will be ignored.”

Why would they ignore even a single report of CSAM, details included or not? Are there so many CSAM reports they have to go through that they have to discard some for lack of detail? Does this seem to be a reporting system from a site that takes claims of CSAM as seriously as it should?

XVideos content report page, April 2021

Earlier this year, Czech authorities announced an investigation into XVideos, which is based in the Czech Republic, after the allegations that the company was enabling and allowing users to upload and share IBSA, nonconsensual content, and CSAM. Under this pressure, XVideos has removed some underage search terms, but even searching for “twelve” suggested other terms including “elementary” and “training bra.”

These efforts at making the site safer from abusive content are hardly comprehensive. Just in the last few days, advocates have found reportedly clear examples of CSAM and nonconsensual content on the site without having to search very thoroughly. Examples of real porn titles on the site include references to “getting f—ed awake,” “passed out” wives, and 18-year-olds taking advantage of school children. Some of these types of videos look staged, but many look very real.

Related: What’s Going On With Pornhub? A Simplified Timeline Of Events

Give One For Love

The role of payment services and search engines

Going forward, Kristof suggests a couple of steps. He called on credit card companies and payment services to abandon XVideos in a similar way they cut ties with Pornhub. After the publication of his op-ed, PayPal contacted Kristof to announce they will no longer be available to purchase advertising on XVideos or their sister sites.

The second point is perhaps more challenging to fix. XVideos and Pornhub rely on search engines to drive traffic to their sites. Kristof called out Google in particular, but also Yahoo and Bing, for enabling the abusive content on sites like XVideos. If such a claim is making you raise your eyebrows, let us explain.

Related: This Child Abuse Expert Says Many Abusers Started By Watching Mainstream Porn

When consumers are looking for porn, a common place to start is Google. Searching for an explicit term on either of the major search engines returns links to Pornhub and XVideos who are constantly vying for the top listing. This is also correct for search terms that are suggestive of underage or nonconsensual material. For example, in Kristof’s research, he typed a series of terms into Google, including “schoolgirl” and “rape unconscious girl” which returned links to XVideos advertising these types of videos.

But wait, if people go looking for something, they’ll eventually find it…right? Does Google really have the capability to redirect people away from searches yielding nonconsensual content and CSAM? Yes, they do.

To prove this, Kristof googled a few things, one of which was “how do I commit suicide?” and the top results returned suicide prevention hotlines. If that simple redirection is possible for suicide or searching for ways to “poison my husband,” why not for illicit content?

Brain Heart World

Why this matters

Not all content on XVideos is actual IBSA or CSAM. Many of the links Google returns for a search term like “schoolgirl” will likely be of porn performers who mimic these so called “genres,” but this blend of professional videos mixed with nonconsensual and abusive content makes it even more challenging for consumers to tell the difference between the two.

Related: MindGeek, Pornhub’s Parent Company, Sued For Reportedly Hosting Videos Of Child Sex Trafficking

Perhaps another question we should be asking is, how did such abusive scenarios become porn genres in the first place?

A study published this year found that one in eight video titles on three major porn tube sites home pages alone (XVideos, Pornhub, and XHamster) described sexual violence or nonconsensual conduct. Many more of the videos themselves showed violence. Videos portrayed women caught on spy cams in changing rooms, unconscious women being raped, and some even depict children or adults trying to fight back against an assault. The report also found the most common form of sexual violence shown was between family members, and frequent terms used to describe the videos included “abuse,” “annihilation,” and “attack.” The researchers conclude that these sites are “likely hosting” unlawful material.

Also, another study analyzing the acts portrayed in porn videos suggests that as little as 33.9% and as much as 88.2% of popular porn scenes contain violence or aggression, and that women are the targets of violence approximately 97% of the time.

Related: How To Report Child Sexual Abuse Material If You Or Someone You Know Sees It Online

These findings combined with hearing victim experiences and reading the explicit search terms are all difficult to digest, but we keep covering this issue of abusive and exploitative content on porn sites for greater awareness that, in turn, reduces the demand that enables the perpetration of sexual violence and exploitation.

There is still a long way to go from stopping online exploitation including IBSA and CSAM. Videos that show “7th grader” or “unconscious girl” are not sexual fantasy and entertainment, they are exploitative. Exploitation, rape, sexual assault, and sex abuse are not sexy.

To learn more about how even exploitation-free porn is not harm-free to consumers, click here. 

Send this to a friend