Skip to main content
Blog

xHamster Reportedly Uses an Unpaid, Untrained, Volunteer Team to Moderate Content

Porn site xHamster reportedly uses volunteers to help the site’s content moderation efforts. The entry requirements are low, support is minimal, and the collection of volunteers is questionable at best.

By October 14, 2021No Comments

How do you determine the age of a person by looking at a photo or video of them online?

It is easy to tell the difference between a toddler and a grandfather, obviously. Even a 10-year-old and a 20-year-old is pretty clear. But what about the difference between a 15 and 19-year old girl? What if they both have a full face of makeup on?

Turns out, identifying a minor by one glance is really difficult to do. It is particularly difficult on porn sites because of the popularity of “teen” porn. This search term has topped the charts for years, and in response, porn productions cast young adults that could pass as teens in appearance.

Mixed in with adults-pretending-to-be-teens porn, though, are actual videos of minors and nonconsensual abusive content on popular mainstream porn sites. How can the average consumer tell the difference? Well, they can’t.

Related: Let’s Talk About Ethical Porn—Is It Really Exploitation-Free?

This is where content moderation comes in, and the world’s fourth most popular adult site with over 910 million daily visitors, is approaching this practice in a concerning way.

According to a VICE investigation, porn site xHamster is recruiting volunteers to help the site’s content moderation efforts. The entry requirements are low, support is minimal, and the collection of volunteers is questionable at best.

Store - General

Why do porn sites need content moderation?

The unfortunate truth is that the porn industry has an extensive history of profiting from nonconsensual content and abuse, often ignoring victims’ pleas to remove abusive content.Kristof, N. (2021). Why do we let corporations profit from rape videos? New York Times. Retrieved from https://www.nytimes.com/2021/04/16/opinion/sunday/companies-online-rape-videos.htmlCopy Kristof, N. (2020). The children of Pornhub. New York Times. Retrieved from https://www.nytimes.com/2020/12/04/opinion/sunday/pornhub-rape-trafficking.htmlCopy 

Porn is a powerful, multi-billion dollar industry. Three porn sites—XVideos, XNXX, and Pornhub—all rank among the top 20 most trafficked websites in the world.Similarweb. (2021). Top websites ranking. Retrieved from https://www.similarweb.com/top-websites/Copy  And while Pornhub has received the most scrutiny in the last year, it’s important to remember that virtually every major porn site has had issues with nonconsensual content and abuse.Kristof, N. (2021). Why do we let corporations profit from rape videos? New York Times. Retrieved from https://www.nytimes.com/2021/04/16/opinion/sunday/companies-online-rape-videos.htmlCopy Burgess, M. (2020). Deepfake porn is now mainstream. and major sites are cashing in. Retrieved from https://www.wired.co.uk/article/deepfake-porn-websites-videos-lawCopy Kristof, N. (2020). The children of Pornhub. New York Times. Retrieved from https://www.nytimes.com/2020/12/04/opinion/sunday/pornhub-rape-trafficking.htmlCopy Meineck, S., & Alfering, Y. (2020). We went undercover in xHamster's unpaid content moderation team. Retrieved from https://www.vice.com/en/article/akdzdp/inside-xhamsters-unpaid-content-moderation-teamCopy 

Most websites have terms of service stating the kinds of content that are prohibited on their platform, and porn sites often declare they have zero tolerance for nonconsensual content like image-based sexual abuse, but in reality, many cases of abusive materialincluding child abuse material—exist on porn sites.

Related: 34 Trafficking And Abuse Survivors Sue Pornhub For Reportedly Profiting From Their Exploitation

It is currently left up to websites to review the user-uploaded content that may violate their terms of service. Content moderation can take the form of human moderators, reverse image search and sourcing technologies, and hashing. No single approach is effective, so sites should ideally instead practice multiple moderation practices. But this doesn’t happen often.

Porn sites say they practice a series of moderation steps to ensure no nonconsensual content is on their sites, but so many unacceptable pieces of content slip through the cracks that we wonder if this is really true. In 2020, Fight the New Drug investigated porn sites’ moderation claims, and this is what we found.

Also in 2020, Pornhub—owned by a company that owns and controls much of the mainstream porn industry, MindGeek—was exposed for reportedly hosting and profiting from numerous cases of child exploitation and nonconsensual content.

The xHamster volunteer approach

So what’s going on with the fourth-most popular porn site?

xHamster is currently the 22nd most visited website in the world, higher than WhatsApp and eBay. It is a free tube site with user-generated content and has reportedly asked for volunteers to help review the newly uploaded material, according to the VICE investigation.

The result is the “Reviewers Club,” a group of international volunteers who check if the persons in an image or video are under 18.

What training do these individuals receive? Each moderator is given a 480-word manual with 38 images explaining what kinds of content are permitted. To give you some context for how brief that is, you’ve read over 480 words in this article alone.

For their investigation, VICE created an xHamster account to apply to be a moderator. Once approved, they corresponded with other moderators, asking why they do this job for free. This is what they said:

  • One wrote that underage images are “forbidden” and they want to protect others from viewing that content.
  • Another said xHamster is his favorite porn site. He described himself as a porn and sex addict looking for something new.
  • Another hoped volunteering would grant “privileges” like access to private profiles and other content on the site.

Related: “I Wasn’t In Control Of My Body”: How The Porn Industry Cashes In On Nonconsensually Shared Images

It’s important to note that these volunteer moderators do their work anonymously on their private xHamster accounts. The porn company does not know the identity of their moderators, only their usernames.

According to VICE, moderators have a choice of 11 different buttons to choose from when reviewing an image. Seven recommend deletion for reasons such as copyright infringement, a minor in the image, animal abuse, or—most classily—visibility of human fecal matter. If the moderator is unsure, they can press “Skip” and leave it to another reviewer or “Other” and write in their reasoning for a photo’s removal.

For an image to be deleted, it needs to be recommended by multiple moderators, but if the moderators flagged the image for different reasons, VICE reports the photo stays online.

BHW - General

Mistakes, like flagging an image that remains online, are recorded in a moderator’s personal statistics. According to an older version of the manual, a moderator who classifies over 15% of images incorrectly will permanently lose their place in the Reviewers Club. xHamster did not respond to VICE’s request to clarify if this rule still applies.

xHamster reportedly claims to have “legions” of moderators, but this is difficult to verify. The volunteer moderators communicate through an xHamster account called “RClub” which only had 130 friends as of October 2020. Additionally, xHamster stated they have a paid staff who reviews the content previously checked by the volunteer moderators.

Removing harmful content on xHamster

Moderators click through thousands of photos a day looking for minors, but there are substantial gaps in the xHamster manual for what else they should be checking for.

For example, the brief manual does not include secret filming or voyeurism as prohibited material. xHamster moderators seemed to have been advised about this by the site’s administration. One moderator wrote that “hidden cam, voyeur, upskirt, all OK unless there is some other violation.” Another said, “I don’t like it either and it’s a crime here in the USA, but xHamster admin said it’s voyeurism and OK.”

There have been other complaints that xHamster has failed to remove deepfake pornography despite reports of its abusive nature. Alex Hawkins, VP of xHamster, said that the company doesn’t have a specific deepfake policy, but treats them like other nonconsensual content.

If he means the nonconsensual content like upskirting and secret voyeurism, then are we to assume that deepfakes are “okay” to xHamster, too?

Related: If A Porn Performer Is Abused During Filming, Where Do They Report It?

One part of the manual seems to encourage moderators to brush aside their concerns if an image is borderline. It reads, “Do not remove any content if you’re not 100% sure that it’s illegal to be here.” The problem is that even trained and paid content moderators cannot determine if consent was given or be 100% sure of the age of a person in an image.

On this point, one moderator wrote, “Man, reviewing underage is impossible.”

Why this matters

It’s difficult to identify minors and nonconsensual content on porn sites. These volunteers receive minimal training and support, a problem that several moderators complained about to xHamster.

But even if these unpaid viewers receive more guidance from xHamster, the site still does not know their moderator’s identity or motivations. Clearly, each reviewer has a different reason for participating, some because they already spend a chunk of their time consuming porn. Can these moderators really be relied on to accurately identify and report abusive content without bias?

Related: Are Porn Sites Protecting Victims Of Nonconsensual Content? We Investigated, Here’s What We Found

And besides, whether monitoring content is truly effective or not, there’s also the problem of cost. Consider the issues with moderation that Sarah Ditum poses for The Guardian. She’s referring to OnlyFans here, but the same concept applies to any for-profit porn site that seeks to moderate content:

“Moderation, especially pre-moderating all content, which is the only way to ensure nothing criminal slips through, is expensive. (It’s also traumatic for the moderators, who have to see all the terrible things that the rest of the world needs protecting from.) Using moderation to clamp down on users who make money would eat into profits even further. Platforms that host user-generated content of any kind are inevitably caught between the demands of decency and profit. When the content involved is porn, however, the push to extremity and the proximity to illegal activity make the tensions especially keen.”

What are the implications of that? The fact is, the vast, vast majority of people who regularly consume porn are logging onto free-to-view sites that do not prioritize moderation, safety, and rooting out exploitation.

The largest, most popular sites are also those that make the most money from views, clicks, and downloads. Moderation is not a profitable venture, and it is not often exhaustively utilized on popular porn sites.

Related: Would “Exploitation-Free” Porn Be Harm-Free For Consumers?

Until there is a fool-proof method for ensuring child abuse and nonconsensual material is not posted and shared online, consumers and victims alike are relying on a questionable variety of content moderation methods to protect victims from online harassment.

xHamster seems to be one of many porn sites trying to signal they are making progress to stop the spread of exploitative content, but when held up to the light, there are significant flaws in their approach.

Victims of nonconsensual content deserve better, and porn consumers deserve to know that they could be watching someone’s abuse.

So while the concept of fully-moderated porn sites may be seen as a solution to the ethics issues in the porn industry, unfortunately, there is still no way to guarantee that it’s fully effective. And regardless, porn can still have serious negative effects on the consumer and their relationships. Is watching worth it?

For the sake of victims and consumers alike, it’s time to stop the demand.