fbpx Skip to main content
BlogWorld

This Popular Porn Site Uses an Unpaid, Untrained, Volunteer Team to Evaluate Illicit Content

Porn site xHamster uses volunteers to help the site’s content moderation efforts. The entry requirements are low, support is minimal, and the collection of volunteers is questionable at best.

By January 6, 2021No Comments

How do you determine the age of a person by looking at a photo or video of them online? It is easy to tell the difference between a toddler and a grandfather, obviously. Even a 10-year-old and a 20-year-old is pretty clear. But what about the difference between a 15 and 19-year old girl? What if they both have a full face of makeup on?

Turns out, identifying a minor by one glance is really difficult to do. It is particularly difficult on porn sites because of the popularity of “teen” porn. This search term has topped the charts for years, and in response, porn productions cast young adults that could pass as teens in appearance.

Mixed in with adults-pretending-to-be-teens porn, though, are actual videos of minors and nonconsensual abusive content. How can the average consumer tell the difference? Well, they can’t.

Related: Are Porn Sites Protecting Victims Of Nonconsensual Content? We Investigated, Here’s What We Found

This is where content moderation comes in, and the world’s fourth most-popular adult site with over 910 million daily visitors is approaching this practice in a concerning way.

According to a VICE investigation, porn site xHamster is recruiting volunteers to help the site’s content moderation efforts. The entry requirements are low, support is minimal, and the collection of volunteers is questionable at best.

Fight The Status Quo - Tye Dye

Why do porn sites need content moderation?

Most websites have terms of service stating the kinds of content that are prohibited on their platform, and porn sites often declare they have zero tolerance for nonconsensual content like image-based sexual abuse, but in reality, many cases of abusive materialincluding child abuse material—exist on porn sites.

It is currently left up to websites to review the user-uploaded content that may violate their terms of service. Content moderation can take the form of human moderators, reverse image search and sourcing technologies, and hashing. No single approach is effective, so sites should instead practice multiple moderation practices. Porn sites say they practice a series of moderation steps to ensure no nonconsensual content is on their sites, but so many unacceptable pieces of content slip through the cracks that we wonder if this is really true. Earlier in 2020, Fight the New Drug investigated porn sites’ moderation claims, and this is what we found.

Related: 7 Cases Of Nonconsensual Porn And Rape Tapes Pornhub Doesn’t Want Consumers To Know About

The xHamster volunteer approach

xHamster is currently the 22nd most visited website in the world, higher than WhatsApp and eBay. It is a free tube site with user-generated content and has asked for volunteers to help review the newly uploaded material.

The result is the “Reviewers Club,” a group of international volunteers who check if the persons in an image or video are under 18. What training do these individuals receive? Each moderator is given a 480-word manual with 38 images explaining what kinds of content are permitted. To give you some context for how brief that is, you’ve almost read 480 words in this article alone.

For their investigation, VICE created an xHamster account to apply to be a moderator. Once approved, they corresponded with other moderators, asking why they do this job for free.

One wrote that underage images are “forbidden” and they want to protect others from viewing that content.

Another said xHamster is his favorite porn site. He described himself as a porn and sex addict looking for something new.

Another hoped volunteering would grant “privileges” like access to private profiles and other content on the site.

Related: “I Wasn’t In Control Of My Body”: How The Porn Industry Cashes In On Nonconsensually Shared Images

It’s important to note that these volunteer moderators do their work anonymously on their private xHamster accounts. The porn company does not know the identity of their moderators, only their usernames.

According to VICE, moderators have a choice of 11 different buttons to choose from when reviewing an image. Seven recommend deletion for reasons such as copyright infringement, a minor in the image, animal abuse, or—most classily—visibility of human fecal matter. If the moderator is unsure, they can press “Skip” and leave it to another reviewer or “Other” and write in their reasoning for a photo’s removal.

For an image to be deleted, it needs to be recommended by multiple moderators, but if the moderators flagged the image for different reasons, VICE reports the photo stays online.

Mistakes, like flagging an image that remains online, are recorded in a moderator’s personal statistics. According to an older version of the manual, a moderator who classifies over 15% of images incorrectly will permanently lose their place in the Reviewers Club. xHamster did not respond to VICE’s request to clarify if this rule still applies.

xHamster claims to have “legions” of moderators, but this is difficult to verify. The volunteer moderators communicate through an xHamster account called “RClub” which only had 130 friends as of October. Additionally, xHamster stated they have a paid staff who reviews the content previously checked by the volunteer moderators.

Brain Heart World

Removing harmful content on xHamster

Moderators click through thousands of photos a day looking for minors, but there are substantial gaps in the xHamster manual for what else they should be checking for.

For example, the brief manual does not include secret filming or voyeurism as prohibited material. xHamster moderators seemed to have been advised about this by the site’s administration. One moderator wrote that “hidden cam, voyeur, upskirt, all OK unless there is some other violation.” Another said, “I don’t like it either and it’s a crime here in the USA, but xHamster admin said it’s voyeurism and OK.”

There have been other complaints that xHamster has failed to remove deepfake pornography despite reports of its abusive nature. Alex Hawkins, VP of xHamster, said that the company doesn’t have a specific deepfake policy, but treats them like other nonconsensual content. If he means the nonconsensual content like upskirting and secret voyeurism, then are we to assume that deepfakes are “okay” to xHamster, too?

Related: If A Porn Performer Is Abused During Filming, Where Do They Report It?

One part of the manual seems to encourage moderators to brush aside their concerns if an image is borderline. It reads, “Do not remove any content if you’re not 100% sure that it’s illegal to be here.” The problem is that even trained and paid content moderators cannot determine if consent was given or be 100% sure of the age of a person in an image.

On this point, one moderator wrote, “Man, reviewing underage is impossible.”

Be A Lover And A Fighter - New Colors

Why this matters

It’s difficult to identify minors and nonconsensual content on porn sites. These volunteers receive minimal training and support, a problem that several moderators complained about to xHamster.

But even if these unpaid viewers receive more guidance from xHamster, the site still does not know their moderator’s identity or motivations. Clearly, each reviewer has a different reason for participating, some because they already spend a chunk of their time consuming porn. Can these moderators really be relied on to accurately identify and report abusive content without bias?

Related: What You Should Know About Porn And Child Predators On TikTok

Until there is a fool-proof method for ensuring child abuse and nonconsensual material is not posted and shared online, consumers and victims alike are relying on a shoddy variety of content moderation methods to protect victims from online harassment.

xHamster is just the latest porn site trying to signal they are making progress in the right direction, but when held up to the light, there are significant flaws in their approach. Victims of nonconsensual content deserve better.

Send this to a friend