Skip to main content
Blog

Moderators Sue TikTok Due to Mental Health Issues from Seeing So Many Violent Videos

Former moderators have filed a lawsuit against TikTok for allegedly not protecting them from the emotional trauma caused by reviewing hundreds of “highly toxic and extremely disturbing” videos.

By August 15, 2022No Comments

Trigger warning: This article contains descriptions of disturbing, abusive content.

When you hop on social media sites like TikTok, YouTube, Facebook, Instagram, and others, you probably don’t have to worry about coming across real and disturbing videos of things like people being shot in the face, a kid getting beaten, or graphic abusive images of underage children.

That’s all thanks to people like Ashley Velez, a Las Vegas mother of two boys, who accepted a job last year as a moderator for TikTok.

Related: There’s a Serious Porn Problem on Popular Social Media Platforms

She was excited to “be the front line of defense” in protecting social media users from seeing toxic content, but she quickly realized that she had gotten much more than she bargained for—it was only a matter of months before she quit her job.

Her experience is representative of many people who have signed on to help keep social media sites safe only to be victimized themselves in the process. And issues like these seem to be getting worse.

FTND Resources

Velez leads a class-action suit against TikTok

As you might already know, TikTok is a Chinese-owned short-form, video-sharing app that is home to more than 1 billion monthly active users.

A large part of TikTok’s success is attributable to its some 10,000 worldwide moderators who police videos on the platform to ensure it remains an endless feed of lighthearted content, rather than a cesspool of violent and distressing videos.

Related: TikTok Videos of Underage Teens are Reportedly Being Stolen and Uploaded to Pornhub

Part of TikTok’s job as an employer, like all employers, is to ensure its employees are able to operate in a safe work environment. However, according to Velez and another fellow former TikTok moderator, Reece Young, the company did not provide such a work environment.

For that reason, Velez and Young have filed a federal lawsuit seeking class-action status against TikTok and its parent company, ByteDance. More specifically, they said the company was negligent and broke California labor laws by allegedly not protecting moderators from the emotional trauma caused by reviewing hundreds of “highly toxic and extremely disturbing” videos every week, including videos of animal cruelty, torture, and even the execution of children.

A lawyer from Joseph Saveri Law Firm, the firm that filed Velez and Young’s case said, “You see TikTok challenges, and things that seem fun and light, but most don’t know about this other dark side of TikTok that these folks are helping the rest of us never see.”

Store - General

Why is toxic content posted in the first place?

You might be wondering, why are these moderators’ jobs so miserable—why are they having to look at and get rid of such terrible content?

Sadly, the answer is that people upload this content because there’s a demand for it.

In particular, as Velez mentioned, “underage nude children” is one type of content that frequently has to be filtered. And Velez’s experience moderating is backed by the research of others.

Related: Why TikTok Has Become a Magnet For Child Predators

In 2019, a BBC investigation of TikTok revealed hundreds of sexually explicit comments under videos posted by minors including children as young as nine. While TikTok eventually removed the comments after they were reported, the tech giant failed to suspend the commentators’ accounts.

The findings of the BBC investigation are not a surprise considering there are also communities within TikTok that encourage “soliciting images of boys and girls” and certain hashtags for sharing nudes.

TikTok was also named to the National Center on Sexual Exploitation’s Dirty Dozen List for 2020 due to its being known as a “hunting ground” for predators, as the platform enabled easy access to children to be groomed, abused, and trafficked by strangers. And Forbes identified the app as a “magnet for predators.”

Grooming is happening faster than ever on TikTok

The unfortunate reality is that TikTok seems to be a hub for child abusers as well as child abuse images.

For child predators, the process of manipulation and building trust is usually relatively extensive. Now, however, grooming can happen almost instantaneously thanks to sites like TikTok.

In fact, according to some experts, young girls are sharing self-generated sexual content online “within seconds” of going on apps similar to TikTok.

An organization that specializes in removing child abuse imagery from the internet, the Internet Watch Foundation (IWF), recently took action on more than 37,000 reports containing self-generated images and videos of children on the internet. The IWF warns that girls were victims in 92% of the child sexual abuse content they removed, and that 80% were of girls between the ages of 11 and 13.

Related: “My Uncle Is One Of My Fans”: Real Horror Stories from OnlyFans Creators

The chief executive of the TWF, Susie Hargreaves, suggests that young, naive girls are highly vulnerable to exploitation because they’re often not emotionally mature enough to realize what is happening, are easily flattered by predators’ compliments, and are unaware of the potential consequences.

Hargreaves said that it’s not uncommon for perpetrators to be asking young children to get undressed “within seconds” of use on the app. “These children clearly do not realize it is an adult coercing and tricking them into doing things… Just because a child is in their bedroom, it does not necessarily mean they are safe.”

Live Presentations

Porn worsens existing issues of extreme content

From social media platforms to mainstream porn sites, sexualization of young people, especially girls, is a huge moneymaker across the internet. Consider how the “teen” porn category has topped Pornhub’s charts for nearly a decade.

Moderators exist because of the demand for content involving underage or underage-looking people, as well as the demand for taboo, abusive, or exploitative content like real videos of rape, torture, or abuse.

Porn unfortunately does a lot to normalize, promote, and facilitate the existing and persistent issue of extreme content online.

Related: How Many People are on Porn Sites Right Now? (Hint: It’s a Lot.)

Porn consumers are often drawn to the secrecy, shock value, taboos of available content. All of these things offer varying ways to feed a desire for novelty and excitement. And for consumers who consistently view this type of material, it is possible to find their sexual interests eventually deviating in very unexpected directions. In one 2016 study, researchers found that 46.9% of respondents reported that, over time, they began watching pornography that had previously disinterested or even disgusted them.Wéry, A., & Billieux, J. (2016). Online sexual activities: An exploratory study of problematic and non-problematic usage patterns in a sample of men. Computers in Human Behavior, 56, 257-266. doi:https://doi.org/10.1016/j.chb.2015.11.046Copy 

These findings are consistent with other research that demonstrates that changing tastes and escalating is not an uncommon experience amongst porn consumers.Bőthe, B., Tóth-Király, I., Zsila, Á., Griffiths, M. D., Demetrovics, Z., & Orosz, G. (2017). The development of the problematic pornography consumption scale (PPCS). The Journal of Sex Research, 55(3), 395–406. doi: 10.1080/00224499.2017.1291798Copy Downing, M. J., Schrimshaw, E. W., Scheinmann, R., Antebi-Gruszka, N., & Hirshfield, S. (2016). Sexually explicit media use by sexual identity: A comparative analysis of gay, bisexual, and heterosexual men in the united states. Archives of Sexual Behavior, 46(6), 1763–1776. doi: 10.1007/s10508-016-0837-9Copy 

It’s worth asking, would Velez and Young be subject to such horrible content if the porn industry didn’t normalize and perpetuate abusive content on such a massive scale first?

To learn more about how porn sites profit from abusive and nonconsensual content, click here. 

Support this resource

Thanks for taking the time to read through this article! As a 501(c)(3) nonprofit, we're able to create resources like this through the support of people like you. Will you help to keep our educational resources free as we produce resources that raise awareness on the harms of porn and sexual exploitation?

DONATE