Skip to main content
Blog

The Horrifying Videos Overseas TikTok Moderators are Exposed To

Artificial intelligence isn't perfect, so humans are still used to review a majority of the most horrifying videos on the platform. How does it affect them?

By September 13, 2022No Comments

Trigger warning: This article contains descriptions of violent and abusive videos. Reader discretion is advised.

“The devil of this job is that you get sick slowly—without even noticing it… You think it’s not a big deal, but it does affect you.”

These are the words of former content moderator, Wisam, as he shared with online news site Business Insider about his experiences working for TikTok.

The critical job of overseas moderators

When you watch a video on Meta (formerly called Facebook), Instagram, TikTok, or YouTube, among other prominent social media sources, you don’t have to worry about watching a group of teenagers beat an old man with an axe, or a man use a hunting gun to shoot himself, or someone violently murder a cat.

The reason you don’t have to do that is because people like Wisam and his coworkers are doing it for you.

Related: Moderators Sue TikTok Due to Mental Health Issues from Seeing So Many Violent Videos

But why? Why can’t social media sites use cool tech like artificial intelligence and machine learning to moderate social media content?

While TikTok and other social media sites do use artificial intelligence to help review content, the technology isn’t always perfect—especially when it comes to videos that feature languages other than English.

For this reason, humans are still used to review a majority of the most horrifying videos on the platform. And their work is essential, ensuring that advertisements from reputable companies like Nike don’t appear alongside porn or violent material.

The moderating experiences of Imani and Samira

Imani was a 25-year-old when she took a job in September 2020 offered by temporary hire agency Majerol as a content moderator for TikTok. Living in a single-bedroom apartment in Casablanca, Morocco, Imani was brought in to help support the company’s growth in the Middle East.

Although she had a bachelor’s degree in English, she took the $2 an hour job because she was struggling to find work during the first part of the pandemic and because her husband, a technician, could not support their infant daughter on his own.

Related: TikTok Videos of Underage Teens are Reportedly Being Stolen and Uploaded to Pornhub

The work was so mentally distressing that she left her job just as quickly as she started and says she’s still dealing with the effects of the work about two years later!

Imani is not alone in her story.

Nine current and former content moderators in Morocco who worked for TikTok through Majerol described experiences to Business Insider of severe psychological distress as a result of their jobs.

Samira, 23, was one of them. In addition to the distress from viewing gruesome content, Samira stated that her and her colleagues were treated like “robots.”

Store - General

She was tasked with reviewing 200 videos every hour while maintaining an accuracy score of 95%. This score was calculated by how close her tags were to those of more senior content moderators who watched the same videos.

Related: What You Should Know About Porn and Child Predators on TikTok

However, three months into her job, the goalpost shifted, and her manager bumped up her video-per-hour quota to 360. In other words, Samira had 10 seconds to review videos with an extremely high level of accuracy requirement. Understandably, she ended up leaving, too.

It’s a sad truth, but it seems to be the case that the cost of keeping social media sites safe is to victimize others in the process—just ask Imani and Samira

Why is toxic content uploaded to sites like TikTok?

It’s pretty simple, actually. This content exists because there’s a massive demand for it.

In particular, one type of content that tends to be heavily demanded is “underage nude children,” according to Ashley Velez, a Las Vegas-based moderator. Unfortunately, Velez’ experiences moderating are supported by a ton of other sources.

Related: There’s a Serious Porn Problem on Popular Social Media Platforms

Forbes identified the social platform as a “magnet for predators,” while the National Center on Sexual Exploitation named TikTok to its Dirty Dozen List for 2020 because of the platform’s reputation for enabling easy access to children to be groomed, abused, and trafficked by strangers.

The BBC also did a separate investigation into TikTok’s toxic sexual content. They found whole communities that encouraged “soliciting images of boys and girls” and hashtags specifically for sharing nudes. In addition, the investigation revealed hundreds of sexually explicit comments on videos of children as young as nine years old.

Clearly, this is a massive issue—especially given how porn particularly does a lot to normalize, promote, and facilitate the existing and persistent issue of extreme content online.

Why this matters

Whether it’s gory, sexually violent or otherwise, user-generated content on platforms often leaves room for unthinkable, illicit,  explicit content. As TikTok and other content platforms like it grow, the graphic content will, too—and hopefully the moderating force with it.

Related: Real Stories of People Finding Hidden Cameras in Bathrooms and Hotels

But for the moderating force to grow and improve, will the mental and emotional well-being of exploited individuals be sacrificed in the process?

What it comes down to is this: if there was no demand for explicit content, it wouldn’t be uploaded, and if it wasn’t uploaded, people like Imani and Samira might still have their jobs without the negative mental health effects.

To learn more about how much porn is on each social media platform, click here.

Support this resource

Did you like that article? Help us keep our educational resources free to access! Fight the New Drug is a registered 501(c)(3) nonprofit, which means the educational resources we create are made possible through donations from people like you. Just one dollar can make a difference!

Give $1