Skip to main content
Blog

There’s a Serious Porn Problem on Popular Social Media Platforms

With huge numbers of users comes huge amounts of responsibility, and not a single site has developed a porn-proof filtering system or thorough enough moderation team.

58.4% of the world’s population uses social media. The average daily usage is 2 hours and 27 minutes. –Smart Insights

Social media has grown to be a staple in the realm of our modern society’s communication and self-expression, but it’s also a major hub for all types of shocking content—including porn and, sometimes, child exploitation. But why is there so much porn on social media?

With huge numbers of users comes huge amounts of responsibility, and it looks like not a single site has developed a porn-proof filtering system or thorough enough moderation team.

Here, we take a look at some of the most popular platforms and their reported struggles to keep hardcore content from plaguing users everywhere. We’ve ranked the sites from most to least users, using Smart Insights’ data on how many people are on each platform.

Knowing what you’re dealing with as a user of any of these sites can be helpful when looking for content to report, block, and flag.

Store - General

Facebook – 2.91 billion active users

In 2021, our affiliates at Bark discovered that if someone searched any letter in Facebook’s search bar and then navigated to the video results, they would see a long list of mostly sexual options.

Previously, the BBC had attempted to report 100 sexualized images of children, and found that Facebook eventually removed just 18 of them.

The BBC report also said Facebook took no action when it was notified that five convicted child predators had active Facebook accounts, explicitly violating the company’s rules.

And more recently, an investigation by the BBC purports that Facebook hasn’t responded adequately to employee concerns over Facebook’s use globally. In particular, they point to drug cartels’ and trafficking organizations’ use of the platform to attract, advertise and sell women.

Related: Report Alleges Facebook Didn’t Fix Systems that Allow Human Traffickers to Recruit Victims on the Platform

Employees state that some of the illegal groups or pages they flag are taken down, while dozens are still left active. Or, only content tied to the group is removed, instead of taking down the illicit group or organization itself.

And as part of a slow but steady file leak, the Guardian revealed a few years ago that Facebook has faced at least one surge in revenge porn and sexual extortion cases. The company ended up disabling over 14,000 accounts involved in these disputes, 33 of which involved children. It’s not clear how this compares to other periods (Facebook doesn’t divulge specific figures), but that’s no small amount.

And those are just the highlights when it comes to the porn problems on this world’s most massive social site.

YouTube – 2.56 billion users

Reports say that  spam commenters are flooding popular YouTube channels with links containing suspected scams and directing young audiences toward porn sites.

An investigation found comments sections on popular channels were bombarded with spam comments despite YouTube claiming to have addressed the problem in 2019.

Using an open-source spam detection algorithm, the iNews investigation found that on a number of popular channels with millions of subscribers, spam comments made up around one in five comments.

WhatsApp – 2 billion users

A 2019 article by TechCrunch, publisher of tech industry news, summarized the findings of two Israeli NGOs. The article “details how WhatsApp chat groups are being used to spread illegal child pornography.”

The ultra-popular cross-platform messaging and calling application is owned by Facebook and utilizes privacy technology—including encryption.

More specifically, end-to-end encryption, as the privacy technology “ensures only you and the person you’re communicating with can read what’s sent and nobody in between, not even WhatsApp. Your messages are secured with locks, and only the recipient and you have the special keys needed to unlock and read your messages.”

To put it simply, end-to-end encryption allows people to share child abuse images in a way that leaves them and their messaging threads anonymous to the rest of the world—meaning they can abuse children and distribute the material with others to use without getting caught.

And unfortunately, that’s what’s happening.

Get The Facts

Instagram – 1.47 billion active users

While there are no reported hard numbers to be found on the amount of porn on this hugely popular platform, this platform has a serious issue with child predators.

It’s no secret that the internet has fueled child sex abuse like never before, and enabled abusers direct access to victims. But what might be lesser-known is just how much social media platforms like Instagram are directly giving that unprecedented access to kids.

Related: WATCH: 37-Year-Old Goes Undercover on Instagram as 11-Year-Old—Here’s What She Learned About Child Predators

To explore the world of social media predators, and learn just how easy it is for child abusers to have unfettered access to minors, the Bark team transformed via photoshop one of their own 37-year-old team members.

With various altered images of her, they created fabricated social media profiles of a hypothetical 15-year-old, 12-year-old, and 11-year-old. What they discovered may shock you.

The Bark investigation is a peek into a terrifying reality where adults manipulate teens and children into engaging in abusive relationships. If there’s good news, it’s that there are steps to grooming that you can learn about to protect yourself or your child.

While Instagram prides itself on banning specific hashtags associated with pornographic content, it has a long way to go before it can be considered totally appropriate and safe for users.

Store - General

TikTok – 1 billion users

As you might already know, TikTok is a Chinese-owned short-form, video-sharing app that is home to more than 1 billion monthly active users.

A large part of TikTok’s success is attributable to its some 10,000 worldwide moderators who police videos on the platform to ensure it remains an endless feed of lighthearted content, rather than a cesspool of violent and distressing videos.

Part of TikTok’s job as an employer, like all employers, is to ensure its employees are able to operate in a safe work environment. However, according to former TikTok moderators, the company did not provide such a work environment.

Related: Moderators Sue TikTok Due to Mental Health Issues from Seeing So Many Violent Videos

For that reason, former moderators have filed a federal lawsuit seeking class-action status against TikTok and its parent company, ByteDance. More specifically, they said the company was negligent and broke California labor laws by allegedly not protecting moderators from the emotional trauma caused by reviewing hundreds of “highly toxic and extremely disturbing” videos every week, including videos of animal cruelty, torture, and even the execution of children.

A lawyer from Joseph Saveri Law Firm, the firm that filed the lawsuit said, “You see TikTok challenges, and things that seem fun and light, but most don’t know about this other dark side of TikTok that these folks are helping the rest of us never see.”

Snapchat – 557 million active users

With more than 100 million daily active users, Snapchat once was the social media app for millennials.

Ever heard of “Premium Snapchat?” It’s the latest way porn creators are accessing their target audiences. The term “premium Snapchat” is an unofficial one, but they do officially exist.

A premium account is a normal Snapchat account, but it’s private, and the account owner can charge a fee for users to access it. Overwhelmingly, people—often porn performers—use these premium accounts to offer explicit sexual footage of themselves. If you pay someone for premium access, you can expect a regular influx of her self-made porn, straight to your Snapchat.

Related: 4 Confessions About the Secret Porn World of “Premium” Snapchat

Premium account holders can enjoy avoiding their accounts being flagged and deleted by Snapchat by making these accounts private—people who are paying for access want the explicit content and have paid for it, and so are unlikely to report the account.

Mostly women, these account holders advertise their premiums on other social platforms like Facebook and Instagram and accept payment through Venmo, Paypal, or whatever money transfer app they prefer. It’s like porn-on-demand, but much more easily disguised.

Parents: If your child is looking at “premium” Snapchat accounts, it can be detected with the right tools. Bark is software that can alert parents when it picks up pornographic or sexting activity on your child’s device. Try Bark for 30 days at no cost today.

Fast Facts

Pinterest – 444 million active users

Pinterest’s section on Pin Etiquette states, “We do not allow nudity or hateful content.” Period. Furthermore, Pinterest’s terms of service prohibit “any content that…is defamatory, obscene, pornographic, vulgar or offensive.” Pinterest community manager Enid Hwang elaborates, “Photographic images that depict full-frontal nudity, fully exposed breasts and/or buttocks are not allowed on Pinterest.”

Still, we’ve been getting messages from Fighters that say they’ve found blatantly hardcore pornographic content in their regular feeds…and this issue seems to be even worse if they’ve marked their gender as “male” on the site, from what we’ve seen.

And, like Instagram, there are no hard numbers to be found for how many explicit posts there actually are on the site, but it’s on our radar as a problematic platform for questionable content.

Twitter – 436 million active users

Worse than porn on Twitter, there’s reportedly sexual exploitation on the platform.

The National Center on Sexual Exploitation Law Center (NCOSE), The Haba Law Firm, and The Matiasic Firm have jointly filed a federal lawsuit against Twitter on behalf of minors who were trafficked on the social media platform.

The 18-year-old John Doe #1 and John Doe #2 say they were 13 years old when a sex trafficker posing as a 16-year-old girl tricked them into sending pornographic videos of themselves through the social media app Snapchat. A few years later when they were in high school, links to those videos began appearing on Twitter in January 2020.

Related: Twitter Sued for Reportedly Distributing and Profiting from Child Abuse Images

The plaintiffs say they alerted law enforcement about the tweets and urgently requested that Twitter remove them. Using Twitter’s reporting system, which according to its policies is designed to catch and stop illegal material like child sexual abuse material (CSAM) from being distributed, the Doe family verified that John Doe was a minor and the videos needed to be taken down immediately.

Instead of the videos being removed, NCOSE reports that Twitter did nothing, even reporting back to John Doe that the video in question did not in fact violate any of their policies.

Reportedly, Twitter refused to take down the content until nine days later when a Department of Homeland Security agent contacted Twitter and urged action. At that point, the lack of care and proper attention resulted in the posts receiving 167,000 views and 2,223 retweets, according to the lawsuit.

Bark

Why this matters

These data show us just how much porn has taken over the internet and our online social experiences, especially for teens.

It’s no secret that porn is everywhere, and now it seems to have taken hold of our Discovery pages on Instagram, our feeds on Facebook and Twitter, and our For You page on TikTok.

And if you’re a parent and you do ultimately catch your child watching porn, do not shame them. Ask them questions about how they feel, and listen to their answers. Point out the differences between porn sex and healthy sex—and point out that the former lacks intimacy, connection, and boundary setting.

Related: We Need to Talk About Porn. Is It As Harmless As Society Says It Is?

While some pornographic social media accounts will ask users if they are over 18, teens can simply lie to get around these flimsy age restrictions. Though parents can install programs on kids’ devices that block pornography, today’s tech-savvy kids know how to get around these (and they can always use their friends’ unblocked phones instead).

The reality is that most teenagers will inevitably be exposed to porn—and parents must talk to them about it.

Click here to access our conversation guide and see how you can talk to your loved ones about porn.

Support this resource

Thanks for taking the time to read through this article! As a 501(c)(3) nonprofit, we're able to create resources like this through the support of people like you. Will you help to keep our educational resources free as we produce resources that raise awareness on the harms of porn and sexual exploitation?

DONATE