Skip to main content
Blog

Twitter Sued for Reportedly Distributing and Profiting from Child Abuse Images

Plaintiff John Doe was horrified to find out explicit of himself—made at age 13 under duress by traffickers—were posted to Twitter.

Portions of the following article were originally shared in a press release by the National Center on Sexual Exploitation.

Disclaimer: Fight the New Drug is a non-religious and non-legislative awareness and education organization. Some of the issues discussed in the following article are legislatively-affiliated. Including links and discussions about these legislative matters does not constitute an endorsement by Fight the New Drug. Though our organization is non-legislative, we fully support the regulation of already illegal forms of pornography and sexual exploitation, including the fight against sex trafficking.

The National Center on Sexual Exploitation Law Center (NCOSE), The Haba Law Firm, and The Matiasic Firm have jointly filed a federal lawsuit against Twitter on behalf of a minor who was trafficked on the social media platform that boasts over 330 million users.
 
The plaintiffs, John Doe #1 and John Doe #2, minors, were reportedly harmed by Twitter’s distribution of material depicting his sexual abuse, and by Twitter’s knowing refusal to remove the images of their sexual abuse (child pornography) when notified by the plaintiff and the plaintiffs’ parents. The case, John Doe v. Twitter, was filed in the United States District Court for the Northern District of California.

Related: Apple Fights Child Abuse Images By Scanning Users’ Uploaded iCloud Photos

The 18-year-old John Doe #1 and John Doe #2 say they were 13 years old when a sex trafficker posing as a 16-year-old girl tricked them into sending pornographic videos of themselves through the social media app Snapchat. A few years later when they were in high school, links to those videos began appearing on Twitter in January 2020.

Store - General

The plaintiffs say they alerted law enforcement about the tweets and urgently requested that Twitter remove them. Using Twitter’s reporting system, which according to its policies is designed to catch and stop illegal material like child sexual abuse material (CSAM) from being distributed, the Doe family verified that John Doe was a minor and the videos needed to be taken down immediately.

Instead of the videos being removed, NCOSE reports that Twitter did nothing, even reporting back to John Doe that the video in question did not in fact violate any of their policies.

Reportedly, Twitter refused to take down the content until nine days later when a Department of Homeland Security agent contacted Twitter and urged action. At that point, the lack of care and proper attention resulted in the posts receiving 167,000 views and 2,223 retweets, according to the lawsuit.

“As John Doe’s situation makes clear, Twitter is not committed to removing child sex abuse material from its platform. Even worse, Twitter contributes to and profits from the sexual exploitation of countless individuals because of its harmful practices and platform design,” said Peter Gentala, senior legal counsel for the National Center on Sexual Exploitation Law Center. “Despite its public expressions to the contrary, Twitter is swarming with uploaded child pornography and Twitter management does little or nothing to prevent it.”

The John Does are now suing Twitter for its involvement in and profiting from his sexual exploitation, which violates the Trafficking Victims Protection Reauthorization Act and various other protections afforded by law.

In August 2021, a federal judge found Twitter may have benefitted financially from ad revenue generated by tweets containing child sexual abuse material.

Read NCOSE’s full press release here.

BHW - General

Is this the first time Twitter has shared child abuse images?

Accessing CSAM used to be difficult, like finding a needle in a haystack. Today, child exploitation is shared through P2P (file sharing) networks, encrypted messaging applications like WhatsApp, social media, adult pornography sites, and even suggested as a search option on Microsoft Bing. It’s even easily accessible on Twitter these days, as this lawsuit clearly shows.

It seems obvious that such abuse should be eradicated. The question is, how? Is such a mission even possible? And if so, whose responsibility is it to end child porn?

These are urgent questions that have not only been made worse by child abusers and exploiters sharing CSAM on the platform, but the adult industry at large, too.

Related: How Mainstream Porn Is Connected To Arrests For Child Abuse Images

Since the start of the COVID-19 pandemic, “not-safe-for-work” subscription-based site OnlyFans has shown how prolific child exploitation images are on Twitter, specifically.

Followers on OnlyFans pay a monthly subscription fee to sexual content creators that ranges anywhere from $4.99 to $49.99 a month. Creators can also charge a minimum of $5 tips or paid private messages, which is the real money maker for those with a loyal subscriber base. And while OnlyFans has an age verification system that tries to ensure sexual content creators are over 18, it can be easily bypassed.

Many OnlyFans creators use Twitter to advertise selling nudes and drive traffic to their profiles—particularly through trending hashtags like #teen and #barelylegal. And while there clearly are underage creators on OnlyFans, on the flip side, many adult creators give the illusion of being under 18 to grow their fan base.

Live Presentations

In the BBC’s 2020 documentary about underage content on OnlyFans, Yoti—a platform that helps individuals prove their identity online—did a scan of 20K Twitter accounts to detect how many users were underages using the hashtags #nudesforsale and #buymynudes, which are commonly used to direct followers to OnlyFans. In just one day, out of 7,000 profiles where faces could be detected and analyzed they found that 33%, or over 2,500 profiles, were very likely underage.

Related: Report Reveals One-Third Of Online Child Sex Abuse Images Are Posted By Kids Themselves

Clearly, the rise in popularity of OnlyFans is causing an influx of underage content generation—legally defined as child exploitation imagery—even outside of the platform itself.

While this data helps us understand the scale of the issue when it comes to underage girls being attracted to and exploited on these platforms, it’s clearly just the tip of the iceberg.

Click here to learn how to report child sexual images.

Support this resource

Thanks for reading our article! Fight the New Drug is a registered 501(c)(3) nonprofit, which means the educational resources we create are made possible through donations from people like you. Join Fighter Club for as little as $10/month and help us educate on the harms of porn!

JOIN FIGHTER CLUB