Portions of the following article were originally shared in a press release by the National Center on Sexual Exploitation.
Disclaimer: Fight the New Drug is a non-religious and non-legislative awareness and education organization. Some of the issues discussed in the following article are legislatively-affiliated. Though our organization is non-legislative, we fully support the regulation of already illegal forms of pornography and sexual exploitation, including the fight against child exploitation and against child exploitation images.
The National Center on Sexual Exploitation Law Center (NCOSE), The Haba Law Firm, and The Matiasic Firm have jointly filed a federal lawsuit against Twitter on behalf of a minor who was trafficked on the social media platform that boasts over 330 million users.
The plaintiff, John Doe, a minor, was harmed by Twitter’s distribution of material depicting his sexual abuse, and by Twitter’s knowing refusal to remove the images of his sexual abuse (child pornography) when notified by the plaintiff and the plaintiff’s parents. The case, John Doe v. Twitter, was filed in the United States District Court for the Northern District of California.
At age 16, Plaintiff John Doe was horrified to find out sexually graphic videos of himself—made at age 13 under duress by sex traffickers—had been posted to Twitter. Both John Doe and his mother, Jane Doe, contacted the authorities and Twitter. Using Twitter’s reporting system, which according to its policies is designed to catch and stop illegal material like child sexual abuse material (CSAM) from being distributed, the Doe family verified that John Doe was a minor and the videos needed to be taken down immediately.
“As John Doe’s situation makes clear, Twitter is not committed to removing child sex abuse material from its platform. Even worse, Twitter contributes to and profits from the sexual exploitation of countless individuals because of its harmful practices and platform design,” said Peter Gentala, senior legal counsel for the National Center on Sexual Exploitation Law Center. “Despite its public expressions to the contrary, Twitter is swarming with uploaded child pornography and Twitter management does little or nothing to prevent it.”
Instead of the videos being removed, NCOSE reports that Twitter did nothing, even reporting back to John Doe that the video in question did not in fact violate any of their policies.
This lack of care and proper attention resulted in the CSAM of John Doe accumulating over 167,000 views before direct involvement from a law enforcement officer caused Twitter to remove the child pornography material. John Doe is now suing Twitter for its involvement in and profiting from his sexual exploitation, which violates the Trafficking Victims Protection Reauthorization Act and various other protections afforded by law.
Is this the first time Twitter has shared child abuse images?
Accessing CSAM used to be difficult, like finding a needle in a haystack. Today, child exploitation is shared through P2P (file sharing) networks, encrypted messaging applications like WhatsApp, social media, adult pornography sites, and even suggested as a search option on Microsoft Bing. It’s even easily accessible on Twitter these days, as this lawsuit clearly shows.
It seems obvious that such abuse should be eradicated. The question is, how? Is such a mission even possible? And if so, whose responsibility is it to end child porn?
These are urgent questions that have not only been made worse by child abusers and exploiters sharing CSAM on the platform, but the adult industry at large, too.
In the last year, a “not-safe-for-work” subscription-based site OnlyFans has shown how prolific child exploitation images are on Twitter, specifically. Followers on OnlyFans pay a monthly subscription fee to sexual content creators that ranges anywhere from $4.99 to $49.99 a month. Creators can also charge a minimum of $5 tips or paid private messages, which is the real money maker for those with a loyal subscriber base. And while OnlyFans has an age verification system that tries to ensure sexual content creators are over 18, it can be easily bypassed.
Many OnlyFans creators use Twitter to advertise selling nudes and drive traffic to their profiles—particularly through trending hashtags like #teen and #barelylegal. And while there clearly are underage creators on OnlyFans, on the flip side, many adult creators give the illusion of being under 18 to grow their fan base.
Yoti—a platform that helps individuals prove their identity online—recently did a scan of 20K Twitter accounts to detect how many users were underages using the hashtags #nudesforsale and #buymynudes, which are commonly used to direct followers to OnlyFans. In just one day, out of 7,000 profiles where faces could be detected and analyzed they found that 33%, or over 2,500 profiles, were very likely underage.
Clearly, the rise in popularity of OnlyFans is causing an influx of underage content generation—legally defined as child exploitation imagery—even outside of the platform itself.
While this data helps us understand the scale of the issue when it comes to underage girls being attracted to and exploited on these platforms, it’s clearly just the tip of the iceberg.