Decades of studies from respected institutions have demonstrated significant impacts of porn consumption on individuals, relationships, and society. No Porn November is all about giving visibility to these facts and empowering individuals to choose to be porn-free. Learn more by clicking here.
Earlier this year, Twitter shut down a proposed plan to start an OnlyFans adult content competitor due to a high risk of child exploitation images being shared—but what will happen now that the site is being run differently?
Over the last few months, there hasn’t been a news corporation, social media platform, or newspaper that hasn’t made mention of Twitter’s purchase and takeover by Elon Musk, the world’s richest man and founder of Tesla and SpaceX.
In case you missed it, Musk was motivated to purchase the social media platform in April for $44 billion before he became worried about Twitter’s excessive bot problem and tried to back out. Twitter’s lawyers sued for him to complete his proposed deal, or risk fining him $1 billion termination fee. Long story short, he closed the deal in late October and officially purchased Twitter.
Now, here we are—Musk owns Twitter, has laid off over half of Twitter employees, and is reportedly putting the company at risk of billions in fines after Musk’s changes to the service bypassed its standard data governance processes.
With all of these changes, especially since so much of the team that moderates Twitter for abuse has been laid off or left, a persistent issue needs to be addressed now more than ever: Twitter’s problem with child sexual exploitation material (CSEM), commonly referred to as “child porn.”
Twitter’s child porn issues ruined its plan to create an OnlyFans competitor
According to internal documents and Twitter employees that predated Musk’s takeover in October 2022, the social media company is in massive need of investment to remove illegal and abusive content.
Twitter acknowledged this huge gap in safety in the spring of 2022 when it began to seriously consider monetizing the adult content it allows on its platform. Essentially, Twitter’s plan was to create an OnlyFans competitor that would give adult content creators the ability to sell paid subscriptions with Twitter keeping a percentage of fees.
Twitter leadership reportedly knew the platform would would lose money from the advertisers who generate the vast majority of the free platform’s revenue, but they decided the profits outweighed the costs. For example, OnlyFans is projecting $2.5 billion in revenue just this year and is already profitable even though OnlyFans has been around for 10 fewer years than Twitter.
With their eyes on profits, in March of 2022, Twitter put money toward a new project called Adult Content Monetization (ACM). To ensure ACM would be safe and free of CSEM, The Verge reported they obtained documents that said Twitter convened a team of 84 employees whose goal was “to pressure-test the decision to allow adult creators to monetize on the platform, by specifically focusing on what it would look like for Twitter to this safely and responsibly.” This team was called the “Red Team.”
Right before the final go-ahead to launch porn monetization, the Red Team discovered that “Twitter cannot accurately detect child nudity at scale.” Or, to put it another way, the company couldn’t safely allow adult creators to sell subscriptions because the company wasn’t—and still isn’t—effectively detecting harmful sexual content on the platform.
If Twitter couldn’t consistently remove child sexual exploitative content on the platform, how was it going to begin to monetize adult porn?
The report found that launching ACM was very likely only going to worsen the problem because allowing creators to put content behind a paywall would mean that even more illegal material would make its way to Twitter— and more of it would be more difficult to track, report, and remove.
With Musk’s impending purchase of the platform, Twitter made the call to delay ACM indefinitely.
What happened to porn content monetization after Musk bought Twitter?
Now that Twitter is under his leadership, it is unclear whether ACM will proceed, but it seems very likely that some version of video monetization will proceed despite current risks.
According to a report by The Washington Post on November 1st, Twitter appeared to be aiming to rush out a monetization feature referred to as Paywalled Video, pending brief internal reviews.
Initial reports say that, similar to ACM, the feature has a very high risks related to “copyrighted content, creator/user trust issues, and legal compliance.”
Musk more or less confirmed that report on Saturday, November 5th.
According to an internal email describing the new video feature, which has not yet been announced, “When a creator composes a tweet with a video, the creator can enable the paywall once a video has been added to the tweet,” the Post reports. They can reportedly then choose from a preset list of prices, such as $1, $2, $5 or $10. It’s unclear how much of that fee Twitter will retain.
One Twitter employee, who spoke on the condition of anonymity to discuss internal plans, said it seemed like a feature that would probably be used at least partly for adult content.
Twitter estimates that about 13% of its content is explicit pornography, according to Reuters, which included the figure in a September story about how Twitter was losing its most active users. Explicit content, along with cryptocurrency content, were the fastest-growing areas of English-speaking Twitter, reports Reuters.
How has Twitter handled child exploitation in the past and present?
Twitter shut down ACM to stop illegal sexual content from getting worse on the platform, but what did it do with the illegal sexual content currently on the platform?
Twitter’s spokesperson, Katie Rosborough, said in August 2022 that “Twitter has zero tolerance for child sexual exploitation,” is aggressively fighting the problem, has “invested significantly in technology and tools to enforce” their policies, and works to “stay ahead of bad-faith actors.”
Historically, Twitter’s actions have, unfortunately, suggested otherwise, even before Musk’s takeover. In fact, Twitter is being sued by survivors of child sex abuse trafficking for reportedly profiting from their images of exploitation.
A February 2021 report from the company’s Health team said, “While the amount of child sexual exploitation (CSE) online has grown exponentially, Twitter’s investment in technologies to detect and manage the growth has not.”
Consider that Meta reportedly spends way more than Twitter’s entire annual revenue on safety features relating to tracking and reporting abusive content.
Additionally, the primary database (PhotoDNA) used by tech platforms to catch known CSEM material isn’t doing the job it once did when created in 2009. An analysis performed by the National Center for Missing and Exploited Children cited that Twitter’s working group found that of the 1 million reports submitted each month, 84% contain new CSEM—none of which would be flagged by PhotoDNA. That’s a big deal because it would mean that Twitter is failing to detect a very large amount of the illegal content coming onto its platform.
Not only that, consider that some major advertisers including Dyson, Mazda, Forbes and PBS Kids have suspended their marketing campaigns or removed their ads from parts of Twitter as of late September, 2022, because their promotions appeared alongside tweets soliciting child pornography, the companies told Reuters.
On November 20th, Elon Musk said that the problem of child sexual exploitation content on Twitter is a top concern, his “priority #1.”
Twitter simply hasn’t been doing enough to fight CSEM—it remains to be seen if that will change, especially because of how reports show that this issue will worsen if the platform pursues content monetization and paywalls.
Why this matters
Despite Twitter’s lack of comprehensive action thus far, here’s how other companies are stepping up to fight child exploitation images.
Apple is rolling out new features in its IOS updates aimed at addressing the issue of CSEM. Also, teams at Google are changing their search algorithms to suppress search results for revenge porn and protect survivors.
Fight the New Drug and our Fighters are committed to raising awareness on the harmful effects of porn in part because of ties to child and adult exploitation.
To learn how to report child sexual exploitation material when you see it, click here.
Support this resource
Thanks for reading our article! Fight the New Drug is a registered 501(c)(3) nonprofit, which means the educational resources we create are made possible through donations from people like you. Join Fighter Club for as little as $10/month and help us educate on the harms of porn!
JOIN FIGHTER CLUB