Skip to main content
Your Support Matters Donate
Blog

2026 Dirty Dozen List: How Big Tech Is Fueling Sexual Exploitation

The 2026 Dirty Dozen list names major platforms like Amazon, TikTok, and AI tools for their role in sexual exploitation. Here’s what the research reveals.

What do Amazon’s child-like sex dolls, Snapchat’s sextortion set-up, and Grok’s sexual companions all have in common?

They are all featured on NCOSE’s 2026 Dirty Dozen list as top contributors to sexual exploitation.

Every year, the National Center on Sexual Exploitation (NCOSE) releases its “Dirty Dozen” list highlighting major companies contributing to sexual exploitation.

Back in the day, the Dirity Dozen list included prostitution and porn sites, places you’d find overtly sexually explicit sites clearly selling objectification sprinkled with other brands and platforms. Today’s Dirty Dozen list includes everyday companies that most of us are using.

The companies named this year aren’t obscure. They’re part of daily routines, shopping, scrolling, learning, and messaging. And according to NCOSE’s investigations, each plays a role in shaping a culture that allows harmful content to be created, distributed, and normalized, and is especially harmful to kids.

Let’s take a closer look at several of the most influential entries from this year’s report and their dirty track record of exploitation.

Amazon: Fueling Exploitation via Sex Dolls

Amazon has built its reputation on convenience and accessibility. But NCOSE’s 2026 investigation highlights how that same accessibility extends to some products that raise serious concerns.

According to the report, Amazon continues to sell sex dolls, including products designed with exaggerated, hypersexualized features and, in some cases, dolls that resemble minors or reinforce harmful sexual interestsNational Center on Sexual Exploitation, 2026, https://endsexualexploitation.org/amazon/Copy . Some of the sex dolls in the images are photographed holding stuffed animals.

NCOSE argues that these products do more than exist in isolation; they contribute to a cultural environment where people are reduced to objects for consumption.

And for those of you thinking, “well, it’s better that pedophiles are exploiting child-like dolls vs real kids,” think again.

A report from the Australian Institute of Criminology found there’s no evidence these dolls help prevent child sexual abuse. Instead, the research suggests they can actually reinforce harmful thinking patterns, like the justifications offenders use to excuse their behavior, and may normalize dangerous fantasies rather than reduce them.

Related: Australia and the UK Banned Import of These Disturbing Child Sex Dolls

And this isn’t just theoretical. People who have worked directly with offenders have seen similar patterns play out in real life.

Kritsi McVee, a former police detective and specialist child interviewer, said, “ During my time as a child abuse detective, I encountered childlike sex dolls in the possession of offenders who were already consuming child sexual abuse material or had directly harmed children. These dolls weren’t harmless ‘fantasy aids’; they were part of a pattern — tools used to rehearse, reinforce, and justify dangerous sexual interests in children. They serve to desensitize offenders, feeding an escalation towards real-world abuse. It was never an innocent, stand-alone behaviour or a ‘preference’; it was always part of a bigger, darker picture of risk…”

So while Amazon has slowly re-allowed sex dolls onto their platform, they are directly fueling the demand for child exploitation.

When a global retailer integrates these products into its marketplace, it doesn’t just reflect demand; it helps define what we see as normal and acceptable.

Grok (xAI): AI-Generated Exploitation

Artificial intelligence has introduced a new layer to this conversation, while quietly getting very good at removing others.

Just a few months ago, with the updates of Grok Imagine, users generated thousands of sexually explicit AI images every hour, some of which involved minors, and many were nonconsensual. By asking Grok to remove clothing layers or reposition individuals in sexually compromising positions, thousands of real people experienced exploitation.

Related: Grok Allows Sexual Abuse in “Mass Digital Undressing Spree”

NCOSE’s report raises even more concerns about Grok, noting that the chatbot has demonstrated the ability to produce explicit sexual content and simulate exploitative or abusive scenarios through its “Companion” chatbots.

Grok-generated sexualized avatars encourage sexually explicit interactions, and that, coupled with practically nonexistent child verification safety measures, makes it a cesspool of exploitation.

NCOSE emphasizes that systems like this can “simulate and normalize exploitative interactions” in ways that feel highly personalized.National Center on Sexual Exploitation, 2026, https://endsexualexploitation.org/grok/Copy 

That personalization matters because of how the brain processes reward and repetition. Neural pathways strengthen through repeated, highly stimulating experiences, especially when those experiences are novel or tailored to the individual.Pitchers, K. K., Vialou, V., Nestler, E. J., Laviolette, S. R., Lehman, M. N., & Coolen, L. M., 2013Copy  AI tools like Grok don’t just increase access to content—they allow users to co-create it, reinforcing those pathways in a more immersive way.

Chromebooks: When School Devices Become Exploitation Access Points

Chromebooks are a staple in classrooms across the United States, making their inclusion on the list particularly significant.

NCOSE’s 2026 findings point to widespread concerns about insufficient filtering systems on school-issued devices. The report explains that students are often able to bypass safeguards and access explicit content, even during school hours.National Center on Sexual Exploitation, 2026, https://endsexualexploitation.org/chromebooks/Copy 

Users report seeing pornography, sextortion, and being contacted by predators.

Predators targeted a 14-year old via explicit emails and chats under a school-issued email. Another 10-year-old used her Chromebook to access Discord, where she was sent explicit and abusive images.

We know children access pornography even at school and on school-issued devices, which is why it’s crucial parents have ongoing healthy conversations about the dangers that can even exist on school computers.

Related: How Many Students Watch Porn at School?

Our Live Presentation program provides students with warnings and education on the harms of pornography and online safety to help them make informed decisions before these types of interactions occur.

Live Presentations

When that exposure occurs in an educational setting, it becomes part of a young person’s developmental environment, influencing beliefs and behaviors long before healthy conversations about relationships take place.

TikTok: When Algorithms Reward Predators

TikTok’s influence is undeniable, and while some of its recent marketing tries to make it seem “safe” for teens, NCOSE’s report highlights how its algorithm plays a central role in not only encouraging sexual content but is a hub for collecting and exchanging CSAM (child sexual exploitation material).

According to the investigation, TikTok’s algorithms favor live stream content with gifts, which “incentivizes sexual content”. This kind of algorithmic reinforcement has real-world implications. Studies show that exposure to sexualized media can influence body image, expectations, and social behavior, particularly among adolescents.Fardouly, J., & Vartanian, L. R., 2016Copy  On a platform designed to maximize engagement, those patterns can scale rapidly.

Related: “Post-in-Private” How TikTok Accounts Are Hiding Child Sexual Abuse Material

The platform also played a huge role in a 14-year-old being sextorted, which resulted in his suicide.

Cases where 1 million videos and images of CSAM were collected by a child-pornography ring in Florida via TikTok or a Minnesota man meeting and grooming a 15-year-old girl through the platform show how predators can utilize the platform to exploit children.

TikTok promotes sexualized content and amplifies trends that encourage objectification, especially among younger users.National Center on Sexual Exploitation, 2026, https://endsexualexploitation.org/tiktok/Copy  Their new 16+ accounts boast increased safety but lack key features to really keep kids safe.

NCOSE also notes that TikTok is used to attract new OnlyFans consumers and models. TikTok “pimps” use the platform to find new models, boasting the false narrative of getting rich quickly to recruit new creators.

Discord: Hidden Networks Behind Closed Doors

Discord is often associated with community and connection, but NCOSE’s 2026 report highlights how its structure can also enable harm.

Discord is no stranger to NCOSE’s Dirty Dozen list; this is their 5th time making the cut.

Their investigation points to the use of private and invite-only servers where exploitative content is shared and normalized. Because these spaces operate with limited oversight, harmful material can circulate within tight-knit communities without the same level of visibility or accountability found on more public platforms.National Center on Sexual Exploitation, 2026, https://endsexualexploitation.org/discord/Copy 

Discord is used by predators to collect new CSAM from kids and to trade it with other criminals.

“FBI: Raleigh man sexually exploited teen, abused baby on video after meeting online.”

“Monroe County man learns sentence after sexually exploiting children using Roblox, Discord, SnapchatMonroe County man learns sentence after sexually exploiting children using Roblox, Discord, Snapchat.”

“Man accused of internet child sex crimes allegedly met victims on Roblox, Discord.”

In addition to connecting kids with predators, minors are often exposed to pornography within Discord. By only requiring users to check a “I am over 18 box to access sexually explicit content, kids of all ages are exposed to pornography.

What makes this particularly concerning is the role of social reinforcement. When behaviors are validated within a group, they can become more deeply ingrained. Over time, these environments can shape norms and expectations in ways that extend beyond the platform itself.

Conversation Blueprint

Mark Zuckerberg (Instagram, Facebook, WhatsApp)

Rather than naming just Facebook or Instagram, NCOSE calls out Mark Zuckerberg as to emphasize leadership accountability over Meta’s ecosystem.

The report details ongoing concerns, including the spread of child sexual abuse material and the use of messaging features to facilitate grooming within Facebook, Instagram, Messenger, and WhatsApp. Despite years of scrutiny and internal reports, NCOSE argues that systemic issues remain unresolved.National Center on Sexual Exploitation, 2026, https://endsexualexploitation.org/zuckerberg/)Copy 

NCOSE reveals Instagram connected “1.4 million potentially dangerous adults to teens” in a day. And Meta’s AI chatbot lets kids have romantic ans sensual conversations. NCOSE also notes Meta documents show that across platforms over 100,000 children were exploited, nad that Meta has a “17 strikes policy” for suspending accounts for trafficking.

Not great.

The Other Dirty Dozen: A Broader System at Work

While these examples highlight some of the most widely used platforms, they are part of a broader group included in NCOSE’s 2026 Dirty Dozen list. Other entries span industries such as additional tech platforms, infrastructure providers, and digital services that, according to the report, contribute in various ways to the creation, distribution, or normalization of exploitative content. Taken together, the full list illustrates a larger ecosystem—one where multiple layers of technology and business intersect to shape how content is accessed and understood.

NCOSE’s full 2026 list also includes several additional companies and platforms identified for specific concerns:

Android
NCOSE highlights Android for its role in providing access to apps and online spaces where exploitative content can be easily reached, particularly given the platform’s global reach and flexibility. The report emphasizes that insufficient safeguards across apps and browsers can leave minors vulnerable to exposure.

Apple App Store
According to NCOSE, the Apple App Store has hosted apps that facilitate sexual exploitation or provide access to explicit material, despite its stated safety standards. The investigation points to inconsistencies in enforcement, where harmful apps can still reach users—including young audiences.

Snapchat
NCOSE flags Snapchat for features like disappearing messages and private sharing, which have been linked to sextortion and grooming. The report notes that these design choices can make it easier for harmful content to be shared quickly and with limited accountability.

Steam
Steam is included due to the availability of sexually explicit and exploitative video games on its platform. NCOSE highlights concerns about games that simulate sexual violence or normalize harmful scenarios, making them interactive rather than passive experiences.

Telegram
NCOSE’s report points to Telegram as a platform where large volumes of exploitative material can be distributed through encrypted channels and private groups. The investigation emphasizes how these features allow content to spread widely while remaining difficult to monitor or remove.

X (formerly Twitter)
X is included for its permissive approach to explicit content and ongoing concerns about inconsistent moderation of exploitative material. NCOSE notes that the platform enables widespread sharing of adult content, which can increase exposure and normalize it.

Final Thought: Awareness Is Where Change Begins

The Dirty Dozen list is not just a collection of companies and names; it’s a snapshot of how technology, business, and culture intersect.

It shows how normal everyday communication platforms, gaming hubs, and even online shopping spaces can fuel sexual exploitation right under our noses.

It’s evident that technology is evolving faster than safety measures for protecting kids against pornography and sexual exploitation can keep up.

It’s critical we are aware of where and how sexual exploitation exists, and its impacts.

Because when people understand how these systems work, and the dangers lurking on the platforms we use every day, they are better equipped to make informed choices about what they consume and support.

At Fight the New Drug, that’s what this conversation is about.

Awareness, so we can build something better.

Your Support Matters Now More Than Ever

Most kids today are exposed to porn by the age of 12. By the time they’re teenagers, 75% of boys and 70% of girls have already viewed itRobb, M.B., & Mann, S. (2023). Teens and pornography. San Francisco, CA: Common Sense.Copy —often before they’ve had a single healthy conversation about it.

Even more concerning: over half of boys and nearly 40% of girls believe porn is a realistic depiction of sexMartellozzo, E., Monaghan, A., Adler, J. R., Davidson, J., Leyva, R., & Horvath, M. A. H. (2016). “I wasn’t sure it was normal to watch it”: A quantitative and qualitative examination of the impact of online pornography on the values, attitudes, beliefs and behaviours of children and young people. Middlesex University, NSPCC, & Office of the Children’s Commissioner.Copy . And among teens who have seen porn, more than 79% of teens use it to learn how to have sexRobb, M.B., & Mann, S. (2023). Teens and pornography. San Francisco, CA: Common Sense.Copy . That means millions of young people are getting sex ed from violent, degrading content, which becomes their baseline understanding of intimacy. Out of the most popular porn, 33%-88% of videos contain physical aggression and nonconsensual violence-related themesFritz, N., Malic, V., Paul, B., & Zhou, Y. (2020). A descriptive analysis of the types, targets, and relative frequency of aggression in mainstream pornography. Archives of Sexual Behavior, 49(8), 3041-3053. doi:10.1007/s10508-020-01773-0Copy Bridges et al., 2010, “Aggression and Sexual Behavior in Best-Selling Pornography Videos: A Content Analysis,” Violence Against Women.Copy .

From increasing rates of loneliness, depression, and self-doubt, to distorted views of sex, reduced relationship satisfaction, and riskier sexual behavior among teens, porn is impacting individuals, relationships, and society worldwideFight the New Drug. (2024, May). Get the Facts (Series of web articles). Fight the New Drug.Copy .

This is why Fight the New Drug exists—but we can’t do it without you.

Your donation directly fuels the creation of new educational resources, including our awareness-raising videos, podcasts, research-driven articles, engaging school presentations, and digital tools that reach youth where they are: online and in school. It equips individuals, parents, educators, and youth with trustworthy resources to start the conversation.

Will you join us? We’re grateful for whatever you can give—but a recurring donation makes the biggest difference. Every dollar directly supports our vital work, and every individual we reach decreases sexual exploitation. Let’s fight for real love: