For the first time ever, the National Center for Missing & Exploited Children (NCMEC) released midyear statistics due to the sharp increase in crimes against children, and it’s a gut punch.
The organization saw an unprecedented surge in online crimes against children in the first half of 2025, calling it “a wake-up call” for parents, tech platforms, and communities alike.
The numbers are staggering.
Between January and June of 2025, NCMEC received more than 518,000 reports of online enticement, compared to just under 293,000 during the same period in 2024. That’s a 56% increase.
Reports of child sex trafficking jumped from 5,976 to 62,891, almost ten times higher, and financial sextortion cases rose nearly 70%, from about 13,800 to 23,600. But perhaps the most shocking finding is how rapidly generative AI is being weaponized. NCMEC received over 440,000 AI-related reports in the first half of this year alone, compared to fewer than 7,000 during the same period in 2024, a 6,341% increase.
“These are not just numbers,” NCMEC President and CEO Michelle DeLaune said. “They represent children being targeted and victimized at an alarming rate.”
At Fight the New Drug, we know pornography and sexual exploitation don’t exist in silos, they’re part of the same digital ecosystem that fuels the growing crisis NCMEC is warning about. What this report shows, in real-time, is how predators are adapting faster than our systems are protecting children.
AI has become the newest tool in the exploitation playbook.
Offenders are now generating fake sexual images of real children, using public social media photos as source material, or using AI to simulate live interactions. These aren’t “deepfakes” for entertainment—they’re crimes that blur the line between what’s real and what’s been manipulated, and they devastate real victims in the process.
NCMEC’s CyberTipline, the central hub for these reports, was initially designed to handle large volumes of child exploitation data, but this year’s spike is unlike anything they’ve seen.
“The data paints a stark picture of how technology is both enabling new forms of abuse and amplifying old ones,” DeLaune said. The report even highlighted stories of victims coerced into sharing explicit images, only to be extorted for money or threatened with exposure. Some of these children were driven to self-harm or suicide.
Protecting Children
The lines between pornography, sexual exploitation, trafficking, enticement—and now AI‐enabled harm—are blurring. When children are being groomed, coerced, blackmailed, or manipulated through digital means, what we’re seeing is not an isolated “online problem,” but a reflection of the broader culture of exploitation. And that culture is fed in part by accessible sexualized content, permissive norms, and silence around the real consequences.
Consider also the context: though the total number of CyberTipline reports went down from 36.2 million in 2023 to 20.5 million in 2024, NCMEC cautions this does not mean less harm—rather it points to changes in reporting patterns (including a new “bundling” feature for viral content) and platform behaviours. The takeaway? The threat is evolving, not shrinking.
For parents, educators, teens—this means the conversation about “internet safety” has to evolve too. It’s less about “don’t talk to strangers” (though that remains valid) and more about: what do we do when fake profiles, AI‐generated images, covert financial extortion, or pornified content show up in the spaces young people inhabit? According to NCMEC’s data on online enticement, it’s not a matter of if but when.
The line between pornography, sexual exploitation, and trafficking has never been thinner.
When children are objectified, commodified, or manipulated online—whether through AI-generated images, grooming, or the demand for explicit content—it feeds into the same cultural normalization of exploitation that we’ve been fighting to expose.
This isn’t a distant issue—it’s happening in the same online spaces where kids chat, play games, and share photos. And that means we all have a role to play. Talking with teens about pornography and exploitation risks can’t wait until “later.” Parents and educators need to stay informed about new technology and the ways it’s being used to manipulate. Awareness isn’t just protection—it’s prevention.
Behind the numbers
One of the biggest shifts revealed in the report is how GAI is changing the game.
NCMEC states that enticement is no longer the only—or even primary—pathway. Offenders are increasingly using generative AI to create explicit images of children without direct contact or coercion by simply manipulating publicly available images or school photos, then threatening disclosure or blackmail.
In addition, analysis from sources like Internet Watch Foundation (IWF) backs this up: AI‐generated child sexual abuse material (CSAM) has gone from rare to extremely common in a matter of months, with videos almost indistinguishable from real abuse footage.
The cultural context cannot be ignored. The fact that generative AI can create hyper‐realistic images and videos of children in sexualized settings—without ever needing a live victim to “consent”—is a profound shift in the landscape. As IWF noted, “AI-generated child sexual abuse videos … verifiably increased from just two in the same period last year to over a thousand this year.”
In a public policy sense, legislation is reacting. In the US, the TAKE IT DOWN Act passed in 2025 aims to require platforms to remove non-consensual deepfakes and manipulated intimate content. Australia’s online safety regulator has taken action too, issuing legally enforceable transparency notices to tech companies like Google and Meta for how they handle child sexual abuse content.
We’re also seeing shifts in platform tactics and grooming. For example, apps and gaming platforms like Roblox are rolling out open-source AI systems aimed at detecting predatory language in chats, but the scale and speed of harm is so great the safety measures are racing to keep up.
We also see tech platforms themselves incentivizing addictive design—recommender algorithms, infinite scrolls, engagement loops. These design patterns make kids spend more time online, expose them to more content, and increase opportunities for grooming or manipulation.
The cultural issue runs deeper. Our society is still dealing with a normalization of sexualized content, easy access, and vague boundaries between what is “just online” and what is real. At the same time, children’s digital lives are evolving faster than we’re trained to understand them. Studies show children exposed to one type of online risk are more likely to face other risks.
Porn culture, sexual exploitation, trafficking—they are part of the same ecosystem that thrives when children are objectified, easily manipulated, or coerced. The NCMEC data just spotlights the latest layer of that ecosystem.
At FTND, we believe that education is empowerment. The more people understand how exploitation spreads, the more power we all have to stop it. The latest NCMEC report proves one thing beyond doubt: online safety is no longer optional.
If we want to protect kids, we need to look beyond the symptoms—beyond “content moderation” or “app settings”—and start addressing the root causes of demand, objectification, and exploitation.
This report isn’t just a wake-up call. It’s a warning—and an invitation to act.
Because behind every number is a child who deserved better.
Your Support Matters Now More Than Ever
Most kids today are exposed to porn by the age of 12. By the time they’re teenagers, 75% of boys and 70% of girls have already viewed itRobb, M.B., & Mann, S. (2023). Teens and pornography. San Francisco, CA: Common Sense.Copy —often before they’ve had a single healthy conversation about it.
Even more concerning: over half of boys and nearly 40% of girls believe porn is a realistic depiction of sexMartellozzo, E., Monaghan, A., Adler, J. R., Davidson, J., Leyva, R., & Horvath, M. A. H. (2016). “I wasn’t sure it was normal to watch it”: A quantitative and qualitative examination of the impact of online pornography on the values, attitudes, beliefs and behaviours of children and young people. Middlesex University, NSPCC, & Office of the Children’s Commissioner.Copy . And among teens who have seen porn, more than 79% of teens use it to learn how to have sexRobb, M.B., & Mann, S. (2023). Teens and pornography. San Francisco, CA: Common Sense.Copy . That means millions of young people are getting sex ed from violent, degrading content, which becomes their baseline understanding of intimacy. Out of the most popular porn, 33%-88% of videos contain physical aggression and nonconsensual violence-related themesFritz, N., Malic, V., Paul, B., & Zhou, Y. (2020). A descriptive analysis of the types, targets, and relative frequency of aggression in mainstream pornography. Archives of Sexual Behavior, 49(8), 3041-3053. doi:10.1007/s10508-020-01773-0Copy Bridges et al., 2010, “Aggression and Sexual Behavior in Best-Selling Pornography Videos: A Content Analysis,” Violence Against Women.Copy .
From increasing rates of loneliness, depression, and self-doubt, to distorted views of sex, reduced relationship satisfaction, and riskier sexual behavior among teens, porn is impacting individuals, relationships, and society worldwideFight the New Drug. (2024, May). Get the Facts (Series of web articles). Fight the New Drug.Copy .
This is why Fight the New Drug exists—but we can’t do it without you.
Your donation directly fuels the creation of new educational resources, including our awareness-raising videos, podcasts, research-driven articles, engaging school presentations, and digital tools that reach youth where they are: online and in school. It equips individuals, parents, educators, and youth with trustworthy resources to start the conversation.
Will you join us? We’re grateful for whatever you can give—but a recurring donation makes the biggest difference. Every dollar directly supports our vital work, and every individual we reach decreases sexual exploitation. Let’s fight for real love:



