Skip to main content
Blog

Deepfakes Are on the Rise—Here’s What You Need To Know

A 2019 report showed that deepfake creation would double yearly and that 96% of existing deepfakes were nonconsensual pornography.

QTCinderella is known as one of the most famous female gamers on Twitch. The 28-year-old, whose real name is Blaire, has a following of over 500k on Twitter and over 900,000 on Twitch. While she is known for her popularity on the gaming site and for creating the Streamer Awards, she has most recently been associated with being the latest high-profile victim of deepfake porn.

What’s the story?

At the end of January 2023, a prominent gamer accidentally revealed, while streaming, a site hosting nonconsensual, AI-generated images of multiple female Twitch gamers, including QTCinderella. This led to a spike in traffic to that deepfake site and, according to the victims, a flood of harassment via DMs or comments, including screenshots of their faces from the deepfake content.

Podcast - Terry

Shortly after this happened, QTCinderella responded. She talked about the pain of having her face used in a fake porn video and widely distributed. She called out the deepfake creators and the prominent gamer who had been responsible for increasing the visibility of deepfake porn that included her and other female Twitch streamers. In her words:

“This is what it looks like to feel violated. This is what it feels like to be taken advantage of, this is what it looks like to see yourself naked against your will being spread all over the internet. This is what it looks like.”

QTCinderella is not the first high-profile female to become a deepfake porn victim. Many celebrities, including Emma Watson, Natalie Portman, Billie Eilish, Taylor Swift, and more, have been victims of this growing and oftentimes damaging phenomenon.

Related: Deepfake Porn Videos of Celebrities are Just Another Form of Sexual Exploitation

Let’s take a step back: what are deepfakes?

We’ve mentioned the term a few times now… but what exactly are deepfakes? And why should we keep them on our radar?

Deepfakes are images, videos, or other content that has been digitally manipulated to sound and/or look like the original. Think of that Princess Leah remake—she looks pretty darn close to the real thing, right? Thanks to improvements in tech, content can be created to imitate the original image or audio with increasing precision, making it look like people are doing or saying things they never have.

While one end of the spectrum could be something like a harmless and funny Snapchat face swap, the other, and increasingly common occurrence, is nonconsensual deepfake porn, where normal people or celebrities’ images are taken and digitally manipulated into existing pornographic content to create a frighteningly realistic looking image or video.

Easy to make and increasingly prevalent

Deepfake creation is on the rise. More specifically, deepfake porn is on the rise. A 2019 report by Sensity, one of the top deepfake detecting companies, showed that deepfake creation would double yearly. Equally or more concerningly, they found that 96% of existing deepfakes were nonconsensual pornography.

Related: New BBC Documentary Reveals How Easy It Is to Create Legitimate-Looking Deepfakes Porn

Deepfakes have only continued to grow, backed by improvements in Artificial Intelligence and other technologies that have eased the time and difficulty of creating one while at the same time improving the veracity of deepfakes, Danielle K. Citron & Robert Chesney, Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security , in 107 California Law Review 1753 (2019).Copy  yielding some results that make the imitation content almost impossible to recognize as a fake.

Increasingly, free or cheap apps or deepfake creators offer the ability to take a few images and within minutes, produce an imitation pornographic image or video of someone.

FTND Resources

The big dilemma: consent

Of course, there are many issues deepfake porn raises- a violation of digital privacy, a violation of trust…but one of the largest ones is that of consent. Oftentimes, whether the victim is a celebrity or a normal person, their image or voice is taken without their permission or notice and used to generate shockingly-realistic pornographic content of them. It can be shared among strangers, or the victim’s peers and family, sometimes even used as blackmail.

Related: 7 Things You Can Do If You’re a Victim of Deepfakes or Revenge Porn

When it comes to victims’ discovery of their nonconsensual deepfake porn, QTCinderella, and other gamers’ responses, that of frustration and sadness, is a reminder that, as one 2019 research paper put it, “the psychological damage [for victims] may be profound”. Danielle K. Citron & Robert Chesney, Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security , in 107 California Law Review 1753 (2019).Copy 

What can we do?

It’s clear that nonconsensual pornography, whether real or not, can cause serious harm to the victims.

One thing that can be done is to reduce the demand for nonconsensual pornography generally. The fewer consumers go after that, the less demand there will be for deepfake nonconsensual pornography. In other words, if we can stop the clicks for the real content created or shared without consent, we can bring down the demand for fake content too.

No one deserves to be exploited or have their image used without consent as a form of sexual entertainment to be created and shared among others. Are you with us?

Support this resource

Thanks for reading our article! Fight the New Drug is a registered 501(c)(3) nonprofit, which means the educational resources we create are made possible through donations from people like you. Join Fighter Club for as little as $10/month and help us educate on the harms of porn!

JOIN FIGHTER CLUB