QTCinderella is known as one of the most famous female gamers on Twitch. The 28-year-old, whose real name is Blaire, has a following of over 500k on Twitter and over 900,000 on Twitch. While she is known for her popularity on the gaming site and for creating the Streamer Awards, she has most recently been associated with being the latest high-profile victim of deepfake porn.
What’s the story?
At the end of January 2023, a prominent gamer accidentally revealed, while streaming, a site hosting nonconsensual, AI-generated images of multiple female Twitch gamers, including QTCinderella. This led to a spike in traffic to that deepfake site and, according to the victims, a flood of harassment via DMs or comments, including screenshots of their faces from the deepfake content.
Shortly after this happened, QTCinderella responded. She talked about the pain of having her face used in a fake porn video and widely distributed. She called out the deepfake creators and the prominent gamer who had been responsible for increasing the visibility of deepfake porn that included her and other female Twitch streamers. In her words:
“This is what it looks like to feel violated. This is what it feels like to be taken advantage of, this is what it looks like to see yourself naked against your will being spread all over the internet. This is what it looks like.”
QTCinderella is not the first high-profile female to become a deepfake porn victim. Many celebrities, including Emma Watson, Natalie Portman, Billie Eilish, Taylor Swift, and more, have been victims of this growing and oftentimes damaging phenomenon.
Let’s take a step back: what are deepfakes?
We’ve mentioned the term a few times now… but what exactly are deepfakes? And why should we keep them on our radar?
Deepfakes are images, videos, or other content that has been digitally manipulated to sound and/or look like the original. Think of that Princess Leah remake—she looks pretty darn close to the real thing, right? Thanks to improvements in tech, content can be created to imitate the original image or audio with increasing precision, making it look like people are doing or saying things they never have.
While one end of the spectrum could be something like a harmless and funny Snapchat face swap, the other, and increasingly common occurrence, is nonconsensual deepfake porn, where normal people or celebrities’ images are taken and digitally manipulated into existing pornographic content to create a frighteningly realistic looking image or video.
Easy to make and increasingly prevalent
Deepfake creation is on the rise. More specifically, deepfake porn is on the rise. A 2019 report by Sensity, one of the top deepfake detecting companies, showed that deepfake creation would double yearly. Equally or more concerningly, they found that 96% of existing deepfakes were nonconsensual pornography.
Deepfakes have only continued to grow, backed by improvements in Artificial Intelligence and other technologies that have eased the time and difficulty of creating one while at the same time improving the veracity of deepfakes, yielding some results that make the imitation content almost impossible to recognize as a fake.
The big dilemma: consent
Of course, there are many issues deepfake porn raises- a violation of digital privacy, a violation of trust…but one of the largest ones is that of consent. Oftentimes, whether the victim is a celebrity or a normal person, their image or voice is taken without their permission or notice and used to generate shockingly-realistic pornographic content of them. It can be shared among strangers, or the victim’s peers and family, sometimes even used as blackmail.
When it comes to victims’ discovery of their nonconsensual deepfake porn, QTCinderella, and other gamers’ responses, that of frustration and sadness, is a reminder that, as one 2019 research paper put it, “the psychological damage [for victims] may be profound”.
What can we do?
It’s clear that nonconsensual pornography, whether real or not, can cause serious harm to the victims.
One thing that can be done is to reduce the demand for nonconsensual pornography generally. The fewer consumers go after that, the less demand there will be for deepfake nonconsensual pornography. In other words, if we can stop the clicks for the real content created or shared without consent, we can bring down the demand for fake content too.
No one deserves to be exploited or have their image used without consent as a form of sexual entertainment to be created and shared among others. Are you with us?
Support this resource
Thanks for reading our article! Fight the New Drug is a registered 501(c)(3) nonprofit, which means the educational resources we create are made possible through donations from people like you. Join Fighter Club for as little as $10/month and help us educate on the harms of porn!JOIN FIGHTER CLUB