Decades of studies from respected institutions have demonstrated significant impacts of porn consumption on individuals, relationships, and society. No Porn November is all about giving visibility to these facts and empowering individuals to choose to be porn-free. Learn more by clicking here.
Have you heard of the term “deepfake”? Maybe not, but we bet you probably know what it is. Remember that Princess Leia Star Wars “remake?” Or this fake Mark Zuckerberg video?
Countless falsified “deepfake” videos and images that recreate a frighteningly real voice or appearance of politicians, celebrities, and average people have been made with increasing veracity over the last few years.
A new BBC Three Documentary about deepfakes
Recently, the BBC launched a documentary into this urgent issue entitled, “’Deepfake Porn: Could You Be Next?” The film highlights the most common use of deepfakes: porn.
In fact, deepfake research company Sensity AI, has found that the overwhelming trend in recent years of deepfake videos are of nonconsensual pornography. It’s reported that over 90% of available deepfake videos online are explicit in nature.
In this latest BBC documentary, journalist Jess Davies shows the disturbing reality of how this shocking statistic plays out in people’s everyday lives.
Through the stories of three women who are victims of deepfake porn—that is, they had their face digitally edited into a pornographic video to create pornography of them without their consent—and interviews of two creators of deepfake porn, the film shows the detrimental psychological effects of deepfake pornography on victims, as well as the dismissive attitude of content creators toward these women.
The documentary demonstrates how celebrities or figures in the public eye are often the targets of deepfake pornography. Of the three women, one is an American state senator and another a public campaigner in the UK who actively worked to remove millions of nonconsensual and child pornography videos from Pornhub in a movement called #NotYourPorn.
Famous or not, deepfakes can cause serious harm
UK campaigner Kate Issacs expressed the horror she felt after finding a pornograhic deepfake of her publicly posted on Twitter:
“It’s really, really scary to watch, because you’re thinking, ‘Oh my god, that’s me, I’m in a porn video and everyone’s going to see—my employers, my grandmother, my friends’…You feel vulnerable, because your body is out there, but you have a complete lack of memory of being filmed.”
She believes she was targeted because of her involvement in the #NotYourPornCampaign, which contributed to the removal of 10 million nonconsensual images from Pornhub.
After the video was posted, her home and work address were also shared online, and she began to receive rape and death threats from strangers. In the documentary, she stated that they would “follow me home while it was dark, and [said] that they were going to rape me, film it and upload it to the Internet…that threat against you is just terrifying. It was one of the scariest things I’ve ever been through.”
However, while Kate may have been targeted due to her reputation as a campaigner, the documentary also drives home the point that this deepfake technology is no longer being restricted to celebrities; in fact, it is developing at a speed that can increasingly threaten normal people.
Advancements in technology make it easier than ever
Apps exist that can generate legitimate-looking deepfake content in under 10 seconds, even if they’re designed for those as young as 12 years old.
Jess, the journalist conducting the interviews for the BBC and a victim of image-based abuse herself, points out the increase in threads and forums for deepfakes and deepnudes, in which content created by amateurs or more professional creators is uploaded and shared publicly—entirely without the victim’s consent or knowledge.
Even well-known platforms have been found to share non consensual deepfake porn of everyday, private citizen women.
To date, there are almost 59 million results for “deepfake porn” on Google. Sensity AI predicted in their “State of Deepfakes” report in 2019 that the number of deepfakes would more than double annually—96% of these being nonconsensual pornography.
What to do if you’re a deepfakes target
These statistics highlight the ease with which this content can and is being produced. After seeing the devastating effects deepfake pornography can have on victims, and the increasing ease with which these deepfakes can be created, what can you do if you or someone you know finds themself as a victim of this terrible occurrence?
Thankfully, there is help available to you. Here’s what you can do:
Connect with organizations who can help victims
There are a growing number of orgs who can assist with this type of abuse. There are few to consider:
-
- The Cyber Civil Rights Initiative, which has a 24/7 hotline, provides legal referrals, and can share resources for removing your content online.
- To assist in taking down images, the Digital Millennium Copyright Act website can provide guidance.
- In the UK and elsewhere, the Revenge Porn Hotline will help anyone. Email or call for support at [email protected] and (if in the UK) at 0345 6000 459.
Record evidence of the abuse and seek legal action and protection
If you have been victimized by deepfakes or any other kind of nonconsensual content, try to keep a thorough record using screenshots of the content posted online without consent. Also keep record of your requests to have it taken down, either on social media or other platforms, or in conversations you might have with the person who uploaded your content.
Legal action should be considered thoughtfully as there can be obstacles, such as retaliation by a partner or the uploader of the content. However, according to the Electronic Frontier Foundation, victims can sue for defamation or for portraying a victim in “false light.”
They could also file a “right of publicity” claim alleging that deepfakes makers profited from their image without permission.
Take care of yourself—this is not your fault
Fake or not, the shame and humiliation victims face when deepfake pornography is created and shared is real.
Seek out mental health resources if you need to. Rape, Abuse, and Incest National Network (RAINN) offers a guide to looking for a therapist. Most importantly, remember this abuse is not your fault.
Hope for the future
No matter the circumstances or what our porn-saturated culture deems as acceptable, the exploitation of another human being is never acceptable—whether those images are “real” or digitally engineered.
It may be discouraging to live in a world where there’s a demand for real nonconsensual porn of real people, but there’s also hope for the future.
Victims can be empowered survivors who speak out about their experiences and fight for change. Each of us can help stop the demand for exploitative content, and expose the industry that normalizes and fuels their abuse.
Support this resource
Thanks for reading our article! Fight the New Drug is a registered 501(c)(3) nonprofit, which means the educational resources we create are made possible through donations from people like you. Join Fighter Club for as little as $10/month and help us educate on the harms of porn!
JOIN FIGHTER CLUB