Jennifer (name has been changed) shared naked pictures of herself with her partner, as many couples do in our digital age of relationships.
After they stopped dating, Jennifer’s ex tried to use her images against her by creating a website filled with her nude images and telling Jennifer that he was going to send the link to her friends and family. Jennifer begged for the site to be taken down. Her ex offered to sell her the domain for $75,000, and she responded with a lawsuit.
Jennifer’s story was shared in a new book by cyber abuse lawyer Carrie Goldberg. Unfortunately, Jennifer’s story is one of the millions throughout the world and one of many Goldberg writes about in Nobody’s Victim: Fighting Psychos, Stalkers, Pervs, and Trolls.
Researchers are referring to this harassment as image-based sexual abuse. The motives range, but it is a gendered issue largely with men perpetrating attacks on women. Not always, of course, but it is a significant trend.
This new world of sexual abuse is devastating for victims. As technology evolves, so do the tactics used to harass and humiliate while laws to protect lag behind in the last decade.
A bill making “revenge porn” a federal crime in the U.S. took another step forward as the U.S. Congress is part-way through passing the SHIELD Act—an amendment to the Violence Against Women Reauthorization Act of 2021 that criminalizes the nonconsensual distribution of nude or sexually explicit images. If the amendment becomes law, offenders could face two years in prison for distributing or even threatening to distribute nonconsensual images.
Many victims have not been physically assaulted—a requirement traditionally associated with sexual abuse—and yet research shows how their betrayal and loss of privacy trigger similar traumatic responses.
One survivor named Deborah explained it this way to researchers:
“It’s a type of rape, it’s just the digital version, like you’re still being exploited, you’re still being made very vulnerable against your will…you’re being raped, just in a very different way, it’s a new version of it.”
Revenge porn is the most well-known tactic under the image-based sexual abuse umbrella. It came into the mainstream about 10 years ago as angry ex-partners began to post and share private images of the people they were in relationships with. The nude images of “revenge porn” can be taken and shared consensually but never intended for a public audience.
Not all victims of revenge porn are attacked by a vengeful ex. Some have had their digital accounts hacked and private images stolen, and many never find out the identity of their perpetrator.
This is partially why the term “revenge porn” is problematic. It not only implies the victim did something to deserve an aggressive response, but also makes it difficult to prosecute without proof of such motive. This category should be defined not by retaliation but by the sharing of private images without consent.
Many survivors prefer the terms “nonconsensual porn” or “cyber sexual abuse,” but nothing has quite stuck in our culture’s vocabulary like “revenge porn.” This is why it is important to understand it as a part of the broader issue of image-based sexual abuse.
It’s worth noting that while the majority of victims of image-based sexual abuse are female, there are large numbers of men and LGBTQ+ individuals affected by their images being shared without consent, with some reports suggesting the numbers are becoming nearly equal.
“Secret filming” refers to recorded footage of a person in private without their knowledge. This is also called “spycam porn” and includes everything from hidden cameras in public bathrooms, hotel rooms, college locker rooms, and even secretly filming a sexual encounter with a partner. While the latter example the sex may have been consensual, the recording isn’t.
It is bad enough to be recorded without your knowledge and later find out that another person possesses intimate footage of you. It is another level of privacy violation when these videos or photos are shared online to porn sites that thrive off “secret” or “stolen” videos of innocent victims oblivious to the camera in the room.
Here are real examples of secret filming ending up on Pornhub.
A woman named Gina Martin was at a music festival when a man she didn’t know stuck his phone between her legs and took pictures up her skirt. She took the phone and the guy straight to the authorities but they told her not to expect much. That experience began her 18-month campaign to make upskirting—the act of taking nonconsensual photos underneath a person’s clothing—a crime in England.
These “creepshots” are shared between other men and uploaded to forums where other contributors like and comment on the images in some twisted competition of harassment.
Deepfake technology uses artificial intelligence to manipulate video, and it’s actually a lot quicker and easier than it sounds. With a collection of photos of a person’s face, it’s possible to create realistic face swaps in any video.
In the political world, deepfakes are feared as the next wave of falsified news, and they are getting trickier to identify. They have the power to ruin reputations if, for example, a fake video spreads on social media of a political leader saying or doing something unethical.
But most often, the victims are women and porn performers. Celebrities were some of the first victims with Gal Gadot’s face being grafted onto a porn performer’s body. The trend quickly spread to non-famous women: work colleagues, friends, or girls at school. It’s a digital fantasy where a consumer can put their crush or dream girl into the porn film of their choice without their knowledge or consent.
When it comes to deepfakes, there are serious ethical concerns. When researchers at AI cybersecurity company Deeptrace did a search of the internet in 2019, they found that of the almost 15,000 deepfakes online, 96% are pornographic. And of the pornographic deepfakes, a report reveals basically all of them are of women.
This is incredibly violating to both the woman whose face is pasted into a film where it doesn’t belong and the performer whose body and work is used to abuse another woman.
The toll of image-based sexual abuse
The numbers of how many people are affected by various forms of image-based sexual abuse are scarce. Studying each category is enormous and complex as often the boundaries are blurred.
For example, when a sexual assault is mixed with image-based sexual abuse, it creates a true horror story. As if a person being raped isn’t bad enough, some perpetrators film the assault. The very evidence that should be used to implicate a perpetrator in a crime is instead used to silence the victim or else risk the video being released online. Despite the threats, such videos are often released anyway, growing the victim’s trauma exponentially.
What we do have by way of numbers is roughly 10.4 million Americans and 1 in 5 Australians who have been threatened or experienced the publication of their explicit images without their consent. In 2018, over 22,000 women in South Korea protested against secret filming, and also deepfake videos, 96% of which were pornographic, doubled across the internet.
This is a global problem affecting all ages, genders, identities, races, and orientations.
The tragedy is that there is a demand for image-based sexual abuse. Niche porn sites dedicated to each of these categories encourage more creation and consumption of nonconsensual images, and even the most popular porn sites profit from nonconsensual videos when they say they have a zero-tolerance policy.
We live in #MeToo world with a growing awareness of sexual abuse, assault, and harassment. It’s time we add image-based sexual abuse to the conversation as the new way people will be abused in the future. But it’s happening now and it’s up to each of us to fight the demand—to refuse to click and consider before sharing.