Disclaimer: Fight the New Drug is a non-religious and non-legislative awareness and education organization. Some of the issues discussed in the following article are legislatively-affiliated. Including links and discussions about these legislative matters does not constitute an endorsement by Fight the New Drug. Though our organization is non-legislative, we fully support the regulation of already illegal forms of pornography and sexual exploitation, including the fight against sex trafficking.
A bill making “revenge porn” a federal crime in the U.S. took another step forward as the U.S. Congress is part-way through passing the SHIELD Act—an amendment to the Violence Against Women Reauthorization Act of 2021 that criminalizes the nonconsensual distribution of nude or sexually explicit images.
If the amendment becomes law, offenders could face two years in prison for distributing or even threatening to distribute nonconsensual images.
The amendment, which has bipartisan support, would make it a crime “to knowingly use any means or facility of interstate or foreign commerce to distribute an intimate visual depiction of an individual” if the offender has knowledge of or reckless disregard for the subject’s reasonable expectation of privacy or lack of consent.
It does provide exceptions for situations related to public concern, like legal proceedings, law enforcement activity, and reporting illegal activity.
Related: “Am I In Porn?”: This Tool Searches Porn Sites To See If Your Images Are Used In Videos
A devastating global issue
Although researchers hope to see revenge porn—called “image-based sexual abuse” by anti-abuse advocates—decrease as it becomes more widely discussed and condemned, studies are revealing the opposite.
In fact, according to a recent survey conducted by leading law firm Slater and Gordon, the number of people who have been victims of image-based sexual abuse in the UK has doubled in the last two years.
The first survey was commissioned in 2019 after a number of the firm’s family lawyers noticed a trend of image-based sexual abuse as divorce proceedings turned toxic. The researchers describe results from this year’s survey as “deeply worrying” and “shocking” compared to just two years ago.
More than 40% of the 2,000 people surveyed knew someone who had been a victim of image-based sexual abuse and 22% knew someone who had been threatened with image-based sexual abuse. About 40% of threats came from an ex-partner, 18% from a friend, and 11% from a family member. Less than one-third of the victims surveyed reported threats to police—slightly down from 2019.
About 15% of those surveyed between the ages of 18 and 45 said intimate sexual images of them had been shared without their consent—an 8% increase since 2019. About 1 in 10 people admitted to sharing or threatening to share an explicit image—more than double the number in 2019. Women accounted for more than three-quarters of victims in the survey.
Related: 7 Things You Can Do If You’re A Victim Of Deepfakes Or Revenge Porn
One in five offenders said they “wanted to scare” the victim, 25% said sharing the image was “just a laugh,” and another 25% deemed the image as “their property” to share.
Motives for distributing revenge porn included punishing someone for ending a relationship or attempting to force someone to stay in a relationship. Refuge, a UK-based domestic abuse charity, describes this means of controlling and manipulating victims as a “devastating form of domestic abuse” that ruins lives.
Holding responsible parties accountable
In today’s digital age, it’s the norm for people to share naked pictures of themselves with their partner. “So what’s the big deal,” some may ask? And if a person doesn’t want their photos to be leaked, some may say, “Shouldn’t they simply not send them in the first place?”
This way of thinking is just as common as it is harmful. Here’s why.
For many people, after a relationship ends, their ex uses nude images as a means of coercion, manipulation, and harassment. One woman’s ex created a website filled with her nude images and threatened to send the link to her friends and family unless she paid him $75,000. Another was sent a screenshot showing over 30 explicit photos and videos of her that had been taken with someone she was dating five years prior at the age of 21. The private content had been accessed through an iCloud hack and shared nonconsensually on a number of porn sites.
All of the content had been posted using her full name and was trending on Pornhub. After repeated requests to have the content removed, Pornhub only listened after she lied and said she was underage in the content.
That’s right—she had to lie about her age for Pornhub to listen and take action. But at the time, Pornhub had a download feature on all content, so the photos and videos of her were reuploaded in just a matter of days.
These experiences are just a couple out of millions throughout the world.
The many forms image-based abuse takes
While the term “revenge porn”—the most well-known tactic under the image-based sexual abuse umbrella—implies this only happens to lovers scorned when a relationship goes sour, it isn’t always inflicted by a vengeful ex. Some people have their digital accounts hacked and are never able to identify their perpetrators. Some are victims of other image-based sexual abuse like secret filming, upskirting, or deepfakes.
As technology evolves, so do the tactics used to harass, humiliate, and harm victims. The motives of abusers range, and while image-based sexual abuse isn’t always men perpetrating attacks on women, this is a gendered issue with that most often being the case.
Consider the impact image-based sexual abuse has on people’s personal lives—their relationships, social circles, families, and employment. Many have been forced to change their names, careers, and education paths. Their worst nightmare is broadcast for the world to see with no way to escape it. Instead, they have to relive the abuse again and again.
Related: XVideos, World’s Most Popular Porn Site, Reportedly Hosts Nonconsensual Content & Child Exploitation
And while sometimes the content is intimate images made public, too often it’s actually filmed sexual assault.
One 18-year-old female was reportedly violently gang-raped by three older men. Her abuse was videoed and quickly populated on the first five pages of search engine results with her full name. Google didn’t remove the image-based sexual abuse results. The future fate of many suffering victims is left in the hands of companies who get to decide what’s taken down and what’s not on a case-by-case basis. And if the content is removed, it’s almost always reposted using a new URL.
Even for victims who haven’t been physically assaulted, studies show that image-based sexual abuse survivors suffer similar trauma to sexual assault survivors—including PTSD, anxiety, depression, trust issues, suicidal thoughts, and negative coping mechanisms like self-medicating. The mental health and daily lives of many survivors are severely disrupted after they’ve been victimized.
The worst part is, image-based sexual abuse can be indistinguishable from mainstream porn. Consumers’ general lack of awareness is apparent in the fact that real abusive content has millions of views on mainstream sites.
Minimal accountability for deeply harmful content
It’s a tragedy that there’s a demand for image-based sexual abuse at all. And we’re not just talking on niche porn sites—it infiltrates mainstream porn platforms with or without consumers’ knowledge.
Major flaws exist when it comes to addressing the issue of nonconsensual porn and holding responsible parties accountable—including and especially the porn industry. Focus is often put on holding individuals responsible for sharing content without permission, but what about the companies that facilitate and profit off of abusive content?
Currently, it’s completely legal for porn sites to host and profit from nonconsensual videos and images. Shocking, but true.
Related: Homepages Of 3 Most Popular Porn Sites Heavily Feature Sexual Violence, Study Finds
Keywords like “leaked,” “stolen,” or even “revenge porn” are dressed up as “fantasy” or “funny” and yield millions of search results on these platforms. Porn sites that allow user-uploaded content too often knowingly allow content to be uploaded to their sites without being checked, and ignore requests to take down nonconsensual content.
Individuals who post nonconsensual content can be virtually untraceable, and the porn industry gets a free pass for hosting and glamorizing it. This will be the case unless we take a stand for the individuals who suffer, and help stop the demand for image-based sexual abuse.