“Revenge porn” is a global problem. Now, thanks to a survivor, victims of nonconsensual pornography can scan the internet and find where their nonconsensual images ended up.
But it’s heartbreaking that this solution is needed in the first place.
More and more, we hear about how technology can and is being used to solve some of the world’s toughest problems. Sometimes though, it can create new ones.
The issue of image-based sexual abuse (IBSA), more commonly (though less accurately) known as “revenge porn,” is one of these urgent issues. In a nutshell, IBSA is the “non-consensual distribution and/or creation of private sexual images,” which can be fueled by different motivations.
Increasingly appearing within mainstream culture is IBSA motivated by someone trying to get back at, blackmail, or threaten their ex-partner by using their sexual content—hence the “revenge” in “revenge porn.”
Related: Here’s What It’s Like To See Yourself In A Deepfake Porn Video
However, the majority of IBSA, or “nonconsensual pornography” as it’s also called, encompasses a wider range of practices and motivations. This includes abuses like secret filming/voyeurism, deepfakes, sextortion, and upskirting—and not by angry ex-partners—but by hackers or trolls, stalkers, or domestic abusers.
A closer look at the numbers
This abuse is becoming more prominent in our technological society, where it is easier than ever to create, edit, upload, send and store content. It’s found mostly among a younger demographic, like those in their 20s.
If we take a look at the numbers, 1 in 12 U.S. adults have reported being a victim of IBSA, and 1 in 20 report being perpetrators.
Related: Are Porn Sites Protecting Victims Of Nonconsensual Content? We Investigated, Here’s What We Found
In 2019, a cross-country survey across the UK, Australia and New Zealand found that among 6,000 respondents, 1 in every 3 reported being a victim of IBSA. During the COVID-19 pandemic, it is speculated the number of IBSA victims in the U.S. was much greater, due to the increase in sharing of sexually explicit content.
So is there hope? Absolutely.
Despite these growing numbers, the same facilitator that has allowed for the growth in IBSA cases can also be used to stop it. That is, technology can be used to combat these abusive practices.
This is exactly what one woman is attempting to do who goes by “Tisiphone.”
Related: What I Did When I Found Out My Partner Posted Photos Of Me To Porn Sites
A victim of IBSA herself, Tisiphone has taken it upon herself to help others through the creation of a facial recognition technology that can detect whether someone has been a victim of IBSA.
Alecto AI: a new app to combat image-based sexual abuse
Tisiphone, a 25-year-old Chinese woman, was shocked one spring afternoon when she got a call from a friend notifying her of some horrible news: there was reportedly an intimate video of her on Pornhub which had been filmed nonconsensually when she was a teenager.
She stated that, “The incident happened maybe seven years ago. I was really young…I had no idea that monster had secretly filmed me until I saw my video on Pornhub.” The discovery of the video was devastating, as it can be for many victims of IBSA, and pushed Tisiphone to even attempt suicide, thinking “I can’t live anymore. I don’t want to live anymore.”
Since that tragic moment, Tisiphone has used her skills in tech to fight back. She has created Alecto AI, an app that scans users’ faces and scans for possible content of them online. Allecto, meaning “unceasing in anger,” Tisiphone hopes it can help thousands of victims like herself.
Related: Google Takes Steps To Suppress Search Results For Revenge Porn And Protect Survivors
In an interview, she stated that the most difficult part for survivors was searching for and locating their content, which can often be scattered all over the internet, all “while being forced to relive our trauma over and over again.” According to her, Alecto AI is the “powerful, unbiased, and compassionate” tool that can help survivors with this challenging process.
“We can’t defend ourselves unless we have access to technology that can help us do so,” she said.
A growing toolkit to fight IBSA
Alecto AI, expected to launch by the end of 2021, is not the first app that leverages facial recognition software to detect IBSA.
Others such as Rumiki and Facepinpoint are some that have appeared over the last few years. Big tech has also stepped up: earlier this year, Google made changes to its algorithm to suppress results that could contain IBSA content.
Related: Would AI-Generated Nudes Solve The Ethical Problems Of Porn Sites?
And in 2019, Facebook even launched an artificial intelligence tool to help victims, though it did not rely on facial recognition software as they did not want to involve victims directly.
However, Tisiphone noted in an interview that existing tools lack appropriate security measures for protecting victims’ sensitive information, and are less accurate when used on women or people of color, a critical consideration when the vast majority of victims of IBSA are women.
Alecto AI aspires to apply a more “human-centered” approach to its technology and to decentralize the process so individuals do not have to rely on major platforms.
The fight forward
IBSA is a difficult reality that more and more people, particularly women, find themselves to be victims of, but it’s also increasingly being tackled from many fronts.
Holding perpetrators accountable under the law is being discussed in the U.S., and tech platforms are also developing new strategies to deal with IBSA.
Related: This U.S. Bill Could Make Sharing “Revenge Porn” A Federal Crime
Emerging initiatives like Alecto AI may prove to be incredibly useful as a tool for survivors, and provide hope that, together with support organizations, survivors can address and recover from trauma.
Click here to learn what you can do if you’re a victim of revenge porn or deepfakes porn.