Imagine waking up to find your face in explicit photos you never created. For 25-year-old Hannah Grundy, that nightmare became her reality. BBC first reported on Hannah Grundy’s story after she received a strange and ominous message from an anonymous sender in her inbox. The sender attached a link with the warning: “[This] contains disturbing material.”
What Hannah found was far more disturbing than she ever could have imagined. The link led to a website called “The Destruction of Hannah,” which contained more than 600 deepfakes: explicit photos of Hannah that were AI-generated – photos she hadn’t created. The website listed her address and Instagram handle, and later, she found out that her phone number had been leaked as well.
The effects of these fabricated photos altered Hannah’s life forever. She recalls how she “immediately [felt] unsafe” after learning about the website. Hannah and her partner installed cameras, set location tracking on Hannah’s devices, and kept knives by their bed. She even started wearing a watch that would let someone know if her heartbeat rose or dropped. Terror and humiliation followed Hannah wherever she went.
And Hannah’s story isn’t an isolated experience. Georgia Harrison, a reality TV star who’s appeared on The Only Way is Essex and Love Island, faced a similar violation when her former boyfriend shared a private video of them having sex. BBC also covered Georgia’s story after her video was shared without her consent, gaining widespread popularity. Georgia described how the video caused a “feeling of humiliation, a violation of literally being de-clothed without your consent.” And the knowledge of that video is something that’s “always something in the back of [her] mind.” Georgia said that she still “really struggles” with the circulation of this video.
Georgia and Hannah are not alone in their struggle with image-based sexual abuse. The rise of deepfake pornography is alarming. According to a 2023 study by Security Hero, 98% of deepfake videos are pornographic.2023 State Of Deepfakes: Realities, Threats, And Impact. (2023). Securityhero.io. https://www.securityhero.io/state-of-deepfakes/#overview-of-current-state Copy 1 The ability to create pornography when you only need a photo of someone is a powerful tool. Security Hero noted that the number of deepfake videos was rapidly increasing. They found a 550% increase in deepfake porn videos from 2019 to 2023. The study also found that 99% of all deepfake porn targeted females.
The Harms of Deepfake Pornography
The increase of this phenomenon is having devastating social effects, psychological effects, and career implications for its victims. One devastating effect of deepfake pornography is its social effects. One of the biggest problems with image-based sexual abuse is its inescapability. In our modern world, the internet can house almost any material that is virtually impossible to get rid of. For victims of deepfake abuse, there’s the twofold trauma of having this non-consensual violation of privacy and dignity, and there’s the problem of actually getting rid of those photos or videos once they’re leaked.
Both Georgia and Hannah described their struggle to get rid of the videos and photos. When Hannah went to the police to try and get her photos taken down, she was met with disbelief and apathy. One officer asked what she’d done to have these photos exposed online, and another pointed to an explicit photo of her saying she looked “cute” in the skimpy outfit. The police’s response made Hannah feel like she was “making a big deal out of nothing.” But for Hannah, this deeply traumatizing experience was “life-changing,” and for her “and for the other girls, [the photos are] forever… they will always be on the internet.” And yet, it wasn’t the police who helped her with taking down these photos, it was Hannah who had to lead her own private effort to get the website taken down. She found herself retreating inward, isolating from those around her. Like Hannah, Georgia also felt the social inescapability of the situation. Even years after the video was leaked, she continued to get messages about the video, people “sending [her] clips of it.” These social effects have vast psychological effects, too.
Deepfake pornography causes severe mental health problems. A 2022 study surveyed studies over the course of 9 years about the mental health effects of non-consensual and consensual image-based sexual abuse. Nine of the studies they surveyed found an increased likelihood of “depression, anxiety, non-suicidal self-harm, and suicidal ideation”Seigfried-Spellar, K. C., Petronio, C., Leshner, G., & Tuttle, H. (2024). The psychological impact of image-based sexual abuse victimization. Psychiatry Research, 330, 1155207. https://doi.org/10.1016/j.psychres.2024.1155207Copy 2 following non-consensual sharing of sexual images.
The repercussions on professional life are equally devastating. Victims face public humiliation, difficulty maintaining employment, or advancing in their careers for something they didn’t have anything to do with. In 2022, the International Review of Victimology found that in image-based sexual abuse, women’s employment and education experienced a direct impact. Women in customer service jobs described difficulty facing customers and concentrating at work.Henry, N., & Powell, A. (2022). The prevalence and impacts of deepfake pornography: An exploratory study. Journal of Forensic Practice, 24(1), 29–42. https://doi.org/10.1177/02697580211063659. Copy 3 Fear of how others might perceive them or whether a stranger had seen their photos was a real issue victims of image-based sexual abuse had to face.
Resources for Victims
The rise of deepfake pornography is not just a technological problem; it’s a blatant violation of human rights with devastating psychological, social, and career implications. That’s why we need Fighters like you to educate yourself on the harms of pornography, especially in a world of growing AI-generated content. And if you’ve been a victim of deepfake pornography or image-based sexual abuse, we’ve got resources to help you. You’re not alone. Keep fighting!
Citations
12023 State Of Deepfakes: Realities, Threats, And Impact. (2023). Securityhero.io. https://www.securityhero.io/state-of-deepfakes/#overview-of-current-state
2Seigfried-Spellar, K. C., Petronio, C., Leshner, G., & Tuttle, H. (2024). The psychological impact of image-based sexual abuse victimization. Psychiatry Research, 330, 1155207. https://doi.org/10.1016/j.psychres.2024.1155207
3Henry, N., & Powell, A. (2022). The prevalence and impacts of deepfake pornography: An exploratory study. Journal of Forensic Practice, 24(1), 29–42. https://doi.org/10.1177/02697580211063659.