What if something as simple as applying an Instagram filter could be used to ruin someone’s life?
According to experts, because of advances in deepfake technology, this could very likely be a reality sooner than we think.
The dangerous future of deepfake porn
In recent years, deepfake technology has been used to fabricate content of real people—particularly celebrities and public figures, although regular people can be victims, too. The face of a celebrity or any private person can be seamlessly photoshopped onto the body of a porn performer, for example, to create a convincing but illicit product.
This concept, unfortunately, has taken the internet by storm.
Explicit content depicting innocent individuals has emerged rapidly online, and the most popular porn sites seem to have no problem with streaming and capitalizing on it.
According to Shamir Allibhai, CEO of the video verification company Amber, the public will be next when it comes to creating massive amounts of deepfake content in just a matter of years.
Allibhai warns that the day is not far distant when creating deepfake porn will be as easy as using an Instagram filter, and that the ramifications on individuals, relationships, and society could be catastrophic.
In a recent interview with Daily Star Online, Allibhai shared, “It will soon be as easy to create a fake as it is to add an Instagram filter and women will be the primary target of the weaponization of this deepfake technology.”
“The havoc is two-fold,” he explained. “At a primary level, relationships will be broken, people will be blackmailed. On a deeper level, society will become cynical if we don’t have video veracity solutions in effect. We will evolve to become distrusting and view everything with skepticism. And when this happens, it will chip at trust among citizens, a foundation of democracies.”
According to Daily Star, there are no specific laws covering deepfake porn in the UK, but recent technological advances are generating calls for one. Allibhai believes deepfakes could disrupt the global judicial system, and even result in innocent people who are portrayed in the videos going to jail.
Explicit content of innocent people generated from scratch
As the software used to make deepfakes has advanced, concerns around this technology have escalated in recent years—meaning Allibhai’s warnings aren’t the first and certainly won’t be the last.
But his analysis brings to the surface something worth considering: what would our world look like if this technology that’s so often used to exploit were easily accessible in everyone’s pocket?
Imagine a world where anyone could take an image of their ex, a teacher, family member, or complete stranger and instantly create explicit content that’s virtually impossible to discern from something real. Now consider the fact that this reality may not be too far off.
Allibhai explains, “Initially, deepfakes will be manipulations of existing audio/video evidence, such as that from CCTV, voice recorders, police body cams, and bystanders’ cell phones. Humans have notoriously weak hearing as compared to our site: I would bet that we get fooled by fake audio first. It is also much easier to create believable fake audio than it is to create believable fake video. In the future, video will be generated from scratch, with no basis in actual footage.”
Currently, the majority of deepfake images and videos are created using either one or a series of existing images. But Allibhai’s analysis reiterates the fact that it’s possible for fake content to be generated entirely from scratch, and this technology will soon be available to the masses.
You can help stop the demand for deepfake porn
When harsh realities like these are brought to the forefront, it can seem difficult to identify solutions or understand how to make an impact. But the reality is, the supply of deepfake porn only exists because there’s a demand for it, so let’s start there.
The porn industry reinforces an attitude of sexual entitlement that anyone should be able to consume whatever and whomever they’d like as fantasy at any time. But when individuals refuse to click, stream, or share deepfake porn—or any porn for that matter, given the fact that it can be tricky to discern between real content and fake—the demand decreases. A decrease in demand triggers a reduction in supply, and potentially the use of these technologies to exploit and harm individuals, relationships, and society.
And if you or someone you know has been a victim of deepfakes porn, there are steps you can take to protect yourself.
Share what you know about the harms of porn, even if your voice makes a difference for just one person. Because that’s how real change is made—one person at a time.
Every individual deserves to understand how porn can harm them, those involved in its production, and the people they love. Everyone deserves to have all the facts so they can consider before consuming. Everyone deserves more than porn.