Another tech startup founder seems to have woken up one morning and decided what the internet needed was more nude pictures.
A new company is selling AI-generated images of women naked for one dollar each. We’ll call it “Nonexistent Nudes” to not give them free advertising.
One of the founders described the venture to VICE as a new opportunity in the porn industry while also requesting to remain anonymous because he and his associates did not want their names to be publicly attached to their controversial company.
The company uses Generative Adversarial Networks (GAN) technology that is trained on a dataset—in this case, a collection of photographs of real-life nude women—so the algorithm can “learn” from the images and create new pictures.
Technically, the women in these new images do not exist in the real world but look real.
For now, the images are only of women nude from the waist up, appearing between 20-40 years old, and white. The co-founder said they plan to add photos of men despite there being little demand for male nudes and to make 3D models with the ability to create custom pictures and videos.
“I think this is probably the first chance that anyone in the world has ever had to buy AI-generated pornographic content, so in a sense, each customer gets to be a part of porn and AI history,” the co-founder said.
And yet, some things online don’t need to exist and create more problems than they set out to solve.
How did we get here?
The porn industry has a long history of adopting new technologies, which often find success.
VHS famously was chosen over Betamax in the 70s (although the story is contested) despite being a lesser quality product simply because the porn industry preferred it. As a result, VHS dominated the home video market for decades.
Virtual reality claims to offer a whole new world of opportunities but tends to be a gimmick at tech conferences. Yet, it too has turned to the porn industry to legitimize the technology to the rest of society. The porn industry is now harnessing artificial intelligence, which is already prevalent in various business scenarios, for profit.
So far, two well-known but problematic applications are sex dolls and deepfakes.
Deepfakes paved the way for the Nonexistent Nudes to exist. The technology initially superimposed a female celebrity’s face onto a porn performer’s body in an adult video. Deepfakes are now made of noncelebrity women and are considered a form of image-based sexual abuse.
The results are realistic but incredibly damaging to the women involved, both to the woman inserted into a video, who feel a loss of control over their body’s image, and the porn performer whose work has been taken without their consent and weaponized against another woman.
Manipulating a video differs from generating new and unique images, which brings us to Deep Nude. This was an app that used similar technology to Nonexistent Nudes to “undress” images of women.
The app’s algorithm trained on thousands of naked women images, generating realistic-looking nude body parts from fully clothed ones. In 2019, the creator took the app offline due to severe backlash and criticism about its objectification of women.
Now, we have AI-generated nudes of realistic-looking but not real-life women. To some, this solves the ethical issues of porn, but this technology is not without its problems.
Not harming anyone, is it?
While Nonexistent Nudes doesn’t manipulate real women’s bodies and faces like deepfakes, its AI-generated images are based on real women
The ethical question with this technology is, were those images consensual? As we mentioned, the algorithm learns from a dataset or collection of photos, but where did those images come from?
Most image datasets raise these ethical questions because it is difficult to identify the nature of an image online. Even major companies like Microsoft have gotten it wrong. In 2019, the largest dataset of faces used to train facial recognition systems was taken down. This action was prompted by the fact that the people included had not consented to using their images.
When it comes to datasets of nude or pornographic images, the content is often stolen from people who sell sex or scraped from Pornhub, which we know includes nonconsensual and illicit photos.
According to the Nonexistent Nudes co-founder, their datasets are ethical. He didn’t specify the datasets used but mentioned using reverse-image search to verify each image’s origin. Verifying image origins is time-consuming and not foolproof, as it’s difficult to ascertain if subjects consented to online sharing.
Even if they did, would they want to be a part of an algorithm churning out fake nude images?
For the benefit of the doubt, let’s assume the dataset doesn’t include copyrighted or nonconsensual imagery. Then is this acceptable? Some argue AI-generated porn is preferable to videos of victims forced into porn scenes, assuming people will view porn regardless.
Still, ultimately, AI-generated images cannot truly solve the ethical problems in the porn industry.
More nude images don’t remove the motivations of perpetrators of image-based sexual abuse who upload and share nonconsensual sexual images for a whole host of reasons: control, attention-seeking, entitlement, humiliation, or to build up social standing.
The problem of objectification
More nude images also don’t solve the problem of objectification. Treating a person as if they are an object or a “means to an end” can lead to feelings of inadequacy, anxiety, and depression.
Porn is not an accurate representation of how everyday people look or how sex and intimacy work in real-life relationships, yet the research shows that porn can, and does, shape the way that consumers think about others and sex.
Real connection starts with seeing others as whole people with unique thoughts, feelings, dreams, struggles, and lives. Viewing people as products is harmful to individuals, relationships, and, ultimately, society as a whole.
The collective private actions of millions affect the larger culture—objectifying others privately on our screens doesn’t inspire respect and dignity in public. The private impacts the public—that’s how culture works.
If we want a culture of proper respect and equality, we need to consider, talk about, and treat others as whole people—not as objects.