BlogWorld

Would AI-Generated Nudes Solve the Ethical Problems of Porn Sites?

By July 7, 2020No Comments
"Teen": Why Has This Porn Category Topped the Charts for 6+ Years?

Another tech startup founder seems to have woken up one morning and decided what the internet needed was more nude pictures.

A new company is selling AI-generated images of women naked for one dollar each. We’ll call it “Nonexistent Nudes” so as to not give them free advertising. One of the founders described the venture to VICE as a new opportunity in the porn industry, while also requesting to remain anonymous because he and his associates did not want their names to be publicly attached to their controversial company.

The company uses Generative Adversarial Networks (GAN) technology that is trained on a dataset—in this case, a collection of photographs of real-life nude women—so the algorithm can “learn” from the images and create new pictures. Technically, the women in these new images do not exist in the real world, but they look real.

Related: 7 Things You Can Do If You’re A Victim Of Deepfakes Or Revenge Porn

For now, the images are only of women nude from the waist up, appearing between 20-40 years old, and White. The co-founder said their plan is to add images of men despite there being little demand for male nudes and to make 3D models with the ability to create custom pictures and videos.

“I think this is probably the first chance that anyone in the world has ever had to buy AI-generated pornographic content, so in a sense, each customer gets to be a part of porn and AI history,” the co-founder said.

And yet there are some things online that don’t need to exist, and create more problems than they set out to solve.

Be A Lover And A Fighter - Retro

How did we get here?

There is a long history of new technologies finding success after being adopted by the porn industry.

VHS famously was chosen over Betamax in the 70s (although the story is contested) despite being a lesser quality product simply because the porn industry preferred it. As a result, VHS dominated the home video market for decades.

Related: Here’s What It’s Like To See Yourself In A Deepfake Porn Video

Virtual reality claims to offer a whole new world of opportunities but tends to be a gimmick at tech conferences, and yet it too has turned to the porn industry to legitimize the technology to the rest of society. Artificial intelligence is slightly different in that it is successfully used in many business situations and is now turning to the porn industry for a profit. So far, two well-known but problematic applications are sex dolls and deepfakes.

Deepfakes paved the way for the Nonexistent Nudes to exist. The technology was first used to superimpose the face of a female celebrity onto a female porn performer’s body in an adult video. Deepfakes are now made of noncelebrity women and considered a form of image-based sexual abuse. The results are realistic but incredibly damaging to the women involved, both to the woman inserted into a video, who feel a loss of control over their body’s image, and the porn performer whose work has been taken without their consent and weaponized against another woman.

Related: What Exactly Is “Image-Based Abuse,” And Why Is It So Hurtful?

Manipulating a video is quite different from generating new and unique images, which is what brings us to Deep Nude. This was an app that used similar technology to Nonexistent Nudes to “undress” images of women. The app’s algorithm used GANs learning from thousands of images of naked women, so when a user uploaded an image of a fully clothed woman, the app would generate fake but realistic-looking nude body parts. The app was taken offline in 2019 by its creator due to severe backlash and criticism about the app objectifying women.

Now, we have AI-generated nudes of realistic-looking but not real-life women. To some, this solves the ethical issues of porn, but this technology is not without its own problems.

People Are Not Products - Black

Not harming anyone, is it?

While Nonexistent Nudes does not mix and match bodies and faces of real women like deepfakes, the AI-generated images were based on real women. The ethical question with this technology is, were those images consensual? As we mentioned, the algorithm learns from a dataset or collection of images, but where did those images come from?

Most image datasets raise these ethical questions because it is difficult to identify the nature of an image online. Even major companies like Microsoft have gotten it wrong. Last year they took down the largest dataset of faces that has been used to train facial recognition systems, because the people included had not consented to their image use. When it comes to datasets of nude or pornographic images, the content is often stolen from people who sell sex or scraped from Pornhub, which we know includes nonconsensual and illicit images.

Related: Quarantine To Blame For Surge In OnlyFans Subscribers And Sextortion Cases

According to the Nonexistent Nudes co-founder, their datasets are ethical. He did not specify which datasets the company uses but said they are using reverse-image search to confirm the origin of each image. Not only does this sound like a time-consuming task, but it still isn’t 100% because it’s nearly impossible to be sure if the subject of an image has consented to the image being shared online in the first place. Even if they did, would they want to be a part of an algorithm churning out fake nude images?

For the benefit of the doubt, let’s assume the dataset doesn’t include any copyrighted or nonconsensual imagery. Then is this acceptable? Some may say people are going to look at porn regardless, so surely AI-generated versions are better than videos of victims forced into a porn scene.

Related: Private Images From When I Was 15 Years Old Ended Up On Porn Sites—And They Haven’t Been Removed

Still, ultimately, AI-generated images cannot truly solve the ethical problems in the porn industry. More nude images don’t remove the motivations of perpetrators of image-based sexual abuse who upload and share nonconsensual sexual images for a whole host of reasons: control, attention-seeking, entitlement, humiliation, or to build up social standing.

Give One For Love

The problem of objectification

More nude images also don’t solve the problem of objectification. Treating a person as if they are an object or a “means to an end” can lead to feelings of inadequacy, anxiety, and depression.

Even if these women don’t exist in the world, they are made to appear as they do, which could have similar effects on consumers and the people they interact with in real life. People are not products or a collection of interchangeable parts. Let’s not treat them that way.

Send this to a friend