fbpx Skip to main content
BlogWorld

Man in Japan Uses Deepfakes Technology to De-pixelate Pornographic Content

The country witnessed the first criminal case involving the use of deepfake technology. The crime? Using deepfake tech to de-pixelate porn.

By November 23, 2021No Comments

News about deepfakes has been creeping into newsfeeds for the last several years, but last month, big headlines were made in Japan.

The country witnessed the first criminal case involving the use of deepfake technology. The crime? Using deepfake tech to de-pixelate censored pornographic content.

What’s the full story?

Website owner Masayuki Nakamoto, 43, was found to have used the artificial intelligence-driven tool “DeepFake” to depixelate—or in other words, unblur—the genitalia of several Japanese adult performer images. This is a crime in Japan, where an obscenity law prohibits the explicit display of genitalia.

Deepfakes, which are digital manipulations of someone’s body or voice, often involve face swapping, so that the face of someone is falsely represented on another body that is not theirs, or is used in a context that is not real, but appears to be so.

Related: This Woman Developed An App For Image-Based Abuse Survivors After Finding Her Video On A Porn Site

However, this case was different. Rather than swapping faces, Nakamoto used lots of uncensored nude images found online to train his technology to reconstruct the blurred genitalia of the porn videos, leaving them in explicit form.

BHW - The World

He sold over 10,000 videos of this manipulated content online, making over 11 million yen (around $96,000 USD). He was arrested for a fraction of these, just 10 fake photos, and charged for copyright violation and displaying uncensored content. However, there were no charges based on sexual exploitation or privacy concerns because there are currently no laws in Japan criminalizing the use of AI for these purposes.

This may seem like just another deepfakes story, but it shows new ways the technology is being used to manipulate media.

DeepFakes: a growing threat?

This isn’t the first time de-pixelating technology has been used. In fact, as early as 2016, researchers warned about the growing weakness of pixelation as a form of privacy; they had created a tool with the capability to reconstruct content that was strikingly similar to that which was blurred, simply using standard image recognition techniques.

This technology has only improved with time. Now, as this case demonstrates, it is being applied with more troubling implications for privacy and legal systems.

Related: Here’s What It’s Like To See Yourself In A Deepfake Porn Video

This is especially concerning as deepfake creation has significantly increased since the term was first coined around 2017, each time requiring less tech-savvy skills, yet rendering results that are more realistic.

Just to get an idea, tech experts estimate that every six months, the number of expert-made deepfakes doubles. While most deepfakes out there are meant for entertainment, the flip side of this technology is that it can be used to ruin reputations, mislead on a mass scale, and exploit others, often with little legal repercussions.

A closer look at deepfakes content

When deepfakes first started growing in popularity, they were overwhelmingly used to create deepfake porn of celebrities. However, this expanded to include women of many professions, and increasingly, private individuals.

Images or videos from social media accounts were often enough content to have a realistic deepfake made from them. In fact, in 2019, the vast majority of deepfakes out there are were of nonconsensual pornography.

One of the leading companies publishing on the current state of deepfakes, Sensity, published in their annual report in 2019 that 96% of deepfakes on the internet were of nonconsensual pornography. The 2020 report found that, “Reputation attacks by defamatory, derogatory, and pornographic fake videos still constitute the majority [of deep fakes] by 93%.”

Related: What I Did When I Found Out My Partner Posted Photos Of Me To Porn Sites

Sensity’s reports consider only certified deepfake videos, which have the potential to, or harm, public figures. Other reports which look at a wider range of deepfakes, including those created by individuals for personal use, find that almost 1 in 5 deepfake videos is pornographic, and that the majority of non-pornographic deepfakes are used for entertainment purposes, with face swaps being among the most common manipulation.

However different the numbers, most reports warn about the rise of pornographic deepfakes and the concerns they raise about issues like privacy and consent.

Consider this: in 2020, over 600,000 women were nonconsenually “stripped” through special deepfake bots on social media platform Telegram. Within a minute, users could strip any image they wanted at virtually no cost. A few months later, a deepfake website launched that allowed the same thing, “nudifying” images of women with a single click.

One report found that there are over 100,000 members of underground deepfake porn communities that create and share nonconsensual deepfake porn.

Get The Facts

A growing form of sexual exploitation

At the end of the day, what these troubling trends highlight is that individuals, primarily women, are increasingly having their image and/or voice used in ways they never intended or consented to for the sexual gratification of someone else.

Sometimes the victims aren’t even aware their image has been exploited, because their consent was never sought nor considered in the making of their deepfake. In the worst of cases, deepfake porn can even be used to extort or blackmail victims.

Related: Google Takes Steps To Suppress Search Results For Revenge Porn And Protect Survivors

Ultimately, pornographic deepfakes are often nonconsensual pornography, a type of sexual exploitation and a serious crime that leaves victims severely affected.

And while deepfakes involving face swaps for a laugh with friends may be harmless enough, nonconsensual deepfake porn is no joke.

To read more about how the porn industry profits from nonconsensual content, click here.

Send this to a friend