Is DeepNude Immoral? Should It Be Illegal?

Is DeepNude Immoral? Should It Be Illegal?

  • DeepNude is immoral and should be illegal.

  • DeepNude is not immoral but should be illegal.

  • DeepNude is immoral but should not be illegal.

  • DeepNude is not immoral and should not be illegal.


Results are only viewable after voting.

Rembrandt Brown

Slider
Registered
This Horrifying App Undresses a Photo of Any Woman With a Single Click
The $50 DeepNude app dispenses with the idea that deepfakes were about anything besides claiming ownership over women’s bodies.
By Samantha Cole
Vice.com
Jun 26 2019

A programmer created an application that uses neural networks to remove clothing from the images of women, making them look realistically nude.

The software, called DeepNude, uses a photo of a clothed person and creates a new, naked image of that same person. It swaps clothes for naked breasts and a vulva, and only works on images of women. When Motherboard tried using an image of a man, it replaced his pants with a vulva. While DeepNude works with varying levels of success on images of fully clothed women, it appears to work best on images where the person is already showing a lot of skin. We tested the app on dozens of photos and got the most convincing results on high resolution images from Sports Illustrated Swimsuit issues.


Since Motherboard discovered deepfakes in late 2017, the media and politicians focused on the dangers they pose as a disinformation tool. But the most devastating use of deepfakes has always been in how they're used against women: whether to experiment with the technology using images without women's consent, or maliciously spreading nonconsensual porn on the internet. DeepNude is an evolution of that technology that is easier to use and faster to create than deepfakes. DeepNude also dispenses with the idea that this technology can be used for anything other than claiming ownership over women’s bodies.

"This is absolutely terrifying," Katelyn Bowden, founder and CEO of revenge porn activism organization Badass, told Motherboard. "Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. This tech should not be available to the public."

This is an “invasion of sexual privacy,” Danielle Citron, professor of law at the University of Maryland Carey School of Law, who recently testified to Congress about the deepfake threat, told Motherboard.

“Yes, it isn’t your actual vagina, but... others think that they are seeing you naked,” she said. “As a deepfake victim said to me—it felt like thousands saw her naked, she felt her body wasn’t her own anymore.”


1561585213934-tyra.jpeg

An image of Tyra Banks, before (left) and after (right) using the DeepNude app. Censoring via Motherboard

DeepNude launched as a website that shows a sample of how the software works and downloadable Windows and Linux application on June 23.

Motherboard downloaded the application and tested it on a Windows machine. It installed and launched like any other Windows application and didn't require technical expertise to use. In the free version of the app, the output images are partially covered with a large watermark. In a paid version, which costs $50, the watermark is removed, but a stamp that says "FAKE" is placed in the upper-left corner. (Cropping out the "fake" stamp or removing it with Photoshop would be very easy.)

Motherboard tested it on more than a dozen images of women and men, in varying states of dress—fully clothed to string bikinis—and a variety of skin tones. The results vary dramatically, but when fed a well lit, high resolution image of a woman in a bikini facing the camera directly, the fake nude images are passably realistic. The algorithm accurately fills in details where clothing used to be, angles of the breasts beneath the clothing, nipples, and shadows.

1561585273913-natalie.jpeg

An image of Natalie Portman, before (left) and after (right) using the DeepNude app. Censoring via Motherboard

But it's not flawless. Most images, and low-resolution images especially, produced some visual artifacts. DeepNude failed entirely with some photographs that used weird angles, lighting, or clothing that seem to throw off the neural network it uses. When we fed it an image of the cartoon character Jessica Rabbit, it distorted and destroyed the image altogether, throwing stray nipples into a blob of a figure.

PqXT_8qapWwaWA5kk8D_Z6Nxfq170eQJqQx381AWlupTf0QkTFt5q7U82Cw9dBbDmr4YYfnME2BnsBLjPOq95VokEjT_j8EeQZLJaMCdUsG6yTG_rYVoAZUr3zM0C3ozr2j4RR_l

In an email, the anonymous creator of DeepNude, who requested to go by the name Alberto, told Motherboard that the software is based on pix2pix, an open-source algorithm developed by University of California, Berkeley researchers in 2017. Pix2pix uses generative adversarial networks (GANs), which work by training an algorithm on a huge dataset of images—in the case of DeepNude, more than 10,000 nude photos of women, the programmer said—and then trying to improve against itself. This algorithm is similar to what's used in deepfake videos, and what self-driving cars use to "imagine" road scenarios.

The algorithm only works with women, Alberto said, because images of nude women are easier to find online—but he's hoping to create a male version, too.


"The networks are multiple, because each one has a different task: locate the clothes. Mask the clothes. Speculate anatomical positions. Render it," he said. "All this makes processing slow (30 seconds in a normal computer), but this can be improved and accelerated in the future."

Deepfake videos, by comparison, take hours or days to render a believable face-swapped video. For even a skilled editor, manually using Photoshop to realistically change a clothed portrait to nude would take several minutes.

Why DeepNude was created
Alberto said he was inspired to create DeepNude by ads for gadgets like X-Ray glasses that he saw while browsing magazines from the 1960s and 70s, which he had access to during his childhood. The logo for DeepNude, a man wearing spiral glasses, is an homage to those ads.

"Like everyone, I was fascinated by the idea that they could really exist and this memory remained," he said. "About two years ago I discovered the potential of AI and started studying the basics. When I found out that GAN networks were able to transform a daytime photo into a nighttime one, I realized that it would be possible to transform a dressed photo into a nude one. Eureka. I realized that x-ray glasses are possible! Driven by fun and enthusiasm for that discovery, I did my first tests, obtaining interesting results."

Alberto said he continued to experiment out of "fun" and curiosity.

"I'm not a voyeur, I'm a technology enthusiast,” he said. “Continuing to improve the algorithm. Recently, also due to previous failures (other startups) and economic problems, I asked myself if I could have an economic return from this algorithm. That's why I created DeepNude."

Unprompted, he said he's always asked himself whether the program should have ever been made: "Is this right? Can it hurt someone?" he asked.

1561585318157-gadot.jpeg

An image of Gal Gadot, before (left) and after (right) using the DeepNude app. Censoring via Motherboard

1561585350176-kim2.jpeg

An image of Kim Kardashian, before (left) and after (right) using the DeepNude app. Censoring via Motherboard

"I think that what you can do with DeepNude, you can do it very well with Photoshop (after a few hours of tutorial)," he said, noting that DeepNude doesn't transmit images itself, only creates them and allows the user to do what they will with the results.

"I also said to myself: the technology is ready (within everyone's reach)," he said. "So if someone has bad intentions, having DeepNude doesn't change much... If I don't do it, someone else will do it in a year."

Better, and much worse, than deepfakes
In the year and a half since Motherboard discovered deepfakes on Reddit, the machine learning technology it employs has moved at breakneck speed. Algorithmic face-swaps have gone from requiring hundreds of images and days of processing time in late 2017, to requiring only a handful of images, or even just text inputs, and a few hours of time, in recent months.

Motherboard showed the DeepNude application to Hany Farid, a computer-science professor at UC Berkeley who has become a widely-cited expert on the digital forensics of deepfakes. Farid was shocked at this development, and the ease at which it can be done.

"We are going to have to get better at detecting deepfakes, and academics and researchers are going to have to think more critically about how to better safeguard their technological advances so that they do not get weaponized and used in unintended and harmful ways," Farid said. "In addition, social media platforms are going to have to think more carefully about how to define and enforce rules surrounding this content. And, our legislators are going to have to think about how to thoughtfully regulate in this space."


Deepfakes have become a widespread, international phenomenon, but platform moderation and legislation so far has failed to keep up with this fast-moving technology. In the meantime, women are victimized by deepfakes and left behind for a more political, US-centric political narrative. Though deepfakes have been weaponized most often against unconsenting women, most headlines and political fear of them have focused on their fake news potential.

Even bills like the DEEPFAKES Accountability Act, introduced earlier this month, aren't enough to stop this technology from hurting real people.

"It’s a real bind—deepfakes defy most state revenge porn laws because it’s not the victim’s own nudity depicted, but also our federal laws protect the companies and social media platforms where it proliferates," attorney Carrie Goldberg, whose law firm specializes in revenge porn, told Motherboard. "It’s incumbent on the public to avoid consumption of what we call at my office humili-porn. Whether it’s revenge porn or deepfakes, don’t click or link or share or like! That’s how these sites make money. People need to stop letting their Id drive internet use and use the internet ethically and conscientiously."

DeepNude is easier to use, and more easily accessible than deepfakes have ever been. Whereas deepfakes require a lot of technical expertise, huge datasets, and access to expensive graphics cards, DeepNude is a consumer-facing app that is easier to install than most video games that can produce a believable nude in 30 seconds with the click of a single button.
 
Editor's note and update shortly after publication:

Editor's note, June 27 1:05 p.m. EST: This story originally included five side-by-side images of various celebrities and DeepNude-manipulated images of those celebrities. While the images were redacted to not show explicit nudity, after hearing from our readers, academic experts, and colleagues, we realized that those images could do harm to the real people in them. We think it's important to show the real consequences that new technologies unleashed on the world without warning have on people, but we also have to make sure that our reporting minimizes harm. For that reason, we have removed the images from the story, and regret the error.



Update June 27, 3:03 p.m. EST: The creator of DeepNude announced that he's taken down the app. Read more, here.
 
Nothing wrong with this app. Its just a digital version of people's imaginations. If everyone was a talented artist they could simply draw naked pictures of whoever they saw. That wouldn't be illegal if the person is in the public domain. This app just facilitates that. Banning this app would be trying to legislate people's imaginations.
 
Nothing wrong with this app. Its just a digital version of people's imaginations. If everyone was a talented artist they could simply draw naked pictures of whoever they saw. That wouldn't be illegal if the person is in the public domain. This app just facilitates that. Banning this app would be trying to legislate people's imaginations.
This
back then people used Photoshop (pictures)
 
Many of the female celebrities already have REAL nudes out there anyway. In the pro-LGBT/#metoo era, we'll see less female nudity in mainstream movies. Then I could see this taking off. I think the regular, non-famous women need to be alarmed.
 
where? I read not to download any apps that claimed to be "deep nude" because they would implant malware

The official deep nude posted the code behind the app in Github. It was coded in Python, which I scrolled through to make sure there wasn't anything nefarious going on. I pulled down the code awhile back.

Looks like GitHub removed it. But I'm sure plenty of folks have the source code.
 
Deep Nude is not illegal as it would be seen as form of parody and covered by the Constitutional 1st Amendment (in America of course, other countries may differ.)

However, unquestionably it is immoral.
 
I'm away from my computer at the moment. Looks like GitHub took down the official repository. However, a quick scroll through and this one looks legit. Again, use caution.

Code:
https://github.com/lwlodo/deep_nude

Source code, use caution but it looks as though this is from the original creators as they were the last ones to do any commits/changes to the code (deepinstruction)
 
The other side of this is that this app now gives women who released nudes and then regret it afterwards a alibi. Just like when people post and then claim their account was "hacked' women who get any negative backlash from releasing nudes will simply claim they are deep nudes.
 
Nothing wrong with this app. Its just a digital version of people's imaginations. If everyone was a talented artist they could simply draw naked pictures of whoever they saw. That wouldn't be illegal if the person is in the public domain. This app just facilitates that. Banning this app would be trying to legislate people's imaginations.

I think you're right on principle.

But this will quickly evolve into video.

When it's a 12 year-old getting gangbanged, it will be harder to be so straight cut.

Even for adult women, that can be dangerous.
 
This shit should be mad illegal, and easily worth a year in prison. Mandatory minimum for immorality. This is fucked up for women.

I wonder if they do this shit to men, on the homo side. Bet shit changes when's deep fake of trump Jr pops up on Fox News
 
This shit should be mad illegal, and easily worth a year in prison. Mandatory minimum for immorality. This is fucked up for women.

I wonder if they do this shit to men, on the homo side. Bet shit changes when's deep fake of trump Jr pops up on Fox News
A year in prison for pornography you say? Yeah sounds like a great idea
 
Deepfakes/Deepnudes are going to make our society into even more of a clusterfuck than it already is. Technology is outpacing our ethical and philosophical imaginations (and certainly our legislation).

Doesn't help that most of the folks who will be grappling with this stuff on a legislative level are some ancient antediluvian fucks who can barely use email :smh:
 
Back
Top