Deep fakes: Kats are now putting Celebrity faces on porn sluts to make new PORN... (Da Remix)

fonzerrillii

BGOL Elite Poster
Platinum Member
Fucking just noticed that my previous thread was a casualty of the purge... So this is the reup.




We Are Truly Fucked: Everyone Is Making AI-Generated Fake Porn Now
A user-friendly application has resulted in an explosion of convincing face-swap porn.

EasySecondDouglasfirbarkbeetle-size_restricted.gif


In December, Motherboard discovered a redditor named 'deepfakes' quietly enjoying his hobby: Face-swapping celebrity faces onto porn performers’ bodies. He made several convincing porn videos of celebrities—including Gal Gadot, Maisie Williams, and Taylor Swift—using a machine learning algorithm, his home computer, publicly available videos, and some spare time.

Since we first wrote about deepfakes, the practice of producing AI-assisted fake porn has exploded. More people are creating fake celebrity porn using machine learning, and the results have become increasingly convincing. Another redditor even created an app specifically designed to allow users without a computer science background to create AI-assisted fake porn. All the tools one needs to make these videos are free, readily available, and accompanied with instructions that walk novices through the process.


These are developments we and the experts we spoke to warned about in our original article. They have arrived with terrifying speed.

Shortly after Motherboard published its story, deepfakes created a subreddit named after himself dedicated to his practice two months ago. In that short time, it has already amassed more than 15,000 subscribers. Within the community, the word “deepfake” itself is now a noun for the kinds of neural-network generated fake videos their namesake pioneered.

Another user, called 'deepfakeapp,' created FakeApp, a user-friendly application that allows anyone to recreate these videos with their own datasets. The app is based on deepfakes' algorithm, but deepfakeapp created FakeApp without the help of the original deepfakes. I messaged deepfakes, but he didn’t respond to a request for comment on the newfound popularity of his creation.

Deepfakeapp told me in a Reddit direct message that his goal with creating FakeApp was to make deepfakes’ technology available to people without a technical background or programming experience.

“I think the current version of the app is a good start, but I hope to streamline it even more in the coming days and weeks,” he said. “Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button.”


In early January, shortly after Motherboard’s first deepfakes story broke, I called Peter Eckersley, chief computer scientist for the Electronic Frontier Foundation, to talk about the implications of this technology on society at large: “I think we’re on the cusp of this technology being really easy and widespread,” he told me, adding that deepfakes were pretty difficult to make at the time. “You can make fake videos with neural networks today, but people will be able to tell that you’ve done that if you look closely, and some of the techniques involved remain pretty advanced. That’s not going to stay true for more than a year or two.”

In fact, that barely stayed true for two months. We counted dozens of users who are experimenting with AI-assisted fake porn, some of which have created incredibly convincing videos.

Redditor UnobtrusiveBot put Jessica Alba’s face on porn performer Melanie Rios’ body using FakeApp. “Super quick one - just learning how to retrain my model. Around 5ish hours - decent for what it is,” they wrote in a comment.



Fakes posted in the subreddit have already been pitched as real on other websites; a deepfake of Emma Watson taking a shower was reuploaded by CelebJihad—a celebrity porn site that regularly posts hacked celebrity nudes—as a “never-before-seen video above is from my private collection, and appears to feature Emma Watson fully nude and flaunting her naked sex organs while showering with another girl.”

Other redditors have taken video trained from celebrities’ public Instagram stories and used them to transfer faces onto nude Snapchats posted by amateurs: “I lucked out that this amateur does similar silly dancing moves and facial expressions as Chloe sometimes does in her instagram stories,” the creator of a deepfake of actress Chloe Bennet wrote.

ezgif-2-aac22f04a3.gif


Most of the posts in r/deepfakes so far are porn, but some users are also creating videos that show the far reaching implication of a technology that allows anyone with sufficient raw footage to work with to convincingly place any face in any video. A user named Z3ROCOOL22 combined footage of Hitler with Argentina’s president Mauricio Macri.

According to deepfakeapp, anyone who can download and run FakeApp can create one of these videos with only one or two high-quality videos of the faces they want to fake. The subreddit’s wiki states that FakeApp is “a community-developed desktop app to run the deepfakes algorithm without installing Python, Tensorflow, etc.,” and that all one needs to run it is a “good GPU [graphics processing unit, the kind that high-end 3D video games require] with CUDA support [NVIDIA’s parallel computing platform and programming model].” If users don't have the proper GPU, they can also rent cloud GPUs through services like Google Cloud Platform. Running the entire process, from data extraction to frame-by-frame conversion of one face onto another, would take about eight to 12 hours if done correctly. Other people have reported spending much longer, sometimes with disastrous results.


An incredibly easy-to-use application for DIY fake videos—of sex and revenge porn, but also political speeches and whatever else you want—that moves and improves at this pace could have society-changing impacts in the ways we consume media. The combination of powerful, open-source neural network research, our rapidly eroding ability to discern truth from fake news, and the way we spread news through social media has set us up for serious consequences.

kULJRZ1L_o.gif



“Socially and culturally, this is exploitative but quite survivable,” Jay Owens, digital media analyst and research director at audience intelligence platform Pulsar told me in an email. “Viral videos and celebrity media already operate on a plane of pure entertainment—but this'll only get sexier and meme-ier and lulzier and ever-more unreal.”

Deborah Johnson, Professor Emeritus of Applied Ethics at the University of Virginia’s school of engineering, told me there's no doubt this technology would get so good that it’d be impossible to tell the difference between an AI-generated face swap and the real thing.

“You could argue that what’s new is the degree to which it can be done, or the believability, we’re getting to the point where we can’t distinguish what’s real—but then, we didn’t before,” she said. “What is new is the fact that it’s now available to everybody, or will be... It’s destabilizing. The whole business of trust and reliability is undermined by this stuff.”

https://motherboard.vice.com/en_us/article/bjye8a/reddit-fake-porn-app-daisy-ridley
 

fonzerrillii

BGOL Elite Poster
Platinum Member
This is Hands down the Best one that I've seen and it has Sound....


Natalie Portman... :eek:





HAHAHAHA...
ANN COULTER
https://www.pornhub.com/view_video.php?viewkey=ph5a7496b08753d


Also Imgur and giphy have started deleted these Deepfake gifs....... This is going to end up being a pretty good Fair use argument.

What is the difference between deepfake and a chick using photoshop or putting a Celeb's head on a Pornstar for a fake image.
 

fonzerrillii

BGOL Elite Poster
Platinum Member
Thank god you didn't actually post the Coulter one.... a ban would have been in order.... I think some one did this thead tho !!!

.

It was me..... The thread was deleted when HNIC purged the board last week. I came in to update and saw that it didn't make it.
 

fonzerrillii

BGOL Elite Poster
Platinum Member
Can anyone stop the rise of fake celebrity porn?
Jay Hathaway
Feb 2 at 5:03AM
98db94ea5c2e7ba0-2048x1024.jpg

deepfakes/Reddit (Fair Use)

PornHub says it is working to take down the AI-made videos.


A new type of invasive porn is spreading on the internet. Using controversial technology, celebrities like Daisy Ridley, Natalie Dormer, and Taylor Swift have become unwitting porn stars, their faces realistically rendered into lewd positions and sex scenes, and you could be next.

Deepfakes, named after a Reddit user who started the trend, pair GIFs and videos with machine learning to convincingly paste one person’s face onto another person’s body. Anyone with a sufficiently powerful graphics processor and a few hours to kill (or a whole day, for better quality) can make a deepfake porn video. The app has already reached 100,000 downloads, and there’s a growing audience for fake celebrity porn on Reddit and 4chan.

The ability to turn anyone into a porn star, as long as you have enough high-quality images of their face, raises some serious ethical and practical questions. It’s hard to say whether deepfakes are legal, considering they touch unsettled areas of intellectual property law, privacy law, and brand-new revenge statutes that vary from state to state. Regardless, who’s willing to host this stuff? And is it possible to stop it?

Reddit’s r/deepfakes community has received the most attention thus far, thanks to the initial report by Motherboard on Jan. 24 and other press coverage, but posters there are also conflicted about the NSFW nature of their work. The outside attention has been mostly critical, calling out deepfakes as a heinous privacy violation and the beginning of a slippery slope toward fake porn of non-famous people. Revenge porn is already ruining lives and tying the legal system in knots, and porn-making neural networks could severely compound the issue.

Some on r/deepfakes have proposed splitting into two or more subreddits to create a division between those who want to advance facial recognition as a consumer technology and those who just want to jerk off to fake videos of their favorite Game of Thronesactresses.

“This should be someplace that you can show your classroom, or parents, or friends, for a tangible example of what machine learning is capable of, but instead it’s just a particularly creepy kind of porn,” wrote one poster.

“And I say that as someone who has had a kink for fake celebrity porn since 2001,” he added.

2bf07abf91afa066-800x317.png
Reddit


These rare flashes of conscience are the reason that anonymous posters on 4chan have argued that Reddit shouldn’t be the home of deepfakes. They feel it’s too liberal—too feminist and “social justice warrior,” as the troll parlance goes—to reliably keep the porn coming.

“[T]here needs to be a proper website to host this stuff regardless, reddit is full of sjws,” wrote one anonymous user.

Some of the earliest deepfake porn posts have already been removed from Reddit, forcing frantic users to break out their backup copies. (There are always backup copies.)

Complicating matters even further, the fakes are mostly hosted outside of Reddit itself. They were initially being uploaded to popular GIF-hosting site Gfycat, but the site took swift action to remove them. “Our terms of service allow us to remove content we find objectionable,” Gfycat told the Daily Dot via email. “We find this content objectionable and are actively removing it from our platform.”Gfycat doesn’t need to make that call, though—deepfakes violate the site’s terms of service, and they’re being taken down.

Deepfake posters then took to Pornhub, which is one of the largest streaming porn providers online and also allows community uploads. One Reddit poster, who claims to be based in Ukraine, uploaded 27 deepfake porn videos, including some that had been deleted from Reddit.

But while Pornhub has a low tolerance for “nonconsensual” porn like these celebrity deepfakes, it’s also a game of Whack-A-Mole, with new videos continuing to crop up.

“Regarding deepfakes, users have started to flag content like this and we are taking it down as soon as we encounter the flags,”, a Pornhub spokesperson told the Daily Dot. “We encourage anyone who encounters this issue to visit our content removal page so they can officially make a request.”


Redditors have started to post their deepfakes on another file-sharing site, SendVid, which keeps links private unless the uploader decides to pass them around. SendVid responds to copyright takedown notices, though, so it may only be a stopgap for Reddit’s porn fiends.

That gives 4chan, where porn of every imaginable stripe has thrived for years, the advantage in the bid to become the internet’s deepfake porn clearinghouse. The site hosts its own GIFs and video files, and its famously permissive content policies mean they’re unlikely to be taken down without a legal threat. It’s not yet clear whether such threats are coming. Gfycat declined to say whether it had received any takedown notices for celebrity porn deepfakes.

The main challenge of hosting porn on 4chan is that threads there expire after a time limit. Several deepfakes threads have come and gone, but anyone looking for specific content will have to post a request and hope someone saved it. Right now, it looks like deepfakes are destined for collectors’ hard drives. From there, they might be periodically reposted to 4chan or packaged as ZIP and RAR files on sharing sites like Mega and Mediafire, or shared via BitTorrent trackers hosted outside the U.S.

When it comes to ill-gotten nude photos, though, 4chan is relatively tame compared to forums like anon-ib, the image board connected to the 2015 “Fappening” celebrity hacking incident. The site is dedicated to scoring “wins”—a.k.a. nudes— of everyone from celebrities to cam performers to cosplayers. Users can post requests for “wins” or revenge porn of any woman, and other posters often deliver. It seems like a natural home for deepfakes collectors.


Alton Brown reviews dumbest kitchen gadgets
Another alternative is private rooms on Discord, the Slack-like group chat app favored by gamers and various organized troll groups. Motherboard reports that some of the most disturbing deepfake experiments—starring people’s friends and classmates—are already being shared in Discord chatrooms.

Like the real celebrity nudes that leaked during the Fappening, deepfakes might end up relegated to the seediest, most pop-up-ad-ridden corners of the internet, but it’ll be extremely difficult to get rid of them altogether.

https://www.dailydot.com/unclick/fake-celebrity-porn-deepfakes/
 

playahaitian

Rising Star
Certified Pussy Poster
damn...

@fonzerrillii (yu realize a whole lot of #metoo claims gonna be doubted now?

Its like EVERYTHING we saw on Star Trek, Fringe and X-Files are REALLY possible now...

and if you a REAL conspiracy theorist?

with who have in office now?

creating the term fake news?

and claiming the pussy grab tapes were faked...

I think we about to hit the big one soon
 

p5ych3

Curry Is My God
BGOL Patreon Investor
damn. some of this shit got me doing side by side facial comparisons they are really good.
 
  • Like
Reactions: BDR

fonzerrillii

BGOL Elite Poster
Platinum Member
Now i want to see one with Candace Patton since those Fappening pics may never see the light of day :lol:

Man that shit is never going to drop. I honestly wish I never saw that Fucking video of her going over lines in that hotel... that way I could just say that the shit doesn’t exist. It’s the fact that I know the set exists that’s fucking killing me.


It’s killing me fam... I check every single day for that shit. Candice and Kristen Kreuk are the only reason I’ve been keeping the thread going.
 

ThaBurgerPimp

Rising Star
BGOL Patreon Investor
Man that shit is never going to drop. I honestly wish I never saw that Fucking video of her going over lines in that hotel... that way I could just say that the shit doesn’t exist. It’s the fact that I know the set exists that’s fucking killing me.


It’s killing me fam... I check every single day for that shit. Candice and Kristen Kreuk are the only reason I’ve been keeping the thread going.

Should be a bunch of deepfakes for her and Allison Mack since they part of some sex club :lol:
 
Top