Good lord... How many mutherfuckers came in here to say they don't know any of these bitches... WHEN YOU CLEARLY KNOW WHO THESE CHICKS ARE!!
It's the same horseshit that I heard for almost 5 years doing the Fappenning shit.
Niggas makes 10 comments in the Star Wars Force Awakens and Last Jedi Thread.... but then when you see Daisy Ridley's Face.. You are like "I don't know who this bitch is"
Are talk about Natalie Portman doing her Thing in "The Professional"..... Yet you don't know what she looks like.
Ok enough with my Rant...
Anyway More News....
This Deepfake shit really has Hollywood scared. Now Variety is talking about it.
‘Deepfakes’ Will Create Hollywood’s Next Sex Tape Scare
By Janko Roettgers
@jank0
Natalie Portman. Emma Watson. Taylor Swift. Sexually explicit videos featuring these and a number of other female celebrities have surfaced online in recent days, foreshadowing what could be Hollywood’s next big sex-tape nightmare.
Only this time, the videos in question haven’t been stolen by hackers, or commissioned by porn studios, starring barely-lookalikes. Instead, these new clips have been made with the help of
artificial intelligence (AI) technology capable of swapping the faces of porn stars with those of famous actresses and other celebrities.
These so-called Deepfakes — a combination of ‘fake’ and ‘deep learning’ — first started popping up online in December, when a
Reddit user began to post explicit videos seemingly featuring celebrities online. The user in question
told Motherboard at the timethat he was using images found by Google image search as well as stock photos and YouTube videos to “train” AI algorithms, and essentially give them an idea of how a celebrity’s face would look like in any given moment.
Then, he applied that knowledge to the task of swapping out the face of a porn star in an explicit video with that of the celebrity in question. The results are clips that are more often than not convincing enough to look like hardcore porn featuring Hollywood’s biggest stars.
In January, the phenomenon took an unexpected turn when another
Reddit user published an app that lets anyone without much technical knowledge produce their own Deepfake videos. The app has been downloaded more than 100,000 times since its release, according to its creator — and new clips are being uploaded by the dozens.
In fact, Deepfakes have gotten so much attention that the Gif hosting platform Gfycat began to remove the clips this week, deeming them objectionable. Most of the clip producers simply switched to other video hosting platforms. Reddit, which is being used to exchange links to these clips, has yet to comment on the phenomenon.
Meanwhile, even some of the users seeking out these clips on Reddit are starting to have second thoughts about the technology. Some seem to hold some hopes out that there may be a silver lining for Hollywood stars. “This is a blessing in disguise for all the celebs” who’ve seen their private pictures and videos leaked,
argued one user who calls himself AnyNamesLeftAnymore. “Now if you have (an explicit) video with your boyfriend that gets leaked? You have infinite plausible deniability.”
But others fear that the repercussions of this technology could go far beyond sex and celebrities. “Imagine not being able to hold politicians accountable with video anymore because it could be fake,”
said Reddit user ThatPickelGuy.
“This is turning into an episode of ‘Black Mirror’,”
wrote a user who goes by the name CapntainIceberg. Reddit user hammerthefish
agreed: “Nothing is real anymore.”
http://variety.com/2018/digital/news/hollywood-sex-tapes-deepfakes-ai-1202685655/
For those that want to work on doing a couple with some Black Celebs but don't understand the Tech...
Here is the user friendly App..