It is not technically illegal to take a person’s female face, paste it on a body, and pass it off as real. The fact that AI technology can create convincing, yet false, footage of various occasions involving people who were not participating in such occasions should at least be disturbing to anyone with a pulse.
It is infinitely disturbing that humans are spending otherwise unencumbered time and perfectly usable sources of life tinkering with the highly beneficial, humanity-enhancing tasks of this aforementioned AI technology, with easy access to every questionable specimen under the sun. And, to be clear, those who use them are inherently suspicious.
Porn Videos is an easy-to-use application that allows anyone to recreate these videos using their datasets. It’s bad enough that the availability of this app (its very existence) violates the principles of consent that every human being should hold dear, but what does it bring us in terms of detecting fake news? Evidence in a murder trial Is blackmail now obsolete? Or is it even easier? Are rules and reality much less important now than they used to be? December’s porn video is just a random guy who happened to be playing a celebrity face-swap game. He produced a porn video, a porn video with Taylor Swift, Scarlett Johansson, , and Daisy Ridley. They say if you look closely, you won’t be fooled, but who is going to check every video for signs of fabrication? Plus, technology will only get more advanced – we’re sure of that. While this was certainly made possible by porn videos, it was difficult for second-party porn video users without a big background in computer science (or, I suppose, without particularly good skills or charisma) to create AI-powered fake porn from the comfort of their own homes. All the tools needed to create the videos in question are readily available for free and come with instructions that walk a child through the steps to make it look suspicious.
Ultimately, we hope to refine it to the point where potential users can simply select a video on their computer, download the neural network associated with a particular face from our public library, and then swap that video for another face with the press of a button.
I don’t know about you, but I sleep well now knowing that porn videos are working tirelessly to improve my ability to create and distribute fake sex videos of people.
While celebrity women may be the primary target at the moment, anyone who wants to pick a bone has a few screws loose, is angry at an ex-lover, or is applying for a job at a company is officially allowed to do so.. I thought we were already in the era of zero class, but this settles it. This app may be legal, but I can’t imagine how you can have an ethical discussion with that porn video. Every time such a video is made, the consent of the person whose face is being swapped is completely ignored, but still, they are exposed to very harmful exposure, comments, and potential harm. Porn videos are not cool. Scary.