Daily Archives: February 6th, 2018

QotD: “Welcome to the age of ‘deepfake’ porn: Your starring role in a sex film is just a few selfies away”

Thanks to an easy to use face-mapping app called FakeApp, Reddit and the rest of the internet are awash with clips of practically any (usually female) celebrity you can think of, engaged in a cornucopia of curious sex acts. What’s particularly unsettling here — beyond the obvious complete lack of consent — is the popularity of female celebrities who first found fame as children. Emma Watson and Maisie Williams are “faves” on the most popular subreddits dedicated to the ‘pretend’ porn.

The celebrity angle is tailor-made to have screen shots of these videos plastered all over the websites of The Sun and MailOnline, all dressed up as public interest (aka the public is interested in masturbating over celebrities they otherwise wouldn’t see having sex). But the bigger issue will be as these technologies are picked up by the kind of people for whom revenge porn has long been an attractive weapon.

For you to be turned into a machine-learning enabled porn performer, starring in clips posted to every porn site on the web, an ex won’t need images of you naked. Every selfie you’ve ever posted will be enough. A few hundred photos and any one of us could be convincingly cast in truly unsettling and upsetting scenes. How do you explain to your employer that you didn’t film a sex tape when there’s a clip that convincingly shows your face mapped on to the body of a porn star who has at least a reasonable resemblance to you?

The law has only just caught up with the selfie culture and the pervasive nature of sexting, revenge porn and smartphones in every single person’s hand. Now, legislators will need to get their heads around the new implications for image rights. Your theoretical disgruntled ex will own the rights to photos of you taken by them, but you own your image rights.

The difference though, between you and I, and the celebrities who will now be spinning up their legal teams to issue takedown notices and get “deepfake” dirty videos of them taken down, is that we don’t have those resources. It’s extremely difficult to get videos pulled down and they spread across sites with frightening speed.

Mic Wright, NewStatesman, full article here