Deepfakes and Digitally Manipulated Pornography: The Ethical Questions Raised by Better Tech
The term ‘deepfakes’ refers to porn that has been digitally manipulated to look like a famous person. Examples online include Emma Watson and Taylor Swift, whose faces have been ‘swapped’ onto porn scenes in which they’re stripping, masturbating or having sex. How does the ability to digitally manipulate pornography and create deepfakes raise ethical questions?
You’d be hard-pressed to find someone who had a moral objection to you fantasising about a celebrity. Most people have, at one point or another, imagined themselves in a clinch with someone famous – it’s so common we likely wouldn’t even include this in our definition of ‘taboo fantasies.’
There are whole genres of erotica – called slashfiction – dedicated to writing stories about film and TV characters, or celebrities, getting it on with each other or themselves. People draw sexy pictures of celebrities too, and are not averse to photoshopping the face of someone they like onto a naked body that they’ve lifted from porn.
Are you feeling uncomfortable yet? Because somewhere along this spectrum – from internal fantasy to external realisation of it – the ethics of ‘realising’ your celebrity fantasy become a lot more complicated.
And now advances in technology are pushing us to condemn – and perhaps even regulate – the more extreme end of this spectrum.
Deepfakes and Personalised Porn
Recently, The Verge published a fascinating article on the legality of ‘deepfakes’ porn. The ability to do this has been around for a while, of course, and it has caused a lot of justifiable discomfort on the grounds that it’s invasive and non-consensual.
The only thing that separates more recent attempts to do it is the ease with which it can be done, and the accuracy of the finished product.
What previously took specialist knowledge and equipment can now be done fairly simply, by almost anyone: there’s even an app that helps people create their own deepfakes. And they’re frighteningly accurate.
As Megan Farokhmanesh explains, writing for The Verge:
“Although there are benign applications of this technology — it’s harmless to swap in actor Nicolas Cage for a bunch of goofy cameos — it’s a lot less cute in the hands of someone with more malicious goals, like placing unwilling participants in explicit sex videos. Photoshopped pornography is already a common harassment tool deployed against women on the internet; a video makes the violation far more active, and harder to identify as forged.”
It’s not surprising, then, that many people are frightened of this technology. In legal terms, the Verge article explains that there are some laws that could be used to prosecute those who make these videos (or at least force them to take them down) – copyright or defamation, for instance.
But taking those cases to court is tricky. Even in countries like the UK, which have revenge porn laws, the law often doesn’t cover these particular instances. The UK Revenge Porn Hotline explains:
“Unfortunately, the UK law on revenge porn does not currently include images that have been photoshopped to look intimate and/or sexual. It would however include an image that had been edited in some way if the original image was intimate/sexual in nature to begin with.
“For example, a person who has shared an intimate image of their former partner in order to cause them distress, would still be breaking the law even if they edited the image to change the victim’s body shape. But a person who uses part of an image of their ex-partner from a non-sexual image and photoshops it onto a person posing in a sexual manner, would not be committing this specific offence.”
So there is plenty to worry about. And understandably most of the discussion we’ve seen around the technology that makes deepfakes possible is about how to control the way it is used so that non-consenting individuals don’t have their likenesses co-opted for custom made porn.
But if this technology is going to become quicker and easier to use, it’s not just porn that will be affected.
How to Make Obama Say Anything
Similar technology – using AI to map facial movements – has already been demonstrated on videos of Obama speeches, with a realistic effect.
Check out this video, which uses audio of President Barack Obama to create a new video of him speaking the words.
Effectively the program is learning how to make the mouth on the video of Obama move naturally to sync to the audio that is being fed in. Michael Nunez, writing for mashable, explains just why this is so concerning:
“Scientists — or anyone, really — can literally put words in Obama’s mouth by converting audio sounds into mouth movements, then blending the movements into old video footage. The video looks incredibly realistic, and, to an untrained eye, would appear to be real. … Soon, the same artificial intelligence system could be used to make fake videos about other celebrities or even regular people like you and me.”
What do we do, in a world where anyone could create videos of us saying – or doing – almost anything? Banning this technology is almost impossible, and it’s also not usually the best option – after all, everything that humans have invented can be used both for good and bad purposes.
The problem is not that this technology exists: it’s that people are using it to do things for which they don’t have the participant’s consent. So let’s explore what this tech could do if people did have the consent of those involved.
Ethical Applications of Deepfake Technology
Megan Faroukhamesh talks about the funny or playful videos that people create – like inserting Nicholas Cage cameos into films. And there are educational (and consensual!) applications too, such as this video by Tom Scott illustrating the possibilities of deepfake technology. But on the sex side there are plenty of ethical applications too.
As with virtual reality, one of the key applications of this tech could also be within consensual porn production. Adult performers who want to offer custom content to their fans could use deepfake technology to insert paying customers into films they’ve made themselves.
Imagine if you could pay your favourite porn performer to swap your face into a custom-made scene, creating a video that looks like you’re having sex with them? Or even swap your own face over theirs, so you could essentially perform as them in the scene? The key difference between this and other videos is consent: it’s possible to create this kind of porn ethically as long as all the people involved – the performers and those whose faces are swapped in – are doing it as part of an agreed, consensual interaction.
Equally, you could create videos of your own – swapping yours and your partner’s faces to make your very own uncanny-valley homemade scene. You could see how you would look inhabiting a different body – larger, smaller, with a different shape or genitals – and live out some of your fantasies on the screen.
And let’s get even more creative and start thinking about gaming: there are plenty of online games in which you’re the main character. Imagine if it were possible to customise your character not just by picking their outfit, but by uploading your own face as well?
The sextech application of this would mean that you could participate in first-person roleplay games, and actually see yourself (or a close approximation of yourself) doing all the things your in-game character was able to do too.
These are just a few possibilities, but we suspect that deepfake technology, like any other technology, could have some helpful (and pleasurable!) ethical applications. The difficulty, of course, is ensuring that there are laws in place to prevent the unethical side, while also educating people on the importance of consent in every sexual interaction – digital as well as physical.
We opened this article by talking about the ethics of personal fantasies – from imaginary sex with your favourite celebrity through slashfic, photoshopped porn, and the more advanced digital manipulation of video. Wherever it is possible to insert celebrities (or yourself) into sexual scenarios, there will be people who want to do that. But it’s important to draw clear lines between what is OK and what isn’t.
The question is not whether people will use deepfake technology in a sexual context, but about how we can encourage them to do it right. With a focus on ethical behaviour, consent and empathy.