A new documentary directed by Sophie Compton and Reuben Hamlyn follows a graduate student as she attempts to uncover who created deepfake porn videos of her – and why
Having intimate pictures or videos of yourself shared non-consensually is nothing short of a nightmare scenario for most women – but imagine being sent an explicit video of yourself that you’d never even filmed. A video that saw your face transposed onto another’s body.
This is what happened to 22-year-old Taylor Klein, the central figure of the new documentary Another Body. Directed by filmmakers Sophie Compton and Reuben Hamlyn, Another Body begins with an acquaintance messaging Taylor, a graduate student, on Facebook, with a link to a pornographic video of her he’d discovered online. Taylor despairs as she realises she has been the victim of non-consensual deepfake porn – and there’s basically nothing the police can do. After a bit of digging, Taylor discovers that deepfake porn videos have also been made of one of her coursemates, Julia, and the two team up to uncover who did this to them and why. They come to the conclusion that it’s likely their coursemate and former friend, Mike, created the videos as an act of ‘revenge’ when both Taylor and Julia refused to continue acting as stand-in therapists for him.
There are laws in place to protect victims of revenge porn: in the UK, sharing intimate content of someone without their consent has been illegal since 2015, while 48 out of 50 states have anti-revenge porn laws in the US. But, as Another Body explores, there are comparatively few laws protecting people who have deepfake porn videos made of them without their consent. In Connecticut, where Taylor is based, creating deepfake pornography of someone without their consent is not illegal. In terms of federal law in the US, Section 230 of the Communications Decency Act states that companies are absolved of liability when hosting deepfake porn videos. Things are slightly better in the UK – sharing non-consensual deepfake porn was made illegal this week as the Online Safety Act was passed into law – but it’s still unclear how this law will actually be enforced.
Another Body shines a light on the urgent need to tackle the creation and distribution of non-consensual deepfake porn, and is part of the #MyImageMyChoice campaign spearheaded by Compton and Hamlyn which seeks to amplify the voices of intimate image abuse survivors. Below, we speak to the pair about how they discovered Taylor’s story, why tech companies won’t moderate deepfake porn properly, and how we can tackle the root cause of this kind of abuse: misogyny.
How did you discover Taylor’s story?
Sophie Compton: We spent about three months on 4chan, trying to track what was happening in these communities. We always knew that we wanted to tell the story from a survivor’s perspective, so it felt really important to find the right person. We actually did a reverse image search of one of the deepfakes we found on 4chan, and it turned out that that was one of Taylor, and the Pornhub profile came up too. We could see that these videos were deepfakes, and assumed that it was probably a case of impersonation as well.
We spent quite a lot of time thinking through whether we should approach her. What if we were the first people to tell her about this situation? We felt like we should ultimately let her know, but also wanted to do that with resources and care – and transparency about the fact that we were making a film. But I think prioritising what she was going through above the needs of the film was really important to set the dynamic that enabled us to make the film with her.
I thought it was really clever that you used deepfake technology to grant Taylor and Julia anonymity. I find discussions about new technologies such as AI or deepfakes often frame the technology itself as inherently bad, when in reality, the issue usually lies with individuals or companies who use these technologies to exploit people. Would you agree with that?
Reuben Hamlyn: Yeah, I think our position is that the technology itself isn’t intrinsically evil. There are positive uses of it and there are negative uses of it. The problem is that in the current landscape, in the vast majority of cases, it’s used to cause harm: 96 per cent of deepfakes online are non-consensual pornography.
But the inverse side of that is this technology could be revolutionary, if properly harnessed, in documentary and video journalism to enable sources to tell their stories in a way that couldn’t be done before. Deepfake technology translates the real emotional expression of the subject, onto the face of another. So when you see Taylor smile in the film, that’s the original Taylor smiling – it’s just translated onto another’s face. There’s an emotional truth to what you’re watching.
But I think it has to be done in a way that’s very conscientious and sensitive, and most importantly, involves the consent of all parties – both the original subject and the ‘face veil’, the person whose face you see. It’s also very important to have that disclosure with the audience.
It was cheering to see Taylor and Julia team up to uncover who had done this to them, but at the same time, it was depressing to think that they shouldered the burden of essentially solving the crime when they were the victims. Is it the case that the police are often unable to help victims of non-consensual deepfake porn?
Sophie Compton: The police, historically, are more often than not victim-blaming, minimising and completely ill-equipped to deal with an online crime. I don’t think we’ve come across a single victim-survivor who’s had a good experience with the police. There’s often a victim-blaming framing around the questioning: ‘what did you do to cause someone to do this?’, as Taylor heard, or, ‘do you act in porn?’, not that that’s relevant.
There are also no federal laws [about non-consensual deepfake porn] in the US, and the fact that there hasn’t been this legislative framework means that the police haven’t been trained. [We’ve now got] a law in the UK which will hopefully set a new cultural standard, but the reality of police being able to do the open source investigation or actually pursue people online that are using VPNs seems unbelievably unlikely. A real sea-change around the education of the police feels really important.
The UK law – is that the Online Safety Act?
Sophie Compton: Yeah. So, [sharing] deepfakes is now criminalised. We’re really curious to see what happens next. How is this actually going to affect these sites which host deepfake porn? Are they still going to be accessible in the UK? How will Google respond, given that they now have a duty to try and address some of this illegal content? What will the regulation be like? It’s gonna be really interesting.
I often get the sense that generally ‘digital’ crimes aren’t really taken seriously – there’s this misconception that victims can just log off or block the perpetrator. And on the other side of the coin, perhaps perpetrators don’t fully grasp just how damaging their actions are when they’re hiding behind a screen. Was that something that you were trying to convey in the documentary – the idea that things that happen online have really real consequences?
Reuben Hamlyn: That’s been one of our guiding lights when we were making this – to show how a crime like this, even if it exists in ‘the digital world’, really reverberates through life and has these material consequences. I think we learnt from our research period on 4chan, where we were looking at the way that this is discussed by the perpetrators, that there is a real lack of recognition of the harm this can cause. At the same time, you’ll have someone discussing a video without recognition of the potential harms for the woman, but it is also used as a form of ‘revenge porn’, as it’s used to humiliate someone – so there is a bit of cognitive dissonance here.
“The people who go on 4chan with the best of intentions can end up pushed towards political extremes in a way that is quite dangerous” – Reuben Hamlyn
Where do tech companies fit into this? What role do they have to play in stamping out non-consensual deepfake porn?
Reuben Hamlyn: Tech companies are enabling the creation, dissemination and distribution of deepfake pornography, and this happens in many different ways.
For example, you have companies like Google are driving people to websites that host the deepfake pornography, or tutorials for how to make a deepfake on YouTube. Then you have internet service providers, which have the capacity to block access to these websites. Verizon in the United States did block Mr Deepfakes [one of the biggest deepfake porn websites] for a while, but it has since unblocked it – all internet service providers could block access to it, and they choose not to. There are all sorts of different tech companies that are extracting from the industry of non-consensual deepfake pornography.
Sophie Compton: I think that the tech companies have been really trying to hold the line that for the internet to exist, it needs to be a free market, complete free-for-all, space, and that’s how it’s functioning. Google and Facebook and other companies have said ‘we don’t have an editorial function, we’re just the platform, we don’t actually make decisions about what is prioritised’. I think that that’s just completely untrue – they make these decisions all the time.
I wanted to talk a bit about misogyny as well. Mike [the alleged creator of the deepfakes of Taylor and Julia] clearly had no empathy and no respect for Taylor and Julia, and worryingly, he wasn’t an anomaly at all – as evidenced by the 4chan forum where there are swathes of men requesting deepfakes to be made of innocent women. It seems to me that the root cause of the issue is misogyny – would you agree with that?
Sophie Compton: Absolutely, 100 per cent. I think what you see on these forums is, in a way, what’s happening inside the locker room. It’s all of the conversations that you sense are happening, but you don’t often see. We’re seeing the objectification and the dehumanisation and the complete disregard for female lives and female agency.
The other thing that feels quite chilling about them, and I think it’s because they’re online spaces, is the humour. It’s all tied up with humour and bravado and bonding. We were so shocked to see how much male social bonding was happening in those spaces – it’s an inverse of the kind of community that we all need, built on the back of this misogyny.
Reuben Hamlyn: I think the people that are posting on 4chan are posting under the assumption that there will not be any consequences. They’re posting anonymously and most of them will be using a VPN if they’re doing anything which is criminal. So the mask really comes off, and people’s worst desires and beliefs can just spew out onto these forums. These people aren’t saying these things when they’re out in the world or in their workplace because there’s a risk of being fired or ostracised, but those beliefs are still held and being activated and reinforced by participation in these forums.
There are some threads on 4chan which are very innocuous and quite sweet, actually – like dedicated to the appreciation of manga or anime. You have these kids that go on to these threads, and then someone refers them to a post on the ‘adult requests’ forum or the extreme right-wing political board. Then they start to entrench themselves in these communities and adopt these beliefs and mimic what they’re seeing there. The people who go on 4chan with the best of intentions can end up pushed towards political extremes in a way that is quite dangerous.
Sophie Compton: We do get into quite nuanced conversations around how you address some of this. We do support introducing criminal law that says that deepfakes are illegal because we do think that this practice is so abusive, but we really have a lot of reservations around actually applying that criminal law to individual people. If you try and imprison the people that are making these videos, like Mike, will that really solve the problem?
These sites that are dedicated to deepfake porn should absolutely not be allowed to exist, let alone develop into thriving businesses where people are profiting off of this abuse. But if you think about the problems with the forums more broadly, I would have major reservations around completely shutting them all down because I don’t know if that would really achieve the desired effect. There will always be forums that are horrible, let’s be real. But Google shouldn’t be promoting those sites. Or maybe 4chan needs to take a more active role around what’s happening within its communities. We need to get to a place where we’re not saying either ‘shut the whole thing down’ or ‘it needs to be a free-for-all’ – we need to get into the grey areas.
Another Body is out now in the US and on November 24 in the UK.