Pin It
deepfake
Illustration Marianne Wilson

Will the Online Safety Bill really stop deepfake porn?

The proposed legislation seeks to criminalise sharing non-consensual deepfake pornography – but campaigners aren’t celebrating just yet

When Kate Isaacs saw an explicit video of herself pop up on Twitter, her immediate reaction was fear. She had no recollection of the encounter. “Seeing yourself having sex in an online space is scary enough,” she recalls, “but seeing yourself having sex with someone you don’t remember, in a place you don’t remember, is so traumatising.”

As it turned out, the video wasn’t actually of Isaacs – someone had edited her face into a porn scene. She’d been deepfaked. “When I realised it was me but it wasn’t my body, it was a different kind of panic,” she continues. “That feeling of not having control or consent over my own image – my own body – filled me with dread. It felt like a form of sexual violence.”

Isaacs believes she was targeted because of her activism, specifically her work with #NotYourPorn, a campaign she co-founded to fight against non-consensual intimate image-based abuse. Increasingly, though, it’s not just people in the public eye falling victim to deepfakes. In fact, in the five years since the software started taking off on the internet, more and more sites have sprung up that make the process of creating non-consensual pornography of anyone – whether that’s a celebrity, someone you know, or a stranger you found online – easier and more accessible. According to government statistics, one such website received 38 million hits in the first eight months of 2021 alone.

Unsurprisingly, the effect on its victims – who, as per 2019 statistics, are exclusively women – is monumental. “It can be very isolating,” says Clare McGlynn, a law professor at Durham University and expert in image-based sexual abuse. “The harms are constant, as the material is forever out there on the internet, with each new viewing or distribution being experienced as abuse.”

In light of this disturbing and relentless rise – along with the tireless work of activists like Isaacs, and recommendations by the Law Commission – two weeks ago, the UK government announced that it would be criminalising non-consensual deepfake pornography. The legislation is set to be a part of the controversial Online Safety Bill, which returned to Parliament this month after several years-long delays.

As well as cracking down on deepfakes, the newly-proposed measures will also tackle intimate image abuse more broadly. If passed, the bill will outlaw the installation of hidden cameras or use of secretive photography to take explicit images without someone’s consent, as well as ‘downblousing’. The UK has already outlawed revenge porn and upskirting, and previously announced cyberflashing’s proposed criminalisation under the Online Safety Bill.

Right now, there are very few details about what the new legislation will actually look like. Currently, all we know is that the sharing or sending of non-consensual deepfake porn is set to be criminalised; there’s no mention of the creation or possession of it. The law is also unlikely to have any impact on retrospective cases. 

When it comes to what could be included, Chris Evans, a barrister at Richard Nelson LLP, says the legislation will need a clearly defined definition of what constitutes deepfake pornography. And, to have a meaningful impact on eradicating or reducing non-consensual deepfakes, there’ll likely be significant punishments in the event of a conviction. However, as Evans adds, “whether significant punishments prevent offending is an entirely different consideration”. He notes that the most important details will lie in the legislation’s proposed detection of material, and thus likelihood of apprehension.

“Seeing yourself having sex in an online space is scary enough, but seeing yourself having sex with someone you don’t remember, in a place you don’t remember, is so traumatising” – Kate Isaacs

For McGlynn, one essential element for inclusion is granting automatic anonymity to those who report deepfake porn to the police, “as is the case for all sexual offence complaints”. She points out that when victims are not guaranteed anonymity, they’re reluctant to report the crime in the first place, and are then more likely to withdraw from prosecutions.

Although it’s light on details, the announcement has largely been welcomed by activists and victims. But whether there’s a major cause for celebration is yet to be seen. The Online Safety Bill still has a number of hurdles to clear, and, if it’s not passed by April, it’ll be scrapped completely. Not just that, but previous delays could mean the laws won’t even be fit for purpose when or if they’re eventually enacted. And, if outdated proposals are rushed through, there’s a higher chance terms will be left too narrow. (Intimate-image abuse legislation has previously fallen victim to loopholes, including focusing on “intent to cause distress or humiliation”, rather than consent, which has led to a lack of prosecutions.)

Then there’s the question of how effectively this legislation can be policed. As legal expert Alan Collins observed in Cosmopolitan, there’ll likely be arguments about what’s consensual and what’s not – for example, what happens if once-consensual deepfakes are shared with a third party? What’s more, victim-blaming narratives could wrongly claim that if someone has shared an intimate image online already, they’ve willingly consented to it being deepfaked.

It’s also simply a question of resources – particularly when combined with the impossibility of online anonymity. “It’s unlikely that an under-resourced police force will have the ability to commit significant resources to investigate offences that are committed behind the screen of anonymity on the internet,” says Evans. Although he suggests that a “close working relationship with internet service providers and social media giants” might be helpful in the identification of deepfakes, he believes it’s more likely that prosecutions will happen on a “local level, where individuals make complaints against known perpetrators and evidence can be relatively easily obtained by seizure of devices”. 

Besides, it can feel hopeless putting faith in a broken criminal justice system that not only fails survivors of sexual violence time and time again, but – as we’ve seen with the Metropolitan Police – seems to have a habit of protecting the perpetrators who walk in its midst. Furthermore, criminalising an issue doesn’t address its root cause. As Andrea Simon, the director of End Violence Against Women, said in a statement, what’s actually needed is investment in methods, like education and public campaigns, that can prevent this kind of violence in the first place.

Another reason to remain sceptical of the legislation is its inclusion in the Online Safety Bill in the first place, which – like the US’s FOSTA/SESTA bill before it – will make sex workers’ lives much harder. Under the bill, tech firms must clamp down on cases of “inciting or controlling prostitution for gain”, which will likely lead to the removal of sex work ads – something that will detrimentally harm sex workers, who’ll be forced offline and into more dangerous ways of working.

“It seems like they’re folding the deepfake criminalisation into the bill to bring people on side, in order to quietly push through laws that further oppress and endanger sex workers,” says Countess Diamond, a dominatrix and representative for United Sex Workers. “We all want to protect children’s wellbeing online, but broad sweeping legislation which harms sex workers – an already marginalised group – is not the way to do it.”

The legislation’s inclusion in a bill so damaging to sex workers might also dissuade adult creators who’ve had their content stolen for deepfakes from reporting it. “It’s not just [victims who are edited into videos] who are taken advantage of, but sex workers as well,” says adult creator Tanya Tate. “It’s literally our bodies at work. Deepfakes dehumanise the sex worker – they trivialise us to nothing more than a puppet.” Despite this, adds Tate, “the problems of sex workers are not a priority regarding legislation”.

Nonetheless, the proposals to outlaw non-consensual deepfake pornography are indeed a step in the right direction. And, as Isaacs says, the significance of government acknowledgment of the lasting distress deepfakes can cause shouldn’t be downplayed. But the reality of the headline-grabbing announcement needs more interrogation. Given the valid criticisms of the Online Safety Bill – as well as its multiple setbacks and insecure future – and the urgency and complexity of intimate-image abuse, it’s worth interrogating whether this legislation belongs in the bill. Is the government hamming it into something that’s destined to die? Is this a way of pacifying activists and victims, without committing to any meaningful change? 

“It always feels like violence against women and girls seems to get shelved in instances like this,” Isaacs tells Dazed. “We need separate laws. Aside from the fact that we need to review online harms as a whole, image-based sexual abuse is a whole other part that deserves the same amount of attention as those laws that are committed against people in the physical world. It just doesn’t have the breadth or depth to make real change in the next two years – it’s something that needs to be reviewed and updated almost yearly, and it needs a dedicated team to do that.”

“It’s difficult to celebrate these huge wins – and they are huge wins – because you realise how much further you have to go,” she concludes. “I remain optimistic, but not naïve.”

Download the app 📱

  • Build your network and meet other creatives
  • Be the first to hear about exclusive Dazed events and offers
  • Share your work with our community
Join Dazed Club