Menu
Thu, 12 December 2024

Newsletter sign-up

Subscribe now
The House Live All
Blood sports in the UK: history is still in the making Partner content
Health
Can ‘Global Britain’ really succeed with electric vehicles? Partner content
By Advanced Propulsion Centre UK (APC)
Transport
Education
Press releases

The rise of AI 'porn': 'The levels of deepfake abuse now are just enormous'

Image-based abuse (Illustration by Tracy Worrall)

12 min read

‘Deepfake porn’ is on the rise. Sienna Rodgers meets the women fighting to criminalise the creation of non-consensual intimate images

“That wasn’t a photo from my Instagram. It was something I had sent directly to one person, and that’s when my heart just dropped. I just knew that it was my best friend, Alex.”

It was almost four years ago that ‘Jodie’ from Cambridge received an anonymous email directing her to Reddit. She followed the links to find a “cesspit” of users posting real images of women, then asking others to digitally remove their clothes and abuse them online.

Among them were pictures taken from Jodie’s Instagram, which was private, posted by someone requesting that others ‘deepfake’ her – i.e. edit using artificial intelligence tools – into pornographic images. A thread of deepfaked images and videos of her in “sexually explicit situations” followed.

Investigating the commissioning user’s profile in a bid to work out who they might be, she found an image that she had not posted on social media but instead sent only to her best friend. She had found the culprit. “It’s still one of the worst moments of my life,” recalls the 26-year-old, now a campaigner who uses the name Jodie when talking about her experience.

She took screenshots and went to the police. The first station she visited concluded that no crime had been committed. A second nearby police station found it was the sexually explicit captions that were criminal rather than the deepfakes themselves, Jodie says. The perpetrator had also targeted 14 other women.

“He was referring to us as ‘sluts’ and ‘whores’. It was disgusting language. But for me, that’s not what felt like the crime. It felt like the image abuse was the crime,” Jodie tells The House.

“He was someone I trusted. For me, this was just the ultimate betrayal. He knew my history of deep distrust in men for various reasons, and he used that against me to humiliate me to strangers on the internet. He posted my location, identifying information about who I was, and the police didn’t deem this to be harassment.”

“Some deepfakers have created desktop apps that make it extremely quick and easy for anyone to engage in this practice”

Jodie says she had to collect evidence herself and was not given a liaison officer. She suspects she would have received greater support if the crime was considered sexually motivated. Alex Woolf, a former BBC Young Composer of the Year, was charged under the Communications Act and ultimately given a 20-week prison sentence suspended for two years.

“When I said to the police that I wanted to read my victim impact statement, their response was, ‘His parents will be there. It will really upset them.’ The way we were treated as victims was absolutely shocking. At times, it felt like we were the perpetrators and he was the victim,” Jodie says.

“There are pornographic images of me on the internet. They will live there forever. And all I got was £100 compensation, which paid for one therapy session, and I got a restraining order against my best friend. That doesn’t feel like justice to me.”

Contacted by The House about Jodie’s case, a Metropolitan Police spokesperson said: “The Met haven’t always got it right but is committed to improving our response to tackling violence against women and girls. This is a priority and we continue to learn from our past mistakes and ensure victims are at the forefront of everything we do. 

“In respect of this case, Met officers were determined to take the offences seriously. Despite independent legal advice claiming this would be exceptionally difficult to prove, Met officers thoroughly examined the allegations and identified an offence which was successfully prosecuted with the majority of victims attending the hearing and given the opportunity to present their victim personal statements.” (Jodie notes that just four of the 14 victims read impact statements.)

The Met added: “We commend the bravery of the original victim-survivor who came forward and hope that the conviction has brought her a small measure of comfort.”

A 2023 study by Security Hero found that 98 per cent of all deepfake videos online are pornographic, and 99 per cent of them depict women. Its research also revealed that new tools mean it takes less than 25 minutes to create a 60-second deepfake porn video of anyone, using one clear image of their face and at no cost.

Sophie Compton, a 31-year-old British film director, has helped raise awareness of AI ‘porn’ with her 2023 documentary, Another Body. It follows an American university student, ‘Taylor’ (not her real name), who is targeted and discovers – in a resemblance to Jodie’s story – that the perpetrator was her former flatmate, ‘Mike’. She tells the police but finds that, in her state, creating such deepfakes is not illegal. Mike is eventually called by the police, neither confirms nor denies that he did it, and faces no further action.

“The levels of deepfake abuse now are just enormous and there’s now an entire industry that’s been built around this practice… It should not have been allowed to get to this point,” says Compton.

“The practice began on forums like 4chan and Reddit, where people can be anonymous. Specifically 4chan. It was this community of people that were quite committed and technologically savvy, trying to advance the cause of deepfaking, and who had absolutely no moral qualms about it at all.

“Interestingly, there were all these bizarre, twisted debates about the ethics of one deepfaker copying the style of another deepfaker, with complete disregard to the identity violation of the woman that was at the core of this,” she explains.

“During the pandemic, people in the community basically realised that there were no consequences, and nobody was coming for them. It’s become an attractive commercial proposition, and a lot more people have been brought into this world. Some deepfakers have created desktop apps that make it extremely quick and easy for anyone to engage in this practice.

“Now you have apps on [social media and instant messaging service] Telegram where you can put in an image and it will strip that – if it’s a female body, it will strip it naked. These are called nudify apps. There’s been a huge explosion of them over the last year.”

Compton adds: “There are ‘custom’ deepfakes, where you specifically commission a ‘talented’ deepfaker to make a video of somebody; then there are people monetising the creation of tools that enable many people to make deepfakes.”

In Compton’s documentary, Taylor is deepfaked to conceal her identity from the audience. It is 10 minutes before the viewer is told this is the case, so we assume the video of her talking is genuine. The director says this shows the technology can have “really positive applications”; as an activist, she does not object to the AI itself.

“In journalism, for example, a source can speak out and you can express the emotion, character and essence of that person, but still protect their identity… This problem, we didn’t think, was a technological problem. It was a problem of the internet architecture and the complete lack of regulation of online spaces.”

 (Illustration by Tracy Worrall)
 (Illustration by Tracy Worrall)

Sharing intimate images without consent will be a ‘priority offence’ under the Online Safety Act (passed into law but not yet implemented). However, this criminalises the sharing rather than the creation of sexually explicit deepfakes.

While the last government did announce it would criminalise their creation, this fell away as the Criminal Justice Bill was dropped when the general election was called. Now, after Labour committed to “banning the creation of sexually explicit deepfakes” in its manifesto, the new government has promised to introduce legislation to this effect.

Nonetheless, Baroness (Charlotte) Owen has chosen it as the subject of her Private Members’ Bill. The 31-year-old former special adviser to former prime minister Boris Johnson has put forward the Non-Consensual Sexually Explicit Images and Videos (Offences) Bill, due to have its Second Reading in the Lords on 13 December, to introduce a comprehensive law.

Under the Online Safety Act, Owen says, “There’s this major loophole that means anyone can use a ‘nudification’ app or go on one of these huge websites which create these images for you – the largest of which has 13.4 million hits every single month – and anyone can own an intimate image or put you in a porn video without your consent under our current law.”

But if the government has pledged to act, why the need for her bill? “I will not be happy until something is written into law,” replies Owen. “I remember going to look through the King’s Speech – it wasn’t there. Victims were saying, ‘Why isn’t it in the King’s Speech? Where is it?’ Anyone who understands this issue knows it needs to have been done yesterday.”

“The way in which the online world finds different ways to abuse women is like water – it always seems to find the gaps”

It is not only speed but also careful drafting that makes her bill necessary, Owen says. Her bill would not only make it illegal to create these images but also to ask someone else to create one for you. “If you’re on a Reddit forum, you could have 10 different people from 10 different countries, all under different jurisdictions. We need to make it so that no one can ask someone else to do it, to get around the law,” she explains.

Her bill defines ‘intimate’ as being a depiction “that a reasonable person would consider to be sexual”. This seeks to bring into scope ‘semen images’, where those depicted have semen added onto a photo of them, digitally or otherwise. It also hopes to be future-proof by specifying that as well as taking or digitally creating a photo or video, it will cover “otherwise capturing” one (screenshotting, for example).

“The way in which the online world finds different ways to abuse women is like water – it always seems to find the gaps. Unless we seek in some way to put in our legislation that we’re going to try and future-proof, we’re forever going to be playing whack-a-mole.”

Durham University law professor Clare McGlynn, who has been researching online abuse including image-based abuse for the last 10 years, helped Owen with the drafting of her bill. She says a clearer and more comprehensive law will make it easier to force online platforms such as Google to take stronger action on deepfake sexual abuse.

“I don’t think that will mean specifically de-listing all of those sites,” she says of the deepfake-dedicated pornography websites, which host 90 per cent of such material. “What would be good is if the platforms took a much more proactive approach to down-ranking them. At the moment, even though they’ve made it more difficult to just search for, say, ‘deepfake porn’, the material is still very easily accessible, and they’re still hosting adverts for these nudify apps and similar.”

McGlynn adds: “A new criminal law can send that message to society that this is no longer acceptable, and that helps deter people, but hopefully also helps change attitudes and compliance.” She would also like to see changes in the civil law, making it easier for people to get material deleted.

Owen is not the only legislator looking to apply pressure in this area. Labour MP Jess Asato, who worked for children’s charity Barnado’s before entering Parliament this year, is campaigning to ban nudifying tools and apps.

“I was very pleased when the creation and sharing of nude AI child sexual abuse material was made illegal,” she says. “The problem I realised once I met with [not-for-profit organisation] Internet Matters when I became an MP was that the apps and tools online are still freely available and legal to use. That felt, to me, to be an absurd situation.

“Many children who have been watching pornography – because that is essentially completely regularised now in society – will sometimes see these apps as something fun to do, not realising that in taking a photo of sometimes their friends they are creating illegal child sexual abuse material.”

Asato highlights that “app stores in themselves were not included in the Online Safety Act, despite a lot of campaigning for them to be so”. “There are a number of areas in which app stores need to bear responsibility for a lot of the harm that is out there. It does feel like a missed opportunity that the Online Safety Act didn’t bring them sufficiently into the scope of the new regime,” the MP says.

So, what are the next steps for the Private Members’ Bill starting in the Lords? “The government should do the right thing and back the bill. They don’t want to look like they’ve been dragged kicking and screaming into fulfilling one of their own manifesto commitments,” Owen says.

A Ministry of Justice spokesperson told The House: “Sexually explicit deepfakes are degrading, harmful, and, more often than not, misogynistic. We refuse to tolerate the violence against women and girls that stains our society, which is why we’re looking at options to ban their creation as quickly as possible.”

Owen herself received significant criticism online after her appointment to the Lords; aged 30, she was the youngest ever to be made a life peer at the time, and some accused her of being undeserving. Does this form part of her interest in online abuse? And does she know whether she has personally been affected by sexually explicit deepfakes?

“I’ve never actually checked. Obviously, it’s something I was aware was a risk when I went into this,” she says, before adding: “This isn’t about me. This is about the victims of sexually explicit deepfakes. I am hugely attached to this bill, and I want to give them the path to justice that they deserve.”

PoliticsHome Newsletters

Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.