Crime And Policing Bill Expected To Criminalise AI Models Used To Make Child Sexual Abuse Images
AI technology is being exploited to produce child sexual abuse material (Alamy)
4 min read
A new offence to prosecute tech firms which enable the creation of AI-generated child sexual abuse is under consideration for inclusion in the government’s Crime and Policing Bill, PoliticsHome understands.
Sources close to the government told PoliticsHome that ministers are looking at using the Crime and Policing Bill to close a loophole that protects AI firms from being prosecuted over the creation of child abuse images. The bill is expected to be introduced to Parliament in the spring.
While it is already illegal in the UK to possess or distribute child sexual abuse material (CSAM), including AI-generated or computer-generated content, the mechanism to enable its creation is not yet technically illegal.
The Crime and Policing Bill could change this, enabling the prosecution of companies that make the AI models and files which are then used to create deepfake CSAM.
Campaigners from organisations such as the Internet Watch Foundation (IWF) and Baroness Kidron, a crossbench peer, have been advocating for this change for some time.
Kidron tabled an amendment to the former government’s Data Protection and Digital Information Bill last year, which aimed to close this loophole.
Speaking in the House of Lords in April 2024, she said “it could hardly be more important or necessary”, and that she was “quite taken aback” that the government had not adopted her amendment at the time.
She spoke again on the topic in Parliament in December, arguing that these AI models were “specific, specialist and currently beyond the reach of the police”, and explained that they “allow paedophiles to generate bespoke CSAM scenarios of unimaginable depravity”.
“A surprising number of people think that AI abuse is a victimless crime, and I want to make it clear that it is not,” she continued.
“If your image is used in this way, you are a victim; if you are forced to watch or copy such imagery, you are a victim; and if you are a child whose real-life abuse is not caught because you are lost in a sea of AI-generated material, you are a victim. Then there is the normalisation of sexual violence against children, which poisons relationships—intimate, familial, across generations, genders and sexes.”
The issue of AI image generators being used to make CSAM is getting worse. The IWF found 20,000 AI-generated images had been posted on one forum in a single month, with more than 3,000 of these images involving criminal acts.
Derek Ray-Hill, Interim Chief Executive at the IWF, said: “The frightening speed with which AI image generators have developed has meant laws must be tightened up to respond.
“We have long campaigned for changes to be made to ensure child sexual abuse laws are updated in line with emerging harms and to prevent AI technology from being exploited to create child sexual abuse material.
“The proliferation of child sexual imagery online fuels real world violence and makes the children whose likenesses have been co-opted into this imagery victims all over again. The commodification and creation of this harmful material must stop, and we welcome anything which will help stem this abuse.”
Professor Clare McGlynn, a professor specialising in the legal regulation of pornography, sexual violence and online abuse, told PoliticsHome that it would be “very positive” to see this change being introduced, as “AI is being used to create ever more extreme child sexual abuse images”.
The Crime and Policing Bill will also criminalise the creation of sexually explicit deepfake images of adults, as the law currently only covers victims under the age of 18. It will include other changes to protect young people, including stronger measures to tackle knife sales online and introducing mandatory reporting of child sexual abuse.
A government spokesperson said: “Child sexual abuse is a vile crime that inflicts long lasting trauma on victims. UK law is clear – creating, possessing or distributing child sexual abuse images, including those that are AI generated, is illegal.
“We continue to invest in law enforcement agencies to support their efforts in identifying offenders and are committed to taking action against AI tools being optimised to create child sexual abuse material, including through legislation.”
PoliticsHome Newsletters
PoliticsHome provides the most comprehensive coverage of UK politics anywhere on the web, offering high quality original reporting and analysis: Subscribe