Online Safety Bill must strengthen child protection after record surge of online child sexual abuse in 2021
3 min read
Discussions about the Online Safety Bill have often been dominated by debates about freedom of speech, online anonymity, or abuse of public figures. These are all important topics to be discussed ahead of regulations, but the most pressing issue which this Bill needs to tackle is online child sexual abuse.
The Internet Watch Foundation (IWF) recently reported that there were 252,000 URLs containing images or videos of children being sexually abused in 2021, compared with 153,000 in the previous year.
The IWF also report that a growing number of these images are self-generated, meaning the child posted or sent the image themselves, in many cases as a result of grooming. These figures and the scale of this problem are truly horrifying, though sadly not surprising following recent data and trends.
Abusers have been able to exploit and target children during this time more than ever before
According to the NSPCC, offences of sexual communication with a child had been increasing in the three years prior to lockdown, but there were at least 1,220 offences in the first three months of lockdown in 2020 alone. The majority of grooming cases now begin online with likes on Instagram or Facebook, followed by direct messages on these Apps or others like Snapchat.
One in seven 11–18-year-olds have been asked to send sexual messages of themselves online. The pandemic and subsequent lockdowns have made this problem worse. Children spent longer indoors, longer on their smart phones or iPads and longer on social media. Abusers have been able to exploit and target children during this time more than ever before.
A key issue at the moment is that companies are not incentivised enough to take action against these crimes. Last year there were 5,441 sexual communication with a child offences recorded between April 2020 and March 2021, an increase of around 70 per cent from recorded crimes in 2017/18. Almost half of the offences used Facebook (now Meta) owned apps. However, in the last six months of 2020 Facebook removed less than half the child abuse content it had done previously, due to two technology failures.
The upcoming Online Safety Bill must require platforms to proactively take action in order to meet their duty of care to users and incentivise them to bring forward innovative technology to help with this work.
The Bill needs to put in place a full systemic approach for identifying, reporting and removing child sexual abuse material online. Social media sites in particular need to take accountability and prioritise the removal of first-generation child sexual abuse material, including self-generated images.
The Online Safety Bill must also work with organisations like the IWF, who are incredibly experienced at removing illegal and abusive content from the internet. Their skills and expertise must be utilised as part of the Bill’s core mission.
It’s vital that the Bill reinforces that children posting sexually explicit or indecent material of themselves will not be criminalised. Exploitation is never a child’s fault, and methods must continue to be developed which allow victims to securely report their own images and access support if needed.
The Bill must also strengthen Ofcom’s powers to investigate services where it believes the risk factors for child sexual exploitation and abuse are high and allow Ofcom to consider a wider range of risk factors when deciding whether to take enforcement action.
The Online Safety Bill presents a unique opportunity to make huge improvements in child protection and safeguarding. I urge the government to focus its attention on drafting legal changes that can help keep all children safe, this threat will only escalate without them.
Sarah Champion is the Labour MP for Rotherham.
PoliticsHome Newsletters
Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.