Menu
Fri, 22 November 2024

Newsletter sign-up

Subscribe now
The House Live All
Education
A highly skilled workforce that delivers economic growth and regional prosperity demands a local approach Partner content
By Instep UK
Economy
UK Advertising: The Creative Powerhouse Fuelling Global Growth Partner content
Economy
Trusted to deliver Britain’s green growth Partner content
By Trust Ports Partnership
Economy
Health
Press releases

Social war: Will the Online Safety Bill keep us safe?

(Image: Alamy)

8 min read

From Myspace to the Metaverse, our online space has changed dramatically over the last two decades – and now legislation is having to play catch up. Georgina Bailey reports on whether the much-heralded Online Safety Bill can bring about the changes needed.

On Tuesday 21 November 2017, the Russell family woke to find their world had changed forever. Molly, the youngest of the family’s three daughters, had died by suicide six days before her 15th birthday. Her father, Ian, describes her as an inquisitive and caring child, bright and “intriguing” – but in the months leading up to her death, she suffered with anxiety and depression. The coroner’s inquest is still ongoing, but the Russell family believe content promoting self-harm and suicide ideation on social media drove their daughter to her death.

“When we looked at what Molly had seen and saved and liked on her social media accounts, our reaction was just one of horror,” Ian Russell says. As well as graphic images of self-harm, the Russells discovered another vein of content that was “less obviously shocking,” but the cumulative effect was highly damaging, with cartoons and graphics promoting suicidal ideation.

“We had to limit the amount of time we spent looking at that content, it was affecting us personally. If we had to do that, as adults, I don’t know how a 14-year-old could ever be expected to get through that without massively detrimental effect on their wellbeing,” Russell says.

One that sticks in Russell’s mind is a simple piece of text saying, “Who would love a suicidal girl?”. “It sums it up,” he says. “That’s nothing illegal. But it engenders feelings of hopelessness and pointlessness. And it’s that worthlessness that is one of the hardest and most dangerous emotions within a person who is struggling with a mental health [issue].”

The bill makes a fantastic first step, but I don't think it will be the answer. It's an ongoing process.

Molly’s death sparked sympathy and outrage across Westminster, amid calls for a greater understanding of the potential harms online for children and young people. It inspired what was then called the Online Harms White Paper, a joint proposal from the Department for Digital, Culture, Media and Sport (DCMS) and the Home Office, which came out in April 2019. Its core proposal was the introduction of a new statutory duty of care for internet companies, overseen by an independent regulator.

The legislation, published in draft form as the Online Safety Bill in May 2021, is meant to cover a range of harms, and illegal content. The draft bill currently defines harmful content as that which “gives rise to a reasonably foreseeable risk of a significant adverse physical or psychological impact on individuals”.

These harms are divided into three categories: illegal activities (such as online child sexual exploitation and abuse, terrorism, serious violence, hate crime, the sale of drugs and weapons); content that would be harmful to children but not adults (including commercial pornography); and legal content which may be harmful when accessed by adults (this might be content about eating disorders, suicide or self harm). Disinformation, including anti-vaccination information, would also be classified as harmful.

There are concerns, however, that the bill doesn’t have enough teeth, or could push harm onto smaller platforms. Analysis by the NSPCC in September found that the bill failed in 10 out of its 27 indicators to protect children from avoidable abuse. This included the bill’s requirement that a “significant” number of children use a service before it is subject to the child safety duty. The NSPCC say this is likely to mean that “harm is displaced, rather than tackled”, with sites such as OnlyFans and Telegram exempt from requirements to “protect children from age-inappropriate and harmful content”.

Russell would like to see tougher criminal sanctions for senior managers at online firms to embed a culture of harms prevention. “They have a culture of running things their way, and prioritising profit. If you want to change the culture of a company to design in safety from the beginning of the process, from the launch of a new algorithm, the conception of a new platform, then you need to change the culture,” Russell says. “And if you want to change the culture of a company that has been steaming in one direction for all its existence, you need the biggest impetus possible to make it reconsider and renavigate. The focusing of minds that criminal liability for senior management would add… would make a big difference.”

At the moment, the bill’s provisions mean senior managers could face criminal sanctions for failing to comply with information requests from the regulator, Ofcom, in a timely manner; the new Culture Secretary, Nadine Dorries has indicated that the implementation period for this will shorten from two years after the bill has passed to within months. Companies can also be fined up to £18m or 10 per cent of a company’s global annual turnover, whichever is the higher, for non-compliance, and Ofcom can also take business disruption measures such as blocking services from the UK.

Dr Joe Mulhall, head of research at HOPE Not Hate, believes criminal sanctions should be up to seven years in prison (the current proposal is two) in line with sanctions for managers of financial institutions which break regulations.

Many of us felt they were failing, but they were trying. It now seems clear that they were failing, and often they knew they were failing, but didn't care.

Russell and Mulhall both highlight the testimony of Facebook whistleblower Frances Haugen as a factor in pushing for harsher sanctions. In evidence to Congress and Parliament, Haugen claimed platforms knew about, and even promoted via algorithms designed to boost engagement, personally and societally harmful content and disinformation.

Haugen told Parliament’s joint committee: “I have seen lots of research that says that kind of engagement-based ranking prioritises polarising, extreme divisive content. It does not matter if you are on the left or on the right, it pushes you to the extremes, and it fans hate. Anger and hate is the easiest way to grow on Facebook.”

Mulhall says many in the field feel “almost gaslit” by Facebook on the back of the allegations. “This has really created a sense of urgency around why legislation is so necessary, because we cannot trust these major platforms,” he says. “They have had a decade or more to try and get their houses in order. Previously many of us felt they were failing, but they were trying. It now seems clear that they were failing, and often they knew they were failing, but didn’t care.”

Representatives of Facebook and Twitter did not respond to requests for interview.

The bill also seeks to tackle online abuse, something that has come more to the forefront since the death of Sir David Amess.

In recent weeks, Labour MPs Jess Phillips and Naz Shah have been involved in separate court cases in which the defendants were found guilty of making death threats towards them.

Phillips says there are very few forums on the internet where she hasn’t experienced abuse. “I have experienced pile-ons, dreadful rape threats, death threats in online spaces – both secret Facebook groups and public Facebook groups, plus Twitter more generally. I’ve had Reddit threads with really harrowing descriptions of how people will kill me. I have experienced porn sites with images of me on, both images that are of me and my cleavage and images that aren’t of me, but purport to be.”

Both Shah and Phillips rarely engage with their social media – they only see abuse because others send it to them or because of police involvement. Their attitude towards the abuse and threats is startling; it has become normalised for them, and many other MPs.

Shah recalls recently sitting next to Kim Leadbeater, sister of the murdered MP Jo Cox, in the Commons: “I said: ‘I’m not around next week because I’m potentially in court over death threats to me and my kids’. She looked at me and said, ‘you didn’t even flinch’. And it’s become that normal that you don’t flinch. It shouldn’t just make us flinch, it should make us sick. And the reality is that it doesn’t because it has become part and parcel of the job.”

Shah and Phillips agree with Haugen’s assessment that individual harms online cannot be separated from wider societal harms, as the bill currently states. When it comes to abuse, both believe there should be provisions for social media firms to work with police to identify anonymous accounts accused of trolling, while protecting anonymity for those who need it online. For Shah, the bill also needs to tackle harms and abuse on smaller platforms, which she describes as breeding grounds for hate.

At a recent appearance in front of the joint committee scrutinising the bill, Dorries said the concept of “societal harm” was “too complex” to put into law.

“However, the department is taking on many recommendations from the Law Commission’s Communications Offences report, as well as from across Parliament. When the bill returns in the new year, it is likely to look noticeably different to its current form. Whether the changes suggested can ever successfully regulate something as continuously changing as the online space remains to be seen.

PoliticsHome Newsletters

Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.

Read the most recent article written by Georgina Bailey - The Home Office – is it fit for purpose?

Categories

Economy Social affairs
Podcast
Engineering a Better World

The Engineering a Better World podcast series from The House magazine and the IET is back for series two! New host Jonn Elledge discusses with parliamentarians and industry experts how technology and engineering can provide policy solutions to our changing world.

NEW SERIES - Listen now