Tech Companies Are Allowing Harmful Misinformation – And Government Seems Powerless To Stop Them
Misinformation spread across social media helped to fuel much of the violent disorder across the UK in the past week (Alamy)
10 min read
The violent disorder across many UK towns and cities in the past few weeks shocked the nation – but the surge of online misinformation which helped to fuel it should come as no surprise.
For years, MPs from all parties have been calling for the Government to do more to tackle misinformation and disinformation, as the Online Safety Bill took years to pass into legislation in 2023 – having initially been proposed four years earlier.
There were many warnings of the possible harmful impact of online misinformation ahead of the 2024 UK general election and many other elections happening around the world this year. British charity Full Fact even developed a tool to allow people to test whether politicians were making false claims online.
But now, months after the Online Safety Act passed into law, these fears have not been abated. The tide of far-right, racist violence that swept over the UK this month has deep roots in online disinformation, particularly with the sharing of a false name connected with the suspect in the Southport stabbings of young girls. According to Bloomberg, UK authorities suspect foreign state actors used bots and fake accounts to amplify these misleading posts.
For some politicians, including Mayor of London Sadiq Khan, these incidents have shown that the Online Safety Act must be revisited “very, very quickly”. Chris Webb, the new MP for Blackpool South – which has been particularly affected by racist violence – told The Rundown podcast that new laws tackling online incitement should be brought in sooner.
There are indications that the UK is looking at steps to tighten regulation, as Prime Minister Keir Starmer told broadcasters on Friday that the Government is "going to have to look more broadly at social media after this disorder". He added that social media was "not a law-free zone", with one man sentenced on Friday for stirring up racial hatred online.
Bloomberg reported that officials are considering reviving the “legal but harmful” provision in the Online Safety Act. This clause, which was removed when the bill was going through Parliament, would give the regulator Ofcom more power to force social media companies to take down harmful content.
But further regulation could take years to enact, with the act as it stands likely to take a few more years already to take full effect.
With the issue ongoing, there are also signs that some of the top technology firms are currently refusing to cooperate with the Government. On Monday, Technology Secretary Peter Kyle met with executives from the top firms such as X, Meta and TikTok – but the Financial Times reported that X resisted calls to take down posts that were deemed a threat to national security by the Government’s National Security Online Information Team (NSOIT).
This is not a new problem. A former tech adviser from the last government told PoliticsHome that X have been “very challenging” to work with for some time, particularly since Musk took ownership of the platform in 2022.
Last year, then-technology secretary Michelle Donelan met with X and other social media firms to address misinformation being spread about the Israel/Gaza conflict – the last “high profile” meeting between a secretary of state and the top firms before Kyle’s meeting this week.
“All of them were receptive to it, except for X, who were very challenging to work with,” the former adviser said.
“Their person in the UK is so low down in the food chain, they don’t have any influence… There is no-one there to talk to.”
They added that the Government would ask X for information and to send on their latest policies relating to online misinformation, but then would not hear anything back. TikTok and Meta, by contrast, were more “helpful”.
The former adviser added that the Government also tried to include smaller companies such as Telegram and Signal in their talks, but found they were “very hostile” to the idea of working with the government.
So if the Government has known for so long that X and other platforms are so resistant to countering harmful misinformation, why has nothing been done about it already?
NSOIT – formerly the Counter Disinformation Unit – is the government body responsible for tackling harmful mis and disinformation. But according to many of those who have worked closely with the unit, it has very limited powers and it "not fit for purpose".
The former tech adviser told PoliticsHome that they would be “staggered” if foreign states were not involved in the most recent bouts of disinformation, but explained that NSOIT only had the power to flag to social media companies where there were the hallmarks of foreign interference. It was then up to social media companies to decide what to do with that information: “The Government has no jurisdiction over that.”
The Government and Ofcom also face different challenges with different platforms. Encrypted apps such as Whatsapp, Telegram and Signal make it particularly difficult to identify harmful information. Full Fact’s Head of Artificial Intelligence Andrew Dudfield told PoliticsHome that the fact that people use a wide range of social media platforms makes it a particularly difficult space to regulate.
“Some of the content that people are consuming is distributed via closed messaging platforms or relatively open messaging platforms, and sometimes it's on on very large social media online platforms or through online forums, and that environment is changing,” he said.
“And it is increasingly hard to see how we have a single place where everybody is consuming the same information, because that fragmentation will continue to happen.”
Jonathan Brash, the new Labour MP for Hartlepool, said that it was evident it was “not just one social media platform” that had contributed to stirring up the riots in his town and elsewhere.
“Part of the problem here is that different social media platforms target different groups, which means the groups who are not targeted by that really have no idea that it's happening," he said.
"There's a lot of this stuff that I, you and other people simply won't see, but it's clearly there.”
He added that while much of the focus was on X, Facebook also hosted many local groups which he had seen to be sharing misinformation and provoking criminal behaviour.
Telegram is a smaller platform which has faced criticism for hosting far-right groups that have plotted attacks on immigration centres and shared information on making weapons. The former tech adviser said these smaller platforms were much more difficult to regulate: “Telegram and Signal are almost designed to facilitate that kind of thing.”
However, banning them, in their view, would be unlikely to solve the problem, as it would create a “whack-a-mole” situation where people would just move to another platform. They explained that the Government therefore considered having a smaller number of powerful firms to be an "advantage” in the tech sector, as it is more in these larger firms’ interests to uphold their reputation and work with the Government.
This makes the hold of these firms – and the individuals who run them – over the Government incredibly powerful.
Tech Secretary Kyle told The Times this week that he considered the Government’s relationship with tech billionaires such as X owner Elon Musk as “much more akin to the negotiations with fellow secretaries of state in other countries, simply because of the scale and scope that they have”.
Musk, meanwhile, has been comparing the UK to the Soviet Union, spreading the conspiracy theory that Starmer is overseeing “two-tier policing” across different communities, and sharing a falsified news article about the UK deporting rioters (which he has since deleted). PoliticsHome has contacted The Department for Science, Innovation and Technology (DSIT) throughout the week to ask what the Government will do to specifically hold Musk and his platform to account, but no details have yet been provided.
Andy McDonald, Labour MP for Middlesbrough and Thornaby East – one of the areas worst affected by the rioting – said that he had been “incredibly frustrated at the weakness of the system in responding to things where people are going way beyond what is reasonable comment or observation”, and echoed calls to look at the legislation again.
“We need to have the power to enforce removal of harmful content and ensure financial penalties that are appropriate to the harm done are levied against these organisations, but it is difficult to do because of the international nature of these platforms,” he said.
“It's a huge undertaking where everybody's got to play their part. We don't want to be in a situation where, like in the People’s Republic of China, the Government has the ability to close things down, shut that communication off. We do want to preserve and treasure our freedom of expression, but when the organisations themselves will not exercise control, it only leaves governments to pursue more and more draconian measures.
“They [the social media companies] have surely got to realise that no government is going to sit back and tolerate this any longer. So work with us, or you leave no alternative than to see alternative methods of constraining you, constraining the content online. That's not where we need to be.”
Hartlepool MP Brash agreed that it was "clearly" the case that the regulations needed to be tightened.
“If people stood up in a town square with a megaphone and said some of the things that were going on social media, they'd be arrested for inciting violence, they'd be arrested for inciting racial hatred," he said.
"There's no question about that, so the rules that apply everywhere else have got to apply on social media platforms as well.”
Both Brash and McDonald said that the number of anonymous accounts allowed on social media platforms was part of the problem.
“I think this is so cowardly and insidious that people are not prepared to say who they are, but they're prepared to say the most dreadful thing and encourage others to do the same,” McDonald said.
“I do think it robs us of the opportunity of holding people to account.”
Full Fact’s Dudfield added that revisiting the legislation could include a framework for identifying information gaps and being able to sort of work with the companies to pre-emptively assess where misinformation might take hold and prevent it.
“In thinking about how do we make sure that we're introducing reliable information from authoritative sources, where it is most needed?” he questioned.
“How do you prioritise the promotion of good information, rather than necessarily just restricting the bad information?”
A Government spokesperson said: “Our immediate focus is working with the social media companies to tackle content that has contributed to the disorder of the past week.
“This includes proactively referring content for platforms to assess and take action on and to ensure they were actively engaging with law enforcement on criminal content. People are already being charged for offences committed online.
“In a sector where innovation and trends develop so rapidly, the Government continually assesses the law’s ability to keep up. The Online Safety Act was designed to tackle illegal content and protect children. As Ofcom has set out, platforms should take action now to tackle the harmful content on their sites.”
The prime minister has said the police are to remain on “high alert” throughout the weekend, but most are hoping that the worst of the street violence is over. However, the question of how Starmer and his ministers intend to reckon with some of the world's most powerful tech giants remains open.
X, Facebook, Telegram and Signal have been contacted for comment.
PoliticsHome Newsletters
PoliticsHome provides the most comprehensive coverage of UK politics anywhere on the web, offering high quality original reporting and analysis: Subscribe