2024 Could Be "Acid Test" For Democracies Against The Threat Of Misinformation
Elections are taking place in the some of the most populated countries in the world in 2024 (Alamy)
9 min read
More people across the world vote in elections in 2024 than in any other year in history. But rather than signalling the flourishing of global democracy, politicians and experts have warned that it could mean the world is more under threat from misinformation and disinformation than ever before.
The UK government must call a general election by the end of 2024, and will hold local elections and regional mayoral elections in spring.
The polling of the British public is likely to clash with other elections around the world, with eight of the ten most populous countries in the world holding them next year, including the US, Russia, Mexico, Bangladesh, Indonesia, and Pakistan. There will also be a European Parliament election and Brazil and Turkey are holding local and municipal elections in which the whole country can participate.
John Penrose, Conservative MP for Weston-super-Mare and the UK’s former Anti-Corruption Champion, told PoliticsHome he believed the fact that so many elections will occur in the same year will present a huge challenge.
He said that while it was important to prevent foreign interference in domestic elections, there was a number of other types of groups that could present a threat via misinformation.
“It might be religious groups, commercial or organised criminal groups… all sorts of people have an interest in trying to sway public debates in a particular direction particularly at the time of an election,” he said.
“The stakes are about as high as they can possibly get and the next 12 to 15 months will be a real acid test about whether or not the democracies can protect the quality of their political debate and their free speech. And their adherence to truth [will be] evidence of how serious we are about that.”
Penrose said that in order to protect democracy, the UK and other countries would have to ensure people have access to “informed free speech” and warned that better systems were needed in order to prevent “systematic, organised lying or bias”.
“If the price of freedom is eternal vigilance, then this is an arms race,” he said.
“We've upgraded our laws, we now need to implement those laws really, really carefully. But we aren't going to be able to assume that this is job done and we can relax for the next 20 years. This is going to be a continuous thing.”
Why do multiple elections at the same time present such a risk? Glen Tarman, Head of Policy and Advocacy at fact-checking organisation Full Fact, said that while every country has its own “specific context” for misinformation to thrive, misinformation “flows over borders”.
“It's highly likely that there will be some narratives and memes that get repurposed and reworked in different territories, in different election settings,” he told PoliticsHome.
“It could be that there are spillover effects and those who want to cause mischief could learn how to do so in one setting and apply it elsewhere.
“We know that the cost of coming into the market of misinformation and disinformation is reduced massively because the tools are now cheaper, easier to use, and they need [fewer] resources to be involved with. So we could find that the fact that there's so many elections taking place over a short period of time does heighten the risk in different territories.”
The elections are also likely to come at a time of continuing global conflicts, including between Israel and Gaza and between Russia and Ukraine. As online disinformation surrounding both wars continues to spread far and wide across the internet, serious questions have been raised over whether the UK government’s Counter Disinformation Unit is “fit for purpose".
“Currently...third sector organisations like Full Fact, or private entities like Logically AI... are having to plug the gap for the government here in calling out this disinformation,” shadow minister Alex Davies-Jones previously told PoliticsHome.
With a combination of elections and wars presenting ample motivation for bad actors to spread inaccurate information in their interests, the concern is that rapidly developing technologies such as generative AI could provide the tools to make this easier than ever.
In November, the UK hosted the first global AI Safety Summit at Bletchley Park, inviting world leaders to discuss and understand the threats posed by AI to future societies and economies. However, some experts criticised the summit for not explicitly setting out protections against misinformation in elections, especially with billions of people heading for the ballots in 2024.
Professor Jack Stilgoe, a lecturer in Science and Technology Policy at University College London, said it should be up to governments to come together to research such risks and propose solutions.
“If you’re having a political summit about AI, one of the key things that you should be worried about is the fact that next year maybe about half of the world's population is going to vote in elections, huge amounts are at stake there," Stilgoe said.
"AI-generated or AI-spread misinformation could be a gigantic risk to those political systems, which is something that we can easily anticipate and it's something that needs real political attention.
“The opportunity is to bring together governments to agree rules and norms, to understand and research these things… Rather than just hoping that the tech industry will be able to mark its own homework, we need to have other people able to come in and do technology assessments, to actually work out what the scale and scope of these risks are.”
The potential risks were highlighted when in October, on the first day of Labour Party conference, a video that appeared to show Labour leader Keir Starmer swearing at his staff started to circulate on social media. It transpired that the video was a deepfake, with the audio – disturbingly realistically – altered by AI.
Speaking to PoliticsHome shortly before the AI Safety Summit in November, former Secretary of State for Science, Innovation and Technology Chloe Smith said that she believed the UK government’s current approach was “correct”, in which they plan for many of the threats posed by AI to be covered by adapting existing legislation.
“There's an example here of why the approaching AI white paper is correct, because we already have a body of election law that sets out a range of election offences,” she said.
“What we perhaps might need to focus on is how regulators and their partners and the police will be able deliver their role throughout next year."
There are others who believe the government has not gone far enough. Penrose, who has planned an amendment to the Media Bill which he hopes would help to tackle online bias and misinformation, said that AI represented a “bleeding edge” new frontier and government needed to act fast to respond to it and hold large internet companies to account for the power they hold over data and information.
“We can always, and we should always, not take no for an answer,” he told PoliticsHome.
“We can and should and have already passed laws which bind these big companies. They are very powerful, but they aren't more powerful than the big developed economies with democratic governments and I think they want to comply with sensible laws.”
Tarman said that one of the reasons he believes the risk for misinformation will be so high in 2024 is that the UK has “not moved forwards” with provisions and measures to ensure safe elections.
“We've had massive missed opportunities with the Online Safety Act, with the Elections Bill, and we haven't learned the lessons of 2016 and many other elections that have happened around the world since,” he said.
“In so many ways, we are in a worse position than we were a number of years ago.”
He argued that the regulator Ofcom should already be setting out exactly how it will seek to mitigate misinformation harms and put out guidance around elections specifically.
“Ofcom could be setting out now what social media companies should be doing to address this disinformation in elections and help individuals and society as a whole, but we don't know if they're going to do that or when they're going to do it," Tarman continued.
“Media literacy is massively under-resourced. It's clear that there's a massive need and it's troubling within the context of elections that the role that we can all play is being undermined by lack of action.
“The Department of Science, Innovation and Technology has a miniscule budget in this area. The Department for Education hasn't done its job for years around this area. So media literacy is one where we could have spent these years really building resilience to bad information in the UK population, but unfortunately the programs and projects have been too small.”
Full Fact is also campaigning for a committee to advise Ofcom on these matters – which was promised in the Online Safety Act – to be set up by the middle of next year.
Tarman warned that while the focus is often on foreign actors, there is an “absolute risk” that the UK’s own political parties could either inadvertently or deliberately engage in and spread misinformation.
“There is an absolute risk that the political parties will not use standards of truth and honesty in their communications to influence our votes,” he said.
“It's absolutely vital that the political parties commit that the claims they make are honest and truthful and that they adhere to those standards. We also want to see them not trying to gain our vote by using misleading formats like campaign materials dressed up as fake news.
“We've seen year after year elections around the world that the domestic political actors themselves and their supporters are the ones we must also make sure adhere to the highest standards.
“The political parties have been too silent on this issue. They should share with us the principles that they will adopt, they should invite feedback and that we should allow the space for new good standards to come forward. We really need to remember that it's those who are seeking our vote who we should be most concerned about in terms of truth and honesty and bad information.”
A spokesperson for the Department for Science, Innovation and Technology said: “Through the Defending Democracy Taskforce, the Government is engaging with a wide range of stakeholders to protect the UK’s democratic processes and institutions, and to ensure they are secure and resilient to any threats of interference.
“Under the Online Safety Act, platforms will be required to swiftly remove illegal misinformation and disinformation content – including that which is user-generated using AI – as soon as they become aware of it.”
PoliticsHome Newsletters
PoliticsHome provides the most comprehensive coverage of UK politics anywhere on the web, offering high quality original reporting and analysis: Subscribe