Menu
Thu, 12 December 2024

Newsletter sign-up

Subscribe now
The House Live All
Can ‘Global Britain’ really succeed with electric vehicles? Partner content
By Advanced Propulsion Centre UK (APC)
Transport
Education
Press releases

Disinformation poses a significant threat to the public’s faith in democracy

4 min read

With the election fast approaching, the start of 2024 has seen renewed concern about the threat that disinformation poses to UK politics and democracy.

Thinks Insight & Strategy’s applied behavioural science team have conducted an online randomised controlled trial (RCT) to explore the threat posed by “election disinformation” – false content seeking to undermine faith in our elections and politics – and test a simple intervention with the potential to help limit its spread.

Our large-scale experiment involved 1,650 regular social media users browsing a simulated feed containing 30 posts in a random order: 15 legitimate posts and 15 examples of disinformation (including five of “election disinformation”). Participants were randomly allocated to different “arms” of the experiment including: a “control group” in which the posts were presented with no additional information; and an “inoculation” arm in which participants played a short 45-second “inoculation game” (a quiz designed to improve their ability to spot false or manipulated content) before seeing the same feed as the control group. The experiment measured the “reaction rate” – the number of times participants in each arm responded to the content in ways that would be likely to lead to its amplification by social media platform algorithms (for example liking, loving, sharing, etc). Alongside the trial, we conducted focus groups and a nationally representative survey of 2000 UK adults.

Significant minorities in the public are open to questioning the integrity of our democracy

The research makes clear that election disinformation poses a genuine threat. Significant minorities in the public are open to questioning the integrity of our democracy; asked to choose which of two statements better reflects their view, 30 per cent of survey respondents selected “elections in the UK are often manipulated and/or rigged” (60 per cent selected “elections in the UK are free and fair). In the RCT, an audio deepfake purporting to show Keir Starmer berating his staff had the highest “reaction rate” of all the political or election disinformation we tested, suggesting that such content is particularly prone to amplification.

But it’s not all bad news. Our results point towards a potentially valuable weapon in the fight against online disinformation. Playing a 45-second “inoculation game” was associated with a meaningful and statistically significant reduction in behaviours that would contribute to the spread of disinformation. Compared to those in the control arm, the odds that participants in the inoculation arm would “react” to disinformation fell by 42 per cent. Considered across billions of social media interactions, this is a genuinely worthwhile reduction.

Obviously, a 45-second game is not a panacea. For a start, inoculation only addresses the spread of false content, doing nothing about the producers of disinformation (whether rogue-ish politicians, conspiracy theorists, or hostile states) or the cynicism and disengagement that underpins public belief in this stuff. Nonetheless, these results (alongside those of multiple academic studies on inoculation) do suggest that, if a similar game were engaged-with by a sizeable proportion of social media users, it could meaningfully reduce the spread of disinformation.

Compared to many academic versions, our game was simple and short by design.  A further improved game could have an even stronger impact. In the focus groups, participants played the game and told us they enjoyed it. But they also suggested making it competitive and shareable, to encourage more people to play. Imagine a counter-disinformation version of Wordle, with messaging honed to focus specifically on election disinformation, backed by a genuinely smart campaign to drive traffic. There are individuals and organisations out there with the expertise to make that happen. The evidence tells us it would have a positive impact. Whether it’s May, November or even next January, time is running out. Forty-five seconds to protect the election.

Ben Shimshon co-founder & CEO, Thinks Insight & Strategy

Max Mawby founder, Thinks Applied Behavioural Science Team

PoliticsHome Newsletters

Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.