September 3, 2024

The Southport Riots: Online Disinformation and Offline Harm

On 29 July 2024, a 17-year-old boy fatally stabbed three children at a dance studio in Southport, United Kingdom (UK), injuring another ten people - mostly children - in the process. Police arrested the suspect almost immediately but did not disclose the suspect's personal details due to media guidelines concerning underage suspects. Without immediate information from authorities, some sections of the public drew their own conclusions. This prompted a large volume of disinformation circulating online that strongly contributed to violent protests in Southport and several cities in England and Northern Ireland across the first half of August. The most prominent disinformation narratives included elements of anti-immigrant/Islamophobia, global conspiracy theories and rhetoric undermining civic integrity. 

Several political figures and disinformation actors framed the Southport attack as an immigration issue by claiming, without evidence, that the suspect was an immigrant known to UK authorities as a threat. The day after the attack, populist UK MP Nigel Farage shared a video falsely claiming there were reports that the attack was committed by someone being monitored by the country’s security services and questioning if the government was telling the truth about the suspect. After being called out for this misleading narrative, Farage alleged that far-right, self-proclaimed misogynist influencer Andrew Tate was the source. Tate made a video shortly after the attack, stating the government and media “do not want to highlight how ridiculous it is that they allow military-aged males, combatants, to flood our shores” and that “the soul of the Western man is so broken that when the invaders slaughter your daughters, you do absolutely f***ing nothing.”

Many far-right sites used white supremacist messaging to claim that the attack laid bare a “two-tier justice system” in the UK that allegedly favours migrants over the white British population. British nationalist group Patriotic Alternative opined that “we, the White British people, are the victims of an anti-white multicultural tyranny that has dispossessed us of power and turned us into second-class citizens in our own land.” Another site claimed that the BBC displayed a two-tier justice bias by calling a counter-protest on 5 August “peaceful” while ignoring that “mobs of Muslim men” rioted in the streets and harassed a journalist. X/Twitter owner Elon Musk amplified claims of a two-tier justice system by using the hashtag #TwoTierKier about the Prime Minister in a post about Southport. Musk further elevated rhetoric from far-right UK actors, claiming that the country was on the brink of civil war, stating that “civil war is inevitable” in another post.

The two-tier justice system narrative also fed into a long-standing conspiracy theory that the UK government is using “weaponised immigration” to destroy the country by letting in large numbers of migrants by boat. Notably, these narratives also featured heavily in previous panics over alleged “Muslim grooming gangs” committing widespread child sexual abuse in the UK - despite research from the UK Home Office showing that “likely no one community or culture is uniquely predisposed to offending.”

Conspiracy theories regarding the alleged role of the UK government in the attack were a prominent undercurrent in these narratives. Some sites argued that the police response to the riots would serve as a pretext for an authoritarian crackdown on civil liberties and restrictions on daily life. Other sites, respectively, claimed that the government orchestrated the attack using “paid thugs” to start a race war, organised the riots to destroy working people, or otherwise planned the riots as part of a globalist plan led by George Soros - a common antisemitic conspiracy theory - to destroy the UK.

Internationally, Russian media channels amplified anti-immigrant, Islamophobic and conspiratorial narratives on official state-sponsored platforms. Pravda-EN, the English language version of the Kremlin-aligned outlet, claimed that the perpetrator was an “Arab” and cast doubts on police reports stating that the perpetrator was born in Cardiff. Another post explicitly tied the attack to Islam by stating that the British people “staged pogroms because of the local Wahhabis,” a reference to an Islamic reform movement often associated with extremism. Articles on other Russian sites accused Prime Minister Starmer of blaming the far-right for the riots while ignoring government dereliction in migration policies and claimed that British people might be forced to “take up arms” or flee the UK due to the alleged erosion of cultural identity.

Far-right Telegram channels played an outsized role in perpetuating disinformation and organising the riots. One of the main channels responding to the attack belonged to the UK chapter of the Active Club Network movement, a decentralised network of neo-Nazi white supremacist groups. Telegram took down the channel, but members simply set up a new channel and migrated over 70% of their members. One of the messages in this channel suggested that England was on the brink of collapse and encouraged members to take immediate, proactive steps to resist perceived threats to cultural identity or survival, chiefly through combat skills training. Other Telegram channels espoused an “accelerationist” ideology. Accelerationism is the idea that modernity, liberalism and capitalism contain inherent flaws that will lead to societal collapse and usher in profound transformations. Proponents endorse political violence and/or terrorism to “accelerate” this downfall. These and similar channels promoted the Great Replacement conspiracy theory - the false idea that there is a plot to replace white populations with non-white immigrants - and the Kalergi plan, an antisemitic conspiracy theory claiming there is a plot to eradicate European whites to the alleged benefit of a Jewish elite.

The rapid proliferation of these disinformation narratives, including on major US-based platforms, highlighted severe gaps in enforcement efforts against dangerous content. Multiple media organisations, including The Telegraph, BBC, CNBC, Le Monde, and others, have highlighted the prevalence of Southport-related dangerous content on the UAE-headquartered social media platform Telegram. Politico called out Telegram for allowing disinformation to spread with an “unparalleled level of impunity” and only removing some of the most egregious content.

The increasingly clear links between the riots and online disinformation prompted UK authorities to warn people against spreading false and inflammatory information and warn platforms for failing to reckon with it. Over a dozen people have been arrested, charged, or jailed for online hate offences related to the attack and the riots, and others may face legal action soon as authorities continue to investigate the riots and related online offences. The UK government confirmed that foreign states had amplified disinformation that inspired the riots, but the full extent of foreign information manipulation and interference (FIMI) is still under investigation.

What is patently clear about these riots, however, is that online disinformation played a direct, significant and terrible role in contributing to offline harm. These narratives directed protesters’ anger to physical locations associated with Muslims and immigrants, including attacks on mosques, temporary accommodation centres housing refugees, and threats to immigration lawyers. Businesses owned by immigrants and non-white people were also attacked, as was a police station in Sunderland. Over 100 police officers from multiple cities were reportedly injured during these riots and violent protests.

It's difficult to quantify how many of these harms could have been prevented without the rapid spread of disinformation. Still, one fact remains clear: if technology platforms and governments fail to act against online disinformation, similar situations will continue. Anti-migrant narratives, Islamophobia, and conspiracy theories proliferate across the internet and have been inciting violent offline actions in the UK and other countries for years before this specific attack. If left unchecked, this pattern is likely to persist and intensify.

The rapid spread of disinformation about the Southport suspect demonstrates the urgent need for action. According to the Center for Countering Digital Hate, false information about the suspect's identity reached over 420,000 viewers on X (formerly Twitter) alone. On the day of the stabbing, it appeared in at least 2,632 posts across various platforms. This swift propagation of online disinformation—and its potential to provoke offline harm—demands an equally rapid response.

Concerningly, Telegram’s failure to remove disinforming narratives proliferating on its platform provides more evidence that industry self-regulation is insufficient for safeguarding users. Telegram’s terms of service prohibit promoting violence, but their content moderation actions and enforcement of this policy remain opaque. The lack of data on the extent of content removals, or how Telegram defines content “promoting violence,” makes it difficult to hold Telegram accountable and assess the extent they are being proactive or negligent.

European policymakers have already established protocols within the EU's Digital Services Act for tech platforms to address crises like the Southport riots and to audit platforms' risk mitigation efforts. The UK has its own framework for tech accountability in the Online Safety Act, set to take effect next year. However, critics argue that this law needs to address the removal of content that is 'legal but harmful.' The Southport riots illustrate how malicious actors exploit these legal loopholes, resulting in real-world harm to ordinary people.