Weaponising Biases

Weaponising Biases

People have always sought to disinform others – all the way back to the days of ancient Rome. In the 21st century the key difference is technology, which brings unprecedented speed and scale as well as new tools for creating increasingly convincing false content.

Yet humans are subject to the same biases we’ve always had, making us uniquely vulnerable to the spread of disinformation online. Smart disinformation purveyors have learned how to leverage these psychological biases through the new mediums available to them. Here is a sample of some of these biases:

Bias blind spot

This refers to the common tendency for people to notice all the flaws in an opponent’s argument while failing to recognise any of their own – which explains why nobody thinks they’re biased. When we’re faced with the task of deciding whether a piece of information is true, the bias blind spot kicks in. We may ask “Can I believe this?” when the information is belief-consistent, and “Must I believe this?” when the information challenges our core beliefs.

Third-person effect

In the third-person effect, people believe that mass media messages have a greater effect on others than on themselves. That’s one of the reasons why propaganda is so effective. People think they personally are immune to it while only others are affected. This also goes some way to explaining the mentality driving the belief in conspiracy theories, where people think they’ve been enlightened while everyone else is still deceived by the mainstream media.


The belief that a society or institution is tending towards decline. In particular, declinism manifests as a predisposition to view the past favourably but the future negatively. Combined with a strong sense of national pride and exceptionalism, this is why slogans like ‘Make America Great Again,’ and ‘Take Back Control’ were such effective messages for both the Trump campaign and Brexit.

Confirmation bias

Common across all forms of social media, this refers to people’s tendency to search for or interpret information in a way that confirms their preexisting views, while ignoring or dismissing information that challenges those views. This is one major reason why people are more likely to click on disinformation headlines that reinforce their views. Social media design leverages this human vulnerability to great effect with algorithms and filter bubbles.

Bandwagon effect 

Also known as the ‘herd mentality,’ the bandwagon effect is the tendency to believe something is true or good, just because many other people believe it to be so. On social media, the bandwagon effect helps disinformation purveyors spread their messages by providing social proof through posts that get numerous likes or shares.

False consensus effect

People have a tendency to overestimate the extent to which their own values and ideas are ‘normal’, assuming that the majority of others share them. In group settings, such as on social media, the false consensus effect can lead us to believe that our group’s views reflect those of the population as a whole. Social media heightens the false consensus effect because of algorithms that keep serving us content which matches our existing views.

Availability cascades

Availability cascades explain why certain false beliefs become fact in the minds of  many. They are a self-reinforcing process in which a collective belief gains increasing plausibility by constant repetition. Beliefs that seem to explain a complex social or political topic in a simple way are particularly prone to becoming part of availability cascades.

Hostile media effect

This refers to the tendency for an individual to perceive news coverage as biased against their personal position on a certain issue. It helps to explain why conspiracy theories tend to thrive and why ‘alternative’ media sources like InfoWars can gain such a large following.

Backfire effect 

Here, attempts to correct someone’s misperceptions (e.g. in response to disinformation) can instead end up strengthening their views. When confronted with attitude-inconsistent information, our instinct is to defend our deeply-held beliefs, causing us to cling to them more than ever – resulting in the backfire effect. This is one reason why attempting to ‘debunk’ people’s incorrect beliefs using fact-checked sources may not always work as well as expected.

Social identity theory (‘Us vs. them’)

People boost their own self-esteem by identifying as members of an ingroup, then reinforce that self-esteem by favouring their ingroup, while acting negatively towards a perceived outgroup. This behaviour encourages tribalism and can lead to deeper divisions between groups. Football teams and political parties are common examples, but people can also form themselves into such groups on social media.  

In a world that revolves around digital, it’s easy to forget that all humans are subject to similar behavioural quirks and biases. Having a working knowledge of these biases is important for gaining a more nuanced understanding of the disinformation problem – one that goes beyond oversimplified explanations. It lets us see what lies beneath the surface of sophisticated disinformation campaigns, many of which tap into basic human psychology.

Close Menu