Today I’m going to explain how disinformation hacks your brain.
In our fast-moving online world, even the most aware of us can be taken in by disinformation. As humans, our minds work in certain ways that can leave us vulnerable to deception – and no-one is immune.
Our perceptions are not as reliable as we might like to imagine, and the online environment amplifies these flaws. In this post, I’ll discuss five important psychological traits that dictate how disinformation hacks your brain.
5 Ways Disinformation Hacks Your Brain
Search engines give us access to all the world’s information simply by typing a few words into a search bar.
Because of confirmation bias, people tend to search only for information that reinforces their beliefs. Furthermore, even if what they find is disinformation, the effect of confirmation bias makes them less likely to question its veracity.
For example, let’s take someone who already dislikes Donald Trump. Perhaps they might search Google for “why is Trump still president?”
This search produces a slew of articles critical of Trump, feeding into the person’s existing beliefs. Consequently, this becomes a vulnerable moment during which disinformation can easily find a foothold in the mind.
The term ‘social proof’ was first used by Robert Cialdini in his seminal marketing book, Influence. It’s a way of building trust in a person, a product or a message, by demonstrating that many people approve of it.
The bandwagon effect is the motivating force driving social proof. It dictates that if something seems popular, people will feel compelled to join in.
Social proof is especially important in today’s environment of information overload. Because there are so many options available to us, we need a shortcut to help us cut through the noise and determine which ones to trust.
For marketers, social proof is an essential tool. It’s also a powerful weapon in the arsenal of disinformation. Devices such as bots play a major role in building social proof around certain messages, including false ones.
Liking, sharing and replying to these messages creates an illusion of widespread approval, which attracts more people to trust them. This may snowball, causing the message to go viral.
There’s a lot more to say about the role of social proof in disinformation. I’ll explore it in more detail in a follow up post. For now, remember that it’s easy to fake online popularity, so likes and retweets aren’t always a reliable indicator of grassroots public opinion.
The Consensus Effect
We like to think that many other people share our beliefs, preferences, values and habits, even when that’s not actually the case. Behavioural psychology calls this overestimation the false consensus effect. It relates to our self-esteem and the desire to conform as part of a social group, meaning we need to fit in.
Online, the false consensus effect is amplified in two main ways:
- By means of algorithms that show us opinions reflecting our own (filter bubble effect),
- By our habit of engaging only with others who support our views (echo chamber effect).
Disinformation that taps into the false consensus effect can find a fertile environment to take root, grow and mutate. Social media helps this happen. No matter how convinced you are of a certain view, you should never forget that other people may well think differently.
Humans are social animals. Consequently, gaining the approval of a likeminded group is important for boosting our self-esteem. We reinforce this sense of self-esteem by behaving in ways that favour our own group (known as the in-group).
For example, we might post on social media about the positive traits of our in-group, which is relatively harmless in itself. However, every in-group needs an out-group.
Furthermore, where there’s in-group loyalty there may also be out-group derogation – negative attitudes and behaviour towards the out-group. This conflict between groups of all kinds is a form of tribalism. It plays a huge role in how disinformation hacks your brain.
In emotive issues like politics, which tap into aspects of people’s identities, tribalism can morph into a force of dangerous power. Violence can easily follow. In fact, tribalism is the driving force behind many human conflicts.
Disinformation leverages the human tendency for tribalism by creating and disseminating adversarial narratives. These inflame existing divisions, creating a sense of ‘us vs them’. We can observe many cases of this in recent political events.
Examples include Trump supporters vs Clinton supporters in the US, Leavers vs Remainers in the UK, Muslims vs Buddhists in Myanmar, Han fans vs Tsai fans in Taiwan’s recent presidential election.
The Backfire Effect
You might expect people would stop believing in disinformation if they are told it’s untrue. This seems logical, however human psychology doesn’t always work that way. The root of the problem is found (once again) in our self-esteem.
Furthermore, when certain beliefs become embedded in our worldview, they also become part of our identity. If one of those beliefs is challenged, it’s as if someone is shaking up the very foundations of that identity.
Challenges to our identity can be psychologically painful. In response, we may cling tighter to the original belief, making it even stronger. The attempt to correct backfires, therefore this process is known as the backfire effect.
How Disinformation Hacks Your Brain: Key Takeaways
- Human psychology makes us susceptible to disinformation
- In a world of information overload, we seek shortcuts to help us navigate. However, these can be gamed, such as social proof.
- Much of online behaviour has its roots in aspects of self-esteem and identity.
- Simply ‘debunking’ disinformation may not be effective, due to the backfire effect.
- Adversarial narratives are a common feature of disinformation, found in many situations worldwide. They can lead to tribalism, which risks real-life violence.