Home » Psychology of Disinformation

Tag: Psychology of Disinformation

Disinformation, Dopamine, And How To Quit Feeding The Trolls

I used to spend a lot of time arguing with strangers on the Internet.

It normally went like this: I would post something political, the trolls would arrive, and I’d get stuck in endless back-and-forths with them. It was a futile endeavour. But somehow I felt compelled to keep on engaging.

On Twitter or Facebook, new notifications would keep coming in, alerting me of another new comment. Then I’d engage again. It wasted my time, raised my anxiety levels, and made me feel more aggressive.

I’d gotten stuck in a dopamine loop.

This happens to us a lot, for example when we can’t stop refreshing our social media feeds. But what exactly is dopamine and what does it do?

How dopamine creates addiction

Dopamine is a chemical created in the brain. It affects many functions, such as thinking, mood, attention and motivation. It also plays a critical role in triggering our desire to seek out and search. In short, dopamine makes us want to search out information.

Two parts are at play in the dopamine loop. There’s the ‘wanting’ part and the ‘liking’ part. The wanting part makes you take action (i.e. searching), while the liking part satisfies you and makes you stop searching. But the wanting part is stronger than the liking part. And that’s how we get trapped in endless scroll mode.

Another important part of the dopamine system relates directly to the issue of engagement with trolling comments.

The system is very sensitive to cues that indicate the possibility of a reward. On social media, notifications are the cues. They make the addiction even stronger. Anticipation is key. If we got a reward every time we logged in, it would weaken the addiction. It’s the uncertainty that gets us hooked.

So how did these behavioural psychology elements get incorporated into our Internet experience in the first place?

Engineered addiction and trolling

It all started with a researcher called BJ Fogg and his Persuasive Technology Lab at Stanford University. In September 2007, Fogg and his students began work on a formula that would change the world — and all of our daily lives.

They built Facebook apps using techniques from behavioural psychology, techniques that could engineer addiction, such as the dopamine loop that I described above. Fogg developed a formula, suggesting that people will act when three forces converge: motivation, trigger, and ability.

Let’s apply this formula to a hostile online interaction. The motivation is your desire to convince the other person that your opinion is right, or to get ‘one up’ on them; the trigger is seeing another notification on your social media app, indicating that your opponent has responded; and the ability is having your phone to hand, which lets you check right away and respond.

These techniques tap directly into the workings of the human brain, making them extremely powerful.

So why do social media platforms leverage such addictive and manipulative design practices?

Simple: their profitability depends on it.

The ad-supported business model means that users spending more time on the app leads to more profit for the company. All that time you spend arguing with trolls on Facebook is making the platform more valuable to potential advertisers.

Dopamine and disinformation

Arguing online also affects our susceptibility to disinformation. The dopamine loop gives a powerful tool to those who seek to divide us. It perpetuates a sense of an adversarial environment and keeps us always on the attack.

When we divide ourselves into tribes and adopt a hostile attitude towards a perceived outgroup, we risk becoming more susceptible to harmful disinformation. We are more likely to share content that is cognitively resonant with our existing beliefs and which reflects the attitudes of our tribe.

The dopamine loop also affects our interactions with agreeable content. When we post something that our tribe members approve of, we’ll receive dopamine hits via our notifications of likes and encouraging comments. That boosts our self-esteem and keeps us coming back for more.

So what can we do to fix the trolling problem and reduce our susceptibility to disinformation?

Breaking out of the loop

Short-term practical solutions mainly involve adapting our devices to help break the dopamine loop. For example, we could make our phones less appealing by changing the screen to grayscale mode, or switching off all notifications.

But we can also tackle adversarial online behaviour in another way.

‘Don’t feed the trolls’ has become an Internet cliché. But it starts to make sense when thinking about the dopamine loop and anticipatory reward.

Everyone who posts online is looking for a response. They want to perpetuate the dopamine loop. If we can maintain our self control by denying them that response, then we can break the cycle.

I’ve managed to break my own addiction to engaging with trolls. On Twitter, I simply mute the conversation. I can’t see the person anymore, and I receive no notifications of any of their comments. It makes the troll invisible to me, breaking the dopamine loop and allowing my brain to focus on other things.

On Facebook, I simply turn off notifications for a particular post. This has the same effect as muting on Twitter. Both platforms also offer a blocking option. I don’t normally use this because it gives trolls the satisfaction of knowing they’ve got a response. Muting is better, because it means they’re left wondering if I’m ignoring them. They just keep on yelling into the void.

Battleground or community

If we could all break the cycle and quit feeding the trolls, then adversarial disinformation and influence ops could lose much of their power. The online environment would feel like more of a community instead of a battleground. In turn, this may help reduce polarisation.

But it has to be done en masse. A handful of people breaking the cycle won’t be enough to change the overall environment. As social media is designed to be addictive, a wider intervention would be necessary to encourage people to do this.

Of course, the social media platforms have all the power. They could redesign their structures to destroy these dopamine loops and disincentivise disinformation. But their ad driven business model means they don’t have an incentive to do so.

Nevertheless, we can still improve our individual online experiences by taking steps to break the dopamine loop and smash our susceptibility to disinformation. Part of doing this is to disengage with users who aim to trap us into never-ending adversarial debates based around polarising topics.

5 Ways Our Minds Make Us Susceptible to Online Disinformation

In our fast-moving online world, even the most aware of us can be taken in by disinformation. As humans, our minds work in certain ways that can leave us vulnerable to deception – and no-one is immune.

Our perceptions are not as reliable as we might like to imagine, and the online environment amplifies these flaws. In this post, I’ll discuss five important psychological traits that affect how we process information and subsequently affect our behaviour online.

Confirmation Bias

Search engines give us access to all the world’s information simply by typing a few words into a search bar.

But thanks to confirmation bias, people tend to search only for information that reinforces their beliefs. Even if what they find is disinformation, confirmation bias makes them less likely to question its veracity.

For example, take someone who already dislikes Donald Trump. They might search Google for “why is Trump still president?” This produces a slew of articles critical of Trump, feeding into the person’s existing beliefs. It’s a vulnerable moment during which disinformation can easily permeate.

Social Proof

The term ‘social proof’ was first used by Robert Cialdini in his seminal marketing book, Influence. It’s a way of building trust in a person, a product or a message, by demonstrating that many people approve of it. The bandwagon effect is the motivating force driving social proof. It dictates that if something seems popular, people will feel compelled to join in.

Social proof is especially important in today’s environment of information overload. With so many options available to us, we need a shortcut to help us cut through the noise and determine which ones to trust.

For marketers, social proof is an essential tool. But it’s also a powerful weapon in the arsenal of disinformation. Bots play a major role in building social proof around certain messages, including false ones. Liking, sharing and replying to these messages creates an illusion of widespread approval, which attracts more people to trust them. This may snowball, causing the message to go viral.

There’s a lot more to say about the role of social proof in disinformation. I’ll explore it in more detail in a follow up post. For now, remember that online popularity can easily be faked, and isn’t always a reliable indicator of grassroots public opinion.

False Consensus Effect

We all like to think that our beliefs, preferences, values and habits are widely shared, even though this may not be so. This overestimation is known as the false consensus effect. It relates to our self-esteem and the desire to conform as part of a social group, meaning we need to fit in.

Online, the false consensus effect is amplified in two main ways: 1) by means of algorithms that show us opinions reflecting our own (filter bubble effect), and 2) our habit of engaging only with others who support our views (echo chamber effect).

Disinformation that taps into the false consensus effect can find a fertile environment to take root, grow and mutate. Social media helps this happen. No matter how wedded you are to a certain view, always keep in mind that other people might think very differently.

Tribalism

Humans are social animals, so gaining the approval of a likeminded group is important for boosting our self-esteem. We reinforce this sense of self-esteem by behaving in ways that favour our own group (known as the in-group).

For example, we might post on social media about the positive traits of our in-group. This is relatively harmless in itself. But every in-group needs an out-group. Where there’s in-group loyalty there may also be out-group derogation – negative attitudes and behaviour towards the out-group. This conflict between groups of all kinds can be referred to as tribalism.

In emotive issues like politics, which tap into aspects of people’s identities, tribalism can morph into a force of dangerous power. Violence can easily follow; indeed, tribalism is at the root of most human conflicts.

Disinformation leverages the human tendency for tribalism by creating and disseminating adversarial narratives. These inflame existing divisions, creating a sense of ‘us vs them’. We can observe many cases of this in recent political events.

Examples include Trump supporters vs Clinton supporters in the US, Leavers vs Remainers in the UK, Muslims vs Buddhists in Myanmar, Han fans vs Tsai fans in Taiwan’s recent presidential election.

Backfire Effect

You might expect people would stop believing in disinformation if they are told it’s untrue. This seems logical, but human psychology doesn’t always work that way. The root of the problem is found (once again) in our self-esteem.

When certain beliefs become embedded in our worldview, they also become part of our identity. If one of those beliefs is challenged, it’s as if someone is shaking up the very foundations of that identity.

Challenges to our identity can be psychologically painful. In response, we may cling tighter to the original belief, making it even stronger. The attempt to correct backfires, so this process is known as the backfire effect.

Key Takeaways

  • Human psychology makes us susceptible to disinformation
  • In a world of information overload, we seek shortcuts to help us navigate. But these can be gamed, such as social proof
  • Much of online behaviour has its roots in aspects of self-esteem and identity
  • Simply ‘debunking’ disinformation may not be effective, because of the backfire effect
  • Adversarial narratives are a common feature of disinformation, found in many situations worldwide. They can lead to tribalism, which risks real-life violence