Home » Disinformation In Politics » Page 2

Category: Disinformation In Politics

Coronavirus Conspiracy Theories, Tribalism And Public Health

During the pandemic, large crowds of Trump supporters took to the streets of US cities, demanding an end to coronavirus restrictions, such as lockdown and the wearing of masks. Britain saw similar issues, albeit on a smaller scale.

Why are some people so determined to ignore public health advice? Part of the answer may be found by examining political tribalism and coronavirus conspiracy theories.

In this post, I’ll explain how coronavirus conspiracy theories and disinformation leverage tribalism to influence people’s behaviour.

Divided societies, universal threat

When the pandemic first hit, some hoped that the shared experience of facing universal threat would bring warring political tribes together. But it seems the opposite is happening. This is partly driven by an organised and sustained campaign of disinformation and coronavirus conspiracy theories.

In the UK and US, government responses to the virus have been unlike those of many other countries. Portugal, Germany, New Zealand, Canada and South Korea have already managed to regain some level of control over its spread.

In contrast, both the UK and the US were slow to implement lockdown measures. Both gave their populations mixed messages about how to handle the pandemic. Both countries’ leaders have displayed a cavalier attitude towards the virus.

Political tribalism in the UK and the US is now affecting their populations’ response to the coronavirus crisis. This tribalism is a hangover from 2016, the same force that played a role in the election of Trump and the vote for Brexit – polarising the populations in the process.

Coronavirus conspiracy theories demonise groups

A sustained torrent of coronavirus disinformation has compounded these issues. In particular, numerous coronavirus conspiracy theories have eroded trust in public institutions among some segments of the population. Distrust of experts is nothing new. It’s been a central feature of tribal politics since 2016 and shows no sign of dissipating in this pandemic.

Common coronavirus conspiracy theories include:

Tribalism means viewing the world as ‘us vs them’, with ‘us’ being superior and ‘them’ being threatening. This perspective is inherent in these coronavirus conspiracy theories.

Many revolve around the demonisation of a particular group (e.g. elites, the establishment, experts, the WHO, China, and so on). True believers view anyone who supports the demonised group as being part of it. And so the tribal divisions persist.

These coronavirus conspiracy theories cast doubt on the public health situation. They promote distrust of expert advice and official organisations. The result is shifts in population behaviour, e.g, people refusing to follow lockdown, wear masks or practise social distancing.

From Facebook to the streets

The situation has become particularly dangerous in the US, with its current protests. Here the role of social media comes under the spotlight.

Private Facebook groups have been key sites for inciting and organising these protests. Some groups are large, such as ‘Michiganders Against Excessive Quarantine’, or ‘Reopen Virginia’ (the latter with over 18,000 members)

Both groups are full of talk of coronavirus conspiracy theories, such as the below from the Michigan group.

Source: https://twitter.com/willsommer/status/1250838111992647680

Below is an example comment from the ‘Reopen Virginia’ group. This user is calling for civil unrest, while also demonising the outgroup (‘leftist Democrats’). The post has attracted significant engagement, both comments and likes.

Source: https://twitter.com/jaredlholt/status/1250842215435337728/photo/3

These posts show how belief in tribal coronavirus conspiracy theories can lead to virus scepticism and denial. It can also trigger people to take real-life protest action, which risks turning violent.

Furthermore, it’s not easy to know who is producing these comments. Do they reflect the views of genuine American citizens? Or are some of the comments being astroturfed by those who seek to create social unrest?

Coronavirus conspiracy theories are a problem for other social media platforms too. YouTube hosts thousands of videos discussing all kinds of conspiracy theories in great detail. The platform recently changed its policies in an attempt to crack down on coronavirus conspiracy theories and 5G content. But it’s likely too little, too late.

The trouble is, platform takedowns are viewed as a sign of elite censorship in the minds of people already suspicious of experts and official organisations. This adds even more fuel to the fire of coronavirus conspiracy theories.

Local groups are key targets

Private local Facebook groups are a prime target for influence operations. They have already been identified as key battle sites for the US 2020 election, where influence operatives aim to manipulate the political narratives in key swing states.

Targeting local Facebook groups is an effective way to do this. As well as activity such as voter suppression in these groups, influence operations can also compel populations to protest on the streets.

It’s difficult for researchers and analysts to study private Facebook groups in aggregate, as tools such as CrowdTangle don’t allow access to private groups.

These groups are hotspots for US 2020 manipulation activities. Facebook should monitor them carefully. Its moderators should look out not only for signs of voter suppression, but also for coordinated attempts to incite populations to violence.

We must take coronavirus conspiracy theories seriously

These times of heightened fear offer a prime opportunity to for disinformation purveyors to influence the outcome of the US 2020 election.

When political tribalism is so entrenched, fact checking and counter disinformation messaging campaigns may be less effective on a large scale. Instead, they risk exacerbating people’s existing suspicions of the establishment and ‘elites’.

Coronavirus conspiracy theories are not trivial. They risk causing harm on a massive scale, by encouraging populations to ignore public health advice and instigate real life violence.

It’s essential that social media companies take coronavirus conspiracy theories seriously, particularly within private groups. Whether or not they do so may end up as a key deciding factor of the US 2020 election. 

disinformation

6 Things I’ve Learned About COVID-19 Disinformation

Disinformation thrives on chaos. A global pandemic is about as chaotic as it gets.

For those who seek to spread disinformation, COVID-19 presents a far grander opportunity than either the 2016 US election or the vote on Brexit. The upcoming 2020 US presidential election further fans the flames.

That’s why it’s important to stop and take stock of lessons learned from the front lines of disinformation tracking.

I’ve been studying cross-platform coronavirus narratives for the last month or so. Here are a few of the things I’ve found.

What I’ve learned about COVID-19 disinformation

disinformation virus

1. Q is a key player in disinformation efforts

Qanon is a mega conspiracy narrative that encompasses a whole range of smaller ones. The basic premise of Qanon claims that Donald Trump is in league with a shadowy figure called Q.

Together, Trump and Q fight against a group of elite paedophiles entrenched within the mainstream media and the Democrat Party.

Previous presidential candidate Hillary Clinton and current one Joe Biden have both been major targets for Q’s accusations.

Every so often, Q releases tantalising nuggets of new information (called ‘Q drops’) for his followers to chew over.

These have sparked a whole ecosystem of pervasive social media content, from Twitter threads to entire YouTube channels.

Coronavirus disinformation is being well-leveraged by Q and followers. Q related themes and activity underpin many of the most widely spread corona conspiracies.

Those include coronavirus being either a hoax or a bioweapon, 5G causing the virus, a supposed plan to enforce mandatory vaccinations, and the imminent arrival of military martial law.

2. Mainstream media is pushing disinformation narratives

Conservative media sources in the US, such as Fox News, play a significant role in promoting narratives that draw on conspiracies, including around coronavirus disinformation. They claim it’s ‘not a big deal’, or it’s ‘just like the flu’, or, ‘it’s all a big hoax’.

Although these stories may be less colourful than those of the average Q acolyte, they are still risky.

Provenance in established media sources provides the necessary social proof to make the narratives more credible in the minds of their audiences.

What’s more, this scenario means less work for those who intend to manipulate public opinion around the coronavirus.

They no longer have to waste time crafting convincing content, but can simply engage with organic content that already exists. And that’s exactly what they’re doing, with a firm eye on the US 2020 election.

3. Coronavirus tribalism is prevalent

Pitting ‘us’ against ‘them’ is at the core of most disinformation, including conspiracy theories. The narratives can take many forms, but always come down to one group (the ingroup) facing off against a predefined opposing group (the outgroup).

For Qanon, it’s Q’s followers who are the ‘enlightened’ ingroup, joining forces with him and Trump to battle the predatory elites. In British politics, we see ‘patriotic’ supporters of Brexit setting themselves against ‘treacherous’ Remainers (and vice versa).

Tribalism even filters down to matters of life or death, i.e. the coronavirus. On social media, I’ve noticed a recurring adversarial narrative emerging around how best to respond to the pandemic.

One camp downplays the severity of the virus, claiming measures such as the lockdown are an overreaction. The other camp is strongly in favour of lockdown and promotes WHO advice to Stay At Home. Each camp supports their own and attacks the other, often in derogatory and aggressive ways.

It’s dangerous when people are already suspicious of ‘elites’ and experts. They have a tendency to dismiss guidance from governments and public health organisations, which can lead to the flouting of virus mitigation measures. Real world harms can result.

4. Virus fears being monetized 

The chaos and fear of a global pandemic has spawned many opportunities for leveraging the attention economy.

As well as conspiracy theories, there are many examples of people making money via coronavirus disinformation, by tapping into people’s fear, boredom, and increased need for answers.

I’ve identified two main ways of doing this. The first is through creating highly clickable content about the virus.

This content may or may not be factual; it doesn’t matter to the creator, as long as it brings in the clicks.  Content is published on websites festooned with online ads, where each click brings extra ad dollars to the site owner.

The second way is to create content on topics such as ‘miracle cures’, which then feeds into attempts to sell products. Vitamin C is a prime example.

It’s a cynical exploitation of people’s fears about the virus and their need to regain a sense of control.

These ‘miracle cures’ are not scientifically proven. They provide a false sense of security, which may lead to individuals choosing not to self isolate and spreading the virus as a result.

5. Takedowns have a ‘backfire effect’ 

Takedowns are a necessary part of tackling the coronavirus disinformation problem.

However, denying bad actors freedom of reach can also strengthen the impetus behind conspiracy theories by feeding into an existing sense of elite suppression.

Here, conspiracy theorists view the platforms as part of the elite, keeping the ‘truth’ hidden from the people.

Conspiracy theorists are quick to react to takedowns, working them into their coronavirus disinformation narratives.

With 5G, a trend has sprung up of referring to it as ‘5gee’ or similar permutations. This is an attempt to avoid the keyword being picked up by moderators or analysts who are tracking it.

For conspiracy adherents, this sense of persecution further reinforces their existing worldview. It makes them more likely to cling to it. In this way, a ‘backfire effect’ has occurred. 

6. Platform responses are shifting 

Social media companies are frequently accused of not doing enough to reduce the flood of misleading content that overwhelms their platforms.

I don’t think they’re reluctant to do so, but they have to balance this move with being seen as supportive of free speech.

Finding that balance can be challenging when addressing conspiracy theories, as opposed to purely false information.

Most conspiracy theories are spun up like candy floss around a small kernel of truth.

A typical post will build a whole story around how some real life event is of possible significance to the wider narrative arc.

The difference between opinion and actual false information is not always clear-cut. This creates murky territory for the platforms. 

But things have shifted after some conspiracy theories, such as the one about 5G causing coronavirus, triggered real life harms.

A recent video by notorious conspiracy theorist David Icke was pulled from YouTube just days after it was released, heralding a change in approach.

A growing amount of research indicates that coronavirus conspiracy theories form a central part of coordinated influence operations.  

We can no longer afford to overlook the role of conspiracy theories and disinformation in influence operations.

Tribalism In The Time Of Coronavirus

As I write this, the world has descended into a major crisis, with effects more far-reaching than anything I’ve experienced in my lifetime. A powerful virus has swept onto the scene and is now ripping its way through the world. Barely any country has been spared.

Here in the UK, the coronavirus crisis is getting worse by the day. But merely observing the city streets on this sunny spring Sunday would give no indication of the gravity of the situation. Indeed, some UK tourist spots, notably Snowdon, experienced their ‘busiest day in living memory’. That’s quite something at a time when a highly contagious virus is on the loose.

In contrast, the streets of Paris, Lisbon and Barcelona are deserted. Most EU countries have issued a decisive response, putting their populations under strict lockdown to try and curb the spread of the virus. The UK government hasn’t followed suit.

Britain is saddled with unfortunate leadership in such a time of crisis. Messages from central government have been unclear and have arrived far too late. Many people have died. Amid the frenzied warnings from other countries, tribalism, rooted in the impulses that drove Brexit, still bisects British society — even influencing how we perceive the choice between life and health, or possible death. 

Brexit tribalism could be seen as a barometer for who will approve or disapprove of Boris Johnson’s handling of the coronavirus situation. No scientific study has yet been conducted to prove or disprove this, but research from Cambridge has shown that Leave (and Trump) voters have a strong tendency to believe conspiracy theories.

So if I may hypothesise for a moment, it would go as follows.

Those who believe Johnson is doing well and don’t believe in the necessity of self isolation — more likely to be Leave voters. Those who believe Johnson is doing the wrong thing and we should follow the majority of the EU (and the world) into lockdown — more likely to be Remain voters. 

I can’t help but wonder if these divided attitudes are linked to the government’s aggressively anti-EU narrative. Could it possibly be that our leaders are reluctant to implement lockdown because it would mean them falling into line with the EU? The British government can’t possibly be seen to do that. On the contrary, it must do the exact opposite. After all, there’s a voter base to keep happy.

This tribal stance has filtered down to the population. People’s cavalier real-life behaviour at a critical juncture risks the health and safety of us all.

We’ve gone beyond Brexit concerns now. Freedom of movement is no longer the most important thing at stake. Continued tribal attitudes in the UK could now lead to significant numbers of deaths. The reckoning has arrived. No matter what side of the political spectrum we’re on, we must ensure that tribalism does not cloud our actions on tackling the virus, as the New European so rightly points out.

There’s another factor influencing public opinion around coronavirus: online disinformation. It’s been a key part of turbocharging existing tribal divisions.

Based on my research so far, I’ve seen the following positions solidifying into recurring narratives. Many are from sources that originate in the United States, but the shared language and overlapping ideologies mean they can mostly be considered as UK-relevant too.  

Narratives primarily from conservative/right-wing/pro-Leave sources:

  • The coronavirus is a hoax used as a smokescreen for elites to take control of society
  • It’s no worse than the flu, so there’s no need to believe WHO or UN advice (in fact we shouldn’t trust them because they may be part of the elite conspiracy)
  • Social distancing is unnecessary / too extreme
  • China is to blame for all this. To quote Trump, coronavirus is ‘the Chinese virus’ 

Narratives primarily from liberal/left-wing/centrist/pro-Remain sources:

  • The coronavirus is real, serious, and affects everyone 
  • It can’t be compared to flu
  • We should trust advice from WHO/UN and other legitimate experts
  • Social distancing and possibly lockdown is necessary to save lives across the wider population. 

Most of the disinformation that I’ve observed so far plays on the core narrative strands in the first group. People targeted by these narratives might well be less likely to take the virus seriously and more likely to carry on with a semblance of normal life, thus continuing the pandemic. This unhelpful behaviour is exacerbated by the population spending more time at home and hence online, seeking out constant updates on this critical global threat.

In the next post, I will unravel the coronavirus disinformation narratives in more detail, providing data-driven examples. It’s critical to understand the why behind the seeding of this disinformation, so I’ll also discuss the various incentives that are driving it.

psychology of disinformation

How Disinformation Hacks Your Brain

Today I’m going to explain how disinformation hacks your brain.

In our fast-moving online world, even the most aware of us can be taken in by disinformation. As humans, our minds work in certain ways that can leave us vulnerable to deception – and no-one is immune.

Our perceptions are not as reliable as we might like to imagine, and the online environment amplifies these flaws. In this post, I’ll discuss five important psychological traits that dictate how disinformation hacks your brain.

5 Ways Disinformation Hacks Your Brain

Confirmation Bias

Search engines give us access to all the world’s information simply by typing a few words into a search bar.

Because of confirmation bias, people tend to search only for information that reinforces their beliefs. Furthermore, even if what they find is disinformation, the effect of confirmation bias makes them less likely to question its veracity.

For example, let’s take someone who already dislikes Donald Trump. Perhaps they might search Google for “why is Trump still president?”

This search produces a slew of articles critical of Trump, feeding into the person’s existing beliefs. Consequently, this becomes a vulnerable moment during which disinformation can easily find a foothold in the mind.

Social Proof

The term ‘social proof’ was first used by Robert Cialdini in his seminal marketing book, Influence. It’s a way of building trust in a person, a product or a message, by demonstrating that many people approve of it.

The bandwagon effect is the motivating force driving social proof. It dictates that if something seems popular, people will feel compelled to join in.

Social proof is especially important in today’s environment of information overload. Because there are so many options available to us, we need a shortcut to help us cut through the noise and determine which ones to trust.

For marketers, social proof is an essential tool. It’s also a powerful weapon in the arsenal of disinformation. Devices such as bots play a major role in building social proof around certain messages, including false ones.

Liking, sharing and replying to these messages creates an illusion of widespread approval, which attracts more people to trust them. This may snowball, causing the message to go viral.

There’s a lot more to say about the role of social proof in disinformation. I’ll explore it in more detail in a follow up post. For now, remember that it’s easy to fake online popularity, so likes and retweets aren’t always a reliable indicator of grassroots public opinion.

The Consensus Effect

We like to think that many other people share our beliefs, preferences, values and habits, even when that’s not actually the case. Behavioural psychology calls this overestimation the false consensus effect. It relates to our self-esteem and the desire to conform as part of a social group, meaning we need to fit in.

Online, the false consensus effect is amplified in two main ways:

  • By means of algorithms that show us opinions reflecting our own (filter bubble effect),
  • By our habit of engaging only with others who support our views (echo chamber effect).

Disinformation that taps into the false consensus effect can find a fertile environment to take root, grow and mutate. Social media helps this happen. No matter how convinced you are of a certain view, you should never forget that other people may well think differently.

Tribalism

Humans are social animals. Consequently, gaining the approval of a likeminded group is important for boosting our self-esteem. We reinforce this sense of self-esteem by behaving in ways that favour our own group (known as the in-group).

For example, we might post on social media about the positive traits of our in-group, which is relatively harmless in itself. However, every in-group needs an out-group.

Furthermore, where there’s in-group loyalty there may also be out-group derogation – negative attitudes and behaviour towards the out-group. This conflict between groups of all kinds is a form of tribalism. It plays a huge role in how disinformation hacks your brain.

In emotive issues like politics, which tap into aspects of people’s identities, tribalism can morph into a force of dangerous power. Violence can easily follow. In fact, tribalism is the driving force behind many human conflicts.

Disinformation leverages the human tendency for tribalism by creating and disseminating adversarial narratives. These inflame existing divisions, creating a sense of ‘us vs them’. We can observe many cases of this in recent political events.

Examples include Trump supporters vs Clinton supporters in the US, Leavers vs Remainers in the UK, Muslims vs Buddhists in Myanmar, Han fans vs Tsai fans in Taiwan’s recent presidential election.

The Backfire Effect

You might expect people would stop believing in disinformation if they are told it’s untrue. This seems logical, however human psychology doesn’t always work that way. The root of the problem is found (once again) in our self-esteem.

Furthermore, when certain beliefs become embedded in our worldview, they also become part of our identity. If one of those beliefs is challenged, it’s as if someone is shaking up the very foundations of that identity.

Challenges to our identity can be psychologically painful. In response, we may cling tighter to the original belief, making it even stronger. The attempt to correct backfires, therefore this process is known as the backfire effect.

Summary: How Disinformation Hacks Your Brain

  • Human psychology makes us susceptible to disinformation
  • In a world of information overload, we seek shortcuts to help us navigate. However, these can be gamed, such as social proof.
  • Much of online behaviour has its roots in aspects of self-esteem and identity.
  • Simply ‘debunking’ disinformation may not be effective, due to the backfire effect.
  • Adversarial narratives are a common feature of disinformation, found in many situations worldwide. They can lead to tribalism, which risks real-life violence.

astroturfing

Astroturfing: A Quick Example from Facebook

What is Astroturfing?

Astroturfing is not new. Its history stretches back to the days of newspapers and pamphlets. But astroturfing has become a major important concern in today’s ‘post-truth’ information environment.

The Guardian defines astroturfing as “the attempt to create an impression of widespread grassroots support for a policy, individual, or product, where little such support exists.”

The ‘grassroots’ part is where the name comes from; that bright green fake grass. You might remember it from the school sports field.

astroturfing

Social media is a prime environment for astroturfing campaigns. User attention spans are low, knee-jerk reactions are prevalent, and ‘likes’ are an addictive form of currency.

Illusion becomes reality when fake engagement intersects with genuine social media users. They are more likely to engage with seemingly popular posts because of social proof – a psychological effect in which people like or support things that already seem popular with others.

An Example of Astroturfing

Let’s take a look at an example of suspected astroturfing on Facebook. Our starting point is the official Facebook page of the UK’s current prime minister, Boris Johnson.

Underneath every post on his page, especially those about Brexit, we can see hundreds of responses. That’s not unusual to find on the page of a public figure. But the style of those responses seemed artificial.

astroturfing
Screenshot of Boris Johnson’s Facebook page, with a selection of comments about Brexit.

They are all very similar; short utterances of praise for Boris Johnson, repeating words and phrases such as ‘brilliant’, ‘fantastic’, and ‘support Boris 100%’. On each comment, we can also see a lot of response emojis of positive sentiments ‘like’, ‘love’ and ‘laugh’.

This behaviour is odd. Genuine people do support Johnson, of course. But it’s suspicious for so many to comment on his posts in this distinctive and repetitive way. This looks very much like an astroturfing campaign.

More genuine engagement

Now let’s contrast this with the page of his predecessor, Theresa May, specifically her Brexit-related posts. Here we can see a very different scenario, which immediately feels far more genuine.

astroturfing
Screenshot of Theresa May’s Facebook page showing a sample of typical comments about Brexit. Note the contrast with Johnson’s page.

Responses to May’s posts are more varied in content, tone and length. Some commenters disagree with her. Others support her. But most commenters use more depth and sophistication of language than the short repetitive replies to posts on Johnson’s page.

The responses on May’s page are more likely to be ‘organic’ (i.e. from real people who behave naturally). In contrast, it appears that Johnson’s page is the subject of astroturfing techniques, which may include fake comments and even fake followers.

Facebook locks its data down tight, so it’s hard to run further analysis to determine for certain whether the Johnson supporters are part of an organised campaign. But we can draw insights from previous recent examples. 

Donald Trump used fake Facebook followers during the US presidential campaign. Researchers discovered that over half of the followers on his page came from countries known as hubs for Facebook ‘like farms’.

It is common for like farms to exist in developing countries such as the Philippines and India, where much of the population speaks English and the US dollar stretches a long way.

The farms offer customers the opportunity to buy fake Facebook likes and Twitter follows, to use for astroturfing the impression of popular support.

As well as likes, customers can purchase fake engagement, usually in the form of comments. This may explain the unusual commenting activity on Johnson’s page.

Why astroturfing matters

Astroturfing matters because it’s a deliberate attempt to manipulate perceptions of popular opinion, with potentially dangerous results.

Although astroturfing has been a feature of political campaigning for decades, the social media environment gives it enormous power. Social media users have become far more susceptible to its effects than newspaper readers ever were.

When combined with disinformation and conspiracy theories, astroturfing has the potential to cause all sorts of social and political chaos. Many would argue that it already has.