Home » The Psychology of Disinformation

Category: The Psychology of Disinformation

fake news and disinformation

4 Simple Ways To Stop Sharing Fake News and Disinformation

Introduction

Fake news, more specifically known as disinformation, is a major problem that shows no sign of going away. If anything, it’s evolving in new ways to become more nefarious than before. Elections are always major flashpoints for fake news, and the US presidential election of 2020 is no exception. 

Many individuals and organizations are working hard to come up with ways to fight fake news and disinformation. In the meantime, ordinary internet users can also do their part to help.

In this post, I’ll discuss four simple ways that you can stop sharing fake news and disinformation.

4 Simple Ways To Stop Sharing Fake News and Disinformation

Break Out Of Dopamine Loops

fake news and disinformation

What is a dopamine loop and how does it relate to fake news and disinformation?

Dopamine is a chemical in the brain that affects functions such as mood, attention and motivation. It also plays a key role in affecting our desire to seek out new things – like information. 

Dopamine loops consist of two parts: wanting and liking. ‘Wanting’ compels you to keep searching for new information, while ‘liking’ is the part that makes you feel satisfied once you find it.

The trouble is, in the dopamine loop, wanting is stronger than liking. This leads to us getting trapped, constantly compelled to seek new information. 

The original designers of social media knew all about dopamine loops. They designed the platforms with them in mind, knowing that the loops would keep users hooked on the platform. That would increase the likelihood of users viewing online ads.  

So how does the dopamine loop relate to fake news and disinformation? One major way that we get dopamine hits online is through receiving notifications on social media.

You know, those little red numbers in the top corner of Facebook. Think about how you feel when you open your profile and see a bunch of them waiting for you. You feel good, right? This is dopamine firing in your brain. 

Sharing content with your friends and followers is a great way to get notifications, which gives you even more dopamine. But this is where we find the danger from fake news and disinformation.

When we share to get dopamine hits, we’re less likely to take the time to check whether the information we’re sharing is actually true. After all, we’re constantly in a state of information overload. 

One way to stop spreading fake news and disinformation is to break our addiction to dopamine. It makes us vulnerable. We need to avoid getting stuck in dopamine loops, constantly refreshing our social media apps in the hunt for fresh information and notifications. 

Quick ways to break the dopamine loop:

  • Turning off your social media notifications altogether
  • Switching your device to grayscale mode (making it less likely to produce a dopamine hit)
  • Pausing to take a few deep breaths before sharing any content 

But there’s another critical way to stop sharing fake news and disinformation…

Avoid Heated Arguments Online

fake news and disinformation

The internet is full of trolls. No matter what you say online, it often feels like someone is always ready to attack you for it. Your natural instinct is to strike back. That’s a very human response. But it risks making you more likely to share fake news and disinformation.  

Why? Because arguing online is another way to get trapped in a dopamine loop. Your antagonist keeps responding, you keep getting more notifications. You keep arguing back, and the cycle continues.

Often, you’ll share a piece of online content, perhaps a news article, to prove your point and get ‘one up’ on your opponent. When doing so, you probably don’t take the time to fact-check the article. That’s where the danger is. 

What’s more, some online trolls are there deliberately. They’re part of coordinated inauthentic behavior campaigns designed to sow division and hostility around certain topics (usually political ones).

These campaigns usually involve fake news and disinformation too. By arguing with these political trolls, you’re giving them exactly what they want. 

Luckily, there’s an easy way to avoid being drawn into online political arguments. On Twitter, it’s the mute function (either mute conversation, or mute user). On Facebook, you can turn off notifications about a specific post.

These features are great, because they allow you to break out of the dopamine loop and the troll has no idea. They just carry on yelling into the void. Meanwhile, you carry on with your day and remain blissfully unaware.

Check Your Confirmation Biases

confirmation bias

Confirmation bias plays a key role in increasing our likelihood of sharing fake news and disinformation. But what exactly is it?

Confirmation bias is our natural tendency to search for, favor and easily believe information that fits with our existing worldview. 

Let’s look at how confirmation bias works in practice. For example, you see a tweet (falsely) claiming that US presidential candidate Joe Biden has dementia.

You’re a Trump supporter and you don’t like Biden. Thanks to confirmation bias, you’re very likely to hit retweet on this tweet without even stopping to question if it’s really true. 

You also know that your Twitter followers (who have similar worldviews) will appreciate your sharing this tweet. They’re likely to give it lots of attention, including retweets and favorites – i.e. plenty of extra dopamine for you. 

However, if you saw a similar tweet questioning Trump’s mental health, it’s far more likely that you’d be skeptical of it. Of course, this works in the other direction too. Confirmation bias is not unique to either end of the political spectrum. 

It’s dangerous, because it makes people automatically believe (and probably share) content that fits their worldviews, without stopping to check its veracity. 

If you really want to stop sharing fake news and disinformation, you have to approach your social media use knowing that you have confirmation bias. You have to consciously remind yourself what exactly it is that compels you to share a certain post.

It’s not easy, but it’s a necessary step to help stop sharing fake news and disinformation.

Consider Content Incentives For Fake News

incentives for fake news

Finally, I want to discuss the incentives of social media content. Every post and article on the internet has a certain incentive behind it. For many content creators, publishing articles is a way to drive traffic to their websites, to earn money from online ads. This is their main incentive.

But the social media space is noisy, so those articles need to stand out. That’s why you’ll see so many overblown clickbait titles that often bear little relevance to the actual content of the article.

In particular, politics is a highly charged and emotive topic, so it’s often used to catch attention and drive site traffic. That’s how the infamous Macedonian teenagers made money from pushing pro-Trump fake news in 2016.

Another incentive in content creation is to push a specific worldview, perhaps on behalf of a foreign government. The Kremlin uses this technique a lot.

Amid the early days of the 2020 pandemic, I found that Russia-linked news sites were pushing conspiracy theory narratives (e.g. the dangers of 5G, Bill Gates as responsible for coronavirus, coronavirus as hoax, etc). These showed up consistently on social media, for example in US and UK based Facebook groups.  

Before sharing something on social media, consider the incentives of its creator. Are you truly happy to help that clickbait website make more ad money, or to help a hostile foreign government promote its worldview to your fellow countrymen?

Summary

In this article, I presented four simple ways to stop sharing fake news and disinformation. I talked about the following points:

  • How to break out of dopamine loops 
  • How to avoid heated arguments online 
  • Why you should check your confirmation biases
  • Why you should consider the incentives of content

Are you doing any of these already? Let us know in the comments.

Dopamine and Disinformation: How To Quit Feeding The Trolls

Dopamine and disinformation are intrinsically linked. In this article, I’ll explain how this works, and look at some ways to lessen the effects.

I used to spend a lot of time arguing with strangers on the Internet.

It normally went like this: I would post something political and the trolls would arrive. I’d get stuck in endless back-and-forths with them. It was a futile endeavour. But somehow I felt compelled to keep on engaging.

On Twitter or Facebook, new notifications would keep coming in, alerting me of another new comment. Then I’d engage again. It wasted my time, raised my anxiety levels, and made me feel more aggressive.

I’d gotten stuck in a dopamine loop.

This happens to us a lot, for example when we can’t stop refreshing our social media feeds. But what exactly is dopamine and what does it do?

How dopamine creates addiction

Dopamine is a chemical created in the brain. It affects many functions, such as thinking, mood, attention and motivation. It also plays a critical role in triggering our desire to seek out and search. In short, dopamine makes us want to search out information.

Two parts are at play in the dopamine loop. There’s the ‘wanting’ part and the ‘liking’ part. The wanting part makes you take action (i.e. searching), while the liking part satisfies you and makes you stop searching. But the wanting part is stronger than the liking part. And that’s how we get trapped in endless scroll mode.

Another important part of the dopamine system relates directly to the issue of engagement with trolling comments.

The system is very sensitive to cues that indicate the possibility of a reward. On social media, notifications are the cues. They make the addiction even stronger. Anticipation is key. If we got a reward every time we logged in, it would weaken the addiction. It’s the uncertainty that gets us hooked.

So how did these behavioural psychology elements get incorporated into our Internet experience in the first place?

Engineered addiction and trolling

It all started with a researcher called BJ Fogg and his Persuasive Technology Lab at Stanford University. In September 2007, Fogg and his students began work on a formula that would change the world — and all of our daily lives.

They built Facebook apps using techniques from behavioural psychology, techniques that could engineer addiction, such as the dopamine loop that I described above. Fogg developed a formula, suggesting that people will act when three forces converge: motivation, trigger, and ability.

Let’s apply this formula to a hostile online interaction. The motivation is your desire to convince the other person that your opinion is right, or to get ‘one up’ on them; the trigger is seeing another notification on your social media app, indicating that your opponent has responded; and the ability is having your phone to hand, which lets you check right away and respond.

These techniques tap directly into the workings of the human brain, making them extremely powerful.

So why do social media platforms leverage such addictive and manipulative design practices?

Simple: their profitability depends on it.

The ad-supported business model means that users spending more time on the app leads to more profit for the company. All that time you spend arguing with trolls on Facebook is making the platform more valuable to potential advertisers.

Dopamine and disinformation

Arguing online also relates to dopamine and disinformation. It can make us more susceptible to the latter. The dopamine loop gives a powerful tool to those who seek to divide us. It perpetuates a sense of an adversarial environment and keeps us always on the attack.

When we divide ourselves into tribes and adopt a hostile attitude towards a perceived outgroup, we risk becoming more susceptible to harmful disinformation. We are more likely to share content that is cognitively resonant with our existing beliefs and which reflects the attitudes of our tribe.

The dopamine loop also affects our interactions with agreeable content. When we post something that our tribe members approve of, we’ll receive dopamine hits via our notifications of likes and encouraging comments. That boosts our self-esteem and keeps us coming back for more.

So what can we do to fix the trolling problem and reduce our susceptibility to disinformation?

Breaking out of the loop

Short-term practical solutions mainly involve adapting our devices to help break the dopamine loop. For example, we could make our phones less appealing by changing the screen to grayscale mode, or switching off all notifications.

But we can also tackle adversarial online behaviour in another way.

‘Don’t feed the trolls’ has become an Internet cliché. But it starts to make sense when thinking about the dopamine loop and anticipatory reward.

Everyone who posts online is looking for a response. They want to perpetuate the dopamine loop. If we can maintain our self control by denying them that response, then we can break the cycle.

I’ve managed to break my own addiction to engaging with trolls. On Twitter, I simply mute the conversation. I can’t see the person anymore, and I receive no notifications of any of their comments. It makes the troll invisible to me, breaking the dopamine loop and allowing my brain to focus on other things.

On Facebook, I simply turn off notifications for a particular post. This has the same effect as muting on Twitter. Both platforms also offer a blocking option. I don’t normally use this because it gives trolls the satisfaction of knowing they’ve got a response. Muting is better, because it means they’re left wondering if I’m ignoring them. They just keep on yelling into the void.

Battleground or community

If we could all break the cycle and quit feeding the trolls, then adversarial disinformation and influence ops could lose much of their power. The online environment would feel like more of a community instead of a battleground. In turn, this may help reduce polarisation.

But it has to be done en masse. A handful of people breaking the cycle won’t be enough to change the overall environment. As social media is designed to be addictive, a wider intervention would be necessary to encourage people to do this.

Of course, the social media platforms have all the power. They could redesign their structures to destroy these dopamine loops and disincentivise disinformation. But their ad driven business model means they don’t have an incentive to do so.

Nevertheless, we can still improve our individual online experiences by taking steps to break the dopamine and disinformation cycle. Part of doing this is to disengage with users who aim to trap us into never-ending adversarial debates based around polarising topics.

Analysing Trump’s Medical Disinformation on Facebook

US president Donald Trump shocked the world this week with his latest piece of medical disinformation.

Trump claimed that injecting disinfectant into the body could be an ‘interesting’ way to cure COVID-19.

He later tried to back-pedal, claiming he was being sarcastic. But that wasn’t how most of the world took it.

Dangers of medical disinformation

The mainstream media and the public widely lambasted this dangerous medical disinformation.

Amid the furore over Trump’s remarks, a major disinfectant firm issued a statement urging the public not to inject or drink any of their products.

However, members of pro-Trump Facebook groups dedicated to conspiracy theories displayed quite the opposite reaction. 

I examined some of these groups to provide comment for an article in CodaStory. I’d previously gathered this list because of the strong focus on various ‘corona disinformation conspiracies’.

These include 5G causing the virus, the virus being a US bioweapon, and Bill Gates as having orchestrated the ‘virus hoax’ in his ambition to enforce a worldwide vaccine programme. 

Many of the groups also centred around the Qanon conspiracy theory.

Pro-Trump Facebook reactions

You might expect the suggestion of injecting bleach to be a step too far even for these largely pro-Trump groups. Not so. 

In my initial observation of the groups, I noticed three distinct ways in which the members attempted to account for Trump’s bizarre medical disinformation.

First, that Trump was just ‘playing the media’. People must be stupid if they believe he meant what he said.

Commenters also attributed all the negative media coverage to ‘yet another’ MSM (mainstream media), liberal, or Democrat attempt to smear Trump.

Secondly, some commenters claimed that the media had quoted Trump ‘out of context’. According to them, he was speaking ‘more generally’ about possible ways to treat COVID-19.

Others highlighted a fact check article from far-right news outlet Breitbart. But no-one acknowledged the videos of Trump making these claims for everyone to see and hear. 

The third claim relates more closely to other COVID-19 medical disinformation, ‘miracle cures’. This commenter claimed that Trump must have been referring to those UV light therapy and ozone therapy, which already exist.

Things got more interesting when the commenter drew links between the medical disinformation about bleach and the popular narrative of ‘Vitamin C as miracle cure’.

They claimed that taking Vitamin C causes hydrogen peroxide to build up in the body. It followed that hydrogen peroxide has a disinfectant effect, so Trump’s comments have a basis in medical fact.

Rationalising medical disinformation

These three counter-narratives about Trump’s medical disinformation all attempt to rationalise an influential figure making a dangerous and irresponsible remark.

Tribal attitudes drive many of these rationalisations. For example, the claims that the media purposefully misinterpreted Trump’s comments in a ‘libs’ or ‘Dems’ smear attack. Once again, this reinforces the existing divide between populist pro-Trump narratives and the mainstream.

The question remains: How many of these Facebook group members are genuine American citizens? Facebook itself is the only entity that could properly attribute the accounts. And it doesn’t seem to be giving much away.

I suspect group members are a mix of genuine Trump supporters and astroturfers working to stir up tribal hatred of the ‘other side’.

Tribal attitudes can be dangerous, particularly in relation to public health. People in the pro-Trump tribe are more likely to challenge messages from the perceived ‘outgroup’ (‘experts’ and the ‘MSM’) such as critical public health advice from the WHO.

A similar dynamic has fuelled recent anti-lockdown protests across the US, which may already have spread the virus further and compromised the entire country. Astroturfing was certainly a factor there; there’s no reason why it couldn’t be influencing these groups too.

Coronavirus Conspiracy Theories, Tribalism And Public Health

During the pandemic, large crowds of Trump supporters took to the streets of US cities, demanding an end to coronavirus restrictions, such as lockdown and the wearing of masks. Britain saw similar issues, albeit on a smaller scale.

Why are some people so determined to ignore public health advice? Part of the answer may be found by examining political tribalism and coronavirus conspiracy theories.

In this post, I’ll explain how coronavirus conspiracy theories and disinformation leverage tribalism to influence people’s behaviour.

Divided societies, universal threat

When the pandemic first hit, some hoped that the shared experience of facing universal threat would bring warring political tribes together. But it seems the opposite is happening. This is partly driven by an organised and sustained campaign of disinformation and coronavirus conspiracy theories.

In the UK and US, government responses to the virus have been unlike those of many other countries. Portugal, Germany, New Zealand, Canada and South Korea have already managed to regain some level of control over its spread.

In contrast, both the UK and the US were slow to implement lockdown measures. Both gave their populations mixed messages about how to handle the pandemic. Both countries’ leaders have displayed a cavalier attitude towards the virus.

Political tribalism in the UK and the US is now affecting their populations’ response to the coronavirus crisis. This tribalism is a hangover from 2016, the same force that played a role in the election of Trump and the vote for Brexit – polarising the populations in the process.

Coronavirus conspiracy theories demonise groups

A sustained torrent of coronavirus disinformation has compounded these issues. In particular, numerous coronavirus conspiracy theories have eroded trust in public institutions among some segments of the population. Distrust of experts is nothing new. It’s been a central feature of tribal politics since 2016 and shows no sign of dissipating in this pandemic.

Common coronavirus conspiracy theories include:

Tribalism means viewing the world as ‘us vs them’, with ‘us’ being superior and ‘them’ being threatening. This perspective is inherent in these coronavirus conspiracy theories.

Many revolve around the demonisation of a particular group (e.g. elites, the establishment, experts, the WHO, China, and so on). True believers view anyone who supports the demonised group as being part of it. And so the tribal divisions persist.

These coronavirus conspiracy theories cast doubt on the public health situation. They promote distrust of expert advice and official organisations. The result is shifts in population behaviour, e.g, people refusing to follow lockdown, wear masks or practise social distancing.

From Facebook to the streets

The situation has become particularly dangerous in the US, with its current protests. Here the role of social media comes under the spotlight.

Private Facebook groups have been key sites for inciting and organising these protests. Some groups are large, such as ‘Michiganders Against Excessive Quarantine’, or ‘Reopen Virginia’ (the latter with over 18,000 members)

Both groups are full of talk of coronavirus conspiracy theories, such as the below from the Michigan group.

Source: https://twitter.com/willsommer/status/1250838111992647680

Below is an example comment from the ‘Reopen Virginia’ group. This user is calling for civil unrest, while also demonising the outgroup (‘leftist Democrats’). The post has attracted significant engagement, both comments and likes.

Source: https://twitter.com/jaredlholt/status/1250842215435337728/photo/3

These posts show how belief in tribal coronavirus conspiracy theories can lead to virus scepticism and denial. It can also trigger people to take real-life protest action, which risks turning violent.

Furthermore, it’s not easy to know who is producing these comments. Do they reflect the views of genuine American citizens? Or are some of the comments being astroturfed by those who seek to create social unrest?

Coronavirus conspiracy theories are a problem for other social media platforms too. YouTube hosts thousands of videos discussing all kinds of conspiracy theories in great detail. The platform recently changed its policies in an attempt to crack down on coronavirus conspiracy theories and 5G content. But it’s likely too little, too late.

The trouble is, platform takedowns are viewed as a sign of elite censorship in the minds of people already suspicious of experts and official organisations. This adds even more fuel to the fire of coronavirus conspiracy theories.

Local groups are key targets

Private local Facebook groups are a prime target for influence operations. They have already been identified as key battle sites for the US 2020 election, where influence operatives aim to manipulate the political narratives in key swing states.

Targeting local Facebook groups is an effective way to do this. As well as activity such as voter suppression in these groups, influence operations can also compel populations to protest on the streets.

It’s difficult for researchers and analysts to study private Facebook groups in aggregate, as tools such as CrowdTangle don’t allow access to private groups.

These groups are hotspots for US 2020 manipulation activities. Facebook should monitor them carefully. Its moderators should look out not only for signs of voter suppression, but also for coordinated attempts to incite populations to violence.

We must take coronavirus conspiracy theories seriously

These times of heightened fear offer a prime opportunity to for disinformation purveyors to influence the outcome of the US 2020 election.

When political tribalism is so entrenched, fact checking and counter disinformation messaging campaigns may be less effective on a large scale. Instead, they risk exacerbating people’s existing suspicions of the establishment and ‘elites’.

Coronavirus conspiracy theories are not trivial. They risk causing harm on a massive scale, by encouraging populations to ignore public health advice and instigate real life violence.

It’s essential that social media companies take coronavirus conspiracy theories seriously, particularly within private groups. Whether or not they do so may end up as a key deciding factor of the US 2020 election. 

disinformation

6 Things I’ve Learned About COVID-19 Disinformation

Disinformation thrives on chaos. A global pandemic is about as chaotic as it gets.

For those who seek to spread disinformation, COVID-19 presents a far grander opportunity than either the 2016 US election or the vote on Brexit. The upcoming 2020 US presidential election further fans the flames.

That’s why it’s important to stop and take stock of lessons learned from the front lines of disinformation tracking.

I’ve been studying cross-platform coronavirus narratives for the last month or so. Here are a few of the things I’ve found.

What I’ve learned about COVID-19 disinformation

disinformation virus

1. Q is a key player in disinformation efforts

Qanon is a mega conspiracy narrative that encompasses a whole range of smaller ones. The basic premise of Qanon claims that Donald Trump is in league with a shadowy figure called Q.

Together, Trump and Q fight against a group of elite paedophiles entrenched within the mainstream media and the Democrat Party.

Previous presidential candidate Hillary Clinton and current one Joe Biden have both been major targets for Q’s accusations.

Every so often, Q releases tantalising nuggets of new information (called ‘Q drops’) for his followers to chew over.

These have sparked a whole ecosystem of pervasive social media content, from Twitter threads to entire YouTube channels.

Coronavirus disinformation is being well-leveraged by Q and followers. Q related themes and activity underpin many of the most widely spread corona conspiracies.

Those include coronavirus being either a hoax or a bioweapon, 5G causing the virus, a supposed plan to enforce mandatory vaccinations, and the imminent arrival of military martial law.

2. Mainstream media is pushing disinformation narratives

Conservative media sources in the US, such as Fox News, play a significant role in promoting narratives that draw on conspiracies, including around coronavirus disinformation. They claim it’s ‘not a big deal’, or it’s ‘just like the flu’, or, ‘it’s all a big hoax’.

Although these stories may be less colourful than those of the average Q acolyte, they are still risky.

Provenance in established media sources provides the necessary social proof to make the narratives more credible in the minds of their audiences.

What’s more, this scenario means less work for those who intend to manipulate public opinion around the coronavirus.

They no longer have to waste time crafting convincing content, but can simply engage with organic content that already exists. And that’s exactly what they’re doing, with a firm eye on the US 2020 election.

3. Coronavirus tribalism is prevalent

Pitting ‘us’ against ‘them’ is at the core of most disinformation, including conspiracy theories. The narratives can take many forms, but always come down to one group (the ingroup) facing off against a predefined opposing group (the outgroup).

For Qanon, it’s Q’s followers who are the ‘enlightened’ ingroup, joining forces with him and Trump to battle the predatory elites. In British politics, we see ‘patriotic’ supporters of Brexit setting themselves against ‘treacherous’ Remainers (and vice versa).

Tribalism even filters down to matters of life or death, i.e. the coronavirus. On social media, I’ve noticed a recurring adversarial narrative emerging around how best to respond to the pandemic.

One camp downplays the severity of the virus, claiming measures such as the lockdown are an overreaction. The other camp is strongly in favour of lockdown and promotes WHO advice to Stay At Home. Each camp supports their own and attacks the other, often in derogatory and aggressive ways.

It’s dangerous when people are already suspicious of ‘elites’ and experts. They have a tendency to dismiss guidance from governments and public health organisations, which can lead to the flouting of virus mitigation measures. Real world harms can result.

4. Virus fears being monetized 

The chaos and fear of a global pandemic has spawned many opportunities for leveraging the attention economy.

As well as conspiracy theories, there are many examples of people making money via coronavirus disinformation, by tapping into people’s fear, boredom, and increased need for answers.

I’ve identified two main ways of doing this. The first is through creating highly clickable content about the virus.

This content may or may not be factual; it doesn’t matter to the creator, as long as it brings in the clicks.  Content is published on websites festooned with online ads, where each click brings extra ad dollars to the site owner.

The second way is to create content on topics such as ‘miracle cures’, which then feeds into attempts to sell products. Vitamin C is a prime example.

It’s a cynical exploitation of people’s fears about the virus and their need to regain a sense of control.

These ‘miracle cures’ are not scientifically proven. They provide a false sense of security, which may lead to individuals choosing not to self isolate and spreading the virus as a result.

5. Takedowns have a ‘backfire effect’ 

Takedowns are a necessary part of tackling the coronavirus disinformation problem.

However, denying bad actors freedom of reach can also strengthen the impetus behind conspiracy theories by feeding into an existing sense of elite suppression.

Here, conspiracy theorists view the platforms as part of the elite, keeping the ‘truth’ hidden from the people.

Conspiracy theorists are quick to react to takedowns, working them into their coronavirus disinformation narratives.

With 5G, a trend has sprung up of referring to it as ‘5gee’ or similar permutations. This is an attempt to avoid the keyword being picked up by moderators or analysts who are tracking it.

For conspiracy adherents, this sense of persecution further reinforces their existing worldview. It makes them more likely to cling to it. In this way, a ‘backfire effect’ has occurred. 

6. Platform responses are shifting 

Social media companies are frequently accused of not doing enough to reduce the flood of misleading content that overwhelms their platforms.

I don’t think they’re reluctant to do so, but they have to balance this move with being seen as supportive of free speech.

Finding that balance can be challenging when addressing conspiracy theories, as opposed to purely false information.

Most conspiracy theories are spun up like candy floss around a small kernel of truth.

A typical post will build a whole story around how some real life event is of possible significance to the wider narrative arc.

The difference between opinion and actual false information is not always clear-cut. This creates murky territory for the platforms. 

But things have shifted after some conspiracy theories, such as the one about 5G causing coronavirus, triggered real life harms.

A recent video by notorious conspiracy theorist David Icke was pulled from YouTube just days after it was released, heralding a change in approach.

A growing amount of research indicates that coronavirus conspiracy theories form a central part of coordinated influence operations.  

We can no longer afford to overlook the role of conspiracy theories and disinformation in influence operations.