Disinformation thrives on chaos. A global pandemic is about as chaotic as it gets.
For those who seek to spread disinformation, COVID-19 presents a far grander opportunity than either the 2016 US election or the vote on Brexit. The upcoming 2020 US presidential election further fans the flames.
That’s why it’s important to stop and take stock of lessons learned from the front lines of disinformation tracking.
I’ve been studying cross-platform coronavirus narratives for the last month or so. Here are a few of the things I’ve found.
What I’ve learned about COVID-19 disinformation
1. Q is a key player in disinformation efforts
Qanon is a mega conspiracy narrative that encompasses a whole range of smaller ones. The basic premise of Qanon claims that Donald Trump is in league with a shadowy figure called Q.
Together, Trump and Q fight against a group of elite paedophiles entrenched within the mainstream media and the Democrat Party.
Previous presidential candidate Hillary Clinton and current one Joe Biden have both been major targets for Q’s accusations.
Every so often, Q releases tantalising nuggets of new information (called ‘Q drops’) for his followers to chew over.
These have sparked a whole ecosystem of pervasive social media content, from Twitter threads to entire YouTube channels.
Coronavirus disinformation is being well-leveraged by Q and followers. Q related themes and activity underpin many of the most widely spread corona conspiracies.
Those include coronavirus being either a hoax or a bioweapon, 5G causing the virus, a supposed plan to enforce mandatory vaccinations, and the imminent arrival of military martial law.
2. Mainstream media is pushing disinformation narratives
Conservative media sources in the US, such as Fox News, play a significant role in promoting narratives that draw on conspiracies, including around coronavirus disinformation. They claim it’s ‘not a big deal’, or it’s ‘just like the flu’, or, ‘it’s all a big hoax’.
Although these stories may be less colourful than those of the average Q acolyte, they are still risky.
Provenance in established media sources provides the necessary social proof to make the narratives more credible in the minds of their audiences.
What’s more, this scenario means less work for those who intend to manipulate public opinion around the coronavirus.
They no longer have to waste time crafting convincing content, but can simply engage with organic content that already exists. And that’s exactly what they’re doing, with a firm eye on the US 2020 election.
3. Coronavirus tribalism is prevalent
Pitting ‘us’ against ‘them’ is at the core of most disinformation, including conspiracy theories. The narratives can take many forms, but always come down to one group (the ingroup) facing off against a predefined opposing group (the outgroup).
For Qanon, it’s Q’s followers who are the ‘enlightened’ ingroup, joining forces with him and Trump to battle the predatory elites. In British politics, we see ‘patriotic’ supporters of Brexit setting themselves against ‘treacherous’ Remainers (and vice versa).
Tribalism even filters down to matters of life or death, i.e. the coronavirus. On social media, I’ve noticed a recurring adversarial narrative emerging around how best to respond to the pandemic.
One camp downplays the severity of the virus, claiming measures such as the lockdown are an overreaction. The other camp is strongly in favour of lockdown and promotes WHO advice to Stay At Home. Each camp supports their own and attacks the other, often in derogatory and aggressive ways.
It’s dangerous when people are already suspicious of ‘elites’ and experts. They have a tendency to dismiss guidance from governments and public health organisations, which can lead to the flouting of virus mitigation measures. Real world harms can result.
4. Virus fears being monetized
The chaos and fear of a global pandemic has spawned many opportunities for leveraging the attention economy.
As well as conspiracy theories, there are many examples of people making money via coronavirus disinformation, by tapping into people’s fear, boredom, and increased need for answers.
I’ve identified two main ways of doing this. The first is through creating highly clickable content about the virus.
This content may or may not be factual; it doesn’t matter to the creator, as long as it brings in the clicks. Content is published on websites festooned with online ads, where each click brings extra ad dollars to the site owner.
The second way is to create content on topics such as ‘miracle cures’, which then feeds into attempts to sell products. Vitamin C is a prime example.
It’s a cynical exploitation of people’s fears about the virus and their need to regain a sense of control.
These ‘miracle cures’ are not scientifically proven. They provide a false sense of security, which may lead to individuals choosing not to self isolate and spreading the virus as a result.
5. Takedowns have a ‘backfire effect’
Takedowns are a necessary part of tackling the coronavirus disinformation problem.
However, denying bad actors freedom of reach can also strengthen the impetus behind conspiracy theories by feeding into an existing sense of elite suppression.
Here, conspiracy theorists view the platforms as part of the elite, keeping the ‘truth’ hidden from the people.
Conspiracy theorists are quick to react to takedowns, working them into their coronavirus disinformation narratives.
With 5G, a trend has sprung up of referring to it as ‘5gee’ or similar permutations. This is an attempt to avoid the keyword being picked up by moderators or analysts who are tracking it.
For conspiracy adherents, this sense of persecution further reinforces their existing worldview. It makes them more likely to cling to it. In this way, a ‘backfire effect’ has occurred.
6. Platform responses are shifting
Social media companies are frequently accused of not doing enough to reduce the flood of misleading content that overwhelms their platforms.
I don’t think they’re reluctant to do so, but they have to balance this move with being seen as supportive of free speech.
Finding that balance can be challenging when addressing conspiracy theories, as opposed to purely false information.
Most conspiracy theories are spun up like candy floss around a small kernel of truth.
A typical post will build a whole story around how some real life event is of possible significance to the wider narrative arc.
The difference between opinion and actual false information is not always clear-cut. This creates murky territory for the platforms.
But things have shifted after some conspiracy theories, such as the one about 5G causing coronavirus, triggered real life harms.
A recent video by notorious conspiracy theorist David Icke was pulled from YouTube just days after it was released, heralding a change in approach.
A growing amount of research indicates that coronavirus conspiracy theories form a central part of coordinated influence operations.
We can no longer afford to overlook the role of conspiracy theories and disinformation in influence operations.