Disinformation thrives on chaos, and a global pandemic is about as chaotic as it gets. For those who seek to disinform, the coronavirus presents a far grander opportunity than either the 2016 US election or the vote on Brexit. The upcoming 2020 US presidential election further fans the flames.
With that in mind, it’s important to regularly stop and take stock of lessons learned from the front lines of disinformation tracking. I’ve been studying cross-platform coronavirus narratives for the last month or so. Here are a few of the things I’ve found.
1. Q is a major player
Qanon is a mega conspiracy narrative that encompasses a whole range of smaller ones. The basic premise of Qanon has Donald Trump in league with a shadowy figure called Q. Together, Trump and Q are fighting against a group of elite paedophiles entrenched within the mainstream media and the Democrat Party.
Previous presidential candidate Hillary Clinton and current one Joe Biden have both been major targets for Q’s accusations. Every so often, Q releases tantalising nuggets of new information (called ‘Q drops’) for his followers to chew over. These have sparked a whole ecosystem of pervasive social media content, from Twitter threads to entire YouTube channels.
Coronavirus disinformation is being well-leveraged by Q and his followers. Q related themes and activity underpin many of the most widely spread corona conspiracies, including coronavirus being either a hoax or a bioweapon, 5G causing the virus, a supposed plan to enforce mandatory vaccinations, and the imminent arrival of military martial law.
2. Mainstream media is pushing conspiracy narratives
Conservative media sources in the US, such as Fox News, play a significant role in promoting narratives that draw on conspiracies, including around the coronavirus. They claim it’s ‘not a big deal’, or it’s ‘just like the flu’, or, ‘it’s all a big hoax’.
Although these stories may be less colourful than those of the average Q acolyte, they are still risky. Provenance in established media sources provides the necessary social proof to make the narratives more credible in the minds of their audiences.
What’s more, this scenario means less work for those who intend to manipulate public opinion around the coronavirus. They no longer have to waste time crafting convincing content, but can simply engage with organic content that already exists. And that’s exactly what they’re doing, with a firm eye on the US 2020 election.
3. Coronavirus tribalism is prevalent
Pitting ‘us’ against ‘them’ is at the core of most disinformation, including conspiracy theories. The narratives can take many forms, but always come down to one group (the ingroup) facing off against a predefined opposing group (the outgroup).
For Qanon, it’s Q’s followers who are the ‘enlightened’ ingroup, joining forces with him and Trump to battle the predatory elites. In British politics, we see ‘patriotic’ supporters of Brexit setting themselves against ‘treacherous’ Remainers (and vice versa).
Tribalism even filters down to matters of life or death, i.e. the coronavirus. On social media, I’ve noticed a recurring adversarial narrative emerging around how best to respond to the pandemic. One camp downplays the severity of the virus, claiming measures such as the lockdown are an overreaction, while the other camp is strongly in favour of lockdown and promotes WHO advice to Stay At Home. Each camp supports their own and attacks the other, often in derogatory and aggressive ways.
When people are already suspicious of ‘elites’ and experts, there’s a real tendency to dismiss guidance from governments and public health organisations, which can lead to the flouting of virus mitigation measures. Real world harms can result.
4. Virus fears are being monetised
The chaos and fear of a global pandemic has spawned many opportunities for leveraging the attention economy. In addition to conspiracy theories, there are many examples of people making money by tapping into the fear, confinement, and increased search for answers.
I’ve identified two main ways of doing this. The first is through creating highly clickable content about the virus. This content may or may not be factual; it doesn’t matter to the creator, as long as it brings in the clicks. The content is published on websites festooned with online ads, where each click brings extra ad dollars to the site owner.
The second way is to create content on topics such as ‘miracle cures’, which then feeds into attempts to sell products. Vitamin C is a prime example. It’s a cynical exploitation of people’s fearfulness about the virus and their need to somehow regain a sense of control.
These ‘miracle cures’ are not scientifically proven. They provide a false sense of security, which may lead to individuals choosing not to self isolate and spreading the virus as a result.
5. Takedowns have a ‘backfire effect’
Although takedowns are a necessary part of tackling the disinformation problem, by denying bad actors freedom of reach, they can also strengthen the impetus behind conspiracy theories by feeding into an existing sense of elite suppression. Here, the platforms are viewed as part of the elite, working together to keep the ‘truth’ hidden from the people.
Conspiracy theorists are quick to react to takedowns by working them into their narratives. With 5G, a trend has sprung up of referring to it as ‘5gee’ or similar permutations, in an attempt to avoid the keyword being picked up by moderators or analysts who are tracking it.
For conspiracy adherents, this sense of persecution further reinforces their existing worldview, making them more likely to cling onto it. In this way, a ‘backfire effect’ has occurred.
6. Platform responses are shifting
Social media companies are frequently accused of not doing enough to reduce the flood of misleading content that overwhelms their platforms. I don’t believe they’re reluctant to do so, but they have to balance it with being seen as supportive of free speech. Finding that balance can be challenging when addressing conspiracy theories, as opposed to purely false information.
Most conspiracy theories are spun up like candy floss around a small kernel of truth. A typical post will build a whole story around how some real life event is of possible significance to the wider narrative arc. This creates murky territory for the platforms because the difference between opinion and actual false information is not always clear-cut.
But things have shifted after some conspiracy theories, such as the one about 5G causing coronavirus, triggered real life harms. A recent video by notorious conspiracy theorist David Icke was pulled from YouTube just days after it was released, heralding a change in approach.
A growing amount of research indicates that coronavirus conspiracy theories form a central part of coordinated influence operations. We can no longer afford to overlook the role of conspiracy theories in influence operations.