4 Things I’ve Learned From Analysing Russia-Aligned COVID-19 Coverage

Much social unrest has emerged amid COVID-19, such as anti-lockdown protests, attacks on 5G masts, and violent reactions when asked to wear masks. As I write this, a murky far-right group called ‘UK Freedom Movement’ is organising a new spate of anti-lockdown protests around the UK.

This month I’ve been reviewing Russia-aligned news sites. I’ve been looking for key narratives on COVID-19 and the US election. I’ve examined two types of sites: those directly linked to the Russian state, and those with a similar political stance. Many sites share the same core group of authors.

Here are some of my findings, related to the current discussions on social unrest, conspiracy theories and the infodemic.

They’re consistent across websites

Topics covered on these sites reflect COVID-19 conspiracy narratives found on social media since the pandemic began. Here are three prime examples.

Bill Gates the ‘criminal globalist’

The Microsoft boss features regularly, from the Kremlin-funded news outlet InfoRos to the Russia-aligned news site Fort Russ. Narratives unfold along similar lines.

They claim that Gates is the ‘criminal globalist’ ringleader of a cabal using coronavirus as a smokescreen to impose mandatory tracking and mandatory vaccines.

Justifications for singling out Gates are usually his prescient 2015 talk, in which he highlighted the global risk of a pandemic, or the Gates Foundation’s funding of WHO.

Herd immunity vs lockdown

Another key narrative centres on the benefits of herd immunity, often juxtaposed against the negatives of lockdown. Sweden is the poster child for herd immunity. Lockdown is presented as a corrupt government-led attempt to remove people’s basic freedoms.

It’s not hard to imagine how this framing could trigger people who value freedom above all else – and cause events like the anti-lockdown protests that have been cropping up across the US and UK.

The smouldering culture war of Trump and Brexit has extended into new battle lines of ‘lockdown vs herd immunity’. As a result, pandemic control efforts are at risk.

Scapegoating China

China is presented as an innocent player in the pandemic. The US is blamed for targeting China with information warfare in order to blame it for the coronavirus.

In some articles, the authors claim that the pandemic could create a ‘New Cold War’ between the US and China, with severe consequences for the global economy.

Other sites take it even further, claiming that COVID-19 could spark a nuclear war between the US and a newly formed Russia/China alliance.

They claim COVID-19 will reshape the world 

Another popular theme is how the outcome of the US 2020 election, plus the effects of coronavirus will cause the US to lose hegemony. The result will be a shift into multilateralism.

Some sites claim coronavirus will cause Western governments to “face a legitimacy crisis like never before”, eventually causing so much chaos that it will reshape the global order.

To reinforce this point they highlight how the US has failed to protect its people from coronavirus, so it can no longer be called a superpower. Multilateralism is presented as inevitable, due to the unprecedented crisis the world now faces.

Anti-imperialism has been a key feature of pro-Russian media for decades. It overlaps with certain far-left lines of thinking, especially among those who critique Western military actions around the world.

They don’t support Trump

“Voters now must choose between Donald Trump, an unstable, incompetent president whose blatant narcissism has been on full display as the nation suffers from coronavirus, and the former vice-president who will diligently represent the rich and govern for their good above all others.”

American Herald Tribune

We often assume that Russia-aligned media is pro-Trump. In fact, many of these news sources criticise Trump as much as Biden. Criticisms of Trump include poor handling of the pandemic, and ‘imperialist shenanigans’ in foreign policy.

Framing of Biden often paints him as sleazy, citing the recent Tara Reade case as evidence. Some articles suggest he may have dementia. Such framing of both candidates as hopeless choices could be a subtle attempt at voter suppression. 

They frame themselves as ‘independent’ thinkers

Most of these websites present themselves as bastions of independent thought. They encourage readers to go beyond the mainstream and discover ‘new’ perspectives.

It reflects a common refrain among social media conspiracy theorists, who often talk about the need to “do your own research” . Often, that translates as “using Google or YouTube to find content that reinforces one’s existing views”.

Pro-Russia news sites tap into this way of thinking. They use it as a defining aspect of their reporting. It’s a message likely to resonate with the exact kind of person who questions everything.

What’s the link to real life unrest? 

Looking at these websites in aggregate, it’s easy to see how their typical narratives link to social unrest during the pandemic.

I’ve noticed the same themes popping up over and over on social media. Ordinary citizens share them in mainstream Facebook groups (e.g. local news and discussion groups).

These ideas have become rooted in public consciousness. They drive a growing sense of distrust in Western governments, particularly in the UK and US, where populations are already polarised. Both countries have handled the pandemic badly, so it’s easier to create scepticism among a fearful population.

If we were to survey the beliefs of anti-lockdown protesters, 5G mast attackers, and mask-related violence, I bet we’d find echoes of the same narratives found across these ‘alternative’ news websites, many of them either Russian government funded, or publishing work from the same authors.

6 Things I’ve Learned From Tracking Coronavirus Disinformation

Disinformation thrives on chaos, and a global pandemic is about as chaotic as it gets. For those who seek to disinform, the coronavirus presents a far grander opportunity than either the 2016 US election or the vote on Brexit. The upcoming 2020 US presidential election further fans the flames.

With that in mind, it’s important to regularly stop and take stock of lessons learned from the front lines of disinformation tracking. I’ve been studying cross-platform coronavirus narratives for the last month or so. Here are a few of the things I’ve found.

1. Q is a major player

Qanon is a mega conspiracy narrative that encompasses a whole range of smaller ones. The basic premise of Qanon has Donald Trump in league with a shadowy figure called Q. Together, Trump and Q are fighting against a group of elite paedophiles entrenched within the mainstream media and the Democrat Party.

Previous presidential candidate Hillary Clinton and current one Joe Biden have both been major targets for Q’s accusations. Every so often, Q releases tantalising nuggets of new information (called ‘Q drops’) for his followers to chew over. These have sparked a whole ecosystem of pervasive social media content, from Twitter threads to entire YouTube channels.

Coronavirus disinformation is being well-leveraged by Q and his followers. Q related themes and activity underpin many of the most widely spread corona conspiracies, including coronavirus being either a hoax or a bioweapon, 5G causing the virus, a supposed plan to enforce mandatory vaccinations, and the imminent arrival of military martial law.

2. Mainstream media is pushing conspiracy narratives

Conservative media sources in the US, such as Fox News, play a significant role in promoting narratives that draw on conspiracies, including around the coronavirus. They claim it’s ‘not a big deal’, or it’s ‘just like the flu’, or, ‘it’s all a big hoax’.

Although these stories may be less colourful than those of the average Q acolyte, they are still risky. Provenance in established media sources provides the necessary social proof to make the narratives more credible in the minds of their audiences.

What’s more, this scenario means less work for those who intend to manipulate public opinion around the coronavirus. They no longer have to waste time crafting convincing content, but can simply engage with organic content that already exists. And that’s exactly what they’re doing, with a firm eye on the US 2020 election.

3. Coronavirus tribalism is prevalent

Pitting ‘us’ against ‘them’ is at the core of most disinformation, including conspiracy theories. The narratives can take many forms, but always come down to one group (the ingroup) facing off against a predefined opposing group (the outgroup).

For Qanon, it’s Q’s followers who are the ‘enlightened’ ingroup, joining forces with him and Trump to battle the predatory elites. In British politics, we see ‘patriotic’ supporters of Brexit setting themselves against ‘treacherous’ Remainers (and vice versa).

Tribalism even filters down to matters of life or death, i.e. the coronavirus. On social media, I’ve noticed a recurring adversarial narrative emerging around how best to respond to the pandemic. One camp downplays the severity of the virus, claiming measures such as the lockdown are an overreaction, while the other camp is strongly in favour of lockdown and promotes WHO advice to Stay At Home. Each camp supports their own and attacks the other, often in derogatory and aggressive ways.

When people are already suspicious of ‘elites’ and experts, there’s a real tendency to dismiss guidance from governments and public health organisations, which can lead to the flouting of virus mitigation measures. Real world harms can result.

4. Virus fears are being monetised 

The chaos and fear of a global pandemic has spawned many opportunities for leveraging the attention economy. In addition to conspiracy theories, there are many examples of people making money by tapping into the fear, confinement, and increased search for answers.

I’ve identified two main ways of doing this. The first is through creating highly clickable content about the virus. This content may or may not be factual; it doesn’t matter to the creator, as long as it brings in the clicks.  The content is published on websites festooned with online ads, where each click brings extra ad dollars to the site owner.

The second way is to create content on topics such as ‘miracle cures’, which then feeds into attempts to sell products. Vitamin C is a prime example. It’s a cynical exploitation of people’s fearfulness about the virus and their need to somehow regain a sense of control.

These ‘miracle cures’ are not scientifically proven. They provide a false sense of security, which may lead to individuals choosing not to self isolate and spreading the virus as a result.

5. Takedowns have a ‘backfire effect’ 

Although takedowns are a necessary part of tackling the disinformation problem, by denying bad actors freedom of reach, they can also strengthen the impetus behind conspiracy theories by feeding into an existing sense of elite suppression. Here, the platforms are viewed as part of the elite, working together to keep the ‘truth’ hidden from the people.

Conspiracy theorists are quick to react to takedowns by working them into their narratives. With 5G, a trend has sprung up of referring to it as ‘5gee’ or similar permutations, in an attempt to avoid the keyword being picked up by moderators or analysts who are tracking it.

For conspiracy adherents, this sense of persecution further reinforces their existing worldview, making them more likely to cling onto it. In this way, a ‘backfire effect’ has occurred. 

6. Platform responses are shifting 

Social media companies are frequently accused of not doing enough to reduce the flood of misleading content that overwhelms their platforms. I don’t believe they’re reluctant to do so, but they have to balance it with being seen as supportive of free speech. Finding that balance can be challenging when addressing conspiracy theories, as opposed to purely false information.

Most conspiracy theories are spun up like candy floss around a small kernel of truth. A typical post will build a whole story around how some real life event is of possible significance to the wider narrative arc. This creates murky territory for the platforms because the difference between opinion and actual false information is not always clear-cut.

But things have shifted after some conspiracy theories, such as the one about 5G causing coronavirus, triggered real life harms. A recent video by notorious conspiracy theorist David Icke was pulled from YouTube just days after it was released, heralding a change in approach.

A growing amount of research indicates that coronavirus conspiracy theories form a central part of coordinated influence operations.  We can no longer afford to overlook the role of conspiracy theories in influence operations. 

Disinformation, Dopamine, And How To Quit Feeding The Trolls

I used to spend a lot of time arguing with strangers on the Internet.

It normally went like this: I would post something political, the trolls would arrive, and I’d get stuck in endless back-and-forths with them. It was a futile endeavour. But somehow I felt compelled to keep on engaging.

On Twitter or Facebook, new notifications would keep coming in, alerting me of another new comment. Then I’d engage again. It wasted my time, raised my anxiety levels, and made me feel more aggressive.

I’d gotten stuck in a dopamine loop.

This happens to us a lot, for example when we can’t stop refreshing our social media feeds. But what exactly is dopamine and what does it do?

How dopamine creates addiction

Dopamine is a chemical created in the brain. It affects many functions, such as thinking, mood, attention and motivation. It also plays a critical role in triggering our desire to seek out and search. In short, dopamine makes us want to search out information.

Two parts are at play in the dopamine loop. There’s the ‘wanting’ part and the ‘liking’ part. The wanting part makes you take action (i.e. searching), while the liking part satisfies you and makes you stop searching. But the wanting part is stronger than the liking part. And that’s how we get trapped in endless scroll mode.

Another important part of the dopamine system relates directly to the issue of engagement with trolling comments.

The system is very sensitive to cues that indicate the possibility of a reward. On social media, notifications are the cues. They make the addiction even stronger. Anticipation is key. If we got a reward every time we logged in, it would weaken the addiction. It’s the uncertainty that gets us hooked.

So how did these behavioural psychology elements get incorporated into our Internet experience in the first place?

Engineered addiction and trolling

It all started with a researcher called BJ Fogg and his Persuasive Technology Lab at Stanford University. In September 2007, Fogg and his students began work on a formula that would change the world — and all of our daily lives.

They built Facebook apps using techniques from behavioural psychology, techniques that could engineer addiction, such as the dopamine loop that I described above. Fogg developed a formula, suggesting that people will act when three forces converge: motivation, trigger, and ability.

Let’s apply this formula to a hostile online interaction. The motivation is your desire to convince the other person that your opinion is right, or to get ‘one up’ on them; the trigger is seeing another notification on your social media app, indicating that your opponent has responded; and the ability is having your phone to hand, which lets you check right away and respond.

These techniques tap directly into the workings of the human brain, making them extremely powerful.

So why do social media platforms leverage such addictive and manipulative design practices?

Simple: their profitability depends on it.

The ad-supported business model means that users spending more time on the app leads to more profit for the company. All that time you spend arguing with trolls on Facebook is making the platform more valuable to potential advertisers.

Dopamine and disinformation

Arguing online also affects our susceptibility to disinformation. The dopamine loop gives a powerful tool to those who seek to divide us. It perpetuates a sense of an adversarial environment and keeps us always on the attack.

When we divide ourselves into tribes and adopt a hostile attitude towards a perceived outgroup, we risk becoming more susceptible to harmful disinformation. We are more likely to share content that is cognitively resonant with our existing beliefs and which reflects the attitudes of our tribe.

The dopamine loop also affects our interactions with agreeable content. When we post something that our tribe members approve of, we’ll receive dopamine hits via our notifications of likes and encouraging comments. That boosts our self-esteem and keeps us coming back for more.

So what can we do to fix the trolling problem and reduce our susceptibility to disinformation?

Breaking out of the loop

Short-term practical solutions mainly involve adapting our devices to help break the dopamine loop. For example, we could make our phones less appealing by changing the screen to grayscale mode, or switching off all notifications.

But we can also tackle adversarial online behaviour in another way.

‘Don’t feed the trolls’ has become an Internet cliché. But it starts to make sense when thinking about the dopamine loop and anticipatory reward.

Everyone who posts online is looking for a response. They want to perpetuate the dopamine loop. If we can maintain our self control by denying them that response, then we can break the cycle.

I’ve managed to break my own addiction to engaging with trolls. On Twitter, I simply mute the conversation. I can’t see the person anymore, and I receive no notifications of any of their comments. It makes the troll invisible to me, breaking the dopamine loop and allowing my brain to focus on other things.

On Facebook, I simply turn off notifications for a particular post. This has the same effect as muting on Twitter. Both platforms also offer a blocking option. I don’t normally use this because it gives trolls the satisfaction of knowing they’ve got a response. Muting is better, because it means they’re left wondering if I’m ignoring them. They just keep on yelling into the void.

Battleground or community

If we could all break the cycle and quit feeding the trolls, then adversarial disinformation and influence ops could lose much of their power. The online environment would feel like more of a community instead of a battleground. In turn, this may help reduce polarisation.

But it has to be done en masse. A handful of people breaking the cycle won’t be enough to change the overall environment. As social media is designed to be addictive, a wider intervention would be necessary to encourage people to do this.

Of course, the social media platforms have all the power. They could redesign their structures to destroy these dopamine loops and disincentivise disinformation. But their ad driven business model means they don’t have an incentive to do so.

Nevertheless, we can still improve our individual online experiences by taking steps to break the dopamine loop and smash our susceptibility to disinformation. Part of doing this is to disengage with users who aim to trap us into never-ending adversarial debates based around polarising topics.

Analysing ‘Bleachgate’ Responses in Pro-Trump Facebook Groups

Much of the world was shocked this week as Donald Trump claimed injecting disinfectant into the body could be an ‘interesting’ way to cure COVID-19. He later tried to back-pedal, claiming he was being sarcastic. But that wasn’t how most of the world took it.

The dangerous comments were widely lambasted across the mainstream media and among much of the ordinary public. Such was the furore over Trump’s remarks that a major disinfectant firm even issued a statement urging the public not to inject or drink any of their products.

But members of Facebook groups dedicated to conspiracy theories displayed quite the opposite reaction. 

I examined some of these groups to provide comment for an article in CodaStory. I’d previously gathered this list because of the strong focus on various ‘corona conspiracies’.

These include 5G causing the virus, the virus being a US bioweapon, and Bill Gates as having orchestrated the ‘virus hoax’ in his ambition to enforce a worldwide vaccine programme. Many of the groups also centred around the Qanon conspiracy theory.

You might expect the suggestion of injecting bleach to be a step too far even for these largely pro-Trump groups. Not so. 

In my initial observation of the groups, I noticed three distinct ways in which the members attempted to account for Trump’s bizarre statement.

First, that Trump was just ‘playing the media’. Anyone who believes he meant what he said must be stupid. Commenters also attributed all the negative media coverage to ‘yet another’ MSM (mainstream media), liberal, or Democrat attempt to smear Trump.

Secondly, some commenters claimed Trump had been quoted ‘out of context’. According to them, he was speaking ‘more generally’ about possible ways to treat COVID-19.

Some highlighted a fact check article from far-right news outlet Breitbart. But nowhere did anyone acknowledge that Trump had been videoed making these claims for everyone to see and hear. 

The third claim relates more closely to the other COVID-19 ‘miracle cures’. This commenter claimed that Trump must have been referring to those UV light therapy and ozone therapy, which already exist.

Things got more interesting when the commenter linked the injecting bleach comments to the popular ‘Vitamin C as miracle cure’ narrative.

They claimed that taking Vitamin C causes hydrogen peroxide to build up in the body. As hydrogen peroxide has a disinfectant effect, then actually Trump’s comments have a basis in medical fact.

These three counter-narratives about Trump’s comments all attempt to rationalise what would normally be seen as an influential figure making a dangerous and irresponsible remark.

Rationalisations like these are rooted in tribal attitudes. For example, claims that Trump’s comments were purposefully misinterpreted in a ‘libs’ or ‘Dems’ smear attack. Once again, this reinforces the existing divide between populist pro-Trump narratives and the mainstream.

The question remains: How many of these Facebook group members are genuine American citizens? Facebook itself is the only entity that could properly attribute the accounts. And it doesn’t seem to be giving much away.

I suspect group members are a mix of genuine Trump supporters and astroturfers working to stir up tribal hatred of the ‘other side’.

Tribal attitudes can be dangerous, particularly in relation to public health. People in the pro-Trump tribe are more likely to challenge messages from the perceived ‘outgroup’ (‘experts’ and the ‘MSM’) such as critical public health advice from the WHO.

A similar dynamic has fuelled recent anti-lockdown protests across the US, which may already have spread the virus further and compromised the entire country. Astroturfing was certainly a factor there; there’s no reason why it couldn’t be influencing these groups too.