In 2020, World Health Organization Director-General Tedros Adhanom Ghebreyesus raised an alarm bell about scientific misinformation circulating online. “We’re not just battling the virus,” he said in a February 8, 2020, media briefing. “We’re also battling the trolls and conspiracy theorists that push misinformation and undermine the outbreak response.” A week later, in a speech at the Munich Security Conference, Tedros put it in terms that went viral on Twitter: “We’re not just fighting an epidemic, we’re fighting an infodemic.”
Although Ghebreyesus was speaking nearly a year before COVID-19 vaccines would become available to the general public, his comments were prescient. The anti-vaccine movement online is currently one of the largest threats to global health, especially in the face of the lingering pandemic. Anti-vaccine supporters share incorrect and misinformed views about the COVID-19 vaccines and others, claiming they are harmful, cause autism, or can be used as tools of population control. This movement was lively and growing before the pandemic, but it has been gaining momentum since vaccines have become the most promising tools to control COVID-19.
In this historical time, many people are hesitant about vaccines. It is therefore of utmost importance to ensure they can access trustworthy sources of information, without getting misled by unfounded anti-vaccine discourse and rhetoric, which is often encountered on social media. Specific strategies to tackle the anti-vaccine infodemic are lacking, and new approaches need to be identified, especially with regard to social media.
Results show that, before his Twitter profile was suspended, former US President Donald Trump was the main anti-vaccine influencer.
In a study published in PLOS ONE last year, we analyzed the behavior of anti-vaccine supporters on Twitter. We found that the anti-vaccine community is made up of many profiles that share content produced by a few influencers with large numbers of followers. Results show that, before his Twitter profile was suspended, former US President Donald Trump was the main anti-vaccine influencer: while he did not share anti-vaccine content himself, his tweets were widely shared by the anti-vaccine community. Other influencers include his son Donald Trump Jr. and public figures supportive of Trump’s presidency, including actor and producer James Wood and conservative activist Charlie Kirk.
Influencers can also play a positive role in fighting misinformation. Given the relevance of actors, athletes, and other celebrities, health organizations such as the WHO should consider lobbying highly followed influencers to share positive messages about vaccines, highlighting their efficacy and safety when these attributes have been demonstrated in clinical trials and supported by pharmacovigilance monitoring, as they have in the case of several COVID-19 vaccines.
Following Trump’s suspension from Twitter, it may be tempting to think that censoring anti-vaccine supporters would be another viable strategy to halt the infodemic. However, individuals, organizations, or corporations may take advantage of censorship to silence dissidents or make profits. Social media companies in particular, because they are private corporations with their own commercial interests, should not become the arbiters of free speech.
Short of banning spreaders of misinformation, some have proposed that we rely on artificial intelligence to stem the tide of false news. Unfortunately, developing algorithms to precisely spot and weed out misinformation may take a long time or may even be impossible. Therefore, we propose that focusing on training consumers’ skills and improving their understanding of digital information should become an absolute priority.
One possible intervention in this vein—albeit one that would raise ethical issues that need to be discussed first—is the introduction of basic tests of critical thinking skills for social media users who wish to share medically sensitive information. There is debate between researchers on whether education level plays a relevant role in shaping vaccination views, but what seems to matter the most is the underlying ability to critically evaluate information, independent of knowledge in specific domains. If social media users cannot reach a minimum score in a critical thinking test, appropriate disclaimers could be added below their tweets and posts. Stronger measures could partially or completely limit the reach or visibility of posts coming from users who failed to pass the critical thinking test.
It is important to underline that such interventions constitute short-term solutions to end the acute phase of the anti-vaccine infodemic on social media. A long-term strategy is also needed. For this, we need to give all users of these platforms tools to understand and properly contextualize the credibility of information they are exposed to, enabling an easier and more efficient distinction between what is fact and what is propaganda. In fact, as a recent study suggested, prompting users to consider the accuracy of information online could help counter misinformation.
We also envision the development of simple, long-term interventions to help members of the general public increase their critical thinking skills. Such skills include the understanding of the concepts of causation vs. correlation, source credibility, statistical significance, experimental controls, replication, and reproducibility. Interventions to teach these skills could come in the form of online games, given their value as indirect teaching tools. For example, games in which users are asked to produce fake news have been shown to be effective in training individuals to distinguish online misinformation from trusted knowledge.
These strategies could also be effective to counter misinformation related to other conspiracy theories, not only anti-vaccine discourse. In fact, data show that anti-vaccine supporters often believe in several other conspiracy theories. They may believe that COVID-19 is a hoax, or that the pandemic was planned by covert ruling elites. Vaccines fit into this broader conspiratorial picture because they are sometimes imagined to be agents of population and/or mind control. For many in the anti-vaccine community, COVID-19 serves as a massive cover-up to allow global vaccination campaigns to take place and ultimately reduce or stupefy the world’s population.
The tendency to believe in several conspiracy theories is likely due to how people come across various types of information on social media. For instance, if a person believes the 2020 US presidential election was rigged and is lingering or clicking on posts supporting this belief, then they will likely be exposed to anti-vaccine views, thanks to social media platforms’ polarization-reinforcing algorithms. This creates two diametrically opposed communities—one supportive of vaccines and one opposing their use—with little in common, and therefore no room for discussion. Social media should ideally act to prevent or lessen social polarization by facilitating communication between groups with opposing views.
Furthermore, according to the results of our PLOS ONE study, anti-vaccine supporters tend to use emotional language in their tweets, including expressions of rage, sadness, fear, or joy. In contrast, pro-vaccine individuals tend to use a sterile, impersonal language when tweeting. This latter cohort includes healthcare organizations, which would probably benefit from a change in communication strategy. Healthcare organizations should use a language that is relatable to people’s experiences, while still rigorous from a scientific point of view.
It is imperative to halt the circulation of skewed, misinformed, or false views about vaccines, as they have the power to influence hesitant people, who are unsure or unaware of the effectiveness and safety of vaccines. Challenging the anti-vaccine rhetoric could save many lives.