The Spread of Misinformation Reveals Existing Fears and the Polarisation of Society

Network World

Sharon Jose

Sharon Jose is a graduate in international political economy from King’s College London. Her research interests include political economy of environment and developmental economics, foreign policy.

12th April 2022

Camilo Jimenez


The uncertainties of the COVID-19 pandemic have heightened the general fear and lack of trust among people. It is important to understand the implications of distrust and polarisation between ethnicities, political ideologies and the like – they can affect all spheres of life and make misinformation especially dangerous. Even though, in recent years, social media platforms have taken several measures to curb the spread of misinformation, it has been insufficient as conspiracy theories have remained widespread. The proliferation of such theories can be attributed to both a lack of confidence in institutions and a lack of trust in a society with inadequate rule of law (OECD, 2020). This existing fear and distrust can be held responsible for cognitive biases, where rationality in judgement and interpretation is distorted as individuals create their own ‘subjective reality’. The amplification systems of social media, wherein posts with greater engagement are promoted to a greater extent regardless of the accuracy of their content, further accelerate the spread of misinformation, triggering people’s preexisting fears and emotional biases. With misinformation exacerbating social inequalities and polarisation, it is crucial that governments build trust among their citizens, creating the foundations for a more secure society. 


Factors influencing the spread of misinformation

Since the 2016 US presidential election, social media companies have taken several measures to regulate misinformation. One major policy change involved the flagging of posts containing false information, determined and regulated by an internal team along with third party fact-checking organisations that have partnered with these platforms (Mosseri, 2017). According to Facebook, this change is supposed to reduce misinformation by 80%. However, Avaaz (2020) finds that “global health misinformation spreading networks on Facebook reached an estimated 3.8 billion views in 2020 spanning at least five countries — the United States, the UK, France, Germany, and Italy”. One major reason for the inefficacy of this policy change is that the time lag between a post’s publication and its review by fact-checkers is long enough for the false information to reach millions of viewers. The existing amplification systems of the algorithms on social media sites function to reward content that attracts the most amount of attention or engagement by amplifying its reach. This works against these organisations’ efforts to stop the spread of misinformation on their platforms.


The extensive impact of misinformation is not only influenced by the algorithms but also by another major factor: cognitive biases. Cognitive biases involve a passive absorption of information that affects our interaction with fake news. Human minds tend to focus on headlines that can often be misleading, and social media signals on the popularity of information can corrupt our understanding of the accuracy of its content (CITS, n.d.). Confirmation bias is one of the most common cognitive biases whereby people are more likely to believe information that confirms their preexisting beliefs or prejudices and consequently tend to disregard information that challenges them. This bias is further exploited when algorithms tailor posts to those who are more likely to believe or agree with such content. For instance, according to confirmation bias research, people are more likely to believe social media posts claiming that the US army played a role in introducing the coronavirus in Wuhan if they are already biased against the US (BBC, 2021). The algorithms in social media manipulate such confirmation bias, resulting in echo chambers where people are exposed to similar content and people with similar ideas, reinforcing their existing beliefs. 


To combat the spread of misinformation, social media companies began labeling false content and attaching accurate information and credible website links to posts containing fraudulent content. However, these efforts have not been very effective: about 41% of misinformation posts on Facebook have not been labeled as false, even though fact-checkers have debunked about 65% of the unlabeled content (Avaaz, 2021).  Moreover, many continue to believe in conspiracy theories such as those that relate to the role of government institutions in controlling the masses through vaccines, or that certain ethnicities are responsible for the spread of the virus. Such conspiracy theories expose biases against certain groups within society and reveal a lack of faith in doctors and institutions. For instance, conspiracy theories disseminated by Qanon, a largely conservative Christian community in America with a political bias towards Trump, misinform people on the dispensability of masks, lockdowns, and vaccines (Timberg; Dwoskin, 2021). Their support for Trump involved downplaying the threat of the virus. The spread of such misinformation has elucidated how vulnerable human minds are to manipulation, exploiting the fears and vulnerabilities that are deeply embedded within their psyches. 


Xenophobia and a lack of trust in government institutions have been exploited by misinformation posts, generating further suspicions. For instance, marginalized communities that do not trust that the government is trying to protect them by extension distrust the vaccines that said government advocates for, making them vulnerable to misinformation on vaccine efficacy. Research papers have also found that vulnerable populations such as women and people of colour have been more susceptible to misinformation that breeds suspicion of government institutions (Seo, Blomberg, Altschwager, Vu, 2020). The fear and uncertainty generated by the COVID-19 pandemic, stemming from a fear of disaster, has aggravated these existing anxieties, encouraging people to search for prompt answers and thus leaving themselves susceptible to misinformation. Misinformation, therefore, can worsen social inequalities and political polarisation.



The extensive spread of misinformation during the pandemic has been influenced by a complex integration of technological and psychological aspects. Amplification systems of social media algorithms have worsened the consequences of cognitive biases among the global population, making the regulation of false content online very challenging. Social media companies, despite their policy changes, have failed to substantially curb the impact of misinformation. 


The factors that contribute to the spread of misinformation need to be deconstructed before thinking of long-term solutions. If a lack of faith in institutions and xenophobic attitudes can further generate confirmation bias, it is important for governments to first gain the trust of their citizens, reduce social inequalities and ensure justice within their respective countries. Therefore, managing the challenge of misinformation begins with building strong foundations for a secure society.



Avaaz (2020) How Facebook can Flatten the Curve of the Coronavirus Infodemic. Avaaz.


BBC. (2021, August 22). Wuhan lab leak theory: How Fort Detrick became a centre for Chinese conspiracies – BBC News. BBC News; BBC News.


CITS. (n.d.). Why We Fall for Fake News | Center for Information Technology and Society – UC Santa Barbara. Center for Information Technology and Society – UC Santa Barbara |. Retrieved February 20, 2022, from


Library of Congress. (2021). Germany: Network Enforcement Act Amended to Better Fight Online Hate Speech  | Library of Congress. The Library of Congress.


Mosseri, A. (2017, April 7). Working to Stop Misinformation and False News. Facebook.


OECD. (2020). Transparency, communication and trust: The role of public communication in responding to the wave of disinformation about the new Coronavirus. OECD.


Seo H, Blomberg M, Altschwager D, Vu HT (2020) Vulnerable populations and misinformation: A mixed-methods approach to underserved older adults’ online information assessment. New Media & Society. doi:10.1177/1461444820925041


Timberg, C., & Dwoskin, E. (2021, March 11). QAnon groups on Telegram seethe with covid denialism and vaccine misinformation – The Washington Post. Washington Post; The Washington Post.


Share This