What’s wrong with misinformation?

Science & Technology Studies, 2024

pdf of published article

Brian Martin


Go to

Brian Martin's publications on science, technology and society

Brian Martin's publications

Brian Martin's website


Abstract

Strangely, few recent studies of misinformation have given attention to the concept of misinformation itself. An examination of several studies of Covid misinformation shows them to be implicitly based on having unquestioned possession of the truth, so there is no attention to struggles over who decides what counts as misinformation and no mention of the possibility that views labelled misinformation might offer reasonable alternative perspectives. This has limitations, especially if understood in the context of research on public scientific controversies: ethical and political disagreements are obscured, and social analysts become de facto supporters of scientific orthodoxy.

Introduction

In recent years, there has been a huge increase in the number of researchers and government officials expressing concern about misinformation, along with its cousin disinformation. Misinformation refers to sincerely held false claims whereas disinformation refers to intentional falsehoods; here, ‘misinformation’ will be used throughout. Much of the commentary sees misinformation as a serious social problem, causing many citizens to subscribe to incorrect views with potential dangers to public health and political decision-making.  

Curiously, in the outpouring of scholarly research on misinformation there is very little attention to the concept of misinformation itself. Authors in this field seem to assume they, or authorities on whom they rely, can unambiguously distinguish between truth and falsity. They align themselves with the truth and hence turn their attention to the reasons why some people subscribe to false beliefs. Few misinformation researchers mention alternative epistemological frameworks such as constructivism, relativism, pragmatism or, more generally, postmodernism and poststructuralism. These perspectives problematise claims to truth in various ways, including by seeing them as tools in social struggles, by pointing to their positionality, and by rejecting the idea of a grand narrative for understanding the world.

Within science studies, a prominent framework has been the sociology of scientific knowledge or SSK (Barnes, 1974; Bloor, 1976; Mulkay, 1979). In what is called the strong program, SSK investigations adhere to four principles: causality, impartiality, symmetry and reflexivity. Impartiality and symmetry are most relevant here. Impartiality specifies that knowledge claims should be scrutinised regardless of whether they are judged right or wrong, while symmetry specifies that knowledge claims, whether judged right or wrong, should be explained using the same conceptual tools. This is in contrast with an approach in which the social analyst accepts one perspective as correct and only studies why people believe otherwise, an approach called the sociology of error. Note that SSK is a methodological prescription: an analyst can seek to explain both successful and unsuccessful knowledge claims using the same conceptual tools while personally believing in objective truth. In other words, within science studies, relativism is commonly treated as a method rather than a belief system.

There are various ways to refer to the social-science approach to knowledge in which reasons are sought only for incorrect beliefs, in other words unsuccessful knowledge claims. It is sometimes called positivism, even though the term positivism historically has a range of meanings. From an SSK perspective, it might be called partiality and asymmetry. For the purposes here, these terms, along with the ‘sociology of error,’ will be used to refer to an approach to the study of knowledge in which the analyst assumes knowledge of the truth — at least as currently understood — and seeks only to study the reasons why people believe otherwise.

Outside of STS, to refer to truth might once have been deemed straightforward, but in 2016 two events triggered the rapid spread of the idea of ‘post-truth’: Brexit and the election of Donald Trump. ‘Post-truth’ was chosen by Oxford Dictionaries as the word of the year, with this definition: “relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief” (https://languages.oup.com/word-of-the-year/2016/). There was an outpouring of informed commentary by journalists and philosophers (e.g., Ball, 2017; d’Ancona, 2017; Davis, 2017; McIntyre, 2017), including STS scholars (Sismondo, 2017). However, not all commentators used the Oxford definition, and they had different views about whether post-truth was a new development, whether it was a dangerous development, and much else. The most sophisticated STS-informed examination of post-truth, also the most unorthodox, was by Fuller (2018, 2020), who posited that post-truth involves questioning the assumptions underlying the “game” of searching for the truth. In other words, it was not about a disagreement about truth claims but rather a disagreement about the bases by which truth claims would be judged. Fuller argues there is little new about post-truth.

Post-truth concerns relate to studies of misinformation in at least two ways. The concept of misinformation assumes knowledge of the truth, and hence implicitly decries post-truth, at least in the sense of the Oxford definition. In addition, the concept of misinformation assumes agreement with the rules for seeking the truth, and hence is contrary to the perspective on post-truth presented by Fuller.

There is nothing inherently wrong with studying only the beliefs of one side of a controversial issue: it offers a way of understanding issues and offers insights. At the same time, it cuts off or obscures insights available from other perspectives. We might expect scholars adopting this approach to justify their choice, including by noting the availability of other analytic frameworks and discussing the strengths and weaknesses of different frameworks and the assumptions involved in choosing their own. Given that the term misinformation signals acceptance of currently dominant ideas, therefore, we might expect that studies would include a careful examination of the assumptions underlying the term.

Initially I inspected a diverse range of publications, from the natural and social sciences, dealing with misinformation and found they all accepted the current scientific orthodoxy uncritically. Rather than undertaking a comprehensive search, I chose to look more closely at a small selection of articles, and to make the choice I limited the search in several ways. First, I looked only at studies of Covid misinformation, given that disputes concerning Covid easily fit within the longstanding tradition of controversy studies, unlike disputes about political matters. I relied especially on Google Scholar with the search term “Covid misinformation,” looking for articles that were about this topic generally rather than, for example, in a specific country. Second, I picked studies that had been highly cited, suggesting that other researchers considered them credible and relevant. Because older papers were likely to acquire more citations as the field rapidly expanded, I chose one article published in 2020 (the most highly cited in that year, and overall, Roozenbeek et al.), one published in 2021 (the most highly cited in that year, Gabarron et al.) and one published in 2022 (Caceres et al.), adding one additional article from 2022 (van der Linden), a review article and therefore of special interest for understanding approaches to the topic. Note that citation figures will have changed since I selected these articles. Third, I selected only articles that were open access, making it easier for others to check the analysis.

In reading these articles, I looked for indications that the concept of misinformation was open to critical inquiry. Indications could include:

Studies of Covid misinformation are a subset of the wider attention to misinformation in a range of domains. Separate investigations are needed into assumptions in misinformation studies in these domains, as well as studies of fake news, conspiracy theories, and critiques of the misinformation agenda (e.g., Schmidt et al., 2023).

In the following three sections, the four chosen articles are discussed with special attention to indications, or lack thereof, that misinformation could be a questionable concept. The emphasis here is not on whether assertions claimed to be misinformation are false but on the concept of misinformation.

Roozenbeek et al.

“Susceptibility to misinformation about COVID-19 around the world,” by Jon Roozenbeek et al., was published in Royal Society Open Science in 2020. The authors surveyed people in five countries — UK, Ireland, US, Spain and Mexico — with a total of 4400 respondents, asking questions about beliefs concerning Covid, personal health behaviours, numeracy skills, and trust in science, among others. The questions about Covid were chosen to reveal beliefs that were either right or wrong, in other words, either correct information or misinformation.

With this extensive data, the authors performed numerous statistical tests, looking for correlations between the beliefs and intentions of the respondents. For example, the authors found “increased susceptibility to misinformation negatively affects people’s self-reported compliance with public health guidance about COVID-19, as well as people’s willingness to get vaccinated against the virus and to recommend the vaccine to vulnerable friends and family.” (Roozenbeek et al., 2020: 1)

The rationale for the study is given in the first sentence in the abstract: “Misinformation about COVID-19 is a major threat to public health.” One innovative contribution of the paper is to use a sample covering five countries, given that most previous studies looked only at US populations. Another is the wide number of variables examined for correlations with beliefs in misinformation.

Roozenbeek et al.’s treatment of ‘misinformation’ in a positivist, asymmetrical manner is apparent in their failure to note that there could be a struggle over who decides what counts as misinformation. Another indication is their central object of study, “susceptibility to misinformation.” They do not examine the reasons people believe in correct information; for them, it would be strange to talk about “susceptibility to information.” This one-sided examination of reasons is an example of the sociology of error.

Respondents were asked about several claims concerning Covid, with some of them deemed misinformation. “The false claims were based on the World Health Organization’s ‘Mythbusters’ page” (Roozenbeek et al., 2020: 4). Thus, the authors treated WHO views as true and contrary views as false, without any question. There is no indication that these WHO views could be subject to rational, sensible disagreement on any grounds. There is no mention of WHO or any other authorities ever being wrong.

Most of Roozenbeek et al.’s misinformation items are strongly deviant from conventional ideas, almost absurd. Nevertheless, some of them might be considered to be associated with plausible claims. One question was “The coronavirus was bioengineered in a military lab in Wuhan.” This is not the same as what is now commonly referred to as the lab-leak theory of the origin of Covid, which posits that the coronavirus accidentally escaped from a civilian lab in Wuhan where gain-of-function research on bat coronaviruses was being carried out, but shares some features with the lab-leak theory, namely bioengineering of coronaviruses in a Wuhan virology lab. Since Roozenbeek’s article appeared, there has been a growing body of writing giving credibility to the lab-leak hypothesis (Wade, 2024) and evidence of a covert, coordinated effort to deny and discredit this hypothesis (Gutentag et al., 2023).

Another statement used by Roozenbeek et al. to measure beliefs in misinformation was “Gargling salt water or lemon juice reduces the risk of infection from Coronavirus.” Several years after their paper appeared, a study found that gargling and nasal washing with a solution of water and bicarbonate of soda greatly reduced the duration of Covid infection (Wang et al., 2023). This is different from gargling with salt water or lemon juice, but suggests that believing in a preventive treatment along these lines may not be as absurd as it might seem on the surface. The point here is that the authors do not raise the possibility that what, when they wrote, was deemed misinformation might contain elements that later gain credibility.

Gabarron et al.

In 2021, a paper by Elia Gabarron and two co-authors was published in the Bulletin of the World Health Organization titled “COVID-19-related misinformation on social media: a systematic review.” As the title indicates, this was not a direct study of misinformation but rather a systematic review, in essence a study of studies. Gabarron et al. used several databases, such as Google Scholar, to identify primary empirical studies of misinformation on social media in the early months of the pandemic, and then examined the studies to find those of the highest quality, of which they found 22. They assessed these studies in various ways, for example for reported rates of misinformation on social media, concluding that “COVID-19-related misinformation on social media is an important issue, both in terms of the amount of misinformation in circulation and the consequences for people’s behaviour and health.” (Gabarron et al., 2021: 460).

Gabarron et al. do not mention struggles over who gets to determine truth, or that authorities might disseminate misinformation. A complication is that Gabarron et al.’s paper is a systematic review, so they are reporting on other studies of misinformation on social media, not their own. Nevertheless, a couple of findings are revealing. One is that half of the 22 studies they examined “did not categorize the specific type of COVID-19-related misinformation” (Gabarron et al., 2021: 456), indicating that many studies of misinformation applied the label without specifying the claims said to be wrong.

Gabarron et al. (2021: 459) say “little is known about the relative importance of the different reasons why people propagate misinformation”. This focus on those who believe wrong information, without any mention of the reasons people believe correct information, is characteristic of the sociology of error, and is compatible with Gabarron et al. adopting a positivist, asymmetrical approach to knowledge.

Caceres et al.; van der Linden

In 2022, “The impact of misinformation on the COVID-19 pandemic” was published in AIMS Public Health (Caceres et al., 2022). It reviews studies of Covid misinformation and summarises themes in these studies, including the risk of vaccine misinformation, the influence of social media, the role of trusted sources of information, measures that can be taken against misinformation, and recommendations for dealing with misinformation. Throughout this review, there is no discussion of struggles over who gets to decide what counts as misinformation, and no mention that people subscribing to views said to be misinformation might have good reasons for their beliefs. In short, Caceres et al.’s review is based on a positivist, asymmetrical approach to knowledge.

Also in 2022, “Misinformation: susceptibility, spread, and interventions to immunize the public” was published in Nature Medicine. Its author, Sander van der Linden, was a co-author of Roozenbeek et al. (2020). It is a review article covering susceptibility to misinformation, the spread of misinformation and how this spread can be limited by discrediting it before and/or after encountering it, called prebunking and debunking. The paper includes several paragraphs (van der Linden et al., 2022: 461) about the challenges of defining and operationalising the concept of misinformation, including that in some circumstances experts change their views fairly rapidly. However, this is treated as a problem for misinformation researchers, not a problem with the concept of misinformation. Overall, van der Linden seems to rely on a positivist framework, for example referring to susceptibility to misinformation (a sociology-of-error approach) and not mentioning vested interests, misinformation endorsed by authorities, people having good reasons for distrusting experts, or struggles over who gets to decide what is considered misinformation.

Controversy studies

Within science studies, there is a long-standing subfield commonly called controversy studies (Engelhardt and Caplan, 1987; Kleinman et al., 2005, 2008, 2010; Mazur, 1981; Nelkin, 1979). A wide variety of scientific controversies have been studied, some of them internal to the scientific community such as over gravitational waves (Collins, 2017) and many involving citizen campaigners, like nuclear power, pesticides, vaccination, microwaves and GMOs. Martin and Richards (1995) classified controversy studies into four approaches:

Martin and Richards (1995) used examples from debates over fluoridation and over vitamin C and cancer, pointing out the strengths and limitations of each of these four approaches.

Disputes over knowledge about Covid can readily be studied as a public scientific controversy. Many of the features of earlier and long-standing public scientific controversies are readily recognisable in claims and counterclaims concerning Covid, including the presence of a dominant view backed by scientific authorities, the role of powerful vested interests (the pharmaceutical industry in this case), the existence of dissident doctors and scientists, and emotional contestation by members of the public. Contrary to the name “scientific controversy,” an important feature of these sorts of controversies is that they are not just about science but also involve disagreements over ethics and decision-making. An example is the dispute over lockdowns, which involves not just “the science” but also judgements about the relative importance of preventing the spread of the coronavirus versus freedom to travel and interact with others, or the value to children of attending school. While many believed the danger from Covid was the overwhelming consideration, this was not strictly a scientific matter but involved human values. The implication is that referring to misinformation without acknowledging these non-scientific dimensions is to take a position on them, without acknowledgement. The very term misinformation, in the context of a public scientific controversy, thus builds in a set of unstated judgements.

None of the Covid-misinformation articles examined here mentions research on scientific controversies or notes that Covid issues involve more than scientific matters. This is not to say that respondents’ answers to questions are necessarily rational, well-informed or justifiable, only that social researchers into Covid misinformation have not provided a full picture of the social context of their investigations, but rather made implicit presumptions about the controversial issues they are studying.

Studies of misinformation can be related to the deficit model of science communication. According to Bucchi (1998), the canonical model of science communication is positivist, with scientific knowledge transmitted, in distorted and simplified form, to the public. In the deficit model, members of the public are assumed to lack scientific understanding and need to be provided with correct information provided by scientists, a process that will make people support “science,” which in practice means to trust scientific authorities. However, studies show that providing more information, filling the supposed deficit, is not an effective way to change attitudes, nor to build trust in science (Sturgis and Allum, 2004). Discrediting or censoring ‘misinformation’ can be interpreted as a way to prevent people developing or maintaining wrong ideas, by reducing their exposure to them, and thus aligns with the deficit model. If the push to increase public understanding of science is thought of as ‘selling science’ (Nelkin, 1987), countering misinformation can be seen as an attempt to hobble competition in the ideas marketplace.

Some members of the public have useful insights, including about how their own social location influences their beliefs, as in the famous study of Cumbrian sheep farmers in Britain in relation to sources of radioactivity (Wynne, 1992). Studies of Covid misinformation seem to assume that members of the public who do not subscribe to the currently dominant scientific view are deficient in knowledge. However, some people may have personal experience of both the disease and potential remedies, and the shortcomings of government policies, that are not adequately considered by scientists.

Simis et al. (2016) argue that one reason for the persistence of the deficit model, despite its shortcomings, is that it provides a simple fix for policy problems, specifically via reform of the science curriculum to make citizens better educated about science. The idea of addressing misinformation is a similarly simple fix for policymakers, which may help explain the burgeoning level of research for the area, implicitly relying on a version of the deficit model. Suldovsky (2016) gives another reason for the persistence of the deficit model: it gives scientific authorities ‘epistemic privilege.’ The concept of misinformation assumes that sort of privilege.

Conclusion

In many studies of misinformation, researchers assume they have access to the truth (or its best available approximation) and that their task is to explain why some people reject this truth, while seeking ways to overcome this rejection. The acceptance of currently dominant scientific knowledge claims is signalled by the term ‘misinformation’ itself, especially when it is not critically examined. This is not inherently problematic, but it does limit investigators to a ‘sociology of error,’ in which the primary task is to explain belief in falsehoods, while reasons for belief in scientific truth are unstudied. Furthermore, when studying controversial issues, a focus on scientific-medical misinformation obscures the role of ethical and political disagreements. With this approach, analysts become de facto supporters of the current scientific orthodoxy and associated ethical and social stances.

Using a positivist, asymmetrical approach means not being able to access insights available using other approaches for studying scientific controversies: group politics, constructivist, and social structural. Studies of misinformation seldom even acknowledge that there is a scientific controversy in which some highly credentialed and published experts disagree with the orthodox position. They have the limitation that when orthodox views change, a new explanation is needed for why some people disagree with it, a problem most obvious in relation to Covid as the lab-leak origin theory changed from being labelled a conspiracy theory to being treated as a serious possibility.

Accepting current scientific orthodoxy means not considering the role of social structures, which can influence the generation and acceptance of knowledge claims. In the case of Covid, pharmaceutical companies and their government and medical allies have, according to critics, played an important role in promoting vaccination as the solution to the pandemic and denigrating treatments by non-patentable drugs (e.g., Kory, 2023). Whatever one’s assessment of the role of vested interests in responses to the pandemic, they deserve consideration, but this is absent from studies using the framework of misinformation.

More generally, invocation of ‘misinformation’ provides a pretext for censorship on the grounds that members of the public should not be exposed to incorrect ideas. In the vast body of commentary on censorship and free speech, a key idea is that open intellectual engagement and expression of values is vital to create better policies and practices (Baker, 1989; Barendt, 2005; Hare and Weinstein, 2009). Understanding the role of the misinformation label in ongoing struggles over free speech can help in challenging censorship, especially censorship that protects vested interests.

What’s wrong with ‘misinformation’? It is a loaded term, built on unstated epistemological assumptions, that implicitly denigrates anyone who questions orthodoxy, and limits the scope of social scientific investigation. Stretching this point, it might be said that the concept of misinformation, by offering a misleadingly narrow and one-sided understanding, is itself a form of misinformation.

One implication is to be wary whenever the term misinformation is used. Another is to deal with issues in terms of the arguments and evidence, without automatically assuming one side is correct, and without applying stigmatising labels.

Acknowledgements

For valuable comments, I thank Elia Gabarron, Kelly Gates, Sue Curry Jansen, Lorraine Pratley, Sander van der Linden and two anonymous reviewers.

References

Baker CE (1989) Human Liberty and Freedom of Speech. New York: Oxford University Press.

Ball J (2017) Post-truth: How Bullshit Conquered the World. London: Biteback Publishing.

Barendt E (2005) Freedom of Speech, 2nd edition. Oxford: Oxford University Press.

Barnes B (1974) Scientific Knowledge and Sociological Theory. London: Routledge and Kegan Paul.

Bellos D and Montagu A (2024) Who Owns This Sentence? A History of Copyrights and Wrongs. London: Mountain Leopard Press.

Bloor D (1976) Knowledge and Social Imagery. London: Routledge and Kegan Paul.

Bucchi M (1998) Science and the Media: Alternative Routes in Scientific Communication. London: Routledge

Caceres MMF, Sosa JP, Lawrence JA, et al. (2022) The Impact of Misinformation on the COVID-19 Pandemic. AIMS Public Health 9(2): 262–277.

Collins H (2017) Gravity’s Kiss: The Discovery of Gravitational Waves. Cambridge: MIT Press.

d’Ancona M (2017) Post Truth: The New War on Truth and How to Fight Back. London: Ebury Press.

Davis E (2017) Post-truth: Peak Bullshit and What We Can Do about It. London: Little, Brown.

Engelhardt HT and Caplan AL (eds) (1987) Scientific Controversies: Case Studies in the Resolution and Closure of Disputes in Science and Technology. Cambridge: Cambridge University Press.

Fuller S (2018) Post-truth: Knowledge as a Power Game. London: Anthem.

Fuller S (2020) A Player’s Guide to the Post-truth Condition: The Name of the Game. London: Anthem.

Gabarron E, Oyeyemi SO and Wynn R (2021) COVID-19-related Misinformation on Social Media: A Systematic Review. Bulletin of the World Health Organisation 99: 455–463A.

Goldacre B (2012) Bad Pharma: How Drug Companies Mislead Doctors and Harm Patients. London: Fourth Estate.

Gutentag A, Woodhouse L and Shellenberger M (2023) Covid Origins Scientist Denounces Reporting on His Messages as a “Conspiracy Theory.” Public, 21 July. Available at: https://public.substack.com/p/covid-origins-scientist-denounces (accessed 25 June 2024).

Hare I and Weinstein J (eds) (2009) Extreme Speech and Democracy. Oxford: Oxford University Press.

Kleinman DL, Kinchy AJ and Handelsman J (eds) (2005) Controversies in Science and Technology: From Maize to Menopause. Madison: University of Wisconsin Press.

Kleinman DL, Cloud-Hansen KA, Matta C and Handelsman J (eds) (2008) Controversies in Science and Technology: From Climate to Chromosomes. New Rochelle: Mary Ann Liebert.

Kleinman DL, Delborne JA, Cloud-Hansen KA and Handelsman J (eds) (2010) Controversies in Science and Technology: From Evolution to Energy. New Rochelle: Mary Ann Liebert.

Kory P with McCarthy J (2023) The War on Ivermectin: The Medicine that Saved Millions and Could Have Ended the Pandemic. New York: Skyhorse.

Martin B and Richards E (1995) Scientific Knowledge, Controversy, and Public Decision-making. In: Jasanoff J, Markle GE, Petersen JC and Pinch T (eds) Handbook of Science and Technology Studies. Thousand Oaks: Sage, pp. 506–526.

Mazur A (1981) The Dynamics of Technical Controversy. Washington: Communications Press.

McIntyre, L (2018) Post-truth. Cambridge: MIT Press.

Mulkay M (1979) Science and the Sociology of Knowledge. London: Allen and Unwin.

Nelkin D (ed) (1979) Controversy: Politics of Technical Decision. Beverly Hills: Sage.

Nelkin D (1987) Selling Science: How the Press Covers Science and Technology. New York: W. H. Freeman.

Roozenbeek J, Schneider CR, Dryhurst S, et al. (2020) Susceptibility to Misinformation about COVID-19 around the World. Royal Society Open Science 7: 201199.

Schmidt S, Lowenthal A, Wyatt T et al. (2023) Report on the Censorship-Industrial Complex: The Top 50 Organizations to Know. Racket News, 11 May. Available at: https://www.racket.news/p/report-on-the-censorship-industrial-74b (accessed 25 June 2024).

Simis MJ, Madden H, Cacciatore MA and Yeo SK (2016) The Lure of Rationality: Why Does the Deficit Model Persist in Science Communication? Public Understanding of Science 25(4): 400–414.

Sismondo S (2017) Post-truth? Social Studies of Science 47(1): 3–6.

Sturgis P and Allum NC (2004) Science in Society: Re-evaluating the Deficit Model of Public Attitudes. Public Understanding of Science 13: 55–74.

Suldovsky B (2016) In Science Communication, Why Does the Idea of the Public Deficit Always Return? Exploring Key Influences. Public Understanding of Science 25(4): 415–426.

van der Linden S (2022) Misinformation: Susceptibility, Spread, and Interventions to Immunize the Public. Nature Medicine 28 (March): 460–467.

Wade N (2024) The Story of the Decade: New Documents Strengthen — Perhaps Conclusively — the Lab-Leak Hypothesis of Covid-19’s Origins. City Journal, 25 January. Available at: https://www.city-journal.org/article/new-documents-bolster-lab-leak-hypothesis (accessed 25 June 2024).

Wang T, Zhang Y, Zhang R, et al. (2023) Efficacy of Nasal Irrigation and Oral Rinse with Sodium Bicarbonate Solution on Virus Clearance for COVID-19 Patients. Frontiers in Public Health 11: 1145669.

Wynne B (1992) Misunderstood Misunderstanding: Social Identities and Public Uptake of Science. Public Understanding of Science 1: 281–304.