» Resources » Articles » Our Complicated History with Disinformation

Our Complicated History with Disinformation

By Alicia Wanless

Humans have long had a complicated relationship with disinformation and information manipulation. People have always had challenges discerning fact from fiction, and there have always been people who have attempted to influence the information environment for their own gain. While new technology can exacerbate our already complicated relationship with information, doing something about the problem of disinformation must be human centred.

A Tale as Old as Time

Examples of our complicated relationship with information and its various malformations can be found among the ancient Greeks. Thucydides, the father of scientific history early in the fourth century BCE, bemoaned the little pains “the vulgar take in the investigation of truth, accepting readily the first story that comes to hand.” In his famous funeral oration, the Athenian leader Pericles complained that people hear or accept what they want and how difficult it can be to persuade them otherwise should they not share one’s views.1

Sophists could be hired to teach the art of rhetoric to help people persuade voting citizens to take their side, even to enter disastrous wars. This act of teaching deliberate persuasion was enough of a problem in the new democracy that the playwright Aristophanes centred an entire play around it: Clouds.

Bust of Thucydides
Bust of Thucydides. Photo: Shakko via Wikimedia Commons (licensed under CC BY-SA 3.0)
Bust of Pericles
Bust of Pericles. Photo: Vatican Museums via Wikimedia Commons (public domain)
Bust of Aristophanes
Bust of Aristophanes. Photo: Alexander Mayatsky via Wikimedia Commons (licensed under CC BY-SA 4.0)

Even politicians who engaged in persuasion themselves derided the situation. Cleon complained that people attended public debates as a form of entertainment as if “to see a sight, take [their] facts on hearsay, judge of the practicability of a project by the wit of its advocates, and trust for the truth as to past events not to the fact which [they] saw more than to the clever strictures which [they] heard.”2

While some of these examples might qualify as disinformation, many of them can be better categorized as information pollution.

Toxins and Taints

Information pollution is the presence of low-quality information and includes an array of types that range in their degree of degradation to the information environment. On the lower end of the spectrum are irrelevant or unsolicited messages such as spam email and redundant or empty information that contributes little to knowledge, such as many forms of entertainment. At the other end of the scale is information that misleads or is false, including disinformation, as well as narratives that provoke arousing emotions, such as fear or anger.3 Higher quality information, as in data science, would be accurate, complete, timely, unique and coherent.4

Information pollution accounts for other low-quality types of information. While not outright lies, they degrade the information ecosystem in which they occur and challenge people within it to discern fact from fiction. Moreover, as with pollution in the physical environment, polluting the information environment can happen both deliberately and unintentionally. Information pollution has always been with us and will always remain. It’s just something we humans seem to naturally produce.

The Trouble with Technology

Illustration: FreePNGImg.com

The situation today is arguably a lot worse than in ancient Greece, thanks to three recent key shifts in our information environment. First, modern information communication technologies move information (and pollution) farther and faster than ever before. Second, social media have changed the way people can share information, enabling some to obfuscate their identities more easily and others to reach mass audiences without filters. Third, these things are happening at a time when these technologies generate more data on individuals. Some actors, equipped with know-how generated from fields such as cognitive psychology, use this data to target and influence audiences. And to make matters worse, few regulations control this process.

Tumult and Pestilence

While we have always been susceptible to information pollution, society is also reeling from a global pandemic. Information pollution becomes more problematic in uncertain situations, where there is a lack of immediate information to explain current events.5 The same part of the brain that is activated by fear, the amygdala, reacts during ambiguous situations such as a crisis.6 People may seek answers to alleviate a stressful response to uncertainty, invent explanations where none can be found and contribute to information pollution.

Moreover, such situations can lead to a greater prevalence of magical thinking; people might seek simplified causes to explain outcomes.7 Uncertainty also causes people to want to fit into groups more, encouraging greater group cohesion in the short term.8 This type of response can exacerbate feelings of us versus them and increase the challenges of addressing disinformation.

Taking Control

But simply becoming aware of our susceptibilities can help us move forward. And because information pollution is a very human problem, there are simple things all of us can do to address it.

  1. Remember we are all human. This point is as important in dealing with others who might have fallen victim to disinformation as it is in dealing with ourselves. We are all susceptible to information pollution. Setting aside judgment and having empathy is key. We will all make mistakes.
  2. Be patient and ask questions. When faced with someone who believes disinformation, try to identify what fear has led the person into this thinking trap. Being combative or insulting will not change someone’s mind. Understanding—and demonstrating your understanding through verbal acknowledgement—of the underlying reasons why a person fell for disinformation might present opportunities to surface accurate information in a non-combative manner. If you can’t muster the patience in the moment, try gently changing the topic and come back to it when you have the wherewithal to engage.
  3. Be mindful when reacting to information. Much information pollution is designed to provoke some sort of reaction in audiences. It’s only natural to feel a response, but before reacting to it, try to be aware that you are susceptible to information pollution. Instead of responding or resharing on social media, ask yourself what about this information is provoking you. Assess the provenance of the information and review it thoroughly for bias or inaccuracies. Critical thinking like any other skill needs to be exercised to stay sharp.
  4. Help surface accurate information. As information professionals, you are better placed than most to help surface accurate information to audiences—on your job or on social media. Sharing accurate information is best done proactively. No need to wait and refute disinformation as it arises; be a regular source of trusted information so that when crises hit, your friends and family know to come to you. If you don’t have an immediate answer—for example, in a crisis when an answer can’t be found—consider surfacing information that might help foster awareness of the other vulnerabilities, such as disinformation, that we might face in that uncertainty.

Alicia Wanless is the director of the Partnership for Countering Influence Operations at the Carnegie Endowment for International Peace.

1 Thucydides, History of the Peloponnesian War, trans. Richard Crawley (Ottawa: East India Publishing Company, 2021), 45.
2 Ibid., 73.
3 Jonah Berger, Contagious: Why Things Catch On (New York: Simon & Schuster, 2013)
4 Behrouz Ehsani-Moghaddam, Ken Martin, and John A. Queenan, “Data Quality in Healthcare: A Report of Practical Experience with the Canadian Primary Care Sentinel Surveillance Network Data,” Health Information Management 50, no. 1/2 (2021): 88–92.
5 W. Timothy Coombs, Ongoing Crisis Communication: Planning, Managing, and Responding, 3rd ed. (Thousand Oaks, US: SAGE, 2012), 141.
6 Paul J. Whalen, “Fear, Vigilance, and Ambiguity: Initial Neuroimaging Studies of the Human Amygdala.” Current Directions in Psychological Science 7, no. 6 (1998): 177–188.
7 Giora Keinan, “Effects of Stress and Tolerance of Ambiguity on Magical Thinking,” Journal of Personality and Social Psychology 67, no. 1 (1994): 48.
8 Richard J. Crisp, Social Psychology: A Very Short Introduction (Oxford: Oxford University Press, 2015), 50.