information-disorder-series · Part 2

The Value of Being Wrong

If information is measured by uncertainty reduction rather than truth, some wrong information is exactly right. What that means for how we think about misinformation.

Aldu Cornelissen 6 min read
misinformation information theory trust uncertainty cognitive science fact-checking
Share: X LinkedIn

Miroslav Holub published this brief prose poem in 1977.11 Holub, M. (1977). “Brief Thoughts on Maps.” Times Literary Supplement, 4 February. https://geography.wisc.edu/histcart/wp-content/uploads/sites/12/2017/07/brdsht23UPDATED.pdf

Albert Szent-Györgyi used to tell this story from the war:

From a small Hungarian unit in the Alps a young lieutenant sent out a scouting party into the icy wastes. At once it began to snow, it snowed for two days and the party did not return. The lieutenant was in distress: he had sent his men to their deaths.

On the third day, however, the scouting party was back. Where had they been? How had they managed to find their way? Yes, the men explained, we certainly thought we were lost and awaited our end. When suddenly one of our lot found a map in his pocket. We felt reassured. We made a bivouac, waited for the snow to stop, and then with the map found the right direction. And here we are.

The lieutenant asked to see that remarkable map in order to study it. It wasn’t a map of the Alps but the Pyrenees.

Fact-checking, content moderation, and media literacy campaigns all rest on the premise that correct information is valuable and incorrect information is harmful. Accuracy and usefulness are assumed to run in the same direction. The map was incorrect and it worked. The premise is not as solid as it appears.

The question is whether this is an outlier or a pattern. Wrong information has been quietly useful across centuries of science, engineering, and human cognition. The misinformation industry has never had to confront this.

Arguing that wrong information can be valuable is not a defence of propaganda.22 The distinction is consent: the soldiers chose the wrong map; no one slipped it into their pocket. The problem the industry addresses is genuine. But the premise that truth is the right measure of information’s value does not survive contact with the history of how human knowledge actually develops.

Science has repeatedly been powered by wrong theories. Engineering has run for decades on incorrect models. Language is built on falsehoods that turn out to be cognitively indispensable. These are not exceptions.

Three examples

Science

In the early 18th century, Georg Ernst Stahl proposed that all flammable substances contain “phlogiston,” a fire-like element released during combustion.

The theory was completely wrong. Combustion is oxidation, gaining oxygen, not releasing anything. Stahl had the direction backwards.

And yet, working within the phlogiston framework, 18th-century chemists made extraordinary discoveries. Joseph Priestley discovered oxygen, calling it “dephlogisticated air.”33 American Chemical Society (n.d.). “Joseph Priestley, Discoverer of Oxygen.” https://www.acs.org/education/whatischemistry/landmarks/josephpriestleyoxygen.html Henry Cavendish discovered hydrogen and showed that burning it produces water.44 Levere, T.H. (2026). “Henry Cavendish.” Encyclopaedia Britannica. https://www.britannica.com/biography/Henry-Cavendish All while using a theory that was wrong.

The wrong theory delivered the tools for its own replacement. When Lavoisier built modern chemistry, he did so from discoveries phlogiston research had produced.

Imre Lakatos argued that theories should not be judged as “right” or “wrong” but by their capacity to generate new discoveries. He called these productive rules of thumb “heuristics.”55 Lakatos, I. (1978). The Methodology of Scientific Research Programmes. Cambridge University Press. Phlogiston had overwhelmingly positive heuristics for a century. Its wrongness is irrelevant to its contribution.

Engineering

In 1712, Newcomen built the first commercial steam engine; in 1769, Watt dramatically improved it, and the industrial revolution followed. The First Law of Thermodynamics wasn’t formulated until the 1840s. For nearly a century, engineers built engines that worked without understanding why they worked. Rules of thumb, trial and error, and incorrect models produced reliable results. The engines did not care.

Carnot makes the same point in engineering. In 1824, he laid the theoretical foundations for thermodynamics while explicitly using the wrong caloric theory of heat.66 Mendoza, E. (n.d.). “Sadi Carnot.” Encyclopaedia Britannica. https://www.britannica.com/biography/Sadi-Carnot-French-scientist Correct conclusions can be reached from incorrect premises. Phlogiston was one wrong theory. The industrial revolution ran on a century of them.

Psychology

Metaphors say things that are not literally true. We talk about time as money and ideas as things we can grasp because abstraction is hard to think without analogy.

Lakoff and Johnson showed that metaphor is not a decorative feature of language.77 Lakoff, G. & Johnson, M. (1980). Metaphors We Live By. University of Chicago Press. It is one of the main ways thought handles abstraction, by mapping it onto concrete experience. Some metaphors go further than that: they are constitutive. They do not just describe a problem; they frame how it is interpreted and what responses make sense. Declare a “war on drugs” and you have constituted a law-enforcement response rather than a public health one. The metaphor precedes the policy.

Vaihinger took the claim further: not that some falsehoods are useful, but that thought itself is built on them.88 Vaihinger, H. (1924). The Philosophy of ‘As If’, trans. C.K. Ogden. Kegan Paul. Thinking, he argued, is a biological function that has become a conscious art, an art of adjustment whose primary instrument is the construction of useful fictions. We know the fictions are false. We use them anyway, because they work. This holds not just in individual minds but in institutions, policy frameworks, and professional practice.

Phlogiston and the steam engine were wrong about the world. Metaphor is more basic: it is how thought gets hold of the world at all.

Fact-checking has no instrument for the useful fiction.

Why the wrong map worked

The soldiers’ map does the same work. Why did the wrong map save them?

The map served functions beyond accuracy. It reduced psychological uncertainty, enabled action, created coordination, and transformed despair into purposeful activity.

Each of those functions is a form of uncertainty reduction. In Shannon’s terms,99 Shannon, C.E. (1948). “A Mathematical Theory of Communication.” Bell System Technical Journal, 27(3), 379-423. https://ia803209.us.archive.org/27/items/bstj27-3-379/bstj27-3-379_text.pdf the map was doing exactly what information is supposed to do. The fact that it described the wrong mountain range was irrelevant.

Karl Weick, who popularised this story in management literature, concluded: “When you are lost, any old map will do.”1010 Weick, K.E. (1995). Sensemaking in Organizations. Sage.

Information serves multiple functions. Accuracy is only one of them. In conditions of high uncertainty, the primary need is often not truth but a basis for action. A wrong-but-actionable map beats paralysing uncertainty.1111 Nationalism, religion, and ideology persist in conditions of social dislocation for exactly this reason: not because they are true, but because people are lost, and these are maps.

The blind spot

Fact-checking was built to evaluate truth. It was not built for uncertainty, coordination, or psychological relief. But information often serves exactly those functions. The question is whether fact-checking has any mechanism for addressing those needs.

Fact-checking rests on a binary: information is true or it is false, and false information harms. That binary has no slot for a map that is wrong and lifesaving, a theory that is wrong and productive, a metaphor that is literally false and constitutive of how a problem gets framed and what responses become thinkable. This is not a failure of application; it is a structural gap in the instrument.

Dan Williams argues that misinformation is often a symptom of institutional distrust, not its cause.1212 Williams, D. (2023). “Misinformation is often the symptom, not the disease.” Conspicuous Cognition (Substack). https://www.conspicuouscognition.com/p/misinformation-is-often-the-symptom This compounds the problem. People reach for useful fictions precisely because credentialled correct information has failed them. Fact-checking cannot act on trust, and an approach that suppresses the cognitive infrastructure people fall back on when trust collapses is working against itself.

Any old map will do

Those Hungarian soldiers did not need an accurate map. They needed something to believe in, something to act on, something to coordinate around. The Pyrenees gave them that.

Usefulness and accuracy are different properties, and fact-checking has no instrument for the difference. The first piece in this series argued that the institutions once responsible for navigating uncertainty (journalism, expertise, public health) lost the trust that made them functional. When they did, people did not wait for the map to become accurate. They reached for what was available.

What is available now is a technology millions of people are already consulting as epistemic counsel, without any design for that role, without accountability structures, and without a public mandate. Whether that remains accidental is the question the series turns to next.

Footnotes

  1. Holub, M. (1977). “Brief Thoughts on Maps.” Times Literary Supplement, 4 February. https://geography.wisc.edu/histcart/wp-content/uploads/sites/12/2017/07/brdsht23UPDATED.pdf

  2. The distinction is consent: the soldiers chose the wrong map; no one slipped it into their pocket.

  3. American Chemical Society (n.d.). “Joseph Priestley, Discoverer of Oxygen.” https://www.acs.org/education/whatischemistry/landmarks/josephpriestleyoxygen.html

  4. Levere, T.H. (2026). “Henry Cavendish.” Encyclopaedia Britannica. https://www.britannica.com/biography/Henry-Cavendish

  5. Lakatos, I. (1978). The Methodology of Scientific Research Programmes. Cambridge University Press.

  6. Mendoza, E. (n.d.). “Sadi Carnot.” Encyclopaedia Britannica. https://www.britannica.com/biography/Sadi-Carnot-French-scientist

  7. Lakoff, G. & Johnson, M. (1980). Metaphors We Live By. University of Chicago Press.

  8. Vaihinger, H. (1924). The Philosophy of ‘As If’, trans. C.K. Ogden. Kegan Paul.

  9. Shannon, C.E. (1948). “A Mathematical Theory of Communication.” Bell System Technical Journal, 27(3), 379-423. https://ia803209.us.archive.org/27/items/bstj27-3-379/bstj27-3-379_text.pdf

  10. Weick, K.E. (1995). Sensemaking in Organizations. Sage.

  11. Nationalism, religion, and ideology persist in conditions of social dislocation for exactly this reason: not because they are true, but because people are lost, and these are maps.

  12. Williams, D. (2023). “Misinformation is often the symptom, not the disease.” Conspicuous Cognition (Substack). https://www.conspicuouscognition.com/p/misinformation-is-often-the-symptom

Information Disorder Series · Part 2 of 3

  1. 1Disinformation Is Dead
  2. 2The Value of Being Wrong← you are here
  3. 3A League of LLMs
Share: X LinkedIn

Stay informed

New research, straight to your inbox. No noise.

Aldu Cornelissen

Murmur Intelligence monitors digital influence operations and narrative control efforts globally, with a particular focus on the African continent. Learn more →