information-disorder-series · Part 1

Disinformation Is Dead

The misinformation industry has been solving the wrong problem. Reframing it through Shannon's information theory reveals what was missed.

Aldu Cornelissen 15 min read
misinformation disinformation information theory trust social media fact-checking
Share: X LinkedIn

That headline is not a fact. But you’re reading this, so I’ll try to make it worth your time.

Misinformation, disinformation, fake news. The World Economic Forum has ranked them as the number one short-term global risk for two consecutive years.11 WEF (2025). Global Risks Report 2025. World Economic Forum. https://www.weforum.org/press/2025/01/global-risks-report-2025-conflict-environment-and-disinformation-top-threats/ Entire industries have emerged to combat them. Yet people within those industries are questioning whether they have been fighting the wrong battle. Whitney Phillips argues that the media ecosystem’s response to falsehoods often amplifies them.22 Phillips, W. (2018). The Oxygen of Amplification. Data & Society Research Institute. https://datasociety.net/library/oxygen-of-amplification/ Dan Williams contends that misinformation is a symptom of institutional distrust, not its cause.33 Williams, D. (2023). “Misinformation is often the symptom, not the disease.” Conspicuous Cognition (Substack). https://www.conspicuouscognition.com/p/misinformation-is-often-the-symptom Renée DiResta, after years building detection tools, now argues the field should stop chasing content and start understanding propaganda as a structural problem.44 DiResta, R. (2024). “America’s Information War Self-Own.” The Next Move (Substack). https://www.thenextmove.org/p/americas-information-war-self-own

We have a real problem. We have had it before. We now have it at new scale, with some new dimensions. But it is not an entirely foreign concept, nor is it solved.

I’ve also had a niggling issue with the concept, given that we (Murmur Intelligence) do a lot of work in this area, I had to think through my position and try to convey it in a series of articles. This is the first in that series.

The problem with problem definitions

I was taught that many unsolved problems are really errors in problem definition. It sounds glib, but it means this: if your definition of a problem is incorrect, any solution will be an accidental placebo at best, or make things worse. Problem definition is a taken-for-granted phase of problem solving, consistently bypassed until it demands a proper look.

That is exactly what happened with the disinformation industry. The problem was defined as one of truth and falsity, not necessarily out loud, but implicitly. The natural result is that the solutions target factchecking, content moderation, and verification.

But what if the problem was never really about truth and falsity? The ‘post-truth’ era was an early and naïve attempt at labelling the situation:

“relating to and denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.”

The first read of that definition could easily miss the dynamics at play. Does the definition say it is ‘no longer’ or it ‘has never’? The term ‘post-truth’ hints that objective facts lost their power over public opinion, not that they never had it. The idea that we somehow lost the primacy of truth in the public domain is absurd. Truth has always been a contested concept, mediated far less by the facts than by whom you trust to vouch for them. Julian Baggini identifies the mechanism precisely:55 Baggini, J. (n.d.). “What can philosophy add to the post-truth crisis?” The Times Literary Supplement. https://www.the-tls.com/philosophy/contemporary-philosophy/post-truth-philosophers-essay-julian-baggini

“One reason is that there is major disagreement and uncertainty concerning what counts as a reliable source of truth. For most of human history, there was some stable combination of trust in religious texts and leaders, learned experts and the enduring folk wisdom called common sense. Now, it seems, virtually nothing is universally taken as an authority. This leaves us having to pick our own experts or simply to trust our guts.”

He makes a similar mistake in arguing that religious texts and leaders once provided a reliable source of truth; holy wars and conquests point against that. He introduces two concepts central to the reframing I want to achieve: uncertainty and trust in a reliable source. My approach draws not from philosophy, as his does, but from information theory.

Shannon’s missing variable

Claude Shannon, the father of information theory, defined ‘information’ in his 1948 paper as a measure of uncertainty reduction: the information content of a message is the degree to which it narrows the set of possibilities for the receiver.66 Shannon, C.E. (1948). “A Mathematical Theory of Communication.” Bell System Technical Journal, 27(3), 379-423. https://ia803209.us.archive.org/27/items/bstj27-3-379/bstj27-3-379_text.pdf That’s it: no truth, no falsity, just two variables: bits and uncertainty. Content and the degree to which it resolves what the receiver does not know.

Shannon was careful about social extrapolation; five years after publishing, he warned against a “scientific bandwagon” of applying information theory to fields where its relevance had not been established.77 Shannon, C.E. (1956). “The Bandwagon.” IRE Transactions on Information Theory, 2(1), 3. https://doi.org/10.1109/TIT.1956.1056774 This is a fair caution. I am not claiming that social information processing is mathematically identical to signal encoding. The claim is narrower: Shannon’s framework gives us a vocabulary that the disinformation industry lacks. It forces attention to both sides of the equation: the message and the receiver’s prior state of uncertainty, where the current frame fixates almost entirely on the message.

A simple proposition: you can create “disinformation” simply by increasing uncertainty at the receiver, without changing a single bit of content. Phrased differently, disinformation can be created not through manipulation of facts, but through manipulation of certainty in receivers. That is why there is an often-repeated mantra in the countless ‘disinformation workshops’ around the world: disinformation isn’t something you ‘dislike’. Attendees need to be reminded, constantly, that just because you don’t like something, you can’t label it disinformation. The primacy of uncertainty keeps revealing the double-edged sword of ‘truth’. It rears its head every time, and even seasoned disinformation experts aren’t spared.

For instance, dial uncertainty to 100%, and all bits become noise. Dial it to 0%, and everything resolves into signal. The content didn’t change; the receiver’s capacity to process it did. When uncertainty spikes (a novel pathogen, violence in a place once felt safe) the threshold for accepting information drops, not the facts themselves. Get attacked in the safety of your own home, and see your preferences shift to information sources that help you deal with the increase in uncertainty.

Information is a simple equation: bits’ ability to reduce uncertainty. What gives bits that capacity is the question that follows.

Trust reduces uncertainty

The reframing illuminates the actual function of our knowledge-creation institutions: journalism, academia, expertise. Their primary value to society isn’t producing accurate information. It’s reducing uncertainty, mediated through trust.

Think of flight attendants. They’re trained to appear completely calm, even during genuine emergencies. Why? Because passenger uncertainty about safety is managed through visible cues of trustworthiness, not through passengers independently verifying aircraft maintenance logs.

The expert class and Fourth Estate aren’t immune to Shannon’s equation. During COVID, the shifting guidance on masks (from discouraging them, to recommending them, to mandating them) increased uncertainty.88 Reuters (2020). “Fact check: Outdated video of Fauci claiming masks don’t help has been shared as if recent.” 8 October. https://www.reuters.com/article/idUSKBN26T2T9 It was rarely accompanied by clear explanations. Consistency is critical for public trust, and the flip-flopping eroded it. By 2023, Gallup found trust in mass media had fallen to just 31%, down from 72% in 1976.99 Gallup (2023). “Americans’ Trust in Media Remains at Trend Low.” Gallup, October. https://news.gallup.com/poll/651977/americans-trust-media-remains-trend-low.aspx

They fell into what I call the epistemological trap: fixating on being right about the facts, while haemorrhaging the trust that made their uncertainty-reduction function possible. The trap has structural roots. Journalism rewards accuracy, being first and being right. Academic incentives reward novel findings. Both reward the production of accurate content. Neither systematically rewards maintaining the audience’s trust, which is a different and often competing objective. A newsroom that publishes a retraction is being accurate; it is also eroding trust. A public health agency that reverses guidance is following the science; it is also raising uncertainty. The trap is that the institution’s internal logic (pursue truth) can work against its social function (reduce uncertainty through trust).

Getting it wrong once is far more damaging than getting it right a hundred times. Psychologist Paul Slovic demonstrated this experimentally: when people rated the impact of hypothetical news events about nuclear plant managers, trust-destroying events (an accident, officials caught lying) registered as having “very powerful impact” roughly twice as often as trust-building events (public meetings, clean safety records). He calls it the asymmetry principle.1010 Slovic, P. (1993). “Perceived Risk, Trust, and Democracy.” Risk Analysis, 13(6), 675-682. https://doi.org/10.1111/j.1539-6924.1993.tb01329.x Trust is easier to destroy than to create, and media incentives ensure that trust-destroying events reach far more people than trust-building ones. That’s not fair, but that’s the market.1111 An institution that wants to build trust must therefore play by rules that structurally disadvantage it: slow to accumulate, fast to destroy, with no market incentive to change those rules. Your primary value proposition is uncertainty reduction mediated through trust. Lose the latter, and the former becomes unattainable. The question is whether fact-checking has any mechanism for recovering trust once it has been spent.

Case study: white genocide

Consider “white genocide” in South Africa, a narrative that reached the Trump administration and briefly became central to American foreign policy discourse. In August 2018, Trump posted about “the large scale killing of farmers” and ordered his Secretary of State, Mike Pompeo, to investigate.1212 de Greef, K. & Karasz, P. (2018). “Trump cites false claims of widespread attacks on white farmers in South Africa.” New York Times, 23 August. https://www.nytimes.com/2018/08/23/world/africa/trump-south-africa-white-farmers.html

Just to say it out loud: there is no systematic persecution of white people in South Africa. But that is not the issue.

There is a real uncertainty that manifests as fear. Decades of political rhetoric, opportunistic amplification of farm attacks,1313 Findlay, K. (2018). “A society divided: land, farm attacks and #WhiteGenocide.” Superlinear. https://www.superlinear.co.za/a-society-divided-land-farm-attacks-and-whitegenocide/ and a plague of brutal crime cannot be dismissed with “get in line.” When this narrative reached opportunistic actors in Washington, no amount of factchecking could stop them from achieving their objective.

Why? Because “white genocide” functions as a metaphor. It doesn’t need to be literally true to be utilised as a concept. Energy spent factchecking whether there is a white genocide in South Africa is energy spent tending to the claims, not the uncertainty.

And here’s what struck me: not a single piece, that I personally read at the time, acknowledged the genuine plight of rural safety. The response was always something like “yes, but crime is worse in the townships.” President Ramaphosa in 2020: “the majority of victims of violent crime are Black and poor.”1414 Ramaphosa, C. (2020). Presidential newsletter, 12 October. Minister Lamola in 2025: “crime affects everyone, irrespective of race.”1515 Lamola, R. (2025). Media briefing, 12 May. The ISS framing: farm murders represent “0.2% of overall murders.”1616 Newham, G. (2025). “Violent crime and the myth of South Africa’s ‘white genocide’.” ISS Today, 26 May. https://issafrica.org/iss-today/violent-crime-and-the-myth-of-south-africa-s-white-genocide Nobody mentioned response times in isolated areas. Nobody discussed the unfunded National Rural Safety Strategy. By this logic, we’re not allowed to save the rhino until the panda is saved first.1717 The universalising deflection (aggregate statistics deployed to dismiss particular suffering) is available to any side of this argument, and in this case both deploy it with equal facility.

The uncertainty-reduction lens predicts what factchecking cannot explain: why the narrative persisted despite being repeatedly debunked. The claims were addressed (“there is no white genocide”) but the uncertainty was not. What would addressing the uncertainty have looked like? Acknowledging the fear as real. Publishing rural response-time data alongside crime statistics. Funding and publicising the National Rural Safety Strategy. Treating the uncertainty as the entry point rather than the claim. None of this requires agreeing with the “white genocide” framing. It requires recognising that people who are afraid do not process corrections the way people who are safe do.

When you fact-check without addressing the underlying uncertainty, all you’re doing is providing a cipher that allows people to redefine and adjust their approaches while maintaining their positions. Implementing smarter anti-plagiarism measures just produces smarter plagiarism. More opinions and questions flood the ‘market of ideas’ in the increasing state of uncertainty, accepted by more and more people and reaching anti-fragility against any counter.

The AfriForum case sharpens the point. AfriForum explicitly rejects the “white genocide” label; they have taken media outlets to the press ombudsman for attributing it to them and won.1818 Press Council of South Africa (2024). “AfriForum vs Sunday World.” Ruling, 2 December. https://presscouncil.org.za/2024/12/02/afriforum-vs-sunday-world/ Their stated position is that farm murders are a serious, racially inflected crime pattern that the government refuses to prioritise. But their international campaigns (dossiers delivered to the Trump administration, presentations to the UN framing “Kill the Boer” as a “genocidal call,” farm murder statistics that fact-checkers have found inconsistent with official police data1919 The Citizen (2018). “Are Afriforum lobbying in Australia with accurate stats?” https://www.citizen.co.za/news/south-africa/are-afriforum-lobbying-in-australia-with-accurate-stats/ ) reach international audiences who interpret “genocidal call” and “targeted killings” as confirmation of genocide. AfriForum rejects the conclusion. The premises they advance reach audiences who draw it anyway.

This is the class of communication that fact-checking cannot resolve. The message as sent is not the message as received, and no verification can close that gap. The government response (“crime affects everyone”) is technically accurate while being functionally dismissive. Neither side has incentive to close the gap: AfriForum’s international profile grows with attention to farm murders; the government avoids engaging with rural safety by folding it into aggregate statistics. Information disorder, in this case, is not a coordination failure but a strategic equilibrium, and equilibria don’t yield to better facts.

Why fact-checking fails

In July 2025, I was on a panel2020 The panel recording: most of what became this series was already showing from my ramblings, even if clumsily so. at LIRNEasia’s “Day of Information Disorder” in Colombo, where researchers had just presented findings from an experimental study that exposed over 1,500 Sri Lankan participants to five counter-misinformation interventions and measured what worked.2121 LIRNEasia (2025). “Day of Information Disorder”: Evidence-Based Solutions for a Resilient Digital Age. LIRNEasia, 3 July. https://lirneasia.net/2025/07/day-of-information-disorder-evidence-based-solutions-for-a-resilient-digital-age/ Almost every intervention produced a measurable effect. The condition: the content had to be simple. Once the claims became complex, the effect vanished. Factchecking works, precisely, on narrow content. The field deployed it as a comedically large brush.

Walter et al. reviewed 30 studies and found that factchecking has a positive effect on belief accuracy, but the effect is modest and weakens significantly when claims are complex or campaign-related.2222 Walter, N., Cohen, J., Holbert, R.L. & Morag, Y. (2020). “Fact-checking: A meta-analysis of what works and for whom.” Political Communication, 37(3), 350-375. https://doi.org/10.1080/10584609.2019.1668894 The “backfire effect” (where corrections strengthen false beliefs) appears less common than initially thought.2323 Wood, T. & Porter, E. (2019). “The elusive backfire effect: Mass attitudes’ steadfast factual adherence.” Political Behavior, 41(1), 135-163. https://doi.org/10.1007/s11109-018-9443-y But modest and non-backfiring is not the same as effective.

People generally agree on obvious truths. But when stories become complex, what people regard as “true” becomes contextual. So, when one fact-checks complex claims, what happens? Believers adjust to positions like “yes, but if it was true…” or “yes, but it’s a good example of…” or “I’m just asking questions.”

Factchecking rarely convinces. Instead, it becomes a cat-and-mouse game. You train producers of non-approved information to subvert the system with opinions, qualified claims, “big if true” hedging. You’re improving their storytelling while losing the war.

The deeper issue is that the content-correction approach assumes people are gullible, easily infected by bad ideas. The cognitive science suggests the opposite. Humans have mechanisms of epistemic vigilance that evaluate the source and plausibility of incoming information.2424 Sperber, D., Clément, F., Heintz, C., Mascaro, O., Mercier, H., Origgi, G. & Wilson, D. (2010). “Epistemic Vigilance.” Mind & Language, 25(4), 359-393. https://doi.org/10.1111/j.1468-0017.2010.01394.x These mechanisms make people pig-headed, not credulous, predisposing them to reject information at odds with their pre-existing beliefs. Hugo Mercier’s Not Born Yesterday is the definitive treatment of this research.2525 Mercier, H. (2020). Not Born Yesterday: The Science of Who We Trust and What We Believe. Princeton University Press. https://press.princeton.edu/titles/13944.html Mass persuasion is extremely difficult precisely because we’re built to resist it.

This connects directly to the uncertainty problem. Epistemic vigilance is a defence against false claims: it filters content. But it has no defence against rising uncertainty. When uncertainty increases, the filter doesn’t break; it tightens. People become more selective about sources, not less, and they select for trust, not for accuracy. Disinformation succeeds not by sneaking past epistemic vigilance but by operating on a different channel entirely: raising uncertainty until the receiver’s source preferences shift.

Dan Williams’s framing is more precise: misinformation is often better understood as a symptom of institutional distrust and political polarisation, not their cause.2626 Williams, D. (2023). “Misinformation is often the symptom, not the disease.” Conspicuous Cognition (Substack). https://www.conspicuouscognition.com/p/misinformation-is-often-the-symptom If that’s true, targeting the content while ignoring underlying conditions is treating a fever by breaking the thermometer.

The real problem

The “truth era” was itself a useful fiction.2727 The term ‘post-truth’ is itself a useful fiction by this definition: it explained nothing but gave people something to point at, which is precisely how it spread.

Information has always functioned through trust-mediated uncertainty reduction. Stories, myths, metaphors: these were humanity’s information technologies for millennia. They worked not because they were literally true, but because they enabled action, coordination, and meaning. Steven Shapin showed that even the Scientific Revolution depended on social trust: seventeenth-century English scientists established facts not through method alone but through codes of gentlemanly conduct (honour, civility, reputation) that made certain people’s testimony count as credible.2828 Shapin, S. (1994). A Social History of Truth: Civility and Science in Seventeenth-Century England. University of Chicago Press. Knowledge-making was always a collective enterprise mediated by trust, and it still is.

What we call the “truth era” was really a historically brief period of high institutional trust, roughly the post-war decades through the mid-1970s. Robert Putnam’s data traces the arc: trust in government fell from 70% in 1966 to 25% by 1998; civic participation, church attendance, and voluntary association membership all declined in tandem.2929 Putnam, R. (2000). Bowling Alone: The Collapse and Revival of American Community. Simon & Schuster. The institutions vouched for the facts, and we processed them as signal. Now that trust has eroded, we’re seeing how information always worked. We just didn’t notice because the scaffolding was invisible.

The obsession with literal truth is an Enlightenment hangover. Before the seventeenth century, no culture expected all claims to be verified by independent observation; testimony, authority, and narrative coherence were the dominant modes of establishing what was known. The Enlightenment introduced a powerful but narrow standard: claims should be judged by empirical evidence and replication. That standard transformed science. But applying it to all information (myth, metaphor, narrative, political speech) is a category error. You can’t fact-check a metaphor. You can’t peer-review a myth. You can’t replicate a story in a lab. They serve different functions, and they always have.

The misinformation industry has had its run. It emerged from the shocks of 2016, accelerated through COVID, and has produced an industry that struggles to demonstrate impact while generating significant backlash. The blunt assessment from misinformation researchers themselves: “Regardless of the type of content moderation, the practice alone is not effective at reducing belief in misinformation or at limiting its spread.”3030 Alhabash, S. (2025). Quoted in “Meta shift from fact-checking to crowdsourcing spotlights competing approaches in fight against misinformation and hate speech.” The Conversation, 15 January. https://theconversation.com/meta-shift-from-fact-checking-to-crowdsourcing-spotlights-competing-approaches-in-fight-against-misinformation-and-hate-speech-246854

Shannon’s insight from 1948 remains quietly relevant: information has no truth attribute. It’s signals that reduce uncertainty. And you can’t reduce uncertainty without trust.

The problem is broken trust and unaddressed uncertainty.

What that reframe demands (in practice, not in principle) is where the series goes next.

Footnotes

  1. WEF (2025). Global Risks Report 2025. World Economic Forum. https://www.weforum.org/press/2025/01/global-risks-report-2025-conflict-environment-and-disinformation-top-threats/

  2. Phillips, W. (2018). The Oxygen of Amplification. Data & Society Research Institute. https://datasociety.net/library/oxygen-of-amplification/

  3. Williams, D. (2023). “Misinformation is often the symptom, not the disease.” Conspicuous Cognition (Substack). https://www.conspicuouscognition.com/p/misinformation-is-often-the-symptom

  4. DiResta, R. (2024). “America’s Information War Self-Own.” The Next Move (Substack). https://www.thenextmove.org/p/americas-information-war-self-own

  5. Baggini, J. (n.d.). “What can philosophy add to the post-truth crisis?” The Times Literary Supplement. https://www.the-tls.com/philosophy/contemporary-philosophy/post-truth-philosophers-essay-julian-baggini

  6. Shannon, C.E. (1948). “A Mathematical Theory of Communication.” Bell System Technical Journal, 27(3), 379-423. https://ia803209.us.archive.org/27/items/bstj27-3-379/bstj27-3-379_text.pdf

  7. Shannon, C.E. (1956). “The Bandwagon.” IRE Transactions on Information Theory, 2(1), 3. https://doi.org/10.1109/TIT.1956.1056774

  8. Reuters (2020). “Fact check: Outdated video of Fauci claiming masks don’t help has been shared as if recent.” 8 October. https://www.reuters.com/article/idUSKBN26T2T9

  9. Gallup (2023). “Americans’ Trust in Media Remains at Trend Low.” Gallup, October. https://news.gallup.com/poll/651977/americans-trust-media-remains-trend-low.aspx

  10. Slovic, P. (1993). “Perceived Risk, Trust, and Democracy.” Risk Analysis, 13(6), 675-682. https://doi.org/10.1111/j.1539-6924.1993.tb01329.x

  11. An institution that wants to build trust must therefore play by rules that structurally disadvantage it: slow to accumulate, fast to destroy, with no market incentive to change those rules.

  12. de Greef, K. & Karasz, P. (2018). “Trump cites false claims of widespread attacks on white farmers in South Africa.” New York Times, 23 August. https://www.nytimes.com/2018/08/23/world/africa/trump-south-africa-white-farmers.html

  13. Findlay, K. (2018). “A society divided: land, farm attacks and #WhiteGenocide.” Superlinear. https://www.superlinear.co.za/a-society-divided-land-farm-attacks-and-whitegenocide/

  14. Ramaphosa, C. (2020). Presidential newsletter, 12 October.

  15. Lamola, R. (2025). Media briefing, 12 May.

  16. Newham, G. (2025). “Violent crime and the myth of South Africa’s ‘white genocide’.” ISS Today, 26 May. https://issafrica.org/iss-today/violent-crime-and-the-myth-of-south-africa-s-white-genocide

  17. The universalising deflection (aggregate statistics deployed to dismiss particular suffering) is available to any side of this argument, and in this case both deploy it with equal facility.

  18. Press Council of South Africa (2024). “AfriForum vs Sunday World.” Ruling, 2 December. https://presscouncil.org.za/2024/12/02/afriforum-vs-sunday-world/

  19. The Citizen (2018). “Are Afriforum lobbying in Australia with accurate stats?” https://www.citizen.co.za/news/south-africa/are-afriforum-lobbying-in-australia-with-accurate-stats/

  20. The panel recording: most of what became this series was already showing from my ramblings, even if clumsily so.

  21. LIRNEasia (2025). “Day of Information Disorder”: Evidence-Based Solutions for a Resilient Digital Age. LIRNEasia, 3 July. https://lirneasia.net/2025/07/day-of-information-disorder-evidence-based-solutions-for-a-resilient-digital-age/

  22. Walter, N., Cohen, J., Holbert, R.L. & Morag, Y. (2020). “Fact-checking: A meta-analysis of what works and for whom.” Political Communication, 37(3), 350-375. https://doi.org/10.1080/10584609.2019.1668894

  23. Wood, T. & Porter, E. (2019). “The elusive backfire effect: Mass attitudes’ steadfast factual adherence.” Political Behavior, 41(1), 135-163. https://doi.org/10.1007/s11109-018-9443-y

  24. Sperber, D., Clément, F., Heintz, C., Mascaro, O., Mercier, H., Origgi, G. & Wilson, D. (2010). “Epistemic Vigilance.” Mind & Language, 25(4), 359-393. https://doi.org/10.1111/j.1468-0017.2010.01394.x

  25. Mercier, H. (2020). Not Born Yesterday: The Science of Who We Trust and What We Believe. Princeton University Press. https://press.princeton.edu/titles/13944.html

  26. Williams, D. (2023). “Misinformation is often the symptom, not the disease.” Conspicuous Cognition (Substack). https://www.conspicuouscognition.com/p/misinformation-is-often-the-symptom

  27. The term ‘post-truth’ is itself a useful fiction by this definition: it explained nothing but gave people something to point at, which is precisely how it spread.

  28. Shapin, S. (1994). A Social History of Truth: Civility and Science in Seventeenth-Century England. University of Chicago Press.

  29. Putnam, R. (2000). Bowling Alone: The Collapse and Revival of American Community. Simon & Schuster.

  30. Alhabash, S. (2025). Quoted in “Meta shift from fact-checking to crowdsourcing spotlights competing approaches in fight against misinformation and hate speech.” The Conversation, 15 January. https://theconversation.com/meta-shift-from-fact-checking-to-crowdsourcing-spotlights-competing-approaches-in-fight-against-misinformation-and-hate-speech-246854

Information Disorder Series · Part 1 of 3

  1. 1Disinformation Is Dead← you are here
  2. 2The Value of Being Wrong
  3. 3A League of LLMs
Share: X LinkedIn

Stay informed

New research, straight to your inbox. No noise.

Aldu Cornelissen

Murmur Intelligence monitors digital influence operations and narrative control efforts globally, with a particular focus on the African continent. Learn more →