Apps

🧪 This platform is in early beta. Features may change and you might encounter bugs. We appreciate your patience!

← Back to Library
blog.category.aspect Mar 29, 2026 6 min read

Fallacy of the Single Cause: Why Nothing Has Just One Reason

Video games cause violence. Immigration caused the recession. Sugar causes hyperactivity. These claims share a common flaw: they reduce complex, multi-causal phenomena to a single convenient factor. The Fallacy of the Single Cause — also called causal oversimplification — is one of the most seductive errors in reasoning, because it satisfies our hunger for clean explanations in a genuinely messy world. When something goes wrong, we want to point at something. The fallacy obliges by pointing at one thing and pretending the story ends there.

What the Fallacy Looks Like

The Fallacy of the Single Cause occurs when a complex outcome with multiple contributing factors is attributed entirely — or overwhelmingly — to a single cause. The claim need not be that the named factor played no role; it may genuinely have contributed. The error is the exclusion of other relevant causes and the implication that removing or addressing this one factor would fully explain or fix the outcome.

Examples span every domain:

  • Public health: "Obesity is caused by lack of willpower." (Genetics, urban food environments, socioeconomic factors, gut microbiome, sleep deprivation, and medication side effects also contribute.)
  • Economics: "The 2008 financial crisis was caused by greedy bankers." (Regulatory failures, decades of deregulation, securitisation structures, rating agency incentives, and macroeconomic imbalances were all implicated.)
  • Education: "Poor school results are caused by bad teachers." (Poverty, family instability, nutrition, class size, school funding, and cultural attitudes toward education all interact.)
  • Crime: "Crime is caused by immigration." (Research consistently shows immigration is not a driver of crime rates; socioeconomic marginalisation, policing patterns, and data collection artefacts are far more explanatory.)

Notice that in each case, the named cause may be a real factor — or may not be — but even when it is, it doesn't operate alone. The fallacy lies in presenting it as the whole story.

Why We Are Drawn to Single Causes

The appeal of the single cause is deeply rooted in how the human mind processes information. We are pattern-seeking creatures who evolved to identify causal relationships quickly — predator heard in the bushes, fruit found on that tree. Rapid, single-factor causal attribution was computationally efficient and often good enough on the savanna. In complex modern systems — economies, ecosystems, bodies, societies — it tends to mislead.

Psychologists call the underlying tendency causal reductionism: the disposition to simplify causal webs into linear chains. Paired with the availability heuristic, which makes us weight causes that come easily to mind, we reliably land on whatever explanation is most vivid, most recent, or most politically convenient — regardless of its actual explanatory power.

Media amplifies this tendency. A headline reading "Study links social media to depression in teenagers" compresses a relationship that researchers describe in cautious, probabilistic language — correlation, confounds, effect sizes that explain perhaps 1-3% of variance — into something that sounds like a clean causal verdict. The single cause is more publishable than the messy truth.

Root Cause Analysis: A Method Gone Wrong

In engineering and management, root cause analysis (RCA) is a systematic method for identifying the underlying causes of failures. Techniques like the "5 Whys" — ask "why" five times to drill down from symptoms to root causes — are valuable tools. But they carry an embedded assumption: that there is a root cause, singular. In genuinely complex failures, this assumption breaks down.

The 1986 Challenger disaster is a canonical case. The immediate cause was an O-ring failure in cold temperatures. But investigations revealed interlocking causes: engineering warnings that were overridden, organisational culture that prioritised schedule over safety, communication failures between NASA and Morton Thiokol, and managerial decision-making structures that filtered out dissenting voices. Fixing the O-ring material without addressing the organisational pathologies would have left the deeper causes untreated — and eventually produced another disaster by a different mechanism. The Rogers Commission's report was explicit: the cause was systemic, not singular.

When root cause analysis is applied naively, it tends to stop at a cause that is nameable, punishable, and politically convenient — a scapegoat — rather than continue to the systemic conditions that made the named cause possible. This is the fallacy of the single cause institutionalised as a methodology.

The Philosophy of Causation

Philosophers of causation have long recognised that most outcomes have multiple necessary contributors. J. L. Mackie formalised this with the concept of INUS conditions: a cause is typically an Insufficient but Necessary part of an Unnecessary but Sufficient condition. In plain terms: any given cause is usually necessary but not sufficient on its own; it combines with other factors to produce the outcome. Isolating one factor and calling it "the cause" misrepresents this structure.

The Bradford Hill criteria, developed in epidemiology to evaluate causal claims, include criteria like specificity (does the cause reliably produce this and only this effect?), biological plausibility, dose-response relationship, and consistency across studies. Single-cause claims in public discourse rarely survive serious application of these criteria, because most real-world causal relationships are probabilistic, contextual, and dose-dependent — not categorical.

Political and Social Consequences

The single-cause fallacy is not merely an intellectual error; it has real-world consequences. Policies designed around single-cause attributions tend to be ineffective because they address only one thread of a tangled causal web. "Just say no" anti-drug campaigns assumed that drug use was caused by exposure to drugs and peer pressure — intervening in the decision at the point of offer. Decades of evidence show that addiction has neurobiological, psychological, and socioeconomic components that peer-resistance training does not address. The campaigns failed not because they were poorly executed but because they were premised on a single-cause model of a multi-cause phenomenon.

Single-cause attributions also enable scapegoating. When an economy turns sour, immigrants, speculators, or minorities become the named cause — absorbing explanatory blame that properly belongs to structural factors like credit cycles, monetary policy, and regulatory failure. The cognitive convenience of a single human group to blame, combined with the confirmation bias that makes consistent evidence easy to find once you're looking, makes this pattern recurring and dangerous.

Systems Thinking as the Antidote

The corrective to the single-cause fallacy is not refusing to identify causes — it is holding multiple causes simultaneously and mapping how they interact. Systems thinking, developed in engineering and management science and popularised by Donella Meadows in Thinking in Systems, frames outcomes as products of feedback loops, delays, and interacting variables rather than linear cause-and-effect chains.

In practice, this means asking: What other factors contributed? What conditions had to be present for this cause to have its effect? What would have happened if this factor were removed but others remained? Could a different trigger have produced the same outcome through different pathways? These questions don't always have clean answers, but asking them is more epistemically honest than settling for one.

When you next hear a confident single-cause claim — whether about health, society, economics, or crime — it is worth treating it as a hypothesis rather than an explanation. The question is never just "did this factor contribute?" but "how much, compared to what else, under which conditions, and for whom?" The mess in that question is the truth.

Sources & Further Reading

  • Mackie, J. L. "Causes and Conditions." American Philosophical Quarterly 2, no. 4 (1965): 245–264.
  • Hill, A. B. "The Environment and Disease: Association or Causation?" Proceedings of the Royal Society of Medicine 58, no. 5 (1965): 295–300.
  • Meadows, D. H. Thinking in Systems: A Primer. Chelsea Green Publishing, 2008.
  • Presidential Commission on the Space Shuttle Challenger Accident (Rogers Commission). Report, 1986.
  • Pearl, J., & Mackenzie, D. The Book of Why: The New Science of Cause and Effect. Basic Books, 2018.
  • Wikipedia: Fallacy of the single cause

Related Articles