Neglect of Probability: Scared of the Wrong Things
In the months after the September 11 attacks, millions of Americans stopped flying and drove instead. Traffic fatalities rose sharply — researchers estimate an additional 1,500 deaths in the year following the attacks, as people substituted a statistically safer activity (flying) for a statistically more dangerous one (driving) because flying felt more terrifying. The fear was psychologically real. The probability assessment was catastrophically wrong. This is neglect of probability in its starkest, most fatal form.
The Bias Defined
Neglect of probability (also called probability neglect) refers to the tendency to respond to risks based on the vividness and emotional impact of potential outcomes — rather than on the actual probability of those outcomes occurring. In extreme cases, we treat very small probabilities as if they were meaningfully different from zero only when the associated outcome is frightening or emotionally salient. The mathematical magnitude of risk disappears; what remains is a binary: "scary" or "not scary."
The concept was formalised by Cass Sunstein in a 2002 paper, building on the broader framework of heuristics and biases developed by Daniel Kahneman and Amos Tversky. Related constructs include the affect heuristic (Paul Slovic's term for evaluating risk based on emotional response) and psychic numbing (the failure of large numbers to produce proportionally larger emotional responses).
Flying vs. Driving: The Classic Illustration
Commercial aviation is, by almost every measure, orders of magnitude safer per kilometre travelled than private automobile travel. The fatality rate per billion passenger-kilometres for air travel in developed countries is typically around 0.07 deaths; for road travel it is closer to 3.1 — roughly 40 to 50 times higher. Yet fear of flying (aviophobia) affects an estimated 25–40% of the population, while fear of driving is vanishingly rare.
What drives the discrepancy?
- Controllability: In a car, you feel you have agency. On a plane, you are entirely in the hands of others. We systematically underestimate risks we feel in control of, and overestimate risks where control is surrendered — even when that control is largely illusory.
- Catastrophic imagery: A plane crash is spectacular, total, and widely covered. A road accident is common, dispersed, and invisible in the news cycle. The vividness of plane crashes makes them feel more common; the mundanity of road deaths makes them disappear into the background.
- Dread: Paul Slovic's research identified "dread" — the sense of uncontrollability, catastrophic potential, and inequitable distribution of risk — as the primary driver of perceived risk, entirely separate from actual probability. Aviation scores high on dread; driving scores low.
Sharks, Coconuts, and the Availability Heuristic
Shark attacks generate enormous media attention. Globally, they kill roughly 5–10 people per year. Falling coconuts are estimated to kill around 150 people annually — yet no one develops coconutphobia. Hippos kill approximately 500 people per year in Africa; mosquitoes, through malaria and other diseases, kill hundreds of thousands. The hierarchy of public fear bears essentially no relationship to the hierarchy of actual danger.
This reflects the availability heuristic: we assess probability by how easily examples come to mind. A shark attack is a vivid, emotionally loaded image — a predator, blood, helplessness, a photogenic monster. It is heavily covered in news and entertainment. A coconut falling on someone's head does not generate a Discovery Channel special. The availability heuristic and neglect of probability reinforce each other: the more vividly we can imagine a risk, the higher our probability estimate; the higher our emotional response, the less we bother checking the actual numbers.
Terrorism and the Policy Response
Sunstein's 2002 analysis was explicitly motivated by post-9/11 policy responses. The attacks killed approximately 3,000 people — a genuine atrocity. The subsequent decade of security measures, two wars, and mass surveillance programmes cost, by various estimates, trillions of dollars and hundreds of thousands of lives. The probability of an American being killed in a terrorist attack in any given year is, statistically, comparable to being killed by lightning or by a bee sting — roughly 1 in 3.5 million.
This is not an argument that terrorism is unimportant. Terrorism is designed to exploit exactly this cognitive mechanism: to create a level of fear and societal disruption wildly disproportionate to the actual harm inflicted. The asymmetry is the point. A bomb in a marketplace kills dozens while reshaping the security posture of entire nations, because humans cannot assess small probabilities of catastrophic events neutrally. Probability disappears; dread takes over.
The result is systematic misallocation of resources. Sunstein coined the term "probability neglect" specifically to argue that regulatory and policy responses to risk should be calibrated to probability, not to outrage — even when the outrage is politically irresistible. If we applied the same degree of dread to car accidents (predictably killing 38,000 Americans per year) as to terrorism (predictably killing dozens), the policy landscape would look entirely different.
Insurance and the Lottery Effect
Neglect of probability distorts financial behaviour in two opposite directions: it makes us over-insure low-probability high-salience risks and under-insure high-probability low-salience ones.
People routinely purchase extended warranties on household electronics (high salience, low actual failure rate), travel insurance for cheap domestic trips, and flight insurance at airports (purchased precisely when flying fear peaks) — all of which tend to be poor value. Simultaneously, many people are chronically underinsured for health, disability, and property risks that are statistically far more likely to affect them, but which don't trigger the visceral dread response.
Lottery ticket purchasing is the purest example of the reverse asymmetry: a very small probability of a very large gain is treated as a viable prospect, not because people are innumerate, but because the large outcome generates an emotionally compelling image. "I might win £10 million" is vivid and motivating; "I have a 1 in 14 million chance of winning" is abstract and unmotivating. The probability is neglected; the dream is purchased.
The Connection to Other Probability Biases
Neglect of probability is part of a family of related distortions. In prospect theory, Kahneman and Tversky showed that people systematically overweight small probabilities (particularly of losses) relative to their objective magnitude — but this overweighting is not uniform. It is highest when the outcome is emotionally extreme, and lowest when the stakes are modest and unemotional. The probability-outcome interaction is key: it's not just that we ignore small probabilities, it's that we ignore them selectively, in proportion to how much the outcome doesn't scare us.
This connects directly to base rate neglect, where population-level frequencies are ignored in favour of case-specific evidence, and to the availability heuristic, where ease of recall substitutes for frequency estimation. All three produce predictable patterns of risk over- and under-estimation that diverge from objective probability in systematic, not random, ways.
Psychic Numbing and Large Numbers
There is a troubling flip side to neglect of probability: when numbers get large, we stop feeling their magnitude. Stalin is alleged to have observed that "one death is a tragedy, a million deaths is a statistic." The psychological research bears this out. Paul Slovic and colleagues have demonstrated that the emotional response to a single identified victim ("little Sophie, who needs surgery") is often stronger than the response to a named group of ten, which is in turn stronger than the response to a statistical description of 100,000. This is "psychic numbing" — the failure of large numbers to generate proportionally large empathy or concern.
The result is paradoxical: we feel more for the one than for the many. Charitable appeals featuring a single face outperform those citing statistics about mass suffering. Humanitarian crises affecting millions receive less per-capita attention than single dramatic rescue stories. The probability of encountering a moving individual story is low — but when we do encounter it, the emotional response overwhelms any probabilistic reasoning about relative priorities.
What Better Probability Reasoning Looks Like
Correcting neglect of probability doesn't require becoming emotionless. It requires developing practices that give probability its due weight:
- Seek denominator data: Never evaluate a risk in isolation. How many people were exposed? What is the rate? The numerator (number of events) without the denominator (number of opportunities) tells you nothing.
- Compare risks explicitly: Ask "compared to what?" Fear of flying becomes calibrated when you compare it to the risk of driving to the airport. Fear of terrorism becomes calibrated when you compare it to the risk of drowning in a bathtub.
- Distinguish probability from consequences: A low-probability catastrophic risk (nuclear war) may warrant extraordinary precaution for reasons unrelated to its expected value. But the precaution should be explicitly justified on those grounds — not via inflated probability estimates driven by dread.
- Notice when you're responding to vividness, not numbers: The question is not "does this scare me?" but "how likely is this, and how does that compare to other things I'm not scared of?"
We are not built to feel probability. We feel stories, images, and emotions — and then we confabulate probability estimates that justify our feelings. Understanding neglect of probability won't make plane crashes less terrifying. But it might keep you in the plane and off the statistically more dangerous road.
Sources & Further Reading
- Sunstein, C. R. "Probability Neglect: Emotions, Worst Cases, and Law." Yale Law Journal 112, no. 1 (2002): 61–107.
- Slovic, P., Fischhoff, B., & Lichtenstein, S. "Facts versus Fears: Understanding Perceived Risk." In Judgment Under Uncertainty, ed. Kahneman, Slovic, & Tversky. Cambridge University Press, 1982.
- Gigerenzer, G. Risk Savvy: How to Make Good Decisions. Viking, 2014.
- Kahneman, D., & Tversky, A. "Prospect Theory: An Analysis of Decision under Risk." Econometrica 47, no. 2 (1979): 263–291.
- Slovic, P. "If I Look at the Mass I Will Never Act: Psychic Numbing and Genocide." Judgment and Decision Making 2, no. 2 (2007): 79–95.
- Gaissmaier, W., & Gigerenzer, G. "Statistical Thinking as the Foundation for Evidence-Based Medicine." In Better Doctors, Better Patients, Better Decisions, MIT Press, 2011.
- Wikipedia: Neglect of probability