Apps

🧪 This platform is in early beta. Features may change and you might encounter bugs. We appreciate your patience!

← Back to Library
blog.category.aspect Mar 29, 2026 7 min read

The Availability Heuristic: Why Vivid Examples Hijack Our Sense of Risk

A plane crashes. For days, the wreckage is everywhere — on television, on your phone, in conversations at work. A week later, you find yourself gripping your armrest at takeoff, heart rate elevated, wondering whether this flight was really necessary. Meanwhile, you drove to the airport through busy motorway traffic without a second thought. The statistics say you were in vastly more danger on the road. Your brain says something different. This is the availability heuristic at work.

What Is the Availability Heuristic?

In 1973, cognitive psychologists Amos Tversky and Daniel Kahneman published a landmark paper describing a class of mental shortcuts — heuristics — that humans use when estimating probability and frequency. One of the most influential was the availability heuristic: the tendency to judge how likely or common something is based on how easily examples of it come to mind.

The logic of the heuristic is not entirely unreasonable. In a world with limited information, the ease with which you can retrieve examples of something often does correlate with its frequency. You can probably think of more common words than rare ones; you can recall more familiar faces than obscure ones. Mental availability is a rough proxy for real-world prevalence.

The problem is that mental availability is also heavily influenced by factors that have nothing to do with actual frequency: recency, emotional intensity, vividness, media coverage, and personal salience. When these factors dominate, our intuitions about probability can be dramatically, systematically wrong.

The Classic Demonstration

Tversky and Kahneman demonstrated the heuristic with elegant simplicity. They asked participants whether there are more words in English that begin with the letter K, or more words in which K is the third letter. Most people confidently said the first category is larger. In fact, the third-letter category is roughly twice as large — but words starting with K are far easier to retrieve from memory. We search our mental lexicon by first letter, so words beginning with K flood in quickly. Third-position K's require more effortful reconstruction.

The same mechanism extends to risk estimation. In a study asking people to estimate causes of death, dramatic, newspaper-worthy causes — tornadoes, plane crashes, shark attacks — were consistently overestimated relative to quiet, statistically dominant killers like heart disease, stroke, and diabetes. The dramatic causes are memorable, visual, and emotionally charged. The mundane ones are neither, even though they kill orders of magnitude more people every year.

Fear of Flying, Comfort with Driving

Aviation safety provides one of the cleanest real-world illustrations. Commercial aviation, per mile travelled, is roughly 50 to 100 times safer than car travel, depending on the country and methodology used. Yet fear of flying is far more common than fear of driving. The explanation lies almost entirely in availability. Plane crashes are rare enough to be globally newsworthy; they are covered exhaustively and emotionally. Car crashes kill over 1.35 million people annually worldwide — but the deaths are dispersed, undramatic, and individually unreported. No global news cycle marks the 3,700 road deaths that happen every single day.

After major aviation disasters, aviation ticket sales consistently drop. After the September 11 attacks, the shift from flying to driving in the United States was estimated to have caused approximately 1,600 additional road fatalities in the months following — people substituting a safer mode of transport for a more dangerous one, because one had become vividly available in their minds.

Media Coverage and the Distortion of Risk

News media operates on a logic that systematically amplifies availability bias. The selection criteria for news — novelty, drama, emotional charge, conflict — preferentially capture rare, dramatic events. Dog bites man is not news; man bites dog is. The result is a news diet that presents a profoundly distorted picture of what is actually dangerous, common, or significant.

This has measurable effects on public risk perception. Studies have shown that populations regularly overestimate the prevalence of violent crime, despite long-term declining trends in most developed countries, because crime is covered dramatically and persistently. In the United States, annual surveys show that a substantial majority of respondents believe crime is increasing in their country — a perception that has persisted even through sustained periods of documented decline, kept alive partly by availability of vivid crime narratives.

Fearmongering as a rhetorical strategy works precisely by exploiting the availability heuristic: flood the information environment with vivid examples of a threat, and the audience will perceive the threat as far more prevalent and imminent than the data support.

Availability in Personal Decision-Making

The heuristic operates beyond risk estimation. It shapes how we evaluate plans, candidates, products, and possibilities.

When we assess whether a strategy will work, we tend to anchor on similar cases that come easily to mind — typically our own experience or recently encountered examples, rather than the full base rate. An entrepreneur who has personally met several successful startup founders may dramatically overestimate the general success rate of startups, because successes are more visible and memorable than the large silent majority of failures.

The same mechanism affects hiring, diagnosis, and investment. A doctor who recently saw a rare disease may be more likely to diagnose it in subsequent patients, because it is now cognitively available. An investor who recently lost money in tech stocks may avoid the sector even when base-rate analysis suggests it is the rational choice. A manager who recently had a bad experience with an employee from a particular background may unconsciously apply that experience as a proxy for the entire group.

Availability and Confirmation Bias

The availability heuristic interacts closely with confirmation bias. Once we hold a belief, we selectively notice and remember information that confirms it — which makes confirming examples more cognitively available, which reinforces the belief, which increases the availability of confirming examples. The two biases form a self-reinforcing loop that can maintain false beliefs in the face of contradicting evidence.

For example: someone who believes a particular ethnic group is disproportionately criminal will notice and remember news stories confirming that belief, and will have those stories readily available when estimating group criminality. Disconfirming stories — members of the group acting prosocially — are noticed less and remembered less, making them less available. The perceived evidence base is systematically skewed.

Availability Cascades

Legal scholar Cass Sunstein and economist Timur Kuran coined the term availability cascade to describe a self-reinforcing process by which a perceived risk gains increasing prominence through a chain of social attention. A story gains attention, which leads to more reporting, which leads to more public concern, which leads to more reporting, which leads to policy responses calibrated to perceived rather than actual risk.

The classic example is the Alar pesticide scare of 1989 in the United States, when a 60 Minutes segment on the chemical used on apples triggered a national panic, collapsed the apple industry, and prompted sweeping regulatory action — all in response to a risk that the scientific evidence, even at the time, did not support at the levels being communicated. The cascade ran on availability, not on data.

Correcting for the Heuristic

Awareness of the availability heuristic does not automatically correct for it — a finding that is itself well-documented in the cognitive bias literature. Simply knowing that you might be overestimating the risk of plane crashes does not make the fear disappear. But explicit debiasing strategies can help:

  • Seek base rates. When estimating probability, actively look for statistical data rather than relying on recalled examples. "What is the actual frequency of X?" is a more reliable question than "How many times have I heard about X?"
  • Consider what you're not hearing. If vivid cases are easy to recall, ask what the unremarkable cases look like and how many of them exist. The absence of dramatic stories about car accidents doesn't mean car accidents aren't happening.
  • Correct for media selection effects. News selects for the unusual. If something is in the news, it is probably precisely because it is not representative of what normally happens.
  • Use reference class forecasting. When evaluating a plan or decision, ask: of all cases similar to this one, what fraction succeeded? This deliberately forces you off individual vivid examples and onto aggregate data.

Sources & Further Reading

  • Tversky, Amos, and Daniel Kahneman. "Availability: A Heuristic for Judging Frequency and Probability." Cognitive Psychology 5 (1973): 207–232.
  • Kahneman, Daniel. Thinking, Fast and Slow. Farrar, Straus and Giroux, 2011.
  • Sunstein, Cass R., and Timur Kuran. "Availability Cascades and Risk Regulation." Stanford Law Review 51, no. 4 (1999): 683–768.
  • Slovic, Paul. "Perception of Risk." Science 236, no. 4799 (1987): 280–285.
  • Wikipedia: Availability heuristic

Related Articles