Confirmation Bias: The Filter We Don't Know We're Wearing
Two people read the same news article about immigration policy. One comes away convinced it proves mass immigration is economically beneficial. The other concludes it demonstrates exactly the opposite. The article hasn't changed. Their pre-existing beliefs have. This is confirmation bias — the tendency to search for, interpret, favour, and recall information in a way that confirms or supports what you already believe. It is arguably the single most pervasive and consequential cognitive bias that humans exhibit.
The Three Faces of Confirmation Bias
Confirmation bias is not a single process but a cluster of related tendencies that operate at different stages of cognition.
Biased Search for Information
When we seek information, we tend to search in ways likely to generate confirming evidence. If you believe a supplement improves memory, you are more likely to search "does X improve memory?" than "is X effective?" — and the search results will reflect the framing of your query. We choose news sources that share our worldview, follow social media accounts that reinforce our views, and ask questions in ways that invite agreement rather than challenge. The classic demonstration: ask someone to test whether a number rule applies to the sequence 2-4-6, and they will almost always generate sequences that fit the rule they believe is operating, rather than sequences designed to test whether they are wrong.
Biased Interpretation
When we encounter ambiguous information, we interpret it in ways consistent with our existing beliefs. In a 1979 study by Charles Lord, Lee Ross, and Mark Lepper, participants who either supported or opposed capital punishment were presented with the same two fictional studies on whether capital punishment deters crime — one supporting deterrence, one opposing it. Rather than updating toward a middle position, both groups became more extreme in their original views. They rated the study that supported their position as more methodologically sound, and found elaborate reasons to dismiss the one that challenged it. The same ambiguous data fed polarisation rather than convergence.
Biased Memory
We do not remember the past as a neutral archive. We selectively encode and recall memories in ways that preserve the coherence of our self-narrative and world-view. Studies of eyewitness memory have shown that people are more likely to remember details consistent with their prior expectations of a situation. After a relationship ends, people tend to rewrite their memory of the relationship's history in ways consistent with the conclusion they've now reached. Facts that fit our beliefs are more accessible; facts that contradict them fade more quickly. This is the mechanism by which the availability heuristic and confirmation bias reinforce each other.
Origins and Evolution
The term "confirmation bias" was popularised by English psychologist Peter Wason, who documented biased hypothesis testing in his 2-4-6 rule study (1960) and later in the famous Wason Selection Task. But the phenomenon itself was noted far earlier. Francis Bacon described something similar in 1620 in his Novum Organum: "The human understanding when it has once adopted an opinion… draws all things else to support and agree with it." Francis Bacon recognised it as a fundamental obstacle to scientific progress — which is why the scientific method was designed partly to counteract it, through pre-registered hypotheses, peer review, and replication requirements.
Evolutionary psychologists have suggested that confirmation bias may have adaptive roots. In a social environment where group cohesion and loyalty mattered enormously, defending your group's beliefs and being a reliable ally may have been more immediately valuable than being accurate. Belief updating in response to every piece of contradicting evidence might signal untrustworthiness. This doesn't make confirmation bias useful in a modern information environment, but it may explain why it is so deeply embedded.
In Politics and Media
The political consequences of confirmation bias are enormous. Partisan media ecosystems are partly a product of it and partly a cause. Once people have formed political identities, they preferentially consume media that confirms those identities, interpret ambiguous political events through the lens of their prior beliefs, and remember politically confirming information better than disconfirming information. Political polarisation in many democracies is partly a story of confirmation bias at scale — accelerated by algorithmic media that detects and amplifies the bias to maximise engagement.
In political debate, confirmation bias often masquerades as principled conviction. People do not typically say "I am interpreting this evidence to confirm my prior belief." They say "the evidence clearly supports my position" — and they genuinely believe it. The bias is mostly unconscious. This makes it particularly hard to address through argument: telling someone they are exhibiting confirmation bias usually generates defensive dismissal, which is itself confirmation of the threatening idea that their reasoning is compromised.
Bulverism — explaining why someone holds a wrong view without first demonstrating that it is wrong — is confirmation bias's rhetorical sibling: both assume the conclusion and work backward to rationalise it.
In Science and Medicine
Confirmation bias is not limited to lay reasoning. It operates in scientific research too, which is why good scientific methodology was designed to counteract it. Publication bias — the tendency of journals to publish positive results over null results — creates a systematic distortion in the scientific literature: we see the studies that found effects, not the majority that didn't. Researchers who believe in a hypothesis work harder to fix anomalous results that fail to confirm it than anomalous results that support it.
In medicine, confirmation bias affects diagnosis. A doctor who forms an early hypothesis about what is wrong with a patient will selectively attend to symptoms that confirm the hypothesis and may dismiss symptoms that point elsewhere. A study of diagnostic error in internal medicine found that failure to revise an initial diagnosis in response to contradicting evidence — a direct expression of confirmation bias — was a significant contributor to misdiagnosis.
The Semmelweis Reflex — the institutional rejection of new findings that contradict established practice — can be seen as a collective form of confirmation bias: entire professional communities defending their existing frameworks against evidence that challenges them.
In Everyday Relationships
Interpersonal relationships are not immune. Once we have formed an impression of another person — positive or negative — we interpret their subsequent behaviour through that lens. A person we like does something ambiguous; we give them the benefit of the doubt. A person we distrust does the same thing; we see it as evidence of what we suspected all along. The first impression bias documented in social psychology is partly a function of confirmation bias: once formed, first impressions become self-confirming filters.
In workplaces, managers who form early views about which employees are high-performers give those employees more challenging assignments, more feedback, more credit for ambiguous successes, and less blame for ambiguous failures — creating actual performance differences that then seem to retrospectively justify the original judgment. The prophecy fulfils itself, and confirmation bias hides the mechanism.
Can We Correct for It?
Debiasing research has produced mixed results. Simply knowing about confirmation bias does not reliably reduce it — a humbling finding. People who score highly on measures of cognitive sophistication and need for cognition are often better at finding sophisticated justifications for their pre-existing beliefs, not more immune to having them. Intelligence can be a tool for rationalisation as easily as it is a tool for truth-seeking.
What does help, to some degree:
- Active disconfirmation: Explicitly ask "what would have to be true for my belief to be wrong?" and then go looking for that evidence rather than waiting for it to arrive.
- Considering the opposite: Deliberately generating arguments for the opposing view before evaluating evidence — not as a rhetorical exercise but as a genuine attempt to understand the strongest case against your position.
- Pre-mortem analysis: When planning, assume the plan has failed and ask why. This forces engagement with disconfirming scenarios that confirmation bias would otherwise screen out.
- Structured peer review: Having people who hold different views evaluate your reasoning, with an explicit mandate to find flaws, catches more confirmation bias than self-review.
- Slow down: Confirmation bias is more pronounced under time pressure and cognitive load. The bias operates fastest when we're processing quickly. Deliberate, effortful reasoning is more likely to catch it.
The Meta-Level
One of the most unsettling features of confirmation bias is that it applies to beliefs about confirmation bias. People who believe others are particularly susceptible to it — especially those with different political views — may be confirming their own prior view of those people's irrationality. Intellectual humility about one's own susceptibility is both the appropriate response and the hardest to maintain.
Sources & Further Reading
- Wason, P. C. "On the Failure to Eliminate Hypotheses in a Conceptual Task." Quarterly Journal of Experimental Psychology 12, no. 3 (1960): 129–140.
- Lord, Charles G., Lee Ross, and Mark R. Lepper. "Biased Assimilation and Attitude Polarization." Journal of Personality and Social Psychology 37, no. 11 (1979): 2098–2109.
- Nickerson, Raymond S. "Confirmation Bias: A Ubiquitous Phenomenon in Many Guises." Review of General Psychology 2, no. 2 (1998): 175–220.
- Kahneman, Daniel. Thinking, Fast and Slow. Farrar, Straus and Giroux, 2011.
- Wikipedia: Confirmation bias