Argument from Personal Incredulity: "I Can't Understand It, So It Must Be False"
"I just can't see how a random mutation could produce something as complex as the human eye." "There's no way millions of years of random chance could build a functioning cell from scratch." "If we really evolved from apes, why are there still apes?" Each of these statements has a common structure: the speaker announces that they cannot personally comprehend how something could be true — and uses that personal incomprehension as evidence that it isn't. This is the argument from personal incredulity, and it is among the most psychologically natural — and epistemically dangerous — fallacies in the catalogue.
The Basic Structure
The argument from personal incredulity takes this form:
P1: I cannot understand (or imagine, or conceive of) how X could be true.
C: Therefore, X is false (or probably false, or unproven).
The flaw is immediate: the limits of one person's comprehension are not evidence about the nature of reality. The universe is not obligated to be comprehensible to any given mind. What a person finds inconceivable depends on their education, experience, cognitive biases, and the complexity of the subject matter — none of which are properties of the world they're attempting to understand.
This might seem obvious. But the fallacy persists precisely because it feels like evidence. Incomprehension is experienced internally as a genuine signal — something is wrong, something doesn't add up, something refuses to resolve into a coherent picture. That signal is valuable in many contexts. When your car mechanic's explanation doesn't quite make sense, your confusion might indicate you're being deceived. But when a cosmologist explains quantum entanglement and it doesn't make intuitive sense, your confusion is almost certainly tracking the limits of your intuition, not a flaw in quantum mechanics.
The Dunning-Kruger Dimension
The argument from personal incredulity has a deep relationship with the Dunning-Kruger effect — the well-documented cognitive phenomenon in which people with limited knowledge in a domain systematically overestimate their competence, while experts tend to be more aware of what they don't know.
In the original 1999 study by David Dunning and Justin Kruger, participants who scored in the bottom quartile on tests of logical reasoning, grammar, and humour consistently estimated their own performance as above average. They lacked the metacognitive skill to recognise their incompetence — precisely because that metacognitive skill is part of what competence in the domain requires.
The connection to personal incredulity is direct: the less someone knows about evolutionary biology, cosmology, or climate science, the more likely they are to find the expert consensus incredible. Their incomprehension is not a neutral observation — it is partly constituted by their lack of background knowledge. But because they lack that background, they cannot easily distinguish "this is incomprehensible because it's wrong" from "this is incomprehensible because I don't know enough to understand it."
A biologist who has spent twenty years studying molecular genetics does not find the emergence of complex structures from natural selection "incredible." They find it mechanistically clear, even boring in its inevitability. Their comprehension is not evidence either — but the asymmetry in who finds what credible is itself informative about where the fallacy tends to operate.
Evolution: The Case Study
No scientific theory has attracted more argument-from-personal-incredulity than evolutionary biology. The reasons are partly sociological (evolution challenges certain religious cosmologies) and partly structural: natural selection and deep time are genuinely counterintuitive. Human minds evolved to track objects, agents, and events on human timescales. We are not naturally equipped to intuit what one hundred million years of accumulated selective pressure can produce.
Richard Dawkins coined the phrase "argument from personal incredulity" in The Blind Watchmaker (1986), specifically in response to the claim that natural selection cannot explain the complexity of biological organisms. His response was simple: the fact that something seems implausible to the uninformed observer is not evidence against it. The eye — often cited as "irreducibly complex" by proponents of intelligent design — is actually well-documented in its evolutionary pathway, with functional intermediate forms visible in living species across the animal kingdom.
The creationist argument typically conflates two claims: "I cannot imagine the evolutionary pathway for X" and "there is no evolutionary pathway for X." The first is a psychological report. The second is an empirical claim. They are not equivalent, and treating them as equivalent is the fallacy.
Flat Earth and the Geography of Disbelief
The modern flat-earth movement provides perhaps the purest contemporary case study in personal incredulity. Its adherents are not, by and large, unintelligent people — they are people who have applied genuine critical thinking in a domain where they lacked the necessary background, and arrived at confident wrongness.
A common flat-earth argument runs: "I've looked out at the ocean and it looks flat to me. I've flown in a plane and don't feel the curvature. Therefore the earth is flat." This is personal incredulity at scale: the person's direct perceptual experience seems inconsistent with a spherical earth — the curvature isn't visible at normal altitudes and distances — and they use that perceptual incongruity as evidence against the scientific consensus.
What is missing is the knowledge that the earth is large enough that its curvature is imperceptible at ground level without instruments, and that numerous independent lines of evidence — ship bottoms disappearing before masts, different constellations at different latitudes, the circular shadow of the earth on the moon during lunar eclipses — all converge on the spherical model. Personal perception is being used to override a vast, cross-disciplinary, independently-reproducible body of evidence. This is the fallacy in its most epistemically stark form.
"I Can't See How" vs. "There Is No Way"
The core logical error in personal incredulity is treating a first-person epistemic state as a third-person fact about the world. "I can't see how X is possible" describes something about the speaker's cognitive model. "X is not possible" describes something about reality. These are very different claims, and sliding between them without argument is the fallacy.
Philosophers distinguish between epistemic possibility (compatible with what I know) and metaphysical possibility (compatible with the laws of nature). Personal incredulity systematically confuses them: because something is not epistemically possible for the speaker — because they can't integrate it into their current understanding — they conclude it is not metaphysically possible either.
This is a category error. History is littered with things that were not epistemically possible for some people — heavier-than-air flight, germ theory, tectonic plates, quantum superposition — that turned out to be not only metaphysically possible but actual.
When Incredulity Is (Partially) Justified
Not all incredulity is fallacious. There is a legitimate version: when a claim seems implausible, that implausibility should raise the evidential bar, not end the inquiry. Carl Sagan's principle — "extraordinary claims require extraordinary evidence" — is not an argument from personal incredulity. It is a calibrated response to prior probability: claims that conflict with well-established knowledge need to overcome a higher evidential burden precisely because the established knowledge represents accumulated, cross-validated evidence.
The difference is in how the incredulity is used:
- Fallacious: "I can't understand how this is true, therefore it's false."
- Legitimate: "This seems implausible given what I know — I'll require stronger evidence before accepting it."
The first closes inquiry. The second calibrates it. The first treats incomprehension as a conclusion. The second treats it as a reason to dig deeper.
The Self-Examination Problem
Personal incredulity is uncomfortable to acknowledge in oneself because it masquerades as principled scepticism. The person deploying it typically feels that they are being rigorously sceptical — refusing to accept something that doesn't make sense. That impulse is not wrong. The error is in the inference: from "doesn't make sense to me" to "doesn't make sense."
A useful discipline is to ask: Is my difficulty understanding this a problem with the theory, or a problem with my background knowledge? Have I made a good-faith effort to understand the best version of the explanation? Am I encountering the real argument, or a simplified version that omits the components that would resolve my confusion?
These questions are effortful. Personal incredulity, like most fallacies, succeeds partly because it is cognitively cheap. Comprehension is expensive; incomprehension feels like a finding.
Related Fallacies
- Appeal to Ignorance (argumentum ad ignorantiam) — "It hasn't been proven, therefore it's false." Shares the structure of treating an absence (of understanding, of proof) as positive evidence.
- God of the Gaps — a specific application of personal incredulity in which unexplained phenomena are attributed to divine intervention. The incomprehension is used to insert a particular explanation rather than simply reject another.
- Appeal to Complexity — the argument that because something is complex, it must have been designed. Often underlies incredulity-based objections to natural processes.
Sources & Further Reading
- Dawkins, R. (1986). The Blind Watchmaker. Norton. (Chapter 1 introduces the phrase "argument from personal incredulity.")
- Dunning, D., & Kruger, J. (1999). "Unskilled and Unaware of It." Journal of Personality and Social Psychology, 77(6), 1121–1134.
- Sagan, C. (1995). The Demon-Haunted World: Science as a Candle in the Dark. Random House.
- Shermer, M. (2002). Why People Believe Weird Things, 2nd ed. Holt.
- Pennycook, G., et al. (2017). "On the reception and detection of pseudo-profound bullshit." Judgment and Decision Making, 10(6), 549–563.