Apps

🧪 This platform is in early beta. Features may change and you might encounter bugs. We appreciate your patience!

← Back to Library
blog.category.aspect Mar 29, 2026 7 min read

Curse of Knowledge: Why Experts Can't Explain Things Simply

You know the feeling. You explain something that seems perfectly obvious to you, and the other person stares back blankly. You repeat it, maybe slightly louder. Still nothing. And somewhere in the back of your mind, a small, uncharitable thought surfaces: How can they not get this? That thought is itself a symptom. You have been cursed — not by ignorance, but by knowledge.

The Tapping Study

In 1990, Stanford doctoral student Elizabeth Newton conducted a deceptively simple experiment that became one of the most cited illustrations in the psychology of communication. She divided participants into two roles: "tappers" and "listeners." Tappers were given a list of well-known songs — "Happy Birthday," "The Star-Spangled Banner" — and asked to tap the rhythm on a table with their finger. Listeners had to identify the song.

The results were striking. Of 120 songs tapped, listeners identified only 2.5% correctly — three songs in total. But before the experiment, tappers were asked to predict how often listeners would succeed. Their estimate: 50%.

The tappers were not lying or boasting. They genuinely believed listeners would recognise the song. The problem was that as a tapper tapped, they could hear the full song playing in their head — melody, lyrics, instrumentation. All that rich mental music was completely invisible to the listener, who received only a series of disconnected knocks. The tappers could not turn off the music in their minds. They could not hear what the listener heard. They were cursed by what they knew.

The Bias Defined

The term "Curse of Knowledge" was introduced to the broader public by economists Colin Camerer, George Loewenstein, and Martin Weber in their 1989 paper in the Journal of Political Economy, where they studied how well-informed traders systematically overestimated how much less-informed traders knew. Newton's 1990 study gave it its most memorable experimental demonstration.

The mechanism is this: once a piece of knowledge is deeply integrated into your mental model of the world, it becomes almost impossible to reconstruct what your world looked like before that knowledge existed. Expertise rewires the brain. The neural pathways that once represented confusion have been replaced by pathways representing fluency. There is no "undo" button. You cannot simulate ignorance from inside knowledge.

This is related to but distinct from overconfidence — the Curse of Knowledge is not about overestimating your own abilities, but about underestimating the cognitive distance between your mental model and someone else's.

Where It Shows Up

The Classroom

The most obvious domain is teaching. Expert teachers are paradoxically at risk of being worse explainers than near-peer tutors — students who learned the material recently and can still remember what confusion felt like. A mathematics professor who has taught calculus for thirty years has so thoroughly automated the manipulation of derivatives and integrals that the steps feel trivially obvious. The gaps in a beginner's understanding are genuinely difficult for them to perceive, because from inside deep expertise, there are no gaps to perceive.

This is not a failure of intelligence or care. It is a structural cognitive problem. The professor's knowledge is too complete to model incompleteness from the outside. Research in educational psychology consistently finds that expert explanations contain fewer bridging steps, more assumed background knowledge, and more jargon than novice learners can process — not because experts are lazy, but because the bridges feel unnecessary from inside their understanding.

Medicine

The Curse of Knowledge in medicine can have serious consequences. When a physician explains a diagnosis or treatment plan, they carry in their head a rich context of anatomy, physiology, pharmacology, and clinical experience that has been built over years. The patient carries none of this. Studies of patient-doctor communication consistently find that patients recall less than half of what physicians tell them in consultations — and that physicians systematically overestimate how much patients understand.

Medical informed consent is built on the assumption that explanation produces understanding. The Curse of Knowledge suggests that the connection between explanation and understanding is far less reliable than we assume, particularly when the explainer and the listener have vastly different knowledge bases. Plain language requirements in medical communication exist precisely because expert physicians tend to unconsciously default to the vocabulary and conceptual structure of their training.

Software and Product Design

Every confusing user interface is, at some level, a monument to the Curse of Knowledge. The designers who built it understood the system deeply. They knew what each menu option did, what the error messages meant, what the "advanced settings" were for. That knowledge made the interface feel intuitive to them — because their brains filled in all the gaps automatically. The user, approaching the software for the first time with no such background knowledge, encounters a system that seems to assume context they don't have.

The tech industry has increasingly recognised this through concepts like "beginner's mind" and user testing with actual novices. But the pull of the curse is strong. When you know how something works, you cannot reliably predict what a person who doesn't know will find confusing. The only reliable way to discover what novices find confusing is to watch novices try to use your product — because self-simulation fails.

Writing

Steven Pinker devoted a chapter to the Curse of Knowledge in his 2014 book The Sense of Style, arguing that it is the primary cause of bad academic and technical writing. Writers know what they mean. Because they know what they mean, they assume context that hasn't been established, use terms before defining them, and skip explanatory steps that feel redundant from the inside. The result is prose that communicates efficiently between one expert and another but leaves everyone else behind.

The fix, Pinker argues, is not dumbing down but rather "classic style" — writing as if guiding a reader to look at something you can both see, building the mental model step by step rather than assuming it exists already.

The Hindsight Bias Connection

The Curse of Knowledge has a temporal cousin: hindsight bias (sometimes called the "I knew it all along" effect). In hindsight bias, knowing how something turned out makes it feel like that outcome was always predictable. In both cases, the mechanism is the same: knowledge, once acquired, retroactively overwrites the feeling of not-knowing. We cannot reconstruct our pre-outcome uncertainty any more than the expert can reconstruct their pre-expertise ignorance.

Both biases are rooted in what psychologists call knowledge contamination: the information we currently have corrupts our ability to simulate accurately what it was like — or what it would be like — to lack that information. The tapper hears music. The outcome-knower sees inevitability. The expert sees obviousness. None of them can switch it off.

Debiasing Strategies

The Curse of Knowledge is not fully curable, but several approaches reduce its impact:

  • Concrete examples before abstractions: Expert knowledge tends to be stored abstractly. Starting with a specific, tangible example gives the novice a hook before the abstract framework is introduced.
  • Explain to a specific person: Rather than imagining an abstract "audience," imagine explaining to one real person you know who is unfamiliar with the subject. This makes the knowledge gap more concrete and easier to bridge.
  • Test with actual novices: The only fully reliable test of whether an explanation works is to watch someone who doesn't know the material try to use it. Self-assessment by experts is structurally unreliable.
  • The "five-year-old" test: Not literally for five-year-olds, but forcing yourself to articulate the core concept in the simplest possible language reveals which parts you can actually explain and which parts you are covering with jargon that hides gaps.
  • Near-peer review: Ask someone who learned the material recently — not years ago — to review your explanation. Their memory of confusion is more recent and less contaminated.

The Curse of Knowledge is, in a sense, the price of mastery. You cannot become truly expert in something without losing your ability to see it as a beginner sees it. Understanding this is not an excuse for poor communication — it is a reason to design communication processes that compensate for a bias we cannot simply decide to stop having.

Sources & Further Reading

  • Camerer, C., Loewenstein, G., & Weber, M. "The Curse of Knowledge in Economic Settings." Journal of Political Economy 97, no. 5 (1989): 1232–1254.
  • Newton, E. L. "Overconfidence in the Communication of Intent: Heard and Unheard Melodies." Doctoral dissertation, Stanford University, 1990.
  • Pinker, S. The Sense of Style: The Thinking Person's Guide to Writing in the 21st Century. Viking, 2014.
  • Heath, C., & Heath, D. Made to Stick: Why Some Ideas Survive and Others Die. Random House, 2007.
  • Hinds, P. J. "The Curse of Expertise." Journal of Experimental Psychology: Applied 5, no. 2 (1999): 205–221.
  • Wikipedia: Curse of knowledge

Related Articles