The Illusory Truth Effect: How Repetition Manufactures Belief
You have heard it before. You can't quite remember where, but it sounds familiar. And somehow, that familiarity feels like a signal of truth. The statement seems credible — not because you've evaluated the evidence, not because a trustworthy source has vouched for it, but simply because it rings a bell. This is the illusory truth effect: the well-documented cognitive phenomenon that repeated exposure to a claim increases our tendency to believe it, regardless of whether it is actually true.
The Discovery
The illusory truth effect was first systematically documented by Lynn Hasher, David Goldstein, and Thomas Toppino in a 1977 study. Participants were asked to rate the validity of a series of statements — some true, some false. Two weeks later, they were asked to rate a new set of statements, which included repetitions of some statements from the first session. The finding was striking: statements that had been seen before were rated as more valid than new statements, even when participants had no conscious memory of having seen them. Mere repetition made claims feel more credible.
The effect proved robust across decades of replication. It occurs with trivia questions, general knowledge claims, and — crucially — even when participants are explicitly warned that some statements are false, and even when they initially know the statement is false. Familiarity exerts a pull toward perceived truth that is difficult to suppress through explicit knowledge.
Why Repetition Feels Like Truth
The cognitive mechanism underlying the illusory truth effect is what psychologists call processing fluency. When we encounter a stimulus we've seen before, we process it more easily — more fluently — than a novel stimulus. The brain runs the pattern, recognises it, and returns a sense of ease. And crucially, the brain uses this ease as a heuristic: familiar things are easier to process, and easier-to-process things are interpreted as more likely to be true.
This is not entirely irrational. In most ordinary circumstances, familiarity is indeed a signal worth something: things we've encountered before have often been vetted by others, encountered in reliable contexts, or confirmed by experience. The problem arises in an environment where familiarity can be artificially manufactured — where repetition is not a product of genuine prevalence and reliability, but of deliberate strategic exposure.
Processing fluency also explains why the effect persists even when people know a statement is false. The fluency signal is automatic and precognitive; the explicit knowledge that something is false is held in a different, slower, more effortful cognitive system. When the two systems interact — fast fluency versus slow explicit evaluation — fluency can exert a pull even against the explicit verdict. The result is a kind of epistemic erosion: with enough repetition, even known falsehoods feel progressively more plausible.
Propaganda and Political Repetition
Propagandists discovered the illusory truth effect long before psychologists named it. Joseph Goebbels's infamous dictum — "repeat a lie often enough and it becomes the truth" — is a precise operational description of the mechanism (though the precise phrasing may be apocryphal, the strategy was undeniably deliberate). Nazi propaganda systematically deployed repetition of specific claims about Jews, communists, and Germany's enemies. The claims were not primarily designed to convince through argument; they were designed to normalise through saturation.
The technique did not end with the Second World War. Political communication strategists across the ideological spectrum have understood that repetition of key phrases, slogans, and frames — regardless of their factual content — shapes perceived truth. "Death panels." "Crooked Hillary." "Stop the steal." "Weapons of mass destruction." Whatever their factual relationship to reality, these phrases were repeated at sufficient volume and frequency that they became cognitively available reference points, acquiring a solidity that evidence alone would not have given them.
This works particularly well because of how it interacts with confirmation bias: repetition that aligns with existing beliefs is processed as confirming evidence. And the availability heuristic amplifies the effect: frequently repeated claims are cognitively available, and cognitive availability reads as frequency, which reads as prevalence, which reads as truth.
Advertising and Commercial Repetition
The advertising industry has operated on the illusory truth effect for at least a century. Early advertising theorist Thomas Smith wrote in 1885 that a consumer needs to see an advertisement at least 20 times before purchasing — a number that probably overstates the specificity but correctly identifies the mechanism. Modern advertising science has refined this considerably.
"The most trusted name in news." "The world's favourite airline." "Because you're worth it." "Just do it." These slogans are not arguments; they do not offer evidence for the claims embedded in them. They are vehicles of repetition. By the time a consumer has heard "the most trusted name in news" four hundred times across multiple platforms and contexts, the association between the brand and the quality of trustworthiness has been installed through sheer familiarity, independent of any journalistic track record.
Repetition also functions to embed specific product associations with emotional states. The illusory truth effect blends with classical conditioning: if a brand is consistently paired with images of happiness, health, or desirability, the association becomes automatic. The fluency of the brand extends to the fluency of the feeling it has been paired with. This is advertising working not on beliefs but on intuitions — which are harder to resist precisely because they feel like direct perception rather than persuasion.
Fake News and the Information Environment
The digital information environment has created conditions for the illusory truth effect to operate at unprecedented scale and speed. Social media platforms that prioritise engagement amplify content that generates reaction — which tends to be emotionally charged, novel, and partisan. False information, as studies of Twitter sharing patterns have found, consistently spreads faster and wider than accurate information, partly because it tends to be more novel and emotionally arousing.
The result is that false claims can accumulate repetition at enormous speed. A false claim about a political figure, a vaccine, or a public event can reach millions of repetitions within hours of its creation, establishing a familiarity baseline that will influence credibility assessments for months or years afterward. Even when fact-checkers issue corrections, research on correction effects suggests that the correction typically reaches a smaller audience than the original false claim, is processed less fluently (because it disrupts the established mental model), and leaves a residue of the original false claim even in people who explicitly accept the correction.
This "continued influence effect" — where a corrected falsehood continues to influence judgments even after the correction is consciously accepted — is a direct expression of the illusory truth effect's depth. Fluency-based credibility is harder to undo than to install.
The Interaction with Other Biases
The illusory truth effect does not operate in isolation. Its power is multiplied by several interacting mechanisms:
- Source amnesia: People remember claims better than where they heard them. A statement heard on a known disinformation website can, after enough repetitions from enough sources, lose its association with that origin and acquire a generic familiarity that feels like broad endorsement.
- Social conformity: Widely repeated claims signal that many people believe them — triggering the bandwagon effect alongside the illusory truth effect. The two mechanisms converge on the same conclusion through different routes.
- Illusory correlation: Repeated co-occurrence of two things in the same context creates an impression of causal relationship, independent of any actual correlation. This is how misinformation about vaccine side effects works: repeated juxtaposition of vaccines and adverse events creates a felt association that statistical evidence cannot easily dislodge.
Resistance and Inoculation
Can the illusory truth effect be resisted? Research suggests some approaches, though none is fully effective:
- Accuracy prompts: Asking people to briefly reflect on the accuracy of a claim before sharing it has been shown in some studies to increase the quality of information shared on social media. The prompt activates deliberate processing, partially counteracting fluency-based automatic evaluation.
- Inoculation theory: Prebunking — warning people about specific disinformation techniques, including the role of repetition, before they encounter false claims — builds resistance more effectively than fact-checking after the fact. If people understand that repetition is a manipulation technique, they can apply deliberate scepticism to familiar-feeling claims.
- Reduce source amnesia: Explicitly labelling sources at the point of first exposure — and attaching warnings to low-quality sources — makes it harder for false claims to shed their provenance through repeated sharing.
- Notice familiarity without trusting it: Training yourself to ask "does this feel true because I've evaluated it, or because I've simply heard it a lot?" introduces a check that fluency-based credibility can otherwise bypass.
The Uncomfortable Implication
The most uncomfortable implication of the illusory truth effect is that critical evaluation is not sufficient protection. A person who prides themselves on careful reasoning can still have their credibility assessments shifted by repetition, because the mechanism operates below the level of deliberate reasoning. This is not a failure of intelligence; it is a feature of how human cognition is structured. The effective response is not more effort at individual evaluation but structural interventions in the information environment — reducing the reach of disinformation at the source, designing platforms that don't preferentially amplify repetition, and building pre-emptive media literacy that specifically targets fluency-as-truth.
Knowing about the illusory truth effect is the beginning, not the end. The effect will try to work on you even as you read this description of how it works.
Sources & Further Reading
- Hasher, Lynn, David Goldstein, and Thomas Toppino. "Frequency and the Conference of Referential Validity." Journal of Verbal Learning and Verbal Behavior 16, no. 1 (1977): 107–112.
- Fazio, Lisa K., et al. "Knowledge does not protect against illusory truth." Journal of Experimental Psychology: General 144, no. 5 (2015): 993–1002.
- Pennycook, Gordon, et al. "Prior exposure increases perceived accuracy of fake news." Journal of Experimental Psychology: General 147, no. 12 (2018): 1865–1880.
- Lewandowsky, Stephan, et al. "Misinformation and Its Correction." Psychological Science in the Public Interest 13, no. 3 (2012): 106–131.
- Vosoughi, Soroush, Deb Roy, and Sinan Aral. "The spread of true and false news online." Science 359, no. 6380 (2018): 1146–1151.
- Wikipedia: Illusory truth effect