You’re scrolling through your feed when you see a shocking claim—maybe that a new study “proves” coffee causes cancer, or that a celebrity was arrested for something scandalous. You raise an eyebrow, but it sticks in your mind. A few days later, you hear that the claim was false, debunked by experts.
Here’s the problem: a month later, when someone asks, “Hey, isn’t coffee bad for you?”, you hesitate. That original claim still lingers in your mind, even though you know it was wrong.
This is the continued influence effect, a psychological phenomenon where misinformation keeps affecting our thinking even after we’ve been told it’s false. The truth comes too late, and by the time it arrives, the damage is done.
Our brains latch onto information quickly but aren’t so great at letting it go. When we first hear something, it becomes a mental shortcut, an easy reference point. Even when we later learn it’s false, our minds often fail to update properly.
Psychologists believe this happens because first impressions stick. The first version of a story we hear feels like the real version, while corrections feel like an addition rather than a replacement. Misinformation also tends to be more emotional and attention-grabbing than the truth. A false claim about a politician being corrupt sparks outrage; the correction that the accusation was baseless feels dull by comparison. The more times we hear something, the more familiar it feels, and familiarity breeds belief.
Take the infamous claim that vaccines cause autism. This falsehood originated from a debunked 1998 study but continues to shape public opinion decades later. Despite overwhelming scientific evidence disproving it, the idea has stuck.
Ironically, trying to correct misinformation can sometimes make it stronger. This happens because repeating a false claim—even to debunk it—reinforces it in people’s minds. The correction also requires more mental effort. Falsehoods are often simple, like “vaccines cause autism,” while the correction is complex, requiring explanations of immune responses, epidemiology, and scientific studies. People resist changing their minds, especially when the false information aligns with their existing beliefs. Once a claim becomes part of someone’s identity, rejecting it feels like admitting they were wrong, which is uncomfortable.
The "Obama birther" conspiracy is a perfect example. Even after repeated fact-checks confirming that Barack Obama was born in the U.S., many Americans still believed the false claim. The controversy itself kept the myth alive, and the more people saw it discussed, the more it felt like a legitimate question rather than a settled fact.
Misinformation doesn’t just influence what we believe—it rewires how we remember. Studies show that when people are exposed to false claims, they later struggle to recall whether they were true or false. In one experiment, participants read fictional crime reports containing false details. Even after being told which information was incorrect, they continued to rely on it when making decisions about the case. The misinformation had already shaped their perception.
A similar study found that even when false headlines were labeled as “false,” people who saw them multiple times were more likely to believe them later. Just seeing misinformation repeatedly made it feel more familiar, and familiarity often wins over fact-checking.
Once misinformation takes hold, it’s hard to erase. Debunking helps, but it’s rarely enough. The most effective strategy isn’t correcting falsehoods after they spread—it’s preventing them from taking hold in the first place.
Psychologists have found that prebunking works better than debunking. Teaching people how misinformation spreads before they encounter it makes them more resistant to it. If you know in advance that certain headlines use emotional manipulation to mislead, you’re less likely to fall for them.
The source of a claim matters just as much as its content. When people first hear something, they often remember the claim but forget where it came from. That’s why misinformation spreads so easily from shady websites while legitimate sources struggle to correct it. Before believing a claim, it’s worth asking: Who benefits from me believing this?
Truth also needs to be more engaging. If misinformation spreads through emotional, compelling narratives, fact-checking needs to do the same. Dry, technical corrections don’t stick. One effective approach is the “truth sandwich”—repeating the correct information before and after addressing the false claim. Instead of saying, “Vaccines don’t cause autism,” a better correction would be: “Vaccines are safe and save millions of lives. The claim that they cause autism is false and based on a debunked study. In reality, vaccines are one of the greatest medical advancements in history.” By reinforcing the truth at the beginning and end, the false claim doesn’t dominate the conversation.
Once a false idea spreads, it rarely disappears completely. Even when people accept a correction, traces of the misinformation remain in their memory, waiting to resurface. That’s why the best defense isn’t just fact-checking—it’s teaching people how to spot misinformation before it spreads.
Next time you hear a viral claim, don’t just ask, Is this true? Ask, Will I still remember that it was false a month from now? Because that’s what really matters.