When False Claims Are Repeated, We Start To Believe They Are True — Here’s How Behaving Like A Fact-Checker Can Help

Authored by digest.bps.org.uk and submitted by mvea
image for When False Claims Are Repeated, We Start To Believe They Are True — Here’s How Behaving Like A Fact-Checker Can Help

If you hear an unfounded statement often enough, you might just start believing that it’s true. This phenomenon, known as the “illusory truth effect”, is exploited by politicians and advertisers — and if you think you are immune to it, you’re probably wrong. In fact, earlier this year we reported on a study that found people are prone to the effect regardless of their particular cognitive profile.

But that doesn’t mean there’s nothing we can do to protect ourselves against the illusion. A study in Cognition has found that using our own knowledge to fact-check a false claim can prevent us from believing it is true when it is later repeated. But we might need a bit of a nudge to get there.

The illusory truth effect stems from the fact that we process repeated statements more fluently: we mistake that feeling of fluency for a signal that the statement is true. And the effect occurs even when we should know better — when we repeatedly hear a statement that we know is wrong, for instance, like “The fastest land animal is the leopard”. But Nadia Brashier at Harvard University and colleagues wondered whether asking people to focus on the accuracy of a statement could encourage them to use their knowledge instead, and avoid relying on feelings of fluency.

In the initial study, the team first asked 103 participants to read 60 widely-known facts, some of which were true (e.g. “The Italian city known for its canals is Venice”), and some of which were false (e.g. “The planet closest to the sun is Venus”). One group rated how interesting each statement was, while the other rated how true it was. Then in the second part of the study, both groups saw the same 60 statements along with 60 new ones — again a mixture of true and false — and rated their truthfulness.

The researchers found that participants who had focussed on how interesting the statements were in the first part of the study showed the illusory truth effect: they subsequently rated false statements which they had already seen as more true than false statements which were new. But the group that had initially focused on the accuracy of the statements didn’t show this effect, rating new and repeated false statements as equally true.

This finding suggests that using our own knowledge to critically analyse a statement when we originally encounter it may inoculate us against the illusory truth effect. And this seems to have fairly long-lasting effects: in another experiment, the team found that participants who had initially focussed on the accuracy of the statements still showed no sign of succumbing to the illusory truth effect two days later.

But considering the accuracy of a statement is only useful if we already have appropriate knowledge (e.g. that the closest planet to the sun is Mercury and not Venus). In further studies, the team found that rating the truthfulness of more obscure false statements which participants didn’t know much about, such as “The twenty-first U.S. president was Garfield,” didn’t later protect against the illusory truth effect. It would be interesting to know whether fact-checking against external sources like the internet or reference books — which requires more effort than simply using our own knowledge — is effective at combating the illusion in these cases.

Still, simply having the background knowledge needed to counter false claims is not always enough, say the authors — their results suggest people may need to be “nudged” into actually using that knowledge. “Education only offers part of the solution to the misinformation crisis; we must also prompt people to carefully compare incoming claims to what they already know,” they write.

– An initial accuracy focus prevents illusory truth

Matthew Warren (@MattbWarren) is Editor of BPS Research Digest

Sun-Anvil on September 13rd, 2019 at 01:41 UTC »

The quote, which reads “Make the lie big, keep it simple, keep saying it and eventually they will believe it,” is attributed to the Third Reich's propaganda supremo, Dr Joseph Goebbels.

egoomega on September 13rd, 2019 at 01:26 UTC »

funny seeing this on reddit

kyna689 on September 12nd, 2019 at 23:45 UTC »

“Using our own knowledge to fact-check” is literally how this phenomena propagates. Learn how to check sources and find legitimate ones. Learn how to read studies and how to debunk their methodology.