We ignore what doesn’t fit with our biases – even if it costs us

Authored by newscientist.com and submitted by mvea

We can’t help but be more welcoming of information that confirms our biases than facts that challenge them. Now an experiment has shown that we do this even when it means losing out financially.

Most research on confirmation bias has focused on stereotypes that people believe to be true, says Stefano Palminteri at École Normale Supérieure (ENS) in Paris. In such experiments, people hold on to their beliefs even when shown evidence that they are wrong. “People don’t change their minds,” says Palminteri.

But those kinds of beliefs tend not to have clear repercussions for the people who hold them. If our biases cost us financially, would we realise that they are not worth holding on to?

How to beat fake news: Learn more at New Scientist Live in London

To find out, Palminteri and his colleagues at ENS and University College London set 20 volunteers a task that involved learning to associate made-up symbols with financial reward. In the first of two experiments, the volunteers were shown two symbols at a time and had to choose between them. They then received a financial reward that varied depending on their choice.

By repeating this multiple times, the volunteers found out how much some of the various symbols were worth. However, they could only see this information for symbols they had chosen.

In the second experiment, the same volunteers were again asked to choose between pairs of abstract symbols. This time, they were told the value of both the symbol they had chosen and the one they hadn’t.

The first experiment helped the volunteers learn which symbols were most valuable, but the second trial was designed to show them that the symbols they hadn’t chosen could be more valuable.

However, the second experiment did not change the participants’ preferences. Despite the lesson that certain symbols were more valuable, they continued to choose those they had learned to favour in the first experiment. This meant that they kept dismissing symbols that would pay them more.

Read more: Liberals are no strangers to confirmation bias after all

This suggests that people generally ignore new information that counters their beliefs, even though doing so costs them financially, says Palminteri. “It’s as if you don’t hear the voices in your head telling you that you’re wrong, even if you lose money,” he says.

Palminteri hopes that we can learn to be aware of our own biases, but says that will be hard – if a person believes they are not biased, it is difficult to shift this belief. And even if some people are aware they are biased, it is probably impossible to eliminate all their biases. “Complete objectivity is probably something we will never fully achieve,” says Palminteri.

Our faith in our biases can make us believe we are right even when we are wrong. “In the end, people will have the impression that they are performing better than they actually are,” says Palminteri. “That could increase self-confidence, and provide a motivational boost.”

Journal reference: PLoS Computational Biology, DOI: 10.1371/journal.pcbi.1005684

IgnisDomini on September 4th, 2017 at 14:36 UTC »

Just to stop people from getting on their high horse about how unbiased they are:

People who believe themselves to be less biased than their peers are usually actually more biased.

tacotaskforce on September 4th, 2017 at 13:57 UTC »

Maybe I am completely misunderstanding this experiment, but this sounds like it has to do with risk aversion, not bias.

runner-33 on September 4th, 2017 at 12:51 UTC »

This could also have an impact on science since bias prevents from interpreting experimental results open-ended.

To put it differently: Are the best scientists the ones with the lowest confirmation bias?