Forming beliefs in a world of filter bubbles

Authored by mpib-berlin.mpg.de and submitted by mvea

Forming beliefs in a world of filter bubbles

Study examines how people deal with diverging information

Why do so many Republicans still believe that the recent US presidential election was fraudulent? Is it possible to reach coronavirus deniers with factual arguments? A study by researchers at the Max Planck Institute for Human Development and the University of Amsterdam provides insights into what it is that stops people from changing their minds. Their findings have been published in the journal Proceedings of the Royal Society of London B.

By talking to other people and observing their behavior, we can learn new things, acquire new skills, and adapt to changing conditions. But what if the information provided by the social environment is inconsistent or contradictory? In a recent study, researchers from the Max Planck Institute for Human Development and the University of Amsterdam have investigated how people deal with information from diverse social sources, and how they use that information to form beliefs. “The internet, in particular, has dramatically changed the structure and dynamics of social interactions. The availablility of social sources is to some extent controlled by algorithms—what we see is biased in favor of our own preferences. At the same time, the internet gives us access to potentially conflicting views,” says lead author Lucas Molleman, associate research scientist in the Center for Adaptive Rationality at the Max Planck Institute for Human Development and postdoc at the University of Amsterdam.

The researchers first conducted an experimental study with 95 participants from the United States. Participants completed an adapted version of the Berlin Estimate AdjuStment Task (BEAST), which reliably measures individuals’ use of social information. They were shown images of groups of animals and asked to estimate the number of animals. They were then shown the estimates of three other participants and asked to make a second estimate. The more participants adjusted their estimates to those of their peers, the more account they had taken of social information.

Across 30 rounds of the task, the researchers varied the conditions of the study, presenting participants with estimates that deviated to a greater or lesser extent from their own estimate, and that were more or less extreme. The results showed that whether participants integrated information from the social environment in their second estimate depended on whether and how strongly their peers’ estimates deviated from each other and from their own estimate. Participants were most likely to adjust their estimates when their peers were in close agreement with each other and their estimates were not too different from the participant’s own. Higher variation in peers’ estimates reduced their impact on the participant’s own judgment. In general, participants gave more weight to their own initial estimate than to their peers’ estimates. Overall, three adjustment strategies were identified: (1) sticking to one’s original estimate, (2) adopting the estimate of one of the three peers, or (3) compromising between one’s original estimate and the peer estimates. The relative frequency of these strategies differed significantly between study conditions. When participants observed a single peer who closely agreed with them, they were more likely to stick to their original estimate or to adopt the estimate of the near peer. When none of the peers were in close agreement with them, participants were more likely to compromise by adjusting their estimate towards, but rarely beyond, that of the nearest peer.

“Our experiment quantifies how people weigh their own prior beliefs and the beliefs of others. In our context, there is actually no reason to assume that one’s own estimate is better than anyone else’s. But what we see here is an effect known in psychology as ‘egocentric discounting’ – namely that people put more weight on their own beliefs than on those of others,” explains co-author Alan Noveas Tump, postdoc at the Center for Adaptive Rationality of the Max Planck Institute for Human Development. “What’s more, our study reveals that this weighting is strongly impacted by the consistency of others’ beliefs with one’s own: people are more likely to heed information that confirms their own beliefs.”

Building on these findings, the researchers developed a model that integrates the observed adjustment strategies and captures that people pay particular attention to social information that confirms their personal judgements. Using simulations, they then investigated how people would behave in real-life situations. For example, they simulated a typical filter bubble, where social information tends to come from like-minded people. They also simulated typical attempts to change people’s minds by confronting them with information inconsistent with their own beliefs. Finally, they investigated how people react to being simultaneously exposed to different groups with extreme beliefs. Their simulations suggest that confirmation effects can lead to divergent social information being ignored, filter bubble effects being exacerbated, and people becoming more extreme in their attitudes.

“Although our study was experimental in design, our model helps explain many contemporary phenomena. It shows how the way people process social information can exacerbate filter bubbles on the internet, and why public debates often become polarized as people quickly become impervious to opposing arguments. As interactions increasingly often take place online, people can often find information that confirms their existing beliefs, making them less willing to listen to alternatives,” says co-author Wouter van den Bos, adjunct research scientist in the Center for Adaptive Rationality at the Max Planck Institute for Human Development and associate professor at the University of Amsterdam.

In future studies, the researchers want to integrate further aspects of reality into the model to find out, for example, whether it matters whether social information comes from a friend, a stranger, an expert, or someone with the same or different political partisanship. They are also investigating how other people influence individuals’ altruistic giving and compliance with social norms.

Molleman, L., Tump, A. N., Gradassi, A., Herzog, S. M., Jayles, B., Kurvers, R. H. J. M., & van den Bos, W. (2020) Strategies for integrating disparate social information. Proceedings of the Royal Society of London B: Biological Sciences. https://doi.org/10.1098/rspb.2020.2413

SirBuzzKill777 on November 27th, 2020 at 15:54 UTC »

What's funny, is everyone reading this thinks it is applying to others they know but not themselves.

buckzer0 on November 27th, 2020 at 14:07 UTC »

Marshall McLuhan describes this as the wind tunnel effect, late 70s I believe he started talking about fractured, niche and push interactions.

CrucialLogic on November 27th, 2020 at 13:40 UTC »

You have to wonder how much social media companies have a part in this. If they build algorithms to present information similar to what the user has been viewing in the past, surely that can be a large part of what drives such unbalanced viewing. Facebook, Reddit, Amazon and all sorts of companies try this with the aim of keeping their "customers" more engaged, which keeps them coming back to the site.