Most people don’t know how to properly evaluate evidence.
As a constant barrage of news events and the spread of misinformation makes clear, human beings are far from ideal thinkers. They are prone to errors in judgment, bias, inconsistency, self-serving delusions, and fall prey to any number of conspiracy theories and other false beliefs.
Part of the work of psychology is understanding all the common ways human thinking can go wrong, and a new study from the University of California Berkeley casts light on a ubiquitous error that can contribute to the formation of false beliefs — including conspiracy theories like QAnon, Holocaust denial, rejection of climate change, and others.
To study these beliefs, Louis Marti — a Ph.D. student in psychology — and his team examined a simple scenario in which subjects were expected to have false beliefs. They conducted online tests of 500 adults who were asked to identify whether each in a series of shapes was “daxxy.” Though this term was meaningless, the subjects were not made aware of this fact.
The researchers found that subjects gained in confidence about what “daxxy” meant when they had only a handful of guesses that were “labeled” correct. Even if they had also had a large number of guesses that were deemed “incorrect,” they only needed a short series of positive results to gain confidence in their beliefs.
“What we found interesting is that they could get the first 19 guesses in a row wrong, but if they got the last five right, they felt very confident,” Marti said. “It’s not that they weren’t paying attention, they were learning what a Daxxy was, but they weren’t using most of what they learned to inform their certainty.”
What this shows, the authors of the study argue, is that people were much more interested in their most recent experiences, not the totality of the evidence, when considering whether they should have confidence that a certain theory is true. And in the real world, they argue, this cognitive error can result in people failing to search for further evidence because they have attained confidence in their beliefs. “If you think you know a lot about something, even though you don’t, you’re less likely to be curious enough to explore the topic further, and will fail to learn how little you know,” said Marti.
“If you use a crazy theory to make a correct prediction a couple of times, you can get stuck in that belief and may not be as interested in gathering more information,” added study senior author Celeste Kidd, who is an assistant professor of psychology at UC Berkeley.
We can imagine how this might reinforce real-world conspiracy theories. For example, the QAnon conspiracy theory holds that a secret user on the message board 4chan who alleges that the “deep state” is engaging in an underhanded plot to subvert President Donald Trump. Much of the theorizing is baseless and often provably untrue, but the new study suggests that it might not take much “evidence” for people to be hooked into believing in the conspiracy. If just a handful of predictions or assertions proposed by the conspiracists were confirmed or appeared to be confirmed to some degree, many observers might take those few data points as justifying high confidence in the theory — even if the vast majority of the claims asserted by the theory were demonstrably wrong.
But there’s no reason to think this phenomenon applies only to outlandish beliefs like conspiracy theories. It is just as likely to have effects in people’s everyday lives, influencing how one thinks about the best route for one’s daily commute, whether one has an allergy to gluten, or whether one is secretly hated by co-workers.
Hopefully, knowing about these kinds of biases and errors can help us avoid falling into these traps. Sometimes, the wisest idea is to acknowledge that you lack insufficient evidence to make a judgment on a given topic.
“If your goal is to arrive at the truth, the strategy of using your most recent feedback, rather than all of the data you’ve accumulated, is not a great tactic,” Marti said.