People Who Jump to Conclusions Show Other Kinds of Thinking Errors

Belief in conspiracy theories and overconfidence are two tendencies linked to hasty thinking

Man holding a lightbulb globe arrangement taking a leap.

Islenia Milien

How much time do you spend doing research before you make a big decision? The answer for many of us, it turns out, is hardly any. Before buying a car, for instance, most people make two or fewer trips to a dealership. And when picking a doctor, many individuals simply use recommendations from friends and family rather than consulting medical professionals or sources such as health-care websites or articles on good physicians, according to an analysis published in the journal Health Services Research.

We are not necessarily conserving our mental resources to spend them on even weightier decisions. One in five Americans spends more time planning their upcoming vacation than they do on their financial future. There are people who go over every detail exhaustively before making a choice, and it is certainly possible to overthink things. But a fair number of individuals are quick to jump to conclusions. Psychologists call this way of thinking a cognitive bias, a tendency toward a specific mental mistake. In this case, the error is making a call based on the sparsest of evidence.

In our own research, we have found that hasty judgments are often just one part of larger error-prone patterns in behavior and thinking. These patterns have costs. People who tend to make such jumps in their reasoning often choose a bet in which they have low chances of winning instead of one where their chances are much better.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


To study jumping, we examined decision-making patterns among more than 600 people from the general population. Because much of the work on this type of bias comes from studies of schizophrenia (jumping to conclusions is common among people with the condition), we borrowed a thinking game used in that area of research.

In this game, players encountered someone who was fishing from one of two lakes: in one lake, most of the fish were red; in the other, most were gray. The fisher would catch one fish at a time and stop only when players thought they could say which lake was being fished. Some players had to see many fish before making a decision. Others—the jumpers—stopped after only one or two.

We also asked participants questions to learn more about their thought patterns. We found that the fewer fish a player waited to see, the more errors that individual made in other types of beliefs, reasoning and decisions.

For instance, the earlier people jumped, the more likely they were to endorse conspiracy theories, such as the idea that the Apollo moon landings had been faked. Such individuals were also more likely to believe in paranormal phenomena and medical myths, such as the idea that health officials are actively hiding a link between cell phones and cancer.

Jumpers made more errors than nonjumpers on problems that require thoughtful analysis. Consider this brainteaser: “A baseball bat and ball cost $1.10 together. The bat costs $1 more than the ball. How much does the ball cost?” Many respondents leaped to the conclusion of 10 cents, but a little thought reveals the right answer to be five cents. (It’s true; think the problem through.)

In a gambling task, people with a tendency to jump were more often lured into choosing inferior bets over those in which they had a better chance of winning. Specifically, jumpers fell into the trap of focusing on the number of times a winning outcome could happen rather than the full range of possible outcomes.

Jumpers also had problems with overconfidence: on a quiz about U.S. civics, they overestimated the chance that their answers were right significantly more than other participants did—even when their answers were wrong.

The distinctions in decision quality between those who jumped and those who did not remained even after we took intelligence—based on a test of verbal intellect—and personality differences into account. Our data also suggested the difference was not merely the result of jumpers rushing through our tasks.

So what is behind jumping? Psychological researchers commonly distinguish between two pathways of thought: automatic, known as system 1, which reflects ideas that come to the mind easily, spontaneously and without effort; and controlled, or system 2, comprising conscious and effortful reasoning that is analytical, mindful and deliberate.

We used several assessments that teased apart how automatic our participants’ responses were and how much they engaged in deliberate analysis. We found that jumpers and nonjumpers were equally swayed by automatic (system 1) thoughts. The jumpers, however, did not engage in controlled (system 2) reasoning to the same degree as nonjumpers.

It is system 2 thinking that helps people counterbalance mental contaminants and other biases introduced by the more knee-jerk system 1. Put another way, jumpers were more likely to accept the conclusions they made at first blush without deliberative examination or questioning. A lack of system 2 thinking was also more broadly connected to their problematic beliefs and faulty reasoning.

Happily, there may be some hope for jumpers: Our work suggests that using training to target their biases can help people think more deliberatively. Specifically, we adapted a method called metacognitive training from schizophrenia research and created a self-paced online version of the intervention. In this training, participants are confronted with their own biases. For example, as part of our approach, we ask people to tackle puzzles, and after they make mistakes related to specific biases, these errors are called out so the participants can learn about the missteps and other ways of thinking through the problem at hand. This intervention helps to chip away at participants’ overconfidence.

We plan to continue this work to trace other problems introduced by jumping. Also, we wonder whether this cognitive bias offers any potential benefits that could account for how common it is. In the process, we aim to give back to schizophrenia research. In some studies, as many as two thirds of people with schizophrenia who express delusions also exhibit a jumping bias when solving simple, abstract probability problems, in comparison with up to one fifth of the general population.

Schizophrenia is a relatively rare condition, and much about the connection between jumping and judgment issues is not well understood. Our work with general populations could potentially fill this gap in ways that help people with schizophrenia.

In everyday life, the question of whether we should think things through or instead go with our gut is a frequent and important one. Recent studies show that even gathering just a little bit more evidence may help us avoid a major mistake. Sometimes the most important decision we make can be to take some more time before making a choice.

Carmen Sanchez is an assistant professor at the University of Illinois at Urbana-Champaign’s Gies College of Business. She studies the development of misbeliefs, decision-making and overconfidence.

More by Carmen Sanchez

David Dunning is a social psychologist and a professor of psychology at the University of Michigan. His research focuses on the psychology of human misbelief, particularly false beliefs people hold about themselves.

More by David Dunning
Scientific American Magazine Vol 326 Issue 2This article was originally published with the title “Leaps of Confusion” in Scientific American Magazine Vol. 326 No. 2 (), p. 68
doi:10.1038/scientificamerican0222-68