The science behind why people think they're right when they're actually wrong (2024)

There may be a psychological reason why some people aren’t just wrong in an argument — they’re confidently wrong.

According to a study published Wednesday in the journal Plos One, it comes down to believing you have all the information you need to form an opinion, even when you don’t.

“Our brains are overconfident that they can arrive at a reasonable conclusion with very little information,” said Angus Fletcher, a professor of English at Ohio State University, who co-wrote the study.

Fletcher, along with two psychology researchers, set out to measure how people make judgments about situations or people based on their confidence in the information they have — even if it’s not the whole story.

“People leap to judgments very quickly,” he said.

The researchers recruited nearly 1,300 people with an average age of about 40. Everyone read a fictitious story about a school that was running out of water because its local aquifer was drying up.

About 500 people read a version of the story that was in favor of the school merging with another school, presenting three arguments supporting the move and one neutral point.

Another 500 people read a story with three arguments in favor of staying separate, plus the same neutral point.

The final 300 people, the control group, read a balanced story that included all seven arguments — three pro-merge, three pro-separate and the one neutral.

After reading, the researchers asked participants about their opinions on what the school should do and how confident they were that they had all the information they needed to make that judgment.

The surveys revealed a majority of people were much more likely to agree with the argument — either in favor of merging or staying separate — they had read, and that they were often confident they had enough information to have that opinion. People in the groups who had read only one point of view were also more likely to say they were more confident in their opinion than those in the control group who had read both arguments.

Half of the participants in each group were then asked to read the opposing side’s information, which contradicted the article they had already read.

Although people were confident about their opinions when they had only read arguments in favor of one solution, when presented with all of the facts, they were often willing to change their mind. They also reported that they were then less confident in their ability to form an opinion on the topic.

“We thought that people would really stick to their original judgments even when they received information that contradicted those judgments, but it turns out if they learned something that seemed plausible to them, they were willing to totally change their minds,” Fletcher said, adding that the research highlights the idea that people fail to contemplate whether they have all of the information about a situation.

However, the researchers noted the findings may not apply to situations in which people have pre-established ideas about a situation, as is often the case with politics.

“People are more open-minded and willing to change their opinions than we assume,” Fletcher said. However, “this same flexibility doesn’t apply to long-held differences, such as political beliefs.”

Todd Rogers, a behavioral scientist at the Harvard Kennedy School of Government, likened the findings to the “invisible gorilla” study, which illustrated the psychological phenomenon of “inattentional blindness,” when a person does not realize something obvious because they are focused on something else.

“This study captures that with information,” Rogers said. “There seems to be a cognitive tendency to not realize the information we have is inadequate.”

The study also parallels a psychological phenomenon, called the “illusion of explanatory depth,” in which people underestimate what they know about a certain topic, said Barry Schwartz, a psychologist and professor emeritus in social theory and social action at Swarthmore College in Pennsylvania.

The idea is that if you ask the average person if they know how a toilet works, they will likely reply that they do. But upon being asked to explain how a toilet works, they quickly realize they don’t know how a toilet works, just how to get it to work by pressing a lever.

“It’s not just that people are wrong. It’s that they are so confident in their wrongness that is the problem,” Schwartz said.

The antidote, he added, is “being curious and being humble.”

The fact that the people in the study who were later presented with new information were open to changing their minds, as long as the new information seemed plausible, was encouraging and surprising, the researchers and Schwartz agreed.

“This is reason to have a tiny bit of optimism that, even if people think they know something, they are still open to having their minds changed by new evidence,” Schwartz said.

Kaitlin Sullivan

Kaitlin Sullivan is a contributor for NBCNews.com who has worked with NBC News Investigations.She reports on health, science and the environment andis a graduate of theCraig Newmark Graduate School of Journalism at City University of New York.

The science behind why people think they're right when they're actually wrong (2024)
Top Articles
Latest Posts
Recommended Articles
Article information

Author: Ray Christiansen

Last Updated:

Views: 6693

Rating: 4.9 / 5 (69 voted)

Reviews: 92% of readers found this page helpful

Author information

Name: Ray Christiansen

Birthday: 1998-05-04

Address: Apt. 814 34339 Sauer Islands, Hirtheville, GA 02446-8771

Phone: +337636892828

Job: Lead Hospitality Designer

Hobby: Urban exploration, Tai chi, Lockpicking, Fashion, Gunsmithing, Pottery, Geocaching

Introduction: My name is Ray Christiansen, I am a fair, good, cute, gentle, vast, glamorous, excited person who loves writing and wants to share my knowledge and understanding with you.