Study reveals that people can mistakenly believe they have all the data, even though they don’t have it
If you think you’re right in a disagreement with a friend or colleague, a new study suggests why you may be wrong. Researchers found that people naturally assume they have all the information they need to make a decision or support their position, even when they don’t. Researchers call it the “illusion of information sufficiency.”
“We found that, in general, people don’t stop to think about whether there might be more information to help them make a more informed decision,” said study co-author Angus Fletcher, a professor of English at The Ohio State University and a member of the University Narrative Project.
“If you give people a few pieces of information that seem to line up, most people will say ‘that looks right’ and stick with it.” The study is published in the journal PLOS ONE.
1,261 Americans participated in the study online. They were divided into three groups that read an article about a fictional school that lacked enough water. One group read an article that only gave reasons why the school should merge with another that had enough water; a second group’s article only gave reasons to remain separate and wait for other solutions; and the third control group read all the arguments in favor of the schools merging and remaining separate.
The results showed that the two groups that read only half of the story – just the arguments for the merger or just the arguments against it – still believed they had enough information to make a good decision, Fletcher said. Most of them said they would follow the recommendations in the article they had read.
Most interpersonal conflicts have nothing to do with ideology, they are simple misunderstandings.
“Those who only had half the information were more certain about their decision to merge or stay separate than those who had full information,” Fletcher says. “They were pretty sure their decision was the right one, even though they didn’t have all the information.”
Additionally, participants who had half the information said they thought most other people would make the same decision as them.
There was one good news story in the study, Fletcher said. Some of the participants who had read only part of the story then read the arguments of the other part. And many of those participants were willing to change their minds about their decision once they had all the data.
This may not always work, especially on deep-rooted ideological issues. In such cases, people may not trust the new information or may try to reframe it to fit their preexisting views.
«But most interpersonal conflicts have nothing to do with ideology. “They are simple misunderstandings in the course of everyday life,” says Fletcher.
According to Fletcher, these conclusions complement research on so-called naïve realism, that is, the belief that the subjective understanding of a situation is the objective truth. Research on naïve realism often focuses on how people understand the same situation differently.
But the information adequacy illusion shows that people can share the same interpretation if they both have enough information. Fletcher, who studies how the power of stories influences people, says people should make sure they know the full story of a situation before taking a stance or making a decision.
“As we discovered in this study, there is a default way in which people believe they know all the relevant facts, even though they don’t,” he says. “When you disagree with someone, the first thing you have to do is think: ‘Is there something I’m missing that would help me see their point of view and better understand their position?’ “That is the way to fight against this illusion of information sufficiency.”
REFERENCE