Weekend Reads | When Do We Have Enough Information to Make a Decision?
If you’ve spent any time on social media sites, you are probably familiar with the Dunning–Kruger effect, in which people with limited skills at a particular task tend to overestimate their abilities (and those with high skills tend to slightly underestimate their abilities). This weekend’s read is a study of what might be a “sister theory” to Dunning–Kruger: people asked to make a decision or render a judgment based on limited information tend to overestimate how complete their understanding of the situation is. The authors, a trio of researchers from Johns Hopkins, Stanford, and Ohio State, call this “the illusion of information adequacy.”
This illusion manifests in a few ways. First, we tend to believe that our knowledge is more complete than it really is. Second, we believe that our personal, subjective view represents an objective understanding of reality. Third, we assume that other rational people will agree with our own reactions, opinions, and behaviors. The late comedian George Carlin had a simple way of explaining this: Everyone who drives slower than I do is an “idiot,” and everyone who drives faster is a “maniac.” The authors propose a related example many of us have experienced: being stopped behind another car at an intersection and growing impatient that it isn’t moving forward — until a pedestrian crossing the street pops into our view, one that we couldn’t see but the driver ahead clearly could. Former U.S. Defense Secretary Donald Rumsfeld famously referred to the “unknown unknowns”: the things we don’t know that we don’t know. It takes humility to admit our own ignorance; even more so to admit that we can’t even grasp how ignorant we are. The concept of “adequate information” is a tricky one. When do we have enough information to make a sound, informed decision, and how do we know when we have enough?
The researchers designed an experiment to validate the “illusion of information adequacy.” They invented a scenario in which there is a proposal to merge two local schools into a single, larger one. They then divided their test subjects into three groups: the first read a “pro” document explaining three arguments in favor of the merger; the second read a “con” document with three arguments against it; and the third “control” group read a document with both the arguments in favor and against. All of the test subjects were then asked to make a recommendation and rate whether the information they had was adequate to make a decision, their own confidence in the recommendation they had made, and how much they believed others would agree with their recommendation.
You probably won’t be surprised to learn that 88% of the “pro” group recommended merging, 77% of the “con” group recommended against it, and the “control” group that saw all the arguments was split close to 50-50.
The researchers then gave the “pro” test subjects the opportunity to read the “con” document, and vice versa — and they asked them the questions again. They predicted that people would be a little stubborn and would tend to stick with their original recommendation. Not so much: Both groups ended up nearly evenly split after reading all the arguments, about the same as the control group.
The most interesting result, however, was how their answers to the other questions changed. Both the “pro” and “con” group test subjects, on average, rated highly their confidence in their original recommendation and their belief that it was a consensus view after seeing just one side of the argument. But after reading the other side’s arguments — increasing the amount of information they had — they rated their confidence and consensus beliefs lower.
Perhaps this is a partial explanation for the Dunning–Kruger effect: People presume they have adequate information, but as we gain more knowledge and broaden our perspective, we gain an appreciation for how much we don’t know and the diversity of perspectives, and we lose confidence in our own judgments.
The researchers suggest that the “information adequacy” phenomenon is critical to how our society might approach some of our most polarizing issues, including abortion, affirmative action, and the Israel–Palestine conflict. And they make a couple of suggestions for “practical strategies for improving people’s social perspective taking capacities.” One is simply to encourage people to ask, “Do I have enough information?” This question, they argue, “infuses humility” and helps us to allow more benefit of the doubt if we assume that some relevant information is missing. They also suggest group activities where people collectively brainstorm how to learn more information about an issue or someone else’s beliefs, rather than leaving it up to individuals to “do their own research.”
Management consultants might warn about taking this too far, constantly questioning the adequacy of our information to the point where “analysis paralysis” sets in and we become unable to make any decisions for fear that we don’t have the right information. Again, deciding what is “adequate information” is crucial wisdom. And, as goofy as the phrase sounds, there is also wisdom in acknowledging our “unknown unknowns”: that we often don’t know what we don’t know.
Help keep BIPOC-led, community-powered journalism free — become a Rainmaker today.