Could I be wrong? Exploring research on cognitive bias, curiosity, intellectual humility, and lifelong learning
A few years ago, I asked a sample of adults to think about all of the disagreements that they have with other people, from minor disagreements about relatively unimportant issues to major disagreements about important matters. Then, I asked them to estimate the percentage of disagreements they have with other people in which they are the one who is correct.
Only 4% of the respondents indicated they were right less than half of the time, and only 14% said they were right half of the time. The vast majority—a whopping 82%—reported that, when they disagreed with other people, they were usually the one who was right! (Pause a moment to ask yourself the same question: In what percentage of the disagreements that you have with other people are you the one who’s right?)
Research on the overconfidence bias shows that people regularly overestimate their abilities, knowledge, and beliefs. For example, when researchers ask people how certain they are that their answers to questions of fact are correct, people’s confidence consistently exceeds the actual accuracy of their answers. Psychologist Scott Plous has noted that overconfidence is not only the most pervasive bias that plagues human thinking and decision-making, but it’s also the most “catastrophic” in that it leads to bad decisions and other negative outcomes.
The first step in dealing with overconfidence is for people to realize that much of what they believe to be true might, in fact, be incorrect. Psychologists call this awareness of one’s fallibility “intellectual humility.”
People who are intellectually humble know that their beliefs, opinions, and viewpoints are fallible because they realize that the evidence on which their beliefs are based could be limited or flawed or that they may not have the expertise or ability to understand and evaluate the evidence. Intellectual humility involves understanding that we can’t fully trust our beliefs and opinions because we might be relying on faulty or incomplete information or are incapable of understanding the details.
Of course, it rarely feels like our beliefs are wrong, and we must usually behave as if our beliefs are true or else we’ll be paralyzed by uncertainty and indecision. But people who are high in intellectual humility keep in mind that whatever they believe to be true could be wrong and, thus, they might need to revise their views at any time.
Features of intellectual humility
One of our studies showed that people high in intellectual humility were more attentive to the quality of the evidence in an article about the value of dental flossing, more clearly distinguishing good from bad reasons to floss. Because they realize that their beliefs might be wrong, intellectually humble people pay more attention to the quality of the evidence on which their beliefs are based.
In another study in which participants read sentences about controversial topics, intellectually humble participants spent more time reading sentences that expressed viewpoints counter to their own opinions than participants low in intellectual humility, suggesting that they were thinking more deeply about ideas with which they disagreed. (Low and high intellectual humility participants didn’t differ in the time they spent reading sentences consistent with their attitudes.) Along the same lines, a study by Tenelle Porter and Karina Schumann found that people higher in intellectual humility were more interested in understanding the reasons that people disagree with them.
These and other findings suggest that people high in intellectual humility pay greater attention to the evidence for and against their beliefs and spend more time thinking about beliefs with which others disagree. Not surprisingly, people who are aware that their views might be wrong are more inclined to think about the accuracy of their beliefs than people who assume that they’re right about most things.
Intellectual humility is also associated with the desire to learn new information. People who are high in intellectual humility score higher in epistemic curiosity, which is the motivation to pursue new knowledge and ideas. Their higher curiosity seems to be motivated both by the fact they enjoy learning new information and by the distress they feel when they lack information or do not understand something. High intellectual humility is also associated with the degree to which people enjoy thinking, mulling over issues, and solving intellectual problems. People higher in intellectual humility like to think more than people low in intellectual humility do.
The trouble with too much confidence
Intellectual humility is fundamentally a meta-cognitive construct—that is, it involves people’s thoughts about their thoughts—but it often manifests in people’s emotions and behavior.
Most notably, in disagreements with other people, people high in intellectual humility are more open to other people’s views and less dogmatic regarding their beliefs and opinions. People who recognize that their beliefs are fallible take other people’s perspectives more seriously and recognize the value of divergent opinions.
Several studies also show that they are less inclined to disparage people who have different viewpoints than they do. In contrast, people who are lower in intellectual humility have stronger emotional reactions when people disagree with them and disregard or disparage people who hold different views.
Given they are more open to other people’s ideas and less contentious when others disagree with them, people higher in intellectual humility are liked better than those lower in intellectual humility. Even after only 30 minutes of interaction, people rate those who are high in intellectual humility more positively than those who are low. Ironically, know-it-alls often don’t seem to know that other people don’t like know-it-alls.
This doesn’t mean that people high in intellectual humility don’t mind being wrong. They do, but for reasons that are different from low intellectual humility people. People high in intellectual humility sometimes find their ignorance and intellectual limitations troubling—not because they lose disagreements with other people but because they want to understand the world.
People who are intolerant of views that differ from theirs can also stifle open and honest discussions. For example, leaders who are not open to divergent ideas inhibit group members from offering their views, potentially short-circuiting creative and valuable ideas. In contrast, intellectually humble leaders who are open to alternative views may motivate others to contribute more ideas to discussions.
What influences intellectual humility?
First, given that virtually every personal characteristic has at least a weak genetic basis, it would be surprising if intellectual humility was not partly heritable. Indirect support for this idea comes from the fact that intellectual humility correlates with both overconfidence and openness, both of which show evidence of genetic influences.
Learning also plays a role in intellectual humility as children observe how parents, teachers, and others express certainty and uncertainty about their beliefs, manage disagreements with other people, and change—or do not change—their minds when evidence warrants. Some parents may also encourage their children to explain and justify their beliefs, attitudes, and decisions, thereby teaching the importance of basing one’s views on evidence and reason. Parents also differ in the degree to which they encourage their children to be open to new ideas and experiences, which may contribute to intellectual humility.
Education, especially higher education, may also affect intellectual humility—but in two opposing ways.
On one hand, the more people learn, the more they see how much they do not know and come to realize that knowledge is exceptionally complicated, nuanced, and endless. On the other hand, the more people learn, the more justifiably confident they become the areas in which they develop expertise. An expert should obviously be more confident of their beliefs in that domain than a non-expert. Although no direct evidence exists, education may increase intellectual humility overall, while (justifiably) increasing certainty—and lowering intellectual humility—in areas in which a person is an expert.
Across a number of beliefs, intellectual humility is curvilinearly related to the extremity of people’s beliefs, such that people with moderate beliefs tend to be higher in intellectual humility than people who hold extreme beliefs. To say it differently, people with more extreme views—for example, people whose political views are farther toward the left or right—tend to be lower in intellectual humility and, thus, less willing to consider that their viewpoints might be incorrect than people who have moderate beliefs. This pattern may occur because moderate views often acknowledge the complexity and equivocal nature of complicated issues.
How can we become more intellectually humble?
In an ideal world, people’s judgments about the accuracy of their beliefs, opinions, and viewpoints would be perfectly calibrated to their actual validity. People make the best decisions about what to believe and what to do when judgments of their correctness are accurate.
Unfortunately, most of us overestimate the accuracy of our beliefs and opinions, often badly, with little consideration of the possibility that we might be wrong. Fortunately, people can increase in intellectual humility both through a personal decision to be more intellectually humble and through interventions that help people confront their intellectual overconfidence and take steps to reduce it.
None of us thinks that our beliefs and attitudes are incorrect; if we did, we obviously wouldn’t hold those beliefs and attitudes. Yet, despite our sense that we are usually correct, we must accept that our views may sometimes turn out to be wrong. This kind of humility isn’t simply virtuous—the research suggests that it results in better decisions, relationships, and outcomes. So, the next time you feel certain about something, you might stop and ask yourself: Could I be wrong?
– Mark Leary, Ph.D., is Garonzik Family Emeritus Professor of Psychology and Neuroscience at Duke University. He is the former president of the Society for Personality and Social Psychology. Copyright Greater Good. Based at UC-Berkeley, Greater Good highlights ground breaking scientific research into the roots of compassion and altruism.