Tag Archives: Confirmation Bias

When 99% Confident Leads to Wrongness 40% of the Time: List of Biases in Judgment and Decision-Making, Part 9

This week, we’re looking at one of my ‘favorite’ biases, in that it’s one that once you know, it can be quite comical to spot it in others (and yourself, if you still fall for it, from time to time). From Wikipedia: the overconfidence effect “is a well-established bias in which someone’s subjective confidence in their judgments is reliably greater than their objective accuracy, especially when confidence is relatively high.” That’s a bit jargon-y, so let me illustrate with a simple example.

In fact, this example comes from a lecture I heard about negotiation and talked about in one of my previous posts. In case you were wondering, the lecture comes from Professor Margaret Neale at Stanford. It was a brilliant! There was so much information packed into the lecture. I remember listening to it a few different times and still pulling out nuggets of wisdom. Anyway, I digress. The example that Prof. Neale uses is particularly on point for illustrating the overconfidence effect.

She has an empty pop bottle filled with paper clips (but not to the top). She says to the crowd that she wants them to guess how many paper clips are in the bottle. She walks up and down the aisles, so they can get a closer look, too. She instructs the crowd to write down their answer. Then, she asks them to write down a range where they could be 100% (she may say 99%, I don’t remember) sure that the number of paper clips fell in the range. Essentially, she was asking for a confidence interval. I think she also told them that she was sure there weren’t more than 1,000,000 paper clips in there. After some time, she then tells the audience how many were in there. She asks if anyone got it right (no one raises their hand). She then says something to the effect of, “For how many of you did the number of paper clips fall within the range?” There may have been about 35% of the room who raised their hand. 35%! She exclaims that this is terrible given that all of these people were 100% (or 99%) sure that the number would fall in the range. In fact, she said that in a room that size, there should have only been a handful of people who’s range wasn’t met (if the 99% figure was being used, rather than 100%). Prof. Neale then goes on to explain that this is the overconfidence effect. The audience was being asked to make an estimate of something about which they knew nothing, and then asked to rate their confidence. Knowing that they knew nothing about the topic, it would have been logical for the audience to have a large confidence interval (between 10 paper clips and 20,000 paper clips) — or even bigger!

This happens in more ways than just simply estimating the number of paper clips in a bottle. We also see this with investors. When asked, fund manager typically report having performed above-average service. In fact, 74% report having delivered above-average service, while the remaining 26% report having rendered average service.

Another place that we see the overconfidence effect show up is with the planning fallacy: “Oh yeah, I can finish that in two weeks…”

Ways for Avoiding the Overconfidence Effect

1) Know what you know (and don’t know)

The fastest way to slip into the trap of the overconfidence effect is to start making “confident” predictions about things that you don’t know about. Guessing the number of paper clips in a bottle is something that most of us have little to no expertise in. So, list a large confidence interval. If you have no experience in managing a project, it might be in your best interest not to make a prediction about how long it will take to complete the project (planning fallacy).

2) Is this person really an expert?

Sometimes, you’ll hear someone displaying a level of confidence in a given situation that makes you think they know what they’re talking about. As a result, it might bias you into believing what they are saying. It’s important to know if this person is an expert in this field, or if maybe they’re succumbing to the overconfidence effect.

From Scott Plous‘ book called The Psychology of Judgment and Decision Making: “Overconfidence has been called the most ‘pervasive and potentially catastrophic’ of all the cognitive biases to which human beings fall victim. It has been blamed for lawsuits, strikes, wars, and stock market bubbles and crashes.”

If you liked this post, you might like one of the other posts in this series:

The Confirmation Bias — What Do You Really Know: List of Biases in Judgment and Decision-Making, Part 6

Well, here we are into the sixth week of biases in judgment and decision-making. Every Monday, I look at my list of cognitive biases and I see that we’ve still got quite a few weeks to go until I’ve exhausted the biases that I want to talk about. This week was a toss-up: I was trying to decide between the fundamental attribution error and the confirmation bias. After flipping a bottle cap (seriously, there wasn’t a coin close by), I’ve decided to talk about the confirmation bias.

Like last week, the confirmation bias is easy to understand in its definition: it’s the tendency to seek out information that confirms one’s previously held beliefs. In a journal article that’s been cited over 1000 times, Ray Nickerson stated:

If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration. Many have written about this bias, and it appears to be sufficiently strong and pervasive that one is led to wonder whether the bias, by itself, might account for a significant fraction of the disputes, altercations, and misunderstandings that occur among individuals, groups, and nations.

Why is the confirmation bias so loathed? Well, as Nickerson points out, it may be the root cause of many disputes both on an individual and an international level. Let’s think about this for a second: let’s say that in the world of objectivity “out there,” there are any number of possibilities. In the world  of subjectivity “inside my head,” there are only the possibilities that I can imagine. Humans, on the whole, tend to fear change (there are over 600,000,000 results for that search on Google!). In order to allay those fears, I’m going to prefer information that already conforms to my previously held beliefs. As a result, when I look “out there,” I’m going to unconsciously be looking for things that are “inside my head.” Let’s take this discussion out of the abstract because there are plenty of examples of it.

If you’re still not convinced and think you’re “beyond” the confirmation bias, I would urge you to try and solve the problem on this site. If you give the problem its due respect, I bet that you’ll be surprised as to your solution vs. the actual solution.

Ways for Avoiding the Confirmation Bias

As with other cognitive biases, being aware that there is such a thing as the confirmation bias is really important. It can be hard to change something if you don’t know that there’s something to be changed.

1) Seek out contradictory ideas and opinions

This is something that I’ve written about before. If at all possible, you’ve got to be sure that you’re getting information that is counter to your beliefs from somewhere. If not, there’s little chance for growth and expansion. This can be difficult for some, so I’ve outlined ways to do this on the post I referenced above.

2) Seek out people with contradictory ideas and opinions

I answered a question on Quora last November where I placed these two ways for avoiding the confirmation bias one and two. Some folks might find it a little more difficult to seek out people with opposing views and that’s why I suggest starting with seeking out contradictory views in print (or some other form of media) to begin. However, in my experience, speaking with someone who has opposing views to mine (assuming that they are also altruistic in their endeavor to seek out opposing views) can be quite enriching. A real-life person can usually put up a better defense when your “confirmation bias” is activated. Similarly, you can do the same for them.

3) What do you really know?

My last suggestion for avoiding the confirmation bias is to always be questioning what it is that you know. This can sound tedious, but if you get into the habit of questioning “how” you know something or “why” you know something, you’d be surprised how ‘thin’ the argument is for something that you know. For instance, let’s say that you have a racial stereotype that ethnicity “x” is bad at driving. When you’re on the highway, you notice that someone from ethnicity “x” cuts you off. Instead of going into a tizzy about ethnicity “x,” you might stop and remember that, in fact, of all the times that you’ve been cut off, ethnicity “x” is the ethnicity that cuts you off the least. This is a curt example, but I think you get the idea. Just to emphasize my point: I would argue that questioning your deeply held beliefs would be a good way of countering the confirmation bias.

So, what do you really know?

If you liked this post, you might like one of the other posts in this series:

When the Data Don’t Match Your Beliefs

By now, you’ve no doubt seen (or at least heard about) Karl Rove — noted Republican strategist — challenging the decision of the network for which he is a contributor (Fox News) to call Ohio for President Obama. If you haven’t, it’s worth checking out. This example is a good display of the data not matching one’s beliefs. While Rove has had experience with networks calling states prematurely, based on the data, all the networks were pretty confident in awarding Ohio for President Obama.

Cognitive biases are not unique to Karl Rove — we all have them. Similarly, there is also a tendency to discount data that does not fit one’s previously held beliefs. This past week, I finally cracked Jim Collins‘ new book: Great By Choice. I really liked Good to Great (and even included the story of the Stockdale Paradox a few months ago!)

Within the first 10 pages of the book, Collins’ writes about “entrenched myths” and “contrary findings.” That is, as part of Collins’ (and his team’s) research, they found that some previously held beliefs did not hold true when looking at the data. In case you’re interested, I’ve included them below. Take a look:

Entrenched myth: Successful leaders in a turbulent world are bold, risk-seeking visionaries.
Contrary finding: The best leaders we studied did not have a visionary ability to predict the future. They observed what worked, figured out why it worked, and built upon proven foundations. They were not more risk taking, more bold, more visionary, and more creative than the comparisons. They were more disciplined, more empirical, and more paranoid.

Entrenched myth: Innovation distinguishes 10X companies in a fast-moving, uncertain, and chaotic world.
Contrary finding: To our surprise, no. Yes, the 10X cases innovated, a lot. But the evidence does not support the premise that 10X companies will necessarily be more innovative than their less successful comparisons; and in some surprise cases, the 10X cases were less innovative. Innovation by itself turns out not to be the trump card we expected; more important is the ability to scale innovation, to blend creativity with discipline.

Entrenched myth: A threat-filled world favors the speedy; you’re either the quick or the dead.
Contrary finding: The idea that leading in a “fast world” always requires “fast decisions” and “fast action”—and that we should embrace an overall ethos of “Fast! Fast! Fast!”—is a good way to get killed. 10X leaders figure out when to go fast, and when not to.

Entrenched myth: Radical change on the outside requires radical change on the inside.
Contrary finding: The 10X cases changed less in reaction to their changing world than the comparison cases. Just because your environment is rocked by dramatic change does not mean that you should inflict radical change upon yourself.

Entrenched myth: Great enterprises with 10X success have a lot more good luck.
Contrary finding: The 10X companies did not generally have more luck than the comparisons. Both sets had luck—lots of luck, both good and bad—in comparable amounts. The critical question is not whether you’ll have luck, but what you do with the luck that you get.