Tag Archives: Aristotle

The Confirmation Bias — What Do You Really Know: List of Biases in Judgment and Decision-Making, Part 6

Well, here we are into the sixth week of biases in judgment and decision-making. Every Monday, I look at my list of cognitive biases and I see that we’ve still got quite a few weeks to go until I’ve exhausted the biases that I want to talk about. This week was a toss-up: I was trying to decide between the fundamental attribution error and the confirmation bias. After flipping a bottle cap (seriously, there wasn’t a coin close by), I’ve decided to talk about the confirmation bias.

Like last week, the confirmation bias is easy to understand in its definition: it’s the tendency to seek out information that confirms one’s previously held beliefs. In a journal article that’s been cited over 1000 times, Ray Nickerson stated:

If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration. Many have written about this bias, and it appears to be sufficiently strong and pervasive that one is led to wonder whether the bias, by itself, might account for a significant fraction of the disputes, altercations, and misunderstandings that occur among individuals, groups, and nations.

Why is the confirmation bias so loathed? Well, as Nickerson points out, it may be the root cause of many disputes both on an individual and an international level. Let’s think about this for a second: let’s say that in the world of objectivity “out there,” there are any number of possibilities. In the world  of subjectivity “inside my head,” there are only the possibilities that I can imagine. Humans, on the whole, tend to fear change (there are over 600,000,000 results for that search on Google!). In order to allay those fears, I’m going to prefer information that already conforms to my previously held beliefs. As a result, when I look “out there,” I’m going to unconsciously be looking for things that are “inside my head.” Let’s take this discussion out of the abstract because there are plenty of examples of it.

If you’re still not convinced and think you’re “beyond” the confirmation bias, I would urge you to try and solve the problem on this site. If you give the problem its due respect, I bet that you’ll be surprised as to your solution vs. the actual solution.

Ways for Avoiding the Confirmation Bias

As with other cognitive biases, being aware that there is such a thing as the confirmation bias is really important. It can be hard to change something if you don’t know that there’s something to be changed.

1) Seek out contradictory ideas and opinions

This is something that I’ve written about before. If at all possible, you’ve got to be sure that you’re getting information that is counter to your beliefs from somewhere. If not, there’s little chance for growth and expansion. This can be difficult for some, so I’ve outlined ways to do this on the post I referenced above.

2) Seek out people with contradictory ideas and opinions

I answered a question on Quora last November where I placed these two ways for avoiding the confirmation bias one and two. Some folks might find it a little more difficult to seek out people with opposing views and that’s why I suggest starting with seeking out contradictory views in print (or some other form of media) to begin. However, in my experience, speaking with someone who has opposing views to mine (assuming that they are also altruistic in their endeavor to seek out opposing views) can be quite enriching. A real-life person can usually put up a better defense when your “confirmation bias” is activated. Similarly, you can do the same for them.

3) What do you really know?

My last suggestion for avoiding the confirmation bias is to always be questioning what it is that you know. This can sound tedious, but if you get into the habit of questioning “how” you know something or “why” you know something, you’d be surprised how ‘thin’ the argument is for something that you know. For instance, let’s say that you have a racial stereotype that ethnicity “x” is bad at driving. When you’re on the highway, you notice that someone from ethnicity “x” cuts you off. Instead of going into a tizzy about ethnicity “x,” you might stop and remember that, in fact, of all the times that you’ve been cut off, ethnicity “x” is the ethnicity that cuts you off the least. This is a curt example, but I think you get the idea. Just to emphasize my point: I would argue that questioning your deeply held beliefs would be a good way of countering the confirmation bias.

So, what do you really know?

If you liked this post, you might like one of the other posts in this series:

Can the Discourse in American Politics Be Saved: The Lost Art of Democratic Debate

I came across a tweet earlier this morning that linked to a TEDTalk given by Michael Sandel in 2010. I’ve written about Prof. Sandel’s course “Justice,” so naturally, I was interested to see his TEDTalk. The title: “The lost art of democratic debate.”

Of course, given the election tomorrow and the absurd hyper-partisanship in the US right now, I thought it would be interesting to hear what Prof. Sandel had to say, even though it was something he said 2 years ago. Ironically, 2 years ago, Congress was still at odds with each other (over healthcare). There’s still discussion about healthcare in the US.

As a quick primer to the video, you may want to check out what I wrote on golf being a sport last summer.

After watching the video, I’d love to hear what you think of what Prof. Sandel has proposed. Do you think discussing the morality of ideas will make Congress less partisan and more productive?