He’s Not as Bad as it Seems and She’s Not as Good as it Seems: List of Biases in Judgment and Decision-Making, Part 11

Hello Hello! It’s been a little more than three weeks since my last post. I’ve finished up the requirements for the MBA, so I should be back to writing posts with some regularity. Since today’s Monday, I thought I’d restart that series of posting about a cognitive bias on Mondays. Today’s cognitive bias: the halo effect.

The halo effect is essentially believing that someone is good at something because they were good at something else. For instance, celebrities are often cited when talking about the halo effect. Many regard celebrities as beautiful and as a result of the halo effect, surmise that these celebrities are also smart (when in fact, this is not always the case).

There’s a famous study about the halo effect from 40 years ago. Here’s an excerpt:

Two different videotaped interviews were staged with the same individual—a college instructor who spoke English with a European accent. In one of the interviews the instructor was warm and friendly, in the other, cold and distant. The subjects who saw the warm instructor rated his appearance, mannerisms, and accent as appealing, whereas those who saw the cold instructor rated these attributes as irritating. These results indicate that global evaluations of a person can induce altered evaluations of the person’s attributes, even when there is sufficient information to allow for independent assessments of them. Furthermore, the subjects were unaware of this influence of global evaluations on ratings of attributes. In fact, the subjects who saw the cold instructor actually believed that the direction of influence was opposite to the true direction. They reported that their dislike of the instructor had no effect on their ratings of his attributes but that their dislike of his attributes had lowered their global evaluations of him.

Ways for Avoiding the Halo Effect

1) Different strengths for different tasks

One of the easiest ways to avoid falling into the trap of the halo effect is to notice that there are different skills/strengths required for different tasks. As such, just because someone is good at climbing mountains doesn’t mean that they would make a good politician. The strengths/skills required for those two tasks are different. Put another way, think about the strengths/skills required for a particular tasks before evaluating whether someone would be good at that task.

2) Notice other strengths (or weaknesses)

It’s been said that, “nobody’s perfect.” When someone is good at one thing, there’s a good chance that they won’t be good at something else. Noticing that this person isn’t good at someone else may help to quell the urge to assume that this person is good at everything.

If you liked this post, you might like one of the other posts in this series:

Tying Up Loose Ends: Food for Thought and Brief Hiatus

Since moving to the new domain (www.JeremiahStanghini.com), this has been the longest time between posts. The last post I wrote was on April 5th. The hiatus from posting will continue for a little while after this post because I’m working on the last requirements for finishing my MBA. There are about 3 weeks left until the end of exam period, so I’ve got a few papers/presentations to finish and a lot of grading of papers/exams.

Whenever I open my computer I see the list of posts that I’ve been meaning to write. In an effort to “clear out some mental space,” I thought I’d do what I’ve done a couple of times in the past and flush out my list of posts to write. In this way, the list will be fresh for when I come back (save for the few cognitive biases that I still want to write about). So, without further adieu, here are some of the things that I had planned on expanding upon. I hope you enjoy!

Cars and Transportation — It’d be really cool if they could *feasibly* develop a car that could transform. A car that could be a single-passenger when commuting, but it could expand/transform into 2, 3, or 4 seats when it necessary.

Political Ideology — What if a given political ideology’s thoughts/plans don’t work unless they can be fully implemented? And because there’s a split in Parliament/Congress, it’s worse. But what if when either party had total control, it’d be worse than this middle-ground between the two ideas?

LeBron James vs. Michael Jordan — A few weeks before the conversation about LeBron vs. Jordan started, I’d had it on my to do list to write about it. I was a bit peeved when the conversation started (without me), but there were some interesting (and some not) things written about it. I think it’s extremely difficult to compare players across decades. It’s akin to comparing players across sports! I remember a few years ago when there was talk that Alex Rodriguez would be the greatest baseball player ever. I think it’s safe to say that conversation has died down a little.

Fear of Public Speaking — I was thinking back to one of the first times I had to stand up in front of a group of people and give a speech. I don’t even remember what I spoke about — but I do remember one of the speeches from my classmates who did quite well (it was about the NBA dunk contest). As I watch some folks present in front of rooms, I can empathize with their nervousness. Heck, even I still get a bit nervous sometimes. One thing I’ve learned — it’s really about repetition. The more times I’ve spoke in front of groups of people, the less nervous I get the next time I go up there. (On a slightly related note: I’d say another key factor in minimizing fear of public speaking is the extent to which you’re prepared to speak on the topic. Read: know your stuff!)

Focus on Labor — I’ve never been the CEO or a highly placed Vice President of a company, but from an outsider’s perspective, I always have a hard time understanding the lack of focus on the labor force. At times, it really looks like labor is the key to success. If the labor force is well taken care of, production and profits tend to do well. It reminds me of that post I did about sustainability and pitchers. The relation here is that when management takes care of the labor force, it is with an eye towards long-term sustainability.

Life, Liberty, and Property? — Why is property so valued? What about nomads or North Americans who show us that land isn’t to be owned? What about animals? They don’t seem to own land.

Star Trek: Inheritance — This is an episode from the final season of Star Trek: The Next Generation. The gist of is that Data has to decide whether or not he’s going to tell his mother that she is an android (when she believes she’s a human). In thinking about this episode, I wondered about the ethics of telling someone they aren’t who they think they are. What about an adopted child?

Social EntrepreneurshipGeorge Mason University‘s Center For Social Entrepreneurship has a massive open online course (MOOC) in social entrepreneurship. If you wanna learn about social entrepreneurship, this is a great place to start!

“I AM” — I saw the movie I AM quite some time ago and there were some cool things that stood out to me. I’ll be brief:

  • The HeartMath Institute — check them out! They’re doing some fascinating work.
  • Animals are more likely to cooperate than we may have first thought. There was a reference to a journal article about how a herd of deer decided to go in a given direction after hydrating at a water hole.
  • Rumi poetry is medicine for the soul.
  • I am continually amazed at the kinds of things that are correlated with Random Number Generators.
  • Did you know that the word “Love” appears 95 times in Darwin’s “The Descent of Man?”
  • A great quote that Desmond Tutu read: “God looked at me and said, all I have is you.”

And so that clears off most of my list. Look for a new post sometime in the next month, but probably not for the next 3 weeks. Happy end of April and early May!

Protection from Nuclear War: Look to the Cockroaches

Yesterday, I saw a post from Mental Floss about whether or not cockroaches would be able to survive a nuclear war. That is, not whether or not the cockroaches would put up a fight in a nuclear war, but whether or not they would survive the radiation from a nuclear war that happened where they existed.

The post cited research done by MythBusters that concluded cockroaches have a much higher tolerance for radiation.

Does anyone else see an opportunity for innovation here?

If I were a scientist, (aside from ethical conundrum), I might be interested in seeing how much radiation cockroaches could withstand before it affects their ability to function. Why? Because then I would want to study what it is about the cockroaches that allows them to withstand such radiation. Then, I’d want to see if I could design some sort of protection for humans. To be fair, it’d be very hard to get this to pass through any kind of Institutional Review Board (IRB). That is, the IRB would probably balk at any kind of research where humans were being used to test the strength of some kind of cockroach shield. Though, I imagine that scientists might be able to work around this by using human cells in the lab, right?

Do Percentages Matter in a One-Time Decision?

I write a lot about decision-making. It’s clearly something that interests me. As a result, I often find myself thinking about how to make better decisions or how to help people make better decisions. That’s why I’m already up to Part 10 of that series on decision-making (and I’ve got at least 4 more to go). I’m not including today’s post as part of that series, but it serves as an interesting addendum. Meaning, it should at least give you something to think about. So, here we go!

As I said, I often find myself thinking about how to optimize decisions. Often times, when people are trying to make a decision about something in the future, there may be percentages attached to the success of a decision. For example, if you’re the elected leader of a country, you might have to decide about a mission to go in and rescue citizens that are being held hostage. When you’re speaking with your military and security advisors, they may tell you the likelihood of success of the different options you have on the table.

I was going to end the example there and move into my idea, but I think it might make it easier to understand, if I really go into detail on the example.

So, you’re the President of the United States and you’ve got citizens who are being held hostage in Mexico (but not by the government of Mexico). The Chief of the Joint Chiefs of Staff presents a plan of action for rescuing the citizens. After hearing about the chance of success of this plan, you ask the Chief what the chance of success is and he tells you 60%. The other option you have is to continue to pursue a diplomatic solution in tandem with the Mexican government. As the President, what do you do?

So, my wondering is whether that 60% number really matters that much. In fact, I would argue that the only “numbers” that would be useful in this situation are 100%, 0%, or whether the number is greater than 50 or less than 50 (to make sure that this is still three numbers, we could call this last number ‘x’). This sounds silly, right? A mission that has a 80% chance of success would make you more inclined to choose that mission, right? The problem is that 20% of the time, that mission is still going to fail. And my point is that since this is a one-time decision (meaning, it’s astronomically unlikely that the identical situation would occur again), there won’t be iterations such that 80% of the time, the decision to carry out that mission will be successful.

I suppose the argument against this idea is that in a mission that has only a 51% chance of success, there’s a 49% chance of failure and one would presume that there are more factors that might lead to failure with these percentages (or at least a higher chance of these failures coming to fruition).

I realize that this idea is off-the-wall, but I’d be interested to read an article in a math journal that explains why this is wrong (using reasoning beyond what I’ve explained here) or… why it’s right!

WRAP — An Acronym from Decisive: List of Biases in Judgment and Decision-Making, Part 10

I recently came across a post from Farnam Street that seems like it would make a great addition to the series we’ve been exploring over the last 10 weeks (biases in judgment and decision-making). So, instead of going over another bias today, I thought I’d share the information I found and tie it back into our series. Check back next week for a new bias (it could be functional fixedness, the hindsight bias, the status quo bias, or maybe the recency/primacy effect(s)…)

The author of the Farnam Street blog summarized some of the work in Chip and Dan Heath’s new book: Decisive: How to Make Better Choices in Life and Work. Given that our series is about decision-making, this book seems like it would be absolutely on-point, with regard to the closing of each post (How to avoid *blank*).

I haven’t yet read the book, but I did want to include a brief excerpt (courtesy of Farnam Street), along with a lead-in from Farnam Street:

The Heaths came up with a process to help us overcome these villains and make better choices. “We can’t deactivate our biases, but … we can counteract them with the right discipline.” The nature of each of the four decision-making villains suggests a strategy for how to defeat it.

1. You encounter a choice. But narrow framing makes you miss options. So … Widen Your Options. How can you expand your sent of choices? …

2. You analyze your options. But the confirmation bias leads you to gather self-serving information. So … Reality-Test Your Assumptions. How can you get outside your head and collect information you can trust? …

3. You make a choice. But short-term emotion will often tempt you to make the wrong one. So … Attain Distance Before Deciding. How can you overcome short-term emotion and conflicted feelings to make better choices? …

4. Then you live with it. But you’ll often be overconfident about how the future will unfold. So … Prepare to Be Wrong. How can we plan for an uncertain future so that we give our decisions the best chance to succeed? …

There’s also a handy picture that’s included (again, courtesy of Farnam Street):

As we can see, the Heaths have offered four universal ways for avoiding biases in judgment and decision-making. If we recall some of the different ways for avoiding biases that we’ve discussed over the last 9 weeks, many of them can be collapsed into one of the categories listed above. In case you’re a bit hazy, here are some of the biases that we’ve talked about before that have a “way for avoiding” that falls into one of the categories above:

So, if you’re having trouble remembering the different ways for avoiding the biases we’ve talked about, all you have to do is remember “W-R-A-P!”

When 99% Confident Leads to Wrongness 40% of the Time: List of Biases in Judgment and Decision-Making, Part 9

This week, we’re looking at one of my ‘favorite’ biases, in that it’s one that once you know, it can be quite comical to spot it in others (and yourself, if you still fall for it, from time to time). From Wikipedia: the overconfidence effect “is a well-established bias in which someone’s subjective confidence in their judgments is reliably greater than their objective accuracy, especially when confidence is relatively high.” That’s a bit jargon-y, so let me illustrate with a simple example.

In fact, this example comes from a lecture I heard about negotiation and talked about in one of my previous posts. In case you were wondering, the lecture comes from Professor Margaret Neale at Stanford. It was a brilliant! There was so much information packed into the lecture. I remember listening to it a few different times and still pulling out nuggets of wisdom. Anyway, I digress. The example that Prof. Neale uses is particularly on point for illustrating the overconfidence effect.

She has an empty pop bottle filled with paper clips (but not to the top). She says to the crowd that she wants them to guess how many paper clips are in the bottle. She walks up and down the aisles, so they can get a closer look, too. She instructs the crowd to write down their answer. Then, she asks them to write down a range where they could be 100% (she may say 99%, I don’t remember) sure that the number of paper clips fell in the range. Essentially, she was asking for a confidence interval. I think she also told them that she was sure there weren’t more than 1,000,000 paper clips in there. After some time, she then tells the audience how many were in there. She asks if anyone got it right (no one raises their hand). She then says something to the effect of, “For how many of you did the number of paper clips fall within the range?” There may have been about 35% of the room who raised their hand. 35%! She exclaims that this is terrible given that all of these people were 100% (or 99%) sure that the number would fall in the range. In fact, she said that in a room that size, there should have only been a handful of people who’s range wasn’t met (if the 99% figure was being used, rather than 100%). Prof. Neale then goes on to explain that this is the overconfidence effect. The audience was being asked to make an estimate of something about which they knew nothing, and then asked to rate their confidence. Knowing that they knew nothing about the topic, it would have been logical for the audience to have a large confidence interval (between 10 paper clips and 20,000 paper clips) — or even bigger!

This happens in more ways than just simply estimating the number of paper clips in a bottle. We also see this with investors. When asked, fund manager typically report having performed above-average service. In fact, 74% report having delivered above-average service, while the remaining 26% report having rendered average service.

Another place that we see the overconfidence effect show up is with the planning fallacy: “Oh yeah, I can finish that in two weeks…”

Ways for Avoiding the Overconfidence Effect

1) Know what you know (and don’t know)

The fastest way to slip into the trap of the overconfidence effect is to start making “confident” predictions about things that you don’t know about. Guessing the number of paper clips in a bottle is something that most of us have little to no expertise in. So, list a large confidence interval. If you have no experience in managing a project, it might be in your best interest not to make a prediction about how long it will take to complete the project (planning fallacy).

2) Is this person really an expert?

Sometimes, you’ll hear someone displaying a level of confidence in a given situation that makes you think they know what they’re talking about. As a result, it might bias you into believing what they are saying. It’s important to know if this person is an expert in this field, or if maybe they’re succumbing to the overconfidence effect.

From Scott Plous‘ book called The Psychology of Judgment and Decision Making: “Overconfidence has been called the most ‘pervasive and potentially catastrophic’ of all the cognitive biases to which human beings fall victim. It has been blamed for lawsuits, strikes, wars, and stock market bubbles and crashes.”

If you liked this post, you might like one of the other posts in this series:

Trying to Form a New Habit: Take a Vacation

Have you ever wanted to make changes in your life, but haven’t been able to stick to those changes? What about a New Year’s Resolution? If I’m being honest, there have been changes that I’ve tried to make that I haven’t been able to keep up. However, I think I may have discovered a trick to making it easier to stick to a new habit. (Truth be told, I’m probably not the first person to make this discovery, but I don’t remember reading it in any of the literature on habit-forming and/or making changes. Of course, that doesn’t mean that it isn’t.)

There have been some new habits that I’ve tried to form over the past couple of weeks. One of those habits is practicing French. (I’m Canadian and I think I ought to know both of the national languages. Plus, it makes good sense to be able to speak more than one language and since I had some training in French, I thought it was the best one to start with.) Anyways, I’ve tried to practice French. At least once I day, I make a point to practice French. Although, this hasn’t been as easy as I thought it would be.

If you’ve ever tried to create a new habit, you know what it’s like: you’re used to doing certain things throughout your day and as a result, it can be difficult to try to squeeze something else into the day — even if you’ve removed some of the other things that you used to do!

I recently returned from a trip this past Monday. As a result, I thought that this was a perfect time to try and carry out a new routine. Having been away from my “regular” routine for the last 10+ days, I can now impose a new routine. I’ve only been doing it for a few days, but so far, it’s been working great. If we look at it from a physics standpoint, it makes sense. The way I went about my day was an “object in motion,” and until that “object in motion” was acted upon, it was going to maintain its course. My attempts to affect its course weren’t strong enough to move that object in motion, but when I left the country, the object was acted upon strongly enough. Inertia is also another concept that applies here. Inertia is the idea that an object will resist a change to its state of motion (or rest).

So, if you’re trying to make some changes in your life, consider going on vacation or getting out of town for a few days to shakeup your routine. It just may be the change you need to make the change you need!

 

Situations Dictate Behavior: List of Biases in Judgment and Decision-Making, Part 8

We’re into the 8th week of cognitive biases. A couple of weeks ago, I was trying to decide between the confirmation bias and the fundamental attribution error and decided on the confirmation bias. I’m not sure why I decided to go with the gambler’s fallacy last week (as opposed to the fundamental attribution error), so I thought I’d “circle back” and pick up the fundamental attribution error… in case you were really pining for it.

The fundamental attribution error may sound complicated (I mean, hey, there are three words!), but it’s actually quite simple once you get the hang of it. Normally, I explain the bias and then provide examples, but I think talking about an example will help to solidify the understanding of this bias. In a study done in 1967, researchers asked participants to assess whether a person was pro-/anti-Castro based on an essay the person had written. In one group, participants were told that the essayists were able to choose whether they wanted to write for the pro-side or the anti-side. Of course, when participants believed that essayists were able to choose which side they wanted to write for, they rated those essayists as having more positive (or negative) feelings towards Castro. In the second group, participants were told that the essayists would have their position determined by a coin flip. Meaning, the essayists had no control over whether they were going to be writing a positive/negative essay of Castro. It was all left up to chance (the situation!). Despite the participants’ knowledge of this, on average, they still rated the positive essays as a sign that those essayists had a positive view of Castro. Similarly for the negative essays as a sign that those essayists had a negative view of Castro. Participants were blind to the situation constraints

So that’s the fundamental attribution error — the idea that the situation dictates the behavior of the person, rather than the person’s personality. If you’re looking for some more examples:

  • You call up your friend and find out that they’ve done nothing all day. You assume that your friend is lazy. In fact, your friend was up all night caring for their sick grandmother.
  • You’re sitting a stop light when it turns to green. You advance out into the intersection only to nearly be smashed into by someone who runs the red light. You scoff at the person for running the red light. Little did you know that person was racing to get a pregnant wife to the hospital as she’d just gone into labor. (Ironically, you’d done something similar the week earlier.)
  • Mitt Romney’s declaration that 47% of the population who don’t pay income taxes will categorically support larger government “because those ‘who are dependent upon government, who believe that they are victims, who believe the government has a responsibility to care for them’ can never be persuaded to ‘take personal responsibility and care for their lives.'” In actuality, the 47% of the population who don’t pay income taxes are “…not some distinct parasite class, but rather ordinary, hard-working people who either already have paid or will soon be paying quite substantial taxes.”

Ways for Avoiding the Fundamental Attribution Error

1a) Empathy

As with many of the other biases, empathy is one of the quickest ways to thwart its power of you. If I put myself in the shoes of another, I’m more likely to understand that there might be more going on in the situation than I can see from my perspective. For instance, if we look at the red light example from above, by empathizing with the driver who runs the red light, I have a much higher chance of understanding that there running the red light is not a demonstration of their disregard for the world around them, but maybe that there’s something urgent to be taken care of.

1b) “Why Would a Rational Person Behave This Way?”

The above sentence is essentially a way to create a sense of empathy, but in case empathy is an ambiguous term, I’ve marked this ‘way’ 1b. Asking yourself this question will make it easier to consider the other factors at contributing to a situation.

Note: While the fundamental attribution error tells us that people make the mistake of devaluing the situational factors, it’s important not to sway too far the other way and totally discount the personality factors that might be contributing to a situation. For those folks that do sway too far to the situational factors affecting behavior, there’s a bias for it: actor-observer effect.

If you liked this post, you might like one of the other posts in this series:

Don’t Fall for the Gambler’s Fallacy: List of Biases in Judgment and Decision-Making, Part 7

A little later in the day than I would have liked, but today’s cognitive bias is the gambler’s fallacy. The bias gets its name from, as you’d expect, gambling. The easiest example to think of is when you’re flipping a coin. If you flip a coin 4 times and each of those 4 times the coin turned up heads, you’d expect the coin to turn up tails on the next (or at least have a higher chance of turning over tails), right? WRONG!

The odds are exactly the same on the 5th turn as the 6th turn as the 66th turn as the 11,024th turn. Why? Because the two instances of flipping the coin are independent events. (Note: we’re ignoring, for the time being, any effects that quantum reality might have on a given event in the past and the future.) So, every time you flip a coin, that’s an independent event — unaffected by earlier events.

Another important example is the reverse fallacy. That is, if we think that heads are “hot” because it’s been flipped a number of time, thinking that there’s a better chance that heads will be flipped is also a fallacy. Again, this is an independent event — unaffected by previous events.

This fallacy is so named because there’s a famous example of the gambler’s fallacy happening at the Monte Carlo Casino where, on roulette, black came up 26 times in a row. A number of gamblers reasoned that red would come up because there had been such an unlikely number of blacks that came up in a row. As the story goes, they lost millions.

Other examples of the gambler’s fallacy:

  • Childbirth: “we’ve had 3 boys, so we’re going to have a girl now…”
  • Lottery: “I’ve lost 3,000 times, so I’m due for a win…”
  • Sports: “Player X is playing really well, they’re bound to start playing bad…”
  • Stock market: “Stock X has had 7 straight down days, so it’s bound to go up on this next trading day…”

Ways for Avoiding the Gambler’s Fallacy

1) Independent Events vs. Dependent Events

The biggest way to avoid the gambler’s fallacy is to understand the difference between an independent event and a dependent event. In the classic example, the odds of a coin landing on heads or tails is — negligibly — 50/50 (I say negligibly because there are those who contend that the “heads side” weighs more and thus gives it a slight advantage). An example of a dependent event would be picking cards from a deck. There are 52 cards in a deck and if you pick one card without replacing it, your odds of picking one of the other 51 cards increases (ever so slightly).

If you liked this post, you might like one of the other posts in this series:

The Confirmation Bias — What Do You Really Know: List of Biases in Judgment and Decision-Making, Part 6

Well, here we are into the sixth week of biases in judgment and decision-making. Every Monday, I look at my list of cognitive biases and I see that we’ve still got quite a few weeks to go until I’ve exhausted the biases that I want to talk about. This week was a toss-up: I was trying to decide between the fundamental attribution error and the confirmation bias. After flipping a bottle cap (seriously, there wasn’t a coin close by), I’ve decided to talk about the confirmation bias.

Like last week, the confirmation bias is easy to understand in its definition: it’s the tendency to seek out information that confirms one’s previously held beliefs. In a journal article that’s been cited over 1000 times, Ray Nickerson stated:

If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration. Many have written about this bias, and it appears to be sufficiently strong and pervasive that one is led to wonder whether the bias, by itself, might account for a significant fraction of the disputes, altercations, and misunderstandings that occur among individuals, groups, and nations.

Why is the confirmation bias so loathed? Well, as Nickerson points out, it may be the root cause of many disputes both on an individual and an international level. Let’s think about this for a second: let’s say that in the world of objectivity “out there,” there are any number of possibilities. In the world  of subjectivity “inside my head,” there are only the possibilities that I can imagine. Humans, on the whole, tend to fear change (there are over 600,000,000 results for that search on Google!). In order to allay those fears, I’m going to prefer information that already conforms to my previously held beliefs. As a result, when I look “out there,” I’m going to unconsciously be looking for things that are “inside my head.” Let’s take this discussion out of the abstract because there are plenty of examples of it.

If you’re still not convinced and think you’re “beyond” the confirmation bias, I would urge you to try and solve the problem on this site. If you give the problem its due respect, I bet that you’ll be surprised as to your solution vs. the actual solution.

Ways for Avoiding the Confirmation Bias

As with other cognitive biases, being aware that there is such a thing as the confirmation bias is really important. It can be hard to change something if you don’t know that there’s something to be changed.

1) Seek out contradictory ideas and opinions

This is something that I’ve written about before. If at all possible, you’ve got to be sure that you’re getting information that is counter to your beliefs from somewhere. If not, there’s little chance for growth and expansion. This can be difficult for some, so I’ve outlined ways to do this on the post I referenced above.

2) Seek out people with contradictory ideas and opinions

I answered a question on Quora last November where I placed these two ways for avoiding the confirmation bias one and two. Some folks might find it a little more difficult to seek out people with opposing views and that’s why I suggest starting with seeking out contradictory views in print (or some other form of media) to begin. However, in my experience, speaking with someone who has opposing views to mine (assuming that they are also altruistic in their endeavor to seek out opposing views) can be quite enriching. A real-life person can usually put up a better defense when your “confirmation bias” is activated. Similarly, you can do the same for them.

3) What do you really know?

My last suggestion for avoiding the confirmation bias is to always be questioning what it is that you know. This can sound tedious, but if you get into the habit of questioning “how” you know something or “why” you know something, you’d be surprised how ‘thin’ the argument is for something that you know. For instance, let’s say that you have a racial stereotype that ethnicity “x” is bad at driving. When you’re on the highway, you notice that someone from ethnicity “x” cuts you off. Instead of going into a tizzy about ethnicity “x,” you might stop and remember that, in fact, of all the times that you’ve been cut off, ethnicity “x” is the ethnicity that cuts you off the least. This is a curt example, but I think you get the idea. Just to emphasize my point: I would argue that questioning your deeply held beliefs would be a good way of countering the confirmation bias.

So, what do you really know?

If you liked this post, you might like one of the other posts in this series: