Tag Archives: Fundamental Attribution Error

WRAP — An Acronym from Decisive: List of Biases in Judgment and Decision-Making, Part 10

I recently came across a post from Farnam Street that seems like it would make a great addition to the series we’ve been exploring over the last 10 weeks (biases in judgment and decision-making). So, instead of going over another bias today, I thought I’d share the information I found and tie it back into our series. Check back next week for a new bias (it could be functional fixedness, the hindsight bias, the status quo bias, or maybe the recency/primacy effect(s)…)

The author of the Farnam Street blog summarized some of the work in Chip and Dan Heath’s new book: Decisive: How to Make Better Choices in Life and Work. Given that our series is about decision-making, this book seems like it would be absolutely on-point, with regard to the closing of each post (How to avoid *blank*).

I haven’t yet read the book, but I did want to include a brief excerpt (courtesy of Farnam Street), along with a lead-in from Farnam Street:

The Heaths came up with a process to help us overcome these villains and make better choices. “We can’t deactivate our biases, but … we can counteract them with the right discipline.” The nature of each of the four decision-making villains suggests a strategy for how to defeat it.

1. You encounter a choice. But narrow framing makes you miss options. So … Widen Your Options. How can you expand your sent of choices? …

2. You analyze your options. But the confirmation bias leads you to gather self-serving information. So … Reality-Test Your Assumptions. How can you get outside your head and collect information you can trust? …

3. You make a choice. But short-term emotion will often tempt you to make the wrong one. So … Attain Distance Before Deciding. How can you overcome short-term emotion and conflicted feelings to make better choices? …

4. Then you live with it. But you’ll often be overconfident about how the future will unfold. So … Prepare to Be Wrong. How can we plan for an uncertain future so that we give our decisions the best chance to succeed? …

There’s also a handy picture that’s included (again, courtesy of Farnam Street):

As we can see, the Heaths have offered four universal ways for avoiding biases in judgment and decision-making. If we recall some of the different ways for avoiding biases that we’ve discussed over the last 9 weeks, many of them can be collapsed into one of the categories listed above. In case you’re a bit hazy, here are some of the biases that we’ve talked about before that have a “way for avoiding” that falls into one of the categories above:

So, if you’re having trouble remembering the different ways for avoiding the biases we’ve talked about, all you have to do is remember “W-R-A-P!”

Situations Dictate Behavior: List of Biases in Judgment and Decision-Making, Part 8

We’re into the 8th week of cognitive biases. A couple of weeks ago, I was trying to decide between the confirmation bias and the fundamental attribution error and decided on the confirmation bias. I’m not sure why I decided to go with the gambler’s fallacy last week (as opposed to the fundamental attribution error), so I thought I’d “circle back” and pick up the fundamental attribution error… in case you were really pining for it.

The fundamental attribution error may sound complicated (I mean, hey, there are three words!), but it’s actually quite simple once you get the hang of it. Normally, I explain the bias and then provide examples, but I think talking about an example will help to solidify the understanding of this bias. In a study done in 1967, researchers asked participants to assess whether a person was pro-/anti-Castro based on an essay the person had written. In one group, participants were told that the essayists were able to choose whether they wanted to write for the pro-side or the anti-side. Of course, when participants believed that essayists were able to choose which side they wanted to write for, they rated those essayists as having more positive (or negative) feelings towards Castro. In the second group, participants were told that the essayists would have their position determined by a coin flip. Meaning, the essayists had no control over whether they were going to be writing a positive/negative essay of Castro. It was all left up to chance (the situation!). Despite the participants’ knowledge of this, on average, they still rated the positive essays as a sign that those essayists had a positive view of Castro. Similarly for the negative essays as a sign that those essayists had a negative view of Castro. Participants were blind to the situation constraints

So that’s the fundamental attribution error — the idea that the situation dictates the behavior of the person, rather than the person’s personality. If you’re looking for some more examples:

  • You call up your friend and find out that they’ve done nothing all day. You assume that your friend is lazy. In fact, your friend was up all night caring for their sick grandmother.
  • You’re sitting a stop light when it turns to green. You advance out into the intersection only to nearly be smashed into by someone who runs the red light. You scoff at the person for running the red light. Little did you know that person was racing to get a pregnant wife to the hospital as she’d just gone into labor. (Ironically, you’d done something similar the week earlier.)
  • Mitt Romney’s declaration that 47% of the population who don’t pay income taxes will categorically support larger government “because those ‘who are dependent upon government, who believe that they are victims, who believe the government has a responsibility to care for them’ can never be persuaded to ‘take personal responsibility and care for their lives.'” In actuality, the 47% of the population who don’t pay income taxes are “…not some distinct parasite class, but rather ordinary, hard-working people who either already have paid or will soon be paying quite substantial taxes.”

Ways for Avoiding the Fundamental Attribution Error

1a) Empathy

As with many of the other biases, empathy is one of the quickest ways to thwart its power of you. If I put myself in the shoes of another, I’m more likely to understand that there might be more going on in the situation than I can see from my perspective. For instance, if we look at the red light example from above, by empathizing with the driver who runs the red light, I have a much higher chance of understanding that there running the red light is not a demonstration of their disregard for the world around them, but maybe that there’s something urgent to be taken care of.

1b) “Why Would a Rational Person Behave This Way?”

The above sentence is essentially a way to create a sense of empathy, but in case empathy is an ambiguous term, I’ve marked this ‘way’ 1b. Asking yourself this question will make it easier to consider the other factors at contributing to a situation.

Note: While the fundamental attribution error tells us that people make the mistake of devaluing the situational factors, it’s important not to sway too far the other way and totally discount the personality factors that might be contributing to a situation. For those folks that do sway too far to the situational factors affecting behavior, there’s a bias for it: actor-observer effect.

If you liked this post, you might like one of the other posts in this series:

The Confirmation Bias — What Do You Really Know: List of Biases in Judgment and Decision-Making, Part 6

Well, here we are into the sixth week of biases in judgment and decision-making. Every Monday, I look at my list of cognitive biases and I see that we’ve still got quite a few weeks to go until I’ve exhausted the biases that I want to talk about. This week was a toss-up: I was trying to decide between the fundamental attribution error and the confirmation bias. After flipping a bottle cap (seriously, there wasn’t a coin close by), I’ve decided to talk about the confirmation bias.

Like last week, the confirmation bias is easy to understand in its definition: it’s the tendency to seek out information that confirms one’s previously held beliefs. In a journal article that’s been cited over 1000 times, Ray Nickerson stated:

If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration. Many have written about this bias, and it appears to be sufficiently strong and pervasive that one is led to wonder whether the bias, by itself, might account for a significant fraction of the disputes, altercations, and misunderstandings that occur among individuals, groups, and nations.

Why is the confirmation bias so loathed? Well, as Nickerson points out, it may be the root cause of many disputes both on an individual and an international level. Let’s think about this for a second: let’s say that in the world of objectivity “out there,” there are any number of possibilities. In the world  of subjectivity “inside my head,” there are only the possibilities that I can imagine. Humans, on the whole, tend to fear change (there are over 600,000,000 results for that search on Google!). In order to allay those fears, I’m going to prefer information that already conforms to my previously held beliefs. As a result, when I look “out there,” I’m going to unconsciously be looking for things that are “inside my head.” Let’s take this discussion out of the abstract because there are plenty of examples of it.

If you’re still not convinced and think you’re “beyond” the confirmation bias, I would urge you to try and solve the problem on this site. If you give the problem its due respect, I bet that you’ll be surprised as to your solution vs. the actual solution.

Ways for Avoiding the Confirmation Bias

As with other cognitive biases, being aware that there is such a thing as the confirmation bias is really important. It can be hard to change something if you don’t know that there’s something to be changed.

1) Seek out contradictory ideas and opinions

This is something that I’ve written about before. If at all possible, you’ve got to be sure that you’re getting information that is counter to your beliefs from somewhere. If not, there’s little chance for growth and expansion. This can be difficult for some, so I’ve outlined ways to do this on the post I referenced above.

2) Seek out people with contradictory ideas and opinions

I answered a question on Quora last November where I placed these two ways for avoiding the confirmation bias one and two. Some folks might find it a little more difficult to seek out people with opposing views and that’s why I suggest starting with seeking out contradictory views in print (or some other form of media) to begin. However, in my experience, speaking with someone who has opposing views to mine (assuming that they are also altruistic in their endeavor to seek out opposing views) can be quite enriching. A real-life person can usually put up a better defense when your “confirmation bias” is activated. Similarly, you can do the same for them.

3) What do you really know?

My last suggestion for avoiding the confirmation bias is to always be questioning what it is that you know. This can sound tedious, but if you get into the habit of questioning “how” you know something or “why” you know something, you’d be surprised how ‘thin’ the argument is for something that you know. For instance, let’s say that you have a racial stereotype that ethnicity “x” is bad at driving. When you’re on the highway, you notice that someone from ethnicity “x” cuts you off. Instead of going into a tizzy about ethnicity “x,” you might stop and remember that, in fact, of all the times that you’ve been cut off, ethnicity “x” is the ethnicity that cuts you off the least. This is a curt example, but I think you get the idea. Just to emphasize my point: I would argue that questioning your deeply held beliefs would be a good way of countering the confirmation bias.

So, what do you really know?

If you liked this post, you might like one of the other posts in this series: