Tag Archives: Planning fallacy

Ways For Avoiding Cognitive Biases: List of Biases in Judgment and Decision-Making, Part 16

It’s Monday, so that means it’s time for another cognitive bias. However, I’ve finished the list of cognitive biases that I wanted to highlight. Of course, there are many more biases that could be discussed, but I thought those 14 were some of the more important cognitive biases. With today’s post, I thought I would review all of the ways for avoiding the biases, categorized by bias. So, I’ll list each bias and recount the ways that I suggested for avoiding the bias.

This is going to be a jam-packed post (with over 3000 words!) I highly recommend bookmarking this post and coming back to it as a reference. Alrighty, with that being said, let’s start with the sunk cost fallacy.

Ways for Avoiding the Sunk Cost Fallacy

So, now that we’ve looked at the sunk cost fallacy, how can we avoid it? Well, the first step in avoiding the sunk cost fallacy is recognizing it. Hopefully, the above examples have given you an idea of how this bias can arise. There are a two other ways I want to highlight that you can use to avoid this trap.

1) What am I assuming?

The crux of the sunk cost fallacy is based on an assumption. That is, you’re assuming that because you’ve already spent money on X, that you should keep spending money on X. If you look at what it is that you’re assuming about a situation, you just might find that you’re about to step into the sunk cost trap.

2) Are there alternatives?

Related to the above example is alternatives. You’re not bound to a decision because you’ve made a similar decision in the past. Just because you bought the ticket to go to the movie, if another activity presents itself as more enticing, you’re allowed to choose that one instead. In fact, if when you sit down to watch the movie, it’s bad, you’re allowed to get up and walk out. Don’t fall into the sunk cost trap thinking that you have to stay because you paid for it. There are any number of things you could be doing: going for a walk, calling an old friend, etc.

Ways for Avoiding Loss Aversion

As with the sunk cost fallacy, one of the most important ways to avoid loss aversion is to recognize it. That is, to know that humans have a tendency for loss aversion is an important first step in not falling into the trap of loss aversion.

1) What’s the big picture?

In our example of golf, that might mean knowing where you are in relation to the other players your competing with in the tournament (rather than where your ball is relation to the hole and what specific stroke you’re about to hit). In business, one might examine a decision about one business unit in relation to the entire company (rather than looking myopically at the one business unit).

2) Am I afraid of losing something?

This may seem like an obvious solution, but it’s pretty important. If before making a decision you can think to yourself (or have your team ask itself), “am I afraid to lose something here?” You might find that you are and it could serve to help you or your company avoid falling into the trap of loss aversion.

3) Do you really expect to never lose anything — ever?

Loss is inevitable. Sometimes, you won’t make that par putt (or that birdie putt). Sometimes, when you negotiate a deal, you won’t get the best deal. Sometimes, the decision to sell that business unit might result in losses somewhere else. If you can come to grips with the fact that every decision you make won’t be perfectand that sometimes you will lose, you may begin to shift your expectations about loss.

Ways for Avoiding the Endowment Effect

1) Am I emotional?

A seemingly obvious way to avoid the endowment effect is assessing whether our emotions are involved. Don’t get me wrong, emotions are a good thing, but they are a surefire way to overvaluing things that you own. That is, if you find yourself overly connected to something, your emotions might be getting in the way.

2) Independent Evaluation

This dovetails nicely with the idea of being unemotional. To guard against succumbing to the endowment effect, be sure to have an independent appraisal of whatever it is that you’re looking to sell of yours. While you’ll still have the final say on what you sell and how much you sell it for, having a second pair of eyes look at your side of the “deal” might help you determine if you’re judgment’s clouded.

3) Empathy

I wasn’t going to include this initially, but after reading the research, it certainly fits. Before I go on, I should say that folks might be confused in that I just suggested asking whether one is emotional and now I’m saying to practice empathy? For those wondering, being emotional is not the same thing as being empathetic. Back to empathy and the endowment effect. In situations where we’re selling something, researchers found there to be an empathy deficit when the endowment effect was present. So, to counter this, you should try to empathize with whom you’re negotiating.

Ways for Avoiding the Planning Fallacy

With the first three biases I talked about, awareness was a key step in overcoming the bias. While you could make that argument for the planning fallacy, one of the hallmarks of [the fallacy] is that people know they’ve erred in the past and stillmake the mistake of underestimating. So, we’ll need to move beyond awareness to help us defend against this bias.

1) Data is your friend

No, I don’t mean Data from Star Trek (though Data would probably be quite helpful in planning), but now that I think about it, Data (the character) might be a good way to position this ‘way for avoiding the planning fallacy.’ For those of you not familiar, Data is a human-like android. In thinking about this way for avoiding the planning fallacy, think about how Data might estimate the length of time it would take to complete a project. It would be very precise and data-driven. Data would likely look at past projects and how long it took for those to be finished to decide the length of time needed for this new project. To put it more broadly, if you have statistics on past projects (that were similar) absolutely use them in estimating the completion time of the new project.

2) Get a second opinion

When we think about the project completion time of one project in relation to another project, we often think about the nuances that make this project different from that project — and by extension — why this project won’t take as long as that project. Planning fallacy. If you can, ask someone who has experience in project completion in the area for which you’re estimating. When you ask this person, be sure not to tell them all the “various ways why this project is different,” because it probably isn’t and it’s only going to cloud the predictive ability of the person you’re asking. You’re probably going to hear an estimate that’s larger than you thought, but I bet you that it’s probably a lot closer to the real project completion time than the estimate you made based on thinking about the ways that this project was going to be different than all the other projects like it.

Ways for Avoiding the Framing Effect

1) Reframe the question

It may seem obvious, but you’d be surprised how many people don’t consider “reframing” the frame with which they are looking at a situation. For instance, in the example from earlier, instead of looking at it as a choice between Program A and Program B, someone could reframe Program A so that it looks like Program C and do the same with Program B, so that it looks like Program D. As a result, one would then be getting a “fuller” picture of their choice.

2) Empathy — assume someone else’s perspective

Many choices implicate another in a situation. As a result, it might be worth it to put yourself in the shoes of that other person to see how they would view a given situation. This is similar to the reframe, but is more specific in that it might serve to help the person remove themselves a little bit from the decision. That is, when we’re faced with a choice, our personal biases can have a big impact on the decision we make. When we imagine how someone else might make this decision, we’re less likely to succumb to our personal biases.

3) Parse the question

Some questions present us with a dichotomous choice: are apples good or bad? Should we exercise in the morning or the evening? Are gap years helpful or harmful? When faced with a question like this, I would highly recommendparsing the question. That is, are we sure that apples can only be good or bad? Are we sure that exercising in the morning or the evening are our only options? Often times, answers to questions aren’t simply this or that. In fact, more times than not, there is a great deal of grey area. Unfortunately, when the question is framed in such a way, it makes it very difficult to see the possibility of the grey area.

Ways for Avoiding the Confirmation Bias

As with other cognitive biases, being aware that there is such a thing as the confirmation bias is really important. It can be hard to change something if you don’t know that there’s something to be changed.

1) Seek out contradictory ideas and opinions

This is something that I’ve written about before. If at all possible, you’ve got to be sure that you’re getting information that is counter to your beliefs from somewhere. If not, there’s little chance for growth and expansion. This can be difficult for some, so I’ve outlined ways to do this on the post I referenced above.

2) Seek out people with contradictory ideas and opinions

I answered a question on Quora last November where I placed these two ways for avoiding the confirmation bias one and two. Some folks might find it a little more difficult to seek out people with opposing views and that’s why I suggest starting with seeking out contradictory views in print (or some other form of media) to begin. However, in my experience, speaking with someone who has opposing views to mine (assuming that they are also altruistic in their endeavor to seek out opposing views) can be quite enriching. A real-life person can usually put up a better defense when your “confirmation bias” is activated. Similarly, you can do the same for them.

3) What do you really know?

My last suggestion for avoiding the confirmation bias is to always be questioning what it is that you know. This can sound tedious, but if you get into the habit of questioning “how” you know something or “why” you know something, you’d be surprised how ‘thin’ the argument is for something that you know. For instance, let’s say that you have a racial stereotype that ethnicity “x” is bad at driving. When you’re on the highway, you notice that someone from ethnicity “x” cuts you off. Instead of going into a tizzy about ethnicity “x,” you might stop and remember that, in fact, of all the times that you’ve been cut off, ethnicity “x” is the ethnicity that cuts you off the least. This is a curt example, but I think you get the idea. Just to emphasize my point: I would argue that questioning your deeply held beliefs would be a good way of countering the confirmation bias.

Ways for Avoiding the Gambler’s Fallacy

1) Independent Events vs. Dependent Events

The biggest way to avoid the gambler’s fallacy is to understand the difference between an independent event and a dependent event. In the classic example, the odds of a coin landing on heads or tails is — negligibly – 50/50 (I say negligibly because there are those who contend that the “heads side” weighs more and thus gives it a slight advantage). An example of a dependent event would be picking cards from a deck. There are 52 cards in a deck and if you pick one card without replacing it, your odds of picking one of the other 51 cards increases (ever so slightly).

Ways for Avoiding the Fundamental Attribution Error

1a) Empathy

As with many of the other biases, empathy is one of the quickest ways to thwart its power of you. If I put myself in the shoes of another, I’m more likely to understand that there might be more going on in the situation than I can see from my perspective. For instance, if we look at the red light example from above, by empathizing with the driver who runs the red light, I have a much higher chance of understanding that there running the red light is not a demonstration of their disregard for the world around them, but maybe that there’s something urgent to be taken care of.

1b) “Why Would a Rational Person Behave This Way?”

The above sentence is essentially a way to create a sense of empathy, but in case empathy is an ambiguous term, I’ve marked this ‘way’ 1b. Asking yourself this question will make it easier to consider the other factors at contributing to a situation.

Ways for Avoiding the Overconfidence Effect

1) Know what you know (and don’t know)

The fastest way to slip into the trap of the overconfidence effect is to start making “confident” predictions about things that you don’t know about. Guessing the number of paper clips in a bottle is something that most of us have little to no expertise in. So, list a large confidence interval. If you have no experience in managing a project, it might be in your best interest not to make a prediction about how long it will take to complete the project (planning fallacy).

2) Is this person really an expert?

Sometimes, you’ll hear someone displaying a level of confidence in a given situation that makes you think they know what they’re talking about. As a result, it might bias you into believing what they are saying. It’s important to know if this person is an expert in this field, or if maybe they’re succumbing to the overconfidence effect.

Ways for Avoiding the Halo Effect

1) Different strengths for different tasks

One of the easiest ways to avoid falling into the trap of the halo effect is to notice that there are different skills/strengths required for different tasks. As such, just because someone is good at climbing mountains doesn’t mean that they would make a good politician. The strengths/skills required for those two tasks are different. Put another way, think about the strengths/skills required for a particular tasks before evaluating whether someone would be good at that task.

2) Notice other strengths (or weaknesses)

It’s been said that, “nobody’s perfect.” When someone is good at one thing, there’s a good chance that they won’t be good at something else. Noticing that this person isn’t good at someone else may help to quell the urge to assume that this person is good at everything.

Ways for Avoiding the Primacy/Recency Effect(s)

How you avoid these two biases really depends on the context of the decision you’re making. For instance, if you want people to remember something, you probably don’t want to give them a long list (thereby invoking the possibility of one of these two biases to happen). There are some general ways to mitigate these baises, though.

1) Keep a record (write down the data)

One of the simplest ways that either of these biases can have an impact on a decision is when there isn’t a record of data. If you’re just making a decision based on what you remember, there will be an unnecessary weighting for the beginning or the end. As a result, keeping a record of the choices can make it easier to evaluate all choices objectively.

2) Standardized data

As I mentioned earlier in this post, it’s important that the data by which you’re evaluating a choice be standardized. As we looked at in number one, keeping data isn’t always enough. it’s important that the data be uniform across choices, so an evaluation can be made. In this way, it’s easier to look at earlier choices and later choices equally whereas if this weren’t instituted, there might be a slight bias towards the beginning or the end. This tip would work for situations similar to making a purchase (and gathering data), interviewing candidates, or something that can be analogized to either of these two.

Ways for Avoiding Functional Fixedness

1) Practice, practice, practice

Probably the easiest and most effective way of overcoming functional fixedness is to practice. What does that mean? Well, take a box of miscellaneous things and see if you can design something fun/creative. The emphasis should be on using those things in a way that they weren’t designed. For instance, if you’re using a toolbox, you might think about how you can use something like wrenches to act as “legs” of a table or as a conductive agent for an electrical circuit.

2) Observant learning — Find examples

Another good way of overcoming functional fixedness is to look at other examples of people who have overcome functional fixedness. When I was giving a presentation on functional fixedness to a group (of college students) about a year ago, I showed the video below. About halfway through the video, one of them remarked: “So, basically, it’s how to be a college student 101.”

Ways for Avoiding the Status Quo Bias

1) Independent Evaluation

It really can be as easy as this. Have someone (or do it yourself) do a cost-benefit analysis on the situation/decision. In this way, you’ll be able to see the pros/cons of your decision in a new light. Of course, you may still succumb to the status quo bias, but you might be less likely to do so.

2) Role Reversal

While the independent evaluation makes “good sense” in trying to avoid this bias, doing some sort of role reversal will probably be the most effective. That is, look at the decision/situation from the other perspective. If it’s a negotiation, imagine that you’re in your negotiating partner’s shoes and you’re actually doing the trade from that side. Evaluate the deal. This may help to shake loose the status quo bias.

Ways for Avoiding the Hindsight Bias

1) Write it down!

This might be a bit tedious, but it’s a surefire way to guard against the hindsight bias. I’ve read a few articles about folks who’ve documented every prediction that they’ve ever made. While this had more to do with their profession (forecasting, stocks, etc.) it might be something you want to consider.

2) “I knew it all along!”

Have you ever found yourself saying, “I knew it all along,” or “I’m was sure it was going to happen?” These are good indicators that you’re probably operating under the hindsight bias. When you catch yourself saying these phrases, stop and think about what has happened in the situation. Chances are that you’ve “short-circuited” and you’re not thinking about what’s happened to cause that situation.

WRAP — An Acronym from Decisive: List of Biases in Judgment and Decision-Making, Part 10

I recently came across a post from Farnam Street that seems like it would make a great addition to the series we’ve been exploring over the last 10 weeks (biases in judgment and decision-making). So, instead of going over another bias today, I thought I’d share the information I found and tie it back into our series. Check back next week for a new bias (it could be functional fixedness, the hindsight bias, the status quo bias, or maybe the recency/primacy effect(s)…)

The author of the Farnam Street blog summarized some of the work in Chip and Dan Heath’s new book: Decisive: How to Make Better Choices in Life and Work. Given that our series is about decision-making, this book seems like it would be absolutely on-point, with regard to the closing of each post (How to avoid *blank*).

I haven’t yet read the book, but I did want to include a brief excerpt (courtesy of Farnam Street), along with a lead-in from Farnam Street:

The Heaths came up with a process to help us overcome these villains and make better choices. “We can’t deactivate our biases, but … we can counteract them with the right discipline.” The nature of each of the four decision-making villains suggests a strategy for how to defeat it.

1. You encounter a choice. But narrow framing makes you miss options. So … Widen Your Options. How can you expand your sent of choices? …

2. You analyze your options. But the confirmation bias leads you to gather self-serving information. So … Reality-Test Your Assumptions. How can you get outside your head and collect information you can trust? …

3. You make a choice. But short-term emotion will often tempt you to make the wrong one. So … Attain Distance Before Deciding. How can you overcome short-term emotion and conflicted feelings to make better choices? …

4. Then you live with it. But you’ll often be overconfident about how the future will unfold. So … Prepare to Be Wrong. How can we plan for an uncertain future so that we give our decisions the best chance to succeed? …

There’s also a handy picture that’s included (again, courtesy of Farnam Street):

As we can see, the Heaths have offered four universal ways for avoiding biases in judgment and decision-making. If we recall some of the different ways for avoiding biases that we’ve discussed over the last 9 weeks, many of them can be collapsed into one of the categories listed above. In case you’re a bit hazy, here are some of the biases that we’ve talked about before that have a “way for avoiding” that falls into one of the categories above:

So, if you’re having trouble remembering the different ways for avoiding the biases we’ve talked about, all you have to do is remember “W-R-A-P!”

When 99% Confident Leads to Wrongness 40% of the Time: List of Biases in Judgment and Decision-Making, Part 9

This week, we’re looking at one of my ‘favorite’ biases, in that it’s one that once you know, it can be quite comical to spot it in others (and yourself, if you still fall for it, from time to time). From Wikipedia: the overconfidence effect “is a well-established bias in which someone’s subjective confidence in their judgments is reliably greater than their objective accuracy, especially when confidence is relatively high.” That’s a bit jargon-y, so let me illustrate with a simple example.

In fact, this example comes from a lecture I heard about negotiation and talked about in one of my previous posts. In case you were wondering, the lecture comes from Professor Margaret Neale at Stanford. It was a brilliant! There was so much information packed into the lecture. I remember listening to it a few different times and still pulling out nuggets of wisdom. Anyway, I digress. The example that Prof. Neale uses is particularly on point for illustrating the overconfidence effect.

She has an empty pop bottle filled with paper clips (but not to the top). She says to the crowd that she wants them to guess how many paper clips are in the bottle. She walks up and down the aisles, so they can get a closer look, too. She instructs the crowd to write down their answer. Then, she asks them to write down a range where they could be 100% (she may say 99%, I don’t remember) sure that the number of paper clips fell in the range. Essentially, she was asking for a confidence interval. I think she also told them that she was sure there weren’t more than 1,000,000 paper clips in there. After some time, she then tells the audience how many were in there. She asks if anyone got it right (no one raises their hand). She then says something to the effect of, “For how many of you did the number of paper clips fall within the range?” There may have been about 35% of the room who raised their hand. 35%! She exclaims that this is terrible given that all of these people were 100% (or 99%) sure that the number would fall in the range. In fact, she said that in a room that size, there should have only been a handful of people who’s range wasn’t met (if the 99% figure was being used, rather than 100%). Prof. Neale then goes on to explain that this is the overconfidence effect. The audience was being asked to make an estimate of something about which they knew nothing, and then asked to rate their confidence. Knowing that they knew nothing about the topic, it would have been logical for the audience to have a large confidence interval (between 10 paper clips and 20,000 paper clips) — or even bigger!

This happens in more ways than just simply estimating the number of paper clips in a bottle. We also see this with investors. When asked, fund manager typically report having performed above-average service. In fact, 74% report having delivered above-average service, while the remaining 26% report having rendered average service.

Another place that we see the overconfidence effect show up is with the planning fallacy: “Oh yeah, I can finish that in two weeks…”

Ways for Avoiding the Overconfidence Effect

1) Know what you know (and don’t know)

The fastest way to slip into the trap of the overconfidence effect is to start making “confident” predictions about things that you don’t know about. Guessing the number of paper clips in a bottle is something that most of us have little to no expertise in. So, list a large confidence interval. If you have no experience in managing a project, it might be in your best interest not to make a prediction about how long it will take to complete the project (planning fallacy).

2) Is this person really an expert?

Sometimes, you’ll hear someone displaying a level of confidence in a given situation that makes you think they know what they’re talking about. As a result, it might bias you into believing what they are saying. It’s important to know if this person is an expert in this field, or if maybe they’re succumbing to the overconfidence effect.

From Scott Plous‘ book called The Psychology of Judgment and Decision Making: “Overconfidence has been called the most ‘pervasive and potentially catastrophic’ of all the cognitive biases to which human beings fall victim. It has been blamed for lawsuits, strikes, wars, and stock market bubbles and crashes.”

If you liked this post, you might like one of the other posts in this series:

Perspective and the Framing Effect: List of Biases in Judgment and Decision-Making, Part 5

Since I was going to talk about the framing effect last week (and opted for the planning fallacy instead because of circumstances), I thought I’d get into the framing effect this week. The framing effect is a very easy bias to understand, in that it’s not as complicated in its description as some of the other biases are. In short, the framing effect is how people can react differently to choices depending on whether the circumstances are presented as gains or losses.

The famous example of the framing effect comes from a paper by Kahneman (who I’ve mentioned before) and Tversky in 1981:

Problem 1: Imagine that the U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimate of the consequences of the programs are as follows: If Program A is adopted, 200 people will be saved. [72 percent]

If Program B is adopted, there is 1/3 probability that 600 people will be saved, and 2/3 probability that no people will be saved. [28 percent]

As you can see from the percentages in brackets, people opted for the sure thing. Now, let’s look at the second part of this study:

If Program C is adopted 400 people will die. [22 percent]

If Program D is adopted there is 1/3 probability that nobody will die, and 2/3 probability that 600 people will die. [78 percent]

Did you notice something? Program C is identical to Program A, and yet the percentage of people who were opting for Program C dropped tremendously! Similarly, notice that Program D’s percentage went way up — even though it’s the same thing as Program B. This is the framing effect in action. Is it frightening to you that we’re so susceptible to changing our mind based simply on how a choice is framed? If it’s not, it certainly should be.

Ways for Avoiding the Framing Effect

1) Reframe the question

It may seem obvious, but you’d be surprised how many people don’t consider “reframing” the frame with which they are looking at a situation. For instance, in the example from earlier, instead of looking at it as a choice between Program A and Program B, someone could reframe Program A so that it looks like Program C and do the same with Program B, so that it looks like Program D. As a result, one would then be getting a “fuller” picture of their choice.

2) Empathy — assume someone else’s perspective

Many choices implicate another in a situation. As a result, it might be worth it to put yourself in the shoes of that other person to see how they would view a given situation. This is similar to the reframe, but is more specific in that it might serve to help the person remove themselves a little bit from the decision. That is, when we’re faced with a choice, our personal biases can have a big impact on the decision we make. When we imagine how someone else might make this decision, we’re less likely to succumb to our personal biases.

3) Parse the question

Some questions present us with a dichotomous choice: are apples good or bad? Should we exercise in the morning or the evening? Are gap years helpful or harmful? When faced with a question like this, I would highly recommend parsing the question. That is, are we sure that apples can only be good or bad? Are we sure that exercising in the morning or the evening are our only options? Often times, answers to questions aren’t simply this or that. In fact, more times than not, there is a great deal of grey area. Unfortunately, when the question is framed in such a way, it makes it very difficult to see the possibility of the grey area.

If you liked this post, you might like one of the other posts in this series:

Get a Second Opinion Before You Succumb to the Planning Fallacy: List of Biases in Judgment and Decision-Making, Part 4

I know that I said that I was going to be talking about a new bias in judgment and decision-making every Monday and I know that today is Tuesday. To be honest — I underestimated how long it would take me to prepare for my seminar in International Relations. Aside: if you want to challenge yourself, take a course in a subject which you know very little about and be amazed at how much you feel like you’ve been dropped into the ocean and told to swim! It can be a little unnerving at first, but if you’re into exploring and open to new experiences, it can be quite satisfying. Anyway, so today yesterday I’d planned to talk about the framing effect, but since I so conveniently demonstrated the planning fallacy, I thought I’d talk about it.

The consequence of this post being written/published today is directly related to my falling into the trap of the planning fallacy. I planned for the preparation for my International Relations class to take a certain amount of time. When that time lasted longer than I had anticipated, I had no time left to write about a bias in judgment and decision-making. The planning fallacy is our tendency to underestimate how long we’ll need to complete a task — especially when we’ve had experiences where we’ve underestimated similar tasks.

This is something that even the best of us fall prey to. In fact, one of the biggest names in cognitive biases Daniel Kahneman (Nobel Prize in economics, but a PhD in psychology!) has said that even he still has a hard time with the planning fallacy. Of course, this doesn’t make it permissible for us not to try to prevent the effects of the planning fallacy.

Before we get into ways for avoiding the planning fallacy, I want to share an excerpt from an oft-cited study when discussing the planning fallacy [emphasis added]:

Participants were provided with a series of specific confidence levels and were asked to indicate the completion time corresponding to each confidence level. In this manner, the participants indicated times by which they were 50% certain they would finish their projects (and 50% certain they would not), 75% certain they would finish, and 99% certain they would finish. When we examined the proportion of subjects who finished by each of these forecasted times, we found evidence of overconfidence. Consider the academic projects: only 12.8% of the subjects finished their academic projects by the time they reported as their 50% probability level, only 19.2% finished by the time of their 75% probability level, and only 44.7% finished by the time of their 99% probability level. The results for the 99% probability level are especially striking: even when they make a highly conservative forecast, a prediction that they feel virtually certain that they will fulfill, people’s confidence far exceeds their accomplishments.

There were a lot of numbers/percentages offered in the excerpt, so I’ve also included a visual representation of the data in a graph below. This graph comes from a book chapter by a couple of the same authors, but it is about the data in the preceding excerpt.

 

 

 

 

 

Ways for Avoiding the Planning Fallacy

With the first three biases I talked about, awareness was a key step in overcoming the bias. While you could make that argument for the planning fallacy, one of the hallmarks of [the fallacy] is that people know they’ve erred in the past and still make the mistake of underestimating. So, we’ll need to move beyond awareness to help us defend against this bias.

1) Data is your friend

No, I don’t mean Data from Star Trek (though Data would probably be quite helpful in planning), but now that I think about it, Data (the character) might be a good way to position this ‘way for avoiding the planning fallacy.’ For those of you not familiar, Data is a human-like android. In thinking about this way for avoiding the planning fallacy, think about how Data might estimate the length of time it would take to complete a project. It would be very precise and data-driven. Data would likely look at past projects and how long it took for those to be finished to decide the length of time needed for this new project. To put it more broadly, if you have statistics on past projects (that were similar) absolutely use them in estimating the completion time of the new project.

2) Get a second opinion

When we think about the project completion time of one project in relation to another project, we often think about the nuances that make this project different from that project — and by extension — why this project won’t take as long as that project. Planning fallacy. If you can, ask someone who has experience in project completion in the area for which you’re estimating. When you ask this person, be sure not to tell them all the “various ways why this project is different,” because it probably isn’t and it’s only going to cloud the predictive ability of the person you’re asking. You’re probably going to hear an estimate that’s larger than you thought, but I bet you that it’s probably a lot closer to the real project completion time than the estimate you made based on thinking about the ways that this project was going to be different than all the other projects like it.

If you liked this post, you might like the first three posts in this series: