Ways For Avoiding Cognitive Biases: List of Biases in Judgment and Decision-Making, Part 16

It’s Monday, so that means it’s time for another cognitive bias. However, I’ve finished the list of cognitive biases that I wanted to highlight. Of course, there are many more biases that could be discussed, but I thought those 14 were some of the more important cognitive biases. With today’s post, I thought I would review all of the ways for avoiding the biases, categorized by bias. So, I’ll list each bias and recount the ways that I suggested for avoiding the bias.

This is going to be a jam-packed post (with over 3000 words!) I highly recommend bookmarking this post and coming back to it as a reference. Alrighty, with that being said, let’s start with the sunk cost fallacy.

Ways for Avoiding the Sunk Cost Fallacy

So, now that we’ve looked at the sunk cost fallacy, how can we avoid it? Well, the first step in avoiding the sunk cost fallacy is recognizing it. Hopefully, the above examples have given you an idea of how this bias can arise. There are a two other ways I want to highlight that you can use to avoid this trap.

1) What am I assuming?

The crux of the sunk cost fallacy is based on an assumption. That is, you’re assuming that because you’ve already spent money on X, that you should keep spending money on X. If you look at what it is that you’re assuming about a situation, you just might find that you’re about to step into the sunk cost trap.

2) Are there alternatives?

Related to the above example is alternatives. You’re not bound to a decision because you’ve made a similar decision in the past. Just because you bought the ticket to go to the movie, if another activity presents itself as more enticing, you’re allowed to choose that one instead. In fact, if when you sit down to watch the movie, it’s bad, you’re allowed to get up and walk out. Don’t fall into the sunk cost trap thinking that you have to stay because you paid for it. There are any number of things you could be doing: going for a walk, calling an old friend, etc.

Ways for Avoiding Loss Aversion

As with the sunk cost fallacy, one of the most important ways to avoid loss aversion is to recognize it. That is, to know that humans have a tendency for loss aversion is an important first step in not falling into the trap of loss aversion.

1) What’s the big picture?

In our example of golf, that might mean knowing where you are in relation to the other players your competing with in the tournament (rather than where your ball is relation to the hole and what specific stroke you’re about to hit). In business, one might examine a decision about one business unit in relation to the entire company (rather than looking myopically at the one business unit).

2) Am I afraid of losing something?

This may seem like an obvious solution, but it’s pretty important. If before making a decision you can think to yourself (or have your team ask itself), “am I afraid to lose something here?” You might find that you are and it could serve to help you or your company avoid falling into the trap of loss aversion.

3) Do you really expect to never lose anything — ever?

Loss is inevitable. Sometimes, you won’t make that par putt (or that birdie putt). Sometimes, when you negotiate a deal, you won’t get the best deal. Sometimes, the decision to sell that business unit might result in losses somewhere else. If you can come to grips with the fact that every decision you make won’t be perfectand that sometimes you will lose, you may begin to shift your expectations about loss.

Ways for Avoiding the Endowment Effect

1) Am I emotional?

A seemingly obvious way to avoid the endowment effect is assessing whether our emotions are involved. Don’t get me wrong, emotions are a good thing, but they are a surefire way to overvaluing things that you own. That is, if you find yourself overly connected to something, your emotions might be getting in the way.

2) Independent Evaluation

This dovetails nicely with the idea of being unemotional. To guard against succumbing to the endowment effect, be sure to have an independent appraisal of whatever it is that you’re looking to sell of yours. While you’ll still have the final say on what you sell and how much you sell it for, having a second pair of eyes look at your side of the “deal” might help you determine if you’re judgment’s clouded.

3) Empathy

I wasn’t going to include this initially, but after reading the research, it certainly fits. Before I go on, I should say that folks might be confused in that I just suggested asking whether one is emotional and now I’m saying to practice empathy? For those wondering, being emotional is not the same thing as being empathetic. Back to empathy and the endowment effect. In situations where we’re selling something, researchers found there to be an empathy deficit when the endowment effect was present. So, to counter this, you should try to empathize with whom you’re negotiating.

Ways for Avoiding the Planning Fallacy

With the first three biases I talked about, awareness was a key step in overcoming the bias. While you could make that argument for the planning fallacy, one of the hallmarks of [the fallacy] is that people know they’ve erred in the past and stillmake the mistake of underestimating. So, we’ll need to move beyond awareness to help us defend against this bias.

1) Data is your friend

No, I don’t mean Data from Star Trek (though Data would probably be quite helpful in planning), but now that I think about it, Data (the character) might be a good way to position this ‘way for avoiding the planning fallacy.’ For those of you not familiar, Data is a human-like android. In thinking about this way for avoiding the planning fallacy, think about how Data might estimate the length of time it would take to complete a project. It would be very precise and data-driven. Data would likely look at past projects and how long it took for those to be finished to decide the length of time needed for this new project. To put it more broadly, if you have statistics on past projects (that were similar) absolutely use them in estimating the completion time of the new project.

2) Get a second opinion

When we think about the project completion time of one project in relation to another project, we often think about the nuances that make this project different from that project — and by extension — why this project won’t take as long as that project. Planning fallacy. If you can, ask someone who has experience in project completion in the area for which you’re estimating. When you ask this person, be sure not to tell them all the “various ways why this project is different,” because it probably isn’t and it’s only going to cloud the predictive ability of the person you’re asking. You’re probably going to hear an estimate that’s larger than you thought, but I bet you that it’s probably a lot closer to the real project completion time than the estimate you made based on thinking about the ways that this project was going to be different than all the other projects like it.

Ways for Avoiding the Framing Effect

1) Reframe the question

It may seem obvious, but you’d be surprised how many people don’t consider “reframing” the frame with which they are looking at a situation. For instance, in the example from earlier, instead of looking at it as a choice between Program A and Program B, someone could reframe Program A so that it looks like Program C and do the same with Program B, so that it looks like Program D. As a result, one would then be getting a “fuller” picture of their choice.

2) Empathy — assume someone else’s perspective

Many choices implicate another in a situation. As a result, it might be worth it to put yourself in the shoes of that other person to see how they would view a given situation. This is similar to the reframe, but is more specific in that it might serve to help the person remove themselves a little bit from the decision. That is, when we’re faced with a choice, our personal biases can have a big impact on the decision we make. When we imagine how someone else might make this decision, we’re less likely to succumb to our personal biases.

3) Parse the question

Some questions present us with a dichotomous choice: are apples good or bad? Should we exercise in the morning or the evening? Are gap years helpful or harmful? When faced with a question like this, I would highly recommendparsing the question. That is, are we sure that apples can only be good or bad? Are we sure that exercising in the morning or the evening are our only options? Often times, answers to questions aren’t simply this or that. In fact, more times than not, there is a great deal of grey area. Unfortunately, when the question is framed in such a way, it makes it very difficult to see the possibility of the grey area.

Ways for Avoiding the Confirmation Bias

As with other cognitive biases, being aware that there is such a thing as the confirmation bias is really important. It can be hard to change something if you don’t know that there’s something to be changed.

1) Seek out contradictory ideas and opinions

This is something that I’ve written about before. If at all possible, you’ve got to be sure that you’re getting information that is counter to your beliefs from somewhere. If not, there’s little chance for growth and expansion. This can be difficult for some, so I’ve outlined ways to do this on the post I referenced above.

2) Seek out people with contradictory ideas and opinions

I answered a question on Quora last November where I placed these two ways for avoiding the confirmation bias one and two. Some folks might find it a little more difficult to seek out people with opposing views and that’s why I suggest starting with seeking out contradictory views in print (or some other form of media) to begin. However, in my experience, speaking with someone who has opposing views to mine (assuming that they are also altruistic in their endeavor to seek out opposing views) can be quite enriching. A real-life person can usually put up a better defense when your “confirmation bias” is activated. Similarly, you can do the same for them.

3) What do you really know?

My last suggestion for avoiding the confirmation bias is to always be questioning what it is that you know. This can sound tedious, but if you get into the habit of questioning “how” you know something or “why” you know something, you’d be surprised how ‘thin’ the argument is for something that you know. For instance, let’s say that you have a racial stereotype that ethnicity “x” is bad at driving. When you’re on the highway, you notice that someone from ethnicity “x” cuts you off. Instead of going into a tizzy about ethnicity “x,” you might stop and remember that, in fact, of all the times that you’ve been cut off, ethnicity “x” is the ethnicity that cuts you off the least. This is a curt example, but I think you get the idea. Just to emphasize my point: I would argue that questioning your deeply held beliefs would be a good way of countering the confirmation bias.

Ways for Avoiding the Gambler’s Fallacy

1) Independent Events vs. Dependent Events

The biggest way to avoid the gambler’s fallacy is to understand the difference between an independent event and a dependent event. In the classic example, the odds of a coin landing on heads or tails is — negligibly – 50/50 (I say negligibly because there are those who contend that the “heads side” weighs more and thus gives it a slight advantage). An example of a dependent event would be picking cards from a deck. There are 52 cards in a deck and if you pick one card without replacing it, your odds of picking one of the other 51 cards increases (ever so slightly).

Ways for Avoiding the Fundamental Attribution Error

1a) Empathy

As with many of the other biases, empathy is one of the quickest ways to thwart its power of you. If I put myself in the shoes of another, I’m more likely to understand that there might be more going on in the situation than I can see from my perspective. For instance, if we look at the red light example from above, by empathizing with the driver who runs the red light, I have a much higher chance of understanding that there running the red light is not a demonstration of their disregard for the world around them, but maybe that there’s something urgent to be taken care of.

1b) “Why Would a Rational Person Behave This Way?”

The above sentence is essentially a way to create a sense of empathy, but in case empathy is an ambiguous term, I’ve marked this ‘way’ 1b. Asking yourself this question will make it easier to consider the other factors at contributing to a situation.

Ways for Avoiding the Overconfidence Effect

1) Know what you know (and don’t know)

The fastest way to slip into the trap of the overconfidence effect is to start making “confident” predictions about things that you don’t know about. Guessing the number of paper clips in a bottle is something that most of us have little to no expertise in. So, list a large confidence interval. If you have no experience in managing a project, it might be in your best interest not to make a prediction about how long it will take to complete the project (planning fallacy).

2) Is this person really an expert?

Sometimes, you’ll hear someone displaying a level of confidence in a given situation that makes you think they know what they’re talking about. As a result, it might bias you into believing what they are saying. It’s important to know if this person is an expert in this field, or if maybe they’re succumbing to the overconfidence effect.

Ways for Avoiding the Halo Effect

1) Different strengths for different tasks

One of the easiest ways to avoid falling into the trap of the halo effect is to notice that there are different skills/strengths required for different tasks. As such, just because someone is good at climbing mountains doesn’t mean that they would make a good politician. The strengths/skills required for those two tasks are different. Put another way, think about the strengths/skills required for a particular tasks before evaluating whether someone would be good at that task.

2) Notice other strengths (or weaknesses)

It’s been said that, “nobody’s perfect.” When someone is good at one thing, there’s a good chance that they won’t be good at something else. Noticing that this person isn’t good at someone else may help to quell the urge to assume that this person is good at everything.

Ways for Avoiding the Primacy/Recency Effect(s)

How you avoid these two biases really depends on the context of the decision you’re making. For instance, if you want people to remember something, you probably don’t want to give them a long list (thereby invoking the possibility of one of these two biases to happen). There are some general ways to mitigate these baises, though.

1) Keep a record (write down the data)

One of the simplest ways that either of these biases can have an impact on a decision is when there isn’t a record of data. If you’re just making a decision based on what you remember, there will be an unnecessary weighting for the beginning or the end. As a result, keeping a record of the choices can make it easier to evaluate all choices objectively.

2) Standardized data

As I mentioned earlier in this post, it’s important that the data by which you’re evaluating a choice be standardized. As we looked at in number one, keeping data isn’t always enough. it’s important that the data be uniform across choices, so an evaluation can be made. In this way, it’s easier to look at earlier choices and later choices equally whereas if this weren’t instituted, there might be a slight bias towards the beginning or the end. This tip would work for situations similar to making a purchase (and gathering data), interviewing candidates, or something that can be analogized to either of these two.

Ways for Avoiding Functional Fixedness

1) Practice, practice, practice

Probably the easiest and most effective way of overcoming functional fixedness is to practice. What does that mean? Well, take a box of miscellaneous things and see if you can design something fun/creative. The emphasis should be on using those things in a way that they weren’t designed. For instance, if you’re using a toolbox, you might think about how you can use something like wrenches to act as “legs” of a table or as a conductive agent for an electrical circuit.

2) Observant learning — Find examples

Another good way of overcoming functional fixedness is to look at other examples of people who have overcome functional fixedness. When I was giving a presentation on functional fixedness to a group (of college students) about a year ago, I showed the video below. About halfway through the video, one of them remarked: “So, basically, it’s how to be a college student 101.”

Ways for Avoiding the Status Quo Bias

1) Independent Evaluation

It really can be as easy as this. Have someone (or do it yourself) do a cost-benefit analysis on the situation/decision. In this way, you’ll be able to see the pros/cons of your decision in a new light. Of course, you may still succumb to the status quo bias, but you might be less likely to do so.

2) Role Reversal

While the independent evaluation makes “good sense” in trying to avoid this bias, doing some sort of role reversal will probably be the most effective. That is, look at the decision/situation from the other perspective. If it’s a negotiation, imagine that you’re in your negotiating partner’s shoes and you’re actually doing the trade from that side. Evaluate the deal. This may help to shake loose the status quo bias.

Ways for Avoiding the Hindsight Bias

1) Write it down!

This might be a bit tedious, but it’s a surefire way to guard against the hindsight bias. I’ve read a few articles about folks who’ve documented every prediction that they’ve ever made. While this had more to do with their profession (forecasting, stocks, etc.) it might be something you want to consider.

2) “I knew it all along!”

Have you ever found yourself saying, “I knew it all along,” or “I’m was sure it was going to happen?” These are good indicators that you’re probably operating under the hindsight bias. When you catch yourself saying these phrases, stop and think about what has happened in the situation. Chances are that you’ve “short-circuited” and you’re not thinking about what’s happened to cause that situation.

Cell Phones and Driving: Do You Value Your Life?

A couple of days ago, I happened to be in the car when NPR’s The Kojo Nnamdi Show was playing. It just so happened that it was “Tech Tuesday,” and they were talking about new findings on distracted driving. Some of the findings would probably shock most people. For instance, would you have guessed that there is no (statistically) significant difference between talking on a cell phone with bluetooth and without bluetooth? I wouldn’t have. And, in fact, part of me thinks that the study maybe wasn’t designed optimally for testing the hypothesis, but I didn’t read the journal article.

One of the more interesting parts of the conversation was when one of the callers brought up the point about having cell phones automatically “lock” themselves when the car is in motion. One of the guests pointed out that this is already out there. She mentioned that there were apps that would “lock” the phone if the car is in motion. Then, Kojo brought up the point about passengers in the car — would they still be able to use their phones in the car? At this point, the guest then explained that getting around the “locked” phone is not too difficult.

After listening to this exchange, I realized that car safety (ala cell phones) is a choice. That is, it’s a choice by the driver. It’s probably not possible to completely legislate away a person’s ability to use their cell phone while driving (meaning: it likely wouldn’t hold up in court), so then it becomes a choice for driver. Does the driver want to increase their chances of causing an accident? Because that’s what happens when a driver decides to use their cell phone while driving. They’re increasing their chances of causing (or being in) an accident. To take this down a psychological tangent, it’s possible that they don’t value their life (as much as the next person) and so they’re willing to take this kind of risk.

As I got out of the car and began walking to my destination, my thoughts floated back to the 2009 book, Nudge (I think I’ve mentioned it on here before). I was trying to think of a way that we, as a society, could help nudge people to make better choices when behind the wheel. Is there some way we could nudge drivers away from using their cell phone?

Hindsight is Always 20/20: List of Biases in Judgment and Decision-Making, Part 15

While it is a little later than I would have liked, it still is Monday (at least in EDT). Today’s cognitive bias: hindsight bias. As many of the previous biases, this is exactly how it sounds. In fact, there’s even a handy idiom to help you remember the gist of this bias: “Hindsight’s 20/20.”

So, what is the hindsight bias? It’s the idea that when looked at a course of events after they’ve happened, things seem quite predictable. ‘I knew that was gonna happen.’ This often happens in spite of someone not thinking those events were going to happen. That is to say, even if they thought there was little likelihood of an event happening, after the fact, someone would think that it would obviously happen. Let me further explain it through an example. Let’s start with an easy example, too.

Remember back to when you were applying to college/university? Let’s say a letter comes in the mail telling this person that they’ve been accepted. When they tell their parents about it, mom gets really excited and says that she knew it all along. Meanwhile, she had previously expressed doubts that this person was going to get accepted. That’s a hindsight bias. Like I did with the gambler’s fallacy, I’ll list some other common ways we can see the hindsight bias affecting us:

  • You tell your friend that you think it’s going to rain later that day — and it does! So, you say something to the effect of, “I was sure it was going to rain!”
  • You give your number out at the bar, but the person doesn’t call you for a few days. When the person eventually calls, you tell yourself that you were sure he was going to call.
  • You’re getting ready to go on a trip and you tell your friend that you’re sure you’re to forget something. When you get to your destination, it turns out you did forget something, so you tell your friend that you knew it was going to happen.

These are some everyday examples, but hindsight bias has proven to be very important in the judicial system. For instance: “Hindsight bias results in being held to a higher standard in court. The defense is particularly susceptible to these effects since their actions are the ones being scrutinized by the jury. Due to the hindsight bias, defendants will be judged as being capable of preventing the bad outcome.”

Ways for Avoiding the Hindsight Bias

1) Write it down!

This might be a bit tedious, but it’s a surefire way to guard against the hindsight bias. I’ve read a few articles about folks who’ve documented every prediction that they’ve ever made. While this had more to do with their profession (forecasting, stocks, etc.) it might be something you want to consider.

2) “I knew it all along!”

Have you ever found yourself saying, “I knew it all along,” or “I’m was sure it was going to happen?” These are good indicators that you’re probably operating under the hindsight bias. When you catch yourself saying these phrases, stop and think about what has happened in the situation. Chances are that you’ve “short-circuited” and you’re not thinking about what’s happened to cause that situation.

If you liked this post, you might like one of the other posts in this series:

Is Sunshine Really the Best Disinfectant: Edward Snowden, PRISM, and the NSA

In keeping with the theme from yesterday’s post about Edward Snowden and the leaks about PRISM and the NSA, I thought I’d share something that I was reminded of when I was watching some of the coverage of it earlier this week. Before doing that though, if you haven’t, and regardless of your position on whether he should or shouldn’t have done this, I would urge you to read the article and watch the clip about him in The Guardian.

A couple of days ago I happened to catch a segment of Morning Joe where one of the journalists who broke the story about the NSA, Glenn Greenwald, was on. The clip is about 20 minutes and there’s an interesting exchange between one of the hosts and Greenwald. The part I’d like to highlight today happens towards the end of the segment. I think it was Willie Geist who asked the question and included the phrase, “Sunshine is the best disinfectant,” in reference to getting the information about these programs out in the open. This reminded me of a paper I wrote for a Public Administration class and I thought it might be useful if I detailed some of the research I used for that paper.

The idea that “sunshine is the best disinfectant” with regard to public administration stems from the idea of government reform. In a 2006 paper in Public Administration Review, Paul C. Light defined four tides of government reform:

All government reform is not created equal. Some reforms seek greater efficiency through the application of scientific principles to organization and management, whereas others seek increased economy through attacks on fraud, waste, and abuse. Some seek improved performance through a focus on outcomes and employee engagement, whereas others seek increased fairness through transparency in government and access to information. Although these four approaches are not inherently contradictory — and can even be found side by side in omnibus statutes such as the 1978 Civil Service Reform Act — they emerge from very different readings of government motivations.

These approaches also offer an ideology for every political taste: scientific management for those who prefer tight chains of command and strong presidential leadership; the war on waste for those who favor coordinated retrenchment and what one inspector general once described as “ the visible odium of deterrence ” ( Light 1993 ); a watchful eye for those who believe that sunshine is the best disinfectant for misbehavior; and liberation management for those who hope to free agencies and their employees from the oppressive rules and oversight embedded in the three other philosophies. [Emphasis Added]

My point in sharing this article wasn’t to say that the idea that sunshine is the best disinfectant is good or bad, but merely to put it in context with some other ways of reforming government. You can decide for yourself which you prefer. In fact, there’s a handy table for differentiating the four:

The Four Tides of Reform

And one more interesting table that shows you how government reform in the US has changed since 1945:

Patterns in Reform Philosophy

What’s the Status Quo From the Other Side: List of Biases in Judgment and Decision-Making, Part 14

It’s Monday, so you know what that means — cognitive bias! When I write that, I sort of imagine a “live television audience shouting in chorus: cognitive bias!” Wouldn’t that be fun? Well, maybe it wouldn’t, but it’s kind of funny to think about. I’ve only got a couple of more biases that I’d like to mention, so let’s get right to today’s — the status quo bias.

The status quo bias, like many of the previous biases we’ve talked about, is exactly what it sounds like: a preference for the how things currently are. You may even look at this bias as some people’s inability to accept change or a fear of change, but that probably wouldn’t be completely accurate. Let’s go back to one of those journal articles we looked at in previous biases — Anomalies: The Endowment Effect, Loss Aversion, and Status Quo Bias:

A large-scale experiment on status quo bias is now being conducted (inadvertently) by the states of New Jersey and Pennsylvania. Both states now offer a choice between two types of automo­bile insurance: a cheaper policy that restricts the right to sue, and a more expensive one that maintains the unrestricted right. Mo­torists in New Jersey are offered the cheaper policy as the default option, with an opportunity to acquire an unrestricted right to sue at a higher price. Since this option was made available in 1988, 83 percent of the drivers have elected the default option. In Pennsyl­vania’s 1990 law, however, the default option is the expensive policy, with an opportunity to opt for the cheaper kind. The potential effect of this legislative framing manipulation was studied by Hershey, Johnson, Meszaros, and Robinson (1990). They asked two groups to choose between alternative policies. One group was presented with the New Jersey plan while the other was presented with the Pennsylvania plan. Of those subjects of­fered the New Jersey plan, only 23 percent elected to buy the right to sue whereas 53 percent of the subjects offered the Pennsylvania plan retained that right. On the basis of this research, the authors predict that more Pennsylvanians will elect the right to sue than New Jerseyans. Time will tell.

Another example:

One final example of a presumed status quo bias comes courtesy of the Journal of Economic Perspectives staff. Among Carl Shapiro’s comments on this column was this gem: “You may be interested to know that when the AEA was considering letting members elect to drop one of the three Association journals and get a credit, prominent economists involved in that decision clearly took the view that fewer members would choose to drop a journal if the default was presented as all three journals (rather than the default being 2 journals with an extra charge for getting all three). We’re talking economists here.”

You can see how important this bias would be for your life in making decisions. Should I sell my house (when the market’s hot) or should I hold onto it? You might be more liable to hold onto your house, even though there are economic gains to be had by selling it and, in fact, there are economic losses by keeping it!

As we’ve mentioned with some of the other biases, this bias can operate in tandem with other biases. For instance, think about the scenario I just mentioned and how that might also be similar to the endowment effect or loss aversion.

Ways for Avoiding the Status Quo Bias

1) Independent Evaluation

It really can be as easy as this. Have someone (or do it yourself) do a cost-benefit analysis on the situation/decision. In this way, you’ll be able to see the pros/cons of your decision in a new light. Of course, you may still succumb to the status quo bias, but you might be less likely to do so.

2) Role Reversal

While the independent evaluation makes “good sense” in trying to avoid this bias, doing some sort of role reversal will probably be the most effective. That is, look at the decision/situation from the other perspective. If it’s a negotiation, imagine that you’re in your negotiating partner’s shoes and you’re actually doing the trade from that side. Evaluate the deal. This may help to shake loose the status quo bias.

If you liked this post, you might like one of the other posts in this series:

If All You Have is a Hammer…: List of Biases in Judgment and Decision-Making, Part 13

The popular ending to the title of this post is, “… everything looks like a nail.” I’m sure you’ve heard this phrase (or some variant thereof) before, right? I bet you didn’t know that this represents an important cognitive bias, though. In fact, didn’t know that this phrase was popularized by one of the giants of psychology — Abraham Maslow. It comes from a book that he published in 1966 — The Psychology of Science: A Reconnaissance. This sentiment behind this phrase is a concept that’s known as functional fixedness.

One of the easiest ways to explain this concept is with a different example — the candle problem. Dan Pink does an excellent job of explaining this in the opening of a TEDTalk he gave a few years ago. I’ve set the video to start just before he begins talking about the candle problem. At about the 3-minute mark, the explanation of functional fixedness ends, but he goes on to talk about an experiment with functional fixedness. Meaning, he couches the importance of functional fixedness in management theory. I’d urge you to come back and watch the remaining 15+ minutes after you’ve finished reading this post:

So, as we can see from the video, it’s hard for people to imagine the box as something other than a receptacle for the tacks. Similarly, when we’re holding the “proverbial hammer,” everything appears as if it’s a nail. One of the most important consequences of functional fixedness is how it contributes to a dearth of creativity. If you’re a manager in a company, maybe you’re not thinking about how you can position your employees to maximize their impact on realizing profits. It’s also possible that you’re not seeing a creative way to reassemble your raw materials (or resources) to design a product that will create a new market!

Ways for Avoiding Functional Fixedness

1) Practice, practice, practice

Probably the easiest and most effective way of overcoming functional fixedness is to practice. What does that mean? Well, take a box of miscellaneous things and see if you can design something fun/creative. The emphasis should be on using those things in a way that they weren’t designed. For instance, if you’re using a toolbox, you might think about how you can use something like wrenches to act as “legs” of a table or as a conductive agent for an electrical circuit.

2) Observant learning — Find examples

Another good way of overcoming functional fixedness is to look at other examples of people who have overcome functional fixedness. When I was giving a presentation on functional fixedness to a group (of college students) about a year ago, I showed the video below. About halfway through the video, one of them remarked: “So, basically, it’s how to be a college student 101.”

If you liked this post, you might like one of the other posts in this series:

How Do I Break a Habit? First, Notice

Last year, Charles Duhigg published a great book called, The Power of Habit: Why We Do What We Do in Life and Business. It has been really well-received garnering almost 900 five- and four-star ratings out of 1050+. I haven’t had the chance to read it, yet, but I have seen many interviews with Duhigg explaining the principles from the book and videos like the one I’ve embedded below with some animation.

A few days ago, I noticed the video embedded on Farnam Street and I thought it’d be a good idea to share it with all of you. It’s one of the best summaries I’ve seen Duhigg give on the principles from the book. In fact, it’s one of the best summaries I’ve seen on habits, in general. If you’re interested in habits, another good person to read (or listen to) is BJ Fogg. Without further adieu, here’s the clip from Duhigg:

There’s so much good information in there, but the piece I want to draw your attention to is near the very end:

Studies have shown that if you can diagnose your habits, you can change them in whichever way you want.

That’s really important because this thinking wasn’t always the case. Sometimes, folks will tell you that you need to focus on the cue, while others will say you need to focus on the reward. As Duhigg suggests, you can focus on whichever aspect you want, so long as you’ve diagnosed the habit. Happy habit-breaking!

Neither the Beginning nor the End — Remember the Middle: List of Biases in Judgment and Decision-Making, Part 12

It’s Monday, so you know what that means — another cognitive bias! This week, I thought I’d combine two because they’re essentially two sides of the same coin. They are: the primacy effect and the recency effect. Believe it or not, these biases are just what they sound like. The primacy effect is the idea that we overweight information we received ‘near the beginning,’ and the recency effect is the idea that we overweight information we received ‘more recently.’

These biases are usually studied in the context of memory and recall. That is, the primacy effect being that people tend to have a better likelihood of remembering information from the beginning and the recency effect being that people tend to have a better likelihood of remembering information they received more recently.

We can certainly see how this would affect our ability to make bias-free decisions. Let’s say that you’re shopping for a new television. You put in a few days worth of research. The biases we have just mentioned above tell us that we might be more likely to give undue weight to the information we found in the beginning (or the information we found most recently). While the purchase of a TV might not be an “important” decision, what if we were interviewing candidates for a job? We might be more likely to view the people in the beginning more favorably or the people towards the end more favorably. This is part of the reason companies use things like standardized questions during the interview as a way to institute some continuity from one interviewee to the next.

Ways for Avoiding the Primacy/Recency Effect(s)

How you avoid these two biases really depends on the context of the decision you’re making. For instance, if you want people to remember something, you probably don’t want to give them a long list (thereby invoking the possibility of one of these two biases to happen). There are some general ways to mitigate these biases, though.

1) Keep a record (write down the data)

One of the simplest ways that either of these biases can have an impact on a decision is when there isn’t a record of data. If you’re just making a decision based on what you remember, there will be an unnecessary weighting for the beginning or the end. As a result, keeping a record of the choices can make it easier to evaluate all choices objectively.

2) Standardized data

As I mentioned earlier in this post, it’s important that the data by which you’re evaluating a choice be standardized. As we looked at in number one, keeping data isn’t always enough. it’s important that the data be uniform across choices, so an evaluation can be made. In this way, it’s easier to look at earlier choices and later choices equally whereas if this weren’t instituted, there might be a slight bias towards the beginning or the end. This tip would work for situations similar to making a purchase (and gathering data), interviewing candidates, or something that can be analogized to either of these two.

If you liked this post, you might like one of the other posts in this series:

Morgan Freeman Explains Physics to The Daily Show’s Jon Stewart

A few nights ago, Morgan Freeman was on The Daily Show promoting his new movie that comes out next week, “Now You See Me.” There’s a science-fiction part to the movie, but that’s not how we ended up with Morgan Freeman explaining physics to Jon Stewart. Morgan Freeman is also a narrator for “Through the Wormhole” on the Discovery Channel. As a result, Freeman knows (at least a little!) about science. I remember getting pretty psyched when I was watching the show live a few nights ago and Freeman says, “It’s easy, I’ll explain it to you,” when referring to concepts from physics. Anyhow, I tried to embed the clip below, but it seems you can’t embed Comedy Central clips on WordPress, so I found a version of the interview on YouTube. It’ll be here until it, invariably, gets taken down from YouTube.

I wanted to talk briefly about one of the examples that Freeman used: the balloon. Several years ago, I saw some videos of a physicist — Nassim Haramein — and they were captivating, to say the least. The way that he presents the material makes it seem very logical, but my scientific literacy isn’t such that I’d be able to say he was right or he was wrong. I know that Haramein’s partnered with one of the big names in science, but like I said, I don’t have the scientific literacy to debunk anything.

From looking on the internet, it seems that a number of people don’t think that Haramein has stumbled onto anything and a number of people do think that he’s stumbled onto something. Like I said, I really don’t know, but I do remember Haramein using the same balloon example that Freeman used in talking to Jon Stewart. You’ve seen the Freeman clip above, so I’ll embed a clip of Haramein talking about the balloon. This video has more in it than just the balloon, so I’ve set the video to start at the time when Haramein begins to talk about the balloon (short intro into it). You can probably stop watching around 37:30 to get the gist of the point I’m making.

Pretty cool, eh?

When I lived on Kauai, I did have the chance to meet with Haramein a few times and he certainly seems like a nice guy. We never chewed on science concepts, but I probably wouldn’t have been able to keep up for too long. If you’re interested in Haramein’s work, I’d urge you to look into his new website: The Resonance Project. It looks like he’s got a new movie coming out called: The Connected Universe. He also has a couple of other movies: Crossing the Event Horizon and Black Whole. If you watched the clip above, you saw a short snippet of Crossing the Event Horizon. There are 4 DVDs in the set. If you don’t have that kind of time, you might want to start with Black Whole — it’s only about an hour and a half.

 

The “Health Halo Effect:” Organic Labels on Food

A couple of days ago I restarted that series on cognitive biases with a post about the Halo Effect. I recently came across a study that applied the Halo Effect, but specifically, to health. The study sought to see whether labeling food organic made folks think that the food was healthier. An excerpt:

115 people were recruited from a local shopping mall in Ithaca, New York to participate in this study. Participants were asked to evaluate 3 pairs of products — 2 yogurts, 2 cookies and 2 potato chip portions. One item from each food pair was labeled “organic,” while the other was labeled “regular.” The trick to this study was: all of the product pairs were organic and identical! Participants were asked to rate the taste and caloric content of each item, and how much they would be willing to pay for the items. A questionnaire also inquired about their environmental and shopping habits. Even though these foods were all the same, the “organic” label greatly influenced people’s perceptions.

It certainly seems like there’s evidence here for the “health halo effect.” Something that I wonder about, though — the placebo effect. I haven’t written about the placebo effect, but I imagine that most of you know what it means: it’s the idea that an inert substance can prove to have an effect on someone’s health. We can apply the placebo effect to situations outside of medicine.

In this instance, we might posit that the people who were eating the food labeled organic believed that it would taste better — and so it did. I don’t think that this hypothesis could be evaluated from the data from this study, but it would be an intriguing follow-up.