Don’t Fall for the Gambler’s Fallacy: List of Biases in Judgment and Decision-Making, Part 7

A little later in the day than I would have liked, but today’s cognitive bias is the gambler’s fallacy. The bias gets its name from, as you’d expect, gambling. The easiest example to think of is when you’re flipping a coin. If you flip a coin 4 times and each of those 4 times the coin turned up heads, you’d expect the coin to turn up tails on the next (or at least have a higher chance of turning over tails), right? WRONG!

The odds are exactly the same on the 5th turn as the 6th turn as the 66th turn as the 11,024th turn. Why? Because the two instances of flipping the coin are independent events. (Note: we’re ignoring, for the time being, any effects that quantum reality might have on a given event in the past and the future.) So, every time you flip a coin, that’s an independent event — unaffected by earlier events.

Another important example is the reverse fallacy. That is, if we think that heads are “hot” because it’s been flipped a number of time, thinking that there’s a better chance that heads will be flipped is also a fallacy. Again, this is an independent event — unaffected by previous events.

This fallacy is so named because there’s a famous example of the gambler’s fallacy happening at the Monte Carlo Casino where, on roulette, black came up 26 times in a row. A number of gamblers reasoned that red would come up because there had been such an unlikely number of blacks that came up in a row. As the story goes, they lost millions.

Other examples of the gambler’s fallacy:

  • Childbirth: “we’ve had 3 boys, so we’re going to have a girl now…”
  • Lottery: “I’ve lost 3,000 times, so I’m due for a win…”
  • Sports: “Player X is playing really well, they’re bound to start playing bad…”
  • Stock market: “Stock X has had 7 straight down days, so it’s bound to go up on this next trading day…”

Ways for Avoiding the Gambler’s Fallacy

1) Independent Events vs. Dependent Events

The biggest way to avoid the gambler’s fallacy is to understand the difference between an independent event and a dependent event. In the classic example, the odds of a coin landing on heads or tails is — negligibly — 50/50 (I say negligibly because there are those who contend that the “heads side” weighs more and thus gives it a slight advantage). An example of a dependent event would be picking cards from a deck. There are 52 cards in a deck and if you pick one card without replacing it, your odds of picking one of the other 51 cards increases (ever so slightly).

If you liked this post, you might like one of the other posts in this series:

The Confirmation Bias — What Do You Really Know: List of Biases in Judgment and Decision-Making, Part 6

Well, here we are into the sixth week of biases in judgment and decision-making. Every Monday, I look at my list of cognitive biases and I see that we’ve still got quite a few weeks to go until I’ve exhausted the biases that I want to talk about. This week was a toss-up: I was trying to decide between the fundamental attribution error and the confirmation bias. After flipping a bottle cap (seriously, there wasn’t a coin close by), I’ve decided to talk about the confirmation bias.

Like last week, the confirmation bias is easy to understand in its definition: it’s the tendency to seek out information that confirms one’s previously held beliefs. In a journal article that’s been cited over 1000 times, Ray Nickerson stated:

If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration. Many have written about this bias, and it appears to be sufficiently strong and pervasive that one is led to wonder whether the bias, by itself, might account for a significant fraction of the disputes, altercations, and misunderstandings that occur among individuals, groups, and nations.

Why is the confirmation bias so loathed? Well, as Nickerson points out, it may be the root cause of many disputes both on an individual and an international level. Let’s think about this for a second: let’s say that in the world of objectivity “out there,” there are any number of possibilities. In the world  of subjectivity “inside my head,” there are only the possibilities that I can imagine. Humans, on the whole, tend to fear change (there are over 600,000,000 results for that search on Google!). In order to allay those fears, I’m going to prefer information that already conforms to my previously held beliefs. As a result, when I look “out there,” I’m going to unconsciously be looking for things that are “inside my head.” Let’s take this discussion out of the abstract because there are plenty of examples of it.

If you’re still not convinced and think you’re “beyond” the confirmation bias, I would urge you to try and solve the problem on this site. If you give the problem its due respect, I bet that you’ll be surprised as to your solution vs. the actual solution.

Ways for Avoiding the Confirmation Bias

As with other cognitive biases, being aware that there is such a thing as the confirmation bias is really important. It can be hard to change something if you don’t know that there’s something to be changed.

1) Seek out contradictory ideas and opinions

This is something that I’ve written about before. If at all possible, you’ve got to be sure that you’re getting information that is counter to your beliefs from somewhere. If not, there’s little chance for growth and expansion. This can be difficult for some, so I’ve outlined ways to do this on the post I referenced above.

2) Seek out people with contradictory ideas and opinions

I answered a question on Quora last November where I placed these two ways for avoiding the confirmation bias one and two. Some folks might find it a little more difficult to seek out people with opposing views and that’s why I suggest starting with seeking out contradictory views in print (or some other form of media) to begin. However, in my experience, speaking with someone who has opposing views to mine (assuming that they are also altruistic in their endeavor to seek out opposing views) can be quite enriching. A real-life person can usually put up a better defense when your “confirmation bias” is activated. Similarly, you can do the same for them.

3) What do you really know?

My last suggestion for avoiding the confirmation bias is to always be questioning what it is that you know. This can sound tedious, but if you get into the habit of questioning “how” you know something or “why” you know something, you’d be surprised how ‘thin’ the argument is for something that you know. For instance, let’s say that you have a racial stereotype that ethnicity “x” is bad at driving. When you’re on the highway, you notice that someone from ethnicity “x” cuts you off. Instead of going into a tizzy about ethnicity “x,” you might stop and remember that, in fact, of all the times that you’ve been cut off, ethnicity “x” is the ethnicity that cuts you off the least. This is a curt example, but I think you get the idea. Just to emphasize my point: I would argue that questioning your deeply held beliefs would be a good way of countering the confirmation bias.

So, what do you really know?

If you liked this post, you might like one of the other posts in this series:

Perspective and the Framing Effect: List of Biases in Judgment and Decision-Making, Part 5

Since I was going to talk about the framing effect last week (and opted for the planning fallacy instead because of circumstances), I thought I’d get into the framing effect this week. The framing effect is a very easy bias to understand, in that it’s not as complicated in its description as some of the other biases are. In short, the framing effect is how people can react differently to choices depending on whether the circumstances are presented as gains or losses.

The famous example of the framing effect comes from a paper by Kahneman (who I’ve mentioned before) and Tversky in 1981:

Problem 1: Imagine that the U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimate of the consequences of the programs are as follows: If Program A is adopted, 200 people will be saved. [72 percent]

If Program B is adopted, there is 1/3 probability that 600 people will be saved, and 2/3 probability that no people will be saved. [28 percent]

As you can see from the percentages in brackets, people opted for the sure thing. Now, let’s look at the second part of this study:

If Program C is adopted 400 people will die. [22 percent]

If Program D is adopted there is 1/3 probability that nobody will die, and 2/3 probability that 600 people will die. [78 percent]

Did you notice something? Program C is identical to Program A, and yet the percentage of people who were opting for Program C dropped tremendously! Similarly, notice that Program D’s percentage went way up — even though it’s the same thing as Program B. This is the framing effect in action. Is it frightening to you that we’re so susceptible to changing our mind based simply on how a choice is framed? If it’s not, it certainly should be.

Ways for Avoiding the Framing Effect

1) Reframe the question

It may seem obvious, but you’d be surprised how many people don’t consider “reframing” the frame with which they are looking at a situation. For instance, in the example from earlier, instead of looking at it as a choice between Program A and Program B, someone could reframe Program A so that it looks like Program C and do the same with Program B, so that it looks like Program D. As a result, one would then be getting a “fuller” picture of their choice.

2) Empathy — assume someone else’s perspective

Many choices implicate another in a situation. As a result, it might be worth it to put yourself in the shoes of that other person to see how they would view a given situation. This is similar to the reframe, but is more specific in that it might serve to help the person remove themselves a little bit from the decision. That is, when we’re faced with a choice, our personal biases can have a big impact on the decision we make. When we imagine how someone else might make this decision, we’re less likely to succumb to our personal biases.

3) Parse the question

Some questions present us with a dichotomous choice: are apples good or bad? Should we exercise in the morning or the evening? Are gap years helpful or harmful? When faced with a question like this, I would highly recommend parsing the question. That is, are we sure that apples can only be good or bad? Are we sure that exercising in the morning or the evening are our only options? Often times, answers to questions aren’t simply this or that. In fact, more times than not, there is a great deal of grey area. Unfortunately, when the question is framed in such a way, it makes it very difficult to see the possibility of the grey area.

If you liked this post, you might like one of the other posts in this series:

Get a Second Opinion Before You Succumb to the Planning Fallacy: List of Biases in Judgment and Decision-Making, Part 4

I know that I said that I was going to be talking about a new bias in judgment and decision-making every Monday and I know that today is Tuesday. To be honest — I underestimated how long it would take me to prepare for my seminar in International Relations. Aside: if you want to challenge yourself, take a course in a subject which you know very little about and be amazed at how much you feel like you’ve been dropped into the ocean and told to swim! It can be a little unnerving at first, but if you’re into exploring and open to new experiences, it can be quite satisfying. Anyway, so today yesterday I’d planned to talk about the framing effect, but since I so conveniently demonstrated the planning fallacy, I thought I’d talk about it.

The consequence of this post being written/published today is directly related to my falling into the trap of the planning fallacy. I planned for the preparation for my International Relations class to take a certain amount of time. When that time lasted longer than I had anticipated, I had no time left to write about a bias in judgment and decision-making. The planning fallacy is our tendency to underestimate how long we’ll need to complete a task — especially when we’ve had experiences where we’ve underestimated similar tasks.

This is something that even the best of us fall prey to. In fact, one of the biggest names in cognitive biases Daniel Kahneman (Nobel Prize in economics, but a PhD in psychology!) has said that even he still has a hard time with the planning fallacy. Of course, this doesn’t make it permissible for us not to try to prevent the effects of the planning fallacy.

Before we get into ways for avoiding the planning fallacy, I want to share an excerpt from an oft-cited study when discussing the planning fallacy [emphasis added]:

Participants were provided with a series of specific confidence levels and were asked to indicate the completion time corresponding to each confidence level. In this manner, the participants indicated times by which they were 50% certain they would finish their projects (and 50% certain they would not), 75% certain they would finish, and 99% certain they would finish. When we examined the proportion of subjects who finished by each of these forecasted times, we found evidence of overconfidence. Consider the academic projects: only 12.8% of the subjects finished their academic projects by the time they reported as their 50% probability level, only 19.2% finished by the time of their 75% probability level, and only 44.7% finished by the time of their 99% probability level. The results for the 99% probability level are especially striking: even when they make a highly conservative forecast, a prediction that they feel virtually certain that they will fulfill, people’s confidence far exceeds their accomplishments.

There were a lot of numbers/percentages offered in the excerpt, so I’ve also included a visual representation of the data in a graph below. This graph comes from a book chapter by a couple of the same authors, but it is about the data in the preceding excerpt.

 

 

 

 

 

Ways for Avoiding the Planning Fallacy

With the first three biases I talked about, awareness was a key step in overcoming the bias. While you could make that argument for the planning fallacy, one of the hallmarks of [the fallacy] is that people know they’ve erred in the past and still make the mistake of underestimating. So, we’ll need to move beyond awareness to help us defend against this bias.

1) Data is your friend

No, I don’t mean Data from Star Trek (though Data would probably be quite helpful in planning), but now that I think about it, Data (the character) might be a good way to position this ‘way for avoiding the planning fallacy.’ For those of you not familiar, Data is a human-like android. In thinking about this way for avoiding the planning fallacy, think about how Data might estimate the length of time it would take to complete a project. It would be very precise and data-driven. Data would likely look at past projects and how long it took for those to be finished to decide the length of time needed for this new project. To put it more broadly, if you have statistics on past projects (that were similar) absolutely use them in estimating the completion time of the new project.

2) Get a second opinion

When we think about the project completion time of one project in relation to another project, we often think about the nuances that make this project different from that project — and by extension — why this project won’t take as long as that project. Planning fallacy. If you can, ask someone who has experience in project completion in the area for which you’re estimating. When you ask this person, be sure not to tell them all the “various ways why this project is different,” because it probably isn’t and it’s only going to cloud the predictive ability of the person you’re asking. You’re probably going to hear an estimate that’s larger than you thought, but I bet you that it’s probably a lot closer to the real project completion time than the estimate you made based on thinking about the ways that this project was going to be different than all the other projects like it.

If you liked this post, you might like the first three posts in this series:

The Endowment Effect – Yours Isn’t Always Better: List of Biases in Judgment and Decision-Making, Part 3

Two weeks ago, I wrote about the pitfalls of the sunk cost fallacy. Last week I alerted you to the bias of loss aversion. Since I mentioned the endowment effect last week, I thought it’d be good to cover it sooner rather than later, so this week, we’ll look at the endowment effect.

The endowment effect can be tricky in that if it’s not described in the right way, it’s likely to be misinterpreted. In short, it means that people want more money for something than they’d be willing to pay for it. Put differently: we overvalue that which we own. You could think of a simple example of this through the course of a negotiation. When negotiation with someone, we’ll probably overvalue what we bring to the table. Someone may offer you $50 for your 25-year old keyboard (piano), but you think it’s worth at least $75. Barring any outside appraisal, the endowment effect is likely at play here.

Now here’s where it might get a little confusing, so bear with me: one of the possible explanations for the endowment effect is that humans are loss-averse. Remember loss aversion from last week? The idea that we’d rather avoid losses than reap rewards. If we apply this knowledge to our example above, let’s say that the piano is actually worth $35, but you want $75, and you’re being offered $50. Because humans are loss-averse, it’s causing you to suffer from the endowment effect, which is causing you to overestimate the value of the piano. As a result, you’re forgoing a $15 gain, given the current value of the piano and the price you’re being offered.

Let’s look at another example, this time, from sports. Often times, general managers have their eye on certain players. They believe this player is going to fill the void that their team has and if they could only sign that one player, all of their troubles would be solved. Throughout the courtship of said player, the general manager is already imagining that the player is part of their team. In so doing, this general manager is likely to end up overpaying for the player. Why? Because of the endowment effect. The general manager feels that the player they’re about to acquire is already theirs and so not acquiring the player would be like losing the player. And because they already imagine the player to be on their team, they’re going to overvalue the player as a result of the endowment effect.

Though this example comes from sports, we can see the skeleton of it and apply it to just about any situation where someone “wants” something and has already imagined it as their own.

Before we get into some ways of avoiding the endowment effect, I want to make sure that I convey the point that the endowment effect applies to more than just things. Another way of looking at it is your customers (if you own a business). It’s never easy to fire a customer, but we’ve learned — sometimes — it must be done. As you might imagine, it can be quite hard to fire a customer because — among other reasons — we tend to overvalue that customer.

Ways for Avoiding the Endowment Effect

1) Am I emotional?

A seemingly obvious way to avoid the endowment effect is assessing whether our emotions are involved. Don’t get me wrong, emotions are a good thing, but they are a surefire way to overvaluing things that you own. That is, if you find yourself overly connected to something, your emotions might be getting in the way.

2) Independent Evaluation

This dovetails nicely with the idea of being unemotional. To guard against succumbing to the endowment effect, be sure to have an independent appraisal of whatever it is that you’re looking to sell of yours. While you’ll still have the final say on what you sell and how much you sell it for, having a second pair of eyes look at your side of the “deal” might help you determine if you’re judgment’s clouded.

3) Empathy

I wasn’t going to include this initially, but after reading the research, it certainly fits. Before I go on, I should say that folks might be confused in that I just suggested asking whether one is emotional and now I’m saying to practice empathy? For those wondering, being emotional is not the same thing as being empathetic. Back to empathy and the endowment effect. In situations where we’re selling something, researchers found there to be an empathy deficit when the endowment effect was present. So, to counter this, you should try to empathize with whom you’re negotiating.

 

Loss Aversion and the Big Picture: List of Biases in Judgment and Decision-Making, Part 2

I think I’m going to make a habit of posting something new to my series on biases in judgment and decision-making every Monday. Last Monday, we looked at sunk costs. Today, we’re going to look at loss aversion.

As much as I can, I’m trying to write about the different biases by themselves. Sunk costs are closely associated with loss aversion, so I could have included it in the first post. Similarly, the endowment effect is closely associated with loss aversion, so I could have wrote about it here. Learning about the biases one at a time may make it easier to focus on that bias for that week. So, without further adieu: loss aversion.

Loss aversion is the idea that we’d rather avoid losses than reap rewards. Put more simply: we’d prefer to not lose something than acquire something. Like we did with the sunk cost fallacy, let’s look at some examples of loss aversion to give us a better understanding of this bias. The implication of loss aversion is that someone who loses $100 or $1000 will lose more satisfaction (or be unhappier) than someone who gains $100 or $1000 will gain satisfaction (or be happier). If we think about a continuum where both of the people in the above example start at 0, the person who loses money will have an higher absolute number (with regard to their satisfaction) than the other person. This is a rather basic example, so let’s look at something a little juicier: golf.

In golf, the difference between winning and losing is sometimes one stroke (or one putt). A short excerpt from Kahneman’s book, Thinking Fast and Slow:

Every stroke counts in golf, and in professional golf every stroke counts a lot. Failing to make par is a loss, but missing a birdies putt is a foregone gain, not a loss. Pope and Schweitzer reasoned from loss aversion that players would try a little harder when putting for par (to avoid a bogey) than when putting for a birdie. They analyzed more than 2.5 million putts in exquisite detail to test that prediction.

They were right. Whether the putt was easy or hard, at every distance from the hole, the players were more successful when putting for par than for a birdie. The difference in their rate of success when going for par (to avoid a bogey) or for a birdie was 3.6%. This difference is not trivial. Tiger Woods was one of the “participants” in their study. If in his best years Tiger Woods had managed to putt as well for birdies as he did for par, his average tournament score would have improved by one stroke and his earnings by almost $1 million per season. [Emphasis added]

That’s an incredible statistic. With the only difference between putting for par and putting for birdie the fact that one would “lose” a stroke and professional golfers are 3.6% better at putting for par? Wow! As the excerpt said, that accounted for $1 million per season for Tiger Woods in his best years.

Ways for Avoiding Loss Aversion

As with the sunk cost fallacy, one of the most important ways to avoid loss aversion is to recognize it. That is, to know that humans have a tendency for loss aversion is an important first step in not falling into the trap of loss aversion.

1) What’s the big picture?

In our example of golf, that might mean knowing where you are in relation to the other players your competing with in the tournament (rather than where your ball is relation to the hole and what specific stroke you’re about to hit). In business, one might examine a decision about one business unit in relation to the entire company (rather than looking myopically at the one business unit).

2) Am I afraid of losing something?

This may seem like an obvious solution, but it’s pretty important. If before making a decision you can think to yourself (or have your team ask itself), “am I afraid to lose something here?” You might find that you are and it could serve to help you or your company avoid falling into the trap of loss aversion.

3) Do you really expect to never lose anything — ever?

Loss is inevitable. Sometimes, you won’t make that par putt (or that birdie putt). Sometimes, when you negotiate a deal, you won’t get the best deal. Sometimes, the decision to sell that business unit might result in losses somewhere else. If you can come to grips with the fact that every decision you make won’t be perfect and that sometimes you will lose, you may begin to shift your expectations about loss.

Ignore Sunk Costs: List of Biases in Judgment and Decision-Making, Part 1

It can be really fun to write a series of posts on a particular topic. By my count, I’ve done this at least seven times so far. Today, I’d like to start what I hope will be an oft-read series on biases in judgment and decision-making (to some, cognitive biases). Because of my background in psychology and my interest in decision-making, I thought it would be wise to share with you the things that I’ve learned either through the classes I’ve taken (the classes I’ve taught!) or the research I’ve read. With each bias, my goal is to explain the bias and offer some possible avenues for not falling into the trap of the bias. Today, we start with one of the big ones: the sunk cost fallacy.

Sunk costs are those costs that have already happened and can’t be recovered. For instance, let’s say you buy an apple and bite into it. The money you used to buy that apple can’t be recovered — it’s a sunk cost. Now let’s say the apple doesn’t taste very good (maybe it’s inorganic). You might say, ‘well, I’ve already paid for the apple, so I might as well eat it.’ NO! That’s the sunk cost fallacy! Just because you’ve already bought the apple and paid for it, doesn’t mean you have to eat it. If it tastes bad, by golly, don’t eat it!

That’s a rather basic example of the sunk cost fallacy, so let’s look at one that might seem a bit more applicable. Sunk costs often come into the fray when they’re contrasted with future costs. Let’s say you’ve bought a subscription to a newspaper or a magazine. Because of your subscription, you get a discount when it’s time to renew your subscription. Now, let’s say that in that year of your subscription, you discovered that there was another newspaper/magazine that you preferred (maybe The Economist?). When it comes time to renew your subscription, you look at the two options to either subscribe to The Economist or continuing with your other subscription. You find out that the discounted price for your current newspaper/magazine will be the same price as The Economist. You say to yourself, “well, I’ve already subscribed to this newspaper and spent so much money on it, so I might as well keep subscribing to it.” NO! That’s the sunk cost fallacy. The money you’ve spent on the subscription for the other newspaper/magazine can’t be recovered! You can’t get it back. As a result, it shouldn’t affect the decision you make now about whether to choose it or The Economist

There’s one more quick example that I want to highlight: war. From a paper by a professor at Princeton:

The United States has invested much in attempting to achieve its objectives. In addition to the many millions of dollars that have been spent, many thousands of lives have been lost, and an even greater number of lives have been irreparably damaged. If the United States withdraws from Vietnam without achieving its objectives, then all of these undeniably significant sacrifices would be wasted. [Emphasis added]

Pay particular attention to that last sentence. That is the sunk cost fallacy in action.

Ways for Avoiding the Sunk Cost Fallacy

So, now that we’ve looked at the sunk cost fallacy, how can we avoid it? Well, the first step in avoiding the sunk cost fallacy is recognizing it. Hopefully, the above examples have given you an idea of how this bias can arise. There are a two other ways I want to highlight that you can use to avoid this trap.

1) What am I assuming?

The crux of the sunk cost fallacy is based on an assumption. That is, you’re assuming that because you’ve already spent money on X, that you should keep spending money on X. If you look at what it is that you’re assuming about a situation, you just might find that you’re about to step into the sunk cost trap.

2) Are there alternatives?

Related to the above example is alternatives. You’re not bound to a decision because you’ve made a similar decision in the past. Just because you bought the ticket to go to the movie, if another activity presents itself as more enticing, you’re allowed to choose that one instead. In fact, if when you sit down to watch the movie, it’s bad, you’re allowed to get up and walk out. Don’t fall into the sunk cost trap thinking that you have to stay because you paid for it. There are any number of things you could be doing: going for a walk, calling an old friend, etc.

Why We Lie, Cheat, and Steal: The Truth About Dishonesty

I’ve just finished the 5th week of my 4th year of graduate school. For folks that have been in graduate school this long, there’s usually a development of research interests. Because of the nature of my time in graduate school (1 year in a PhD program, 1 year completing my first Master’s, and now into year two of an MBA), I never really had to declare my research interests or choose a dissertation topic. Though, for my first master’s, I did have to write a final paper. That final paper was on a topic that, if I were asked, would probably appear on a list of my “research interests.” It was on intuition and decision-making. Ironically, I’m working with a professor at George Mason University to test whether or not one can improve the conditions for one’s intuition (in the context of decision-making).

If I were to list another research interest, I’d have to say that it’d be on the topic of ethics or morals. Ironically, during my time as an undergrad, I worked on a research project with a psychology professor where we were examining (among other things) people’s moral judgments. I’ve had an RSA Animate talk bookmarked for about two weeks and I just finished watching it — I think you’ll enjoy it.

It was given by Dan Ariely on the content of his new book: The Honest Truth About Dishonesty: How We Lie to Everyone—Especially Ourselves. Ariely is also the researcher I referenced a few months ago when I was talking about the research on American’s perceptions and misperceptions of wealth inequality. I’ve pulled a few important quotes from the video:

“The magnitude of dishonesty we see in society is by good people who think they’re doing good, but in fact cheating just a little bit, but because there’s so many of them — of us — it has a tremendous economic impact.”

“You can’t go and say to yourselves, chef really want their food to be eaten. And it’s really owned by a conglomerate that is really not that good. Some things lend themselves to a much higher degree of rationalization.”

“At some point, many people switch and start cheating all the time. And we call this switching point the ‘what the hell’ effect. It turns out we don’t have to be 100% good to think of ourselves as good. But if at some point you don’t think of yourself as good, you might as well enjoy. And many people, by the way, report this same thing with diets.”

“Your motivation influences how you see reality.”

Tying Up Loose Ends — Again

Earlier this year, I did a where I talked about a number of ideas in one post. This served a couple of interconnected purposes: 1) it emptied my “posts to write” list, and 2) it allowed me to flood that list with some new ideas. (I said the purposes were interconnected.) My list has again started to grow a little bit, so I thought I would do another one of those to flush out the list. There are a couple of ideas that I won’t include in this post because I do want to write a “fuller” post on them, so look for some posts in the next few days about “balance,” “The Stockdale Paradox,” and the idea that “every game (in a season) counts equally.”

The Enneagram — Through my exposure to transpersonal psychology, I was introduced to the . I don’t know this for a fact, but my suspicion is that the Enneagram is highly underutilized relative to its helpfulness in understand one’s self and others.

Life’s all about making decisions — One of my interests is “decision-making.” Books, literature, research: I’m fascinated by how humans make decisions. On that note, one of the things I’ve learned is that life is — really — all about making decisions. More importantly though, it’s important to put yourself in situations that allow you to make good choices. Let me say that again: “It’s important to put yourself in situations that allow you to make good choices.”

Measuring outcomes in the non-profit sector — I’ve talked before about my time with , but I also had a class in this summer. The thing that struck me the most about the non-profit sector is the lack of ways to measure outcomes. That is not to say that there aren’t ways to measure outcomes in the non-profit sector, but when compared to the for-profit sector, it seems that, for whatever reason, there aren’t as many established and agreed upon ways to measure outcomes.

Reframing your life — Many people, myself included, sometimes get caught up in choosing things they want to do (career-wise). An important realization on that front: it’s not what you want to do for the “rest of your life,” but simply, what you want to do “for right now.” Meaning, it’s okay to change your mind later and move into a different position, field, or industry.

Psychological reasons why good people do bad things — I came across this a few days ago that recounts a number of reasons why good people do bad things. I think it’s really important to understand the underlying psychological concepts that contribute to these errors in “decision-making.”

Adding General Managers to the Organization Could Improve Ethical Decision-Making

I’ve mentioned that I’m working at for the summer. As I don’t currently live in , I take the to get to work. As I don’t yet have an iPhone or an iPad (with which to read something on), I’ve kept my subscription to . As I was reading , I got to an article from called, “

At first, I was a bit skeptical, but as I read on, it may me think of the post I recently wrote about . Here’s an excerpt from the Schumpeter [emphasis added]:

But is it wise to be so obsessed with speed? High-speed trading can lead to market meltdowns, as almost happened on May 6th 2010, unless automatic breaks are installed. And is taking one’s time so bad? Regulators are always warning people not to buy things in the heat of the moment. Procrastinators have a built-in cooling-off period. Businesses are forever saying that they need more creativity. Dithering can help. Ernest Hemingway told a fan who asked him how to write a novel that the first thing to do was to clean the fridge. Steven Johnson, a writer on innovation, argues that some of the best new products are “slow hunches”. Nestlé’s idea of selling coffee in small pods went nowhere for three decades; now it is worth billions.

These thoughts have been inspired by two (slowly savoured) works of management theory: an obscure article in the Academy of Management Journal by Brian Gunia of Johns Hopkins University; and a popular new book, “Wait: The Art and Science of Delay”, by Frank Partnoy of University of San Diego. Mr Gunia and his three co-authors demonstrated, in a series of experiments, that slowing down makes us more ethical. When confronted with a clear choice between right and wrong, people are five times more likely to do the right thing if they have time to think about it than if they are forced to make a snap decision. Organisations with a “fast pulse” (such as banks) are more likely to suffer from ethical problems than those that move more slowly. (The current LIBOR scandal engulfing Barclays in Britain supports this idea.) The authors suggest that companies should make greater use of “cooling-off periods” or introduce several levels of approval for important decisions.

I fine this rather on-point with what I was saying in the . By having more layers of approval (by way of the general managers), there would, undoubtedly, be more time factored into the process. As a result, this *may* result in less of the instances of poor decision-making that what we’ve seen recently with companies like Barclay’s and JP Morgan.