It’s a beautiful, warm October day here in the Bay Area. And yet, I have to admit, that warm autumn days can sometimes give me a slight sense of foreboding, because they remind me of the seasons when we’ve had our big natural disasters. Of course, earthquakes are not weather related, but the mind plays tricks and makes up cause and effect as a protection mechanism. Earthquakes are especially vulnerable to this because, so far, they are unpredictable in any useful way, so a folk-predictability arises.

Cause and effect myth-making is a flavor is misleading memories, one of the three red flags of bad decision-making that Andrew Campbell, Jo Whitehead and Sydney Finkelstein wrote about in a Harvard Business Review piece last year (Why Good Leaders Make Bad Decisions, Feb 2009, 60-66.) The other two are inappropriate self-interest and distorting attachments. The presence of any one of these factors can through off the judgment of the best and smartest decision-makers.

What happens is that our brains engage in “high levels of unconscious thinking.” The end result is that you think you know the right thing to do or the right decision to make, but it’s colored by the undue influence of a past experience, a strong attachment to someone involved in the situation or your own interests in the outcome. According to the HBR article, you can’t help making a bad decision, if any of these factors are involved, unless you take some counter-balancing action. They recommend these steps (I’m paraphrasing):

  • Get exposed to additional experience and thinking,
  • Build in more discussion and chances to hear dissenting views, and
  • Seek additional review.

Another look at the foibles of our psychological makeup is available in the book Sway by Ori Brafman and Rom Brafman. The book’s subtitle is “The Irresistible Pull of Irrational Behavior.” Sway is an exciting, if somewhat disturbing tour of the “hidden currents and forces” underneath our ordinary exteriors. These include (and I’m paraphrasing here too):

  • loss aversion (the willingness to do almost anything to avoid losses),
  • value attribution (the tendency to go with first impressions);
  • and the diagnosis bias (ignoring all evidence that contradicts these first impressions).

Sadly, the Brafman’s report that “the the more there is on the line, the easier it is to get swept up into an irrational decision.” They too have suggestions for working against our instinctual responses, and there is overlap with the HBR steps, but these proposals seem to involve a little more personal work.

To combat our fierce desire to avoid loss, the key is to take a long term view. It turns out that the loss we fear most is short term. The antidote to value attribution is a commitment to observation, to seeing things for what they really are. And, overcoming a diagnosis bias involves truly keeping an open mind. It also helps to open up decision-making processes to others.

So, if you are engaged in any decision-making, it may be worth your attention to these instinctual “hidden currents.” Particularly if the decisions you need to make are very important, there’s a good chance that you can improve your decision-making outcome by opening your process up to review, dissenting opinions, and consideration of the long-term perspective. That’s what the psychological studies are saying, in any case!