big decisionA recent conversation, about leadership, that I had with Paul Carroll (Editor in Chief of InsuranceThoughtLeadership.com) identified two new guest bloggers to share content this month.

Both have lessons to share for leaders wanting to improve their decision-making; especially for that next big decision, that really matters.

In this guest post, Chunka Mui shares on how not to flub your next big decision as a leader.

As well as drawing on understanding of behavioural biases (a popular topic on our blog too), all the traps that Chunka shares are so applicable to the decisions customer insight leaders need to make today.

Over to Chunka…

Business is a contact sport. Some companies win, while others lose. That won’t change. There is no way to guarantee success. Make the best decisions you can, then fight the battle in the marketplace.

Yet research into more than 2,500 large corporate failures that Paul Carroll and I did found that many big decisions are doomed as they come off the drawing board—before first contact with the competition. Why?

The short answer is that humans are far from rational in their planning and decision-making. Psychological and anthropological studies going back decades, including those of Solomon AschStanley MilgramIrving JanisDonald Brown and, more recently, Dan Ariely, consistently demonstrate that even the smartest among us face huge impediments when making complicated decisions, such as those involved in setting strategy.

In other words, humans are hard-wired to come up with bad decisions. Formulating good ones is very difficult because of five natural tendencies:

1. Fallacious assumptions

If “point of view is worth 80 IQ points,” as Alan Kay says, people often start out in a deep hole.

One problem is the anchoring bias, where we subconsciously tend to work from whatever spreadsheet, forecast or other formulation we’re presented. We tend to tinker rather than question whether the assumptions are right or the ideas are even worth considering. Even when we know a situation requires more sophisticated analysis, it’s hard for us to dislodge the anchors.

Another strike against expansive thinking is what psychologists call the survivorship bias: We remember what happened; we don’t remember what didn’t happen. We are encouraged to take risks in business, because we read about those who made “bet the company” decisions and reaped fortunes—and don’t read about those who never quite made the big time because they made “bet the company” decisions and lost.

2. Premature closure

People home in on an answer prematurely, long before we evaluate all information.

We get a first impression of an idea in much the same way we get a first impression of a person. Even when people are trained to withhold judgment, they find themselves evaluating information as they go along, forming a tentative conclusion early in the process. Premature conclusions, like first impressions, are hard to reverse.

A study of analysts in the intelligence community, for instance, found that, despite their extensive training, analysts tended to come to a conclusion very quickly and then “fit the facts” to that conclusion. A study of clinical psychologists found that they formed diagnoses relatively rapidly and that additional information didn’t improve those diagnoses.

3. Confirmation bias

Once people start moving toward an answer, they look to confirm that their answer is right, rather than hold open the possibility that they’re wrong.

Although science is supposed to be the most rational of endeavors, it constantly demonstrates confirmation bias. Ian Mitroff’s The Subjective Side of Science shows at great length how scientists who had formulated theories about the origins of the Moon refused to capitulate when the moon rocks brought back by Apollo 11 disproved their theories; the scientists merely tinkered with their theories to try to skirt the new evidence.

Max Planck, the eminent physicist, said scientists never do give up their biases, even when they are discredited. The scientists just slowly die off, making room for younger scientists, who didn’t grow up with the errant biases. Planck could just as easily have been describing most business people.

4. Groupthink

People conform to the wishes of the group, especially if there is a strong person in the leadership role, rather than ask tough questions.

Our psyches lead us to go along with our peers and to conform, in particular, to the wishes of authority figures. Numerous psychological experiments show that humans will go along with the group to surprising degrees.

From a business standpoint, ample research, supported by numerous examples, suggest that even senior executives, as bright and decisive as they typically are, may value their standing with their peers and bosses so highly that they’ll bend to the group’s wishes—especially when the subject is complicated and the answers aren’t clear, as is always the case in strategy setting.

5. Failure to learn from past mistakes

People tend to explain away their mistakes rather than to acknowledge their errors, making it impossible to learn from them.

Experts are actually more likely to suffer from overconfidence than the rest of the world. After all, they’re experts. Studies have found that people across all cultures tend to think highly of themselves even if they shouldn’t. They also blame problems on bad luck rather than take responsibility and learn from failures. Our rivals may succeed through good luck, but not us. We earned our way to the top.

While it’s been widely found that some 70% of corporate takeovers hurt the stock-market value of the acquiring company, studies find that roughly three-quarters of executives report that takeovers they were involved in had been successes.

The really aware decision makers (the sort who read articles like this one) realize the limitations they face. So, they redouble their efforts, insisting on greater vigilance and deeper analysis.

The problem is that that isn’t enough. As the long history of corporate failures show, vigilant and analytical executives can still come up with demonstrably bad strategies.

The solution is not to just be more careful. Accept that the tendency toward decision-making errors is deeply ingrained and adopt devil’s advocates and other explicit mechanisms to counter those tendencies.

Thanks for sharing that, Chunka & Paul. I hope it is helpful advice for all you customer insight leaders.

More leadership tips tomorrow.