It's an uncomfortable truth that more than half the decisions we make fail - sometimes with dire consequences in the realm of business.
Just a few examples: Studies show that up to 80% of mergers and acquisitions fail to deliver the desired revenues, leaving shareholders to shoulder the losses. And 8 out of 10 startups never make it past their second year.
Then there are poor decisions that reflect badly on the company, like Nestlé's aggressive marketing of infant formula in poor countries with limited access to clean water, or Ford's dogged unwillingness to recall the Pinto model, despite the vehicle's tendency to burst into flames.
We all have systemic biases, argue IESE professors Roberto García-Castro and Miguel Angel Ariño, but they don't have to define us. The title of the professors' new book, Wonderful Decisions, sets the tone.
The book proposes a simple three-step model to improve decision-making. The model classifies all possible consequences of a decision into three kinds of results - extrinsic, intrinsic and transcendent. Extrinsic results are the visible effects of a decision, intrinsic are what the decision-maker learns, and transcendent are what others who are affected by the decision learn.
The model also serves to reduce the influence of biases in three ways:
- Underscoring the need for a thorough evaluation of the potential consequences of a decision, before the results are fully known. Extrinsic, intrinsic and transcendent consequences should all be explored.
- Emphasising the resolution of future problems, not just the problem at hand.
- Providing a complete characterisation of human motivation - no consequence, result or motive behind a decision should be left unexplained.
Finally, once a decision has been made, managers should evaluate all three kinds of results following the model. That is to say, the extrinsic, intrinsic and transcendent impacts should be evaluated based on their effectiveness, efficiency and consistency, respectively.
Learning to recognize our blind spots
Every driver knows there are blind spots in your field of vision. And it's no different with management.
Oversimplifying thorny challenges is typical of poor decision-making, the authors stress. It comes from a good impulse: to navigate complexity by reducing reality to simpler forms and mental shortcuts - heuristics - that allow executives to solve complex problems by drawing on experience, guile and intuition.
But these shortcuts are also prone to all kinds of biases, which can lead to systemic blind spots. These include: evaluating gains and losses asymmetrically, exhibiting preference reversals (i.e., preferring A to B and then, simply because of the order in which they appear, favoring B to A), making different choices depending on the wording of a problem, making time-inconsistent decisions, and wrongly applying one's own motives to others' decision-making.
Broadly speaking, these biases can be grouped into three categories: cognitive, intertemporal and interpersonal.
There are three common mental shortcuts in cognitive bias that lead to flawed decision-making:
- Representativeness: When we judge something based on its similarity to our expectations, rather than on unbiased likelihood.
- Availability: If you've never been subject to a cyberattack, you probably underestimate the risk. This bias occurs when we use only the most recent or accessible information to inform a decision.
- Anchoring: When people make estimates starting from an initial value that is adjusted to yield the final answer. Anchoring distorts judgment, especially when making decisions in uncertain situations.
This type of bias occurs when we project our current mindset and assumptions onto the past and future. For example, when we face complex decisions, the future often appears to be clouded in a fog of uncertainty, and inadvertently we tend to attach far too much weight to the present.
If intertemporal bias leads us to magnify the present, interpersonal bias overemphasizes our own highly subjective perspective when judging ambiguous situations and other people's actions and intentions. For example, when people encounter information that is favorable to their own interests, they tend to accept it uncritically. By contrast, negative or contrarian information usually produces a more critical assessment.
We can all recognize ourselves in these definitions of bias, which is why the model presented in the book by García-Castro and Ariño is so promising. By improving the decision-making process with the authors' identified parameters, we have a much better chance of not making mistakes and, indeed, producing something wonderful.