insights@IMD No.14
Article

Why smart people make stupid decisions

December 2012
PRINTABLE PDF – Less than 1MB

More than 75 executives attended an IMD Discovery Event that presented an overview of behavioral finance, behavioral pricing and related fields. Participants’ eyes were opened to the traps of linear decision making in dynamic and complex systems, and they were able to reflect on their own biases and irrational behaviors and apply the insights to their day-to-day work.

Why do smart people make stupid decisions? There are two ways of looking at it. The first is that smart people tend to hold higher offices in an organization and are therefore in a position to make important decisions, which are magnified within the organization. When a CEO makes a bad decision, everyone in the organization knows about it. The second way of looking at the question involves considering how organizations search for answers to business problems. If the question they are seeking to solve is not correctly framed, then the search will lead to the wrong answer, which could trigger a stupid decision in that particular context. The challenge for managers lies in understanding that most problems are part of a dynamic and complex overall system.

The logic of failure

To illustrate the complexity of strategic systems thinking, Professor Michel invited three participants to play his fishing simulation game. The rules were simple: Catch fish! Each participant could choose how many ships to put to sea in each period, or indeed how many to move to harbor. All participants immediately started by launching boats to catch fish; after the first successful results, two managers sent out more boats to catch even more fish. However, this led to a decline in the overall fish population and the replacement rate was not fast enough to replenish stocks, and catches also fell, even though the number of boats was subsequently reduced.

Why was it so difficult? What led to failure? One factor is that managers, by their very nature, want to manage – they want to do something. But fishing requires a stable environment, an environment in equilibrium. Thus, to succeed at the simulation, the managers needed to be patient and observe how the fish population was growing and declining over time. However, getting to the point of maximizing returns would mean waiting 10 years. “But,” as one participant put it, “by that time the CEO would have been fired!”

In contrast to this simple experiment, managers work within complex systems, where they face three main challenges:

1. Complexity: There are many interdependent variables, and changing one can have an unexpected outcome because of its effect on the others. As complexity increases, so does the potential for problems. It is important for managers to look at patterns, not isolated facts.

2. Dynamics: Things keep changing on their own, at their own pace, which means managers do not have total control. Furthermore, whenever change is introduced into a system, it has direct – expected – effects, but there are also unexpected indirect effects, which can have a greater impact than the expected effects. Variables themselves evolve over time. In the fishing example, the fish population had its own dynamics. So although adding a boat to the system might not seem like much, it had a severe negative impact on the overall fish population.

3. Lack of transparency (causal ambiguity): Managers often cannot see a direct relationship between variables – and even when they can, sometimes they choose to ignore them.

Typical mistakes managers make include linear thinking, which fails to take into account the ramification of effects. Thus, the assumption was that more ships would lead to a bigger catch. People often have problems perceiving the real magnitude of non-linear growth. For example, 264 doesn’t seem as “big” as the exponential answer – 18,446,744,073,709,551,616, which, if represented as grains of rice, would be a thousand times more than the world’s annual global rice production. Often managers are unable to understand and interpret the feedback they receive from the market, and they may ignore facts that do not fit their picture. When a company starts changing a strategy, the leaders are impatient to see immediate results, and if these do not materialize quickly enough, what could have been a good idea is killed. Other pitfalls are encapsulation, or “falling in love” with a particular aspect while ignoring much more important ones and, conversely, thematic vagabonding, or wildly switching from one solution to another without giving one a chance to work.

One way to deal with these issues is to use dynamic modeling or causal loop diagrams (see box below) – as a basis for discussion, not to prove you are right or wrong. They can be used to check different viewpoints and to reflect the assumptions. Small differences in a model can have dramatic consequences. And introducing one new variable can change the whole dynamic of a system. For maximum effectiveness, however, there should be enough variables (15 to 20 is good) to give the full picture.

Confused by statistics

A powerful tool that can help decision makers understand complex systems is statistical analysis. However, as Dr Grize pointed out, a healthy dose of caution is advisable when using statistical materials, since they can be biased and, if not used correctly, may lead managers to incorrect conclusions. Some simple lessons to remember are:

Lesson 1: Sample size matters – there is more variability in statistics based on small samples than on large ones. Thus, when comparing results from samples of different sizes, it is important to take this size difference into account.

Lesson 2: Beware of the regression to the mean effect, which can deceive even experienced scientists. For example, when speed cameras were placed at notoriously dangerous traffic junctions in the UK, it was claimed that 100 lives a year were saved. However, about 50% of the decline in accidents would have occurred anyway because of regression to the mean. Because cameras were put in places where the number of accidents was disproportionately high, by chance alone there were likely to be fewer accidents once the cameras were installed.1 It is important to use a reference group when evaluating change, rather than relying on repeated observations before and after the change.

Lesson 3: Numbers alone are not enough! Subject matter knowledge is essential for a correct interpretation of statistics. Sometimes looking at the big picture vs. the small picture can give contradictory results. Leaders need to be able to interpret the numbers to define the “strategy.”

Lesson 4: Correlation does not prove causation – a confounding variable may be at play. Without a controlled experiment, it is impossible to prove cause and effect.

Lesson 5: A statistically significant result does not mean that it is practically relevant and, conversely, if something is not statistically significant, this does not imply that it is not important. A result is said to be statistically significant if it is unlikely to have occurred by chance.

Behavioral finance and biases

Statistical failures can also have an impact in the financial markets, as Professor Bris explained. Millions of dollars are spent each year on research techniques in order to try to understand the message of the market, but behavioral finance and its underlying psychological biases come into play. Traditional finance theory argues that markets are efficient and the smart guys will outperform the stupid ones, naturally weeding out all the inefficient or non-rational investors. But hedge funds “play stupid” to increase the bubble and it is the individual investor, for the most part, that suffers. Finance theory breaks down and rational investors become rationally irrational. Some of the biases associated with investor psychology are:

Value of money: Experimental research has shown that people behave differently when they gamble with chips instead of with real money. And we can also see this in the financial markets – derivatives are the chips.

Overconfidence: Investors greatly overestimate the precision of their forecasts. This can be due to self-attribution bias – giving oneself credit for investment successes and blaming bad luck for failures; and to hindsight bias – the “I knew all along” syndrome.

Optimism: People have an unrealistically rosy view of their abilities and prospects – 90% think they are above average, better drivers, better looking, and the like.

Representativeness: People often believe that they see patterns in data that are completely random.

Conservatism: Once they have formed an opinion, people are slow to change it, even in the face of new evidence. Indeed, they can even misread evidence that goes against them as being in their favor.

Preferring risk over losses: Most people prefer a 50% chance of a $1,000 loss to a certain loss of $500, which goes against generally assumed risk aversion.

In sum, humans are not very good at rational choice. When it comes to investing, we make emotional mistakes often, and these mistakes can have a massive impact because the financial markets are too big. We need to know ourselves as managers and investors and be aware of our psychological biases.

Avoiding stupid strategic decisions

Many companies and managers make strategic mistakes because they focus solely on the customer or the product but do not clearly see the job to be done. Companies like Polaroid and Kodak failed to see the writing on the wall when disruptive innovations were introduced to their industry and did not consider the job to be done and its functional, emotional and social dimensions. Professor Yu used the example of a US milkshake company to show how thinking about strategic decisions should revolve around three questions: 1) What’s the job to be done? 2) What experiences in purchase and use must we provide to do the job perfectly? 3) What and how to integrate, to provide a difficult-to-copy holistic experience? The company hired an ethnographer to help answer these questions. People bought the milkshake to consume on their way to work as breakfast, so the firm added fruit to make the shake thicker and improve its nutritive value. It also installed a prepaid dispenser to reduce customers’ wait time – they could simply swipe and go. By understanding the underlying causal mechanism that leads a consumer to make a purchase decision, a company can reintegrate its activities to deliver a seamless experience for its customers.

Be aware of behavioral pricing

Using the results of several online surveys he had conducted, Professor Michel introduced the idea of behavioral pricing, which reflects how people relate to pricing information based on personal biases. For example, consider the following two scenarios:

1. Assume that you bought 1,000 shares in UBS, the largest Swiss bank, almost at the peak price of CHF 70 per share in 2007. Today, the share is trading at CHF 12, up from the all-time low of CHF 9.34. Would you sell the shares in the next seven days or not?

2. Assume that you bought 1,000 shares in UBS, the largest Swiss bank, for CHF 9.40 in 2011, just above the all-time low. Today, the share is trading at CHF 12, up from the all-time low of CHF 9.34. Would you sell the shares in the next seven days or not?

If you answered the two questions differently, you have proved the theory of behavioral pricing. In economic theory, the statements are equivalent, but what changes is your perception. The price you paid should not matter from an economic point of view; the only rational decision is whether to sell or not.

Principle 1 – Sunk costs: In economics and business decision making, sunk costs are costs that cannot be recovered once they have been incurred. In traditional microeconomic theory, sunk costs are irrelevant to a decision. Behavioral economics suggests the opposite: that sunk costs greatly affect actors’ decisions, because humans are inherently loss averse and thus normally act irrationally when making economic decisions.

Principle 2 – Anchoring: This is a psychological rule of thumb by which people start with an implicitly suggested reference point (the “anchor”) and make adjustments to it to reach an acceptable number. For example, asked to choose between two cameras, 70% of customers chose a $540 one, with 9/10 consumer report rating, over a $270 one with 4/10. When an $840 camera (6/10) was added to the mix, 85% of customers bought the $540 one.

Principle 3 – Mental accounting: The process whereby people code, categorize and evaluate economic outcomes is referred to as mental accounting. According to theorists, people group their assets into a number of mental accounts. Thus, people who discovered they had lost a €100 ticket for a show just as they got to the theater were less inclined to buy another ticket than people who lost the €100 cash on the way to buy the ticket. For the first group, the budget for the “entertainment” mental account had been spent, but according to economic theory there is no difference between the scenarios.

Principle 4 – Endowment effect: According to the endowment effect hypothesis, people place a higher value on objects they own than on objects that they do not. You don’t need the professional version of a particular software until you own it.

Principle 5 – Loss aversion: People strongly prefer avoiding losses to acquiring gains. Thus when making an offer to a client for, say, a consulting project, start the proposal by showing them everything that they can have, before presenting the more modest options. Or a car dealer selling new cars should start with a model that includes all options: As he moves down to the cheaper models with fewer options, the customer feels the pain with every option removed. In other words, the sunroof that is taken out has more perceived value than the same sunroof that is added.

Mind the trap

At the end of an enlightening day, participants came up with a number of takeaways to help reduce the risk of falling into the traps revealed in the presentations and making stupid decisions:

• Aim to understand the underlying dynamics of a system.

• Ask questions to understand the context of the numbers.

• Employ multidisciplinary teams to explore all viewpoints and challenge the model.

• Above all, be aware of your own behavioral biases!

References

1- Connor, Steve. “Speed cameras ‘do not save as many lives as claimed.’” Independent, 10 September 2008.

 

Discovery Events are exclusively available to members of IMD’s Corporate Learning Network. To find out more, go to wwwtest.imd.org/cln

Contact

Research Information & Knowledge Hub for additional information on IMD publications

World Competitiveness Center
Discover our latest research
IMD's faculty and research teams publish articles, case studies, books and reports on a wide range of topics