Skip to main content
Miguel Lobo

Leadership & Organisations

The secret of success: Do your homework!

The secret of success: Do your homework!

To err is human but how often do we get it wrong? More often than we realise, according to new research, and the biggest mistake corporations make is not being prepared.

In September 1999, the Mars Climate Orbiter crashed due to a simple miscalculation. NASA reported that the loss of the US$125 million probe was the result of ground crew’s failing to translate English units into metric units. While the Orbiter example may be extreme, human error - miscalculation, systematic bias or “fat-fingers” - costs businesses billions of dollars a year and is responsible for the creation and mis-marketing of some very forgettable products: think Ford Edsel or New Coke.

Organisations need accurate forecasts and estimates to decide where to invest, how much and how fast. And, while they are becoming more sophisticated in the way they use the increasing mass of statistical data which enters their orbit to improve productivity and create or sell new products, human judgment remains a vital part of this decision-making process.

Big blunders

It’s accepted humans, even experts, get things wrong at times, but how often do they make really big blunders? More frequently than expected, according to Miguel Lobo, INSEAD Assistant Professor of Decision Sciences, and the biggest mistake businesses can make is not acknowledging the fact. “Companies grossly under-invest in the collection of information,” Lobo told Knowledge. “They rely too much on a single piece of information and they fail to make contingency plans for when things go horribly wrong.”

Take the slump in semi-conductor demand during the early 1990s. According to Lobo, IT giant Intel massively under-invested in fabrication capacity. When the boom driven by the dot.com bubble came, they were caught out. The loss of income in not being prepared cost much more than the US$1 billion outlay for a new plant. “It was an error in estimation,” Lobo says. “An error in not having thought about the consequences of the world not being as you think it will be.”

The Normal Curve

For decades scientists have used the bell-shaped normal curve to depict the distribution of uncertainties and previous research assumed the size of errors in human judgement fell into this distribution pattern. “I wondered how often they fell at the ends of this curve,” Lobo says. "I had a hunch it was going to be more often than expected. That motivated me to do the research.”

Using 17 databases and over 20,000 forecasts from two sets of focus groups – a panel of MBA graduates and the forecasts of 50 economists – Lobo’s research ‘Human Judgment is Heavily Tailed’ found that really big mistakes, at the tail ends of the normal curve, occurred much more frequently than expected. “What surprised me was how consistent it was across a lot of different tasks. The size of the typical error changed but the frequency of large errors was fairly constant.”

The huge variety of the economists’ forecasts included big systematic errors, while the panel of MBA graduates who were asked to estimate questions such as the number of countries in the United Nations, the value of daily global oil production, and the market capitalisation of Google, often gave wildly inaccurate responses. “Surprisingly no matter what we asked, whether it was something they knew a lot about or something outside their field, very large errors occurred at a predictable rate,” says Lobo. “We looked at, for example, an error which is so large it should happen only one out of 1,000 times, an extremely rare event, when a forecast is completely wrong… it turns out those mistakes happen 10 times more often than would be predicted by the standard distribution pattern.”

The cost of mistakes

Errors of judgment become costly mistakes when the faulty figures are accepted by senior managers who make decisions based on too little advice. Because of biases like assumed similarity (people tend to project their preferences onto others), or over-confidence bias (people often grossly underestimate the undertainty they face as well as their own ignorance), managers are not arming themselves with enough or appropriate information, says Lobo. “The majority of people under-appreciate the importance of collecting advice or opinions from a diverse range of people. Asking more people from different backgrounds gives better value.”

Even when plenty of information is gathered decision-makers still tend to lock themselves into one piece of information on one person’s judgment. “We often latch on to a salient piece of information without looking at the whole picture.” Microsoft spent years trying to develop tablets and got it completely wrong when guessing who would use the technology and for what purpose, assuming it would be a niche market. Lobo also notes the big mistakes made by mobile phone manufacturer, Nokia. “They thought ‘We don’t know what’s going to be popular next so we’ll make phones of every type’ and they still missed the transition to large touch screen PDAs.”

Being prepared

While the first lesson for avoiding costly mistakes is to broaden the sources of information, the second, and equally important lesson, is to maintain awareness of the fact things go wrong more often than one expects and to ensure contingency plans are in place. Many organisations, particularly auditing companies and large mining and pharmaceutical firms, now better understand the extent of human error and ensure they are prepared when, say, an oil strike isn’t made or a drug test produces unexpected results.

“Over the last decade companies are increasingly paying attention to real options in their decision making - looking not just at the value of a capital investment under a reference forecast, but by also taking into account its consequences under different and uncertain economic conditions and market outcomes,” says Lobo.

It’s better to take action early, to have knowledge and back up plans to avoid adding to mistakes. “It’s about risk management,” he warns. “Looking at all possible outcomes and how to protect yourself. You have to think about different ways people can be wrong.”


Miguel Sousa Lobo is Assistant Professor of Decision Sciences at INSEAD.

View Comments
No comments yet.
Leave a Comment
Please log in or sign up to comment.