Skip to main content
Gilles Hillary Complex models don't plan for uncertainty

Strategy

Embrace the Fuzzy Crystal Ball

Embrace the Fuzzy Crystal Ball

Complex models employed to forecast the future as accurately as possible are a poor way to plan for uncertainty.

Phil used to be a very senior financial executive. When asked for a number, he would typically answer with a rough ballpark, such as, “It’s about 5%”. He’d then be peppered with questions about how he had arrived at that figure. After a while, he got tired of this questioning and started to bring a stack of financials with him to every meeting. From then on, instead of providing an approximate but effectively accurate answer, he would instead turn to his printout, thumb through the pages, and then randomly point to a specific line and answer by saying “It is 4.96%.” The questions stopped. The oracle had spoken.

Phil’s experience is not unique. Humans tend to dislike uncertainty. For example, many people are happy to play roulette, despite its inherent risk and expectation of loss, but only a few are willing to participate in a wager if the odds are not clearly defined, even if they can choose their side of the gamble.

Models, particularly those with a veneer of complexity and sophistication, cater to this aversion. Various academic studies suggest that seemingly more precise numbers can act as more potent anchors, and that when presented with more information, people tend to be even more (over)confident of their capability even though their actual performance does not improve. Data does not guarantee knowledge.

By looking at that 4.96% figure, we are all the more likely to anchor more strongly on that view and feel even greater overconfidence towards the precision of that measure. In fact, many people don’t want to be bothered with details, particularly when data go against their prior beliefs. For example, studies are less likely to change peoples’ minds when they provide more information about the way they were conducted.  Knowledge can be a curse.

Anchored in risk models

Naturally this has implications for risk management. Risk modeling has made incredible strides, particularly in financial markets, and experts have far more sophisticated indicators of what their risk positions are than ever before. Along the way, business people ceased to follow a conversation based on esoteric mathematical concepts. If you can’t convince, confuse.

But, even simpler things are being missed in the conversation. For example, one of the most common measures of risk in the financial sector is Value at Risk (VAR). VAR provides a sense of the volatility that should be expected over a certain period given the investment made by the firm. The higher the VAR, the greater the risk. Recently, VAR measures have dropped. Unfortunately, the reason for this drop is the fact that the indicator is typically based on experience over the previous 5 years. Observations during the financial crisis were removed from the calculation, lowering the value of the indicator. Needless to say that the underlying risk has not been affected. The indicator is not the risk.

Models aren’t reality

At the same time the most notable problems have arisen from issues of uncertainty as unpredictable surprises swamp businesses. For example, the various banks that failed or suffered in 2008 all had wonderfully complex risk models, yet had often failed to consider the possibility of a major increase of correlations among individual instruments and even asset classes in a cascade that eventually expanded to include regulatory and structural changes to markets. Reality is a stubborn thing.

The surprise in this case was perhaps that models had not anticipated this. Twenty years ago, a major hedge fund, Long-Term Capital Management (LTCM), was run by finance veterans, a small army of PhDs, and no less than two Nobel Prize winners. In 1998, it nearly caused a global financial meltdown as the same increase in asset correlations happened. We learn from history that we do not learn from history.

Manage uncertainty, not just risk

Yet while these uncertainties are, by their very definition, not something that can be included in the typical risk model, that does not mean that they should simply be ignored. Blaming failure on a Black Swan is also not particularly useful. Careful consideration of and preparation for the types of disruptions that could occur due to the impact of unknown events is still prudent planning and can aid business operations in times of trouble. Plans may be useless, but planning is indispensable.

With the idea that it is never too late to do it today, risk professionals themselves can take steps to better frame their environment, including formatting, vocabulary, and the visual display of their information. Such steps might seem basic compared to the advanced mathematics and consideration that go into today’s risk models, but they offer a very real prospect for risk professionals to bridge the gap between managing risk and managing uncertainty. Sometimes, less is more, and while Phil can save a lot of time by presenting overly precise answers, he will gain far more insightful, thoughtful discussion and input if he starts the conversation with a recognition of uncertainty and by embracing the fuzziness inherent in any crystal ball.

 

Gilles Hilary is an INSEAD Professor of Accounting and Control and The Mubadala Chaired Professor in Corporate Governance and Strategy. He is also a contributing faculty member to the INSEAD Corporate Governance Initiative.

Chris Lobello is a Financial Consultant.

Follow INSEAD Knowledge on Twitter and Facebook

About the author(s)

View Comments
No comments yet.
Leave a Comment
Please log in or sign up to comment.