There is no doubt we are terribly bad at forecasting. Even the smartest among us are. Even the best and the brightest, whom we have tasked to save the world from financial annihilation, are. Take Ben Bernanke, Chairman of the Federal Reserve. In 2004, he declared, in a speech ominously titled “The Great Moderation”: “One of the most striking features of the economic landscape over the past twenty years or so has been a substantial decline in macroeconomic volatility. This […] makes me optimistic for the future.” You might want to read the full transcript of the “Great Moderation” talk here because it is for a fascinating reading on how wrong experts can be at forecasting. And it’s not just Ben. In fact, political, economic and business histories are littered by forecasts and predictions that turned out to be ridiculously wrong. From the commercial potential of the Xerox machine or of Nespresso, from the possibility of heavier than air flight to the market for mobile phones, from prosperity at the corner of the street to Japan as number One. Our hopelessness at forecasting is a confirmed fact.
Getting forecasts wrong is expensive (as we taxpayers all know by now). While forecasts are often approximately right, sometimes they can go spectacularly wrong. Often, these errors are rare events with a very low probability of occurring. Because the probability is low, we think we can get away with it. However, as Nassim Taleb remarked in coining the expression a “Black Swan”, low probability sometimes goes with high impact. In fact, the case can be made that all our efforts to reduce the probability of these events (i.e. reduce the volatility as Ben would say) only increase the cost we pay when they occur. The less likely, the more expensive, in a way.
Of course there is a behavioral element to our use of forecasts, and probably an agency element as well. Put succinctly, managers who use forecasts bet (so to speak) that the probability of a rare event occurring during their tenure is low, by definition, so why bother. Après moi le déluge. At the macro-economic level, they’ve been more or less right for thirty years. But again, when they were proved wrong, the price was huge.
But why do we need forecasting in the first place? Mostly it has to do with our deep epistemological assumptions about how we do business, and how we think in general. In the words of Christine Moorman and Anne Miner, we usually assume that “composition (or design) of an activity occurs first and is then followed by implementation or execution.” Plan first, then execute. In this we carry the heritage of Plato who promoted a separation between the world of ideas and the world of nature, and that of Descartes who separated the spirit and the body. Such a separation is the foundation of our strategic thinking. And that’s were we return to forecasting: we need to forecast because once we have composed the activity (ie drawn the action plan), we will implement it, and the action requires the environment not to change, so to speak; otherwise, the assumptions we’ve made during the composition phase will turn out wrong and the plan will not work. Hence, in this mode of thinking, our ability to successfully execute the plan depends entirely on our ability to forecast the conditions under which the plan will be executed. Hence, we need forecasting because we need control, and we need control because of how we conceive our action, in business or in other matters.
Control also helps us optimize, the holy grail of modern business. Again, Taleb points to the dangers of optimization, because optimization means the elimination of redundancy, which in turn increases fragility. We can only optimize at the expense of robustness, just like a Formula One car is optimized for pure speed and is rather useless to take you for your weekly shopping. Needless to say, we get no control at all, as recent events have proved, even to smart people like Ben. Such control is a matter of belief, not reality.
What to do then? There are two approaches that can be proposed, and they come from very different horizons. The first is the consequential approach, and it is proposed by Taleb. Essentially, it goes like sex education for teenagers: we’d rather have you not doing it, but because we know you will do it anyway, we’ll tell you a few things to limit the possible damage. Taleb’s approach is to acknowledge that for many reasons we will keep making ridiculous forecasts, and that avoiding them is an uphill battle that we are sure to lose. What we can do, then, it to work a way to make our actions more robust in the case our forecast turn out to be wrong, even, or especially, when the error is large. Said otherwise, we should strive to make our forecasting errors less consequential. How do we do that? By building redundancies, broadly speaking. Just like nature has given us two lungs and two eyes, we wouldn’t concentrate all our production in one single factory in China, for instance. Of course, should everything go well, we would be rewarded for doing so by a really optimized supply chain. This is what Apple does today. Should something go wrong, however, the extent of that very optimization will define the cost of the black swan.
A second, very different approach is the transformative approach. In this, we learn from entrepreneurs, who are actually the economic actors who specialize in dealing with uncertainty. Recent research (known under the name of Effectuation) has shown that entrepreneurs do not try to predict the future. They do not work with forecasts. Rather than predicting the future to control it, they control the future to avoid having to predict it. Here, we have a more optimistic approach than that of Taleb. Entrepreneurs enact their environment by working with stakeholders through a process of co-creation, highlighting the very social nature of the effort. They do that by using a set of heuristics, meaning that while they don’t have a deterministic “Plan then act” approach, they do not proceed randomly either. In the field of international relations, such approach was recently described by Stephen Krasner as based on orienting principles as opposed to having a Grand Strategy (see Krasner’s article here).
To conclude, let’s imagine a world where we acknowledge that we don’t know how to predict or forecast many things outside of the natural world, and where we actually stop doing it. And let’s start from there. That isn’t much, and as one of my students told me recently, “what do you propose instead?” I argued that, on the contrary, this was a lot: knowing that you don’t know, and avoiding costly mistakes – Let’s call it the Hippocratic Oath of Economics: Above all do no harm. In any case let’s be clear: there is nothing to propose instead and there is no point trying to find our car keys under the lamppost. I am the spirit that negates…
Note: Nassim Taleb analysis of these different points can be found in the podcast from the econ library in a conversation that is fascinating if sometimes erratic.
Update 2011/07/17: on this topic, read “Overcoming our aversion to acknowledging our ignorance“, a very interesting article on CATO Unbound by Dan Gardner and Philip Tetlock.