Tag Archives: black swan

Geopolitics, Investing, and the Little Book of Psychic Cold Reading

Logo 1 without Taglines

Milo’s latest advice for investors and business people trying to come to grips with geopolitics is now available on Forbes.com.  It’s called “Geopolitics, Investing and the Little Book of Psychic Cold Reading”.

Risk, Uncertainty and Black Swans: Theoretical Differences and Practical Implications

Every time Milo and I teach about how organizations can make sense of their environments, we are confronted with the difficulty of explaining why uncertainty is so different from risk and why understanding that difference matters to entrepreneurs and managers. In this article, we address those questions and discuss the practical implications that flow from them.

Continue reading

Constructing Cassandra Now Available

Our new book on strategic surprise, Constructing Cassandra:  Reframing Intelligence Failure at the CIA, 1947-2001, is now available for pre-order worldwide.

CC

Interested readers in North America can read reviews and order it via  Amazon.com or Barnes&Nobel;  in the UK you can use Amazon.co.uk; in the rest of the EU, you may wish to use Amazon.fr or Amazon.de; and in Asia you may wish to use  Amazon.jp.

If you do order it thank you.  Naturally, if you have any questions about the book, please ask us.

Culture of risk vs. culture of uncertainty: a crucial management issue

Recently I had a discussion with a friend who is a colonel in the Army about the culture of risk among senior officers and, by extension, in management.  The culture of risk is an important question for any organization.

To understand the culture of risk, we must first distinguish between two types of risks.  Type One risk is where you do something that leads to an error or a bad result.  It’s a reasonable assumption that the majority of our time in school and in higher education is designed to teach us how to reduce such risks.

Type Two risk is the opposite, it is the risk of not doing something that could be valuable.  Of course, the two are linked:  the more one reduces the risk of doing something, the more one increases the risk of not doing something valuable.  The trick, unfortunately, is that we tend to focus more on the Type One than Type Two risks.  On one level, this makes sense:  after all, failure is very visible – a disaster, a lost war, a failed product launch, etc.  In contrast, forfeited opportunity is invisible:  we do not see what valuable things our caution has prevented us from doing, and no one is punished for not having invented something.  Our education, liability laws and corporate governance structure push us towards a culture of Type One risk avoidance, i.e. to reduce the risk of failure (Sarbanes-Oxley anyone?).  This obviously is a problem for innovation in the long term, but it doesn’t even reliably protect us.   If it did nothing else, the financial crisis that began in 2008 has demonstrated that those institutions entrusted to manage risk failed to do so properly.  In short, we focus on risk avoidance at the expense of opportunity creation, and we don’t avoid even risk very well!

Continue reading

Managing the unexpected – On the work of Karl E. Weick and Kathleen M. Sutcliffe

Karl Weick has long been known for his work on organization theory.  In particular, his work focuses on how organizations make sense of complex and uncertain environments.  Among Weick’s most famous works is the study of the fire in Mann Gulch, an initially banal forest fire in 1949 that went wrong and resulted in the deaths of 13 firefighters. Weick’s analysis shows that in such conditions,  a professional team faces what he calls a ‘cosmological event’, ie an event so unexpected and powerful that it destroys the will and the ability of the victims to act, in particular to act as a group.  It is a great piece of scholarship.

“Managing the Unexpected” explores how the organization can handle the unexpected.  To do so, Weick and Sutcliffe chose to study organizations that are specifically created with that end in mind, which they call High-Reliability Organizations (HRO):  firefighters, crew of a submarine, the control center of a nuclear plant, etc. What principles do these organizations implement to operate?

Continue reading

Welcome to Extremistan! Why some things cannot be predicted and what that means for your strategy

In an earlier post about forecasting, I mentioned the work by Nassim Taleb on the concept of black swan. In his remarkable book, “The Black Swan”, Taleb describes at length the characteristics of environments that can be subject to black swans (unforeseeable, high-impact events).

When we make a forecast, we usually explicitly or implicitly base it on an assumption of continuity in a statistical series. For example, a company building its sales forecast for next year considers past sales, estimates a trend based on these sales, makes some adjustments based on current circumstances and then generates a sales forecast.  The hypothesis (or rather assumption, as it is rarely explicit) in this process is that each additional year is not fundamentally different from the previous years. In other words, the distribution of possible values for next year’s sales is Gaussian (or “normal”): the probability that sales are the same is very high; the probability of an extreme variation (doubling or dropping to zero) is very low. In fact, the higher the envisaged variation, the lower the probability that such variation will occur.  As a result, it is reasonable to discard extreme values in the forecasts:  no marketing director is working on an assumption of sales dropping to zero.

Continue reading

The Fragility of the Future (and Your Strategy)

Today I was reminded of the perils of forecasting while reviewing  a Department of Defense document, the Joint Operating Environment 2010.

“JOE 2010” as it’s called, is designed to provide the various branches of the US Armed Forces a joint perspective on likely global trends, possible shocks and their future operating environment.  If you’re interested in geopolitics and strategy, I recommend that you take a look.

Apart from its inherent interest, JOE 2010 opens with a defense planning timeline that business and financial strategy practitioners – and anyone who consumes their work  – would do well to bear in mind.  I have reproduced it verbatim here:

1900 If you are a strategic analyst for the world’s leading power, you are British, looking warily at Britain’s Age-old enemy, France.

1910 You are now allied with France, and the enemy is now Germany.

Continue reading

We have met the enemy and he is, er, forecasting

There is no doubt we are terribly bad at forecasting. Even the smartest among us are. Even the best and the brightest, whom we have tasked to save the world from financial annihilation, are.  Take Ben Bernanke, Chairman of the Federal Reserve. In 2004, he declared, in a speech ominously titled “The Great Moderation”: “One of the most striking features of the economic landscape over the past twenty years or so has been a substantial decline in macroeconomic volatility. This […] makes me optimistic for the future.” You might want to read the full transcript of the “Great Moderation” talk here because it is for a fascinating reading on how wrong experts can be at forecasting. And it’s not just Ben. In fact, political, economic and business histories are littered by forecasts and predictions that turned out to be ridiculously wrong. From the commercial potential of the Xerox machine or of Nespresso, from the possibility of heavier than air flight to the market for mobile phones, from prosperity at the corner of the street to Japan as number One. Our hopelessness at forecasting is a confirmed fact.

Continue reading