I’m not supposed to be blogging. Philippe and I have a book deadline at SUP this week. We have a Forbes piece due soon, too. And I have a speech to prepare for an Institutional Investor Forum in mid-September. So I’m going to make this quick…
Tonight I went out for dinner with a stack of reading to catch up on. Over indifferent Italian, I read two articles that I have to share. One I want to share because it’s so smart, and the other I want to share because it’s the opposite, but it parrots several popular misconceptions.
Posted in Theory
Tagged Anne Korin, China, commodities, dystopia, economics, energy, extrapolation, forecasting, Gal Luft, geopolitical alpha, Geopolitics, GMO Quarterly Letter, Grantham, jeremy grantham, malthusianism, Minxin Pei, oil, Peter H. Diamandis, prediction, price mechanism, Simon and Ehrlich, SocGen, Steven Kotler, whale oil
As Milo and I have argued before, the environments and issues businesses deal with are more complex than traditional strategy models admit. Business issues today display high levels of uncertainty, they can behave non-linearly, and they can be vulnerable to “Black swans”, i.e. low-probability but high impact events that disrupt even the best formulated strategies. The added difficulty for strategists and managers is that nonlinear environments often appear linear for an extended time period (think US house prices). As a result, some conclude that what seems to be an essentially linear pattern (prices fluctuate a bit around a ‘long term trend’ but always rise), are linear in reality – before a radical change occurs that completely disrupts previously assumed patterns (e.g. prices fall dramatically). In short, people often assume an environment is linear and predictable when in fact the continuity we observe is only a particular case of limited duration. To make matters worse, with many nonlinear systems change is not nicely spread over the years: most of the cumulative change occurs in one, single – often dramatic – occurrence. In the language of engineering, some things don’t “fail gracefully” (e.g. a bridge that breaks suddenly instead of bending slowly).
Not a “graceful failure”.
Milo and I organize a workshop on Tuesday, May 29 on ACH (Analysis of Competing Hypotheses). ACH is a tool originally developed by Richards Heuer at the CIA to analyze complex and uncertain situations. It is widely used in intelligence and international politics, but Milo and I think it applies equally well to business for strategic decision making. ACH uses a deceptively simple framework to use ideas from the scientific method, cognitive psychology and decision analysis to overcome a common but immensely important bias: the fact that we tend to perceive what we expect to perceive rather than what actually exists.
Recently I had a discussion with a friend who is a colonel in the Army about the culture of risk among senior officers and, by extension, in management. The culture of risk is an important question for any organization.
To understand the culture of risk, we must first distinguish between two types of risks. Type One risk is where you do something that leads to an error or a bad result. It’s a reasonable assumption that the majority of our time in school and in higher education is designed to teach us how to reduce such risks.
Type Two risk is the opposite, it is the risk of not doing something that could be valuable. Of course, the two are linked: the more one reduces the risk of doing something, the more one increases the risk of not doing something valuable. The trick, unfortunately, is that we tend to focus more on the Type One than Type Two risks. On one level, this makes sense: after all, failure is very visible – a disaster, a lost war, a failed product launch, etc. In contrast, forfeited opportunity is invisible: we do not see what valuable things our caution has prevented us from doing, and no one is punished for not having invented something. Our education, liability laws and corporate governance structure push us towards a culture of Type One risk avoidance, i.e. to reduce the risk of failure (Sarbanes-Oxley anyone?). This obviously is a problem for innovation in the long term, but it doesn’t even reliably protect us. If it did nothing else, the financial crisis that began in 2008 has demonstrated that those institutions entrusted to manage risk failed to do so properly. In short, we focus on risk avoidance at the expense of opportunity creation, and we don’t avoid even risk very well!
As I explain to my students at IE, the most any business school can hope to do is move you from unconscious ignorance to conscious ignorance of a subject. In other words, a course can lay a firm foundation in a subject, and then provide a jumping off point for future self-study. After my MIAF course “Geopolitics and Investing”, that usually prompts the question, “Where should I begin such self-study?” How do I start to learn to generate “geopolitical alpha”?
As I said in an earlier post, there are certain key books that point you towards how to think like an intelligence analyst. Because the skills of an intelligence analyst and a geopolitical investor overlap so much, I would also say that investors interested in geopolitics start with those key books. In particular, if you haven’t mastered the critical thinking and the basic analytic techniques described in Thinking in Time, Essence of Decision and The Thinker’s Toolkit, you are still in kindergarten as far as intelligence analysis is concerned. Heuer’s Psychology of Intelligence Analysis (downloadable free from the CIA’s site here) is also immensely valuable. None of these books will teach you geopolitical analysis per se, but they will give you a solid foundation in non-quantitative analysis.
One investor gets a grip on Geopolitics
Posted in Methodology & Tools, Theory
Tagged analysis, asset allocation, CIA, demography, economics, energy, event trading, forecasting, geopolitical alpha, Geopolitics, Geostrategy, Graham T. Allison, Hedge funds, Intelligence Analysis, investing, Luttwak, Richard Neustadt, strategic autism, Use of history
Joseph Nye, an eminent political scientist at Harvard, wrote a book about “soft power” a few years ago. He followed that volume up by devoting a chapter to the concept in last year’s book The Future of Power. So what is “soft power”?
According to Nye, whereas “hard power” grows out of a country’s military or economic might, soft power, “Arises from the attractiveness of a country’s culture, political ideals, and policies.” In the Future of Power Nye examines what it means to be powerful in the twenty-first century, and how the US might set about retaining its place in the world. He thinks soft power will be an important part of the mix, and I tend to agree.
But while I’m generally optimistic about the future of America’s place in the international order , one historical parallel related to soft power disturbs me: the degree to which the threat of terrorism has led the US to create embassy buildings that appear to cower before contemporary threats.
The difficulty of anticipating strategic surprises is often ascribed to a ‘signal-to-noise’ problem, i.e. to the inability to pick up so-called ‘weak signals’ that foretell such surprises. In fact, monitoring of weak signals has become a staple of competitive intelligence. This is all the more so since the development of information technology that allows the accumulation and quasi-automatic processing of massive amount of data. The idea is that the identification of weak signals will enable an organization to detect a problem (or an opportunity) early and, hence, to react more quickly and more appropriately. For instance, a firm can detect a change in attitude of consumer behavior by spending time with the most advanced of them, as Nokia did in the early 1990s, a move that enabled the firm to realize that the mobile phone was becoming a fashion item.
Karl Weick has long been known for his work on organization theory. In particular, his work focuses on how organizations make sense of complex and uncertain environments. Among Weick’s most famous works is the study of the fire in Mann Gulch, an initially banal forest fire in 1949 that went wrong and resulted in the deaths of 13 firefighters. Weick’s analysis shows that in such conditions, a professional team faces what he calls a ‘cosmological event’, ie an event so unexpected and powerful that it destroys the will and the ability of the victims to act, in particular to act as a group. It is a great piece of scholarship.
“Managing the Unexpected” explores how the organization can handle the unexpected. To do so, Weick and Sutcliffe chose to study organizations that are specifically created with that end in mind, which they call High-Reliability Organizations (HRO): firefighters, crew of a submarine, the control center of a nuclear plant, etc. What principles do these organizations implement to operate?