Tag Archives: forecasting

Geopolitics: Shortcuts For Spotting Good And Bad Analysis

If you enjoyed the piece in Forbes earlier in the week about the similarities between poor geopolitical analysis and psychic cold-reading, an expanded version, “Geopolitics: Shortcuts For Spotting Good And Bad Analysis” is now available on Seeking Alpha.

A Slipshod Analysts Best Friend.

A Slipshod Analysts Best Friend.

Drivers of Prediction Accuracy in World Politics…Keep digging, Tetlock!

downloadPhilip Tetlock and his team have just released an interesting article entitled “The Psychology  of Intelligence Analysis: Drivers of Prediction Accuracy in World Politics” in the Journal of Experimental Psychology: Applied (January 12, 2015).  Their article summarizes the findings of the Intelligence Advanced Research Projects Activity (IARPA) tournament that we drew our readers attention to in 2013.  If you’re interested in intelligence analysis, forecasting or geopolitics, the article is certainly worth your time.  Nevertheless, we have our differences with Messrs Tetlock et al.

Some of the article’s conclusions are part of the received wisdom of forecasting.  For example, they conclude, “We developed a profile of the best forecasters; they were better at inductive reasoning, pattern detection, cognitive flexibility, and open-mindedness. They had greater understanding of geopolitics, training in probabilistic reasoning, and opportunities to succeed in cognitively enriched team environments. Last but not least, they viewed forecasting as a skill that required deliberate practice, sustained effort, and constant monitoring of current affairs.”  (Hurrah, and here’s to Open Sources!  The important drivers of geopolitics are not remotely secret.)  While these conclusions might sound intuitive, it is useful to document that they stand up to sustained scrutiny in a controlled experiment.

Some of Tetlock and his teams’ other conclusions also jibe with our (more sociological) approach to understanding the challenges of forecasting. Among other things, they find that when it comes to anticipating major geopolitical events, teams outperform individuals, and laymen can be trained to be effective analysts using only open sources.

The publication of this article, however, is also an excellent occasion to remind people of the shortcomings of a psychological approach to understanding success and failure in intelligence and geopolitical analysis. As we explore in Constructing Cassandra, purely psychological approaches present intermediate-level theories: they do not necessarily conflict with – but also do not entirely transcend – competing approaches to the problem (such as those presented by studies of organizational behavior or discussions of the “politicization” of intelligence).

Moreover, while the new paper certainly analyses the role of collective dynamics of the processing of information (which is a huge step forward when compared to simple “psychological biases” work), without an underpinning in the sociology of knowledge, some key root questions about intelligence analysis are left addressed:  e.g. Exactly which questions are asked, by whom, in response to what, and why; as you seek to answer them, who gets ignored, when and why?  Which questions are simply rejected? How and why does that happen?

As Wohlstetter wrote in 1962, “The job of lifting signals out of a confusion of noise is an activity that is very much aided by hypotheses.”  As I discussed last May at the Spy Museum in Washington, that remains true in today’s “Big Data” environment, and Tetlock’s experiments are a worthy attempt to determine who individually and collectively most effectively does that “lifting”, or sorting, of signals from noise.

One more failure of imagination...

One more failure of imagination…

BUT, what the IARPA work and Tetlock’s experiments do not address is the root cause of surprise, which in our view is the “problem of the wrong puzzle” or in Intelligence, bad Tasking (AKA “failures of imagination,”, that phrase so beloved of the 9/11 Commission which is now often wheeled out as a deus ex machina after a surprise has occurred).

In contrast, we believe the question of Tasking is vital, and that the systematic and sustained study of “Cassandras” – those who give warning but are ignored – are interesting exactly because their imaginations don’t fail yet for reasons that extend well beyond the merely psychological, their warnings (which should result in Tasking or further analysis) are ignored.  In other words, given a particular set of questions, who answers them best is quite interesting. More interesting, however, is what questions are not being asked, and who’s excluded from the debate. These dilemmas Tetlock’s work does not directly address, but we think the answers lie in the realm of the culture and identity of the organization performing the analysis.

Until  the role that the culture and identity of analytic teams and intelligence agencies as a whole is systematically address, we will have more strategic surprises than necessary.  The beginnings of a cure for any problem is a sound diagnosis.  Our diagnosis is that the core challenges of intelligence analysis are socially constructed.  In short, our hats are off to Dr. Tetlock and his team, but they need to dig deeper!

Naturally, we would welcome your comments on the IARPA research or Constructing Cassandra, and if you enjoyed this blog post, why not subscribe?

Our new Forbes piece: Play it Like Steve Jobs-Three Questions for Business Leaders to Ask When Surprise Hits

Our latest post on Forbes proposes a simple framework for leaders to apply when confronted with a strategic surprise-That 3am call… In short, don’t rush into action, no matter how urgent things seem to be! Read the post here.

Previous Forbes pieces:

Our new Forbes piece: Lady Gaga World President by 2030? Why the forecasters so often get it wrong

Our latest post on Forbes is a reflection on the limits of forecasting after the publication of the National intelligence Council’s Global Trends 2030 report is available here. In short, don’t predict, construct.

Previous Forbes pieces:

 

Forecasting World Events – Call for Participants

We may be looking for you.

We may be looking for you.

If you’re reading this blog, you’re probably the sort of person that the US Intelligence Advanced Research Projects Activity (IARPA) is looking for:   IARPA is now looking for new participants for its online research study, Forecasting World Events.

The Forecasting World Events study involves making predictions about current issues that you select from various categories, like international relations, global politics, economics, business, and other areas.  If you’d like to try to participate, click HERE.

Once you sign up at the website, they will send you a background questionnaire.   After you complete the questionnaire, they will send you an e-mail to let you know if you have been selected.  The initial questionnaire only takes about 20 minutes, and the prediction study itself is really interesting and quite quick to do every few weeks.

PS To understand the methodological background of the study, we recommend Tetlock’s Expert Political Judgment:  How Good Is It? How Can We Know.

PPS If you enjoyed this post, why not subscribe to our blog?  Thanks.

Repeat after me: “Why won’t the price mechanism work?” – Energy independence and neo-Malthusian commodity fears

I’m not supposed to be blogging.  Philippe and I have a book deadline at SUP this week.  We have a Forbes piece due soon, too.  And I have a speech to prepare for an Institutional Investor Forum in mid-September.  So I’m going to make this quick…

Tonight I went out for dinner with a stack of reading to catch up on.  Over indifferent Italian, I read two articles that I have to share.  One I want to share because it’s so smart, and the other I want to share because it’s the opposite, but it parrots several popular misconceptions.

Continue reading

Geopolitics and Investing: A Reading List

As I explain to my students at IE, the most any business school can hope to do is move you from unconscious ignorance to conscious ignorance of a subject.  In other words, a course can lay a firm foundation in a subject, and then provide a jumping off point for future self-study.  After my MIAF course “Geopolitics and Investing”, that usually prompts the question, “Where should I begin such self-study?”  How do I start to learn to generate “geopolitical alpha”?

As I said in an earlier post, there are certain key books that point you towards how to think like an intelligence analyst.  Because the skills of an intelligence analyst and a geopolitical investor overlap so much, I would also say that investors interested in geopolitics start with those key books.  In particular, if you haven’t mastered the critical thinking and the basic analytic  techniques described in Thinking in Time, Essence of Decision and The Thinker’s Toolkit, you are still in kindergarten as far as intelligence analysis is concerned.    Heuer’s Psychology of Intelligence Analysis (downloadable free from the CIA’s site here) is also immensely valuable.  None of these books will teach you geopolitical analysis per se, but they will give you a solid foundation in non-quantitative analysis.

One investor gets a grip on Geopolitics

Continue reading

Welcome to Extremistan! Why some things cannot be predicted and what that means for your strategy

In an earlier post about forecasting, I mentioned the work by Nassim Taleb on the concept of black swan. In his remarkable book, “The Black Swan”, Taleb describes at length the characteristics of environments that can be subject to black swans (unforeseeable, high-impact events).

When we make a forecast, we usually explicitly or implicitly base it on an assumption of continuity in a statistical series. For example, a company building its sales forecast for next year considers past sales, estimates a trend based on these sales, makes some adjustments based on current circumstances and then generates a sales forecast.  The hypothesis (or rather assumption, as it is rarely explicit) in this process is that each additional year is not fundamentally different from the previous years. In other words, the distribution of possible values for next year’s sales is Gaussian (or “normal”): the probability that sales are the same is very high; the probability of an extreme variation (doubling or dropping to zero) is very low. In fact, the higher the envisaged variation, the lower the probability that such variation will occur.  As a result, it is reasonable to discard extreme values in the forecasts:  no marketing director is working on an assumption of sales dropping to zero.

Continue reading

The Fragility of the Future (and Your Strategy)

Today I was reminded of the perils of forecasting while reviewing  a Department of Defense document, the Joint Operating Environment 2010.

“JOE 2010” as it’s called, is designed to provide the various branches of the US Armed Forces a joint perspective on likely global trends, possible shocks and their future operating environment.  If you’re interested in geopolitics and strategy, I recommend that you take a look.

Apart from its inherent interest, JOE 2010 opens with a defense planning timeline that business and financial strategy practitioners – and anyone who consumes their work  – would do well to bear in mind.  I have reproduced it verbatim here:

1900 If you are a strategic analyst for the world’s leading power, you are British, looking warily at Britain’s Age-old enemy, France.

1910 You are now allied with France, and the enemy is now Germany.

Continue reading

Gresham’s Law of Strategy: Why Bad Advice Drives out Good Advice

Near the end of a seminal essay on strategic surprise, Richard Betts writes, “The intelligence officer may perform most usefully by not offering the answers sought by authorities, but by offering questions, acting as a Socratic agnostic, nagging decision makers into awareness of the full range of uncertainty, and making authorities’ calculations harder rather than easier.”  I believe that the same should be true for corporate strategy consultants:  often their job is to make long-range calculations harder rather than easier.

Why then, is the opposite so often true?  In a world in which surprise, disruption and the unanticipated are rife, why do strategists who promise to make calculations easier rather than harder often succeed?  I think a phenomenon that I call of “Gresham’s Law of Strategic Advice” is at work.

E pluribus unum

Continue reading