Despite formidable developments in business strategy over the last fifty years, organizations keep being disrupted by events they should have seen coming, but didn’t, or by events they saw coming but were unable to avoid or take advantage of. In 1971, NCR was surprised by the rapid rise of electronic cash registers and lost its leadership of the market. In 2007, Nokia was unable to react to the launch of the iPhone, an event the Finnish firm dismissed as minor, and is now struggling to survive. In 2011, the Arab uprising came as a complete surprise to everybody, not just business and governments but the people involved as well. And the list goes on: if strategy is about addressing the key challenges an organization face, then the general lack of preparedness (if not prevention of) the economic and political crises that the world has been facing since 2008 is a massive failure of strategy. Hence it’s no surprise that in a survey conducted in 2011 by consulting firm Booz, fully 53% of senior executives did not think their company’s strategy would be successful. Houston, we have a problem…with strategy.
So what is the problem? It could be with managers themselves, who either neglect or misuse the great tools of strategy. Certainly, poor managers exist. Undoubtedly, classic tools are routinely misused. Porter’s five forces model is often more of a shopping list for a fill-in-the-blank exercise than a tool for thinking about a firm’s environment. But this explanation is wholly not convincing: large multinationals, staffed by managers from the best business schools and advised by leading strategy consulting firms, also failed miserably to anticipate fundamental upheavals in their economic, business and political environments. Those people knew the tools, knew how to use them, did indeed use them, and now find themselves in a sorry state. So, if it is not (primarily) the managers, then could it be the tools themselves?
Traditionally, strategy making rests on three pillars. The first is that strategic thinking is reductionist. It assumes that what affects a business can be reduced to a few variables (all traditional, quantifiable business factors), and that the business environment is linear. Once we know the few variables that move a particular environment, all we have to do is pull them to obtain the desired result. A will lead to B, and B will lead to C. Knowing we want to reach C, we have to prepare for A and then execute. The second pillar is that strategic thinking is deterministic. Most schools of strategy making assume that the environment is exogenous and simply a given: the point of strategy-making in such a fixed world is for an organization to find the right fit with this environment, with little, if any, ability to influence it. Key to achieving success with this approach is the ability to make accurate predictions about the future, hence paving the way for efficient action towards the predicted future. The key driving philosophy is that “To the extent we can predict the future, we can control it, or at least our piece of it.” The third pillar is that strategic thinking is positivist. It assumes a detachment of the observer (the strategist) from the facts (the environment and the issues at hand), just like the natural scientist deals with some chemical products in the lab, or a physicist deals with motion.
The problem is, as one article puts it, “God gave Physicists the easy problems”. While we are not advocating the essentially reactive strategic posture that is often the result of the “cult of complexity” it is essential to understand the nature of the strategic environment and its relationship to the task of strategists. Many of the issues businesses and governments deal with are human, socially-constructed, and nonlinear. Such systems display five characteristics
- Nonlinear systems are synergistic, not additive. The big picture must be kept in mind and urges to simplify controlled. We cannot separate the different parts of the system. In the seventies, the resilience of the Soviet Union could not be assessed only by looking at the economy, or the military, or the population, but looking at the whole. Here lies the danger of “analysis”, which comes from the Greek and means “breaking up”. One must also do synthesis, i.e. look at the whole, and not just at the top.
- Nonlinear systems have uncertain cause-and-effect relationships. Side effects and unintended consequences must be considered inevitable. Fighting a drug war and seizing important amounts of drug raises the price of drug, making the trade more attractive for traffickers. Well-meaning treaty-writers in 1919 can ban large scale artillery for the Germans, only to witness not only a Nazi Germany breaking that treaty, but also launching research into an alternative to artillery – rockets and ballistic missiles!
- The behavior of nonlinear systems cannot be repeated. Just because prices have gone up for several years doesn’t mean they will go up next year. Hence arguments by analogy will never apply precisely. There can be only one Vietnam war and one launch of the iPad, and the Internet can only be invented once. Some events might come close, but they will be different and the difference might have significant implications that are impossible to assess.
- There is a disproportionate effect between the input and the output. For instance, the Tunisian revolution was sparked by a small street vendor who immolated himself after being harassed by the police. Hundreds of people had been harassed by the Police for years with little consequence. Conversely, massive inputs can come to naught, as illustrated by the various “packages” of money poured over the economy in the last few years, or by unsuccessful investments in R&D by companies.
- Issues cannot be looked upon natural facts, unchanging regardless of who the observer is. Nature exhibits many nonlinear systems, such as population growth or the weather, but most such systems business and government deal with are social in nature, ie they concern people. The adoption of a new innovation, healthcare, public debt, etc. are not just complex mechanistic systems, but involve human characteristics, such as emotions, desires, fears, aspirations, etc. Just defining such issues is a problem, as different people will come up with different definitions and views.
There are five implications of these characteristics.
- The first is that it is impossible to predict future states of the system. A small action might have big consequences, or no consequence at all. Without clear causality, or even the direction of causality, it is impossible to know anything for certain in advance. Because behaviors cannot be repeated, learning doesn’t work (and in any case may be ambiguous). Things happen, but why? Sales go up, but is it because we increased the marketing budget or simply because consumers are more optimistic? Will sales go up again if we increase the budget again? The impossibility to predict and the alternative to prediction offered by what we call “deep understanding” is discussed in our post “Deep Understanding Beats Prediction”.
- Strive to be approximately right rather than exactly wrong. In recognition of the nature of the environment that you face, when you must speak about the future, seek not to make exact predictions but instead to make forecast that simply define the scope of the possible; beware of Gresham’s Law of Strategy.
- The third implication is that the identity of the actors involved is a key element is dealing with the issues. That concerns who makes strategy – their identity, culture and organization – and who is the ‘subject’ of strategy. Indeed, a vision of scientists making an experiment on a closed system must be replaced with a vision of a self-organizing system made of actors who influence each others. This point is discussed in our post “Start With Who You Are“.
- The fourth implication is that it is impossible to deal with nonlinear systems by focusing only on a few short-term variables. One does not stop a recession simply by reducing interest rates. We can only hope to deal with complex environments by basing our thinking on a reasonably rich representation of this environment over the long term, not on short term simplifications of it. This point is developed by our post “Start with Geostrategy, or call it Tactics“.
- The final implication is that it is essential to interrogate anomalies: data or incidents that seem anomalous - that somehow “don’t fit”, seem weird or don’t make sense, should receive immediate attention. They could be pointers to a shift in the system as a whole.
In sum, the implication of the approach above means that strategy making is more a craft, or an art, than a science, and it pays to understand the limits of your knowledge.
See Part I of this series: Deep Understanding Beats Prediction. If you enjoyed this post, please sign up to receive updates when we post something new.
Note: This post builds on the work of Josh Kerbel, a CIA analyst.