We are all too easily astonished by the course of events. Other sections in this chapter show how we can organise ourselves in order to lessen the chances of this happening. Here, however, we ask what factors contribute to a tendency for astonishment. The text is arranged in two chapters:
First, we look at the sources of surprise, asking why it tends to be the case that 'no plan long survives contact with reality.'
Second, we note four systematics from which error and the avoidance of error seem to flow.
Change, like death and taxes, is always with us. However, in most industries the pace of change is accelerating, even to an extent where some observers have begun to question whether it is possible to take a proactive stance in such an environment. In short, some argue that it is only possible to respond to immediate circumstances and that it is not useful to seek a long-term strategy. Some even go so far as to claim that the accepted wisdom that 'strategic planning is essential for success' is no longer valid in this context.
Despite this pessimistic view, many organisations are beginning to recognise that in order to establish competitive advantage it is necessary to have command of skills and competencies which they foresee will shape their market's conditions. The time taken to acquire and develop these necessary skills is increasing, therefore it is crucial to recognise which skills to develop and to begin developing these skills at the appropriate time.
This means that superior strategy relies on an organisation's ability to develop a 'clearer view of the future' than its competitors. Thus, strategic planning is essential to achieve competitive advantage, aiding the organisation to choose the appropriate skills and competencies to invest in and to develop.
Figure 1: Forecasts and scenarios.
However, in world of growing complexity, this is no longer an easy task. The range of options and alternative futures which face organisations today is much greater than that which faced firms from the 1950s to the 1970s, owing to technologically driven changes in industry structures, regulatory influences which have blurred boundaries between hitherto clearly distinct markets, and the changing nature of work and leisure, to name but three forces of change.
In this environment, making correct assumptions about future trends is difficult if not impossible. One approach that a growing number of organisations have adopted is to work with scenarios. Scenarios are diverging - but plausible - views of the future. They differ from forecasts in that they explore possible futures rather than predict a single point future. As Michael Porter, author of Competitive Advantage has pointed out, scenarios offer "an internally consistent view of what the future might turn out to be - not a forecast".
Figure 2: Converging and diverging systems.
Past forecasts can be compared with contemporary outcomes. The result is seldom happy. We have to be conscious of the fact that we had bring our own frames of reference when developing scenarios. It is worthwhile to look for ways of exposing what seems systematically to have been unanticipated by futurists.
We set out to make such a series of studies. We decided not to study how to make organisations more able to adapt, and instead to focus on the ways in which organisations "looked in the wrong place". What were the systematics of visions of the future and their connection with the actual development?
We looked back at examples of several types of forecasting to see what happened after the forecasts were made. We looked at history, at science fiction, at some major forecasts in the public domain and examples relating to the take-up of technology.
Through looking at military history, we realised that errors were made through failure to learn, failure to anticipate and failure to adapt. Then, analysis of over 20 scenario projects reinforced this analysis: we saw "not looking at the future at all" (failure to learn) and "looking in the wrong place" (failure to anticipate) as major reasons why those scenarios subsequently proved to be inadequate or unhelpful to the commissioning organisation. These failures were those of being wrong in detail or in failing to see major new trends emerging. This was distinct from the very large proportion of scenario exercises where failure to adapt - to be able to act appropriately when a scenario became imminent - was observed in spite of the existence of the scenarios inside the corporation or organisation.
One clear point to emerge was that there were some assumptions which were common to many of the forecasts that turned out wrong. We believe this list should be useful to planners in general. After all, most organisations run an annual budget cycle which is necessarily based on explicit and implicit assumptions. Shocks and paradigm changes can change the validity of these assumptions overnight and so knock plans sideways in the short or medium term. Examples such as a major typhoon or the fall of the Berlin Wall can make the current year's budget difficult to achieve or even render it meaningless.
Figure 3: The proper domains of scenarios, sensitivities, firms intentions.
The purpose of this paper is therefore to share what we learnt from our analysis and to highlight how organisations might improve their ability to anticipate the future.
Even competent organisations fail to forecast correctly. Eliot Cohen and John Gooch, in their book Military Misfortunes analyse a number of occasions upon which competent military organisations have failed. They identify and reject two syndromes: (a) "The Man in the Dock", where the person at the top is fired for being at the top at the time of disaster, and (b) "The Man on the Couch", where the blame is on the psychology of the people drawn to and promoted into top positions.
They suggest instead there are three kinds of organisational failure:
They use a number of examples to illustrate that one out of the three failures is damaging, two create a very serious situation, and all three together almost always produce catastrophe.
As an example of failure to learn, they cite US antisubmarine warfare in 1942. Even before they entered the war, the US studied the antisubmarine warfare experience of the British. However, their assumption was that the problem was primarily a technological one. Only later did they realise that they also needed to learn from British experience of organisation and operations.
Israel showed a failure to anticipate in the period before the Yom Kippur War in 1973. Israel relied on the assumption that Egypt would not attempt another war of conquest and failed to see that President Sadat might attempt a quite different kind of war, one designed to inflict just enough damage to restore Egyptian self-confidence.
In 1915, at Suvla Bay on the Gallipoli Peninsula, the British Army demonstrated a sad failure to adapt. Successful with a surprise landing, the British applied recent experience of defensive operations and did not move rapidly inland, failing to see and take the opportunity of a great victory.
In 1940, the French Army and Air Force experienced all three failures, leading to catastrophe. Although the French had good information about the German blitzkrieg in Poland, they did not learn the lessons. They assumed they would have to fight a World War 1 type of war of position, failing to anticipate the new German war of movement. When the crisis broke, they failed to adapt. Although some leaders began to learn and adapt in a hurry, the Army could not change its ideas and methods quickly enough - it ran out of time.
This analysis helped us to frame our questions about scenarios and forecasting and we decided to concentrate on failures to learn and to anticipate.
On the basis of 20 years' experience with scenario planning, one of the authors - Barbara Heinzen - has developed a list of characteristic reasons why scenarios are not effective in helping the organisation in question to anticipate the future. The list was further refined in work conducted with the Chatham House Forum. The main reasons are given below.
Why organisations fail to anticipate the future:
Not only large organisations but also departments, teams and individuals are vulnerable to these factors. We tested these common sources of error against some selected "shocks" such as the Barings collapse, the Asian financial crisis, and the Maxwell fraud. This showed that in these three examples, the signs were visible - and even widely commented on in the media - for a significant period before the "unexpected" or even "unthinkable" did in fact happen. People in the organisation assumed things were going along satisfactorily, or perhaps preferred not to know they were not.
It became clear that failures like "not paying attention" and "the culture of obedience" are important and these should be easy for management to overcome. However, even if the organisation does have a function specifically tasked with looking at the competitive environment and takes steps to dispel the tendency towards a culture of obedience, it is easy to miss paradigm shifts and shocks. The ability to put a lot of effort into planning and still "not asking the right questions" is a key to many failures.
The same reasons might also prevent organisations from foreseeing huge successes. For instance, Virgin moved ahead in a way that the financial services sector and airline industry were unable to forecast because of their assumptions about a pop music label boss who wore a jumper to work. It was assumed that nobody would buy financial services from Richard Branson or trust him to run an airline.
Remembering that organisations should ask the right questions is one thing. Knowing what the right questions are is something else.
We can be clear about 'What We Know'. Also, we have some idea of 'What We Know We Don't Know'. Unhappily, a large area of what is important is hidden from us - it is 'What We Don't Know We Don't Know', and so we cannot ask questions about it.
Figure 4: Knowing what we know, and do not know.
Scenario planning can help us by emphasising that a range of possible futures should be considered. In this environment - which is non-judgemental - even unlikely but possible paradigm changes which affect a business, such as the oil price rise shock in the 1970s, can be considered analytically. In most scenario planning methodologies, wild cards can be used, elements which are independent of particular scenarios, and which could happen under any scenario. In trying to anticipate shocks and paradigm changes, individuals try to do two things - improve their confidence in the current "known" domain, and extend the range of events which they can include and prompt themselves to think about. This is a search for the "truth", but we can never forecast perfectly, as the future is forever unknowable. But scenarios help us to explore ways it might be. If we can improve our ability to anticipate shocks and paradigm changes we can extend the area where our forecasts are likely to be more reliable. Our annual budgets will be better founded and more stable, too.
Long before scenario planning was heard of, H G Wells was visualising different futures based on scientific progress. Aldous Huxley's Brave New World posits a world in which the technological capability which is currently creating Dolly the sheep is applied to humans.
Many other science fiction writers have explored how new scientific and technological knowledge might affect our lives. In general, it seems that technology forecasting has a better record than forecasts about human behaviour, which remains various and often quite unpredictable. There has also been a tendency to over-estimate the capacity of governments to do things, or even to see the need to do certain things. What has often been under-estimated is the capacity of people acting as individuals or in small loose groups to do things, relying on their own common sense.
Some remarkable, socially-credible science fiction, aimed at the near future:
Holy Fire; Islands in the Net.
Snow Crash, or even more remarkable, The Age of Diamond.
Distress; Axiomatic; for the speculative, Diaspora
Forecasts aimed at public attitudes
Similar assumptions were visible in the work done in the 1960s by Herman Kahn, in The Year 2000. Some of the one hundred things he expected to see by the year 2000 were:
While he over-estimated the potential of governments to implement big projects, he under-estimated the paradigm change arising from the technology changes based on semi-conductors.
He was also part of the movement trying to get the public to think about the unthinkable. While much of his focus was on the effect of nuclear warfare, he was also part of the Club of Rome group. In 1972, the Club of Rome published its book The Limits to Growth, in which it forecast massive growth of population and industrialisation, along with pollution and the depletion of resources, leading to collapse and catastrophe within a century. Exponential growth was the main focus of attention, and probably the biggest error in the analysis.
At that time, ideas of an ever-increasing population and the exhaustion of resources were popular to the point of being unchallengable. The forecast did not foresee the exploration of deep sea beds which found more oil or that efficiency in fuel use would reduce the need for it; it did not consider that the use of steel would decrease, thus reducing pollution. The forecast also did not envisage the reduced size of families combined with increased education of women which has lead to zero population growth in many countries.
The authors proposed an international group to deal with the problems as they saw them. It is instructive to see how embedded the forecasters were in the period of big government and how difficult it is for forecasters to reject orthodoxy. Because at any time, orthodoxy exists, a dominant logic which controls our perceptions of reality. It is promoted by different forecasters who have studied the same body of information with much the same set of themes and values in mind. The orthodoxy seems obvious, and becomes reinforcing. Research grants are available for carrying the orthodoxy further, but not usually for overturning it.
Forecasts of technological change
Steven Schnaars has studied an orthodoxy that he calls "the myth of rapid technological change". He notes that in the 1960s, for example, tremendous change was forecast for the way transportation would develop, including commercial passenger rockets, VTOL and supersonic planes, automatic vehicle separation on new "smart highways", and the use of nuclear power in all forms of transport.
The people who made these forecasts now seem to have been enthusiasts enamoured of technological wonder. They went wrong because they fell in love with exotic technologies just because they were exotic. It was easy for them to believe what they wanted to believe. They also failed to pay attention to the less romantic matters of commercial fundamentals. Many of the ideas were simply too expensive to be practical. They made some wrong assumptions about human behaviour as well. Consumers might have agreed that they wanted better mass transit systems, but few were happy at the idea of sitting behind a nuclear engine in a computer-controlled bus. Not all technology is wanted merely because it exists.
Some ideas have come to fruition, but not very quickly. The gradual use of the fax machine is an example of timing and complexity weakening an essentially correct projection. Quick uptake was originally predicted, but initially it was too expensive and took too long to transmit a document. Eventually, 20 years later, the fax machine achieved a mass market through improvements in price and performance.
It is easy to see now that the microwave oven was always a good idea, but it achieved success 25 years later than expected. It was only with changes in lifestyle - women working and improvements in ready and frozen meals to gourmet status - that the ovens proved to fulfil a useful domestic role. An erratic convergence between technical potential, market economics, societal factors and individual values often define what products will or will not succeed, and it is often the case that purely technological potential is insufficient to create a product. It needs its context, which in turn changes the offering. Managing this issue of non-linearity is discussed on this CD, under 'Knowledge Management'.
Mechanisms for creating orthodoxy
At any time, the current orthodoxy has two potential sources. One is that the way we do things is obvious and right and that extrapolation of our present behaviour and knowledge will give us a valid picture of what is going to happen. The second is a kind of collective wishful thinking, which accumulates layers. If we want to believe in an attractive idea and others do too, a momentum can develop; increasingly more people find reasons for believing and do not look for reasons to doubt, especially if the end is a noble one, like saving whales and rainforests, or powering cities with wind energy.
The media reinforce current orthodoxies. Having discovered authorities in any field, the media return repeatedly to the same people, reinforcing their standing and their opinions by their frequent appearances. The media also write and speak in a set of clichés and stereotypes, which contribute to a climate of thinking in terms of right/wrong, good/bad, without discriminating among circumstances. The media reinforce the current orthodoxy until boredom and the possibility of a dramatic overturning of the orthodoxy in favour of a new one start a new cycle.
Even the most stolid organisations, which carefully question their own policies, are not immune to this outside influence. We need to remind ourselves to challenge the received wisdom that comes through the media. We tend to let information about the world come to us, but we should go out actively to see if there are other facts and opinions. We need to develop our own diverse sources.
When did the orthodoxy on the Internet change - from "a hacker's tool" to "an essential help to commerce" - and did we forecast its development?
Four sources of error in forecasting appear to emerge from these analyses. Checking ideas, assumptions and plans for sensitivity to these factos may help organisations to improve the quality of the assumptions they make about the future.
(1) The Individual is unboxed
The first error is that planners' assumptions about the behaviour of people, which may have been accurate in previous decades, are certainly not valid in the current world. The basic framework of a hierarchy of needs, starting with meeting our basic needs for food, clothing and shelter, and moving on to needs for self-expression and self actualisation, should warn us that people widen the range of choices which they make once food and shelter needs are met. And since today most people in the West are not prompted by memories of hunger or cold, people's behaviour becomes increasingly difficult to forecast. The common reason for failure of a number of forecasts, particularly the technology-driven ones, was that people were more sensible and capable of adapting than the forecasters or planners expected. A failure to understand human motivation and behaviour can cause paradigm shifts and shocks to surprise us.
(2) Government cannot do it
The second error is a major political and military paradigm shift, caused by the comparative retreat of governments. Many Western governments are trying to withdraw from the approach they took in the postwar period. Partly it is because the ability to control their environment has decreased, as finance moves around the globe more easily, large movements of guest workers and immigrants continue and technology makes the international transfer of ideas faster and more copious. At the same time, the public's demand for government services constantly increases, not diminishes. While privatisation satisfies some expectations by replacing the government in supplying services, demographic and employment pressures reduce governments' ability to fulfil their postwar role.
In the bipolar world of the Cold War, the effort by the United States to stay ahead in technology meant that government development funding was large and assured. This resulted in a stream of spin-offs for civilian and commercial exploitation. Now that the Soviet threat has disappeared, funds for research and development have been reduced. The main drivers for technological change must now come from private enterprise. Will the sources and types of technological advancement therefore be harder to forecast? The effect of this paradigm shift is far reaching, yet many forecasts continue to assume that the role of the government in this field will be remain significant.
(3) Technology will be used if it is useful
The third source of common error is in time scales of adoption of technological innovation. Often, the nature of a development is forecast correctly, but the timing is over-optimistic. A good idea attracts enthusiasts who assume that consumers will be equally keen. Forecasting the timing of crucial developments requires an understanding of the other components which are needed to form a total system.
For example, computer hardware needed a popular standard operating system before mass PC use could take off. An important lesson is that a forecast which does not materialise in the expected timescale may not be wrong in its essentials, only in its timescale, so it should not be discarded too quickly. The other components may come from totally different fields - as in the case of the microwave oven discussed earlier.
The question we should keep asking ourselves is - who would want any of these goods and what would they use it for? It provides a useful counterpoint at a time of hype.
A fourth source of error is a change in public attitudes. For centuries up to the turn of this century, Western intellectual thought embraced the idea of continual progress towards greater scientific certainty and a more perfect state of being. Ultimately, everything would be explained and all problems would have solutions. The experience of the 20th century has disillusioned many and preoccupations with worries about issues such as pollution, the nuclear threat and ethnic conflict have challenged our assumptions about the nature of progress. Now, we do not think that things will necessarily get better. We think we might do well if we can merely sustain things. This loss of optimism is more marked, perhaps, in Europe than in the United States.
Organisations also need to make better use of the forecasts they already have. A forecast should be seen as an experiment. It is too often seen as a discrete unit with an end point, but in fact, it is only a step along the road, leading to the next forecast. It is a snapshot of the future, as taken now, and there may well be more in a picture taken tomorrow. Organisations can learn more from a forecast, as from any experiment, if they keep reviewing it. Each time, we are trying something out to see how it looks. Even "failures" can be useful if we use them for learning.
For instance, the figure represents a forecast, made in 1980, of the use of IT in the home. It is 90% accurate - most of the functions are those found in today's multi-media PCs. But it raises a laugh because it looks so old-fashioned - like an aircraft console - who would want it in their living space? In 1980, the effects of the semi-conductor revolution were beginning to be visible, in terms of the reduced size, power consumption and price of computing. A review of this even two years later would have enabled the useful 90% to be extracted from the erroneous assumptions about the hardware used as the delivery vehicle.
Figure 5: Yesterday's future.
In the section above we summarised the main sources of error we have found in forecasts. These are important to individuals both at work and at home. In a changing world, individuals find that they are responsible for planning for their own futures, because organisations will not and governments say they cannot. If individuals can increase their forecasting skills for personal reasons, they can contribute more to their organisation's forecasting. This should be welcomed. We started with the assumption that organisations needed to be able to take a view of the future in order to gain competitive advantage.
Although most organisations like to claim that they are open to new thoughts, in practice they behave as though they value orthodoxy among staff. Some people develop unorthodox views anyway and they should be made to feel they are valued. But we cannot rely on people being brave and we need to actively encourage people to think outside the box.
In many organisations, most people are busy achieving their tasks and only a few people at the centre have authority and time to question assumptions. And, although a greater variety of views can help, greater quantity can slow things down and too many inputs at corporate level can be destabilising. We need to manage a change to encourage more people to participate in a helpful way. What we want, of course, in both the personal and corporate worlds, is intelligent and correct forecasting.
Getting it right implies realising that in both worlds we carry our sets of assumptions around with us and we have seen that clinging to current orthodox assumptions is a major source of forecasting error.
We need some more tools to help us improve our forecasting. Scenario planning is well known to be valuable in encouraging people to think outside orthodoxies, and so are various forms of modelling and simulation. But perhaps the toolbox needs restocking and new methodologies should be developed to lever the benefits from what is essentially a less well defined, but equally relevant, branch of knowledge management.
Organisations operate on the basis of forecasts, whether the forecasts are carefully prepared and assessed, or whether they consist of a set of common and unquestioned assumptions. Most forecasts get things wrong and all assumptions are steadily going out of date.
This review has highlighted the prevalence of four major sources of error:
|to the top|