David Omand – How Spies Think – 10 Lessons in Intelligence – Part 6

Become a Patron!
True Information is the most valuable resource and we ask you to give back.

4

Lesson 4: Strategic notice We do not have to be so surprised by surprise

Early in the blustery spring morning of 14 April 2010 an Icelandic volcano with a near unpronounceable name (Eyjafjallajökull) exploded, throwing a cloud of fine ash high into the sky. The debris was quickly swept south-east by the regular jet stream of wind across the Atlantic until the skies above Northern Europe were filled with ash. Deep under the Icelandic ice-sheet melt water from the heat of the magma had flowed into the site of the eruption, rapidly cooling the lava and causing the debris to be rich in corrosive glass particles. These are known to pose a potential hazard if ingested by aircraft jet engines. The next day alarmed air traffic authorities decided they had to play it safe since no one had prescribed in advance specific particle sizes and levels below which engines were considered not to be at risk and thus safe to fly. They closed airspace over Europe and grounded all civil aviation in the biggest shut-down since the Second World War.1

Yet there had been warning that such an extreme event might one day occur, an example of strategic notice that is the fourth component of the SEES model of intelligence analysis. The government authorities in Iceland had been asking airlines for years to determine the density and type of ash that is safe for jet engines to fly through. Had the tests been carried out, the 2010 disruption would have been much less. There would still have been no immediate forewarning of the volcano about to explode, but sensible preparations would have been in place for when it did.

The lesson is that we need sufficiently early notice of future developments that might pose potential danger to us (or might offer us opportunities) to be prepared to take precautionary measures just in case. Strategic notice enables us to anticipate. Governments had strategic noticeof possible coronavirus pandemics – the COVID-19 outbreak should not have caught us unprepared.

There is an important difference between having strategic warning of the existence of a future risk, and a prediction of when such a risk might materialize. Scientists cannot tell us exactly when a specific volcano will erupt (or when a viral disease will mutate from animals to humans). But there can be warning signs. Based on historical data, some sense of scale of frequency of eruption can be given. In the Icelandic case it was to be expected that some such volcanic activity would occur within the next fifty years. But before the volcanic events of April 2010, aviation authorities and aircraft engine manufacturers had not recognized that they needed to

prepare. Instead they had implicitly accepted the precautionary principle2 that if any measurable volcanic ash appeared in the atmosphere they would issue an advisory notice that all planes should be grounded even at the cost of considerable disruption to passengers.

The airlines had known of the baseline precaution that would be taken of grounding planes in the event of volcanic ash appearing in the atmosphere, but they had not thought in advance how such a major global dislocation would be handled. After the April 2010 closure of European airspace, the effects rapidly cascaded around the world. Planes were diverted on safety grounds to countries for which the passengers did not have visas, and could not leave the airport to get to hotels. Coming at the end of the Easter holidays, school parties were unable to return for the start of the term. Nobody had considered if stranded passengers should have priority over new passengers for booking on flights when they restarted. For millions of people the result was misery, camping in airports until finally aviation was allowed to resume just over a week later. At the same time, test flights were rapidly organized by the aero engine manufacturers. These provided data on which calibrated judgements could be made of when it is safe enough to fly through ash clouds. By the end of a week of chaos and confusion, 10 million passengers had been affected overall, with the aviation industry facing losses of over £1bn.

The same thing happened in the 1982 Falklands crisis. The British government was given strategic warning by the JIC that Argentine patience might run out, in which case the Junta could take matters into its own hands. That warning could have prompted the stationing of naval forces as a credible deterrent, while a permanent solution could have been created by extending the runway to handle long-distance transports and the stationing of fast jets (as has now been done). That would have been expensive. But the expense pales in comparison with the loss of over 1000 lives, not to mention an estimated price tag of over £3bn that was involved in recovering the Islands for the Crown once lost.

‘I just say it was the worst, I think, moment of my life’ was how Margaret Thatcher later described the surprise loss of the Falklands: yet she and her senior Cabinet members and the officials supporting them had not understood beforehand the dangers they were running. It was painful for me as a member of the Ministry of Defence to have to recognize later that we had all been so preoccupied by other problems, including managing defence expenditure, that we failed to pay sufficient attention to the vulnerability of the Falklands. We implicitly assumed (magical thinking) that the need would never arise. It was a salutary lesson learned early in my career and one that stayed with me.

Living with surprise

The fourth stage of the SEES method involves acquiring strategic notice of the important longer-term developments that could affect you. If you do not have these at the back of your mind, the chances are that you will not have prepared either mentally or physically for the possibility of their occurring. Nor will you be sufficiently alert to spot their first signs. We will experience what is known to intelligence officers as strategic surprise.

The distinction between having strategic and tactical surprise is an old one in military history. It is often hard for a general to conceal the strategy being followed. But when it comes to choosing tactically when and where to attack, a commander can obtain the advantages of surprise by, for example, picking a point in the enemy’s defences where at least initially he will have the advantage. In 1944 the Germans knew perfectly well that the Allies were preparing a major landing of US, British and Canadian troops

on the continent of Europe. That intent was no surprise since the strategy of opening a second front in Europe was well known. But the tactics that would be adopted, the date of the invasion, and exactly where and how the landings would be mounted were secrets carefully kept from the German High Command. Come 6 June 1944, the Allies landed in Normandy and enjoyed the immediate advantage of tactical surprise.

A tragic example of tactical surprise was the events of 7 July 2005, when terrorist suicide bombers with rucksack bombs struck at the London Underground network and surface transport during the morning rush hour. Fifty-two innocent passengers lost their lives and very many more suffered horrific injuries. The attacks came without intelligence warning and the shock across London and round the world was considerable. But they were not a strategic surprise to the authorities.

The likelihood of terrorist attacks in London in 2005 had been assessed by the Joint Terrorism Analysis Centre based in MI5 headquarters. Intelligence had indicated that supporters of Al Qaid’a living inside the UK had both the capability and intent to mount some form of domestic terror attack. The possibility that the London Underground system would be an attractive target to terrorist suicide bombers had been anticipated and plans drawn up and staff trained just in case. A full-scale live rehearsal of the response to a terrorist attack on the Underground, including emergency services and hospitals that would be receiving casualties, had been held in September 2003. Just as well, since many practical lessons were learned

that helped the response two years later.3 The same can be said for the experience of the pandemic exercise in 2016 for the COVID-19 outbreak in 2020. Exercises can never fully capture the real thing but if events come as a strategic surprise the damage done will be far greater.

The same is true for all of us. We have, for example, plenty of strategic notice that our possessions are at risk of theft, which is why we should think about insurance. If we do get our mobile phone stolen we will certainly feel it as an unwelcome tactical surprise, but if insured we can console ourselves that however inconvenient it is not as bad as if it had been a strategic surprise as well.

Forestalling surprise

Intelligence communities have the duty of trying to forestall unwelcome surprises by spotting international developments that would spell real

trouble.4 In 1973 Israeli intelligence was carefully monitoring Egypt for evidence that President Sadat might be preparing to invade. Signs of mobilization were nevertheless discounted by the Israelis. That was because the Israeli Director of Military Intelligence, Major General Eli Veira, had convinced himself that he would have strategic notice of a coming war. He reasoned that without major new arms imports from Russia, and a military alliance with Syria, Egypt would be bound to lose. Since no such imports or alliance with Syria had been detected he was certain war was not coming. What he failed to spot was that President Sadat of Egypt also knew that and had no illusions about defeating Israel militarily. His plan was to launch a surprise attack to seize the Sinai Peninsula, call for a ceasefire and then to negotiate from strength in peace talks. It was a crucial report from Israel’s top spy inside Egypt (the highly placed Israeli agent Ashwar Marwan was actually the millionaire son-in-law of Gamel Abdel Nasser, Egypt’s second President), which arrived literally on the eve of Yom Kippur, that just gave Israel enough time to mobilize to resist the attack when it came. The near disaster for Israel provides a warning of the dangerous double power of magical thinking, not only imagining that the world will somehow of its own accord fit in with your desires but also interpreting all evidence to the contrary so as to confirm your belief that all is well.

An important conclusion is that events which take us unawares will force us to respond in a hurry. We did not expect it to happen today, but it has happened. If we have not prepared for the eventuality we will be caught out, red-faced, improvising rapidly to recover the situation. That includes ‘slow burn’ issues that creep up on us (like COVID-19) until we suddenly realize with horror that some tipping point has been reached and we are forced to respond. Climate change due to global warming is one such ‘slow burn’ issue. It has been evident to scientists for decades and has now reached a tipping point with the melting of polar ice and weather extremes. It is only very recently, however, that this worsening situation has become a matter for general public concern.

The creation of ISIS in Syria and Iraq is another example where intelligence officers slowly began to recognize that something significant and dangerous was afoot as the terrorists began to occupy and control areas of the two countries. The failure of strategic notice was not to see how a

combination of jihadist participation in the civil war in Syria together with the strength of the remnants of the Sunni insurgency in Iraq could create a power vacuum. The early signals of major risks may be weak, and hard to perceive against the generally noisy background of reality. For example, we have only recently recognized the re-emergence of state threats through digital subversion and propaganda and the possibility of highly damaging cyberattacks against the critical infrastructure such as power and telecommunications.

The product of the likelihood of something happening (a probability) and a measure of its impact if it does arise gives us a measure of what is called the expected value of the event. We are all familiar with the principle fromassessing the expected value of a bet: the combination of the chances of winning and payoff (winnings minus our stake) if we do. At odds of 100 to 1 the chances are low but the payoff correspondingly large, and vice versa with an odds-on favourite. We also know that the expected value of a series of separate bets can be calculated by simply adding the individual net values together. Wins are sadly usually quickly cancelled out by losses. The effect is even more evident with games like roulette, in which, with a fair wheel, there is no skill involved in placing a bet. Over a period, the bookmakers and casino operators will always make their turn, which is why they continue to exist (but punters will still come back for more because of the non-monetary value to them of the thrill of the bet).

Very unlikely events with big consequences (events that in our experience we do not expect to happen to us or only rarely) do nevertheless

sometimes pop up and surprise us.5 Sometimes they are referred to as ‘long-tailed’ risks because of the way that they lie at the extreme end, or tail, of the distribution of risk likelihood rather than in the ‘expected’

middle range. An example might be the 2007 global financial crash.6 Our intuition also can mislead us into thinking that the outcome of some event that concerns us is as likely to be above the average (median) as below since so many large-scale natural processes are governed by the so-called ‘normal’ bell-shaped symmetrical probability distribution. But there are important exceptions in which there is a sizeable long tail of bad outcomes.

That idea of expected value can be elaborated into what is known to engineers as the risk equation to provide a measure of overall loss or gain. We learned the value of this approach when I was the UK Security and Intelligence Coordinator in the Cabinet Office constructing the UK counter-

terrorism strategy, CONTEST, after 9/11.7 Our risk equation separated out the factors that contribute to the overall danger to the public so that actions could be designed to reduce each of them, as shown below.

We can thus reduce the probability of terrorists attempting to attack us by detecting and pursuing terrorist networks and by preventing radicalization to reduce the flow of new recruits. We reduce society’s vulnerability to particular types of attack by more protective security measures such as better airport screening. We reduce the cost to the public of an attack if the terrorists get through our defences by preparing the emergency services to face the initial impact when an attack takes place and by investing in infrastructure that can be repaired quickly. This logic is a major reason why CONTEST remains the UK’s counter-terrorism strategy, despite being on its fifth Prime Minister and ninth Home Secretary. Military planners would

recognize this lesson as applying ‘layered defence’,8 just as the thief after an expensive bicycle might have first to climb over a high wall into the garden, dodge round the burglar alarms, break into the shed, then undo the bicycle lock. The chance of the thief succeeding undetected goes down with each layer of security that is added (and so does your overall risk).

The search for strategic notice of long-term developments is sometimes referred to as horizon scanning, as in looking for the tops of the masts of the enemy ships just appearing. Global banks, consultancies and corporations such as Shell have acquired a reputation for horizon scanning to help their

strategy and planning departments.9 But we should remember that some important developments are like ships that have not yet put to sea – they

may never come to threaten us if pre-emptive measures are taken early enough.

In the UK the chief scientists of government departments have assumed a leading role in providing strategic notice, such as the UK Ministry of

Defence 2016 Global Strategic Trends report looking ahead to 2045.10 Another example is the 2016 report by the UK Chief Scientific Adviser on the potential revolution for government agencies, banks, insurance companies and other private sector organizations of blockchain

technology.11 The headline message is clear: watch out, a new disruptive technology is emerging that has the potential to transform the working of any organization that relies on keeping records. In the natural world we do have strategic notice of many serious issues that should make governments and companies sit up. One of the top risks flagged up by the UK government has long been a virus pandemic, alongside terrorism and cyberattacks.

It is worth bearing in mind when new technology appears which might pose risks to humans or the environment that most scientists prefer to hold back from offering advice until there is solid evidence on which to reach judgements. That understandable caution leads to the special category of ‘epistemic’ risk arising from a lack of knowledge or of agreed understanding, because experts are reluctant to commit as to whether the harm will ever crystallize or because they disagree among themselves as to its significance.

It is hard to predict when some theoretical scientific advance will result in brand-new technology that will impact heavily on our lives. Of the 2.5

million new scientific papers published each year12 very few represent breakthroughs in thinking. Even when a theoretical breakthrough opens the possibility of a revolution in technology it may be years in gestation. Quantum computing provides a striking example where we have strategic notice of its potential, once such a machine is built, to factorize the very large numbers on which all the commercial encryption systems rely for secure internet communication and online payments. At the time of writing, however, no workable quantum computer at scale has been built that can operate to fulfil the promise of the theory: it could be decades ahead. But we know that the impact when and if it happens will be significant. Wise

governments will therefore be investing (as the US and the UK are13 ) in developing new types of cryptography that will be more resistant to

quantum computers when they arrive; and no doubt asking their intelligence agencies to report any signs that somewhere else the practical problems of implementation look like being cracked.

At a personal level, where we find some of our risks easily visualized (such as coming back to the car park to find the car gone) and the costs are low, we can quickly learn to manage the risk (this causes us to get into the habit of checking whether we locked the car). Other personal risks that are more abstract, although more dangerous, may be unconsciously filed as the kind that happen to other people (such as returning home to find the fire brigade outside as a result of a short circuit in old wiring). We look but do not see the danger, just as in everyday life we can hear but not listen.

The term ‘risk’ conventionally carries the meaning of something bad that could happen. But as the economist F. H. Knight concluded many years

ago, without risk there is no profit.14 A further lesson in using strategic notice is how it can allow advance notice of long-term opportunities that might present themselves. Perhaps the chosen route for the future high-speed rail link will go through the middle of the village (threat) or a station on the line will be built in a nearby town (opportunity).

Strategic notice has even become a fashionable theory governing the marketing of products in the internet age. Rather than the more traditional clustering of products around common types of goods and services, it is increasingly being found that it is the quirky niche products or services which might appear at first sight unlikely to have mass appeal that can go viral on social media and quickly generate large returns. Entrepreneurs expect most of such outlier efforts to fail, but those that succeed more than make up for them in profits earned. Who would have thought a few years ago that sportswear such as brightly coloured running shoes and jogging bottoms, then confined to the gym, would for many become staple outdoor wear.

Providing ourselves with strategic notice

Bangladesh Climate Geo-engineering Sparks Protests

April 4, 2033 – Dhaka

Bangladesh became the first country to try to slow climate change by releasing a metric ton of sulphate aerosol into the upper atmosphere from a modified Boeing 797 airplane in the first of six planned flights to reduce the warming effects of solar radiation. The unprecedented move provoked diplomatic warnings by 25 countries and violent public protests at several Bangladeshi Embassies, but government officials in Dhaka claimed its action was ‘critical to self-defense’ after a spate of devastating hurricanes, despite scientists’ warnings of major unintended consequences, such as intensified acid rain and depletion of the ozone layer.

Note the date on that news report. That surprising glimpse of the future in 2033 was included in the 2017 report on Global Trends published by the US

National Intelligence Council.15 The intelligence officers drafting the report included such headlines to bring to life their strategic assessments of possible developments out to 2030 and beyond and the disruptive game-changers to be expected between now and then.

The then chair of the US National Intelligence Council, Professor Greg Treverton, explained in his foreword to the 2017 report that he examined global trends to identify their impact on power, governance and cooperation. In the near future, absent very different personal, political and business choices, he expects the current trajectory of trends and power dynamics to play out among rising international tensions. But what he expects to happen twenty years or more thereafter is explored through three stories or scenarios. The NIC report discusses the lessons these scenarios provide regarding potential opportunities and trade-offs in creating the future, rather than just responding to it.

It is possible to do long-term forecasting, starting with now and trying to work forwards into the far future, on the basis of mega-trends in technology, national wealth, population and so on. That quickly runs into the problem that there are too many possible intersecting options that humankind might or might not take to allow an overall estimate of where we will end up. That problem is getting worse with the interdependencies that globalization has brought. One of the fascinating aspects of the US NIC report quoted above is the use of ‘backcasting’ as well as forecasting, working backwards from a number of postulated long-term scenarios to identify the factors that might influence which of those futures we might

end up near. It is important in all such work to challenge conventional wisdom (an approach known as red teaming). When preparing the report the US NIC team visited thirty-five countries, including the UK, and canvassed ideas from academics and former practitioners such as myself, as well as serving government and military planners.

Using risk management in practice

A number of things have to go right in order for strategic warning to be translated into effective action at both national and international level, inside the private sector and in the home. Forecasts of risk need to be communicated effectively to those who can make use of the information. They in turn must be able to mobilize some kind of response to reduce, mitigate or transfer the risk. And there must be the capacity to learn lessons from experience of this process.

Advised by the assessments of the Joint Intelligence Committee, and by

the National Risk Assessment16 from the Civil Contingencies Secretariat, the UK’s National Security Council, chaired by the Prime Minister,

promulgates the strategic threat priorities for government.17 The 2015 Risk Assessment identified a major human pandemic as one of the most significant national security risks (in terms of likelihood and impact) facing the UK. A test of plans in 2016 exposed major gaps in capability. By the time COVID-19 struck in 2020 there was at least a national biological security strategy in place, although shortcomings still emerged along with shortages of essential protective equipment.

A comparable role must be played by the boards of companies, charitable organizations and government agencies in ensuring that major risks are identified and monitored and that plans for managing the risks are in place. A useful lesson I have learned is to divide the risks into three groups. The first group consists of the risks that are outside the influence of the business, such as a major disease outbreak. These are known as the exogenous risks. The second group of risks are those inherent in the nature of the business: banks suffer fraud, retailers suffer pilfering or ‘shrinkage’ of stock, trucking companies have accidents and so on. The final group of risks are those that the company took on itself, such as investment in a major IT upgrade on which the viability of the whole enterprise depends.

There is nothing most companies can do to eliminate the risks in the first group. But they can conduct periodic impact assessments, and exercise contingency plans. Even MI6 got caught out in September 2000 when a PIRA terrorist fired a small rocket at their Vauxhall Cross headquarters and the police then declared the building a crime scene and refused to allow staff back in until their investigations were complete, a more than trivial problem for an organization that has to operate 24/7.

For the second group of risks, those inherent in the nature of the business, discussion should be around the systems of control – for example, for cash flow, and whether there is sufficient pooling or transfer of risk through insurance or commercial alliance or partnership.

For the third category, the questions a company board must ask itself are much more pointed. Since the future of the organization depends on managing such changes successfully, directors need to ensure they personally have visibility of progress and that they allocate enough of their time to ensuring that key change managers have access to expertise, the delegated authority and the finance needed for success.

We can all follow such a three-part discussion of the risks we face, even at the level of the family. Do we have sufficient medical insurance from work to cover unforeseen traffic and other accidents or do we need extra cover? Is there adequate holiday insurance? Who has to meet the upfront cost of cancellations due to external disruption (such as COVID-19 in 2020 or the shutdown in 2018 of the busy Gatwick airport in the UK due to illegal drone flying over the runway)? Who has spare keys to the car or house in case of loss?

Conclusions: strategic notice

Having strategic notice of possible futures means we will not be so surprised by surprise. In this chapter we have looked at the perils of surprise, at not having strategic notice, and at what it means to say something is likely to happen. We examined the nature of surprise itself, sudden crises and slow-burn crises, how to think about the likelihood of events, and strategies for managing their risk. We looked at some of the ways in which long-term risks can be spotted and at the importance of

communicating the results to achieve an alerted, but not alarmed, public. To learn to live with the expectation of surprise we should:

Search for strategic notice of relevant developments in technology, international and economic affairs, the environment and potential threats.

Think in terms of the expected value of events and developments (probability times impact), not just their likelihood.

Think about a register of your major risks and use a risk equation to identify and link the factors that contribute to the value of overall outcomes.

Use strategic notice to spot opportunities as well as identify dangers.

Accept that you will usually suffer tactical surprise even when you have avoided strategic surprise.

Beware magical thinking, believing that one event occurs, or does not occur, as a result of another without plausible causation and thus wishing away what strategic notice is telling you.

Group risks into those you can do nothing about (but might want to prepare for and exercise contingency plans); those that go with the territory (where you can take sensible precautions); and those risks that accompany your major decisions (since your future depends on them you need to ensure they get the right priority and attention).

Become a Patron!
True Information is the most valuable resource and we ask you to give back.