GoMoPa-Luca Brasi alias “Klaus-Dieter Maurischat”, “Kinderfreund”, & STASI-Handlanger des mutmasslichen wahren “GoMoPa”-Chefs, des mutmasslichen STASI-Oberst Jochen Resch, Berlin
“GoMoPa”-“CEO” und “President”, der wegen Betruges am eigenen Anleger verurteilte “Klaus-Dieter Maurischat”, der mutmaßliche Strohmann für seine Genossen, ein Mann mit einer äußerst dubiosen Biographie (siehe Artikel auf dieser Webseite).
[BERNDPULCH.ORG – BERND-PULCH.ORG – TOXDAT, STASI List, STASI SLEEPER List, KGB List, BDVP List, STASI Names A-Z, DDR-EAST GERMAN POLICE List,Offshore List, Leaks Lists, GOMOPA4KIDS-Pedophiles-Network, GOMOPA Victims List, GOMOPA Offender Names,] TAGESSCHAU bestätigt unsere Haltung zu “GoMoPa” & Genossen Hello there! This is Melika and I am a certified photographer and illustrator. I was baffled, mildly speaking, when I saw my images at your web-site. If you use a copyrighted image without an owner’s permission, you’d better know that you could be sued by the creator. It’s not legal to use stolen images and it’s so mean! Check out this document with the links to my images you used at berndpulch.org and my earlier publications to get the evidence of my copyrights. Download it now and check this out for yourself: https://sites.google.com/view/id983016300057/google/drive/file/downloads/storage?FileID=84809287572317692 If you don’t get rid of the images mentioned in the file above within the next several days, I’ll file a complaint on you to your hosting provider informing them that my copyrights have been severely infringed and I am trying to protect my intellectual property. And if it doesn’t help, trust me I am going to take legal action against you! And I will not bother myself to let you know of it in advance.
Lesson 3: Estimations Predictions need an explanatory model as well as sufficient data
In mid-August 1968, I was driving an elderly Land Rover with friends from university along the Hungarian side of the border with Czechoslovakia on the first stage of an expedition to eastern Turkey. To our surprise we found ourselves having to dodge in and out of the tank transporters of a Soviet armoured column crawling along the border. We did not realize – and nor did the Joint Intelligence Committee in London – that those tank crews already had orders to cross the border and invade Czechoslovakia as part of a twin strategy of intimidation and deception being employed by Yuri Andropov, then KGB chairman, to undermine the reform-minded
US, UK and NATO intelligence analysts were aware of the Soviet military deployments, which could not be hidden from satellite observation and signals intelligence (I joined GCHQ a year later and learned how that had been done). The Western foreign policy community was also following the war of words between Moscow and Prague over Dubček’s reform programme. They shared Czech hopes that, in Dubček’s memorable campaign slogan, ‘socialism with a human face’ would replace the rigidities of Stalinist doctrine.
Dubček had run for the post of First Secretary of the Party on a platform of increased freedom of the press and of speech and movement; an economic emphasis on consumer goods; a reduction in the powers of the secret police; and even the possibility of multi-party elections. Dubček was in a hurry, with the wind of popular support behind him. But he was clearly
and repeatedly ignoring warnings from Moscow that he was going too far too fast. In 1968, Prague was at risk of slipping from under Moscow’s control.
In the JIC, senior intelligence and policy officials met with representatives of the ‘5-eyes’ to consider whether Moscow would use
military force as it had done in Hungary in 1956.2 This is the stage of analysis that the layperson might consider the most important, trying to predict for the policymakers what will happen next. This is very satisfying when it is achieved, although intelligence professionals shun the word ‘prediction’ as an overstatement of what is normally possible.
Analysts had no difficulty explaining the massing of tanks just on the other side of the Czech border as putting pressure on the reformist Czech government. The JIC analysts must have felt they had had good situational awareness and a credible explanation of what was going on at a military level. But they failed to take the next step and forecast the invasion and violent crushing of the reform movement. They reasoned that the Soviet Union would hold back from such crude direct intervention given the international condemnation that would undoubtedly follow. That verb reasoned carries the explanation of why the analysts got it wrong: they werereasonable people trying to predict the actions of an unreasonable regime. When they put themselves in the shoes of the decisionmakers in Moscow, they still thought exclusively from their own perspective.
We now know from historical research much more than the analysts would have known at the time about the resolve of the Soviet leadership to crush the Czech reforms. Western intelligence analysts would probably have come to a different conclusion about the Soviet willingness to take huge risks if they had known the active measures being taken against the Czech reformers being masterminded by Yuri Andropov, Head of the KGB.
That the key inner adviser to President Brezhnev in Moscow was Andropov should have triggered alarm. Andropov had form. As Soviet Ambassador in Budapest in 1956, he had played a decisive role in convincing the Soviet leader, Nikita Khrushchev, that only the ruthless use of military force would end the Hungarian uprising. It was a movement that had started with student protests but had ended up with an armed revolt to install a new government committed to free elections and a withdrawal from the Warsaw Pact.
One of the main instruments being employed by Andropov was the use of ‘illegals’. The West found that out much later in 1992 with the reporting of Vasili Mitrokhin, the Soviet KGB archivist and MI6 source. He revealed how specially selected and trained KGB officers had been sent in 1968 into Czechoslovakia, disguised as tourists, journalists, businessmen and students, equipped with false passports from West Germany, Austria, the UK, Switzerland and Mexico. Each illegal was given a monthly allowance of $300, travel expenses and enough money to rent a flat in the expectation that the Czech dissidents would more readily confide in Westerners. Their role was both to penetrate reformist circles such as the Union of Writers, radical journals, the universities and political groupings, but also to take ‘active measures’ to blacken the reputation of the dissidents. The Soviet Prime Minister loudly complained of Western provocations and sabotage (with the alleged uncovering of a cache of American weapons and with a faked document purporting to show a US plan for overthrowing the Prague regime). He used such arguments to justify Soviet interference in Czechoslovak affairs even though they were, in fact, the work of the KGB ‘illegals’.
In August 1968, under the pretext of preventing an imperialist plot, the Soviet Union despatched armies from Russia and four other Warsaw Pact countries to invade Czechoslovakia, taking over the airport and public buildings and confining Czech soldiers to barracks. Dubček and his colleagues were flown to Moscow under KGB escort, where, under considerable intimidation, they accepted the reality of complying with the demands of their occupiers.
Today we have seen Moscow using all these tactics from the Soviet playbook to prevent Ukraine orientating itself towards the EU. Yet, despite their understanding of Soviet history, Western analysts failed to predict the Russian seizure of Crimea and their armed intervention in eastern Ukraine. Analysts knew of past Soviet use of methods involving intimidation, propaganda and dirty tricks including the use of the little grey men of the KGB infiltrated into Czechoslovakia in 1968. Yet the appearance of ‘little green men’ in Ukraine, as the Russian special forces were dubbed by the media, came as a surprise.
Modelling the path to the future
The task of understanding how things will unfold is like choosing the most likely route to be taken across a strange country by a traveller you have equipped with a map that sets down only some of the features of the landscape. You know that all maps simplify to some extent; the perfect map, as described satirically by Jonathan Swift in Gulliver’s Travels is one that has a scale of 1 to 1 and thus is as big and detailed as the ground being
mapped.3 There are blank spots on the traveller’s map: ‘here be dragons’, as the medieval cartographers labelled areas where they did not have enough information. The important lesson is that reality itself has no blank spots: the problems you encounter are not with reality but with how well you are able to map it.
An example of getting the modelling of future international developments right was the 1990 US National Intelligence Council estimate ‘Yugoslavia Transformed’ a decade after the death of its autocratic ruler, the
former Partisan leader Marshal Tito.4 The US analysts understood the dynamics of Tito’s long rule. He had forged a federation from very different and historically warring peoples: Serbs, Croats, Slovenes and Bosnian Muslims. As so often happens with autocrats ruling divided countries (think about Iraq under Saddam, Libya under Gaddafi), Tito ruled by balancing the tribal loyalties. For every advantage awarded to one group there had to be counter-balancing concessions in other fields to the other groups. Meanwhile a tough internal security apparatus loyal to Tito and the concept of Yugoslavia identified potential flashpoints to be defused and dissidents to be exiled. After Tito’s death the centre could not long hold. The Serb leadership increasingly played the Serb nationalist and religious card and looked for support to Moscow. The Croats turned to the sympathy of Catholic fellowship in Germany. The Bosnian Muslims put their faith in the international community and the United Nations for protection. The US 1990 estimate summarized the future of the former Yugoslavia in a series of unvarnished judgements that read well in the light of subsequent developments in the Balkans as described in the previous chapter:
Yugoslavia will cease to function as a federal state within one year and will probably break up within two. Economic reform will not stave off the break-up …
There will be a protracted armed uprising by the Albanians in Kosovo. A full-scale, interrepublic war is unlikely but serious intercommunal
violence will accompany the breakup and will continue thereafter. The violence will be intractable and bitter.
There is little that the US and its European allies can do to preserve Yugoslav unity. Yugoslavs will see such efforts as contradictory to advocacy of democracy and self-determination … the Germans will pay lip service to the idea of Yugoslav integrity, whilst quietly accepting the dissolution of the Yugoslav state.
In London, analysts shared the thrusts of the US intelligence assessment on Yugoslavia. But the government of John Major did not want to get involved in what promised to be internecine Balkan civil war, always the bloodiest kind of conflict. The Chiefs of Staff could see no British interest worth fighting for. I recall attending the Chiefs of Staff Committee and reporting on the deteriorating situation but having Bismarck’s wisecrack thrown back at me, that the pacification of the turbulent Balkans was not worth the healthy bones of a single Pomeranian grenadier.
There can be many reasons for failure to predict developments correctly. One of the most common reasons is simply the human temptation to indulge in magical thinking, imagining that things will turn out as we want without any credible causal explanation of how that will come about. We do this to shield ourselves from the unwelcome truth that we may not be able to get what we want. The arguments over the handling of the UK Brexit process say it all.
The choice between being more right or less wrong
It is easy to criticize analysts when they fail to warn of some aggressive act. They know that they will be accused of an intelligence failure. As a rule of thumb, analysts will tend to risk a false positive by issuing a warning estimate rather than risk the accusation of failure after a negative report failed to warn. The costs of not having a timely warning if the event does happen are usually greater than the costs of an unnecessary warning when it does not. Cynics might also argue that analysts are realists and they know that if they issue a warning but the event does not take place there will be many exculpatory reasons that can be deployed for events not turning out that way. On the other hand, if policymakers are badly surprised by events after a failure to warn there will be no excuses accepted.
Analysts are faced in those circumstances with an example of the much
studied false-positive/false-negative quality control problem.5 This is the same dilemma faced by car manufacturers who inspect as the cars leave the factory and have to set the testing to a desired rate of defective vehicles passing the inspection (taken to be safe but actually not, a false positive), knowing that such vehicles are likely to break down and have to be recalled at great cost and the company reputation and sales will suffer; but knowing as well that if too many vehicles are wrongly rejected as unsafe (taken to be unsafe but actually not, a false negative) the car company will also incur large unnecessary costs in reworking them. This logic applies even more forcibly with medicines and foodstuffs. As consumers it is essential to expect foods labelled as nut-free to be just that, in order to avoid the potentially lethal risk to those allergic to nuts. The consequence, however, is that we have to recognize that the manufacturer will need a rigorous testing system achieving very low false-positive rejection rates, and that will put up the false-negative rejection rates, which is likely to add significant cost to the product. We can expect the cursor on most overall manufacturing industry inspection systems to be set towards avoiding more false positives at the expense of more false negatives. The software industry, however, is notorious for cost reasons for tolerating a high false-positive rate, preferring to issue endless patches and updates as the customers themselves find the flaws the hard way by actually using the software.
An obvious application in intelligence and security work is in deciding whether an individual has shown sufficient associations with violent extremism to be placed on a ‘no-fly’ list. Policymakers would want the system to err on the side of caution. That means accepting rather more false negatives, which will of course seriously inconvenience an individual falsely seen as dangerous because they will not be allowed to fly, as the price for having a very low level of false positives (falsely seen as safe when not, which could lead, in the worst case, to a terrorist destroying a passenger aircraft by smuggling a bomb on board). Another example is the design of algorithms for intelligence agencies to pull out information relating to terrorist suspects from digital communications data accessed in bulk. Set the cursor too far in the direction of false positives and too much material of no intelligence interest will be retrieved, wasting valuable analyst time and risking unnecessary invasion of privacy; set the cursor too
far towards false negatives and the risk of not retrieving the material being sought and terrorists escaping notice rises. There is no optimal solution possible without weighing the relative penalties of a false positive as against a false negative. At one extreme, as we will see in the next chapter, is the so-called precautionary principle whereby the risk of harm to humans means there cannot be false positives. Application of such a principle
The false-positive/false-negative dilemma occurs with algorithms that have to separate data into categories. Such algorithms are trained on a large set of historic data where it is known which category each example falls into (such as genuinely suspect/not-suspect) and the AI programme then works out the most efficient indicators to use in categorizing the data. Before the algorithm is deployed into service, however, the accuracy of its output needs to be assessed against the known characteristics of the input. Simply setting the rule at a single number so that, say, 95 per cent of algorithmic decisions are expected to be correct in comparison with the known training data is likely to lead to trouble depending upon the ratio of false positives to false negatives in the result and the penalty associated with each. One way of assessing the accuracy of the algorithm in its task is to define its precision as the number of true positives as a proportion of positives that the algorithm thinks it has detected in the training data. Accuracy is often measured as the number of true positives and negatives as a proportion of the total number in the training set. A modern statistical technique that can be useful with big data sets is to chart the number of false positives and false negatives to be expected at each setting of the rule and to look at the area under the resulting curve (AUC) as a measure of
overall success in the task.7
Reluctance to act on intelligence warnings
The policy world may need shaking into recognizing that they have to take warnings seriously. In April 1993 I accompanied the British Defence Secretary, Malcolm Rifkind, to the opening of the Holocaust Museum in Washington. The day started with a moving tribute at Arlington Cemetery to the liberators of the concentration camps. I remembered the sole occasion my father had spoken to me of the horror of entering one such just liberated
camp in 1944 when he was serving as an officer in the Black Watch on the Eighth Army A Staff. It was a memory that he had preferred to suppress. Later that day Elie Wiesel, the Nobel Peace Prize winner, spoke passionately in front of President Bill Clinton, President Chaim Herzog of Israel and a large crowd of dignitaries about the need to keep the memory of those horrors alive. He issued an emotional appeal to remember the failure of the Allied powers to support the Warsaw Ghetto uprising and the Jewish
resistance.8 He quoted the motto chiselled in stone over the entrance to the Holocaust Museum: ‘For the dead and the living we must bear witness’. Then, turning directly to face President Clinton and the First Lady, Hillary Clinton, he reminded them: ‘We are also responsible for what we are doing with those memories … Mr President, I cannot not tell you something. I have been in the former Yugoslavia last Fall … I cannot sleep since over what I have seen. As a Jew I am saying that we must do something to stop the bloodshed in that country! People fight each other and children die. Why? Something, anything, must be done.’
His message, genocide is happening again in Europe, and it is happening on your watch, Mr President, and the Allies are once again doing nothing, was heard in an embarrassed silence, followed by loud applause from the survivors of the camps who were present. Later that year the UN Security Council did finally mandate a humanitarian operation in Bosnia, the UN Protection Force (UNPROFOR), for which the UK was persuaded to provide a headquarters and an infantry battle group. As the opening of the previous chapter recounted, that small peacekeeping force in their blue helmets and white-painted vehicles sadly proved inadequate faced with the aggression of both Bosnian Serbs and Croats, and was helpless to stop the massacre of Bosnian Muslims at Srebrenica in the summer of 1995.
Providing leaders with warnings is not easy. The ancient Greek myth of Cassandra, one of the princesses of Troy and daughter of King Priam, relates that she was blessed by the god Apollo with the gift of foreseeing the future. But when she refused the advances of Apollo she was placed under a curse which meant that, despite her gift, no one would believe her. She tried in vain to warn the inhabitants of Troy to beware Greeks bearing gifts. The giant wooden horse, left by the Greeks as they pretended to lift the siege of the city, was nevertheless pulled inside the walls. Odysseus and his soldiers who were hidden inside climbed out at night and opened the city gates to the invading Greek Army. As Cassandra had cried out in the
streets of Troy: ‘Fools! ye know not your doom … Oh, ye believe not me,
though ne’er so loud I cry!’9 Not to have their warnings believed has been the fate of many intelligence analysts over the years and will be again. The phenomenon is known to the intelligence world as the Cassandra effect.
It might have been doubts about Cassandra’s motives that led to her information being ignored. In 1982 there were warnings from the captain of the ice patrol ship HMS Endurance in the South Atlantic who was monitoring Argentine media that the point was coming close when the Junta would lose patience with diplomatic negotiations. But these warnings were discounted by a very human reaction of ‘Well, he would say that, wouldn’t he’, given his ship was to be withdrawn from service under the cuts in capability imposed by the 1981 defence expenditure review. It is also quite possible that Cassandra might have made too many predictions in the past that led to nothing and created what is known as warning fatigue. We know this as crying wolf, from Aesop’s fable. That might in turn imply the threshold for warning was set too low and should have been set higher than turning out the whole village on a single shout of ‘wolf’ (but remember the earlier discussion of false positives and false negatives and how raising the warning threshold increases the risk of a real threat being ignored). Sending signals which lead to repeated false alarms is an ancient tactic to inure the enemy to the real danger. Warnings also have to be sufficiently specific to allow sensible action to be taken. Simply warning that there is a risk of possible political unrest in the popular holiday destination of Ruritania does not help the tourist know whether or not to cancel their holiday on the Ruritanian coast.
Perhaps poor Cassandra was simply not thought a sufficiently credible source for reasons unconnected with the objective value of her intelligence reporting. Stalin was forewarned of the German surprise attack on the Soviet Union in 1941 by reports from well-placed Soviet intelligence sources, including the Cambridge spies, some of whom had access to Bletchley Park Enigma decrypts of the German High Command’s signals. But he discounted the reporting as too good to be true and therefore assumed a deliberate attempt by the Allies to get him to regard Germany as an enemy and to discount the guarantees of peace in the 1939 Molotov– Ribbentrop non-aggression pact that Stalin had approved two years earlier.
A final lesson from the failure of the Trojans to act on Cassandra’s warning might be that the cost of preventive action can be seen as too great.
Legend has it that the Trojans were concerned with angering their gods if they had refused the Greek offering of the wooden horse. We may ignore troubling symptoms if we fear that a visit to the doctor will result in a diagnosis that prevents us being able to fly to a long-promised holiday in the sun.
Expressing predictions and forecasts as probabilities
It is sadly the case that only rarely can intelligence analysts be definitive in warning what will happen next. Most estimates have to be hedged with caveats and assumptions. Analysts speak therefore of their degree of belief in a forward-looking judgement. Such a degree of belief is expressed as a probability of being right. This is a different use of probability from that associated with gambling games like dice or roulette, where the frequency with which a number comes up provides data from which the probability of a particular outcome is estimated. When we throw a fair die we know that the probability that the next throw will come up with a six is 1/6. We know the odds we ought to accept on a bet that this is what will happen. That is the frequentist interpretation of probability. By analogy, we think of the odds that intelligence analysts would rationally accept on their estimate being right. That is the measure of their degree of belief in their judgement.
It is of course a subjective interpretation of probability.10 Intelligence analysts prefer – like political pollsters – forecasts that
associate with a range of possible outcomes an associated probability. For example, the US Director of National Intelligence, Dan Coats, predicted in a worldwide threat assessment given to the Senate Intelligence Committee that competitors such as Russia, China and Iran ‘probably already are looking to the 2020 U.S. elections as an opportunity to advance their
interests’.11 ‘Probably’ here is likely to mean 55–70 per cent, which can be thought of as the gambling odds the analysts should accept for being right (in that case, just over 70 per cent probable equates to bookmakers offering odds of 2 to 1 on).
When a forecast outcome is heavily dependent on external events, that is usually expressed as an assumption so that readers of the assessment understand that dependency. The use of qualifying words such as ‘unlikely’, ‘likely’ and so on is standardized by professional intelligence analysts. The
UK yardstick was devised by the Professional Head of the Intelligence Analysis profession (PHIA) in the Cabinet Office, and is in use across the British intelligence community, including with law enforcement. The example of the yardstick below is taken from the annual National Strategic
Assessment (NSA) by the UK National Crime Agency.12Probability and Uncertainty
Throughout the NSA, the ‘probability yardstick’ (as defined by the Professional Head of Intelligence Assessment (PHIA) has been used to ensure consistency across the different threats and themes when assessing probability. The following defines the probability ranges considered when such language is used:
The US Intelligence Community also has published a table showing how to express a likelihood in ordinary language (line 1 of the table below) and in probabilistic language (line 2 of the table, with the corresponding
confidence level in line 3).13
One difference between the approach taken by the UK and the US analysts is in the use of gaps between the ranges in the UK case. The intention is to avoid the potential problem with the US scale over what term you use if your judgement is ‘around 20 per cent’. Two analysts can have a perfectly reasonable, but unnecessary, argument over whether something is ‘very unlikely’ or ‘unlikely’. The gaps obviate the problem. The challenge
is over what to do if the judgement falls within one of the gaps. If an analyst can legitimately say that something is ‘a 75–80 per cent chance’, then they are free to do so. The yardstick is a guide and a minimum standard, but analysts are free to be more specific or precise in their judgements, if they can. It is sensible to think in 5 or 10 per cent increments to discourage unjustified precision for which the evidence is unlikely to be available. I recommend this framework in any situation in which you have to make a prediction. It is very flexible, universally applicable, and extremely helpful in aiding your decisionmaking and in communicating it to others. You could start off by reminding yourself the next time you say it is ‘unlikely’ to rain that that still leaves a one in five chance of a downpour. You might well accept that level of risk and not bother with a coat. But if you were badly run down after a bout of flu even a 20 per cent chance of getting soaked and developing a fever would be a risk not worth running. That is an example of examining the expected value of the outcome, not just its likelihood, formed by multiplying together the probability of an event and a measure of the consequences for you of it happening.
The limits of prediction
The science fiction writer Isaac Asimov in his Foundation and Empire books imagined a future empirical science of psychohistory, where recurring patterns in civilizations on a cosmic scale could be modelled
using sociology, history and mathematical statistics.14 Broad sweeps of history could, Asimov fantasized, be forecast in the same way as statistical mechanics allows the behaviour of large numbers of molecules in a gas to be predicted, although the behaviour of individual molecules cannot (being subject to quantum effects). Asimov’s fictional creator of psychohistory, Dr Hari Seldon, laid down key assumptions that the population whose behaviour was being modelled should be sufficiently large and that the population should remain in ignorance of the results of the application of psychohistorical analyses because, if it became so aware, there would be feedback changing its behaviour. Other assumptions include that there would be no fundamental change in human society and that human nature and reactions to stimuli would remain constant. Thus, Asimov reasoned, the occurrence of times of crisis at an intergalactic scale could be forecast, and
guidance provided (by a holograph of Dr Seldon) by constructing time vaults that would be programmed to open when the crisis was predicted to arise and the need would be greatest.
Psychohistory will remain fantasy. Which is perhaps just as well. The main problem with such ideas is the impossibility of sufficiently specifying the initial conditions. Even with deterministic equations in a weather-forecasting model, after a week or so the divergence between what is forecast and what is observed becomes too large to allow the prediction to be useful. And often in complex systems the model is non-linear, so small changes can quickly become large ones. There are inherent limits to forecasting reality. Broad sweeps may be possible but not detailed predictions. There comes a point when the smallest disturbance (the iconic flapping of a butterfly’s wings) sets in train a sequence of cascading changes that tip weather systems over, resulting in a hurricane on the other side of the world. The finer the scale being used to measure forecasts in international affairs, the more variables that need to be taken into account, the greater the number of imponderables and assumptions, and the less
accurate the long-term forecast is liable to be.15
Even at the level of physical phenomenon not every activity is susceptible to precise modelling. Exactly when a radioactive atom will spontaneously decay cannot be predicted, although the number of such events in a given time can be known in terms of its probability of occurrence. The exact path a photon of light or an electron will take when passing through a narrow pair of slits can also only be predicted in advance in terms of probabilities (the famous double slit experiment that demonstrates one of the key principles of quantum physics).
Secrets, mysteries and complex interactions
There is a deeper way of looking at intelligence, and that is to distinguish between secrets and mysteries. Secrets can be found out if the seeker has the ingenuity, skill and the means to uncover them. Mysteries are of a different order. More and more secrets will not necessarily unlock the mystery of a dictator’s state of mind. But intelligence officers trying to get inside the mind of a potential adversary have to do their best to make an assessment, since that will influence what the policymakers decide to do
next. Inferences can certainly be drawn, based on knowledge of the individuals concerned and on reading of their motivations, together with general observation of human behaviour. But such a judgement will depend on who is making it. A neutral observer might come to a different view from one from a country at risk of being invaded.
Mysteries have a very different evidential status. They concern events that have not yet happened (and therefore may never happen). Yet it is solutions to such mysteries that the users of intelligence need. It was the case that from the moment early in 1982 when the Argentine Junta’s Chief of Naval Staff and chief hawk over the issue, Admiral Anaya, issued secret orders to his staff to begin planning the Falkland Islands invasion then there were secrets to collect. But whether, when it came to the crunch, the Junta as a whole would approve the resulting plan and order implementation would remain a mystery until much later.
To make matters harder, there is often an additional difficulty due to the
complex interactions16involved. We now know in the case of the Junta in1982 that it completely misread what the UK reaction would be to an invasion of the Falkland Islands. And, just as seriously, the Junta did not take sufficient account of the longstanding US/UK defence relationship in assessing how the US would react. It may not have recognized the personal relationship that had developed between the UK’s Defence Secretary, John Nott, and his US counterpart, Caspar Weinberger. Margaret Thatcher’s iron response in sending a naval Task Force to recover the Islands met with Weinberger’s strong approval, in part because it demonstrated to the Soviet Union that armed aggression would not be allowed to pay.
These distinctions are important in everyday life. There are many secrets that can in principle be found out if your investigations are well designed and sufficiently intrusive. In your own life, your partner may have kept texts on their phone from an ex that they have kept private from you. Strictly speaking, these are secrets that you could probably find a way of accessing covertly (I strongly advise you don’t. Your curiosity is not a sufficient reason for violating their privacy rights. And once you have done so, your own behaviour towards your partner, and therefore your partner’s towards you, is likely unconsciously to change). But whether you uncover the secrets or not, the mystery of why your partner kept them and whether they ever intend in the future to contact the ex remains unanswered, and not even your partner is likely to be certain of the answer. You would have the
secret but not the answer to the mystery, and that answer is likely to depend upon your own behaviour over coming months that will exercise a powerful influence on how your partner feels about the relationship. Prediction in such circumstances of complex interactions is always going to be hard.
Missing out on the lessons of Chapter 2and leaping from situational awareness to prediction – for example, by extrapolating trends or assuming conditions will remain the same – is a common error, known as the inductive fallacy. It is equivalent to weather forecasting by simply looking out of the window and extrapolating: most of the time tomorrow’s weather follows naturally from today’s, but not when there is a rapidly developing weather front. Ignoring the underlying dynamics of weather systems will mean you get the forecast right much of the time but inevitably not always. When it happens that you are wrong, as you are bound to be from time to time, you are liable to be disastrously wrong – for example, as a flash flood develops or an unexpected hurricane sweeps in. That holds as true for international affairs as it does for all life as well: if you rely on assumptions, when you get it wrong, you get it really wrong. Experts are as likely to fall
I am fond of the Greek term phronesis, to describe the application of practical wisdom to the anticipation of risks. As defined by the art historian Edgar Wind, this term describes how good judgement can be applied to human conduct consisting in a sound practical instinct for the course of events, an almost indefinable hunch that anticipates the future by
remembering the past and thus judges the present correctly.18
Conclusions: estimates and predictions
Estimates of how events may unfold, and predictions of what will happen next, are crucially dependent on having a reliable explanatory model, as well as sufficient data. Even if we are not consciously aware of doing this, when we think about the future we are mentally constructing a model of our current reality and reaching judgements about how our chosen explanatory model would behave over time and in response to different inputs or stimuli. It will help to have identified what are the most important factors that are likely to affect the outcome, and how sensitive that outcome might
be to changes in circumstances. We are here posing questions of the ‘what next and where next?’ type. In answering them we should:
Avoid the inductive fallacy of jumping straight from situational awareness to prediction and use an explanatory model of how you think the key variables interact.
Be realistic about the limitations of any form of prediction, expressing results as estimates between a range of likely possibilities. Point predictions are hazardous.
Express your degree of confidence in your judgements in probabilistic language, taking care over consistent use of terms such as ‘likely’.
Remember to consider those less likely but potentially damaging outcomes as well as the most probable.
Be aware that wanting to see a reduction in the level of false positives implies increasing the level of false negatives to be expected.
Do not confuse the capability of an individual or organization to act with an intent to act on their part.
Be aware of your cultural differences and prejudices when explaining the motivations and intent of another.
Distinguish between what you conclude based on information you have and what you think based on past experience, inference and intuition (secrets, mysteries and complexities).
Beware your own biases misleading you when you are trying to understand the motives of others.
Give warnings as active deliberative acts based on your belief about how events will unfold and with the intent of causing a change in behaviour or policy.
A bill on guarantees of the immunity of the former president was submitted to the State Duma. It will bring the federal law into line with the latest version of the Russian Constitution (adopted by a vote in the summer of 2020). The document not only complicates the procedure for depriving the former president of immunity, but also actually allows the former head of state to commit some crimes after his resignation.
After the adoption of this bill, both the current and the former president can be deprived of immunity only for grave and especially grave crimes. That is, all crimes of small and medium gravity, committed by the former president of Russia, will remain unpunished. Meduza re-read the Criminal Code and wrote out most of the crimes that could theoretically get away with Dmitry Medvedev and Vladimir Putin. Moreover, we are talking about both past and future deeds.
Crimes against life and health Murder committed in passion (v. 107) Murder committed in excess of the limits of necessary defense (Art.108) Causing death by negligence (Article 109) Driving a person to suicide or attempted suicide (part 1 of article 110) Inclination to commit suicide (part 1 and part 3 of Art. 110¹) Facilitating suicide (Art.110 Part 2 and Part 3) Intentional infliction of medium-gravity harm to health (Article 112) Infliction of grave or moderate harm to health in a state of passion (Article 113) Causing serious or moderate harm to health when the limits of necessary defense are exceeded (Article 114) Intentional infliction of slight harm to health (Article 115) Beating (Art. 116) Torment (part 1 of article 117) Causing grievous bodily harm through negligence (Article 118) Threats of murder or grievous bodily harm (Article 119) Coercion to remove human organs or tissues for transplantation (Article 120) Infection with a venereal disease (Art. 121) Infection with HIV (part 1, part 2 and part 4 of article 122) Obstruction of the provision of medical care (art.124¹) Leaving in danger (Art. 125)
Crimes against freedom, honor and dignity of the person Kidnapping (part 1 of article 126) Unlawful deprivation of liberty (parts 1 and 2 of article 127) Use of slave labor (Part 1 of Art. 127²) Libel (art. 128¹)
Crimes against sexual inviolability and sexual freedom of the person Compulsion to conduct of a sexual nature (Art. 133) Sexual intercourse and other actions of a sexual nature with a person under the age of sixteen (part 1 of article 134) Depraved actions (part 1 of article 135)
Crimes against constitutional human and civil rights and freedoms Discrimination (art. 136) Violation of privacy (art. 137) Violation of secrecy of correspondence, telephone conversations, postal, telegraph or other messages (Article 138) Illegal circulation of special technical means intended for secretly obtaining information (Article 138¹) Violation of the inviolability of the home (Article 139) Obstruction of the exercise of electoral rights or the work of election commissions (art. 141) Violation of the procedure for financing the election campaign (Article 141¹) Illegal issuance and receipt of a ballot paper, a ballot paper for a referendum, a ballot paper for an all-Russian vote (Art. 142²) Obstruction of the legitimate professional activities of journalists (parts 1 and 2 of article 144) Violation of copyright and related rights (parts 1 and 2 of article 146) Infringement of inventive and patent rights (Art. 147) Violation of the right to freedom of conscience and religion (Article 148) Obstruction of a meeting, meeting, demonstration, procession, picketing or participation in them (Art. 149)
Crimes against family and minors Involvement of a minor in the commission of a crime (part 1 of article 150) Involvement of a minor in the commission of antisocial acts (part 1 and part 2 of article 151) Retail sale of alcoholic beverages to minors (Art. 151¹) Involvement of a minor in the commission of acts that pose a danger to the life of a minor (Art. 151²) Substitution of a child committed from mercenary or other base motives (Article 153) Illegal adoption (adoption) (Article 154) Failure to fulfill the duties of raising a minor (Article 156) Failure to pay funds for the maintenance of children or disabled parents (Article 157)
Property crimes Theft (part 1 and part 2 of article 158) Fraud (part 1, part 2 and part 5 of article 159) Credit fraud (part 1 and part 2 of Art.159¹) Fraud in receiving payments (part 1 and part 2 of Art. 159²) Fraud using electronic means of payment (part 1 and part 2 of article 159³) Fraud in the field of insurance (part 1 and part 2 of article 159⁵) Fraud in the field of computer information (part 1 and part 2 of article 159⁶) Misappropriation or waste (part 1 and part 2 of article 160) Robbery (part 1 of article 161) Extortion (part 1 of article 163) Causing property damage by deception or breach of trust (Article 165) Unlawful seizure of a car or other vehicle without the purpose of theft (part 1 of article 166) Intentional destruction or damage to property (Article 167) Destruction or damage to property by negligence (Article 168)
Crimes in the field of economic activity Illegal business (art. 171) Production, purchase, storage, transportation or sale of goods and products without labeling and (or) applying information provided for by the legislation of the Russian Federation (part 1, part 1¹, part 3 and part 5 of article 171¹) Illegal organization and conduct of gambling (parts 1 and 2 of Art. 171²) Illegal production and (or) circulation of ethyl alcohol, alcoholic and alcohol-containing products (Art. 171³) Illegal retail sale of alcoholic and alcohol-containing food products (Article 171⁴) Illegal banking (part 1 of article 172) Illegal formation (creation, reorganization) of a legal entity (Art. 173¹) Illegal use of documents for the formation (creation, reorganization) of a legal entity (Art. 173²) Legalization (laundering) of funds or other property acquired by other persons in a criminal way (part 1, part 2 and part 3 of article 174) Legalization (laundering) of funds or other property acquired by a person as a result of a crime committed by him (part 1, part 2 and part 3 of Art. 174¹) Acquisition or sale of property, knowingly obtained by criminal means (part 1 and part 2 of article 175) Unlawful receipt of a loan (Article 176) Malicious evasion of accounts payable (Article 177) Restriction of competition (part 1 of article 178) Coercion to complete a transaction or refuse to complete it (part 1 of article 179) Illegal use of means of individualization of goods (works, services) (part 1, part 2 and part 3 of article 180) Illegal receipt and disclosure of information constituting commercial, tax or banking secrets (part 1, part 2 and part 3 of article 183) Unlawful influence on the result of an official sports competition or spectacular commercial competition (part 1 of article 184) Market manipulation (part 1 of article 185³) Unlawful use of insider information (part 1 of article 185⁶) Illegal export from the Russian Federation or transfer of raw materials, materials, equipment, technologies, scientific and technical information, illegal performance of work (provision of services) that can be used to create weapons of mass destruction, weapons and military equipment (parts 1 and 2 Article 189) Illegal circulation of amber, jade or other semi-precious stones, precious metals, precious stones or pearls (part 1, part 2, part 4 of article 191) Acquisition, storage, transportation, processing for the purpose of marketing or marketing of knowingly illegally harvested timber (Article 191¹) Acquisition, storage, transportation, processing for the purpose of marketing or marketing of knowingly illegally harvested timber (Article 191¹) Performing currency transactions to transfer funds in foreign currency or the currency of the Russian Federation to the accounts of non-residents using forged documents (part 1 and part 2 of article 193¹) Evasion of payment of customs payments levied from an organization or an individual (parts 1 and 2 of article 194) Smuggling of cash and (or) monetary instruments (Art. 200¹) Smuggling of alcoholic beverages and (or) tobacco products (part 1 of Art. 200²)
Crimes against the interests of service in commercial and other organizations Commercial bribery (part 1, part 2, part 5 and part 6 of Art.204) Mediation in commercial bribery (part 1 and part 2 of Art.204¹) Petty commercial bribery (Article 204²)
Public Safety Crimes Public calls for terrorist activities, public justification of terrorism or propaganda of terrorism (Part 1 of Art. 205²) Failure to report a crime (art. 205⁶) Knowingly false reporting of an act of terrorism (part 1 and part 2 of article 207) Public dissemination of knowingly false information about circumstances posing a threat to the life and safety of citizens (Art.207¹) Public dissemination of knowingly false socially significant information, which entailed grave consequences (Article 207²) Calls for riots or participation in them, as well as calls for violence against citizens (part 3 of article 212) Hooliganism (part 1 of article 213) Vandalism (Art. 214) Decommissioning of life support facilities (part 1 and part 2 of Art. 215²) Unlawful entry into a guarded object (Art.215⁴) Illegal handling of nuclear materials or radioactive substances (part 1 and part 2 of article 220) Theft or extortion of nuclear materials or radioactive substances (part 1 of article 221) Illegal acquisition, transfer, sale, storage, transportation or carrying of weapons, their main parts, ammunition (part 1 of article 222) Illegal acquisition, transfer, sale, storage, transportation or carrying of explosives or explosive devices (part 1 of article 222¹) Illegal manufacture of weapons (Article 223) Negligent possession of firearms (Article 224)
Crimes against public health and public morals Illegal acquisition, storage, transportation, manufacture, processing of narcotic drugs, psychotropic substances or their analogues, as well as illegal acquisition, storage, transportation of plants containing narcotic drugs or psychotropic substances, or their parts containing narcotic drugs or psychotropic substances (part 1 Article 228) Illegal acquisition, storage or transportation of precursors of narcotic drugs or psychotropic substances, as well as illegal acquisition, storage or transportation of plants containing precursors of narcotic drugs or psychotropic substances, or their parts containing precursors of narcotic drugs or psychotropic substances (Article 228³) Illegal production, sale or shipment of precursors of narcotic drugs or psychotropic substances, as well as illegal sale or shipment of plants containing precursors of narcotic drugs or psychotropic substances, or parts thereof containing precursors of narcotic drugs or psychotropic substances (Part 1 of Art.228.) Induction to the consumption of narcotic drugs, psychotropic substances or their analogs (part 1 of article 230) Illegal cultivation of plants containing narcotic drugs or psychotropic substances or their precursors (part 1 of article 231) Organization or maintenance of dens or the systematic provision of premises for the consumption of narcotic drugs, psychotropic substances or their analogues (Part 1 of Art.232) Illegal issuance or forgery of prescriptions or other documents giving the right to receive narcotic drugs or psychotropic substances (Article 233) Illegal circulation of strong or poisonous substances for marketing purposes (part 1 and part 2 of article 234) Illegal implementation of medical or pharmaceutical activities (Article 235) Illegal production of medicines and medical devices (part 1 of article 235¹) Circulation of counterfeit, substandard and unregistered medicines, medical devices and circulation of counterfeit dietary supplements (part 1 of Art.238¹) Creation of a non-profit organization that infringes upon the personality and rights of citizens (Article 239) Involvement in prostitution (part 1 of article 240) Receiving sexual services from a minor (Art.240¹) Organization of prostitution (part 1 of article 241) Illegal production and circulation of pornographic materials or objects (part 1 and part 2 of article 242) Destruction or damage of cultural heritage objects (historical and cultural monuments) of the peoples of the Russian Federation included in the unified state register of cultural heritage objects (historical and cultural monuments) of the peoples of the Russian Federation, identified cultural heritage objects, natural complexes, objects taken under state protection, or cultural values (part 1 of article 243) Illegal search and (or) removal of archaeological objects from the places of occurrence (part 1 and part 2 of article 243²) Destruction or damage of military graves, as well as monuments, steles, obelisks, other memorial structures or objects that perpetuate the memory of those killed in the defense of the Fatherland or its interests, or dedicated to the days of military glory of Russia (Article 243⁴) Desecration of the bodies of the dead and the places of their burial (Article 244) Cruelty to animals (Article 245)
Environmental crimes Water pollution (Art. 250) Air pollution (art. 251) Pollution of the marine environment (Art. 252) Violation of the legislation of the Russian Federation on the continental shelf and on the exclusive economic zone of the Russian Federation (Article 253) Damage to the earth (v. 254) Violation of the rules for the protection and use of subsurface resources (Article 255) Illegal extraction (catch) of aquatic biological resources (Art. 256) Violation of the rules for the protection of aquatic biological resources (Article 257) Illegal hunting (art. 258) Illegal extraction and circulation of especially valuable wild animals and aquatic biological resources belonging to the species included in the Red Book of the Russian Federation and (or) protected by international treaties of the Russian Federation (part 1 and part 1¹ of article 258¹) Destruction of critical habitats for organisms listed in the Red Book of the Russian Federation (Article 259) Illegal felling of forest plantations (parts 1 and 2 of article 260) Destruction or damage of forest plantations (i. 1 and part 2 of article 261) Violation of the regime of specially protected natural areas and natural objects (Article 262)
Crimes against traffic safety and transport operation Violation of the rules of the road and the operation of vehicles (part 1, part 2, part 3, part 5 of article 264) Destruction of vehicles or means of communication (Article 267) Committing hooligan motives of actions that threaten the safe operation of vehicles (Art.267¹) Violation of the rules ensuring the safe operation of transport (Article 268)
Crimes in the field of computer information Unlawful access to computer information (part 1, part 2 and part 3 of article 272) Creation, use and distribution of malicious computer programs (part 1 and part 2 of article 273) Violation of the rules for the operation of means of storage, processing or transmission of computer information and information and telecommunication networks (Art.274) Unlawful influence on the critical information infrastructure of the Russian Federation (part 1 of article 274¹)
Crimes against the foundations of the constitutional system and state security Public calls to carry out extremist activities (Article 280) Public calls for the implementation of actions aimed at violating the territorial integrity of the Russian Federation (Art.280¹) Incitement to hatred or enmity, as well as humiliation of human dignity (part 1 of article 282) Disclosure of state secrets (part 1 of article 283) Unlawful receipt of information constituting a state secret (part 1 of article 283¹) Loss of documents containing state secrets (Article 284)
Crimes against state power, interests of civil service and service in local self-government bodies Taking a bribe (part 1 of article 290) Giving a bribe (part 1 and part 2 of article 291) Mediation in bribery (part 1 of article 291¹) Petty bribery (art. 291²)
Crimes against justice Obstruction of justice and preliminary investigation (Art.294) Threats or violent actions in connection with the administration of justice or the production of a preliminary investigation (part 1, part 2 and part 3 of article 296) Disrespect for the court (Art. 297) Defamation against a judge, juror, prosecutor, investigator, person conducting the inquiry, an employee of the compulsory enforcement bodies of the Russian Federation (Article 298¹) Coercion to testify (part 1 of article 302) Knowingly false denunciation (part 1 and part 2 of article 306) Knowingly false testimony, expert opinion, specialist or incorrect translation (Article 307) Refusal of witness or victim to testify (Article 308) Bribery or coercion to testify or evade testimony or to incorrect translation (part 1, part 2 and part 3 of article 309) Disclosure of data from preliminary investigation (Art. 310) Concealment of crimes (Art. 316)
Crimes against the order of administration Use of violence against a government official (part 1 of article 318) Insulting a government official (Article 319) Disclosure of information on security measures applied to an official of a law enforcement or regulatory body (Article 320) Illegal crossing of the State border of the Russian Federation (part 1 of article 322) Organization of illegal migration (part 1 of article 322¹) Fictitious registration of a citizen of the Russian Federation at the place of stay or at the place of residence in a residential building in the Russian Federation and fictitious registration of a foreign citizen or stateless person at the place of residence in a residential building in the Russian Federation (Art. 322²) Illegal change of the State border of the Russian Federation (Article 323) Purchase or sale of official documents and state awards (Article 324) Theft or damage to documents, stamps, seals or theft of excise stamps, special stamps or conformity marks (Article 325) Unlawful seizure of the state registration plate of a vehicle (Art. 325¹) Forgery or destruction of the vehicle identification number (Article 326) Forgery, manufacture or circulation of forged documents, state awards, stamps, seals or letterheads (Art. 327) Production, sale or use of counterfeit excise stamps, special stamps or conformity marks (part 1 and part 2 of article 327¹) Forgery of documents for medicines or medical devices or packaging of medicines or medical devices (Part 1 and Part 2 of Art. 327²) Desecration of the State Emblem of the Russian Federation or the State Flag of the Russian Federation (Article 329) Arbitrariness (Article 330) Malicious evasion of duties defined by the legislation of the Russian Federation on non-profit organizations performing the functions of a foreign agent (Article 330¹) Failure to comply with the obligation to submit a notification that a citizen of the Russian Federation has citizenship (nationality) of a foreign state or a residence permit or other valid document confirming the right to his permanent residence in a foreign state (Art. 330²)
Crimes against the peace and security of mankind Public calls to unleash an aggressive war (Article 354) Rehabilitation of Nazism (Article 354¹)
AN ANALYST SEES: FOUR LESSONS IN ORDERING OUR THOUGHTS
Lesson 1: Situational awareness Our knowledge of the world is always fragmentary and incomplete, and is sometimes wrong
London, 11 p.m., 20 April 1961. In room 360 of the Mount Royal Hotel, Marble Arch, London, four men are waiting anxiously for the arrival of a fifth. Built in 1933 as rented apartments and used for accommodation by US Army officers during the war, the hotel was chosen by MI6 as a suitably anonymous place for the first face-to-face meeting of Colonel Oleg Penkovsky of Soviet military intelligence, the GRU, with the intelligence officers who would jointly run him as an in-place agent of MI6 and CIA. When Penkovsky finally arrived he handed over two packets of handwritten notes on Soviet missiles and other military secrets that he had smuggled out of Moscow as tokens of intent. He then talked for several hours explaining what he felt was his patriotic duty to mother Russia in exposing to the West the adventurism and brinkmanship of the Soviet leader, Nikita Khrushchev, and the true nature of what he described as the rotten two-faced Soviet
The huge value of Penkovsky as a source of secret intelligence came from the combination of his being a trained intelligence officer and his access to the deepest secrets of the Soviet Union – military technology, high policy and personalities. He was one of the very few with his breadth of access allowed to visit London, tasked with talent spotting of possible
sources for Soviet intelligence to cultivate in Western business and scientific circles.
Penkovsky had established an acquaintance with a frequent legitimate visitor to Moscow, a British businessman, Greville Wynne, and entrusted him with his life when he finally asked Wynne to convey his offer of service to MI6. From April 1961 to August 1962 Penkovsky provided over 5500 exposures of secret material on a Minox camera supplied by MI6. His material alone kept busy twenty American and ten British analysts, and his 120 hours of face-to-face debriefings occupied thirty translators, producing 1200 pages of transcript.
At the same time, on the other side of the Atlantic, intelligence staffs worried about the military support being provided by the Soviet Union to Castro’s Cuba. On 14 October 1962 a U2 reconnaissance aircraft over Cuba photographed what looked to CIA analysts like a missile site under construction. They had the top secret plans Penkovsky had passed to MI6 showing the typical stages of construction and operation for Soviet medium-range missile sites. In the view of the CIA, without this information it would have been very difficult to identify which type of nuclear-capable missiles were at the launch sites and track their operational readiness. On 16 October President Kennedy was briefed on the CIA assessment and shown the photographs. By 19 October he was told a total of nine such sites were under construction and had been photographed by overflights. On 21 October the British Prime Minister, Harold Macmillan, was informed by President Kennedy that the entire US was now within Soviet missile range with a warning time of only four minutes. Macmillan’s response is recorded as ‘now the Americans will realize what we here in England have lived through these past many years’. The next day, after consultation with Macmillan, the President instituted a naval blockade of
The Cuban missile crisis is a clear example of the ability intelligence has to create awareness of a threatening situation, the first component of the SEES model of intelligence analysis. The new evidence turned US analysts’ opinion on its head. They had previously thought the Soviets would not dare to attempt introducing nuclear missile systems in the Western hemisphere. Now they had a revised situational awareness of what the United States was facing.
There is a scientific way of assessing how new evidence should alter our beliefs about the situation we face, the task of the first stage of the SEES method. That is the Bayesian approach to inference, widely applied in
intelligence analysis, modern statistics and data analysis.3 The method is named after the Rev. Thomas Bayes, the eighteenth-century Tunbridge Wells cleric who first described it in a note on probability found among his papers after his death in 1761.
The Bayesian approach uses conditional probability to work backwards from seeing evidence to the most likely causes of that evidence existing. Think of the coin about to be tossed by a football referee to decide which side gets to pick which goal to attack in the first half of the game. To start with it would be rational to estimate that there is a 50 per cent probability that either team will win the toss. But what should we think if we knew that in every one of the last five games involving our team and the same referee we had lost the toss? We would probably suspect foul play and reduce our belief that we stand an even chance of winning the toss this time. That is what we describe as the conditional probability, given that we now know the outcome of previous tosses. It is different from our prior estimate. What Bayesian inference does in that case is give us a scientific method of starting with the evidence of past tosses to arrive at the most likely cause of those results, such as a biased coin.
Bayesian inference helps us to revise our degree of belief in the likelihood of any proposition being true given our learning of evidence that bears on it. The method applies even when, unlike the coin-tossing example, we only have a subjective initial view of the likelihood of the proposition being true. An example would be the likelihood of our political party winning the next election. In that case it might then be new polling evidence that causes us to want to revise our estimate. We can ask ourselves how far the new evidence helps us discriminate between alternative views of the situation or, as we should term them, alternative hypotheses, about what the outcome is likely to be. If we have a number of alternatives open to us, and the evidence is more closely associated with one of them than the alternatives, then it points us towards believing more strongly that that is the best description of what we face.
The Bayesian method of reasoning therefore involves adjusting our prior degree of belief in a hypothesis on receipt of new evidence to form a posterior degree of belief in it (‘posterior’ meaning after seeing the
evidence). The key to that re-evaluation is to ask the question: if the hypothesis was actually true how likely is it that we would have been able to see that evidence? If we think that evidence is strongly linked to the hypothesis being true, then we should increase our belief in the hypothesis.
The analysts in the Defense Intelligence Agency in the Pentagon had originally thought it was very unlikely that the Soviet Union would try to introduce nuclear missiles into Cuba. That hypothesis had what we term a low prior probability. We can set this down precisely using notation that will come in handy in the next chapter. Call the hypothesis that nuclear missiles would be introduced N. We can write their prior degree of belief in N as a prior probability p(N) lying between 0 and 1. In this case, since they considered N very unlikely, they might have given p(N) a probability value of 0.1, meaning only 10 per cent likely.
The 14 October 1962 USAF photographs forced them to a very different awareness of the situation. They saw evidence, E, consistent with the details Penkovsky had provided of a Soviet medium-range nuclear missile installation under construction. The analysts suddenly had to face the possibility that the Soviet Union was introducing such a capability into Cuba by stealth. They needed to find the posterior probability p(N|E) (read as the reassessed probability of the hypothesis N given the evidence E where the word ‘given’ is written using the vertical line |).
The evidence in the photographs was much more closely associated with the hypothesis that these were Soviet nuclear missile launchers than any alternative hypothesis. Given the evidence in the photographs, they did not appear to be big trucks carrying large pipes for a construction site, for instance. The chances of the nuclear missile hypothesis being true given the USAF evidence will be proportionate to p(E|N), which is the likelihood of finding that evidence on the assumption that N is true. That likelihood was estimated as much greater than the overall probability that such photographs might have been seen in any case (which we can write as p(E)). The relationship between the nuclear missile hypothesis and the evidence seen, that of p(E|N) to p(E), is the factor we need to convert the prior probability p(N) to the posterior probability that the decisionmaker needs, p(N|E).
The Rev. Bayes gave us the rule to calculate what the posterior probability is:
p(N|E) = p(N). [p(E|N)/p(E)]
Or, the new likelihood of something being the case given the evidence is found by adjusting what you thought was likely (before you saw the evidence) by how well the new evidence supports the claim of what could be happening.
This is the only equation in this book. Despite wanting to talk as plainly as possible, I’ve included it because it turns words into precise calculable conditional likelihoods which is what so much of modern data science is about. In the next chapter we examine how we can apply Bayes’s great insight to work backwards, inferring from observations what are the most likely causes of what we see.
The example of the Cuban missile crisis shows Bayesian logic in action to provide new situational awareness. For example, if the analysts had felt that the photographs could equally well have been of a civil construction site and so the photographs were equally likely whether or not N was true (i.e. whether or not these were nuclear missile launchers) then p(E|N) would be the same as p(E), and so the factor in Bayes’s rule is unity and the posterior probability is no different from the prior. The President would not be advised to change his low degree of belief that Khrushchev would dare try to introduce nuclear missiles into Cuba. If, on the other hand, E would be much more likely to be seen in cases where N is true (which is what the Penkovsky intelligence indicated), then it is a strong indicator that N is indeed true and p(E|N) will be greater than P(E). So p(N|E) therefore rises significantly. For the Pentagon analysts p(N|E) would have been much nearer to 1, a near certainty. The President was advised to act on the basis that Soviet nuclear missiles were in America’s backyard.
Kennedy’s key policy insight in 1962 was recognition that Khrushchev would only have taken such a gamble over Cuba having been persuaded that it would be possible to install the missiles on Cuba covertly, and arm them with nuclear warheads before the US found out. The US would then have discovered that the Soviet Union was holding at immediate risk the entire Eastern seaboard of the US, but would have been unable to take action against Cuba or the missiles without running unacceptable risk. Once the missiles had been discovered before they were operational, it was then the Soviet Union that was carrying the risk of confrontation with the naval blockade Kennedy had ordered. Kennedy privately suggested a face-saving
ladder that Khrushchev could climb down (by offering later withdrawal of the old US medium-range missiles based in Turkey), which Khrushchev duly accepted. The crisis ended without war.
The story of President Kennedy’s handling of the Cuban missile crisis has gone down as a case study in bold yet responsible statecraft. It was made possible by having situational awareness – providing the what, who, where and when that the President needed based on Penkovsky’s intelligence on technical specifications about Soviet nuclear missiles, their range and destructive power, and how long they took to become operational after they were shipped to a given location. That last bit of intelligence persuaded Kennedy that he did not need to order air strikes to take out the missile sites immediately. His awareness of the time he had gave him the option of trying to persuade Khrushchev that he had miscalculated.
Bayesian inference is central to the SEES method of thinking. It can be applied to everyday matters, especially where we may be at risk of faulty situational awareness. Suppose you have recently been assigned to a project that looks, from the outside, almost impossible to complete successfully on time and in budget. You have always felt well respected by your line manager, and your view of the situation is that you have been given this hard assignment because you are considered highly competent and have an assured future in the organization. However, at the bottom of an email stream that she had forgotten to delete before forwarding, you notice that your manager calls you ‘too big for your boots’. Working backwards from this evidence you might be wise to infer that it is more likely your line manager is trying to pull you down a peg or two, perhaps by getting you to think about your ability to work with others, by giving you a job that will prove impossible. Do try such inferential reasoning with a situation of your own.
Most intelligence analysis is a much more routine activity than the case of the Cuban missile crisis. The task is to try to piece together what’s going on by looking at fragmentary information from a variety of sources. The Bayesian methodology is the same in weighing information in order to be able to answer promptly the decisionmakers’ need to know what is happening, when and where and who is involved.
When data is collected in the course of intelligence investigations, scientific experiments or just in the course of web browsing and general observation, there is a temptation to expect that it will conform to a known
pattern. Most of the data may well fit nicely. But some may not. That may be because there are problems with the data (source problems in intelligence, experimental error for scientists) or because the sought-for pattern is not an accurate enough representation of reality. It may be that the bulk of the observations fit roughly the expected pattern. But more sensitive instruments or sources with greater access may also be providing data that reveals a new layer of reality to be studied. In the latter case, data that does not fit what has been seen before may be the first sighting of a new phenomenon that cries out to be investigated, or, for an intelligence officer, that could be the first sign that there is a deception operation being mounted. How to treat such ‘outliers’ is thus often the beginning of new insights. Nevertheless, it is a natural human instinct to discard or explain away information that does not fit the prevailing narrative. ‘Why spoil a good story’ is the unconscious thought process. Recognizing the existence of such cases is important in learning to think straight.
Penkovsky had quickly established his bona fides with MI6 and the CIA. But our judgements depend crucially on assessing how accurate and reliable the underlying information base is. What may be described to you as a fact about some event of interest deserves critical scrutiny to test whether we really do know the ‘who, what, where and when’. In the same way, an intelligence analyst would insist when receiving a report from a human informant on knowing whether this source had proved to be regular and reliable, like Penkovsky, or was a new untested source. Like the historian who discovers a previously unknown manuscript describing some famous event in a new way, the intelligence officer has to ask searching questions about who wrote the report and when, and whether they did so from first-hand knowledge, or from a sub-source, or even from a sub-sub-source with potential uncertainty, malicious motives or exaggeration being introduced at each step in the chain. Those who supply information owe the recipient a duty of care to label carefully each report with descriptions to help the analyst assess its reliability. Victims of village gossip and listeners to TheArchers on BBC Radio 4 will recognize the effect.
The best way to secure situational awareness is when you can see for yourself what is going on, although even then be aware that appearances can be deceptive, as optical illusions demonstrate. It would always repay treating with caution a report on a social media chat site of outstanding bargains to be had on a previously unknown website. Most human eye-
witness reporting needs great care to establish how reliable it is, as criminal courts know all too well. A good intelligence example where direct situational awareness was hugely helpful comes from the Falklands conflict. The British authorities were able to see the flight paths of Argentine air force jets setting out to attack the British Task Force because they had been detected by a mountaintop radar in Chile, and the Chilean government had agreed their radar picture could be accessed by the UK.
Experienced analysts know that their choice of what deserves close attention and what can be ignored is a function of their mental state at the
time.4 They will be influenced by the terms in which they have been tasked but also by how they may have unconsciously formulated the problem. The analysts will have their own prejudices and biases, often from memories of previous work. In the words of the tradecraft primer for CIA officers:
‘These are experience based constructs of assumptions and expectations both about the world in general and more specific domains. These constructs strongly influence what information analysts will accept – that is, data that are in accordance with analysts’ unconscious mental models are more likely to be perceived and remembered than information that is at
odds with them.’5 Especial caution is needed therefore when the source seems to be showing you what you had most hoped to see.
The interception and deciphering of communications and the product of eavesdropping devices usually have high credibility with intelligence analysts because it is assumed those involved do not realize their message or conversation is not secure and therefore will be speaking honestly. But that need not be the case, since one party to a conversation may be trying to deceive the other, or both may be participating in an attempt to deceive a third party, such as the elaborate fake communications generated before the D-Day landings in June 1944 to create the impression of a whole US Army Corps stationed near Dover. That, combined with the remarkable double agent operation that fed back misleading information to German intelligence, provided the basis of the massive deception operation mounted for D-Day (Operation Fortitude). The main purpose was to convince the German High Command that the landings in Normandy were only the first phase with the main invasion to follow in the Pas de Calais. That intelligence-led deception may have saved the Normandy landings from disaster by persuading the German High Command to hold back an entire armoured division from the battle.
Unsubstantiated reports (at times little more than rumour) swirl around commercial life and are picked up in the business sections of the media and are a driver of market behaviour. As individuals, the sophisticated analysts of the big investment houses may well not be taken in by some piece of market gossip. But they may well believe that the average investor will be, and that the market will move and thus as a consequence they have to make their investment decisions as if the rumour is true. It was that insight that enabled the great economist John Maynard Keynes to make so much money for his alma mater, King’s College Cambridge, in words much quoted today in the marketing material of investment houses: ‘successful investing is
anticipating the anticipation of others’.6 Keynes described this process in his General Theory as a beauty contest:
It is not a case of choosing those which, to the best of one’s judgment, are really the prettiest, nor even those which average opinion genuinely thinks the prettiest. We have reached the third degree where we devote our intelligences to anticipating what average opinion expects the average opinion to be. And there are
some, I believe, who practise the fourth, fifth and higher degrees.7
The Penkovsky case had a tragic ending. His rolls of film had to be delivered by dead drop in the teeth of Soviet surveillance using methods later made famous by John le Carré’s fictional spies, including the mark on the lamppost to indicate there was material to pick up. That task fell to Janet Chisholm, the wife of Penkovsky’s SIS case officer working under diplomatic cover in the Moscow Embassy. She had volunteered to help and was introduced to Penkovsky during one of his official visits to London. It was no coincidence therefore that her children were playing on the pavement of Tsvetnoy Boulevard while she watched from a nearby bench, at the exact moment Oleg Penkovsky in civilian clothes walked past. He chatted to the children and offered them a small box of sweets (that he had been given for that purpose during his meeting in London) within which were concealed microfilms of documents that Penkovsky knew would meet London’s and Washington’s urgent intelligence requirements. Similar drops of film followed. She was, however, later put under routine surveillance and by mischance she was seen making a ‘brush contact’ with a Russian who the KGB could not immediately identify but who triggered further
investigations. That and other slips made by Penkovsky led finally to his arrest. His go-between, the British businessman Greville Wynne, was then kidnapped during a business trip to Budapest, and put on show trial in Moscow alongside Penkovsky. Both were found guilty. Penkovsky was severely tortured and shot. Wynne spent several years in a Soviet prison until exchanged in 1964 in a spy swop for the convicted KGB spy Gordon Lonsdale (real name Konon Molody) and his cut-outs, an antiquarian bookseller and his wife, Peter and Helen Kroger, who had helped him run a spy ring against the UK Admiralty research establishment at Portland.
The digital revolution in information gathering
Today a Penkovsky could more safely steal secret missile plans by finding a way of accessing the relevant database. That is true for digital information of all kinds if there is access to classified networks. Digital satellite imagery provides global coverage. The introduction of remotely piloted aircraft with high-resolution cameras provides pin-sharp digitized imagery for operational military, security and police purposes, as well as for farming, pollution control, investigative journalism and many other public uses. At any incident there are bound to be CCTV cameras and individuals with mobile phones (or drones) that have high-resolution cameras able to take video footage of the event – and media organizations such as broadcasters advertise the telephone numbers to which such footage can be instantly uploaded. Every one of us is potentially a reconnaissance agent.
There is the evident risk that we end up with simply too much digital data to make sense of. The availability of such huge quantities of digitized information increases the importance of devising artificial intelligence
algorithms to sort through it and highlight what appears to be important.8 Such methods rely upon applying Bayesian inference to learn how best to search for the results we want the algorithms to detect. They can be very powerful (and more reliable than a human would be) if the task they are given is clear-cut, such as checking whether a given face appears in a large set of facial images or whether a specimen of handwriting matches any of those in the database. But these algorithms are only as reliable as the data on which they were trained, and spurious correlations are to be expected.
The human analyst is still needed to examine the selected material and to add meaning to the data.9
At the same time, we should remember that the digital world also provides our adversaries with ample opportunities to operate anonymously online and to hack our systems and steal our secrets. Recognition of these cyber-vulnerabilities has led the liberal democracies to give their security and intelligence agencies access to powerful digital intelligence methods, under strict safeguards, to be able to search data in bulk for evidence about those who are attacking us.
One side effect of the digitization of information is the democratization of situational awareness. We can all play at being intelligence analysts given our access to powerful search engines. Anyone with a broadband connection and a mobile device or computer has information power undreamed of in previous eras. There is a new domain of open-source intelligence, or OSINT. We use this ourselves when trying to decide which party to vote for in an election and want to know what each candidate stands for, or ascertaining the level of property prices in a particular area, or researching which university offers us the most relevant courses. The Internet potentially provides the situational awareness that you need to make the right decision. But like intelligence officers you have to be able to use it with discrimination.
The tools available to all of us are remarkable. Catalogues of image libraries can be searched to identify in fractions of a second a location, person, artwork or other object. Google Images has indexed over 10 billion photographs, drawings and other images. By entering an address almost anywhere in the world, Google Street View will enable you to see the building and take a virtual drive round the neighbourhood with maps providing directions and overlays of information. The position of ships and shipping containers can be displayed on a map, as can the location of trains across much of Europe.
With ingenuity and experience, an internet user can often generate situational awareness to rival that of intelligence agencies and major
broadcasting corporations. The not-for-profit organization Bellingcat10 is named after Aesop’s fable in which the mice propose placing a bell around the neck of the cat so that they are warned in good time of its approach but none will volunteer to put the bell on it. Bellingcat publishes the results of non-official investigations by private citizens and journalists into war
crimes, conditions in war zones and the activities of serious criminals. Its most recent high-profile achievement was to publish the real identities of the two GRU officers responsible for the attempted murder of the former MI6 agent and GRU officer Sergei Skripal and his daughter in Salisbury and the death of an innocent citizen.
It requires practice to become as proficient in retrieving situational information from the 4.5 billion indexed pages of the World Wide Web (growing by about one million documents a day) and the hundreds of thousands of accessible databases. Many sites are specialized and may take skill and effort, and the inclination to find (a location map of fishing boats around the UK, for example, should you ever want to know, can be found at fishupdate.com).
Although huge, the indexed surface web accessible by a search engine is estimated to be only 0.03 per cent of the total Internet. Most of the Internet, the so-called deep web, is hidden from normal view, for largely legitimate reasons since it is not intended for casual access by an average user. These are sites that can only be accessed if you already know their location, such as corporate intranets and research data stores, and most will be password-protected. In addition to the deep web, a small part of the Internet is the so-called ‘dark web’ or ‘dark net’ with its own indexing, which can only be reached if specialist anonymization software such as Tor is being used to
hide the identity of the inquirer from law enforcement.11 The dark net thus operates according to different rules from the rest of the Internet that has become so much a part of all of our daily lives. An analogy for the deep web would be the many commercial buildings, research laboratories and government facilities in any city that the average citizen has no need to access, but when necessary can be entered by the right person with the proper pass. The dark net, to develop that cityscape analogy, can be thought of like the red-light district in a city with a small number of buildings (sometimes very hard to find), where access is controlled because the operators want what is going on inside to remain deeply private. At one time, these would have been speakeasies, illegal gambling clubs, flophouses and brothels, but also the meeting places of impoverished young artists and writers, political radicals and dissidents. Today it is where the media have their secure websites which their sources and whistleblowers can access anonymously.
I guess we have all cursed when clicking on the link for a web page we wanted brought up the error message ‘404 Page Not Found’. Your browser communicated with the server, but the server could not locate the web page where it had been indexed. The average lifespan of a web page is under 100 days so skill is needed in using archived web material to retrieve sites that have been mislabelled, moved or removed from the web. Politicians may find it useful that claims they make to the electorate can thus quickly disappear from sight, but there are search methods that can retrieve old web
pages and enable comparison with their views today.12 Most search engines use asterisks to denote wild cards, so a query that includes ‘B*n Lad*n’ will search through the different spellings of his name such as Ben Laden, Bin Laden (the FBI-preferred spelling), Bin Ladin (the CIA-preferred spelling) and so on. Another useful lesson is the use of the tilde, the ~ character on the keyboard. So prefacing a query term with ~ will result in a search for synonyms as well as the specific query term, and will also look for alternative endings. Finally, you can ask the search to ignore a word by placing a minus in front of it, as –query. The meta-search engine Dogpile will return answers taken from other search engines, including from Google and Yahoo.
The order in which results are presented to you after entering a search query into a search engine can give a misleading impression of what is important. The answers that are returned (miraculously in a very small fraction of a second) may have been selected in a number of different ways. The top answer may be as a result of publicity-based search – a form of product placement where a company, interest group or political party has paid to have its results promoted in that way (or has used one of the specialist companies that offer for a fee to deliver that result to advertisers). A search on property prices in an area will certainly flag up local estate agents who have paid for the marketing advantage of appearing high up on the page. The answers will also take account of the accumulated knowledge in the search database of past answers, and also which answers have been most frequently clicked for further information (a popularity-based search, thus tapping into a form of ‘wisdom of the crowd’). This can be misleading. While it may be interesting to see the results of a search for information about university courses that has been sorted by what were the most popular such searches, it is hardly helpful if what you want to know about is all the courses available that match your personal interests.
Finally, and perhaps most disturbingly, the suggested answers to the query may represent a sophisticated attempt by the algorithm to conduct apersonalized search by working out what it is that the user is most likely towant to know (in other words, inferring why the question is being asked) from the user’s previous internet behaviour and any other personal information about the individual accessible by the search engine. Two different people entering the same search terms on different devices will therefore get a different ranking of results. My query ‘1984?’ using the Google Chrome browser and the Google search engine brings up George Orwell’s dystopian novel along with suggestions of how I can most conveniently buy or download a copy. Helpfully, the Wikipedia entry on the book is also high up on the first page of the 1.49 billion results I am being offered (in 0.66 seconds). The same query using the Apple Safari browser and its search engine brings up first an article about the year 1984 telling me it was a leap year. And a very different past browsing history might highlight references to the assassination of Indira Gandhi in 1984, or news that the release of the forthcoming film Wonder Woman 1984 has been postponed to 2020. Internet searching is therefore a powerful tool for acquiring the components of situational awareness. That is, for as long as we can rely on an open Internet. If the authorities were to have insisted that the search algorithms did not reference Orwell’s book in response to queries from their citizens about 1984 then we would indeed have entered Orwell’s dystopian world. That, sadly, is likely to be the ambition of authoritarian regimes that will try to use internet technology for social control.
Conclusions: lessons in situational awareness
In this chapter, we have been thinking about the first stage of SEES, the task of acquiring what I have termed situational awareness, knowing about the here and now. Our knowledge of the world is always fragmentary and incomplete, and is sometimes wrong. But something has attracted our attention and we need to know more. It may be because we have already thought about what the future may bring and had strategic notice of areas we needed to monitor. Or it may be that some unexpected observation or report we have received triggers us to focus our attention. There are lessons we can learn about how to improve our chances of seeing clearly what is
going on when answering questions that begin with ‘who, what, where and when’.
We should in those circumstances:
Ask how far we have access to sufficient sources of information.
Understand the scope of the information that exists and what we need to know but do not.
Review how reliable the sources of information we do have are.
If time allows, collect additional information as a cross-check before reaching a conclusion.
Use Bayesian inference to use new information to adjust our degree of belief about what is going on.
Be open and honest about the limitations of what we know, especially in public, and be conscious of the public reactions that may be triggered.
Be alive to the possibility that someone is deliberately trying to manipulate, mislead, deceive or defraud us.
In Russia, everybody is utilized to the reality the administration’s situation on intense political, social, and even social issues doesn’t originate from the offices that should be liable for these undertakings. The Kremlin consistently has the final word. Intermittently this doesn’t mean President Vladimir Putin himself, yet rather agents from his Presidential Executive Office. To get a more full comprehension of what Putin’s organization does, we asked “Meduza” political journalist Andrey Pertsev to separate the what precisely the Presidential Executive Office is, the extent of its formal (and casual) obligations, and the constraints of its impact over what occurs in Russia.
Russian political columnists regularly allude to the “Kremlin” as shorthand for the Presidential Executive Office of Russia (additionally alluded to as the Presidential Administration of Russia, which in Russian is abbreviated to the abbreviation “AP”). The Moscow Kremlin turned into the official home of the nation’s top authority very quickly after the October Revolution in 1917. What’s more, the Russian Federation proceeded with this convention after the Soviet Union fell; the Kremlin Senate houses Russian President Vladimir Putin’s office.
The Presidential Executive Office itself is really situated in a complex of structures in another piece of town: in Staraya Square in the eastern piece of Moscow’s Kitay Gorod area. As a previous authority from the Presidential Executive Office reviewed in discussion with Meduza, the AP’s workplaces used to be situated inside the Kremlin’s purported “fourteenth structure.” When this structure was wrecked, it moved to Number Four Staraya Square.
The Presidential Executive Office of Russia was set up under Boris Yeltsin in 1991. Nonetheless, the arrangements administering its work didn’t come out until 1993. A refreshed rendition of these guidelines from 1996 states that the AP “makes the conditions” for the president to decide the fundamental headings of homegrown and international strategy, resolve staff issues, and guarantee the “organized working” of collections of intensity. This scope of obligations is additionally delineated in the current guidelines on the official organization, which go back to 2004. These additionally express that the AP “practices control” over the execution of the president’s choice.
Furthermore, the Presidential Executive Office is liable for setting up the top of state’s yearly location and the draft laws the president brings to parliament, “supporting cooperation with ideological groups,” gathering and investigating data “on financial, political, and lawful cycles in the nation and abroad,” just as “sorting out logical examination work, incorporating with the inclusion of specialists.”
Officially, the Presidential Executive Office is truly the top of state’s working office. Be that as it may, the more powers the Russian president has by and by — either officially or casually — the more compelling the AP becomes.
A few Meduza sources working in or near individuals in the official organization guarantee that the AP turned into a “operational hub” with a created inner political alliance at some point between the early and mid-2000s. This occurred affected by then-First Deputy Chief of Staff Vladislav Surkov, who made a political framework oversaw legitimately from inside the Kremlin. Because of Putin’s own prevalence, the decision party, United Russia, accomplished incredible outcomes during parliamentary races (Putin was the gathering’s administrator from 2008–2012 and was then supplanted by Dmitry Medvedev). Under implicit principles, foundational resistance groups (for instance the Communist Party and the Liberal Democratic Party) were additionally needed to talk with the AP — for instance, when arranging applicant selections.
Despite the fact that Surkov’s function in administering Russian governmental issues was a notable mystery, the main individual to talk about it openly was financial specialist Mikhail Prokhorov, who, in front of the 2011 parliamentary decisions, was driving the gathering Pravoye Delo (Right Cause). Prokhorov considered Vladislav Surkov the “primary manikin ace of the political cycle” and blamed him for endeavoring to impact the development of gathering records.
Under Surkov’s replacement, First Deputy Chief of Staff Vyacheslav Volodin, the organization’s impact on foundational resistance groups just expanded. Delegates from the Communist Party of the Russian Federation (KPRF), the Liberal Democratic Party (LDPR), and A Just Russia started running as competitors in gubernatorial races (which were somewhat once again introduced in Russia in 2012). In reality, these up-and-comers were chosen inside the AP. Simultaneously, huge organizations could settle on concurrences with Putin straightforwardly on designating their proteges as lead representatives.
The official organization likewise holds influence over youth strategy and participation with the common society activists faithful to the specialists; this is managed by its public ventures office under the authority of Sergey Novikov, a long-lasting partner of the current First Deputy Chief of Staff Sergey Kiriyenko.
Nonetheless, the Presidential Executive Office’s close to boundless force in the public arrangement circle doesn’t imply that it’s truly all-powerful. For instance, the AP doesn’t run the administration and isn’t associated with choosing possibility for bureau. Many key choices on the economy and money, and with respect to law authorization organizations, are made during gatherings of the Security Council. It’s useful to envision the Security Council, which incorporates some of the nation’s key figures, as a sort of top managerial staff; a similarity that by and by makes the AP an overseeing office.
The Presidential Executive Office is comprised of around 20 offices, including the Security Council Office, the Presidential Advisers’ Office, and the Presidential Chancellery. The current Chief of Staff is Anton Vaino and there are two First Deputy Chiefs of Staff, Sergey Kiriyenko and Alexey Gromov. The official agents to the government locale, the State Duma, the Federation Council, and the Constitutional Court are likewise important for Putin’s organization.
The AP additionally has nine official associates on staff (three of whom likewise head divisions), who are liable for planning proposition and examination for the president with respect to the regions that they direct. Regularly, these associates are previous prominent authorities. For instance, among the current official associates are previous Transport Minister Igor Levitin, previous Culture Minister Vladimir Medinsky, and previous Economic Development Minister Maxim Oreshkin. Andrey Belousov, who filled in as Russia’s Economic Development Minister from 2012 until 2013, was Putin’s assistant for almost seven years — he at that point got back to the bureau as First Deputy Prime Minister.
Moreover, the president has six consultants, including the Chairman of the Presidential Council for Human Rights, Valery Fadeev. Youngsters’ Rights Commissioner Anna Kuznetsova is likewise important for the AP, alongside Business Ombudsman Boris Titov, and the Special Presidential Representative for Environmental Protection, Ecology, and Transport, Sergey Ivanov.
Figures possessing public positions —, for example, the top of the Human Rights Council and the Children’s Rights Commissioner — have gotten discernibly more moderate and faithful to the experts lately.
The Presidential Executive Office has a formal hierarchy subordinate to its Chief of Staff, however, a source close to the administration describes Anton Vaino as the “first among equals.” “It was always like this. Yes, the chief of the AP is the leader but, in effect, everything is tied to the president himself. The AP’s deputy chiefs can make independent decisions within their competences,” explains a former Kremlin official. “Yes, the short-list of candidates for governor goes through the domestic policy department, then through the first deputy head of politics, and then is agreed upon with the chief. But without the approval of the politics deputy, the chief can’t carry out his own decision, for example, to ‘push’ a senator. The relevant deputy can outmaneuver the decision, he has access to the president.”
According to the source, in reality, the status and influence of each presidential aide and adviser depends on their individual relationship with Putin. “If a person has direct access to the president [and] the opportunity to enter the cabinet at his request at any time, [whether] they’re an adviser or an aide isn’t so important.”
The AP has a Foreign Policy Directorate, which is not to be confused with Russia’s Foreign Affairs Ministry. According to Meduza’s Kremlin source, presidential aide Yuri Ushakov is in charge of the AP’s foreign policy department. “The directorate and Ushakov deal with issues that concern Putin directly. The organization of his visits, the organization of forums involving the president within Russia. The Foreign Affairs Ministry [handles] routine matters — regular meetings of international bodies, some routine statements. Although, for example, Foreign Minister Sergey Lavrov’s speeches are still sent to the AP,” the source explained.
Pro-Kremlin propaganda on television and from major, state-owned media outlets is carried out by subordinates under First Deputy Chief of Staff Alexey Gromov. Several sources told Meduzathat Gromov holds weekly meetings with the heads of the press services at key government agencies, whose reports in turn act as sources for the rest of the news media. During these briefings, Gromov outlines the main topics for the week ahead and provides guidance on how to present the government’s agenda “correctly.”
“When it comes to statements about important news events that arise unexpectedly, press service directors coordinate directly with [the head of the presidential administration’s public relations and communications department Alexander] Smirnov or even with Gromov personally,” Meduza’s source said. Gromov holds similar meetings with the heads of major traditional media outlets. “If there’s an urgent, topical issue he can call some editor-in-chief personally and he often does so,” a source close to the AP said.
On the other hand, attacks on the opposition on social networks and messaging platforms are the sphere of the AP’s domestic policy bloc under the leadership of First Deputy Chief of Staff Sergey Kiriyenko. However, Gromov also has groups (“nets”) of Telegram channels that are loyal to him.
Kremlin officials do not and cannot have a coordinated view on all issues. Each bloc and department has its own area of responsibility that it cannot go beyond. “For example, Vyacheslav Volodin, was he was the AP’s first deputy chief and domestic policy curator, could have had his own views — for example, on policy regarding Ukraine and the self-proclaimed republics of Donbas. But Vladislav Surkov oversaw this area as a presidential aide, so Volodin couldn’t interfere,” a Meduza source who worked in the AP in the early 2010s explained.
In reporting on Russia, the phrase “the Kremlin thinks” is more or less a journalistic cliché. Officials from the president’s office often share their own views, which don’t necessarily reflect a position agreed upon with the entire AP or Putin himself. “Often in some of his answers to journalists’ questions Presidential Press Secretary Dmitry Peskov also says that this is his personal opinion,” Meduza’s source underscored.
Russian media and Telegram channels often uses the phrase “tower wars” to describe political in-fighting among influential groups with the Putin administration. You’ll also hear people say “the Kremlin has many towers,” referring to the number of rival groups. For example, it’s believed that the “Kovalchuk group” opposes the “Rotenberg group,” whereas the FSB doesn’t get along so well with the Interior Ministry. These rival groups can compete for spheres of influence, positions in the federal government, and for the governorships of major regions. Sometimes, traces of these struggles come into public view.
However, there’s an informal understanding that Kremlin officials shouldn’t get involved in lobbying battles themselves and are required to stay above these conflicts — regardless of the fact that they might be close to one of the groups involved (for example, Sergey Kiriyenko is considered closed to the “Kovalchuk group”). “In any case, all of [the Kremlin’s] decisions on serious issues are collegial and coordinated. The final decision is up to the president, but the agreed upon point of view goes to him for approval,” Meduza’s Kremlin source explained.
Nevertheless, conflicts can emerge among officials with overlapping spheres of influence. For example, media curator/First Deputy Chief of Staff Alexey Gromov has always had a tense relationship with the AP’s domestic policy curators, since the media also falls within the domestic policy bloc’s area of responsibility.
For a long time, there was also covert competition between Sergey Kiriyenko and State Duma Speaker Vyacheslav Volodin, who didn’t want to hand over the levers of political influence — first and foremost, United Russia — to Kiriyenko completely. The fight ended predictably with Volodin’s people leaving key posts within the ruling party. The Kremlin’s number one rule worked: each member of the “power vertical” handles the sphere that Vladimir Putin has set out for them.
I am now a visiting professor teaching intelligence studies in the War Studies Department at King’s College London, at Sciences Po in Paris and also at the Defence University in Oslo. My experience is that it really helps to have a systematic way of unpacking the process of arriving at judgements and establishing the appropriate level of confidence in them. The model I have developed – let me call it by an acronym that recalls what analysts do as they look at the world, the SEES model – leads you through the four types of information that can form an intelligence product, derived from different levels of analysis:
Situational awareness of what is happening and what we face now.
Explanation of why we are seeing what we do and the motivations ofthose involved.
Estimates and forecasts of how events may unfold under differentassumptions.
Strategic notice of future issues that may come to challenge us in thelonger term.
There is a powerful logic behind this four-part SEES way of thinking. Take as an example the investigation of far-right extremist violence. The
first step is to find out as accurately as possible what is going on. As a starting point, the police will have had crimes reported to them and will have questioned witnesses and gathered forensic evidence. These days there is also a lot of information available on social media and the Internet, but the credibility of such sources will need careful assessment. Indeed, even well-attested facts are susceptible to multiple interpretations, which can lead to misleading exaggeration or underestimation of the problem.
We need to add meaning so that we can explain what is really going on. We do that in the second stage of SEES by constructing the best explanation consistent with the available evidence, including an understanding of the motives of those involved. We see this process at work in every criminal court when prosecution and defence barristers offer the jury their alternative versions of the truth. For example, why are the fingerprints of an accused on the fragments of a beer bottle used for a petrol bomb attack? Was it because he threw the bottle, or is the explanation that it was taken out of his recycling box by the mob looking for material to make weapons? The court
has to test these narratives and the members of the jury have then to choose the explanation that they think best fits the available evidence. The evidence rarely speaks for itself. In the case of an examination of extremist violence, in the second stage we have to arrive at an understanding of the causes that bring such individuals together. We must learn what factors influence their anger and hatred. That provides the explanatory model that allows us to move on to the third stage of SEES, when we can estimate how the situation may change over time, perhaps following a wave of arrests made by the police and successful convictions of leading extremists. We can estimate how likely it is that arrest and conviction will lead to a reduction in threats of violence and public concern overall. It is this third step that provides the intelligence feedstock for evidence-based policymaking.
The SEES model has an essential fourth component: to provide strategic notice of longer-term developments. Relevant to our example we might want to examine the further growth of extremist movements elsewhere in Europe or the impact on such groups were there to be major changes in patterns of refugee movements as a result of new conflicts or the effects of climate change. That is just one example, but there are very many others where anticipating future developments is essential to allow us to prepare sensibly for the future.
The four-part SEES model can be applied to any situation that concerns us and where we want to understand what has happened and why and what may happen next, from being stressed out at a situation at work to your sports team losing badly. SEES is applicable to any situation where you have information, and want to make a decision on how best to act on it.
We should not be surprised to find patterns in the different kinds of error tending to occur when working on each of the four components of the SEES process. For example:
Situational awareness suffers from all the difficulties of assessing what is going on. Gaps in information exist and often evoke a reluctance to change our minds in the face of new evidence.
Explanations suffer from weaknesses in understanding others: their motives, upbringing, culture and background.
Estimates of how events will unfold can be thrown out by unexpected developments that were not considered in the forecast.
Strategic developments are often missed due to too narrow a focus and a lack of imagination as to future possibilities.
The four-part SEES approach to assessment is not just applicable to affairs of state. At heart it contains an appeal to rationality in all our thinking. Our choices, even between unpalatable alternatives, will be sounder as a result of adopting systematic ways of reasoning. That includes being able to distinguish between what we know, what we do not know and what we think may be. Such thinking is hard. It demands integrity.
Buddhists teach that there are three poisons that cripple the mind: anger,
attachment and ignorance.7 We have to be conscious of how emotions such as anger can distort our perception of what is true and what is false. Attachment to old ideas with which we feel comfortable and that reassure us that the world is predictable can blind us to threatening developments. This is what causes us to be badly taken by surprise. But it is ignorance that is the most damaging mental poison. The purpose of intelligence analysis is to reduce such ignorance, thereby improving our capacity to make sensible decisions and better choices in our everyday lives.
On that fateful day in March 1982 Margaret Thatcher had immediately grasped what the intelligence reports were telling her. She understood what the Argentine Junta appeared to be planning and the potential consequences for her premiership. Her next words demonstrated her ability to use that insight: ‘I must contact President Reagan at once. Only he can persuade Galtieri [General Leopoldo Galtieri, the Junta’s leader] to call off this madness.’ I was deputed to ensure that the latest GCHQ intelligence was being shared with the US authorities, including the White House. No. 10 rapidly prepared a personal message from Thatcher to Reagan asking him to speak to Galtieri and to obtain confirmation that he would not authorize any landing, let alone any hostilities, and warning that the UK could not acquiesce in any invasion. But the Argentine Junta stalled requests for a Reagan conversation with Galtieri until it was much too late to call off the invasion.
Only two days later, on 2 April 1982, the Argentine invasion and military occupation of the Islands duly took place. There was only a small detachment of Royal Marines on the Islands and a lightly armed ice patrol ship, HMS Endurance, operating in the area. No effective resistance was possible. The Islands were too far away for sea reinforcements to arrive within the two days’ notice the intelligence had given us, and the sole
airport had no runway capable of taking long-distance troop-carrying aircraft.
We had lacked adequate situational awareness from intelligence on what the Junta was up to. We had failed to understand the import of what we did know, and therefore had not been able to predict how events would unfold. Furthermore, we had failed over the years to provide strategic notice that this situation was one that might arise, and so had failed to take steps that would have deterred an Argentine invasion. Failures in each of the four stages of SEES analysis.
All lessons to be learned.
How this book is organized
The four chapters in the first part of this book are devoted to the aforementioned SEES model. Chapter 1covers how we can establish situational awareness and test our sources of information. Chapter 2deals with causation and explanation, and how the scientific method called Bayesian inference, allows us to use new information to alter our degree of belief in our chosen hypothesis. Chapter 3explains the process of making estimates and predictions. Chapter 4describes the advantage that comes from having strategic notice of long-term developments.
There are lessons from these four phases of analysis in how to avoid different kinds of error, failing to see what is in front of us, misunderstanding what we do see, misjudging what is likely to follow and failing to have the imagination to conceive of what the future may bring.
Part Two of this book has three chapters, each drawing out lessons in how to keep our minds clear and check our reasoning.
We will see in Chapter 5how cognitive biases can subconsciously lead us to the wrong answer (or to fail to be able to answer the question at all). Being forewarned of those very human errors helps us sense when we may be about to make a serious mistake of interpretation.
Chapter 6introducesus to the dangers of the closed-loop conspiratorialmindset, and how it is that evidence which ought to ring alarm bells can too often be conveniently explained away.
The lesson of Chapter 7is to beware deliberate deceptions and fakes aimed at manipulating our thinking. There is misinformation, which is false
but circulated innocently; malinformation, which is true but is exposed and circulated maliciously; and disinformation, which is false, and that was known to be false when circulated for effect. The ease with which digital text and images can be manipulated today makes these even more serious problems than in the past.
Part Three explores three areas of life that call for the intelligent use of intelligence.
The lessons of Chapter 8are about negotiating with others, something we all have to do. The examples used come from extraordinary cases of secret intelligence helping to shape perceptions of those with whom governments have to negotiate, and of how intelli
gence can help build mutual trust – necessary for any arms control or international agreement to survive – and help uncover cheating. We will see how intelligence can assist in unravelling the complex interactions that arise from negotiations and confrontations.
Chapter 9identifieshow you go about establishing and maintaininglasting partnerships. The example here is the successful longstanding ‘5-eyes’ signals intelligence arrangement between the US, the UK, Canada, Australia and New Zealand, drawing out principles that are just as applicable to business and even to personal life.
The lesson of Chapter 10is that our digital life provides new opportunities for the hostile and unscrupulous to take advantage of us. We can end up in an echo chamber of entertaining information that unconsciously influences our choices, whether over products or politics. Opinion can be mobilized by controlled information sources, with hidden funding and using covert opinion formers. When some of that information is then revealed to be knowingly false, confidence in democratic processes and institutions slowly ebbs away.
The concluding chapter, Chapter 11, is a call to shake ourselves awake and recognize that we are all capable of being exploited through digital technology. The lessons of this book put together an agenda to uphold the values that give legitimacy to liberal democracy: the rule of law; tolerance; the use of reason in public affairs and the search for rational explanations of the world around us; and our ability to make free and informed choices. When we allow ourselves to be over-influenced by those with an agenda, we erode our free will and that is the gradual erosion of an open society. Nobody should be left vulnerable to the arguments of demagogues or snake
oil salesmen. The chapter and the book ends therefore on an optimistic note.
We can learn the lessons of how to live safely in this digital world.
David Omand was the first UK Security and Intelligence Coordinator, responsible to the Prime Minister for the professional health of the intelligence community, national counter-terrorism strategy and ‘homeland security’. He served for seven years on the Joint Intelligence Committee. He was Permanent Secretary of the Home Office from 1997 to 2000, and before that Director of GCHQ.
For Keir, Robert, Beatrice and Ada, in the hope that
you will grow up in a better world
Why we need these lessons in seeking independence of mind, honesty and integrity
Westminster, March 1982. ‘This is very serious, isn’t it?’ said Margaret Thatcher. She frowned and looked up from the intelligence reports I had handed her. ‘Yes, Prime Minister,’ I replied, ‘this intelligence can only be read one way: the Argentine Junta are in the final stages of preparing to invade the Falkland Islands, very likely this coming Saturday.’
It was the afternoon of Wednesday, 31 March 1982.
I was the Principal Private Secretary to the Defence Secretary, John Nott. We were in his room in the House of Commons drafting a speech when an officer from the Defence Intelligence Staff rushed down Whitehall with a locked pouch containing several distinctive folders. I knew immediately from the red diagonal crosses on their dark covers that they contained top secret material with its own special codeword (UMBRA), denoting that they came from the Government Communications Headquarters (GCHQ).
The folders contained decrypted intercepts of Argentine naval communications. The messages showed that an Argentine submarine had been deployed on covert reconnaissance around the Falklands capital, Port Stanley, and that the Argentine Fleet, which had been on exercises, was reassembling. A further intercept referred to a task force said to be due to arrive at an unstated destination in the early hours of Friday, 2 April. From their analysis of the coordinates of the naval vessels, GCHQ had concluded
John Nott and I looked at each other with but one thought, loss of the Falkland Islands would bring a major existential crisis for the government
of Margaret Thatcher: the Prime Minister must be told at once. We hurried down the Commons corridor to her room and burst in on her.
The last assessment she had received from the UK Joint Intelligence Committee (JIC) had told her that Argentina did not want to use force to secure its claim to the sovereignty of the Falkland Islands. However, the JIC had warned that if there was highly provocative action by the British towards Argentine nationals, who had landed illegally on the British South Atlantic island of South Georgia, then the Junta might use this as a pretext for action. Since the UK had no intention of provoking the Junta, the assessment was wrongly interpreted in Whitehall as reassuring. That made the fresh intelligence reports all the more dramatic. It was the first indication that the Argentine Junta was ready to use force to impose its claim.
The importance for us of being able to reason
The shock of seeing the nation suddenly pitched into the Falklands crisis is still deeply etched in my memory. It demonstrated to me the impact that errors in thinking can have. This is as true for all life as it is for national statecraft. My objective in writing this book therefore is an ambitious one: I want to empower people to make better decisions by learning how intelligence analysts think. I will provide lessons from our past to show how we can know more, explain more and anticipate more about what we face in the extraordinary age we now live in.
There are important life lessons in seeing how intelligence analysts reason. By learning what intelligence analysts do when they tackle problems, by observing them in real cases from recent history, we will learn how they order their thoughts and how they distinguish the likely from the unlikely and thus make better judgements. We will learn how to test alternative explanations methodically and judge how far we need to change our minds as new information arrives. Sound thinkers try to understand how their unconscious feelings as individuals, as members of a group and within an institution might affect their judgement. We will also see how we can fall victim to conspiracy thinking and how we can be taken in by deliberate deception.
We all face decisions and choices, at home, at work, at play. Today we have less and less time to make up our minds than ever before. We are in the digital age, bombarded with contradictory, false and confusing information from more sources than ever. Information is all around us and we feel compelled to respond at its speed. There are influential forces at play ranged against us pushing specific messages and opinions through social media. Overwhelmed by all this information, are we less, or more, ignorant than in previous times? Today more than ever, we need those lessons from the past.
Looking over the shoulder of an intelligence analyst
Over the centuries, generals naturally learned the advantage that intelligence can bring. Governments today deliberately equip themselves with specialist agencies to access and analyse information that can help
them make better decisions.2 Britain’s Secret Intelligence Service (MI6) runs human agents overseas. The Security Service (MI5) and its law enforcement partners investigate domestic threats and conduct surveillance on suspects. The Government Communications Headquarters (GCHQ) intercepts communications and gathers digital intelligence. The armed forces conduct their share of intelligence gathering in their operations overseas (including photographic intelligence from satellites and drones). It is the job of the intelligence analyst to fit all the resulting pieces together. They then produce assessments that aim to reduce the ignorance of the decisionmakers. They find out what is happening, they explain why it is
happening and they outline how things might develop.3
The more we understand about the decisions we have to take, the less likely it is that we will duck them, make bad choices or be seriously surprised. Much of what we need can come from sources that are open to anyone, provided sufficient care is taken to apply critical reasoning to them.
Reducing the ignorance of the decisionmaker does not necessarily mean simplifying. Often the intelligence assessment has to warn that the situation is more complicated than they had previously thought, that the motives of an adversary are to be feared and that a situation may develop in a bad way. But it is better to know than not. Harbouring illusions on such matters leads to poor, or even disastrous, decisions. The task of the intelligence officer is
to tell it as it is to government. When you make decisions, it is up to you todo the same to yourself.
The work of intelligence officers involves stealing the secrets of the dictators, terrorists and criminals who mean us harm. This is done using human sources or technical means to intrude into the privacy of personal correspondence or conversations. We therefore give our intelligence officers a licence to operate by ethical standards different from those we would hope to see applied in everyday life, justified by the reduction in harm to the
public they can achieve.4 Authoritarian states may well feel that they can dispense with such considerations and encourage their officers to do whatever they consider necessary, regardless of law or ethics, to achieve the objectives they have been set. For the democracies such behaviours would quickly undermine confidence in both government and intelligence services. Consequently, intelligence work is carefully regulated under domestic law to ensure it remains necessary and proportionate. I should therefore be clear. This book does not teach you how to spy on others, nor should it encourage you to do so. I want, however, to show that there are lessons from the thinking behind secret intelligence from which we can all benefit. This book is a guide to thinking straight, not a manual for bad behaviour.
Nor does thinking straight mean emotionless, bloodless calculation. ‘Negative capability’ was how the poet John Keats described the writer’s ability to pursue a vision of artistic beauty even when it led to uncertainty, confusion and intellectual doubt. For analytic thinkers the equivalent ability is tolerating the pain and confusion of not knowing, rather than imposing ready-made or omnipotent certainties on ambiguous situations or emotional challenges. To think clearly we must have a scientific, evidence-based approach which nevertheless holds a space for the ‘negative capability’
Intelligence analysts like to look ahead, but they do not pretend to be soothsayers. There are always going to be surprise outcomes, however hard we try to forecast events. The winner of the Grand National or the Indy 500 cannot be known in advance. Nor does the favourite with the crowds always come out in front. Events sometimes combine in ways that seem destined to confound us. Importantly, risks can also provide opportunities if we can use intelligence to position ourselves to take advantage of them.
Who am I to say this?
Intelligence agencies prefer to keep quiet about successes so that they can repeat them, but failures can become very public. I have included examples of both, together with a few glimpses from my own experience – one that spans the startling development of the digital world. It is sobering to recall that in my first paid job, in 1965, in the mathematics department of an engineering company in Glasgow, we learned to write machine code for the early computers then available using five-character punched paper tape for the input. Today, the mobile device in my pocket has immediate access to more processing power than there was then in the whole of Europe. This digitization of our lives brings us huge benefits. But it is also fraught with dangers, as we will examine in Chapter 10.
In 1969, fresh out of Cambridge, I joined GCHQ, the British signals intelligence and communications security agency, and learned of their pioneering work applying mathematics and computing to intelligence. I gave up my plans to pursue a doctorate in (very) theoretical economics, and the lure of an offer to become an economic adviser in HM Treasury. I chose instead a career in public service that would take me into the worlds of intelligence, defence, foreign affairs and security. In the Ministry of Defence (MOD), as a policy official, I used intelligence to craft advice for ministers and the Chiefs of Staff. I had three tours in the Private Office of the Secretary of State for Defence (serving six of them, from Lord Carrington in 1973 to John Nott in 1981) and saw the heavy burden of decisionmaking in crisis that rests at the political level. I saw how valuable good intelligence can be, and the problems its absence causes. When I was working as the UK Defence Counsellor in NATO Brussels it was clear how intelligence was shaping arms control and foreign policy. And as the Deputy Under Secretary of State for Policy in the MOD I was an avid senior customer for operational intelligence on the crisis in the former Yugoslavia. In that role I became a member of the Joint Intelligence Committee (JIC), the most senior intelligence assessment body in the UK, on which I served for a total of seven years.
When I left the MOD to go back to GCHQ as its Director in the mid-1990s, computing was transforming the ability to process, store and retrieve data at scale. I still recall the engineers reporting triumphantly to me that they had achieved for the first time stable storage of a terabyte of rapidly accessible data memory – a big step then although my small laptop today
has half as much again. Even more significantly, the Internet had arrived as an essential working domain for professionals, with the World Wide Web gaining in popularity and Microsoft’s new Hotmail service making email a fast and reliable form of communication. We knew digital technology would eventually penetrate into every aspect of our lives and that
organizations like GCHQ would have to change radically to cope.6 The pace of digital change has been faster than predicted. Then, smartphones had not been invented and nor of course had Facebook,
Twitter, YouTube and all the other social media platforms and apps that go with them. What would become Google was at that point a research project at Stanford. Within this small part of my working lifetime, I saw those revolutionary developments, and much more, come to dominate our world. In less than twenty years, our choices in economic, social and cultural life have become dependent on accessing networked digital technology and learning to live safely with it. There is no way back.
When I was unexpectedly appointed Permanent Secretary of the Home Office in 1997, it brought close contact with MI5 and Scotland Yard. Their use of intelligence was in investigations to identify and disrupt domestic threats, including terrorist and organized crime groups. It was in that period that the Home Office drew up the Human Rights Act and legislation to regulate and oversee investigatory powers to ensure a continual balancing act between our fundamental rights to life and security and the right to privacy for our personal and family life. My career as a Permanent Secretary continued with three years in the Cabinet Office after 9/11 as the first UK Security and Intelligence Coordinator. In that post, rejoining the JIC, I had responsibility for ensuring the health of the British intelligence community and for drawing up the first UK counter-terrorism strategy, CONTEST, still in force in 2020 as I write.
I offer you in this book my choice of lessons drawn from the world of secret intelligence both from the inside and from the perspective of the policymaker as a user of intelligence. I have learned the hard way that intelligence is difficult to come by, and is always fragmentary and incomplete, and is sometimes wrong. But used consistently and with understanding of its limitations, I know it shifts the odds in the nation’s favour. The same is true for you.