(Art design: Jean-Dominique Lavoix-Carli)
When delivering warnings, are we doomed to never be believed, sharing the same fate as Cassandra, the tragic character of Greek mythology? Or, on the contrary, can we hope to become as successful as the Pythia, the oracle priestess of Apollo at Delphi?
Her gift of prophecy becoming a curse, Cassandra lives thrice all tragedies: once when she foresees them, once when she fails to convince those who could prevent disasters and finally once when she herself suffers the dire events she foresees. She lives through Troy’s fall, is abducted and raped, taken as captive and then murdered (The Editors, “Cassandra“. Encyclopedia Britannica, 14 Feb. 2019; Seth L. Schein, “The Cassandra Scene in Aeschylus’ ‘Agamemnon’“, Greece & Rome, Vol. 29, No. 1, Apr., 1982, pp. 11-16).
By contrast, the Pythia, the famous oracle priestess of Apollo at Delphi was an institution that was so successful it lasted from ca. 800 BCE to AD 390/91 (Julia Kindt, “Hidden women of history: the priestess Pythia at the Delphic Oracle, who spoke truth to power“, The Conversation, 22 janvier 2019). Her foresights were sought by kings and commoners on public and individual matters (Ibid.). They were believed, became advice, and were richly rewarded (Ibid.).
Thus, how can we emulate the Pythia’s destiny rather than Cassandra’s fate? We need to find out what can make strategic foresight and early warning a successful activity and not a curse, and apply our findings to our work.
To help us in this endeavour, we shall notably build upon Christopher Meyer’s research on warning and conflict prevention (“Beyond the Cassandra Syndrome: Understanding the failure and success of warnings“, King’s College Lecture, 26 February 2014). Indeed, Meyer, after having highlighted problems related to warnings and prevention, identifies three key elements that make a warning successful, from the point of view of prevention (Ibid.). He furthermore suggests ways to bridge the “warning-response gap”.
To make sure strategic foresight and warning is successful, we shall first highlight that proper strategic foresight and warning needs, intrinsically, to be actionable. It must also walk a thin line between being useful to decision-makers and interfering. We shall particularly emphasise the challenge of impact assessment and suggest that proper scenario tree is a key tool for offering policy alternatives to decision-makers, alongside involving policy-makers as stakeholders. We shall, second, turn to hurdles linked to the reception of warnings by policy-makers, as identified by Meyer, and to ways forward. Finally, carefully comparing Cassandra and the Pythia, we shall single out important keys explaining why warning can be either a curse or a successful activity.
A real warning is actionable
No, everything is not an early warning
Of course, first, to be able to deliver a successful warning, we need to make sure we communicate a real warning, and not any opinion, brief or piece of information.
As Grabo* reminds us, indeed:
A warning concerns a situation, an objective, an opportunity, a danger, a threat or a risk, which are specific and defined (the issue).
Cynthia M. Grabo, Cynthia M., and Jan Goldman. Anticipating Surprise: Analysis for Strategic Warning. Washington, D.C.: Center for Strategic Intelligence Research, Joint Military Intelligence College, 2002, pp. 4-16.
“Warning” deals with the future. It tries to anticipate and predict dynamics and events which do not yet exist.
An analysis explaining solely the past or present is NOT warning.
A warning is not made only of facts, data and information, but results from analysis and synthesis.
Meyer’s research similarly highlights that any report, brief, piece of information, or even merely opinions, reinterpreted with hindsight does NOT constitute a warning (Ibid). Actually, it is because these warnings did not exist that we are faced with surprise, what we seek to avoid. Even broad generic statements, not backed up by proper analysis, and made before the events cannot qualify as warning. At best, they could be considered as “proto-warnings”, but would need to be substantiated and transformed into proper foresight and warning.
Proper warnings must be actionable. This means that they must be specific enough to allow for proper action. They must be detailed enough and include an evaluation of probability, as well as an impact assessment.
If we use a real life example, the beginning of a proper warning could look as follows:
As long as travel restrictions remain in place due to the COVID-19 situation, it is highly likely that production and sales of fake test certificates will prevail. Given the widespread technological means available, in the form of high-quality printers and different software, fraudsters are able to produce high-quality counterfeit, forged or fake documents.
Europol’s “Early Warning Notification: The illicit sales of false negative COVID-19 test certificates“, February 2021. My emphasis: in bold the likelihood assessment.
This warning, to be truly actionable and complete, would need to include an impact assessment, done according to the decision-makers receiving the warning.
The problem with impact assessment
To assess an impact may appear as first glance as something relatively easy to do. However, there are hidden traps in this apparently simple evaluation.
- The New Space Race (1) – The BRICS and Space Mining
- Uranium for the U.S. Nuclear Renaissance: Meeting Unprecedented Requirements (1)
- Fifth Year of Advanced Training in Early Warning Systems & Indicators – ESFSI of Tunisia
- Towards a U.S. Nuclear Renaissance?
- AI at War (3) – Hyperwar in the Middle east
- AI at War (2) – Preparing for the US-China War?
If we think about it, what do we need to do to assess impacts? Actually, we fundamentally need to judge and evaluate past and current policies and decisions, as well as those policies for the future, which have already been decided. This is what Meyer highlights when he stresses that, in an impact assessment, there is an implicit judgement on current policies and what should be done (Ibid.). We thus judge what policy-makers and decision makers are doing and have been doing and appear to be ready to do. This may easily lead to tension with our decision-makers, as they may not be ready for what they may perceive as a fault-finding exercise.
This potentially dangerous implicit judgement may contribute to explain the absence of impact assessment in Europol’s warning. The public quality of these warnings and the multinational character of the agency probably only enhance the difficulty of impact assessment. We may imagine that classified versions of Europol’s warnings include such impacts assessments, if member-countries gave the right signals to ensure they wanted them.
Furthermore, “simple” impact assessments, focusing on past and present policies and decisions, also invite criticism from decision-makers. They could complain it is easy to point out future problems while no other solution is offered or suggested. Indeed, Meyer underlines that, often, warnings do not make the case for the feasibility or existence of other course of actions, and that it is a flaw from the point of view of prevention (Ibid.).
Policy alternatives, scenario tree and decision-makers as stakeholders
Scenario tree and key decisions
If we want to consider alternative policies, then a solution – even the best solution – is to develop properly a complete scenario tree and to use it for our warnings. Indeed, a scenario tree considers critical uncertainties. This implies that, most of the time, we also assess a range of possible actions with key decision points. Thus, we look at other possible courses of actions, which should help policy-makers and decision-makers in their tasks. With a proper scenario tree, our strategic foresight and warning becomes truly fully actionable for prevention.
However, there, we also enter further into the realm of policy-making. This could be seen as contradicting, for example, the position of the intelligence community according to which the realm of action should be completely separated from all intelligence analysis, including strategic foresight and early warnings analysis (e.g. Fingar 2009, Meyer 2014). On the contrary, practitioners in the field of conflict prevention and in risk management are not so adamant on this separation (Meyer 2014,; ISO 31000:2018; for a summary on risk management, Helene Lavoix, “When Risk Management …“, The Red Team Analysis Society, 2019). They even see these two dimensions as linked, and, for them, policy options or alternatives must be attached to warnings.
Fundamentally, as long as the decision regarding policy choices remain with decision-makers, then there should not be any issue related to the blurring of responsibilities.
Including decision-makers as stakeholders in the strategic foresight and early warning process
Furthermore, in its final stages, the scenario tree related to our warnings could also be developed with members of the policy-making community. By making the latter stakeholders in the development of the final strategic foresight and warning products we could create conditions favourable to the acceptance of warnings.
To create such substantiated, precise yet encompassing warnings including the evaluation of probability and impact assessment, preferably under the shape of a scenario tree highlighting key possible decisions and impacts is difficult. It also demands a lot of work. It is however feasible and is a condition necessary but not sufficient to achieve successful warnings. As such, it should be seen as an investment.
Thus, one of the key to success is to consider decision-makers and the very object of strategic foresight and warning, i.e. decisions and actions, from the very start of the process. It will smooth the last steps of that process, the very delivery and communication of the warnings. Yet, we still have to face many hurdles.
Receiving early warnings
Over-warning or “warning fatigue”
Notes and Bibliography
Featured image : Art design: Jean-Dominique Lavoix-Carli – Photos by Zack Jarosz from Pexels and from PxHere
Notes
* For some context on Grabo’s seminal work, see Hélène Lavoix, Communication of Strategic Foresight and Early Warning, The Red Team Analysis Society, 2021.
**The Dunning-Kruger effect: According to this bias, “the skills that engender competence in a particular domain are often the very same skills necessary to evaluate competence in that domain” (Kruger and Dunning, “Unskilled and Unaware of It…”, 1999). In other words, the less one knows about something, the best one thinks one is in this field.
Bibliography
Bar-Joseph, Uri Bar-Joseph and Arie W. Kruglanski, “Intelligence Failure and Need for Cognitive Closure: On the Psychology of the Yom Kippur Surprise“, Political Psychology, Vol. 24, No. 1 (Mar., 2003), pp. 75-99.
Betts, Richard K., Surprise Attack: Lessons for Defense Planning, Brookings Institution Press, Dec 1, 2010.
Betts, Richard K., “Surprise Despite Warning: Why Sudden Attacks Succeed“, Political Science Quarterly, Vol. 95, No. 4 (Winter, 1980-1981), pp. 551-572
Cancian, Mark, Avoiding Coping with Surprise in Great Power Conflicts, A Report of the CSIS International Security Program, February 2018).
Davis, Jack, “Improving CIA Analytic Performance: Strategic Warning,” The Sherman Kent Center for Intelligence Analysis Occasional Papers: Volume 1, Number 1, accessed September 12, 2011.
Doyle, Andrea, “Cassandra – Feminine Corrective in Aeschylus’ Agamemnon” Acta Classica, vol. 51, 2008, pp. 57–75.
Fingar, Thomas, “”Myths, Fears, and Expectations,” Payne Distinguished Lecture Series 2009 Reducing Uncertainty: Intelligence and National Security, Lecture 1, FSI Stanford, CISAC Lecture Series, March 11, 2009.
Fingar, Thomas, “Anticipating Opportunities: Using Intelligence to Shape the Future,” Payne Distinguished Lecture Series 2009 Reducing Uncertainty: Intelligence and National Security, Lecture 3, FSI Stanford, CISAC Lecture Series, October 21, 2009.
Grabo, Cynthia M., and Jan Goldman. Anticipating Surprise: Analysis for Strategic Warning. [Washington, D.C.?]: Center for Strategic Intelligence Research, Joint Military Intelligence College, 2002.
ISO 31000:2018 Guidelines (revised from the 2009 version), IEC 31010:2009, Risk assessment techniques, and ISO Guide 73:2009 Vocabulary.
Kindt, Julia, “Hidden women of history: the priestess Pythia at the Delphic Oracle, who spoke truth to power“, The Conversation, 22 janvier 2019
Kruger, Justin, and David Dunning, “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments“, Journal of Personality and Social Psychology, vol 77, no 6, p 1121-1134, American Psychological Association (1999).
Lavoix, Helene, Why the Messenger Got Shot and how to Avoid this Fate, The Red Team Analysis Society, April 2021
Lavoix, Helene, “Communication of Strategic Foresight and Early Warning“, The Red Team Analysis Society, 3 March 2021.
Lavoix, Helene, “When Risk Management Meets Strategic Foresight and Warning“, The Red Team Analysis Society, 2019.
Lavoix, Helene, “Revisiting Timeliness for Strategic Foresight and Warning and Risk Management“, The Red Team Analysis Society, 2018
Lavoix, Helene, “Ensuring a Closer Fit: Insights on making foresight relevant to policymaking”, Development (2014) 56(4);
Lavoix, Helene, “What makes foresight actionable: the cases of Singapore and Finland”, confidential commissioned report, US government, November 2010.
Meyer, Christoph O., Beyond the Cassandra Syndrome: Understanding the failure and success of warnings, King’s College Lecture – 26 February 2014
Schein, Seth L. Schein, “The Cassandra Scene in Aeschylus’ ‘Agamemnon’“, Greece & Rome, Vol. 29, No. 1, Apr., 1982, pp. 11-16
Schelling, Thomas, foreword to Roberta Wohlstetter, Pearl Harbor: Warning and Decisions (Stanford, CA: Stanford University Press, 1962).