Strategic foresight and early warning are grounded in the idea of preventing surprise and more specifically strategic surprise. However, if we move away from the general idea of “strategic surprise” and try to be specific, i.e. if we try to apply the concept to a specific threat or issue we try to anticipate, then the …
Foreseeing the future, whatever the name given to the endeavour, includes two major tasks.
The first one is, of course, the analysis, the process according to which the foresight, forecast, warning, or, more broadly, anticipation is obtained.
The second one is less obvious, or rather so evident that it may be overlooked. It is, however, no less vital than analysis. We need to deliver the output of the analytical process to those who need the foresight, the decision-makers or policy-makers. Ideally, the recipients must understand that output, because they will act on it. They need to integrate the new knowledge received in the decisions they will take.*
A huge challenge runs across these tasks: biases.
We must overcome the various natural and constructed biases – systematic mental errors – that limit human understanding. This article will present first the classical way we deal with biases: we consider them – quite rightly – as “enemies” and we devote much effort to mitigate them. Then, considering the specificity of the delivery stage, this article suggests that another strategy is necessary. We need to turn our usual strategy on its head and befriend biases. In that case, scenarios become a tool of choice for an enhanced delivery of our foresight to decision-makers […]