Académique Documents
Professionnel Documents
Culture Documents
net/publication/249687949
CITATIONS READS
90 334
2 authors:
Some of the authors of this publication are also working on these related projects:
When IPE meets IPS: the socio-political economy of economics View project
All content following this page was uploaded by Oliver Kessler on 04 August 2015.
Knowns and Unknowns in the `War on Terror': Uncertainty and the Political
Construction of Danger
Christopher Daase and Oliver Kessler
Security Dialogue 2007 38: 411
DOI: 10.1177/0967010607084994
Published by:
http://www.sagepublications.com
On behalf of:
Additional services and information for Security Dialogue can be found at:
Subscriptions: http://sdi.sagepub.com/subscriptions
Reprints: http://www.sagepub.com/journalsReprints.nav
Permissions: http://www.sagepub.com/journalsPermissions.nav
Citations: http://sdi.sagepub.com/content/38/4/411.refs.html
What is This?
The Unknown
As we know,
There are known knowns.
There are things we know we know.
We also know
There are known unknowns.
That is to say
We know there are some things
We do not know.
But there are also unknown unknowns,
The ones we don’t know
We don’t know.1
1
Donald Rumsfeld, speaking at a US Department of Defense press conference, 12 February 2002; reprinted
as a poem in Seely (2003: 2).
Introduction
W
HOEVER TURNED Donald Rumsfeld’s sentences from a press
conference into a poem, which circulated in the world press just
weeks before the start of the war against Iraq, had a fine sense for
the suggestive power with which the US Secretary of Defense encapsulated –
almost poetically – the central problem of foreign and security policy today:
uncertainty. According to Rumsfeld, there exist three forms of knowledge
and non-knowledge2 that are relevant for political and military decision-
making. First, there is certain knowledge (‘known knowns’), on the basis of
which political programmes and military strategies can be developed.
Second, there is some knowledge about what we do not know (‘known
unknowns’), uncertainties that can be tamed analytically and reintegrated
into the decisionmaking process as calculable risks. Third, there is non-
knowledge about what we do not know and cannot know (‘unknown
unknowns’). This is the wild card, the unpredictable that can throw over the
most careful planning. Interestingly enough, however, Rumsfeld conceals a
fourth category of knowledge and non-knowledge: the knowledge we do not
want to know. These are the things we could know but rather decide not to
know by forgetting, suppressing or repressing them. Rumsfeld’s poem thus
needs another verse, which might run like this:
Finally, there are unknown knowns
The knowns
We don’t want to know.3
The poem indicates that the widespread perception of political and military
decisions as being based on more or less firm knowledge is misleading.
Rather, knowledge and non-knowledge are equally constitutive for the
decisionmaking process. It is the relationship between what we know, what
we do not know, what we cannot know and what we do not like to know that
determines the cognitive frame for political practice. Scholars have long
acknowledged that, in international relations, uncertainty is endemic
(Schelling, [1960] 1980; Hughes, 1999; Cioffi-Revilla, 1998). But, after the end
of the Cold War, when old truths had lost their validity, the uncertainty was
acutely felt with regard to new international risks (Richardson, 1989; Beck,
1996; Daase, 2002). Since then, international terrorism has emerged as a
prime security concern that many claim requires bold new strategies because
of its shadowy character and its incalculable dangers.
This article investigates the knowns and unknowns in international security
2
We use the word ‘non-knowledge’ as the antonym of knowledge in order to reserve the term ‘ignorance’
and its pejorative connotations for a specific form of non-knowledge.
3
Of course, we do not claim that this poem could not be extended differently. We thank an anonymous
referee for pointing out further alternatives and their normative implications.
and their impact on the ‘war on terrorism’. In short, we claim that, in order to
explain counter-terror strategies, it is crucial to understand how knowledge
and non-knowledge in international politics are conceptualized, assessed
and managed. The first part is thus conceptual. We show how different com-
binations of knowledge and non-knowledge determine the kinds of danger we
perceive. In short, we identify threats, risks, catastrophes and ignorance as
such kinds of danger. The second part of the article is methodological. We
demonstrate how different notions of probability are used to determine the
magnitude of danger we face. Here, we identify two different ways in which
non-knowledge might be given: structured and unstructured. While un-
certainty is treated as synonymous with risk in the former case, risk and
uncertainty are treated distinctively in the latter. As we will demonstrate,
these two notions of risk and uncertainty draw specific boundaries of politi-
cal responsibility. The third part of the article is political. We reveal how
concepts of danger and measures of probability are used to construct
counter-terror strategies and to justify action or inaction. In short, we discuss
deterrence, pre-emption, denial and negligence. We do not claim that these
strategies exhaust the list of measures taken, but we think that they are para-
digmatic for how knowledge and non-knowledge shape security policy.
There are many ways to categorize different forms of knowledge and non-
knowledge. Rumsfeld draws on a distinction that is widely used in the
sociology of knowledge and links two aspects: ontology and epistemology.
Both knowledge and non-knowledge have two sides. There is knowledge (or
non-knowledge) about things, and knowledge (or non-knowledge) about
ways to identify things. The first refers to the phenomena of reality as such,
which we might or might not know, thus having phenomenological or
factual knowledge or not. The second refers to the epistemological status of
such knowledge. We might or might not know methods of gaining know-
ledge. If we do know ways of escaping non-knowledge, we possess method-
ological knowledge.
Both kinds of knowledge and non-knowledge are independent, but linked.
We might have reliable methods of identifying observable facts, thus pro-
ducing known knowns. But, we might also have some methods of dealing with
phenomena that we are not 100% sure of, thus creating known unknowns. On
the other hand, we must accept that there might be things we do not even
dream of and have no method of anticipating; these are unknown unknowns.
But, there are also situations in which factual knowledge is available in
principle, but not used because it is ignored or repressed; this is the category
of the unknown knowns.
In the security field – traditionally defined4 – there is empirical knowledge
about the ‘things’ that could pose a danger to the security of the international
system or to national security. Such ‘things’ could be political actors, such as
Al-Qaeda, of which we know (or do not know) that they exist and are prepar-
ing for terror attacks. Also, such ‘things’ could be military capabilities, such
as nuclear weapons in the Iraqi stockpile, of which we know (or do not know)
that they exist and can be used. Finally, such ‘things’ could be motivations,
such as Iran’s intent to acquire nuclear weapons, which we might or might
not know. Depending on how much we know about actors, capabilities and
motivations, our factual knowledge about security dangers is more or less
complete.
But, in the security field, there is also methodological knowledge about the
techniques for obtaining factual knowledge, dealing with uncertainty and
assessing danger. This knowledge consists of concepts, theories and analyti-
cal methods to interpret the – supposedly raw – factual data about the
dangers in the world. Conceptual models help, for example, to describe
collective actors, such as Al-Qaeda, despite their complexity as unitary
rational agents. More or less sophisticated methods are used to assess mili-
tary capabilities, such as in Iraq prior to the war, without having firm
evidence. Theories in the field of International Relations (IR) allow inferring
motivations, such as Iran’s, although direct information is not available.
Whether or not these methods are adequate is another matter. They are part
of the knowledge that is used to assess danger in international politics.
Combining empirical and methodological knowledge and non-knowledge
leads to the four forms of known knowns, known unknowns, unknown unknowns
and unknown knowns previously mentioned. Depending on the knowledge
we have, international dangers take on very specific forms. If the empirical
facts of a danger are known (i.e. if an enemy actor exists with malign inten-
tions and a sufficient military capability to do harm) and if this knowledge is
known to be reliable, a threat exists. This was paradigmatically the case
between the USA and the Soviet Union during the Cold War. If, however, the
factual knowledge is partial, yet methods exist for reducing the uncertainty,
we might speak of security risks. Such risks have been perceived after the end
of the Cold War with regard to nuclear proliferation, terrorism, organized
crime and similar issues. If no (or scant) factual knowledge about a danger
exists and a method for assessment is not available, the danger, if material-
ized, will appear as disaster. If, finally, the facts of danger are largely known
but, for whatever reason, this knowledge is neglected, suppressed or forgot-
ten, the danger is intensified by what we call ignorance.
4
We concentrate here on international and national security, although we acknowledge that societal and
human security are also important.
Empirical knowledge
‘knowns’ ‘unknowns’
6
Within the Scholastic tradition, knowledge was grounded in authorities. Books were not seen as simple
texts, but as ‘speaking’ persons. For an analysis, see Luhmann (1997).
7
There is a dominance of the past over the future, which can be seen in the justification of dynasties; see
Koselleck (1979).
8
See Esposito (2002). Of course, the future existed before the 17th century, but the distinction of past/future
became the dominant form of observing the temporal condition of human existence. See also Luhmann
(1991: 23).
redefined concept of nature ultimately opens the door for probability.9 There
is no need to consult the old authorities eternalized in their books; nature can
now be asked directly. The probabilis of signs turns into causes and becomes
the possibility to read nature’s laws.10
The emergence of probability, in other words, parallels an overall trans-
formation of societies manifested in the discovery of the subject, social con-
tract theory, the invention of positive law and subjective rights, and thus a
particular modern notion of knowledge and understanding of how the world
might be known. Ultimately, explanations of possible harm by sin or Fortuna
are replaced by contingency and thus uncertainty and risk.11 As Mary Douglas
(1992: 26) once said, ‘risk provides secular terms for rewriting scripture: not
the sins of the fathers but the risks unleashed by the fathers are visited on the
heads of their children’.12 13
What is of less interest at the moment is the institutional impact of risk in
the form of insurances,14 or the way in which the exact meaning of risk and its
relation to rationality, responsibility and blame changed over the centuries.15
Rather, what is of interest at this point is the particular way in which the
semantic field of probability emerged as it gave rise to a specific ambiguity
concerning the category of non-knowledge. Let us go back to Leibniz for a
moment.16 Leibniz shows that the notion of equal probability can have two
meanings. His first description is based on the principle of sufficient reason,
where he uses notions of opinio and aestimatio, two categories of Scholastic
epistemology. The expression that two cases have equal probability expli-
cates a degree of belief due to insufficient knowledge. In his second argu-
ment, Leibniz uses probability as degree of possibility, as aeque faciles seu
aeque possibiles. Leibniz uses facile, which refers back to Aristotle’s notion of
9
For introductions to probability theory, see Cohen (1989); Good (1987, 1950); Benenson (1984); Weatherford
(1982).
10
It would be interesting to analyse this fusion of several semantics during the 17th century more deeply: the
ascendancy of science, the emergence of probability, and risk. At the same time, risk is linked to a rede-
fined understanding of necessity and security. We can only speculate that the Thirty Years War functions
as a catalyst where the meaning of the past, the human and eternal life change, accompanied by the
demise of the church as a unifying force, in which knowledge increasingly moves from monasteries to
universities. From this perspective, risk is irremediably linked to the rise of the territorial nation-state. For
a discussion, see Kessler (2007).
11
The roots of risk go back to antiquity, and, although the term did not exist at that time, the functional
equivalent of today’s risk is antiquity’s explanations for possible harm in terms of myths, gods and ora-
cles. For a historical analysis of risk, see Luhmann (1991: Chapter 1); Bonss (1995); Ewald (1991); Bernstein
(1998).
12
Rooted in probability theory, the basic metaphor becomes the bet, or gambling, where references to sin, the
will of God or other rather esoteric explanations become illegitimate.
13
As Robert Castel (1991: 289) explains: ‘a grandiose technocratic rationalizing dream of absolute control of
the accidental, understood as the irruption of the unpredictable . . . a vast hygienist utopia plays on the
alternate register of fear of security, inducing a delirium of rationality, an absolute calculative reason and
a no less absolute prerogative of its agents, planners, and technocrats, administrators of happiness for a
life to which nothing happens’.
14
Ewald (1991: 199).
15
Douglas & Wildavsky (1982).
16
Leibniz, quoted in Hacking (1975: 125).
potentia. Equal probability does not refer to a lack of knowledge, but to future
states of the world. This understanding is further developed by Laplace and
his definition of probability as relative frequency.
These two very distinct and very modern definitions of probability – that is,
as relative frequency or degree of belief – are based on two definitive notions of
non-knowledge and uncertainty. The definition of probability as a relative
frequency characterizes uncertainty as a probability distribution over possi-
ble states of the world. For this setting to apply, uncertainty needs to be
characterized in terms of a distinct and exhaustive set of possible states of the
world and a probability distribution over this set. A die can only show a
number from one to six – and the ‘category’ of ‘three’ is different from ‘four’.
This framework is commonly known from expected utility theory and
mathematical risk analysis, where risk is embedded in broader assumptions
of rationality and axioms of probability calculus. As this literature shows,
and exactly because contingency is understood to be an ontological concept,
probabilities mirror objectively given facts. They provide a kind of secured
knowledge about possible states of the world.
A second meaning of uncertainty emerges within the epistemological defi-
nition of probability as a degree of belief. Consider a moment, a situation
where a person is located in a lit room – and then somebody switches off the
lights. In this dark and unfamiliar room, one has to find out what objects are
actually in the room and where they are located. The probability of what and
where an object is is not subject to a frequency, but rather to a degree of belief.
Probability does not constitute knowledge of the world; rather, it constitutes
knowledge about our engagement with the world and includes qualitative
judgements about ‘what is the case’. In this setting, uncertainty – that is, the
kind of relevant non-knowledge – is at first unstructured and needs to be
absorbed via evidences or processes of trial and error into manageable risk.
In this context, where uncertainty is juxtaposed to risk (Knight, 1921;
Keynes, 1937: 213–214), uncertainty changes the condition of possibility – or
impossibility – of action. It is subject to a different rationality, based on norms
and values and not on instrumental criteria. These conventions and norms
thus represent a different kind of knowledge, and at the same time constitute
the limits of a potential applicability of formal reasoning and the insurability
of the future. From the particular way in which uncertainty is translated into
manageable risks, one might think of hierarchies or institutions as other alter-
natives; however, this is not of particular relevance at the moment. More
important are the methodological implications these two notions of uncer-
tainty and risk have. Probability theories provide different conceptualiza-
tions of how the future and the present are connected via characterizing the
unknown. Risk names the boundary of what an individual can and does (not)
know, what lies in his responsibility or what is subject to, for example, a
‘higher force’.
Known Knowns
The known knowns of security politics represent the realm of available secured
knowledge in political decision processes. This is the paradigmatic case of
structured uncertainty that we encounter in relative frequency probability
theory and mathematical approaches to security studies. This view assumes
a well-defined ontology, that is, the existence of recursive patterns of ‘law’-
like regularities where our knowledge mirrors empirical facts.
This knowledge consists of the use of a particular vocabulary within a given
framework, that is, the concepts and special notions that demarcate the frame
for evaluating international political situations. Of course, how one names a
situation has crucial consequences for the political processes that follow. It
matters whether a group is named as ‘terrorists’ or ‘freedom fighters’, ‘guer-
Example: Deterrence
Deterrence can serve here as a paradigmatic example for the particular kind
of (non-)knowledge of known knowns for three reasons: First, it was enforced
only at the onset of the Cold War (Freedman, 1989); second, it represents a
particular way of dealing with political knowledge deemed insufficient (e.g.
in relation to the motives and the military capacity of the USSR) (Schelling,
[1960] 1980); and, third, it became one of the known knowns of foreign and
security policy during the Cold War (Zagare & Kilgour, 2000). After the end
of the East–West conflicts, and in particular with the start of the ‘war against
terror’, the concept of deterrence has lost much of its usefulness, although
politicians and commentators like Rumsfeld and Wolfgang Schäuble have
repeatedly made demands for a ‘new form of deterrence’.
The basic mechanism of deterrence – in both its nuclear and its convention-
al form – is to avert an aggressive act by an opponent by threatening that
opponent with unacceptable losses should it carry out the act (George &
Smoke, 1974). For example, one could deny all the potential benefits of a
potential aggressive act (deterrence by denial) and/or one could threaten to
heavily punish unacceptable behaviour (deterrence by punishment) (Snyder,
1961). For this strategy to work, both the deterring and the deterred state
need to know what the potential military, political and propagandistic
benefits of their behaviour are, and what particular punishment might be
highly sensitive. That is, the cost and loss functions of political actions, the
epistemological question of qualities, of context, and the determination of the
‘currency’, need to be established in advance.
This presupposition of fixed categories is also valid – cum grano salis – for
considerations on how to deter terrorists (Davis & Jenkins, 2002). However,
and in contrast to the framework of Mutually Assured Destruction (MAD),
the empirical and methodological knowns are not easily available. One of the
most frustrating elements of the terrorist attacks on New York and
Washington was precisely the non-existence of an identifiable actor and the
lack of any clear political end that could be served with the attack. It is
extremely difficult to identify or deny a potential gain if there is neither an
actor nor a common currency (Münkler, 2002) against which threats and
claims can be evaluated. Whereas somebody might deny the exchange of
prisoners to reject a potential benefit, the symbolic power of the tumbling
World Trade Towers cannot be ‘denied’. Consequently, an analogy between
MAD and terrorism is highly problematic. One cannot deter terrorists in the
way that one might deter states – particularly if the ultimate threat, death, is
not perceived as a threat but rather welcomed by suicide terrorists. The
benefit of the act is thus not some ultimate goal, but the act itself.
Regardless of the aforementioned problem, the network-based organiza-
tion of terrorist groups escapes the logic of deterrence (White, 1998: 32–43).
Example: Terrorism
In this context, the question of how big the threat of international terrorism
currently is can be conveniently answered by pointing to the next possible
act. The rationale behind many counter-terrorist acts is the constant threat of
a subsequent act – which is indubitably anticipated for some point in the
future. Within such a point of view, the current calm is only the calm before
the next storm. Taken seriously, this statement is both true and trivial.
However, it is exactly the triviality of this insight – that is, that a next terror-
ist ‘attack’ will happen – that allows proponents to structure and shape the
contemporary security policy discourse. There are a couple of reasons for
this. Most basically, there are two standard models for examining the risk of
terrorism – which are, however, both inadequate (Falkenrath, 2001). First,
there is the inquiry into the motivational structure and the extrapolation from
past terrorist activities. Second, there is the attempt to calculate the possible
risks from the expected losses and the probability of a certain state of the
world occurring. The former is preferred by terrorist and regional experts,
the latter by security practitioners and security experts.
The problem of the first method is that it cannot trace new developments
and spontaneous changes in the motivational structure. There is always a
first time, one could argue. Even the act of hijacking planes in order to
destroy skyscrapers was not ‘predictable’. Such behaviour was simply not on
the screen; it was unimaginable. These methods of explorations are thus
inherently conservative, and they systematically underestimate the associ-
ated risks. The problem with the second method is that nobody can actually
‘calculate’ the loss of thousands of lives.17 If the risk of terrorism is defined in
common terms of probability and potential loss, then the focus on terrorist
‘world events’ consequently leads to a reduced importance of probabilities.
The improbability of the risk’s manifestation becomes irrelevant, as the costs
would reach infinity (Jenkins, 1999). The classic calculation of risks from
terror thus tends to overestimate and to dramatize terror. This overestima-
tion of risks is not only directed at the level of risk assessment but includes
the formulation of a prudent ‘risk policy’: if one factor of the risk equation
goes to infinity, a fair representation being a terrorist attack with nuclear
weapons, then there is no rational measure for anti-terrorist measures.
This point is close to a further characteristic of risks: their political nature.
For example, in 1998, the Rumsfeld Commission criticized the Clinton
administration for systematically playing down the security threat posed by
17
Though there certainly are attempts to value human lives. And, even though policymakers might use
these attempts, there are two objections. First, it is almost trivial to see that the value of human lives
changes from policy issue to policy issue and country to country. Second, and in spite of these attempts,
it seems to be a far cry to a situation where the ‘loss’ of an attack will be presented as a ‘bill’ due next
month.
missiles, and the intelligence services for working with the wrong method-
ologies (Rumsfeld Commission, 1998: Sections II.A, II.G). In a sense, this was
basically a battle of methods within the US intelligence community, about the
adequate methods for political risk analysis: which criteria allow for pruden-
tial recognition and evaluation of an emerging threat, allowing for effective
counter measures? Back in 1995, the National Intelligence Estimate had
declared that (as summarized in a statement by the chairman of the National
Intelligence Council, Richard Cooper):
the Intelligence Community judges that in the next 15 years no country other than the
major declared nuclear powers will develop a ballistic missile that could threaten the
contiguous 48 states or Canada. (Cooper, 1996)
The Rumsfeld Commission (1998: Section II.A) report, on the other hand,
emphasized that:
concerted efforts by a number of overtly or potentially hostile nations to acquire ballis-
tic missiles with biological or nuclear payloads pose a growing threat to the United
States, its deployed forces and its friends and allies. . . . [North Korea, Iran and Iraq]
would be able to inflict major destruction on the U.S. within about five years of a
decision to acquire such a capability (ten years in the case of Iraq).
Three years after the 1995 National Intelligence Estimate, the Rumsfeld
Commission had thus drastically reduced the time horizon of a potential
threat and increased the level of perceived risk. The main idea of the report,
however, was to criticize traditional methodologies of threat analysis in the
defence community:
In analyzing the ballistic missile threat, the Commission used an expanded method-
ology. We used it as a complement to the traditional analysis in which a country’s
known program status is used to establish estimates of its current missile capabilities.
We believe this expanded approach provides insights into emerging threats that the pre-
vailing approaches used by the Intelligence Community may not bring to the surface.
(Rumsfeld Commission, 1998: Section II.G.)
Although the Rumsfeld Commission did not provide any probability assess-
ment of the imagined developments, its results were taken by missile defence
advocates as evidence that the US intelligence community would misjudge
the situation and severely underestimate the danger. The scolded National
Intelligence Council subsequently embraced this new method to emphasize
the possible and not the probable. The subsequent Foreign Missile
Developments and the Ballistic Missile Threat to the United States Through 2015
thus reads:
North Korea could convert its Taepodong-1 space launch vehicle (SLV) into an ICBM
that could deliver a light payload (sufficient for a biological or chemical weapon) to the
United States, albeit with inaccuracies that would make hitting large urban targets
improbable. North Korea is more likely to weaponize the larger Taepodong-2 as an ICBM
that could deliver a several-hundred kilogram payload (sufficient for early generation
nuclear weapons) to the United States. Most analysts believe it could be tested at any
time, probably initially as an SLV, unless it is delayed for political reasons. (NIC, 1999)
We quote this passage to show how an emphasis on the possible rather than
the probable, and the almost complete lack of political risk assessments in the
1999 National Intelligence Estimate softened the established criteria used by
intelligence services to evaluate threats (Cirincione, 2000). The ‘growing
threat’ through the proliferation of carriers was based less on a dramatic
change of the armament behaviour of Iran, Iraq or North Korea, and more
on the change of the political parameters of US threat assessment. Put more
precisely, it was a politicization of risk analysis.
One can see the traces of this battle of methods even in the controversies
over how to explain the misinterpretation and false estimation of the intelli-
gence associated with decisionmaking concerning the Iraq War. As US
weapons inspector David Kay (2004) noted after the long and ultimately
unsuccessful search for evidence of a renewed weapons programme in
Saddam Hussein’s Iraq: ‘We were almost all wrong.’ In this context, it is
interesting to note that, a few weeks before the new war, US Vice-President
Dick Cheney (2003) argued that giving the inspectors more time would be
wrong, as more procrastination would increase the military risk posed by the
Iraqi regime: ‘The risks of inaction are far greater than the risk of action.’ This
thought served as central rationale for the military preventive strike: military
action was essential now to prevent future potential loss. This reasoning was
also behind the ‘pre-emptive strike’ doctrine by which the US government
undermined Article 51 of the United Nations Charter and the ‘Caroline cri-
teria’. Without mentioning the conditions under which the supposed risks
are supposed to grow, Cheney’s assertion is as empty as the sentence that
the next terrorist attack will surely come. This sentence nicely illustrates the
argumentation of the Bush administration that the cognitive uncertainty of
the known unknowns was not seen as a problem to be solved, but as the justi-
fication for military action. Cheney’s argument is more than just a simple
assertion – it is a programme for how the USA will deal with non-knowledge
in international politics.
arios, both wild and tame, can gain more credibility in the telling than they
deserve.’ Cognitive studies speak here of an ‘Othello Effect’, as when false
accusations change Othello’s cognitive frame and ultimately lead him to
kill his beloved wife, Desdemona. The point is that even the most absurd
scenarios will gain in plausibility if a chain of potentialities changes cogni-
tion. They are thereby included in the realm of the possible, if not even the
probable: ‘Although the likelihood of the scenario dwindles with each step,
the residual impression is one of plausibility’ (Conetta & Knight, 1998: 38).
That this ‘Othello Effect’ is relevant for international relations is obvious: we
just need to look at the alleged connection between Saddam Hussein and Al-
Qaeda. What the US government tried to prove was disputed right from the
beginning. False evidence was repeatedly presented and repeatedly refuted.
But, that did not prevent the US government from attempting to present the
improbable yet possible connection of Iraq to terrorist networks, and the
improbable yet possible proliferation of an improbable yet possible nuclear
weapon to bin Laden as casus belli. As Donald Rumsfeld famously once said:
‘Absence of evidence is not evidence of absence’ – which is nothing else than
to admit that, under conditions of known unknowns and unknown unknowns,
different evidence criteria prevail. Unknown unknowns are not only hard to
identify, but also hard to refute.
George W. Bush, estimated the costs of the war and reconstruction of Iraq at
around US$100 billion, he was fired. Yet, President Bush now finds himself
in the awkward position of having to raise exactly three times18 as much
through national and international political processes.
Ignorance
If threats emerge because of wrong decisions and non-knowledge, this is
often called ignorance – a concept heavily used in heterodox economics to
refer to the necessary limitedness of human knowledge. The almost complete
absence of any plan for the postwar situation in Iraq and the creation of a new
political order provides the best example of such. This became visible for the
first time at the beginning of April 2003, when insular troops of the Iraqi
Army shifted from conventional to guerrilla war tactics. Only two things
were surprising: first, that they waited for so long before doing so; second,
that the Americans were so surprised when they did. We are surely not in a
position to provide answers or a complete analysis of how to ‘win the peace’,
but it is quite astonishing to see the extent to which institutional and political
blind spots have led to such a wrong assessment of the situation. The politi-
cal ignorance was surely due to the political project of the current US
administration, who simply ignored warning voices and somehow thought
that the joy of freedom, as perceived and defined by the United States of
America, would bring about a political order by itself.
A second political ignorance lies in Donald Rumsfeld’s decision to be deaf
to the warnings of the Joint Chiefs of Staff, and to fight the war with the
inadequate number of soldiers that he deployed. In fact, given US tactics, it is
surprising that the Iraqi Army was unable to profit from the self-induced
vulnerabilities, in particular the securitization of US supply lines. The reason
for this blind spot was Rumsfeld’s determination to direct the Iraq War as an
example of modern warfare. Behind this was the idea that the so-called
Revolution in Military Affairs19 allows wars to be fought with maximum
hi-tech equipment and minimum manpower. As far as that goes, Rumsfeld
might be right. But, the military victory was simply worthless as soon as it
became obvious that, in order to ‘win the peace’, far more than technology
was needed. What was needed most of all was a political plan and the per-
sonnel to secure the policing function. In addition, there is an institutional
ignorance on the part of the US military machinery – it seemingly was unable
to systemically learn from its experience in irregular warfare (Daase, 1999:
107–151). The USA has a remarkable short memory when it comes to how to
fight insurgencies. The reason for that might be the predominance of a
strategic culture that regards irregular warfare as a deviant form of war and
18
‘Projected Iraq War Costs Soar’, Washington Post, 27 April 2006.
19
See the discussion on the Rumsfeld Commission above.
glorifies the conventional, hi-tech-driven war. It might also have its causes in
the traditional contempt of the military for ‘political questions’, which seems
to be rooted in the history of the US military (Huntington, 1957). It is this dis-
regard of political questions that has its roots in the variety of knowledge –
and non-knowledge – structures that has made the US army appear ill-suited
for the current situation, as it is forced to relearn small-warfare tactics, which
results in exorbitant costs (Weighley, 1973). The current war-after-the-war
situation, we think, provides support for our argument that both knowledge
and non-knowledge are constitutive for the formulation of security policy.
Conclusion
References
Baldwin, David A., 1971. ‘Thinking about Threats’, Journal of Conflict Resolution 15(1):
71–78.
Beck, Ulrich, 1996. ‘World Risk Society as Cosmopolitan Society? Ecological Questions in
a Framework of Manufactured Uncertainties’, Theory, Culture and Society 13(4): 1–32.
Benenson, Frederick C., 1984. Probability, Objectivity and Evidence. London: Routledge &
Kegan Paul.
Bernstein, Peter L, 1998. Against the Gods: The Remarkable Story of Risk. New York: John Wiley.
Bonss, Wolfgang, 1991. ‘Unsicherheit und Gesellschaft. Argumente für eine soziologische
Risikoanalyse’ [Insecurity and Association: Arguments for a Sociological Risk
Analysis], Soziale Welt 42(2): 258–277.
Bonss, Wolfgang, 1995. Vom Risiko. Unsicherheit und Ungewissheit in der Moderne [About
Risk: Insecurity and Uncertainty in Modern Times]. Hamburg: Hamburger Edition HIS
Verlagsges.
Castel, Robert, 1991. ‘From Dangerousness to Risk’, in Gordon Burchell, Colin Gordon &
Peter Miller, eds, The Foucault Effect: Studies in Governmentability. London: Harvester
Wheatsheaf (281–298).
Cheney, Dick, 2003. ‘Cheney Cites “Risks of Inaction” with Iraq, CNN.com, 29 January
2003.
Cioffi-Revilla, Claudio, 1998. Politics and Uncertainty: Theory, Models and Application.
Cambridge: Cambridge University Press.
Cirincione, Joseph, 2000. ‘Assessing the Assessment: The 1999 National Intelligence
Estimate of the Ballistic Missile Threat’, Nonproliferation Review 7(1): 125–137.
Cohen, L. Jonathan, 1989. An Introduction to the Philosophy of Induction and Probability.
Oxford: Oxford University Press.
Cohen, Raymond, 1979. Threat Perception in International Crisis. Madison, WI: University of
Wisconsin Press.
Conetta, Carl & Charles Knight, 1998. ‘Inventing Threats’, Bulletin of the Atomic Scientists
54(2): 32–38.
Cooper, Richard N., 1996. ‘Emerging Missile Threats to North America During the Next
15 Years’, statement by the Chairman of the National Intelligence Council to the