Nuclear deterrence is anti-scientific

It’s well established that nuclear deterrence practices – that is, systems used to threaten nuclear violence – are anti-democratic (see here and here). What’s less often recognised is that nuclear deterrence is also anti-scientific. Here’s why.

Nuclear-armed states and their allies regularly claim that nuclear deterrence  is the “supreme guarantee” of national security and “guarantees protection of national sovereignty and territorial integrity”. But in reality, the historical record shows that on multiple occasions it was dumb luck, not nuclear threats that prevented the use of nuclear weapons and most likely, nuclear war.

Setting aside the questionable ethics of basing the ‘security’ of a political community on nuclear threats that could easily lead to the extinction of humanity, it’s plausible that such threats cause some leaders to exercise caution in some cases. But the limited historical data available to the public – limited due to severe state censorship, again reflecting the anti-democratic nature of nuclear deterrence – shows that when facing a potential nuclear war, leaders who arguably had most to lose, like Cuban leader Fidel Castro and West German leader Konrad Adenauer in the 1962 Cuban Missile Crisis, actually pushed hard to escalate towards violent confrontation.

In other words, the absolutist claims of nuclear elites that nuclear deterrence ‘guarantees’ security are simply not credible. So why do those elites ignore reality and insist on making such non-credible claims, when the likely consequences of a major deterrence failure are so catastrophic? It’s because the anti-scientific logic of nuclear deterrence incentivises them to do so.

Science is not a set of beliefs about how the world works, it’s a method for approximating 'true' knowledge. The starting point for the scientific method is rational scepticism, which encourages us to doubt, question, and challenge assumptions and claims. Loosely speaking, the method goes like this: propose a hypothesis based on your existing theoretical framework, gather as much relevant evidence as you can, and try to disprove the hypothesis. To the extent that you can’t disprove it, the underlying theory gains credibility.

Science focuses on disproving, rather than proving theories, partly because human perceptions and beliefs always affect how we see the world, and trying to prove something can lead to confirmation bias – the tendency to look for information that supports your pre-existing beliefs – and thus, to interpreting evidence in ways that support those beliefs. Scientific rationality thus cautions against overstating knowledge claims, and seeks to avoid confirmation bias by questioning assumptions and actively searching for evidence that contradicts or casts doubt on the credibility of our hypotheses. This is the opposite of the logic that drives nuclear deterrence practices.

Any garden-variety nuclear expert will tell you that deterrence is partly psychological – it relies not only on missiles or bombs, but also on an adversary’s perception of your willingness to use them. Think of a misbehaving child. If a parent threatens to punish the child for misbehaving and the child keeps acting out, the parent has to follow through with the punishment. Otherwise, the child learns that the threat is not credible, and they will feel free to misbehave in future. This is why politicians will blithely say in public that they’re willing to “authorise a nuclear strike that could kill 100,000 innocent men, women and children” – because if your adversary thinks you don’t have the stomach for nuclear devastation, fancy missiles are nothing more than a very expensive, catastrophically risky comfort blanket.

Another way to understand the anti-scientific nature of nuclear deterrence practices is to compare them to our scientific understanding of how gravity functions. Ultimately, gravity is relative: if we weren’t on planet Earth, it wouldn’t work the same. But since we are on Earth, when I drop an object, it falls to the ground a hundred times out of a hundred. Most importantly, nothing that anyone says or chooses to believe will change that fact. So when proponents of relativity theory say “gravity works”, it’s a purely descriptive statement. That’s not true for nuclear deterrence.

When advocates say “nuclear deterrence works”, they aren’t just describing the world, they’re also trying to influence the world, so it’s a descriptive and a normative act. Specifically, it’s intended as a self-fulfilling prophecy aiming in part to convince adversaries of the credibility of making nuclear threats, which advocates claim lowers the likelihood of nuclear war. And while that goal is laudable, relying on deterrence practices as a means to achieve it has perverse and troubling side-effects – like incentivising nuclear elites to ignore reality.

If an adversary doesn’t believe your nuclear threats are credible, they might engage in nuclear or non-nuclear aggression based on the assumption that there’ll be no nuclear response. During the 1961 Berlin crisis, for example, French President de Gaulle and British Prime Minister Macmillan worried (loc. 4864) that “Any suggestion that the United States might not use nuclear weapons immediately…could weaken deterrence and encourage the Soviets to take risks.” And should deterrence fail, the next step, according to the US Joint Chiefs of Staff, might need to be a US nuclear attack to “end the conflict” and “restore deterrence” (which sounds curiously similar to the ‘escalate to de-escalate’ strategy US officials accuse Russia of pursuing), thus increasing the likelihood of nuclear war – especially if your adversary follows the same deterrence logic.

So in sum, fear of deterrence failure creates incentives for advocates to exaggerate claims about the effectiveness of nuclear deterrence; to surrender to their confirmation biases and ignore evidence to the contrary; and to oppose any claims or policies that might cast doubt on the credibility of nuclear threats. In other words, nuclear deterrence practices are profoundly anti-scientific.

Previous
Previous

PATH receives grant from Ploughshares Fund for Web3 trainings