Of nuclear weapons, politicians and cognitive biases
Every four years or so, the UK’s Royal Navy fires a Trident intercontinental ballistic missile, in order to prove the systems on the launching submarines, demonstrate to potential enemies the credibility of the weapon and uncover any faults which need rectifying.
The last test was in June, when HMS Vengeance (there is a whole blog to be written on the biases involved in naming these weapon systems…) launched a Trident II D5 missile, from off the coast of Florida, with a dummy warhead. It didn’t get far. Shortly after its launch, it veered off course and was safely destroyed in mid air.
A little context, before we go on… all weapon systems have a failure rate – or, if you prefer, a reliability rate. Trident has had 161 successful launches, according to data from Lockheed Martin (the main manufacturer); this, when set against two ‘known’ failures since the system achieved maturity (the Vengeance firing and one in 2012 by the USN), gives an apparent reliability rate of 98.76% – high, for such a complex system. There are 16 launch tubes on HMS Vengeance and her sister submarines, each able to be loaded with a Trident missile, so there is plenty of redundancy, if your aim is to destroy a single target. Subsequent firings of Trident II D5 (eg USS Maryland in August 2016) have gone smoothly.
Shortly after the HMS Vengeance test firing, there was a debate scheduled in parliament about funding the replacements for the submarines carrying the Trident system. It appears that the government decided to suppress news of the Trident test failure. You can see more here on the impact of cognitive biases on strategic decision-making. There would have been several cognitive biases in action in this decision, but they would have probably started with ‘interest’ biases, before being reinforced by biases from the ‘pattern-recognition’ and ‘action-orientated’ families. (To keep the large number of cognitive biases in some form of manageable model, we categorise them in five large families. You can see how this works by looking at one of our cognitive bias test reports.)
Most governments’ bias is to suppress any defence news which could be embarrassing. This is driven by ‘interest’ biases (the government’s primary duty is defence of the realm, so anything which makes it look incompetent clashes with its main interest – remaining in power). ‘Pattern-recognition’ biases then kick in, because the technique of secrecy to obscure bungling has been used by governments for centuries; established mechanisms exist to exert influence over the media, if they get to hear of the story (which further reinforces the pattern-recognition biases), and almost any paper in government to do with nuclear deterrence tends to be stamped with a high security classification – again reinforcing the bias that all things to do with a nuclear deterrent should be secret. The relatively short period of time available between the test firing and the debate in parliament would have reinforced ‘action-orientated’ biases.
The result was that information about the test remained suppressed for around six months – until the Sunday Times published the story on 22 January. The resultant furore has left the UK prime minister and her ministers looking as though they had been party to a cover-up, before a major debate, in parliament, on nuclear deterrence. In hindsight, the decision to conceal the test firing looks to have been extraordinarily misjudged.
Of course, hindsight is very easy – in hindsight. The trick is to develop its far more useful cousin – foresight. This requires us to understand and mitigate our cognitive biases – both on the individual level and those we have when working in a team.
I have written about the use of alternative arguments to try to mitigate the effects of cognitive biases on decision-making. There are other more advanced techniques, but this one is always a good start. Here are a few…
Let’s start with the security issues. The Russians knew about the failure the day that it happened – we have to notify them of test launches, so that they don’t think them to be an attack and they can watch the whole thing on their satellites. Of course, this doesn’t mean that every other potential problem state necessarily knows too, but relying on the Kremlin to keep your secrets may not be the wisest way of running a western democracy.
Another related argument is: if it were to get out, would it not look like we were simply keeping the secret from our own people, since our potential adversaries may be more abreast of the issue than pretty much anyone else? There is an old adage that nothing is ‘Top Secret’, if more than two people know about it – otherwise, it is bound to leak out. How many people knew about this test failure? Once the secret was out, would this not be (at best) a propaganda gift for campaign groups which wish the UK to unilaterally disarm and (at worst) an excellent recruiting sergeant?
Going back to basics would also have generated useful counter arguments. Why do we conduct these tests? As mentioned above, one of the key reasons is to uncover faults in the design, or emerging faults, as the components age (which was probably the case with the missile’s gyro system in this instance). We can show that we have tested for problems, found one and intend to get it fixed.
Even the facile, unusable argument that it’s more of a deterrent if nobody knows where it is going to land (even your friends would be motivated to put pressure on adversaries not to upset you – also known as the ‘mad dog’ argument) would have been worth raising – not least because it may, in turn, generate useful alternative thinking. Humour often does.
We all have biases about nuclear weapons. My own bias is that it would be folly for the UK to give them up on the eve of protracted negotiations to withdraw from the EU. Nobody should want to sit down to hard negotiations when the French are on the opposite team and are the only nuclear power in the room. However, whatever your personal bias on the merits of the UK having nuclear weapons, the response by the UK Government to the failed Trident test appears unwise in the extreme.
Cognitive biases do that to you: control them, and foresight can be your friend; ignore them, and they make you hindsight’s fool.