Nuclear arms control has lessons for us as powerful technologies increase the scope for disaster
Please use the sharing tools found via the share button at the top or side of articles. Copying articles to share with others is a breach of FT.comT&Cs and Copyright Policy. Email email@example.com to buy additional rights. Subscribers may share up to 10 or 20 articles per month using the gift article service. More information can be found here.
A Covid-19-style pandemic was both predictable and preventable, according to a panel of experts. The fact that it has resulted in a global disaster killing 3.3m people was largely due to a failure of governance and the lack of a co-ordinated international response, they say. “Global political leadership was absent,” conclude the two lead authors, Helen Clark, the former prime minister of New Zealand, and Ellen Johnson Sirleaf, the former president of Liberia, in the report published this week. As the historian Niall Ferguson writes in his latest book Doom: The Politics of Catastrophe, the distinction drawn between “natural” and “man-made” disasters is often misleading. What matters is how humans anticipate and react to such events, which are foreseeable in their frequency if not in their particularity. And while it may be tempting to blame such disasters on incompetent leaders, they also reflect a broader societal incapacity to prepare and respond. What is most unnerving about this failure is that humanity will soon face even bigger threats. The risks of environmental destruction, nuclear annihilation, cyberwarfare, bioterrorism and rogue artificial intelligence are easy to foresee and horrifying to contemplate. But trying to pre-empt such dangers is becoming harder as access to powerful technologies becomes easier and cheaper. Eliezer Yudkowsky, co-founder of the Machine Intelligence Research Institute, reckons that an alarmingly different kind of Moore’s Law is at work today: the minimum IQ needed to destroy the world drops by one point every 18 months. It may be a mountainous challenge, but at least some smart researchers are on humanity’s case. In a paper for the Future of Humanity Institute in Oxford, Waqar Zaidi and Allan Dafoe analyse the earliest attempts to control the atomic bomb, highlighting some resonant lessons. In short, we should invest little hope in political leaders tackling these risks on their own initiative. We must depend on scientific experts and civil society to supply the necessary knowledge and political impulse, as has been the case with the environmental movement. Zaidi said in an interview he was astonished by how radical some early thinking on nuclear arms control had been and how relevant it was to our own times. The devastation of the second world war and the threat of a nuclear cataclysm had boosted support for the creation of the UN. As early as 1944, Niels Bohr, the Danish physicist, had urged the wartime leaders of Britain and the US to put nuclear arms under international control. Later, leading scientists including Robert Oppenheimer, the “father of the atomic bomb”, and Albert Einstein, the Nobel Prize winning physicist, argued that nuclear power should only be used for peaceful purposes. Their campaigning won some public support but also the hostility of the military establishment, which branded them politically naive and classified them as security risks. For a while, the US toyed with such radical “idealist” thinking, reflected in the Baruch Plan presented to the newly created UN Atomic Energy Commission. But in a “realist” statement to the US Congress in 1946, Leslie Groves, who ran the Manhattan project that built the atomic bomb, laid out a startling choice. “Either we must have a hard-boiled, realistic enforceable world agreement ensuring the outlawing of atomic weapons or we and our dependable allies must have an exclusive supremacy in the field,” he said. Stalin’s determination to build his own bomb and growing distrust of the Soviet Union pushed the US into choosing the second path, triggering the start of a decades-long cold war. Zaidi said if scientists wanted to influence the public debate they must learn how to mobilise political support. “Technological experts are essential because they have the credibility and sometimes the celebrity. Politicians never want to get ahead of public opinion but sometimes they respond to it.” Intriguingly, as talk of a new cold war between the US and China fills the air, I heard one leading AI researcher this week express alarm that a new arms race would only encourage bad outcomes and call for international oversight. “What I would like is close to a functioning version of the UN with a set of guiding principles that all the big players would sign up to and cede power to,” he said. Distracted politicians are always likely to delay and defer to “realist” arguments unless “idealist” scientific experts empowered by civil society can convince them otherwise. firstname.lastname@example.org