Editor’s note: this hypertext transcript was made from the PDF original at: http://www.bmeia.gv.at/fileadmin/user_upload/Zentrale/Aussenpolitik/Abruestung/HINW14/Presentations/HINW14_S2_Presentation_Eric_Schlosser.pdf. Left-mouse click the local file recording here at – <HINW-ESchlosser2014.mp3> – to download the mp3 file to your machine. This presentation of Eric Schlosser was recorded on 8 December 2014 at the Vienna Conference on the Humanitarian Impact of Nuclear Weapons, at the Hofburg Vienna, in Austria. |
|
Session II
Risk Drivers for Deliberate or Inadvertent Nuclear Weapons Use Eric Schlosser is the author of Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety. See Also: “Nuclear ‘Command And Control’: A History Of False Alarms And Near Catastrophes,” NPR, 11 Aug 2014. It is a great honor to be speaking here today and about such an important subject. I spent 6 years researching a book on nuclear weapons and the sources for my book were tens of thousands of pages of documents obtained in the United States through the Freedom of Information Act and the interviews that I did with bomber crew members, missile crew members, high level Pentagon officials, Air Force Generals, the people who ran the nuclear weapons enterprise in the United States. And what I found is that nuclear weapons really are not symbols of national power or symbols of national prestige. Nuclear weapons are machines. They are man-made machines. Designed by human beings, maintained by human beings; and the reason that’s important is that all machines eventually go wrong. It is hard to think of a machine ever invented made by mankind that hasn’t gone wrong eventually. Toasters catch on fire, microphones don’t work, cars crash. We’ve been making automobiles for more than a century on a mass level and yet right now millions of automobiles are being recalled by some of best brands like Toyota, Lexus, BMW, because their airbags may kill the passengers. In that case a technology designed to ensure the safety of passengers may kill them. Commercial airline travel is one of the safest forms of travel and yet one of the most sophisticated airliners ever built, the Malaysian 777, is still missing 9 months after its flight. So it’s hard to see how fallible, imperfect human beings can ever create a machine that’s perfect. And this is important to keep in mind with nuclear weapons because nuclear weapons are the most dangerous machine ever invented. And nuclear weapons are attached to other machines – to airplanes, to missiles and then those machines are attached to other machines – to computers, communication systems, radar, early warning systems. And what you get is a very complex technological system. What I learned in my research is that human beings are very good at creating complex technological systems, not so good at controlling them, and not good at all at knowing what to do with them when something goes wrong. What I learned in looking at the history of nuclear weapons is that this technology has not been fully under control since the very beginning. In July of 1945 as the United States was preparing to detonate the first nuclear device, the scientists there, the most brilliant scientists and physicists in the world, were not certain if the earth’s atmosphere would be set on fire and every living thing on earth would be killed by the first nuclear detonation. The physicist Hans Bethe thought that was impossible. The physicist Enrico Fermi thought the odds were one in ten and no one was sure if he was kidding. But they did calculations for one year to see if the earth’s atmosphere would catch on fire because of a nuclear detonation and they did not know the answer until they detonated that devise in July 1945. There is the account of one of the physicists in the Manhattan project, Viktor Weisskopf, who was ten miles away from the blast, saw the fire ball, felt the heat on his face at a distance on ten miles, and honestly believed the earth’s atmosphere had caught on fire. So from the very beginning of this technology there has been this sense that we have created it but maybe we had not fully and completely mastered it. One of the struggles during the Cold War is the United States wanted nuclear weapons that were always available for immediate use—could be used very quickly—and nuclear weapons that would always detonate when they were supposed to. So that they wouldn’t be duds. But there was also a concern about having nuclear weapons that would never detonate by accident, that could never be stolen, that could never be used by someone without authorization. So in designing these machines there were two fundamentally different design goals: “always” versus “never”. And the kinds of mechanisms that you need to guarantee a weapon will always detonate are often very different from the mechanisms you need to ensure it will never detonate by accident. And again and again in the history of American weapons design we favored “always” over “never” and came very close to catastrophic accidents as a result. The Pentagon has released a list of 32 “Broken Arrows”; serious nuclear weapons accidents in American history. That’s the official list. Through the Freedom of Information Act, I received a list that said that there were more than a thousand accidents involving American nuclear weapons just from 1950 to 1968. Some of those accidents where relatively trivial and minor but some of the accidents on that secret list were more dangerous than the Broken Arrows on the official list. I’ll give you a few quick examples. In North Carolina in 1961, just a few days after John F. Kennedy was inaugurated, a 52-Bomber broke apart in the air. And when it broke apart the centrifugal force as it was spinning, pulled a lanyard in the cockpit. And that was the lanyard that a crew member was supposed to pull to drop a hydrogen bomb over enemy territory. Now this is a machine. It didn’t know that a human being had not pulled the lanyard. It was designed to just do its job. So it fell over North Carolina. It went through a series of arming steps. When it hit the ground a firing signal was sent. And there was one switch in that weapon that prevented the full scale detonation of a hydrogen bomb, hundreds of times more powerful than the Hiroshima bomb, in North Carolina. And it would have sent lethal radioactive fallout up the East Coast. And that switch in that bomb was later found to be defective in many other bombs and you would never ever be allowed to use that in a nuclear weapon today. So we were lucky. Then there were also other accidents in which very simple, seemingly trivial things could have led to catastrophic events. At a Minuteman missile silo in South Dakota, there was a problem with the intruder alarm and a maintenance person went out to check on basically the burglar alarm and used a screw driver instead of a fuse puller to start checking the fuses in a circuit-board. And he pulled one out, no problem. Pulled out another one. Pulled out another, heard a loud explosion and this worker had created a short circuit and launched the warhead off of the missile. The warhead didn’t detonate but it might have. In my book I write at length about an accident in which a worker dropped a tool in a Titan 2 missile silo in Arkansas, the tool fell to the bottom of the silo, hit the ground, ricocheted, hit the missile, created a fuel leak of explosive rocket fuel. That missile later exploded. That warhead was blown off the missile and was lost for many hours in the woods. And we now know that that warhead could have detonated and destroyed much of the state of Arkansas. I’ll mention one last accident. At Grand Forks Air Force Base in North Dakota, someone forgot to put a little nut on a fuel strainer, a nut about that big [about one-quarter inch]. And when the pilot started the engines of a B-52 bomber, the engine caught on fire. And there were 12 nuclear weapons on that plane. And this engine was burning. And the fire could have gotten to the nuclear weapons except the wind was blowing very strong and blew the flames away from the bomb bay where the weapons were. And eventually they were able to put out the fire.
But I talked to one of those crew members of that plane and he said, ‘The crazy thing is, if we had been assigned to a different parking space on the runway, the wind would have blown the fire towards the nuclear weapons.’ Those weapons could have detonated and the city of Grand Forks, with 50,000 people, either would have been destroyed or would have been contaminated with plutonium. In that case the difference between safety and catastrophe was the assignment of a parking space. As long as nuclear weapons exist fully assembled there will be a risk of catastrophic accident. And every single country that possesses nuclear weapons endangers its own citizens by having them. Let alone, its enemies. These are complex technological systems. In very simple technological systems, like an assembly line, if you have a problem you shut down the assembly line, you fix the problem, and then you turn the assembly line back on. In these complex technological systems there can be all kinds of things going on during an accident. You don’t even know what’s happening. In the case of Fukushima, to this day we don’t know what’s happening inside those reactors. These are complex technological systems and they can be prone to common mode failures. One of the most widely deployed American hydrogen bombs during the Cold War, we now know, was vulnerable to a common mode failure. It had a wire very close to the metal skin that was the arming wire and if prolonged heat was applied to the surface of that hydrogen bomb, we now know three things would happen as part of a common mode failure:
We are very lucky that weapon is no longer in our arsenal but we could have had catastrophic accidents during the Cold War. There is a sociologist from Yale University, Charles Perrow. He talks about these complex technological systems in which accidents aren’t unexpected, accidents are actually normal. They’re normal accidents because they can be predicted in these sorts of systems. Now I would like to say that there are no longer risks of accidents. Today America’s nuclear arsenal is much smaller, our weapons have much more advanced safety devices than they did 40 or 50 years ago. But still, we have problems. In 2008 a maintenance person went to a Minuteman missile silo in Wyoming, opened the door, and realized that there had been a fire there. The fire alarm hadn’t gone off. And we’re very fortunate that the fire burned itself out and didn’t reach the missile. In 2010, at the same Air Force Base, 50 Minuteman missiles went offline for an hour. And what that means is their launch crews couldn’t communicate with the missiles for an hour. It was later realized a computer chip had come loose. But it raised concerns about [what] one of our speakers after me is going to talk about: it raised concerns that someone could hack into our nuclear command and control system and launch a missile in a cyber-attack. Now I am very critical of my country’s management of its arsenal. But we invented this technology. We have more experience with this technology than any other country. And if we’ve come this close again and again to catastrophic accidents, think of other countries. I was able to write my book because the United States is the most transparent about its nuclear weapons system. We don’t know about the accidents in other countries. We do know that the Iraqi design for their weapon that was seized after of the first Gulf War was a very unsafe design and Russia had a nuclear missile submarine that caught on fire in 2011 with 64 warheads on it. When the odds of something are low it still means it may happen. And again and again I was told by the people I interviewed, we were lucky to get out of the Cold War without a nuclear detonation. The problem with luck is that eventually it runs out. Thank you very much. |