Atomic Bomb

views updated May 17 2018

ATOMIC BOMB.

ORIGINS
MASSIVE RETALIATION
NUCLEAR WEAPONS AND EUROPEAN SECURITY
MUTUAL ASSURED DESTRUCTION
BEYOND DETERRENCE
AFTER THE COLD WAR
BIBLIOGRAPHY

The first atomic bomb was tested in New Mexico in the United States in July 1945, and the second and third were used against the Japanese cities of Hiroshima and Nagasaki the next month, bringing World War II in the Pacific to a ferocious close. By this time Germany had surrendered, yet it was the European war, and the prospect of the Nazis getting the bomb first, that provided the stimulus to the wartime development of the bomb, and it was the demands of European security that continued to influence the development, deployment, and strategic thinking surrounding nuclear weapons.

The remarkable continuity of the Cold War that developed following World War II, and the apparent symmetry, with two alliances each dominated by a superpower owning a formidable nuclear arsenal, provided an unusually stable context. It meant that on the one hand there were always strategic reasons to develop new weapons and explore new strategies, yet on the other there was a limited risk of the weapons actually being used in anger. In part this was because of the caution and circumspection induced by the fear of nuclear war. Memoirs and archives all testify to the anxieties felt by political leaders at times of crisis when there was the slightest risk of having to make decisions that could lead to these weapons being used.

ORIGINS

By the late 1930s, scientific breakthroughs in nuclear physics were coming so quickly that the theoretical possibility of creating massive explosions through splitting the individual atoms (nuclear fission) within a critical mass of material (uranium) and so producing a "chain reaction" was coming to be widely recognized. If Germany had not expelled so many of its top scientists because they were Jews, they would have been well placed to turn the developing science into actual weapons. In the event, émigré scientists, first in Britain and then the United States, fearful of this possibility, played critical roles in the wartime race to build the first weapons. Their work was completed just in time for the end of the war, and it is an interesting question for speculation as to what would have happened if there had not been time for the actual use of these weapons. The secret would not have been withheld from the Soviet Union, which was kept well informed by its spies in the Anglo-American project.

After years of air raids of increasing horror, the destructive impact of the first atomic bombs was not in itself so shocking. The Allied air raids against Hamburg and Tokyo inflicted more death and destruction. In addition to the new, insidious feature of radioactivity, the main difference was in their efficiency. One bomb could achieve what would otherwise require the loads of two hundred heavy bombers. In addition, the new weapons were not of great importance in the onset of the Cold War in Europe. It has been argued that their monopolistic position gave the Americans some confidence in their bargaining over the shape of postwar Europe, but while there may have been some momentary hopes of strategic advantage, the administration of Harry S. Truman (1884–1972) could not actually threaten their use so soon after World War II (and there were actually very few weapons available for use), and the Soviet advantage lay in the fact that it was their troops and commissars who were actually controlling developments on the ground in Central and Eastern Europe. Although a combination of events, including the Communist Czech coup (1948) and the Berlin blockade and airlift (1948–1949), convinced the Americans that they had to make a renewed commitment to Europe, reflected in the formation of the North Atlantic Treaty Organization (NATO) in April 1949, this was, at first, essentially a political more than a military move.

The limited potential relevance of the weapons to the fate of Europe was apparently confirmed in August 1949 when it was learned that the Soviet Union had tested its own device. Whatever advantages gained by the United States through its nuclear monopoly would eventually be neutralized. The Truman administration accepted that over time it would be necessary to build up conventional forces to counter Soviet strength on the ground if it wished to prevent the sort of push across the Iron Curtain that was witnessed in the summer of 1950 as communist forces invaded South Korea. As a result the United States and Britain began a major rearmament program.

MASSIVE RETALIATION

To buy time, and because he had no confidence that American restraint would be reciprocated by the Soviet Union, President Truman authorized the development of thermonuclear weapons. The move from fission to fusion weapons, based on the energy generated as atoms combined, was almost as important as the original development of atomic bombs. There was now no natural limit on the destructive power of weapons. Their explosive yields could range from the low kilotons (equivalent to thousands of tons of TNT) to the high megatons (equivalent to millions of tons of TNT). In addition, new production facilities meant that there was a move from scarcity to plenty.

The administration of President Dwight Eisenhower (1890–1969), which came to power in January 1953, was sufficiently emboldened by these developments, including the success, after some false starts, of the thermonuclear program, to push nuclear weapons to the center of its strategy. This was in part because of the frustrating experience of the inconclusive conventional war fought in Korea and also because of the developing economic burden of conventional rearmament. So much had now been invested in nuclear weapons that the marginal costs of building up the arsenal made them a relatively cheap option compared with matching the Eastern bloc in conventional weapons. Most importantly, Eisenhower was confident in nuclear deterrence. He did not believe that the Soviet Union was in a rush to go to war and was instead prepared for a long haul, during which both sides might maneuver for strategic advantage and test the other's staying power. In this context he saw the great advantage of nuclear weapons as reminding Moscow just what dangers they would run if they ever tried to break out of the developing Cold War stalemate.

U.S. secretary of state John Foster Dulles (1888–1959) announced the new strategy in January 1954. He declared that in the future, rather than attempting to maintain large conventional forces around the Sino-Soviet periphery (the two communist giants were still treated as a single entity at this time), a U.S. response to aggression would be "at places and with means of our own choosing." This was interpreted as threatening nuclear attack against targets in the Soviet Union and China in response to conventional aggression anywhere in the world, and the doctrine became known as "massive retaliation." This was probably intended to be of more relevance to areas other than Europe. The French were battling it out in Vietnam at the time, and Eisenhower had been influenced by the apparent role of nuclear threats in getting the Soviets to agree to an armistice in the Korean War. The doctrine was widely criticized for placing undue reliance on nuclear threats, which would become less credible as Soviet nuclear strength grew. Few doubted that the United States would respond vigorously to any challenge in Europe, but the concern was that a limited challenge elsewhere would find the United States with few options other than the nuclear with which to respond, leaving it with a dire choice between "suicide or surrender."

Yet while the Eisenhower administration clearly put Europe in a higher category than its other security commitments, and did not remove from the continent the conventional forces that had been sent during the alarms of the early 1950s, the dependence on nuclear deterrence created problems. There was bound to be some doubt as to whether the Americans really would be prepared to sacrifice New York or Chicago for Paris or London. Furthermore, the allies were responsible for the security of the western part of a divided Germany and, much more difficult, the western part of a divided Berlin, stuck well inside East Germany and not obviously defensible by conventional means. Under the conventional buildup set in motion under the Truman administration, the United States always planned to rearm West Germany. So soon after the Nazi era, this was bound to be controversial. It took until 1954 for a formula to be found by which West Germany rearmed but was permitted no chemical or nuclear weapons and was part of NATO's military command. In return, the West German government sought a commitment by its new allies to the concept of "forward defense," so that any aggression would be held at the inner-German border, for so much of its population and industry was concentrated close to this border. The German fear was that otherwise its territory would be used to provide a battleground, to be sacrificed, in extremis, to gain time for reinforcements to arrive from North America. Now that NATO was not going to attempt to match Soviet conventional forces, forward defense meant, in effect, that nuclear deterrence was linked to this border.

European governments came to see great advantages in the Eisenhower approach. It did not rely on a readiness to fight conventional wars, which could be almost as devastating for the continent as a nuclear war. One of the problems with conventional deterrence was that Moscow would not feel that its own territory would be at risk so long as the combat could be confined to the center of Europe. In addition, this arrangement saved the Europeans the expense of sustaining large-scale conventional forces, especially as they were pessimistic as to the possibility of ever matching Warsaw Pact strength. Despite the evident Soviet concern over what it saw to be the anomalous position of West Berlin, NATO countries grew increasingly doubtful that there was a serious risk of a World War III.

NUCLEAR WEAPONS AND EUROPEAN SECURITY

An important shift in American thinking came when John Fitzgerald Kennedy (1917–1963) became president in January 1961. Kennedy was not so sanguine about Soviet intentions, especially when he faced an early and severe crisis over West Berlin, which peaked when the Berlin Wall was constructed in August 1961. Nor did he and his senior aides feel any confidence in nuclear threats given the growing Soviet capacity for retaliation. How could a credible deterrent be fashioned out of an incredible nuclear threat? Kennedy wanted NATO to commit extra forces to raise the nuclear threshold—that is, the point at which nuclear weapons would be necessary to stave off conventional defeat. He was fortified by analyses that suggested that previous intelligence assessments had exaggerated the strength of the Warsaw Pact.

European governments resisted these arguments strongly. They argued that all war had to be deterred, not just nuclear war, and that conventional buildups would be expensive and ineffectual. The Americans kept up the pressure, but after the 1961 Berlin crisis (and the 1962 Cuban missile crisis), concern about a European war subsided. Meanwhile, demands in Vietnam reduced the spare military capacity of the United States. In 1967 a compromise was found in the doctrine of "flexible response." The Europeans recognized the U.S. requirement for an extended conventional stage, so that the first shots across the Iron Curtain would not lead automatically to a nuclear holocaust, and the United States accepted the need for a clear link between a land war in Europe and its own strategic nuclear arsenal. If the alliance looked like it was being overrun as a result of the conventional superiority of the Warsaw Pact, NATO reserved for itself the option of being the first to use nuclear weapons.

This link would be provided by short-range, tactical nuclear weapons (TNWs). These had first been introduced into the NATO inventory during the 1950s as nuclear equivalents to all types of conventional munitions—from mortars and artillery shells to air-delivered bombs and depth-charges, and even mines. There was a hope that this extra firepower would make it possible to take on communists in limited nuclear wars without resorting to incredible threats of massive retaliation. Simulations of their use during the 1950s soon demonstrated that they were not just more powerful conventional weapons, but would lead to great death and destruction, including among the people supposedly being defended. Warsaw Pact forces would obtain comparable weapons of their own and would neutralize any Western advantage. Nor could there be confidence that nuclear use, once begun, would stop with TNWs. There could soon be escalation to strategic—intercontinental—nuclear use.

Though TNWs could not be considered ordinary weapons of war, their close integration with conventional forces meant that they were more likely than strategic forces to get entangled in a land war in Europe. This created an added risk for the Soviet Union. It was hard to demonstrate that under any circumstances nuclear war would be a rational option for NATO leaders, but once a major war was under way the circumstances would not be conducive to rationality and it was possible that, in the heat of the moment, some nuclear use might be authorized. The dynamics of escalation could lead to a potentially catastrophic conclusion. Deterrence did not require certainty that nuclear weapons would be used—only a small possibility. So TNWs provided a demonstration of the extent to which the fate of both superpowers was linked to stability in the center of Europe.

This meant that deterrence would depend less on the implementation of a clear threat but a risk of matters getting out of control, which was an uncomfortable prospect. There was an intense alliance debate in the late 1970s over how to replace the first generation of TNWs. If they were made smaller and more precise, then this would imply a readiness to use them to fight a nuclear war rather than simply deter, and a return to the idea that they were just more powerful forms of conventional weapons. This was the purpose of the so-called "neutron bomb" (actually a thermonuclear missile warhead or artillery shell of enhanced radiation and reduced blast), which was criticized for blurring the boundary between conventional and nuclear weapons and thereby making it much easier to go nuclear.

MUTUAL ASSURED DESTRUCTION

The debate over the neutron bomb, which ended with President Jimmy Carter (b. 1924) deciding not to deploy it, was the first major public discussion of these weapons since the 1950s. During the intervening years, most of the attention had been taken up with the problem of the strategic nuclear balance. The doctrine of massive retaliation had assumed that U.S. nuclear superiority could last for some time, but the Americans were slow off the mark when it came to developing intercontinental ballistic missiles (ICBMs) and were stung when the Soviet Union apparently stole the lead, and undermined its reputation for technological inferiority, when it launched the first artificial Earth satellite (Sputnik 1) in October 1957, not long after it had also tested the first ICBM. Now the fear was of Soviet superiority, and there was talk of a missile gap. Of particular concern was that the Soviets might turn their advantage into a first strike capability, so that they could mount a surprise attack against U.S. air and missile bases and so render retaliation impossible. This would be the only way to "win" a nuclear war. This possibility would be denied with the development of a second-strike capability, the capacity to launch a devastating riposte even after absorbing an enemy attempt at a first strike. If both sides developed first-strike capabilities then future crises would be very tense, because it would create pressure to gain the advantage by preemption. On the other hand, if both sides enjoyed second-strike capabilities then the situation should be more stable, as there would be no premium attached to striking first.

In the event, technological developments supported the second strike. As ICBMs were deployed by the United States in the early 1960s, they were placed in hardened underground silos, so that an unlikely direct hit would be required to destroy them. Harder to hit because they would be harder to find would be submarine-launched ballistic missiles (SLBMs). In principle, effective defenses might shore up a first-strike capability, but the standards for defense against nuclear weapons had to be much higher than for conventional air raids, because of the impact of just one offensive weapon getting through—and the defensive systems would have to cope with thousands. Any improvements in the radars and anti-missile missiles were countered by even greater improvements in offensive systems—notably multiple independently targeted reentry vehicles (MIRVs) that could swamp any defenses, especially when combined with decoys. Civil defense promised scant protection to civilians: at best, there might be some chance of avoiding exposure to nuclear fallout.

During the 1960s, the U.S. secretary of defense Robert S. McNamara (b. 1916) argued that the situation was one of "mutual assured destruction" (which soon became known by its acronym MAD). This meant that the two superpowers were each able to impose "unacceptable damage" (defined as 25 percent of population and 50 percent of industry). This he considered the source of stability and he urged that all policies, from new weapons procurement to arms control measures, be geared toward this end. Although this approach encountered strong opposition, it was, by and large, followed through the Richard Nixon (1913–1994) and Carter administrations. Opponents warned that if MAD failed to deter, then any war would soon lead to genocide, and also that it suggested that nuclear weapons could only be used to deter other nuclear weapons, thereby adding to the risk of conventional aggression and so undermining the commitments made to allies to use nuclear weapons first on their behalf. Nor, it was argued, was there evidence that the Soviet Union had signed up to this theory. Soviet strategy appeared to envisage using nuclear weapons to obtain a decisive military advantage and reducing the damage that an enemy might do to Soviet territory (if necessary, by launching preemptive strikes).

The main effort to break out of MAD was made by President Ronald Reagan (1911–2004) in the 1980s. Initially, he continued with the search for offensive nuclear operations to allow the United States to "prevail" in a protracted war with the Soviet Union, but his most significant initiative was to call for a true ballistic missile defense system that could protect lives rather than avenge them, thereby rendering nuclear weapons "impotent and obsolete." The science was bound to defeat the ambition, given the diverse means of delivering nuclear weapons. Reagan concluded with increasingly radical arms control proposals developed in dialogue with Soviet leader Mikhail Gorbachev (b.1931) from 1985. In January 1986, Gorbachev set out a radical disarmament agenda leading toward a nuclear-free world by the end of the century, and Reagan was clearly not unsympathetic to this vision. The only difference was that he saw his strategic defense initiative fitting in with this vision and Gorbachev did not.

BEYOND DETERRENCE

The shared disarmament agenda constituted a formidable challenge to the orthodox view that nuclear weapons were vital to West European security in the face of preponderant Soviet conventional forces. Yet in practice the Europeans tolerated the steady decline in the credibility of threats to use nuclear weapons first. They concluded that deterrence could survive with only the slightest risk of nuclear use, especially when it was so hard to construct a realistic scenario for a European war. In addition, Europe's own populations were unenthusiastic about allowing their security to depend on nuclear threats. This became apparent following NATO's decision in 1979 to modernize its intermediate nuclear forces (INF) with the Pershing II intermediate-range ballistic missile (IRBM) and the Tomahawk cruise missile. The idea was to convince the Soviet Union that it could be put directly at risk by systems based in Europe in a way that could not be achieved by TNWs. European governments had also expressed concern that during the Strategic Arms Limitation Talks (SALT II, signed in 1979), the United States concentrated on achieving symmetry between the nuclear forces of the two superpowers, while paying little attention to the superiority within the European theater of the Warsaw Pact in both nuclear and conventional weapons. However, after NATO's 1979 decision, large-scale protests sprang up in Europe and North America. Voicing a concern that a new arms race was getting under way in Europe, the protests took on special urgency following the 1979 Soviet invasion of Afghanistan and the election of the hawkish Ronald Reagan.

The protests encouraged NATO to put less public stress on the requirements of flexible response and more on the need to match the deployment of the Soviet intermediate-range SS-20. In November 1981, at a time when there was real doubt that the NATO missiles would ever be deployed, Reagan offered to abandon the program if all SS-20s were removed. This "zero option" was rejected. The measure of the change during that decade was that, once deployment had taken place, Gorbachev agreed to the zero option. In December 1987, Gorbachev and Reagan signed the Intermediate-Range Nuclear Forces (INF) Treaty.

Reagan's interest in a nuclear-free world encouraged discussion about the possibility of a European nuclear force, independent of the United States, based on the French and British capabilities. The United Kingdom had always, officially at least, committed its strategic nuclear forces (which since the late 1960s had been SLBMs) to NATO. Britain's rationale for maintaining a national nuclear force involved a combination of the political influence that could be brought to bear on its allies, especially the United States, and a claim to be contributing to the overall deterrent posture. France, by contrast, had always had a much more nationalistic rationale, although it had claimed that its force de frappe would defend allies. Neither country was eager or really able to take over from the United States the broader deterrent role; nor did their allies see them in that role. The alternative to reinforcing deterrence was to introduce new TNWs, but the Germans could see that these would mean that any nuclear war would be confined to its soil (East and West), and the political climate was now against any new nuclear systems. Soon the whole issue appeared to be irrelevant as European communism collapsed and the Cold War could be declared over. There was nothing left to deter, while the potential targets for NATO's short-range nuclear missiles were embracing liberal democracy and capitalism.

AFTER THE COLD WAR

The traditional calculus of European security was turned upside down. NATO now had conventional superiority—against all comers—and it was Russia that was considering using nuclear first-use threats to shore up its security position. The nuclear danger in Europe was soon seen to be less the traditional threat of a rising and radical great power and more chronic weakness in a declining great power, leading to questions about the control of the nuclear systems within the former Soviet Union, and the revival of older conflicts and rivalries within Europe, suppressed during the Cold War. Instead of moves to introduce new TNWs, the Americans moved to encourage Russians to eliminate all of their own, to prevent them falling into the wrong hands, and triggered the process by announcing the withdrawal of their systems from land and sea deployments. In the strategic arms reduction talks, both sides agreed to progressive reductions in the size of their arsenals, with the main limiting factor the cost of their decommissioning.

While relations between the old nuclear antagonists had been transformed, new nuclear powers were starting to emerge. After the First Persian Gulf War in 1991, it was revealed just how advanced Iraq had become in its nuclear capability. During the 1990s concerns also grew about Iran and North Korea, while India and Pakistan actually tested nuclear weapons in 1998. Within Europe the major risk of proliferation came with the breakup of the former Soviet Union, but Ukraine, Belarus, and Kazakhstan accepted that they could not hold on to the systems inherited from the Soviet Union. NATO countries took the view that so long as other states had nuclear arsenals, and even a capacity to inflict death and destruction on a massive scale by other means, then it was only prudent to sustain arsenals of their own. What was less clear was whether they would consider nuclear use in response to the use of chemical or biological weapons. In practice because of conventional superiority, particularly in air power, they would have plenty of alternative means of responding without having to inflict massive destruction themselves.

The attacks on the World Trade Center in New York and the Pentagon in Washington, D.C., on 11 September 2001 raised the specter of superterrorists gaining access to nuclear or, more likely, chemical or biological weapons. This specter was used to justify the U.S.- and U.K.-led war against Iraq beginning in 2003, although the lack of subsequent evidence of these capabilities undermined the rationale. This episode nonetheless illustrated the extent to which nuclear weapons, having first been developed in the context of titanic struggles between great powers, when major centers of population had come to appear as natural and legitimate targets for attack, now had to be understood in a world in which the major powers were at peace, and conventional forces could be used with precision, but where weak powers and sub-state groups might try to gain some artificial strength through gaining access to the most destructive weapons.

See alsoCold War; Disarmament; Gorbachev, Mikhail; NATO; Nuclear Weapons; Soviet Union; World War II.

BIBLIOGRAPHY

Bundy, McGeorge. Danger and Survival: Choices about the Bomb in the First Fifty Years. New York, 1988. An excellent account of the key decisions made by U.S. policymakers, in particular about nuclear weapons, by an academic who had been a key insider.

Freedman, Lawrence. The Evolution of Nuclear Strategy. 3rd ed. Houndmills, U.K., 2003. Summarizes the history of strategic thought in this area.

Gaddis, John Lewis, Philip H. Gordon, Ernest R. May, and Jonathan Rosenberg, eds. Cold War Statesmen Confront the Bomb: Nuclear Diplomacy since 1945. Oxford, U.K., 1999. Considers the impact on policymakers.

Garthoff, Raymond. The Great Transition: American Soviet Relations and the End of the Cold War. Washington, D.C., 1994. An excellent discussion of the interaction of nuclear weapons and foreign policy as the Cold War came to an end.

Heuser, Beatrice. NATO, Britain, France, and the FRG: Nuclear Strategies and Forces for Europe, 1949–2000. New York, 1997. Covers the key debates within West European countries.

Holloway, David. Stalin and the Bomb. New Haven, Conn., 1994. The history of the early Soviet program.

Rhodes, Richard. The Making of the Atomic Bomb. New York, 1986. Author wrote the best books on the development of the first nuclear weapons.

——. Dark Sun: The Making of the Hydrogen Bomb. New York, 1995.

Sagan, Scott D., and Kenneth N. Waltz. The Spread of Nuclear Weapons: A Debate. New York, 1995. Debates the big issues of post–Cold War nuclear war policy.

Zubok, Vladislav, and Constantine Pleshakov. Inside the Kremlin's Cold War: From Stalin to Khrushchev. Cambridge, Mass., 1996.

Lawrence Freedman

Atomic Bomb

views updated May 23 2018

ATOMIC BOMB

The mushroom-shaped cloud associated with the above-ground detonation of an atomic bomb is one of the most defining images and represents one of the most challenging moral imperatives to arise from the mid-twentieth century. The scientific, technological, political, sociological, psychological, religious, and ethical ramifications of humankind's ability to harness and release in a fraction of a second fundamental forces of nature make the atomic bomb one of the preeminent issues of modern society and human existence.

Bomb Engineering

An atomic bomb is a weapon that derives its energy from a nuclear reaction in which a heavy nucleus of an atom such as uranium or plutonium splits into two parts and subsequently releases two or three neutrons along with a vast quantity of energy. These nuclear reactions, if they can be induced rapidly and in quick succession across a critical mass of material, produce a cataclysmic release of energy of prodigious dimensions from a very small quantity of initial material.

Advances in the design of these weapons have focused on efficiency and effectiveness, including ways to produce purer initial materials, induce and sustain more rapid reactions, and produce similar effects with smaller amounts of material. As a result, nuclear devices now available to the armed forces can yield effects from a small warhead on a missile that compare favorably to those generated in the 1950s by weapons so large that bombers had to be specially adapted to haul and drop them. Advances in weapons construction techniques further allow experts to assemble even relatively impure materials into "dirty" bombs with limited yield but severe environmental effects.

Developments since the mid-1980s have posed new threats to world security as an ever-expanding set of nations gained access to suitable raw materials for constructing these devices. Global monitoring of these materials has become increasingly more difficult and nongovernmental organizations have sought, and probably have obtained, previously unavailable raw materials to construct small-scale nuclear devices to advance sinister purposes.

The technology behind atomic bombs dates to work in physics including the theoretical work of Albert Einstein at the beginning of the twentieth century and experimental work by Otto Hahn, Fritz Strassmann, Lise Meitner, Otto Robert Frisch, and others in Germany and Sweden in the late 1930s. Scientists in Germany, France, the United Kingdom, the Soviet Union, Japan, and the United States all realized that it might be possible to produce weapons of mass destruction as an extension of the work of the experimental physicists, but it was only in the United States that these efforts were organized and funded to achieve success.


State Construction

The Hungarian refugee physicist Leo Szilard organized his physics colleagues in the United States to petition President Franklin Delano Roosevelt to sponsor work to build an atomic bomb out of fear that the Germans were already well advanced in their efforts. (This claim was later shown to be completely erroneous.) He enlisted the aid of Einstein in this cause, and Roosevelt responded in the fall of 1939 by devoting $6,000 in discretionary money to preliminary investigations by scientists. This sum had grown to $300,000 per year by 1941 with funds channeled through the National Bureau of Standards to hide the scientists' true intent. By 1941 Vannevar Bush, president of the Carnegie Institute of Washington, DC, had formed and chaired an Office of Scientific Research and Development to better harness the abilities of scientists in the United States to contribute substantially to the war effort. A series of experiments at the University of California at Berkeley, the University of Chicago, and a remote location in Oak Ridge, Tennessee, during the period of 1940 to 1941 established that a fission reaction could be created and controlled, that new elements were created in such reactions that could also be useful as sources for bomb materials, and that uranium-235 could be separated from the much more abundant but non-useful for bombs uranium-238 via a number of different means. Several of these separation techniques involved the use of highly reactive and corrosive materials, especially uranium-hexafluoride, in addition to a whole series of radioactive and dangerous by-products from the various processes associated with production of the basic materials needed for atomic bombs—by-products that continue to create problems of waste disposal and health impacts to this day.

Bush appointed a secret National Academy of Sciences (NAS) committee in 1941 to recommend whether it was feasible to build an atomic bomb. The committee, chaired by the Nobel Prize–winning physicist Arthur Holly Compton of the University of Chicago, concluded in May 1941 that an expanded six months of intensive research was needed before a decision could be rendered. Bush was dissatisfied with this report and responded by appointing more engineers to the committee and asking them to reconsider and produce a new report. This report, delivered on July 18, reached the same general conclusions as the prior one. By this point, Bush had a secret report from British scientists concluding that an atomic bomb could conceivably be built within the next few years.

Bush used this report and his own persuasive powers to convince President Roosevelt to give his full backing to proceeding with a large-scale effort to build the bomb. Roosevelt decreed that only four other people were to know: James B. Conant (Bush's deputy and president of Harvard University), Vice President Henry Wallace, Secretary of War Henry Stimson, and U.S. Army Chief of Staff George Marshall. Members of Congress were explicitly excluded from knowledge of the project and remained so throughout the war. The third and final NAS committee report completed in November 1941 provided a cost estimate of $133 million (in 1940 dollars)—a vast underestimate for a project whose final cost of $2 billion was about two-fifths of the entire military cost of World War II to the United States.

The U.S. Army Corps of Engineers (ACE) became the vehicle by which this massive endeavor would be hidden in the federal war budget because construction contracts were large and difficult to understand. The project was turned over to ACE in June 1942 and code-named the Manhattan Engineer District (MED) for its proposed base of administrative operations in New York City. MED became known colloquially as the "Manhattan Project," even though building the atomic bomb had little to do with the city of New York. Colonel Leslie Groves, the civil engineer who supervised the building of the Pentagon in record time, was promoted to brigadier general and given command of the Manhattan Project.

General Groves swiftly commandeered equipment, supplies, human resources, and the best scientists who could be assembled, and created a series of centers in remote locations in Hanford, Washington; Oak Ridge; and Los Alamos, New Mexico in addition to maintaining work at many universities and over 200 corporations including Stone and Webster, Dupont, Eastman Kodak, and Union Carbide. At its peak in 1944 there were more than 160,000 employees working on the project. This workforce overcame tremendous scientific and technical problems in the push to build "the device," and the first atomic bomb performed superbly at Alamogordo, New Mexico, on July 16, 1945. Three weeks later the first atomic bomb was used in war as the Enola Gay bomber dropped a single 90-kilogram device over Hiroshima, Japan, on August 6, 1945. Two days later the Soviets declared war on Japan and invaded Manchuria, and on August 9 a second atomic bomb weighing only 6.1 kilograms fell from the sky over Nagasaki, Japan, which created equally widespread destruction (because of its smaller size, the second bomb was considerably more powerful per kilogram). The emperor of Japan announced his intent to accept the Potsdam Proclamation and surrender to the Allied forces on August 14, 1945, with a formal surrender occurring on the 2nd of September.


Assessments

These first atomic bombs affected earth, water, air, and all living organisms in the targeted area. The Hiroshima bomb delivered the equivalent energy of 13.5 kilotons of TNT, while the much smaller but technically superior Nagasaki device yielded 22 kilotons of TNT. The fireball radius was 150 yards with a peak heat close to that of the center of the sun. These bombs leveled the core of these cities with a huge shock wave moving at the speed of sound and heat radiation moving at the speed of light that, while sustained for only a few seconds, vaporized entire structures and human beings, seriously burned thousands of others, and sowed radiation poisoning in human and animal tissue, water supplies, building remains, and the very earth itself, which would affect generations to come. J. Robert Oppenheimer, the scientific leader of the Manhattan Project, when viewing the test site explosion at Alamogordo was reminded of the words of Shiva from the Bhagavad Gita, a Vedic text of India, "I am become death, the destroyer of worlds."

Many scientists associated with the Manhattan Project went on to take leading roles in organizations such as the American Nuclear Society, Federation of Atomic (later American) Scientists, Union of Concerned Scientists, and International Pugwash that sought to stop the spread of nuclear weapons and better educate the public about the brave new world humanity entered with the creation and use of these devices. Einstein expressed deep regret at his own key role in getting the ear of President Roosevelt for Szilard. Einstein would later write, "the unleashed power of the atom has changed everything save our modes of thinking, and thus we drift toward unparalleled catastrophe ... [A] new type of thinking is essential if mankind is to survive and move toward higher levels." Szilard was appalled to learn that America had used the atomic bomb against Hiroshima and devoted himself to the post-war effort to restrict and control the development and use of nuclear weapons. Most nuclear scientists, however, went on to further government contract work on the construction of thermonuclear weapons that were more than one thousand times more powerful than those developed during the project or to work on peaceful uses of nuclear energy. Many scientists, joined by other scholars such as Pitirim Sorokin, Ruth Sivard, Alex Roland, Bruce Mazlish, Kenneth Waltz, and John Mearsheimer, agreed with the assessment of the nuclear scientist Donald York that providing these types of implements rendered war on a large scale too horrific to contemplate and consequently saved hundreds of millions of lives in the standoff between the United States and the Soviet Union known as the Cold War (1945–1989).

Karl Jaspers, a noted German philosopher, argued in Atombombe und die Zunkunft des Menschen (1958), that an entirely new way of thinking was required after the creation of the atomic bomb. The philosopher and mathematician, Bertrand Russell, argued in 1946 in "The Atomic Bomb and the Prevention of War" (Bulletin of the Atomic Scientists 2(5): p. 19), that the only way to prevent war was through an international government that possessed atomic weapons and was prepared to use them if nations would not heed its directives and settle their disputes amicably with one another.

In the years following the development and deployment of the atomic bomb, the United States and other nations went on to develop more powerful weapons and to repeatedly test them above and below ground. Tens of thousands of civilians and military personnel were exposed to increased amounts of radiation, many unwittingly and unknowingly. The balance of evidence and the opinion of the majority of scientists with expertise who have studied this issue, suggest that for the most part the effects were quite minimal, although whether these low levels of exposure have long-term detrimental health effects can neither be demonstrated nor conclusively denied. The government of the United States, throughout this period, consistently assured the American public that there were no risks, despite voluminous information from scientists and classified studies they had commissioned that showed such a claim to be preposterous.

Various ethical arguments have been advanced against nuclear weapons. For example, some have argued that atomic weapons are "unnatural" and on this basis alone should be banned. But all armaments beyond sticks and stones fall under the same charge. Massive fire bombings in World War II of British, German, and Japanese cities killed far more civilians and in ways every bit as horrendous. While an atomic weapon is more than the "beautiful physics" that Enrico Fermi declared when asked about any moral qualms he had about working on the bomb, it must be viewed on a long continuum of the technological evolution of warfare. Whether nations holding nuclear technologies can, and should be able to, prohibit others from acquiring such devices remains an open question to be decided in sociopolitical processes that will include but not be wholly determined by ethical criticism. There is little question that human thought as expressed in writings across a wide range of other subject areas has also been profoundly influenced by the genesis and spread of nuclear weapons. The future of the world is literally increasingly in the hands of a very small number of individuals.


DENNIS W. CHEEK

SEE ALSO Baruch Plan;Einstein, Albert;Hiroshima and Nagasaki;International Relations;Limited Nuclear Test Ban Treaty;Oppenheimer, J. Robert;Rotblat, Joseph;Weapons of Mass Destruction.

BIBLIOGRAPHY

Badash, Lawrence. (1995). Scientists and the Development of Nuclear Weapons: From Fission to the Limited Test Ban Treaty, 1939–1963. Atlantic Highlands, NJ: Humanities Press. This study portrays the scientific work, ethical and human dimensions, and societal interactions of scientists who worked on nuclear weapons over this period of time.

Bauckham, Richard J., and R. John Elford, eds. (1989). The Nuclear Weapons Debate: Theological and Ethical Issues. London: SCM Press. Series of essays by theologians and philosophers who take up central issues concerning nuclear weapons.

Bergeron, Kenneth D. (2002). Tritium on Ice: The Dangerous New Alliance of Nuclear Weapons and Nuclear Power. Cambridge, MA: MIT Press. Examines ramifications of a U.S. decision in 2003 to allow conventional nuclear power plants to produce tritium, a radioactive form of hydrogen used to convert atomic bombs into hydrogen bombs.

Canaday, John. (2000). The Nuclear Muse: Literature, Physics and the First Atomic Bomb. Madison, WI: University of Wisconsin Press. A detailed analysis of a variety of texts produced by physicists before, during and after World War II.

Einstein, Albert. (1960). Einstein on Peace, ed. Otto Nathan and Heinz Norden. New York: Simon and Schuster. A series of writings on peace by one of the most famous physicists of all time, including many reflections on the nuclear age.

Gusterson, Hugh. (2004). People of the Bomb: Portraits of America's Nuclear Complex. Minneapolis: University of Minnesota Press. A portrait based on fifteen years of research at weapons laboratories that shows how the military-industrial complex built consent for its programs and transformed public culture and personal psychology since the beginning of the nuclear age.

Hacker, Barton C. (1994). Elements of Controversy: The Atomic Energy Commission and Radiation Safety in Nuclear Weapons Testing, 1947–1974. Berkeley: University of California Press. A balanced portrait of weapons testing programs in the United States and their effects, including documentary evidence, clinical, and epidemiological studies. Contains an extensive bibliography.

Hashmi, Sohail, and Steven Lee, eds. (2004). Ethics and Weapons of Mass Destruction: Religious and Secular Perspectives. New York: Cambridge University Press. Structured dialogues among representatives of various religious and secular traditions along with essays on weapons of mass destruction and an analysis of existing agreements among nations.

Palmer-Fernandez, Gabriel. (1996). Deterrence and the Crisis in Moral Theory: An Analysis of the Moral Literature on the Nuclear Arms Debate. New York: Peter Lang. A systematic comparative study of the dominant views on nuclear arms. It suggests the weapons and the plans that accompany them challenge traditional moral reasoning and understandings of the relationship between intentions and actions.

Rhodes, Richard. (1986). The Making of the Atomic Bomb. New York: Simon and Schuster. The author won a Pulitzer Prize for this massive study that is rich in the human, political, and scientific dimensions of the making and use of the atomic bomb.

Smith, Alice Kimball. (1970). A Peril and a Hope: The Scientists' Movement in America, 1945–47. Cambridge, MA: MIT Press. Explores the political influence of scientists on arms control in the period just after the dropping of the atomic bomb.

Atomic Bomb

views updated May 29 2018

Atomic Bomb

The scientific discovery that would enable the creation of the atomic bomb occurred on the eve of World War II (1939–45). In 1934, experiments with uranium by Italian physicist Enrico Fermi (1901–1954) led to the discovery of nuclear fission. Scientists found that each fission of a uranium-235 nucleus releases 100 million times more energy than is released in a chemical reaction.

Most of the scientists who worked on nuclear fission experiments were German or Italian. They fled their native countries as German dictator Adolf Hitler (1889–1945) and the Nazis began their ascent to power. Had these men not emigrated to America, it is quite likely that Hitler would have been the one to control the use of the atomic bomb.

In the late 1930s, scientist Albert Einstein (1879–1955) wrote a letter to President Franklin D. Roosevelt (1882–1945; served 1933–45), encouraging a national effort for the development of an atomic bomb. The government did not move quickly. It was not until mid-1942 that a program, authorized by Roosevelt, began to build the bomb. The Manhattan Project was the name given to the work by a division established within the Army Corps of Engineers. The sole purpose of this project was to develop the atomic bomb.

The first nuclear bomb test was conducted on July 16, 1945, in New Mexico . The test was a success, detonating a bomb as powerful as 20,000 tons of TNT explosives. Within a month, two such bombs were dropped on Japan, killing an estimated 110,000 to 150,000 people and injuring another 200,000 or more. On August 15, six days after the second bomb was dropped, Japan announced its surrender, bringing World War II to an end.

By 1962, two thousand nuclear weapons existed across the globe. The Soviet Union and the United States owned 98 percent of them. By the end of 2007, there were still 26,000 nuclear warheads in existence; more than 95 percent belong to Russia and the United States.

Atomic Bomb

views updated Jun 08 2018

atom bomb

views updated May 14 2018

at·om bomb (also a·tom·ic bomb) • n. a bomb that derives its destructive power from the rapid release of nuclear energy by fission of heavy atomic nuclei, causing damage through heat, blast, and radioactivity.

atomic bomb

views updated Jun 11 2018

atomic bomb See nuclear weapon

atom bomb

views updated May 29 2018

atom bomb. See nuclear energy.