by Stanley Goldberg
© 1995 by the History of Science Society, All rights reserved
There are four parts to this thematic essay. It begins with a general discussion of the relationship between science and technology, especially how that relationship was viewed in the United States before and after World War II. This leads to a brief definition of the concept of Big Science. We turn then to the project which more than any other became the symbol of Big Science: the building of the atomic bomb. We then conclude with a very brief description of how one of the first post war big science enterprises, high energy physics, was established.
Science and Technology
Though they are often conflated and confused, science and technology are distinct activities. For the purposes of this essay, science includes activities directed toward understanding the way the world works. Since what we personally know about the world, we know through our senses, it is never possible to ascertain if the theories and models that we devise for explaining various classes of phenomena are correct. In fact, the history of science is a history of the succession of such theories and models. These changes are partly driven by experience and partly driven by the social and cultural milieu in which science operates. Our inability to penetrate the cyclorama behind our experiential net means that, while eventually there is almost always agreement on what the phenomena are, there is never complete consensus on what those phenomena tell us about the nature of the world. This is not only reflected in the arguments that scientists have with regard to the proper interpretation of data, it is also a function of the deep and divisive epistemological arguments which have driven philosophers, historians, scientists and other scholars since at least the time of the Greeks. We can’t even agree on the issue of whether or not science makes progress in the sense that successive theories about the same class of phenomena represent closer approximations to the true laws governing the phenomena in question.
Technology includes all those activities devoted to manipulating the world of experience. Such manipulation may range from creating environments with particular characteristics, for example, greenhouses or the experimental arrangements used in the prosecution of science, to the invention of all manner of gadgets and tools ranging from cherry pitters to food processors, from cherry pickers to ladders, from chariots to locomotives, from barometers to computers, from magnetic compasses to navigational satellites, from needles to I-beam Earlier, I noted that science and technologies are distinct from each other. Technology does not require science in order to proceed; it is not necessary to understand the world in order to be able to manipulate it. For example, people did not wait around for the development of plane geometry and the theory that space was Euclidean in order to invent the wheel. Nor did people wait for the development of the science of heat -thermodynamics – before inventing the steam engine. Such engines emerged out of a succession of elaborations of devices which can be traced back to the ancient world. This points up the fact that technological innovation is sui-generis, requiring only prior technology. Furthermore, it is possible to speak of technological progress in terms that are easily specifiable.
Technology makes science possible in several ways. First, it provides the tools required for doing science-the measuring devices and the specialized environments necessary for the pursuit of scientific investigations. Second, it turns up phenomena often hitherto undetected which undermine the current consensus with regard to our understanding of how the world works and which often simultaneously suggest new directions in which to strike out. As our technologies get more and more sophisticated, we are able to make increasingly more delicate measurements which invariably show that theories we once thought were sound can no longer stand the test of experience.
Some scholars, while acknowledging that in earlier times technology was totally independent of science, maintain that increasingly since the middle of the nineteenth century, advances in scientific knowledge are required for pointing technology in specific directions and for providing the necessary theoretical underpinning required for technological development. This is reflected in the recently coined phrase, “science-based technology.” As we will see, this was not the case with the development of the atomic bomb. In general, careful examination of almost any of the major technologies which have emerged in the last half of the twentieth century shows that the basic motivations driving the innovations still remain to be such things as market forces, social concerns, and inspired guesses motivated by technological bottlenecks.
This is not to say that science does not play any role in this process. First, the need for the development of instrumentation in support of scientific investigations serves as an important inspiration for technological innovation. Second, the major focus of various scientific disciplines at any time often directs attention to phenomena which otherwise might not receive the intense study which leads to the exploitation of those phenomena. Science also provides a heuristic ambiance for investigation which encourages technological innovation. Working in close proximity to scientists who are willing to entertain seemingly contradictory and contrafactual hypotheses serves as a spur and an inspiration to raise possibilities for technological innovation that might not otherwise suggest themselves
Prior to World War II, virtually no public money was available in the United States for scientific research in or out of university settings. Exceptions included funds for agricultural research and some grants for medical studies. Most research activities in physical science were small, bench-top project affairs, requiring little in the way of expensive equipment or large investments of personnel. Problems were usually chosen to avoid such requirements. Some physical science departments in large and prestigious universities had professionally run machine shops and often employed glass-blowers, but such luxuries were the exception. Most research had to be accomplished during the summer or squeezed in around full teaching assignments. There were some exceptions. For example, Albert Michelson, the United States’s first Nobel Laureate in physics (1907), with the financial backing of wealthy industrialists, was able to organize a series of large scale projects to determine the speed of light in a variety of contexts. After 1929, Ernest 0. Lawrence, the inventor of the cyclotron, ran an enormous laboratory employing many scientists and technicians; his projects served as the vehicle for many Ph.D. candidates in nuclear physics and chemistry. Most of the money came from private foundations. Until the Second World War such projects were noteworthy for their rareness.
World War II radically changed the picture. In the spring of 1940, President Franklin Delano Roosevelt agreed to a plan put forth by Vannevar Bush (see Biographical Information at end of essay) for organizing and directing scientific research toward preparation for “the upcoming conflict.” Initially, the new organization, the National Defense Research Committee (NDRC), was funded from presidential emergency funds authorized by the Congress. A year later, Bush reorganized the NDRC under his newly formed Office of Scientific Research and Development (OSRD). Bush pioneered the introduction of a system in which the government contracted for research and development services from university scientists and private industry. This system forced a new relationship among government, university and the private sector which resulted in an unprecedented infusion of public resources into university based research. The research being sponsored covered the gamut of technical problems faced during the war and ranged from the development of radar and sonar to the perfection of sulfa drugs and penicillin and the invention of insecticides such as DDT. The initial investigations into the possibilities of the atomic bomb were included within Bush’s organization, but when it became clear that substantial production facilities would be required, Bush took steps to turn the project over to the Army Corps of Engineers, whence it acquired the name, “Manhattan Engineer District, ” a ruse designed to make it appear to be just another of the Corps’ district offices. Subsequently, it became known as “The Manhattan Project.”
After the war, these kinds of research activities did not suddenly end. Rather, government support for research in science intensified and proliferated. Several of the research centers of the Manhattan Project formed the core of a set of so-called “National Laboratories.” Universities often received contracts from specific agencies such as the Department of Defense, the Department of Agriculture, or the Department of Commerce. Congress created agencies such as the National Science Foundation which could provide large grants through university departments to individuals and teams of scientists.
In 1961, Alvin Weinberg, the Director of Research at the Oak Ridge National Laboratory coined the term “big science” to take note of the fact that “many of the activities of modern science – nuclear physics, or elementary particle physics, or space research – require extremely elaborate equipment and staffs of large teams of professionals . . . .”(1) Weinberg went on to note a series of conflicts and problems which the emergence of big science had created. Among them were criteria for the allocation of scarce resources, mediation between the interests of competing laboratories and individuals, equitable distribution of funds (and as a result talent) between large scale and small projects as well as between different regions of the country. Since Weinberg first called explicit attention to big science, the phenomenon and the resulting difficulties have intensified to what many today consider crisis proportions. The results of some of the larger physics projects now being pursued are reported in papers listing hundreds of individual co-authors. The legislative battles over the funding of the Human Genome Program, the Super Conducting Super Collider, and research on AIDS are simply three of the better known struggles that mark current tensions in our society over the question of the proper niche that the social institutions of science should occupy. Not infrequently in such debates, reference is made to the need for another Manhattan project type effort. Of all the projects spawned by the war, it is the Manhattan Project which captured the public imagination after the war and it is that project which more than any other has served as the prototype of big science.
The Manhattan Project
The history of the Manhattan Project may be conveniently divided three periods reflecting three major historiographical issues on which scholars have concentrated: 1) The decision to go forward with the program; 2) the relationship between administrative and technical of the project; and 3) the decision to use the bomb on Japan.
The Decision to Build the Bomb
When the chemical atomic theory was introduced at the beginning of the nineteenth century, the atom was conceived of as the unanalyzable, uncuttable fundamental building block of all matter. Each of the chemical elements was composed of a different kind of atom, but each atom of a species was identical to every other. For example, all oxygen atoms were like all other oxygen atoms but different from the atoms of any other element-copper, zinc, hydrogen etc.
By the turn of the twentieth century, the expanding range of known phenomena made such a view untenable, and by the mid 1930s the atomic theory, while more secure than it had been earlier, was much more complex and sophisticated. The notion of the atom as uncuttable had been abandoned. In its place was a complex solar system-like model. At the center, the nucleus was pictured as being composed of two kinds of particles, positively charged protons and uncharged neutrons. A proton and a neutron have about the same mass. The number of protons determines the chemical nature of the atom-what element it is. For example, hydrogen, the simplest atomic species, had only one proton; neon 10; iron 26. Uranium with 92 protons has the most of any naturally occurring element. But the same element can be composed of atoms having a differing number of neutrons in the nucleus. For example, three kinds of hydrogen are known. The simplest nucleus contains but one proton. A nucleus with one proton and one neutron is still hydrogen but has twice the weight. It has acquired the name deuterium and has been dubbed “heavy hydrogen.” (Water made with deuterium instead of ordinary hydrogen is called “heavy water.”) Tritium, a third kind of hydrogen, contains one proton and two neutrons. Such variations in the same element are called isotopes of each other. Most naturally occurring elements are composed of mixtures of isotopes. Natural uranium ore, for example, is largely a mixture of two isotopes: one containing 92 protons and 146 neutrons -uranium 238-and uranium 235 containing 92 protons and 143 neutrons. Uranium 238 is far more plentiful. In the earth’s crust, for every 139 atoms of uranium 238, one finds one atom of uranium 235.
Surrounding the nucleus, in a way somewhat analogous to the manner in which the planets orbit the sun, are negatively charged particles electrons – one for each of the protons in the nucleus. Orbiting the hydrogen nucleus is one electron; orbiting the uranium atom are 92 electrons. Overall, the atom is electrically neutral. And the mass of the electrons is a negligible fraction of the mass of the proton or the neutron.
One of the most puzzling questions confronted by theoretical and experimental nuclear physics in the late 1930s was the nature of the forces which held the nucleus together. As is well known, like electric charges repel each other. Within most nuclei there are a number of positively charged protons in extremely close proximity. The repulsive electrical forces are enormous. Yet much of the matter of the world is quite stable which suggests that there must be some extraordinarily powerful nuclear forces acting over extremely short ranges. The reason that these forces must act over short range is suggested by the fact that when protons or other charged particles are used as projectiles to explore the nature and behavior of the nucleus, they are repelled. Yet at some point these forces of repulsion must be overcome if all nuclei do not spontaneously disintegrate.
As it turns out, not all combinations of protons and neutrons form stable nuclei. For example, while hydrogen and deuterium are stable (in nature one finds one atom of deuterium for every six thousand atoms of hydrogen), tritium, composed of two neutrons and one proton, is unstable. Every once in a while the combination disintegrates spontaneously – one of the neutrons converting to a proton, ejecting an energetic negative charge from the nucleus. In doing so the nucleus is transformed from an isotope of hydrogen to an isotope of helium – the element containing two protons. It is not possible to predict the decay of a particular tritium nucleus, but measurements reveal that half of any portion of the substance will disintegrate over a period of about twelve years. Some species are much more active, with so-called “half-lives” of minutes or even seconds, while others have half-lives of centuries or longer. For example, uranium 238, while not stable, has a half-life of 750 million years. Unstable isotopes are said to be “radioactive.”
The discovery of the neutron in 1932 reinvigorated experimental studies in nuclear physics. Being a neutral particle, the neutron should not experience repulsive electrical forces as it nears the nucleus. Laboratories around the world developed programs for bombarding samples of material with neutrons. Physicists were convinced that the resulting reactions would give new insights into the nature of the nucleus and nuclear forces. In this effort one of the chief tools was the so-called ” atom smasher” such as Ernest Lawrence’s cyclotron, for which he was to receive the 1939 Nobel Prize. The role played by these machines in physics is analogous to the role of the microscope in biology.
The favorite target for these experiments was uranium. The view prevailed that some of the uranium atoms might absorb neutrons and would thus be transformed in the subsequent reactions into so-called transuranic elements. Between 1932 and 1938, physicists puzzled over the confusing results of these kinds of experiments. Finally in late December, 1938, two German scientists, Hahn and Strassman, announced that they had discovered that neutron bombardment of uranium occasionally resulted in the splitting of the uranium atom into roughly two equal pieces and the release of a considerable amount of energy. In other words, the atom did not just split, rather, the pieces flew apart. Atom for atom the amount of energy released was millions of times greater than the energy released in ordinary combustion or in the explosive chemical reaction undergone by a substance such as TNT. The phenomenon was soon given the descriptive name “fission,” an obvious reference to the splitting of cells in living systems.
The discovery of the phenomenon of fission occurred against a background of growing international tension. Much of the world watched with growing uneasiness as Germany rearmed and as it moved aggressively against German Jews and others considered to be of impure and tainted ethnic origins. In the course of 1938, Germany had marched into Austria and had also occupied the Sudetenland in Czechoslovakia. The United States and Germany had broken off diplomatic relations. France called up its reserves, and Great Britain’s internal politics were at the boiling point over the issue of how to deal with Hitler, while central Europe waited for the other shoe to drop.
Fission was a phenomenon which had not been anticipated, but the moment it was noticed, many of those who had been doing this kind of research realized that they had been overlooking the obvious. They also understood the possible implications for what was already being described *as “the upcoming conflict” -World War II. Within days of the announcement by Hahn and Strassman, laboratories around the world had duplicated the effect. For example, the moment he heard about the announcement, the United States physicist Luis Alvarez, who had been investigating the effects of bombarding uranium with neutrons at Lawrence’s laboratory at the University of California, Berkeley, replicated the Hahn and Strassman result for an audience of theoretical and experimental physicists. The fissions of uranium were evident in a visual display on an oscilloscope. As he watched in fascination, the theoretical physicist J. Robert Oppenheimer spelled out the potential consequences – the controlled generation of power on a scale which until then had been only the dream of science fiction writers and the development of weapons of hitherto unprecedented power. At that point there was no practical scheme for effecting such results, but the phenomenon had been revealed. Barring some trick of nature, there could be no question of what lay before us.
When the announcement of the discovery of fission was made, Niels Bohr, the Danish patron-saint of atomic and nuclear physics, was preparing to sail to the United States to attend a conference being held during January 1939 in Washington DC. The discovery of fission was announced by Bohr to the assembled participants The conference was being held at the Carnegie Institution of Washington, one of the premier centers in the United States for research in nuclear physics. Some of the participants immediately replicated the effect on Carnegie Institution of Washington equipment, but on a more ominous note, when Niels Bohr made the announcement, a discussion ensued at the open meeting as to whether or not to close that part of the meeting to the press.
Over the next year-and-a-half, it was largely the activities of a small group of European scientists who had fled to the United States to escape Nazi tyranny and persecution that kept alive the idea of developing a technological program to explore the possibility of capitalizing on the energy released in nuclear fission. It was assumed that the Germans would make every effort to pursue the potential of fission technologies.
Opinions among scientists as to the prospects for exploiting the phenomenon of fission for controlled or explosive releases of energy ran the gamut from impossible to certain. The entreaties of immigrant scientists such as Leo Szilard and Eugene Wigner fell on deaf ears. Recall that there was no tradition of government support for this kind of research. Private foundations were already supporting research in nuclear physics but made no move to underwrite investigations examining the viability of fission technologies.
Undaunted, Szilard hit on the idea of using the fame and authority of his friend and colleague, Albert Einstein, to reach President Franklin Delano Roosevelt. He convinced Einstein to sign a letter informing Roosevelt in general terms of what might be technologically possible. The letter pointed out that Germany had already suspended export of uranium ores and warned Roosevelt of the consequences of Germany’s obtaining such a weapon before the United States did. Szilard next arranged for a confidant of FDR, Alexander Sachs, to deliver the letter to the President. Though Szilard had gotten Einstein’s signature in August, Sachs was not able to see the President until October. Roosevelt responded by authorizing a small sum of money, six-thousand dollars, to set up a committee on uranium research to beheaded by Lyman J. Briggs, director of the National Bureau of Standards.
Two questions had to be answered if fission was to be effectively exploited. First, which isotope of naturally occurring uranium, 235 or 238, was responsible for the observed incidents of fission? Second, if fission was to be useful, it had to be self-sustaining. The only obvious way that might be possible was if several neutrons were released during the fission process.
By the middle of 1940, with additional monies from the President, it was discovered that uranium 235 and not uranium 238 was responsible for the observed fission and that indeed, on the average, each fission resulted in the release of between two and three neutrons. Thus, a chain reaction would be possible, but as is almost always the case, in answering these questions, a whole host of new questions emerged.
One set of questions surround the issue of how to build a practical device to take advantage of the chain reaction for the generation of power. The weight of informed opinion was to build a structure, at first called a “pile” and later a “reactor,” containing lumps of uranium imbedded, lattice-like, in a moderator whose sole function was to slow down by collision the fast moving neutrons that emerged from fissioning uranium 235 without absorbing them until they eventually encountered other uranium 235 atoms to continue the fission process. The leading candidates for such a moderator were heavy water (i.e. water made with deuterium rather than hydrogen) and carbon. Heavy water was quite scarce and quite difficult to make. On the other hand, in order to be of use for this application, carbon had to be purified to a degree which stretched the limits of industrial practice. Meanwhile, in examining the behavior of various isotopes as a function of combinations of protons and neutrons, some physicists predicted that should one be able to manufacture element 94 (that is the element with two more protons than uranium) with 145 protons, it should be at least as prone to fission as uranium 235. The route to building this element began with the absorption of a neutron by uranium 238. This combination would prove to be unstable, and within a matter of days it would go through a series of nuclear transformations to become element 94. In other words, in a pile composed of natural uranium, the neutrons which emerged from the fission of uranium 235 would either end up fissioning other uranium 235 atoms or would be absorbed by the more prevalent uranium 238 and produce a new fissionable element – to be known as plutonium.
A second major set of questions surrounded the techniques that might be used to fashion a bomb. This required the invention of technologies based on the sudden rather than the controlled release of energy resulting from fission. The first question that had to be answered here was whether or not uranium 236 would fission on impact from very, very fast neutrons as opposed to neutrons which had been slowed down as a result of many collisions, with inert carbon atoms. The fact that uranium 235 did fission when impacted by fast moving neutrons was an easy question to answer. The basic outline of how to build a fission bomb was not difficult to conceive. It simply required assembling a sufficient quantity of uranium 235 in a very short time. Once that feat was accomplished, an explosion would follow.
What was meant by a “sufficient quantity?” The volume occupied by the uranium 235 had to be great enough that the chances of a neutron emerging from a fission escaping were very small in comparison to the chances of its encountering and fissioning another uranium 235 atom. Such a mass was termed a “critical mass.” All one had to do was to bring two subcritical masses together. But this had to be done extremely rapidly and in the absence of any stray neutrons. Suppose, for example, that as the assembly was in process, a neutron began the fissioning process. A chain reaction would begin, but would fizzle because many of the neutrons would still be able to escape. While the exact amount of uranium 235 required was not known with any precision (estimates varied from ten to several hundred pounds), the best estimates suggested that the assembly would have to occur in a time less than a millionth of a second. And once that assembly was accomplished, a technique of some sort for injecting a supply of neutrons to start the reaction would have to be devised. A technique for assembly which immediately suggested itself was to use a small cannon of some sort. The projectile would be a subcritical mass of uranium 235, perhaps in the shape of a small sphere, and the target, which would have to be contained at the end of the gun barrel, would be another subcritical mass of uranium 235 in shape of a doughnut or hollow cylinder. The muzzle velocity would have to be high enough to assemble the parts in a short enough time. At that instant, perhaps as a result of the act of assembly, another process would be initiated to release some neutrons. The entire technique was dubbed “gun assembly.”
So the basic outline for how an atomic bomb might be built was established. It was still not known just what constituted a critical mass, and it was not known if plutonium could be used instead of uranium 235, but the most pressing and seemingly intractable technical problem that now emerged was how to collect sufficient amounts of pure uranium 235 in order to manufacture a critical mass. Isotopes of the same element are chemically identical, so chemical separation is impossible. The major distinguishing feature between the two isotopes is their weight. Uranium 235 weighs about one-hundredth of a percent less than uranium 238 and should therefore respond to electrical or mechanical forces slightly differently than uranium 238, but the difference is very tiny and difficult to exploit. A variety of strategies was proposed, but though some of them showed promise in the period between the fall of 1939 and the summer of 1941, not one of the six or seven being tried had produced definitive results.
As this work proceeded at various government and university laboratories under the direction of Lyman Briggs’s uranium committee, the scope of operations kept increasing and the costs mounted steadily. Money for the various projects was still being supplied from the president’s emergency funds. The major motivation for continuing the work was the fear that Germany must be working hard on the development of their own atomic bomb. It was assumed that German work on such a device had begun immediately after their discovery of fission at the end of 1938. Since American and British efforts did not begin in earnest until the end of 1939, the consensus was that the Germans must be at least a year ahead. In order not to alert or aid the Germans, a policy of informal censorship emerged which at first was voluntary, but which, by the middle of 1941, had become an official government policy. While no one commented on it publicly at the time, after the war many scientists not involved in the program reported that the fact of the rather sudden disappearance of papers on nuclear physics had signaled to them that a program to exploit fission must be going on in secret.
When President Roosevelt authorized the formation of the NDRC in the spring of 1940, he also agreed to Vannevar Bush’s suggestion that the Uranium Committee be made an administrative part of the new organization. At the time, Bush noted that it was not clear that anything of promise would come from this work, but in the event that it did, the development of nuclear technologies during war time would properly fall within the jurisdiction of NDRC. The transfer also meant that ever growing budgets for the projects of the Uranium Committee could more easily be accommodated within the structure of the NDRC.
For a year, Bush took no other actions with regard to the work of the Uranium Committee. Rumors persisted during that year that the Germans were working hard on the development of a fission bomb and some of those rumors suggested that they were making progress. British physicists and engineers who were investigating fission technology were concentrating exclusively on the prospects of building a bomb. In the United States, opinion was still divided not only on this issue, but on whether or not it would be possible to exploit the phenomenon to generate controlled power.
On the one hand, Bush realized that should the Germans obtain a weapon of such power as some experts expected the device to have, the war might well be lost before it began. On the other hand, advisors to Bush, including Harvard’s President James B. Conant, were skeptical that such a bomb was really in the offing. They pressured Bush to husband funds and material resources for those technical projects which are likely to be of use for the current conflict. Examples of such projects included radar, sonar, the proximity fuse, various chemical and medical technologies, etc. In the meantime, as all these programs grew, funding by use of presidential discretionary funds became more and more problematic.
Finally, in April, 1941, Bush decided that it was time to make a decision on whether or not to push ahead with nuclear technologies or to suspend such work for the duration of the war and turn the material and human resources then occupied in this work toward other, more pressing war-related technical projects. As he usually did in reaching a conclusion on such matters, Bush assembled a committee of experts to look into the question. In this case, he turned to the National Academy of Sciences and asked its president, Frank B. Jewett, retired president of the Bell Telephone Laboratories to assemble such a committee. The committee was chaired by Chicago physicist and Nobel Laureate Arthur H. Compton. In May, 1941, the committee reported that it was likely that fission technologies could be developed to power ships or to produce hitherto unavailable radioactive isotopes which might be used in warfare as poisons. There was also the slim chance that a weapon of unprecedented power might be possible. It recommended $350,000 of additional funding and continued research for six months, at which time the issue of whether or not to continue should again be examined. Both Bush and Conant were dismayed at the lack of focus on engineering questions. They immediately asked the committee to reconsider the question and augmented its membership with several engineers. In July the committee returned essentially the same recommendation. Bush now let the matter rest for several months. Though little progress was made over the summer on solving the problem of the separation of uranium isotopes, it was confirmed that element 94, plutonium, did undergo fission and in the process, like uranium 235, did emit neutrons. (The analysis had been done on microgram quantities of the new element which had been produced using the cyclotron at Lawrence’s laboratory.)
Finally, in October, Bush asked the NAS committee to reconsider the matter again. But this time he charged the committee to concentrate on the very narrow question of whether or not it would be possible to engineer a fission bomb. The committee, further bolstered by the participation of chemists and chemical engineers, now reported back that with sufficient effort the creation of such a bomb in the next several years was a virtual certainty and that such a project would require $133 million dollars.
Bush used this third report as a warrant to immediately begin reorganization of the work on uranium. He began by authorizing engineering studies for the construction of pilot plants and semi works to proof-test various techniques for uranium separation and the production of plutonium in uranium fueled fission reactors. He formally created three research centers for the project: one at Columbia University was headed by chemist and Nobel Laureate Harold Urey; another at the University of Chicago was headed by physicist and Nobel Laureate Arthur H. Compton; and the third at the University of California, Berkeley, was headed by Nobel Laureate physicist Ernest 0. Lawrence.
The Columbia operation, under the code name Substitute Alloy Materials (SAM) Laboratory took up the question of mechanical techniques for uranium isotope separation. There were two leading candidates: gaseous diffusion and gaseous centrifuge. The work at Chicago initially concentrated on three problems: design of nuclear reactors for of plutonium; techniques for chemically separating the plutonium from uranium; and the design of the weapon itself. Later, weapons design would be transferred to a new laboratory at Los Alamos, New Mexico. The Chicago laboratory was dubbed The Metallurgical Laboratory (MET). Research at Berkeley, known as the Radiation Laboratory, initially concentrated on the development of techniques for separating uranium using electromagnetic techniques. Later the laboratory undertook many studies in support of the work at Los Alamos.
Bush realized that large sums of money would be required. For reasons of security and expedience, he did not wish to have to ask the Congress for funding. Therefore, Bush began the process of turning the project over to the Army Corps of Engineers within whose massive wartime budget such a project could be easily buried.
At about the same time that the Americans were deciding to push ahead with an all out effort to build an atomic bomb, the Germans were deciding not to undertake a similar effort. The Germans were convinced that the war would be over very quickly. It did not seem likely that a fission weapon could be finished in time to play a role in their victory. For the rest of the war, German research on fission was restricted to low level efforts to create a self-sustained, controlled chain reaction. The United States decision was driven largely by the belief that the Germans were working on a fission bomb and were probably ahead of the Americans. The Germans had no such fears. The German science and technology communities were recognized as world leaders; if they thought a fission bomb could not be created in time to be of use during the war, it was inconceivable to them that anyone else might be able to construct such a weapon.
Building the Bomb
Formal transfer of control to the Army Corps of Engineers did not take place until June, 1942. Colonel James C. Marshall, who had been District Engineer in the Syracuse, NY area was appointed as head of the project. Studies were immediately undertaken to identify a location for factories and pilot plants. Requirements included good supplies of water and power and relative isolation. The preferred site was a 57 square mile parcel of land along the Clinch River in the hills of Eastern Tennessee, not far from Knoxville. It was to become known as Oak Ridge.
In general, however, there seemed to be little progress. This worried Bush. He had committed himself to an all-out project on the assumption that a weapon would be ready to be used during the war and he had done so without consulting Congress. Large sums of money had already been committed and some of the country’s scientists and engineers had been recruited this project rather than to some other. As fall 1942 approached, Bush appealed to the leadership of the Corps to replace Colonel Marshall with a person having a more aggressive style. In response, after consulting with highest authorities in the Army, the Chief of Engineers appointed Colonel Leslie R. Groves (See Biographical Information at end of essay).
Groves was deeply disappointed when he received orders to assume command of the Manhattan Engineering District. In the summer of 1942 he was hoping to be given command of a combat engineering division. This was not just a reflection of his patriotism and his sense of wanting to be at the center of combat action; it was also an indication of his intense ambition. The high wide road to speedy promotion was successful command of troops. If this route had been denied to him he was determined to make sure that the atomic bomb project, however improbable, was a success.
As the price of his appointment, Groves insisted on being promoted to Brigadier General on the grounds that otherwise he would not command sufficient respect from the scientists to wield effective leadership. He was also told that he would have carte blanche with regard to procurement of material resources and manpower, and indeed, all during the war, in a climate in which resources were closely rationed and metered, the Manhattan Project was given the very highest priority – above munitions, above rubber, above the landing craft program, above aviation gasoline – and whenever this priority ranking was questioned, which happened often, Groves got the full active backing of the highest authorities including the Secretary of War, Henry Stimson, and even President Roosevelt.
Groves’s first task after taking command of the Manhattan Project was to survey the work then in progress. He was extremely dismayed at what he found. None of the laboratories seemed to have any sense of discipline. He was particularly appalled at the what he judged to be a total lack of adequate security and secrecy. He was disheartened at how little progress had been made on the myriad of technical puzzles which the project had to address before the bomb itself could be constructed, and he was unsettled by the realization that no one, including Vannevar Bush or Groves’s superiors in the War Department, had a realistic estimate of the magnitude of the construction project that would be required.
Groves remained undaunted. Between November 1 and the end of December 1942, he made a series of decisive moves. He, believed that there would be no time to follow the usual industrial practices of building pilot plants and semi-works for new and untried processes. He ordered that steps be taken to move immediately to build the required industrial factories. He also decided that the production of plutonium and its separation from uranium would require a separate industrial complex and began the steps which led to the construction of the reactors and separation plants on a 500,000 acre reservation centered at Hanford, Washington. When Groves informed the other laboratory directors of the project that he was going to create a special laboratory at Los Alamos to design the bomb itself and that he was going to appoint J. Robert Oppenheimer as director, they all objected. Some thought such a laboratory would not be needed. All opposed the appointment of Oppenheimer as a person lacking experience in experimental physics and in administration. The Counter Intelligence Corps officer assigned to investigate Oppenheimer’s background told Groves that given Oppenheimer’s close connections to West Coast communist and left-wing organizations, Oppenheimer should play no role in the project, but there was something that Groves saw in Oppenheimer that others did not (See Biographical Information at end of essay). Groves brushed aside all objections and during the next three years, Oppenheimer distinguished himself with his charismatic and inspired leadership of the Los Alamos laboratory.
By December, 1942, in but three months, Groves had committed the project to expenditures in excess of half-a-billion dollars, and before the end of the war in August, 1945, instead of the original estimated $133 million, the Manhattan Project had spent over $2 billion (equivalent to 20 billion 1990 dollars). At its peak in late 1944 to early 1945, the project employed over 160,000 people in operations that stretched from coast to coast as well as to Canada.
Groves’s initial orders were to build the factories necessary to produce fissionable uranium and plutonium and to provide other services to the scientists as the need arose. Groves’s own interpretation of those orders was to do whatever was required to build the atomic bomb in the shortest time possible and thereby end the war. This attitude is reflected in the reorganization of Los Alamos in the summer of 1944. It was discovered then that for technical reasons, the gun assembly technique could not be used in order to detonate plutonium. A technique known as implosion, in which a collapsing spherically symmetrical shock wave, produced by the simultaneous detonation of strategically placed ordinary explosives, would be used to bring a subcritical mass together much more rapidly. This was the only known possible alternative to the gun assembly. However, research on this implosion technique had been carried on at a relatively low level and had not gotten very far. Now, Groves and the other project leadership faced the prospect of having committed more than $500 million dollars to the production and purification of plutonium only to find that; it would not be useable. Within two weeks, Los Alamos was radically reorganized. Implosion techniques development was given top priority and the number of personnel working at the laboratory increased, suddenly, by a factor of ten. A year later, it was an implosion bomb with plutonium as the active material that was successfully tested on the desert in southern New Mexico.
The Decision to Drop the Bomb
Recall that a major motivation for going forward with the development of a fission bomb had been the fear that the Germans would beat us to such a weapon. And it had been anxiety over this possibility which had been the expressed motivation of many of the individual scientists who had joined the project. Even though Germany surrendered on May 7, 1945, and even though it had been clear for six months that there would be no German bomb, the work of the Manhattan Project did not slow down. On the contrary, under the goading influence of General Groves, who was aided and abetted by J. Robert Oppenheimer and other project leaders, the work speeded up and intensified. It is true that fighting against the Japanese in the Pacific continued, but there was no question that Japan had all but lost the war.
Just what it would take to force the Japanese to surrender was not clear. Each of the major United States military services assumed that they alone could end the war. The Army Air Corps believed that continued fire bombing which would level all major Japanese urban centers by September would force the surrender. The Navy believed that their blockade of the Japanese islands would soon starve the Japanese into submission. Meanwhile, the Army was completing its plans for an invasion of the Japanese mainland, scheduled for October, 1945.
It was against this background that Secretary of War Henry Stimson (see Biographical Information at end of essay) created a committee called “the Interim Committee” to advise him on the development of long-range and short-term policy in regard to fission technology. This included recommendations on whether or not to use the bomb in the war against the Japanese. Among the inputs received by the committee was a report by a so-called scientific panel chaired by J. Robert Oppenheimer. The Interim Committee recommended, with the concurrence of the scientific panel, that the atomic bomb be used on Japan without warning. In anticipation of such a decision, four Japanese cities had been chosen as potential targets and held in reserve, off-limits for the conventional bombing attacks that continued with unremitting fury.
On August 1945, the first atomic bomb, a uranium 235 gun type assembly, was dropped on Hiroshima by the Enola Gay, a B-29 piloted by Colonel Paul Tibbets. That bomb contained about 200 pounds of uranium 235, all that had been separated up to that time. There would not be enough for another such bomb before January, 1946. The bomb produced an explosion equivalent to the detonation of 13.5 thousand tons of TNT and in an instant destroyed the city. Many thousands more were killed in the ensuing firestorm. Significant numbers of Japanese were injured and eventually died from the burst of radiation which was emitted by the bomb as it detonated.
On August 8, as they had promised, the Soviet Union declared war on Japan. This came as a shock to the Japanese, who had hoped to convince the Russians to intercede on their behalf in obtaining terms of surrender more reasonable than the allied insistence on “unconditional surrender.” (While there is evidence that the Japanese thought that the Soviets would eventually declare war on Japan, they believed that if it did happen, it would not occur until after September, 1945.)
On August 9,1945, a second B-29, Bock’s Car, carrying an implosion assembly plutonium bomb, attacked Nagasaki. This explosion was estimated to be the equivalent of more than 20 thousand tons of TNT. Nagasaki was destroyed. By this time, Hanford was producing plutonium at a rate sufficient to fabricate three implosion bombs per month, and the Japanese population was warned by American radio propaganda broadcasts and by leaflet that unless they surrendered, they could expect the atomic bombings to continue without end. Finally, on August 14, the Japanese agreed to the allied terms for unconditional surrender.
Of course there is no way of knowing if the Japanese might have surrendered without the need of an invasion if the atomic bomb had not been used. However, the United States had long since broken Japanese codes. At the highest levels of the American government, it was well known during the spring and early summer of 1945 that a significant segment of the Japanese leadership was desperate to find a formula which would make surrender acceptable.
Since the end of the war there have been two major historiographical streams with regard to the use of the bomb. It has been argued by some that the bomb was used to shorten the war and that it resulted in a net saving of lives. This group can point to such evidence as the fact that the Army was indeed actively planning a full scale October invasion of the Japanese mainland and that the transfer of significant numbers of troops from Europe to the Pacific had been ordered; there can be no question that the use of the bomb produced a sudden and dramatic end to the war.
A second group has argued that the bomb was used as the opening salvo in the cold war – that the use of the bomb was aimed at putting the Soviets on notice with regard to post-war adventurism. Evidence educed for this position includes, among other things, the fact that one of Truman’s chief advisors, soon-to-be Secretary of State James Byrnes, desperately wanted to bring the war to an end before the Russians entered; and from the fact that even before the war ended, the Joint Chiefs of Staff and other civilian members of the Administration besides Byrnes were urging an aggressive posture toward the Soviets.
Proponents of these theories have treated them as mutually exclusive, but of course, there is no reason why a person who thought the bomb could be used to intimidate the Soviets might not also have believed that its use would result in a net saving of lives.
There are at least three other motivations for using the bomb on Japan before the end of the war which have been largely overlooked but which almost certainly played a role in the decision to use the bomb. The personal motivations of some of those involved in the decision making process and the momentum which the project had gathered between 1942 and 1945 are two of those factors. The third was the result of internal domestic politics. General Groves and some of the civilian leadership of the project often expressed the belief that if the war ended and the bomb had not been used, they would be confronted by a Congress likely to be incensed that the Manhattan Project had been funded and run without the explicit approval of the Congress. In fact, every time members of the Congress had tried to find out the purpose of the mysterious large expenditures which were being made by the Manhattan Engineering District, they had been stonewalled by Groves and by Secretary of War Stimson. Investigations and hearings by congressional oversight committees had been obstructed, leaving committee chairs frustrated and angry. It was widely assumed by the leadership of the Manhattan Project that the only way to avoid a very nasty post-war investigation was to make sure that the bomb ended the war.
The issues surrounding the decision to drop the bomb are complex. This brief summary can only suggest that complexity. It is an argument which can never be decided simply by enumeration of evidence. The questions concerning the decision to use the bomb frame a classic historical puzzle which energizes historians and which will forever fascinate a curious and caring public, but at the same time, the debate itself continues to illuminate issues of public policy.
High Energy Physics
At the end of the war, the United States public finally was able to get an overview of the degree to which technical advancements had contributed to the winning of the war. Of all of those achievements, it was the Manhattan Project and the atomic bomb which dominated the presentations in the media and which caught the imagination of the public as the archetype of the role that science had played in the war. The accomplishments of the Manhattan Project reinforced the view of the essential role that science plays as a precursor to technological innovation.
It was not surprising, then, that one of the consequences of the war was a continuation of the war-time policy of government support of scientific research in university-based laboratories. But everyone recognized that it would not be possible to continue such work under the conditions of secrecy which had existed during the war. Most of the scientists who had worked on the atomic bomb were anxious to return to the work that they had been doing prior to the war – investigation of fundamental questions of how the universe worked – in an environment of openness and collegiality. They wanted no part of working conditions which would require enforced secrecy. What emerged was a two tier system in which secret, so-called “classified” research, and open, unclassified investigations were both supported.
One of the immediate beneficiaries of the infusion of federal money in support of scientific research was a field which became known as high energy physics, an outgrowth of prewar research in nuclear physics. Prior to the war the cyclotron and other types of so-called atom smashers had been one of the more important tools used to investigate the nature of the atomic nucleus. Access to such machines had been limited. They are very costly machines and only a few laboratories had been able to find the funds to construct them. Such research had been suspended during the war. Most of the people who had been engaged in this kind of research had worked on the fission bomb project. The very success of the project had stimulated a host of questions about the nature of the atom for which new atom smashers of higher energy than had thus far been available were needed.
Both government and university officials agreed that support of this kind of research should include money for building the machines, as well as support of the apprenticeship system for training undergraduate and graduate students. It was a way of ensuring a cadre of trained physicists – of performing research which it was believed would play an ever increasingly important role in defense of the country. The tradeoff was the proliferation which had hitherto been generally unavailable for research in nuclear physics. Big science was coming of age. The symbiotic relationship which was thus created, resulted in the evolution of ever more sophisticated machines of higher and higher energy in pursuit of the study of the fundamental particles believed to make up the interior of atoms.
The support of high energy physics after the war is a model for the contract between university science and government which has been maintained. Still, by the early 1970s support for high energy physics was waning. Many of the atom smashers that had been built in universities around the country were being shut down for lack of continued financial support and because they had become obsolete. The research questions now being asked required machines of hitherto unprecedented size and cost. The solution was to site one or two such machines at strategic locations in the country and make them available through a competitive evaluative process to qualified teams of scientists.
Such a compromise not only reflected the ever increasing cost of pursuing research in high energy physics, it was also a function of the competition for funds from other areas of science. For example, in recent years, there has been a call for large sums of money to support such programs as the Human Genome Program, or research to combat the threat represented by the spread of the AIDS virus.
Thus far, we have been unwilling to tamper with the fundamental assumption which emerged out of World War II that support of scientific and technological research is an important function of the Federal Government. Just how long that will be the case remains to be seen.
Vannevar Bush, who was born and raised in the Boston area, was trained as an engineer at Tufts University from which he received his Bachelors and Master’s degree in 1913. He taught mathematics and electrical engineering at Tufts through 1917 while completing a doctorate in engineering at Harvard and MIT. He then spent a short period of time as a consulting engineer in the private sector, but in 1919, he returned to MIT as a professor of electrical engineering. Bush’s most significant contribution to electrical engineering was the development of the differential analyzer, an analogue computer of considerable power which presaged the electronic computer.
But Bush’s real forte was in the management and administration of scientific and engineering organizations. In 1932, he became vice president and dean of engineering at MIT, a position which he occupied until 1938 when he resigned to assume the presidency of the prestigious Carnegie Institution of Washington, one of several important privately funded American centers of scientific research. At the same time he was appointed a member of the National Advisory Committee for Aeronautics (NACA). A year later he assumed the chairmanship of the NACA. The NACA had been created by Congress early in the century to oversee and coordinate scientific study of the problems of flight. Now, in 1939, in recognition of the approach of what would more and more be termed “the upcoming conflict,” President Franklin Delano Roosevelt ordered that, in the event of a national emergency, the NACA was to assume the role of consultant and research agency to the Joint Army and Navy Aeronautical Board.
During World War I, Bush had worked on the problem of submarine detection. He later reported how dismayed he had been at the administrative chaos and lack of coordination among groups charged with overseeing the effective prosecution of such research. For Bush the lesson to be learned from this experience was the necessity, in such a situation, of having access to the seats of power combined with a workable system for the delegation of authority. These were the only ways to effectively channel the contributions of talented scientific and engineering personnel in a crisis such as the one faced in World War I. As Bush later put it, “I knew that you couldn’t get anything done in that damn town [Washington] unless you organized under the wing of the President.”
By the spring of 1940, Vannevar Bush was well versed in the ins and outs of Washington politics. Together with some of the country’s leading administrators of academic and scientific institutions – Karl T. Compton (physicist and president of MIT), James B. Conant (chemist and president of Harvard University), Richard C. Tolman (physicist and dean of the Graduate School at the California Institute of Technology), and Frank Jewett (retired president of Bell Telephone Laboratories and president of the National Academy of Sciences [NAS] – and two Roosevelt advisers, Arthur C. Cox and Harry Hopkins. Bush conceived of the National Defense Research Committee (NDRC), an organization for coordinating the efforts of government, the private sector, and universities with regard to the contributions that science could make to the development of new weapons. Harry Hopkins arranged for a personal meeting between Roosevelt and Bush at which the president approved the plan and ordered it funded out of emergency funds available to the President. Roosevelt appointed Bush to chair the committee.
While the role played by the NDRC in coordinating research and development in new weaponry, techniques of intelligence, psychological warfare, etc,, was novel, the most unique aspect of the enterprise was the utilization of the contract system. Heretofore, most research and development on military-related problems had taken place within laboratories under the jurisdiction of the Departments of War or Navy. Bush introduced a contract system which exploited the facilities and expertise available in the private and university sectors and which delegated the responsibility for technical decisions to those organizations while NDRC maintained tight control over the administration and coordination of the overall effort. It was a scheme which was resisted by the research arm of the Navy that prided itself on its history of accomplishments within its own facilities.
A year later, in June, 1941, at Bush’s behest, Roosevelt created a new organization having expanded powers and scope – the Office of Scientific Research and Development (OSRD) -which superseded and incorporated NDRC. Besides NDRC, OSRD included an Office of Field Services, a Committee on Medical Research and several other adjunct organizations. Roosevelt appointed Bush to head OSRD. James B. Conant, who had served as Bush’s second within NDRC, was named NDRC chairman.
Leslie R.Groves was born in 1896, the son of an Army chaplain. He spent his youth at various Army posts, where his contact with Army officers influenced hie determination to attend West Point. This was contrary to his parents advice, but his resolve on this matter was stiffened in 1913, his last year of high school. The family was living at Fort Lawton (Seattle), Washington. Even while finishing high school he undertook studies at the University of Washington. His academic record to this point was undistinguished. He moved to Boston in 1914 to attend MIT as a way of preparing himself for West Point. It had been his intention to complete an engineering course at MIT in three years, but the administration was not willing to fully credit the work he had done at the University of Washington, and after a little more than a year of pedestrian performance, he left MIT, having received a presidential appointment to West Point.
He entered in the summer of 1916. Because of American entry into World War 1, the program at West Point was accelerated and Groves’s class was graduated in November 1918, a year-and-a-half ahead of schedule. By this time the war was over and there was little prospect for rapid advancement within the Army, but Groves had long since set his cap for a career in the Army Corps of Engineers. Between 1918 and 1931, Groves rose only to the rank of first lieutenant. None of his contemporaries fared better. In 1931 he was assigned as head of the Supply Section of the Chief of Engineers office in Washington. During this period his superiors began to take notice of his organizational and administrative skills. After he attended the Command and General Staff School at Leavenworth, Kansas in 1935-36, and the Army War College in 1939, he was assigned to the War Department General Staff. In July, 1940, Groves was appointed as special assistant on construction matters to the Quartermaster General and was promoted to major. During this period, when the United States had undertaken a massive rearmament program, the Quartermasters Corps had responsibility for all Army construction in the United States. Four months later, in November of 1940, Groves was promoted over thousands of his seniors to full colonel and was appointed Deputy Chief of Construction of the Office of the Quartermaster General. In December of 1940 he became Chief of Operations, Construction Division, Quartermaster General’s Office, a position he held until December of 1941, when all army construction operations within the United States were turned over to the Army Corps of Engineers. At that point, Groves was appointed Deputy Chief of Construction of the Corps. Thus, Groves now directed all Army construction within United States, including all camps, ordinance plants ports, and depots, as well as the Pentagon building to which he personally gave a great deal of attention. By the middle of 1941, Groves was overseeing the expenditure of $600 million a month. His ability to keep on top of the details of this enormous undertaking was punctuated by his skill in appointing to positions of responsibility officers whose judgement he could trust. Even though he was often brutish and tactless, Groves understood people and could quickly sort out those who had talents he required from those who would be a hindrance to his quickly accomplishing his goals. He showed great acumen in identifying civilian contractors who could produce as promised, on time and without waste. His rise through the ranks of Engineering Officers between 1936 and 1942 had been meteoric,
J. Robert Oppenheimer was born in 1904 in New York City. His parents, who were very well off, were connoisseurs and collectors of art, and in his youth Oppenheimer was encouraged to follow intellectual pursuits. He showed early enthusiasm and talent in mathematics and science, but at the same time he pursued his interests in language, philosophy, and the arts. He attended Harvard University, where he excelled in mathematical physics. One of his mentors, Professor Percy Bridgman, arranged for him to study with Max Born at Göttingen, Germany, and during the next two years Oppenheimer interacted with the world’s leading young theoretical physicists and performed brilliantly. He received his PhD at the age of twenty-three from Göttingen, and it was generally accepted that he would quickly take his place among the greats of theoretical physics.
In 1929, Oppenheimer returned to the United States with a then unique joint appointment at the University of California at Berkeley and the California Institute of Technology. Oppenheimer contributed a steady stream of significant papers on quantum mechanics, cosmology, and atomic theory, but his research career did not reflect the brilliance that his teachers and colleagues had anticipated. Meanwhile he distinguished himself as an exceptional teacher of theoretical physics and attracted a steady stream of bright students who were alternately charmed and inspired by his charismatic style and cowed and intimidated by his fearsome temper and a streak of remorseless sarcasm which he freely delivered when he was displeased with a student’s work.
Oppenheimer’s work in physics did not prevent his continued study of both Eastern and Western philosophy, or his growing interest in issues of social justice. He was just as likely to be found reading Sanskrit or a tract by Vladimir Lenin as he was to be studying a work in theoretical or experimental physics. His range of contacts included associates and friends from within the academic community as well as socially concerned political activists. His commitment to social causes was reflected in his membership in a number of organizations which were considered left or even communist.
Henry L. Stimson was appointed by Franklin Delano Roosevelt as his Secretary of War in 1940 at the age of seventy-three. It was the third time that Stimson, a life-long Republican, had been appointed to a cabinet- rank appointment and the second time he had been Secretary of War. Stimson graduated from Yale in 1888 and pursed a degree in law from Harvard, which he obtained in 1890. He was in private practice New York City until 1906, when he served for three years as U.S. Attorney for the Southern district of New York. In 1910 he made an unsuccessful run for the office of governor. In 1911 he was appointed Secretary of War, the civilian head of the Department of the Army, by President Taft. In 1913, with the end of the Taft administration, Stimson returned to private life, but remained active in public service. At the beginning of World War I he was a prominent member of the so-called preparedness movement, and when America entered the war he served as a colonel in the artillery. Stimson returned to public office in May, 1927, as President Coolidge’s personal representative to settle the Nicaraguan insurrection. His success in bringing about a peaceful solution to this crisis led to his being appointed Governor General of the Philippine Islands. President Herbert Hoover appointed him Secretary of Sate in 1929 and he occupied that position until Roosevelt and the Democrats came to power in 1933. His most significant contribution as Secretary of State was the elaboration of what became known as “the Stimson Doctrine” – American nonrecognition of territories and agreements achieved by aggression, a doctrine that was aimed at Japanese aggression against China. The doctrine was adopted by the League of Nations and Japan acquiesced to this expression of world opinion by withdrawing from Shanghai. Stimson returned to private practice after the 1932 election, but retained an avid interest in international affairs. He described himself later as a person who had, for thirty years, “championed international law and morality,” and had, all his life, “argued war itself must be restrained within the bounds of humanity.”
In the face of a certain global conflict, Roosevelt asked Stimson to once again accept the position of Secretary of War as a way of insuring bipartisan support for the administration’s steps to prepare the nation. The appointment was made on June 20, 1940, five days after Roosevelt had approved Bush’s plan for the NDRC. Stimson’s acceptance was no doubt motivated by his deeply felt commitment to the notion of “civilized war,” but it also reflected his commitment to duty as a public servant. Still, the Republicans removed him from the party for what was interpreted as his disloyalty.
From the beginning Stimson was committed to the marriage of the military with modern science. He made it clear to members of the War Department and to Vannevar Bush and his associates that it was his intention that the Army exploit science to the fullest. Those in the military who resisted this idea were severely chastised and even relieved of their responsibilities. He encouraged and supported all steps which insured the effective application of scientific expertise to problems of weaponry and warfare.
Stimson was a master at delegating authority, and though this meant some loss of control with regard to his principles of “civilized warfare” (he was appalled at the indiscriminate bombing of cities), for the most part he kept a firm grip on the administrative controls of the Army. With regard to the running of the department and in particular with regard to keeping on top of the management of the Manhattan Project, Stimson was blessed by those with whom he had surrounded himself, especially under Secretary of War Robert B. Patterson, Special Assistants Harvey H. Bundy and John J. McCloy, and Special Consultant George L. Harrison.
Social and Cultural Context
The social and cultural context of Big Science and of The Manhattan Project have been intricately woven into thematic development and biographical descriptions. The major points bear repeating.
Science and technology are not activities which develop in a vacuum, divorced from the social and cultural orientations of a civilization or a nation-state. On the contrary, the organization and values of the scientific and technological communities of a culture reflect very much the same overlapping web of local, regional and national values that infuse all other elements of the society. The interconnections between technological innovation and the evolution of our culture is easy to demonstrate. One need only point to such devices as the internal combustion engine, the telephone, the fax machine, or the computer. An issue which deserves much more study is the leveling effects of such technologies on values and behavior in cultures which initially have little in common.
Science does not exert such a leveling influence. The situation is not unlike that in other abstract disciplines: music, for example. While there may be universal agreement with regard to the notes in a musical score, there is never agreement on how that score is to be interpreted.
There are also significant differences between cultures with regard to the place that science occupies in the social fabric of the culture. In the United States prior to World War II, it was widely accepted that there should be no general public support for science or the arts any more than there should be public support for the activities of private enterprise. This was not the case in most European cultures.
Accepted practice in the United States in this regard changed drastically during and immediately after World War II. The achievements of The Manhattan Project and the success of other government supported war-related activities had a profound effect on United States culture, not just with regard to the involvement of government of the support of science, but of the arts as well.
Now, near the close of the twentieth century as WWII and the Cold War fade into the past, a fascinating question to consider is the proposition that the effect of both was a large perturbation in what might be termed “traditional” American values with regard to science, education, and the arts. Perhaps, over the long run, we are in the process of returning to an arrangement which closely approximates the situation prior to World War II.
1. Alvin Weinberg, Reflexions on Big Science (Cambridge/London: The MIT Press, 1967), p. 39