Laboratories

views updated

LABORATORIES

LABORATORIES exist at the nexus of commerce, academic disciplines, and the State in early twenty-first-century America. Laboratories have been, and remain, a source of American military and economic power; their products indispensable elements for everyday life. How these organizations, ostensibly dedicated to the production of new knowledge and technologies, came to occupy such a central place in the landscape of American history is at one with the history of American science and technology as well as the growth of the American state during the twentieth century.

The Nineteenth Century

Although laboratories existed to test materials for railroads and small chemical concerns, as well as in some of the nation's colleges and universities such as Harvard and the Lawrence Scientific School at Yale, laboratories that we might recognize as such date from two distinct events—the 1862 Morrill Land Grant Act and the establishment of the Johns Hopkins University in 1876. The Land Grant Act provided each state in the Union with funds to establish an institution of higher learning with an emphasis on practical knowledge. Hopkins followed the German model of higher education, with its reverence for research, albeit with substantial local modification. Uniting these two temporally distinct events was a common recognition that laboratories are sites where the producers and consumers of technical knowledge bargain over a host of meanings.

Prior to World War II (1939–1945), the federal government supported research that might serve to aid in the development of the nation's natural resources. Hence, agriculture was a major beneficiary of Department of Agriculture funding through the experiment station at each state's land grant institution. Successful researchers enrolled local farmers to support research and teaching that might aid the local agricultural economy. Distinctive and important results emerged from these local constellations, ranging from the famous Babcock butterfat test to the development of hybrid corn. Balancing local needs with their own agendas, land grant researchers enacted the American laboratory's dilemma—those charged with the production of knowledge were often the least powerful actors in any given locale.

The founding of the Johns Hopkins University is central for understanding the laboratory's history in America. Until the 1980s, historians viewed the establishment of this new institution in Baltimore as simply an attempt to bring the German model of research to American soil and break with the traditional American college, with its emphasis on the production of morally solid citizens. Under Daniel Coit Gilman's leadership, the new university hired professors trained in European universities, including Henry Rowland (physics), Ira Remsen (chemistry), Henry Newell Martin (biology) and J. J. Sylvester (mathematics). However, far from abandoning the college's traditional function, the new institution's laboratories and seminar rooms became new sites for the production of both knowledge and upstanding citizens, the majority of which became college teachers. Hopkins valued research, but it was inseparable from teaching. As Gilman once explained, "in the hunt for truth we are first men and then hunters"; the antebellum college's moral economy moved to the new university. So great was the connection between research and teaching that Remsen expressed dismay when Gilman left Hopkins to become the first president of the Carnegie Institution of Washington (CIW), a private research institution.

Research and No Teaching

Separating research from teaching was among the great social accomplishments of the twentieth century. Private philanthropy and the emergence of the corporate laboratory were crucial in achieving this division. Around 1900, General Electric (GE) and AT&T established the first industrial research laboratories in America. Rather than produce students and theses, these laboratories produced new technologies and patents, the new corporate currency. For example, after many failures, Willis Whitney's group at GE invented the ductile tungsten filament for light bulbs, creating an array of patents that made GE untouchable in this growing market. At AT&T, researchers patented various important aspects of radio so as to maintain the system's monopoly on long distance communication. Far from being a university in exile, the corporate laboratory invented the firm's future and protected its investments. Industrial research was always basic to corporate needs, but that did not mean such work was mundane or less intellectually sophisticated than university based research. GE's Irving Langmuir won his 1932 Nobel Prize in Chemistry for his explanation of a basic GE problem: why did light bulbs darken over time?

The establishment of the Rockefeller Institute for Medical Research (now Rockefeller University) and the CIW were also salient in separating research from teaching. Both were the products of the massive fortunes earned by the nineteenth century's great robber barons, but each had different ends. Rockefeller's Institute, founded in 1901, had as its mission the understanding and treatment of disease and the separation of biomedical research from the education of physicians. Sinclair Lewis's Arrowsmith offers a fine depiction of Institute life. The CIW, founded in 1902 with $10 million in U.S. Steel bonds, sought to find the "exceptional man" and free him from the distractions of everyday life with financial support. Finding the exceptional man proved difficult, and the CIW settled for the creation of an array of departments under the leadership of recognized leaders in the natural and social sciences as well as the humanities. Only the natural science departments survived into the twenty-first century. Cleaving research from teaching allowed the laboratory to become portable and capable of existing in a variety of contexts.

War and the State

The two world wars facilitated the growth of U.S. laboratories in ways that had been heretofore unthinkable. World War I (1914–1918) provided American science with a new institution, the National Research Council (NRC) of the National Academy of Sciences, which served as the agent for the Rockefeller Foundation's massive postdoctoral fellowship program, which provided American researchers with funds to study at elite institutions in the United States and Europe. These young researchers returned to take up faculty positions, establish laboratories, and effectively end America's reliance on Europe as a source of advanced training in the sciences. The 1920s also featured what one observer called a "fever of commercialized science," as laboratories spread throughout American industry. Although the Great Depression slowed the spread of industrial laboratories, the crisis also acted as a selection mechanism, allowing only those laboratories with independent sources of revenue or outstanding research to survive.

World War II and the massive mobilization of American science led by CIW President Vannevar Bush effectively made the nation's laboratories at one with the nation's security and prosperity. With the federal government's support, the Manhattan Project, the American atomic bomb project, created a whole set of laboratories—including Los Alamos, Oak Ridge, and the Metallurgical Laboratory. Equally important were the laboratories established to develop radar (the MIT Radiation Laboratory), the proximity fuze (The Johns Hopkins University Applied Physics Laboratory), and guided missiles (CalTech's Jet Propulsion Laboratory). Government, but more specifically military patronage, heretofore unacceptable to the nation's scientific elite, propelled the laboratory into its central role in American life. Contrary to what many originally believed, American researchers found military problems a rich source of intellectually and technologically important questions. Even more importantly, there was someone eager to pay for answers—the armed services. Bush's famous 1945 report, Science—The Endless Frontier, and the visible demonstration of scientific power made at Hiroshima and Nagasaki, made the nation's laboratories and their products essential for America's coming struggle with the Soviet Union as well as the country's future economic growth.

During the Cold War, military patronage supported the largest expansion of the nation's research capabilities in both university and corporate laboratories. Four basic projects dominated the nation's laboratories: the development of the ballistic missile; the various attempts to design and build adequate continental defense systems; the introduction of quantitative methods into the social sciences; and the development of new technologies of surveillance and interpretation for the purposes of intelligence gathering. One basic technology emerging from this quartet was the networked digital computer, a tool now indispensable in so many contexts, including the modern research laboratory. In the biomedical disciplines, the National Institutes of Health (NIH) supported a similar and equally expensive expansion that had as its visible endeavor the human genome project.

In 1990, one-sixth of the nation's scientists and engineers were employed in more than 700 federally funded laboratories, including sixty-five Department of Defense and Department of Energy institutions, having annual budgets ranging from $15 billion to $21 billion, depending on who and what is counted. Even with the Cold War's end and the lessening of federal funds, the nation's laboratories flourished as government and industry rushed to continue the vital business of innovation.

The Present and the Future

As of 2000, industry outspent the federal government as the laboratory's greatest patron, but much of that work involved the laborious and difficult process of developing ideas into viable commercial products. University laboratories still account for the majority of basic research done in the United States. Although the events of 11 September 2001 will undoubtedly affect federal funding of research and lead to an increase in military research, the major areas in which laboratories will play a role will remain roughly as they were in 2000: biotechnology, including the massive private investment by the pharmaceutical industry as well as the ongoing attempts to harvest the work of the human genome project; nanotechnology, the attempt to develop sophisticated miniature technologies to act in a variety of contexts, including the human body and the battlefield; and information technology, as researchers attempt to make computers ubiquitous, easy to use, and capable of mining the vast data archives created by government and industry. In the first and last of these domains, corporate laboratories will play vital roles as individual firms attempt to bring new therapies and new technologies to market. Nanotechnology will remain a ward of the state as researchers attempt to develop means of manipulating their newfound Lilliputian world effectively. If successful, corporations will adopt that research just as they adopted the biotechnology research originally done in NIH-funded laboratories. The twenty-first century, like the twentieth, will be the laboratory's century.

BIBLIOGRAPHY

Dennis, Michael Aaron. "Accounting for Research: New Histories of Corporate Laboratories and the Social History of American Science." Social Studies of Science 17 (1987): 479– 518.

Geiger, Roger L. Research and Relevant Knowledge: American research Universities Since World War II. New York: Oxford University Press, 1993.

———. To Advance Knowledge: The Growth of American Research Universities, 1900–1940. New York: Oxford University Press, 1986.

Gusterson, Hugh. Nuclear Rites: A Weapons Laboratory at the End of the Cold War. Berkeley: University of California Press, 1996.

James, Frank A. J. L., ed. The Development of the Laboratory: Essays on the Place of Experiment in Industrial Civilization. Basing-stoke, Hampshire: Macmillan Press Scientific and Medical, 1989.

Kevles, Daniel J. The Physicists: The History of a Scientific Community in Modern America. New York: Vintage Books, 1979.

Kohler, Robert E. Partners in Science: Foundations and Natural Scientists, 1900–1945. Chicago: University of Chicago Press, 1991.

Koppes, Clayton R. JPL and the American Space Program: A History of the Jet Propulsion Laboratory. New Haven, Conn.: Yale University Press, 1982.

Leslie, Stuart W. The Cold War and American Science: The Military-Industrial-Academic Complex at MIT and Stanford. New York: Columbia University Press, 1993.

Reich, Leonard S. The Making of American Industrial Research: Science and Business at GE and Bell, 1876–1926. New York: Cambridge University Press, 1985.

Rhodes, Richard. The Making of the Atomic Bomb. New York: Simon and Schuster, 1986.

Rosenberg, Charles E. No Other Gods: On Science and American Social Thought. Revised and expanded edition. Baltimore: Johns Hopkins University Press, 1997 [1976].

Michael AaronDennis

About this article

Laboratories

Updated About encyclopedia.com content Print Article

NEARBY TERMS