Epidemics and Public Health
EPIDEMICS AND PUBLIC HEALTH
EPIDEMICS AND PUBLIC HEALTH. In its broadest sense, public health refers to organized efforts to protect and promote the physical and mental well being of a community. These wide-ranging efforts can include such activities as combating epidemic outbreaks of disease, controlling chronic diseases, educating the public about disease prevention and health promotion, screening and vaccinating individuals for diseases, collecting and maintaining statistics about the incidence of diseases, births, deaths, and other vital events, guarding the food and water supplies through inspections and quality standards, and enacting laws and programs to preserve health. A critical component of public health work is epidemiology, the study of the incidence, distribution, and causes of health-related events in a given population. The scope, functions, and expectations of American public health have never been fixed; rather, they have changed significantly over the course of American history, reflecting advances in science and medicine, developments in society and politics, and trends in disease incidence.
Environmental Sources of Disease
The earliest American efforts to protect public health were sporadic, organized in response to outbreaks of widespread disease, and were rooted in an understanding of the environment and climate as the leading determinants of health and illness. Miasmatism, a prevailing medical explanation for illness, attributed diseases to miasma, foul odors or effluvia that originated from the soil or decomposing organic matter. The "bad airs" could contribute to an imbalance of one or more of the body's four humors which were believed to govern physical health. Medical ideas that emphasized the role of an individual's environment influenced the measures that were deployed against epidemics of disease. Filth, foul smells, and squalor, along with climates that were warm and wet, were implicated as the features of disease-prone areas. Outbreaks of fevers, malaria, cholera, yellow fever, and smallpox, typically during the warm and wet spring and summer months, and disproportionately affecting those living in crowded, impoverished, and unsanitary conditions, reinforced the medical beliefs about environmental influences in disease causation. Before the Civil War, measures against disease were directed by local health boards that were often hastily arranged and almost always temporary, in large part because, until well into the nineteenth century, public health was seen as a responsibility of private citizens rather than the elected government. Public health measures involved sanitation: regulating privies, cleaning streets, spreading lime, and removing garbage, dead animals, and stagnant pools of water. These functions were particularly critical because growing towns and cities still lacked sewer systems, water supply systems, and other infrastructure. Measures also controlled sources of environmental pollution such as refuse from butchers, slaughterhouses, tanneries, fishmongers, bone boilers, starch makers, clothes dyers, and other nuisance industries. Other ordinances regulated the quality of the food supply and governed the sale of foodstuffs during epidemics, such as cholera when the sale and consumption of fruits, vegetables, meats, and fish were forbidden.
Although sanitary measures were the foundation of early public health efforts, public health also extended beyond cleaning the environment and regulating sources of filth. Following the experience of Europe, American port cities instituted quarantine practices beginning early in the 1700s with Charleston, South Carolina, and Boston, Massachusetts, leading the way. Arriving passengers and cargo were inspected for disease, and those passengers determined to be sick or a health threat to the community were isolated and usually returned to the country from which they arrived. As the nation expanded, towns and cities on inland waterways also followed quarantine measures, particularly during times of disease epidemics. The need for quarantine and isolation was driven by the belief that diseases were transmitted between persons by direct contact. In addition to quarantine stations used to inspect and isolate arriving passengers, many cities beginning in the early eighteenth century established pesthouses to segregate individuals whose health could threaten the health of the wider community.
Epidemics and Public Health in the Antebellum Period
Americans in the antebellum period were afflicted by two general categories of diseases. The first included such endemic ailments as fevers, malaria, respiratory afflictions, measles, mumps, dysenteries, diarrheas, and other gastrointestinal complaints which people grudgingly accepted as among the unavoidable rigors of their daily lives. The second category included the much-feared epidemic diseases of smallpox, diphtheria, yellow fever, and cholera,
which together inflicted their most punishing blows on the American population before the Civil War. Outbreaks of smallpox and yellow fever during the eighteenth century led to the passage of quarantine regulations by many cities.
Smallpox. While epidemics of yellow fever, cholera, and diphtheria were devastating but infrequent events in antebellum America, smallpox occurred much more regularly during the eighteenth century, and its harm could have been greater had it not been the first disease for which an effective prevention was devised (in the 1720s). During a smallpox outbreak in Boston in 1721, the Reverend Cotton Mather and the physician Zabdiel Boylston, following the recent practice of European physicians, inoculated their families and several hundred Bostonians, resulting in far fewer deaths among those inoculated than those who were infected during the epidemic. Inoculation, which induced a mild case of smallpox so that the person could acquire immunity against the disease, was controversial on religious and medical grounds and resisted as being dangerous because, when improperly performed, it could instigate a smallpox epidemic. Over the next half century the practice grew slowly, but by the end of the eighteenth century it was regularly practiced and accepted throughout the American colonies. Inoculation was supplanted by the much safer practice of vaccination, developed by Edward Jenner in 1798, which used cowpox vaccinia rather than active smallpox. The safety and growing acceptance of vaccination was reflected in 1813 when Congress supported a limited program of smallpox vaccination. Although state and municipal governments were generally apathetic to community health concerns in this early period, the banning of inoculation and the promotion of vaccination against smallpox were among the first state interventions into public health matters. The popularity of inoculation and the opposition to vaccination remained strong during the first half of the nineteenth century, and by 1850, most states needed to have laws forbidding inoculation.
Yellow Fever. In contrast to the regular incidence of smallpox, yellow fever appeared violently in the 1690s and then waned, only to occur occasionally during the first half of the eighteenth century. After an outbreak in 1762 in Philadelphia, the disease did not occur again for three decades, when, in 1793, it returned with overwhelming fulmination and mortality, annually wreaking fear and havoc in most American port cities until 1806. The social and medical responses to the outbreak of yellow fever in Philadelphia in August 1793 remains representative of the challenges public health efforts faced in the period. In the Philadelphia epidemic, the disease was not immediately recognized as yellow fever, in part because an outbreak had not occurred in many decades and in part because physicians disagreed about its origins and treatment. As the number of deaths rapidly climbed, nearly 20,000 of the city's 30,000 residents fled, including most federal, state, and municipal officials and civic leaders. A lack of funds, manpower, and civic leadership to enforce quarantine regulations allowed the epidemic to rage unabated, with hundreds of deaths weekly. The city's port officers and physicians did not have the funds and resources to inspect arriving ships and to isolate infected sailors and passengers, many arriving from Caribbean ports where yellow fever was epidemic. With many government officials having abandoned the city, Matthew Clarkson, Philadelphia's mayor, had to patch together emergency funds and volunteers to care for the sick and to enforce the quarantine laws. A volunteer committee of private citizens was quickly assembled by Clarkson to oversee the relief efforts, including soliciting food, money, and needed supplies from other towns. When the epidemic ended in November 1793, the death toll stood at more than 5,000, nearly one of every six residents of Philadelphia. In four months the disease had wiped out entire families, wreaking economic and social havoc on a scale never before seen in the new nation. (For comparison, 4,435 Americans died in battle over the eight years of the Revolutionary War.) Yellow fever returned to port cities along the Atlantic and Gulf coasts over the next decade, including major outbreaks in New York City in 1795 and 1798 when nearly 3,000 residents died. The responses to epidemic yellow fever did not change appreciably over the decade, and everywhere towns strengthened and enforced their quarantine regulations with hopes of avoiding local outbreaks of the disease. For much of the nineteenth century, yellow fever continued to appear in southern coastal towns but never with the severity of earlier epidemics. The yellow fever epidemics underscored the need for organized public efforts to combat epidemic diseases, although many of the measures instituted during the epidemics, particularly local health boards, were allowed to lapse after the epidemics abated.
Cholera. By the 1820s, smallpox and yellow fever continued to afflict the American population, albeit at less destructive levels than had been experienced before. But, in the early 1830s, they were joined by a third epidemic disease, cholera. After appearing in South Asia in the late 1820s, cholera spread west into Russia and eastern Europe by 1830, and despite quarantine efforts by western European countries, the pandemic spread rapidly and widely throughout Europe, appearing in England in 1831. North American newspapers carried stories of the devastation Asiatic cholera was causing in Europe, and Americans grew to fear that cholera would arrive imminently in their land. As they had with yellow fever, American cities resorted to the strict quarantine and inspection of arriving ships, passengers, and goods, but with no success. By June 1832, the first cases of cholera were reported in Quebec, Montreal, and New York. Political and civic leaders argued that cholera would not affect American cities and its peoples in the way it had in Europe. The first medical reports of cholera in New York City were dismissed by political and business leaders as unnecessarily inciting public panic and disturbing the city's economy and society. The New York City Board of Health, nevertheless, took limited actions after the first reported cases, establishing cholera hospitals to care for victims and requiring residents to remove rubbish, filth, standing water, and to generally clean the city. Cholera disproportionately affected those who were living in impoverished, crowded, squalid conditions, without proper access to clean water. Health officials promoted the widespread belief that poverty and disease were the result of immoral behaviors. In doing so, health officials were suggesting that those who contracted cholera had brought the disease upon themselves through their vices. Such a position reassured the upper classes that their social status, economic wealth, and religious beliefs made them immune from contracting the disease. Despite the city's efforts to fend off the epidemic, nearly 3,000 New Yorkers died over six weeks in June and July 1832.
Cholera continued to spread throughout the United States during 1832 and 1833. Some cities, including Boston and Charleston, were spared, but towns and cities of all sizes throughout the United States were visited by cholera during the summer of 1832. Government officials and religious leaders called for days of fasting and prayer with hopes of divine intervention against the epidemic. As in the epidemics of yellow fever, the outbreak of cholera prompted many residents, including government officers and civic leaders, to flee the towns and their duties. In the absence of permanent boards of health, the remaining population established temporary boards of health to oversee sanitation efforts, to create hospitals, dispensaries, and other medical services, to enforce quarantine and sanitation laws, and to provide social services such as burials and the care of orphans and widows. By October 1832, the epidemic reached New Orleans, killing 5,000 residents (or nearly 10 percent of the city's total population) and causing many thousands more to flee the city after the first reported cases. The deaths among those who remained in the city were very high. Occasional outbreaks of cholera continued through 1835 and then the disease disappeared, leading some Americans to believe that it would not return. However, a second major wave of cholera struck the United States between 1849 and 1854, by which time most cities had disbanded their boards of health and had returned to sanitary practices that had existed before the first wave in 1832–1833. As in the earlier epidemics, Americans saw cholera as a disease that affected classes of people who were poor, immoral, irreligious, criminal, intemperate, or lazy. During the second epidemic wave of cholera, this belief about disease susceptibility was joined with nativist, anti-immigrant attitudes, perpetuating the idea that immigrants were disease-laden and brought the diseases with them. Irish immigrants particularly, arriving in the United States after the Irish famine of 1845–1850, were treated as scapegoats for the epidemic. Immigrants were affected by cholera in overwhelming numbers, not because they were biologically susceptible or because they harbored the disease
but because they lived in the overcrowded, squalid conditions of America's growing cities, where epidemic diseases spread rapidly.
Public Health in a Changing Nation. The cholera epidemics of 1832–1833 and 1849–1854 illustrate that the nation's towns and cities were unprepared and unable to shoulder the dual burdens of industrialization and a rapidly growing population. These population changes imposed enormous burdens on the cities' limited infrastructure and environmental resources. Many cities had antiquated or only limited systems of open sewers and gutters and lacked effective systems to remove garbage, animal waste, and rubbish from the city streets. Public water supply systems often relied on private wells that were created when the population and its needs were smaller. In short, the environmental and social conditions were rife for the rapid spread of disease when it occurred. Coupled with these conditions was the political expectation that protecting the public health was the duty of private citizens and not a government responsibility. As a result, municipal and state governments did not consistently attend to public health matters unless circumstances forced their involvement, and even then such interventions were fleeting, often consisting of private citizen groups acting in an official capacity to enforce the two measures in antebellum public health's arsenal: sanitation and quarantine.
Public Health as a Permanent State Function
The practice of public health did not change appreciably during the antebellum period, as sanitation and quarantine remained the common responses to outbreaks of disease. The epidemics of cholera, yellow fever, and diphtheria, the continuing threat posed by smallpox, malaria, and other ailments, and the growing incidence of tuberculosis during the first half of the nineteenth century highlighted the need for permanent institutions to safeguard a community's health. In the decades before the Civil War, the general health of Americans began to decline, and the rates of mortality and morbidity rose largely as a result of the consequences of urbanization and industrialization. Even smallpox, the one disease that early public health efforts made some progress in controlling through vaccination, began to occur in growing numbers by the 1840s as vaccination programs were neglected.
Calls for Sanitary Reform. During the 1830s in Europe, there was widening recognition of the relationship between the health of communities and the living conditions they enjoyed. After epidemics of influenza and typhoid in 1837 and 1838, the British government instructed Edwin Chadwick to study the problem of sanitation. Published in 1842, Chadwick's report, The Sanitary Conditions of the Labouring Population, argued that disease was directly tied to living conditions and that there was a dire need for public health reform. American physicians also identified appalling sanitary problems in the United States, including Benjamin McCready in an 1837 essay, "On the Influence of Trades, Professions, and Occupations in the United States in the Production of Disease," and John Griscom in his 1845 report The Sanitary Condition of the Laboring Class of New York, With Suggestions for Its Improvement. These studies identified the growing rates of illness among the working class and proposed public health reforms to address the problems.
The most forceful and important call for wider public health efforts by the government came from the Massachusetts physician Lemuel Shattuck in his Report of the Sanitary Commission of Massachusetts (1850). Its recommendations for reform were ahead of their time in light of the absence of national and state boards of health and the disorganization and transience of local boards of health. Among other things, the report called for the creation of a permanent state board of health; the collection and analysis of vital statistics and disease surveys from various localities, classes, and occupations; the creation of a smallpox vaccination program; the promotion of maternal, infant, and children's health; the protection of children's health through health education and the sanitation and ventilation of school buildings; the development of state programs to care for the mentally ill; and the instruction of physicians and nurses on matters of preventative medicine. Although highly praised, Shattuck's recommendations, like McCready's and Griscom's, were largely ignored by the medical and political communities. But they pointed to the growing realization of the need for permanent, state-sponsored public health programs.
Establishment of Public Health Boards. By the 1850s, spurred by the second wave of cholera epidemics between 1849 and 1854, cities and states began to address the grim sanitary conditions that contributed to the rapid spread of disease. Some sanitary reformers, through national conventions between 1857 and 1860, called for a national quarantine, for reform and standardization of local quarantine laws, and for greater study of cholera, yellow fever, and the effectiveness of quarantine in stemming their transmission. The conventions brought together the influential figures of the sanitary movement—Richard Arnold, Edward Barton, Jacob Bigelow, John Griscom, Edward Jarvis, Wilson Jewell, James Jones, Edwin Miller Snow, and others. John Duffy, a leading historian of American public health, has called these national conventions "the real beginning of the American sanitary revolution."
The Civil War disrupted the momentum of new sanitary efforts, but the spread of diseases by the movement of troops on both sides reiterated the needed measures for which reformers were clamoring. As in peacetime, the responsibility of caring for the sick and the wounded and rectifying the poor sanitary conditions of military camps fell on private volunteer groups, one of which evolved to become the United States Sanitary Commission. The Sanitary Commission, headed by Frederick Law Olmsted, oversaw military relief efforts, lobbied for a stronger medical corps, and succeeded in securing the appointment of the reform-minded William Hammond as surgeon general of the army.
After the war, important reforms were made in public health in the United States. Most notably, cities and states began to create permanent boards of health to oversee public health efforts. Louisiana and Providence, Rhode Island, established boards in 1855. Questions about the necessity of such bodies were answered when a third epidemic wave of cholera struck the United States during 1866 and 1867. New York City organized its Metropolitan Board of Health in 1866, and many other cities, including Cincinnati and Saint Louis, which suffered 2, 000 and 3, 500 deaths respectively during the cholera epidemic, quickly followed. The boards of health organized and implemented the sanitary measures and enforced the quarantine and inspection laws reformers had been urging for three decades. The tasks that had long been considered the responsibilities of private citizens were increasingly being assumed by the state. The new boards of health also expanded the reach and functions of public health in the closing decades of the nineteenth century, as an emphasis on the control of epidemic diseases declined and a greater interest in practical measures to prevent disease and preserve health emerged. New laws were passed forbidding the adulteration of milk, flour, meats, and other foods and regulating their quality. Boards demonstrated a commitment to improving infant mortality and children's health by regulating school buildings and instituting programs of health screening and vaccination. The creation of health boards also spurred the professionalization of public health as a discipline, reflected in the founding of the American Public Health Association in 1872.
Expansion of Municipal Infrastructure. On a practical level, the single greatest impediment to realizing the goals sought by the sanitary reformers was the removal of waste, garbage, standing water, and other pollution from America's crowded cities. Heavy rains and heavy use resulted in privies and sewers that would overflow, creating ideal unsanitary conditions for disease and contamination of the wells and water supply. The growing rates of typhoid, dysentery, and other enteric illnesses pointed to the need for new infrastructure. Beginning in the late 1860s and into the early twentieth century, dozens of cities initiated programs, spanning many years, to build sewers and to create a safe and protected water supply, which involved the introduction of filtration methods in the 1890s and chlorination in 1908. Health boards also enacted ordinances to curb other sources of animal pollution, which was a great concern at a time when horses were relied on for transportation and labor and when many families even in urban areas kept farm animals. Stables and manure in the streets had to be dealt with, as well as hogs and other livestock that roamed the streets. The carcasses of dead cats, dogs, and abandoned horses were common features of most large cities in the late nineteenth century. Even as late as the 1890s, New York City officials had to annually contend with removing on average 8, 000 dead horses from the city's streets. Because of such circumstances, enforcing sanitation laws remained a central activity of many public health departments.
Public Health and the Federal Government. The federal government played little if any role in public health matters until the end of the nineteenth century. In 1798, the United States Marine Hospital Service was created to care for sick and injured merchant seamen, utilizing municipal and charity hospitals already in existence. Between 1830 and 1861, the federal government undertook a program of hospital construction for the Marine Hospital Service, with many of the hospitals being built along inland waterways. The system of hospitals was reorganized in 1870, placing its headquarters in Washington, D.C., and assigning as its head the newly appointed surgeon general. In 1891, the federal government conferred on the Marine Hospital Service the task of inspecting for disease newly arriving immigrants at the nation's ports. As the service's quarantine and inspection functions broadened, its name was changed in 1902 to the Public Health and Marine Hospital Service, and again in 1912 to the Public Health Service.
The second major federal foray into public health at the end of the nineteenth century was an attempt to establish a permanent National Board of Health. A devastating outbreak of yellow fever in 1878 in southern and midwestern states prompted politicians and public health leaders in the American Public Health Association to press for a federal body to oversee a national quarantine. The quarantine duties were initially vested in the existing Marine Hospital Service. After its establishment in 1879, the board pursued work in the areas of disease contagion, sanitation, and food adulteration. But, political infighting, a lack of resources, and a lack of a clear mandate led to its demise in 1883 when its funds and duties were fully transferred to the Marine Hospital Service.
Public Health after the Germ Theory of Disease
Medical and public health theory and practice were fundamentally transformed by the development of the germ theory of disease at the end of the nineteenth century. Beginning in the 1860s, Louis Pasteur, Joseph Lister, John Tyndall, Robert Koch, and other scientists offered competing versions of germ theory, which generally proposed that diseases were caused by specific microorganisms that did not propagate spontaneously. Rejecting the prevailing miasmatic explanation that diseases were the products of chemical fermentation and organic decomposition, germ theory was emblematic of the prominence held by laboratory sciences such as bacteriology and physiology in the new experimental medicine that emerged in the second half of the nineteenth century. Although the germ theory of disease was not widely accepted immediately, experimental evidence mounted in its favor, as the microorganisms responsible for tuberculosis, diphtheria, cholera, septicemia, cattle plague, and anthrax were identified.
For the public health community, which had long relied on sanitation to prevent outbreaks of disease, the germ theory of disease provided an important means of controlling diseases by identifying infected individuals. Beginning in the mid-1880s, the goals of public health work changed from sanitary measures, which were slow in realizing their benefits, to the scientific control of disease using bacteriological and epidemiological work. Germ theory allowed public health officials to identify and isolate infected individuals, to use the laboratory to diagnose diseases, and to develop vaccines against infectious diseases. Public health departments began to collect statistics about the incidence of disease, and physicians became legally obligated to report to the local health department any individual they diagnosed with a contagious disease. At the close of the nineteenth century, vaccination against smallpox was expanded, and new human vaccines were developed over the next half-century, including rabies (1885), plague (1897), diphtheria (1923), pertussis (1926), tuberculosis (1927), tetanus (1927), and yellow fever (1935). With the availability of new vaccines, many states instituted mandatory vaccination programs, which were often politically contentious and resisted by the public. Public opposition to a mandatory smallpox vaccination program in Massachusetts found its way to the United States Supreme Court in Jacobson v. Massachusetts (1905). In the groundbreaking case, the court's decision upheld and broadened state powers in public health matters, and declared that the state could compel private citizens to act when the health of the community was threatened.
Disease control efforts against tuberculosis and diphtheria were symbolic of public health's successes in curbing diseases using bacteriological science. Tuberculosis had been the leading cause of death in the United States during the nineteenth century and continued to afflict many Americans until the mid-twentieth century. The identification of the bacteria that causes tuberculosis allowed physicians and public health officials to definitively diagnose infected individuals and to isolate them, preventing the further spread of the disease. By the 1930s, the incidence level of tuberculosis was one-sixth of what it was in the 1870s. Diphtheria reached epidemic proportions beginning in the 1870s and killed thousands of Americans annually, particularly young children, over the next three decades. In the 1890s, the New York City Department of Health, under the leadership of Hermann Biggs, instituted a multiphase diphtheria control effort that involved diagnosing and isolating individuals with the disease, developing, testing, and distributing diphtheria antitoxin, and creating a program of screening, immunization, and public education. By the 1930s, the incidence of diphtheria in New York City was a fraction of its levels in the early 1870s. Advances in laboratory medicine, however, were not solely responsible for the success public health enjoyed at the century's turn. Rather, disease campaigns needed the participation and the support of physicians, politicians, public health officials, and the general public to succeed, and implementing disease control programs often involved complexsocial negotiations among these groups.
The development of a laboratory-based, quantitative public health also led to the appearance of new related disciplines including epidemiology, sanitation engineering, vital statistics, public health nursing, and preventative medicine. New schools and academic departments in epidemiology, hygiene, and public health were established, with Yale creating the first department in 1915 and Johns Hopkins (1916) and Harvard (1922) following shortly after. At the turn of the twentieth century, public health, which had long been practiced by physicians and which was regarded as a subdiscipline of medicine, emerged as a field of its own, with its exclusive professional organizations, journals, institutions, and practices.
The New Public Health and Beyond
By the 1910s, most municipal and state public health departments had established or had access to bacteriological laboratories to aid their disease control efforts. But it was also becoming clear that neither sanitation efforts nor laboratory tools alone could prevent or control disease. Public health efforts needed the participation and the cooperation of the community that they aimed to protect. The career of Charles Chapin, superintendent of health in Providence, Rhode Island, for example, reflected the transitions public health underwent in this period. During the 1880s, Chapin led Providence's efforts to eliminate privies and introduce indoor plumbing, investigated filtration and other methods to protect the public water supply, and established the nation's first municipal bacteriological laboratory in 1888. Chapin's sanitation efforts were detailed in his landmark book, Municipal Sanitation in the United States (1901). Even before writing his book, Chapin began to move toward a belief that public health efforts needed to be based on laboratory science. His field investigations and statistical studies concluded that general sanitary measures were not effective means of preventing diseases because diseases were spread by person-to-person contact, and therefore, personal hygiene was a critical factor in their transmission. He proposed that public health departments should be relieved of their sanitation duties, in order to focus on the diagnosis and isolation of infected individuals and on public education to promote healthy personal hygiene and practices. In articulating these views in his book, The Sources and Modes of Infection (1910), Chapin laid the foundation for the New Public Health movement that in the early decades of the twentieth century brought together efforts grounded in bacteriology, public health education, and social hygiene and reform. These social concerns resulted in new or expanded public health programs for the inspection of milk, the care of the mentally ill, the promotion of children's health, and the regulation of food and drug purity. There also came new laws regarding child labor and occupational health, particularly among immigrants who faced harsh, unhealthy living and working conditions.
Public health work in this period was closely associated with broader social concerns and reforms of the Progressive movement, as public health officials and social reformers advanced a relationship between individual/ personal hygiene and community/social hygiene. In expanding disease control programs against tuberculosis, diphtheria, and venereal diseases, health departments engaged in public health education and took advantage of films, magazine and newspaper advertising, traveling exhibitions, and public lectures to spread their message of personal hygiene. Public health officials hoped these efforts would discourage unhealthy behaviors and practices that could spread disease and would promote such behaviors as washing hands, swatting flies, using handkerchiefs and spittoons, and avoiding common drinking cups by informing the public about how disease was spread. A newfound awareness of the presence of germs affected people's daily lives and practices, including the growing use of water closets, the removal of rugs and heavy linens in the home that could harbor germs, the use of smooth chrome, porcelain, and linoleum to protect against germs, and the raising of women's skirts so that dust and germs would not be trapped in their hems. While old products such as soap were marketed with renewed fervor, new products were developed to accommodate new hygienic practices, including disposable, one-use paper cups, clear plastic wrap to protect foods, and disinfectants for the body and home.
The expansion of public health work and the introduction of epidemiological practices grounded in bacteriology and the laboratory also raised perplexing concerns about how public health officials could best protect the community's health while preserving the civil liberties of individuals. Public protests against mandatory vaccination programs accompanied their expansion at the turn of the twentieth century, but there is no better illustration of many of the legal challenges public health officials faced than the case of Mary Mallon, an Irish immigrant cook, who was diagnosed as a carrier of the typhoid fever bacterium and was incarcerated by New York City health officials to prevent her from spreading the disease. Laboratory tests confirmed that Mallon's body harbored salmonella typhi, the bacterium that causes typhoid fever. As a healthy carrier, she did not manifest any of the disease's symptoms but she could communicate the bacteria to others during her cooking jobs. Derisively called "Typhoid Mary," Mallon defied orders to stop working, and New York City officials felt compelled to isolate her between 1907 and 1910 and then again permanently from 1915 until her death in 1938.
Mallon's case raised important questions about the scope and the limits of state powers in public health matters at a time when the number of governmental and private agencies and organizations addressing public health issues was growing. In the pre–World War II United States, the Rockefeller Foundation supported both national and international public health programs against hookworm, malaria, and pellagra, while the Rosenwald Fund, the Milbank Memorial Fund, the Commonwealth Fund, the W. K. Kellogg Foundation, and the Metropolitan Life Insurance Company promoted programs of public health and preventative medicine. In addition to the Public Health Service, a plethora of federal agencies, including the Communicable Disease Center (later and presently, the Center for Disease Control and Prevention), the National Institutes of Health, the Indian Health Service, the Children's Bureau, the Department of Agriculture, and various military departments undertook public health and preventative medicine work and research. During World Wars I and II, the War Department particularly sought to curb the high incidence of venereal diseases among military personnel which threatened the country's military and moral strengths.
In the years immediately after World War II, the discovery and the growing availability of new antibiotics such as penicillin, streptomycin, aureomycin, chloromycin, terramycin, and sulfonamides and new vaccines such as those against polio (1955; 1962), measles (1963), mumps (1967), and rubella (1969) contributed to further controlling infectious diseases that had long plagued the public. The incidence of and the mortality from infectious diseases had steadily declined beginning in the late nineteenth century, but the availability of powerful new vaccines and chemotherapeutic agents brought the incidence of infectious diseases to a fraction of the levels at their worst, all of which underscored the changing patterns of disease. Although a global pandemic of influenza during 1918–1919 had killed between twenty and forty million people worldwide, including 600, 000 people in the United States, epidemic and infectious diseases diminished as the leading killers of Americans. During the twentieth century, chronic, noninfectious illnesses and conditions became the leading causes of death in the United States. On one hand, the increasing number of deaths from heart disease, cancers, stroke, diabetes, liver disease, arteriosclerosis, and lung diseases pointed to the fact that Americans were surviving to an older age at which they were afflicted by these degenerative conditions. But on the other hand, poor personal behaviors and conduct, such as rich, unhealthy diets, sedentary lifestyles, and addictions to alcohol, tobacco, and drugs also contributed to the incidence of these illnesses. As a result of these changes in the incidence of disease, the goals and the emphases of public health work underwent a shift from combating diseases to preventing them and to promoting sound health.
The considerable successes and confidence public health officials enjoyed in disease prevention and health promotion in the decades after World War II faced severe tests at the close of the twentieth century. Beginning in the 1980s, the global pandemic of acquired immune deficiency syndrome (AIDS) strained public health departments. Between 1981 and 2000, 774, 467 cases of AIDS were reported in the United States; 448, 060 people died of AIDS. Nearly one million other Americans were also infected by the human immunodeficiency virus (HIV), the virus that causes AIDS. The development of powerful antiretroviral therapies during the 1990s prolonged the lives of many Americans infected by HIV or suffering from AIDS. Further contributing to a public health crisis in which tuberculosis, malaria, sexually transmitted diseases, and other diseases again emerged as grave threats to community health were: the displacement of populations through immigration and political conflicts; the emergence of drug-resistant strains; the high rates of incarceration, homelessness, and intravenous drug use; the prevalence of mass air travel; the collapse of medical services in eastern Europe; the persistence of widespread poverty; and the progress of the AIDS pandemic, in which tuberculosis served as an opportunistic infection.
At the start of the twenty-first century, American public health officers and epidemiologists continued their work of disease prevention and health promotion in a world changed by the terrorist attacks of 11 September 2001. The outbreak of anthrax during 2001 and the threat of biological warfare and terrorism suggested that few public health departments were well-equipped or well-trained to handle a sudden, mass disease outbreak. This also raised questions about the necessity to reinstitute mass vaccination against smallpox. Globalization and commercialism continue to pose profound consequences and challenges for the American public health community in its promotion of good health for an American population beset by obesity, diabetes, stress, violence, smoking, and drug use. The still emerging threats such as Lyme disease, bovine spongiform encephalopathy (BSE, or "mad cow" disease) and infections of Ebola and West Nile viruses, point to the continuing need for American public health organizations to respond to disease threats, promote preventative measures, and above all, adapt their mission to the times.
BIBLIOGRAPHY
Brandt, Allan M. No Magic Bullet: A Social History of Venereal Disease in the United States since 1880. Exp. ed. New York: Oxford University Press, 1987.
Duffy, John. The Sanitarians: A History of American Public Health. Urbana: University of Illinois Press, 1990.
Gostin, Larry O. Public Health Law: Power, Duty, Restraint. Berkeley: University of California Press; New York: Milbank Memorial Fund, 2000.
Hammonds, Evelynn Maxine. Childhood's Deadly Scourge: The Campaign to Control Diphtheria in New York City, 1880–1930. Baltimore: Johns Hopkins University Press, 1999.
Humphreys, Margaret. Yellow Fever and the South. New Brunswick: Rutgers University Press, 1992.
———. Malaria: Poverty, Race, and Public Health in the United States. Baltimore: Johns Hopkins University Press, 2001.
Kraut, Alan M. Silent Travelers: Germs, Genes, and the "Immigrant Menace." New York: Basic Books, 1994.
Leavitt, Judith Walzer. The Healthiest City: Milwaukee and the Politics of Health Reform. Princeton, N.J.: Princeton University Press, 1982.
———. Typhoid Mary: Captive to the Public's Health. Boston: Beacon Press, 1996.
Markel, Howard. Quarantine!: East European Jewish Immigrants and the New York City Epidemics of 1892. Baltimore: Johns Hopkins University Press, 1997.
Melosi, Martin. The Sanitary City: Urban Infrastructure in America from Colonial Times to the Present. Baltimore: Johns Hopkins University Press, 2000.
Rogers, Naomi. Dirt and Disease: Polio before FDR. New Brunswick, N.J.: Rutgers University Press, 1992.
Rosen, George. A History of Public Health. Exp. ed. Baltimore: Johns Hopkins University Press, 1993.
Rosenberg, Charles E. The Cholera Years: The United States in 1832, 1849, and 1866. Chicago: University of Chicago Press, 1987.
Rosenkrantz, Barbara Gutmann. Public Health and the State: Changing Views in Massachusetts, 1842–1936. Cambridge, Mass.: Harvard University Press, 1972.
Rothman, Sheila M. Living in the Shadow of Death: Tuberculosis and the Social Experience of Illness in America. New York: Basic Books, 1994.
Shilts, Randy. And the Band Played On: Politics, People, and the AIDS Epidemic. New York: St. Martin's Press, 1987.
Smith, Susan Lynn. Sick and Tired of Being Sick and Tired: Black Women's Health Activism in America, 1890–1950. Philadelphia: University of Pennsylvania Press, 1995.
Tomes, Nancy. The Gospel of Germs: Men, Women, and the Microbe in American Life. Cambridge, Mass.: Harvard University Press, 1998.
D. GeorgeJoseph
See alsoCholera ; Meat Inspection Laws ; Poliomyelitis ; Sexually Transmitted Diseases ; Smallpox ; Tuberculosis ; Urbanization ; Yellow Fever .