Medicine and Surgery

views updated

MEDICINE AND SURGERY

MEDICINE AND SURGERY The Colonial Period

The people who settled the New World were often seeking something that had eluded them in the Old World; thus few established practitioners of medicine were among the early settlers. The exceptions were the apprentice-trained midwives who were members of the religious minorities who came to America and the "surgeons" who accompanied the gentlemen adventurers of early Virginia. Surgeons in the early modern era were, in general, apprentice-trained practitioners without university education who used external treatment—baths, massage, and external ointments and salves—as well as the knife. However, many "surgeons" were general practitioners. Medicine was a university discipline read, not practiced, by those who took a medical degree but also by those taking arts and theology degrees. Physicians prescribed for patients after consultation; few examined their patients. If procedures were required, a surgeon was summoned. Because university education was expensive, the vast majority of physicians were from the successful middle orders of society. They practiced among those who could afford to pay for their expensive consultations. Physicians moved to the New World in significant numbers in the eighteenth century. Most were native sons who returned to the increasingly affluent colonies after a university education in Europe.

In the seventeenth and early eighteenth centuries, the highest concentrations of university-educated persons in the colonies were among the clergy and the political classes; not surprisingly, both groups gave medical advice. Among the most prominent of the medical political leaders were John Winthrop and his son, also named John. Both conducted an extensive medical advice correspondence among the population of the sparely settled New England countryside. In the eighteenth century, John Wesley, the founder of the Methodist sect, wrote a guide-book,Primitive Physick (1747), for clergy and lay people to use in the provision of health care. It remained popular through much of the nineteenth century.

Among the prominent clerical medical advisors was Cotton Mather, who contributed to the introduction of variolation, or smallpox inoculation, as a smallpox preventive. Smallpox had been regularly epidemic in Boston, and in 1721 Mather sent a letter to the medical practitioners of Boston urging them to collectively undertake inoculation. When they did not respond, Mather convinced Zabdiel Boylston to undertake the procedure without the concurrence of his colleagues. The reaction was vigorous, with most medical practitioners opposed and most of the clergy supportive.

Mather's experience and the subsequent history of inoculation reveal many things about medicine in colonial America. First and foremost, the idea of professional authority among physicians was in its infancy. As the university-trained physician Dr. William Douglass tried to get the apprentice-trained surgeon Boylston to discontinue an experimental intervention, the clergy decided to intervene. The local political leaders did not stop the practice, although they did mandate that inoculated persons be in quarantine as if they had natural smallpox. In Virginia in 1768, an inoculator started an epidemic by not confining his patients, and the legislature restricted the practice. In Marblehead, Massachusetts, in 1774, inoculation led to riots. Part of the problem was economic; Benjamin Franklin, a strong supporter of the practice, noted its cost was "pretty high, " usually greater than a common tradesman "can well spare." In addition it was impractical to take several weeks away from work (there was no sick leave) to convalesce from the procedure. If inoculation was practiced when there was not a raging epidemic, it was the well-to-do class who received the benefits and the working classes who were put at risk of the spreading disease.

Smallpox was only one of many infectious diseases to plague the colonists. Others included the childhood diseases of measles, mumps, and especially diphtheria; epidemics of gastrointestinal diseases, sometimes typhoid fever, sometimes dysentery, and sometimes food poisoning; respiratory disease, which occurred every winter; and the yellow fever brought episodically to American ports by trade with Africa and the West Indies. But the most common disease was the so-called intermittent fever. After 1619, in South Carolina, outbreaks of a particularly malignant intermittent fever began to occur. Sometimes it was called remittent fever because the disease never seemed to completely leave the patient. The malignant fever spread with the continuing arrival of African slaves and is widely believed to be an African variant of malaria. In new settlements the experience was so common as to be called "seasoning," a normal part of moving there. The planter patterns of Low Country South Carolina—winter on the plantation and summer in the city, the Sea Islands, or the uplands—were shaped by fear of the seasonal disease. (The seasonal pattern would later be explained as a result of malaria being communicated by a mosquito vector.)

By 1750 the colonies had several significant towns, some of which were becoming cities. In these towns the medical profession began to organize following European models. Local practitioners formed medical societies, which sometimes provided for regulation. New Jersey and Massachusetts both chartered societies. They examined apprentices completing their training, and through admission to membership in the society following their success on the examination, the apprentices were certified as competent to practice. This provided a transportable credential, allowing apprentices to move away from masters to establish a practice. The colonial governments supported professional efforts by denying uncertified practitioners the use of the courts to recover fees. The "license" of the medical society was not required to practice, and many practitioners of varying competency and training practiced quite successfully without the license.

With increases in colonial wealth, more and more young practitioners opted to visit European cities for further education. Usually they went to Edinburgh for a medical degree or London to walk the wards of the new voluntary hospitals, and some went to the Continent. In 1765 two such students, William Shippen and John Morgan, convinced the College of Philadelphia trustees to open a medical department (later the University of Pennsylvania). In 1766, Thomas Bond, a founder of the Pennsylvania Hospital (1751), offered to give clinical lectures for the new medical program, and in June 1768 the first medical degrees were awarded in the colonies. The Pennsylvania Hospital was a typical eighteenth-century British voluntary hospital, built by subscription of wealthy individuals who held the right to nominate patients for admission; the medical staff served gratis. Shippen also offered instruction in midwifery, following a new pattern of anatomically based, physician-mediated childbirth practices that he had learned in London.

The Nineteenth Century

In 1800 the nation had four medical schools: the University of Pennsylvania (Philadelphia), the College of Physicians and Surgeons (New York City), Harvard's Massachusetts Medical College, and Dartmouth in New Hampshire. The standard of education was two courses of identical lectures (each of approximately four months), a final examination of indifferent rigor, a thesis, and a three-year apprenticeship. The value of the lectures was to introduce material the student would not see in the practical apprenticeship and provide some theoretical underpinnings to future reading. The vast majority of practitioners remained apprentice-trained, and states steadily followed the lead of Massachusetts in exempting medical school graduates from licensure examination. The M.D. was even more transportable than the state license, and as the country expanded westward it became increasingly important.

Practice was eclectic at best; therapy had changed little in the early modern period despite new scientific knowledge. The heavy metals introduced in the Renaissance had become increasingly common therapeutic agents and many practitioners and patients preferred the purgative calomel (mercurous chloride) as a means of depletion in place of bloodletting, particularly if the digestion was implicated. Botanical remedies were also common, and some practitioners favored indigenous remedies. Modern scholars often disparage the value of these remedies, but they seemed to be effective. The physician gave a name to the unusual and serious disease (diagnosis), he frequently could tell patient and family what to expect in the coming days (prognosis), and he prescribed a remedy that had obvious physiological activity and so presumably helped effect a cure. Since most patients get well from most diseases (a concept not understood in 1800), this approach worked to most people's satisfaction.

Challenges and reforms. The growth of scientific and medical societies in Europe in the eighteenth century had established that the community of practitioners determined the degree of theoretical and therapeutic innovation that one could advocate and remain within the profession. Samuel Hahnemann introduced homeopathy in the late eighteenth century, but most medical practitioners rejected his ideas. By 1844, however, there were enough homeopathic practitioners to sustain a national medical society, the American Institute of Homeopathy. Germans in Pennsylvania came under Hahnemann's influence, and in 1848 they obtained a charter for a medical college in Philadelphia, which was later named for Hahnemann. Many practitioners tried homeopathy as a result of the failure of traditional remedies, especially in the cholera pandemics of 1832 and 1849.

European ideas were not the only challenges to the medical profession in the early nineteenth century. A New England farmer, Samuel Thomson, developed a system of healing based on his understanding of indigenous plant remedies, which he patented in 1813. By 1840 Thomson had sold over 100,000 rights. Thomson's son, John, was the agent in Albany, New York, when in 1844 a patient died under his care. The medical society pressed the state to bring charges against John Thomson for practicing without a license and so being guilty of the untimely death of a fellow citizen. Thomson collected a petition with over 50,000 signatures requesting the repeal of the licensing law in New York, and the legislature complied. By 1850 the licensure legislation in every state except Massachusetts (where it was not enforced) had been repealed in the name of equity.

At the close of the eighteenth century the French republic had combined the ancient professions of physician and surgeon into a new unitary profession, trained in a medical school that had science teaching and hospital-based clinical experiences. These new practitioners made important observations in pathology, particularly the correlation of clinical signs and symptoms with changes in the tissues observed at postmortem. New means of physical diagnosis—auscultation and percussion—became increasingly popular. Dissections were frequent and practitioners began to fully document the natural history of various diseases. Americans visited Paris in large numbers, some for a brief period, others for extended study.

In 1835 Jacob Bigelow of Harvard advanced a radical idea: the therapies of the medical profession, regular or sectarian, did not change the natural course of most cases of disease. His classic essay, "On Self-Limited Diseases, " documented that those who take no remedies recover at rates similar to those who take remedies of the various forms of practice. The idea of self-limited disease was strongly resisted by much of the profession—self-interest, self-esteem, and even feelings of humanity all combined to oppose the idea. Yet more and more physicians became increasingly skeptical with the accumulation of data and the passage of time. As medical leaders abandoned the older therapies but had no new ones to take the place of that which was discarded, a crisis of confidence was exacerbated between the patient, who had been comfortable being prescribed for, and the profession, which now had a reduced, relatively passive role. Charles Rosenberg has described the profound change produced by this crisis as the "therapeutic revolution" and suggests that it was one of the most important turning points in the history of medicine since it led to reforms in the study and practice of medicine that ultimately excluded educated laymen from the medical decision-making process. The physician's increasingly scientific knowledge meant that doctor and patient no longer had a shared understanding of the therapeutic encounter.

In the years leading up to the Civil War these changes would play out in a variety of ways. One was the "mal-practice crisis, " in which improvements in orthopedics had resulted in improved limb salvage after bad fractures but at the cost of some limb shortening and limping. Juries awarded oppressive damages on the assumption that all fractures could be healed perfectly. Another area of confusion was the role and nature of ether anesthesia, introduced in 1846. Did pain serve an important physiological function? Should it be relieved if the patient could stand the surgery without pain relief? Patients wanted anesthesia, but doctors were worried about the long-term success if the normal physiology was interrupted. The competition of increasing numbers of practitioners, the growth of sectarian practice, and the influence of new ideas from Paris resulted in a profound change in the nature of medical thinking in the 1830s and 1840s.

As some practitioners campaigned to "improve" the profession socially and scientifically, others were concerned about the propriety of the changes. The frustrations surrounding these controversies resulted in the organization of the American Medical Association in 1847. Essentially an organization of medical schools and interested practitioners, the AMA had as its primary goal the elevation of standards of practice through improved medical school education. Professor Martyn Paine of New York University, one of the strongest voices for an open and egalitarian profession, felt those who wished to increase requirements for medical students were self-interested elitists who were unfairly attacking the honor of the profession for personal aggrandizement. Under the then current system, America was the only country in the world where every citizen had the expectation of receiving care from a medical graduate with an M.D., and "reform" might put that distinction at risk. However, the schools had no incentive to raise standards.

Post–Civil War changes. During the Civil War the diversity of education, knowledge, and skill within the medical profession was increasingly apparent. The experience focused attention on several areas where improvement was possible. The use of ambulances to evacuate the wounded from the battlefield led some physicians to advocate hospital-based, urban ambulance systems after the war. The experience with camp diseases and the failures of camp hygiene encouraged the sanitary reform movement in the later part of the nineteenth century. The failures of surgical knowledge were recognized as heavily based in anatomical ignorance and inspired anatomy acts in many states in the generation following the war. Perhaps most important, many Americans gained a knowledge of and experience with hospitals as places of quality medical care for all classes.

While America was engaged in the Civil War, scientists in France and Germany were creating modern physiology, cellular pathology, and the germ theory of disease causation. The development of a postwar urban, industrial culture in the United States, when combined with the new germ theory as applied to antiseptic surgery, helped drive the rapid expansion of hospitals in American cities. With a little over 600 hospitals at the centennial, the nation developed over 2,000 by the beginning of the new century. The volunteer surgical staffs of the hospitals saw the advantage of allowing the charities to provide the capital investment for antiseptic and increasingly aseptic operating rooms, realizing that their paying, private patients would benefit from this new surgical environment. The traditional charities had never had staff practitioners charge patients, and so surgeons engaged in campaigns to have the rules changed and to get private, paying beds added to the institutions. In addition, surgeons were frequently the driving force behind the establishment of nurse-training programs in the hospitals.

The passage of anatomy acts, starting in Pennsylvania (1867), improved basic education in medical science. The increased need for clinical experience was also recognized, and medical schools built relationships with the expanding community of hospitals. Apprenticeship declined rapidly. As professional leaders encouraged investment in improved education, they saw value in external certification of their graduates to limit the influence of students on curricula and admissions. The reintroduction of state licensure provided the needed extramural authority.

As licensing examinations grew more common, old sectarian practitioners declined. However, there were (and are) always patients for whom medicine has no answers, and such failures supported new sectarian movements as medicine was making scientific progress. Andrew T. Still, a bonesetter, developed the system of osteopathic medicine in the 1870s, based on nervous energy distribution mediated by the spine. He used the science of the day and brought physiologists into the new school of thought in such a way that osteopathy was open to new ideas and scientific change; it would be largely reintegrated with regular medicine in the 1960s. D. D. Palmer's "invention" of chiropractic (1894) did not allow for changes in theory, nor did Mary Baker Eddy's "discovery" of Christian Science (1875).

That medical science was progressing faster than anyone could manage was beyond dispute; the AMA adopted sections for special interest programming at its meetings and published their discussions in its periodicals. Many of the leaders of the sections were involved with other kinds of specialty organizations, the most common of which was the limited membership society made up of professors who published in the specialty and devoted all or most of their time to its practice. Organized medicine at the close of the nineteenth century was dynamic, innovative, and successful in its attempt to portray medical progress as being beyond the understanding of the nonpractitioner, trustworthy to keep its own house in order, and of sufficient societal benefit for society to take the risk.

The 1896 war with Spain and the resulting colonial empire served as an excellent study of the possibilities of the new medicine in American society. As the war progressed and America's premier surgeons volunteered and publicized their successes, the power of surgery was obvious. William Borden took the new X-ray machine to war. Walter Reed went to Cuba and solved the puzzle of yellow fever—it was caused by a virus and transmitted by the bite of the mosquito. William Gorgas, the health officer in Havana, began a campaign to eliminate mosquitoes, and with military efficiency yellow fever went away. In the Philippines army doctors established health departments that controlled smallpox, plague, cholera, malaria, and a host of other tropical infections. The vital statistics of various cities and states confirmed what the army experience demonstrated: modern medicine saved lives through preventing illness. Modern surgeons could do amazing things—delivery by cesarean section, appendectomy, hernia repair—and new drugs such as antitoxins, aspirin, and nitroglycerine for the heart made other diseases much less fearsome and painful. So as the new century opened, the power of science was turning back disease, healthy people could be more productive and thus happier, and at least to many opinion molders the possibilities of the future seemed limitless.

The Twentieth Century

In 1901, John D. Rockefeller founded the Rockefeller Institute for Medical Research. Other new institutes followed and hospitals and medical schools began to receive philanthropy of an order that America had never seen in the past. The AMA asked the Carnegie Institution in Washington to investigate medical education, and in 1910 its agent, Abraham Flexner, issued his muckraking sensation, Report on Medical Education in the United States and Canada. Reform of medical education, which had been struggling, took on new definition from Flexner's report. He adopted Johns Hopkins as his model and evaluated every school by how well they matched it; only one came close. Societal leaders urged the others to meet the new standard, a standard designed to educate practitioners who would keep pace with the best science. Philanthropy helped, both with endowments and capital improvements. Tax support followed rapidly. Among the difficult challenges was the new entry requirement of a college education; also expensive was a university-controlled teaching hospital. It would take almost forty years before all schools could provide educational opportunities like Johns Hopkins had in the early years of the century, but eventually the standard of educational excellence would be achieved. The costs were high; if you could not afford college and full-time study you could not become a doctor. Even more unfortunate was the funneling of the limited philanthropy into traditional white male institutions, leaving women's medical colleges and black schools without the funds to survive the new push toward excellent education for every medical student. The profession became better trained but more homogeneous as the new century progressed.

Because the progress of medical science had made it difficult to teach practitioners everything they needed to know in even four years, much of the practical training was gained in postgraduate hospital service. In 1904, Pennsylvania became the first of many states to mandate the completion of an internship year prior to licensure. The AMA issued standards for an approved internship in 1919, although a generation would pass before the experience became essentially universal.

The surgical hospital needed nursing staff to assist in surgery and to care for patients in recovery and convalescence. As the practice became more complex, more individuals played important roles: personnel were needed to give anesthetics, to do pathological tests and examinations, and to use the new X-ray machines and interpret the images that resulted. Many of these individuals seemed to require special education and training, often beyond the basic M.D. Hospitals were expensive to capitalize and operate, and patients frequently could not pay the actual costs. By the second decade of the twentieth century, the increasing costs associated with modern medical care were attracting political attention worldwide. Germany had introduced sickness insurance in the 1880s, and by 1911 the British adopted workmen's health insurance. In various states insurance proposals were made. While these proposals frequently included hospitalization insurance, they most commonly called for the payment of what would come to be called sick leave (temporary salary replacement) and workmen's compensation or medical and disability pay protecting the employer from liability. The medical profession was divided on the proposals, but in 1915, as World War I gathered fury and the proposals were labeled as German, the movement lost strength and faded before the end of the decade. During the 1920s malpractice suits grew exponentially as lawyers recommended a malpractice countersuit when patients were sued for uncollected hospital bills. Finally, as the decade drew to a close, Texas teachers agreed on hospitalization insurance with local hospitals, and the voluntary insurance program known as Blue Cross was born.

It was still better to stay healthy, and the early twentieth century saw continued progress in preventive medicine. Among the most dramatic changes was the study of industrial risks led by Alice Hamilton of Illinois. The rural South also saw rapid progress in public health, first through the Rockefeller hookworm eradication campaign and then through Public Health Service campaigns against malaria and pellagra. After proving that pellagra was a nutritional deficiency disease, Dr. Joseph Goldberger, a Public Health Service officer, attracted national attention in 1921 when he publicly charged that the southern system of subsistence tenant farming was starving the people of the South. However, the cure for the disease—a higher standard of living—was more difficult to achieve than the diagnosis. Rockefeller and Public Health Service trials as well as local public health initiatives sometimes made local progress in fighting malaria, which required capital investment in drainage and screens to limit the access of mosquitoes to humans, but its elimination (or at least that of its mosquito vector) was beyond the scope of public health expenditure. As the century progressed it became increasingly clear that the easy public health victories had been won, and further progress required participation of the people and changes in public priorities, neither easy to mobilize.

By the 1930s the problems of specialization had grown to become a significant issue in the self-governance of the profession. A group of concerned surgeons formed the American College of Surgeons in 1913, which was restricted to surgeons who could document satisfactory experience in both training and practice and who were certified by their peers to practice according to high ethical standards. The College grew steadily and after World War I became an important force in education and specialization in the practice of surgery. The ophthalmologists took a different approach; their concern was with a group of "specialized eye doctors, " the optometrists, who were marketing themselves as trained exclusively in eye problems. Ophthalmologists established a separate examination in eye medicine for well-trained specialists so that they too were certified as more specialized in eye problems than the average doctor. In 1917 the American Board of Ophthalmic Examinations was chartered, soon to be renamed the American Board of Ophthalmology; and otolaryngology followed in 1922. In 1930, gynecology and obstetrics formed a third board, and before 1940, all the basic specialty boards were formed. The common basis was three years of specialty training beyond internship and a satisfactory certifying examination.

Innovations after World War II. World War II fundamentally reshaped American medicine in many important ways: it changed the opinions of doctors on the virtues of specialty practice, it changed the opinion of many Americans on the importance of physician access, and it convinced many of the profound possibilities of medical research. Americans' expectations of medical care grew, as did the system for paying for that care. In 1940 about 9 percent of Americans were covered by hospitalization insurance; by 1950 that percentage was greater than half. Fringe benefits were made exempt from taxation in the war and, unlike wages, were not frozen, creating an incentive to increase fringe benefits to attract workers to war industry. In the military, Americans were referred to specialized care in the general hospitals. Those approved as specialists received higher rank and so better pay and quarters. Even more telling was the observation that the specialist in the hospital was in the rear while the generalist was in the front; there is no greater statement of social value than distance from bullets.

At the end of the war the Servicemen's Readjustment Act (GI Bill) paid tuition to enhance the education and training of servicemen returning to civil life. Under the act, hospitals and medical schools were paid tuition to employ residents, whom they needed to staff the hospital anyway. The result was a rapid, almost exponential, expansion of residency positions all across the nation. The greatest impact of the wartime medical experience was probably the faith created in the power of science to change the quality and perhaps the quantity of life for Americans. Penicillin was the most obvious of the medical miracles, but new immunizations, blood fractionation, DDT insecticide, and the bug bomb to repel mosquitoes all contributed to the sense of rapid progress. The wealth of the postwar economy solved many of the social problems that had undermined health in the prewar years, and postwar Americans began to expect medicine to deliver on demand.

Streptomycin, the second great antibiotic, was valuable in the fight against tuberculosis, the last great nineteenth-century killer. As new epidemics emerged, particularly polio, the mobilization of medical research led to an immunization within a decade. The early 1950s saw the introduction of the heart-lung machine and open-heart surgery, the Kolff artificial kidney and the possibility of dialysis, as well as exceptional progress in disability management within the Veteran's Administration. The reformation of the VA medical system led to the association of new VA medical centers with medical schools in an effort to enhance care for the veterans and increase the number of physicians available to serve the higher expectations of the public. Even more important was the growth of federal financing of medical research through the National Institutes of Health. With federal money through the VA and the NIH, the Flexnerian vision of American medical education finally became a complete reality. In the following decades, therapeutic interventions, especially drugs to control metabolic diseases, combat cancer, prevent pregnancy, and limit psychiatric symptoms, all transformed illness. Diagnostic imaging and automated diagnostic tests transformed the doctor-patient encounter in ways both helpful to precision and harmful to the doctor's counseling role. For the first time in human history, life expectancy for those over sixty-five increased, fundamentally altering the demographic profile of American society.

With an increasingly scientific and specialized medical profession, the ancillary components of the health care system also changed. The goal of a professional nursing service became increasingly common, and with insurance the hospitals could pay these fully qualified nurses. However, professions in America were educated in the university, not trained on the job, and leaders of American nursing increasingly looked to the bachelor's degree as the standard of basic training in nursing. The technical assistants of radiologists and pathologists were also increasingly educated in universities. As the personnel increased so did the technical equipment required to practice the best medicine, such as new imaging and laboratory machines. This equipment, as well as changes in nurseries and operating rooms to control infections and more semiprivate rooms expected by suburbanites who had insurance, required more capital and more management. A profession of hospital administration began to grow, and hospitals looked for new sources of support. The federal government was the obvious place to find large sums of money, and the 1947 Hill-Burton Act created a program of matching grants to build hospitals in underserved areas. The underserved areas that could raise matching funds were in the suburbs, and Hill-Burton hospitals sprang up across the nation at the edges of cities.

Paying for health care. The federal government might build hospitals and pay for medical research, but it would not organize ways to pay for medical care. The Truman administration reintroduced national health insurance after the war and was defeated by the AMA. Through the 1950s the insurance system and the economy both expanded rapidly enough to give the appearance of moving toward universal coverage. By 1960 it had become obvious that voluntary insurance was meeting the needs of many but that some groups were left out, particularly the elderly. The Johnson administration's Great Society initiative provided both the elderly and the poor with new federal programs: Medicare for the elderly—an addition to the social security program that provided hospital insurance and, for a small fee, doctor's bill coverage—and Medicaid, a state grant program to provide coverage for the poor. The medical profession was convinced to participate by making the programs reimburse on a costs-plus basis like private insurance rather than using an imposed fee schedule as most national insurance programs required. By 1971 the upward spiral of costs forced the government to reconsider its costs-plus approach and look for ways to retrench.

By the late 1960s health care began to be discussed as a "right." Increasingly there were those who felt that a right implied responsibility, and so individuals needed to be encouraged to take better care of themselves. In 1964 the surgeon general of the United States, on the advice of an expert committee, determined that the association between cigarettes and lung cancer in men was sufficient to issue a public warning. The nation grappled with the difficult question of how to change legal, if harmful, habits that were generations in the making. Like Goldberger's prescription of a better diet for southern tenant farmers in the early twentieth century, modern public health advice frequently requires massive social changes.

By the late 1980s health insurance for business had become a major personnel expense, and corporate leaders looked for ways to limit their costs. The idea of health maintenance organizations (HMOs) had been introduced in the early 1970s as a market-driven way to reduce utilization: if physician incomes were determined in advance by the number of patients they cared for and they had to pay for referrals, they would have an interest in holding down the expenditures on patients in order to retain more of the money. As business turned increasingly to HMOs and fee-for-service declined as a percentage of the medical marketplace, the suspicion that medical decisions were made on the basis of cost rather than patient interest began to grow. But when President Clinton attempted to overhaul the entire health care system in 1993, the general opinion was that it was not broken enough to take the chance of making it worse. Concerns over affirmative action and equity in medical school admissions suggest that society does not wish to return to the days when the well to do were the only ones able to become physicians. Concern with the uninsured remains politically important. Research and practice associated with fetal issues and genetics generate debate. Technological improvements continue to fascinate, as replacement parts, miniaturized diagnostics, and the increasing engineering marvels associated with modern medicine bespeak a commitment to excellence in health care undiminished in the twenty-first century.

BIBLIOGRAPHY

Dowling, Harry Filmore. Fighting Infection: Conquests of the Twentieth Century. Cambridge, Mass.: Harvard University Press, 1977.

Duffy, John. Epidemics in Colonial America. Port Washington, NY: Kennikat Press, 1972.

———. The Sanitarians: A History of American Public Health. Urbana: University of Illinois Press, 1990.

Gevitz, Norman, ed. Other Healers: Unorthodox Medicine in America. Baltimore: Johns Hopkins University Press, 1988.

Grob, Gerald N. The Mad Among Us. New York: Free Press, 1994.

Kraut, Alan M. Silent Travelers: Germs, Genes, and the Immigrant Menace. New York: Basic Books, 1994.

Leavitt, Judith Walzer. Brought to Bed: Childbearing in America, 1750 to 1950. New York: Oxford University Press, 1986.

Ludmerer, Kenneth M. Time to Heal: American Medical Education from the Turn of the Century to the Era of Managed Care. Oxford and New York: Oxford University Press, 1999.

Maulitz, Russell C., and Diana E. Long, eds. Grand Rounds: One Hundred Years of Internal Medicine. Philadelphia: University of Pennsylvania Press, 1988.

Morantz-Sanchez, Regina. Sympathy and Science: Women Physicians in American Medicine. Chapel Hill: University of North Carolina Press, 2000.

Numbers, Ronald L., ed. Compulsory Health Insurance: The Continuing American Debate. Westport, Conn.: Greenwood Press, 1982.

Rosenberg, Charles E. The Cholera Years: The United States in 1832, 1849, and 1866. 1962. Reprint, Chicago: University of Chicago Press, 1987.

———. The Care of Strangers: The Rise of America's Hospital System. New York: Basic Books, 1987.

Ruktow, Ira M., and Stanley B. Burns. American Surgery: An Illustrated History. Philadelphia: Lippencott, 1998.

Savitt, Todd L., and James H. Young, eds. Disease and Distinctiveness in the American South. Knoxville: University of Tennessee Press, 1988.

Starr, Paul. The Social Transformation of American Medicine. New York: Basic Books, 1982.

Stevens, Rosemary. In Sickness and in Wealth: American Hospitals in the Twentieth Century. New York: Basic Books, 1989.

———. American Medicine and the Public Interest. Rev. ed. Berkeley: University of California Press, 1998.

Dale C.Smith

See alsoAmerican Medical Association ; Anesthesia, Discovery of ; Chemotherapy ; Childbirth and Reproduction ; Chiropractic ; Clinical Research ; Epidemics and Public Health ; Health Care ; Health Insurance ; Health Maintenance Organizations ; Homeopathy ; Hospitals ; Hygiene ; Johns Hopkins University ; Malaria ; Medical Education ; Medical Profession ; Medical Research ; Medical Societies ; Medicare and Medicaid ; Medicine, Alternative ; Microbiology ; Osteopathy ; Smallpox .

About this article

Medicine and Surgery

Updated About encyclopedia.com content Print Article