Laurie Garrett. Foreign Affairs. Volume 75, Issue 1, January/February 1996.
The Post-Antibiotic Era
Since World War II, public health strategy has focused on the eradication of microbes. Using powerful medical weaponry developed during the postwar period—antibiotics, antimalarials, and vaccines—political and scientific leaders in the United States and around the world pursued a military-style campaign to obliterate viral, bacterial, and parasitic enemies. The goal was nothing less than pushing humanity through what was termed the “health transition,” leaving the age of infectious disease permanently behind. By the turn of the century, it was thought, most of the world’s population would live long lives ended only by the “chronics”—cancer, heart disease, and Alzheimer’s.
The optimism culminated in 1978 when the member states of the United Nations signed the “Health for All, 2000” accord. The agreement set ambitious goals for the eradication of disease, predicting that even the poorest nations would undergo a health transition before the millennium, with life expectancies rising markedly. It was certainly reasonable in 1978 to take a rosy view of Homo sapiens’ ancient struggle with the microbes; antibiotics, pesticides, chloroquine and other powerful antimicrobials, vaccines, and striking improvements in water treatment and food preparation technologies had provided what seemed an imposing armamentarium. The year before, the World Health Organization (WHO) had announced that the last known case of smallpox had been tracked down in Ethiopia and cured.
The grandiose optimism rested on two false assumptions: that microbes were biologically stationary targets and that diseases could be geographically sequestered. Each contributed to the smug sense of immunity from infectious diseases that characterized health professionals in North America and Europe.
Anything but stationary, microbes and the insects, rodents, and other animals that transmit them are in a constant state of biological flux and evolution. Darwin noted that certain genetic mutations allow plants and animals to better adapt to environmental conditions and so produce more offspring; this process of natural selection, he argued, was the mechanism of evolution. Less than a decade after the U.S. military first supplied penicillin to its field physicians in the Pacific theater, geneticist Joshua Lederberg demonstrated that natural selection was operating in the bacterial world. Strains of staphylococcus and streptococcus that happened to carry genes for resistance to the drugs arose and flourished where drug-susceptible strains had been driven out. Use of antibiotics was selecting for ever-more-resistant bugs. More recently scientists have witnessed an alarming mechanism of microbial adaptation and change—one less dependent on random inherited genetic advantage. The genetic blueprints of some microbes contain DNA and RNA codes that command mutation under stress, offer escapes from antibiotics and other drugs, marshal collective behaviors conducive to group survival, and allow the microbes and their progeny to scour their environments for potentially useful genetic material. Such material is present in stable rings or pieces of DNA and RNA, known as plasmids and transposons, that move freely among microorganisms, even jumping between species of bacteria, fungi, and parasites. Some plasmids carry the genes for resistance to five or more different families of antibiotics, or dozens of individual drugs. Others confer greater powers of infectivity, virulence, resistance to disinfectants or chlorine, even such subtly important characteristics as the ability to tolerate higher temperatures or more acidic conditions. Microbes have appeared that can grow on a bar of soap, swim unabashed in bleach, and ignore doses of penicillin logarithmically larger than those effective in 1950.
In the microbial soup, then, is a vast, constantly changing lending library of genetic material that offers humanity’s minute predators myriad ways to outmaneuver the drug arsenal. And the arsenal, large as it might seem, is limited. In 1994 the Food and Drug Administration licensed only three new antimicrobial drugs, two of them for the treatment of AIDS and none an antibacterial. Research and development has ground to a near halt now that the easy approaches to killing viruses, bacteria, fungi, and parasites—those that mimic the ways competing microbes kill one another in their endless tiny battles throughout the human gastrointestinal tract—have been exploited. Researchers have run out of ideas for countering many microbial scourges, and the lack of profitability has stifled the development of drugs to combat organisms that are currently found predominantly in poor countries. “The pipeline is dry. We really have a global crisis,” James Hughes, director of the National Center for Infectious Diseases at the Centers for Disease Control and Prevention (CDC) in Atlanta, said recently.
Diseases Without Borders
During the 1960S, 1970S, and 1980S, the World Bank and the International Monetary Fund devised investment policies based on the assumption that economic modernization should come first and improved health would naturally follow. Today the World Bank recognizes that a nation in which more than ten percent of the working-age population is chronically ill cannot be expected to reach higher levels of development without investment in health infrastructure. Furthermore, the bank acknowledges that few societies spend health care dollars effectively for the poor, among whom the potential for the outbreak of infectious disease is greatest. Most of the achievements in infectious disease control have resulted from grand international efforts such as the expanded program for childhood immunization mounted by the U.N. Children’s Emergency Fund and WHO’S smallpox eradication drive. At the local level, particularly in politically unstable poor countries, few genuine successes can be cited.
Geographic sequestration was crucial in all postwar health planning, but diseases can no longer be expected to remain in their country or region of origin. Even before commercial air travel, swine flu in 1918-19 managed to circumnavigate the planet five times in 18 months, killing 22 million people, 500,000 in the United States. How many more victims could a similarly lethal strain of influenza claim in 1996, when some half a billion passengers will board airline flights?
Every day one million people cross an international border. One million a week travel between the industrial and developing worlds. And as people move, unwanted microbial hitchhikers tag along. In the nineteenth century most diseases and infections that travelers carried manifested themselves during the long sea voyages that were the primary means of covering great distances. Recognizing the symptoms, the authorities at ports of entry could quarantine contagious individuals or take other action. In the age of jet travel, however, a person incubating a disease such as Ebola can board a plane, travel 12,000 miles, pass unnoticed through customs and immigration, take a domestic carrier to a remote destination, and still not develop symptoms for several days, infecting many other people before his condition is noticeable.
Surveillance at airports has proved grossly inadequate and is often biologically irrational, given that incubation periods for many incurable contagious diseases may exceed 21 days. And when a recent traveler’s symptoms become apparent, days or weeks after his journey, the task of identifying fellow passengers, locating them, and bringing them to the authorities for medical examination is costly and sometimes impossible. The British and U.S. governments both spent millions of dollars in 1976 trying to track down 522 people exposed during a flight from Sierra Leone to Washington, D.C., to a Peace Corps volunteer infected with the Lassa virus, an organism that produces gruesome hemorrhagic disease in its victims. The U.S. government eventually tracked down 505 passengers, scattered over 21 states; British Airways and the British government located 95, some of whom were also on the U.S. list. None tested positive for the virus.
In the fall of 1994 the New York City Department of Health and the U.S. Immigration and Naturalization Service took steps to prevent plague-infected passengers from India from disembarking at New York’s John F. Kennedy International Airport. All airport and federal personnel who had direct contact with passengers were trained to recognize symptoms of Yersinia pestis infection. Potential plague carriers were, if possible, to be identified while still on the tarmac, so fellow passengers could be examined. Of ten putative carriers identified in New York, only two were discovered at the airport; the majority had long since entered the community. Fortunately, none of the ten proved to have plague. Health authorities came away with the lesson that airport-based screening is expensive and does not work.
Humanity is on the move worldwide, fleeing impoverishment, religious and ethnic intolerance, and high-intensity localized warfare that targets civilians. People are abandoning their homes for new destinations on an unprecedented scale, both in terms of absolute numbers and as a percentage of population. In 1994 at least 110 million people immigrated, another 30 million moved from rural to urban areas within their own country, and 23 million more were displaced by war or social unrest, according to the U.N. High Commissioner for Refugees and the Worldwatch Institute. This human mobility affords microbes greatly increased opportunities for movement.
The City as Vector
Population expansion raises the statistical probability that pathogens will be transmitted, whether from person to person or vector—insect, rodent, or other-to person. Human density is rising rapidly worldwide. Seven countries now have overall population densities exceeding 2,000 people per square mile, and 43 have densities greater than 500 people per square mile. (The U.S. average, by contrast, is 74.)
High density need not doom a nation to epidemics and unusual outbreaks of disease if sewage and water systems, housing, and public health provisions are adequate. The Netherlands, for example, with 1,180 people per square mile, ranks among the top 20 countries for good health and life expectancy. But the areas in which density is increasing most are not those capable of providing such infrastructural support. They are, rather, the poorest on earth. Even countries with low overall density may have cities that have become focuses for extraordinary overpopulation, from the point of view of public health. Some of these urban agglomerations have only one toilet for every 750 or more people.
Most people on the move around the world come to burgeoning metropolises like India’s Surat (where pneumonic plague struck in 1994) and Zaire’s Kikwit (site of the 1995 Ebola epidemic) that offer few fundamental amenities. These new centers of urbanization typically lack sewage systems, paved roads, housing, safe drinking water, medical facilities, and schools adequate to serve even the most affluent residents. They are squalid sites of destitution where hundreds of thousands live much as they would in poor villages, yet so jammed together as to ensure astronomical transmission rates for airborne, waterborne, sexually transmitted, and contact-transmission microbes.
But such centers are often only staging areas for the waves of impoverished people that are drawn there. The next stop is a megacity with a population of ten million or more. In the nineteenth century only two cities on earth—London and New York-even approached that size. Five years from now there will be 24 megacities, most in poor developing countries: Sao Paulo, Calcutta, Bombay, Istanbul, Bangkok, Tehran, Jakarta, Cairo, Mexico City, Karachi, and the like. There the woes of cities like Surat are magnified many times over. Yet even the developing world’s megacities are way stations for those who most aggressively seek a better life. All paths ultimately lead these people—and the microbes they may carry—to the United States, Canada, and Western Europe.
Urbanization and global migration propel radical changes in human behavior as well as in the ecological relationship between microbes and humans. Almost invariably in large cities, sex industries arise and multiple-partner sex becomes more common, prompting rapid increases in sexually transmitted diseases. Black market access to antimicrobials is greater in urban centers, leading to overuse or outright misuse of the precious drugs and the emergence of resistant bacteria and parasites. Intravenous drug abusers’ practice of sharing syringes is a ready vehicle for the transmission of microbes. Underfunded urban health facilities often become unhygienic centers for the dissemination of disease rather than its control.
The Emblematic New Disease
All these factors played out dramatically during the 1980s, allowing an obscure organism to amplify and spread to the point that WHO estimates it has infected a cumulative total of 30 million people and become endemic to every country in the world. Genetic studies of the human immunodeficiency virus that causes AIDS indicate that it is probably more than a century old, yet HIV infected perhaps less than .001 percent of the world population until the mid-1970s. Then the virus surged because of sweeping social changes: African urbanization; American and European intravenous drug use and homosexual bathhouse activity; the Uganda-Tanzania war of 1977-79, in which rape was used as a tool of ethnic cleansing; and the growth of the American blood products industry and the international marketing of its contaminated goods. Government denial and societal prejudice everywhere in the world led to inappropriate public health interventions or plain inaction, further abetting HIV transmission and slowing research for treatment or a cure.
The estimated direct (medical) and indirect (loss of productive labor force and family-impact) costs of the disease are expected to top $500 billion by the year 2000, according to the Global AIDS Policy Coalition at Harvard University. The U.S. Agency for International Development predicts that by then some 11 percent of children under 15 in sub-Saharan Africa will be AIDS orphans, and that infant mortality will soar fivefold in some African and Asian nations, due to the loss of parental care among children orphaned by AIDS and its most common opportunistic infection, tuberculosis. Life expectancy in the African and Asian nations hit hardest by AIDS will plummet to an astonishing low of 25 years by 2010, the agency forecasts.
Medical experts now recognize that any microbe, including ones previously unknown to science, can take similar advantage of conditions in human society, going from isolated cases camouflaged by generally high levels of disease to become a global threat. Furthermore, old organisms, aided by mankind’s misuse of disinfectants and drugs, can take on new, more lethal forms.
A White House-appointed interagency working group on emerging and reemerging infectious diseases estimates that at least 29 previously unknown diseases have appeared since 1973 and 20 well-known ones have reemerged, often in new drug-resistant or deadlier forms. According to the group, total direct and indirect costs of infectious disease in the United States in 1993 were more than $120 billion; combined federal, state, and municipal government expenditures that year for infectious disease control were only $74.2 million (neither figure includes AIDS, other sexually transmitted diseases, or tuberculosis).
The Real Threat of Biowarfare
The world was lucky in the September 1994 pneumonic plague epidemic in Surat. Independent studies in the United States, France, and Russia revealed that the bacterial strain that caused the outbreak was unusually weak, and although the precise figures for plague cases and deaths remain a matter of debate, the numbers certainly fall below 200. Yet the epidemic vividly illustrated three crucial national security issues in disease emergence: human mobility, transparency, and tensions between states up to and including the threat of biological warfare.
When word got out that an airborne disease was loose in the city, some 500,000 residents of Surat boarded trains and within 48 hours dispersed to every corner of the subcontinent. Had the microbe that caused the plague been a virus or drug-resistant bacterium, the world would have witnessed an immediate Asian pandemic. As it was, the epidemic sparked a global panic that cost the Indian economy a minimum of $2 billion in lost sales and losses on the Bombay stock market, predominantly the result of international boycotts of Indian goods and travelers.
As the number of countries banning trade with India mounted that fall, the Hindi-language press insisted that there was no plague, accusing Pakistan of a smear campaign aimed at bringing India’s economy to its knees. After international scientific investigations concluded that Yersinia pestis had indeed been the culprit in this bona fide epidemic, attention turned to the bacteria’s origin. By last June several Indian scientists claimed to have evidence that the bacteria in Surat had been genetically engineered for biowarfare purposes. Though no credible evidence exists to support it, and Indian government authorities vigorously deny such claims, the charge is almost impossible to disprove, particularly in a region rife with military and political tensions of long standing.
Even when allegations of biological warfare are not flying, it is often exceedingly difficult to obtain accurate information about outbreaks of disease, particularly from countries dependent on foreign investment or tourism or both. Transparency is a common problem; though there is usually no suggestion of covert action or malevolent intent, many countries are reluctant to disclose complete information about contagious illness. For example, nearly every country initially denied or covered up the presence of the HIV virus within its borders. Even now, at least ten nations known to be in the midst of HIV epidemics refuse to cooperate with WHO, deliberately obfuscating incidence reports or declining to provide any statistics. Similarly, Egypt denies the existence of cholera bacteria in the Nile’s waters; Saudi Arabia has asked WHO not to warn that travelers to Mecca may be bitten by mosquitoes carrying viruses that cause the new, superlethal dengue hemorrhagic fever; few countries report the appearance of antibioticresistant strains bacteria; and central authorities in Serbia recently rescinded an international epidemic alert when they learned that all the scientists WHO planned to send to the tense Kosovo region to halt a large outbreak of Crimean-Congo hemorrhagic fever were from the United States, a nation Serbia viewed with hostility.
The specter of biological warfare having raised its head, Brad Roberts of the Center for Strategic and International Studies is particularly concerned that the New Tier nations—developing states such as China, Iran, and Iraq that possess technological know-how but lack an organized civil society that might put some restraints on its use—might be tempted to employ bioweapons. The Federation of American Scientists has sought, so far in vain, a scientific solution to the acute weaknesses of verification and enforcement provisions in the 1972 Biological Weapons Convention, which most of the world’s nations have signed.
That treaty’s flaws, and the very real possibility of bioweapons use, stand in sharp focus today. Iraq’s threat in 1990-91 to use biological weapons in the Persian Gulf conflict found allied forces in the region virtually powerless to respond: the weapons’ existence was not verified in a timely manner, the only available countermeasure was a vaccine against one type of organism, and protective gear and equipment failed to stand up to windblown sand. Last June the U.N. Security Council concluded that Iraqi stocks of bioweaponry might have been replenished after the Gulf War settlement. More alarming were the actions of the Aum Shinrikyo cult in Japan in early 1995. In addition to releasing toxic sarin gas in the Tokyo subway on March 18, cult members were preparing vast quantities of Clostridium difficile bacterial spores for terrorist use. Though rarely fatal, clostridium infections often worsen as a result of improper antibiotic use, and long bouts of bloody diarrhea can lead to dangerous colon inflammations. Clostridium was a good choice for biological terrorism: the spores can survive for months and may be spread with any aerosol device, and even slight exposure can make vulnerable people (particularly children and the elderly) sick enough to cost a crowded society like Japan hundreds of millions of dollars for hospitalizations and lost productivity.
The U.S. Office of Technology Assessment has calculated what it would take to produce a spectacular terrorist bioweapon: 100 kilograms of a lethal sporulating organism such as anthrax spread over Washington, D.C., by a crop duster could cause well over two million deaths. Enough anthrax spores to kill five or six million people could be loaded into a taxi and pumped out its tailpipe as it meandered through Manhattan. Vulnerability to terrorist attacks, as well as to the natural emergence of disease, increase with population density.
A World at Risk
A 1995 WHO survey of global capacity to identify and respond to threats from emerging disease reached troubling conclusions. Only six laboratories in the world, the study found, met security and safety standards that would make them suitable sites for research on the world’s deadliest microbes, including those that cause Ebola, Marburg, and Lassa fever. Local political instability threatens to compromise the security of the two labs in Russia, and budget cuts threaten to do the same to the two in the United States (the army’s facility at Fort Detrick and the CDC in Atlanta) and the one in Britain. In another survey, WHO sent samples of hantaviruses (such as Sin Nombre, which caused the 1993 outbreak in New Mexico) and organisms that cause dengue, yellow fever, malaria, and other diseases to the world’s 35 leading disease-monitoring facilities. Only one—the CDC—correctly identified all the organisms; most got fewer than half right.
Convinced that newly emerging diseases, whether natural or engineered, could endanger national security, the CDC requested $125 million from Congress in 1994 to bolster what it termed a grossly inadequate system of surveillance and response; it received $7.3 million. After two years of inquiry by a panel of experts, the Institute of Medicine, a division of the National Academy of Sciences, declared the situation a crisis.
Today’s reality is best reflected in New York City’s battle with tuberculosis. Control of the W-strain of the disease-which first appeared in the city in 1991-92, is resistant to every available drug, and kills half its victims—has already cost more than $1 billion. Despite such spending, there were 3,000 TB cases in the city in 1994, some of which were the W-strain. According to the surgeon general’s annual reports from the 1970s and 1980s, tuberculosis was supposed to be eradicated from the United States by 2000. During the Bush administration the CDC told state authorities they could safely lower their fiscal commitments to TB control because victory was imminent. Now public health officials are fighting to get levels down to where they were in 1985—a far cry from elimination. New York’s crisis is a result of both immigration pressure (some cases originated overseas) and the collapse of the local public health infrastructure.
National preparedness has further eroded over the past five years in the face of budgetary constraints. Just as WHO cannot intercede in an epidemic unless it receives an invitation from the afflicted country, the CDC may not enter a U.S. state without a request from the state government. The U.S. system rests on an increasingly shaky network of disease surveillance and response by states and territories. A 1992 survey for the CDC showed that 12 states had no one on staff to monitor microbial contamination of local food and water; 67 percent of the states and territories had less than one employee monitoring the food and water of every one million residents. And only a handful of states were monitoring hospitals for the appearance of unusual or drug-resistant microbes.
State capacity rests on county and municipal public health, and there too weaknesses are acute. In October, dengue hemorrhagic fever, which had been creeping steadily northward from Brazil over the past eight years, with devastating results, struck in Texas. Most Texas counties had slashed their mosquito control budgets and were ill prepared to combat the aggressive Tiger mosquitoes from Southeast Asia that carry the virus. In Los Angeles County that month, a $2 billion budget shortfall drove officials to close all but 10 of the 45 public health clinics and to attempt to sell four of the county’s six public hospitals. Congress is contemplating enormous cuts in Medicare and Medicaid spending, which the American Public Health Association predicts would result in a widespread increase in infectious disease.
Bolstering research capacity, enhancing disease surveillance capabilities, revitalizing sagging basic public health systems, rationing powerful drugs to avoid the emergence of drug-resistant organisms, and improving infection control practices at hospitals are only stopgap measures. National security warrants bolder steps.
One priority is finding scientifically valid ways to use polymerase chain reaction (popularly known as DNA fingerprinting), field investigations, chemical and biological export records, and local legal instruments to track the development of new or reemergent lethal organisms, whether natural or bioweapons. The effort should focus not only on microbes directly dangerous to humans but on those that could pose major threats to crops or livestock.
Most emerging diseases are first detected by health providers working at the primary-care level. Currently there is no system, even in the United States, whereby the providers can notify relevant authorities and be assured that their alarm will be investigated promptly. In much of the world, the notifiers’ reward is penalties levied against them, primarily because states want to hush up the problem. But Internet access is improving worldwide, and a small investment would give physicians an electronic highway to international health authorities that bypassed government roadblocks and obfuscation.
Only three diseases—cholera, plague, and yellow fever-are subject to international regulation, permitting U.N. and national authorities to interfere as necessary in the global traffic of goods and persons to stave off cross-border epidemics. The World Health Assembly, the legislative arm of WHO, recommended at its 1995 annual meeting in Geneva that the United Nations consider both expanding the list of regulated diseases and finding new ways to monitor the broad movement of disease. The Ebola outbreak in Kikwit demonstrated that a team of international scientists can be mobilized to swiftly contain a remote, localized epidemic caused by known nonairborne agents.
Were a major epidemic to imperil the United States, the Office of Emergency Preparedness and the National Disaster Medical System (part of the Department of Health and Human Services) would be at the helm. The office has 4,200 private-sector doctors and nurses throughout the 50 states who are at its disposal and committed to rapid mobilization in case of emergency. The system is sound but should be bolstered. Participants should be supplied with protective suits, respirators, mobile containment laboratories, and adequate local isolation facilities.
As for potential threats from biological weapons, the U.S. Department of Energy has identified serious lapses in Russian and Ukrainian compliance with the Biological Weapons Convention. Large stockpiles of bioweapons are believed to remain, and employees of the Soviet program for biological warfare are still on the state payroll. Arsenals are also thought to exist in other nations, although intelligence on this is weak. The location and destruction of such weapons is a critical priority. Meanwhile, scientists in the United States and Europe are identifying the genes in bacteria and viruses that code for virulence and modes of transmission. Better understanding of the genetic mechanisms will allow scientists to manipulate existing organisms, endowing them with dangerous capabilities. It would seem prudent for the United States and the international community to examine that potential now and consider options for the control of such research or its fruits.
To guard against the proliferation of blood-associated diseases, the blood and animal exports industries must be closely regulated, plasma donors must be screened for infections, and an internationally acceptable watchdog agency must be designated to monitor reports of the appearance of new forms of such diseases. The export of research animals played a role in a serious incident in Germany in which vaccine workers were infected with the Marburg virus and in an Ebola scare in Virginia in which imported monkeys died from the disease.
Nobel laureate Joshua Lederberg of Rockefeller University has characterized the solutions to the threat of disease emergence as multitudinous, largely straightforward and commonsensical, and international in scope; “the bad news,” he says, “is they will cost money.”
Budgets, particularly for health care, are being cut at all levels of government. Dustin Hoffman made more money last year playing a disease control scientist in the movie Outbreak, than the combined annual budgets for the U.S. National Center for Infectious Diseases and the U.N. Programme on AIDS/HIV.