Christopher Lawrence. Scientific Thought: In Context. Editor: K Lee Lerner & Brenda Wilmoth Lerner. Volume 1. Detroit: Gale, 2009.
Antibiotics and antiseptics are substances used internally and externally to kill or deter the growth of the biological agents that cause disease. For humans, sepsis (infection) is inextricably linked to the idea of bacterial or fungal infection, but this was not always the case. When a wound, perhaps a pinprick or maybe a major battle injury, becomes red, swollen, painful, and hot, it is said to be inflamed; if a white or greenish fluid called pus appears, sepsis has set in. Only at the end of the nineteenth century, however, was it agreed that sepsis was the result of infection by microorganisms. Similarly, when a person develops a fever, becomes short of breath, and coughs up red-tinged, green sputum, they might now be diagnosed as having pneumonia, a sort of internal infection caused by bacteria. Again, this view was not widely held until about 1900. Before this time, sepsis was considered to arise from causes internal to the body itself, with the addition of factors such as poor diet and a bad climate (not infectious agents). Today, the term sepsis is often reserved to describe an overwhelming infection in the blood with bacteria that produce toxins.
Historical Background and Scientific Foundations
Egyptian papyri from 3000 BC onwards contain references to entities that can be identified as inflammation, pus, and abscesses, but what these writers considered caused these signs of infection is a mystery. In writings associated with the Greek physician Hippocrates (c.450-370 BC), but which are suspected to be from many different authors and sects, the first extended descriptions of sepsis occur. These are mainly observations about patients, not speculations on the cause of infection.
For the most part, the Hippocratic texts praised descriptions of sickness and are devoid of any coherent theory. When theory was used, health was equated with a normal blending of the body’s humors, which in one manuscript are identified as blood, phlegm, and yellow and black bile. The writings give excellent accounts of inflammation and diseases such as puerperal sepsis (childbed fever) as seen at the bedside. In one text, pus was considered overheated blood. In the treatise Wounds, irrigation with fresh water is recommended as a treatment and no dressing was permitted if suppuration (pus formation) occurred. These measures might seem basic antiseptic techniques to modern people, but there was no indication that the writers of the time considered that sepsis was caused by an infective agent. Poisonous gas emanating from marshes and clouds of insects are about the only reference to specific disease-producing agents mentioned in the Hippocratic texts. Some ancient surgeons used gum resins such as myrrh and frankincense as antiseptics.
In the Roman world, the writer Varro (127-116 BC) described invisible animacules (little animals) as the cause of disease, but his works were never used by other authors. The greatest physician in Rome was the Greek named Galen (AD 129-c.216). Galen endorsed Hippocratic observation, but also sought to produce a coherent theory of health and disease. Galen regarded inflammation as a sort of tumor. Suppuration (the formation of pus) and its worst consequence, gangrene, were described by Galen as a humoral (body fluid) disorder. Galen also considered pus could in certain circumstances indicate healing. What would be considered internal infections today, such as pneumonia or meningitis, Galen understood as individual deviations from good health caused by a weak constitution, poor diet, or bad living.
Galen’s works formed the basis of orthodox medicine until the late seventeenth century. Observations and theories of sepsis made by physicians used his works as guidelines. Two other threads, however, influenced the treatment of septic diseases and theories of their cause. First was the management of epidemic disease by religious and civil authorities who sometimes relied on contemporary medical theory, but also resorted to religion in the face of epidemics. In Vienna during some outbreaks of plague, for example, dying citizens were often brought to the church in the center of town in order to receive a miracle from God. It is probable that such movement of infected individuals facilitated the spread of the disease, and dependence upon miracles and penitence to cure or prevent plague and other infectious diseases likely hindered the evolution of scientific thought about the process of infection. Later, surgeons who were forced to deal with numerous infected battle wounds among soldiers helped practical solutions emerge in treating infections and, ultimately, preventing them. It is important to grasp that although in the modern era septic wounds and febrile diseases are thought of as different facets of the same sort of pathological process—infection with a disease-causing bacteria—this was by no means the case in the past.
Infection and Contagion
The terms infection and contagion originally meant pollution in the most general sense and referred to the religious and moral realm. In the Middle Ages, the isolation of lepers and quarantining of plague victims by no means implied that the causes of these diseases were invading organisms; the isolation was due to social perceptions about the person with the disease. The first figure to postulate living “seeds” as a cause of disease was a Renaissance physician, Girolamo Fracastoro (c.1478-1553), in a poem called Syphilis (the name of a Greek shepherd), which described the newly observed scourge of that name. In general, until the nineteenth century, physicians worked on the view that epidemic diseases could be explained by miasmas, the bad airs (from which the name for malaria is derived) caused by environmental pollution, whether natural (produced by swamps, for example) or artificial, as could be caused by sewage or animal matter left to decompose in the streets of towns.
The word antiseptic was introduced in the eighteenth century. At this time, increasing attention was turned to recording and explaining the diseases of populations, the beginning of the study of epidemiology. The British in particular, with their expanding army and navy, showed great concern with understanding the cause of epidemics in military camps and on naval vessels. Theoretical and experimental interests centered on understanding how the body’s vital processes resisted decay or putrefaction. Antiseptics, usually weak acids, were introduced as agents to assist cleansing operations, which were recognized as essential to maintaining the health of large numbers of men quartered together. But there was still no connection with infection in anything like a modern sense. At that time, many diseases were held to be varieties of septic or putrid fever. Thus, scurvy—a major maritime problem for a colonial power—was considered a putrid disease, caused by bad diet and poor air. It was thus agreed it might be prevented by green vegetables, ventilation aboard ship, and cleanliness assisted by regular disinfection of the decks with vinegar. Vinegar or lemon juice was also taken internally as an antiseptic.
In the countryside and small villages where large populations did not live in close quarters, wounds generally healed more often without complications from infection or sepsis. In urban areas and on battlefields, on the other hand, infection and delayed healing were the norm. Military and naval surgeons were at the forefront of treating septic wounds and using makeshift antiseptics, although they seldom agreed on the cause of sepsis. Until the late nineteenth century, many surgeons continued to consider the formation of white, odorless pus as a sign of wound healing. Instruments of cautery (hot irons) and boiling oil were widely used to destroy dead flesh and prevent putrefaction—the production of foul-smelling, bloodstained, green pus. The French surgeon, Ambroise Paré (1510-1590), famously and successfully dispensed with the cautery iron in favor of a wound dressing made of egg yolk, oil of roses, and turpentine. Long afterward, it was a tradition among carpenters to dip a cut finger in turpentine.
Rise of Surgery
In the late eighteenth and early nineteenth centuries, a number of factors began to draw together the previously separate realms of internal and external disease, thus laying the groundwork for the emergence of a single theory of the cause of sepsis. The most important of these factors was the rise in status of the surgeons and their access to large numbers of hospital patients. In the eighteenth century, surgeons increasingly attempted to give their craft a theoretical grounding based upon animal experimentation and dissection. The most important figures in this movement were the Scottish physician John Hunter (1728-1793) and the French physician Xavier Bichat (1771-1802). Hunter and Bichat were relentless animal experimenters, and both worked on the processes of inflammation and pus formation. Both physicians came up with similar theories based on experimentation that inoculation, inflammation, and infection could be local processes that could be initiated in tissues rather than in organs or the body as a whole.
An important constituent of this “rise of surgery” was the increasing emphasis on the hospital as an important site of medical treatment and education. By the middle of the nineteenth century in Europe and slightly later in North America, general and specialty hospitals were commonplace in large towns and cities. London had twelve major teaching hospitals, each with hundreds of beds. The ancient hospital of St. Bartholomew’s had more than 5,000 inpatients in 1867 and over 100,000 outpatients. All of these hospitals had large numbers of surgical wards. Facilitated by the introduction of ether anesthesia in 1846, surgery by this time had become a complex process featuring relative surgical precision, rather than quick knife-work. Although largely limited to bones and soft tissues, operations were often extensive.
The concentration of surgical patients in one area, however, was associated with a terrifying phenomenon known by various names, commonly, hospital gangrene or “hospitalism.” In this condition, a patient might enter a ward with an injury, often an open wound with a protruding broken bone. These compound fractures were frequently the product of accidents involving one of the many thousands of horse-drawn carts and carriages that packed the streets. The fractures were much feared and known to have a high mortality. In some cases, the wound became septic, the limb gangrenous, and then often the patient developed a high fever and death followed. Frequently, other patients in the same ward would develop the same condition, with the original patient considered the source of the infection (in the sense of pollution) in all the others.
Many public health reformers, notably Florence Nightingale (1820-1910), saw hospitalism as analogous to the febrile diseases of the city slums produced by overcrowding and filth. The cause of the disease, these reformers held, lay in environmental pollution; in the miasmas (toxic air) that were produced by a rotting wound, just as sewage and putrefying matter in the streets was thought to produce typhoid. Their proposed solution was to tear down the hospitals and rebuild them as spacious, airy pavilions in the countryside.
Most surgeons, although they might agree that festering injuries were the sources of disease, sought for a different solution. For the most part, surgeons considered the answer lay in management of the wound itself rather than the general environment. A large number of surgeons tried the simplest of dressings, or abandoned them altogether on the grounds that most uncomplicated injuries healed without the development of sepsis. Various substances such as milk or oil were used to encourage clean healing. This was known as the “clean” school, although the term clean referred to the tissues surrounding the wound itself, rather than removing bacteria to levels below that necessary to cause disease, as it does today. The most important and controversial reformer who saw the answer to surgical sepsis in wound management was the English surgeon Joseph Lister (1827-1912).
Lister’s antiseptic treatment of wounds with carbolic acid caused a furor among surgeons of the day. Many of Lister’s colleagues claimed equally good results using simpler techniques. Lister did not understand a modern-type germ theory in the 1870s when the promotion of his method was at its peak. Instead, Lister considered infection to be caused by some sort of fungal-like agent whose pathogenic (disease-causing) properties were shaped by the environment. Lister’s theory was one of putrefaction, not infection. Only after 1880 did Lister adopt modern germ theory, by which time surgeons were adopting aseptic methods (that is, heat sterilization of the whole surgical environment, clothes, instruments etc., rather than chemical sterilization of the wound alone). It is very difficult to determine the causes of the real decline in surgical mortality in this period, as hospitals were becoming cleaner and more sanitary, nursing methods were improving, and patient nutrition was also improving. Finally, Lister’s antiseptic assault on wounds based on germ theory was an attempt to show that science, when applied to healing in large inner city hospitals, was the route of progress. The true Listerian legacy is that specific antiseptic agents could kill invisible living things that caused wound decay.
Specific Antibacterial Agents
The 1880s saw the establishment of modern germ theory based on the microscopic study and the growth of bacteria in vivo (in living things) and in vitro (in the laboratory). Unlike Lister’s germs, these bacteria were agents that produced specific diseases such as diphtheria and gonorrhea. As the new germ theory was created, so theories of immunity were devised. It was quickly recognized in the case of diphtheria that the lethal effects of the causative bacteria were brought about by a poison: diphtheria toxin. In the 1890s in Germany, Emil von Behring (1854-1917) and Shibasaburo Kitasato (1852-1931), two assistants of the German bacteriologist Robert Koch (1843-1910), showed that injections of small non-lethal doses of the toxin into rabbits produced a serum capable of neutralizing it. This property of the serum they called “antitoxic.” These workers raised the serum in horses, and in 1891-1892, injected it into children with diphtheria. Although there were some deaths and equivocal results, overall, there were many recoveries from a disease with a very high mortality (death rate). The first specific antibacterial agent had been made. These natural products were soon to be called antibodies. As important as the diphtheria therapy results themselves was the demonstration of the idea that specific bacterial diseases could be targeted by specific substances.
Paul Ehrlich (1854-1915) was another worker in Koch’s lab who studied immunity. Ehrlich conceived of antibodies as “magic bullets” and set out to discover whether chemicals could have similar effects. It was well known, for example, that various dyes stained particular cells. Ehrlich began by modifying an organic arsenic compound, atoxyl; compound No. 606 was found to kill the newly discovered spirochete that causes syphilis, Treponema pallidum. Compound No. 606 was renamed Salvarsan and in 1911, it was used to treat syphilis in humans. Ehrlich gave currency and a new meaning to an old term, chemotherapy, to describe the agent’s specific beneficial effects. Biologicals was the word eventually used to describe therapeutic and prophylactic agents (vaccines) produced from living bodies. There are now also synthetic biologicals, manufactured chemicals with the same structures as naturally produced products.
About the same time, European powers with their empires to protect were anxious to find cures for tropical diseases. In the 1920s, following Ehrlich’s train of thought, various chemotherapeutic agents were produced in Germany and the United Kingdom that were effective against the malaria parasite. Other agents were produced that combated the parasite of trypanosomiasis (sleeping sickness). Apart from Salvarsan, however, there was no chemotherapeutic agent effective internally against specific bacteria. Many commercial internal antiseptics were produced and were advertised as having a blunderbuss (scattered) effect on all bacteria, but such claims were never generally endorsed (carbolic, iodine, and other antiseptics killed most bacteria in external wounds).
Germany remained at the front of the chemotherapeutics industry until World War II (1939-1945). The country had industrialized later than Britain, and its industrial base was more modern. It was also heavily orientated to the chemical and dye industries. The American pharmaceutical industry was, until after World War II, largely orientated to mass producing drugs that had been synthesized elsewhere. In 1935 in Germany, Gerhard Domagk (1895-1964) published the results of experiments with the dye prontosil red, showing that it could protect mice from infection with streptococci (a bacteria associated with a variety of infections in humans, notably tonsillitis). French workers then showed this effect was owing to conversion of the dye in the body into a member of a group of chemical substances called sulphonamides. Hundreds of sulphonamides were then synthesized, and a small number were shown as effective cures in human streptococcal infections along with a few other bacterial diseases, notably gonorrhea. Sulphonilamide was the most well known of these sulphonamide variants. Research showed that the sulphonamides did not kill bacteria, but prevented them from multiplying. They also manifested a property eventually to be discovered as a drawback to antibiotics—bacteria became resistant to them.
Path to Penicillin
Between both wars, researchers used many different approaches to produce preventative and therapeutic agents that were effective against bacteria and tropical parasites. At that time, there was no agreed theory of bacterial action or of immunity. Some scientists claimed that chemicals were the way forward, while others considered antisera (serum containing antibodies) the best form of attack. The most effective, successful, and surprising therapy for bacterial infections was neither of these.
One powerhouse of bacteriological and immunological research at this time was Almroth Wright’s (1861-1932) Inoculation Department at St. Mary’s Hospital in London. (Wright had produced the first anti-typhoid vaccine before World War I ended in 1918.) Wright postulated a theory of immunity based on “opsonin,” a circulating blood substance, and had been caricatured for it as Sir Colenso Ridgeon by his good friend George Bernard Shaw in his play The Doctor’s Dilemma. Wright was eccentric and sternly ruled his department. Nonetheless, a stream of young men who were eventually to become distinguished medical scientists worked under him using the opsonin theory as a research tool. The whole ethos of Wright’s department was biological rather than chemical in focus. One of the gifted individuals who worked there was Alexander Fleming (1881-1955).
In Wright’s laboratory in 1922, Fleming discovered lysozyme. This is a protein found in substances such as blood, tears, human milk, and nasal secretions. Fleming discovered its action on a bacterial culture plate that had been contaminated, as was evidenced by the spread of an unwanted yellow bacterial colony. On a part of the plate where Fleming had added a drop of his own nasal mucus (he had a common cold at the time), nothing grew.
By 1928, Fleming had been appointed professor of bacteriology and deputy director of the Inoculation Department. Wright was still exercising imperious control and was very protective of Fleming. In August or September 1928, Fleming made a striking observation. He reported it in the British Journal of Experimental Pathology in 1929, writing that while working with variants of staphylococcus bacteria, “a number of culture plates were set aside on the laboratory bench.” When they were later examined, they were “exposed to the air and they became contaminated.” Later it was noticed “that around a large colony of a contaminating mold the staphylococcus colonies became transparent and were obviously undergoing lysis.” After numerous experiments, Fleming identified the mold as a fungus, Penicillium, (later identified as Penicillium notatum).
The mold had probably been borne on the air from another laboratory two floors below. Fleming showed that it was something that the mold secreted that inhibited bacterial growth in Petri dishes. He also showed that it was non-toxic to animals and humans, and that it could be filtered. Bacteria affected by the filtrate included staphylococci, streptococci, the gonococcus of gonorrhea, the meningococcus of meningitis, the pneumococcus of pneumonia, and the diphtheria bacillus. He called this substance penicillin. He used it externally on the eyes of two of his colleagues with conjunctivitis (infection of the outer membranes of the eyes). The infections cleared up and the penicillin produced no toxic effects. Fleming also applied it to the infected amputation stump of a patient in St. Mary’s without success.
Why penicillin was not developed there and then into a widely available antibacterial agent is not entirely clear. Fleming was a notoriously ineffective communicator. During World War II, the movie star Marlene Dietrich met Fleming, who was now a hero, at a dinner party, and he scarcely spoke a word. At the end of the evening, however, he gave her a tiny glass jar containing the first penicillin culture. Also, attempts by two of Fleming’s colleagues in the lab to extract large quantities of stable penicillin were a failure, possibly due to Wright’s lab staff working without the requisite skills in chemistry. Third, it is not clear how far Fleming foresaw the role of penicillin, possibly considering it only useful as an agent to assist bacterial culture in the lab. This relates to a final factor. The ethos of Wright’s lab was hostile to the drug treatment of infections and much more conducive to the search for methods of promoting the body’s natural defenses. Wright assumed that the doctor of the future would be a vaccinator.
Further attempts were made to purify penicillin in 1931 by Harold Raistrick (1870-1971), director of the department of biochemistry of the London School of Hygiene and Tropical Medicine. Fleming supplied the Penicillium mold. The achievement of Raistrick’s team was to grow it on an artificial medium rather than the bullock’s heart broth that Fleming used. This allowed scientists to grow penicillin in larger quantities. Nevertheless, large amounts of penicillin were not yet forthcoming. Similar failures occurred in the United States. It must be remembered that those who started out to extract penicillin at this time did so with various goals in mind. None began a project to produce massive quantities of a safe, effective antibacterial drug. Only hindsight shows that this might have been possible.
In 1931, Cecil George Paine (1905-), an English physician, employed penicillin therapeutically. Paine was a pathologist in Sheffield who had studied with Fleming at St. Mary’s. Fleming sent Paine a sample of the mold, and he repeated Fleming’s experiments. Paine produced the antibacterial filtrate and tested it on patients. He tried it first externally on staphylococcal skin infections without success. Penicillin is not recognized today as especially useful in topical diseases. The next cases were four babies with eye diseases: two with staphylococcal infections and two with gonococcal infections acquired from their mothers. All were cured bar one of the staphylococcus cases. His next case was an adult with an eye injury infected with pneumococci. Paine irrigated the eye with the filtrate for two days. The man recovered. Paine, however, gave up further trials because of the difficulties in producing penicillin.
It is quite likely that the introduction of sulphonamides in the mid 1930s strengthened the belief that synthetic chemicals and biologically-produced drugs were the answer to bacterial infections rather than vaccines. It was Howard Florey (1898-1968) who successfully engineered the production of penicillin in quantity. Florey was born in Adelaide, South Australia. By virtue of scholarships, he studied for two years at Adelaide University Medical School, and, in 1922, was awarded a scholarship to Oxford University. He then moved to Cambridge University to study pathology. He spent a year in the United States as a Rockefeller Foundation fellow in 1925, and, in 1931, was appointed professor of pathology at Sheffield. In 1935, he took the chair of pathology at the William Dunn School of Pathology in Oxford. In Florey’s department was the biochemist Ernst Boris Chain (1906-1979). Chain was a German refugee who had fled the Nazis. Under Florey, Chain first studied lysozyme, the substance discovered by Fleming in 1922. From lysozyme, Chain and Florey moved to penicillin, as both compounds appeared to have a similar action on bacteria.
Florey and Chain’s laboratory ran far from smoothly, however, and the two scientists eventually communicated only in writing. Money was short. Salaries of assistants could only just be met, and, by 1939, Norman Heatley (1911-2004), a gifted expert on microtechniques, was about to leave the group. The university was not sympathetic, nor was the Medical Research Council, the government’s grant-giving body. Nothing like the charitable money that was available in the United States for research existed in Britain. On September 3, 1939, Britain and Germany went to war. In November, Florey made an application for support to the Rockefeller Foundation in New York, a massive source of medical research funding. Florey’s application was for a broad study of microbial antagonists; Penicillium notatum came last on the list. Once again, the possibilities of mass producing penicillin as an antibacterial were not apparent. Penicillin was included because the lab workers happened to be culturing it. Lysozyme was then higher up the on list of the department’s accomplishments. Nevertheless, the Rockefeller Foundation provided funds for salaries and equipment.
The money enabled the workers to expand their studies. Chain and Heatley took up the chemical problems of isolating and purifying the penicillin. Like earlier researchers, they had problems with low yields and instability. Eventually, however, using a variety of solvents and freeze-drying the product, they produced a powdered salt of penicillin. On May 25, 1940, penicillin was tested on mice. Eight white mice had virulent streptococci injected in their abdomens. Four of the animals were left untreated; the other four were treated with different doses of penicillin. The untreated animals died, all the treated ones recovered. Florey apparently remarked, “It looks very promising.” The paper describing this work was published in the medical journal The Lancet on August 24, 1940. Fleming learned of their work through this paper and paid them a visit. Chain, reportedly, said until then he thought Fleming had died.
On January 17, 1941, the first human trial of penicillin was made on a woman dying of breast cancer. After the injection, the patient developed a fever, which the researchers thought might be due to impurities in the drug, although they also considered that the effects might have been caused by the penicillin itself. The researchers produced what they considered a pure preparation, and another and now more famous patient was treated. Albert Alexander, a patient in the Radcliffe Infirmary, Oxford, was a forty-three-year-old policeman who had scratched his mouth with a rose thorn. It became infected with Staphylococcus aureus, which eventually covered his body with abscesses. He had already lost function of an eye due to the increasingly overwhelming infection. Sulfonamides had had no effect. On February 12, 1941, Radcliffe was given an intravenous drip of penicillin. Four days later, his fever subsided and the infected areas were healing. On the fifth day, the supply of penicillin (which was already being extracted from the patient’s urine) ran out. Radcliffe remained in his improved state for ten days, then quickly deteriorated. He died a month after starting treatment on March 15, 1941.
Initial attempts to interest the British pharmaceutical industry in penicillin failed. Florey and Heatley traveled to the United States, where the Committee on Medical Research (CMR) had been formed in June 1941 to organize research on problems related to national defense. The Chairman of CMR, Alfred N. Richards (1876-1966), was a professor of pharmacology at the University of Pennsylvania and was well connected to the American pharmaceutical industry. Richards used his government position and industrial connections to back clinical tests and large-scale production of penicillin. In 1943, groups of U.S. investigators were accredited to use the antibiotic.
The first patients treated included the victims of burns following a fire at the Coconut Grove Nightclub in Boston. There was then a study of penicillin in military orthopedic cases at Bushnell General Hospital in Brigham, Utah, followed by one at Halloran General Hospital in Staten Island, New York. Bushnell and Halloran were used as schools to teach penicillin therapy to medical officers. It was known that penicillin could be produced in reasonably large quantities by a fermentation process. So much interwar drug success had been based on synthesis, however, that the American government and the pharmaceutical industry invested hundreds of thousands of dollars in attempts to produce synthetic penicillin, to no avail. The army, on the other hand, backed the fermentation process of producing natural penicillin. The first large supply was produced by the Merck Corporation and was delivered to Britain in May 1943.
Extensive assessments of penicillin in the British army were carried out under Florey’s supervision in Algiers in 1943 by the British neurosurgeon Hugh Cairns (1896-1952). Results there in infected wounds—infested with maggots in hospitals thick with flies—were impressive. The drug was found to be effective against many conditions, notably gonorrhea, a sexually transmitted disease producing a severe shortage of effective military manpower. Penicillin also transformed the treatment of burns. By the close of the war, penicillin was being produced in quantities that exceeded military requirements. Diseases of civilian life, such as common boils, pneumonia, meningitis, diphtheria, scarlet fever, and rheumatic fever were all initially susceptible to penicillin. It is an interesting point, however, to question what exact role penicillin had in the decline of these disorders, as improved diet and living conditions were already producing a marked decline in their incidence (especially of diphtheria) before penicillin became available. The sexually transmitted diseases syphilis and gonorrhea, if patients were prepared to turn up for treatment, were also responsive. For their work on penicillin, Fleming, Chain, and Florey shared the Nobel Prize in 1945 for physiology and medicine. Heatley’s invaluable contributions were overlooked and, some scientists noted, Fleming’s might have been overvalued.
Penicillin is usually described as the first antibiotic. Paul Jean Villemin (1827-1892), coined the term “antibiosis” in 1889 to convey a process by which life could be used to destroy life. Selman Waksman (1888-1973), the discoverer of streptomycin, is credited with inventing the word antibiotic (although antibiotic is a Victorian term that meant opposed to belief in the possibility of life; Waksman may not have known this). Its definition is by no means fixed today. Broadly, an antibiotic is a drug that destroys microorganisms, but some definitions limit use of the term to drugs produced naturally by living things. Others include as antibiotics synthetic imitations of natural drugs, and some include any synthetic (which would of course make Salvarsan the first antibiotic). Effective and amazing though penicillin was, many conditions resisted penicillin treatment. The most obvious example was tuberculosis (TB), predominantly a bacterial disease of poverty and overcrowding. Penicillin, however, was the signal to researchers and drug companies to try to produce other antibiotics.
Streptomycin was discovered in the laboratory of a soil microbiologist, Selman Waksman, at Rutgers University in New Jersey. The son of poor Russian Jewish parents, Waksman studied in Odessa before the family immigrated to the United States in 1910. Waksman, whose family members were farmers, matriculated at Rutgers College and graduated in 1915 with a degree in agriculture. In 1916, he went to the University of California at Berkeley, and, in 1918, he was awarded a PhD in biochemistry. He then returned to Rutgers as a researcher.
In 1939, Waksman began a study of the substances by which soil microbes destroyed each other and the possibility of them having value in human therapeutics. In 1940, his team isolated from the Actinomyces organism actinomycin; a toxic substance with antibiotic properties. In 1942, another antibiotic, streptothricin, was found. This, too, was toxic but was active against many bacteria including tubercle bacillus. Encouraged by the discovery of streptothricin and fired by the development of penicillin treatment, Waksman studied no less than 10,000 different soil microbes. One of his assistants, Albert Schatz (1920-2005), isolated a strain of Actinomyces, Streptomyces griseus (in a clump of dirt taken from the throat of a sick chicken), which had antibiotic activity. Waksman gave it the name streptomycin. Testing of streptomycin in 1944 produced the satisfying result that showed it had a curative action against tuberculosis.
On February 9, 1945, Waksman and Schatz made a joint patent application for streptomycin. According to Schatz, Waksman asked him to assign the patent rights to the Rutgers Research and Endowment Fund. Schatz agreed. By 1950, royalties from streptomycin had grown to more than a million dollars per year. Schatz filed suit against Waksman and the Rutgers Research Foundation. On March 25, 1950, at a closed pretrial hearing, Waksman admitted that he had made money from streptomycin. He had earned $350,000 before he turned the patents over to Rutgers, and he still held a ten percent interest in the drug. Rutgers and Waksman agreed to a settlement. Schatz was given $125,000 and three percent of future royalties. Fifteen other scientists and twelve laboratory assistants also received awards. In 1952, Waksman was named sole recipient of the Nobel Prize in Medicine for the discovery of streptomycin.
Modern Cultural Connections
By weight, two million pounds of antibiotics were produced in the United States in 1954. By 2000, this figure had increased to 50 million pounds. The drug industry became largely involved in the profitable production of antibiotics after 1945. Within a few years, Parke Davis had produced chloramphenicol; Lederle Laboratories had extracted aureomycin from a soil actinomycete; and Pfizer had extracted terramycin, also from an actinomycete, found near its factory in New Jersey. Chloramphenicol, aureomycin, and terramycin were the first broad-spectrum (active against a wide range of bacteria) antibiotics. The most widely used broad-spectrum antibiotic, tetracycline, was developed by a chemist at Pfizer who realized that terramycin and aureomycin had a shared a core structure. Erythromicin and vancomycin were developed by Eli Lilly in the 1950s, especially for penicillin-resistant cases. Beecham produced broad-spectrum synthetic variants of penicillin such as amoxicillin. In the 1960s, the cephalosporins (related to the penicillins) appeared. By the late 1990s, there was concern among scientists that few new families of antibiotics were being discovered.
In 1945, in an interview with The New York Times, Fleming warned that the misuse of penicillin could lead to the development of resistant forms of bacteria. In Britain, the Penicillin Act of 1948 was intended to limit, through prescription, the public’s access to the drug. By the 1950s, hospitals were finding almost 50% of Staphyloccocus aureusstrains were penicillin resistant. Newborn babies in hospitals were sometimes infected, and post-operative infections became common. Worries that further resistant bacteria would breed in animals and spread to humans led, in Britain, to the Swann Committee report of 1969 and legislative controls.
Antibiotic resistance continues to increase. Although much of the increase is due to the widespread use of antibiotics in medicine, extensive use in veterinary practice is also to blame. Aerosol forms of antibiotics are also often sprayed on fruit orchards, where low doses may reach more distant bacteria and promote resistance. In medicine, some of the blame lies with doctors who prescribe antibiotics on demand, even though they are ineffective in viral or other non-bacterial infections. Antibiotic resistance is also encouraged by patients failing to complete the full course of prescribed antibiotic treatment.
Resistance can take various forms, but evolves in the bacterial genes. These genes can encode enzymes to degrade the antibiotic, or they can alter the antibiotic’s binding site on the bacterial cell. The resistance gene can code for a mutated membrane transport protein and stop the antibiotic from entering the bacterial cell. The resistance gene can also create mechanisms for exporting antibiotics immediately upon entry into the cell.
One of the most challenging organisms to have appeared recently is methicillin-resistant Staphyloccocus aureus(MRSa). Common and harmless while residing on the skin of most people, when it enters a wound in a susceptible person, it can cause serious infections and in many cases, death. Similar to nineteenth-century hospitalism, MRSa infections are especially a concern when acquired or spread in an already hospitalized patient. Precautions against such infections include disinfectants and careful patient and staff placement in hospitals.
Primary Source Connection
The heavy use of antibiotics in livestock and food animals is controversial. Many scientists assert that the overuse of antibiotics threatens human health by strengthening harmful bacteria and promoting antibiotic resistance. Representative Rep. Louise Slaughter (D-New York) introduced the following bill in the United States House of Representatives in 2007. The bill calls for increased study, monitoring, and regulation of the antibiotic use in animal agriculture.
To amend the Federal Food, Drug, and Cosmetic Act to preserve the effectiveness of medically important antibiotics used in the treatment of human and animal diseases.
IN THE HOUSE OF REPRESENTATIVES
February 8, 2007
Ms. SLAUGHTER introduced the following bill; which was referred to the Committee on Energy and Commerce
To amend the Federal Food, Drug, and Cosmetic Act to preserve the effectiveness of medically important antibiotics used in the treatment of human and animal diseases.
Be it enacted by the Senate and House of Representatives of the United States of America in Congress assembled,…
SEC. 2. FINDINGS.
The Congress finds that—
(1)(A) in January 2001, a Federal interagency task force released an action plan to address the continuing decline in effectiveness of antibiotics against common bacterial infections, referred to as antibiotic resistance;
(B) the task force determined that antibiotic resistance is a growing menace to all people and poses a serious threat to public health; and
(C) the task force cautioned that if current trends continue, treatments for common infections will become increasingly limited and expensive, and, in some cases, nonexistent;
(2) antibiotic resistance, resulting in a reduced number of effective antibiotics, may significantly impair the ability of the United States to respond to terrorist attacks involving bacterial infections or a large influx of hospitalized patients;
(3)(A) any overuse or misuse of antibiotics contributes to the spread of antibiotic resistance, whether in human medicine or in agriculture; and
(B) recognizing the public health threat caused by antibiotic resistance, Congress took several steps to curb antibiotic overuse in human medicine through amendments to the Public Health Service Act (42 U.S.C. 201 et seq.) made by section 102 of the Public Health Threats and Emergencies Act (Public Law 106-505, title I; 114 Stat. 2315), but has not yet addressed antibiotic overuse in agriculture;
(4) in a March 2003 report, the National Academy of Sciences stated that—
(A) a decrease in antimicrobial use in human medicine alone will have little effect on the current situation; and
(B) substantial efforts must be made to decrease inappropriate overuse in animals and agriculture;
(5)(A) an estimated 70 percent of the antibiotics and other antimicrobial drugs used in the United States are fed to farm animals for nontherapeutic purposes, including—(i) growth promotion; and
(ii) compensation for crowded, unsanitary, and stressful farming and transportation conditions; and
(B) unlike human use of antibiotics, these nontherapeutic uses in animals typically do not require a prescription;
(6)(A) many scientific studies confirm that the nontherapeutic use of antibiotics in agricultural animals contributes to the development of antibiotic-resistant bacterial infections in people;
(B) the periodical entitled ‘Clinical Infectious Diseases’ published a report in June 2002, based on a 2-year review by experts in human and veterinary medicine, public health, microbiology, biostatistics, and risk analysis, of more than 500 scientific studies on the human health impacts of antimicrobial use in agriculture; and
(C) the report recommended that antimicrobial agents should no longer be used in agriculture in the absence of disease, but should be limited to therapy for diseased individual animals and prophylaxis when disease is documented in a herd or flock;
(7) the United States Geological Survey reported in March 2002 that—
(A) antibiotics were present in 48 percent of the streams tested nationwide; and
(B) almost half of the tested streams were downstream from agricultural operations;
(8) an April 1999 study by the General Accounting Office concluded that resistant strains of 3 microorganisms that cause food-borne illness or disease in humans—Salmonella, Campylobacter, and E. coli—are linked to the use of antibiotics in animals;
(9)(A) in January 2003, Consumer Reports published test results on poultry products bought in grocery stores nationwide showing disturbingly high levels of Campylobacter and Salmonella bacteria that were resistant to antibiotics used to treat food-borne illnesses; and
(B) further studies showed similar results in other meat products;
(10) in October 2001, the New England Journal of Medicine published an editorial urging a ban on nontherapeutic use of medically important antibiotics in animals;
(11)(A) in 1999, the European Union banned the practice of feeding medically important antibiotics to animals other than for disease treatment or control, and prior to that, individual European countries had banned the use of specific antibiotics in animal feed; and
(B) those countries have experienced no significant impact on animal health or productivity, food safety, or meat prices, and more importantly, levels of resistant bacteria have declined sharply;
(12) in 1998, the National Academy of Sciences noted that antibiotic-resistant bacteria generate a minimum of $4,000,000,000 to $5,000,000,000 in costs to United States society and individuals yearly;
(13) a year later, the National Academy of Sciences estimated that eliminating the use of all antibiotics as feed additives would cost each American consumer less than $5 to $10 per year;
(14) the American Medical Association, the American Public Health Association, the National Association of County and City Health Officials, and the National Campaign for Sustainable Agriculture, are among the more than 300 organizations representing health, consumer, agricultural, environmental, humane, and other interests that support enactment of legislation to phase out nontherapeutic use in farm animals of medically important antibiotics;
(15) the Federal Food, Drug, and Cosmetic Act (21 U.S.C. 301 et seq.)—
(A) requires that all drugs be shown to be safe before the drugs are approved; and
(B) places the burden on manufacturers to account for health consequences and prove safety;
(16)(A) the Food and Drug Administration recently modified the drug approval process for antibiotics to recognize the development of resistant bacteria as an important aspect of safety;
(B) however, most antibiotics currently used in animal production systems for nontherapeutic purposes were approved before the Food and Drug Administration began giving in-depth consideration to resistance during the drug-approval process; and
(C) the Food and Drug Administration has not established a schedule for reviewing those existing approvals; and
(17) certain non-routine uses of antibiotics in animal agriculture are legitimate to prevent animal disease; and
(18)(A) an April 2004 study by the General Accounting Office concluded that Federal agencies do not collect the critical data on antibiotic use in animals that they need to support research on human health risks; and
(B) the report recommends that the Department of Agriculture and the Department of Health and Human Services develop and implement a plan to collect data on antibiotic use in animals.
SEC. 3. PURPOSE.
The purpose of this Act is to preserve the effectiveness of medically important antibiotics used in the treatment of human and animal diseases by phasing out use of certain antibiotics for nontherapeutic purposes in food-producing animals …