Infant and Child Nutrition

Sara A Quandt. Cambridge World History of Food. Editor: Kenneth F Kiple & Kriemhild Conee Ornelas, Volume 2, Cambridge University Press, 2000.

Infant Feeding in Prehistory

Prehistoric patterns of human infant feeding are important for understanding the forces that have shaped the nutritional requirements of today’s infants and the biological capacities of their mothers to satisfy these requirements through lactation. The primary sources of data to reconstruct such patterns are observational studies of nonhuman primates in the wild, particularly the great apes, and ethnographic studies of contemporary peoples whose nomadic lifestyle and level of material culture probably approximate those of the first humans.

As a mammalian class, virtually all primates—prosimians, monkeys, and apes, as well as humans—follow a k-strategy of reproduction. That is, they have a small number of infants, most born as singletons, in whom considerable parental attention and effort are invested. The consequence is that a relatively high number of offspring live to adulthood, with the success of such a strategy depending on close physical contact for protection. Thus, non-human primate parents carry, rather than cache, their infants—a constant contact that is reinforced by primate milks. These are uniformly high in sugar and low in satiety-producing fat—a milk composition more suited to a frequent-snacking pattern of eating than to one of large, isolated meals. Frequent nursing episodes enhance the amount and, possibly, the energy density of milk (Quandt 1984a). They also delay the return of ovulation, leading to longer birth intervals. Such a biobehavioral complex involving infant feeding, infant care, and birth spacing was doubtless encouraged by the selection process in primate evolution, and consequently, infant feeding became an integral part of the reproductive process, as is demonstrated by studies of human foragers.

The infants of the latter are normally kept in skin-to-skin contact with the mother, and nursing takes place frequently upon demand. Observational studies of the !Kung San of the Kalahari Desert, for example, show that mothers nurse their infants only briefly, but as often as four times per hour during the daytime for an average of two minutes per feeding. Cosleeping of mothers and infants is the rule, and infants also nurse at will during the night (Konner and Worthman 1980). Birth intervals in this noncontracepting group average 44 months (Lee 1979). Weaning among hunter-gatherers is generally gradual, with complete severance not occurring until the next child is born. Premasticated foods are gradually added to the diet, because there are few available foods that can be otherwise managed by very small children.

Infant Feeding in the Historical Period

From the Neolithic to the Renaissance

The Neolithic Revolution brought significant alterations in infant- and child-feeding practices. This transition from a foraging subsistence base to one that incorporated a variety of food production techniques began some 8,000 to 10,000 years ago and was first accomplished in at least six places, including Egypt, Mesopotamia, highland Peru, Mesoamerica, the Indus Valley, and China. Accompanying these changes in food sources were the origins of urbanism, stratified social organization, and writing and record-keeping, along with a host of other cultural traits that joined to comprise what we regard as civilization.

Data on infant-feeding practices come from a variety of artifacts of these civilizations. In addition to archaeological data, there are documentary sources, including art, inscriptions, and early writings. Interpretation of these data is aided by ethnographic analogy with contemporary nonindustrialized societies.

Food production, as opposed to foraging, brought two major changes in the feeding of young children. One of these was the addition of starchy, semisoft foods to the infant diet as a complement to breast milk. Food production meant that the grains or tubers from which such foods could be made were now dependable staples. Cooked into gruels of varying consistency, these foods offered infants and young children relatively innocuous flavor, readily digestible carbohydrate composition, and concentrated energy. Such foods, however, could lead to infectious diseases when contaminated and to equally serious undernutrition, as vegetable foods are deficient in essential amino acids. Consequently, if they were given in place of breast milk, they made for a protein-deficient diet.

With the second change, which came with the domestication of such animals as cows, sheep, goats, and camels, nonhuman milks became available to supplement or even to replace breast milk and to serve as nutrient rich foods for children after weaning. Although there is no record of such milks in the New World or in the civilizations of eastern Asia, their use is well documented elsewhere.

Together these changes in feeding were closely linked to the population explosion that accompanied the Neolithic. The newly available infant foods could shorten the period of intensive breast feeding, leading to shorter periods of postpartum infecundability and, thus, shorter intervals between births. The latter requires substitute foods for infants abruptly displaced from the breast by subsequent newborns, and the milks and cereals were such substitutes.

This does not, of course, mean that breast feeding was abandoned. Early civilizations of the Near East (c. 3000 B.C.) left abundant evidence of their infant-feeding attitudes and practices. Sculptures, paintings, and inscriptions all indicate that mothers were highly regarded and that breast feeding was practiced. For Egypt, medical papyri from about 1550 B.C. contain remedies and recommendations for a variety of nursing problems still addressed today. These include ways to increase milk supply, to ease breast pain, and to evaluate milk quality. Ointments, incantations, and magico-religious spells are also included.

Wet nursing seems to have been a well-accepted practice in Egypt and Mesopotamia. Among the ancient Egyptian royalty, wet nursing was apparently the usual method of feeding highborn infants, with the nurses accorded high status in the royal court (Robins 1993). During periods of Greek and Roman rule in Egypt, the practice changed so that wet nursing became even more widespread and was frequently performed by slave women. In addition to wet nursing, there is pictorial evidence of infants being fed animal milks from horn-shaped and decorated vessels. We cannot know how frequently such artificial feeding was practiced, but it seems likely that most infants were breast-fed. In the case of wet nursing, there were contracts that specified how long a wet nurse should suckle a child and how the child should be weaned before being returned to the parents (Fildes 1986).

Near Eastern texts also prescribed the weaning diet of young children. After several months of exclusive breast feeding, it was recommended that animal milks and eggs be added to the diet and that animal milks become the principal food after weaning, which occurred at 2 to 3 years of age. Other foods were not introduced until teeth were cut, and these were primarily fruits and vegetables. The feeding of corn and pulses to weaning infants is described in the Old Testament.

There is limited mention of infant feeding in the writings of the philosophers of ancient Greece. These make it clear that infants were breast-fed, usually from the first day after birth, but there is no indication if this initial feeding was done by the mother or another woman. (In this connection, Aristotle noted that the early milk is salty.) Wet nurses were used by the wealthy, but there was regional and temporal variation in whether they resided with the family and whether they were slaves. We have little evidence to indicate when children nursed by their own mothers were weaned or how, although it is probable that animal milks, cereals, and honey were employed in the task. Pottery feeding vessels of various designs have been found that were probably used during and after weaning to feed animal milks to children.

Roman physicians from the first and second centuries A.D. were the first to focus significant attention on the health of infants. The most influential of these, Soranus of Ephesus, wrote on gynecology in the second century, and his works were translated and circulated throughout Europe during medieval times (Atkinson 1991). The writings of Soranus and his contemporaries indicate some controversy over whether new arrivals should be given the mother’s breast immediately, or whether a wet nurse should substitute until the mother’s mature milk came in. There was also disagreement about whether or not newborns should be given purgatives to help remove the meconium.

Physicians discussed the personal qualities of wet nurses in their writings because there was widespread belief that those qualities could be transmitted via the nurses’ milk. Thus, Spartan nurses were highly valued for their ability to transmit stamina and good physical and mental health (Fildes 1986). Despite the recommendation of physicians that other foods be withheld until teeth were cut, their writings indicate that women were feeding cereals to infants as young as 40 days. Writers disagreed over giving wine to infants, as well as over the value of prechewed foods. Although the literary sources do not mention their use, feeding vessels are common artifacts from Roman times. Ages recommended for weaning ranged from 2 to 3 years (deMause 1974).

Ancient Ayurvedic medical texts from the second millennium B.C. through the seventh century A.D. contain clear and consistent discussions of pediatrics and infant feeding in India. Practices in the first few days of life were ritualized and quite similar to those still employed in India and among those of Indian heritage elsewhere. These included the feeding of ghee (clarified butter) and honey mixed with other substances in a specific order and at regular intervals over three days. On the fourth day, milk was expressed from the mother’s breasts, and then the infant was put to nurse. As in other civilizations, a wet nurse was employed if the mother was unable to nurse, and there is mention in the texts of the use of cow’s and goat’s milk if breast milk was insufficient. However, no feeding vessels are mentioned in the literature or have been found as artifacts from ancient India, so the extent (and method) of artificial feeding is unknown.

The concept of balance is found throughout the Ayurvedic writings on infant feeding. A wet nurse, for example, was to be neither fat nor thin, neither fickle nor greedy (Fildes 1986). The quality of milk could be disturbed by extreme emotions and could be brought back into balance with a specific diet. The first non-milk baby food in India was rice. The timing of this feeding varied, but in many cases coincided with the eruption of teeth. Weaning from the breast occurred in the third year.

Although the texts from ancient civilizations do not link feeding practices to infant health, some of those mentioned, such as giving infants cereals or animal milks within the early weeks of life, must have led to illnesses. Contamination of these foods probably occurred with some frequency, as climates were hot and sanitary practices for food storage were not well developed. Preference for spring weaning in some places, for example, was probably based on seasonal patterns of disease and the known association of weaning and disease.

Careful study of populations undergoing the change from foraging to food production elsewhere indicate that an epidemiological transition also took place. The skeletal sequence from the Dickson Mounds, Illinois (A.D. 950 to 1300), for example, shows evidence of lower life expectancy at birth with the shift to food production (Goodman et al. 1984). Skeletons and dentition of adults bear evidence of infectious disease episodes and some deficiency diseases (Cassidy 1980). These same nutrition-related health difficulties probably characterized many populations of settled horticulturalists, with greater problems present among those who were most crowded and whose resources were seasonally strained.

From the Renaissance to the Eve of Industrialization

Most data bearing on infant and child nutrition during the centuries spanning the Renaissance and the beginning of industrialization are from Europe and its colonies. During this period, Europe was characterized by changing medical knowledge, social divisions, and cultural patterns. At the same time, great portions of the rest of the world were changed by the diffusion of European customs, much of this occurring within the context of slavery and colonial domination. In Europe and in the colonies, most infants of the sixteenth century were breast-fed by their mothers, and when mothers could not or did not nurse, wet nurses were employed. There were class differences in feeding practices throughout the period, with change coming sometimes from the upper and sometimes from the lower classes. There were regional differences as well, which make generalizations difficult and trends hard to describe in a simple fashion.

Neonatal feeding focused on the use of purgatives to help remove the fetal meconium from the intestines. It was believed that the consumption of milk before purging the meconium could lead to a dangerous coagulation in the bowels. The use of substances for purges such as almond oil, butter, and honey was continued from ancient times, and wine, pharmaceutical purges (such as castor oil), and others, like salt water or soap, became popular in the late seventeenth century. These were administered at regular intervals over the first few days of life, a period during which women were advised not to breast-feed their infants.

In fact, the period of withholding breast milk could range from a few days up to a month. This was in part because a mother was believed to need time to recover from the delivery and, also, to become “clean” after the vaginal discharge had stopped. But in addition, the infant was thought to be at risk of harm from the mother’s impure colostrum and, as already mentioned, from the coagulation of colostrum and meconium (Fildes 1986). The effect of this delay in breast feeding was to deprive the infant of the immune properties of the colostrum and to cause dehydration and weight loss. In addition, it disrupted the supply and demand nature of lactation, so that mothers might lack sufficient milk to feed their babies properly.

But such ideas began to change in the mid–seventeenth century following the publication of a number of medical and midwifery texts that advocated the feeding of colostrum, although it doubtless took some time for such new notions to be put into practice. A major impetus for such change came from English lying-in hospitals, where, by 1750, purgative use was discouraged and breast feeding of infants within 24 hours was advocated to reduce the incidence of breast abscesses and infections, both of which were major causes of maternal mortality. However, an important by-product of this practice was a marked decline in infant mortality as well (Fildes 1980).

These lying-in hospitals were used by poor women for delivery and were the site of medical training for both physicians and midwives who became advocates of early breast feeding. At the same time, upper-class women were beginning to read and to be influenced by treatises on infant care that incorporated the new medical ideas about early feeding. These changes in neonatal feeding were probably more responsible for the decline in infant mortality observed in England from the late preindustrial to the industrial period than any other factor (Fildes 1980).

During the preindustrial era, maternal breast feeding was practiced by most mothers after the neonatal period. They nursed infants with no supplemental foods for at least one year and usually continued breast feeding well into the second year of life until the infant was capable of consuming the family diet. Weaning from the breast was sometimes gradual and sometimes abrupt, with bitter substances smeared on the breast to deter the child; or the child was sent to stay with friends until weaning was accomplished (Treckel 1989).

The actual timing of weaning was dictated by the seasons and by the dental development of the child. The fall and spring were believed to be the best seasons for severance from the breast, and weaning in the summer and winter months was discouraged. This was because of presumed dangers of summer infections and the perils of winter cold. Children were also believed to be at risk of illness and death until most of their teeth had erupted, and so weaning was postponed until this had occurred. Nipple shields were sometimes used to reduce discomfort for mothers.

Until the late seventeenth century, upper-class women were more likely to employ wet nurses than to breast-feed children themselves. Although this practice was rationalized by women as avoiding a delay in feeding the infant because of the avoidance of colostrum, it was probably mostly the result of ideas about sexuality and gender roles held by upper-class women. Because sexual intercourse was believed to hasten the resumption of menstruation and to impair both the quality and quantity of breast milk, sexual abstinence until weaning was encouraged by medical authorities. The employment of a wet nurse avoided this issue, and husbands were normally responsible for the decision to seek a wet nurse. In so deciding, husbands were able to assert patriarchal authority and establish “ownership” of the child. In other words, having a wet nurse rather than the mother feed the child (especially when the child was sent away to be nursed) focused attention on the woman as wife and diminished her identity as mother (Klapisch-Zuber 1985).

If wet nursing solved some problems, however, it brought with it others. Because breast milk was believed to transmit the characteristics of the woman who produced it, care had to be taken in choosing and supervising a wet nurse. The possibilities for the transmission of influences ranged from temperament to physical attributes such as hair color and complexion. It was best if a boy’s wet nurse had herself borne a boy, and a girl’s borne a girl, so that appropriate gender demeanor would be transmitted.

Earlier in this period, there was general acceptance throughout Europe of women who employed wet nurses, although many physicians and moralists condemned the practice. But after the Reformation, there was considerably more sentiment against wet nursing. Indeed, mothers who did not breast-feed were portrayed as evil and self-indulgent in both popular tracts and in sermons (Lindemann 1981), and there was concern that the practices of upper-class women might spread to the lower classes. In the American colonies, wet nursing was deplored because of the Puritan ethic, which encouraged women to devote themselves to motherhood and not indulge their sensual urges. Women who placed their infants with wet nurses were criticized for being vain and sinful (Treckel 1989).

Such sentiments apparently also resulted in a reduced use of wet nurses among members of the stricter Protestant sects in England and in some countries on the Continent (Lindemann 1981; Fildes 1986). However, in other places, most notably France, wet nursing continued at all levels of society. Elsewhere, by the late eighteenth century, upper-class women had begun to nurse their own infants. But in France, the use of wet nurses began to spread from the upper to the lower social strata, resulting in an increasing reliance on rural wet nurses by the lower and working classes of growing urban populations (Sussman 1982; Fildes 1988). Infants were sometimes sent a considerable distance into the countryside to live with a wet nurse, not returning until the prescribed age of weaning, which could be several years of age. This brought with it considerable infant mortality, as there was little supervision of wet nurses to ensure that infants were fed and cared for.

It is the case, however, that perhaps because of its prevalence wet nursing was more regulated in France than elsewhere. Starting in the seventeenth century, a number of recommandaresses were authorized in Paris by local judiciaries to require registration of those wet nurses who took infants home with them to rural parishes. These recommandaresses were later consolidated and regulations added to protect the lives of the children in question. For example, nurses were forbidden to take on more than two infants at a time, and still later, additional regulations set the prices for wet nursing. In 1769, the recommandaresses were abolished and replaced with a single authority that was designed to enhance communication between parents and nurses and to enforce the payment of the wages to the nurse. Although the intentions of these regulations may have been good, the wet-nursing business was, in fact, highly entrepreneurial, with individuals contracting to bring infants to nurses in the countryside, competing against one another to do so, and more concerned with profits than with the welfare of parents, nurses, or infants. Infant mortality rates were extremely high.

One of the reasons wet nursing flourished in France was the doctrine of the Catholic Church, which solved the problem of the taboo on sexual intercourse during breast feeding by recommending that infants be placed with a wet nurse so that husbands could enjoy the conjugal relations they felt were owed to them. Other factors also contributed to the extensiveness of wet nursing. As we just noted, by the eighteenth century, large numbers of urban Frenchwomen had entered the workforce, and their work as artisans, shopkeepers, and domestics for upper-class families provided little opportunity to care for a baby. The crowded nature of living and working conditions in cities also made nursing a baby difficult, and a lack of safe breast milk substitutes left wet nursing as the only alternative.

The decline in wet nursing in France and elsewhere has been attributed to a number of factors. The foremost cultural theory—that of the “discovery of childhood”—has been the subject of considerable debate (Bellingham 1988). Philippe Aries (1962) has argued that in the past, child mortality was so high that adults invested little money or interest in children, who were frequently neglected or abused; wet nursing was, according to Aries, one form of such neglect. However, Edward Shorter (1976) has viewed the maternal investment in children less than 2 years of age as a product of modernization. As Enlightenment philosophers and physicians began to attack wet nursing and other child-care practices, upper-class mothers began to devote themselves to feeding, rearing, and caring for their own children. Such concerns for the welfare of the child gradually trickled down to the lower classes.

Infant Feeding in the Industrial Era

The Industrial Revolution of the nineteenth century brought dramatic changes in lifestyle that had drastic effects on infant-feeding practices. People flowed from traditional towns and villages to large urban communities, and with the concentration of the population in cities came overcrowding, contaminated water and milk supplies, and infectious diseases spawned by poor sanitation and crowded living conditions. Women entered the labor force in large numbers and, for both men and women, work was no longer based at home.

Enhanced access to education increased the literacy level of the population, and magazines, newspapers, and books became widely available. Included among these were infant-care manuals and numerous advertisements for infant foods, all of which heightened public demand for new products and services that would benefit their children (Fildes 1991). Moreover, improvements in the recording of vital statistics in the second half of the nineteenth centur y increased awareness of high infant-mortality rates, and—as inadequate nutrition was seen as a cause of disease and death—this also stimulated interest in infant feeding.

Research-oriented physicians and scientists developed theories of infant nutrition, and both practitioners dealing with breast-fed infants who were not thriving and mothers having problems with breast feeding sought nutritional solutions (Apple 1987). Although mothers’ milk was recognized as the best food for infants, that milk had to be of optimal quality, and this, both physicians and mothers believed, was not always the case. Breast milk quality might be compromised by the mother’s health, her behavior, even her disposition. Factors alleged necessary to the production of adequate milk ranged from consuming a good diet and getting sufficient rest to avoiding strong emotions and physical labor. Because these factors were thought to compromise the milk of wet nurses as well as mothers, wet nursing increasingly fell into disfavor.

Because wet nurses were used less, and women were working outside the home in large numbers, infants were fed foods other than breast milk by their caretakers (often, young girls) during the working day. Paps and porridges were made, frequently with bread and water flavored with a little milk and sugar, and kept warm all day on the stove to be fed as needed. When this food failed to soothe an infant, commercial baby tonics were given. Containing laudanum, these were probably very successful at quieting a hungry or fretful baby.

The infant-feeding practices of working-class Europe spread to other parts of the world as populations came under the economic domination of England and the other colonizing countries. Plantation systems and other labor-intensive industries needed a constant supply of workers, and returning mothers to work as soon as possible after giving birth was in the interest of the dominating powers. In the English-speaking Caribbean, for example, the African practice of prolonged breast feeding was discouraged among slaves. After returning to work, mothers left their infants in creches where they were fed paps and panadas by elderly caretakers in between nursing breaks (King and Ashworth 1987). These early supplemental foods were very similar to those fed in England at the time (Fildes 1986).

Such economic and social changes in the last half of the nineteenth century, as well as research into the chemical composition of milk in Europe and the United States, paved the way for the use of artificial milks or formula milks. Certainly, women were ready to adopt such milks. Most middle-class households could not afford a servant to breast-feed infants while the mother was engaged in activities outside the home, and working-class women had to wean their infants at a very young age to enter the labor force.

The earliest scientists to make a substantial impact on feeding practices through the development of formulas were Philip Biedert in Germany and Arthur Meigs in Philadelphia (Wood 1955). Biedert was the first to suggest modification of cow’s milk, noting its high protein content and hard-curd consistency relative to human milk. He recommended the addition of cream, whey, and sugar to make cow’s milk more digestible for human infants. Meigs published nutrient analyses of human and cow’s milk in 1884, which showed the higher carbohydrate and lower protein and fat content of human milk. He was able to demonstrate that the protein of human milk formed a softer curd than cow’s milk and thought that the addition of lime water was important to make cow’s milk more alkaline. In combining the work of Biedert with his own, Meigs was able to produce what was to be one of the most popular formulas for the transformation of cow’s milk into an approximation of human milk.

The problem of infant diarrhea provided the research impetus for therapeutic artificial milks, and two schools of thought developed. In the United States, that thought was complex, as Thomas Morgan Rotch (a professor of pediatrics at Harvard University) and his followers developed the “percentage method,” based on Meigs’s recommendations. Rotch advocated individualized infant-formula prescriptions that required considerable mathematical calculation and a hospital laboratory to produce.

In Europe at the same time a “caloric method” was developed by Heinrich Finkelstein, who hypothesized that infant diarrhea resulted from the fermentation of carbohydrates in the intestine. This theory led to the invention of “Eiweissmilch,” a low-carbohydrate, high-protein milk produced by a process of sieving curdled casein back into artificially soured milk. In contrast to Finkelstein, Adalbert Czerny concluded that infant diarrhea resulted from an intolerance of milk fat. Almost simultaneously, he developed a mixture of butter and flour that was added to milk.

By 1910, the percentage method had fallen into disfavor, and pediatricians were noting a disturbing number of occurrences of deficiency diseases, such as scurvy, among infants fed the artificial milks. These infants also suffered high rates of mortality (Levenstein 1983). Reformers advocated a series of different measures, ranging from educating the poor to breast-feed to regulating the milk supply so as to guarantee its cleanliness. But while physician-scientists were developing formulas that could be prescribed to mothers for home preparation, chemists were devising other alternatives to breast milk, and the success of these products indicated the existence of a previously unrecognized market. The subsequent commercial marketing of formulas, in contrast to their prescription by physicians, once again gave control of infant-feeding decisions to mothers, as in the past when they had decided between breast-feeding or employing wet nurses. By the end of the nineteenth century, mothers could choose from viable alternatives to breast feeding, and in the twentieth, their preferences made artificial feeding a cultural norm.

Such changes in infant feeding were part of the transformation of medicine into a scientific profession, and at the turn of the century, science was highly valued and viewed as the key to resolving numerous important problems (Rosenberg 1976). As physicians obtained more and more scientific knowledge, they became increasingly regarded as experts with privileged information (Apple 1987).

The “Scientific” Era

Infant feeding in the 1930s and 1940s differed somewhat from earlier practices; there was a general acceptance of simpler infant formulas, and researchers themselves were involved in the commercialization of formulas. Thereafter, as the dangers and difficulties of artificial feeding lessened, the flurry of research on infant nutrition declined as well (Apple 1987). By the middle of the twentieth century, most babies in the United States were bottle-fed with artificial milks, as were a high percentage of babies in the industrialized countries of Europe and in those under European influence. The belief was that artificial feeding, with its scientific basis and medical direction, was equal or superior to breast feeding.

No wonder then that survey data from the United States, Europe, and European-influenced countries showed a consistent decline in breast feeding during the twentieth century. The number of breast-fed infants born in the United States dropped from 77 percent between 1936 and 1940 to 25 percent by 1970. Duration of breast feeding also declined from a 1930s average of 4.2 months to 2.2 months in the late 1950s (Meyer 1958, 1968; Hirschman and Hendershot 1979; Hendershot 1980, 1981).

Table VI.7.1. Percentage of first–born infants ever breast–fed between 1951 and 1970 in the United States, by ethnic group and education Source: From Hirschman and Hendershot (1979).

Category 1951–5 1956–60 1961–5 1966–70
Ethnic group
White 49 43 39 29
Black 59 42 24 14
Hispanic 58 55 39 35
Education
<9yrs 62 53 40 32
9–11yrs 50 40 29 17
12yrs 45 40 32 23
13–15yrs 57 48 50 35
>15yrs 46 50 69 57

There were striking demographic patterns of breast feeding during this period (Table VI.7.1). Although breast feeding was more common among blacks than whites earlier in the twentieth century, its subsequent decline was greater among blacks so that only 14 percent breast-fed by 1970, compared with 29 percent of whites and 35 percent of Hispanics. The relationship of education to breast feeding changed as well. In the early 1950s, the practice was most common among women with lower educational levels; better-educated mothers were less likely to breast-feed. But this trend was reversed by the 1970s—a phenomenon interpreted as a “trickling down” of values and behaviors from upper- to lower-class women (Salber, Stitt, and Babbott 1958; Meyer 1968). The incidence of breast feeding in the United States reached a low of 22 percent in 1972—a downward trend paralleled in England (Newson and Newson 1963) and elsewhere.

Although recommendations from professional medical groups stressed breast feeding throughout this period, a variety of factors worked against it. One was the continued high value placed on science and technology and their applications to medicine (Apple 1987), as physicians and nurses learned more and more about artificial feeding and less and less about breast feeding. Moreover, birth became more of a medical event, and mothers who delivered in hospitals frequently stayed as long as two weeks. During that time, infants were often artificially fed because they were separated from their mothers and only brought to them to be nursed at fixed intervals. Mothers were instructed to wear face masks while breast-feeding, their nipples were washed before feeding, and they were not allowed to hold their babies afterward. All these practices virtually guaranteed that mothers’ milk supplies would be inadequate and that their babies would require formula to supplement or replace breast milk.

That such hospital practices contributed to breast-feeding problems was recognized by some physicians (for example, Aldrich 1942). However, professional medical journals included recommendations from other practitioners who advised mothers to feed thickened cereal and other foods from an early age (Clein 1943; Stewart 1943). Others advocated feeding schedules, and some even recommended reducing the number of feedings to match family mealtimes (Clein 1943), reflecting cultural themes of regimentation and discipline applied to the infant (Millard 1990).

As already noted, by the 1970s the downward trend in breast feeding was reversed in the United States. More mothers initiated breast feeding, and they breast-fed for longer periods of time (Martinez and Nalezienski 1979; Hendershot 1981). This return to breast feeding paralleled the earlier decline, although now it was women of higher socioeconomic status who rediscovered the practice. As with the development of formula in the first half of the century, popular books and pamphlets stressed the scientific aspects of the “new” feeding method, which with breast feeding focused on greater disease resistance, prevention of allergies, and enhanced mother–infant bonding.

The return to breast feeding in Western countries in the 1970s was fueled by growing health activism and advocacy by women. As women discovered that they knew relatively little about their bodies and that they were dependent on the largely male medical community, lay efforts of women educating women grew (Boston Women’s Health Book Collective 1976). La Leche League had been founded in 1956 as an organization focused on providing mothers with practical knowledge of breast feeding (La Leche League International 1963). The league filled a void. Few physicians had received any training in breast feeding, and lay breast-feeding knowledge had been lost as women turned to bottle feeding.

Breast feeding as promoted by La Leche League clashed sharply with other aspects of American culture. The League’s promotion of extended breast feeding until a child weaned itself, often at 2 or 3 years of age, was at odds with cultural values of fostering independence. Infant-centered feeding patterns, with frequent nursing during the day and nighttime nursing, contrasted with the prevailing norm of scheduled feedings and use of the clock (Millard 1990). Comparing studies of League and non-League mothers shows the sharp contrast in feeding styles. Whereas League mothers nursed an average of 15 times per day (Cable and Rothenberger 1984), non-League mothers averaged only 7 (Quandt 1986).

The upward trend in breast feeding reached a peak in 1982 with 61 percent of newborns breast-fed and two-thirds of those still breast-feeding at 3 months of age (Martinez and Krieger 1985). But as the decade wore on, interest in breast feeding once again eroded, and by its end only 52 percent of newborns and 18 percent of 6-month-olds were breast-fed. The decline in initiation and duration was greatest among non-white mothers, younger mothers, and those with less education (Ryan et al. 1991). A similar decline in breast feeding was recorded in Great Britain for the same period (Emery, Scholey, and Taylor 1990).

There are several possible explanations. Alan S. Ryan and colleagues (1991) suggest that there was a decline in attention given breast feeding in the public press at the same time that manufacturers of formula began to aggressively market their products through television and direct mailings to new parents. Economic pressures may also have contributed, because mothers in the labor force breast-fed less (Ryan and Martinez 1989). Indeed, even those not working outside the home expressed the need to be unencumbered by breast feeding if the need or opportunity to work arose (Wright, Clark, and Bauer 1993).

In addition to this cultural explanation of a decline in breast feeding, other scientists combined biological and behavioral perspectives into explanatory models that drew on rapid advances (during the 1980s) in knowledge of the hormonal control of milk production and milk volume (Stuart-Macadam and Dettwyler 1995).They pointed to the need for frequent stimulation of the nipple to maintain milk production (Quandt 1984b) and for baby-led feeding schedules to allow infants to regulate their own hunger and satiety levels through fat intake (Woolridge 1995). Sara A. Quandt (1984b) demonstrated that the introduction of beikost (nonmilk foods) to the diet of infants in the first 3 months of life resulted in declines in the number of breast feeds per day and, ultimately, in the duration of breast feeding. Thus, shorter durations of nursing at the population level appear to have a biobehavioral basis and are not simply the result of marketing of formula or women working outside the home.

Twentieth-Century Infant Feeding in the Developing World

In the second half of the twentieth century, considerable concern arose over changes in infant-feeding patterns occurring in developing countries. This was especially the case with changes thought to be the result of the marketing malpractice of multinational corporations (Chetley 1986). Scores of reports documented the promotion and sale of infant formula in developing countries, often to mothers who could not afford formula and who lived in circumstances where it could not be hygienically prepared. As a result, it was charged, low-income families were spending a large proportion of their incomes on products that were actually dangerous to the health of infants and also, because of the availability of breast milk, unnecessary. Indeed, dependence on formula meant erosion of breast-feeding knowledge and practices and increased poverty and infant mortality. Horror stories of “bottle babies” fed contaminated and overdiluted formula drew international attention, and pictures of infant graves marked by nursing bottles aroused anger toward the companies promoting the products.

After the publication, in 1974, of The Baby Killer (Muller 1974) and its German translation, Nestlé filed a libel suit against the Third World Action Group responsible for the translation. Although the group was found guilty of misrepresenting the formula manufacturer, it was assessed only a nominal fine, whereas the conduct of the formula manufacturer drew considerable public outcry. A number of public action groups joined to form the Infant Formula Action Coalition (INFACT), and in 1977, INFACT began to promote a consumer boycott of all Nestlé products. The boycott was extremely successful, with participation by large numbers of citizens as well as governmental agencies in the United States and Canada (Van Esterik 1989). In 1981, the World Health Organization (WHO) and the United Nations International Children’s Emergency Fund (UNICEF) adopted a code of marketing with which Nestlé agreed to abide. INFACT called off the boycott in 1984.

Scientific research on infant-feeding patterns in developing countries, however, provides a more complex picture than that conveyed by the problems arising from the simple substitution of bottled formula for breast feeding. In some areas, new feeding modalities and products were added to existing breast-feeding practices. The contents of infant-feeding bottles were highly varied. There was inter- and intracultural variation in how breast feeding was practiced before the promotion of formula by multinational corporations. For example, mixed breast and artificial feeding had been common in the Caribbean for centuries, whereas extended breast-feeding had been the norm in Africa and Asia (King and Ashworth 1987).

Data gathered around 1980 from large, nationally representative samples for the World Fertility Survey in 17 countries make possible some tentative generalizations about the state of breast-feeding practices in the second half of the twentieth century (Popkin et al. 1983). In Asia and the Pacific Islands, the practice continued to be very common, with virtually all mothers initiating breast feeding. Over 90 percent of infants were still breast-fed at 3 months of age and between 60 and 90 percent at 12 months. Breast feeding was slightly more common and of longer duration in rural than in urban areas. Supplementation of breast feeding was often delayed until after 12 months.

In Latin America, by contrast, as many as 20 percent of infants were never breast-fed, and by 6 months of age, less than 60 percent were breast-fed. The vast majority were weaned by 12 months. Rural–urban differences were pronounced, with more and longer duration of breast feeding in rural areas.

African breast-feeding rates were intermediate between those of Asia and Latin America. Overall, these findings indicate that a general decline in breast-feeding duration, but not breast-feeding incidence, had occurred. This change is probably tied to overall patterns of modernization (for example, changes in women’s participation in the labor force and changes in postpartum sex taboos) and not exclusively to the promotional effects of formula manufacturers.

International interest in infant-feeding practices in developing countries stems from a broader and longstanding concern about child survival and the effects of malnutrition on the long-term functional development of children (Mosley and Chen 1984). The link of breast feeding of short duration and abrupt weaning to kwashiorkor and extended, unsupplemented breast feeding to marasmus was established by the 1960s (Williams 1955; McLaren 1966; Cravioto et al. 1967) and attributed to a variety of adverse biological and social factors. Both human and animal studies demonstrated that the long-term consequences of these severe forms of protein–energy malnutrition included impaired mental development (Pollitt 1969; Winick 1969). More recently, a series of intervention and observational studies has shown that even marginal malnutrition results in growth stunting in early life and continued functional impairments later in life (Allen 1995). The results of such studies make it clear that achieving adequate nutrition for infants is the foundation for optimal health in whole populations.