H Leon Abrams. Cambridge World History of Food. Editor: Kenneth F Kiple & Kriemhild Conee Ornelas, Volume 2, Cambridge University Press, 2000.
Early Humankind
Vegetarianism is a cultural and social, rather than a biological, phenomenon. Anatomically and physiologically, the digestive organs of the human species are designed for both animal and plant foods. Moreover, a global cross-cultural survey demonstrates the fact that all cultures, past and present, have revealed a preference for at least some form of animal fat and protein and that none have ever been totally vegetarian (Abrams 1987b: 207-23).
Current paleoanthropological research indicates that humans have been on this earth for some 3 to 4 million years. For over 99 percent of that time, humans were hunters and gatherers (Cohen 1977: 15; Johanson and White 1979: 321-30). From the Australopithecines to the inception of agriculture, humans gradually developed more efficient tools to obtain food, especially for hunting game. Homo erectus pursued large game, and early Homo sapiens, the late Paleolithic humans, were even more dedicated in this regard.
Indeed, the availability of game may well have dictated human settlement patterns. As population pressure mounted in Africa, the original home of humankind, herds of game dwindled, forcing people further afield to follow other herds. This ultimately led to human settlements in Asia, Europe, the Americas, and Australia – in all of the continents except Antarctica.
About 10,000 years ago, beginning in the Middle East, people finally started to raise plant foods because of their own growing number, on the one hand, and because so many game animals were scarce or had been hunted to extinction, on the other. And with sedentary agriculture came political organization and formal religions.
Religion and Diet
As people settled into sedentary agriculture, meat-eating prohibitions gradually became part of the tenets of some of their religions, such as Hinduism. Ultra-orthodox Hindus and Jains, for example, are strict vegetarians, but as anthropologist Marvin Harris points out, in earlier times the Hindus were beef eaters. Later, beef eating was restricted, primarily, to the Brahman caste; and finally, the practice became prohibited for all Hindus.
Harris has argued that such a prohibition was the result of human population growth. Because cows were valuable as draft animals and as providers of milk and dung (as fuel for cooking), it was wasteful to use them for meat. In fact, Harris argues that the consumption of certain animals is prohibited in many cultures for pragmatic reasons (Harris 1977: 129-52) and that compliance is more likely when such prohibitions are codified as tenets of religion (Dwyer et al. 1973: 856-65; Mayer 1973: 32; Todhunter 1973: 286).
Other religious groups that today adhere to some type of vegetarianism include Buddhists, Seventh-Day Adventists, and orders such as the Roman Catholic Trappist monks. Justification of vegetarianism on religious grounds may be because it is more spiritual, or because it is more in harmony with nature, or because of the conviction that all animal life is sacred (Abrams 1980: 53-87).
Modern-Day Vegetarianism
A phenomenal escalation of vegetarianism, which began in the 1960s, has occurred in the Western world and has had little to do with religion (Erhard 1973: 4-12; Sonnenberg and Zolber 1973). The new vegetarians include a wide variety of individuals with concerns ranging from what animals are fed and injected with to environmental problems and world hunger to the indictment of cholesterol and animal fats as causative factors in the development of vascular diseases, cancers, and premature deaths.
There is also considerable variation in the practice of vegetarianism. At one extreme are the vegans, who exclude all types of animal protein from their diet and subsist solely on vegetables, seeds, nuts, and fruits. Moreover, their dietary beliefs are extended to other daily activities. Vegans shun clothing made of animal products such as wool, silk, leather, fur, or pearl (Erhard 1974: 20-7). They abstain from using consumer goods, such as soap or cosmetics made with animal fat or brushes of animal hair, and they refuse immunization from animal-derived sera or drugs, or toiletries whose safety has been determined by animal testing.
Vegetarians toward the middle of the continuum choose a wider variety of foods and include certain animal protein foods in their diets. Lacto-vegetarians, for example, use dairy products; ovo-vegetarians, eggs; pollo-vegetarians, poultry; and pesco-vegetarians, fish. Some will use more than one type of animal protein foods, such as lacto-ovo vegetarians, who consume dairy products and eggs along with other foods.
Opposite to the vegans on the vegetarian continuum are the red-meat abstainers, who consume all animal protein foods except beef, lamb, and pork (Fleck 1976: 408-27).
Extreme Vegetarianism and Health
One form of extreme vegetarianism is the final stage of the Zen macrobiotic diet. This vegetarian phenomenon attained some popularity in the Western world during the 1960s but has declined since its founder’s death. The diet encompasses a life philosophy based on a loose reading of Buddhist symbolism and the ancient Chinese dualistic concept of yin and yang (Dwyer 1979: 1-2). Allegedly, humans achieve harmony in all of life’s manifestations by reaching a balance between the cosmic forces of yin and yang, which oppose while they complement one another (Keen 1978), such as male/female; contraction/expansion; and fire/water. In terms of food, such dualities are manifested in complements such as animal versus vegetable; mineral versus protein and fat; bitter and salty versus sweet.
According to the philosophy behind the macrobiotic persuasion, disease results from yin/yang imbalances, which can be restored by an ideal diet. The individual must follow 10 stages of an elaborate dietary regimen, with the final stage consisting solely of brown rice, salt, and fluids such as herbal teas. Reaching this final stage, presumably, assures attainment of perfect health, freedom from all disease, and spiritual enlightenment. In this latter connection, macrobiotics has been especially popular. Some individuals have embraced it in a search for an alternative to drug experimentation in order to experience a “natural high” (Keen 1978).
Yet one account, The Death Diet (Christagu 1967: 43-7), has described the tragedy suffered by a young couple upon reaching the tenth stage of the macrobiotic diet. The health of the male was impaired severely, and the woman died. Her death certificate read, “Multiple vitamin and protein deficiencies precipitated the chain of events leading to her death” (Mayer 1975).
Total vegetarians are likely to suffer deficiencies of many of the chief nutrients, among them vitamin B12. All foods of animal origin contain vitamin B12, yet plant sources for this vitamin are unreliable. The human body stores at least a thousand times the Recommended Dietary Allowance (RDA) of vitamin B12, which is an amount sufficient to maintain serum levels in some people for up to five years, although there are a wide range of serum levels associated with the deficiency. Thus, deficiency may not be apparent for some time while reserves of the vitamin are gradually being tapped (Ellis and Mumford 1967: 205; Ellis and Montegriffo 1970: 249). It can be difficult to detect vitamin B12 deficiency because of the high folic acid intake of vegetarians, which frequently masks symptoms of the condition (Brown 1990: 175). All of this is particularly unfortunate because early detection of the deficiency is vital to avoid irreversible neurological impairment.
Some vegetarians, although aware of the risk of vitamin B12 deficiency, depend nonetheless on plants for this vitamin, despite the unreliability of such foods as a source. When found in plants, vitamin B12 is present only because of bacteria growing on the plant or because the plant has been contaminated by bacteria, especially from fecal matter. Spirulina and tempeh, used frequently by vegetarians as vitamin B12 sources, have virtually none and in fact provide B12 analogues that actually block the metabolism of functional vitamin B12 (Bailie 1987: 98-105).
Vitamin B12 deficiency can result in megaloblastic anemia, a grave condition. In the mid-1970s, a high incidence of megaloblastic anemia was reported among orthodox Hindus who had emigrated from India to England. The phenomenon was puzzling. The orthodox Hindus did not have vitamin B12 deficiency while they lived in India, nor after they first emigrated, but developed the deficiency during their residence in England. Their diet was similar in both countries. But investigations revealed that the cause stemmed from differences in growing, processing, and packaging foods in the two countries.
In India, where crops had minimal applications of costly pesticides, many plants were insect-contaminated, and the insects, or their eggs and larvae, contributed a supply of vitamin B12 to the diet. Additional opportunities for insect contamination were provided by minimally processed or packaged food. Thus, in India, the orthodox Hindus had adventitiously obtained sufficient vitamin B12 to prevent deficiency, a finding that helps to explain why extreme vegetarians in other developing countries may escape it as well. But ironically, the privileged in those countries, who can afford to purchase imported foods that are highly processed and protectively packaged, may suffer from an inadequate vitamin B12 intake (Rose 1976: 87).
In addition to lacking vitamin B12, extreme vegetarian diets are often deficient in adequate proteins and even calories. Moreover, total vegetarian diets tend to be low in calcium and riboflavin (Raper and Hill 1973). Certain coarse, green leafy vegetables may be high in calcium, but the calcium is not well absorbed because of the high fiber content of the diet, and other minerals, including zinc, phosphorus, and iron, may also be poorly absorbed (Haviland 1967: 316-25; Reinhold et al. 1976: 163-80; Bodzy et al. 1977: 1139; Freeland, Ebangit, and Johnson 1978: 253).
Calcium absorption is also impaired by certain green leafy vegetables, such as spinach and Swiss chard, that contain oxalates. These compounds bind with calcium during digestion to form insoluble calcium oxalate, which is not utilized, and the calcium is excreted in the feces (Albanese 1980). Moreover, whole grains, frequently consumed in large quantities by many vegetarians, are high in phytates, and these substances, like the oxalates, interfere with calcium absorption (Hegsted 1976: 509). Similarly, vegetarians may have low zinc levels due to phytates and oxalates in their diets. In contrast, foods from animal sources provide dietary zinc, but they do not contain the inhibiting phytate and oxalate compounds (Prasad 1966: 225-38).
Zinc insufficiency is one of the greatest but least-known dangers of vegetarianism, according to the late Carl C. Pfeiffer, who commented wryly that “the light headed feeling of detachment that enshrouds some vegetarians can be caused by hidden zinc hunger, rather than by some mystical quality of the brown rice or other food consumed” (Pfeiffer 1975: 103). Among other things, low zinc levels are related to male infertility. In one study, mild zinc depletion in nine male volunteers resulted in decreased fertility, and the sperm count of the men fell to an infertility level after the men intentionally ate a zinc-deficient diet for about six months (Prasad 1982).
Obtaining sufficient iron can also be a problem for vegetarians, and women on strict vegetarian diets, especially during child-bearing years, may have real difficulties in this regard (Mayer 1973: 32). As with zinc, foods of animal origin are reliable iron sources, whereas foods of plant origin are not. Moreover, heme iron from foods of animal origin is absorbed and utilized far more efficiently than nonheme iron from plant foods: 10 to 30 percent can be absorbed from animal foods, compared with only 2 to 10 percent from plant foods; and finally, phytates and oxalates interfere with iron absorption (Finch 1969: 3).
Normal plasma estrogen levels are necessary for women’s menstrual regularity. Because nutrition can affect hormone levels, dietary differences may affect menstrual patterns, as was shown in a study of evenly matched nonvegetarian and vegetarian premenopausal women at the Milton S. Hershey Medical Center in Hershey, Pennsylvania: Only 4.9 percent of the nonvegetarians experienced menstrual irregularities, whereas 26.5 percent of the vegetarians had difficulties. The probability of menstrual regularity was associated positively with protein and cholesterol intake and negatively with dietary fiber and magnesium intake (Pedersen, Bartholomew, and Dolence 1991: 879-85). Such results are consistent with the hypothesis that premenopausal vegetarian women have circulating estrogen concentrations (Lloyd, Schaeffer, and Walter 1991: 1005-10) and that these women may also have decreased reproductive capacity (Pedersen et al. 1991: 879-85). Another study at the Hershey Medical Center reveals that the frequency of menstrual irregularity was significantly higher in a lacto-ovo vegetarian group of women than in a matched group of nonvegetarian women (Lloyd et al. 1991: 1005-10).
Infants and children may suffer the most from extreme vegetarian diets. Commenting on this issue, the pediatrician George Kerr observed that “history has many examples of parents abusing children for the very best intention.… Specifically, should we start considering reporting these cases of growth failure on vegan diets to the authorities involved with child abuse?” (Kerr 1974: 675-6).
Infants breast-fed by women who are strict vegetarians have been found to be deficient in vitamin B12, and such lactating women are advised both to take a vitamin B12 supplement and to include soybean milk or fermented soybean foods in their diets (Dwyer 1979: 1-2). Vegetarian infants and children tend to be smaller and to grow at a slower rate than do children from the general meat-eating population. One factor may be the high-bulk/low-calorie content characteristic of vegetarian diets (Erhard 1973: 11).
Pediatricians have expressed concerns about the vulnerability of infants and growing children who are especially at risk from extreme vegetarianism. “Kokoh,” a special macrobiotic infant-feeding formula, was found to be capable of inducing kwashiorkor, one form of protein-energy malnutrition. Moreover, the American Academy of Pediatrics (ACP) has reported that infants fed solely on this formula from birth usually are underweight within a few months and below average in total body length (Abrams 1980: 57).
Other research and clinical observations have also shown that infants fed no animal protein fail to grow at a normal rate and may develop kwashiorkor. Infants who are breast-fed and then placed on vegan diets do not grow or develop as normal infants, nor do they do as well as infants fed vegetarian diets supplemented with cow’s milk (Erhard 1973: 10-12).
Vegetarianism and Chronic Diseases
Ancel Keys’s seminal reports (Keys 1956: 364; Keys, Anderson, and Grande 1957: 959), followed by others, suggested a direct relationship between diet, especially animal fats, heart disease (Connor, Stone, and Hodges 1964: 1691-1705; West and Hayes 1968: 853; Anderson, Grande, and Keys 1976: 1184), and colon cancer (Burkitt 1971: 3, 1973: 274; Berg, Howell, and Silverman 1973: 915; Reiser 1973: 524, 1978: 865; Hill 1975: 31; Howell 1975; Nichols et al. 1976: 1384; Truswell 1978: 977-89). Conclusions drawn from these studies, however, remain controversial, and many other studies take issue with them (Mann et al. 1961: 31; Mann 1977: 644; Enstrom 1975: 432; Lyon, Klauber, and Gardner 1976: 129; Enig, Munn, and Keeney 1978: 2215; Glueck and Connor 1978: 727).
Nonetheless, the public has been led to believe that foods that contain saturated fat and cholesterol, such as eggs and red meats, are related to heart disease and colon cancer. In fact, some alarmed individuals have embraced a limited or total vegetarian diet in the belief that such measures will ward off these illnesses (Hardinge and Stare 1954: 83; West and Hayes 1968: 853; Ruys and Hickie 1976: 87; Sanders, Ellis, and Dickerson 1978: 805).
In truth, however, research has produced contradictor y and puzzling findings. Seventh-Day Adventists, for example, have an educational health program for their members that advocates a well-balanced diet, including milk and eggs, along with other foods. In other words, these lacto-ovo vegetarians eat adequate amounts of animal protein foods, although they exclude red meats. Additionally, Seventh-Day Adventists have been found to have lower rates of cancer and coronary artery disease than the general American population (Wynder, Lemon, and Bross 1959: 1016-28). For cancer, they experience a 50 to 70 percent lower mortality incidence than the general population (Phillips 1975: 3513). For coronary heart disease, mortality was lower in males under 65 years of age than in the general population but not in males over 65, nor in females of any age (Phillips et al. 1978: S191).
Because Seventh-Day Adventists are red-meat abstainers, it was assumed that such abstention accounted for their lower incidence of heart disease than that of the general public. But other confounding factors may play important roles in preventing these diseases. For example, Seventh-Day Adventists abstain from tobacco, alcohol, and caffeinated beverages (Dwyer 1979: 1-2).
Members of the Church of Jesus Christ of Latter Day Saints (Mormons) provide another homogeneous group for study. Mormons are not red-meat abstainers, but the church urges moderation in lifestyle. Compared to the American average, Utahan Mormons were found to have a 22 percent lower incidence of cancer in general and a 34 percent lower incidence of colon cancer (Lyon et al. 1976: 129). However, Puerto Ricans, who eat large amounts of animal fats, especially pork, have very low rates of colon and breast cancers (Enig et al. 1978: 2215). Such contradictions practically leap off the pages of a comparative study of the incidence of breast and colon cancers conducted in Finland and the Netherlands. People in these two countries eat similar levels of animal fats per person per day, but despite this similarity, the incidence of breast and colon cancer in the Netherlands was nearly double that of Finland (Enig et al. 1978: 2215).
Nor are such contradictions confined to cancer. A study in western Australia has contradicted the notion that a high serum cholesterol level increases the risk of heart attack. Three groups of mothers and children were studied: those with high, those with medium, and those with low cholesterol levels. The groups had no significant differences in their daily dietary intake of proteins, fats, and carbohydrates, and obesity was ruled out as a factor, because cholesterol levels did not differ between the obese and the nonobese.
The conclusion reached was that diet did not appear to account for differences in cholesterol levels in a culturally homogeneous group. Furthermore, the “correlation between habitual diet and the average serum cholesterol level is good between contrasting populations (for example, people of Japan and Finland)” but “within a given culture, people eating the same kind of food can have different serum lipids. Those who develop coronary heart disease do not necessarily eat differently from those who do not” (Hitchcock and Gracey 1977: 790).
More recently, additional confounding evidence has been contributed by the so-called French Paradox. The eating habits of people in Gascony, in the southwest of France, were studied for 10 years. This area is noted as the world’s leading producer of cholesterol-laden foie gras. People in Gascony consume goose and duck fat liberally, snack on dried duck skin, and eat more than twice as much foie gras as other French people and 50 times as much as Americans. Yet they were found to have the lowest incidence of death from cardiovascular disease in all of France. In Gascony, of 100,000 middle-aged men, about 80 die annually of heart attack, compared with 145 elsewhere in France and 315 in the United States (O’Neill 1991: 1, 22).
Fats
Because of the negative publicity that fats have received of late, we sometimes tend to forget that fat is essential for proper growth and development, even for life itself. Fat provides essential fatty acids necessary for cell membrane structure and for prostaglandins; it also acts as a carrier for the fat-soluble vitamins. Humans require different types of fats for different purposes, both structural and storage. Structural fats make possible the growth of the brain and the central nervous system and maintain cell membrane integrity. Structural fats are found inside the cell. Cell membranes are not mere envelopes but active parts of the cell, arranged as an orderly production line, creating some things, breaking down others, and transferring raw materials (nutrients) as needed from one place to another. Storage fats, no less important, are used for energy (Crawford and Marsh 1989: 120).
Vegetable Oils
It may be unfortunate that millions of people have been persuaded to change their choices of fat in the belief that oils from plants containing predominantly polyunsaturated or monounsaturated fatty acids are good, and fats from animals are bad. There is an argument that large amounts of polyunsaturated oils in the diet may be detrimental to health (Lyon 1977: 28-31). Although vegetable oils have been offered with promises of risk reductions for cardiovascular diseases, some authorities find these claims to be misleading and inaccurate (Pinckney and Pinckney 1973: 31-8; Moore 1989: 40-95).
Eggs and Cholesterol
Cholesterol has also gotten a bad press, which tends to obscure its vital roles in human health. It is needed for the transmission of nerve impulses throughout the body, and, in fact, the nervous system cannot function properly with insufficient cholesterol. In addition, cholesterol is needed to produce sex hormones. Indeed, it may well be because cholesterol is so crucial that the human body does not depend solely on dietary sources but also manufactures cholesterol. If dietary cholesterol is insufficiently supplied, then the body manufactures more to meet its requirements. Thus, Roger J. Williams turned current wisdom on its head when he observed that “anyone who deliberately avoids cholesterol in his diet may be inadvertently courting heart disease” (Williams 1971: 73).
In response to public concern about eggs and their cholesterol content, food processors have countered by concocting products that contain egg white (the noncholesterol component) but no yolk (the cholesterol-containing component). These products were intended for omelets and scrambled egg dishes. M. K. Navidi and F. A. Kummerow of the Burnsides Research Laboratory, University of Illinois, undertook an experiment to determine the nutritional value of such replacer eggs. They fed one group of lactating rats exclusively on fresh eggs and another group on Egg Beaters. They intended to run the experiment for 40 days, but all the rats on the egg substitute were dead in 17 days. They were underweight, were severely stunted in growth, and had mottled hair. All the rats on fresh whole eggs were healthy and normally sized (Navidi and Kummerow 1974: 565).
Diet and Primitive Societies
In using the term “primitive,” we would note at the outset that there are no primitive peoples. The term, when used by anthropologists, denotes the development of technology used by cultures.
Not many hunting and gathering societies have survived to the present, and those that have are rapidly becoming acculturated. Yet the few still in existence have been extensively studied, with such efforts contributing considerably to our knowledge of the diet and health of past peoples.
Among other things, such knowledge disputes vegetarians who defend their practice with the claim that individuals in primitive societies (and presumably our ancestors) were herbivores who lived primarily on plant foods. The evidence is that most hunting-and-gathering peoples consume large amounts of meat and other types of animal protein whenever it is available.
The Eskimos constitute a classic example. They lived almost entirely on a traditional diet of raw sea and land mammals, fish, and birds, and so long as the traditional diet was followed, Eskimos remained in excellent health. Indeed, the raw meat and organs of these animals, including the liver and the adrenal glands, even provided an ample supply of vitamin C, so long as the meat was raw or cooked only a little. Being heat-sensitive, vitamin C is destroyed in meats by normal cooking (Stefansson 1960: 58).
Although it was obvious that Eskimos could thrive on a diet of raw meat (Stefansson 1937), the notion persisted that such a diet would probably harm Europeans, which led to an interesting experiment. Under the auspices of the Russell Sage Institute of Pathology at Bellevue Hospital (an affiliate of the Cornell University Medical College of New York), the anthropologist Vilhjalmur Stefansson and his colleague Karsten Anderson volunteered to eat a ratio of 2 pounds of raw lean meat to a half pound of raw fat and nothing else daily for a year. Both men thrived. After the year was up, Stefansson, who had previously lived for years on the Eskimo diet, continued on it for decades more and enjoyed sound health until his death at age 83. Similarly, Anderson was found to be in better physical condition after eating the traditional Eskimo diet for the year than when he began the experiment (Stefansson 1957: 60-89).
The Eskimo diet, then, also tends to confound the notion that meat and fats undermine human health. Paul Martin, for example, who spent years in the Arctic, discovered that although Eskimos consumed large amounts of animal fats, whale blubber, and seal oil, they did not have problems with cholesterol and, in fact, were remarkably free of degenerative diseases, especially those related to heart and to blood pressure (Martin 1977: 25-8).
Another observer of indigenous Eskimo culture concluded that the native diet of raw land and sea mammals and fish provided all the nutrients needed for excellent health (Schaefer 1971: 8-16). Unfortunately, with their adoption of industrialized foods since World War II, consisting mainly of refined carbohydrates (sugar and white flour), chemically altered fats, and other highly processed foods, the Eskimos began to suffer from those degenerative diseases common to the modern industrialized world (Schaefer 1971: 8-16).
George V. Mann, of Vanderbilt University Medical College, who challenged Keys’s fat hypothesis, initiated several studies with the few available primitive societies still living largely on animal foods. He reasoned that if a high fat and cholesterol intake really did cause heart disease, then hypercholesteremia and coronary heart disease ought to characterize the health of groups of people living on high fat and cholesterol diets.
Mann first studied the Pygmies in the African rain forest. Almost untouched by civilization, these people have continued on a traditional diet of large amounts of meat, supplemented with plant foods. But Mann discovered that despite a high level of meat consumption, adult male Pygmies had the lowest cholesterol levels yet recorded (Mann et al. 1961).
Next, Mann observed the Masai of Tanganyika, a nomadic pastoral people living almost exclusively on meat, cow’s blood, and milk. His examination of 400 Masai adult males revealed very little cardiovascular disease and no signs of arteriosclerotic disease. As a result of his investigation, Mann concluded that the widely held notion that meat and milk cause coronary heart disease is unsupported by the evidence (Mann 1963: 104).
Modern Nutrition and Tooth Decay
Tooth decay has been known throughout human history, but only in relatively modern times has it become endemic. The evidence shows that tooth decay was rare throughout the Paleolithic period, which spanned some 3 to 4 million years.
When the hunting-and-gathering way of life gave way to stable agriculture, though, the switch led, among other things, to increased tooth decay. And it would seem that dependence on high-carbohydrate foods, such as grains and other plant foods, began to seriously undermine the health of those who did not find a way to maintain a favorable dietary balance between high-quality animal protein foods and lower-quality plant protein foods (Page and Abrams 1974: 188-203, 214-26; Abrams 1976: 102-12, 1978: 41-55).
Skeletal remains of European populations reveal a slow, steady increase in tooth decay from the Neolithic period when agriculture first began until the present, when it escalated (Wells 1964; Abrams 1976: 102-12, 1978: 41-55). Such a deterioration in teeth is also vividly portrayed in the experience of the Lapps (Sami) of northern Europe, who until recently had hunted in a manner similar to that of the late Paleolithic age and lived mainly on reindeer meat. They consumed about a pound of meat per person each day and were renowned for their healthy teeth. Eighteenth-century skeletal remains show a cavity rate of only 1.5 percent in Lapp teeth. But gradually, Lapps adopted the industrialized diet, with the result that their cavity rate has soared to 85 percent (Pelto and Pelto 1979: 292-301).
Examination of the skulls of the ancient Norse, who subsisted almost entirely on fish, seafood, and animals, also showed that dental decay was rare (Page and Abrams 1974: 188-203, 214-26). But skeletal remains of prehistoric Indians in the California area, whose diet was primarily vegetarian, revealed that 25 percent of the population had some tooth decay (Pelto and Pelto 1979: 292-301). Substantially worse were the teeth of prehistoric Indians in New Mexico, whose diet was based on corn (maize), supplemented with a few other plant foods and small amounts of animal protein. Tooth decay was a problem for 75 percent of that population (Pelto and Pelto 1979: 292-301).
Such relationships between diet and dental decay, as well as other degenerative diseases, has also been well documented by Weston A. Price’s pioneering efforts around the globe. He found, for example, that so long as the Australian Aborigine, Polynesian, Eskimo, and other peoples followed their traditional diets, consisting of animal foods and unrefined, unprocessed plant foods, they had an exceptionally low incidence of caries and other health problems. But after adopting an industrialized diet these same peoples developed rampant tooth decay and other degenerative diseases, as well as malformations in off-spring (Price 1989: 59-72).
Animal experiments have yielded similar results that underscore observations on human populations. Protein-malnourished rat pups developed more caries than when protein was amply supplied. Rats fed adequate calories but no protein also had a high incidence of tooth decay (DiOrio, Miller, and Navia 1973: 856-65; Menaker and Navia 1973: 680-7; Navia 1979: 1-4). Clearly, such findings suggest that humans, especially infants and children, are more susceptible to tooth decay if deprived of adequate animal protein foods.
Diet and the Lower Primates
It is sometimes argued that the basic human diet should be plant food because closely related species of primates such as monkeys and apes are vegetarian. Yet, since World War II, ethological studies on primates in the wild have shown that monkeys and apes, despite a predominantly vegetarian diet, nonetheless seek out small animals, insects, insect eggs, and larvae as food (Abrams 1980: 75).
Marmosets and squirrel monkeys are especially fond of crickets and flies (Abrams 1980: 75). Baboons actively hunt, kill, and eat a variety of wild animals, and zoo marmosets have failed to mate until animal foods were added to their diets (Abrams 1978: 42-50). Jane Goodall has witnessed chimpanzees in the wild fashioning crude tools to gather termites as food and has observed them killing and eating arboreal monkeys, baby baboons, and other small animals (Goodall 1971).
Gibbons often eat rodents, birds, and small antelopes. Orangutans feed on larvae, beetles, nesting birds, and squirrels. And in captivity, gorillas show a preference for meat over vegetarian fare (Perry 1976: 165-85). In this connection, it should be noted that the desire of chimpanzees to eat meat does not stem from a lack of plant foods, which are abundant and constitute most of their diet (Eimerl and De Vore 1968: 142-52; Goodall 1971).
Such findings have made it necessary to reclassify monkeys and apes. Rather than being herbivores, these animals are now recognized as omnivores, and the reason that they do not eat more animal food than they do may be the result of a limited ability to provide it in great abundance. But in seeking the animal foods that they do consume, monkeys and apes are most likely driven by a basic need to meet nutritional requirements that are available only from animal protein (Abrams 1980: 76).
The tree shrew, the most primitive of all the living primates, is closest in structure to the fossil ancestors of the first primates, who date from about 70 million years ago. The tree shrew is almost exclusively an insectivore (Berg et al. 1973: 915). Such findings about the primate progenitors of humans strongly suggest that all primates, including humans, have a basic need for some animal protein in the diet (Abrams 1980: 76).
In moving away from lower primates to humans, we find that available paleontological and archaeological data can tell us much about the diet of humans for most of their time on earth.
Humans as Hunters
Fossil remains of Australopithecus, the first humans (Dart 1969: 42-189; Leakey 1971: 215-90), have been found over the savannas of South and East Africa (Leakey 1971: 215-90). They ate their food raw, but unlike other primates, they made and utilized crude stone tools to collect more food per unit of time. Their diet consisted of insects and small animals and plant foods that could be eaten raw as found in nature (Biegert 1975: 717-27). They scavenged, eating the remains of any large carcass they found, and apparently were able to obtain surprisingly large amounts of meat (Schaller and Lowther 1969: 307-34).
Through mutations and natural selections, about a million years ago a more advanced human, Homo erectus, evolved. These people developed far more efficient tools and techniques for hunting large animals and spread from Africa into Asia and southern Europe (Constable 1973: 29-37, 70).
Between 35,000 and 70,000 years ago, modern Homo sapiens evolved. The late Paleolithic Homo sapiens is exemplified by Cro-Magnon man (Prideaux 1973). Homo sapiens continued to improve hunting techniques and followed game from Africa to the far north of Europe and, finally, to Siberia, the Americas, and Australia. Cro-Magnon’s religion and magic, reflected in extraordinary cave art, centered around the theme of the hunt for large game.
Both archaeological excavations and cave art reveal the variety of meat available. In Europe and Asia, for example, Cro-Magnon hunted for bear, cave lion, hyena, wild horse, bison, woolly rhinoceros, reindeer, chamois, ibex, woolly mammoth, and deer. In the Americas, the quarry became the dire wolf, mammoth, giant beaver, giant sloth, American camel, and more than 100 other species (Prideaux 1973: 12, 14, 42-78). But beginning about 40,000 years ago, many of the large animal species began to die out worldwide, and by approximately 10,000 years ago, numerous animals had been hunted to extinction (Martin 1967: 32-8). Anthropologist Sherwood L. Washburn has postulated that the quest for game was a significant factor in the evolution and intellectual development of humans. Beginning with the Australopithecines and continuing over eons until the domestication of plants and animals, humans increasingly learned to be ever more efficient hunters, to devise better and better strategies, and to invent and improve better and more effective tools as well as techniques for using them. That humans were forced to shift from hunting and gathering to sedentary agriculture because of a decreasing supply of large animals suggests that they were too successful in perfecting their hunting techniques (Washburn 1961: 299).
Nutrients in Plants and Animals
It should be clear by now that our ancestors, both of the lower primates and of humans, consumed much in the way of animal food and with good reason, for nutrients in meat are much more highly concentrated and are utilized more efficiently than those from plant foods. For example, venison provides 572 calories/100 grams of weight, compared with only 100 calories from a similar amount of fruits and vegetables (White and Brown 1973: 68-94). Humans, then, developed an ability not only to hunt an animal efficiently but also to digest it efficiently. Through natural selection both of these abilities afforded humans an adaptive advantage for survival in a changing environment (Bronowski 1972: 42-56).
A full day of foraging is required to obtain the amount of food value from plants that is found in one small animal. By eating animals that lived on grasses and leaves, humans could obtain a highly concentrated and complete food, converted from plants into meat. When humans moved into the cold regions of northern Europe and Asia, they had to rely on meat even more. Edible plant foods were available only seasonally, whereas game was available throughout the year (Campbell 1976: 253-302). Moreover, game did not need to be cooked, whereas many plant foods had to be prepared in this fashion to be edible. Homoerectus may have used fire to a very limited extent some 300,000 years ago, but the evidence is sparse and questionable. Fire’s general use, according to both paleontological and archaeological records, began only about 40,000 to 50,000 years ago (Abrams 1976: 102-12).
The use of fire, extended to food preparation, resulted in a great increase and variety of plant food supply. All of the major basic domesticated plant foods, such as wheat, barley, rice, millet, rye, and potatoes, require cooking before they are suitable for human consumption. In fact, in a raw state, many plants contain toxic or indigestible substances or anti-nutrients. But after cooking, many of these undesirable substances are deactivated, neutralized, reduced, or released; and starch and other nutrients in the plants are rendered absorbable by the digestive tract. Thus, the use of fire to cook plant foods doubtless encouraged the domestication of these foods and, thus, was a vitally important factor in human cultural advancement (Abrams 1986: 24-9).
In summary, it seems clear that the natural diet of humans is an omnivorous one and that proteins from animal sources are vital to health and well-being. This is because proteins differ in quality, depending on the amino acids from which they are built. Some amino acids (“nonessential” or “dispensable”) can be synthesized in the human body; others (“essential” or “indispensable”) must be supplied by food. For optimal protein quality, all amino acids must be present and in optimal ratios with one another. No individual plant meets these requirements, whereas all foods of animal origin do.
The egg represents the ideal quality protein against which all other proteins are measured in terms of its protein efficiency ratio (PER). At the top of the PER scale, the egg is followed closely by proteins from other animal sources, both organ and muscle meats. The PERs for plant foods such as legumes, grains, and seeds are much lower on the scale because all are limited by a low level of one or more of the amino acids, which are not in good balance with each other. To obtain an amount sufficient to meet human requirements, one would need to consume enormous quantities of a plant food, and the total diet would become imbalanced.
Perhaps in the future, with genetic engineering and other scientific and technological advances, it will become possible to develop plant foods that will supply, in ideal balance, all the amino acids and other nutrients required for optimal health. Meanwhile, it seems prudent to keep the traditional dietary wisdom of our predecessors and apply it, insofar as possible, to modern life, meaning that humans should eat a varied, balanced diet consisting of both animal and plant foods, all in moderation.