Human Nutritional Adaptation: Biological and Cultural Aspects

H H Draper. Cambridge World History of Food. Editor: Kenneth F Kiple & Kriemhild Conee Ornelas, Volume 2, Cambridge University Press, 2000.

The extraordinary diversity of aboriginal food cultures testifies to the capacity of many combinations of foodstuffs to sustain human health and reproduction. From this diversity it is apparent that humans have no requirement for specific foods (with the qualified exception of breast milk, which can be replaced by the milk of other mammals but with less satisfactory results). Modern nutritional science has demonstrated that good health is dependent upon the consumption of a discrete number of biochemical compounds that are essential for normal metabolism but cannot be synthesized de novo in the body. These compounds or their metabolic precursors can be obtained from many different combinations of foods.

It is possible that there are still unidentified trace elements required in such small amounts that it has not yet been possible to demonstrate their essentiality, although it is unlikely that they are of any clinical importance in human nutrition. It is probable that current perceptions of the amounts of some nutrients required for optimal health—such as the relative amounts of various fatty acids necessary for the prevention of cardiovascular disease—will undergo further change, but the present state of nutritional knowledge is adequate as a basis for evaluating the quality of different food cultures in terms of their ability to provide the nutrients required for nutritional health.

This chapter evaluates two contrasting food cultures: the carnivorous aboriginal diets of the Arctic Inuit and the traditional cereal-based diets consumed by the inhabitants of Southeast Asia and of Central and South America. Also, the current nutritional health of these populations is evaluated in terms of their adaptation to a modern diet and lifestyle.

Biological Adaptation to the Intuit Diet

Although nutritional anthropologists have concluded that the diets of most hunting and gathering societies were more varied than those of early agricultural societies, this clearly was not true of the aboriginal diet of the Inuit inhabitants of the high Arctic. Although berries and the leafy parts of edible plants were available in the Subarctic region, the diet of the Inuit residing 300 miles above the Arctic Circle was, for all practical purposes, carnivorous (that is, they were hunters but not gatherers).

Current diet recommendations in industrial societies promote the consumption of a mixture of foods belonging to four or more food groups: cereals, fruits and vegetables, meat and fish, and dairy products. The native diet of the Arctic Inuit—caribou, seal, and whale meat, augmented with lesser amounts of fish, birds, eggs, and the meat of other land mammals—is composed of foods belonging to only one of these groups.

Meat consumption in modern societies is being discouraged because of its propensity to cause cardiovascular disease. Yet, when the Inuit were first examined by ships’ doctors, they were found to be virtually free not only of vascular disease but of renal disease, hypertension, and diabetes as well (Thomas 1927; Robinhold and Rice 1970; Mouratoff and Scott 1973). These conditions are the main manifestations of malnutrition in modern societies. The factors underlying the successful adaptation of the Inuit to their extraordinarily narrow food base have been discussed elsewhere (Draper 1977) and are further examined in the context of current knowledge in the following sections.

Sources of Metabolic Fuel

The Inuit native diet frequently is characterized as being high in protein or high in fat, but from the standpoint of metabolic adaptation, its most important feature is its low content of carbohydrate. In the absence of plant sources of starch and sugars, carbohydrates in the diet were limited to small amounts of glycogen and the sugar moieties of the glycoproteins and glycolipids present in animal tissues.

The postprandial energy state following consumption of the modern mixed diet, in which carbohydrates constitute the main source of energy, is marked by the conversion of excess glucose to fat for energy storage. The postprandial state following consumption of a carnivorous diet is marked by the obligatory synthesis of glucose, which is essential for brain function and other metabolic processes. Inasmuch as fatty acids, the main source of energy in the diet, cannot be converted to glucose, most of the glucose required must be obtained from gluconeogenic amino acids. Glycerol, the backbone of dietary triglycerides (fat) and of phospholipids, is an additional but minor metabolic source of glucose.

Because of its low carbohydrate content, the native Inuit diet had to be sufficiently high in protein, not only to supply the amino acids required for the synthesis of body proteins but to maintain normoglycemia as well. In some Inuit food subcultures, such as that of the caribou eaters of the Arctic tundra, protein (rather than fat) probably constituted the main source of energy in the diet, because caribou is much lower in fat than is modern beef.

The high rate of gluconeogenesis in the Inuit requires large amounts of enzymes (which consist mainly of proteins) for the conversion of amino acids to glucose and for the conversion of waste amino acid nitrogen to urea. These processes, respectively, account for the large livers and urine volumes long associated with Arctic Inuit. Increased enzyme synthesis in response to an increase in substrate concentration (so-called feed forward regulation of enzyme reactions) is a major form of metabolic adaptation to changes in diet composition.

Sources of Vitamins

The vitamins of the large B-complex group function as cofactors for enzymes involved in the metabolism of many biochemical compounds, including amino acids, fatty acids, and glucose. Consequently, there is a natural association between the amounts of these vitamins and the amounts of enzyme proteins present in the tissues of animals. Most of the protein in the liver, where many of these transactions occur, consists of enzyme proteins. This association is most apparent in the case of vitamin B6, which functions exclusively in enzymatic transformations of amino acids, such as the aminotransferase reactions required to remove their nitrogen groups before their conversion to glucose.

As a result, the requirement for this vitamin is proportional to protein intake. Although the high-protein Inuit diet generates a high B6 requirement, the risk of B6 deficiency fails to increase with protein intake because of the strong association between this vitamin and protein in the diet. There is a similar, though less strong, association between other B vitamins and dietary protein. In contrast to cereal-based food cultures, there is no history of deficiencies of the B-complex vitamins among carnivores. The adequacy of the aboriginal Inuit diet in these nutrients was further ensured by their custom of eating food in the fresh, frozen, or lightly cooked state.

The nutriture of the Inuit with respect to vitamin C, a vitamin generally associated with fruits and vegetables, has long held a fascination for nutritionists. As for other apparent mysteries about Inuit nutrition, however, their freedom from vitamin C deficiency is readily explicable in terms of current knowledge relative to the amount of vitamin C required for the prevention of scurvy, its concentration and stability in the diet, and the capacity for storage. Prevention of scurvy requires 10 milligrams (mg) or less per day, an amount available from the fresh or frozen raw tissues of animals that synthesize it. The liver of land and sea mammals, often eaten fresh while still warm, is a good source of vitamin C. Seal liver, an Inuit delicacy, contains about 35 mg per 100 grams (g), an amount sufficient to provide for significant storage (Geraci and Smith 1979).

The epidemics of scurvy among Europeans in the eighteenth and nineteenth centuries were caused by the oxidation of vitamin C in grains and salt meats during long sea and land voyages and by its leaching from foods during cooking in water. Further, it is not necessary, as a recent advertisement claims, “to get your vitamin C every day.” The amount of vitamin C stored by a typical adult in the United States or Canada is sufficient to prevent a deficiency, in the absence of any intake, for about 30 days (Baker et al. 1971).

The need to envelop the body in heavy clothing against the cold during the long winter night deprives the Arctic Inuit of solar radiation, the main source of vitamin D in temperate and tropical climates. The Inuit experience demonstrates that at least in some environments, there is an absolute dietary requirement for vitamin D, a nutrient that some have proposed be reclassified as a hormone because it can be synthesized in adequate amounts in the skin with sufficient exposure to sunlight. The high concentrations of vitamin D in the oils of fish and fish-eating sea mammals make up for limited synthesis of vitamin D in the skin during the Arctic winter, as demonstrated by the lack of historical evidence of rickets in Inuit children and of osteomalacia in adults consuming the native diet. These oils also provide an abundance of vitamin A. The occurrence of toxic amounts of vitamin A in polar bear liver, accumulated through the fish and seal food chain of this species, forms the basis of one of the strongest Inuit food taboos.

The vitamin E nutriture of the Inuit is unusual because their carnivorous diet contains no cereal oils, the main source of vitamin E in the modern mixed diet. Further, the high concentration of polyunsaturated fatty acids from fish and marine mammals in this diet generates a high requirement for vitamin E to maintain their oxidative stability in the tissues. Nonetheless, analysis of the blood plasma of Northern Alaskan Inuit has revealed concentrations of vitamin E similar to those of the general U.S. population (Wo and Draper 1975).

The explanation for this finding lies in a difference in the forms of vitamin E present in carnivorous and mixed diets. Although the total amount of vitamin E present in the mixed diet is substantially higher than in the carnivorous diet, only about one-quarter of the total occurs in the form of alpha-tocopherol; the rest is present mainly as gamma-tocopherol, which has only about 10 percent as much biological activity. In contrast, animal tissues contain almost exclusively alpha-tocopherol. Consequently, the amount of vitamin E in the Inuit native diet, expressed as biological activity, is comparable to that in the mixed diet.

As is evident from the pungent odor of volatile products of lipid peroxidation that permeates Inuit villages in summer, the concentration of vitamin E in the highly unsaturated oils of aquatic species is inadequate to protect them from oxidative rancidity when they are exposed to an air environment, even though it is sufficient at the reduced oxygen tension and temperature of their natural aquatic environment. In general, the concentration of vitamin E in plant oils and animal fats is proportional to the amount necessary to stabilize the unsaturated fatty acids they contain. For example, the inheritance of vitamin E and polyunsaturated fatty acids in maize is genetically linked, so that varieties high in polyunsaturated fatty acids are also high in vitamin E (Levy 1973). Fish, seal, and whale oils, on the other hand, impose a burden on the vitamin E requirement that must be borne by other food sources of the vitamin. Although Canadian Indians consuming a diet high in fish have been found to have a plasma vitamin E level below that of the general population (Desai and Lee 1974), dietary vitamin E deficiency has not been documented.

Sources of Inorganic Nutrients

Human requirements for inorganic nutrients (often loosely referred to as minerals) are qualitatively similar to those of other mammalian species. Hence, the carcasses of animals are a good source of these nutrients in human diets, provided they are eaten in their entirety, as they were (less the skin and compact bone) by the Arctic Inuit. The inorganic elements most frequently at risk of deficiency in human diets are iron and iodine. Iron deficiency, often precipitated by intestinal infections and diarrhea, is most prevalent among malnourished consumers of low-protein vegetarian diets. Iron is accumulated in animal tissues in the iron-binding liver proteins hemosiderin and ferritin, which, together with hemoglobin and other iron-containing proteins, provide an abundance of bioavailable iron in a carnivorous diet. Risk of iron deficiency is reduced by an adaptive feedback mechanism that increases the efficiency of iron absorption on a low-iron diet. This mechanism also serves to prevent iron toxicity on the high-meat Inuit diet by suppressing the absorption of excess dietary iron.

Iodine deficiency is a prevalent problem in various human populations, arising in some cases from its deficiency in the soil (for example, the “goiter belt” around Hudson’s Bay) and in other cases from the ingestion of plant goitrogens. Neither of these circumstances applies to the Inuit native diet, which is devoid of goitrogens and is high in iodine from foods of aquatic origin. This diet also contains an abundance of zinc, which is present in marginal amounts in cereal diets high in phytin, a component of fiber that inhibits zinc absorption.

Meat has a very low calcium content, and in the absence of dairy foods, which supply about 75 percent of the calcium in the food supply of most industrialized societies, the spongy trabecular bone of land and sea mammals and the soft bones of fish are an essential source of this element in the native Inuit diet. Bone chewing, a nutritional as well as a social custom, is now nearly extinct, and a low consumption of dairy products has resulted in a calcium intake among the Inuit that is both below the historic level and below intake level currently recommended in the United States for optimal development of the skeleton during growth and for its stability during aging. However, the intake is comparable to the amount delivered by cereal-based diets, and there is no clear evidence that the lowered intake results in a state of calcium deficiency.

The body has an ability to adapt to a range of calcium intakes by modulating the efficiency of calcium absorption from the intestine. This ability, which serves to protect against both calcium deficiency and toxicity, is one of the best understood of metabolic adaptations to a change in nutrient intake. It involves an increase in the synthesis of parathyroid hormone in response to a small decline in serum calcium concentration caused by a decrease in calcium intake. This hormone effects an increase in the renal synthesis of 1,25-dihydroxycholecalciferol, the active form of vitamin D, which, in turn, enhances the synthesis of a calcium transport protein in the intestinal epithelium that is required for the active absorption of calcium.

The result is an increase in the efficiency of calcium absorption following a reduction in intake that restores serum calcium homeostasis. Whether this mechanism is fully successful in enabling adequate amounts of calcium to be absorbed to maximize skeletal development during growth and minimize bone loss during aging is a question of current interest with respect to the relationship between calcium intake and the incidence of osteoporosis.

The Inuit of Northern Alaska and Canada undergo a more rapid rate of aging bone loss, a risk factor for osteoporosis, than Caucasians consuming the mixed diet (Mazess and Mathur 1974, 1975). The high-protein, high-phosphorus, low-calcium content of the native diet has been implicated in this phenomenon. However, the high rate of aging bone loss in the Inuit does not appear to be associated with a high incidence of osteoporotic bone fractures.

This may be explained by a difference in bone morphology between the Inuit and Caucasians; the shorter, thicker bones of the Inuit may have a greater weight-bearing capacity than the longer, thinner bones of Caucasians. The rapid bone loss in the Inuit, nevertheless, may have relevance for the high incidence of fractures among elderly consumers of the “Western diet,” which contains about twice as much protein as necessary to meet the protein requirement and an excess of phosphorus arising from a natural association of this element with protein and from the widespread use of phosphate food additives.

Oxidation of excess sulfur amino acids present in high-protein diets generates hydrogen ions and sulfate (sulfuric acid) that is excreted in the urine. On a high-protein (usually high-meat) diet the urine consequently is acid, whereas on a low-protein (usually high-cereal) diet it is near neutrality or slightly alkaline. Unopposed acidification of the renal filtrate decreases the reabsorption of calcium and increases its loss in the urine.

In experiments on adults fed purified proteins, this loss was found to amount to 300 mg per day, indicative of rapid bone loss, at a daily protein intake of 95 g, an amount well within the range of intakes on the mixed diet and low by Inuit standards (Linkswiler, Joyce, and Anand 1974).The protein intake of aboriginal Inuit adults in northwest Greenland has been estimated at nearly 400 g per day (Bang, Dyerberg, and Sinclair 1980). On a high-protein diet composed of normal foodstuffs, however, the decrease in calcium reabsorption by the proximal renal tubules caused by urine acidification is not unopposed.

The increased intake of phosphorus on high-protein diets results in a depression of serum calcium and a consequent increase in the synthesis of parathyroid hormone, which stimulates calcium reabsorption from the renal tubules and thereby counteracts the calciuretic action of excess dietary protein. Whether these opposing effects of dietary protein and phosphorus on urinary calcium excretion are always fully offsetting is not clear. However, it has become apparent that the most important nutrient ratio, from the standpoint of bone homeostasis, is not the ratio of calcium to phosphorus, which has received most attention, but the ratio of protein to phosphorus, which is fundamental to the maintenance of calcium balance on a high-protein diet. Whether the increased rate of bone resorption associated with excess dietary phosphate and protein observed on the Inuit diet and on the high-protein modern diet (Calvo, Kumar, and Heath 1988) results in increased bone loss and risk for osteoporosis is still controversial.

Metabolic Adaptation to Dietary Lipids

Danish studies on nutritional adaptation among the Inuit of northeast Greenland undertaken in the late 1960s and early 1970s led to an explanation for their low incidence of cardiovascular disease and to important innovations in methods for its prevention and treatment in modern societies. These studies revealed a protective effect of the unusual fatty acids present in the fish and marine mammal oils consumed in large quantities by the Inuit. Such oils contain polyunsaturated fatty acids with extremely long carbon chains (up to 22 carbon atoms) and unusually large numbers of double bonds (up to 6), characteristics that confer on them a low melting point that enables aquatic species to maintain membrane fluidity at the temperature of their environment. In contrast, the polyunsaturated fatty acids of cereal oils, the main source of fat in the modern mixed diet, contain up to 18 carbon atoms and up to 3 double bonds.

Another important difference is that the first double bond in most of the highly unsaturated fatty acids in the Inuit diet occurs between the third and fourth carbon atoms (n-3 fatty acids), whereas the first double bond in most of the fatty acids in cereals occurs between the sixth and seventh carbon atoms (n-6 fatty acids).The metabolic significance of this distinction is that the n-3 and n-6 fatty acids are precursors of two distinct groups of hormones, the prostacyclins and the prostaglandins, that have significantly different effects on the metabolism of blood lipids. They therefore also differently affect the risk of heart attack. Prostacyclins reduce the plasma triglyceride (fat) level, as well as the propensity of the blood to clot (and hence the risk of an embolism). A negative effect of the high n-3 fatty acid content of the Inuit diet is its tendency to cause nosebleeds (Fortuine 1971).

As a result of these findings, increased consumption of fish has been recommended as part of the Western diet, and concentrates of n-3 fatty acids prepared from fish oil are used in the clinical management of cardiovascular disease. The n-3 fatty acids have become recognized as dietary essentials for both children and adults, and official recommendations have been issued relative to their desirable level of intake (National Research Council, U.S. Academy of Sciences 1989; Health and Welfare Canada 1990).This series of events serves as an example of the value of cross-cultural research as a source of information relevant to the nutritional health of all human societies.

There is little doubt that hypercholesterolemia is a risk factor for heart attacks, and a plasma cholesterol level of 200 mg per deciliter or less has been selected as a desirable goal. Diet recommendations for the prevention of heart disease in modern societies call for a reduction in cholesterol intake to 300 mg per day or less (National Research Council, U.S. Academy of Sciences 1989; Health and Welfare Canada 1990). In light of these recommendations, the rarity of cardiovascular disease among the Inuit (Thomas 1927), whose traditional diet is extraordinarily high in cholesterol, seems anomalous.

Electrocardiographic recordings made on Northern Alaskan Inuit adults in the 1970s (by which time only about 50 percent of their calories were derived from native foods) revealed an incidence of abnormalities only half that recorded in a reference population of U.S. Caucasian adults (Colbert, Mann, and Hursh 1978). Indigenous foods in the partially acculturated diet of a cohort of adult Inuit living in the northwest Alaskan village of Point Hope in 1970 averaged 918 mg per day (range 420 to 1650 mg) (Feldman et al. 1978).

Studies on the epidemiology of hypercholesterolemia carried out between 1937 and 1961 showed “a strong tendency towards normocholesterolemia … among the unacculturated Eskimo groups but increased rates of elevated serum cholesterol among modernized Eskimos” (Feldman et al. 1978: 174). It is noteworthy that the plasma cholesterol levels of the Inuit during that period were similar to those of adults in the general population of the United States, despite a twofold difference or more in cholesterol intake.

The explanation for this discrepancy has been provided by epidemiological and laboratory research on the relationship between the intake of cholesterol in the diet and the level of cholesterol in the plasma. The Inuit experience is consistent with the conclusion of A. Keys, J. T. Anderson, and F. Grande (1965), the originators of the connection between dietary fat and heart disease, and with the observations of subsequent investigators: In the normal, healthy, adult population, the level of cholesterol in the diet has only a minor influence on blood cholesterol.

The current public preoccupation with dietary cholesterol and the promotion of “cholesterol-free” foods is attributable to a seemingly logical assumption that lowering cholesterol intake should lower plasma cholesterol. This is often true in patients with clinical hypercholesterolemia, a condition associated with genetic and various metabolic disorders, including diabetes and obesity, in which the metabolic control mechanism that normally downregulates the synthesis of cholesterol in the body in response to a high intake in the diet is impaired.

This is, however, weak justification for imposing a limit of 300 mg per day on the intake of cholesterol by “the generally healthy population” for whom diet recommendations are issued (National Research Council, U.S. Academy of Sciences 1989; Health and Welfare Canada 1990). The “cholesterol free” label on foods has been used as a “red herring” to attract attention away from the more important characteristics of foods from the standpoint of their effect on plasma cholesterol, namely the amount and composition of the fat they contain.

Current nutrition recommendations in industrialized countries include a call for a reduced consumption of “red meat” as a means of lowering the risk of cardiovascular disease. This edict seems paradoxical in view of the reported absence of this disease among Inuit consumers of the native meat diet. It is aimed at the consumption of beef, which bears the stigma (no longer fully justified) that it is high in fat and saturated fatty acids and low in polyunsaturated fatty acids. (The term “red meat” itself seems anomalous, since the redder meat is, the lower it is in fat.)

Contrary to the general impression that all animal tissues are high in saturated and low in polyunsaturated fatty acids, the tissues of the fish and of the land and sea mammals that constituted most of the Inuit diet are the reverse. The lipids of seal, whale, walrus, and polar bear meat contain 15 to 25 percent polyunsaturated fatty acids, consisting mostly of fatty acids of the n-3 type (Wo and Draper 1975). In contrast to beef, caribou meat is so low in fat that the Inuit dip it in seal oil to improve its palatability. The lipids of caribou muscle have been found to contain 5.6 percent arachidonic acid and 15.4 percent linoleic acid of the n-6 polyunsaturated fatty acid series.

In contrast, values of 0.5 percent arachidonic acid and 2.5 percent linoleic acid have been reported for beef muscle (Link et al. 1970). Most of the polyunsaturated fatty acids in both caribou and beef muscle are located in the phospholipids of cell membranes. The phospholipids of beef muscle have a level of polyunsaturates comparable to that in caribou muscle (Link et al. 1970), but have been swamped by saturated fatty acids as a result of selection of beef animals for “marbling” with intramuscular fat to increase the palatability of their meat.

Current diet recommendations in the United States and Canada also call for a reduction in fat intake to 30 percent of calories or less as a further means of reducing the risk of cardiovascular disease (National Research Council, U.S.Academy of Sciences 1989; Health and Welfare Canada 1990). The experience of the Inuit, however, indicates that it is the composition of dietary fat, rather than its level in the diet, that is of primary importance in the prevention of heart disease.

This view is supported by the low incidence of cardiovascular disease among consumers of the “Mediterranean diet,” which contains a preponderance of monounsaturated fatty acids derived mainly from olive oil used in cooking. These acids replace more saturated than polyunsaturated fatty acids in the diet, thereby shifting the ratio in favor of polyunsaturates. The role of this ratio is also indicated by an analysis of changes in the composition of dietary fat in the United States from the 1960s through the 1980s, an interval during which the death rate from heart disease declined by about 25 percent (Stephen and Wald 1990).

The data show that there was an increase in the ratio of polyunsaturated to saturated fatty acids in the diet over this period from about 0.25 to 0.50. This increase was due mainly to a decrease in the intake of total fat, which entailed a discrimination against “visible fat” (such as the peripheral fat around the outside of steak) that consists mostly of saturated fatty acids. These observations further indicate that it is the composition of dietary fat, rather than the amount of fat consumed, that is of prime importance in the prevention of cardiovascular disease. Reducing total fat intake, however, is the only practical means of increasing the ratio of polyunsaturated to saturated fatty acids in the modern diet of industrialized countries.

The ratio of polyunsaturates to saturates in the native Inuit diet is difficult to estimate with accuracy and is highly variable, but it clearly exceeds the ratio of 0.50 calculated for the U.S. diet in the 1980s (Stephen and Wald 1990) and probably exceeds the 1.0 ratio proposed in modern diet recommendations. Inuit diets in northwestern Greenland in 1976 were estimated to have a ratio of 0.84, compared to a ratio of 0.24 in the diet of Danes (Bang et al. 1980). In addition to a reduction in total fat consumption to 30 percent of energy intake, current recommendations call for a 1:1:1 ratio of polyunsaturates to saturates to monounsaturates (Health and Welfare Canada 1990). The protection from cardiovascular disease afforded by the profile of fatty acids in the native Inuit diet is encouragement that implementation of these recommendations will confer a similar benefit on modern societies.

Adaptation to Dietary Sugars

The Inuit of western Greenland and northwestern Alaska are uniquely susceptible to congenital primary sucrase-isomaltase deficiency, a condition presumably related to the absence of sucrose in their traditional diet over many centuries (McNair et al. 1972; Raines, Draper, and Bergan 1973). Its epidemiology is familial, and studies on Greenland Inuit have indicated that it is due to a deficiency of an autosomal recessive gene. Its incidence was estimated to be about 3 percent in northwestern Alaska and 10 percent in a sample of hospital patients and staff in northeast Greenland.

This deficiency has a major effect on the capacity of those Inuit affected to deal with the modern diet. For example, children are unable to eat an ice-cream cone or drink a bottle of carbonated beverage unless it contains a synthetic sweetener in place of sucrose. In this respect, sucrase deficiency differs from lactase deficiency, in which lactase production declines during growth but persists at a reduced level in adulthood. Sucrase deficiency, a recently recognized cause of diarrhea in Inuit children, is present in acute form from birth.

The options available for dealing with it include replacing sucrose with other sugars (such as invert sugar in honey), using synthetic sweeteners, or taking oral sucrase preparations made from the intestinal juices of animals. Sucrase-isomaltase deficiency appears to be absent among Alaskan Inuit residents of the Subarctic, where the traditional diet has contained sucrose present in fruits and berries.

As in most societies (excepting mainly those of northern European origin), Inuit children undergo a progressive decrease in the intestinal synthesis of lactase during growth to a level that limits the amount of milk and milk products that can be digested in adulthood (Kretchmer 1972). Based on the results of a standard lactose tolerance test—which involves administering 50 g of lactose (the amount present in a liter of cow’s milk), measuring the subsequent rise in plasma glucose, if any, and recording any adverse clinical reactions, such as abdominal cramps or diarrhea—about 70 percent of Inuit adults have been characterized as “lactose intolerant” (Gudmand-Hoyer and Jarnum 1969). However, if given a dose of 10 g of lactose (the amount present in a cup of milk), a large majority of Inuit adults experience no adverse symptoms and, therefore, from a nutritional standpoint, may be regarded as “lactose tolerant.”

With respect to their ability to digest lactose, the Inuit resemble a majority of the world’s population. Alaskan Inuit children routinely consume milk or other dairy products as part of a school lunch program. The breakpoint in lactose tolerance in adults typically falls between 10 and 20 g (Raines et al.1973); intakes in this range can be repeated after an interval of several hours. Hence, for most individuals, lactose intolerance is not a serious impediment to the acquisition of the calcium and other nutrients present in dairy foods.

Adaptation to the Modern Diet

Historically, the greatest threat to the nutritional status of the Arctic Inuit was famine, triggered by failure to catch a bowhead whale or by the failure of the caribou to run. In such exigencies, seals were the most reliable dietary staple. The Inuit population was kept in balance with its food supply by periodic famines, by a high infant mortality, by an extremely high rate of fatal accidents among adults pursuing the hunting culture, and if necessary, by infanticide. They had no nutrition education and no need for it. Their custom of eating animals almost in their entirety provided them with all the nutrients necessary for nutritional health, assuming their foods were available in sufficient amounts. There were no “junk foods” in their diet. Food not eaten fresh was preserved in the frozen state in the ice cellars that are still prevalent in Arctic Inuit villages. The central nutritional imperative was simply to get enough to eat.

For the first time since they migrated across the Bering Strait several thousand years ago, the Inuit in the twentieth century have been confronted with the necessity of making significant food choices in order to be well fed. Fractionation of primary foodstuffs by the modern food-manufacturing industry has generated multitudes of products of highly variable nutritional quality that must be reassembled in specific combinations to ensure a balanced diet. This necessity has led to the development of a set of rules for selecting proper combinations of foods that are inculcated into children by their parents and teachers and communicated to the public through nutrition education programs. To assist consumers in following these rules, a list of the nutrients that processed foods contain must be put on their labels, including the contribution of one serving to the recommended daily intake of each nutrient. A large government bureaucracy is devoted to ensuring that foods actually contain the amounts of nutrients listed on the label, and another agency is responsible for determining whether claims made for their efficacy in the prevention of disease are valid.

To primitive peoples whose only previous nutritional imperative was to get enough to eat, dealing with the complexities of the modern diet presents serious problems of nutritional adaptation. These problems are reflected in the poor nutritional status of Inuit and Amerindian inhabitants of the urban centers of Alaska and Canada. The nutritional health of urban Inuits is inferior to that of their forebears, as well as to that of their Arctic contemporaries who still maintain a semblance of the traditional diet and lifestyle (Colbert et al. 1978).

The younger generation of Inuit, like younger generations of most other native minorities, has abandoned the traditional diet almost entirely. The incidence of obesity, hypertension, and cardiovascular disease among Alaskan Inuit follows an increasing gradient from the high Arctic to the Subarctic to the modern cities of the south (Colbert et al. 1978). The incidence of diabetes is also increasing, but is still substantially lower than in Caucasians (Thouez et al.1990;Young et al.1990).

The decline in the nutritional status of the Inuit in the second half of the twentieth century has more to do with psychosocial factors than with a lack of nutritious foods. Educational and technical deficiencies, loss of social status in the community, deterioration of cultural values, lack of a sense of community, and discrimination in employment are factors in their poor nutritional status. The modern Inuit are at a watershed in social acculturation, unwilling to revert to their traditional lifestyle, yet unable to cope successfully with the complexities of an industrialized society.

Metabolic Adaptation to Cereal-Based Diets

The cereal-based diets consumed by the inhabitants of Southeast Asia and of Central and South America are in direct contrast to those consumed by the Inuit; that is, they are high in carbohydrates and low in protein and fat, rather than vice versa. Consequently, the bioenergetic transformations imposed by this diet, are also reversed. Fat stored in the adipose tissues is formed primarily by lipogenesis from glucose released in carbohydrate digestion, rather than by the fatty acids released by fat digestion. As polyunsaturated fatty acids cannot be synthesized in the body, fat formed from glucose consists mainly of saturated and monounsaturated fatty acids. In contrast to the massive conversion of excess amino acids to glucose on the Inuit diet, most of the limited amounts of amino acids in cereal diets are used for the synthesis of tissue proteins.

In addition to food shortages caused by crop failures, cereal-based food cultures have been subject to epidemic deficiencies of specific nutrients, notably beriberi caused by thiamine deficiency in the case of rice diets and pellagra caused by niacin deficiency in the case of maize diets. These diseases have created a prevalent impression that cereal diets are low in nutritional quality, but, in fact, these afflictions were the result of a disruption of native food cultures by foreign influences or the failure of foreigners to adopt the native culture.

Replacement of brown rice with more prestigious polished rice, a minor food in Europe but the staple food in Southeast Asia, removed most of the thiamine from the diet and resulted in a classic example of so-called cultural malnutrition. The traditional method of preparing maize practiced by the Mexican Indians, which entailed grinding it in lime water, was not followed by the black and “poor white” populations of African and European origin who inhabited the southern United States.

Their maize diet had several nutritional liabilities. The niacin in maize is present in complex forms from which it must be released by some type of hydrolysis, which in the native Indian culture took the form of alkaline hydrolysis effected by grinding with lime water. Further, maize protein is low in tryptophan, an amino acid that can be converted to niacin in the body. Although pellagrins consumed substantial amounts of pork, it consisted mostly of fat (“sow-belly”) that contained little niacin, protein, or tryptophan. The Indian practice of grinding maize in lime water presumably arose out of experience. Not only was it instrumental in protecting them from the ravages of pellagra but it also contributed substantially to their intake of calcium, which is low in cereal grains.

Cereal diets are lower in protein than modern mixed diets and tend to be associated with protein-deficiency diseases, such as kwashiorkor, a severe condition affecting impoverished children during the postweaning period. It is marked by an edema that wrongly gives the impression that the children are of normal weight. When their serum protein level is restored by providing enough dietary protein, the edema dissipates, revealing severe emaciation (marasmus) caused by inadequate energy intake. This condition of protein–calorie malnutrition can be prevented by providing an adequate cereal-based weaning diet, indicating that the main etiological factor is a lack of food, rather than a lack of protein specifically.

When sufficient calories are supplied to prevent the utilization of dietary amino acids for energy production, these acids become available for the synthesis of tissue proteins. Most cereal-based diets, particularly if they contain (as they usually do) small amounts of animal products, such as fish, chicken, eggs, or meat, provide enough protein to satisfy the requirement for this nutrient. Diets based on cassava, a root vegetable with no redeeming nutritional qualities other than as a source of calories, are an exception to this generalization.

Vitamin B12 occurs only in foods of animal origin, which, therefore, must be included in all so-called vegetarian food cultures. Strict vegetarian diets, such as that of the “vegans,” can result in macrocytic anemia and irreversible neurological damage caused by a deficiency of this vitamin. On the other hand, most of the B-complex vitamins and vitamin E are plentiful in the bran and germ of cereal grains. The vegetables and fruits that normally constitute a major part of vegetarian diets provide folic acid, vitamin C, and vitamin K and are good sources of beta-carotene, a precursor of vitamin A in the body. Yet, until programs of vitamin A supplementation were instituted in the mid–twentieth century, vitamin A deficiency was the main cause of blindness among children in Indonesia and other Southeast Asian localities, despite the fact that edible plants capable of preventing it were readily available.

Cereal diets have been of particular interest from the standpoint of mineral nutrition. Phytin, a component of the fiber of cereals (particularly wheat), binds dietary zinc, preventing its absorption and, at high intakes, precipitating a zinc deficiency. Dwarfism among poor Middle Eastern children, long presumed to be of congenital origin, was found to be due to chronic zinc deficiency caused by the consumption of large quantities of unleavened wheat bread (Prasad et al. 1963). Yeast fermentation degrades phytin and, thereby, prevents zinc deficiency, a relationship that evidently escaped recognition in this food culture for generations, even though the relationship between fermentation and the production of alcohol (presumably a higher priority) was known and exploited. Phytin also binds calcium, but to a lesser extent, and the capacity to adapt to a low intake of this element aids in the prevention of calcium deficiency.

Unlike carnivorous diets, which in general contain adequate amounts of essential trace elements to meet human requirements (because these elements are also essential in the diet of food animals), the trace element content of plants often reflects the content of the soil on which they were grown. Keshan disease, an acute cardiomyopathy among children in the Keshan district of China, is caused by an extremely low concentration of selenium in local soils and, consequently, in the cereals grown on these soils. Locally grown plant foods constitute the bulk of the diet in this area. Such deficiencies are less likely to occur on mixed diets, which contain plant foods grown on a variety of soils, as well as foods of animal origin.

The low calcium content of cereal diets is of current interest with respect to the role of this element in the prevention of osteoporotic bone fractures in elderly adults. Paradoxically, the incidence of hip fractures associated with the cereal-based food cultures of Japan and other Asian countries appears to be lower than it is in Western countries where calcium intake, derived mainly from dairy foods, is substantially higher (Fujita 1992). Whether this apparent anomaly reflects a minor role of calcium in the prevention of hip fractures, a high calcium requirement generated by the high-protein Western diet, a difference in lifestyle, a difference in the incidence of falls, a genetic involvement, or the effect of some unidentified factor is a question raised by cross-cultural studies on the epidemiology of osteoporotic bone disease.

Adaptation to Dietary Toxins

Plants did not evolve with the idea that they should be good to eat. In fact, the synthesis by plants of substances toxic to their predators (humans, animals, insects, and microbes) has been a major factor in survival to the present day. Humans and animals have eaten them at their own peril, sorting out those that, based on experience, could be safely added to their diet. A. C. Leopold and R.Ardrey (1972) have developed the thesis that the presence of toxins in plants was an important determinant of the dietary habits of primitive societies.

In addition to natural insecticides and substances poisonous to animals, plants contain a multitude of compounds that make them unsafe for human consumption: hemagglutinins, enzyme inhibitors, cyanogens, antivitamins, carcinogens, neurotoxins, and allergens, among others. Accounts of their occurrence have been given elsewhere (Ory 1981; Lampe 1986). L. S. Gold and colleagues (1992) have argued that nearly all the carcinogens in the diet are of natural rather than—as widely perceived—industrial origin. Although rigorous toxicological testing of food additives on laboratory animals is required before they are approved for human consumption, no such tests have been applied to the natural toxins present in the diet. Broadening of testing for carcinogenicity to include these compounds recently has been recommended (Gold et al. 1992).

Compounds toxic to humans are more likely to be encountered in foods of plant origin because they have not been subjected to toxicological screening by first having been consumed by animals. Many toxins that are not overtly toxic to animals are at least partially catabolized by enzymatic detoxification systems present in animal tissues, notably the microsomal mixed-function oxidase system in the liver. Some toxins in plants are destroyed by cooking, but this method of food preparation appears to have been practiced only during the last 20 percent of human existence (Grivetti 1981).

Removal of toxins from plants by selective breeding and food processing has made it possible to add many new foods to the modern mixed diet. In addition, commerce in foodstuffs has diluted the high concentrations of toxins present in plants grown in specific localities. For example, the soil in some areas of the western United States contains high levels of selenium, an element that is accumulated in the seed of wheat and other cereals. When these locally grown grains constitute the bulk of the diet fed to farm animals, they can cause severe symptoms of selenium toxicity and even death. It must be presumed that they also caused at least mild toxicity among the early settlers in the same areas, for whom locally grown grains made up a major part of the diet. Modern commerce in foodstuffs has resulted in dilution of high-selenium foods and removed the risk of toxicity among consumers of the mixed diet.

Numerous substances have been added to the modern diet to improve its nutritional quality and safety. The required listing of these supplementary nutrients and additives on the label of processed foods has created a prevalent impression among consumers that some products consist mainly of synthetic industrial chemicals of questionable safety. In fact, nearly all food supplements and additives are either natural substances or exact replicas of natural substances already present in the diet or in the body (Branen, Davidson, and Salminen 1989). Most cases of food poisoning in modern societies are caused by microbial toxins formed as a result of careless handling and preservation in the home.

The few additives of purely synthetic origin have been subjected to the exhaustive life-cycle testing on laboratory animals required by food safety laws. Two such substances are the lipid antioxidants, BHA and BHT, used to prevent the oxidation of polyunsaturated fatty acids in foods, a widespread problem in the food industry resulting in flavor deterioration and the formation of products that pose a possible health risk. The synthetic sweeteners used in low-calorie beverages were approved only after years, or even decades, of safety evaluation. As indicated, there are several grounds upon which the modern diet can be criticized, but the common impression that it is less safe, in toxicological terms, than so-called natural diets is without foundation. Indeed, the reverse is the case.

The epidemiology of cancer in modern societies shows correlations of uncertain interpretation with diets high in fat and protein, but none with the use of food additives (Doll 1992; Lutz and Schlatter 1992). There are correlations, probably signifying causation, between the presence of certain substances in the diet and a high incidence of specific cancers. These include stomach cancer among Japanese and other Asians who consume large quantities of heavily salted foods, and liver cancer in Madagascans who consume moldy peanuts containing the potent carcinogen aflatoxin B1. There is no current information as to the possible role of natural chemicals in the estimated 35 percent of cancers in industrialized societies that are “diet related” (Doll 1992). W. K. Lutz and J. Schlatter (1992) have pointed out that this figure is “provocatively close” to the prevalence of overnutrition (that is, obesity) in these societies.

There remain a number of toxicological risks associated with the modern diet that are preventable or avoidable: formation of microbial toxins in poorly preserved foods; overdosing or accidental poisoning with synthetic vitamins; ingestion of harmful bacteria, residual plant enzymes, and antinutrients resulting from inadequate cooking; consumption of carcinogenic amino acid derivatives produced by overcooking meats; and ingestion of natural toxicants in such foods as mushrooms. There are also numerous allergens present in foods that seriously affect food selection by individuals but which do not have an important inf luence on general food cultures. The occurrence of immunological toxicants in foods has been discussed by L.W. Mayron (1981).

Cultural Factors in Nutritional Adaptation

There are many dietary practices that are not explicable in terms of environmental, nutritional, or toxicological (that is, biological) determinants. They are attributable to traditions (religious, societal, and familial) that often distinguish one food culture from another, even when their practitioners share other common influences on food choices, such as food economics, social class, and occupational status. Traditional ceremonies surrounding the procurement of food are still in evidence among some hunting cultures, as exemplified in the ceremony held by the Arctic Inuit on the eve of the annual whale hunt, even though this hazardous undertaking is no longer necessary to meet their nutritional needs.

Religious prohibitions affect the consumption of beef, pork, dairy products, and fish, as well as processed foods that contain substances of animal origin. Adherents to some food cultures would rather starve than consume foods that are a common item in other cultures (for example, dog meat and insects). Food fads, prevalent in modern societies, have a major, but usually short-term, influence on food practices. The role of cultural factors as determinants of traditional and modern food habits is discussed in several reviews and books (Wing and Brown 1979; Grivetti 1981; Axelson 1986; Gordon 1987; Kittler and Sucher 1989).

Notwithstanding the strength of cultural traditions, studies on intersocietal migrants and aboriginal societies undergoing acculturation have shown that changes in food habits can occur relatively rapidly, particularly among the young. A diet survey conducted on the inhabitants of an Arctic Inuit village in the early 1970s showed that about 50 percent of calories were still derived from native foods in the diet of adults, compared to only about 25 percent in the diet of children, even though all foods were eaten at home (Raines and Heller 1978). A similar generation gap in the acceptance of new foods has been observed among Chinese immigrants to the United States (Axelson 1986).

Adaptation to a new food culture often begins with the preparation of new foods by traditional methods and by selection of commercially prepared facsimiles of traditional foods, such as prepared tortillas by Mexican Americans (Axelson 1986). P. G. Kittler and K. Sucher (1989) compared the traditional food cultures of Native Americans, that of the original European settlers, and that of recent immigrants from other countries. Further, they traced the changes in native and foreign food cultures that have taken place to the present time. They concluded that the modern mixed diet reflects a strong influence of European food traditions on the dietary habits of all segments of the population, but that there is also clear evidence of an influence of all the major food cultures brought to America, except those brought by blacks. Consumption of traditional Native American foods is confined mainly to Indian reservations. The decimation of the Amerindian food culture is reflected in an extraordinarily high incidence of diseases that are the modern hallmarks of malnutrition in America.

Paradoxically, overnutrition in industrialized societies is more prevalent among the poor than among the affluent. Analysis of the disposal of family income in the United States indicates that expenditures for food are not closely related to income (Popkin and Haines 1981). This finding is attributable to the low cost of food, which makes it possible for even the poor to conserve family income by buying inexpensive food items.

The relationship between the cost of food and its nutritional quality is also weak. The poor nutritional status of blacks and Native Americans is not due in any major degree to an inability to buy nutritious food. It is, more importantly, a reflection of a lack of education that precludes understanding the health risks associated with obesity, a low social status lacking the peer pressure to maintain normal body weight that prevails among the middle and upper classes of society, and incomplete emergence from the agrarian culture of the nineteenth and early twentieth centuries. It was only some 50 years ago that obesity was regarded as a sign of good health and a source of energy that could be called upon in times of ill health. Popular figures in Western social culture (Shakespeare’s Falstaff and Santa Claus, for example) are portrayed as jolly, fat men. It is understandable, therefore, that the opposite perception of obesity has not yet permeated all levels of American society.