Alfred E Harper. Cambridge World History of Food. Editor: Kenneth F Kiple & Kriemhild Conee Ornelas, Volume 2, Cambridge University Press, 2000.
To view this topic in perspective, we need answers at the outset to two questions: “What are Recommended Dietary Allowances (RDA)?” and “What purpose do they serve?” RDA are a set of dietary standards; they are reference values for the amounts of essential nutrients and food sources of energy that should be present in human diets. Standards of this type, based on the best available scientific knowledge, are needed by policy administrators, public-health officials, physicians, dietitians, and educators who have responsibility for establishing food and health policy; for providing the public with reliable dietary advice; for planning nutritionally adequate food supplies for large groups of people; and for assessing the adequacy of diets consumed by individuals or populations.
Definition of RDA
RDA for essential nutrients are defined differently from RDA for food sources of energy. For essential nutrients, the RDA values are amounts judged to be high enough to meet the known physiological needs of practically any healthy person in a group that has been specified by age and sex. They exceed average requirements to ensure that few, if any, individuals who are consuming amounts of nutrients equivalent to the RDA will have inadequate intakes. In contrast, RDA values for food sources of energy are the average amounts needed by the members of each group. RDA are not, in themselves, general dietary recommendations, but as standards, they serve as the scientific basis for many aspects of practical nutrition and food and health policy (FNB 1989b).
Nomenclature
The name “Recommended Dietary Allowances” for these reference values was proposed by the Food and Nutrition Board (FNB) of the U.S. National Research Council (NRC) in 1941 when it first established a dietary standard (ADA 1941). As the term “RDA” was more and more widely used, it became evident that its meaning was not well understood. It was frequently used as a synonym for requirements or with the implication that RDA were dietary recommendations for use directly by the public. The response of the FNB was to increase the specificity of the definition in subsequent RDA reports and to include an introductory section explaining the uses of RDA (FNB 1974).
In an effort to distinguish more clearly between reference values and general dietary recommendations, some national and international committees have used other names for their dietary standards. A Canadian committee called its values simply a “Dietary Standard” (CCN 1940).The Food and Agriculture and World Health Organizations of the United Nations (FAO/WHO) have usually entitled their reports “Nutrient Requirements,” with the dietary standards derived from the average requirements being designated as “Recommended Daily Intakes” (FAO/WHO 1970) or “Safe Levels of Intake” (FAO/WHO 1985). The United Kingdom adopted the term “Recommended Daily Amounts of Nutrients” (DHSS 1985). Canada (HWC 1983) and Australia (NHMRC 1990) have used “Recommended Nutrient Intakes.” Recently, the United Kingdom adopted the terms “Dietary Reference Values for Food Energy and Nutrients” and “Reference Nutrient Intakes” (DHSS 1991). These are the clearest and most specific terms proposed. It is hoped that they will be widely accepted and thereby reduce misunderstanding about the meaning of dietary standards.
About 40 countries have established national dietary standards; at least 10 national committees have adopted the term “RDA,” and several others the FAO/WHO standards (Truswell 1987, 1990). Although definitions of dietary standards differ slightly from one organization to another, and values for some nutrients differ from one national standard to another, the various sets of reference values (regardless of the names used for them) have all been designed for the same purposes. The term “RDA” is used for the most part in this chapter, but the equivalence of this and other terms for dietary standards should be kept in mind.
Historical Perspective
Prehistoric Knowledge of Food and Health
The essentiality of specific constituents of foods for human health and survival was discovered only during the nineteenth and early twentieth centuries. The concept of dietary standards and programs of dietary guidance based on knowledge of human needs for individual nutrients are, thus, of recent origin. However, as food is a basic biological necessity, our early ancestors undoubtedly learned through observation and much trial and error how to identify safe sources of nutriment from among the vast array of natural products available to them.
They discovered that some of these products were toxic and caused illness and even death. At the same time, they learned how to cope with several potentially dangerous foods and other dietary problems. For example, the indigenous peoples of the Amazon Basin found that the poisonous cassava root could be converted into a wholesome, edible product by steeping it in water, and peoples in the Arctic region discovered that scurvy, the debilitating and often fatal disease caused by a lack of ascorbic acid (vitamin C), was prevented or cured if they consumed extracts of evergreen needles (Clark 1968). Thus, the human species learned early to distinguish among natural products that were valuable sources of food, those that had medicinal properties, and those that contained poisons (Harper 1988). Oral transmission of such information can be viewed as the earliest form of dietary guidance.
In recent prehistoric times (12,000 to 5000 B.C.), the domestication of certain plants (especially cereal grains and legumes) and animals (sheep, goats, pigs, cattle) to provide safe and dependable sources of food (Cowan and Watson 1992) was undoubtedly based on this knowledge. Also, observations about the effects of certain foods and other natural products in treating diseases expanded and, by early historic times (3000 to 500 B.C.) in Egypt, Babylonia, India, and China, had become intimately associated with an organized practice of medicine. In all of these civilizations, extensive lists of prescriptions (pharmacopoeias) were compiled, and an immense number of natural materials, especially herbs (some of which have since been shown to contain effective medicinal components) but also some traditional foods, were used as remedies for a variety of ailments.
In Egypt, for example, the onion, a good source of ascorbic acid, was recommended as a cure for a disease resembling scurvy. At the same time, much of the accumulated knowledge about the use of natural products for the treatment of illnesses was subjective and unreliable and was intertwined with mythical and magical beliefs. Major diseases were attributed mainly to supernatural causes and restoration of health to the magical or supernatural powers of the remedy or the healer (Ackerknecht 1982). Apart from improved methods of obtaining a dependable supply of food and increased knowledge of the medicinal properties of natural products, concepts of foods and nutriture during the early historic period differed little from those of the prehistoric period.
Medical Dietetics—Hippocrates to the Renaissance
In Greece, a new attitude toward food and health began to emerge between 600 and 400 B.C. Physicians of the Hippocratic school of medicine in Cos rejected the belief that magical and supernatural inf luences determined human well-being. They accepted the viewpoint of the Ionian philosophers that the only reliable way to learn about the natural world was through direct observation and logical reasoning. Those physicians described the symptoms of their patients and the course of diseases in great detail with remarkable accuracy, and they recognized the recuperative powers of the body and the importance of food in maintaining or restoring health. They observed: “Growing bodies have the most innate heat; they therefore require the most food for otherwise their bodies are wasted.… In old persons the heat is feeble and they therefore require little fuel” (Adams 1952: 131).
Diet loomed large in both their diagnosis and treatment of diseases. The Greek physicians were concerned with what to feed, along with when, how much, and how often, and developed a complex system of dietetic medicine (Fidanza 1979). Despite their many astute observations, however, they believed that health depended upon the appropriate balance of four mystical humors—blood, phlegm, black bile, and yellow bile—and their associated qualities: warm, cold, moist, and dry. They believed that imbalances among these humors caused diseases (Ackerknecht 1982). They further believed that the basic elements of matter were fire, water, earth, and air—a concept proposed by Empedocles (490 to 430 B.C.) that remained the basis of theoretical chemistry until the eighteenth century (Brock 1993)—and that foods contained only a single source of nutriment: “aliment” (McCollum 1957: 63).These erroneous concepts of chemistry and human biology and reliance on subjective observations without controlled experimental studies prevented the early Greek physicians from achieving an understanding of the chemical nature of foods, the process of nutrition, and objective knowledge of relationships between diet and disease.
The Greek era of medicine culminated with Galen, who in the second century A.D. compiled an extensive record of Greek medical knowledge and his own contributions to it. He used the experimental method to establish that urine was formed in the kidney, not in the bladder, and recognized that food must undergo transformation in the body before it can be used to form tissues. But he, too, was limited by the chemical and biological concepts of the times. Of pertinence to the present topic, however, he codified rules of personal hygiene, of healthful living. He identified health-influencing factors over which the individual could exert some control—air, food and drink, activity, rest, sleep, evacuation, and passions of the mind—and he recommended moderation in all things (Whorton 1982).
Hippocratic and Galenic concepts dominated medicine in Europe until the seventeenth century, but they were gradually discarded as science progressed. The rules for healthful living, however, persisted and were a stimulus for many health movements in the eighteenth and nineteenth centuries (Burnham 1987), probably because they were compatible with the theology of the times in providing a moralistic solution for the sins of gluttony, sloth, and intemperance to which illness was attributed. Present-day dietary guidelines are a modern version of some of these rules. The objective remains the same—guidance for healthful living with moralistic overtones—but is now bolstered by scientific observations.
Knowledge of foods, nutrition, and relationships between diet and health advanced little for over 1,500 years following the death of Galen. With the decline of the Roman Empire and the rise of Christianity, interest shifted from the body to the soul; the social and religious environment was no longer conducive to investigations of natural phenomena. The concepts and teachings of the Greek philosophers and physicians became ossified, and it was only toward the end of the Renaissance that resistance to authoritarian doctrines and a rebirth of the scientific attitude opened the way for a new era of critical inquiry.
The nature of the transition is illustrated by the career of Paracelsus (1493-1541), a Swiss physician, alchemist, magician, and necromancer (Pachter 1951). He rejected the Galenic belief that diseases were caused by an imbalance of humors and proposed that they were caused by improper diet or defective body function. He taught that foods were sources of aliment, medicine, and poison, and asserted: It is the dose that makes the poison. Despite his enlightened approach, however, he accepted the magical “doctrine of signatures,” the belief that potential remedies exhibited signs of the conditions for which they were effective—the shape of the leaves of the liverwort, for example, indicated its value for treating diseases of the liver. Thus, reading the efforts of Paracelsus to express his concepts of biological processes without knowledge of basic chemistry is reminiscent of viewing Michelangelo’s unfinished statues, in which the partially formed figures appear to be struggling to break out of the blocks of marble.
The Enlightenment: The Impact of Science on Nutrition
By the beginning of the eighteenth century, the power of the scientific method to expand knowledge of natural phenomena had been demonstrated in the physical sciences through the investigations of Galileo Galilei (1564-1642) and Isaac Newton (1642-1727). The essentiality of a component of air for respiration had been established by Robert Boyle (1627-91) and his colleagues. The mystical elements assumed by the Greeks—Empedocles and Aristotle—and by the alchemists to be the basis of matter were being questioned. The anatomical observations of Andreas Vesalius (1514-64) had begun to undermine the authority of Galen, and William Harvey (1578-1657) made discoveries about the nature of blood circulation that opened up new approaches to the study of respiration (Wightman 1953). Yet knowledge of food and nutrition had advanced little. The Hippocratic belief that the state of health was determined by the balance of humors was still dominant in medicine, as was the concept that foods contained only a single source of nutriment.
There were, however, a few observations of links between foods, diet, health, and disease made during the seventeenth and eighteenth centuries that may be taken as harbingers of progress to come in nutrition (McCollum 1957; Guggenheim 1981). The effectiveness of citrus fruits and fresh vegetables as cures for scurvy was reported on several occasions during the seventeenth century (Carpenter 1988). Investigations of this disease by James Lind (1716-94) in 1753 led some 40 years later to the inclusion of citrus juice in the rations of sailors in the British Navy. Earlier Thomas Sydenham (1624-89), a prestigious London physician, had observed (in the 1670s) that a tonic of iron filings in wine produced clinical responses in anemic patients. Pellagra was noted by Gaspar Casal (1679-1759) to be associated with consumption of a poor diet, and in Germany, C. A. von Bergen noted in 1754 that liver, later found to be a good source of vitamin A, had been recommended in several places as a treatment for night blindness. But although Sydenham’s observations could be taken as early evidence of the essentiality of iron (McCollum 1957), and the formal action of requiring citrus juice in the rations of British sailors as the initial example of a dietary standard (Leitch 1942), they were not recognized as such at the time, probably because success in treating diseases with foodstuffs was attributed to their content of medicinal substances.
The scientific contributions of Antoine-Laurent Lavoisier (1743-94) to our knowledge of the nature of gases, the process of oxidation, and the chemical structure of matter, which built on the discoveries of many of his predecessors and colleagues (Brock 1993), finally opened the way for an understanding of the composition of foods and the functions of individual food constituents in nutrition. After Lavoisier had shown that during combustion, various metals and carbon combined with oxygen and released heat, he and Pierre Simon de Laplace demonstrated (between 1770 and 1790) that respiration by guinea pigs and human subjects was an analogous process and that oxidation of carbon compounds by the body was accompanied by the release of heat (Lusk 1964). Foods, it was now evident, were not just a source of building materials for body substance but were also a source of energy for physical activity and other body functions.
In 1816, Françoise Magendie (1783-1855) concluded, from the results of experiments on dogs fed on diets that contained only carbohydrate or fat, that food was the source of nitrogen for the body. Although his experiments were less definitive than is often assumed (Carpenter 1994), his work led to the recognition that protein was an essential dietary constituent. During this time, the major constituents of foods had been separated and identified and, in 1827, William Prout (1785-1850), a London physician and scientist, proposed that the nutrition of higher animals was to be understood through consideration of the three major food components—carbohydrates, fats, and proteins. By 1850, largely through animal-feeding studies in Europe, particularly those conducted by Jean Baptiste Boussingault (1802-87) in France, five mineral elements—calcium, phosphorus, sodium, chloride, and iron—were known to be required in the diets of animals (McCollum 1957; Guggenheim 1981). The foundation for a science of nutrition had now been laid.
Nutrients as the Basis for Dietary Advice
After these advances, making dietary recommendations based on the chemical composition of foods was a concept finally approaching reality. During the 1830s, Boussingault in France, Gerrit Mulder (1802-80) in Holland, and Justus von Liebig (1803-73) in Germany had all proposed that the nitrogen content of a food could serve as an indicator of its nutritive value. Liebig began to teach that the “plastic” (nitrogen or protein) and “fuel” (carbohydrate plus fat) constituents of foods, together with a few minerals, represented the essentials of a nutritionally adequate diet. During the 1840s, Liebig and his colleagues in Giessen prepared extensive tables for predicting relative nutritive values of foods from their content of protein and energy-yielding constituents.
Mulder probably deserves credit for proposing the first dietary recommendations for a specific nutrient. In 1847, based on studies of Dutch army rations, he recommended 100 grams of protein daily for a laborer and 60 grams for a sedentary person (Tod-hunter 1954). In selecting a higher value for laborers, he was conforming with Liebig’s erroneous belief that muscle protein was degraded during physical activity and was replaced by proteins from foods. During the 1850s, Lyon Playfair (1818-98), a professor of chemistry at Edinburgh who also accepted Liebig’s hypothesis that muscular work involved breakdown of muscle protein, undertook extensive surveys of the diets consumed by men doing different types of work as a way of assessing protein needs. He took the results of his surveys (that laborers doing heavy physical work consumed much more protein than those who were less active) as support for Liebig’s hypothesis. His results were not surprising. If food is readily available, individuals doing hard work will eat more than those working less vigorously and will therefore consume more protein whether or not they need it. Nonetheless, in 1865 he concluded that healthy adults should consume 119 grams of protein daily in a diet providing about 3,000 kilocalories of energy (Munro 1964).
This approach of basing dietary recommendations on estimates of the amounts and types of foods consumed by a generally healthy segment of the population was continued throughout the century. In Germany, Carl von Voit, a colleague of Liebig, on the basis of surveys of the diets of German workers in different occupations, reached essentially the same conclusions as Playfair, as did Wilbur Atwater (1844-1907), who had worked with Voit and later studied the diets of workers in the United States. But these recommendations were not actually dietary standards; all were based only on observed intakes of active healthy people, not on measured needs.
An exception was one set of recommendations by Edward Smith (1819-74), a British physician and scientist who had done extensive experimental studies of energy expenditure and nitrogen excretion by human subjects. During the depression of the 1860s, he was requested by the medical officer of the British Privy Council to determine the quantity and quality of food that would avoid “starvation-diseases” among the unemployed (Leitch 1942). He concluded from his experimental observations that a nutritionally adequate diet for working men should provide about 80 grams of protein and about 3,000 kilocalories of energy and recommended about 10 percent less for women. His recommendations were the first true dietary standards based on experimental evidence of human requirements (Todhunter 1961).
Throughout the nineteenth century, dietary recommendations were directed exclusively toward meeting needs for protein and energy, in large measure because of the prestige of Liebig, who had asserted strongly that protein and energy-yielding nutrients (fats and carbohydrates), together with a few minerals, were the essentials of a nutritionally adequate diet. This concept was the basis for dietary advice provided to the public by the U.S. Department of Agriculture (USDA) between 1890 and 1905. Yet, as early as 1849, J. Pereira (1804-53), a British physician and Fellow of the Royal Society, had questioned the validity of Liebig’s views on the grounds that diets providing adequate quantities of protein and energy, but lacking variety, were associated with the occurrence of scurvy. Subsequently, several investigators in France, Britain, and Switzerland provided strong circumstantial evidence that foods contained many other unidentified substances essential for health (McCollum 1957; Guggenheim 1981). But it was only in the second decade of the twentieth century that Liebig’s concept of a nutritionally adequate diet was generally acknowledged to be untenable. This came after experiments had demonstrated that modifications of diet or provision of dietary supplements could cure diseases such as beriberi in humans and scurvy in guinea pigs, and that supplements of small amounts of milk or egg could prevent growth failure in rodents fed a diet consisting of purified proteins, carbohydrates, fats, and minerals.
Evidence that several diseases associated with the consumption of poor diets were caused by nutritional deficiencies (MRC 1932) began to affect dietary recommendations toward the end of World War I. A Food Committee of the British Royal Society, besides recommending 70 to 80 grams of protein and 3,000 kilo-calories of food energy for the “average man,” stated that every diet should include “a certain proportion” of fresh fruits or green vegetables and that diets of all children should contain “a considerable proportion of milk” to protect against unknown dietary deficiencies (Leitch 1942). In response to this “newer knowledge of nutrition,” the USDA in 1916 developed a food guide based on five food groups and recommended that diets be selected from a wide variety of foods to ensure that both known and unknown nutrients would be consumed in adequate amounts (Hertzler and Anderson 1974).
Between 1925 and 1935, the importance of consuming protective foods was also emphasized in reports of the British Medical Association (BMA) (1933) and a League of Nations committee on nutrition-related policy issues (1937). The objective of dietary recommendations prior to this time had been, first, to prevent starvation and malnutrition during periods of economic depression and, second, to maintain the working capacity of those in the labor force and the army (Leitch 1942). The new recommendations were designed to improve the health of the entire population but were framed only in general terms because knowledge of human requirements for specific nutrients was sparse.
Dietary Standards—A New Concept
During the depression years from 1929 to 1935, E. Burnet and W. R. Aykroyd (1935) prepared a report on diet and health for the League of Nations. In it they emphasized the inadequate state of knowledge of human nutrition and the importance of investigating human needs for nutrients to provide a reliable base for dietary recommendations. An important outcome of the discussion of their report by the Assembly of the League was a statement calling attention to the need for, among other things, public education about the principles and practice of nutrition and the need to establish dietary standards (Harper 1985).This was the first time that a clear distinction was made between recommendations that represented food and nutrition policy and standards to establish that policy.
Between 1920 and 1940, rapid progress was made in advancing knowledge of the newly discovered essential nutrients. Chemical structures of some of the vitamins were determined, the effects of several vitamin and mineral deficiencies were described, and human needs for essential nutrients were investigated (McCollum, Orent-Keiles, and Day 1939; Rosenberg 1945). The knowledge needed for establishing dietary standards was becoming available.
Hazel Stiebling of the USDA, who had been a U.S. representative at meetings of League of Nations committees on food and nutrition problems, proposed the first set of standards for ensuring that diets used in feeding programs during the depression of the 1930s would be nutritionally adequate (Stiebling 1933).This standard for several essential nutrients—vitamins A and C, calcium, phosphorus, and iron—was subsequently expanded to include values for the vitamins thiamine and riboflavin (Stiebling and Phipard 1939). The values were derived from estimates of human requirements (USDA 1939). It is noteworthy that Stiebling and E. F. Phipard made separate recommendations for children of various ages and that they set their recommended intakes 50 percent above the average requirement to allow for variability among the requirements of individuals.
During this time, the League of Nations Technical Commission on Nutrition (1938) published recommended intakes for calcium and iron in addition to energy and protein. The commission did not make quantitative recommendations for vitamins but emphasized consumption of “protective foods”—meat, milk, leafy vegetables, fruits, eggs, organ meats, and fish—as important sources of vitamins and minerals and included specific recommendations for pregnant and lactating women. In 1939, the Canadian Council on Nutrition (CCN) (1940) proposed a “Canadian Dietary Standard” that included reference values for energy, protein, fat, calcium, iron, iodine, and vitamins C and D. Thus, by 1940 the principles for developing dietary standards had been established.
In 1940, at the request of the U.S. government, the NRC established a Committee on Food and Nutrition, which, a year later, was given permanent status as the Food and Nutrition Board (FNB). One of the first actions of the FNB in fulfilling its obligation of advising the federal government on problems relating to national defense was to prepare a set of dietary standards for adequate intakes of essential nutrients. The process by which this was done has been described by Lydia Roberts (1944, 1958), chair of the committee that was assigned this task. The set of values for nine essential nutrients adopted by the FNB, and designated as Recommended Dietary Allowances, was approved by the American Institute of Nutrition (the professional nutrition society of the United States) and subsequently accepted at a national conference on nutrition convened at the request of President Franklin Roosevelt in 1941. The reference values for seven of the nine essential nutrients included in this standard did not differ appreciably from those proposed by Stiebling and Phipard; the other two nutrients, niacin and vitamin D, were new additions.
The RDA publication has been revised nine times since 1941; the tenth edition was published in 1989. The number of nutrients for which RDA or safe and adequate intakes are established has expanded from 9 (ADA 1941) to 26 (FNB 1989b). Changes in the first seven editions have been summarized by Donald Miller and Leroy Voris (1969). The International Union of Nutritional Sciences has compiled detailed information on many national and international dietary standards (Truswell 1983).
The Process for Establishing Dietary Reference Values
The process by which dietary standards are established is described in the various RDA reports and has been discussed by Gordon Young (1964), Alfred Harper (1987a, 1994), Stewart Truswell (1990), Roger Whitehead (1992), and others cited by these authors. It involves appointment of a committee that assumes responsibility for selecting age-sex groups for which RDA are needed, evaluating information about human requirements for nutrients, and then establishing procedures for deriving appropriate recommended intakes from knowledge of the requirements for, and utilization of, nutrients.
Committees on Dietary Standards
In the United States, a committee appointed by the NRC and reporting to the FNB is assigned the task of setting RDA. In most other countries, committees to establish Recommended Nutrient Intakes are appointed by departments of health. The committees usually consist of 8 to 12 scientists with both a broad general knowledge of nutrition and expertise in one or more special areas. In brief, in the United States, individual members or subcommittees prepare background papers on one or more nutrients based on a critical evaluation of the scientific literature on human requirements. These papers are reviewed by the entire RDA committee at several meetings over a period of three years, and problems are identified and resolved, sometimes with the aid of consultants or workshops. After revisions, the final draft of the report is submitted to the NRC for further review and revision before the final version is published.
The international agencies—FAO and WHO—follow a different procedure. Their reports deal with only a few nutrients at a time. One report, for example, may be on energy and protein (FAO/WHO 1985), whereas another may discuss vitamin A, iron, folate, and vitamin B12 (FAO/WHO 1988).The two organizations jointly appoint a committee of from 12 to as many as 20 scientists from various nations and a secretariat, consisting mainly, but not exclusively, of members of their staffs, to coordinate and facilitate the work of the committee. Its members (and those of the secretariat) prepare background papers on different aspects of requirements for the nutrients under consideration; they then meet together for about 10 days to discuss and resolve problems and prepare a draft report. This is circulated to the members for comment, after which a subgroup of the committee and the secretariat prepare the final report for publication.
Age-Sex Groups
The first step in setting RDA is to select age ranges and age-sex groups for which separate RDA values are needed and to specify body weights appropriate for each group. Most RDA reports currently include values for some 15 age-sex groups and for gestation and lactation. This number of groups is necessary for two reasons—first, because requirements for nutrients per unit of body weight decline as the growth rate declines and, second, because RDA for essential nutrients are expressed as amounts recommended daily per person in the specified group rather than as amount per unit of body weight. The range of ages included within a group must, therefore, be narrow during early life, when requirements per unit of body weight are high, and during adolescence, when body size is increasing rapidly. Also, requirements per person for females differ from those for males, mainly, but not exclusively, because of differences in body size, so that, except at young ages, separate groups are also employed for the two sexes. The age ranges and weights for groups frequently differ from one country to another, making exact comparisons among values from different reports difficult.
For individuals or groups whose weights deviate considerably from the specified weights, RDA should be adjusted on the assumption that they are proportional to body weight. For those judged to be substantially overweight, adjustment on the basis of lean body mass is thought to be more appropriate. Body weights for the age-sex groups are currently based on information obtained in national health surveys. They are not “ideal” or “desirable” weights. In the United States, the values are the median weights observed for the population (FNB 1989b); in Britain they correspond with the fiftieth percentile for body weight (DHSS 1991).
Establishing RDA for Essential Nutrients
The objective in establishing RDA for essential nutrients is to identify a level of intake high enough to ensure that the tissues of virtually all individuals in the population will contain enough of the nutrient to prevent impairment of health even if intake is inadequate for some period of time. The time differs from one nutrient to another, owing to differences in the ability of the body to store and conserve them. To accomplish this objective, the usual procedure is, first, to assemble and evaluate the information on human requirements for the different age-groups that have been studied; second, to estimate the variability among requirements of individuals; and, third, to assess the effects of various factors that influence nutrient needs, such as efficiency of absorption and utilization in the body, the bioavailability of the nutrient from major foodstuffs, and the contributions of precursors of nutrients in the food supply. The extent to which the last group of factors influences RDA differs from nutrient to nutrient and depends upon the nature of the foods consumed.
Estimation of Requirements
Methods that can be used for estimating requirements differ from one nutrient to another, and there is not always agreement as to the most appropriate criteria for establishing when a requirement has been met. Maintenance of a satisfactory rate of growth is accepted generally as an appropriate criterion for establishing that the requirements of infants and young children have been met. The amounts of nutrients consumed by infants growing well on a diet of human milk or a formula of proven quality are generally viewed as satisfactory, but not necessarily minimal, estimates of infant requirements.
With adults, several approaches have been used. In the case of nutrients for which naturally occurring dietary deficiencies have been encountered, the amounts needed to prevent or cure deficiency signs have been estimated both in practical studies and experimentally. These include thiamine, niacin, vitamins A and C, iron, and iodine. Experimental studies in which human subjects have been depleted of a nutrient until they develop signs of deficiency and then are replenished with different amounts until the signs disappear have been used to estimate requirements for ascorbic acid, vitamins A, E, and B6, riboflavin, and folic acid. These methods are based on the assumption that body reserves are adequately restored when deficiency signs disappear. Balance studies have been used to estimate requirements for protein, calcium, magnesium, and zinc. In these the minimal amount of nutrient required to just prevent loss from the body is estimated from measurements of intake of the nutrient and excretion of the nutrient or its metabolites by human subjects whose intakes are increased incrementally.
The body content (usually called the body pool) of a few nutrients, for example, vitamin B12, ascorbic acid, and iron, can be measured directly by isotopic methods; then, relationships among intake, rate of loss, and the development of signs of deficiency can be established. For several others, an indirect measure of the state of body reserves can be estimated from measurements made while intake is varied, of the rate of excretion of the nutrient or one of its metabolites in urine (niacin, vitamin B6), the concentration in blood (vitamin A), or the activity of an enzyme or other protein of which the nutrient is a component (thiamine, selenium).
Confidence in estimates of requirements is increased when there is reasonable agreement between values obtained by two or more methods. With all of these methods, however, an element of judgment is required in selecting the point at which the requirement is met. What, for example, is the appropriate body pool size, blood level, or enzyme activity to equate with an adequate intake, especially when the responses to increasing or decreasing intakes of nutrients are characteristically curvilinear and do not have clear inflection points?
Appearance of deficiency signs is clear evidence of an inadequate intake and, thus, prevention of these signs provides a guide to the minimal amount required. But it is a matter of judgment when it comes to how much the body pool, blood level, or metabolic indicator should be above the value associated with the prevention of signs. As intake of most nutrients is increased, excretion of the nutrient or of its metabolic products increases, and a sharp rise in body losses serves as an indicator that intake is in excess of needs. But as intake of some nutrients (vitamins A and D, for example) is increased, they continue to accumulate to a point at which toxic reactions can occur.
The total numbers of subjects studied in experiments on requirements, and the total amount of information available, is less than desirable. For many nutrients, values for intermediate age-groups must be obtained by interpolation from what is known about the requirements of infants and young adults; this is because few subjects at other ages have been studied. Differences in the judgments of various national committees lead to differences in estimates of the requirement. For a few nutrients, such as ascorbic acid and calcium, such differences in judgment are wide. But as a rule, the similarity of the values determined by different committees is impressive.
Individual Variability of Requirements
Individuals differ considerably in their requirements for essential nutrients owing to genetic differences among them. Stiebling and Phipard (1939) originally suggested that an intake of 50 percent above the average requirement should meet the needs of those with the highest requirements. L. B. Pett, C. A. Morrell, and F. W. Hanley (1945) assumed that requirements follow a Gaussian distribution, so RDA values would have to exceed average requirements by about three times the standard deviation of the mean in order to meet the needs of all individuals in the population. M. H. Lorstad (1971) subsequently proposed that a value two standard deviations above the average requirements (which would cover 97 to 98 percent of the population) would be an appropriate practical dietary standard. His proposal was adopted by FAO/WHO and has since been accepted generally. In view of the tendency of RDA committees to be generous rather than parsimonious in the selection of values for requirements, and of the ability of the body to conserve nutrients if intake is somewhat below the requirement without impairment of health, this would seem to be an acceptable approach.
In practice, for many nutrients there are not enough individual values to permit reliable estimates of the variability of requirements. In the case of nutrients for which the procedure has been applied, requirements follow a Gaussian distribution with coefficients of variation (the standard deviation expressed as a percentage of the mean) of about 15 (10 to 20) percent (Beaton 1994). If it is assumed that requirements for other nutrients follow a similar pattern, increasing the average requirement by about 30 percent should give an acceptable estimate for setting RDA generally (FAO/WHO 1988; Beaton 1991). Iron requirements of menstruating women, which are skewed toward the upper end of the range, are an exception.
Other Considerations in Setting RDA
RDA are the amounts of nutrients that should be present in foods as they are eaten. Losses of nutrients that occur during processing and preparation are not taken into account in setting RDA, but factors that influence the utilization of nutrients in the food ingested are. These include the digestibility of the foods, the biological availability of the nutrient from the food, absorption, and the presence in foods of precursors of certain nutrients. The requirement for protein, for example, is a composite requirement for nine essential amino acids and for a nonspecific source of nitrogen. The requirement for protein, thus, depends upon the digestibility of the protein and the proportions of amino acids it contains. RDA for protein are based on requirements for protein of the highest quality.
Many plant foods contain carotenoids, precursors of vitamin A (retinol) that differ in vitamin A value. To take this into account, RDA for vitamin A are expressed as retinol equivalents. The contributions from the vitamin and its precursors will depend upon the nature of the food supply. The amino acid tryptophan is a precursor of the vitamin niacin, so the need for this vitamin will depend upon the tryptophan content of the protein consumed. Thus, the RDA is expressed as niacin-equivalents.
The biological availability of iron may range from as low as 3 percent for foods of plant origin to 20 percent for foods of animal origin; hence, RDA for this nutrient will depend upon the proportions of plant and animal products in the food supply. Differences in the biological availability of nutrients in national food supplies account for some of the differences in RDA for essential nutrients among countries. There is no evidence to suggest that physiological requirements for different racial groups differ significantly.
RDA for Food Sources of Energy
The RDA for energy, expressed as kilocalories or megajoules per day (1,000 kilocalories equal 4.184 megajoules), are actually requirements for the components of the diet that, upon oxidation in the body, yield energy—carbohydrates, fats, and proteins. Ordinarily the amount of food consumed by most healthy individuals is controlled spontaneously by the body, over time, to an amount that will provide just enough energy to balance the amount expended and maintain body weight relatively constant—often for a period of years. If the RDA for energy were set, as for essential nutrients, at the upper end of the requirement range, it would be a recommendation for overconsumption and, hence, overweight for most people, because intakes in excess of the amount needed are not excreted but stored. The RDA values for energy are, therefore, average requirements for each group, not reference values for each individual in the group.
Energy requirements of individuals can be estimated from knowledge of their resting (basal) energy expenditure, energy expended for physical activity, and energy lost as heat from the stimulation of metabolism after a meal (thermogenesis). Resting energy expenditure varies with body weight, age, and sex. It can be estimated from equations derived empirically from relationships among basal metabolic rate, body weight, age, sex, and height, which are based on measurements of energy expenditures determined directly by calorimetry. Energy required for activity is estimated using factors that are multiples of the resting expenditure appropriate for each activity, and then multiplying these by the amounts of time spent in different activities (HWC 1990;Warwick 1990).The total of energy required for activity and resting energy expenditure, with an allowance for thermogenesis, yields the final value. The amount of food energy consumed daily, averaged over a period of several days, provides a reasonable estimate of the energy requirements of adults who are maintaining a stable body weight.
Uses of RDA
The original purpose of RDA was to “serve as a goal for good nutrition”—to act as reference values for planning nutritionally adequate food supplies and diets for large groups of people, and for devising dietary guidance systems based on sound scientific principles for the public. They have since been used in many other ways, some of which they were not designed for and for which they have limitations that are not always recognized. The uses can be separated into two general categories: (1) “standards for dealing with practical applications of nutrition,” and (2) “guidelines for food and nutrition regulations and policies.” This distinction is somewhat arbitrary because practical applications may arise from, or even be the basis for, policy decisions. Uses of RDA have also been grouped in another, more technical, way. Those related to planning food supplies and diets, offering dietary guidance, and serving as standards for food regulations and economic assistance programs have been categorized as prescriptive uses. Those related to assessment of the adequacy of nutrient intake, diets, and food supplies have been termed diagnostic (Beaton 1994). The former classification seems more appropriate for the present purpose, even though the separation between the categories is less distinct.
In using RDA as reference values for different purposes a question arises as to whether they apply equally well to individuals and groups. For energy, the situation is clear. RDA for energy are average values for groups; for applications involving individuals, energy needs of each person must be estimated separately. For essential nutrients the situation is more complex. RDA for essential nutrients exceed the requirements of nearly all persons in the specified groups, so a food supply that meets the RDA should be adequate for an individual and, provided it is distributed equitably, for all members of a group as well. But when RDA are employed as standards for assessing the adequacy of essential nutrient intakes, the answer is less clear. The problems that arise are considered in the following sections.
Uses Related to Practical Applications of Nutrition
Planning Food Supplies and Diets
The RDA are used to estimate the amounts of foods needed to provide large groups of people, such as the armed forces or populations of institutions, with nutritionally adequate diets. For this purpose, information is needed about the proportions of the population in various age-sex categories and about the composition of the available foods. The RDA for energy, being average requirements for each age-sex group, can be used to calculate directly the total amount of energy sources needed, as about half of the individuals in the group should consume more and half less than the RDA. The RDA for essential nutrients are then used as reference values to assure that foods selected and diets formulated provide the requisite amounts of essential nutrients. Allowance must also be made for losses that occur during food preparation. A similar procedure is used by international agencies to estimate the adequacy of both the quantity and quality of food supplies of nations in meeting the nutritional needs of their populations (FAO/WHO 1985).
It should be emphasized that the objective in planning food supplies and diets is not merely to provide amounts of essential nutrients needed to meet RDA. Diets should be composed of a wide variety of palatable and acceptable foods so that they will meet psychological and social needs and be eaten in the required amounts over long periods of time. In trying to accomplish this, the amount of vitamin A provided, for example, will ordinarily vary greatly from one day to another because relatively few foods are rich sources of this vitamin. Although RDA are specified as daily amounts, this kind of day-to-day variability in the provision of individual nutrients need not be of concern as long as the total amount of each nutrient, averaged over several days, meets the RDA.
Nutrition Education
RDA are employed effectively by health professionals for planning nutritionally adequate diets, but they are not designed for use by the public. For people with limited knowledge of nutrition, RDA for nutrients must be translated in terms of foods. A common way of doing this is by first separating familiar foods into a few groups on the basis of the essential nutrients of which they are good sources, and then, using the RDA as reference values, estimating the number of servings from each food group that must be eaten daily to meet essential nutrient needs.
This has been the basis for a food guide, developed by Louise Page and Esther Phipard (USDA 1957), designed to explain how to obtain the required amounts of essential nutrients from a specified number of servings selected from four familiar groups of foods. The number of servings required by adults to obtain the foundation for a nutritionally adequate diet provided only 1,200 to 1,600 kilocalories daily and allowed for obtaining the needed extra energy from additional servings of any of a variety of foods.
The number and selection of servings were modified for children. Estimates of the nutrient content of model diets based on this food guidance system demonstrated that the principles underlying it are sound. The system has been widely used in nutrition education programs for teaching the principles of a nutritionally adequate diet. Many modifications have been proposed in order to adapt it for groups with different cultural backgrounds, lifestyles, and incomes (King et al. 1978). More recently, a modified version based on five food groups—the “food pyramid” (USDA 1992)—was designed as a guide for the total diet and for limiting fat consumption.
Therapeutic Nutrition
Although RDA are standards for nutrient intakes of healthy people, they serve equally well as a guide to the nutritional needs of persons with medical problems that do not affect food consumption or nutrient utilization. They also provide a baseline for modifying diets to meet the special nutritional needs of individuals who are ill. Loss of appetite, associated with many types of illness, can result in the wasting of tissues and depletion of body stores of nutrients that must be replenished. Infections of the gastrointestinal tract, disorders that cause malabsorption, systemic infections, injuries, renal failure, certain inborn errors of the metabolism, and many other health problems alter nutritional needs in different ways and to different degrees, owing to the impairment of absorption or excretion or to increased metabolic wastage of nutrients. Dealing with the specific nutritional needs resulting from these conditions requires specific clinical and dietary management.
Designing New or Modified Foods
RDA are employed in formulating new food products composed of highly purified constituents, foods for special dietary purposes, or products designed as meal replacements, such as special diets for weight reduction, so as to ensure that these will provide appropriate amounts of essential nutrients. When RDA are used for this purpose, it is essential to remember that they are standards for amounts of nutrients to be consumed daily. In the case of products that are total diet replacements, especially those for weight reduction that provide restricted amounts of energy sources, the daily allotment should contain amounts of nutrients that meet the entire RDA.
Foods for general use should contain essential nutrients in reasonable proportion to the amount of energy derived from the product. The concept of “nutrient density”—the amounts of nutrients in the product expressed per 1,000 kilocalories—provides the basis of a method for accomplishing this goal (Hansen, Wyse, and Sorenson 1979).To provide a set of standards, RDA are expressed as units of nutrient required per 1,000 kilocalories of energy sources required. The quantities of nutrients in the product can then be adjusted to correspond with the standard.
Some shortcomings of this procedure should be recognized. In establishing standards for nutrient density, only a single RDA value is used for each nutrient—usually the value for the age-sex group with the highest requirement. For other groups, the standards therefore deviate in varying degrees from the RDA. Nutrient density values have also been proposed as guides to the relative nutritive values of foods. Caution should be exercised, however, in employing the nutrient density concept to compare the nutritive values of foods. The contributions of foods to meeting nutritional needs depend not just on differences in their nutrient content but also on differences in the amounts consumed. Two foods may have the same nutrient density, but if one is eaten in an amount only 10 percent that of the other, their contributions to meeting nutrient needs will differ by tenfold.
Assessing the Adequacy of Nutrient Intakes—Individuals
The amounts of essential nutrients consumed by individuals can be estimated from records of the quantities of different foods they eat and tables of food composition. The amounts of nutrients consumed can then be compared with the RDA. Unfortunately, comparisons of this type permit few firm conclusions about the adequacy of intakes. Because RDA exceed the requirements of most people, the probability of nutritional inadequacy is obviously remote for those whose intakes equal or exceed the RDA. But because requirements of individuals differ widely, it is not valid to conclude that an intake below the RDA is inadequate. All that can be said is that the further the intake falls below the RDA, the greater is the probability that the intake is inadequate. This is not a shortcoming of the RDA; dietary standards generally are not meant to be used for this purpose. The adequacy of essential nutrient intakes of individuals can be evaluated accurately only by clinical and biochemical assessment, not by comparing intakes with the RDA or with any other standard.
Inability to quantify the degree of nutrient inadequacy from knowledge of intakes below the RDA has, however, stimulated efforts to apply statistical methods (and the risk concept) as an alternative procedure for assessing the adequacy of essential nutrient intakes. Application of probability analysis, using information about requirements (not RDA) and intakes of essential nutrients, has been proposed as a way of obtaining a quantitative estimate of the risk of nutritional inadequacy (FNB 1986; FAO/WHO 1988; Beaton 1991, 1994).
From plots of the cumulative percentage distribution of requirements (assuming they are normally distributed) against intake, the probability that an intake below the highest requirement is inadequate can be estimated; for example, the probability that an intake equal to the average requirement is inadequate is 50 percent. Although a quantitative estimate of the probability (risk) of a particular intake being inadequate is obtained through this type of analysis, it does not establish whether the intake actually is inadequate. But it can help in deciding if an individual should be referred for specific biochemical tests.
Uses Related to Food and Nutrition Policy
Nutritional standards are needed both for many food and nutrition programs and regulations and to provide a reliable scientific basis for interpreting the nutrient intake information that is employed to establish food and nutrition policy. For some of these uses, RDA are appropriate standards, but for others they are too complex and must be modified, or they are too limited in scope and must be used in conjunction with other standards.
Assessment of Nutritional Adequacy—Populations
Information about the adequacy of essential nutrient intakes in a population is used to identify potential health problems. In assessing the adequacy of intakes in a population, one must take into account not only the variability of requirements but also the variability of intakes. Food and, hence, nutrient intakes of individuals vary widely. Even in developed countries, where the average nutritional intake of the population exceeds the RDA, those of many individuals are found to be less—and some much less—than the RDA (USDHHS/USDA 1986).This may occur for many reasons, such as poverty, illness, and ignorance, but the point is that even if the average intake of a population is in excess of the RDA, there is no assurance that all individuals within that population are meeting their requirements. Moreover, the problem of estimating the proportion of persons who have inadequate intakes remains unresolved.
From the results of dietary surveys of representative samples of large populations, the proportions of the population with different intakes can be determined. Then, with this information about the distribution of intakes, the same type of analysis that is used to assess the risk of inadequacy for an individual can be employed to assess the probable incidence (risk) of inadequate intakes in the population (FNB 1986; FAO/WHO 1988). This problem has been analyzed in detail by G. H. Beaton (1994), who concluded that only if the average intake of a population exceeds the average requirement by more than twice the coefficient of variation of the average intake will the probability of inadequate intakes in the group be low. This raises a question as to whether RDA for groups should be higher than those for individuals. The answer is not clear, but because the problem is important in relation to policy decisions, it is important to be aware of it. The actual prevalence of nutritional inadequacy in a population, as in an individual, can be assessed accurately only through clinical and biochemical observations, not by comparing the average population intake with a standard.
It seems not to be sufficiently appreciated, as Beaton (1994) has emphasized, that estimates of the probability of intake being inadequate are based on the observed distribution of individual requirements around the mean. Inclusion of a table of average requirements in RDA reports would be more useful than the table of RDA for assessing the adequacy of nutrient intakes by probability analysis. As all committees use essentially the same information for establishing requirements, a higher degree of agreement might be anticipated in establishing average requirements than in establishing RDA. Having both sets of values published together would also draw attention to the need for a critical assessment of criteria, both for establishing when a requirement has been met and for evaluating the process used to derive RDA from requirements.
Difficulties in trying to assess the adequacy of nutrient intakes using RDA as standards have also stimulated interest in alternative standards. Separate standards that represent the lower limits of acceptable nutrient intake have been developed in New Zealand, Norway, and Australia. These are to be used only for evaluating the adequacy of nutrient intakes (Truswell 1990). The FAO and WHO (1988) have published separate reference values for “basal requirements” and “safe levels of intake” (essentially RDA) for iron and for vitamin A.
The major problems in evaluating the adequacy of nutrient intakes from dietary information are not resolved by establishing new standards. Conclusions about inadequacy can still be drawn only in terms of probability or risk. Moreover, individuals cannot be identified as having inadequate intakes from dietary assessment alone, but as Truswell (1990) notes, intakes below the lower standard are more likely to be associated with deficiency than are intakes at some arbitrary level below the RDA.
Food Labelling
To provide consumers with information about nutrient contributions of foods, the U.S. Food and Drug Administration (FDA) established a standard based on the RDA. To use the RDA directly for this purpose would be unwieldy and complicated; in order to devise a standard with a single set of values for food labeling, the FDA selected the highest values from among the RDA for 20 nutrients and called these U.S. Recommended Daily Allowances (NNC 1975). Nutrient contributions of products have been listed on labels as percentages of the U.S. RDA per serving. (The different meanings of the terms “U.S. RDA” and “RDA” had created a measure of confusion.)
There are differing views about the purpose of food labels: According to one, the label should provide information that can readily be related to nutritional needs, and thereby serve as a tool for nutrition education; according to another, the main purpose of having nutrition information on the label is to enable the purchaser easily to compare the nutrient contributions of different products. The U.S. RDA are not useful guides to nutrient needs of the various age-sex groups for which RDA are set other than for the adult male. They do serve effectively, however, for the primary purpose for which they were designed—to provide an easy and simple standard for comparing products as to the per-serving percentages of the U.S. RDA that they contain. U.S. RDA are also appropriate standards for programs that fortify foods with nutrients judged to be low in the food supply.
The FDA has recently established new standards for food labeling and regulations. The basic standard for essential nutrients—Reference Daily Intake—is a population-weighted average of the 1989 RDA values for age-sex groups beyond 4 years of age. The amount of nutrient provided by a food product may be included on the label as a percentage of the Daily Reference Value—the amount of the nutrient that should be provided by a diet that yields 2,000 kilocalories. This system will permit simple comparisons of the nutrient contributions of different foods just as the earlier system did, but the values will not relate closely to the RDA. It is essentially a modified form of the nutrient density concept and, as such, may eventually find a niche in nutrition education programs.
Food Assistance and Welfare Programs
Most economic assistance programs of government agencies in the United States are direct or indirect food assistance programs. They include programs such as school lunches, supplementary food for women, infants, and children, and nutrition for the elderly. RDA are used as reference values for many of these programs. To assure that foods or meals supplied through these programs will be of high nutritional quality, it is required that they contain specified proportions of the RDA for certain nutrients. RDA are appropriate standards for meeting needs for essential nutrients, but they are not appropriate as the sole standards for planning meals or diets. As mentioned in relation to diet planning generally, planning meals requires, in addition, consideration of palatability, acceptability, and variety of the food supply, so that diets will be eaten and enjoyed over prolonged periods of time.
Meeting RDA has been an important consideration in developing the food plan that has served as the basis for allotting coupons in the food-stamp program, for licensing and certifying nursing homes and day-care centers, and for establishing the poverty level of income. RDA are appropriate standards for meeting needs for essential nutrients, but even though they are employed in conjunction with other standards, their use in this way has resulted in economic assistance programs being viewed as nutrition programs. It may make such programs more politically acceptable, but it also diverts attention from the need for broader and more comprehensive standards for them. It also has led to efforts to include use of RDA for economic assistance programs as part of the basis for setting RDA values, a procedure that, if accepted, would undermine their scientific validity as dietary standards.
Dietary Guidance for the Public
The RDA-based food guide system, employed in teaching the principles of a nutritionally adequate diet in nutrition education programs, was developed originally by the USDA as part of that agency’s response to the policy of providing dietary guidance for the public. Many aspects of such guidance, however, extend beyond the scope of RDA, such as desirable proportions in the diet of carbohydrates, fats, and other sources of food energy; maintaining a healthful body weight; and associations between diet and disease. These topics ordinarily are discussed in only broad terms in RDA publications, and some are addressed in dietary guidelines adopted jointly (as part of current health policy) by the USDA and the U.S. Department of Health and Human Services (USDHHS) (1990), by Canada (HWC 1992), and by many other countries (Truswell 1987).
To comply with current health policy, a modified food guide system for selecting the entire diet from five food groups and emphasizing food selections that will reduce fat and cholesterol consumption (USDA 1992) has been substituted for the earlier one, which provided partial guidance in selecting only 1,200 to 1,800 kilocalories from four food groups as a basic diet—to be supplemented with other selections. This new system represents a shift in public dietary guidance in the United States, whereas in Canada, the four-food-group system, with an emphasis on meeting essential nutrient needs, has been retained (albeit with new guidelines for reduced fat consumption) (HWC 1992). RDA continue, nonetheless, to serve as the scientific basis for selecting a nutritionally adequate diet in programs of dietary guidance.
RDA and Changing Health Policy
During the past 30 years, emphasis in health policy in the United States and many other developed nations has shifted toward the prevention of chronic and degenerative diseases. This shift has been accompanied by changes in dietary guidance that require a different base of knowledge from that for guidance based on the RDA.When the nature of the changes is examined, it is evident that this has implications for the concepts of RDA.
Changing Emphasis in Food and Nutrition Policy
Changing Health Status
Health has improved dramatically in the United States during the twentieth century while dietary advice has been based on the RDA (USDHEW 1979). Nutritional deficiency diseases have been virtually eliminated; mortality from infectious diseases has declined sharply; the proportion of infants born who live to age 65 or older has increased from less than 40 percent to 80 percent or more; height at maturity has increased; and life expectancy has risen from less than 50 to about 75 years. These improvements have occurred in association with improved sanitation, diet, and medical care and a higher standard of living. The extent to which they are attributable to better diets cannot be established, but increased infant and childhood survival and increased growth rates and height at maturity could not have occurred unless diets had been nutritionally adequate (Harper 1987b).
A consequence of the high rate of survival of the young to old age, with a stable, then declining, birth rate, is that the proportion of people 65 years of age or older in the population has increased from 4 to 12 percent. Such aging of the population has been accompanied in most developed countries by an increase in the proportion of deaths from cardiovascular diseases and cancer, from about 20 percent of total deaths during the early 1900s to about 70 percent today.
New Directions in Dietary Advice
The emergence of chronic and degenerative diseases as the major causes of death, coupled with the perception that alterations in diet may increase life expectancy by delaying the onset of such diseases (the causes of which are poorly understood), has brought about a striking change in food, nutrition, and health policy. Much greater attention is now being given to dietary guidance for the “prevention” of diseases associated with aging than to advice on how to select a nutritionally adequate diet (USDHHS 1988; USDA/USDHHS 1990).
This change in policy evolved over a period of more than two decades (Truswell 1987), mainly in response to reports of associations between the type and quantity of fat in the diet and mortality from heart disease. Conclusions drawn from these associations and accepted as the bases for changes in health policy have, nonetheless, been controversial (FNB 1980), and skeptical views have been expressed by numerous experts.1 Later observations that high intakes of vegetables and fruits are associated with lower mortality from certain cancers expanded interest in diet modification as a disease prevention measure (USDHHS 1988; FNB 1989a). The underlying bases for these associations have not been established, and claims that certain known plant constituents (such as carotenoids) may be among the effective agents have led to further controversy. Despite this, however, the recommendation for increased consumption of vegetables and fruits is accepted generally, on several grounds, as appropriate nutrition policy.
In 1980, as the culmination of an effort to achieve consensus among conflicting views over the role of diet in delaying the onset of chronic and degenerative diseases, the USDA and USDHHS jointly adopted (and later revised) a set of “Dietary Guidelines for Americans” (USDA/USDHHS 1990).Two of the seven guidelines—”Eat a variety of foods” (a nutritionally adequate diet), and “Maintain healthy body weight”—are the essence of the RDA-based dietary guidance initiated decades ago for variety and moderation in food consumption. The others, recommending a diet low in fat (and especially in saturated fat and cholesterol), with plenty of vegetables, fruits, and grains (providing fiber) and only moderate use of sugars, salt, and alcohol, were proposed as guidelines for reducing the risk of developing chronic and degenerative diseases, as is emphasized in the text of the publication.
The purpose of these guidelines is distinctly different from those based on the RDA. They are uniform dietary recommendations for the entire population, adopted to institute a health policy for the control of diseases that are not products of nutritional deficiencies and to which individuals differ greatly in susceptibility. They deal mainly with foods and food constituents that are not essential nutrients (energy sources, fiber, cholesterol) and are based on information that is beyond the scope of the RDA—indirect evidence of associations that cannot be accurately quantified. RDA, in contrast to these guidelines, are quantitative reference values for intakes of essential nutrients. They are part of the information base employed in developing programs to institute a policy of encouraging consumption of diets containing adequate amounts of nutrients, deficits of which will cause diseases in all individuals in the population. In a recently published revision of Dietary Guidelines for Americans (USDA/USDHHS 1995), although the guidelines themselves remain essentially the same, the text has been modified considerably. Much more emphasis has been placed on dietary guidance for maintaining health and much less on diet as a disease prevention measure. This change has improved the balance between dietary guidance based on well-established knowledge of the effects of essential nutrients (the RDA) and guidance based on evidence, much of it indirect, of probable health benefits from other components of foods (including unidentified substances).
The adoption of a policy that emphasizes dietary modification as a means of disease prevention has led to proposals that the RDA concept be modified to conform with the new direction in health policy (Hegsted 1993; Lachance 1993; IM 1994). To accomplish this, the RDA would be expanded to include all dietary constituents shown to influence susceptibility to chronic and degenerative diseases; some examples are carotenoids (plant pigments, some of which are precursors of vitamin A), cholesterol, and fiber (Lachance 1993, 1994). Moreover, observations that high intakes of certain vitamins (especially those that can function as antioxidants, such as vitamins E and C) may reduce the risk of heart disease and some cancers have given rise to proposals that such information be used in setting values for the RDA (IM 1994). Controversy over issues of this type delayed revision of the tenth RDA report for several years (Pellett 1988).
If the basis for setting RDA is modified to include consideration of the effects of substances (such as fiber) that cannot be quantified accurately, and of medicinal or pharmacological effects of vitamins or other nutrients (which require amounts well in excess of those that can be obtained from diet), the RDA will cease to be reliable reference values for meeting basic physiological needs (Truswell 1990; Harper 1994; Rosenberg 1994). Such a change would represent modification of a standard (the RDA) to conform with policy. The RDA, which are set on the basis of critical scientific evaluation, would become a vehicle for dietary guidance comparable to dietary guidelines, which are established by consensus. To maintain the independence and integrity of sources of information employed in making policy decisions, it is essential that the process of evaluating information used in setting policy be separated clearly from the process used to develop programs to institute policy. Continuing the approach used in most countries of maintaining a clear separation between the process for establishing dietary guidelines and that for setting dietary standards (Reference Nutrient Intakes; RDA) will accomplish this. Separate publication of dietary guidelines also provides an appropriate mechanism for presenting information about the potential health benefits (even tentative ones) of foods and food constituents independently of the RDA. Adoption of the British approach of calling the dietary standards “reference values” would be a further contribution toward maintaining such separation.