Food Labeling

Eliza M Mojduszka. Cambridge World History of Food. Editor: Kenneth F Kiple & Kriemhild Conee Ornelas, Volume 2, Cambridge University Press, 2000.

In the United States, the past three decades have witnessed tremendous changes in the way the public views the foods it buys. Unlike counterparts in the developing world, where problems of food availability and food quantities still dominate, consumers in the United States (and the West in general) have become increasingly interested in the nutritional quality of the foodstuffs they are offered. As a consequence, nutrition labeling has emerged to play a key role in government regulation of the food supply, in informing consumers about the constituents of the foods they eat, and in the formulation and marketing of food products by the manufacturers.

The importance of food labels has come about in spite of the fact that during the last 20 years or so, policy decisions regarding the implementation of nutrition labeling have been made in a political environment that emphasizes nonintervention in the operation of market economies. Recent legislation in industrialized countries has taken, for the most part, a minimalist approach to the regulation of nutritional quality. But although still controversial, labeling is seen as an acceptable “information remedy” that requires relatively little market intervention, and most Western nations have established some form of legislative guidelines for the regulation of nutrition labeling. With the adoption of the Nutrition Labeling and Education Act (NLEA) in 1994 (Caswell and Mojduszka 1996), the United States is currently in the forefront of establishing a mandatory and comprehensive national nutrition labeling policy.

The growing interest in nutrition policy reflects the understanding that foods represent major potential risks to public health because of such factors as foodborne organisms, heavy metals, pesticide residues, food additives, veterinary residues, and naturally occurring toxins. But although these are very real health hazards, scientists believe that the risks associated with nutritional imbalances in the composition of the diet as a whole are the most significant in the longer run, particularly in industrialized countries, where the high percentage of fat in daily diets seems to significantly threaten public health (Henson and Traill 1993). The recognition of nutritional quality as a value in itself has led to a separation of the nutritional from other food safety issues and a growing tendency to develop legislation targeted specifically at nutrition.

The latter, along with a perceived need for nutrition labels, is a reflection of the social changes that are influencing how, where, and when, as well as what, people eat (Wilkins 1994). Perhaps the most important change is the growing number of households that consist of two working parents or a single parent. With such a change in family structure, more meals are consumed in restaurants, more foods are eaten at home that have been prepared elsewhere, and convenience items predominate among foods prepared at home. In addition, the traditional three meals per day are often replaced with ad hoc meals as consumers employ an even greater variety of “heat and eat” foods. As a result, the consumption of traditional staple foods, such as cereals and potatoes, has decreased, and in most industrialized countries, the consumption of meat, milk, and milk products has markedly risen. The percentage of fat in the diets of citizens in industrialized countries has thus grown substantially in the last half century. Whereas regional variations persist, these current trends in eating patterns have generally meant a much greater intake of fats as a percentage of calories, which many view as a major factor in increasing morbidity and mortality from so-called chronic diseases such as cancer and coronary artery disease (Helsing 1991).

Such linkages between diet and health, and the communication of this knowledge to the general public, have led to an increased demand for higher quality, to which producers and retailers have responded by extending the variety of foods offered for sale. The nutritional composition of many foods has improved as well as their labeling; this, in turn, has stimulated an even more intensive marketing of the nutritional attributes of food products, as producers have recognized that an informed public will pay for better nutrition. Thus, as growing numbers of scholarly studies show, nutrition labels have become vital to public knowledge about food and, consequently, to improving public health (Caswell 1991).

The Evolution of Labeling Regulation

Despite the recent importance that labeling has achieved, concerns with labeling food products according to safety and purity have a long history. For example, throughout the medieval and early modern periods, foods of all kinds were identified by origin and grade, and regulatory “marks” on bread go back at least to the reign of Henry III in thirteenth-century Britain. Indeed, they were one of the most common forms of control imposed on food producers in the Middle Ages. Manufacturers also recognized the profitability of such regulation; the German “Brewing Purity Law” of 1516, imposed by the Bavarian court but readily adopted by German brewers, stands as an example that continues to this day as a marketing tool for beer.

The history of food regulation in its current form has its roots in aspects of late-nineteenth-century modernization and industrialization. Scientific chemical experimentation in the early 1800s made possible the sophisticated adulteration of food as well as tests of its quality. The later growth of international markets and transportation networks contributed to a decline of reliance on fresh food and advanced the increasingly impersonal aspects of food markets. The key to such markets was food durability, and as processing capabilities evolved through packaging, bottling, and canning in the nineteenth, and freezing in the twentieth century, so interests in marking and labeling food products by brand name and contents became increasingly important.

Because they presumably carried assurances of quality and consistency in an impersonal market, brand names substituted to some degree for the lack of face-to-face contact in modern retailing, and by 1900 food labels showing brand names were well established in the industrialized Western nations. In the United States, major brands dating from the late nineteenth century include those created by Joseph Campbell, H. J. Heinz, and P. D. Armour. By 1920, producers recognized that brand names were a significant factor in trade and worth protecting legally as the important “public face” of a company’s product line (Wilkins 1994). Beyond displaying brand names that promoted company reputation and increased consumer recognition, food labels sometimes included frequently specious claims about the content, quality, and healthfulness of their products.

During the late nineteenth century, there was a growing government interest in regulating the food supply through inspection and control of such deceptive labeling. To some extent this was a response to varying national standards for the import of meat products, but it was also because of an increasing public recognition of the potential health hazard of processed food. In the United States, public and government concerns with food purity helped propel the outspoken Dr. Harvey Wiley, a chemist from Purdue University, into work on food safety with the Department of Agriculture in 1883. As leader of the Bureau of Chemistry, the forerunner of the U.S. Food and Drug Administration (FDA), Wiley worked on a variety of food- and health-related issues.

The Pure Food and Drug Act of 1906 for which he had labored long and hard (FDA 1993) marked a major step toward labeling regulation and offers a classic example of how interactions between consumers, government, and industry shape the food regulatory environment. Various attempts to pass food and drug legislation had languished in Congress for over 15 years, while food poisoning scandals and investigative reports appearing in the scientific and popular press called growing attention to abuses in the food industry. And public outcry against the unsanitary conditions in the meat-packing industry in Chicago was inspired by the publication of Upton Sinclair’s 1905 novel, The Jungle.

All of this activity encouraged President Theodore Roosevelt to back legislation authorizing federal oversight of the food industry. The provisions of Wiley’s Pure Food and Drug Act, along with an accompanying Beef Inspection Act, authorized the federal inspection of meat-processing plants, established controls on food additives and preservatives, and tightened controls on food labels (FDA 1993).The latter, in a major step forward, were now required to correctly identify the producer so as to facilitate consumer complaints. Food labels were also to honestly list package contents; descriptive superlatives, such as “best” or “superfine,” had to be removed if the manufacturer could not prove the claim. If producers had products tested and registered by the Department of Agriculture, an official label could be used on these packages.

This provision, however, was weakened by the lack of postregistration oversight: Unless a consumer complained, producers could use official labels with impunity, even though product composition might have changed since initial registration. Thus, in a progression that became typical of industry reception of new food regulations, the powerful Chicago “Beef Trust” at first resisted the new legislation but then publicly adopted the federal stamp of approval in an effort to revive public confidence and increase flagging demand for its meat products (FDA 1993).

Other important regulatory measures were adopted in the United States during the following decades. In 1938 a new U.S. Food, Drug, and Cosmetic Act updated the 1906 Pure Food and Drug Act and tightened labeling regulations considerably. In addition to prohibiting statements that were false or misleading, the law prescribed that labels on all processed and packaged foods had to include the name of the food, its net weight, and the name and address of its manufacturer or distributor. A detailed list of ingredients was also required on certain products. The Food, Drug, and Cosmetic Act was subsequently updated, and in 1950, an Oleomargarine Act required clear labels on margarine spreads to distinguish them from butter. In 1957, the Poultry Act authorized the labeling of poultry products.

In 1969, the White House Conference on Food, Nutrition, and Health (FDA 1993) investigated dietary deficiencies in the United States and recommended that the federal government take a more active role in identifying the nutritional qualities of food, which set the stage for the differentiation of food safety from nutrition regulation. The 1970s saw an explosion of public interest in healthful foods, as fears of pesticides, food radiation, and mercury poisoning in fish, on the one side, and the rise of “health food,” organic farming, and consumer watchdog groups, on the other, encouraged a more activist approach to nutrition policy and food labeling.

Recent Labeling Regulation in the United States

The United States, as the world leader in the regulation of nutrition labeling, moved from partial controls in the 1970s and 1980s on voluntary provision of nutrition information (on product labels) to mandatory nutrition labeling in the 1990s. No other Western nation has similar mandatory labeling regulations. This move was, in part, a response to heightened public awareness of nutrition-related health issues in the United States. But it represented, as well, an acknowledgment on the part of federal health agencies that a significant portion of the population had poor diets. At the same time, manufacturers discovered that nutritional claims on packaging labels could be powerful marketing tools.

In 1973 the FDA had issued rules requiring nutrition labeling on all packaged foods to which one or more nutrients had been added or on foods whose advertising included claims about their nutritional qualities and/or their importance in the daily diet. But for almost all other foods, nutrition labeling was voluntary. Voluntary nutrition labeling went into effect in 1975; if producers wished to include nutritional information on their packaging they could, but they were required to use a standardized, federally designed label, which remained in use until 1994.

From 1975 to 1984, food health claims for marketing purposes were prohibited; after 1984, however, some claims were allowed, following case-by-case review by the FDA for nutrient content and health claims (FDA 1993). However, this review process was lax, and little enforcement against deceptive claims took place, virtually inviting industry abuses. Examples included Salisbury steaks that were labeled “lowfat,” even though the meat product derived 45 percent of its total calories from fat, and labels on potato chips that concealed high levels of cholesterol-elevating saturated fat.

In 1988, Surgeon General C. Everett Coop released a report on nutrition and health in which, for the first time, the federal government formally recognized the role of diet in the etiology of certain chronic diseases and proposed labeling policies. And after 1989, growing public interest in nutrition, and several publicized cases of label disinformation, led to stricter state and federal scrutiny of nutrient content and health claims on food labels. In 1990, negotiation between the FDA and the United States Department of Agriculture (USDA), often made tense by lobbying on the part of special-interest groups, led to the enactment by Congress of the Nutrition Labeling and Education Act (NLEA) (Gorman 1991). This legislation, which went into effect in 1994, makes nutrition labeling mandatory and strictly regulates nutrient content and health claims. Nutrition labeling, however, remained voluntary for raw foodstuffs (primarily fruits, vegetables, and meats), though grocery stores were allowed to post general nutrition information at the point of sale. Other exempted products from labeling included those destined for consumption away from home, as, for example, in restaurants, hospitals, and other institutions. Small producers, with annual gross sales under $500,000 or food sales under $50,000, were also exempt.

NLEA regulations changed the nutrients that must be listed on the redesigned “Nutrition Facts” panel, which appears on the back side of product packaging. The new list emphasizes fats, sodium, and cholesterol—nutrients that most consumers worry about having too much of rather than too little—and requires that manufacturers indicate percentages of the following macro-and micronutrients on food labels: total calories, calories derived from fat, total fat, saturated fat, cholesterol, total carbohydrates, sugars, dietary fiber, protein, sodium, vitamins A and C, calcium, and iron. Information on the content of these nutrients is presented in quantitative amounts and as percentages of standardized dietary reference values, stated as “Percent of Daily Value.” Other nutrition labeling is permitted on a voluntary basis, including information on calories derived from saturated fat and polyunsaturated fat and listings on soluble fiber, insoluble fiber, potassium, and other essential vitamins and minerals.

The percentages given in the “Nutrition Facts” panel are based on standardized serving sizes to help consumers understand and compare the nutritional values of different foods. In fact, according to the NLEA, serving sizes are supposed to be consistent across product lines and close to the amounts people actually consume; the number and amounts of nutrients labeled is based on the serving or portion size supposedly eaten by an “average person” over the age of 4 and must be given in common household and metric measures. Packages that contain less than two servings are considered single-serving containers, and their nutrient contents have to be declared on the basis of the contents of the entire package.

In order to control the manipulation of food descriptors, the FDA has developed consistent and uniform definitions of nine nutrient content claims that expressly or implicitly characterize the level of a nutrient present in a product (such as the Quick Quaker Oats front label, which claims, in eye-catching yellow, that the product is a “lowfat food,” a “good source of fiber,” a “sodium-free food, and a “cholesterol-free food”).These nutrient content claims are allowed on a voluntary basis, but manufacturers can use them only when a food meets particular nutritional standards. The nine descriptor terms include: free, low, high, source of, reduced, light (or lite), less, more, and fresh. Because these are key advertising terms and typically appear in marketing tag lines on product front labels, NLEA has encouraged producers to either reformulate their products or drop deceptive nutrient descriptors.

Until 1984, a food product making a claim about the relationship between a nutrient and a disease on its label was treated as a drug (FDA 1993). But in a major shift in nutrition regulatory policy, the NLEA, for the first time, allowed food labels to include voluntary health claims, linking a nutrient or food to the risk of specific diseases or health conditions. For example, within carefully defined limits, food labels are allowed to cite links between calcium and osteoporosis, sodium and hypertension, fat and cardiovascular disease, and fat and cancer (thus, the Quick Quaker Oats back label states in part that “[t]his product is consistent with the American Heart Association dietary guidelines” and that “diets low in saturated fat and cholesterol and high in fiber-containing grains … may reduce risk of heart disease”).

Recognizing that labels were of limited value without an informed and educated consumer who could understand them, the FDA has also begun a multiyear food labeling education campaign that involves participation by consumer advocates, industry groups and corporations, and other government agencies. Its purpose is to increase consumers’ knowledge and to assist them in making sound dietary choices in accordance with the USDA-designed “Dietary Guidelines for Americans.” The latter promotes a diet balanced between basic food groups and is usually represented in the form of a food pyramid.

The FDA estimates that in the United States the new food label will cost food processors between $1.4 and $2.3 billion over the next 20 years. But measured in monetary terms, the potential benefits to the public may well exceed the costs; health benefits potentially include decreased rates of coronary heart disease, cancer, osteoporosis, obesity, high blood pressure, and allergic reactions to food (FDA 1993).

Thus, the effects of nutrition labeling go far beyond serving as a shopping aid for consumers. The NLEA is also expected to have a significant impact on the functioning and performance of the food industry because manufacturers will redesign food nutritional profiles to meet changing consumer demands. In other words, for food companies the implementation of mandatory labeling requirements is likely to have important effects on both consumer demand and the marketing of foods (Caswell and Mojduszka 1996).

Nutrition Labeling in Europe

In post–World War II Europe, agricultural and food industry policies were mainly shaped by criteria such as the welfare of industrial workers, economic implications for farmers, national employment, and military security. So long as nutrition experts advised Ministry of Agriculture officials in various European countries on the best crops to cultivate to overcome food shortages in the immediate postwar years, health and nutrition issues were rarely central considerations in agricultural planning but instead were seen as primarily relevant to developing countries.

In 1974, however, the World Food Conference stressed the need for all countries to adopt food safety and nutrition policies, regardless of their stage of development. In Europe, Norway was the first industrialized country to adopt such a policy. By the mid-1980s, a number of health departments in European nations, including West Germany, France, and Great Britain, had issued reports on nutrition and diet, and international organizations such as the World Health Organization (WHO) and the Food and Agriculture Organization (FAO)—both offices of the United Nations—also called attention to the role of nutrients in human health in industrialized societies, citing dietary problems associated with oils and fats.

By 1990, seven countries in Western Europe had established food safety and nutrition policies similar to those in the United States before NLEA (Helsing 1991). But although the majority of these programs recognized that information and public education were key components of a national nutrition policy, they stopped short of requiring labeling on all processed food products.

The regulation of food and nutrition labeling in Europe has taken place within the context of attempts to establish a common European market. In 1979, the European Community (EC) issued rules on food safety and consumer protection based on the “Principle of Mutual Recognition,” intended to standardize food health regulations to facilitate international trade. Under this rule, no EC country was to prevent the importation of a food product legally produced and sold in another country, even though the product might not be legally produced according to standards established by the importing country’s own laws. Such a rule eased trade restrictions in the EC but presented risks of lowered food quality and safety in countries with more stringent quality standards.

To address these concerns, the EC expanded regulation in the 1980s with new rules on product definition, quality differentiation, and labeling. Under these directives food labels were required to display information on product characteristics, including quality, quantity, and origin of ingredients. Producers are prohibited from making health claims that link their product to the prevention or treatment of diseases. EC labeling policy was intended to supply European consumers with adequate information on nutrition and food quality to help them make informed choices in the grocery store without setting up rigid labeling regulations that might place burdens on individual national economies (Worley et al. 1994).

The variety of nutrition policies in Europe underscores the way variations in regional dietary patterns and differing political environments affect governments’ willingness to establish nutrition regulations and is illustrated by a brief examination of food policy in Norway, Denmark, and Poland. In Norway, government planners chose a carefully activist nutrition policy, unique in the extent to which it relied on government intervention. Norway’s readiness to adopt interventionist policies reflects in part the relative limits of the country’s domestic agricultural sector, based heavily on dairy and beef production, and its dependence on foreign imports of sugar and grains.

Adopted in 1975, Norway’s “Food Supply and Nutrition Policy” acknowledged the relationship between public-health objectives and food supply planning. The stated aims of the policy included encouraging healthy eating habits in the population, implementing the recommendations of the World Food Conference of 1974, and maximizing the domestic production of food to ensure the stability, security, and quality of the national food supply. Specific dietary changes recommended for the population—to reduce fat intake to 35 percent of total dietary energy, to increase the consumption of cereal and potatoes, to increase the proportion of polyunsaturated fat in the diet, and to increase the consumption of fish—reflected concerns over the dietary level of fat but also posed a potential threat to Norway’s domestic agriculture.

Acting with support from a coalition of industry, consumer, and government representatives, the Norwegian government used a variety of consumer food subsidies (many were already in place) together with agricultural policies intended to slowly raise domestic production of grains while lowering consumer consumption of saturated fats, so as to protect both consumer and industry interests. Information controls forbade misleading advertising, particularly for foods with poor nutritional profiles such as ice cream, chocolate, and soft drinks. Educational programs were also launched and special attention was given to the nutritional content of school courses at the nursery, primary, secondary, and higher levels. Voluntary nutrition labeling of processed foods aimed to better inform the population on the nutritional content of food products. By 1990 fat consumption had dropped to the 35 percent target level and overall agricultural production rose as planned, demonstrating the success of Norway’s combination of interventionist and information policies (Gormley, Downey, and O’Beirne 1987; Helsing 1991).

In Denmark, by contrast, food and nutrition policy came relatively late, and was poorly promulgated and coordinated. Acting in response to consumer interest in diet and health, the government adopted a formal nutrition policy in 1984. Unlike Norway, Denmark’s policy did not formulate specific nutrition policy goals or assign responsibility for coordination of policy activities. Although the government helped establish several research institutes focused on nutrition issues, little attention was given to providing nutrition information to the public. In 1990, Denmark’s nutrient pattern failed to show the same dramatic improvements that had been seen in Norway (Helsing 1991).

Poland’s experience with nutrition policy offers an exemplary case of the problems countries on the geographical and economic periphery of the European Union (EU) face in achieving standards deemed important by the Union. In Poland, as in other Eastern European countries with command economies before 1989, nutrition policy lags behind and creates problems for public health and international trade. Though the right of Polish consumers to truthful disclosure of information about the quality of food products was presumably guaranteed when the Polish government accepted United Nations Organization directives in 1985, advertising became the main—and not very satisfactory—source of information about food quality for most Polish people after 1989.

In April 1995, the Federation of Polish Consumers issued a report on “The State of Realization of Consumer Rights in the Polish Food Market” that called attention to shortcomings in the Polish regulatory system. The report made it clear that despite growing public awareness of nutrition issues, laws then in place that dealt with the nutritional composition of foods and their advertising were not effective because the Polish government lacked the power to enforce them. Thus, for example, it became impossible to fine manufacturers that used deceptive advertising because there was no government office to evaluate the nutritional claims made by advertisements.

An ineffective nutrition policy also affects foreign markets for Poland’s agricultural products, and it would seem that food labeling policy is necessary to make Polish food acceptable and competitive in international markets as well as competitive with imports reaching Poland. The regulation of nutrition labeling would give Polish manufacturers incentives to quickly come into compliance with higher standards. Such a step would also provide Polish consumers with knowledge to protect themselves from risks posed by foods and save them from buying costly products whose quality is misrepresented. Like other central and Eastern European nations where agriculture and food production are key sectors of the national as well as the export economy, Poland faces the need to put in place effective regulations and enforcement policies that ensure the quality of food products and the reliability and usefulness of information provided on food labels.

An international comparison of nutrition regulation can also underline the importance of food labeling, not only for its value in public education and for encouraging better health but also as an increasingly important component of international trade. The stiffer regulatory environment in the United States poses potential barriers to the import of food from other countries, even those in Western Europe, where regulations are less strict (Worley et al. 1994). As countries in both Eastern and Western Europe and North America transform their regulatory requirements to comply with international trade agreements, informational approaches to food regulation will play a more significant role and will likely lead toward the gradual convergence of international regulatory policies.

This is a slow-moving process, largely because labeling policies evolve in different political climates and in response to different regional diets, but also because food manufacturers typically oppose mandatory nutrition labeling of foods. Changes in national nutrition-labeling regulation have to be carefully negotiated between consumer, government, and industry representatives and involve important policy issues.

Nutrition Labeling and Policy Issues

Since a better awareness of linkages between diet and health has been achieved, governments of many countries have been forced to confront the issue of market intervention in order to regulate food quality. There are two main approaches to the implementation of nutrition policies by central governments. The first, called the minimalist approach, stresses the responsibility of the individual in the attainment of a healthful diet. The role of government is limited to the development and advocacy of dietary guidelines and to the provision and monitoring of information available to the consumer on the nutritional composition of foods. The planned interventionist approach, in contrast, maintains that government should actively intervene in many sectors of the economy to achieve optimum nutrition for all citizens.

With the exception of Norway, however, developed nations have thus far adopted minimalist approaches to nutrition regulation that rely on market forces to ensure the nutritional quality of food products and the nutritious diets of their populations. Governments have many reasons for hesitating to adopt planned interventionist programs. The most common can be grouped under the headings “scientific/philosophical” and “political.” Scientific/philosophical reasons include a lack of confidence in the scientific basis of most nutrition theories and thus a hesitation to intervene in nutrition-related health issues without “absolute” scientific proof. When combined with philosophical beliefs in the sufficiency of the market and a reluctance to interfere with the individual’s freedom of choice, these reasons join to make a compelling argument for a minimalist approach.

Political reasons for the unwillingness of governments to adopt interventionist policies center primarily on the relative importance of the agricultural and food processing industry sectors in different countries. For example, in nations with economically highly ranked agriculture and food processing industries, like Denmark, the Netherlands, Ireland, and Belgium, politicians may be unwilling to interfere with food availability and choice because of resistance from powerful producers and various lobbies and a lack of a clear support from the public. Public demands for action, however, may encourage elected officials to address such issues (Gormley et al. 1987).

As already noted, though, even the implementation of a minimalist government nutrition policy has been politically controversial, especially in the United States and Great Britain, where the agricultural and the food processing industries were concerned that government efforts to implement dietary guidelines could threaten their survival. In fact, a successful implementation of the ideal dietary guidelines would imply reduced demand for meat, dairy products (two well-organized producer groups in the United States), eggs, fats, and sweets and increased demand for vegetables, fruits, cereals, and vegetable oils. This would represent a major shift in the diets of the populations and could potentially disrupt the food production sector. Not surprisingly, the regulation of nutrition has often been resisted by meat and dairy producers and other agricultural interests.

Nevertheless, despite the potential economic dislocation feared by advocates of a “hands-off” approach, there are compelling arguments for government intervention in nutrition regulation. Even though a degree of scientific uncertainty exists, most researchers agree that public health is well served by increased knowledge of nutrition and by the availability of nutritionally sound foods. Yet although it is true that producers will readily supply food with a positive nutrition profile if it is profitable for them to do so, the cost is a likely deterrent. Consequently, without some form of regulation to control the truthfulness of label claims, producers who do market higher-quality food face unfair competition from others offering less expensive, lower-quality products.

Following the philosophical assumptions surrounding belief in the “free market,” some economists, policy makers, and food manufacturers have argued that relatively unregulated markets provide adequate information to consumers on the nutritional quality of foods, making government regulation of information disclosure unnecessary. In a perfectly operating market, product price will, theoretically, transmit all the information the consumer needs to make a rational, informed choice. A variety of products with different levels of quality and risks would be offered for sale at a variety of prices; higher nutritional quality would be signaled by higher prices. In a theoretically “free market,” consumers would choose among a variety of foods based on their preferences for products and the attributes of those products (Caswell and Mojduszka 1996).

The actual marketplace, however, is rarely free or perfect in the theoretical sense and typically fails to provide either accurate nutritional information or foods with good nutritional profiles. Manufacturers are better informed about the content and nutritional profiles of their products than consumers. Consequently, if information on the nutritional quality of food is inaccurate, or if public perceptions about the risks or hazards associated with certain food products are incorrect, then consumers are exposed to health risks far greater than they would have been willing to face if they were able to make informed decisions. These factors imply that government should intervene in the marketplace to ensure the provision of nutritious food and, thus, enhance consumer health.

Governments in market economies have three basic instruments that can be used to influence the nutritional quality of the food supply: providing the public with more reliable information about nutrition, enhancing production standards at processing plants, and confronting manufacturers with pecuniary measures, including taxes, subsidies, and fines (Kramer 1990). Because the political and philosophical climate of the last two decades has favored mini-malist approaches to market intervention, government agencies have focused mostly on the first of these. Information remedies can take a variety of forms: the mandatory disclosure of information about the nature of products by manufacturers; controls on claims made for product promotion; provision of public information and education; government subsidies for the provision of information; and control of product names. Such measures are undertaken in an attempt to encourage people to demand foods with a high nutritional profile and, as we have already noted, to inspire manufacturers to produce higher-quality foods. As we have seen, the NLEA, as promulgated by the FDA, leans heavily on information remedies.

Policy experts consider information regulation appropriate, and particularly valuable, in the case of nutrition because without it, it is generally impossible for consumers to determine either the health safety or the nutritional quality of food, even after the product has been bought and consumed. Advocates of the minimalist approach also assume that the regulation of information will encourage consumers to purchase more healthful food bundles, in turn compelling producers to improve the nutritional formulation of their products without direct legislative intervention in the production process.

Though the FDA contends that food prices will show little change as a result of mandatory nutrition labeling, reliance on information remedies to regulate food nutrition does impose other costs on consumers by forcing them to spend time reading labels and, of course, learning about nutrition. This can be something of a problem because minimalist information remedies do little to control food advertising beyond limiting false or misleading claims, and even under the relatively stringent NLEA rules, producers and marketers are able to sometimes use vague nutrition-related claims to powerfully influence consumer food choice. Finally, critics of the minimalist approach maintain that its intent is often severely weakened by a lack of formal government commitment to action. For example, educational or enforcement programs may have inadequate funding.

The effects of labeling regulation are difficult to measure. A 1996 survey in the United States showed mixed results: Whereas 58 percent of consumers polled said that they always read the nutrition label before buying a product for the first time (up from 52 percent in 1991), and more than half said they were trying to eat less fat, 46 percent agreed with the statement that “there is so much conflicting information about which foods are good for me that I’m not sure what to eat anymore” (Howe 1996).

Information remedies clearly have limits, and interventionist approaches, despite their current unpopularity, may ultimately offer consumers greater health protection by directly requiring producers to improve the quality of their foods. In a political climate that favors a minimalist approach, however, government adoption of nutrition regulation focused on information remedies can influence product design, advertising, consumer confidence in food quality, and consumer education about diet and health. The implementation of nutrition labeling regulation thus offers real health benefits to the general population and potentially improves both economic efficiency and public welfare (Caswell and Padberg 1992).


Before World War II, rules for information disclosure on food labels were generally understood as one component of more general food safety policies. After the war, nutrition was increasingly seen as an important factor in its own right, and by the 1970s, most developed countries had established nutrition policies that included provisions for consumer information and food labeling. Perhaps ironically, concerns with nutrition are greater in countries where there is abundant food for most citizens and malnutrition is a result of poor dietary habits rather than food shortages. In less developed countries, where nutrition deficits are often the result of a lack of food, regulation of nutrition through labeling may appear superfluous or unnecessary until supply problems are solved; in addition, neither consumers nor governments can afford the relatively high costs associated with labeling programs.

Nutrition labeling is only one of the possible policy tools available to governments as part of a modern program of nutrition and food safety regulation, but in an economic environment stressing free trade and nonintervention, it has become increasingly important. Product labeling and information disclosure requirements act as an interface between government requirements, manufacturer response, and consumer demand that encourages nonpunitive market incentives with relatively limited government interference. Nutrition labeling policies continue to evolve in an increasingly international environment and are now seen as possible nontariff barriers to the flow of international trade. Among the strongest incentives to modify nutrition labeling policy in the twenty-first century will be concerns with the compatibility of regulatory regimes across national boundaries.

Nutrition labels can provide an important benefit to the health of certain groups of consumers, but they also impose costs on consumers, producers, and government. The use of labels can require expensive monitoring programs in regulating industry and educational programs to encourage public awareness; producers may also be forced to invest in product reformulation and new label design. Labels are also something of a double-edged sword and, at times, have been used by manufacturers to misrepresent the nutrition profiles of certain foods in marketing campaigns. In the end, the success of nutrition labeling depends on a number of interacting factors, making effectiveness difficult to measure without broad studies that focus on changes in consumer awareness and dietary patterns, as well as on the introduction of new foods with more healthful nutrition profiles and improvements in the nutritional quality of existing food products.