Nutritional Anthropology

The Bond Effect
The science and art of living the way nature intended

www.naturaleater.com

Geoff hires.JPG (267058 bytes)

DEADLY HARVEST
The Intimate Relationship

Between Our Health & Our Food

GEOFF BOND

Geoff Bond's  

Home Page

The most recent Newsletters are available by private subscription

Info and Order


Now Buy the
Hard Copy!

Deadly Harvest Cover.jpg (293631 bytes)

Nutritional Anthropology's Bible:

DEADLY HARVEST

by

Geoff Bond


Healthy Harvest Cover.jpg (300923 bytes)

COOKBOOK 

Healthy Harvest Information Page


 

Chapter 2
The Farming Revolution and Its Consequences

In the last chapter, we explored the lifestyle of our ancient ancestors in our East African homeland and dubbed it the “Savanna Model.” Remarkably, our bodies have not changed significantly since then and, ideally, we would still live and feed ourselves the same way, even today. However, as history will show, things changed. We will now continue our human story by exploring how and why human lifestyles drifted away from this ideal.

Just before the dawn of recorded history, so-called advancements took place that set most of humanity on a path that led firmly away from our naturally adapted lifestyle. This process repeated itself throughout the world. As we will see, most humans were no longer nourishing their bodies to the best advantage.

This chapter puts the spotlight on the major departures from the ideal diet, departures which happened, for the most part, quite unwittingly. As the centuries rolled by, new techniques and new foods came along that led us ever further from our ancestral feeding patterns. From the 19th century, it became obvious that our food supply was not always nourishing populations properly.

It also became apparent that the food supply was increasingly vulnerable to dubious practices. We review how government agencies tried to correct both these matters and why we cannot rely on them to protect the public interest. This will give us a perspective on how we ended up with the food supply that we have.

THE MAJOR UPSET IN HUMAN NUTRITION 
By about 11,000 years ago, the human race had occupied the majority of the world’s land and there was nowhere else to go. The wandering bands of foragers still needed 100 to 200 square miles for each group of 50 people. Nevertheless, as their numbers were still multiplying, these groups came into conflict with one another over territory. More and more they would fight each other with increasing ferocity to protect their living space. The archeological evidence is clear: there are traces of Stone Age battlegrounds from Australia to Europe.

Page 27 Above


Page 28 Below

Then, a group of foragers stumbled on a solution to the problem. They discovered how to feed themselves on a much smaller area of land. Instead of wandering their territory in search of their next meal, they took control of their food supply. They had figured out a way to survive on 4 square miles instead of 200 square miles. This was a huge innovation for the human race: for the first time, humans stayed in one place and planted. It would not only change the social behavior of these bands forever, it would also cause a dramatic shift in the human diet, not always with positive results.

The place where it all started was the grasslands of what is now Kurdistan in northern Iraq. Historians call this lifestyle upheaval the “Farming Revolution.” It was important enough to mark the end of the Paleolithic Age and the beginning of the “Neolithic” or New Stone Age. The start of agriculture marked the beginning of this period. It would take man’s use of metals several thousand years later to create the next important period—the Bronze Age. However, for us, it is this agricultural revolution that still influences our life and health today.

The New Stone Age ushered in unforeseen consequences. Because those pioneer farmers were not able to cultivate their usual diet of foraged plants, they found themselves compelled to grow those items that it was possible to grow. For this reason, completely new food groups entered the diet. For the first time in the history of the human race, people started to eat grass seeds. Put plainly like that, it sounds faintly ridiculous, but today we eat grass seeds on a huge scale. Of course, we know them by another name: grains. Those first farmers took the grasses that surrounded their living space and learned to plant, grow, harvest, winnow, and mill their seeds. These grasses were the ancestors of today’s wheat and barley. This dietary change marked a massive upheaval in human nutrition. It was the first step—but a major one—away from our ancestral diet. We are only now beginning to understand the severity of the consequences.

Cereal and Legume Defined

Cereal is just another word for grain. They are all varieties of grasses. Legume, also known as “pulse,” is the collective term for lentils, chickpea (garbanzo) beans, peas, soybeans, peanuts, and similar seeds. Some authorities, such as the United States Department of Agriculture (USDA), use the term dry beans. They are part of a very large family called Fabaceae, the pea family.

About the same time, the first farmers also discovered how to grow lentils and garbanzo beans (chickpeas). In this way, 11,000 years ago, and for the first time, we humans started to consume two completely new food categories: grains and legumes. It was the first demonstration that humans could harness nature.  However, as we shall see, nature can play tricks on us. Just because we can consume

Page 28 Above


Page 29 Below

something without an immediate negative reaction, it does not always mean that we should consume it. Indeed, humans can train themselves to eat almost anything. It is one of the lessons that we shall learn: contrary to what most of us fondly imagine, we cannot trust our instincts to tell us what to eat.

A TECHNOLOGICAL REVOLUTION AND A SHIFT IN SOCIETY 
The implementation of basic farming skills meant that other inventions had to follow. Farmers had to devise baskets, fences, hoes, and sickles; they had to build storage silos and houses. Humans found themselves on an exciting, yet demanding, treadmill of creation, manufacture, and construction.

All these pressures made for a huge change in human activity. Humans had exchanged the mobile, instinctive, and day-to-day existence of the forager for the responsibilities of the structured, disciplined, and productive life of one who farms and processes food. A remarkable adjustment had to be made: evolution had equipped humans with a mentality for survival in the savanna environment. Fortunately, some of the same qualities, such as ingenuity, fortitude, and persistence, could be pressed into service to make a success of this new existence. On the other hand, humans are not by nature tidy or given to planning for the future or to organizing large groups. These first farmers had to learn, the hard way, the skills to manage themselves on a larger scale and to make provision for the future.

Farming fixed people in one place, so they created the first permanent villages. This marked one of man’s most important shifts from animal clan-like life. The density of their populations increased vastly. As foragers, an individual would rarely see a group of more than 50 people; in a whole lifetime, he or she might encounter no more than 400 strangers. Today, we come across just as many on a single visit to the shopping mall.

There was a second, quite unexpected consequence that changed forever the way human society is organized. As foragers, humans lived day-to-day and hand-to-mouth. They gathered what they needed for the day and consumed it, then they repeated the process the next day. Everybody took part in the procurement of food—if not, they starved. With the advent of farming came a radical change: farmers had to produce food in advance of requirements and store it. This enabled the production of food surpluses. In a very short time, these surpluses were used to support artisans whose skills only indirectly helped food production. Here was the start of the “division of labor,” where individuals specialized in just one activity, such as the making of tools, baskets, or bricks.

The farming life was in many ways more insecure than the foraging one. The stores required protection from pilferers and bandits, so warrior castes arose. The total dependence on a successful harvest required the gods to be placated, so priestly castes came into being. As intermediaries between the people and the gods, the priesthoods in turn developed ever more complex rituals, sacrifices, taboos, and superstitions.

Page 29 Above


Page 30 Below

 In forager societies, the barter system is well developed. Humans are very good at keeping a record in their heads of who owes what and to whom: they keep score and make sure that they leave no obligation unreturned and that nobody cheats or gets a free ride. It is easier to police this in a forager society. They know everybody with whom they are dealing and many of them are direct relatives. There is a high degree of trust. However, in these new, densely populated farming societies, this delicate balance breaks down. With the division of labor, the number of transactions multiplies. Farmers, tradesmen, artisans, and all the different occupations have to make deals with each other all the time— just to stay fed, get raw materials, and trade finished product. Furthermore, they were less likely to know each other or have mutual kinship ties.

There was an urgent need to keep records of who does what—and owes what—to whom. This led to another revolution in human society: the invention of writing and numbering. In this way, an intellectual class of scribes and bookkeepers came into being. It is from this time that we have the first written records, or “history.” (Everything that happened before this time is known as “prehistory.”) Barter of goods is an unwieldy and inflexible way of trading on this scale, so money was invented. With that came special classes of financiers, money lenders, and accountants. With the multiplication of transactions between people who did not know each other, lawyers and judges were needed to draw up contracts and resolve conflicts. To manage these complex societies, a class of bureaucrats came into being. All these specialized groups were fed from the food surpluses.

The food reserves were a fabulous source of wealth quite unknown to hunter-gatherers. With concentration of knowledge in just a few hands, some people were able to commandeer an unequal share of these resources for themselves and their relatives. In this way, potentates, priests, and merchants accumulated vast amounts of power and wealth.

Dating the Earliest Farming Sites

The dates we give for the start of farming are the best estimates available. Exploration of ancient sites is ongoing and new sites could yet be discovered that push the earliest dates back in time. In addition, it is possible that improved dating technology will cause minor revisions to currently accepted dates. However, for our purposes, we are only interested in seeing the general pattern of when farming came about. It does not really matter exactly when it happened or, indeed, which farmers were the first. The essential points are that, in the grand sweep of human history, it is a recent occurrence and that farming started in certain localities and not in others.

Page 30 Above


Page 31 Below

In due course, many villages grew into cities; some cities became the centers of great empires such as Sumer, Babylon, and Egypt. With this evolution came a completely new way of organizing society. By necessity, and in a great many ways, the new society was at variance with our naturally adapted Savanna Model society. This dislocation affects us even more today, leading to all kinds of unwarranted stress and psychological disturbances.

The people who adopted farming had, unknowingly, grasped a tiger by the tail. Their population densities had grown well beyond the point where they could return to a simple forager existence. So, all peoples who depend on farming (and that includes us) have to put up with its inherent drawbacks. However, until now, we had not even realized the extent to which our lives are affected by this lack of harmony with our savanna-bred natures.

THE SPREAD OF FARMING
As biologist-historian Jared Diamond points out, only a few varieties of plants in the world lend themselves to being farmed and the farmer had little choice but to focus his efforts on those few.1 This practical reality greatly reduces the variety of foods eaten. So, it was for our first planters in Kurdistan. Instead of consuming plants from the hundreds of wild, foraged species, the farmers’ diet was now limited mostly to just four farmed species—wheat, barley, lentils, and beans. As the centuries rolled by, farmers gradually domesticated some fruits (such as apricot and apple) and vegetables (such as onion and leeks), but they remained a tiny part of the diet.

The farming techniques developed in Kurdistan spread rapidly to neighboring areas. Within 1,000 years, farming was practiced in the plains between the Tigris and Euphrates (in present-day Iraq) and eastwards to western Persia (Iran). Farming spread westward into areas of present-day Syria, Turkey, Lebanon, Jordan, Israel, and finally the Nile valley of Egypt. It was the scene of the struggles and migrations of some of the earliest known peoples, including Sumerians, Assyrians, Akkadians, Semites, Babylonians, and Phoenicians. On a map, the area traced out looks rather like a French croissant (a crescent shape). In 1916, an American Egyptologist at the University of Chicago, James Henry Breasted, coined the term the Fertile Crescent, which became the byword for the cradle of farming (see map on pp. 10-11). Farming was also quickly taken up even further to the west, in Cyprus, Crete, and Greece, and in India to the east.

As time went by, other groups of people, quite independently, discovered how to farm grains using whatever resources were locally available. The Chinese began with millet about 7500 B.C. and moved on to rice about 1,000 years later. From there, rice cultivation spread to Burma, Indo-China, and India. Rye, which grows well in cold climates, was first harvested 3,000 years ago when agriculture spread to northern Europe. Oats came along only 1,000 years ago, also in northern Europe.

Page 31 Above


Page 32 Below

Recent Origins of Breakfast Cereals

The modern commercial concept of corn as a breakfast food originated in the vegetarian beliefs of the American Seventh-Day Adventists. In 1906, a Seventh- Day Adventist named Will Kellogg founded a company to make “Corn Flakes” for this niche market. Then, in the late 1950s, came a remarkable example of how smart advertising can dramatically change a nation’s eating habits. A new marketing campaign promoted “breakfast cereals” so persuasively that consumption skyrocketed. In just a generation, they became the chief food of choice at breakfast for an entire nation. Progressively, governments have required the cereals to be fortified (or, as the cereal companies prefer, “enriched”) with an ever-lengthening list of vitamins and minerals.

The Indians of Mexico were the first to cultivate corn (maize) 7,000 years ago. By ingenious selection of the best varieties, they gradually bred it from a normal grass seed into the much larger and plump cob that we know today.  Columbus brought corn back to Spain and it spread to similar climates in the Old World.

In the United States, the main communities cultivating corn were those living close to their Mexican counterparts in the Southwest. It was not until 200 A.D. that corn spread out from that area and then only to the Indians on the eastern seaboard, such as the Iroquois. Even so, it was regarded as a minor crop. Most of the other Indians of the United States—the Apache, Comanche, Sioux, Cheyenne, the Cahuilla in the south, and the Chinooks in the north—were hunter-gatherers. After the arrival of European farmers to America, wheat, not corn, was the main crop planted for human consumption. It may come as a surprise to learn that in the United States corn did not become a big item of human consumption until the 1950s. Until then, Americans only consumed corn in a minor way in the form of popcorn, corn on the cob, and hominy; corn’s main use was to fatten cows and hogs.

We have so far focused on grains because they were the storm troopers of the farming revolution. As the centuries rolled by, many more foods were brought into production (and others abandoned). In the next chapter, we will look at how these new foods were introduced and the consequences (for better or for worse) of human consumption. In the meantime, let us note that it took a long time for farmed products to become common around the world.

The peoples of ancient Kurdistan (northern Iraq) happened to be the first to develop farming, but as we have seen, later and quite independently, cereal farming was invented in China and Mexico. However, not all farming started with cereals: the Incas of Peru began with potatoes (5,500 years ago) and moved on to a grass seed called “quinoa” only later. The Indians of the eastern United

Page 32 Above


Page 33 Below

States first cultivated the sunflower for its seeds 4,500 years ago. The root of the sunflower (we know it as Jerusalem artichoke) was also eaten.

THE INDUSTRIALIZATION OF FARMING
Over the centuries, farmers gradually improved their techniques. Irrigation was an early innovation practiced by Sumerians and Egyptians alike: it improved yields and removed much of the uncertainty of unpredictable rainfall. Farmers learned to maintain soil fertility in several ways. They would plant a field with a different crop each year and in one of the years, the farmer would leave the field unplanted and allow nature to replenish soil nutrients, a process known as “lying fallow.” The Romans knew that alternating a leguminous crop with a cereal crop improved the quality of the latter, but without knowing why. We now understand that legumes put an important nutrient back into the soil— nitrogen.

Sometimes, farming percolated outwards from these centers, often by conquest, to neighboring territories, but the process was not always rapid. For example, the Celts, Germans, Anglo-Saxons, and Scandinavians did not farm until 2,500 years ago, a mere 100 generations past. Indeed, up to the present day, there are still a few non-farming populations: isolated forager bands of San Bushmen (Southern Africa), Aborigines (Australia), Hadza (Tanzania), the fierce Sentinales (Nicobar Islands), and Aché (Peru) have escaped efforts to corral them into fixed hamlets and farms.

Farming always began with plants. However, where suitable animals existed, their domestication quickly followed. In the Fertile Crescent, sheep and goats were soon farmed. The same happened in China (pigs), Mexico (turkeys), and Peru (llamas and guinea pigs). The types of plants cultivated and breeds of animals raised were specific to the locality. But the plants and animals of the Fertile Crescent are the ones that spread to Europe and came to dominate the Western food supply until the late Middle Ages (around 1300 to 1500).

Farmers learned that spreading farmyard manure on the land improved the quality of the crop. Farmers were great naturalists: they watched out for the best growing plants and selected their seeds for the next planting. In this way, they developed varieties that possessed more desirable qualities: for example, they resisted disease better, had better yields, or were easier to harvest.

Yield
The term yield simply means the amount of crop that is produced by a given area of land. It is often measured as bushels per acre. (A bushel is about 9.3 gallons.) A good yield for wheat is 50 bushels per acre; for corn, it is 130 bushels per acre.

Page 33 Above


Page 33 Below

The earliest farmers used hoes to till the ground. But as soon as they had domesticated cattle, oxen were available as a source of power. So, some ingenious person invented the first plows. They were already in use 5,000 years ago in present-day southern Iraq. The technique quickly spread to everywhere in the Fertile Crescent, including Israel and Egypt. The earliest known use in China is more recent, about 2,500 years ago. This basic plowing technique hardly changed for several thousand years although there were gradual improvements: more efficient plows were devised and draft animals became bigger and more powerful. Most farming centers followed this pattern, but in the Americas, no suitable animals were available, so the Aztecs and Incas continued to cultivate by hand.

The first farmers had to grind their cereal grains into flour. They did this with a device called a quern, which consisted of a flat stone with a rounded stone on top. A few grains were put between the two stones and someone pushed the rounded stone backwards and forwards to pulverize the grains into coarse flour. By Roman times, the quern had become a much bigger, rotary device operated by slaves or donkeys. About this time, there was an important advance: nature, in the form of flowing water, was harnessed to turn the millstones.  These early “watermills” were built of wood including all the mechanism. In some areas where free flowing water was not readily available (for example, Holland), the watermill technology was adapted to harness wind power; thus the windmill was born. The technology improved steadily over the ensuing centuries. It took steam power during the Industrial Revolution to replace these mills during the 19th century.

The late Middle Ages in northern Europe saw two big leaps in farming practices. In England and Germany, it was discovered how to get three crops every two years instead of just two crops. This is known as the three-field system: onethird was planted in the fall for harvesting early summer, one-third in the spring for harvesting in late summer, and one-third remained fallow. This increased production by 50%. Mediterranean countries like Italy, Greece, and Spain could not benefit from this innovation: unlike northern Europe, they do not have summer rain, which is essential for the system to work.

Secondly, the problem of feeding livestock during the hard winters held back northern Europe. The practice was to slaughter a large part of the herd in autumn and start again in the spring. The three-field system generated a surplus of fodder that farmers could feed to the beasts through the winter. But this could only work if there was a good way of preserving the fodder for several months. This led to a second major development—silage, a way of conserving fodder in deep pits and allowing it to ferment. This stops it from going rotten and preserves its nutrients. These two developments marked the rise in economic power of northern Europe during the Middle Ages to the detriment of the countries of southern Europe.

Page 34 Above


Page 35 Below

So, farming techniques improved, at least in the sense that farmers obtained higher production for the same effort. Farming had evolved in a slow and steady way from its early roots and most of the basic principles would have been familiar to a Sumerian from 5,000 years earlier. During all this time, no one knew what was happening to the nutrients in the plants and animals, but no one really thought about it either. They were being kept alive in an uncertain world and survival was the goal.

Plant Chemicals
In spite of 5,000 years of gradual improvement, there was one big area that remained a problem: the loss of crops to diseases and pests of all kinds. It was not until the 19th century that a pest was successfully controlled on a large scale. This was an infestation of vines in the 1840s by a kind of mildew and it was cleared up by dusting with sulfur. However, advances on this front were slow. It took another century before the agricultural world was turned upside down by a discovery by Paul Müller, a Swiss chemist. In 1942, he developed the highly effective and long-lasting insecticide DDT. DDT’s ability to kill just about every insect, yet leave plants and warm-blooded animals apparently unharmed, was so successful that Müller received the Nobel Prize in 1948.

Research on poison gas in Germany during World War II led to the discovery of another group of yet more powerful insecticides—the most common being a compound called parathion. Some of these compounds were “systemic”— that is, the plant absorbed them into its tissues and became itself toxic to insects. Though low in cost, these compounds were toxic to humans and other warm-blooded animals.

These chemicals were designed to kill insects. However, there are other nuisances that harm crop yields: funguses, weeds, worms and viruses. Attention was turned to developing fungicides, herbicides (to kill weeds), and vermicides (to kill worms), with almost equal success. Viruses cannot be attacked by chemicals, but they are transmitted from plant to plant by insects, worms, and other bugs; by killing the bugs, virus damage was controlled too. It seemed that almost any pest, disease, or weed problem could be mastered by suitable chemical treatment. Farmers foresaw a pest-free millennium. Crop losses were cut sharply, locust attack was reduced to a manageable problem, and the new chemicals, by dramatically improving food production, saved the lives of millions of people.

But problems began surfacing in the early 1950s. In many crops, standard doses of DDT, parathion, and similar pesticides were found ineffective and had to be doubled or trebled. Resistant breeds of insects had developed. In addition, the powerful insecticides often destroyed helpful insects too. Resistant survivors soon produced worse outbreaks of pests than there had been before the treatment.

Soon, concern was expressed about pesticide residues. It was found that

Page 35 Above


Page 36 Below

many birds and wild mammals retained considerable quantities of DDT in their bodies. Rachel Carson, in her 1962 book Silent Spring, sounded the alarm. Since that time, agriculturalists have tried to find a middle way between the well-tried traditional methods and the use of chemicals. Even so, chemicals have become ever more sophisticated and widespread, and they are not just restricted to controlling pests either. Fruit trees are sprayed to heighten the color of the fruit; they are even treated with hormones to get all the fruit ready for harvesting on a programmed day. Residues in foods are strictly controlled, but there are always some left on our plates. No one really knows the consequences of consuming them over a lifetime or the effect they have when they are added to each other.

Plant Fertilizers
The ancient techniques of enriching the ground with manure had been known for a very long time. However, it was not until the 18th century that a chemical found naturally in India, saltpeter, was used to fertilize fields in England. Ground-up bones, especially if treated with sulfuric acid, were found to be useful too. All kinds of other materials were tried, such as powdered gypsum and blast furnace slag, but one of the most successful was guano. Guano is a massive thick deposit of bird droppings accumulated over the centuries in the Peruvian Lobos Islands.

It took a while for anyone to work out why these materials had their effect. Then, the brilliant English chemist Sir Humphry Davy, in an 1820 treatise, explained just what these fertilizers were doing. They were adding three bulk elements essential for plant growth: nitrogen, potassium, and phosphorus. Deposits of phosphorus and potassium were discovered in many parts of the world and their availability, even up to the present day, is not a problem. Sources of nitrogen (as in saltpeter) were scarce and its supply was not assured until, in 1909, the German chemist Fritz Haber discovered how to make nitrogen fertilizer from the nitrogen in the air. These three chemicals—nitrogen, potassium and phosphorus—still form the basis of all bulk fertilizers.

Plants grow in soil that contains a vast range of chemicals and they absorb them, even if they don’t need them. Over the years, scientists have identified those other elements that are essential to a healthy plant. They are needed in much smaller quantities (so they are known as “trace elements”) and there are only about 14 of them. They include chemicals like copper, zinc, manganese, and sulfur. With this discovery, it was possible to grow plants without soil altogether, just dangling their roots in nutrient-rich water. This system is known variously as “hydroponics,” “nutriculture,” and “soil-less culture.” A variant is used extensively in desert areas where plants can be grown, under suitable cover, with their roots in gravel or sand. Beautiful vegetables and fruit can be grown this way by just using these basic nutrients. However, what makes a plant grow is not always sufficient for animals and humans. We need those other trace

Page 36 Above                                 


Page 37 Below

elements that plants normally absorb when they grow in soil, such as iron, chromium, and selenium, even if the plants themselves do not need them.

Animal Husbandry
In parallel with the developments in pest control since World War II, animal husbandry was under examination. It is expensive allowing cattle, hogs, and chickens to roam freely, feeding as they choose. Much better to restrict their movements and give them feed that is designed to make them grow faster, fatter, and with less waste. Proteins, fats, and carbohydrates are the basic elements of animal nutrition, so does it matter where they come from? Yes, indeed it does. For example, cows’ natural food is found by browsing in trees and bushes. This might come as a surprise, because we think of cows sitting in a grassy meadow chewing the cud. It was only at the end of the Middle Ages when herdsmen discovered that, by feeding cows on “high energy” grass pastures, they would grow more quickly. We now know that this restricted, single-food diet changes the nutritional quality of the meat.

But cattlemen have gone one step further: corn is plentiful and easily made into a concentrated feed, and it fattens cows fast. But corn is not normal cow food at all—they cannot digest it properly and it disrupts the working of their intestines. Their colons become overgrown with bacteria, which in turn produce nasty toxins that get into the carcass. Cattlemen even have a name for this phenomenon: “bloody gut.” Ever cheaper sources of fodder were sought, however outlandish. Even the last swillings from the slaughterhouse floor were collected, dried, and pressed into cake as animal feed. In this way, we were treated to the ultimate spectacle of dead cows being fed to live cows. This practice allowed the incurable sickness bovine spongiform encephalopathy (BSE; familiarly known as “mad cow disease”) to spread in British herds and to fatally sicken many humans who ate the beef.

But that is only the start. Chickens would normally lay only about 170 eggs per year. With clever feeding, suitable lighting, and other stimulation, they now average 240 eggs per year. The ambition is to increase this to 700 eggs per year by the addition of sex hormones to speed up the chicken’s egg-production cycle. They are fed dyes to make their yolks bright yellow, they are dusted with insecticides against parasites, and fed antibiotics to stop them from getting sick in the crowded conditions.

Since 1993, dairy cows have been injected with the hormone known as rbST to increase milk production by up to 25%. Antibiotics have routinely been added to animal feed since the 1950s to increase growth rate. All these measures are sanctioned by government authorities, chief among them the U.S. Food and Drug Administration (FDA). But even this is not enough for some: the competitive pressures to produce cheap meat are so great that unscrupulous cattlemen inject their herds with illegal substances, such as muscle-building steroids.

Page 37 Above


Page 38 Below

Mechanization
Meanwhile, in the 19th century, another major development was taking place— mechanization. Early steam “traction engines” were developed for plowing. These were cumbersome but were a great improvement on the horse-drawn methods. Soon, they were supplanted by the internal combustion engine in the form of tractors. The first successful gasoline tractor was built in the United States in 1892. The number of tractors increased dramatically in America from 600 in 1907 to almost 3,400,000 by 1950. Thus, mechanization was a tremendous force for increasing productivity and reducing the need for farm labor.

Through all these changes, the nature of the plants was changed by selective breeding. Combine harvesters, tomato reapers, or cotton pickers need plants that grow in specific ways to work efficiently, so the plants were bred to be more suitable for mechanical harvesting. In this way, mechanization drove a trend to change plants for convenient handling. Many plants do not lend themselves to mechanized production, so they were no longer farmed.

“A chicken in every pot and a car in every garage”—that was the slogan used by Herbert Hoover in his 1928 presidential campaign. It is hard to imagine that, for the average American in those times, it was as rare to eat chicken as it was to own a car. Mechanization changed all that for both chickens and cars. Animals such as hogs and chickens could be kept in large sheds and reared in much more densely packed conditions. Their products became much cheaper. By the 1930s, farming had become so mechanized that this marked a major change: agriculture flipped from being a labor-intensive industry to one that used few farmhands but invested heavily in machines.

Plant Genetics
We have seen how ancient farmers selected the best plants for cultivation. This was a continuous process down through the centuries. Indeed, many plants that we know today are unrecognizably different from their wild ancestors. However, the process accelerated as commercial pressures of farming intensified. There have been some major successes. Millions more could be fed after the “green revolution” that occurred during the 1960s in Asian countries, when new, highly productive strains of rice were planted. However, often plants are modified for seemingly trivial reasons. Take, for example, wheat flakes: different varieties of wheat respond differently to milk. One of the major producers of breakfast cereal, General Mills, has a brand called “Wheaties.” They wanted a flake that curled on contact with milk and reduced sogginess in the breakfast bowl. General Mills undertook a development program to breed such a wheat and then contracted with farmers to supply only this variety [2]. What happened to the nutritional quality? Perhaps nothing changed, but no one cared to find out either.

This kind of plant breeding has a long history, but this does not automatically make it an acceptable thing to do. The whole point is that plants have been

Page 38 Above


Page 39 Below

changed for a variety of reasons, but none of them has to do with nutritiousness. We just do not know what has been lost or gained in the process. However, with plant breeding, at least scientists were working with combinations of genes that could have occurred in nature.

Carrot Color Frivolity
The ancestral wild carrot came from western Asia (the region of Turkey, Lebanon, and Syria). It was a deep purple color and it was skinnier and had a hard yellow core. As long ago as the 16th century, Dutch farmers thought it amusing to breed a carrot in the Dutch national color: bright orange. For us in the West, this is the “proper” carrot color; however, for the people of western Asia the carrot has remained purple. Now, growers want to jazz up their product and make it more appealing. Plant breeders are experimenting to make carrots with all kinds of hues, from white through primrose and bright red to black.

Since the 1970s, scientists have been artificially manipulating plant genes to achieve desired characteristics. Sometimes genes from a quite different species, or even an animal, are introduced to modify the plant genes. Their goals have been to make farming easier and cheaper by improving yields, and by producing crops resistant to pests, drought, salt, and weed-killers. A second objective is to make foods that transport well, are easily packaged, and have a long shelf life. It is an incredibly powerful technique that has few boundaries. A Supreme Court decision in 1980 made genetically modified organisms (GMOs) patentable, so there is a strong incentive for agri-business to focus on GMO plants and to ignore conventional breeds. The momentum is so great that it is like a runaway train hurtling into the darkness. No one knows what will come of it, but one thing is clear—the train is rushing us on an enforced journey away from our human origins.

FOOD PROCESSING, TRANSPORT, AND STORAGE
With the Industrial Revolution going full-swing during the 19th century, cities grew to sizes never before seen in history. Chicago’s population increased 17- fold from 30,000 in 1850 to 500,000 in 1870. New York City grew 25 times bigger, from 60,000 in 1800 to 1.5 million in 1870. In contrast, Babylon at the time of the Biblical exodus (1447 B.C.) was only about 60,000 total [3]. Feeding populations in these enormous agglomerations required novel methods. It was quite impossible to get most fresh foods to them in the normal way. Food had to be “preserved,”— that is, processed in a way that stopped it from going bad. Meat and fish were a particular problem but there were tried-and-true methods to conserve it: salting and smoking. Salt beef, bacon, cured ham, kippered herring, and bologna were just a few examples that took over the diet of city dwellers, replacing their fresh equivalent.

Page 39 Above


Page 40 Below

Wheat quickly goes rancid when made into flour. For this reason, since time immemorial, bakers only milled their flour when they were ready to use it. However, ingenious industrialists found that the problem lay in the wheat germ. By the simple expedient of removing the wheat germ as the grain was milled, flour would keep almost indefinitely. Mechanization was brought to traditional processes of grinding cereal grains into flour. For 10,000 years, this had been achieved by grinding the grains between two stones. In the 19th century, that process changed. Steel had arrived and the quirky millstones were replaced by banks of steel cylinders rotating at high speed. These progressively ground the grain down to ever finer particle sizes [4]. At every stage, there were sieves to separate the bits of outer husk (bran) from the flour itself. The whole lot was driven by steam-powered machinery—it was a tremendous advance in productivity. This procedure has been continued right to the present day.

The Importance of Micronutrients 
We know that there are many active compounds in the foods we eat, particularly fruits and vegetables. We are familiar with the “classic” micronutrients that have been identified over the past 100 years: vitamins A, B, C, and so on, and minerals like iron, selenium, zinc, and iodine. However, we now know that there are thousands of other micronutrient compounds that play a part in the smooth functioning of the body. In this book, we call them “background” micronutrients.

For example, there is the family of carotenoids, of which there are over 600. They give the color to carrots, oranges, tomatoes, and melons. There is the phenol family with over 5,000 members. They too are present in all fruits and vegetables, and strongly present in tea, coffee, and wine. And there are the 7,000 terpene compounds, which are omnipresent in all plant foods, particularly in spices and aromatic herbs. We must not forget the thousands of bioflavonoids, yet another vast range of compounds that are essential to health.

We know that all these micronutrients, both classic and background, are important to optimum health. We can’t define exactly how all these compounds work, but we ignore their importance at our peril.

The industrialization of milling and baking also changed the nature of bread. The bakers like the new “refined” flour. It was uniform in size and free of bran and wheat germ, so bread-making became much more predictable. It did not need human skill to ensure that the bread baked properly every time, so this meant that bread could be made on a production line too. But both the bran and the wheat germ had been stripped out of the bread. It was not until much later when scientists discovered that wheat germ is a powerhouse of important nutrients, including omega-3 oil, vitamin E, and choline (a B vitamin). It was the precious

 Page 40 Above


Page 41 Below

and fragile omega-3 oil that went rancid so quickly. In one stroke, this processing deprived city populations of vital nutrients. As we shall see, this had surprising and unexpected negative consequences for the consumer.

The first patents for canning food were issued as early as 1810 in England; the United States soon followed. The technique involves sealing the food in the can and then heating it to over 200°F. Most animal foods can be preserved this way and a good many plant foods as well. Always the pressure is on to select variations of the food that withstand this treatment best. Some foods, like milk and fruit juices, are “pasteurized”—the food is heated very briefly to an elevated temperature and then sealed into bottles. No one thought particularly hard about what was happening. Heat, it was known, killed the harmful bacteria that cause food to rot, so that was good. It was less understood that heating also destroyed natural enzymes and many other micronutrients.

Systems of food transport became quicker and more reliable, so many more food products were grown for export to the burgeoning cities. Thus, varieties of plant were chosen that survived transportation well. Bulk storage systems improved with the development of refrigeration in the 1920s and of scientific techniques of “conditioning,” which sought to slow or prevent spoilage by careful control of moisture and gases in the silo. Plant varieties that stored well were favored.

THE FAST-FOOD INDUSTRY
In just the last 50 years, there has been a tremendous shift in the way families get their meals. In the year 1950, the average American spent $2,625 for food eaten at home and a further $724 on food eaten out.5 (All money is expressed in year 2000 dollars.) In other words, about 20% of the food budget was spent on food eaten away from home. In contrast, in the year 2000, the average American slashed nearly in half the dollars spent on food eaten at home to $1,500. Meanwhile, consumption of meals eaten out jumped by almost 50% to $1,125. In other words, over 40% of the food budget is spent on eating out.

An industry had sprung up to fulfill a need. Americans were earning more money but they had less free time, and more and more women were working. This “fast-food” industry, as it came to be known, provided attractive, tasty, and cheap food—and you did not even have to stop the car engine while collecting your order at the “drive-thru” window. It is hard to imagine that in 1950, McDonald’s had just one outlet (in San Bernardino, California). Today, they have over 11,000. They were quickly followed by imitators such as Burger King (now over 6,000 outlets) and Wendy’s (now over 3,500 outlets). These establishments specialized in a new type of food, the hamburger. However, other enterprising food suppliers introduced different but new foods to the scene: pizza and tacos, for example. Others reworked traditional ideas: fried chicken and sub-sandwiches.

Almost always, the accompaniment was french fries and a soft drink. The

Page 41 Above


Page 42 Below

soft drink industry had gotten going earlier, partly encouraged by Prohibition in the 1920s. Even so, in 1940, the average American consumed only about 6 ounces per week. By 2000, that had jumped by 20 times to a gallon a week (128 ounces).

These changes are radical. At the 1950s family meal, hamburger, pizza, and tacos were either unknown or rarely served. Potatoes were not often served in the form of french fries; soft drinks were absent. All these changes have occurred just in living memory and we will later look at the consequences of this dramatic shift in feeding habits.

FOOD SAFETY
Like any traded item, food is vulnerable to cheating. The Code of Hammurabi, in 1750 B.C., laid out penalties for brewers who sold short weight. The Greeks and Romans struggled with vintners who dyed and flavored their wine. In 1202, King John of England proclaimed the first English food law, “the Assize of Bread,” which prohibited adulteration of bread with such ingredients as ground peas or beans.

As the food supply was industrialized during the 19th century, more and more foods were processed and packaged for sale. Unscrupulous merchants adulterated their products with cheaper and sometimes harmful substances, labels were deliberately misleading, inferior food was fancied up with dyes and artificial flavors, and food was preserved with noxious chemicals. In the United States, Abraham Lincoln set up the U.S. Department of Agriculture (USDA) with a division called the Bureau of Chemistry to look into such matters.

Following his investigation of food adulteration, in 1880 the chief chemist Peter Collier recommended a national food and drug law. The bill was defeated, and this was a portent of battles ahead. Why would Congress refuse to legislate for food purity? Only if there were powerful forces opposed to these measures— the nation’s food supply is the subject of a titanic battle between the food industry and governments that try to regulate it for the public good.

In 1883, Dr. Harvey Wiley became chief chemist and took up the battle. He expanded the Bureau of Chemistry’s food adulteration studies and campaigned for a federal law. He was so vigorous and forthright that was called the “Crusading Chemist.” Finally, a law was passed in 1906. The pure food regulations were scientifically sound, thorough, and gave strong powers to the Bureau of Chemistry to enforce them. Had these regulations been allowed to remain in place, they would have made America one of the healthiest places in the world to eat, but trouble was already on the horizon.

On signing the regulations, the Secretary to the Treasury complained that they were too severe on the food industries. The fishermen of Massachusetts wanted to keep borax; the dried fruit industry of California wanted to use sulfur dioxide; ketchup interests begged for benzoic acid. Very quickly, industry forces set about undermining the Bureau of Chemistry. The Secretary of

Page 42 Above


Page 43 Below

agriculture, James Wilson, was persuaded to set up a board, under chairman Ira Remsen, to protect the manufacturers. This “Remsen Board” started making its own recommendations to Wilson, who often upheld them, over the head of the Bureau of Chemistry. The Crusading Chemist’s success was short-lived. In Dr. Wiley’s words, “The food and drugs law became a hopeless paralytic.”

In spite of the restrictions and difficulties, Dr. Wiley battled on. He tried to stop the bleaching of flour, which often uses chlorine dioxide, a chemical akin to household bleach. In a test case, the Bureau of Chemistry sued the Lexington Mill and Elevator Company for contaminating flour with nitrogen peroxide, another bleaching chemical. The case took almost 10 years to complete as it went all the way to the United States Supreme Court. The Supreme Court finally ruled against bleached flour in 1919. But mysteriously, the USDA wrote the application guidelines in such a way as to make the ruling easily circumvented. Bleached flour has never been removed from commerce in the U.S. to this day. On the other hand, bleached flour has been banned in many European countries.

Dr. Wiley took on the Coca-Cola company for dispensing its wares without disclosing the ingredients. The two sides fought to a standstill. Coca-Cola made a concession: it removed cocaine from the formula. On the other hand, it retained the right to keep some of the ingredients secret from the public. In 1912, Dr. Wiley resigned in disgust and wrote: “The makers of unfit foods have taken possession of Food and Drug enforcement, and have reversed the effect of the law, protecting the criminals that adulterate food, instead of protecting the public health” [6 ].

The Bureau of Chemistry nevertheless continued with its work. By chipping away at its task, Congress passed a series of piecemeal laws. For example, requiring that food packages be “plainly and conspicuously marked on the outside of the package in terms of weight, measure, or numerical count” and banning labels that “may mislead or deceive.” In 1930, the Bureau of Chemistry was restructured into the Food and Drug Administration (FDA). Little by little, the law was clarified on a number of fronts. In 1958, manufacturers of new food additives were required to establish safety, and in 1960 the manufacturers of new color additives were required to do likewise. Even these gains are not quite what Dr. Wiley had in mind—he wanted food to be free of additives altogether.

That is the situation today. Manufacturers can make up a confection of anything they like, so long as no one ingredient is “harmful.” In all these matters, there is a further weakness: the FDA relies on the manufacturer’s own laboratory tests to prove safety. The confection can be totally without food value. Indeed, food can now officially be adulterated so long as it is declared on the label! In this way, hot chocolate, for example, has a poor cocoa content but a high level of cheap fillers, artificial colors, and flavors. Still, manufacturers have to be careful about health claims and they must declare somewhere on a label what is in the food.

Page 43 Above


Page 44 Below

Both from its founding as the Bureau of Chemistry and under its present banner, the FDA is supposed to be a guardian of the public interest. However, in many respects it gives the impression of being a watchdog that is conspiring with the burglars. This may be a somewhat unfair characterization, but the reality is that the FDA has to work in a highly political environment. The general public cannot therefore rely on the FDA’s protection from many of the dubious practices carried out by the food industry. We have to do that for ourselves.

GOVERNMENT EATING GUIDELINES
The story of this evolution of our diet is crucial to understanding why foods are having such a powerful impact on our health today. It took a very long time before it was recognized that many common diseases were linked to nutrition. One of the first was scurvy. The British navy used to lose more sailors to scurvy than to warfare until the 1790s. Then, a discovery of naval surgeon James Lind was put into practice. Sailors were fed lemon juice on long voyages and scurvy disappeared “as though by magic.” We now know that scurvy is caused by a deficiency of vitamin C.

Scurvy

Scurvy is a disease that has been known from ancient times, although it was rare. It became common among early Europeans who had to endure long winters in places like central Canada and among sailors on long sea voyages. Scurvy’s symptoms are swollen and bleeding gums, loosened teeth, sore joints, bleeding under the skin, slow wound healing, and anemia. If not treated, it results in death.

Also in the 18th century, it was found that rickets, a bone disease common in poor parts of cities, could be cured by the consumption of cod liver oil. We now know that rickets is caused by a deficiency of vitamin D. Pellagra is a disease that used to be common in the southern states of the U.S. where poorer people lived almost entirely on corn. In 1937, it was discovered that pellagra is caused by a deficiency of tryptophan, an essential protein that is unavailable in corn. It can easily be cured by eating small amounts of protein-rich foods.

The Japanese Navy used to lose 50% of its seamen to beriberi. They were eating a diet of polished white rice and not much else. In the 1870s, the Japanese reported that they could cure beriberi by feeding their sailors with some extra rations of vegetables and fish. We now know that beriberi is a disease caused by a deficiency of vitamin B1 (thiamine).

The list goes on, but the message is simple: for the last 250 years, more and more diseases have been linked to nutritional deficiencies. Governmental authorities have learned the lesson from this and, in a bid to improve the health

 

Page 44 Above


Page 45 Below

of their populations, started to advise them how to eat. One can imagine the early messages: “eat citrus fruit to avoid scurvy” and “eat beans (which are protein- rich) with your corn to avoid pellagra.” There were early attempts to smuggle essential nutrients into the food supply by “enriching” some foods: vitamin D was added to butter and calcium was added to flour.

Magic Bullet Mirage

An unfortunate side effect of these discoveries was the encouragement of the notion of a “magic bullet”—that is, one simple cure for one simple disease. As we shall see, this is too simplistic. Most of our modern diseases are due to a complex interaction of many factors that are going wrong at the same time.

For over a century, the U.S. government has been interested in helping Americans to choose a healthy diet. The agency charged with this responsibility is the USDA. As early as 1894, the USDA developed the first food composition tables and dietary recommendations. However, they found quite quickly that, to communicate the ideas to ordinary folk, they needed to group the various foods into categories. Then, they could give recommendations for each category.

This gave rise to the concept of “food groups.” These food groups have become entrenched, in various forms, in the way we think about our diets. For this reason, and because we will be using this concept as we move through the book, we will look at the story of food groups and how to interpret them.

The History of Food Groups
In 1917, the USDA established its first food groups: milk and meat, cereals, vegetables and fruits, fats and fat foods, and sugars and sugary foods. The government released their publication called “How to Select Foods” using these categories and called them “food groups” for the first time. These early recommendations used a breakdown of five food groups for the American food supply. However, when it was put into practice, dieticians and doctors found that it was too broad-brush and was easily circumvented. It was necessary to be more detailed, so, after due consultation and reflection, in 1933 the USDA published family food plans using 12 food groups. These subdivided the earlier large groups into more meaningful categories.

It might be thought that this was very clear. However, in 1942, the USDA issued a new food guide that reduced the number of food groups to what they called the “Basic Seven.” These were: green and yellow vegetables; oranges, tomatoes, and grapefruit; potatoes and other vegetables and fruit; milk and milk products; meat, poultry, fish, eggs, and dried peas and beans; bread, flour, and cereals; and butter and fortified margarine.

 

Page 45 Above


Page 46 Below

Twelve Food Groups
(USDA Categorization of 1933)

Milk
Lean Meat, Poultry, and Fish
Eggs
Dry Beans, Peas, and Nuts
Tomatoes and Citrus Fruits
Leafy Green and Yellow Vegetables
Other Vegetables and Fruits
Potatoes and Sweet Potatoes
Flours and Cereals
Butter
Other Fats
Sugars

It is interesting to see what has changed. “Potatoes and sweet potatoes” have been lumped in with “other vegetables and fruit.” “Eggs” and “dry beans, peas, and nuts” are lumped in with “meat, poultry, and fish.” The “butter” group has been expanded to “butter and fortified margarine.” The word lean has been dropped from the category “lean meat.” The “other fats” group and the “sugars” group have disappeared entirely.

This does not look like a move in the right direction, but worse is to come. In 1956, the “Basic Seven” groups were condensed to the “Basic Four”: milk and milk products; meat, fish, poultry, eggs, dry beans, and nuts; fruits and vegetables; and grains. This time the “green and yellow vegetables” group has disappeared.

“Butter and fortified margarine” has been dropped. “Oranges, tomatoes, and grapefruit” are lumped into the catch-all category “fruits and vegetables.” This took the simplification too far. In 1979, the USDA issued the “Hassle Free Guide to a Better Diet.” This added a fifth group—”fats, sweets, and alcohol”— to the Basic Four. The guide recommended moderation in the use of the fifth group and also mentioned calories and dietary fiber for the first time.

Finally, in 1980, the USDA released the first Dietary Guidelines for Americans. The only change was to split the “fruit and vegetable” group into separate groups. So, now we are up to six groups. The USDA, for ease of reference, condensed their designations to: the Grains group, the Vegetables group, the Fruit group, the Milk (Dairy) group, the Meat and Beans group, and the Fats, Oils, and Sweets group.

Six Food Groups

(USDA Categorization of 1980–2004)

 

Grains

Vegetables

Fruit

Milk (Dairy)

Meat and Beans

Fats, Oils, and Sweets

 

Page 46 Above


Page 47 Below

Since that time, the USDA has issued revisions to its Dietary Guidelines every five years, but the food group classification has remained broadly the same. More recently, the USDA mentioned the use of “salt and sodium” and recommends moderation. Finally in 2005, the USDA issued the following redefinition of the food groups: 

Six Food Groups

(USDA Categorization 2005)

 

Grains

Vegetables

Fruit

Milk (Dairy)

Meat and Beans

Oils

The only food group that the USDA changed is the “Fats, Oils, and Sweets” group—it is now just the “Oils” group. Where have fat and sugar gone? The USDA has created a new concept: that of optional treats. If your daily intake of calories on the conventional food groups leaves a shortfall, you can top-up with sugars and fats. By removing sugar and fat from food groups altogether, the USDA is placating the sugar, snack-food, soft drink, and confectionary lobbies, and it is also an attempt to feed consumers’ weakness for pleasurable and comfort foods.

In 1992, to give a pictorial presentation to the Dietary Guidelines, the USDA introduced the Food Guide Pyramid. This is a neat way of showing the priority to be given to each group as well as depicting the food groups themselves. However, as we shall see later, there are serious flaws both in the groupings and the priorities.

So, there is nothing special about the way our food supply is categorized today. Other categorizations have been used in the past and every few years the USDA reviews and makes changes to them. Most Americans will be familiar from their school days with the idea of food groups. However, depending on just what year they went to school, the food groups were different. No wonder people are confused.

Why are the contents of the food groups shuffled around so much? One of the reasons has to do with pressure groups. The sugar lobby did not like being singled out, so they got sugar dropped entirely in 1942. Only in 1980, and against bitter opposition, did the USDA get sugar mentioned again, but only as an afterthought in the “Fats, Oils, and Sweets” group. Likewise, butter and margarine were quietly merged into the same group. For similar reasons, the potato lobby got their product dropped as a food group in 1942; the potato and its french fry variant have remained submerged in the “Vegetable” group ever

Page 47 Above


Page 48 Below

since. For this reason, in the minds of most Americans, a french fry has just the same value as a tomato. As for “lean meat” and “green and yellow vegetables,” they were leveled down and airbrushed out of special mention.

The Bond Effect Food Groups
When we look at the history of our food supply in the next chapter, we will follow the USDA’s broad food group categorizations because most people are familiar with them. However, there is an arbitrary nature to some of the groupings that has more to do with political expediency than scientific rigor. So, we will subdivide some of the groups in a way that allows us to make important distinctions between the types of foods within the group. We will also add groups that do not exist at the moment, namely a “Sugars and Sweeteners” group, “Salt and Sodium” group, and a “Beverages” group. We will also add fats back into the oils group.

By the term vegetable, the USDA means any plant food that is not a fruit, grain, nut, or legume. Even after excluding these categories of plant food, the term vegetable covers a wide range of plant types. For reasons that will become clear later, we will divide the “Vegetable” group in two: “Vegetables (Starchy)” and “Vegetables (Non-Starchy).”

One USDA group, “Meat, Fish, Poultry, Dry Beans, Nuts and Eggs,” seems to have been lumped together because they are, on the whole, protein-rich foods. However, not all protein-rich foods (for example, cheese) are included and some protein-poor foods (for example, chestnuts) make the list. This USDA food group is just too incoherent for our purposes. There are significant differences among these items, so we will break down this group into three major classes. One is protein-rich foods of animal origin: “Meat, Poultry, Eggs, and Fish.” The other two are protein-rich foods of plant origin: “Dry Beans and Peas (or Legumes)” and “Nuts.”

By making these adjustments to the familiar USDA food groups, we will be able to highlight in a more precise way how foods conform to, or diverge from, the Savannah Model. As we proceed through the book, it will be necessary to make even more subtle distinctions, but for now, this breakdown will serve our purposes. Our modified groupings then are: Grains; Vegetables (Starchy); Vegetables (Non-Starchy); Fruits; Milk and Dairy; Meat, Fish, Eggs, and Poultry; Dry Beans; Nuts; Fats and Oils; Sugars and Sweeteners; Salt and Sodium; and Beverages. Table 2.1 (shown on page 49) shows how the modified groups can be compared to the current USDA groups.

Dietary Guidelines for Americans The categorization of the nation’s food supply into food groups is the first of a two-stage process. The second, and more important, stage, is then recommending to Americans how many servings of each food group they should be consuming

Page 48 Above


Page 49 Below

every day. These recommendations are embodied in the impressive-sounding “Dietary Guidelines for Americans.” First instituted in 1980, they are revised every five years, on the decade and on the half decade.

 

Table 2.1 Comparison of Food Groups  

USDA 2005 FOOD GROUPS

BOND EFFECT FOOD GROUPS

Grains (Bread, cereals, rice, and pasta)

Grains (Bread, cereals, rice, and pasta)

Vegetables

Vegetables (Starchy)

Vegetables (Non-Starchy)

Fruit

Fruit

Milk and Dairy (Milk, Yogurt, and Cheese)

Milk, Yogurt, and Cheese

Meat and Beans (Meat, fish, poultry, dry beans, nuts, and eggs)

Meat, Poultry, Eggs, and Fish

Dry Beans, Peas

Nuts

Oils

Fats and Oils

Sugars and Sweeteners

Beverages

Salt and Sodium

However, these recommendations do not consist of the best advice given by impartial scientists. We have already had a glimpse of some of the political pressures at work. The USDA’s Dietary Guidelines are drawn up only after negotiation with all interested parties. These interested parties are powerful and include agro-industrialists, farmers, food lobbies, trade associations, labor unions, politicians, and financiers.

Every five years, we are treated to the spectacle of a new round of negotiations for an agreed text to put in the Dietary Guidelines. It is not edifying: each interest group brings the maximum of financial and political pressure to bear. Regrettably, in the mêlée, the scientists’ impartial advice is mostly watered down or abandoned. In other words, the USDA’s Dietary Guidelines are not a gold standard—on the contrary, they are a weak and deceitful compromise between all the competing interests.

This is a major cause for concern. In spite of their debased nature, these recommendations are then taught to our children in schools and used to design meals in hospitals, schools, prisons, and retirement homes. Worse, these recommendations become the dogma in which professional dieticians and nutritionists are trained. The conventional platitudes for healthy eating have become as sincere as a harlot’s kiss. Integrity has abandoned the field, leaving it wide

Page 49 Above


Page 50 Below

open to all kinds of alternative dietary nostrums. Most of them are questionable, some are plausible, but none of them gets to the fundamentals. They cannot, for they do not know the truth about our human heritage. The whole point of this book is to provide those fundamentals and to do so in an honest, and coherent way.

ADRIFT FROM THE IDEAL
In this chapter, we have reviewed all the various ways in which diets have changed not just in the last 50 or even 500 years, but in 50,000 years. We catalogued the Farming Revolution’s upheaval of our eating pattern 11,000 years ago with the introduction of grains and legumes into the human diet for the first time. We recounted the major changes in farm practices and technology since that time. And we traced where today’s diet comes from and pinpointed when and how we drifted away from the ideal.

In the 1940s and 1950s, the Americans led the way in intensifying agriculture. Now, we have found scientific ways of altering foods for all kinds of reasons. When the supply of untouched fertile soils run out, we find ways of pressing exhausted soils back into service using fertilizers, pesticides, and other chemicals. We are capable of producing what look like real plants but which simply do not contain the same nutrients. We can mass-produce animals on a production line system and their flesh finds its way onto our plates—injected with hormones, fed with antibiotics, and dusted with insecticide. Foods are processed and refined in ever more sophisticated ways. Artificial dyes, fillers, preservatives, colorings, flavors, and odors are routinely used in manufactured foods. Foods are routinely adulterated with cheap, nutrition-free fillers and extenders. We have seen how government agencies try to hold the line, but their efforts are subverted by political and financial pressures. They cannot be relied upon to protect the public interest.

All this sounds alarming, but which factors are of primary importance and which of secondary importance? In this chapter, we have reviewed familiar territory concerning the intensification of agriculture; but the main theme has focused on the idea that many of these foods are new to the human diet anyway. Perhaps in themselves these foods are posing problems: just maybe, no matter how pure or sullied they are, they need to be treated with caution. In the next chapter, we will look more closely at the origins of the foods commonly available today and examine the consequences of consuming them.

Page 50 Above

Index Page

Back to Chapter One

On to Chapter Three

Home Page