|
Nutrition Science News: April 1997
Paleolithic Nutrition:
Your Future Is In Your Dietary Past
Human genes, formed by millions of years
of evolution, are a bad match for highly processed modern diets
By Jack Challem
You are what you eat, and perhaps surprisingly, you also are what your
ancestors ate. Just as individual genetics and experiences influence your
nutritional requirements, millions of years of evolution have also shaped
your need for specific nutrients.
The implications? Your genes, which control every function of your body,
are essentially the same as those of your early ancestors. Feed these genes
well, and they do their job--keeping you healthy. Give these genes nutrients
that are unfamiliar or in the wrong ratios and they go awry--aging faster,
malfunctioning and leading to disease.
According to S. Boyd Eaton, M.D., one of the foremost authorities on
paleolithic (prehistoric) diets and a radiologist and medical anthropologist
at Emory University in Atlanta, modern diets are out of sync with our genetic
requirements. He makes the point that the less you eat like your ancestors,
the more susceptible you'll be to coronary heart disease, cancer, diabetes
and many other "diseases of civilization."1 To chart the right
direction for improving your current or future nutrition, you have to understand--and
often adopt--the diet of the past.
It helps to go back to the very beginning. Denham Harman, M.D., Ph.D.,
who conceived the free-radical theory of aging, also theorized that free
radicals were a major player in the origin and evolution of life on Earth.
According to Harman, professor emeritus of the University of Nebraska, Omaha,
free radicals most likely triggered the chemical reactions that led to the
first and simplest forms of life some 3.5 billion years ago. But because
free-radical oxidation can be destructive, antioxidant defenses--including
vitamins--likely developed soon after and ensured the survival of life.2
In fact, the first building blocks of life may have been created when
solar radiation oxidized compounds in the primordial oceans to produce pantetheine,
a form of the B-vitamin pantothenic acid, according to chemist Stanley L.
Miller, Ph.D., of the University of California, San Diego.3 Pantetheine
is the cornerstone of coenzyme A--a molecule that helps amino acids link
together and makes possible the creation of deoxyribonucleic acid (DNA)
and ribonucleic acid (RNA), the building blocks of our genes.
Over the next several billion years, many more molecules--amino acids,
lipids, vitamins and minerals--formed and helped construct the countless
forms of life. In turn, these life forms became dependent on essentially
the same group of nutrients.
According to Eaton, 99 percent of our genetic heritage dates from before
our biologic ancestors evolved into Homo sapiens about 40,000 years ago,
and 99.99 percent of our genes were formed before the development of agriculture
about 10,000 years ago.
Today's Diet,Yesterday's Genes
What we are--and were--can be deduced from paleontological data (mostly
ancient bones and coprolites--ancient feces) and the observed habits of
hunter-gatherer tribes that survived into the 20th century, according to
Eaton.
Before the advent of agriculture about 10,000 years ago, all people were
hunter-gatherers; they gathered various fruits and vegetables to eat and
they hunted animals for their meat. Of course, the ratio of meat and vegetables
varied with geographic location, climate and season. Until they began cultivating
grains and livestock, people rarely, if ever, ate grains or drank animal's
milk.
With the spread of agriculture, people shifted from nomadic groups to
relatively stable and larger societies to tend the fields. Culture and knowledge
flourished. People also began consuming large amounts of grain, milk and
domesticated meat. When communities became more localized, humans became
more sedentary as well. Then, with the industrial revolution of the 18th
century, the human diet changed even more dramatically. Beginning around
1900, whole grains were routinely refined, removing much of their nutritional
value, and refined sugar became commonplace. Reflecting on the changes in
1939, nutritionist Jean Bogert noted, "The machine age has had the
effect of forcing upon the peoples of the industrial nations (especially
the United States) the most gigantic human feeding experiment ever attempted."4
Bogert was also disturbed by the growing use of refined grains and sugar
and the preference for processed foods over fresh fruits and vegetables.
During the past 40 years, the growth of fast-food restaurants has altered
the average diet more dramatically than Bogert could have imagined. People
now rely even more on processed rather than fresh foods. In fact, the many
dietary changes during the past 10,000 years have outpaced our ability to
genetically adapt to them, according to Eaton. "That the vast majority
of our genes are ancient in origin means that nearly all of our biochemistry
and physiology are fine-tuned to conditions of life that existed before
10,000 years ago," he says.5 Looked at in another way, 100,000 generations
of people were hunter-gatherers, 500 generations have depended on agriculture,
only 10 generations have lived since the start of the industrial age, and
only two generations have grown up with highly processed fast foods.
"The problem is that our genes don't know it," Eaton points
out. "They are programming us today in much the same way they have
been programming humans for at least 40,000 years. Genetically, our bodies
now are virtually the same as they were then."6
The Paleolithic Diet
By working with anthropologists, Eaton has created what many experts
consider a clear picture of our prehistoric diet and lifestyle. Today's
panoply of diets--from fast-food burgers to various concepts of balanced
diets and food groups--bears little resemblance, superficially or in actual
nutritional constituents, to the diet earliest H. sapiens and their descendants
consumed over millions of years. For example, vitamin intake is lower today,
and the dietary fatty acid profile is substantially different from our evolutionary
diet. In other words, our diet today fails to provide the biochemical and
molecular requirements of H. sapiens.7
Here's how the major dietary constituents stack up past and present.
Carbohydrates: Early humans obtained about half of their calories
from carbohydrates, but these carbohydrates were rarely grains. Most carbohydrates
came from vegetables and fruit.
"Current carbohydrates often take the form of sugars and sweeteners.
... Products of this sort, together with items made from highly refined
grain flours, constitute empty calories ... devoid of accompanying essential
amino and fatty acids, vitamins, minerals and possibly phytochemicals,"
Eaton says.8
Fruits, vegetables and fiber: Over the course of one year, hunter-gatherers
typically consumed more than 100 different species of fruits and vegetables.
Today, fewer than 9 percent of Americans eat the recommended five daily
servings of fruits and vegetables, according to Gladys Block, Ph.D., a nutritional
epidemiologist at the University of California, Berkeley. Even people who
regularly do eat fruits and vegetables generally limit themselves to a handful
of different foods, she says.9 The fruits, vegetables, nuts and seeds of
H. sapiens provided more than 100 g of fiber daily, far above the typical
recommendation of 20 g to 30 g, and even farther above what the average
American actually eats. Additionally, Eaton says, "The fiber in preagricultural
diets came almost exclusively from fruits, roots, legumes, nuts and other
naturally occurring noncereal plant sources, so it was less associated with
phytic acid than is fiber from cereal grains [phytic acid interferes with
mineral absorption]."
Protein and fat: Early humans consumed about 30 percent protein,
although consumption varied with the season and geographic location. Current
dietary recommendations suggest much less protein--about 12 percent to 15
percent of total caloric intake. Much of this protein came from what people
now call "game meat"--undomesticated animals such as deer and
bison.10
Based on contemporary studies of hunter-gatherer societies, it appears
early humans consumed relatively large amounts of cholesterol (480 mg daily),
but it is extrapolated that their blood cholesterol levels were much lower
than those of the average American (about 125 mg vs. 200+ mg per deciliter
of blood). There are a couple of reasons for this.
First, domesticating animals increases their saturated fat levels and
alters the ratio of omega-6 to omega-3 fatty acids. Saturated fat is associated
with increased blood cholesterol levels. Most Americans consume an 11:1
ratio of omega-6 to omega-3 fatty acids. But a more ideal ratio, based on
evolutionary and anthropological data, would be in the range of 1:1 to 4:1.
In other words, our ancestors consumed a higher percentage of omega-3 fatty
acids--and we probably should, too.
Second, hunting and gathering required considerable physical effort,
which means early humans exercised a lot, burned fat and lowered cholesterol
levels. "Their nomadic foraging lifestyle required vigorous physical
exertion, and skeletal remains indicate that they were typically more muscular
than we are today," says Eaton. "Life during the agricultural
period was also strenuous, but industrialization has progressively reduced
obligatory physical exertion."11
Vitamins and minerals: Game meats and wild plant foods contain
higher amounts of vitamins and minerals relative to their protein and carbohydrates.
Observes Eaton: "The fruits, nuts, legumes, roots and other noncereals
that provided 65 percent to 70 percent of typical hunter-gatherer subsistence
were generally consumed within hours of being gathered, with little or no
processing and often uncooked ... it seems inescapable that preagrarian
humans would generally have had an intake of more vitamins and minerals
and exceeded currently recommended dietary allowances."12
The difference in consumption of sodium and potassium--electrolyte minerals
necessary for normal heart function--is especially dramatic. According to
Eaton, the typical adult American consumes about 4,000 mg of sodium daily,
but less than 10 percent of this amount occurs naturally in food. The rest
is added during processing, cooking or seasoning at the table. Potassium
consumption is lower, about 3,000 mg daily.
In contrast, early humans consumed only an estimated 600 mg of sodium
but 7,000 mg of potassium daily. People, says Eaton, are the "only
free-living terrestrial mammals whose electrolyte intake exhibits this relationship."13
That reversed ratio could be one reason why people are so prone to hypertension
and other heart ailments.
Although dietary vitamin and mineral levels in the past were 1.5 to 5
times higher than today, Eaton does not favor "megadoses" of vitamins.
However, there is evolutionary evidence that large doses of vitamin C may
be needed for optimal health. The reason has less to do with diet and more
to do with an evolutionary accident.
Vitamin C And Human Evolution
Evolution often zigzags rather than following a linear flow. One reason
is that a species might wipe out another by eating it. In addition, climatic
and, more recently, industrial changes, also destroy species. According
to the theory of "punctuated equilibrium," proposed by Niles Eldredge,
Ph.D., and Stephen Jay Gould, Ph.D., of Harvard University, catastrophic
events--such as an asteroid striking the Earth--can also dramatically shift
the course of evolution.14
One such catastrophic event of an unknown nature affected the preprimate
ancestors of humans sometime between 25 and 70 million years ago, according
to biochemist Irwin Stone, Ph.D. This particular event led to a mutation
that prevented all of this species' descendants from manufacturing their
own vitamin C. At least some of the species were able to survive and evolved
into H. sapiens because they lived in a lush equatorial region with vitamin
C-rich foods. But nearly all other species of animals, from insects to mammals,
continued to produce their own vitamin C.
This theory regarding how our evolutionary ancestors lost their ability
to produce vitamin C is generally accepted by scientists, Stone's other
theory is more controversial: He contends that people never lost the need
for large amounts of vitamin C, even though they lost the ability to make
it. Based on animal data, he estimates that people might require 1.8 g -
13 g of vitamin C daily.15 This idea that people require large amounts of
vitamin C later became a cornerstone of Nobel laureate Linus Pauling's recommendations
for vitamin C in the treatment of colds and cancer. Ironically, losing the
ability to produce vitamin C actually may have accelerated the evolution
of primates into modern human beings, according to a new theory. Vitamin
C is an important antioxidant, and losing the ability to produce it would
have allowed the formation of a large number of free radicals. These excessive
free radicals would have caused large numbers of DNA mutations, contributing
to the aging process and diseases. Some of these mutations would also have
been inherited by offspring, creating many biological variations--one of
which eventually become H. sapiens.16
A Diet For The Future
For much of history, human life span was not particularly long. Two thousand
years ago, the average life expectancy was a mere 22 years, and infections
and traumatic injury were the principal causes of death. Better hygiene
and sanitation have largely accounted for the dramatic improvement in life
expectancy in the 20th century.
Now, as people live longer, they are increasingly susceptible to greater
amounts of free-radical damage and their principal end points--cardiovascular
disease and cancer.
The question is: Where do we and our diets go from here? Our evolutionary
diet provides important clues to the "baseline" levels and ratios
of nutrients needed for health. The evidence suggests we should be eating
a lot of plant foods and modest amounts of game meat, but few grains and
no dairy products. With a clear understanding of this diet, we have an opportunity
to adopt a better, more natural diet. We can also do a better job of individualizing
and optimizing our nutritional requirements.
Based on our evolutionary and paleolithic diets, it's clear that modern
diets are on the wrong track--and that our diets are not satisfying our
genetic requirements. In 1939, the same year when Bogert bemoaned the rise
of highly refined foods, Nobel laureate Albert Szent-Gyorgyi, M.D., Ph.D.,
explored the importance of optimal (and not just minimal) requirements of
vitamins. Years later, Roger Williams, Ph.D., and Linus Pauling, Ph.D.,
would also promote the concept of optimal nutrition, based on providing
ideal levels of vitamins and other nutrients on a molecular level. Pauling
eloquently observed that health depended on the presence of nutritional
molecules. To set a dietary course for the future, we have to recognize
how certain molecules shaped our lives over millions of years. Paleolithic
diets provide those clues and give us a sound foundation to build on, perhaps
to protect and prime our genes even further.
A note to those who don't believe in evolution: It's worth observing
that evolution describes the mechanism of how life develops, but says nothing
about whether a higher being was guiding the process. Regardless, the diet
of today is very different from, and not always as good as, the diet of
the past.
Jack Challem is based in Aloha, Ore., and has been writing for health
magazines for 20 years. He also publishes his own newsletter, The Nutrition
Reporter, which summarizes recent medical journal articles on vitamins.
1. Eaton, S.B., Eaton III, S.B., et al. "An evolutionary perspective
enhances understanding of human nutritional requirements," J of
Nutr, 126:1732-40, June 1996.
2. Harman, D. "Aging: Prospects for further increases in the functional
life span," Age, 17:119-46, 1994.
3. Keefe, A.D., Newton, G.L., et al. "A possible prebiotic synthesis
of pantetheine, a precursor to coenzyme A," Nature, 373:683-85,
Feb. 23, 1995.
4. Bogert, L.J. Nutrition and Physical Fitness: 437. New York:
Saunders, 1939.
5. Eaton, S.B., Shostak, M., et al. The Paleolithic Prescription:
A Program of Diet & Exercise and a Design for Living: 39. New York:
Harper & Row, 1988.
6. Eaton, Shostak, et al., op cit, 1988:41.
7. Eaton, Shostak, et al., op cit, 1996.
8. Eaton, Shostak, et al., op cit, 1996.
9. Patterson, B.H., Block, G., et al. "Fruit and vegetables in the
American diet: Data from the NHANES II survey," Amer J Public Health,
80:1443-49, December 1990.
10. Eaton, S.B., & Konner, M. "Paleolithic nutrition: A consideration
of its nature and current implications," N Engl J of Med, 312:283-89,
Jan. 31, 1983.
11. Eaton & Konner, op cit, 1996.
12. Eaton & Konner, op cit, 1996.
13. Eaton & Konner, op cit, 1996.
14. Eldredge, N. & Gould, S.J. "Punctuated equilibria: An alternative
to phyletic gradualism," Schopf, T.J.M., editor, Models in Paleobiology,
San Francisco: Freeman Cooper, 1972.
15. Stone, I. "Hypoascorbemia: The genetic disease causing the human
requirement for exogenous ascorbic acid," Perspectives in Biol and
Med, 10:133-34, 1966.
16. Challem, J.J. "Did the loss of endogenous ascorbate propel the
evolution of anthropoidea and homo sapiens?" Med Hypotheses,
in press.
Copyright 1997, New Hope Communications. Any duplication
of this document by electronic or other means is strictly prohibited. If
you have any comments or questions regarding the information or web site
itself, please contact [email protected] |