Why Paleo? From Cave to Chronic Illness
by Chris Kresser
Excerpted from the book Your Personal Paleo Code by Chris Kresser.
Copyright © 2013 by Chris Kresser.
Reprinted with permission of Little, Brown and Company.
Consider the following:
â— Diabetes and obesity combined affect more than a billion people worldwide, including one hundred million Americans.
â— More than half of Americans are overweight; a full third are clinically obese.
â— Heart disease causes four out of every ten deaths in the United States.
â— One-third of Americans have high blood pressure, which contributes to almost eight hundred thousand strokes every year — the leading cause of serious, long-term disability. Annually, there are 12.7 million strokes worldwide.
â— More than thirty-six million people are now living with dementia.
â— Depression is now the leading cause of disability, affecting more than 120 million people worldwide.
I could go on, but I think you get the point. We’re getting fatter and sicker every year.
Now imagine, for a moment, a world where:
â— Modern, chronic diseases, like diabetes, obesity, some cancers, autoimmune disorders, and heart disease, are rare or nonexistent.
â— The world population is naturally lean and fit.
â— We all age gracefully with strong bones, sharp vision, and normal blood pressure.
While this might sound like pure fantasy today, anthropological evidence suggests that this is exactly how human beings lived for the vast majority of our species’ evolutionary history.
Today, most people accept disorders like obesity, diabetes, and heart disease as normal. But while these problems may be common now, they’re anything but normal. Our species evolved roughly two million years ago, and for more than sixty-six thousand generations, humans were free of the modern diseases that today kill millions of people each year and make countless others miserable. In fact, the world I asked you to imagine above was the natural state for humans’ history on this planet up until the agricultural revolution occurred, about eleven thousand years (366 generations) ago — less than 0.5 percent of the time recognizably human beings have been here. It’s a tiny blip on the evolutionary time scale.
What happened? What transformed healthy and vital people free of chronic diseases into sick, fat, and unhappy people?
In a word? Mismatch.
Agriculture: The Worst Mistake in Human History?
Like it or not, we humans are animals. And like all animals, we have a species-appropriate diet and way of life.
When animals eat and live in accordance with the environment to which they’ve adapted, they thrive. Cats, with their sharp teeth and short intestinal tracts, evolved to be carnivores, so when we feed them grain-rich kibble, they develop kidney trouble and other woes. Cows naturally graze on grass; when they eat too much grain, harmful bacteria proliferate and make them sick. We humans face a similar mismatch. Our biology and genes evolved in a particular environment. Then that environment changed far faster than humans could adapt, with a few important exceptions that I’ll cover later in this chapter. The result? The modern epidemic of chronic disease.
For the vast majority of existence, humans lived as Paleolithic hunter-gatherers, eating the meat they hunted, the fish they caught, and the vegetables, fruits, and tubers they picked while on the move. The agricultural revolution dramatically altered humans’ food supply and way of life. They learned to stay put, planting crops and domesticating cows, sheep, goats, and pigs. Early farmers consumed foods that their hunter-gatherer predecessors didn’t eat, such as cereal grains, milk and meat from domesticated animals, and legumes and other cultivated plants.
While scientists have argued that these developments allowed our species to flourish socially and intellectually, the consequences of this shift from a Paleolithic to an agricultural diet and lifestyle were disastrous for human health. In evolutionary terms, eleven thousand years is the blink of an eye, not nearly long enough for humans to completely adapt to this new way of eating. This is why the influential scientist and author Jared Diamond called agriculture “the worst mistake in human history.” He argued that hunter-gatherers “practiced the most successful and longest-lasting lifestyle in human history” and were all but guaranteed a healthy diet because of the diversity and nutrient density of the foods they consumed. Once humans switched diets and became more sedentary, our species’ naturally robust health began to decline.
How do we know that agriculture has been so harmful to humanity? There are three main points of evidence:
â— A decline in health among hunter-gatherer populations that adopted agriculture
â— The robust health of contemporary hunter-gatherers
â— The poor health of people who rely heavily on grains as a staple
Let’s look at each of these in more detail.
What happened when hunter-gatherers became farmers?
Studying bones gives scientists a window into the health of our distant ancestors and offers insight into what an optimal human diet might be. Some archaeologists and anthropologists today may have a better understanding of human nutrition than the average health-care practitioner!
So what have these scientists learned from examining the bones of humans who shifted from a Paleolithic hunter-gatherer lifestyle to an agricultural one? The fossil record shows a rapid and clear decline in health in places where agriculture was adopted. Tooth decay and anemia due to iron deficiency became widespread, average bone density decreased, and infant mortality increased. These changes resulted in large part from the nutritional stress of eating a diet inappropriate for our species.
We also shrank. Skeletal remains from Greece and Turkey indicate that the average height of hunter-gatherers at the end of the ice age was five nine for men and five five for women. After agriculture was adopted in these areas, the average height fell to a low of five three for men and five feet for women. Archaeologists have found similar shrinkage in skeletons all over the world when populations shifted to agriculture.
Early farmers lost more than inches from their skeletons; they lost years from their lives. Anthropologist George Armelagos studied the American Indians living in the Ohio River Valley in approximately AD 1150. His team compared the skeletons of hunter-gatherers that lived in the same area with those of the early farmers who followed them. The farmers had 50 percent more tooth-enamel defects (suggestive of malnutrition), four times as much iron-deficiency anemia, three times more bone lesions, and an overall increase in degenerative conditions of the spine. Their life expectancy at birth also dropped, from twenty-six years to nineteen years.
In their book The 10,000 Year Explosion, anthropologists Gregory Cochran and Henry Harpending argued that these dramatic declines in health were brought on by a major shift in the human diet. When hunter-gatherers switched to farmers’ diets, their average carbohydrate intake shot up while the amount of protein plummeted. The quality of that protein also decreased, since almost any type of meat has a desirable amino acid balance, whereas most plants do not. Vitamin shortages were common because the new diet was based on a limited set of crops and was lower in more nutrient-dense animal products. Evidence suggests that these early farmers, who depended on one or two starchy crops, like wheat or corn, may have developed vitamin-deficiency diseases such as beriberi, pellagra, rickets, and scurvy. Their hunter-gatherer ancestors, who ate a wide variety of foods rich in vitamins and minerals, rarely suffered from these diseases.
Because of “plentiful protein, vitamin D, and sunlight in early childhood,” our Paleo ancestors were “extremely tall,” had very good teeth, and larger skulls and pelvises, according to one group of archaeologists. Their farming descendants, by contrast, suffered skull deformities because of iron-deficiency anemia, had more tooth decay, were more prone to infectious diseases, and were much shorter, “apparently because subsistence by this time is characterized by a heavy emphasis on a few starchy food crops.” Farming may have offered our ancestors a more stable and predictable food supply, but this stability came at a great price.
Didn’t Our Paleo Ancestors Die Young?
A common question I hear from Paleo skeptics is something along the lines of “Didn’t Stone Age people die before their thirtieth birthday?”
It’s true that, on average, our Paleo ancestors died younger than we do. However, these averages don’t factor in challenges largely absent from modern American lives: high infant mortality, violence and accidents, infectious diseases, and lack of medical care. Hunter-gatherer populations had infant-mortality rates about thirty times higher than those in the United States today; early-childhood-mortality rates were more than one hundred times higher. These higher infant- and childhood-mortality rates were caused by accidents, trauma, exposure to the elements, violence, warfare, and untreated acute infectious diseases — issues that, fortunately, few of us in the developed world face. These untimely deaths had the net effect of dragging down average life expectancy. If, out of ten Paleo people, three died in infancy, two died during childhood from exposure to the elements, and two died as teenagers in warfare, then even if the remaining three lived long, healthy lives, the average life span in this hypothetical group would still be short.
Recent research that has taken the high infant-mortality rates of our Paleolithic ancestors into account suggests that if our Stone Age forebears survived childhood, they had life spans roughly equivalent to those of people living in industrialized societies today, with a range from sixty-eight to seventy-eight years. Even more important, they reached these ages without any signs of the chronic inflammatory and degenerative diseases that we consider to be normal in developed countries, including obesity, type 2 diabetes, gout, hypertension, cardiovascular disease, and some cancers. Sure, those of us living in modern industrialized societies might live a little longer than hunter-gatherers, on average. But most of our elderly people now suffer from painful and debilitating diseases, take several medications a day, and have an unsatisfactory quality of life. Fortunately, we don’t have to choose between eating like our ancestors and reaping the benefits of modern medicine. We can combine them to get the best of both worlds and enjoy long life spans without the degenerative diseases that are so common in the industrialized world.
Contemporary hunter-gatherers: A study in good health
Modern studies of contemporary hunter-gatherers — people who have had minimal exposure to industrial civilization and follow a traditional diet and lifestyle — suggest they are largely free of the chronic inflammatory diseases that have become epidemic in the industrialized world.
Anthropological and medical reports of these contemporary hunter-gatherers show they have far fewer modern illnesses, such as metabolic syndrome, cardiovascular disease, obesity, some cancers, and autoimmune disorders, than Westernized populations. In their study “The Western
Diet and Lifestyle and Diseases of Civilization,” nutrition researcher Pedro Carrera-Bastos and his colleagues compared the health of traditional populations with the health of people living in industrialized societies. The contemporary hunter-gatherers were superior in every measure of health and physical fitness. They had:
â— Lower blood pressure
â— Excellent insulin sensitivity and lower fasting insulin levels (meaning they were less likely to develop type 2 diabetes)
â— Lower fasting leptin levels (leptin is a hormone that regulates body fat)
â— Lower body mass indexes and waist-to-height ratios (one way of measuring optimal weight)
â— Greater maximum oxygen consumption (a measure of physical fitness)
â— Better vision
â— Stronger bones
Let’s look at some examples of contemporary hunter-gatherer populations around the world that, at least until a short time ago, followed the traditional diet and lifestyle.
The Kitavans
Kitava is a small island in the Trobriand Islands archipelago in Papua New Guinea. Though not technically hunter-gatherers (they are horticulturalists), the Kitavans were, until recently, one of the last populations on earth still following a traditional diet similar in composition to Paleolithic diets. According to Dr. Staffan Lindeberg in his 1989 book Food and Western Disease, residents of Kitava subsisted “exclusively on root vegetables (yam, sweet potato, taro, tapioca), fruits (banana, papaya, pineapple, mango, guava, watermelon, pumpkin), vegetables, fish and coconuts.”
The Kitavans enjoyed excellent health. Dr. Lindeberg’s study of 2,300 Kitavans found that:
â— None had ever experienced heart disease or a stroke (which was particularly remarkable because most Kitavans smoked, and smoking is one of the biggest risk factors for heart disease).
â— They were very lean, with an average body mass index (BMI) of 20 in men and 18 in women. (By contrast, in 2010, the average BMI of Americans — both men and women — was 27, which is considered overweight and is only three points away from the obese category.)
â— Compared to Westernized populations, Kitavans had very low levels of leptin and insulin, the hormones that regulate food intake and energy balance. Low levels of each are associated with leanness and overall metabolic health.
Most significant, Kitavans rarely suffered the diseases of aging that are so common in developed countries. Lindeberg noted, “The elderly residents of Kitava generally remain quite active up until the very end, when they begin to suffer fatigue for a few days and then die from what appears to be an infection or some type of rapid degeneration. Although this is seen in Western societies, it is relatively rare in elderly vital people. The quality of life among the oldest residents thus appeared to be good in the Trobriand Islands.”
A long, healthy life followed by an easy, quick death. Don’t we all want that?
The Inuit
The Inuit are a group of hunter-gatherers who live in the Arctic regions of Alaska, Canada, and Greenland. They eat primarily fish, seals, whale, caribou, walrus, birds, and eggs: a diet very high in fat and protein, with very few vegetables or fruits. They live in a harsh environment that is marginal at best for human habitation. Yet early explorers, physicians, and scientists unanimously reported that the Inuit they encountered enjoyed excellent health and vitality.
Dr. John Simpson studied the Inuit in the mid-1850s. He noted that the Inuit were “robust, muscular and active, inclining rather to sparseness, rather than corpulence, presenting a markedly healthy appearance. The expression of the countenance is one of habitual good humor. The physical constitution of both sexes is strong.” This is especially remarkable considering the inhospitable environment the Inuit lived in, and it’s a testament to the nutrient density of the animal foods that made up the majority of their diet.
Nearly a hundred years later, an American dentist named Weston A. Price noticed an alarming increase in tooth decay and other problems in his patients, and he set out to determine whether traditional peoples who had not adopted a Western diet suffered from the same problems. In 1933, he took a trip to the Arctic to visit the Inuit, one of many cultures he studied, and he was deeply impressed by what he found. He praised the Inuit’s “magnificent dental development” and “freedom from dental caries” (that is, they had no cavities).
It’s especially impressive that the Inuit enjoyed such robust good health when you consider that their diets were 80 to 85 percent fat, a percentage that would surely horrify the American Medical Association!
Aboriginal Australians
Aboriginal Australians, or Indigenous Australians, were the original inhabitants of the Australian continent and surrounding islands. They traditionally lived as hunter-gatherers, consuming mostly animal products — including land mammals, birds, reptiles, sea creatures, and insects — along with a variety of plants. The quality of their diet depended in large part on where they lived: the subtropical, coastal areas were lush and provided abundant food; the harsh desert interior offered less in terms of both diversity and amounts of food.
Nevertheless, numerous studies suggest that even those Aboriginal Australians living in marginal environments were free of modern diseases like obesity, diabetes, and heart disease. Weston Price described them as “a living museum preserved from the dawn of animal life on the earth.”
Even today, contemporary Aboriginal Australians who maintain a traditional lifestyle are lean and fit and show no evidence of obesity, insulin resistance, type 2 diabetes, or cardiovascular disease. A study published in 1991 found that this population had optimal blood pressure, fasting-glucose levels (high levels indicate diabetes), and cholesterol levels, with an average body mass index well below that of Australians living in urban environments.
Aboriginal Australians who make the transition from their traditional hunter-gatherer lifestyle to a Westernized lifestyle develop unusually high rates of diabetes, cardiovascular disease, and obesity, according to the same study, and Westernized Aboriginal Australians experience a dramatic improvement in metabolic and cardiovascular health when they return to their traditional ways.
These three groups of hunter-gatherers have enjoyed good health with their traditional lifestyles into the twenty-first century, although each eats a very different diet. This may indicate that what we don’t eat might be just as important as what we do.
Are people who eat more grains less healthy?
Another way to evaluate whether traditional Paleolithic diets are healthier than modern diets is to look at cultures and groups that consume large amounts of grains. Are they more likely to have health problems? There’s a great deal of research that says yes. Whole grains, legumes, nuts, and seeds contain compounds called phytates that bind to minerals such as calcium, iron, zinc, and manganese, making them more difficult to absorb. If a food contains nutrients that you can’t absorb, you’re not going to reap their benefits.
Studies show that children on vegetarian macrobiotic diets — “healthy” diets composed of whole grains (especially brown rice), legumes, vegetables, and some fruits — are deficient in vitamins and minerals and are more likely to develop rickets than their meat-eating peers. Breast-fed babies of macrobiotic mothers may be getting lower levels of vitamin B12, calcium, and magnesium, according to some research, which may result in these babies having delayed physical and cognitive growth.
Cultures that are heavily dependent on grains often show signs of severe vitamin A and protein deficiencies, which make them more susceptible to infectious diseases. Dr. Edward Mellanby, the discoverer of vitamin D, compared the agricultural Kikuyu tribe with the pastoralist (livestock-raising) Masai tribe, who consume primarily the milk, blood, and flesh of the cows they raise. Dr. Mellanby discovered that the Kikuyu, who lived mainly on cereals, had a far higher incidence of bronchitis, pneumonia, tropical ulcers, and tuberculosis.
We’ve been raised to believe that healthy whole grains are nutritional marvels, but cereal grains like corn, wheat, and rice don’t deserve the label healthy. They’re inferior to animal products as a source of protein because they’re incomplete, meaning that they are missing one or more essential amino acids. (Essential amino acids are those that we can’t synthesize and therefore have to get from our diets.) They’re also lower in vitamins and minerals compared to meat and the variety of wild fruits and vegetables consumed by our ancestors.
The evidence suggests that when we eat grains at the expense of more nutritious foods — especially when those grains are not properly prepared to reduce phytates and toxins — our health suffers.
How Meat Made Us Human
Eating meat and cooking food is quite literally what made us human. The transition from a raw, exclusively plant-based diet to one that included meat and cooked food (as well as starchy tubers) is what enabled the brains of our pre-human ancestors to grow so rapidly.
Humans have exceptionally large, neuron-rich brains relative to body size compared to nonhuman primates. For example, gorillas have bodies that are three times larger than ours, but they have smaller brains with only about a third the number of neurons that we have. So why is it that the largest primates don’t also have the largest brains?
The answer is that the brain competes with other organs for resources in the body. Gorillas require a large, metabolically expensive digestive tract to process the high-fiber, low-calorie plant matter they consume. This doesn’t leave enough resources for larger, higher-performance brains (like ours). The human brain is an expensive metabolic tissue: it consumes 20 percent of total body energy even though it represents only 2 percent of body mass.
The larger you are, the more you need to eat. The more you need to eat, the more time you have to spend feeding yourself. Gorillas, who are vegetarians, already spend as much as 9.6 hours of a twelve-hour day eating, in part because the fibrous plant matter they consume takes so long for their bodies to break down and absorb. In order to provide enough energy for a human-like brain, a gorilla would have to eat for an extra two hours a day! Likewise, early humans eating only raw vegetation would have needed to eat for more than nine hours a day to get enough calories to support their large brains.
Gathering food was both dangerous and time-consuming, so it is unlikely that our ancestors had a completely vegetarian raw diet. When they cooked their meat, it became easier for them to chew and therefore to digest and absorb, which increased both the calories available and the nutritional density of their diet.
As you’ll see in chapter 3 (which focuses on nutrient density), meat provides an ideal mix of amino acids, fats, and vitamins and minerals for brain growth and maintenance. Vitamin B12 — available only in animal foods — is particularly important for developing brains.
It’s possible to survive on these vegan or vegetarian diets today, but they’re far from optimal or normal for our species. People choose not to eat meat for many reasons, including concerns about the ethical treatment of animals, the amount of resources depleted in raising animals for consumption, and religious beliefs. Those are complex issues beyond the scope of this book. My point is simply that we may not have become the humans we are without this nutritious food source.
The Industrial Revolution: Out of the Frying Pan and into the Fire
The agricultural revolution began humans’ transition away from sixty-six thousand generations of good health. But this shift wasn’t really complete until about six generations ago, when humans reached another milestone: the Industrial Revolution, which ushered in a new age of mass production, transportation, urbanization, and economic development.
Although the beginning of the Industrial Revolution dates back to the eighteenth century, its dietary effects didn’t become evident until the late 1800s. Improved transportation meant greater access to food for more people. Mass-production methods meant that items like white flour, table sugar, vegetable oil, dairy products, and alcohol could become fixtures at every table. White flour, for example, became widespread in the United States after 1850, but it didn’t reach the saturation point until the 1890s. People living in England in the mid-Victorian period, between 1850 and 1890, generally enjoyed great health and still ate a fairly preindustrial diet. With falling prices and improved transportation, however, by around 1900, modern foods made up about 70 percent of the total calories the average person consumed each day — a remarkable change when you consider that none of them was available for the vast majority of human history.
Another significant change that came with the Industrial Revolution was a decrease in the diversity of the human diet around the world. Paleolithic hunter-gatherers consumed a large variety of plant species, primarily fruits, tubers, and vegetables, as do their modern counterparts. (For example, the Alyawarra tribe in Central Australia consumes ninety-two different plant species, and the Tlokwa tribe in Botswana a hundred and twenty-six.) Thanks to improved railways, roads, and canals, a limited number of crops could be grown and shipped cheaply to every corner of the planet. Today, 80 percent of the world’s population lives on only four principal staple plants: wheat, rice, corn, and potatoes.
The food introduced on a large scale by the Industrial Revolution (and grown with newly invented pesticides containing toxins) may be cheaper for us, but it isn’t better. A hundred grams of sweet potato (about half a potato) contains only about 90 calories, and a hundred grams (one small serving) of wild-game meat contains about 150 calories, but both of these foods contain a wide spectrum of beneficial micronutrients. By contrast, a hundred grams (less than a cup) of refined wheat flour contains 361 calories, the same amount of sugar contains 387 calories, and both have virtually no beneficial nutrients. A hundred grams of corn oil (about seven tablespoons), a staple of modern diets, contains a whopping 881 calories and has essentially no nutritional value.
Even worse, industrialization completely changed the way humans lived. In 1800, 90 to 95 percent of Americans lived in rural areas or in small villages. In 1900, about half the population resided in nonurban environments. Today, less than 16 percent of all Americans live in rural areas. People who moved to cities to work became more sedentary. Longer work hours meant less time in the sun and less sleep. Stress — chronic, unrelenting — became a fixture in everyday life. While the Industrial Revolution undoubtedly improved human health in many ways (e.g., greater protection against infectious disease and better emergency medical care), these benefits did not come without significant cost. We have the Industrial Revolution to thank for new diseases of civilization that were rare or virtually nonexistent in preindustrial cultures:
â— In the early 1950s in Uganda, only 0.7 percent of people above the age of 40 showed evidence of having had heart attacks, according to an autopsy study. Today in Uganda, a country where the Western-style diet has taken hold, heart disease is the fourth leading cause of death.
â— In Papua New Guinea, heart attacks were unknown prior to urbanization. Today, the rate of heart attacks is skyrocketing, with upward of 400,000 heart attacks a year in a population of 5.4 million people.
â— Among the Pima Indians in Arizona, the first confirmed case of diabetes was reported in 1908. Thirty years later, there were twenty-one cases, and by 1967, the number had risen to five hundred. Today, half of all adult Pima Indians have diabetes.
â— When some of the South Pacific people of Tokelau migrated to nearby New Zealand and switched to a Western diet, they developed diabetes at three times the rate of those who had remained in Tokelau.
As study after study shows, the more Westernized a traditional culture becomes, the more disease it experiences. Today obesity, diabetes, heart disease, and other chronic degenerative conditions affect well over a billion people worldwide and kill millions of people each year. It may be nearly impossible for you to imagine life without these disorders. Yet they’ve been common for only the past two hundred or so years, a tiny fraction of the time humans have existed on the planet.
We’re Still Evolving
I’ve argued that humans are mismatched with an agricultural diet because the environment changed faster than our species’ genes and biology could adapt. But this doesn’t mean that we haven’t developed any adaptations to agriculture or that human evolution stopped in the Paleolithic
era.
In fact, the pace of genetic change in humans has actually increased during the past few thousand years. Evolutionary biologist Scott Williamson suggests that evolution is occurring one hundred times faster than its previous average over the six million years of hominid evolution and that as much as 10 percent of the genome shows evidence of recent evolution in European Americans, African Americans, and Chinese.
This rapid increase in genetic change has been driven by two factors, say anthropologists Gregory Cochran and Henry Harpending in their book The 10,000 Year Explosion:
â— A significant change in environment, which increased the selective pressure to adapt to it
â— A dramatic increase in population, which increased the likelihood that adaptive mutations would arise by chance
If there’s a new source of slightly indigestible food available to a population that lacks abundant food sources, there will be a lot of selective pressure for the species to adapt so they are able to consume that food. That’s exactly what happened with milk. For most of our species’ history,
humans produced lactase, the enzyme that helps digest the milk sugar, only during infancy and early childhood. Since mother’s milk was the only lactose-containing food in the human diet at that time, there was simply no need for children to continue making lactase after they stopped breast-feeding, which was at about age four for most hunter-gatherers.
However, this all changed with the dawn of the agricultural revolution and the domestication of cattle, which made cow’s milk a readily available food source. Early farmers who relied heavily on grains were prone to mineral, especially calcium, deficiencies. Their skeletons, shorter than their hunter-gatherer predecessors’, indicated they also probably lacked vitamin D, which plays a role in skeletal development. Milk is rich in calcium, contains some vitamin D, is a complete protein, and may promote growth during childhood. It also provided hydration and sustenance during periods of drought. Individuals who carried a genetic mutation allowing them to digest milk beyond their breast-feeding years would have been favored by natural selection, and their genes would have spread rapidly through farming populations.
In fact, archaeological evidence and gene-mapping studies suggest that a genetic mutation that allowed the continued production of lactase into adulthood originated about eight thousand years ago somewhere in Europe and spread rapidly thereafter. Today, approximately one-third of the global population produces lactase into adulthood. In cattle-herding tribes in East Africa, like the Tutsi, the rate is up to 90 percent. In some Northern European countries, like Denmark and Sweden, the genes are present in up to 95 percent of people.
There are several other relatively recent changes — genetic and otherwise— that have influenced our response to modern foods. For example:
â— Populations with historically high starch intake produce more amylase in their saliva than populations with lower starch intake. Amylase is an enzyme that helps digest starch and glucose, both of which are forms of carbohydrates.
â— New versions of genes that affect insulin and blood-sugar regulation have also arisen in the relatively recent past. These mutations appear to increase carbohydrate tolerance and reduce the likelihood that a higher-carbohydrate diet will lead to problems like diabetes.
â— Changes in the expression of certain genes (which can happen much faster than changes to the underlying genes themselves) may help some populations that rely on grains as staples to process them more effectively.
â— Finally, changes in the gut microbiota — the beneficial microorganisms that live in our digestive tracts — can directly affect one’s ability to assimilate certain nutrients. Researchers have identified a type of bacteria in the colon of Japanese people that produces an enzyme that helps them digest seaweed (nori in particular). And some studies suggest that lactose intolerance can be eliminated simply by eating increasing amounts of yogurt containing
live bacteria, which can naturally metabolize lactose.
So, our bodies have adapted in some ways to the challenges of an agricultural diet. Human innovation has also helped. As I mentioned in the previous section, cereal grains and legumes contain phytates, which bond with zinc, iron, calcium, and other minerals. The human gut is unable to break these bonds, which means that it’s difficult for us to absorb the minerals from grains. But traditional cultures soaked grains and grain flours in an acid medium (such as whey or lemon juice), fermented them, germinated (sprouted) them, or leavened them (for example, baking bread with natural sourdough starter), which significantly reduced their phytate content and thus made the minerals they contained more bioavailable (that is, easier to absorb).
Will Evolution Catch Up to Western Diets?
Humans, it would seem, are well adapted to Paleolithic foods like meat, vegetables, fruits, and tubers because our species has been eating them for millennia, and the evidence shows human health declined with the introduction of agricultural foods. However, the fact that a food wasn’t available during the Paleolithic era doesn’t necessarily mean we should avoid it entirely today. The genetic and cultural changes I’ve described above occurred (at least in part) to help humans adapt to an agricultural diet, and they do influence how individuals tolerate Neolithic foods. This explains why some people are able to include moderate amounts of dairy, grains, and/or legumes in their diets — especially when these foods are predigested by fermenting, soaking, sprouting, or leavening — without ill effect. (I’ll have more to say on this topic later in the book.) But these genetic changes don’t mean we can eat a diet high in cereal grains and low in animal protein without adverse health consequences. These adaptations are often simple mutations of single genes and can be relatively crude. For example, the mutation that enables people to digest
milk beyond childhood simply breaks the genetic switch that is supposed to turn off lactase production after infancy. This rather haphazard fix reflects the short time frame in which it took place; it’s much easier for the body to break something that already exists than to create something new.
Eventually, it’s at least possible that humans could evolve a more complex adaptation (involving the coordinated action of several different genes) to a grain-heavy diet. This might include changes in the gastrointestinal tract that would allow better absorption of the nutrients in grains. But even if such an adaptation occurred, it wouldn’t change the fact that grains are far less nutrient dense than meats, fish, and vegetables — the staple foods of our Paleolithic ancestors. This is especially true when you take into account the bioavailability of nutrients, which is high in animal products and low in grains.
For these reasons, the best approach is to make the Paleolithic foods our species evolved to eat the foundation of your diet and then personalize it from there depending on your own unique combination of genetics, health status, activity level, life circumstances, and goals. That’s exactly what I’m going to show you how to do, starting in the very next chapter.
Obviously, a lot has changed since our Paleo ancestors roamed the earth, and most of us aren’t living like the contemporary hunter-gatherer populations I’ve mentioned in this section. How do we know their lifestyle is our best option today? Beyond the considerable anthropological record, there are several lines of modern, clinical evidence supporting the health benefits of a Paleo-template diet and lifestyle. These include:
â— The high nutrient density of Paleo foods
â— The minimal presence of toxins and antinutrients in Paleo foods
â— The superior balance of fats in a Paleo diet
â— The beneficial effects of the Paleo diet on gut bacteria
â— The benefits of integrating physical activity throughout the day and minimizing sedentary time, the way our Paleo ancestors did
â— The benefits of sleeping at least seven to eight hours a night and minimizing exposure to artificial light (although the latter was something our Paleo ancestors never had to contend with)
â— The benefits of sun exposure (which go beyond vitamin D) and spending time outdoors
â— The importance of pleasure, play, and social connection
I’ll cover each of these — and much more — in Steps 1 and 2. Again, the good news is that we don’t have to live in caves or roam the earth for food to enjoy the benefits of a Paleo-style diet. And there’s no need to run to a geneticist to see if you have the right alleles to digest milk or wheat.
Your Personal Paleo Code will lead you to the perfect diet. For now, I hope I’ve convinced you that a Paleo template is the right place to begin.
Ready? Let’s get started!
Excerpted from the book Your Personal Paleo Code by Chris Kresser.
Copyright © 2013 by Chris Kresser.
Reprinted with permission of Little, Brown and Company.
Consider the following:
â— Diabetes and obesity combined affect more than a billion people worldwide, including one hundred million Americans.
â— More than half of Americans are overweight; a full third are clinically obese.
â— Heart disease causes four out of every ten deaths in the United States.
â— One-third of Americans have high blood pressure, which contributes to almost eight hundred thousand strokes every year — the leading cause of serious, long-term disability. Annually, there are 12.7 million strokes worldwide.
â— More than thirty-six million people are now living with dementia.
â— Depression is now the leading cause of disability, affecting more than 120 million people worldwide.
I could go on, but I think you get the point. We’re getting fatter and sicker every year.
Now imagine, for a moment, a world where:
â— Modern, chronic diseases, like diabetes, obesity, some cancers, autoimmune disorders, and heart disease, are rare or nonexistent.
â— The world population is naturally lean and fit.
â— We all age gracefully with strong bones, sharp vision, and normal blood pressure.
While this might sound like pure fantasy today, anthropological evidence suggests that this is exactly how human beings lived for the vast majority of our species’ evolutionary history.
Today, most people accept disorders like obesity, diabetes, and heart disease as normal. But while these problems may be common now, they’re anything but normal. Our species evolved roughly two million years ago, and for more than sixty-six thousand generations, humans were free of the modern diseases that today kill millions of people each year and make countless others miserable. In fact, the world I asked you to imagine above was the natural state for humans’ history on this planet up until the agricultural revolution occurred, about eleven thousand years (366 generations) ago — less than 0.5 percent of the time recognizably human beings have been here. It’s a tiny blip on the evolutionary time scale.
What happened? What transformed healthy and vital people free of chronic diseases into sick, fat, and unhappy people?
In a word? Mismatch.
Agriculture: The Worst Mistake in Human History?
Like it or not, we humans are animals. And like all animals, we have a species-appropriate diet and way of life.
When animals eat and live in accordance with the environment to which they’ve adapted, they thrive. Cats, with their sharp teeth and short intestinal tracts, evolved to be carnivores, so when we feed them grain-rich kibble, they develop kidney trouble and other woes. Cows naturally graze on grass; when they eat too much grain, harmful bacteria proliferate and make them sick. We humans face a similar mismatch. Our biology and genes evolved in a particular environment. Then that environment changed far faster than humans could adapt, with a few important exceptions that I’ll cover later in this chapter. The result? The modern epidemic of chronic disease.
For the vast majority of existence, humans lived as Paleolithic hunter-gatherers, eating the meat they hunted, the fish they caught, and the vegetables, fruits, and tubers they picked while on the move. The agricultural revolution dramatically altered humans’ food supply and way of life. They learned to stay put, planting crops and domesticating cows, sheep, goats, and pigs. Early farmers consumed foods that their hunter-gatherer predecessors didn’t eat, such as cereal grains, milk and meat from domesticated animals, and legumes and other cultivated plants.
While scientists have argued that these developments allowed our species to flourish socially and intellectually, the consequences of this shift from a Paleolithic to an agricultural diet and lifestyle were disastrous for human health. In evolutionary terms, eleven thousand years is the blink of an eye, not nearly long enough for humans to completely adapt to this new way of eating. This is why the influential scientist and author Jared Diamond called agriculture “the worst mistake in human history.” He argued that hunter-gatherers “practiced the most successful and longest-lasting lifestyle in human history” and were all but guaranteed a healthy diet because of the diversity and nutrient density of the foods they consumed. Once humans switched diets and became more sedentary, our species’ naturally robust health began to decline.
How do we know that agriculture has been so harmful to humanity? There are three main points of evidence:
â— A decline in health among hunter-gatherer populations that adopted agriculture
â— The robust health of contemporary hunter-gatherers
â— The poor health of people who rely heavily on grains as a staple
Let’s look at each of these in more detail.
What happened when hunter-gatherers became farmers?
Studying bones gives scientists a window into the health of our distant ancestors and offers insight into what an optimal human diet might be. Some archaeologists and anthropologists today may have a better understanding of human nutrition than the average health-care practitioner!
So what have these scientists learned from examining the bones of humans who shifted from a Paleolithic hunter-gatherer lifestyle to an agricultural one? The fossil record shows a rapid and clear decline in health in places where agriculture was adopted. Tooth decay and anemia due to iron deficiency became widespread, average bone density decreased, and infant mortality increased. These changes resulted in large part from the nutritional stress of eating a diet inappropriate for our species.
We also shrank. Skeletal remains from Greece and Turkey indicate that the average height of hunter-gatherers at the end of the ice age was five nine for men and five five for women. After agriculture was adopted in these areas, the average height fell to a low of five three for men and five feet for women. Archaeologists have found similar shrinkage in skeletons all over the world when populations shifted to agriculture.
Early farmers lost more than inches from their skeletons; they lost years from their lives. Anthropologist George Armelagos studied the American Indians living in the Ohio River Valley in approximately AD 1150. His team compared the skeletons of hunter-gatherers that lived in the same area with those of the early farmers who followed them. The farmers had 50 percent more tooth-enamel defects (suggestive of malnutrition), four times as much iron-deficiency anemia, three times more bone lesions, and an overall increase in degenerative conditions of the spine. Their life expectancy at birth also dropped, from twenty-six years to nineteen years.
In their book The 10,000 Year Explosion, anthropologists Gregory Cochran and Henry Harpending argued that these dramatic declines in health were brought on by a major shift in the human diet. When hunter-gatherers switched to farmers’ diets, their average carbohydrate intake shot up while the amount of protein plummeted. The quality of that protein also decreased, since almost any type of meat has a desirable amino acid balance, whereas most plants do not. Vitamin shortages were common because the new diet was based on a limited set of crops and was lower in more nutrient-dense animal products. Evidence suggests that these early farmers, who depended on one or two starchy crops, like wheat or corn, may have developed vitamin-deficiency diseases such as beriberi, pellagra, rickets, and scurvy. Their hunter-gatherer ancestors, who ate a wide variety of foods rich in vitamins and minerals, rarely suffered from these diseases.
Because of “plentiful protein, vitamin D, and sunlight in early childhood,” our Paleo ancestors were “extremely tall,” had very good teeth, and larger skulls and pelvises, according to one group of archaeologists. Their farming descendants, by contrast, suffered skull deformities because of iron-deficiency anemia, had more tooth decay, were more prone to infectious diseases, and were much shorter, “apparently because subsistence by this time is characterized by a heavy emphasis on a few starchy food crops.” Farming may have offered our ancestors a more stable and predictable food supply, but this stability came at a great price.
Didn’t Our Paleo Ancestors Die Young?
A common question I hear from Paleo skeptics is something along the lines of “Didn’t Stone Age people die before their thirtieth birthday?”
It’s true that, on average, our Paleo ancestors died younger than we do. However, these averages don’t factor in challenges largely absent from modern American lives: high infant mortality, violence and accidents, infectious diseases, and lack of medical care. Hunter-gatherer populations had infant-mortality rates about thirty times higher than those in the United States today; early-childhood-mortality rates were more than one hundred times higher. These higher infant- and childhood-mortality rates were caused by accidents, trauma, exposure to the elements, violence, warfare, and untreated acute infectious diseases — issues that, fortunately, few of us in the developed world face. These untimely deaths had the net effect of dragging down average life expectancy. If, out of ten Paleo people, three died in infancy, two died during childhood from exposure to the elements, and two died as teenagers in warfare, then even if the remaining three lived long, healthy lives, the average life span in this hypothetical group would still be short.
Recent research that has taken the high infant-mortality rates of our Paleolithic ancestors into account suggests that if our Stone Age forebears survived childhood, they had life spans roughly equivalent to those of people living in industrialized societies today, with a range from sixty-eight to seventy-eight years. Even more important, they reached these ages without any signs of the chronic inflammatory and degenerative diseases that we consider to be normal in developed countries, including obesity, type 2 diabetes, gout, hypertension, cardiovascular disease, and some cancers. Sure, those of us living in modern industrialized societies might live a little longer than hunter-gatherers, on average. But most of our elderly people now suffer from painful and debilitating diseases, take several medications a day, and have an unsatisfactory quality of life. Fortunately, we don’t have to choose between eating like our ancestors and reaping the benefits of modern medicine. We can combine them to get the best of both worlds and enjoy long life spans without the degenerative diseases that are so common in the industrialized world.
Contemporary hunter-gatherers: A study in good health
Modern studies of contemporary hunter-gatherers — people who have had minimal exposure to industrial civilization and follow a traditional diet and lifestyle — suggest they are largely free of the chronic inflammatory diseases that have become epidemic in the industrialized world.
Anthropological and medical reports of these contemporary hunter-gatherers show they have far fewer modern illnesses, such as metabolic syndrome, cardiovascular disease, obesity, some cancers, and autoimmune disorders, than Westernized populations. In their study “The Western
Diet and Lifestyle and Diseases of Civilization,” nutrition researcher Pedro Carrera-Bastos and his colleagues compared the health of traditional populations with the health of people living in industrialized societies. The contemporary hunter-gatherers were superior in every measure of health and physical fitness. They had:
â— Lower blood pressure
â— Excellent insulin sensitivity and lower fasting insulin levels (meaning they were less likely to develop type 2 diabetes)
â— Lower fasting leptin levels (leptin is a hormone that regulates body fat)
â— Lower body mass indexes and waist-to-height ratios (one way of measuring optimal weight)
â— Greater maximum oxygen consumption (a measure of physical fitness)
â— Better vision
â— Stronger bones
Let’s look at some examples of contemporary hunter-gatherer populations around the world that, at least until a short time ago, followed the traditional diet and lifestyle.
The Kitavans
Kitava is a small island in the Trobriand Islands archipelago in Papua New Guinea. Though not technically hunter-gatherers (they are horticulturalists), the Kitavans were, until recently, one of the last populations on earth still following a traditional diet similar in composition to Paleolithic diets. According to Dr. Staffan Lindeberg in his 1989 book Food and Western Disease, residents of Kitava subsisted “exclusively on root vegetables (yam, sweet potato, taro, tapioca), fruits (banana, papaya, pineapple, mango, guava, watermelon, pumpkin), vegetables, fish and coconuts.”
The Kitavans enjoyed excellent health. Dr. Lindeberg’s study of 2,300 Kitavans found that:
â— None had ever experienced heart disease or a stroke (which was particularly remarkable because most Kitavans smoked, and smoking is one of the biggest risk factors for heart disease).
â— They were very lean, with an average body mass index (BMI) of 20 in men and 18 in women. (By contrast, in 2010, the average BMI of Americans — both men and women — was 27, which is considered overweight and is only three points away from the obese category.)
â— Compared to Westernized populations, Kitavans had very low levels of leptin and insulin, the hormones that regulate food intake and energy balance. Low levels of each are associated with leanness and overall metabolic health.
Most significant, Kitavans rarely suffered the diseases of aging that are so common in developed countries. Lindeberg noted, “The elderly residents of Kitava generally remain quite active up until the very end, when they begin to suffer fatigue for a few days and then die from what appears to be an infection or some type of rapid degeneration. Although this is seen in Western societies, it is relatively rare in elderly vital people. The quality of life among the oldest residents thus appeared to be good in the Trobriand Islands.”
A long, healthy life followed by an easy, quick death. Don’t we all want that?
The Inuit
The Inuit are a group of hunter-gatherers who live in the Arctic regions of Alaska, Canada, and Greenland. They eat primarily fish, seals, whale, caribou, walrus, birds, and eggs: a diet very high in fat and protein, with very few vegetables or fruits. They live in a harsh environment that is marginal at best for human habitation. Yet early explorers, physicians, and scientists unanimously reported that the Inuit they encountered enjoyed excellent health and vitality.
Dr. John Simpson studied the Inuit in the mid-1850s. He noted that the Inuit were “robust, muscular and active, inclining rather to sparseness, rather than corpulence, presenting a markedly healthy appearance. The expression of the countenance is one of habitual good humor. The physical constitution of both sexes is strong.” This is especially remarkable considering the inhospitable environment the Inuit lived in, and it’s a testament to the nutrient density of the animal foods that made up the majority of their diet.
Nearly a hundred years later, an American dentist named Weston A. Price noticed an alarming increase in tooth decay and other problems in his patients, and he set out to determine whether traditional peoples who had not adopted a Western diet suffered from the same problems. In 1933, he took a trip to the Arctic to visit the Inuit, one of many cultures he studied, and he was deeply impressed by what he found. He praised the Inuit’s “magnificent dental development” and “freedom from dental caries” (that is, they had no cavities).
It’s especially impressive that the Inuit enjoyed such robust good health when you consider that their diets were 80 to 85 percent fat, a percentage that would surely horrify the American Medical Association!
Aboriginal Australians
Aboriginal Australians, or Indigenous Australians, were the original inhabitants of the Australian continent and surrounding islands. They traditionally lived as hunter-gatherers, consuming mostly animal products — including land mammals, birds, reptiles, sea creatures, and insects — along with a variety of plants. The quality of their diet depended in large part on where they lived: the subtropical, coastal areas were lush and provided abundant food; the harsh desert interior offered less in terms of both diversity and amounts of food.
Nevertheless, numerous studies suggest that even those Aboriginal Australians living in marginal environments were free of modern diseases like obesity, diabetes, and heart disease. Weston Price described them as “a living museum preserved from the dawn of animal life on the earth.”
Even today, contemporary Aboriginal Australians who maintain a traditional lifestyle are lean and fit and show no evidence of obesity, insulin resistance, type 2 diabetes, or cardiovascular disease. A study published in 1991 found that this population had optimal blood pressure, fasting-glucose levels (high levels indicate diabetes), and cholesterol levels, with an average body mass index well below that of Australians living in urban environments.
Aboriginal Australians who make the transition from their traditional hunter-gatherer lifestyle to a Westernized lifestyle develop unusually high rates of diabetes, cardiovascular disease, and obesity, according to the same study, and Westernized Aboriginal Australians experience a dramatic improvement in metabolic and cardiovascular health when they return to their traditional ways.
These three groups of hunter-gatherers have enjoyed good health with their traditional lifestyles into the twenty-first century, although each eats a very different diet. This may indicate that what we don’t eat might be just as important as what we do.
Are people who eat more grains less healthy?
Another way to evaluate whether traditional Paleolithic diets are healthier than modern diets is to look at cultures and groups that consume large amounts of grains. Are they more likely to have health problems? There’s a great deal of research that says yes. Whole grains, legumes, nuts, and seeds contain compounds called phytates that bind to minerals such as calcium, iron, zinc, and manganese, making them more difficult to absorb. If a food contains nutrients that you can’t absorb, you’re not going to reap their benefits.
Studies show that children on vegetarian macrobiotic diets — “healthy” diets composed of whole grains (especially brown rice), legumes, vegetables, and some fruits — are deficient in vitamins and minerals and are more likely to develop rickets than their meat-eating peers. Breast-fed babies of macrobiotic mothers may be getting lower levels of vitamin B12, calcium, and magnesium, according to some research, which may result in these babies having delayed physical and cognitive growth.
Cultures that are heavily dependent on grains often show signs of severe vitamin A and protein deficiencies, which make them more susceptible to infectious diseases. Dr. Edward Mellanby, the discoverer of vitamin D, compared the agricultural Kikuyu tribe with the pastoralist (livestock-raising) Masai tribe, who consume primarily the milk, blood, and flesh of the cows they raise. Dr. Mellanby discovered that the Kikuyu, who lived mainly on cereals, had a far higher incidence of bronchitis, pneumonia, tropical ulcers, and tuberculosis.
We’ve been raised to believe that healthy whole grains are nutritional marvels, but cereal grains like corn, wheat, and rice don’t deserve the label healthy. They’re inferior to animal products as a source of protein because they’re incomplete, meaning that they are missing one or more essential amino acids. (Essential amino acids are those that we can’t synthesize and therefore have to get from our diets.) They’re also lower in vitamins and minerals compared to meat and the variety of wild fruits and vegetables consumed by our ancestors.
The evidence suggests that when we eat grains at the expense of more nutritious foods — especially when those grains are not properly prepared to reduce phytates and toxins — our health suffers.
How Meat Made Us Human
Eating meat and cooking food is quite literally what made us human. The transition from a raw, exclusively plant-based diet to one that included meat and cooked food (as well as starchy tubers) is what enabled the brains of our pre-human ancestors to grow so rapidly.
Humans have exceptionally large, neuron-rich brains relative to body size compared to nonhuman primates. For example, gorillas have bodies that are three times larger than ours, but they have smaller brains with only about a third the number of neurons that we have. So why is it that the largest primates don’t also have the largest brains?
The answer is that the brain competes with other organs for resources in the body. Gorillas require a large, metabolically expensive digestive tract to process the high-fiber, low-calorie plant matter they consume. This doesn’t leave enough resources for larger, higher-performance brains (like ours). The human brain is an expensive metabolic tissue: it consumes 20 percent of total body energy even though it represents only 2 percent of body mass.
The larger you are, the more you need to eat. The more you need to eat, the more time you have to spend feeding yourself. Gorillas, who are vegetarians, already spend as much as 9.6 hours of a twelve-hour day eating, in part because the fibrous plant matter they consume takes so long for their bodies to break down and absorb. In order to provide enough energy for a human-like brain, a gorilla would have to eat for an extra two hours a day! Likewise, early humans eating only raw vegetation would have needed to eat for more than nine hours a day to get enough calories to support their large brains.
Gathering food was both dangerous and time-consuming, so it is unlikely that our ancestors had a completely vegetarian raw diet. When they cooked their meat, it became easier for them to chew and therefore to digest and absorb, which increased both the calories available and the nutritional density of their diet.
As you’ll see in chapter 3 (which focuses on nutrient density), meat provides an ideal mix of amino acids, fats, and vitamins and minerals for brain growth and maintenance. Vitamin B12 — available only in animal foods — is particularly important for developing brains.
It’s possible to survive on these vegan or vegetarian diets today, but they’re far from optimal or normal for our species. People choose not to eat meat for many reasons, including concerns about the ethical treatment of animals, the amount of resources depleted in raising animals for consumption, and religious beliefs. Those are complex issues beyond the scope of this book. My point is simply that we may not have become the humans we are without this nutritious food source.
The Industrial Revolution: Out of the Frying Pan and into the Fire
The agricultural revolution began humans’ transition away from sixty-six thousand generations of good health. But this shift wasn’t really complete until about six generations ago, when humans reached another milestone: the Industrial Revolution, which ushered in a new age of mass production, transportation, urbanization, and economic development.
Although the beginning of the Industrial Revolution dates back to the eighteenth century, its dietary effects didn’t become evident until the late 1800s. Improved transportation meant greater access to food for more people. Mass-production methods meant that items like white flour, table sugar, vegetable oil, dairy products, and alcohol could become fixtures at every table. White flour, for example, became widespread in the United States after 1850, but it didn’t reach the saturation point until the 1890s. People living in England in the mid-Victorian period, between 1850 and 1890, generally enjoyed great health and still ate a fairly preindustrial diet. With falling prices and improved transportation, however, by around 1900, modern foods made up about 70 percent of the total calories the average person consumed each day — a remarkable change when you consider that none of them was available for the vast majority of human history.
Another significant change that came with the Industrial Revolution was a decrease in the diversity of the human diet around the world. Paleolithic hunter-gatherers consumed a large variety of plant species, primarily fruits, tubers, and vegetables, as do their modern counterparts. (For example, the Alyawarra tribe in Central Australia consumes ninety-two different plant species, and the Tlokwa tribe in Botswana a hundred and twenty-six.) Thanks to improved railways, roads, and canals, a limited number of crops could be grown and shipped cheaply to every corner of the planet. Today, 80 percent of the world’s population lives on only four principal staple plants: wheat, rice, corn, and potatoes.
The food introduced on a large scale by the Industrial Revolution (and grown with newly invented pesticides containing toxins) may be cheaper for us, but it isn’t better. A hundred grams of sweet potato (about half a potato) contains only about 90 calories, and a hundred grams (one small serving) of wild-game meat contains about 150 calories, but both of these foods contain a wide spectrum of beneficial micronutrients. By contrast, a hundred grams (less than a cup) of refined wheat flour contains 361 calories, the same amount of sugar contains 387 calories, and both have virtually no beneficial nutrients. A hundred grams of corn oil (about seven tablespoons), a staple of modern diets, contains a whopping 881 calories and has essentially no nutritional value.
Even worse, industrialization completely changed the way humans lived. In 1800, 90 to 95 percent of Americans lived in rural areas or in small villages. In 1900, about half the population resided in nonurban environments. Today, less than 16 percent of all Americans live in rural areas. People who moved to cities to work became more sedentary. Longer work hours meant less time in the sun and less sleep. Stress — chronic, unrelenting — became a fixture in everyday life. While the Industrial Revolution undoubtedly improved human health in many ways (e.g., greater protection against infectious disease and better emergency medical care), these benefits did not come without significant cost. We have the Industrial Revolution to thank for new diseases of civilization that were rare or virtually nonexistent in preindustrial cultures:
â— In the early 1950s in Uganda, only 0.7 percent of people above the age of 40 showed evidence of having had heart attacks, according to an autopsy study. Today in Uganda, a country where the Western-style diet has taken hold, heart disease is the fourth leading cause of death.
â— In Papua New Guinea, heart attacks were unknown prior to urbanization. Today, the rate of heart attacks is skyrocketing, with upward of 400,000 heart attacks a year in a population of 5.4 million people.
â— Among the Pima Indians in Arizona, the first confirmed case of diabetes was reported in 1908. Thirty years later, there were twenty-one cases, and by 1967, the number had risen to five hundred. Today, half of all adult Pima Indians have diabetes.
â— When some of the South Pacific people of Tokelau migrated to nearby New Zealand and switched to a Western diet, they developed diabetes at three times the rate of those who had remained in Tokelau.
As study after study shows, the more Westernized a traditional culture becomes, the more disease it experiences. Today obesity, diabetes, heart disease, and other chronic degenerative conditions affect well over a billion people worldwide and kill millions of people each year. It may be nearly impossible for you to imagine life without these disorders. Yet they’ve been common for only the past two hundred or so years, a tiny fraction of the time humans have existed on the planet.
We’re Still Evolving
I’ve argued that humans are mismatched with an agricultural diet because the environment changed faster than our species’ genes and biology could adapt. But this doesn’t mean that we haven’t developed any adaptations to agriculture or that human evolution stopped in the Paleolithic
era.
In fact, the pace of genetic change in humans has actually increased during the past few thousand years. Evolutionary biologist Scott Williamson suggests that evolution is occurring one hundred times faster than its previous average over the six million years of hominid evolution and that as much as 10 percent of the genome shows evidence of recent evolution in European Americans, African Americans, and Chinese.
This rapid increase in genetic change has been driven by two factors, say anthropologists Gregory Cochran and Henry Harpending in their book The 10,000 Year Explosion:
â— A significant change in environment, which increased the selective pressure to adapt to it
â— A dramatic increase in population, which increased the likelihood that adaptive mutations would arise by chance
If there’s a new source of slightly indigestible food available to a population that lacks abundant food sources, there will be a lot of selective pressure for the species to adapt so they are able to consume that food. That’s exactly what happened with milk. For most of our species’ history,
humans produced lactase, the enzyme that helps digest the milk sugar, only during infancy and early childhood. Since mother’s milk was the only lactose-containing food in the human diet at that time, there was simply no need for children to continue making lactase after they stopped breast-feeding, which was at about age four for most hunter-gatherers.
However, this all changed with the dawn of the agricultural revolution and the domestication of cattle, which made cow’s milk a readily available food source. Early farmers who relied heavily on grains were prone to mineral, especially calcium, deficiencies. Their skeletons, shorter than their hunter-gatherer predecessors’, indicated they also probably lacked vitamin D, which plays a role in skeletal development. Milk is rich in calcium, contains some vitamin D, is a complete protein, and may promote growth during childhood. It also provided hydration and sustenance during periods of drought. Individuals who carried a genetic mutation allowing them to digest milk beyond their breast-feeding years would have been favored by natural selection, and their genes would have spread rapidly through farming populations.
In fact, archaeological evidence and gene-mapping studies suggest that a genetic mutation that allowed the continued production of lactase into adulthood originated about eight thousand years ago somewhere in Europe and spread rapidly thereafter. Today, approximately one-third of the global population produces lactase into adulthood. In cattle-herding tribes in East Africa, like the Tutsi, the rate is up to 90 percent. In some Northern European countries, like Denmark and Sweden, the genes are present in up to 95 percent of people.
There are several other relatively recent changes — genetic and otherwise— that have influenced our response to modern foods. For example:
â— Populations with historically high starch intake produce more amylase in their saliva than populations with lower starch intake. Amylase is an enzyme that helps digest starch and glucose, both of which are forms of carbohydrates.
â— New versions of genes that affect insulin and blood-sugar regulation have also arisen in the relatively recent past. These mutations appear to increase carbohydrate tolerance and reduce the likelihood that a higher-carbohydrate diet will lead to problems like diabetes.
â— Changes in the expression of certain genes (which can happen much faster than changes to the underlying genes themselves) may help some populations that rely on grains as staples to process them more effectively.
â— Finally, changes in the gut microbiota — the beneficial microorganisms that live in our digestive tracts — can directly affect one’s ability to assimilate certain nutrients. Researchers have identified a type of bacteria in the colon of Japanese people that produces an enzyme that helps them digest seaweed (nori in particular). And some studies suggest that lactose intolerance can be eliminated simply by eating increasing amounts of yogurt containing
live bacteria, which can naturally metabolize lactose.
So, our bodies have adapted in some ways to the challenges of an agricultural diet. Human innovation has also helped. As I mentioned in the previous section, cereal grains and legumes contain phytates, which bond with zinc, iron, calcium, and other minerals. The human gut is unable to break these bonds, which means that it’s difficult for us to absorb the minerals from grains. But traditional cultures soaked grains and grain flours in an acid medium (such as whey or lemon juice), fermented them, germinated (sprouted) them, or leavened them (for example, baking bread with natural sourdough starter), which significantly reduced their phytate content and thus made the minerals they contained more bioavailable (that is, easier to absorb).
Will Evolution Catch Up to Western Diets?
Humans, it would seem, are well adapted to Paleolithic foods like meat, vegetables, fruits, and tubers because our species has been eating them for millennia, and the evidence shows human health declined with the introduction of agricultural foods. However, the fact that a food wasn’t available during the Paleolithic era doesn’t necessarily mean we should avoid it entirely today. The genetic and cultural changes I’ve described above occurred (at least in part) to help humans adapt to an agricultural diet, and they do influence how individuals tolerate Neolithic foods. This explains why some people are able to include moderate amounts of dairy, grains, and/or legumes in their diets — especially when these foods are predigested by fermenting, soaking, sprouting, or leavening — without ill effect. (I’ll have more to say on this topic later in the book.) But these genetic changes don’t mean we can eat a diet high in cereal grains and low in animal protein without adverse health consequences. These adaptations are often simple mutations of single genes and can be relatively crude. For example, the mutation that enables people to digest
milk beyond childhood simply breaks the genetic switch that is supposed to turn off lactase production after infancy. This rather haphazard fix reflects the short time frame in which it took place; it’s much easier for the body to break something that already exists than to create something new.
Eventually, it’s at least possible that humans could evolve a more complex adaptation (involving the coordinated action of several different genes) to a grain-heavy diet. This might include changes in the gastrointestinal tract that would allow better absorption of the nutrients in grains. But even if such an adaptation occurred, it wouldn’t change the fact that grains are far less nutrient dense than meats, fish, and vegetables — the staple foods of our Paleolithic ancestors. This is especially true when you take into account the bioavailability of nutrients, which is high in animal products and low in grains.
For these reasons, the best approach is to make the Paleolithic foods our species evolved to eat the foundation of your diet and then personalize it from there depending on your own unique combination of genetics, health status, activity level, life circumstances, and goals. That’s exactly what I’m going to show you how to do, starting in the very next chapter.
Obviously, a lot has changed since our Paleo ancestors roamed the earth, and most of us aren’t living like the contemporary hunter-gatherer populations I’ve mentioned in this section. How do we know their lifestyle is our best option today? Beyond the considerable anthropological record, there are several lines of modern, clinical evidence supporting the health benefits of a Paleo-template diet and lifestyle. These include:
â— The high nutrient density of Paleo foods
â— The minimal presence of toxins and antinutrients in Paleo foods
â— The superior balance of fats in a Paleo diet
â— The beneficial effects of the Paleo diet on gut bacteria
â— The benefits of integrating physical activity throughout the day and minimizing sedentary time, the way our Paleo ancestors did
â— The benefits of sleeping at least seven to eight hours a night and minimizing exposure to artificial light (although the latter was something our Paleo ancestors never had to contend with)
â— The benefits of sun exposure (which go beyond vitamin D) and spending time outdoors
â— The importance of pleasure, play, and social connection
I’ll cover each of these — and much more — in Steps 1 and 2. Again, the good news is that we don’t have to live in caves or roam the earth for food to enjoy the benefits of a Paleo-style diet. And there’s no need to run to a geneticist to see if you have the right alleles to digest milk or wheat.
Your Personal Paleo Code will lead you to the perfect diet. For now, I hope I’ve convinced you that a Paleo template is the right place to begin.
Ready? Let’s get started!
Chris Kresser is a licensed acupuncturist and practitioner of integrative medicine. He did his undergraduate work at UC Berkeley, and graduated from the Acupuncture and Integrative Medicine College in Berkeley. Kresser has a private practice in Berkeley, CA, and also consults with patients throughout the U.S. www.chriskresser.com. |
Search Articles
Article Categories
Sort by Author
Sort by Issue & Date
Article Categories
Sort by Author
Sort by Issue & Date