American society is inundated with food. There are thousands of options of what to eat and a myriad of ways to eat it. So why does food need defending? In In Defense of Food, Michael Pollan distinguishes between real food and processed food and how the reliance on the latter leads to a society plagued by Western diseases.
Since the mid-20th century, Americans have looked to the government and scientists to tell them what to eat and what not to eat. As we’ll see, a decision that was once successfully made by families guided by tradition was turned over to those who benefit most from consumer confusion.
The industrialization of food changed how food was produced and the nutritional value of food. Farmers started using pesticides and chemical enhancements to fortify soil for faster production, tainting the plants grown in it. A reliance on three main staples—corn, soy, and wheat—infiltrated the industry as refined carbohydrates, processed sweeteners, and hydrogenated fats. Nutritional science and industry successfully shifted the focus from real food to nutrients and implied that eating was solely intended to bring about physical health.
We have willingly handed the chef’s hat to the multi-million-dollar food and science industries, losing the culture of food along the way. We’ve become obsessed with health labels and eating by the numbers, but none of this has resulted in better health. In fact, an increase in the rate of heart disease, diabetes, cancer, and obesity is the result of the industrialization of food and the advent of the Western diet. Getting us back on track to healthier bodies and minds requires a paradigm shift about food production and food culture.
The only rule about eating that you need to understand is eat mostly plant-based real food in moderation. ** This idea is simple enough and should be easy to follow. What has prevented us from following this simple rule is the rise of nutritionism. Nutritionism is an ideology about nutritional science, not an actual branch of that science. The focus of nutritionism is on isolating certain nutrients—proteins, carbohydrates, certain fats, and antioxidants—as the cause of either good or bad health. The concept is “eat more of the good nutrients to live longer.”
But this narrow view of health and eating ignores the other advantages of food. These advantages include the holistic benefits of whole foods and a more traditional food culture.
The focus on nutrients led to the lipid hypothesis, a theory developed in the 1960s that states that fat and cholesterol, mostly from meat and dairy, lead to increased rates of heart disease. Originally, the hypothesis was used to urge consumers to eat less animal protein and dairy, but the powerful and influential meat and dairy industries fought back.
They used the concept of nutritionism to shift the blame from the unhealthy whole foods to unhealthy nutrients. Their efforts changed government guideline language from “eat less animal proteins and dairy” to “eat proteins and dairy that help reduce saturated fat.” With this shift, nutritionism gained a massive hold over America’s food culture.
The science and food industries tightened their grip on the American diet after winning another major battle against real food. In the early 20th-century, the Food and Drug Administration stipulated that “imitation” food, or food made with artificial ingredients or chemically altered, must be labeled as such. With food manufacturers now touting the benefits of nutritionism and processing foods to match, they feared the negative response consumers might have to this label.
With the help of the American Heart Association, who recommended that foods be modified to reduce cholesterol and saturated fats, the food industry was successful in repealing the mandate. Now, any food-like product that equaled or surpassed real food in nutrient value could be sold without a warning label. The age of processed food kicked into high gear.
The issues with nutritionism and processed foods would be nil if we were actually getting healthier, but we’re not, thanks to the lipid hypothesis. The lipid hypothesis made saturated fats the main enemy of health. People believed that removing saturated fats from food and replacing them with “good” fats, like hydrogenated seed oil, was the best solution to improving health. The low-fat food revolution began, with hydrogenated oil becoming the fat of choice to be added to all foods to make them healthier. This decision had far-reaching consequences. Many believe the current obesity and diabetes epidemics in America correspond with the rise of low-fat foods in the 1970s.
The process of solidifying vegetable oil using hydrogen creates trans fat. Scientists and food marketers urged Americans to eat more trans fat in low-fat foods. Unfortunately, the evidence supporting this recommendation was flimsy from the start.
American eating habits during and after WWII formed the assumptions that led to the lipid hypothesis. During the war, meat and dairy were rationed and heart disease was low. After the war, eating resumed as normal, and heart disease increased. But the researchers failed to acknowledge that people ate less during the war and exercised by walking more because of rationed gasoline. Also, post-war eating became increasingly industrial, with...
Unlock the full book summary of In Defense of Food by signing up for Shortform .
Shortform summaries help you learn 10x faster by:
Here's a preview of the rest of Shortform's In Defense of Food summary:
For decades, Americans have sought to determine which foods support optimal health. A multi-million dollar food marketing and science industry have dedicated their efforts to finding this answer. But the solution to what to eat can be boiled down to a basic principle:eat mostly plant-based real food in moderation.Seems easy, right? So why all the fuss?
The tried and tested days of tradition are gone. In those days, fresh and unprocessed foods dominated the culture and family guided eating decisions. Now, we have grocery stores full of processed foods and a government food pyramid about as trustworthy as the money-making schemes of the same name.
Food products no longer look like whole food. They are modified for mass production, easy consumption, and low cost. They come with health claims stamped on their packages to guide the would-be savvy shopper. The existence of a health claim on a food package should be the first red flag to consumers. Real food doesn’t need a health claim. Processed foods do.
We have given the power to decide what to put in our bodies to science, government, and media. The need for humans to be told what to eat, something our ancestors were able...
American eating habits took a sharp turn in the 20th century, when the focus on positive nutrition turned from whole food to nutrients (chemical and mineral compounds found in food). Although the theoretical aspects of this shift sprouted between the late-19th and mid-20th centuries, the effects didn’t become wholly obvious until the 1980s.
On grocery store shelves in the 1980s, food identified as food was being replaced by packages identified by their nutrient components, whether bad or good.Words like cholesterol, fiber, and saturated fats replaced eggs, grains, and meat. This new strategy suggested that the inclusion or exclusion of these substances equaled positive or negative health for consumers. Real food became antiquated. Nutrients took over as the shiny new, lab-tested giant of healthy eating—eat the good and avoid the bad to find physical Nirvana. The events that led to this shift and those that have followed created the Western diet.
The work of an English doctor and chemist would change the face of the American diet forever. William Prout was the first person to isolate and identify the three main compounds in food, called...
Food processors, free of marketing restrictions and backed by the government, set out to make processed food the food of choice. They modified any food they could. Labels such as low-fat, fat-free, cholesterol-free, high-fiber, low-fiber, and low-carb became the norm. Foods that once consisted of two- or three ingredients, such as mayonnaise and whipped cream, now had tumbling lists of additives.
If a nutrient once thought healthy was found to be otherwise, the food-like product was reengineered to reduce the bad and include the good. A new set of health claims was printed on new boxes. The public followed this advice until new evidence contradicted those claims. And because health claims couldn’t be printed on fruits and vegetables, healthy foods were often left in the dark.
In fact, the fate of real food in supermarkets is still dependent on what science dictates. If carrots are said to be healthy, people will buy them. If not, they won’t. A new subgroup of food science has formed to exploit the benefits of certain whole foods based on a particular nutrient.Walnuts, once deemed fatty, are now hailed for their omega-3 compounds. Pomegranates and blueberries are...
Several factors make the study of nutrition difficult: 1) the focus on nutrients makes studying food benefits difficult, and 2) the tools used to measure nutritional data are lacking in methodology. These factors make it hard to truly understand food and eating habits. However, this problem doesn’t stop scientists from claiming their findings as fact.
When you take the nutrient out of the context of food, you miss influential relationships within whole foods and the numerous possible benefits beyond the one identified. Other important variables pertaining to health are also ignored, such as diet patterns, lifestyles, and human physiology. **
No two humans are alike. There are specific inherent differences that either support or hinder nutritional health.
"I LOVE Shortform as these are the BEST summaries I’ve ever seen...and I’ve looked at lots of similar sites. The 1-page summary and then the longer, complete version are so useful. I read Shortform nearly every day."Sign up for free
To create eating habits that support better health, we need to fix the relationship between humans and food as part of the overall food chain. To do this, we need to leave the Western diet behind. But what about all those already experiencing poor health? Has the Western diet destroyed their chances for a healthy life?
The Western diet has spread throughout most of the civilized world and beyond. The decline in health of those who follow it is no secret. But until 1982, what was unclear was whether the damage could be reversed. A study of 10 Aborigines in Australia provided the first grain of hope that it could.
In summer 1982, a research scientist, Kerin O’Dea, asked 10 formerly bush-dwelling Aborigines to return to their previous lives as hunters and gatherers to see if their health improved. Since moving to civilization years before, all had developed Type 2 diabetes and were at high risk for heart disease. They’d also developed “metabolic syndrome,” a disorder in which the body is unable to metabolize carbohydrates and fats appropriately through [restricted term] production.
When the Aborigines returned to the bush, they lived by the...
One of the main downfalls of nutritionism is the way it shifts our focus from food as part of a relationship to merely an item. The relationship between humans and food is the food chain and occurs in every aspect of nature. Organisms along this chain tend to adapt to or with one another to survive.
For example, cow’s milk used to make people physically sick. Then, five thousand years ago, milk farmers and the surrounding community started to evolve. A mutation of the gene that promotes the enzyme needed to digest milk spread through the population. The milk gave the herders the ability to have and raise more children, which increased the spread of the gene. Likewise, the larger population and improvements to health allowed for better care of the cows, which improved their lives and allowed more reproduction.
When one aspect of the food chain is disrupted, it disrupts the rest of the chain. If the soil is unhealthy, the grass growing from it will be unhealthy, and the cows that eat the grass will be unhealthy, and so on. Therefore, the health of humans is greatly dependent on the health of the food chain.
In the past, the familiar relationship between humans and...
To change your habits away from the Western Diet, you must differentiate the theories from the problem and work to address the latter. The different theories of nutritionism presented in this summary may create more confusion about what you should eat and what not to eat. Many of them contradict each other or overlap. For instance, is it a lack of omega-3s that leads to disease or an influx of refined carbohydrates? If carbohydrates are the problem, what does that mean for the lipid hypothesis?
It’s natural for scientists and the general public to want a one-stop solution that is easily proven and disseminated. Yet, allowing the one-nutrient mentality to serve as the endpoint of the issue is narrow, short-sighted, and ineffective in helping guide you to good choices. Think of who is actually benefiting from these theories. Is it the consumer, who is yo-yoed between this nutrient and that nutrient? Not really. The main beneficiaries of nutritionism, again, are the food and science industries.
Just because something is edible does not mean it is real food. Much of what grocery stores carry are food-like substances. In fact, there are 17,000 new food-like substances created and marketed a year. In contrast, real food is ordinary food, or food-as-food, and you should only be eating ordinary food. How can you tell the difference? Three rules can help you identify what you should and shouldn’t be eating.
If your great-grandma wouldn’t call it real food, it’s not real food. Using your great-grandmother gives you a good chance of going back to a time before industrialized food took over. If you’re very young, go back to the hunter and gatherer days.
When you’re in the grocery store, imagine your great-grandma or ancestor is standing next to you. When you pick up an item, imagine they are picking up the item and inspecting it. The example of Go-Gurt provides an understanding of how this works.
Your great-grandma looks at the Go-Gurt and asks what it is. You tell her it’s yogurt in tube form. She looks at the list of ingredients and gets confused, and rightfully so. In her day, yogurt was cultured milk. What she sees are one...
With Shortform, you can:
Access 1000+ non-fiction book summaries.
Access 1000+ premium article summaries.
Take notes on your
Read on the go with our iOS and Android App.
Download PDF Summaries.
It can be hard to know what you’re eating simply by looking at the food. Distinguishing between real food and food-like substances requires knowledge and intention.
In your refrigerator, do you have at least two real food items? How do you know they’re real? What are they made of?
The evolutionary relationship between humans and plants greatly explains why they are so good for us and why a diet focused on plant-based foods is a healthier option. Even Thomas Jefferson said that meat should be eaten more as a flavoring to support vegetables, rather than the main dish.
At the end of the day, a switch to real food of any type is going to be beneficial. In fact, other cultures who eat traditional diets still experience greater health than Western eaters simply because of the priority on whole foods. Still, not all whole foods are created equal, and a few rules can help you identify which ones provide the most benefit.
Much in the same way scientists can’t agree on what aspect of nutritionism is most helpful, they also can’t agree on what it is about plants (not seeds) that provide the benefit. What they do agree on, and maybe the only thing, is that plant-based foods are good for you.
One benefit of eating plants is the amount of antioxidants they provide the body. Vitamin C is one of the most important. Your ancestors were able to produce vitamin C internally. This process helped reduce the amount of free...
It’s evident from this chapter that foods encompass both advantages and disadvantages. These factors may greatly influence your life and health.
How often do you eat fruits and vegetables, and which do you mostly eat?
If food is more than nutrients and diet is more than certain foods, it stands to reason that food culture is more than just diet. The sociology of food is a major component of eating and health. It involves habits, customs, manners, and the way we eat. Following traditional dietary patterns is just one step toward health. Following the customs of a culture is the other step.
When scientists ponder the French paradox, they only notice a diet of rich foods and wine and thinner, healthier eaters. What they don’t consider is the relationship those eaters have with the food and culture surrounding how they eat. If they did, they’d see that the French don’t snack and eat mostly meals in social environments. Their portions are small, and they eat at a slower pace. They eat fewer calories and enjoy the experience of eating.
Psychologists refer to something called “unit bias,” which means people tend to believe whatever portion they’re served is the right amount to eat. This is why portion size matters. Plus, when you eat smaller portions more slowly, you allow the food to be savored rather than devoured, leaving you wanting more. Americans could learn a lot from the habits of...
Food culture is about more than just what you eat, and understanding what makes up a food culture might provide a new perspective of your own habits.
Do you mostly eat three conventional meals a day, or do you eat more snacks and convenience meals? Why?
Now that you’ve heard Micheal Pollan’s defense of food, do you think differently about the food you eat?
What is one surprising thing you learned about the food you typically eat?