Eat food. Not too much. Mostly plants.
In the case of nutritionism, the widely shared but unexamined assumption is that the key to understanding food is indeed the nutrient. Put another way: Foods are essentially the sum of their nutrient parts.
It follows from the premise that food is foremost about promoting physical health that the nutrients in food should be divided into the healthy ones and the unhealthy ones – good nutrients and bad.
Ever since, the history of modern nutritionism has been a history of macronutrients at war: protein against carbs; carbs against proteins, and then fats; fats against carbs. Beginning with Liebig, in each age nutritionism has organized most of its energies around an imperial nutrient: protein in the nineteenth century, fat in the twentieth, and , it stands to reason, carbohydrates will occupy our attention in the twenty-first. Meanwhile, in the shadow of these titanic struggles, smaller civil wars have raged within the sprawling empires of the big three: refined carbohydrates versus fiber; animal protein versus plant protein saturated fats versus polyunsaturated fats; and then deep down within the province of the polyunsaturates, omega-3 fatty acids versus omega-6s.
For while it is true that Americans post-1977 did shift the balance in their diets from fats to carbs so that fat as a percentage of total calories in the diet declined (from 42 percent in 1977 to 34 percent in 1995), we never did in fact cut down on our total consumption of fat; we just ate more of other things.
I would submit that the ideology of nutritionism deserves as much of the blame as the carbohydrates themselves do – that and human nature. By framing dietary advice in terms of good and bad nutrients, and by burying the recommendation that we should eat less of any particular actual food, it was easy for the take-home message of the 1977 and 1982 dietary guidelines to be simplified as follows: Eat more low-fat foods.
And that is precisely what we did. We’re always happy to receive a dispensation to eat more of something, and one of the things nutritionism reliably gives us is some such dispensation: low-fat cookies then, low-carb beer now.
It’s hard to imagine the low-fat/high-carb craze taking off as it did or our collective health deteriorating to the extent that it has if McGovern’s original food-based recommendation had stood: Eat less meat and fewer dairy products.
Most nutritional science involves studying one nutrient at a time, a seemingly unavoidable approach that even nutritionists who do it will tell you is deeply flawed. “The problem with nutrient-by nutrient nutrition science,” points out Marion Nestle, a New York University nutritionist, “is that it takes the nutrient out of the context of the food, the food out of the context of the diet, and the diet out of the context of the lifestyle.”
A few years ago, Rozin presented a group of Americans with the following scenario: “Assume you are alone on a desert island for one year and you can have water and one other food. Pick the food that you think would be best for your health.” The choices were corn, alfalfa sprouts, hot dogs, spinach, peaches, bananas, and milk chocolate. The most popular choice was bananas (42 percent), followed by spinach (27 percent), corn (12 percent), alfalfa sprouts (7 percent), peaches (5 percent), hot dogs (4 percent), and milk chocolate (3 percent). Only 7 percent of the participants chose one of the two foods that would in fact best support survival: hot dogs and milk chocolate.
“Fat seems to have assumed, even at low levels, the role of a toxin” in our dietary imaginations.
Thirty years of nutritional advice have left us fatter, sicker, and more poorly nourished. Which is why we find ourselves in the predicament we do: in need of a whole new way to think about eating.
We have known for a century now that there is a complex of so-called Western diseases – including obesity, diabetes, cardiovascular disease, hypertension, and a specific set of diet-related cancers – that begin almost invariably to appear soon after a people abandons its traditional diet and way of life. What we did not know before O’Dea took her Aborigines back to the bush was that some of the most deleterious effects of the Western diet could be so quickly reversed.
A whole food might be more than the sum of its nutrient parts.
The business model of the food industry is organized around “adding value” to cheap raw materials; its genius has been to figure out how to break these two big seeds down into their chemical building blocks and hen reassemble them in myriad packaged food products. With the result that today corn contributes 554 calories a day to America’s per capita food supply and soy another 257. Add wheat (768 calories) and rice (91) and you can see there isn’t a whole lot of room left in the American stomach for any other foods.
Our food system has long devoted its energies to increasing yields and selling food as cheaply as possible. It would be too much to hope those goals could be achieved without sacrificing at least some of the nutritional quality of our food.
To put this in more concrete terms, you now have to eat three apples to get the same amount of iron as you would have gotten from a single 1940 apple, and you’d have to eat several more slices of bread to get your recommended daily allowance of zinc than you would have a century ago.
In addition to those higher levels of minerals, organically grown crops have also been found to contain more phytochemicals – various secondary compounds (including carotenoids and polyphenols) that plants produce in order to defend themselves from pests and diseases, many of which turn out to have important antioxidant, anti-flammatory, and other beneficial effects in humans. Because plants living on organic farms aren’t sprayed with synthetic pesticides, they’re forced to defend themselves, with the result that they tend to produce between 10 percent and 50 percent more of these valuable secondary compounds than conventionally grown plants.
Clearly the achievements of industrial agriculture have come at a cost: It can produce a great many more calories per acre, but each of those calories may supply less nutrition than it formerly did.
Because the two fatty acids compete with each other for space in cell membranes and for the attention of various enzymes, the ratio between omega-3s and omega-6s, in the diet and in turn in our tissues, may matter more than the absolute quantity of either fat. So, too much omega-6 may be just as much of a problem as too little omega-3.
Most of the official nutritional advice we’ve been getting since the 1970s has, again unwittingly, helped to push omega-3s out of the diet and to elevate levels of omega-6. Besides demonizing fats in general, that advice has encouraged us to move from saturated fats of animal origin (some of which, like butter, actually contain respectable amounts of omega-3s) to seed oils, most of which are much higher in omega-6s (corn oil especially), and even more so after partial hydrogenation. The move from butter (and especially butter from pastured cows) to margarine, besides introducing trans fat to the diet, markedly increased omega-6s at the cost of omega-3s.
For example, the Japanese, who consume large amounts of omega-3s (most of it in fish), have markedly low rates of cardiovascular disease in spite of their high rates of smoking and high blood pressure. Americans consume only a third as much omega-3s as the Japanese and have nearly four times the rate of death from heart disease.
But at the root of all these biochemical changes is a single ecological change. For the shift from leaves to seeds affects much more than the levels of omega-3 and omega-6 in the body. It also helps account for the flood of refined carbohydrates in the modern diet and the drought of so many micronutrients and the surfeit of total calories.
Although an estimated 80 percent of cases of type 2 diabetes could be prevented by a change of diet and exercise, it looks like the smart money is instead on the creation of a vast new diabetes industry. The mainstream media is full of advertisements for gadgets and drugs for diabetics, and the health care industry is gearing up to meet the surging demand for heart bypass operations, dialysis, and kidney transplantation.
People eating a Western diet are prone to a complex of chronic diseases that seldom strike people eating more traditional diets. Scientist can argue all they want about the biological mechanisms behind this phenomenon, but whichever it is, the solution to the problem would appear to remain very much the same: Stop eating a Western diet.
– Don’t eat anything your great grandmother wouldn’t recognize as food
– Avoid food products containing ingredients that are: a) unfamiliar, b) unpronounceable, c) more than five in number, or that include d) high-fructose corn syrup.
– Avoid food products that make health claims
– Shop the peripheries of the supermarket and stay out of the middle
– Get out of the supermarket whenever possible
Indeed, the surest way to escape the Western diet is simply to depart the realms it rules: the supermarket, the convenience store, and the fast food outlet. It is hard to eat badly from the farmers’ market, from a CSA box (community-supported agriculture, an increasingly popular scheme in which you subscribe to a farm and receive a weekly box of produce), or from your garden.
– Eat mostly plants, especially leaves
– You are what what you eat eats too
– If you have the space, buy a feezer
– Eat like an omnivore
– Eat well-grown food from healthy soils
– Eat wild foods when you can
– Be the kind of person who takes supplements
We know that people who take supplements are generally healthier than the rest of us, and we also know that, in controlled studies, most of the supplements they take don’t appear to work.
– Eat more like the French, or the Italians, or the Japanese, or the Indians, or the Greeks
– Regard nontraditional foods with skepticism
– Don’t look for the magic bullet in the traditional diet
– Have a glass of wine with dinner
Not Too Much
– Pay more, eat less
The people of Okinawa, one of the longest-lived and healthiest populations in the world, practice a principle they call hara hachi bu: Eat until you are 80 percent full.
As Rozin and other psychologists have demonstrated, Americans typically eat not until they’re full but rather until they receive some visual cue from their environment that it’s time to stop: the bowl or package is empty, or the plate is clean, or the TV show is over.
In one study Wansink rigged up bowls of soup in a restaurant so they would automatically refill from the bottom; those given the bottomless bowl ate 73 percent more soup than the subjects eating from an ordinary bowl; several ate as much as a quart.
They point out that in 1980 less than 10 percent of Americans owned a microwave; by 1999 that figure had reached 83 percent of households. As technology reduces the time cost of food, we tend to eat more of it.
Is it just a coincidence that as the portion of our income spent on food has declined, spending on health care has soared? In 1960 Americans spent 17.5 percent of their income on food and 5.2 percent of national income on health care. Since, then those numbers have flipped: Spending on food has fallen to 9.9 percent, while spending on health care has climbed to 16 percent of national income.
– Eat Meals
– Do all your eating at a table
– Don’t get your fuel from the same place your car does
– Try not to eat alone
– Consult your gut
Supposedly it takes twenty minutes before the brain get the word that the belly is full: unfortunately most of us take considerably less than twenty minutes to finish a meal, with the result that the sensation of felling full exerts little if any influence on how much we eat.
– Eat slowly
– Cook and, if you can, plant a garden