After watching that, I wasn’t sure if it was a comedy skit or a documentary about the American experience. Should I laugh or should I cry?
The obsession with body fat is an interesting story. It didn’t begin a few generations ago but goes back centuries. But maybe that shouldn’t be surprising.
That was the colonial era when the diet was transformed by imperial trade of foreign foods. I might note that this included previously rare or never before seen varieties of fattening carbohydrates: sugar, potatoes, corn, rice, etc. The old feudal system was ending and entirely different forms of food production and diets were developing, especially for the then landless peasants. Hunting, gathering and grazing for the commoners definitely would have been on the decline for a while at that point, as the last of the commons had been privatized. The loss of access to wild game would take longer in the colonies, but eventually it happened everywhere.
The last stage of that shift overlapped with the beginnings of industrialization and agricultural improvements. In the 19th century, change in wheat surpluses and hence costs and prices. Agriculture boomed as fewer people were employed in it. There was also a sudden obsession with gender roles and social roles in general, such as the post-revolutionary expectation of the mother to make citizens out of her children. Bread-making, a once uncommon activity for Americans, became increasingly important to the normative identity of family life and the symbolic maintenance of the social order.
Regular consumption of wheat bread was once limited to the wealthy and that is how refined bread gained its moral association with the refined class. Only the wealthy could afford wheat prior to the 19th century, as prior to that the poor were forced to rely upon cheaper grains and grain substitutes at a time when bread was regularly adulterated with bark, sawdust, chalk, etc. Poverty breads, in the previous centuries, often were made with no grain at all.* For wheat and especially heavily refined white bread to become available to all walks of life meant an upsurge of the civilizing process. The obsession with middle class life took hold and so cookbooks were produced in large numbers.
In a growing reactionary impulse, there was a nostalgic tendency toward invented traditions. Bread took on new meanings that then were projected onto the past. It wasn’t acknowledged how radical was the industrial agriculture and industrial milling that made all of this possible. And the disconnection is demonstrated by the simultaneous promotion of the grain production of this industrial age and the complaint about how industrialized life was destroying all that was good. Bread, as a symbol, transcended these mere details.
With the aristocracy having been challenged during the Revolutionary Era the refinement of the refined class that once was admired had then become suspect. The ideology of whole foods began to emerge and had some strong proponents. But by the end of the 1800s, the ideal of refinement gained prominence again and prepared the way for the following century of ever greater industrialization of processed foods. Refinement represented progress. Only after more extensive refinement led to mass malnourishment, near the end of that century and heading into the next, did whole foods once again capture the public imagination.
Then we enter the true era of fat obsession, fat blaming, and dieting, endless dieting. Eat your whole grains, get your fiber, make sure you get enough servings of fruits, and veggies, and don’t forget to exercise. Calories in, calories out. Count your calories, count your carbs, count your steps. Count every last one of them. Still, the basic sides of the debate remain the same: fewer carbohydrates vs less meat, whole foods vs refined foods, barbaric lifestyle vs civilizing process, individual moral failure vs societal changes, etc. One theme that runs through dietary advice from the ancient world to the present is that there is a close link between physical health, mental health, and moral health — the latter erupting as moral panic and moral hygiene. But what stands about the modern era, beginning in the 1600s, is that it was observed that psychological problems were mostly seen among the well-to-do.
This was often blamed on luxury and sometimes on meat (a complaint often about animals raised unnaturally in confinement and probably fed grain, the early equivalent of concerns about factory farming; but also a complaint about the introduction of foreign spices and use of fancy sauces to make meat more appetizing), although there was beginning to be an awareness that a high-carb diet might be playing a role in that it was often noted that the morbidly obese ate lots of pastries, fruit pies, and such. The poor didn’t have much access to wheat and sugar before the 1800s, but the wealthy had plenty of such foods centuries earlier. Meat consumption didn’t change much during that era of colonial trade. What did change the most was availability of starchy and sugary foods, and the wealthy consumed them in great proportions. Meat had always been a desirable food going back to earliest hominid evolution. Modern agriculture and global trade, however, entirely transformed the human diet with the introduction of massive amounts of carbohydrates.
It’s strange that right from the beginning of the modern era there were those pushing for a vegetarian diet, not many but their voices were being heard for the first time. Or maybe it wasn’t so strange. Prior to the modern era, a vegetarian diet so far north in Europe would have been impossible. It was only the elite promoting vegetarianism as only they could afford a vegetarian diet year round, in buying expensive plant-based foods that were often shipped in from far away. Although plant foods were expensive at the time, they were available to those who had plenty of money. But during the Middle Ages and earlier, vegetarianism for the most part was not an option for anyone since the food items required of such a diet simply weren’t available enough to sustain life, certainly not in places like England or Germany.
There is another side to this bring us back to the obsession with fat. It was only with the gradual increase of grain production that cattle could be fed grain, not only as additional feed in the winter but year round. This is also what allowed the possibility of confining animals, rather than grazing them on fields. Grain surpluses weren’t consistent until the 19th century, but even before that grain production had been increasing. There were slow improvements in agriculture over the centuries. The rich could afford meat from grain-fed animals much earlier than the rest of the population and it was highly sought after. That is because such meat is extremely fatty creating those beautiful marbled steaks, pork chops, etc (such fattiness, by the way, is a sign of metabolic syndrome in both animals and humans). Fat couldn’t have been a focus of debate prior to grain-fattened animals became common.
So, there is a reason that both wheat bread and fatty meat gained immense symbolic potency at the same time. Similarly, it was during this same era that vegetables became more common and gardens likewise became symbols of wealth, abundance, and the good life. Only the rich could afford to maintain large gardens because of the difficulty involved and immense time-consuming work required (see The Jane Austen Diet by Bryan Kozlowski**; also about the American diet before the 20th century, see The Big Fat Surprise by Nina Teicholz that I quote in Malnourished Americans). They represented the changed diet of modern civilization. They were either indicators of progress or decline, depending on one’s perspective. Prior to modernity, a diet had consisted to a much greater degree of foods that were gathered, hunted, trapped, and fished.
The shift from one source of food to another changed the diet and so changed the debate about diet. There suddenly were more options of foods available as choices to argue about. Diet as a concept was being more fully formulated. Rather than being something inherited according to the traditional constraints of local food systems and food customs, assuming one had the wealth, one could pick from a variety of possible diets. Even to this day, the obsession about dieting carries a taint of class privilege. It is, as they say, a first world problem. But what is fascinating is how this way of thinking took hold in the 1600s and 1700s. There was a modern revolution in dietary thought in the generations before modern political revolution. The old order was falling apart and sometimes actively being dismantled. This created much anxiety and it forced the individual into a state of uncertainty. Old wisdom no longer could be relied upon.
* * *
*Rather than bread, the food that was most associated with the laboring class was fish, a food the wealthy avoided. Think about how lobster and clams used to be poverty foods. In Galenic theory of humoral physiology, fish is considered cold and wet, hard to digest and weakening. This same humoral category of food also included fruits and vegetables. This might be why, even to this day, many vegetarians and vegans will make an exception for fish, in seeing it as different than ‘meat’. This is an old ideological bias because ‘meat’ was believed to have the complete opposite effect of being hot and dry, easy to digest and invigorating. This is the reason for why meat but not fish was often banned during religious fasts and festivals.
As an interesting side note, the supposed cooling effect of fish was a reason for not eating it during the cold times of the year. Fish is one of the highest sources of vitamin A. Another source is by way of the precursor of beta-carotene found in vegetables. That these two types of food are considered of the same variety according to Galenic thought is interesting. Cold weather is one of the factors that can disrupt the body’s ability to convert beta-carotene into usable vitamin A. The idea of humors mixes this up slightly, but it maybe points to understanding there was something important to be understood. Eating more meat, rather than vegetables, in winter is a wise practice in a traditional society that can’t supplement such nutrients. Vitamin A is key for maintaining a strong immune system and handling stress (True Vitamin A For Health And Happiness).
By the way, it was during the 19th century that a discussion finally arose about vegetarianism. The question was about whether life and health could be sustained with vegetables. Then again, those involved were probably still being influenced by Galenic thought. By vegetarianism, they likely meant a more general plant-based diet that excluded ‘meat’ but not necessarily fish. The context of the debate was the religious abstinence of Lent, during which fish was allowed. So, maybe the fundamental argument was more about the possibility of long-term survival solely on moist, cooling foods. Whatever the exact point of contention, it was the first time in the modern Western world where a plant-based diet (be it vegan, vegetarian, or pescetarian-style Mediterranean diet) was considered seriously.
These ideas have been inherited by us, even though the philosophical justifications no longer make sense to us. This is seen in the debate that continues over red meat in particular and meat in general, specifically in terms of the originally Galenic assertion of its heat and dryness building up the ‘blood’ (High vs Low Protein). It’s funny that dietary debates remain obsessed over red meat (along with the related issue of cows and their farts), even though actual consumption of red meat has declined over the past century. As with bread, the symbolic value of red meat has maybe even gained greater importance. Similarly, as I mentioned above, the uncertain categorization of fish remains hazy. I know a vegan who doesn’t eat ‘meat’ but does eat fish. When I noted how odd that was, a vegetarian I was talking to thought it made perfect sense. This is Galenic thought without the Galenic theory that at least made it a rational position, but the ideological bias remains in spite of those adhering to it being unable to explain why they hold that bias. It amuses me.
Ideologies are powerful systems. They are mind viruses that can survive and mutate across centuries and sometimes millennia. Most of the time, their origins are lost to history. But sometimes we are able to trace them and it makes for strange material to study.
See: “Fish in Renaissance Dietary Theory” by Ken Albala from Fish: Food from the Waters ed. by Harlan Walker, and Food and Faith in Christian Culture ed. by Ken Albala and Trudy Eden. Also, read text below, such as the discussion of vegetarianism.
* * *
(Both texts below are from collections that are freely available on Google Books and possibly elsewhere.)
The Fat of the Land: Proceedings of the Oxford Symposium on Food and Cooking 2002
ed. by Harlan Walker
“The Apparition of Fat in Western Nutritional Theory”
by Ken Albala
Naturally dietary systems of the past had different goals in mind when framing their recommendations. They had different conceptions of the good, and at some point in history that came to include not being fat. Body size then became an official concern for dietary writers. Whether the original impetus for this change was a matter of fashion, spirituality or has its roots in a different approach to science is impossible to say with any degree of precision. But this paper will argue that nutritional science itself as reformulated in the 17th century was largely to blame for the introduction of fat into the discourse about how health should be defined. […] Obesity is a pathological state according to modern nutritional science. But it was not always so.
When and why fat became a medical issue has been a topic of concern among contemporary scholars. Some studies, such as Peter N. Sterns’ Fat History: Bodies and Beauty in the Modern West, place the origin of our modern obsession in the late 19th century when the rise of nutritional science and health movements lead by figures like John Harvey Kellogg, hand in hand with modern advertising and Gibson Girls, swept away the Victorian preference for fulsome figures. As a form of social protest, those who could afford to, much as in the 60s, idealized the slim androgynous figure we associate with flappers. Others push the origin further back into the early 19th century, in the age of Muscular Christianity and Sylvester Graham. But clearly the obsession is earlier than this. In the 18th century the 448 pound physician George Cheyne and his miracle dieting had people flocking to try out the latest ‘cures.’ It was at the same time that dissertations on the topic of obesity became popular, and clearly the medical profession had classified this as a treatable condition. And readers had already been trained to monitor and police their own bodies for signs of impending corpulence. The roots of this fear and guilt must lie somewhere in the previous century as nutritional science was still groping its way through a myriad of chemical and mechanical theories attempting to quantify health and nutrition with empirical research.
The 17th century is also the ideal place to look if only because the earlier system of humoral physiology is almost totally devoid of a concept of fat as a sickness. […]
For all authors in the Galenic tradition it appears that fat was seen as a natural consequence of a complexion tending to the cold and moist, something which could be corrected, but not considered an illness that demanded serious attention. And socially there does not seem to have been any specific stigma attached to fat if Rubens’ taste in flesh is any measure.
The issue of fat really only emerges among authors who have abandoned, in part or totally, the system of humoral physiology. This seems to have something to do with both the new attempts to quantify nutrition, first and most famously by Santorio Santorio9 and also among those who began to see digestion and nutrition as chemical reactions which when gone awry cast fatty deposits throughout the body. It was only then that fat came to be considered a kind of sickness to be treated with therapy.10
The earliest indications that fat was beginning to be seen as a medical problem are found in the work of the first dietary writer who systematically weighed himself. Although Santorio does not seem to have been anxious about being overweight himself, he did consistently define health as the maintenance of body weight. Expanding on the rather vague concept of insensible perspiration used by Galenic authors, Santorio sought to precisely measure the amount of food he consumed each day compared to the amount excreted in ‘sensible’ evacuations. […] Still, fat was not a matter of eating too much. ‘He who eats more than he can digest, is nourished less than he ought to be, and [becomes] consequently emaciated.’12 More importantly, fat was a sign of a system in disarray. […]
Food was not in fact the only factor Santorio or his followers took into account though. As before, the amount of exercise one gets, baths, air quality, even emotions could alter the metabolic rate. But now, the effect of all these could be precisely calculated. […]
At the same time that these mechanistic conceptions of nutrition became mainstream, a chemical understanding of how food is broken down by means of acids and alkalis also came to be accepted by the medical profession. These ideas ultimately harked back to Paracelsus writing in the 16th century but were elaborated upon by 17th century writers […] It is clear that by the early 18th century fat could be seen as a physiological defect that could be corrected by heating the body to facilitate digestive fermentation and the passage of insensible perspiration. […] Although the theories themselves are obviously nothing like our own, we are much closer to the idea of fat as a medical condition. […]
Where Cheyne departs from conventional medical opinion, is in his recommendation of a cooked vegetable diet to counter the affects of a disordered system, which he admits is rooted in his own ‘experience and observation on my own crazy carcase and the infirmities of others I have treated’ rather than on any theoretical foundation.
The controversy over whether vegetables could be considered a proper diet, not only for the sick or overgrown but for healthy individuals, was of great concern in the 18th century. Nicholas Andry in his Traité des alimens de caresme offered an extended diatribe against the very notion that vegetables could sustain life, a question of particular importance in Catholic France where Lenten restriction were still in force, at least officially. […] According to current medical theory, vegetables could not be suitable for weight loss, despite the successful results of the empirics. […]
It is clear that authors had a number of potentially conflicting theoretical models to draw from and both mechanical and chemical explanations could be used to explain why fat accumulates in the body. Yet with entirely different conceptual tools, these authors arrived at dietary goals surprisingly like our own, and equally as contentious. The ultimate goals now became avoiding disease and fat, and living a long life. While it would be difficult to prove that these dietary authors had any major impact beyond the wealthy elites and professionals who read their works, it is clear that a concern over fat was firmly in place by the mid 18th century, and appears to have its roots in a new conception of physiology which not only paid close attention to body weight as an index of health, but increasingly saw fat as a medical condition.
Food and Morality: Proceedings of the Oxford Symposium on Food and Cookery 2007
ed. by Susan R. Friedland
“Moral Fiber: Bread in Nineteenth-Century America”
by Mark McWilliams
From Sarah Josepha Hale, who claimed, ‘the more perfect the bread, the more perfect the lady’ to Sylvester Graham, who insisted, ‘the wife, the mother only’ has the ‘moral sensibility’ required to bake good bread for her family, bread often became a gendered moral marker in nineteenth-century American culture.1 Of course, what Hale and Graham considered ‘good’ bread differed dramatically, and exactly what constituted ‘good’ bread was much contested. Amidst technological change that made white flour more widely available and home cooking more predictable, bread, described in increasingly explicit moral terms, became the leading symbol of a housewife’s care for her family.
Americans were hardly the first to ascribe moral meaning to their daily bread. As Bernard Dupaigne writes, ‘since time immemorial [bread] has attended the great events of various human communities: monsoon or grape harvest bread, the blessed bread of Catholics or the unleavened bread of Passover, or the fasting-break bread of Ramadan. There is no bread that does not, somewhere in the world, celebrate an agricultural or religious holiday, enrich a family event, or commemorate the dead.’2 With such varied symbolic resonance, bread seems easily filled with new meanings.
In America (as later in France),3 bread became a revolutionary symbol. To the early English colonists’ dismay, European wheat did not adapt well to the North American climate; the shift to corn as the primary grain was perhaps the most important dietary adaptation made by the colonists. Wheat remained too expensive for common consumption well into the nineteenth century. […]
By the end of the Revolution, then, bread was already charged with moral meaning in the young United States. In the nineteenth century, this meaning shifted in response to agricultural improvements that made wheat more widely available, technological change that made bread easier to make consistently, and, perhaps most important, social change that made good bread the primary symbol of a housewife’s care for her family. In effect, bread suffered a kind of identity crisis that paralleled the national identity crisis of Jacksonian America. As Americans thought seriously about who they were in this new nation, about how they should act and even how they should eat, bread’s symbolic meaning – and bread itself– changed.
American agricultural production exploded, although the proportion of the population working on farms declined. James Trager notes that even before the McCormick reaper first sold in large numbers as farmers struggled to replace workers leaving for the 1849 Gold Rush, the average time required to produce a bushel of wheat declined 22 per cent from 1831 to 1840.7 Dramatic improvements in efficiency led to larger yields; for example, wheat production more than doubled between 1840 and 1860. Such increases in wheat production, combined with better milling procedures, made white flour finally available in quantities sufficient for white bread to become more than a luxury good.8
Even as wheat became easier to find for many Americans, bread remained notoriously difficult to make, or at least to make well. Lydia Maria Child, a baker’s daughter who became one of America’s leading writers, emphasizes what must have been the intensely frustrating difficulty of learning to cook in the era before predictable heat sources, standardized measurements, and consistent ingredients.9 […]
Unlike Hale, who implies that learning to bake better can be a kind of self improvement, this passage works more as dire warning to those not yet making the proper daily bread. Though bread becomes the main distinction between the civilized and the savage, Beecher turns quickly, and reassuringly, to the science of her day: ‘By lightness is meant simply that in order to facilitate digestion the particles are to be separated from each other by little holes or air-cells; and all the different methods of making light bread are neither more nor less than the formation of bread with these air cells’ (170). She then carefully describes how to produce the desired lightness in bread, instructions which must have been welcome to the young housewife now fully convinced of her bread’s moral importance.
The path for Beecher, Hale, and others had been prepared by Sylvester Graham, although he is little mentioned in their work.14 In his campaign to improve bread, Graham’s rhetoric ‘romanticized the life of the traditional household’ in ways that ‘unknowingly helped prepare women to find a new role as guardians of domestic virtue,’ as Stephen Nissenbaum notes.15 Bread was only one aspect of Graham’s program to educate Americans on what he called ‘the Science of Human Life.’ Believing on the one hand, unlike many at the time, that overstimulation caused debility and, on the other, that industrialization and commercialization were debasing modern life, Graham proposed a lifestyle based around a strict controls on diet and sexuality.16 While Graham promoted a range of activities from vegetarianism to temperance, his emphasis on good bread was most influential. […]
And yet modern conditions make such bread difficult to produce. Each stage of the process is corrupted, according to Graham. Rather than grow wheat in ‘a pure virgin soil’ required for the best grain, farmers employ fields ‘exhausted by tillage, and debauched by the means which man uses to enrich and stimulate it.’ As Nissenbaum notes, the ‘conscious sexual connotations’ of Graham’s language here is typical of his larger system, but the language also begins to point to the moral dimensions of good bread (6).
Similarly loaded language marks Graham’s condemnation of bakery bread. Graham echoed the common complaints about adulteration by commercial bakers. But he added a unique twist: even the best bakery bread was doubly flawed. The flour itself was inferior because it was over-processed, according to Graham: the ‘superfine flour’ required for white bread ‘is always far less wholesome, in any and every situation of life, than that which is made of wheaten meal which contains all the natural properties of the grain.’ […]
As Nissenbaum argues, pointing to this passage, Graham’s claims invoke ‘the vision of a domestic idyll, of a mother nursing her family with bread and affection’ (8). Such a vision clearly anticipates the emphasis on cookery as measure of a woman’s social worth in the domestic rhetoric that came so to characterize the mid-nineteenth century.
Such language increasingly linking cookery with morality emphasized the virtue not of the food itself but rather of the cooks preparing it. This linkage reached read ers not only through the explosion of cookbooks and domestic manuals but also through the growing numbers of sentimental novels. Indeed, this linkage provided a tremendously useful trope for authors seeking a shorthand to define their fictional characters. And that trope, in turn, helped expand the popularity of interpreting cookery in moral terms. […]
After the Civil War, domestic rhetoric evolved away from its roots in the wholesome foods of the nation’s past toward the ever-more refined cuisine of the Gilded Age. Graham’s refusal to evolve in this direction – his system was based entirely in a nostalgic struggle against modernity, against refinement – may well be a large part of why his work was quickly left behind even by those for whom it had paved the way.
* * *
Here is another text I came across. It’s not free, but it seems like a good survey worth buying.
At Optimizing Nutrition, there is a freaking long post with a ton of info: Do we need meat from animals? Let me share some of charts showing changes over the past century. As calories have increased, the nutrient content of food has been declining. Also, with vegetable oils and margarine shooting up, animal fat and dietary cholesterol intake has dropped.
Carbs are a bit different. They had increased some in the early 20th century. That was in response to meat consumption having declined in response to Upton Sinclair’s muckraking of the meat industry with his book The Jungle. That was precisely at the time when industrialization had made starchy carbs and added sugar more common. For perspective, read Nina Teicholz account of the massive consumption of animal foods, including nutrient-dense animal fat and organ meats, among Americans in the prior centuries:
“About 175 pounds of meat per person per year! Compare that to the roughly 100 pounds of meat per year that an average adult American eats today. And of that 100 pounds of meat, more than half is poultry—chicken and turkey—whereas until the mid-twentieth century, chicken was considered a luxury meat, on the menu only for special occasions (chickens were valued mainly for their eggs). Subtracting out the poultry factor, we are left with the conclusion that per capita consumption of red meat today is about 40 to 70 pounds per person, according to different sources of government data—in any case far less than what it was a couple of centuries ago.” (The Big Fat Surprise, passage quoted in Malnourished Americans).
What we forget, though, is that low-carb became popular for a number of decades. In the world war era, there was a lot of research on the ketogenic diet. Then around the mid-century, low-carb diets became common and carb intake fell. Atkins didn’t invent the low-carb diet. Science conferences on diet and nutrition, into the 1970s, regularly had speakers on low-carb diets (either Gary Taubes or Nina Teicholz mentions this). It wasn’t until 1980 that the government began seriously promoting the high-carb diet that has afflicted us ever since. Carb intake peaked out around 2000 and dropped a bit after that, but has remained relatively high.
The inflammatory omega-6 fatty acids combined with all the carbs has caused obesity, as part of metabolic syndrome. That goes along with the lack of nutrition that has caused endless hunger as Americans have been eating empty calories. The more crap you eat, the more your body hungers for nutrition. And all that crap is designed to be highly addictive. So, Americans eat and eat, the body hungering for nutrition and not getting it. Under natural conditions, hunger is a beneficial signal to seek out what the body needs. But such things as sugar have become unlinked from nutrient-density.
Unsurprisingly, Americans have been getting sicker and sicker, decade after decade. But on a positive note, recently there is a slight drop in how many carbs Americans are eating. This is particularly seen with added sugar. And it does seem to be making a difference. There is evidence that the diabetes epidemic might finally be reversing. Low-carb diets are becoming popular again, after almost a half century of public amnesia. That is good. Still, the food most American have access to remains low quality and lacking in nutrition.
Studies have shown that a low-carb, high-fat diet improves health. But it wasn’t clear if this is caused directly by the diet or caused instead by the fat loss that is a common result of the diet. In a new 3-year study, researchers controlled for fat loss and many of the same health benefits were still seen.
The researchers did this by providing prepared meals. They had to make sure that the subjects were getting enough calories so as to lose no weight. This meant increasing fat intake, sometimes by extraordinary amounts. Despite this including an increased in saturated fat, there was no increase of saturated fat in the bloodstream. This is yet more evidence against the scapegoating of saturated fat. The diets also would have been high in cholesterol and, unsurprisingly, all the health markers for cholesterol were positive.
On the other hand, there are confounding factors. Subjects were given prepared meals. This would naturally decrease the consumption of processed foods. To really understand what was going on, we would have to look at the precise ingredients. For example, did these prepared meals have less industrial vegetable oils that are known to cause all kinds of health problems, including affecting metabolic syndrome?
The fact that there was greater amount of saturated fat in the diet indicates that the kind of fat one eats does matter. So, simply replacing sources of PUFAs with healthy fats, including saturated fats, will lead to massive improvements, whether it is caused by what is being eliminated or by what is being added in. Still, from what we know about the harm caused by excess starches and sugar, it’s hard to conclude that this study merely showed the positive effects of changes in the amounts and kinds of fats.
Whatever the cause, it is well-established at this point that a low-carb, high-fat diet is healthy. This is true, whether or not there is fat loss. Yet considering fat loss is a definitely health benefit typical of this diet, it demonstrates how the advantages are multiple. If you need to lose weight, it’s the best diet around. But if you don’t need to lose weight, it’s still great. There is no way for you not to come out ahead.
* * *
Dietary carbohydrate restriction improves metabolic syndrome independent of weight loss
by Parker N. Hyde et al
Silence on the US Front–News Flash of US Research from the UK!
by Angela A. Stanton
Low-Carb Diet Could Reduce Risk of These Diseases
by Kashmira Gander
Eric “Butterbean” Esch, having weighed 425 lbs at his heaviest, was one of the best boxers of the 1990s. He regularly knocked out his competitors in under a minute. He didn’t look impressive, besides being obese. He wasn’t the best trained nor did he fight with much style. But he was a powerhouse. He could take punches and give them in return. And when he landed a punch, it was devastating.
As with many others, Butterbean’s obesity was not an indicator of a lack of muscle, stamina, and aerobic health. Even in later fights when his power was decreased, he still could hold his own for many rounds. In 2002, he remained on his feet for 10 rounds with one of the greatest fighters of all time, Larry Holmes, before finally knocking him back against the ropes with the fight ending after the referee did a standing 8 count. He expanded his career into professional wrestling and MMA matches, winning many more fights. As late as 2011 in his mid-40s, he was still knocking out opponents and he was still fat.
This is why so few people can lose weight through exercise alone. All that more exercise does for most, specifically on a high-carb diet, is to make them hungrier and so leading to them eating more (exercise on a ketogenic diet is a bit different, though). And indeed, many athletes end up focusing on carbs in trying to maintain their energy, as glucose gets used up so quickly (as opposed to ketones). Long-distance runners on a high-carb diet have to constantly refuel with sugary drinks provided along the way.
Americans have been advised to eat more of the supposedly healthy carbs (whole grains, vegetables, fruit, etc) while eating less of the supposedly unhealthy animal foods (red meat, saturated fats, etc) and the data shows they are doing exactly that, more than ever before since data was kept. But telling people that eating lots of carbs, even if from “whole foods”, is part of a healthy diet is bad advice. And when they gain weight, blaming them for not exercising enough is bad advice stacked upon bad advice.
Such high-carb diets don’t do any good for long-term health, even for athletes. Morally judging fat people as gluttonous and slothful simply doesn’t make sense and it is the opposite of helpful, a point that Gary Taubes has made. It’s plain bullshit and this scapegoating of the victims of bad advice is cruel.
This is why so many professional athletes get fat when they retire, after a long career of eating endless carbs, not that it ever was good for their metabolic health (people can be skinny fat with adipose around their internal organs and have diabetes or pre-diabetes). But some like Butterbean begin their athletic careers fat and remained fat. Many football players are similarly overweight. William Perry, AKA The Fridge, was an example of that, although he was a relative lightweight at 335-350 lbs. Even more obvious examples are seen with some gigantic sumo wrestlers who, while grotesquely obese, are immensely strong athletes.
Sumo wrestlers are also a great example of the power of a high-carb diet. They will intentionally consume massive amounts of starches and sugars in order to put on fat. That is old knowledge, the reason people have understood for centuries the best way to fatten cattle is to feed them grains. And it isn’t as if cattle get fat by being lazy while sitting on the couch watching tv and playing on the internet. It’s the diet alone that accomplishes that feat of deliciously marbled flesh. Likewise, humans eating a high-carb diet will make their own muscles and organs marbled.
I speak from personal experience, after gaining weight in my late 30s and into my early 40s. I topped out at around 220 lbs — not massive, but way beyond my weight in my early 20s when I was super skinny, maybe down in the 140 lbs range (the result of a poverty diet and I looked gaunt at the time). In recent years, I had developed a somewhat protruding belly and neck flabs. You could definitely tell I was carrying extra fat. Could you tell that I also was physically fit? Probably not.
No matter how much I exercised, I could not lose weight. I was jogging out to my parent’s place, often while carrying a backpack that sometimes added another 20-30 lbs (books, water bottle, etc). That jog took about an hour and I did it 3-4 times a week and I was doing some weightlifting as well, but my weight remained the same. Keep in mind I was eating what, according to official dietary guidelines, was a ‘balanced’ diet. I had cut back on my added sugars over the years, only allowing them as part of healthy whole foods such as in kefir, kombucha, and fruit. I was emphasizing lots of vegetables and fiber. This often meant starting my day with a large bowl of bran cereal topped with blueberries or dried fruit.
I was doing what Americans have been told is healthy. I could not lose any of that extra fat, in spite of all my effort and self-control. Then in the spring of last year I went on a low-carb diet that transitioned into a very low-carb diet (i.e., keto). In about 3 months, I lost 60 lbs and have kept it off since. I didn’t do portion control and didn’t count calories. I ate as much as I wanted, but simply cut out the starches and sugars. No willpower was required, as on a keto diet my hunger diminished and my cravings disappeared. It was the high-carb diet that had made me fat, not a lack of exercise.
Vitamin D3, a fat-soluble vitamin, is one of the most important micronutrients. I won’t describe all of its health benefits. But the effect on the body can be more like a hormone in how powerful it influences numerous physiological processes and systems.
Here is what I’ll emphasize for the moment, as an example of how to think about health in a more complex way. Unless you live near the equator and are near naked outside in the sun for most of the day, you are guaranteed to not be getting enough vitamin D3 through your body’s own production of it. The only other natural source is from animal foods. So, be sure to eat plenty of fatty animal foods from pasture-raised animals, especially organ meats, eggs, and dairy.
Let me throw out the issue of autophagy. Eating protein, as with eating carbs or really anything, shuts down autophagy. And we want some autophagy (i.e., cellular repair and regrowth) as it is essential to health and longevity. Some people blame protein for lack of autophagy, but that is nonsense. It is no more to blame than anything else. Sure, you should fast from protein on occasion. Then again, you should fast from everything on occasion. But fasting won’t give you the benefits of autophagy if you don’t have all that is required to make this possible. Guess which nutrient enhances autophagy? Yep, vitamin D3.
Someone severely restricting their protein consumption is unintentionally also restricting their vitamin D3 intake. They’ll have a harder time getting into full autophagy with all of its benefits. This is even more true for those, in avoiding fatty meats, eat a high-carb/low-fat diet instead. Not only are they not getting healthy amounts of vitamin D3 for they also aren’t regularly in ketosis. And one has to first be in ketosis before one can be in autophagy. On a high-fat ketogenic diet, all that it will take to get autophagy is a relatively shorter fast because the body is already fully primed for it.
It is true that eating protein shuts down autophagy in up-regulating what causes biological growth by way of mTOR and IFG1. That isn’t a bad thing. We want our bodies to grow, just as we also want our bodies to repair. The optimal condition is to cycle back and forth between these two states. Vitamin D3 from fatty animal foods is key for both, as it promotes bone growth and promotes autophagy, among much else. Don’t deny yourself. Enjoy those delicious fats from high quality sources. Feast until satiation and, to balance it out, fast on occasion.
* * *
As a side note, deficiency in vitamin D3 is associated with such things as Alzheimer’s.
It makes me wonder if that is related to the role of vitamin D3 in autophagy. Alzheimer’s is accumulated damage involving (among other factors) insulin resistance and inflammation, both of which would relate to low-carb/high fat diets along with ketosis and autophagy.
But vitamin D3 out of balance can also be a problem, as it works closely with the fat-soluble vitamin A (as beta-carotene). Vitamins A and D3 form a fat-soluble trio with vitamin K2. You can learn more about this from Kate Rheaume-Bleue, although credit must be given to Weston A. Price.
* * *
Primal Fat Burner
by Nora Gedgaudas
You’ve likely heard of the “French paradox”—that, despite the French people’s high consumption of saturated fat, their rates of heart disease are lower than ours in the United States. Here in our country we’re stuck in an unfortunate situation that I call the American paradox: the more closely you follow official dietary government guidelines, the worse your health is likely to be! 11 The USDA is busy telling Americans to base their daily diets upon low-fat, starchy carbohydrates and get more exercise; meanwhile, the obesity epidemic and related health challenges continue to grow. (This paradox is global, by the way—countries such as India are seeing skyrocketing rates of diabetes, and the vegetarians of southern India have literally the world’s shortest life span.)
Trying to make sense of all this is a bit like Alice falling down a rabbit hole; everything seems upside down and nonsensical. Let’s take a brief look at the stats. According to the Food Research and Action Center (FRAC), after decades of being subjected to government guidelines promoting a low-fat and high-carbohydrate diet, Americans show the following problems: 12
- 68.5 percent of adults are overweight or obese; 34.9 percent are obese. (Compare this to the 1971 overweight statistic of 42 percent.)
- 31.8 percent of children and adolescents are overweight or obese; 16.9 percent are obese.
- 30.4 percent of low-income preschoolers are overweight or obese.
Yet another study published in May 2015 examining the impact of dietary guidelines on the health of US citizens yielded some shocking but undeniable conclusions: rates of obesity and diabetes have increased dramatically. 13 The official government dietary recommendations were intended to prevent weight problems and obesity, along with diabetes, cancer, and other chronic diseases. The fact that this has not happened—and that the reverse is true—is officially rationalized in a number of ways. 14 But the underlying message is that we are dumb and lazy. That’s right—the party line about why official dietary recommendations (such as from the American Heart Association and the US Departments of Agriculture and Health and Human Services) have failed is that Americans are to blame because we don’t follow the guidelines and we don’t work out enough. 15 In other words, if we’re sick, it’s our own fat, stupid fault.
This is such a persistent, morale-killing, and completely misleading message that I want to address it directly before we move on.
First, we have collectively and diligently followed the guidelines. Here’s what official guidelines recommend for our daily diets versus what we are currently doing in reality (RDA stands for Recommended Daily Allowance):
Total fat consumption. RDA says a maximum of 35 percent of calories; reality says about 34 percent. (Let’s not pat ourselves on the back, though—the number one source of those fat calories is partially hydrogenated oil from genetically modified soybeans, one of the worst things for the body!)
Saturated fats. RDA says a maximum of 10 percent saturated fat; reality says just under 11 percent (not terribly naughty or rebellious relative to established government recommendations).
Carbs. RDA says 55 to 65 percent, with 45 percent the smallest amount necessary to meet the (unfounded) “optimal dietary requirements”; reality says over 50 percent. This is more than enough to create a health-compromising, sugar-burning metabolism.
Protein. RDA says between 10 and 35 percent; reality says 15 percent.
As you can see, Americans are meeting the established dietary requirements, and we have largely eschewed our national interest in protein in favor of far more addictive carbohydrates. Isn’t it strange, then, that the predominant health messages we hear are that we eat too much animal protein and saturated fat for our own good, and that those are the things that make us overweight and cause heart-related and other health problems?
Meanwhile, FRAC looked at historical shifts and found that the consumption of fats dropped from 45 to 34 percent of total caloric intake between 1971 and 2011, while carbohydrate consumption jumped from 39 to 51 percent. In the same time, obesity has surged by over 25 percent. We have diligently increased our consumption of carbohydrates and reduced our intake of animal fat and cholesterol for over five decades, according to the rules—and we have gotten fatter. Processed foods that contain chemicals such as MSG, Frankenfoods that contain genetically modified organisms (GMOs), hydrogenated and interesterified vegetable oils, and other damaging ingredients such as high fructose corn syrup are to thank for a good part of this disaster. But the promotion of higher-carb, low-fat diets has also undeniably served to push everyone in the wrong direction. (FRAC concluded, as many scientists have, that the increased consumption of carbohydrates is what has caused the huge increase in overweight and obesity.)
In understanding human health, we have to look at all factors as a package deal. Our gut-brain is a system, as is our entire mind-body. Our relationships, lifestyle, the environment around us — all of it is inseparable. This is true even if we limit ourselves to diet alone. It’s not simply calories in/calories out, macronutrient ratios, or anything else along these lines. It is the specific foods eaten in combination with which other foods and in the context of stress, toxins, epigenetic inheritance, gut health, and so much else that determine what effects manifest in the individual.
There are numerous examples of this. But I’ll stick to a simple one, which involves several factors and the relationship between them. First, red meat is associated with cancer and heart disease. Yet causation is hard to prove, as red meat consumption is associated with many other foods in the standard American diet, such as added sugars and vegetable oils in processed foods. The association might be based on confounding factors that are culture-specific, which can explain why we find societies with heavy meat consumption and little cancer.
So, what else might be involved? We have to consider what red meat is being eaten with, at least in the standard American diet that is used as a control in most research. There is, of course, the added sugars and vegetable oils — they are seriously bad for health and may explain much of the confusion. Saturated fat intake has been dropping since the early 1900s and, in its place, there has been a steady rise in the use of vegetable oils; we now know that highly heated and hydrogenated vegetable oils do severe damage. Also, some of the original research that blamed saturated fat, when re-analyzed, found that sugar was the stronger correlation to heart disease.
Saturated fat, as with cholesterol, had been wrongly accused. This misunderstanding has, over multiple generations at this point, led to the early death of at least hundreds of millions of people worldwide, as dozens of the wealthiest and most powerful countries enforced this in their official dietary recommendations which transformed the world’s food system. Similar to eggs, red meat became the fall guy.
Such things as heart disease are related to obesity, and conventional wisdom tells us that fat makes us fat. Is that true? Not exactly or directly. I was amused to discover that a scientific report commissioned by the British government in 1846 (Experimental Researches on the Food of Animals, and the Fattening of Cattle: With Remarks on the Food of Man. Based Upon Experiments Undertaken by Order of the British Government by Robert Dundas Thomson) concluded that “The present experiments seem to demonstrate that the fat of animals cannot be produced from the oil of the food” — fat doesn’t make people fat, and that low-carb meat-eating populations tend to be slim has been observed for centuries.
So, in most cases, what does cause fat accumulation? It is only fat combined with plenty of carbs and sugar that is guaranteed to make us fat, that is to say fat in the presence of glucose in that the two compete as a fuel source.
Think about what an American meal with red meat looks like. A plate might have a steak with some rolls or slices of bread, combined with a potato and maybe some starchy ‘vegetables’ like corn, peas, or lima beans. Or there will be a hamburger with a bun, a side of fries, and a large sugary drink (‘diet’ drinks are no better, as we now know artificial sweeteners fool the body and so are just as likely to make you fat and diabetic). What is the common factor, red meat combined with wheat or some other grain, as part of a diet drenched in carbs and sugar (and all of it cooked or slathered in vegetable oils).
Most Americans have a far greater total intake of carbs, sugar, and vegetable oils than red meat and saturated fat. The preferred meat of Americans these days is chicken with fish also being popular. Why does red meat and saturated fat continue to be blamed for the worsening rates of heart disease and metabolic disease? It’s simply not rational, based on the established facts in the field of diet and nutrition. That isn’t to claim that too much red meat couldn’t be problematic. It depends on the total diet. Also, Americans have the habit of grilling their red meat and grilling increases carcinogens, which could be avoided by not charring one’s meat, but that equally applies to not burning (or frying) anything one eats, including white meat and plant foods. In terms of this one factor, you’d be better off eating beef roasted with vegetables than to go with a plant-based meal that included foods like french fries, fried okra, grilled vegetable shish kabobs, etc.
Considering all of that, what exactly is the cause of cancer that keeps showing up in epidemiological studies? Sarah Ballantyne has some good answers to that (see quoted passage below). It’s not so much about red meat itself as it is about what red meat is eaten with. The crux of the matter is that Americans eat more starchy carbs, mostly refined flour, than they do vegetables. What Ballantyne explains is that two of the potential causes of cancer associated with red meat only occur in a diet deficient in vegetables and abundant in grains. It is the total diet as seen in the American population that is the cause of high rates of cancer.
As a heavy meat diet without grains is not problematic, a heavy carb diet without grains is also not necessarily problematic. Some of the healthiest populations eat lots of carbs like sweet potatoes, but you won’t find any healthy population that eats as many grains as do Americans. There are many issues with grains considered in isolation (read the work of David Perlmutter or any number of writers on the paleo diet), but grains combined with certain other foods in particular can contribute to health concerns.
Then again, some of this is about proportion. For most of the time of agriculture, humans ate small amounts of grains as an occasional food. Grains tended to be stored for hard times or for trade or else turned into alcohol to be mixed with water from unclean sources. The shift to large amounts of grains made into refined flour is an evolutionarily unique dilemma our bodies aren’t designed to handle. The first accounts of white bread are found in texts from slightly over two millennia ago and most Westerners couldn’t afford white bread until the past few centuries when industrialized milling began. Before that, people tended to eat foods that were available and didn’t mix them as much (e.g., eat fruits and vegetables in season). Hamburgers were invented only about a century ago. The constant combining of red meat and grains is not something we are adapted for. That harm to our health results maybe shouldn’t surprise us.
Red meat can be a net loss to health or a net gain. It depends not on the red meat, but what is and isn’t eaten with it. Other factors matter as well. Health can’t be limited to a list of dos and don’ts, even if such lists have their place in the context of more detailed knowledge and understanding. The simplest solution is to eat as most humans ate for hundreds of thousands of years, and more than anything else that means avoiding grains. Even without red meat, many people have difficulties with grains.
Let’s return to the context of evolution. Hominids have been eating fatty red meat for millions of years (early humans having prized red meat from blubbery megafauna until their mass extinction), and yet meat-eating hunter-gatherers rarely get cancer, heart disease, or any of the other modern ailments. How long ago was it when the first humans ate grains? About 12 thousand years ago. Most humans on the planet never touched a grain until the past few millennia. And fewer still included grains with almost every snack and meal until the past few generations. So, what is this insanity of government dietary recommendations putting grains as the base of the food pyramid? Those grains are feeding the cancerous microbes, and doing much else that is harmful.
In conclusion, is red meat bad for human health? It depends. Red meat that is charred or heavily processed combined with wheat and other carbs, lots of sugar and vegetable oils, and few nutritious vegetables, well, that would be a shitty diet that will inevitably lead to horrible health consequences. Then again, the exact same diet minus the red meat would still be a recipe for disease and early death. Yet under other conditions, red meat can be part of a healthy diet. Even a ton of pasture-raised red meat (with plenty of nutrient-dense organ meats) combined with an equal amount of organic vegetables (grown on healthy soil, bought locally, and eaten in season), in exclusion of grains especially refined flour and with limited intake of all the other crap, that would be one of the healthiest diets you could eat.
On the other hand, if you are addicted to grains as many are and can’t imagine a world without them, you would be wise to avoid red meat entirely. Assuming you have any concerns about cancer, you should choose one or the other but not both. I would note, though, that there are many other reasons to avoid grains while there are no other known reasons to avoid red meat, at least for serious health concerns, although some people exclude red meat for other reasons such as digestion issues. The point is that whether or not you eat red meat is a personal choice (based on taste, ethics, etc), not so much a health choice, as long as we separate out grains. That is all we can say for certain based on present scientific knowledge.
* * *
We’ve known about this for years now. Isn’t it interesting that no major health organization, scientific institution, corporate news outlet, or government agency has ever warned the public about the risk factors of carcinogenic grains? Instead, we get major propaganda campaigns to eat more grains because that is where the profit is for big ag, big food, and big oil (that makes farm chemicals and transports the products of big ag and big food). How convenient! It’s nice to know that corporate profit is more important than public health.
But keep listening to those who tell you that cows are destroying the world, even though there are fewer cows in North America than there once were buffalo. Yeah, monocultural GMO crops immersed in deadly chemicals that destroy soil and deplete nutrients are going to save us, not traditional grazing land that existed for hundreds of millions of years. So, sure, we could go on producing massive yields of grains in a utopian fantasy beloved by technocrats and plutocrats that further disconnects us from the natural world and our evolutionary origins, an industrial food system dependent on turning the whole world into endless monocrops denatured of all other life, making entire regions into ecological deserts that push us further into mass extinction. Or we could return to traditional ways of farming and living with a more traditional diet largely of animal foods (meat, fish, eggs, dairy, etc) balanced with an equal amount of vegetables, the original hunter-gatherer diet.
Our personal health is important. And it is intimately tied to the health of the earth. Civilization as we know it was built on grains. That wasn’t necessarily a problem when grains were a small part of the diet and populations were small. But is it still a sustainable socioeconomic system as part of a healthy ecological system? No, it isn’t. So why do we continue to do more of the same that caused our problems in the hope that it will solve our problems? As we think about how different parts of our diet work together to create conditions of disease or health, we need to begin thinking this way about our entire world.
* * *
by Sarah Ballantyne
While this often gets framed as an argument for going vegetarian or vegan. It’s actually a reflection of the importance of eating plenty of plant foods along with meat. When we take a closer look at these studies, we see something extraordinarily interesting: the link between meat and cancer tends to disappear once the studies adjust for vegetable intake. Even more exciting, when we examine the mechanistic links between meat and cancer, it turns out that many of the harmful (yes, legitimately harmful!) compounds of meat are counteracted by protective compounds in plant foods.
One major mechanism linking meat to cancer involves heme, the iron-containing compound that gives red meat its color (in contrast to the nonheme iron found in plant foods). Where heme becomes a problem is in the gut: the cells lining the digestive tract (enterocytes) metabolize it into cytotoxic compounds (meaning toxic to living cells), which can then damage the gut barrier (specifically the colonic mucosa; see page 67), cause cell proliferation, and increase fecal water toxicity—all of which raise cancer risk. Yikes! In fact, part of the reason red meat is linked with cancer far more often than with white meat could be due to their differences in heme content; white meat (poultry and fish) contains much, much less.
Here’s where vegetables come to the rescue! Chlorophyll, the pigment in plants that makes them green, has a molecular structure that’s very similar to heme. As a result, chlorophyll can block the metabolism of heme in the intestinal tract and prevent those toxic metabolites from forming. Instead of turning into harmful by-products, heme ends up being metabolized into inert compounds that are no longer toxic or damaging to the colon. Animal studies have demonstrated this effect in action: one study on rats showed that supplementing a heme-rich diet with chlorophyll (in the form of spinach) completely suppressed the pro-cancer effects of heme. All the more reason to eat a salad with your steak.
Another mechanism involves L-carnitine, an amino acid that’s particularly abundant in red meat (another candidate for why red meat seems to disproportionately increase cancer risk compared to other meats). When we consume L-carnitine, our intestinal bacteria metabolize it into a compound called trimethylamine (TMA). From there, the TMA enters the bloodstream and gets oxydized by the liver into yet another compound, trimethylamine-N-oxide (TMAO). This is the one we need to pay attention to!
TMAO has been strongly linked to cancer and heart disease, possibly due to promoting inflammation and altering cholesterol transport. Having high levels of it in the bloodstream could be a major risk factor for some chronic diseases. So is this the nail in the coffin for meat eaters?
Not so fast! An important study on this topic published in 2013 in Nature Medicine sheds light on what’s really going on. This paper had quite a few components, but one of the most interesting has to do with gut bacteria. Basically, it turns out that the bacteria group Prevotella is a key mediator between L-carnitine consumption and having high TMAO levels in our blood. In this study, the researchers found that participants with gut microbiomes dominated by Prevotella produced the most TMA (and therefore TMAO, after it reached the liver) from the L-carnitine they ate. Those with microbiomes high in Bacteroides rather than Prevotella saw dramatically less conversion to TMA and TMAO.
Guess what Prevotella loves to snack on? Grains! It just so happens that people with high Prevotella levels, tend to be those who eat grain-based diets (especially whole grain), since this bacterial group specializes in fermenting the type of polysaccharides abundant in grain products. (For instance, we see extremely high levels of Prevotella in populations in rural Africa that rely on cereals like millet and sorghum.) At the same time, Prevotella doesn’t seem to be associated with a high intake of non-grain plant sources, such as fruit and vegetables.
So is it really the red meat that’s a problem . . . or is it the meat in the context of a grain-rich diet? Based on the evidence we have so far, it seems that grains (and the bacteria that love to eat them) are a mandatory part of the L-carnitine-to-TMAO pathway. Ditch the grains, embrace veggies, and our gut will become a more hospitable place for red meat!
* * *
Georgia Ede has a detailed article about the claim of meat causing cancer. In it, she provides several useful summaries of and quotes from the scientific literature.
In November 2013, 23 cancer experts from eight countries gathered in Norway to examine the science related to colon cancer and red/processed meat. They concluded:
“…the interactions between meat, gut and health outcomes such as CRC [colorectal cancer] are very complex and are not clearly pointing in one direction….Epidemiological and mechanistic data on associations between red and processed meat intake and CRC are inconsistent and underlying mechanisms are unclear…Better biomarkers of meat intake and of cancer occurrence and updated food composition databases are required for future studies.” 1) To read the full report: http://www.ncbi.nlm.nih.gov/pubmed/24769880 [open access]
Translation: we don’t know if meat causes colorectal cancer. Now THAT is a responsible, honest, scientific conclusion.
How the WHO?
How could the WHO have come to such a different conclusion than this recent international gathering of cancer scientists? As you will see for yourself in my analysis below, the WHO made the following irresponsible decisions:
- The WHO cherry-picked studies that supported its anti-meat conclusions, ignoring those that showed either no connection between meat and cancer or even a protective effect of meat on colon cancer risk. These neutral and protective studies were specifically mentioned within the studies cited by the WHO (which makes one wonder whether the WHO committee members actually read the studies referenced in its own report).
- The WHO relied heavily on dozens of “epidemiological” studies (which by their very nature are incapable of demonstrating a cause and effect relationship between meat and cancer) to support its claim that meat causes cancer.
- The WHO cited a mere SIX experimental studies suggesting a possible link between meat and colorectal cancer, four of which were conducted by the same research group.
- THREE of the six experimental studies were conducted solely on RATS. Rats are not humans and may not be physiologically adapted to high-meat diets. All rats were injected with powerful carcinogenic chemicals prior to being fed meat. Yes, you read that correctly.
- Only THREE of the six experimental studies were human studies. All were conducted with a very small number of subjects and were seriously flawed in more than one important way. Examples of flaws include using unreliable or outdated biomarkers and/or failing to include proper controls.
- Some of the theories put forth by the WHO about how red/processed meat might cause cancer are controversial or have already been disproved. These theories were discredited within the texts of the very same studies cited to support the WHO’s anti-meat conclusions, again suggesting that the WHO committee members either didn’t read these studies or deliberately omitted information that didn’t support the WHO’s anti-meat position.
Does it matter whether the WHO gets it right or wrong about meat and cancer? YES.
“Strong media coverage and ambiguous research results could stimulate consumers to adapt a ‘safety first’ strategy that could result in abolishment of red meat from the diet completely. However, there are reasons to keep red meat in the diet. Red meat (beef in particular) is a nutrient dense food and typically has a better ratio of N6:N3-polyunsaturated fatty acids and significantly more vitamin A, B6 and B12, zinc and iron than white meat(compared values from the Dutch Food Composition Database 2013, raw meat). Iron deficiencies are still common in parts of the populations in both developing and industrialized countries, particularly pre-school children and women of childbearing age (WHO)… Red meat also contains high levels of carnitine, coenzyme Q10, and creatine, which are bioactive compounds that may have positive effects on health.” 2)
The bottom line is that there is no good evidence that unprocessed red meat increases our risk for cancer. Fresh red meat is a highly nutritious food which has formed the foundation of human diets for nearly two million years. Red meat is a concentrated source of easily digestible, highly bioavailable protein, essential vitamins and minerals. These nutrients are more difficult to obtain from plant sources.
It makes no sense to blame an ancient, natural, whole food for the skyrocketing rates of cancer in modern times. I’m not interested in defending the reputation of processed meat (or processed foods of any kind, for that matter), but even the science behind processed meat and cancer is unconvincing, as I think you’ll agree. […]
Regardless, even if you believe in the (non-existent) power of epidemiological studies to provide meaningful information about nutrition, more than half of the 29 epidemiological studies did NOT support the WHO’s stance on unprocessed red meat and colorectal cancer.
It is irresponsible and misleading to include this random collection of positive and negative epidemiological studies as evidence against meat.
The following quote is taken from one of the experimental studies cited by the WHO. The authors of the study begin their paper with this striking statement:
“In puzzling contrast with epidemiological studies, experimental studies do not support the hypothesis that red meat increases colorectal cancer risk. Among the 12 rodent studies reported in the literature, none demonstrated a specific promotional effect of red meat.” 3)
[Oddly enough, none of these twelve “red meat is fine” studies, which the authors went on to list and describe within the text of the introduction to this article, were included in the WHO report].
I cannot emphasize enough how common it is to see statements like this in scientific papers about red meat. Over and over again, researchers see that epidemiology suggests a theoretical connection between some food and some health problem, so they conduct experiments to test the theory and find no connection. This is why our nutrition headlines are constantly changing. One day eggs are bad for you, the next day they’re fine. Epidemiologists are forever sending well-intentioned scientists on time-consuming, expensive wild goose chases, trying to prove that meat is dangerous, when all other sources–from anthropology to physiology to biochemistry to common sense—tell us that meat is nutritious and safe.
* * *
Below good discussion between Dr. Steven Gundry and Dr. Paul Saladino. It’s an uncommon dialogue. Even though Gundry is known for warning against the harmful substances in plant foods, he has shifted toward a plant-based diet in also warning against too much animal foods or at least too much protein, another issue about IGF1 not relevant to this post. As for Saladino, he is a carnivore and so takes Gundry’s argument against plants to a whole other level. Saladino sees no problem with meat, of course. And his view contradicts what Gundry writes about in his most recent book, The Longevity Paradox.
Anyway, they got onto the topic of TMAO. Saladino points out that fish has more fully formed TMAO than red meat produces in combination with grain-loving Prevotella. Even vegetables produce TMAO. So, why is beef being scapegoated? It’s pure ignorant idiocy. To further this point, Saladino explained that he has tested the microbiome of patients of his on the carnivore diet and it comes up low on the Prevotella bacteria. He doesn’t think TMAO is the danger people claim it is. But even if it were, the single safest diet might be the carnivore diet.
Gundry didn’t even disagree. He pointed out that he did testing on patients of his who are long-term vegans and now in their 70s. They had extremely high levels of TMAO. He sent their lab results to the Cleveland Clinic for an opinion. The experts there refused to believe that it was possible and so dismissed the evidence. That is the power of dietary ideology when it forms a self-enclosed reality tunnel. Red meat is bad and vegetables are good. The story changes over time. It’s the saturated fat. No, it’s the TMAO. Then it will be something else. Always looking for a rationalization to uphold the preferred dogma.
* * *
7/25/19 – Additional thoughts: There is always new research coming out. And as is typical, it is often contradictory. It is hard to know what is being studied exactly.The most basic understanding in mainstream nutrition right now seems to be that red meat is associated with TMAO by way of carnitine and Prevotella (Studies reveal role of red meat in gut bacteria, heart disease development). But there are many assumptions being made. This research tends to be epidemiological/observational and so most factors aren’t being controlled.
Worse still, they aren’t comparing the equivalent extremes, such as veganism vs carnivory but veganism and vegetarianism vs omnivory. That is to leave out the even greater complicating factor that, as the data shows, a significant number of vegans and vegetarians occasionally eat animal foods. There really aren’t that many long-term vegans and vegetarians to study because 80% of people who start the diet quit it, and of that 20% few are consistent.
As for omnivores, they are a diverse group that could include hundreds of dietary variations. One variety of omnivory is the paleo diet, slightly restricted omnivory in that grains are excluded, often along with legumes, white potatoes, dairy, added sugar, etc. The paleo diet was studied and showed higher levels of TMAO and, rather than cancer, the focus was on cardiovascular disease (Heart disease biomarker linked to paleo diet).
So, that must mean the paleo diet is bad, right? When people think of the paleo diet, they think of a caveman lugging a big hunk of meat. But the reality is that the standard paleo diet, although including red meat, emphasizes fish and heaping platefuls of vegetables. Why is red meat getting blamed? In a bizarre twist, the lead researcher of the paleo study, Dr. Angela Genoni, thought the problem was the lack of grains. But it precisely grains that the TMAO-producing Prevotella gut bacteria love so much. How could reducing grains increase TMAO? No explanation was offered. Before we praise grains, why not look at the sub-population of vegans, vegetarians, fruitivores, etc who also avoid grains?
There is a more rational and probable factor. It turns out that fish and vegetables raise TMAO levels higher than red meat (Eat your vegetables (and fish): Another reason why they may promote heart health). This solves the mystery of why some Dr. Gundry’s vegan patients had high TMAO levels. Yet, in another bizarre leap of logic, the same TMAO that is used to castigate red meat suddenly is portrayed as healthy in reducing cardiovascular risk when it comes from sources other than red meat. It is the presence of red meat that somehow magically transforms TMAO into an evil substance that will kill you. Or maybe, just maybe it has nothing directly to do with TMAO alone.
After a long and detailed analysis of the evidence, Dr. Georgia Ede concluded that, “As far as I can tell, the authors’ theory that red meat provides carnitine for bacteria to transform into TMA which our liver then converts to TMAO, which causes our macrophages to fill up with cholesterol, block our arteries, and cause heart attacks is just that–a theory–full of sound and fury, signifying nothing” (Does Carnitine from Red Meat Cause Heart Disease?).
There is a piece from The Atlantic about weight loss, The Weight I Carry. It’s written from a personal perspective. The author, Tommy Tomlinson, has been overweight his entire life. He describes what this has been like, specifically the struggle and failure in finding anything that worked. One has to give him credit for trying a wide range of diets.
It was sad to read for a number of reasons. But a point of interest was a comment he made about carbs: “I remember the first time carbohydrates were bad for you, back in the 1970s. The lunch counter at Woolworth’s in my hometown of Brunswick, Georgia, sold a diet plate of a hamburger patty on a lettuce leaf with a side of cottage cheese. My mom and I stared at the picture on the menu like it was a platypus at the zoo. We pretended to care about carbs for a while. Mama even bought a little carbohydrate guide she kept in her pocketbook. It said biscuits and cornbread were bad for us. It didn’t stay in her pocketbook long.”
That is what I’ve read about. Into the 1970s, it was still well known that carbs were the main problem for many health problems, specifically weight gain. This was part of mainstream medical knowledge going back to the 1800s. It was an insight that once was considered common sense, back when most people lived on and around farms. Everyone used to know that how cattle were fattened for the slaughter was with a high-carb diet and so the way to lose weight was to decrease carbs. There was nothing controversial about this old piece of wisdom, that is until the government decreed the opposite to be true in their 1980s dietary recommendations.
The sad part is how, even as this guy knew of this wisdom, the context of understanding its significance was lost. He lacks an explanatory framework that can sift through all the bullshit. He writes that, “I’ve done low-fat and low-carb and low-calorie, high-protein and high-fruit and high-fiber. I’ve tried the Mediterranean and taken my talents to South Beach. I’ve shunned processed foods and guzzled enough SlimFast to drown a rhino. I’ve eaten SnackWell’s cookies (low-fat, tons of sugar) and chugged Tab (no sugar, tons of chemicals, faint whiff of kerosene). I’ve been told, at different times, that eggs, bacon, toast, cereal, and milk are all bad for you. I’ve also been told that each one of those things is an essential part of a healthy diet. My brain is fogged enough at breakfast. Don’t fuck with me like this.”
His frustration is palpable and reasonable. But I notice all that gets left out from his complaints. A low-carb diet by itself very well might feel impossible. If you aren’t replacing carbs with healthy fats and nutrient-dense whole foods, you will be trying to swim upstream. Carbs is used by the body as a fuel. Take it away and you better give the body a different fuel. And after a lifetime of nutrient deficiency as is common in modern industrialization, you’d be wise to rebuilding your nutritional foundations.
That is the failure of the deprivation model of diets. They eliminate without offering any good advice about what to add back in. The advantage of traditional foods and paleo is that they are less diets in this sense. They are simply seeking scientific knowledge based on how humans live in traditional communities in the world today and how humans have lived going back to the ancient world and beyond. The point is finding what naturally works for the human body, not forcing restrictions based on ideological demands. If a diet feels like a constant struggle, then you are doing something wrong. For most of human existence, the vast majority of individuals maintained a healthy body weight with no effort whatsoever. The epidemic of obesity is extremely and bizarrely abnormal. Obesity indicates something is seriously out of balance, specifically with insulin sensitivity and the related hormonal hunger signals. Deprivation simply antagonizes this state of disease.
We already know that the ketogenic diet is the most effective diet for weight loss. Not only in the losing part but also in maintaining one’s optimal weight. No other diet decreases hunger and eliminates cravings to the same extent. More generally, a recent study showed that a low-carb diet beat a low-fat diet in burning fat, even when protein and calories were exactly the same in both groups. This possibly indicates that, as some have speculated, a diet low enough in carbs may increase metabolism in burning more calories than one is consuming. Then when you reach your preferred weight, you can add back in some calories to attain an equilibrium. This is apparently the one thing the author didn’t try. He did try the South Beach diet, but it is only moderately low-carb and unfortunately is also low-fat, a bad combination — this diet, for example, recommends low-fat milk which is not only eliminating the needed fats but also the fat-soluble vitamins, especially in the form of dairy from cows that are pastured/grass-fed.
The author is trapped in the dominant paradigm. He doesn’t need to “Eat less and exercise.” And he recognizes this is bad advice, even as he can’t see an alternative. But he should look a bit further outside the mainstream. On a ketogenic diet, many people can lose weight while eating high levels of calories and not exercising. It’s more of a matter of what you eat than how much, although in some cases where there are serious health problems as is typical with lifelong obesity more emphasis might need to be given to exercise and such. But the point is to find foods that are satisfying without overeating, which generally means healthy fats. Your body gets hungry for a reason and, if you don’t feed it what it needs, it will remain hungry. Calorie counting and portion control won’t likely help anyone with long term weight issues. It will just make them frustrated and hangry, and for good reason. But when the old patterns repeatedly fail, it is best to try something new. Sadly, the author’s conclusion is to more fully commit to the old way of thinking. His chances of success are next to zero, as long as he continues on this path.
It’s an obesity mindset. The individual blames himself, rather than blaming the bad advice. He just needs more self-control and less gluttony. This time, he tells himself, it will work. I doubt it. I hope he doesn’t spend the rest of his life on this endless treadmill of self-defeat and self-blame. Life doesn’t need to be so difficult. Rather than losing weight, he should focus on what it takes to be and feel healthy. But it is hard to convince someone of that when their entire identity has become entangled with obesity itself, with their appearance as judged by the same society that gave the bad advice.
* * *
The Weight I Carry
What it’s like to be too big in America
by Tommy Tomlinson
I remember the first time carbohydrates were bad for you, back in the 1970s. The lunch counter at Woolworth’s in my hometown of Brunswick, Georgia, sold a diet plate of a hamburger patty on a lettuce leaf with a side of cottage cheese. My mom and I stared at the picture on the menu like it was a platypus at the zoo. We pretended to care about carbs for a while. Mama even bought a little carbohydrate guide she kept in her pocketbook. It said biscuits and cornbread were bad for us. It didn’t stay in her pocketbook long.
I’ve done low-fat and low-carb and low-calorie, high-protein and high-fruit and high-fiber. I’ve tried the Mediterranean and taken my talents to South Beach. I’ve shunned processed foods and guzzled enough SlimFast to drown a rhino. I’ve eaten SnackWell’s cookies (low-fat, tons of sugar) and chugged Tab (no sugar, tons of chemicals, faint whiff of kerosene). I’ve been told, at different times, that eggs, bacon, toast, cereal, and milk are all bad for you. I’ve also been told that each one of those things is an essential part of a healthy diet. My brain is fogged enough at breakfast. Don’t fuck with me like this.
Here are the two things I have come to believe about diets:
1. Almost any diet works in the short term.
2. Almost no diets work in the long term.
The most depressing five-word Google search I can think of—and I can think of a lot of depressing five-word Google searches—is gained all the weight back. Losing weight is not the hard part. The hard part is living with your diet for years, maybe the rest of your life.
When we go on a diet—especially a crash diet—our own bodies turn against us. Nutritional studies have shown that hunger-suppressing hormones in our bodies dwindle when we lose weight. Other hormones—the ones that warn us we need to eat—tend to rise. Our bodies beg us to gorge at the first sign of deprivation. This makes sense when you think about the history of humankind. There were no Neanderthal foodies. They ate to survive. They went hungry for long stretches. Their bodies sent up alarms telling them they’d better find something to eat. Our DNA still harbors a fear that we’ll starve. But now most of us have access to food that is more abundant, cheaper, and more addictive than at any other time in human history. Our bodies haven’t caught up to the modern world. Our cells think we’re storing up fat for a hard winter when actually it’s just happy hour at Chili’s.
Even worse, when people succeed at losing a lot of weight, their bodies slam on the brakes of their metabolism. […] Other studies had already shown that the body’s metabolism slows down as people lose weight, which means they have to eat fewer and fewer calories to keep losing. But this study showed that, for the contestants who lost weight quickly, their metabolism kept slowing even when they started gaining weight again. Basically, however fat they had been, that’s what their bodies wanted them to be. […]
“Eat less and exercise.”
That’s what some of you are saying right now. That’s what some of you have said the whole time you’ve been reading. That’s what some of you say—maybe not out loud, but you say it—every time you see a fat person downing fried eggs in a diner, or overstuffing a bathing suit on the beach, or staring out from one of those good-lord-what-happened-to-her? stories in the gossip magazines.
“Eat less and exercise.”
What I want you to understand, more than anything else, is that telling a fat person “Eat less and exercise” is like telling a boxer “Don’t get hit.”
You act as if there’s not an opponent.
Losing weight is a fucking rock fight. The enemies come from all sides: The deluge of marketing telling us to eat worse and eat more. The culture that has turned food into one of the last acceptable vices. Our families and friends, who want us to share in their pleasure. Our own body chemistry, dragging us back to the table out of fear that we’ll starve.
On top of all that, some of us fight holes in our souls that a boxcar of donuts couldn’t fill.
My compulsion to eat comes from all those places. I’m almost never hungry in the physical sense. But I’m always craving an emotional high, the kind that comes from making love, or being in the crowd for great live music, or watching the sun come up over the ocean. And I’m always wanting something to counter the low, when I’m anxious about work or arguing with family or depressed for reasons I can’t understand.
There are radical options for people like me. There are boot camps where I could spend thousands of dollars to have trainers whip me into shape. There are crash diets and medications with dangerous side effects. And, of course, there is weight-loss surgery. Several people I know have done it. Some say it saved them. Others had life-threatening complications. A few are just as miserable as they were before. I don’t judge any people who try to find their own way. I speak only for myself here: For me, surgery feels like giving up. I know that the first step of 12-step programs is admitting that you’re powerless over your addiction. But I don’t feel powerless yet.
My plan is to lose weight in a simple, steady, sustainable way. I’ll count how many calories I eat and how many I burn. If I end up on the right side of the line at the end of the day, that’s a win. I’ll be like an air mattress with a slow leak, fooling my body into thinking I’m not on a diet at all. And one day, a few years down the road, I’ll wake up and look in the mirror and think: I got there.