American Heart Association’s “Fat and Cholesterol Counter” (1991)

  • 1963 – “Every woman knows that carbohydrates are fattening, this is a piece of common knowledge, which few nutritionists would dispute.”
  • 1994 – “… obesity may be regarded as a carbohydrate-deficiency syndrome and that an increase in dietary carbohydrate content at the expense of fat is the appropriate dietary part of a therapeutical strategy.”*

My mother was about to throw out an old booklet from the American Heart Association (AHA), “Fat and Cholesterol Counter”, one of several publications they put out around that time. It was published in 1991, the year I started high school. Unsurprisingly, it blames everything on sodium, calories, cholesterol, and, of course, saturated fat.

Even hydrogenated fat gets blamed on saturated fat, since the hydrogenation process turns some small portion of it saturated, which ignores the heavy damage and inflammatory response caused by the oxidization process (both in the industrial processing and in cooking). Not to mention those hydrogenated fats as industrial seed oils are filled with omega-6 fatty acids, the main reason they are so inflammatory. Saturated fat, on the other hand, is not inflammatory at all. This obsession with saturated fat is so strange. It never made any sense from a scientific perspective. When the obesity epidemic began and all that went with it, the consumption of saturated fat by Americans had been steadily dropping for decades, ever since the invention of industrial seed oils in the late 1800s and the fear about meat caused by Upton Sinclair’s muckraking journalism, The Jungle, about the meatpacking industry.

The amount of saturated fat and red meat has declined over the past century, to be replaced with those industrial seed oils and lean white meat, along with fruits and vegetables — all of which have been increasing.** Chicken, in particular, replaced beef and what stands out about chicken is that, like those industrial seed oils, it is high in the inflammatory omega-6 fatty acids. How could saturated fat be causing the greater rates of heart disease and such when people were eating less of it. This scapegoating wasn’t only unscientific but blatantly irrational. All of this info was known way back when Ancel Keys went on his anti-fat crusade (The Creed of Ancel Keys). It wasn’t a secret. And it required cherrypicked data and convoluted rationalizations to explain away.

Worse than removing saturated fat when it’s not a health risk is the fact that it is actually an essential nutrient for health: “How much total saturated do we need? During the 1970s, researchers from Canada found that animals fed rapeseed oil and canola oil developed heart lesions. This problem was corrected when they added saturated fat to the animals diets. On the basis of this and other research, they ultimately determined that the diet should contain at least 25 percent of fat as saturated fat. Among the food fats that they tested, the one found to have the best proportion of saturated fat was lard, the very fat we are told to avoid under all circumstances!” (Millie Barnes, The Importance of Saturated Fats for Biological Functions).

It is specifically lard that has been most removed from the diet, and this is significant as lard was a central to the American diet until this past century: “Pre-1936 shortening is comprised mainly of lard while afterward, partially hydrogenated oils came to be the major ingredient” (Nina Teicholz, The Big Fat Surprise, p. 95); “Americans in the nineteenth century ate four to five times more butter than we do today, and at least six times more lard” (p. 126). And what about the Mediterranean people who supposedly are so healthy because of their love of olive oil? “Indeed, in historical accounts going back to antiquity, the fat more commonly used in cooking in the Mediterranean, among peasants and the elite alike, was lard.” (p. 217).

Jason Prall notes that long-lived populations ate “lots of meat” and specifically, “They all ate pig. I think pork was the was the only common animal that we saw in the places that we went” (Longevity Diet & Lifestyle Caught On Camera w/ Jason Prall). The infamous long-lived Okinawans also partake in everything from pigs, such that their entire culture and religion was centered around pigs (Blue Zones Dietary Myth). Lard, in case you didn’t know, comes from pigs. Pork and lard is found in so many diets for the simple reason pigs can live in diverse environments, from mountainous forests to tangled swamps to open fields, and they are a food source available year round.

Another thing that has gone hand in hand with loss of healthy, nutrient-dense saturated fat in the American diet is a loss of nutrition in general. It’s not only that plant foods have less minerals and vitamins because of depleted soil and because they are picked when not ripe in order to ship them long distances. The same is true of animal foods, since the animals are being fed the same crappy plant foods as us humans. But at the very least, even factory-farmed animals have far more bioavailable nutrient-density than plant foods from industrial agriculture. If we ate more fatty meat, saturated fat or otherwise, we’d be getting far more fat-soluble vitamins. But when looking at all animal foods, in particular from pasture-raised and wild-caught sources, there is no mineral or vitamin that can’t be obtained at required levels. The same can’t be said for plant foods on a vegan diet.

Back in 1991, the AHA was recommending the inclusion of lots of bread, rolls, crackers, and pasta (“made with low-fat milk and fats or oils low in saturated fatty acids” and “without eggs”); rice, beans, and peas; sugary fruits and starchy vegetables (including juices) — and deserts were fine as well. At most, eat 3 or 4 eggs a week and, as expected, optimally avoid the egg yolks where all the nutrition is located (not only fat-soluble vitamins, but also choline and cholesterol and much else; by the way, your brain health is dependent on high levels of dietary cholesterol, such that statins in blocking cholesterol cause neurocognitive decline). As long as there were little if any saturated fat and fat in general was limited, buckets of starchy carbs and sugar was considered by the AHA to be part of a healthy and balanced diet. That is sad.

This interested me because of the year. This was as I was entering young adulthood and so I was becoming more aware of the larger world. I remember the heavy-handed propaganda preaching that fiber is good and fat is evil, as if the war on obesity was a holy crusade that demanded black-and-white thinking, all subtleties and complexities must be denied in adherence to the moralistic dogma against the sins of gluttony and sloth — it was literally a evangelistic medical gospel (see Belinda Fettke’s research on the Seventh Day Adventists: Thou Shalt not discuss Nutrition ‘Science’ without understanding its driving force). In our declining public health, we were a fallen people who required a dietary clergy for our salvation. Millennia of traditional dietary wisdom and knowledge was thrown out the window as if it was worthless or maybe even dangerous.

I do remember my mother buying high-fiber cereals and “whole wheat” commercial breads (not actually whole wheat as it is simply denatured refined flour with fiber added back in). And along with this, skim or 1% fat dairy foods, especially milk, was included with every major meal and often snacks. I had sugary and starchy cereal with skim milk (and/or milk with sugary Instant Breakfast) every morning and a glass of skim milk for every dinner, maybe sometimes milk for lunch. Cheese was a regular part of the diet as well, such as with pizza eaten multiple times week or any meal with pasta, and heck cheese was a great snack all by itself, but also good combined with crackers and one could pretend to be healthy if one used Triscuits. Those were the days when I might devour a whole block of cheese, probably low-fat, in a single sitting — I was probably craving fat-soluble vitamins. Still, most of my diet was most starches and sugar, as that was my addiction. The fiber was an afterthought to market junk food as health food.

It now makes sense. When I was a kid in the 1980s, my mother says the doctor understood that whole fat milk was important for growing bodies. So that is what he recommended. But I guess the anti-fat agenda had fully taken over by the 1990s. The AHA booklet from 1991 was by then recommending “skim or 1% milk and low-fat cheeses” for all ages, including babies and children, pregnant and lactating women. Talk about a recipe for health disaster. No wonder metabolic syndrome exploded and neurocognitive health fell like a train going over a collapsed bridge. It was so predictable, as the failure of this diet was understood by many going back to earlier in the century (e.g., Weston A. Price; see my post Health From Generation To Generation).

The health recommendations did get worse over time, but to be fair it started much earlier. They had been discouraging breastfeeding for a while. Traditionally, babies were breastfed for the first couple of years or so. By the time modern America came around, experts were suggesting a short period of breast milk or even entirely using scientifically-designed formulas. My mother only breastfed me for 5-6 months and then put me on cows milk — of course, pasteurized and homogenized milk from grain-fed and factory-farmed cows. When the dairy caused diarrhea, the doctor suggested soy milk. After a while, my mother put me on dairy again, but diarrhea persisted and so for preschool she put me back on soy milk again. I was drinking soy milk off and on for many years during the most important stage of development. Holy fuck! That had to have done serious damage to my developing body, in particular my brain. Then I went from that to skim milk during another important time of development, as I hit puberty and went through growth spurts.

Early on in elementary school, I had delayed reading and a diagnosis of learning disability, seemingly along with something along the lines of either Asperger’s or specific language impairment, although undiagnosed. I definitely had social and behavioral issues, in that I didn’t understand people well when I was younger. Then entering adulthood, I was diagnosed with depression and something like a “thought disorder” or something (I forget the exact diagnosis I got while in a psychiatric ward after a suicide attempt). No doubt the latter was already present in my early neurocogntive problems, as I obviously was severely depressed at least as early as 7th grade. A malnourished diet of lots of carbs and little fat was the most probable cause for all of these problems.

Thanks, American Heart Association! Thanks for doing so much harm my health and making my life miserable for decades, not to mention nearly killing me through depression so severe I attempted suicide, and then decades of depressive struggle that followed. That isn’t even to mention the sugar and carb addiction that plagued me for so long. Now multiply my experience by that of at least hundreds of millions of other Americans, and even greater number of people from elsewhere as their governments followed the example of the United States, across the past few generations. Great job, AHA. And much appreciation for the helping hand of the USDA and various medical institutions in enforcing this anti-scientific dogma.

Let me be clear about one thing. I don’t blame my mother, as she was doing the best she could with the advice given to her by doctors and corporate media, along with the propaganda literature from respected sources such as the AHA. Nor do I blame any other average Americans as individuals, although I won’t hold back on placing the blame squarely on the shoulders of demagogues like Ancel Keys. As Gary Taubes and Nina Teicholz have made so clear, this was an agenda of power, not science. With the help of government and media, the actual scientific debate was silenced and disappeared from public view (Eliminating Dietary Dissent). The consensus in favor of a high-carb, low-fat diet didn’t emerge through rational discourse and evidence-based medicine —  it was artificially constructed and enforced.

Have we learned our lesson? Apparently not. We still see this tactic of technocratic authoritarianism, such as with corporate-funded push behind EAT-Lancet (Dietary Dictocrats of EAT-Lancet). Why do we tolerate this agenda-driven exploitation of public trust and harm to public health?

* * *

 * First quote: Passmore, R., and Y. E. Swindelis. 1963. “Observations on the Respiratory Quotients and Weight Gain of Man After Eating Large Quantities of Carbohydrates.” British Journal of Nutrition. 17. 331-39.
Second quote: Astrup, A., B. Baemann, N. . Christenson, and S. Toubre. 1994. “Failure to Increase Lipid Oxidtion in Response to Increasing Dietary Fat Content in Formerly Obese Women.” American Journal of Physiology. April, 266 (4, pt. 1) E592-99.
Both quotes are from a talk given by Peter Ballerstedt, “AHS17 What if It’s ALL Been a Big Fat Lie?,” available on the Ancestry Foundation Youtube page.

(It appears that evidence-based factual reality literally changes over time. I assume this relativity of ideological realism has something to do with quantum physics. It’s the only possible explanation. I’m feeling a bit snarky, in case you didn’t notice.)

** Americans, in the prior centuries, ate few plant foods at all because they were so difficult and time-consuming to grow. There was no way to control for pests and wild animals that often would devour and destroy a garden or a crop. It was too much investment for too little reward, not to mention extremely unreliable as a food source and so risky to survival for those with a subsistence lifestyle. Until modern farming methods, especially with 20th century industrialization of agriculture, most Americans primarily ate animal foods with tons of fat, mostly butter, cream and lard, along with a wide variety of wild-caught animal foods.

This is discussed by Nina Teicholz in The Big Fat Surprise: “Early-American settlers were “indifferent” farmers, according to many accounts. They were fairly lazy in their efforts at both animal husbandry and agriculture, with “the grain fields, the meadows, the forests, the cattle, etc, treated with equal carelessness,” as one eighteenth-century Swedish visitor described. And there was little point in farming since meat was so readily available.” (see more in my post Malnourished Americans). That puts the conventional dietary debate in an entirely different context. Teicholz adroitly dismantles the claim that fatty animal foods have increased in the American diet.

Teicholz goes on to state that, “So it seems fair to say that at the height of the meat-and-butter-gorging eighteenth and nineteenth centuries, heart disease did not rage as it did by the 1930s. Ironically—or perhaps tellingly—the heart disease “epidemic” began after a period of exceptionally reduced meat eating.” It was the discovery of seed oils that originally were an industrial byproduct, combined with Upton Sinclair’s muckraking journalism about the meatpacking industry (The Jungle), that caused meat and animal fats to quickly fall out as the foundation of the American diet. Saturated fat, in particular, had been in decline for decades prior to the epidemics of obesity, diabetes, and heart disease. Ancel Keys knew this data, which is why he had to throw out some of his data to make it fit his preconceived conclusions in promoting his preferred dietary ideology.

If we were honestly wanting to find the real culprit to blame, we would look to the dramatic rise of vegetable oils, white flour, and sugar in the 20th century diet. It began much earlier with the grain surpluses and cheap wheat, especially in England during the 1800s, but in the United States it became most noticeable in the first half century following that period. The agenda of Keys and the AHA simply made a bad situation worse, albeit much much worse.

Autism and the Upper Crust

There are multiple folktales about the tender senses of royalty, aristocrats, and other elite. The most well known example is “The Princess and the Pea”. In the Aarne-Thompson-Uther system of folktale categorization, it gets listed as type 704 about the search for a sensitive wife. That isn’t to say that all the narrative variants of elite sensitivity involve potential wives. Anyway, the man who made this particular story famous is Hans Christian Andersen, having published his translation in 1835. He longed to be a part of the respectable class, but felt excluded. Some speculate that he projected his own class issues onto his slightly altered version of the folktale, something discussed in the Wikipedia article about the story:

“Wullschlager observes that in “The Princess and the Pea” Andersen blended his childhood memories of a primitive world of violence, death and inexorable fate, with his social climber’s private romance about the serene, secure and cultivated Danish bourgeoisie, which did not quite accept him as one of their own. Researcher Jack Zipes said that Andersen, during his lifetime, “was obliged to act as a dominated subject within the dominant social circles despite his fame and recognition as a writer”; Andersen therefore developed a feared and loved view of the aristocracy. Others have said that Andersen constantly felt as though he did not belong, and longed to be a part of the upper class.[11] The nervousness and humiliations Andersen suffered in the presence of the bourgeoisie were mythologized by the storyteller in the tale of “The Princess and the Pea”, with Andersen himself the morbidly sensitive princess who can feel a pea through 20 mattresses.[12]Maria Tatar notes that, unlike the folk heroine of his source material for the story, Andersen’s princess has no need to resort to deceit to establish her identity; her sensitivity is enough to validate her nobility. For Andersen, she indicates, “true” nobility derived not from an individual’s birth but from their sensitivity. Andersen’s insistence upon sensitivity as the exclusive privilege of nobility challenges modern notions about character and social worth. The princess’s sensitivity, however, may be a metaphor for her depth of feeling and compassion.[1] […] Researcher Jack Zipes notes that the tale is told tongue-in-cheek, with Andersen poking fun at the “curious and ridiculous” measures taken by the nobility to establish the value of bloodlines. He also notes that the author makes a case for sensitivity being the decisive factor in determining royal authenticity and that Andersen “never tired of glorifying the sensitive nature of an elite class of people”.[15]

Even if that is true, there is more going on here than some guy working out his personal issues through fiction. This princess’ sensory sensitivity sounds like autism spectrum disorder and I have a theory about that. Autism has been associated with certain foods like wheat, specifically refined flour in highly processed foods (The Agricultural Mind). And a high-carb diet in general causes numerous neurocognitive problems (Ketogenic Diet and Neurocognitive Health), along with other health conditions such as metabolic syndrome (Dietary Dogma: Tested and Failed) and insulin resistance (Coping Mechanisms of Health), atherosclerosis (Ancient Atherosclerosis?) and scurvy (Sailors’ Rations, a High-Carb Diet) — by the way, the rates of these diseases have been increasing over the generations and often first appearing among the affluent. Sure, grains have long been part of the diet, but the one grain that had most been associated with the wealthy going back millennia was wheat, as it was harder to grow which caused it to be in short supply and so expensive. Indeed, it is wheat, not the other grains, that gets brought up in relation to autism. This is largely because of gluten, though other things have been pointed to.

It is relevant that the historical period in which these stories were written down was around when the first large grain surpluses were becoming common and so bread, white bread most of all, became a greater part of the diet. But as part of the diet, this was first seen among the upper classes. It’s too bad we don’t have cross-generational data on autism rates in terms of demographic and dietary breakdown, but it is interesting to note that the mental health condition neurasthenia, also involving sensitivity, from the 19th century was seen as a disease of the middle-to-upper class (The Crisis of Identity), and this notion of the elite as sensitive was a romanticized ideal going back to the 1700s with what Jane Austen referred to as ‘sensibility’ (see Bryan Kozlowski’s The Jane Austen Diet, as quoted in the link immediately above). In that same historical period, others noted that schizophrenia was spreading along with civilization (e.g., Samuel Gridley Howe and Henry Maudsley; see The Invisible Plague by Edwin Fuller Torrey & Judy Miller) and I’d add the point that there appear to be some overlapping factors between schizophrenia and autism — besides gluten, some of the implicated factors are glutamate, exorphins, inflammation, etc. “It is unlikely,” writes William Davis, “that wheat exposure was the initial cause of autism or ADHD but, as with schizophrenia, wheat appears to be associated with worsening characteristics of the conditions” (Wheat Belly, p. 48).

For most of human history, crop failures and famine were a regular occurrence. And this most harshly affected the poor masses when grain and bread prices went up, leading to food riots and sometimes revolutions (e.g., French Revolution). Before the 1800s, grains were so expensive that, in order to make them affordable, breads were often adulterated with fillers or entirely replaced with grain substitutes, the latter referred to as “famine breads” and sometimes made with tree bark. Even when available, the average person might be spending most of their money on bread, as it was one of the most costly foods around and other foods weren’t always easily obtained.

Even so, grain being highly sought after certainly doesn’t imply that the average person was eating a high-carb diet, quite the opposite (A Common Diet). Food in general was expensive and scarce and, among grains, wheat was the least common. At times, this would have forced feudal peasants and later landless peasants onto a diet limited in both carbohydrates and calories, which would have meant a typically ketogenic state (Fasting, Calorie Restriction, and Ketosis), albeit far from an optimal way of achieving it. The further back in time one looks the greater prevalence would have been ketosis (e.g., Spartan  and Mongol diet), maybe with the exception of the ancient Egyptians (Ancient Atherosclerosis?). In places like Ireland, Russia, etc, the lower classes remained on this poverty diet that was often a starvation diet well into the mid-to-late 1800s, although in the case of the Irish it was an artificially constructed famine as the potato crop was essentially being stolen by the English and sold on the international market.

Yet, in America, the poor were fortunate in being able to rely on a meat-based diet because wild game was widely available and easily obtained, even in cities. That may have been true for many European populations as well during earlier feudalism, specifically prior to the peasants being restricted in hunting and trapping on the commons. This is demonstrated by how health improved after the fall of the Roman Empire (Malnourished Americans). During this earlier period, only the wealthy could afford high-quality bread and large amounts of grain-based foods in general. That meant highly refined and fluffy white bread that couldn’t easily be adulterated. Likewise, for the early centuries of colonialism, sugar was only available to the wealthy — in fact, it was a controlled substance typically only found in pharmacies. But for the elite who had access, sugary pastries and other starchy dessert foods became popular. White bread and pastries were status symbols. Sugar was so scarce that wealthy households kept it locked away so the servants couldn’t steal it. Even fruit was disproportionately eaten by the wealthy. A fruit pie would truly have been a luxury with all three above ingredients combined in a single delicacy.

Part of the context is that, although grain yields had been increasing during the early colonial era, there weren’t dependable surplus yields of grains before the 1800s. Until then, white bread, pastries, and such simply were not affordable to most people. Consumption of grains, along with other starchy carbs and sugar, rose with 19th century advancements in agriculture. Simultaneously, income was increasing and the middle class was growing. But even as yields increased, most of the created surplus grains went to feeding livestock, not to feeding the poor. Grains were perceived as cattle feed. Protein consumption increased more than did carbohydrate consumption, at least initially. The American population, in particular, didn’t see the development of a high-carb diet until much later, as related to US mass urbanization also happening later.

Coming to the end of the 19th century, there was the emergence of the mass diet of starchy and sugary foods, especially the spread of wheat farming and white bread. And, in the US, only by the 20th century did grain consumption finally surpass meat consumption. Following that, there has been growing rates of autism. Along with sensory sensitivity, autistics are well known for their pickiness about foods and well known for cravings for particular foods such as those made from highly refined wheat flour, from white bread to crackers. Yet the folktales in question were speaking to a still living memory of an earlier time when these changes had yet to happen. Hans Christian Andersen first published “The Princess and the Pea” in 1835, but such stories had been orally told long before that, probably going back at least centuries, although we now know that some of these folktales have their origins millennia earlier, even into the Bronze Age. According to the Wikipedia article on “The Princess and the Pea”,

“The theme of this fairy tale is a repeat of that of the medieval Perso-Arabic legend of al-Nadirah.[6] […] Tales of extreme sensitivity are infrequent in world culture but a few have been recorded. As early as the 1st century, Seneca the Younger had mentioned a legend about a Sybaris native who slept on a bed of roses and suffered due to one petal folding over.[23] The 11th-century Kathasaritsagara by Somadeva tells of a young man who claims to be especially fastidious about beds. After sleeping in a bed on top of seven mattresses and newly made with clean sheets, the young man rises in great pain. A crooked red mark is discovered on his body and upon investigation a hair is found on the bottom-most mattress of the bed.[5] An Italian tale called “The Most Sensitive Woman” tells of a woman whose foot is bandaged after a jasmine petal falls upon it.”

I would take it as telling that, in the case of this particular folktale, it doesn’t appear to be as ancient as other examples. That would support my argument that the sensory sensitivity of autism might be caused by greater consumption of refined wheat, something that only began to appear late in the Axial Age and only became common much later. Even for the few wealthy that did have access in ancient times, they were eating rather limited amounts of white bread. It might have required hitting a certain level of intake, not seen until modernity or closer to it, before the extreme autistic symptoms became noticeable among a larger number of the aristocracy and monarchy.

* * *

Sources

Others have connected such folktales of sensitivity with autism:

The high cost and elite status of grains, especially white bread, prior to 19th century high yields:

The Life of a Whole Grain Junkie
by Seema Chandra

Did you know where the term refined comes from? Around 1826, whole grain bread used by the military was called superior for health versus the white refined bread used by the aristocracy. Before the industrial revolution, it was more labor consuming and more expensive to refine bread, so white bread was the main staple loaf for aristocracy. That’s why it was called “refined”.

The War on White Bread
by Livia Gershon

Bread has always been political. For Romans, it helped define class; white bread was for aristocrats, while the darkest brown loaves were for the poor. Later, Jacobin radicals claimed white bread for the masses, while bread riots have been a perennial theme of populist uprisings. But the political meaning of the staff of life changed dramatically in the early twentieth-century United States, as Aaron Bobrow-Strain, who went on to write the book White Bread, explained in a 2007 paper. […]

Even before this industrialization of baking, white flour had had its critics, like cracker inventor William Sylvester Graham. Now, dietary experts warned that white bread was, in the words of one doctor, “so clean a meal worm can’t live on it for want of nourishment.” Or, as doctor and radio host P.L. Clark told his audience, “the whiter your bread, the sooner you’re dead.”

Nutrition and Economic Development in the Eighteenth-Century Habsburg Monarchy: An Anthropometric History
by John Komlos
p.31

Furthermore, one should not disregard the cultural context of food consumption. Habits may develop that prevent the attainment of a level of nutritional status commensurate with actual real income. For instance, the consumption of white bread or of polished rice, instead of whole-wheat bread or unpolished rice, might increase with income, but might detract from the body’s well-being. Insofar as cultural habits change gradually over time, significant lags could develop between income and nutritional status.

pp. 192-194

As consequence, per capita food consumption could have increased between 1660 and 1740 by as much as 50 percent. The fact that real wages were higher in the 1730s than at any time since 1537 indicates a high standard of living was reached. The increase in grain exports, from 2.8 million quintals in the first decade of the eighteenth century to 6 million by the 1740s, is also indicative of the availability of nutrients.

The remarkably good harvests were brought about by the favorable weather conditions of the 1730s. In England the first four decades of the eighteenth century were much warmer than the last decades of the previous century (Table 5.1). Even small differences in temperature may have important consequences for production. […] As a consequence of high yields the price of consumables declined by 14 percent in the 1730s relative to the 1720s. Wheat cost 30 percent less in the 1730s than it did in the 1660s. […] The increase in wheat consumption was particularly important because wheat was less susceptible to mold than rye. […]

There is direct evidence that the nutritional status of many populations was, indeed, improving in the early part of the eighteenth century, because human stature was generally increasing in Europe as well as in America (see Chapter 2). This is a strong indication that protein and caloric intake rose. In the British colonies of North America, an increase in food consumption—most importantly, of animal protein—in the beginning of the eighteenth century has been directly documented. Institutional menus also indicate that diets improved in terms of caloric content.

Changes in British income distribution conform to the above pattern. Low food prices meant that the bottom 40 percent of the distribution was gaining between 1688 and 1759, but by 1800 had declined again to the level of 1688. This trend is another indication that a substantial portion of the population that was at a nutritional disadvantage was doing better during the first half of the eighteenth century than it did earlier, but that the gains were not maintained throughout the century.

The Roots of Rural Capitalism: Western Massachusetts, 1780-1860
By Christopher Clark
p. 77

Livestock also served another role, as a kind of “regulator,” balancing the economy’s need for sufficiency and the problems of producing too much. In good years, when grain and hay were plentiful, surpluses could be directed to fattening cattle and hogs for slaughter, or for exports to Boston and other markets on the hoof. Butter and cheese production would also rise, for sale as well as for family consumption. In poorer crop years, however, with feedstuffs rarer, cattle and swine could be slaughtered in greater numbers for household and local consumption, or for export as dried meat.

p. 82

Increased crop and livestock production were linked. As grain supplies began to overtake local population increases, more corn in particular became available for animal feed. Together with hay, this provided sufficient feedstuffs for farmers in the older Valley towns to undertake winter cattle fattening on a regular basis, without such concern as they had once had for fluctuations in output near the margins of subsistence. Winter fattening for market became an established practice on more farms.

When Food Changed History: The French Revolution
by Lisa Bramen

But food played an even larger role in the French Revolution just a few years later. According to Cuisine and Culture: A History of Food and People, by Linda Civitello, two of the most essential elements of French cuisine, bread and salt, were at the heart of the conflict; bread, in particular, was tied up with the national identity. “Bread was considered a public service necessary to keep the people from rioting,” Civitello writes. “Bakers, therefore, were public servants, so the police controlled all aspects of bread production.”

If bread seems a trifling reason to riot, consider that it was far more than something to sop up bouillabaisse for nearly everyone but the aristocracy—it was the main component of the working Frenchman’s diet. According to Sylvia Neely’s A Concise History of the French Revolution, the average 18th-century worker spent half his daily wage on bread. But when the grain crops failed two years in a row, in 1788 and 1789, the price of bread shot up to 88 percent of his wages. Many blamed the ruling class for the resulting famine and economic upheaval.
Read more: https://www.smithsonianmag.com/arts-culture/when-food-changed-history-the-french-revolution-93598442/#veXc1rXUTkpXSiMR.99
Give the gift of Smithsonian magazine for only $12! http://bit.ly/1cGUiGv
Follow us: @SmithsonianMag on Twitter

What Brought on the French Revolution?
by H.A. Scott Trask

Through 1788 and into 1789 the gods seemed to be conspiring to bring on a popular revolution. A spring drought was followed by a devastating hail storm in July. Crops were ruined. There followed one of the coldest winters in French history. Grain prices skyrocketed. Even in the best of times, an artisan or factor might spend 40 percent of his income on bread. By the end of the year, 80 percent was not unusual. “It was the connection of anger with hunger that made the Revolution possible,” observed Schama. It was also envy that drove the Revolution to its violent excesses and destructive reform.

Take the Reveillon riots of April 1789. Reveillon was a successful Parisian wall-paper manufacturer. He was not a noble but a self-made man who had begun as an apprentice paper worker but now owned a factory that employed 400 well-paid operatives. He exported his finished products to England (no mean feat). The key to his success was technical innovation, machinery, the concentration of labor, and the integration of industrial processes, but for all these the artisans of his district saw him as a threat to their jobs. When he spoke out in favor of the deregulation of bread distribution at an electoral meeting, an angry crowded marched on his factory, wrecked it, and ransacked his home.

Why did our ancestors prefer white bread to wholegrains?
by Rachel Laudan

Only in the late nineteenth and twentieth century did large numbers of “our ancestors”–and obviously this depends on which part of the world they lived in–begin eating white bread. […]

Wheat bread was for the few. Wheat did not yield well (only seven or eight grains for one planted compared to corn that yielded dozens) and is fairly tricky to grow.

White puffy wheat bread was for even fewer. Whiteness was achieved by sieving out the skin of the grain (bran) and the germ (the bit that feeds the new plant). In a world of scarcity, this made wheat bread pricey. And puffy, well, that takes fairly skilled baking plus either yeast from beer or the kind of climate that sourdough does well in. […]

Between 1850 and 1950, the price of wheat bread, even white wheat bread, plummeted in price as a result of the opening up of new farms in the US and Canada, Argentina, Australia and other places, the mechanization of plowing and harvesting, the introduction of huge new flour mills, and the development of continuous flow bakeries.

In 1800 only half the British population could afford wheat bread. In 1900 everybody could.

History of bread – Industrial age
The Industrial Age (1700 – 1887)
from The Federation of Bakers

In Georgian times the introduction of sieves made of Chinese silk helped to produce finer, whiter flour and white bread gradually became more widespread. […]

1757
A report accused bakers of adulterating bread by using alum lime, chalk and powdered bones to keep it very white. Parliament banned alum and all other additives in bread but some bakers ignored the ban. […]

1815
The Corn Laws were passed to protect British wheat growers. The duty on imported wheat was raised and price controls on bread lifted. Bread prices rose sharply. […]

1826
Wholemeal bread, eaten by the military, was recommended as being healthier than the white bread eaten by the aristocracy.

1834
Rollermills were invented in Switzerland. Whereas stonegrinding crushed the grain, distributing the vitamins and nutrients evenly, the rollermill broke open the wheat berry and allowed easy separation of the wheat germ and bran. This process greatly eased the production of white flour but it was not until the 1870s that it became economic. Steel rollermills gradually replaced the old windmills and watermills.

1846
With large groups of the population near to starvation the Corn Laws were repealed and the duty on imported grain was removed. Importing good quality North American wheat enabled white bread to be made at a reasonable cost. Together with the introduction of the rollermill this led to the increase in the general consumption of white bread – for so long the privilege of the upper classes.

Of all foods bread is the most noble: Carl von Linné (Carl Linneaus) on bread
by Leena Räsänen

In many contexts Linné explained how people with different standing in society eat different types of bread. He wrote, “Wheat bread, the most excellent of all, is used only by high-class people”, whereas “barley bread is used by our peasants” and “oat bread is common among the poor”. He made a remark that “the upper classes use milk instead of water in the dough, as they wish to have a whiter and better bread, which thereby acquires a more pleasant taste”. He compared his own knowledge on the food habits of Swedish society with those mentioned in classical literature. Thus, according to Linné, Juvenal wrote that “a soft and snow-white bread of the finest wheat is given to the master”, while Galen condemned oat bread as suitable only for cattle, not for humans. Here Linné had to admit that it is, however, consumed in certain provinces in Sweden.

Linné was aware of and discussed the consequences of consuming less tasty and less satisfying bread, but he seems to have accepted as a fact that people belonging to different social classes should use different foods to satisfy their hunger. For example, he commented that “bran is more difficult to digest than flour, except for hard-labouring peasants and the likes, who are scarcely troubled by it”. The necessity of having to eat filling but less palatable bread was inevitable, but could be even positive from the nutritional point of view. “In Östergötland they mix the grain with flour made from peas and in Scania with vetch, so that the bread may be more nutritious for the hard-working peasants, but at the same time it becomes less flavoursome, drier and less pleasing to the palate.” And, “Soft bread is used mainly by the aristocracy and the rich, but it weakens the gums and teeth, which get too little exercise in chewing. However, the peasant folk who eat hard bread cakes generally have stronger teeth and firmer gums”.

It is intriguing that Linné did not find it necessary to discuss the consumption or effect on health of other bakery products, such as the sweet cakes, tarts, pies and biscuits served by the fashion-conscious upper class and the most prosperous bourgeois. Several cookery books with recipes for the fashionable pastry products were published in Sweden in the eighteenth century 14. The most famous of these, Hjelpreda i Hushållningen för Unga Fruentimmer by Kajsa Warg, published in 1755, included many recipes for sweet pastries 15. Linné mentioned only in passing that the addition of egg makes the bread moist and crumbly, and sugar and currants impart a good flavour.

The sweet and decorated pastries were usually consumed with wine or with the new exotic beverages, tea and coffee. It is probable that Linné regarded pastries as unnecessary luxuries, since expensive imported ingredients, sugar and spices, were indispensable in their preparation. […]

Linné emphasized that soft and fresh bread does not draw in as much saliva and thus remains undigested for a long time, “like a stone in the stomach”. He strongly warned against eating warm bread with butter. While it was “considered as a delicacy, there was scarcely another food that was more damaging for the stomach and teeth, for they were loosen’d by it and fell out”. By way of illustration he told an example reported by a doctor who lived in a town near Amsterdam. Most of the inhabitants of this town were bakers, who sold bread daily to the residents of Amsterdam and had the practice of attracting customers with oven-warm bread, sliced and spread with butter. According to Linné, this particular doctor was not surprised when most of the residents of this town “suffered from bad stomach, poor digestion, flatulence, hysterical afflictions and 600 other problems”. […]

Linné was not the first in Sweden to write about famine bread. Among his remaining papers in London there are copies from two official documents from 1696 concerning the crop failure in the northern parts of Sweden and the possibility of preparing flour from different roots, and an anonymous small paper which contained descriptions of 21 plants, the roots or leaves of which could be used for flour 10. These texts had obviously been studied by Linné with interest.

When writing about substitute breads, Linné formulated his aim as the following: “It will teach the poor peasant to bake bread with little or no grain in the circumstance of crop failure without destroying the body and health with unnatural foods, as often happens in the countryside in years of hardship” 10.

Linné’s idea for a publication on bread substitutes probably originated during his early journeys to Lapland and Dalarna, where grain substitutes were a necessity even in good years. Actually, bark bread was eaten in northern Sweden until the late nineteenth century 4. In the poorest regions of eastern and north-eastern Finland it was still consumed in the 1920s 26. […]

Bark bread has been used in the subarctic area since prehistoric times 4. According to Linné, no other bread was such a common famine bread. He described how in springtime the soft inner layer can be removed from debarked pine trees, cleaned of any remaining bark, roasted or soaked to remove the resin, and dried and ground into flour. Linné had obviously eaten bark bread, since he could say that “it tastes rather well, is however more bitter than other bread”. His view of bark bread was most positive but perhaps unrealistic: “People not only sustain themselves on this, but also often become corpulent of it, indeed long for it.” Linné’s high regard for bark bread was shared by many of his contemporaries, but not all. For example, Pehr Adrian Gadd, the first professor of chemistry in Turku (Åbo) Academy and one of the most prominent utilitarians in Finland, condemned bark bread as “useless, if not harmful to use” 28. In Sweden, Anders Johan Retzius, a professor in Lund and an expert on the economic and pharmacological potential of Swedish flora, called bark bread “a paltry food, with which they can hardly survive and of which they always after some time get a swollen body, pale and bluish skin, big and hard stomach, constipation and finally dropsy, which ends the misery” 4. […]

Linné’s investigations of substitutes for grain became of practical service when a failed harvest of the previous summer was followed by famine in 1757 10. Linné sent a memorandum to King Adolf Fredrik in the spring of 1757 and pointed out the risk to the health of the hungry people when they ignorantly chose unsuitable plants as a substitute for grain. He included a short paper on the indigenous plants which in the shortage of grain could be used in bread-making and other cooking. His Majesty immediately permitted this leaflet to be printed at public expense and distributed throughout the country 10. Soon Linné’s recipes using wild flora were read out in churches across Sweden. In Berättelse om The inhemska wäxter, som i brist af Säd kunna anwändas til Bröd- och Matredning, Linné 32 described the habitats and the popular names of about 30 edible wild plants, eight of which were recommended for bread-making.

Blood Sugar Test: Ezekiel Bread vs White Bread

As with all sugars, all starches, including all grain products, will spike your blood sugar level. It doesn’t matter if bread is white, whole grain, sprouted, etc. Bread is bread, unless it’s keto bread made out of almond flour, coconut flour, or some other low-carb ingredient.

Ezekiel bread, for example, might be healthier in other ways such as nutrient profile, although the nutrient-density is rather meager compared to many other plant foods and animal foods. For certain, it is not healthy if you’re diabetic, pre-diabetic, or insulin resistant (the majority of Americans fall into one of these categories).

I used to eat Ezekiel bread thinking it was healthier. And this was during the time I was gaining weight and probably developing pre-diabetes or at least worsening insulin resistance. Claims of lower glycemic index is mostly bunk, as the following video shows — and the same would apply to glycemic load as well. The net carbs, excluding fiber, are identical in Ezekiel bread and white bread.

* * *

* * *

More glycemic index tests comparing foods from Dennis Pollock at his Youtube channel, Beat Diabetes!

Low-Carb Diet Is Healthy Even Without Fat Loss

Studies have shown that a low-carb, high-fat diet improves health. But it wasn’t clear if this is caused directly by the diet or caused instead by the fat loss that is a common result of the diet. In a new 3-year study, researchers controlled for fat loss and many of the same health benefits were still seen.

The researchers did this by providing prepared meals. They had to make sure that the subjects were getting enough calories so as to lose no weight. This meant increasing fat intake, sometimes by extraordinary amounts. Despite this including an increased in saturated fat, there was no increase of saturated fat in the bloodstream. This is yet more evidence against the scapegoating of saturated fat. The diets also would have been high in cholesterol and, unsurprisingly, all the health markers for cholesterol were positive.

On the other hand, there are confounding factors. Subjects were given prepared meals. This would naturally decrease the consumption of processed foods. To really understand what was going on, we would have to look at the precise ingredients.  For example, did these prepared meals have less industrial vegetable oils that are known to cause all kinds of health problems, including affecting metabolic syndrome?

The fact that there was greater amount of saturated fat in the diet indicates that the kind of fat one eats does matter. So, simply replacing sources of PUFAs with healthy fats, including saturated fats, will lead to massive improvements, whether it is caused by what is being eliminated or by what is being added in. Still, from what we know about the harm caused by excess starches and sugar, it’s hard to conclude that this study merely showed the positive effects of changes in the amounts and kinds of fats.

Whatever the cause, it is well-established at this point that a low-carb, high-fat diet is healthy. This is true, whether or not there is fat loss. Yet considering fat loss is a definitely health benefit typical of this diet, it demonstrates how the advantages are multiple. If you need to lose weight, it’s the best diet around. But if you don’t need to lose weight, it’s still great. There is no way for you not to come out ahead.

* * *

Dietary carbohydrate restriction improves metabolic syndrome independent of weight loss
by Parker N. Hyde et al

Low-carb diets could reduce diabetes, heart disease and stroke risk even if people DON’T lose weight by cutting down on bread, potatoes and pasta
by Sam Blanchard

Silence on the US Front–News Flash of US Research from the UK!
by Angela A. Stanton

Low-Carb Diet Could Reduce Risk of These Diseases
by Kashmira Gander

Low-carb diet may reduce diabetes risk independent of weight loss
by Misti Crane

Official Guidelines For Low-Carb Diet

A while back, the Swedish government came around to advising a low-carb (and high-fat or at least not low-fat) diet for treating obesity, diabetes, and heart disease. They were the first Western country to do so. The government committee didn’t come to this official position casually, as they first reviewed 16,000 studies (Brian Shilhavy, Sweden Becomes First Western Nation to Reject Low-fat Diet Dogma in Favor of Low-carb High-fat Nutrition). The committee consisted of ten physicians, several of which were skeptics of the low-carb diet — far from being a foregone conclusion (Dr. Andreas Eenfeldt, “Fat Trims Your Waistline”).

The committee’s assessment of the low-carb diet was glowing: “…a greater increase in HDL cholesterol (“the good cholesterol”) without having any adverse affects on LDL cholesterol (“the bad cholesterol”). This applies to both the moderate low-carbohydrate intake of less than 40 percent of the total energy intake, as well as to the stricter low-carbohydrate diet, where carbohydrate intake is less than 20 percent of the total energy intake. In addition, the stricter low-carbohydrate diet will lead to improved glucose levels for individuals with obesity and diabetes, and to marginally decreased levels of triglycerides” (as quoted by Dr. Andreas Eenfeldt in Swedish expert committee: A low-carb diet most effective for weight loss).

As you can see, they went so far as to speak well of a stricter version of the low-carb diet. That is the way mainstream experts refer to what cannot be named. The ketogenic diet retains a stigma that isn’t easily shaken, despite a century of powerful medical research behind it. The ketogenic diet sneaks in, nonetheless — just low-carb with a bit more restriction, which sounds less threatening. But the saturated fat issue is still a sore spot, despite the lack of research ever causally linking it to any disease condition. It’s one step at a time. Openly and loudly declaring low-carb diets as an unequivocal good is a massive step forward. It swings the door wide open for the rest to follow.

The Swedish committee came out with their report in 2017. Now the Australian government has done a scientific review (Inquiry into the role of diet in type 2 diabetes prevention and management) and also taken the official position that low-carb diet should be the default diet for diabetes, although I’m not quite sure when this happened (here is a 2018 Position Statement: Low carbohydrate eating for people with diabetes). “A landmark Australian report has highlighted that remission, not just management, should be the target for type 2 diabetes interventions, and that low carb provides a valuable way to achieve this” (Jack Woodfield, Landmark Australian report promotes low carb approach for treating type 2 diabetes). The committee report even included mention of the benefits from “very low-carbohydrate” dieting, that is to say ketogenic (Ryan Mernin, Australian Lawmakers Propose Low-Carb as Official Diabetes Treatment). The Australian government has gone so far as recommending a campaign to promote diet as a primary approach, as opposed to mere treatment with drugs.

This is an amazing about-face from the position taken only a few years ago. “Fettke, an orthopedic surgeon,” Jennifer Calihan wrote, “was sanctioned in 2016 by regulators (the Australian Health Practitioners Regulation Agency or AHPRA) for recommending a low-carb lifestyle to patients he felt could improve their health by changing their diets. As we wrote in November 2016, Dr. Fettke was officially ‘silenced’ by the AHPRA; this means he was forbidden to give diet-related advice to his patients” (Dr. Gary Fettke exonerated! Receives apology from regulators).

How did that end? Two years later, those attacking him were forced to admit that they had wronged Dr. Fettke. “We are pleased to report that after careful review, the AHPRA has repealed its decision in its entirety, and cleared Dr. Fettke of all charges. He also received a written apology…” As when Tim Noakes won his case in South Africa, this was one more victory for the validity of low-carb diets. Other incidents where doctors have been attacked for advocating for their patients’ health have ended similarly. The tide has turned. It didn’t come out of nowhere, though. In 2017, an Australian government research agency put out a low-carb diet book (Dr. Andreas Eenfeldt, Australian Government Research Agency Releases Low-Carb Diet Book). It’s sad that they were doing this at the same time that regulators were attacking Dr. Fettke.

Some signs of change are seen in the UK as well. The UK National Health Service has officially stated that, “The Low Carb Program can help anyone with type 2 diabetes or pre-diabetes take better control of their condition” (Low Carb Program). That is a good start and might begin to catch the attention of policymakers in the US. The fact that the Pentagon and US military have been researching the ketogenic diet is a massive step forward (Obese Military?), but there seems to be resistance in implementing it, maybe because that would put the Pentagon in opposition to official USDA policy and there might be pressure in the government to not allow internal conflict.

Such shifts don’t happen easily or evenly. Governments lurch back and forth before finally taking a new direction. It’s been building up for a while. This has been true for many governments and health institutions, as they slowly and quietly shift away from the old high-carb dogma without ever admitting they were wrong, instead often hiding the changes of position on a back page of their official website without any public announcement reported by every major news outlet. This is how, without most people realizing it, new viewpoints take hold. Only future historians will look back and realize the dramatic paradigm shift that occurred.

Yet sometimes the shift is quite dramatic. Belgium’s Royal Academy of Medicine recently stated in no uncertain terms that children, teens, pregnant women, and nursing mothers should not follow a vegan diet. A precedent was set with a 2017 case of a child’s death from a vegan diet where the parents were given suspended jail time (Mitchell Sunderland, Judge Convicts Parents After Baby Dies from Vegan Diet). The Belgian government has decided that from now on they will legally prosecute other parents in cases such as these (Susan Scutti, Is vegan diet healthy for kids? Belgian doctors say no). In other countries, there have been similar prosecutions against vegan parents when children have died. And before this decision in Belgium, there was a 2016 proposal for prosecution in Italy (BBC, Italy proposal to jail vegans who impose diet on children).

This fits into the larger shift I’m talking about. Veganism is typically high-carb and low-fat, not to mention low-protein (e.g., fruit smoothies loaded with sugar) — the complete opposite of the typical LCHF diet that emphasizes moderate-to-high protein intake, such as fatty animal foods. It’s true a vegan could go on a LCHF diet and some do and yet few choose to do so since, without animal fat, it seems glucose becomes the preferred fuel for the body.

The prosecution of vegan-related childhood death is a real shocker, considering veganism has been held up as the ultimately healthy plant-based diet for decades. Veganism had become quite trendy among celebrities, but that is likely reversing as well. This past year or so, a large number of well known vegans, many of them vegan advocates with sizable followings, have given up the vegan diet and gone back to eating animal foods. Other than some Hollywood stars, the most famous example is that of Tim Shieff, a professional athlete who had become a leader in the vegan movement but began eating meat again because of serious health concerns. So, along with an emerging shift in public policy, there has also been a shift in public perception about diets.

This new dietary attitude is not limited to more progressive countries elsewhere. We are seeing these same trends even in the corporatist United States, the epicenter of high-carb advocacy by government authorities and institutional experts and big food lobbyists. There has been a slow revolution. Some years back, the American Heart Association snuck in some changes to sugar intake and it barely received any media attention — no public announcement, no apologies, as if that was always their position. That was amazing. All the way back to the 1950s, the AHA had led the charge in blaming fats and exonerating sugar. Almost three quarters of a century of being wrong and now they’re backtracking. The U.S. government followed suit in 2015 (Jen Christensen, 2015 Dietary Guidelines). Neither of these was a defense of low-carb diets, but it was a reversal of course without explanation. Even Walter Willett who followed in Ancel Keys footsteps admitted that they had been wrong in having put all blame on saturated fat and that was a mind-blowing admission, considering how hard those like him had defended the status quo and attacked all alternative views with many careers destroyed in the process.

Just this year, the American Diabetes Association also changed its tune. Once again, there was little fanfare. It’s as if a volcano erupted in the middle of New York City and no media outlet thought to send a reporter to the scene to see what happened. Suddenly, a volcano in New York City is the new norm. The ADA went even further than did the AHA, in that they specifically and clearly declared that LCHF diets are not a fad and are not dangerous. This thawing of dietary ideology has been slowly cracking the edifice of the glacier that had enclosed public debate since the mid-20th century. The growing evidence simply can’t be denied, as the research on low-carb including keto has shown positive results, the shift having taken hold in the 1990s with the Charlie Foundation. The new direction was initially imperceptible to anyone not paying attention. I barely noticed this myself until quite recently, even though I’ve long thought of sugar as an addictive drug and even though I did experiment with the low-carb diet earlier last decade, but I didn’t realize how much the science itself was going down a different path.

Dr. Robert Lustig points out how he was taught this information in his nutritionist education, but then had it drilled out of him in medical school. He forgot about what he had learned and followed establishment thought for the next twenty years. It’s maybe not surprising that he re-awoke to his horrible mistake around the time the Charlie Foundation was established. He was angry, presumably for having failed his patients in providing them the best care but no doubt also for allowing himself to be duped. Many other doctors and other health experts have grown angry as well and that anger has driven a sense of passionate advocacy and moral responsibility. It wasn’t merely a personal failure but that of an entire field and public health was the victim, that is to say hundreds of millions of Americans suffered the consequences.

It’s been building up for a while. And the public hasn’t been entirely kept in the dark. The internet opened up public debate like never before. At the same time research was proving that low-carb works, people were experimenting on themselves and discovering the truth of this. This initially led to a backlash by the powers that be, but the public awareness keeps gaining momentum. The ketogenic diet has become the most Googled diet. One hears about low-carb diets all the time thee days, even when it is simply another denial of the facts. Suppression of truth through silence is no longer an option. Authorities are forced to respond one way or another, and increasingly that has meant a gradual movement toward low-carb. Maybe unsurprisingly, as more Americans embrace low-carb diets following the peak of sugar intake in 1999, for the first time in decades the diabetes epidemic seems to be subsiding.

There have been widely read journalistic accounts of what has gone so wrong in the field of diet and nutrition, specifically the work of Gary Taubes and Nina Teicholz. Several popular documentaries have also had quite an impact, from Pete Evans’ The Magic Pill to Tom Naughton’s Fat Head. On social media, there has been growing influence of low-carb advocates, including many doctors and scientists. Some low-carb Facebook groups have millions of members. And a video of a biochemistry talk criticizing sugar by Dr. Robert Lustig has received millions of views.

I’ve argued that changes will come from below before we see changes in public policy, but in some countries the government is taking the lead. In the United States, it’s going to take a while for low-carb diets to make their way into the official dietary recommendations. The main problem is the U.S. was the original force behind the high-carb, low-fat fad diet and the reason other governments adopted it. There are too many American experts who built their careers on it and several highly respected institutions that fully embraced it. They can never admit they were wrong. I’m sure many of the people involved see the writing on the wall, but they are trying to figure out how to switch their position while saving face and without too many people noticing. Only after many other Western governments take up the low-carb approach will the U.S. government follow their example. Then and only then, if we are lucky, the entire food system of transnational corporations might begin to fall in line.

Consensus will eventually shift. Most of the experts that once were against low-carb will suddenly be for it or else they’ll simply become irrelevant and forgotten. A generation will grow up not knowing anything else and the former dietary ideology will quickly fade from public memory, but the consequences on public health will epigenetically linger for many generations more. Fortunately, individuals don’t have to wait for the rest of society to catch up. What you do as an individual can improve your health, along with the health of your children and grandchildren. One thing that is guaranteed is that low-carb is a vast improvement over what most Americans are eating and what the United States government is recommending. That much is clear from the science.

* * *

For more info, see:

Slow, Quiet, and Reluctant Changes to Official Dietary Guidelines

The American Paradox

Primal Fat Burner
by Nora Gedgaudas
pp. 101-103

You’ve likely heard of the “French paradox”—that, despite the French people’s high consumption of saturated fat, their rates of heart disease are lower than ours in the United States. Here in our country we’re stuck in an unfortunate situation that I call the American paradox: the more closely you follow official dietary government guidelines, the worse your health is likely to be! 11 The USDA is busy telling Americans to base their daily diets upon low-fat, starchy carbohydrates and get more exercise; meanwhile, the obesity epidemic and related health challenges continue to grow. (This paradox is global, by the way—countries such as India are seeing skyrocketing rates of diabetes, and the vegetarians of southern India have literally the world’s shortest life span.)

Trying to make sense of all this is a bit like Alice falling down a rabbit hole; everything seems upside down and nonsensical. Let’s take a brief look at the stats. According to the Food Research and Action Center (FRAC), after decades of being subjected to government guidelines promoting a low-fat and high-carbohydrate diet, Americans show the following problems: 12

  • 68.5 percent of adults are overweight or obese; 34.9 percent are obese. (Compare this to the 1971 overweight statistic of 42 percent.)
  • 31.8 percent of children and adolescents are overweight or obese; 16.9 percent are obese.
  • 30.4 percent of low-income preschoolers are overweight or obese.

Yet another study published in May 2015 examining the impact of dietary guidelines on the health of US citizens yielded some shocking but undeniable conclusions: rates of obesity and diabetes have increased dramatically. 13 The official government dietary recommendations were intended to prevent weight problems and obesity, along with diabetes, cancer, and other chronic diseases. The fact that this has not happened—and that the reverse is true—is officially rationalized in a number of ways. 14 But the underlying message is that we are dumb and lazy. That’s right—the party line about why official dietary recommendations (such as from the American Heart Association and the US Departments of Agriculture and Health and Human Services) have failed is that Americans are to blame because we don’t follow the guidelines and we don’t work out enough. 15 In other words, if we’re sick, it’s our own fat, stupid fault.

This is such a persistent, morale-killing, and completely misleading message that I want to address it directly before we move on.

First, we have collectively and diligently followed the guidelines. Here’s what official guidelines recommend for our daily diets versus what we are currently doing in reality (RDA stands for Recommended Daily Allowance):

Total fat consumption. RDA says a maximum of 35 percent of calories; reality says about 34 percent. (Let’s not pat ourselves on the back, though—the number one source of those fat calories is partially hydrogenated oil from genetically modified soybeans, one of the worst things for the body!)
Saturated fats. RDA says a maximum of 10 percent saturated fat; reality says just under 11 percent (not terribly naughty or rebellious relative to established government recommendations).
Carbs. RDA says 55 to 65 percent, with 45 percent the smallest amount necessary to meet the (unfounded) “optimal dietary requirements”; reality says over 50 percent. This is more than enough to create a health-compromising, sugar-burning metabolism.
Protein. RDA says between 10 and 35 percent; reality says 15 percent.

As you can see, Americans are meeting the established dietary requirements, and we have largely eschewed our national interest in protein in favor of far more addictive carbohydrates. Isn’t it strange, then, that the predominant health messages we hear are that we eat too much animal protein and saturated fat for our own good, and that those are the things that make us overweight and cause heart-related and other health problems?

Meanwhile, FRAC looked at historical shifts and found that the consumption of fats dropped from 45 to 34 percent of total caloric intake between 1971 and 2011, while carbohydrate consumption jumped from 39 to 51 percent. In the same time, obesity has surged by over 25 percent. We have diligently increased our consumption of carbohydrates and reduced our intake of animal fat and cholesterol for over five decades, according to the rules—and we have gotten fatter. Processed foods that contain chemicals such as MSG, Frankenfoods that contain genetically modified organisms (GMOs), hydrogenated and interesterified vegetable oils, and other damaging ingredients such as high fructose corn syrup are to thank for a good part of this disaster. But the promotion of higher-carb, low-fat diets has also undeniably served to push everyone in the wrong direction. (FRAC concluded, as many scientists have, that the increased consumption of carbohydrates is what has caused the huge increase in overweight and obesity.)

Sailors’ Rations, a High-Carb Diet

In the 18th century British Navy, “Soldiers and sailors typically got one pound of bread a day,” in the form of hard tack, a hard biscuit. That is according to James Townsend. On top of that, some days they had been given peas and on other days a porridge called burgoo. Elsewhere, Townsend shares a some info from a 1796 memoir of the period — the author having written that, “every man and boy born on the books of any of his Majesty’s ships are allowed as following a pound of biscuit bread and a gallon of beer per day” (William Spavens, Memoirs of A Seafaring Life,  p. 106). So, grains and more grains, in multiple forms, foods and beverages.

About burgoo, it is a “ground oatmeal boiled up,” as described by Townsend. “Now you wouldn’t necessarily eat that all by itself. Early on, you were given to go with that salt beef fat. So the slush that came to the top when you’re boiling all your salt beef or salt pork. You get all that fat that goes up on top — they would scrape that off, they keep that and give it to you to go with your burgoo. But later on they said maybe that cause scurvy so they let you have some molasses instead.”

They really didn’t understand scurvy at the time. Animal foods, especially fat, would have some vitamin C in it, whereas the oats and molasses had none. They made up for this deficiency later on by adding in cabbage to the sailors’ diet, though not a great choice considering vegetables don’t store well on ships. I’d point out that it’s not that they weren’t getting enough vitamin C, at least for a healthy traditional diet, as they got meat four days a week and even on the other meat-free banyan-days they had some butter and cheese. That would have given them sufficient vitamin C for a low-carb diet, especially with seafood caught along the way.

A high-carb diet, however, is a whole other matter. The amount of carbs and sugar sailors ate daily was quite large. This came about with colonial trade that made grains cheap and widely available, along with the sudden access to sugar from distant sugarcane plantations. Glucose competes with the processing of vitamin C and so requires higher intake of the latter for basic health, specifically to avoid scurvy. A low-carb diet, on the other hand, can avoid scurvy with very little vitamin C since sufficient amounts are in animal foods. Also, a low-carb diet is less inflammatory and so this further decreases the need for antioxidants like vitamin C.

This is why Inuit could eat few plants and immense amounts of meat and fat. They got more vitamin C on a regular basis from seal fat than they did from the meager plant foods they could gather in the short warm period of the far north. But with almost no carbohydrates in the traditional Inuit diet, the requirement for vitamin C was so low as to not be a problem. This is probably the same explanation for why Vikings and Polynesians could travel vast distances across the ocean without getting sick, as they were surely eating mostly fresh seafood and very little, if any, starchy foods.

Unlike protein and fat, carbohydrate is not an essential macronutrient. Yes, carbohydrates provide glucose that the body needs in limited amounts, but through gluceogenesis proteins can be turned into glucose on demand. So, a long sea voyage with zero carbs would never have been a problem.

Sailors in the colonial era ate all of those biscuits, porridge, and peas not because it offered any health value beyond mere survival but because it was cheap food. Those sailors weren’t being fed to have long, healthy lives as labor was cheap and no one cared about them. As soon as a sailor was no longer useful, he would no longer be employed in that profession and he’d find himself among the impoverished masses. For all the health problems of a sailor’s diet, it was better than the alternative of starvation or near starvation that so many others faced.

Grain consumption had been increasing in late feudalism, but peasants still maintained wider variety in their diet through foods they could hunt or gather, not to mention some fresh meat, fat, eggs, and dairy from animals they raised. That all began to change with the enclosure movement. The end of feudal village life and loss of the peasants’ commons was not a pretty picture and did not lead to happy results, as the landless peasants evicted from their homes flooded into the cities where most of them died. The economic desperation made for much cheap labor. Naval sailors with their guaranteed rations, in spite of nutritional deficiencies, were comparably lucky.

* * *

This understanding of low-carb, animal-based diets isn’t new either. If you look back to previous centuries, you see that low-carb diets have been advocated going back to the late 1700s. Advocating such diets prior to that was irrelevant since low-carb was the dietary norm that was assumed without being needing to be stated.

Only beginning a couple of centuries ago did new forms of agriculture take hold that created large surplus yields for the first time in human existence. Suddenly, right when a high-carb diet became possible for a larger part of the population, it was unsurprising that the health problems of a high-carb diet began to appear and so the voices for low-carb soon followed.

In prior centuries, one even sees examples in old books describing the health advantages of animal foods. But I’m not sure if anyone made such connections of high-carb diets to scurvy until more recently. Still, this understanding is older than most people realize, going back at least to the late 1800s. L. Amber O’Hearn shares the following passages from (C is for Carnivore):

Selected notes from the Lancet volume 123
You can find this in Google books [1].

p 329. From a medical report from Mr. W. H. Neale, M.B. B.S. medical officer of the Eira, about an Arctic expedition:

“For the boat journey we saved 40 lb. of tinned meat (per man), and 351b. of tinned soups(per man), 3cwt. of biscuit, and about 800lb. of walrus me it, which was cooked and soldered up by our blacksmith in old provision tins. About 80 lb. of tea were saved, enabling us to have tea night and morning till almost the day we were picked up. No lime-juice was saved. A few bottles of wine and brandy were secured, and kept for Mr. Leigh-Smith and invalids. All the rum was saved, and every man was allowed one-fifth of a gill per day until May 1st, 1882, when it was decided to keep the remaining eighteen gallons for the boats. One man was a teetotaler from January to June, and was quite as healthy as anyone else. Personally it made very little difference whether I took the allowance of “grog” or not. One of the sick men was also a teetotaler nearly all the time. During the boat journey the men preferred their grog when doing any hard work, a fact I could never agree to, but when wet and cold a glass of grog before going to sleep seemed to give warmth to the body and helped to send one to sleep. Whilst sailing, also, one glass of grog would give temporary warmth ; but everyone acknowledged that a mug of hot tea was far better when it was fit weather to make a fire. I do not think that spirits or lime-juice is much use as anti scorbutics ; for if you live on the flesh of the country even, I believe, without vegetables, you will run very little risk of scurvy. There was not a sign of scurvy amongst us, not even an anaemic face. I have brought home a sample of bear and walrus meat in a tin, which I intend to have analysed if it is still in good preservation ; and then it will be a question as to how it will be best to preserve the meat of the country in such a form as to enable a sufficient supply to be taken on long sledge journeys ; for as long as you have plenty of ventilation and plenty of meat, anyone can live out an Arctic winter without fear of scurvy, even if they lie for days in their beds, as our men were compelled to do in the winter when the weather was too bad to go outside (there being no room inside for more than six or seven to be up at one time).”

p331, John Lucas: “Sir, —A propos the annotation appearing under the above heading in The Lancet of June 24th, pp. 1048-9, I would beg permission to observe that almost every medical man in India will be able to endorse the views of Dr. Moore, to which you refer. Medical officers of native regiments notice almost daily in their hospital practice that—to use your writer’s words—”insufficient diet will cause scurvy even if fresh vegetable material forms a part of the diet, though more rapidly if it is withheld.” Indeed, so far as my humble experience as a regimental surgeon from observations on the same men goes, I am inclined to think that the meat-eating classes of our Sepoys—to wit, the Mahomedans, especially those from the Punjaub—are comparatively seldom seen with the scorbutic taint ; while, on the contrary, the subjects are, in the main, vegetable feeders who are their non-meat-eating comrades, the Hindus (Parboos from the North- West Provinces and Deccan Mahrattas), especially those whose daily food is barely sufficient either in quality or quantity. A sceptic may refuse to accept this view on the ostensible reason that though the food of the meat-eating classes be such, it may, perchance, contain vegetable ingredients as well as meat. To this I would submit the rejoinder that as a matter of fact, quite apart from all theory and hypothesis, the food of these meat-eating classes does not always contain much, or any, vegetables. In the case of the semi-savage hill tribes of Afghanistan and Baluchistan, their food contains large amounts of meat (mutton), and is altogether devoid of vegetables. The singular immunity from scurvy of these races has struck me as a remarkable physiological circumstance, which should make us pause before accepting the vegetable doctrine in relation to scurvy et hoc genus omne.”

p370 Charles Henry Ralphe “To the Editor of The Lancet. Sir, —I was struck by two independent observations which occurred in your columns last week with regard to the etiology of scurvy, both tending to controvert the generally received opinion that the exclusive cause of the disease is the prolonged and complete withdrawal of succulent vegetables from the dietary of those affected. Thus Mr. Neale, of the Eira Arctic Expedition, says : ” I do not think that spirit or limejuice is of much use as an anti scorbutic ; for if you live on the flesh of the country, even, I believe, without vegetables, you will run very little risk of scurvy.” Dr. Lucas writes: “In the case of the semi- savage hill tribes of Afghanistan and Beluchistan their food contains a large amount of meat, and is altogether devoid of vegetables. The singular immunity from scurvy of these races has struck me as a remarkable physiological circumstance, which should make us pause before accepting the vegetable doctrine in relation to scurvy.” These observations do not stand alone. Arctic voyagers have long pointed out the antiscorbutic properties of fresh meat, and Baron Larrey, with regard to hot climates, arrived at the same conclusion in the Egyptian expedition under Bonaparte, at the end of last century.”

p495 “SCURVY. Dr. Buzzard, in a letter which appeared in oar columns last week, considers the fact that the crew of the Eira were supplied with preserved vegetables tells against the supposition advanced by Mr. Neale, that if Arctic voyagers were to feed only on the flesh of the animals supplied by the country they would be able to dispense with lime-juice. The truth is, it is an open question with many as to the relative antiscorbutic properties of preserved vegetables, and whether under the circumstances in which the Eira’s crew were placed they would have been sufficient, in the absence of lime-juice and fresh meat, to have preserved the crew from scurvy. A case in point is the outbreak that occurred on board the Adventure, in the surveying voyages of that vessel and the Beagle. The Adventure had been anchored in Port Famine for several months, and although “pickles, cranberries, large quantities of wild celery, preserved meats and soups, had been abundantly supplied,” still great difficulty had been experienced in obtaining fresh meat, and they were dependent on an intermittent supply from wild-fowl and a few shell-fish. Scurvy appeared early in July, fourteen cases, including the assistant-surgeon, being down with it. At the end of July fresh meat was obtained; at first it seemed to prove ineffectual, but an ample supply being continued, the commander was able to report, by the end of August, ” the timely supply of guanaco meat had certainly checked the scurvy.” This is an instance in which articles of diet having recognised antiscorbutic properties proved insufficient, in the absence of lime-juice and fresh meat, and under conditions of exceptional hardship, exposure, and depressing influence, to prevent the occurrence of scurvy. So with the Eira, we believe that had they not fortunately been able to obtain abundant supplies of fresh meat, scurvy would have appeared, and that the preserved vegetables in the absence of lime-juice would have proved insufficient as antiscorbutics. This antiscorbutic virtue of fresh meat has long been recognised by Arctic explorers, and, strangely, their experience in this respect is quite at variance with ours in Europe. It has been sought to explain the immunity from the disease of the Esquimaux, who live almost exclusively on seal and walrus flesh daring the winter months, by maintaining that the protection is derived from the herbage extracted from the stomach of reindeer they may kill. In view, however, of the small proportion of vegetable matter that would be thus obtained for each member of the tribe, and the intermittent nature of the supply, it can hardly be maintained that the antiscorbutic supplied in this way is sufficient unless there are other conditions tending in the same direction. And of these, one, as we have already stated, consists probably in the fact that the flesh is eaten without lactic acid decomposition having taken place, owing either to its being devoured immediately, or from its becoming frozen. The converse being the case in Europe, where meat is hung some time after rigor mortis has passed off, and lactic acid develops to a considerable extent. This seems a rational explanation, and it reconciles the discrepancy of opinion that exists between European and Arctic observers with regard to meat as an antiscorbutic. In bringing forward the claims of the flesh of recently killed animals as an antiscorbutic, it must be understood that we fully uphold the doctrine that the exclusive cause of scurvy is due to the insufficient supply of fresh vegetable food, and that it can be only completely cured by their administration ; but if the claims advanced with regard to the antiscorbutic qualities of recently slaughtered flesh be proved, then we have ascertained a fact which ought to be of the greatest practical value with regard to the conduct of exploring expeditions, and every effort should be made to obtain it. Everything, moreover, conducive to the improvement of the sailor’s dietary ought to receive serious consideration, and it has therefore seemed to us that the remarks of Mr. Neale and Dr. Lucas are especially worthy of attention, whilst we think the suggestion of the former gentleman with regard to the use of the blood of slaughtered animals likely to prove of special value.”

p913 “Sir, —In a foot-note to page 49G of his ” Manual of Practical Hygiene,”, fifth edition, (London, Churchill, 1878), Parkes says : —”For a good deal of evidence up to 1818, I beg to refer to a review I contributed on scurvy in the British and Foreign. Medico-Chirurgical Review in that year. The evidence since this period has added, I believe, little to our knowledge, except to show that the preservation and curative powers of fresh meat in large quantities, and especially raw meat (Kane’s Arctic Expedition), will not only prevent, but will cure scurvy. Kane found the raw meat of the walrus a certain cure. For the most recent evidence and much valuable information, see the Report of the Admiralty Committee on the Scurvy which occurred in the Arctic Expedition of 1875-76 (Blue Hook, 1877).” I think that the last sentence in the above is not Parkes’ own, but that it must have been added by the editor in order to bring it up to the date of the issue of the current edition. The experience since then of the Arctic Expedition in the Eira coincides with these. I refer to that portion of the report where the author tells us that “our food consisted chiefly of War and walrus meat, mixing some of the bear’s blood with the soup when possible.” And again: “I do not think that, spirits or lime-juice is much use as an antiscorbutic, for if you live on the flesh of the country, even, I believe, without vegetables, you will run very little risk of scurvy. There was not a sign of scurvy amongst us, not even an anaemic face,” (Lancet, Aug. 26th.) So that, as far as this question of fresh meat and raw meat and their prophylactic and curative properties are concerned, ample evidence will be found in other published literature to corroborate that of the Eira. But when you take up the question of the particular change which takes place in meat from its fresh to its stale condition, you will find a great deal of diversity and little harmony at opinion. Without taking up other authors on the subject, we stick to Parkes and compare his with Pr. I ; life’.-, views on this point. Parkes thought “fresh, and especially raw meat, is also useful, and this is conjectured to be from its amount of lactic acid ; but this is uncertain,”1 while on the other hand Dr. Ralfe repeats, as a probable explanation, too, of the reason of fresh meat being an anti scorbutic, but that it is due to the absence of lactic acid. For, from well-known chemical facts he deduces the following: — ” In hot climates meat has to be eaten so freshly killed that no lime is allowed for the development of the lactic acid : in arctic regions the freezing arrests its formation. The muscle plasma, therefore, remains alkaline. In Europe the meat is invariably hung, lactic acid is developed freely, and the muscle plasma is consequently acid. If, therefore, scurvy is, as I have endeavoured to show (“Inquiry into the General Pathology of Scurvy”), due to diminished alkalinity of the blood, it can be easily understood that meat may be antiscorbutic when fresh killed, or frozen immediately after killing, but scorbutic when these alkaline salts have been converted into acid ones by lactic acid decomposition.'”-‘ The view of the alkalinity of the blood coincides with Dr. Garrod’s theory, which, however, appears to have as a sine qua turn the absence of a particular salt- namely, potash. I am inclined to think that, taking into account the nervous symptoms which are not infrequently associated with a certain proportion of scorbutic cases, resulting probably from the changes taking place in the blood, not unlike those which occur in gout and rheumatism, there must be some material change produced in the sympathetic system. In many of the individuals tainted with scurvy there were slight and severe attacks of passing jaundice in the cases which occurred in Afghanistan. Can we possibly trace this icteric condition to this cause? This is but a conjecture so far. But there certainly is in Garrod’s observations an important point which, if applicable to all countries, climates, and conditions of life, is sufficiently weighty to indicate the necessity for farther research in that direction, and that point is this : the scorbutic condition disappeared on the patient being given a few grains of potash, though kept strictly on precisely the same diet which produced scurvy. —I am, Sir, yours truly, Ahmedabad, India, 30th Sept., 1882. JOHN C. LUCAS.”

Dr. Eric Berg on Insulin Resistance

Let me do a simple post by sharing a short video. Dr. Eric Berg has a talent for summarizing scientific explanations in a minimal amount of time. Watching it will require less than 10 minutes of your life. And after watching it, you’ll understand why insulin is so important, why insulin resistance is so problematic, and why a low-carb diet is so necessary. It’s simple and to the point.

Carcinogenic Grains

In understanding human health, we have to look at all factors as a package deal. Our gut-brain is a system, as is our entire mind-body. Our relationships, lifestyle, the environment around us — all of it is inseparable. This is true even if we limit ourselves to diet alone. It’s not simply calories in/calories out, macronutrient ratios, or anything else along these lines. It is the specific foods eaten in combination with which other foods and in the context of stress, toxins, epigenetic inheritance, gut health, and so much else that determine what effects manifest in the individual.

There are numerous examples of this. But I’ll stick to a simple one, which involves several factors and the relationship between them. First, red meat is associated with cancer and heart disease. Yet causation is hard to prove, as red meat consumption is associated with many other foods in the standard American diet, such as added sugars and vegetable oils in processed foods. The association might be based on confounding factors that are culture-specific, which can explain why we find societies with heavy meat consumption and little cancer.

So, what else might be involved? We have to consider what red meat is being eaten with, at least in the standard American diet that is used as a control in most research. There is, of course, the added sugars and vegetable oils — they are seriously bad for health and may explain much of the confusion. Saturated fat intake has been dropping since the early 1900s and, in its place, there has been a steady rise in the use of vegetable oils; we now know that highly heated and hydrogenated vegetable oils do severe damage. Also, some of the original research that blamed saturated fat, when re-analyzed, found that sugar was the stronger correlation to heart disease.

Saturated fat, as with cholesterol, had been wrongly accused. This misunderstanding has, over multiple generations at this point, led to the early death of at least hundreds of millions of people worldwide, as dozens of the wealthiest and most powerful countries enforced this in their official dietary recommendations which transformed the world’s food system. Similar to eggs, red meat became the fall guy.

Such things as heart disease are related to obesity, and conventional wisdom tells us that fat makes us fat. Is that true? Not exactly or directly. I was amused to discover that a scientific report commissioned by the British government in 1846 (Experimental Researches on the Food of Animals, and the Fattening of Cattle: With Remarks on the Food of Man. Based Upon Experiments Undertaken by Order of the British Government by Robert Dundas Thomson) concluded that “The present experiments seem to demonstrate that the fat of animals cannot be produced from the oil of the food” — fat doesn’t make people fat, and that low-carb meat-eating populations tend to be slim has been observed for centuries.

So, in most cases, what does cause fat accumulation? It is only fat combined with plenty of carbs and sugar that is guaranteed to make us fat, that is to say fat in the presence of glucose in that the two compete as a fuel source.

Think about what an American meal with red meat looks like. A plate might have a steak with some rolls or slices of bread, combined with a potato and maybe some starchy ‘vegetables’ like corn, peas, or lima beans. Or there will be a hamburger with a bun, a side of fries, and a large sugary drink (‘diet’ drinks are no better, as we now know artificial sweeteners fool the body and so are just as likely to make you fat and diabetic). What is the common factor, red meat combined with wheat or some other grain, as part of a diet drenched in carbs and sugar (and all of it cooked or slathered in vegetable oils).

Most Americans have a far greater total intake of carbs, sugar, and vegetable oils than red meat and saturated fat. The preferred meat of Americans these days is chicken with fish also being popular. Why does red meat and saturated fat continue to be blamed for the worsening rates of heart disease and metabolic disease? It’s simply not rational, based on the established facts in the field of diet and nutrition. That isn’t to claim that too much red meat couldn’t be problematic. It depends on the total diet. Also, Americans have the habit of grilling their red meat and grilling increases carcinogens, which could be avoided by not charring one’s meat, but that equally applies to not burning (or frying) anything one eats, including white meat and plant foods. In terms of this one factor, you’d be better off eating beef roasted with vegetables than to go with a plant-based meal that included foods like french fries, fried okra, grilled vegetable shish kabobs, etc.

Considering all of that, what exactly is the cause of cancer that keeps showing up in epidemiological studies? Sarah Ballantyne has some good answers to that (see quoted passage below). It’s not so much about red meat itself as it is about what red meat is eaten with. The crux of the matter is that Americans eat more starchy carbs, mostly refined flour, than they do vegetables. What Ballantyne explains is that two of the potential causes of cancer associated with red meat only occur in a diet deficient in vegetables and abundant in grains. It is the total diet as seen in the American population that is the cause of high rates of cancer.

As a heavy meat diet without grains is not problematic, a heavy carb diet without grains is also not necessarily problematic. Some of the healthiest populations eat lots of carbs like sweet potatoes, but you won’t find any healthy population that eats as many grains as do Americans. There are many issues with grains considered in isolation (read the work of David Perlmutter or any number of writers on the paleo diet), but grains combined with certain other foods in particular can contribute to health concerns.

Then again, some of this is about proportion. For most of the time of agriculture, humans ate small amounts of grains as an occasional food. Grains tended to be stored for hard times or for trade or else turned into alcohol to be mixed with water from unclean sources. The shift to large amounts of grains made into refined flour is an evolutionarily unique dilemma our bodies aren’t designed to handle. The first accounts of white bread are found in texts from slightly over two millennia ago and most Westerners couldn’t afford white bread until the past few centuries when industrialized milling began. Before that, people tended to eat foods that were available and didn’t mix them as much (e.g., eat fruits and vegetables in season). Hamburgers were invented only about a century ago. The constant combining of red meat and grains is not something we are adapted for. That harm to our health results maybe shouldn’t surprise us.

Red meat can be a net loss to health or a net gain. It depends not on the red meat, but what is and isn’t eaten with it. Other factors matter as well. Health can’t be limited to a list of dos and don’ts, even if such lists have their place in the context of more detailed knowledge and understanding. The simplest solution is to eat as most humans ate for hundreds of thousands of years, and more than anything else that means avoiding grains. Even without red meat, many people have difficulties with grains.

Let’s return to the context of evolution. Hominids have been eating fatty red meat for millions of years (early humans having prized red meat from blubbery megafauna until their mass extinction), and yet meat-eating hunter-gatherers rarely get cancer, heart disease, or any of the other modern ailments. How long ago was it when the first humans ate grains? About 12 thousand years ago. Most humans on the planet never touched a grain until the past few millennia. And fewer still included grains with almost every snack and meal until the past few generations. So, what is this insanity of government dietary recommendations putting grains as the base of the food pyramid? Those grains are feeding the cancerous microbes, and doing much else that is harmful.

In conclusion, is red meat bad for human health? It depends. Red meat that is charred or heavily processed combined with wheat and other carbs, lots of sugar and vegetable oils, and few nutritious vegetables, well, that would be a shitty diet that will inevitably lead to horrible health consequences. Then again, the exact same diet minus the red meat would still be a recipe for disease and early death. Yet under other conditions, red meat can be part of a healthy diet. Even a ton of pasture-raised red meat (with plenty of nutrient-dense organ meats) combined with an equal amount of organic vegetables (grown on healthy soil, bought locally, and eaten in season), in exclusion of grains especially refined flour and with limited intake of all the other crap, that would be one of the healthiest diets you could eat.

On the other hand, if you are addicted to grains as many are and can’t imagine a world without them, you would be wise to avoid red meat entirely. Assuming you have any concerns about cancer, you should choose one or the other but not both. I would note, though, that there are many other reasons to avoid grains while there are no other known reasons to avoid red meat, at least for serious health concerns, although some people exclude red meat for other reasons such as digestion issues. The point is that whether or not you eat red meat is a personal choice (based on taste, ethics, etc), not so much a health choice, as long as we separate out grains. That is all we can say for certain based on present scientific knowledge.

* * *

We’ve known about this for years now. Isn’t it interesting that no major health organization, scientific institution, corporate news outlet, or government agency has ever warned the public about the risk factors of carcinogenic grains? Instead, we get major propaganda campaigns to eat more grains because that is where the profit is for big ag, big food, and big oil (that makes farm chemicals and transports the products of big ag and big food). How convenient! It’s nice to know that corporate profit is more important than public health.

But keep listening to those who tell you that cows are destroying the world, even though there are fewer cows in North America than there once were buffalo. Yeah, monocultural GMO crops immersed in deadly chemicals that destroy soil and deplete nutrients are going to save us, not traditional grazing land that existed for hundreds of millions of years. So, sure, we could go on producing massive yields of grains in a utopian fantasy beloved by technocrats and plutocrats that further disconnects us from the natural world and our evolutionary origins, an industrial food system dependent on turning the whole world into endless monocrops denatured of all other life, making entire regions into ecological deserts that push us further into mass extinction. Or we could return to traditional ways of farming and living with a more traditional diet largely of animal foods (meat, fish, eggs, dairy, etc) balanced with an equal amount of vegetables, the original hunter-gatherer diet.

Our personal health is important. And it is intimately tied to the health of the earth. Civilization as we know it was built on grains. That wasn’t necessarily a problem when grains were a small part of the diet and populations were small. But is it still a sustainable socioeconomic system as part of a healthy ecological system? No, it isn’t. So why do we continue to do more of the same that caused our problems in the hope that it will solve our problems? As we think about how different parts of our diet work together to create conditions of disease or health, we need to begin thinking this way about our entire world.

* * *

Paleo Principles
by Sarah Ballantyne

While this often gets framed as an argument for going vegetarian or vegan. It’s actually a reflection of the importance of eating plenty of plant foods along with meat. When we take a closer look at these studies, we see something extraordinarily interesting: the link between meat and cancer tends to disappear once the studies adjust for vegetable intake. Even more exciting, when we examine the mechanistic links between meat and cancer, it turns out that many of the harmful (yes, legitimately harmful!) compounds of meat are counteracted by protective compounds in plant foods.

One major mechanism linking meat to cancer involves heme, the iron-containing compound that gives red meat its color (in contrast to the nonheme iron found in plant foods). Where heme becomes a problem is in the gut: the cells lining the digestive tract (enterocytes) metabolize it into cytotoxic compounds (meaning toxic to living cells), which can then damage the gut barrier (specifically the colonic mucosa; see page 67), cause cell proliferation, and increase fecal water toxicity—all of which raise cancer risk. Yikes! In fact, part of the reason red meat is linked with cancer far more often than with white meat could be due to their differences in heme content; white meat (poultry and fish) contains much, much less.

Here’s where vegetables come to the rescue! Chlorophyll, the pigment in plants that makes them green, has a molecular structure that’s very similar to heme. As a result, chlorophyll can block the metabolism of heme in the intestinal tract and prevent those toxic metabolites from forming. Instead of turning into harmful by-products, heme ends up being metabolized into inert compounds that are no longer toxic or damaging to the colon. Animal studies have demonstrated this effect in action: one study on rats showed that supplementing a heme-rich diet with chlorophyll (in the form of spinach) completely suppressed the pro-cancer effects of heme. All the more reason to eat a salad with your steak.

Another mechanism involves L-carnitine, an amino acid that’s particularly abundant in red meat (another candidate for why red meat seems to disproportionately increase cancer risk compared to other meats). When we consume L-carnitine, our intestinal bacteria metabolize it into a compound called trimethylamine (TMA). From there, the TMA enters the bloodstream and gets oxydized by the liver into yet another compound, trimethylamine-N-oxide (TMAO). This is the one we need to pay attention to!

TMAO has been strongly linked to cancer and heart disease, possibly due to promoting inflammation and altering cholesterol transport. Having high levels of it in the bloodstream could be a major risk factor for some chronic diseases. So is this the nail in the coffin for meat eaters?

Not so fast! An important study on this topic published in 2013 in Nature Medicine sheds light on what’s really going on. This paper had quite a few components, but one of the most interesting has to do with gut bacteria. Basically, it turns out that the bacteria group Prevotella is a key mediator between L-carnitine consumption and having high TMAO levels in our blood. In this study, the researchers found that participants with gut microbiomes dominated by Prevotella produced the most TMA (and therefore TMAO, after it reached the liver) from the L-carnitine they ate. Those with microbiomes high in Bacteroides rather than Prevotella saw dramatically less conversion to TMA and TMAO.

Guess what Prevotella loves to snack on? Grains! It just so happens that people with high Prevotella levels, tend to be those who eat grain-based diets (especially whole grain), since this bacterial group specializes in fermenting the type of polysaccharides abundant in grain products. (For instance, we see extremely high levels of Prevotella in populations in rural Africa that rely on cereals like millet and sorghum.) At the same time, Prevotella doesn’t seem to be associated with a high intake of non-grain plant sources, such as fruit and vegetables.

So is it really the red meat that’s a problem . . . or is it the meat in the context of a grain-rich diet? Based on the evidence we have so far, it seems that grains (and the bacteria that love to eat them) are a mandatory part of the L-carnitine-to-TMAO pathway. Ditch the grains, embrace veggies, and our gut will become a more hospitable place for red meat!

* * *

Georgia Ede has a detailed article about the claim of meat causing cancer. In it, she provides several useful summaries of and quotes from the scientific literature.

WHO Says Meat Causes Cancer?

In November 2013, 23 cancer experts from eight countries gathered in Norway to examine the science related to colon cancer and red/processed meat. They concluded:

“…the interactions between meat, gut and health outcomes such as CRC [colorectal cancer] are very complex and are not clearly pointing in one direction….Epidemiological and mechanistic data on associations between red and processed meat intake and CRC are inconsistent and underlying mechanisms are unclear…Better biomarkers of meat intake and of cancer occurrence and updated food composition databases are required for future studies.” 1) To read the full report: http://www.ncbi.nlm.nih.gov/pubmed/24769880 [open access]

Translation: we don’t know if meat causes colorectal cancer. Now THAT is a responsible, honest, scientific conclusion.

How the WHO?

How could the WHO have come to such a different conclusion than this recent international gathering of cancer scientists? As you will see for yourself in my analysis below, the WHO made the following irresponsible decisions:

  1. The WHO cherry-picked studies that supported its anti-meat conclusions, ignoring those that showed either no connection between meat and cancer or even a protective effect of meat on colon cancer risk. These neutral and protective studies were specifically mentioned within the studies cited by the WHO (which makes one wonder whether the WHO committee members actually read the studies referenced in its own report).
  2. The WHO relied heavily on dozens of “epidemiological” studies (which by their very nature are incapable of demonstrating a cause and effect relationship between meat and cancer) to support its claim that meat causes cancer.
  3. The WHO cited a mere SIX experimental studies suggesting a possible link between meat and colorectal cancer, four of which were conducted by the same research group.
  4. THREE of the six experimental studies were conducted solely on RATS. Rats are not humans and may not be physiologically adapted to high-meat diets. All rats were injected with powerful carcinogenic chemicals prior to being fed meat. Yes, you read that correctly.
  5. Only THREE of the six experimental studies were human studies. All were conducted with a very small number of subjects and were seriously flawed in more than one important way. Examples of flaws include using unreliable or outdated biomarkers and/or failing to include proper controls.
  6. Some of the theories put forth by the WHO about how red/processed meat might cause cancer are controversial or have already been disproved. These theories were discredited within the texts of the very same studies cited to support the WHO’s anti-meat conclusions, again suggesting that the WHO committee members either didn’t read these studies or deliberately omitted information that didn’t support the WHO’s anti-meat position.

Does it matter whether the WHO gets it right or wrong about meat and cancer? YES.

“Strong media coverage and ambiguous research results could stimulate consumers to adapt a ‘safety first’ strategy that could result in abolishment of red meat from the diet completely. However, there are reasons to keep red meat in the diet. Red meat (beef in particular) is a nutrient dense food and typically has a better ratio of N6:N3-polyunsaturated fatty acids and significantly more vitamin A, B6 and B12, zinc and iron than white meat(compared values from the Dutch Food Composition Database 2013, raw meat). Iron deficiencies are still common in parts of the populations in both developing and industrialized countries, particularly pre-school children and women of childbearing age (WHO)… Red meat also contains high levels of carnitine, coenzyme Q10, and creatine, which are bioactive compounds that may have positive effects on health.” 2)

The bottom line is that there is no good evidence that unprocessed red meat increases our risk for cancer. Fresh red meat is a highly nutritious food which has formed the foundation of human diets for nearly two million years. Red meat is a concentrated source of easily digestible, highly bioavailable protein, essential vitamins and minerals. These nutrients are more difficult to obtain from plant sources.

It makes no sense to blame an ancient, natural, whole food for the skyrocketing rates of cancer in modern times. I’m not interested in defending the reputation of processed meat (or processed foods of any kind, for that matter), but even the science behind processed meat and cancer is unconvincing, as I think you’ll agree. […]

Regardless, even if you believe in the (non-existent) power of epidemiological studies to provide meaningful information about nutrition, more than half of the 29 epidemiological studies did NOT support the WHO’s stance on unprocessed red meat and colorectal cancer.

It is irresponsible and misleading to include this random collection of positive and negative epidemiological studies as evidence against meat.

The following quote is taken from one of the experimental studies cited by the WHO. The authors of the study begin their paper with this striking statement:

“In puzzling contrast with epidemiological studies, experimental studies do not support the hypothesis that red meat increases colorectal cancer risk. Among the 12 rodent studies reported in the literature, none demonstrated a specific promotional effect of red meat.” 3)

[Oddly enough, none of these twelve “red meat is fine” studies, which the authors went on to list and describe within the text of the introduction to this article, were included in the WHO report].

I cannot emphasize enough how common it is to see statements like this in scientific papers about red meat. Over and over again, researchers see that epidemiology suggests a theoretical connection between some food and some health problem, so they conduct experiments to test the theory and find no connection. This is why our nutrition headlines are constantly changing. One day eggs are bad for you, the next day they’re fine. Epidemiologists are forever sending well-intentioned scientists on time-consuming, expensive wild goose chases, trying to prove that meat is dangerous, when all other sources–from anthropology to physiology to biochemistry to common sense—tell us that meat is nutritious and safe.

* * *

Below good discussion between Dr. Steven Gundry and Dr. Paul Saladino. It’s an uncommon dialogue. Even though Gundry is known for warning against the harmful substances in plant foods, he has shifted toward a plant-based diet in also warning against too much animal foods or at least too much protein, another issue about IGF1 not relevant to this post. As for Saladino, he is a carnivore and so takes Gundry’s argument against plants to a whole other level. Saladino sees no problem with meat, of course. And his view contradicts what Gundry writes about in his most recent book, The Longevity Paradox.

Anyway, they got onto the topic of TMAO. Saladino points out that fish has more fully formed TMAO than red meat produces in combination with grain-loving Prevotella. Even vegetables produce TMAO. So, why is beef being scapegoated? It’s pure ignorant idiocy. To further this point, Saladino explained that he has tested the microbiome of patients of his on the carnivore diet and it comes up low on the Prevotella bacteria. He doesn’t think TMAO is the danger people claim it is. But even if it were, the single safest diet might be the carnivore diet.

Gundry didn’t even disagree. He pointed out that he did testing on patients of his who are long-term vegans and now in their 70s. They had extremely high levels of TMAO. He sent their lab results to the Cleveland Clinic for an opinion. The experts there refused to believe that it was possible and so dismissed the evidence. That is the power of dietary ideology when it forms a self-enclosed reality tunnel. Red meat is bad and vegetables are good. The story changes over time. It’s the saturated fat. No, it’s the TMAO. Then it will be something else. Always looking for a rationalization to uphold the preferred dogma.

* * *

7/25/19 – Additional thoughts: There is always new research coming out. And as is typical, it is often contradictory. It is hard to know what is being studied exactly.The most basic understanding in mainstream nutrition right now seems to be that red meat is associated with TMAO by way of carnitine and Prevotella (Studies reveal role of red meat in gut bacteria, heart disease development). But there are many assumptions being made. This research tends to be epidemiological/observational and so most factors aren’t being controlled.

Worse still, they aren’t comparing the equivalent extremes, such as veganism vs carnivory but veganism and vegetarianism vs omnivory. That is to leave out the even greater complicating factor that, as the data shows, a significant number of vegans and vegetarians occasionally eat animal foods. There really aren’t that many long-term vegans and vegetarians to study because 80% of people who start the diet quit it, and of that 20% few are consistent.

As for omnivores, they are a diverse group that could include hundreds of dietary variations. One variety of omnivory is the paleo diet, slightly restricted omnivory in that grains are excluded, often along with legumes, white potatoes, dairy, added sugar, etc. The paleo diet was studied and showed higher levels of TMAO and, rather than cancer, the focus was on cardiovascular disease (Heart disease biomarker linked to paleo diet).

So, that must mean the paleo diet is bad, right? When people think of the paleo diet, they think of a caveman lugging a big hunk of meat. But the reality is that the standard paleo diet, although including red meat, emphasizes fish and heaping platefuls of vegetables. Why is red meat getting blamed? In a bizarre twist, the lead researcher of the paleo study, Dr. Angela Genoni, thought the problem was the lack of grains. But it precisely grains that the TMAO-producing Prevotella gut bacteria love so much. How could reducing grains increase TMAO? No explanation was offered. Before we praise grains, why not look at the sub-population of vegans, vegetarians, fruitivores, etc who also avoid grains?

There is a more rational and probable factor. It turns out that fish and vegetables raise TMAO levels higher than red meat (Eat your vegetables (and fish): Another reason why they may promote heart health). This solves the mystery of why some Dr. Gundry’s vegan patients had high TMAO levels. Yet, in another bizarre leap of logic, the same TMAO that is used to castigate red meat suddenly is portrayed as healthy in reducing cardiovascular risk when it comes from sources other than red meat. It is the presence of red meat that somehow magically transforms TMAO into an evil substance that will kill you. Or maybe, just maybe it has nothing directly to do with TMAO alone.

After a long and detailed analysis of the evidence, Dr. Georgia Ede concluded that, “As far as I can tell, the authors’ theory that red meat provides carnitine for bacteria to transform into TMA which our liver then converts to TMAO, which causes our macrophages to fill up with cholesterol, block our arteries, and cause heart attacks is just that–a theory–full of sound and fury, signifying nothing” (Does Carnitine from Red Meat Cause Heart Disease?).

 

A Food Revolution Worthy of the Name!

“Our success with carbohydrates, however, has had a serious downside: a worldwide plague of obesity, diabetes and other diet-related diseases.”
~Gerald C. Nelson

The conventional view on diet promoted by establishment figures and institutions is based on the idea that all calories are equal. In dieting and fat loss, this has meant promoting a philosophy of calorie-in/calorie-out which translates as calorie counting and calorie restriction. Recent research has brought serious doubt to this largely untested hypothesis that has for so long guided public health recommendations.

There is also a larger background to this issue. The government has spent immense money promoting and subsidizing the high-carb diet. For example, they’ve put decades of funding into research for growing higher yield staples of wheat, corn, and rice. But they have never done anything comparable for healthy foods that are nutrient-dense and low-carb. This promotion of high yield crops with industrialized farming has denatured the soil and the food grown on it. This is problematic since these high-carb staples are low in nutrient-density even when grown on healthy soil.

This mentality of obsessing over food as calories is severely dysfunctional. It ignores the human reality of how our bodies function. And it ignores widespread human experience. Calorie-restricted diets are well known to have one of the lowest rates of compliance and success. It doesn’t matter how many or how few calories one tries to eat, as long as the food one is eating is of such low quality. Your hunger and cravings will drive you in your body’s seeking nutrition.

As I’ve eaten more nutrient-dense foods as part of a diet that is ketogenic and paleo, my hunger decreased and my cravings disappeared. I certainly don’t consume more calories than before and possibly far less, not that I’m counting. I no longer overeat and I find fasting easy. Maybe too many people eat so much making them fat because the food system produces mostly empty calories and processed carbs. It’s what’s available and cheapest, and the food industry is brilliant in making their products as addictive as possible. The average person in our society is endlessly hungry while their body is not getting what it needs. It’s a vicious cycle of decline.

I remember how I was for most of my life until quite recently, with decades as a sugar addict and a junk food junky. I was always hungry and always snacking. Carbs and sugar would keep my blood sugar and serotonin levels on a constant roller coaster ride of highs and lows, and it wrecked my physical and mental health in the process. It wasn’t a happy state. And anyone having told me in my deepest and darkest depressive funk that I should count and restrict my calories would not have been helpful. What I needed was more of the right kinds of calories, those filled with healthy fats and fat-soluble vitamins along with so much else. My body was starving from malnourishment even when I was overeating and, despite regular exercise, eventually gaining weight.

We don’t need to grow more food to feed the world but to grow better food to nourish everyone at least to a basic level, considering how many diseases even in rich countries are caused by nutrient deficiencies (e.g., Dr. Terry Wahls reversed multiple sclerosis symptoms in her self, in patients, and in clinical subjects through increasing nutrient-density). The same amount of food produced, if nutrient-dense, could feed many more people. We already have enough food and will continue to have enough food for the foreseeable future. That of equal and fair distribution of food is a separate issue. The problem isn’t producing a greater quantity for what we desperately need is greater quality. But that is difficult because our industrial farming has harmed the health of the soil and denatured our food supply.

The U.S. gov pays some farmers to not grow anything because the market is flooded with too much food. At the same time, U.S. gov pays other farmers to grow more crops like corn, something I know from living in Iowa, the corn capital of the world. Subsidizing the production of processed carbs and high fructose syrup is sickening and killing us, ignoring the problems with ethanol. Just as important, it also wastes limited resources that could be used in better ways.

We have become disconnected in so many ways. Scientific research and government policies disconnected from human health. An entire civilization disconnected from the earth we depend upon. And the modern mind disconnected from our own bodies, to the point of being alienated from what should be the most natural thing in the world, that of eating. When we are driven by cravings, our bodies are seeking something essential and needed. There is a good reason we’re attracted to things that taste sweet, salty, and fatty/oily. In natural whole foods, these flavors indicate something is nutrient-dense. But we fool the body by eating nutrient-deficient processed foods grown on poor soil. And then we create dietary ideologies that tell us this is normal.

What if we could feed more people with less land? And what if we could do so in a way that brought optimal and sustainable health to individuals, society, and the earth? Now that would be a food revolution worthy of the name!

* * *

The global food problem isn’t what you think
by Gerald C. Nelson 

Here’s what we found:

Under even the worst conditions, there will be enough food, if we define “enough” as meaning sufficient calories, on average, for everyone — with 2,000 calories per day as the standard requirement. . . [T]he post-World War II Green Revolution efforts to boost the productivity of staples such as wheat and rice have been so successful that we are now awash in carbohydrates. And because so much has already been invested in improving the productivity of these crops, solid yield gains will likely continue for the next few decades. The productivity enhancements have also made them more affordable relative to other foods that provide more of the other needed nutrients.

Our success with carbohydrates, however, has had a serious downside: a worldwide plague of obesity, diabetes and other diet-related diseases. The World Health Organization reports that in 2014, there were 462 million underweight adults worldwide but more than 600 million who were obese — nearly two-thirds of them in developing countries. And childhood obesity is rising much faster in poorer countries than in richer ones.

Meanwhile, micronutrient shortages such as Vitamin A deficiency are already causing blindness in somewhere between 250,000 and 500,000 children a year and killing half of them within 12 months of them losing their sight. Dietary shortages of iron, zinc, iodine and folate all have devastating health effects.

These statistics point to the need for more emphasis on nutrients other than carbohydrates in our diets. And in this area, our findings are not reassuring.