A Century of Obesity Epidemic

Why We Get Fat
by Gary Taubes

In 1934, a young German pediatrician named Hilde Bruch moved to America, settled in New York City, and was “startled,” as she later wrote, by the number of fat children she saw—“ really fat ones, not only in clinics, but on the streets and subways, and in schools.” Indeed, fat children in New York were so conspicuous that other European immigrants would ask Bruch about it, assuming that she would have an answer. What is the matter with American children? they would ask. Why are they so bloated and blown up? Many would say they’d never seen so many children in such a state.

Today we hear such questions all the time, or we ask them ourselves, with the continual reminders that we are in the midst of an epidemic of obesity (as is the entire developed world). Similar questions are asked about fat adults. Why are they so bloated and blown up? Or you might ask yourself: Why am I?

But this was New York City in the mid- 1930s. This was two decades before the first Kentucky Fried Chicken and McDonald’s franchises, when fast food as we know it today was born. This was half a century before supersizing and high- fructose corn syrup. More to the point, 1934 was the depths of the Great Depression, an era of soup kitchens, bread lines, and unprecedented unemployment. One in every four workers in the United States was unemployed. Six out of every ten Americans were living in poverty. In New York City, where Bruch and her fellow immigrants were astonished by the adiposity of the local children, one in four children were said to be malnourished. How could this be?

Fat in the Fifties
by Nicolas Rasmussen

Obesity burst into the public consciousness in the years immediately following the Second World War. Around 1950, the US Public Health Service (PHS) issued a brochure on “the greatest problem in preventive medicine in the USA”: obesity. The life insurance industry, working in collaboration with the PHS and the American Medical Association (AMA), launched a national drive, proclaiming “Overweight: America’s No. 1 Health Problem.” And no wonder, given that insurance company data and some local health surveys suggested that more than a quarter of the American population was significantly overweight or obese. By the typical measure of the day, anyone 10 percent above the “ideal weight” for a given height fell into the category of overweight—the ideal weight being that which the insurance industry found to predict maximum longevity. Those 20 percent overweight were classified as obese. The danger of excess weight was grave, because it was the leading predictor of heart disease, the nation’s top killer. […]

Stroke, cancer, and, most of all, heart disease leaped to the forefront as causes of death.20 By 1920 heart disease had taken the lead as the top cause of death; by the end of the decade, based mainly on evidence developed by Dublin and other insurance industry statisticians, health policy analysts came to believe that heart disease was also catching up with tuberculosis in terms of its total financial burden on the nation (despite the fact that heart disease tended to kill its victims later in their wage-earning years). Imposing double the economic burden of cancer, which would soon become the second greatest cause of death, heart disease had unquestionably become Public Health Enemy Number 1 by 1930. […] The [early 20th century] findings indicated a clear association between overweight and excess mortality. […] In 1930, Louis Dublin used this type of information as the basis for a groundbreaking actuarial study that specifically correlated overweight with heart disease.

Millennials Are Hitting Old Age In Their Thirties

There is a comedy sketch, This is Your Brain After Thirty, from the group It’s a Southern Thing. It is a parody of a pharmaceutical commercial. And the target audience is Millennials who are now feeling the evidence of growing older. The voiceover begins, “Are you in your 30s? You may not feel old. But you don’t exactly feel young, either.” Then it presents three characters with their symptoms:

  • Person 1: “Sometimes I walk into a room and completely forget what I walked in there for.”
  • Person 2: “I can’t remember my own phone number. And I’ve had the same number for ten years.”
  • Person 3: “I know I had supper last night. I clearly don’t skip meals. But for the life of me, I can’t remember what I ate.”

The voiceover continues with the official diagnosis. “Then you might be suffering from Thirties Brain.” There is nothing quite as comforting as having a label. That explains everything. That’s just what happens when one reaches old age in one’s thirties. Yeah, that’s completely normal. Don’t worry, though. “It’s not your fault,” reassures the voice of authority. More info is then offered about it:

“It’s a common condition that affects millions of people. People who are old enough to take their 401(k) seriously, but not quite old enough to enjoy eating at Golden Corral. It’s not your fault. Your brain is too full of useless knowledge, now. Why remember your own phone number, when you could retain every word of the 2001 hit “Drops of Jupiter” by Train? Thirties Brain can make even the most simple conversations feel exhausting. But as soon as it feels like you can think clearly again, your brain stops working again. If this sounds like you or someone you love, then ask your doctor about our new twice-a day…”

Of course, this is just comedy, but it’s funny for the very reason so many can relate to the experience. In becoming part of popular culture, it’s being normalized. That is rather sad when one thinks about it. Should we really be normalizing early onset neurocognitive decline? What they are now jokingly calling “Thirties Brain”, would not long ago have been called “Fifties Brain” or “Sixties Brain”. Indeed many serious health conditions like Alzheimer’s used to be entirely identified with old age and now are increasingly being diagnosed among the young (when we were kids, Alzheimer’s would sometimes be called Old Timer’s disease). The same is true of type II diabetes, which originally was called adult onset diabetes because adulthood was typically the age of diagnosis. These conditions are part of metabolic syndrome or metabolic dysfunction that involves insulin resistance as a key component.

Also common in metabolic syndrome is obesity. It instantly stood out that each actor in the parody commercial were all quite overweight to the point of being obese. Yet obesity also has been normalized, particularly in the South where it’s rampant. Obesity involves inflammation throughout the body, as inflammation is also seen in the brain with Alzheimer’s (along with depression, etc); and inflammation is related to autoimmune disorders, from multiple sclerosis to rheumatoid arthritis. Body fat is an organ, like the liver, spleen, or thyroid. And, in particular, body fat is key to the functioning of the hormone system. Hormones like insulin don’t only regulate appetite and glucose but also a number of other interlinked systems in the body. That is why metabolic syndrome can manifest as numerous health conditions and diseases. And that is why metabolic syndrome is the main comorbidity of COVID-19 and other infectious diseases.

If you’re experiencing “Thirties Brain”, you should take that as a serious symptom to be worried about. It’s an early sign of health decline that is only going to get worse, unless you change your diet and lifestyle. People typically have metabolic syndrome years or even decades before finally being diagnosed with a disease that doctors recognize, something like diabetes or cardiovascular disease. But it can often be easily reversed, particularly if caught early. Unfortunately, few Americans realize that this is a public health crisis and one that is entirely preventable. Many experts have predicted that healthcare costs are going to continue to skyrocket, as it eats up more of the national GDP and causes widespread medical debt.

This could end up an existential crisis for our society. That is what happened during the World War II draft. The United States Military suddenly realized so many young men were severely malnourished: “40 percent of the raw recruits drafted by our military proved to be so malnourished that they could not pass a physical and were declared unfit for duty” (Stephen Yafa, Grain of Truth, p. 17; quoted in Malnourished Americans). After the war, there was a public campaign with nutritional fortification of food and meal programs in schools, along with official dietary recommendations. It was also a time when obesity was finally seen as a public health crisis (Nicolas Rasmussen, Fat in the Fifties: America’s First Obesity Crisis).

At present, the military is once again acknowledging that this is a serious problem (Obese Military?). By law, the U.S. military is required to serve food that conforms to the U.S. dietary guidelines. Yet, despite military personnel having high levels of exercise, obesity is also increasing in the military. As research has shown, even when caloric intake and exercise is controlled for, the standard American diet (SAD) is obesogenic (Americans Fatter at Same Level of Food Intake and Exercise). But, on a positive note, the military is beginning to recognize the cause of the problem. They’ve determined the link the diet soldiers are being given. And research on soldiers has shown a ketogenic diet will help with fat loss.

The U.S. military is forced to be so honest because it’s simply not an option to have obese soldiers, much less soldiers experiencing neurocognitive decline. It’s only a question when other institutions of authority will catch up. There are signs that changes are already in the air (Slow, Quiet, and Reluctant Changes to Official Dietary Guidelines; & American Diabetes Association Changes Its Tune). After decades of blaming saturated fat, it’s becoming clear that the real culprit is carbohydrates and industrial seed oils; although other factors are involved in the general health crisis, such as possibly hormone mimics that are stunting male development (Real Issues Behind Regressive Identity Politics), but that is diverging from the immediate topic at hand.

The fact is the consumption of saturated fat has declined ever since, back in the 1930s, industrial seed oils replaced animal fats as the main source of fatty acids in the American diet. Likewise, beef intake has dropped about as low as it was in the first half of the 20th century, after a brief period of peaking out in the 1970s (Diet and Health, from John Hopkins Center for a Livable Future). Meanwhile, what has risen in the American diet, besides industrial seed oils, are mostly plant foods: vegetables, fruits, fruit juices, soda pop, grains, rice, legumes, nuts, and seeds. The only animal foods that have seen a significant increase are fish and chicken, the two supposedly healthy meats. That is the modern SAD diet that has led to the sudden appearance of “Thirties Brain”. Welcome to the new normal!

To make a related point, this health decline can’t be blamed on a factor like behavior, no matter how much lifestyle is implicated as well — you can’t outrun a bad diet, as some say. The young generations have become quite health-conscious, but it’s simply the health advice they’ve been given is wrong. Young adults are eating more supposedly healthy foods than did people in the past, including with rising rates of plant-based diets: Mediterranean, vegetarianism, veganism, etc. Also, when younger, Millennials (and Generation Z) had lower rates of teen sexual activity, alcohol consumption, and drug use. As observed elsewhere, one could call them prudes (Rate of Young Sluts) or at least that used to be true. But something has definitely changed that is now affecting their behavior.

After living through a major recession and a global pandemic, we are now seeing a rise of behavioral health issues among younger Americans with rising rates of self-medication, specifically alcohol and tobacco (Blue Cross Blue Shield Association study finds nearly one-third of millennials are affected by behavioral health conditions, Independence Blue Cross). Still, the rates of alcohol and tobacco consumption is now approximately the same as it was in the early 1900s, which was rather low compared to the later spike in the second half of the 20th century (graph from The Health Consequences of Smoking—50 Years of Progress: A Report of the Surgeon General; & Mona Chalabi, Dear Mona Followup: Where Do People Drink The Most Beer, Wine And Spirits?).

Some countries that with more alcohol and tobacco usage than the US are, nonetheless, healthier (France, Germany, etc). Limiting ourselves to the US, consider the residents of Roseto, Pennsylvania in their having been studied from 1954 to 1961. At the time, they were the healthiest population in the country, despite being quite fond of drinking and smoking, not to mention their love of processed meat and saturated fat like lard (Blue Zones Dietary Myth). So, a recent slight shift of drinking and smoking among Millennials also ends up being a non-explanation. It’s more likely a result of declining health than a cause, and hence the reason to describe it as self-medication. Or, more generally, the addictive mindset isn’t limited to addictive substances; and, besides, drug use is nothing new (The Drugged Up Birth of Modernity).

Anyway, keep in mind that these Millennial rates of substance abuse are still lower than was seen, for example, among Generation X that had far fewer health problems at the same age, even with GenXers being the most lead poisoned living generation. Something unique is going on right at present and it’s hard to explain it with anything other than a ultra-processed diet high in carbs and industrial seed oils. Back when the first wave of GenXers hit their thirties in the mid-1990s, no one was talking about “Thirties Brain”. And neither did it come up with the prior generations. We are complaining about U.S. presidents of the Silent Generation (Donald Trump and Joe Biden) in their seventies who have obvious neurocognitive decline, but that is a vast difference from one’s thirties.

To put that in further comparison, there was a discussion of health in terms of running. It was part of an argument that humans evolved for running. This is supported by the fact that persistence hunting (i.e., running game down) is one of the oldest and most widespread hunting techniques, as it requires almost no technology other than something to club or stab the animal to death after it collapses from heat exhaustion. The human body seems extremely well-adapted to long-distance running, especially in heat; and this also seems closely linked to the predilection for ketosis (Human Adaptability and Health). What is relevant for our discussion here is that hunter-gatherers reach their peak aerobic health in their fifties. The average middle-aged hunter-gatherer can outrun the average eighteen year old hunter-gatherer. Up into old age, hunter-gatherers can keep up a fast pace with others who are much younger.

Think about how many middle-aged or older Americans who could do the same. Unsurprisingly, hunter-gatherers likewise have very little of the diseases of civilization. Obesity, of course, is almost unheard of among them. The have what is called a long healthspan where most people live healthily into old age and suddenly die without any lingering sickness or long periods of degeneration. In such a healthy society, they likely wouldn’t even understand the concept of “Thirties Brain”.

* * *

Some might think Millennials are being unfairly criticized. That is not the intention. This health decline hardly began in recent decades. Weston A. Price and others were talking about it in the early 1900s. There was even a growing debate about it in the century before that, Heck, all the way back in the 1700s, people were recommending specific medical diets for obesity and diabetes, as it was already being observed that they were becoming more common. The only difference is that we are finally hitting a point of extreme consequences, as diseases of old age are now prevalent among the young, sometimes in early childhood.

We write posts like this with genuine concern and compassion. We are not disinterested observers, much less see ourselves as standing above these problems with condescension. It’s all rather personal. Though relatively healthy in many ways, we have experienced serious neurocognitive and mental health issues since our own childhood. And we suspect we previously were suffering from metabolic syndrome, if not yet diagnosed with any particular disease. To be specific about the point made in the parody video, we have experienced our own equivalent of “Thirties Brain”, as we had a memory-related learning disability that was diagnosed in third grade. For our entire lives, we’ve struggled with memory recall.

So, personal concern is underlying our public worries; magnified by the fact that our nieces and nephew span across the generations of Millennials and GenZ, allowing us to observe firsthand the health issues involved. From our own experience, we know what it’s like to be addicted to carbs and to suffer the consequences. We know what it’s like to struggle with serious mental illness, specifically depression with suicidal ideation, since young adulthood. It saddens us immensely to think that large numbers of Millennials will begin having so many harsh problems this early in life. That is a plain shitty situation, and Millennials did nothing to deserve it. Like the rest of us, they were simply born into this society with its food system and dietary recommendations.

For the most part, the majority of Millennials and other Americans have basically been doing what they were told is healthy. They don’t realize that what has been normalized should not be taken as normal because very few of them have anything to compare against. It’s not like most of us have ever lived among hunter-gatherers to realize how far human health has fallen. Even the traditional rural diet and lifestyle has mostly slipped from living memory. Certainly, hunting and fishing have become uncommon. Getting ultra-processed food from a grocery store or restaurant is simply what people do now.

* * *

44% of older millennials already have a chronic health condition. Here’s what that means for their futures
by Megan Leonhardt

Why insecure millennials are set for unhealthy middle age
by Greg Hurst

Gen X, Millennials in Worse Health Than Prior Generations at Same Age
by Amy Norton

Millennials less heart-healthy than Gen Xers at the same age
by Anicka Slachta

BCBSA: Millennials’ mental health is on the decline—and COVID-19 is making it worse
by Paige Minemyer

Millennials on Track to be Most Obese Generation in History
by Cathy Cassata

Diabetes’ Impact Is Rising Fastest Among Millennials
by Laura Entis

Study: Young adults with high cholesterol face greater risk of heart attack or stroke
by Ken Alltucker

The number of millennials with early-onset Alzheimer’s disease is surging, report finds
by Tracy Romero

Millennials may need to worry about autoimmune disease, right away
by Swedish Blogger

For millennials, cancers fueled by obesity are on rise, study says
by Sandee LaMotte

Study: Millennials’ Increased Risk for Some Obesity-Linked Cancers — 5 Takeaways
by Sandy McDowell

The coming of vegetables, fruits and key nutrients to the European diet
by V. J. Knapp

“On the basis of evidence now accumulating, vegetables and fruits were not always an integral part of the European diet. Prior to 1800, vegetables and fruits were not esteemed but rather looked down upon. It has only been over the past two centuries that these two critical foods have come into vogue. First, they had to be accepted by a growing number of medical men and observers. Then, once licensed as edible foods, vegetables and fruits, starting with the potato, actually did make their way into every man’s diet. And by the end of the nineteenth century, these rich sources of carotene and Vitamins A, C and E became so universal that Europeans now forgot that a hundred years earlier these foods had barely been consumed.”

What’s on your table? How America’s diet has changed over the decades
by Drew Desilver

What happens when you take public health advice to heart?
by Lena Zegher

Why are we fatter and sicker than ever? The graphs that explain how sugar, fruit juice and margarine are to blame
by Anna Hodgekiss

What fruits and vegetables looked like before
by Andreas Eenfeldt

Banana – before and after

banana1banana2

Carrot – before and after

carrot1carrot2

Watermelon – before and after

watermelan1watermelon2

Americans Fatter at Same Level of Food Intake and Exercise

Americans, to state the obvious, are unhealthier with each passing generation. And the most obvious sign of this is rising obesity rate. In one analysis, this was shown to be true even when controlling for levels of food intake and exercise (see article below). This is the kind of data that undermines conventional dietary advice based on Christian moralizing about the deadly sins of gluttony and sloth.

Heart attacks and obesity first became a public health concern in the 1940s and 1950s. That was following decades of seed oil and margarine consumption having mostly replaced lard in the American diet. We were told that saturated fat is dangerous and that seed oils were great for health. Americans were listening and they strictly followed this advice. Even restaurants stopped cooking their french fries in tallow.

In particular, olive oil has been sold as the best. Why is olive oil supposed to be so healthy? Because it has monounsaturated fat, the same as is primarily found in lard. Not too long ago, the healthiest population in the United States was in Roseto, Pennyslvania. Guess what was their main source of fat? Lard. They also ate massive loads of meat, as do other long-lived populations in the world such as in Hong Kong.

Red meat also decreased over that period and has continued to increase since then. Dairy has followed this pattern of decline. Americans are eating less animal fats now than ever before in American history or probably human existence. It’s true that Americans are eating more lean chicken and fish, but we were told those are healthy for us. Meanwhile, Americans are eating more fruits and vegetables, nuts and seeds than ever before.

Calories-in/calories-out has been an utter failure. It’s not how much we are eating but what we are eating. That then determines how our metabolism functions, whether it burns fat or stores it. Exercise is largely irrelevant for fat loss. Fat people can exercise all the time and not lose weight, while some skinny people hardly move at all. Another study “demonstrated that there is no difference in total energy expenditure between traditional hunter-gathers, subsistence farmers and modern Westerners.”

One explanation is an increase of obesogens. These are chemicals that cause the body to create fat. In general, fat is where the body stores excess toxins that overwhelm the body. And indeed younger Americans are exposed to more toxins. Then this makes losing weight hard because all the toxins get released and make one feel like shit. It’s hard for the body to eliminate a lifetime of accumulated toxicity. On top of that, the young are prescribed more medications than ever before. Antidepressants and antipsychotics have been given out like candy for anyone with mild mental issues. What is a common side effect of these drugs? Yep, weight gain.

A third possibility is more complex. We know the gut microbiome has shrunk in number and diversity. It’s also changed in the profile of bacteria. Research is showing how important is the microbiome (see The Secrete Life of Your Microbiome by Susan L. Prescott and Alan C. Logan). Toxins and drugs, by the way, also alter the microbiome. So does diet. Even if total calorie intake hasn’t changed much relative to the increased height of the population, what has changed is what we are eating.

In place of animal fats, we are eating not only more seed oils but also more carbs and sugar. Animal fats are highly satiating and so food companies realized they needed to find something equally satiating. It turns out a high-carb diet is not only satiating but addictive. It knocks people out of ketosis and causes them to put on weight. It doesn’t matter if one tries to eat less. In processed foods, when carbs are combined with seed oils, the body is forced to burn the carbs immediately and so it has no choice but to turn the seed oils into fat.

By the way, what alters metabolism also alters the microbiome. This is seen when people go from a high-carb diet to a ketogenic diet. Ketosis is powerful in its impact on how the body functions in so many ways, even changing epigenetic expression of genes. Here is the worst part. Those epigenetic changes have been happening for generations with the loss of regular ketosis. Even epigenetics for obesity, following an environmental trigger like famine, have been shown to pass on across multiple generations. The microbiome, of course, also is inherited and each of those bacteria likewise have an epigenome that determines their genetic expression.

Everything we do as individuals, good and bad, doesn’t only affect us as individuals. People are getting fatter now not only because of what they are doing differently but because of everything that was done by their parents, grandparents, and great-grandparents. As I’ve said before, even if we reversed all these changes instantly, as we are unlikely to do, it would still require generations to fully reverse the consequences.

* * *

Why It Was Easier to Be Skinny in the 1980s
by Olga Khazan

A study published recently in the journal Obesity Research & Clinical Practice found that it’s harder for adults today to maintain the same weight as those 20 to 30 years ago did, even at the same levels of food intake and exercise. […]

Just what those other changes might be, though, are still a matter of hypothesis. In an interview, Kuk proffered three different factors that might be making harder for adults today to stay thin.

First, people are exposed to more chemicals that might be weight-gain inducing. Pesticides, flame retardants, and the substances in food packaging might all be altering our hormonal processes and tweaking the way our bodies put on and maintain weight.

Second, the use of prescription drugs has risen dramatically since the 1970s and ’80s. Prozac, the first blockbuster SSRI, came out in 1988. Antidepressants are now one of the most commonly prescribed drugs in the U.S., and many of them have been linked to weight gain.

Finally, Kuk and the other study authors think that the microbiomes of Americans might have somehow changed between the 1980s and now. It’s well known that some types of gut bacteria make a person more prone to weight gain and obesity. Americans are eating more meat than they were a few decades ago, and many animal products are treated with hormones and antibiotics in order to promote growth. All that meat might be changing gut bacteria in ways that are subtle, at first, but add up over time. Kuk believes that the proliferation of artificial sweeteners could also be playing a role.

Why Do Americans Keep Getting Fatter?
by Chris Bodenner

Notwithstanding the known errors of dietary assessment, it is interesting that we observe consistent trends over time in terms of how dietary intake relates with obesity and how this relationship has changed over time. This lends more confidence to our primary findings and suggests that there are either physiological changes in how diet relates with body weight or differences in how individuals are reporting their dietary intake over time. […]

[W]e observed that the BMI associated with a given leisure time physical activity frequency was still higher over time in men. This may be attributed to changes in non-leisure time physical activity such as reductions in occupational physical activity or increasing screen time. However, a study using doubly labelled water demonstrated that there is no difference in total energy expenditure between traditional hunter-gathers, subsistence farmers and modern Westerners. Thus, numerous other factors in addition to energy intake and physical activity may be important to consider when trying to explain the rise in obesity, and should be further evaluated in further studies.

To Be Fat And Have Bread

The obsession with body fat is an interesting story. It didn’t begin a few generations ago but goes back centuries. But maybe that shouldn’t be surprising.

That was the colonial era when the diet was transformed by imperial trade of foreign foods. I might note that this included previously rare or never before seen varieties of fattening carbohydrates: sugar, potatoes, corn, rice, etc. The old feudal system was ending and entirely different forms of food production and diets were developing, especially for the then landless peasants. Hunting, gathering and grazing for the commoners definitely would have been on the decline for a while at that point, as the last of the commons had been privatized. The loss of access to wild game would take longer in the colonies, but eventually it happened everywhere.

The last stage of that shift overlapped with the beginnings of industrialization and agricultural improvements. In the 19th century, change in wheat surpluses and hence costs and prices. Agriculture boomed as fewer people were employed in it. There was also a sudden obsession with gender roles and social roles in general, such as the post-revolutionary expectation of the mother to make citizens out of her children. Bread-making, a once uncommon activity for Americans, became increasingly important to the normative identity of family life and the symbolic maintenance of the social order.

Regular consumption of wheat bread was once limited to the wealthy and that is how refined bread gained its moral association with the refined class. Only the wealthy could afford wheat prior to the 19th century, as prior to that the poor were forced to rely upon cheaper grains and grain substitutes at a time when bread was regularly adulterated with bark, sawdust, chalk, etc. Poverty breads, in the previous centuries, often were made with no grain at all.* For wheat and especially heavily refined white bread to become available to all walks of life meant an upsurge of the civilizing process. The obsession with middle class life took hold and so cookbooks were produced in large numbers.

In a growing reactionary impulse, there was a nostalgic tendency toward invented traditions. Bread took on new meanings that then were projected onto the past. It wasn’t acknowledged how radical was the industrial agriculture and industrial milling that made all of this possible. And the disconnection is demonstrated by the simultaneous promotion of the grain production of this industrial age and the complaint about how industrialized life was destroying all that was good. Bread, as a symbol, transcended these mere details.

With the aristocracy having been challenged during the Revolutionary Era the refinement of the refined class that once was admired had then become suspect. The ideology of whole foods began to emerge and had some strong proponents. But by the end of the 1800s, the ideal of refinement gained prominence again and prepared the way for the following century of ever greater industrialization of processed foods. Refinement represented progress. Only after more extensive refinement led to mass malnourishment, near the end of that century and heading into the next, did whole foods once again capture the public imagination.

Then we enter the true era of fat obsession, fat blaming, and dieting, endless dieting. Eat your whole grains, get your fiber, make sure you get enough servings of fruits, and veggies, and don’t forget to exercise. Calories in, calories out. Count your calories, count your carbs, count your steps. Count every last one of them. Still, the basic sides of the debate remain the same: fewer carbohydrates vs less meat, whole foods vs refined foods, barbaric lifestyle vs civilizing process, individual moral failure vs societal changes, etc. One theme that runs through dietary advice from the ancient world to the present is that there is a close link between physical health, mental health, and moral health — the latter erupting as moral panic and moral hygiene. But what stands about the modern era, beginning in the 1600s, is that it was observed that psychological problems were mostly seen among the well-to-do.

This was often blamed on luxury and sometimes on meat (a complaint often about animals raised unnaturally in confinement and probably fed grain, the early equivalent of concerns about factory farming; but also a complaint about the introduction of foreign spices and use of fancy sauces to make meat more appetizing), although there was beginning to be an awareness that a high-carb diet might be playing a role in that it was often noted that the morbidly obese ate lots of pastries, fruit pies, and such. The poor didn’t have much access to wheat and sugar before the 1800s, but the wealthy had plenty of such foods centuries earlier. Meat consumption didn’t change much during that era of colonial trade. What did change the most was availability of starchy and sugary foods, and the wealthy consumed them in great proportions. Meat had always been a desirable food going back to earliest hominid evolution. Modern agriculture and global trade, however, entirely transformed the human diet with the introduction of massive amounts of carbohydrates.

It’s strange that right from the beginning of the modern era there were those pushing for a vegetarian diet, not many but their voices were being heard for the first time. Or maybe it wasn’t so strange. Prior to the modern era, a vegetarian diet so far north in Europe would have been impossible. It was only the elite promoting vegetarianism as only they could afford a vegetarian diet year round, in buying expensive plant-based foods that were often shipped in from far away. Although plant foods were expensive at the time, they were available to those who had plenty of money. But during the Middle Ages and earlier, vegetarianism for the most part was not an option for anyone since the food items required of such a diet simply weren’t available enough to sustain life, certainly not in places like England or Germany.

There is another side to this bring us back to the obsession with fat. It was only with the gradual increase of grain production that cattle could be fed grain, not only as additional feed in the winter but year round. This is also what allowed the possibility of confining animals, rather than grazing them on fields. Grain surpluses weren’t consistent until the 19th century, but even before that grain production had been increasing. There were slow improvements in agriculture over the centuries. The rich could afford meat from grain-fed animals much earlier than the rest of the population and it was highly sought after. That is because such meat is extremely fatty creating those beautiful marbled steaks, pork chops, etc (such fattiness, by the way, is a sign of metabolic syndrome in both animals and humans). Fat couldn’t have been a focus of debate prior to grain-fattened animals became common.

So, there is a reason that both wheat bread and fatty meat gained immense symbolic potency at the same time. Similarly, it was during this same era that vegetables became more common and gardens likewise became symbols of wealth, abundance, and the good life. Only the rich could afford to maintain large gardens because of the difficulty involved and immense time-consuming work required (see The Jane Austen Diet by Bryan Kozlowski**; also about the American diet before the 20th century, see The Big Fat Surprise by Nina Teicholz that I quote in Malnourished Americans). They represented the changed diet of modern civilization. They were either indicators of progress or decline, depending on one’s perspective. Prior to modernity, a diet had consisted to a much greater degree of foods that were gathered, hunted, trapped, and fished.

The shift from one source of food to another changed the diet and so changed the debate about diet. There suddenly were more options of foods available as choices to argue about. Diet as a concept was being more fully formulated. Rather than being something inherited according to the traditional constraints of local food systems and food customs, assuming one had the wealth, one could pick from a variety of possible diets. Even to this day, the obsession about dieting carries a taint of class privilege. It is, as they say, a first world problem. But what is fascinating is how this way of thinking took hold in the 1600s and 1700s. There was a modern revolution in dietary thought in the generations before modern political revolution. The old order was falling apart and sometimes actively being dismantled. This created much anxiety and it forced the individual into a state of uncertainty. Old wisdom no longer could be relied upon.

* * *

*Rather than bread, the food that was most associated with the laboring class was fish, a food the wealthy avoided. Think about how lobster and clams used to be poverty foods. In Galenic theory of humoral physiology, fish is considered cold and wet, hard to digest and weakening. This same humoral category of food also included fruits and vegetables. This might be why, even to this day, many vegetarians and vegans will make an exception for fish, in seeing it as different than ‘meat’. This is an old ideological bias because ‘meat’ was believed to have the complete opposite effect of being hot and dry, easy to digest and invigorating. This is the reason for why meat but not fish was often banned during religious fasts and festivals.

As an interesting side note, the supposed cooling effect of fish was a reason for not eating it during the cold times of the year. Fish is one of the highest sources of vitamin A. Another source is by way of the precursor of beta-carotene found in vegetables. That these two types of food are considered of the same variety according to Galenic thought is interesting. Cold weather is one of the factors that can disrupt the body’s ability to convert beta-carotene into usable vitamin A. The idea of humors mixes this up slightly, but it maybe points to understanding there was something important to be understood. Eating more meat, rather than vegetables, in winter is a wise practice in a traditional society that can’t supplement such nutrients. Vitamin A is key for maintaining a strong immune system and handling stress (True Vitamin A For Health And Happiness).

By the way, it was during the 19th century that a discussion finally arose about vegetarianism. The question was about whether life and health could be sustained with vegetables. Then again, those involved were probably still being influenced by Galenic thought. By vegetarianism, they likely meant a more general plant-based diet that excluded ‘meat’ but not necessarily fish. The context of the debate was the religious abstinence of Lent, during which fish was allowed. So, maybe the fundamental argument was more about the possibility of long-term survival solely on moist, cooling foods. Whatever the exact point of contention, it was the first time in the modern Western world where a plant-based diet (be it vegan, vegetarian, or pescetarian-style Mediterranean diet) was considered seriously.

These ideas have been inherited by us, even though the philosophical justifications no longer make sense to us. This is seen in the debate that continues over red meat in particular and meat in general, specifically in terms of the originally Galenic assertion of its heat and dryness building up the ‘blood’ (High vs Low Protein). It’s funny that dietary debates remain obsessed over red meat (along with the related issue of cows and their farts), even though actual consumption of red meat has declined over the past century. As with bread, the symbolic value of red meat has maybe even gained greater importance. Similarly, as I mentioned above, the uncertain categorization of fish remains hazy. I know a vegan who doesn’t eat ‘meat’ but does eat fish. When I noted how odd that was, a vegetarian I was talking to thought it made perfect sense. This is Galenic thought without the Galenic theory that at least made it a rational position, but the ideological bias remains in spite of those adhering to it being unable to explain why they hold that bias. It amuses me.

Ideologies are powerful systems. They are mind viruses that can survive and mutate across centuries and sometimes millennia. Most of the time, their origins are lost to history. But sometimes we are able to trace them and it makes for strange material to study.

See: “Fish in Renaissance Dietary Theory” by Ken Albala from Fish: Food from the Waters ed. by Harlan Walker, and Food and Faith in Christian Culture ed. by Ken Albala and Trudy Eden. Also, read text below, such as the discussion of vegetarianism.

* * *

(Both texts below are from collections that are freely available on Google Books and possibly elsewhere.)

The Fat of the Land: Proceedings of the Oxford Symposium on Food and Cooking 2002
ed. by Harlan Walker
“The Apparition of Fat in Western Nutritional Theory”
by Ken Albala

Naturally dietary systems of the past had different goals in mind when framing their recommendations. They had different conceptions of the good, and at some point in history that came to include not being fat. Body size then became an official concern for dietary writers. Whether the original impetus for this change was a matter of fashion, spirituality or has its roots in a different approach to science is impossible to say with any degree of precision. But this paper will argue that nutritional science itself as reformulated in the 17th century was largely to blame for the introduction of fat into the discourse about how health should be defined. […] Obesity is a pathological state according to modern nutritional science. But it was not always so.

When and why fat became a medical issue has been a topic of concern among contemporary scholars. Some studies, such as Peter N. Sterns’ Fat History: Bodies and Beauty in the Modern West, place the origin of our modern obsession in the late 19th century when the rise of nutritional science and health movements lead by figures like John Harvey Kellogg, hand in hand with modern advertising and Gibson Girls, swept away the Victorian preference for fulsome figures. As a form of social protest, those who could afford to, much as in the 60s, idealized the slim androgynous figure we associate with flappers. Others push the origin further back into the early 19th century, in the age of Muscular Christianity and Sylvester Graham. But clearly the obsession is earlier than this. In the 18th century the 448 pound physician George Cheyne and his miracle dieting had people flocking to try out the latest ‘cures.’ It was at the same time that dissertations on the topic of obesity became popular, and clearly the medical profession had classified this as a treatable condition. And readers had already been trained to monitor and police their own bodies for signs of impending corpulence. The roots of this fear and guilt must lie somewhere in the previous century as nutritional science was still groping its way through a myriad of chemical and mechanical theories attempting to quantify health and nutrition with empirical research.

The 17th century is also the ideal place to look if only because the earlier system of humoral physiology is almost totally devoid of a concept of fat as a sickness. […]

For all authors in the Galenic tradition it appears that fat was seen as a natural consequence of a complexion tending to the cold and moist, something which could be corrected, but not considered an illness that demanded serious attention. And socially there does not seem to have been any specific stigma attached to fat if Rubens’ taste in flesh is any measure.

The issue of fat really only emerges among authors who have abandoned, in part or totally, the system of humoral physiology. This seems to have something to do with both the new attempts to quantify nutrition, first and most famously by Santorio Santorio9 and also among those who began to see digestion and nutrition as chemical reactions which when gone awry cast fatty deposits throughout the body. It was only then that fat came to be considered a kind of sickness to be treated with therapy.10

The earliest indications that fat was beginning to be seen as a medical problem are found in the work of the first dietary writer who systematically weighed himself. Although Santorio does not seem to have been anxious about being overweight himself, he did consistently define health as the maintenance of body weight. Expanding on the rather vague concept of insensible perspiration used by Galenic authors, Santorio sought to precisely measure the amount of food he consumed each day compared to the amount excreted in ‘sensible’ evacuations. […] Still, fat was not a matter of eating too much. ‘He who eats more than he can digest, is nourished less than he ought to be, and [becomes] consequently emaciated.’12 More importantly, fat was a sign of a system in disarray. […]

Food was not in fact the only factor Santorio or his followers took into account though. As before, the amount of exercise one gets, baths, air quality, even emotions could alter the metabolic rate. But now, the effect of all these could be precisely calculated. […]

At the same time that these mechanistic conceptions of nutrition became mainstream, a chemical understanding of how food is broken down by means of acids and alkalis also came to be accepted by the medical profession. These ideas ultimately harked back to Paracelsus writing in the 16th century but were elaborated upon by 17th century writers […] It is clear that by the early 18th century fat could be seen as a physiological defect that could be corrected by heating the body to facilitate digestive fermentation and the passage of insensible perspiration. […] Although the theories themselves are obviously nothing like our own, we are much closer to the idea of fat as a medical condition. […]

Where Cheyne departs from conventional medical opinion, is in his recommendation of a cooked vegetable diet to counter the affects of a disordered system, which he admits is rooted in his own ‘experience and observation on my own crazy carcase and the infirmities of others I have treated’ rather than on any theoretical foundation.

The controversy over whether vegetables could be considered a proper diet, not only for the sick or overgrown but for healthy individuals, was of great concern in the 18th century. Nicholas Andry in his Traité des alimens de caresme offered an extended diatribe against the very notion that vegetables could sustain life, a question of particular importance in Catholic France where Lenten restriction were still in force, at least officially. […] According to current medical theory, vegetables could not be suitable for weight loss, despite the successful results of the empirics. […]

It is clear that authors had a number of potentially conflicting theoretical models to draw from and both mechanical and chemical explanations could be used to explain why fat accumulates in the body. Yet with entirely different conceptual tools, these authors arrived at dietary goals surprisingly like our own, and equally as contentious. The ultimate goals now became avoiding disease and fat, and living a long life. While it would be difficult to prove that these dietary authors had any major impact beyond the wealthy elites and professionals who read their works, it is clear that a concern over fat was firmly in place by the mid 18th century, and appears to have its roots in a new conception of physiology which not only paid close attention to body weight as an index of health, but increasingly saw fat as a medical condition.

Food and Morality: Proceedings of the Oxford Symposium on Food and Cookery 2007
ed. by Susan R. Friedland
“Moral Fiber: Bread in Nineteenth-Century America”

by Mark McWilliams

From Sarah Josepha Hale, who claimed, ‘the more perfect the bread, the more perfect the lady’ to Sylvester Graham, who insisted, ‘the wife, the mother only’ has the ‘moral sensibility’ required to bake good bread for her family, bread often became a gendered moral marker in nineteenth-century American culture.1 Of course, what Hale and Graham considered ‘good’ bread differed dramatically, and exactly what constituted ‘good’ bread was much contested. Amidst technological change that made white flour more widely available and home cooking more predictable, bread, described in increasingly explicit moral terms, became the leading symbol of a housewife’s care for her family.

Americans were hardly the first to ascribe moral meaning to their daily bread. As Bernard Dupaigne writes, ‘since time immemorial [bread] has attended the great events of various human communities: monsoon or grape harvest bread, the blessed bread of Catholics or the unleavened bread of Passover, or the fasting-break bread of Ramadan. There is no bread that does not, somewhere in the world, celebrate an agricultural or religious holiday, enrich a family event, or commemorate the dead.’2 With such varied symbolic resonance, bread seems easily filled with new meanings.

In America (as later in France),3 bread became a revolutionary symbol. To the early English colonists’ dismay, European wheat did not adapt well to the North American climate; the shift to corn as the primary grain was perhaps the most important dietary adaptation made by the colonists. Wheat remained too expensive for common consumption well into the nineteenth century. […]

By the end of the Revolution, then, bread was already charged with moral meaning in the young United States. In the nineteenth century, this meaning shifted in response to agricultural improvements that made wheat more widely available, technological change that made bread easier to make consistently, and, perhaps most important, social change that made good bread the primary symbol of a housewife’s care for her family. In effect, bread suffered a kind of identity crisis that paralleled the national identity crisis of Jacksonian America. As Americans thought seriously about who they were in this new nation, about how they should act and even how they should eat, bread’s symbolic meaning – and bread itself– changed.

American agricultural production exploded, although the proportion of the population working on farms declined. James Trager notes that even before the McCormick reaper first sold in large numbers as farmers struggled to replace workers leaving for the 1849 Gold Rush, the average time required to produce a bushel of wheat declined 22 per cent from 1831 to 1840.7 Dramatic improvements in efficiency led to larger yields; for example, wheat production more than doubled between 1840 and 1860. Such increases in wheat production, combined with better milling procedures, made white flour finally available in quantities sufficient for white bread to become more than a luxury good.8

Even as wheat became easier to find for many Americans, bread remained notoriously difficult to make, or at least to make well. Lydia Maria Child, a baker’s daughter who became one of America’s leading writers, emphasizes what must have been the intensely frustrating difficulty of learning to cook in the era before predictable heat sources, standardized measurements, and consistent ingredients.9 […]

Unlike Hale, who implies that learning to bake better can be a kind of self improvement, this passage works more as dire warning to those not yet making the proper daily bread. Though bread becomes the main distinction between the civilized and the savage, Beecher turns quickly, and reassuringly, to the science of her day: ‘By lightness is meant simply that in order to facilitate digestion the particles are to be separated from each other by little holes or air-cells; and all the different methods of making light bread are neither more nor less than the formation of bread with these air cells’ (170). She then carefully describes how to produce the desired lightness in bread, instructions which must have been welcome to the young housewife now fully convinced of her bread’s moral importance.

The path for Beecher, Hale, and others had been prepared by Sylvester Graham, although he is little mentioned in their work.14 In his campaign to improve bread, Graham’s rhetoric ‘romanticized the life of the traditional household’ in ways that ‘unknowingly helped prepare women to find a new role as guardians of domestic virtue,’ as Stephen Nissenbaum notes.15 Bread was only one aspect of Graham’s program to educate Americans on what he called ‘the Science of Human Life.’ Believing on the one hand, unlike many at the time, that overstimulation caused debility and, on the other, that industrialization and commercialization were debasing modern life, Graham proposed a lifestyle based around a strict controls on diet and sexuality.16 While Graham promoted a range of activities from vegetarianism to temperance, his emphasis on good bread was most influential. […]

And yet modern conditions make such bread difficult to produce. Each stage of the process is corrupted, according to Graham. Rather than grow wheat in ‘a pure virgin soil’ required for the best grain, farmers employ fields ‘exhausted by tillage, and debauched by the means which man uses to enrich and stimulate it.’ As Nissenbaum notes, the ‘conscious sexual connotations’ of Graham’s language here is typical of his larger system, but the language also begins to point to the moral dimensions of good bread (6).

Similarly loaded language marks Graham’s condemnation of bakery bread. Graham echoed the common complaints about adulteration by commercial bakers. But he added a unique twist: even the best bakery bread was doubly flawed. The flour itself was inferior because it was over-processed, according to Graham: the ‘superfine flour’ required for white bread ‘is always far less wholesome, in any and every situation of life, than that which is made of wheaten meal which contains all the natural properties of the grain.’ […]

As Nissenbaum argues, pointing to this passage, Graham’s claims invoke ‘the vision of a domestic idyll, of a mother nursing her family with bread and affection’ (8). Such a vision clearly anticipates the emphasis on cookery as measure of a woman’s social worth in the domestic rhetoric that came so to characterize the mid-nineteenth century.

Such language increasingly linking cookery with morality emphasized the virtue not of the food itself but rather of the cooks preparing it. This linkage reached read ers not only through the explosion of cookbooks and domestic manuals but also through the growing numbers of sentimental novels. Indeed, this linkage provided a tremendously useful trope for authors seeking a shorthand to define their fictional characters. And that trope, in turn, helped expand the popularity of interpreting cookery in moral terms. […]

After the Civil War, domestic rhetoric evolved away from its roots in the wholesome foods of the nation’s past toward the ever-more refined cuisine of the Gilded Age. Graham’s refusal to evolve in this direction – his system was based entirely in a nostalgic struggle against modernity, against refinement – may well be a large part of why his work was quickly left behind even by those for whom it had paved the way.

* * *

Here is another text I came across. It’s not free, but it seems like a good survey worth buying.

 

 

What Caused Rise In Bowel Cancer Rate?

Charlie Spedding
We are told red meat causes bowel cancer. Today @thetimes reports on surge in colon cancer among the young. But young people are eating less meat. How does  @WHO explain that?
Image
Image

Louise Stephen
Fake news – there is big money behind the drive to get people off red meat and onto replacement products such as Beyond Meat.
Image

Frédéric Leroy
🤔 Mmm.
Image

Tim Noakes
just possibly, cancer might have nutritional basis. Which seems at least an outside possibility since cancer is modern disease found rarely in peoples eating their traditional diets.

Guðmundur Jóhannsson
“Hyperinsulinemia appears to be a consistent marker of enhanced colon cancer risk.”
The Role of Obesity and Related Metabolic Disturbances in Cancers of the Colon, Prostate, and Pancreas
by Edward Giovannucci & Dominique Michaud

Guðmundur Jóhannsson
Hyperinsulinemia & colon cancer. Prospective cohort study of 14.275 women:
“For colon cancer alone (75 case subjects and 146 control subjects), ORs increased up to 3.96 (95% CI = 1.49-10.50; P:(trend) <.001) for the highest versus the lowest quintiles.”
Serum C-Peptide, Insulin-Like Growth Factor (IGF)-I, IGF-Binding Proteins, and Colorectal Cancer Risk in Women
by Rudolf Kaaks et al

Fat is our Friend
“Leading a Western lifestyle, being overweight, and being sedentary are associated with an increased risk of colorectal cancer”… but I thought it was mostly down to red meat.😉

Guðmundur Jóhannsson
Yes, because it rots in your colon… obviously
Does Meat Rot In Your Colon? No. What Does? Beans, Grains, and Vegetables!
by J. Stanton

Guðmundur Jóhannsson
“A high-fiber diet and increased frequency of bowel movements are associated with greater, rather than lower, prevalence of diverticulosis.”
A High-Fiber Diet Does Not Protect Against Asymptomatic Diverticulosis
by Anne F. Peery et al

Tim Noakes
Is diverticulosis related in any way to bowel cancer? Recall that rise in colon cancer has occurred at same time that unproven Burkitt/Trowell hypothesis has been accepted as dogma. BT hypothesis holds that absence of dietary fibre causes colon cancer. So prevention = more fibre.

Guðmundur Jóhannsson
“There is no direct evidence of an effect of dietary fiber on colon cancer incidence… In a trial of ispaghula husk fiber, the intervention group actually had significantly more recurrent adenomas after 3 years”
Does a high-fiber diet prevent colon cancer in at-risk patients?
by Linda French, MD & Susan Kendall, PhD

Harold Quinn
If, as seems likely, colonic caracinoma is significantly pathogenically driven, then more “prebiotic” might be expected to be carcinogenic in the dysbiotic gut but potentially anti-cancer in a situation of eubiosis. Seeking some ubiquituous impact of fibre for all seems unwise

Dr. Ann
Interesting given bowel cancer may be highest in groups most likely to ingest plant fiber, at least if this study is to be believed
Vegetarians Have Fewer Cancers But Higher Risk Of Colorectal Cancer, Study
by Catharine Paddock PhD

Sydney
Did they study seed oils?

Joseph Emmanuel
‘’Elementary my dear Watson” … it’s a paradox ‘of course’ 😉 at least in nutrition epidemiology

What causes health?

What causes health? It’s such a simple question, but it’s complex. The causes are many and the direction of causality not always clear. There has been a particular challenge to dietary ideology that shifts our way of thinking. It has to do with energy and motivation.

The calorie-in/calorie-out (CICO) theory is obviously false (Caloric Confusion; & Fung, The Evidence for Caloric Restriction). Dr. Jason Fung calls it the CRaP theory (Caloric Reduction as Primary). Studies show there is a metabolic advantage to low-carb diets (Cara B. Ebbeling, Effects of a low carbohydrate diet on energy expenditure during weight loss maintenance: randomized trial), especially ketogenic diets. It alters your entire metabolism and endocrine system. Remember that insulin is a hormone that has much to do with hunger signaling. Many other hormones are involved as well. This also alters how calories are processed and used in the body. More exercise won’t necessarily do any good as long nothing else is changed. The standard American diet is fattening and the standard American lifestyle makes it hard to lose that fat. Even starving yourself won’t help. The body seeks to limit energy use and maintain energy stores, especially when it is under stress (NYU Langone, Researchers Identify Mechanism that May Drive Obesity Epidemic). All that caloric restriction does is to slow down metabolism, the opposite of what happens on carbohydrate restriction.

We associate obesity with disease and rightly so, but that isn’t to say that obesity is the primary cause. It too is a symptom or, in some cases, even a protective measure (Coping Mechanisms of Health). The body isn’t stupid. Everything the body does serves a purpose, even if that purpose is making the best out of a bad situation. Consider depression. One theory proposes that when there is something wrong we seek seclusion in order to avoid further risks and stressors and to figure out the cause of distress — hence the isolation and rumination of depression. It’s similar to why we lay in bed when sick, to let the body heal. And it should be noted that depression is a symptom of numerous health conditions and often indicates inflammation in the brain (an immune response). Insulin resistance related to obesity also can involve inflammation. When the cause of the problem is permanent, the symptoms (depression, obesity, etc) become permanent. The symptoms then become problems in their own right.

This is personal for me. I spent decades in severe depression. And during that time my health was worsening, despite struggling to do what was right. I went to therapists and took antidepressants. I tried to improve my diet and exercised. But it always felt like I was fighting against myself. I was gaining weight over time and my food cravings were persistent. Something was missing. All that changed once I got into ketosis. It’s not merely that I lost weight. More amazingly, my depression and food addictions went away, along with my tendencies toward brooding and compulsive thought (The Agricultural Mind). Also, everything felt easier and more natural. I didn’t have to force myself to exercise for it now felt good to exercise. Physical activity then was an expression of my greater health, in the way a child runs around simply for the joy of it, for no other reason than he has the energy to do so. Something fundamentally changed within my body and mind. Everything felt easier.

This touches on a central theory argued by some low-carb advocates. It’s not how many calories come in versus how many go out, at least not in a simple sense. The question is what is causing calories to be consumed and burned. One thing about ketosis is that it forces the body to burn its own energy (i.e., body fat) while reducing hunger, but it does this without any need of willpower, restraint, or moral superiority. It happens naturally. The body simply starts producing more energy and, even if someone eats a high-calorie diet, the extra energy creates the conditions where, unless some other health condition interferes, increased physical activity naturally follows.

It’s not merely that being in ketosis leads to changed activity that burns more energy. Rather, the increased energy comes first. And that is because ketosis allows better access to all that energy your body already has stored up. Most people feel too tired and drained to exercise, too addicted to food that trying to control it further stresses them. That is the typical experience on a high-carb diet, mood and energy levels go up and down with the inevitable crashes becoming worse over time. But in ketosis, mood and energy is more balanced and constant. Simply put, one feels better. And when one feels better, one is more likely to do other activities that are healthy. Ketosis creates a leverage point where health improvements can be made with far less effort.

In the public mind, diet is associated with struggle and failure. But in its original meaning, the word ‘diet’ referred to lifestyle. Diet shouldn’t be something you do so much as something that changes your way of being in and relating to the world. If you find making health changes hard, it might be because you’re doing it wrong. Obesity and tiredness is not a moral failing or character flaw. You aren’t a sinner to be punished and reformed. Your body doesn’t need to be denied and controlled. There is a natural state of health that we can learn to listen to. When your body hungers and craves, it is trying to tell you something. Feed it with the nutrition it needs. Eat to satiety those foods that contribute to health. Lose excess weight first and only later worry about exercise. Once you begin to feel better, you might find your habits improving of their own accord.

This is a challenge not only to dietary belief systems but an even more radical challenge to society itself. Take prisons as an example. Instead of using prisons to store away the victims of poverty and inequality, we could eliminate the causes and consequences of poverty and inequality. We used to treat the mentally ill in hospitals, but now we put them into prisons. This is seen in concrete ways, such that prisoners have higher rates of lead toxicity. As a society, it would be cheaper, more humane, and less sociopathic to reduce the heavy metal poisoning. Similarly, studies have shown the prison population tends to be extremely malnourished. Prisons that improve the diet of prisoners result in a drastic reduction in aggressive, violent, anti-social, and other problematic behaviors. A similar observation has been made in studies with low-carb diets and children, as behavior improves. That indicates that, if we had increased public health, many and maybe most of these people wouldn’t have ended up in prison in the first place (Physical Health, Mental Health).

We’ve had a half century of unscientific dietary advice. Most Americans have been doing what they’ve been told. Saturated fat, red meat, and salt consumption went down over the past century. In place of those, fruits and vegetables, fish and lean chicken became a larger part of the diet. What has been the results? An ever worsening epidemic of obesity, diabetes, heart disease, autoimmune disorders, mood disorders, and on and on. In fact, these kinds of health problems were seen quite early on, following the fear toward meat that followed Upton Sinclair’s 1906 muckraking journalism on the meatpacking industry in The Jungle. Saturated fat intake had been decreasing and seed oil intake had been increasing in the early 1900s, in the decades leading up to the health epidemic that began most clearly around the 1940s and 1950s. The other thing that had increased over that time period were grains, sugar, and carbs in general. Then the victims who followed this bad advice were blamed by the experts for being gluttonous and slothful, as if diet were a Christian morality play. We collectively took the hard path. And the more we failed, the more the experts doubled down in demanding more of the same.

Do we want better lives for ourselves and others? Or do we simply want to scapegoat individuals for our collective failures? If you think we can’t afford to do the right thing, then we really won’t be able to afford the consequences of trying to avoid responsibility. The increasing costs of sickness, far from being limited to healthcare, will eventually bankrupt our society or else cause so much dysfunction that civil society will break down. Why choose such a dark path when an easier choice is before us? Why is the government and major health institutions still pushing a high-carb diet? We have scientifically proven the health benefits of low-carb diets. The simplest first act would be to change our dietary guidelines and all else would follow from that, from the food system to medical practice. What are we waiting for? We can make life hard, if we choose. But why not make it easy?

* * *

I’ve long wondered why we humans make life unnecessarily hard. We artificially construct struggle and suffering out of fear of what would happen if people were genuinely free from threat, punishment, and social control. We think humans are inherently bad and must be controlled. This seeps into every aspect of life, far from being limited to demented dietary ideology.

We are even willing to punish others at great costs to ourselves, even to the point of being highly destructive to all of society. We’d rather harm, imprison, kill, etc millions of innocents in order to ensure one guilty person gets what we think they deserve. And we constantly need an endless parade of scapegoats to quench our vengeful natures. Innocence becomes irrelevant, as it ultimately is about control and not justice.

All of it is driven by fear. The authoritarians, social dominators, and reactionaries — they prey upon our fear. And in fear, people do horrific things or else submit to others doing them. Most importantly, it shuts down our ability to imagine and envision. We go to great effort to make our lives difficult. Struggle leads to ever more struggle. Suffering cascades onto suffering. Worse upon worse, ad infinitum. As such, dietary ideology or whatever else pushed by the ruling elite isn’t about public good. It’s social control, pure and simple.

But let all of that go. Let the fear go. We know from science itself that it doesn’t have to be this hard. There are proven ways to do things that are far simpler and far easier and with far better results. We aren’t bad people who need to be punished into doing the right thing. Our bodies aren’t fallen forms that will lead us into sin. What if, instead, we looked to the better angels of our nature, to what is inherently good within us?

Here is some of what I’ve written before about the easy versus the hard, about freedom versus social control:
Public Health, Public Good
Freedom From Want, Freedom to Imagine
Rationalizing the Rat Race, Imagining the Rat Park
Costs Must Be Paid: Social Darwinism As Public Good
Denying the Agency of the Subordinate Class
Capitalism as Social Control
Substance Control is Social Control
Reckoning With Violence
Morality-Punishment Link
Unspoken Connection: Fundamentalism and Punishment
What If Our Economic System Conflicts With Our Human Nature?
An Invisible Debt Made Visible

About imagining alternatives, I’ve been reading Edward Bellamy’s Looking Backward. It’s a utopian novel, but in many ways it isn’t all that extreme. The future portrayed basically is a Nordic-style social democracy taken to the next level. That basic model of governance has already proven itself one of the best in the world, not only for public good but also wealth and innovation.

In reading about this fictionalized world, one thing stood out to me. The protagonist, Julian West, was put into trance to aid his sleep. He was in a sealed room underground and apparently the house burned down, leaving behind an empty lot. As a leap of imagination for both author and reader, this trance state put him into hibernation for more than a century. His underground bedchamber is discovered by the Leete family who, in the future world, lives on his old property although there house was built on a different location.

The father is Doctor Leete who takes particular interest in Julian. They have many conversations about the differences between the late 19th and early 21st centuries. Julian struggles to understand the enormous changes that have taken place. The world he fell asleep in is no longer recognizable by the world he woke up in. When he questions something that seems remarkable to him, Doctor Leete often responds that it’s more simple than it seems to Julian. The contrast shows how unnecessarily difficult, wasteful, and cruel was that earlier society.

The basic notion is that simple changes in social conditions can result in drastic changes in public good. The costs are miniscule in comparison to the gains. That is to say that this alternative future humanity chose the easy path, instead of continually enforcing costly punishment and social control. It’s quite amazing that the argument I make now was being made all the way back in 1888 when Bellamy began writing it. From the novel, one example of this other way of thinking is the description of the future education system in how it relates to health:

I shall not describe in detail what I saw in the schools that day. Having taken but slight interest in educational matters in my former life, I could offer few comparisons of interest. Next to the fact of the universality of the higher as well as the lower education, I was much struck with the prominence given to physical culture, and the fact that proficiency in athletic feats and games as well as in scholarship had a place in the rating of the youth.

“The faculty of education,” Dr. Leete explained, “is held to the same responsibility for the bodies as for the minds of its charges. The highest possible physical, as well as mental, development of everyone is the double object of a curriculum which lasts from the age of six to that of twenty- one.”

The magnificent health of the young people in the schools impressed me strongly. My previous observations, not only of the notable personal endowments of the family of my host, but of the people I had seen in my walks abroad, had already suggested the idea that there must have been something like a general improvement in the physical standard of the race since my day ; and now, as I compared these stalwart young men and fresh, vigorous maidens, with the young people I had seen in the schools of the nineteenth century, I was moved to impart my thought to Dr. Leete. He listened with great interest to what I said.

“Your testimony on this point,” he declared, “is invaluable. We believe that there has been such an improvement as you speak of, but of course it could only be a matter of theory with us. It is an incident of your unique position that you alone in the world of to-day can speak with authority on this point. Your opinion, when you state it publicly, will, I assure you, make a profound sensation. For the rest it would be strange, certainly, if the race did not show an improvement. In your day, riches debauched one class with idleness of mind and body, while poverty sapped the vitality of the masses by overwork, bad food, and pestilent homes. The labour required of children, and the burdens laid on women, enfeebled the very springs of life. Instead of the these maleficent circumstances, all now enjoy the most favourable conditions of physical life ; the young are care fully nurtured and studiously cared for ; the labour which is required.of all is limited to the period of greatest bodily vigour, and is never excessive ; care for one’s self and one’s family, anxiety as to livelihood, the strain of a ceaseless battle of life, all these influences, which once did so much to wreck the minds and bodies of men and women, are known no more. Certainly, an improvement of the species ought to follow such a change, In certain specific respects we know, indeed, that the improvement has taken place. Insanity, for instance, which in the nineteenth century was so terribly common a product of your insane mode of life, has almost dis appeared, with its alternative, suicide.”

* * *

Bonus Article:
Here’s What Weight-Loss Advice Looked Like Nearly 100 Years Ago
by Morgan Cutolo, Reader’s Digest

I’m throwing this in for a number of reasons. It is showing how low-carb views are basically the same as dietary advice from earlier last century. Heck, one can find advice like that going back to the 1800s and even 1700s. Low-carb diets were well known and mainstream until the changes at the AHA and FDA over the past 50 years or so.

The return of low-carb popularity is what inspires such articles from the corporate media. Reader’s Digest would’t likely have published something like that 10, 20, or 30 years ago. Attitudes are changing, even if institutions are resistant. Profits are also changing as low-carb products become big biz. Corporate media, if nothing else, will follow the profits.

Here is what really stood out to me. In the article, two major dietary experts are quoted: Dr. Jason Fung and Dr. Robert Lustig. Both of them are leading advocates of low-carb diets with Dr. Lustig being the most influential critic of sugar. But neither of them is presented as such. They are simply used as authorities on the topic, which they are. That means that low-carb has become so acceptable as, in some cases, to go without saying. They aren’t labeled as low-carb gurus, much less dismissed as food faddists. No qualifications or warnings are given about low-carb. The article simply quotes these experts about what the science shows.

This is a major advance in news reporting. It’s a positive sign of changes being embraced. Maybe we are finally turning off the hard path and trying out the easier path instead. Some early signs are indicating this. The growing incidence of diabetes might be finally leveling out and even reversing for the first time in generations.

Diabetic Confusion
Low-Carb Diets On The Rise
American Diabetes Association Changes Its Tune
Slow, Quiet, and Reluctant Changes to Official Dietary Guidelines
Official Guidelines For Low-Carb Diet
Obese Military?
Weight Watchers’ Paleo Diet

A Century of Dietary and Nutritional Trends

At Optimizing Nutrition, there is a freaking long post with a ton of info: Do we need meat from animals? Let me share some of charts showing changes over the past century. As calories have increased, the nutrient content of food has been declining. Also, with vegetable oils and margarine shooting up, animal fat and dietary cholesterol intake has dropped.

Carbs are a bit different. They had increased some in the early 20th century. That was in response to meat consumption having declined in response to Upton Sinclair’s muckraking of the meat industry with his book The Jungle. That was precisely at the time when industrialization had made starchy carbs and added sugar more common. For perspective, read Nina Teicholz account of the massive consumption of animal foods, including nutrient-dense animal fat and organ meats, among Americans in the prior centuries:

“About 175 pounds of meat per person per year! Compare that to the roughly 100 pounds of meat per year that an average adult American eats today. And of that 100 pounds of meat, more than half is poultry—chicken and turkey—whereas until the mid-twentieth century, chicken was considered a luxury meat, on the menu only for special occasions (chickens were valued mainly for their eggs). Subtracting out the poultry factor, we are left with the conclusion that per capita consumption of red meat today is about 40 to 70 pounds per person, according to different sources of government data—in any case far less than what it was a couple of centuries ago.” (The Big Fat Surprise, passage quoted in Malnourished Americans).

What we forget, though, is that low-carb became popular for a number of decades. In the world war era, there was a lot of research on the ketogenic diet. Then around the mid-century, low-carb diets became common and carb intake fell. Atkins didn’t invent the low-carb diet. Science conferences on diet and nutrition, into the 1970s, regularly had speakers on low-carb diets (either Gary Taubes or Nina Teicholz mentions this). It wasn’t until 1980 that the government began seriously promoting the high-carb diet that has afflicted us ever since. Carb intake peaked out around 2000 and dropped a bit after that, but has remained relatively high.

The inflammatory omega-6 fatty acids combined with all the carbs has caused obesity, as part of metabolic syndrome. That goes along with the lack of nutrition that has caused endless hunger as Americans have been eating empty calories. The more crap you eat, the more your body hungers for nutrition. And all that crap is designed to be highly addictive. So, Americans eat and eat, the body hungering for nutrition and not getting it. Under natural conditions, hunger is a beneficial signal to seek out what the body needs. But such things as sugar have become unlinked from nutrient-density.

Unsurprisingly, Americans have been getting sicker and sicker, decade after decade. But on a positive note, recently there is a slight drop in how many carbs Americans are eating. This is particularly seen with added sugar. And it does seem to be making a difference. There is evidence that the diabetes epidemic might finally be reversing. Low-carb diets are becoming popular again, after almost a half century of public amnesia. That is good. Still, the food most American have access to remains low quality and lacking in nutrition.












image3-1.png

Gary Taubes On Biological Homeostasis

Gary Taubes wrote, “This is Curt Richter talking about his diabetic rat experiments published in 1941. It raises an obvious question: Could Richter’s rats have been smarter than the expert committees of the American Diabetes Association? I’m just saying….”

We found that when pancreatectomized rats with marked diabetes were offered a carbohydrate, a fat, and a protein in separate containers, in place of the mixed diet, they refused the carbohydrate and ate large amounts of fat and protein (7). As a result they lost their symptoms of diabetes, i.e., their blood sugar fell to its normal level, they gained weight, ate less food, and drank only normal amounts of water.

More from the same paper:

“When placed on a self-selection diet, the rats were no longer forced to take carbohydrate, except for the small amount contained in the yeast. They stopped eating sucrose and ate large amounts of olive oil; and as a consequence, the water was no longer drained from the tissues. The thirst disappeared, energy from the sugar was no longer lost; so the food intake decreased to its normal level again. With the return to the McCollum diet, the symptoms were reversed again.

“The dietary selections made by the diabetic rats closely agree with the diets determined empirically by clinicians from human diabetics in the preinsulin period. It was generally agreed that patients did much better on a high fat than on a low fat diet. Since the insulin increases the ability to utilize carbohydrate, the need for a high fat diet is no longer present. In preliminary experiments we have found that, when treated with insulin, our diabetic rats stop taking olive oil in such large amounts and eat sucrose.”

* * *

Increased Fat and Decreased Carbohydrate Appetite of Pancreatectomized Rats
by Curt P. Richter and Edward C. H. Schmidt, Jr.

BERNARD, who in 1859 first stated his concept of the constancy of the internal environment, described various physiological mechanisms, part responses of the organism, responses of individual organs, which contribute to the maintenance of this constancy. We have recently found that behavior mechanisms, responses of the total organism, may also serve to maintain the constancy of the internal environment. The existence of these behavior mechanisms became established in experiments in which certain physiological mechanisms had been excluded. Thus, after adrenalectomy had removed the chief physiological means of regulating sodium metabolism, it was found that the animal itself made an effort to maintain the sodium balance by seeking and ingesting large amounts of sodium chloride (1). Similarly, after parathyroidectomy had removed the physiological mechanisms for the maintenance of a constant calcium balance, the animals themselves made an effort to correct the calcium loss by ingesting large amounts of calcium solution (2).

Catching Up On Lost Time – The Ancestral Health Symposium, Food Reward, Palatability, Insulin Signaling and Carbohydrates… Part II(C)
by Gary Taubes

It was well known at the time (although it may have been forgotten since then), as I discussed in Good Calories, Bad Calories, that animals can be made to like one food more than another, and so eat more of the one than the other, by interventions that influenced their underlying physiologic/metabolic/hormonal states. Here’s how I illustrated this in GC,BC:

Throughout the first half of the twentieth century, a series of experimental observations, many of them from [Curt] Richter’s laboratory [at Johns Hopkins University], raised questions about what is meant by the concepts of hunger, thirst and palatability, and how they might reflect metabolic and physiological needs. For example, rats in which the adrenal glands are removed cannot retain salt and will die within two weeks on their usual diet from the consequences of salt depletion. If given a supply of salt in their cages, however, or given the choice of drinking salt water or pure water, they will chose to either eat or drink the salt and, by doing so, keep themselves alive indefinitely. These rats will develop a “taste” for salt that did not exist prior to the removal of their adrenal glands. Rats that have had their parathyroid glands removed will die within days of tetany, a disorder of calcium deficiency. If given the opportunity, however, they will drink a solution of calcium lactate rather than water—not the case with healthy rats—and will stay alive because of that choice. They will appear to like the calcium lactate more than water. And rats rendered diabetic voluntarily choose diets devoid of carbohydrates, consuming only protein and fat. “As a result,” Richter said, “they lost their symptoms of diabetes, i.e., their blood sugar fell to its normal level, they gained weight, ate less food and drank only normal amounts of water.

In short, change underlying physiologic/hormonal conditions and it will affect what an animal chooses to eat and so seems to like or find rewarding. The animal’s behavior and perceptions will change in response to a change in homeostasis – in the hormonal milieu of the cells in the body.

It’s quite possible that all those foods we seem to like, or even the ones we find rewarding but don’t particularly like, as Dr. Guyenet argues, and that subsequently cause obesity (not necessarily the same thing) are those foods that somehow satisfy an underlying metabolic and physiological demand. This in turn might induce our brains to register them as more palatable or rewarding, but the initial cause would be the effect in the periphery. The nutrient composition of the food, in this case, would be the key—what it’s doing in the body, not necessarily the brain.

Good Calories, Bad Calories
by Gary Taubes
pp. 331-332

This idea that energy expenditure increases to match consumption, and that the ability to do this differs among individuals, also serves to reverse the cause-and-effect relationship between weight and physical activity or inactivity. Lean people are more active than obese people, or they have, pound for pound, a higher expenditure of energy, *89 because a greater proportion of the energy they consume is made available to their cells and tissues for energy. By this conception, lean people become marathon runners because they have more energy to burn for physical activity; their cells have access to a greater proportion of the calories they consume to use for energy. Less goes to making fat. That’s why they’re lean. Running marathons, however, will not make fat people lean, even if they can get themselves to do it, because their bodies will adjust to the extra expenditure of energy, just as they would adjust to calorie-restricted diets.

Our propensity to alter our behavior in response to physiological needs is what the Johns Hopkins physiologist Curt Richter called, in a heralded 1942 lecture, “total self-regulatory functions.” Behavioral adaptation is one of the fundamental mechanisms by which animals and humans maintain homeostasis. Our responses to hunger and thirst are manifestations of this, replenishing calories or essential nutrients or fluids. Physical activity, as Richter suggested, is another example of this behavioral regulation, in response to an excess or dearth of calories. “We may regard the great physical activity of many normal individuals, the play activity of children, and perhaps even the excessive activity of many manic patients, as efforts to maintain a constant internal balance by expending excessive amounts of energy,” he explained. “On the other hand, the low level of activity seen in some apparently normal people, the almost total inactivity seen in depressed patients, again may be regarded as an effort to conserve enough energy to maintain a constant internal balance.”

p. 457-460

This is where physiological psychologists provided a viable alternative hypothesis to explain both hunger and weight regulation. In effect, they rediscovered the science of how fat metabolism is regulated, but did it from an entirely different perspective, and followed the implications through to the sensations of hunger and satiety. Their hypothesis explained the relative stability of body weight, which has always been one of the outstanding paradoxes in the study of weight regulation, and even why body weight would be expected to move upward with age, or even move upward on average in a population, as the obesity epidemic suggests has been the case lately. And this hypothesis has profound implications, both clinical and theoretical, yet few investigators in the field of human obesity are even aware that it exists.

This is yet another example of how the specialization of modern research can work against scientific progress. In this case, endocrinologists studying the role of hormones in obesity, and physiological psychologists studying eating behavior, worked with the same animal models and did similar experiments, yet they published in different journals, attended different conferences, and thus had little awareness of each other’s work and results. Perhaps more important, neither discipline had any influence on the community of physicians, nutritionists, and psychologists concerned with the medical problem of human obesity. When physiological psychologists published articles that were relevant to the clinical treatment of obesity, they would elicit so little attention, said UCLA’s Donald Novin, whose research suggested that the insulin response to carbohydrates was a driving force in both hunger and obesity, that it seemed as though they had simply tossed the articles into a “black hole.”

The discipline of physiological psychology was founded on Claude Bernard’s notion of the stability of the internal environment and Walter Cannon’s homeostasis. Its most famous practitioner was the Russian Ivan Pavlov, whose career began in the late nineteenth century. The underlying assumption of this research is that behavior is a fundamental mechanism through which we maintain homeostasis, and in some cases—energy balance in particular—it is the primary mechanism. From the mid-1920s through the 1940s, the central figure in the field was Curt Richter of Johns Hopkins. “In human beings and animals, the effort to maintain a constant internal environment or homeostasis constitutes one of the most universal and powerful of all behavior urges or drives,” Richter wrote.

Throughout the first half of the twentieth century, a series of experimental observations, many of them from Richter’s laboratory, raised questions about what is meant by the concepts of hunger, thirst, and palatability, and how they might reflect metabolic and physiological needs. For example, rats whose adrenal glands are removed cannot retain salt, and will die within two weeks on their usual diet, from the consequences of salt depletion. If given a supply of salt in their cages, however, or given the choice of drinking salt water or pure water, they will choose either to eat or to drink the salt and, by doing so, keep themselves alive indefinitely. These rats will develop a “taste” for salt that did not exist prior to the removal of their adrenal glands. Rats that have had their parathyroid glands *132 removed will die within days of tetany, a disorder of calcium deficiency. If given the opportunity, however, they will drink a solution of calcium lactate rather than water—not the case with healthy rats—and will stay alive because of that choice. They will appear to like the calcium lactate more than water. And rats rendered diabetic voluntarily choose diets devoid of carbohydrates, consuming only protein and fat. “As a result,” Richter said, “they lost their symptoms of diabetes, i.e., their blood sugar fell to its normal level, they gained weight, ate less food and drank only normal amounts of water.”

The question most relevant to weight regulation concerns the quantity of food consumed. Is it determined by some minimal caloric requirement, by how the food tastes, or by some other physical factor—like stomach capacity, as is still commonly believed? This was the question addressed in the 1940s by Richter and Edward Adolph of the University of Rochester, when they did the experiments we discussed earlier (see Chapter 18), feeding rats chow that had been diluted with water or clay, or infusing nutrients directly into their stomachs. Their conclusion was that eating behavior is fundamentally driven by calories and the energy requirements of the animal. “Rats will make every effort to maintain their daily caloric intake at a fixed level,” Richter wrote. Adolph’s statement of this conclusion still constitutes one of the single most important observations in a century of research on hunger and weight regulation: “Food acceptance and the urge to eat in rats are found to have relatively little to do with ‘a local condition of the gastro-intestinal canal,’ little to do with the ‘organs of taste,’ and very much to do with quantitative deficiencies of currently metabolized materials”—in other words, the relative presence of usable fuel in the bloodstream.

Fat Doesn’t Mean Not Fit

Eric “Butterbean” Esch, having weighed 425 lbs at his heaviest, was one of the best boxers of the 1990s. He regularly knocked out his competitors in under a minute. He didn’t look impressive, besides being obese. He wasn’t the best trained nor did he fight with much style. But he was a powerhouse. He could take punches and give them in return. And when he landed a punch, it was devastating.

As with many others, Butterbean’s obesity was not an indicator of a lack of muscle, stamina, and aerobic health. Even in later fights when his power was decreased, he still could hold his own for many rounds. In 2002, he remained on his feet for 10 rounds with one of the greatest fighters of all time, Larry Holmes, before finally knocking him back against the ropes with the fight ending after the referee did a standing 8 count. He expanded his career into professional wrestling and MMA matches, winning many more fights. As late as 2011 in his mid-40s, he was still knocking out opponents and he was still fat.

This is why so few people can lose weight through exercise alone. All that more exercise does for most, specifically on a high-carb diet, is to make them hungrier and so leading to them eating more (exercise on a ketogenic diet is a bit different, though). And indeed, many athletes end up focusing on carbs in trying to maintain their energy, as glucose gets used up so quickly (as opposed to ketones). Long-distance runners on a high-carb diet have to constantly refuel with sugary drinks provided along the way.

Americans have been advised to eat more of the supposedly healthy carbs (whole grains, vegetables, fruit, etc) while eating less of the supposedly unhealthy animal foods (red meat, saturated fats, etc) and the data shows they are doing exactly that, more than ever before since data was kept. But telling people that eating lots of carbs, even if from “whole foods”, is part of a healthy diet is bad advice. And when they gain weight, blaming them for not exercising enough is bad advice stacked upon bad advice.

Such high-carb diets don’t do any good for long-term health, even for athletes. Morally judging fat people as gluttonous and slothful simply doesn’t make sense and it is the opposite of helpful, a point that Gary Taubes has made. It’s plain bullshit and this scapegoating of the victims of bad advice is cruel.

This is why so many professional athletes get fat when they retire, after a long career of eating endless carbs, not that it ever was good for their metabolic health (people can be skinny fat with adipose around their internal organs and have diabetes or pre-diabetes). But some like Butterbean begin their athletic careers fat and remained fat. Many football players are similarly overweight. William Perry, AKA The Fridge, was an example of that, although he was a relative lightweight at 335-350 lbs. Even more obvious examples are seen with some gigantic sumo wrestlers who, while grotesquely obese, are immensely strong athletes.

Sumo wrestlers are also a great example of the power of a high-carb diet. They will intentionally consume massive amounts of starches and sugars in order to put on fat. That is old knowledge, the reason people have understood for centuries the best way to fatten cattle is to feed them grains. And it isn’t as if cattle get fat by being lazy while sitting on the couch watching tv and playing on the internet. It’s the diet alone that accomplishes that feat of deliciously marbled flesh. Likewise, humans eating a high-carb diet will make their own muscles and organs marbled.

I speak from personal experience, after gaining weight in my late 30s and into my early 40s. I topped out at around 220 lbs  — not massive, but way beyond my weight in my early 20s when I was super skinny, maybe down in the 140 lbs range (the result of a poverty diet and I looked gaunt at the time). In recent years, I had developed a somewhat protruding belly and neck flabs. You could definitely tell I was carrying extra fat. Could you tell that I also was physically fit? Probably not.

No matter how much I exercised, I could not lose weight. I was jogging out to my parent’s place, often while carrying a backpack that sometimes added another 20-30 lbs (books, water bottle, etc). That jog took about an hour and I did it 3-4 times a week and I was doing some weightlifting as well, but my weight remained the same. Keep in mind I was eating what, according to official dietary guidelines, was a ‘balanced’ diet. I had cut back on my added sugars over the years, only allowing them as part of healthy whole foods such as in kefir, kombucha, and fruit. I was emphasizing lots of vegetables and fiber. This often meant starting my day with a large bowl of bran cereal topped with blueberries or dried fruit.

I was doing what Americans have been told is healthy. I could not lose any of that extra fat, in spite of all my effort and self-control. Then in the spring of last year I went on a low-carb diet that transitioned into a very low-carb diet (i.e., keto). In about 3 months, I lost 60 lbs and have kept it off since. I didn’t do portion control and didn’t count calories. I ate as much as I wanted, but simply cut out the starches and sugars. No willpower was required, as on a keto diet my hunger diminished and my cravings disappeared. It was the high-carb diet that had made me fat, not a lack of exercise.