Sugar is an Addictive Drug

Sugar is addictive. That is not a metaphor. It is literally an addictive drug, a gateway drug. Sugar is the first drug that most humans ever experience.

For many Americans, the addictive nature of it begins shaping the brain in infancy, as sweeteners are put into formula. And if you didn’t get formula, I bet you didn’t make it past toddlerhood without getting regularly dosed with sugar: sweet baby food, candy, cake, etc.

Addiction is trained into us during the most key years of physiological development. What we eat in the first few years, as research shows, determines what tastes good to us for the rest of our lives. We are hooked.

(I’ve previously written on food addiction: The Agricultural Mind; & Diets and Systems.)

* * *

WHAT IS FOOD ADDICTION?
By H. Theresa Wright, MS, RD, LDN and Joan Ifland, PhD

The addictive properties of sugar are perhaps the most studied.[6]  Rats will choose sugar, high fructose corn syrup, and saccharine over cocaine and heroin. Rats have shown a withdrawal syndrome similar to that of morphine [7]. Sugar activates the dopamine pathway. [8]  Food addiction recovery groups often recommend abstinence from sugar and sweeteners. [8]

The case for treating sugar like a dangerous drug

German Lopez: Walk me through the argument for treating sugar like a controlled substance.

Robert Lustig: The definition of addicted is that you know it’s bad for you and you can’t stop anyway, like heroin, cocaine, alcohol, and nicotine. You know it’s bad for you. You  know it will kill you. But you can’t stop anyway, because the biochemical drive to consume is greater than any cognitive ability to restrain oneself.

There are two phenomena attached to addiction: one’s called tolerance, the other is withdrawal. It turns out sugar does both of those as well.

If a substance is abused and addictive and it contributes to societal problems, that’s criteria for regulation.

GL: Is that really grounds for considering it a controlled substance, though?

RL: There are four things that have to be met in order to consider a substance worthy of regulation. Number one: ubiquity — you can’t get rid of it, it’s everywhere. Number two: toxicity — it has to hurt you. Number three: abuse. Number four: externalities, which means it has a negative impact on society.

Sugar meets all four criteria, hands down. One, it’s ubiquitous — it’s everywhere, and it’s cheap. Two, as I mentioned, we have a dose threshold, and we are above it. Three, if it’s addictive, it’s abused. Four, how does your sugar consumption hurt me? Well, my employer has to pay $2,750 per employee for obesity management and medicine, whether I’m obese or not.

GL: The thing that led me to look into your paper is that I wrote an article a couple weeks back about how the three most dangerous drugs in the country are legal: tobacco, alcohol, and prescription painkillers. And a few people mentioned that I forgot sugar. That idea really interested me.

RL: Yeah, that’s right. The Wall Street Journal asked Americans what are the most dangerous of four substances in America: tobacco, 49 percent; alcohol, 24 percent; sugar, 15 percent; and then marijuana, 8 percent. Sugar was doubly worrisome to Americans than marijuana was. How about that?

GL: One potential hurdle is that controlled substances are typically seen as drugs. Do you consider sugar a drug?

RL: Of course it’s a drug. It’s very simple: a drug is a substance that has effects on the body, and the effects have to be exclusive of calories.

So in order to qualify it as a drug, the negative effects of sugar have to be exclusive of its calories. Is 100 calories of sugar different from, say, 100 calories in broccoli? The answer is absolutely.

Can you name another substance of abuse for which the effect of the substance is more dangerous than the calories it harbors? Alcohol. Its calories are dangerous not because they’re calories; they’re dangerous because they’re part of alcohol. Sugar is the same.

Sugar is the alcohol of a child. You would never let a child drink a can of Budweiser, but you would never think twice about a can of Coke. Yet what it does to the liver, what it does to the arteries, what it does to the heart is all the same. And that’s why we have adolescents with type 2 diabetes.

There are some studies of rats that are completely addicted to cocaine. So they have this drip, cocaine just comes out, and so they’re consuming it all the time. This is the crazy part. As soon as they taste sugar, they don’t care about the cocaine anymore and all they care about is a sugar. That is how addictive sugar is. It’s so addictive that rats that are addicted to cocaine, which we all know is an addictive substance, they would prefer the sugar over cocaine.

There is another study where rats are pulling a cord and every time they pull the cord a little bit a little drip of sugar water comes out. So they’re confined into this space and that is all they get. So then they learn to pull the cord so that they can get their drip of sugar. And over time the researchers open the door so that they have access to the outside. They even have access to family and they have access to all these other foods.

And guess what these rats do. They don’t care about anything else, but they just wait and wait and obsessively pull the cord to try to get sugar. This is how scary and addictive sugar is.

Fat Chance: Fructose 2.0 by Dr. Robert Lustig (Transcript)

So the question is, is fast food addictive? What do you think? Yes? No? Okay, so we actually looked at that question.

So everybody familiar with this book? Michael Moss put this out, “Salt, sugar, fat, how the giants hooked us”, right? This is wrong, this is a mistake. Because there is one thing not on the list. What’s missing? Caffeine.

Now we’ve got fast food! Okay, salt, sugar, fat and caffeine, right? So the question is, of these four which are addictive?

Let’s talk about salt. Is salt addictive? No, it’s not addictive. In humans the threshold is physiologically fixed, higher levels are attributable to preference but you can alter that preference, lots of people do especially when they have to go low salt for some reason. And we know because we take care of a disease in endocrinology called salt-losing congenital adrenal hyperplasia where their kidneys are losing salt non stop. But when we give them the salt retaining hormone that works in the kidney called aldosterone, their salt intake goes way down. And if they were addicted that wouldn’t happen.

So when we fix their physiology, their preference gets a lot better. So salt? Not addictive.

Now let’s take fat. Is fat addictive? What do you think? Nope, rodents binge but show no signs of dependence, and humans they always binge on high fat high carb or high sugar items, like pizza and ice cream, you don’t binge on high fat per se, otherwise the Atkins diet would have everybody addicted and they’ll tell you, you know they are losing weight, how could they lose weight if they are all addicted?

Energy density actually has a stronger association with obesity and metabolic syndrome than fat does.

So, fat? Not addictive.

So we are left with these two. Caffeine? Oh man, caffeine is addictive and if you take my Starbucks away from me I’ll kill you. Model drug of dependence, gateway drug in fact, dependence show in children, adolescence, adults, 30% who consume it meet the DSM criteria for dependence and physiological addiction is well established with the headache, and the test performance, and everything else. Mega addictive.

But do you see anybody going out and regulating Starbucks or Pizza or anything like that? Why? Because it’s not toxic. It’s addictive, but not toxic, unless you mix it with alcohol and then you got something called four loco and that we are banning, everybody got it?

So when it’s toxic and addictive we ban it or we regulate it. And so, caffeine and alcohol together that’s a bad deal. But caffeine alone? Keep your hands of my Starbucks.

So caffeine? Yes, addictive.

Okay, that leaves this one. Sugar, is sugar addictive? What do you think? You know, we’ve known this for a long time, because, anybody know what this is? It’s called sweeties. This is a super concentrated sucrose, sugar solution, that you dip the pacifier in and you put in the newborn baby boy’s mouth before you do the circumcision, because it releases opioids and deadens the pain. And this has been known forever. Then you mix it with a little wine and then you got a really good cocktail, eh?

So is there really such a thing as sugar addiction, we have to look for similarities to other drugs of dependence like nicotine, morphine, amphetamine, cocaine. The one I think is most appropriate is alcohol, because after all alcohol and sugar are basically metabolized the same way, because after all where do you get alcohol from? Fermentation of sugar, it’s called wine, right? We do it every day, up in Sonoma. The big difference between alcohol and sugar is that for alcohol the yeast does the first step of metabolism called glycolysis; for sugar we do our own first step, but after that when the mitochondria see it, it doesn’t matter where it came from. And that’s the point, and that’s why they both cause the same diseases. And they do the same thing to the brain.

So for the criteria for addiction in animals are bingeing, withdrawal, craving, and then there is one down here called cross-sensitization with other drugs of abuse, that means that if you expose an animal to one drug of abuse, like cocaine for 3 weeks and addict them, and then you expose them to a second drug they’ve never seen before, like say amphetamine, they’re addicted to the amphetamine even though they’d never seen it before, because the dopamine receptors are already down-regulated because they are the same dopamine receptors, everybody got it?

Okay, and so, does sugar do this? Absolutely. Q.E.D. slammed on, sugar is addictive in animals.

What about humans? Who saw this movie? Right? Did you like it? More or less?

I’ve a big problem with this movie, because if you watch the movie his doctor, Morgan’s doctor keeps saying: “You gotta get off this high fat diet, high fat diet, high fat diet, high fat diet, high fat diet” Not the high fat diet, it’s the high sugar diet, high sugar diet, that’s what caused all the problems.

So, can sugar be addictive? Watch.

“I was feeling bad” “In the car, feeling like…I was feeling really, really sick and unhappy…started eating, feel great…feel really good now… I feel so good as crazy… Ain’t that right baby? Yeah you’re right darling”

This was on day 18, of his 30 day sojourn from McDonald’s. He just described withdrawal, that’s withdrawal, and he needed another hit in order to feel good again. He just described withdrawal, he was a vegan, right? Because his girlfriend was a vegan chef and in 18 days he’s a sugar addict.

So, you tell me. So this is what we are dealing with. We are dealing with an industry that wants us to consume its product, well gee, every industry wants us to consume their product in some fashion or another, the question is what if it hurts you? What if it hurts you?

Vegetarianism is an Animal-Based Diet

“Some cultures like the Plains Indians, like the Lakota, they lived mostly on Buffalo. And they were the longest lived people in history. More centenarians per capita than any other population at the turn of the century.”

Dr. Mark Hyman mentioned that as a side comment, in a talk with Tom Bilyeu (27 minutes into the video below). This caught my attention for the obvious reason. The Lakota, on a carnivore diet as they were in the 19th century, hold the world historical record for the longest lived population. Let’s give credit where it’s due. And the credit goes to those nutrient-dense buffalo that gave up their lives for the greater good of the Lakota.

But the thing that surprised me is that Hyman brought this up at all. He is not a carnivore advocate. In fact, he was advising against it. He isn’t even particularly paleo in his dietary views. Of the alternative health doctors, he is one of the more well known and one of the more mainstream. I wouldn’t exactly say he is conventional, although he doesn’t tend to stray far into extreme dietary regimens. His comment was a bit out of character.

I looked it up and found where he spoke of it in one of his books: Eat Fat, Get Thin (at the beginning of chapter 7). That passage gives more context there for why he highlights that exemplary population. He brings up another long-lived population, the Seventh Day Adventists who are vegetarians. “What gives?” he asked. “Meat or veggies? Maybe we’re asking the wrong question. The answer seems to be that it is not the meat or the veggies, but the sugar and refined carbs that are part of the typical meat eater’s diet and our highly processed inflammatory diet that we should be concerned with.”

That makes more sense of why he was interested in the Lakota. He presents two diets that most people would take as extreme and then seeks a moderate position. Then he concludes that it isn’t what either diet includes but what they both exclude. That is a fair enough point (reminiscent of Catherine Shanahan’s assessment of industrial seed oils in Deep Nutrition; see Dr. Catherine Shanahan On Dietary Epigenetics and Mutations). The same basic argument comes up in an article on his official website, Is Meat Good or Bad for You?. He states that the “whole carnivore-vegan debate misses the real point”.

Sure, it misses the point. But did you see the sleight-of-hand he did there. He switched the frame from a carnivore-vegetarian debate to a carnivore-vegan debate. That is problematic, since vegetarianism and veganism are extremely different. Vegetarianism is animal-based omnivorous diet. It allows animal foods such as eggs and dairy (some even include seafood). If a vegetarian so desired, they could eat almost entirely animal foods and remain vegetarian. That is not possible with veganism that entirely excludes animal foods of all varieties (ignoring vegans who likewise make exceptions).

In comparing long-lived carnivores and long-lived vegetarians, maybe there is more going on than the unhealthy processed foods that their diets lack. The early Lakota obviously were getting high-quality and highly nutritious animal foods, but the same could be true of the vegetarians among Seventh Day Adventists. Both could be getting high levels of fat-soluble vitamins, along with other animal-sourced nutrients such as EPA, DHA, choline, biotin, etc. A vegan lacks all of these without artificial sources of non-food supplementation. That isn’t a minor detail.

We have no comparable vegan population that has been studied or about which we have a historical data. Interestingly, veganism didn’t exist as a diet until a Seventh Day Adventist received it as a message from God. Yet even to this day, there are no significant number of vegans among Seventh Day Adventists or any other population, much less vegans who have been on the diet for their entire life or even multiple generations. There is no such thing as a long-lived vegan population. In fact, few people who start a vegan diet remain on it for long. We have no clue what would happen to an entire population maintained on veganism for their entire lives. It’s a complete unknown. But what we do know is that populations that allow animal foods, from carnivore to vegetarian, can maintain good health and can produce centenarians.

This gets overlooked in mainstream debate where vegetarianism and veganism are conflated as plant-based diets. That confusion is purposely promoted by vegans (e.g., the documentary The Game Changers) who don’t want a public discussion about animal foods, especially not nutrient-dense animal foods as part of regenerative farming, and so vegans want to dismiss all animal foods as factory-farmed ‘meat’ supposedly destroying the world. For some reason, many experts like Dr. Hyman have fallen into this framing, a framing by the way that corporate interests have likewise promoted (Dietary Dictocrats of EAT-Lancet).

* * *

Old Debates Forgotten

Since earlier last year, I’ve done extensive reading, largely but not entirely focused on health. This has particularly concerned diet and nutrition, although it has crossed over into the territory of mental health with neurocognitive issues, addiction, autism, and much else, with my personal concern being that of depression. The point of this post is to consider some of the historical background. Before I get to that, let me explain how my recent interests have developed.

What got me heading in this direction was the documentary The Magic Pill. It’s about the paleo diet. The practical advice was worth the time spent, though other things drew me into the the larger arena of low-carb debate. The thing about the paleo diet is that it offers a framework of understanding that includes many scientific fields involving health beyond only diet and also it explores historical records, anthropological research, and archaeological evidence. The paleo diet community in particular, along with the low-carb diet community in general, is also influenced by the traditional foods approach of Sally Fallon Morrell. She is the lady who, more than anyone else, popularized the work of Weston A. Price, an early 20th century dentist who traveled the world and studied traditional populations. I was already familiar with this area from having reading Morrell’s first book in the late ’90s or early aughts.

New to me was the writings of Gary Taubes and Nina Teicholz, two science journalists who have helped to shift the paradigm in nutritional studies. They accomplished this task by presenting not only detailed surveys of the research and other evidence but in further contextualizing the history of powerful figures, institutions, and organizations that shaped the modern industrial diet. I didn’t realize how far back this debate went with writings on fasting for epilepsy found in ancient texts and recommendations of a low-carb diet (apparently ketogenic) for diabetes appearing in the 1790s, along with various low-carb and animal-based diets being popularized for weight-loss and general health during the 19th century, and then the ketogenic diet was studied for epilepsy beginning in the 1920s. Yet few know this history.

Ancel Keys was one of those powerful figures who, in suppressing his critics and silencing debate, effectively advocated for the standard American diet of high-carbs, grains, fruits, vegetables, and industrial seed oils. In The Magic Pill, more recent context is given in following the South African trial of Tim Noakes. Other documentaries have covered this kind of material, often with interviews with Gary Taubes and Nina Teicholz. There has been immense drama involved and, in the past, there was also much public disagreement and discussion. Only now is that returning to mainstream awareness in the corporate media, largely because social media has forced it out into the open. But what interests me is how old is the debate and often in the past much more lively.

The post-revolutionary era created a sense of crisis that, by the mid-19th century, was becoming a moral panic. The culture wars were taking shape. The difference back then was that there was much more of a sense of the connection between physical health, mental health, moral health, and societal health. As a broad understanding, health was seen as key and this was informed by the developing scientific consciousness and free speech movement. The hunger for knowledge was hard to suppress, although there were many attempts as the century went on. I tried to give a sense of this period in two massive posts, The Crisis of Identity and The Agricultural Mind. It’s hard to imagine what that must’ve been like. That scientific debate and public debate was largely shut down around the World War era, as the oppressive Cold War era took over. Why?

It is strange. The work of Taubes and Teicholz gives hint to what changed, although the original debate was much wider than diet and nutrition. The info I’ve found about the past has largely come from scholarship in other fields, such as historical and literary studies. Those older lines of thought are mostly treated as historical curiosities at this point, background info for the analysis of entirely other subjects. As for the majority of scientists, doctors and nutritionists these days, they are almost entirely ignorant of the ideologies that shaped modern thought about disease and health.

This is seen, as I point out, in how Galen’s ancient Greek theory of humors as incorporated into Medieval Christianity appears to be the direct source of the basic arguments for a plant-based diet, specifically in terms of the scapegoating of red meat, saturated fat and cholesterol. Among what I’ve come across, the one scholarly book that covers this in detail is Food and Faith in Christian Culture edited by Ken Albala and Trudy Eden. Bringing that into present times, Belinda Fettke dug up how so much of contemporary nutritional studies and dietary advice was built on the foundation of 19th-20th century vegan advocacy by the Seventh Day Adventists. I’ve never met anyone adhering to “plant-based” ideology who knows this history. Yet now it is becoming common knowledge in the low-carb world.

On the literary end of things, there is a fascinating work by Bryan Kozlowski, The Jane Austen Diet. I enjoyed reading it, in spite of never having cracked open a book by Jane Austen. Kozlowski, although no scholar, was able to dredge up much of interest about those post-revolutionary decades in British society. For one, he shows how obesity was becoming noticeable all the way back then and many were aware of the benefits of low-carb diets. He also makes clear that the ability to maintain a vegetable garden was a sign of immense wealth, not a means for putting much food on the tables of the poor — this is corroborated by Teicholz discussion of how gardening in American society, prior to modern technology and chemicals, was difficult and not dependable. More importantly, Kozlowski’s book explains what ‘sensibility’ meant back then, related to ‘nerves’ and ‘vapors’ and later on given the more scientific-sounding label of ‘neurasthenia’.

I came across another literary example of historical exegesis about health and diet, Sander L. Gilman’s Franz Kafka, the Jewish Patient. Kafka was an interesting case, as a lifelong hypochondriac who, it turns out, had good reason to be. He felt that he had inherited a weak constitution and blamed this on his psychological troubles, but more likely causes were urbanization, industrialization, and a vegetarian diet that probably also was a high-carb diet based on nutrient-depleted processed foods; and before the time when industrial foods were fortified and many nutritional supplements were available.

What was most educational, though, about the text was Gilman’s historical details on tuberculosis in European thought, specifically in relationship to Jews. To some extent, Kafka had internalized racial ideology and that is unsurprising. Eugenics was in the air and racial ideology penetrated everything, especially health in terms of racial hygiene. Even for those who weren’t eugenicists, all debate of that era was marked by the expected biases and limitations. Some theorizing was better than others and for certain not all of it was racist, but the entire debate maybe was tainted by the events that would follow. With the defeat of the Nazis, eugenics fell out of favor for obvious reasons and an entire era of debate was silenced, even many of the arguments that were opposed to or separate form eugenics. Then historical amnesia set in, as many people wanted to forget the past and instead focus on the future. That was unfortunate. The past doesn’t simply disappear but continues to haunt us.

That earlier debate was a struggle between explanations and narratives. With modernity fully taking hold, people wanted to understand what was happening to humanity and where it was heading. It was a time of contrasts which made the consequences of modernity quite stark. There were plenty of communities that were still pre-industrial, rural, and traditional, but since then most of these communities have died away. The diseases of civilization, at this point, have become increasingly normalized as living memory of anything else has disappeared. It’s not that the desire for ideological explanations has disappeared. What happened was, with the victory of WWII, a particular grand narrative came to dominate the entire Western world and there simply were no other grand narratives to compete with it. Much of the pre-war debate and even scientific knowledge, especially in Europe, was forgotten as the records of it were destroyed, weren’t translated, or lost perceived relevance.

Nonetheless, all of those old ideological conflicts were left unresolved. The concerns then are still concerns now. So many problems worried about back then are getting worse. The connections between various aspects of health have regained their old sense of urgency. The public is once again challenging authorities, questioning received truths, and seeking new meaning. The debate never ended and here we are again, and one could add that fascism also is back rearing its ugly head. It’s worrisome that the political left seems to be slow on the uptake. There are reactionary right-wingers like Jordan Peterson who are offering visions of meaning and also who have become significant figures in the dietary world, by way of the carnivore diet he and his daughter are on. T?hen there are the conspiratorial paleo-libertarians such as Tristan Haggard, another carnivore advocate.

This is far from being limited to carnivory and the low-carb community includes those across the political spectrum, but it seems to be the right-wingers who are speaking the loudest. The left-wingers who are speaking out on diet come from the confluence of veganism/vegetarianism and environmentalism, as seen with EAT-Lancet (Dietary Dictocrats of EAT-Lancet). The problem with this, besides much of this narrative being false (Carnivore is Vegan), is that it is disconnected from the past. The right-wing is speaking more to the past than is the left-wing, such as Trump’s ability to invoke and combine the Populist and Progressive rhetoric from earlier last century. The political left is struggling to keep up and is being led down ideological dead-ends.

If we want to understand our situation now, we better study carefully what was happening in centuries past. We are having the same old debates without realizing it and we very well might see them lead to the same kinds of unhappy results.

Is California a Canary in the Coal Mine?

About present ecological problems in the Sunshine State, Patrice Aymes presented her own take on what is going on (Burn California, Burn… The Price of Hypocrisy?). Her perspective is from that of being a Californian, apparently from the specific location of Central Valley in Northern California. She argues that the main problem is urban sprawl. Based on that working hypothesis, she speculates the situation could be remedied by simply enforcing more dense urbanization and so disincentivizing large houses in areas that are difficult to protect against fire. Besides that, she also thinks better resource management would help. Let’s look at the data to get a sense of the challenge, data that to my mind is shocking. The Californian population is immense and growing, which problematizes any attempt at resource management. And climate change makes everything worse.

My take on the situation is, in some ways, simpler than the suggestion of reforming the system and restructuring housing. No matter how you dice it, the population is plain too large for the ecological constrains of California. It’s a variation on, if maybe less extreme version of, the Dust Bowl. There was a wetter period that attracted people to California. Also, as in earlier times, the Federal government encouraged people to move West. But the wet period inevitably didn’t last and the weather patterns returned to their historical norm. This was exacerbated in California. Franklin Delano Roosevelt implemented federal farm subsidies in California before they were ever used anywhere else in the country. Along with diverting water in from other states, this created a big ag that otherwise wouldn’t have been possible. Yet there is too much profit and too many powerful lobbyist groups invested in maintaining the status quo that, in the long term, cannot be maintained.

The purpose of artificially constructing this big ag was partly to feed the growing population (further promoted by the Nixon administration guided by the corporatist vision of Earl Butz). And a large reason for that was because the Federal government needed a massive workforce to be employed in the defense industry so that the United States military could have a presence on the West Coast. This defense industry also funded decades of the tech industry. Much (most?) of the Californian economy is, directly or indirectly, connected to and dependent on the military-industrial complex. This has brought immense wealth into the state and so created a wealthy class demanding luxury. They live beyond their means through taxpayer money and externalized costs. California, as it is presently structured, would not exist if not for the intervening alliance of big gov and big biz.

Even if urban sprawl was eliminated and housing concentrated, the same basic ecological problems would remain without solution. It’s likely to get worse. As with large areas of Australia, there probably will be a mass exodus from California until the declining population reaches a sustainable size. But the motivation for that change will require mass crisis and catastrophe. That is my sense of things, anyway. These are just my thoughts. I can defend parts of my argument. I’ve written about the emergence of big ag in California and it’s interesting history. The military-industrial complex, in California as elsewhere is not only interesting but concerning. (See: Fascism, Corporatism, and Big Ag, From Progressivism to Neoconservatism, Vicious Cycle: The Pentagon Creates Tech Giants and Then Buys their Services, & Plutocratic Mirage of Self-Made Billionaires.) All of that, from what I can tell, is pretty much straightforward facts that are well-established and agreed upon.

As an example of hard-hitting data: “About 60 percent of all precipitation evaporates or is transpired by trees and vegetation” (Water Education Foundation, California Water 101); still, California receives a fair amount of precipitation… but: “There’s a catch. While parts of Northern California receive 100 inches or more of precipitation per year, the state’s southern, drier areas receive less precipitation – and just a few inches of rain annually in the desert regions. That means 75 percent of California’s available water is in the northern third of the state (north of Sacramento), while 80 percent of the urban and agricultural water demands are in the southern two-thirds of the state.” Consider that 80% of California’s surface water is used by the agricultural industry, whereas the average water usage for urban areas is only 10%. Besides draining aquifers, the state has lost “as much as 90 percent of the original wetlands acreage—a greater percentage of loss than any other state in the nation” (Water Education Foundation, Wetlands).

As for water appropriated from the Colorado River, there is competition for it from many other states with their own agricultural needs and growing populations. The part about how much population could be supported through the local environmental resources is more speculative. A strong case against sustainability, though, can be and has been made. Many others have written about it. If you do a web search, you can find numerous scientific papers and news reporting on the relationship of water shortage and overpopulation in California, including comparisons to the Dust Bowl. (See: Water Use in California by Jeffrey Mount & Ellen Hanak, The California Water Crisis: More Than Just Another Drought from Calsense, & California faces ‘Dust Bowl’-like conditions amid drought, says climate tracker by Chris Megerian.)

My comments have been about all of California, not limited to one region. A fairly small proportion of the Californian population lives north of the Bay Area. Maybe that area has a sustainable population. The greatest population concentration in Northern California is the Bay Area. But even if you look at all of Northern California including the Bay Area, that is only 15 million compared to the 25 million in Southern California. So, Northern California is far less than half of the population of the state and the Bay Area alone is half the population of Northern California. Northern California minus the Bay area is less than 18% of the total population. When I traveled across California, what stood out to me was not only that the Southern half had a larger population but also more densely populated, although I don’t know in terms of urban concentration (specifically in comparison to the Bay Area and Central Valley). Northern California seemed relatively empty, as large swaths of it wasn’t inhabited. My observations are cursory, though. Besides the Bay Area, the urban areas I saw were smaller.

All of Central Valley that includes multiple cities is only 6.5 million, but as a comparison even that is larger than 39 other states and territories in the US (much larger than many farm states, and about 12 times that of the least populated state). There are only 16 states, excluding California itself, that have more population than Central Valley and Central Valley is one of the least populated areas of California. That is in the context of California being the most populated state in the country. To really emphasize the massive population we’re talking about, Central Valley is larger than 124 countries in the world, Northern California is larger than 160 countries, and all of California is larger than 197 countries. Only 35 countries in the world have more inhabitants than California. Such an immense number of people crammed together in such a small area, with or without urban sprawl, is hard to imagine and comprehend, specifically in terms of the implications and effects. Data can be barely convey the immensity of the ecological challenge.

That brings us to carrying capacity. California is one of the dryer places in the United States (in top 10 of states of low precipitation with 5 out of 9 the largest American cities with less than 20 inches as yearly average). There are many other states that have far more water than California, even though no state has more residents. This is why California is dependent on taking water from other states, specifically the Colorado River, and even then California is also draining its own aquifers faster than they can be refilled. Sure, using resources more wisely would help, but that can only go so far. It’s unclear what the carrying capacity is for the entire planet and some argue we’ve already overshot maximum population load, an argument I’ve found persuasive or at least a point of serious concern. The larger complication involves the repercussions of going beyond the carrying capacity, in that the full externalized costs wouldn’t show up for decades or even generations later. As such, if we’ve already traipsed past this breaking point sometime these past decades, we might not be forced to acknowledge this stark reality until later in the century when the bill finally comes due.

It’s all rather speculative, as I said. But we do know that climate change is irreversible at this point. The melting of ice is a half century ahead of schedule, according to many predictions. It’s happening far more quickly than expected. Large parts of the world are experiencing droughts and are draining their aquifers, which exacerbates desertification. Even the 100th Meridian is moving eastward and drying out what used to be some of the most productive farmland in the world, the region that has been the breadbasket of the world. My own attitude is that of the precautionary principle. I see no advantage to seeing how close we can get to the carrying capacity of any particularly area or for the whole planet before going too far. But ignoring that, it’s possible that the carrying capacity could be extended a bit more, if we find more sustainable ways of living. Maybe or maybe not. As always, time will tell.

* * *

As a related issue, maybe one should consider the importance of trees and the dire situation of their loss as related to climate change, in California and elsewhere:

Creeping toward Permanent Drought
by Kate Marvel

An American tragedy: why are millions of trees dying across the country?
by Oliver Milman & Alan Yuhas

California’s Trees Are Dying At A Catastrophic Rate
by Laura Geiser & Mette Lampcov

18 Million Trees Died in California in 2018, Forest Service Study Finds
by Ron Brackett

California’s Drought Killed Almost 150 Million Trees
by Jason Daley

150 million trees died in California’s drought, and worse is to come
by Nathanael Johnson

California has 149 million dead trees ready to ignite like a matchbook
by Umair Irfan

The hard truth about being a 21st century tree in California
by Mark Kaufman

Can the Los Angeles We Know Survive the Death of Its Trees?
by Brandon R. Reynolds

Scientists: Future of oldest tree species on Earth in peril
by Scott Smith

Earth’s Oldest Trees in Climate-Induced Race up the Tree Line
by Kat Kerlin

Moral Panic and Physical Degeneration

From the beginning of the country, there has been an American fear of moral and mental decline that was always rooted in the physical, involving issues of vitality of land and health of the body, and built on an ancient divide between the urban and rural. Over time, it grew into a fever pitch of moral panic about degeneration and degradation of the WASP culture, the white race, and maybe civilization itself. Some saw the end was near, maybe being able to hold out for another few generations before finally succumbing to disease and weakness. The need for revitalization and rebirth became a collective project (Jackson Lears, Rebirth of a Nation), which sadly fed into ethno-nationalist bigotry and imperialistic war-mongering — Make America Great Again!

A major point of crisis, of course, was the the Civil War. Racial ideology became predominant, not only because of slavery but maybe moreso because of mass immigration, the latter being the main reason the North won. Racial tensions merged with the developing scientific mindset of Darwinism and out of this mix came eugenics. For all we can now dismiss this kind of simplistic ignorance and with hindsight see the danger it led to, the underlying anxieties were real. Urbanization and industrialization were having an obvious impact on public health that was observed by many, and it wasn’t limited to mere physical ailments. “Cancer, like insanity, seems to increase with the progress of civilization,” noted Stanislas Tanchou, a mid-19th century French physician.

The diseases of civilization, including mental sickness, have been spreading for centuries (millennia, actually, considering the ‘modern’ chronic health conditions were first detected in the mummies of the agricultural Egyptians). Consider how talk of depression suddenly showed up in written accounts with the ending of feudalism (Barbara Ehrenreich, Dancing in the Street). That era included the enclosure movement that forced millions of then landless serfs into the desperate conditions of crowded cities and colonies where they faced stress, hunger, malnutrition, and disease. The loss of rural life hit Europe much earlier than America, but it eventually came here as well. The majority of white Americans were urban by the beginning of the 20th century and the majority of black Americans were urban by the 1970s. There has been a consistent pattern of mass problems following urbanization, everywhere it happens. It still is happening. The younger generation, more urbanized than any generation before, are seeing rising rates of psychosis that is specifically concentrated in the most urbanized areas.

In the United States, it was the last decades of the 19th century that was the turning point, the period of the first truly big cities. Into this milieu, Weston A. Price was born (1870) in a small rural village in Canada. As an adult, he became a dentist and sought work in Cleveland, Ohio (1893). Initially, most of his patients probably had, like him, grown up in rural areas. But over the decades, he increasingly was exposed to the younger generations having spent their entire lives in the city. Lierre Keith puts Price’s early observations in context, after pointing out that he started his career in 1893: “This date is important, as he entered the field just prior to the glut of industrial food. Over the course of the next thirty years, he watched children’s dentition — and indeed their overall health deteriorate. There was suddenly children whose teeth didn’t fit in their mouths, children with foreshortened jaws, children with lots of cavities. Not only were their dental arches too small, but he noticed their nasal passages were also too narrow, and they had poor health overall; asthma, allergies, behavioral problems” (The Vegetarian Myth, p. 187). This was at the time when the industrialization of farming and food had reached a new level, far beyond the limited availability of canned foods that in the mid-to-late 1800s when most Americans still relied on a heavy amount of wild-sourced meat, fish, nuts, etc. Even city-dwellers in early America had ready access to wild game because of the abundance of surrounding wilderness areas. In fact, in the 19th century, the average American ate more meat (mostly hunted) than bread.

We are once again coming back to the ever recurrent moral panic about the civilizational project. The same fears given voice in the late 19th to early 20th century are being repeated again. For example, Dr. Leonard Sax alerts us to how girls are sexually maturing early (1% of female infants showing signs of puberty), whereas boys are maturing later. As a comparison, hunter-gatherers don’t have such a large gender disparity of puberty nor do they experience puberty so early for girls, instead both genders typically coming to puberty around 18 years old with sex, pregnancy, and marriage happening more or less simultaneously. Dr. Sax, along with others, speculates about a number of reasons. Common causes that are held responsible include health factors, from diet to chemicals. Beyond altered puberty, many other examples could be added: heart disease, autoimmune disorders, mood disorders, autism, ADHD, etc; all of them increasing and worsening with each generation (e.g., type 2 diabetes used to be known as adult onset diabetes but now is regularly diagnosed in young children; the youngest victim recorded recently was three years old when diagnosed).

In the past, Americans responded to moral panic with genocide of Native Americans, Prohibition targeting ethnic (hyphenated) Americans and the poor, and immigrant restrictions to keep the bad sort out; the spread of racism and vigilantism such as KKK and Jim Crow and sundown towns and redlining, forced assimilation such as English only laws and public schools, and internment camps for not only Japanese-Americans but also German-Americans and Italian-Americans; implementation of citizen-making projects like national park systems, Boy Scouts, WPA, and CCC; promotion of eugenics, war on poverty (i.e., war on the poor), imperial expansionism, neo-colonial exploitation, and world wars; et cetera. The cure sought was often something to be forced onto the population by a paternalistic elite, that is to say rich white males, most specifically WASPs of the capitalist class.

Eugenics was, of course, one of the main focuses as it carried the stamp of science (or rather scientism). Yet at the same time, there were those challenging biological determinism and race realism, as views shifted toward environmental explanations. The anthropologists were at the front lines of this battle, but there were also Social Christians who changed their minds after having seen poverty firsthand. Weston A. Price, however, didn’t come to this from a consciously ideological position or religious motivation. He was simply a dentist who couldn’t ignore the severe health issues of his patients. So, he decided to travel the world in order to find healthy populations to study, in the hope of explaining why the change had occurred (Nutrition and Physical Degeneration).

Although familiar with eugenics literature, what Price observed in ‘primitive’ communities (including isolated villages in Europe) did not conform to eugenicist thought. It didn’t matter which population he looked at. Those who ate traditional diets were healthy and those who ate an industrialized Western diet were not. And it was a broad pattern that he saw everywhere he went, not only physical health but also neurocognitive health as indicated by happiness, low anxiety, and moral character. Instead of blaming individuals or races, he saw the common explanation as nutrition and he made a strong case by scientifically analyzing the nutrition of available foods.

In reading about traditional foods, paleo diet/lifestyle and functional medicine, Price’s work comes up quite often. He took many photographs that compared people from healthy and unhealthy populations. The contrast is stark. But what really stands out is how few people in the modern world look close to as healthy as those from the healthiest societies of the past. I live in a fairly wealthy college and medical town where there is a far above average concern for health along with access to healthcare. Even so, I now can’t help noticing how many people around me show signs of stunted or perturbed development of the exact kind Price observed in great detail: thin bone structure, sunken chests, sloping shoulders, narrow facial features, asymmetry, etc. That is even with modern healthcare correcting some of the worst conditions: cavities, underbites, pigeon-toes, etc. My fellow residents in this town are among the most privileged people in the world and, nonetheless, their state of health is a sad state of affairs in what it says about humanity at present.

It makes me wonder, as it made Price wonder, what consequences this has on neurocognitive health for individuals and the moral health of society. Taken alone, it isn’t enough to get excited about. But put in a larger context of looming catastrophes and it does become concerning. It’s not clear that our health will be up to the task of the problems we need to solve. We are a sickly population, far more sickly than when moral panic took hold in past generations.

As important, there is the personal component. I’m at a point where I’m not going to worry too much about decline and maybe collapse of civilization. I’m kind of hoping the American Empire will meet its demise. Still, that leaves us with many who suffer, no matter what happens to society as a whole. I take that personally, as one who has struggled with physical and mental health issues. And I’ve come around to Price’s view of nutrition as being key. I see these problems in other members of my family and it saddens me to watch as health conditions seem to get worse from one generation to the next.

It’s far from being a new problem, the central point I’m trying to make here. Talking to my mother, she has a clear sense of the differences on the two sides of her family. Her mother’s family came from rural areas and, even after moving to a larger city for work, they continued to hunt on a daily basis as there were nearby fields and woods that made that possible. They were a healthy, happy, and hard-working lot. They got along well as a family. Her father’s side of the family was far different. They had been living in towns and cities for several generations by the time she was born. They didn’t hunt at all. They were known for being surly, holding grudges, and being mean drunks. They also had underbites (i.e., underdeveloped jaw structure) and seemed to have had learning disabilities, though no one was diagnosing such conditions back then. Related to this difference, my mother’s father raised rabbits whereas my mother’s mother’s family hunted rabbits (and other wild game). This makes a big difference in terms of nutrition, as wild game has higher levels of omega-3 fatty acids and fat-soluble vitamins, all of which are key to optimal health and development.

What my mother observed in her family is basically the same as what Price observed in hundreds of communities in multiple countries on every continent. And I now observe the same pattern repeating. I grew up with an underbite. My brothers and I all required orthodontic work, as do so many now. I was diagnosed with a learning disability when young. Maybe not a learning disability, but behavioral issues were apparent when my oldest brother was young, likely related to his mildew allergies and probably an underlying autoimmune condition. I know I had food allergies as a child, as I think my other brother did as well. All of us have had neurocognitive and psychological issues of a fair diversity, besides learning disabilities: stuttering, depression, anxiety, and maybe some Asperger’s.

Now another generation is coming along with increasing rates of major physical and mental health issues. My nieces and nephews are sick all the time. They don’t eat well and are probably malnourished. During a medical checkup for my nephew, my mother asked the doctor about his extremely unhealthy diet, consisting mostly of white bread and sugar. The doctor bizarrely dismissed it as ‘normal’ in that, as she claimed, no kid eats healthy. If that is the new normal, maybe we should be in a moral panic.

* * *

Violent Behavior: A Solution in Plain Sight
by Sylvia Onusic

Nutrition and Mental Development
by Sally Fallon Morell

You Are What You Eat: The Research and Legacy of Dr. Weston Andrew Price
by John Larabell

While practicing in his Cleveland office, Dr. Price noticed an increase in dental problems among the younger generations. These issues included the obvious dental caries (cavities) as well as improper jaw development leading to crowded, crooked teeth. In fact, the relatively new orthodontics industry was at that time beginning to gain popularity. Perplexed by these modern problems that seemed to be affecting a greater and greater portion of the population, Dr. Price set about to research the issue by examining people who did not display such problems. He suspected (correctly, as he would later find) that many of the dental problems, as well as other degenerative health problems, that were plaguing modern society were the result of inadequate nutrition owing to the increasing use of refined, processed foods.

Nasty, Brutish and Short?
by Sally Fallon Morell

It seems as if the twentieth century will exit with a crescendo of disease. Things were not so bad back in the 1930’s, but the situation was already serious enough to cause one Cleveland, Ohio dentist to be concerned. Dr. Weston Price was reluctant to accept the conditions exhibited by his patients as normal. Rarely did an examination of an adult patient reveal anything but rampant decay, often accompanied by serious problems elsewhere in the body, such as arthritis, osteoporosis, diabetes, intestinal complaints and chronic fatigue. (They called it neurasthenia in Price’s day.) But it was the dentition of younger patients that alarmed him most. Price observed that crowded, crooked teeth were becoming more and more common, along with what he called “facial deformities”-overbites, narrowed faces, underdevelopment of the nose, lack of well-defined cheekbones and pinched nostrils. Such children invariably suffered from one or more complaints that sound all too familiar to mothers of the 1990’s: frequent infections, allergies, anemia, asthma, poor vision, lack of coordination, fatigue and behavioral problems. Price did not believe that such “physical degeneration” was God’s plan for mankind. He was rather inclined to believe that the Creator intended physical perfection for all human beings, and that children should grow up free of ailments.

Is it Mental or is it Dental?
by Raymond Silkman

The widely held model of orthodontics, which considers developmental problems in the jaws and head to be genetic in origin, never made sense to me. Since they are wedded to the genetic model, orthodontists dealing with crowded teeth end up treating the condition with tooth extraction in a majority of the cases. Even though I did not resort to pulling teeth in my practice, and I was using appliances to widen the jaws and getting the craniums to look as they should, I still could not come up with the answer as to why my patients looked the way they did. I couldn’t believe that the Creator had given them a terrible blueprint –it just did not make sense. In four years of college education, four years of dental school education and almost three years of post-graduate orthodontic training, students never hear a mention of Dr. Price, so they never learn the true reasons for these malformations. I have had the opportunity to work with a lot of very knowledgeable doctors in various fields of allopathic and alternative healthcare who still do not know about Dr. Price and his critical findings.

These knowledgeable doctors have not stared in awe at the beautiful facial development that Price captured in the photographs he took of primitive peoples throughout the globe and in so doing was able to answer this most important question: What do humans look like in health? And how have humans been able to carry on throughout history and populate such varied geographical and physical environments on the earth without our modern machines and tools?

The answer that Dr. Price was able to illuminate came through his photographs of beautiful, healthy human beings with magnificent physical form and mental development, living in harmony with their environments. […]

People who are not well oxygenated and who have poor posture often suffer from fatigue and fibromyalgia symptoms, they snore and have sleep apnea, they have sinusitis and frequent ear infections. Life becomes psychologically and physically challenging for them and they end up with long-term dependence on medications—and all of that just from the seemingly simple condition of crowded teeth.

In other words, people with poor facial development are not going to live very happily. […]

While very few people have heard of the work of Weston Price these days, we haven’t lost our ability to recognize proper facial form. To make it in today’s society, you must have good facial development. You’re not going to see a general or a president with a weak chin, you’re not going to see coaches with weak chins, you’re not going to see a lot of well-to-do personalities in the media with underdeveloped faces and chins. You don’t see athletes and newscasters with narrow palates and crooked teeth.

Weston A. Price: An Unorthodox Dentist
by Nourishing Israel

Price discovered that the native foods eaten by the isolated populations were far more nutrient dense than the modern foods. In the first generation that changed their diet there was noticeable tooth decay; in subsequent generations the dental and facial bone structure changed, as well as other changes that were seen in American and European families and previously considered to be the result of interracial marriage.

By studying the different routes that the same populations had taken – traditional versus modern diet – he saw that the health of the children is directly related to the health of the parents and the germ plasms that they provide, and are as important to the child’s makeup as the health of the mother before and during pregnancy.

Price also found that primitive populations were very conscious of the importance of the mothers’ health and many populations made sure that girls were given a special diet for several months before they were allowed to marry.

Another interesting finding was that although genetic makeup was important, it did not have as great a degree of influence on a person’s development and health as was thought, but that a lot of individual characteristics, including brain development and brain function, where due to environmental influence, what he called “intercepted heredity”.

The origin of personality and character appear in the light of the newer date to be biologic products and to a much less degree than usually considered pure hereditary traits. Since these various factors are biologic, being directly related to both the nutrition of the parents and to the nutritional environment of the individuals in the formative and growth period any common contributing factor such as food deficiencies due to soil depletion will be seen to produce degeneration of the masses of people due to a common cause. Mass behavior therefore, in this new light becomes the result of natural forces, the expression of which may not be modified by propaganda but will require correction at the source. [1] …

It will be easy for the reader to be prejudiced since many of the applications suggested are not orthodox. I suggest that conclusions be deferred until the new approach has been used to survey the physical and mental status of the reader’s own family, of his brothers and sisters, of associated families, and finally, of the mass of people met in business and on the street. Almost everyone who studies the matter will be surprised that such clear-cut evidence of a decline in modern reproductive efficiency could be all about us and not have been previously noted and reviewed.[2]

From Nutrition and Physical Degeneration by Weston Price

Food Freedom – Nourishing Raw Milk
by Lisa Virtue

In 1931 Price visited the people of the Loetschental Valley in the Swiss Alps. Their diet consisted of rye bread, milk, cheese and butter, including meat once a week (Price, 25). The milk was collected from pastured cows, and was consumed raw: unpasteurized, unhomogenized (Schmid, 9).

Price described these people as having “stalwart physical development and high moral character…superior types of manhood, womanhood and childhood that Nature has been able to produce from a suitable diet and…environment” (Price, 29). At this time, Tuberculosis had taken more lives in Switzerland than any other disease. The Swiss government ordered an inspection of the valley, revealing not a single case. No deaths had been recorded from Tuberculosis in the history of the Loetschental people (Shmid, 8). Upon return home, Price had dairy samples from the valley sent to him throughout the year. These samples were higher in minerals and vitamins than samples from commercial (thus pasteurized) dairy products in America and the rest of Europe. The Loetschental milk was particularly high in fat soluble vitamin D (Schmid, 9).

The daily intake of calcium and phosphorous, as well as fat soluble vitamins would have been higher than average North American children. These children were strong and sturdy, playing barefoot in the glacial waters into the late chilly evenings. Of all the children in the valley eating primitive foods, cavities were detected at an average of 0.3 per child (Price, 25). This without visiting a dentist or physician, for the valley had none, seeing as there was no need (Price, 23). To offer some perspective, the rate of cavities per child between the ages of 6-19 in the United States has been recorded to be 3.25, over 10 times the rate seen in Loetschental (Nagel).

Price offers some perspective on a society subsisting mainly on raw dairy products: “One immediately wonders if there is not something in the life-giving vitamins and minerals of the food that builds not only great physical structures within which their souls reside, but builds minds and hearts capable of a higher type of manhood…” (Price, 26).

100 Years Before Weston Price
by Nancy Henderson

Like Price, Catlin was struck by the beauty, strength and demeanor of the Native Americans. “The several tribes of Indians inhabiting the regions of the Upper Missouri. . . are undoubtedly the finest looking, best equipped, and most beautifully costumed of any on the Continent.” Writing of the Blackfoot and Crow, tribes who hunted buffalo on the rich glaciated soils of the American plains, “They are the happiest races of Indian I have met—picturesque and handsome, almost beyond description.”

“The very use of the word savage,” wrote Catlin, “as it is applied in its general sense, I am inclined to believe is an abuse of the word, and the people to whom it is applied.” […]

As did Weston A. Price one hundred years later, Catlin noted the fact that moral and physical degeneration came together with the advent of civilized society. In his late 1830s portrait of “Pigeon’s Egg Head (The Light) Going to and Returning from Washington” Catlin painted him corrupted with “gifts of the great white father” upon his return to his native homeland. Those gifts including two bottles of whiskey in his pockets. […]

Like Price, Catlin discusses the issue of heredity versus environment. “No diseases are natural,” he writes, “and deformities, mental and physical, are neither hereditary nor natural, but purely the result of accidents or habits.”

So wrote Dr. Price: “Neither heredity nor environment alone cause our juvenile delinquents and mental defectives. They are cripples, physically, mentally and morally, which could have and should have been prevented by adequate education and by adequate parental nutrition. Their protoplasm was not normally organized.”

The Right Price
by Weston A. Price Foundation

Many commentators have criticized Price for attributing “decline in moral character” to malnutrition. But it is important to realize that the subject of “moral character” was very much on the minds of commentators of his day. As with changes in facial structure, observers in the first half of the 20th century blamed “badness” in people to race mixing, or to genetic defects. Price quotes A.C. Jacobson, author of a 1926 publication entitled Genius (Some Revaluations),35 who stated that “The Jekyll-Hydes of our common life are ethnic hybrids.” Said Jacobson, “Aside from the effects of environment, it may safely be assumed that when two strains of blood will not mix well a kind of ‘molecular insult’ occurs which the biologists may some day be able to detect beforehand, just as blood is now tested and matched for transfusion.” The implied conclusion to this assertion is that “degenerates” can be identified through genetic testing and “weeded out” by sterilizing the unfit–something that was imposed on many women during the period and endorsed by powerful individuals, including Oliver Wendell Holmes.

It is greatly to Price’s credit that he objected to this arrogant point of view: “Most current interpretations are fatalistic and leave practically no escape from our succession of modern physical, mental and moral cripples. . . If our modern degeneration were largely the result of incompatible racial stocks as indicated by these premises, the outlook would be gloomy in the extreme.”36 Price argued that nutritional deficiencies affecting the physical structure of the body can also affect the brain and nervous system; and that while “bad” character may be the result of many influences–poverty, upbringing, displacement, etc.–good nutrition also plays a role in creating a society of cheerful, compassionate individuals.36

Rebirth of a Nation:
The Making of Modern America, 1877-1920
By Jackson Lears
pp. 7-9

By the late nineteenth century, dreams of rebirth were acquiring new meanings. Republican moralists going back to Jefferson’s time had long fretted about “overcivilization,” but the word took on sharper meaning among the middle and upper classes in the later decades of the nineteenth century. During the postwar decades, “overcivilization” became not merely a social but an individual condition, with a psychiatric diagnosis. In American Nervousness (1880), the neurologist George Miller Beard identified “neurasthenia,” or “lack of nerve force,” as the disease of the age. Neurasthenia encompassed a bewildering variety of symptoms (dyspepsia, insomnia, nocturnal emissions, tooth decay, “fear of responsibility, of open places or closed places, fear of society, fear of being alone, fear of fears, fear of contamination, fear of everything, deficient mental control, lack of decision in trifling matters, hopelessness”), but they all pointed to a single overriding effect: a paralysis of the will.

The malady identified by Beard was an extreme version of a broader cultural malaise—a growing sense that the Protestant ethic of disciplined achievement had reached the end of its tether, had become entangled in the structures of an increasingly organized capitalist society. Ralph Waldo Emerson unwittingly predicted the fin de siècle situation. “Every spirit makes its house,” he wrote in “Fate” (1851), “but afterwards the house confines the spirit.” The statement presciently summarized the history of nineteenth-century industrial capitalism, on both sides of the Atlantic.

By 1904, the German sociologist Max Weber could put Emerson’s proposition more precisely. The Protestant ethic of disciplined work for godly ends had created an “iron cage” of organizations dedicated to the mass production and distribution of worldly goods, Weber argued. The individual striver was caught in a trap of his own making. The movement from farm to factory and office, and from physical labor outdoors to sedentary work indoors, meant that more Europeans and North Americans were insulated from primary processes of making and growing. They were also caught up in subtle cultural changes—the softening of Protestantism into platitudes; the growing suspicion that familiar moral prescriptions had become mere desiccated, arbitrary social conventions. With the decline of Christianity, the German philosopher Friedrich Nietzsche wrote, “it will seem for a time as though all things had become weightless.”

Alarmists saw these tendencies as symptoms of moral degeneration. But a more common reaction was a diffuse but powerful feeling among the middle and upper classes—a sense that they had somehow lost contact with the palpitating actuality of “real life.” The phrase acquired unprecedented emotional freight during the years around the turn of the century, when reality became something to be pursued rather than simply experienced. This was another key moment in the history of longing, a swerve toward the secular. Longings for this-worldly regeneration intensified when people with Protestant habits of mind (if not Protestant beliefs) confronted a novel cultural situation: a sense that their way of life was being stifled by its own success.

On both sides of the Atlantic, the drive to recapture “real life” took myriad cultural forms. It animated popular psychotherapy and municipal reform as well as avant-garde art and literature, but its chief institutional expression was regeneration through military force. As J. A. Hobson observed in Imperialism (1902), the vicarious identification with war energized jingoism and militarism. By the early twentieth century, in many minds, war (or the fantasy of it) had become the way to keep men morally and physically fit. The rise of total war between the Civil War and World War I was rooted in longings for release from bourgeois normality into a realm of heroic struggle. This was the desperate anxiety, the yearning for rebirth, that lay behind official ideologies of romantic nationalism, imperial progress, and civilizing mission—and that led to the trenches of the Western Front.

Americans were immersed in this turmoil in peculiarly American ways. As the historian Richard Slotkin has brilliantly shown, since the early colonial era a faith in regeneration through violence underlay the mythos of the American frontier. With the closing of the frontier (announced by the U.S. census in 1890), violence turned outward, toward empire. But there was more going on than the refashioning of frontier mythology. American longings for renewal continued to be shaped by persistent evangelical traditions, and overshadowed by the shattering experience of the Civil War. American seekers merged Protestant dreams of spiritual rebirth with secular projects of purification—cleansing the body politic of secessionist treason during the war and political corruption afterward, reasserting elite power against restive farmers and workers, taming capital in the name of the public good, reviving individual and national vitality by banning the use of alcohol, granting women the right to vote, disenfranchising African-Americans, restricting the flow of immigrants, and acquiring an overseas empire.

Of course not all these goals were compatible. Advocates of various versions of rebirth—bodybuilders and Prohibitionists, Populists and Progressives, Social Christians and Imperialists—all laid claims to legitimacy. Their crusades met various ends, but overall they relieved the disease of the fin de siècle by injecting some visceral vitality into a modern culture that had seemed brittle and about to collapse. Yearning for intense experience, many seekers celebrated Force and Energy as ends in themselves. Such celebrations could reinforce militarist fantasies but could also lead in more interesting directions—toward new pathways in literature and the arts and sciences. Knowledge could be revitalized, too. William James, as well as Houdini and Roosevelt, was a symbol of the age.

The most popular forms of regeneration had a moral dimension.

pp. 27-29

But for many other observers, too many American youths—especially among the upper classes—had succumbed to the vices of commerce: the worship of Mammon, the love of ease. Since the Founding Fathers’ generation, republican ideologues had fretted about the corrupting effects of commercial life. Norton and other moralists, North and South, had imagined war would provide an antidote. During the Gilded Age those fears acquired a peculiarly palpable intensity. The specter of “overcivilization”—invoked by republican orators since Jefferson’s time—developed a sharper focus: the figure of the overcivilized businessman became a stock figure in social criticism. Flabby, ineffectual, anxious, possibly even neurasthenic, he embodied bourgeois vulnerability to the new challenges posed by restive, angry workers and waves of strange new immigrants. “Is American Stamina Declining?” asked William Blaikie, a former Harvard athlete and author of How to Get Strong and Stay So, in Harper’s in 1889. Among white-collar “brain-workers,” legions of worried observers were asking similar questions. Throughout the country, metropolitan life for the comfortable classes was becoming a staid indoor affair. Blaikie caught the larger contours of the change:

“A hundred years ago, there was more done to make our men and women hale and vigorous than there is to-day. Over eighty per cent of all our men then were farming, hunting, or fishing, rising early, out all day in the pure, bracing air, giving many muscles very active work, eating wholesome food, retiring early, and so laying in a good stock of vitality and health. But now hardly forty per cent are farmers, and nearly all the rest are at callings—mercantile, mechanical, or professional—which do almost nothing to make one sturdy and enduring.”

This was the sort of anxiety that set men (and more than a few women) to pedaling about on bicycles, lifting weights, and in general pursuing fitness with unprecedented zeal. But for most Americans, fitness was not merely a matter of physical strength. What was equally essential was character, which they defined as adherence to Protestant morality. Body and soul would be saved together.

This was not a gender-neutral project. Since the antebellum era, purveyors of conventional wisdom had assigned respectable women a certain fragility. So the emerging sense of physical vulnerability was especially novel and threatening to men. Manliness, always an issue in Victorian culture, had by the 1880s become an obsession. Older elements of moral character continued to define the manly man, but a new emphasis on physical vitality began to assert itself as well. Concern about the over-soft socialization of the young promoted the popularity of college athletics. During the 1880s, waves of muscular Christianity began to wash over campuses.

pp. 63-71

NOT MANY AMERICAN men, even among the comparatively prosperous classes, were as able as Carnegie and Rockefeller to master the tensions at the core of their culture. Success manuals acknowledged the persistent problem of indiscipline, the need to channel passion to productive ends. Often the language of advice literature was sexually charged. In The Imperial Highway (1881), Jerome Bates advised:

[K]eep cool, have your resources well in hand, and reserve your strength until the proper time arrives to exert it. There is hardly any trait of character or faculty of intellect more valuable than the power of self-possession, or presence of mind. The man who is always “going off” unexpectedly, like an old rusty firearm, who is easily fluttered and discomposed at the appearance of some unforeseen emergency; who has no control over himself or his powers, is just the one who is always in trouble and is never successful or happy.

The assumptions behind this language are fascinating and important to an understanding of middle-and upper-class Americans in the Gilded Age. Like many other purveyors of conventional wisdom—ministers, physicians, journalists, health reformers—authors of self-help books assumed a psychic economy of scarcity. For men, this broad consensus of popular psychology had sexual implications: the scarce resource in question was seminal fluid, and one had best not be diddling it away in masturbation or even nocturnal emissions. This was easier said than done, of course, as Bates indicated, since men were constantly addled by insatiable urges, always on the verge of losing self-control—the struggle to keep it was an endless battle with one’s own darker self. Spiritual, psychic, and physical health converged. What Freud called “‘civilized’ sexual morality” fed directly into the “precious bodily fluids” school of health management. The man who was always “‘going off’ unexpectedly, like an old rusty firearm,” would probably be sickly as well as unsuccessful—sallow, sunken-chested, afflicted by languorous indecision (which was how Victorian health literature depicted the typical victim of what was called “self-abuse”).

But as this profile of the chronic masturbator suggests, scarcity psychology had implications beyond familiar admonitions to sexual restraint. Sexual scarcity was part of a broader psychology of scarcity; the need to conserve semen was only the most insistently physical part of a much more capacious need to conserve psychic energy. As Bates advised, the cultivation of “self-possession” allowed you to “keep your resources well in hand, and reserve your strength until the proper time arrives to exert it.” The implication was that there was only so much strength available to meet demanding circumstances and achieve success in life. The rhetoric of “self-possession” had financial as well as sexual connotations. To preserve a cool, unruffled presence of mind (to emulate Rockefeller, in effect) was one way to stay afloat on the storm surges of the business cycle.

The object of this exercise, at least for men, was personal autonomy—the ownership of one’s self. […]

It was one thing to lament excessive wants among the working class, who were supposed to be cultivating contentment with their lot, and quite another to find the same fault among the middle class, who were supposed to be improving themselves. The critique of middle-class desire posed potentially subversive questions about the dynamic of dissatisfaction at the core of market culture, about the very possibility of sustaining a stable sense of self in a society given over to perpetual jostling for personal advantage. The ruinous results of status-striving led advocates of economic thrift to advocate psychic thrift as well.

By the 1880s, the need to conserve scarce psychic resources was a commonly voiced priority among the educated and affluent. Beard’s American Nervousness had identified “the chief and primary cause” of neurasthenia as “modern civilization,” which placed unprecedented demands on limited emotional energy. “Neurasthenia” and “nervous prostration” became catchall terms for a constellation of symptoms that today would be characterized as signs of chronic depression—anxiety, irritability, nameless fears, listlessness, loss of will. In a Protestant culture, where effective exercise of will was the key to individual selfhood, the neurasthenic was a kind of anti-self—at best a walking shadow, at worst a bedridden invalid unable to make the most trivial choices or decisions. Beard and his colleagues—neurologists, psychiatrists, and self-help writers in the popular press—all agreed that nervous prostration was the price of progress, a signal that the psychic circuitry of “brain workers” was overloaded by the demands of “modern civilization.”

While some diagnoses of this disease deployed electrical metaphors, the more common idiom was economic. Popular psychology, like popular economics, was based on assumptions of scarcity: there was only so much emotional energy (and only so much money) to go around. The most prudent strategy was the husbanding of one’s resources as a hedge against bankruptcy and breakdown. […]

Being reborn through a self-allowed regime of lassitude was idiosyncratic, though important as a limiting case. Few Americans had the leisure or the inclination to engage in this kind of Wordsworthian retreat. Most considered neurasthenia at best a temporary respite, at worst an ordeal. They strained, if ambivalently, to be back in harness.

The manic-depressive psychology of the business class mimicked the lurching ups and downs of the business cycle. In both cases, assumptions of scarcity underwrote a pervasive defensiveness, a circle-the-wagons mentality. This was the attitude that lay behind the “rest cure” devised by the psychiatrist Silas Weir Mitchell, who proposed to “fatten” and “redden” the (usually female) patient by isolating her from all mental and social stimulation. (This nearly drove the writer Charlotte Perkins Gilman crazy, and inspired her story “The Yellow Wallpaper.”) It was also the attitude that lay behind the fiscal conservatism of the “sound-money men” on Wall Street and in Washington—the bankers and bondholders who wanted to restrict the money supply by tying it to the gold standard. Among the middle and upper classes, psyche and economy alike were haunted by the common specter of scarcity. But there were many Americans for whom scarcity was a more palpable threat.

AT THE BOTTOM of the heap were the urban poor. To middle-class observers they seemed little more than a squalid mass jammed into tenements that were festering hives of “relapsing fever,” a strange malady that left its survivors depleted of strength and unable to work. The disease was “the most efficient recruiting officer pauperism ever had,” said a journalist investigating tenement life in the 1870s. Studies of “the nether side of New York” had been appearing for decades, but—in the young United States at least—never before the Gilded Age had the story of Dives and Lazarus been so dramatically played out, never before had wealth been so flagrant, or poverty been so widespread and so unavoidably appalling. The army of thin young “sewing-girls” trooping off in the icy dawn to sweatshops all over Manhattan, the legions of skilled mechanics forced by high New York rents to huddle with their families amid a crowd of lowlifes, left without even a pretense of privacy in noisome tenements that made a mockery of the Victorian cult of home—these populations began to weigh on the bourgeois imagination, creating concrete images of the worthy, working poor.

pp. 99-110

Racial animosities flared in an atmosphere of multicultural fluidity, economic scarcity, and sexual rivalry. Attitudes arising from visceral hostility acquired a veneer of scientific objectivity. Race theory was nothing new, but in the late nineteenth century it mutated into multiple forms, many of them characterized by manic urgency, sexual hysteria, and biological determinism. Taxonomists had been trying to arrange various peoples in accordance with skull shape and brain size for decades; popularized notions of natural selection accelerated the taxonomic project, investing it more deeply in anatomical details. The superiority of the Anglo-Saxon—according to John Fiske, the leading pop-evolutionary thinker—arose not only from the huge size of his brain, but also from the depth of its furrows and the plenitude of its creases. The most exalted mental events had humble somatic origins. Mind was embedded in body, and both could be passed on to the next generation.

The year 1877 marked a crucial development in this hereditarian synthesis: in that year, Richard Dugdale published the results of his investigation into the Juke family, a dull-witted crew that had produced more than its share of criminals and mental defectives. While he allowed for the influence of environment, Dugdale emphasized the importance of inherited traits in the Juke family. If mental and emotional traits could be inherited along with physical ones, then why couldn’t superior people be bred like superior dogs or horses? The dream of creating a science of eugenics, dedicated to improving and eventually even perfecting human beings, fired the reform imagination for decades. Eugenics was a kind of secular millennialism, a vision of a society where biological engineering complemented social engineering to create a managerial utopia. The intellectual respectability of eugenics, which lasted until the 1930s, when it became associated with Nazism, underscores the centrality of racialist thinking among Americans who considered themselves enlightened and progressive. Here as elsewhere, racism and modernity were twinned.

Consciousness of race increasingly pervaded American culture in the Gilded Age. Even a worldview as supple as Henry James’s revealed its moorings in conventional racial categories when, in The American (1877), James presented his protagonist, Christopher Newman, as a quintessential Anglo-Saxon but with echoes of the noble Red Man, with the same classical posture and physiognomy. There was an emerging kinship between these two groups of claimants to the title “first Americans.” The iconic American, from this view, was a blend of Anglo-Saxon refinement and native vigor. While James only hints at this, in less than a generation such younger novelists as Frank Norris and Jack London would openly celebrate the rude vitality of the contemporary Anglo-Saxon, proud descendant of the “white savages” who subdued a continent. It should come as no surprise that their heroes were always emphatically male. The rhetoric of race merged with a broader agenda of masculine revitalization.[…]

By the 1880s, muscular Christians were sweeping across the land, seeking to meld spiritual and physical renewal, establishing institutions like the Young Men’s Christian Association. The YMCA provided prayer meetings and Bible study to earnest young men with spiritual seekers’ yearnings, gyms and swimming pools to pasty young men with office workers’ midriffs. Sometimes they were the same young men. More than any other organization, the YMCA aimed to promote the symmetry of character embodied in the phrase “body, mind, spirit”—which a Y executive named Luther Gulick plucked from Deuteronomy and made the motto of the organization. The key to the Y’s appeal, a Harper’s contributor wrote in 1882, was the “overmastering conviction” of its members: “The world always respects manliness, even when it is not convinced [by theological argument]; and if the organizations did not sponsor that quality in young men, they would be entitled to no respect.” In the YMCA, manliness was officially joined to a larger agenda.

For many American Protestants, the pursuit of physical fitness merged with an encompassing vision of moral and cultural revitalization—one based on the reassertion of Protestant self-control against the threats posed to it by immigrant masses and mass-marketed temptation. […]

Science and religion seemed to point in the same direction: Progress and Providence were one.

Yet the synthesis remained precarious. Physical prowess, the basis of national supremacy, could not be taken for granted. Strong acknowledged in passing that Anglo-Saxons could be “devitalized by alcohol and tobacco.” Racial superiority could be undone by degenerate habits. Even the most triumphalist tracts contained an undercurrent of anxiety, rooted in the fear of flab. The new stress on the physical basis of identity began subtly to undermine the Protestant synthesis, to reinforce the suspicion that religion was a refuge for effeminate weaklings. The question inevitably arose, in some men’s minds: What if the YMCA and muscular Christianity were not enough to revitalize tired businessmen and college boys?

Under pressure from proliferating ideas of racial “fitness,” models of manhood became more secular. Despite the efforts of muscular Christians to reunite body and soul, the ideal man emerging among all classes by the 1890s was tougher and less introspective than his mid-Victorian predecessors. He was also less religious. Among advocates of revitalization, words like “Energy” and “Force” began to dominate discussion—often capitalized, often uncoupled from any larger frameworks of moral or spiritual meaning, and often combined with racist assumptions. […]

The emerging worship of force raised disturbing issues. Conventional morality took a backseat to the celebration of savage strength. After 1900, in the work of a pop-Nietzschean like Jack London, even criminality became a sign of racial vitality: as one of his characters says, “We whites have been land-robbers and sea-robbers from remotest time. It is in our blood, I guess, and we can’t get away from it.” This reversal of norms did not directly challenge racial hierarchies, but the assumptions behind it led toward disturbing questions. If physical prowess was the mark of racial superiority, what was one to make of the magnificent specimens of manhood produced by allegedly inferior races? Could it be that desk-bound Anglo-Saxons required an infusion of barbarian blood (or at least the “barbarian virtues” recommended by Theodore Roosevelt)? Behind these questions lay a primitivist model of regeneration, to be accomplished by incorporating the vitality of the vanquished, dark-skinned other. The question was how to do that and maintain racial purity.

pp. 135-138

Yet to emphasize the gap between country and the city was not simply an evasive exercise: dreams of bucolic stillness or urban energy stemmed from motives more complex than mere escapist sentiment. City and country were mother lodes of metaphor, sources for making sense of the urban-industrial revolution that was transforming the American countryside and creating a deep sense of discontinuity in many Americans’ lives during the decades after the Civil War. If the city epitomized the attraction of the future, the country embodied the pull of the past. For all those who had moved to town in search of excitement or opportunity, rural life was ineluctably associated with childhood and memory. The contrast between country and city was about personal experience as well as political economy. […]

REVERENCE FOR THE man of the soil was rooted in the republican tradition. In his Notes on the State of Virginia (1785), Jefferson articulated the antithesis that became central to agrarian politics (and to the producerist worldview in general)—the contrast between rural producers and urban parasites. “Those who labour in the earth are the chosen people of God, if ever he had a chosen people, whose breasts he has made his peculiar deposit for substantial and genuine virtue,” he announced. “Corruption of morals in the mass of cultivators is a phenomenon of which no age nor nation has furnished an example. It is the mark set on those, who not looking up to heaven, to their own soil and industry, as does the husbandman, for their subsistence, depend for it on the casualties and caprice of customers. Dependence begets subservience and venality, suffocates the germ of virtue, and prepares fit tools for the design of ambition.” Small wonder, from this view, that urban centers of commerce seemed to menace the public good. “The mobs of great cities,” Jefferson concluded, “add just so much to the support of pure government as sores do to the strength of the human body.” Jefferson’s invidious distinctions echoed through the nineteenth century, fueling the moral passion of agrarian rebels. Watson, among many, considered himself a Jeffersonian.

There were fundamental contradictions embedded in Jefferson’s conceptions of an independent yeomanry. Outside certain remote areas in New England, most American farmers were not self-sufficient in the nineteenth century—nor did they want to be. Many were eager participants in the agricultural market economy, animated by a restless, entrepreneurial spirit. Indeed, Jefferson’s own expansionist policies, especially the Louisiana Purchase, encouraged centrifugal movement as much as permanent settlement. “What developed in America,” the historian Richard Hofstadter wrote, “was an agricultural society whose real attachment was not to the land but to land values.” The figure of the independent yeoman, furnishing enough food for himself and his family, participating in the public life of a secure community—this icon embodied longings for stability amid a maelstrom of migration.

Often the longings were tinged with a melancholy sense of loss. […] For those with Jeffersonian sympathies, abandoned farms were disturbing evidence of cultural decline. As a North American Review contributor wrote in 1888: “Once let the human race be cut off from personal contact with the soil, once let the conventionalities and artificial restrictions of so-called civilization interfere with the healthful simplicity of nature, and decay is certain.” Romantic nature-worship had flourished fitfully among intellectuals since Emerson had become a transparent eye-ball on the Concord common and Whitman had loafed among leaves of grass. By the post–Civil War decades, romantic sentiment combined with republican tradition to foster forebodings. Migration from country to city, from this view, was a symptom of disease in the body politic. Yet the migration continued. Indeed, nostalgia for rural roots was itself a product of rootlessness. A restless spirit, born of necessity and desire, spun Americans off in many directions—but mainly westward. The vision of a stable yeomanry was undercut by the prevalence of the westering pioneer.

pp. 246-247

Whether energy came from within or without, it was as limitless as electricity apparently was. The obstacles to access were not material—class barriers or economic deprivation were never mentioned by devotees of abundance psychology—they were mental and emotional. The most debilitating emotion was fear, which cropped up constantly as the core problem in diagnoses of neurasthenia. The preoccupation with freeing oneself from internal constraints undermined the older, static ideal of economic self-control at its psychological base. As one observer noted in 1902: “The root cause of thrift, which we all admire and preach because it is so convenient to the community, is fear, fear of future want; and that fear, we are convinced, when indulged overmuch by pessimist minds is the most frequent cause of miserliness….” Freedom from fear meant freedom to consume.

And consumption began at the dinner table. Woods Hutchinson claimed in 1913 that the new enthusiasm for calories was entirely appropriate to a mobile, democratic society. The old “stagnation” theory of diet merely sought to maintain the level of health and vigor; it was a diet for slaves or serfs, for people who were not supposed to rise above their station. “The new diet theory is based on the idea of progress, of continuous improvement, of never resting satisfied with things as they are,” Hutchinson wrote. “No diet is too liberal or expensive that will…yield good returns on the investment.” Economic metaphors for health began to focus on growth and process rather than stability, on consumption and investment rather than savings.

As abundance psychology spread, a new atmosphere of dynamism enveloped old prescriptions for success. After the turn of the century, money was less often seen as an inert commodity, to be gradually accumulated and tended to steady growth; and more often seen as a fluid and dynamic force. To Americans enraptured by the strenuous life, energy became an end itself—and money was a kind of energy. Success mythology reflected this subtle change. In the magazine hagiographies of business titans—as well as in the fiction of writers like Dreiser and Norris—the key to success frequently became a mastery of Force (as those novelists always capitalized it), of raw power. Norris’s The Pit (1903) was a paean to the furious economic energies concentrated in Chicago. “It was Empire, the restless subjugation of all this central world of the lakes and prairies. Here, mid-most in the land, beat the Heart of the nation, whence inevitably must come its immeasurable power, its infinite, inexhaustible vitality. Here of all her cities, throbbed the true life—the true power and spirit of America: gigantic, crude, with the crudity of youth, disdaining rivalry; sane and healthy and vigorous; brutal in its ambition, arrogant in the new-found knowledge of its giant strength, prodigal of its wealth, infinite in its desires.” This was the vitalist vision at its most breathless and jejune, the literary equivalent of Theodore Roosevelt’s adolescent antics.

The new emphasis on capital as Force translated the psychology of abundance into economic terms. The economist who did the most to popularize this translation was Simon Nelson Patten, whose The New Basis of Civilization (1907) argued that the United States had passed from an “era of scarcity” to an “era of abundance” characterized by the unprecedented availability of mass-produced goods. His argument was based on the confident assumption that human beings had learned to control the weather. “The Secretary of Agriculture recently declared that serious crop failures will occur no more,” Patten wrote. “Stable, progressive farming controls the terror, disorder, and devastation of earlier times. A new agriculture means a new civilization.” Visions of perpetual growth were in the air, promising both stability and dynamism.

The economist Edward Atkinson pointed the way to a new synthesis with a hymn to “mental energy” in the Popular Science Monthly. Like other forms of energy, it was limitless. “If…there is no conceivable limit to the power of mind over matter or to the number of conversions of force that can be developed,” he wrote, “it follows that pauperism is due to want of mental energy, not of material resources.” Redistribution of wealth was not on the agenda; positive thinking was.

pp. 282-283

TR’s policies were primarily designed to protect American corporations’ access to raw materials, investment opportunities, and sometimes markets. The timing was appropriate. In the wake of the merger wave of 1897–1903, Wall Street generated new pools of capital, while Washington provided new places to invest it. Speculative excitement seized many among the middle and upper classes who began buying stocks for the first time. Prosperity spread even among the working classes, leading Simon Nelson Patten to detect a seismic shift from an era of scarcity to an era of abundance. For him, a well-paid working population committed to ever-expanding consumption would create what he called The New Basis of Civilization (1907).

Patten understood that the mountains of newly available goods were in part the spoils of empire, but he dissolved imperial power relations in a rhetoric of technological determinism. The new abundance, he argued, depended not only on the conquest of weather but also on the annihilation of time and space—a fast, efficient distribution system that provided Americans with the most varied diet in the world, transforming what had once been luxuries into staples of even the working man’s diet. “Rapid distribution of food carries civilization with it, and the prosperity that gives us a Panama canal with which to reach untouched tropic riches is a distinctive laborer’s resource, ranking with refrigerated express and quick freight carriage.” The specific moves that led to the seizure of the Canal Zone evaporated in the abstract “prosperity that gives us a Panama Canal,” which in turn became as much a boon to the workingman as innovative transportation. Empire was everywhere, in Patten’s formulation, and yet nowhere in sight.

What Patten implied (rather than stated overtly) was that imperialism underwrote expanding mass consumption, raising standards of living for ordinary folk. “Tropic riches” became cheap foods for the masses. The once-exotic banana was now sold from pushcarts for 6 cents a dozen, “a permanent addition to the laborer’s fund of goods.” The same was true of “sugar, which years ago was too expensive to be lavishly consumed by the well-to-do,” but “now freely gives its heat to the workingman,” as Patten wrote. “The demand that will follow the developing taste for it can be met by the vast quantities latent in Porto Rico and Cuba, and beyond them by the teeming lands of South America, and beyond them by the virgin tropics of another hemisphere.” From this view, the relation between empire and consumption was reciprocal: if imperial policies helped stimulate consumer demand, consumer demand in turn promoted imperial expansion. A society committed to ever-higher levels of mass-produced abundance required empire to be a way of life.

Eat Beef and Bacon!

“The evidence is just not there on red/processed meat. It’s also just not there on 2-3 portions of low-fat dairy, 30g fiber, 5-a-day, 14-21 alcohol units, 8 glasses of water… just numbers plucked out of the air.”
~ Dr. Zoe Harcombe, PhD (professor of public health nutrition)

We’ve always known that many healthy and long-lived societies ate a lot of meat. In fact, the single most longest living society, Hong Kong, eats the most meat in the world. It’s been well established in Eastern research that for Asians more meat correlates to better health outcomes.

In the West, the famous Roseto Penssylvanians also were great consumers of red meat and saturated fat. Like traditional Mediterraneans, they ate more lard than olive oil (olive oil was too expensive for everyday cooking and too much in demand for other uses: fuel, salves, etc). Amont long-lived societies, one of the few commonalities was lard, as pigs are adaptable creatures that can be raised almost anywhere.

The past correlations in some Western research showing the opposite were probably spurious data based on confounding factors such as the healthy user effect. Tell people that meat is unhealthy and then everyone who is healthy will avoid meat, but that tells one nothing about causation. Healthy people in general are higher in conscientiousness and tend to do whatever they are told is healthy, not only in terms of diet but also regular exercise and medical checkups.

We’ve also known for a while that Ancel Keys’ study showing a correlation to saturated fat was a weak link. When the data was re-analyzed, sugar showed a stronger correlation than saturated fat and some other factors showed a stronger correlation than both. It never really made sense to blame meat and animal fats, as these were not increasing in the diet as certain diseases were increasing. The only foods, besides fruits and vegetables, that increased during that period were refined starches, added sugar, and seed oils.

Red meat and saturated fat specifically had been on the decline in the American diet for decades when Ancel Keys began his research. There never was a logical reason for scapegoating these foods. Evidence-based healthcare isn’t always as evidence-based as the experts have led us to believe. And having got it wrong so severely, possibly having caused harm and shortened lives to hundreds of millions of people in the Western world, how are they going to save face and maintain their authority while admitting they gave bad advice for more than a half century? They can’t.

So, organizations like the AHA and ADA have been slowly and quietly backtracking their recommendations in moving them in the direction of what their critics have been saying for decades (Slow, Quiet, and Reluctant Changes to Official Dietary Guidelines; & American Diabetes Association Changes Its Tune). They’re hoping no one notices or calls them out on this covert admission of guilt. But with ever more research piling up in challenging their credibility, leading experts are feeling defensive and so have been lashing out.

Even the media has a hard time reporting fairly on the topic, as they spent so many decades contributing to the confusion. For instance, most mainstream ‘journalism’ keeps blaming cow farts for climate change which is a ludicrous and unscientific allegation (Carnivore Is Vegan). Why are so obsessed with animal foods when sugar is possibly the single greatest dietary threat we face? And why do we narrowly focus on diet while ignoring even more massive factors such as lead toxicity? It would be nice to finally have an informed and honest public debate about public health.

* * *

Where is the Beef? The 6 Papers That Turned the World Up-Side-Down
by Angela A. Stanton

The latest flip-flop on red meat uses best science in place of best guesses
by Nina Teicholz

The New Guidelines On Meat Are Exposing the Fault Lines In Nutrition Advice
by Dr. Anthony Pearson

Is eating beef healthy? The new fight raging in nutrition science, explained.
by Julia Belluz

Eat Less Red Meat, Scientists Said. Now Some Believe That Was Bad Advice.
by Gina Kolata

Avoiding red or processed meat doesn’t seem to give health benefits
by Clare Wilson

Lead Toxicity is a Hyperobject

What is everywhere cannot be seen. What harms everyone cannot be acknowledged. So, we obsess over what is trivial and distract ourselves with false narratives. The point isn’t to understand, much less solve, problems. We’d rather large numbers of people to suffer and die, as long as we don’t have to face the overwhelming sense of anxiety about the world we’ve created.

We pretend to care about public health. We obsess over pharmaceuticals and extreme medical interventions while pandering about exercise and diet, not to mention going on about saving the planet while only taking symbolic actions. But some of the worst dangers to public health go with little mention or media reporting. Lead toxicity is an example of this. It causes numerous diseases and health conditions: lowered IQ, ADHD, aggressive behavior, asthma, and on and on. Now we know it also causes heart disease. Apparently, it even immensely contributes to diabetes. A common explanation might be that heavy metals interfere with important systems in the body such as the immune system and hormone system. In the comments section of Dr. Malcolm Kendrick’s post shared below, I noticed this interesting piece of info:

“I recently listened to a presentation, as a part of a class I’m taking, put on by the lead researcher for the TACT trial. He is a cardiologist himself. I would say that a 48% ABSOLUTE risk reduction in further events in diabetic patients, and a 30-something % risk reduction in patients without diabetes, is extremely significant. I went and read the study afterward to verify the numbers he presented. I would say, based on the fact that he admitted freely he thought he was going to prove exactly the opposite, and that his numbers and his statements show it does work, are pretty convincing. Naturally, no one that works for JAMA will ever tell you that. They would prefer to do acrobatics with statistics to prove otherwise.”

Lead toxicity is one of the leading causes of disease and death in the world. It damages the entire body, especially the brain. For the survivors of lead toxicity, they are crippled for life. It was also behind the violent crime wave of paste decades. The prison population has higher than average rates of lead toxicity, which means we are using prisons to store and hide the victims and scapegoat them all in one fell swoop. And since it is the poor who are primarily targeted by our systematic indifference (maybe not indifference, since there are profits and privileges incentivizing it), it is they who are disproportionately poisoned by lead and then, as victims, imprisoned or otherwise caught up in the legal system or institutionalized or left as one of the vast multitudes of forgotten, of the homeless, of those who die without anyone bothering to find out what killed them.

But if only the poor worked harder, got an education, followed the USDA-recommended diet, and got a good job to pay for all the pills pushed on them by the pharmaceutical-funded doctors, then… well, then what the fuck would good would it do them? Tell me that. The irony is that, as we like to pity the poor for their supposed failures and bad luck, we are all being screwed over. It’s just we feel slightly better, slightly less anxious as long as others are doing worse than us. Who cares that we live in a society slowly killing us. The real victory is knowing that it is killing you slightly slower than your neighbor or those other people elsewhere. For some odd reason, most people find that comforting.

It’s sad. Despite making some minor progress in cleaning up the worst of it, the decades of lead accumulation still lingers in the soil, oceans, infrastructure, and old buildings. Entire communities continue to raise new generations with lead exposure. On top of that, we’ve been adding even more pollutants and toxins to the environment, to our food supply, and to every variety of product we buy. I will say this. Even if diet doesn’t have as big of a direct affect on some of these conditions as does removing dangerous toxins, diet has the advantage of being a factor one can personally control. If you eat an optimally healthy diet, especially if you can avoid foods that are poisoned (either unintentionally with environmental toxins or intentionally with farm chemicals), you’ll be doing yourself a world of good. Greater health won’t eliminate all of the dangers we are surrounded by, but it will help you to detoxify and heal from the damage. It may not be much  in the big picture, but it’s better than nothing.

On the other hand, even if our diet obsession is overblown, maybe it’s more significant than we realize. Sammy Pepys, in Fat is our Friend, writes about Roseto, Pennsylvania. Scientists studying this uniquely healthy American community called the phenomenon the Roseto Effect. These people ate tons of processed meat and lard, smoked cigars and drink wine, and they worked back-breaking labor in quarries where they would have been exposed to toxins (“Rosetan men worked in such toxic environments as the nearby slate quarries … inhaling gases, dusts and other niceties.” p. 117). Yet their health was great. At the time, diet was dismissed because it didn’t conform to USDA standards. While most Americans had already switched to industrial seed oils, the Rosetans were still going strong on animal fats. Maybe their diet was dismissed too easily. As with earlier lard-and-butter-gorging Americans, maybe all the high quality animal fats (probably from pasture-raised animals) was essential to avoiding disease. Also, maybe it had something to do with their ability to handle the toxins as well. Considering Weston A. Price’s research, it’s obvious that all of those additional fat-soluble vitamins sure would have helped.

Still, let’s clean up the toxins. And also, let’s quit polluting like there is no tomorrow.

* * *

What causes heart disease part 65 – Lead again
by Dr. Malcolm Kendrick

There are several things about the paper that I found fascinating. However, the first thing that I noticed was that…. it hadn’t been noticed. It slipped by in a virtual media blackout. It was published in 2018, and I heard nothing.

This is in direct contrast to almost anything published about diet. We are literally bombarded with stories about red meat causing cancer and sausages causing cancer and heart disease, and veganism being protective against heart disease and cancer, and on and on. Dietary articles often end up on the front page on national newspapers. […]

Where was I? Oh yes, lead. The heavy metal. The thing that, unlike diet, makes no headlines whatsoever, the thing that everyone ignores. Here is one top-line fact from that study on lead, that I missed:

‘Our findings suggest that, of 2·3 million deaths every year in the USA, about 400 000 are attributable to lead exposure, an estimate that is about ten times larger than the current one.’ 1

Yes, according to this study, one in six deaths is due to lead exposure. I shall repeat that. One in six. Eighteen per cent to be exact, which is nearer a fifth really. […]

So, on one side, we have papers (that make headlines around the world) shouting about the risk of red meat and cancer. Yet the association is observational, tiny, and would almost certainly disappear in a randomised controlled trial, and thus mean nothing.

On the other we have a substance that could be responsible for one sixth of all deaths, the vast majority of those CVD deaths. The odds ratio, highest vs lowest lead exposure, by the way, depending on age and other factors, was a maximum of 5.30 [unadjusted].

Another study in the US found the following

‘Cumulative lead exposure, as reflected by bone lead, and cardiovascular events have been studied in the Veterans’ Normative Aging Study, a longitudinal study among community-based male veterans in the greater Boston area enrolled in 1963. Patients had a single measurement of tibial and patellar bone lead between 1991 and 1999. The HR for ischemic heart disease mortality comparing patellar lead >35 to <22 μg/g was 8.37 (95% CI: 1.29 to 54.4).’ 3

HR = Hazard Ratio, which is similar, if not the same to OR = Odds Ratio. A Hazard Ratio of 8.37, means (essentially) a 737% increase in risk (Relative Risk).

Anyway, I shall repeat that finding a bit more loudly. A higher level of lead in the body leads to a seven hundred and thirty-seven per cent increase in death from heart disease. This is, in my opinion, correlation proving causation.

Looking at this from another angle, it is true that smoking causes a much greater risk of lung cancer (and a lesser but significant increase in CVD), but not everyone smokes. Therefore, the overall damage to health from smoking is far less than the damage caused by lead toxicity.

Yet no-one seems remotely interested. Which is, in itself, very interesting.

It is true that most Governments have made efforts to reduce lead exposure. Levels of lead in the children dropped five-fold between the mid-sixties and the late nineties. 4 Indeed, once the oil industry stopped blowing six hundred thousand tons of lead into the atmosphere from vehicle exhausts things further improved. Lead has also been removed from water pipes, paint, and suchlike.

However, it takes a long old time from lead to be removed from the human body. It usually lingers for a lifetime. Equally, trying to get rid of lead is not easy, that’s for sure. Having said this, chelation therapy has been tried, and does seem to work.

‘On November 4, 2012, the TACT (Trial to Assess Chelation Therapy) investigators reported publicly the first large, randomized, placebo-controlled trial evidence that edetate disodium (disodium ethylenediaminetetraacetic acid) chelation therapy significantly reduced cardiac events in stable post–myocardial infarction (MI) patients. These results were so unexpected that many in the cardiology community greeted the report initially with either skepticism (it is probably wrong) or outright disbelief (it is definitely wrong).’ 3

Cardiologists, it seems from the above quotes, know almost nothing about the subject in which they claim to be experts. Just try mentioning glycocalyx to them… ‘the what?’

Apart from a few brave souls battling to remove lead from the body, widely derided and dismissed by the mainstream world of cardiology, nothing else is done. Nothing at all. We spend trillions on cholesterol lowering, and trillions on blood pressure lowering, and more trillions on diet. On the other hand, we do nothing active to try and change a risk factor that kicks all the others – in terms of numbers killed – into touch.

True Vitamin A For Health And Happiness

“The discovery of vitamin A and the history of its application in the field of human nutrition is a story of bravery and brilliance, one that represents a marriage of the best of scientific inquiry with worldwide cultural traditions; and the suborning of that knowledge to the dictates of the food industry provides a sad lesson in the use of power and influence to obfuscate the truth”
~Mary Enig, PhD, Lipid Biochemist

Over this past century, there has been a developing insight into the role of nutrition in health. It was originally motivated by the observations of the diseases of malnutrition, largely coinciding with the diseases of civilization. This became increasingly obvious with industrialization. By the mid-20th century, there was a growing health movement that that brought greater awareness to this field of study.

When my grandmother was diagnosed with cancer in the 1980s, she had already for decades been reading books on health and so, instead of chemotherapy or radiation, she tried to cure herself with a macrobiotic diet. My father recalls her juicing such vast amounts of carrots, presumably to up her beta-carotene levels, that her skin turned the same color as her beverage of choice. That is the reason egg yolks and butter are yellow and turn an even deeper orange when animals are pasture-raised on lush green forage (beta-carotene is easily oxidized and so, once cut, grass quickly loses its amount of this nutrient; hence, cattle eating their fresh greens in the spring and summer is important for the fat-soluble vitamins they store in their fat over winter). It is the carotenoids that cause that and there are “currently about 600 known forms of naturally occurring carotenoids” (Sarah Pope, Busting the Beta Carotene Vitamin A Myth). “The carotenoids are further broken down into 2 classes, carotenes and xanthophylls. The carotenes consist of alpha-carotene, beta-carotene, gamma-carotene, delta-carotene, epsilon-carotene, and zeta-carotene. In the xanthophyll class we have astaxanthin, beta-crypto-xanthin, canthaxanthin, fucoxanthin, lutein, neoxanthin, violaxanthin, and zeaxanthin” (Casey Thaler, Why You Need Vitamin A). Beta-carotene is the precursor to the fat-soluble vitamin A and it is linked to the body’s immune system, in fighting off cancerous cells and other sicknesses, along with handling stress (yet “Stress conditions, such as extremely hot weather, viral infections, and altered thyroid function, have also been suggested as causes for reduced carotene to vitamin A conversion”; from Lee Russell McDowell’s Vitamins in Animal Nutrition, p. 30) — Weston A. Price was an early 20th century researcher who observed the relationship between fatty animal foods, fat-soluble vitamins, and a strong immune system.

Price also discussed fertility, and I’d note that research shows that vegans and vegetarians have higher rates of infertility, as do rodents with vitamin A deficiency. I know a vegetarian couple that spent years trying to get pregnant and then spent thousands of dollars on in vitro fertilization treatments, trying twice and finally getting pregnant (the guy’s sperm were deformed and didn’t have proper motility). Pam Schoenfield states that, “in the worst case, spontaneous abortion or death of mother and offspring during labor, as described by Weston A. Price. In humans, even mild deficiencies during pregnancy can lead to compromised kidney development in the child” (The Unfair Stigmatization of Vitamin A). And Nora Gedaudas offers a grim warning that, “Vitamin A deficiency has even been implicated in Sudden Infant Death Syndrome (SIDS)! Far from being a threat to any unborn fetus, The Nordic Epidemiological SIDS Study found an association between low or no vitamin A intake and an increased risk of sudden infant death syndrome (SIDS) during their first year of life. This finding remained conclusive even when an adjustment was made for potential confounders, including socioeconomic factors. Furthermore, substantial evidence exists to show that healthy vitamin A levels during pregnancy in the mother may results in substantially reduced HIV transmission risk to her unborn child (also, vitamin A-deficient mothers were much more likely to transmit the virus to their newborn infants than were HIV-infected mothers who had adequate amounts of this critical nutrient)” (Vitamin A Under Attack Down Under). By the way, see the works of Ken Albala and Trudy Eden (e.g., Food and Faith in Christian Culture) about how in Galenic theory of humors it is understood that meat, especially red meat of ruminants, increases ‘blood’ which turns into semen and breast milk, that is to say fatty animal foods are good for fertility and fecundity, for growth, strength and vigor (Galenic ‘blood’ might be translated as animal spirits, life force, prana, or, in more modern terminology, libido) — it is true that ruminants appear to have been the primary food source in human and hominid evolution, specifically the blubbery ruminants that we call the megafauna that hominids prized for millions of years before their die-off.

Many animals are able to turn beta-carotene into retinol, something the human body theoretically can do to a limited extent if not as effectively as ruminants, and the final product is concentrated in the animal fat and the liver. As carotenoids are why carrots and sweet potatoes are orange, this relates to why egg yolks and butter is yellow-to-orange but not egg whites and skim milk. The deep color doesn’t only tell you the presence of vitamin A but the other fat-soluble vitamins as well: D, E, and K (the light color or even whiteness of the fat in factory-farmed animal foods is a sign of nutritional low quality). In my grandmother’s compromised health, she likely was not producing enough of her own vitamin A, despite getting a bounty of the precursor (her oddly-colored skin probably indicated carotenaemia, the condition of the body not metabolizing carotenoids). Worse still, beta-carotene in some cases is associated with increased risk of cancer: “In two studies, where people were given high doses of B-carotene supplements in an attempt to prevent lung cancer and other cancers, the supplements increased risk of lung cancer in cigarette smokers, and a third study found neither benefit nor harm from them. What might cause the unexpected findings? While beneficial at low doses, at higher doses, antioxidants can shut down cell signalling pathways and decrease synthesis of mitochondria in new muscle cells. They can also decrease production of endogenous antioxidants produced by a body” (Fred Provenza, Nourishment, p. 99). In getting unnaturally high levels of beta-carotene from juicing large quantities of carrots, was my grandmother overdosing herself in a similar manner? Rather than improving her immunity, did it contribute to her dying from cancer despite doing her best to follow a healthy diet? More is not necessarily better, especially when dealing with these precursors. It would be better to go straight to the preformed vitamin A in naturally balanced ratio with other nutrients (e.g., organ meats).

Beyond cancer concerns, the bioavailable forms of retinoids, along with nutrients like B12 that also are only found in animal foods (liver has high levels of both), have a major effect on the health and development of eyes (during the British food shortage of World War II, the government promoted eating the surplus of carrots by having told the public that it would give them the eyesight of heroic fighter pilots). Night blindness, a common symptom of vitamin A deficiency, is one of the most widespread problems. It is associated with developing countries and impoverished communities, but malnourishment creeps into many other populations. A vegetarian I know is experiencing loss of his night vision and it stands out that the others in his family who are also vegetarian show signs of deficiencies — besides the family being regularly sick, his wife has severe calcium loss (her body is literally absorbing her lower jaw bone) and his kids have issues with neurocognitive development and mental health (autism, depression, etc). The family doesn’t only avoid meat for neither do they eat much dairy or eggs, for example preferring plant ‘milks’ over real milk. This is something now recommended against for the young, instead advising children either drinking dairy milk or else water (E. J. Mundell, New Kids’ Drink Guidelines: Avoid Plant-Based Milks). This makes me wonder about my own early development because, after being weaned early at 6 months, I couldn’t handle cow milk and so was put on soy milk. This likely was a contributing factor to my early diagnosed learning disability, autistic-like symptoms, depression, and poor eyesight (I got corrective lenses a year or two after reading delays led me to going to a special education teacher; I was an otherwise healthy kid who played sports and spent a lot of time outside, rather than watching tv; as an interesting anecdote from someone else, see what Josiah shares in The Story of How Real Vitamin A Changed My Life). To later compound the fat-soluble vitamin deficiency, my mother in following expert health advice bought skim milk during my adolescent growth period and, now that I think about it, that was around when my depression really started going into full gear. This kind of thing hasn’t been a problem for traditional societies that often breastfed for the first 2-3 years, and that mother’s milk would be full of fat-soluble vitamins assuming the mother was eating well such as getting fatty animal foods from wild or pasture-raised sources (a safe assumption to make as long as they still have access to their traditional foods in maintaining traditional hunting grounds, fishing waters, or grazing lands).

The diseases of vitamin A deficiency have been known for millennia and were typically treated with fatty animal foods such as ruminant liver or cod liver oil, but other parts of the animal were also used. “In his pioneering work, Nutrition and Physical Degeneration, Weston Price tells the story of a prospector who, while crossing a high plateau in the Rocky Mountains, went blind with xerophthalmia, due to a lack of vitamin A. As he wept in despair, he was discovered by an Indian who caught him a trout and fed him “the flesh of the head and the tissues back of the eyes, including the eyes.”1 Within a few hours his sight began to return and within two days his eyes were nearly normal. Several years previous to the travels of Weston Price, scientists had discovered that the richest source of vitamin A in the entire animal body is that of the retina and the tissues in back of the eyes” (Sally Fallon Morell & Mary G. Enig, Vitamin A Saga). I might add that there is more to eyeball than just vitamin A: “The Latin name for the retina of the eye is macula lutea. ( Lutea is Latin for yellow. ) This thick, membranous yellow layer of the eyeball is a rich source of the nutrient lutein, a member of the retinoid family of vitamin A precursors. Lutein supplements are now promoted as being good for prostate health and for preventing macular degeneration. The fat behind the eyeball is a rich source of vitamin A and lutein. (If you think you’d rather swallow a supplement than pop an eyeball after breakfast, remember that vitamins are heat-, light-, and oxygen-sensitive and unlikely to survive processing.) And while you’re digesting the idea of eating eyeball fat, consider that the gooey juice in the eye is primarily hyaluronic acid, rich in glycosaminoglycans. You can get hyaluronic acid injected into your lips (to fill them out), your knee (as a treatment for osteoarthritis), and even your own eye (to treat certain ocular diseases) for $200 a dose (twenty one-thousandths of a gram). It’s called Restylane. But you can get this useful nutrient into your body just by eating the eyes you find in fish head soup, and the glycosaminoglycans will find their way to the parts of the body that need them most” (Catherine Shanahan, Deep Nutrition, p. 279). Maybe that is a home remedy my grandmother should have tried.

Despite this old wisdom, vitamin A itself was not identified until the early 1900s. In the decades following, all of the other main vitamins were discovered. Gyorgy Scrinis, in his book Nutritionism, summarizes this early history of nutritional studies that led to the isolation of the chemical structure (p. 75): “It was also in 1912 that Elmer McCollum’s research team at Yale University first identified a fat-soluble substance they called fat-soluble A or Factor A (later renamed vitamin A) found in butter, liver, and leafy greens. Through his experiments on rats, McCollum demonstrated that a deficiency of Factor A led to impaired vision and stunted growth. The team also identified “water-soluble B” or “Factor B,” later renamed vitamin B, the absence of which they linked to the tropical disease beriberi. 80 In the 1920s scientists identified various important relationships between vitamins and human health: they linked deficiency in vitamin C, which they found in citrus fruits, to scurvy, and they linked deficiency in vitamin D, found in various foods and produced by the body in response to sunlight, to rickets. During the 1920s and 1930s, other vitamins and minerals were identified, including riboflavin, folic acid, beta-carotene, vitamin E, and vitamin K.” McCollum was motivated by a realization of some important factor that was missing. “After reviewing the literature between 1873 and 1906,” writes Lee Russell McDowell, “in which small animals had been fed restricted diets of isolated proteins, fats, and carbohydrates, E. V. McCollum of the united States noted that the animals rapidly failed in health and concluded that the most important problem in nutrition was to discover what was lacking in such diets” (Vitamins in Animal Nutrition, p. 9). One of the limitations of the early research was that it was too broad of an approach. In isolated macronutrients failing to serve optimal health, researchers turned to isolated micronutrients, not considering that part of the problem is the isolating of any particular nutrient and ignoring the larger process of how nutrients get used as part of whole foods. The precursor provitamin A or carotenoids such as beta-carotene is not the same as preformed vitamin A as retinoids in the form of retinol and its esterified form, retinyl ester, and also as retinal and retinoic acid. To conflate the carotenoids and the retinoids is reductionist and, when accepted without question, causes many problems. Nutrients aren’t simply a compound that either one gets or doesn’t. There is more going on than that.

The main complication is that the body doesn’t easily turn the one form into the other and this can vary greatly between individuals: “many people are genetically bad converters of carotenoids to retinol” (Dr. Chris Masterjohn: Why You’re Probably Nutrient Deficient – The Genius Life); for brief discussions of related genes, see Debbie Moon’s How Well Do You Convert Beta-Carotene to Vitamin A?, Joe Cohen’s The Importance of Real Vitamin A (Retinol), and David Krantz’s When Carrots Don’t Cut It: Your Genes and Vitamin A. Also consider: “This genetic problem may exist in up to half of the population. Its presence appears to be associated with high blood levels of beta-carotene, alpha-carotene, beta-cryptoxanthin, and low levels of lycopene, lutein and zeaxanthin—three other carotenoids important for health but don’t convert to vitamin A” (Phil Maffetone, Vitamin A and the Beta-Carotene Myth: “A” is for Athletics, Aging and Advanced Health). Keep in mind that other nutrients such as “Iron and zinc deficiency can affect the conversion to vitamin A” (Pam Schoenfield, The Unfair Stigmatization of Vitamin A).  It’s ironic that it turns out, with all the obsession over eating loads of the precursor from vegetables and supplements, that the very beta-carotene that potentially can be made into vitamin A might also compete with it: “Recent research also suggests that cleavage products of beta-carotene can block vitamin A at its receptor sites—another possible anti-nutrient?” (Schoenfield). So, even nutrient-density of a vitamin precursor doesn’t necessarily mean there is bioavailability of the vitamin itself. “Conversion of carotenes to Vitamin A takes place in the upper intestinal tract in the presence of bile salts and fat-splitting enzymes. While early studies on this biological process suggested a 4:1 ratio of beta carotene to Vitamin A conversion, later studies revised this to 6:1 or perhaps even higher. If a meal is lowfat, however, not much bile is going to reach the intestinal tract further worsening the conversion ratio of carotenes to Vitamin A” (Sarah Pope, Busting the Beta Carotene Vitamin A Myth). “Average conversion of beta-carotene to retinol is around 2-28% (28% is on the very generous end), meaning those who consume all of their un-converted vitamin A from plants would have a very hard time meeting their vitamin A needs, and conversion could be even lower if someone is in poor health” (Laura A. Poe, Nutrient Spotlight: Vitamin A). There are numerous health conditions and medications that make conversion difficult or impossible — for example: “Diabetics and those with poor thyroid function, (a group that could well include at least half the adult US population), cannot make the conversion. Children make the conversion very poorly and infants not at all — they must obtain their precious stores of vitamin A from animal fats— yet the low-fat diet is often recommended for children” (Sally Fallon Morell, Vitamin A Vagary). I could list many other factors that get in the way, as this conversion process requires almost perfect conditions within the human body.

If you’re assuming that you are getting enough vitamin A, you’re making a dangerous gamble with your health. Take heed of a recent case where a teenager lost his eyesight and hearing after a decade of avoiding nutritious animal foods and instead having followed a junk food diet (Lizzie Roberts, Teenager ‘first in UK’ to go deaf and blind due to junk food diet, report reveals) — “The doting mother believes vitamin A injections could have saved Harvey’s sight if they were given to him at an earlier age” (Kyle O’Sullivan, Mum of teenager who went blind from only eating crisps and chocolate blames NHS); she said that, “Back in December when we were told it was down to nutrition, we think if they’d done the blood test then and realised the Vitamin A was so low they could have given him the Vitamin A injections then and he could see a lot more out of that right eye and we could have saved it a lot better.” Of course, he wasn’t eating a ‘balanced’ diet, but like many people on modern nutrient-deficient diets he became dependent on government-enforced supplementation policies saving him from malnourishment. But it turns out that, even with fortified foods, there are major nutritional holes in the Western diet. Should we fortify food even further or maybe genetically-modify crops for this purpose? Should we eat even more vegetables and juice them until, like my grandmother, we all turn orange? Or else should we simply go back to eating healthy traditional animal foods?

This is the same basic issue with the precursors of other fat-soluble vitamins and the precursors of other nutrients that aren’t easily processed by humans until after other animals have done the work for us (e.g., the omega-3s in algae can’t be accessed by human digestion but fish can break it down and so, in eating fish, it becomes bioavailable). This understanding has been slow to take hold in nutrition studies. Consider how, even though Weston A. Price was writing about Activator X in the 1940s, it wasn’t until this new century that it was identified as vitamin K2 and identified in being distinct from vitamin K1 —  see Christopher Masterjohn, On the Trail of the Elusive X-Factor: A Sixty-Two-Year-Old Mystery Finally Solved. By the way, I’d emphasize the close link of vitamin A and vitamin K, as Masterjohn details: “Because vitamin K1 is directly associated with both chlorophyll and beta-carotene within a single protein complex and plays a direct role in photosynthesis,13 the richness of the green color of grass, its rate of growth, and its brix rating (which measures the density of organic material produced by the plant) all directly indicate its concentration of vitamin K1. Animals grazing on grass will accumulate vitamin K2 in their tissues in direct proportion to the amount of vitamin K1 in their diet. The beta-carotene associated with vitamin K1 will also impart a yellow or orange color to butterfat; the richness of this color therefore indirectly indicates the amount of both vitamins K1 and K2 in the butter. Not only are the K vitamins detected by the Activator X test and distributed in the food supply precisely as Price suggested, but, as shown in Figure 2, the physiological actions that Price attributed to Activator X correspond perfectly to those of vitamin K2. It is therefore clear that the precursor to Activator X found in rapidly growing, green grass is none other than vitamin K1, while Activator X itself is none other than vitamin K2.” Just eat those delicious animal foods! Then everything will be right with the world. Any ideology that tells you to fear these foods is a belief system that is anti-human and anti-life.

The thing is, in their natural form in animal foods, fat-soluble vitamins are part of a complex of synergistic nutrients and their cofactors (it’s particularly important for vitamins A, D3, and K2 to be in balance). Isolated vitamins, especially in higher amounts as supplements and fortification to treat disease, have sometimes proven to be problematic for health in other ways. “Nutrient supplements may even be harmful, particularly when taken in large, concentrated, and isolated doses,” explained Gyorgy Scrinis. “An overdose of vitamins A and D, for example, can have toxic and potentially fatal effects. 85 Some studies have also found an association between beta-carotene supplements and an increased risk of both cardiovascular disease and certain cancers” (p. 76). Furthermore, specific foods are part of a total diet and lifestyle. For hundreds of thousands of years, humans ate a low-carb and high-fat diet combined with regular fasting, which guaranteed regular ketosis and autophagy. There have been numerous health benefits shown from these combined factors. It’s fascinating that, in early research on the ketogenic diet when applied to children, that it was sometimes observed to not only to be effective in treating physical diseases like epileptic seizures and diabetes but that behavioral issues also improved. This has been demonstrated with more recent research as well in showing the diverse connections to neurocognitive health (Ketogenic Diet and Neurocognitive HealthFasting, Calorie Restriction, and Ketosis, The Agricultural Mind, “Yes, tea banished the fairies.”, Autism and the Upper Crust, Diets and SystemsPhysical Health, Mental Health). Still, there are many confounding factors. Since ketogenic diets tend to increase fat intake, depending on the source, they also can increase the levels of fat-soluble vitamins. Weston A. Price observed that traditional healthy societies getting plenty of these nutrients had both greater physical health (well-developed bone structure, strong immune system, etc) and greater ‘moral’ health (positive mood, pro-social behaviors, etc). The moral panic that more strongly took hold in the 19th century was understood at the time as being rooted in general health concerns (The Crisis of Identity).

Many of the fat-soluble vitamins, especially vitamin A, act more like hormones than mere nutrients. They influence and determine nearly every system in the body, including in how they impact the development of nervous system, brain, and gut-brain axis. If one is seeing outward symptoms like eye deterioration, it can be guaranteed that far worse problems are already forming that are less obvious. Yet even outward symptoms aren’t always recognized, such as in one study where most subjects didn’t even realize they had decreased night vision from vitamin A deficiency. Our health often has to get quite severely bad before we notice it and admit to it. Part of this is that sickness and disease has become so common that it has become normalized. I was socializing with someone who is overweight to an unhealthy degree, but I later realized how the excess body fat didn’t even stand out to me because, by American standards, this person was normal. Anything can become normalized. My mother brought my nephew to the doctor and, in discussing how often he gets sick, told the doctor about his unhealthy diet. The doctor’s response is that all kids eat unhealthy. The doctor has seen so many sickly patients that she couldn’t imagine health as a normal state of the human body. About the vegetarian I mentioned, I give him credit for at least noticing his loss of night vision, but what was nagging me about his situation is how he seems to have accepted it as an expected part of aging, in not realizing that it is a sign of a severe health concern. He probably has never thought to mention it to his doctor and, even if he did, most doctors aren’t well enough educated in nutrition to realize its significance and understand what it might mean (Most Mainstream Doctors Would Fail Nutrition).

This isn’t a problem limited to a few people. One of the main things that has declined over time is access to fat-soluble vitamins, a pattern that most clearly emerged with the early 19th cheap grains, especially white flour, replacing animal foods and then ratcheted up further with the early 20th century replacement of animal fats with industrial seed oils. It’s not only whether or not we are getting a particular vitamin. As important is how other aspects of the diet affect nutrition. A high-carb diet itself might be disrupting the bioavailability of vitamin A, and that is even more true if also low-fat since the fat-soluble vitamins are useless without the fat to absorb them. From the book Malnutrition and the Eye, Donald McLaren writes: “The association of xerophthalmia with an excessive intake of carbohydrate in the diet in infancy was recorded by Czerny and Keller (1906) in their classical monograph on the syndrome they termed Mehlnahrschaden. It is now recognized that this condition is identical in all basic features to what has been called “the most serious and widespread nutritional disorder known to medical and nutritional science” (Brock and Autret, 1952) and due in essence to a deficiency of protein and excess of carbohydrate in the diet. Many local and other names have been applied to this disease but it will be necessary here to use one, and that chosen, “kwashiorkor,” has found wide acceptance. Since Cerny’s day there has been a great number of other accounts in which ocular involvement has been described (McLaren, 1958), providing good evidence for the contention that a deficiency of vitamin A is the most common of all vitamin deficiencies associated with kwashiorkor.” It has been observed by many that the populations with vitamin A deficiency tend to have a diet high in grains and vegetables while low in animal foods, particularly low in seafood (fish oil is the most concentrated source of vitamin A). A high grain diet effects nutrition in other ways as well: “Many nutritionists consider cereal grains to be good sources of most of the B vitamins except for vitamin B12. Inspection of table 4 generally is supportive of this concept, at least in terms of the % RDA which cereal grains contain. However, of more importance is the biological availability of the B vitamins contained within cereal grains and their B vitamin content after milling, processing and cooking. It is somewhat ironic that two of the major B vitamin deficiency diseases which have plagued agricultural man (pellagra and beriberi) are almost exclusively associated with excessive consumption of cereal grains” (Loren Cordain, “Cereal Grains: Humanity’s Double-Edged Sword,” from Vitamin History: The Early Years ed. by Artemis P. Simopoulos, p. 27). Besides vitamin A and B12 affecting eye and bone health, they work together in numerous other ways (e.g., Edward G. High & Sherman S. Wilson, Effects of Vitamin B12 on the Utilization of Carotene and Vitamin a by the Rat), as is true with so many other links between nutrients. The balance is easily disturbed — all the more reason to worry about grains since they knock out large swaths of essential nutrients, including the nutrients from animal foods. Eat the hamburger and leave the bun, enjoy the steak but not the roll. There is another factor to keep in mind. A high-carb diet is a major cause of liver disease, related to metabolic syndrome that is also associated with insulin resistance, obesity, diabetes, heart disease, Alzheimer’s, etc. The liver is necessary for the digestion of fat and the assimilation of fat-soluble vitamins. So, if someone on a high-carb diet has compromised their liver functioning, they can be deficient in vitamin A no matter how much they are getting from their diet (James DiNicolantonio made a similar point about other nutrients: “The liver makes proteins that carry minerals around the body. So if you have liver disease, even if you consume enough minerals, you may have difficulty moving minerals around the body to where they are needed”; this would relate to how the fat-soluble vitamins are absolutely necessary for the absorption, processing, transportation, and use of minerals).

This is exacerbated by the fact that, as people have followed official dietary recommendations in decreasing fat intake, they’ve replaced it with an increase in starchy and sugary carbohydrates. Actually, that isn’t quite correct. Fat intake, in general, hasn’t gone down (nor gone up). Rather, Americans are eating less animal fats and more industrial seed oils. The combination of unhealthy carbs and unhealthy oils is a double whammy. There are all kinds of problems with industrial seed oils, from being oxidative to being inflammatory, not to mention being mutagenic (Dr. Catherine Shanahan On Dietary Epigenetics and Mutations). On the one hand, there is the loss of fat-soluble vitamins in that the industrial seed oils lack them. That is bad enough, but consider another part of the equation. Those same oils actively interfere with what fat-soluble vitamins that are otherwise being obtained — as Gryorgy Scrinis tells it: “The nutritional engineering of foods can create nutrient-level contradictions, whereby the enhancement or removal of a particular nutrient by food manufacturers interferes with the quantities or absorption of other desirable nutrients. For instance, studies have shown that the concentrated quantities of plant sterols in cholesterol-lowering margarines block the absorption of beta-carotene and therefore lower vitamin A levels in the body. 27 Consumers of sterol-enriched foods are therefore encouraged to compensate for this nutrient-level contradiction by eating more fruits and vegetables to increase their vitamin A levels” (Nutritionism, p. 211). Combine that with health problems with the gut, metabolism, and liver as seen with so many modern Westerners and other industrialized populations (88% of adult Americans are metabolically unfit: Dietary Health Across Generations). Maybe humans with the right genetics and optimal health could both turn beta-carotene into retinol and make use of all the fat-soluble vitamins, as precursors or preformed. The problem is that doesn’t describe most people today. “Unfortunately, just like with Omega 3, the evidence indicates our ability to convert beta-carotene to retinol is not sufficient (202122). Some estimates indicate that beta-carotene intake is only 16-23% as effective as retinol intake for increasing body levels of retinol (20, 21, 22). This is supported by findings that beta-carotene supplementation and high beta-carotene intake (vitamin A from vegetables) increases serum beta-carotene levels but does not significantly impact retinol levels (20, 21, 22)” (Andy AKA Barefoot Golfer, Why Humans are Not Vegetarians – The Omnivorous Truth).

This is why vegans and vegetarians so easily get into trouble with nutrient deficiencies and are forced to rely upon supplements, although its questionable how helpful are these supplements as replacements for whole foods as part of what would otherwise be a diet that is both nutrient-dense and nutrient-bioavailable. The vegetarian I discussed above eats a “balanced diet” of fresh produce and fortified plant foods combined with a multivitamin, and yet he still is losing his night vision. Nutrients are part of a complex web of health. Too much of one thing or too little of another can mess with the levels of a particular nutrient which, in turn, can have a chain effect with numerous other nutrients. Consider calcium and how it is processed (Calcium: Nutrient Combination and Ratios); vitamin A plays a central role in bone development, and so maldevelopment of the skull in constricting the cornea or optic nerve is another way deficiency can negatively impact eyesight. Or consider that, “Dietary antioxidants (i.e., vitamin E) also appear to have an important effect on the utilization and perhaps absorption of carotenoids. It is uncertain whether the antioxidants contribute directly to efficient absorption or whether they protect both carotene and vitamin A from oxidative stress. Protein deficiency reduces absorption of carotene from the intestine” (Lee Russell McDowell, Vitamins in Animal Nutrition, pp. 16-17). We humans aren’t smart enough to outsmart nature (Hubris of Nutritionism). The body as a biological system is too complex and there are too many moving parts. Any single thing shifts and everything else follows. The only guaranteed healthy solution is to adhere to a traditional diet that includes plenty of fatty animal foods. This is easy enough for omnivores and carnivores — get plenty of liver and fish (or get it in the form of fish oil and cod liver oil), although any high quality fatty meats will work. And if you’re vegetarian, emphasize pasture-raised eggs and dairy. But for vegans, I can only suggest that you pray to God that you have perfect genetics, perfect metabolism, perfect balance of supplements, and all other aspects of optimal functioning that allows you to be the rare individual who has a high conversion rate of beta-carotene to vitamin A, along with conversion of other precursors (4 Reasons Why Some People Do Well as Vegans (While Others Fail Miserably)) — good luck with that!

To lighten up the mood, I’ll end with a fun factoid. Talking about genetics, an important element is epigenetics. Catherine Shanahan, quoted once already, has written an interesting discussion in her book, Deep Nutrition, that covers the interaction of nutrition with both genetics and epigenetics. Health problems from deficiencies can be passed on, but they can also be reversed when the nutrient is added back into the diet: “One example of the logic underlying DNA’s behavior can be found by observing the effects of vitamin A deficiency. In the late 1930s, Professor Fred Hale, of the Texas Agricultural Experiment Station at College Station, was able to deprive pigs of vitamin A before conception in such a way that mothers would reliably produce a litter without any eyeballs. 50 When these mothers were fed vitamin A, the next litters developed normal eyeballs, suggesting that eyeball growth was not switched off due to (permanent) mutation, but to a temporary epigenetic modification. Vitamin A is derived from retinoids, which come from plants, which in turn depend on sunlight. So in responding to the absence of vitamin A by turning off the genes to grow eyes, it is as if DNA interpreted the lack of vitamin A as a lack of light, or a lightless environment in which eyes would be of no use. The eyeless pigs had lids, very much like blind cave salamanders. It’s possible that these and other blind cave dwellers have undergone a similar epigenetic modification of the genes controlling eye growth in response to low levels of vitamin A in a lightless, plantless cave environment” (p. 57). The body is amazing in what it can do, when we give it the nourishment it needs. Heck, it’s kind of amazing even when we malnourish ourselves and the body still tries to compensate.

* * *

Vitamin A Under Attack Down Under
by Nora Gedgaudas

Traditional and indigenous diets have always venerated foods rich in fat-soluble nutrients and women either pregnant and/or seeking to become pregnant in traditional and indigenous societies (according to the exhaustive and well-documented research of Dr. Weston A. Price, author of the respected and acclaimed textbook, ‘Nutrition and Physical Degeneration’) ate diets rich in fats and fat-soluble nutrients–including liver–for this very purpose. The notion that these foods have somehow–all of a sudden—become toxic to us at any age is patently absurd. In fact, Price—himself a meticulous researcher in the 1930’s determined that traditional/indigenous societies readily consumed more than 10-times the levels of these nutrients—easily estimable at about 50,000 IU per day of preformed vitamin A, as compared to the levels of vitamin A consumed in (his) “modern times”. And people in Weston Price’s “modern day era” (1930’s) were nowhere near as hysterically phobic about foods such as egg yolks, shellfish and liver as we have since become following the fabrication of irrational concerns about such foods (and animal source foods/fats, in general)! Let’s just say we’re not necessarily healthier as a species for consuming at least ten-times less of these vital and protective, activated fat-soluble nutrients today; much less are we enjoying fewer birth defects or improved overall maternal/infant health. In fact, the closer we as a society attempt to emulate government guidelines, the less healthy (according to the latest confirming research) we demonstrably become.

Primal Body, Primal Mind
by Nora Gedgaudas
p. 139

The role of certain nutrients in relation to others and the need for certain cofactors in order to optimize a nutrient’s function or prevent imbalances aren’t normally discussed at all. This, of course, leads to problems.

For instance—and perhaps critically—for each and every receptor for vitamin D, there are two receptors for vitamin A on every cell. Because of the compartmentalized approach to vitamin D research, this sort of thing does not get recognized or discussed. A relative balance of these two nutrients is vital to their healthy functioning in the body. An excess of one can create a relative deficiency of the other. For instance, if you take large amounts of vitamin D without vitamin A, you are potentially more likely to develop symptoms of vitamin A deficiency and experience an actual immunosuppressive effect. Conversely, taking certain commercial cod-liver oil supplements that are rich in vitamin A but poor in vitamin D can lead to more severe vitamin D deficiencies. (It’s important to read labels. The amount of vitamin D in a serving of high-vitamin cod-liver oil is around 1,000 IU. Most commercial brands don’t exceed between 20 and 400 IU). Recent research from Spain indicates that vitamin A is necessary for both vitamin D binding and vitamin D release to receptor sites. The two vitamins are synergistic and should always be balanced in the diet or in supplementation. Individual needs for both may vary considerably.

Fat Soluble Vitamins: Vitamins A, D, E & K
by Jenny MacGruther

Carotenoids which include the very prevalent beta carotene are poorly converted by the body. For example, some studies indicate that the body requires as much as twenty-one times the amount of carotenoids to create the same amount of vitamin A is one part retinol. To add insult to injury many people, especially those suffering from thyroid disorders and small children, are even poorer converters. A 2001 study found that the conversion rate of carotenoids to true vitamin A is so poor as to render it nutritionally insignificant.

Why You Won’t Get Vitamin A From Carrots
by Lauren Geertsen

The most important fact about vitamin A is the difference between retinoids and cartenoids. The vitamin A from animal sources is retinoids, also called retinol, while plant source vitamin A is carotenoids, such as beta carotene.

Animal sources of retinol is bio-available, which means the body can utilize it. The vitamin A from plant sources, in contrast, must first be converted to retinol to be useful in the body. This poses two big problems.

First, when we are in pristine health, it requires at least six units of carotenes to convert into 1 unit of retinol (source). To put this in perspective, that means one must eat 4 1/2 pounds of carrots to potentially get the amount of useable A as in 3 oz. of beef liver (source). What happens if we have digestive issues, hormone imbalances, or other health problems? It requires even more units of carotene in the ratio.

Second, the carotene-to-retinol conversion is HIGHLY compromised. As a matter of fact, this conversion is negligible for many individuals. This conversion is virtually insignificant:

  • In infants
  • In those with poor thyroid function (hypothyroidism)
  • In those with diabetes
  • In those who are on a low fat diet or have a history of low fat dieting
  • In those who have compromised bile production (think: gallbladder and digestive issues) (source and source)

So, do you still think carrots are a vitamin A food? As with other orange veggies, sweet potatoes provide carotenes. Although beta carotene is an antioxidant, it is not true vitamin A. We must eat true vitamin A foods on a daily basis to meet our requirements for this essential nutritient.

Beef or Carrots
by Doug Garrison

This vitamins proper name “retinol” refers to its role in supporting vision.  Growing up we were told to eat our carrots for healthy eyes, especially to have night vision like cats!  Hmm, do cats eat carrots?  It is the “carotene” in carrots that our bodies can (with effort) convert into vitamin A.  The drawbacks to relying on carrots for your vitamin A:

  • We must use a biochemical reaction with bile salts and enzymes to convert the beta-carotene in carrots into vitamin A.
  • The conversion rate of beta-carotene to vitamin A in each person depends on many factors and ranges from 4 to 28 beta-carotene units to produce one unit of vitamin A.

For an adult male to meet the daily recommended intake for vitamin A, he would need to consume 2 pounds of baby carrots.  (Skipping the baby carrots, he could do one pound of regular carrots, for some reason baby carrots have half the beta-carotene.  Chlorine bath anyone?)  Don’t want to eat that many carrots?  How about 2.3 pounds of kale?  If you are like me, kind of lazy, I’ll opt for my vitamin A already formed in some beef liver.  Less than 1 ounce of beef liver will do the trick.

Still want to get your Vitamin A from carrots?  Boost your bodies conversion rate by eating carrots with animal fat such as cooking carrots with a pasture grazed beef roast!  In fact, we cannot convert the beta carotene found in plants without fat in our diet as a catalyst.

Vitamin A Deficiency: Health, Survival, and Vision
by Alfred Sommer, Keith P. West, James A. Olson, and A. Catharine Ross
p. 101

The ancient Egyptians and Greeks recognized nyctalopia and treated it with calf’s or goat’s liver (high in vitamin A content). By the nineteenth century, nightblindness was known to occur primarily among the poorer strata of society, particularly during periods of dietary deprivation; was exacerbated by photic stress (which bleached so much rhodopsin that synthesis could not keep up with demand, causing borderline deficiency to become manifest as nightblindness), and could be effectively treated with liver or liver oils. In fact, most other manifestations of xerophthalmia were first recognized by their association with nightblindness. Nightblindness (without evidence of xerphthalmia) was reported to have disabled Confederate soldiers between dawn and dusk, and to have affected whole regiments during the Crimean War.

Evolutionary Aspects of Nutrition and Health
edited by Artemis P. Simopoulos
p. 26
“Cereal Grains: Humanity’s Double-Edged Sword”
by Loren Cordain

Vitamin A deficiency remains one of the major public health nutritional problems in the third world [24]. Twenty to 40 million children worldwide are estimated to have at least mild vitamin A deficiency [25]. Vitamin A deficiency is a leading cause of xerophthalmia and blindness among children and also a major determinant of childhood morbidity and mortality [26]. In virtually all infectious diseases, vitamin A deficiency is known to result in greater frequency, severity, or mortality [27]. A recent meta-analysis [280 from 20 randomized controlled trials of vitamin A supplementation in third world children has shown a 30-38% reduction in all cause mortality in vitamin A-supplemented children. Analysis of cause-specific mortality showed vitamin A supplementation elicited a reduction in deaths from diarrheal disease by 39%, from respiratory disease by 70% and from all other causes of death by 34% [28]. Clearly, the displacement of beta-carotene-containing fruits and vegetables and vitamin A-containing foods (milk fat, egg yolks and organ meats) by excessive consumption of cereal grains plays a major role in the etiology of vitamin A deficiency in third world children.

Malnutrition and the Eye
by Donald McLaren
pp. 165-171

Few effective cures have been known so long to mankind as that of liver for night blindness. No doubt this was due in part to the dramatic nature of both the onset of the condition and of its relief. It is probable that the Ebers papyrus, written about 1600 B.C. in Egypt, referred to night blindness when it recommended liver for treatment of the eyes. A literal translation reads “Another [prescription] for the eyes: liver of ox roasted and pressed, give for it. Very excellent” (Drummond and Wilbraham, 1939). At about the same time the physicians in China were giving liver, dung of the flying fox, and tortoise shell for the cure of night blindness (Read, 1936). Hippocrates prescribed the whole liver of an ox dipped in honey and the therapeutic value of liver was also known to later Roman writers. It is believed that Celsus (25 B.C.-50 A. D.) first used the term xerophthalmia. […]

It would seem that night blindness was widespread in Europe in medieval times, for we find a 14th century poet in Holland, Jacob van Maerland, referring to the disease and its cure in this way (Bicknell and Prescott, 1953):

He who cannot see at night
Must eat the liver of the goat.
Then he can see all right.

[…] The relationship to poor general health and infectious disease was noted frequently in early accounts of xerophthalmia with special attention paid to intestinal disorders (Teuscher, 1867; de Gouvea, 1883). Baas (1894) described both night blindness and xerophthalmia in patients with liver disease and there have been many confirmatory accounts since. There is now reason to believe that impairment of dark adaptation in patients with disease of the liver may not always be due to deficiency of vitamin A […]

Although the cure for night blindness had been known since time immemorial, ti was not until the last century that the dietary deficiency nature of the condition was recognized. […] With the turn of the century several further steps forward were taken in the understanding of the nature of the disease. Jensen (1903) was the first to show that xerophthalmia could be cured by an adequate diet and for this purpose used raw cow’s milk. It is interesting to note that he observed a rapid improvement on this regime not only as judged by the condition of the eyes and gain in weight but particularly by the disappearance of what he called the “characteristic psychic indifference.” This recognition of the profound systemic effects of vitamin A deficiency has not always persisted since this time and the high mortality attributable to the disease in its severest form has also been lost sight of at times.

In 1904 the important observation was made by Mori that the disease known as “hikan,” characterized by conjunctival xerosiss and keratomalacia and widely prevalent among children aged 2-5 years in Japan, was most common in the children of people living largely on rice, barley and other cereals, beans, and vegetables. It did not occur among fisher folk, and cod liver oil, chicken liver, and eel fat were all effective remedies.

The association of xerophthalmia with an excessive intake of carbohydrate in the diet in infancy was recorded by Czerny and Keller (1906) in their classical monograph on the syndrome they termed Mehlnahrschaden. It is now recognized that this condition is identical in all basic features to what has been called “the most serious and widespread nutritional disorder known to medical and nutritional science” (Brock and Autret, 1952) and due in essence to a deficiency of protein and excess of carbohydrate in the diet. Many local and other names have been applied to this disease but it will be necessary here to use one, and that chosen, “kwashiorkor,” has found wide acceptance. Since Cerny’s day there has been a great number of other accounts in which ocular involvement has been described (McLaren, 1958), providing good evidence for the contention that a deficiency of vitamin A is the most common of all vitamin deficiencies associated with kwashiorkor.

Handbook of Nutrition, Diet, and the Eye
edited by Victor R. Preedy
p. 301
“Vitamin A, Zinc, Dark Adaptation, and Liver Disease”
by Winsome Abbot-Johnson and Paul Kerlin

McCollum and Davis (1912) found that ‘fat-soluble factor A’ was essential for growth in rats. The important connection between vitamin A deficiency and night blindness however was made by Frederica and Holm in 1925, who observed slower generation of visual purple in light-adapted vitamin A-deficient rats than for normal rats when put into the dark.

A relationship between night blindness and cirrhosis was reported by Haig et al. in 1938 and Patek and Haig in 1939. It was thought that these patients may be deficient in vitamin A and the deficiency state was not thought to be attributable to inadequate intake of the vitamin in their food. Impairments of dark adaptation (DA) included delayed rod cone break (time when rods become more sensitive to light than cones), higher intensity of light seen at 20 minutes (postexposure to a bright light), and higher intensity of light seen at final reading (elevated final rod thresholds). Nineteen of 24 patients demonstrated night blindness but none was aware of this on direct questioning.

The Vitamin A Story
by Richard D. Semba
p. 76

The piecemeal clinical picture of night blindness caused by vitamin A deficiency (see previous chapters) finally came together between 1896 and 1904, when Japanese physician Masamichi Mori described more than fifteen hundred children with hikan — that is, xerophthalmia [37]. Mori had studied medicine at the Mie Prefectual Medical School and the Tokyo University and gone on to work in Germany and Switzerland before returning to Mie Prefecture to practice surgery. The children Mori described had night blindness, Bitot’s spots, corneal ulceration, keratomalacia, and diarrhea. The death rate among them was high. Most were between ages one and four and one-half, and many came from poor families living in mountainous regions, where the diet completely lacked milk and fish. Once under medical care, the children were given cod liver oil daily, and this proved to be a effective treatment for both the eye lesions and diarrhea. Contrary to the view of many physicians, Mori concluded that the disease was not infectious but rather was caused by the lack of fat in the diet.

Fat-Soluble Vitamins
edited by Peter J. Quinn and Valerian E. Kagan
p. 150
“Plasma Vitamins A and E in HIV-Positive Patients”
by Joel Constans, Evelyne Peuchant, Claire Sergent, and Claude Conri

During the last 10 years it has been demonstrated that vitamin A deficiency not only results in xerophthalmia and blindness, but also in mortality and susceptibility to infectious diseases (Lammer et al., 1985; Reddy et al., 1986). Treatment with massive intermittent dosages of vitamin A has resulted in a decrease in mortality in developing countries (Sommer et al., 1986). It has been suggested that vitamin A might have a positive effect on the immune system and that marginal deficiencies in vitamin A that are unable to give rise to xerphthalmia and blindness might impair immune defenses (Bates, 1995; Sommer et al., 1984). Deficiency in vitamin A clearly has effects on the immune system, and the number, and the function of natural killer cells were depressed in vitamin A-deficient rats (Bates, 1995). Supplementation with vitamin A (60 mg retinol equivalents) resulted in higher CD4/CD8 ratio, higher CD4 naive T cells, and lower proportions of CD8 and CD45 RO T cells in vitamin A-deficient children compared to placebo-treated children (Semba et al., 1993b). Vitamin A deficiency might also result in depression of humoral response to proteins and alterations of mucosal sufraces (Bates, 1995). Watson et al. (1988) reported that high vitamin A given to mice with retroviral infection increased survival and numbers of macrophages and total T lymphcytes.

Vitamin History: The Early Years
by Lee McDowell
pp. 61-66

IV. Xeropthalmia And Night Blindness History, From Antiquity To 14th Century

Vitamin A deficiency is one of the oldest recorded medical conditions, long recognized as eye manifestations. For thousands of years humans and animals have suffered from vitamin A deficiency, typified by night blindness and xerophthalmia. The cause was unknown, but it was recognized that consumption of animal and fish livers had curative powers according to records and folklore from early civilizations. It is interesting to find the knowledge of the cure is almost as old as medicine.

Night blindness and its successful treatment with animal liver was known to the ancient Egyptians (Fig. 3.4). Eber’s Papyrus, an Egyptian medical treatise of between 1520-1600 B.C., recommends eating roast ox liver, or the liver of black cocks, to cure it (Aykroyd, 1958). Wolf (1978) notes that a more careful evaluation of ancient Egyptian writings (Eber’s Papyrus no. 351) reveals that the therapy consisted of topical application of vitamin A rich liver juice to the eyes. Wolf (1978) suggested that with the topical application some of the liver oil must enter the lacrimal duct and thereby reach the throat via the nose. Therefore, the vitamin A could enter the body despite its topical application.

Vitamin deficiency diseases in China such as xerophthalmia and night blindness had been very prevalent from olden times (Lee, 1940). For preventing night blindness the Chinese in 1500 B.C. were giving liver, honey, flying fox dung and tortoise shell, all of which would have cured night blindness (Bicknell and Prescott, 1955).

The term “sparrow eyed” was used for a man who could see in daytime but not at twilight. The sparrow also has vision problems at night. Even though the ancient Chinese did not know the real cause of night blindness, they knew it was caused by a nutritional disturbance. A report from the 7th century notes that pig’s liver could cure night blindness.

In the Bible book “Tobit”, blindness apparently due to vitamin A deficiency, is described. The setting of the story is the latter part of the 8 th century B.C. in the Assyrian capital of Nineveh where the people of Northern Israel had been taken captive. In this book God sends the angel Raphael who tells Tobit’s son to rub the eyes of Tobit with fish bile. After this Tobit was no longer blind (Grabman, 1973). Around 600 B.C. an early reference to vitamin A deficiency in livestock is in the Bible (Jeremiah 14:6, King James version): “and the asses did stand in high places, their eyes did fail, because there was no grass.”

Evaluation of medicine of the Assyrian-Babylonian empires (900-400 B.C.) report eye diseases or conditions (Krause, 1934). The Babylonian word sin-lurma (night blindness) was described as “a man can see everything by day but can see nothing by night.” The prescription for cure was to use a concoction whose major ingredient was “liver of an ass.” The procedure was for a priest to say to the person with night blindness “receive, o dim of the eye”. Next the liver-based potion was applied to the eyes. Also for xerophthalmia a type of prescription that would provide vitamin A was “thou shalt disembowel a yellow frog, mix its gall in curd, apply to eyes”.

The old Greek, Roman and Arab physicians recommended an internal and external therapy with livers of goats to overcome night blindness. The Greek Hippocrates, who lived 460-325 B.C., recognized night blindness and recommended eating raw ox liver dipped in honey as a cure (Littre, 1861). The notation was to “eat, once or twice, as big an ox-liver as possible, raw, and dipped in honey”. To eat a whole ox liver seems a superhuman feat, even when we reflect that the ox of antiquity was a much smaller creature than that of today (Figure 3.4).

Mani in 1953 reviewed the Greek therapy for night blindness (cited by Wolf, 1978). A precise definition of night blindness is not found until after the time of Hippocrates. Galen (130 to 200 AD) describes patients who are “blind at night” and Oribasius (325 AD) defines night blindness as “vision is good during the day and declines at sundown, one cannot distinguish anything any longer at night”. Galen recommends the cure for night blindness as “continuous eating of roasted or boiled liver of goats”. He also suggests, as did the Egyptians a topical treatment, “the juice of the roasted liver should be painted on the eyes”.

Xerophthalmia had been known as hikan in Japan since antiquity (Mori, 1904). Mori stated that hikan was common among people who subsisted in great measure on rice, barley and other cereals, beans and other vegetables, whereas it did not occur among fisher folk. He not only recognized the entire sequence of ocular changes in xerophthalmia, but also the central role of dietary deficiency of fats (particularly fish liver oils) resulting from either a faulty diet or faulty absorption, the role of diarrhea, kwashiorkor (protein deficiency) and other contributory and precipitating events. Mori reports that administration of cod liver oil was followed by speedy relief from the disorder and that chicken livers and eel fat were effective remedies also. He incorrectly concluded that deficiency of fat in the diet was the cause of the disease.

V. Xeropthalmia And Night Blindness History, 1300-1900

Liver was the most widely used cure for night blindness. Jacob van Maerland, a Dutch poet of the 14th century concluded: “He, who cannot see at night, must eat the liver of the goat. Then he can see all right” (Bicknell and Prescott, 1955).

Guillemeau in France in the 16th century clearly described night blindness and advised liver for its cure (Bicknell, and Prescott, 1955), which was also advised by other writers during this time.

The first mention of liver for the eyes in England was in Muffett’s “Health’s Improvement” (1655), though Bayly, at one time Queen Elizabeth’s physician, in his book on eyes recommends “rawe herbes” among which is “eie bright” (Drummond, 1920). The only evidence of night blindness being common at this time is references to mists and films over the eyes “rawe herbes” would of course provide provitamin A (Bicknell and Prescott, 1955).

In 1657 Hofer expressed the view that night blindness is caused by malnutrition with this thought reintroduced nearly 100 years later in 1754 by von Bergen (Rosenberg, 1942). In 1754 he also speculated that night blindness might be due to excessive exposure to sunlight. This would later be confirmed as light increases the need for regeneration of retinal visual pigment. In a review of literature Hicks (1867) noted that night blindness was noticed by Baron Young in Napoleon’s Egyptian campaign in 1798.

At one time, night blindness was a typical disease of seafarers, due to the lack of fresh food. Aykroyd (1944) in his accounts of Newfoundland and Labrador fishermen noted they not only recognized how bright sunlight may bring on night blindness, but also used liver, preferably the raw liver of a gull or puffin for a cure. In rural communities inability to see in the dusk is a very serious condition; fishermen, for instance, may walk off the rocks into the sea after landing in the evening. Night blindness can be cured, often in 12 hours, by eating food rich in vitamin A, such as liver. In addition to eating liver, patients with night blindness were recommended to hold their heads over steam rising from the roasting liver. The dramatic quickness both of the onset and cure explains why liver has been used for centuries for prevention and cure of night blindness (Bicknell and Prescott, 1955).

Other regions of the world where liver was used to control night blindness was in central Africa. Medicine men in Ruanda-Urandi prescribed chicken liver to cure night blindness (Tseng Lui and Roels, 1980). The origins and original date of implementation of their therapy are unknown.

During the Lewis-Clarke expedition (1804-1806) to open up the far west in the United States, there were a number of men who developed severe eye troubles toward the end of the trip. These men had lived for long periods upon buffalo, dog and horsemeat, muscle of these animals is low in fat-soluble vitamins (McCay, 1973).

Magendie (1816) in France studied lack of protein in dog diets by feeding the animals sugar and water. Magendie noted that during the third week, dogs were thin, lost liveliness, had decreased appetite and developed small ulceration in the center of the transparent cornea. The ulcer was first on one eye and then the other, it increased rapidly and at the end of a few days was more than two millimeters in diameter and in depth. Soon the cornea was entirely pierced and the fluid of the eye was flowing out. An abundant secretion of the glands of the eyelids accomplished this singular phenomenon. Magendie repeated this experiment twice more with identical results.

Although xerophthalmia had been known in ancient Egypt, Magendie appears to be the first to experimentally produce the condition. Not only did Magendie record the production of xerophthalmia in animals, he recognized the analogous conditions in man as a result of a restricted diet. In his report Magendie noted an experiment by an English doctor named Stark. Stark lived on an exclusive diet of sugar for one month. In his eyes appeared livid red spots, which seemed to announce the approach of an ulcer (xerophthalmia). C.M. McCay in 1930 suggested that Magendie was the father of the vitamin hypothesis.

In a journey from Calcutta to Bombay, India in 1824-1825 vitamin A deficiency was observed (Aykroyd, 1944). Individuals would sometime describe their condition as being “night blind.” By the mid-1800s, xerophthalmia was recognized in many areas of Europe, particularly in Russia during the long Lenten fasts, (Blessig, 1866) the United States (Hicks, 1867) and elsewhere around the world (Sommer and West, 1996). Hubbenet (1860) reports night blindness in the Crimean War (1853-1856).

Inflammation of the cornea and conjunctiva associated with night blindness had been ascribed to “defective nutriment” by Budd in 1842. Wilde (1851) made similar conclusions during the Ireland famine in 1851. For treatment he recommended cod-liver oil.

In 1857 David Livingstone, a medical missionary in Africa, described eye effects for his native carriers when they were forced by circumstances to subsist for a time on coffee, manioc and meal (Livingston, 1905). He noted that the eyes became affected (e.g. ulceration of the cornea) as they did in animals receiving starch. He was probably referring to the dog study of Magendie in 1816.

Hicks (1867) a doctor in the confederate army, described night blindness in the U.S. Civil War (1861-1865). The disease was found to be in the army of Northern Virginia so extensively as to resemble an epidemic. Soldiers attributed it to the “effect of the moon-light falling upon their eyes while sleeping upon the ground.” The soldier, who had marched all day without problems, would complain of blindness upon the approach of early twilight, and make immediate application for transportation in an ambulance. At such times he would be found blundering along just as a blind man, holding on to the arm of his companion. For those with night blindness being examined at night by candle-light, the eye pupil was found dilated, and refusing to respond to the stimulus of this light.

To overcome night blindness, extreme treatments such as cupping, leeching, blistering, iron, mercury and potash were used extensively, but most often did more harm than good (Hicks, 1867). Cases frequently recovered spontaneously after all treatments had been abandoned. Hicks (1867) observed that a furlough from the army was most beneficial to cure night blindness. It was noted that poverty, filth and the absence of vegetables were associated with night blindness. The disease was found to be most prevalent when symptoms of scurvy were also observed. Vegetables were observed to be of benefit for both scurvy and night blindness.

In 1883 De Gouvea described night blindness in poorly nourished slaves in Brazil. He noted that the slaves were unable to see when returning from work after sunset, but could see well when starting for work before sunrise. Their food was beans, pork fat and corn meal. Exposure to sunlight was suspected of inducing night blindness, and resting the eyes at night was believed to result in recovery.

In the 1800’s a number of investigators related malnutrition to night blindness and more clearly described eye problems as related to malnutrition The researchers, cited by Rosenberg (1942) and Jayle et al. (1959), included Bamfield (1814), Schutte (1824), Foerster (1858), Von Graefe (1859), Hubbenet (1860), Netter (1863), Bitot (1863), Toporow (1885), Kubli(1887) and Parinaud (1881). Bitot’s name is widely known in conjunction with his observation on conjunctival and corneal changes in vitamin A deficiency. Hubbenet (1860) observed children in a French orphanage and described the progression of xerophthalmia from night blindness through conjunctival and corneal involvement, attributing it to a faulty diet. Toporow (1885) called attention to the importance of fats and Schütte (1824) and Kubli (1887) to that of cod liver oil in prevention of night blindness (cited by Jayle et al., 1959; Loosli, 1991). Parinaud (1881) opened a new era in this field by connecting night blindness with a slowing down in the regeneration of retinal pigment.

VI. Relationship Of Cod Liver Oil To Eye Disease

In the early years of recorded history, liver was the principal food in a number of societies that was beneficial for night blindness control. The discovery of vitamins A and D was closely related to studies with cod liver oil. This oil has been used since very early times by the Greenlanders, Laplanders and Eskimos (Loosli, 1991). Harris (1955) reviewed the early medical uses of cod liver oil. The earliest medical record was for treatment of rickets in 1766 in Manchester, England. In 1848 a treatise on cod-liver oil, published in Edinburgh, describes how xerophthalmia may be cured with the oil. In 1874 Dusart in a Paris research bulletin noted that his common treatments were wine, quinine, cod liver oil and body massage. The beneficial effect of cod liver oil in the treatment of rickets, osteomalacia, generalized malnourishment, and certain eye conditions was widely recognized by the middle of the 19th century.

Cod liver oil used to be treated in wooden barrels that had some small holes bored in the side. These holes were plugged with pegs. As the fisherman cleaned the cod for salting and drying, they threw the livers into these barrels. After the livers rotted, the oil was set free and rose to the top. This could then be taken off through the holes in the side of the barrel. This cod liver oil had many of the attributes of medicines of olden times, namely a dark brown color, an unpleasant odor and a nauseating flavor. Historically cod liver oil was used in the treatment of both eye disease and rickets.

In a particularly thoughtful and well-documented study published in 1881, Snell demonstrated that cod liver oil would cure both night blindness and Bitot’s spots. Within a decade, meat, milk, and cod liver oil were routinely administered for corneal ulceration and dissolution (keratomalacia). In 1904 Mori gave an elaborate account of large numbers of cases of xeropthalmia in Japan and how cod-liver oil cured the condition.

High vs Low Protein

P. D. Mangan Tweeted a quote from a research paper, Reversal of epigenetic aging and immunosenescent trends in humans by Gregory M. Fahy et al. He stated that  the “Most important sentence in aging reversal study” is the following: “Human longevity seems more consistently linked to insulin sensitivity than to IGF‐1 levels, and the effects of IGF‐1 on human longevity are confounded by its inverse proportionality to insulin sensitivity.” Mangan added that “This line agrees with what I wrote a while back” (How Carbohydrates and Not Protein Promote Aging); and in the comments section of that article, someone pointed to a supporting video by Dr. Benjamin Bikman (‘Insulin vs. Glucagon: The relevance of dietary protein’). Here is the context of the entire paragraph from the discussion section of the research paper:

“In this regard, it must be pointed out that GH and IGF‐1 can also have pro‐aging effects and that most gerontologists therefore favor reducing rather than increasing the levels of these factors (Longo et al., 2015). However, most past studies of aging and GH/IGF‐1 are confounded by the use of mutations that affect the developmental programming of aging, which is not necessarily relevant to nonmutant adults. For example, such mutations in mice alter the normal innervation of the hypothalamus during brain development and prevent the hypothalamic inflammation in the adult (Sadagurski et al., 2015). Hypothalamic inflammation may program adult body‐wide aging in nonmutants (Zhang et al., 2017), but it seems unlikely that lowering IGF‐1 in normal non‐mutant adults can provide the same protection. A second problem with past studies is a general failure to uncouple GH/IGF‐1 signaling from lifelong changes in insulin signaling. Human longevity seems more consistently linked to insulin sensitivity than to IGF‐1 levels, and the effects of IGF‐1 on human longevity are confounded by its inverse proportionality to insulin sensitivity (Vitale, Pellegrino, Vollery, & Hofland, 2019). We therefore believe our approach of increasing GH/IGF‐1 for a limited time in the more natural context of elevated DHEA while maximizing insulin sensitivity is justified, particularly in view of the positive role of GH and IGF‐1 in immune maintenance, the role of immune maintenance in the retardation of aging (Fabris et al., 1988), and our present results.”

In the Twitter thread, Командир Гиперкуба said, “So it is insulin [in]sensitivity than drives ageing rather than IGF‐1/GH. Huge if true.” And GuruAnaerobic added that, “I assume this isn’t IR per se, but IR in the presence of carbohydrate/excess food. IOW, the driver is environment.” Mangan then went onto point out that, “It explains the dichotomy of growth vs longevity, and why calorie restriction increases lifespan.” Mick Keith asked, “So drop carbs and sugar?go paleo style?” And Mangan answered, “There are other aspects to insulin sensitivity, but yes.” All of this cuts to the heart of a major issue in the low-carb community, an issue that I only partly and imperfectly understand. What I do get is this has to do with the conclusions various experts come to about protein, whether higher amounts are fine or intake should be very limited. Some see insulin sensitivity as key while others prioritize IGF-1. The confounding requires careful understanding. In the comments section of Mangan’s above linked article, Rob H. summed it up well:

“Great post, very timely too as I believe this is an issue that seems to be polarising the science-based nutrition space at the moment. Personally I fall down on the same side as you Dennis – as per Ben Bikman’s video which has also been posted here, as well as the views of all the main protein researchers including Stuart Philips, Jose Antonio, Donald Layman, Gabrielle Lyon, Ted Naiman, Chris Masterjohn etc who all believe the science clearly supports a high protein intake eg 1.6 -2.2g/kilo of bodyweight – with no upper limit which has yet been observed. At the same time, I have just been reading the new book by Dr Steven Gundry ‘The Longevity Paradox’. Has anyone read this one yet? Whilst about 90% of the content is fairly solid stuff (although nothing that hasn’t already been written about here) he aggressively supports Longo’s view that we should only consume 0.37g protein/ kilo of bodyweight, eg around 25g of protein/ day for most males. Also that animal protein should be avoided wherever possible. Personally I consume double that amount of protein at each meal! It appears that Longo, Gundry, Dr Ron Rosedale and Dr Mercola are all aligned in a very anti-animal protein stance, but also believe their view is backed by science – although the science quoted in Gundry’s book seems to be largely based on epidemiology. Both sides can’t be right here, so I hope more research is done in this field to shut this debate down – personally I feel that advising ageing males to consume only 25g of protein a day is extremely irresponsible.”

In response, Mangan wrote, “I agree that is irresponsible. Recently Jason Fung and James DiNicolantonio jumped on the anti animal protein bandwagon. My article above is my attempt (successful, I hope) to show why that’s wrong.” Following that, Rob added, “Humans have been consuming animal proteins for most or all of our evolutionary history. And certainly, large quantities of animal protein were consumed at times (as when a kill of a large animal was made). So, I cannot imagine that the “evidence” supporting an anti-animal protein stance can be solid or even science-based. This sounds like a case of certain researchers trying their best to find support for their pre-determined dietary beliefs (vegan proponents do this all the time). I’m not buying it.” It’s very much an ongoing debate.

I have suspicions about the point of confusion that originated this disagreement. Fear of promoting too much growth through protein is basically the old Galenic argument based on humoral physiology. The belief is that too much meat as a stimulating/nurturing substance built up the ‘blood’ with too much heat and dryness which would burn up the body and cause a shortened lifespan. This culturally inherited bias about meat has since been fancied up with scientific language. But ancient philosophy is not the best source for formulating modern scientific theory. Let me bring this back to insulin sensitivity and insulin resistance that appears to play the determining role. Insulin is a hormone and so we must understand this from an endicrinological approach, quite different than Galenic-style fears about meat that was filtered through the Christian theology of the Middle Ages.

Hormones are part of a complex hormonal system going far beyond macronutrients in the diet, although it does appear that the macronutrient profile is a major factor. Harry Serpano, in a discussion with Bart Kay, said that: “In a low insulin state, when you’re heavy meat and fat and your insulin is at 1.3, as Dr. Paul Mangan has actually shown in one of his videos, it’s quite clear; and in what I’m showing in one of the studies, it’s quite clear. It’s so close to basically fasting which is 0.8 — it’s very low. You’re not going to be pushing up these growth pathways like mTOR or IGF-1 in any significant way.” Like with so much else, there is strong evidence that what we need to be worrying about is insulin, specifically on a high-carb diet that causes insulin resistance and metabolic syndrome. That is what is guaranteed to severely decrease longevity.

This question about too much protein recently came up in my own thoughts while reading Dr. Stephen Gundry’s new book, The Longevity Paradox. As mentioned above, he makes a case against too much animal protein. But it sounds like there is more information to be considered in the affect on health, growth, and longevity. In a dialogue with Gundry, Dr. Paul Saladino defended meat consumption (Gundry’s Plant Paradox and Saladino’s Carnivory). What Mangan has added to this debate strengthens this position.

* * *

In one of the above quoted comments, Robert H. mentions that Dr. Joseph Mercola is one of those “aligned in a very anti-animal protein stance, but also believe their view is backed by science.” It’s interesting that I’m just now listening to a discussion between Mercola and Siim Land. They met at a conference and got to talking. Mercola then read Land’s book, Metabolic Autophagy. Land is more in the camp supporting the value of protein. His view is nuanced and the debate isn’t entirely polarized. The role protein plays in health depends on the health outcomes being sought and the health conditions under which protein is being eaten: amounts, regularity of meals, assimilation, etc. It’s about how one’s body is able to use protein and to what end.

Right at the beginning of their talk, Mercola states that he is impressed by Land’s knowledge and persuaded by his view on protein. Land makes the simple point that one doesn’t want to be in autophagy all the time but to cycle between periods of growth and not. Too much protein restriction, especially all the time, is not a good thing. Mercola seems to have come around to this view. So, it’s a shifting debate. There is a lot of research and new studies are coming out all the time. But obviously, context is important in making any statement about protein in the diet. Maybe Saladino will similarly bring Gundry on board with greater protein being a good thing for certain purposes or maybe come to a middle ground. These dialogues are helpful, in particular for an outsider like me who is listening in.

* * *

On a personal note, I’m not sure I take a strong position either way. But I’ve long been persuaded by Siim Land’s view. It feels more moderate and balanced. The opposite side can sound too fear-mongering about protein, not seeming to allow as much differences in contexts and conditions. From a low-carb perspective, one has to replace carbs with something and that means either protein or fat, and one can only consume so much fat. Besides, proteins really are important for anabolism and activating mTOR, for building of the body. Maybe if you’re trying to lose weight or simply maintaining where you’re at with no concern for healing or developing muscle then protein would play less of a role. I don’t know.

Traditional societies don’t seem to worry about protein amounts. When they have access to it, they eat it, at times even to the point of their bellies distending. And when not, they don’t. Those populations with greater access don’t appear to suffer any harm from greater protein intake. Then again, these traditional societies tend to do a lot of strenuous physical activity. They also usually mix it up with regular fasting, intermittent and extended. I’m not sure how optimal protein levels may differ depending on lifestyle. Still, I’d think that the same basic biological truths would apply to all populations. For most people in most situations, increased protein will be helpful at least some of the time and maybe most of the time. Other than fasting, I’m not sure why one needs to worry about it. And with fasting, protein restriction happens naturally.

So, maybe eat protein to satiation. Then throw in some fasting. You’ll probably be fine. There doesn’t seem to be anything to be overly concerned about, based on what evidence I’ve seen so far.

Hubris of Nutritionism

There is a fundamental disagreement over diets. It is about one’s philosophical position on humanity and the world, about the kind of society one aspires to. Before getting to nutritionism, let me explain my present understanding that has developed from what I’ve learned. It’s all quite fascinating. There is a deeper reason why, for example, I see vegetarianism as potentially healthy but not veganism (see debate in comments section of my recent post A Fun Experiment), and that distinction will be central in my following argument. There have been some, not many, traditional societies that were vegetarian or rather semi-vegetarian for millennia (e.g., India; see specific comment in the above linked post), but veganism didn’t exist until the Seventh Day Adventists invented it in the late 19th century. Few people know this history. It’s not exactly something most vegan advocates, other than Adventists themselves, would want to mention.

Veganism was a modernization of ancient Greek Galenic theory of humors, having originally been incorporated into mainstream Christian thought during feudalism, especially within the monastic tradition of abstinence and self-denial but also applied to the population at large through food laws. A particular Galenic argument is that, by limiting red meat and increasing plant foods, there would be a suppression or weakening of libido/virility as hot-bloodedness that otherwise threatens to ‘burn’ up the individual. (The outline of this ideology remains within present dietary thought in the warning that too much animal protein will up-regulate mTOR and over-activate IGF-1 which, as it is asserted, will shorten lifespan. Many experts such as Dr. Steven Gundry in The Longevity Paradox, biological anthropologist Stephen Le in 100 Million Years of Food, etc have been parroting Galenic thought without any awareness of the origin of the ideas they espouse. See my posts High vs Low Protein and Low-Carb Diets On The Rise.) Also, it was believed this Galenic strategy would help control problematic behaviors like rowdiness, the reason in the Middle ages that red meat sometimes was banned prior to Carnival (about dietary systems as behavioral manipulation and social control, see Food and Faith in Christian Culture ed. by Ken Albala and Trudy Eden and some commentary about that book at my posts Western Individuality Before the Enlightenment Age and The Crisis of Identity; for similar discussion, also check out The Agricultural Mind, “Yes, tea banished the fairies.”, Autism and the Upper Crust, and Diets and Systems). For the purposes of Christian societies, this has been theologically reinterpreted and reframed. Consider the attempt to protect against the moral sin of masturbation as part of the Adventist moral reform, such that modern cereal was originally formulated specifically for an anti-masturbation campaign — the Breakfast of Champions!

High protein vs low protein is an old conflict, specifically in terms of animal meat and even more specifically as red meat. It’s more of a philosophical or theological disagreement than a scientific debate. The anti-meat argument would never hold such a central position in modern dietary thought if not for the influence of heavily Christianized American culture. It’s part of Christian theology in general. Gary Taubes discusses it in how dieting gets portrayed as the sins of gluttony and sloth: “Of all the dangerous ideas that health officials could have embraced while trying to understand why we get fat, they would have been hard-pressed to find one ultimately more damaging than calories-in/calories-out. That it reinforces what appears to be so obvious – obesity as the penalty for gluttony and sloth – is what makes it so alluring. But it’s misleading and misconceived on so many levels that it’s hard to imagine how it survived unscathed and virtually unchallenged for the last fifty years” (Why We Get Fat). Read mainstream dietary advice and you’ll quickly hear this morality-drenched worldview of fallen humanity and Adam’s sinful body. This goes along with the idea of “no pain, no gain” (an ideology I came to question in seeing how simple and easy are low-carb diets, specifically with how ketosis eliminates endless hunger and cravings while making fat melt away with little effort, not to mention how my decades of drug-resistant and suicidally-prone depression also disappeared, something many others have experienced; so it turns out that for many people great gain can be had with no pain at all). The belief has been that we must suffer and struggle to attain goodness (with physical goodness being an outward sign of moral goodness), such that the weak flesh of the mortal frame must be punished with bodily mortification (i.e., dieting and exercise) to rid it of its inborn sinful nature. Eating meat is a pleasurable temptation in nurturing the ‘fallen’ body and so it must be morally wrong. This Christian theology has become so buried in our collective psyche, even in science itself, that we no longer are able to recognize it for what it is. And because of historical amnesia, we are unaware of where these mind viruses come from.

It’s not only that veganism is a modern ideology in a temporal sense, as a product of post-Enlightenment fundamentalist theology and its secularization. More importantly, it is a broader expression of modern ways of thinking and perceiving, of being in and relating to the world, including but far from limited to how it modernizes and repurposes ancient philosophy (Galen wasn’t advocating veganism, religious or secularized, that is for sure). Besides the crappy Standard American Diet (SAD), veganism is the only other diet entirely dependent on industrialization by way of chemical-laden monoculture, high-tech food processing, and global trade networks — and hence enmeshed in the web of big ag, big food, big oil, and big gov (all of this, veganism and the industrialization that made it possible, surely was far beyond Galen’s imagination). To embrace veganism, no matter how well-intentioned, is to be fully complicit in modernity and all that goes with it — not that it makes individual vegans bad people, as to varying degrees all of us are complicit in this world we are born into. Still, veganism stands out for, within that ideological framework, there is no other choice outside of modern industrialization.

At the heart of veganism, is a techno-utopian vision and technocratic impulse. It’s part of the push for a plant-based diet that began with the Seventh Day Adventists, most infamously Dr. John Harvey Kellogg, who formed the foundation of modern American nutritional research and dietary recommendations (see the research of Bellinda Fettke who made this connection: Ellen G White and Medical EvangelismThou Shalt not discuss Nutrition ‘Science’ without understanding its driving force, and Lifestyle Medicine … where did the meat go?). I don’t say this to be mean or dismissive of vegans. If one insists on being a vegan, there are better ways to do it. But it will never be an optimal diet, neither for the individual nor for the environment (and, yes, industrial agriculture does kill large numbers of animals, whether or not the vegan has to see it in the grocery store or on their plate; see my post Carnivore Is Vegan: if veganism is defined by harming and killing the fewest lives, if veganism is dependent on industrialization that harms and kills large numbers of lives, and if potentially carnivore is the least dependent on said industrialization, then we are forced to come the conclusion that, by definition, “carnivore is vegan”). Still, if vegans insist, they should be informed and honest in embracing industrialization as a strength, rather than hiding it as a weakness, in overtly arguing for techno-utopian and technocratic solutions in the Enlightenment fashion of Whiggish progressivism. Otherwise, this unacknowledged shadow side of veganism remains an Achille’s heel that eventually will take down veganism as a movement when the truth is finally revealed and becomes public knowledge. I don’t care if veganism continues in its influence, but if vegans care about advocating their moral vision they better do some soul-searching about what exactly they are advocating and for what reason and to what end.

Veganism is not limited to being unique as the only specific diet that is fully industrialized (SAD isn’t comparable because it isn’t a specific diet, since one could argue that veganism as an industrialized diet is one variety of SAD). More importantly, what makes veganism unique is its ethical impetus. That is how it originated within the righteously moralizing theology of Adventism (to understand the moral panic of that era, read my post The Crisis of Identity). The Adventist Ellen G. White’s divine visions from God preceded the health arguments. And even those later health arguments within Adventism were predicated upon a moralistic hypothesis of human nature and reality, that is to say theology. Veganism has maintained the essence of that theology of moral health, even though the dietary ideology was quickly sanitized and secularized. Adventists like Dr. Kellogg realized that this new kind of plant-based diet would not spread unless it was made to seem natural and scientific, a common strategy of fundamentalist apologetics such as pseudo-scientific Creationism (I consider this theologically-oriented rhetoric to be a false framing; for damn sure, veganism is not more natural since it is one of the least natural diets humanity was ever attempted). So, although the theology lost its emphasis, one can still sense this religious-like motivation and righteous zeal that remains at the heart of veganism, more than a mere diet but an entire social movement and political force.

Let’s return to the health angle and finally bring in nutritionism. The only way a vegan diet is possible at all is through the industrial agriculture that eliminated the traditional farming practices, including an entire lifestyle as part of farming communities, that was heavily dependent on animal husbandry and pasturage (similar to how fundamentalist religion such as Adventism is also a product of modernity, an argument made by Karen Armstrong; modern fundamentalism is opposed to traditional religion in the way that, as Corey Robin explains, reactionary conservatism is opposed to the ancien regime it attacked and replaced). This is the industrial agriculture that mass produces plant foods through monoculture and chemicals (that, by the way, destroys ecosystems and kills the soil). And on top of that, vegans would quickly die of malnutrition if not for the industrial production of supplements and fortified foods to compensate for the immense deficiencies of their diet. This is based on an ideology of nutritionism, that as clever apes we can outsmart nature, that humanity is separate from and above nature — this is the main point I’m making here, that veganism is unnatural to the human condition formed under millions of years of hominid evolution. This isn’t necessarily a criticism from a Christian perspective since it is believed that the human soul ultimately isn’t at home in this world, but it is problematic when this theology is secularized and turned into pseudo-scientific dogma. This further disconnects us from the natural world and from our own human nature. Hence, veganism is very much a product of modernity and all of its schisms and dissociations, very much seen in American society of the past century or so. Of course, the Adventists want the human soul to be disconnected from the natural world and saved from the fallen nature of Adam’s sin. As for the rest of us who aren’t Adventists, we might have a different view on the matter. This is definitely something atheist or pagan vegans should seriously consider and deeply contemplate. We should all think about how the plant-based and anti-meat argument has come to dominate mainstream thought. Will veganism and industrialization save us? Is that what we want to put our faith in? Is that faith scientifically justified?

It’s not that I’m against plant-based diets in general. I’ve been vegetarian. And when I was doing a paleo diet, I ate more vegetables than I had ever done in my life, far more than most vegetarians. I’m not against plants themselves based on some strange principle. It’s specifically veganism that I’m concerned about. Unlike vegetarianism, there is no way to do veganism with traditional, sustainable, and restorative farming practices. Vegetarianism, omnivory, and carnivory are all fully compatible in the possibility of eliminating industrial agriculture, including factory farming. That is not the case with veganism, a diet that is unique in its place in the modern world. Not all plant-based diets are the same. Veganism is entirely different from plant-heavy diets such as vegetarianism and paleo that also allow animal foods (also, consider the fact that any diet other than carnivore is “plant-based”, a somewhat meaningless label). That is no small point since plant foods are limited in seasonality in all parts of the world, whereas most animal foods are not. If a vegetarian wanted, they could live fairly far north and avoid out-of-season plant foods shipped in from other countries simply by eating lots of eggs and dairy (maybe combined with very small amounts of what few locally-grown plant foods were traditionally and pre-industrially stored over winter: nuts, apples, fermented vegetables, etc; or maybe not even that since, technically, a ‘vegetarian’ diet could be ‘carnivore’ in only eating eggs and dairy). A vegetarian could be fully locavore. A vegan could not, at least not in any Western country, although a vegan near the equator might be able to pull off a locavore diet as long as they could rely upon local industrial agriculture, which at least would eliminate the harm from mass transportation, but it still would be an industrial-based diet with all the problems, including mass suffering and death, that entails.

Veganism in entirely excluding animal foods (and excluding insect foods such as honey) does not allow this option of a fully natural way of eating, both local and seasonal without any industrialization. Even in warmer climes amidst lush foliage, a vegan diet was never possible and never practiced prior to industrialization. Traditional communities, surrounded by plant foods or not, have always found it necessary to include animal and insect foods to survive and thrive. Hunter-gatherers living in the middle of dense jungles (e.g., Piraha) typically get most of their calories from animal foods, as long as they maintain access to their traditional hunting grounds and fishing waters, and as long as poaching and environmental destruction or else hunting laws haven’t disrupted their traditional foodways. The closest to a more fully plant-based diet among traditional people was found among Hindus in India, but even there they unintentionally (prior to chemical insecticides) included insects and insect eggs in their plant foods while intentionally allowing individuals during fertile phases of life to eat meat. So, even traditional (i.e., pre-industrial) Hindus weren’t entirely and strictly vegetarian, much less vegan (see my comment at my post A Fun Experiment), but still high quality eggs and dairy can go a long way toward nourishment, as many healthy traditional societies included such foods, especially dairy from pasture-raised animals (consider Weston A. Price’s early 20th century research of healthy traditional communities; see my post Health From Generation To Generation).

Anyway, one basic point is that plant-based diet is not necessarily and always identical to veganism, in that other plant-based diets exist with various forms of animal foods. This is a distinction many vegan advocates want to confound in muddying the water of public debate. In discussing the just released documentary The Game Changers, Paul Kita writes that it “repeatedly pits a vegan diet against a diet that includes meat. The film does this to such an extent that you slowly realize that “plant-based” is just a masquerade for “vegan.” Either you eat animal products and suffer the consequences or avoid animal products and thrive, the movie argues.” (This New Documentary Says Meat Will Kill You. Here’s Why It’s Wrong.). That is a false dichotomy, a forced choice driven by an ideological-driven agenda. Kita makes a simple point that challenges this entire frame: “Except that there’s another choice: Eat more vegetables” Or simply eat less industrial foods that have been industrially grown, industrially processed, and/or industrially transported — basically, don’t eat heavily processed crap, from either meat or plants (specifically refined starches, added sugar, and vegetable oils) but also don’t eat the unhealthy (toxic and nutrient-depleted) produce of industrial agriculture, that is to say make sure to eat locally and in season. But that advice also translates as: Don’t be vegan. That isn’t the message vegan advocates want you to hear.

Dietary ideologies embody social, political, and economic ideologies, sometimes as all-encompassing cultural worldviews. They can shape our sense of identity and reality, what we perceive as true, what we believe is desirable, and what we imagine is possible. It goes further than that, in fact. Diets can alter our neurocognitive development and so potentially alter the way we think and feel. This is one way mind viruses could quite literally parasitize our brains and come to dominate a society, which I’d argue is what has brought our own society to this point of mass self-harm through dietary dogma of pseudo-scientific “plant-based” claims of health (with possibly hundreds of millions of people who have been harmed and had their lives cut short). A diet is never merely a diet. And we are all prone to getting trapped in ideological systems. In my criticisms of veganism as a diet, that doesn’t make vegans as individuals bad people. And I don’t wish them any ill will, much less failure in their dietary health. But I entirely oppose the ideological worldview and social order that, with conscious intention or not, they are promoting. I have a strong suspicion that the world that vegans are helping to create is not a world I want to live in. It is not their beautiful liberal dream that I criticize and worry about. I’m just not so sure that the reality will turn out to be all that wonderful. So far, the plant-based agenda doesn’t seem to be working out all that well. Americans eat more whole grains and legumes, vegetables and fruits than ever before since data was kept and yet the health epidemic continues to worsen (see my post Malnourished Americans). It was never rational to blame public health concerns on meat and animal fat.

Maybe I’m wrong about veganism and the ultimate outcome of their helping to shape the modern world. Maybe technological innovation and progress will transform and revolutionize industrial agriculture and food processing, the neoliberal trade system and capitalist market in a beneficial way for all involved, for the health and healing of individuals and the whole world. Maybe… but I’m not feeling confident enough to bet the fate of future generations on what, to me, seems like a flimsy promise of vegan idealism borne out of divine visions and theological faith. More simply, veganism doesn’t seem all that healthy on the most basic of levels. No diet that doesn’t support health for the individual will support health for society, as society is built on the functioning of humans. That is the crux of the matter. To return to nutritionism, that is the foundation of veganism — the argument that, in spite of all of the deficiencies of veganism and other varieties of the modern industrial diet, we can simply supplement and fortify the needed nutrients and all will be well. To my mind, that seems like an immense leap of faith. Adding some nutrients back into a nutrient-depleted diet is better than nothing, but comes nowhere close to the nutrition of traditional whole foods. If we have to supplement the deficiencies of a diet, that diet remains deficient and we are merely covering up the worst aspects of it, what we are able to most obviously observe and measure. Still, even with those added vitamins, minerals, cofactors, etc, it doesn’t follow that the body is getting all that it needs for optimal health. In traditional whole foods, there are potentially hundreds or thousands of compounds, most of which have barely been researched or not researched at all. There are certain health conditions that require specific supplements. Sure, use them when necessary, as we are not living under optimal conditions of health in general. But when anyone and everyone on a particular diet is forced to supplement to avoid serious health decline as is the case with veganism, there is a serious problem with that diet.

It’s not exactly that I disagree with the possible solution vegans are offering to this problem, as I remain open to future innovative progress. I’m not a nostalgic reactionary and romantic revisionist seeking to turn back the clock to re-create a past that never existed. I’m not, as William F. Buckley jr. put it, “someone who stands athwart history, yelling Stop”. Change is great — I have nothing against it. And I’m all for experimenting. That’s not where I diverge from the “plant-based” vision of humanity’s salvation. Generally speaking, vegans simply ignore the problem I’ve detailed or pretend it doesn’t exist. They believe that such limitations don’t apply to them. That is a very modern attitude coming from a radically modern diet and the end result would be revolutionary in remaking humanity, a complete overturning of what came before. It’s not to be obsessed with the past, to believe we are limited to evolutionary conditions and historical precedence. But ignoring the past is folly. Our collective amnesia about the traditional world keeps getting us into trouble. We’ve nearly lost all traces of what health once meant, the basic level of health that used to be the birthright of all humans.

My purpose here is to create a new narrative. It isn’t vegans and vegetarians against meat-eaters. The fact of the matter is most Americans eat more plant foods than animal foods, in following this part of dietary advice from the AHA, ADA, and USDA (specifically eating more vegetables, fruits, whole grains, and legumes than ever before measured since data has been kept). When snacking, it is plant foods (crackers, potato chips, cookies, donuts, etc) that we gorge on, not animal foods. Following Upton Sinclair’s writing of The Jungle, the average intake of red meat went on a decline. And since the 1930s, Americans have consumed more industrial seed oils than animal fat. “American eats only about 2oz of red meat per day,” tweets Dr. Shawn Baker, “and consumes more calories from soybean oil than beef!” Even total fat hasn’t increased but remained steady with the only change in the ratio of what kinds of fats, that is to say more industrial seed oils. It’s true that most Americans aren’t vegan, but what they share with vegans is an industrialized diet that is “plant-based”. To push the American diet further in this direction would hardly be a good thing. And it would require ever greater dependence on the approach of nutritionism, of further supplementation and fortification as Americans increasingly become malnourished. That is no real solution to the problem we face.

Instead of scapegoating meat and animal fat, we should return to the traditional American diet or else some other variant of the traditional human diet. The fact of the matter is historically Americans ate massive amounts of meat and, at the time, they were known as the healthiest population around. Meat-eating Americans in past centuries towered over meat-deprived Europeans. And those Americans, even the poor, were far healthier than their demographic counterparts elsewhere in the civilized and increasingly industrialized world. The United States, one of the last Western countries to be fully industrialized and urbanized, was one of the last countries to see the beginning of a health epidemic. The British noticed the first signs of physical decline in the late 1800s, whereas Americans didn’t clearly see this pattern until World War II. With this in mind, it would be more meaningful to speak of animal-based diets, including vegetarianism that allows dairy and eggs. This would be far more meaningful than grouping together supposed “plant-based” diets. Veganism is worlds apart from vegetarianism. Nutritionally speaking, vegetarianism has more in common with the paleo diet or even carnivore diet than with veganism, the latter being depleted of essential nutrients from animal foods (fat-soluble vitamins, EPA, DHA, DPA, choline, cholesterol, etc; yes, we sicken and die without abundant cholesterol in our diet, the reason dementia and other forms of neurocognitive decline are a common symptom of statins in lowering cholesterol levels). To entirely exclude all animal foods is a category unto itself, a category that didn’t exist and was unimaginable until recent history.

* * *

Nutritionism
by Gyorgy Scrinin

In Defense of Food
by Michael Pollan

Vegan Betrayal
by Mara Kahn

The Vegetarian Myth
by Lierre Keith

Mike Mutzel:

On the opposite side of the spectrum, the vegans argue that now we have the technologies like B12, synthetic b12, we can get DHA from algae. So it’s a beautiful time to be be vegan because we don’t need to rely upon animals for these compounds. What would you say to that argument?

Paul Saladino:

I would say that that’s a vast oversimplification of the sum total of human nutrition to think that, if we can get synthetic B12 and synthetic DHA, we’re getting everything in an animal. It’s almost like this reductionist perspective, in my opinion.

I’ve heard some people say that it doesn’t matter what you eat. It’s all about calories in and calories out, and then you can just take a multivitamin for your minerals and vitamins. And I always bristle at that I think that is so reductionist. You really think you’ve got it all figured out that you can just take one multivitamin and your calories and that is the same as real food?

That to me is just a travesty of an intellectual hypothesis or intellectual position to take because that’s clearly not the case. We know that animal foods are much more than the reductionist vitamins and minerals that are in them. And they are the structure or they are the matrix they are the amino acids… they are the amino acid availability… they are the cofactors. And to imagine that you can substitute animal foods with B12 and DHA is just a very scary position for me.

I think this is an intellectual error that we make over and over as humans in our society and this is a broader context… I think that we are smart and because we have had some small victories in medicine and nutrition and health. We’ve made scanning electron microscopes and we’ve understood quarks. I think that we’ve gotten a little too prideful and we imagine that as humans we can outsmart natural the natural world, that we can outsmart nature. And that may sound woo-woo, but I think it’s pretty damn difficult to outsmart 3 million years of natural history and evolution. And any time we try to do that I get worried.

Whether it’s peptides, whether it’s the latest greatest drug, whether it’s the latest greatest hormone or hormone combination, I think you are messing with three million years of the natural world’s wisdom. You really think you’re smarter than that? Just wait just wait, just wait, you’ll see. And to reduce animal foods to B12 and DHA, that’s a really really bad idea.

And as we’ve been talking about all those plant foods that you’re eating on a vegan diet are gonna come with tons of plants toxins. So yes, I think that we are at a time in human history when you can actually eat all plants and not get nutritional deficiencies in the first year or two because you can supplement the heck out of it, right? You can get… but, but… I mean, the list goes on.

Where’s your zinc? Where’s your carnitine? Where’s your carnosine? Where’s your choline? It’s a huge list of things. How much protein are you getting? Are you actually a net positive nitrogen balance? Let’s check your labs. Are you getting enough iodine? Where are you getting iodine from on a vegan diet?

It doesn’t make sense. You have to supplement with probably 27 different things. You have to think about the availability of your protein, the net nitrogen uses of your protein.

And you know people may not know this about me. I was a vegan, I was a raw vegan for about 7 months about 14 years ago. And my problem — and one thing I’ve heard from a lot of other people, in fact my clients, are the same thing today — is that, even if you’re able to eat the foods and perfectly construct micronutrients, you’re going to have so much gas that nobody’s going to want to be around you in the first place.

And I don’t believe that, in any way, shape or form, a synthetic diet is the same as a real foods diet. You can eat plants and take 25 supplements. But then you think what’s in your supplements? And are they bioavailable in the same way? And do they have the cofactors like they do in the food? And to imagine — we’ve done so much in human nutrition — but to imagine that we really understand fully the way that humans eat and digest their food I think is just that’s just pride and that’s just a folly.

Mike Mutzel:

Well, I agree I mean I think there’s a lot more to food than we recognize: micro RNA, transfer RNA, like other molecules that are not quote-unquote macronutrients. Yeah, now I think that’s what you’re getting from plants and animals in a good or bad way that a lot of people don’t think about. For example, you know there’s animal studies that show stress on animals; for example, like pre-slaughter stress affects the transcription patches and various genes in the animal product.

So, I love how you’re bringing to this whole carnivore movement — like the grass-fed movement, eating more organic free-range, things like that — because one of the qualms that I had seeing this thing take off is a lot of people going to fast food were taking the bun off the burger saying that there’s really no difference between grass-fed or a grain-fed. Like meat’s meat, just get what you can afford. I understand that some people… I’ve been in that place financially before in my life where grass-fed was a luxury.

But the other constituents that could potentially be in lower quality foods, both plant and animal. And the other thing about that you, just to hit on one more thing… The supplements —  been in the supplement space since ’06 — they’re not free of iatrogenesis, right. So there is heavy metals, arsenic, lead, mercury, cadmium in supplements; even vegan proteins, for example.

Paul Saladino:

Yeah, highly contaminated. Yeah, people don’t think about the metals in their supplements. And I see a lot of clients with high heavy metals and we think where are you getting this from. I saw a guy the other day with a really high tin and I think it’s in his supplements. And so anyway, that’s a whole other story