Plant-Based Nutritional Deficiencies

The purpose here is to highlight the nutritional deficiencies of plant-based diets but most specifically plant-exclusive diets such as veganism (important nutrients are listed below). Not all of these deficiencies involve essential nutrients, but our knowledge is limited on what is essential. There are deficiencies that will kill you quickly, others slowly, and still others that simply will cause deteriorating health or less than optimal functioning. Also, some of these nutrients or their precursors can be found in plant foods or otherwise produced by the body, but there can be several problems. The plant-based sources may be inadequate or not in the most bioavailable form, antinutrients in the plants may block the absorption of certain nutrients (e.g., phytates block mineral absorption), gut and microbiome problems related to a plant-based diet might interfere with absorption, and most people have severely limited capacity to turn certain precursors into the needed nutrients.

So, when eating a supposedly healthy diet, many vegans and vegetarians still have major deficiencies, even with nutrients that should be in their diet according to standard food intake calculations — in those cases, the nutrients are there in theory but for some reason not being absorbed or utilized. For example, raw spinach has a lot of calcium, but it is almost entirely unavailable to the body. Adding raw spinach to your smoothie or salad might be a net loss to your health, as the antinutrients will block the nutrients in other foods as well. Another factor is that, on a plant-based diet, nutrients can get out of ratio. Nutrients work together with some acting as precursors, others as catalysts, and still others like master hormones — such as vitamin K2 determining where calcium is transported to, preferably the bones as opposed to arteries, joints and the brain; or think about how the body can produce vitamin D3 but only if there is adequate cholesterol. As such, besides deficiencies, sometimes there can too much of a nutrient which interferes with another nutrient, as seen with copper in relation to zinc.

That is the advantage to an animal-based diet, which could even include a well-balanced vegetarian diet that emphasized dairy and eggs (Vegetarianism is an Animal-Based Diet), but unfortunately many vegetarians are near-vegan in limiting even those non-meat animal foods. Here is the reason why animal foods are so important. Other animals have similar nutritional needs as humans and so, when we eat animal foods, we are getting not only the nutrients our bodies need but in the required form and ratio for our own optimal functioning. Without animal foods, one has to study nutrition to understand all of this and then try to artificially re-create it through careful calculations in balancing what one eats and supplements, an almost impossible task that requires someone to have a scientific mindset. Even then, one is likely to get it wrong. Regular testing of nutritional levels would be absolutely necessary to ensure everything is going according to plan.

As for supplements and fortification, the nutrients aren’t always in the best form and so wouldn’t be as bioavailable nor would likely have all the needed cofactors in just the right amounts. Besides, a diet dependent on supplementation and fortification is not healthy by definition, in that the food itself in natural form lacksing those nutrients. The fact that most vegans in particular and vegetarians as well have to be extremely obsessive about nutrition just to maintain a basic level of health is not high praise to the health-giving benefits of such a plant-based diet — and hence the reason even vegetarians should emphasize the allowed animal foods (there are even vegans who will make exceptions for some animal foods, such as fish). This is probably why most people quit these diets after a short period of time and why most people who quit, including those who quit after years or decades, do so for health reasons. Among those who remain on these diets, their responses on surveys show that most of them cheat on occasion and so are getting some minimal level of animal-based nutrition, and that is a good thing for their health even as it calls into question the validity of health claims about plant-based diets (Being “mostly vegan” is like being “a little pregnant.”).

There has long been a bias against meat, especially red meat. It goes back to the ancient Greek thought of Galen and how it was adapted to Medieval society in being Christianized for purposes of maintaining social hierarchy and social control. This Galenic bias was carried forward in the Christian tradition and then modernized within nutrition studies through the surprisingly powerful influence of the Seventh Day Adventists who continue to fund a lot of nutritional studies to this day. This has had practical consequences. It has long been assumed, based on a theology of a sinful world, that eating animals would make us beastly. It’s similar to the ancient idea that eating the muscles or heart of a fallen warrior would make one strong or courageous. A similar logic was applied to plants, that they have inherent qualities that we can imbibe.

So, it has been long believed that plant foods are somehow healthier for both body and soul, somehow more spiritual and so would bring humans closer to God or else closer to their divine natural state before the Fall of Man. That has been the moral concern of many Christians, from Medieval Catholics to modern Seventh Day Adventists. And in secularized form, it became internalized by mainstream nutrition studies and dietary guidelines. Part of the purpose of eating plants, according to Christianized Galenism, was that a strong libido was considered bad and it was understood that a plant-based diet suppressed libido, which admittedly doesn’t sound like a sign of health but their idea of ‘health’ was very different. It was also worried that, along with firing up the libido, meat would heat up the entire body and would lead to a shorter lifespan. Pseudo-scientific explanations have been used to rationalize this theological doctrine, such as concerns about mTOR and IGF-1, although this requires contorting the science and dismissing other evidence.

The problem is this simply became built into mainstream nutritional ideology, to such an extent that few questioned it until recently. This has led to most researchers, nutritionists, dieticians, and other health experts to obsess over the nutrients in plants while overlooking the nutrients in animal foods. So, you’ll hear something along the lines of, “meat is not an important source of vitamin E and with the exception of liver, is not a particularly good source of fat-soluble vitamins” (Nutrients in Meat, from the Meat We Eat). Keep in mind that assertion comes from a project of the American Meat Science Association — not likely to be biased against meat. It’s sort of true, depending on how one defines meat. From Galenic thought, the notion of meat is still associated with red meat. It is true that muscle meat, particularly lean muscle meat, from beef, pork and veal doesn’t have much vitamin E compared to plant foods (M. Leonhardt et al, Vitamin E content of different animal products: influence of animal nutrition). This is why some vegetarians and even vegans see no contradiction or conflict, much less hypocrisy, in eating fish and fowl — culturally, these have for millennia been considered a separate category from meat.

Yet adequate amounts of vitamin E are found in many animal foods, whether or not we label them as ‘meat’: chicken, goose meat, fish, seafood, crayfish, butter, and cheese; and some vitamin E is also found in liver and eggs (Atli Anarson, 20 Foods That Are High in Vitamin E). We have to be clear what we mean by ‘meat’. On a meat-based diet, even to the degree of being carnivore, there are plentiful good sources of every essential nutrient, including vitamin E, and many that aren’t essential but highly conducive to optimal health. Besides animal foods, there is no other source of such immense nutrient-density and nutrient-biavailability. Plant foods don’t come close in comparison.

Also, as vitamin E is an antioxidant, it’s important to note that animal foods contain many other antioxidants that play a similar role in maintaining health, but animal-sourced antioxidants have been mostly ignored because they don’t fit the dominant plant-based paradigm. Plant foods lack these animal-sourced antioxidants. So why do so few talk about a deficiency in them for vegans and vegetarians? And why have researchers so rarely studied in depth the wide variety of nutrients in animal foods to determine their full health benefits? This is particularly odd when considering, as I already stated, every known essential nutrient can be found in animal foods but not in plant foods. Isn’t that an important detail? Why is there a collective silence among mainstream health experts?

Think about how plant antinutrients can block the absorption of nutrients, both in plant foods and animal foods, and so require even more nutrients to counteract this effect which might simply further increase the antinutrient intake, unless one is careful in following the food selection and preparation as advised by those like Steven Gundry (The Plant Paradox). Or think about how glucose competes with the antioxidant vitamin C causing an increase of scurvy if vitamin C is not increased, and yet a low-carb diet with far lower intake of vitamin C is not linked to scurvy — maybe the reason ancient Vikings and Polynesians could remain healthy at sea for months, but once a high-carb diet was introduced modern sailors were plagued by scurvy (Sailors’ Rations, a High-Carb Diet). Similarly, a plant-based diet in general might require greater amounts of vitamin E: “Plant-based foods have higher concentrations of vitamin E. And for good reason. A plant-based diet requires additional protection from oxidation of PUFA which Vitamin E helps provide through its antioxidant properties. It’s still found in adequate supply in meat” (Kevin Stock, Vitamins and Minerals – Plants vs Animals).

What is adequate depends on the diet. A diet low in carbs, seed oils, and other plant foods may require fewer plant-based antioxidants, especially if this is countered by an increase of animal-based antioxidants. It is reminiscent of the fiber debate. Yes, fiber adds bulk that supposedly will increase regularity, ignoring the fact that the research is divided on this topic. No doubt bulking up your poop makes you have larger poops and more often, but is that really a good thing? People on a low-residue carnivore diet more easily digest and absorb what the eat, and so they don’t have bulky poops — then again they don’t usually have constipation either, not if they’re getting enough dietary fat. The main cause of constipation is plant foods. So, why are people advised to eat more plant foods in the hope of resolving this issue caused by plant foods? It’s absurd! We keep looking at problems in isolation, as we look at nutrients in isolation (Hubris of Nutritionism). This has failed us, as demonstrated by our present public health crisis.

Let me throw in a last thought about antioxidants. It’s like the fiber issue. People on plant-based diets have contipation issues and so they eat more plant foods in the form of fiber in trying to solve the problem plant foods cause, not realizing that constipation generally resolves itself by eliminating or limiting plant foods. So, in relation to antioxidants, we have to ask ourselves what is it about our diet in the first place that is causing all the oxidative stress? Plant foods do have antioxidants, but some plant foods also cause oxidative stress (e.g., seed oils). If we eliminate these plant foods, our oxidative stress goes down and so our requirement of antioxidants to that degree also lessens. Our body already produces its own antioxidants and, combined with what comes from animal foods, we shouldn’t such excess amounts of antioxidants. Besides, it’s not clear from studies that plant antioxidants are always beneficial to health. It would be better to eliminate the need for them in the first place. Shawn Baker explained this in terms of vitamin C (interview with Shan Hussain, The Carnivore Diet with Dr. Shawn Baker MD):

“The Carnivore diet is deficient in carbohydrates and essential vitamins like Vitamin C, how do we make up for that? When I wanted to do this I was curious about this as well. You will see a number of potential deficiencies around this diet. There is no role of fibre in this diet. With Vitamin C we know there are some transporters across different cell membranes. In a higher glucose environment, Vitamin C is competitively inhibited and therefore we see less absorption of Vitamin C. We also see that interestingly human red blood cells do have the capacity to actually recycle Vitamin C which is something that not many people are aware of. One of the major function of Vitamin C is that it is an antioxidant. In low carbohydrate states our antioxidants systems particularly things like glutathione are regulated. We may obviate some of the need of antioxidants of the Vitamin C by regulating around systems in a low carb diet. Also, Vitamin C is very important in the function of carnitine which is part of the fat cycle. When we are ingesting carnitine we have actual transporters in the gut which can take up full carnosine. It is a misconception that we can only take amino acids, a number of di and tripeptide transporters that are contained within our gut. The other function of Vitamin C is when we don’t have sufficient Vitamin C relative to our needs, we start to develop symptoms of scurvy, bleeding gum problems, teeth falling out, sores and cuts won’t heal. This is all due to the collagen synthesis. If we look at Vitamin C’s role in collagen synthesis, it helps to take proline and lysine, hydroxyproline and hydroxylysine. In meat-based diet, we are getting that in ample amount. Even a steak has 3% of its content as collagen. There are all kinds of compensatory mechanisms.”

I’ll end on an amusing note. Chris Kresser wrote about the carnivore diet (Everything You Need to Know about the Carnivore Diet and How It Can Affect Your Health). Athough an advocate of low-carb diets and nutrient-dense animal foods, he is skeptical that carnivory will be healthy for most humans long-term. One worry is that there might be nutritional deficiencies, but the argument he makes is funny. He basically saying that if all one eats is muscle meat then key nutrients will get missed. Then he goes onto point out that these nutrients can be found in other animal foods, such as liver and dairy. So, his main concern about a carnivore diet is actually that people might not eat enough animal foods or rather not enough of certain animal foods. So, make sure you eat lots of a wide variety of animal foods if going full carnivore and apparently even critics like Kresser agree you’ll be fine, at least nutritionally. The problem isn’t too much animal foods but potentially too little. That made me smile.

Now to the whole point of this post. Below is a list of nutrients that are commonly deficient in those on plant-based diets, especially those on plant-exclusive diets (i.e., vegans). I won’t explain anything about these nutrients, as there is plenty of info online. But you can look to the linked articles below that cover the details.

  • Vitamin K2 (MK-4)
  • Vitamin D3 (Cholecalciferol)
  • Vitamin A (Retinol)
  • Vitamin B12 (Cobalamin)
  • Vitamin B6 (Pyridoxine)
  • B3 (Niacin)
  • B2 (Riboflavin)
  • Calcium
  • Heme Iron
  • Zinc
  • Selenium
  • Iodine
  • Sulfur
  • Creatine
  • Beta-Alanine
  • Carnosine
  • Beta-Alanine (Precursor to Carnosine)
  • L-Carnitine
  • Taurine
  • Choline
  • Coenzyme Q10 (CoQ10)
  • Phytanic Acid
  • DHA Omega-3 (Docosahexaenoic Acid)
  • EPA Omega-3 (Eicosapentaenoic Acid)
  • DPA Omega-3 (Docosapentaenoic Acid)
  • ARA Omega-6 (Arachidonic Acid)
  • CLA (Conjugated Linoleic Acid)
  • Phosphatidylserin
  • Cholesterol
  • Collagen
  • Complete Protein
  • Glutathione
  • Glycine
  • Essential Amino Acids (Methionine, Tryptophan, Lysine, Leucine, Cysteine, Proline, Tyrosine, Phenylalanine, Serine, Alanine, Threonine, Isoleucine and Valine)

Just for the sake of balance, I’ll also share a list of plant compounds that are problematic for many people — from Joe Cohen (20 Nutrients that Vegans & Vegetarians are Lacking):

  1. Lectins
  2. Amines
  3. Tannins
  4. Trypsin Inhibitors
  5. FODMAPS
  6. Salicylates
  7. Oxalates
  8. Sulfites, Benzoates, and MSG
  9. Non-protein amino acids
  10. Glycosides
  11. Alkaloids [includes solanine, chaconine]
  12. Triterpenes
  13. Lignins
  14. Saponins
  15. Phytic Acid [Also Called Phytate]
  16. Gluten
  17. Isoflavones

* * *

Are ‘vegetarians’ or ‘carnivores’ healthier?
Gundry’s Plant Paradox and Saladino’s Carnivory
Dr. Saladino on Plant and Animal Foods
True Vitamin A For Health And Happiness
Calcium: Nutrient Combination and Ratios
Vitamin D3 and Autophagy

The Vegetarian Myth: Food, Justice, and Sustainability
by Lierre Keit

Vegan Betrayal: Love, Lies, and Hunger in a Plants-Only World
by Mara J. Kahn

The Meat Fix: How a lifetime of healthy eating nearly killed me!
by John Nicholson

The Fat of the Land/Not By Bread Alone
by Vilhjalmur Stefansson

Sacred Cow: The Case for (Better) Meat: Why Well-Raised Meat Is Good for You and Good for the Planet
by Diana Rodgers and Robb Wolf

The Carnivore Code: Unlocking the Secrets to Optimal Health by Returning to Our Ancestral Diet
by Paul Saladino

Primal Body, Primal Mind: Beyond Paleo for Total Health and a Longer Life
by Nora Gedgauda

Paleo Principles
by Sarah Ballantyn

 
The Queen of Fats: Why Omega-3s Were Removed from the Western Diet and What We Can Do to Replace Them
by Susan Allport

The Omega Principle: Seafood and the Quest for a Long Life and a Healthier Planet
by Paul Greenberg

The Omega-3 Effect: Everything You Need to Know About the Super Nutrient for Living Longer, Happier, and Healthier
by William Sears and James Sear

The Missing Wellness Factors: EPA and DHA: The Most Important Nutrients Since Vitamins?
by Jorn Dyerberg and Richard Passwater

Could It Be B12?: An Epidemic of Misdiagnoses
by Sally M. Pacholok and Jeffrey J. Stuar

What You Need to Know About Pernicious Anaemia and Vitamin B12 Deficiency
by Martyn Hooper

Living with Pernicious Anaemia and Vitamin B12 Deficiency
by Martyn Hoope

Pernicious Anaemia: The Forgotten Disease: The causes and consequences of vitamin B12 deficiency
by Martyn Hooper

Healing With Iodine: Your Missing Link To Better Health
by Mark Sircus

Iodine: Thyroid: The Hidden Chemical at the Center of Your Health and Well-being
by Jennifer Co

The Iodine Crisis: What You Don’t Know About Iodine Can Wreck Your Life
by Lynne Farrow

L-Carnitine and the Heart
by Stephen T. Sinatra and Jan Sinatra

Food Politics: How the Food Industry Influences Nutrition and Health
by Marion Nestle

Unsavory Truth: How Food Companies Skew the Science of What We Eat
by Marion Nestle

Formerly Known As Food: How the Industrial Food System Is Changing Our Minds, Bodies, and Culture
by Kristin Lawless

Death by Food Pyramid: How Shoddy Science, Sketchy Politics and Shady Special Interests Have Ruined Our Health
by Denise Minge

Nutrition in Crisis: Flawed Studies, Misleading Advice, and the Real Science of Human Metabolism
by Richard David Feinman

Nutritionism: The Science and Politics of Dietary Advice
by Gyorgy Scrinis

Measured Meals: Nutrition in America
by Jessica J. Mudry

(Although more about macronutrients, also see the work of Gary Taubes and Nina Teicholz. They add useful historical context about nutrition studies, dietary advice, and public health.)

20 Nutrients that Vegans & Vegetarians are Lacking
by Joe Cohen

8 Nutrients You May Be Missing If You’re Vegetarian or Vegan
by Tina Donvito

7 Nutrients That You Can’t Get from Plants
by Atli Anarson

7 Supplements You Need on a Vegan Diet
by Alina Petre

The Top 5 Nutrient Deficiencies on a Plant Based Diet
by Kate Barrington

5 Brain Nutrients That You Can’t Get From Plants
by Kris Gunnars

Vitamin Supplements for Vegetarians
by Jeff Takacs

Health effects of vegan diets
by Winston J Craig

Nutritional Deficiencies and Essential Considerations for Every Vegan (An Evidence-Based Nutritional Perspective)
from Dai Manuel

Why You Should Think Twice About Vegetarian and Vegan Diets
by Chris Kresser

Three Big Reasons Why You Don’t Want to be a Vegetarian
by Alan Sears

How to Avoid Common Nutrient Deficiencies if You’re a Vegan
by Joseph Mercola

What is Glutathione and How Do I Get More of It?
by Mark Hyman

Could THIS Be the Hidden Factor Behind Obesity, Heart Disease, and Chronic Fatigue?
by Joseph Mercola

Vegetarianism produces subclinical malnutrition, hyperhomocysteinemia and atherogenesis
by Y. Ingenbleek Y and K. S. McCully

Vegan Diet is Sulfur Deficient and Heart Unhealthy
by Larry H. Bern

Heart of the Matter : Sulfur Deficits in Plant-Based Diets
by Kaayla Daniel

Copper-Zinc Imbalance: Unrecognized Consequence of Plant-Based Diets and a Contributor to Chronic Fatigue
by Laurie Warner

Vegan diets ‘risk lowering intake of nutrient critical for unborn babies’ brains’
by Richard Hartley-Parkinson

The Effects of a Mother’s Vegan Diet on Fetal Development
by Marc Choi

Vegan–vegetarian diets in pregnancy: danger orpanacea? A systematic narrative review
by G. B. Piccoli

Is vegetarianism healthy for children?
by Nathan Cofnas

Clinical practice: vegetarian infant and child nutrition
by M. Van Winckel, S. Vande Velde, R. De Bruyne, and S. Van Biervliet

Dietary intake and nutritional status of vegetarian and omnivorous preschool children and their parents in Taiwan
C. E. Yen, C. H. Yen, M. C. Huang, C. H. Cheng, and Y. C. Huang

Persistence of neurological damage induced by dietary vitamin B-12 deficiency in infancy
by Ursula von Schenck, Christine Bender-Götze, and Berthold Koletzko

Severe vitamin B12 deficiency in an exclusively breastfed 5-month-old Italian infant born to a mother receiving multivitamin supplementation during pregnancy
by S. Guez et al

Long-chain n-3 PUFA in vegetarian women: a metabolic perspective
by G. C. Burdge, S. Y. Tan, and C. J. Henry

Signs of impaired cognitive function in adolescents with marginal cobalamin status
by M. W. Louwman et al

Transient neonatal hypothyroidism due to a maternal vegan diet
by M. G. Shaikh, J. M. Anderson, S. K. Hall, M. A. Jackson

Veganism as a cause of iodine deficient hypothyroidism
by O. Yeliosof and L. A. Silverman

Do plant based diets deprive the brain of an essential nutrient?
by Ana Sandoiu

Suggested move to plant-based diets risks worsening brain health nutrient deficiency
from BMJ

Could we be overlooking a potential choline crisis in the United Kingdom?
by Emma Derbyshire

How a vegan diet could affect your intelligence
by Zaria Gorvett

Vitamins and Minerals – Plants vs Animals
by Kevin Stock

Health effects of vegan diets
by Winston J Craig

Comparing Glutathione in the Plasma of Vegetarian and Omnivore Populations
by Rachel Christine Manley

Vegan diets are adding to malnutrition in wealthy countries
by Chris Elliott, Chen Situ, and Claire McEvoy

What beneficial compounds are primarily found in animal products?
by Kamal Patel

The Brain Needs Animal Fat
by Georgia Ede

The Vegan Brain
by Georgia Ede

Meat, Organs, Bones and Skin
by Christopher Masterjohn

Vegetarianism and Nutrient Deficiencies
by Christopher Masterjohn

Adding milk, meat to diet dramatically improves nutrition for poor in Zambia
from Science Daily

Red meat plays vital role in diets, claims expert in fightback against veganism
by James Tapper

Nutritional Composition of Meat
by Rabia Shabir Ahmad, Ali Imran and Muhammad Bilal Hussain

Meat and meat products as functional food
by Maciej Ostaszewski

Meat: It’s More than Protein
from Paleo Leap

Conjugated Linoleic Acid: the Weight Loss Fat?
from Paleo Leap

Nutritional composition of red meat
by P. G. Williams

How Red Meat Can ‘Beef Up’ Your Nutrition
by David Hu

Endogenous antioxidants in fish
by Margrét Bragadóttir

Astaxanthin Benefits Better than Vitamin C?
by Rachael Link

Astaxanthin: The Most Powerful Antioxidant You’ve Never Heard Of
from XWERKS

Antioxidants Are Bullshit for the Same Reason Eggs Are Healthy
by Sam Westreich

We absolutely need fruits and vegetables to obtain optimal antioxidant status, right?
by Paul Saladino

Hen Egg as an Antioxidant Food Commodity: A Review
Chamila Nimalaratne and Jianping Wu

Eggs’ antioxidant properties may help prevent heart disease and cancer, study suggests
from Science Daily

The Ultimate Superfood? Milk Offers Up a Glass Full of Antioxidants
by Lauren Milligan Newmark

Antioxidant properties of Milk and dairy products: a comprehensive review of the current knowledge
by Imran Taj Khan et al

Antioxidants in cheese may offset blood vessel damage
from Farm and Dairy

Identification of New Peptides from Fermented Milk Showing Antioxidant Properties: Mechanism of Action
by Federica Tonolo

Bioavailability of iron, zinc, and other trace minerals from vegetarian diets
by Janet R Hunt

Dietary iron intake and iron status of German female vegans: results of the German vegan study.
by A. Waldmann, J. W. Koschizke, C. Leitzmann, and A. Hahn

Mechanisms of heme iron absorption: Current questions and controversies
by Adrian R. West and Phillip S. Oates

Association between Haem and Non-Haem Iron Intake and Serum Ferritin in Healthy Young Women
by Isabel Young et al

Pork meat increases iron absorption from a 5-day fully controlled diet when compared to a vegetarian diet with similar vitamin C and phytic acid content.
by M. Bach Kristensen, O. Hels, C. Morberg, J. Marving, S. Bügel, and I. Tetens

Do you need fiber?
by Kevin Stock

To Supplement Or Not?

Some say vitamin, mineral, and electrolyte supplements are unnecessary, useless, or even harmful. I’ve been on the fence about this. Our modern diet is so deficient in nutrients. But it is argued that even in the modern world we should be able to get all the nutrients we need from nutrient-dense foods. I’m coming around to this view.

There are specific conditions where supplementation would be necessary. If you get a lot of caffeine, that will dehydrate you and throw off your electrolyte balance and so maybe supplementation could help, but even then it probably would be better to use a natural sea salt and eat seaweed or, better yet, give up caffeine. Another example is that, for those on statins, additional CoQ10 is required beyond what is likely found in the diet. But this shouldn’t apply to anyone who is healthy. There is the rub. Most Americans aren’t healthy.

I might add that nutrient deficiencies are much more common on vegan and vegetarian diets, especially the former. But that is the problem with these diets. We should be able to get all our nutrients from our diet without supplements, as most humans have done for most of evolution, something I’ve long agreed with in theory. Requiring supplements indicates a failure. If we aren’t getting enough nutrients, there is something wrong with either our diet or our food system. This is why food quality is so important. We need to be getting plenty of wild-caught and pasture-raised animal foods, especially organ meats. But how many people have access to and can afford these foods? And how many will go to the effort to procure and prepare them?

My own carnivore experiment only lasted a couple of months, but I did learn from it. I’m still mostly animal-based in my meals, with a few nutrient-dense plant foods (e.g., fermented vegetables). I’ve known about nutrient-density since the late 1990s, back when I first read Sally Fallon Morrell (just Sally Fallon at the time). I have been trying to improve my diet for many years, but not to the degree I’ve been doing over the past year.

The only issue I’ve had is that most foods today are nutritionally deficient. And so I’ve worried about not getting required nutrition without supplementation. I’ve argued in favor of supplementation in the past, for the simple reason most Americans are malnourished. Telling people to eat nutrient-dense foods is easier said than done, as such foods are less common and familiar while being more expensive. I’ve previously come across those who oppose general supplementation for all or most people. But I wasn’t sure what to make of it. Most people are dealing with major deficiencies while struggling to eat even moderately well. Our society isn’t exactly supportive of a healthy diet. Even the official food recommendations and guidelines are making people sick.

One thing that brought me to thinking about this again is a study reported on by the New York Times, Supplements and Diets for Heart Health Show Limited Proof of Benefit by Anahad O’Connor. The evidence on effectiveness is mixed. Maybe the risk to benefit ration is too high in taking an approach of the precautionary principle, considering we don’t have enough good research yet. I’m coming around to the conclusion that modern foods, as long as they are high quality, can or should be enough for optimal health — other than medically diagnosed deficiencies because of health problems.

I’ll experiment with this, maybe after I use up my present supply of multivitamins, and see if I observe any differences or rather observe a lack of a difference. I still don’t know what that will tell me, as some deficiencies like that of vitamin K2 are almost impossible to notice since the effects are mostly indirect. I guess eat the best food possible and hope for the best.

I must admit I still have some reservations. When I look at the people advocating nutrient-density alone can be adequate without supplementation, I notice that these are people putting immense time, effort, and money into their diet and health. They are going to great lengths to ensure high quality food — dairy, eggs, organ meats, brains, caviar, etc from animals that were pasture-raised, wild-caught, or hunted. This is simply not an option for most Americans, for many reasons. The reality is few Americans will be willing to do this, to dedicate their entire lives to this endeavor, even if they could afford it and had the time to do it.

So, I don’t know. But since I have the money and motivation, I’m going to try to do my best in getting as much food-sourced nutrition as possible.

* * *

For no particular reason, I’ll share some videos only from Frank Tufano. He is one of the carnivore advocates who talks about nutrient-density.

As a side note, Tufano got into a snit because he thought that fellow carnivore Paul Saladino stole information from him and didn’t credit him as the source. What he claimed was unjustly taken had to do with nutrient-density from animal foods. He was trying to convince his viewers that he was the first carnivore advocate to ever talk about nutrient-density. That simply is not true.

J.D. Garland has been a carnivore longer than Tufano. Where Garland comes from is specifically a nutritional approach, prior to his going on a carnivore diet. He learned of this from Sally Fallon Morrell who in turn got it from Weston A. Price, the latter having researched this topic long before any of these other people were born. This comes up in an interview with Tristan Haggard. As far as that goes, Haggard has also been going on about this topic for quite a while. It’s pretty much common knowledge at this point.

That was a bit of meaningless drama. But I wanted to set the record straight. Many people have picked up on the knowledge of traditional foods from Price. And it was Morrell who specifically did the most in popularizing his work. Still, Tufano is worth listening to, if not as original as he’d like to believe. Listen to his informative videos and ignore the rest.

Calcium: Nutrient Combination and Ratios

Calcium is centrally important, as most people already know. Not only is it necessary for the health of bones but also for the health of the heart, nerve cells, gut microbiome, hormonal system, skin, etc and will affect such things as grip strength and fatigue. As usual, there is a lot of misinformation out there and newer information that has changed our understanding. Let me clear up the issue to the degree I can. The following represents my present understanding, based on the sources I could find.

We can store calcium when we are younger, but lose this ability as we age. On the other hand, it turns out we don’t need as much calcium as previously assumed. And too much calcium can be harmful, even deadly as can happen with hardening of arteries. In fact, the healthiest societies have lower levels of calcium. It’s not so much about the calcium itself for, as always, context matters. Calcium deficiencies typically are caused by a health condition (kidney condition, alcohol abuse, etc), rather than lack of calcium in the diet. Importantly, other nutrients determine how the body absorbs, processes, utilizes, and deposits calcium. Furthermore, nutritional imbalances involving deficiencies and excesses create a cascade of health problems.

Let me explain the interrelationship of micronutrients. There is a whole series of relationships involved in calcium processing. Vitamin B6 is necessary for absorption of magnesium; and magnesium is necessary for absorption of vitamin D3 — zinc, boron, vitamin A, bile salts, and a healthy guy microbiome are all important as well. Of course, cholesterol and sunlight are needed for the body to produce it’s own vitamin D3, which is why deficiencies in these are also problematic. Statins block cholesterol and sunscreen blocks sun; while stress will block vitamin D3 itself whereas exercise will do the opposite. Then vitamin D3 is necessary for absorption of calcium. But it doesn’t end there. Most important of all, vitamin K2 is necessary for regulating where calcium is deposited in the body, ensuring it ends up in bones and teeth rather than in joints, arteries, brain, kidneys, etc.

About on specific issue, the often cited 2-to-1 ratio of calcium and magnesium is actually on the high end indicating the maximum calcium levels you don’t want to exceed as part of your total calcium intake from both diet and supplementation. So, if you’re getting a 2-to-1 ratio in your supplements combined with high levels of calcium from food, such as a diet with plenty of dairy and/or greens, your calcium levels could be causing you harm. Speaking of magnesium deficiency is a relative assessment, as it depends on calcium levels. The body is rarely depleted of magnesium and so, on a superficial level, your body is never deficient in an absolute sense. Yet the higher your calcium levels go the greater your need of magnesium. Nutrients never act alone, such as how vitamin C requirements increase on a high-carb diet.

Here is another example of nutrient interaction. With more salt in your diet, you’ll need more potassium and magnesium to compensate. And potassium deficiency is associated with magnesium deficiency. But that isn’t to say you want to decrease sodium to increase these others, as research indicates higher salt intake is associated with greater health (Dr. James DiNicolantonio, The Salt Fix) — and I’d recommend getting a good source of salt such as Real Salt (although natural forms of salt lack iodine and so make sure to increase iodine-rich foods like seaweed, that being a good option since seaweed is extremely nutrient-dense). As an interesting side note, calcium helps your muscles contract and magnesium helps your muscles relax, which is why muscle cramps (also spasms, twitches, and restlessness) can be a sign of magnesium deficiency. Plus, excess calcium and insufficient magnesium will increase cortisol, the stress hormone, and so can interfere with sleep. There is yet another dual relationship between these two in the clotting and thinning of blood.

Macronutrients play a role as well. Higher protein ensures optimal levels of magnesium and is strongly linked to increased bone mass and density. Fat intake may also play a role with these minerals, but I couldn’t find much discussion about this. Certainly, fat is necessary for the absorption of fat-soluble vitamins. If you’re eating pastured (or grass-fed-and-finished) fatty animal foods, you’ll be getting both the protein and the fat-soluble vitamins (A as beta-carotene, D3, E complex, & K2). Even greater, with cultured, fermented and aged foods (whether from animals or plants), you’ll get higher levels of the much needed vitamin K2. Assuming you can stand the taste and texture of it, fermented soy in the form of natto is the highest known source of K2 as the subtype MK7 which remains in the body longer than other subtypes. By the way, some multiple vitamins contain MK7 (e.g., Garden of Life). Vitamin K2 is massively important. Weston A. Price called it Activator X because it controls so much of what the body does, specifically in relationship to other nutrients, including other fat-soluble vitamins. And all of the fat-soluble vitamins are central in relationship to mineral levels.

Another factor to consider is when nutrients are taken and in combination with what. Some minerals will compete with each other for absorption, but this probably is not an issue if you are getting small amounts throughout the day, such as adding a balanced electrolyte mix (with potassium, magnesium, etc) to your water or other drinks. Calcium and magnesium are two that compete and many advise they should be taken separately, but if you take them in smaller amounts competition is not an issue. Some research indicates calcium has a higher absorption rate in the evening, but magnesium can make you sleepy and so might also be taken in the evening — if taking a supplement, maybe take the former with dinner and the latter before bed or you could take the magnesium in the morning and see how it makes you feel. By the way, too much coffee (6 cups or more a day) will cause the body to excrete calcium and salt, and yet coffee is also a good source of potassium and magnesium. Coffee, as with tea, in moderate amounts is good for your health.

As a last thought, here is what you want to avoid for healthy calcium levels: taken with iron supplements, high levels of insoluble fiber, antacids, excessive caffeine. Also, calcium can alter the effects of medications and, in some cases, should be taken two hours apart. Keep in mind that many plant foods can be problematic because of anti-nutrients that bind minerals or interfere with absorption. This is why traditional people spent so much time preparing plant foods (soaking, sprouting, cooking, fermenting, etc) in order to eliminate these anti-nutrients and hence increase nutrient absorption. It is irrelevant the amount of nutrients in a food if you’re body can’t use them. For example, one of the highest concentrations of calcium is found in spinach, but the bioavailability is extremely low. Other foods, including other leafy greens, are a much better source and with any leafy greens always cook them.

This problem is magnified by the decreased nutrient content of most plant foods these days, as the soil itself has become depleted. Supplementation of many micronutrients is maybe necessary for almost everyone at this point, although great caution should be taken with supplementing calcium.

* * *

Sometimes I write posts about diet and health after doing research for my own purposes or simply for the sake of curiosity about a topic. But in many cases, I have family members in mind, as my own health improvements have gone hand in hand with dietary changes my parents also have made, and my brothers are health-conscious as well although with a vegetarian diet quite different than my own. This particular post was written for my mother.

Just the other day she was diagnosed with osteoporosis. She had osteopenia for decades. Now looking back, she realizes that her bone loss began when she started taking fiber and antacids, both of which block calcium. And all the years of calcium supplementation were probably doing her no good because, even to the degree she was absorbing any of the calcium, it wasn’t balanced with other needed nutrients. I gathered this information in order to help her to figure out how to improve her bone health, as her doctor was only moderately informed and her recent appointment was rushed.

This was researched and written on Mother’s Day. I guess it was my gift to my mother. But I hope it is of value to others as well.

* * *

Without Magnesium, Vitamin D Supplementation May Backfire
by Joseph Mercola

Calcium with Magnesium: Do You Need the Calcium?
from Easy Immune System Health

Expert cites risk of calcium—magnesium imbalance
from Nutritional Magnesium Association

Optimum Calcium Magnesium Ratio: The 2-to-1 Calcium-to-Magnesium Ratio
by A. Rosanoff

Nutritional strategies for skeletal and cardiovascular health: hard bones, softarteries, rather than vice versa
by James H O’Keefe, Nathaniel Bergman, Pedro Carrera-Bastos, Maélan Fontes-Villalba, James J DiNicolantonio, Loren Cordain

Why You Need To Take Vitamin K With Calcium Supplements
by Stacy Facko

For Bone Health, Think Magnesium
from Harvest Market Natural Foods

Calcium Deficiency: Are Supplements the Answer?
by Jillian Levy

Calcium to Magnesium: How the Ratio Affects Your Health
from Juvenon Health Journal

How to Correct Your Calcium-to-Magnesium Ratio
by Sandra Ketcham

Calcium & Magnesium: Finding the Right Ratio for Optimal Health
by Dr. Edward Group

Magnesium, NOT Calcium, Is The Key To Healthy Bones
by Jackie Ritz

Calcium Supplements: Things to Consider before Taking One
by Chris Kresser

How to Get Enough Calcium Without Dairy
by Katie Wells

Is The Paleo Diet Deficient In Calcium?
by Michael Ofer

Paleo & Calcium | Friendly Calcium Rich Foods
by Irena Macri

Mineral Primer – The Weston A. Price Foundation
by Sally Fallon and Mary G. Enig

The science of salt and electrolytes (are we consuming enough?)
by Will Little

13 Signs Of Magnesium Deficiency + How To Finally Get Enough
by Dr. Will Cole

Top 10 Magnesium-Rich Foods
by Rachael Link

Vitamin K2, Vitamin D, and Calcium: A Winning Combo
by Joseph Mercola

Vitamin K2: Everything You Need to Know
by Joe Leech

The Ultimate Vitamin K2 Resource
by Chris Masterjohn

Vitamin K2: Are You Consuming Enough?
by Chris Kresser

Promoting Calcium Balance Health On A Paleo Diet (Easier Than You Think)
by Loren Cordain

Calcium: A Team Sports View of Nutrition
by Loren Cordain

How To Keep Your Bones Healthy On A Paleo Diet
by Chris Kresser

Malnourished Americans

Prefatory Note

It would be easy to mistake this writing as a carnivore’s rhetoric against the evils of grains and agriculture. I’m a lot more agnostic on the issue than it might seem. But I do come off as strong in opinion, from decades of personal experience about bad eating habits and the consequences, and my dietary habits were no better when I was vegetarian.

I’m not so much pro-meat as I am for healthy fats and oils, not only from animals sources but also from plants, with coconut oil and olive oil being two of my favorites. As long as you are getting adequate protein, from whatever source (including vegetarian foods), there is no absolute rule about protein intake. But hunter-gatherers on average do eat more fats and oils than protein (and more than vegetables as well), whether the protein comes from meat or seeds and nuts (though the protein and vegetables they get is of extremely high quality and, of course, nutrient dense; along with much fiber). Too much protein with too little fat/oil causes rabbit sickness. It’s fat and oil that has a higher satiety and, combined with low-carb ketosis, is amazing in eliminating food cravings, addictions, and over-eating.

Besides, I have nothing against plant-based foods. I eat more vegetables on the paleo diet than I did in the past, even when I was a vegetarian, more than any vegetarian I know as well; not just more in quantity but also more in quality. Many paleo and keto dieters have embraced a plant-based diet with varying attitudes about meat and fat. Dr. Terry Wahls, former vegetarian, reversed her symptoms of multiple sclerosis by formulating a paleo diet that include massive loads of nutrient-dense vegetables, while adding in the nutrient-dense animal foods as well (e.g., liver).

I’ve picked up three books lately that emphasize plants even further. One is The Essential Vegetarian Keto Cookbook and pretty much is as the title describes it, mostly recipes with some introductory material about ketosis. Another book, Ketotarian by Dr. Will cole, is likewise about keto vegetarianism, but with leniency toward fish consumption and ghee (the former not strictly vegetarian and the latter not strictly paleo). The most recent I got is The Paleo Vegetarian Diet by Dena Harris, another person with a lenient attitude toward diet. That is what I prefer in my tendency toward ideological impurity. About diet, I’m bi-curious or maybe multi-curious.

My broader perspective is that of traditional foods. This is largely based on the work of Weston A. Price, which I was introduced to long ago by way of the writings of Sally Fallon Morrell (formerly Sally Fallon). It is not a paleo diet in that agricultural foods are allowed, but its advocates share a common attitude with paleolists in the valuing of traditional nutrition and food preparation. Authors from both camps bond over their respect for Price’s work and so often reference those on the other side in their writings. I’m of the opinion, in line with traditional foods, that if you are going to eat agricultural foods then traditional preparation is all the more important (from long-fermented bread and fully soaked legumes to cultured dairy and raw aged cheese). Many paleolists share this opinion and some are fine with such things as ghee. My paleo commitment didn’t stop me from enjoying a white role for Thanksgiving, adorning it with organic goat butter, and it didn’t kill me.

I’m not so much arguing against all grains in this post as I’m pointing out the problems found at the extreme end of dietary imbalance that we’ve reached this past century: industrialized and processed, denatured and toxic, grain-based/obsessed and high-carb-and-sugar. In the end, I’m a flexitarian who has come to see the immense benefits in the paleo approach, but I’m not attached to it as a belief system. I heavily weigh the best evidence and arguments I can find in coming to my conclusions. That is what this post is about. I’m not trying to tell anyone how to eat. I hope that heads off certain areas of potential confusion and criticism. So, let’s get to the meat of the matter.

Grain of Truth

Let me begin with a quote, share some related info, and then circle back around to putting the quote into context. The quote is from Grain of Truth by Stephen Yafa. It’s a random book I picked up at a secondhand store and my attraction to it was that the author is defending agriculture and grain consumption. I figured it would be a good balance to my other recent readings. Skimming it, one factoid stuck out. In reference to new industrial milling methods that took hold in the late 19th century, he writes:

“Not until World War II, sixty years later, were measures taken to address the vitamin and mineral deficiencies caused by these grain milling methods. They caught the government’s attention only when 40 percent of the raw recruits drafted by our military proved to be so malnourished that they could not pass a physical and were declared unfit for duty.” (p. 17)

That is remarkable. He is talking about the now infamous highly refined flour, something that never existed before. Even commercial whole wheat breads today, with some fiber added back in, have little in common with what was traditionally made for millennia. My grandparents were of that particular generation that was so severely malnourished, and so that was the world into which my parents were born. The modern health decline that has gained mainstream attention began many generations back. Okay, so put that on the backburner.

Against the Grain

In a post by Dr. Malcolm Kendrick, I was having a discussion in the comments section (and, at the same time, I was having a related discussion in my own blog). Göran Sjöberg brought up Jame C. Scott’s book about the development of agriculture, Against the Grain — writing that, “This book is very much about the health deterioration, not least through epidemics partly due to compromised immune resistance, that occurred in the transition from hunting and gathering to sedentary mono-crop agriculture state level scale, first in Mesopotamia about five thousand years ago.”

Scott’s view has interested me for a while. I find compelling the way he connects grain farming, legibility, record-keeping, and taxation. There is a reason great empires were built on grain fields, not on potato patches or vegetable gardens, much less cattle ranching. Grain farming is easily observed and measured, tracked and recorded, and that meant it could be widely taxed to fund large centralized governments along with their armies and, later on, their police forces and intelligence agencies. The earliest settled societies arose prior to agriculture, but they couldn’t become major civilizations until the cultivation of grains.

Another commenter, Sasha, responded with what she considered important qualifications: “I think there are too many confounders in transition from hunter gatherers to agriculture to suggest that health deterioration is due to one factor (grains). And since it was members of upper classes who were usually mummified, they had vastly different lifestyles from that of hunter gatherers. IMO, you’re comparing apples to oranges… Also, grain consumption existed in hunter gatherers and probably intensified long before Mesopotamia 5 thousands years ago as wheat was domesticated around 9,000 BCE and millet around 6,000 BCE to use just two examples.”

It is true that pre-neolithic hunter-gatherers, in some cases, sporadically ate grains in small amounts or at least we have evidence they were doing something with grains, though as far as we know they might have been using it to mix with medicinal herbs or used as a thickener for paints — it’s anyone’s guess. Assuming they were eating those traces of grains we’ve discovered, it surely was no where near at the level of the neolithic agriculturalists. Furthermore, during the following millennia, grains were radically changed through cultivation. As for the Egyptian elite, they were eating more grains than anyone, as farmers were still forced to partly subsist from hunting, fishing, and gathering.

I’d take the argument much further forward into history. We know from records that, through the 19th century, Americans were eating more meat than bread. Vegetable and fruit consumption was also relatively low and mostly seasonal. Part of that is because gardening was difficult with so many pests. Besides, with so many natural areas around, hunting and gathering remained a large part of the American diet. Even in the cities, wild game was easily obtained at cheap prices. Into the 20th century, hunting and gathering was still important and sustained many families through the Great Depression and World War era when many commercial foods were scarce.

It was different in Europe, though. Mass urbanization happened centuries before it did in the United States. And not much European wilderness was left standing in recent history. But with the fall of the Roman Empire and headng into feudalism, many Europeans returned to a fair amount of hunting and gathering, during which time general health improved in the population. Restrictive laws about land use eventually made that difficult and the land enclosure movement made it impossible for most Europeans.

Even so, all of that is fairly recent in the big scheme of things. It took many millennia of agriculture before it more fully replaced hunting, fishing, trapping, and gathering. In places like the United States, that change is well within living memory. When some of my ancestors immigrated here in the 1600s, Britain and Europe still maintained plenty of procuring of wild foods to support their populations. And once here, wild foods were even more plentiful and a lot less work than farming.

Many early American farmers didn’t grow food so much for their own diet as to be sold on the market, sometimes in the form of the popular grain-based alcohols. It was in making alcohol that rural farmers were able to get their product to the market without it spoiling. I’m just speculating, but alcohol might have been the most widespread agricultural food of that era because water was often unsafe to drink.

Another commenter, Martin Back, made the same basic point: “Grain these days is cheap thanks to Big Ag and mechanization. It wasn’t always so. If the fields had to be ploughed by draught animals, and the grain weeded, harvested, and threshed by hand, the final product was expensive. Grain became a store of value and a medium of exchange. Eating grains was literally like eating money, so presumably they kept consumption to a minimum.”

In early agriculture, grain was more of a way to save wealth than a staple of the diet. It was saved for purposes of trade and also saved for hard times when no other food was available. What didn’t happen was to constantly consume grain-based foods every day and all day long — going from a breakfast with toast and cereal to lunch with a sandwich and maybe a salad with croutons, and then a snack of crackers in the afternoon before eating more bread or noodles for dinner.

Historical Examples

So, I am partly just speculating. But it’s informed speculation. I base my view on specific examples. The most obvious example are hunter-gatherers, poor by standards of modern industrialization while maintaining great health, as long as they their traditional way of life is able to be maintained. Many populations that are materially better of in terms of a capitalist society (access to comfortable housing, sanitation, healthcare, an abundance of food in grocery stores, etc) are not better off in terms of chronic diseases.

As the main example I already mentioned, poor Americans have often been a quite healthy lot, as compared to other populations around the world. It is true that poor Americans weren’t particularly healthy in the early colonial period, specifically in Virginia because of indentured servitude. And it’s true that poor Americans today are fairly bad off because of the cheap industrialized diet. Yet for the couple of centuries or so in between, they were doing quite well in terms of health, with lots of access to nutrient-dense wild foods. That point is emphasized by looking at other similar populations at the time, such as back in Europe.

Let’s do some other comparisons. The poor in the Roman Empire did not do well, even when they weren’t enslaved. That was for many reasons, such as growing urbanization and its attendant health risks. When the Roman Empire fell, many of the urban centers collapsed. The poor returned to a more rural lifestyle that depended on a fair amount of wild foods. Studies done on their remains show their health improved during that time. Then at the end of feudalism, with the enclosure movement and the return of mass urbanization, health went back on a decline.

Now I’ll consider the early Egyptians. I’m not sure if there is any info about the diet and health of poor Egyptians. But clearly the ruling class had far from optimal health. It’s hard to make comparisons between then and now, though, because it was an entire different kind of society. The early Bronze Age civilizations were mostly small city-states that lacked much hierarchy. Early Egypt didn’t even have the most basic infrastructure such as maintained roads and bridges. And the most recent evidence indicates that the pyramid workers weren’t slaves but instead worked freely and seem to have fed fairly well, whatever that may or may not indicate about their socioeconomic status. The fact that the poor weren’t mummified leaves us with scant evidence that would more directly inform us.

On the other hand, no one can doubt that there have been plenty of poor populations who had truly horrific living standards with much sickness, suffering, and short lifespans. That is particularly true over the millennia as agriculture became ever more central, since that meant periods of abundance alternating with periods of deficiency and sometimes starvation, often combined with weakened immune systems and rampant sickness. That was less the case for the earlier small city-states with less population density and surrounded by the near constant abundance of wilderness areas.

As always, it depends on what are the specifics we are talking about. Also, any comparison and conclusion is relative.

My mother grew up in a family that hunted and at the time there was a certain amount of access to natural areas for many Americans, something that helped a large part of the population get through the Great Depression and world war era. Nonetheless, by the time of my mother’s childhood, overhunting had depleted most of the wild game (bison, bear, deer, etc were no longer around) and so her family relied on less desirable foods such as squirrel, raccoon, and opossum; even the fish they ate was less than optimal because they came from highly polluted waters because of the very factories and railroad her family worked in. So, the wild food opportunities weren’t nearly as good as it had been a half century earlier, much less in the prior centuries.

Not All Poverty is the Same

Being poor today means a lot of things that it didn’t mean in the past. The high rates of heavy metal toxicity today has rarely been seen among previous poor populations. Today 40% of the global deaths are caused by air pollution, primarily effecting the poor, also extremely different from the past. Beyond that, inequality has grown larger than ever before and that has been strongly correlated to high rates of stress, disease, homicides, and suicides. Such inequality is also seen in terms of climate change, droughts, refugee crises, and war/occupation.

Here is what Sasha wrote in response to me: “I agree with a lot of your points, except with your assertion that “the poor ate fairly well in many societies especially when they had access to wild sources of food”. I know how the poor ate in Russia in the beginning of the 20th century and how the poor eat now in the former Soviet republics and in India. Their diet is very poor even though they can have access to wild sources of food. I don’t know what the situation was for the poor in ancient Egypt but I would be very surprised if it was better than in modern day India or former Soviet Union.”

I’d imagine modern Russia has high inequality similar to the US. About modern India, that is one of the most impoverished, densely populated, and malnourished societies around. And modern industrialization did major harm to Hindu Indians because studies show that traditional vegetarians got a fair amount of nutrients from the insects that were mixed in with pre-modern agricultural goods. Both Russia and India have other problems related to neoliberalism that wasn’t a factor in the past. It’s an entirely different kind of poverty these days. Even if some Russians have some access to wild foods, I’m willing to bet they have nowhere near the access that was available in previous generations, centuries, and millennia.

Compare modern poverty to that of feudalism. At least in England, feudal peasants were guaranteed to be taken care of in hard times. The Church, a large part of local governance at the time, was tasked with feeding and taking care of the poor and needy, from orphans to widows. They were tight communities that took care of their own, something that no longer exists in most of the world where the individual is left to suffer and struggle. Present Social Darwinian conditions are not the norm for human societies across history. The present breakdown of families and communities is historically unprecedented.

Socialized Medicine & Externalized Costs
An Invisible Debt Made Visible
On Conflict and Stupidity
Inequality in the Anthropocene
Capitalism as Social Control

The Abnormal Norms of WEIRD Modernity

Everything about present populations is extremely abnormal. This is seen in diet as elsewhere. Let me return to the quote I began this post with. “Not until World War II, sixty years later, were measures taken to address the vitamin and mineral deficiencies caused by these grain milling methods. They caught the government’s attention only when 40 percent of the raw recruits drafted by our military proved to be so malnourished that they could not pass a physical and were declared unfit for duty.” * So, what had happened to the health of the American population?

Well, there were many changes. Overhunting, as I already said, made many wild game species extinct or eliminated them from local areas, such that my mother born into a rural farm state never saw a white-tailed deer growing up. Also, much earlier after the Civil War, a new form of enclosure movement happened as laws were passed to prevent people, specifically the then free blacks, from hunting and foraging wherever they wanted (early American laws often protected the rights of anyone to hunt, forage plants, collect timber, etc from any land that was left open, whether or not it was owned by someone). The carryover from the feudal commons was finally and fully eliminated. It was also the end of the era of free range cattle ranching, the ending have come with the invention of barbed wire. Access to wild foods was further reduced by the creation and enforcement of protected lands (e.g., the federal park system), which very much was targeted at the poor who up to that point had relied upon wild foods for health and survival.

All of that was combined with mass urbanization and industrialization with all of its new forms of pollution, stress, and inequality. Processed foods were becoming more widespread at the time. Around the turn of the century unhealthy and industrialized vegetable oils became heavily marketed and hence popular, which replaced butter and lard. Also, muckraking about the meat industry scared Americans off from meat and consumption precipitiously dropped. As such, in the decades prior to World War II, the American diet had already shifted toward what we now know. A new young generation had grown up on that industrialized and processed diet and those young people were the ones showing up as recruits for the military. This new diet in such a short period had caused mass malnourishment. It was a mass experiment that showed failure early on and yet we continue the same basic experiment, not only continuing it but making it far worse.

Government officials and health authorities blamed it on bread production. Refined flour had become widely available because of industrialization. This removed all the nutrients that gave any health value to bread. In response, there was a movement to fortify bread, initially enforced by federal law and later by state laws. That helped some, but obviously the malnourishment was caused by many other factors that weren’t appreciated by most at the time, even though this was the same period when Weston A. Price’s work was published. Nutritional science was young at the time and very most nutrients were still undiscovered or else unappreciated. Throwing a few lab-produced vitamins back into food barely scratches the surface of the nutrient-density that was lost.

Most Americans continue to have severe nutritional deficiencies. We don’t recognize this fact because being underdeveloped and sickly has become normalized, maybe even in the minds of most doctors and health officials. Besides, many of the worst symptoms don’t show up until decades later, often as chronic diseases of old age, although increasingly seen among the young. Far fewer Americans today would meet the health standards of World War recruits. It’s been a steady decline, despite the miracles of modern medicine in treating symptoms and delaying death.

* The data on the British shows an even earlier shift in malnourishment because imperial trade brought an industrialized diet sooner to the British population. Also, rural life with a greater diet of wild foods had more quickly disappeared, as compared to the US. The fate of the British in the late 1800s showed what would later happen more than a half century later on the other side of the ocean.

Lore of Nutrition
by Tim Noakes
pp. 373-375

The mid-Victorian period between 1850 and 1880 is now recognised as the golden era of British health. According to P. Clayton and J. Rowbotham, 47 this was entirely due to the mid-Victorians’ superior diet. Farm-produced real foods were available in such surplus that even the working-class poor were eating highly nutritious foods in abundance. As a result, life expectancy in 1875 was equal to, or even better, than it is in modern Britain, especially for men (by about three years). In addition, the profile of diseases was quite different when compared to Britain today.

The authors conclude:

[This] shows that medical advances allied to the pharmaceutical industry’s output have done little more than change the manner of our dying. The Victorians died rapidly of infection and/or trauma, whereas we die slowly of degenerative disease. It reveals that with the exception of family planning, the vast edifice of twentieth century healthcare has not enabled us to live longer but has in the main merely supplied methods of suppressing the symptoms of degenerative disease which have emerged due to our failure to maintain mid-Victorian nutritional standards. 48

This mid-Victorians’ healthy diet included freely available and cheap vegetables such as onions, carrots, turnips, cabbage, broccoli, peas and beans; fresh and dried fruit, including apples; legumes and nuts, especially chestnuts, walnuts and hazelnuts; fish, including herring, haddock and John Dory; other seafood, including oysters, mussels and whelks; meat – which was considered ‘a mark of a good diet’ so that ‘its complete absence was rare’ – sourced from free-range animals, especially pork, and including offal such as brain, heart, pancreas (sweet breads), liver, kidneys, lungs and intestine; eggs from hens that were kept by most urban households; and hard cheeses.

Their healthy diet was therefore low in cereals, grains, sugar, trans fats and refined flour, and high in fibre, phytonutrients and omega- 3 polyunsaturated fatty acids, entirely compatible with the modern Paleo or LCHF diets.

This period of nutritional paradise changed suddenly after 1875 , when cheap imports of white flour, tinned meat, sugar, canned fruits and condensed milk became more readily available. The results were immediately noticeable. By 1883 , the British infantry was forced to lower its minimum height for recruits by three inches; and by 1900, 50 per cent of British volunteers for the Boer War had to be rejected because of undernutrition. The changes would have been associated with an alteration in disease patterns in these populations, as described by Yellowlees ( Chapter 2 ).

On Obesity and Malnourishment

There is no contradiction, by the way, between rampant nutritional deficiencies and the epidemic of obesity. Gary Taubes noted the dramatic rise of obesity in America began earlier last century, which is to say that it is not a problem that came out of nowhere with the present younger generations. Americans have been getting fatter for a while now. Specifically, they were getting fatter while at the same time being malnourished, partly because of refined flour that was as empty of a carb that is possible.

Taubes emphasizes the point that this seeming paradox has often been observed among poor populations around the world, lack of optimal nutrition that leads to ever more weight gain, sometimes with children being skinny to an unhealthy degree only to grow up to be fat. No doubt that many Americans in the early 1900s were dealing with much poverty and the lack of nutritious foods that often goes with it. As for today, nutritional deficiencies are different because of enrichment, but it persists nonetheless in many other ways. Also, as Keith Payne argues in The Broken Ladder, growing inequality mimics poverty in the conflict and stress it causes. And inequality has everything to do with food quality, as seen with many poor areas being food deserts.

I’ll give you a small taste of Taube’s discussion. It is from the introduction to one of his books, published a few years ago. If you read the book, look at the section immediately following the below. He gives examples of tribes that were poor, didn’t overeat, and did hard manual labor. Yet they were getting obese, even as nearby tribes sometimes remained a healthy weight. The only apparent difference was what they were eating and not how much they were eating. The populations that saw major weight gain had adopted a grain-based diet, typically because of government rations or government stores.

Why We Get Fat
by Gary Taubes
pp. 17-19

In 1934, a young German pediatrician named Hilde Bruch moved to America, settled in New York City, and was “startled,” as she later wrote, by the number of fat children she saw—“really fat ones, not only in clinics, but on the streets and subways, and in schools.” Indeed, fat children in New York were so conspicuous that other European immigrants would ask Bruch about it, assuming that she would have an answer. What is the matter with American children? they would ask. Why are they so bloated and blown up? Many would say they’d never seen so many children in such a state.

Today we hear such questions all the time, or we ask them ourselves, with the continual reminders that we are in the midst of an epidemic of obesity (as is the entire developed world). Similar questions are asked about fat adults. Why are they so bloated and blown up? Or you might ask yourself: Why am I?

But this was New York City in the mid-1930s. This was two decades before the first Kentucky Fried Chicken and McDonald’s franchises, when fast food as we know it today was born. This was half a century before supersizing and high-fructose corn syrup. More to the point, 1934 was the depths of the Great Depression, an era of soup kitchens, bread lines, and unprecedented unemployment. One in every four workers in the United States was unemployed. Six out of every ten Americans were living in poverty. In New York City, where Bruch and her fellow immigrants were astonished by the adiposity of the local children, one in four children were said to be malnourished. How could this be?

A year after arriving in New York, Bruch established a clinic at Columbia University’s College of Physicians and Surgeons to treat obese children. In 1939, she published the first of a series of reports on her exhaustive studies of the many obese children she had treated, although almost invariably without success. From interviews with her patients and their families, she learned that these obese children did indeed eat excessive amounts of food—no matter how much either they or their parents might initially deny it. Telling them to eat less, though, just didn’t work, and no amount of instruction or compassion, counseling, or exhortations—of either children or parents—seemed to help.

It was hard to avoid, Bruch said, the simple fact that these children had, after all, spent their entire lives trying to eat in moderation and so control their weight, or at least thinking about eating less than they did, and yet they remained obese. Some of these children, Bruch reported, “made strenuous efforts to lose weight, practically giving up on living to achieve it.” But maintaining a lower weight involved “living on a continuous semi-starvation diet,” and they just couldn’t do it, even though obesity made them miserable and social outcasts.

One of Bruch’s patients was a fine-boned girl in her teens, “literally disappearing in mountains of fat.” This young girl had spent her life fighting both her weight and her parents’ attempts to help her slim down. She knew what she had to do, or so she believed, as did her parents—she had to eat less—and the struggle to do this defined her existence. “I always knew that life depended on your figure,” she told Bruch. “I was always unhappy and depressed when gaining [weight]. There was nothing to live for.… I actually hated myself. I just could not stand it. I didn’t want to look at myself. I hated mirrors. They showed how fat I was.… It never made me feel happy to eat and get fat—but I never could see a solution for it and so I kept on getting fatter.”

pp. 33-34

If we look in the literature—which the experts have not in this case—we can find numerous populations that experienced levels of obesity similar to those in the United States, Europe, and elsewhere today but with no prosperity and few, if any, of the ingredients of Brownell’s toxic environment: no cheeseburgers, soft drinks, or cheese curls, no drive-in windows, computers, or televisions (sometimes not even books, other than perhaps the Bible), and no overprotective mothers keeping their children from roaming free.

In these populations, incomes weren’t rising; there were no labor-saving devices, no shifts toward less physically demanding work or more passive leisure pursuits. Rather, some of these populations were poor beyond our ability to imagine today. Dirt poor. These are the populations that the overeating hypothesis tells us should be as lean as can be, and yet they were not.

Remember Hilde Bruch’s wondering about all those really fat children in the midst of the Great Depression? Well, this kind of observation isn’t nearly as unusual as we might think.

How Americans Used to Eat

Below is a relevant passage. It puts into context how extremely unusual has been the high-carb, low-fat diet these past few generations. This is partly what informed some of my thoughts. We so quickly forget that the present dominance of a grain-based diet wasn’t always the case, likely not even in most agricultural societies until quite recently. In fact, the earlier American diet is still within living memory, although those left to remember it are quickly dying off.

Let me explain why history of diets matter. One of the arguments for forcing official dietary recommendations onto the entire population was the belief that Americans in a mythical past ate less meat, fat, and butter while having ate more bread, legumes, and vegetables. This turns out to have been a trick of limited data.

We now know, from better data, that the complete opposite was the case. And we have the further data that shows that the increase of the conventional diet has coincided with increase of obesity and chronic diseases. That isn’t to say eating more vegetables is bad for your health, but we do know that even as the average American intake of vegetables has gone up so has all the diet-related health conditions. During this time, what went down was the consumption of all the traditional foods of the American diet going back to the colonial era: wild game, red meat, organ meat, lard, and butter — all the foods Americans ate in huge amounts prior to the industrialized diet. Writing in the New York Times, Jane Ziegelman describes the American love of meat going back centuries (America’s Obsession With Cheap Meat):

“In her 1832 travel book, “Domestic Manners of the Americans,” the English writer Frances Trollope describes the breathtaking quantities of food on American dinner tables. Even tea, she reports, is a “massive meal,” a lavish spread of many cakes and breads and “ham, turkey, hung beef, apple sauce and pickled oysters.”

“Equally impressive to this foreign observer were the carnivorous tendencies of her American hosts. “They consume an extraordinary amount of bacon,” she writes, while “ham and beefsteaks appear morning, noon and night.”

“Americans were indiscriminate in their love for animal protein. Beef, pork, lamb and mutton were all consumed with relish. However, as pointed out by the food historian Harvey Levenstein, it was beef, the form of protein preferred by the upper class, “that reigned supreme in status.” With the opening of the Western frontier in the mid-19th century, increased grazing land for cattle lowered beef prices, making it affordable for the working class.

“Dietary surveys conducted at the turn of the 20th century by Wilbur Atwater, father of American nutrition, revealed that even laborers were able to have beefsteak for breakfast. As Atwater was quick to point out, a high-protein diet set American workers apart from their European counterparts. On average, Americans ate a phenomenal 147 pounds of meat a year; Italians, by contrast, consumed 24.

““Doubtless,” Atwater wrote, “we live and work more intensely than people do in Europe.” The “vigor, ambition and hopes for higher things” that distinguished the American worker, he argued, was fed by repeated helpings of T-bone and sirloin steak.”

That calculation might be way off, an underestimation if anything. The majority of Americans remained rural, largely working as farmers, into the early 20th century. All we have data for is the food that was shipped across state lines, as no one kept nation-wide records on food that was produced, bought, and consumed locally. Until quite recently, most of the food Americans ate either was grown at home or grown at a nearby farm, not to mention much food that was hunted, trapped, fished and gathered. So, that 147 pounds of meat probably was only the tip of the iceberg.

What added to the confusion and misinterpretation of the evidence had to do with timing. Diet and nutrition was first seriously studied right at the moment when, for most populations, it had already changed. That was the failure of Ancel Keys research on what came to be called the Mediterranean diet (see Sally Fallon Morrell’s Nourishing Diets). The population was recuperating from World War II that had devastated their traditional way of life, including their diet. Keys took the post-war deprivation diet as being the historical norm, but the reality was far different. Cookbooks and other evidence from before the war showed that this population used to eat higher levels of meat and fat, including saturated fat. So, the very people focused on had grown up and spent most of their lives on a diet that was at the moment no longer available because of disruption of the food system. What good health Keys observed came from a lifetime of eating a different diet. Combined with cherry-picking of data and biased analysis, Keys came to a conclusion that was as wrong as wrong could be.

Slightly earlier, Weston A. Price was able to see a different picture. He intentionally traveled to the places where traditional diets remained fully in place. And the devastation of World War II had yet to happen. Price came to a conclusion that what mattered most of all was nutrient-density. Sure, the vegetables eaten would have been of a higher quality than we get today, largely because they were heirloom cultivars grown on health soil. Nutrient-dense foods can only come from nutrient-dense soil, whereas today our food is nutrient-deficient because our soil is highly depleted. The same goes for animal foods. Animals pastured on healthy land will produce healthy dairy, eggs, meat, and fat; these foods will be high in omega-3s and the fat-soluble vitamins.

No matter if it is coming from plant sources or animal sources, nutrient-density might be the most important factor of all. Why fat is meaningful in this context is that it is fat that is where fat-soluble vitamins are found and it is through fat that they are metabolized. And in turn, the fat-soluble vitamins play a key role in the absorption and processing of numerous other nutrients, not to mention a key role in numerous functions in the body. Nutrient-density and fat-density go hand in hand in terms of general health. That is what early Americans were getting in eating so much wild food, not only wild game but also wild greens, fruit, and mushrooms. And nutrient-density is precisely what we are lacking today, as the nutrients have been intentionally removed to make more palatable commercial foods.

Once again, this has a class dimension, since the wealthier have more access to nutrient-dense foods. Few poor people could afford to shop at a high-end health food store, even if one was located nearby their home. But it was quite different in the past when nutrient-dense foods were available to everyone and sometimes more available to the poor concentrated in rural areas. If we want to improve public health, the first thing we should do is return to this historical norm.

The Big Fat Surprise
by Nina Teicholz
pp. 123-131

Yet despite this shaky and often contradictory evidence, the idea that red meat is a principal dietary culprit has thoroughly pervaded our national conversation for decades. We have been led to believe that we’ve strayed from a more perfect, less meat-filled past. Most prominently, when Senator McGovern announced his Senate committee’s report, called Dietary Goals , at a press conference in 1977, he expressed a gloomy outlook about where the American diet was heading. “Our diets have changed radically within the past fifty years,” he explained, “with great and often harmful effects on our health.” Hegsted, standing at his side, criticized the current American diet as being excessively “rich in meat” and other sources of saturated fat and cholesterol, which were “linked to heart disease, certain forms of cancer, diabetes and obesity.” These were the “killer diseases,” said McGovern. The solution, he declared, was for Americans to return to the healthier, plant-based diet they once ate.

The New York Times health columnist Jane Brody perfectly encapsulated this idea when she wrote, “Within this century, the diet of the average American has undergone a radical shift away from plant-based foods such as grains, beans and peas, nuts, potatoes, and other vegetables and fruits and toward foods derived from animals—meat, fish, poultry, eggs and dairy products.” It is a view that has been echoed in literally hundreds of official reports.

The justification for this idea, that our ancestors lived mainly on fruits, vegetables, and grains, comes mainly from the USDA “food disappearance data.” The “disappearance” of food is an approximation of supply; most of it is probably being eaten, but much is wasted, too. Experts therefore acknowledge that the disappearance numbers are merely rough estimates of consumption. The data from the early 1900s, which is what Brody, McGovern, and others used, are known to be especially poor. Among other things, these data accounted only for the meat, dairy, and other fresh foods shipped across state lines in those early years, so anything produced and eaten locally, such as meat from a cow or eggs from chickens, would not have been included. And since farmers made up more than a quarter of all workers during these years, local foods must have amounted to quite a lot. Experts agree that this early availability data are not adequate for serious use, yet they cite the numbers anyway, because no other data are available. And for the years before 1900, there are no “scientific” data at all.

In the absence of scientific data, history can provide a picture of food consumption in the late eighteenth to nineteenth century in America. Although circumstantial, historical evidence can also be rigorous and, in this case, is certainly more far-reaching than the inchoate data from the USDA. Academic nutrition experts rarely consult historical texts, considering them to occupy a separate academic silo with little to offer the study of diet and health. Yet history can teach us a great deal about how humans used to eat in the thousands of years before heart disease, diabetes, and obesity became common. Of course we don’t remember now, but these diseases did not always rage as they do today. And looking at the food patterns of our relatively healthy early-American ancestors, it’s quite clear that they ate far more red meat and far fewer vegetables than we have commonly assumed.

Early-American settlers were “indifferent” farmers, according to many accounts. They were fairly lazy in their efforts at both animal husbandry and agriculture, with “the grain fields, the meadows, the forests, the cattle, etc, treated with equal carelessness,” as one eighteenth-century Swedish visitor described. And there was little point in farming since meat was so readily available.

The endless bounty of America in its early years is truly astonishing. Settlers recorded the extraordinary abundance of wild turkeys, ducks, grouse, pheasant, and more. Migrating flocks of birds would darken the skies for days . The tasty Eskimo curlew was apparently so fat that it would burst upon falling to the earth, covering the ground with a sort of fatty meat paste. (New Englanders called this now-extinct species the “doughbird.”)

In the woods, there were bears (prized for their fat), raccoons, bobolinks, opossums, hares, and virtual thickets of deer—so much that the colonists didn’t even bother hunting elk, moose, or bison, since hauling and conserving so much meat was considered too great an effort. IX

A European traveler describing his visit to a Southern plantation noted that the food included beef, veal, mutton, venison, turkeys, and geese, but he does not mention a single vegetable. Infants were fed beef even before their teeth had grown in. The English novelist Anthony Trollope reported, during a trip to the United States in 1861, that Americans ate twice as much beef as did Englishmen. Charles Dickens, when he visited, wrote that “no breakfast was breakfast” without a T-bone steak. Apparently, starting a day on puffed wheat and low-fat milk—our “Breakfast of Champions!”—would not have been considered adequate even for a servant.

Indeed, for the first 250 years of American history, even the poor in the United States could afford meat or fish for every meal. The fact that the workers had so much access to meat was precisely why observers regarded the diet of the New World to be superior to that of the Old. “I hold a family to be in a desperate way when the mother can see the bottom of the pork barrel,” says a frontier housewife in James Fenimore Cooper’s novel The Chainbearer.

Like the primitive tribes mentioned in Chapter 1, Americans also relished the viscera of the animal, according to the cookbooks of the time. They ate the heart, kidneys, tripe, calf sweetbreads (glands), pig’s liver, turtle lungs, the heads and feet of lamb and pigs, and lamb tongue. Beef tongue, too, was “highly esteemed.”

And not just meat but saturated fats of every kind were consumed in great quantities. Americans in the nineteenth century ate four to five times more butter than we do today, and at least six times more lard. X

In the book Putting Meat on the American Table , researcher Roger Horowitz scours the literature for data on how much meat Americans actually ate. A survey of eight thousand urban Americans in 1909 showed that the poorest among them ate 136 pounds a year, and the wealthiest more than 200 pounds. A food budget published in the New York Tribune in 1851 allots two pounds of meat per day for a family of five. Even slaves at the turn of the eighteenth century were allocated an average of 150 pounds of meat a year. As Horowitz concludes, “These sources do give us some confidence in suggesting an average annual consumption of 150–200 pounds of meat per person in the nineteenth century.”

About 175 pounds of meat per person per year! Compare that to the roughly 100 pounds of meat per year that an average adult American eats today. And of that 100 pounds of meat, more than half is poultry—chicken and turkey—whereas until the mid-twentieth century, chicken was considered a luxury meat, on the menu only for special occasions (chickens were valued mainly for their eggs). Subtracting out the poultry factor, we are left with the conclusion that per capita consumption of red meat today is about 40 to 70 pounds per person, according to different sources of government data—in any case far less than what it was a couple of centuries ago.

Yet this drop in red meat consumption is the exact opposite of the picture we get from public authorities. A recent USDA report says that our consumption of meat is at a “record high,” and this impression is repeated in the media. It implies that our health problems are associated with this rise in meat consumption, but these analyses are misleading because they lump together red meat and chicken into one category to show the growth of meat eating overall, when it’s just the chicken consumption that has gone up astronomically since the 1970s. The wider-lens picture is clearly that we eat far less red meat today than did our forefathers.

Meanwhile, also contrary to our common impression, early Americans appeared to eat few vegetables. Leafy greens had short growing seasons and were ultimately considered not worth the effort. They “appeared to yield so little nutriment in proportion to labor spent in cultivation,” wrote one eighteenth-century observer, that “farmers preferred more hearty foods.” Indeed, a pioneering 1888 report for the US government written by the country’s top nutrition professor at the time concluded that Americans living wisely and economically would be best to “avoid leafy vegetables,” because they provided so little nutritional content. In New England, few farmers even had many fruit trees, because preserving fruits required equal amounts of sugar to fruit, which was far too costly. Apples were an exception, and even these, stored in barrels, lasted several months at most.

It seems obvious, when one stops to think, that before large supermarket chains started importing kiwis from New Zealand and avocados from Israel, a regular supply of fruits and vegetables could hardly have been possible in America outside the growing season. In New England, that season runs from June through October or maybe, in a lucky year, November. Before refrigerated trucks and ships allowed the transport of fresh produce all over the world, most people could therefore eat fresh fruit and vegetables for less than half the year; farther north, winter lasted even longer. Even in the warmer months, fruit and salad were avoided, for fear of cholera. (Only with the Civil War did the canning industry flourish, and then only for a handful of vegetables, the most common of which were sweet corn, tomatoes, and peas.)

Thus it would be “incorrect to describe Americans as great eaters of either [fruits or vegetables],” wrote the historians Waverly Root and Richard de Rochemont. Although a vegetarian movement did establish itself in the United States by 1870, the general mistrust of these fresh foods, which spoiled so easily and could carry disease, did not dissipate until after World War I, with the advent of the home refrigerator.

So by these accounts, for the first two hundred and fifty years of American history, the entire nation would have earned a failing grade according to our modern mainstream nutritional advice.

During all this time, however, heart disease was almost certainly rare. Reliable data from death certificates is not available, but other sources of information make a persuasive case against the widespread appearance of the disease before the early 1920s. Austin Flint, the most authoritative expert on heart disease in the United States, scoured the country for reports of heart abnormalities in the mid-1800s, yet reported that he had seen very few cases, despite running a busy practice in New York City. Nor did William Osler, one of the founding professors of Johns Hopkins Hospital, report any cases of heart disease during the 1870s and eighties when working at Montreal General Hospital. The first clinical description of coronary thrombosis came in 1912, and an authoritative textbook in 1915, Diseases of the Arteries including Angina Pectoris , makes no mention at all of coronary thrombosis. On the eve of World War I, the young Paul Dudley White, who later became President Eisenhower’s doctor, wrote that of his seven hundred male patients at Massachusetts General Hospital, only four reported chest pain, “even though there were plenty of them over 60 years of age then.” XI About one fifth of the US population was over fifty years old in 1900. This number would seem to refute the familiar argument that people formerly didn’t live long enough for heart disease to emerge as an observable problem. Simply put, there were some ten million Americans of a prime age for having a heart attack at the turn of the twentieth century, but heart attacks appeared not to have been a common problem.

Was it possible that heart disease existed but was somehow overlooked? The medical historian Leon Michaels compared the record on chest pain with that of two other medical conditions, gout and migraine, which are also painful and episodic and therefore should have been observed by doctors to an equal degree. Michaels catalogs the detailed descriptions of migraines dating all the way back to antiquity; gout, too, was the subject of lengthy notes by doctors and patients alike. Yet chest pain is not mentioned. Michaels therefore finds it “particularly unlikely” that angina pectoris, with its severe, terrifying pain continuing episodically for many years, could have gone unnoticed by the medical community, “if indeed it had been anything but exceedingly rare before the mid-eighteenth century.” XII

So it seems fair to say that at the height of the meat-and-butter-gorging eighteenth and nineteenth centuries, heart disease did not rage as it did by the 1930s. XIII

Ironically—or perhaps tellingly—the heart disease “epidemic” began after a period of exceptionally reduced meat eating. The publication of The Jungle , Upton Sinclair’s fictionalized exposé of the meatpacking industry, caused meat sales in the United States to fall by half in 1906, and they did not revive for another twenty years. In other words, meat eating went down just before coronary disease took off. Fat intake did rise during those years, from 1909 to 1961, when heart attacks surged, but this 12 percent increase in fat consumption was not due to a rise in animal fat. It was instead owing to an increase in the supply of vegetable oils, which had recently been invented.

Nevertheless, the idea that Americans once ate little meat and “mostly plants”—espoused by McGovern and a multitude of experts—continues to endure. And Americans have for decades now been instructed to go back to this earlier, “healthier” diet that seems, upon examination, never to have existed.