Dietary Dictocrats of EAT-Lancet

“Civilisation is in crisis. We can no longer feed our population a healthy diet while balancing planetary resources. For the first time in 200 000 years of human history, we are severely out of synchronisation with the planet and nature. This crisis is accelerating, stretching Earth to its limits, and threatening human and other species’ sustained existence.”

Those words, found on the main page for EAT-Lancet, are from comments by Tarmara Lucas and Richard Horton, editors for The Lancet. EAT-Lancet is a campaign to force a high-carb, plant-based diet on all or most of the world’s population. They have global aspirations. I don’t automatically have a problem with this, despite my dislike of technocratic paternalism, for I understand there are global problems that require global solutions (pollution, for example, knows no national boundary). But there is a long history of bad dietary advice being imposed on large populations. I’m not fond of dominator culture, no matter how good the intentions. We might be wise to be cautious before going down that road again.

Besides, there seems to be an inherent contradiction behind this advocacy. The report and the editorial both are praising basically what is the same old high-carb diet that governments around the world have been pushing since the late 1970s, a diet that is correlated with an epidemic of chronic diseases. The journal’s own editors seemingly admit that they see it as a forced choice between “a healthy diet” and “balancing planetary resources” — one or the other but not both. Or rather, since many of them don’t follow their own advice (more on that further down), it’s good health for the rich shoved onto the malnourished shoulders of the poor. This interpretation is indicated by how the report simultaneously acknowledges certain foods are healthy even as those very foods are supposed to be restricted. Then the authors suggest that vitamin supplementation or fortification might be necessary to make up for what is lacking. It’s not always clear, in the report, when a dietary suggestion is intended to promote human health or intended to promote planetary health. Are they really trying to save the world or simply hoping to prop up a collapsing global order?

The claims about a healthy diet are suspect. “The Achilles heel of the proposal?,” asks Tim Noakes and then continues, “Most must surely realise that this cannot be healthy in the long term.” For a key area of health, “Our brains NEED animal foods. They’re 2/3 fat and can’t function without DHA. It also needs Vitamins B12, K2, A & Iron. They’re ONLY in animal foods & without them we have major brain issues. The spread of veganism is pouring fuel on the mental health crisis fire” (Carnivore Aurelius). I suspect the EAT-Lancet proponents realize this and don’t care… or rather it’s not their primary concern. This is not a new situation, since malnourishment caused by dietary guidelines has been going on for generations at this point. This point is made by Nina Teicholz: “Americans have eaten more plants, fewer animal foods, and 34% less red meat since 1970. While, rates of obesity and diabetes have skyrocketed. How does it make sense that continuing on this path will improve health if it hasn’t so far?” (Not just an issue for Americans since the 1977 US Department Diet Guidelines were adopted widely throughout the world, based on extremely weak evidence and bad science.) Elsewhere, she adds: “There is no rational basis for that.” As usual, Dr. Jason Fung shares this take on the situation: “they know they’re going to succeed with the same advice. Insanity, literally.” In another tweet, Tim Noakes concludes with a rhetorical question: “Don’t humans ever learn?”

Official dietary recommendations have been a grand failure, one could easily argue, and we have no reason to expect different results, other than a continued worsening as ill health accumulates from one generation to the next. Then again, maybe it hasn’t failed in that maybe it’s purpose was never to promote public health in the first place. When something seems to fail and continues to get repeated, consider the possibility that it is serving some other purpose all too well. If so, the real agenda simply isn’t the one being publicly stated. Not to be conspiratorial, but human nature is what it is and that means people are good at rationalizing to themselves and others. It is largely irrelevant whether or not they sincerely believe they have good intentions.

Perhaps the covert motive is old school social control and social engineering, and quite possibly motivated by the genuine concern of paternalism. Promoting a single diet for all the world would mean more big government run by technocrats who work for plutocrats, all to make the world a better place and it just so happens to directly benefit certain people more than others. The ruling elite and the comfortable classes are starting to worry about the consequences of the capitalism that has slowly destroyed the world and, in their technocratic fantasy, they are hoping to manage the situation. That means getting the masses in line. There are too many useless eaters. And if the excess population (i.e., the takers) can’t be eliminated entirely without a lot of mess and complication (World War III, plague, eugenics, etc), their consumption habits could be manipulated and redirected so that they don’t use up the resources needed by the rich (i.e., the makers). Since the teeming masses are useless anyhow, it matters not that they’ll be further malnourished than they already are. Sure, an increasing number will die at a younger age and all the better, as it will keep the population down. (Yes, I’m being cynical, maybe more than is called for. But I don’t feel forgiving at the moment toward those who claim to have all the answers to complex problems. Let them experiment on themselves first and then get back to us later with the results.)

The commissioners of the report recommend that governments use “choice editing” in order to “guide choice” (nudge theory) through incentives, disincentives, and default policy or, failing that, “restrict choice” and “eliminate choice” to enforce compliance. That is to say, “the scale of change to the food system is unlikely to be successful if left to the individual or whim of consumer choice. This change requires reframing at the population and systemic level. By contrast, hard policy interventions include laws fiscal measures, subsidies and penalties, trade reconfiguration, and other economic and structural measures.” This interventionism, including “banning and pariah status of key products” along with “rationing on a population scale”, would be more authoritarian in its proposed strategy than prior guidelines. It’s misleading to call them ‘guidelines’ at all when the object is to eliminate choice because the masses are seen as being too stupid and weak to make the right choice.

No doubt, an austerity diet would never be willingly accepted by entire populations. In the blockade following World War II, the residents of Berlin were forced by circumstances into severe restriction of a subsistence diet based mostly on carbs while low in calories, protein and fat — not that far off from the official dietary recommendations. Writing in 1952, Dr. H. E. Magee, Senior Medical Officer of the UK Ministry of Health, concluded: “The Berlin diet was austere… and only the compelling force of hunger and the fear of political oppression would, I believe, make any civilized community continue to eat a similar diet for as long as the Berliners did” (Nutrition Lessons of the Berlin Blockade). Yet so many officials continue with the mentality that austerity diets are the way to go: calorie counting, portion control, etc. But Gary Taubes, In Why We Get Fat, makes the point that all this accomplishes is making people endlessly hungry with the perverse effect of a rebound effect of gaining weight, even if initially losing it. Other than a blockade or government enforcement, hunger always wins out. That is why the only successful diets are satiating, which generally means nutrient-density and high-fat.

As always, the rich want to tell the lower classes how to live and then to have them do as their told, by carrot or stick. “The EAT-Lancet Commission spent three years calculating the first scientific targets for a healthy, globally-sustainable diet,” wrote Nick McDermott. “But,” he noted, “the panel of experts admitted none of them were on it.” So, do as they say, not as they do. Also pointing out the hypocrisy were Nina Teicholz and Dr. Jason Fung, the former stating it bluntly about one of the rich advocates: “#EATlancet funders: Private plane jetting around the world, major carbon footprint lifestyle while telling others to save planet from global warming. Doesn’t sound right.” Connecting some dots, Jeroen Sluiter observed that this isn’t exactly new behavior for the paternalistic dietary elite: “This reminds me of how nutrition guidelines’ first villain, Ancel Keys, lectured all of us about the “dangers” of meat while frequently enjoying the most delicious roast beef with his wife Margaret.”

I was reminded of the exact same thing. In reference to Ancel Key’s “stringent vows of the dietary priesthood”, Sally Fallon Morell offers the following note (p. 157, Nourishing Diets): “Actually, Keys recommended the practice of renunciation for the general population but not for himself or those of his inner circle. The esteemed researcher Fred Kummerow, PhD, defender of eggs and butter in the human diet, once spied Keys and a colleague eating eggs and bacon at a conference for cardiologists. When Kummerow inquired whether Keys had changed his mind about dietary fats and cholesterol, Keys replied that such a restricted diet was “for others,” not for himself.” Keep in mind that Keys was the main figure that forced this dietary religion onto the American population and much of the rest of the world. With persuasive charisma, he righteously advocated that others should eat a high carb and fiber diet with restricted animal products: meat, fat, butter, eggs, etc. This became government policy and transformed the entire food sector. The eventual impact has been on possibly billions of people over multiple generations. Yet it wasn’t important enough for his own dietary habits.

There is not enough to go around, but don’t worry, our benevolent overlords will never go without. As yet another put it with some historical context, “The elites will never eat this diet they prescribe to the masses. Meat for me. And wheat for thee. The elites with their superior bodies brains intellects and money will need special nutrition to maintain their hegemony and rightful place as leaders of the planet. Ask yourself why the silicon valley brainiacs are all on keto/carnivore. It’s a reenactment of feudal life w fatty meats for the elites & thin gruel for the peasants” (David Smith). A high-carb diet combined with low-protein and low-fat has always been a poverty diet, rarely eaten by choice other than by ascetic monks (and maybe its an acceptable diet for those who, with little physical activity, spend their time meditating and praying). Worse still, it’s unhealthy and, except when calories are pushed so low as to be a starvation diet, it’s fattening.

This general strategy has been done before. It’s a way of shifting the blame and the consequences elsewhere. It’s the same as promoting feel good policies such as encouraging recycling for households, which helps distract from the fact that the vast majority of waste comes from factories and other businesses. The rich use most of the resources and cause the most problems. Yet it’s the rest of us who are supposed to take responsibility, as consumer-citizens. What the rich pushing this agenda refuse to talk about is that the entire system is to blame, the very system they benefit the most from. The only way to solve the problem is to eliminate the socioeconomic order that creates people so rich and powerful that they dream of controlling the world. If sustainability is the genuine concern, we need to return to a smaller-scale decentralized way of living where solutions come from communities, not enforced by distant dictocrats.

But that would mean reducing inequality of wealth and power by bringing the resources and the decisions back to local populations. As Peter Kalmus wisely put it, “You cannot have billionaires and a livable Earth. The two cannot go together.” That isn’t what the billionaire, Gunhild Stordalen, funding this campaign wants, though (her organization is the EAT part of EAT-Lancet). She likes the inequality just fine the way it is, if not for the problem of all those poor people.

As Herman Melvile put it, “Of all the preposterous assumptions of humanity over humanity, nothing exceeds most of the criticisms made on the habits of the poor by the well-housed, well-warmed, and well-fed.” The rich are worrying about what will happen when the living conditions, including diets, improve for the rest of the global population. And there is reason to worry for, after all, it is a finite planet. But the upper classes should worry about themselves, with the externalized costs of their lifestyle (on a finite planet, externalizations that go around come around). Once the obstructionist elite get out of the way, I have no doubt that the rest of us can come up with innovative responses to these dire times. Locally-sourced food eaten in season, organic and GMO-free agriculture, community gardens and family farms, crop rotation and cattle pasturage, farmers markets and food co-ops, etc — these are the kinds of things that will save the world, assuming we aren’t already too late.

A local diet including animal foods will be a thousand times better for the planetary biosphere than turning all of earth’s available land into big ag industrial farming in order to support a plant-based diet. Even in the EAT-Lancet report, they agree that “animal production can also be essential for supporting livelihoods, grassland ecosystem services, poverty alleviation, and benefits of nutritional status.” They even go so far as to add that this is true “particularly in children and vulnerable populations.” Yet they attack all animal foods, which are the best sources of fat-soluble vitamins that Weston A. Price found were central to the healthiest populations. Somehow too much animal products are bad for you and the entire planet, not just red meat but also chicken, fish, eggs and dairy. Instead, we’re supposed to sustain ourselves on loads of carbs, as part of the decades of government-subsidized, chemically-doused, genetically-modified, and nutrient-depleted “Green Revolution”.

What they don’t explain is how the world’s poor are supposed to eat this way. It’s presented as emphasizing fruits and vegetables. But in many poor countries, fruits and vegetables are more expensive than animal foods. The authors of the report do admit that animal foods might be increased slightly for many demographics — as Dr. Georgia Ede put it: “Although their diet plan is intended for all “generally healthy individuals aged two years and older,” the authors admit it falls short of providing proper nutrition for growing children, adolescent girls, pregnant women, aging adults, the malnourished, and the impoverished—and that even those not within these special categories will need to take supplements to meet their basic requirements.” It’s not clear what this means, as this admission goes against their general recommendations. The proposal is vague on details with neither food lists nor meal plans. And the details shown don’t actually indicate greater amounts of fruits and vegetables, as the plant-based foods mostly consist of carbs (according to Optimising Nutrition’s Should you EAT Lancet?, the calorie breakdown should be: 70% plant-based including sweeteners; with 46% carbs; only 3% vegetables & 5% fruits; & a remarkable 5% for sweeteners, about equal to allowance of meat).

In the harsh criticism offered by Optimising Nutrition: “You would be forgiven if you thought from their promotional materials that they were promoting more vegetables. But it’s not actually the case! However, I admit they are promoting primarily a ‘plant based diet’ if you count corn, soy and wheat (grown using large scale agricultural practices, mono-cropping and large doses of fertilisers and chemical pesticides) and the oils that you can extract them as ‘plant based’.” I eat more actual vegetables on my low-carb, high-fat paleo diet than is being recommended in the EAT-Lancet report. Just because a diet is ‘plant-based’ doesn’t mean it’s healthy, considering most processed foods consist of plant-based ingredients. Even commercial whole wheat breads with some fiber and vitamins added back in to the denatured flour are basically junk food with good marketing. Heck, partially hydrogenated oils and high fructose corn syrup are both plant-based. The EAT-Lancet diet is basically the Standard American Diet (SAD), as it has fallen in line with decades of a Food Pyramid with carbs as the base — more from Optimising Nutrition:

“The thing that struck me was the EAT Lancet dietary guidance seems to largely be an extension of the current status quo that is maximising profits for the food industry and driving us to eat more than we need to. Other than the doubling down on the recommendation to reduce red meat and eggs, it largely seems like business-as-usual for the food industry. With Walter Willett at the helm, it probably shouldn’t be surprising that this looks and feels like an extension of the Dietary Guidelines for Americans for the whole world, complete with talk of United Nations level sanctions to prevent excess meat consumption. […] it’s the added fats and oils (mostly from unsaturated fats) as well as flours and cereals (from rice, wheat and corn) that have exploded in our food system and tracked closely with the rise in obesity. The EAT Lancet guidelines will ensure that this runaway trend continues!”

The proposal, though, isn’t entirely worthless. But it most definitely is confusing and internally conflicted. Even if it genuinely were a diet high in healthy produce, it’s not clear why the dismissal of all animal foods, including eggs and dairy that are enjoyed by most vegetarians and non-vegetarians alike. If feeding the world is the issue, it’s hard to beat an egg for cost effectiveness and it accomplishes this without need for the kind of subsidization we see with high-yield crops. When I was poor, I survived on eggs with the expensive ingredient being the few frozen vegetables I threw in for balance and variety. They are filling, both satisfying and satiating. Also, they make for a quick and easy meal, an advantage for the working poor with limited time and energy.

We are being told, though, that eggs are part of what is destroying the world and so must be severely limited, if not entirely eliminated, for the good of humanity. “While eggs are no longer thought to increase risk of heart disease, Willett said the report recommends limiting them because studies indicate a breakfast of whole grains, nuts and fruit would be healthier” (Candice Choi). So, there is nothing unhealthy about eggs, but since they are made of protein and fat, we should eat more carbs and sugar instead — “According to EAT Lancet, you can eat 8 tsp of sugar but only 1/4 egg per day” (Nina Teicholz). After all, everyone knows that American health has improved over the decades as more carbs and sugar were eaten… no, wait, it’s the complete opposite with worsening health. That is plain fucked up! Explain to me again why eggs, one of the cheapest and healthiest food sources, are being targeted as a danger to human existence in somehow contributing or linked to overpopulation, environmental destruction, and climate change. What exactly is the connection? Am I missing something?

Whatever the explanation, eating less of such things as eggs, we are supposed to eat more of such things as vegetables, at least in taking at face value how this diet is being sold. Let’s pretend for a moment that the Eat-Lancet diet is accurately described as vegetable-focused and that, as a sweeping recommendation, it is fully justified. Consider that, as Diana Rodgers explains, “Fresh produce is not grown year round in all locations, not available to everyone, and by calorie, weight, and micronutrients, more expensive than meat. Oh, and lettuce has three times the GHG emissions of bacon and fruit has the largest water and energy footprint per calorie. I didn’t see this mentioned in the EAT Lancet report.” We forget that our cheap vegetables in the West are actually rather uncommon for much of the world, excluding root vegetables which are more widely available. I’d guess we only have such a broad variety of cheap vegetables here in the West because its part of the government-subsidization of high-yield farming, which by the way has simultaneously depleted our soil and so produced nutrient-deficient food. I’m all in favor of subsidizing vegetables and much else, but I’d rather see the subsidization of sustainable farming in general that promotes nutrient-dense foods, not limited to plants. Anyway, how is telling poor people to eat more expensive and, in some cases, problematic foods going to help the world’s population that is struggling with poverty and inequality?

“And what are the things individuals can do to reduce their carbon footprint?,” as also brought up by Rodgers. “According to a recent meta-analysis, having one less child (in industrialized nations), which was shown by far to have the biggest impact, followed by living “car-free”, avoiding one round-trip trans-Atlantic flight, and buying “green” energy have much more of an effect on our carbon footprint than our dietary choices.” Most people in the West are already having fewer children. And most people in the rest of the world already live without cars. We know that the birthrate goes down when life conditions are improved, but this dietary regime would worsen life conditions through austerity politics and so would make people feel more desperate than they already are. As for transportation, many things could lessen the externalized costs of that, from funding public transportation to the relevant option of increasing local farming. More mass farming to support this top-down dietary scheme would require more mass transportation and inevitably would create more pollution.

One might note that Eat-Lancet is specifically partnered with big ag companies, such as Monsanto that has poisoned the world’s population with Roundup (i.e., glyphosate). Other companies involved are those developing meat alternatives produced from the industrially-farmed crops of big ag. And don’t worry about how this carb-laden diet will harm your health with the majority of the American population already some combination of insulin sensitive, pre-diabetic, and diabetic — they’ve got this covered: “The drug company Novo Nordisk supports Eat-Lancet. Smart. Insulin is 85% of their revenue” (P. D. Mangan). I’m beginning to see a pattern here in the vested interests behind this proposal. Just to throw out a crazy idea, maybe transnational corporations are the problem, not the answer.

The whole health and sustainability claim is a red herring and an unhelpful distraction. The Eat-Lancet commissioners and others of their ilk don’t feel they have to justify their position, not really. They throw out some halfhearted rationalizations, but they fall apart under casual scrutiny. Furthermore, there is far from being a consensus among the experts. The Associated Press offered some dissenting voices, such as “John Ioannidis, chair of disease prevention at Stanford University, said he welcomed the growing attention to how diets affect the environment, but that the report’s recommendations do not reflect the level of scientific uncertainties around nutrition and health.” Ioannidis was quoted as saying, “The evidence is not as strong as it seems to be.” That is to put it mildly. We are in the middle of a replication crisis in numerous fields of science and, as Ioannidis has shown, food-related research is among the worse. When he says the evidence is not strong enough, people should pay attention.

For emphasis, consider what kind of scientists are involved in this project. The lead researcher and author behind the EAT-Lancet report is Walter Willett, chair of the Harvard School of Public Health’s nutrition department. He was recently rebuked in science journal Nature (editorial and feature article) for his unscientific behavior. Willett has many potential conflicts of interest with, according to Nina Teicholz, “many 100Ks in funding by a host of companies selling/promoting plant-based diet.” This is the guy, by the way, who inherited the mantle from Ancel Keys, of which some would consider very low praise, as Keys too has regularly been accused of an unscientific approach to diet and nutrition.

Let me return to nutrient-density in my concluding thoughts. Feeding the whole world is the easy part. But if we want humanity, all of humanity, to thrive and not merely survive, this is what it comes down to — as I previously wrote (A Food Revolution Worthy of the Name!): “We don’t need to grow more food to feed the world but to grow better food to nourish everyone at least to a basic level, considering how many diseases even in rich countries are caused by nutrient deficiencies (e.g., Dr. Terry Wahls reversed multiple sclerosis symptoms in her self, in patients, and in clinical subjects through increasing nutrient-density). The same amount of food produced, if nutrient-dense, could feed many more people. We already have enough food and will continue to have enough food for the foreseeable future. That of equal and fair distribution of food is a separate issue. The problem isn’t producing a greater quantity for what we desperately need is greater quality. But that is difficult because our industrial farming has harmed the health of the soil and denatured our food supply.”

From that piece, I suggested that nutrient-density, especially if combined with low-carb, might decrease food consumption worldwide. And for damn sure, it would improve health for those already eating so little. “What if we could feed more people with less land? And what if we could do so in a way that brought optimal and sustainable health to individuals, society, and the earth? Now that would be a food revolution worthy of the name!” This is very much an issue of inequality, as at least some of the EAT-Lancet commissioners acknowledge — Dr. Lawrence Haddad says, “Most conflict is driven by inequality, or at least a sense of inequality. Work by UNICEF and others shows that inequality in terms of malnutrition is actually rising faster within countries than it is between countries. So inequality within countries in terms of things like stunting and anaemia is either not improving or is actually worsening – and we know that inequality is a big driver of violence conflict.” The EAT-Lancet report itself mentions this in passing, mostly limited to a single paragraph:

“Wars and disasters cause food insecurity and highlight the issues faced when nutrition is inadequate and food becomes scarce. Wars and natural disasters also provide opportunities from which the food system can be transformed. However, only at the end of World War 2 was a global effort and commitment introduced to redirect the food system.258 New institutions were created or revised at the global level such as WHO, the Food and Agriculture Organization, and World Bank, which allied with new and renewed national Ministries of agriculture and health to stop pre-war food problems caused by market distortions, environmentally-damaging farming, and social inequalities.259 However, the negative consequences of the post-war food revolution are now becoming increasingly clear (ie, negative environmental and health consequences, as outlined in this Commission).”

I’ll give them credit for bringing it up at all, however inadequately. They do admit that our food system has failed. That makes it all the more unfortunate that they are demanding more of the same. As others have noted, the diet they have fashioned for the world is severely lacking in nutrition. And they offer no convincing suggestions in how to reverse this problem. It won’t help to eat more plant-based foods, if they are grown through chemical-dependent high-yield farming that is depleting the soil of minerals and killing the microbes. The idea of nutrient-dense foods is simply not on their radar. That is because a large portion of nutrient-dense foods don’t come from plants and, furthermore, aren’t compliant with industrial farming and food production. That isn’t to say we should be eating massive amounts of meat, but animal foods have been the key element to every health population. Even among vegetarians, the healthiest are those with access to high quality dairy and eggs, along with those eating food from traditional farming that includes many insects mixed in with grains and such.

None of that, as far as I can tell, is discussed in the EAT-Lancet report. The authors offer no helpful advice, no long-term vision that can move us in a positive direction. Their ideology is getting ahead of the science. A sense of urgency is important. But impatience, especially the EAT Foundation’s self-described “impatient disruption”, won’t serve us well. It was careless hubris that got us here. It’s time we learn to respect the precautionary principle, to think carefully before collectively acting or rather before the ruling elite goes forward with yet another harebrained scheme. If as a society we want to direct our sense of urgency toward where it counts, that wouldn’t be hard to do: “World: Stop wasting a third of the food produced. Stop wrapping it in needless packaging. Stop transporting food half way round the world. Stop selling food at below-cost prices. Stop undercutting our produce with low standard alternatives. Then I’ll discuss how much meat I eat” (David Hill). It would mean drastically transforming our political and economic system. Capitalism and corporatism, as we know it, must end for the sake of humanity and the planet.

* * *

The Big Fat Surprise
by Nina Teicholz
pp. 131-133

“We Cannot Afford to Wait”

In the late 1970s in America, the idea that a plant-based diet might be the best for health as well as the most historically authentic was just entering the popular consciousness. Active efforts to demonize saturated fat had been underway for more than fifteen years by that time, and we’ve seen how the McGovern committee’s staff were in short order persuaded by these ideas. Even so, the draft report that Mottern wrote for the McGovern committee sparked an uproar—predictably—from the meat, dairy, and egg producers. They sent representatives to McGovern’s office and insisted that he hold additional hearings. Under pressure from these lobbies, McGovern’s staff carved out an exception for lean meats, which Americans could be advised to eat. Thus, Dietary Goals recommended that Americans increase poultry and fish while cutting back on red meat, butterfat, eggs, and whole milk. In the language of macronutrients, this meant advising Americans to reduce total fat, saturated fat, dietary cholesterol, sugar, and salt while increasing carbohydrate consumption to between 55 percent and 60 percent of daily calories.

While Mottern would have liked the final report to advise against meat altogether, some of the senators on the committee were not so unequivocally confident about their ability to weigh in on matters of nutritional science. The ranking minority member, Charles H. Percy from Illinois, wrote in the final Dietary Goals report that he and two other senators had “serious reservations” about the “divergence of scientific opinion on whether dietary change can help the heart.” They described the “polarity” of views among well-known scientists such as Jerry Stamler and Pete Ahrens and noted that leaders in government, including no less than the head of the NHLBI as well as the undersecretary of health, Theodore Cooper, had urged restraint before making recommendations to the general public.

Yet this hesitation turned out to be too little too late to stop the momentum that Mottern’s report had set in motion. Dietary Goals revived the same argument that Keys and Stamler had used before: that now was the time to take action on an urgent public health problem. “We cannot afford to await the ultimate proof before correcting trends we believe to be detrimental,” said the Senate report.

So it was that Dietary Goals , compiled by one interested layperson, Mottern, without any formal review, became arguably the most influential document in the history of diet and disease. Following publication of Dietary Goals by the highest elective body in the land, an entire government and then a nation swiveled into gear behind its dietary advice. “It has stood the test of time, and I feel very proud of it, as does McGovern,” Marshall Matz, general counsel of the McGovern committee, told me thirty years later.

Proof of the report’s substantiality, according to Matz, is that its basic recommendations—to reduce saturated fat and overall fat while increasing carbohydrates—have endured down to today. But such logic is circular. What if the US Congress had said exactly the opposite: to eat meat and eggs and nothing else? Perhaps that advice, supported by the power of the federal government, would have lived on equally well. In the decades since the publication of Dietary Goals , Americans have seen the obesity and diabetes epidemics explode—a hint, perhaps, that something is wrong with our diet. Based on these facts, the government might have deemed it appropriate to reconsider these goals, but it has nevertheless stayed the course because governments are governments, the least nimble of institutions, and unable easily to change direction.

* * *

Dr. Andrew Samis:

One hundred and eleven years ago a scientist in St. Petersberg Russia fed rabbits meat, eggs, and dairy. Not unexpectedly for a herbivorous animal, it built up in the blood vessels. It also built up in the ligaments, the tendons, the muscles, and everywhere else in the rabbits body without any evolved mechanism for excretion. This yellow goop in the rabbit’s aortas looked just like human atherosclerosis, which had only been described four years earlier. This started science down a misguided pathway of focusing on fat as the cause of hardening of the arteries. A pathway that future historians will likely call the greatest tragedy in terms of years of life lost in the history of humanity.

Initially it was eating cholesterol that was blamed for causing of hardening of the arteries. Then in the 1950s an American physiologist, who had such an affinity for hard compacted refined carbohydrates that he designed soldiers rations featuring it, expanded the blame from cholesterol to all fat, especially animal fat. Carbohydrates should be increased and fat excluded, that was the battle cry! In the 1970s this unproven theory drew the attention of the US senate, and within a few short years blaming fat for atherosclerosis became a worldwide revolution. This time period, interesting, also marks the beginning of the obesity epidemic that has gripped the world’s developed countries. Tragically, what everyone seemed to have missed was the fact that there was no conclusive scientific evidence for this theory, and over time much of that thinking has actually been proven wrong. I have little doubt that issuing these guidelines without conclusive scientific evidence will eventually be viewed as the most significant blunder in the history of science.

I am an ICU doctor. I see the carnage that this cavalier and misguided attitude towards food guidelines has caused every single day, up close and personal. The tears of families suffering loss. The premature death of those who should have had long lives. Parents burying their adult sons and daughters. Atherosclerosis, obesity, and type 2 diabetes, when grouped together represent the top conditions for admission to adult ICUs everywhere on earth where our unhealthy Western Diet is consumed. And approximately one in five don’t survive their ICU stay. But what makes me the most angry is the fact that those people who draft these misguided non-scientific food guidelines, with their biased agendas and misrepresented studies, sit in government offices and ivory towers completely remote from the devastating impact of their work. Is it any wonder that the doctors of the world represent a large portion of those leading the charge against our current misguided food guidelines. Doctors are not remote to the problem or blind to the devastation. It is here every single day at work.

This has to stop. Food guidelines need to be based on rigorous science. How many more thousands of people have to die.

Enough is enough.

* * *

No photo description available.

Image may contain: text

* * *

Globe-trotting billionaire behind campaign to save planet accused of blatant hypocrisy
by Martin Bagot

Billionaire Vegan Tells Us all How to Eat
by Tim Rees

Billionaire tycoon who urged Brits to eat less meat tucks into 20,000-calorie burger
from Mirror

Eat Lancet, a template for sustaining irony
by Stefhan Gordon

Does Lancet want to hand control of our diets to the state?
by Kate Andrews

Tax, ban, regulate: the radical ‘planetary health diet’ explained
by Christopher Snowden

Report: Cut red-meat eating by 80 percent to save the planet?
by Anne Mullens and Bret Scher

Can vegetarians save the planet? Why campaigns to ban meat send the wrong message on climate change
by Erin Biba

Two-pager Scientific Evidence on Red Meat and Health
from The Nutrition Coalition

I think you’ll find it’s a little bit more complicated than that…
by Malcolm Tucker

Why we should resist the vegan putsch
by Joanna Blythman

The EAT Lancet diet is nutritionally deficient
by Zoë Harcombe

EAT-Lancet Diet – inadequate protein for older adults
by Joy Kiddie

EAT-Lancet report’s recommendations are at odds with sustainable food production
by Sustainable Food Trust

Sorry, But Giving Up on Meat Is Not Going to Save The Planet
by Frank M. Mitloehner

20 Ways EAT Lancet’s Global Diet is Wrongfully Vilifying Meat
by Diana Rodgers

Red meat bounds down the carbon neutral path
by Shan GoodwinShan Goodwin

Can cows cause more climate change than cars?
by Frédéric Leroy

The EAT-Lancet Commission’s controversial campaign
by Frédéric Leroy and Martin Cohen

Why we shouldn’t all be vegan
by Frédéric Leroy and Martin Cohen

EAT-Lancet: what lies behind the Veggie-Business according to Frédéric Leory and Martin Cohen
from CARNI Sostenibili

Considerations on the EAT-Lancet Commission Report
from CARNI Sostenibili

The Eat-Lancet Commission: The World’s Biggest Lie
by Angela A. Stanton

We test diet of the future that will save the planet – that calls on Irish people to slash red meat consumption by 89 per cent
by Adam Higgins

Is the EAT-Lancet (Vegan) Rule-Book Hijacking Our Health?
by Belinda Fettke

EAT-Lancet’s Plant-based Planet: 10 Things You Need to Know
by Georgia Ede

Should you EAT Lancet?
from Optimising Nutrition

Damning Dietary Data

Below are some tweets from Nina Teicholz, the journalist who authored The Big Fat Surprise. Her book has pushed further the debate that Gary Taubes earlier helped bring out into public view.

Both of their writings are an eye-opening critique of how we got to this place of mass health catastrophe that, if it continues, will bankrupt and cripple our society. Healthcare costs are going up not only because of big biz exploitation but also because the American population has become more sickly. Most healthcare money now goes to chronic conditions that were rare in the past, and those costs are skyrocketing. This is trending toward disaster.

The graphed data she shares does one thing well. It clearly shows that, as she and others have written about, most Americans have been following the dietary guidelines given by mainstream authority figures, scientific institutions, and government agencies. Americans are eating more whole grains, legumes, vegetables, and fruits. This is true in terms of both percentage of calories and number of calories. We’ve been doing what we were told to do. How has that worked out? Not so well.

Furthermore, saturated fat consumption also decreased over this period (not included in graphs). In fact, it had been decreasing since the early 20th century, prior to the beginning of the epidemic of obesity and heart disease. This is corroborated by the fact no study has ever found a causal link between saturated fat and heart disease, despite probably trillions of dollars spent on researching diet and nutrition this past century. It’s not for a lack of trying to find such a causal link.

It turns out that the main proven causal link, that of sugar, was apparent in the earliest data. But interestingly, even sugar can’t be solely blamed for the sharp rise of chronic diseases over the past few generations. Teicholz points out that, “Sugar consumption has actually declined since 1999…so have refined grains.”

Then again, that was a small decline following a massive increase over the prior century. Keep in mind that Teicholz is only talking about added sugar. That leaves out the increase of foods that are naturally full of sugar such as fruit, especially considering that fruit has been developed to be higher in sugar than what was available in the past. Plus, that leaves out the entirety of how simple carbs in our modern diet have shot through the roof, and as far as the body is concerned they’re treated the same as sugar since they convert so easily.

Taken altogether, we are nowhere near the lower level of sugar and carb intake as seen in the early 1900s. And the consumption in the 1800s was so low that the pro-carb experts today warning about the dangers of low-carb diets should be surprised that the American population somehow survived and thrived, with a citizenry that by the end of the century was on average the tallest among countries where such data was kept. That in the 21st century our added sugar addiction has finally hit a plateau should offer no comfort.

About the graphs, this is one of the cases where the data does speak for itself. Not that it proves anything specifically. It simply shows what has changed in relation to what else has changed. Quite telling, though, in its potential implications. Obviously, the standard dietary ideology can’t explain this data. The ruling experts don’t even bother to try to explain it. Heck, they do their best to avoid even acknowledging it. This is inconvenient data, to say the least. But in their corporate corruption and hypocrisy, it doesn’t stop the powers that be to continue pushing the same diet with claims that eventually it will have the opposite effect. What they won’t allow in public debate is what are the real causes behind all of this. That is dangerous territory because then we’d have to tread upon the high-profit territory of processed foods.

* * *

* * *

On a related note, this might be the reason Anthony Warner is an “Angry Chef” in attacking “fad diets” and “bullshit”. That is to say anything other than the dominant paradigm.

I had noticed an earlier book by him, but his most recent book caused me to research him further. I was willing to take him seriously, up to the point when I saw in his book where he referred to Professor Tim Noakes as a “diet author”. Noakes is a top-rated researcher on diet and nutrition, the leading expert on the ketogenic diet in South Africa where he successfully defended himself in a government trial funded by millions of dollars of taxpayer money for the sin of having suggested a traditional foods diet to a pregnant woman. What are Warner’s credentials as an authority on diet and nutrition, well other than being a blogger and corporate shill? None.

A former anonymous blogger, Warner has admitted to being a corporate consultant and development cook for food manufacturers. With corporate money overflowing from his pockets, he unsurprisingly “goes to great lengths to absolve the food industry and its relentless marketing of processed food from playing any role in modern diet problems,” as it was put by Bee Wilson. Warner goes so far as to defend the besmirched name of sugar. From a Guardian article by Tim Lewis, he is quoted as saying,

The rhetoric that sugar is poison, it’s killing us, has become completely accepted… We’re told it’s just empty calories. Well, we kind of need calories to live. But a lot of people will read that and say, ‘He would say that. He works for a big cake manufacturer.’… Sugar has an enormous amount of energy and is one of the most important building blocks for life. But they say, “It has no nutritional value.” That makes absolutely no sense.

That is amusing. I never thought I’d see a defense of sugar. Even the most mainstream scientific institutions and governmental agencies no longer try to defend sugar, although they did so in the past and have been slow to change. It’s scientific consensus at this point, both within and outside the establishment, that sugar is bad for health and is empty of nutrition. Consistency, of course, is irrelevant in his line of work — as explained by Chris C. at The Low Carb Diabetic forum:

I’m just thinking how unintentionally ironic his fevered defence of sugar is. Since he and his dietician pals all believe in calories in calories out, surely a food “full of energy” is the last thing to recommend that fat people eat even in their world?

Warner must be getting paid very well. His corporate advocacy is one of the greatest examples of sophistry I’ve ever seen. There appears to be no big money food interest or food product he won’t defend — besides sugar: white bread, potato chips, processed meat, fast food, etc; pretty much anything and everything that comes out of a factory. As to be expected, he and his books get promoted on corporate media.

The Angry Chef can do as much damage control as he wants on behalf of corporations. Any informed person doesn’t care what a corporate shill has to say. And at this point, neither should anyone pay attention to dietary guidelines from governments that are no more reliable than corporate hackery. Besides, it’s become overwhelmingly clear that governments and corporations regularly collude, specifically when the profits of the food system are involved (See Marion Nestle, among others). We are left to inform ourselves as best we can.

* * *

The USDA Dietary Guidelines Committee Gets The Spanking It Deserves
Tom Naughton

As you’ve probably heard, the National Academies of Science, Engineering and Medicine (NASEM) recently gave the USDA Dietary Guidelines Committee the spanking it deserves. Here are some quotes from an editorial in The Hill written by Rep. Andy Harris, who also happens to be a doctor:

The nation’s senior scientific body recently released a new report raising serious questions about the “scientific rigor” of the Dietary Guidelines for Americans. This report confirms what many in government have suspected for years and is the reason why Congress mandated this report in the first place: our nation’s top nutrition policy is not based on sound science.

In order to “develop a trustworthy DGA [guidelines],” states the report by the National Academies of Science, Engineering and Medicine (NASEM), “the process needs to be redesigned.”

Among other things, the report finds that the guidelines process for reviewing the scientific evidence falls short of meeting the “best practices for conducting systematic reviews,” and advises that “methodological approaches and scientific rigor for evaluating the scientific evidence” need to “be strengthened.”

In other words, the Dietary Guidelines for Americans are far from the “gold standard” of science and dietary advice they need to be. In fact, they may be doing little to improve our health at all.

Heh-heh-heh … remember what happened when Nina Teicholz, author of The Big Fat Surprise, wrote a piece in the British Medical Journal criticizing the dietary guidelines as unscientific? Dr. David Katz (who reviewed his own novel under a false name and compared himself to Milton and Chaucer) dismissed her critique as “the opinion of one journalist.” The USDA’s report, he insisted, “is excellent, and represents both the weight of evidence, and global consensus among experts.”

Then for good measure, he and several other members of The Anointed tried to harass BMJ into retracting the article by Teicholz.

And now along comes the NASEM report, saying Teicholz was right. The “opinion of one journalist” (which of course was shared by countless doctors and researchers) is now the official opinion of the National Academies of Science, Engineering and Medicine. You gotta love it. Perhaps Dr. Katz can write a rebuttal to the NASEM report, then review his rebuttal under a false name and compare himself to Albert Einstein.

Anyway, back to the editorial by Rep. Harris:

It seems clear that the lack of sound science has led to a number of dietary tenets that are not just mistaken, but even harmful – as a number of recent studies suggest.

For instance, the guidelines’ recommendation to eat “healthy whole grains” turns out not to be supported by any strong science, according to a recent study by the Cochrane Collaboration, a group specializing in scientific literature reviews. Looking at all the data from clinical trials, which is the most rigorous data available, the study concluded that there is “insufficient evidence” to show that whole grains reduced blood pressure or had any cardiovascular benefit.

* * *

Unsavory Truth
by Marion Nestle
pp. 108-113

[US senator William] Proxmire was right about the [National Academy of Science’s Food and Nutrition] board’s ties to industry. Those were revealed in 1980 during a dispute over the first edition of the US dietary guidelines, which advised reductions in intake of fat, saturated fat, and cholesterol (meaning, in effect, meat, dairy, and eggs) to reduce the risk of heart disease. The board opposed the guideline so vehemently that it issued a counter-report, Toward Healthful Diets, arguing that fat restrictions were unnecessary for healthy people. This infuriated health advocates, who charged that at least six board members had financial ties to industries most affected by the guidelines. Sheldon Margen, a professor of public health at the University of California, for example, objected that “the board’s range of expertise is too narrow, its ties with industry too close to avoid the suspicions of bias, its mandate is too ill-defined, and its mode of operation too secret.” Others criticized the board’s support by an industry liaison committee whose members represented eighty food companies. The furor over the report so embarrassed the academy that it eliminated the industry panel, removed board members with strong ties to food companies, and appointed new members with fewer industry ties.

That was not the only instance of early concerns about conflicted committees. I asked Ken Fisher, who in the 1970s had directed the nongovernmental Life Sciences Research Office (LSRO), about his experience appointing committees to review the safety of food additives. In 1958, Congress had defined two categories of food additives: new chemicals that needed to be proven safe before they could go into the food supply and substances with a history of common use—sugar, salt, flavorings, and the like—that could be considered generally recognized as safe (GRAS). In the early 1970s, questions about the safety of GRAS additives led President Richard Nixon to direct the FDA to evaluate them, and the FDA commissioned the LSRO to conduct the reviews. The LSRO appointed committees to do this work and was immediately confronted with the problem of what to do about candidates with ties to companies making or using the additive under consideration.

The review committees eventually issued 151 evaluations of more than four hundred GRAS additives. In a report on this work, Fisher said that the LSRO required candidates to report grants, contracts, and consultancies, as well as investments and holdings.  It did not permit members with such ties to participate in discussions or vote on final decisions. Fisher told me that all members “were made aware of these conditions and all agreed—after some back and forth.” He recalled “one conflicted member, who of his own volition, absented himself from the vote on the decision.” He also recalled that committees “rejected several of the monographs on substances because they were incomplete and clearly biased in coverage of published positive or negative studies on certain substances.”

Fisher’s comments suggested that conflicts of interest only rarely caused problems with GRAS reviews. But in The Case Against Sugar (2016) the journalist Gary Taubes presented the GRAS review of sugar (sucrose) as highly conflicted. His book notes that the chair of the overall GRAS review process was George W. Irving Jr., a former head of the scientific advisory board of the International Sugar Research Foundation, and that the GRAS committee relied heavily on materials provided by the Sugar Association. The 1976 GRAS review concluded that “other than the contribution made to dental caries, there is no clear evidence in the available information on sucrose that demonstrates a hazard to the public when used at the levels that are now current and in the manner now practiced.” According to Taubes, the Sugar Association took that to mean that “there is no substantiated scientific evidence indicating that sugar causes diabetes, heart disease, or any other malady.” He has harsh words for critics of the idea that sugars are harmful. “If you get a chance,” He advises, “ask about the GRAS Review Report. Odds are you won’t get an answer. Nothing stings in a nutritional liar like scientific facts.”

The FDA’s GRAS reviews still elicit concerns about conflicted interests. A 2013 analysis of the GRAS review process concludes that the industry ties of committee members not only threaten the integrity of GRAS reviews but also the integrity of the FDA’s entire scientific enterprise. In a commentary on that analysis, I pointed out that without independent review of GRAS additives, it is difficult to be confident that the ones in use are sage.

My question to Fisher about GRAS review committees had induced him to search through notes packed away for decades. Among them, he found memos indicating that Mike Jacobson had asked to have consumer representatives appointed to GRAS review committees, but, he said, “We opted not to do so as it would imply the other members of the [committees] were not consumers.” Fisher was referring to Michale Jacobson, director of the Center for Science in the Public Interest (CSPI), whose concerns about conflicted advisory committee members also date back to the 1970s. Jacobson was arguing that if federal agencies insisted on permitting members with industry ties to serve on advisory committees, they should balance viewpoints with an equivalent number of consumer representatives.

Jacobson holds a doctorate in microbiology. He began his career working for Ralph Nader, cofounded CSPI in 1971, and retired as its director in 2017. CSPI’s purpose is to improve the American diet, and it continues to be the largest nonprofit organization engaged in advocacy for a broad range of nutrition issues, among them conflicts of interest caused by food industry sponsorship. I served on the CSPI board for about five years in the early 1990s, remain a member, and subscribe to its monthly Nutrition Action Health letter.

In 1976, Jacobson asked a member of Congress with a strong record of consumer advocacy, New York Democrat Benjamin Rosenthal, to help him survey the heads of university nutrition departments about their faculty’s ties to food corporations. Jacobson told me why he had done this: “It was so obvious to me that professors were touting their academic affiliations while shilling for food manufacturers and trade associations. I thought it would be interesting and possibly useful to collect information about the matter.” Rosenthal introduced their report of the survey results, titled “Feeding at the Company Trough,” into the Congressional Record, with this blunt statement:

Nutritional and food science professors at Harvard, at the Universities of Wisconsin, Iowa and Massachusetts, and at many other prominent universities work closely and often secretly with food and chemical companies. Professors sit on the boards of directors, act as consultants, testify on behalf of industry at congressional hearings, and receive industry research grants. Many professors with corporate links also serve as “university” representatives on Federal advisory committees. . . . One can only come to the conclusion that industry grants, consulting fees and directorships are muzzling, if not prostituting nutrition and food science professors.

The report named names: it characterized Fred Stare, the head of Harvard’s Department of Nutrition, as a “food-industry apologist,” but it also listed the industry ties to sixteen other eminent scientists, nearly all members of prestigious national committees issuing advice about nutrition and health. It proposed three strategies for countering conflicted interests: balance, disclosure, and new funding mechanisms. All merit comment from today’s perspective.

To achieve balance, they wanted consumer representatives to be appointed to nutrition advisory committees. This seems entirely rational, but in my experience federal agencies view experts who avoid industry ties on principle as too biased to appoint, especially if they state those principles publicly. I was a member of the Dietary Guidelines Advisory of Committee in 1995, but only because I had previously worked with the assistant secretary of health, Philip R. Lee, who insisted on my appointment. I served a s a consumer representative on two FDA advisory committees in the 1990s, Food Advisory and Science Advisory, but have not been asked to join another federal committee since the publication of Food Politics in 2002. The FDA’s current practice is to appoint one consumer representative to its committees, hardly enough to have much influence on decisions.

With respect to disclosure, the report comments on the failure of the named professors to state the full extent of their industry ties: “As long as collaboration with industry continues to be viewed by the academic community as ethical and respectable, it is important that the public know about potential sources of bias. . . . In such matters, respect for individual privacy must yield to society’s right to know.”

To help accomplish the third strategy, funding, the report raised the idea of a nonprofit, public interest group to “launder” industry contributions before they reach universities. But I doubt that such a group could maintain its objectivity if it depended on ongoing donations. I also doubt that companies would be willing provide ongoing support for research that might risk producing unfavorable results.

pp. 193-

[Founder of Harvard Department of Nutrition Fred] Stare ran into precisely the same difficulty faced by the Nutrition Foundation: the need to please donors to get ongoing support. For this reason, or perhaps because his personal beliefs coincided with those of his donors, eh was widely recognized as a nutrition scientist working on behalf of the food industry. His public statements consistently defended the American diet against suggestions that it might increase the risk of heart or other chronic disease. He, like officials of the Nutrition Foundation, could be counted on to state the industry position on matters of diet and health and to assure reporters and Congress that no scientific justification existed for advice to avoid food additives or eat less sugar.

We now know much more about the depth of Stare’s food-industry ties from documents that came to light in 2016 when Cristin Kearns and colleagues at the University of California, San Francisco published an analysis of internal documents of the Sugar Research Foundation (SRF), the forerunner of today’s Sugar Association. The documents included letters between the SRF and Mark Hegsted, a faculty member in Stare’s Harvard department, about the SRF’s sponsorship of a research review on the effects of dietary carbohydrartes and fats on cardiovascular disease. The review, written by Stare, Hegsted, and antoher colleague, appeared in two parts in the New England Journal of Medicine in 1967. The letters show that the SRF not only commissioned and paid for the review but also pressured the Harvard authors to exonerate sugar as a factor in heart disease, then and now the leading cause of death among Americans. Other documents from the mid-1960s demonstrate that the SRF withheld funding from studies suggesting that sugar might be harmful.

I wrote the editorial that accompanied

A Food Revolution Worthy of the Name!

“Our success with carbohydrates, however, has had a serious downside: a worldwide plague of obesity, diabetes and other diet-related diseases.”
~Gerald C. Nelson

The conventional view on diet promoted by establishment figures and institutions is based on the idea that all calories are equal. In dieting and fat loss, this has meant promoting a philosophy of calorie-in/calorie-out which translates as calorie counting and calorie restriction. Recent research has brought serious doubt to this largely untested hypothesis that has for so long guided public health recommendations.

There is also a larger background to this issue. The government has spent immense money promoting and subsidizing the high-carb diet. For example, they’ve put decades of funding into research for growing higher yield staples of wheat, corn, and rice. But they have never done anything comparable for healthy foods that are nutrient-dense and low-carb. This promotion of high yield crops with industrialized farming has denatured the soil and the food grown on it. This is problematic since these high-carb staples are low in nutrient-density even when grown on healthy soil.

This mentality of obsessing over food as calories is severely dysfunctional. It ignores the human reality of how our bodies function. And it ignores widespread human experience. Calorie-restricted diets are well known to have one of the lowest rates of compliance and success. It doesn’t matter how many or how few calories one tries to eat, as long as the food one is eating is of such low quality. Your hunger and cravings will drive you in your body’s seeking nutrition.

As I’ve eaten more nutrient-dense foods as part of a diet that is ketogenic and paleo, my hunger decreased and my cravings disappeared. I certainly don’t consume more calories than before and possibly far less, not that I’m counting. I no longer overeat and I find fasting easy. Maybe too many people eat so much making them fat because the food system produces mostly empty calories and processed carbs. It’s what’s available and cheapest, and the food industry is brilliant in making their products as addictive as possible. The average person in our society is endlessly hungry while their body is not getting what it needs. It’s a vicious cycle of decline.

I remember how I was for most of my life until quite recently, with decades as a sugar addict and a junk food junky. I was always hungry and always snacking. Carbs and sugar would keep my blood sugar and serotonin levels on a constant roller coaster ride of highs and lows, and it wrecked my physical and mental health in the process. It wasn’t a happy state. And anyone having told me in my deepest and darkest depressive funk that I should count and restrict my calories would not have been helpful. What I needed was more of the right kinds of calories, those filled with healthy fats and fat-soluble vitamins along with so much else. My body was starving from malnourishment even when I was overeating and, despite regular exercise, eventually gaining weight.

We don’t need to grow more food to feed the world but to grow better food to nourish everyone at least to a basic level, considering how many diseases even in rich countries are caused by nutrient deficiencies (e.g., Dr. Terry Wahls reversed multiple sclerosis symptoms in her self, in patients, and in clinical subjects through increasing nutrient-density). The same amount of food produced, if nutrient-dense, could feed many more people. We already have enough food and will continue to have enough food for the foreseeable future. That of equal and fair distribution of food is a separate issue. The problem isn’t producing a greater quantity for what we desperately need is greater quality. But that is difficult because our industrial farming has harmed the health of the soil and denatured our food supply.

The U.S. gov pays some farmers to not grow anything because the market is flooded with too much food. At the same time, U.S. gov pays other farmers to grow more crops like corn, something I know from living in Iowa, the corn capital of the world. Subsidizing the production of processed carbs and high fructose syrup is sickening and killing us, ignoring the problems with ethanol. Just as important, it also wastes limited resources that could be used in better ways.

We have become disconnected in so many ways. Scientific research and government policies disconnected from human health. An entire civilization disconnected from the earth we depend upon. And the modern mind disconnected from our own bodies, to the point of being alienated from what should be the most natural thing in the world, that of eating. When we are driven by cravings, our bodies are seeking something essential and needed. There is a good reason we’re attracted to things that taste sweet, salty, and fatty/oily. In natural whole foods, these flavors indicate something is nutrient-dense. But we fool the body by eating nutrient-deficient processed foods grown on poor soil. And then we create dietary ideologies that tell us this is normal.

What if we could feed more people with less land? And what if we could do so in a way that brought optimal and sustainable health to individuals, society, and the earth? Now that would be a food revolution worthy of the name!

* * *

The global food problem isn’t what you think
by Gerald C. Nelson 

Here’s what we found:

Under even the worst conditions, there will be enough food, if we define “enough” as meaning sufficient calories, on average, for everyone — with 2,000 calories per day as the standard requirement. . . [T]he post-World War II Green Revolution efforts to boost the productivity of staples such as wheat and rice have been so successful that we are now awash in carbohydrates. And because so much has already been invested in improving the productivity of these crops, solid yield gains will likely continue for the next few decades. The productivity enhancements have also made them more affordable relative to other foods that provide more of the other needed nutrients.

Our success with carbohydrates, however, has had a serious downside: a worldwide plague of obesity, diabetes and other diet-related diseases. The World Health Organization reports that in 2014, there were 462 million underweight adults worldwide but more than 600 million who were obese — nearly two-thirds of them in developing countries. And childhood obesity is rising much faster in poorer countries than in richer ones.

Meanwhile, micronutrient shortages such as Vitamin A deficiency are already causing blindness in somewhere between 250,000 and 500,000 children a year and killing half of them within 12 months of them losing their sight. Dietary shortages of iron, zinc, iodine and folate all have devastating health effects.

These statistics point to the need for more emphasis on nutrients other than carbohydrates in our diets. And in this area, our findings are not reassuring.

The Secret of Health

I’m going to let you in on a secret. But before I get to that… There is much conflict over diet. Many will claim that their own is the one true way. And some do have more research backing them up than others. But even that research has been extremely limited and generally of low quality. Hence, all the disagreement and debate.

There have been few worthwhile studies where multiple diets are compared on equal footing. And the results are mixed. In some studies, vegetarians live longer. But in others, they live less long. Well, it depends on what kind of vegetarian diet in what kind of population and compared against which other diet or diets. The Mediterranean diet also has showed positive results and the Paleo diet has as well, although most often the comparison is against a control group that isn’t on any particular diet.

It turns out that almost any diet is better than the Standard American Diet (SAD). Eating dog shit would be improvement over what the average American shoves into their mouth-hole. I should know. I shudder at the diet of my younger days, consisting of junk food and fast food. Like most Americans, I surely used to be malnourished, along also with likely having leaky gut, inflammation, insulin sensitivity, toxic overload, and who knows what else. Any of the changes I’ve made in my diet over the years has been beneficial.

So, here is the great secret. It matters less which specific diet you have, in the general sense. That is particular true in decreasing some of the worst risk factors. Many diets can help you lose weight and such, from low fat to high fat, from omnivorian to vegetarian. That isn’t to say all diets are equal in the long term, but there are commonalities to be found in any healthy diet. Let me lay it out. All health diets do some combination of the following.

Eliminate or lessen:

  • processed foods
  • vegetable oils
  • carbs, especially simple carbs
  • grains, especially wheat
  • sugar, especially fructose
  • dairy, especially cow milk
  • foods from factory-farmed animals
  • artificial additives

Emphasize and increase:

  • whole foods
  • omega-3s, including but not limited to seafood
  • fiber, especially prebiotics
  • probiotics, such as fermented/cultured
  • foods that are organic, local, and in season
  • foods from pasture-raised or grass-fed animals
  • nutrient-density
  • fat-soluble vitamins

There are some foods that are harder to categorize. Even though many people have problems with cow milk, especially of the variety with A1 casein, more people are better able to deal with ghee which has the problematic proteins removed. And pasture-raised cows produce nutrient-dense milk, as they produce nutrient-dense organ meats and meat filled with omega-3s. So, it’s not that a diet has to include everything I listed. But the more it follows these the greater will be the health benefits.

It does matter to some degree, for example, where you get your nutrient-density. Fat-soluble vitamins are hard to find in non-animal sources, a problem for vegans. But even a vegan can vastly increase their nutrient intake by eating avocados, leafy greens, seaweed, etc. The main point is any increase in nutrients can have a drastic benefit to health. And the greater amount and variety of nutrients the greater the improvement.

That is why any diet you can imagine comes in healthy and unhealthy versions. No matter the diet, anyone who decreases unhealthy fats/oils and increases healthy fats/oils will unsurprisingly increase their health. But as an omnivore could fill their plate with factory-farmed meat and dairy, a vegan could fill their plate with toxic soy-based processed foods and potato chips. The quality of a diet is in the details.

Still, it is easier to include more of what I listed in some diets than others. Certain nutrients are only found in animal sources and so a vegan has to be careful about supplementing what is otherwise lacking. A diet of whole foods that doesn’t require supplementation, however, is preferable.

That is why there are a surprisingly large number of self-identified vegans and vegetarians who will, at least on occasion, eat fish and other seafood. That also might be why the Mediterranean diet and Paleo diet can be so healthy as well, in their inclusion of these foods. Weston A. Price observed some of the healthiest populations in the world were those who lived near the ocean. And this is why cod liver oil was traditionally one of the most important parts of the Western diet, high in both omega-3s and fat soluble vitamins and much else as well.

Whatever the details one focuses upon, the simple rule is increase the positives and decrease the negatives. It’s not that difficult, as long as one knows which details matter most. The basic trick to any health diet is to not eat like the average American. That is the secret.

* * *

Getting that out of the way, here is my bias.

My own dietary preferences are based on functional medicine, traditional foods, paleo diet, nutritional science, anthropology, and archaeology — basically, any and all relevant evidence and theory. This is what informs the list I provided above, with primary focus on the Paleo diet which brings all the rest together. That is what differentiates the Paleo diet from all others, in that it is a systematic approach that scientifically explains why the diet works. It focuses not just on one aspect but all known aspects, including lifestyle and such.

Something like the Mediterranean diet is far different. It has been widely researched and it is healthy, at least relative to what it has been tested against. There are multiple limitations to health claims about it.

First, the early research was done after World War II and , because of the ravages to the food supply, the diet they were eating then was different than what they were eating before. The healthy adults observed were healthy because of the diet they grew up on, not because of the deprivation diet they experienced after the war. That earlier diet was filled with meat and saturated fat, but it also had lots of vegetables and olive oil as. As in the US, the health of the Mediterranean people had decreased as well from one generation to the next. So, arguing that the post-war Mediterranean diet was healthier than the post-war American diet wasn’t necessarily making as strong of a claim as it first appeared, as health was declining in both countries but with the decline in the latter being far worst.

Working with that problematic research alone, there was no way to get beyond mere associations in order to determine causation. As such, it couldn’t be stated with any certainty which parts of the diet were healthy, which parts unhealthy, and which parts neutral. It was a diet based on associations, not on scientific understanding of mechanisms and the evidence in support. It’s the same kind of associative research that originally linked saturated fat to heart disease, only to later discover that it was actually sugar that was the stronger correlation. The confusion came because, in the American population because of the industrialized diet, habits of saturated fat consumption had become associated with that of sugar, but there was no study that ever linked saturated fat to heart disease. It was a false or meaningless association, a correlation that it turns out didn’t imply causation.

That is the kind of mistake that the Paleo diet seeks to avoid. The purpose is not merely to look for random associations and hope that they are causal without ever proving it. Based on other areas of science, paleoists make hypotheses that can be tested, both in clinical studies and in personal experience. The experimental attitude is central.

That is why there is no single Paleo diet, in the way there is a single Mediterranean diet. As with hunter-gatherers in the real world, there is a diversity of Paleo diets that are tailored to different purposes, health conditions, and understandings. Dr. Terry Wahl’s Paleo diet is a plant-based protocol for multiple sclerosis, Dr. Dale Bredesen’s Paleo diet is part of an even more complex protocol including ketosis for Alzheimer’s. Other ketogenic Paleo diets target the treatment of obesity, autism, etc. Still other Paleo diets allow more carbs and so don’t prioritize ketosis at all. There are even Paleo diets that are so plant-based as to be vegetarian, with or without the inclusion of fish and seafood, more similar to that of Dr. Wahls.

Which is the Paleo diet? All of them. But what do they all have in common? What I listed above. They all take a multi-pronged approach. Other diets work to the degree they overlap with the Paleo diet, especially nutrient-density. Sarah Ballantyne, a professor and medical biophycisist, argues that nutrient-density might be the singlemost important factor and she might be right. Certainly, you could do worse than focusing on that alone. That has largely been the focus of traditional foods, as inspired by the work of Weston A. Price. Most diets seem to improve nutrient-density, one way or another, even if they don’t do it as fully as the best diets. The advantage of the Paleo diet(s), as with traditional foods and functional medicine, is that there is scientific understanding about why specific nutrients matter, even as our overall knowledge of nutrients has many gaps. Still, knowledge with gaps is better than anything else at the moment.

The list of dos and don’ts is based on the best science available. The science likely will change and so dietary recommendations will be modified accordingly. But if a diet is based on ideology instead, new information can have no impact. Fortunately, most people advocating diets are increasingly turning to a scientific approach. This might explain why all diets are converging on the same set of principles. Few people would have been talking about nutrient-density back when the FDA made its initial dietary recommendations as seen in the Food Pyramid. Yet now the idea of nutrient-density has become so scientifically established that it is almost common knowledge.

More than the Paleo diet as specific foods to eat and avoid, what the most important takeaway is the scientific and experimental approach that its advocates have expressed more strongly than most. That is the way to treat the list I give, for each person is dealing with individual strengths and weaknesses, a unique history of contributing factors and health concerns. So, even if you dismiss the Paleo diet for whatever reason, don’t dismiss the principles upon which the Paleo diet is based (for vegetarians, see: Ketotarian by Dr. Will Cole and The Paleo Vegetarian Diet by Dena Harris). Anyone following any diet will find something of use, as tailored to their own needs.

That is the purpose of my presenting generalized guidelines that apply to all diets. It’s a way of getting past the ideological rhetoric in order to get at the substance of health itself, to get at the causal level. The secret is that there is no single healthy diet, not in any simplistic sense, even as every healthy diet has much in common.

Clearing Away the Rubbish

“The case against science is straightforward: much of the scientific literature, perhaps half, may simply be untrue”
~Richard Horton, editor in chief of The Lancet

“It is simply no longer possible to believe much of the clinical research that is published, or to rely on the judgment of trusted physicians or authoritative medical guidelines. I take no pleasure in this conclusion, which I reached slowly and reluctantly over my two decades as an editor”
~Dr. Marcia Angell, former editor in chief of NEJM

Back in September, there was a scientific paper published in Clinical Cardiology, a peer reviewed medical journal that is “an official journal of the American Society for Preventive Cardiology” (Wikipedia). It got a ton of attention from news media, social media, and the blogosphere. The reason for all the attention is that, in the conclusion, the authors claimed that low-carb diets had proven the least healthy over a year period:

“One-year lowered-carbohydrate diet significantly increases cardiovascular risks, while a low-to-moderate-fat diet significantly reduces cardiovascular risk factors. Vegan diets were intermediate. Lowered-carbohydrate dieters were least inclined to continue dieting after conclusion of the study. Reductions in coronary blood flow reversed with appropriate dietary intervention. The major dietary effect on atherosclerotic coronary artery disease is inflammation and not weight loss.”

It has recently been retracted and it has come out that the lead author, Richard M. Fleming, has a long history of fraud going back to 2002 with two FBI convictions of fraud in 2009, following his self-confession. He has also since been debarred by the U.S. Food and Drug Administration. (But his closest brush with fame or infamy was his leaking the medical records of Dr. Robert Atkins, a leak that was behind a smear campaign.) As for his co-authors: “Three of the authors work at Fleming’s medical imaging company in California, one is a deceased psychologist from Iowa, another is a pediatric nutritionist from New York and one is a Kellogg’s employee from Illinois. How this group was able to run a 12-month diet trial in 120 subjects is something of a mystery” (George Henderson). Even before the retraction, many wondered how it ever passed peer-review considering the low quality of the study: “This study has so many methodological holes in it that it has no real value.” (Low Carb Studies BLOG).

But of course, none of that has been reported as widely as the paper originally was. So, most people who read about it still assume it is valid evidence. This is related to the replication crisis, as even researchers are often unaware of retractions, that is when journals will allow retractions to be published at all, something they are reluctant to do because it delegitimizes their authority. So, a lot of low quality or in some cases deceptive research goes unchallenged and unverified, neither confirmed nor disconfirmed. It’s rare when any study falls under the scrutiny of replication. If not for the lead author’s criminal background in the Fleming case, this probably would have been another paper that could have slipped past and been forgotten or else, without replication, repeatedly cited in future research. As such, bad research builds on bad research, creating the appearance of mounting evidence, but in reality it is a house of cards (consider the takedown of Ancel Keys and gang in the work by numerous authors: Gary Taubes’ Good Calories, Bad Calories; Nina Tiecholz’s The Big Fat Surprise; Sally Fallon Morrell’s Nourishing Diets; et cetera).

This is why the systemic problem and failure is referred to as a crisis. Fairly or unfairly, the legitimacy of entire fields of science are being questioned. Even scientists no longer are certain which research is valid or not. The few attempts at determining the seriousness of the situation by replicating studies has found a surprisingly low replication rate. And this problem is worse in the medical field than in many other fields, partly because of the kind of funding involved and more importantly because of how few doctors are educated in statistics or trained in research methodology. It is even worse with nutrition, as the average doctor gets about half the questions wrong when asked about this topic, and keep in mind that so much of the nutritional research is done by doctors. An example of problematic dietary study is that of Dr. Fleming himself. We’d be better off letting physicists and geologists do nutritional research.

There is more than a half century of research that conventional medical and dietary opinions are based upon. In some major cases, re-analysis of data has shown completely opposite conclusions. For example, the most famous study by Ancel Keys blamed saturated fat for heart disease, while recent reappraisal has shown the data actually shows a stronger link to sugar as the culprit. Meanwhile, no study has ever directly linked saturated fat to heart disease. The confusion has come because, in the Standard American Diet (SAD), saturated fat and sugar have been conflated in the population under study. Yet, even in cases like that of Keys when we now know what the data shows, Keys’ original misleading conclusions are still referenced as authoritative.

The only time this crisis comes to attention is when the researcher gets attention. If Keys wasn’t famous and Fleming wasn’t criminal, no one would have bothered with their research. Lots of research gets continually cited without much thought, as the authority of research accumulates over time by being cited which encourages further citation. It’s similar to how legal precedents can get set, even when the initial precedent was intentionally misinterpreted for that very purpose.

To dig through the original data, assuming it is available and one knows where to find it, is more work than most are willing to do. There is no glory or praise to be gained in doing it, nor will it promote one’s career or profit one’s bank account. If anything, there are plenty of disincentives in place, as academic careers in science are dependent on original research. Furthermore, private researchers working in corporations, for obvious reasons, tend to be even less open about their data and that makes scrutiny even more difficult. If a company found their own research didn’t replicate, they would be the last in line to announce it to the world and instead would likely bury it where it never would be found.

There is no system put into place to guard against the flaws of the system itself. And the news media is in an almost continual state of failure when it comes to scientific reporting. The crisis has been stewing for decades, occasionally being mentioned, but mostly suppressed, until now when it has gotten so bad as to be undeniable. The internet has created alternative flows of information and so much of the scrutiny, delayed for too long, is now coming from below. If this had happened at an earlier time, Fleming might have gotten away with it. But times have changed. And in crisis, there is opportunity or at very least there is hope for open debate. So bring on the debate, just as soon as we clear away some of the rubbish.

* * *

Retracted: Long‐term health effects of the three major diets under self‐management with advice, yields high adherence and equal weight loss, but very different long‐term cardiovascular health effects as measured by myocardial perfusion imaging and specific markers of inflammatory coronary artery disease

The above article, published online on 27 September 2018 in Wiley Online Library (wileyonlinelibrary.com), has been withdrawn by agreement between the journal Editor in Chief, A. John Camm and Wiley Periodicals, Inc. The article has been withdrawn due to concerns with data integrity and an undisclosed conflict of interest by the lead author.

A convicted felon writes a paper on hotly debated diets. What could go wrong?
by Ivan Oransky, Retraction Watch

Pro-tip for journals and publishers: When you decide to publish a paper about a subject — say, diets — that you know will draw a great deal of scrutiny from vocal proponents of alternatives, make sure it’s as close to airtight as possible.

And in the event that the paper turns out not to be so airtight, write a retraction notice that’s not vague and useless.

Oh, and make sure the lead author of said study isn’t a convicted felon who pleaded guilty to healthcare fraud.

“If only we were describing a hypothetical.

On second thought: A man of many talents — with a spotty scientific record
by Adam Marcus, Boston Globe

Richard M. Fleming may be a man of many talents, but his record as a scientist has been spotty. Fleming, who bills himself on Twitter as “PhD, MD, JD AND NOW Actor-Singer!!!”, was a co-author of short-lived paper in the journal Clinical Cardiology purporting to find health benefits from a diet with low or modest amounts of fat. The paper came out in late September — just a day before the Food and Drug Administration banned Fleming from participating in any drug studies. Why? Two prior convictions for fraud in 2009.

It didn’t take long for others to begin poking holes in the new article. One researcher found multiple errors in the data and noted that the study evidently had been completed in 2002. The journal ultimately retracted the article, citing “concerns with data integrity and an undisclosed conflict of interest by the lead author.” But Fleming, who objected to the retraction, persevered. On Nov. 5, he republished the study in another journal — proving that grit, determination, and a receptive publisher are more important than a spotless resume.

Malnourished Americans

Prefatory Note

It would be easy to mistake this writing as a carnivore’s rhetoric against the evils of grains and agriculture. I’m a lot more agnostic on the issue than it might seem. But I do come off as strong in opinion, from decades of personal experience about bad eating habits and the consequences, and my dietary habits were no better when I was vegetarian.

I’m not so much pro-meat as I am for healthy fats and oils, not only from animals sources but also from plants, with coconut oil and olive oil being two of my favorites. As long as you are getting adequate protein, from whatever source (including vegetarian foods), there is no absolute rule about protein intake. But hunter-gatherers on average do eat more fats and oils than protein (and more than vegetables as well), whether the protein comes from meat or seeds and nuts (though the protein and vegetables they get is of extremely high quality and, of course, nutrient dense; along with much fiber). Too much protein with too little fat/oil causes rabbit sickness. It’s fat and oil that has a higher satiety and, combined with low-carb ketosis, is amazing in eliminating food cravings, addictions, and over-eating.

Besides, I have nothing against plant-based foods. I eat more vegetables on the paleo diet than I did in the past, even when I was a vegetarian, more than any vegetarian I know as well; not just more in quantity but also more in quality. Many paleo and keto dieters have embraced a plant-based diet with varying attitudes about meat and fat. Dr. Terry Wahls, former vegetarian, reversed her symptoms of multiple sclerosis by formulating a paleo diet that include massive loads of nutrient-dense vegetables, while adding in the nutrient-dense animal foods as well (e.g., liver).

I’ve picked up three books lately that emphasize plants even further. One is The Essential Vegetarian Keto Cookbook and pretty much is as the title describes it, mostly recipes with some introductory material about ketosis. Another book, Ketotarian by Dr. Will cole, is likewise about keto vegetarianism, but with leniency toward fish consumption and ghee (the former not strictly vegetarian and the latter not strictly paleo). The most recent I got is The Paleo Vegetarian Diet by Dena Harris, another person with a lenient attitude toward diet. That is what I prefer in my tendency toward ideological impurity. About diet, I’m bi-curious or maybe multi-curious.

My broader perspective is that of traditional foods. This is largely based on the work of Weston A. Price, which I was introduced to long ago by way of the writings of Sally Fallon Morrell (formerly Sally Fallon). It is not a paleo diet in that agricultural foods are allowed, but its advocates share a common attitude with paleolists in the valuing of traditional nutrition and food preparation. Authors from both camps bond over their respect for Price’s work and so often reference those on the other side in their writings. I’m of the opinion, in line with traditional foods, that if you are going to eat agricultural foods then traditional preparation is all the more important (from long-fermented bread and fully soaked legumes to cultured dairy and raw aged cheese). Many paleolists share this opinion and some are fine with such things as ghee. My paleo commitment didn’t stop me from enjoying a white role for Thanksgiving, adorning it with organic goat butter, and it didn’t kill me.

I’m not so much arguing against all grains in this post as I’m pointing out the problems found at the extreme end of dietary imbalance that we’ve reached this past century: industrialized and processed, denatured and toxic, grain-based/obsessed and high-carb-and-sugar. In the end, I’m a flexitarian who has come to see the immense benefits in the paleo approach, but I’m not attached to it as a belief system. I heavily weigh the best evidence and arguments I can find in coming to my conclusions. That is what this post is about. I’m not trying to tell anyone how to eat. I hope that heads off certain areas of potential confusion and criticism. So, let’s get to the meat of the matter.

Grain of Truth

Let me begin with a quote, share some related info, and then circle back around to putting the quote into context. The quote is from Grain of Truth by Stephen Yafa. It’s a random book I picked up at a secondhand store and my attraction to it was that the author is defending agriculture and grain consumption. I figured it would be a good balance to my other recent readings. Skimming it, one factoid stuck out. In reference to new industrial milling methods that took hold in the late 19th century, he writes:

“Not until World War II, sixty years later, were measures taken to address the vitamin and mineral deficiencies caused by these grain milling methods. They caught the government’s attention only when 40 percent of the raw recruits drafted by our military proved to be so malnourished that they could not pass a physical and were declared unfit for duty.” (p. 17)

That is remarkable. He is talking about the now infamous highly refined flour, something that never existed before. Even commercial whole wheat breads today, with some fiber added back in, have little in common with what was traditionally made for millennia. My grandparents were of that particular generation that was so severely malnourished, and so that was the world into which my parents were born. The modern health decline that has gained mainstream attention began many generations back. Okay, so put that on the backburner.

Against the Grain

In a post by Dr. Malcolm Kendrick, I was having a discussion in the comments section (and, at the same time, I was having a related discussion in my own blog). Göran Sjöberg brought up Jame C. Scott’s book about the development of agriculture, Against the Grain — writing that, “This book is very much about the health deterioration, not least through epidemics partly due to compromised immune resistance, that occurred in the transition from hunting and gathering to sedentary mono-crop agriculture state level scale, first in Mesopotamia about five thousand years ago.”

Scott’s view has interested me for a while. I find compelling the way he connects grain farming, legibility, record-keeping, and taxation. There is a reason great empires were built on grain fields, not on potato patches or vegetable gardens, much less cattle ranching. Grain farming is easily observed and measured, tracked and recorded, and that meant it could be widely taxed to fund large centralized governments along with their armies and, later on, their police forces and intelligence agencies. The earliest settled societies arose prior to agriculture, but they couldn’t become major civilizations until the cultivation of grains.

Another commenter, Sasha, responded with what she considered important qualifications: “I think there are too many confounders in transition from hunter gatherers to agriculture to suggest that health deterioration is due to one factor (grains). And since it was members of upper classes who were usually mummified, they had vastly different lifestyles from that of hunter gatherers. IMO, you’re comparing apples to oranges… Also, grain consumption existed in hunter gatherers and probably intensified long before Mesopotamia 5 thousands years ago as wheat was domesticated around 9,000 BCE and millet around 6,000 BCE to use just two examples.”

It is true that pre-neolithic hunter-gatherers, in some cases, sporadically ate grains in small amounts or at least we have evidence they were doing something with grains, though as far as we know they might have been using it to mix with medicinal herbs or used as a thickener for paints — it’s anyone’s guess. Assuming they were eating those traces of grains we’ve discovered, it surely was no where near at the level of the neolithic agriculturalists. Furthermore, during the following millennia, grains were radically changed through cultivation. As for the Egyptian elite, they were eating more grains than anyone, as farmers were still forced to partly subsist from hunting, fishing, and gathering.

I’d take the argument much further forward into history. We know from records that, through the 19th century, Americans were eating more meat than bread. Vegetable and fruit consumption was also relatively low and mostly seasonal. Part of that is because gardening was difficult with so many pests. Besides, with so many natural areas around, hunting and gathering remained a large part of the American diet. Even in the cities, wild game was easily obtained at cheap prices. Into the 20th century, hunting and gathering was still important and sustained many families through the Great Depression and World War era when many commercial foods were scarce.

It was different in Europe, though. Mass urbanization happened centuries before it did in the United States. And not much European wilderness was left standing in recent history. But with the fall of the Roman Empire and headng into feudalism, many Europeans returned to a fair amount of hunting and gathering, during which time general health improved in the population. Restrictive laws about land use eventually made that difficult and the land enclosure movement made it impossible for most Europeans.

Even so, all of that is fairly recent in the big scheme of things. It took many millennia of agriculture before it more fully replaced hunting, fishing, trapping, and gathering. In places like the United States, that change is well within living memory. When some of my ancestors immigrated here in the 1600s, Britain and Europe still maintained plenty of procuring of wild foods to support their populations. And once here, wild foods were even more plentiful and a lot less work than farming.

Many early American farmers didn’t grow food so much for their own diet as to be sold on the market, sometimes in the form of the popular grain-based alcohols. It was in making alcohol that rural farmers were able to get their product to the market without it spoiling. I’m just speculating, but alcohol might have been the most widespread agricultural food of that era because water was often unsafe to drink.

Another commenter, Martin Back, made the same basic point: “Grain these days is cheap thanks to Big Ag and mechanization. It wasn’t always so. If the fields had to be ploughed by draught animals, and the grain weeded, harvested, and threshed by hand, the final product was expensive. Grain became a store of value and a medium of exchange. Eating grains was literally like eating money, so presumably they kept consumption to a minimum.”

In early agriculture, grain was more of a way to save wealth than a staple of the diet. It was saved for purposes of trade and also saved for hard times when no other food was available. What didn’t happen was to constantly consume grain-based foods every day and all day long — going from a breakfast with toast and cereal to lunch with a sandwich and maybe a salad with croutons, and then a snack of crackers in the afternoon before eating more bread or noodles for dinner.

Historical Examples

So, I am partly just speculating. But it’s informed speculation. I base my view on specific examples. The most obvious example are hunter-gatherers, poor by standards of modern industrialization while maintaining great health, as long as they their traditional way of life is able to be maintained. Many populations that are materially better of in terms of a capitalist society (access to comfortable housing, sanitation, healthcare, an abundance of food in grocery stores, etc) are not better off in terms of chronic diseases.

As the main example I already mentioned, poor Americans have often been a quite healthy lot, as compared to other populations around the world. It is true that poor Americans weren’t particularly healthy in the early colonial period, specifically in Virginia because of indentured servitude. And it’s true that poor Americans today are fairly bad off because of the cheap industrialized diet. Yet for the couple of centuries or so in between, they were doing quite well in terms of health, with lots of access to nutrient-dense wild foods. That point is emphasized by looking at other similar populations at the time, such as back in Europe.

Let’s do some other comparisons. The poor in the Roman Empire did not do well, even when they weren’t enslaved. That was for many reasons, such as growing urbanization and its attendant health risks. When the Roman Empire fell, many of the urban centers collapsed. The poor returned to a more rural lifestyle that depended on a fair amount of wild foods. Studies done on their remains show their health improved during that time. Then at the end of feudalism, with the enclosure movement and the return of mass urbanization, health went back on a decline.

Now I’ll consider the early Egyptians. I’m not sure if there is any info about the diet and health of poor Egyptians. But clearly the ruling class had far from optimal health. It’s hard to make comparisons between then and now, though, because it was an entire different kind of society. The early Bronze Age civilizations were mostly small city-states that lacked much hierarchy. Early Egypt didn’t even have the most basic infrastructure such as maintained roads and bridges. And the most recent evidence indicates that the pyramid workers weren’t slaves but instead worked freely and seem to have fed fairly well, whatever that may or may not indicate about their socioeconomic status. The fact that the poor weren’t mummified leaves us with scant evidence that would more directly inform us.

On the other hand, no one can doubt that there have been plenty of poor populations who had truly horrific living standards with much sickness, suffering, and short lifespans. That is particularly true over the millennia as agriculture became ever more central, since that meant periods of abundance alternating with periods of deficiency and sometimes starvation, often combined with weakened immune systems and rampant sickness. That was less the case for the earlier small city-states with less population density and surrounded by the near constant abundance of wilderness areas.

As always, it depends on what are the specifics we are talking about. Also, any comparison and conclusion is relative.

My mother grew up in a family that hunted and at the time there was a certain amount of access to natural areas for many Americans, something that helped a large part of the population get through the Great Depression and world war era. Nonetheless, by the time of my mother’s childhood, overhunting had depleted most of the wild game (bison, bear, deer, etc were no longer around) and so her family relied on less desirable foods such as squirrel, raccoon, and opossum; even the fish they ate was less than optimal because they came from highly polluted waters because of the very factories and railroad her family worked in. So, the wild food opportunities weren’t nearly as good as it had been a half century earlier, much less in the prior centuries.

Not All Poverty is the Same

Being poor today means a lot of things that it didn’t mean in the past. The high rates of heavy metal toxicity today has rarely been seen among previous poor populations. Today 40% of the global deaths are caused by air pollution, primarily effecting the poor, also extremely different from the past. Beyond that, inequality has grown larger than ever before and that has been strongly correlated to high rates of stress, disease, homicides, and suicides. Such inequality is also seen in terms of climate change, droughts, refugee crises, and war/occupation.

Here is what Sasha wrote in response to me: “I agree with a lot of your points, except with your assertion that “the poor ate fairly well in many societies especially when they had access to wild sources of food”. I know how the poor ate in Russia in the beginning of the 20th century and how the poor eat now in the former Soviet republics and in India. Their diet is very poor even though they can have access to wild sources of food. I don’t know what the situation was for the poor in ancient Egypt but I would be very surprised if it was better than in modern day India or former Soviet Union.”

I’d imagine modern Russia has high inequality similar to the US. About modern India, that is one of the most impoverished, densely populated, and malnourished societies around. And modern industrialization did major harm to Hindu Indians because studies show that traditional vegetarians got a fair amount of nutrients from the insects that were mixed in with pre-modern agricultural goods. Both Russia and India have other problems related to neoliberalism that wasn’t a factor in the past. It’s an entirely different kind of poverty these days. Even if some Russians have some access to wild foods, I’m willing to bet they have nowhere near the access that was available in previous generations, centuries, and millennia.

Compare modern poverty to that of feudalism. At least in England, feudal peasants were guaranteed to be taken care of in hard times. The Church, a large part of local governance at the time, was tasked with feeding and taking care of the poor and needy, from orphans to widows. They were tight communities that took care of their own, something that no longer exists in most of the world where the individual is left to suffer and struggle. Present Social Darwinian conditions are not the norm for human societies across history. The present breakdown of families and communities is historically unprecedented.

Socialized Medicine & Externalized Costs
An Invisible Debt Made Visible
On Conflict and Stupidity
Inequality in the Anthropocene
Capitalism as Social Control

The Abnormal Norms of WEIRD Modernity

Everything about present populations is extremely abnormal. This is seen in diet as elsewhere. Let me return to the quote I began this post with. “Not until World War II, sixty years later, were measures taken to address the vitamin and mineral deficiencies caused by these grain milling methods. They caught the government’s attention only when 40 percent of the raw recruits drafted by our military proved to be so malnourished that they could not pass a physical and were declared unfit for duty.” * So, what had happened to the health of the American population?

Well, there were many changes. Overhunting, as I already said, made many wild game species extinct or eliminated them from local areas, such that my mother born into a rural farm state never saw a white-tailed deer growing up. Also, much earlier after the Civil War, a new form of enclosure movement happened as laws were passed to prevent people, specifically the then free blacks, from hunting and foraging wherever they wanted (early American laws often protected the rights of anyone to hunt, forage plants, collect timber, etc from any land that was left open, whether or not it was owned by someone). The carryover from the feudal commons was finally and fully eliminated. It was also the end of the era of free range cattle ranching, the ending have come with the invention of barbed wire. Access to wild foods was further reduced by the creation and enforcement of protected lands (e.g., the federal park system), which very much was targeted at the poor who up to that point had relied upon wild foods for health and survival.

All of that was combined with mass urbanization and industrialization with all of its new forms of pollution, stress, and inequality. Processed foods were becoming more widespread at the time. Around the turn of the century unhealthy and industrialized vegetable oils became heavily marketed and hence popular, which replaced butter and lard. Also, muckraking about the meat industry scared Americans off from meat and consumption precipitiously dropped. As such, in the decades prior to World War II, the American diet had already shifted toward what we now know. A new young generation had grown up on that industrialized and processed diet and those young people were the ones showing up as recruits for the military. This new diet in such a short period had caused mass malnourishment. It was a mass experiment that showed failure early on and yet we continue the same basic experiment, not only continuing it but making it far worse.

Government officials and health authorities blamed it on bread production. Refined flour had become widely available because of industrialization. This removed all the nutrients that gave any health value to bread. In response, there was a movement to fortify bread, initially enforced by federal law and later by state laws. That helped some, but obviously the malnourishment was caused by many other factors that weren’t appreciated by most at the time, even though this was the same period when Weston A. Price’s work was published. Nutritional science was young at the time and very most nutrients were still undiscovered or else unappreciated. Throwing a few lab-produced vitamins back into food barely scratches the surface of the nutrient-density that was lost.

Most Americans continue to have severe nutritional deficiencies. We don’t recognize this fact because being underdeveloped and sickly has become normalized, maybe even in the minds of most doctors and health officials. Besides, many of the worst symptoms don’t show up until decades later, often as chronic diseases of old age, although increasingly seen among the young. Far fewer Americans today would meet the health standards of World War recruits. It’s been a steady decline, despite the miracles of modern medicine in treating symptoms and delaying death.

* The data on the British shows an even earlier shift in malnourishment because imperial trade brought an industrialized diet sooner to the British population. Also, rural life with a greater diet of wild foods had more quickly disappeared, as compared to the US. The fate of the British in the late 1800s showed what would later happen more than a half century later on the other side of the ocean.

Lore of Nutrition
by Tim Noakes
pp. 373-375

The mid-Victorian period between 1850 and 1880 is now recognised as the golden era of British health. According to P. Clayton and J. Rowbotham, 47 this was entirely due to the mid-Victorians’ superior diet. Farm-produced real foods were available in such surplus that even the working-class poor were eating highly nutritious foods in abundance. As a result, life expectancy in 1875 was equal to, or even better, than it is in modern Britain, especially for men (by about three years). In addition, the profile of diseases was quite different when compared to Britain today.

The authors conclude:

[This] shows that medical advances allied to the pharmaceutical industry’s output have done little more than change the manner of our dying. The Victorians died rapidly of infection and/or trauma, whereas we die slowly of degenerative disease. It reveals that with the exception of family planning, the vast edifice of twentieth century healthcare has not enabled us to live longer but has in the main merely supplied methods of suppressing the symptoms of degenerative disease which have emerged due to our failure to maintain mid-Victorian nutritional standards. 48

This mid-Victorians’ healthy diet included freely available and cheap vegetables such as onions, carrots, turnips, cabbage, broccoli, peas and beans; fresh and dried fruit, including apples; legumes and nuts, especially chestnuts, walnuts and hazelnuts; fish, including herring, haddock and John Dory; other seafood, including oysters, mussels and whelks; meat – which was considered ‘a mark of a good diet’ so that ‘its complete absence was rare’ – sourced from free-range animals, especially pork, and including offal such as brain, heart, pancreas (sweet breads), liver, kidneys, lungs and intestine; eggs from hens that were kept by most urban households; and hard cheeses.

Their healthy diet was therefore low in cereals, grains, sugar, trans fats and refined flour, and high in fibre, phytonutrients and omega- 3 polyunsaturated fatty acids, entirely compatible with the modern Paleo or LCHF diets.

This period of nutritional paradise changed suddenly after 1875 , when cheap imports of white flour, tinned meat, sugar, canned fruits and condensed milk became more readily available. The results were immediately noticeable. By 1883 , the British infantry was forced to lower its minimum height for recruits by three inches; and by 1900, 50 per cent of British volunteers for the Boer War had to be rejected because of undernutrition. The changes would have been associated with an alteration in disease patterns in these populations, as described by Yellowlees ( Chapter 2 ).

On Obesity and Malnourishment

There is no contradiction, by the way, between rampant nutritional deficiencies and the epidemic of obesity. Gary Taubes noted the dramatic rise of obesity in America began earlier last century, which is to say that it is not a problem that came out of nowhere with the present younger generations. Americans have been getting fatter for a while now. Specifically, they were getting fatter while at the same time being malnourished, partly because of refined flour that was as empty of a carb that is possible.

Taubes emphasizes the point that this seeming paradox has often been observed among poor populations around the world, lack of optimal nutrition that leads to ever more weight gain, sometimes with children being skinny to an unhealthy degree only to grow up to be fat. No doubt that many Americans in the early 1900s were dealing with much poverty and the lack of nutritious foods that often goes with it. As for today, nutritional deficiencies are different because of enrichment, but it persists nonetheless in many other ways. Also, as Keith Payne argues in The Broken Ladder, growing inequality mimics poverty in the conflict and stress it causes. And inequality has everything to do with food quality, as seen with many poor areas being food deserts.

I’ll give you a small taste of Taube’s discussion. It is from the introduction to one of his books, published a few years ago. If you read the book, look at the section immediately following the below. He gives examples of tribes that were poor, didn’t overeat, and did hard manual labor. Yet they were getting obese, even as nearby tribes sometimes remained a healthy weight. The only apparent difference was what they were eating and not how much they were eating. The populations that saw major weight gain had adopted a grain-based diet, typically because of government rations or government stores.

Why We Get Fat
by Gary Taubes
pp. 17-19

In 1934, a young German pediatrician named Hilde Bruch moved to America, settled in New York City, and was “startled,” as she later wrote, by the number of fat children she saw—“really fat ones, not only in clinics, but on the streets and subways, and in schools.” Indeed, fat children in New York were so conspicuous that other European immigrants would ask Bruch about it, assuming that she would have an answer. What is the matter with American children? they would ask. Why are they so bloated and blown up? Many would say they’d never seen so many children in such a state.

Today we hear such questions all the time, or we ask them ourselves, with the continual reminders that we are in the midst of an epidemic of obesity (as is the entire developed world). Similar questions are asked about fat adults. Why are they so bloated and blown up? Or you might ask yourself: Why am I?

But this was New York City in the mid-1930s. This was two decades before the first Kentucky Fried Chicken and McDonald’s franchises, when fast food as we know it today was born. This was half a century before supersizing and high-fructose corn syrup. More to the point, 1934 was the depths of the Great Depression, an era of soup kitchens, bread lines, and unprecedented unemployment. One in every four workers in the United States was unemployed. Six out of every ten Americans were living in poverty. In New York City, where Bruch and her fellow immigrants were astonished by the adiposity of the local children, one in four children were said to be malnourished. How could this be?

A year after arriving in New York, Bruch established a clinic at Columbia University’s College of Physicians and Surgeons to treat obese children. In 1939, she published the first of a series of reports on her exhaustive studies of the many obese children she had treated, although almost invariably without success. From interviews with her patients and their families, she learned that these obese children did indeed eat excessive amounts of food—no matter how much either they or their parents might initially deny it. Telling them to eat less, though, just didn’t work, and no amount of instruction or compassion, counseling, or exhortations—of either children or parents—seemed to help.

It was hard to avoid, Bruch said, the simple fact that these children had, after all, spent their entire lives trying to eat in moderation and so control their weight, or at least thinking about eating less than they did, and yet they remained obese. Some of these children, Bruch reported, “made strenuous efforts to lose weight, practically giving up on living to achieve it.” But maintaining a lower weight involved “living on a continuous semi-starvation diet,” and they just couldn’t do it, even though obesity made them miserable and social outcasts.

One of Bruch’s patients was a fine-boned girl in her teens, “literally disappearing in mountains of fat.” This young girl had spent her life fighting both her weight and her parents’ attempts to help her slim down. She knew what she had to do, or so she believed, as did her parents—she had to eat less—and the struggle to do this defined her existence. “I always knew that life depended on your figure,” she told Bruch. “I was always unhappy and depressed when gaining [weight]. There was nothing to live for.… I actually hated myself. I just could not stand it. I didn’t want to look at myself. I hated mirrors. They showed how fat I was.… It never made me feel happy to eat and get fat—but I never could see a solution for it and so I kept on getting fatter.”

pp. 33-34

If we look in the literature—which the experts have not in this case—we can find numerous populations that experienced levels of obesity similar to those in the United States, Europe, and elsewhere today but with no prosperity and few, if any, of the ingredients of Brownell’s toxic environment: no cheeseburgers, soft drinks, or cheese curls, no drive-in windows, computers, or televisions (sometimes not even books, other than perhaps the Bible), and no overprotective mothers keeping their children from roaming free.

In these populations, incomes weren’t rising; there were no labor-saving devices, no shifts toward less physically demanding work or more passive leisure pursuits. Rather, some of these populations were poor beyond our ability to imagine today. Dirt poor. These are the populations that the overeating hypothesis tells us should be as lean as can be, and yet they were not.

Remember Hilde Bruch’s wondering about all those really fat children in the midst of the Great Depression? Well, this kind of observation isn’t nearly as unusual as we might think.

How Americans Used to Eat

Below is a relevant passage. It puts into context how extremely unusual has been the high-carb, low-fat diet these past few generations. This is partly what informed some of my thoughts. We so quickly forget that the present dominance of a grain-based diet wasn’t always the case, likely not even in most agricultural societies until quite recently. In fact, the earlier American diet is still within living memory, although those left to remember it are quickly dying off.

Let me explain why history of diets matter. One of the arguments for forcing official dietary recommendations onto the entire population was the belief that Americans in a mythical past ate less meat, fat, and butter while having ate more bread, legumes, and vegetables. This turns out to have been a trick of limited data.

We now know, from better data, that the complete opposite was the case. And we have the further data that shows that the increase of the conventional diet has coincided with increase of obesity and chronic diseases. That isn’t to say eating more vegetables is bad for your health, but we do know that even as the average American intake of vegetables has gone up so has all the diet-related health conditions. During this time, what went down was the consumption of all the traditional foods of the American diet going back to the colonial era: wild game, red meat, organ meat, lard, and butter — all the foods Americans ate in huge amounts prior to the industrialized diet.

What added to the confusion and misinterpretation of the evidence had to do with timing. Diet and nutrition was first seriously studied right at the moment when, for most populations, it had already changed. That was the failure of Ancel Keys research on what came to be called the Mediterranean diet (see Sally Fallon Morrell’s Nourishing Diets). The population was recuperating from World War II that had devastated their traditional way of life, including their diet. Keys took the post-war deprivation diet as being the historical norm, but the reality was far different. Cookbooks and other evidence from before the war showed that this population used to eat higher levels of meat and fat, including saturated fat. So, the very people focused on had grown up and spent most of their lives on a diet that was at the moment no longer available because of disruption of the food system. What good health Keys observed came from a lifetime of eating a different diet. Combined with cherry-picking of data and biased analysis, Keys came to a conclusion that was as wrong as wrong could be.

Slightly earlier, Weston A. Price was able to see a different picture. He intentionally traveled to the places where traditional diets remained fully in place. And the devastation of World War II had yet to happen. Price came to a conclusion that what mattered most of all was nutrient-density. Sure, the vegetables eaten would have been of a higher quality than we get today, largely because they were heirloom cultivars grown on health soil. Nutrient-dense foods can only come from nutrient-dense soil, whereas today our food is nutrient-deficient because our soil is highly depleted. The same goes for animal foods. Animals pastured on healthy land will produce healthy dairy, eggs, meat, and fat; these foods will be high in omega-3s and the fat-soluble vitamins.

No matter if it is coming from plant sources or animal sources, nutrient-density might be the most important factor of all. Why fat is meaningful in this context is that it is fat that is where fat-soluble vitamins are found and it is through fat that they are metabolized. And in turn, the fat-soluble vitamins play a key role in the absorption and processing of numerous other nutrients, not to mention a key role in numerous functions in the body. Nutrient-density and fat-density go hand in hand in terms of general health. That is what early Americans were getting in eating so much wild food, not only wild game but also wild greens, fruit, and mushrooms. And nutrient-density is precisely what we are lacking today, as the nutrients have been intentionally removed to make more palatable commercial foods.

Once again, this has a class dimension, since the wealthier have more access to nutrient-dense foods. Few poor people could afford to shop at a high-end health food store, even if one was located nearby their home. But it was quite different in the past when nutrient-dense foods were available to everyone and sometimes more available to the poor concentrated in rural areas. If we want to improve public health, the first thing we should do is return to this historical norm.

The Big Fat Surprise
by Nina Teicholz
pp. 123-131

Yet despite this shaky and often contradictory evidence, the idea that red meat is a principal dietary culprit has thoroughly pervaded our national conversation for decades. We have been led to believe that we’ve strayed from a more perfect, less meat-filled past. Most prominently, when Senator McGovern announced his Senate committee’s report, called Dietary Goals , at a press conference in 1977, he expressed a gloomy outlook about where the American diet was heading. “Our diets have changed radically within the past fifty years,” he explained, “with great and often harmful effects on our health.” Hegsted, standing at his side, criticized the current American diet as being excessively “rich in meat” and other sources of saturated fat and cholesterol, which were “linked to heart disease, certain forms of cancer, diabetes and obesity.” These were the “killer diseases,” said McGovern. The solution, he declared, was for Americans to return to the healthier, plant-based diet they once ate.

The New York Times health columnist Jane Brody perfectly encapsulated this idea when she wrote, “Within this century, the diet of the average American has undergone a radical shift away from plant-based foods such as grains, beans and peas, nuts, potatoes, and other vegetables and fruits and toward foods derived from animals—meat, fish, poultry, eggs and dairy products.” It is a view that has been echoed in literally hundreds of official reports.

The justification for this idea, that our ancestors lived mainly on fruits, vegetables, and grains, comes mainly from the USDA “food disappearance data.” The “disappearance” of food is an approximation of supply; most of it is probably being eaten, but much is wasted, too. Experts therefore acknowledge that the disappearance numbers are merely rough estimates of consumption. The data from the early 1900s, which is what Brody, McGovern, and others used, are known to be especially poor. Among other things, these data accounted only for the meat, dairy, and other fresh foods shipped across state lines in those early years, so anything produced and eaten locally, such as meat from a cow or eggs from chickens, would not have been included. And since farmers made up more than a quarter of all workers during these years, local foods must have amounted to quite a lot. Experts agree that this early availability data are not adequate for serious use, yet they cite the numbers anyway, because no other data are available. And for the years before 1900, there are no “scientific” data at all.

In the absence of scientific data, history can provide a picture of food consumption in the late eighteenth to nineteenth century in America. Although circumstantial, historical evidence can also be rigorous and, in this case, is certainly more far-reaching than the inchoate data from the USDA. Academic nutrition experts rarely consult historical texts, considering them to occupy a separate academic silo with little to offer the study of diet and health. Yet history can teach us a great deal about how humans used to eat in the thousands of years before heart disease, diabetes, and obesity became common. Of course we don’t remember now, but these diseases did not always rage as they do today. And looking at the food patterns of our relatively healthy early-American ancestors, it’s quite clear that they ate far more red meat and far fewer vegetables than we have commonly assumed.

Early-American settlers were “indifferent” farmers, according to many accounts. They were fairly lazy in their efforts at both animal husbandry and agriculture, with “the grain fields, the meadows, the forests, the cattle, etc, treated with equal carelessness,” as one eighteenth-century Swedish visitor described. And there was little point in farming since meat was so readily available.

The endless bounty of America in its early years is truly astonishing. Settlers recorded the extraordinary abundance of wild turkeys, ducks, grouse, pheasant, and more. Migrating flocks of birds would darken the skies for days . The tasty Eskimo curlew was apparently so fat that it would burst upon falling to the earth, covering the ground with a sort of fatty meat paste. (New Englanders called this now-extinct species the “doughbird.”)

In the woods, there were bears (prized for their fat), raccoons, bobolinks, opossums, hares, and virtual thickets of deer—so much that the colonists didn’t even bother hunting elk, moose, or bison, since hauling and conserving so much meat was considered too great an effort. IX

A European traveler describing his visit to a Southern plantation noted that the food included beef, veal, mutton, venison, turkeys, and geese, but he does not mention a single vegetable. Infants were fed beef even before their teeth had grown in. The English novelist Anthony Trollope reported, during a trip to the United States in 1861, that Americans ate twice as much beef as did Englishmen. Charles Dickens, when he visited, wrote that “no breakfast was breakfast” without a T-bone steak. Apparently, starting a day on puffed wheat and low-fat milk—our “Breakfast of Champions!”—would not have been considered adequate even for a servant.

Indeed, for the first 250 years of American history, even the poor in the United States could afford meat or fish for every meal. The fact that the workers had so much access to meat was precisely why observers regarded the diet of the New World to be superior to that of the Old. “I hold a family to be in a desperate way when the mother can see the bottom of the pork barrel,” says a frontier housewife in James Fenimore Cooper’s novel The Chainbearer.

Like the primitive tribes mentioned in Chapter 1, Americans also relished the viscera of the animal, according to the cookbooks of the time. They ate the heart, kidneys, tripe, calf sweetbreads (glands), pig’s liver, turtle lungs, the heads and feet of lamb and pigs, and lamb tongue. Beef tongue, too, was “highly esteemed.”

And not just meat but saturated fats of every kind were consumed in great quantities. Americans in the nineteenth century ate four to five times more butter than we do today, and at least six times more lard. X

In the book Putting Meat on the American Table , researcher Roger Horowitz scours the literature for data on how much meat Americans actually ate. A survey of eight thousand urban Americans in 1909 showed that the poorest among them ate 136 pounds a year, and the wealthiest more than 200 pounds. A food budget published in the New York Tribune in 1851 allots two pounds of meat per day for a family of five. Even slaves at the turn of the eighteenth century were allocated an average of 150 pounds of meat a year. As Horowitz concludes, “These sources do give us some confidence in suggesting an average annual consumption of 150–200 pounds of meat per person in the nineteenth century.”

About 175 pounds of meat per person per year! Compare that to the roughly 100 pounds of meat per year that an average adult American eats today. And of that 100 pounds of meat, more than half is poultry—chicken and turkey—whereas until the mid-twentieth century, chicken was considered a luxury meat, on the menu only for special occasions (chickens were valued mainly for their eggs). Subtracting out the poultry factor, we are left with the conclusion that per capita consumption of red meat today is about 40 to 70 pounds per person, according to different sources of government data—in any case far less than what it was a couple of centuries ago.

Yet this drop in red meat consumption is the exact opposite of the picture we get from public authorities. A recent USDA report says that our consumption of meat is at a “record high,” and this impression is repeated in the media. It implies that our health problems are associated with this rise in meat consumption, but these analyses are misleading because they lump together red meat and chicken into one category to show the growth of meat eating overall, when it’s just the chicken consumption that has gone up astronomically since the 1970s. The wider-lens picture is clearly that we eat far less red meat today than did our forefathers.

Meanwhile, also contrary to our common impression, early Americans appeared to eat few vegetables. Leafy greens had short growing seasons and were ultimately considered not worth the effort. They “appeared to yield so little nutriment in proportion to labor spent in cultivation,” wrote one eighteenth-century observer, that “farmers preferred more hearty foods.” Indeed, a pioneering 1888 report for the US government written by the country’s top nutrition professor at the time concluded that Americans living wisely and economically would be best to “avoid leafy vegetables,” because they provided so little nutritional content. In New England, few farmers even had many fruit trees, because preserving fruits required equal amounts of sugar to fruit, which was far too costly. Apples were an exception, and even these, stored in barrels, lasted several months at most.

It seems obvious, when one stops to think, that before large supermarket chains started importing kiwis from New Zealand and avocados from Israel, a regular supply of fruits and vegetables could hardly have been possible in America outside the growing season. In New England, that season runs from June through October or maybe, in a lucky year, November. Before refrigerated trucks and ships allowed the transport of fresh produce all over the world, most people could therefore eat fresh fruit and vegetables for less than half the year; farther north, winter lasted even longer. Even in the warmer months, fruit and salad were avoided, for fear of cholera. (Only with the Civil War did the canning industry flourish, and then only for a handful of vegetables, the most common of which were sweet corn, tomatoes, and peas.)

Thus it would be “incorrect to describe Americans as great eaters of either [fruits or vegetables],” wrote the historians Waverly Root and Richard de Rochemont. Although a vegetarian movement did establish itself in the United States by 1870, the general mistrust of these fresh foods, which spoiled so easily and could carry disease, did not dissipate until after World War I, with the advent of the home refrigerator.

So by these accounts, for the first two hundred and fifty years of American history, the entire nation would have earned a failing grade according to our modern mainstream nutritional advice.

During all this time, however, heart disease was almost certainly rare. Reliable data from death certificates is not available, but other sources of information make a persuasive case against the widespread appearance of the disease before the early 1920s. Austin Flint, the most authoritative expert on heart disease in the United States, scoured the country for reports of heart abnormalities in the mid-1800s, yet reported that he had seen very few cases, despite running a busy practice in New York City. Nor did William Osler, one of the founding professors of Johns Hopkins Hospital, report any cases of heart disease during the 1870s and eighties when working at Montreal General Hospital. The first clinical description of coronary thrombosis came in 1912, and an authoritative textbook in 1915, Diseases of the Arteries including Angina Pectoris , makes no mention at all of coronary thrombosis. On the eve of World War I, the young Paul Dudley White, who later became President Eisenhower’s doctor, wrote that of his seven hundred male patients at Massachusetts General Hospital, only four reported chest pain, “even though there were plenty of them over 60 years of age then.” XI About one fifth of the US population was over fifty years old in 1900. This number would seem to refute the familiar argument that people formerly didn’t live long enough for heart disease to emerge as an observable problem. Simply put, there were some ten million Americans of a prime age for having a heart attack at the turn of the twentieth century, but heart attacks appeared not to have been a common problem.

Was it possible that heart disease existed but was somehow overlooked? The medical historian Leon Michaels compared the record on chest pain with that of two other medical conditions, gout and migraine, which are also painful and episodic and therefore should have been observed by doctors to an equal degree. Michaels catalogs the detailed descriptions of migraines dating all the way back to antiquity; gout, too, was the subject of lengthy notes by doctors and patients alike. Yet chest pain is not mentioned. Michaels therefore finds it “particularly unlikely” that angina pectoris, with its severe, terrifying pain continuing episodically for many years, could have gone unnoticed by the medical community, “if indeed it had been anything but exceedingly rare before the mid-eighteenth century.” XII

So it seems fair to say that at the height of the meat-and-butter-gorging eighteenth and nineteenth centuries, heart disease did not rage as it did by the 1930s. XIII

Ironically—or perhaps tellingly—the heart disease “epidemic” began after a period of exceptionally reduced meat eating. The publication of The Jungle , Upton Sinclair’s fictionalized exposé of the meatpacking industry, caused meat sales in the United States to fall by half in 1906, and they did not revive for another twenty years. In other words, meat eating went down just before coronary disease took off. Fat intake did rise during those years, from 1909 to 1961, when heart attacks surged, but this 12 percent increase in fat consumption was not due to a rise in animal fat. It was instead owing to an increase in the supply of vegetable oils, which had recently been invented.

Nevertheless, the idea that Americans once ate little meat and “mostly plants”—espoused by McGovern and a multitude of experts—continues to endure. And Americans have for decades now been instructed to go back to this earlier, “healthier” diet that seems, upon examination, never to have existed.

Most Mainstream Doctors Would Fail Nutrition

“A study in the International Journal of Adolescent Medicine and Health assessed the basic nutrition and health knowledge of medical school graduates entering a pediatric residency program and found that, on average, they answered only 52 percent of eighteen questions correctly. In short, most mainstream doctors would fail nutrition.”
~Dr. Will Cole

That is amazing. The point is emphasized by the fact that these are doctors fresh out of medical school. If they were never taught this info in the immediate preceding years of intensive education and training, they are unlikely to pick up more knowledge later in their careers. These young doctors are among the most well educated people in the world, as few fields are as hard to enter and the drop-out rate of medical students is phenomena. These graduates entering residency programs are among the smartest of Americans, the cream of the crop, having been taught at some of the best schools in the world. They are highly trained experts in their field, but obviously this doesn’t include nutrition.

Think about this. Doctors are where most people turn to for serious health advice. They are the ultimate authority figures that the average person directly meets and talks to. If a cardiologist only got 52 percent right to answers on heart health, would you follow her advice and let her do heart surgery on you? I’d hope not. In that case, why would you listen to the dietary opinion of the typical doctor who is ill-informed? Nutrition isn’t a minor part of health, that is for sure. It is the one area where an individual has some control over their life and so isn’t a mere victim of circumstance. Research shows that simple changes in diet and nutrition, not to mention lifestyle, can have dramatic results. Yet few people have that knowledge because most doctors and other officials, to put it bluntly, are ignorant. Anyone who points out this state of affairs in mainstream thought generally isn’t received with welcoming gratitude, much less friendly dialogue and rational debate.

In reading about the paleo diet, a pattern I’ve noticed is that few critics of it know what the diet is and what is advocated by those who adhere to it. It’s not unusual to see, following a criticism of the paleo diet, a description of dietary recommendations that are basically in line with the paleo diet. Their own caricature blinds them to the reality, obfuscating the common ground of agreement or shared concern. I’ve seen the same kind of pattern in the critics of many alternative views: genetic determinists against epigenetic researchers and social scientists, climate change denialists against climatologists, Biblical apologists against Jesus mythicists, Chomskyan linguists against linguistic relativists, etc. In such cases, there is always plenty of fear toward those posing a challenge and so they are treated as the enemy to be attacked. And it is intended as a battle to which the spoils go to the victor, those in dominance assuming they will be the victor.

After debating some people on a blog post by a mainstream doctor (Paleo-suckered), it became clear to me how attractive genetic determinism and biological essentialism is to many defenders of conventional medicine, that there isn’t much you can do about your health other than to do what the doctor tells you and take your meds (these kinds of views may be on the decline, but they are far from down for the count). What bothers them isn’t limited to the paleo diet but extends seemingly to almost any diet as such, excluding official dietary recommendations. They see diet advocates as quacks, faddists, and cultists who are pushing an ideological agenda, and they feel like they are being blamed for their own ill health; from their perspective, it is unfair to tell someone they are capable of improving their diet, at least beyond the standard advice of eat your veggies and whole grains while gulping down your statins and shooting up your insulin.

As a side note, I’m reminded of how what often gets portrayed as alternative wasn’t always seen that way. Linguistic relativism was a fairly common view prior to the Chomskyan counter-revolution. Likewise, much of what gets promoted by the paleo diet was considered common sense in mainstream medical thought earlier last century and in the centuries prior (e.g., carbs are fattening, easily observed back in the day when most people lived on farms, as carbs were and still are how animals get fattened for the slaughter). In many cases, there are old debates that go in cycles. But the cycles are so long, often extending over centuries, that old views appear as if radically new and so easily dismissed as such.

Early Christians heresiologists admitted to the fact of Jesus mythicism, but their only defense was that the devil did it in planting parallels in prior religions. During the Enlightenment Age, many people kept bringing up these religious parallels and this was part of mainstream debate. Yet it was suppressed with the rise of literal-minded fundamentalism during the modern era. Then there is the battle between the Chomskyites, genetic determinists, etc and their opponents is part of a cultural conflict that goes back at least to the ancient Greeks, between the approaches of Plato and Aristotle (Daniel Everett discusses this in the Dark Matter of the Mind; see this post).

To return to the topic at hand, the notion of food as medicine, a premise of the paleo diet, also goes back to the ancient Greeks — in fact, originates with the founder of modern medicine, Hippocrates (he also is ascribed as saying that, “All disease begins in the gut,”  a slight exaggeration of a common view about the importance of gut health, a key area of connection between the paleo diet and alternative medicine). What we now call functional medicine, treating people holistically, used to be standard practice of family doctors for centuries and probably millennia, going back to medicine men and women. But this caring attitude and practice went by the wayside because it took time to spend with patients and insurance companies wouldn’t pay for it. Traditional healthcare that we now think of as alternative is maybe not possible with a for-profit model, but I’d say that is more of a criticism of the for-profit model than a criticism of traditional healthcare.

The dietary denialists love to dismiss the paleo lifestyle as a ‘fad diet’. But as Timothy Noakes argues, it is the least fad diet around. It is based on the research of what humans have been eating since the Paleoithic era and what hominids have been eating for millions of years. Even as a specific diet, it is the earliest official dietary recommendations given by medical experts. Back when it was popularized, it was called the Banting diet and the only complaint the medical authorities had was not that it was wrong but that it was right and they disliked it being promoted in the popular literature, as they considered dietary advice to be their turf to be defended. Timothy Noakes wrote that,

“Their first error is to label LCHF/Banting ‘the latest fashionable diet’; in other words, a fad. This is wrong. The Banting diet takes its name from an obese 19th-century undertaker, William Banting. First described in 1863, Banting is the oldest diet included in medical texts. Perhaps the most iconic medical text of all time, Sir William Osler’s The Principles and Practice of Medicine , published in 1892, includes the Banting/Ebstein diet as the diet for the treatment of obesity (on page 1020 of that edition). 13 The reality is that the only non-fad diet is the Banting diet; all subsequent diets, and most especially the low-fat diet that the UCT academics promote, are ‘the latest fashionable diets’.”
(Lore of Nutrition, p. 131)

The dominant paradigm maintains its dominance by convincing most people that what is perceived as ‘alternative’ was always that way or was a recent invention of radical thought. The risk the dominant paradigm takes is that, in attacking other views, it unintentionally acknowledges and legitimizes them. That happened in South Africa when the government spent hundreds of thousands of dollars attempting to destroy the career of Dr. Timothy Noakes, but because he was such a knowledgeable expert he was able to defend his medical views with scientific evidence. A similar thing happened when the Chomskyites viciously attacked the linguist Daniel Everett who worked in the field with native tribes, but it turned out he was a better writer with more compelling ideas and also had the evidence on his side. What the dogmatic assailants ended up doing, in both cases, was bringing academic and public attention to these challengers to the status quo.

Even though these attacks don’t always succeed, they are successful in setting examples. Even a pyrrhic victory is highly effective in demonstrating raw power in the short term. Not many doctors would be willing to risk their career as did Timothy Noakes and even fewer would have the capacity to defend themselves to such an extent. It’s not only the government that might go after a doctor but also private litigators. And if a doctor doesn’t toe the line, that doctor can lose their job in a hospital or clinic, be denied the ability to get Medicaire reimbursement, be blacklisted from speaking at medical conferences, and many other forms of punishment. That is what many challengers found in too loudly disagreeing with Ancel Keys and gang — they were effectively silenced and were no longer able to get funding to do research, even though the strongest evidence was on their side of the argument. Being shut out and becoming pariah is not a happy place to be.

The establishment can be fearsome when they flex their muscles. And watch out when they come after you. The defenders of the status quo become even more dangerous precisely when they are the weakest, like an injured and cornered animal who growls all the louder, and most people wisely keep their distance. But without fools to risk it all in testing whether the bark really is worse than the bite, nothing would change and the world would grind to a halt, as inertia settled into full authoritarian control. We are in such a time. I remember back in the era of Bush jr and as we headed into the following time of rope-a-dope hope-and-change. There was a palpable feeling of change in the air and I could viscerally sense the gears clicking into place. Something had irrevocably changed and it wasn’t fundamentally about anything going on in the halls of power but something within society and the culture. It made me feel gleeful at the time, like scratching the exact right spot where it itches — ah, there it is! Outwardly, the world more or less appeared the same, but the public mood had clearly shifted.

The bluntness of reactionary right-wingers is caused by the very fact that the winds of change are turning against them. That is why they praise the crude ridicule of wannabe emperor Donald Trump. What in the past could have been ignored by those in the mainstream no longer can be ignored. And after being ignored, the next step toward potential victory is being attacked, which can be mistaken for loss even as it offers the hope for reversal of fortune. Attacks come in many forms, with a few examples already mentioned. Along with ridicule, there is defamation, character assassination, scapegoating, and straw man arguments; allegations of fraud, quackery, malpractice, or deviancy. These are attacks as preemptive defense, in the hope of enforcing submission and silence. This only works for so long, though. The tide can’t be held back forever.

The establishment is under siege and they know it. Their only hope is to be able hold out long enough until the worst happens and they can drop the pretense in going full authoritarian. That is a risky gamble on their part and likely not to pay off, but it is the only hope they have in maintaining power. Desperation of mind breeds desperation of action. But it’s not as if a choice is being made. The inevitable result of a dominant paradigm is that it closes itself not only to all other possibilities but, more importantly, to even the imagination that something else is possible. Ideological realism becomes a reality tunnel. And insularity leads to intellectual laziness, as those who rule and those who support them have come to depend on a presumed authority as gatekeepers of legitimacy. What they don’t notice or don’t understand is the slow erosion of authority and hence loss of what Julian Jaynes called authorization. Their need to be absolutely right is no longer matched with their capacity to enforce their increasingly rigid worldview, their fragile and fraying ideological dogmatism.

This is why challengers to the status quo are in a different position, thus making the altercation of contestants rather lopsided. There is a freedom to being outside the constraints of mainstream thought. An imbalance of power, in some ways, works in favor of those excluded from power since they have all the world to gain and little to lose, meaning less to defend; this being shown in how outsiders, more easily than insiders, often can acknowledge where the other side is right and accept where points of commonality are to be found, that is to say the challengers to power don’t have to be on the constant attack in the way that is required for defenders of the status quo (similar to how guerrilla fighters don’t have to defeat an empire, but simply not lose and wait it out). Trying to defeat ideological underdogs that have growing popular support is like the U.S. military trying to win a war in Vietnam or Afghanistan — they are on the wrong side of history. But systems of power don’t give up without a fight, and they are willing to sacrifice loads of money and many lives in fighting losing battles, if only to keep the enemies at bay for yet another day. And the zombie ideas these systems are built on are not easily eliminated. That is because they are highly infectious mind viruses that can continue to spread long after the original vector of disease disappeared.

As such, the behemoth medical-industrial complex won’t be making any quick turns toward internal reform. Changes happen over generations. And for the moment, this generation of doctors and other healthcare workers were primarily educated and trained under the old paradigm. It’s the entire world most of them know. The system is a victim of its own success and so those working within the system are victimized again and again in their own indoctrination. It’s not some evil sociopathic self-interest that keeps the whole mess slogging along; after all, even doctors are suffering the same failed healthcare system as the rest of us and are dying of the same preventable diseases. All are sacrificed equally, all are food for the system’s hunger. When my mother brought my nephew for an appointment, the doctor was not trying to be a bad person when she made the bizarre and disheartening claim that all kids eat unhealthy and are sickly; i.e., there is nothing to do about it, just the way kids are. Working within the failed system, that is all she knows. The idea that sickness isn’t or shouldn’t be the norm was beyond her imagination.

It is up to the rest of us to imagine new possibilities and, in some cases, to resurrect old possibilities long forgotten. We can’t wait for a system to change when that system is indifferent to our struggles and suffering. We can’t wait for a future time when most doctors are well-educated on treating the whole patient, when officials are well-prepared for understanding and tackling systemic problems. Change will happen, as so many have come to realize, from the bottom up. There is no other way. Until that change happens, the best we can do is to take care of ourselves and take care of our loved ones. That isn’t about blame. It’s about responsibility, that is to say the ability to respond; and more importantly, the willingness to do so.

* * *

Ketotarian
by Dr. Will Cole
pp. 15-16

With the Hippocratic advice to “let food be thy medicine, and medicine thy food,” how far have we strayed that the words of the founder of modern medicine can actually be threatening to conventional medicine?

Today medical schools in the United States offer, on average, only about nineteen hours of nutrition education over four years of medical school.10 Only 29 percent of U.S. medical schools offer the recommended twenty-five hours of nutrition education.11 A study in the International Journal of Adolescent Medicine and Health assessed the basic nutrition and health knowledge of medical school graduates entering a pediatric residency program and found that, on average, they answered only 52 percent of eighteen questions correctly.12 In short, most mainstream doctors would fail nutrition. So if you were wondering why someone in functional medicine, outside conventional medicine, is writing a book on how to use food for optimal health, this is why.

Expecting health guidance from mainstream medicine is akin to getting gardening advice from a mechanic. You can’t expect someone who wasn’t properly trained in a field to give sound advice. Brilliant physicians in the mainstream model of care are trained to diagnose a disease and match it with a corresponding pharmaceutical drug. This medicinal matching game works sometimes, but it often leaves the patient with nothing but a growing prescription list and growing health problems.

With the strong influence that the pharmaceutical industry has on government and conventional medical policy, it’s no secret that using foods to heal the body is not a priority of mainstream medicine. You only need to eat hospital food once to know this truth. Even more, under current laws it is illegal to say that foods can heal. That’ right. The words treat, cure, and prevent are in effect owned by the Food and Drug Administration (FDA) and the pharmaceutical industry and can be used in the health care setting only when talking about medications. This is the Orwellian world we live in today; health problems are on the rise even though we spend more on health care than ever, and getting healthy is considered radical and often labeled as quackery.

10. K. Adams et al., “Nutrition Education in U.S. Medical Schools: Latest Update of a National Survey,” Academic Medicine 85, no. 9 (September 2010): 1537-1542, https://www.ncbi.nlm.nih.gov/pubmed/9555760.
11. K. Adams et al., “The State of Nutrition Education at US Medical Schools,” Journal of Biomedical Education 2015 (2015), Article ID 357627, 7 pages, http://dx.doi.org/10.1155/2015/357627.
12. M. Castillo et al., “Basic Nutrition Knowledge of Recent Medical Graduates Entering a Pediatric Reside): 357-361, doi: 10.1515/ijamh-2015-0019, https://www.ncbi.nlm.nih.gov/pubmed/26234947.

Ancient Atherosclerosis?

In reading about health, mostly about diet and nutrition, I regularly come across studies that are either poorly designed or poorly interpreted. The conclusions don’t always follow from the data or there are so many confounders that other conclusions can’t be discounted. Then the data gets used by dietary ideologues.

There is a major reason I appreciate the dietary debate among proponents of traditional, ancestral, paleo, low-carb, ketogenic, and some other related views (anti-inflammatory diets, autoimmune diets, etc such as the Wahls Protocol for multiple sclerosis and Bredesen Protocol for Alzheimer’s). This area of alternative debate leans heavily on questioning conventional certainties by digging deep into the available evidence. These diets seem to attract people capable of changing their minds or maybe it is simply that many people who eventually come to these unconventional views do so after having already tried numerous other diets.

For example, Dr. Terry Wahls is a clinical professor of Internal Medicine, Epidemiology, and Neurology  at the University of Iowa while also being Associate Chief of Staff at a Veterans Affairs hospital. She was as conventional as doctors come until she developed multiple sclerosis, began researching and experimenting, and eventually became a practitioner of functional medicine. Also, she went from being a hardcore vegetarian following mainstream dietary advice (avoided saturated fats, ate whole grains and legumes, etc) to embracing an essentially nutrient-dense paleo diet; her neurologist at the Cleveland Clinic referred her to Dr. Loren Cordain’s paleo research at Colorado State University. Since that time, she has done medical research and, recently having procured funding, she is in the process of doing a study in order to further test her diet.

Her experimental attitude, both personally and scientifically, is common among those interested in these kinds of diets and functional medicine. This experimental attitude is necessary when one steps outside of conventional wisdom, something Dr. Wahls felt she had to do to save her own life — a motivating factor of health crisis that leads many people to try a paleo, keto, etc diet after trying all else (these involve protocols to deal with serious illnesses, such as ketosis being medically used for treatment of epileptic seizures). Contradicting professional opinion of respected authorities (e.g., the American Heart Association), a diet like this tends to be an option of last resort for most people, something they come to after much failure and worsening of health. That breeds a certain mentality.

On the other hand, it should be unsurprising that people raised on mainstream views and who hold onto those views long into adulthood (and long into their careers) tend not to be people willing to entertain alternative views, no matter what the evidence indicates. This includes those working in the medical field. Some ask, why are doctors so stupid? As Dr. Michael Eades explains, it’s not that they’re stupid but that many of them are ignorant; to put it more nicely, they’re ill-informed. They simply don’t know because, like so many others, they are repeating what they’ve been told by other authority figures. And the fact of the matter is most doctors never learned much about certain topics in the first place: “A study in the International Journal of Adolescent Medicine and Health assessed the basic nutrition and health knowledge of medical school graduates entering a pediatric residency program and found that, on average, they answered only 52 percent of the eighteen questions correctly. In short, most mainstream doctors would fail nutrition” (Dr. Will Cole, Ketotarian).

The reason people stick to the known, even when it is wrong, is because it is familiar and so it feels safe (and because of liability, healthcare workers and health insurance companies prefer what is perceived as safe). Doctors, as with everyone else, are dependent on heuristics to deal with a complex world. And doctors, more than most people, are too busy to explore the large amounts of data out there, much less analyze it carefully for themselves.

This maybe relates to why most doctors tend to not make the best researchers, not to dismiss those attempting to do quality research. For that reason, you might think scientific researchers who aren’t doctors would be different than doctors. But that obviously isn’t always the case because, if so, Ancel Keys low quality research wouldn’t have dominated professional dietary advice for more than a half century. Keys wasn’t a medical professional or even trained in nutrition, rather he was educated in a wide variety of other fields (economics, political science, zoology, oceanography, biology, and physiology) with his earliest research done on the physiology of fish.

I came across yet another example of this, although less extreme than that of Keys, but also different in that at least some of the authors of the paper are medical doctors. The study in question involved the participation of 19 people. The paper is “Atherosclerosis across 4000 years of human history: the Horus study of four ancient populations,” peer-reviewed and published (2013) in the highly respectable Lancet Journal (Keys’ work, one might note, was also highly respectable). This study on atherosclerosis was well reported in the mainstream news outlets and received much attention from those critical of paleo diets, offered as a final nail in the coffin, claimed as being absolute proof that ancient people were as unhealthy as we are.

The 19 authors conclude that, “atherosclerosis was common in four preindustrial populations, including a preagricultural hunter-gatherer population, and across a wide span of human history. It remains prevalent in contemporary human beings. The presence of atherosclerosis in premodern human beings suggests that the disease is an inherent component of human ageing and not characteristic of any specific diet or lifestyle.” There you have it. Heart disease is simply in our genetics — so take your statin meds like your doctor tells you to do, just shut up and quit asking questions, quit looking at all the contrary evidence.

But even ignoring all else, does the evidence from this paper support their conclusion? No. It doesn’t require much research or thought to ascertain the weak case presented. In the paper itself, on multiple occasions including in the second table, they admit that three out of four of the populations were farmers who ate largely an agricultural diet and, of course, lived an agricultural lifestyle. At most, these examples can speak to the conditions of the neolithic but not the paleolithic. Of these three, only one was transitioning from an earlier foraging lifestyle, but as with the other two was eating a higher carb diet from foods they farmed. Also, the most well known example of the bunch, the Egyptians, particularly point to the problems of an agricultural diet — as described by Michael Eades in Obesity in ancient Egypt:

“[S]everal thousand years ago when the future mummies roamed the earth their diet was a nutritionist’s nirvana. At least a nirvana for all the so-called nutritional experts of today who are recommending a diet filled with whole grains, fresh fruits and vegetables, and little meat, especially red meat. Follow such a diet, we’re told, and we will enjoy abundant health.

“Unfortunately, it didn’t work that way for the Egyptians. They followed such a diet simply because that’s all there was. There was no sugar – it wouldn’t be produced for another thousand or more years. The only sweet was honey, which was consumed in limited amounts. The primary staple was a coarse bread made of stone-ground, whole wheat. Animals were used as beasts of burden and were valued much more for the work they could do than for the meat they could provide. The banks of the Nile provided fertile soil for growing all kinds of fruits and vegetables, all of which were a part the low-fat, high-carbohydrate Egyptian diet. And there were no artificial sweeteners, artificial coloring, artificial flavors, preservatives, or any of the other substances that are part of all the manufactured foods we eat today.

“Were the nutritionists of today right about their ideas of the ideal diet, the ancient Egyptians should have had abundant health. But they didn’t. In fact, they suffered pretty miserable health. Many had heart disease, high blood pressure, diabetes and obesity – all the same disorders that we experience today in the ‘civilized’ Western world. Diseases that Paleolithic man, our really ancient ancestors, appeared to escape.”

With unintentional humor, the authors of the paper note that, “None of the cultures were known to be vegetarian.” No shit. Maybe that is because until late in the history of agriculture there were no vegetarians and for good reason. As Weston Price noted, there is a wide variety of possible healthy diets as seen in traditional communities. Yet for all his searching for a healthy traditional community that was strictly vegan or even vegetarian, he could never find any; the closest examples were those that relied largely on such things as insects and grubs because of a lack of access to larger sources of protein and fat. On the other hand, the most famous vegetarian population, Hindu Indians, have one of the shortest lifespans (to be fair, though, that could be for other reasons such as poverty-related health issues).

Interestingly, there apparently has never been a study done comparing a herbivore diet and a carnivore diet, although one study touched on it while not quite eliminating all plants from the latter. As for fat, there is no evidence that it is problematic (vegetable oils are another issue), if anything the opposite: “In a study published in the Lancet, they found that people eating high quantities of carbohydrates, which are found in breads and rice, had a nearly 30% higher risk of dying during the study than people eating a low-carb diet. And people eating high-fat diets had a 23% lower chance of dying during the study’s seven years of follow-up compared to people who ate less fat” (Alice Park, The Low-Fat vs. Low-Carb Diet Debate Has a New Answer); and “The Mayo Clinic published a study in the Journal of Alzheimer’s Disease in 2012 demonstrating that in individuals favoring a high-carb diet, risk for mild cognitive impairment was increased by 89%, contrasted to those who ate a high-fat diet, whose risk was decreased by 44%” (WebMD interview of Dr. David Perlmutter). Yet the respectable authorities tell us that fat is bad for our health, making it paradoxical that many fat-gluttonous societies have better health. There are so many paradoxes, according to conventional thought, that one begins to wonder if conventional thought is the real paradox.

Now let me discuss the one group, the Unangan, that at first glance stands out from the rest. The authors say that the, “five Unangan people living in the Aleutian Islands of modern day Alaska (ca 1756–1930 CE, one excavation site).” Those mummies are far different than those from the other populations that came much earlier in history. Four of the Unangan died around 1900 and one around 1850. Why does that matter? Well, for the reason that their entire world was being turned on its head at that time. The authors claim that, “The Unangan’s diet was predominately marine, including seals, sea lions, sea otters, whale, fish, sea urchins, and other shellfish and birds and their eggs. They were hunter-gatherers living in barabaras, subterranean houses to protect against the cold and fierce winds.” They base this claim on the assumption that these particular mummified Unangan had been eating the same diet as their ancestors for thousands of years, but the evidence points in the opposite direction.

Questioning this assumption, Jeffery Gerber explains that, “During life (before 1756–1930 CE) not more than a few short hundred years ago, the 5 Unangan/Aleut mummies were hardly part of an isolated group. The Fur Seal industry exploded in the 18th century bringing outside influence, often violent, from countries including Russia and Europe. These mummies during life, were probably exposed to foods (including sugar) different from their traditional diet and thus might not be representative of their hunter-gatherer origins” (Mummies, Clogged Arteries and Ancient Junk Food). One might add that, whatever Western foods that may have been introduced, we do know of another factor — the Government of Nunavat official website states that, “European whalers regularly travelled to the Arctic in the late 17th and 18th century. When they visited, they introduced tobacco to Inuit.” Why is that significant? Tobacco is a known risk factor for atherosclerosis. Gideon Mailer and Nicola Hale, in their book Decolonizing the Diet, elaborate on the colonial history of the region (pp. 162-171):

“On the eve of Western contact, the indigenous population of present-day Alaska numbered around 80,000. They included the Alutiiq and Unangan communities, more commonly defined as Aleuts, Inupiat and Yupiit, Athabaskans, and the Tinglit and Haida groups. Most groups suffered a stark demographic decline from the mid-eighteenth century to the mid-nineteenth century, during the period of extended European — particularly Russian — contact. Oral traditions among indigenous groups in Alaska described whites as having taken hunting grounds from other related communities, warning of a similar fate to their own. The Unangan community, numbering more than 12,000 at contact, declined by around 80 percent by 1860. By as early as the 1820s, as Jacobs has described, “The rhythm of life had changed completely in the Unangan villages now based on the exigencies of the fur trade rather than the subsistence cycle, meaning that often villages were unable to produce enough food to keep them through the winter.” Here, as elsewhere, societal disruption was most profound in the nutritional sphere, helping account for the failure to recover population numbers following disease epidemics.

“In many parts of Alaska, Native American nutritional strategies and ecological niches were suddenly disrupted by the arrival of Spanish and Russian settlers. “Because,” as Saunt has pointed out “it was extraordinarily difficult to extract food from the challenging environment,” in Alaska and other Pacific coastal communities, “any disturbance was likely to place enormous stress on local residents.” One of indigenous Alaska’s most important ecological niches centered on salmon access points. They became steadily more important between the Paleo-Eskimo era around 4,200 years ago and the precontact period, but were increasingly threatened by Russian and American disruptions from the 1780s through the nineteenth century. Dependent on nutrients and omega fatty acids such as DHA from marine resources such as salmon, Aleut and Alutiiq communities also required other animal products, such as intestines, to prepare tools and waterproof clothing to take advantage of fishing seasons. Through the later part of the eighteenth century, however, Russian fur traders and settlers began to force them away from the coast with ruthless efficiency, even destroying their hunting tools and waterproof apparatus. The Russians were clear in their objectives here, with one of their men observing that the Native American fishing boats were “as indispensable as the plow and the horse for the farmer.”

“Here we are provided with another tragic case study, which allows us to consider the likely association between disrupted access to omega-e fatty acids such as DHA and compromised immunity. We have already noted the link between DHA, reduced inflammation and enhanced immunity in the millennia following the evolution of the small human gut and the comparatively larger human brain. Wild animals, but particularly wild fish, have been shown to contain far higher proportions of omega-3 fatty acids than the food sources that apparently became more abundant in Native American diets after European contact, including in Alaska. Fat-soluble vitamins and DHA are abundantly found in fish eggs and fish fats, which were prized by Native Americans in the Northwest and Great Lakes regions, in the marine life used by California communities, and perhaps more than anywhere else, in the salmon products consumed by indigenous Alaskan communities. […]

“In Alaska, where DHA and vitamin D-rich salmon consumption was central to precontact subsistence strategies, alongside the consumption of nutrient-dense animal products and the regulation of metabolic hormones through periods of fasting or even through the efficient use of fatty acids or ketones for energy, disruptions to those strategies compromised immunity among those who suffered greater incursions from Russian and other European settlers through the first half of the nineteenth century.

“A collapse in sustainable subsistence practices among the Aleuts of Alaska exacerbated population decline during the period of Russian contact. The Russian colonial regime from the 1740s to 1840s destroyed Aleut communities through open warfare and by attacking and curtailing their nutritional resources, such as sea otters, which Russians plundered to supply the Chinese market for animal skins. Aleuts were often forced into labor, and threatened by the regular occurrence of Aleut women being taken as hostages. Curtailed by armed force, Aleuts were often relocated to the Pribilof Islands or to California to collect seals and sea otters. The same process occurred as Aleuts were co-opted into Russian expansion through the Aleutian Islands, Kodiak Island and into the southern coast of Alaska. Suffering murder and other atrocities, Aleuts provided only one use to Russian settlers: their perceived expertise in hunting local marine animals. They were removed from their communities, disrupting demography further and preventing those who remained from accessing vital nutritional resources due to the discontinuation of hunting frameworks. Colonial disruption, warfare, captivity and disease were accompanied by the degradation of nutritional resources. Aleut population numbers declined from 18,000 to 2,000 during the period of Russian occupation in the first half of the nineteenth century. A lag between the first period of contact and the intensification of colonial disruption demonstrates the role of contingent interventions in framing the deleterious effects of epidemics, including the 1837-38 smallpox epidemic in the region. Compounding these problems, communities used to a relatively high-fat and low-fructose diet were introduced to alcohol by the Russians, to the immediate detriment of their health and well-being.”

The traditional hunter-gatherer diet, as Mailer and Hale describe it, was high in the nutrients that protect against inflammation. The loss of these nutrients and the simultaneous decimation to the population was a one-two punch. Without the nutrients, their immune systems were compromised. And with their immune systems compromised, they were prone to all kinds of health conditions, probably including heart disease which of course is related to inflammation. Weston A. Price, in Nutrition and Physical Degeneration, observed that morbidity and mortality of health conditions such as heart disease rise and fall with the seasons, following precisely the growth and dying away of vegetation throughout the year (which varies by region as do the morbidity and mortality rates; the regions of comparison were in the United States and Canada). He was able to track this down to the change of fat soluble vitamins, specifically vitamin D, in dairy. When fresh vegetation was available, cows ate it and so produced more of these nutrients and presumably more omega-3s at the same time.

Prior to colonization, the Unang would have had access to even higher levels of these protective nutrients year round. The most nutritious dairy taken from the springtime wouldn’t come close in comparison to the nutrient profile of wild game. I don’t know why anyone would be shocked that, like agricultural populations, hunter-gatherers also experience worsening health after loss of wild resources. Yet the authors of the mummy studies act like they made a radical discovery that throws to the wind every doubt anyone ever had about simplistic mainstream thought. It turns out, they seem to be declaring, that we are all victims of genetic determinism after all and so toss out your romantic fairy tales about healthy primitives from the ancient world. The problem is all the evidence that undermines their conclusion, including the evidence that they present in their own paper, that is when it is interpreted in full context.

As if responding to the researchers, Mailer and Hale write (p. 186): “Conditions such as diabetes are thus often associated with heart disease and other syndromes, given their inflammatory component. They now make up a huge proportion of treatment and spending in health services on both sides of the Atlantic. Yet policy makers and researchers in those same health services often respond to these conditions reactively rather than proactively — as if they were solely genetically determined, rather than arising due to external nutritional factors. A similarly problematic pattern of analysis, as we have noted, has led scholars to ignore the central role of nutritional change in Native American population loss after European contact, focusing instead on purportedly immutable genetic differences.”

There is another angle related to the above but somewhat at a tangent. I’ll bring it up because the research paper mentions it in passing as a factor to be considered: “All four populations lived at a time when infections would have been a common aspect of daily life and the major cause of death. Antibiotics had yet to be developed and the environment was non-hygienic. In 20th century hunter-foragers-horticulturalists, about 75% of mortality was attributed to infections, and only 10% from senescence. The high level of chronic infection and inflammation in premodern conditions might have promoted the inflammatory aspects of atherosclerosis.”

This is familiar territory for me, as I’ve been reading much about inflammation and infections. The authors are presenting the old view of the immune system, as opposed to that of functional medicine that looks at the entire human. An example of the latter is the hygiene hypothesis that argues it is the exposure to microbes that strengthens the immune system and there has been much evidence in support of it (such as children raised with animals or on farms being healthier as adults). The researchers above are making an opposing argument that is contradicted by populations remaining healthy when lacking modern medicine as long as they maintain traditional diet and lifestyle in a healthy ecosystem, including living soil that hasn’t been depleted from intensive farming.

This isn’t only about agriculturalists versus hunter-gatherers. The distinction between populations goes deeper into culture and environment. Weston A. Price discovered this simple truth in finding healthy populations among both agriculturalists and hunter-gatherers, but it was specific populations under specific conditions. Also, at the time when he traveled in the early 20th century, there were still traditional communities living in isolation in Europe. One example is Loetschenatal Valley in Switzerland, while visiting the country in two separate trips in the consecutive years of 1931 and 1932 — as he writes of it:

“We were told that the physical conditions that would not permit people to obtain modern foods would prevent us from reaching them without hardship. However, owing to the completion of the Loetschberg Tunnel, eleven miles long, and the building of a railroad that crosses the Loetschental Valley, at a little less than a mile above sea level, a group of about 2,000 people had been made easily accessible for study, shortly prior to 1931. Practically all the human requirements of the people in that valley, except a few items like sea salt, have been produced in the valley for centuries.”

He points out that, “Notwithstanding the fact that tuberculosis is the most serious disease of Switzerland, according to a statement given me by a government official, a recent report of inspection of this valley did not reveal a single case.” In Switzerland and other countries, he found an “association of dental caries and tuberculosis.” The commonality was early life development, as underdeveloped and maldeveloped bone structure led to diverse issues: crowded teeth, smaller skull size, misaligned features, and what was called tubercular chest. And that was an outward sign of deeper and more systemic developmental issues, including malnutrition, inflammation, and the immune system:

“Associated with a fine physical condition the isolated primitive groups have a high level of immunity to many of our modern degenerative processes, including tuberculosis, arthritis, heart disease, and affections  of the internal organs. When, however, these individuals have lost this high level of physical excellence a definite lowering in their resistance to the modern degenerative processes has taken place. To illustrate, the narrowing of the facial and dental arch forms of the children of the modernized parents, after they had adopted the white man’s food, was accompanied by an increase in susceptibility to pulmonary tuberculosis.”

Any population that lost its traditional way of life became prone to disease. But this could often as easily be reversed by having the diseased individual return to healthy conditions. In discussing Dr. Josef Romig, Price said that, “Growing out of his experience, in which he had seen large numbers of the modernized Eskimos and Indians attacked with tuberculosis, which tended to be progressive and ultimately fatal as long as the patients stayed under modernized living conditions, he now sends them back when possible to primitive conditions and to a primitive diet, under which the death rate is very much lower than under modernized  conditions. Indeed, he reported that a great majority of the afflicted recover under the primitive type of living and nutrition.”

The point made by Mailer and Hale was earlier made by Price. As seen with pre-contact Native Alaskans, the isolated traditional residents of Loetschenatal Valley had nutritious diets. Price explained that he “arranged to have samples of food, particularly dairy products, sent to me about twice a month, summer and winter. These products have been tested for their mineral and vitamin contents, particularly the fat-soluble activators. The samples were found to be high in vitamins and much higher than the average samples of commercial dairy products in America and Europe, and in the lower areas of Switzerland.” Whether fat and organ meats from marine animals or dairy from pastured alpine cows, the key is high levels of fat soluble vitamins and, of course, omega-3 fatty acids procured from a pristine environment (healthy soil and clean water with no toxins, farm chemicals, hormones, etc). It also helped that both populations ate much that was raw which maintains the high nutrient content that is partly destroyed through heat.

Some might find it hard to believe that what you eat can determine whether or not you get a serious disease like tuberculosis. Conventional medicine tells us that the only thing that protects us is either avoiding contact or vaccination. But this view is being seriously challenged, as Mailer and Hale make clear (p. 164): “Several studies have focused on the link between Vitamin D and the health outcomes of individuals infected with tuberculosis, taking care to discount other causal factors and to avoid determining causation merely through association. Given the historical occurrence of the disease among indigenous people after contact, including in Alaska, those studies that have isolated the contingency of immunity on active Vitamin D are particularly pertinent to note. In biochemical experiments, the presence of the active form of vitamin D has been shown to have a crucial role in the destruction of Mycobacterium tuberculosis by macrophages. A recent review has found that tuberculosis patients tend to retain a lower-than-average vitamin D status, and that supplementation of the nutrient improved outcomes in most cases.” As an additional thought, the popular tuberculosis sanitoriums, some in the Swiss Alps, were attractive because “it was believed that the climate and above-average hours of sunshine had something to do with it” (Jo Fahy, A breath of fresh air for an alpine village). What does sunlight help the body to produce? Vitamin D.

As an additional perspective, James C. Scotts’ Against the Grain, writes that, “Virtually every infectious disease caused by micro-organisms and specifically adapted to Homo sapiens has arisen in the last ten thousand years, many of them in the last five thousand years as an effect of ‘civilisation’: cholera, smallpox, measles, influenza, chickenpox, and perhaps malaria” It is not only that agriculture introduces new diseases but also makes people susceptible to them. That might be true, as Scott suggests, even of a disease like malaria. The Piraha are more likely to die of malaria than anything else, but that might not have been true in the past. Let me offer a speculation by connecting to the mummy study.

The Ancestral Puebloans, one of the groups in the mummy study, were at the time farming maize (corn) and squash while foraging pine nuts, seeds, amaranth (grain), and grasses. How does this compare to the more recent Piraha? A 1948 Smithsonian publication, Handbook of South American Indians ed. Julian H. Steward, reported that, “The Pirah grew maize, sweet manioc (macaxera), a kind of yellow squash (jurumum), watermelon, and cotton” (p. 267). So it turns out that, like the Ancestral Puebloan, the Piraha have been on their way toward a more agricultural lifestyle for a while. I also noted that the same publication added the detail that the Piraha “did not drink rum,” but by the time Daniel Everett met the Piraha in 1978 traders had already introduced them to alcohol and it had become an occasional problem. Not only were they becoming agricultural but also Westernized, two factors that likely contributed to decreased immunity.

Like other modern hunter-gatherers, the Piraha have been effected by the Neolithic Revolution and are in many ways far different from Paleolithic hunter-gatherers. Ancient dietary habits are shown in the analysis of ancient bones — M.P. Richards writes that, “Direct evidence from bone chemistry, such as the measurement of the stable isotopes of carbon and nitrogen, do provide direct evidence of past diet, and limited studies on five Neanderthals from three sites, as well as a number of modern Palaeolithic and Mesolithic humans indicates the importance of animal protein in diets. There is a significant change in the archaeological record associated with the introduction of agriculture worldwide, and an associated general decline in health in some areas. However, there is an rapid increase in population associated with domestication of plants, so although in some regions individual health suffers after the Neolithic revolution, as a species humans have greatly expanded their population worldwide” (A brief review of the archaeological evidence for Palaeolithic and Neolithic subsistence). This is further supported in the analysis of coprolites. “Studies of ancient human coprolites, or fossilized human feces, dating anywhere from three hundred thousand to as recent as fifty thousand years ago, have revealed essentially a complete lack of any plant material in the diets of the subjects studied (Bryant and Williams-Dean 1975),” Nora Gedgaudas tells us in Primal Body, Primal Mind (p. 39).

This diet changed as humans entered our present interglacial period with its warmer temperatures and greater abundance of vegetation, which was lacking during the Paleolithic Period: “There was far more plant material in the diets of our more recent ancestors than our more ancient hominid ancestors, due to different factors” (Gedgaudas, p. 37). Following the earlier megafauna mass extinction, it wasn’t only agriculturalists but also hunter-gatherers who began to eat more plants and in many cases make use of cultivated plants (either that they cultivated or that they adopted from nearby agriculturalists). To emphasize how drastic was this change, this loss of abundant meat and fat, consider the fact that humans have yet to regain the average height and skull size of Paleolithic humans.

The authors of the mummy study didn’t even attempt to look at the data of Paleolithic humans. The populations compared are entirely from the past few millennia. And the only hunter-gatherer group included was post-contact. So, why are the authors so confident in their conclusion? I presume they were simply trying to get published and get media attention in a highly competitive market of academic scholarship. These people obviously aren’t stupid, but they had little incentive to fully inform themselves either. All the info I shared in this post I was able to gather in about a half an hour of several web searches, not exactly difficult academic research. It’s amazing the info that is easily available these days, for those who want to find it.

Let me make one last point. The mummy study isn’t without its merits. The paper mentions other evidence that remains to be explained: “We also considered the reliability and previous work of the authors. Autopsy studies done as long ago as the mid-19th century showed atherosclerosis in ancient Egyptians. Also, in more recent times, Zimmerman undertook autopsies and described atherosclerosis in the mummies of two Unangan men from the same cave as our Unangan mummies and of an Inuit woman who lived around 400 CE. A previous study using CT scanning showed atherosclerotic calcifications in the aorta of the Iceman, who is believed to have lived about 3200 BCE and was discovered in 1991 in a high snowfield on the Italian-Austrian border.”

Let’s break that down. Further examples of Egyptian mummies is irrelevant, as their diet was so strikingly similar to the idealized Western diet recommended by mainstream doctors, dieticians, and nutritionists. That leaves the rest to account for. The older Unangan mummies are far more interesting and any meaningful paper would have led with that piece of data, but even then it wouldn’t mean what the authors think it means. Atherosclerosis is one small factor and not necessarily as significant as assumed. From a functional medicine perspective, it’s the whole picture that matters in how the body actually functions and in the health that results. If so, atherosclerosis might not indicate the same thing for all populations. In Nourishing Diets, Morell writes that (pp. 124-5),

“Critics have pointed out that Keys omitted from his study many areas of the world where consumption of animal foods is high and deaths from heart attack are low, including France — the so-called French paradox. But there is also a Japanese paradox. In 1989, Japanese scientists returned to the same two districts that Keys had studied. In an article titled “lessons fro Science from the Seven Countries Study,” they noted that per capita consumption of rice had declined, while consumption of fats, oils, meats, poultry, dairy products and fruit had all increased. […]

“During the postwar period of increased animal consumption, the Japanese average height increased three inches and the age-adjusted death rate from all causes declined from 17.6 to 7.4 per 1,000 per year. Although the rates of hypertension increased, stroke mortality declined markedly. Deaths from cancer also went down in spite of the consumption of animal foods.

“The researchers also noted — and here is the paradox — that the rate of myocardial infarction (heart attack) and sudden death did not change during this period, in spite of the fact that the Japanese weighed more, had higher blood pressure and higher cholesterol levels, and ate more fat, beef and dairy foods.”

Right here in the United States, we have are own ‘paradox’ as well. Good Calories, Bad Calories by Gary Taubes makes a compelling argument that, based on the scientific research, there is no strong causal link between atherosclerosis and coronary heart disease. Nina Teicholz has also written extensively about this, such as in her book The Big Fat Surprise; and in an Atlantic piece (How Americans Got Red Meat Wrong) she lays out some of the evidence showing that Americans in the 19th century, as compared to the following century, ate more meat and fat while they ate fewer vegetables and fruits. Nonetheless: “During all this time, however, heart disease was almost certainly rare. Reliable data from death certificates is not available, but other sources of information make a persuasive case against the widespread appearance of the disease before the early 1920s.” Whether or not earlier Americans had high rates of atherosclerosis, there is strong evidence indicating they did not have high rates of heart disease, of strokes and heart attacks. The health crisis for these conditions, as Tiecholz notes, didn’t take hold until the very moment meat and animal fat consumption took a nosedive. So what gives?

The takeaway is this. We have no reason to assume that atherosclerosis in the present or in the past can tell us much of anything about general health. Even ignoring the fact that none of the mummies studied was from a high protein and high fat Paleo population, we can make no meaningful interpretations of the presence of atherosclerosis among some of the individuals. Going by modern data, there is no reason to jump to the conclusion that they had high mortality rates because of it. Quite likely, they died from completely unrelated health issues. A case in point is that of the Masai, around which there is much debate in interpreting the data. George V. Mann and others wrote a paper, Atherosclerosis in the Masai, that demonstrated the complexity:

“The hearts and aortae of 50 Masai men were collected at autopsy. These pastoral people are exceptionally active and fit and they consume diets of milk and meat. The intake of animal fat exceeds that of American men. Measurements of the aorta showed extensive atherosclerosis with lipid infiltration and fibrous changes but very few complicated lesions. The coronary arteries showed intimal thickening by atherosclerosis which equaled that of old U.S. men. The Masai vessels enlarge with age to more than compensate for this disease. It is speculated that the Masai are protected from their atherosclerosis by physical fitness which causes their coronary vessels to be capacious.”

Put this in the context provided in What Causes Heart Disease? by Sally Fallon Morell and Mary Enig: “The factors that initiate a heart attack (or a stroke) are twofold. One is the pathological buildup of abnormal plaque, or atheromas, in the arteries, plaque that gradually hardens through calcification. Blockage most often occurs in the large arteries feeding the heart or the brain. This abnormal plaque or atherosclerosis should not be confused with the fatty streaks and thickening that is found in the arteries of both primitive and industrialized peoples throughout the world. This thickening is a protective mechanism that occurs in areas where the arteries branch or make a turn and therefore incur the greatest levels of pressure from the blood. Without this natural thickening, our arteries would weaken in these areas as we age, leading to aneurysms and ruptures. With normal thickening, the blood vessel usually widens to accommodate the change. But with atherosclerosis the vessel ultimately becomes more narrow so that even small blood clots may cause an obstruction.”

A distinction is being made here that maybe wasn’t being made in the the mummy study. What gets measured as atherosclerosis could correlate to diverse health conditions and consequences in various populations across dietary lifestyles, regional environments, and historical and prehistorical periods. Finding atherosclerosis in an individual, especially a mummy, might not tell us any useful info about overall health.

Just for good measure, let’s tackle the last piece of remaining evidence the authors mention: “A previous study using CT scanning showed atherosclerotic calcifications in the aorta of the Iceman, who is believed to have lived about 3200 BCE and was discovered in 1991 in a high snowfield on the Italian-Austrian border.” Calling him Iceman, to most ears, sounds similar to calling an ancient person a caveman — implying that he was a hunter for it is hard to grow plants on ice. In response, Paul Mabry writes in Did Meat Eating Make Ancient Hunter Gatherers Get Heart Disease, showing what was left out in the research paper:

“Sometimes the folks trying to discredit hunter-gather diets bring in Ötzi, “The Iceman” a frozen human found in the Tyrolean Mountains on the border between Austria and Italy that also had plaques in his heart arteries. He was judged to be 5300 years old making his era about 3400 BCE. Most experts feel agriculture had reach Europe almost 700 years before that according to this article. And Ötzi himself suggests they are right. Here’s a quote from the Wikipedia article on Ötzi’s last meal (a sandwich): “Analysis of Ötzi’s intestinal contents showed two meals (the last one consumed about eight hours before his death), one of chamois meat, the other of red deer and herb bread. Both were eaten with grain as well as roots and fruits. The grain from both meals was a highly processed einkornwheat bran,[14] quite possibly eaten in the form of bread. In the proximity of the body, and thus possibly originating from the Iceman’s provisions, chaff and grains of einkorn and barley, and seeds of flax and poppy were discovered, as well as kernels of sloes (small plumlike fruits of the blackthorn tree) and various seeds of berries growing in the wild.[15] Hair analysis was used to examine his diet from several months before. Pollen in the first meal showed that it had been consumed in a mid-altitude conifer forest, and other pollens indicated the presence of wheat and legumes, which may have been domesticated crops. Pollen grains of hop-hornbeam were also discovered. The pollen was very well preserved, with the cells inside remaining intact, indicating that it had been fresh (a few hours old) at the time of Ötzi’s death, which places the event in the spring. Einkorn wheat is harvested in the late summer, and sloes in the autumn; these must have been stored from the previous year.””

Once again, we are looking at the health issues of someone eating an agricultural diet. It’s amazing that the authors, 19 of them, apparently all agreed that diet has nothing to do with a major component of health. That is patently absurd. To the credit of Lancet, they published a criticism of this conclusion (though these critics repeats their own preferred conventional wisdom, in their view on saturated fat) — Atherosclerosis in ancient populations by Gino Fornaciari and Raffaele Gaeta:

“The development of vascular calcification is related not only to atherosclerosis but also to conditions such as disorders of calcium-phosphorus metabolism, diabetes, chronic microinflammation, and chronic renal insufficiency.

“Furthermore, stating that atherosclerosis is not characteristic of any specific diet or lifestyle, but an inherent component of human ageing is not in agreement with recent studies demonstrating the importance of diet and physical activity.5 If atherosclerosis only depended on ageing, it would not have been possible to diagnose it in a young individual, as done in the Horus study.1

“Finally, classification of probable atherosclerosis on the basis of the presence of a calcification in the expected course of an artery seems incorrect, because the anatomy can be strongly altered by post-mortem events. The walls of the vessels might collapse, dehydrate, and have the appearance of a calcific thickening. For this reason, the x-ray CT pattern alone is insufficient and diagnosis should be supported by histological study.”

As far as I know, this didn’t lead to a retraction of the paper. Nor did this criticism receive the attention that the paper itself was given. None of the people who praised the paper bothered to point out the criticism, at least not among what I came across. Anyway, how did this weakly argued paper based on faulty evidence get published in the first place? And then how does it get spread by so many as if proven fact?

This is the uphill battle faced by anyone seeking to offer an alternative perspective, especially on diet. This makes meaningful debate next to impossible. That won’t stop those like me from slowly chipping away at the vast edifice of the dominant paradigm. On a positive note, it helps when the evidence used against an alternative view, after reinterpretation, ends up being strong evidence in favor of it.

Health From Generation To Generation

Traveling around the world, Weston A. Price visited numerous traditional communities. Some of them hunter-gatherers and others agricultural, including some rural communities in Europe. This was earlier last century when industrialization had yet to take hold in most places, a very different time in terms of diet, even in the Western world.

What he found was how healthy these people were, whether they consumed more or less meat, dairy or not — although none were vegetarian (the typical pre-agricultural diet was about 1/3 to 2/3 animal products, often a large part of it saturated fat). The commonality is that they ate nutrient-dense foods, much of it raw, fermented, or prepared traditionally (the singlemost nutrient-dense food is organ meats). As a dentist, the first thing Price looked for was dental health. A common feature of these traditional societies was well-developed jaws and bone structure, straight uncrowded teeth, few cavities facial symmetry, etc. These people never saw a dentist or orthodontist, didn’t brush or floss, and yet their teeth were in excellent condition into old age.

This obviously was not the case with Price’s own American patients that didn’t follow a traditional diet and lifestyle. And when he visited prisons, he found that bone development and dental health was far worse, as indicators of worse general health and by implication worse neurocognitive health (on a related note, testing has shown that prisoners have higher rates of lead toxicity, which harms health in diverse ways). Between malnutrition and toxicity, it is unsurprising that there are so many mentally ill people housed in prisons, especially after psychiatric institutions were closed down.

Another early figure in researching diet and health was Francis M. Pottenger Jr, an American doctor. While working as a full-time assistant at a sanatorium, he did a study on cats. He fed some cats a raw food diet, some a cooked food diet, and another group got some of both. He also observed that the cooked food diet caused developmental problems of bone and dental structure. The results were worse than that, though. For the cats fed cooked food, the health of the next generation declined even further. By the third generation, they didn’t reach adulthood. There was no generation after that.

I was reading about this at work. In my normal excitement about learning something new, I shared this info with a coworker, a guy who has some interest in health but is a conventional thinker. He immediately looked for reasons for why it couldn’t be true, such as claiming that the generations of cats kept as pets disproves Pottenger’s observations. Otherwise, so the argument goes, domestic cats would presumably have gone extinct by now.

That was easy to counter, considering most pets are born strays who ate raw food or born to parents who were strays. As for purebred cats, I’m sure breeders have already figured out that a certain amount of raw food (or supplementation of enzymes, microbes, etc that normally would be found in raw food) is necessary for long term feline health. Like processed human food, processed pet food is heavily fortified with added nutrients, which likely counteracts some of the negative consequences to a cooked food diet. Pottenger’s cats weren’t eating fortified cooked food, but neither were the cats fed raw food getting any extra nutrients.

The thing is that prior to industrialization food was never fortified. All the nutrients humans (and cats) needed to not only survive but thrive was available in a traditional/natural diet. The fact that we have to fortify foods and take multivitamins is evidence of something severely wrong with the modern, industrialized food system. But that only lessens the health problems slightly. As with Pottenger’s cats, even the cats on a cooked food diet who had some raw food added didn’t avoid severely decreased health. Considering the emerging health crisis, the same appears to be true of humans.

The danger we face is that the effects are cumulative across the generations, the further we get from a traditional diet. We are only now a few generations into the modern Western diet. Most humans were still consuming raw milk and other traditional foods not that long ago. Earlier last century, the majority of Americans were rural and had access to fresh organic food from gardens and farms, including raw milk from pastured cows and fertile eggs from pastured chickens (pastured meaning high in omega-3s).

Even living in a large city, one of my grandfathers kept rabbits and chickens for much of his life and kept a garden into his old age. That means my mother was raised with quite a bit of healthy food, as was my father living in a small town surrounded by farms. My brothers and I are the first generation in our family to eat a fully modern industrialized diet from childhood. And indeed, we have more mental/neurocognitive health problems than the generations before. I had a debilitating learning disorder diagnosed in elementary school and severe depression clearly showing in 7th grade, one brother had stuttering and anxiety attacks early on, and my oldest brother had severe allergies in childhood that went untreated for years and since then has had a host of ailments (also, at least one of my brothers and I have suspected undiagnosed Asperger’s or something like that, but such conditions weren’t being diagnosed when we were in school). One thing to keep in mind is that my brothers and I are members of the generation that received one of the highest dosages of lead toxicity in childhood, prior to environmental regulations limiting lead pollution; and research has directly and strongly correlated that to higher rates of criminality, suicide, homicide, aggressive behavior, impulse control problems, lowered IQ, and stunted neurocognitive development (also many physical health conditions).

The trend of decline seems to be continuing. My nieces and nephews eat almost nothing but heavily processed foods, way more than my brothers and I had in our own childhoods, and the produce they do eat is mostly from nutrient-depleted soil, along with being filled with farm chemicals and hormones — all of this having continuously worsened these past decades. They are constantly sick (often every few weeks) and, even though still in grade school, all have multiple conditions such as: Asperger’s, learning disorder, obsessive-compulsion, failure to thrive, asthma, joint pain, etc.

If sugar was heroin, my nephew could be fairly called a junky (regularly devouring bags of candy and on more than one occasion eating a plain bowl of sugar; one step short of snorting powdered sugar and mainlining high fructose corn syrup). And in making these observations, I speak from decades of experience as a junkfood junky, most of all a sugar addict, though never quite to the same extreme. My nieces too have a tremendous intake of sugar and simple carbs, as their families’ vegetarianism doesn’t emphasize vegetables (since going on the paleo diet, I’ve been eating more organic nutrient-dense vegetables and other wholesome foods than my brothers and their families combined) — yet their diet fits well into the Standard American Diet (SAD) and, as the USDA suggests, they get plenty of grains. I wouldn’t be surprised if one or all of them already has pre-diabetes and likely will get diabetes before long, as is becoming common in their generation. The body simply can only take so much harm. I know the damage done to my own body and mind from growing up in this sick society and I hate to see even worse happening to the generations following.

To emphasize this point, the testing of newborn babies in the United States shows that they’ve already accumulated on average more than 200 synthetic chemicals from within the womb; and then imagine all the further chemicals they get from the breast milk of their unhealthy mothers along with all kinds of crap in formulas and in their environments (e.g., carcinogenic fire retardants that they breathe 24/7). Lead toxicity has decreased since my own childhood and that is a good thing, but thousands of new toxins and other chemicals have replaced it. On top of that, the hormones, hormone mimics, and hormone disruptors add to dysbiosis and disease — some suggesting this is a cause of puberty’s greater variance than in past generations, either coming earlier or later depending on gender and other factors (maybe partly explaining the reversal and divergence of educational attainment for girls and boys). Added to this mix, this is the first generation of human guinea pigs to be heavily medicated from childhood, much of it medications that have been shown to permanently alter neurocognitive development.

A major factor in many modern diseases is inflammation. This has many causes from leaky gut to toxicity, the former related to diet and often contributing to the latter (in how the leaky gut allows molecules to more easily cross the gut lining and get into the bloodstream where they can freely travel throughout the body — causing autoimmune disorders, allergies, asthma, rheumatoid arthritis, depression, etc). But obesity is another main cause of inflammation. And one might note that, when the body is overloaded and not functioning optimally, excess toxins are stored in fat cells — which makes losing weight even more difficult as toxins are released back into the body, and if not flushed out causing one to feel sick and tired.

It’s not simply bad lifestyle choices. We are living in unnatural and often outright toxic conditions. Many of the symptoms that we categorize as diseases are the bodies attempt to make the best of a bad situation. All of this adds up to a dysfunctional level across society. Our healthcare system is already too expensive for most people to afford. And the largest part of public funding for healthcare is going to diabetes alone. But the saddest part is the severe decrease in quality of life, as the rate of mood and personality disorders skyrockets. It’s not just diet. For whatever reason (toxins? stress?), with greater urbanization has come greater levels of schizophrenia and psychosis. And autism, a rare condition in the past, has become highly prevalent (by the way, one of the proven effective treatments for autism is a paleo/keto diet; also effective for autoimmune conditions among much else).

It’s getting worse and worse, generation after generation. Imagine what this means in terms of epigenetics and transgenerational trauma, as nutritional deficits and microbiotic decimation accumulates, exacerbated by a society driven mad through inequality and instability, stress and anxiety. If not for nutrients added to our nutrient poor food and supplements added to our unhealthy diet, we’d already be dying out as a society and our civilization would’ve collapsed along with it (maybe similar to how some conjecture the Roman Empire weakened as lead toxicity increased in the population). Under these conditions, that children are our future may not be an affirmation of hope. Nor may these children be filled with gratitude once they’ve reached adulthood and come to realize what we did to them and the world we left them. On the other hand, we aren’t forced to embrace fatalism and cynicism. We already know what to do to turn around all of these problems. And we don’t lack the money or other resources to do what needs to be done. All that we are waiting for is public demand and political will, although that might first require our society reaching a point of existential crisis… we are getting close.

The stumbling block is that there is no profit in the ‘healthcare’ industry for advocating, promoting, incentivizing, and ensuring healthy diet and healthy conditions for a healthy population. Quite the opposite. If disease profiteering was made illegal, there would be trillions of dollars of lost profit every year. Disease is the reality of capitalist realism, a diseased economic system and social order. This collective state of sickliness has become the norm and vested interests will go to great lengths to defend the status quo. But for most who benefit from the dysfunctional and destructive system, they never have to give it much thought. When my mother brought my nephew to the doctor, she pointed out how he is constantly sick and constantly eating a poor diet. The doctor’s response was that this was ‘normal’ for kids (these days), which might be true but the doctor should be shocked and shamed by his own admission. As apathy takes hold and we lose a sense of hope, low standards fall ever lower.

We can’t rely upon the established authority figures in seeking better health for ourselves, our families, and our communities. We know what we need to do. It might not be easy to make such massive changes when everything in society is going against you. And no doubt it is more expensive to eat healthy when the unhealthiest foods (e.g., high fructose corn syrup) are being subsidized by the government. It’s no accident that buying off the dollar menu at a fast food is cheaper than cooking a healthy meal at home. Still, if you are willing to go to the effort (and it is worth the effort), a far healthier diet is possible for many within a limited budget. That is assuming you don’t live in a food desert. But even in that case, there is a movement to create community gardens in poor neighborhoods, people providing for themselves what neither the government nor economy will provide.

Revolutions always begin from the bottom up. Or failing that, the foundations of our society will crumble, as the health of our citizenry declines. It’s a decision we must make, individually and collectively. A choice between two divergent paths leading to separate possible futures. As we have so far chosen suicidal self-destruction, we remain free to choose the other option. As Thomas Paine said, “We have it in our power to begin the world over again.”

* * *

Primal Nutrition
by Ron Schmid, ND
pp. 99-100

Parallels Between Pottenger’s and Price’s Work

While the experiments of McCarrison and Pottenger show the value of raw foods in keeping animals remarkably healthy, one might wonder about the relevance to human needs. Cats are carnivores, humans omnivores, and while the animals’ natural diet is raw, humans have cooked some foods for hundreds of thousands of years. But humans, cats, and guinea pigs are all mammals. And while the human diet is omnivorous, foods of animal origin (some customarily eaten raw) have always formed a substantial and essential part of it.

Problems in cats eating cooked foods provided parallels with the human populations Weston Price studied; the cats developed the same diseases as humans eating refined foods. The deficient generation of cats developed the same dental malformations that children of people eating modernized foods developed, including narrowing of dental arches with attendant crowding of teeth, underbites and overbites, and protruding and crooked teeth. The shape of the cat’s skull and even the entire skeleton became abnormal in severe cases, with concomitant marked behavioral changes.

Price observed these same physical and behavioral changes in both native and modern cultures eating refined foods. These changes accompanied the adoption by a culture of refined foods. In native cultures eating entirely according to traditional wisdom resulted in strength of character and relative freedom from the moral problems of modern cultures. In modern cultures, studies of populations of prisons, reformatories, and homes for the mentally delayed revealed that a large majority of individuals residing there (often approaching 100 percent) had marked abnormalities of the dental arch, often with accompanying changes in the shape of the skull.

This was not coincidence; thinking is a biological process, and abnormal changes in the shape of the skull from one generation to the next can contribute to changes in brain functions and thus in behavior. The behavioral changes in deficient cats were due to changes in nutrition. This was the only variable in Pottenger’s carefully controlled experiments. As with physical degenerative changes, parallels with human populations cannot help but suggest themselves, although the specific nature of the relationship is beyond the scope of this discussion.

Human beings do not have the same nutritional requirements as cats, but whatever else each needs, there is strong empirical evidence that both need a significant amount of certain high-quality raw foods to reproduce and function efficiently.

pp. 390-393

Certain groups of these cats were fed quality, fresh, undenatured food and others were fed varying degrees of denatured and processed food, then the effects were observed over several generations. The results from the inferior diets were not so startling for the first-generation animals but markedly and progressively so in subsequent generations. From the second generation on, the cats that were fed processed and denatured diets showed increasing levels of structural deformities, birth defects, stress-driven behaviors, vulnerability to illness, allergies, reduced learning ability, and, finally, major reproductive problems. When Pottenger attempted to reverse the effects in the genetically weakened and vulnerable later-generation animals with greatly improved diet, he found it took fully four generations for the cats to return to normal.

The reflections that Pottenger’s work casts on the health issues and dietary habits of modern-day society are glaring and inescapable. […]

Pottenger’s work has shown us that progressive generations with poor dietary habits result in increasingly more vulnerable progeny and that each subsequent generation with unhealthy dietary habits results in impaired resistance to disease, increasingly poor health and vitality, impaired mental and cognitive health, and impaired capacity to reproduce. It is all part of what we are seeing in our epidemic levels of poor health and the overwhelming rates of autism, violence, attentional disorders, childhood (and adult) behavioral problems, mental illness, fertility issues, and birth defects.

What Causes the Poor Health of the Poor?

What is the cause of unhealthy eating? Richard Florida attempts to answer that, based on new research. He makes some good points, but maybe he is missing some of the context.

Let me begin with one thing that seems correct in the analysis. He points out that food deserts are found in poor areas. Dietary health is an issue of socioeconomic class. Yet even when better food is made available, either by grocery stores opening or people moving, the habits of foods bought don’t tend to change. That isn’t surprising and not particularly insightful. If changes were to happen, they would happen across generations as they always do. It took generations to create food deserts and the habits that accompany them. So, it would take generations to reverse the conditions that created the problem in the first place.

Florida sort of agrees with this, even though he doesn’t seek to explain the original cause and hence the fundamental problem. Instead, he points to the need for better knowledge by way of educating the public. What this overlooks is that in generations past there was much better eating habits that were altered by a combined effort of government dietary recommendations and corporate advertising, that is to say an alliance of big government and big business, an alliance that did great harm to public health (from the largely unhelpful food pyramid to the continuing subsidization of corn and corn syrup). The bad eating habits poor Americans have now come from what was taught and promoted over the past century, diet info and advice that in many cases has turned out to be harmfully wrong.

Florida sees this as being more about culture, as related to knowledge. It’s those dumbfucks in rural middle America who need to be taught the wisdom of the coastal elites. It’s a liberal’s way of speaking about ‘poor culture’, a way of blaming the poor while throwing in some paternalistic technocracy. The healthy middle-to-upper classes have to teach the poor how to have healthy middle-to-upper class habits. Then all of society will be well. (Not that the conservative elite are offering anything better with their preference of maintaining oppressive conditions, just let the poor suffer and die because they deserve it.)

Florida’s solution ignores a number of factors, such as costs. When I lived under the poverty level, I bought the cheapest food available which included some frozen vegetables but lots of cheap carbs (e.g., Ramen noodles) and cheap proteins (e.g., eggs) along with cheap junk food (e.g., Saltine crackers) and cheap fast food (e.g., 2 egg and sausage biscuits for $2). Crappy food is extremely inexpensive, a motivating factor for anyone living hand-to-mouth or paycheck-to-paycheck. It’s about getting the most calories for the buck. The healthiest food tends to be the most expensive and least filling (e.g., kale). I couldn’t afford many fresh fruits and vegetables back when I was barely making ends meet, a time when I was so lacking in excess money that I was forced to skip meals on a regular basis. Even when there isn’t a drastic difference in costs, saving a few bucks when shopping adds up for the poor.

Someone like Florida is unlikely to understand this. Maybe the middle-to-upper class could use some education themselves so as to comprehend the lived reality of the poor. But no doubt more and better knowledge should be made available to everyone, no matter their class status. We have way too much ignorance in this society, sadly with much of it to be found among those making public policies and working in corporate media. So, yes, we have room for improvement on this level for all involved. I’m just not sure that is the fundamental issue, as the state of knowledge at present is a result of the state of society, which is to say it is more of a systemic than individual problem.

Anyway, some of the data in the research cited seems a bit off or misleading. Florida’s article includes a map of “Average Health Index of Store Purchases by County.” It doesn’t entirely match various other mapped data for health such as infant mortality. The Upper Midwest, for example, looks rather mixed in terms of the store purchases by county while looking great according to other health indicators. Also, the Upper Midwest has low rates of food deserts, which is supposedly what Florida was talking about. Even though there are poor rural areas in the Upper Midwest, there are also higher rates of gardening and farmers’ markets.

Also, it must be kept in mind that most people in rural states don’t actually live in rural areas, as would have been the case earlier last century. Mapping data by county is misleading because a minority of the population visually dominates the map. The indicators of health would be lower for that minority of rural county residents, even as the indicators of health are high for the entire state population mostly concentrated in specific counties. In that case, it is socioeconomic conditions combined with geographic isolation (i.e., rural poverty) that is the challenge, along with entire communities slowly dying and entire populations aging as the youth and young families escape. That is no doubt problematic, although limited to a small and rapidly shrinking demographic. The majority of Upper Midwestern lower classes are doing fairly well in their living in or around urban centers. But of course, this varies by region. No matter the data, the Deep South almost always looks bad. That is a whole other can of worms.

That set of issues is entirely ignored by Florida’s article, which is severely limiting for his analysis. Those particular Middle Americans don’t need the condescension from the coastal elites. What is missing is that poverty and inequality is also lower in places like the Upper Midwest, while being higher elsewhere such as the Deep South. Socioeconomic conditions correlates strongly with all aspects of health, physical health and mental health, just as they correlate to with all aspects of societal health (e.g., rates of violent crime).

I applaud any attempt at new understanding, but not all attempts are equal. Part of my complaint is directed toward the conservative-mindedness of much of the liberal class. There is too often a lack of concern for and lack of willingness to admit to the worst systemic problems. Let me give a quick example. Florida wrote another article, also for City Lab, in which he discusses the great crime decline and the comeback of cities. Somehow, he manages to entirely avoid the topic of lead toxicity, the single most well researched and greatest proven factor of crime decline. This kind of omission is sadly common from this kind of public intellectual. In constantly skirting around the deeper issues, it’s a failure of intellect and morality going hand in hand with a failure of insight and imagination.

To return to the original topic, I suspect that there are even deeper issues at play. Inequality is definitely important. Then again, so is segregation, distrust, and stress. Along with all of this, loss of community and social capital is immensely important. To understand why inequality matters, an analysis has to dig deeper into what inequality means. It isn’t merely an economic issue. Inequality of income and wealth is inequality of education and time, as the author notes, while it is also inequality of political power, public resources, and individual opportunities. A high inequality society causes dysfunction, especially for the poor, in a thousand different ways. Inequality is less of a cause than the outward sign of deeper problems. Superficial attempts at solutions won’t be helpful.

In response to why the poor eat less well, one commenter suggested that, “It’s literally because they have less time” and offered supporting evidence. The article linked was about data showing the poor don’t eat more fast food than do the wealthier, despite more fast food restaurants being located near the poor, which implies the wealthier are more willing or more likely to be travelling further distances in their eating fast food at the same rate. What the article also shows is that it is busy people, no matter socioeconomic class, who eat the most fast food.

That is a key point to keep in mind. In that context, another commenter responded with a disagreement and pointed to other data. The counter-claim was that the lower classes have more leisure time. I dug into that data and other data and had a different take on it. I’ll end with my response from the comments section:

A superficial perusal of cherry-picked data isn’t particularly helpful. Show me where you included commute time, childcare, eldercare, housework, house maintenance, yard work, etc. The linked data doesn’t even have a category for commute time and it doesn’t disaggregate specific categories of non-employment work according to income, occupation, or education.

Also, what is called leisure is highly subjective, such as the wealthier having greater freedom to take relaxing breaks while at work or eating a healthy meal out at a restaurant for lunch, none of which would get listed as leisure. Wealthier people have lives that are more leisurely in general, even when it doesn’t involve anything they would explicitly perceive and self-report as leisure. They are more likely to be able to choose their own work schedule, such as sleeping in later if they want (e.g., because of sickness) or leaving work early when needed (e.g., in order to bring their child to an event). They might be puttering around the house which they consider work, as the nanny takes care of the kids and the maid cleans the house. What a wealthier person considers work a poor person might consider leisure.

None of that is accounted for in the data you linked to. And it doesn’t offer strong, clear support for your conclusions. Obviously, something is getting lost in the self-reported data in how people calculate their own leisure. It shows that the poorer someone is the more they are likely to go to work at a place of employment on an average day (and apparently for some bizarre reason that includes “single jobholders only” in terms of income): 93.9% of those making $0 – $580, 90.6% of those making $581 – $920, 85.4% of those making $921 – $1,440, and 78% of those making $1,441 and higher. Imagine if they included all the lower class people working multiple jobs (the data doesn’t list any categories that combine income bracket and number of jobs).

About those formally working on an average day, to put it in context of occupation, this is: 75.5% of management, business, and financial operations, 76.9% of professional and related, 90.1% of construction and extraction, 93.2% of installation, maintenance, and repair, 91.2% of production, and 88.7% of transportation and material moving. Or break it down by education, which strongly correlates to income brackets: 85.5% of those with less than a high school diploma, 89.8% of those with high school graduates, no college, 85.4% of those with some college or associate degree, 77.2% of those with bachelor’s degree only, and 70.7% of those with advanced degree. It is even more stark separated in two other categories: 85.6% of wage and salary workers and 49.9% of self-employed workers.

No matter how you slice and dice the data, non-professionals with less education and income are precisely those who are most likely to do employment-related work on an average day. That is to say they are more likely to not be at home, the typical location of most leisure activities. It’s true the wealthier and more well educated like to describe themselves as working a lot even when at home, but it’s not clear what that might or might not mean in terms of actual activities. Self-report data is notoriously unreliable, as it is based on self-perception and self-assessment.

Besides, anyone who knows anything about social science research knows that there are a lot of stresses involved in a life of poverty, far beyond less time, although that is significant. There is of course less wealth and resources, which is a major factor. Plus, there are such things as physical stress, from lack of healthcare to high rates of lead toxicity. Living in a food desert and being busy are among the lesser worries for the working poor.

To return to the work angle, I would also add that the poor are more likely to work multiple shifts in a row, to work irregular or unpredictable schedules (being on call, split or rotating shifts, etc), to work on the black market (doing yard work for cash, bartering one’s time and services in the non-cash economy, etc), along with probably having a higher number of family members such as teens working in some capacity (paid and/or helping at home, formal and/or informal work). There is also the number of hours spent looking for work, a major factor considering the growing gig economy. Also, what about the stress and uncertainty for the increasing number of people working minimum wage (many employees at Walmart and Amazon) who make so little that they have to be on welfare just to make ends meet.

None of this is found in the data you linked. In general, it’s hard to find high quality and detailed data on this kind of thing. But there is plenty of data that indicates the complicating and confounding factors.

Survey: More Than One-Third Of Working Millennials Have A Side Job
by Renee Morad

“majority of workers taking on side gigs (68%) are making less than $50K a year.”

Millennials Significantly Outpacing Other Age Groups for Taking on Side Gigs
by Michael Erwin

“Workers of all income levels are taking on side work. Nearly 1 in 5 workers making more than $75k (18 percent) and 12 percent of those making more than $100k currently have a gig outside of their full time job. This is compared to a third of workers making below $50k (34 percent) and 34 percent earning below $35k.”

Who Counts as Employed? Informal Work, Employment Status, and Labor Market Slack
by Anat Bracha and Mary A. Burke

“Among informal participants who experienced a job loss or other economic loss during or after the Great Recession, 40 percent report engaging in informal work out of economic necessity, and 8.5 percent of all informal workers report that they would like to have a formal job. However, about 70 percent of informal work hours offer wages that are similar to or higher than the same individual’s formal wage.

“[…] informal work participation complicates the official U.S. measurement of employment status. In particular, a significant share of those who report that they are currently engaged in informal work also report separately that they performed no work for pay or profit in the previous week. In light of such potential underreporting of informal work, the BLS’s official labor force participation rate might be too low by an economically meaningful (if modest) margin, and the share of employed workers with full-time hours is also likely to be higher than is indicated by the official employment statistics.”

What Is the Informal Labor Market?
by Paulina Restrepo-Echavarria

“Survey of Informal Work Participation within the Survey of Consumer Expectations revealed that about 20 percent of non-retired adults at least 21 years old in the U.S. generated income informally in 2015.2 The share jumped to 37 percent when including those who were exclusively involved in informal renting and selling activities.

“When breaking down the results by the Bureau of Labor Statistics (BLS) employment categories, about 16 percent of workers employed full time participated in informal work. Not surprisingly, the highest incidence of informal work was among those who are employed part time for economic reasons, with at least 30 percent participating in informal work. Also, at least 15 percent of those who are considered not in the labor force by the BLS also participated in informal work.

“[…] Enterprising and Informal Work Activities (EIWA) survey, which revealed that 36 percent of adults in the U.S. (18 and older) worked informally in the second half of 2015.3 Of these informal workers, 56 percent self-identified as also being formally employed, and 20 percent said they worked multiple jobs (including full-time and part-time positions).

“[…] There were slightly more women than men among informal workers, though the share of women was much larger in lower income categories.

“The majority of informal workers were white, non-Hispanic (64 percent), while the share of Hispanic workers tended to be slightly higher than that of African-Americans (16 and 12 percent, respectively). The racial breakdowns were consistent across most income categories, with a higher incidence of informal work among minorities in the lowest income categories.”

Irregular Work Scheduling and Its Consequences
by Lonnie Golden

– “By income level, the lowest income workers face the most irregular work schedules.”
– “Irregular shift work is associated with working longer weekly hours.”
– “Employees who work irregular shift times, in contrast with those with more standard, regular shift times, experience greater work-family conflict, and sometimes experience greater work stress.”
– “The association between work-family conflict and irregular shift work is particularly strong for salaried workers, even when controlling for their relatively longer work hours.”
– “With work hours controlled for, having a greater ability to set one’s work schedule (start and end times and take time off from work) is significantly associated with reduced work-family conflict.”