A Plutocrat Criticizing Plutocrats in Defense of Plutocracy

On C-SPAN’s After Words, Koch lobbyist and Catholic conservative Matt Schlapp interviewed self-avowed elitist Tucker Carlson from Fox News. The purpose of the interview is Carlson’s new book, Ship of Fools. I don’t know much about him nor have I read his book. The only reason I watched it was because my dad cajoled me into doing so. Even though my dad strongly dislikes Carlson on his new show, he took this interview as important and to the point. I might agree.

Carlson regularly states that he isn’t that smart and he is right. His intellect is rather mundane, he offers no new insights, and he admits that he was wrong about so much of what he has believed and supported. But what makes the interview worthwhile is that, if one ignores the right-wing talking points, he expresses something resembling honesty. He poses as a humble Christian speaking the truth and, as easy as it would be to dismiss him, I’m feeling generous in taking him at face value for the moment.

Much of what he says has been said better by left-wingers for generations. Some of these criticisms are so typical of the far left that, in the Democratic Party, they are beyond the pale. The message is essentially the same as Nick Hanauer, another rich white guy, warning about the pitchforks coming for plutocrats (Hanauer once said of his fellow Democrat and former business associate, Jeff Bezos, that he’ll do the right thing when someone points a gun at his head). Carlson himself not that long ago, if he had heard someone say what he is saying now, would have called that person radical, unAmerican, and maybe evil. Instead, as a defender of capitalism, he literally called evil those CEOs who wreck their corporations and then take large bonuses.

This is drawing a line in the sand. It is the conviction that there is a moral order that trumps all else. He didn’t say that these money-mongers are psychopathic, narcissistic, or Machiavellian. Such terms have no moral punch to them. Carlson didn’t merely call something bad or wrong but evil. And he didn’t say he hated the sin but loved the sinner. No, these corrupt and selfish individuals were deemed evil, the ultimate moral judgment. When I pointed out this strong language to my dad, he said it was in line with his own Christian views.

For many conservatives and also for many establishment liberals, this is a rare moment when they might hear this message in the corporatist media, whether or not they listen. If they won’t pay attention to those who have been warning about this sad state of affairs for longer than I’ve been alive, let us hope they will finally take notice of those in positions of wealth, power, and authority when they say the exact same thing.

Tucker Carlson is basically telling the ruling elite that the game is up. The only reason he is warning his fellow plutocrats, as he states in no uncertain terms, is because he fears losing his comfortable lifestyle if the populists gain power. And his fear isn’t idle, considering that a while back protesters gathered outside of his house and chanted, “Tucker Carlson, we will fight! We know where you sleep at night!” The natives are restless. I guess he is hoping for a plutocrat like Theodore Roosevelt to ride into power and then reign in the worst aspects of capitalism in order to prop it up for another generation or two.

Good luck with that…

Misreading the Misreadings of History

“A majority of decent well-meaning people said there was no need to confront Hitler…. When people decided to not confront fascism, they were doing the popular thing, they were doing it for good reasons, and they were good people…but they made the wrong decision.”

Tony Blair spoke those words as UK Prime Minister in 2003. And the supposedly Hitler-like figure he alluded to was Saddam Hussein. I ran across this quote in a piece from The Wall Street Journal, The Trouble With Hitler Analogies by Zachary Karabell. I’d instead point out the trouble with those who feel troubled. The critic here, if he was like most in the mainstream media at the time, beat the drums for war in attacking Iraq.

That war, if you can call it that when from a safe distance the most powerful countries in the world bomb a small country to oblivion, was a war of aggression. It was illegal according to both US law and international law, whatever its legal standing may have been in the UK. Besides, the justification for the military attack on Iraq was based on a lie and everyone knew it was a lie, that is to say we have long been in a post-truth age (numerous military conflicts in US history were based on lies, from the Cold War to the Vietnam War).

Hussein had no weapons of mass destruction. The only reason we could even suggest that he did was because we had sold some to him back in the 1980s. But there was no way those weapons would still work at this point, since those biological weapons had a short lifespan. Worse still, whatever horrible things Saddam did in recent years, it pales against the horrible things he did while he was our ally. He used those US biological weapons against his own people back then and the US military stood by and did nothing, the US government implicitly supporting his authoritarian actions, as has long been the established pattern of US foreign relations. Our authoritarian allies do horrific atrocities all the time, for their own purposes and sometimes on our behalf.

What makes Blair’s statement morally demented is that he was speaking as an authoritarian imperialist flexing his muscles. Hussein, as a petty tyrant, was getting uppity and needed to be put back in his place. It had nothing to do with freedom and democracy, any more than World War II was motivated by such noble ideals. The US and UK went up against the Nazis because Germany (along with Japan) was a competing empire that became a direct threat, nothing less and nothing more. Having won that global conflict, the US then became a fully global empire and the greatest authoritarian power in history. Fascism wasn’t defeated, though, as the US became even more fascist over the following generations. This had to do with political elites such as the Bush family that made its wealth in doing business with Nazis and that later helped Nazi war criminals to evade justice by working for the US government.

Here is the real problem with Hitler analogies. If Hitler were here today and he had a different name, he would either smear his opponents as being like Hitler or he would dismiss those who dared to make such comparisons. Either way, the purpose would be to muddy the water and make impossible any public discussion and moral accounting. It is interesting to note that the author of the WSJ article indirectly defends the authoritarian forces in our society by blaming those who call names:

“Contesting today’s populist strongmen doesn’t require calling them fascists, a label that often deepens the anger and alienation of their followers. The only thing worse than forgetting history is using it badly, responding to echoes of the past with actions that fuel today’s fires rather than douse them.”

Basically, don’t antagonize the authoritarians or they might get mean. Well, they’re already mean. It’s too late for that. It’s another example of someone demanding moderation in a society that has gone mad. As I often wonder, moderate toward what? If we can’t even call authoritarianism for what it is as authoritarians rise to power, then what defense is there against what is taboo to speak of? There is none. That is the point. This is how authoritarianism takes hold in a society.

But to the author, one suspects that is not necessarily a bad thing. Authoritarianism, in this worldview, might be fine as long as it is used wisely and the mob is kept in check. The only problem with the second Iraq War wasn’t that it was authoritarian but that it failed in its own stated authoritarian agenda. What can’t be mentioned is the historical analogy of Hitler also failing in his authoritarian agenda when he turned to wars of aggression in a bid to assert imperial rule. The analogy, of course, ends there for the moment. That is because, unlike Nazi Germany, 21st century America doesn’t quite have the equivalent of an opposing power also aspiring to empire. Not yet. But Russia and China, if and when World War III begins, probably will be willing to play the role.

Ideasthesia

Ideasthesia
from Wikipedia

Ideasthesia (alternative spelling ideaesthesia) is defined as a phenomenon in which activations of concepts (inducers) evoke perception-like experiences (concurrents). The name comes Ancient Greek ἰδέα (idéa) and αἴσθησις (aísthēsis), meaning “sensing concepts” or “sensing ideas”. The main reason for introducing the notion of ideasthesia was the problems with synesthesia. While “synesthesia” means “union of senses”, empirical evidence indicated that this was an incorrect explanation of a set of phenomena traditionally covered by this heading. Syn-aesthesis denoting also “co-perceiving”, implies the association of two sensory elements with little connection to the cognitive level. However, according to others, most phenomena that have inadvertently been linked to synesthesia in fact are induced by the semantic representations. That is, the meaning of the stimulus is what is important rather than its sensory properties, as would be implied by the term synesthesia. In other words, while synesthesia presumes that both the trigger (inducer) and the resulting experience (concurrent) are of sensory nature, ideasthesia presumes that only the resulting experience is of sensory nature while the trigger is semantic. Meanwhile, the concept of ideasthesia developed into a theory of how we perceive and the research has extended to topics other than synesthesia — as the concept of ideasthesia turned out applicable to our everyday perception. Ideasthesia has been even applied to the theory of art. Research on ideasthesia bears important implications for solving the mystery of human conscious experience, which according to ideasthesia, is grounded in how we activate concepts.

What Is “Ideasthesia” And Why Is It Important To Know?
by Faena Aleph

Many of us speak metaphorically when we describe a color as “screaming” or a sound as “sharp”, These are synesthetic associations we all experience, whether we know it or not ––but we pronounce them literally because it makes enough sense to us.

But synesthesia, which is one of the most charming sensory phenomena, has been overly studied and illustrated by many artists. Today, however, a fascinating aspect of this bridge between senses is being discovered: ideasthesia.

Danko Nikolic, a brain researcher from the Max-Plank Institute, has proposed this theory that questions the reality of two philosophical premises 1) the mind and body, and 2) the perception of senses and ideas. His research suggests that, for starters, these dualities might not exist.

Widely speaking, ideasthesia is a type of bridge that metaphorically links rational abstractions, i.e. ideas with sensory stimuli in a dynamic catalyzed by language. Nevertheless, the best way of understanding “ideasthesia” is through a TED talk that Nikolic himself recently gave. And, be warned, his theory might just change your paradigms from their foundation and reinforce the beliefs that Walt Whitman anticipated over a hundred years ago.

Ideasthesia — Art, Genius, Insanity, and Semiotics
by Totem And Token

…the notion of ideasthesia — that one can feel or physically experience an idea. Instead of a letter or a sound or a single word as being physically felt, an entire idea or construct or abstract is experienced phenomenologically.

But this seems abstract in and of itself, right? Like, what would it mean to ‘feel’ an idea? The classic example, linked to here, would be to imagine two shapes. One is a curvy splatter, kind of like the old 90s Nickelodeon logo, and the other is an angular, jagged, pointy sort of shape. Which would you call Kiki and which would you call Bouba?

An overwhelming majority (95% according to one source) would say that the splatter is Bouba and the pointy thing is Kiki.

But why though?

Bouba and Kiki are random sounds, absolutely meaningless and the figures were similarly meaningless. Some contend that it is a linguistic effect, since ‘K’ is an angular letter and ‘B’ is more rounded. Yet, there seems to be a consensus on which is which, even cross-culturally to some extent. Because just the idea of the pointy shape feels like a Kiki and the blobbier shape feels like a Bouba.

Another way I think it is felt is when we talk about highly polarizing topics, often political or religious in nature. In the podcast You Are Not So Smart, David McRaney talks about being confronted with a differing view point as having a gut-wrenching, physical effect on him. Researchers pointed out that the feeling is so strong that it actually elicits a fight-or-flight response.

But it’s just words, right? It’s not like someone saying “I don’t believe in universal healthcare” or “You should have the right to pull the plug in a coma” actually makes it so, or will cause it to happen to you. It is simply one person’s thought, so why does it trigger such a deep-seated emotion? The researchers in the episode hypothesize that the core ideas are related to you identity which is being threatened, but I think the explanation is somewhat simpler and stranger.

It’s because the ideas actually feel dangerous to you.

This is why what feels perfectly rational to you feels irrational to others.

It also makes more sense when talking about geniuses or highly gifted individuals. Although they exist, the Dr. House-type hyper-rational savants aren’t usually what you hear about when you look at the biographies of highly intelligent or gifted peoples. Da Vinci, Goethe, Tesla, Einstein and others all seem to describe an intensely phenomenological approach to creating their works.

Even in what is traditionally considered to be more rational pursuits, like math, have occasional introspective debates about whether string theory or higher order mathematics is created or discovered. This seems like a question about whether one feels out a thought or collects and constructs evidence to make a case.

What’s more is that, while I think most people can feel an idea to some extent (kiki vs bouba), gifted peoples and geniuses are more sensitive to these ideas and can thus navigate it better. Sensitivity seems to really be the hallmark of gifted individuals, so much so that I remember reading about how some gifted students have to wear special socks because the inner stitching was too distracting.

I remember when I was younger (around elementary school) there was a girl who was in our schools gifted program who everyone could not stand. She seemed to have a hairline trigger and would snap at just about anything. I realize now that she was simply incredible sensitive to other children and didn’t really know how to handle it maturely.

I can imagine if this sort of sensitivity applied to ideas and thought processes might actually be a big reason why geniuses can handle seemingly large and complex thoughts that are a struggle for the rest of us — they aren’t just thinking through it, they are also feeling their way through it.

It may offer insight into the oft-observed correlation between madness and intellect. Maybe that’s what’s really going on in schizophrenia. It’s not just a disconnect of thoughts, but an oversensitivity to the ideas that breed those thoughts that elicits instinctive, reactionary emotions much like our fight-or-flight responses to polarizing thoughts. The hallucinations are another manifestation of the weaker sensory experience of benign symbols and thoughts.

The Secret of Health

I’m going to let you in on a secret. But before I get to that… There is much conflict over diet. Many will claim that their own is the one true way. And some do have more research backing them up than others. But even that research has been extremely limited and generally of low quality. Hence, all the disagreement and debate.

There have been few worthwhile studies where multiple diets are compared on equal footing. And the results are mixed. In some studies, vegetarians live longer. But in others, they live less long. Well, it depends on what kind of vegetarian diet in what kind of population and compared against which other diet or diets. The Mediterranean diet also has showed positive results and the Paleo diet has as well, although most often the comparison is against a control group that isn’t on any particular diet.

It turns out that almost any diet is better than the Standard American Diet (SAD). Eating dog shit would be improvement over what the average American shoves into their mouth-hole. I should know. I shudder at the diet of my younger days, consisting of junk food and fast food. Like most Americans, I surely used to be malnourished, along also with likely having leaky gut, inflammation, insulin sensitivity, toxic overload, and who knows what else. Any of the changes I’ve made in my diet over the years has been beneficial.

So, here is the great secret. It matters less which specific diet you have, in the general sense. That is particular true in decreasing some of the worst risk factors. Many diets can help you lose weight and such, from low fat to high fat, from omnivorian to vegetarian. That isn’t to say all diets are equal in the long term, but there are commonalities to be found in any healthy diet. Let me lay it out. All health diets do some combination of the following.

Eliminate or lessen:

  • processed foods
  • vegetable oils
  • carbs, especially simple carbs
  • grains, especially wheat
  • sugar, especially fructose
  • dairy, especially cow milk
  • foods from factory-farmed animals
  • artificial additives

Emphasize and increase:

  • whole foods
  • omega-3s, including but not limited to seafood
  • fiber, especially prebiotics
  • probiotics, such as fermented/cultured
  • foods that are organic, local, and in season
  • foods from pasture-raised or grass-fed animals
  • nutrient-density
  • fat-soluble vitamins

There are some foods that are harder to categorize. Even though many people have problems with cow milk, especially of the variety with A1 casein, more people are better able to deal with ghee which has the problematic proteins removed. And pasture-raised cows produce nutrient-dense milk, as they produce nutrient-dense organ meats and meat filled with omega-3s. So, it’s not that a diet has to include everything I listed. But the more it follows these the greater will be the health benefits.

It does matter to some degree, for example, where you get your nutrient-density. Fat-soluble vitamins are hard to find in non-animal sources, a problem for vegans. But even a vegan can vastly increase their nutrient intake by eating avocados, leafy greens, seaweed, etc. The main point is any increase in nutrients can have a drastic benefit to health. And the greater amount and variety of nutrients the greater the improvement.

That is why any diet you can imagine comes in healthy and unhealthy versions. No matter the diet, anyone who decreases unhealthy fats/oils and increases healthy fats/oils will unsurprisingly increase their health. But as an omnivore could fill their plate with factory-farmed meat and dairy, a vegan could fill their plate with toxic soy-based processed foods and potato chips. The quality of a diet is in the details.

Still, it is easier to include more of what I listed in some diets than others. Certain nutrients are only found in animal sources and so a vegan has to be careful about supplementing what is otherwise lacking. A diet of whole foods that doesn’t require supplementation, however, is preferable.

That is why there are a surprisingly large number of self-identified vegans and vegetarians who will, at least on occasion, eat fish and other seafood. That also might be why the Mediterranean diet and Paleo diet can be so healthy as well, in their inclusion of these foods. Weston A. Price observed some of the healthiest populations in the world were those who lived near the ocean. And this is why cod liver oil was traditionally one of the most important parts of the Western diet, high in both omega-3s and fat soluble vitamins and much else as well.

Whatever the details one focuses upon, the simple rule is increase the positives and decrease the negatives. It’s not that difficult, as long as one knows which details matter most. The basic trick to any health diet is to not eat like the average American. That is the secret.

* * *

Getting that out of the way, here is my bias.

My own dietary preferences are based on functional medicine, traditional foods, paleo diet, nutritional science, anthropology, and archaeology — basically, any and all relevant evidence and theory. This is what informs the list I provided above, with primary focus on the Paleo diet which brings all the rest together. That is what differentiates the Paleo diet from all others, in that it is a systematic approach that scientifically explains why the diet works. It focuses not just on one aspect but all known aspects, including lifestyle and such.

Something like the Mediterranean diet is far different. It has been widely researched and it is healthy, at least relative to what it has been tested against. There are multiple limitations to health claims about it.

First, the early research was done after World War II and , because of the ravages to the food supply, the diet they were eating then was different than what they were eating before. The healthy adults observed were healthy because of the diet they grew up on, not because of the deprivation diet they experienced after the war. That earlier diet was filled with meat and saturated fat, but it also had lots of vegetables and olive oil as. As in the US, the health of the Mediterranean people had decreased as well from one generation to the next. So, arguing that the post-war Mediterranean diet was healthier than the post-war American diet wasn’t necessarily making as strong of a claim as it first appeared, as health was declining in both countries but with the decline in the latter being far worst.

Working with that problematic research alone, there was no way to get beyond mere associations in order to determine causation. As such, it couldn’t be stated with any certainty which parts of the diet were healthy, which parts unhealthy, and which parts neutral. It was a diet based on associations, not on scientific understanding of mechanisms and the evidence in support. It’s the same kind of associative research that originally linked saturated fat to heart disease, only to later discover that it was actually sugar that was the stronger correlation. The confusion came because, in the American population because of the industrialized diet, habits of saturated fat consumption had become associated with that of sugar, but there was no study that ever linked saturated fat to heart disease. It was a false or meaningless association, a correlation that it turns out didn’t imply causation.

That is the kind of mistake that the Paleo diet seeks to avoid. The purpose is not merely to look for random associations and hope that they are causal without ever proving it. Based on other areas of science, paleoists make hypotheses that can be tested, both in clinical studies and in personal experience. The experimental attitude is central.

That is why there is no single Paleo diet, in the way there is a single Mediterranean diet. As with hunter-gatherers in the real world, there is a diversity of Paleo diets that are tailored to different purposes, health conditions, and understandings. Dr. Terry Wahl’s Paleo diet is a plant-based protocol for multiple sclerosis, Dr. Dale Bredesen’s Paleo diet is part of an even more complex protocol including ketosis for Alzheimer’s. Other ketogenic Paleo diets target the treatment of obesity, autism, etc. Still other Paleo diets allow more carbs and so don’t prioritize ketosis at all. There are even Paleo diets that are so plant-based as to be vegetarian, with or without the inclusion of fish and seafood, more similar to that of Dr. Wahls.

Which is the Paleo diet? All of them. But what do they all have in common? What I listed above. They all take a multi-pronged approach. Other diets work to the degree they overlap with the Paleo diet, especially nutrient-density. Sarah Ballantyne, a professor and medical biophycisist, argues that nutrient-density might be the singlemost important factor and she might be right. Certainly, you could do worse than focusing on that alone. That has largely been the focus of traditional foods, as inspired by the work of Weston A. Price. Most diets seem to improve nutrient-density, one way or another, even if they don’t do it as fully as the best diets. The advantage of the Paleo diet(s), as with traditional foods and functional medicine, is that there is scientific understanding about why specific nutrients matter, even as our overall knowledge of nutrients has many gaps. Still, knowledge with gaps is better than anything else at the moment.

The list of dos and don’ts is based on the best science available. The science likely will change and so dietary recommendations will be modified accordingly. But if a diet is based on ideology instead, new information can have no impact. Fortunately, most people advocating diets are increasingly turning to a scientific approach. This might explain why all diets are converging on the same set of principles. Few people would have been talking about nutrient-density back when the FDA made its initial dietary recommendations as seen in the Food Pyramid. Yet now the idea of nutrient-density has become so scientifically established that it is almost common knowledge.

More than the Paleo diet as specific foods to eat and avoid, what the most important takeaway is the scientific and experimental approach that its advocates have expressed more strongly than most. That is the way to treat the list I give, for each person is dealing with individual strengths and weaknesses, a unique history of contributing factors and health concerns. So, even if you dismiss the Paleo diet for whatever reason, don’t dismiss the principles upon which the Paleo diet is based (for vegetarians, see: Ketotarian by Dr. Will Cole and The Paleo Vegetarian Diet by Dena Harris). Anyone following any diet will find something of use, as tailored to their own needs.

That is the purpose of my presenting generalized guidelines that apply to all diets. It’s a way of getting past the ideological rhetoric in order to get at the substance of health itself, to get at the causal level. The secret is that there is no single healthy diet, not in any simplistic sense, even as every healthy diet has much in common.

The Haunting of Voices

“If I met a skin-changer who demanded my shoes, I’d give him my shoes.” This is what a Navajo guy once told me. I didn’t inquire about why a skin-changer would want his shoes, but it was a nice detail of mundane realism. This conversation happened when I was living in Arizona and working at the Grand Canyon. Some might see this anecdote as the over-worked imagination of the superstitious. That probably is how I took it at the time. But I wouldn’t now be so dismissive.

While there, my job was to do housekeeping in the El Tovar. It’s an old hotel located directly on the South Rim of the canyon. It has the feeling of a building that has been around a while. It’s age was hard for me to ignore in its lacking an elevator, something I became familiar with in carrying stacks of sheets up the stairs of multiple floors. I worked there a few times late at night and there was an eerie atmosphere to the place. You could viscerally sense the history, all the people who had stayed there and passed through.

There were stories of suicides and homicides, of lonely lost souls still looking for their lovers or simply going through their habitual routine in the afterlife. The place was famous for it having been one of the locations where the Harvey Girls worked, young women looking for wealthy husbands. There was a tunnel that was once used by the Harvey girls to go between the hotel and the women’s dorm. This hidden and now enclosed tunnel added to the spookiness.

Many Navajo worked at the Grand Canyon, including at the El Tovar. And sometimes we would chat. I asked about the ghosts that supposedly haunted the place. But they were reluctant to talk about it. I later learned that they thought it disrespectful or unwise to speak of the dead. I also learned that some had done traditional ceremonies in the hotel in order to put the dead to rest and help them pass over to the other side. Speaking of the dead would be like calling them back to the world of the living.

I doubt this worldview is merely metaphorical in the superficial sense. Though it might be metaphorical in the Jaynesian sense. Julian Jaynes hypothesized that ancient people continued to hear the voices of the dead, that the memory would live on as auditory experience. He called this the bicameral mind. And in bicameral societies, voice-hearing supposedly was key to social order. This changed because of various reasons and then voice-hearing became a threat to the next social order that replaced the old one.

The Navajo’s fearful respect of ghosts could be thought of as a bicameral carryover. Maybe they better understand the power voice-hearing can have. Ask any schizophrenic about this and they’d agree. Most of us, however, have developed thick boundaries of the egoic mind. We so effectively repress the many voices under the authority of the egoic sole rulership that we no longer are bothered by their sway, at least not consciously.

Still, we may be more influenced than we realize. We still go through the effort of costly rituals of burying the dead where they are kept separate from the living, not to mention appeasing them with flowers and flags. Research shows that the number of people who have heard disembodied voices in their lifetime is surprisingly high. The difference for us is that we don’t openly talk about it and try our best to quickly forget it again. Even as we don’t have ceremonies in the way seen in Navajo tradition, we have other methods for dispelling the spirits that otherwise would haunt us.

Clearing Away the Rubbish

“The case against science is straightforward: much of the scientific literature, perhaps half, may simply be untrue”
~Richard Horton, editor in chief of The Lancet

“It is simply no longer possible to believe much of the clinical research that is published, or to rely on the judgment of trusted physicians or authoritative medical guidelines. I take no pleasure in this conclusion, which I reached slowly and reluctantly over my two decades as an editor”
~Dr. Marcia Angell, former editor in chief of NEJM

Back in September, there was a scientific paper published in Clinical Cardiology, a peer reviewed medical journal that is “an official journal of the American Society for Preventive Cardiology” (Wikipedia). It got a ton of attention from news media, social media, and the blogosphere. The reason for all the attention is that, in the conclusion, the authors claimed that low-carb diets had proven the least healthy over a year period:

“One-year lowered-carbohydrate diet significantly increases cardiovascular risks, while a low-to-moderate-fat diet significantly reduces cardiovascular risk factors. Vegan diets were intermediate. Lowered-carbohydrate dieters were least inclined to continue dieting after conclusion of the study. Reductions in coronary blood flow reversed with appropriate dietary intervention. The major dietary effect on atherosclerotic coronary artery disease is inflammation and not weight loss.”

It has recently been retracted and it has come out that the lead author, Richard M. Fleming, has a long history of fraud going back to 2002 with two FBI convictions of fraud in 2009, following his self-confession. He has also since been debarred by the U.S. Food and Drug Administration. (But his closest brush with fame or infamy was his leaking the medical records of Dr. Robert Atkins, a leak that was behind a smear campaign.) As for his co-authors: “Three of the authors work at Fleming’s medical imaging company in California, one is a deceased psychologist from Iowa, another is a pediatric nutritionist from New York and one is a Kellogg’s employee from Illinois. How this group was able to run a 12-month diet trial in 120 subjects is something of a mystery” (George Henderson). Even before the retraction, many wondered how it ever passed peer-review considering the low quality of the study: “This study has so many methodological holes in it that it has no real value.” (Low Carb Studies BLOG).

But of course, none of that has been reported as widely as the paper originally was. So, most people who read about it still assume it is valid evidence. This is related to the replication crisis, as even researchers are often unaware of retractions, that is when journals will allow retractions to be published at all, something they are reluctant to do because it delegitimizes their authority. So, a lot of low quality or in some cases deceptive research goes unchallenged and unverified, neither confirmed nor disconfirmed. It’s rare when any study falls under the scrutiny of replication. If not for the lead author’s criminal background in the Fleming case, this probably would have been another paper that could have slipped past and been forgotten or else, without replication, repeatedly cited in future research. As such, bad research builds on bad research, creating the appearance of mounting evidence, but in reality it is a house of cards (consider the takedown of Ancel Keys and gang in the work by numerous authors: Gary Taubes’ Good Calories, Bad Calories; Nina Tiecholz’s The Big Fat Surprise; Sally Fallon Morrell’s Nourishing Diets; et cetera).

This is why the systemic problem and failure is referred to as a crisis. Fairly or unfairly, the legitimacy of entire fields of science are being questioned. Even scientists no longer are certain which research is valid or not. The few attempts at determining the seriousness of the situation by replicating studies has found a surprisingly low replication rate. And this problem is worse in the medical field than in many other fields, partly because of the kind of funding involved and more importantly because of how few doctors are educated in statistics or trained in research methodology. It is even worse with nutrition, as the average doctor gets about half the questions wrong when asked about this topic, and keep in mind that so much of the nutritional research is done by doctors. An example of problematic dietary study is that of Dr. Fleming himself. We’d be better off letting physicists and geologists do nutritional research.

There is more than a half century of research that conventional medical and dietary opinions are based upon. In some major cases, re-analysis of data has shown completely opposite conclusions. For example, the most famous study by Ancel Keys blamed saturated fat for heart disease, while recent reappraisal has shown the data actually shows a stronger link to sugar as the culprit. Meanwhile, no study has ever directly linked saturated fat to heart disease. The confusion has come because, in the Standard American Diet (SAD), saturated fat and sugar have been conflated in the population under study. Yet, even in cases like that of Keys when we now know what the data shows, Keys’ original misleading conclusions are still referenced as authoritative.

The only time this crisis comes to attention is when the researcher gets attention. If Keys wasn’t famous and Fleming wasn’t criminal, no one would have bothered with their research. Lots of research gets continually cited without much thought, as the authority of research accumulates over time by being cited which encourages further citation. It’s similar to how legal precedents can get set, even when the initial precedent was intentionally misinterpreted for that very purpose.

To dig through the original data, assuming it is available and one knows where to find it, is more work than most are willing to do. There is no glory or praise to be gained in doing it, nor will it promote one’s career or profit one’s bank account. If anything, there are plenty of disincentives in place, as academic careers in science are dependent on original research. Furthermore, private researchers working in corporations, for obvious reasons, tend to be even less open about their data and that makes scrutiny even more difficult. If a company found their own research didn’t replicate, they would be the last in line to announce it to the world and instead would likely bury it where it never would be found.

There is no system put into place to guard against the flaws of the system itself. And the news media is in an almost continual state of failure when it comes to scientific reporting. The crisis has been stewing for decades, occasionally being mentioned, but mostly suppressed, until now when it has gotten so bad as to be undeniable. The internet has created alternative flows of information and so much of the scrutiny, delayed for too long, is now coming from below. If this had happened at an earlier time, Fleming might have gotten away with it. But times have changed. And in crisis, there is opportunity or at very least there is hope for open debate. So bring on the debate, just as soon as we clear away some of the rubbish.

* * *

Retracted: Long‐term health effects of the three major diets under self‐management with advice, yields high adherence and equal weight loss, but very different long‐term cardiovascular health effects as measured by myocardial perfusion imaging and specific markers of inflammatory coronary artery disease

The above article, published online on 27 September 2018 in Wiley Online Library (wileyonlinelibrary.com), has been withdrawn by agreement between the journal Editor in Chief, A. John Camm and Wiley Periodicals, Inc. The article has been withdrawn due to concerns with data integrity and an undisclosed conflict of interest by the lead author.

A convicted felon writes a paper on hotly debated diets. What could go wrong?
by Ivan Oransky, Retraction Watch

Pro-tip for journals and publishers: When you decide to publish a paper about a subject — say, diets — that you know will draw a great deal of scrutiny from vocal proponents of alternatives, make sure it’s as close to airtight as possible.

And in the event that the paper turns out not to be so airtight, write a retraction notice that’s not vague and useless.

Oh, and make sure the lead author of said study isn’t a convicted felon who pleaded guilty to healthcare fraud.

“If only we were describing a hypothetical.

On second thought: A man of many talents — with a spotty scientific record
by Adam Marcus, Boston Globe

Richard M. Fleming may be a man of many talents, but his record as a scientist has been spotty. Fleming, who bills himself on Twitter as “PhD, MD, JD AND NOW Actor-Singer!!!”, was a co-author of short-lived paper in the journal Clinical Cardiology purporting to find health benefits from a diet with low or modest amounts of fat. The paper came out in late September — just a day before the Food and Drug Administration banned Fleming from participating in any drug studies. Why? Two prior convictions for fraud in 2009.

It didn’t take long for others to begin poking holes in the new article. One researcher found multiple errors in the data and noted that the study evidently had been completed in 2002. The journal ultimately retracted the article, citing “concerns with data integrity and an undisclosed conflict of interest by the lead author.” But Fleming, who objected to the retraction, persevered. On Nov. 5, he republished the study in another journal — proving that grit, determination, and a receptive publisher are more important than a spotless resume.

Malnourished Americans

Prefatory Note

It would be easy to mistake this writing as a carnivore’s rhetoric against the evils of grains and agriculture. I’m a lot more agnostic on the issue than it might seem. But I do come off as strong in opinion, from decades of personal experience about bad eating habits and the consequences, and my dietary habits were no better when I was vegetarian.

I’m not so much pro-meat as I am for healthy fats and oils, not only from animals sources but also from plants, with coconut oil and olive oil being two of my favorites. As long as you are getting adequate protein, from whatever source (including vegetarian foods), there is no absolute rule about protein intake. But hunter-gatherers on average do eat more fats and oils than protein (and more than vegetables as well), whether the protein comes from meat or seeds and nuts (though the protein and vegetables they get is of extremely high quality and, of course, nutrient dense; along with much fiber). Too much protein with too little fat/oil causes rabbit sickness. It’s fat and oil that has a higher satiety and, combined with low-carb ketosis, is amazing in eliminating food cravings, addictions, and over-eating.

Besides, I have nothing against plant-based foods. I eat more vegetables on the paleo diet than I did in the past, even when I was a vegetarian, more than any vegetarian I know as well; not just more in quantity but also more in quality. Many paleo and keto dieters have embraced a plant-based diet with varying attitudes about meat and fat. Dr. Terry Wahls, former vegetarian, reversed her symptoms of multiple sclerosis by formulating a paleo diet that include massive loads of nutrient-dense vegetables, while adding in the nutrient-dense animal foods as well (e.g., liver).

I’ve picked up three books lately that emphasize plants even further. One is The Essential Vegetarian Keto Cookbook and pretty much is as the title describes it, mostly recipes with some introductory material about ketosis. Another book, Ketotarian by Dr. Will cole, is likewise about keto vegetarianism, but with leniency toward fish consumption and ghee (the former not strictly vegetarian and the latter not strictly paleo). The most recent I got is The Paleo Vegetarian Diet by Dena Harris, another person with a lenient attitude toward diet. That is what I prefer in my tendency toward ideological impurity. About diet, I’m bi-curious or maybe multi-curious.

My broader perspective is that of traditional foods. This is largely based on the work of Weston A. Price, which I was introduced to long ago by way of the writings of Sally Fallon Morrell (formerly Sally Fallon). It is not a paleo diet in that agricultural foods are allowed, but its advocates share a common attitude with paleolists in the valuing of traditional nutrition and food preparation. Authors from both camps bond over their respect for Price’s work and so often reference those on the other side in their writings. I’m of the opinion, in line with traditional foods, that if you are going to eat agricultural foods then traditional preparation is all the more important (from long-fermented bread and fully soaked legumes to cultured dairy and raw aged cheese). Many paleolists share this opinion and some are fine with such things as ghee. My paleo commitment didn’t stop me from enjoying a white role for Thanksgiving, adorning it with organic goat butter, and it didn’t kill me.

I’m not so much arguing against all grains in this post as I’m pointing out the problems found at the extreme end of dietary imbalance that we’ve reached this past century: industrialized and processed, denatured and toxic, grain-based/obsessed and high-carb-and-sugar. In the end, I’m a flexitarian who has come to see the immense benefits in the paleo approach, but I’m not attached to it as a belief system. I heavily weigh the best evidence and arguments I can find in coming to my conclusions. That is what this post is about. I’m not trying to tell anyone how to eat. I hope that heads off certain areas of potential confusion and criticism. So, let’s get to the meat of the matter.

Grain of Truth

Let me begin with a quote, share some related info, and then circle back around to putting the quote into context. The quote is from Grain of Truth by Stephen Yafa. It’s a random book I picked up at a secondhand store and my attraction to it was that the author is defending agriculture and grain consumption. I figured it would be a good balance to my other recent readings. Skimming it, one factoid stuck out. In reference to new industrial milling methods that took hold in the late 19th century, he writes:

“Not until World War II, sixty years later, were measures taken to address the vitamin and mineral deficiencies caused by these grain milling methods. They caught the government’s attention only when 40 percent of the raw recruits drafted by our military proved to be so malnourished that they could not pass a physical and were declared unfit for duty.” (p. 17)

That is remarkable. He is talking about the now infamous highly refined flour, something that never existed before. Even commercial whole wheat breads today, with some fiber added back in, have little in common with what was traditionally made for millennia. My grandparents were of that particular generation that was so severely malnourished, and so that was the world into which my parents were born. The modern health decline that has gained mainstream attention began many generations back. Okay, so put that on the backburner.

Against the Grain

In a post by Dr. Malcolm Kendrick, I was having a discussion in the comments section (and, at the same time, I was having a related discussion in my own blog). Göran Sjöberg brought up Jame C. Scott’s book about the development of agriculture, Against the Grain — writing that, “This book is very much about the health deterioration, not least through epidemics partly due to compromised immune resistance, that occurred in the transition from hunting and gathering to sedentary mono-crop agriculture state level scale, first in Mesopotamia about five thousand years ago.”

Scott’s view has interested me for a while. I find compelling the way he connects grain farming, legibility, record-keeping, and taxation. There is a reason great empires were built on grain fields, not on potato patches or vegetable gardens, much less cattle ranching. Grain farming is easily observed and measured, tracked and recorded, and that meant it could be widely taxed to fund large centralized governments along with their armies and, later on, their police forces and intelligence agencies. The earliest settled societies arose prior to agriculture, but they couldn’t become major civilizations until the cultivation of grains.

Another commenter, Sasha, responded with what she considered important qualifications: “I think there are too many confounders in transition from hunter gatherers to agriculture to suggest that health deterioration is due to one factor (grains). And since it was members of upper classes who were usually mummified, they had vastly different lifestyles from that of hunter gatherers. IMO, you’re comparing apples to oranges… Also, grain consumption existed in hunter gatherers and probably intensified long before Mesopotamia 5 thousands years ago as wheat was domesticated around 9,000 BCE and millet around 6,000 BCE to use just two examples.”

It is true that pre-neolithic hunter-gatherers, in some cases, sporadically ate grains in small amounts or at least we have evidence they were doing something with grains, though as far as we know they might have been using it to mix with medicinal herbs or used as a thickener for paints — it’s anyone’s guess. Assuming they were eating those traces of grains we’ve discovered, it surely was no where near at the level of the neolithic agriculturalists. Furthermore, during the following millennia, grains were radically changed through cultivation. As for the Egyptian elite, they were eating more grains than anyone, as farmers were still forced to partly subsist from hunting, fishing, and gathering.

I’d take the argument much further forward into history. We know from records that, through the 19th century, Americans were eating more meat than bread. Vegetable and fruit consumption was also relatively low and mostly seasonal. Part of that is because gardening was difficult with so many pests. Besides, with so many natural areas around, hunting and gathering remained a large part of the American diet. Even in the cities, wild game was easily obtained at cheap prices. Into the 20th century, hunting and gathering was still important and sustained many families through the Great Depression and World War era when many commercial foods were scarce.

It was different in Europe, though. Mass urbanization happened centuries before it did in the United States. And not much European wilderness was left standing in recent history. But with the fall of the Roman Empire and headng into feudalism, many Europeans returned to a fair amount of hunting and gathering, during which time general health improved in the population. Restrictive laws about land use eventually made that difficult and the land enclosure movement made it impossible for most Europeans.

Even so, all of that is fairly recent in the big scheme of things. It took many millennia of agriculture before it more fully replaced hunting, fishing, trapping, and gathering. In places like the United States, that change is well within living memory. When some of my ancestors immigrated here in the 1600s, Britain and Europe still maintained plenty of procuring of wild foods to support their populations. And once here, wild foods were even more plentiful and a lot less work than farming.

Many early American farmers didn’t grow food so much for their own diet as to be sold on the market, sometimes in the form of the popular grain-based alcohols. It was in making alcohol that rural farmers were able to get their product to the market without it spoiling. I’m just speculating, but alcohol might have been the most widespread agricultural food of that era because water was often unsafe to drink.

Another commenter, Martin Back, made the same basic point: “Grain these days is cheap thanks to Big Ag and mechanization. It wasn’t always so. If the fields had to be ploughed by draught animals, and the grain weeded, harvested, and threshed by hand, the final product was expensive. Grain became a store of value and a medium of exchange. Eating grains was literally like eating money, so presumably they kept consumption to a minimum.”

In early agriculture, grain was more of a way to save wealth than a staple of the diet. It was saved for purposes of trade and also saved for hard times when no other food was available. What didn’t happen was to constantly consume grain-based foods every day and all day long — going from a breakfast with toast and cereal to lunch with a sandwich and maybe a salad with croutons, and then a snack of crackers in the afternoon before eating more bread or noodles for dinner.

Historical Examples

So, I am partly just speculating. But it’s informed speculation. I base my view on specific examples. The most obvious example are hunter-gatherers, poor by standards of modern industrialization while maintaining great health, as long as they their traditional way of life is able to be maintained. Many populations that are materially better of in terms of a capitalist society (access to comfortable housing, sanitation, healthcare, an abundance of food in grocery stores, etc) are not better off in terms of chronic diseases.

As the main example I already mentioned, poor Americans have often been a quite healthy lot, as compared to other populations around the world. It is true that poor Americans weren’t particularly healthy in the early colonial period, specifically in Virginia because of indentured servitude. And it’s true that poor Americans today are fairly bad off because of the cheap industrialized diet. Yet for the couple of centuries or so in between, they were doing quite well in terms of health, with lots of access to nutrient-dense wild foods. That point is emphasized by looking at other similar populations at the time, such as back in Europe.

Let’s do some other comparisons. The poor in the Roman Empire did not do well, even when they weren’t enslaved. That was for many reasons, such as growing urbanization and its attendant health risks. When the Roman Empire fell, many of the urban centers collapsed. The poor returned to a more rural lifestyle that depended on a fair amount of wild foods. Studies done on their remains show their health improved during that time. Then at the end of feudalism, with the enclosure movement and the return of mass urbanization, health went back on a decline.

Now I’ll consider the early Egyptians. I’m not sure if there is any info about the diet and health of poor Egyptians. But clearly the ruling class had far from optimal health. It’s hard to make comparisons between then and now, though, because it was an entire different kind of society. The early Bronze Age civilizations were mostly small city-states that lacked much hierarchy. Early Egypt didn’t even have the most basic infrastructure such as maintained roads and bridges. And the most recent evidence indicates that the pyramid workers weren’t slaves but instead worked freely and seem to have fed fairly well, whatever that may or may not indicate about their socioeconomic status. The fact that the poor weren’t mummified leaves us with scant evidence that would more directly inform us.

On the other hand, no one can doubt that there have been plenty of poor populations who had truly horrific living standards with much sickness, suffering, and short lifespans. That is particularly true over the millennia as agriculture became ever more central, since that meant periods of abundance alternating with periods of deficiency and sometimes starvation, often combined with weakened immune systems and rampant sickness. That was less the case for the earlier small city-states with less population density and surrounded by the near constant abundance of wilderness areas.

As always, it depends on what are the specifics we are talking about. Also, any comparison and conclusion is relative.

My mother grew up in a family that hunted and at the time there was a certain amount of access to natural areas for many Americans, something that helped a large part of the population get through the Great Depression and world war era. Nonetheless, by the time of my mother’s childhood, overhunting had depleted most of the wild game (bison, bear, deer, etc were no longer around) and so her family relied on less desirable foods such as squirrel, raccoon, and opossum; even the fish they ate was less than optimal because they came from highly polluted waters because of the very factories and railroad her family worked in. So, the wild food opportunities weren’t nearly as good as it had been a half century earlier, much less in the prior centuries.

Not All Poverty is the Same

Being poor today means a lot of things that it didn’t mean in the past. The high rates of heavy metal toxicity today has rarely been seen among previous poor populations. Today 40% of the global deaths are caused by air pollution, primarily effecting the poor, also extremely different from the past. Beyond that, inequality has grown larger than ever before and that has been strongly correlated to high rates of stress, disease, homicides, and suicides. Such inequality is also seen in terms of climate change, droughts, refugee crises, and war/occupation.

Here is what Sasha wrote in response to me: “I agree with a lot of your points, except with your assertion that “the poor ate fairly well in many societies especially when they had access to wild sources of food”. I know how the poor ate in Russia in the beginning of the 20th century and how the poor eat now in the former Soviet republics and in India. Their diet is very poor even though they can have access to wild sources of food. I don’t know what the situation was for the poor in ancient Egypt but I would be very surprised if it was better than in modern day India or former Soviet Union.”

I’d imagine modern Russia has high inequality similar to the US. About modern India, that is one of the most impoverished, densely populated, and malnourished societies around. And modern industrialization did major harm to Hindu Indians because studies show that traditional vegetarians got a fair amount of nutrients from the insects that were mixed in with pre-modern agricultural goods. Both Russia and India have other problems related to neoliberalism that wasn’t a factor in the past. It’s an entirely different kind of poverty these days. Even if some Russians have some access to wild foods, I’m willing to bet they have nowhere near the access that was available in previous generations, centuries, and millennia.

Compare modern poverty to that of feudalism. At least in England, feudal peasants were guaranteed to be taken care of in hard times. The Church, a large part of local governance at the time, was tasked with feeding and taking care of the poor and needy, from orphans to widows. They were tight communities that took care of their own, something that no longer exists in most of the world where the individual is left to suffer and struggle. Present Social Darwinian conditions are not the norm for human societies across history. The present breakdown of families and communities is historically unprecedented.

Socialized Medicine & Externalized Costs
An Invisible Debt Made Visible
On Conflict and Stupidity
Inequality in the Anthropocene
Capitalism as Social Control

The Abnormal Norms of WEIRD Modernity

Everything about present populations is extremely abnormal. This is seen in diet as elsewhere. Let me return to the quote I began this post with. “Not until World War II, sixty years later, were measures taken to address the vitamin and mineral deficiencies caused by these grain milling methods. They caught the government’s attention only when 40 percent of the raw recruits drafted by our military proved to be so malnourished that they could not pass a physical and were declared unfit for duty.” * So, what had happened to the health of the American population?

Well, there were many changes. Overhunting, as I already said, made many wild game species extinct or eliminated them from local areas, such that my mother born into a rural farm state never saw a white-tailed deer growing up. Also, much earlier after the Civil War, a new form of enclosure movement happened as laws were passed to prevent people, specifically the then free blacks, from hunting and foraging wherever they wanted (early American laws often protected the rights of anyone to hunt, forage plants, collect timber, etc from any land that was left open, whether or not it was owned by someone). The carryover from the feudal commons was finally and fully eliminated. It was also the end of the era of free range cattle ranching, the ending have come with the invention of barbed wire. Access to wild foods was further reduced by the creation and enforcement of protected lands (e.g., the federal park system), which very much was targeted at the poor who up to that point had relied upon wild foods for health and survival.

All of that was combined with mass urbanization and industrialization with all of its new forms of pollution, stress, and inequality. Processed foods were becoming more widespread at the time. Around the turn of the century unhealthy and industrialized vegetable oils became heavily marketed and hence popular, which replaced butter and lard. Also, muckraking about the meat industry scared Americans off from meat and consumption precipitiously dropped. As such, in the decades prior to World War II, the American diet had already shifted toward what we now know. A new young generation had grown up on that industrialized and processed diet and those young people were the ones showing up as recruits for the military. This new diet in such a short period had caused mass malnourishment. It was a mass experiment that showed failure early on and yet we continue the same basic experiment, not only continuing it but making it far worse.

Government officials and health authorities blamed it on bread production. Refined flour had become widely available because of industrialization. This removed all the nutrients that gave any health value to bread. In response, there was a movement to fortify bread, initially enforced by federal law and later by state laws. That helped some, but obviously the malnourishment was caused by many other factors that weren’t appreciated by most at the time, even though this was the same period when Weston A. Price’s work was published. Nutritional science was young at the time and very most nutrients were still undiscovered or else unappreciated. Throwing a few lab-produced vitamins back into food barely scratches the surface of the nutrient-density that was lost.

Most Americans continue to have severe nutritional deficiencies. We don’t recognize this fact because being underdeveloped and sickly has become normalized, maybe even in the minds of most doctors and health officials. Besides, many of the worst symptoms don’t show up until decades later, often as chronic diseases of old age, although increasingly seen among the young. Far fewer Americans today would meet the health standards of World War recruits. It’s been a steady decline, despite the miracles of modern medicine in treating symptoms and delaying death.

* The data on the British shows an even earlier shift in malnourishment because imperial trade brought an industrialized diet sooner to the British population. Also, rural life with a greater diet of wild foods had more quickly disappeared, as compared to the US. The fate of the British in the late 1800s showed what would later happen more than a half century later on the other side of the ocean.

Lore of Nutrition
by Tim Noakes
pp. 373-375

The mid-Victorian period between 1850 and 1880 is now recognised as the golden era of British health. According to P. Clayton and J. Rowbotham, 47 this was entirely due to the mid-Victorians’ superior diet. Farm-produced real foods were available in such surplus that even the working-class poor were eating highly nutritious foods in abundance. As a result, life expectancy in 1875 was equal to, or even better, than it is in modern Britain, especially for men (by about three years). In addition, the profile of diseases was quite different when compared to Britain today.

The authors conclude:

[This] shows that medical advances allied to the pharmaceutical industry’s output have done little more than change the manner of our dying. The Victorians died rapidly of infection and/or trauma, whereas we die slowly of degenerative disease. It reveals that with the exception of family planning, the vast edifice of twentieth century healthcare has not enabled us to live longer but has in the main merely supplied methods of suppressing the symptoms of degenerative disease which have emerged due to our failure to maintain mid-Victorian nutritional standards. 48

This mid-Victorians’ healthy diet included freely available and cheap vegetables such as onions, carrots, turnips, cabbage, broccoli, peas and beans; fresh and dried fruit, including apples; legumes and nuts, especially chestnuts, walnuts and hazelnuts; fish, including herring, haddock and John Dory; other seafood, including oysters, mussels and whelks; meat – which was considered ‘a mark of a good diet’ so that ‘its complete absence was rare’ – sourced from free-range animals, especially pork, and including offal such as brain, heart, pancreas (sweet breads), liver, kidneys, lungs and intestine; eggs from hens that were kept by most urban households; and hard cheeses.

Their healthy diet was therefore low in cereals, grains, sugar, trans fats and refined flour, and high in fibre, phytonutrients and omega- 3 polyunsaturated fatty acids, entirely compatible with the modern Paleo or LCHF diets.

This period of nutritional paradise changed suddenly after 1875 , when cheap imports of white flour, tinned meat, sugar, canned fruits and condensed milk became more readily available. The results were immediately noticeable. By 1883 , the British infantry was forced to lower its minimum height for recruits by three inches; and by 1900, 50 per cent of British volunteers for the Boer War had to be rejected because of undernutrition. The changes would have been associated with an alteration in disease patterns in these populations, as described by Yellowlees ( Chapter 2 ).

On Obesity and Malnourishment

There is no contradiction, by the way, between rampant nutritional deficiencies and the epidemic of obesity. Gary Taubes noted the dramatic rise of obesity in America began earlier last century, which is to say that it is not a problem that came out of nowhere with the present younger generations. Americans have been getting fatter for a while now. Specifically, they were getting fatter while at the same time being malnourished, partly because of refined flour that was as empty of a carb that is possible.

Taubes emphasizes the point that this seeming paradox has often been observed among poor populations around the world, lack of optimal nutrition that leads to ever more weight gain, sometimes with children being skinny to an unhealthy degree only to grow up to be fat. No doubt that many Americans in the early 1900s were dealing with much poverty and the lack of nutritious foods that often goes with it. As for today, nutritional deficiencies are different because of enrichment, but it persists nonetheless in many other ways. Also, as Keith Payne argues in The Broken Ladder, growing inequality mimics poverty in the conflict and stress it causes. And inequality has everything to do with food quality, as seen with many poor areas being food deserts.

I’ll give you a small taste of Taube’s discussion. It is from the introduction to one of his books, published a few years ago. If you read the book, look at the section immediately following the below. He gives examples of tribes that were poor, didn’t overeat, and did hard manual labor. Yet they were getting obese, even as nearby tribes sometimes remained a healthy weight. The only apparent difference was what they were eating and not how much they were eating. The populations that saw major weight gain had adopted a grain-based diet, typically because of government rations or government stores.

Why We Get Fat
by Gary Taubes
pp. 17-19

In 1934, a young German pediatrician named Hilde Bruch moved to America, settled in New York City, and was “startled,” as she later wrote, by the number of fat children she saw—“really fat ones, not only in clinics, but on the streets and subways, and in schools.” Indeed, fat children in New York were so conspicuous that other European immigrants would ask Bruch about it, assuming that she would have an answer. What is the matter with American children? they would ask. Why are they so bloated and blown up? Many would say they’d never seen so many children in such a state.

Today we hear such questions all the time, or we ask them ourselves, with the continual reminders that we are in the midst of an epidemic of obesity (as is the entire developed world). Similar questions are asked about fat adults. Why are they so bloated and blown up? Or you might ask yourself: Why am I?

But this was New York City in the mid-1930s. This was two decades before the first Kentucky Fried Chicken and McDonald’s franchises, when fast food as we know it today was born. This was half a century before supersizing and high-fructose corn syrup. More to the point, 1934 was the depths of the Great Depression, an era of soup kitchens, bread lines, and unprecedented unemployment. One in every four workers in the United States was unemployed. Six out of every ten Americans were living in poverty. In New York City, where Bruch and her fellow immigrants were astonished by the adiposity of the local children, one in four children were said to be malnourished. How could this be?

A year after arriving in New York, Bruch established a clinic at Columbia University’s College of Physicians and Surgeons to treat obese children. In 1939, she published the first of a series of reports on her exhaustive studies of the many obese children she had treated, although almost invariably without success. From interviews with her patients and their families, she learned that these obese children did indeed eat excessive amounts of food—no matter how much either they or their parents might initially deny it. Telling them to eat less, though, just didn’t work, and no amount of instruction or compassion, counseling, or exhortations—of either children or parents—seemed to help.

It was hard to avoid, Bruch said, the simple fact that these children had, after all, spent their entire lives trying to eat in moderation and so control their weight, or at least thinking about eating less than they did, and yet they remained obese. Some of these children, Bruch reported, “made strenuous efforts to lose weight, practically giving up on living to achieve it.” But maintaining a lower weight involved “living on a continuous semi-starvation diet,” and they just couldn’t do it, even though obesity made them miserable and social outcasts.

One of Bruch’s patients was a fine-boned girl in her teens, “literally disappearing in mountains of fat.” This young girl had spent her life fighting both her weight and her parents’ attempts to help her slim down. She knew what she had to do, or so she believed, as did her parents—she had to eat less—and the struggle to do this defined her existence. “I always knew that life depended on your figure,” she told Bruch. “I was always unhappy and depressed when gaining [weight]. There was nothing to live for.… I actually hated myself. I just could not stand it. I didn’t want to look at myself. I hated mirrors. They showed how fat I was.… It never made me feel happy to eat and get fat—but I never could see a solution for it and so I kept on getting fatter.”

pp. 33-34

If we look in the literature—which the experts have not in this case—we can find numerous populations that experienced levels of obesity similar to those in the United States, Europe, and elsewhere today but with no prosperity and few, if any, of the ingredients of Brownell’s toxic environment: no cheeseburgers, soft drinks, or cheese curls, no drive-in windows, computers, or televisions (sometimes not even books, other than perhaps the Bible), and no overprotective mothers keeping their children from roaming free.

In these populations, incomes weren’t rising; there were no labor-saving devices, no shifts toward less physically demanding work or more passive leisure pursuits. Rather, some of these populations were poor beyond our ability to imagine today. Dirt poor. These are the populations that the overeating hypothesis tells us should be as lean as can be, and yet they were not.

Remember Hilde Bruch’s wondering about all those really fat children in the midst of the Great Depression? Well, this kind of observation isn’t nearly as unusual as we might think.

How Americans Used to Eat

Below is a relevant passage from Nina Teicholz’s book, The Big Fat Surprise. It puts into context how extremely unusual has been the high-carb, low-fat diet these past few generations. This is partly what informed some of my thoughts. We so quickly forget that the present dominance of a grain-based diet wasn’t always the case, likely not even in most agricultural societies until quite recently. In fact, the earlier American diet is still within living memory, although those left to remember it are quickly dying off.

Let me explain why history of diets matter. One of the arguments for forcing official dietary recommendations onto the entire population was the belief that Americans in a mythical past ate less meat, fat, and butter while having ate more bread, legumes, and vegetables. This turns out to have been a trick of limited data.

We now know, from better data, that the complete opposite was the case. And we have the further data that shows that the increase of the conventional diet has coincided with increase of obesity and chronic diseases. That isn’t to say eating more vegetables is bad for your health, but we do know that even as the average American intake of vegetables has gone up so has all the diet-related health conditions. During this time, what went down was the consumption of all the traditional foods of the American diet going back to the colonial era: wild game, red meat, organ meat, lard, and butter — all the foods Americans ate in huge amounts prior to the industrialized diet. I noticed a couple of comments about this.  Someone going by the username kaakitwitaasota shared data:

“Robert J. Gordon’s The Rise and Fall of American Growth devotes a section to changes in the American diet. In 1870, the average American ate 74 pounds of beef and veal, 123 pounds of pork, 15 pounds of chicken and turkey and a pound of lamb, for a grand total of 212 pounds of meat a year, falling to 141 pounds by 1940. 1940 represents a mid-century nadir, because meat consumption climbs again after the war: in 2010 the average American ate just barely less meat than in 1870, but much, much more of it was poultry (about 60 pounds of beef, 47 of pork, a pound of lamb and 96 pounds of chicken and turkey).

“Meat consumption was low in the rest of the world, but the comparative prosperity of the US, though abysmal by the standards of today, was borne out in a high consumption of meat. (Though, admittedly, much of that meat was not bought–the American farmer had much more land than his European counterpart, and pigs will generally look after themselves given the space.)

“I wonder about nut and fish consumption, though. Fish + shellfish consumption was only about 11 pounds a person in 1940, and I doubt it was much higher in 1870, though I don’t know for certain.”

Then another comment, going by blkadder, added that, “It’s even more complex than that though. In addition to the type of feed, you need to look at the fat content of the meat. Due in part to the fat-is-bad madness a lot of the fat has been bred out of pork in the U.S. as an example which further complicates analysis.”

Others have covered this same time period of American dietary history. Writing in the New York Times, Jane Ziegelman describes the American love of meat going back centuries (America’s Obsession With Cheap Meat):

“In her 1832 travel book, “Domestic Manners of the Americans,” the English writer Frances Trollope describes the breathtaking quantities of food on American dinner tables. Even tea, she reports, is a “massive meal,” a lavish spread of many cakes and breads and “ham, turkey, hung beef, apple sauce and pickled oysters.”

“Equally impressive to this foreign observer were the carnivorous tendencies of her American hosts. “They consume an extraordinary amount of bacon,” she writes, while “ham and beefsteaks appear morning, noon and night.”

“Americans were indiscriminate in their love for animal protein. Beef, pork, lamb and mutton were all consumed with relish. However, as pointed out by the food historian Harvey Levenstein, it was beef, the form of protein preferred by the upper class, “that reigned supreme in status.” With the opening of the Western frontier in the mid-19th century, increased grazing land for cattle lowered beef prices, making it affordable for the working class.

“Dietary surveys conducted at the turn of the 20th century by Wilbur Atwater, father of American nutrition, revealed that even laborers were able to have beefsteak for breakfast. As Atwater was quick to point out, a high-protein diet set American workers apart from their European counterparts. On average, Americans ate a phenomenal 147 pounds of meat a year; Italians, by contrast, consumed 24.

““Doubtless,” Atwater wrote, “we live and work more intensely than people do in Europe.” The “vigor, ambition and hopes for higher things” that distinguished the American worker, he argued, was fed by repeated helpings of T-bone and sirloin steak.”

That calculation might be way off, an underestimation if anything. The majority of Americans remained rural, largely working as farmers, into the early 20th century. All we have data for is the food that was shipped across state lines, as no one kept nation-wide records on food that was produced, bought, and consumed locally. Until quite recently, most of the food Americans ate either was grown at home or grown at a nearby farm, not to mention much food that was hunted, trapped, fished and gathered. So, that 147 pounds of meat probably was only the tip of the iceberg.

What added to the confusion and misinterpretation of the evidence had to do with timing. Diet and nutrition was first seriously studied right at the moment when, for most populations, it had already changed. That was the failure of Ancel Keys research on what came to be called the Mediterranean diet (see Sally Fallon Morrell’s Nourishing Diets). The population was recuperating from World War II that had devastated their traditional way of life, including their diet. Keys took the post-war deprivation diet as being the historical norm, but the reality was far different. Cookbooks and other evidence from before the war showed that this population used to eat higher levels of meat and fat, including saturated fat. So, the very people focused on had grown up and spent most of their lives on a diet that was at the moment no longer available because of disruption of the food system. What good health Keys observed came from a lifetime of eating a different diet. Combined with cherry-picking of data and biased analysis, Keys came to a conclusion that was as wrong as wrong could be.

Slightly earlier, Weston A. Price was able to see a different picture. He intentionally traveled to the places where traditional diets remained fully in place. And the devastation of World War II had yet to happen. Price came to a conclusion that what mattered most of all was nutrient-density. Sure, the vegetables eaten would have been of a higher quality than we get today, largely because they were heirloom cultivars grown on health soil. Nutrient-dense foods can only come from nutrient-dense soil, whereas today our food is nutrient-deficient because our soil is highly depleted. The same goes for animal foods. Animals pastured on healthy land will produce healthy dairy, eggs, meat, and fat; these foods will be high in omega-3s and the fat-soluble vitamins.

No matter if it is coming from plant sources or animal sources, nutrient-density might be the most important factor of all. Why fat is meaningful in this context is that it is fat that is where fat-soluble vitamins are found and it is through fat that they are metabolized. And in turn, the fat-soluble vitamins play a key role in the absorption and processing of numerous other nutrients, not to mention a key role in numerous functions in the body. Nutrient-density and fat-density go hand in hand in terms of general health. That is what early Americans were getting in eating so much wild food, not only wild game but also wild greens, fruit, and mushrooms. And nutrient-density is precisely what we are lacking today, as the nutrients have been intentionally removed to make more palatable commercial foods.

Once again, this has a class dimension, since the wealthier have more access to nutrient-dense foods. Few poor people could afford to shop at a high-end health food store, even if one was located nearby their home. But it was quite different in the past when nutrient-dense foods were available to everyone and sometimes more available to the poor concentrated in rural areas. If we want to improve public health, the first thing we should do is return to this historical norm.

Now, in ending, here is the longer passage aforementioned:

The Big Fat Surprise
by Nina Teicholz
pp. 123-131

Yet despite this shaky and often contradictory evidence, the idea that red meat is a principal dietary culprit has thoroughly pervaded our national conversation for decades. We have been led to believe that we’ve strayed from a more perfect, less meat-filled past. Most prominently, when Senator McGovern announced his Senate committee’s report, called Dietary Goals , at a press conference in 1977, he expressed a gloomy outlook about where the American diet was heading. “Our diets have changed radically within the past fifty years,” he explained, “with great and often harmful effects on our health.” Hegsted, standing at his side, criticized the current American diet as being excessively “rich in meat” and other sources of saturated fat and cholesterol, which were “linked to heart disease, certain forms of cancer, diabetes and obesity.” These were the “killer diseases,” said McGovern. The solution, he declared, was for Americans to return to the healthier, plant-based diet they once ate.

The New York Times health columnist Jane Brody perfectly encapsulated this idea when she wrote, “Within this century, the diet of the average American has undergone a radical shift away from plant-based foods such as grains, beans and peas, nuts, potatoes, and other vegetables and fruits and toward foods derived from animals—meat, fish, poultry, eggs and dairy products.” It is a view that has been echoed in literally hundreds of official reports.

The justification for this idea, that our ancestors lived mainly on fruits, vegetables, and grains, comes mainly from the USDA “food disappearance data.” The “disappearance” of food is an approximation of supply; most of it is probably being eaten, but much is wasted, too. Experts therefore acknowledge that the disappearance numbers are merely rough estimates of consumption. The data from the early 1900s, which is what Brody, McGovern, and others used, are known to be especially poor. Among other things, these data accounted only for the meat, dairy, and other fresh foods shipped across state lines in those early years, so anything produced and eaten locally, such as meat from a cow or eggs from chickens, would not have been included. And since farmers made up more than a quarter of all workers during these years, local foods must have amounted to quite a lot. Experts agree that this early availability data are not adequate for serious use, yet they cite the numbers anyway, because no other data are available. And for the years before 1900, there are no “scientific” data at all.

In the absence of scientific data, history can provide a picture of food consumption in the late eighteenth to nineteenth century in America. Although circumstantial, historical evidence can also be rigorous and, in this case, is certainly more far-reaching than the inchoate data from the USDA. Academic nutrition experts rarely consult historical texts, considering them to occupy a separate academic silo with little to offer the study of diet and health. Yet history can teach us a great deal about how humans used to eat in the thousands of years before heart disease, diabetes, and obesity became common. Of course we don’t remember now, but these diseases did not always rage as they do today. And looking at the food patterns of our relatively healthy early-American ancestors, it’s quite clear that they ate far more red meat and far fewer vegetables than we have commonly assumed.

Early-American settlers were “indifferent” farmers, according to many accounts. They were fairly lazy in their efforts at both animal husbandry and agriculture, with “the grain fields, the meadows, the forests, the cattle, etc, treated with equal carelessness,” as one eighteenth-century Swedish visitor described. And there was little point in farming since meat was so readily available.

The endless bounty of America in its early years is truly astonishing. Settlers recorded the extraordinary abundance of wild turkeys, ducks, grouse, pheasant, and more. Migrating flocks of birds would darken the skies for days . The tasty Eskimo curlew was apparently so fat that it would burst upon falling to the earth, covering the ground with a sort of fatty meat paste. (New Englanders called this now-extinct species the “doughbird.”)

In the woods, there were bears (prized for their fat), raccoons, bobolinks, opossums, hares, and virtual thickets of deer—so much that the colonists didn’t even bother hunting elk, moose, or bison, since hauling and conserving so much meat was considered too great an effort. IX

A European traveler describing his visit to a Southern plantation noted that the food included beef, veal, mutton, venison, turkeys, and geese, but he does not mention a single vegetable. Infants were fed beef even before their teeth had grown in. The English novelist Anthony Trollope reported, during a trip to the United States in 1861, that Americans ate twice as much beef as did Englishmen. Charles Dickens, when he visited, wrote that “no breakfast was breakfast” without a T-bone steak. Apparently, starting a day on puffed wheat and low-fat milk—our “Breakfast of Champions!”—would not have been considered adequate even for a servant.

Indeed, for the first 250 years of American history, even the poor in the United States could afford meat or fish for every meal. The fact that the workers had so much access to meat was precisely why observers regarded the diet of the New World to be superior to that of the Old. “I hold a family to be in a desperate way when the mother can see the bottom of the pork barrel,” says a frontier housewife in James Fenimore Cooper’s novel The Chainbearer.

Like the primitive tribes mentioned in Chapter 1, Americans also relished the viscera of the animal, according to the cookbooks of the time. They ate the heart, kidneys, tripe, calf sweetbreads (glands), pig’s liver, turtle lungs, the heads and feet of lamb and pigs, and lamb tongue. Beef tongue, too, was “highly esteemed.”

And not just meat but saturated fats of every kind were consumed in great quantities. Americans in the nineteenth century ate four to five times more butter than we do today, and at least six times more lard. X

In the book Putting Meat on the American Table , researcher Roger Horowitz scours the literature for data on how much meat Americans actually ate. A survey of eight thousand urban Americans in 1909 showed that the poorest among them ate 136 pounds a year, and the wealthiest more than 200 pounds. A food budget published in the New York Tribune in 1851 allots two pounds of meat per day for a family of five. Even slaves at the turn of the eighteenth century were allocated an average of 150 pounds of meat a year. As Horowitz concludes, “These sources do give us some confidence in suggesting an average annual consumption of 150–200 pounds of meat per person in the nineteenth century.”

About 175 pounds of meat per person per year! Compare that to the roughly 100 pounds of meat per year that an average adult American eats today. And of that 100 pounds of meat, more than half is poultry—chicken and turkey—whereas until the mid-twentieth century, chicken was considered a luxury meat, on the menu only for special occasions (chickens were valued mainly for their eggs). Subtracting out the poultry factor, we are left with the conclusion that per capita consumption of red meat today is about 40 to 70 pounds per person, according to different sources of government data—in any case far less than what it was a couple of centuries ago.

Yet this drop in red meat consumption is the exact opposite of the picture we get from public authorities. A recent USDA report says that our consumption of meat is at a “record high,” and this impression is repeated in the media. It implies that our health problems are associated with this rise in meat consumption, but these analyses are misleading because they lump together red meat and chicken into one category to show the growth of meat eating overall, when it’s just the chicken consumption that has gone up astronomically since the 1970s. The wider-lens picture is clearly that we eat far less red meat today than did our forefathers.

Meanwhile, also contrary to our common impression, early Americans appeared to eat few vegetables. Leafy greens had short growing seasons and were ultimately considered not worth the effort. They “appeared to yield so little nutriment in proportion to labor spent in cultivation,” wrote one eighteenth-century observer, that “farmers preferred more hearty foods.” Indeed, a pioneering 1888 report for the US government written by the country’s top nutrition professor at the time concluded that Americans living wisely and economically would be best to “avoid leafy vegetables,” because they provided so little nutritional content. In New England, few farmers even had many fruit trees, because preserving fruits required equal amounts of sugar to fruit, which was far too costly. Apples were an exception, and even these, stored in barrels, lasted several months at most.

It seems obvious, when one stops to think, that before large supermarket chains started importing kiwis from New Zealand and avocados from Israel, a regular supply of fruits and vegetables could hardly have been possible in America outside the growing season. In New England, that season runs from June through October or maybe, in a lucky year, November. Before refrigerated trucks and ships allowed the transport of fresh produce all over the world, most people could therefore eat fresh fruit and vegetables for less than half the year; farther north, winter lasted even longer. Even in the warmer months, fruit and salad were avoided, for fear of cholera. (Only with the Civil War did the canning industry flourish, and then only for a handful of vegetables, the most common of which were sweet corn, tomatoes, and peas.)

Thus it would be “incorrect to describe Americans as great eaters of either [fruits or vegetables],” wrote the historians Waverly Root and Richard de Rochemont. Although a vegetarian movement did establish itself in the United States by 1870, the general mistrust of these fresh foods, which spoiled so easily and could carry disease, did not dissipate until after World War I, with the advent of the home refrigerator.

So by these accounts, for the first two hundred and fifty years of American history, the entire nation would have earned a failing grade according to our modern mainstream nutritional advice.

During all this time, however, heart disease was almost certainly rare. Reliable data from death certificates is not available, but other sources of information make a persuasive case against the widespread appearance of the disease before the early 1920s. Austin Flint, the most authoritative expert on heart disease in the United States, scoured the country for reports of heart abnormalities in the mid-1800s, yet reported that he had seen very few cases, despite running a busy practice in New York City. Nor did William Osler, one of the founding professors of Johns Hopkins Hospital, report any cases of heart disease during the 1870s and eighties when working at Montreal General Hospital. The first clinical description of coronary thrombosis came in 1912, and an authoritative textbook in 1915, Diseases of the Arteries including Angina Pectoris , makes no mention at all of coronary thrombosis. On the eve of World War I, the young Paul Dudley White, who later became President Eisenhower’s doctor, wrote that of his seven hundred male patients at Massachusetts General Hospital, only four reported chest pain, “even though there were plenty of them over 60 years of age then.” XI About one fifth of the US population was over fifty years old in 1900. This number would seem to refute the familiar argument that people formerly didn’t live long enough for heart disease to emerge as an observable problem. Simply put, there were some ten million Americans of a prime age for having a heart attack at the turn of the twentieth century, but heart attacks appeared not to have been a common problem.

Was it possible that heart disease existed but was somehow overlooked? The medical historian Leon Michaels compared the record on chest pain with that of two other medical conditions, gout and migraine, which are also painful and episodic and therefore should have been observed by doctors to an equal degree. Michaels catalogs the detailed descriptions of migraines dating all the way back to antiquity; gout, too, was the subject of lengthy notes by doctors and patients alike. Yet chest pain is not mentioned. Michaels therefore finds it “particularly unlikely” that angina pectoris, with its severe, terrifying pain continuing episodically for many years, could have gone unnoticed by the medical community, “if indeed it had been anything but exceedingly rare before the mid-eighteenth century.” XII

So it seems fair to say that at the height of the meat-and-butter-gorging eighteenth and nineteenth centuries, heart disease did not rage as it did by the 1930s. XIII

Ironically—or perhaps tellingly—the heart disease “epidemic” began after a period of exceptionally reduced meat eating. The publication of The Jungle , Upton Sinclair’s fictionalized exposé of the meatpacking industry, caused meat sales in the United States to fall by half in 1906, and they did not revive for another twenty years. In other words, meat eating went down just before coronary disease took off. Fat intake did rise during those years, from 1909 to 1961, when heart attacks surged, but this 12 percent increase in fat consumption was not due to a rise in animal fat. It was instead owing to an increase in the supply of vegetable oils, which had recently been invented.

Nevertheless, the idea that Americans once ate little meat and “mostly plants”—espoused by McGovern and a multitude of experts—continues to endure. And Americans have for decades now been instructed to go back to this earlier, “healthier” diet that seems, upon examination, never to have existed.

Ketogenic Diet and Neurocognitive Health

Below is a passage from Ketotarian by Will Cole. It can be read in Chapter 1, titled “the ketogenic diet (for better and worse)”. The specific passage is to be found on pp. 34-38 in printed book (first edition) or pp. 28-31 in the Google ebook. I share it here because it is a great up-to-date summary of the value of the ketogenic diet. It is the low-carb diet pushed to its furthest extent where you burn fat instead of sugar, that is to say the body prioritizes and more efficiently uses ketones in place of glucose.

The brain, in particular, prefers ketones. That is why I decided to share a passage specifically on neurological health, as diet and nutrition isn’t the first thing most people think of in terms of what often gets framed as mental health, typically treated with psychiatric medications. But considering the severely limited efficacy of entire classes of such drugs (e.g., antidepressives), maybe it’s time for a new paradigm for treatment.

The basic advantage to ketosis is that, until modernity, most humans for most of human evolution (and going back into hominid evolution) were largely dependent on a high-fat diet for normal functioning. This is indicated by how the body more efficiently uses ketones than glucose. What the body does with carbs and sugar, though, is to either to use it right away or store it as fat. This is why hunter-gatherers would, when possible, carb-load right before winter in order to fatten themselves up. We have taken this knowledge in using carbs to fatten up animals before the slaughter.

Besides fattening up for winter in northern climes, hunter-gatherers focus most of their diet on fats and oils, in that when available they choose to eat far more fats and oils than they eat meat or vegetables. They do most of their hunting during the season when animals are the fattest and, if they aren’t simply doing a mass slaughter, they specifically target the fattest individual animals. After the kill, they often throw the lean meat to the dogs or mix it with fat for later use (e.g., pemmican).

This is why, prior to agriculture, ketosis was the biological and dietary norm. Even farmers until recent history were largely dependent in supplementing their diet with hunting and gathering. Up until the 20th century, most Americans ate more meat than bread, while intake of vegetables and fruits was minor and mostly seasonal. The meat most Americans, including city-dwellers, were eating was wild game because of the abundance in nearby wilderness areas; and, going by cookbooks of the time, fats and oils were at the center of the diet.

Anyway, simply in reading the following passage, you will not only become more well informed on this topic than average American but, sadly, also the average American doctor. This isn’t the kind of info that is emphasized in medical schools, despite it being fairly well researched at this point (see appended section of the author’s notes). “A study in the International Journal of Adolescent Medicine and Health assessed the basic nutrition and health knowledge of medical school graduates entering a pediatric residency program and found that, on average, they answered only 52 percent of eighteen questions correctly,” as referenced by Dr. Cole. He concluded that, “In short, most mainstream doctors would fail nutrition” (see previous post).

Knowledge is a good thing. And so here is some knowledge.

* * *

NEUROLOGICAL IMPROVEMENTS

Around 25 percent of your body’s cholesterol is found in your brain, (19) and remember, your brain is composed of 60 percent fat. (20) Think about that. Over half of your brain is fat! What we have been traditionally taught when it comes to “low-fat is best” ends up depriving your brain of the very thing it is made of. It’s not a coincidence that many of the potential side effects associated with statins—cholesterol-lowering drugs—are brain problems and memory loss. (21)

Your gut and brain actually form from the same fetal tissue in the womb and continue their special bond throughout your entire life through the gut-brain axis and the vagus nerve. Ninety-five percent of your happy neurotransmitter serotonin is produced and stored in your gut, so you can’t argue that your gut doesn’t influence the health of your brain. (22) The gut is known as the “second brain” in the medical literature, and a whole area of research known as the cytokine model of cognitive function is dedicated to examining how chronic inflammation and poor gut health can directly influence brain health. (23)

Chronic inflammation leads to not only increased gut permeability but blood-brain barrier destruction as well. When this protection is compromised, your immune system ends up working in overdrive, leading to brain inflammation. (24) Inflammation can decrease the firing rate of neurons in the frontal lobe of the brain in people with depression. (25) Because of this, antidepressants can be ineffective since they aren’t addressing the problem. And this same inflammatory oxidative stress in the hypothalamic cells of the brain is one potential factor of brain fog. (26)

Exciting emerging science is showing that a ketogenic diet can be more powerful than some of the strongest medications for brain-related problems such as autism, attention deficit/hyperactivity disorder (ADHD), bipolar disorder, schizophrenia, anxiety, and depression. (27) Through a ketogenic diet, we can not only calm brain-gut inflammation but also improve the gut microbiome. (28)

Ketones are also extremely beneficial because they can cross the blood-brain barrier and provide powerful fuel to your brain, providing mental clarity and improved mood. Their ability to cross the blood-brain barrier paired with their natural anti-inflammatory qualities provides incredible healing properties when it comes to improving traumatic brain injury (TBI) as well as neurodegenerative diseases. (29)

Medium-chain triglycerides (MCTs), found in coconuts (a healthy fat option in the Ketotarian diet), increase beta-hydroxybutyrate and are proven to enhance memory function in people with Alzheimer’s disease (30) as well as protect against neurodegeneration in people with Parkinson’s disease. (31) Diets rich in polyunsaturated fats, wild-caught fish specifically, are associated with a 60 percent decrease in Alzheimer’s disease. (32) Another study of people with Parkinson’s disease also found that the severity of their condition improved 43 percent after just one month of eating a ketogenic diet. (33) Studies have also shown that a ketogenic diet improves autism symptoms. (34) Contrast that with high-carb diets, which have been shown to increase the risk of Alzheimer’s disease and other neurodegenerative conditions. (35)

TBI or traumatic brain injury is another neurological area that can be helped through a ketogenic diet. When a person sustains a TBI, it can result in impaired glucose metabolism and inflammation, both of which are stabilized through a healthy high-fat ketogenic diet. (36)

Ketosis also increases the brain-derived-neurotrophic factor (BDNF), which protects existing neurons and encourages the growth of new neurons—another neurological benefit. (37)

In its earliest phases, modern ketogenic diet research was focused on treating epilepsy. (38) Children with epilepsy who ate this way were more alert, were more well behaved, and had more enhanced cognitive function than those who were treated with medication. (39) This is due to increased mitochondrial function, reduced oxidative stress, and increased gamma-aminobutyric acid (GABA) levels, which in turn helps reduce seizures. These mechanisms can also provide benefits for people with brain fog, anxiety, and depression. (40)

METABOLIC HEALTH

Burning ketones rather than glucose helps maintain balanced blood sugar levels, making the ketogenic way of eating particularly beneficial for people with metabolic disorders, diabetes, and weight-loss resistance.

Insulin resistance, the negative hormonal shift in metabolism that we mentioned earlier, is at the core of blood sugar problems and ends up wreaking havoc on the body, eventually leading to heart disease, weight gain, and diabetes. As we have seen, healthy fats are a stronger form of energy than glucose. The ketogenic diet lowers insulin levels and reduces inflammation as well as improving insulin receptor site sensitivity, which helps the body function the way it was designed. Early trial reports have shown that type 2 diabetes symptoms can be reversed in just ten weeks on the ketogenic diet! (41)

Fascinating research has been done correlating blood sugar levels and Alzheimer’s disease. In fact, so much so that the condition is now being referred to by some experts as type 3 diabetes . With higher blood sugar and increased insulin resistance comes more degeneration in the hippocampus, your brain’s memory center. (42) It’s because of this that people with type 1 and 2 diabetes have a higher risk of developing Alzheimer’s disease. This is another reason to get blood sugar levels balanced and have our brain burn ketones instead.

Notes:

* * *

I came across something interesting on the Ketogenic Forum, a discussion of a video. It’s about reporting on the ketogenic diet from Dateline almost a quarter century ago, back when I was a senior in high school. So, not only has the ketogenic diet been known in the medical literature for about a century but has even shown up in mainstream reporting for decades. Yet, ketogenic-oriented and related low-carb diets such as the paleo diet get called fad diets, and the low-carb diet has been well known for even longer, going back to the 19th century.

The Dateline show was about the ketosis used as treatment for serious medical conditions. But even though it was a well known treatment for epilepsy, doctors apparently still weren’t commonly recommending it. In fact, the keto diet wasn’t even mentioned as an option by a national expert, instead focusing on endless drugs and even surgery. After doing his own research for his son’s seizures, the father discovered the keto diet in the medical literature. The doctor was asked why he didn’t recommend it for the child’s seizures when it was known to have the highest efficacy rate. The doctor essentially had no answer other than to say that there were more drugs he could try, even as he admitted that no drug comes close in comparison.

As one commenter put it, “Seems like even back then the Dr’s knew drugs would always trump diet even though the success rate of the keto diet was 50-70%. No drugs at the time could even come close to that. And the one doctor still insisted they should try even more drugs to help Charlie even after Keto. Ugh!” Everyone knows the diet works. It’s been proven beyond all doubt. But there is a simple problem. There is no profit to be made from an easy and effective non-pharmaceutical solution.

This doctor knew there was a better possibility to offer the family and chose not to mention it. The consequences to his medical malfeasance is the kid may have ended up with permanent brain damage from seizures and from the side effects of medications. The father was shocked and angry. You’d think cases like this would have woken up the medical community, right? Well, you’d be wrong if you thought so. Yet quarter of a century later, most doctors continue to act clueless that these kinds of diets can help numerous health conditions. It’s not a lack of information being available, as many of these doctors knew about it even back then. But it simply doesn’t fit into the conventional medicine nor within the big drug and big insurance framework.

Here is the video:

Most Mainstream Doctors Would Fail Nutrition

“A study in the International Journal of Adolescent Medicine and Health assessed the basic nutrition and health knowledge of medical school graduates entering a pediatric residency program and found that, on average, they answered only 52 percent of eighteen questions correctly. In short, most mainstream doctors would fail nutrition.”
~Dr. Will Cole

That is amazing. The point is emphasized by the fact that these are doctors fresh out of medical school. If they were never taught this info in the immediate preceding years of intensive education and training, they are unlikely to pick up more knowledge later in their careers. These young doctors are among the most well educated people in the world, as few fields are as hard to enter and the drop-out rate of medical students is phenomena. These graduates entering residency programs are among the smartest of Americans, the cream of the crop, having been taught at some of the best schools in the world. They are highly trained experts in their field, but obviously this doesn’t include nutrition.

Think about this. Doctors are where most people turn to for serious health advice. They are the ultimate authority figures that the average person directly meets and talks to. If a cardiologist only got 52 percent right to answers on heart health, would you follow her advice and let her do heart surgery on you? I’d hope not. In that case, why would you listen to the dietary opinion of the typical doctor who is ill-informed? Nutrition isn’t a minor part of health, that is for sure. It is the one area where an individual has some control over their life and so isn’t a mere victim of circumstance. Research shows that simple changes in diet and nutrition, not to mention lifestyle, can have dramatic results. Yet few people have that knowledge because most doctors and other officials, to put it bluntly, are ignorant. Anyone who points out this state of affairs in mainstream thought generally isn’t received with welcoming gratitude, much less friendly dialogue and rational debate.

In reading about the paleo diet, a pattern I’ve noticed is that few critics of it know what the diet is and what is advocated by those who adhere to it. It’s not unusual to see, following a criticism of the paleo diet, a description of dietary recommendations that are basically in line with the paleo diet. Their own caricature blinds them to the reality, obfuscating the common ground of agreement or shared concern. I’ve seen the same kind of pattern in the critics of many alternative views: genetic determinists against epigenetic researchers and social scientists, climate change denialists against climatologists, Biblical apologists against Jesus mythicists, Chomskyan linguists against linguistic relativists, etc. In such cases, there is always plenty of fear toward those posing a challenge and so they are treated as the enemy to be attacked. And it is intended as a battle to which the spoils go to the victor, those in dominance assuming they will be the victor.

After debating some people on a blog post by a mainstream doctor (Paleo-suckered), it became clear to me how attractive genetic determinism and biological essentialism is to many defenders of conventional medicine, that there isn’t much you can do about your health other than to do what the doctor tells you and take your meds (these kinds of views may be on the decline, but they are far from down for the count). What bothers them isn’t limited to the paleo diet but extends seemingly to almost any diet as such, excluding official dietary recommendations. They see diet advocates as quacks, faddists, and cultists who are pushing an ideological agenda, and they feel like they are being blamed for their own ill health; from their perspective, it is unfair to tell someone they are capable of improving their diet, at least beyond the standard advice of eat your veggies and whole grains while gulping down your statins and shooting up your insulin.

As a side note, I’m reminded of how what often gets portrayed as alternative wasn’t always seen that way. Linguistic relativism was a fairly common view prior to the Chomskyan counter-revolution. Likewise, much of what gets promoted by the paleo diet was considered common sense in mainstream medical thought earlier last century and in the centuries prior (e.g., carbs are fattening, easily observed back in the day when most people lived on farms, as carbs were and still are how animals get fattened for the slaughter). In many cases, there are old debates that go in cycles. But the cycles are so long, often extending over centuries, that old views appear as if radically new and so easily dismissed as such.

Early Christians heresiologists admitted to the fact of Jesus mythicism, but their only defense was that the devil did it in planting parallels in prior religions. During the Enlightenment Age, many people kept bringing up these religious parallels and this was part of mainstream debate. Yet it was suppressed with the rise of literal-minded fundamentalism during the modern era. Then there is the battle between the Chomskyites, genetic determinists, etc and their opponents is part of a cultural conflict that goes back at least to the ancient Greeks, between the approaches of Plato and Aristotle (Daniel Everett discusses this in the Dark Matter of the Mind; see this post).

To return to the topic at hand, the notion of food as medicine, a premise of the paleo diet, also goes back to the ancient Greeks — in fact, originates with the founder of modern medicine, Hippocrates (he also is ascribed as saying that, “All disease begins in the gut,”  a slight exaggeration of a common view about the importance of gut health, a key area of connection between the paleo diet and alternative medicine). What we now call functional medicine, treating people holistically, used to be standard practice of family doctors for centuries and probably millennia, going back to medicine men and women. But this caring attitude and practice went by the wayside because it took time to spend with patients and insurance companies wouldn’t pay for it. Traditional healthcare that we now think of as alternative is maybe not possible with a for-profit model, but I’d say that is more of a criticism of the for-profit model than a criticism of traditional healthcare.

The dietary denialists love to dismiss the paleo lifestyle as a ‘fad diet’. But as Timothy Noakes argues, it is the least fad diet around. It is based on the research of what humans have been eating since the Paleoithic era and what hominids have been eating for millions of years. Even as a specific diet, it is the earliest official dietary recommendations given by medical experts. Back when it was popularized, it was called the Banting diet and the only complaint the medical authorities had was not that it was wrong but that it was right and they disliked it being promoted in the popular literature, as they considered dietary advice to be their turf to be defended. Timothy Noakes wrote that,

“Their first error is to label LCHF/Banting ‘the latest fashionable diet’; in other words, a fad. This is wrong. The Banting diet takes its name from an obese 19th-century undertaker, William Banting. First described in 1863, Banting is the oldest diet included in medical texts. Perhaps the most iconic medical text of all time, Sir William Osler’s The Principles and Practice of Medicine , published in 1892, includes the Banting/Ebstein diet as the diet for the treatment of obesity (on page 1020 of that edition). 13 The reality is that the only non-fad diet is the Banting diet; all subsequent diets, and most especially the low-fat diet that the UCT academics promote, are ‘the latest fashionable diets’.”
(Lore of Nutrition, p. 131)

The dominant paradigm maintains its dominance by convincing most people that what is perceived as ‘alternative’ was always that way or was a recent invention of radical thought. The risk the dominant paradigm takes is that, in attacking other views, it unintentionally acknowledges and legitimizes them. That happened in South Africa when the government spent hundreds of thousands of dollars attempting to destroy the career of Dr. Timothy Noakes, but because he was such a knowledgeable expert he was able to defend his medical views with scientific evidence. A similar thing happened when the Chomskyites viciously attacked the linguist Daniel Everett who worked in the field with native tribes, but it turned out he was a better writer with more compelling ideas and also had the evidence on his side. What the dogmatic assailants ended up doing, in both cases, was bringing academic and public attention to these challengers to the status quo.

Even though these attacks don’t always succeed, they are successful in setting examples. Even a pyrrhic victory is highly effective in demonstrating raw power in the short term. Not many doctors would be willing to risk their career as did Timothy Noakes and even fewer would have the capacity to defend themselves to such an extent. It’s not only the government that might go after a doctor but also private litigators. And if a doctor doesn’t toe the line, that doctor can lose their job in a hospital or clinic, be denied the ability to get Medicaire reimbursement, be blacklisted from speaking at medical conferences, and many other forms of punishment. That is what many challengers found in too loudly disagreeing with Ancel Keys and gang — they were effectively silenced and were no longer able to get funding to do research, even though the strongest evidence was on their side of the argument. Being shut out and becoming pariah is not a happy place to be.

The establishment can be fearsome when they flex their muscles. And watch out when they come after you. The defenders of the status quo become even more dangerous precisely when they are the weakest, like an injured and cornered animal who growls all the louder, and most people wisely keep their distance. But without fools to risk it all in testing whether the bark really is worse than the bite, nothing would change and the world would grind to a halt, as inertia settled into full authoritarian control. We are in such a time. I remember back in the era of Bush jr and as we headed into the following time of rope-a-dope hope-and-change. There was a palpable feeling of change in the air and I could viscerally sense the gears clicking into place. Something had irrevocably changed and it wasn’t fundamentally about anything going on in the halls of power but something within society and the culture. It made me feel gleeful at the time, like scratching the exact right spot where it itches — ah, there it is! Outwardly, the world more or less appeared the same, but the public mood had clearly shifted.

The bluntness of reactionary right-wingers is caused by the very fact that the winds of change are turning against them. That is why they praise the crude ridicule of wannabe emperor Donald Trump. What in the past could have been ignored by those in the mainstream no longer can be ignored. And after being ignored, the next step toward potential victory is being attacked, which can be mistaken for loss even as it offers the hope for reversal of fortune. Attacks come in many forms, with a few examples already mentioned. Along with ridicule, there is defamation, character assassination, scapegoating, and straw man arguments; allegations of fraud, quackery, malpractice, or deviancy. These are attacks as preemptive defense, in the hope of enforcing submission and silence. This only works for so long, though. The tide can’t be held back forever.

The establishment is under siege and they know it. Their only hope is to be able hold out long enough until the worst happens and they can drop the pretense in going full authoritarian. That is a risky gamble on their part and likely not to pay off, but it is the only hope they have in maintaining power. Desperation of mind breeds desperation of action. But it’s not as if a choice is being made. The inevitable result of a dominant paradigm is that it closes itself not only to all other possibilities but, more importantly, to even the imagination that something else is possible. Ideological realism becomes a reality tunnel. And insularity leads to intellectual laziness, as those who rule and those who support them have come to depend on a presumed authority as gatekeepers of legitimacy. What they don’t notice or don’t understand is the slow erosion of authority and hence loss of what Julian Jaynes called authorization. Their need to be absolutely right is no longer matched with their capacity to enforce their increasingly rigid worldview, their fragile and fraying ideological dogmatism.

This is why challengers to the status quo are in a different position, thus making the altercation of contestants rather lopsided. There is a freedom to being outside the constraints of mainstream thought. An imbalance of power, in some ways, works in favor of those excluded from power since they have all the world to gain and little to lose, meaning less to defend; this being shown in how outsiders, more easily than insiders, often can acknowledge where the other side is right and accept where points of commonality are to be found, that is to say the challengers to power don’t have to be on the constant attack in the way that is required for defenders of the status quo (similar to how guerrilla fighters don’t have to defeat an empire, but simply not lose and wait it out). Trying to defeat ideological underdogs that have growing popular support is like the U.S. military trying to win a war in Vietnam or Afghanistan — they are on the wrong side of history. But systems of power don’t give up without a fight, and they are willing to sacrifice loads of money and many lives in fighting losing battles, if only to keep the enemies at bay for yet another day. And the zombie ideas these systems are built on are not easily eliminated. That is because they are highly infectious mind viruses that can continue to spread long after the original vector of disease disappeared.

As such, the behemoth medical-industrial complex won’t be making any quick turns toward internal reform. Changes happen over generations. And for the moment, this generation of doctors and other healthcare workers were primarily educated and trained under the old paradigm. It’s the entire world most of them know. The system is a victim of its own success and so those working within the system are victimized again and again in their own indoctrination. It’s not some evil sociopathic self-interest that keeps the whole mess slogging along; after all, even doctors are suffering the same failed healthcare system as the rest of us and are dying of the same preventable diseases. All are sacrificed equally, all are food for the system’s hunger. When my mother brought my nephew for an appointment, the doctor was not trying to be a bad person when she made the bizarre and disheartening claim that all kids eat unhealthy and are sickly; i.e., there is nothing to do about it, just the way kids are. Working within the failed system, that is all she knows. The idea that sickness isn’t or shouldn’t be the norm was beyond her imagination.

It is up to the rest of us to imagine new possibilities and, in some cases, to resurrect old possibilities long forgotten. We can’t wait for a system to change when that system is indifferent to our struggles and suffering. We can’t wait for a future time when most doctors are well-educated on treating the whole patient, when officials are well-prepared for understanding and tackling systemic problems. Change will happen, as so many have come to realize, from the bottom up. There is no other way. Until that change happens, the best we can do is to take care of ourselves and take care of our loved ones. That isn’t about blame. It’s about responsibility, that is to say the ability to respond; and more importantly, the willingness to do so.

* * *

Ketotarian
by Dr. Will Cole
pp. 15-16

With the Hippocratic advice to “let food be thy medicine, and medicine thy food,” how far have we strayed that the words of the founder of modern medicine can actually be threatening to conventional medicine?

Today medical schools in the United States offer, on average, only about nineteen hours of nutrition education over four years of medical school.10 Only 29 percent of U.S. medical schools offer the recommended twenty-five hours of nutrition education.11 A study in the International Journal of Adolescent Medicine and Health assessed the basic nutrition and health knowledge of medical school graduates entering a pediatric residency program and found that, on average, they answered only 52 percent of eighteen questions correctly.12 In short, most mainstream doctors would fail nutrition. So if you were wondering why someone in functional medicine, outside conventional medicine, is writing a book on how to use food for optimal health, this is why.

Expecting health guidance from mainstream medicine is akin to getting gardening advice from a mechanic. You can’t expect someone who wasn’t properly trained in a field to give sound advice. Brilliant physicians in the mainstream model of care are trained to diagnose a disease and match it with a corresponding pharmaceutical drug. This medicinal matching game works sometimes, but it often leaves the patient with nothing but a growing prescription list and growing health problems.

With the strong influence that the pharmaceutical industry has on government and conventional medical policy, it’s no secret that using foods to heal the body is not a priority of mainstream medicine. You only need to eat hospital food once to know this truth. Even more, under current laws it is illegal to say that foods can heal. That’ right. The words treat, cure, and prevent are in effect owned by the Food and Drug Administration (FDA) and the pharmaceutical industry and can be used in the health care setting only when talking about medications. This is the Orwellian world we live in today; health problems are on the rise even though we spend more on health care than ever, and getting healthy is considered radical and often labeled as quackery.

10. K. Adams et al., “Nutrition Education in U.S. Medical Schools: Latest Update of a National Survey,” Academic Medicine 85, no. 9 (September 2010): 1537-1542, https://www.ncbi.nlm.nih.gov/pubmed/9555760.
11. K. Adams et al., “The State of Nutrition Education at US Medical Schools,” Journal of Biomedical Education 2015 (2015), Article ID 357627, 7 pages, http://dx.doi.org/10.1155/2015/357627.
12. M. Castillo et al., “Basic Nutrition Knowledge of Recent Medical Graduates Entering a Pediatric Reside): 357-361, doi: 10.1515/ijamh-2015-0019, https://www.ncbi.nlm.nih.gov/pubmed/26234947.

Despite Growing Burden of Diet-related Disease, Medical Education Does Not Equip Students to Provide High Quality Nutritional Care to Patients
by Millie Barnes

The reviewed studies consistently found that medical students wanted to receive nutrition education to develop their skills in nutrition care but perceived that their education did not equip them to do so. Students cited both quantity and quality of their education as reasons for this — poor quality and under prioritization of nutrition in the curriculum, lack of interest and expertise in nutrition among faculty members, and few examples of nutritional counseling during clinical years to serve as models for emerging doctors.

Furthermore, students uniformly reported having a lack of required nutrition knowledge, which was also found through testing. For instance, one study found that when nutrition knowledge was assessed in a test, half of medical students scored below the pass rate.

Five studies assessing curriculum initiatives found that they had a modest positive effect. However, most nutrition initiatives were employed opportunistically as a once-off activity, rather than being integrated in a sustained way into the medical curricula. Innovative initiatives — such as online curriculum, hands on cooking experiences, and learning from other health professionals such as dietitians — showed short-term and long-term benefits for patients and health systems. Therefore, the authors call for more funding for innovative curriculum initiatives to be developed and implemented.