Fasting, Calorie Restriction, and Ketosis

What we eat obviously affects gut health such as the microbiome and through that, along with other mechanisms, it affects the rest of our body, the brain included (by way of permeability, immune system, vagus nerve, substances like glutamate and propionate, and much else). About general health, I might add that foods eaten in what combination (e.g., red meat and grains) is also an issue. Opposite of what you eat impacting neurocognition and mental health, not eating (i.e., fasting, whether intermittent or extended) or else caloric restriction and carbohydrate reduction, ketogenic or otherwise, alters it in other ways.

Fasting, for example, increases the level of neurotransmitters such as serotonin, dopamine, and norepinephrine while temporarily reducing the brains release and use of them; plus, serotonin and its precursor tryptophan are made more available to the brain. So, it allows your reserves of neurotransmitters to rebuild to higher levels. That is partly why a ketogenic diet, along with the brains efficient use of ketones, shows improvements in behavior, learning, memory, acuity, focus, vigilance, and mood (such as sense of well-being and sometimes euphoria); with specific benefits, to take a couple of examples, in cerebral blood flow and prefrontal-cortex-related cognitive functions (mental flexibility and set shifting); while also promoting stress resistance, inflammation reduction, weight loss, and metabolism, and while decreasing free radical damage, blood pressure, heart rate, and glucose levels. Many of these are similar benefits as seen with strenuous exercise.

We know so much about this because the ketogenic diet is the only diet that has been specifically and primarily studied in terms of neurological diseases, going back to early 20th century research on epileptic seizures and autism, was shown effective for other conditions later in the century (e.g., V. A. Angelillo et al, Effects of low and high carbohydrate feedings in ambulatory patients with chronic obstructive pulmonary disease and chronic hypercapnia), and more recently with positive results seen in numerous other conditions (Dr. Terry Wahl’s work on multiple sclerosis, Dr. Dale Bredesen’s work on Alzheimer’s, etc). By the way, the direction of causality can also go the other way around, from brain to gut: “Studies also suggest that overwhelming systemic stress and inflammation—such as that induced via severe burn injury—can also produce characteristic acute changes in the gut microbiota within just one day of the sustained insult [15].” (Rasnik K. Singh et al, Influence of diet on the gut microbiome and implications for human health). And see:

“Various afferent or efferent pathways are involved in the MGB axis. Antibiotics, environmental and infectious agents, intestinal neurotransmitters/neuromodulators, sensory vagal fibers, cytokines, essential metabolites, all convey information about the intestinal state to the CNS. Conversely, the HPA axis, the CNS regulatory areas of satiety and neuropeptides released from sensory nerve fibers affect the gut microbiota composition directly or through nutrient availability. Such interactions appear to influence the pathogenesis of a number of disorders in which inflammation is implicated such as mood disorder, autism-spectrum disorders (ASDs), attention-deficit hypersensitivity disorder (ADHD), multiple sclerosis (MS) and obesity.” (Anastasia I. Petra et al, Gut-Microbiota-Brain Axis and Its Effect on Neuropsychiatric Disorders With Suspected Immune Dysregulation)

There are many other positive effects. Fasting reduces the risk of neurocognitive diseases: Parkinson’s, Alzheimer’s, etc. And it increases the protein BDNF (brain-derived neurotrophic factor) that helps grow neuronal connections. Results include increased growth of nerve cells from stem cells (as stem cells are brought out of their dormant state) and increased number of mitochondria in cells (mitochondria are the energy factories), the former related to the ability of neurons to develop and maintain connections between each other. An extended fast will result in autophagy (cellular housekeeping), the complete replacement of your immune cells and clearing out damaged cells which improves the functioning of your entire body (it used to be thought to not to occur in the brain but we now know it does) — all interventions known to prolong youthful health, lessen and delay diseases of aging (diabetes, cancer, cardiovascular disease, etc), and extend lifespan in lab animals involve autophagy (James H. Catterson et al, Short-Term, Intermittent Fasting Induces Long-Lasting Gut Health and TOR-Independent Lifespan Extension). Even calorie restriction has no effect when autophagy is blocked (Fight Aging!, Autophagy Required For Calorie Restriction Benefits?). It cleans out the system, gives the body a rest from its normal functioning, and redirects energy toward healing and rebuilding.

As a non-human example, consider hibernation for bears. A study was done comparing bears with a natural diet (fruits, nuts, insects, and small mammals) and those that ate human garbage (i.e., high-carb processed foods). “A research team tracked 30 black bears near Durango, Colo., between 2011 and 2015, paying close attention to their eating and hibernation habits. The researchers found that bears who foraged on human food hibernated less during the winters — sometimes, by as much as 50 days — than bears who ate a natural diet. The researchers aren’t sure why human food is causing bears to spend less time in their dens. But they say shorter hibernation periods are accelerating bears’ rates of cellular aging” (Megan Schmidt, Human Food Might Be Making Bears Age Faster). As with humans who don’t follow fasting or a ketogenic diet, bears who hibernate less don’t live as long. Maybe a high-carb diet messes with hibernation similarly to how it messes with ketosis.

Even intermittent fasting shows many of these benefits. Of course, you can do dramatic changes to the body without fasting at all, if you’re on a ketogenic diet (though one could call it a carb fast since it is extremely low carb) or severe caloric restriction (by the way, caloric restriction has been an area of much mixed results and hence confusion — see two pieces by Peter Attia: Calorie restriction: Part I – an introduction & Part IIA – monkey studies; does intermittent fasting and ketosis mimic caloric restriction or the other way around?). I’d add a caveat: On any form of dietary limitation or strict regimen, results vary depending on specifics of test subjects and other factors: how restricted and for how long, micronutrient and macronutrient content of diet, fat-adaptation and metabolic flexibility, etc; humans, by the way, are designed for food variety and so it is hard to know the consequences of modern diet that often remains unchanged, season to season, year to year (Rachel Feltman, The Gut’s Microbiome Changes Rapidly with Diet). There is a vast difference between someone on a high-carb diet doing an occasional fast and someone on a ketogenic diet doing regular intermittent fasting. Even within a single factor such as a high-carb diet, there is little similarity between the average American eating processed foods and a vegetarian monk eating restricted calories. As another example, autophagy can take several days of fasting to be fully achieved; but how quickly this happens depends on the starting conditions such as how many carbs eaten beforehand and how much glucose in the blood and glycogen stores in the muscles, both of which need to be used up before ketosis begins.

Metabolic flexibility, closely related to fat-adaptation, requires flexibility of the microbiome. Research has found that certain hunter-gatherers have microbiomes that completely switch from season to season and so the gut somehow manages to maintain some kind of memory of previous states of microbial balance which allows them to be re-established as needed. This is seen more dramatically with the Inuit who eat an extremely low-carb diet, but they seasonally eat relatively larger amounts of plant matter such as seaweed and they temporarily have digestive issues until the needed microbes take hold again. Are these microbes dormant in the system or systematically reintroduced? In either case, the process is unknown, as far as I know. What we are clear about is how dramatically diet affects the microbiome, whatever the precise mechanisms.

For example, a ketogenic diet modulates the levels of the microbes Akkermansia muciniphila, Lactobacillus, and Desulfovibrio (Lucille M. Yanckello, Diet Alters Gut Microbiome and Improves Brain Functions). It is the microbes that mediate the influence on both epileptic seizures and autism, such that Akkermansia is decreased in the former and increased in the latter, that is to say the ketogenic diet helps the gut regain balance no matter which direction the imabalance is. In the case of epileptic seizures, Akkermansia spurs the growth of Parabacteroides which alters neurotransmission by elevating the GABA/glutamate ratio (there is glutamate again): “the hippocampus of the microbe-protected mice had increased levels of the neurotransmitter GABA, which silences neurons, relative to glutamate, which activates them” (Carolyn Beans, Mouse microbiome findings offer insights into why a high-fat, low-carb diet helps epileptic children), but no such effect was found in germ-free mice, that is to say with no microbiome (similar results were found in human studies: Y. Zhang, Altered gut microbiome composition in children with refractory epilepsy after ketogenic diet). Besides reducing seizures, “GABA is a neurotransmitter that calms the body. Higher GABA to glutamate ratios has been shown to alleviate depression, reduce anxiety levels, lessen insomnia, reduce the severity of PMS symptoms, increase growth hormone, improve focus, and reduce systemic inflammation” (MTHFR Support, Can Eating A Ketogenic Diet Change Our Microbiome?). To throw out the other interesting mechanism, consider Desulfovibrio. Ketosis reduces its numbers and that is a good thing since it causes leakiness of the gut barrier, and what causes leakiness in one part of the body can cause it elsewhere as well such as the brain barrier. Autoimmune responses and inflammation can follow. This is why ketosis has been found beneficial for preventing and treating neurodegenerative conditions like Alzheimer’s (plus, ketones are a useful alternative fuel for Alzheimer’s since their brain cells begin starving to death for loss of the capacity to use glucose as a fuel).

All of this involves the factors that increase and reduce inflammation: “KD also increased the relative abundance of putatively beneficial gut microbiota (Akkermansia muciniphila and Lactobacillus), and reduced that of putatively pro-inflammatory taxa (Desulfovibrio and Turicibacter).” (David Ma et al, Ketogenic diet enhances neurovascular function with altered gut microbiome in young healthy mice). Besides the microbiome itself, this has immense impact on leakiness and autoimmune conditions, with this allowing inflammation to show up in numerous areas of the body, including the brain of course. Inflammation is found in conditions such as depression and schizophrenia. Even without knowing this mechanism, much earlier research has long established that ketosis reduces inflammation.

It’s hard to know what this means, though. Hunter-gatherers tend to have much more diverse microbiomes, as compared to industrialized people. Yet the ketogenic diet that helps induce microbial balance simultaneously reduces diversity. So, diversity isn’t always a good thing, with another example being small intestinal bacterial overgrowth (SIBO). What matters is which microbes one has in abundance and in relation which microbes one lacks or has limitedly. And what determines that isn’t limited to diet in the simple sense of what foods we eat or don’t eat but the whole pattern involved. Also, keep in mind that in a society like ours most of the population is in varying states of gut dysbiosis. First eliminating the harmful microbes is most important before the body can heal and rebalance. That is indicated by a study on multiple sclerosis that found, after the subjects had an initial reduction in the microbiome, “They started to recover at week 12 and exceeded significantly the baseline values after 23–24 weeks on the ketogenic diet” (Alexander Swidsinski et al, Reduced Mass and Diversity of the Colonic Microbiome in Patients with Multiple Sclerosis and Their Improvement with Ketogenic Diet). As always, it’s complex. But the body knows what to do when you give it the tools its evolutionarily-adapted to.

In any case, all of the methods described can show a wide range of benefits and improvements in physical and mental health. They are potentially recommended for almost anyone who is in a healthy state or in some cases of disease, although as always seek medical advice before beginning any major dietary change, especially anyone with an eating disorder or malnourishment (admittedly, almost all people on a modern industrialized diet are to some degree malnourished, especially Americans, although most not to a degree of being immediately life-threatening). Proceed with caution. But you are free to take your life in your hands by taking responsibility for your own health through experimentation in finding out what happens (my preferred methodology), in which case the best case scenario is that you might gain benefit at no professional medical cost and the worst case scenario is that you might die (not that I’ve heard of anyone dying from a typical version of a diet involving fasting, ketosis, and such; you’re way more likely to die from the standard American diet; but individual health conditions aren’t necessarily predictable based on the experience of others, even the vast majority of others). Still, you’re going to die eventually, no matter what you do. I wish you well, until that time.

* * *

Let me clarify one point of widespread confusion. Talk of ‘diets’, especially of the variety I’ve discussed here, are often thought of in terms of restriction and that word does come up quite a bit. I’m guilty of talking this way even in this post, as it is about impossible to avoid such language considering it is used in the scientific and medical literature. So, there is an implication of deprivation, of self-control and self-denial, as if we must struggle and suffer to be healthy. That couldn’t be further from the truth.

Once you are fat-adapted and have metabolic flexibility, you are less restricted than you were before, in that you can eat more carbs and sugars for a time and then more easily return back to ketosis, as is a common seasonal pattern for hunter-gatherers. And once you no longer are driven by food cravings and addictions, you’ll have a happier and healthier relationship to food — eating when genuinely hungry and going without for periods without irritation or weakness, as also is common among hunter-gatherers.

This is simply a return to the state in which most humans have existed for most of our historical and evolutionary past. It’s not restriction or deprivation, much less malnourishment. It’s normalcy or should be. But we need to remember what normalcy looks and feels like: “People around the world suffer from starvation and malnutrition, and it is not only because they lack food and nutrients. Instead they suffer from immature microbiomes, which can severely impact health” (AMI, The effects of fasting and starvation on the microbiome). Gut health is inseparable from the rest, and these diets heal and rebalance the gut.

We need to redefine what health means, in a society where sickness has become the norm.

* * *

Here is a good discussion that is relevant here, even though the author never discusses ketosis anywhere in his book. He is pointing out that calorie intake and energy usage is approximately the same for urbanized humans as for hunter-gatherers. Yet the former have higher rates of obesity and the latter don’t. As many have noted, not all calories are the same and so calories-in/calories-out is a myth. This data makes more sense once you understand how profoundly different the body functions under ketogenic and non-ketogenic states.

100 Million Years of Food
by Stephen Le
pp.166 -168

At this point, a reader might conclude that the root of modern food-related ailments like obesity and diabetes lies in people eating a lot more food, due to the miracle of nitrogen fixation, and doing a lot less physical activity, due to the miracle of combustion engines and private vehicles. However, it turns out that neither of these common beliefs is supported by the evidence.

First, the food intake myth. The daily energy consumed through food in contemporary industrialized nations runs from about 2,300 kcal (kilocalories) among Japanese men and 1,800 kcal among Japanese women to 2,600 kcal among American men and 1,900 kcal among American women. 21 What is surprising is that the average daily caloric intake of these overweight industrialized societies is about the same as among hunter-gatherer groups, with some hunter-gatherer groups below and others above the calories consumed of industrialized nations. 22 Although hunter-gatherers ate about as much as we do today, they faced much greater variability in their food supply. In northern Australia, among the Anbarra, the daily energy intake dropped to 1,600 kcal during the rainy season and peaked at 2,500 kcal during the dry season. The calorie consumption of the Hiwi in the rainforests of Venezuela bounced between 1,400 and 2,800 kilocalories, depending on the season (plant foods were most plentiful at the end of the wet season). Thus, if any major pattern emerges in terms of caloric intake, it is that our hunter-gatherer ancestors lived on a dramatically varying diet, which swung between feast and famine according to the season and other hazards of fortune.

Another surprising finding concerns physical activity. Although it is commonly believed that people in hunter-gatherer societies expended much more energy than people in industrialized societies today, the evidence so far does not support this assumption. One common measure of physical activity level (PAL) expresses the total energy used in one day as a multiple of a person’s metabolic rate. For example, a PAL of 1 means that a person uses only his/her metabolic energy, i.e., the energy expended by breathing, thinking, digesting, etc. A PAL of 2 means that a person uses twice as much energy as his or her base metabolic rate. PAL allows us to adjust for the fact that people have varying levels of metabolism; a person who has a high metabolic rate can burn up a lot of energy by just sitting in one place compared to a person with low metabolism, so a good measure of physical activity needs to compensate for differences in metabolism. To determine the amount of energy used in a day, the best measure involves giving a person a drink of water that has been “tagged” with isotopes of hydrogen and oxygen. Measurement of these two tags in samples of saliva, urine, or blood allows measurement of exhaled carbon dioxide and hence the degree of respiration from metabolic processes.

Using tagged water, the average PAL among foragers was found to be 1.78 for men and 1.72 for women. Among industrialized contemporary societies with a high human development index (which measures income, literacy, and so on), the PAL of men was 1.79 for men and 1.71 for women. 23 In other words, the energy expenditure of overweight contemporary industrialized societies is roughly the same as that of lean hunter-gatherer societies once metabolism is taken into account; or to put it another way, the cause of obesity is unlikely to be lack of exercise, because people in industrialized societies today use about the same amount of energy as people in hunter-gatherer societies. 24

This finding has important implications for understanding obesity. All of us living in industrialized societies are aware of the stigma associated with obesity, and perhaps the longer-term health consequences of diabetes, high blood pressure, gout, and cancers associated with being overweight. Since food intake and energy expenditure levels today are roughly the same as during ancestral times (using the lifestyles of modern hunter-gatherers as a reasonable model for our ancestors’ lifestyles), why are obesity and diabetes so prevalent among industrialized societies and virtually nonexistent among our ancestors?

The first argument might be an objection that obesity has in fact been with us since the days of our earliest ancestors, so nothing has changed. It has been suggested that figurines of markedly obese women, found in Europe and dating to thirty thousand years ago, are proof that obesity existed at that time. However, no hunter-gatherer or small-scale horticultural group has ever manifested signs of obesity, despite having caloric intake and energy expenditure (adjusted for metabolism) within the range of contemporary industrialized populations. Thus the prehistoric statuettes may be representative of idealized feminine beauty, just as Barbie dolls and Japanese anime characters with huge eyes and exaggerated busts are fantasies more revealing of their creators than of real women.

* * *

Genius Foods:
Become Smarter, Happier, and More Productive While Protecting Your Brain for Life
by Max Lugavere

Baby Fat Isn’t Just Cute—It’s a Battery

Have you seen a baby lately? I’m talking about a newborn, fresh out of the womb. They’re fat. And cute. But mostly fat. Packed with stored energy prior to birth in the third trimester, the fatness of human babies is unprecedented in the mammal world. While the newborns of most mammal species average 2 to 3 percent of birth weight as body fat, humans are born with a body fat percentage of nearly 15, surpassing the fatness of even newborn seals. Why is this so? Because humans are born half-baked.

When a healthy human baby emerges from the womb, she is born physically helpless ad with an underdeveloped brain. Unlike most animals at birth, a newborn human is not equipped with a full catalogue of instincts preinstalled. It is estimated that if a human were to be born at a similar stage of cognitive development to a newborn chimp, gestation would be at least double the length (that doesn’t sound fun—am I right ladies?). By being born “prematurely,” human brains complete their development not in the womb, but in the real world, with open eyes and open ears—this is probably why we’re so social and smart! And it is during this period for rapid brain growth, what some refer to as the “fourth trimester,” that our fast serves as an important ketone reservoir for the brain, which can account for nearly 90 percent of the newborn’s metabolism. Now you know: baby fat isn’t just there for pinching. It’s there for the brain.

* * *

Mitochondria and the Future of Medicine:
The Key to Understanding Disease, Chronic Illness, Aging, and Life Itself
by Lee Know

Ketogenic Diets and Calorie Restriction

Ketone bodies, herein also referred to simply as ketones , are three water-soluble compounds that are produced as by-products when fatty acids are broken down for energy in the liver. These ketones can be used as a source of energy themselves, especially in the heart and brain, where they are a vital source of energy during periods of fasting.

The three endogenous ketones produced by the body are acetone , acetoacetic acid , and beta-hydroxybutyric acid (which is the only one that’s not technically a ketone, chemically speaking). They can be converted to acetyl-CoA, which then enters the TCA cycle to produce energy.

Fatty acids are so dense in energy, and the heart is one of the most energy-intensive organs, so under normal physiologic conditions, it preferentially uses fatty acids as its fuel source. However, under ketotic conditions, the heart can effectively utilize ketone bodies for energy.

The brain is also extremely energy-intensive, and usually relies on glucose for its energy. However, when glucose is in short supply, it gets a portion of its energy from ketone bodies (e.g., during fasting, strenuous exercise, low-carbohydrate, ketogenic diet, and in neonates). While most other tissues have alternate fuel sources (besides ketone bodies) when blood glucose is low, the brain does not. For the brain, this is when ketones become essential. After three days of low blood glucose, the brain gets 25 percent of its energy from ketone bodies. After about four days, this jumps to 70 percent!

In normal healthy individuals, there is a constant production of ketone bodies by the liver and utilization by other tissues. Their excretion in urine is normally very low and undetectable by routine urine tests. However, as blood glucose falls, the synthesis of ketones increases, and when it exceeds the rate of utilization, their blood concentration increases, followed by increased excretion in urine. This state is commonly referred to as ketosis , and the sweet, fruity smell of acetone in the breath is a common feature of ketosis.

Historically, this sweet smell was linked to diabetes and ketones were first discovered in the urine of diabetic patients in the mid-nineteenth century. For almost fifty years thereafter, they were thought to be abnormal and undesirable by-products of incomplete fat oxidation.

In the early twentieth century, however, they were recognized as normal circulating metabolites produced by the liver and readily utilized by the body’s tissues. In the 1920s, a drastic “hyperketogenic” diet was found to be remarkably effective for treating drug-resistant epilepsy in children. In 1967, circulating ketones were discovered to replace glucose as the brain’s major fuel during prolonged fasting. Until then, the adult human brain was thought to be entirely dependent upon glucose.

During the 1990s, diet-induced hyperketonemia (commonly called nutritional ketosis ) was found to be therapeutically effective for treating several rare genetic disorders involving impaired glucose utilization by nerve cells. Now, growing evidence suggests that mitochondrial dysfunction and reduced bioenergetic efficiency occur in brains of patients with Parkinson’s disease and Alzheimer’s disease. Since ketones are efficiently used by brain mitochondria for ATP generation and might also help protect vulnerable neurons from free-radical damage, ketogenic diets are being evaluated for their ability to benefit patients with Parkinson’s and Alzheimer’s diseases, and various other neurodegenerative disorders (with some cases reporting remarkable success).

There are various ways to induce ketosis, some easier than others. The best way is to use one of the various ketogenic diets (e.g., classic, modified Atkins, MCT or coconut oil, low-glycemic index diet), but calorie restriction is also proving its ability to achieve the same end results when carbohydrates are limited.

Features of Caloric Restriction

There are a number of important pieces to caloric restriction. First, and the most obvious, is that caloric intake is most critical. Typically, calories are restricted to about 40 percent of what a person would consume if food intake was unrestricted. For mice and rats, calorie restriction to this degree results in very different physical characteristics (size and body composition) than those of their control-fed counterparts. Regarding life extension, even smaller levels of caloric restriction (a reduction of only 10–20 percent of unrestricted calorie intake) produce longer-lived animals and disease-prevention effects.

In April of 2014, a twenty-five-year longitudinal study on rhesus monkeys showed positive results. The benefit of this study was that it was a long-term study done in primates—human’s closest relatives—and confirms positive data we previously saw from yeasts, insects, and rodents. The research team reported that monkeys in the control group (allowed to eat as much as they wanted) had a 2.9-fold increased risk of disease (e.g., diabetes) and a 3-fold increased risk of premature death, compared to calorie-restricted monkeys (that consumed a diet with 30 percent less calories).

If other data from studies on yeast, insects, and rodents can be confirmed in primates, it would indicate that calorie restriction could extend life span by up to 60 percent, making a human life span of 130–150 years a real possibility without fancy technology or supplements or medications. The clear inverse relationship between energy intake and longevity links its mechanism to mitochondria—energy metabolism and free-radical production.

Second, simply restricting the intake of fat, protein, or carbohydrates without overall calorie reduction does not increase the maximum life span of rodents. It’s the calories that count, not necessarily the type of calories (with the exception of those trying to reach ketosis, where type of calorie does count).

Third, calorie restriction has been shown to be effective in disease prevention and longevity in diverse species. Although most caloric restriction studies have been conducted on small mammals like rats or mice, caloric restriction also extends life span in single-celled protozoans, water fleas, fruit flies, spiders, and fish. It’s the only method of life extension that consistently achieves similar results across various species.

Fourth, these calorie-restricted animals stay “biologically younger” longer. Experimental mice and rats extended their youth and delayed (even prevented) most major diseases (e.g., cancers, cardiovascular diseases). About 90 percent of the age-related illnesses studied remained in a “younger” state for a longer period in calorie-restricted animals. Calorie restriction also greatly delayed cancers (including breast, colon, prostate, lymphoma), renal diseases, diabetes, hypertension, hyperlipidemia, lupus, and autoimmune hemolytic anemia, and a number of others.

Fifth, calorie restriction does not need to be started in early age to reap its benefits. Initiating it in middle-aged animals also slowed aging (this is good news for humans, because middle age is when most of us begin to think about our own health and longevity).

Of course, the benefits of calorie restriction relate back to mitochondria. Fewer calories mean less “fuel” (as electrons) entering the ETC, and a corresponding reduction in free radicals. As you know by now, that’s a good thing.

Health Benefits

As just discussed, new research is showing that judicious calorie restriction and ketogenic diets (while preserving optimal nutritional intake) might slow down the normal aging process and, in turn, boost cardiovascular, brain, and cellular health. But how? We can theorize that the restriction results in fewer free radicals, but one step in confirming a theory is finding its mechanism.

In particular, researchers have identified the beneficial role of beta-hydroxybutyric acid (the one ketone body that’s not actually a ketone). It is produced by a low-calorie diet and might be the key to the reduced risk of age-related diseases seen with calorie restriction. Over the years, studies have found that restricting calories slows aging and increases longevity, but the mechanism behind this remained elusive. New studies are showing that beta-hydroxybutyric acid can block a class of enzymes, called histone deacetylases , which would otherwise promote free-radical damage.

While additional studies need to be conducted, it is known that those following calorie-restricted or ketogenic diets have lower blood pressure, heart rate, and glucose levels than the general population. More recently, there has been a lot of excitement around intermittent fasting as an abbreviated method of achieving the same end results.

However, self-prescribing a calorie-restricted or ketogenic diet is not recommended unless you’ve done a lot of research on the topic and know what to do. If not done properly, these diets can potentially increase mental and physical stress on the body. Health status should be improving, not declining, as a result of these types of diets, and when not done properly, these diets could lead to malnutrition and starvation. Health care practitioners also need to properly differentiate a patient who is in a deficiency state of anorexia or bulimia versus someone in a healthy state of ketosis or caloric restriction.

I’ll add a final word of caution: While ketogenic diets can be indispensable tools in treating certain diseases, their use in the presence of mitochondrial disease—at this point—is controversial and depends on the individual’s specific mitochondrial disease. In some cases, a ketogenic diet can help; in others it can be deleterious. So, of all the therapies listed in this book, the one for which I recommend specific expertise in its application is this diet, and only after a proper diagnosis.

* * *

Grain Brain:
The Surprising Truth about Wheat, Carbs, and Sugar–Your Brain’s Silent Killers

by David Perlmutter

Caloric Restriction

Another epigenetic factor that turns on the gene for BDNF production is calorie restriction. Extensive studies have clearly demonstrated that when animals are on a reduced-calorie diet (typically reduced by around 30 percent), their brain production of BDNF shoots up and they show dramatic improvements in memory and other cognitive functions. But it’s one thing to read experimental research studies involving rats in a controlled environment and quite another to make recommendations to people based upon animal research. Fortunately, we finally have ample human studies demonstrating the powerful effect of reducing caloric intake on brain function, and many of these studies have been published in our most well-respected medical journals. 13

In January 2009, for example, the Proceedings of the National Academy of Science published a study in which German researchers compared two groups of elderly individuals—one that reduced their calories by 30 percent and another that was allowed to eat whatever they wanted. The researchers were interested in whether changes could be measured between the two groups’ memory function. At the conclusion of the three-month study, those who were free to eat without restriction experienced a small but clearly defined decline in memory function, while memory function in the group on the reduced-calorie diet actually increased, and significantly so. Knowing that current pharmaceutical approaches to brain health are very limited, the authors concluded, “The present findings may help to develop new prevention and treatment strategies for maintaining cognitive health into old age.” 14

Further evidence supporting the role of calorie restriction in strengthening the brain and providing more resistance to degenerative disease comes from Dr. Mark P. Mattson, chief of the Laboratory of Neurosciences at the National Institute on Aging (NIA). He reported:

Epidemiological data suggest that individuals with a low calorie intake may have a reduced risk of stroke and neurodegenerative disorders. There is a strong correlation between per capita food consumption and risk for Alzheimer’s disease and stroke. Data from population-based case control studies showed that individuals with the lowest daily calorie intakes had the lowest risk of Alzheimer’s disease and Parkinson’s disease. 15

Mattson was referring to a population-based longitudinal prospective study of Nigerian families, in which some members moved to the United States. Many people believe that Alzheimer’s disease is something you “get” from your DNA, but this particular study told a different story. It was shown that the incidence of Alzheimer’s disease among Nigerian immigrants living in the United States was increased compared to their relatives who remained in Nigeria. Genetically, the Nigerians who moved to America were the same as their relatives who remained in Nigeria. 16 All that changed was their environment—specifically, their caloric intake. The research clearly focused on the detrimental effects that a higher caloric consumption has on brain health. In a 2016 study published in Johns Hopkins Health Review, Mattson again emphasized the value of caloric restriction in warding off neurodegenerative diseases while at the same time improving memory and mood. 17 One way to do that is through intermittent fasting, which we’ll fully explore in chapter 7 . Another way, obviously, is to trim back your daily consumption.

If the prospect of reducing your calorie intake by 30 percent seems daunting, consider the following: On average, we consume 23 percent more calories a day than we did in 1970. 18 Based on data from the Food and Agriculture Organization of the United Nations, the average American adult consumes more than 3,600 calories daily. 19 Most would consider “normal” calorie consumption to be around 2,000 calories daily for women and 2,500 for men (with higher requirements depending on level of activity/exercise). A 30 percent cut of calories from an average of 3,600 per day equals 1,080 calories.

We owe a lot of our increased calorie consumption to sugar. Remember, the average American consumes roughly 163 grams (652 calories) of refined sugars a day—reflecting upward of a 30 percent hike in just the last three decades. 20 And of that amount, about 76 grams (302 calories) are from high-fructose corn syrup. So focusing on just reducing sugar intake may go a long way toward achieving a meaningful reduction in calorie intake, and this would obviously help with weight loss. Indeed, obesity is associated with reduced levels of BDNF, as is elevation of blood sugar. Remember, too, that increasing BDNF provides the added benefit of actually reducing appetite. I call that a double bonus.

But if the figures above still aren’t enough to motivate you toward a diet destined to help your brain, in many respects, the same pathway that turns on BDNF production can be activated by intermittent fasting (which, again, I’ll detail in chapter 7 ).

The beneficial effects in treating neurologic conditions using caloric restriction actually aren’t news for modern science, though; they have been recognized since antiquity. Calorie restriction was the first effective treatment in medical history for epileptic seizures. But now we know how and why it’s so effective: It confers neuroprotection, increases the growth of new brain cells, and allows existing neural networks to expand their sphere of influence (i.e., neuroplasticity).

While low caloric intake is well documented in relation to promoting longevity in a variety of species—including roundworms, rodents, and monkeys—research has also demonstrated that lower caloric intake is associated with a decreased incidence of Alzheimer’s and Parkinson’s disease. And the mechanisms by which we think this happens are via improved mitochondrial function and controlling gene expression.

Consuming fewer calories decreases the generation of free radicals while at the same time enhancing energy production from the mitochondria, the tiny organelles in our cells that generate chemical energy in the form of ATP (adenosine triphosphate). Mitochondria have their own DNA, and we know now that they play a key role in degenerative diseases such as Alzheimer’s and cancer. Caloric restriction also has a dramatic effect on reducing apoptosis, the process by which cells undergo self-destruction. Apoptosis happens when genetic mechanisms within cells are turned on that culminate in the death of that cell. While it may seem puzzling at first as to why this should be looked upon as a positive event, apoptosis is a critical cellular function for life as we know it. Pre-programmed cell death is a normal and vital part of all living tissues, but a balance must be struck between effective and destructive apoptosis. In addition, caloric restriction triggers a decrease in inflammatory factors and an increase in neuroprotective factors, specifically BDNF. It also has been demonstrated to increase the body’s natural antioxidant defenses by boosting enzymes and molecules that are important in quenching excessive free radicals.

In 2008, Dr. Veronica Araya of the University of Chile in Santiago reported on a study she performed during which she placed overweight and obese subjects on a three-month calorie-restricted diet, with a total reduction of 25 percent of calories. 21 She and her colleagues measured an exceptional increase in BDNF production, which led to notable reductions in appetite. It’s also been shown that the opposite occurs: BDNF production is decreased in animals on a diet high in sugar. 22 Findings like this have since been replicated.

One of the most well-studied molecules associated with caloric restriction and the growth of new brain cells is sirtuin-1 (SIRT1), an enzyme that regulates gene expression. In monkeys, increased SIRT1 activation enhances an enzyme that degrades amyloid—the starch-like protein whose accumulation is the hallmark of diseases like Alzheimer’s. 23 In addition, SIRT1 activation changes certain receptors on cells, leading to reactions that have the overall effect of reducing inflammation. Perhaps most important, activation of the sirtuin pathway by caloric restriction enhances BDNF. BDNF not only increases the number of brain cells, but also enhances their differentiation into functional neurons (again, because of caloric restriction). For this reason, we say that BDNF enhances learning and memory. 24

The Benefits of a Ketogenic Diet

While caloric restriction is able to activate these diverse pathways, which are not only protective of the brain but enhance the growth of new neuronal networks, the same pathway can be activated by the consumption of special fats called ketones. By far the most important fat for brain energy utilization is beta-hydroxybutyrate (beta-HBA), and we’ll explore this unique fat in more detail in the next chapter. This is why the so-called ketogenic diet has been a treatment for epilepsy since the early 1920s and is now being reevaluated as a therapeutic option in the treatment of Parkinson’s disease, Alzheimer’s disease, ALS, depression, and even cancer and autism. 25 It’s also showing promise for weight loss and ending type 2 diabetes. In mice models, the diet rescues hippocampal memory deficits, and extends healthy lifespan.

Google the term “ketogenic diet” and well over a million results pop up. Between 2015 and 2017, Google searches for the term “keto” increased ninefold. But the studies demonstrating a ketogenic diet’s power date back further. In one 2005 study, for example, Parkinson’s patients actually had a notable improvement in symptoms that rivaled medications and even brain surgery after being on a ketogenic diet for just twenty-eight days. 26 Specifically, consuming ketogenic fats (i.e., medium-chain triglycerides, or MCT oil) has been shown to impart significant improvement in cognitive function in Alzheimer’s patients. 27 Coconut oil, from which we derive MCTs, is a rich source of an important precursor molecule for beta-hydroxybutyrate and is a helpful approach to treating Alzheimer’s disease. 28 A ketogenic diet has also been shown to reduce amyloid in the brain, 29 and it increases glutathione, the body’s natural brain-protective antioxidant, in the hippocampus. 30 What’s more, it stimulates the growth of mitochondria and thus increases metabolic efficiency. 31

Dominic D’Agostino is a researcher in neuroscience, molecular pharmacology, and physiology at the University of South Florida. He has written extensively on the benefits of a ketogenic diet, and in my Empowering Neurologist interview with him he stated: “Research shows that ketones are powerful energy substrates for the brain and protect the brain by enhancing antioxidant defenses while suppressing inflammation. No doubt, this is why nutritional ketosis is something pharmaceutical companies are aggressively trying to replicate.” I have also done a lot of homework in understanding the brain benefits of ketosis—a metabolic state whereby the body burns fat for energy and creates ketones in the process. Put simply, your body is in a state of ketosis when it’s creating ketones for fuel instead of relying on glucose. And the brain loves it.

While science typically has looked at the liver as the main source of ketone production in human physiology, it is now recognized that the brain can also produce ketones in special cells called astrocytes. These ketone bodies are profoundly neuroprotective. They decrease free radical production in the brain, increase mitochondrial biogenesis, and stimulate production of brain-related antioxidants. Furthermore, ketones block the apoptotic pathway that would otherwise lead to self-destruction of brain cells.

Unfortunately, ketones have gotten a bad rap. I remember in my internship being awakened by a nurse to treat a patient in “diabetic ketoacidosis.” Physicians, medical students, and interns become fearful when challenged by a patient in such a state, and with good reason. It happens in insulin-dependent type 1 diabetics when not enough insulin is available to metabolize glucose for fuel. The body turns to fat, which produces these ketones in dangerously high quantities that become toxic as they accumulate in the blood. At the same time, there is a profound loss of bicarbonate, and this leads to significant lowering of the pH (acidosis). Typically, as a result, patients lose a lot of water due to their elevated blood sugars, and a medical emergency develops.

This condition is exceedingly rare, and again, it occurs in type 1 diabetics who fail to regulate their insulin levels. Our normal physiology has evolved to handle some level of ketones in the blood; in fact, we are fairly unique in this ability among our comrades in the animal kingdom, possibly because of our large brain-to-body weight ratio and the high energy requirements of our brain. At rest, 20 percent of our oxygen consumption is used by the brain, which represents only 2 percent of the human body. In evolutionary terms, the ability to use ketones as fuel when blood sugar was exhausted and liver glycogen was no longer available (during starvation) became mandatory if we were to survive and continue hunting and gathering. Ketosis proved to be a critical step in human evolution, allowing us to persevere during times of food scarcity. To quote Gary Taubes, “In fact, we can define this mild ketosis as the normal state of human metabolism when we’re not eating the carbohydrates that didn’t exist in our diets for 99.9 percent of human history. As such, ketosis is arguably not just a natural condition but even a particularly healthful one.” 32

There is a relationship between ketosis and calorie restriction, and the two can pack a powerful punch in terms of enhancing brain health. When you restrict calories (and carbs in particular) while upping fat intake, you trigger ketosis and increase levels of ketones in the blood. In 2012, when researchers at the University of Cincinnati randomly assigned twenty-three older adults with mild cognitive impairment to either a high-carbohydrate or very low-carbohydrate diet for six weeks, they documented remarkable changes in the low-carb group. 33 They observed not only improved verbal memory performance but also reductions in weight, waist circumference, fasting glucose, and fasting insulin. Now here’s the important point: “Ketone levels were positively correlated with memory performance.”

German researchers back in 2009 demonstrated in fifty healthy, normal to overweight elderly individuals that when calories were restricted along with a 20 percent increase in dietary fat, there was a measurable increase in verbal memory scores. 34 Another small study, yes, but their findings were published in the respected Proceedings of the National Academy of Sciences and spurred further research like that of the 2012 experiment. These individuals, compared to those who did not restrict calories, demonstrated improvements in their insulin levels and decline in their C-reactive protein, the infamous marker of inflammation. As expected, the most pronounced improvements were in people who adhered the most to the dietary challenge.

Research and interest in ketosis have exploded in recent years and will continue. The key to achieving ketosis, as we’ll see later in detail, is to severely cut carbs and increase dietary fat. It’s that simple. You have to be carb restricted if you want to reach this brain-blissful state.

* * *

Power Up Your Brain
by David Perlmutter and Alberto Villoldo

Your Brain’s Evolutionary Advantage

One of the most important features distinguishing humans from all other mammals is the size of our brain in proportion to the rest of our body. while it is certainly true that other mammals have larger brains, scientists recognize that larger animals must have larger brains simply to control their larger bodies. An elephant, for example, has a brain that weighs 7,500 grams, far larger than our 1,400-gram brain. So making comparisons about “brain power” or intelligence just based on brain size is obviously futile, Again, it’s the ratio of the brain size to total body size that attracts scientist’s interests when considering the brain’s functional capacity. An elephant’s brain represents 1/550 of its body weight, while the human brain weighs 1/40 of the total body weight. So our brain represents 2.5 percent of our total body weight as opposed to the large-brained elephant whose brain is just 0.18 percent of its total body weight.

But even more important than the fact that we are blessed with a lot of brain matter is the intriguing fact that, gram for gram, the human brain consumes a disproportionately huge amount of energy. While only representing 2.5 percent of our total body weight, the human brain consumes an incredible 22 percent of our body’s energy expenditure when at rest. this represents about 350 percent more energy consumption in relation to body weight compared with other anthropoids like gorillas, orangutans, and chimpanzees.

So it takes a lot of dietary calories to keep the human brain functioning. Fortunately, the very fact that we’ve developed such a large and powerful brain has provided us with the skills and intelligence to maintain adequate sustenance during times of scarcity and to make provisions for needed food supplies in the future. Indeed, the ability to conceive of and plan for the future is highly dependent upon the evolution not only of brain size but other unique aspects of the human brain.

It is a colorful image to conceptualize early Homo sapiens migrating across and arid plain and competing for survival among animals with smaller brains yet bigger claws and greater speed. But our earliest ancestors had one other powerful advantage compared to even our closest primate relatives. The human brain has developed a unique biochemical pathway that proves hugely advantageous during times of food scarcity. Unlike other mammals, our brain is able to utilize an alternative source of calories during times of starvation. Typically, we supply our brain with glucose form our daily food consumption. We continue to supply our brains with a steady stream of glucose (blood sugar) between meals by breaking down glycogen, a storage form of glucose primarily found in the liver and muscles.

But relying on glycogen stores provides only short-term availability of glucose. as glycogen stores are depleted, our metabolism shifts and we are actually able to create new molecules of glucose, a process aptly termed gluconeogenesis. this process involves the construction of new glucose molecules from amino acids harvested form the breakdown of protein primarily found in muscle. While gluconeogenesis adds needed glucose to the system, it does so at the cost of muscle breakdown, something less than favorable for a starving hunter-gatherer.

But human physiology offers one more pathway to provide vital fuel to the demanding brain during times of scarcity. When food is unavailable, after about three days the liver begins to use body fat to create chemicals called ketones. One ketone in particular, beta hydroxybutyrate (beta-HBA), actually serves as a highly efficient fuel source for the brain, allowing humans to function cognitively for extended periods during food scarcity.

Our unique ability to power our brains using this alternative fuel source helps reduce our dependence on gluconeogensis and therefore spares amino acids and the muscles they build and maintain. Reducing muscle breakdown provides obvious advantages for the hungry Homo sapiens in search of food. It is this unique ability to utilize beta-HBA as a brain fuel that sets us apart from our nearest animal relatives and has allowed humans to remain cognitively engaged and, therefore, more likely to survive the famines ever-present in our history.

This metabolic pathway, unique to Homo sapiens, may actually serve as an explanation for one of the most hotly debated questions in anthropology: what caused the disappearance of our Neanderthal relatives? Clearly, when it comes to brains, size does matter. Why then, with a brain some 20 percent larger than our own, did Neanderthals suddenly disappear in just a few thousand years between 40,000 and 30,000 years ago? the party line among scientists remains fixated on the notion that the demise of Neanderthals was a consequence of their hebetude, or mental lethargy. The neurobiologist William Calvin described Neanderthals in his book, A Brain for All Seasons: “Their way of life subjected them to more bone fractures; they seldom survived until forty years of age; and while making tools similar to [those of] overlapping species, there was little [of the] inventiveness that characterizes behaviorally modern Homo sapiens.”

While it is convenient and almost dogmatic to accept that Neanderthals were “wiped out” by clever Homo sapiens, many scientists now believe that food scarcity may have played a more prominent role in their disappearance. Perhaps the simple fact that Neanderthals, lacking the biochemical pathway to utilize beta-HBA as a fuel source for brain metabolism, lacked the “mental endurance” to persevere. Relying on gluconeogenesis to power their brains would have led to more rapid breakdown of muscle tissue, ultimately compromising their ability to stalk prey or migrate to areas where plant food sources were more readily available. their extinction may not have played out in direct combat with Homo sapiens but rather manifested as a consequence of a simple biochemical inadequacy.

Our ability to utilize beta-HBA as a brain fuel is far more important than simply a protective legacy of our hunter-gatherer heritage. George F. Cahill of Harvard Medical School stated, “Recent studies have shown that beta-hydroxybutyrate, the principle ‘ketone’ is not just a fuel, but a ‘superfuel’ more efficiently producing ATP energy than glucose. . . . It has also protected neuronal cells in tissue culture against exposure to toxins associated with Alzheimer’s or Parkinson’s.”

Indeed, well beyond serving as a brain superfuel, Dr. Cahill and other researchers have determined that beta-HBA has other profoundly positive effects on brain health and function. Essentially, beta-HBA is thought to mediate many of the positive effects of calorie reduction and fasting on the brain, including improved antioxidant function, increased mitochondrial energy production with an increased in mitochondrial energy production with an increase in mitochondrial population, increased cellular survival, and increased levels of BDNF leading to enhanced growth of new brain cells (neurogenesis).

Fasting

Earlier, we explored the need to reduce caloric intake in order to increase BDNF as a means to stimulate the growth of new brain cells as well as to enhance the function of existing neurons. The idea of substantially reducing daily calorie intake will not appeal to many people despite the fact that it is a powerful approach to brain enhancement as well as overall health.

Interestingly, however, many people find the idea of intermittent fasting to be more appealing. Fasting is defined here as a complete abstinence from food for a defined period of time at regular intervals—our fasting program permits the drinking of water. Research demonstrates that many of the same health-providing and brain-enhancing genetic pathways activated by calorie reduction are similarly engaged by fasting—even for relatively short periods of time. Fasting actually speaks to your DNA, directing your genes to produce an astounding array of brain-enhancement factors.

Not only does fasting turn on the genetic machinery for the production of BDNF, but it also powers up the Nrf2 pathway, leading to enhanced detoxification, reduced inflammation, and increased production of brain-protective antioxidants. Fasting causes the brain to shift away from using glucose as a fuel to a metabolism that consumes ketones. When the brain metabolizes ketones for fuel, even the process of apoptosis is reduced, while mitochondrial genes turn their attention to mitochondrial replication. In this way, fasting shifts the brain’s basic metabolism and specifically targets the DNA of mitochondria, thus enhancing energy production and paving the way for better brain function and clarity . . .

* * *

Ketone bodies mimic the life span extending properties of caloric restriction
by Richard L. Veech Patrick C. Bradshaw Kieran Clarke William Curtis Robert Pawlosky M. Todd King

Aging in man is accompanied by deterioration of a number of systems. Most notable are a gradual increase in blood sugar and blood lipids, increased narrowing of blood vessels, an increase in the incidence of malignancies, the deterioration and loss of elasticity in skin, loss of muscular strength and physiological exercise performance, deterioration of memory and cognitive performance, and in males decreases in erectile function. Many aging‐induced changes, such as the incidence of malignancies in mice 82, the increases in blood glucose and insulin caused by insulin resistance 39, 78, and the muscular weakness have been shown to be decreased by the metabolism of ketone bodies 18, 83, a normal metabolite produced from fatty acids by liver during periods of prolonged fasting or caloric restriction 12.

The unique ability of ketone bodies to supply energy to brain during periods of impairment of glucose metabolism make ketosis an effective treatment for a number of neurological conditions which are currently without effective therapies. Impairment of cognitive function has also been shown to be improved by the metabolism of ketone bodies 84. Additionally, Alzheimer’s disease, the major cause of which is aging 20 can be improved clinically by the induction of mild ketosis in a mouse model of the disease 85 and in humans 86. Ketosis also improves function in Parkinson’s disease 87 which is thought to be largely caused by mitochondrial free radical damage 19, 88. Ketone bodies are also useful in ameliorating the symptoms of amyotrophic lateral sclerosis 89. It is also recognized that ketosis could have important therapeutic applications in a wide variety of other diseases 90 including Glut 1 deficiency, type I diabetes 91, obesity 78, 92, and insulin resistance 20, 39, 93, and diseases of diverse etiology 90.

In addition to ameliorating a number of diseases associated with aging, the general deterioration of cellular systems independent of specific disease seems related to ROS toxicity and the inability to combat it. In contrast increases in life span occur across a number of species with a reduction in function of the IIS pathway and/or an activation of the FOXO transcription factors, inducing expression of the enzymes required for free radical detoxification (Figs. 1 and 2). In C. elegans, these results have been accomplished using RNA interference or mutant animals. Similar changes should be able to be achieved in higher animals, including humans, by the administration of d‐βHB itself or its esters.

In summary, decreased signaling through the insulin/IGF‐1 receptor pathway increases life span. Decreased insulin/IGF‐1 receptor activation leads to a decrease in PIP3, a decrease in the phosphorylation and activity of phosphoinositide‐dependent protein kinase (PDPK1), a decrease in the phosphorylation and activity of AKT, and a subsequent decrease in the phosphorylation of FOXO transcription factors, allowing them to continue to reside in the nucleus and to increase the transcription of the enzymes of the antioxidant pathway.

In mammals, many of these changes can be brought about by the metabolism of ketone bodies. The metabolism of ketones lowers the blood glucose and insulin thus decreasing the activity of the IIS and its attendant changes in the pathway described above. However, in addition ketone bodies act as a natural inhibitor of class I HDACs, inducing FOXO gene expression stimulating the synthesis of antioxidant and metabolic enzymes. An added important factor is that the metabolism of ketone bodies in mammals increases the reducing power of the NADP system providing the thermodynamic drive to destroy oxygen free radicals which are a major cause of the aging process.

* * *

Insights into human evolution from ancient and contemporary microbiome studies
by Stephanie L Schnorr, Krithivasan Sankaranarayanan, Cecil M Lewis, Jr, and Christina Warinner

Brain growth, development, and behavior

The human brain is our defining species trait, and its developmental underpinnings are key foci of evolutionary genetics research. Recent research on brain development and social interaction in both humans and animal models has revealed that microbes exert a major impact on cognitive function and behavioral patterns []. For example, a growing consensus recognizes that cognitive and behavioral pathogenesis are often co-expressed with functional bowel disorders []. This hints at a shared communication or effector pathway between the brain and gut, termed the gutbrain-axis (GBA). The enteric environment is considered a third arm of the autonomic nervous system [], and gut microbes produce more than 90% of the body’s serotonin (5-hydroxytryptamine or 5-HT) []. Factors critical to learning and plasticity such as serotonin, γ-aminobutryic acid (GABA), short chain fatty acids (SCFAs), and brain derived neurotrophic factor (BDNF), which train amygdalin and hippocampal reactivity, can be mediated through gut-brain chemical signals that cross-activate bacterial and host receptors []. Probiotic treatment is associated with positive neurological changes in the brain such as increased BDNF, altered expression of GABA receptors, increased circulating glutathione, and a reduction in inflammatory markers. This implicates the gut microbiome in early emotional training as well as in affecting long-term cognitive plasticity.

Critically, gut microbiota can modulate synthesis of metabolites affecting gene expression for myelin production in the prefrontal cortex (PFC), presumably influencing the oligodendrocyte transcriptome []. Prosocial and risk associated behavior in probiotic treated mice, a mild analog for novelty-seeking and risk-seeking behaviors in humans, suggests a potential corollary between entrenched behavioral phenotypes and catecholamines (serotonin and dopamine) produced by the gut microbiota []. Evolutionary acceleration of the human PFC metabolome divergence from chimpanzees, particularly the dopaminergic synapse [], reifies the notion that an exaggerated risk-reward complex characterizes human cognitive differentiation, which is facilitated by microbiome derived bioactive compounds. Therefore, quintessentially human behavioral phenotypes in stress, anxiety, and novelty-seeking is additionally reinforced by microbial production of neuroactive compounds. As neurological research expands to include the microbiome, it is increasingly clear that host–microbe interactions have likely played an important role in human brain evolution and development [].

Ancient human microbiomes
by Christina Warinner, Camilla Speller, Matthew J. Collins, and Cecil M. Lewis, Jr

Need for paleomicrobiology data

Although considerable effort has been invested in characterizing healthy gut and oral microbiomes, recent investigations of rural, non-Western populations () have raised questions about whether the microbiota we currently define as normal have been shaped by recent influences of modern Western diet, hygiene, antibiotic exposure, and lifestyle (). The process of industrialization has dramatically reduced our direct interaction with natural environments and fundamentally altered our relationship with food and food production. Situated at the entry point of our food, and the locus of food digestion, the human oral and gut microbiomes have evolved under conditions of regular exposure to a diverse range of environmental and zoonotic microbes that are no longer present in today’s globalized food chain. Additionally, the foods themselves have changed from the wild natural products consumed by our hunter-gatherer ancestors to today’s urban supermarkets stocked with an abundance of highly processed Western foodstuffs containing artificially enriched levels of sugar, oil, and salt, not to mention antimicrobial preservatives, petroleum-based colorants, and numerous other artificial ingredients. This dietary shift has altered selection pressure on our microbiomes. For example, under the ‘ecological plaque hypothesis’, diseases such as dental caries and periodontal disease are described as oral ecological catastrophes of cultural and lifestyle choices ().

Although it is now clear that the human microbiome plays a critical role in making us human, in keeping us healthy, and in making us sick, we know remarkably little about the diversity, variation, and evolution of the human microbiome both today and in the past. Instead, we are left with many questions: When and how did our bacterial communities become distinctly human? And what does this mean for our microbiomes today and in the future? How do we acquire and transmit microbiomes and to what degree is this affected by our cultural practices and built environments? How have modern Western diets, hygiene practices, and antibiotic exposure impacted ‘normal’ microbiome function? Are we still in mutualistic symbiosis with our microbiomes, or are the so-called ‘diseases of civilization’ – heart disease, obesity, type II diabetes, asthma, allergies, osteoporosis – evidence that our microbiomes are out of ecological balance and teetering on dysbiosis ()? At an even more fundamental level, who are the members of the human microbiome, how did they come to inhabit us, and how long have they been there? Who is ‘our microbial self’ ()?

Studies of remote and indigenous communities () and crowdsourcing projects such as the American Gut (www.americangut.org), the Earth Microbiome Project (www.earthmicrobiome.org), and uBiome (www.uBiome.com) are attempting to characterize modern microbiomes across a range of contemporary environments. Nevertheless, even the most extensive sampling of modern microbiota will provide limited insight into Pre-Industrial microbiomes. By contrast, the direct investigation of ancient microbiomes from discrete locations and time points in the past would provide a unique view into the coevolution of microbes and hosts, host microbial ecology, and changing human health states through time. […]

Diet also plays a role in shaping the composition of oral microbiomes, most notably by the action of dietary sugar in promoting the growth of cariogenic bacteria such as lactobacilli and S. mutans (). Two recent papers have proposed that cariogenic bacteria, such as S. mutans, were absent in pre-Neolithic human populations, possibly indicating low carbohydrate diets (), while evolutionary genomic analyses of S. mutans suggest an expansion in this species approximately 10,000 years, coinciding with the onset of agriculture (). […]

Ancient microbiome research provides an additional pathway to understanding human biology that cannot be achieved by studies of extant individuals and related species alone. Although reconstructing the ancestral microbiome by studying our ancestors directly is not without challenges (), this approach provides a more direct picture of human-microbe coevolution. Likewise, ancient microbiome sources may reveal to what extent bacteria commonly considered ‘pathogenic’ in the modern world (for example, H. pylori) were endemic indigenous organisms in pre-Industrial microbiomes ().

The three paths to reconstructing the ancestral microbiomes are also complimentary. For example, analysis of the gut microbiome from extant, rural peoples in Africa and South America have revealed the presence of a common, potentially commensal, spirochete belonging to the genus Treponema (). Such spirochetes have also been detected in extant hunter-gatherers (), and in 1,000-year-old human coprolites from Mexico (), but they are essentially absent from healthy urban populations, and they have not been reported in the gut microbiome of chimpanzees (). These multiple lines of evidence suggest that this poorly understood spirochete is a member of the ancestral human microbiome, yet not necessarily the broader primate microbiome. Future coprolite research may be able to answer the question of how long this microbe has co-associated with humans, and what niche it fills.

The Latest in Darwinian Medical Science: 12 Noteworthy New Papers That Shed Evolutionary Light on Health and Disease
by Eirik Garnas

Exposure of the Host-Associated Microbiome to Nutrient-Rich Conditions May Lead to Dysbiosis and Disease Development—an Evolutionary Perspective

The gist of what the paper is about:

We here propose that overfeeding of the host-associated bacterial community, particularly with easily digestible, energy-dense, low-fiber-content foods, likely causes dysbiosis and the development of disease. Overfeeding uncouples natural host-microbe associations, leading to an increased activity and changed functionality of the associated microbiota.

Continue reading…

My comments:
This paper sheds light on an important, yet often overlooked facet of the recent assault on the human microbiome, namely the effects of overfeeding. While it’s widely recognized that fast food doesn’t do the microbiome good, it’s somewhat underappreciated that changes pertaining to our meal pattern and total caloric intake have also undermined human-microbe relations.

Whereas our primal ancestors often did things, like chasing game, on an empty stomach, many humans of the 21st century could be said to almost constantly be in a fed, as opposed to fasted, state. This paper helps draw attention to the microbiome-related implications of this mismatch and suggests that many of the health benefits of intermittent fasting are mediated by gut bacteria.

* * *

Ketogenic Diet and Neurocognitive Health
Spartan Diet
The Agricultural Mind
Malnourished Americans

Can Ketogenic Diets Work for Bodybuilding or Athletics?
by P. D. Mangan

Here’s how I’d summarize the ‘keto for sports’ evidence so far:

  • The longer the study…or the longer its keto-adaptation phase…or the more keto-adapted the subjects are…the more likely the study is to find favorable performance results
  • Keto is worth trying for anyone in any sport (but start in the off-season!)
  • It’s highly unlikely keto is better for high-intensity
  • It’s unlikely that keto is bad for high-intensity
  • It’s likely that keto is neutral for high-intensity
  • It’s likely that keto diets are better for endurance
  • It’s very likely keto diets are better for body composition
  • It’s very likely keto diets are generally healthier than standard high-carb diets for athletes

Neuroscientist Shows What Fasting Does To Your Brain & Why Big Pharma Won’t Study It
by Arjun Walia

Does Fasting Make You Smarter?
by Derek Beres

Fasting Cleans the Brain
by P. D. Mangan

How Fasting Heals Your Brain
by Adriana Ayales

Effect of Intermittent Fasting on Brain Neurotransmitters, Neutrophils Phagocytic Activity, and Histopathological Finding in Some Organs in Rats
by Sherif M. Shawky, Anis M. Zaid, Sahar H. Orabi, Khaled M. Shoghy, and Wafaa A. Hassan

The Effects of Fasting During Ramadan on the Concentration of Serotonin, Dopamine, Brain-Derived Neurotrophic Factor and Nerve Growth Factor
by Abdolhossein Bastani, Sadegh Rajabi, and Fatemeh Kianimarkani

Gut microbiome, SCFAs, mood disorders, ketogenic diet and seizures
by Jonathan Miller

Study: Ketogenic diet appears to prevent cognitive decline in mice
by University of Kentucky

Low-carb Diet Alleviates Inherited Form of Intellectual Disability in Mice
by Johns Hopkins Medicine

Ketogenic Diet Protects Against Alzheimer’s Disease by Keeping Your Brain Healthy and Youthful
by Joseph Mercola

The Gut Microbiota Mediates the Anti-Seizure Effects of the Ketogenic Diet.
by C. A. Olson

Is the Keto Diet Bad for the Microbiome?
by David Jockers

Does a Ketogenic Diet Change Our Microbiome?
by Christie Rice

Can Health Issues Be Solved By Dietary Changes Altering the Microbiome?
by Russ Schierling

Some Benefits of Intermittent Fasting are Mediated by the Gut Microbiome
by Fight Aging!

RHR: Is High Fat Healthy for the Gut Microbiota?
by Chris Kresser

A Comprehensive List of Low Carb Research
by Sarah Hallberg

Randomised Controlled Trials Comparing Low-Carb Diets Of Less Than 130g Carbohydrate Per Day To Low-Fat Diets Of Less Than 35% Fat Of Total Calories
from Public Health Collaboration

Carcinogenic Grains

In understanding human health, we have to look at all factors as a package deal. Our gut-brain is a system, as is our entire mind-body. Our relationships, lifestyle, the environment around us — all of it is inseparable. This is true even if we limit ourselves to diet alone. It’s not simply calories in/calories out, macronutrient ratios, or anything else along these lines. It is the specific foods eaten in combination with which other foods and in the context of stress, toxins, epigenetic inheritance, gut health, and so much else that determine what effects manifest in the individual.

There are numerous examples of this. But I’ll stick to a simple one, which involves several factors and the relationship between them. First, red meat is associated with cancer and heart disease. Yet causation is hard to prove, as red meat consumption is associated with many other foods in the standard American diet, such as added sugars and vegetable oils in processed foods. The association might be based on confounding factors that are culture-specific, which can explain why we find societies with heavy meat consumption and little cancer.

So, what else might be involved? We have to consider what red meat is being eaten with, at least in the standard American diet that is used as a control in most research. There is, of course, the added sugars and vegetable oils — they are seriously bad for health and may explain much of the confusion. Saturated fat intake has been dropping since the early 1900s and, in its place, there has been a steady rise in the use of vegetable oils; we now know that highly heated and hydrogenated vegetable oils do severe damage. Also, some of the original research that blamed saturated fat, when re-analyzed, found that sugar was the stronger correlation to heart disease.

Saturated fat, as with cholesterol, had been wrongly accused. This misunderstanding has, over multiple generations at this point, led to the early death of at least hundreds of millions of people worldwide, as dozens of the wealthiest and most powerful countries enforced this in their official dietary recommendations which transformed the world’s food system. Similar to eggs, red meat became the fall guy.

Such things as heart disease are related to obesity, and conventional wisdom tells us that fat makes us fat. Is that true? Not exactly or directly. I was amused to discover that a scientific report commissioned by the British government in 1846 (Experimental Researches on the Food of Animals, and the Fattening of Cattle: With Remarks on the Food of Man. Based Upon Experiments Undertaken by Order of the British Government by Robert Dundas Thomson) concluded that “The present experiments seem to demonstrate that the fat of animals cannot be produced from the oil of the food” — fat doesn’t make people fat, and that low-carb meat-eating populations tend to be slim has been observed for centuries.

So, in most cases, what does cause fat accumulation? It is only fat combined with plenty of carbs and sugar that is guaranteed to make us fat, that is to say fat in the presence of glucose in that the two compete as a fuel source.

Think about what an American meal with red meat looks like. A plate might have a steak with some rolls or slices of bread, combined with a potato and maybe some starchy ‘vegetables’ like corn, peas, or lima beans. Or there will be a hamburger with a bun, a side of fries, and a large sugary drink (‘diet’ drinks are no better, as we now know artificial sweeteners fool the body and so are just as likely to make you fat and diabetic). What is the common factor, red meat combined with wheat or some other grain, as part of a diet drenched in carbs and sugar (and all of it cooked or slathered in vegetable oils).

Most Americans have a far greater total intake of carbs, sugar, and vegetable oils than red meat and saturated fat. The preferred meat of Americans these days is chicken with fish also being popular. Why does red meat and saturated fat continue to be blamed for the worsening rates of heart disease and metabolic disease? It’s simply not rational, based on the established facts in the field of diet and nutrition. That isn’t to claim that too much red meat couldn’t be problematic. It depends on the total diet. Also, Americans have the habit of grilling their red meat and grilling increases carcinogens, which could be avoided by not charring one’s meat, but that equally applies to not burning (or frying) anything one eats, including white meat and plant foods. In terms of this one factor, you’d be better off eating beef roasted with vegetables than to go with a plant-based meal that included foods like french fries, fried okra, grilled vegetable shish kabobs, etc.

Considering all of that, what exactly is the cause of cancer that keeps showing up in epidemiological studies? Sarah Ballantyne has some good answers to that (see quoted passage below). It’s not so much about red meat itself as it is about what red meat is eaten with. The crux of the matter is that Americans eat more starchy carbs, mostly refined flour, than they do vegetables. What Ballantyne explains is that two of the potential causes of cancer associated with red meat only occur in a diet deficient in vegetables and abundant in grains. It is the total diet as seen in the American population that is the cause of high rates of cancer.

As a heavy meat diet without grains is not problematic, a heavy carb diet without grains is also not necessarily problematic. Some of the healthiest populations eat lots of carbs like sweet potatoes, but you won’t find any healthy population that eats as many grains as do Americans. There are many issues with grains considered in isolation (read the work of David Perlmutter or any number of writers on the paleo diet), but grains combined with certain other foods in particular can contribute to health concerns.

Then again, some of this is about proportion. For most of the time of agriculture, humans ate small amounts of grains as an occasional food. Grains tended to be stored for hard times or for trade or else turned into alcohol to be mixed with water from unclean sources. The shift to large amounts of grains made into refined flour is an evolutionarily unique dilemma our bodies aren’t designed to handle. The first accounts of white bread are found in texts from slightly over two millennia ago and most Westerners couldn’t afford white bread until the past few centuries when industrialized milling began. Before that, people tended to eat foods that were available and didn’t mix them as much (e.g., eat fruits and vegetables in season). Hamburgers were invented only about a century ago. The constant combining of red meat and grains is not something we are adapted for. That harm to our health results maybe shouldn’t surprise us.

Red meat can be a net loss to health or a net gain. It depends not on the red meat, but what is and isn’t eaten with it. Other factors matter as well. Health can’t be limited to a list of dos and don’ts, even if such lists have their place in the context of more detailed knowledge and understanding. The simplest solution is to eat as most humans ate for hundreds of thousands of years, and more than anything else that means avoiding grains. Even without red meat, many people have difficulties with grains.

Let’s return to the context of evolution. Hominids have been eating fatty red meat for millions of years (early humans having prized red meat from blubbery megafauna until their mass extinction), and yet meat-eating hunter-gatherers rarely get cancer, heart disease, or any of the other modern ailments. How long ago was it when the first humans ate grains? About 12 thousand years ago. Most humans on the planet never touched a grain until the past few millennia. And fewer still included grains with almost every snack and meal until the past few generations. So, what is this insanity of government dietary recommendations putting grains as the base of the food pyramid? Those grains are feeding the cancerous microbes, and doing much else that is harmful.

In conclusion, is red meat bad for human health? It depends. Red meat that is charred or heavily processed combined with wheat and other carbs, lots of sugar and vegetable oils, and few nutritious vegetables, well, that would be a shitty diet that will inevitably lead to horrible health consequences. Then again, the exact same diet minus the red meat would still be a recipe for disease and early death. Yet under other conditions, red meat can be part of a healthy diet. Even a ton of pasture-raised red meat (with plenty of nutrient-dense organ meats) combined with an equal amount of organic vegetables (grown on healthy soil, bought locally, and eaten in season), in exclusion of grains especially refined flour and with limited intake of all the other crap, that would be one of the healthiest diets you could eat.

On the other hand, if you are addicted to grains as many are and can’t imagine a world without them, you would be wise to avoid red meat entirely. Assuming you have any concerns about cancer, you should choose one or the other but not both. I would note, though, that there are many other reasons to avoid grains while there are no other known reasons to avoid red meat, at least for serious health concerns, although some people exclude red meat for other reasons such as digestion issues. The point is that whether or not you eat red meat is a personal choice (based on taste, ethics, etc), not so much a health choice, as long as we separate out grains. That is all we can say for certain based on present scientific knowledge.

* * *

We’ve known about this for years now. Isn’t it interesting that no major health organization, scientific institution, corporate news outlet, or government agency has ever warned the public about the risk factors of carcinogenic grains? Instead, we get major propaganda campaigns to eat more grains because that is where the profit is for big ag, big food, and big oil (that makes farm chemicals and transports the products of big ag and big food). How convenient! It’s nice to know that corporate profit is more important than public health.

But keep listening to those who tell you that cows are destroying the world, even though there are fewer cows in North America than there once were buffalo. Yeah, monocultural GMO crops immersed in deadly chemicals that destroy soil and deplete nutrients are going to save us, not traditional grazing land that existed for hundreds of millions of years. So, sure, we could go on producing massive yields of grains in a utopian fantasy beloved by technocrats and plutocrats that further disconnects us from the natural world and our evolutionary origins, an industrial food system dependent on turning the whole world into endless monocrops denatured of all other life, making entire regions into ecological deserts that push us further into mass extinction. Or we could return to traditional ways of farming and living with a more traditional diet largely of animal foods (meat, fish, eggs, dairy, etc) balanced with an equal amount of vegetables, the original hunter-gatherer diet.

Our personal health is important. And it is intimately tied to the health of the earth. Civilization as we know it was built on grains. That wasn’t necessarily a problem when grains were a small part of the diet and populations were small. But is it still a sustainable socioeconomic system as part of a healthy ecological system? No, it isn’t. So why do we continue to do more of the same that caused our problems in the hope that it will solve our problems? As we think about how different parts of our diet work together to create conditions of disease or health, we need to begin thinking this way about our entire world.

* * *

Paleo Principles
by Sarah Ballantyne

While this often gets framed as an argument for going vegetarian or vegan. It’s actually a reflection of the importance of eating plenty of plant foods along with meat. When we take a closer look at these studies, we see something extraordinarily interesting: the link between meat and cancer tends to disappear once the studies adjust for vegetable intake. Even more exciting, when we examine the mechanistic links between meat and cancer, it turns out that many of the harmful (yes, legitimately harmful!) compounds of meat are counteracted by protective compounds in plant foods.

One major mechanism linking meat to cancer involves heme, the iron-containing compound that gives red meat its color (in contrast to the nonheme iron found in plant foods). Where heme becomes a problem is in the gut: the cells lining the digestive tract (enterocytes) metabolize it into cytotoxic compounds (meaning toxic to living cells), which can then damage the gut barrier (specifically the colonic mucosa; see page 67), cause cell proliferation, and increase fecal water toxicity—all of which raise cancer risk. Yikes! In fact, part of the reason red meat is linked with cancer far more often than with white meat could be due to their differences in heme content; white meat (poultry and fish) contains much, much less.

Here’s where vegetables come to the rescue! Chlorophyll, the pigment in plants that makes them green, has a molecular structure that’s very similar to heme. As a result, chlorophyll can block the metabolism of heme in the intestinal tract and prevent those toxic metabolites from forming. Instead of turning into harmful by-products, heme ends up being metabolized into inert compounds that are no longer toxic or damaging to the colon. Animal studies have demonstrated this effect in action: one study on rats showed that supplementing a heme-rich diet with chlorophyll (in the form of spinach) completely suppressed the pro-cancer effects of heme. All the more reason to eat a salad with your steak.

Another mechanism involves L-carnitine, an amino acid that’s particularly abundant in red meat (another candidate for why red meat seems to disproportionately increase cancer risk compared to other meats). When we consume L-carnitine, our intestinal bacteria metabolize it into a compound called trimethylamine (TMA). From there, the TMA enters the bloodstream and gets oxydized by the liver into yet another compound, trimethylamine-N-oxide (TMAO). This is the one we need to pay attention to!

TMAO has been strongly linked to cancer and heart disease, possibly due to promoting inflammation and altering cholesterol transport. Having high levels of it in the bloodstream could be a major risk factor for some chronic diseases. So is this the nail in the coffin for meat eaters?

Not so fast! An important study on this topic published in 2013 in Nature Medicine sheds light on what’s really going on. This paper had quite a few components, but one of the most interesting has to do with gut bacteria. Basically, it turns out that the bacteria group Prevotella is a key mediator between L-carnitine consumption and having high TMAO levels in our blood. In this study, the researchers found that participants with gut microbiomes dominated by Prevotella produced the most TMA (and therefore TMAO, after it reached the liver) from the L-carnitine they ate. Those with microbiomes high in Bacteroides rather than Prevotella saw dramatically less conversion to TMA and TMAO.

Guess what Prevotella loves to snack on? Grains! It just so happens that people with high Prevotella levels, tend to be those who eat grain-based diets (especially whole grain), since this bacterial group specializes in fermenting the type of polysaccharides abundant in grain products. (For instance, we see extremely high levels of Prevotella in populations in rural Africa that rely on cereals like millet and sorghum.) At the same time, Prevotella doesn’t seem to be associated with a high intake of non-grain plant sources, such as fruit and vegetables.

So is it really the red meat that’s a problem . . . or is it the meat in the context of a grain-rich diet? Based on the evidence we have so far, it seems that grains (and the bacteria that love to eat them) are a mandatory part of the L-carnitine-to-TMAO pathway. Ditch the grains, embrace veggies, and our gut will become a more hospitable place for red meat!

* * *

Georgia Ede has a detailed article about the claim of meat causing cancer. In it, she provides several useful summaries of and quotes from the scientific literature.

WHO Says Meat Causes Cancer?

In November 2013, 23 cancer experts from eight countries gathered in Norway to examine the science related to colon cancer and red/processed meat. They concluded:

“…the interactions between meat, gut and health outcomes such as CRC [colorectal cancer] are very complex and are not clearly pointing in one direction….Epidemiological and mechanistic data on associations between red and processed meat intake and CRC are inconsistent and underlying mechanisms are unclear…Better biomarkers of meat intake and of cancer occurrence and updated food composition databases are required for future studies.” 1) To read the full report: http://www.ncbi.nlm.nih.gov/pubmed/24769880 [open access]

Translation: we don’t know if meat causes colorectal cancer. Now THAT is a responsible, honest, scientific conclusion.

How the WHO?

How could the WHO have come to such a different conclusion than this recent international gathering of cancer scientists? As you will see for yourself in my analysis below, the WHO made the following irresponsible decisions:

  1. The WHO cherry-picked studies that supported its anti-meat conclusions, ignoring those that showed either no connection between meat and cancer or even a protective effect of meat on colon cancer risk. These neutral and protective studies were specifically mentioned within the studies cited by the WHO (which makes one wonder whether the WHO committee members actually read the studies referenced in its own report).
  2. The WHO relied heavily on dozens of “epidemiological” studies (which by their very nature are incapable of demonstrating a cause and effect relationship between meat and cancer) to support its claim that meat causes cancer.
  3. The WHO cited a mere SIX experimental studies suggesting a possible link between meat and colorectal cancer, four of which were conducted by the same research group.
  4. THREE of the six experimental studies were conducted solely on RATS. Rats are not humans and may not be physiologically adapted to high-meat diets. All rats were injected with powerful carcinogenic chemicals prior to being fed meat. Yes, you read that correctly.
  5. Only THREE of the six experimental studies were human studies. All were conducted with a very small number of subjects and were seriously flawed in more than one important way. Examples of flaws include using unreliable or outdated biomarkers and/or failing to include proper controls.
  6. Some of the theories put forth by the WHO about how red/processed meat might cause cancer are controversial or have already been disproved. These theories were discredited within the texts of the very same studies cited to support the WHO’s anti-meat conclusions, again suggesting that the WHO committee members either didn’t read these studies or deliberately omitted information that didn’t support the WHO’s anti-meat position.

Does it matter whether the WHO gets it right or wrong about meat and cancer? YES.

“Strong media coverage and ambiguous research results could stimulate consumers to adapt a ‘safety first’ strategy that could result in abolishment of red meat from the diet completely. However, there are reasons to keep red meat in the diet. Red meat (beef in particular) is a nutrient dense food and typically has a better ratio of N6:N3-polyunsaturated fatty acids and significantly more vitamin A, B6 and B12, zinc and iron than white meat(compared values from the Dutch Food Composition Database 2013, raw meat). Iron deficiencies are still common in parts of the populations in both developing and industrialized countries, particularly pre-school children and women of childbearing age (WHO)… Red meat also contains high levels of carnitine, coenzyme Q10, and creatine, which are bioactive compounds that may have positive effects on health.” 2)

The bottom line is that there is no good evidence that unprocessed red meat increases our risk for cancer. Fresh red meat is a highly nutritious food which has formed the foundation of human diets for nearly two million years. Red meat is a concentrated source of easily digestible, highly bioavailable protein, essential vitamins and minerals. These nutrients are more difficult to obtain from plant sources.

It makes no sense to blame an ancient, natural, whole food for the skyrocketing rates of cancer in modern times. I’m not interested in defending the reputation of processed meat (or processed foods of any kind, for that matter), but even the science behind processed meat and cancer is unconvincing, as I think you’ll agree. […]

Regardless, even if you believe in the (non-existent) power of epidemiological studies to provide meaningful information about nutrition, more than half of the 29 epidemiological studies did NOT support the WHO’s stance on unprocessed red meat and colorectal cancer.

It is irresponsible and misleading to include this random collection of positive and negative epidemiological studies as evidence against meat.

The following quote is taken from one of the experimental studies cited by the WHO. The authors of the study begin their paper with this striking statement:

“In puzzling contrast with epidemiological studies, experimental studies do not support the hypothesis that red meat increases colorectal cancer risk. Among the 12 rodent studies reported in the literature, none demonstrated a specific promotional effect of red meat.” 3)

[Oddly enough, none of these twelve “red meat is fine” studies, which the authors went on to list and describe within the text of the introduction to this article, were included in the WHO report].

I cannot emphasize enough how common it is to see statements like this in scientific papers about red meat. Over and over again, researchers see that epidemiology suggests a theoretical connection between some food and some health problem, so they conduct experiments to test the theory and find no connection. This is why our nutrition headlines are constantly changing. One day eggs are bad for you, the next day they’re fine. Epidemiologists are forever sending well-intentioned scientists on time-consuming, expensive wild goose chases, trying to prove that meat is dangerous, when all other sources–from anthropology to physiology to biochemistry to common sense—tell us that meat is nutritious and safe.

* * *

Below good discussion between Dr. Steven Gundry and Dr. Paul Saladino. It’s an uncommon dialogue. Even though Gundry is known for warning against the harmful substances in plant foods, he has shifted toward a plant-based diet in also warning against too much animal foods or at least too much protein, another issue about IGF1 not relevant to this post. As for Saladino, he is a carnivore and so takes Gundry’s argument against plants to a whole other level. Saladino sees no problem with meat, of course. And his view contradicts what Gundry writes about in his most recent book, The Longevity Paradox.

Anyway, they got onto the topic of TMAO. Saladino points out that fish has more fully formed TMAO than red meat produces in combination with grain-loving Prevotella. Even vegetables produce TMAO. So, why is beef being scapegoated? It’s pure ignorant idiocy. To further this point, Saladino explained that he has tested the microbiome of patients of his on the carnivore diet and it comes up low on the Prevotella bacteria. He doesn’t think TMAO is the danger people claim it is. But even if it were, the single safest diet might be the carnivore diet.

Gundry didn’t even disagree. He pointed out that he did testing on patients of his who are long-term vegans and now in their 70s. They had extremely high levels of TMAO. He sent their lab results to the Cleveland Clinic for an opinion. The experts there refused to believe that it was possible and so dismissed the evidence. That is the power of dietary ideology when it forms a self-enclosed reality tunnel. Red meat is bad and vegetables are good. The story changes over time. It’s the saturated fat. No, it’s the TMAO. Then it will be something else. Always looking for a rationalization to uphold the preferred dogma.

* * *

7/25/19 – Additional thoughts: There is always new research coming out. And as is typical, it is often contradictory. It is hard to know what is being studied exactly.The most basic understanding in mainstream nutrition right now seems to be that red meat is associated with TMAO by way of carnitine and Prevotella (Studies reveal role of red meat in gut bacteria, heart disease development). But there are many assumptions being made. This research tends to be epidemiological/observational and so most factors aren’t being controlled.

Worse still, they aren’t comparing the equivalent extremes, such as veganism vs carnivory but veganism and vegetarianism vs omnivory. That is to leave out the even greater complicating factor that, as the data shows, a significant number of vegans and vegetarians occasionally eat animal foods. There really aren’t that many long-term vegans and vegetarians to study because 80% of people who start the diet quit it, and of that 20% few are consistent.

As for omnivores, they are a diverse group that could include hundreds of dietary variations. One variety of omnivory is the paleo diet, slightly restricted omnivory in that grains are excluded, often along with legumes, white potatoes, dairy, added sugar, etc. The paleo diet was studied and showed higher levels of TMAO and, rather than cancer, the focus was on cardiovascular disease (Heart disease biomarker linked to paleo diet).

So, that must mean the paleo diet is bad, right? When people think of the paleo diet, they think of a caveman lugging a big hunk of meat. But the reality is that the standard paleo diet, although including red meat, emphasizes fish and heaping platefuls of vegetables. Why is red meat getting blamed? In a bizarre twist, the lead researcher of the paleo study, Dr. Angela Genoni, thought the problem was the lack of grains. But it precisely grains that the TMAO-producing Prevotella gut bacteria love so much. How could reducing grains increase TMAO? No explanation was offered. Before we praise grains, why not look at the sub-population of vegans, vegetarians, fruitivores, etc who also avoid grains?

There is a more rational and probable factor. It turns out that fish and vegetables raise TMAO levels higher than red meat (Eat your vegetables (and fish): Another reason why they may promote heart health). This solves the mystery of why some Dr. Gundry’s vegan patients had high TMAO levels. Yet, in another bizarre leap of logic, the same TMAO that is used to castigate red meat suddenly is portrayed as healthy in reducing cardiovascular risk when it comes from sources other than red meat. It is the presence of red meat that somehow magically transforms TMAO into an evil substance that will kill you. Or maybe, just maybe it has nothing directly to do with TMAO alone.

After a long and detailed analysis of the evidence, Dr. Georgia Ede concluded that, “As far as I can tell, the authors’ theory that red meat provides carnitine for bacteria to transform into TMA which our liver then converts to TMAO, which causes our macrophages to fill up with cholesterol, block our arteries, and cause heart attacks is just that–a theory–full of sound and fury, signifying nothing” (Does Carnitine from Red Meat Cause Heart Disease?).

 

Spartan Diet

There are a number well known low-carb diets. The most widely cited is that of the Inuit, but the Masai are often mentioned as well. I came across another example in Jack Weatherford’s Genghis Khan and the Making of the Modern World (see here for earlier discussion).

Mongols lived off of meat, blood, and milk paste. This diet, as the Chinese observed, allowed the Mongol warriors to ride and fight for days on end without needing to stop for meals. Part of this is because they could eat while riding, but there is a more key factor. This diet is so low-carb as to be ketogenic. And long-term ketosis leads to fat-adaptation which allows for high energy and stamina, even without meals as long as one has enough fat reserves (i.e., body fat). The feast and fast style of eating is common among non-agriculturalists.

There are other historical examples I haven’t previously researched. Ori Hofmekler in The Warrior Diet, claims that Spartans and Romans ate in a brief period each day, about a four hour window — because of the practice of having a communal meal once a day. This basically meant fasting for lengthy periods, although today it is often described as time-restricted eating. As I recall, Sikh monks have a similar practice of only eating one meal a day during which they are free to eat as much as they want. The trick to this diet is that it decreases overall food intake and keeps the body in ketosis more often — if starchy foods are restricted enough and the body is fat-adapted, this lessens hunger and cravings.

The Mongols may have been doing something similar. The thing about ketosis is your desire to snack all the time simply goes away. You don’t have to force yourself into food deprivation and it isn’t starvation, even if going without food for several days. As long as there is plenty of body fat and you are fat-adapted, the body maintains health, energy and mood just fine until the next big meal. Even non-warrior societies do this. The meat-loving and blubber-gluttonous Inuit don’t tolerate aggression in the slightest, and they certainly aren’t known for amassing large armies and going on military campaigns. Or consider the Piraha who are largely pacifists, banishing their own members if they kill another person, even someone from another tribe. The Piraha get about 70% of their diet from fish and other meat, that is to say a ketogenic diet. Plus, even though surrounded by lush forests filled with a wide variety of food, plants and animals, the Piraha regularly choose not to eat — sometimes for no particular reason but also sometimes when doing communal dances over multiple days.

So, I wouldn’t be surprised if Spartan and Roman warriors had similar practices, especially the Spartans who didn’t farm much (the grains that were grown by the Spartans’ slaves likely were most often fed to the slaves, not as much to the ruling Spartans). As for Romans, their diet probably became more carb-centric as Rome grew into an agricultural empire. But early on in the days of the Roman Republic, Romans probably were like Spartans in the heavy focus they would have put on raising cattle and hunting game. Still, a diet doesn’t have to be heavy in fatty meat to be ketogenic, as long as it involves some combination of calorie restriction, portion control, narrow periods of meals, intermittent fasting, etc — all being other ways of lessening the total intake of starchy foods.

One of the most common meals for Spartans was a blood and bone broth using boiled pork mixed with salt and vinegar, the consistency being thick and the color black. That would have included a lot of fat, fat-soluble vitamins, minerals, collagen, electrolytes, and much else. It was a nutrient-dense elixir of health, however horrible it may seem to the modern palate. And it probably was low-carb, depending on what else might’ve been added to it. Even the wine Spartans drink was watered down, as drunkenness was frowned upon. The purpose was probably more to kill unhealthy microbes in the water, as was watered down beer millennia later for early Americans, and so it would have added little sugar to the diet. Like the Mongols, they also enjoyed dairy. And they did have some grains such as bread, but apparently it was never a staple of their diet.

One thing they probably ate little of was olive oil, assuming it was used at all, as it was rarely mentioned in ancient texts and only became popular among Greeks in recent history, specifically the past century (discussed by Nina Teicholz in The Big Fat Surprise). Instead, Spartans as with most other early Greeks would have preferred animal fat, mostly lard in the case of the Spartans, whereas many other less landlocked Greeks preferred fish. Other foods the ancient Greeks, Spartans and otherwise, lacked was tomatoes later introduced from the New World and noodles later introduced from China, both during the colonial era of recent centuries. So, a traditional Greek diet would have looked far different than what we think of as the modern ‘Mediterranean diet’.

On top of that, Spartans were proud of eating very little and proud of their ability to fast. Plutarch (2nd century AD) writes in Parallel Lives “For the meals allowed them are scanty, in order that they may take into their own hands the fight against hunger, and so be forced into boldness and cunning”. Also, Xenophon who was alive whilst Sparta existed, writes in Spartan Society 2, “furnish for the common meal just the right amount for [the boys in their charge] never to become sluggish through being too full, while also giving them a taste of what it is not to have enough.” (from The Ancient Warrior Diet: Spartans) It’s hard to see how this wouldn’t have been ketogenic. Spartans were known for being great warriors achieving feats of military prowess that would’ve been impossible from lesser men. On their fatty meat diet of pork and game, they were taller and leaner than other Greeks. They didn’t have large meals and fasted for most of the day, but when they did eat it was food dense in fat, calories, and nutrition.

* * *

Ancient Spartan Food and Diet
from Legend & Chronicles

The Secrets of Spartan Cuisine
by Helena P. Schrader

This is what Mongol MREs looked like
by Cam Rea

Cold War Silencing of Science

In the early Cold War, the United States government at times was amazingly heavy-handed in its use of domestic power.

There was plenty of surveillance, of course. But there was also blatant propaganda with professors, journalists, and artists on the payroll of intelligence agencies, not to mention funding going to writing programs, American studies, etc. Worse still, there were such things as COINTELPRO, including truly effed up shit like the attempt to blackmail Martin Luther King, jr. into committing suicide. There is another angle to this. Along with putting out propaganda, they would do the opposite by trying to silence alternative voices and enforce conformity. They did that with the McCarthyist attacks on anyone perceived or falsely portrayed as deviant or as a fellow traveler of deviants. This destroyed careers and did successfully lead to some suicides of those devastated. But there was another kind of shutting down that I find sad as someone who affirms a free society as, among else, the free flow of information.

When Nikola Tesla died, the FBI swooped in and stole his research with no justification, as Tesla was a US citizen and such actions are both illegal and unconstitutional. They didn’t release his papers until 73 years later and no one knows if they released everything, as there is no transparency or accountability. One of the most famous examples is much more heinous. Wilhelm Reich was targeted by the American Medical Association, FDA, and FBI. The government arrested him and sentenced him to prison where he died. All of his journals and books were incinerated. In the end, the FDA had spent $2 million investigating and prosecuting Reich, simply because they didn’t like his research and of course his promoting sexual deviancy through free love.

These were not minor figures either. Nikola Tesla was one of the greatest scientists in the world and most definitely the greatest inventor in American history. And Wilhelm Reich was a famous doctor and psychoanalyst, an associate of Sigmund Freud and Carl Jung, and a well known writer. Their otherwise respectable positions didn’t protect them. Imagine what the government could get away with when they targeted average Americans with no one to protest and come to their defense. This same abuse of power was seen in related fields. A major focus of Reich’s work was health and, of course, he shared that area of concern with the FDA who saw it as their personal territory to rule as they wished. The FDA went after many with alternative health views that gained enough public attention and they could always find a reason to justify persecution.

I’ve come across examples in diet and nutrition, such as last year when I read Nina Planck’s Real Food where she writes about Adelle Davis, a biochemist and nutritionist who became a popular writer and gained celebrity as a public intellectual. Since she advocated a healthy diet of traditional foods, this put her in the cross-hairs of the powerful that sought to defend the standard American diet (SAD):

“My mother’s other nutritional hero was Adelle Davis, the best-selling writer who recommended whole foods and lots of protein. […] Davis had a master’s degree in biochemistry from the University of Southern California Medical School, but she wrote about nutrition in a friendly, common-sense style. In the 1950s and ’60s, titles like Let’s Eat Right to Keep Fit and Let’s Get Well became bestsellers. […] Like Price, Davis was controversial. “She so infuriated the medical profession and the orthodox nutrition community that they would stop at nothing to discredit her,” recalls my friend Joann Grohman, a dairy farmer and nutrition writer who says Adelle Davis restored her own health and that of her five young children. “The FDA raided health food stores and seized her books under a false labeling law because they were displayed next to vitamin bottles.” ”

In the same period during the 1950s and 1960s, the FDA went after Carlton Fredericks in an extended battle. He had a master’s degree and a doctorate in public health education and was a former associate professor. What was his horrific crime? He suggested that the modern food supply had become deficient in nutrients because of industrial processing and so that supplementation was necessary for health. It didn’t matter this was factually true. Fredericks’ mistake was stating such obvious truths openly on his radio show and in his written material. The FDA seized copies of Eat, Live and Be Merry (1961) for allegedly recommending the treatment of ailments “with vitamin and mineral supplements, which products are not effective in treating such conditions” (Congress 1965) which were “not effective”. They declared this as “false labeling”, despite it never contradicting any known science at the time or since. Then a few years later, the Federal Trade Commission brought a similar charge of false advertising in the selling of his tape-recorded programs and writing, but the allegations didn’t stick and the case was dropped.

A brief perusal of web search results brought up a similar case. Gayelord Hauser was a nutritionist with degrees in naturopathy and chiropractic who, like the others, became a popular writer — with multiple books translated into 12 languages and a regular column in Hearst newspapers read nationwide. What brought official ire down upon him was that he became so famous as to be befriended by numerous Hollywood actors, which elevated his popularity even further. Authority figures in the government and experts within the medical field saw him as a ‘quack’ and ‘food faddist’, which is to say as an ideological competitor who needed to be eliminated. His views worthy of being silenced included that American should eat more foods rich in B vitamins and to avoid sugar and white flour. As you can see, he was a monster and a public menace. This brought on the righteous wrath of the American Medical Association along with the flour and sugar lobbies. So, this led to an initial charge of practicing medicine without a license with products seized and destroyed. Later on, in recommending black-strap molasses as a nutrient-dense food which it is, the FDA made the standard accusation of product endorsement and false claims, and this was followed by the standard action of confiscating his 1950 best-selling book on healthy diet, Look Younger, Live Longer. Now Hauser is remembered by many as a pioneer in his field and as founder of the natural food movement.

Let me end with one last example of Cold War suppression. In reading Nina Teicholz’s The Big Fat Surprise, I noticed a brief reference to Herman Taller, a New York obstetrician and gynecologist. He too was an advocate of natural health. His book Calories Don’t Count got him into trouble for the same predictable reasons with claims of “false and misleading” labeling. He also sold supplements, but nothing bizarre — from bran fiber to safflower oil capsules, the latter being brought up in the legal case. His argument was that, since fish oil was healthy, other polyunsaturated fatty acids (PUFAs) would likewise be beneficial. It turns out he was wrong about safflower oil, but his scientific reasoning was sound for what was known at the time. His broader advocacy of a high fat diet with a focus on healthy fats has become mainstream since. Certain PUFAs, the omega-3 fats, are absolutely necessary for basic physiological functioning and indeed most people in the modern world do not get enough of them.

Anyway, it was never about fair-minded scientific inquiry and debate. So $30,000 worth of safflower‐oiI capsules and 1,600 copies of his book were taken from several warehouses. To justify this action, FDA Commissioner George P. Larrick stated that, “The book is full of false ideas, as many competent medical and nutritional writers have pointed out. Contrary to the book’s basic premise, weight reduction requires the reduction of caloric intake. There is no easy, simple substitute. Unfortunately, calories do count.” He decreed this from on high as the ultimate truth — the government would not tolerate anyone challenging this official ideology and yet scientists continue to debate the issue with recent research siding with Taller’s conclusion. According to the best science presently available, it is easy to argue that calories don’t count or, to put it another way, calorie-counting diets have proven a failure in study after study — a fact so well known that mainstream doctors and medical experts admit to its sad truth, even as they go on advising people to follow it and then blaming them for its failure.

If you’ve ever wondered how Ancel Keys’ weak evidence and bad science came to dominate as official dietary recommendations pushed by medical institutions, the federal government and the food industry, the above will give you some sense of the raw force of government authority that was used to achieve this end. It wasn’t only voices of popular writers and celebrity figures that were silenced, eliminated, and discredited. Gary Taubes and Nina Teicholz discuss how a related persecution happened within academia where independent researchers lost funding and no longer were invited to speak at conferences. For a half century, it was impossible to seriously challenge this behemoth of the dietary-industrial complex. And during this era, scientific research was stunted. This Cold War era oppression is only now beginning to thaw.

The Literal Metaphor of Sickness

I’ve written about Lenore Skenazy before. She is one of my mom’s favorite writers and so she likes to share the articles with me. Skenazy has a another piece about her usual topic, helicopter parents and their captive children. Today’s column, in the local newspaper (The Gazette), has the title “The irony of overprotection” (you can find it on the Creators website or from the GazetteXtra). She begins with a metaphor. In studying how leukemia is contracted, scientist Mel Greaves found that two conditions were required. The first is a genetic susceptibility, which exists only in a certain number of kids, although far from uncommon. But that alone isn’t sufficient without the second factor.

There has to be an underdeveloped or compromised immune system. And sadly this also has become far from uncommon. Further evidence of the hygiene hypothesis keeps accumulating (should be called the hygiene theory at this point). Basically, it is only by being exposed to germs that a child’s immune system experiences healthy stress that activates the immune system into normal development. Without this, many are left plagued by ongoing sickness, allergies, and autoimmune conditions for the rest of their lives.

Parents have not only protected their children from the larger dangers and infinite risks of normal childhood: skinned knees from roughhousing, broken limbs from falling from trees, hurt feelings from bullies, trauma from child molesters, murder from the roving bands of psychotic kidnappers who will sell your children on the black market, etc. Beyond such everyday fears, parents have also protected their kids from minor infections, with endless application of anti-bacterial products and cocooning them in sterile spaces that have been liberally doused with chemicals that kill all known microbial life forms. That is not a good thing for the consequences are dire.

This is where the metaphor kicks in. Skenazy writes:

The long-term effects? Regarding leukemia, “when such a baby is eventually exposed to common infections, his or her unprimed immune system reacts in a grossly abnormal way,” says Greaves. “It overreacts and triggers chronic inflammation.”

Regarding plain old emotional resilience, what we might call “psychological inflammation” occurs when kids overreact to an unfamiliar or uncomfortable situation because they have been so sheltered from these. They feel unsafe, when actually they are only unprepared, because they haven’t been allowed the chance to develop a tolerance for some fears and frustrations. That means a minor issue can be enough to set a kid off — something we are seeing at college, where young people are at last on their own. There has been a surge in mental health issues on campuses.

It’s no surprise that anxiety would be spiking in an era when kids have had less chance to deal with minor risks from childhood on up.

There is only a minor detail of disagreement I’d throw out. There is nothing metaphorical about this. Because of an antiseptic world and other causes (leaky gut, high-carb diet, sugar addiction, food additives, chemical exposure, etc), the immune systems of so many modern Americans are so dysfunctional and overreactive that it wreaks havoc on the body. Chronic inflammation has been directly linked to or otherwise associated with about every major health issue you can think of.

This includes, by the way, neurocognitive conditions such as depression and anxiety, but much worse as well. Schizophrenia, Alzheimer’s, etc also often involve inflammation. When inflammation gets into the brain, gut-brain axis, and/or nervous system, major problems follow with a diversity of symptoms that can be severe and life threatening, but they can also be problematic on a social and psychological level as well. This new generation of children are literally being brain damaged, psychologically maimed, and left in a fragile state. For many of them, their bodies and minds are not fully prepared to deal with the real world with normal healthy responses. It is hard to manage the stresses of life when one is in a constant state of low-grade sickness that permanently sets the immune system on high, when even the most minor risks could endanger one’s well being.

The least of our worries is the fact that diseases like type 2 diabetes, what used to be called adult onset diabetes because it was unknown among children, is now increasing among children. Sure, adult illnesses will find their way earlier and earlier into young adulthood and childhood and the diseases of the elderly will hit people in middle age or younger. This will be a health crisis that could bankrupt and cripple our society. But worse than that is the human cost of sickness and pain, struggle and suffering. We are forcing this fate onto the young generations. That is cruel beyond comprehension. We can barely imagine what this will mean across the entire society when it finally erupts as a crisis.

We’ve done this out of ignorant good intentions of wanting to protect our children from anything that could touch them. It makes us feel better that we have created a bubble world of innocence where children won’t have to learn from the mistakes and failures, harms and difficulties we experienced in growing up. So instead, we’ve created something far worse for them.

Neolithic Troubles

Born Expecting the Pleistocene
by Mark Seely
p. 31

Not our natural habitat

The mismatch hypothesis

Our bodies including our brains—and thus our behavioral predispositions—have evolved in response to very specific environmental and social conditions. Many of those environmental and social conditions no longer exist for most of us. Our physiology and our psychology, all of our instincts and in-born social tendencies, are based on life in small semi-nomadic tribal groups of rarely more than 50 people. There is a dramatic mismatch between life in a crowded, frenetic, technology-based global civilization and the kind of life our biology and our psychology expects [14].

And we suffer serious negative consequences of this mismatch. A clear example can be seen in the obesity epidemic that has swept through developed nations in recent decades: our bodies evolved to meet energy demands in circumstances where the presence of food was less predictable and periods of abundance more variable. Because of this, we have a preference for calorie-dense food, we have a tendency to eat far more than we need, and our bodies are quick to hoard extra calories in the form of body fat.
This approach works quite well during a Pleistocene ice age, but it is maladaptive in our present food-saturated society—and so we have an obesity epidemic because of the mismatch between the current situation and our evolution-derived behavioral propensities with respect to food. Studies on Australian aborigines conducted in the 1980s, evaluating the health effects of the transition from traditional hunter-gatherer lifestyle to urban living, found clear evidence of the health advantages associated with a lifestyle consistent with our biological design [15]. More recent research on the increasingly popular Paleo-diet [16] has since confirmed wide-ranging health benefits associated with selecting food from a pre-agriculture menu, including cancer resistance, reduction in the prevalence of autoimmune disease, and improved mental health.

[14] Ornstein, R. & Ehrlich, P. (1989). New World, New Mind. New York: Simon & Schuster.
[15] O’Dea, K., Spargo, R., & Akerman, K. (1980). The effect of transition from traditional to urban life-style on the insulin secretory response in Australian Aborigines. Diabetes Care, 3(1), 31-37; O’Dea, K., White, N., & Sinclair, A. (1988). An investigation of nutrition-relatedrisk factors in an isolated Aboriginal community in northern Australia: advantagesof a traditionally-orientated life-style. The Medical Journal of Australia, 148 (4), 177-80.
[16] E.g., Frassetto, L. A., Schloetter, M., Mietus-Snyder, M., Morris, R. C., & Sebastian, A. (2009). Metabolic and physiological improvements from consuming a Paleolithic, hunter-gatherer type diet. European Journal of Clinical Nutrition, 63, 947=955.

pp. 71-73

The mechanisms of cultural evolution can be seen in the changing patterns of foraging behavior in response to changes in food availability and changes in population density. Archaeological analyses suggest that there is a predictable pattern of dietary choice that emerges from the interaction among population density, relative abundance of preferred food sources, and factors that relate to the search and handling of various foods. [56] In general, diets become more varied, or broaden, as population increases and the preferred food becomes more difficult to obtain. When a preferred food source is abundant, the calories in the diet may consist largely of that one particular food. But as the food source becomes more difficult to obtain, less preferable foods will be included and the diet will broaden. Such dietary changes imply changes in patterns of behavior within the community—changes of culture.

Behavior ecologists and anthropologists have partitioned the foraging process into two components with respect to the cost-benefit analysis associated with dietary decisions:
search and handling. [57] The search component of the cost-benefit ledger refers to the amount of work per calorie payoff (and other benefits such as the potential for enhanced social standing) associated with a food item’s abundance, distance, terrain, proximity of another group’s territory, water sources, etc. The handling component refers to the work per calorie payoff associated with getting the food into a state (location, form, etc.) in which it can be consumed. Search and handling considerations can be largely independent of each other. The residential permanence involved with the incorporation of agriculture reduces the search consideration greatly, and makes handling the primary consideration. Global industrial food economies change entirely the nature of both search and handling: handling in industrial society—from the perspective of the individual and the individual’s decision processes—is reduced largely to considerations of speed and convenience. The search component has been re-appropriated and refocused by corporate marketing, and reduced to something called shopping.

Domestication, hands down the most dramatic and far-reaching example of cultural evolution, emerges originally as a response to scarcity that is tied to a lack of mobility and an increase in population density. Domestication is a way of further broadening the diet when other local sources of food are already being maximally exploited. Initial experimentation with animal domestication “occurred in situations where forager diets were already quite broad and where the principle goal of domestication was the production of milk, an exercise that made otherwise unusable plants or plant parts available for human consumption. . . .” [58] The transition to life-ways based even partially on domestication has some counter-intuitive technological ramifications as well.

This leads to a further point about efficiency. It is often said that the adoption of more expensive subsistence technology marks an improvement in this aspect of food procurement: better tools make the process more efficient. This is true in the sense that such technology often enables its users to extract more nutrients per unit weight of resource processed or area of land harvested. If, on the other hand, the key criterion is the cost/benefit ratio, the rate of nutrient gained relative to the effort needed to acquire it, then the use of more expensive tools will often be associated with declines in subsistence efficiency. Increased investment in handling associated with the use of high-cost projectile weapons, in plant foods that require extensive tech-related processing, and in more intensive agriculture all illustrate this point. [59]

In modern times, thanks to the advent of—and supportive propaganda associated with—factory industrial agriculture, farming is coupled with ideas of plentitude and caloric abundance. However, in the absence of fossil energy and petroleum-based chemical fortification, farming is expensive in terms of the calories produced as a function of the amount of work involved. For example, “farmers grinding corn with hand-held stone tools can earn no more than about 1800 kcal per hour of total effort devoted to farming, and this from the least expensive cultivation technique.” [60] A successful fishing or bison hunting expedition is orders of magnitude more efficient in terms of the ratio of calories expended to calories obtained.

[56] Bird & O’Connell [Bird, D. W., & O’Connell, J. F. (2006). Behavioral ecology and archaeology. Journal of Archaeological Research, 14, 143-188]
[57] Ibid.
[58] Ibid, p. 152.
[59] Ibid, p. 153.
[60] Ibid, p. 151, italics in original.

pp. 122-123

The birth of the machine

The domestication frame

The Neolithic marks the beginnings of large scale domestication, what is typically referred to as the agricultural revolution. It was not really a revolution in that it occurred over an extended period of time (several thousand years) and in a mosaic piecemeal fashion, both in terms of the adoption of specific agrarian practices and in terms of specific groups of people who practiced them. Foraging lifestyles continue today, and represented the dominant lifestyle on the planet until relatively recently. The agricultural revolution was a true revolution, however, in terms of its consequences for the humans who adopted domestication-based life-ways, and for the rest of the natural world. The transition from nomadic and seminomadic hunting and gathering to sedentary agriculture is the most significant chapter in the chronicle of the human species. But it is clearly not a story of unmitigated success. Jared Diamond, who acknowledges somewhat the self-negating double-edge of technological “progress,” has called domestication the biggest mistake humans ever made.

That transition from hunting and gathering to agriculture is generally considered a decisive step in our progress, when we at last acquired the stable food supply and leisure time prerequisite to the great accomplishments of modern civilization. In fact, careful examination of that transition suggests another conclusion: for most people the transition brought infectious disease, malnutrition, and a shorter lifespan. For human society in general it worsened the relative lot of women and introduced class-based inequality. More than any other milestone along the path from chimpanzeehood to humanity, agriculture inextricably combines causes of our rise and our fall. [143]

The agricultural revolution had profoundly negative consequences for human physical,
psychological, and social well being, as well as a wide-ranging negative impact on the planet.

For humans, malnutrition and the emergence of infectious disease are the most salient physiological results of an agrarian lifestyle. A large variety of foodstuffs and the inclusion of a substantial amount of meat make malnutrition an unlikely problem for hunter gatherers, even during times of relative food scarcity. Once the diet is based on a few select mono-cropped grains supplemented by milk and meat from nutritionally-inferior domesticated animals, the stage is set for nutritional deficit. As a result, humans are not as tall or broad in stature today as they were 25,000 years ago; and the mean age of death is lower today as well. [144] In addition, both the sedentism and population density associated with agriculture create the preconditions for degenerative and infectious disease. “Among the human diseases directly attributable to our sedentary lives in villages and cities are heart and vascular disorders, diabetes, stroke, emphysema,
hypertension, and cirrhoses [sic.] of the liver, which together cause 75 percent of the deaths in the industrial nations.” [145] The diet and activity level of a foraging lifestyle serve as a potent prophylactic against all of these common modern-day afflictions. Nomadic hunter-gatherers are by no means immune to parasitic infection and disease. But the spread of disease is greatly limited by low population density and by a regular change of habitation which reduced exposure to accumulated wastes. Both hunter-gatherers and agriculturalists are susceptible to zoonotic diseases carried by animals, but domestication reduces an animal’s natural immunity to disease and infection, creates crowded conditions that support the spread of disease among animal populations, and increases the opportunity for transmission to humans. In addition, permanent dwellings provide a niche for a new kind of disease-carrying animal specialized for symbiotic parasitic cohabitation with humans, the rat being among the most infamous.
Plagues and epidemic outbreaks were not a problem in the Pleistocene.

There is a significant psychological dimension to the agricultural revolution as well.
A foraging hunter-gatherer lifestyle frames natural systems in terms of symbiosis and interrelationship. Understanding subtle connections among plants, animals, geography,
and seasonal climate change is an important requisite of survival. Human agents are intimately bound to these natural systems and contemplate themselves in terms of these systems, drawing easy analogy between themselves and the natural communities around them, using animals, plants, and other natural phenomena as metaphor. The manipulative focus of domestication frames natural systems in antagonistic terms of control and resistance. “Agriculture removed the means by which men [sic.] could contemplate themselves in any other than terms of themselves (or machines). It reflected back upon nature an image of human conflict and competition . . . .” [146] The domestication frame changed our perceived relationship with the natural world,
and lies at the heart of our modern-day environmental woes. According to Paul Shepard,
with animal domestication we lost contact with an essential component of our human nature, the “otherness within,” that part of ourselves that grounds us to the rest of nature:

The transformation of animals through domestication was the first step in remaking them into subordinate images of ourselves—altering them to fit human modes and purposes. Our perception of not only ourselves but also of the whole of animal life was subverted, for we mistook the purpose of those few domesticates as the purpose of all. Plants never had for us the same heightened symbolic representation of purpose itself. Once we had turned animals into the means of power among ourselves and over the rest of nature, their uses made possible the economy of husbandry that would, with the addition of the agrarian impulse, produce those motives and designs on the earth contrary to respecting it. Animals would become “The Others.” Purposes of their own were not allowable, not even comprehensible. [147]

Domestication had a profound impact on human psychological development. Development—both physiological and psychological—is organized around a series of stages and punctuated by critical periods, windows of time in which the development and functional integration of specific systems are dependent upon external input of a designated type and quality. If the necessary environmental input for a given system is absent or of a sufficiently reduced quality, the system does not mature appropriately. This can have a snowball effect because the future development of other systems is almost always critically dependent on the successful maturation of previously developed systems. The change in focus toward the natural world along with the emergence of a new kind of social order interfered with epigenetic programs that evolved to anticipate the environmental input associated with a foraging lifestyle. The result was arrested development and a culture-wide immaturity:

Politically, agriculture required a society composed of members with the acumen of children. Empirically, it set about amputating and replacing certain signals and experiences central to early epigenesis. Agriculture not only infantilized animals by domestication, but exploited the infantile human traits of normal individual neoteny. The obedience demanded by the organization necessary for anything larger than the earliest village life, associated with the rise of a military caste, is essentially juvenile and submissive . . . . [148]

[143] Diamond (1992), p. 139. [Diamond, J. (1992). The Third Chimpanzee. New York: HarperCollins.]
[144] Shepard (1998) [Shepard, P. (1998). Coming Home to the Pleistocene. Washington, D.C.: Island Press]
[145] Ibid, p. 99.
[146] Shepard (1982), p. 114. [Shepard, P. (1982). Nature and Madness. Athens Georgia: University of Georgia Press]
[147] Shepard (1998), p. 128.
[148] Shepard (1982), pp. 113-114.

Paleo Diet, Traditional Foods, & General Health

Diet & Lifestyle

Basic Guidelines (LCHF):

  • low carb (LC)
  • high fat (HF)
  • moderate protein

Eliminate or Lessen:

  • industrially farmed & heavily processed foods, especially with many additives, including when labeled as healthy.
  • foods from factory farmed animals.
  • vegetable oils, especially hydrogenated seed oils (e.g., canola) & margarine; but some are good for you (see below).
  • carbs, especially simple carbs with high glycemic index & load: potatoes, rice, bread, etc; sweet potatoes a better choice but limit consumption; better to eat raw carrots than cooked carrots; but cooking & then cooling carbs creates resistant starches that turn into sugar more slowly.
  • grains, especially wheat; some people better handle ancient grains, sprouted or long-fermented breads (sourdough); but better to avoid entirely.
  • added sugar, especially fructose; also avoid artificial sweeteners (causes insulin problems & cause diabetes); if sweetener is desired, try raw stevia.
  • fruit, especially high sugar: grapes, pineapple, pears, bananas, watermelon, apples, prunes, pomegranates, etc.
  • dairy, especially cow milk; some handle better non-cow milk, cultured milk, & aged cheese; but better to avoid entirely.

Emphasize & Increase:

  • organic, whole foods, locally grown, in season.
  • foods from pasture raised or grass fed animals.
  • healthy fats/oils: animal fat, butter/ghee, avocado oil, & coconut oil for cooking; coconut milk/cream & almond milk for drinks (e.g., added to coffee); cold-pressed olive oil for salads or adding to already cooked foods; cold-pressed seed oils used sparingly; cod liver oil, krill oil (Neptune is best), flax oil, borage oil, evening primrose oil, etc for supplementation (don’t need to take all of them); maybe MCT oil for ketosis (seek advice of your physician).
  • fibrous starches & nutritious vegetables/fruits: leafy greens, broccoli, green beans, onions, garlic, mushrooms, celery, beets, black cherries, berries, olives, avocados, etc.
  • nutrient-density & fat-soluble vitamins, besides healthy fats/oils: eggs, wild-caught fish, other seafoods, organ meats, bone broth, aged cheese (raw is best), yogurt, kefir, avocados; nutritional yeast (gluten-free), bee pollen, & royal jelly.
  • protein: eggs, fatty meats, nuts/seeds (handful a day), & avocados.
  • probiotics (from fermented/cultured foods preferrably): traditional sauerkraut, kimchi, miso, natto, yogurt, kefir, kombucha, etc; not necessarily recommended for everyone, depending on gut health.
  • supplements (besides already mentioned above): ox bile for fat digestion, turmeric/curcumin & CBD oil for inflammation, CoQ10 if you are on statins, etc; only take as needed.
  • seasoning: black pepper contains bioperine which helps absorption of nutrients; onions and garlic are also great sources of nutrients and the specific soluble fiber that feeds microbes.

Other Suggestions:

  • fasting: occasionally/intermittently, starting with a single day & maybe eventually increasing length (the immune system is replaced/recuperated after 2-3 days); an extended fast can be good to do around once a year, assuming your in relatively good health.
  • restricted eating period: limit meal time to a 4-8 hour window of the day (even limiting it to 12 hours will be beneficial as compared to eating non-stop from waking to sleeping) followed by a short-term fast; start by skipping a meal & work up from there (some people find going without breakfast to be the easiest since you are already in fasting mode from the night’s sleep).
  • ketosis: if carbs are restricted enough or fasting continues long enough (glucose & stored glycogen is used up), the body will switch from burning glucose to burning fat, the latter turning into ketones (MCT oil will aid this process); for carb restriction, body burns fat consumed; for fasting, body burns body fat.
  • salt & water: body can become depleted if diet is strictly low carb & high fat/protein, especially in ketosis; salt is needed to metabolize protein.
  • exercise: aerobics & strength training (especially beneficial is high intensity for short duration); improves metabolism & general health; helps get into ketosis.
  • stress management: get plenty of sleep, spend time in nature, regularly socialize with friends & family, try relaxation (meditation, yoga, etc), find ways to play (games, sports, be around children), etc.
  • sunshine: get regular time outside in the middle of day without sunscreen to produce vitamin D & improve mood (for those not near the equator), as studies correlate this to lower skin cancer rates & longer life.

Resources:

Documentaries/Shows:

(lists here & here)

The Perfect Human Diet
The Magic Pill
The Paleo Way
We Love Paleo
Carb Loaded
My Big Fat Diet
Fed Up
Fat Head
What’s With Wheat?
The Big Fat Lie (coming soon)
The Real Skinny on Fat (coming soon)

Books:

Gary Taubes – Good Calories, Bad Calories; & Why We Get Fat
Nina Teicholz – The Big Fat Surprise (being made into a documentary)
Tim Noakes – Lore of Nutrition
Robert Lustig – Fat Chance
Loren Cordain – The Paleo Diet; & The Paleo Answer
Robb Wolf – The Paleo Solution
Mark Sisson – The Primal Blueprint
Nora T. Gedgaudas – Primal Body, Primal Mind
Sally Fallon Morell – Nourishing Diets
Catherine Shanahan – Food Rules; & Deep Nutrition
Sarah Ballantyne – The Paleo Approach; & Paleo Principles
Mark Hyman – Food: What the Heck Should I Eat?
David Perlmutter – Grain Brain
William Davis – Wheat Belly
John Yudkin – Pure, White and Deadly
Weston A. Price – Nutrition and Physical Degeneration
Francis Marion Pottenger Jr. – Pottenger’s Cats: A Study in Nutrition

Blogs/Websites:

(recommendations here)

Gary Taubes
Nina Teicholz
Tim Noakes
Robert Lustig
Gary Fettke
Loren Cordain
Robb Wolf
Mark Sisson
Nora Gedgaudas
Jimmy Moore
Pete Evans
Zoe Harcombe
Chris Kresser
Chris Masterjohn
Sarah Ballantyne
Catherine Shanahan
Terry Wahls
Will Cole
Josh Axe
Dave Asprey
Mark Hyman
Joseph Mercola
David Perlmutter
William Davis
Paleohacks
The Weston A. Price Foundation
Price-Pottenger

Other People’s Craziness

In a Facebook group dedicated to Julian Jaynes, I was talking to a lady who is an academic and a poet. She happened to mention that she is also a ‘Manbo’, something like a vodou practitioner. She made the admission that she sees and hears spirits, but she qualified it by saying that her rational mind knew it wasn’t real. I found that qualification odd, as if she were worried about maintaining her respectability. She made clear that these experiences weren’t make-believe, as they felt real to her, as real as anything else, and yet one side of her personality couldn’t quite take them as real. So, two different realities existed inside her and she seemed split between them.

None of this is particularly strange in a group like that. Many voice-hearers, for obvious reasons, are attracted to Jaynes’ view on voice-hearing. Jaynes took such experiences seriously and, to a large degree, took the experiences on their own terms. Jaynes offered a rational or rationalizing narrative for why it is ‘normal’ to hear voices. The desire to be normal is powerful social force. Having a theory helps someone like this lady to compartmentalize the two aspects of her being and not feel overwhelmed. If she didn’t qualify her experience, she would be considered crazy by many others and maybe in her own mind. Her academic career might even be threatened. So, the demand of conformity is serious with real consequences.

That isn’t what interested me, though. Our conversation happened in a post about the experience of falling under a trance while driving, such that one ends up where one was going without remember how one got there. It’s a common experience and a key example Jaynes uses about how the human mind functions. I mentioned that many people have experiences of alien contact and UFO abduction while driving, often alone at night on some dark stretch of road. And I added that, according to Jacques Vallee and John Keel, many of these experiences match the descriptions of fairy abductions in folklore and the accounts of shamanic initiations. Her response surprised me, in her being critical.

Vallee also had two sides, on the one hand an analytical type who worked as an astronomer and a computer scientist and on the other a disreputable UFO researcher. He came at the UFO field from a scientific approach, but like Jaynes he felt compelled to take people at their word in accepting that their experience was real to them. He even came to believe there was something to these experiences. It started with a time he was working in an observatory and, after recording anomalous data of something in the sky that wasn’t supposed to be there, the director of the observatory erased the tapes out of fear that if it got out to the press it would draw negative attention to the institution. That is what originally piqued his curiosity and started him down the road of UFO research. But he also came across many cases where entire groups of people, including military, saw the same UFOs in the sky and their movements accorded with no known technology or physics.

That forced him to consider the possibility that people were seeing something that was on some level real, whatever it was. He went so far as to speculate about consciousness being much stranger than science could presently explain, that there really is more to the universe or at an angle to our universe. In this line of thought, he spoke of the phenomena as, “partly associated with a form of non-human consciousness that manipulates space and time.” Sure, to most people, that is crazy talk, though no more crazy than interacting with the spirit world. But the lady I was speaking with immediately dismissed this as going too far. Her anomalous experiences were fine, as long as she pretended that they were pretend or something, thus proving she wasn’t bat-shit loony. Someone else’s anomalous experience, however, was not to be taken seriously. It’s the common perception that only other people’s religion is mythology.

That amused me to no end. And I said that it amused me. She then blocked me. That amused me as well. I’m feeling amused. I was more willing to take her experiences as being valid in a way she was unwilling to do for others. It’s not that I had any skin in the game, as I’ve never talked to spirits nor been abducted by aliens. But I give people the benefit of the doubt that there experiences are real to them. I’m a radical skeptic and extreme agnostic. I take the world as it comes and sometimes the world is strange. No need to rationalize it. And if that strangeness is proof of insanity and disrepute, there are worse fates.

* * *

As for my own variety of crazy, I’ve always felt a kinship with Philip K. Dick. Below is what he what he wrote in justifying himself. Some people feel compelled to speak truth, no matter what. If that truth sounds crazy, maybe that is because we live in a society gone mad. Under such unhappy circumstances, there can be great comfort in feeling validated by someone speaking truth. So, maybe be kind toward the craziness and truths of other people. Here is what PKD has to say:

“What I have done may be good, it may be bad. But the reality that I discern is the true reality; thus I am basically analytical, not creative; my writing is simply a creative way of handling analysis. I am a fictionalizing philosopher, not a novelist; my novel and story-writing ability is employed as a means to formulate my perception. The core of my writing is not art, but truth. Thus what I tell is the truth, yet I can do nothing to alleviate it, either by deed or exploration. Yet this seems somehow to help a certain kind of sensitive and troubled person, for whom I speak. I think I understand the common ingredient in those whom my writing helps; they cannot or will not blunt their own intimations about the irrational, mysterious nature of reality, & for them my corpus of writing is one long ratiocination regarding this inexplicable reality, an investigation & presentation, analysis & response & personal history. My audience will always be limited to these people.”
(In Pursuit of Valis, p.161)

Right-Wing Political Correctness on Right-Wing Terrorism

During the administration of George W. Bush, the FBI put out numerous reports on terrorism. Although they conflated non-violent actions against property by left-wing groups with violent actions against people by right-wing groups, the FBI nonetheless made clear that it was right-wing groups that were the greatest and most dangerous emerging risk, going back to the 1990s. And they specifically warned of returning veterans potentially being recruited into terrorist groups or acting as lone actor terrorists. From a report on terrorism from 2002 to 2005:

“Right-wing extremism, however, primarily in the form of domestic militias and conservative special interest causes, began to overtake left-wing extremism as the most dangerous, if not the most prolific, domestic terrorist threat to the country during the 1990s. In contrast to the ALF and the ELF, which have pursued a philosophy that avoids physical violence in favor of acts of property damage that cause their victims economic harm, right-wing extremists pursued a qualitatively different method of operation by targeting people.”

Yet this largely went unnoticed. The media, especially the right-wing media, had little interest in focusing on domestic threats while the foreign “War on Terror” was going on. And it would have been hard for right-wing groups to argue for bias when right-wingers were in control of the federal government. This attitude changed, of course, when Barack Obama was elected. There was right-wing outrage when a DHS report came out in 2009 that highlighted right-wing terrorism, despite the fact that the research for the report began under the Bush administration. This forced a retraction, not because it wasn’t true but because it was politically incorrect.

Right-Wing Terrorism in the 21st Century
By Daniel Koehler
pp. 27-28

“It is noteworthy that while right-wing terrorism is widely seen as a phenomenon involving lone actors or small cells, this study indicates that a critical mass of group members might be necessary for the escalation into violence.

“Another aspect highly relevant for the present subject is the research on so-called ‘sovereign citizens’ and the political impact of these assessments. The sovereign citizen movement is a very diverse and loose network of individuals and groups with a shared rejection of United States laws, taxation, currency and the government’s legitimacy especially regarding firearms control (e.g., ADL 2010; FBI 2011; Fleishman 2004; Macnab 2016). The concept behind the movement is directly rooted in Christian Identity teachings and the right-wing terrorist Posse Comitatus group in the 1980s. Fluent overlapping with more militant and violent militias or white supremacists (e.g., Aganes 1996; Crothers 2003; Freilich 2003; Levitas 2002) have resulted in a number of violent attacks from individuals and groups as well clashes with law enforcement agencies. For example, the accomplice of Timothy McVeigh for his Oklahoma bombing in 1995 was a member of the movement; and a number of violent stand-offs between sovereign citizen groups with Federal law enforcement agencies (e.g., the ‘Bundy stand-offs’ in 2014 and 2016), and numerous individual acts of killings of police officers exemplify the movement’s danger.

“One critical effect of government (e.g., intelligence and police) assessments of threats posed by this sovereign citizen movement in the United States is the high risk of political backlash and strong opposition. In April 2009, for example, the Department of Homeland Security’s Extremism and Radicalization branch issued a report looking at the risk of violent radicalization within the right-wing extremist movement including sovereign citizens (DHS 2009). Shortly after the report was published, several quotes were used by mostly conservative politicians and public interest organizations to organize strong nationwide critique (Levin 2011; Thompson 2009). Especially relevant for the subsequent debate, were the report’s arguments regarding the increased risk of right-wing radicalization and recruitment through the first African-American presidency, the prospects of firearms restrictions and the potential of returning veterans becoming recruits for terrorist groups or working as lone actors. Although research for the report had already started under the Bush administration in 2008 (Levin 2011) and some of these claims were founded in much earlier assessments by the FBI, the political climate swiftly changed against the DHS, which retracted the report, cut personnel in the domestic terrorism branch, canceled briefings on the issue and held back about a dozen reports (Smith 2011). Eventually the intelligence unit responsible was dismantled in April 2010. Especially noteworthy is the fact that the FBI had already published a number of reports on the same issues and continued afterwards without a similar reaction (e.g., FBI 2004, 2006, 2008, 2011). In 2012, the main author responsible for the problematic DHS report, Daryl Johnson, published his own accounts about the sovereign citizen movement and the risk for potential terrorist incidents becoming rooted in this milieu, arguing that the public debate after the report had effectively created a security risk by furthering the already critical devaluation of domestic terrorism within the DHS’ list of priorities (Johnson 2012). In the eyes of Johnson, the resulting lack of specialized analysis capacity, both in regard to experienced personnel and resources, was majorly responsible for the inadequate threat assessments and counter-measures against terrorism from the Far-Right (Nixon 2016). This capacity seems to have become one field of activity for the FBI since 2011 (Sullivan 2012) and the department of Justice, which re-established the Domestic Terrorism Executive Committee in 2014. The committee had been created in the aftermath of the Oklahoma bombing in 1995 and disbanded after the 9/11 attacks (DoJ 2013). In addition to the DoJ and US attorney community, the committee comprises the FBI and National Security Division. As a consequence of increased lethal violence directed against the U government by sovereign citizens — for example, the killing of a half dozen police officers and three prevented major terrorist attacks involving movement members since 2010 — the FBI has labeled the network as domestic terrorism. A recent study about the sovereign citizens has also highlighted, the role of the movement’s specific subculture with approximately 300,000 followers in the United States, which has increasingly become part of the mainstream political culture (Macnab 2016).”