Ketogenic Diet and Neurocognitive Health

Below is a passage from Ketotarian by Will Cole. It can be read in Chapter 1, titled “the ketogenic diet (for better and worse)”. The specific passage is to be found on pp. 34-38 in printed book (first edition) or pp. 28-31 in the Google ebook. I share it here because it is a great up-to-date summary of the value of the ketogenic diet. It is the low-carb diet pushed to its furthest extent where you burn fat instead of sugar, that is to say the body prioritizes and more efficiently uses ketones in place of glucose.

The brain, in particular, prefers ketones. That is why I decided to share a passage specifically on neurological health, as diet and nutrition isn’t the first thing most people think of in terms of what often gets framed as mental health, typically treated with psychiatric medications. But considering the severely limited efficacy of entire classes of such drugs (e.g., antidepressives), maybe it’s time for a new paradigm for treatment.

The basic advantage to ketosis is that, until modernity, most humans for most of human evolution (and going back into hominid evolution) were largely dependent on a high-fat diet for normal functioning. This is indicated by how the body more efficiently uses ketones than glucose. What the body does with carbs and sugar, though, is to either to use it right away or store it as fat. This is why hunter-gatherers would, when possible, carb-load right before winter in order to fatten themselves up. We have taken this knowledge in using carbs to fatten up animals before the slaughter.

Besides fattening up for winter in northern climes, hunter-gatherers focus most of their diet on fats and oils, in that when available they choose to eat far more fats and oils than they eat meat or vegetables. They do most of their hunting during the season when animals are the fattest and, if they aren’t simply doing a mass slaughter, they specifically target the fattest individual animals. After the kill, they often throw the lean meat to the dogs or mix it with fat for later use (e.g., pemmican).

This is why, prior to agriculture, ketosis was the biological and dietary norm. Even farmers until recent history were largely dependent in supplementing their diet with hunting and gathering. Up until the 20th century, most Americans ate more meat than bread, while intake of vegetables and fruits was minor and mostly seasonal. The meat most Americans, including city-dwellers, were eating was wild game because of the abundance in nearby wilderness areas; and, going by cookbooks of the time, fats and oils were at the center of the diet.

Anyway, simply in reading the following passage, you will not only become more well informed on this topic than average American but, sadly, also the average American doctor. This isn’t the kind of info that is emphasized in medical schools, despite it being fairly well researched at this point (see appended section of the author’s notes). “A study in the International Journal of Adolescent Medicine and Health assessed the basic nutrition and health knowledge of medical school graduates entering a pediatric residency program and found that, on average, they answered only 52 percent of eighteen questions correctly,” as referenced by Dr. Cole. He concluded that, “In short, most mainstream doctors would fail nutrition” (see previous post).

Knowledge is a good thing. And so here is some knowledge.

* * *

NEUROLOGICAL IMPROVEMENTS

Around 25 percent of your body’s cholesterol is found in your brain, (19) and remember, your brain is composed of 60 percent fat. (20) Think about that. Over half of your brain is fat! What we have been traditionally taught when it comes to “low-fat is best” ends up depriving your brain of the very thing it is made of. It’s not a coincidence that many of the potential side effects associated with statins—cholesterol-lowering drugs—are brain problems and memory loss. (21)

Your gut and brain actually form from the same fetal tissue in the womb and continue their special bond throughout your entire life through the gut-brain axis and the vagus nerve. Ninety-five percent of your happy neurotransmitter serotonin is produced and stored in your gut, so you can’t argue that your gut doesn’t influence the health of your brain. (22) The gut is known as the “second brain” in the medical literature, and a whole area of research known as the cytokine model of cognitive function is dedicated to examining how chronic inflammation and poor gut health can directly influence brain health. (23)

Chronic inflammation leads to not only increased gut permeability but blood-brain barrier destruction as well. When this protection is compromised, your immune system ends up working in overdrive, leading to brain inflammation. (24) Inflammation can decrease the firing rate of neurons in the frontal lobe of the brain in people with depression. (25) Because of this, antidepressants can be ineffective since they aren’t addressing the problem. And this same inflammatory oxidative stress in the hypothalamic cells of the brain is one potential factor of brain fog. (26)

Exciting emerging science is showing that a ketogenic diet can be more powerful than some of the strongest medications for brain-related problems such as autism, attention deficit/hyperactivity disorder (ADHD), bipolar disorder, schizophrenia, anxiety, and depression. (27) Through a ketogenic diet, we can not only calm brain-gut inflammation but also improve the gut microbiome. (28)

Ketones are also extremely beneficial because they can cross the blood-brain barrier and provide powerful fuel to your brain, providing mental clarity and improved mood. Their ability to cross the blood-brain barrier paired with their natural anti-inflammatory qualities provides incredible healing properties when it comes to improving traumatic brain injury (TBI) as well as neurodegenerative diseases. (29)

Medium-chain triglycerides (MCTs), found in coconuts (a healthy fat option in the Ketotarian diet), increase beta-hydroxybutyrate and are proven to enhance memory function in people with Alzheimer’s disease (30) as well as protect against neurodegeneration in people with Parkinson’s disease. (31) Diets rich in polyunsaturated fats, wild-caught fish specifically, are associated with a 60 percent decrease in Alzheimer’s disease. (32) Another study of people with Parkinson’s disease also found that the severity of their condition improved 43 percent after just one month of eating a ketogenic diet. (33) Studies have also shown that a ketogenic diet improves autism symptoms. (34) Contrast that with high-carb diets, which have been shown to increase the risk of Alzheimer’s disease and other neurodegenerative conditions. (35)

TBI or traumatic brain injury is another neurological area that can be helped through a ketogenic diet. When a person sustains a TBI, it can result in impaired glucose metabolism and inflammation, both of which are stabilized through a healthy high-fat ketogenic diet. (36)

Ketosis also increases the brain-derived-neurotrophic factor (BDNF), which protects existing neurons and encourages the growth of new neurons—another neurological benefit. (37)

In its earliest phases, modern ketogenic diet research was focused on treating epilepsy. (38) Children with epilepsy who ate this way were more alert, were more well behaved, and had more enhanced cognitive function than those who were treated with medication. (39) This is due to increased mitochondrial function, reduced oxidative stress, and increased gamma-aminobutyric acid (GABA) levels, which in turn helps reduce seizures. These mechanisms can also provide benefits for people with brain fog, anxiety, and depression. (40)

METABOLIC HEALTH

Burning ketones rather than glucose helps maintain balanced blood sugar levels, making the ketogenic way of eating particularly beneficial for people with metabolic disorders, diabetes, and weight-loss resistance.

Insulin resistance, the negative hormonal shift in metabolism that we mentioned earlier, is at the core of blood sugar problems and ends up wreaking havoc on the body, eventually leading to heart disease, weight gain, and diabetes. As we have seen, healthy fats are a stronger form of energy than glucose. The ketogenic diet lowers insulin levels and reduces inflammation as well as improving insulin receptor site sensitivity, which helps the body function the way it was designed. Early trial reports have shown that type 2 diabetes symptoms can be reversed in just ten weeks on the ketogenic diet! (41)

Fascinating research has been done correlating blood sugar levels and Alzheimer’s disease. In fact, so much so that the condition is now being referred to by some experts as type 3 diabetes . With higher blood sugar and increased insulin resistance comes more degeneration in the hippocampus, your brain’s memory center. (42) It’s because of this that people with type 1 and 2 diabetes have a higher risk of developing Alzheimer’s disease. This is another reason to get blood sugar levels balanced and have our brain burn ketones instead.

Notes:

* * *

I came across something interesting on the Ketogenic Forum, a discussion of a video. It’s about reporting on the ketogenic diet from Dateline almost a quarter century ago, back when I was a senior in high school. So, not only has the ketogenic diet been known in the medical literature for about a century but has even shown up in mainstream reporting for decades. Yet, ketogenic-oriented and related low-carb diets such as the paleo diet get called fad diets, and the low-carb diet has been well known for even longer, going back to the 19th century.

The Dateline show was about the ketosis used as treatment for serious medical conditions. But even though it was a well known treatment for epilepsy, doctors apparently still weren’t commonly recommending it. In fact, the keto diet wasn’t even mentioned as an option by a national expert, instead focusing on endless drugs and even surgery. After doing his own research for his son’s seizures, the father discovered the keto diet in the medical literature. The doctor was asked why he didn’t recommend it for the child’s seizures when it was known to have the highest efficacy rate. The doctor essentially had no answer other than to say that there were more drugs he could try, even as he admitted that no drug comes close in comparison.

As one commenter put it, “Seems like even back then the Dr’s knew drugs would always trump diet even though the success rate of the keto diet was 50-70%. No drugs at the time could even come close to that. And the one doctor still insisted they should try even more drugs to help Charlie even after Keto. Ugh!” Everyone knows the diet works. It’s been proven beyond all doubt. But there is a simple problem. There is no profit to be made from an easy and effective non-pharmaceutical solution.

This doctor knew there was a better possibility to offer the family and chose not to mention it. The consequences to his medical malfeasance is the kid may have ended up with permanent brain damage from seizures and from the side effects of medications. The father was shocked and angry. You’d think cases like this would have woken up the medical community, right? Well, you’d be wrong if you thought so. Yet quarter of a century later, most doctors continue to act clueless that these kinds of diets can help numerous health conditions. It’s not a lack of information being available, as many of these doctors knew about it even back then. But it simply doesn’t fit into the conventional medicine nor within the big drug and big insurance framework.

Here is the video:

28 thoughts on “Ketogenic Diet and Neurocognitive Health

  1. I need to add that there is some research which indicates that, throughout history and across the world, the ‘ruling classes’ have monopolised (or benefited from) high-protein and high-fat diets.

    • I haven’t come across such evidence. But I’d suspect that any ruling elite would monopolize any and all foods that were the most desirable and that often means the most nutritious (i.e., nutrient dense) and healthiest. That would support the argument for the health benefits of a low-carb and high-fat diet (e.g., organ meats are some of the most nutrient-dense foods that humans eat). So, considering all of the research showing the immense health benefits of a low-carb and high-fat diet, is it surprising that those who could freely choose whatever they wanted would choose what made them the healthiest and also made them feel better? Not really surprising at all.

      Still, the evidence isn’t so clear in how this plays out in terms of actual societies. Authoritarian hierarchies grew stronger and empires larger as agriculture increased within civilization. The rich and powerful were gluttonously consuming sugary foods centuries before the rest of the population did, at a time when sugar otherwise was a controlled substance only to be found in medicinal dispensaries. On the other hand, the poor in early America subsisted on the cheapest and easiest source of food available, that of fatty wild game. But it is true that the poor in Europe were in a different situation, lacking the access to wild game.

      To go back further in history, the Egyptian ruling class show all the signs of disease from eating high levels of a high-carb agricultural diet (from obesity to diabetes), whereas the poor farmers probably had most of their grains confiscated and were forced to subsist on hunting, fishing, trapping, and foraging. If you want to know who ate much more meat and fat, especially fat, you’d have to look to hunter gatherers. If your claim indicated causal correlation and we were to push the argument to its most absurd conclusion, we would expect the fat-indulgent Inuit to be an elite that rules the world. That obviously isn’t the case.

      In general and on average, hunter-gatherers get most of their calories from fat. There are exceptions to this rule, but that they are exceptions is an important point. The evidence points to exceptions only existing when hunter-gatherers simply don’t have access to preferred fatty foods. Even grubs will offer high levels of fat. And if even no grubs are available, tribes near the equator rely upon tropical nuts such as coconuts that are rich in fats, especially saturated fats. A hunter-gatherer has to be pretty desperate, deprived, and deficient to not be able to eat a fatty diet, whatever the source of food.

      I would correct one misconception, though. Hunter-gatherers always made sure to eat more fat than meat. A high-fat diet hasn’t always or usually meant a high-protein diet. That is why hunter-gatherers would intentionally hunt fat animals during the season they were the fattest. Also, that is why they’d throw the lean meat to the dogs or add some fat to it in saving it for harder times. The fattest and most nutrient-dense parts of the animal, specifically the organ meats, were often eaten on the spot and sometimes raw to get the full nutrient value out of them, as cooking destroys some nutrients.

      Another aspect has to do with the specific subject of this post. I was quoting a book written by a former vegan who, though he now includes some animal-based foods, has maintained a mostly plant-based diet. The main difference is that he now emphasizes ketosis, the topic of the book. What needs to be noted, as he goes in great detail about, is that ketosis doesn’t require large amounts of meat or fish. He goes so far as to make it clear that ketosis is easily attained even on an entirely meat-free and fish-free diet, and as such it is a plant-based ketogenic diet. Any fats and oils, including from plants, combined with low enough carb intake will produce ketones and hence put one into ketosis.

      That said, it would have been near impossible for a hunter-gatherer to maintain health without meat or fish. No vegetarian hunter-gatherers have ever been discovered. And coprolites (fossilized feces) from paleolithic humans shows they also ate a diet with large amounts of animal-based foods. I’m fairly confident in asserting that most hunter-gatherers ate far more fat and protein than any ruling class in any other kind of society. The Piraha, for example, eat a diet consisting of somewhere between 70-90% fish and their tribes entirely lack any hierarchy whatsoever, lacking any tribal chief, counsel of elders, or governing body while also lacking laws and corporal punishment; heck, they don’t even spank and hit their children.

      I had one more thought to add. I mentioned a possible difference between America and Europe, that of access to wild game. There were many reasons for this, from restricted land use to the enclosure movement. Europeans experienced mass urbanization centuries before Americans. The majority of white Americans weren’t urbanized until the early 1900s and the majority of black Americans not until the 1960s or 1970s. Being rural meant having access to lots of wild game. But even in Europe many people remained rural into recent history.

      Weston A. Price studied some of the rural populations in Europe. He was traveling around in the 1930s or thereabout. What he found is that many rural populations ate a tremendous amount of fatty animal foods, including meat but with a heavy focus on extremely nutrient-dense dairy (raw and often cultured): butter, aged cheese, yogurt, etc. This was not low-fat dairy products. The most prized butter was the yellow, high-fat butter from spring when the cows were eating the fresh green grass, and this superfood was saved for pregnant women and children.

      The basic rule is that any population that had access to fatty animal-based foods would eat them with abandon. In authoritarian societies with rigid hierarchies, that sometimes meant the ruling elite monopolized the healthiest and tastiest foods. But that wasn’t true for most societies during most of human evolution. Agriculture created a heavily distorted social order that was never possible in the hunter-gatherer lifestyle.

    • I was still thinking about your comment. Without knowing the evidence, your statement seems intuitively obvious. But it is maybe only meaningful in a larger context.

      Throughout history and across the world, the ruling classes probably have monopolized (or benefited from) anything and everything. In terms of diet, they often might have eaten a disproportionate amount of meat and fat, they also likely gluttonously horded more than their fair share of dairy, sugar, grains, legumes, vegetables, and fruits. Simply put, the rich and powerful ate well.

      It goes without saying that this extends to every area of life. The ruling classes monopolized, dominated, and took advantage of the best housing, infrastructure, sanitation, healthcare, education, legal representation, and on and on.

      That pretty much what it means, by definition, to be a ruling class. They rule and, in ruling, they take whatever they want and as much as they want. But there doesn’t appear to be anything particularly special about meat and fat, as compared to anything else.

    • There is a great book by James C. Scott, Against the Grain. He is an anarchist and makes an argument for grain being a necessary element to the rise of hierarchical and centralized authoritarian governments.

      Basically, other foods, grains can be easily measured while growing in a field and when harvested. This relates to his idea of legibility, for what can be measured can be recorded and hence can be taxed.

      That is why all the greatest empires were built on grain farming. The ruling class would seek to control anything they could, but most things weren’t as easily controlled as agriculture. Even something like a potato was less useful to this system of control because a potato patch could be hid from the tax collector in the way grain field could not.

      It is one of the most fascinating arguments I’ve come across in a while.

    • For the historical context of the American diet:

      The Big Fat Surprise
      by Nina Teicholz
      pp. 123-131

      How Americans Used to Eat

      Yet despite this shaky and often contradictory evidence, the idea that red meat is a principal dietary culprit has thoroughly pervaded our national conversation for decades. We have been led to believe that we’ve strayed from a more perfect, less meat-filled past. Most prominently, when Senator McGovern announced his Senate committee’s report, called Dietary Goals , at a press conference in 1977, he expressed a gloomy outlook about where the American diet was heading. “Our diets have changed radically within the past fifty years,” he explained, “with great and often harmful effects on our health.” Hegsted, standing at his side, criticized the current American diet as being excessively “rich in meat” and other sources of saturated fat and cholesterol, which were “linked to heart disease, certain forms of cancer, diabetes and obesity.” These were the “killer diseases,” said McGovern. The solution, he declared, was for Americans to return to the healthier, plant-based diet they once ate.

      The New York Times health columnist Jane Brody perfectly encapsulated this idea when she wrote, “Within this century, the diet of the average American has undergone a radical shift away from plant-based foods such as grains, beans and peas, nuts, potatoes, and other vegetables and fruits and toward foods derived from animals—meat, fish, poultry, eggs and dairy products.” It is a view that has been echoed in literally hundreds of official reports.

      The justification for this idea, that our ancestors lived mainly on fruits, vegetables, and grains, comes mainly from the USDA “food disappearance data.” The “disappearance” of food is an approximation of supply; most of it is probably being eaten, but much is wasted, too. Experts therefore acknowledge that the disappearance numbers are merely rough estimates of consumption. The data from the early 1900s, which is what Brody, McGovern, and others used, are known to be especially poor. Among other things, these data accounted only for the meat, dairy, and other fresh foods shipped across state lines in those early years, so anything produced and eaten locally, such as meat from a cow or eggs from chickens, would not have been included. And since farmers made up more than a quarter of all workers during these years, local foods must have amounted to quite a lot. Experts agree that this early availability data are not adequate for serious use, yet they cite the numbers anyway, because no other data are available. And for the years before 1900, there are no “scientific” data at all.

      In the absence of scientific data, history can provide a picture of food consumption in the late eighteenth to nineteenth century in America. Although circumstantial, historical evidence can also be rigorous and, in this case, is certainly more far-reaching than the inchoate data from the USDA. Academic nutrition experts rarely consult historical texts, considering them to occupy a separate academic silo with little to offer the study of diet and health. Yet history can teach us a great deal about how humans used to eat in the thousands of years before heart disease, diabetes, and obesity became common. Of course we don’t remember now, but these diseases did not always rage as they do today. And looking at the food patterns of our relatively healthy early-American ancestors, it’s quite clear that they ate far more red meat and far fewer vegetables than we have commonly assumed.

      Early-American settlers were “indifferent” farmers, according to many accounts. They were fairly lazy in their efforts at both animal husbandry and agriculture, with “the grain fields, the meadows, the forests, the cattle, etc, treated with equal carelessness,” as one eighteenth-century Swedish visitor described. And there was little point in farming since meat was so readily available.

      The endless bounty of America in its early years is truly astonishing. Settlers recorded the extraordinary abundance of wild turkeys, ducks, grouse, pheasant, and more. Migrating flocks of birds would darken the skies for days . The tasty Eskimo curlew was apparently so fat that it would burst upon falling to the earth, covering the ground with a sort of fatty meat paste. (New Englanders called this now-extinct species the “doughbird.”)

      In the woods, there were bears (prized for their fat), raccoons, bobolinks, opossums, hares, and virtual thickets of deer—so much that the colonists didn’t even bother hunting elk, moose, or bison, since hauling and conserving so much meat was considered too great an effort. IX

      A European traveler describing his visit to a Southern plantation noted that the food included beef, veal, mutton, venison, turkeys, and geese, but he does not mention a single vegetable. Infants were fed beef even before their teeth had grown in. The English novelist Anthony Trollope reported, during a trip to the United States in 1861, that Americans ate twice as much beef as did Englishmen. Charles Dickens, when he visited, wrote that “no breakfast was breakfast” without a T-bone steak. Apparently, starting a day on puffed wheat and low-fat milk—our “Breakfast of Champions!”—would not have been considered adequate even for a servant.

      Indeed, for the first 250 years of American history, even the poor in the United States could afford meat or fish for every meal. The fact that the workers had so much access to meat was precisely why observers regarded the diet of the New World to be superior to that of the Old. “I hold a family to be in a desperate way when the mother can see the bottom of the pork barrel,” says a frontier housewife in James Fenimore Cooper’s novel The Chainbearer.

      Like the primitive tribes mentioned in Chapter 1, Americans also relished the viscera of the animal, according to the cookbooks of the time. They ate the heart, kidneys, tripe, calf sweetbreads (glands), pig’s liver, turtle lungs, the heads and feet of lamb and pigs, and lamb tongue. Beef tongue, too, was “highly esteemed.”

      And not just meat but saturated fats of every kind were consumed in great quantities. Americans in the nineteenth century ate four to five times more butter than we do today, and at least six times more lard. X

      In the book Putting Meat on the American Table , researcher Roger Horowitz scours the literature for data on how much meat Americans actually ate. A survey of eight thousand urban Americans in 1909 showed that the poorest among them ate 136 pounds a year, and the wealthiest more than 200 pounds. A food budget published in the New York Tribune in 1851 allots two pounds of meat per day for a family of five. Even slaves at the turn of the eighteenth century were allocated an average of 150 pounds of meat a year. As Horowitz concludes, “These sources do give us some confidence in suggesting an average annual consumption of 150–200 pounds of meat per person in the nineteenth century.”

      About 175 pounds of meat per person per year! Compare that to the roughly 100 pounds of meat per year that an average adult American eats today. And of that 100 pounds of meat, more than half is poultry—chicken and turkey—whereas until the mid-twentieth century, chicken was considered a luxury meat, on the menu only for special occasions (chickens were valued mainly for their eggs). Subtracting out the poultry factor, we are left with the conclusion that per capita consumption of red meat today is about 40 to 70 pounds per person, according to different sources of government data—in any case far less than what it was a couple of centuries ago.

      Yet this drop in red meat consumption is the exact opposite of the picture we get from public authorities. A recent USDA report says that our consumption of meat is at a “record high,” and this impression is repeated in the media. It implies that our health problems are associated with this rise in meat consumption, but these analyses are misleading because they lump together red meat and chicken into one category to show the growth of meat eating overall, when it’s just the chicken consumption that has gone up astronomically since the 1970s. The wider-lens picture is clearly that we eat far less red meat today than did our forefathers.

      Meanwhile, also contrary to our common impression, early Americans appeared to eat few vegetables. Leafy greens had short growing seasons and were ultimately considered not worth the effort. They “appeared to yield so little nutriment in proportion to labor spent in cultivation,” wrote one eighteenth-century observer, that “farmers preferred more hearty foods.” Indeed, a pioneering 1888 report for the US government written by the country’s top nutrition professor at the time concluded that Americans living wisely and economically would be best to “avoid leafy vegetables,” because they provided so little nutritional content. In New England, few farmers even had many fruit trees, because preserving fruits required equal amounts of sugar to fruit, which was far too costly. Apples were an exception, and even these, stored in barrels, lasted several months at most.

      It seems obvious, when one stops to think, that before large supermarket chains started importing kiwis from New Zealand and avocados from Israel, a regular supply of fruits and vegetables could hardly have been possible in America outside the growing season. In New England, that season runs from June through October or maybe, in a lucky year, November. Before refrigerated trucks and ships allowed the transport of fresh produce all over the world, most people could therefore eat fresh fruit and vegetables for less than half the year; farther north, winter lasted even longer. Even in the warmer months, fruit and salad were avoided, for fear of cholera. (Only with the Civil War did the canning industry flourish, and then only for a handful of vegetables, the most common of which were sweet corn, tomatoes, and peas.)

      Thus it would be “incorrect to describe Americans as great eaters of either [fruits or vegetables],” wrote the historians Waverly Root and Richard de Rochemont. Although a vegetarian movement did establish itself in the United States by 1870, the general mistrust of these fresh foods, which spoiled so easily and could carry disease, did not dissipate until after World War I, with the advent of the home refrigerator.

      So by these accounts, for the first two hundred and fifty years of American history, the entire nation would have earned a failing grade according to our modern mainstream nutritional advice.

      During all this time, however, heart disease was almost certainly rare. Reliable data from death certificates is not available, but other sources of information make a persuasive case against the widespread appearance of the disease before the early 1920s. Austin Flint, the most authoritative expert on heart disease in the United States, scoured the country for reports of heart abnormalities in the mid-1800s, yet reported that he had seen very few cases, despite running a busy practice in New York City. Nor did William Osler, one of the founding professors of Johns Hopkins Hospital, report any cases of heart disease during the 1870s and eighties when working at Montreal General Hospital. The first clinical description of coronary thrombosis came in 1912, and an authoritative textbook in 1915, Diseases of the Arteries including Angina Pectoris , makes no mention at all of coronary thrombosis. On the eve of World War I, the young Paul Dudley White, who later became President Eisenhower’s doctor, wrote that of his seven hundred male patients at Massachusetts General Hospital, only four reported chest pain, “even though there were plenty of them over 60 years of age then.” XI About one fifth of the US population was over fifty years old in 1900. This number would seem to refute the familiar argument that people formerly didn’t live long enough for heart disease to emerge as an observable problem. Simply put, there were some ten million Americans of a prime age for having a heart attack at the turn of the twentieth century, but heart attacks appeared not to have been a common problem.

      Was it possible that heart disease existed but was somehow overlooked? The medical historian Leon Michaels compared the record on chest pain with that of two other medical conditions, gout and migraine, which are also painful and episodic and therefore should have been observed by doctors to an equal degree. Michaels catalogs the detailed descriptions of migraines dating all the way back to antiquity; gout, too, was the subject of lengthy notes by doctors and patients alike. Yet chest pain is not mentioned. Michaels therefore finds it “particularly unlikely” that angina pectoris, with its severe, terrifying pain continuing episodically for many years, could have gone unnoticed by the medical community, “if indeed it had been anything but exceedingly rare before the mid-eighteenth century.” XII

      So it seems fair to say that at the height of the meat-and-butter-gorging eighteenth and nineteenth centuries, heart disease did not rage as it did by the 1930s. XIII

      Ironically—or perhaps tellingly—the heart disease “epidemic” began after a period of exceptionally reduced meat eating. The publication of The Jungle , Upton Sinclair’s fictionalized exposé of the meatpacking industry, caused meat sales in the United States to fall by half in 1906, and they did not revive for another twenty years. In other words, meat eating went down just before coronary disease took off. Fat intake did rise during those years, from 1909 to 1961, when heart attacks surged, but this 12 percent increase in fat consumption was not due to a rise in animal fat. It was instead owing to an increase in the supply of vegetable oils, which had recently been invented.

      Nevertheless, the idea that Americans once ate little meat and “mostly plants”—espoused by McGovern and a multitude of experts—continues to endure. And Americans have for decades now been instructed to go back to this earlier, “healthier” diet that seems, upon examination, never to have existed.

    • I was at a secondhand store today. While there, I perused the books and found one in defense of wheat, Grain of Truth by Stephen Yafa.

      He basically is coming from a traditional foods perspective, but narrowly focused on wheat alone and apparently not even familiar with other traditional food advocates (Weston A. Price, Sally Fallon Morrell, etc). The book has its limits in other ways as well, in that he barely discusses wheat exorphin (gluteomorphin) and, as far as I can tell, doesn’t discuss propionate at all. Still, it seemed like a reasonable argument was being made within the author’s focus, primarily on the advantage of traditional long-fermentation bread making such as sourdough. That is fair enough. I’m already familiar with and sympathetic to traditional foods. I’m always willing to give this kind of argument a fair hearing.

      He readily admits that modern food production has dramatically altered what we eat and not to a good end. His main disagreement with the paleo view is that he thinks that agriculture has been a net gain, rather than a net loss. I’m indifferent about agriculture, neither being for or against it on principle, but it was nice to see the author be somewhat evenhanded in admitting the downsides. In response to the asserted advances of civilization, he discusses Jared Diamond (Guns, Germs, and Steel):

      “I took that to be the logical and laudable progression until I came upon Diamond’s essay, “The Worst Mistake in the History of the Human Race.” He argues the opposite point of view. “Recent discoveries suggest that the adoption of agriculture, supposedly our most decisive step toward a better life, was in many ways a catastrophe from which we have never recovered. With agriculture came the gross social and sexual inequality, the disease and despotism, that curse our existence.” Wheat was foremost among the eight “founder crops” that doomed us.

      “Diamond accepts that while for many hunter-gatherers life might have been nasty, brutish, and short, it was also sustained by eating wild animals and wild plants, foods that “provide more protein and a better balance of nutrients.” By contrast, today just three high-carbohydrate plants—wheat, rice, and corn—provide the bulk of the calories consumed by the human species, yet each one is deficient in certain vitamins or amino acids essential to life.

      “In her New Yorker overview of the current Paleolithic trend, Elizabeth Kolber points to research that indicates the transition from hunting to plowing was, in its own way, equally brutish. A study of human remains in China and Japan reveals that the average person’s height declined by three inches with the advent of rice cultivation; a simiar decline occurred as wheat spread from the Middle East across Europe. Deadly epidemics increased as people began living closer together, creating societies that were ideal breeding grounds for microbes.”

      This does give him pause: “At the very least, these writers made me reconsider the meaning of progress.” Yet he remains a cheerleader for agricultural-based civilization as we know it: “In balance, I’m still persuaded that more good than harm came from our adoption of agriculture.” I’m more wary than he is, considering modern civilization wouldn’t have been possible without agriculture and presently modern civilization is causing mass extinction, ecosystem destruction, biosphere disruption, and climate change. If this is a net gain, I fear for future generations.

      He doesn’t go into any of that. But he does remain balanced in his assessment: “Yet I also know that an astonishing one third of our population—over a hundred million Americans—are obese at present” and, I might add, half or more of the population is either pre-diabetic or diabetic. Continuing, he says that he “does wonder at times if self-inflicted illness will go down as one of the unintended by-products of our hunter-to-planter transition.” And he ends that section of text with one last quote from Diamond: “We’re still struggling with the mess into which agriculture has tumbled us and it’s unclear whether we can solve it.”

      It reminds me of Brenna Hassett in her book Built on Bones. She also acknowledges the major problems of agriculture. But she makes an argument, one that I find unconvincing, that the inequality that arose with civilization must be blamed on something else. Scott’s view makes the opposing argument, as it is precisely grains that created a tax base that could pay for and support a ruling class that could maintain armies and police to control the citizenry. If grain did create inequality, it also created the rulers and the ruled. If the ruling class did dominate meat and fat consumption, they only were able to do so through the means and power given them by the grain-based social order. The evidence of this interpretation being how little inequality existed in pre-agricultural societies and still exist among hunter-gatherers.

      That was just some thoughts I had. Not that I blame everything on grains. Civilization in its often oppressive form is a whole mess of factors. In the present world, the inequality isn’t only within societies but between them. So, it’s not only an issue of a ruling class. Even poor Westerners have more access to all foods, from meat to wheat, than many more relatively well off people in less developed countries. Most diet writers don’t deal with these issues, if they mention them at all, but they should. Healthy diet has everything to do with greater equality, fairness, and justice.

      I could always complicate things further, as I love to do. Julian Jaynes makes a good case for very little hierarchy, at least as we understand it, in the earliest civilizations. That would indicate there is something more going on. I argue that is because agriculture took a long time to take hold. The earliest farming city-states of the Bronze Age and even farming societies much later, including early America, were maintaining much of their diet through hunting and gathering. I’ve suspected that the neurocognitive and related societal changes weren’t fully seen until the 20th century because it took that long for agriculture to become so dominant a part of the human diet. The ancient so-called bicameral humans that Jaynes speculates about still had many traits of hunter-gatherer societies, including the ease of collective hallucination (a trait also exhibited by tribes such as the Piraha).

      I’ve been working on a long blog post for a while now. In it, I argue about the addictive nature of grains and sugar. It isn’t only that they are addictive but how addiction entirely alters our psychology and contributes to a mentality of individualism which, when pushed to an extreme, becomes our present hyper-individualism with its breakdown of family and community. As Johann Hari states, the addict is the ultimate individual. And as Jaynes speculates, there is a strong link between individualism and authoritarianism. If that is true, enough people switching to a significantly different diet such as keto could alter the collective mentality, culture, and worldview of a society. It would be revolutionary. I suspect that alterations in the food system have always been behind revolutions, such as the increase of sugar and tea as part of the slave trade that went hand in hand with the new mentality of the Enlightenment which was a precursor to the early modern revolutionary era.

      There are plenty of interesting ways of looking at it all.

    • I found a couple of videos talking about how peasants and rich nobles ate in Medieval times. The basic difference is that the former ate more salmon and the latter more chicken, not what one would expect considering salmon is far more healthy. They both ate bread, though, but a big difference would have been the rich eating more refined flour and more sugary deserts. Back then, only the rich had access to or could afford sugar. Even fruit would have been rare for peasants.

      Feudalism, especially early feudalism, was actually rather healthy for peasants as they relied heavily on food that was hunted, gathered, and trapped. But that did change over time as laws became more restrictive about land use. Still, in the centuries following the collapse of the Roman Empire, health and longevity drastically improved for most of the population. The living conditions for the poor only got worse again as society moved toward modernity.

    • Looking back at the comments, something occurred to me. You assert that the “‘ruling classes’ have monopolised (or benefited from) high-protein and high-fat diets.” That sometimes has been true, but often not been the case, as I point out in other comments. In many societies, a high-protein and high-fat diet was as common or more common among the lower classes.

      That is besides the point, though, for this post. Ketosis is possible without either high-protein or high-fat. All that is required low-carb, either as a percentage of the total diet or as the amount. For example, a high percentage of carbs on a calorie-restricted diet would still be a low-carb diet and, depending on how calorie-restricted, could easily be ketogenic. Most peasants were likely in regular ketosis, especially as fasting was a common religious practice.

      This is quite relevant to this post. I was referencing Dr. Will Cole. He is a former vegan who adapted a paleo-style ketogenic diet. But he maintains a plant-based approach. Although he argues that adding some animal foods can be good, a healthy ketogenic diet is possible with vegetarianism. A high-vegetable diet is quite common among ketogenic dieters, as influenced by vegetarianism and paleo — many ketogenic dieters started off vegetarian or paleo and so carried over the plant-based focus.

      Many ketogenic dieters purposely avoid high-protein for concerns of gluconeogenesis (i.e., protein turning into glucose), not that this concern is likely important. Not all ketogenic dieters are necessarily focusing on high-fat either and some ketogenic advocates actually advise against eating lots of fat. A moderate-fat and moderate-protein diet will serve the same ketogenic purposes, as long as it is low enough in carbs. Hindu, Buddhist, etc monks would be spending much of their time in ketosis.

    • Below is a good discussion. The ancient Egyptians are mentioned.

      Apparently, the rich and the poor ate the same basic diet. The dry conditions have allowed a wide variety of corpses to have remained intact for study. My guess was that the poor would’ve been more dependent on foods they fished, hunted, and gathered. But apparently I was wrong. Or else the amount of food they got by other means was extremely limited.

      So, ancient Egyptians in general were eating more or less according to what mainstream experts today consider a healthy diet, that of high-carb/low-fat as recommended in the Food Pyramid. In particular, they grew a lot of wheat and enjoyed their bread, as their teeth indicates. What is interesting is that they had rotten teeth despite having little access to sweeteners. It was the carbs from wheat and tubers that caused cavities.

      They had high rates of the diseases of civilization, including being overweight. Also, according to the statues that unlike the Greeks offered honest portrayals, the wealthy had big guts and the men had man-breasts, as is typical of a high-carb diet. That isn’t to say they weren’t eating meat too, but it obviously wasn’t as high as in the past prior to agriculture.

    • Coming across your comment again, I have a different response I could offer. Yes, the ruling classes will monopolize what is perceived as most desirable. Sometimes that means the healthiest foods, but not always.

      In Japan, the upper classes ate white rice before the poor. So, the wealthy developed particular deficiency-related diseases (e.g., Beriberi) because white rice has the hull removed where is located vitamin B1. It was a similar pattern to how sugar was first only available and affordable to wealthy Europeans. All of the diseases of civilization also appeared among wealthy Europeans first, as happened later with wealthy Americans such as the obsession over neurasthenia.

      What the ruling classes monopolized depended on many factors. Sometimes it was protein and fat that was monopolized, but at other times it was particular kinds of carbs. I’d also note that, even in the Middle Ages, certain fatty proteins were considered poverty foods, such as fish and shellfish (including lobster) and maybe eggs as well (since eating the chickens themselves was mostly limited to the well off; after all, once you at a hen, it couldn’t lay any more eggs).

      Also, in many traditional societies, pork and lard, along with butter and dairy, were common foods among the poor. The reason is these are animals that are so easy to raise under diverse conditions. There is no land on the planet earth where people can’t keep pigs, cows, goats, sheep, camels, or some other similar animal that provides meat, fat, and/or dairy. Sometimes these animal foods were used more sparingly, depending on access to land for pasturage; not to mention land for hunting and fishing.

      For example, peasants were healthier in the early Middle Ages when there was more open land and hence great intake of animal foods. The same pattern was true for Americans, from the colonial period to the early 1900s, prior to the enclosure of vast wilderness areas and other open lands. Many families during the Great Depression survived on subsistence farming, gardening, hunting, and fishing. Americans in the prior centuries were the tallest people in the Western world because of their heavy dependence on fatty animal foods, much of it wild-caught.

  2. As an interesting note, many people argue that type 1 diabetes is genetically determined. But that there is a genetic predisposition is far from concluding determinism. The data points elsewhere.

    Diabetes is increasing and type 1 is specifically increasing faster than type 2, the complete opposite that would be expected if it were genetically determined. Evidence for it being related to diet and/or gut health is that type 1 diabetes has been strongly linked to celiac disease, the extreme autoimmune response to gluten.

    Autoimmune disorders appear to be central to so many diseases. Another example of this is how many working in the field are beginning to call Alzheimer’s as type 3 diabetes. Alzheimer’s, by the way, is yet another autoimmune disorder.

    https://beyondtype1.org/celiac-disease/

    “Among the whole population, celiac disease is relatively rare; only 1% of people are diagnosed. However, if you are diagnosed with Type 1 diabetes, your risk for getting celiac disease increases tenfold.”

    https://robbwolf.com/2008/09/17/paleo-vs-type-1-diabetes/

    “It is well understood that Type 1 Diabetes is a failure of the beta cells of the pancreas to produce insulin. This is generally acknowledged to be the result of an autoimmune response, usually attributed to a viral infection or some kind of trauma. What is less known is the role of grain lectins in this process. Many people benefit not only from reducing the recommended American Diabetes Association 60% carb diet (higher even than the diet that causes most of the type 2 diabetes we see) because of a more fat fueled metabolism but also, occasionally, we see a return of normal pancreatic function with the removal of the neo-lithic foods. The inflammation and immune response that has been beating down the beta cells cease, some repair occurs and things come back to normal. This is not the norm unfortunately, but it does happen. Even without the full return of pancreatic function, the reduced carb, higher fat paleo diet greatly mitigates the accelerated aging and systemic inflammation inherent in both type 1 and type 2 diabetes.”

    http://diabetes.diabetesjournals.org/content/58/7/1578.long

    “Mounting evidence suggests that the gut immune system is involved in the development of autoimmune diabetes. An inflammatory state has been demonstrated to be present in the structurally normal intestine of patients with type 1 diabetes, and the abnormal intestinal permeability that has been found in these patients could represent a contributing factor.”

    https://robbwolf.com/2009/11/09/type-1-diabetes-the-gut-connection/

    “I know it’s kinda thick, technical stuff but what this paper indicates is when a doctor (or anyone else) sits back and dismisses the gut autoimmune connection for Type 1 diabetes (or any other autoimmune disease) they are really missing the boat. At one time it was assumed celiac only affected 1 in 15,000. then it was 1 in 1,500 now the official story is something like 1 in 125. This is all bullocks, this issue affects everyone, it just manifests differently in different people. One of the main antibodies they talk about in the paper, TG2, (trans glutaminase) is an enzyme that alters EVERY PROTEIN IN THE BODY. This is why one pathological vector (leaky gut) can manifest in so many different ways (all the autoimmune diseases known and unknown).”

    https://www.drperlmutter.com/tag/type-1-diabetes/

    “Bacteriophages are a type of virus that can infect bacteria and alter their function. First identified in 1917, bacteriophages have been long overlooked in terms of their potential contribution to human disease.

    “Our interview today is with Dr. George Tetz, one of the world leaders in bacteriophage research. He has identified pathways whereby bacteriophages can alter gut bacteria in such a way so as to modify their function in the human body. His work relates bacteriophage activity with autoimmune conditions, like Type 1 diabetes. He has also found strong connections between bacteriophages and Parkinson’s disease, a subject into which he’ll dive deeper in our discussion. He believes that these bacteria-infecting viruses may also play a prominent role in other neurodegenerative conditions, including amyotrophic lateral sclerosis (ALS) and even Alzheimer’s disease.”

    https://paleoleap.com/paleo-and-type-1-diabetes/

    “Part of it is genetic, but genetics can’t tell the whole story, and especially not the rapidly increasing rate of the disease. This study lays out the case for a strong influence of gut flora on Type 1 Diabetes:

    ” – Gut flora are a very important part of the immune system. Specifically, they control the permeability of the intestinal lining, which is involved in the development of many different autoimmune diseases, including Type 1 Diabetes.
    ” – Many things that affect the gut flora, like method of delivery and use of antibiotics, also affect the risk of developing Type 1 Diabetes. Interestingly, early childhood introduction to grains and cow’s milk may also be a risk factor.
    ” – People with Type 1 Diabetes have measurably different gut flora than healthy controls.

    “That all points to a role for the gut flora in influencing Type 1 Diabetes.”

    https://www.glutenfreeliving.com/gluten-free/beyond-gluten-free/keys-ketogenic-diet/

    “For those with diabetes, research shows that banishing carbohydrates leads to weight loss and decreased need for diabetes medications. “For people with prediabetes, metabolic syndrome and polycystic ovarian syndrome [PCOS], eating a ketogenic diet can help keep their insulin levels under control,” Spritzler advises. “Plus, I notice that there is increased energy when eating low carb—a benefit for everyone!”

    “A high-fat diet may bring to mind heart disease, but the research on low-carb diets and cholesterol levels are, for the most part, favorable. Some studies on children who follow the ketogenic diet for epilepsy do show increased levels of bad cholesterol (LDL) and triglycerides, at least for the first few months after starting the plan. However, research in adults has shown that “good” cholesterol levels (HDL) rise and triglycerides decrease when cutting carbohydrates. It is important to note that the majority of studies involving the ketogenic diet are of short duration, so long-term effects are not necessarily known.

    “In addition to diabetes and weight control, new research may point to a role for the ketogenic diet in the treatment of certain cancers and neurological disorders such as Alzheimer’s disease and Parkinson’s disease.”

  3. Ketone Bodies in Neurological Diseases: Focus on Neuroprotection and Underlying Mechanisms
    Huajun Yang, Wei Shan, Fei Zhu, Jianping Wu, and Qun Wang

    https://www.frontiersin.org/articles/10.3389/fneur.2019.00585/full

    There is growing evidence that ketone bodies, which are derived from fatty acid oxidation and usually produced in fasting state or on high-fat diets have broad neuroprotective effects. Although the mechanisms underlying the neuroprotective effects of ketone bodies have not yet been fully elucidated, studies in recent years provided abundant shreds of evidence that ketone bodies exert neuroprotective effects through possible mechanisms of anti-oxidative stress, maintaining energy supply, modulating the activity of deacetylation and inflammatory responses. Based on the neuroprotective effects, the ketogenic diet has been used in the treatment of several neurological diseases such as refractory epilepsy, Parkinson’s disease, Alzheimer’s disease, and traumatic brain injury. The ketogenic diet has great potential clinically, which should be further explored in future studies. It is necessary to specify the roles of components in ketone bodies and their therapeutic targets and related pathways to optimize the strategy and efficacy of ketogenic diet therapy in the future. […]

    Summary and Perspectives

    KD is used in the treatment of several neurological diseases for many years, and a large number of studies recently have validated the role of KD in neuroprotection. It could play the neuroprotective role by reducing oxidative stress, maintaining energy metabolism, modulating inflammation, modulating the activity of deacetylation, and other possible mechanisms. Although the specific mechanisms of KD in the treatment of neurological diseases are still uncertain, it is inevitable that all neurological diseases could affect human health through oxidative damage, energy metabolism disorders or inflammatory reactions. Neurological diseases often involve multiple mechanisms, and KD may also play a role through these mechanisms. In some diseases, such as epilepsy, AD, PD, KD can play a therapeutic role, while in some others, it plays a supporting role, facilitating the therapy of the disease, improving the symptoms and quality of life in patients. KD has excellent potential in clinical application, which further requires exploration. Future studies are necessary to further specify the roles of components in KBs and their therapeutic targets and related pathways, to optimize the strategy and efficacy of KD therapy.

Comments are closed.