Old Debates Forgotten

Since earlier last year, I’ve done extensive reading, largely but not entirely focused on health. This has particularly concerned diet and nutrition, although it has crossed over into the territory of mental health with neurocognitive issues, addiction, autism, and much else, with my personal concern being that of depression. The point of this post is to consider some of the historical background. Before I get to that, let me explain how my recent interests have developed.

What got me heading in this direction was the documentary The Magic Pill. It’s about the paleo diet. The practical advice was worth the time spent, though other things drew me into the the larger arena of low-carb debate. The thing about the paleo diet is that it offers a framework of understanding that includes many scientific fields involving health beyond only diet and also it explores historical records, anthropological research, and archaeological evidence. The paleo diet community in particular, along with the low-carb diet community in general, is also influenced by the traditional foods approach of Sally Fallon Morrell. She is the lady who, more than anyone else, popularized the work of Weston A. Price, an early 20th century dentist who traveled the world and studied traditional populations. I was already familiar with this area from having reading Morrell’s first book in the late ’90s or early aughts.

New to me was the writings of Gary Taubes and Nina Teicholz, two science journalists who have helped to shift the paradigm in nutritional studies. They accomplished this task by presenting not only detailed surveys of the research and other evidence but in further contextualizing the history of powerful figures, institutions, and organizations that shaped the modern industrial diet. I didn’t realize how far back this debate went with writings on fasting for epilepsy found in ancient texts and recommendations of a low-carb diet (apparently ketogenic) for diabetes appearing in the 1790s, along with various low-carb and animal-based diets being popularized for weight-loss and general health during the 19th century, and then the ketogenic diet was studied for epilepsy beginning in the 1920s. Yet few know this history.

Ancel Keys was one of those powerful figures who, in suppressing his critics and silencing debate, effectively advocated for the standard American diet of high-carbs, grains, fruits, vegetables, and industrial seed oils. In The Magic Pill, more recent context is given in following the South African trial of Tim Noakes. Other documentaries have covered this kind of material, often with interviews with Gary Taubes and Nina Teicholz. There has been immense drama involved and, in the past, there was also much public disagreement and discussion. Only now is that returning to mainstream awareness in the corporate media, largely because social media has forced it out into the open. But what interests me is how old is the debate and often in the past much more lively.

The post-revolutionary era created a sense of crisis that, by the mid-19th century, was becoming a moral panic. The culture wars were taking shape. The difference back then was that there was much more of a sense of the connection between physical health, mental health, moral health, and societal health. As a broad understanding, health was seen as key and this was informed by the developing scientific consciousness and free speech movement. The hunger for knowledge was hard to suppress, although there were many attempts as the century went on. I tried to give a sense of this period in two massive posts, The Crisis of Identity and The Agricultural Mind. It’s hard to imagine what that must’ve been like. That scientific debate and public debate was largely shut down around the World War era, as the oppressive Cold War era took over. Why?

It is strange. The work of Taubes and Teicholz gives hint to what changed, although the original debate was much wider than diet and nutrition. The info I’ve found about the past has largely come from scholarship in other fields, such as historical and literary studies. Those older lines of thought are mostly treated as historical curiosities at this point, background info for the analysis of entirely other subjects. As for the majority of scientists, doctors and nutritionists these days, they are almost entirely ignorant of the ideologies that shaped modern thought about disease and health.

This is seen, as I point out, in how Galen’s ancient Greek theory of humors as incorporated into Medieval Christianity appears to be the direct source of the basic arguments for a plant-based diet, specifically in terms of the scapegoating of red meat, saturated fat and cholesterol. Among what I’ve come across, the one scholarly book that covers this in detail is Food and Faith in Christian Culture edited by Ken Albala and Trudy Eden. Bringing that into present times, Belinda Fettke dug up how so much of contemporary nutritional studies and dietary advice was built on the foundation of 19th-20th century vegan advocacy by the Seventh Day Adventists. I’ve never met anyone adhering to “plant-based” ideology who knows this history. Yet now it is becoming common knowledge in the low-carb world.

On the literary end of things, there is a fascinating work by Bryan Kozlowski, The Jane Austen Diet. I enjoyed reading it, in spite of never having cracked open a book by Jane Austen. Kozlowski, although no scholar, was able to dredge up much of interest about those post-revolutionary decades in British society. For one, he shows how obesity was becoming noticeable all the way back then and many were aware of the benefits of low-carb diets. He also makes clear that the ability to maintain a vegetable garden was a sign of immense wealth, not a means for putting much food on the tables of the poor — this is corroborated by Teicholz discussion of how gardening in American society, prior to modern technology and chemicals, was difficult and not dependable. More importantly, Kozlowski’s book explains what ‘sensibility’ meant back then, related to ‘nerves’ and ‘vapors’ and later on given the more scientific-sounding label of ‘neurasthenia’.

I came across another literary example of historical exegesis about health and diet, Sander L. Gilman’s Franz Kafka, the Jewish Patient. Kafka was an interesting case, as a lifelong hypochondriac who, it turns out, had good reason to be. He felt that he had inherited a weak constitution and blamed this on his psychological troubles, but more likely causes were urbanization, industrialization, and a vegetarian diet that probably also was a high-carb diet based on nutrient-depleted processed foods; and before the time when industrial foods were fortified and many nutritional supplements were available.

What was most educational, though, about the text was Gilman’s historical details on tuberculosis in European thought, specifically in relationship to Jews. To some extent, Kafka had internalized racial ideology and that is unsurprising. Eugenics was in the air and racial ideology penetrated everything, especially health in terms of racial hygiene. Even for those who weren’t eugenicists, all debate of that era was marked by the expected biases and limitations. Some theorizing was better than others and for certain not all of it was racist, but the entire debate maybe was tainted by the events that would follow. With the defeat of the Nazis, eugenics fell out of favor for obvious reasons and an entire era of debate was silenced, even many of the arguments that were opposed to or separate form eugenics. Then historical amnesia set in, as many people wanted to forget the past and instead focus on the future. That was unfortunate. The past doesn’t simply disappear but continues to haunt us.

That earlier debate was a struggle between explanations and narratives. With modernity fully taking hold, people wanted to understand what was happening to humanity and where it was heading. It was a time of contrasts which made the consequences of modernity quite stark. There were plenty of communities that were still pre-industrial, rural, and traditional, but since then most of these communities have died away. The diseases of civilization, at this point, have become increasingly normalized as living memory of anything else has disappeared. It’s not that the desire for ideological explanations has disappeared. What happened was, with the victory of WWII, a particular grand narrative came to dominate the entire Western world and there simply were no other grand narratives to compete with it. Much of the pre-war debate and even scientific knowledge, especially in Europe, was forgotten as the records of it were destroyed, weren’t translated, or lost perceived relevance.

Nonetheless, all of those old ideological conflicts were left unresolved. The concerns then are still concerns now. So many problems worried about back then are getting worse. The connections between various aspects of health have regained their old sense of urgency. The public is once again challenging authorities, questioning received truths, and seeking new meaning. The debate never ended and here we are again, and one could add that fascism also is back rearing its ugly head. It’s worrisome that the political left seems to be slow on the uptake. There are reactionary right-wingers like Jordan Peterson who are offering visions of meaning and also who have become significant figures in the dietary world, by way of the carnivore diet he and his daughter are on. T?hen there are the conspiratorial paleo-libertarians such as Tristan Haggard, another carnivore advocate.

This is far from being limited to carnivory and the low-carb community includes those across the political spectrum, but it seems to be the right-wingers who are speaking the loudest. The left-wingers who are speaking out on diet come from the confluence of veganism/vegetarianism and environmentalism, as seen with EAT-Lancet (Dietary Dictocrats of EAT-Lancet). The problem with this, besides much of this narrative being false (Carnivore is Vegan), is that it is disconnected from the past. The right-wing is speaking more to the past than is the left-wing, such as Trump’s ability to invoke and combine the Populist and Progressive rhetoric from earlier last century. The political left is struggling to keep up and is being led down ideological dead-ends.

If we want to understand our situation now, we better study carefully what was happening in centuries past. We are having the same old debates without realizing it and we very well might see them lead to the same kinds of unhappy results.

Moral Panic and Physical Degeneration

From the beginning of the country, there has been an American fear of moral and mental decline that was always rooted in the physical, involving issues of vitality of land and health of the body, and built on an ancient divide between the urban and rural. Over time, it grew into a fever pitch of moral panic about degeneration and degradation of the WASP culture, the white race, and maybe civilization itself. Some saw the end was near, maybe being able to hold out for another few generations before finally succumbing to disease and weakness. The need for revitalization and rebirth became a collective project (Jackson Lears, Rebirth of a Nation), which sadly fed into ethno-nationalist bigotry and imperialistic war-mongering — Make America Great Again!

A major point of crisis, of course, was the the Civil War. Racial ideology became predominant, not only because of slavery but maybe moreso because of mass immigration, the latter being the main reason the North won. Racial tensions merged with the developing scientific mindset of Darwinism and out of this mix came eugenics. For all we can now dismiss this kind of simplistic ignorance and with hindsight see the danger it led to, the underlying anxieties were real. Urbanization and industrialization were having an obvious impact on public health that was observed by many, and it wasn’t limited to mere physical ailments. “Cancer, like insanity, seems to increase with the progress of civilization,” noted Stanislas Tanchou, a mid-19th century French physician.

The diseases of civilization, including mental sickness, have been spreading for centuries (millennia, actually, considering the ‘modern’ chronic health conditions were first detected in the mummies of the agricultural Egyptians). Consider how talk of depression suddenly showed up in written accounts with the ending of feudalism (Barbara Ehrenreich, Dancing in the Street). That era included the enclosure movement that forced millions of then landless serfs into the desperate conditions of crowded cities and colonies where they faced stress, hunger, malnutrition, and disease. The loss of rural life hit Europe much earlier than America, but it eventually came here as well. The majority of white Americans were urban by the beginning of the 20th century and the majority of black Americans were urban by the 1970s. There has been a consistent pattern of mass problems following urbanization, everywhere it happens. It still is happening. The younger generation, more urbanized than any generation before, are seeing rising rates of psychosis that is specifically concentrated in the most urbanized areas.

In the United States, it was the last decades of the 19th century that was the turning point, the period of the first truly big cities. Into this milieu, Weston A. Price was born (1870) in a small rural village in Canada. As an adult, he became a dentist and sought work in Cleveland, Ohio (1893). Initially, most of his patients probably had, like him, grown up in rural areas. But over the decades, he increasingly was exposed to the younger generations having spent their entire lives in the city. Lierre Keith puts Price’s early observations in context, after pointing out that he started his career in 1893: “This date is important, as he entered the field just prior to the glut of industrial food. Over the course of the next thirty years, he watched children’s dentition — and indeed their overall health deteriorate. There was suddenly children whose teeth didn’t fit in their mouths, children with foreshortened jaws, children with lots of cavities. Not only were their dental arches too small, but he noticed their nasal passages were also too narrow, and they had poor health overall; asthma, allergies, behavioral problems” (The Vegetarian Myth, p. 187). This was at the time when the industrialization of farming and food had reached a new level, far beyond the limited availability of canned foods that in the mid-to-late 1800s when most Americans still relied on a heavy amount of wild-sourced meat, fish, nuts, etc. Even city-dwellers in early America had ready access to wild game because of the abundance of surrounding wilderness areas. In fact, in the 19th century, the average American ate more meat (mostly hunted) than bread.

We are once again coming back to the ever recurrent moral panic about the civilizational project. The same fears given voice in the late 19th to early 20th century are being repeated again. For example, Dr. Leonard Sax alerts us to how girls are sexually maturing early (1% of female infants showing signs of puberty), whereas boys are maturing later. As a comparison, hunter-gatherers don’t have such a large gender disparity of puberty nor do they experience puberty so early for girls, instead both genders typically coming to puberty around 18 years old with sex, pregnancy, and marriage happening more or less simultaneously. Dr. Sax, along with others, speculates about a number of reasons. Common causes that are held responsible include health factors, from diet to chemicals. Beyond altered puberty, many other examples could be added: heart disease, autoimmune disorders, mood disorders, autism, ADHD, etc; all of them increasing and worsening with each generation (e.g., type 2 diabetes used to be known as adult onset diabetes but now is regularly diagnosed in young children; the youngest victim recorded recently was three years old when diagnosed).

In the past, Americans responded to moral panic with genocide of Native Americans, Prohibition targeting ethnic (hyphenated) Americans and the poor, and immigrant restrictions to keep the bad sort out; the spread of racism and vigilantism such as KKK and Jim Crow and sundown towns and redlining, forced assimilation such as English only laws and public schools, and internment camps for not only Japanese-Americans but also German-Americans and Italian-Americans; implementation of citizen-making projects like national park systems, Boy Scouts, WPA, and CCC; promotion of eugenics, war on poverty (i.e., war on the poor), imperial expansionism, neo-colonial exploitation, and world wars; et cetera. The cure sought was often something to be forced onto the population by a paternalistic elite, that is to say rich white males, most specifically WASPs of the capitalist class.

Eugenics was, of course, one of the main focuses as it carried the stamp of science (or rather scientism). Yet at the same time, there were those challenging biological determinism and race realism, as views shifted toward environmental explanations. The anthropologists were at the front lines of this battle, but there were also Social Christians who changed their minds after having seen poverty firsthand. Weston A. Price, however, didn’t come to this from a consciously ideological position or religious motivation. He was simply a dentist who couldn’t ignore the severe health issues of his patients. So, he decided to travel the world in order to find healthy populations to study, in the hope of explaining why the change had occurred (Nutrition and Physical Degeneration).

Although familiar with eugenics literature, what Price observed in ‘primitive’ communities (including isolated villages in Europe) did not conform to eugenicist thought. It didn’t matter which population he looked at. Those who ate traditional diets were healthy and those who ate an industrialized Western diet were not. And it was a broad pattern that he saw everywhere he went, not only physical health but also neurocognitive health as indicated by happiness, low anxiety, and moral character. Instead of blaming individuals or races, he saw the common explanation as nutrition and he made a strong case by scientifically analyzing the nutrition of available foods.

In reading about traditional foods, paleo diet/lifestyle and functional medicine, Price’s work comes up quite often. He took many photographs that compared people from healthy and unhealthy populations. The contrast is stark. But what really stands out is how few people in the modern world look close to as healthy as those from the healthiest societies of the past. I live in a fairly wealthy college and medical town where there is a far above average concern for health along with access to healthcare. Even so, I now can’t help noticing how many people around me show signs of stunted or perturbed development of the exact kind Price observed in great detail: thin bone structure, sunken chests, sloping shoulders, narrow facial features, asymmetry, etc. That is even with modern healthcare correcting some of the worst conditions: cavities, underbites, pigeon-toes, etc. My fellow residents in this town are among the most privileged people in the world and, nonetheless, their state of health is a sad state of affairs in what it says about humanity at present.

It makes me wonder, as it made Price wonder, what consequences this has on neurocognitive health for individuals and the moral health of society. Taken alone, it isn’t enough to get excited about. But put in a larger context of looming catastrophes and it does become concerning. It’s not clear that our health will be up to the task of the problems we need to solve. We are a sickly population, far more sickly than when moral panic took hold in past generations.

As important, there is the personal component. I’m at a point where I’m not going to worry too much about decline and maybe collapse of civilization. I’m kind of hoping the American Empire will meet its demise. Still, that leaves us with many who suffer, no matter what happens to society as a whole. I take that personally, as one who has struggled with physical and mental health issues. And I’ve come around to Price’s view of nutrition as being key. I see these problems in other members of my family and it saddens me to watch as health conditions seem to get worse from one generation to the next.

It’s far from being a new problem, the central point I’m trying to make here. Talking to my mother, she has a clear sense of the differences on the two sides of her family. Her mother’s family came from rural areas and, even after moving to a larger city for work, they continued to hunt on a daily basis as there were nearby fields and woods that made that possible. They were a healthy, happy, and hard-working lot. They got along well as a family. Her father’s side of the family was far different. They had been living in towns and cities for several generations by the time she was born. They didn’t hunt at all. They were known for being surly, holding grudges, and being mean drunks. They also had underbites (i.e., underdeveloped jaw structure) and seemed to have had learning disabilities, though no one was diagnosing such conditions back then. Related to this difference, my mother’s father raised rabbits whereas my mother’s mother’s family hunted rabbits (and other wild game). This makes a big difference in terms of nutrition, as wild game has higher levels of omega-3 fatty acids and fat-soluble vitamins, all of which are key to optimal health and development.

What my mother observed in her family is basically the same as what Price observed in hundreds of communities in multiple countries on every continent. And I now observe the same pattern repeating. I grew up with an underbite. My brothers and I all required orthodontic work, as do so many now. I was diagnosed with a learning disability when young. Maybe not a learning disability, but behavioral issues were apparent when my oldest brother was young, likely related to his mildew allergies and probably an underlying autoimmune condition. I know I had food allergies as a child, as I think my other brother did as well. All of us have had neurocognitive and psychological issues of a fair diversity, besides learning disabilities: stuttering, depression, anxiety, and maybe some Asperger’s.

Now another generation is coming along with increasing rates of major physical and mental health issues. My nieces and nephews are sick all the time. They don’t eat well and are probably malnourished. During a medical checkup for my nephew, my mother asked the doctor about his extremely unhealthy diet, consisting mostly of white bread and sugar. The doctor bizarrely dismissed it as ‘normal’ in that, as she claimed, no kid eats healthy. If that is the new normal, maybe we should be in a moral panic.

* * *

Violent Behavior: A Solution in Plain Sight
by Sylvia Onusic

Nutrition and Mental Development
by Sally Fallon Morell

You Are What You Eat: The Research and Legacy of Dr. Weston Andrew Price
by John Larabell

While practicing in his Cleveland office, Dr. Price noticed an increase in dental problems among the younger generations. These issues included the obvious dental caries (cavities) as well as improper jaw development leading to crowded, crooked teeth. In fact, the relatively new orthodontics industry was at that time beginning to gain popularity. Perplexed by these modern problems that seemed to be affecting a greater and greater portion of the population, Dr. Price set about to research the issue by examining people who did not display such problems. He suspected (correctly, as he would later find) that many of the dental problems, as well as other degenerative health problems, that were plaguing modern society were the result of inadequate nutrition owing to the increasing use of refined, processed foods.

Nasty, Brutish and Short?
by Sally Fallon Morell

It seems as if the twentieth century will exit with a crescendo of disease. Things were not so bad back in the 1930’s, but the situation was already serious enough to cause one Cleveland, Ohio dentist to be concerned. Dr. Weston Price was reluctant to accept the conditions exhibited by his patients as normal. Rarely did an examination of an adult patient reveal anything but rampant decay, often accompanied by serious problems elsewhere in the body, such as arthritis, osteoporosis, diabetes, intestinal complaints and chronic fatigue. (They called it neurasthenia in Price’s day.) But it was the dentition of younger patients that alarmed him most. Price observed that crowded, crooked teeth were becoming more and more common, along with what he called “facial deformities”-overbites, narrowed faces, underdevelopment of the nose, lack of well-defined cheekbones and pinched nostrils. Such children invariably suffered from one or more complaints that sound all too familiar to mothers of the 1990’s: frequent infections, allergies, anemia, asthma, poor vision, lack of coordination, fatigue and behavioral problems. Price did not believe that such “physical degeneration” was God’s plan for mankind. He was rather inclined to believe that the Creator intended physical perfection for all human beings, and that children should grow up free of ailments.

Is it Mental or is it Dental?
by Raymond Silkman

The widely held model of orthodontics, which considers developmental problems in the jaws and head to be genetic in origin, never made sense to me. Since they are wedded to the genetic model, orthodontists dealing with crowded teeth end up treating the condition with tooth extraction in a majority of the cases. Even though I did not resort to pulling teeth in my practice, and I was using appliances to widen the jaws and getting the craniums to look as they should, I still could not come up with the answer as to why my patients looked the way they did. I couldn’t believe that the Creator had given them a terrible blueprint –it just did not make sense. In four years of college education, four years of dental school education and almost three years of post-graduate orthodontic training, students never hear a mention of Dr. Price, so they never learn the true reasons for these malformations. I have had the opportunity to work with a lot of very knowledgeable doctors in various fields of allopathic and alternative healthcare who still do not know about Dr. Price and his critical findings.

These knowledgeable doctors have not stared in awe at the beautiful facial development that Price captured in the photographs he took of primitive peoples throughout the globe and in so doing was able to answer this most important question: What do humans look like in health? And how have humans been able to carry on throughout history and populate such varied geographical and physical environments on the earth without our modern machines and tools?

The answer that Dr. Price was able to illuminate came through his photographs of beautiful, healthy human beings with magnificent physical form and mental development, living in harmony with their environments. […]

People who are not well oxygenated and who have poor posture often suffer from fatigue and fibromyalgia symptoms, they snore and have sleep apnea, they have sinusitis and frequent ear infections. Life becomes psychologically and physically challenging for them and they end up with long-term dependence on medications—and all of that just from the seemingly simple condition of crowded teeth.

In other words, people with poor facial development are not going to live very happily. […]

While very few people have heard of the work of Weston Price these days, we haven’t lost our ability to recognize proper facial form. To make it in today’s society, you must have good facial development. You’re not going to see a general or a president with a weak chin, you’re not going to see coaches with weak chins, you’re not going to see a lot of well-to-do personalities in the media with underdeveloped faces and chins. You don’t see athletes and newscasters with narrow palates and crooked teeth.

Weston A. Price: An Unorthodox Dentist
by Nourishing Israel

Price discovered that the native foods eaten by the isolated populations were far more nutrient dense than the modern foods. In the first generation that changed their diet there was noticeable tooth decay; in subsequent generations the dental and facial bone structure changed, as well as other changes that were seen in American and European families and previously considered to be the result of interracial marriage.

By studying the different routes that the same populations had taken – traditional versus modern diet – he saw that the health of the children is directly related to the health of the parents and the germ plasms that they provide, and are as important to the child’s makeup as the health of the mother before and during pregnancy.

Price also found that primitive populations were very conscious of the importance of the mothers’ health and many populations made sure that girls were given a special diet for several months before they were allowed to marry.

Another interesting finding was that although genetic makeup was important, it did not have as great a degree of influence on a person’s development and health as was thought, but that a lot of individual characteristics, including brain development and brain function, where due to environmental influence, what he called “intercepted heredity”.

The origin of personality and character appear in the light of the newer date to be biologic products and to a much less degree than usually considered pure hereditary traits. Since these various factors are biologic, being directly related to both the nutrition of the parents and to the nutritional environment of the individuals in the formative and growth period any common contributing factor such as food deficiencies due to soil depletion will be seen to produce degeneration of the masses of people due to a common cause. Mass behavior therefore, in this new light becomes the result of natural forces, the expression of which may not be modified by propaganda but will require correction at the source. [1] …

It will be easy for the reader to be prejudiced since many of the applications suggested are not orthodox. I suggest that conclusions be deferred until the new approach has been used to survey the physical and mental status of the reader’s own family, of his brothers and sisters, of associated families, and finally, of the mass of people met in business and on the street. Almost everyone who studies the matter will be surprised that such clear-cut evidence of a decline in modern reproductive efficiency could be all about us and not have been previously noted and reviewed.[2]

From Nutrition and Physical Degeneration by Weston Price

Food Freedom – Nourishing Raw Milk
by Lisa Virtue

In 1931 Price visited the people of the Loetschental Valley in the Swiss Alps. Their diet consisted of rye bread, milk, cheese and butter, including meat once a week (Price, 25). The milk was collected from pastured cows, and was consumed raw: unpasteurized, unhomogenized (Schmid, 9).

Price described these people as having “stalwart physical development and high moral character…superior types of manhood, womanhood and childhood that Nature has been able to produce from a suitable diet and…environment” (Price, 29). At this time, Tuberculosis had taken more lives in Switzerland than any other disease. The Swiss government ordered an inspection of the valley, revealing not a single case. No deaths had been recorded from Tuberculosis in the history of the Loetschental people (Shmid, 8). Upon return home, Price had dairy samples from the valley sent to him throughout the year. These samples were higher in minerals and vitamins than samples from commercial (thus pasteurized) dairy products in America and the rest of Europe. The Loetschental milk was particularly high in fat soluble vitamin D (Schmid, 9).

The daily intake of calcium and phosphorous, as well as fat soluble vitamins would have been higher than average North American children. These children were strong and sturdy, playing barefoot in the glacial waters into the late chilly evenings. Of all the children in the valley eating primitive foods, cavities were detected at an average of 0.3 per child (Price, 25). This without visiting a dentist or physician, for the valley had none, seeing as there was no need (Price, 23). To offer some perspective, the rate of cavities per child between the ages of 6-19 in the United States has been recorded to be 3.25, over 10 times the rate seen in Loetschental (Nagel).

Price offers some perspective on a society subsisting mainly on raw dairy products: “One immediately wonders if there is not something in the life-giving vitamins and minerals of the food that builds not only great physical structures within which their souls reside, but builds minds and hearts capable of a higher type of manhood…” (Price, 26).

100 Years Before Weston Price
by Nancy Henderson

Like Price, Catlin was struck by the beauty, strength and demeanor of the Native Americans. “The several tribes of Indians inhabiting the regions of the Upper Missouri. . . are undoubtedly the finest looking, best equipped, and most beautifully costumed of any on the Continent.” Writing of the Blackfoot and Crow, tribes who hunted buffalo on the rich glaciated soils of the American plains, “They are the happiest races of Indian I have met—picturesque and handsome, almost beyond description.”

“The very use of the word savage,” wrote Catlin, “as it is applied in its general sense, I am inclined to believe is an abuse of the word, and the people to whom it is applied.” […]

As did Weston A. Price one hundred years later, Catlin noted the fact that moral and physical degeneration came together with the advent of civilized society. In his late 1830s portrait of “Pigeon’s Egg Head (The Light) Going to and Returning from Washington” Catlin painted him corrupted with “gifts of the great white father” upon his return to his native homeland. Those gifts including two bottles of whiskey in his pockets. […]

Like Price, Catlin discusses the issue of heredity versus environment. “No diseases are natural,” he writes, “and deformities, mental and physical, are neither hereditary nor natural, but purely the result of accidents or habits.”

So wrote Dr. Price: “Neither heredity nor environment alone cause our juvenile delinquents and mental defectives. They are cripples, physically, mentally and morally, which could have and should have been prevented by adequate education and by adequate parental nutrition. Their protoplasm was not normally organized.”

The Right Price
by Weston A. Price Foundation

Many commentators have criticized Price for attributing “decline in moral character” to malnutrition. But it is important to realize that the subject of “moral character” was very much on the minds of commentators of his day. As with changes in facial structure, observers in the first half of the 20th century blamed “badness” in people to race mixing, or to genetic defects. Price quotes A.C. Jacobson, author of a 1926 publication entitled Genius (Some Revaluations),35 who stated that “The Jekyll-Hydes of our common life are ethnic hybrids.” Said Jacobson, “Aside from the effects of environment, it may safely be assumed that when two strains of blood will not mix well a kind of ‘molecular insult’ occurs which the biologists may some day be able to detect beforehand, just as blood is now tested and matched for transfusion.” The implied conclusion to this assertion is that “degenerates” can be identified through genetic testing and “weeded out” by sterilizing the unfit–something that was imposed on many women during the period and endorsed by powerful individuals, including Oliver Wendell Holmes.

It is greatly to Price’s credit that he objected to this arrogant point of view: “Most current interpretations are fatalistic and leave practically no escape from our succession of modern physical, mental and moral cripples. . . If our modern degeneration were largely the result of incompatible racial stocks as indicated by these premises, the outlook would be gloomy in the extreme.”36 Price argued that nutritional deficiencies affecting the physical structure of the body can also affect the brain and nervous system; and that while “bad” character may be the result of many influences–poverty, upbringing, displacement, etc.–good nutrition also plays a role in creating a society of cheerful, compassionate individuals.36

Rebirth of a Nation:
The Making of Modern America, 1877-1920
By Jackson Lears
pp. 7-9

By the late nineteenth century, dreams of rebirth were acquiring new meanings. Republican moralists going back to Jefferson’s time had long fretted about “overcivilization,” but the word took on sharper meaning among the middle and upper classes in the later decades of the nineteenth century. During the postwar decades, “overcivilization” became not merely a social but an individual condition, with a psychiatric diagnosis. In American Nervousness (1880), the neurologist George Miller Beard identified “neurasthenia,” or “lack of nerve force,” as the disease of the age. Neurasthenia encompassed a bewildering variety of symptoms (dyspepsia, insomnia, nocturnal emissions, tooth decay, “fear of responsibility, of open places or closed places, fear of society, fear of being alone, fear of fears, fear of contamination, fear of everything, deficient mental control, lack of decision in trifling matters, hopelessness”), but they all pointed to a single overriding effect: a paralysis of the will.

The malady identified by Beard was an extreme version of a broader cultural malaise—a growing sense that the Protestant ethic of disciplined achievement had reached the end of its tether, had become entangled in the structures of an increasingly organized capitalist society. Ralph Waldo Emerson unwittingly predicted the fin de siècle situation. “Every spirit makes its house,” he wrote in “Fate” (1851), “but afterwards the house confines the spirit.” The statement presciently summarized the history of nineteenth-century industrial capitalism, on both sides of the Atlantic.

By 1904, the German sociologist Max Weber could put Emerson’s proposition more precisely. The Protestant ethic of disciplined work for godly ends had created an “iron cage” of organizations dedicated to the mass production and distribution of worldly goods, Weber argued. The individual striver was caught in a trap of his own making. The movement from farm to factory and office, and from physical labor outdoors to sedentary work indoors, meant that more Europeans and North Americans were insulated from primary processes of making and growing. They were also caught up in subtle cultural changes—the softening of Protestantism into platitudes; the growing suspicion that familiar moral prescriptions had become mere desiccated, arbitrary social conventions. With the decline of Christianity, the German philosopher Friedrich Nietzsche wrote, “it will seem for a time as though all things had become weightless.”

Alarmists saw these tendencies as symptoms of moral degeneration. But a more common reaction was a diffuse but powerful feeling among the middle and upper classes—a sense that they had somehow lost contact with the palpitating actuality of “real life.” The phrase acquired unprecedented emotional freight during the years around the turn of the century, when reality became something to be pursued rather than simply experienced. This was another key moment in the history of longing, a swerve toward the secular. Longings for this-worldly regeneration intensified when people with Protestant habits of mind (if not Protestant beliefs) confronted a novel cultural situation: a sense that their way of life was being stifled by its own success.

On both sides of the Atlantic, the drive to recapture “real life” took myriad cultural forms. It animated popular psychotherapy and municipal reform as well as avant-garde art and literature, but its chief institutional expression was regeneration through military force. As J. A. Hobson observed in Imperialism (1902), the vicarious identification with war energized jingoism and militarism. By the early twentieth century, in many minds, war (or the fantasy of it) had become the way to keep men morally and physically fit. The rise of total war between the Civil War and World War I was rooted in longings for release from bourgeois normality into a realm of heroic struggle. This was the desperate anxiety, the yearning for rebirth, that lay behind official ideologies of romantic nationalism, imperial progress, and civilizing mission—and that led to the trenches of the Western Front.

Americans were immersed in this turmoil in peculiarly American ways. As the historian Richard Slotkin has brilliantly shown, since the early colonial era a faith in regeneration through violence underlay the mythos of the American frontier. With the closing of the frontier (announced by the U.S. census in 1890), violence turned outward, toward empire. But there was more going on than the refashioning of frontier mythology. American longings for renewal continued to be shaped by persistent evangelical traditions, and overshadowed by the shattering experience of the Civil War. American seekers merged Protestant dreams of spiritual rebirth with secular projects of purification—cleansing the body politic of secessionist treason during the war and political corruption afterward, reasserting elite power against restive farmers and workers, taming capital in the name of the public good, reviving individual and national vitality by banning the use of alcohol, granting women the right to vote, disenfranchising African-Americans, restricting the flow of immigrants, and acquiring an overseas empire.

Of course not all these goals were compatible. Advocates of various versions of rebirth—bodybuilders and Prohibitionists, Populists and Progressives, Social Christians and Imperialists—all laid claims to legitimacy. Their crusades met various ends, but overall they relieved the disease of the fin de siècle by injecting some visceral vitality into a modern culture that had seemed brittle and about to collapse. Yearning for intense experience, many seekers celebrated Force and Energy as ends in themselves. Such celebrations could reinforce militarist fantasies but could also lead in more interesting directions—toward new pathways in literature and the arts and sciences. Knowledge could be revitalized, too. William James, as well as Houdini and Roosevelt, was a symbol of the age.

The most popular forms of regeneration had a moral dimension.

pp. 27-29

But for many other observers, too many American youths—especially among the upper classes—had succumbed to the vices of commerce: the worship of Mammon, the love of ease. Since the Founding Fathers’ generation, republican ideologues had fretted about the corrupting effects of commercial life. Norton and other moralists, North and South, had imagined war would provide an antidote. During the Gilded Age those fears acquired a peculiarly palpable intensity. The specter of “overcivilization”—invoked by republican orators since Jefferson’s time—developed a sharper focus: the figure of the overcivilized businessman became a stock figure in social criticism. Flabby, ineffectual, anxious, possibly even neurasthenic, he embodied bourgeois vulnerability to the new challenges posed by restive, angry workers and waves of strange new immigrants. “Is American Stamina Declining?” asked William Blaikie, a former Harvard athlete and author of How to Get Strong and Stay So, in Harper’s in 1889. Among white-collar “brain-workers,” legions of worried observers were asking similar questions. Throughout the country, metropolitan life for the comfortable classes was becoming a staid indoor affair. Blaikie caught the larger contours of the change:

“A hundred years ago, there was more done to make our men and women hale and vigorous than there is to-day. Over eighty per cent of all our men then were farming, hunting, or fishing, rising early, out all day in the pure, bracing air, giving many muscles very active work, eating wholesome food, retiring early, and so laying in a good stock of vitality and health. But now hardly forty per cent are farmers, and nearly all the rest are at callings—mercantile, mechanical, or professional—which do almost nothing to make one sturdy and enduring.”

This was the sort of anxiety that set men (and more than a few women) to pedaling about on bicycles, lifting weights, and in general pursuing fitness with unprecedented zeal. But for most Americans, fitness was not merely a matter of physical strength. What was equally essential was character, which they defined as adherence to Protestant morality. Body and soul would be saved together.

This was not a gender-neutral project. Since the antebellum era, purveyors of conventional wisdom had assigned respectable women a certain fragility. So the emerging sense of physical vulnerability was especially novel and threatening to men. Manliness, always an issue in Victorian culture, had by the 1880s become an obsession. Older elements of moral character continued to define the manly man, but a new emphasis on physical vitality began to assert itself as well. Concern about the over-soft socialization of the young promoted the popularity of college athletics. During the 1880s, waves of muscular Christianity began to wash over campuses.

pp. 63-71

NOT MANY AMERICAN men, even among the comparatively prosperous classes, were as able as Carnegie and Rockefeller to master the tensions at the core of their culture. Success manuals acknowledged the persistent problem of indiscipline, the need to channel passion to productive ends. Often the language of advice literature was sexually charged. In The Imperial Highway (1881), Jerome Bates advised:

[K]eep cool, have your resources well in hand, and reserve your strength until the proper time arrives to exert it. There is hardly any trait of character or faculty of intellect more valuable than the power of self-possession, or presence of mind. The man who is always “going off” unexpectedly, like an old rusty firearm, who is easily fluttered and discomposed at the appearance of some unforeseen emergency; who has no control over himself or his powers, is just the one who is always in trouble and is never successful or happy.

The assumptions behind this language are fascinating and important to an understanding of middle-and upper-class Americans in the Gilded Age. Like many other purveyors of conventional wisdom—ministers, physicians, journalists, health reformers—authors of self-help books assumed a psychic economy of scarcity. For men, this broad consensus of popular psychology had sexual implications: the scarce resource in question was seminal fluid, and one had best not be diddling it away in masturbation or even nocturnal emissions. This was easier said than done, of course, as Bates indicated, since men were constantly addled by insatiable urges, always on the verge of losing self-control—the struggle to keep it was an endless battle with one’s own darker self. Spiritual, psychic, and physical health converged. What Freud called “‘civilized’ sexual morality” fed directly into the “precious bodily fluids” school of health management. The man who was always “‘going off’ unexpectedly, like an old rusty firearm,” would probably be sickly as well as unsuccessful—sallow, sunken-chested, afflicted by languorous indecision (which was how Victorian health literature depicted the typical victim of what was called “self-abuse”).

But as this profile of the chronic masturbator suggests, scarcity psychology had implications beyond familiar admonitions to sexual restraint. Sexual scarcity was part of a broader psychology of scarcity; the need to conserve semen was only the most insistently physical part of a much more capacious need to conserve psychic energy. As Bates advised, the cultivation of “self-possession” allowed you to “keep your resources well in hand, and reserve your strength until the proper time arrives to exert it.” The implication was that there was only so much strength available to meet demanding circumstances and achieve success in life. The rhetoric of “self-possession” had financial as well as sexual connotations. To preserve a cool, unruffled presence of mind (to emulate Rockefeller, in effect) was one way to stay afloat on the storm surges of the business cycle.

The object of this exercise, at least for men, was personal autonomy—the ownership of one’s self. […]

It was one thing to lament excessive wants among the working class, who were supposed to be cultivating contentment with their lot, and quite another to find the same fault among the middle class, who were supposed to be improving themselves. The critique of middle-class desire posed potentially subversive questions about the dynamic of dissatisfaction at the core of market culture, about the very possibility of sustaining a stable sense of self in a society given over to perpetual jostling for personal advantage. The ruinous results of status-striving led advocates of economic thrift to advocate psychic thrift as well.

By the 1880s, the need to conserve scarce psychic resources was a commonly voiced priority among the educated and affluent. Beard’s American Nervousness had identified “the chief and primary cause” of neurasthenia as “modern civilization,” which placed unprecedented demands on limited emotional energy. “Neurasthenia” and “nervous prostration” became catchall terms for a constellation of symptoms that today would be characterized as signs of chronic depression—anxiety, irritability, nameless fears, listlessness, loss of will. In a Protestant culture, where effective exercise of will was the key to individual selfhood, the neurasthenic was a kind of anti-self—at best a walking shadow, at worst a bedridden invalid unable to make the most trivial choices or decisions. Beard and his colleagues—neurologists, psychiatrists, and self-help writers in the popular press—all agreed that nervous prostration was the price of progress, a signal that the psychic circuitry of “brain workers” was overloaded by the demands of “modern civilization.”

While some diagnoses of this disease deployed electrical metaphors, the more common idiom was economic. Popular psychology, like popular economics, was based on assumptions of scarcity: there was only so much emotional energy (and only so much money) to go around. The most prudent strategy was the husbanding of one’s resources as a hedge against bankruptcy and breakdown. […]

Being reborn through a self-allowed regime of lassitude was idiosyncratic, though important as a limiting case. Few Americans had the leisure or the inclination to engage in this kind of Wordsworthian retreat. Most considered neurasthenia at best a temporary respite, at worst an ordeal. They strained, if ambivalently, to be back in harness.

The manic-depressive psychology of the business class mimicked the lurching ups and downs of the business cycle. In both cases, assumptions of scarcity underwrote a pervasive defensiveness, a circle-the-wagons mentality. This was the attitude that lay behind the “rest cure” devised by the psychiatrist Silas Weir Mitchell, who proposed to “fatten” and “redden” the (usually female) patient by isolating her from all mental and social stimulation. (This nearly drove the writer Charlotte Perkins Gilman crazy, and inspired her story “The Yellow Wallpaper.”) It was also the attitude that lay behind the fiscal conservatism of the “sound-money men” on Wall Street and in Washington—the bankers and bondholders who wanted to restrict the money supply by tying it to the gold standard. Among the middle and upper classes, psyche and economy alike were haunted by the common specter of scarcity. But there were many Americans for whom scarcity was a more palpable threat.

AT THE BOTTOM of the heap were the urban poor. To middle-class observers they seemed little more than a squalid mass jammed into tenements that were festering hives of “relapsing fever,” a strange malady that left its survivors depleted of strength and unable to work. The disease was “the most efficient recruiting officer pauperism ever had,” said a journalist investigating tenement life in the 1870s. Studies of “the nether side of New York” had been appearing for decades, but—in the young United States at least—never before the Gilded Age had the story of Dives and Lazarus been so dramatically played out, never before had wealth been so flagrant, or poverty been so widespread and so unavoidably appalling. The army of thin young “sewing-girls” trooping off in the icy dawn to sweatshops all over Manhattan, the legions of skilled mechanics forced by high New York rents to huddle with their families amid a crowd of lowlifes, left without even a pretense of privacy in noisome tenements that made a mockery of the Victorian cult of home—these populations began to weigh on the bourgeois imagination, creating concrete images of the worthy, working poor.

pp. 99-110

Racial animosities flared in an atmosphere of multicultural fluidity, economic scarcity, and sexual rivalry. Attitudes arising from visceral hostility acquired a veneer of scientific objectivity. Race theory was nothing new, but in the late nineteenth century it mutated into multiple forms, many of them characterized by manic urgency, sexual hysteria, and biological determinism. Taxonomists had been trying to arrange various peoples in accordance with skull shape and brain size for decades; popularized notions of natural selection accelerated the taxonomic project, investing it more deeply in anatomical details. The superiority of the Anglo-Saxon—according to John Fiske, the leading pop-evolutionary thinker—arose not only from the huge size of his brain, but also from the depth of its furrows and the plenitude of its creases. The most exalted mental events had humble somatic origins. Mind was embedded in body, and both could be passed on to the next generation.

The year 1877 marked a crucial development in this hereditarian synthesis: in that year, Richard Dugdale published the results of his investigation into the Juke family, a dull-witted crew that had produced more than its share of criminals and mental defectives. While he allowed for the influence of environment, Dugdale emphasized the importance of inherited traits in the Juke family. If mental and emotional traits could be inherited along with physical ones, then why couldn’t superior people be bred like superior dogs or horses? The dream of creating a science of eugenics, dedicated to improving and eventually even perfecting human beings, fired the reform imagination for decades. Eugenics was a kind of secular millennialism, a vision of a society where biological engineering complemented social engineering to create a managerial utopia. The intellectual respectability of eugenics, which lasted until the 1930s, when it became associated with Nazism, underscores the centrality of racialist thinking among Americans who considered themselves enlightened and progressive. Here as elsewhere, racism and modernity were twinned.

Consciousness of race increasingly pervaded American culture in the Gilded Age. Even a worldview as supple as Henry James’s revealed its moorings in conventional racial categories when, in The American (1877), James presented his protagonist, Christopher Newman, as a quintessential Anglo-Saxon but with echoes of the noble Red Man, with the same classical posture and physiognomy. There was an emerging kinship between these two groups of claimants to the title “first Americans.” The iconic American, from this view, was a blend of Anglo-Saxon refinement and native vigor. While James only hints at this, in less than a generation such younger novelists as Frank Norris and Jack London would openly celebrate the rude vitality of the contemporary Anglo-Saxon, proud descendant of the “white savages” who subdued a continent. It should come as no surprise that their heroes were always emphatically male. The rhetoric of race merged with a broader agenda of masculine revitalization.[…]

By the 1880s, muscular Christians were sweeping across the land, seeking to meld spiritual and physical renewal, establishing institutions like the Young Men’s Christian Association. The YMCA provided prayer meetings and Bible study to earnest young men with spiritual seekers’ yearnings, gyms and swimming pools to pasty young men with office workers’ midriffs. Sometimes they were the same young men. More than any other organization, the YMCA aimed to promote the symmetry of character embodied in the phrase “body, mind, spirit”—which a Y executive named Luther Gulick plucked from Deuteronomy and made the motto of the organization. The key to the Y’s appeal, a Harper’s contributor wrote in 1882, was the “overmastering conviction” of its members: “The world always respects manliness, even when it is not convinced [by theological argument]; and if the organizations did not sponsor that quality in young men, they would be entitled to no respect.” In the YMCA, manliness was officially joined to a larger agenda.

For many American Protestants, the pursuit of physical fitness merged with an encompassing vision of moral and cultural revitalization—one based on the reassertion of Protestant self-control against the threats posed to it by immigrant masses and mass-marketed temptation. […]

Science and religion seemed to point in the same direction: Progress and Providence were one.

Yet the synthesis remained precarious. Physical prowess, the basis of national supremacy, could not be taken for granted. Strong acknowledged in passing that Anglo-Saxons could be “devitalized by alcohol and tobacco.” Racial superiority could be undone by degenerate habits. Even the most triumphalist tracts contained an undercurrent of anxiety, rooted in the fear of flab. The new stress on the physical basis of identity began subtly to undermine the Protestant synthesis, to reinforce the suspicion that religion was a refuge for effeminate weaklings. The question inevitably arose, in some men’s minds: What if the YMCA and muscular Christianity were not enough to revitalize tired businessmen and college boys?

Under pressure from proliferating ideas of racial “fitness,” models of manhood became more secular. Despite the efforts of muscular Christians to reunite body and soul, the ideal man emerging among all classes by the 1890s was tougher and less introspective than his mid-Victorian predecessors. He was also less religious. Among advocates of revitalization, words like “Energy” and “Force” began to dominate discussion—often capitalized, often uncoupled from any larger frameworks of moral or spiritual meaning, and often combined with racist assumptions. […]

The emerging worship of force raised disturbing issues. Conventional morality took a backseat to the celebration of savage strength. After 1900, in the work of a pop-Nietzschean like Jack London, even criminality became a sign of racial vitality: as one of his characters says, “We whites have been land-robbers and sea-robbers from remotest time. It is in our blood, I guess, and we can’t get away from it.” This reversal of norms did not directly challenge racial hierarchies, but the assumptions behind it led toward disturbing questions. If physical prowess was the mark of racial superiority, what was one to make of the magnificent specimens of manhood produced by allegedly inferior races? Could it be that desk-bound Anglo-Saxons required an infusion of barbarian blood (or at least the “barbarian virtues” recommended by Theodore Roosevelt)? Behind these questions lay a primitivist model of regeneration, to be accomplished by incorporating the vitality of the vanquished, dark-skinned other. The question was how to do that and maintain racial purity.

pp. 135-138

Yet to emphasize the gap between country and the city was not simply an evasive exercise: dreams of bucolic stillness or urban energy stemmed from motives more complex than mere escapist sentiment. City and country were mother lodes of metaphor, sources for making sense of the urban-industrial revolution that was transforming the American countryside and creating a deep sense of discontinuity in many Americans’ lives during the decades after the Civil War. If the city epitomized the attraction of the future, the country embodied the pull of the past. For all those who had moved to town in search of excitement or opportunity, rural life was ineluctably associated with childhood and memory. The contrast between country and city was about personal experience as well as political economy. […]

REVERENCE FOR THE man of the soil was rooted in the republican tradition. In his Notes on the State of Virginia (1785), Jefferson articulated the antithesis that became central to agrarian politics (and to the producerist worldview in general)—the contrast between rural producers and urban parasites. “Those who labour in the earth are the chosen people of God, if ever he had a chosen people, whose breasts he has made his peculiar deposit for substantial and genuine virtue,” he announced. “Corruption of morals in the mass of cultivators is a phenomenon of which no age nor nation has furnished an example. It is the mark set on those, who not looking up to heaven, to their own soil and industry, as does the husbandman, for their subsistence, depend for it on the casualties and caprice of customers. Dependence begets subservience and venality, suffocates the germ of virtue, and prepares fit tools for the design of ambition.” Small wonder, from this view, that urban centers of commerce seemed to menace the public good. “The mobs of great cities,” Jefferson concluded, “add just so much to the support of pure government as sores do to the strength of the human body.” Jefferson’s invidious distinctions echoed through the nineteenth century, fueling the moral passion of agrarian rebels. Watson, among many, considered himself a Jeffersonian.

There were fundamental contradictions embedded in Jefferson’s conceptions of an independent yeomanry. Outside certain remote areas in New England, most American farmers were not self-sufficient in the nineteenth century—nor did they want to be. Many were eager participants in the agricultural market economy, animated by a restless, entrepreneurial spirit. Indeed, Jefferson’s own expansionist policies, especially the Louisiana Purchase, encouraged centrifugal movement as much as permanent settlement. “What developed in America,” the historian Richard Hofstadter wrote, “was an agricultural society whose real attachment was not to the land but to land values.” The figure of the independent yeoman, furnishing enough food for himself and his family, participating in the public life of a secure community—this icon embodied longings for stability amid a maelstrom of migration.

Often the longings were tinged with a melancholy sense of loss. […] For those with Jeffersonian sympathies, abandoned farms were disturbing evidence of cultural decline. As a North American Review contributor wrote in 1888: “Once let the human race be cut off from personal contact with the soil, once let the conventionalities and artificial restrictions of so-called civilization interfere with the healthful simplicity of nature, and decay is certain.” Romantic nature-worship had flourished fitfully among intellectuals since Emerson had become a transparent eye-ball on the Concord common and Whitman had loafed among leaves of grass. By the post–Civil War decades, romantic sentiment combined with republican tradition to foster forebodings. Migration from country to city, from this view, was a symptom of disease in the body politic. Yet the migration continued. Indeed, nostalgia for rural roots was itself a product of rootlessness. A restless spirit, born of necessity and desire, spun Americans off in many directions—but mainly westward. The vision of a stable yeomanry was undercut by the prevalence of the westering pioneer.

pp. 246-247

Whether energy came from within or without, it was as limitless as electricity apparently was. The obstacles to access were not material—class barriers or economic deprivation were never mentioned by devotees of abundance psychology—they were mental and emotional. The most debilitating emotion was fear, which cropped up constantly as the core problem in diagnoses of neurasthenia. The preoccupation with freeing oneself from internal constraints undermined the older, static ideal of economic self-control at its psychological base. As one observer noted in 1902: “The root cause of thrift, which we all admire and preach because it is so convenient to the community, is fear, fear of future want; and that fear, we are convinced, when indulged overmuch by pessimist minds is the most frequent cause of miserliness….” Freedom from fear meant freedom to consume.

And consumption began at the dinner table. Woods Hutchinson claimed in 1913 that the new enthusiasm for calories was entirely appropriate to a mobile, democratic society. The old “stagnation” theory of diet merely sought to maintain the level of health and vigor; it was a diet for slaves or serfs, for people who were not supposed to rise above their station. “The new diet theory is based on the idea of progress, of continuous improvement, of never resting satisfied with things as they are,” Hutchinson wrote. “No diet is too liberal or expensive that will…yield good returns on the investment.” Economic metaphors for health began to focus on growth and process rather than stability, on consumption and investment rather than savings.

As abundance psychology spread, a new atmosphere of dynamism enveloped old prescriptions for success. After the turn of the century, money was less often seen as an inert commodity, to be gradually accumulated and tended to steady growth; and more often seen as a fluid and dynamic force. To Americans enraptured by the strenuous life, energy became an end itself—and money was a kind of energy. Success mythology reflected this subtle change. In the magazine hagiographies of business titans—as well as in the fiction of writers like Dreiser and Norris—the key to success frequently became a mastery of Force (as those novelists always capitalized it), of raw power. Norris’s The Pit (1903) was a paean to the furious economic energies concentrated in Chicago. “It was Empire, the restless subjugation of all this central world of the lakes and prairies. Here, mid-most in the land, beat the Heart of the nation, whence inevitably must come its immeasurable power, its infinite, inexhaustible vitality. Here of all her cities, throbbed the true life—the true power and spirit of America: gigantic, crude, with the crudity of youth, disdaining rivalry; sane and healthy and vigorous; brutal in its ambition, arrogant in the new-found knowledge of its giant strength, prodigal of its wealth, infinite in its desires.” This was the vitalist vision at its most breathless and jejune, the literary equivalent of Theodore Roosevelt’s adolescent antics.

The new emphasis on capital as Force translated the psychology of abundance into economic terms. The economist who did the most to popularize this translation was Simon Nelson Patten, whose The New Basis of Civilization (1907) argued that the United States had passed from an “era of scarcity” to an “era of abundance” characterized by the unprecedented availability of mass-produced goods. His argument was based on the confident assumption that human beings had learned to control the weather. “The Secretary of Agriculture recently declared that serious crop failures will occur no more,” Patten wrote. “Stable, progressive farming controls the terror, disorder, and devastation of earlier times. A new agriculture means a new civilization.” Visions of perpetual growth were in the air, promising both stability and dynamism.

The economist Edward Atkinson pointed the way to a new synthesis with a hymn to “mental energy” in the Popular Science Monthly. Like other forms of energy, it was limitless. “If…there is no conceivable limit to the power of mind over matter or to the number of conversions of force that can be developed,” he wrote, “it follows that pauperism is due to want of mental energy, not of material resources.” Redistribution of wealth was not on the agenda; positive thinking was.

pp. 282-283

TR’s policies were primarily designed to protect American corporations’ access to raw materials, investment opportunities, and sometimes markets. The timing was appropriate. In the wake of the merger wave of 1897–1903, Wall Street generated new pools of capital, while Washington provided new places to invest it. Speculative excitement seized many among the middle and upper classes who began buying stocks for the first time. Prosperity spread even among the working classes, leading Simon Nelson Patten to detect a seismic shift from an era of scarcity to an era of abundance. For him, a well-paid working population committed to ever-expanding consumption would create what he called The New Basis of Civilization (1907).

Patten understood that the mountains of newly available goods were in part the spoils of empire, but he dissolved imperial power relations in a rhetoric of technological determinism. The new abundance, he argued, depended not only on the conquest of weather but also on the annihilation of time and space—a fast, efficient distribution system that provided Americans with the most varied diet in the world, transforming what had once been luxuries into staples of even the working man’s diet. “Rapid distribution of food carries civilization with it, and the prosperity that gives us a Panama canal with which to reach untouched tropic riches is a distinctive laborer’s resource, ranking with refrigerated express and quick freight carriage.” The specific moves that led to the seizure of the Canal Zone evaporated in the abstract “prosperity that gives us a Panama Canal,” which in turn became as much a boon to the workingman as innovative transportation. Empire was everywhere, in Patten’s formulation, and yet nowhere in sight.

What Patten implied (rather than stated overtly) was that imperialism underwrote expanding mass consumption, raising standards of living for ordinary folk. “Tropic riches” became cheap foods for the masses. The once-exotic banana was now sold from pushcarts for 6 cents a dozen, “a permanent addition to the laborer’s fund of goods.” The same was true of “sugar, which years ago was too expensive to be lavishly consumed by the well-to-do,” but “now freely gives its heat to the workingman,” as Patten wrote. “The demand that will follow the developing taste for it can be met by the vast quantities latent in Porto Rico and Cuba, and beyond them by the teeming lands of South America, and beyond them by the virgin tropics of another hemisphere.” From this view, the relation between empire and consumption was reciprocal: if imperial policies helped stimulate consumer demand, consumer demand in turn promoted imperial expansion. A society committed to ever-higher levels of mass-produced abundance required empire to be a way of life.

High vs Low Protein

P. D. Mangan Tweeted a quote from a research paper, Reversal of epigenetic aging and immunosenescent trends in humans by Gregory M. Fahy et al. He stated that  the “Most important sentence in aging reversal study” is the following: “Human longevity seems more consistently linked to insulin sensitivity than to IGF‐1 levels, and the effects of IGF‐1 on human longevity are confounded by its inverse proportionality to insulin sensitivity.” Mangan added that “This line agrees with what I wrote a while back” (How Carbohydrates and Not Protein Promote Aging); and in the comments section of that article, someone pointed to a supporting video by Dr. Benjamin Bikman (‘Insulin vs. Glucagon: The relevance of dietary protein’). Here is the context of the entire paragraph from the discussion section of the research paper:

“In this regard, it must be pointed out that GH and IGF‐1 can also have pro‐aging effects and that most gerontologists therefore favor reducing rather than increasing the levels of these factors (Longo et al., 2015). However, most past studies of aging and GH/IGF‐1 are confounded by the use of mutations that affect the developmental programming of aging, which is not necessarily relevant to nonmutant adults. For example, such mutations in mice alter the normal innervation of the hypothalamus during brain development and prevent the hypothalamic inflammation in the adult (Sadagurski et al., 2015). Hypothalamic inflammation may program adult body‐wide aging in nonmutants (Zhang et al., 2017), but it seems unlikely that lowering IGF‐1 in normal non‐mutant adults can provide the same protection. A second problem with past studies is a general failure to uncouple GH/IGF‐1 signaling from lifelong changes in insulin signaling. Human longevity seems more consistently linked to insulin sensitivity than to IGF‐1 levels, and the effects of IGF‐1 on human longevity are confounded by its inverse proportionality to insulin sensitivity (Vitale, Pellegrino, Vollery, & Hofland, 2019). We therefore believe our approach of increasing GH/IGF‐1 for a limited time in the more natural context of elevated DHEA while maximizing insulin sensitivity is justified, particularly in view of the positive role of GH and IGF‐1 in immune maintenance, the role of immune maintenance in the retardation of aging (Fabris et al., 1988), and our present results.”

In the Twitter thread, Командир Гиперкуба said, “So it is insulin [in]sensitivity than drives ageing rather than IGF‐1/GH. Huge if true.” And GuruAnaerobic added that, “I assume this isn’t IR per se, but IR in the presence of carbohydrate/excess food. IOW, the driver is environment.” Mangan then went onto point out that, “It explains the dichotomy of growth vs longevity, and why calorie restriction increases lifespan.” Mick Keith asked, “So drop carbs and sugar?go paleo style?” And Mangan answered, “There are other aspects to insulin sensitivity, but yes.” All of this cuts to the heart of a major issue in the low-carb community, an issue that I only partly and imperfectly understand. What I do get is this has to do with the conclusions various experts come to about protein, whether higher amounts are fine or intake should be very limited. Some see insulin sensitivity as key while others prioritize IGF-1. The confounding requires careful understanding. In the comments section of Mangan’s above linked article, Rob H. summed it up well:

“Great post, very timely too as I believe this is an issue that seems to be polarising the science-based nutrition space at the moment. Personally I fall down on the same side as you Dennis – as per Ben Bikman’s video which has also been posted here, as well as the views of all the main protein researchers including Stuart Philips, Jose Antonio, Donald Layman, Gabrielle Lyon, Ted Naiman, Chris Masterjohn etc who all believe the science clearly supports a high protein intake eg 1.6 -2.2g/kilo of bodyweight – with no upper limit which has yet been observed. At the same time, I have just been reading the new book by Dr Steven Gundry ‘The Longevity Paradox’. Has anyone read this one yet? Whilst about 90% of the content is fairly solid stuff (although nothing that hasn’t already been written about here) he aggressively supports Longo’s view that we should only consume 0.37g protein/ kilo of bodyweight, eg around 25g of protein/ day for most males. Also that animal protein should be avoided wherever possible. Personally I consume double that amount of protein at each meal! It appears that Longo, Gundry, Dr Ron Rosedale and Dr Mercola are all aligned in a very anti-animal protein stance, but also believe their view is backed by science – although the science quoted in Gundry’s book seems to be largely based on epidemiology. Both sides can’t be right here, so I hope more research is done in this field to shut this debate down – personally I feel that advising ageing males to consume only 25g of protein a day is extremely irresponsible.”

In response, Mangan wrote, “I agree that is irresponsible. Recently Jason Fung and James DiNicolantonio jumped on the anti animal protein bandwagon. My article above is my attempt (successful, I hope) to show why that’s wrong.” Following that, Rob added, “Humans have been consuming animal proteins for most or all of our evolutionary history. And certainly, large quantities of animal protein were consumed at times (as when a kill of a large animal was made). So, I cannot imagine that the “evidence” supporting an anti-animal protein stance can be solid or even science-based. This sounds like a case of certain researchers trying their best to find support for their pre-determined dietary beliefs (vegan proponents do this all the time). I’m not buying it.” It’s very much an ongoing debate.

I have suspicions about the point of confusion that originated this disagreement. Fear of promoting too much growth through protein is basically the old Galenic argument based on humoral physiology. The belief is that too much meat as a stimulating/nurturing substance built up the ‘blood’ with too much heat and dryness which would burn up the body and cause a shortened lifespan. This culturally inherited bias about meat has since been fancied up with scientific language. But ancient philosophy is not the best source for formulating modern scientific theory. Let me bring this back to insulin sensitivity and insulin resistance that appears to play the determining role. Insulin is a hormone and so we must understand this from an endicrinological approach, quite different than Galenic-style fears about meat that was filtered through the Christian theology of the Middle Ages.

Hormones are part of a complex hormonal system going far beyond macronutrients in the diet, although it does appear that the macronutrient profile is a major factor. Harry Serpano, in a discussion with Bart Kay, said that: “In a low insulin state, when you’re heavy meat and fat and your insulin is at 1.3, as Dr. Paul Mangan has actually shown in one of his videos, it’s quite clear; and in what I’m showing in one of the studies, it’s quite clear. It’s so close to basically fasting which is 0.8 — it’s very low. You’re not going to be pushing up these growth pathways like mTOR or IGF-1 in any significant way.” Like with so much else, there is strong evidence that what we need to be worrying about is insulin, specifically on a high-carb diet that causes insulin resistance and metabolic syndrome. That is what is guaranteed to severely decrease longevity.

This question about too much protein recently came up in my own thoughts while reading Dr. Stephen Gundry’s new book, The Longevity Paradox. As mentioned above, he makes a case against too much animal protein. But it sounds like there is more information to be considered in the affect on health, growth, and longevity. In a dialogue with Gundry, Dr. Paul Saladino defended meat consumption (Gundry’s Plant Paradox and Saladino’s Carnivory). What Mangan has added to this debate strengthens this position.

* * *

In one of the above quoted comments, Robert H. mentions that Dr. Joseph Mercola is one of those “aligned in a very anti-animal protein stance, but also believe their view is backed by science.” It’s interesting that I’m just now listening to a discussion between Mercola and Siim Land. They met at a conference and got to talking. Mercola then read Land’s book, Metabolic Autophagy. Land is more in the camp supporting the value of protein. His view is nuanced and the debate isn’t entirely polarized. The role protein plays in health depends on the health outcomes being sought and the health conditions under which protein is being eaten: amounts, regularity of meals, assimilation, etc. It’s about how one’s body is able to use protein and to what end.

Right at the beginning of their talk, Mercola states that he is impressed by Land’s knowledge and persuaded by his view on protein. Land makes the simple point that one doesn’t want to be in autophagy all the time but to cycle between periods of growth and not. Too much protein restriction, especially all the time, is not a good thing. Mercola seems to have come around to this view. So, it’s a shifting debate. There is a lot of research and new studies are coming out all the time. But obviously, context is important in making any statement about protein in the diet. Maybe Saladino will similarly bring Gundry on board with greater protein being a good thing for certain purposes or maybe come to a middle ground. These dialogues are helpful, in particular for an outsider like me who is listening in.

* * *

On a personal note, I’m not sure I take a strong position either way. But I’ve long been persuaded by Siim Land’s view. It feels more moderate and balanced. The opposite side can sound too fear-mongering about protein, not seeming to allow as much differences in contexts and conditions. From a low-carb perspective, one has to replace carbs with something and that means either protein or fat, and one can only consume so much fat. Besides, proteins really are important for anabolism and activating mTOR, for building of the body. Maybe if you’re trying to lose weight or simply maintaining where you’re at with no concern for healing or developing muscle then protein would play less of a role. I don’t know.

Traditional societies don’t seem to worry about protein amounts. When they have access to it, they eat it, at times even to the point of their bellies distending. And when not, they don’t. Those populations with greater access don’t appear to suffer any harm from greater protein intake. Then again, these traditional societies tend to do a lot of strenuous physical activity. They also usually mix it up with regular fasting, intermittent and extended. I’m not sure how optimal protein levels may differ depending on lifestyle. Still, I’d think that the same basic biological truths would apply to all populations. For most people in most situations, increased protein will be helpful at least some of the time and maybe most of the time. Other than fasting, I’m not sure why one needs to worry about it. And with fasting, protein restriction happens naturally.

So, maybe eat protein to satiation. Then throw in some fasting. You’ll probably be fine. There doesn’t seem to be anything to be overly concerned about, based on what evidence I’ve seen so far.

Dr. Catherine Shanahan On Dietary Epigenetics and Mutations

Dr. Catherine Shanahan is a board-certified family physician with an undergraduate degree in biology, along with training in biochemistry and genetics. She has also studied ethno-botany, culinary traditions, and ancestral health. Besides regularly appearing in and writing for national media, she has worked as director and nutrition consultant for the Los Angeles Lakers. On High Intensity Health, she was interviewed by nutritionist Mike Mutzel (Fat Adapted Athletes Perform Better). At the 31:55 mark in that video, she discussed diet (in particular, industrial vegetable oils or simply seed oils), epigenetic inheritance, de novo genetic mutations, and autism. This can be found in the show notes (#172) where it is stated that,

“In 1909 we consumed 1/3 of an ounce of soy oil per year. Now we consume about 22 pounds per year. In the amounts that we consume seed oils, it breaks down into some of the worst toxins ever discovered. They are also capable of damaging our DNA. Many diseases are due to mutations that children have that their parents did not have. This means that mothers and fathers with poor diets have eggs/sperm that have mutated DNA. Children with autism have 10 times the number of usual mutations in their genes. Getting off of seed oils is one of the most impactful things prospective parents can do. The sperm has more mutations than the egg.”

These seed oils didn’t exist in the human diet until the industrial era. Our bodies are designed to use and incorporate the PUFAs from natural sources, but the processing into oils through high pressure and chemicals denatures the structure of the oil and destroys the antioxidants. The oxidative stress that follows from adding them to the diet is precisely because these altered oils act as trojan horses in being treated by the body like natural fats. This is magnified by a general increase of PUFAs, specifically omega-6 fatty acids, with a simultaneous decrease of omega-3 fatty acids and saturated fats. It isn’t any difference in overall fat intake, as the 40% we get in the diet now is about the same as seen in the diet at the beginning of last century. What is different is these oxidized PUFAs combined with massive loads of sugar and starches like never seen before.

Dr. Shanahan sees these industrial plant oils as the single greatest harm, such that she doesn’t consider them to be a food but a toxin, originally discovered as an industrial byproduct. She is less worried about any given category of food or macronutrient, as long as you first and foremost remove this specific source of toxins.** She goes into greater detail in a talk from Ancestry Foundation (AHS16 – Cate Shanahan – Bad Diet, Bad DNA?). And her book, Deep Nutrition, is a great resource on this topic. I’ll leave that for you to further explore, if you so desire. Let me quickly and simply note an implication of this.

Genetic mutations demonstrates how serious of a situation this is. The harm we are causing ourselves might go beyond merely punishment for our personal sins but the sins of the father and mother genetically passing onto their children, grandchildren, and further on (one generation of starvation or smoking among grandparents leads to generations of smaller birth weight and underdevelopment among the grandchildren and maybe beyond, no matter if the intervening generation of parents was healthy).

It might not be limited to a temporary transgenerational harm as seen with epigenetics. This could be permanent harm to our entire civilization, fundamentally altering our collective gene pool. We could recover from epigenetics within a few generations, assuming we took the problem seriously and acted immediately (Dietary Health Across Generations), but with genetic mutations we may never be able to undo the damage. These mutations have been accumulating and will continue to accumulate, until we return to an ancestral diet of healthy foods as part of an overall healthy lifestyle and environment. Even mutations can be moderated by epigenetics, as the body is designed to deal with them.

This further undermines genetic determinism and biological essentialism. We aren’t mere victims doomed to a fate beyond our control. This dire situation is being created by all of us, individually and collectively. There is no better place to begin than with your own health, but we better also treat this as a societal crisis verging on catastrophe. It was public policies and an international food system that created the conditions that enacted and enforced this failed mass experiment of dietary dogma and capitalist realist profiteering. Maybe we could try something different, something  less psychopathically authoritarian, less psychotically disconnected from reality, less collectively suicidal. Heck, it’s worth a try.

* * *

** I’d slightly disagree with her emphasis. She thinks what matters most is the changes over the past century. There is a good point made in this focus on late modernity. But I’d note that industrialization and modern agriculture began in the prior centuries.

It was in the colonial era that pasta was introduced to Italy, potatoes to Ireland, and sugar throughout the Western world. It wasn’t until the late 1700s and more clearly in the early 1800s that there were regular grain surpluses that made grains available for feeding/fattening both humans and cattle. In particular, it was around this time that agricultural methods improved for wheat crops, allowing it to be affordable to the general public for the first time in human existence and hence causing white bread to become common during the ensuing generations.

I don’t know about diseases like Alzheimer’s, Parkinson’s, and multiple sclerosis. But I do know that the most major diseases of civilization (obesity, diabetes, cancer, and mental illness) were first noticed to be on the rise during the 1700s and 1800s or sometimes earlier, long before industrial oils or the industrial revolution that made these oils possible. The high-carb diet appeared gradually with colonial trade and spread across numerous societies, first hitting the wealthiest before eventually being made possible for the dirty masses. During this time, it was observed by doctors, scientists, missionaries and explorers that obesity, diabetes, cancer, mental illness and moral decline quickly followed on the heels of this modern diet.

Seed oils were simply the final Jenga block pulled out from the ever growing and ever more wobbly tower, in replacing healthy nutrient-dense animal fats (full of fat-soluble vitamins, choline, omega-3 fatty acids, etc) that were counterbalancing some of the worst effects of the high-carb diet. But seed oils, as with farm chemicals such as glyphosate, never would never have had as severe and dramatic of an impact if not for the previous centuries of worsening diet and health. It had been building up over a long time and it was doomed to topple right from the start. We are simply now at the tipping point that is bringing us to the culmination point, the inevitable conclusion of a sad trajectory.

Still, it’s never too late… or let us hope. Dr. Shanahan prefers to end on an optimistic note. And I’d rather not disagree with her about that. I’ll assume she is right or that she is at least in the general ballpark. Let us do as she suggests. We need more and better research, but somehow industrial seed oils have slipped past the notice of autism researchers.

* * *

On Deep Nutrition and Genetic Expression
interview by Kristen Michaelis CNC

Dr. Cate: Genetic Wealth is the idea that if your parents or grandparents ate traditional and nutrient-rich foods, then you came into the world with genes that could express in an optimal way, and this makes you more likely to look like a supermodel and be an extraordinary athlete. Take Angelina Jolie or Michael Jordan, for instance. They’ve got loads of genetic wealth.

Genetic Momentum
 describes the fact that, once you have that extraordinary genetic wealth, you don’t have to eat so great to be healthier than the average person. It’s like being born into a kind of royalty. You always have that inheritance around and you don’t need to work at your health in the same way other people do.

These days, for most of us, it was our grandparents or great grandparents who were the last in our line to grow up on a farm or get a nutrient-rich diet. In my case, I have to go back 4 generations to the Irish and Russian farmers who immigrated to NYC where my grandparents on both sides could only eat cheap food; sometimes good things like chopped liver and beef tongue, but often preserves and crackers and other junk. So my grandparents were far healthier than my brother and sisters and I.

The Standard American Diet (SAD) has accelerated the processes of genetic wealth being spent down, genetic momentum petering out, and the current generation getting sick earlier than their parents and grandparents. This is a real, extreme tragedy on the order of end-of-the-world level losses of natural resources. Genetic wealth is a kind of natural resource. And loss of genetic wealth is a more urgent problem than peak oil or the bursting of the housing bubble. But of course nobody is talking about it directly, only indirectly, in terms of increased rates of chronic disease.

Take autism, for example. Why is autism so common? I don’t think vaccines are the reason for the vast vast majority of cases, since subtle signs of autism can be seen before vaccination in the majority. I think the reason has to do with loss of genetic wealth. We know that children with autism exhibit DNA mutations that their parents and grandparents did not have. Why? Because in the absence of necessary nutrients, DNA cannot even duplicate itself properly and permanent mutations develop.

(Here’s an article on one kind of genetic mutation (DNA deletions) associated with autism.)

Fortunately, most disease is not due to permanent letter mutations and therefore a good diet can rehabilitate a lot of genetic disease that is only a result of altered genetic expression. To put your high-school biology to work, it’s the idea of genotype versus phenotype. You might have the genes that make you prone to, for example, breast cancer (the BRCA1 mutation), but you might not get the disease if you eat right because the gene expression can revert back to normal.

Deep Nutrition: Why Your Genes Need Traditional Food
by Dr. Catherine Shanahan
pp. 55-57

Guided Evolution?

In 2007, a consortium of geneticists investigating autism boldly announced that the disease was not genetic in the typical sense of the word, meaning that you inherit a gene for autism from one or both of your parents. New gene sequencing technologies had revealed that many children with autism had new gene mutations, never before expressed in their family line.

An article published in the prestigious journal Proceedings of the National Academy of Sciences states, “The majority of autisms are a result of de novo mutations, occurring first in the parental germ line.” 42 The reasons behind this will be discussed in Chapter 9.

In 2012, a group investigating these new, spontaneous mutations discovered evidence that randomness was not the sole driving force behind them. Their study, published in the journal Cell, revealed an unexpected pattern of mutations occurring 100 times more often in specific “hotspots,” regions of the human genome where the DNA strand is tightly coiled around organizing proteins called histones that function much like spools in a sewing kit, which organize different colors and types of threads. 43

The consequences of these mutations seem specifically designed to toggle up or down specific character traits. Jonathan Sebat, lead author on the 2012 article, suggests that the hotspots are engineered to “mutate in ways that will influence human traits” by toggling up or down the development of specific behaviors. For example, when a certain gene located at a hotspot on chromosome 7 is duplicated, children develop autism, a developmental delay characterized by near total lack of interest in social interaction. When the same chromosome is deleted, children develop Williams Syndrome, a developmental delay characterized by an exuberant gregariousness, where children talk a lot, and talk with pretty much anyone. The phenomenon wherein specific traits are toggled up and down by variations in gene expression has recently been recognized as a result of the built-in architecture of DNA and dubbed “active adaptive evolution.” 44

As further evidence of an underlying logic driving the development of these new autism-related mutations, it appears that epigenetic factors activate the hotspot, particularly a kind of epigenetic tagging called methylation. 45 In the absence of adequate B vitamins, specific areas of the gene lose these methylation tags, exposing sections of DNA to the factors that generate new mutations. In other words, factors missing from a parent’s diet trigger the genome to respond in ways that will hopefully enable the offspring to cope with the new nutritional environment. It doesn’t always work out, of course, but that seems to be the intent.

You could almost see it as the attempt to adjust character traits in a way that will engineer different kinds of creative minds, so that hopefully one will give us a new capacity to adapt.

pp. 221-228

What Is Autism?

The very first diagnostic manual for psychiatric disorders published in 1954 described autism simply as “schizophrenic reaction, childhood type.” 391 The next manual, released in 1980, listed more specific criteria, including “pervasive lack of responsiveness to other people” and “if speech is present, peculiar speech patterns such as immediate and delayed echolalia, metaphorical language, pronominal reversal (using you when meaning me, for instance).” 392 Of course, the terse language of a diagnostic manual can never convey the real experience of living with a child on the spectrum, or living on the spectrum yourself.

When I graduated from medical school, autism was so rarely diagnosed that none of my psychiatry exams even covered it and I and my classmates were made aware of autism more from watching the movie Rain Man than from studying course material. The question of whether autism (now commonly referred to as ASD) is more common now than it was then or whether we are simply recognizing it more often is still controversial. Some literature suggests that it is a diagnostic issue, and that language disorders are being diagnosed less often as autism is being diagnosed more. However, according to new CDC statistics, it appears that autism rates have risen 30 percent between 2008 and 2012. Considering that diagnostic criteria had been stable by that point in time for over a decade, increased diagnosis is unlikely to be a major factor in this 30 percent figure. 393

Given these chilling statistics, it’s little wonder that so many research dollars have been dedicated to exploring possible connections between exposure to various environmental factors and development of the disorder. Investigators have received grants to look into a possible link between autism and vaccines, 394 smoking, 395 maternal drug use (prescription and illicit), 396 , 397 , 398 organophosphates, 399 and other pesticides, 400 BPA, 401 lead, 402 mercury, 403 cell phones, 404 IVF and infertility treatments, 405 induced labor, 406 high-powered electric wires, 407 flame retardants, 408 ultrasound, 409 —and just about any other environmental factor you can name. You might be wondering if they’ve also looked into diet. But of course: alcohol, 410 cow’s milk, 411 milk protein, 412 soy formula, 413 gluten, 414 and food colorings 415 have all been investigated. Guess what they’ve never dedicated a single study to investigating? Here’s a hint: it’s known to be pro-oxidative and pro-inflammatory and contains 4-HNE, 4-HHE, and MDA, along with a number of other equally potent mutagens. 416 Still haven’t guessed? Okay, one last hint: it’s so ubiquitous in our food supply that for many Americans it makes up as much as 60 percent of their daily caloric intake, 417 a consumption rate that has increased in parallel with rising rates of autism.

Of course, I’m talking about vegetable oil. In Chapter 2 , I discussed in some detail how and why gene transcription, maintenance, and expression are necessarily imperiled in the context of a pro-inflammatory, pro-oxidative environment, so I won’t go further into that here. But I do want to better acquaint you with the three PUFA-derived mutagens I just named because when they make it to the part of your cell that houses DNA, they can bind to DNA and create new, “de novo,” mutations. DNA mutations affecting a woman’s ovaries, a man’s sperm, or a fertilized embryo can have a devastating impact on subsequent generations.

First, let’s revisit 4-HNE (4-hydroxynonanol), which you may recall meeting in the above section on firebombing the highways. This is perhaps the most notorious of all the toxic fats derived from oxidation of omega-6 fatty acids, whose diversity of toxic effects requires that entire chemistry journals be devoted to 4-HNE alone. When the mutagenicity (ability to mutate DNA) of 4-HNE was first described in 1985, the cytotoxicity (ability to kill cells) had already been established for decades. The authors of a 2009 review article explain that the reason it had taken so long to recognize that HNE was such an effective carcinogen was largely due to the fact that “the cytotoxicity [cell-killing ability] of 4-HNE masked its genotoxicity [DNA-mutating effect].” 419 In other words, it kills cells so readily that they don’t have a chance to divide and mutate. How potently does 4-HNE damage human DNA? After interacting with DNA, 4-HNE forms a compound called an HNE-adduct, and that adduct prevents DNA from copying itself accurately. Every time 4-HNE binds to a guanosine (the G of the four-letter ACGT DNA alphabet), there is somewhere between a 0.5 and 5 percent chance that G will not be copied correctly, and that the enzyme trying to make a perfect copy of DNA will accidentally turn G into T. 420 Without 4-HNE, the chance of error is about a millionth of a percent. 421 In other words, 4-HNE increases the chances of a DNA mutation rate roughly a million times!

Second, 4-HHE (4-hydroxy-hexanal), which is very much like 4-HNE, his more notorious bigger brother derived from omega-6, but 4-HHE is derived instead from omega-3. If bad guys had sidekicks, 4-NHE’s would be 4-HHE. Because 4-HHE does many of the same things to DNA as 4-HNE, but has only been discovered recently. 422 You see, when omega-6 reacts with oxygen, it breaks apart into two major end products, whereas omega-3, being more explosive, flies apart into four different molecules. This means each one is present in smaller amounts, and that makes them a little more difficult to study. But it doesn’t make 4-HHE any less dangerous. 4-HHE specializes in burning through your glutathione peroxidase antioxidant defense system. 423 This selenium-based antioxidant enzyme is one of the three major enzymatic antioxidant defense systems, and it may be the most important player defending your DNA against oxidative stress. 424 , 425

Finally, there is malonaldehyde (MDA), proven to be a mutagen in 1984, but presumed to only come from consumption of cooked and cured meats. 426 Only in the past few decades have we had the technology to determine that MDA can be generated in our bodies as well. 427 And unlike the previous two chemicals, MDA is generated by oxidation of both omega-3 and omega-6. It may be the most common endogenously derived oxidation product. Dr. J. L. Marnett, who directs a cancer research lab at Vanderbuit University School of Medicine, Nashville, Tennessee, and who has published over 400 articles on the subject of DNA mutation, summarized his final article on MDA with the definitive statement that MDA “appears to be a major source of endogenous DNA damage [endogenous, here, meaning due to internal, metabolic factors rather than, say, radiation] in humans that may contribute significantly to cancer and other genetic diseases.” 428

There’s one more thing I need to add about vegetable-oil-derived toxic breakdown products, particularly given the long list of toxins now being investigated as potential causes of autism spectrum disorders. Not only do they directly mutate DNA, they also make DNA more susceptible to mutations induced by other environmental pollutants. 429 , 430 This means that if you start reading labels and taking vegetable oil out of your diet, your body will more readily deal with the thousands of contaminating toxins not listed on the labels which are nearly impossible to avoid.

Why all this focus on genes when we’re talking about autism? Nearly every day a new study comes out that further consolidates the consensus among scientists that autism is commonly a genetic disorder. The latest research is focusing on de novo mutations, meaning mutations neither parent had themselves but that arose spontaneously in their egg, sperm, or during fertilization. These mutations may affect single genes, or they may manifest as copy number variations, in which entire stretches of DNA containing multiple genes are deleted or duplicated. Geneticists have already identified a staggering number of genes that appear to be associated with autism. In one report summarizing results of examining 900 children, scientists identified 1,000 potential genes: “exome sequencing of over 900 individuals provided an estimate of nearly 1,000 contributing genes.” 431

All of these 1,000 genes are involved with proper development of the part of the brain most identified with the human intellect: our cortical gray matter. This is the stuff that enables us to master human skills: the spoken language, reading, writing, dancing, playing music, and, most important, the social interaction that drives the desire to do all of the above. One need only have a few of these 1,000 genes involved in building a brain get miscopied, or in some cases just one, in order for altered brain development to lead to one’s inclusion in the ASD spectrum.

So just a few troublemaker genes can obstruct the entire brain development program. But for things to go right, all the genes for brain development need to be fully functional.

Given that humans are thought to have only around 20,000 genes, and already 1,000 are known to be essential for building brain, that means geneticists have already labeled 5 percent of the totality of our genetic database as crucial to the development of a healthy brain—and we’ve just started looking. At what point does it become a foolish enterprise to continue to look for genes that, when mutated, are associated with autism? When we’ve identified 5,000? Or 10,000? The entire human genome? At what point do we stop focusing myopically only on those genes thought to play a role in autism?

I’ll tell you when: when you learn that the average autistic child’s genome carries de novo mutations not just in genes thought to be associated with autism, but across the board, throughout the entirety of the chromosomal landscape. Because once you’ve learned this, you can’t help but consider that autism might be better characterized as a symptom of a larger disease—a disease that results in an overall increase in de novo mutations.

Almost buried by the avalanche of journal articles on genes associated with autism is the finding that autistic children exhibit roughly ten times the number of de novo mutations compared to their typically developing siblings. 432 An international working group on autism pronounced this startling finding in a 2013 article entitled: “Global Increases in Both Common and Rare Copy Number Load Associated With Autism.” 433 ( Copy number load refers to mutations wherein large segments of genes are duplicated too often.) What the article says is that yes, children with autism have a larger number of de novo mutations, but the majority of their new mutations are not statistically associated with autism because other kids have them, too. The typically developing kids just don’t have nearly as many.

These new mutations are not only affecting genes associated with brain development. They are affecting all genes seemingly universally. What is more, there is a dose response relationship between the total number of de novo mutations and the severity of autism such that the more gene mutations a child has (the bigger the dose of mutation), the worse their autism (the larger the response). And it doesn’t matter where the mutations are located—even in genes that have no obvious connection to the brain. 434 This finding suggests that autism does not originate in the brain, as has been assumed. The real problem—at least for many children—may actually be coming from the genes. If this is so, then when we look at a child with autism, what we’re seeing is a child manifesting a global genetic breakdown. Among the many possible outcomes of this genetic breakdown, autism may simply be the most conspicuous, as the cognitive and social hallmarks of autism are easy to recognize.

As the authors of the 2013 article state, “Given the large genetic target of neurodevelopmental disorders, estimated in the hundreds or even thousands of genomic loci, it stands to reason that anything that increases genomic instability could contribute to the genesis of these disorders.” 435 Genomic instability —now they’re on to something. Because framing the problem this way helps us to ask the more fundamental question, What is behind the “genomic instability” that’s causing all these new gene mutations?

In the section titled “What Makes DNA Forget” in Chapter 2 , I touched upon the idea that an optimal nutritional environment is required to ensure the accurate transcription of genetic material and communication of epigenetic bookmarking, and how a pro-oxidative, pro-inflammatory diet can sabotage this delicate operation in ways that can lead to mutation and alter normal growth. There I focused on mistakes made in epigenetic programming, what you could call de novo epigenetic abnormalities. The same prerequisites that support proper epigenetic data communication, I submit, apply equally to the proper transcription of genetic data.

What’s the opposite of a supportive nutritional environment? A steady intake of pro-inflammatory, pro-oxidative vegetable oil that brings with it the known mutagenic compounds of the kind I’ve just described. Furthermore, if exposure to these vegetable oil-derived mutagens causes a breakdown in the systems for accurately duplicating genes, then you might expect to find other detrimental effects from this generalized defect of gene replication. Indeed we do. Researchers in Finland have found that children anywhere on the ASD spectrum have between 1.5 and 2.7 times the risk of being born with a serious birth defect, most commonly a life-threatening heart defect or neural tube (brain and spinal cord) defect that impairs the child’s ability to walk. 436 Another group, in Nova Scotia, identified a similarly increased rate of minor malformations, such as abnormally rotated ears, small feet, or closely spaced eyes. 437

What I’ve laid out here is the argument that the increasing prevalence of autism is best understood as a symptom of De Novo Gene Mutation Syndrome brought on by oxidative damage, and that vegetable oil is the number-one culprit in creating these new mutations. These claims emerge from a point-by-point deduction based on the best available chemical, genetic, and physiologic science. To test the validity of this hypothesis, we need more research.

Does De Novo Gene Mutation Syndrome Affect Just the Brain?

Nothing would redirect the trajectory of autism research in a more productive fashion than reframing autism as a symptom of the larger underlying disease, which we are provisionally calling de novo gene-mutation syndrome, or DiNGS. (Here’s a mnemonic: vegetable oil toxins “ding” your DNA, like hailstones pockmarking your car.)

If you accept my thesis that the expanding epidemic of autism is a symptom of an epidemic of new gene mutations, then you may wonder why the only identified syndrome of DiNGS is autism. Why don’t we see all manner of new diseases associated with gene mutations affecting organs other than the brain? We do. According to the most recent CDC report on birth defect incidence in the United States, twenty-nine of the thirty-eight organ malformations tracked have increased. 438

However, these are rare events, occurring far less frequently than autism. The reason for the difference derives from the fact that the brain of a developing baby can be damaged to a greater degree than other organs can, while still allowing the pregnancy to carry to term. Though the complex nature of the brain makes it the most vulnerable in terms of being affected by mutation, this aberration of development does not make the child more vulnerable in terms of survival in utero. The fact that autism affects the most evolutionarily novel portion of the brain means that as far as viability of an embryo is concerned, it’s almost irrelevant. If the kinds of severely damaging mutations leading to autism were to occur in organs such as the heart, lungs, or kidneys, fetal survival would be imperiled, leading to spontaneous miscarriage. Since these organs begin developing as early as four to six weeks of in-utero life, failure of a pregnancy this early might occur without any symptoms other than bleeding, which might be mistaken for a heavy or late period, and before a mother has even realized she’s conceived.

* * *

Rhonda Patrick’s view is similar to that of Shanahan:

American Heart Association’s “Fat and Cholesterol Counter” (1991)

  • 1963 – “Every woman knows that carbohydrates are fattening, this is a piece of common knowledge, which few nutritionists would dispute.”
  • 1994 – “… obesity may be regarded as a carbohydrate-deficiency syndrome and that an increase in dietary carbohydrate content at the expense of fat is the appropriate dietary part of a therapeutical strategy.”*

My mother was about to throw out an old booklet from the American Heart Association (AHA), “Fat and Cholesterol Counter”, one of several publications they put out around that time. It was published in 1991, the year I started high school. Unsurprisingly, it blames everything on sodium, calories, cholesterol, and, of course, saturated fat.

Even hydrogenated fat gets blamed on saturated fat, since the hydrogenation process turns some small portion of it saturated, which ignores the heavy damage and inflammatory response caused by the oxidization process (both in the industrial processing and in cooking). Not to mention those hydrogenated fats as industrial seed oils are filled with omega-6 fatty acids, the main reason they are so inflammatory. Saturated fat, on the other hand, is not inflammatory at all. This obsession with saturated fat is so strange. It never made any sense from a scientific perspective. When the obesity epidemic began and all that went with it, the consumption of saturated fat by Americans had been steadily dropping for decades, ever since the invention of industrial seed oils in the late 1800s and the fear about meat caused by Upton Sinclair’s muckraking journalism, The Jungle, about the meatpacking industry.

The amount of saturated fat and red meat has declined over the past century, to be replaced with those industrial seed oils and lean white meat, along with fruits and vegetables — all of which have been increasing.** Chicken, in particular, replaced beef and what stands out about chicken is that, like those industrial seed oils, it is high in the inflammatory omega-6 fatty acids. How could saturated fat be causing the greater rates of heart disease and such when people were eating less of it. This scapegoating wasn’t only unscientific but blatantly irrational. All of this info was known way back when Ancel Keys went on his anti-fat crusade (The Creed of Ancel Keys). It wasn’t a secret. And it required cherrypicked data and convoluted rationalizations to explain away.

Worse than removing saturated fat when it’s not a health risk is the fact that it is actually an essential nutrient for health: “How much total saturated do we need? During the 1970s, researchers from Canada found that animals fed rapeseed oil and canola oil developed heart lesions. This problem was corrected when they added saturated fat to the animals diets. On the basis of this and other research, they ultimately determined that the diet should contain at least 25 percent of fat as saturated fat. Among the food fats that they tested, the one found to have the best proportion of saturated fat was lard, the very fat we are told to avoid under all circumstances!” (Millie Barnes, The Importance of Saturated Fats for Biological Functions).

It is specifically lard that has been most removed from the diet, and this is significant as lard was a central to the American diet until this past century: “Pre-1936 shortening is comprised mainly of lard while afterward, partially hydrogenated oils came to be the major ingredient” (Nina Teicholz, The Big Fat Surprise, p. 95); “Americans in the nineteenth century ate four to five times more butter than we do today, and at least six times more lard” (p. 126). And what about the Mediterranean people who supposedly are so healthy because of their love of olive oil? “Indeed, in historical accounts going back to antiquity, the fat more commonly used in cooking in the Mediterranean, among peasants and the elite alike, was lard.” (p. 217).

Jason Prall notes that long-lived populations ate “lots of meat” and specifically, “They all ate pig. I think pork was the was the only common animal that we saw in the places that we went” (Longevity Diet & Lifestyle Caught On Camera w/ Jason Prall). The infamous long-lived Okinawans also partake in everything from pigs, such that their entire culture and religion was centered around pigs (Blue Zones Dietary Myth). Lard, in case you didn’t know, comes from pigs. Pork and lard is found in so many diets for the simple reason pigs can live in diverse environments, from mountainous forests to tangled swamps to open fields, and they are a food source available year round.

Another thing that has gone hand in hand with loss of healthy, nutrient-dense saturated fat in the American diet is a loss of nutrition in general. It’s not only that plant foods have less minerals and vitamins because of depleted soil and because they are picked when not ripe in order to ship them long distances. The same is true of animal foods, since the animals are being fed the same crappy plant foods as us humans. But at the very least, even factory-farmed animals have far more bioavailable nutrient-density than plant foods from industrial agriculture. If we ate more fatty meat, saturated fat or otherwise, we’d be getting far more fat-soluble vitamins. But when looking at all animal foods, in particular from pasture-raised and wild-caught sources, there is no mineral or vitamin that can’t be obtained at required levels. The same can’t be said for plant foods on a vegan diet.

Back in 1991, the AHA was recommending the inclusion of lots of bread, rolls, crackers, and pasta (“made with low-fat milk and fats or oils low in saturated fatty acids” and “without eggs”); rice, beans, and peas; sugary fruits and starchy vegetables (including juices) — and deserts were fine as well. At most, eat 3 or 4 eggs a week and, as expected, optimally avoid the egg yolks where all the nutrition is located (not only fat-soluble vitamins, but also choline and cholesterol and much else; by the way, your brain health is dependent on high levels of dietary cholesterol, such that statins in blocking cholesterol cause neurocognitive decline). As long as there were little if any saturated fat and fat in general was limited, buckets of starchy carbs and sugar was considered by the AHA to be part of a healthy and balanced diet. That is sad.

This interested me because of the year. This was as I was entering young adulthood and so I was becoming more aware of the larger world. I remember the heavy-handed propaganda preaching that fiber is good and fat is evil, as if the war on obesity was a holy crusade that demanded black-and-white thinking, all subtleties and complexities must be denied in adherence to the moralistic dogma against the sins of gluttony and sloth — it was literally a evangelistic medical gospel (see Belinda Fettke’s research on the Seventh Day Adventists: Thou Shalt not discuss Nutrition ‘Science’ without understanding its driving force). In our declining public health, we were a fallen people who required a dietary clergy for our salvation. Millennia of traditional dietary wisdom and knowledge was thrown out the window as if it was worthless or maybe even dangerous.

I do remember my mother buying high-fiber cereals and “whole wheat” commercial breads (not actually whole wheat as it is simply denatured refined flour with fiber added back in). And along with this, skim or 1% fat dairy foods, especially milk, was included with every major meal and often snacks. I had sugary and starchy cereal with skim milk (and/or milk with sugary Instant Breakfast) every morning and a glass of skim milk for every dinner, maybe sometimes milk for lunch. Cheese was a regular part of the diet as well, such as with pizza eaten multiple times week or any meal with pasta, and heck cheese was a great snack all by itself, but also good combined with crackers and one could pretend to be healthy if one used Triscuits. Those were the days when I might devour a whole block of cheese, probably low-fat, in a single sitting — I was probably craving fat-soluble vitamins. Still, most of my diet was most starches and sugar, as that was my addiction. The fiber was an afterthought to market junk food as health food.

It now makes sense. When I was a kid in the 1980s, my mother says the doctor understood that whole fat milk was important for growing bodies. So that is what he recommended. But I guess the anti-fat agenda had fully taken over by the 1990s. The AHA booklet from 1991 was by then recommending “skim or 1% milk and low-fat cheeses” for all ages, including babies and children, pregnant and lactating women. Talk about a recipe for health disaster. No wonder metabolic syndrome exploded and neurocognitive health fell like a train going over a collapsed bridge. It was so predictable, as the failure of this diet was understood by many going back to earlier in the century (e.g., Weston A. Price; see my post Health From Generation To Generation).

The health recommendations did get worse over time, but to be fair it started much earlier. They had been discouraging breastfeeding for a while. Traditionally, babies were breastfed for the first couple of years or so. By the time modern America came around, experts were suggesting a short period of breast milk or even entirely using scientifically-designed formulas. My mother only breastfed me for 5-6 months and then put me on cows milk — of course, pasteurized and homogenized milk from grain-fed and factory-farmed cows. When the dairy caused diarrhea, the doctor suggested soy milk. After a while, my mother put me on dairy again, but diarrhea persisted and so for preschool she put me back on soy milk again. I was drinking soy milk off and on for many years during the most important stage of development. Holy fuck! That had to have done serious damage to my developing body, in particular my brain. Then I went from that to skim milk during another important time of development, as I hit puberty and went through growth spurts.

Early on in elementary school, I had delayed reading and a diagnosis of learning disability, seemingly along with something along the lines of either Asperger’s or specific language impairment, although undiagnosed. I definitely had social and behavioral issues, in that I didn’t understand people well when I was younger. Then entering adulthood, I was diagnosed with depression and something like a “thought disorder” or something (I forget the exact diagnosis I got while in a psychiatric ward after a suicide attempt). No doubt the latter was already present in my early neurocogntive problems, as I obviously was severely depressed at least as early as 7th grade. A malnourished diet of lots of carbs and little fat was the most probable cause for all of these problems.

Thanks, American Heart Association! Thanks for doing so much harm my health and making my life miserable for decades, not to mention nearly killing me through depression so severe I attempted suicide, and then decades of depressive struggle that followed. That isn’t even to mention the sugar and carb addiction that plagued me for so long. Now multiply my experience by that of at least hundreds of millions of other Americans, and even greater number of people from elsewhere as their governments followed the example of the United States, across the past few generations. Great job, AHA. And much appreciation for the helping hand of the USDA and various medical institutions in enforcing this anti-scientific dogma.

Let me be clear about one thing. I don’t blame my mother, as she was doing the best she could with the advice given to her by doctors and corporate media, along with the propaganda literature from respected sources such as the AHA. Nor do I blame any other average Americans as individuals, although I won’t hold back on placing the blame squarely on the shoulders of demagogues like Ancel Keys. As Gary Taubes and Nina Teicholz have made so clear, this was an agenda of power, not science. With the help of government and media, the actual scientific debate was silenced and disappeared from public view (Eliminating Dietary Dissent). The consensus in favor of a high-carb, low-fat diet didn’t emerge through rational discourse and evidence-based medicine —  it was artificially constructed and enforced.

Have we learned our lesson? Apparently not. We still see this tactic of technocratic authoritarianism, such as with corporate-funded push behind EAT-Lancet (Dietary Dictocrats of EAT-Lancet). Why do we tolerate this agenda-driven exploitation of public trust and harm to public health?

* * *

 * First quote: Passmore, R., and Y. E. Swindelis. 1963. “Observations on the Respiratory Quotients and Weight Gain of Man After Eating Large Quantities of Carbohydrates.” British Journal of Nutrition. 17. 331-39.
Second quote: Astrup, A., B. Baemann, N. . Christenson, and S. Toubre. 1994. “Failure to Increase Lipid Oxidtion in Response to Increasing Dietary Fat Content in Formerly Obese Women.” American Journal of Physiology. April, 266 (4, pt. 1) E592-99.
Both quotes are from a talk given by Peter Ballerstedt, “AHS17 What if It’s ALL Been a Big Fat Lie?,” available on the Ancestry Foundation Youtube page.

(It appears that evidence-based factual reality literally changes over time. I assume this relativity of ideological realism has something to do with quantum physics. It’s the only possible explanation. I’m feeling a bit snarky, in case you didn’t notice.)

** Americans, in the prior centuries, ate few plant foods at all because they were so difficult and time-consuming to grow. There was no way to control for pests and wild animals that often would devour and destroy a garden or a crop. It was too much investment for too little reward, not to mention extremely unreliable as a food source and so risky to survival for those with a subsistence lifestyle. Until modern farming methods, especially with 20th century industrialization of agriculture, most Americans primarily ate animal foods with tons of fat, mostly butter, cream and lard, along with a wide variety of wild-caught animal foods.

This is discussed by Nina Teicholz in The Big Fat Surprise: “Early-American settlers were “indifferent” farmers, according to many accounts. They were fairly lazy in their efforts at both animal husbandry and agriculture, with “the grain fields, the meadows, the forests, the cattle, etc, treated with equal carelessness,” as one eighteenth-century Swedish visitor described. And there was little point in farming since meat was so readily available.” (see more in my post Malnourished Americans). That puts the conventional dietary debate in an entirely different context. Teicholz adroitly dismantles the claim that fatty animal foods have increased in the American diet.

Teicholz goes on to state that, “So it seems fair to say that at the height of the meat-and-butter-gorging eighteenth and nineteenth centuries, heart disease did not rage as it did by the 1930s. Ironically—or perhaps tellingly—the heart disease “epidemic” began after a period of exceptionally reduced meat eating.” It was the discovery of seed oils that originally were an industrial byproduct, combined with Upton Sinclair’s muckraking journalism about the meatpacking industry (The Jungle), that caused meat and animal fats to quickly fall out as the foundation of the American diet. Saturated fat, in particular, had been in decline for decades prior to the epidemics of obesity, diabetes, and heart disease. Ancel Keys knew this data, which is why he had to throw out some of his data to make it fit his preconceived conclusions in promoting his preferred dietary ideology.

If we were honestly wanting to find the real culprit to blame, we would look to the dramatic rise of vegetable oils, white flour, and sugar in the 20th century diet. It began much earlier with the grain surpluses and cheap wheat, especially in England during the 1800s, but in the United States it became most noticeable in the first half century following that period. The agenda of Keys and the AHA simply made a bad situation worse, albeit much much worse.

Healthy Diet Made Simple

Let me share the Cosmic Secret of Dietary Success™. It cannot fail! Money back guaranteed.

I’ve studied and experimented with various diets. And I’ve observed many others in their own experiences and results. One begins to see patterns across all dietary regimens and strategies. There is a basic consistency to what works for most people.

Here it is — the official DICE Dietary Protocol© (not in order of priority):

  1. Don’t eat fat and carbs together, limiting one or the other or both. That is to say, do a low-carb/moderate-to-high-fat diet or a low-fat/moderate-to-high-carb diet. In either case, it can be done as plant-based, animal-based, or fancy-free omnivory. In practice, this would mean, for example, eating the bread or eating the meat but not eating a sandwich with the two combined. This is the standard strategy for any health issue related to metabolic syndrome, such as obesity, diabetes, heart disease, and fatty liver. This is because lots of starchy carbs and added sugar combined with lots of fats, especially industrial seed oils (oxidized and high in omega-6 fatty acids), causes all kinds of havoc in the body. Going one way or the other will effectively improve health, at least in the short term of counteracting the accumulated harm of the Standard American Diet (SAD). Debates about what is the best long-term diet is a separate issue.
  2. If struggling, try an elimination diet in order to determine specific allergies or intolerances: wheat gluten, dairy lactose, egg whites, plant oxalates, and many other potentially problematic foods and categories of foods (such as the nightshade family and anything high in histamines). There are many versions of the elimination diet. The most conventional one is to remove from the diet everything besides rice. However, a downside to this is that a significant percentage of people have a high glycemic response to rice, which is a problem with 88% of the American population with one or more symptoms of metabolic syndrome. So, some might find using meat, instead of rice, as a better starting point (Like water fasts, meat fasts are good for health). Few people have any problems with fresh meat from ruminants. In fact, some find this so beneficial with all that ails them that they remain carnivore or else decide to use plant foods sparingly. Sure, others might instead choose to go the vegetarian or vegan route, but I’ve never heard of anyone trying an elimination diet with rice and deciding to eat nothing other than rice for the rest of their lives.
  3. Change metabolic functioning with fasting, calorie restriction, portion control, protein leveraging, hormonal hunger signaling, etc. This is one of the most powerful and effective tools, especially for fat loss and weight maintenance. Some of these methods have the added benefit of curbing appetite, cravings, and addictions while improving mood, energy, and stamina. This is specifically true with ketosis that can be achieved by numerous means, not limited to the ketogenic diet. Many diets, intentionally or unintentionally, increase ketone levels and, simply put, that makes one feel good. There are way more ketogenic and lower-carb diets than is generally acknowledged in how they are labeled or marketed (e.g., Weight Watchers’ Paleo Diet). This is a natural tendency in the dieting world because low-carb, especially ketogenic, is a powerhouse strategy. It’s not the only strategy, but it’s hard to go wrong with it. Even on higher carb diets, many people will turn to other methods that promote ketone production — the above mentioned fasting, calorie restriction, and portion control or else long periods of aerobic exercise. People intuitively seek out ketosis, whether or not they know anything about it.
  4. Exclude highly processed foods with chemical additives, refined carbs, added sugar, and seed oils. Basically, avoid junk food and fast food: candy, chips, crackers, commercial breads, pop, fruit juice, and other such crap. So, eat whole foods or else those prepared in traditional ways: lightly cooked or steamed vegetables, soaked and rinsed legumes, long-fermented breads, real sauerkraut, yogurt, raw aged cheese, homemade bone broth, naturally-cured meat, etc (ignoring minor disagreements over details, as there are always disagreements). Generally, avoid packaged foods, especially those with long lists of ingredients that you don’t recognize and can’t pronounce. And when possible, cook your own meals with ingredients procured from trustworthy sources.

Some combination and variation of this set of guidelines will solve basic diet-related health concerns for almost anyone. For bonus points, eat foods that are locally produced, in season, organic, pasture-raised, wild-caught, nutrient-dense, and nutrient-bioavailable. You’re welcome!

“Now we know.”
“And Knowing is half the battle.”
“G.I. Joe!!!”

Dietary Health Across Generations

It’s common to blame individuals for the old Christian sins of sloth and gluttony. But that has never made much sense, at least not scientifically. Gary Taubes has discussed this extensively, and so look to his several books for more info about why applying Christian theology to diet, nutrition, and health is not a wise strategy for evidence-based medicine and public health policy.

Yes, Americans in particular would be wise to do something about their health in a society where 88% of the adult population has one or more symptoms of metabolic syndrome with about three-quarters being overweight and about half diabetic or prediabetic (Joana Araújo, Jianwen Cai, June Stevens. “Prevalence of Optimal Metabolic Health in American Adults: National Health and Nutrition Examination Survey 2009–2016”; for more info, see The University of North Carolina at Chapel Hill or Science Daily). Consider, these statistics are even worse for the younger generations. But let’s put this in even greater context. It’s not only that each generation is unhealthier than the last for this declining health is being inherited from before birth. There is now an obesity epidemic among 6 month old babies. I doubt anyone thinks it’s reasonable to blame babies. Should babies eat less and exercise more?

This goes back a while. European immigrants in the early 1900s noticed how American children were much chubbier than their European counterparts. By the 1950s, there was already a discussion of an obesity epidemic, as it was becoming noticeable with the younger generations. We are several generations into this modern industrialized diet of highly processed starchy carbs, added sugar, and seed oils. Much of this is caused by worsening environmental conditions, from harmful chemicals to industrial food system. The effects would begin in the womb, but the causality can actually extend across numerous generations.

This is called epigenetics, what determines which genes get expressed and how. And this epigenetic effect is magnified by the microbiome we inherit as well, since microbes help determine some of the epigenetic effect, involving short-chain fatty acids that can be obtained either through plant or animal foods (Fiber or Not: Short-Chain Fatty Acids and the Microbiome). This is important, as it is easier and more straightforward to manipulate our microbiome than our epigenetics, or at least our knowledge is more clear about the former. By changing our diet, we can change our microbiome. And by changing our microbiome, we can change our epigenetics and that of our children and grandchildren.

The dietary aspect is the most basic component, in that some diets seem to have an effect directly on the epigenome itself, however the microbiome may or may not be involved — for example, there is “recent evidence that KD [ketogenic diet] influences the epigenome through modulation of adenosine metabolism as a plausible antiepileptogenic mechanism of the diet” (Theresa A. Lusardi & Detlev Boison, Ketogenic Diet, Adenosine, Epigenetics, and Antiepileptogenesis). It’s been proven for about a century now that the ketogenic diet is the most effective treatment for epileptic seizures, but there has been much debate about why. Now we might know the reason. The mechanism appears to be epigenetic.

This is not exactly new knowledge (Health From Generation To Generation). Such cross-generational influences have been known since earlier last century, but sadly such knowledge is not epigenetically inherited by each succeeding generation. Francis M. Pottenger Jr studied the health of cats on severely malnourished and well-nourished diets — by the third generation the malnourished cats were no longer capable of breeding and so there was no fourth generation. This doesn’t perfectly translate to the present human diet, although it does make one wonder. Many of our diseases of civilization seem to be at least partly caused by malnourishment. This is a public health epidemic as national security crisis.

Here is the question that comes to mind: In this modern industrialized diet, what generation of malnourishment are we at now? And if as a society we changed public health policies and medical practice right now, how many generations would it take to reverse the trend and fully undo the damage? To end on a positive note, we could potentially turn it around within this century: “Dr. Pottenger’s research also showed that the health of the cats could be recovered if the diet were returned to a healthy one by the second generation; however, even then it took four generations for some of the cats to show no symptoms of allergies” (Carolyn Biggerstaff, Pottenger’s Cats – an early window on epigenetics).

So, what are we waiting for?

* * *

To give you some idea of how long our society has experienced declining health, check out some of my earlier posts:

Malnourished Americans
Ancient Atherosclerosis?
The Agricultural Mind

* * *

Videos, podcasts, and articles on epigenetics as related to diet, nutrition, microbiome, health, etc with some emphasis on paleo and ketogenic viewpoints:

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Nutriepigenomics
from Wikipedia

Changes in the diet affect epigenetics via the microbiota
from EurekAlert!

Diet and the epigenome
Yi Zhang and Tatiana G. Kutateladze

Dietary Epigenetics: New Frontiers
by Austin Perlmutter

RHR: The Latest Discoveries in Evolutionary Biology, Genetics, and Epigenetics
by Chris Kresser

Epigenetics, Methylation, and Gene Expression
by Kevin Cann

Epigenetics: Will It Change the Way We Treat Disease?
by Kissairis Munoz

Hacking Your Genes Through Epigenetics and Targeted Nutrigenomics
by Daniel Rash

The Promise of Paleo-Epigenetics
by Jennifer Raff

Dawn of Paleoepigenomics
by Zachary Cofran

37: Robb Wolf – Diets, Epigenetics, Longevity, and Going Foodless for 9 Days
by Andy Petranek

Epigenetics and the Paleo Diet
from The Paleo Diet

Paleo, Epigenetics, and Your Weight
from Paleo Leap

EP157: Improving Mental Health with Epigenetics, Diet & Exercise with Alex Swanson
from Paleo Valley

Epigenetics Warning: Are You Wrecking Your Kids’ Health?
by Louise Hendon

EPISODE 64: Epigenetics 101 with Bailey Kirkpatrick
from Phoenix Helix

Episode 90 – Dr. Lucia Aronica studies keto and epigenetics
by Brian Williamson

Can Keto Affect Your Genes?
from KetoNutrition

Energy & Epigenetics 1: The Infant Brain is Unique
by Jack Kruse

Dr. David Perlmutter: Intermittent Fasting, Epigenetics & What Sugar Really Does To Your Brain
by Abel James

Epigenetic Explanations For Why Cutting Sugar May Make You Feel Smarter
by Caitlin Aamodt

Eating Sweet, Fatty Foods During Pregnancy is Linked to ADHD in Children
by Bailey Kirkpatrick

High Fat, Low Carb Diet Might Epigenetically Open Up DNA and Improve Mental Ability
by Bailey Kirkpatrick

A Child’s Mental Fitness Could Be Epigenetically Influenced by Dad’s Diet
by Bailey Kirkpatrick

Dad’s Drinking Could Epigenetically Affect Son’s Sensitivity and Preference for Alcohol
by Bailey Kirkpatrick

B Vitamins Protect Against Harmful Epigenetic Effects of Air Pollution
by Bailey Kirkpatrick

Vitamin D Adjusts Epigenetic Marks That Could Hinder A Baby’s Health
by Bailey Kirkpatrick

Could We Use Epigenetics and Diet to Fix Binge Eating?
by Bailey Kirkpatrick

Early Epigenetic Nutrition ‘Memory’ Could Program You for Obesity Later in Life
by Bailey Kirkpatrick

The Consequences of a Poor Diet Could Epigenetically Persist Despite Improving Eating Habits
by Bailey Kirkpatrick

Epigenetic Transfer of Nutrition ‘Memory’ Ends Before Great-Grandchildren
by Bailey Kirkpatrick

How your grandparents’ life could have changed your genes
by Tim Spector

Nutrition & the Epigenome
from University of Utah

The epigenetics diet: A barrier against environmental pollution
from University of Alabama at Birmingham

How Epigenetics May Help Explain the Complexity of Autism Spectrum Disorder
from Zymo Research

Epigenetics, Health and the Mind
from PBS with John Denu

Eating for two risks harm to the baby
by Laura Donnelly and Leah Farrar

Micronutrients in Psychiatry: Sound Science or Just Hype?
by Seth J. Gillihan

Epigenetics: A New Bridge between Nutrition and Health
by Sang-Woon Choi and Simonetta Friso

Role of diet in epigenetics: a review
by Abhina Mohanan and Raji Kanakkaparambil

The science behind the Dutch Hunger Winter
from Youth Voices

Epigenetic Marks From Parents Could Influence Embryo Development and Future Health
by Tim Barry

Can Your Diet Epigenetically Shape Your Child’s Health?
by Janeth Santiago Rios

Epigenetic Insights on Nutrition, Hormones and Eating Behavior
by Janeth Santiago Rios

Paternal Environmental and Lifestyle Factors Influence Epigenetic Inheritance
by Estephany Ferrufino

How Diet Can Change Your DNA
by Renee Morad

Food that shapes you: how diet can change your epigenome
by Cristina Florean

The Unknown Link: Epigenetics, Metabolism, and Nutrition
by Nafiah Enayet

Obesity, Epigenetics, and Gene Regulation
by Jill U. Adams

Epigenetics and Epigenomics: Implications for Diabetes and Obesity
by Evan D. Rosen et al

Epigenetic switch for obesity
from Science Daily

Epigenetics between the generations: We inherit more than just genes
from Science Daily

Low paternal dietary folate alters the mouse sperm epigenome and is associated with negative pregnancy outcomes
by R. Lambrot et al

Diet-Induced Obesity in Female Mice Leads to Offspring Hyperphagia, Adiposity, Hypertension, and Insulin Resistance
by Anne-Maj Samuelsson et al

Maternal obesity increases the risk of metabolic disease and impacts renal health in offspring
by Sarah J. Glastras

Transgenerational Epigenetic Mechanisms in Adipose Tissue Development
by Simon Lecoutre et al

Your Grandma’s Diet Could Have Made You Obese, Mouse Study Suggests
by Kashmira Gandery

Your Diet Affects Your Grandchildren’s DNA, Scientists Say
by Christopher Wanjek

You Are What Your Grandparents Ate
by Maria Rodale

People who eat too much fast food could cause heart disease in their great grandchildren by Jasper Hamill

Eating Badly When Pregnant Might Make Your Kid Fat
by Zak Stone

Perinatal Western Diet Consumption Leads to Profound Plasticity and GABAergic Phenotype Changes within Hypothalamus and Reward Pathway from Birth to Sexual Maturity in Rat
by Julie Paradis et al

A Maternal “Junk Food” Diet in Pregnancy and Lactation Promotes Nonalcoholic Fatty Liver Disease in Rat Offspring
by S. A. M. Bayol et al

Exposure to a Highly Caloric Palatable Diet during the Perinatal Period Affects the Expression of the Endogenous Cannabinoid System in the Brain, Liver and Adipose Tissue of Adult Rat Offspring
by María Teresa Ramírez-López et al

A maternal junk food diet alters development of opioid pathway in the offspring
from Science Daily

‘Junk food’ moms have ‘junk food’ babies
from Science Daily

Born to Be Junk Food Junkies
by Linda Wasmer Andrews

Reality check: Do babies inherit junk food addictions from their moms?
by Carmen Chai

Bad Eating Habits Start in the Womb
by Kristin Wartman

Could Over-Snacking While Pregnant Predispose Children to Be Obese?
by Natasha Geiling

Overeating in pregnancy could lead to child obesity
by John von Radowitz

Eating for two puts unborn child at risk of junk addiction
by James Randerson

Craving for junk food ‘inherited’
from BBC

Craving for junk food ‘begins in the womb’
by Fran Yeoman

Hooked on junk food in the womb
by Fiona MacRae

How pregnant mums who ‘eat for 2’ can make their babies fat
by Victoria Fletcher

 

A Fun Experiment

I’ve written a lot about diet lately, but let me get personal about it. I’ve had lifelong issues with diet, not that I thought about it that way when younger. I ate a crappy diet and it was the only diet I knew, as everyone else around me was likewise eating the same basic crappy diet. Even my childhood sugar addiction didn’t stand out as all that unique. Though I didn’t know it at the time, looking back at it now, I’m sure an unhealthy diet with nutrient-deficiencies and food additives (maybe along with environmental toxins or other external factors) was likely contributing factors to my learning disability and word finding difficulties (WFD) — see previous posts: Aspergers and Chunking; and Specific Language Impairment. As early as elementary school, there were also signs of what would later be diagnosed as depression. I knew something was wrong with me, but felt at a loss in that there was no way to explain it. I was just broken, inferior and inadequate. I didn’t even understand that I was depressed during my youth, although my high school art teacher once asked me if I was depressed and, in my ignorance, I said I wasn’t. Being depressed was all I knew and so it just felt normal.

I didn’t have the insight to connect my neurocognitive and psychological struggles to physical health. The crappiness of my diet only became apparent to me in adulthood, although I’m not sure when I started thinking about it. I grew up in churches where people were more health-conscious and my mother tried to do what she thought was healthy, even as good info was lacking back then. Still, a basic mentality of healthfulness was instilled in me, not that it initially did me much good. It took a while for it to lead to anything more concrete than doing what was the height of “healthy eating” in those day, which was skim milk poured over bran cereal and an occasional salad with low-fat dressing. That simply would’ve made my depression and learning disabilities worse as it surely was fucking up my neurocognition precisely as my brain was developing, but mainstream advice asserted that this USDA-approved way of eating would cure all that ails you. Fat was the enemy and fiber was a health tonic. Few at the time realized that fat-soluble vitamins were key to health nor that a high-fiber diet can block nutrient absorption.

Everything fell apart after high school. I despised life and wanted to escape the world. I dropped out of college and seriously considered becoming a hermit, but the prospect was too lonely and after moving out to Arizona I felt homesick. Then in going back to college, I attempted suicide. I failed at that as well and earned myself a vacation in a psychiatric ward. I was bad off, but having been raised in New Thought Christianity I was always looking for answers in self-help books and similar things. It would’ve been maybe in my early to mid 20s when I first read books that were explicitly about diet, nutrition, and health. I do recall, for instance, a book I picked up on low-carb diets and it wasn’t about the Atkins diet — it might have been an old copy of Vilhjalmur Stefansson’s Not By Bread Alone or it could have been something else entirely. Around that time, there was a minor incident that comes to mind. I told my friend that fast food was unhealthy and he didn’t believe me. It sounds odd now, but this was back in the 1990s. His mother was a nurse and regularly bought him fast food as a child. So how could it be bad? Many people at the time didn’t have much sense of what made food healthy or not, but obviously something had got me thinking about it. I knew that some foods were not healthy, even as what a healthy diet should look like was a bit hazy in my mind, beyond the nostrum of eating more fruits and veggies.

I lacked knowledge and there weren’t many sources of knowledge prior to my getting internet. Still, based on what limited info I could glean, I did start experimenting during that period. I began trying supplements to deal with my depression with the related low energy and low motivation, as therapy and medications had failed to put a dent in it. Around 1998, four years after graduating high school and a couple years after the suicide attempt, I tried vegetarianism for a time, maybe for a year or so, but it mainly involved eating as a regular meal a mix of Ramen noodles, eggs, and frozen vegetables cooked in the microwave — it was a poverty diet as I was living in poverty. I probably also was eating plenty of junk food as well, considering most cheap processed foods are vegetarian. Avoiding meat certainly doesn’t guarantee health — it didn’t fill me with joy and vitality. A bit later on I did finally try a low-carb diet, but it mainly consisted of eating processed meat because I was too depressed to cook. Even then, I might not have been getting many fat-soluble vitamins, as I didn’t understand nutrient-density. I wasn’t procuring pasture-raised meat, much less emphasizing organ meats, bone broth, wild-caught fish, etc.

My experiments weren’t well-informed and so weren’t done under optimal conditions. There was no one around to offer me guidance and so it didn’t work out all that well. I don’t give up easy, though. I went looking for guidance from dozens of psychiatrists, therapists, energy healers, body workers, and even a shaman. In my desperation, I’d try anything. Then I went to massage school where I learned Shiatsu massage and traditional Chinese theory, along with some other modalities. Even that didn’t change anything. My massage teachers were alternative health practitioners, one being a naturopath, but it seemed like no one understood what was wrong with me and so nothing could make a difference. My depression was an incomprehensible mystery. Rather than something being wrong with me, I was the problem in being inherently defective, so it seemed in my lingering dark mood.

The only thing that made much of a difference was exercise. I found that I could keep the worst symptoms of depression at bay through jogging, if only temporarily. At some point, I learned to jog before eating anything in the morning and I found that my hunger and cravings were less for the rest of the day. I had accidentally discovered ketosis and didn’t know what it was. It didn’t make sense that physical exertion minus food would lead to such results — rather counterintuitive. I was also occasionally fasting around then which also would’ve contributed to ketosis. That isn’t to say ketosis while in nutrient deficiency is a good thing. I’d have been better off in having avoided ketosis entirely and, instead, having filled up on nutrient-dense fatty animal foods. I needed healing and only high dosage of nutrition was going to accomplish that. I had been too malnourished for far too long at that point. Ketosis would’ve been great after a period of deep nourishment, but I didn’t understand either the significance of key nutrients nor how to implement ketosis in a more beneficial way.

At some point, I read Sally Fallon Morrell’s Nourishing Traditions (1995) where I was introduced to nutrient-density and fat-soluble viatmins along with traditional food preparation, but I was too depressed and too isolated to fully and successfully implement what I was learning. Depression is a real kick in the ass. Still, I was slowly accruing basic knowledge and making small changes when and where I felt able. I was limiting some of the worst problematic foods. In particular, I began cutting back on junk food, especially candy. And I replaced sugar with such things as stevia. Simultaneously, I increased healthier foods like probiotics and Ezekiel bread, although I’m not sure that the latter really is all that healthy (it has vital gluten added to it and it mostly starchy carbs). I tried to limit my sugar intake to foods that were relatively better, such as yogurt and kefir. I still was experimenting a bit with supplements, but wasn’t getting any clear results. My depression persisted and I see now that, even with these changes, I continued to lack nutrient-density. It just wasn’t clicking together for me. Maybe my depression had moderated ever so slightly, to the degree that I was a functional depressive and not in the total gloom and doom of my late teens to early twenties. I figured that was as good as it was going to get. I had survived that far and figured I’d be depressed for the rest of my life. Let me put this in perspective. This slightly lessened depression was, nonetheless, chronic and severe. For example, suicidal ideation persisted — maybe more as a background noise to my thoughts, but there, always there. I had this suspicion that eventually depression would catch up with me and then that would be the end of me. Suicide remained a real possibility in my mind, a persistent thought. It was hard for me imagine myself surviving into old age.

I carried on like this. I managed my life at a bare minimal level. I held down a job, I sort of kept my apartment clean, I fed my cats and sometimes changed their litter, and I more or less paid my bills on time. But depression had kept me working minimal hours and barely staying above poverty. There wasn’t only the depression for, over the decades, a crippling sense of shame had accumulated. I felt worthless, a failure. I wasn’t taking care of myself or at least wasn’t doing it well. Everything felt like a struggle while nothing I did seemed to make a difference. It was shitty and I knew life was just going to get worse as I aged and thinking about that made me feel more hopeless. To add to that general anxiety and despair, as I drifted through my thirties, I began gaining weight. I had always thought of myself as athletic. I played soccer from 1st grade to 11th grade and was always slim and trim, although I remember at one point after high school having been so inactive for a number of years that I felt winded just by walking up a hill — that was a strange experience for me because I had never been out of shape before that time. That was why I came to focus so much on exercise. Yet with age, mere exercise wouldn’t stop the weight gain, much less help with weight loss… nor any of the other symptoms of declining health. I was jogging multiple times a week for long periods, sometimes while wearing a heavy backpack as I hoofed it out to my parent’s place on the far edge of town. Still, the excess fat remained. That was rather dispiriting. Yet from a conventional viewpoint, my diet was balanced and my lifestyle was generally healthy, at least by American standards. I was doing everything right, as I understood it. Just the expected results of aging, most doctors would likely have told me.

I realize now that insulin resistance probably had set in quite a while back. I was probably prediabetic at that point, maybe even in the early stages of diabetes (I sweated a lot, in the way my grandmother did before her diabetes was managed with insulin shots). I know that I no longer handled sugar well, which helped keep my sugar addiction in check. About a decade ago, my friend and I visited a nearby donut shop and I got several fine specimens. Upon eating them, I felt sick with a slight headache. No more donuts for me. Sugar or not, my diet was still fairly high-carb, but I wasn’t yet fully aware of how starches and sugars sneak into everything. Then last year I randomly came across the paleo documentary The Magic Pill and watched it without any expectation. I suppose it was just basic curiosity, as is my habit. Something about it resonated with me. I showed it to my parents and they too found it inspiring. So, we all set about changing our diets — having mutual support from family was surely an important factor for motivation. The diet portrayed is standard paleo with a combination of low-carb and nutrient-density. What made the documentary compelling was how a wide variety of people were followed as they tried the paleo diet: a woman living alone with various health problems, a family with a young daughter with severe autism, and an Australian Aboriginal community that had lost their traditional way of life. It demonstrated the significant changes that could occur through diet. The transformation of the autistic girl was particularly impressive. The entire documentary was motivational. After that, I looked for some other documentaries to watch with my parents: The Perfect Human Diet, Carb Loaded, etc. Learning more reinforced this new view and brought together all that I had learned over the decades. I finally had a broader framework of understanding.

It was this low-carb paleo diet that was the starting point for me, although my mother never was quite on board with it. After looking online, she was drawn to the FODMAP diet in hoping it could help with her gut issues, specifically GERD and belching, but also osteoporosis (and indeed it did seem to work for her, as her former high-fiber diet apparently was the source of her problems), although her diet had some overlap with paleo. Going into my typical obsessive-compulsive mode, I gathered dozens of books on the subject, voraciously took in all the info I could find online, and began following various people on social media. I quickly figured out the basics and what was most essential while determining the points of disagreement and uncertainty. What I liked about the paleo and low-carb community was the attitude of curiosity, of exploration and experimentation. Try something and see what happens. And if it doesn’t work, try something else. There was no failure, a much more positive attitude about health. Within three months of implementing the paleo diet, I had lost 60 pounds of fat and I did it without starving myself. I simply figured out how to tap into the body’s natural mechanisms for fat-burning and hunger signalling. As I switched from general low-carb to ketogenic, my experience improved even further. It finally dawned on me that my depression had gone away, simply and utterly disappeared, decades of depression along with a lifetime of sugar addiction no longer an issue. I didn’t need to struggle against it. I wasn’t even trying to cure my depression, not that I realized this even was a possibility. It was a complete surprise.

It’s been a little over a year now. I’m still coming to terms with this new state of being. It’s strange. Depression had become part of my identity, as had sugar addiction and the roller coaster hangriness of insulin resistance. I now simply wake up in the morning feeling perfectly fine. It’s not that I go around feeling ecstatic, but the extreme low moods and funks no longer happen. I feel more balanced and relaxed. I used to fall into long periods of apathy and despair where all I could do was isolate myself until it passed, sometimes requiring days or weeks before I could rejoin humanity. How I functioned at all in such a state is kind of amazing, but not nearly as amazing as the miracle of its drama-free disappearance. Depression was there and then it wasn’t. I didn’t really notice it going away, until after it was gone. This leaves me in a strange position, as the decades of depressive thought and behavioral patterns remain. It’s hard for me to know how to not be a depressed person. I can’t quite wrap my mind around it. I don’t remember the last time I had any suicidal tendencies or fantasies. Yet the decades of damage to my body also remains as a reminder.

That hasn’t stopped me from getting back in shape and beyond. In fact, I’m in better shape now as I move toward middle age than ever before in my life. It’s not simply that I’ve been working out but that I enjoy working out. It feels good to me and I enjoy doing physical activity, pushing myself to the point of exhaustion. Unsurprisingly, I’m looking better. People notice and tell me. This sometimes makes me uncomfortable, as I’m not used to getting compliments. Just today I went to a picnic with a large crowd, some people I knew and some I didn’t. I met a friendly young woman and she was obviously flirting with me as we talked. It was a nice day and, having been out in a kayak, I had my shirt off. She told me that I looked “gorgeous” — the exact word she chose.* I’ll be blunt about this. No one has ever said anything like that to me in my entire life. I had never been a buff guy before and now I actually have muscles. It changes how I carry myself, how I feel.

It makes me realize why some fat people, after losing a bunch of weight, will sometimes gain their weight back just to feel normal again. The person I am now is not the person I’ve known myself for as long as I can remember. And I don’t know what to do with people relating to me differently. I’m sure people treat me differently not only because I look different but probably because I’m putting off a different vibe. I’m less sullen and dissociated than I used to be. An easygoing friendliness comes more naturally to me now. I don’t feel so crappy in no longer being on a crappy diet, but I’m not sure what it might mean to be healthy and happy. That is an odd concept to my mind. What if I could really be different? I take this seriously. In the past, I didn’t feel capable of being different, but all of that has changed. I don’t feel so irritable, frustrated, and angry. In place of that, I find myself wanting to be kinder and more forgiving. I want to be a good person. I realize that, in the past, how I could be an asshole and I was often open in admitting this basic fact of my former state, sometimes apologizing for my antagonistic moods. My life didn’t always feel like a net gain for the world and I’m sure some people might have agreed with that assessment. I could be harshly critical at times and that doesn’t make others feel better — I seriously harmed a number of relationships.

Now here I am. It’s a bit late in my life, but I have a second chance to try to do things differently. It will take some further experimentation beyond diet to find better ways of relating to others and to myself. That said, I’ll go on tinkering with my diet and lifestyle. It’s an ongoing experiment, all of it. Most importantly, it’s a fun experiment. The idea that I can try various things and discover what works is new to me. I’m more used to failure, but now I’m starting to see ‘failure’ as simply part of the experiment. There is no failure. Life doesn’t have to be hard. And I’m realizing that I’m not alone in this, as I’ve come across hundreds of stories just like mine. Sometimes simple changes can have profound effects.


* I must admit that it was a disconcerting experience. A young beautiful woman telling me in no uncertain words that I’m attractive. That is not the kind of thing I’ve grown accustomed to. I handled the situation as well as I could. It was kind of an amusing scenario. She was with her family. Along with her parents, she was visiting from Tunisia in order to see her sister who now works at the local university.

So, this young woman wasn’t going to be around long. Developing a romantic relationship didn’t seem to be in the cards, even if I had wanted it, but I feel ambivalent about romantic relationships these days. I’ve become comfortable in my bachelorhood with its lack of complications. Even so, I played along with the flirtation. As I sat near her with her family at the picnic table, she kept wanting to feed me. And how I could I decline food offered by a beautiful woman, even when she offered me carbs. That is my new plan for carb cycling — I’ll eat carbs every time a beautiful woman feeds them directly to me.

Anyway, combined with introversion and shyness, the lifetime of depression has made me reticent. I’m not confident around the opposite sex, but I’ve had long years of training in hiding any anxieties. Still, I didn’t know what purpose there was in flirting with this nice-looking person who would soon be gone. She said she might be back to visit again in a few years and that seems like a long time when you just met someone. I convinced myself there was no point and didn’t give her my contact info or ask for hers. But now I feel some regret.

I was acting according to my old self, the one who was ruled by his depression. Maybe it was irrelevant that I might not see her again. I should have left the door open for the possibility. These are the kinds of habits I need to learn.

Fiber or Not: Short-Chain Fatty Acids and the Microbiome

A common viewpoint among both conventional and alternative health practitioners is that fiber is good for you. Not only good but necessary. Millie Barnes, as an example, identifies her expertise as a chef and nutrition coach. She apparently comes from a functional medicine approach, common among those advocating traditional foods diet that is plant-based and fiber-heavy (another example is Dr. Terry Wahls).

Barnes wrote a post about fiber and short-chain fatty acids (SCFAs), Why Short-Chain Fatty Acids Are Key To Gut & Overall Health, Plus How To Get More — her position is typical: “SCFAs are produced when bacteria—the good kind—ferment fiber in the gut, thereby providing your body with energy, keeping your metabolism humming, and even thwarting a wide range of digestive disorders.” There is nothing necessarily wrong about this position, although the scientific evidence is severely limited and highly contested. The problem is in treating the science as settled.

I’m not against fiber. I eat some high-fiber vegetables, especially fermented, along with other cultured foods. I used to eat even more fiber and vegetables, back when I was doing a paleo diet. And there was benefits to it, at least in comparison to my prior high-carb diet of processed foods. But I’ve also tried the carnivore diet and felt freaking awesome! I never realized how hard to digest are most plant foods (Like water fasts, meat fasts are good for health.).

I’m much more cautious about the plants and hence plant substances, including toxins and anti-nutrients, I allow into my body. Still, I have nothing against plants on general principle and I’m persuaded by Siim Land’s argument for hormesis and antifragility, that is to say beneficial stress (in case you’re interested, there is an intriguing scientific paper to check out: Hagen, Roulette & Sullivan, Explaining Human Recreational Use of ‘pesticides’). I now think of plants as more medicine than food, but nonetheless quite useful as medicine.

SCFAs are a complex topic, as is the microbiome of which we know little. As aside note, while some SCFAs (acetate and butyrate) are ketogenic, others (propionate) is glycogenic. They play an important role in health. That much we can agree on. What is less understood or at least less acknowledged is that SCFAs can come from other sources besides fiber. Butyrate, for example, is found in dairy fat. The cow eats the fiber and makes the butyrate for us.

So butyrate deficiency shouldn’t be a problem for anyone on a reasonably healthy diet, plant-based or animal-based. That is assuming they are getting plenty of high-fat dairy, pasture-raised all the better, and most Westerners tend to consume tons of dairy. As for myself, I get plenty of ghee (clarified butter) which means I’m probably fine on butyrate levels. By the way, my preferred mode of ghee delivery is through coffee and tea, what has been made famous as Dave Asprey’s Bulletproof Coffee, but he got the idea from a Tibetan woman who served him tea with yak butter. This maybe is not such a foreign practice. My mother recalls her Kentuckiana grandmother regularly pouring coffee over butter, although she also mixed in saltine crackers — that latter part probably less traditional and certainly not low-carb.

To get back to our discussion of SCFAs, I’m not as familiar with acetate, but apparently you can get it from apple cider vinegar (ACV), something I also take on a daily basis. I assume that the microbes in the ACV produced the acetate and so bypasses the need of the microbes in your own gut to do the work. No fiber is required, at least not in the diet. Furthermore, one can get acetate from ketosis as well and ketosis is my preferred state. Acetate/acetoacetate sometimes is what is measured for ketone levels. Some amino acids such as leucine and lysine can be converted into acetoacetate through fatty acid synthesis. Acetoacetate then is reduced to beta-hydroxybutyrate and the latter gets turned into acetone and acetate.

Now on to propionate, even more fascinating. It is a food additive that the modern person is getting overdosed on and appears to be a causal factor behind such conditions as autism (The Agricultural Mind). Those on the autistic level tend to have high levels of the bacteria that produce propionate and tend to crave foods that are high in it. Rodents injected with propionate express autistic-like behaviors. And those on the autistic spectrum show decreased behavioral problems when propionate is removed from their diet or when an antibiotic kills off some of their microbiome. SCFAs are a key part of a health diet, but they are powerful substances not to be taken lightly. They potentially can do harm as well.

As a last comment, no studies have been done on the microbiome of those on a carnivore diet or near-carnivore diet such as the Inuit. Heck, there has been no research even on a more general healthy omnivore diet including meat — the studies on the Standard American Diet (SAD) don’t count. But from what we do know about biology in general, it appears humans have multiple pathways of producing or obtaining SCFAs. The microbiome, in particular, is probably extremely adaptable to a wide variety of diets that were necessary during evolution (e.g., the microbiome of some hunter-gathers completely alters from season to season). Dr. Paul Saladino has talked a lot about this kind of thing — take what he had to say in an interview with Geoffrey Woo (Nose-to-Tail Carnivore Diet: Organ Meat, TMAO Implications, & Reaching Ketosis ft. Dr. Paul Saladino; & video):

“There are many bacteria which can metabolize fat, protein and animal-based collagen. That’s the thing I think that most people are missing. That our gut microbiome can shift. There’s a study where they put people on what I would consider to be a very poor version of a carnivorous diet and they compare it a plant-based diet. What they see is a divergence in the gut flora within a week. The animal-based eater, again, it’s not an ideal diet. The animal-based eaters had more bile acid tolerant organisms and more organisms to ferment fat and protein. They made isobutyrate and they made acetate and they made propionate as short chain fatty acids.

“The plant-based eaters made butyrate as a short chain fatty acid and had different colonic and small intestinal microflora. The investigators in that study jumped to the conclusion. Look, we know what’s going on with the gut because they have this organism. What’s worthy of biophilia or they don’t have this organism. They clearly have an unhealthy gut microbiome and I think that is an extrapolation. We do not know that. Clinically, nobody is assaying anything clinically in that study. They didn’t do inflammatory markers. They didn’t follow those people moving forward. It was almost like a setup. They were just trying to prove that these bile acid tolerant organisms would show up when they gave people a bunch of foods, which promote the formation of bile.”

Dr. Paul Saladino was on the paleo diet before trying carnivore, but Dr. Will Cole went from vegetarian to a more paleo-style diet. Dr. Cole wrote a book, Ketotarian, about how to do a plant-based keto and so he is right in line with the likes of Millie Barnes. That didn’t stop him, in an interview with Vanessa Spina, from pointing to evidence that a high-fiber diet may not be necessary, even going so far as to mention the carnivore diet:

“Because we have an epidemic of gut problems in the United States and around the world and Europe as well that this is going take time. Sometimes some people can have it right out of the gate. Some people can’t. It’s important to know what’s right for your body and what’s not right for your body. But as you heal, what you used to not be able to have the goal is to be able to reintroduce these things as your body heals.

“So the carnivore diet, for example, it’s the ultimate elimination die because it’s removing a lot of these fibers. But the goal isn’t to be carnivorous forever and ever, even though maybe some people would prefer that. But the goal is to use something like that to drive down this inflammatory cascade to bring things back in, as long as it’s nutrient-dense. And there are studies to show like the Hadza tribe in Tanzania they have good bacterial diversity during those months where they are eating less vegetables. But they’re eating more raw meat or getting like drinking blood and doing things that most people that are on the carnivore diet in the West are not doing today.

“So there are other there are other ways to get back to our diversity beyond fiber. I would just say it is the most common, most well researched way to get back to our diversity.”

Vanessa Spina, in that interview, then added an important point, not all prebiotics are fiber or necessarily come from plants at all: “I found this list of prebiotic foods that were non-carbohydrate that included cellulose, cartilage, collagen, fructooligosaccharides, glucosamine, rabbit bone, hair, skin, glucose. There’s a bunch of things that are all — there’s also casein. But these tend to be some of the foods that actually have some of the highest prebiotic content. So it’s interesting, I think, if someone has less tolerance for fiber, they can also explore some of these other product prebiotics.” That is something I never hear anyone talk about.

This might explain why so many people do so well on a carnivore diet. They are still getting prebiotics. And we know those on entirely or mostly meat diets retain functioning microbiomes. But there has been so few scientists looking into this.

“For the average American or European, Coca-Cola poses a far deadlier threat than al-Quaeda.”

Homo Deus: A Brief History of Tomorrow
by Yuval Noah Harari

  • “Poverty certainly causes many other health problems, and malnutrition shortens life expectancy even in the richest countries on earth. In France, for example, 6 million people (about 10 percent of the population) suffer from nutritional insecurity. They wake up in the morning not knowing whether they will have anything to eat for lunch: they often go to sleep hungry; and the nutrition they do obtain is unbalanced and unhealthy — lots of starches, sugar and salt, and not enough protein and vitamins. Yet nutritional insecurity isn’t famine, and France of the early twenty-first century isn’t France of 1694. Even in the worst slums around Beauvais or Paris, people don’t die because they have not eaten for weeks on end.”
  • “Indeed, in most countries today overeating has become a far worse problem than famine. In the eighteenth century Marie Antoinette allegedly advised the starving masses that if they ran out of bread, they should just eat cake instead. Today, the poor are following this advice to the letter. Whereas the rich residents of Beverly Hills eat lettuce salad and steamed tofu with quinoa, in the slums and ghettos the poor gorge on Twinkie cakes, Cheetos, hamburgers and pizza. In 2014 more than 2.1 billion people were overweight compared to 850 million who suffered from malnutrition. Half of humankind is expected to be overweight by 2030. In 2010 famine and malnutrition combined killed about 1 million people, whereas obesity killed 3 million.”
  • “During the second half of the twentieth century this Law of the Jungle has finally been broken, if not rescinded. In most areas wars became rarer than ever. Whereas in ancient agricultural societies human violence caused about 15 per cent of all deaths, during the twentieth century violence caused only 5 per cent of deaths, and in the early twenty-first century it is responsible for about 1 per cent of global mortality. In 2012, 620,000 people died in the world due to human violence (war killed 120,000 people, and crime killed another 500,000). In contrast, 800,000 committed suicide, and 1.5 million died of diabetes. Sugar is now more dangerous than gunpowder.”
  • “What about terrorism, then? Even if central governments and powerful states have learned restraint, terrorists might have no such qualms about using new and destructive weapons. That is certainly a worrying possibility. However, terrorism is a strategy of weakness adopted by those who lack access to real power. At least in the past, terrorism worked by spreading fear rather than by causing significant material damage. Terrorists usually don’t have the strength to defeat an army, occupy a country or destroy entire cities. In 2010 obesity and related illnesses killed about 3 million people, terrorists killed a total of 7697 people across the globe, most of them in developing countries. For the average American or European, Coca-Cola poses a far deadlier threat than al-Quaeda.”

Harari’s basic argument is compelling. The kinds of violence and death we experience now is far different. The whole reason I wrote this post is because of a few key points that stood out to me: “Sugar is now more dangerous than gunpowder.” And: “For the average American or European, Coca-Cola poses a far deadlier threat than al-Quaeda.” As those quotes make clear, our first world problems are of a different magnitude. But I would push back against his argument, as for much of the rest of the world, in his making the same mistake as Steven Pinker by ignoring slow violence (so pervasive and systemic as to go unnoticed and uncounted, unacknowledged and unreported, often intentionally hidden). Parts of the United States also are in third world conditions. So, it isn’t simply a problem of nutritional excess from a wealthy economy. That wealth isn’t spread evenly, much less the nutrient-dense healthy foods or the healthcare. Likewise, the violence oppression falls harder upon some than others. Those like Harari and Pinker can go through their entire lives seeing very little of it.

Since World War Two, there have been thousands of acts of mass violence: wars and proxy wars, invasions and occupations, bombings and drone strikes; covert operations in promoting toppled governments, paramilitaries, and terrorists; civil wars, revolutions, famines, droughts, refugee crises, and genocides; et cetera. Most of these events of mass violence were directly or indirectly caused by the global superpowers, besides through military aggression and such, in their destabilizing regions, exploiting third world countries, stealing wealth and resources, enforcing sanctions on food and medicine, economic manipulations, debt entrapment, artificially creating poverty, and being the main contributors to environmental destruction and climate change. One way or another, these institutionalized and globalized forms of injustice and oppression might be the combined largest cause of death, possibly a larger number than in any society seen before. Yet they are rationalized away as ‘natural’ deaths, just people dying.

Over the past three-quarters of a century, probably billions of people in world have been killed, maimed, imprisoned, tortured, starved, orphaned, and had their lives cut short. Some of this was blatant violent actions and the rest was slow violence. But it was all intentional, as part of the wealthy and powerful seeking to maintain their wealth and power and gain even more. There is little justification for all this violence. Even the War on Terror involved cynical plans for attacking countries like Iraq that had preceded the terrorist attacks themselves. The Bush cronies, long before the 2000 presidential election, had it written down on paper that they were looking for an excuse to take Saddam Hussein out of power. The wars in Afghanistan and Iraq killed millions of people, around 5% or so of the population (the equivalent would be if a foreign power killed a bit less than 20 million Americans). The used uranium weapons spread across the landscape will add millions of more deaths over the decades — slow, torturous, and horrific deaths, many of them children. Multiply that by the hundreds of other similar US actions, and then multiply that by the number of other countries that have committed similar crimes against humanity.

Have we really become less violent? Or has violence simply taken new forms? Maybe we should wait until after the coming World War Three before declaring a new era of peace, love, and understanding. Numerous other historical periods had a few generations without war and such. That is not all that impressive. The last two world wars are still in living memory and hence living trauma. Let’s give it some time before we start singing the praises and glory of our wonderful advancement as a civilization guided by our techno-utopian fantasies of Whiggish liberalism. But let’s also not so easily dismiss the tremendous suffering and costs from the diseases of civilization that worsen with each generation; not only obesity, diabetes, heart disease but also autoimmune conditions, Alzheimer’s, schizophrenia, mood disorders, ADHD, autism, and on and on — besides diet and nutrition, much of it caused by chemical exposure from factory pollution, oil spills, ocean dumping, industrial farming, food additives, packaging, and environmental toxins. And we must not forget the role that governments have played in pushing harmful dietary recommendations of low-fat and high-carb that, in being spread worldwide by the wealth and power and influence of the United States, has surely harmed at least hundreds of millions over the past several generations.

The fact that sugar is more dangerous than gun powder, Coca-Cola more dangerous than al-Queda… This is not a reason to stop worrying about mass violence and direct violence. Rather than as a percentage, the total number of violent deaths is still going up, just as there are more slaves now than at the height of slavery prior to the American Civil War. Talking about percentages of certain deaths while excluding other deaths is sleight of hand rhetoric. That misses an even bigger point. The corporate plutocracy that now rules our neo-fascist society of inverted totalitarianism poses the greatest threat of our age. That is not an exaggeration. It is simply what the data shows us to be true, as Harari unintentionally reveals. Privatized profit comes at a public price, a price we can’t afford. Even ignoring the greater externalized costs of environmental harm from corporations (and the general degradation of society from worsening inequality), the increasing costs of healthcare because of diseases caused by highly-profitable and highly-processed foods that are scientifically-designed to be palatable and addictive (along with the systematic dismantling of traditional food systems) could bankrupt many countries in the near future and cripple their populations in the process. World War Three might turn out to be the least of our worries. Just because most of the costs have been externalized on the poor and delayed to future generations doesn’t mean they aren’t real. It will take a while to get the full death count.