Health, Happiness, and Exercise

I’m unsurprised that 10,000 steps was a random number selected for marketing reasons. Like so much else, it never was backed by any scientific evidence. I agree that it doesn’t take that much physical activity to promote health. The basic thing is to simply not sit on your butt all day. Anything that gets you up and moving throughout the day will probably be a vast improvement over a sedentary lifestyle. By the way, I think it goes without saying (or should) that mental health is closely linked to physical health, far from being limited to exercise. It seems common sense that physical health is the causal factor. But even assuming this, what would be the exact line of causation?

Then again, this entire approach of explanation is based on an assumption. All we know is that healthier people move more than unhealthy people. But we haven’t yet proven that merely getting up and going for a walk or whatever is the direct cause in this equation. It’s possible that it’s simply part of the healthy user effect or maybe the happy user effect (just made up that last one). People seeking better health or those already feeling good from better health are going to exercise more, whether or not movement by itself is the main factor to get credit.

From personal experience, improving health (lowing weight, increasing energy, and eliminating severe depression) by way of low-carb/keto diet was a major contributing factor to feeling more motivated to push my exercise to the next level. I can exercise while in poor physical and mental health, but it’s easier to first eliminate the basic level of problems. I always feel bad when I see overweight people jogging, presumably with the hope of losing weight (exercise didn’t help me lose weight and seems of limited benefit to most people in this regard). I’d suggest starting with dietary and other lifestyle changes. Exercise is great in a healthy state, although in an unhealthy state one might end up doing more harm than good, from spraining an ankle to having a heart attack.

It’s highly context-dependent. For simplicity’s sake, diet will probably have a greater impact on mood than exercise, despite how awesome exercise can be. After feeling better, exercise will be less of a struggle and so require less force of willpower to overcome the apathy and discomfort. I’m all about going the route of what is easiest. Life is hard enough as is. There is no point in trying to punish ourselves into good health, as if we are fallen sinners requiring bodily mortification. If one is just starting out an exercise program, I’d say go easy with it. Less is better. Push yourself over time, but there is no reason to rush it. Exercise should be enjoyable. If it is causing you pain and stress, you’re doing it wrong. A stroll through the woods will do your health far more good than sprinting on a treadmill until you collapse.

Don’t worry about counting steps, in my humble opinion, as you shouldn’t worry about counting calories, carbs, ketones, or Weight Watcher points (yes, I realize Westerners are obsessed with numbers and love the feeling of counting anything and everything; who am I to deny anyone this pleasure?). It easily becomes an unhealthy moralistic mindset of constant self-control and self-denial that can undermine a natural good feeling of health and well-being. That is unless you’re dealing with a specific health protocol for a serious medical condition (e.g., keto diet for epileptic seizures) or maybe, in extreme cases, you need the structure to achieve a particular goal. I’m just saying be careful to not go overboard with the endless counting of one thing or another. If counting is helpful, great! Just maybe think of it as a transitional stage, not a permanent state of struggle.

Sometimes rules initially help people when their health has gotten so bad that they’ve lost an intuitive sense of what it feels like to do what is healthy. I get that. But regaining that intuitive, not just intuitive but visceral, sense of feeling good in one’s body should be the ultimate goal — just being healthy and happy as one’s natural birthright (I know, a crazy radical idea; I spent too much time in the positive-and-abundance-thinking of practical Christianity). Experiment for yourself (N=1) and find out works for you. If nothing else, start off with a short walk every once in a while or heck just stand up from your desk and get the blood flowing. Keep it simple. Maybe it isn’t as hard as it first seems. Don’t overthink it. Relearn that childlike sense of enjoying the world around you, immersed in the experience of your own body. Don’t just exercise. Go play. Run around a field with a child. Have a chat while walking. Simply appreciate the state of being alive.

* * *

by Amanda Mull, The Atlantic

“It turns out the original basis for this 10,000-step guideline was really a marketing strategy,” she explains. “In 1965, a Japanese company was selling pedometers and they gave it a name that, in Japanese, means the 10,000-step meter.”

Based on conversations she’s had with Japanese researchers, Lee believes that name was chosen for the product because the character for “10,000” looks sort of like a man walking. As far as she knows, the actual health merits of that number have never been validated by research. […]

“The basic finding was that at 4,400 steps per day, these women had significantly lower mortality rates compared to the least active women,” Lee explains. If they did more, their mortality rates continued to drop, until they reached about 7,500 steps, at which point the rates leveled out. Ultimately, increasing daily physical activity by as little as 2,000 steps — less than a mile of walking — was associated with positive health outcomes for the elderly women.” […]

Because her study was observational, it’s impossible to assert causality: The women could have been healthier because they stepped more, or they could have stepped more because they were already healthier. Either way, Lee says, it’s clear that regular, moderate physical activity is a key element of a healthy life, no matter what that looks like on an individual level.

“I’m not saying don’t get 10,000 steps. If you can get 10,000 steps, more power to you,” explains Lee. “But, if you’re someone who’s sedentary, even a very modest increase brings you significant health benefits.”

But since happiness can be incredibly difficult to define, I’d call these odds very interesting but not necessarily conclusive. Chen and colleagues acknowledge that more research is needed to prove whether exercise causeshappiness, or if other factors are involved. As just one example, it could be that exercise makes us healthier (which is well established by science) and being healthier is what makes us happy. […]

Not as much research has been done whether happiness is a key to motivating people to exercise. But one 2017 study published in the Annals of Behavioral Medicine certainly suggests as much.

Over 11 years, nearly 10,000 people over age 50 were asked about their frequency and intensity of physical activity, at work and otherwise. Those with higher psychological well-being (a proxy for happiness and optimism) at the start of the study had higher levels of physical activity over the next decade. Also, those who started out happy and active were more likely to stay active.

“Results from this study suggest that higher levels of psychological well-being may precede increased physical activity,” said Julia Boehm, a researcher at Chapman University and lead author of the study.

In very preliminary results of my Happiness Survey for The Happiness Quest,regular exercise is emerging as a theme among those who self-report as being the happiest. However, the survey is self-selecting, the numbers are as-yet small, and the happiest respondents also associate strongly with other traits and habits, so at best the responses are just another possible indicator of an association between exercise and happiness, not a cause-and-effect relationship, and no indication in which direction any effect may flow. […]

I can only conclude, despite the years-on, years-off nature of my exercise routine, that exercise puts me in a good mood. And when I’m in a good mood, I tend to exercise more. In many ways, it matters little which is the cause and which is the effect. And I’ll bet it’s simply a virtuous circle (and, in those off years, a vicious spiral).

Lead Toxicity is a Hyperobject

What is everywhere cannot be seen. What harms everyone cannot be acknowledged. So, we obsess over what is trivial and distract ourselves with false narratives. The point isn’t to understand, much less solve, problems. We’d rather large numbers of people to suffer and die, as long as we don’t have to face the overwhelming sense of anxiety about the world we’ve created.

We pretend to care about public health. We obsess over pharmaceuticals and extreme medical interventions while pandering about exercise and diet, not to mention going on about saving the planet while only taking symbolic actions. But some of the worst dangers to public health go with little mention or media reporting. Lead toxicity is an example of this. It causes numerous diseases and health conditions: lowered IQ, ADHD, aggressive behavior, asthma, and on and on. Now we know it also causes heart disease. Apparently, it even immensely contributes to diabetes. A common explanation might be that heavy metals interfere with important systems in the body such as the immune system and hormone system. In the comments section of Dr. Malcolm Kendrick’s post shared below, I noticed this interesting piece of info:

“I recently listened to a presentation, as a part of a class I’m taking, put on by the lead researcher for the TACT trial. He is a cardiologist himself. I would say that a 48% ABSOLUTE risk reduction in further events in diabetic patients, and a 30-something % risk reduction in patients without diabetes, is extremely significant. I went and read the study afterward to verify the numbers he presented. I would say, based on the fact that he admitted freely he thought he was going to prove exactly the opposite, and that his numbers and his statements show it does work, are pretty convincing. Naturally, no one that works for JAMA will ever tell you that. They would prefer to do acrobatics with statistics to prove otherwise.”

Lead toxicity is one of the leading causes of disease and death in the world. It damages the entire body, especially the brain. For the survivors of lead toxicity, they are crippled for life. It was also behind the violent crime wave of paste decades. The prison population has higher than average rates of lead toxicity, which means we are using prisons to store and hide the victims and scapegoat them all in one fell swoop. And since it is the poor who are primarily targeted by our systematic indifference (maybe not indifference, since there are profits and privileges incentivizing it), it is they who are disproportionately poisoned by lead and then, as victims, imprisoned or otherwise caught up in the legal system or institutionalized or left as one of the vast multitudes of forgotten, of the homeless, of those who die without anyone bothering to find out what killed them.

But if only the poor worked harder, got an education, followed the USDA-recommended diet, and got a good job to pay for all the pills pushed on them by the pharmaceutical-funded doctors, then… well, then what the fuck would good would it do them? Tell me that. The irony is that, as we like to pity the poor for their supposed failures and bad luck, we are all being screwed over. It’s just we feel slightly better, slightly less anxious as long as others are doing worse than us. Who cares that we live in a society slowly killing us. The real victory is knowing that it is killing you slightly slower than your neighbor or those other people elsewhere. For some odd reason, most people find that comforting.

It’s sad. Despite making some minor progress in cleaning up the worst of it, the decades of lead accumulation still lingers in the soil, oceans, infrastructure, and old buildings. Entire communities continue to raise new generations with lead exposure. On top of that, we’ve been adding even more pollutants and toxins to the environment, to our food supply, and to every variety of product we buy. I will say this. Even if diet doesn’t have as big of a direct affect on some of these conditions as does removing dangerous toxins, diet has the advantage of being a factor one can personally control. If you eat an optimally healthy diet, especially if you can avoid foods that are poisoned (either unintentionally with environmental toxins or intentionally with farm chemicals), you’ll be doing yourself a world of good. Greater health won’t eliminate all of the dangers we are surrounded by, but it will help you to detoxify and heal from the damage. It may not be much  in the big picture, but it’s better than nothing.

On the other hand, even if our diet obsession is overblown, maybe it’s more significant than we realize. Sammy Pepys, in Fat is our Friend, writes about Roseto, Pennsylvania. Scientists studying this uniquely healthy American community called the phenomenon the Roseto Effect. These people ate tons of processed meat and lard, smoked cigars and drink wine, and they worked back-breaking labor in quarries where they would have been exposed to toxins (“Rosetan men worked in such toxic environments as the nearby slate quarries … inhaling gases, dusts and other niceties.” p. 117). Yet their health was great. At the time, diet was dismissed because it didn’t conform to USDA standards. While most Americans had already switched to industrial seed oils, the Rosetans were still going strong on animal fats. Maybe their diet was dismissed too easily. As with earlier lard-and-butter-gorging Americans, maybe all the high quality animal fats (probably from pasture-raised animals) was essential to avoiding disease. Also, maybe it had something to do with their ability to handle the toxins as well. Considering Weston A. Price’s research, it’s obvious that all of those additional fat-soluble vitamins sure would have helped.

Still, let’s clean up the toxins. And also, let’s quit polluting like there is no tomorrow.

* * *

What causes heart disease part 65 – Lead again
by Dr. Malcolm Kendrick

There are several things about the paper that I found fascinating. However, the first thing that I noticed was that…. it hadn’t been noticed. It slipped by in a virtual media blackout. It was published in 2018, and I heard nothing.

This is in direct contrast to almost anything published about diet. We are literally bombarded with stories about red meat causing cancer and sausages causing cancer and heart disease, and veganism being protective against heart disease and cancer, and on and on. Dietary articles often end up on the front page on national newspapers. […]

Where was I? Oh yes, lead. The heavy metal. The thing that, unlike diet, makes no headlines whatsoever, the thing that everyone ignores. Here is one top-line fact from that study on lead, that I missed:

‘Our findings suggest that, of 2·3 million deaths every year in the USA, about 400 000 are attributable to lead exposure, an estimate that is about ten times larger than the current one.’ 1

Yes, according to this study, one in six deaths is due to lead exposure. I shall repeat that. One in six. Eighteen per cent to be exact, which is nearer a fifth really. […]

So, on one side, we have papers (that make headlines around the world) shouting about the risk of red meat and cancer. Yet the association is observational, tiny, and would almost certainly disappear in a randomised controlled trial, and thus mean nothing.

On the other we have a substance that could be responsible for one sixth of all deaths, the vast majority of those CVD deaths. The odds ratio, highest vs lowest lead exposure, by the way, depending on age and other factors, was a maximum of 5.30 [unadjusted].

Another study in the US found the following

‘Cumulative lead exposure, as reflected by bone lead, and cardiovascular events have been studied in the Veterans’ Normative Aging Study, a longitudinal study among community-based male veterans in the greater Boston area enrolled in 1963. Patients had a single measurement of tibial and patellar bone lead between 1991 and 1999. The HR for ischemic heart disease mortality comparing patellar lead >35 to <22 μg/g was 8.37 (95% CI: 1.29 to 54.4).’ 3

HR = Hazard Ratio, which is similar, if not the same to OR = Odds Ratio. A Hazard Ratio of 8.37, means (essentially) a 737% increase in risk (Relative Risk).

Anyway, I shall repeat that finding a bit more loudly. A higher level of lead in the body leads to a seven hundred and thirty-seven per cent increase in death from heart disease. This is, in my opinion, correlation proving causation.

Looking at this from another angle, it is true that smoking causes a much greater risk of lung cancer (and a lesser but significant increase in CVD), but not everyone smokes. Therefore, the overall damage to health from smoking is far less than the damage caused by lead toxicity.

Yet no-one seems remotely interested. Which is, in itself, very interesting.

It is true that most Governments have made efforts to reduce lead exposure. Levels of lead in the children dropped five-fold between the mid-sixties and the late nineties. 4 Indeed, once the oil industry stopped blowing six hundred thousand tons of lead into the atmosphere from vehicle exhausts things further improved. Lead has also been removed from water pipes, paint, and suchlike.

However, it takes a long old time from lead to be removed from the human body. It usually lingers for a lifetime. Equally, trying to get rid of lead is not easy, that’s for sure. Having said this, chelation therapy has been tried, and does seem to work.

‘On November 4, 2012, the TACT (Trial to Assess Chelation Therapy) investigators reported publicly the first large, randomized, placebo-controlled trial evidence that edetate disodium (disodium ethylenediaminetetraacetic acid) chelation therapy significantly reduced cardiac events in stable post–myocardial infarction (MI) patients. These results were so unexpected that many in the cardiology community greeted the report initially with either skepticism (it is probably wrong) or outright disbelief (it is definitely wrong).’ 3

Cardiologists, it seems from the above quotes, know almost nothing about the subject in which they claim to be experts. Just try mentioning glycocalyx to them… ‘the what?’

Apart from a few brave souls battling to remove lead from the body, widely derided and dismissed by the mainstream world of cardiology, nothing else is done. Nothing at all. We spend trillions on cholesterol lowering, and trillions on blood pressure lowering, and more trillions on diet. On the other hand, we do nothing active to try and change a risk factor that kicks all the others – in terms of numbers killed – into touch.

To Be Fat And Have Bread

The obsession with body fat is an interesting story. It didn’t begin a few generations ago but goes back centuries. But maybe that shouldn’t be surprising.

That was the colonial era when the diet was transformed by imperial trade of foreign foods. I might note that this included previously rare or never before seen varieties of fattening carbohydrates: sugar, potatoes, corn, rice, etc. The old feudal system was ending and entirely different forms of food production and diets were developing, especially for the then landless peasants. Hunting, gathering and grazing for the commoners definitely would have been on the decline for a while at that point, as the last of the commons had been privatized. The loss of access to wild game would take longer in the colonies, but eventually it happened everywhere.

The last stage of that shift overlapped with the beginnings of industrialization and agricultural improvements. In the 19th century, change in wheat surpluses and hence costs and prices. Agriculture boomed as fewer people were employed in it. There was also a sudden obsession with gender roles and social roles in general, such as the post-revolutionary expectation of the mother to make citizens out of her children. Bread-making, a once uncommon activity for Americans, became increasingly important to the normative identity of family life and the symbolic maintenance of the social order.

Regular consumption of wheat bread was once limited to the wealthy and that is how refined bread gained its moral association with the refined class. Only the wealthy could afford wheat prior to the 19th century, as prior to that the poor were forced to rely upon cheaper grains and grain substitutes at a time when bread was regularly adulterated with bark, sawdust, chalk, etc. Poverty breads, in the previous centuries, often were made with no grain at all.* For wheat and especially heavily refined white bread to become available to all walks of life meant an upsurge of the civilizing process. The obsession with middle class life took hold and so cookbooks were produced in large numbers.

In a growing reactionary impulse, there was a nostalgic tendency toward invented traditions. Bread took on new meanings that then were projected onto the past. It wasn’t acknowledged how radical was the industrial agriculture and industrial milling that made all of this possible. And the disconnection is demonstrated by the simultaneous promotion of the grain production of this industrial age and the complaint about how industrialized life was destroying all that was good. Bread, as a symbol, transcended these mere details.

With the aristocracy having been challenged during the Revolutionary Era the refinement of the refined class that once was admired had then become suspect. The ideology of whole foods began to emerge and had some strong proponents. But by the end of the 1800s, the ideal of refinement gained prominence again and prepared the way for the following century of ever greater industrialization of processed foods. Refinement represented progress. Only after more extensive refinement led to mass malnourishment, near the end of that century and heading into the next, did whole foods once again capture the public imagination.

Then we enter the true era of fat obsession, fat blaming, and dieting, endless dieting. Eat your whole grains, get your fiber, make sure you get enough servings of fruits, and veggies, and don’t forget to exercise. Calories in, calories out. Count your calories, count your carbs, count your steps. Count every last one of them. Still, the basic sides of the debate remain the same: fewer carbohydrates vs less meat, whole foods vs refined foods, barbaric lifestyle vs civilizing process, individual moral failure vs societal changes, etc. One theme that runs through dietary advice from the ancient world to the present is that there is a close link between physical health, mental health, and moral health — the latter erupting as moral panic and moral hygiene. But what stands about the modern era, beginning in the 1600s, is that it was observed that psychological problems were mostly seen among the well-to-do.

This was often blamed on luxury and sometimes on meat (a complaint often about animals raised unnaturally in confinement and probably fed grain, the early equivalent of concerns about factory farming; but also a complaint about the introduction of foreign spices and use of fancy sauces to make meat more appetizing), although there was beginning to be an awareness that a high-carb diet might be playing a role in that it was often noted that the morbidly obese ate lots of pastries, fruit pies, and such. The poor didn’t have much access to wheat and sugar before the 1800s, but the wealthy had plenty of such foods centuries earlier. Meat consumption didn’t change much during that era of colonial trade. What did change the most was availability of starchy and sugary foods, and the wealthy consumed them in great proportions. Meat had always been a desirable food going back to earliest hominid evolution. Modern agriculture and global trade, however, entirely transformed the human diet with the introduction of massive amounts of carbohydrates.

It’s strange that right from the beginning of the modern era there were those pushing for a vegetarian diet, not many but their voices were being heard for the first time. Or maybe it wasn’t so strange. Prior to the modern era, a vegetarian diet so far north in Europe would have been impossible. It was only the elite promoting vegetarianism as only they could afford a vegetarian diet year round, in buying expensive plant-based foods that were often shipped in from far away. Although plant foods were expensive at the time, they were available to those who had plenty of money. But during the Middle Ages and earlier, vegetarianism for the most part was not an option for anyone since the food items required of such a diet simply weren’t available enough to sustain life, certainly not in places like England or Germany.

There is another side to this bring us back to the obsession with fat. It was only with the gradual increase of grain production that cattle could be fed grain, not only as additional feed in the winter but year round. This is also what allowed the possibility of confining animals, rather than grazing them on fields. Grain surpluses weren’t consistent until the 19th century, but even before that grain production had been increasing. There were slow improvements in agriculture over the centuries. The rich could afford meat from grain-fed animals much earlier than the rest of the population and it was highly sought after. That is because such meat is extremely fatty creating those beautiful marbled steaks, pork chops, etc (such fattiness, by the way, is a sign of metabolic syndrome in both animals and humans). Fat couldn’t have been a focus of debate prior to grain-fattened animals became common.

So, there is a reason that both wheat bread and fatty meat gained immense symbolic potency at the same time. Similarly, it was during this same era that vegetables became more common and gardens likewise became symbols of wealth, abundance, and the good life. Only the rich could afford to maintain large gardens because of the difficulty involved and immense time-consuming work required (see The Jane Austen Diet by Bryan Kozlowski**; also about the American diet before the 20th century, see The Big Fat Surprise by Nina Teicholz that I quote in Malnourished Americans). They represented the changed diet of modern civilization. They were either indicators of progress or decline, depending on one’s perspective. Prior to modernity, a diet had consisted to a much greater degree of foods that were gathered, hunted, trapped, and fished.

The shift from one source of food to another changed the diet and so changed the debate about diet. There suddenly were more options of foods available as choices to argue about. Diet as a concept was being more fully formulated. Rather than being something inherited according to the traditional constraints of local food systems and food customs, assuming one had the wealth, one could pick from a variety of possible diets. Even to this day, the obsession about dieting carries a taint of class privilege. It is, as they say, a first world problem. But what is fascinating is how this way of thinking took hold in the 1600s and 1700s. There was a modern revolution in dietary thought in the generations before modern political revolution. The old order was falling apart and sometimes actively being dismantled. This created much anxiety and it forced the individual into a state of uncertainty. Old wisdom no longer could be relied upon.

* * *

*Rather than bread, the food that was most associated with the laboring class was fish, a food the wealthy avoided. Think about how lobster and clams used to be poverty foods. In Galenic theory of humoral physiology, fish is considered cold and wet, hard to digest and weakening. This same humoral category of food also included fruits and vegetables. This might be why, even to this day, many vegetarians and vegans will make an exception for fish, in seeing it as different than ‘meat’. This is an old ideological bias because ‘meat’ was believed to have the complete opposite effect of being hot and dry, easy to digest and invigorating. This is the reason for why meat but not fish was often banned during religious fasts and festivals.

As an interesting side note, the supposed cooling effect of fish was a reason for not eating it during the cold times of the year. Fish is one of the highest sources of vitamin A. Another source is by way of the precursor of beta-carotene found in vegetables. That these two types of food are considered of the same variety according to Galenic thought is interesting. Cold weather is one of the factors that can disrupt the body’s ability to convert beta-carotene into usable vitamin A. The idea of humors mixes this up slightly, but it maybe points to understanding there was something important to be understood. Eating more meat, rather than vegetables, in winter is a wise practice in a traditional society that can’t supplement such nutrients. Vitamin A is key for maintaining a strong immune system and handling stress (True Vitamin A For Health And Happiness).

By the way, it was during the 19th century that a discussion finally arose about vegetarianism. The question was about whether life and health could be sustained with vegetables. Then again, those involved were probably still being influenced by Galenic thought. By vegetarianism, they likely meant a more general plant-based diet that excluded ‘meat’ but not necessarily fish. The context of the debate was the religious abstinence of Lent, during which fish was allowed. So, maybe the fundamental argument was more about the possibility of long-term survival solely on moist, cooling foods. Whatever the exact point of contention, it was the first time in the modern Western world where a plant-based diet (be it vegan, vegetarian, or pescetarian-style Mediterranean diet) was considered seriously.

These ideas have been inherited by us, even though the philosophical justifications no longer make sense to us. This is seen in the debate that continues over red meat in particular and meat in general, specifically in terms of the originally Galenic assertion of its heat and dryness building up the ‘blood’ (High vs Low Protein). It’s funny that dietary debates remain obsessed over red meat (along with the related issue of cows and their farts), even though actual consumption of red meat has declined over the past century. As with bread, the symbolic value of red meat has maybe even gained greater importance. Similarly, as I mentioned above, the uncertain categorization of fish remains hazy. I know a vegan who doesn’t eat ‘meat’ but does eat fish. When I noted how odd that was, a vegetarian I was talking to thought it made perfect sense. This is Galenic thought without the Galenic theory that at least made it a rational position, but the ideological bias remains in spite of those adhering to it being unable to explain why they hold that bias. It amuses me.

Ideologies are powerful systems. They are mind viruses that can survive and mutate across centuries and sometimes millennia. Most of the time, their origins are lost to history. But sometimes we are able to trace them and it makes for strange material to study.

See: “Fish in Renaissance Dietary Theory” by Ken Albala from Fish: Food from the Waters ed. by Harlan Walker, and Food and Faith in Christian Culture ed. by Ken Albala and Trudy Eden. Also, read text below, such as the discussion of vegetarianism.

* * *

(Both texts below are from collections that are freely available on Google Books and possibly elsewhere.)

The Fat of the Land: Proceedings of the Oxford Symposium on Food and Cooking 2002
ed. by Harlan Walker
“The Apparition of Fat in Western Nutritional Theory”
by Ken Albala

Naturally dietary systems of the past had different goals in mind when framing their recommendations. They had different conceptions of the good, and at some point in history that came to include not being fat. Body size then became an official concern for dietary writers. Whether the original impetus for this change was a matter of fashion, spirituality or has its roots in a different approach to science is impossible to say with any degree of precision. But this paper will argue that nutritional science itself as reformulated in the 17th century was largely to blame for the introduction of fat into the discourse about how health should be defined. […] Obesity is a pathological state according to modern nutritional science. But it was not always so.

When and why fat became a medical issue has been a topic of concern among contemporary scholars. Some studies, such as Peter N. Sterns’ Fat History: Bodies and Beauty in the Modern West, place the origin of our modern obsession in the late 19th century when the rise of nutritional science and health movements lead by figures like John Harvey Kellogg, hand in hand with modern advertising and Gibson Girls, swept away the Victorian preference for fulsome figures. As a form of social protest, those who could afford to, much as in the 60s, idealized the slim androgynous figure we associate with flappers. Others push the origin further back into the early 19th century, in the age of Muscular Christianity and Sylvester Graham. But clearly the obsession is earlier than this. In the 18th century the 448 pound physician George Cheyne and his miracle dieting had people flocking to try out the latest ‘cures.’ It was at the same time that dissertations on the topic of obesity became popular, and clearly the medical profession had classified this as a treatable condition. And readers had already been trained to monitor and police their own bodies for signs of impending corpulence. The roots of this fear and guilt must lie somewhere in the previous century as nutritional science was still groping its way through a myriad of chemical and mechanical theories attempting to quantify health and nutrition with empirical research.

The 17th century is also the ideal place to look if only because the earlier system of humoral physiology is almost totally devoid of a concept of fat as a sickness. […]

For all authors in the Galenic tradition it appears that fat was seen as a natural consequence of a complexion tending to the cold and moist, something which could be corrected, but not considered an illness that demanded serious attention. And socially there does not seem to have been any specific stigma attached to fat if Rubens’ taste in flesh is any measure.

The issue of fat really only emerges among authors who have abandoned, in part or totally, the system of humoral physiology. This seems to have something to do with both the new attempts to quantify nutrition, first and most famously by Santorio Santorio9 and also among those who began to see digestion and nutrition as chemical reactions which when gone awry cast fatty deposits throughout the body. It was only then that fat came to be considered a kind of sickness to be treated with therapy.10

The earliest indications that fat was beginning to be seen as a medical problem are found in the work of the first dietary writer who systematically weighed himself. Although Santorio does not seem to have been anxious about being overweight himself, he did consistently define health as the maintenance of body weight. Expanding on the rather vague concept of insensible perspiration used by Galenic authors, Santorio sought to precisely measure the amount of food he consumed each day compared to the amount excreted in ‘sensible’ evacuations. […] Still, fat was not a matter of eating too much. ‘He who eats more than he can digest, is nourished less than he ought to be, and [becomes] consequently emaciated.’12 More importantly, fat was a sign of a system in disarray. […]

Food was not in fact the only factor Santorio or his followers took into account though. As before, the amount of exercise one gets, baths, air quality, even emotions could alter the metabolic rate. But now, the effect of all these could be precisely calculated. […]

At the same time that these mechanistic conceptions of nutrition became mainstream, a chemical understanding of how food is broken down by means of acids and alkalis also came to be accepted by the medical profession. These ideas ultimately harked back to Paracelsus writing in the 16th century but were elaborated upon by 17th century writers […] It is clear that by the early 18th century fat could be seen as a physiological defect that could be corrected by heating the body to facilitate digestive fermentation and the passage of insensible perspiration. […] Although the theories themselves are obviously nothing like our own, we are much closer to the idea of fat as a medical condition. […]

Where Cheyne departs from conventional medical opinion, is in his recommendation of a cooked vegetable diet to counter the affects of a disordered system, which he admits is rooted in his own ‘experience and observation on my own crazy carcase and the infirmities of others I have treated’ rather than on any theoretical foundation.

The controversy over whether vegetables could be considered a proper diet, not only for the sick or overgrown but for healthy individuals, was of great concern in the 18th century. Nicholas Andry in his Traité des alimens de caresme offered an extended diatribe against the very notion that vegetables could sustain life, a question of particular importance in Catholic France where Lenten restriction were still in force, at least officially. […] According to current medical theory, vegetables could not be suitable for weight loss, despite the successful results of the empirics. […]

It is clear that authors had a number of potentially conflicting theoretical models to draw from and both mechanical and chemical explanations could be used to explain why fat accumulates in the body. Yet with entirely different conceptual tools, these authors arrived at dietary goals surprisingly like our own, and equally as contentious. The ultimate goals now became avoiding disease and fat, and living a long life. While it would be difficult to prove that these dietary authors had any major impact beyond the wealthy elites and professionals who read their works, it is clear that a concern over fat was firmly in place by the mid 18th century, and appears to have its roots in a new conception of physiology which not only paid close attention to body weight as an index of health, but increasingly saw fat as a medical condition.

Food and Morality: Proceedings of the Oxford Symposium on Food and Cookery 2007
ed. by Susan R. Friedland
“Moral Fiber: Bread in Nineteenth-Century America”

by Mark McWilliams

From Sarah Josepha Hale, who claimed, ‘the more perfect the bread, the more perfect the lady’ to Sylvester Graham, who insisted, ‘the wife, the mother only’ has the ‘moral sensibility’ required to bake good bread for her family, bread often became a gendered moral marker in nineteenth-century American culture.1 Of course, what Hale and Graham considered ‘good’ bread differed dramatically, and exactly what constituted ‘good’ bread was much contested. Amidst technological change that made white flour more widely available and home cooking more predictable, bread, described in increasingly explicit moral terms, became the leading symbol of a housewife’s care for her family.

Americans were hardly the first to ascribe moral meaning to their daily bread. As Bernard Dupaigne writes, ‘since time immemorial [bread] has attended the great events of various human communities: monsoon or grape harvest bread, the blessed bread of Catholics or the unleavened bread of Passover, or the fasting-break bread of Ramadan. There is no bread that does not, somewhere in the world, celebrate an agricultural or religious holiday, enrich a family event, or commemorate the dead.’2 With such varied symbolic resonance, bread seems easily filled with new meanings.

In America (as later in France),3 bread became a revolutionary symbol. To the early English colonists’ dismay, European wheat did not adapt well to the North American climate; the shift to corn as the primary grain was perhaps the most important dietary adaptation made by the colonists. Wheat remained too expensive for common consumption well into the nineteenth century. […]

By the end of the Revolution, then, bread was already charged with moral meaning in the young United States. In the nineteenth century, this meaning shifted in response to agricultural improvements that made wheat more widely available, technological change that made bread easier to make consistently, and, perhaps most important, social change that made good bread the primary symbol of a housewife’s care for her family. In effect, bread suffered a kind of identity crisis that paralleled the national identity crisis of Jacksonian America. As Americans thought seriously about who they were in this new nation, about how they should act and even how they should eat, bread’s symbolic meaning – and bread itself– changed.

American agricultural production exploded, although the proportion of the population working on farms declined. James Trager notes that even before the McCormick reaper first sold in large numbers as farmers struggled to replace workers leaving for the 1849 Gold Rush, the average time required to produce a bushel of wheat declined 22 per cent from 1831 to 1840.7 Dramatic improvements in efficiency led to larger yields; for example, wheat production more than doubled between 1840 and 1860. Such increases in wheat production, combined with better milling procedures, made white flour finally available in quantities sufficient for white bread to become more than a luxury good.8

Even as wheat became easier to find for many Americans, bread remained notoriously difficult to make, or at least to make well. Lydia Maria Child, a baker’s daughter who became one of America’s leading writers, emphasizes what must have been the intensely frustrating difficulty of learning to cook in the era before predictable heat sources, standardized measurements, and consistent ingredients.9 […]

Unlike Hale, who implies that learning to bake better can be a kind of self improvement, this passage works more as dire warning to those not yet making the proper daily bread. Though bread becomes the main distinction between the civilized and the savage, Beecher turns quickly, and reassuringly, to the science of her day: ‘By lightness is meant simply that in order to facilitate digestion the particles are to be separated from each other by little holes or air-cells; and all the different methods of making light bread are neither more nor less than the formation of bread with these air cells’ (170). She then carefully describes how to produce the desired lightness in bread, instructions which must have been welcome to the young housewife now fully convinced of her bread’s moral importance.

The path for Beecher, Hale, and others had been prepared by Sylvester Graham, although he is little mentioned in their work.14 In his campaign to improve bread, Graham’s rhetoric ‘romanticized the life of the traditional household’ in ways that ‘unknowingly helped prepare women to find a new role as guardians of domestic virtue,’ as Stephen Nissenbaum notes.15 Bread was only one aspect of Graham’s program to educate Americans on what he called ‘the Science of Human Life.’ Believing on the one hand, unlike many at the time, that overstimulation caused debility and, on the other, that industrialization and commercialization were debasing modern life, Graham proposed a lifestyle based around a strict controls on diet and sexuality.16 While Graham promoted a range of activities from vegetarianism to temperance, his emphasis on good bread was most influential. […]

And yet modern conditions make such bread difficult to produce. Each stage of the process is corrupted, according to Graham. Rather than grow wheat in ‘a pure virgin soil’ required for the best grain, farmers employ fields ‘exhausted by tillage, and debauched by the means which man uses to enrich and stimulate it.’ As Nissenbaum notes, the ‘conscious sexual connotations’ of Graham’s language here is typical of his larger system, but the language also begins to point to the moral dimensions of good bread (6).

Similarly loaded language marks Graham’s condemnation of bakery bread. Graham echoed the common complaints about adulteration by commercial bakers. But he added a unique twist: even the best bakery bread was doubly flawed. The flour itself was inferior because it was over-processed, according to Graham: the ‘superfine flour’ required for white bread ‘is always far less wholesome, in any and every situation of life, than that which is made of wheaten meal which contains all the natural properties of the grain.’ […]

As Nissenbaum argues, pointing to this passage, Graham’s claims invoke ‘the vision of a domestic idyll, of a mother nursing her family with bread and affection’ (8). Such a vision clearly anticipates the emphasis on cookery as measure of a woman’s social worth in the domestic rhetoric that came so to characterize the mid-nineteenth century.

Such language increasingly linking cookery with morality emphasized the virtue not of the food itself but rather of the cooks preparing it. This linkage reached read ers not only through the explosion of cookbooks and domestic manuals but also through the growing numbers of sentimental novels. Indeed, this linkage provided a tremendously useful trope for authors seeking a shorthand to define their fictional characters. And that trope, in turn, helped expand the popularity of interpreting cookery in moral terms. […]

After the Civil War, domestic rhetoric evolved away from its roots in the wholesome foods of the nation’s past toward the ever-more refined cuisine of the Gilded Age. Graham’s refusal to evolve in this direction – his system was based entirely in a nostalgic struggle against modernity, against refinement – may well be a large part of why his work was quickly left behind even by those for whom it had paved the way.

* * *

Here is another text I came across. It’s not free, but it seems like a good survey worth buying.



Biden’s Corruption and Dementia

“Will the Senate investigate Joe and Hunter Biden’s actions in China and Ukraine? We don’t know, but they should. If a two-year investigation of President Trump, Russia and the Trump family was justified to ensure the president isn’t compromised, an investigation into Joe Biden, China, Ukraine and the Biden family is imperative.”
~Peter Schweitzer, Secret Empires *

“It is certainly understandable that people are concerned about the presidential frontrunner having a racist worldview. But what’s really weird and creepy is how few people are discussing the obvious fact that the presidential forerunner is also clearly suffering from the early stages of some kind of dementia. The brain that spouted the gibberish transcribed above would probably score poorly on a basic test for the early stages of Alzheimer’s disease, yet discussion of his inability to complete a coherent sentence is relegated to the margins of political discourse. This is someone who is campaigning to have access to the nuclear codes, yet we’re only talking about how he’s kind of racist and not about the fact that his brain is turning into Swiss cheese right before our eyes. It’s freaky.”
~Caitlin Johnstone **

* quoted by Patrice Aymes, Biden Family Corruption: So Common A Thing Democrats & Their Pluto Media Didn’t Notice

** from Open Society blog, Biden’s Brain Is Swiss Cheese and It’s Creepy How Much We’re Not Talking About It

True Vitamin A For Health And Happiness

“The discovery of vitamin A and the history of its application in the field of human nutrition is a story of bravery and brilliance, one that represents a marriage of the best of scientific inquiry with worldwide cultural traditions; and the suborning of that knowledge to the dictates of the food industry provides a sad lesson in the use of power and influence to obfuscate the truth”
~Mary Enig, PhD, Lipid Biochemist

Over this past century, there has been a developing insight into the role of nutrition in health. It was originally motivated by the observations of the diseases of malnutrition, largely coinciding with the diseases of civilization. This became increasingly obvious with industrialization. By the mid-20th century, there was a growing health movement that that brought greater awareness to this field of study.

When my grandmother was diagnosed with cancer in the 1980s, she had already for decades been reading books on health and so, instead of chemotherapy or radiation, she tried to cure herself with a macrobiotic diet. My father recalls her juicing such vast amounts of carrots, presumably to up her beta-carotene levels, that her skin turned the same color as her beverage of choice. That is the reason egg yolks and butter are yellow and turn an even deeper orange when animals are pasture-raised on lush green forage (beta-carotene is easily oxidized and so, once cut, grass quickly loses its amount of this nutrient; hence, cattle eating their fresh greens in the spring and summer is important for the fat-soluble vitamins they store in their fat over winter). It is the carotenoids that cause that and there are “currently about 600 known forms of naturally occurring carotenoids” (Sarah Pope, Busting the Beta Carotene Vitamin A Myth). “The carotenoids are further broken down into 2 classes, carotenes and xanthophylls. The carotenes consist of alpha-carotene, beta-carotene, gamma-carotene, delta-carotene, epsilon-carotene, and zeta-carotene. In the xanthophyll class we have astaxanthin, beta-crypto-xanthin, canthaxanthin, fucoxanthin, lutein, neoxanthin, violaxanthin, and zeaxanthin” (Casey Thaler, Why You Need Vitamin A). Beta-carotene is the precursor to the fat-soluble vitamin A and it is linked to the body’s immune system, in fighting off cancerous cells and other sicknesses, along with handling stress (yet “Stress conditions, such as extremely hot weather, viral infections, and altered thyroid function, have also been suggested as causes for reduced carotene to vitamin A conversion”; from Lee Russell McDowell’s Vitamins in Animal Nutrition, p. 30) — Weston A. Price was an early 20th century researcher who observed the relationship between fatty animal foods, fat-soluble vitamins, and a strong immune system.

Price also discussed fertility, and I’d note that research shows that vegans and vegetarians have higher rates of infertility, as do rodents with vitamin A deficiency. I know a vegetarian couple that spent years trying to get pregnant and then spent thousands of dollars on in vitro fertilization treatments, trying twice and finally getting pregnant (the guy’s sperm were deformed and didn’t have proper motility). Pam Schoenfield states that, “in the worst case, spontaneous abortion or death of mother and offspring during labor, as described by Weston A. Price. In humans, even mild deficiencies during pregnancy can lead to compromised kidney development in the child” (The Unfair Stigmatization of Vitamin A). And Nora Gedaudas offers a grim warning that, “Vitamin A deficiency has even been implicated in Sudden Infant Death Syndrome (SIDS)! Far from being a threat to any unborn fetus, The Nordic Epidemiological SIDS Study found an association between low or no vitamin A intake and an increased risk of sudden infant death syndrome (SIDS) during their first year of life. This finding remained conclusive even when an adjustment was made for potential confounders, including socioeconomic factors. Furthermore, substantial evidence exists to show that healthy vitamin A levels during pregnancy in the mother may results in substantially reduced HIV transmission risk to her unborn child (also, vitamin A-deficient mothers were much more likely to transmit the virus to their newborn infants than were HIV-infected mothers who had adequate amounts of this critical nutrient)” (Vitamin A Under Attack Down Under). By the way, see the works of Ken Albala and Trudy Eden (e.g., Food and Faith in Christian Culture) about how in Galenic theory of humors it is understood that meat, especially red meat of ruminants, increases ‘blood’ which turns into semen and breast milk, that is to say fatty animal foods are good for fertility and fecundity, for growth, strength and vigor (Galenic ‘blood’ might be translated as animal spirits, life force, prana, or, in more modern terminology, libido) — it is true that ruminants appear to have been the primary food source in human and hominid evolution, specifically the blubbery ruminants that we call the megafauna that hominids prized for millions of years before their die-off.

Many animals are able to turn beta-carotene into retinol, something the human body theoretically can do to a limited extent if not as effectively as ruminants, and the final product is concentrated in the animal fat and the liver. As carotenoids are why carrots and sweet potatoes are orange, this relates to why egg yolks and butter is yellow-to-orange but not egg whites and skim milk. The deep color doesn’t only tell you the presence of vitamin A but the other fat-soluble vitamins as well: D, E, and K (the light color or even whiteness of the fat in factory-farmed animal foods is a sign of nutritional low quality). In my grandmother’s compromised health, she likely was not producing enough of her own vitamin A, despite getting a bounty of the precursor (her oddly-colored skin probably indicated carotenaemia, the condition of the body not metabolizing carotenoids). Worse still, beta-carotene in some cases is associated with increased risk of cancer: “In two studies, where people were given high doses of B-carotene supplements in an attempt to prevent lung cancer and other cancers, the supplements increased risk of lung cancer in cigarette smokers, and a third study found neither benefit nor harm from them. What might cause the unexpected findings? While beneficial at low doses, at higher doses, antioxidants can shut down cell signalling pathways and decrease synthesis of mitochondria in new muscle cells. They can also decrease production of endogenous antioxidants produced by a body” (Fred Provenza, Nourishment, p. 99). In getting unnaturally high levels of beta-carotene from juicing large quantities of carrots, was my grandmother overdosing herself in a similar manner? Rather than improving her immunity, did it contribute to her dying from cancer despite doing her best to follow a healthy diet? More is not necessarily better, especially when dealing with these precursors. It would be better to go straight to the preformed vitamin A in naturally balanced ratio with other nutrients (e.g., organ meats).

Beyond cancer concerns, the bioavailable forms of retinoids, along with nutrients like B12 that also are only found in animal foods (liver has high levels of both), have a major effect on the health and development of eyes (during the British food shortage of World War II, the government promoted eating the surplus of carrots by having told the public that it would give them the eyesight of heroic fighter pilots). Night blindness, a common symptom of vitamin A deficiency, is one of the most widespread problems. It is associated with developing countries and impoverished communities, but malnourishment creeps into many other populations. A vegetarian I know is experiencing loss of his night vision and it stands out that the others in his family who are also vegetarian show signs of deficiencies — besides the family being regularly sick, his wife has severe calcium loss (her body is literally absorbing her lower jaw bone) and his kids have issues with neurocognitive development and mental health (autism, depression, etc). The family doesn’t only avoid meat for neither do they eat much dairy or eggs, for example preferring plant ‘milks’ over real milk. This is something now recommended against for the young, instead advising children either drinking dairy milk or else water (E. J. Mundell, New Kids’ Drink Guidelines: Avoid Plant-Based Milks). This makes me wonder about my own early development because, after being weaned early at 6 months, I couldn’t handle cow milk and so was put on soy milk. This likely was a contributing factor to my early diagnosed learning disability, autistic-like symptoms, depression, and poor eyesight (I got corrective lenses a year or two after reading delays led me to going to a special education teacher; I was an otherwise healthy kid who played sports and spent a lot of time outside, rather than watching tv; as an interesting anecdote from someone else, see what Josiah shares in The Story of How Real Vitamin A Changed My Life). To later compound the fat-soluble vitamin deficiency, my mother in following expert health advice bought skim milk during my adolescent growth period and, now that I think about it, that was around when my depression really started going into full gear. This kind of thing hasn’t been a problem for traditional societies that often breastfed for the first 2-3 years, and that mother’s milk would be full of fat-soluble vitamins assuming the mother was eating well such as getting fatty animal foods from wild or pasture-raised sources (a safe assumption to make as long as they still have access to their traditional foods in maintaining traditional hunting grounds, fishing waters, or grazing lands).

The diseases of vitamin A deficiency have been known for millennia and were typically treated with fatty animal foods such as ruminant liver or cod liver oil, but other parts of the animal were also used. “In his pioneering work, Nutrition and Physical Degeneration, Weston Price tells the story of a prospector who, while crossing a high plateau in the Rocky Mountains, went blind with xerophthalmia, due to a lack of vitamin A. As he wept in despair, he was discovered by an Indian who caught him a trout and fed him “the flesh of the head and the tissues back of the eyes, including the eyes.”1 Within a few hours his sight began to return and within two days his eyes were nearly normal. Several years previous to the travels of Weston Price, scientists had discovered that the richest source of vitamin A in the entire animal body is that of the retina and the tissues in back of the eyes” (Sally Fallon Morell & Mary G. Enig, Vitamin A Saga). I might add that there is more to eyeball than just vitamin A: “The Latin name for the retina of the eye is macula lutea. ( Lutea is Latin for yellow. ) This thick, membranous yellow layer of the eyeball is a rich source of the nutrient lutein, a member of the retinoid family of vitamin A precursors. Lutein supplements are now promoted as being good for prostate health and for preventing macular degeneration. The fat behind the eyeball is a rich source of vitamin A and lutein. (If you think you’d rather swallow a supplement than pop an eyeball after breakfast, remember that vitamins are heat-, light-, and oxygen-sensitive and unlikely to survive processing.) And while you’re digesting the idea of eating eyeball fat, consider that the gooey juice in the eye is primarily hyaluronic acid, rich in glycosaminoglycans. You can get hyaluronic acid injected into your lips (to fill them out), your knee (as a treatment for osteoarthritis), and even your own eye (to treat certain ocular diseases) for $200 a dose (twenty one-thousandths of a gram). It’s called Restylane. But you can get this useful nutrient into your body just by eating the eyes you find in fish head soup, and the glycosaminoglycans will find their way to the parts of the body that need them most” (Catherine Shanahan, Deep Nutrition, p. 279). Maybe that is a home remedy my grandmother should have tried.

Despite this old wisdom, vitamin A itself was not identified until the early 1900s. In the decades following, all of the other main vitamins were discovered. Gyorgy Scrinis, in his book Nutritionism, summarizes this early history of nutritional studies that led to the isolation of the chemical structure (p. 75): “It was also in 1912 that Elmer McCollum’s research team at Yale University first identified a fat-soluble substance they called fat-soluble A or Factor A (later renamed vitamin A) found in butter, liver, and leafy greens. Through his experiments on rats, McCollum demonstrated that a deficiency of Factor A led to impaired vision and stunted growth. The team also identified “water-soluble B” or “Factor B,” later renamed vitamin B, the absence of which they linked to the tropical disease beriberi. 80 In the 1920s scientists identified various important relationships between vitamins and human health: they linked deficiency in vitamin C, which they found in citrus fruits, to scurvy, and they linked deficiency in vitamin D, found in various foods and produced by the body in response to sunlight, to rickets. During the 1920s and 1930s, other vitamins and minerals were identified, including riboflavin, folic acid, beta-carotene, vitamin E, and vitamin K.” McCollum was motivated by a realization of some important factor that was missing. “After reviewing the literature between 1873 and 1906,” writes Lee Russell McDowell, “in which small animals had been fed restricted diets of isolated proteins, fats, and carbohydrates, E. V. McCollum of the united States noted that the animals rapidly failed in health and concluded that the most important problem in nutrition was to discover what was lacking in such diets” (Vitamins in Animal Nutrition, p. 9). One of the limitations of the early research was that it was too broad of an approach. In isolated macronutrients failing to serve optimal health, researchers turned to isolated micronutrients, not considering that part of the problem is the isolating of any particular nutrient and ignoring the larger process of how nutrients get used as part of whole foods. The precursor provitamin A or carotenoids such as beta-carotene is not the same as preformed vitamin A as retinoids in the form of retinol and its esterified form, retinyl ester, and also as retinal and retinoic acid. To conflate the carotenoids and the retinoids is reductionist and, when accepted without question, causes many problems. Nutrients aren’t simply a compound that either one gets or doesn’t. There is more going on than that.

The main complication is that the body doesn’t easily turn the one form into the other and this can vary greatly between individuals: “many people are genetically bad converters of carotenoids to retinol” (Dr. Chris Masterjohn: Why You’re Probably Nutrient Deficient – The Genius Life); for brief discussions of related genes, see Debbie Moon’s How Well Do You Convert Beta-Carotene to Vitamin A?, Joe Cohen’s The Importance of Real Vitamin A (Retinol), and David Krantz’s When Carrots Don’t Cut It: Your Genes and Vitamin A. Also consider: “This genetic problem may exist in up to half of the population. Its presence appears to be associated with high blood levels of beta-carotene, alpha-carotene, beta-cryptoxanthin, and low levels of lycopene, lutein and zeaxanthin—three other carotenoids important for health but don’t convert to vitamin A” (Phil Maffetone, Vitamin A and the Beta-Carotene Myth: “A” is for Athletics, Aging and Advanced Health). Keep in mind that other nutrients such as “Iron and zinc deficiency can affect the conversion to vitamin A” (Pam Schoenfield, The Unfair Stigmatization of Vitamin A).  It’s ironic that it turns out, with all the obsession over eating loads of the precursor from vegetables and supplements, that the very beta-carotene that potentially can be made into vitamin A might also compete with it: “Recent research also suggests that cleavage products of beta-carotene can block vitamin A at its receptor sites—another possible anti-nutrient?” (Schoenfield). So, even nutrient-density of a vitamin precursor doesn’t necessarily mean there is bioavailability of the vitamin itself. “Conversion of carotenes to Vitamin A takes place in the upper intestinal tract in the presence of bile salts and fat-splitting enzymes. While early studies on this biological process suggested a 4:1 ratio of beta carotene to Vitamin A conversion, later studies revised this to 6:1 or perhaps even higher. If a meal is lowfat, however, not much bile is going to reach the intestinal tract further worsening the conversion ratio of carotenes to Vitamin A” (Sarah Pope, Busting the Beta Carotene Vitamin A Myth). “Average conversion of beta-carotene to retinol is around 2-28% (28% is on the very generous end), meaning those who consume all of their un-converted vitamin A from plants would have a very hard time meeting their vitamin A needs, and conversion could be even lower if someone is in poor health” (Laura A. Poe, Nutrient Spotlight: Vitamin A). There are numerous health conditions and medications that make conversion difficult or impossible — for example: “Diabetics and those with poor thyroid function, (a group that could well include at least half the adult US population), cannot make the conversion. Children make the conversion very poorly and infants not at all — they must obtain their precious stores of vitamin A from animal fats— yet the low-fat diet is often recommended for children” (Sally Fallon Morell, Vitamin A Vagary). I could list many other factors that get in the way, as this conversion process requires almost perfect conditions within the human body.

If you’re assuming that you are getting enough vitamin A, you’re making a dangerous gamble with your health. Take heed of a recent case where a teenager lost his eyesight and hearing after a decade of avoiding nutritious animal foods and instead having followed a junk food diet (Lizzie Roberts, Teenager ‘first in UK’ to go deaf and blind due to junk food diet, report reveals) — “The doting mother believes vitamin A injections could have saved Harvey’s sight if they were given to him at an earlier age” (Kyle O’Sullivan, Mum of teenager who went blind from only eating crisps and chocolate blames NHS); she said that, “Back in December when we were told it was down to nutrition, we think if they’d done the blood test then and realised the Vitamin A was so low they could have given him the Vitamin A injections then and he could see a lot more out of that right eye and we could have saved it a lot better.” Of course, he wasn’t eating a ‘balanced’ diet, but like many people on modern nutrient-deficient diets he became dependent on government-enforced supplementation policies saving him from malnourishment. But it turns out that, even with fortified foods, there are major nutritional holes in the Western diet. Should we fortify food even further or maybe genetically-modify crops for this purpose? Should we eat even more vegetables and juice them until, like my grandmother, we all turn orange? Or else should we simply go back to eating healthy traditional animal foods?

This is the same basic issue with the precursors of other fat-soluble vitamins and the precursors of other nutrients that aren’t easily processed by humans until after other animals have done the work for us (e.g., the omega-3s in algae can’t be accessed by human digestion but fish can break it down and so, in eating fish, it becomes bioavailable). This understanding has been slow to take hold in nutrition studies. Consider how, even though Weston A. Price was writing about Activator X in the 1940s, it wasn’t until this new century that it was identified as vitamin K2 and identified in being distinct from vitamin K1 —  see Christopher Masterjohn, On the Trail of the Elusive X-Factor: A Sixty-Two-Year-Old Mystery Finally Solved. By the way, I’d emphasize the close link of vitamin A and vitamin K, as Masterjohn details: “Because vitamin K1 is directly associated with both chlorophyll and beta-carotene within a single protein complex and plays a direct role in photosynthesis,13 the richness of the green color of grass, its rate of growth, and its brix rating (which measures the density of organic material produced by the plant) all directly indicate its concentration of vitamin K1. Animals grazing on grass will accumulate vitamin K2 in their tissues in direct proportion to the amount of vitamin K1 in their diet. The beta-carotene associated with vitamin K1 will also impart a yellow or orange color to butterfat; the richness of this color therefore indirectly indicates the amount of both vitamins K1 and K2 in the butter. Not only are the K vitamins detected by the Activator X test and distributed in the food supply precisely as Price suggested, but, as shown in Figure 2, the physiological actions that Price attributed to Activator X correspond perfectly to those of vitamin K2. It is therefore clear that the precursor to Activator X found in rapidly growing, green grass is none other than vitamin K1, while Activator X itself is none other than vitamin K2.” Just eat those delicious animal foods! Then everything will be right with the world. Any ideology that tells you to fear these foods is a belief system that is anti-human and anti-life.

The thing is, in their natural form in animal foods, fat-soluble vitamins are part of a complex of synergistic nutrients and their cofactors (it’s particularly important for vitamins A, D3, and K2 to be in balance). Isolated vitamins, especially in higher amounts as supplements and fortification to treat disease, have sometimes proven to be problematic for health in other ways. “Nutrient supplements may even be harmful, particularly when taken in large, concentrated, and isolated doses,” explained Gyorgy Scrinis. “An overdose of vitamins A and D, for example, can have toxic and potentially fatal effects. 85 Some studies have also found an association between beta-carotene supplements and an increased risk of both cardiovascular disease and certain cancers” (p. 76). Furthermore, specific foods are part of a total diet and lifestyle. For hundreds of thousands of years, humans ate a low-carb and high-fat diet combined with regular fasting, which guaranteed regular ketosis and autophagy. There have been numerous health benefits shown from these combined factors. It’s fascinating that, in early research on the ketogenic diet when applied to children, that it was sometimes observed to not only to be effective in treating physical diseases like epileptic seizures and diabetes but that behavioral issues also improved. This has been demonstrated with more recent research as well in showing the diverse connections to neurocognitive health (Ketogenic Diet and Neurocognitive HealthFasting, Calorie Restriction, and Ketosis, The Agricultural Mind, “Yes, tea banished the fairies.”, Autism and the Upper Crust, Diets and SystemsPhysical Health, Mental Health). Still, there are many confounding factors. Since ketogenic diets tend to increase fat intake, depending on the source, they also can increase the levels of fat-soluble vitamins. Weston A. Price observed that traditional healthy societies getting plenty of these nutrients had both greater physical health (well-developed bone structure, strong immune system, etc) and greater ‘moral’ health (positive mood, pro-social behaviors, etc). The moral panic that more strongly took hold in the 19th century was understood at the time as being rooted in general health concerns (The Crisis of Identity).

Many of the fat-soluble vitamins, especially vitamin A, act more like hormones than mere nutrients. They influence and determine nearly every system in the body, including in how they impact the development of nervous system, brain, and gut-brain axis. If one is seeing outward symptoms like eye deterioration, it can be guaranteed that far worse problems are already forming that are less obvious. Yet even outward symptoms aren’t always recognized, such as in one study where most subjects didn’t even realize they had decreased night vision from vitamin A deficiency. Our health often has to get quite severely bad before we notice it and admit to it. Part of this is that sickness and disease has become so common that it has become normalized. I was socializing with someone who is overweight to an unhealthy degree, but I later realized how the excess body fat didn’t even stand out to me because, by American standards, this person was normal. Anything can become normalized. My mother brought my nephew to the doctor and, in discussing how often he gets sick, told the doctor about his unhealthy diet. The doctor’s response is that all kids eat unhealthy. The doctor has seen so many sickly patients that she couldn’t imagine health as a normal state of the human body. About the vegetarian I mentioned, I give him credit for at least noticing his loss of night vision, but what was nagging me about his situation is how he seems to have accepted it as an expected part of aging, in not realizing that it is a sign of a severe health concern. He probably has never thought to mention it to his doctor and, even if he did, most doctors aren’t well enough educated in nutrition to realize its significance and understand what it might mean (Most Mainstream Doctors Would Fail Nutrition).

This isn’t a problem limited to a few people. One of the main things that has declined over time is access to fat-soluble vitamins, a pattern that most clearly emerged with the early 19th cheap grains, especially white flour, replacing animal foods and then ratcheted up further with the early 20th century replacement of animal fats with industrial seed oils. It’s not only whether or not we are getting a particular vitamin. As important is how other aspects of the diet affect nutrition. A high-carb diet itself might be disrupting the bioavailability of vitamin A, and that is even more true if also low-fat since the fat-soluble vitamins are useless without the fat to absorb them. From the book Malnutrition and the Eye, Donald McLaren writes: “The association of xerophthalmia with an excessive intake of carbohydrate in the diet in infancy was recorded by Czerny and Keller (1906) in their classical monograph on the syndrome they termed Mehlnahrschaden. It is now recognized that this condition is identical in all basic features to what has been called “the most serious and widespread nutritional disorder known to medical and nutritional science” (Brock and Autret, 1952) and due in essence to a deficiency of protein and excess of carbohydrate in the diet. Many local and other names have been applied to this disease but it will be necessary here to use one, and that chosen, “kwashiorkor,” has found wide acceptance. Since Cerny’s day there has been a great number of other accounts in which ocular involvement has been described (McLaren, 1958), providing good evidence for the contention that a deficiency of vitamin A is the most common of all vitamin deficiencies associated with kwashiorkor.” It has been observed by many that the populations with vitamin A deficiency tend to have a diet high in grains and vegetables while low in animal foods, particularly low in seafood (fish oil is the most concentrated source of vitamin A). A high grain diet effects nutrition in other ways as well: “Many nutritionists consider cereal grains to be good sources of most of the B vitamins except for vitamin B12. Inspection of table 4 generally is supportive of this concept, at least in terms of the % RDA which cereal grains contain. However, of more importance is the biological availability of the B vitamins contained within cereal grains and their B vitamin content after milling, processing and cooking. It is somewhat ironic that two of the major B vitamin deficiency diseases which have plagued agricultural man (pellagra and beriberi) are almost exclusively associated with excessive consumption of cereal grains” (Loren Cordain, “Cereal Grains: Humanity’s Double-Edged Sword,” from Vitamin History: The Early Years ed. by Artemis P. Simopoulos, p. 27). Besides vitamin A and B12 affecting eye and bone health, they work together in numerous other ways (e.g., Edward G. High & Sherman S. Wilson, Effects of Vitamin B12 on the Utilization of Carotene and Vitamin a by the Rat), as is true with so many other links between nutrients. The balance is easily disturbed — all the more reason to worry about grains since they knock out large swaths of essential nutrients, including the nutrients from animal foods. Eat the hamburger and leave the bun, enjoy the steak but not the roll. There is another factor to keep in mind. A high-carb diet is a major cause of liver disease, related to metabolic syndrome that is also associated with insulin resistance, obesity, diabetes, heart disease, Alzheimer’s, etc. The liver is necessary for the digestion of fat and the assimilation of fat-soluble vitamins. So, if someone on a high-carb diet has compromised their liver functioning, they can be deficient in vitamin A no matter how much they are getting from their diet (James DiNicolantonio made a similar point about other nutrients: “The liver makes proteins that carry minerals around the body. So if you have liver disease, even if you consume enough minerals, you may have difficulty moving minerals around the body to where they are needed”; this would relate to how the fat-soluble vitamins are absolutely necessary for the absorption, processing, transportation, and use of minerals).

This is exacerbated by the fact that, as people have followed official dietary recommendations in decreasing fat intake, they’ve replaced it with an increase in starchy and sugary carbohydrates. Actually, that isn’t quite correct. Fat intake, in general, hasn’t gone down (nor gone up). Rather, Americans are eating less animal fats and more industrial seed oils. The combination of unhealthy carbs and unhealthy oils is a double whammy. There are all kinds of problems with industrial seed oils, from being oxidative to being inflammatory, not to mention being mutagenic (Dr. Catherine Shanahan On Dietary Epigenetics and Mutations). On the one hand, there is the loss of fat-soluble vitamins in that the industrial seed oils lack them. That is bad enough, but consider another part of the equation. Those same oils actively interfere with what fat-soluble vitamins that are otherwise being obtained — as Gryorgy Scrinis tells it: “The nutritional engineering of foods can create nutrient-level contradictions, whereby the enhancement or removal of a particular nutrient by food manufacturers interferes with the quantities or absorption of other desirable nutrients. For instance, studies have shown that the concentrated quantities of plant sterols in cholesterol-lowering margarines block the absorption of beta-carotene and therefore lower vitamin A levels in the body. 27 Consumers of sterol-enriched foods are therefore encouraged to compensate for this nutrient-level contradiction by eating more fruits and vegetables to increase their vitamin A levels” (Nutritionism, p. 211). Combine that with health problems with the gut, metabolism, and liver as seen with so many modern Westerners and other industrialized populations (88% of adult Americans are metabolically unfit: Dietary Health Across Generations). Maybe humans with the right genetics and optimal health could both turn beta-carotene into retinol and make use of all the fat-soluble vitamins, as precursors or preformed. The problem is that doesn’t describe most people today. “Unfortunately, just like with Omega 3, the evidence indicates our ability to convert beta-carotene to retinol is not sufficient (202122). Some estimates indicate that beta-carotene intake is only 16-23% as effective as retinol intake for increasing body levels of retinol (20, 21, 22). This is supported by findings that beta-carotene supplementation and high beta-carotene intake (vitamin A from vegetables) increases serum beta-carotene levels but does not significantly impact retinol levels (20, 21, 22)” (Andy AKA Barefoot Golfer, Why Humans are Not Vegetarians – The Omnivorous Truth).

This is why vegans and vegetarians so easily get into trouble with nutrient deficiencies and are forced to rely upon supplements, although its questionable how helpful are these supplements as replacements for whole foods as part of what would otherwise be a diet that is both nutrient-dense and nutrient-bioavailable. The vegetarian I discussed above eats a “balanced diet” of fresh produce and fortified plant foods combined with a multivitamin, and yet he still is losing his night vision. Nutrients are part of a complex web of health. Too much of one thing or too little of another can mess with the levels of a particular nutrient which, in turn, can have a chain effect with numerous other nutrients. Consider calcium and how it is processed (Calcium: Nutrient Combination and Ratios); vitamin A plays a central role in bone development, and so maldevelopment of the skull in constricting the cornea or optic nerve is another way deficiency can negatively impact eyesight. Or consider that, “Dietary antioxidants (i.e., vitamin E) also appear to have an important effect on the utilization and perhaps absorption of carotenoids. It is uncertain whether the antioxidants contribute directly to efficient absorption or whether they protect both carotene and vitamin A from oxidative stress. Protein deficiency reduces absorption of carotene from the intestine” (Lee Russell McDowell, Vitamins in Animal Nutrition, pp. 16-17). We humans aren’t smart enough to outsmart nature (Hubris of Nutritionism). The body as a biological system is too complex and there are too many moving parts. Any single thing shifts and everything else follows. The only guaranteed healthy solution is to adhere to a traditional diet that includes plenty of fatty animal foods. This is easy enough for omnivores and carnivores — get plenty of liver and fish (or get it in the form of fish oil and cod liver oil), although any high quality fatty meats will work. And if you’re vegetarian, emphasize pasture-raised eggs and dairy. But for vegans, I can only suggest that you pray to God that you have perfect genetics, perfect metabolism, perfect balance of supplements, and all other aspects of optimal functioning that allows you to be the rare individual who has a high conversion rate of beta-carotene to vitamin A, along with conversion of other precursors (4 Reasons Why Some People Do Well as Vegans (While Others Fail Miserably)) — good luck with that!

To lighten up the mood, I’ll end with a fun factoid. Talking about genetics, an important element is epigenetics. Catherine Shanahan, quoted once already, has written an interesting discussion in her book, Deep Nutrition, that covers the interaction of nutrition with both genetics and epigenetics. Health problems from deficiencies can be passed on, but they can also be reversed when the nutrient is added back into the diet: “One example of the logic underlying DNA’s behavior can be found by observing the effects of vitamin A deficiency. In the late 1930s, Professor Fred Hale, of the Texas Agricultural Experiment Station at College Station, was able to deprive pigs of vitamin A before conception in such a way that mothers would reliably produce a litter without any eyeballs. 50 When these mothers were fed vitamin A, the next litters developed normal eyeballs, suggesting that eyeball growth was not switched off due to (permanent) mutation, but to a temporary epigenetic modification. Vitamin A is derived from retinoids, which come from plants, which in turn depend on sunlight. So in responding to the absence of vitamin A by turning off the genes to grow eyes, it is as if DNA interpreted the lack of vitamin A as a lack of light, or a lightless environment in which eyes would be of no use. The eyeless pigs had lids, very much like blind cave salamanders. It’s possible that these and other blind cave dwellers have undergone a similar epigenetic modification of the genes controlling eye growth in response to low levels of vitamin A in a lightless, plantless cave environment” (p. 57). The body is amazing in what it can do, when we give it the nourishment it needs. Heck, it’s kind of amazing even when we malnourish ourselves and the body still tries to compensate.

* * *

Vitamin A Under Attack Down Under
by Nora Gedgaudas

Traditional and indigenous diets have always venerated foods rich in fat-soluble nutrients and women either pregnant and/or seeking to become pregnant in traditional and indigenous societies (according to the exhaustive and well-documented research of Dr. Weston A. Price, author of the respected and acclaimed textbook, ‘Nutrition and Physical Degeneration’) ate diets rich in fats and fat-soluble nutrients–including liver–for this very purpose. The notion that these foods have somehow–all of a sudden—become toxic to us at any age is patently absurd. In fact, Price—himself a meticulous researcher in the 1930’s determined that traditional/indigenous societies readily consumed more than 10-times the levels of these nutrients—easily estimable at about 50,000 IU per day of preformed vitamin A, as compared to the levels of vitamin A consumed in (his) “modern times”. And people in Weston Price’s “modern day era” (1930’s) were nowhere near as hysterically phobic about foods such as egg yolks, shellfish and liver as we have since become following the fabrication of irrational concerns about such foods (and animal source foods/fats, in general)! Let’s just say we’re not necessarily healthier as a species for consuming at least ten-times less of these vital and protective, activated fat-soluble nutrients today; much less are we enjoying fewer birth defects or improved overall maternal/infant health. In fact, the closer we as a society attempt to emulate government guidelines, the less healthy (according to the latest confirming research) we demonstrably become.

Primal Body, Primal Mind
by Nora Gedgaudas
p. 139

The role of certain nutrients in relation to others and the need for certain cofactors in order to optimize a nutrient’s function or prevent imbalances aren’t normally discussed at all. This, of course, leads to problems.

For instance—and perhaps critically—for each and every receptor for vitamin D, there are two receptors for vitamin A on every cell. Because of the compartmentalized approach to vitamin D research, this sort of thing does not get recognized or discussed. A relative balance of these two nutrients is vital to their healthy functioning in the body. An excess of one can create a relative deficiency of the other. For instance, if you take large amounts of vitamin D without vitamin A, you are potentially more likely to develop symptoms of vitamin A deficiency and experience an actual immunosuppressive effect. Conversely, taking certain commercial cod-liver oil supplements that are rich in vitamin A but poor in vitamin D can lead to more severe vitamin D deficiencies. (It’s important to read labels. The amount of vitamin D in a serving of high-vitamin cod-liver oil is around 1,000 IU. Most commercial brands don’t exceed between 20 and 400 IU). Recent research from Spain indicates that vitamin A is necessary for both vitamin D binding and vitamin D release to receptor sites. The two vitamins are synergistic and should always be balanced in the diet or in supplementation. Individual needs for both may vary considerably.

Fat Soluble Vitamins: Vitamins A, D, E & K
by Jenny MacGruther

Carotenoids which include the very prevalent beta carotene are poorly converted by the body. For example, some studies indicate that the body requires as much as twenty-one times the amount of carotenoids to create the same amount of vitamin A is one part retinol. To add insult to injury many people, especially those suffering from thyroid disorders and small children, are even poorer converters. A 2001 study found that the conversion rate of carotenoids to true vitamin A is so poor as to render it nutritionally insignificant.

Why You Won’t Get Vitamin A From Carrots
by Lauren Geertsen

The most important fact about vitamin A is the difference between retinoids and cartenoids. The vitamin A from animal sources is retinoids, also called retinol, while plant source vitamin A is carotenoids, such as beta carotene.

Animal sources of retinol is bio-available, which means the body can utilize it. The vitamin A from plant sources, in contrast, must first be converted to retinol to be useful in the body. This poses two big problems.

First, when we are in pristine health, it requires at least six units of carotenes to convert into 1 unit of retinol (source). To put this in perspective, that means one must eat 4 1/2 pounds of carrots to potentially get the amount of useable A as in 3 oz. of beef liver (source). What happens if we have digestive issues, hormone imbalances, or other health problems? It requires even more units of carotene in the ratio.

Second, the carotene-to-retinol conversion is HIGHLY compromised. As a matter of fact, this conversion is negligible for many individuals. This conversion is virtually insignificant:

  • In infants
  • In those with poor thyroid function (hypothyroidism)
  • In those with diabetes
  • In those who are on a low fat diet or have a history of low fat dieting
  • In those who have compromised bile production (think: gallbladder and digestive issues) (source and source)

So, do you still think carrots are a vitamin A food? As with other orange veggies, sweet potatoes provide carotenes. Although beta carotene is an antioxidant, it is not true vitamin A. We must eat true vitamin A foods on a daily basis to meet our requirements for this essential nutritient.

Beef or Carrots
by Doug Garrison

This vitamins proper name “retinol” refers to its role in supporting vision.  Growing up we were told to eat our carrots for healthy eyes, especially to have night vision like cats!  Hmm, do cats eat carrots?  It is the “carotene” in carrots that our bodies can (with effort) convert into vitamin A.  The drawbacks to relying on carrots for your vitamin A:

  • We must use a biochemical reaction with bile salts and enzymes to convert the beta-carotene in carrots into vitamin A.
  • The conversion rate of beta-carotene to vitamin A in each person depends on many factors and ranges from 4 to 28 beta-carotene units to produce one unit of vitamin A.

For an adult male to meet the daily recommended intake for vitamin A, he would need to consume 2 pounds of baby carrots.  (Skipping the baby carrots, he could do one pound of regular carrots, for some reason baby carrots have half the beta-carotene.  Chlorine bath anyone?)  Don’t want to eat that many carrots?  How about 2.3 pounds of kale?  If you are like me, kind of lazy, I’ll opt for my vitamin A already formed in some beef liver.  Less than 1 ounce of beef liver will do the trick.

Still want to get your Vitamin A from carrots?  Boost your bodies conversion rate by eating carrots with animal fat such as cooking carrots with a pasture grazed beef roast!  In fact, we cannot convert the beta carotene found in plants without fat in our diet as a catalyst.

Vitamin A Deficiency: Health, Survival, and Vision
by Alfred Sommer, Keith P. West, James A. Olson, and A. Catharine Ross
p. 101

The ancient Egyptians and Greeks recognized nyctalopia and treated it with calf’s or goat’s liver (high in vitamin A content). By the nineteenth century, nightblindness was known to occur primarily among the poorer strata of society, particularly during periods of dietary deprivation; was exacerbated by photic stress (which bleached so much rhodopsin that synthesis could not keep up with demand, causing borderline deficiency to become manifest as nightblindness), and could be effectively treated with liver or liver oils. In fact, most other manifestations of xerophthalmia were first recognized by their association with nightblindness. Nightblindness (without evidence of xerphthalmia) was reported to have disabled Confederate soldiers between dawn and dusk, and to have affected whole regiments during the Crimean War.

Evolutionary Aspects of Nutrition and Health
edited by Artemis P. Simopoulos
p. 26
“Cereal Grains: Humanity’s Double-Edged Sword”
by Loren Cordain

Vitamin A deficiency remains one of the major public health nutritional problems in the third world [24]. Twenty to 40 million children worldwide are estimated to have at least mild vitamin A deficiency [25]. Vitamin A deficiency is a leading cause of xerophthalmia and blindness among children and also a major determinant of childhood morbidity and mortality [26]. In virtually all infectious diseases, vitamin A deficiency is known to result in greater frequency, severity, or mortality [27]. A recent meta-analysis [280 from 20 randomized controlled trials of vitamin A supplementation in third world children has shown a 30-38% reduction in all cause mortality in vitamin A-supplemented children. Analysis of cause-specific mortality showed vitamin A supplementation elicited a reduction in deaths from diarrheal disease by 39%, from respiratory disease by 70% and from all other causes of death by 34% [28]. Clearly, the displacement of beta-carotene-containing fruits and vegetables and vitamin A-containing foods (milk fat, egg yolks and organ meats) by excessive consumption of cereal grains plays a major role in the etiology of vitamin A deficiency in third world children.

Malnutrition and the Eye
by Donald McLaren
pp. 165-171

Few effective cures have been known so long to mankind as that of liver for night blindness. No doubt this was due in part to the dramatic nature of both the onset of the condition and of its relief. It is probable that the Ebers papyrus, written about 1600 B.C. in Egypt, referred to night blindness when it recommended liver for treatment of the eyes. A literal translation reads “Another [prescription] for the eyes: liver of ox roasted and pressed, give for it. Very excellent” (Drummond and Wilbraham, 1939). At about the same time the physicians in China were giving liver, dung of the flying fox, and tortoise shell for the cure of night blindness (Read, 1936). Hippocrates prescribed the whole liver of an ox dipped in honey and the therapeutic value of liver was also known to later Roman writers. It is believed that Celsus (25 B.C.-50 A. D.) first used the term xerophthalmia. […]

It would seem that night blindness was widespread in Europe in medieval times, for we find a 14th century poet in Holland, Jacob van Maerland, referring to the disease and its cure in this way (Bicknell and Prescott, 1953):

He who cannot see at night
Must eat the liver of the goat.
Then he can see all right.

[…] The relationship to poor general health and infectious disease was noted frequently in early accounts of xerophthalmia with special attention paid to intestinal disorders (Teuscher, 1867; de Gouvea, 1883). Baas (1894) described both night blindness and xerophthalmia in patients with liver disease and there have been many confirmatory accounts since. There is now reason to believe that impairment of dark adaptation in patients with disease of the liver may not always be due to deficiency of vitamin A […]

Although the cure for night blindness had been known since time immemorial, ti was not until the last century that the dietary deficiency nature of the condition was recognized. […] With the turn of the century several further steps forward were taken in the understanding of the nature of the disease. Jensen (1903) was the first to show that xerophthalmia could be cured by an adequate diet and for this purpose used raw cow’s milk. It is interesting to note that he observed a rapid improvement on this regime not only as judged by the condition of the eyes and gain in weight but particularly by the disappearance of what he called the “characteristic psychic indifference.” This recognition of the profound systemic effects of vitamin A deficiency has not always persisted since this time and the high mortality attributable to the disease in its severest form has also been lost sight of at times.

In 1904 the important observation was made by Mori that the disease known as “hikan,” characterized by conjunctival xerosiss and keratomalacia and widely prevalent among children aged 2-5 years in Japan, was most common in the children of people living largely on rice, barley and other cereals, beans, and vegetables. It did not occur among fisher folk, and cod liver oil, chicken liver, and eel fat were all effective remedies.

The association of xerophthalmia with an excessive intake of carbohydrate in the diet in infancy was recorded by Czerny and Keller (1906) in their classical monograph on the syndrome they termed Mehlnahrschaden. It is now recognized that this condition is identical in all basic features to what has been called “the most serious and widespread nutritional disorder known to medical and nutritional science” (Brock and Autret, 1952) and due in essence to a deficiency of protein and excess of carbohydrate in the diet. Many local and other names have been applied to this disease but it will be necessary here to use one, and that chosen, “kwashiorkor,” has found wide acceptance. Since Cerny’s day there has been a great number of other accounts in which ocular involvement has been described (McLaren, 1958), providing good evidence for the contention that a deficiency of vitamin A is the most common of all vitamin deficiencies associated with kwashiorkor.

Handbook of Nutrition, Diet, and the Eye
edited by Victor R. Preedy
p. 301
“Vitamin A, Zinc, Dark Adaptation, and Liver Disease”
by Winsome Abbot-Johnson and Paul Kerlin

McCollum and Davis (1912) found that ‘fat-soluble factor A’ was essential for growth in rats. The important connection between vitamin A deficiency and night blindness however was made by Frederica and Holm in 1925, who observed slower generation of visual purple in light-adapted vitamin A-deficient rats than for normal rats when put into the dark.

A relationship between night blindness and cirrhosis was reported by Haig et al. in 1938 and Patek and Haig in 1939. It was thought that these patients may be deficient in vitamin A and the deficiency state was not thought to be attributable to inadequate intake of the vitamin in their food. Impairments of dark adaptation (DA) included delayed rod cone break (time when rods become more sensitive to light than cones), higher intensity of light seen at 20 minutes (postexposure to a bright light), and higher intensity of light seen at final reading (elevated final rod thresholds). Nineteen of 24 patients demonstrated night blindness but none was aware of this on direct questioning.

The Vitamin A Story
by Richard D. Semba
p. 76

The piecemeal clinical picture of night blindness caused by vitamin A deficiency (see previous chapters) finally came together between 1896 and 1904, when Japanese physician Masamichi Mori described more than fifteen hundred children with hikan — that is, xerophthalmia [37]. Mori had studied medicine at the Mie Prefectual Medical School and the Tokyo University and gone on to work in Germany and Switzerland before returning to Mie Prefecture to practice surgery. The children Mori described had night blindness, Bitot’s spots, corneal ulceration, keratomalacia, and diarrhea. The death rate among them was high. Most were between ages one and four and one-half, and many came from poor families living in mountainous regions, where the diet completely lacked milk and fish. Once under medical care, the children were given cod liver oil daily, and this proved to be a effective treatment for both the eye lesions and diarrhea. Contrary to the view of many physicians, Mori concluded that the disease was not infectious but rather was caused by the lack of fat in the diet.

Fat-Soluble Vitamins
edited by Peter J. Quinn and Valerian E. Kagan
p. 150
“Plasma Vitamins A and E in HIV-Positive Patients”
by Joel Constans, Evelyne Peuchant, Claire Sergent, and Claude Conri

During the last 10 years it has been demonstrated that vitamin A deficiency not only results in xerophthalmia and blindness, but also in mortality and susceptibility to infectious diseases (Lammer et al., 1985; Reddy et al., 1986). Treatment with massive intermittent dosages of vitamin A has resulted in a decrease in mortality in developing countries (Sommer et al., 1986). It has been suggested that vitamin A might have a positive effect on the immune system and that marginal deficiencies in vitamin A that are unable to give rise to xerphthalmia and blindness might impair immune defenses (Bates, 1995; Sommer et al., 1984). Deficiency in vitamin A clearly has effects on the immune system, and the number, and the function of natural killer cells were depressed in vitamin A-deficient rats (Bates, 1995). Supplementation with vitamin A (60 mg retinol equivalents) resulted in higher CD4/CD8 ratio, higher CD4 naive T cells, and lower proportions of CD8 and CD45 RO T cells in vitamin A-deficient children compared to placebo-treated children (Semba et al., 1993b). Vitamin A deficiency might also result in depression of humoral response to proteins and alterations of mucosal sufraces (Bates, 1995). Watson et al. (1988) reported that high vitamin A given to mice with retroviral infection increased survival and numbers of macrophages and total T lymphcytes.

Vitamin History: The Early Years
by Lee McDowell
pp. 61-66

IV. Xeropthalmia And Night Blindness History, From Antiquity To 14th Century

Vitamin A deficiency is one of the oldest recorded medical conditions, long recognized as eye manifestations. For thousands of years humans and animals have suffered from vitamin A deficiency, typified by night blindness and xerophthalmia. The cause was unknown, but it was recognized that consumption of animal and fish livers had curative powers according to records and folklore from early civilizations. It is interesting to find the knowledge of the cure is almost as old as medicine.

Night blindness and its successful treatment with animal liver was known to the ancient Egyptians (Fig. 3.4). Eber’s Papyrus, an Egyptian medical treatise of between 1520-1600 B.C., recommends eating roast ox liver, or the liver of black cocks, to cure it (Aykroyd, 1958). Wolf (1978) notes that a more careful evaluation of ancient Egyptian writings (Eber’s Papyrus no. 351) reveals that the therapy consisted of topical application of vitamin A rich liver juice to the eyes. Wolf (1978) suggested that with the topical application some of the liver oil must enter the lacrimal duct and thereby reach the throat via the nose. Therefore, the vitamin A could enter the body despite its topical application.

Vitamin deficiency diseases in China such as xerophthalmia and night blindness had been very prevalent from olden times (Lee, 1940). For preventing night blindness the Chinese in 1500 B.C. were giving liver, honey, flying fox dung and tortoise shell, all of which would have cured night blindness (Bicknell and Prescott, 1955).

The term “sparrow eyed” was used for a man who could see in daytime but not at twilight. The sparrow also has vision problems at night. Even though the ancient Chinese did not know the real cause of night blindness, they knew it was caused by a nutritional disturbance. A report from the 7th century notes that pig’s liver could cure night blindness.

In the Bible book “Tobit”, blindness apparently due to vitamin A deficiency, is described. The setting of the story is the latter part of the 8 th century B.C. in the Assyrian capital of Nineveh where the people of Northern Israel had been taken captive. In this book God sends the angel Raphael who tells Tobit’s son to rub the eyes of Tobit with fish bile. After this Tobit was no longer blind (Grabman, 1973). Around 600 B.C. an early reference to vitamin A deficiency in livestock is in the Bible (Jeremiah 14:6, King James version): “and the asses did stand in high places, their eyes did fail, because there was no grass.”

Evaluation of medicine of the Assyrian-Babylonian empires (900-400 B.C.) report eye diseases or conditions (Krause, 1934). The Babylonian word sin-lurma (night blindness) was described as “a man can see everything by day but can see nothing by night.” The prescription for cure was to use a concoction whose major ingredient was “liver of an ass.” The procedure was for a priest to say to the person with night blindness “receive, o dim of the eye”. Next the liver-based potion was applied to the eyes. Also for xerophthalmia a type of prescription that would provide vitamin A was “thou shalt disembowel a yellow frog, mix its gall in curd, apply to eyes”.

The old Greek, Roman and Arab physicians recommended an internal and external therapy with livers of goats to overcome night blindness. The Greek Hippocrates, who lived 460-325 B.C., recognized night blindness and recommended eating raw ox liver dipped in honey as a cure (Littre, 1861). The notation was to “eat, once or twice, as big an ox-liver as possible, raw, and dipped in honey”. To eat a whole ox liver seems a superhuman feat, even when we reflect that the ox of antiquity was a much smaller creature than that of today (Figure 3.4).

Mani in 1953 reviewed the Greek therapy for night blindness (cited by Wolf, 1978). A precise definition of night blindness is not found until after the time of Hippocrates. Galen (130 to 200 AD) describes patients who are “blind at night” and Oribasius (325 AD) defines night blindness as “vision is good during the day and declines at sundown, one cannot distinguish anything any longer at night”. Galen recommends the cure for night blindness as “continuous eating of roasted or boiled liver of goats”. He also suggests, as did the Egyptians a topical treatment, “the juice of the roasted liver should be painted on the eyes”.

Xerophthalmia had been known as hikan in Japan since antiquity (Mori, 1904). Mori stated that hikan was common among people who subsisted in great measure on rice, barley and other cereals, beans and other vegetables, whereas it did not occur among fisher folk. He not only recognized the entire sequence of ocular changes in xerophthalmia, but also the central role of dietary deficiency of fats (particularly fish liver oils) resulting from either a faulty diet or faulty absorption, the role of diarrhea, kwashiorkor (protein deficiency) and other contributory and precipitating events. Mori reports that administration of cod liver oil was followed by speedy relief from the disorder and that chicken livers and eel fat were effective remedies also. He incorrectly concluded that deficiency of fat in the diet was the cause of the disease.

V. Xeropthalmia And Night Blindness History, 1300-1900

Liver was the most widely used cure for night blindness. Jacob van Maerland, a Dutch poet of the 14th century concluded: “He, who cannot see at night, must eat the liver of the goat. Then he can see all right” (Bicknell and Prescott, 1955).

Guillemeau in France in the 16th century clearly described night blindness and advised liver for its cure (Bicknell, and Prescott, 1955), which was also advised by other writers during this time.

The first mention of liver for the eyes in England was in Muffett’s “Health’s Improvement” (1655), though Bayly, at one time Queen Elizabeth’s physician, in his book on eyes recommends “rawe herbes” among which is “eie bright” (Drummond, 1920). The only evidence of night blindness being common at this time is references to mists and films over the eyes “rawe herbes” would of course provide provitamin A (Bicknell and Prescott, 1955).

In 1657 Hofer expressed the view that night blindness is caused by malnutrition with this thought reintroduced nearly 100 years later in 1754 by von Bergen (Rosenberg, 1942). In 1754 he also speculated that night blindness might be due to excessive exposure to sunlight. This would later be confirmed as light increases the need for regeneration of retinal visual pigment. In a review of literature Hicks (1867) noted that night blindness was noticed by Baron Young in Napoleon’s Egyptian campaign in 1798.

At one time, night blindness was a typical disease of seafarers, due to the lack of fresh food. Aykroyd (1944) in his accounts of Newfoundland and Labrador fishermen noted they not only recognized how bright sunlight may bring on night blindness, but also used liver, preferably the raw liver of a gull or puffin for a cure. In rural communities inability to see in the dusk is a very serious condition; fishermen, for instance, may walk off the rocks into the sea after landing in the evening. Night blindness can be cured, often in 12 hours, by eating food rich in vitamin A, such as liver. In addition to eating liver, patients with night blindness were recommended to hold their heads over steam rising from the roasting liver. The dramatic quickness both of the onset and cure explains why liver has been used for centuries for prevention and cure of night blindness (Bicknell and Prescott, 1955).

Other regions of the world where liver was used to control night blindness was in central Africa. Medicine men in Ruanda-Urandi prescribed chicken liver to cure night blindness (Tseng Lui and Roels, 1980). The origins and original date of implementation of their therapy are unknown.

During the Lewis-Clarke expedition (1804-1806) to open up the far west in the United States, there were a number of men who developed severe eye troubles toward the end of the trip. These men had lived for long periods upon buffalo, dog and horsemeat, muscle of these animals is low in fat-soluble vitamins (McCay, 1973).

Magendie (1816) in France studied lack of protein in dog diets by feeding the animals sugar and water. Magendie noted that during the third week, dogs were thin, lost liveliness, had decreased appetite and developed small ulceration in the center of the transparent cornea. The ulcer was first on one eye and then the other, it increased rapidly and at the end of a few days was more than two millimeters in diameter and in depth. Soon the cornea was entirely pierced and the fluid of the eye was flowing out. An abundant secretion of the glands of the eyelids accomplished this singular phenomenon. Magendie repeated this experiment twice more with identical results.

Although xerophthalmia had been known in ancient Egypt, Magendie appears to be the first to experimentally produce the condition. Not only did Magendie record the production of xerophthalmia in animals, he recognized the analogous conditions in man as a result of a restricted diet. In his report Magendie noted an experiment by an English doctor named Stark. Stark lived on an exclusive diet of sugar for one month. In his eyes appeared livid red spots, which seemed to announce the approach of an ulcer (xerophthalmia). C.M. McCay in 1930 suggested that Magendie was the father of the vitamin hypothesis.

In a journey from Calcutta to Bombay, India in 1824-1825 vitamin A deficiency was observed (Aykroyd, 1944). Individuals would sometime describe their condition as being “night blind.” By the mid-1800s, xerophthalmia was recognized in many areas of Europe, particularly in Russia during the long Lenten fasts, (Blessig, 1866) the United States (Hicks, 1867) and elsewhere around the world (Sommer and West, 1996). Hubbenet (1860) reports night blindness in the Crimean War (1853-1856).

Inflammation of the cornea and conjunctiva associated with night blindness had been ascribed to “defective nutriment” by Budd in 1842. Wilde (1851) made similar conclusions during the Ireland famine in 1851. For treatment he recommended cod-liver oil.

In 1857 David Livingstone, a medical missionary in Africa, described eye effects for his native carriers when they were forced by circumstances to subsist for a time on coffee, manioc and meal (Livingston, 1905). He noted that the eyes became affected (e.g. ulceration of the cornea) as they did in animals receiving starch. He was probably referring to the dog study of Magendie in 1816.

Hicks (1867) a doctor in the confederate army, described night blindness in the U.S. Civil War (1861-1865). The disease was found to be in the army of Northern Virginia so extensively as to resemble an epidemic. Soldiers attributed it to the “effect of the moon-light falling upon their eyes while sleeping upon the ground.” The soldier, who had marched all day without problems, would complain of blindness upon the approach of early twilight, and make immediate application for transportation in an ambulance. At such times he would be found blundering along just as a blind man, holding on to the arm of his companion. For those with night blindness being examined at night by candle-light, the eye pupil was found dilated, and refusing to respond to the stimulus of this light.

To overcome night blindness, extreme treatments such as cupping, leeching, blistering, iron, mercury and potash were used extensively, but most often did more harm than good (Hicks, 1867). Cases frequently recovered spontaneously after all treatments had been abandoned. Hicks (1867) observed that a furlough from the army was most beneficial to cure night blindness. It was noted that poverty, filth and the absence of vegetables were associated with night blindness. The disease was found to be most prevalent when symptoms of scurvy were also observed. Vegetables were observed to be of benefit for both scurvy and night blindness.

In 1883 De Gouvea described night blindness in poorly nourished slaves in Brazil. He noted that the slaves were unable to see when returning from work after sunset, but could see well when starting for work before sunrise. Their food was beans, pork fat and corn meal. Exposure to sunlight was suspected of inducing night blindness, and resting the eyes at night was believed to result in recovery.

In the 1800’s a number of investigators related malnutrition to night blindness and more clearly described eye problems as related to malnutrition The researchers, cited by Rosenberg (1942) and Jayle et al. (1959), included Bamfield (1814), Schutte (1824), Foerster (1858), Von Graefe (1859), Hubbenet (1860), Netter (1863), Bitot (1863), Toporow (1885), Kubli(1887) and Parinaud (1881). Bitot’s name is widely known in conjunction with his observation on conjunctival and corneal changes in vitamin A deficiency. Hubbenet (1860) observed children in a French orphanage and described the progression of xerophthalmia from night blindness through conjunctival and corneal involvement, attributing it to a faulty diet. Toporow (1885) called attention to the importance of fats and Schütte (1824) and Kubli (1887) to that of cod liver oil in prevention of night blindness (cited by Jayle et al., 1959; Loosli, 1991). Parinaud (1881) opened a new era in this field by connecting night blindness with a slowing down in the regeneration of retinal pigment.

VI. Relationship Of Cod Liver Oil To Eye Disease

In the early years of recorded history, liver was the principal food in a number of societies that was beneficial for night blindness control. The discovery of vitamins A and D was closely related to studies with cod liver oil. This oil has been used since very early times by the Greenlanders, Laplanders and Eskimos (Loosli, 1991). Harris (1955) reviewed the early medical uses of cod liver oil. The earliest medical record was for treatment of rickets in 1766 in Manchester, England. In 1848 a treatise on cod-liver oil, published in Edinburgh, describes how xerophthalmia may be cured with the oil. In 1874 Dusart in a Paris research bulletin noted that his common treatments were wine, quinine, cod liver oil and body massage. The beneficial effect of cod liver oil in the treatment of rickets, osteomalacia, generalized malnourishment, and certain eye conditions was widely recognized by the middle of the 19th century.

Cod liver oil used to be treated in wooden barrels that had some small holes bored in the side. These holes were plugged with pegs. As the fisherman cleaned the cod for salting and drying, they threw the livers into these barrels. After the livers rotted, the oil was set free and rose to the top. This could then be taken off through the holes in the side of the barrel. This cod liver oil had many of the attributes of medicines of olden times, namely a dark brown color, an unpleasant odor and a nauseating flavor. Historically cod liver oil was used in the treatment of both eye disease and rickets.

In a particularly thoughtful and well-documented study published in 1881, Snell demonstrated that cod liver oil would cure both night blindness and Bitot’s spots. Within a decade, meat, milk, and cod liver oil were routinely administered for corneal ulceration and dissolution (keratomalacia). In 1904 Mori gave an elaborate account of large numbers of cases of xeropthalmia in Japan and how cod-liver oil cured the condition.

High vs Low Protein

P. D. Mangan Tweeted a quote from a research paper, Reversal of epigenetic aging and immunosenescent trends in humans by Gregory M. Fahy et al. He stated that  the “Most important sentence in aging reversal study” is the following: “Human longevity seems more consistently linked to insulin sensitivity than to IGF‐1 levels, and the effects of IGF‐1 on human longevity are confounded by its inverse proportionality to insulin sensitivity.” Mangan added that “This line agrees with what I wrote a while back” (How Carbohydrates and Not Protein Promote Aging); and in the comments section of that article, someone pointed to a supporting video by Dr. Benjamin Bikman (‘Insulin vs. Glucagon: The relevance of dietary protein’). Here is the context of the entire paragraph from the discussion section of the research paper:

“In this regard, it must be pointed out that GH and IGF‐1 can also have pro‐aging effects and that most gerontologists therefore favor reducing rather than increasing the levels of these factors (Longo et al., 2015). However, most past studies of aging and GH/IGF‐1 are confounded by the use of mutations that affect the developmental programming of aging, which is not necessarily relevant to nonmutant adults. For example, such mutations in mice alter the normal innervation of the hypothalamus during brain development and prevent the hypothalamic inflammation in the adult (Sadagurski et al., 2015). Hypothalamic inflammation may program adult body‐wide aging in nonmutants (Zhang et al., 2017), but it seems unlikely that lowering IGF‐1 in normal non‐mutant adults can provide the same protection. A second problem with past studies is a general failure to uncouple GH/IGF‐1 signaling from lifelong changes in insulin signaling. Human longevity seems more consistently linked to insulin sensitivity than to IGF‐1 levels, and the effects of IGF‐1 on human longevity are confounded by its inverse proportionality to insulin sensitivity (Vitale, Pellegrino, Vollery, & Hofland, 2019). We therefore believe our approach of increasing GH/IGF‐1 for a limited time in the more natural context of elevated DHEA while maximizing insulin sensitivity is justified, particularly in view of the positive role of GH and IGF‐1 in immune maintenance, the role of immune maintenance in the retardation of aging (Fabris et al., 1988), and our present results.”

In the Twitter thread, Командир Гиперкуба said, “So it is insulin [in]sensitivity than drives ageing rather than IGF‐1/GH. Huge if true.” And GuruAnaerobic added that, “I assume this isn’t IR per se, but IR in the presence of carbohydrate/excess food. IOW, the driver is environment.” Mangan then went onto point out that, “It explains the dichotomy of growth vs longevity, and why calorie restriction increases lifespan.” Mick Keith asked, “So drop carbs and sugar?go paleo style?” And Mangan answered, “There are other aspects to insulin sensitivity, but yes.” All of this cuts to the heart of a major issue in the low-carb community, an issue that I only partly and imperfectly understand. What I do get is this has to do with the conclusions various experts come to about protein, whether higher amounts are fine or intake should be very limited. Some see insulin sensitivity as key while others prioritize IGF-1. The confounding requires careful understanding. In the comments section of Mangan’s above linked article, Rob H. summed it up well:

“Great post, very timely too as I believe this is an issue that seems to be polarising the science-based nutrition space at the moment. Personally I fall down on the same side as you Dennis – as per Ben Bikman’s video which has also been posted here, as well as the views of all the main protein researchers including Stuart Philips, Jose Antonio, Donald Layman, Gabrielle Lyon, Ted Naiman, Chris Masterjohn etc who all believe the science clearly supports a high protein intake eg 1.6 -2.2g/kilo of bodyweight – with no upper limit which has yet been observed. At the same time, I have just been reading the new book by Dr Steven Gundry ‘The Longevity Paradox’. Has anyone read this one yet? Whilst about 90% of the content is fairly solid stuff (although nothing that hasn’t already been written about here) he aggressively supports Longo’s view that we should only consume 0.37g protein/ kilo of bodyweight, eg around 25g of protein/ day for most males. Also that animal protein should be avoided wherever possible. Personally I consume double that amount of protein at each meal! It appears that Longo, Gundry, Dr Ron Rosedale and Dr Mercola are all aligned in a very anti-animal protein stance, but also believe their view is backed by science – although the science quoted in Gundry’s book seems to be largely based on epidemiology. Both sides can’t be right here, so I hope more research is done in this field to shut this debate down – personally I feel that advising ageing males to consume only 25g of protein a day is extremely irresponsible.”

In response, Mangan wrote, “I agree that is irresponsible. Recently Jason Fung and James DiNicolantonio jumped on the anti animal protein bandwagon. My article above is my attempt (successful, I hope) to show why that’s wrong.” Following that, Rob added, “Humans have been consuming animal proteins for most or all of our evolutionary history. And certainly, large quantities of animal protein were consumed at times (as when a kill of a large animal was made). So, I cannot imagine that the “evidence” supporting an anti-animal protein stance can be solid or even science-based. This sounds like a case of certain researchers trying their best to find support for their pre-determined dietary beliefs (vegan proponents do this all the time). I’m not buying it.” It’s very much an ongoing debate.

I have suspicions about the point of confusion that originated this disagreement. Fear of promoting too much growth through protein is basically the old Galenic argument based on humoral physiology. The belief is that too much meat as a stimulating/nurturing substance built up the ‘blood’ with too much heat and dryness which would burn up the body and cause a shortened lifespan. This culturally inherited bias about meat has since been fancied up with scientific language. But ancient philosophy is not the best source for formulating modern scientific theory. Let me bring this back to insulin sensitivity and insulin resistance that appears to play the determining role. Insulin is a hormone and so we must understand this from an endicrinological approach, quite different than Galenic-style fears about meat that was filtered through the Christian theology of the Middle Ages.

Hormones are part of a complex hormonal system going far beyond macronutrients in the diet, although it does appear that the macronutrient profile is a major factor. Harry Serpano, in a discussion with Bart Kay, said that: “In a low insulin state, when you’re heavy meat and fat and your insulin is at 1.3, as Dr. Paul Mangan has actually shown in one of his videos, it’s quite clear; and in what I’m showing in one of the studies, it’s quite clear. It’s so close to basically fasting which is 0.8 — it’s very low. You’re not going to be pushing up these growth pathways like mTOR or IGF-1 in any significant way.” Like with so much else, there is strong evidence that what we need to be worrying about is insulin, specifically on a high-carb diet that causes insulin resistance and metabolic syndrome. That is what is guaranteed to severely decrease longevity.

This question about too much protein recently came up in my own thoughts while reading Dr. Stephen Gundry’s new book, The Longevity Paradox. As mentioned above, he makes a case against too much animal protein. But it sounds like there is more information to be considered in the affect on health, growth, and longevity. In a dialogue with Gundry, Dr. Paul Saladino defended meat consumption (Gundry’s Plant Paradox and Saladino’s Carnivory). What Mangan has added to this debate strengthens this position.

* * *

In one of the above quoted comments, Robert H. mentions that Dr. Joseph Mercola is one of those “aligned in a very anti-animal protein stance, but also believe their view is backed by science.” It’s interesting that I’m just now listening to a discussion between Mercola and Siim Land. They met at a conference and got to talking. Mercola then read Land’s book, Metabolic Autophagy. Land is more in the camp supporting the value of protein. His view is nuanced and the debate isn’t entirely polarized. The role protein plays in health depends on the health outcomes being sought and the health conditions under which protein is being eaten: amounts, regularity of meals, assimilation, etc. It’s about how one’s body is able to use protein and to what end.

Right at the beginning of their talk, Mercola states that he is impressed by Land’s knowledge and persuaded by his view on protein. Land makes the simple point that one doesn’t want to be in autophagy all the time but to cycle between periods of growth and not. Too much protein restriction, especially all the time, is not a good thing. Mercola seems to have come around to this view. So, it’s a shifting debate. There is a lot of research and new studies are coming out all the time. But obviously, context is important in making any statement about protein in the diet. Maybe Saladino will similarly bring Gundry on board with greater protein being a good thing for certain purposes or maybe come to a middle ground. These dialogues are helpful, in particular for an outsider like me who is listening in.

* * *

On a personal note, I’m not sure I take a strong position either way. But I’ve long been persuaded by Siim Land’s view. It feels more moderate and balanced. The opposite side can sound too fear-mongering about protein, not seeming to allow as much differences in contexts and conditions. From a low-carb perspective, one has to replace carbs with something and that means either protein or fat, and one can only consume so much fat. Besides, proteins really are important for anabolism and activating mTOR, for building of the body. Maybe if you’re trying to lose weight or simply maintaining where you’re at with no concern for healing or developing muscle then protein would play less of a role. I don’t know.

Traditional societies don’t seem to worry about protein amounts. When they have access to it, they eat it, at times even to the point of their bellies distending. And when not, they don’t. Those populations with greater access don’t appear to suffer any harm from greater protein intake. Then again, these traditional societies tend to do a lot of strenuous physical activity. They also usually mix it up with regular fasting, intermittent and extended. I’m not sure how optimal protein levels may differ depending on lifestyle. Still, I’d think that the same basic biological truths would apply to all populations. For most people in most situations, increased protein will be helpful at least some of the time and maybe most of the time. Other than fasting, I’m not sure why one needs to worry about it. And with fasting, protein restriction happens naturally.

So, maybe eat protein to satiation. Then throw in some fasting. You’ll probably be fine. There doesn’t seem to be anything to be overly concerned about, based on what evidence I’ve seen so far.

Hubris of Nutritionism

There is a fundamental disagreement over diets. It is about one’s philosophical position on humanity and the world, about the kind of society one aspires to. Before getting to nutritionism, let me explain my present understanding that has developed from what I’ve learned. It’s all quite fascinating. There is a deeper reason why, for example, I see vegetarianism as potentially healthy but not veganism (see debate in comments section of my recent post A Fun Experiment), and that distinction will be central in my following argument. There have been some, not many, traditional societies that were vegetarian or rather semi-vegetarian for millennia (e.g., India; see specific comment in the above linked post), but veganism didn’t exist until the Seventh Day Adventists invented it in the late 19th century. Few people know this history. It’s not exactly something most vegan advocates, other than Adventists themselves, would want to mention.

Veganism was a modernization of ancient Greek Galenic theory of humors, having originally been incorporated into mainstream Christian thought during feudalism, especially within the monastic tradition of abstinence and self-denial but also applied to the population at large through food laws. A particular Galenic argument is that, by limiting red meat and increasing plant foods, there would be a suppression or weakening of libido/virility as hot-bloodedness that otherwise threatens to ‘burn’ up the individual. (The outline of this ideology remains within present dietary thought in the warning that too much animal protein will up-regulate mTOR and over-activate IGF-1 which, as it is asserted, will shorten lifespan. Many experts such as Dr. Steven Gundry in The Longevity Paradox, biological anthropologist Stephen Le in 100 Million Years of Food, etc have been parroting Galenic thought without any awareness of the origin of the ideas they espouse. See my posts High vs Low Protein and Low-Carb Diets On The Rise.) Also, it was believed this Galenic strategy would help control problematic behaviors like rowdiness, the reason in the Middle ages that red meat sometimes was banned prior to Carnival (about dietary systems as behavioral manipulation and social control, see Food and Faith in Christian Culture ed. by Ken Albala and Trudy Eden and some commentary about that book at my posts Western Individuality Before the Enlightenment Age and The Crisis of Identity; for similar discussion, also check out The Agricultural Mind, “Yes, tea banished the fairies.”, Autism and the Upper Crust, and Diets and Systems). For the purposes of Christian societies, this has been theologically reinterpreted and reframed. Consider the attempt to protect against the moral sin of masturbation as part of the Adventist moral reform, such that modern cereal was originally formulated specifically for an anti-masturbation campaign — the Breakfast of Champions!

High protein vs low protein is an old conflict, specifically in terms of animal meat and even more specifically as red meat. It’s more of a philosophical or theological disagreement than a scientific debate. The anti-meat argument would never hold such a central position in modern dietary thought if not for the influence of heavily Christianized American culture. It’s part of Christian theology in general. Gary Taubes discusses it in how dieting gets portrayed as the sins of gluttony and sloth: “Of all the dangerous ideas that health officials could have embraced while trying to understand why we get fat, they would have been hard-pressed to find one ultimately more damaging than calories-in/calories-out. That it reinforces what appears to be so obvious – obesity as the penalty for gluttony and sloth – is what makes it so alluring. But it’s misleading and misconceived on so many levels that it’s hard to imagine how it survived unscathed and virtually unchallenged for the last fifty years” (Why We Get Fat). Read mainstream dietary advice and you’ll quickly hear this morality-drenched worldview of fallen humanity and Adam’s sinful body. This goes along with the idea of “no pain, no gain” (an ideology I came to question in seeing how simple and easy are low-carb diets, specifically with how ketosis eliminates endless hunger and cravings while making fat melt away with little effort, not to mention how my decades of drug-resistant and suicidally-prone depression also disappeared, something many others have experienced; so it turns out that for many people great gain can be had with no pain at all). The belief has been that we must suffer and struggle to attain goodness (with physical goodness being an outward sign of moral goodness), such that the weak flesh of the mortal frame must be punished with bodily mortification (i.e., dieting and exercise) to rid it of its inborn sinful nature. Eating meat is a pleasurable temptation in nurturing the ‘fallen’ body and so it must be morally wrong. This Christian theology has become so buried in our collective psyche, even in science itself, that we no longer are able to recognize it for what it is. And because of historical amnesia, we are unaware of where these mind viruses come from.

It’s not only that veganism is a modern ideology in a temporal sense, as a product of post-Enlightenment fundamentalist theology and its secularization. More importantly, it is a broader expression of modern ways of thinking and perceiving, of being in and relating to the world, including but far from limited to how it modernizes and repurposes ancient philosophy (Galen wasn’t advocating veganism, religious or secularized, that is for sure). Besides the crappy Standard American Diet (SAD), veganism is the only other diet entirely dependent on industrialization by way of chemical-laden monoculture, high-tech food processing, and global trade networks — and hence enmeshed in the web of big ag, big food, big oil, and big gov (all of this, veganism and the industrialization that made it possible, surely was far beyond Galen’s imagination). To embrace veganism, no matter how well-intentioned, is to be fully complicit in modernity and all that goes with it — not that it makes individual vegans bad people, as to varying degrees all of us are complicit in this world we are born into. Still, veganism stands out for, within that ideological framework, there is no other choice outside of modern industrialization.

At the heart of veganism, is a techno-utopian vision and technocratic impulse. It’s part of the push for a plant-based diet that began with the Seventh Day Adventists, most infamously Dr. John Harvey Kellogg, who formed the foundation of modern American nutritional research and dietary recommendations (see the research of Bellinda Fettke who made this connection: Ellen G White and Medical EvangelismThou Shalt not discuss Nutrition ‘Science’ without understanding its driving force, and Lifestyle Medicine … where did the meat go?). I don’t say this to be mean or dismissive of vegans. If one insists on being a vegan, there are better ways to do it. But it will never be an optimal diet, neither for the individual nor for the environment (and, yes, industrial agriculture does kill large numbers of animals, whether or not the vegan has to see it in the grocery store or on their plate; see my post Carnivore Is Vegan: if veganism is defined by harming and killing the fewest lives, if veganism is dependent on industrialization that harms and kills large numbers of lives, and if potentially carnivore is the least dependent on said industrialization, then we are forced to come the conclusion that, by definition, “carnivore is vegan”). Still, if vegans insist, they should be informed and honest in embracing industrialization as a strength, rather than hiding it as a weakness, in overtly arguing for techno-utopian and technocratic solutions in the Enlightenment fashion of Whiggish progressivism. Otherwise, this unacknowledged shadow side of veganism remains an Achille’s heel that eventually will take down veganism as a movement when the truth is finally revealed and becomes public knowledge. I don’t care if veganism continues in its influence, but if vegans care about advocating their moral vision they better do some soul-searching about what exactly they are advocating and for what reason and to what end.

Veganism is not limited to being unique as the only specific diet that is fully industrialized (SAD isn’t comparable because it isn’t a specific diet, since one could argue that veganism as an industrialized diet is one variety of SAD). More importantly, what makes veganism unique is its ethical impetus. That is how it originated within the righteously moralizing theology of Adventism (to understand the moral panic of that era, read my post The Crisis of Identity). The Adventist Ellen G. White’s divine visions from God preceded the health arguments. And even those later health arguments within Adventism were predicated upon a moralistic hypothesis of human nature and reality, that is to say theology. Veganism has maintained the essence of that theology of moral health, even though the dietary ideology was quickly sanitized and secularized. Adventists like Dr. Kellogg realized that this new kind of plant-based diet would not spread unless it was made to seem natural and scientific, a common strategy of fundamentalist apologetics such as pseudo-scientific Creationism (I consider this theologically-oriented rhetoric to be a false framing; for damn sure, veganism is not more natural since it is one of the least natural diets humanity was ever attempted). So, although the theology lost its emphasis, one can still sense this religious-like motivation and righteous zeal that remains at the heart of veganism, more than a mere diet but an entire social movement and political force.

Let’s return to the health angle and finally bring in nutritionism. The only way a vegan diet is possible at all is through the industrial agriculture that eliminated the traditional farming practices, including an entire lifestyle as part of farming communities, that was heavily dependent on animal husbandry and pasturage (similar to how fundamentalist religion such as Adventism is also a product of modernity, an argument made by Karen Armstrong; modern fundamentalism is opposed to traditional religion in the way that, as Corey Robin explains, reactionary conservatism is opposed to the ancien regime it attacked and replaced). This is the industrial agriculture that mass produces plant foods through monoculture and chemicals (that, by the way, destroys ecosystems and kills the soil). And on top of that, vegans would quickly die of malnutrition if not for the industrial production of supplements and fortified foods to compensate for the immense deficiencies of their diet. This is based on an ideology of nutritionism, that as clever apes we can outsmart nature, that humanity is separate from and above nature — this is the main point I’m making here, that veganism is unnatural to the human condition formed under millions of years of hominid evolution. This isn’t necessarily a criticism from a Christian perspective since it is believed that the human soul ultimately isn’t at home in this world, but it is problematic when this theology is secularized and turned into pseudo-scientific dogma. This further disconnects us from the natural world and from our own human nature. Hence, veganism is very much a product of modernity and all of its schisms and dissociations, very much seen in American society of the past century or so. Of course, the Adventists want the human soul to be disconnected from the natural world and saved from the fallen nature of Adam’s sin. As for the rest of us who aren’t Adventists, we might have a different view on the matter. This is definitely something atheist or pagan vegans should seriously consider and deeply contemplate. We should all think about how the plant-based and anti-meat argument has come to dominate mainstream thought. Will veganism and industrialization save us? Is that what we want to put our faith in? Is that faith scientifically justified?

It’s not that I’m against plant-based diets in general. I’ve been vegetarian. And when I was doing a paleo diet, I ate more vegetables than I had ever done in my life, far more than most vegetarians. I’m not against plants themselves based on some strange principle. It’s specifically veganism that I’m concerned about. Unlike vegetarianism, there is no way to do veganism with traditional, sustainable, and restorative farming practices. Vegetarianism, omnivory, and carnivory are all fully compatible in the possibility of eliminating industrial agriculture, including factory farming. That is not the case with veganism, a diet that is unique in its place in the modern world. Not all plant-based diets are the same. Veganism is entirely different from plant-heavy diets such as vegetarianism and paleo that also allow animal foods (also, consider the fact that any diet other than carnivore is “plant-based”, a somewhat meaningless label). That is no small point since plant foods are limited in seasonality in all parts of the world, whereas most animal foods are not. If a vegetarian wanted, they could live fairly far north and avoid out-of-season plant foods shipped in from other countries simply by eating lots of eggs and dairy (maybe combined with very small amounts of what few locally-grown plant foods were traditionally and pre-industrially stored over winter: nuts, apples, fermented vegetables, etc; or maybe not even that since, technically, a ‘vegetarian’ diet could be ‘carnivore’ in only eating eggs and dairy). A vegetarian could be fully locavore. A vegan could not, at least not in any Western country, although a vegan near the equator might be able to pull off a locavore diet as long as they could rely upon local industrial agriculture, which at least would eliminate the harm from mass transportation, but it still would be an industrial-based diet with all the problems, including mass suffering and death, that entails.

Veganism in entirely excluding animal foods (and excluding insect foods such as honey) does not allow this option of a fully natural way of eating, both local and seasonal without any industrialization. Even in warmer climes amidst lush foliage, a vegan diet was never possible and never practiced prior to industrialization. Traditional communities, surrounded by plant foods or not, have always found it necessary to include animal and insect foods to survive and thrive. Hunter-gatherers living in the middle of dense jungles (e.g., Piraha) typically get most of their calories from animal foods, as long as they maintain access to their traditional hunting grounds and fishing waters, and as long as poaching and environmental destruction or else hunting laws haven’t disrupted their traditional foodways. The closest to a more fully plant-based diet among traditional people was found among Hindus in India, but even there they unintentionally (prior to chemical insecticides) included insects and insect eggs in their plant foods while intentionally allowing individuals during fertile phases of life to eat meat. So, even traditional (i.e., pre-industrial) Hindus weren’t entirely and strictly vegetarian, much less vegan (see my comment at my post A Fun Experiment), but still high quality eggs and dairy can go a long way toward nourishment, as many healthy traditional societies included such foods, especially dairy from pasture-raised animals (consider Weston A. Price’s early 20th century research of healthy traditional communities; see my post Health From Generation To Generation).

Anyway, one basic point is that plant-based diet is not necessarily and always identical to veganism, in that other plant-based diets exist with various forms of animal foods. This is a distinction many vegan advocates want to confound in muddying the water of public debate. In discussing the just released documentary The Game Changers, Paul Kita writes that it “repeatedly pits a vegan diet against a diet that includes meat. The film does this to such an extent that you slowly realize that “plant-based” is just a masquerade for “vegan.” Either you eat animal products and suffer the consequences or avoid animal products and thrive, the movie argues.” (This New Documentary Says Meat Will Kill You. Here’s Why It’s Wrong.). That is a false dichotomy, a forced choice driven by an ideological-driven agenda. Kita makes a simple point that challenges this entire frame: “Except that there’s another choice: Eat more vegetables” Or simply eat less industrial foods that have been industrially grown, industrially processed, and/or industrially transported — basically, don’t eat heavily processed crap, from either meat or plants (specifically refined starches, added sugar, and vegetable oils) but also don’t eat the unhealthy (toxic and nutrient-depleted) produce of industrial agriculture, that is to say make sure to eat locally and in season. But that advice also translates as: Don’t be vegan. That isn’t the message vegan advocates want you to hear.

Dietary ideologies embody social, political, and economic ideologies, sometimes as all-encompassing cultural worldviews. They can shape our sense of identity and reality, what we perceive as true, what we believe is desirable, and what we imagine is possible. It goes further than that, in fact. Diets can alter our neurocognitive development and so potentially alter the way we think and feel. This is one way mind viruses could quite literally parasitize our brains and come to dominate a society, which I’d argue is what has brought our own society to this point of mass self-harm through dietary dogma of pseudo-scientific “plant-based” claims of health (with possibly hundreds of millions of people who have been harmed and had their lives cut short). A diet is never merely a diet. And we are all prone to getting trapped in ideological systems. In my criticisms of veganism as a diet, that doesn’t make vegans as individuals bad people. And I don’t wish them any ill will, much less failure in their dietary health. But I entirely oppose the ideological worldview and social order that, with conscious intention or not, they are promoting. I have a strong suspicion that the world that vegans are helping to create is not a world I want to live in. It is not their beautiful liberal dream that I criticize and worry about. I’m just not so sure that the reality will turn out to be all that wonderful. So far, the plant-based agenda doesn’t seem to be working out all that well. Americans eat more whole grains and legumes, vegetables and fruits than ever before since data was kept and yet the health epidemic continues to worsen (see my post Malnourished Americans). It was never rational to blame public health concerns on meat and animal fat.

Maybe I’m wrong about veganism and the ultimate outcome of their helping to shape the modern world. Maybe technological innovation and progress will transform and revolutionize industrial agriculture and food processing, the neoliberal trade system and capitalist market in a beneficial way for all involved, for the health and healing of individuals and the whole world. Maybe… but I’m not feeling confident enough to bet the fate of future generations on what, to me, seems like a flimsy promise of vegan idealism borne out of divine visions and theological faith. More simply, veganism doesn’t seem all that healthy on the most basic of levels. No diet that doesn’t support health for the individual will support health for society, as society is built on the functioning of humans. That is the crux of the matter. To return to nutritionism, that is the foundation of veganism — the argument that, in spite of all of the deficiencies of veganism and other varieties of the modern industrial diet, we can simply supplement and fortify the needed nutrients and all will be well. To my mind, that seems like an immense leap of faith. Adding some nutrients back into a nutrient-depleted diet is better than nothing, but comes nowhere close to the nutrition of traditional whole foods. If we have to supplement the deficiencies of a diet, that diet remains deficient and we are merely covering up the worst aspects of it, what we are able to most obviously observe and measure. Still, even with those added vitamins, minerals, cofactors, etc, it doesn’t follow that the body is getting all that it needs for optimal health. In traditional whole foods, there are potentially hundreds or thousands of compounds, most of which have barely been researched or not researched at all. There are certain health conditions that require specific supplements. Sure, use them when necessary, as we are not living under optimal conditions of health in general. But when anyone and everyone on a particular diet is forced to supplement to avoid serious health decline as is the case with veganism, there is a serious problem with that diet.

It’s not exactly that I disagree with the possible solution vegans are offering to this problem, as I remain open to future innovative progress. I’m not a nostalgic reactionary and romantic revisionist seeking to turn back the clock to re-create a past that never existed. I’m not, as William F. Buckley jr. put it, “someone who stands athwart history, yelling Stop”. Change is great — I have nothing against it. And I’m all for experimenting. That’s not where I diverge from the “plant-based” vision of humanity’s salvation. Generally speaking, vegans simply ignore the problem I’ve detailed or pretend it doesn’t exist. They believe that such limitations don’t apply to them. That is a very modern attitude coming from a radically modern diet and the end result would be revolutionary in remaking humanity, a complete overturning of what came before. It’s not to be obsessed with the past, to believe we are limited to evolutionary conditions and historical precedence. But ignoring the past is folly. Our collective amnesia about the traditional world keeps getting us into trouble. We’ve nearly lost all traces of what health once meant, the basic level of health that used to be the birthright of all humans.

My purpose here is to create a new narrative. It isn’t vegans and vegetarians against meat-eaters. The fact of the matter is most Americans eat more plant foods than animal foods, in following this part of dietary advice from the AHA, ADA, and USDA (specifically eating more vegetables, fruits, whole grains, and legumes than ever before measured since data has been kept). When snacking, it is plant foods (crackers, potato chips, cookies, donuts, etc) that we gorge on, not animal foods. Following Upton Sinclair’s writing of The Jungle, the average intake of red meat went on a decline. And since the 1930s, Americans have consumed more industrial seed oils than animal fat. “American eats only about 2oz of red meat per day,” tweets Dr. Shawn Baker, “and consumes more calories from soybean oil than beef!” Even total fat hasn’t increased but remained steady with the only change in the ratio of what kinds of fats, that is to say more industrial seed oils. It’s true that most Americans aren’t vegan, but what they share with vegans is an industrialized diet that is “plant-based”. To push the American diet further in this direction would hardly be a good thing. And it would require ever greater dependence on the approach of nutritionism, of further supplementation and fortification as Americans increasingly become malnourished. That is no real solution to the problem we face.

Instead of scapegoating meat and animal fat, we should return to the traditional American diet or else some other variant of the traditional human diet. The fact of the matter is historically Americans ate massive amounts of meat and, at the time, they were known as the healthiest population around. Meat-eating Americans in past centuries towered over meat-deprived Europeans. And those Americans, even the poor, were far healthier than their demographic counterparts elsewhere in the civilized and increasingly industrialized world. The United States, one of the last Western countries to be fully industrialized and urbanized, was one of the last countries to see the beginning of a health epidemic. The British noticed the first signs of physical decline in the late 1800s, whereas Americans didn’t clearly see this pattern until World War II. With this in mind, it would be more meaningful to speak of animal-based diets, including vegetarianism that allows dairy and eggs. This would be far more meaningful than grouping together supposed “plant-based” diets. Veganism is worlds apart from vegetarianism. Nutritionally speaking, vegetarianism has more in common with the paleo diet or even carnivore diet than with veganism, the latter being depleted of essential nutrients from animal foods (fat-soluble vitamins, EPA, DHA, DPA, choline, cholesterol, etc; yes, we sicken and die without abundant cholesterol in our diet, the reason dementia and other forms of neurocognitive decline are a common symptom of statins in lowering cholesterol levels). To entirely exclude all animal foods is a category unto itself, a category that didn’t exist and was unimaginable until recent history.

* * *

by Gyorgy Scrinin

In Defense of Food
by Michael Pollan

Vegan Betrayal
by Mara Kahn

The Vegetarian Myth
by Lierre Keith

Mike Mutzel:

On the opposite side of the spectrum, the vegans argue that now we have the technologies like B12, synthetic b12, we can get DHA from algae. So it’s a beautiful time to be be vegan because we don’t need to rely upon animals for these compounds. What would you say to that argument?

Paul Saladino:

I would say that that’s a vast oversimplification of the sum total of human nutrition to think that, if we can get synthetic B12 and synthetic DHA, we’re getting everything in an animal. It’s almost like this reductionist perspective, in my opinion.

I’ve heard some people say that it doesn’t matter what you eat. It’s all about calories in and calories out, and then you can just take a multivitamin for your minerals and vitamins. And I always bristle at that I think that is so reductionist. You really think you’ve got it all figured out that you can just take one multivitamin and your calories and that is the same as real food?

That to me is just a travesty of an intellectual hypothesis or intellectual position to take because that’s clearly not the case. We know that animal foods are much more than the reductionist vitamins and minerals that are in them. And they are the structure or they are the matrix they are the amino acids… they are the amino acid availability… they are the cofactors. And to imagine that you can substitute animal foods with B12 and DHA is just a very scary position for me.

I think this is an intellectual error that we make over and over as humans in our society and this is a broader context… I think that we are smart and because we have had some small victories in medicine and nutrition and health. We’ve made scanning electron microscopes and we’ve understood quarks. I think that we’ve gotten a little too prideful and we imagine that as humans we can outsmart natural the natural world, that we can outsmart nature. And that may sound woo-woo, but I think it’s pretty damn difficult to outsmart 3 million years of natural history and evolution. And any time we try to do that I get worried.

Whether it’s peptides, whether it’s the latest greatest drug, whether it’s the latest greatest hormone or hormone combination, I think you are messing with three million years of the natural world’s wisdom. You really think you’re smarter than that? Just wait just wait, just wait, you’ll see. And to reduce animal foods to B12 and DHA, that’s a really really bad idea.

And as we’ve been talking about all those plant foods that you’re eating on a vegan diet are gonna come with tons of plants toxins. So yes, I think that we are at a time in human history when you can actually eat all plants and not get nutritional deficiencies in the first year or two because you can supplement the heck out of it, right? You can get… but, but… I mean, the list goes on.

Where’s your zinc? Where’s your carnitine? Where’s your carnosine? Where’s your choline? It’s a huge list of things. How much protein are you getting? Are you actually a net positive nitrogen balance? Let’s check your labs. Are you getting enough iodine? Where are you getting iodine from on a vegan diet?

It doesn’t make sense. You have to supplement with probably 27 different things. You have to think about the availability of your protein, the net nitrogen uses of your protein.

And you know people may not know this about me. I was a vegan, I was a raw vegan for about 7 months about 14 years ago. And my problem — and one thing I’ve heard from a lot of other people, in fact my clients, are the same thing today — is that, even if you’re able to eat the foods and perfectly construct micronutrients, you’re going to have so much gas that nobody’s going to want to be around you in the first place.

And I don’t believe that, in any way, shape or form, a synthetic diet is the same as a real foods diet. You can eat plants and take 25 supplements. But then you think what’s in your supplements? And are they bioavailable in the same way? And do they have the cofactors like they do in the food? And to imagine — we’ve done so much in human nutrition — but to imagine that we really understand fully the way that humans eat and digest their food I think is just that’s just pride and that’s just a folly.

Mike Mutzel:

Well, I agree I mean I think there’s a lot more to food than we recognize: micro RNA, transfer RNA, like other molecules that are not quote-unquote macronutrients. Yeah, now I think that’s what you’re getting from plants and animals in a good or bad way that a lot of people don’t think about. For example, you know there’s animal studies that show stress on animals; for example, like pre-slaughter stress affects the transcription patches and various genes in the animal product.

So, I love how you’re bringing to this whole carnivore movement — like the grass-fed movement, eating more organic free-range, things like that — because one of the qualms that I had seeing this thing take off is a lot of people going to fast food were taking the bun off the burger saying that there’s really no difference between grass-fed or a grain-fed. Like meat’s meat, just get what you can afford. I understand that some people… I’ve been in that place financially before in my life where grass-fed was a luxury.

But the other constituents that could potentially be in lower quality foods, both plant and animal. And the other thing about that you, just to hit on one more thing… The supplements —  been in the supplement space since ’06 — they’re not free of iatrogenesis, right. So there is heavy metals, arsenic, lead, mercury, cadmium in supplements; even vegan proteins, for example.

Paul Saladino:

Yeah, highly contaminated. Yeah, people don’t think about the metals in their supplements. And I see a lot of clients with high heavy metals and we think where are you getting this from. I saw a guy the other day with a really high tin and I think it’s in his supplements. And so anyway, that’s a whole other story

Warren and Sanders on Environmentalism

I’m not normally impressed by Elizabeth Warren. I don’t have any particular reason to dislike her, but I haven’t felt convinced that she has what it takes. Still, she is able speak strongly at times that perks up my ears. At CNN’s climate town hall, she responded with exasperation to a question about energy-saving lightbulbs:

“This is exactly what the fossil fuel industry hopes we’re all talking about…They want to be able to stir up a lot of controversy around your lightbulbs, around your straws, and around your cheeseburgers.”

That was refreshing. I’m very much in support of the environment. As an example, I’d like for life on earth to continue. And if possible, it might be nice to maintain human civilization without collapsing in ecological catastrophe and mass suffering. On the other hand, I hate how environmentalism can get used as a political football on both sides that distracts from actually doing anything that makes a difference, which is precisely what big biz wants.

Giving a far different kind of response while in North Carolina, when asked about a meat tax, Bernie Sanders refused to give a straight answer. He talked in vague generalities by not making any statement that would offend anyone or commit him to anything. Unlike Warren, he didn’t challenge the premise of the question. It was quite disappointing to hear this kind of waffling.

To be fair, the right-wing media was being dishonest in reporting that he supported a meat tax. He didn’t say that. He simply said as little as possible. But it is true that he accepted the framing without challenging or questioning it. His was an answer one expects from a professional politician pandering to potential voters, in allowing people to hear what they want to hear while not stating any clear position:

“All that i can say is if we believe, as i do and you do, that climate change is real, we’re going to have to tackle it in every single area, including agriculture. Okay?

“And in fact, one of the things we want to do with our farmers out there is help them become more aggressive and able to help us combat climate change rather than contribute to it.

“So we will certainly.. — you’re right, we got to look at agriculture, we got to look at every cause of the crisis that we face.”

I understand. There was no way for him to come out looking good in that situation. He has never shown any evidence of wanting to tax food in order to control the dietary habits of Americans. It’s certainly not part of his political platform. Yet when confronted with a direction question, it put him in a corner that he didn’t want to be in. Disagreeing with a supporter can lead to all kinds of problems, especially in how the media would spin it and obsess over it.

Still, it is disheartening that we so rarely can have honest political debate where people speak their minds. If campaign season doesn’t force public awareness into uncomfortable issues, then what good does it serve? Very little. That is why Warren’s short but effective tirade against the fossil fuel industry was a breath of fresh air. She shifted the focus away from artificially-created division and toward the problems that are common among us.

Felice Jacka Defends Boundaries of Allowable Dietary Thought

Felice Jacka is an Australian professor of epidemiology. In her official capacity as an expert, she made a public health warning from her Twitter account: “If your/an MD is advocating an extreme diet of any type, please understand that they may NOT be the best person to listen to.” In her other tweets that followed, it was made clear that doctors had no right to recommend any diet other than whatever is officially declared healthy by the appropriate government and medical institutions.

She made this statement after watching a video of Dr. Shawn Baker informally discuss the carnivore diet, as if in doing so he was a public threat and an immoral actor who must be publicly called out and shamed. Her professional assessment was that he wasn’t being scientific enough. Fine. If she wanted a more scientific analysis of the evidence, she could have turned to talks given by Georgia Ede, Zoe Harcombe, Amber L. O’Hearn, and Paul Saladino. Her damning indictment of the carnivore diet was rather strong after watching a single Youtube video of a casual talk. That doesn’t seem like a scientific response.

Or she could have checked out the informal survey that Dr. Baker himself recently did in exploring people’s experience with the carnivore diet. Her complaint was that his experience was merely anecdotal. Sure. But he isn’t alone, which was the purpose of the survey he did. Look at the carnivore groups on social media, some of which have hundreds of thousands of members.

Carnivore is not a minor diet. She calls it “extreme”. It’s no more extreme than veganism and certainly far less extreme than the modern industrial standard American diet (SAD). I’d also go so far as to say, in terms of history and evolution, carnivore is also not nearly as extreme as the diet advocated by the AHA and USDA, the diet that the data shows Americans have been mostly following and that has led to a disease epidemic.

It’s not only the carnivore diet Jacka targets. In her book Brain Changer, she has a small section on the ketogenic diet in relationship to schizophrenia. She writes that, “Until we have the evidence from such studies, however, we would definitely not recommend such a diet, as it’s extremely strict and demanding and requires close medical supervision.” There she goes again: “extremely” — as if she were talking about potentially violent political activists. Her language is consistent in talking about any diet that dares to cross the line.

Let me set one thing straight. No, the ketogenic diet isn’t extremely strict or particularly demanding. Those who go on it often find it to be the easiest diet they ever tried, as hunger and cravings tend to decrease. It still allows for a wide variety of animal and plant foods. If ketosis is all you care about, you don’t even have to worry about the quality of the food, as long as it is low enough in carbs. Go out to fast food and eat the hamburger but without the bun. And if you want snack foods, have a bag of pork rinds instead of a bag of potato chips. Plus, there are all kinds of prepared products now marketed as keto, from protein bars to cauliflower pizzas, and nearly all stores carry them.

So, why all this fear-mongering about alternative dietary approaches? In response to Jacka, Dr. Ara Darakjian tweeted, “This seems overly restrictive on a physician’s freedoms. Why should there be a gag rule? If a physician believes differently they have to stick to the party line? I’ve never recommended carnivore but I don’t think it’s wrong for other MD’s to advocate based on anecdotal evidence” That is a good point. Why not allow doctors to use their best judgment based on their own professional experience?

A light went off in my head when I saw that mention of a “gag rule”. The specific doctor she is criticizing, Dr. Shawn Baker, was the target of a witch-hunt that involved a several year legal battle and resulted with the state board temporarily taking away his license to practice. So, it seems like no accident that he still is being targeted. It turns out he was vindicated and his license was reinstated. Still, he was forced out of work during that time and, along with severe disruption in his life and his family, because of legal costs he lost his house.

His sin in that earlier situation, however, wasn’t about the carnivore diet. He was simply recommending lifestyle changes as a prevention for surgery. By the way, he doesn’t only recommend a carnivore diet but also keto and moderate low-carb, even plant-based in some cases. He treats his patients as individuals and seeks the best treatment according to his knowledge. Sometimes that involves a particular dietary approach or another, but according to Felice Jacka that should not be allowed, a powerful message considering the doctor she chose to use as an example.

When I first saw her tweet, I didn’t know she was Australian. It occurred to me to see where she was from. I wondered this because I knew some other major cases of witch-hunts. The moment I saw that she is employed at an Australian university, another light bulb went off in my head. One of the worst witch-hunts against a low-carb advocate sought to destroy the career of the Australian doctor Gary Fettke. I don’t know if she was involved in that witch-hunt or supported it in any way, but it seems likely she wouldn’t been on the side defending Dr. Fettke’s rights.

I also left some tweets in that thread she started. I brought up some criticisms of the field of nutrition studies itself. She defended her field of expertise since, after all, her authority rests upon it. She said to me that, “I don’t agree that there is (largely) not consensus among nutrition professionals and researchers. But it’s not the point I’m making. MDs are charged with practising evidence-based medicine. Whether or not you or they dont agree with the evidence for whatever reason.”

Responding back to her, I wrote: “Consensus from evidence-based medicine in a field suffering from one of the worst replication crises in scientific history is precisely part of the problem.” That was a tougher criticism than it might seem, since the main replication failure of nutrition studies has been epidemiology, Jacka’s sole area of expertise. After that simple comment, she blocked me. There was nothing else I said that was mean or trollish. The closest I came to being antagonistic was in saying that I’d rather trust the expertise of those who are world-leading experts in keto and low-carb diets: Benjamin Bikman, Jason Fung, etc; also, Tim Noakes (another victim of a witch-hunt, as shown in the documentary The Magic Pill, in Daryl Ilbury’s book The Quiet Maverick, and in Noakes’ own book Lore of Nutrition). She obviously is not in favor of open scientific debate and inquiry.

There are powerful interests seeking to maintain the status quo. A simple tweet might not seem like anything to be concerned about. Then again, Tim Noakes troubles began with a single innocent tweet that was used as evidence. He fought back, but it also took years and immense amounts of money. If he wasn’t such a brilliant and determined guy, the powers that be might have been successful. Still, the attack did effectively make Noakes into an example. Few people could have stood up to that kind of organized and highly funded onslaught. When someone like Felice Jacka complains about someone like Dr. Shawn Baker, there is always an implied threat. Most doctors probably remain silent and keep their heads down. Otherwise, the consequences might mean the ending of one’s career.


Dr. Catherine Shanahan On Dietary Epigenetics and Mutations

Dr. Catherine Shanahan is a board-certified family physician with an undergraduate degree in biology, along with training in biochemistry and genetics. She has also studied ethno-botany, culinary traditions, and ancestral health. Besides regularly appearing in and writing for national media, she has worked as director and nutrition consultant for the Los Angeles Lakers. On High Intensity Health, she was interviewed by nutritionist Mike Mutzel (Fat Adapted Athletes Perform Better). At the 31:55 mark in that video, she discussed diet (in particular, industrial vegetable oils or simply seed oils), epigenetic inheritance, de novo genetic mutations, and autism. This can be found in the show notes (#172) where it is stated that,

“In 1909 we consumed 1/3 of an ounce of soy oil per year. Now we consume about 22 pounds per year. In the amounts that we consume seed oils, it breaks down into some of the worst toxins ever discovered. They are also capable of damaging our DNA. Many diseases are due to mutations that children have that their parents did not have. This means that mothers and fathers with poor diets have eggs/sperm that have mutated DNA. Children with autism have 10 times the number of usual mutations in their genes. Getting off of seed oils is one of the most impactful things prospective parents can do. The sperm has more mutations than the egg.”

These seed oils didn’t exist in the human diet until the industrial era. Our bodies are designed to use and incorporate the PUFAs from natural sources, but the processing into oils through high pressure and chemicals denatures the structure of the oil and destroys the antioxidants. The oxidative stress that follows from adding them to the diet is precisely because these altered oils act as trojan horses in being treated by the body like natural fats. This is magnified by a general increase of PUFAs, specifically omega-6 fatty acids, with a simultaneous decrease of omega-3 fatty acids and saturated fats. It isn’t any difference in overall fat intake, as the 40% we get in the diet now is about the same as seen in the diet at the beginning of last century. What is different is these oxidized PUFAs combined with massive loads of sugar and starches like never seen before.

Dr. Shanahan sees these industrial plant oils as the single greatest harm, such that she doesn’t consider them to be a food but a toxin, originally discovered as an industrial byproduct. She is less worried about any given category of food or macronutrient, as long as you first and foremost remove this specific source of toxins.** She goes into greater detail in a talk from Ancestry Foundation (AHS16 – Cate Shanahan – Bad Diet, Bad DNA?). And her book, Deep Nutrition, is a great resource on this topic. I’ll leave that for you to further explore, if you so desire. Let me quickly and simply note an implication of this.

Genetic mutations demonstrates how serious of a situation this is. The harm we are causing ourselves might go beyond merely punishment for our personal sins but the sins of the father and mother genetically passing onto their children, grandchildren, and further on (one generation of starvation or smoking among grandparents leads to generations of smaller birth weight and underdevelopment among the grandchildren and maybe beyond, no matter if the intervening generation of parents was healthy).

It might not be limited to a temporary transgenerational harm as seen with epigenetics. This could be permanent harm to our entire civilization, fundamentally altering our collective gene pool. We could recover from epigenetics within a few generations, assuming we took the problem seriously and acted immediately (Dietary Health Across Generations), but with genetic mutations we may never be able to undo the damage. These mutations have been accumulating and will continue to accumulate, until we return to an ancestral diet of healthy foods as part of an overall healthy lifestyle and environment. Even mutations can be moderated by epigenetics, as the body is designed to deal with them.

This further undermines genetic determinism and biological essentialism. We aren’t mere victims doomed to a fate beyond our control. This dire situation is being created by all of us, individually and collectively. There is no better place to begin than with your own health, but we better also treat this as a societal crisis verging on catastrophe. It was public policies and an international food system that created the conditions that enacted and enforced this failed mass experiment of dietary dogma and capitalist realist profiteering. Maybe we could try something different, something  less psychopathically authoritarian, less psychotically disconnected from reality, less collectively suicidal. Heck, it’s worth a try.

* * *

** I’d slightly disagree with her emphasis. She thinks what matters most is the changes over the past century. There is a good point made in this focus on late modernity. But I’d note that industrialization and modern agriculture began in the prior centuries.

It was in the colonial era that pasta was introduced to Italy, potatoes to Ireland, and sugar throughout the Western world. It wasn’t until the late 1700s and more clearly in the early 1800s that there were regular grain surpluses that made grains available for feeding/fattening both humans and cattle. In particular, it was around this time that agricultural methods improved for wheat crops, allowing it to be affordable to the general public for the first time in human existence and hence causing white bread to become common during the ensuing generations.

I don’t know about diseases like Alzheimer’s, Parkinson’s, and multiple sclerosis. But I do know that the most major diseases of civilization (obesity, diabetes, cancer, and mental illness) were first noticed to be on the rise during the 1700s and 1800s or sometimes earlier, long before industrial oils or the industrial revolution that made these oils possible. The high-carb diet appeared gradually with colonial trade and spread across numerous societies, first hitting the wealthiest before eventually being made possible for the dirty masses. During this time, it was observed by doctors, scientists, missionaries and explorers that obesity, diabetes, cancer, mental illness and moral decline quickly followed on the heels of this modern diet.

Seed oils were simply the final Jenga block pulled out from the ever growing and ever more wobbly tower, in replacing healthy nutrient-dense animal fats (full of fat-soluble vitamins, choline, omega-3 fatty acids, etc) that were counterbalancing some of the worst effects of the high-carb diet. But seed oils, as with farm chemicals such as glyphosate, never would never have had as severe and dramatic of an impact if not for the previous centuries of worsening diet and health. It had been building up over a long time and it was doomed to topple right from the start. We are simply now at the tipping point that is bringing us to the culmination point, the inevitable conclusion of a sad trajectory.

Still, it’s never too late… or let us hope. Dr. Shanahan prefers to end on an optimistic note. And I’d rather not disagree with her about that. I’ll assume she is right or that she is at least in the general ballpark. Let us do as she suggests. We need more and better research, but somehow industrial seed oils have slipped past the notice of autism researchers.

* * *

On Deep Nutrition and Genetic Expression
interview by Kristen Michaelis CNC

Dr. Cate: Genetic Wealth is the idea that if your parents or grandparents ate traditional and nutrient-rich foods, then you came into the world with genes that could express in an optimal way, and this makes you more likely to look like a supermodel and be an extraordinary athlete. Take Angelina Jolie or Michael Jordan, for instance. They’ve got loads of genetic wealth.

Genetic Momentum
 describes the fact that, once you have that extraordinary genetic wealth, you don’t have to eat so great to be healthier than the average person. It’s like being born into a kind of royalty. You always have that inheritance around and you don’t need to work at your health in the same way other people do.

These days, for most of us, it was our grandparents or great grandparents who were the last in our line to grow up on a farm or get a nutrient-rich diet. In my case, I have to go back 4 generations to the Irish and Russian farmers who immigrated to NYC where my grandparents on both sides could only eat cheap food; sometimes good things like chopped liver and beef tongue, but often preserves and crackers and other junk. So my grandparents were far healthier than my brother and sisters and I.

The Standard American Diet (SAD) has accelerated the processes of genetic wealth being spent down, genetic momentum petering out, and the current generation getting sick earlier than their parents and grandparents. This is a real, extreme tragedy on the order of end-of-the-world level losses of natural resources. Genetic wealth is a kind of natural resource. And loss of genetic wealth is a more urgent problem than peak oil or the bursting of the housing bubble. But of course nobody is talking about it directly, only indirectly, in terms of increased rates of chronic disease.

Take autism, for example. Why is autism so common? I don’t think vaccines are the reason for the vast vast majority of cases, since subtle signs of autism can be seen before vaccination in the majority. I think the reason has to do with loss of genetic wealth. We know that children with autism exhibit DNA mutations that their parents and grandparents did not have. Why? Because in the absence of necessary nutrients, DNA cannot even duplicate itself properly and permanent mutations develop.

(Here’s an article on one kind of genetic mutation (DNA deletions) associated with autism.)

Fortunately, most disease is not due to permanent letter mutations and therefore a good diet can rehabilitate a lot of genetic disease that is only a result of altered genetic expression. To put your high-school biology to work, it’s the idea of genotype versus phenotype. You might have the genes that make you prone to, for example, breast cancer (the BRCA1 mutation), but you might not get the disease if you eat right because the gene expression can revert back to normal.

Deep Nutrition: Why Your Genes Need Traditional Food
by Dr. Catherine Shanahan
pp. 55-57

Guided Evolution?

In 2007, a consortium of geneticists investigating autism boldly announced that the disease was not genetic in the typical sense of the word, meaning that you inherit a gene for autism from one or both of your parents. New gene sequencing technologies had revealed that many children with autism had new gene mutations, never before expressed in their family line.

An article published in the prestigious journal Proceedings of the National Academy of Sciences states, “The majority of autisms are a result of de novo mutations, occurring first in the parental germ line.” 42 The reasons behind this will be discussed in Chapter 9.

In 2012, a group investigating these new, spontaneous mutations discovered evidence that randomness was not the sole driving force behind them. Their study, published in the journal Cell, revealed an unexpected pattern of mutations occurring 100 times more often in specific “hotspots,” regions of the human genome where the DNA strand is tightly coiled around organizing proteins called histones that function much like spools in a sewing kit, which organize different colors and types of threads. 43

The consequences of these mutations seem specifically designed to toggle up or down specific character traits. Jonathan Sebat, lead author on the 2012 article, suggests that the hotspots are engineered to “mutate in ways that will influence human traits” by toggling up or down the development of specific behaviors. For example, when a certain gene located at a hotspot on chromosome 7 is duplicated, children develop autism, a developmental delay characterized by near total lack of interest in social interaction. When the same chromosome is deleted, children develop Williams Syndrome, a developmental delay characterized by an exuberant gregariousness, where children talk a lot, and talk with pretty much anyone. The phenomenon wherein specific traits are toggled up and down by variations in gene expression has recently been recognized as a result of the built-in architecture of DNA and dubbed “active adaptive evolution.” 44

As further evidence of an underlying logic driving the development of these new autism-related mutations, it appears that epigenetic factors activate the hotspot, particularly a kind of epigenetic tagging called methylation. 45 In the absence of adequate B vitamins, specific areas of the gene lose these methylation tags, exposing sections of DNA to the factors that generate new mutations. In other words, factors missing from a parent’s diet trigger the genome to respond in ways that will hopefully enable the offspring to cope with the new nutritional environment. It doesn’t always work out, of course, but that seems to be the intent.

You could almost see it as the attempt to adjust character traits in a way that will engineer different kinds of creative minds, so that hopefully one will give us a new capacity to adapt.

pp. 221-228

What Is Autism?

The very first diagnostic manual for psychiatric disorders published in 1954 described autism simply as “schizophrenic reaction, childhood type.” 391 The next manual, released in 1980, listed more specific criteria, including “pervasive lack of responsiveness to other people” and “if speech is present, peculiar speech patterns such as immediate and delayed echolalia, metaphorical language, pronominal reversal (using you when meaning me, for instance).” 392 Of course, the terse language of a diagnostic manual can never convey the real experience of living with a child on the spectrum, or living on the spectrum yourself.

When I graduated from medical school, autism was so rarely diagnosed that none of my psychiatry exams even covered it and I and my classmates were made aware of autism more from watching the movie Rain Man than from studying course material. The question of whether autism (now commonly referred to as ASD) is more common now than it was then or whether we are simply recognizing it more often is still controversial. Some literature suggests that it is a diagnostic issue, and that language disorders are being diagnosed less often as autism is being diagnosed more. However, according to new CDC statistics, it appears that autism rates have risen 30 percent between 2008 and 2012. Considering that diagnostic criteria had been stable by that point in time for over a decade, increased diagnosis is unlikely to be a major factor in this 30 percent figure. 393

Given these chilling statistics, it’s little wonder that so many research dollars have been dedicated to exploring possible connections between exposure to various environmental factors and development of the disorder. Investigators have received grants to look into a possible link between autism and vaccines, 394 smoking, 395 maternal drug use (prescription and illicit), 396 , 397 , 398 organophosphates, 399 and other pesticides, 400 BPA, 401 lead, 402 mercury, 403 cell phones, 404 IVF and infertility treatments, 405 induced labor, 406 high-powered electric wires, 407 flame retardants, 408 ultrasound, 409 —and just about any other environmental factor you can name. You might be wondering if they’ve also looked into diet. But of course: alcohol, 410 cow’s milk, 411 milk protein, 412 soy formula, 413 gluten, 414 and food colorings 415 have all been investigated. Guess what they’ve never dedicated a single study to investigating? Here’s a hint: it’s known to be pro-oxidative and pro-inflammatory and contains 4-HNE, 4-HHE, and MDA, along with a number of other equally potent mutagens. 416 Still haven’t guessed? Okay, one last hint: it’s so ubiquitous in our food supply that for many Americans it makes up as much as 60 percent of their daily caloric intake, 417 a consumption rate that has increased in parallel with rising rates of autism.

Of course, I’m talking about vegetable oil. In Chapter 2 , I discussed in some detail how and why gene transcription, maintenance, and expression are necessarily imperiled in the context of a pro-inflammatory, pro-oxidative environment, so I won’t go further into that here. But I do want to better acquaint you with the three PUFA-derived mutagens I just named because when they make it to the part of your cell that houses DNA, they can bind to DNA and create new, “de novo,” mutations. DNA mutations affecting a woman’s ovaries, a man’s sperm, or a fertilized embryo can have a devastating impact on subsequent generations.

First, let’s revisit 4-HNE (4-hydroxynonanol), which you may recall meeting in the above section on firebombing the highways. This is perhaps the most notorious of all the toxic fats derived from oxidation of omega-6 fatty acids, whose diversity of toxic effects requires that entire chemistry journals be devoted to 4-HNE alone. When the mutagenicity (ability to mutate DNA) of 4-HNE was first described in 1985, the cytotoxicity (ability to kill cells) had already been established for decades. The authors of a 2009 review article explain that the reason it had taken so long to recognize that HNE was such an effective carcinogen was largely due to the fact that “the cytotoxicity [cell-killing ability] of 4-HNE masked its genotoxicity [DNA-mutating effect].” 419 In other words, it kills cells so readily that they don’t have a chance to divide and mutate. How potently does 4-HNE damage human DNA? After interacting with DNA, 4-HNE forms a compound called an HNE-adduct, and that adduct prevents DNA from copying itself accurately. Every time 4-HNE binds to a guanosine (the G of the four-letter ACGT DNA alphabet), there is somewhere between a 0.5 and 5 percent chance that G will not be copied correctly, and that the enzyme trying to make a perfect copy of DNA will accidentally turn G into T. 420 Without 4-HNE, the chance of error is about a millionth of a percent. 421 In other words, 4-HNE increases the chances of a DNA mutation rate roughly a million times!

Second, 4-HHE (4-hydroxy-hexanal), which is very much like 4-HNE, his more notorious bigger brother derived from omega-6, but 4-HHE is derived instead from omega-3. If bad guys had sidekicks, 4-NHE’s would be 4-HHE. Because 4-HHE does many of the same things to DNA as 4-HNE, but has only been discovered recently. 422 You see, when omega-6 reacts with oxygen, it breaks apart into two major end products, whereas omega-3, being more explosive, flies apart into four different molecules. This means each one is present in smaller amounts, and that makes them a little more difficult to study. But it doesn’t make 4-HHE any less dangerous. 4-HHE specializes in burning through your glutathione peroxidase antioxidant defense system. 423 This selenium-based antioxidant enzyme is one of the three major enzymatic antioxidant defense systems, and it may be the most important player defending your DNA against oxidative stress. 424 , 425

Finally, there is malonaldehyde (MDA), proven to be a mutagen in 1984, but presumed to only come from consumption of cooked and cured meats. 426 Only in the past few decades have we had the technology to determine that MDA can be generated in our bodies as well. 427 And unlike the previous two chemicals, MDA is generated by oxidation of both omega-3 and omega-6. It may be the most common endogenously derived oxidation product. Dr. J. L. Marnett, who directs a cancer research lab at Vanderbuit University School of Medicine, Nashville, Tennessee, and who has published over 400 articles on the subject of DNA mutation, summarized his final article on MDA with the definitive statement that MDA “appears to be a major source of endogenous DNA damage [endogenous, here, meaning due to internal, metabolic factors rather than, say, radiation] in humans that may contribute significantly to cancer and other genetic diseases.” 428

There’s one more thing I need to add about vegetable-oil-derived toxic breakdown products, particularly given the long list of toxins now being investigated as potential causes of autism spectrum disorders. Not only do they directly mutate DNA, they also make DNA more susceptible to mutations induced by other environmental pollutants. 429 , 430 This means that if you start reading labels and taking vegetable oil out of your diet, your body will more readily deal with the thousands of contaminating toxins not listed on the labels which are nearly impossible to avoid.

Why all this focus on genes when we’re talking about autism? Nearly every day a new study comes out that further consolidates the consensus among scientists that autism is commonly a genetic disorder. The latest research is focusing on de novo mutations, meaning mutations neither parent had themselves but that arose spontaneously in their egg, sperm, or during fertilization. These mutations may affect single genes, or they may manifest as copy number variations, in which entire stretches of DNA containing multiple genes are deleted or duplicated. Geneticists have already identified a staggering number of genes that appear to be associated with autism. In one report summarizing results of examining 900 children, scientists identified 1,000 potential genes: “exome sequencing of over 900 individuals provided an estimate of nearly 1,000 contributing genes.” 431

All of these 1,000 genes are involved with proper development of the part of the brain most identified with the human intellect: our cortical gray matter. This is the stuff that enables us to master human skills: the spoken language, reading, writing, dancing, playing music, and, most important, the social interaction that drives the desire to do all of the above. One need only have a few of these 1,000 genes involved in building a brain get miscopied, or in some cases just one, in order for altered brain development to lead to one’s inclusion in the ASD spectrum.

So just a few troublemaker genes can obstruct the entire brain development program. But for things to go right, all the genes for brain development need to be fully functional.

Given that humans are thought to have only around 20,000 genes, and already 1,000 are known to be essential for building brain, that means geneticists have already labeled 5 percent of the totality of our genetic database as crucial to the development of a healthy brain—and we’ve just started looking. At what point does it become a foolish enterprise to continue to look for genes that, when mutated, are associated with autism? When we’ve identified 5,000? Or 10,000? The entire human genome? At what point do we stop focusing myopically only on those genes thought to play a role in autism?

I’ll tell you when: when you learn that the average autistic child’s genome carries de novo mutations not just in genes thought to be associated with autism, but across the board, throughout the entirety of the chromosomal landscape. Because once you’ve learned this, you can’t help but consider that autism might be better characterized as a symptom of a larger disease—a disease that results in an overall increase in de novo mutations.

Almost buried by the avalanche of journal articles on genes associated with autism is the finding that autistic children exhibit roughly ten times the number of de novo mutations compared to their typically developing siblings. 432 An international working group on autism pronounced this startling finding in a 2013 article entitled: “Global Increases in Both Common and Rare Copy Number Load Associated With Autism.” 433 ( Copy number load refers to mutations wherein large segments of genes are duplicated too often.) What the article says is that yes, children with autism have a larger number of de novo mutations, but the majority of their new mutations are not statistically associated with autism because other kids have them, too. The typically developing kids just don’t have nearly as many.

These new mutations are not only affecting genes associated with brain development. They are affecting all genes seemingly universally. What is more, there is a dose response relationship between the total number of de novo mutations and the severity of autism such that the more gene mutations a child has (the bigger the dose of mutation), the worse their autism (the larger the response). And it doesn’t matter where the mutations are located—even in genes that have no obvious connection to the brain. 434 This finding suggests that autism does not originate in the brain, as has been assumed. The real problem—at least for many children—may actually be coming from the genes. If this is so, then when we look at a child with autism, what we’re seeing is a child manifesting a global genetic breakdown. Among the many possible outcomes of this genetic breakdown, autism may simply be the most conspicuous, as the cognitive and social hallmarks of autism are easy to recognize.

As the authors of the 2013 article state, “Given the large genetic target of neurodevelopmental disorders, estimated in the hundreds or even thousands of genomic loci, it stands to reason that anything that increases genomic instability could contribute to the genesis of these disorders.” 435 Genomic instability —now they’re on to something. Because framing the problem this way helps us to ask the more fundamental question, What is behind the “genomic instability” that’s causing all these new gene mutations?

In the section titled “What Makes DNA Forget” in Chapter 2 , I touched upon the idea that an optimal nutritional environment is required to ensure the accurate transcription of genetic material and communication of epigenetic bookmarking, and how a pro-oxidative, pro-inflammatory diet can sabotage this delicate operation in ways that can lead to mutation and alter normal growth. There I focused on mistakes made in epigenetic programming, what you could call de novo epigenetic abnormalities. The same prerequisites that support proper epigenetic data communication, I submit, apply equally to the proper transcription of genetic data.

What’s the opposite of a supportive nutritional environment? A steady intake of pro-inflammatory, pro-oxidative vegetable oil that brings with it the known mutagenic compounds of the kind I’ve just described. Furthermore, if exposure to these vegetable oil-derived mutagens causes a breakdown in the systems for accurately duplicating genes, then you might expect to find other detrimental effects from this generalized defect of gene replication. Indeed we do. Researchers in Finland have found that children anywhere on the ASD spectrum have between 1.5 and 2.7 times the risk of being born with a serious birth defect, most commonly a life-threatening heart defect or neural tube (brain and spinal cord) defect that impairs the child’s ability to walk. 436 Another group, in Nova Scotia, identified a similarly increased rate of minor malformations, such as abnormally rotated ears, small feet, or closely spaced eyes. 437

What I’ve laid out here is the argument that the increasing prevalence of autism is best understood as a symptom of De Novo Gene Mutation Syndrome brought on by oxidative damage, and that vegetable oil is the number-one culprit in creating these new mutations. These claims emerge from a point-by-point deduction based on the best available chemical, genetic, and physiologic science. To test the validity of this hypothesis, we need more research.

Does De Novo Gene Mutation Syndrome Affect Just the Brain?

Nothing would redirect the trajectory of autism research in a more productive fashion than reframing autism as a symptom of the larger underlying disease, which we are provisionally calling de novo gene-mutation syndrome, or DiNGS. (Here’s a mnemonic: vegetable oil toxins “ding” your DNA, like hailstones pockmarking your car.)

If you accept my thesis that the expanding epidemic of autism is a symptom of an epidemic of new gene mutations, then you may wonder why the only identified syndrome of DiNGS is autism. Why don’t we see all manner of new diseases associated with gene mutations affecting organs other than the brain? We do. According to the most recent CDC report on birth defect incidence in the United States, twenty-nine of the thirty-eight organ malformations tracked have increased. 438

However, these are rare events, occurring far less frequently than autism. The reason for the difference derives from the fact that the brain of a developing baby can be damaged to a greater degree than other organs can, while still allowing the pregnancy to carry to term. Though the complex nature of the brain makes it the most vulnerable in terms of being affected by mutation, this aberration of development does not make the child more vulnerable in terms of survival in utero. The fact that autism affects the most evolutionarily novel portion of the brain means that as far as viability of an embryo is concerned, it’s almost irrelevant. If the kinds of severely damaging mutations leading to autism were to occur in organs such as the heart, lungs, or kidneys, fetal survival would be imperiled, leading to spontaneous miscarriage. Since these organs begin developing as early as four to six weeks of in-utero life, failure of a pregnancy this early might occur without any symptoms other than bleeding, which might be mistaken for a heavy or late period, and before a mother has even realized she’s conceived.

* * *

Rhonda Patrick’s view is similar to that of Shanahan: