The Transparent Self to Come?

Scott Preston’s newest piece, The Seer, is worth reading. He makes an argument for what is needed next for humanity, what one might think of as getting off the Wheel of Karma. But I can’t help considering about the messy human details, in this moment of societal change and crisis. The great thinkers like Jean Gebser talk of integral consciousness in one way while most people experience the situation in entirely different terms. That is why I’m glad Preston brought in what is far less respectable (and far more popular) like Carlos Castaneda and the Seth Material.

As anyone should know, we aren’t discussing mere philosophy here for it touches upon human experience and social reality. I sense much of what is potentially involved, even as it is hard to put one’s finger on it. The challenge we are confronted with is far more disconcerting than we typically are able and willing to acknowledge, assuming we can even begin to comprehend what we are facing and what is emerging. How we get to the integral is the difficult part. Preston explains well the issue of making the ego/emissary transparent — as the Seth Material put it, “true transparency is not the ability to see through, but to move through”. That is a good way of putting it.

I appreciate his explanation of Satan (the egoic-demiurge) as the ape of God, what Iain McGilchrist calls usurpation. This reminds me of the mimicry of the Trickster archetype and its relation to the co-optation of the reactionary mind (see Corey Robin). A different kind of example of this is that of the folkloric Men in Black, as described by John Keel. It makes me wonder about what such things represent in human reality. This was on my mind because of another discussion I was having in a different post, Normal, from rauldukeblog’s The Violent Ink. The topic had to do with present mass hysteria and, as I’m wont to do, I threw out my own idiosyncratic context. Climate change came up and so I was trying to explain what makes this moment of crisis different than the past.

There is the scientific quality to it. Modern science created climate change through technological innovation and industrialization. And now science warns us about it. But it usually isn’t like a war, famine, or plague that hits a population in an undeniable way — not for most of us, not yet. That is the complexifying change in the scientific worldview we now inhabit and it is why the anxiety is so amorphous, in away profoundly different than before. To come to terms with climate change, something within human nature itself would have to shift. If we are to survive it while maintaining civilization, we will likely have to be as dramatically transformed as were bicameral humans during the collapse of the Bronze Age Civilizations. We won’t come through this unscathed and unchanged.

In speaking of the scientific or pseudo-scientific, there is the phenomenon of UFOs and contact experience. I pointed out that there has been a shift in official military policy toward reporting of UFO sightings, which gets one wondering about motives and also gets one thinking about why now. UFOs and aliens express that free-floating sense of vague anxiety about the unknown, specifically in a modern framework. It’s almost irrelevant what UFOs really are or aren’t. And no doubt, as in the past, various governments will attempt to use UFO reports to manipulate populations, to obfuscate what they wish to keep hidden, or whatever else. The relevant point here is what UFOs symbolize in the human psyche and why they gain so much attention during periods of wide scale uncertainty and stress. The UFO cults that have appeared over the past few generations are maybe akin to the cults like Jesus worship that arose in the Axial Age. Besides Jung, it might be helpful to bring in Jacques Vallee’s even more fascinating view. A new mythos is forming.

I’m not sure what it all adds up to. And my crystal ball is no less cloudy than anyone else’s. It just feels different in that we aren’t only facing crisis and catastrophe. It feels like a far more pivotal point, a fork in the path. During what is called the General Crisis, there was much change going on and it did help bring to an end what remained of feudalism. But the General Crisis didn’t fundamentally change society and culture, much less cut deeper into the human psyche. I’d argue that it simply brought us further down the same path we’d been on for two millennia since the Axial Age. I keep wondering if now the Axial Age is coming to its final conclusion, that there isn’t much further we can go down this path.

By the way, I think my introduction to Jacques Vallee came through my further reading after having discovered John Keel’s The Mothman Prophecies, the book that came out long before the movie. That is where the basic notion comes from that I was working with here. During times of crisis and foreboding, often preceding actual mass death, there is a build up of strangeness that spills out from our normal sense of reality. We can, of course, talk about this in more rational or rather respectable terms without any of the muck of UFO research.

Keith Payne, in The Broken Ladder, notes that people come to hold bizarre beliefs and generally act irrationally when under conditions of high inequality, that is to say when inflicted with unrelenting stress. But it goes beyond that. There is more going on than mere beliefs. People’s sense of reality becomes distorted and they begin experiencing what they otherwise would not. This was the basis of Julian Jaynes’ hypothesis of the bicameral mind where voice-hearing was supposedly elicited through stress. And this is supported by modern evidence, such as the cases recorded by John Geiger in the Third Man Factor.

An additional layer could be brought to this with Jacques Valle’s work in showing how anecdotes of alien contact follow the same pattern as the stories of fairy abductions and the anthropological accounts of shamanic initiation. These are religious experiences. At other times, they were more likely interpreted as visitations by spiritual beings or as transportation into higher realms. Similarly, spinning and flying disks in the sky were interpreted as supernatural manifestations in the pre-scientific age. But maybe it’s all the same phenomenon, whether the source is elsewhere or from within the human psyche.

The interesting part is that these experiences, sometimes sightings involving crowds of people (including many incidents with military personnel and pilots), often correspond with intensified societal conflict. UFO sightings and contact experiences appear to increase at specific periods of stress. Unsurprisingly, people turn to the strange in strange times. And there is something about this strangeness, the pervasiveness of it and the power it holds. To say we are living in a reactionary time when nearly everything and everyone has become reactionary, that is to understate it to an extreme degree. The Trickster quality of the reactionary mind, one might argue, is its most defining feature.

One might call it the return of the repressed. Or it could be thought of as the eruption (irruption?) of the bicameral mind. Whatever it is, it challenges and threatens the world we think we know. Talk of Russian meddling and US political failure is tiddlywinks in comparison. But the fact that we take such tiddlywinks so seriously does add to the sense of crisis. Everything is real to the degree we believe it to be real, in that the effects of it become manifest in our experience and behavior, in the collective choices that we make and accumulate over time.

We manifest our beliefs. And even the strangest of beliefs can become normalized and, as such, become self-fulfilling prophecies. Social realities aren’t only constructed. They are imagined into being. Such imagination is human reality for we are incapable of experiencing it as anything other than reality. We laugh at the strange beliefs of others at our own peril. But what is being responded to can remain hidden or outside of the mainstream frame of consciousness. Think of the way that non-human animals act in unusual ways before an earthquake hits. If all we see is what the animals are doing and lack any greater knowledge, we won’t appreciate that it means we should prepare for the earthquake to come.

Humans too act strangely before coming catastrophes. It doesn’t require anyone to consciously know of and rationally understand what is coming. Most of how humans respond is instinctual or intuitive. I’d only suggest to pay less attention to the somewhat arbitrary focus of anxiety and, instead, to take the anxiety itself as a phenomenon to be taken seriously. Something real is going on. And it portends something on its way.

Here is my point. We see things through a glass darkly. Things are a bit on the opaque side. Transparency of self is more of an aspiration at this point, at least for those of us not yet enlightened beings. All the voices remain loud within us and in the world around us. In many thinkers seeking a new humanity, there is the prioritizing of the visual over the auditory. There is a historical background to this. The bicameral mind was ruled by voices. To be seek freedom from this, to get off the grinding and rumbling Wheel of Karma requires a different relationship to our senses. There is a reason the Enlightenment was so powerfully obsessed with tools that altered and extended our perception with a major focus on the visual, from lenses to the printed word. Oral society was finally losing its power over us or that is what some wanted to believe.

The strangeness of it all is that pre-consciousness maintains its pull over modern consciousness simultaneously as we idealize the next stage of humanity, integral trans-consciousness. Instead of escaping the authoritative power of the bicameral voice, we find ourselves in a world of mass media and social media where voices have proliferated. We are now drowning in voices and so we fantasize about the cool silence of the visionary, that other side of our human nature — as Preston described it:

One of the things we find in don Juan’s teachings is “the nagual” and “the tonal” relation and this is significant because it is clearly the same as McGilchrist’s “Master” and “Emissary” relationship of the two modes of attention of the divided brain. In don Juan’s teachings, these correspond to the what is called the “first” and “the second attentions”. If you have read neuroscientist Jill Bolte-Taylor’s My Stroke of Insight or followed her TED talk about that experience, you will see that she, too, is describing the different modes of attention of the “nagual” and the “tonal” (or the “Master” and the “Emissary”) in her own experience, and that when she, too, shifted into the “nagual” mode, also saw what Castaneda saw — energy as it flows in the universe, and she also called that “the Life Force Power of the Universe”

About getting off the Wheel, rauldukeblog wrote that, “Karma is a Sanskrit word meaning action so the concept is that any act(tion) creates connective tissue which locks one into reaction and counter and so on in an endless loop.” That brings us back to the notion of not only seeing through the egoic self but more importantly to move through the egoic self. If archaic authorization came from voices according to Jaynes, and if self-authorization of the internalized voice of egoic consciousness hasn’t fundamentally changed this equation, then what would offer us an entirely different way of being and acting in the world?

The last time we had a major transformation of the human mind, back during the ending of the Bronze Age, it required the near total collapse of every civilization. Structures of the mind aren’t easily disentangled from entrenched patterns of social identity as long as the structures of civilization remain in place. All these millennia later, we are still struggling to deal with the aftermath of the Axial Age. What are the chances that the next stage of humanity is going to be easier or happen more quickly?

Caloric Confusion

In human biological terms, there is no such thing as a calorie. It’s an abstraction measured by machines, in breaking down matter to determine the energy it contains. That isn’t how the body functions. It’s similar to the view of nutritionism where chemical analyses determines the amounts of specific vitamins and minerals found in any given food. None of this, however, tells us how the body absorbs, processes, and uses them.

Take sugar, for example. It is worse than empty calories. Rather, we are talking about actively toxic calories. It also interferes with nutrient intake and so can contribute to malnourishment. In the 1890s in Britain and by 1940 in the United States, a shockingly high number of recruits and draftees were being rejected because of malnourishment and tooth decay. This had been preceded by decades of rising levels of sugar and carbs in the diet, combined with processed vegetable oils that were replacing saturated fat.

A commonly discussed example of this is how more vitamin C is required on a high-carb diet because glucose competes with it, whereas on a low-carb diet very little vitamin C is needed to avoid scurvy. Sugar is causing harm simultaneously on multiple levels. That it is fattening is bad enough, especially considering all that is involved: dental caries, diabetes, heart disease, fatty liver, depression, etc — no minor set of health consequences and that list could go on much longer. Yet sugar was exonerated while saturated fat was scapegoated, which is rather inconsistent in that saturated fats have never been treated as mere empty calories equal to anything else.

It turns out calories aren’t all equal and on some level everyone probably always knew that was true, but in its simplicity it was an easy way of describing nutrition to the public. The problem is that it is so simplistic as to be fundamentally wrong. It is meaningless to speak of calories-in/calories-out. That doesn’t explain anything. We are still left with the issue of why the body burns some calories while turning others into fat. Recent research has shown that there is a metabolic advantage to low-carb diets in that more calories are burned in ratio to calories consumed. This is particularly true in a ketogenic state where the body efficiently burns fat. Fat easily turns into fat when eaten with carbs, but this is not true to the same degree when carbs are limited.

It is understandable how this all came about. We study what we can perceive and we ignore what we can’t. Scientific researchers early on learned how to measure calories with machines and it was assumed that the body was like a machine, burning a fuel in the way an engine burned coal and released heat. It became not only one model among many but a defining paradigm to explain human behavior and even morality, with the sins of gluttony and sloth taking key roles. Calories-in/calories-out created a perfect moral calculus. If you were fat or whatever, it was your fault. It couldn’t possibly have anything to do with the severely health-destroying food system, demented nutritional advice, and sub-par healthcare.

Other models of dietary health developed such as the endocrinological study of hormones and the hormonal systems, but the calorie model was already established. Besides, most of this other early research was done in Europe, much of it in German. The World Wars scattered the European research communities and their scientific literature mostly remained untranslated. When the US became the new center of nutritional research, English-speaking researchers were largely ignorant of all that previous researchers had already figured out.

Until this past decade or so, this state of affairs has remained that way, more than a century after that early research was done. Only now has the American-dominated nutritional research begun to return to old knowledge long forgotten.

* * *

The Curious History of the Calorie in U.S. Policy:
A Tradition of Unfulfilled Promises
by Deborah I. Levine

The Progressive Era Body Project:
Calorie-Counting and “Disciplining the Stomach” in 1920s America
by Chin Jou

Death of the Calorie
by Peter Wilson

* * *

The Case Against Sugar
by Gary Taubes
pp. 23-25

Meanwhile, the latest surge in this epidemic of diabetes in the United States— an 800 percent increase from 1960 to the present day, according to the Centers for Disease Control—coincides with a significant rise in the consumption of sugar. Or, rather, it coincides with a surge in the consumption of sugars, or what the FDA calls “caloric sweeteners—sucrose, from sugarcane or beets, and high-fructose corn syrup, HFCS, a relatively new invention.

After ignoring or downplaying the role of sugars and sweets for a quarter-century, many authorities now argue that these are, indeed, a major cause of obesity and diabetes and that they should be taxed heavily or regulated. The authorities still do so, however, not because they believe sugar causes disease but, rather, because they believe sugar represents “empty calories” that we eat in excess because they taste so good. By this logic, since refined sugar and high-fructose corn syrup don’t contain any protein, vitamins, minerals, antioxidants, or fiber, they either displace other, more nutritious elements of our diet, or simply add extra, unneeded calories to make us fatter. The Department of Agriculture, for instance (in its recent “Dietary Guidelines for Americans”), the World Health Organization, and the American Heart Association, among other organizations, advise a reduction in sugar consumption for these reasons primarily.

The empty-calories argument is particularly convenient for the food industry, which would understandably prefer not to see a key constituent of its products—all too often, the key constituent—damned as toxic. The sugar industry played a key role in the general exoneration of sugar that took place in the 1970s, as I’ll explain later. Health organizations, including the American Diabetes Association and the American Heart Association, have also found the argument convenient, having spent the last fifty years blaming dietary fat for our ills while letting sugar off the hook. […]

This book makes a different argument: that sugars like sucrose and high-fructose corn syrup are fundamental causes of diabetes and obesity, using the same simple concept of causality that we employ when we say smoking cigarettes causes lung cancer. It’s not because we eat too much of these sugars—although that is implied merely by the terms “overconsumption” and “overeating”—but because they have unique physiological, metabolic, and endocrinological (i.e., hormonal) effects in the human body that directly trigger these disorders. This argument is championed most prominently by the University of California, San Francisco, pediatric endocrinologist Robert Lustig. These sugars are not short-term toxins that operate over days and weeks, by this logic, but ones that do their damage over years and decades, and perhaps even from generation to generation. In other words, mothers will pass the problem down to their children, not through how and what they feed them (although that plays a role), but through what they eat themselves and how that changes the environment in the womb in which the children develop.

Individuals who get diabetes—the ones in any population who are apparently susceptible, who are genetically predisposed—would never have been stricken if they (and maybe their mothers and their mothers’ mothers) lived in a world without sugar, or at least in a world with a lot less of it than the one in which we have lived for the past 100 to 150 years. These sugars are what an evolutionary biologist might call the environmental or dietary trigger of the disease: the requisite ingredient that triggers the genetic predisposition and turns an otherwise healthy diet into a harmful one. Add such sugars in sufficient quantity to the diet of any population, no matter what proportion of plants to animals they eat—as Kelly West suggested in 1974 about Native American populations—and the result eventually is an epidemic of diabetes, and obesity as well.

pp. 117-121

The second pillar of modern nutritional wisdom is far more fundamental and ultimately has had far more influence on how the science has developed, and it still dominates thinking on the sugar issue. As such, it has also done far more damage. To the sugar industry, it has been the gift that keeps on giving, the ultimate defense against all arguments and evidence that sugar is uniquely toxic. This is the idea that we get obese or overweight because we take in more calories than we expend or excrete. By this thinking, researchers and public-health authorities think of obesity as a disorder of energy balance,” a concept that has become so ingrained in conventional thinking, so widespread, that arguments to the contrary have typically been treated as quackery, if not a willful disavowal of the laws of physics.

According to this logic of energy balance, of calories-in/calories-out, the only meaningful way in which the foods we consume have an impact on our body weight and body fat is through their energy content—calories. This is the only variable that matters. We grow fatter because we eat too much—we consume more calories than we expend—and this simple truth was, and still is, considered all that’s necessary to explain obesity and its prevalence in populations. This thinking renders effectively irrelevant the radically different impact that different macronutrients—the protein, fat, and carbohydrate content of foods—have on metabolism and on the hormones and enzymes that regulate what our bodies do with these foods: whether they’re burned for fuel, used to rebuild tissues and organs, or stored as fat.

By this energy-balance logic, the close association between obesity, diabetes, and heart disease implies no profound revelations to be gleaned about underlying hormonal or metabolic disturbances, but rather that obesity is driven, and diabetes and heart disease are exacerbated, by some combination of gluttony and sloth. It implies that all these diseases can be prevented, or that our likelihood of contracting them is minimized if individuals—or populations—are willing to eat in moderation and perhaps exercise more, as lean individuals are assumed to do naturally. Despite copious reasons to question this logic and, as we’ll see, an entire European school of clinical research that came to consider it nonsensical, medical and nutrition authorities have tended to treat it as gospel. Obesity is caused by this caloric imbalance, and diabetes, as Joslin said nearly a century ago, is largely the penalty for obesity. Curb the behaviors of gluttony (Shakespeare’s Falstaff was often invoked as a pedagogical example) and sloth (another deadly sin) and all these diseases will once again become exceedingly rare.

This logic also served publicly to exonerate sugar as a suspect in either obesity or diabetes. By specifying energy or caloric content as the instrument through which foods influence body weight, it implies that a calorie of sugar would be no more or less capable of causing obesity, and thus diabetes, than a calorie of broccoli or olive oil or eggs or any other food. By the 1960s, the phrase a calorie is a calorie had become a mantra of the nutrition-and-obesity research community, and it was invoked to make just this argument (as it still is). […]

The energy-balance idea derives ultimately from the simple observation that the obese tend to be hungrier than the lean, and to be less physically active, and that these are two deviations from normal intake and expenditure: gluttony and sloth. It was first proposed as an explanation of obesity in the early years of the twentieth century, when nutrition researchers, as we discussed, were focused on carefully quantifying with their calorimeters the energy content of foods and the energy expended in human activity. At the time, the application of the laws of thermodynamics and particularly the conservation of energy to living creatures—the demonstration that all the calories we consume will either be burned as fuel or be stored or excreted—was considered one of the triumphs of late-nineteenth-century nutrition science. Nutrition and metabolism researchers embraced calories and energy as the currency of their research. When physicians began speculating as to the cause of obesity, they naturally did the same.

The first clinician to take these revelations on thermodynamics and apply them to the very human problem of obesity was the German diabetes specialist Carl von Noorden. In 1907, he proposed that “ the ingestion of a quantity of food greater than that required by the body, leads to an accumulation of fat, and to obesity, should the disproportion be continued over a considerable period.”

Noorden’s ideas were disseminated widely in the United States and took root primarily through the work of Louis Newburgh, a University of Michigan physician who did so based on what he believed to be a fundamental truth: “All obese persons are alike in one fundamental respect—they literally overeat.” Newburgh assumed that overeating was the cause of obesity and so proceeded to blame the disorder on some combination of a “perverted appetite” (excessive energy consumption) and a “lessened outflow of energy” (insufficient expenditure). As for obese patients who remained obese in spite of this understanding, Newburgh suggested they did so because of “various human weaknesses such as overindulgence and ignorance.” (Newburgh himself was exceedingly lean.) Newburgh was resolutely set against the idea that other physical faults could be involved in obesity. By 1939, his biography at the University of Michigan was already crediting him with the discovery that “ the whole problem of weight lies in regulation of the inflow and outflow of calories” and for having “undermined conclusively the generally held theory that obesity is the result of some fundamental fault.”

The question of a fundamental fault could not be dismissed so lightly, however. To do that required dismissing observations of German and Austrian clinical researchers who had come to conclude that obesity could only be reasonably explained by the existence of such a fault—specifically, a defect in the hormones and enzymes that served to control the flow of fat into and out of cells. Newburgh rejected this hormonal explanation, believing he had identified the cause of obesity as self-indulgence.

Gustav von Bergmann, a contemporary of Noorden’s and the leading German authority on internal medicine, * 1 criticized Noorden’s ideas (and implicitly Newburgh’s) as nonsensical. Positive energy balance—more energy in than out—occurred when any system grew, Bergmann pointed out: it accumulated mass. Positive energy balance wasn’t an explanation but, rather, a description, and a tautological one at that: logically equivalent to saying that a room gets crowded because more people enter than leave. * 2 It was a statement that described what happens but not why. It seems just as illogical, wrote Bergmann, to say children grow taller because they eat too much or exercise too little, or they remain short because they’re too physically active. “ That which the body needs to grow it always finds, and that which it needs to become fat, even if it’s ten times as much, the body will save for itself from the annual balance.”

The question that Bergmann was implicitly asking is why excess calories were trapped in fat tissue, rather than expended as energy or used for other necessary biological purposes. Is there something about how the fat tissue is regulated or how fuel metabolism functions, he wondered, that makes it happen?

The purpose of a hypothesis in science is to offer an explanation for what we observe, and, as such, its value is determined by how much it can explain or predict. The idea that obesity is caused by the overconsumption of calories, Bergmann implied, failed to explain anything.

p. 129

These revelations led both directly and indirectly to the notion that diets restricted in carbohydrates—and restricted in sugar most of all—would be uniquely effective in slimming the obese. By the mid-1960s, these carbohydrate-restricted diets, typically high in fat, were becoming fashionable, promoted by physicians, not academics, and occasionally in the form of hugely successful diet books. Academic nutritionists led by Fred Stare and Jean Mayer of Harvard were alarmed by this and denounced these diets as dangerous fads (because of their high fat content, particularly saturated fat), suggesting that the physician-authors were trying to con the obese with the fraudulent argument that they could become lean without doing the hard work of curbing their perverted appetites. It is a medical fact that no normal person can lose weight unless he cuts down on excess calories,” The New York Times would explain in 1965.

This battle played out through the mid-1970s, with the academic nutritionists and obesity researchers on one side, and the physicians-turned-diet-book-authors on the other. The obesity researchers began the 1960s believing that obesity was, indeed, an eating disorder—Newburgh’s “perverted appetite”—and the ongoing revolution in endocrinology, spurred by Yalow and Berson’s invention of the radioimmunoassay, did little to convince them otherwise. Many of the most influential obesity researchers were psychologists, and much of their research was dedicated to studying why the obese failed to restrain their appetites sufficiently—to eat in moderation—and how to induce them to do a better job of it. The nutritionists followed along as they focused on the question of whether dietary fat caused heart disease and perhaps obesity as well, because of its dense calories. (A gram of protein or a gram of carbohydrate has four calories; a gram of fat has almost nine.) In the process, they would continue to reject any implication that sugar had fattening powers beyond its caloric content. That it might be the cause of insulin resistance—after all, something was—would not cross their radar screen for decades.

pp. 199-201

In 1986, with the perceived FDA exoneration of sugar, the public-health authorities and the clinicians and researchers studying obesity and diabetes had come to a consensus that type 2 diabetes was caused by obesity, not sugar, and that obesity itself was caused merely by eating too many calories or exercising away too few. By this logic, the only means by which a macronutrient could influence body weight was its caloric content, and so, calorie for calorie, sugar was no more fattening than any other food, and thus no more likely to promote or exacerbate diabetes. This was what the sugar industry had been arguing and embracing since the 1930s. It was what Fred Stare of Harvard had in mind when he said publicly that he would prefer to get his calories from a martini than from a dessert.

A more nuanced perspective, one nourished by scientific progress, would be that if two foods or macronutrients are metabolized differently—if glucose and fructose, for instance, are metabolized in entirely different organs, as they mostly are—then they are likely to have vastly different effects on the hormones and enzymes that control or regulate the storage of fat in fat cells. One hundred calories of glucose will very likely have an entirely different effect on the human body from one hundred calories of fructose, or fifty calories of each consumed together as sucrose, despite having the same caloric content. It would take a leap of faith to assume otherwise.

Nutritionists had come to assume that a hundred calories of fat had a different effect from a hundred calories of carbohydrate on the accumulation of plaque in coronary arteries; even that a hundred calories of saturated fat would have an entirely different effect from a hundred calories of unsaturated fat. So why not expect that macronutrients would have a different effect on the accumulation of fat in fat tissue, or on the phenomena, whatever they might be, that eventually resulted in diabetes? (Insulin resistance and hyperinsulinemia, as Rosalyn Yalow and Solomon Berson, among others, had suggested in the 1960s, seemed to be a very likely bet.) But obesity and diabetes researchers, as we’ve seen, had come to embrace the mantra that “a calorie is a calorie”; they would repeat it publicly when they were presented with the idea that there was something unique about how the human body metabolizes sugar that sets it apart from other carbohydrates. The long-held view was based on the state of the science in the early years of the twentieth century, and to cling to it required a willful rejection of the decades’ worth of relevant revelations in the medical sciences that had come since.

By the 1980s, biochemists, physiologists, and nutritionists who specialized in the study of sugar or in the fructose component of sugar had come to consistent conclusions about the short-term effects of sugar consumption in human subjects, as well as the details of how sugar is metabolized and how this influences the body as a whole. The glucose we consume—in starch or flour, or as half of a sugar molecule—will be used directly for fuel by muscle cells, the brain, and other tissues, and can be stored in muscles or the liver (as a compound called glycogen), but the fructose component of sugar has a much different fate. Most of it never makes it into the circulation; it is metabolized in the liver. The metabolic pathways through which glucose passes when it is being used for fuel—in both liver and muscle cells—involve a feedback mechanism to redirect it toward storage as glycogen when necessary. This is the case with fructose, too. But the metabolism of fructose in the liver is “unfettered by the cellular controls,” as biochemists later put it, that work to prevent its conversion to fat. One result is the increased production of triglycerides, and thus the abnormally elevated triglyceride levels that were observed in many research subjects, though not all, when they ate sugar-rich diets.

While cardiologists and epidemiologists were debating whether elevated triglycerides actually increased the risk of heart disease (in the process, challenging their own beliefs that cholesterol was key), biochemists had come to accept that sucrose was “the most lipogenic” of carbohydrates—as even Walter Glinsmann, author of the FDA report on sugar, would later acknowledge—and that the liver was the site of this fat synthesis. * 2 The Israeli biochemist Eleazar Shafrir would describe this in the technical terminology as “the remarkable hepatic lipogenic capacity induced by fructose-rich diets.” It was also clear from the short-term trials in humans that this happened to a greater extent in some individuals than others, just as it did in some species of animals and not others. In human studies, subjects who had the highest triglycerides when the trials began tended to have the greatest response to reducing sugar intake, suggesting (but not proving) that the sugar was the reason they had such high triglycerides in the first place. These same individuals also tended to see the greatest drop in cholesterol levels when they were put on low-sugar diets.

Does a Healthy LCHF Diet Protect Against Sunburns?

As I’ve written about lately, there is something unique about a low-carb, high-fat diet. People feel better and have more energy. Diverse symptoms disappear, including from serious conditions that for some people are reversed, from autoimmune disorders to mood disorders. That is particularly true in the context of exercise, calorie restriction, fasting, OMAD, ketosis, autophagy, etc and when combined with traditional foods, paleo, carnivore, etc. Many have experimented with this area of dietary changes and have observed major improvements, but it isn’t always clear exactly what is causing any given improvement.

We do understand certain things well. I’ve already discussed in detail ketosis and related factors. And there has been more info coming out about autophagy, an even more fascinating topic. There is the signaling in relation to mTOR, IGF1, and AMPK. And there are the hormones that deal with hunger, satiety, and fullness. Everything is context-dependent. For example, the carnitine in red meat can be turned into carcinogenic TMOA by the Prevotella gut bacteria, but that is a non-issue as long as you aren’t eating the grains that feed Prevotella in the first place. Or consider how vitamin C deficiency that leads to scurvy is rare on carnivore diets, even though vitamin C is found in such small amounts in animal foods, since on a low-carb diet the body needs less vitamin C. Issues with gut health, inflammation, and neurocognition are also more clear in explanation as they’ve received much scientific attention.

Other results are more anecdotal, though. This is largely because the research on low-carb, high-fat diets has been limited and in many cases, such as with zero-carb, scientific evidence is even more sparse. But what thousands of people have observed remains interesting, if yet not entirely explained. Many LCHF dieters have noted that their thoughts are less obsessive and compulsive, something I’ve argued has to do with eliminating addictive foods from the diet, especially added sugar and grains. An example of this is decrease of intrusive sexual thoughts reported by some (and less distraction in general), although at the same time some also state decrease in erectile dysfunction (the latter being unsurprising as the LCHF diet are causally linked to hormonal functioning and cardiovascular health). Sexuality definitely is changed in various ways, as demonstrated in how early puberty becomes common when populations switch to agriculture with high amounts of carbohydrates, in particular grains, and maybe dairy has something to do with it as well since dairy triggers growth hormone — maybe why agricultural societies were able to outbreed hunter-gatherers, overwhelming them with a continually growing supply of cheap labor and cheap lives to send off to war.

There are some confounding factors, of course. Along with more nutrient-dense foods with an emphasis on fat-soluble vitamins, people going on various kinds of low-carb diets also tend to increase cholesterol, saturated fat, and omega-3s while decreasing omega-6s. Cholesterol is one of the most important substances for brain health and it helps your body to process vitamin D from sunlight. Saturated fat is a complicated issue and no one fully knows the significance, beyond our knowing the fear-mongering about it appears to be no longer valid. As for omega-3s, they are essential to so much. The main problem is that omega-6s are at such a high level in the modern diet that they are inflammatory. In using healthier oils and fats, most low-carbers eliminate vegetable oils in junk food and in cooking with vegetable oils being the main source of omega-6s.

This could explain why some think sunburns are less common on a low-carb diet (read down through the Twitter comments). It may or may not have anything specifically to do with carbohydrates themselves and, instead, be more about the general eating pattern common among low-carb dieters. This might have to do with oxidation and free-radicals in relation to omega-6s. Or it could have something to do with fat-soluble vitamins or dietary cholesterol that is typically greater in low-carb, high-fat diets. There are similar patterns in multiple areas of dietary changes and health, and they indicate something that can’t be explained by mainstream health ideology. Consider how Americans have experienced worsening health as they have followed expert opinion in eating more vegetables, fruits, whole grains, and vegetable oils while decreasing red meat and saturated fat. Americans have been following expert advice from mainstream institutions and from their doctors. The same kind of thing has happened with people protecting themselves against sun damage. Americans have increased their use of sunscreen while spending less time in the sun, as they were told to do. What has been the results? The skin cancer rate is going up and those avoiding the sun are less healthy. Is it a mere coincidence that the intake of omega-6s was also increasing during the same period? Maybe not.

When the actual causes are determined, we can isolate them and re-create the appropriate conditions or mimic them. This is biohacking — Siim Land is great in explaining how to get particular results based on the scientific evidence. If omega-6s or whatever is the problem behind sunburns, then it’s far from being knowledge of value limited to the low-carb community. Omega-6s haven’t been as clearly on the radar of many other diets, but health issues with omega-6s are already well known in the scientific literature. So, the advantages in this case might be attained without restricting carbs, although we don’t know that as of yet, assuming the anecdotal observations are proven valid. The interaction between omega-6s and carbohydrates might be a total package, in terms of pushing the body more fully into an inflammatory state where sunlight sensitivity becomes an issue. All we can do at the moment is offer hypotheses to be tested in personal experience and hopefully soon in scientific studies.

The body is a complex system. Change even a single factor and it can have cascading effects. But change multiple factors and the entire functioning can shift into a different state, altering numerous areas of health. Many of the results will be unpredictable based on present science because most research up to this point has had a narrow focus in the population being studied, almost entirely those on the Standard American diet and variations of it. What is true for most people following the past half century of health advice won’t always apply to those following entirely different diets and lifestyles. It’s not that LCHF is going to heal all that ails you, but we find ourselves at a rather fascinating point in the views on diet, lifestyle, and health. We are coming to realize how profoundly affected is the body and mind by even some minor changes. We have more hypotheses at present than knowledge, and that isn’t a new situation. So much of what we thought we knew in the past, the basis of mainstream ideology of health experts, were largely untested hypotheses when first advocated and much of it remains unproven.

Now it’s time to get serious about exploring these other glimpses of entirely different possibilities of understanding. That is the point of hypotheses that often begin as observations and anecdotal evidence.

* * *

Effects of high-fat diets rich in either omega-3 or omega-6 fatty acids on UVB-induced skin carcinogenesis in SKH-1 mice
by You-Rong Lou et al

Is Sunscreen the New Margarine?
by Rowan Jacobsen

Don’t Drink (oil) and Fry (in the sun) – the link between polyunsaturated vegetable oil and skin cancer
by George Henderson

N=Many on Omega-6 and Sunburn: Can Sunburn be Reduced?
by Tucker Goodrich

Don’t Blame it on the Sun!
by Dawn Waldron

American Diabetes Association Changes Its Tune

Over the past decade, ever more mainstream health organizations and government agencies have been slowly reversing their official positions on the dietary intake of carbohydrates, sugar, fat, cholesterol, and salt. This was seen in how the American Heart Association, without acknowledgment, backed off its once strong position about fats that it defended since I think 1961, with the federal government adopting the same position as official policy in 1980. Here we are in 2019, more than a half century later.

Now we see the American Diabetes Association finally coming around as well. And its been a long time coming. When my grandmother was in an assisted living home, the doctors and nurses at the time were following the official ADA position of what were called “consistent carbs”. Basically, this meant diabetics were given a high-carb diet and that was considered perfectly fine, as long as it was consistent so as to manage diabetes with consistent high levels of insulin use. It was freaking insanity in defying common sense.

While my grandmother was still living with my parents, my mother kept her blood sugar under control through diet, until she went to this healthcare facility. After that, her blood sugar was all over the place. The nurses had no comprehension that not all carbohydrates are equal since the glycemic index might be equivalent between a cookie and a carrot, irrespective of glycemic load and ignoring that maybe diabetics should simply be cutting out carbs in general. Instead, they argued that old people should be allowed to enjoy carbs, even if it meant that these nurses were slowly killing their patients and profiting the insulin companies at the same time. My mother was not happy about this callous attitude by these medical ‘professionals’.

Yet here we are. The ADA now says low-carb, high-fat (LCHF) diets aren’t a fad and aren’t dangerous. They go so far as to say they are beneficial for type 2 diabetes. Those not completely ignorant have been saying this for generations. And the research has been accumulating for just as long. The shift in official recommendations that happened in the decades following the 1960s never made sense even according to the research at the time. Many academics and researchers pointed out the lack of evidence in blaming saturated fat and cholesterol. But they were ignored and dismissed, then later attacked, discredited, and silenced by influential and, in some cases, downright charismatic figures (e.g., Ancel Keys) in powerful organizations that became aligned with leading politicians and bureaucrats in key positions. Many careers were destroyed and debate was shut down.

Now those victims of dietary authoritarianism are vindicated, not that this helps all the average folk harmed. There was many decades of bad dietary advice was force onto the American public. This determined official policies and practices of government healthcare programs, school lunch programs, and healthcare providers. Because of the central position of the United States as a geopolitical power during the Cold War, countries all over the world adopted this unhealthy dietary ideology as part of their own official policies.

This also influenced the food system with the government subsidizing high yields of corn and grains to meet the recommendations of these nutritional guidelines. Big ag and big food changed their business models accordingly and put out products that were high in carbs and sugar while low in saturated fat, replacing the latter with unhealthy hydrogenated oils. At least hundreds of millions, if not billions of people, worldwide over multiple generations have suffered a horrible diet, increased sickness, bad medical care, and premature mortality as a result.

Without admitting they were wrong all this time, without apologizing for all the harm they caused, these leading experts and officials are changing their opinion. Better late than never. Mark this date for it is a historic moment.

* * *

Nutrition Therapy for Adults With Diabetes or Prediabetes: A Consensus Report
by Alison B. Evert et al, American Diabetes Association
(also see here)

EATING PATTERNS: Consensus recommendations

  • A variety of eating patterns (combinations of different foods or food groups) are acceptable for the management of diabetes.
  • Until the evidence surrounding comparative benefits of different eating patterns in specific individuals strengthens, health care providers should focus on the key
    factors that are common among the patterns:
    ○ Emphasize nonstarchy vegetables.
    ○ Minimize added sugars and refined grains.
    ○ Choose whole foods over highly processed foods to the extent possible.
  • Reducing overall carbohydrate intake for individuals with diabetes has demonstrated the most evidence for improving glycemia and may be applied in a variety of eating patterns that meet individual needs and preferences.
  • For select adults with type 2 diabetes not meeting glycemic targets or where reducing antiglycemic medications is a priority, reducing overall carbohydrate intake with low- or very lowcarbohydrate eating plans is a viable approach

New Consensus Report Recommends Individualized Eating Plan to Meet Each Person’s Goals, Life Circumstances and Health Status
news release from American Diabetes Association

“‘What can I eat?’ is the number one question asked by people with diabetes and prediabetes when diagnosed. This new Consensus Report reflects the ADA’s continued commitment to evidence-based guidelines that are achievable and meet people where they are and recommends an individualized nutrition plan for every person with diabetes or prediabetes,” said the ADA’s Chief Scientific, Medical and Mission Officer William T. Cefalu, MD. “The importance of this consensus also lies in the fact it was authored by a group of experts who are extremely knowledgeable about numerous eating patterns, including vegan, vegetarian and low carb.”

Nina Teicholz:

Just out: @AmDiabetesAssn guidelines–most comprehensive review to date of Dietary Patterns + diabetes prevention/treatment. What’s new: low-carb recommendations are prominent. (Says low-carb “are among the most studied eating patterns for T2 diabetes.”) […]

This is the key advancement of new @AmDiabetesAssn guidelines. Low carb is no longer “dangerous”‘or “fad”‘but a “viable”‘diet supported by “substantial”‘research and considered best for a number of T2 diabetes outcomes.

Dr. John Owens:

This is an historic day! My case managers and dietitian have been supporting my low-carb recommendations for years, going against ADA guidelines. Now they don’t have to!

Dr. Eric Sodicoff:

Still….They seem a little backward here. Bust out the low carb diet when meds not working?? Really? IMHO-Carb restriction is JOB #1 in diabetes management for use early and always. It is NOT second to medication my treatment protocol.


If you go back to the beginning, like back in the 1930’s, the doctors were telling diabetics to stop eating carbohydrates. Then somebody fabricated the cholesterol theory of heart disease and invented a drug called statins. Then suddenly carbs were okay for diabetics.

Nutrition Therapy for Adults With Diabetes or Prediabetes: A Consensus Report — American Diabetes Association
from r/ketoscience


“Eating patterns that replace certain carbohydrate foods with those higher in total fat, however, have demonstrated greater improvements in glycemia and certain CVD risk factors (serum HDL cholesterol [HDL-C] and triglycerides) compared with lower fat diets.”

Yay! Ack that higher fat isn’t deadly.

“The body makes enough cholesterol for physiological and structural functions such that people do not need to obtain cholesterol through foods. Although the DGA concluded that available evidence does not support the recommendation to limit dietary cholesterol for the general population, exact recommendations for dietary cholesterol for other populations, such as people with diabetes, are not as clear (8). Whereas cholesterol intake has correlated with serum cholesterol levels, it has not correlated well with CVD events (65,66). More research is needed regarding the relationship among dietary cholesterol, blood cholesterol, and CVD events in people with diabetes.

Or, in layman’s language: While the data doesn’t support vilifying cholesterol as causing heart attacks, we’re going to keep on searching in hopes we find the answer we want.


Are protein needs different for people with diabetes and kidney disease?

“Historically, low-protein eating plans were advised to reduce albuminuria and progression of chronic kidney disease in people with DKD, typically with improvements in albuminuria but no clear effect on estimated glomerular filtration rate. In addition, there is some indication that a low-protein eating plan may lead to malnutrition in individuals with DKD (317–321). The average daily level of protein intake for people with diabetes without kidney disease is typically 1–1.5 g/kg body weight/day or 15–20% of total calories (45,146). Evidence does not suggest that people with DKD need to restrict protein intake to less than the average protein intake.


“The amount of carbohydrate intake required for optimal health in humans is unknown. Although the recommended dietary allowance for carbohydrate for adults without diabetes (19 years and older) is 130 g/day and is determined in part by the brain’s requirement for glucose, this energy requirement can be fulfilled by the body’s metabolic processes, which include glycogenolysis, gluconeogenesis (via metabolism of the glycerol component of fat or gluconeogenic amino acids in protein), and/or ketogenesis in the setting of very low dietary carbohydrate intake (49).”


Low-carbohydrate (110–112) Emphasizes vegetables low in carbohydrate (such as salad greens, broccoli, cauliflower, cucumber, cabbage, and others); fat from animal foods, oils, butter, and avocado; and protein in the form of meat, poultry, fish, shellfish, eggs, cheese, nuts, and seeds. Some plans include fruit (e.g., berries) and a greater array of nonstarchy vegetables. Avoids starchy and sugary foods such as pasta, rice, potatoes, bread, and sweets. There is no consistent definition of “low” carbohydrate. In this review, a low-carbohydrate eating pattern is defined as reducing carbohydrates to 26–45% of total calories. c A1C reduction c Weight loss c Lowered blood pressure c Increased HDL-C and lowered triglycerides

Very low-carbohydrate (VLC) (110–112) Similar to low-carbohydrate pattern but further limits carbohydrate-containing foods, and meals typically derive more than half of calories from fat. Often has a goal of 20–50 g of nonfiber carbohydrate per day to induce nutritional ketosis. In this review a VLC eating pattern is defined as reducing carbohydrate to ,26% of total calories. c A1C reduction c Weight loss c Lowered blood pressure c Increased HDL-C and lowered triglycerides”


Low-Carbohydrate or Very Low Carbohydrate Eating Patterns

“Low-carbohydrate eating patterns, especially very low-carbohydrate (VLC) eating patterns, have been shown to reduce A1C and the need for antihyperglycemic medications. These eating patterns are among the most studied eating patterns for type 2 diabetes. One metaanalysis of RCTs that compared lowcarbohydrate eating patterns (defined as #45% of calories from carbohydrate) to high-carbohydrate eating patterns (defined as .45% of calories from carbohydrate) found that A1C benefits were more pronounced in the VLC interventions (where ,26% of calories came from carbohydrate) at 3 and 6 months but not at 12 and 24 months (110).

“Another meta-analysis of RCTs compared a low-carbohydrate eating pattern (defined as ,40% of calories from carbohydrate) to a low-fat eating pattern (defined as ,30% of calories from fat). In trials up to 6 months long, the low-carbohydrate eating pattern improved A1C more, and in trials of varying lengths, lowered triglycerides, raised HDL-C, lowered blood pressure, and resulted in greater reductions in diabetes medication (111). Finally, in another meta-analysis comparing lowcarbohydrate to high-carbohydrate eating patterns, the larger the carbohydrate restriction, the greater the reduction in A1C, though A1C was similar at durations of 1 year and longer for both eating patterns (112). Table 4 provides a quick reference conversion of percentage of calories from carbohydrate to grams of carbohydrate based on number of calories consumed per day.

“Because of theoretical concerns regarding use of VLC eating plans in people with chronic kidney disease, disordered eating patterns, and women who are pregnant, further research is needed before recommendations can be made for these subgroups. Adopting a VLC eating plan can cause diuresis and swiftly reduce blood glucose; therefore, consultation with a knowledgeable practitioner at the onset is necessary to prevent dehydration and reduce insulin and hypoglycemic medications to prevent hypoglycemia.

“No randomized trials were found in people with type 2 diabetes that varied the saturated fat content of the low- or very low-carbohydrate eating patterns to examine effects on glycemia, CVD risk factors, or clinical events. Most of the trials using a carbohydrate-restricted eating pattern did not restrict saturated fat; from the current evidence, this eating pattern does not appear to increase overall cardiovascular risk, but longterm studies with clinical event outcomes are needed (113–117).”


What is the evidence to support specific eating patterns in the management of type 1 diabetes?

“For adults with type 1 diabetes, no trials met the inclusion criteria for this Consensus Report related to Mediterraneanstyle, vegetarian or vegan, low-fat, low-carbohydrate, DASH, paleo, Ornish, or Pritikin eating patterns. We found limited evidence about the safety and/or effects of fasting on type 1 diabetes (129). A few studies have examined the impact of a VLC eating pattern for adults with type 1 diabetes. One randomized crossover trial with 10 participants examined a VLC eating pattern aiming for 47 g carbohydrate per day without a focus on calorie restriction compared with a higher carbohydrate eating pattern aiming for 225 g carbohydrate per day for 1 week each. Participants following the VLC eating pattern had less glycemic variability, spent more time in euglycemia and less time in hypoglycemia, and required less insulin (130). A single-arm 48-person trial of a VLC eating pattern aimed at a goal of 75 g of carbohydrate or less per day found that weight, A1C, and triglycerides were reduced and HDL-C increased after 3 months, and after 4 years A1C was still lower and HDL-C was still higher than at baseline (131). This evidence suggests that a VLC eating pattern may have potential benefits for adults with type 1 diabetes, but clinical trials of sufficient size and duration are needed to confirm prior findings.”

Sweeteners Can Mess You Up!

Sugar, in large doses, is a harmful substance. Many people have warned against sugar going back to the 1800s. The strongest case was made against it by the physiologist, nutritionist, and professor John Yudkin back in 1972 with his book Pure, White and Deadly. For speaking the truth, his reputation was destroyed by Ancel Keys. But more recently, the science journalist Gary Taubes brought the topic back to public attention, in his own book The Case Against Sugar.

Yudkin has been vindicated, as Keys original research that blamed saturated fat for heart disease has since been re-analyzed and shown that sugar is the stronger correlation. We also now understand the science of why that is the true. But it isn’t only about sugar. The sweet taste, whether from sugar or non-nutritive sweeteners, still causes many of the same problems. All sweeteners affect insulin, gut microbiome, cell functioning, neurocognition, mood, and much else. For example, consumption of both sugar and aspartame is associated with depression, at least in one study that was a randomized controlled trial (G. N. Lindseth, Neurobehavioral Effects of Aspartame Consumption).

They do alter serotonin levels, if not dopamine in all cases — some sweeteners affect dopamine and some don’t. This is observable in how people addicted to sugar so easily shift to non-sugar sweeteners and then act in the same addicted way, finding it hard to imagine giving them up. One way or another, addictive pathways seem to be elicited. The brain isn’t fooled and so the body still will hunger for the sugar it thinks it is getting from the sweet taste.

Exchanging one addictive substance, sugar, for another, non-sugar sweetener, is not a benefit. The problem is the addiction itself. A diet high in carbs and sugars is addictive. Throwing in some other kinds of sweeteners doesn’t change this. The best option is to break the addictive cycle entirely by going low-carb or, better yet, ketogenic. And there is no evidence that artificial sweeteners can be used with a ketogenic diet, since they might knock the body out of ketosis in the way sugar does. To be certain, just eliminate all sweeteners and so kill the problem at its root.

On the other hand, that can be easier said than done. I know sugar addiction, as in I was a sugar fiend from childhood to my thirties. And I did for years increase my use of other sweeteners, in an attempt to break free from the hold sugar had over my brain and mood. This wasn’t a particularly successful strategy. And my health was not improved, as the non-sugar sweeteners maintained my high-carb cravings.

I simply had to cut them out strictly. This simple truth is reinforced every time I slowly increase sweeteners in my diet and the cravings creep back in. I just don’t feel good with them. The lesson has been fully learned at this point.

It was amazing what a difference it made once my sweet tooth went away. Only then did my physical health improve and my psychological health soon followed. I can’t emphasize this enough. Carbs, sugars, and other sweeteners will seriously mess you up over time. You might not notice it for decades, but it all catches up with you. The damage is being done, even if you don’t notice it slowly accumulating. And realize the consequences won’t be limited to sickliness of obesity, diabetes, and heart disease, as neurocognitive and mental health can also decline (e.g., Alzheimer’s is now called type 3 diabetes by some).

An occasional sweet at a birthday party or holiday gathering is one thing. Maybe you can have that and immediately go back on a healthy low-carb diet. Maybe or maybe not. If you were ever a sugar addict, as most Americans are, you are tempting fate. It’s like a recovering alcoholic taking that first sip of whiskey, vodka, or beer; like the recovering druggie getting that first shot of heroin or puff of the crack pipe.

Sugar is a drug, as research shows, that elicits the same reward pathway in the brain as other drugs and all sweeteners can elicit the same or similar pathways. You’ll hunger for more. And even if other sweeteners don’t have all of the problems of sugar, they still have plenty of potential problems that could do serious harm to your health over time.

* * *

Sucralose Promotes Food Intake through NPY and a Neuronal Fasting Response
by Qiao-Ping Wang et al

The truth about artificial sweeteners – Are they good for diabetics?
by Vikas Purohit and Sundeep Mishra

Consuming low-calorie sweeteners may predispose overweight individuals to diabetes
by Jenni Glenn Gingery and Colleen Williams

Not So Sweet: Metabolic Problems Caused by Artificial Sweeteners
by Serena Cho

Artificial Sweeteners Impact Metabolic Health Even on Cellular Level
by Kristen Monaco

Artificial Sweeteners Could be Sabotaging Your Microbiome, Says Study
by Amel Ahmed

Sugar Substitutes or Sugar: What’s Better for Diabetes?
by Kathleen Doheny

Artificial sweeteners linked to diabetes and obesity
by James Brown and Alex Conner

Artificial Sweeteners: Agents of Insulin Resistance, Obesity and Disease
by Loren Cordain

The Unbiased Truth About Artificial Sweeteners
by Chris Kresser

How Artificial Sweeteners Wreak Havoc on Your Gut
by Chris Kresser

Artificial Sweeteners Can Lead to Diabetes In Overweight People
by Gundry MD Team

Artificial Sweeteners Could Be Ruining Your Gut Health
by Gundry MD Team

Are Artificial Sweeteners Safe for the Brain and Gut
by Siim Land

Artificial Sweeteners Don’t Fool Your Brain
by Joseph Mercola

Tricking Taste Buds but Not the Brain: Artificial Sweeteners Change Brain’s Pleasure Response to Sweet
by Caitlin Kirkwood

Artificial Sweeteners: Why You Should Completely Avoid Them to Stay Healthy
by Elizabeth Lyden

Aspartame: 11 Dangers of This All-Too-Common Food Additive
by Rebekah Edwards

Aspartame Side Effects: Recent Research Confirms Reasons for Concern
by University Health News Staff

The Effects of Aspartame on Fibromyalgia and Chronic Fatigue Syndrome
by Adrienne Dellwo

Are Artificial Sweeteners Damaging Your Blood Vessels?
by Michelle Schoffro Cook

Direct and indirect cellular effects of aspartame on the brain
by P. Humphries, E. Pretorius, and H. Naudé

Effects of repeated doses of aspartame on serotonin and its metabolite in various regions of the mouse brain.
by R. P. Sharma and R. A. Coulombe Jr.

Neurophysiological symptoms and aspartame: What is the connection?
by Arbind Kumar Choudhary and Yeong Yeh Lee

The debate over neurotransmitter interaction in aspartame usage
by Arbind Kumar Choudhary and Yeong Yeh Lee

The Connection between Aspartame (Artificial Sweetener) and Panic Attacks, Depression, Bipolar Disorder, Memory Problems, and Other Mental Symptoms
by Betty Martini

Side-Effects of Aspartame on the Brain
by Michael Greger

Migraine Triggers: Artificial Sweeteners
by Jeremy Orozco

Intense Sweetness Surpasses Cocaine Reward
by Magalie Lenoir , Fuschia Serre , Lauriane Cantin, and Serge H. Ahmed

Diet Soft Drinks Linked to Depression
by Naveed Saleh

Why is Diet Soda Addictive?
by Edward Group

Neurobiology of Addiction
by George F. Koob, Michel Le Moal
pp. 448

Accumulating evidence also suggests that dopamine is not required for nondrug reward. In a study in which dopamine release in the nucleus accumbens core and shell was measured with precise voltammetric techniques during self-stimulation, it was shown that if dopamine activation is a necessary condition for brain stimulation reward, evoked dopamine release is actually not observed during brain stimulation reward and is even diminished (Garris et al, 1999). Also, mice completely lacking tyrosine hydroxylase, such that they cannot make dopamine, demonstrated the ability to learn to consume sweet solutions and showed a preference for sucrose and saccharin. Dopamine was not required for animals to find sweet tastes of sucrose or saccharin rewarding (Cannon and Palmiter, 2003; Cannon and Bseikri, 2004).

* * *

I wrote this post as a response to a video, Keto & Beverages, by the LCHF advocate Dr. Westman.

Below is an amusing and irritating (and, sadly, all too common) dialogue or miscommunication I had with Dr. Eric Westman or someone writing on his behalf at his Youtube channel, Adapt Your Life. The most frustrating part is that I’m mostly in agreement with Dr. Westman, as I too adhere to LCHF. For that reason, I don’t want to be harshly critical nor do I want to be polarized into a stronger position than I actually support, but I must admit that my emotional response was a bit negative.

To stand back from the disagreement, I don’t even have a strong opinion on what others do in terms of sweeteners. I don’t recommend sweeteners, sugar or otherwise, based on personal experience. But if artificial sweeteners help some people to transition to a healthier diet (or if they believe this to be true, and I won’t dismiss the placebo effect), then more power to them. Anyway, here is the interaction that rubbed me the wrong way:

Me: “Wasn’t there a recent study that showed even artificial sweeteners can lead to diabetes? The body still responds to the sweet taste as if it were sugar and over time messes with insulin sensitivity.”

Other person: “Hi Ben Steele, I’m not sure we have seen this study.”

Me: “I decided to write a post about it. I found the info I was looking for. At the end of the post, I share links to it and other info about the problems with non-sugar sweeteners.

“As a recovering sugar addict who followed that up with addiction to other sweeteners, I personally would recommend against such substances. But each individual has to decide for themselves.”

[I then linked to this post.]

Other person: “Hi Ben Steele, we opened a couple and didn’t find an actual “study”. Just peoples opinion and thoughts. We would be interested to read a study, preferably a RCT as this is the gold standard of studies. Thanks for this 😊”

Me: “Multiple links were to “studies”. The first two links are papers on studies, the third is the press release of a study, the fourth is a Yale report about a Yale study, five more links further down are papers on studies, and the last is a quote from an academic book from a university press. So, about a third of them linked to studies and academic material. As for all the rest, they directly reference and discuss numerous other studies.

“If you are interested to read a study, then do so. But if not, then don’t. I can’t force you to read anything.”

To further respond, I’m not sure how much of the research is randomized controlled trial. But after doing a casual research that as easily could’ve done by Dr. Westman or his staff, I found info on RCT research. I noticed it briefly mentioned in a few links above, but I didn’t check all the links. I did find RCTs on the topic elsewhere in doing a web search.

I still find it irritating, though. It feels hypocritical. Dr. Westman or his representative was acting with condescension, intentional or not.

Why is this person demanding RCTs of me when they don’t hold themselves to this same standard? Dr. Westman offered no RCTs to back up his recommendations. And these artificial sweeteners were approved by the FDA without any RCTs proving their safety. So, why is it that critics have to prove they are unsafe? Why would we allow invented chemicals to be put on the market and then have doctors recommend them to patients without knowing their long-term safety or side effects?

Just to prove my point, I will share some RCT evidence (see below). And to be fair, I will admit that the results are mixed and, one might argue, inconclusive — consider aspartame that has been researched more fully (see: Travis Saunders, Aspartame: Cause of or Solution to Obesity?; and Michael Joseph, Is Aspartame Safe Or Is It Bad For You?). But based on a familiarity with the available research, without more and better research, no sane and reasonable person would give artificial sweeteners a clean bill of health and proclaim them safe for general mass consumption, especially on a regular basis as a preferred flavoring. Whether or not artificial sweeteners cause weight gain, those might be the least of our worries, in terms of potential side effects seen in some of the studies. Some precautionary principle is in order.

Still, yes, it is hard to state a strong opinion about present evidence, beyond a note of caution. But every individual is free to dismiss such caution and use artificial sweeteners anyway. They might or might not be helpful in losing weight, even if used long enough might lead to detrimental outcomes in other areas of health. Maybe that risk seems worthwhile, assuming short-term weight loss is all that concerns you, and assuming that short-term use won’t lead to long-term use and won’t sabotage a long-term healthy diet. Individuals should make a decision with eyes wide open with the knowledge of potential risks that could be quite serious.

I understand. There are also potential benefits. For those addicted to sugar, they are dealing with a highly destructive substance. Artificial sweeteners may seem like the only choice. And who am I to judge. That is what I did. While transitioning off sugar, I spent a number of years consuming my fair share of laboratory-invented sweeteners. It did get me off sugar, but then all that happened was I was addicted to these other sweeteners. It maintained my sweet tooth and so encouraged me to continue eating a diet high in carbs and sugar. There was no obvious benefit. It did eventually lead me to give up all sweeteners. I just don’t know that the artificial sweeteners were more of a help or a hindrance in that process.

Whatever your decision, know that these substances aren’t without risks. It is dietary Russian roulette. Maybe there will be no serious harm and maybe there will. We shall all find out decades from now when the children and young adults raised on these chemicals reach older age. Here is my perspective. Why take any risk at all when it is completely unnecessary? We already know how to stop sugar cravings in their tracks. With a ketogenic diet, you won’t need to exchange addiction to sugar with an addiction to artificial sweeteners. It’s the simplest solution and, for such people, the only solution with a guaranteed net positive health outcome.

As a quick note, I’d point out a few things about the research. First, all sweeteners affect the body and so aren’t neutral substances, but it is unknown if the effects are a net benefit or net loss. Also, different sweeteners have different effects and the reason for this is not entirely understood. There are still other concerns.

The worst effects in animal studies were seen with high doses, which makes one wonder what is the effect of artificial sweeteners combined with the effect numerous other chemicals, additives, toxins, and pollutants, along with other physiological and environmental stressors that most people are exposed to, as the interaction of multiple factors is an area that mostly remains unexplored and, more importantly, rarely controlled for. And as far as I know, no study has ever been done with various sweeteners in relation to low-carb, zero-carb, and ketogenic diets; whereas most of the studies that have been done are using subjects on the severely unhealthy standard American diet, and so maybe for those people an artificial sweetener is better than the alternative of a high-sugar diet.

Basically, we are in a state of far greater ignorance than knowledge. It’s anyone’s guess. As always, you are taking your life into your own hands. Whatever a journalist, doctor, or health expert may say, it is in the end your life that is at stake, not theirs.

* * *

Gain weight by “going diet?” Artificial sweeteners and the neurobiology of sugar cravings
by Qing Yang

[C]onsensus from interventional studies suggests that artificial sweeteners do not help reduce weight when used alone [2,25]. BMI did not decrease after 25 weeks of substituting diet beverages for sugar-sweetened beverages in 103 adolescents in a randomized controlled trial, except among the heaviest participants [26]. Adouble blind study subjected 55 overweight youth to 13 weeks of a 1,000 Kcal diet accompanied by daily capsules of aspartame or lactose placebo. Both groups lost weight, and the difference was not significant. Weight loss was attributed to caloric restriction [27]. Similar results were reported for a 12-week, 1,500 Kcal program using either regular or diet soda [28]. Interestingly, when sugar was covertly switched to aspartame in a metabolic ward, a 25 percent immediate reduction in energy intake was achieved [29]. Conversely, knowingly ingesting aspartame was associated with increased overall energy intake, suggesting overcompensation for the expected caloric reduction [30]. Vigilant
monitoring, caloric restriction, and exercise were likely involved in the weight loss seen in multidisciplinary programs that included artificial sweeteners [31,32].

Nonnutritive sweeteners and cardiometabolic health: a systematic review and meta-analysis of randomized controlled trials and prospective cohort studies
by Meghan B. Azad, Ahmed M. Abou-Setta, Bhupendrasinh F. Chauhan, Rasheda Rabbani, Justin Lys, Leslie Copstein, Amrinder Mann, Maya M. Jeyaraman, Ashleigh E. Reid, Michelle Fiander, Dylan S. MacKay, Jon McGavock, Brandy Wicklow, and Ryan Zarychanski

Evidence from small RCTs with short follow-up (median 6 mo) suggests that consumption of nonnutritive sweeteners is not consistently associated with decreases in body weight, BMI or waist circumference. However, in larger prospective cohort studies with longer follow-up periods (median 10 yr), intake of nonnutritive sweeteners is significantly associated with modest long-term increases in each of these measures. Cohort studies further suggest that consumption of nonnutritive sweeteners is associated with higher risks of obesity, hypertension, metabolic syndrome, type 2 diabetes, stroke and cardiovascular disease events; however, publication bias was indicated for type 2 diabetes, and there are no data available from RCTs to confirm these observations.

Previous reviews12,65 concluded that, although data from RCTs support weight-loss effects from sustained nonnutritive sweetener interventions, observational studies provide inconsistent results. Building on these findings, we included new studies14–24 and found that consumption of nonnutritive sweeteners was not generally associated with weight loss among participants in RCTs, except in long-term (≥ 12 mo) trials with industry sponsorship. In addition, we found that consumption of nonnutritive sweeteners was associated with modest long-term weight gain in observational studies. Our results also extend previous meta-analyses that showed higher risks of type 2 diabetes32,33 and hypertension66 with regular consumption of nonnutritive sweeteners.

Sucralose decreases insulin sensitivity in healthy subjects: a randomized controlled trial
by Alonso Romo-Romo, Carlos A. Aguilar-Salinas, Griselda X. Brito-Córdova, Rita A. Gómez-Díaz, and Paloma Almeda-Valdes

We performed a randomized controlled trial involving healthy subjects without comorbidities and with a low habitual consumption of nonnutritive sweeteners (n = 33/group). […]

Individuals assigned to sucralose consumption showed a significant decrease in insulin sensitivity with a median (IQR) percentage change of −17.7% (−29.3% to −1.0%) in comparison to −2.8% (−30.7% to 40.6%) in the control group (P= 0.04). An increased acute insulin response to glucose from 577 mU · L-1· min (350–1040 mU · L-1· min) to 671 mU · L-1· min (376–1010 mU · L-1· min) (P = 0.04) was observed in the sucralose group for participants with adequate adherence.

Sucralose may have effects on glucose metabolism, and our study complements findings previously reported in other trials. Further studies are needed to confirm the decrease in insulin sensitivity and to explore the mechanisms for these metabolic alterations.

Neurobehavioral Effects of Aspartame Consumption
by Glenda N. Lindseth, Sonya E. Coolahan, Thomas V. Petros, and Paul D. Lindseth

Despite its widespread use, the artificial sweetener aspartame remains one of the most controversial food additives, due to mixed evidence on its neurobehavioral effects. Healthy adults who consumed a study-prepared high-aspartame diet (25 mg/kg body weight/day) for 8 days and a low-aspartame diet (10 mg/kg body weight/day) for 8 days, with a 2-week washout between the diets, were examined for within-subject differences in cognition, depression, mood, and headache. Measures included weight of foods consumed containing aspartame, mood and depression scales, and cognitive tests for working memory and spatial orientation. When consuming high-aspartame diets, participants had more irritable mood, exhibited more depression, and performed worse on spatial orientation tests. Aspartame consumption did not influence working memory. Given that the higher intake level tested here was well below the maximum acceptable daily intake level of 40–50 mg/kg body weight/day, careful consideration is warranted when consuming food products that may affect neurobehavioral health.

Artificial Sweeteners: A Systematic Review and Primer for Gastroenterologists
by Marisa Spencer, Amit Gupta, Lauren Van Dam, Carol Shannon, Stacy Menees, and William D Chey

Artificial sweeteners (AS) are ubiquitous in food and beverage products, yet little is known about their effects on the gastrointestinal (GI) tract, and whether they play a role in the development of GI symptoms, especially in patients with irritable bowel syndrome. Utilizing the PubMed and Embase databases, we conducted a search for articles on individual AS and each of these terms: fermentation, absorption, and GI tract. Standard protocols for a systematic review were followed. At the end of our search, we found a total of 617 eligible papers, 26 of which were included. Overall, there is limited medical literature available on this topic. The 2 main areas on which there is data to suggest that AS affect the GI tract include motility and the gut microbiome, though human data is lacking, and most of the currently available data is derived from in vivo studies. The effect on motility is mainly indirect via increased incretin secretion, though the clinical relevance of this finding is unknown as the downstream effect on motility was not studied. The specific effects of AS on the microbiome have been conflicting and the available studies have been heterogeneous in terms of the population studied and both the AS and doses evaluated. Further research is needed to assess whether AS could be a potential cause of GI symptoms. This is especially pertinent in patients with irritable bowel syndrome, a population in whom dietary interventions are routinely utilized as a management strategy.

Association between intake of non-sugar sweeteners and health outcomes: systematic review and meta-analyses of randomised and non-randomised controlled trials and observational studies
by Ingrid Toews, Szimonetta Lohner, Daniela Küllenberg de Gaudry, Harriet Sommer, Joerg J Meerpohl

In one randomised controlled trial,85 total cholesterol concentration decreased strongly in sucrose groups but increased in the aspartame group (mean difference 0.44 mmol/L, 95% confidence interval 0.33 to 0.56; n=45). […]

In one crossover non-randomised controlled trial,83 researchers found a significantly higher increase in blood glucose in children of preschool age receiving aspartame compared with sucrose (mean difference 0.24 mmol/L, 95% confidence interval 0.09 to 0.39; n=25), a significantly higher increase in blood glucose in children of school age receiving saccharin compared with sucrose (0.65 mmol/L, 0.44 to 0.86; n=23), and a significantly lower increase in blood glucose in children of preschool age receiving aspartame compared with saccharin (−0.75 mmol/L, −0.95 to −0.64; n=23, very low certainty of evidence). In overweight children involved in active weight loss, blood glucose decreased less strongly in those receiving NSSs compared with those not receiving NSSs (0.3 mmol/L, 0.2 to 0.4; n=49, very low certainty of evidence).

Systematic review of the relationship between artificial sweetener consumption and cancer in humans: analysis of 599,741 participants
by A. Mishra, K. Ahmed, S. Froghi, and P. Dasgupta

The statistical value of this review is limited by the heterogeneity and observational designs of the included studies. Although there is limited evidence to
suggest that heavy consumption may increase the risk of certain cancers, overall
the data presented are inconclusive as to any relationship between artificial sweeteners and cancer.

Evidence suggesting artificial sweeteners may be harmful should give us pause
by Leslie Beck

The study, a randomized controlled trial, investigated the effect of daily sucralose consumption on insulin sensitivity in 66 healthy, normal-weight adults who didn’t regularly use artificial sweeteners. […]

This finding is provocative because it suggests that regular consumption of sucralose can lead to insulin resistance in healthy, normal-weight people.

Sucralose may affect blood sugar control by activating sweet taste receptors in the gut, triggering the release of insulin. Artificial sweeteners are also thought to disrupt the balance of good gut bacteria in a direction that can lead to insulin resistance and weight gain.

Did America Get Fat by Drinking Diet Soda?
by Daniel Engber

Perhaps more to the point, researchers have tested the effects of diet soda on people trying to lose weight, and gotten positive results. A randomized, controlled trial published in May compared the efficacy of artificially sweetened beverages and water in a 12-week weight-loss program. Both treatment groups ended up with smaller waists, and the people taking diet drinks appeared to lose more weight. That study’s lead authors are consultants for Coca-Cola, so perhaps we shouldn’t take this as the final word. But another randomized trial from 2012, this one funded by a bottled-water company, came to a similar conclusion. When overweight and obese adults switched to diet beverages or water for a six-month stretch, both groups shed 1 inch of girth, on average, and 5 pounds.

Health outcomes of non-nutritive sweeteners: analysis of the research landscape
Szimonetta Lohner, Ingrid Toews, and Joerg J. Meerpoh

Finally, we included 372 studies in our scoping review, comprising 15 systematic reviews, 155 randomized controlled trials (RCTs), 23 non-randomized controlled trials, 57 cohort studies, 52 case-control studies, 28 cross sectional studies and 42 case series/case reports.

In healthy subjects, appetite and short term food intake, risk of cancer, risk of diabetes, risk of dental caries, weight gain and risk of obesity are the most investigated health outcomes. Overall there is no conclusive evidence for beneficial and harmful effects on those outcomes. Numerous health outcomes including headaches, depression, behavioral and cognitive effects, neurological effects, risk of preterm delivery, cardiovascular effects or risk of chronic kidney disease were investigated in fewer studies and further research is needed. In subjects with diabetes and hypertension, the evidence regarding health outcomes of NNS use is also inconsistent.

Early-Life Exposure to Non-Nutritive Sweeteners and the Developmental Origins of Childhood Obesity: Global Evidence from Human and Rodent Studies
by Alyssa J. Archibald, Vernon W. Dolinsky, and Meghan B. Azad

Non-nutritive sweeteners (NNS) are increasingly consumed by children and pregnant women around the world, yet their long-term health impact is unclear. Here, we review an emerging body of evidence suggesting that early-life exposure to NNS may adversely affect body composition and cardio-metabolic health. Some observational studies suggest that children consuming NNS are at increased risk for obesity-related outcomes; however, others find no association or provide evidence of confounding. Fewer studies have examined prenatal NNS exposure, with mixed results from different analytical approaches. There is a paucity of RCTs evaluating NNS in children, yielding inconsistent results that can be difficult to interpret due to study design limitations (e.g., choice of comparator, multifaceted interventions). The majority of this research has been conducted in high-income countries. Some rodent studies demonstrate adverse metabolic effects from NNS, but most have used extreme doses that are not relevant to humans, and few have distinguished prenatal from postnatal exposure. Most studies focus on synthetic NNS in beverages, with few examining plant-derived NNS or NNS in foods. Overall, there is limited and inconsistent evidence regarding the impact of early-life NNS exposure on the developmental programming of obesity and cardio-metabolic health. Further research and mechanistic studies are needed to elucidate these effects and inform dietary recommendations for expectant mothers and children worldwide.

Noncaloric Sweeteners in Children: A Controversial Theme
by Samuel Durán Agüero, Lissé Angarita Dávila, Ma. Cristina Escobar Contreras, Diana Rojas Gómez, and Jorge de Assis Costa

On the other hand, three transversal studies, including 385 and 3311 children, showed positive association between the intakes of NCS and BMI [53]. Similar results were obtained with pregnant woman who ingested NCS, showing more probability of having babies with increased risk for later obesity or overweight. However, the limitation of the studies is that these were of observational type, and the findings do not necessarily imply a significant correlation between the intake of artificial sweeteners and weight gain [54, 55]. In a meta-analysis of intake of NCS that included 11.774 citations, 7 trials, 1003 participants, and 30 cohort studies (adults and adolescents) it was concluded that there is not enough evidence from randomized controlled trials to demonstrate the positive effect of NCS on controlling body weight. Findings of observational studies suggest that the continuous ingestion of NCS could be associated with BMI and cardiometabolic risk increase [56].

Diet Soda May Alter Our Gut Microbes And Raise The Risk Of Diabetes
by Allison Aubrey

While the findings are preliminary, the paper could begin to explain why studies of diet soda point in opposite directions.

“All of us have a microbiome” made up of trillions of organisms. “[It’s] extremely complex. Everybody’s microbiome is a little different,” Blaser says.

And the ways our microbiomes respond to what we eat can vary, too.

In the study, the Israeli researchers find that as mice and people started consuming artificial sweeteners, some types of bacteria got pushed out, and other types of bacteria began to proliferate.

It could be that for some people who responded negatively to the artificial sweetener, the bacteria that got crowded out were helping to keep glucose in check.

How it’s happening isn’t clear, and Blaser says a lot more research is needed.

“So that’s the next step,” Blaser says. “Firstly, for [researchers] to confirm this, to see if it’s really true.” And the next challenge is to understand the mechanism. “How does the change in the microbial composition — how is it causing this?”

Lots of researchers agree they’d like to see a large-scale study.

“It’s much too early, on the basis of this one study, [to conclude that] artificial sweeteners have negative impacts on humans’ [risk for diabetes],” says James Hill, director of the Center for Human Nutrition at the University of Colorado.

He points to a randomized controlled trial published in 2011 that found artificial sweeteners helped to limit the rise in blood sugar in a group of slightly overweight people, compared with sugar.

Hill also points to a study of people on the National Weight Control Registry that found successful long-term dieters tend to consume artificially sweetened foods and beverages at a higher rate compared with the general population.

So expect the debate over diet sodas to continue — and also anticipate hearing more about the role of our microbiomes.

Study links artificial sweeteners and weight gain
by Staff

Azad said what her team was most struck by was the lack of good, rigorous studies on artificial sweeteners.

“Surprisingly, given how common these products are, not many studies have looked at the long-term impact of their consumption,” Azad told CTV News Channel from Lisbon, Portugal.

She noted that only seven of the 37 studies they reviewed were randomized controlled trials (RCTs), and all were relatively short, following participants for a median period of only six months.

The other 30 studies were longer and followed the participants for an average of 10 years, but they were observational studies – a form of research that is not as precise as a controlled trial.

“A lot of the studies we found were observational, meaning they could show a link but they can’t prove a cause-and-effect relationship,” she said.

Among the seven RCT’s, regular consumption of sweeteners had no significant effect on weight loss. From the other studies, the team found that regular use of sweeteners was associated with an increased risk of type 2 diabetes and high blood pressure, and modest increases in weight and waist circumferences.

“What we found was that at the end of the day, from all of this research, there really wasn’t firm evidence of a long-term benefit of artificial sweeteners. And there was some evidence of long-term harm from long-term consumption,” Azad said.

As for why artificial sweeteners seem to be linked to weight gain, not weight loss, Azad says no one knows for sure but there are lots of theories.

One theory is that the sweeteners somehow disrupt healthy gut bacteria. Another theory is that the sweeteners confuse our metabolisms, causing them to overreact to sugary tastes.

It could be that those who regularly use artificial sweeteners over-compensate for the missed calories from sugar, or they could have otherwise unhealthy diets in conjunction with sweetener use.

Azad would like to see a lot more research on the long-term use of sweeteners, in particular studies that could compare the different sweeteners, to see if one is any better than another.

In the meantime, for those trying to cut down on their sugar consumption, Azad says it’s important not to switch from one harmful food item to another.

“I think the takeaway for Canadians at this point is to maybe think twice about whether you really want to be consuming these artificial sweeteners, particularly on an everyday basis,” Azad said, “because really we don’t have evidence to say for sure whether these are truly harmless alternatives to sugar.”

Like water fasts, meat fasts are good for health.

I was on a low-carb paleo diet for about a year with a focus on intermittent fasting and ketosis. Influenced by Dr. Terry Wahls and Dr. Will Cole, both former vegetarians converted to paleo, this included large helpings of vegetables but without the starchy carbs. It was a game-changer for me, as my health improved on all fronts, from weight to mood. But every time my carbs and sugar intake would creep up, I could feel the addictive cravings coming back and I decided to limit my diet to a greater extent. Zero-carb had already been on my radar, but I then looked more into it. It seemed worth a try.

So, I went carnivore for the past couple of months, mostly as an experiment and not as an idea of it being permanent. It is the best elimination diet ever and it definitely takes low-carb to another level, but I wanted to be able to compare how I felt with plants in my diet. So, a couple weeks ago with spring in the air and wild berries on their way, I ended my zero-carb carnivory with a three-day fast and reintroduced some light soup and fermented vegetables. I felt fine. Even after the extended period of low-carb diet, this zero-carb experiment made me realize how much better I feel with severely restricting carbs and sugar. Now back on a paleo-keto diet, I’m going to keep my focus on animal foods and be more cautious about which plant foods I include and how often.

Dr. Anthony Gustin offers an approach similar to Siim Land, as discussed in the first four videos below. A low-carb diet, especially strict carnivore (no dairy, just meat), is an extremely effective way of healing digestive issues and reducing bodily inflammation. The carnivore diet is a low residue diet because meat and fat gets fully digested much earlier in the digestive tract, whereas lots of fiber can clog you up in causing constipation. A similar kind of benefit is seen with the ketogenic diet, as microbiome imbalance and overgrowth is improved by initially starving and decreasing the number of microbes, but after some months the microbiome recovers to its original numbers and with a healthier balance.

Still, as Gustin and Land argue, it’s good to maintain some variety in the diet for metabolic flexibility. But we must understand plants stress the system (Steven Gundry, The Plant Paradox), as they are inflammatory, unlike most animal foods (though dairy can be problematic for some), and plants contain anti-nutrients that can cause deficiencies. There are other problems as well, such as damage from oxalates that are explained by the oxalate expert Sally K. Norton in the fifth and sixth videos; she argues that plants traditionally were only eaten seasonally and not daily as she talks about in the seventh video (also, written up as an academic paper: Lost Seasonality and Overconsumption of Plants: Risking Oxalate Toxicity).

Even so, one might argue that small amounts of stress are good for what is called hormesis — in the way that working out stresses the body in order to build muscle, whereas constant exertion would harm the body; or in the way that being exposed to germs as a child helps the development of a stronger immune system — with a quick explanation by Siim Land in the second video below. Otherwise, by too strictly excluding foods for too long you might develop sensitivities, which the fourth video is about. As cookie monster said about cookies on the Colbert Show, vegetables are a sometimes food. Think of plant foods more as medicine in that dose is important.

Plant foods are beneficial in small portions on occasion, whereas constantly overloading your body with them never gives your system a rest. Fruits and veggies are good, in moderation. It turns out a “balanced diet” doesn’t mean massive piles of greens for every meal and snacks in between. Grains aren’t the only problematic plant food. Sure, on a healthy diet, you can have periods of time when you eat more plant foods and maybe be entirely vegan on certain days, but also make sure to fast from plant foods entirely every now and then or even for extended periods.

That said, I understand that we’ve been told our entire lives to eat more fruits and veggies. And I’m not interested in trying to prove zero-carbs is the best. If you’re afraid that you’ll be unhealthy without a massive load of plant nutritients, then make sure to take care of potential problems with gut health and inflammation. In the eighth video below, a former vegan explains how she unknowingly had been managing her plant-induced inflammation with CBD oil, something she didn’t realize until after stopping its use. She later turned to an animal-based diet and the inflammation was no longer an issue.

But for those who don’t want to go strictly low-carb, much less carnivore, there are many ways to manage one’s health, besides anti-inflammatory CBD oil. Be sure to include other anti-inflammatories such as turmeric (curcumin) combined with, for absorption, black pepper (bioperine). Also, intermittent and extended fasting will be all the more important to offset the plant intake, although everyone should do fasting as it is what the human body is designed for. A simple method is limited eating periods, even going so far as one meal a day (OMAD), but any restriction is better than none. Remember that even sleeping at night is a fast and so, skipping breakfast or eating later, will extend that fast with its benefits; or else skipping dinner will start the fasting period earlier.

Even on a vegan or vegetarian diet, one can also do a ketogenic diet, which is another way of reducing inflammation and healing the gut. For this approach, I’d suggest reading Dr. Will Cole’s book Ketotarian; also helpful might be some other books such as Dena Harris’ The Paleo Vegetarian Diet and Mark Hyman’s Food: What the Heck Should I Eat?. Anytime carbs are low enough, including during fasts, will put the body into ketosis and eventually autophagy, the latter being how the body heals itself. Carbs, more than anything else, will knock you out of this healthy state, not that you want to be permanently in this state.

Still, I wouldn’t recommend extreme plant-based diets, in particular not the typically high-carb veganism. Even with the advantages of low-carb, I would still avoid it as this will force you to eat more unhealthy foods like soy and over-consume omega-6 fatty acids from nuts and seeds, one of the problems discussed in the fourth video. Some vegetarians and vegans will oddly make an exception for seafood; but if you don’t eat seafood at all, be sure to add an algal-source supplement of EPA and DHA, necessary omega-3 fatty acids that are also beneficial for inflammation and general health. If meat, including seafood, is entirely unacceptable, consider at least adding certain kinds animal foods in such as pasture-raised eggs and ghee.

If you still have health problems, consider the possibility of going zero-carb. Even a short meat fast might do wonders. As always, self-experimentation is the key. Put your health before dietary ideology. That is to say, don’t take my word for it nor the word of others. Try it for yourself. If you want to do a comparison, try strict veganism for a period and then follow it with carnivore. And if you really want to emphasize the difference, make the vegan part of the experiment high-carb and I don’t necessarily mean what are considered ‘unhealthy’ carbs — so, eat plenty of whole wheat bread, rice, corn, and beans, — that way you’ll also feel the difference that carbohydrates make. But if you don’t want to do carnivore for the other part of the experiment, at least try a ketogenic diet which can be done with more plant-based foods but consider reducing the most problematic plant foods, as Gundry explains.

Of course, you can simply jump right into carnivory and see what happens. Give it a few months or even a year, as it can take a while for your body to heal, not only in elimination of toxins. What do you have to lose?

* * *

I’ll add a personal note. I’ve long had an experimental attitude about life. But the last year, I’ve been quite intentional in my self-experimenting. Mainly, I try something and then observe the results, not that I’m always that systematic about it. Many of the changes I’ve experienced would be hard to miss, even when I’m not paying close attention.

That playing around with dietary parameters is what I’m still doing. My dietary experiments likely will go on for quite a while longer. After a few days of fermented vegetables, I felt fine and there were no symptoms. I decided to try a salad which is raw vegetables (lettuce, green onions, and radishes) and included fermented vegetables. Now I notice that the inflammation in my wrist has flared up. I’ll take that as my body giving me feedback.

One of the best benefits to zero-carb was how inflammation had gone away. My wrists weren’t bothering me at all and that is a big deal, as they’re has been irritation for years now with my job as a cashier and all the time I spend on the computer. Inflammation had gone down with low-carb, but it was still noticeable. There was further decrease with zero-carb and I’d hate to lose those gains.

As I said, I’m being cautious. The benefits I’ve seen are not slight and far from being limited to joint issues, with what is going on with my wrists probably being related to the crackling in my knees I experience earlier last decade before reducing sugar. A much bigger deal is the neurocognitive angle, since mental health has been such a struggle for decades. Possible inflammation in my brain is greater concern than inflammation in my wrists, not that the two can be separated as an inflammatory state can affect any and all parts of the body. I take depression extremely seriously and I’m hyper-aware to shifts in mood and related aspects.

I’ll limit myself to fermented vegetables for the time being and see how that goes.

Having written that, I remembered one other possible offending food. The day before the salad I had a slice of oat bread. I had asked someone to make me some almond bread, as I explained to them, because of the paleo diet and they misunderstood. They apparently thought the paleo diet was only about wheat and so they got it in their head that oats would be fine. Because they made it for me, I decided to have a slice as I’m not a dietary Puritan.

So maybe it wasn’t the salad, after all. Still, I think I’ll keep to the fermented veggies for a while. And I’ll keep away from those grains. That was the first time I had any oats in a long time. I’ll have to try oats again sometime in the future to see if I have a similar response. But for now, I’m keeping my diet simple by keeping animal foods at the center of of what I eat.

* * *

My own experience with diets makes me understand the attraction of carnivore diet. It isn’t only the most effective diet for healing from inflammation and gut problems. Also, it is so simple to do, it is highly satisfying with lots of fat and sat, and the results are dramatic and quick. You just eat until you’re no longer hungry.

Few other diets compare. The one exception being the ketogenic diet, which is unsurprising since zero-carb will obviously promote ketosis. Both of these diets have the advantage of simplicity. One quickly learns that all the struggle and suffering is unnecessary and undesirable. You eat until satiety and then stop. Overeating is almost impossible on carnivore, as the body returns to normal balance without all those carbs and sugar fucking up your metabolism and hormonal signaling for hunger.

We live in a dominator society that is drenched in moralistic religion and this impacts everyone, even atheists and new agers. This shapes the stories we tell, including dieting narratives of gluttony and sin (read Gary Taubes). We are told dieting must be hard, that it is something enforced, not something we do naturally as part of a lifestyle. We are taught to mistrust our bodies and, as if we are disembodied ego-minds, that we must control the body and resist temptation… and when we inevitably fail, one might argue by design, we must punish ourselves and double down on self-denial. If it feels good, it must be bad. What bullshit!

The addictive mentality of diets high in carbs and sugar are part of a particular social order built on oppressive social control. Rather than an internal sense of satisfaction, control must come from outside, such that we become disconnected even from our own bodies. It is a sense of scarcity where one is always hungry, always worried about where the next meal will come from. And in order to control this addictive state, we are told we have to fight against our own bodies, as if we are at war with ourselves. We lose an intuitive sense of what is healthy, as everything around us promotes imbalance and disease.

But what if there could be another way? What if you could feel even better with carnivory or in ketogenic fasting than you ever felt before?

* * *

I’ve written before about low-carb, fasting, ketosis, and related dietary topics such as paleo and nutrient-density:

Ketogenic Diet and Neurocognitive Health; Fasting, Calorie Restriction, and Ketosis; Fasting and Feasting; The Agricultural Mind; Spartan Diet; Sailors’ Rations, a High-Carb DietObese Military?; Low-Carb Diets On The Rise; Obesity Mindset; Malnourished Americans; Ancient Atherosclerosis?; Carcinogenic Grains; The Creed of Ancel Keys; Dietary Dictocrats of EAT-Lancet; Clearing Away the Rubbish; Damning Dietary Data; Paleo Diet, Traditional Foods, & General Health; and The Secret of Health.

This is the first post about the carnivore diet. Some of the other posts come close to it, though. In a couple of them, I discuss diets that were largely centered on animal foods, from the Mongols to the Spartans. It was specifically my reading about and experimenting with fasting and ketosis that opened my mind to considering the carnivore diet.

I bring this up because of another interesting historical example I just came across. Brad Lemley, a science journalist, is a LCHF practitioner and advocate. He writes that, “I’ve always been fascinated by Lewis and Clark’s expedition. What gave the 33 men and one dog the strength to traverse the wild nation? Nine pounds of meat per day per man”.

From the journal of Raymond Darwin Burroughs, there was a tally of the meat consumed on the expedition: “Deer (all species combined” 1,001; Elk 375; Bison 227; Antelope 62; Bighorn sheep 35; Bears, grizzly 43; Bears, black 23; Beaver (shot or trapped) 113; Otter 16; Geese and Brant 104; Grouse (all species) 46; Turkeys 9; Plovers 48; Wolves (only one eaten) 18; Indian dogs (purchased and consumed) 190; Horses 12″ (The Natural History of the Lewis and Clark Expedition).

“This list does not include the countless smaller or more exotic animals that were captured and eaten by the Corps, such as hawk, coyote, fox, crow, eagle, gopher, muskrat, seal, whale blubber, turtle, mussels, crab, salmon, and trout” (Hunting on the Lewis and Clark Trail). “Additionally, 193 pounds of “portable soup” were ordered as an emergency ration when stores ran out and game was scarce or unavailable. The soup was produced by boiling a broth down to a gelatinous consistency, then further drying it until it was rendered quite hard and desiccated. Not exactly a favorite with the men of the Corps, it nonetheless saved them from near starvation on a number of occasions.”

That would be a damn healthy diet. Almost entirely hunted and wild-caught meat. They would have been eating head-to-tail with nothing going to waste: brains, intestines, organ meats, etc. They also would’ve been getting the bone marrow and bone broth. This would have provided every nutrient needed for not just surviving but thriving at high levels of health and vitality. Yet they also would have gone through periods of privation and hunger.

“Despite the apparent bounty of the ever-changing landscape and the generosity of local tribes, many were the nights when the crew of the Corps went to sleep hungry. Many were the days when shots went awry and missed their mark, or game remained hidden from sight. Relentless rain ruined drying meat, punishing heat spoiled perishable provisions, and clothing rotted right off the backs of the men.”

That means they also spent good portions of time fasting. So, there was plenty of ketosis and autophagy involved, further factors that promote health and energy. Taken together, this dietary lifestyle follows the traditional hunter-gatherer pattern of feasting and fasting. Some ancient agricultural societies such as the Spartans intentionally mimicked this intermittent fasting through the practice of one-meal-a-day, at least for young boys training for the life of a soldier.

Nina Teicholz has pointed out that a meat-heavy diet was common to early Americans, not only to those on expeditions into the Western wilderness, and because of seasonal changes fasting and its results would also have been common. The modern industrial style of the standard American diet (SAD) doesn’t only diverge from traditional hunter-gatherer diets but also from the traditional American diet.

* * *

Video 1

Video 2

Video 3

Video 4

Video 5

Video 6

Video 7

Video 8

* * *

Bonus Video!

This one particularly fits my own experience with mental health. The guy interviewed offers a compelling conversion story, in going from the standard American diet (SAD) to carnivore after decades of everything getting worse. His example shows how, as long as you’re still alive, it is never too late to regain some of your health and sometimes with a complete reversal.

* * *

Other videos:

Sailors’ Rations, a High-Carb Diet

In the 18th century British Navy, “Soldiers and sailors typically got one pound of bread a day,” in the form of hard tack, a hard biscuit. That is according to James Townsend. On top of that, some days they had been given peas and on other days a porridge called burgoo. Elsewhere, Townsend shares a some info from a 1796 memoir of the period — the author having written that, “every man and boy born on the books of any of his Majesty’s ships are allowed as following a pound of biscuit bread and a gallon of beer per day” (William Spavens, Memoirs of A Seafaring Life,  p. 106). So, grains and more grains, in multiple forms, foods and beverages.

About burgoo, it is a “ground oatmeal boiled up,” as described by Townsend. “Now you wouldn’t necessarily eat that all by itself. Early on, you were given to go with that salt beef fat. So the slush that came to the top when you’re boiling all your salt beef or salt pork. You get all that fat that goes up on top — they would scrape that off, they keep that and give it to you to go with your burgoo. But later on they said maybe that cause scurvy so they let you have some molasses instead.”

They really didn’t understand scurvy at the time. Animal foods, especially fat, would have some vitamin C in it, whereas the oats and molasses had none. They made up for this deficiency later on by adding in cabbage to the sailors’ diet, though not a great choice considering vegetables don’t store well on ships. I’d point out that it’s not that they weren’t getting enough vitamin C, at least for a healthy traditional diet, as they got meat four days a week and even on the other meat-free banyan-days they had some butter and cheese. That would have given them sufficient vitamin C for a low-carb diet, especially with seafood caught along the way.

A high-carb diet, however, is a whole other matter. The amount of carbs and sugar sailors ate daily was quite large. This came about with colonial trade that made grains cheap and widely available, along with the sudden access to sugar from distant sugarcane plantations. Glucose competes with the processing of vitamin C and so requires higher intake of the latter for basic health, specifically to avoid scurvy. A low-carb diet, on the other hand, can avoid scurvy with very little vitamin C since sufficient amounts are in animal foods. Also, a low-carb diet is less inflammatory and so this further decreases the need for antioxidants like vitamin C.

This is why Inuit could eat few plants and immense amounts of meat and fat. They got more vitamin C on a regular basis from seal fat than they did from the meager plant foods they could gather in the short warm period of the far north. But with almost no carbohydrates in the traditional Inuit diet, the requirement for vitamin C was so low as to not be a problem. This is probably the same explanation for why Vikings and Polynesians could travel vast distances across the ocean without getting sick, as they were surely eating mostly fresh seafood and very little, if any, starchy foods.

Unlike protein and fat, carbohydrate is not an essential macronutrient. Yes, carbohydrates provide glucose that the body needs in limited amounts, but through gluceogenesis proteins can be turned into glucose on demand. So, a long sea voyage with zero carbs would never have been a problem.

Sailors in the colonial era ate all of those biscuits, porridge, and peas not because it offered any health value beyond mere survival but because it was cheap food. Those sailors weren’t being fed to have long, healthy lives as labor was cheap and no one cared about them. As soon as a sailor was no longer useful, he would no longer be employed in that profession and he’d find himself among the impoverished masses. For all the health problems of a sailor’s diet, it was better than the alternative of starvation or near starvation that so many others faced.

Grain consumption had been increasing in late feudalism, but peasants still maintained wider variety in their diet through foods they could hunt or gather, not to mention some fresh meat, fat, eggs, and dairy from animals they raised. That all began to change with the enclosure movement. The end of feudal village life and loss of the peasants’ commons was not a pretty picture and did not lead to happy results, as the landless peasants evicted from their homes flooded into the cities where most of them died. The economic desperation made for much cheap labor. Naval sailors with their guaranteed rations, in spite of nutritional deficiencies, were comparably lucky.

On Salt: Sodium, Trace Minerals, and Electrolytes

There has been a lot of debate about salt lately. The mainstream view originated from little actual scientific evidence. It wasn’t well-supported. But research since then has been mixed.

The isn’t limited to disagreement between mainstream and alternative thinkers. Paleo advocates such as Dr. Loren Cordain (considered to be the founder of the paleo diet) continue to recommend lower salt intake. Still, there have been an increasing number of scientists and physicians coming out in favor of the benefits of salt: Dr Barbara Hendel, Dr. F. Batmanghelidj, Dr. Esteban Genoa, Dr. Eric Westman, Dr. Jeff S Volek, Dr. Stephen D. Phinney, and Dr. James DiNicolantonio. Many of these experts argue that increased amounts of salt is particularly important for a low-carb diet and that is even more true with high-protein. This relates to issues transitioning into ketosis, what is referred to as keto flu. Basically, the electrolytes temporarily get out of balance while one is adapting to ketosis. Yet Sally Fallon Morell states that it is a plant-based diet that requires more salt to increase HCL in the stomach for digestion.

All of this was brought to my attention because of Dr. DiNcolantonio’s book The Salt Fix that came out recently. His simplest advice is to salt to taste since your body (presumably under normal conditions) should know how much salt it needs. He argues that salt isn’t addictive like sugar. So, according to this view, salt cravings can be safely treated as a genuine need for salt. I haven’t read The Salt Fix, but I have skimmed a bit of one of his other books, Superfuel. In that book, he states that salt, besides maintaining healthy blood pressure, helps maintain insulin sensitivity. Also, salt goes back to the fat issue — more from the book:

“Diets very low in sodium (salt) increase adrenaline and aldosterone, and these hormones reduce activity of D6D and D5D. For this reason, low-salt diets increase the need for EPA and DHA due to the reduced desaturase enzyme activities. Another extremely common hormonal issue these days, one that interferes with conversion of the parent omega-6 and omega-3 fats into their derivatives, is hypothyroidism. Thyroid hormone is required for proper activity of D6D and D5D, so individuals with suboptimal thyroid hormone levels may benefit from consuming more EPA and DHA or taking good-quality supplements.”

There is a number of issues with sodium, potassium, and magnesium in relation to insulin, adrenaline, and aldosterone. Shifting the diet can affect any or all of these. The problem is most research has been limited to people on the standard American diet. We know very little, if anything at all, about salt intake or electrolyte supplementation with other diets. That forces people into experimentation. Anything true of high-carb diets may or may not apply to low-carb diets. Nor do we know that the same will be true between moderately low-carb diets, extremely low-carb diets, zero-carb diets, etc. Then there are other factors such as fasting, ketosis, autophagy, etc that alters the body’s functioning. It’s possible that, on low enough carb restriction, the need for electrolytes and trace minerals decreases, as is the case with vitamin C. Sounds like a great hypothesis to be tested.

Then there is the issue of what actually helps vs what might harm you. What are the potential risks and benefits of getting too few electrolytes and trace minerals vs higher levels? I’m not sure self-experimentation can exactly figure this out, although maybe some have strong enough responses to salt or its lack that they can figure out what works for them. My own experimentation hasn’t indicated anything particular, either positive or negative.

Like anyone else, I enjoy the taste of salt. But unlike sugar, I’ve never craved salt in the addictive sense (and I know what addiction feels like). According to some of what I read, the danger seems to be specifically with refined salt, as is the case with so much else that is refined. Refined salt doesn’t give your body what it needs and so throws off the balance, disallowing healthy processsing of glucose, and so according to this explanation this is why refined salt disposes you to sugar cravings. I remember reading about this sugar and salt craving cycle back in the 1990s, but apparently it only applies to refined salt, if I’m understanding the research correctly. It just so happens that processed food manufacturers love to combine refined carbs and sugar with refined salt, where taste has become disconnected from actual nutrient content because almost all nutrients have been stripped away. They also throw in addictive wheat and dairy for good measure.

I noticed that Dr. James DiNicolantonio says to worry less about sodium and instead focus on potassium. But he emphasizes natural sources of potassium. His point is that salt simply makes high-potassium foods more palatable, foods that otherwise would be more bitter. He points out that there are both animal and plant foods that have greater amounts of potassium: fish, shellfish, greens, beans, potatoes, and tomatoes. The significance of the salt is that once potassium hits a threshold the sodium supposedly will balance it out. Seafood is particularly high in these particular micronutrients, along with much else that is healthy (e.g., EPA and DHA). Many healthy populations have lived near the ocean, as observed by Weston A. Price and others. Some argue that seafood shaped human evolution, the aqauatic ape theory.

“Most animals with a sodium deficiency display an active craving for salt which, when satisfied, disappears. In humans, salt intake has little or no relation to the body’s needs. Some Inuit tribes avoid salt almost completely, while people in the Western world consume 1520 times the amount needed for health. In other works, a single African species (assuming humans have an African origin) possesses a wildly different scheme of salt management. Humans are also the only primates to regulate body temperature by sweat-cooling, a system profligate in the use of sodium. Proponents of the Aquatic Ape Hypothesis believe that sweat-cooling could not have developed anywhere except near the sea where diets contain considerable salt, in fact much more salt than the body requires.” (William R. Corliss, Our aquatic phase!)

An interesting theory to explain the unusual aspects of salt in the human species and why there is so many differences even across traditional societies. Whether or not the aquatic ape theory is correct, it’s for certain that the foods in the standard American diet are far different in numerous ways, likely including nutrient content of magnesium and potassium. It would be useful to measure the levels of micronutrients in a healthy hunter-gatherer diet, not only from salt but food sources as well. Besides seafood and certain plants, especially seaweed (Birgitte Svennevig, Did seaweed make us who we are today?), many have noted that it is a common practice among hunter-gatherers to consume blood along with organ meats and interstitial fluid, all of which are high in salt.

I wonder if this is something we overthink because dietary experts came to obsess over it, as a convenient scapegoat (as they scapegoated saturated fat). The whole debate has become polarized, those arguing for low-salt vs those for high-salt. But other factors might be more important. Besides the problems of a high-carb diet, maybe salt levels aren’t that big of an issue. Assuming there aren’t specific health conditions, most people might be perfectly safe to salt to taste or largely ignore salt if they prefer. Potassium and magnesium seem a bit different, though. Those mostly come from foods, not salt. I don’t know of research that compares people who eat foods high in these micronutrients and those who don’t. It’s another one of those confounders with the standard American diet. And even a zero-carb dieter can eat foods that are either high or low in these micronutrients. For those not using salt, it would be useful info to know which foods they eat and their micronutrient profile.

My conclusion is simply that salt tastes good and, until better science comes along to tell me otherwise, I’ll salt to taste. I’m definitely a fan of the philosophy of listening to one’s body. I self-experiment and find out what works. In my experience, there is a big difference between craving sugar and enjoying salt. One is clearly an addiction and the other not, at least in my case. I was reminded of this just moments ago. I got a glass of water. Since it was on my mind, I sprinkled some sea salt in it and a few drops of electrolytes, along with a capful of apple cider vinegar as I’m wont to do. I quickly downed it and realized how refreshing it was. Earlier this morning I had a glass of water without anything in it and it wasn’t nearly as thirst-quenching. I’m not sure why that is. Something about water with salt and trace minerals in it is so satisfying. I suppose that is why many people love Gatorade and similar drinks. They go down so easily, even though the other ingredients are horrible for your health.

My advice is this. Enjoy salt. It tastes good and makes food more satisfying. Certain trace minerals are necessary for life and health, although only small amounts are naturally found in salt. As for potential downsides, there is yet no clear evidence and no consensus. So, do as many others do, find out what works for you.

* * *

There is a secondary issue or rather some related secondary issues. Angela A. Stanton advises against consuming rock salts that have to be mined, such as Himalayan pink salt and Real Salt (The Truth About Himalayan Salt). She gives two main reasons. First, there might be impurities, including radioactive elements and heavy metal toxins such as lead, although she mentions there are also impurities in sea salt as well. The other problem is that these natural sources of salt lack iodine, an important nutrient. So, for both reasons, she recommends a refined salt that has been purified and iodized.

Her second point is the strongest. Iodine is, without a doubt, an essential nutrient and a deficiency is serious. I’m not sure how likely deficiencies are these days for those eating an otherwise wholesome diet, but it is something to keep in mind. Of course, you could solve this problem by occasionally sprinkling on your food some seaweed, a great natural source of iodine. Her fear about impurities, though, is maybe less substantiated because the amount of impurities is so small as to be insignificant. If we are to be paranoid, impurities are almost everywhere and in almost everything — the air you breathe and the water you drink, the food you eat and supplements you take. The human body evolved to handle such miniscule exposures.

If you have health concerns with iodine deficiency, then go ahead with iodized salt. But otherwise, it probably doesn’t matter too much which kind of salt you use, as long as it comes from a high quality source. But if you want to learn more about the issue of contaminants, David Asprey has directly responded to Stanton (Is Pink Himalayan Salt Toxic?) and so has Jeremy Orozco (Is Pink Himalayan Salt Toxic? Radioactive?). There are others as well who respond to the issue more generally to the topic. There were some responses to a Quora inquiry: Is the amount of plutonium in pink Himalayan salt dangerous? (less than .001 ppm). Also, in the comments section of a piece by Harriet Hall (Pink Himalayan Sea Salt: An Update), there were useful responses:

Jeff Mink • 2 years ago
“In case it wasn’t clear from the article, Himalayan sea salt does not contain “84 trace elements”. If you follow the link to the spectral analysis, it simply lists all non-noble gas elements in the periodic table. If the concentration is “< X ppm”, that means that none of that element was detected. That leaves it with a total of 29 elements (NOT MINERALS!) detected, assuming I counted correctly. In fact, they didn’t even test for as technetium and promethium, since there’s no chance (according to our modern scientific theories) that those could be in there. None of the elements that are actually contained in the salt are radioactive (at least not that I saw), but thallium and lead are definitely not good for the human body. Of course, at the concentrations listed, you’d probably succumb to sodium poisoning long before you got a harmful dose of heavy metals.”

Mathew Taylor • 2 years ago
“There are two main parts to this article: 1) Pink Salt does not provide any health benefits, or they are overstated grossly. I concede this point.

“However, the 2nd part, that pink salt is HARMFUL appears to be wrong. You state that it is full of poisons / contaminants. Lets look at a few of them;

“Arsenic – <0.01 ppm – There is more arsenic in some foods than this. In fact, local authorities limit arsenic concentrations in some seafood to 2mg/kg, thats 2ppm, orders of magnitude more than in pink salt and in something you would consume an order of magnitude more of.

“Mercury – .03 ppm in pink salt. Contrast that with Tuna, where levels are at least TEN TIMES higher, and the volume you would consume in a serving is many orders of magnitude higher.

“Lead – .1 ppm – There is lead in a variety of foods, but usually lower concentrations than this. Remember that salt is not used in massive quantities, unlike vegetables. The target for blood lead levels is less than 10 mcg/dl, or approx 500 mcg total. To get that much lead from pink salt, you’d have to consume 5 kilograms of the stuff. Good luck with that.

“Uranium – <0.001 ppm – Lots of food has uranium in it. Mushrooms can have over 100 μg U/kg (Dry mass).

“So don’t use it if you don’t want to, but don’t make out like this stuff is bad for you, it is, after all, 97.3% table salt.”

* * *

If you want further info about salt, here is a somewhat random collection of articles and videos, all of them bringing new perspectives based on the latest research:

The Salt of the Earth

Salt: friend or foe?

Why Salt Is Good For You, But Some Salt is Better Than Others

Dr. James DiNicolantonio On Sodium-Potassium Balance

The Potassium Myth

The Importance of Potassium and Sodium for Fertility Health

On Keto Flu and Electrolyte Imbalances

Leveraging basic physiology to prevent ‘keto-flu,’ ‘Atkins-flu,’ and ‘adrenal fatigue.’

How much sodium, potassium and magnesium should I have on a ketogenic diet?


The Crisis of Identity

“Have we lived too fast?”
~Dr. Silas Weir Mitchell, 1871
Wear and Tear, or Hints for the Overworked

I’ve been following Scott Preston over at his blog, Chrysalis. He has been writing on the same set of issues for a long time now, longer than I’ve been reading his blog. He reads widely and so draws on many sources, most of which I’m not familiar with, part of the reason I appreciate the work he does to pull together such informed pieces. A recent post, A Brief History of Our Disintegration, would give you a good sense of his intellectual project, although the word ‘intellectual’ sounds rather paltry for what he is describing:

“Around the end of the 19th century (called the fin de siecle period), something uncanny began to emerge in the functioning of the modern mind, also called the “perspectival” or “the mental-rational structure of consciousness” (Jean Gebser). As usual, it first became evident in the arts — a portent of things to come, but most especially as a disintegration of the personality and character structure of Modern Man and mental-rational consciousness.”

That time period has been an interest of mine as well. There are two books that come to mind that I’ve mentioned before: Tom Lutz’s American Nervousness, 1903 and Jackson Lear’s Rebirth of a Nation (for a discussion of the latter, see: Juvenile Delinquents and Emasculated Males). Both talk about that turn-of-the-century crisis, the psychological projections and physical manifestations, the social movements and political actions. A major concern was neurasthenia which, according to the dominant economic paradigm, meant a deficit of ‘nervous energy’ or ‘nerve force’, the reserves of which if not reinvested wisely and instead wasted would lead to physical and psychological bankruptcy, and so one became spent (the term ‘neurasthenia’ was first used in 1829 and popularized by George Miller Beard in 1869, the same period when the related medical condition of ‘nostalgia’ began being diagnosed).

This was mixed up with sexuality in what Theodore Dreiser called the ‘spermatic economy’ (by the way, the catalogue for Sears, Roebuck and Company offered an electrical device to replenish nerve force that came with a genital attachment). Obsession with sexuality was used to reinforce gender roles in how neurasthenic patients were treated in following the practice of Dr. Silas Weir Mitchell, in that men were recommended to become more active (the ‘West cure’) and women more passive (the ‘rest cure’), although some women “used neurasthenia to challenge the status quo, rather than enforce it. They argued that traditional gender roles were causing women’s neurasthenia, and that housework was wasting their nervous energy. If they were allowed to do more useful work, they said, they’d be reinvesting and replenishing their energies, much as men were thought to do out in the wilderness” (Julie Beck, ‘Americanitis’: The Disease of Living Too Fast). That feminist-style argument, as I recall, came up in advertisements for the Bernarr Macfadden’s fitness protocol in the early-1900s, encouraging (presumably middle class) women to give up housework for exercise and so regain their vitality. Macfadden was also an advocate of living a fully sensuous life, going as far as free love.

Besides the gender wars, there was the ever-present bourgeois bigotry. Neurasthenia is the most civilized of the diseases of civilization since, in its original American conception, it was perceived as only afflicting middle-to-upper class whites, especially WASPs — as Lutz says that, “if you were lower class, and you weren’t educated and you weren’t Anglo Saxon, you wouldn’t get neurasthenic because you just didn’t have what it took to be damaged by modernity” (Julie Beck, ‘Americanitis’: The Disease of Living Too Fast) and so, according to Lutz’s book, people would make “claims to sickness as claims to privilege.” It was considered a sign of progress, but over time it came to be seen by some as the greatest threat to civilization, in either case offering much material for fictionalized portrayals that were popular. Being sick in this fashion was proof that one was a modern individual, an exemplar of advanced civilization, if coming at immense cost —Julie Beck explains:

“The nature of this sickness was vague and all-encompassing. In his book Neurasthenic Nation, David Schuster, an associate professor of history at Indiana University-Purdue University Fort Wayne, outlines some of the possible symptoms of neurasthenia: headaches, muscle pain, weight loss, irritability, anxiety, impotence, depression, “a lack of ambition,” and both insomnia and lethargy. It was a bit of a grab bag of a diagnosis, a catch-all for nearly any kind of discomfort or unhappiness.

“This vagueness meant that the diagnosis was likely given to people suffering from a variety of mental and physical illnesses, as well as some people with no clinical conditions by modern standards, who were just dissatisfied or full of ennui. “It was really largely a quality-of-life issue,” Schuster says. “If you were feeling good and healthy, you were not neurasthenic, but if for some reason you were feeling run down, then you were neurasthenic.””

I’d point out how neurasthenia was seen as primarily caused by intellectual activity, as it became a descriptor of a common experience among the burgeoning middle class of often well-educated professionals and office workers. This relates to Weston A. Price’s work in the 1930s, as modern dietary changes first hit this demographic since they had the means to afford eating a fully industrialized Standard American Diet (SAD), long before others (within decades, though, SAD-caused malnourishment would wreck the health at all levels of society). What this meant, in particular, was a diet high in processed carbs and sugar that coincided, because of Upton Sinclair’s 1904 The Jungle: Muckraking the Meat-Packing Industry,  with the early-1900s decreased consumption of meat and saturated fats. As Price demonstrated, this was a vast change from the traditional diet found all over the world, including in rural Europe (and presumably in rural America, with most Americans not urbanized until the turn of last century), that always included significant amounts of nutritious animal foods loaded up with fat-soluble vitamins, not to mention lots of healthy fats and cholesterol.

Prior to talk of neurasthenia, the exhaustion model of health portrayed as waste and depletion took hold in Europe centuries earlier (e.g., anti-masturbation panics) and had its roots in humor theory of bodily fluids. It has long been understood that food, specifically macronutrients (carbohydrate, protein, & fat), affect mood and behavior — see the early literature on melancholy. During feudalism food laws were used as a means of social control, such that in one case meat was prohibited prior to Carnival because of its energizing effect that it was thought could lead to rowdiness or even revolt (Ken Albala & Trudy Eden, Food and Faith in Christian Culture).

There does seem to be a connection between an increase of intellectual activity with an increase of carbohydrates and sugar, this connection first appearing during the early colonial era that set the stage for the Enlightenment. It was the agricultural mind taken to a whole new level. Indeed, a steady flow of glucose is one way to fuel extended periods of brain work, such as reading and writing for hours on end and late into the night — the reason college students to this day will down sugary drinks while studying. Because of trade networks, Enlightenment thinkers were buzzing on the suddenly much more available simple carbs and sugar, with an added boost from caffeine and nicotine. The modern intellectual mind was drugged-up right from the beginning, and over time it took its toll. Such dietary highs inevitably lead to ever greater crashes of mood and health. Interestingly, Dr. Silas Weir Mitchell who advocated the ‘rest cure’ and ‘West cure’ in treating neurasthenia and other ailments additionally used a “meat-rich diet” for his patients (Ann Stiles, Go rest, young man). Other doctors of that era were even more direct in using specifically low-carb diets for various health conditions, often for obesity which was also a focus of Dr. Mitchell.

Still, it goes far beyond diet. There has been a diversity of stressors that have continued to amass over the centuries of tumultuous change. The exhaustion of modern man (and typically the focus has been on men) has been building up for generations upon generations before it came to feel like a world-shaking crisis with the new industrialized world. The lens of neurasthenia was an attempt to grapple with what had changed, but the focus was too narrow. With the plague of neurasthenia, the atomization of commericialized man and woman couldn’t hold together. And so there was a temptation toward nationalistic projects, including wars, to revitalize the ailing soul and to suture the gash of social division and disarray. But this further wrenched out of alignment the traditional order that had once held society together, and what was lost mostly went without recognition. The individual was brought into the foreground of public thought, a lone protagonist in a social Darwinian world. In this melodramatic narrative of struggle and self-assertion, many individuals didn’t fare so well and everything else suffered in the wake.

Tom Lutz writes that, “By 1903, neurasthenic language and representations of neurasthenia were everywhere: in magazine articles, fiction, poetry, medical journals and books, in scholarly journals and newspaper articles, in political rhetoric and religious discourse, and in advertisements for spas, cures, nostrums, and myriad other products in newspapers, magazines and mail-order catalogs” (American Nervousness, 1903, p. 2).

There was a sense of moral decline that was hard to grasp, although some people like Weston A. Price tried to dig down into concrete explanations of what had so gone wrong, the social and psychological changes observable during mass urbanization and industrialization. He was far from alone in his inquiries, having built on the prior observations of doctors, anthropologists, and missionaries. Other doctors and scientists were looking into the influences of diet in the mid-1800s and, by the 1880s, scientists were exploring a variety of biological theories. Their inability to pinpoint the cause maybe had more to do with their lack of a needed framework, as they touched upon numerous facets of biological functioning:

“Not surprisingly, laboratory experiments designed to uncover physiological changes in the nerve cell were inconclusive. European research on neurasthenics reported such findings as loss of elasticity of blood vessels,’ thickening of the cell wall, changes in the shape of nerve cells,’6 or nerve cells that never advanced beyond an embryonic state.’ Another theory held that an overtaxed organism cannot keep up with metabolic requirements, leading to inadequate cell nutrition and waste excretion. The weakened cells cannot develop properly, while the resulting build-up of waste products effectively poisons the cells (so-called “autointoxication”).’ This theory was especially attractive because it seemed to explain the extreme diversity of neurasthenic symptoms: weakened or poisoned cells might affect the functioning of any organ in the body. Furthermore, “autointoxicants” could have a stimulatory effect, helping to account for the increased sensitivity and overexcitability characteristic of neurasthenics.'” (Laura Goering, “Russian Nervousness”: Neurasthenia and National Identity in Nineteenth-Century Russia)

This early scientific research could not lessen the mercurial sense of unease, as neurasthenia was from its inception a broad category that captured some greater shift in public mood, even as it so powerfully shaped the individual’s health. For all the effort, there were as many theories about neurasthenia as there were symptoms. Deeper insight was required. “[I]f a human being is a multiformity of mind, body, soul, and spirit,” writes Preston, “you don’t achieve wholeness or fulfillment by amputating or suppressing one or more of these aspects, but only by an effective integration of the four aspects.” But integration is easier said than done.

The modern human hasn’t been suffering from mere psychic wear and tear for the individual body itself has been showing the signs of sickness, as the diseases of civilization have become harder and harder to ignore. On a societal level of human health, I’ve previously shared passages from Lears (see here) — he discusses the vitalist impulse that was the response to the turmoil, and vitalism often was explored in terms of physical health as the most apparent manifestation, although social and spiritual health were just as often spoken of in the same breath. The whole person was under assault by an accumulation of stressors and the increasingly isolated individual didn’t have the resources to fight them off.

By the way, this was far from being limited to America. Europeans picked up the discussion of neurasthenia and took it in other directions, often with less optimism about progress, but also some thinkers emphasizing social interpretations with specific blame on hyper-individualism (Laura Goering, “Russian Nervousness”: Neurasthenia and National Identity in Nineteenth-Century Russia). Thoughts on neurasthenia became mixed up with earlier speculations on nostalgia and romanticized notions of rural life. More important, Russian thinkers in particular understood that the problems of modernity weren’t limited to the upper classes, instead extending across entire populations, as a result of how societies had been turned on their heads during that fractious century of revolutions.

In looking around, I came across some other interesting stuff. From 1901 Nervous and Mental Diseases by Archibald Church and Frederick Peterson, the authors in the chapter on “Mental Disease” are keen to further the description, categorization, and labeling of ‘insanity’. And I noted their concern with physiological asymmetry, something shared later with Price, among many others going back to the prior century.

Maybe asymmetry was not only indicative of developmental issues but also symbolic of a deeper imbalance. The attempts of phrenological analysis about psychiatric, criminal, and anti-social behavior were off-base; and, despite the bigotry and proto-genetic determinism among racists using these kinds of ideas, there is a simple truth about health in relationship to physiological development, most easily observed in bone structure, but it would take many generations to understand the deeper scientific causes, along with nutrition (e.g., Price’s discovery of vitamin K2, what he called Acivator X) including parasites, toxins, and epigenetics. Churchland and Peterson did acknowledge that this went beyond mere individual or even familial issues: “It is probable that the intemperate use of alcohol and drugs, the spreading of syphilis, and the overstimulation in many directions of modern civilization have determined an increase difficult to estimate, but nevertheless palpable, of insanity in the present century as compared with past centuries.”

Also, there is the 1902 The Journal of Nervous and Mental Disease: Volume 29 edited by William G. Spiller. There is much discussion in there about how anxiety was observed, diagnosed, and treated at the time. Some of the case studies make for a fascinating read —– check out: “Report of a Case of Epilepsy Presenting as Symptoms Night Terrors, Inipellant Ideas, Complicated Automatisms, with Subsequent Development of Convulsive Motor Seizures and Psychical Aberration” by W. K. Walker. This reminds me of the case that influenced Sigmund Freud and Carl Jung, Daniel Paul Schreber’s 1903 Memoirs of My Nervous Illness.

Talk about “a disintegration of the personality and character structure of Modern Man and mental-rational consciousness,” as Scott Preston put it. He goes on to say that, “The individual is not a natural thing. There is an incoherency in “Margaret Thatcher’s view of things when she infamously declared “there is no such thing as society” — that she saw only individuals and families, that is to say, atoms and molecules.” Her saying that really did capture the mood of the society she denied existing. Even the family was shrunk down to the ‘nuclear’. To state there is no society is to declare that there is also no extended family, no kinship, no community, that there is no larger human reality of any kind. Ironically in this pseudo-libertarian sentiment, there is nothing holding the family together other than government laws imposing strict control of marriage and parenting where common finances lock two individuals together under the rule of capitalist realism (the only larger realities involved are inhuman systems) — compared to high trust societies such as Nordic countries where the definition and practice of family life is less legalistic (Nordic Theory of Love and Individualism).

The individual consumer-citizen as a legal member of a family unit has to be created and then controlled, as it is a rather unstable atomized identity. “The idea of the “individual”,” Preston says, “has become an unsustainable metaphor and moral ideal when the actual reality is “21st century schizoid man” — a being who, far from being individual, is falling to pieces and riven with self-contradiction, duplicity, and cognitive dissonance, as reflects life in “the New Normal” of double-talk, double-think, double-standard, and double-bind.” That is partly the reason for the heavy focus on the body, an attempt to make concrete the individual in order to hold together the splintered self — great analysis of this can be found in Lewis Hyde’s Trickster Makes This World: “an unalterable fact about the body is linked to a place in the social order, and in both cases, to accept the link is to be caught in a kind of trap. Before anyone can be snared in this trap, an equation must be made between the body and the world (my skin color is my place as a Hispanic; menstruation is my place as a woman)” (see one of my posts about it: Lock Without a Key). Along with increasing authoritarianism, there was increasing medicalization and rationalization — to try to make sense of what was senseless.

A specific example of a change can be found in Dr. Frederick Hollick (1818-1900) who was a popular writer and speaker on medicine and health — his “links were to the free-thinking tradition, not to Christianity” (Helen Lefkowitz Horowitz, Rewriting Sex). With the influence of Mesmerism and animal magnetism, he studied and wrote about what more scientifically-sounding was variously called electrotherapeutics, galvanism, and electro-galvanism. Hollick was an English follower of the Scottish industrialist and socialist Robert Dale Owen who he literally followed to the United States where Owen started the utopian community New Harmony, a Southern Indiana village bought from the utopian German Harmonists and then filled with brilliant and innovative minds but lacking in practical know-how about running a self-sustaining community (Abraham Lincoln, later becoming a friend to the Owen family, recalled as a boy seeing the boat full of books heading to New Harmony).

“As had Owen before him, Hollick argued for the positive value of sexual feeling. Not only was it neither immoral nor injurious, it was the basis for morality and society. […] In many ways, Hollick was a sexual enthusiast” (Horowitz). These were the social circles of Abraham Lincoln, as he personally knew free-love advocates; that is why early Republicans were often referred to as “Red Republicans”, the ‘Red’ indicating radicalism as it still does to this day. Hollick wasn’t the first to be a sexual advocate nor, of course would he be the last — preceding him was Sarah Grimke (1837, Equality of the Sexes) and Charles Knowlton (1839, The Private Companion of Young Married People), Hollick having been “a student of Knowlton’s work” (Debran Rowland, The Boundaries of Her Body); and following him were two more well known figures, the previously mentioned Bernarr Macfadden (1868-1955) who was the first major health and fitness guru, and Wilhelm Reich (1897–1957) who was the less respectable member of the trinity formed with Sigmund Freud and Carl Jung. Sexuality became a symbolic issue of politics and health, partly because of increasing scientific knowledge but also because increasing marketization of products such as birth control (with public discussion of contraceptives happening in the late 1700s and advances in contraceptive production in the early 1800s), the latter being quite significant as it meant individuals could control pregnancy and this is particularly relevant to women. It should be noted that Hollick promoted the ideal of female sexual autonomy, that sex should be assented to and enjoyed by both partners.

This growing concern with sexuality began with the growing middle class in the decades following the American Revolution. Among much else, it was related to the post-revolutionary focus on parenting and the perceived need for raising republican citizens — this formed an audience far beyond radical libertinism and free-love. Expert advice was needed for the new bourgeouis family life, as part of the “civilizing process” that increasingly took hold at that time with not only sexual manuals but also parenting guides, health pamphlets, books of manners, cookbooks, diet books, etc — cut off from the roots of traditional community and kinship, the modern individual no longer trusted inherited wisdom and so needed to be taught how to live, how to behave and relate. Along with the rise of the science, this situation promoted the role of the public intellectual that Hollick effectively took advantage of and, after the failure of Owen’s utopian experiment, he went on the lecture circuit which brought on legal cases in the unsuccessful attempt to silence him, the kind of persecution that Reich also later endured.

To put it in perspective, this Antebellum era of public debate and public education on sexuality coincided with other changes. Following the revolutionary era feminism (e.g., Mary Wollstonecraft), the “First Wave” of organized feminists emerged generations later with the Seneca meeting in 1848 and, in that movement, there was a strong abolitionist impulse. This was part of the rise of ideological -isms in the North that so concerned the Southern aristocrats who wanted to maintain their hierarchical control of the entire country, the control they were quickly losing with the shift of power in the Federal government. A few years before that in 1844, a more effective condom was developed using vulcanized rubber, although condoms had been on the market since the previous decade; also in the 1840s, the vaginal sponge became available. Interestingly, many feminists were as against the contraceptives as they were against abortions. These were far from being mere practical issues as politics imbued every aspect and some feminists worried about how this might lessen the role of women and motherhood in society, if sexuality was divorced from pregnancy.

This was at a time when the abortion rate was sky-rocketing, indicating most women held other views. “Yet we also know that thousands of women were attending lectures in these years, lectures dealing, in part, with fertility control. And rates of abortion were escalating rapidly, especially, according to historian James Mohr, the rate for married women. Mohr estimates that in the period 1800-1830, perhaps one out of every twenty-five to thirty pregnancies was aborted. Between 1850 and 1860, he estimates, the ratio may have been one out of every five or six pregnancies. At mid-century, more than two hundred full-time abortionists reported worked in New York City” (Rickie Solinger, Pregnancy and Power, p. 61). In the unGodly and unChurched period of early America (“We forgot.”), organized religion was weak and “premarital sex was typical, many marriages following after pregnancy, but some people simply lived in sin. Single parents and ‘bastards’ were common” (A Vast Experiment). Early Americans, by today’s standards, were not good Christians — visiting Europeans often saw them as uncouth heathens and quite dangerous at that, such as the common American practice of toting around guns and knives, ever ready for a fight, whereas carrying weapons had been made illegal in England. In The Churching of America, Roger Finke and Rodney Stark write (pp. 25-26):

“Americans are burdened with more nostalgic illusions about the colonial era than about any other period in their history. Our conceptions of the time are dominated by a few powerful illustrations of Pilgrim scenes that most people over forty stared at year after year on classroom walls: the baptism of Pocahontas, the Pilgrims walking through the woods to church, and the first Thanksgiving. Had these classroom walls also been graced with colonial scenes of drunken revelry and barroom brawling, of women in risque ball-gowns, of gamblers and rakes, a better balance might have been struck. For the fact is that there never were all that many Puritans, even in New England, and non-Puritan behavior abounded. From 1761 through 1800 a third (33.7%) of all first births in New England occurred after less than nine months of marriage (D. S. Smith, 1985), despite harsh laws against fornication. Granted, some of these early births were simply premature and do not necessarily show that premarital intercourse had occurred, but offsetting this is the likelihood that not all women who engaged in premarital intercourse would have become pregnant. In any case, single women in New England during the colonial period were more likely to be sexually active than to belong to a church-in 1776 only about one out of five New Englanders had a religious affiliation. The lack of affiliation does not necessarily mean that most were irreligious (although some clearly were), but it does mean that their faith lacked public expression and organized influence.”

Though marriage remained important as an ideal in American culture, what changed was that procreative control became increasingly available — with fewer accidental pregnancies and more abortions, a powerful motivation for marriage disappeared. Unsurprisingly, at the same time, there was increasing worries about the breakdown of community and family, concerns that would turn into moral panic at various points. Antebellum America was in turmoil. This was concretely exemplified by the dropping birth rate that was already noticeable by mid-century (Timothy Crumrin, “Her Daily Concern:” Women’s Health Issues in Early 19th-Century Indiana) and was nearly halved from 1800 to 1900 (Debran Rowland, The Boundaries of Her Body). “The late 19th century and early 20th saw a huge increase in the country’s population (nearly 200 percent between 1860 and 1910) mostly due to immigration, and that population was becoming ever more urban as people moved to cities to seek their fortunes—including women, more of whom were getting college educations and jobs outside the home” (Julie Beck, ‘Americanitis’: The Disease of Living Too Fast). It was a period of crisis, not all that different from our present crisis, including the fear about low birth rate of native-born white Americans, especially the endangered species of WASPs, being overtaken by the supposed dirty hordes of blacks, ethnics, and immigrants.

The promotion of birth control was considered a genuine threat to American society, maybe to all of Western Civilization. It was most directly a threat to traditional gender roles. Women could better control when they got pregnant, a decisive factor in the phenomenon of  larger numbers of women entering college and the workforce. And with an epidemic of neurasthenia, this dilemma was worsened by the crippling effeminacy that neutered masculine potency. Was modern man, specifically the white ruling elite, up for the task of carrying on Western Civilization?

“Indeed, civilization’s demands on men’s nerve force had left their bodies positively effeminate. According to Beard, neurasthenics had the organization of “women more than men.” They possessed ” a muscular system comparatively small and feeble.” Their dainty frames and feeble musculature lacked the masculine vigor and nervous reserves of even their most recent forefathers. “It is much less than a century ago, that a man who could not [drink] many bottles of wine was thought of as effeminate—but a fraction of a man.” No more. With their dwindling reserves of nerve force, civilized men were becoming increasingly susceptible to the weakest stimulants until now, “like babes, we find no safe retreat, save in chocolate and milk and water.” Sex was as debilitating as alcohol for neurasthenics. For most men, sex in moderation was a tonic. Yet civilized neurasthenics could become ill if they attempted intercourse even once every three months. As Beard put it, “there is not force enough left in them to reproduce the species or go through the process of reproducing the species.” Lacking even the force “to reproduce the species,” their manhood was clearly in jeopardy.” (Gail Bederman, Manliness and Civilization, pp. 87-88)

This led to a backlash that began before the Civil War with the early obscenity laws and abortion laws, but went into high gear with the 1873 Comstock laws that effectively shut down the free market of both ideas and products related to sexuality, including sex toys. This made it near impossible for most women to learn about birth control or obtain contraceptives and abortifacients. There was a felt need to restore order and that meant white male order of the WASP middle-to-upper classes, especially with the end of slavery, mass immigration of ethnics, urbanization and industrialization. The crisis wasn’t only ideological or political. The entire world had been falling apart for centuries with the ending of feudalism and the ancien regime, the last remnants of it in America being maintained through slavery. Motherhood being the backbone of civilization, it was believed that women’s sexuality had to be controlled and, unlike so much else that was out of control, it actually could be controlled through enforcement of laws.

Outlawing abortions is a particularly interesting example of social control. Even with laws in place, abortions remained commonly practiced by local doctors, even in many rural areas (American Christianity: History, Politics, & Social Issues). Corey Robin argues that the strategy hasn’t been to deny women’s agency but to assert their subordination (Denying the Agency of the Subordinate Class). This is why abortion laws were designed to target male doctors, although they rarely did, and not their female patients. Everything comes down to agency or its lack or loss, but our entire sense of agency is out of accord with our own human nature. We seek to control what is outside of us for own sense of self is out of control. The legalistic worldview is inherently authoritarian, at the heart of what Julian Jaynes proposes as the post-bicameral project of consciousness, the contained self. But the container is weak and keeps leaking all over the place.

To bring it back to the original inspiration, Scott Preston wrote: “Quite obviously, our picture of the human being as an indivisible unit or monad of existence was quite wrong-headed, and is not adequate for the generation and re-generation of whole human beings. Our self-portrait or sel- understanding of “human nature” was deficient and serves now only to produce and reproduce human caricatures. Many of us now understand that the authentic process of individuation hasn’t much in common at all with individualism and the supremacy of the self-interest.” The failure we face is that of identify, of our way of being in the world. As with neurasthenia in the past, we are now in a crisis of anxiety and depression, along with yet another moral panic about the declining white race. So, we get the likes of Steve Bannon, Donald Trump, and Jordan Peterson. We failed to resolve past conflicts and so they keep re-emerging.

“In retrospect, the omens of an impending crisis and disintegration of the individual were rather obvious,” Preston points out. “So, what we face today as “the crisis of identity” and the cognitive dissonance of “the New Normal” is not something really new — it’s an intensification of that disintegrative process that has been underway for over four generations now. It has now become acute. This is the paradox. The idea of the “individual” has become an unsustainable metaphor and moral ideal when the actual reality is “21st century schizoid man” — a being who, far from being individual, is falling to pieces and riven with self-contradiction, duplicity, and cognitive dissonance, as reflects life in “the New Normal” of double-talk, double-think, double-standard, and double-bind.” We never were individuals. It was just a story we told ourselves, but there are others that could be told. Scott Preston offers an alternative narrative, that of individuation.

* * *

I found some potentially interesting books while skimming material on Google Books, in my researching Frederick Hollick and other info. Among the titles below, I’ll share some text from one of them because it offers a good summary about sexuality at the time, specifically women’s sexuality. Obviously, it went far beyond sexuality itself, and going by my own theorizing I’d say it is yet another example of symbolic conflation, considering its direct relationship to abortion.

The Boundaries of Her Body: The Troubling History of Women’s Rights in America
by Debran Rowland
pp. 34

WOMEN AND THE WOMB: The Emerging Birth Control Debate

The twentieth century dawned in America on a falling white birth rate. In 1800, an average of seven children were born to each “American-born white wife,” historians report. 29 By 1900, that number had fallen to roughly half. 30 Though there may have been several factors, some historians suggest that this decline—occurring as it did among young white women—may have been due to the use of contraceptives or abstinence,though few talked openly about it. 31

“In spite of all the rhetoric against birth control,the birthrate plummeted in the late nineteenth century in America and Western Europe (as it had in France the century before); family size was halved by the time of World War I,” notes Shari Thurer in The Myth of Motherhood. 32

As issues go, the “plummeting birthrate” among whites was a powder keg, sparking outcry as the “failure”of the privileged class to have children was contrasted with the “failure” of poor immigrants and minorities to control the number of children they were having. Criticism was loud and rampant. “The upper classes started the trend, and by the 1880s the swarms of ragged children produced by the poor were regarded by the bourgeoisie, so Emile Zola’s novels inform us, as evidence of the lower order’s ignorance and brutality,” Thurer notes. 33

But the seeds of this then-still nearly invisible movement had been planted much earlier. In the late 1700s, British political theorists began disseminating information on contraceptives as concerns of overpopulation grew among some classes. 34 Despite the separation of an ocean, by the 1820s, this information was “seeping” into the United States.

“Before the introduction of the Comstock laws, contraceptive devices were openly advertised in newspapers, tabloids, pamphlets, and health magazines,” Yalom notes.“Condoms had become increasing popular since the 1830s, when vulcanized rubber (the invention of Charles Goodyear) began to replace the earlier sheepskin models.” 35 Vaginal sponges also grew in popularity during the 1840s, as women traded letters and advice on contraceptives. 36 Of course, prosecutions under the Comstock Act went a long way toward chilling public discussion.

Though Margaret Sanger’s is often the first name associated with the dissemination of information on contraceptives in the early United States, in fact, a woman named Sarah Grimke preceded her by several decades. In 1837, Grimke published the Letters on the Equality of the Sexes, a pamphlet containing advice about sex, physiology, and the prevention of pregnancy. 37

Two years later, Charles Knowlton published The Private Companion of Young Married People, becoming the first physician in America to do so. 38 Near this time, Frederick Hollick, a student of Knowlton’s work, “popularized” the rhythm method and douching. And by the 1850s, a variety of material was being published providing men and women with information on the prevention of pregnancy. And the advances weren’t limited to paper.

“In 1846,a diaphragm-like article called The Wife’s Protector was patented in the United States,” according to Marilyn Yalom. 39 “By the 1850s dozens of patents for rubber pessaries ‘inflated to hold them in place’ were listed in the U.S. Patent Office records,” Janet Farrell Brodie reports in Contraception and Abortion in 19 th Century America. 40 And, although many of these early devices were often more medical than prophylactic, by 1864 advertisements had begun to appear for “an India-rubber contrivance”similar in function and concept to the diaphragms of today. 41

“[B]y the 1860s and 1870s, a wide assortment of pessaries (vaginal rubber caps) could be purchased at two to six dollars each,”says Yalom. 42 And by 1860, following publication of James Ashton’s Book of Nature, the five most popular ways of avoiding pregnancy—“withdrawal, and the rhythm methods”—had become part of the public discussion. 43 But this early contraceptives movement in America would prove a victim of its own success. The openness and frank talk that characterized it would run afoul of the burgeoning “purity movement.”

“During the second half of the nineteenth century,American and European purity activists, determined to control other people’s sexuality, railed against male vice, prostitution, the spread of venereal disease, and the risks run by a chaste wife in the arms of a dissolute husband,” says Yalom. “They agitated against the availability of contraception under the assumption that such devices, because of their association with prostitution, would sully the home.” 44

Anthony Comstock, a “fanatical figure,” some historians suggest, was a charismatic “purist,” who, along with others in the movement, “acted like medieval Christiansengaged in a holy war,”Yalom says. 45 It was a successful crusade. “Comstock’s dogged efforts resulted in the 1873 law passed by Congress that barred use of the postal system for the distribution of any ‘article or thing designed or intended for the prevention of contraception or procuring of abortion’,”Yalom notes.

Comstock’s zeal would also lead to his appointment as a special agent of the United States Post Office with the authority to track and destroy “illegal” mailing,i.e.,mail deemed to be “obscene”or in violation of the Comstock Act.Until his death in 1915, Comstock is said to have been energetic in his pursuit of offenders,among them Dr. Edward Bliss Foote, whose articles on contraceptive devices and methods were widely published. 46 Foote was indicted in January of 1876 for dissemination of contraceptive information. He was tried, found guilty, and fined $3,000. Though donations of more than $300 were made to help defray costs,Foote was reportedly more cautious after the trial. 47 That “caution”spread to others, some historians suggest.

Disorderly Conduct: Visions of Gender in Victorian America
By Carroll Smith-Rosenberg

Riotous Flesh: Women, Physiology, and the Solitary Vice in Nineteenth-Century America
by April R. Haynes

The Boundaries of Her Body: The Troubling History of Women’s Rights in America
by Debran Rowland

Rereading Sex: Battles Over Sexual Knowledge and Suppression in Nineteenth-century America
by Helen Lefkowitz Horowitz

Rewriting Sex: Sexual Knowledge in Antebellum America, A Brief History with Documents
by Helen Lefkowitz Horowitz

Imperiled Innocents: Anthony Comstock and Family Reproduction in Victorian America
by Nicola Kay Beisel

Against Obscenity: Reform and the Politics of Womanhood in America, 1873–1935
by Leigh Ann Wheeler

Purity in Print: Book Censorship in America from the Gilded Age to the Computer Age
by Paul S. Boyer

American Sexual Histories
edited by Elizabeth Reis

Wash and Be Healed: The Water-Cure Movement and Women’s Health
by Susan Cayleff

From Eve to Evolution: Darwin, Science, and Women’s Rights in Gilded Age America
by Kimberly A. Hamlin

Manliness and Civilization: A Cultural History of Gender and Race in the United States, 1880-1917
by Gail Bederman

One Nation Under Stress: The Trouble with Stress as an Idea
by Dana Becker