The Transparent Self to Come?

Scott Preston’s newest piece, The Seer, is worth reading. He makes an argument for what is needed next for humanity, what one might think of as getting off the Wheel of Karma. But I can’t help considering about the messy human details, in this moment of societal change and crisis. The great thinkers like Jean Gebser talk of integral consciousness in one way while most people experience the situation in entirely different terms. That is why I’m glad Preston brought in what is far less respectable (and far more popular) like Carlos Castaneda and the Seth Material.

As anyone should know, we aren’t discussing mere philosophy here for it touches upon human experience and social reality. I sense much of what is potentially involved, even as it is hard to put one’s finger on it. The challenge we are confronted with is far more disconcerting than we typically are able and willing to acknowledge, assuming we can even begin to comprehend what we are facing and what is emerging. How we get to the integral is the difficult part. Preston explains well the issue of making the ego/emissary transparent — as the Seth Material put it, “true transparency is not the ability to see through, but to move through”. That is a good way of putting it.

I appreciate his explanation of Satan (the egoic-demiurge) as the ape of God, what Iain McGilchrist calls usurpation. This reminds me of the mimicry of the Trickster archetype and its relation to the co-optation of the reactionary mind (see Corey Robin). A different kind of example of this is that of the folkloric Men in Black, as described by John Keel. It makes me wonder about what such things represent in human reality. This was on my mind because of another discussion I was having in a different post, Normal, from rauldukeblog’s The Violent Ink. The topic had to do with present mass hysteria and, as I’m wont to do, I threw out my own idiosyncratic context. Climate change came up and so I was trying to explain what makes this moment of crisis different than the past.

There is the scientific quality to it. Modern science created climate change through technological innovation and industrialization. And now science warns us about it. But it usually isn’t like a war, famine, or plague that hits a population in an undeniable way — not for most of us, not yet. That is the complexifying change in the scientific worldview we now inhabit and it is why the anxiety is so amorphous, in away profoundly different than before. To come to terms with climate change, something within human nature itself would have to shift. If we are to survive it while maintaining civilization, we will likely have to be as dramatically transformed as were bicameral humans during the collapse of the Bronze Age Civilizations. We won’t come through this unscathed and unchanged.

In speaking of the scientific or pseudo-scientific, there is the phenomenon of UFOs and contact experience. I pointed out that there has been a shift in official military policy toward reporting of UFO sightings, which gets one wondering about motives and also gets one thinking about why now. UFOs and aliens express that free-floating sense of vague anxiety about the unknown, specifically in a modern framework. It’s almost irrelevant what UFOs really are or aren’t. And no doubt, as in the past, various governments will attempt to use UFO reports to manipulate populations, to obfuscate what they wish to keep hidden, or whatever else. The relevant point here is what UFOs symbolize in the human psyche and why they gain so much attention during periods of wide scale uncertainty and stress. The UFO cults that have appeared over the past few generations are maybe akin to the cults like Jesus worship that arose in the Axial Age. Besides Jung, it might be helpful to bring in Jacques Vallee’s even more fascinating view. A new mythos is forming.

I’m not sure what it all adds up to. And my crystal ball is no less cloudy than anyone else’s. It just feels different in that we aren’t only facing crisis and catastrophe. It feels like a far more pivotal point, a fork in the path. During what is called the General Crisis, there was much change going on and it did help bring to an end what remained of feudalism. But the General Crisis didn’t fundamentally change society and culture, much less cut deeper into the human psyche. I’d argue that it simply brought us further down the same path we’d been on for two millennia since the Axial Age. I keep wondering if now the Axial Age is coming to its final conclusion, that there isn’t much further we can go down this path.

By the way, I think my introduction to Jacques Vallee came through my further reading after having discovered John Keel’s The Mothman Prophecies, the book that came out long before the movie. That is where the basic notion comes from that I was working with here. During times of crisis and foreboding, often preceding actual mass death, there is a build up of strangeness that spills out from our normal sense of reality. We can, of course, talk about this in more rational or rather respectable terms without any of the muck of UFO research.

Keith Payne, in The Broken Ladder, notes that people come to hold bizarre beliefs and generally act irrationally when under conditions of high inequality, that is to say when inflicted with unrelenting stress. But it goes beyond that. There is more going on than mere beliefs. People’s sense of reality becomes distorted and they begin experiencing what they otherwise would not. This was the basis of Julian Jaynes’ hypothesis of the bicameral mind where voice-hearing was supposedly elicited through stress. And this is supported by modern evidence, such as the cases recorded by John Geiger in the Third Man Factor.

An additional layer could be brought to this with Jacques Valle’s work in showing how anecdotes of alien contact follow the same pattern as the stories of fairy abductions and the anthropological accounts of shamanic initiation. These are religious experiences. At other times, they were more likely interpreted as visitations by spiritual beings or as transportation into higher realms. Similarly, spinning and flying disks in the sky were interpreted as supernatural manifestations in the pre-scientific age. But maybe it’s all the same phenomenon, whether the source is elsewhere or from within the human psyche.

The interesting part is that these experiences, sometimes sightings involving crowds of people (including many incidents with military personnel and pilots), often correspond with intensified societal conflict. UFO sightings and contact experiences appear to increase at specific periods of stress. Unsurprisingly, people turn to the strange in strange times. And there is something about this strangeness, the pervasiveness of it and the power it holds. To say we are living in a reactionary time when nearly everything and everyone has become reactionary, that is to understate it to an extreme degree. The Trickster quality of the reactionary mind, one might argue, is its most defining feature.

One might call it the return of the repressed. Or it could be thought of as the eruption (irruption?) of the bicameral mind. Whatever it is, it challenges and threatens the world we think we know. Talk of Russian meddling and US political failure is tiddlywinks in comparison. But the fact that we take such tiddlywinks so seriously does add to the sense of crisis. Everything is real to the degree we believe it to be real, in that the effects of it become manifest in our experience and behavior, in the collective choices that we make and accumulate over time.

We manifest our beliefs. And even the strangest of beliefs can become normalized and, as such, become self-fulfilling prophecies. Social realities aren’t only constructed. They are imagined into being. Such imagination is human reality for we are incapable of experiencing it as anything other than reality. We laugh at the strange beliefs of others at our own peril. But what is being responded to can remain hidden or outside of the mainstream frame of consciousness. Think of the way that non-human animals act in unusual ways before an earthquake hits. If all we see is what the animals are doing and lack any greater knowledge, we won’t appreciate that it means we should prepare for the earthquake to come.

Humans too act strangely before coming catastrophes. It doesn’t require anyone to consciously know of and rationally understand what is coming. Most of how humans respond is instinctual or intuitive. I’d only suggest to pay less attention to the somewhat arbitrary focus of anxiety and, instead, to take the anxiety itself as a phenomenon to be taken seriously. Something real is going on. And it portends something on its way.

Here is my point. We see things through a glass darkly. Things are a bit on the opaque side. Transparency of self is more of an aspiration at this point, at least for those of us not yet enlightened beings. All the voices remain loud within us and in the world around us. In many thinkers seeking a new humanity, there is the prioritizing of the visual over the auditory. There is a historical background to this. The bicameral mind was ruled by voices. To be seek freedom from this, to get off the grinding and rumbling Wheel of Karma requires a different relationship to our senses. There is a reason the Enlightenment was so powerfully obsessed with tools that altered and extended our perception with a major focus on the visual, from lenses to the printed word. Oral society was finally losing its power over us or that is what some wanted to believe.

The strangeness of it all is that pre-consciousness maintains its pull over modern consciousness simultaneously as we idealize the next stage of humanity, integral trans-consciousness. Instead of escaping the authoritative power of the bicameral voice, we find ourselves in a world of mass media and social media where voices have proliferated. We are now drowning in voices and so we fantasize about the cool silence of the visionary, that other side of our human nature — as Preston described it:

One of the things we find in don Juan’s teachings is “the nagual” and “the tonal” relation and this is significant because it is clearly the same as McGilchrist’s “Master” and “Emissary” relationship of the two modes of attention of the divided brain. In don Juan’s teachings, these correspond to the what is called the “first” and “the second attentions”. If you have read neuroscientist Jill Bolte-Taylor’s My Stroke of Insight or followed her TED talk about that experience, you will see that she, too, is describing the different modes of attention of the “nagual” and the “tonal” (or the “Master” and the “Emissary”) in her own experience, and that when she, too, shifted into the “nagual” mode, also saw what Castaneda saw — energy as it flows in the universe, and she also called that “the Life Force Power of the Universe”

About getting off the Wheel, rauldukeblog wrote that, “Karma is a Sanskrit word meaning action so the concept is that any act(tion) creates connective tissue which locks one into reaction and counter and so on in an endless loop.” That brings us back to the notion of not only seeing through the egoic self but more importantly to move through the egoic self. If archaic authorization came from voices according to Jaynes, and if self-authorization of the internalized voice of egoic consciousness hasn’t fundamentally changed this equation, then what would offer us an entirely different way of being and acting in the world?

The last time we had a major transformation of the human mind, back during the ending of the Bronze Age, it required the near total collapse of every civilization. Structures of the mind aren’t easily disentangled from entrenched patterns of social identity as long as the structures of civilization remain in place. All these millennia later, we are still struggling to deal with the aftermath of the Axial Age. What are the chances that the next stage of humanity is going to be easier or happen more quickly?

Caloric Confusion

In human biological terms, there is no such thing as a calorie. It’s an abstraction measured by machines, in breaking down matter to determine the energy it contains. That isn’t how the body functions. It’s similar to the view of nutritionism where chemical analyses determines the amounts of specific vitamins and minerals found in any given food. None of this, however, tells us how the body absorbs, processes, and uses them.

Take sugar, for example. It is worse than empty calories. Rather, we are talking about actively toxic calories. It also interferes with nutrient intake and so can contribute to malnourishment. In the 1890s in Britain and by 1940 in the United States, a shockingly high number of recruits and draftees were being rejected because of malnourishment and tooth decay. This had been preceded by decades of rising levels of sugar and carbs in the diet, combined with processed vegetable oils that were replacing saturated fat.

A commonly discussed example of this is how more vitamin C is required on a high-carb diet because glucose competes with it, whereas on a low-carb diet very little vitamin C is needed to avoid scurvy. Sugar is causing harm simultaneously on multiple levels. That it is fattening is bad enough, especially considering all that is involved: dental caries, diabetes, heart disease, fatty liver, depression, etc — no minor set of health consequences and that list could go on much longer. Yet sugar was exonerated while saturated fat was scapegoated, which is rather inconsistent in that saturated fats have never been treated as mere empty calories equal to anything else.

It turns out calories aren’t all equal and on some level everyone probably always knew that was true, but in its simplicity it was an easy way of describing nutrition to the public. The problem is that it is so simplistic as to be fundamentally wrong. It is meaningless to speak of calories-in/calories-out. That doesn’t explain anything. We are still left with the issue of why the body burns some calories while turning others into fat. Recent research has shown that there is a metabolic advantage to low-carb diets in that more calories are burned in ratio to calories consumed. This is particularly true in a ketogenic state where the body efficiently burns fat. Fat easily turns into fat when eaten with carbs, but this is not true to the same degree when carbs are limited.

It is understandable how this all came about. We study what we can perceive and we ignore what we can’t. Scientific researchers early on learned how to measure calories with machines and it was assumed that the body was like a machine, burning a fuel in the way an engine burned coal and released heat. It became not only one model among many but a defining paradigm to explain human behavior and even morality, with the sins of gluttony and sloth taking key roles. Calories-in/calories-out created a perfect moral calculus. If you were fat or whatever, it was your fault. It couldn’t possibly have anything to do with the severely health-destroying food system, demented nutritional advice, and sub-par healthcare.

Other models of dietary health developed such as the endocrinological study of hormones and the hormonal systems, but the calorie model was already established. Besides, most of this other early research was done in Europe, much of it in German. The World Wars scattered the European research communities and their scientific literature mostly remained untranslated. When the US became the new center of nutritional research, English-speaking researchers were largely ignorant of all that previous researchers had already figured out.

Until this past decade or so, this state of affairs has remained that way, more than a century after that early research was done. Only now has the American-dominated nutritional research begun to return to old knowledge long forgotten.

* * *

The Curious History of the Calorie in U.S. Policy:
A Tradition of Unfulfilled Promises
by Deborah I. Levine

The Progressive Era Body Project:
Calorie-Counting and “Disciplining the Stomach” in 1920s America
by Chin Jou

Forget Calories
by James Hamblin

Death of the Calorie
by Peter Wilson

How Do We Gain Weight? – Calories Part I, II, IIIIV, V, VI, VII, VIII, IX, X, & XI
Historic Perspective on Obesity – Hormonal Obesity Part I, II, III, IV, & V
The Myth about Exercise – Exercise Part I, II, III, & IV
by Jason Fung

11 Experts Demolish the “Calories-In-Calories-Out” (CICO) Model of Obesity
9 More Experts Lay Waste to the “Calories-In-Calories-Out” (CICO) Model of Obesity
by Adam Kosloff

* * *

The Case Against Sugar
by Gary Taubes
pp. 23-25

Meanwhile, the latest surge in this epidemic of diabetes in the United States— an 800 percent increase from 1960 to the present day, according to the Centers for Disease Control—coincides with a significant rise in the consumption of sugar. Or, rather, it coincides with a surge in the consumption of sugars, or what the FDA calls “caloric sweeteners—sucrose, from sugarcane or beets, and high-fructose corn syrup, HFCS, a relatively new invention.

After ignoring or downplaying the role of sugars and sweets for a quarter-century, many authorities now argue that these are, indeed, a major cause of obesity and diabetes and that they should be taxed heavily or regulated. The authorities still do so, however, not because they believe sugar causes disease but, rather, because they believe sugar represents “empty calories” that we eat in excess because they taste so good. By this logic, since refined sugar and high-fructose corn syrup don’t contain any protein, vitamins, minerals, antioxidants, or fiber, they either displace other, more nutritious elements of our diet, or simply add extra, unneeded calories to make us fatter. The Department of Agriculture, for instance (in its recent “Dietary Guidelines for Americans”), the World Health Organization, and the American Heart Association, among other organizations, advise a reduction in sugar consumption for these reasons primarily.

The empty-calories argument is particularly convenient for the food industry, which would understandably prefer not to see a key constituent of its products—all too often, the key constituent—damned as toxic. The sugar industry played a key role in the general exoneration of sugar that took place in the 1970s, as I’ll explain later. Health organizations, including the American Diabetes Association and the American Heart Association, have also found the argument convenient, having spent the last fifty years blaming dietary fat for our ills while letting sugar off the hook. […]

This book makes a different argument: that sugars like sucrose and high-fructose corn syrup are fundamental causes of diabetes and obesity, using the same simple concept of causality that we employ when we say smoking cigarettes causes lung cancer. It’s not because we eat too much of these sugars—although that is implied merely by the terms “overconsumption” and “overeating”—but because they have unique physiological, metabolic, and endocrinological (i.e., hormonal) effects in the human body that directly trigger these disorders. This argument is championed most prominently by the University of California, San Francisco, pediatric endocrinologist Robert Lustig. These sugars are not short-term toxins that operate over days and weeks, by this logic, but ones that do their damage over years and decades, and perhaps even from generation to generation. In other words, mothers will pass the problem down to their children, not through how and what they feed them (although that plays a role), but through what they eat themselves and how that changes the environment in the womb in which the children develop.

Individuals who get diabetes—the ones in any population who are apparently susceptible, who are genetically predisposed—would never have been stricken if they (and maybe their mothers and their mothers’ mothers) lived in a world without sugar, or at least in a world with a lot less of it than the one in which we have lived for the past 100 to 150 years. These sugars are what an evolutionary biologist might call the environmental or dietary trigger of the disease: the requisite ingredient that triggers the genetic predisposition and turns an otherwise healthy diet into a harmful one. Add such sugars in sufficient quantity to the diet of any population, no matter what proportion of plants to animals they eat—as Kelly West suggested in 1974 about Native American populations—and the result eventually is an epidemic of diabetes, and obesity as well.

pp. 117-121

The second pillar of modern nutritional wisdom is far more fundamental and ultimately has had far more influence on how the science has developed, and it still dominates thinking on the sugar issue. As such, it has also done far more damage. To the sugar industry, it has been the gift that keeps on giving, the ultimate defense against all arguments and evidence that sugar is uniquely toxic. This is the idea that we get obese or overweight because we take in more calories than we expend or excrete. By this thinking, researchers and public-health authorities think of obesity as a disorder of energy balance,” a concept that has become so ingrained in conventional thinking, so widespread, that arguments to the contrary have typically been treated as quackery, if not a willful disavowal of the laws of physics.

According to this logic of energy balance, of calories-in/calories-out, the only meaningful way in which the foods we consume have an impact on our body weight and body fat is through their energy content—calories. This is the only variable that matters. We grow fatter because we eat too much—we consume more calories than we expend—and this simple truth was, and still is, considered all that’s necessary to explain obesity and its prevalence in populations. This thinking renders effectively irrelevant the radically different impact that different macronutrients—the protein, fat, and carbohydrate content of foods—have on metabolism and on the hormones and enzymes that regulate what our bodies do with these foods: whether they’re burned for fuel, used to rebuild tissues and organs, or stored as fat.

By this energy-balance logic, the close association between obesity, diabetes, and heart disease implies no profound revelations to be gleaned about underlying hormonal or metabolic disturbances, but rather that obesity is driven, and diabetes and heart disease are exacerbated, by some combination of gluttony and sloth. It implies that all these diseases can be prevented, or that our likelihood of contracting them is minimized if individuals—or populations—are willing to eat in moderation and perhaps exercise more, as lean individuals are assumed to do naturally. Despite copious reasons to question this logic and, as we’ll see, an entire European school of clinical research that came to consider it nonsensical, medical and nutrition authorities have tended to treat it as gospel. Obesity is caused by this caloric imbalance, and diabetes, as Joslin said nearly a century ago, is largely the penalty for obesity. Curb the behaviors of gluttony (Shakespeare’s Falstaff was often invoked as a pedagogical example) and sloth (another deadly sin) and all these diseases will once again become exceedingly rare.

This logic also served publicly to exonerate sugar as a suspect in either obesity or diabetes. By specifying energy or caloric content as the instrument through which foods influence body weight, it implies that a calorie of sugar would be no more or less capable of causing obesity, and thus diabetes, than a calorie of broccoli or olive oil or eggs or any other food. By the 1960s, the phrase a calorie is a calorie had become a mantra of the nutrition-and-obesity research community, and it was invoked to make just this argument (as it still is). […]

The energy-balance idea derives ultimately from the simple observation that the obese tend to be hungrier than the lean, and to be less physically active, and that these are two deviations from normal intake and expenditure: gluttony and sloth. It was first proposed as an explanation of obesity in the early years of the twentieth century, when nutrition researchers, as we discussed, were focused on carefully quantifying with their calorimeters the energy content of foods and the energy expended in human activity. At the time, the application of the laws of thermodynamics and particularly the conservation of energy to living creatures—the demonstration that all the calories we consume will either be burned as fuel or be stored or excreted—was considered one of the triumphs of late-nineteenth-century nutrition science. Nutrition and metabolism researchers embraced calories and energy as the currency of their research. When physicians began speculating as to the cause of obesity, they naturally did the same.

The first clinician to take these revelations on thermodynamics and apply them to the very human problem of obesity was the German diabetes specialist Carl von Noorden. In 1907, he proposed that “ the ingestion of a quantity of food greater than that required by the body, leads to an accumulation of fat, and to obesity, should the disproportion be continued over a considerable period.”

Noorden’s ideas were disseminated widely in the United States and took root primarily through the work of Louis Newburgh, a University of Michigan physician who did so based on what he believed to be a fundamental truth: “All obese persons are alike in one fundamental respect—they literally overeat.” Newburgh assumed that overeating was the cause of obesity and so proceeded to blame the disorder on some combination of a “perverted appetite” (excessive energy consumption) and a “lessened outflow of energy” (insufficient expenditure). As for obese patients who remained obese in spite of this understanding, Newburgh suggested they did so because of “various human weaknesses such as overindulgence and ignorance.” (Newburgh himself was exceedingly lean.) Newburgh was resolutely set against the idea that other physical faults could be involved in obesity. By 1939, his biography at the University of Michigan was already crediting him with the discovery that “ the whole problem of weight lies in regulation of the inflow and outflow of calories” and for having “undermined conclusively the generally held theory that obesity is the result of some fundamental fault.”

The question of a fundamental fault could not be dismissed so lightly, however. To do that required dismissing observations of German and Austrian clinical researchers who had come to conclude that obesity could only be reasonably explained by the existence of such a fault—specifically, a defect in the hormones and enzymes that served to control the flow of fat into and out of cells. Newburgh rejected this hormonal explanation, believing he had identified the cause of obesity as self-indulgence.

Gustav von Bergmann, a contemporary of Noorden’s and the leading German authority on internal medicine, * 1 criticized Noorden’s ideas (and implicitly Newburgh’s) as nonsensical. Positive energy balance—more energy in than out—occurred when any system grew, Bergmann pointed out: it accumulated mass. Positive energy balance wasn’t an explanation but, rather, a description, and a tautological one at that: logically equivalent to saying that a room gets crowded because more people enter than leave. * 2 It was a statement that described what happens but not why. It seems just as illogical, wrote Bergmann, to say children grow taller because they eat too much or exercise too little, or they remain short because they’re too physically active. “ That which the body needs to grow it always finds, and that which it needs to become fat, even if it’s ten times as much, the body will save for itself from the annual balance.”

The question that Bergmann was implicitly asking is why excess calories were trapped in fat tissue, rather than expended as energy or used for other necessary biological purposes. Is there something about how the fat tissue is regulated or how fuel metabolism functions, he wondered, that makes it happen?

The purpose of a hypothesis in science is to offer an explanation for what we observe, and, as such, its value is determined by how much it can explain or predict. The idea that obesity is caused by the overconsumption of calories, Bergmann implied, failed to explain anything.

p. 129

These revelations led both directly and indirectly to the notion that diets restricted in carbohydrates—and restricted in sugar most of all—would be uniquely effective in slimming the obese. By the mid-1960s, these carbohydrate-restricted diets, typically high in fat, were becoming fashionable, promoted by physicians, not academics, and occasionally in the form of hugely successful diet books. Academic nutritionists led by Fred Stare and Jean Mayer of Harvard were alarmed by this and denounced these diets as dangerous fads (because of their high fat content, particularly saturated fat), suggesting that the physician-authors were trying to con the obese with the fraudulent argument that they could become lean without doing the hard work of curbing their perverted appetites. It is a medical fact that no normal person can lose weight unless he cuts down on excess calories,” The New York Times would explain in 1965.

This battle played out through the mid-1970s, with the academic nutritionists and obesity researchers on one side, and the physicians-turned-diet-book-authors on the other. The obesity researchers began the 1960s believing that obesity was, indeed, an eating disorder—Newburgh’s “perverted appetite”—and the ongoing revolution in endocrinology, spurred by Yalow and Berson’s invention of the radioimmunoassay, did little to convince them otherwise. Many of the most influential obesity researchers were psychologists, and much of their research was dedicated to studying why the obese failed to restrain their appetites sufficiently—to eat in moderation—and how to induce them to do a better job of it. The nutritionists followed along as they focused on the question of whether dietary fat caused heart disease and perhaps obesity as well, because of its dense calories. (A gram of protein or a gram of carbohydrate has four calories; a gram of fat has almost nine.) In the process, they would continue to reject any implication that sugar had fattening powers beyond its caloric content. That it might be the cause of insulin resistance—after all, something was—would not cross their radar screen for decades.

pp. 199-201

In 1986, with the perceived FDA exoneration of sugar, the public-health authorities and the clinicians and researchers studying obesity and diabetes had come to a consensus that type 2 diabetes was caused by obesity, not sugar, and that obesity itself was caused merely by eating too many calories or exercising away too few. By this logic, the only means by which a macronutrient could influence body weight was its caloric content, and so, calorie for calorie, sugar was no more fattening than any other food, and thus no more likely to promote or exacerbate diabetes. This was what the sugar industry had been arguing and embracing since the 1930s. It was what Fred Stare of Harvard had in mind when he said publicly that he would prefer to get his calories from a martini than from a dessert.

A more nuanced perspective, one nourished by scientific progress, would be that if two foods or macronutrients are metabolized differently—if glucose and fructose, for instance, are metabolized in entirely different organs, as they mostly are—then they are likely to have vastly different effects on the hormones and enzymes that control or regulate the storage of fat in fat cells. One hundred calories of glucose will very likely have an entirely different effect on the human body from one hundred calories of fructose, or fifty calories of each consumed together as sucrose, despite having the same caloric content. It would take a leap of faith to assume otherwise.

Nutritionists had come to assume that a hundred calories of fat had a different effect from a hundred calories of carbohydrate on the accumulation of plaque in coronary arteries; even that a hundred calories of saturated fat would have an entirely different effect from a hundred calories of unsaturated fat. So why not expect that macronutrients would have a different effect on the accumulation of fat in fat tissue, or on the phenomena, whatever they might be, that eventually resulted in diabetes? (Insulin resistance and hyperinsulinemia, as Rosalyn Yalow and Solomon Berson, among others, had suggested in the 1960s, seemed to be a very likely bet.) But obesity and diabetes researchers, as we’ve seen, had come to embrace the mantra that “a calorie is a calorie”; they would repeat it publicly when they were presented with the idea that there was something unique about how the human body metabolizes sugar that sets it apart from other carbohydrates. The long-held view was based on the state of the science in the early years of the twentieth century, and to cling to it required a willful rejection of the decades’ worth of relevant revelations in the medical sciences that had come since.

By the 1980s, biochemists, physiologists, and nutritionists who specialized in the study of sugar or in the fructose component of sugar had come to consistent conclusions about the short-term effects of sugar consumption in human subjects, as well as the details of how sugar is metabolized and how this influences the body as a whole. The glucose we consume—in starch or flour, or as half of a sugar molecule—will be used directly for fuel by muscle cells, the brain, and other tissues, and can be stored in muscles or the liver (as a compound called glycogen), but the fructose component of sugar has a much different fate. Most of it never makes it into the circulation; it is metabolized in the liver. The metabolic pathways through which glucose passes when it is being used for fuel—in both liver and muscle cells—involve a feedback mechanism to redirect it toward storage as glycogen when necessary. This is the case with fructose, too. But the metabolism of fructose in the liver is “unfettered by the cellular controls,” as biochemists later put it, that work to prevent its conversion to fat. One result is the increased production of triglycerides, and thus the abnormally elevated triglyceride levels that were observed in many research subjects, though not all, when they ate sugar-rich diets.

While cardiologists and epidemiologists were debating whether elevated triglycerides actually increased the risk of heart disease (in the process, challenging their own beliefs that cholesterol was key), biochemists had come to accept that sucrose was “the most lipogenic” of carbohydrates—as even Walter Glinsmann, author of the FDA report on sugar, would later acknowledge—and that the liver was the site of this fat synthesis. * 2 The Israeli biochemist Eleazar Shafrir would describe this in the technical terminology as “the remarkable hepatic lipogenic capacity induced by fructose-rich diets.” It was also clear from the short-term trials in humans that this happened to a greater extent in some individuals than others, just as it did in some species of animals and not others. In human studies, subjects who had the highest triglycerides when the trials began tended to have the greatest response to reducing sugar intake, suggesting (but not proving) that the sugar was the reason they had such high triglycerides in the first place. These same individuals also tended to see the greatest drop in cholesterol levels when they were put on low-sugar diets.

Does a Healthy LCHF Diet Protect Against Sunburns?

As I’ve written about lately, there is something unique about a low-carb, high-fat diet. People feel better and have more energy. Diverse symptoms disappear, including from serious conditions that for some people are reversed, from autoimmune disorders to mood disorders. That is particularly true in the context of exercise, calorie restriction, fasting, OMAD, ketosis, autophagy, etc and when combined with traditional foods, paleo, carnivore, etc. Many have experimented with this area of dietary changes and have observed major improvements, but it isn’t always clear exactly what is causing any given improvement.

We do understand certain things well. I’ve already discussed in detail ketosis and related factors. And there has been more info coming out about autophagy, an even more fascinating topic. There is the signaling in relation to mTOR, IGF1, and AMPK. And there are the hormones that deal with hunger, satiety, and fullness. Everything is context-dependent. For example, the carnitine in red meat can be turned into carcinogenic TMOA by the Prevotella gut bacteria, but that is a non-issue as long as you aren’t eating the grains that feed Prevotella in the first place. Or consider how vitamin C deficiency that leads to scurvy is rare on carnivore diets, even though vitamin C is found in such small amounts in animal foods, since on a low-carb diet the body needs less vitamin C. Issues with gut health, inflammation, and neurocognition are also more clear in explanation as they’ve received much scientific attention.

Other results are more anecdotal, though. This is largely because the research on low-carb, high-fat diets has been limited and in many cases, such as with zero-carb, scientific evidence is even more sparse. But what thousands of people have observed remains interesting, if yet not entirely explained. Many LCHF dieters have noted that their thoughts are less obsessive and compulsive, something I’ve argued has to do with eliminating addictive foods from the diet, especially added sugar and grains. An example of this is decrease of intrusive sexual thoughts reported by some (and less distraction in general), although at the same time some also state decrease in erectile dysfunction (the latter being unsurprising as the LCHF diet are causally linked to hormonal functioning and cardiovascular health). Sexuality definitely is changed in various ways, as demonstrated in how early puberty becomes common when populations switch to agriculture with high amounts of carbohydrates, in particular grains, and maybe dairy has something to do with it as well since dairy triggers growth hormone — maybe why agricultural societies were able to outbreed hunter-gatherers, overwhelming them with a continually growing supply of cheap labor and cheap lives to send off to war.

There are some confounding factors, of course. Along with more nutrient-dense foods with an emphasis on fat-soluble vitamins, people going on various kinds of low-carb diets also tend to increase cholesterol, saturated fat, and omega-3s while decreasing omega-6s. Cholesterol is one of the most important substances for brain health and it helps your body to process vitamin D from sunlight. Saturated fat is a complicated issue and no one fully knows the significance, beyond our knowing the fear-mongering about it appears to be no longer valid. As for omega-3s, they are essential to so much. The main problem is that omega-6s are at such a high level in the modern diet that they are inflammatory. In using healthier oils and fats, most low-carbers eliminate vegetable oils in junk food and in cooking with vegetable oils being the main source of omega-6s.

This could explain why some think sunburns are less common on a low-carb diet (read down through the Twitter comments). It may or may not have anything specifically to do with carbohydrates themselves and, instead, be more about the general eating pattern common among low-carb dieters. This might have to do with oxidation and free-radicals in relation to omega-6s. Or it could have something to do with fat-soluble vitamins or dietary cholesterol that is typically greater in low-carb, high-fat diets. There are similar patterns in multiple areas of dietary changes and health, and they indicate something that can’t be explained by mainstream health ideology. Consider how Americans have experienced worsening health as they have followed expert opinion in eating more vegetables, fruits, whole grains, and vegetable oils while decreasing red meat and saturated fat. Americans have been following expert advice from mainstream institutions and from their doctors. The same kind of thing has happened with people protecting themselves against sun damage. Americans have increased their use of sunscreen while spending less time in the sun, as they were told to do. What has been the results? The skin cancer rate is going up and those avoiding the sun are less healthy. Is it a mere coincidence that the intake of omega-6s was also increasing during the same period? Maybe not.

When the actual causes are determined, we can isolate them and re-create the appropriate conditions or mimic them. This is biohacking — Siim Land is great in explaining how to get particular results based on the scientific evidence. If omega-6s or whatever is the problem behind sunburns, then it’s far from being knowledge of value limited to the low-carb community. Omega-6s haven’t been as clearly on the radar of many other diets, but health issues with omega-6s are already well known in the scientific literature. So, the advantages in this case might be attained without restricting carbs, although we don’t know that as of yet, assuming the anecdotal observations are proven valid. The interaction between omega-6s and carbohydrates might be a total package, in terms of pushing the body more fully into an inflammatory state where sunlight sensitivity becomes an issue. All we can do at the moment is offer hypotheses to be tested in personal experience and hopefully soon in scientific studies.

There are other arguments for why a specifically low-carb diet could offer sunburn protection, as explored by Keir Watson in Animal Products That Protect You From UV Damage. This is particularly true when we are talking about a paleo or similar diet with plenty of fatty animal foods. Along with omega-3s, saturated fat might play a role: “A higher saturation index should be protective against free-radical damage, suggesting that more saturated fat in the diet might be good, but the evidence I found was not very strong.” More research will need to be done on that possibility. Even if saturated fats simply replace omega-6s, they will be beneficial in this area of health. Another thing to consider is creatine, plentifully found in meat and fish, that “has marked protective effects against oxidative stress and UV-induced damage in the skin, including protecting mitochondrial DNA.” The last thing brought up by Watson are antioxidants, although typically associated with plants, are also found in animal products: “Lutein and zeaxanthin (from egg yolk),” “Astaxanthin found in wild salmon, krill, lobster and crab,” “retinol (vitamin A from animal source, e.g. liver),” and vitamin D from “oily fish – salmon, mackerel, herrings and sardines – the very same fish that give you the protective omega-3 fats!”

The body is a complex system. Change even a single factor and it can have cascading effects. But change multiple factors and the entire functioning can shift into a different state, altering numerous areas of health. Many of the results will be unpredictable based on present science because most research up to this point has had a narrow focus in the population being studied, almost entirely those on the Standard American diet and variations of it. What is true for most people following the past half century of health advice won’t always apply to those following entirely different diets and lifestyles. It’s not that LCHF is going to heal all that ails you, but we find ourselves at a rather fascinating point in the views on diet, lifestyle, and health. We are coming to realize how profoundly affected is the body and mind by even some minor changes. We have more hypotheses at present than knowledge, and that isn’t a new situation. So much of what we thought we knew in the past, the basis of mainstream ideology of health experts, were largely untested hypotheses when first advocated and much of it remains unproven.

Now it’s time to get serious about exploring these other glimpses of entirely different possibilities of understanding. That is the point of hypotheses that often begin as observations and anecdotal evidence.

* * *

Effect of Dietary Lipid on UV Light Carcinogenesis in the Hairless Mouse
by Vivienne E. Reeve, Melissa Matheson, Gavin E. Greenoak, Paul J. Canfield, Christa Boehm‐Wilcox, and Clifford H. Gallagher

Isocaloric feeding of diets varying in lipid content to albino hairless mice has shown that their susceptibility to skin tumorigenesis induced by simulated solar UV light was not affected by the level of polyunsaturated fat, 5% or 20%. However a qualitative effect of dietary lipid was demonstrated. Mice fed 20% saturated fat were almost completely protected from UV tumorigenesis when compared with mice fed 20% polyunsaturated fat. Multiple latent tumours were detected in the saturated fat‐fed mice by subsequent dietary replenishment, suggesting that a requirement for dietary unsaturated fat exists for the promotion stage of UV‐induced skin carcinogenesis.

Effects of high-fat diets rich in either omega-3 or omega-6 fatty acids on UVB-induced skin carcinogenesis in SKH-1 mice
by You-Rong Lou et al

Is Sunscreen the New Margarine?
by Rowan Jacobsen

Don’t Drink (oil) and Fry (in the sun) – the link between polyunsaturated vegetable oil and skin cancer
by George Henderson

N=Many on Omega-6 and Sunburn: Can Sunburn be Reduced?
by Tucker Goodrich

Don’t Blame it on the Sun!
by Dawn Waldron

Why I Don’t Use (Or Need) Sunscreen
by Tom Naughton

American Diabetes Association Changes Its Tune

Over the past decade, ever more mainstream health organizations and government agencies have been slowly reversing their official positions on the dietary intake of carbohydrates, sugar, fat, cholesterol, and salt. This was seen in how the American Heart Association, without acknowledgment, backed off its once strong position about fats that it defended since I think 1961, with the federal government adopting the same position as official policy in 1980. Here we are in 2019, more than a half century later.

Now we see the American Diabetes Association finally coming around as well. And its been a long time coming. When my grandmother was in an assisted living home, the doctors and nurses at the time were following the official ADA position of what were called “consistent carbs”. Basically, this meant diabetics were given a high-carb diet and that was considered perfectly fine, as long as it was consistent so as to manage diabetes with consistent high levels of insulin use. It was freaking insanity in defying common sense.

While my grandmother was still living with my parents, my mother kept her blood sugar under control through diet, until she went to this healthcare facility. After that, her blood sugar was all over the place. The nurses had no comprehension that not all carbohydrates are equal since the glycemic index might be equivalent between a cookie and a carrot, irrespective of glycemic load and ignoring that maybe diabetics should simply be cutting out carbs in general. Instead, they argued that old people should be allowed to enjoy carbs, even if it meant that these nurses were slowly killing their patients and profiting the insulin companies at the same time. My mother was not happy about this callous attitude by these medical ‘professionals’.

Yet here we are. The ADA now says low-carb, high-fat (LCHF) diets aren’t a fad and aren’t dangerous. They go so far as to say they are beneficial for type 2 diabetes. Those not completely ignorant have been saying this for generations. And the research has been accumulating for just as long. The shift in official recommendations that happened in the decades following the 1960s never made sense even according to the research at the time. Many academics and researchers pointed out the lack of evidence in blaming saturated fat and cholesterol. But they were ignored and dismissed, then later attacked, discredited, and silenced by influential and, in some cases, downright charismatic figures (e.g., Ancel Keys) in powerful organizations that became aligned with leading politicians and bureaucrats in key positions. Many careers were destroyed and debate was shut down.

Now those victims of dietary authoritarianism are vindicated, not that this helps all the average folk harmed. There was many decades of bad dietary advice was force onto the American public. This determined official policies and practices of government healthcare programs, school lunch programs, and healthcare providers. Because of the central position of the United States as a geopolitical power during the Cold War, countries all over the world adopted this unhealthy dietary ideology as part of their own official policies.

This also influenced the food system with the government subsidizing high yields of corn and grains to meet the recommendations of these nutritional guidelines. Big ag and big food changed their business models accordingly and put out products that were high in carbs and sugar while low in saturated fat, replacing the latter with unhealthy hydrogenated oils. At least hundreds of millions, if not billions of people, worldwide over multiple generations have suffered a horrible diet, increased sickness, bad medical care, and premature mortality as a result.

Without admitting they were wrong all this time, without apologizing for all the harm they caused, these leading experts and officials are changing their opinion. Better late than never. Mark this date for it is a historic moment.

* * *

3/19/2022: The AHA has once against reversed one of its decades-old recommendations. This time, in following the ADA’s example, they’ve backed off of their pro-carbohydrate advocacy. They now admit that low-carb diets, including the keto diet, can be healthy. This is a massive change. Up until quite recently, the AHA put its Heart Healthy logo on sugary cereals, simply because they had grains that were supposedly healthy.

These changes, once again, happened without any major public announcement and little corporate media reporting. Quietly, new recommendations replace the old without any admission that they were wrong for generations, much less any apology. As some suspect, a decade or so down the road, they’ll finally come out with fully publicized official recommendations about high-fat, low-carb diets; and they’ll state they’ve been supporting such diets for years. What they’ll never note is that they supported the complete opposite for decades.

* * *

A cautiously endorses low-carb nutrition
by Dr. Bret Scher

The central message is one of acceptance and individualization which they sum up by saying:

“Evidence suggests that there is not an ideal percentage of calories from carbohydrate, protein, and fat for people with diabetes. Therefore, macronutrient distribution should be based on an individualized assessment of current eating patterns, preferences, and metabolic goals.”

While there is definite truth that people have different preferences and metabolic goals, the ADA could risk oversimplification if they stopped there. Fortunately, they get more specific, mentioning the benefits of low-carb:

“For individuals with type 2 diabetes not meeting glycemic targets or for whom reducing glucose-lowering drugs is a priority, reducing overall carbohydrate intake with a low- or very-low-carbohydrate eating pattern is a viable option”

My first question is, who wouldn’t prioritize reducing medications? That should be a given for everyone. Unfortunately, in our pharmaceutically driven medical society, that’s not always the case. But I give kudos to the ADA for mentioning it. I only hope that it will become the new standard, so that next time the ADA can say, “Since reducing or eliminating diabetes medications is a universal goal, we recommend low-cab diets.”

My second question is, what are the glycemic targets? Is it the standard HgbA1c of 7? Or is it time to recognize we can do much better with lifestyle, as opposed to drugs, and set the goal as less than 5.7 for everyone?

After an initial backing of low-carb diets, the guideline then takes a questionable turn.

“As research studies on some low-carbohydrate eating plans generally indicate challenges with long-term sustainability, it is important to reassess and individualize meal plan guidance regularly for those interested in this approach.”

With Virta Health reporting 83% compliance at 1 year and 74% at 2 years, I would take issue with a blanket statement that compliance is challenging. In fact, any behavioral change has long-term sustainability issues, and carbohydrate restriction may be no different, but it does not deserve to be singled out as particularly difficult. Certainly, if we discuss it with a patient saying “this is difficult to maintain long term,” that has less chance of success than if we say, “All behavior change is difficult, but given the potential health benefits, this is worth committing to for the long-term.” As they say in the beginning of the guide, the words we use matter and we should focus on positive and inspiring messages.

Nutrition Therapy for Adults With Diabetes or Prediabetes: A Consensus Report
by Alison B. Evert et al, American Diabetes Association
(also see here)

EATING PATTERNS: Consensus recommendations

  • A variety of eating patterns (combinations of different foods or food groups) are acceptable for the management of diabetes.
  • Until the evidence surrounding comparative benefits of different eating patterns in specific individuals strengthens, health care providers should focus on the key
    factors that are common among the patterns:
    ○ Emphasize nonstarchy vegetables.
    ○ Minimize added sugars and refined grains.
    ○ Choose whole foods over highly processed foods to the extent possible.
  • Reducing overall carbohydrate intake for individuals with diabetes has demonstrated the most evidence for improving glycemia and may be applied in a variety of eating patterns that meet individual needs and preferences.
  • For select adults with type 2 diabetes not meeting glycemic targets or where reducing antiglycemic medications is a priority, reducing overall carbohydrate intake with low- or very lowcarbohydrate eating plans is a viable approach

New Consensus Report Recommends Individualized Eating Plan to Meet Each Person’s Goals, Life Circumstances and Health Status
news release from American Diabetes Association

“‘What can I eat?’ is the number one question asked by people with diabetes and prediabetes when diagnosed. This new Consensus Report reflects the ADA’s continued commitment to evidence-based guidelines that are achievable and meet people where they are and recommends an individualized nutrition plan for every person with diabetes or prediabetes,” said the ADA’s Chief Scientific, Medical and Mission Officer William T. Cefalu, MD. “The importance of this consensus also lies in the fact it was authored by a group of experts who are extremely knowledgeable about numerous eating patterns, including vegan, vegetarian and low carb.”

Nina Teicholz:

Just out: @AmDiabetesAssn guidelines–most comprehensive review to date of Dietary Patterns + diabetes prevention/treatment. What’s new: low-carb recommendations are prominent. (Says low-carb “are among the most studied eating patterns for T2 diabetes.”) […]

This is the key advancement of new @AmDiabetesAssn guidelines. Low carb is no longer “dangerous”‘or “fad”‘but a “viable”‘diet supported by “substantial”‘research and considered best for a number of T2 diabetes outcomes.

Dr. John Owens:

This is an historic day! My case managers and dietitian have been supporting my low-carb recommendations for years, going against ADA guidelines. Now they don’t have to!

Dr. Eric Sodicoff:

Still….They seem a little backward here. Bust out the low carb diet when meds not working?? Really? IMHO-Carb restriction is JOB #1 in diabetes management for use early and always. It is NOT second to medication my treatment protocol.

Starofthesea:

If you go back to the beginning, like back in the 1930’s, the doctors were telling diabetics to stop eating carbohydrates. Then somebody fabricated the cholesterol theory of heart disease and invented a drug called statins. Then suddenly carbs were okay for diabetics.

Nutrition Therapy for Adults With Diabetes or Prediabetes: A Consensus Report — American Diabetes Association
from r/ketoscience

lutzlover:

“Eating patterns that replace certain carbohydrate foods with those higher in total fat, however, have demonstrated greater improvements in glycemia and certain CVD risk factors (serum HDL cholesterol [HDL-C] and triglycerides) compared with lower fat diets.”

Yay! Ack that higher fat isn’t deadly.

“The body makes enough cholesterol for physiological and structural functions such that people do not need to obtain cholesterol through foods. Although the DGA concluded that available evidence does not support the recommendation to limit dietary cholesterol for the general population, exact recommendations for dietary cholesterol for other populations, such as people with diabetes, are not as clear (8). Whereas cholesterol intake has correlated with serum cholesterol levels, it has not correlated well with CVD events (65,66). More research is needed regarding the relationship among dietary cholesterol, blood cholesterol, and CVD events in people with diabetes.

Or, in layman’s language: While the data doesn’t support vilifying cholesterol as causing heart attacks, we’re going to keep on searching in hopes we find the answer we want.

dem0n0cracy:

Are protein needs different for people with diabetes and kidney disease?

“Historically, low-protein eating plans were advised to reduce albuminuria and progression of chronic kidney disease in people with DKD, typically with improvements in albuminuria but no clear effect on estimated glomerular filtration rate. In addition, there is some indication that a low-protein eating plan may lead to malnutrition in individuals with DKD (317–321). The average daily level of protein intake for people with diabetes without kidney disease is typically 1–1.5 g/kg body weight/day or 15–20% of total calories (45,146). Evidence does not suggest that people with DKD need to restrict protein intake to less than the average protein intake.

dem0n0cracy:

“The amount of carbohydrate intake required for optimal health in humans is unknown. Although the recommended dietary allowance for carbohydrate for adults without diabetes (19 years and older) is 130 g/day and is determined in part by the brain’s requirement for glucose, this energy requirement can be fulfilled by the body’s metabolic processes, which include glycogenolysis, gluconeogenesis (via metabolism of the glycerol component of fat or gluconeogenic amino acids in protein), and/or ketogenesis in the setting of very low dietary carbohydrate intake (49).”

dem0n0cracy:

Low-carbohydrate (110–112) Emphasizes vegetables low in carbohydrate (such as salad greens, broccoli, cauliflower, cucumber, cabbage, and others); fat from animal foods, oils, butter, and avocado; and protein in the form of meat, poultry, fish, shellfish, eggs, cheese, nuts, and seeds. Some plans include fruit (e.g., berries) and a greater array of nonstarchy vegetables. Avoids starchy and sugary foods such as pasta, rice, potatoes, bread, and sweets. There is no consistent definition of “low” carbohydrate. In this review, a low-carbohydrate eating pattern is defined as reducing carbohydrates to 26–45% of total calories. c A1C reduction c Weight loss c Lowered blood pressure c Increased HDL-C and lowered triglycerides

Very low-carbohydrate (VLC) (110–112) Similar to low-carbohydrate pattern but further limits carbohydrate-containing foods, and meals typically derive more than half of calories from fat. Often has a goal of 20–50 g of nonfiber carbohydrate per day to induce nutritional ketosis. In this review a VLC eating pattern is defined as reducing carbohydrate to ,26% of total calories. c A1C reduction c Weight loss c Lowered blood pressure c Increased HDL-C and lowered triglycerides”

dem0n0cracy:

Low-Carbohydrate or Very Low Carbohydrate Eating Patterns

“Low-carbohydrate eating patterns, especially very low-carbohydrate (VLC) eating patterns, have been shown to reduce A1C and the need for antihyperglycemic medications. These eating patterns are among the most studied eating patterns for type 2 diabetes. One metaanalysis of RCTs that compared lowcarbohydrate eating patterns (defined as #45% of calories from carbohydrate) to high-carbohydrate eating patterns (defined as .45% of calories from carbohydrate) found that A1C benefits were more pronounced in the VLC interventions (where ,26% of calories came from carbohydrate) at 3 and 6 months but not at 12 and 24 months (110).

“Another meta-analysis of RCTs compared a low-carbohydrate eating pattern (defined as ,40% of calories from carbohydrate) to a low-fat eating pattern (defined as ,30% of calories from fat). In trials up to 6 months long, the low-carbohydrate eating pattern improved A1C more, and in trials of varying lengths, lowered triglycerides, raised HDL-C, lowered blood pressure, and resulted in greater reductions in diabetes medication (111). Finally, in another meta-analysis comparing lowcarbohydrate to high-carbohydrate eating patterns, the larger the carbohydrate restriction, the greater the reduction in A1C, though A1C was similar at durations of 1 year and longer for both eating patterns (112). Table 4 provides a quick reference conversion of percentage of calories from carbohydrate to grams of carbohydrate based on number of calories consumed per day.

“Because of theoretical concerns regarding use of VLC eating plans in people with chronic kidney disease, disordered eating patterns, and women who are pregnant, further research is needed before recommendations can be made for these subgroups. Adopting a VLC eating plan can cause diuresis and swiftly reduce blood glucose; therefore, consultation with a knowledgeable practitioner at the onset is necessary to prevent dehydration and reduce insulin and hypoglycemic medications to prevent hypoglycemia.

“No randomized trials were found in people with type 2 diabetes that varied the saturated fat content of the low- or very low-carbohydrate eating patterns to examine effects on glycemia, CVD risk factors, or clinical events. Most of the trials using a carbohydrate-restricted eating pattern did not restrict saturated fat; from the current evidence, this eating pattern does not appear to increase overall cardiovascular risk, but longterm studies with clinical event outcomes are needed (113–117).”

dem0n0cracy:

What is the evidence to support specific eating patterns in the management of type 1 diabetes?

“For adults with type 1 diabetes, no trials met the inclusion criteria for this Consensus Report related to Mediterraneanstyle, vegetarian or vegan, low-fat, low-carbohydrate, DASH, paleo, Ornish, or Pritikin eating patterns. We found limited evidence about the safety and/or effects of fasting on type 1 diabetes (129). A few studies have examined the impact of a VLC eating pattern for adults with type 1 diabetes. One randomized crossover trial with 10 participants examined a VLC eating pattern aiming for 47 g carbohydrate per day without a focus on calorie restriction compared with a higher carbohydrate eating pattern aiming for 225 g carbohydrate per day for 1 week each. Participants following the VLC eating pattern had less glycemic variability, spent more time in euglycemia and less time in hypoglycemia, and required less insulin (130). A single-arm 48-person trial of a VLC eating pattern aimed at a goal of 75 g of carbohydrate or less per day found that weight, A1C, and triglycerides were reduced and HDL-C increased after 3 months, and after 4 years A1C was still lower and HDL-C was still higher than at baseline (131). This evidence suggests that a VLC eating pattern may have potential benefits for adults with type 1 diabetes, but clinical trials of sufficient size and duration are needed to confirm prior findings.”

* * *

5/14/20 – The official changes in support of low-carb diets continues to spread among major health institutions around the world. A new consensus is slowly being established about the health benefits of varying degrees of carbohydrate restriction. Canada is the most recent country following this trend. Below is a recent article describing this shift in mainstream opinion among experts and officials, specifically about our northern neighbor:

Diabetes Canada Deems Low Carb and Very Low Carb Diet Safe and Effective
by Joy Kiddie

Diabetes Canada has just released a new Position Statement acknowledging that a low carb and very low carb (keto) diet is both safe and effective for adults with diabetes.

Reflecting back on their 2018 Clinical Practice Guidelines for the Prevention and Management of Diabetes in Canada release in April 2018 and covered in this article,  Diabetes Canada clarified in today’s  Position Statement that it was not their intention to restrict the choice of individuals with diabetes to follow dietary patterns with carbohydrate intake that were below the consensus recommendation of 45-60% energy as carbohydrate, nor to discourage health-care practitioners from providing low-carb dietary support to individuals who wanted to follow a low-carb meal pattern.

In the new Position Statement, Diabetes Canada acknowledged what I’ve written about previously, that Diabetes Australia, Diabetes UK, and the American Diabetes Association (ADA) in conjunction with the European Association for the Study of Diabetes (EASD) have developed position statements and recommendations regarding the use of low carbohydrate and very low carbohydrate (ketogenic) diets for people with diabetes. They state that from these previous international position statements and recommendations, several consistent themes have emerged — specifically that low carbohydrate diets (defined as less than <130 g of carbohydrate per day or <45% energy as carbohydrate) and very low carbohydrate diets (defined as <50 g of carbohydrate per day) can be safe and effective both in managing weight, as well as lowering glycated hemoglobin (HbA1C) in people with type 2 diabetes over the short term (<3 months).

Sweeteners Can Mess You Up!

Sugar, in large doses, is a harmful substance. Many people have warned against sugar going back to the 1800s. The strongest case was made against it by the physiologist, nutritionist, and professor John Yudkin back in 1972 with his book Pure, White and Deadly. For speaking the truth, his reputation was destroyed by Ancel Keys. But more recently, the science journalist Gary Taubes brought the topic back to public attention, in his own book The Case Against Sugar.

Yudkin has been vindicated, as Keys original research that blamed saturated fat for heart disease has since been re-analyzed and shown that sugar is the stronger correlation. We also now understand the science of why that is the true. But it isn’t only about sugar. The sweet taste, whether from sugar or non-nutritive sweeteners, still causes many of the same problems. All sweeteners affect insulin, gut microbiome, cell functioning, neurocognition, mood, and much else. For example, consumption of both sugar and aspartame is associated with depression, at least in one study that was a randomized controlled trial (G. N. Lindseth, Neurobehavioral Effects of Aspartame Consumption).

They do alter serotonin levels, if not dopamine in all cases — some sweeteners affect dopamine and some don’t. This is observable in how people addicted to sugar so easily shift to non-sugar sweeteners and then act in the same addicted way, finding it hard to imagine giving them up. One way or another, addictive pathways seem to be elicited. The brain isn’t fooled and so the body still will hunger for the sugar it thinks it is getting from the sweet taste.

Exchanging one addictive substance, sugar, for another, non-sugar sweetener, is not a benefit. The problem is the addiction itself. A diet high in carbs and sugars is addictive. Throwing in some other kinds of sweeteners doesn’t change this. The best option is to break the addictive cycle entirely by going low-carb or, better yet, ketogenic. And there is no evidence that artificial sweeteners can be used with a ketogenic diet, since they might knock the body out of ketosis in the way sugar does. To be certain, just eliminate all sweeteners and so kill the problem at its root.

On the other hand, that can be easier said than done. I know sugar addiction, as in I was a sugar fiend from childhood to my thirties. And I did for years increase my use of other sweeteners, in an attempt to break free from the hold sugar had over my brain and mood. This wasn’t a particularly successful strategy. And my health was not improved, as the non-sugar sweeteners maintained my high-carb cravings.

I simply had to cut them out strictly. This simple truth is reinforced every time I slowly increase sweeteners in my diet and the cravings creep back in. I just don’t feel good with them. The lesson has been fully learned at this point.

It was amazing what a difference it made once my sweet tooth went away. Only then did my physical health improve and my psychological health soon followed. I can’t emphasize this enough. Carbs, sugars, and other sweeteners will seriously mess you up over time. You might not notice it for decades, but it all catches up with you. The damage is being done, even if you don’t notice it slowly accumulating. And realize the consequences won’t be limited to sickliness of obesity, diabetes, and heart disease, as neurocognitive and mental health can also decline (e.g., Alzheimer’s is now called type 3 diabetes by some).

An occasional sweet at a birthday party or holiday gathering is one thing. Maybe you can have that and immediately go back on a healthy low-carb diet. Maybe or maybe not. If you were ever a sugar addict, as most Americans are, you are tempting fate. It’s like a recovering alcoholic taking that first sip of whiskey, vodka, or beer; like the recovering druggie getting that first shot of heroin or puff of the crack pipe.

Sugar is a drug, as research shows, that elicits the same reward pathway in the brain as other drugs and all sweeteners can elicit the same or similar pathways. You’ll hunger for more. And even if other sweeteners don’t have all of the problems of sugar, they still have plenty of potential problems that could do serious harm to your health over time.

* * *

Sucralose Promotes Food Intake through NPY and a Neuronal Fasting Response
by Qiao-Ping Wang et al

The truth about artificial sweeteners – Are they good for diabetics?
by Vikas Purohit and Sundeep Mishra

Consuming low-calorie sweeteners may predispose overweight individuals to diabetes
by Jenni Glenn Gingery and Colleen Williams

Not So Sweet: Metabolic Problems Caused by Artificial Sweeteners
by Serena Cho

Artificial Sweeteners Impact Metabolic Health Even on Cellular Level
by Kristen Monaco

Artificial Sweeteners Could be Sabotaging Your Microbiome, Says Study
by Amel Ahmed

Effects of Sweeteners on the Gut Microbiota: A Review of Experimental Studies and Clinical Trials
by Francisco Javier Ruiz-Ojeda et al

Sugar Substitutes or Sugar: What’s Better for Diabetes?
by Kathleen Doheny

Artificial sweeteners linked to diabetes and obesity
by James Brown and Alex Conner

Artificial Sweeteners: Agents of Insulin Resistance, Obesity and Disease
by Loren Cordain

The Unbiased Truth About Artificial Sweeteners
by Chris Kresser

How Artificial Sweeteners Wreak Havoc on Your Gut
by Chris Kresser

Artificial Sweeteners Can Lead to Diabetes In Overweight People
by Gundry MD Team

Artificial Sweeteners Could Be Ruining Your Gut Health
by Gundry MD Team

Are Artificial Sweeteners Safe for the Brain and Gut
by Siim Land

Artificial Sweeteners Don’t Fool Your Brain
by Joseph Mercola

Tricking Taste Buds but Not the Brain: Artificial Sweeteners Change Brain’s Pleasure Response to Sweet
by Caitlin Kirkwood

Artificial Sweeteners: Why You Should Completely Avoid Them to Stay Healthy
by Elizabeth Lyden

Aspartame: 11 Dangers of This All-Too-Common Food Additive
by Rebekah Edwards

Aspartame Side Effects: Recent Research Confirms Reasons for Concern
by University Health News Staff

The Effects of Aspartame on Fibromyalgia and Chronic Fatigue Syndrome
by Adrienne Dellwo

Are Artificial Sweeteners Damaging Your Blood Vessels?
by Michelle Schoffro Cook

Direct and indirect cellular effects of aspartame on the brain
by P. Humphries, E. Pretorius, and H. Naudé

Effects of repeated doses of aspartame on serotonin and its metabolite in various regions of the mouse brain.
by R. P. Sharma and R. A. Coulombe Jr.

Neurophysiological symptoms and aspartame: What is the connection?
by Arbind Kumar Choudhary and Yeong Yeh Lee

The debate over neurotransmitter interaction in aspartame usage
by Arbind Kumar Choudhary and Yeong Yeh Lee

The Connection between Aspartame (Artificial Sweetener) and Panic Attacks, Depression, Bipolar Disorder, Memory Problems, and Other Mental Symptoms
by Betty Martini

Side-Effects of Aspartame on the Brain
by Michael Greger

Migraine Triggers: Artificial Sweeteners
by Jeremy Orozco

Intense Sweetness Surpasses Cocaine Reward
by Magalie Lenoir , Fuschia Serre , Lauriane Cantin, and Serge H. Ahmed

Diet Soft Drinks Linked to Depression
by Naveed Saleh

Why is Diet Soda Addictive?
by Edward Group

Neurobiology of Addiction
by George F. Koob, Michel Le Moal
pp. 448

Accumulating evidence also suggests that dopamine is not required for nondrug reward. In a study in which dopamine release in the nucleus accumbens core and shell was measured with precise voltammetric techniques during self-stimulation, it was shown that if dopamine activation is a necessary condition for brain stimulation reward, evoked dopamine release is actually not observed during brain stimulation reward and is even diminished (Garris et al, 1999). Also, mice completely lacking tyrosine hydroxylase, such that they cannot make dopamine, demonstrated the ability to learn to consume sweet solutions and showed a preference for sucrose and saccharin. Dopamine was not required for animals to find sweet tastes of sucrose or saccharin rewarding (Cannon and Palmiter, 2003; Cannon and Bseikri, 2004).

* * *

I wrote this post as a response to a video, Keto & Beverages, by the LCHF advocate Dr. Westman.

Below is an amusing and irritating (and, sadly, all too common) dialogue or miscommunication I had with Dr. Eric Westman or someone writing on his behalf at his Youtube channel, Adapt Your Life. The most frustrating part is that I’m mostly in agreement with Dr. Westman, as I too adhere to LCHF. For that reason, I don’t want to be harshly critical nor do I want to be polarized into a stronger position than I actually support, but I must admit that my emotional response was a bit negative.

To stand back from the disagreement, I don’t even have a strong opinion on what others do in terms of sweeteners. I don’t recommend sweeteners, sugar or otherwise, based on personal experience. But if artificial sweeteners help some people to transition to a healthier diet (or if they believe this to be true, and I won’t dismiss the placebo effect), then more power to them. Anyway, here is the interaction that rubbed me the wrong way:

Me: “Wasn’t there a recent study that showed even artificial sweeteners can lead to diabetes? The body still responds to the sweet taste as if it were sugar and over time messes with insulin sensitivity.”

Other person: “Hi Ben Steele, I’m not sure we have seen this study.”

Me: “I decided to write a post about it. I found the info I was looking for. At the end of the post, I share links to it and other info about the problems with non-sugar sweeteners.

“As a recovering sugar addict who followed that up with addiction to other sweeteners, I personally would recommend against such substances. But each individual has to decide for themselves.”

[I then linked to this post.]

Other person: “Hi Ben Steele, we opened a couple and didn’t find an actual “study”. Just peoples opinion and thoughts. We would be interested to read a study, preferably a RCT as this is the gold standard of studies. Thanks for this 😊”

Me: “Multiple links were to “studies”. The first two links are papers on studies, the third is the press release of a study, the fourth is a Yale report about a Yale study, five more links further down are papers on studies, and the last is a quote from an academic book from a university press. So, about a third of them linked to studies and academic material. As for all the rest, they directly reference and discuss numerous other studies.

“If you are interested to read a study, then do so. But if not, then don’t. I can’t force you to read anything.”

To further respond, I’m not sure how much of the research is randomized controlled trial. But after doing a casual research that as easily could’ve done by Dr. Westman or his staff, I found info on RCT research. I noticed it briefly mentioned in a few links above, but I didn’t check all the links. I did find RCTs on the topic elsewhere in doing a web search.

I still find it irritating, though. It feels hypocritical. Dr. Westman or his representative was acting with condescension, intentional or not.

Why is this person demanding RCTs of me when they don’t hold themselves to this same standard? Dr. Westman offered no RCTs to back up his recommendations. And these artificial sweeteners were approved by the FDA without any RCTs proving their safety. So, why is it that critics have to prove they are unsafe? Why would we allow invented chemicals to be put on the market and then have doctors recommend them to patients without knowing their long-term safety or side effects?

Just to prove my point, I will share some RCT evidence (see below). And to be fair, I will admit that the results are mixed and, one might argue, inconclusive — consider aspartame that has been researched more fully (see: Travis Saunders, Aspartame: Cause of or Solution to Obesity?; and Michael Joseph, Is Aspartame Safe Or Is It Bad For You?). But based on a familiarity with the available research, without more and better research, no sane and reasonable person would give artificial sweeteners a clean bill of health and proclaim them safe for general mass consumption, especially on a regular basis as a preferred flavoring. Whether or not artificial sweeteners cause weight gain, those might be the least of our worries, in terms of potential side effects seen in some of the studies. Some precautionary principle is in order.

Still, yes, it is hard to state a strong opinion about present evidence, beyond a note of caution. But every individual is free to dismiss such caution and use artificial sweeteners anyway. They might or might not be helpful in losing weight, even if used long enough might lead to detrimental outcomes in other areas of health. Maybe that risk seems worthwhile, assuming short-term weight loss is all that concerns you, and assuming that short-term use won’t lead to long-term use and won’t sabotage a long-term healthy diet. Individuals should make a decision with eyes wide open with the knowledge of potential risks that could be quite serious.

I understand. There are also potential benefits. For those addicted to sugar, they are dealing with a highly destructive substance. Artificial sweeteners may seem like the only choice. And who am I to judge. That is what I did. While transitioning off sugar, I spent a number of years consuming my fair share of laboratory-invented sweeteners. It did get me off sugar, but then all that happened was I was addicted to these other sweeteners. It maintained my sweet tooth and so encouraged me to continue eating a diet high in carbs and sugar. There was no obvious benefit. It did eventually lead me to give up all sweeteners. I just don’t know that the artificial sweeteners were more of a help or a hindrance in that process.

Whatever your decision, know that these substances aren’t without risks. It is dietary Russian roulette. Maybe there will be no serious harm and maybe there will. We shall all find out decades from now when the children and young adults raised on these chemicals reach older age. Here is my perspective. Why take any risk at all when it is completely unnecessary? We already know how to stop sugar cravings in their tracks. With a ketogenic diet, you won’t need to exchange addiction to sugar with an addiction to artificial sweeteners. It’s the simplest solution and, for such people, the only solution with a guaranteed net positive health outcome.

As a quick note, I’d point out a few things about the research. First, all sweeteners affect the body and so aren’t neutral substances, but it is unknown if the effects are a net benefit or net loss. Also, different sweeteners have different effects and the reason for this is not entirely understood. There are still other concerns.

The worst effects in animal studies were seen with high doses, which makes one wonder what is the effect of artificial sweeteners combined with the effect numerous other chemicals, additives, toxins, and pollutants, along with other physiological and environmental stressors that most people are exposed to, as the interaction of multiple factors is an area that mostly remains unexplored and, more importantly, rarely controlled for. And as far as I know, no study has ever been done with various sweeteners in relation to low-carb, zero-carb, and ketogenic diets; whereas most of the studies that have been done are using subjects on the severely unhealthy standard American diet, and so maybe for those people an artificial sweetener is better than the alternative of a high-sugar diet.

Basically, we are in a state of far greater ignorance than knowledge. It’s anyone’s guess. As always, you are taking your life into your own hands. Whatever a journalist, doctor, or health expert may say, it is in the end your life that is at stake, not theirs.

* * *

Gain weight by “going diet?” Artificial sweeteners and the neurobiology of sugar cravings
by Qing Yang

[C]onsensus from interventional studies suggests that artificial sweeteners do not help reduce weight when used alone [2,25]. BMI did not decrease after 25 weeks of substituting diet beverages for sugar-sweetened beverages in 103 adolescents in a randomized controlled trial, except among the heaviest participants [26]. Adouble blind study subjected 55 overweight youth to 13 weeks of a 1,000 Kcal diet accompanied by daily capsules of aspartame or lactose placebo. Both groups lost weight, and the difference was not significant. Weight loss was attributed to caloric restriction [27]. Similar results were reported for a 12-week, 1,500 Kcal program using either regular or diet soda [28]. Interestingly, when sugar was covertly switched to aspartame in a metabolic ward, a 25 percent immediate reduction in energy intake was achieved [29]. Conversely, knowingly ingesting aspartame was associated with increased overall energy intake, suggesting overcompensation for the expected caloric reduction [30]. Vigilant
monitoring, caloric restriction, and exercise were likely involved in the weight loss seen in multidisciplinary programs that included artificial sweeteners [31,32].

Nonnutritive sweeteners and cardiometabolic health: a systematic review and meta-analysis of randomized controlled trials and prospective cohort studies
by Meghan B. Azad, Ahmed M. Abou-Setta, Bhupendrasinh F. Chauhan, Rasheda Rabbani, Justin Lys, Leslie Copstein, Amrinder Mann, Maya M. Jeyaraman, Ashleigh E. Reid, Michelle Fiander, Dylan S. MacKay, Jon McGavock, Brandy Wicklow, and Ryan Zarychanski

Evidence from small RCTs with short follow-up (median 6 mo) suggests that consumption of nonnutritive sweeteners is not consistently associated with decreases in body weight, BMI or waist circumference. However, in larger prospective cohort studies with longer follow-up periods (median 10 yr), intake of nonnutritive sweeteners is significantly associated with modest long-term increases in each of these measures. Cohort studies further suggest that consumption of nonnutritive sweeteners is associated with higher risks of obesity, hypertension, metabolic syndrome, type 2 diabetes, stroke and cardiovascular disease events; however, publication bias was indicated for type 2 diabetes, and there are no data available from RCTs to confirm these observations.

Previous reviews12,65 concluded that, although data from RCTs support weight-loss effects from sustained nonnutritive sweetener interventions, observational studies provide inconsistent results. Building on these findings, we included new studies14–24 and found that consumption of nonnutritive sweeteners was not generally associated with weight loss among participants in RCTs, except in long-term (≥ 12 mo) trials with industry sponsorship. In addition, we found that consumption of nonnutritive sweeteners was associated with modest long-term weight gain in observational studies. Our results also extend previous meta-analyses that showed higher risks of type 2 diabetes32,33 and hypertension66 with regular consumption of nonnutritive sweeteners.

Sucralose decreases insulin sensitivity in healthy subjects: a randomized controlled trial
by Alonso Romo-Romo, Carlos A. Aguilar-Salinas, Griselda X. Brito-Córdova, Rita A. Gómez-Díaz, and Paloma Almeda-Valdes

Design
We performed a randomized controlled trial involving healthy subjects without comorbidities and with a low habitual consumption of nonnutritive sweeteners (n = 33/group). […]

Results
Individuals assigned to sucralose consumption showed a significant decrease in insulin sensitivity with a median (IQR) percentage change of −17.7% (−29.3% to −1.0%) in comparison to −2.8% (−30.7% to 40.6%) in the control group (P= 0.04). An increased acute insulin response to glucose from 577 mU · L-1· min (350–1040 mU · L-1· min) to 671 mU · L-1· min (376–1010 mU · L-1· min) (P = 0.04) was observed in the sucralose group for participants with adequate adherence.

Conclusions
Sucralose may have effects on glucose metabolism, and our study complements findings previously reported in other trials. Further studies are needed to confirm the decrease in insulin sensitivity and to explore the mechanisms for these metabolic alterations.

Neurobehavioral Effects of Aspartame Consumption
by Glenda N. Lindseth, Sonya E. Coolahan, Thomas V. Petros, and Paul D. Lindseth

Despite its widespread use, the artificial sweetener aspartame remains one of the most controversial food additives, due to mixed evidence on its neurobehavioral effects. Healthy adults who consumed a study-prepared high-aspartame diet (25 mg/kg body weight/day) for 8 days and a low-aspartame diet (10 mg/kg body weight/day) for 8 days, with a 2-week washout between the diets, were examined for within-subject differences in cognition, depression, mood, and headache. Measures included weight of foods consumed containing aspartame, mood and depression scales, and cognitive tests for working memory and spatial orientation. When consuming high-aspartame diets, participants had more irritable mood, exhibited more depression, and performed worse on spatial orientation tests. Aspartame consumption did not influence working memory. Given that the higher intake level tested here was well below the maximum acceptable daily intake level of 40–50 mg/kg body weight/day, careful consideration is warranted when consuming food products that may affect neurobehavioral health.

Artificial Sweeteners: A Systematic Review and Primer for Gastroenterologists
by Marisa Spencer, Amit Gupta, Lauren Van Dam, Carol Shannon, Stacy Menees, and William D Chey

Artificial sweeteners (AS) are ubiquitous in food and beverage products, yet little is known about their effects on the gastrointestinal (GI) tract, and whether they play a role in the development of GI symptoms, especially in patients with irritable bowel syndrome. Utilizing the PubMed and Embase databases, we conducted a search for articles on individual AS and each of these terms: fermentation, absorption, and GI tract. Standard protocols for a systematic review were followed. At the end of our search, we found a total of 617 eligible papers, 26 of which were included. Overall, there is limited medical literature available on this topic. The 2 main areas on which there is data to suggest that AS affect the GI tract include motility and the gut microbiome, though human data is lacking, and most of the currently available data is derived from in vivo studies. The effect on motility is mainly indirect via increased incretin secretion, though the clinical relevance of this finding is unknown as the downstream effect on motility was not studied. The specific effects of AS on the microbiome have been conflicting and the available studies have been heterogeneous in terms of the population studied and both the AS and doses evaluated. Further research is needed to assess whether AS could be a potential cause of GI symptoms. This is especially pertinent in patients with irritable bowel syndrome, a population in whom dietary interventions are routinely utilized as a management strategy.

Association between intake of non-sugar sweeteners and health outcomes: systematic review and meta-analyses of randomised and non-randomised controlled trials and observational studies
by Ingrid Toews, Szimonetta Lohner, Daniela Küllenberg de Gaudry, Harriet Sommer, Joerg J Meerpohl

In one randomised controlled trial,85 total cholesterol concentration decreased strongly in sucrose groups but increased in the aspartame group (mean difference 0.44 mmol/L, 95% confidence interval 0.33 to 0.56; n=45). […]

In one crossover non-randomised controlled trial,83 researchers found a significantly higher increase in blood glucose in children of preschool age receiving aspartame compared with sucrose (mean difference 0.24 mmol/L, 95% confidence interval 0.09 to 0.39; n=25), a significantly higher increase in blood glucose in children of school age receiving saccharin compared with sucrose (0.65 mmol/L, 0.44 to 0.86; n=23), and a significantly lower increase in blood glucose in children of preschool age receiving aspartame compared with saccharin (−0.75 mmol/L, −0.95 to −0.64; n=23, very low certainty of evidence). In overweight children involved in active weight loss, blood glucose decreased less strongly in those receiving NSSs compared with those not receiving NSSs (0.3 mmol/L, 0.2 to 0.4; n=49, very low certainty of evidence).

Systematic review of the relationship between artificial sweetener consumption and cancer in humans: analysis of 599,741 participants
by A. Mishra, K. Ahmed, S. Froghi, and P. Dasgupta

The statistical value of this review is limited by the heterogeneity and observational designs of the included studies. Although there is limited evidence to
suggest that heavy consumption may increase the risk of certain cancers, overall
the data presented are inconclusive as to any relationship between artificial sweeteners and cancer.

Evidence suggesting artificial sweeteners may be harmful should give us pause
by Leslie Beck

The study, a randomized controlled trial, investigated the effect of daily sucralose consumption on insulin sensitivity in 66 healthy, normal-weight adults who didn’t regularly use artificial sweeteners. […]

This finding is provocative because it suggests that regular consumption of sucralose can lead to insulin resistance in healthy, normal-weight people.

Sucralose may affect blood sugar control by activating sweet taste receptors in the gut, triggering the release of insulin. Artificial sweeteners are also thought to disrupt the balance of good gut bacteria in a direction that can lead to insulin resistance and weight gain.

Did America Get Fat by Drinking Diet Soda?
by Daniel Engber

Perhaps more to the point, researchers have tested the effects of diet soda on people trying to lose weight, and gotten positive results. A randomized, controlled trial published in May compared the efficacy of artificially sweetened beverages and water in a 12-week weight-loss program. Both treatment groups ended up with smaller waists, and the people taking diet drinks appeared to lose more weight. That study’s lead authors are consultants for Coca-Cola, so perhaps we shouldn’t take this as the final word. But another randomized trial from 2012, this one funded by a bottled-water company, came to a similar conclusion. When overweight and obese adults switched to diet beverages or water for a six-month stretch, both groups shed 1 inch of girth, on average, and 5 pounds.

Health outcomes of non-nutritive sweeteners: analysis of the research landscape
Szimonetta Lohner, Ingrid Toews, and Joerg J. Meerpoh

Finally, we included 372 studies in our scoping review, comprising 15 systematic reviews, 155 randomized controlled trials (RCTs), 23 non-randomized controlled trials, 57 cohort studies, 52 case-control studies, 28 cross sectional studies and 42 case series/case reports.

In healthy subjects, appetite and short term food intake, risk of cancer, risk of diabetes, risk of dental caries, weight gain and risk of obesity are the most investigated health outcomes. Overall there is no conclusive evidence for beneficial and harmful effects on those outcomes. Numerous health outcomes including headaches, depression, behavioral and cognitive effects, neurological effects, risk of preterm delivery, cardiovascular effects or risk of chronic kidney disease were investigated in fewer studies and further research is needed. In subjects with diabetes and hypertension, the evidence regarding health outcomes of NNS use is also inconsistent.

Early-Life Exposure to Non-Nutritive Sweeteners and the Developmental Origins of Childhood Obesity: Global Evidence from Human and Rodent Studies
by Alyssa J. Archibald, Vernon W. Dolinsky, and Meghan B. Azad

Non-nutritive sweeteners (NNS) are increasingly consumed by children and pregnant women around the world, yet their long-term health impact is unclear. Here, we review an emerging body of evidence suggesting that early-life exposure to NNS may adversely affect body composition and cardio-metabolic health. Some observational studies suggest that children consuming NNS are at increased risk for obesity-related outcomes; however, others find no association or provide evidence of confounding. Fewer studies have examined prenatal NNS exposure, with mixed results from different analytical approaches. There is a paucity of RCTs evaluating NNS in children, yielding inconsistent results that can be difficult to interpret due to study design limitations (e.g., choice of comparator, multifaceted interventions). The majority of this research has been conducted in high-income countries. Some rodent studies demonstrate adverse metabolic effects from NNS, but most have used extreme doses that are not relevant to humans, and few have distinguished prenatal from postnatal exposure. Most studies focus on synthetic NNS in beverages, with few examining plant-derived NNS or NNS in foods. Overall, there is limited and inconsistent evidence regarding the impact of early-life NNS exposure on the developmental programming of obesity and cardio-metabolic health. Further research and mechanistic studies are needed to elucidate these effects and inform dietary recommendations for expectant mothers and children worldwide.

Noncaloric Sweeteners in Children: A Controversial Theme
by Samuel Durán Agüero, Lissé Angarita Dávila, Ma. Cristina Escobar Contreras, Diana Rojas Gómez, and Jorge de Assis Costa

On the other hand, three transversal studies, including 385 and 3311 children, showed positive association between the intakes of NCS and BMI [53]. Similar results were obtained with pregnant woman who ingested NCS, showing more probability of having babies with increased risk for later obesity or overweight. However, the limitation of the studies is that these were of observational type, and the findings do not necessarily imply a significant correlation between the intake of artificial sweeteners and weight gain [54, 55]. In a meta-analysis of intake of NCS that included 11.774 citations, 7 trials, 1003 participants, and 30 cohort studies (adults and adolescents) it was concluded that there is not enough evidence from randomized controlled trials to demonstrate the positive effect of NCS on controlling body weight. Findings of observational studies suggest that the continuous ingestion of NCS could be associated with BMI and cardiometabolic risk increase [56].

Diet Soda May Alter Our Gut Microbes And Raise The Risk Of Diabetes
by Allison Aubrey

While the findings are preliminary, the paper could begin to explain why studies of diet soda point in opposite directions.

“All of us have a microbiome” made up of trillions of organisms. “[It’s] extremely complex. Everybody’s microbiome is a little different,” Blaser says.

And the ways our microbiomes respond to what we eat can vary, too.

In the study, the Israeli researchers find that as mice and people started consuming artificial sweeteners, some types of bacteria got pushed out, and other types of bacteria began to proliferate.

It could be that for some people who responded negatively to the artificial sweetener, the bacteria that got crowded out were helping to keep glucose in check.

How it’s happening isn’t clear, and Blaser says a lot more research is needed.

“So that’s the next step,” Blaser says. “Firstly, for [researchers] to confirm this, to see if it’s really true.” And the next challenge is to understand the mechanism. “How does the change in the microbial composition — how is it causing this?”

Lots of researchers agree they’d like to see a large-scale study.

“It’s much too early, on the basis of this one study, [to conclude that] artificial sweeteners have negative impacts on humans’ [risk for diabetes],” says James Hill, director of the Center for Human Nutrition at the University of Colorado.

He points to a randomized controlled trial published in 2011 that found artificial sweeteners helped to limit the rise in blood sugar in a group of slightly overweight people, compared with sugar.

Hill also points to a study of people on the National Weight Control Registry that found successful long-term dieters tend to consume artificially sweetened foods and beverages at a higher rate compared with the general population.

So expect the debate over diet sodas to continue — and also anticipate hearing more about the role of our microbiomes.

Study links artificial sweeteners and weight gain
by CTVNews.ca Staff

Azad said what her team was most struck by was the lack of good, rigorous studies on artificial sweeteners.

“Surprisingly, given how common these products are, not many studies have looked at the long-term impact of their consumption,” Azad told CTV News Channel from Lisbon, Portugal.

She noted that only seven of the 37 studies they reviewed were randomized controlled trials (RCTs), and all were relatively short, following participants for a median period of only six months.

The other 30 studies were longer and followed the participants for an average of 10 years, but they were observational studies – a form of research that is not as precise as a controlled trial.

“A lot of the studies we found were observational, meaning they could show a link but they can’t prove a cause-and-effect relationship,” she said.

Among the seven RCT’s, regular consumption of sweeteners had no significant effect on weight loss. From the other studies, the team found that regular use of sweeteners was associated with an increased risk of type 2 diabetes and high blood pressure, and modest increases in weight and waist circumferences.

“What we found was that at the end of the day, from all of this research, there really wasn’t firm evidence of a long-term benefit of artificial sweeteners. And there was some evidence of long-term harm from long-term consumption,” Azad said.

As for why artificial sweeteners seem to be linked to weight gain, not weight loss, Azad says no one knows for sure but there are lots of theories.

One theory is that the sweeteners somehow disrupt healthy gut bacteria. Another theory is that the sweeteners confuse our metabolisms, causing them to overreact to sugary tastes.

It could be that those who regularly use artificial sweeteners over-compensate for the missed calories from sugar, or they could have otherwise unhealthy diets in conjunction with sweetener use.

Azad would like to see a lot more research on the long-term use of sweeteners, in particular studies that could compare the different sweeteners, to see if one is any better than another.

In the meantime, for those trying to cut down on their sugar consumption, Azad says it’s important not to switch from one harmful food item to another.

“I think the takeaway for Canadians at this point is to maybe think twice about whether you really want to be consuming these artificial sweeteners, particularly on an everyday basis,” Azad said, “because really we don’t have evidence to say for sure whether these are truly harmless alternatives to sugar.”

Like water fasts, meat fasts are good for health.

I was on a low-carb paleo diet for about a year with a focus on intermittent fasting and ketosis. Influenced by Dr. Terry Wahls and Dr. Will Cole, both former vegetarians converted to paleo, this included large helpings of vegetables but without the starchy carbs. It was a game-changer for me, as my health improved on all fronts, from weight to mood. But every time my carbs and sugar intake would creep up, I could feel the addictive cravings coming back and I decided to limit my diet to a greater extent. Zero-carb had already been on my radar, but I then looked more into it. It seemed worth a try.

So, I went carnivore for the past couple of months, mostly as an experiment and not as an idea of it being permanent. It is the best elimination diet ever and it definitely takes low-carb to another level, but I wanted to be able to compare how I felt with plants in my diet. So, a couple weeks ago with spring in the air and wild berries on their way, I ended my zero-carb carnivory with a three-day fast and reintroduced some light soup and fermented vegetables. I felt fine. Even after the extended period of low-carb diet, this zero-carb experiment made me realize how much better I feel with severely restricting carbs and sugar. Now back on a paleo-keto diet, I’m going to keep my focus on animal foods and be more cautious about which plant foods I include and how often.

Dr. Anthony Gustin offers an approach similar to Siim Land, as discussed in the first four videos below. A low-carb diet, especially strict carnivore (no dairy, just meat), is an extremely effective way of healing digestive issues and reducing bodily inflammation. The carnivore diet is a low residue diet because meat and fat gets fully digested much earlier in the digestive tract, whereas lots of fiber can clog you up in causing constipation. A similar kind of benefit is seen with the ketogenic diet, as microbiome imbalance and overgrowth is improved by initially starving and decreasing the number of microbes, but after some months the microbiome recovers to its original numbers and with a healthier balance.

Still, as Gustin and Land argue, it’s good to maintain some variety in the diet for metabolic flexibility. But we must understand plants stress the system (Steven Gundry, The Plant Paradox), as they are inflammatory, unlike most animal foods (though dairy can be problematic for some), and plants contain anti-nutrients that can cause deficiencies. There are other problems as well, such as damage from oxalates that are explained by the oxalate expert Sally K. Norton in the fifth and sixth videos; she argues that plants traditionally were only eaten seasonally and not daily as she talks about in the seventh video (also, written up as an academic paper: Lost Seasonality and Overconsumption of Plants: Risking Oxalate Toxicity).

Even so, one might argue that small amounts of stress are good for what is called hormesis — in the way that working out stresses the body in order to build muscle, whereas constant exertion would harm the body; or in the way that being exposed to germs as a child helps the development of a stronger immune system — with a quick explanation by Siim Land in the second video below. Otherwise, by too strictly excluding foods for too long you might develop sensitivities, which the fourth video is about. As cookie monster said about cookies on the Colbert Show, vegetables are a sometimes food. Think of plant foods more as medicine in that dose is important.

Plant foods are beneficial in small portions on occasion, whereas constantly overloading your body with them never gives your system a rest. Fruits and veggies are good, in moderation. It turns out a “balanced diet” doesn’t mean massive piles of greens for every meal and snacks in between. Grains aren’t the only problematic plant food. Sure, on a healthy diet, you can have periods of time when you eat more plant foods and maybe be entirely vegan on certain days, but also make sure to fast from plant foods entirely every now and then or even for extended periods.

That said, I understand that we’ve been told our entire lives to eat more fruits and veggies. And I’m not interested in trying to prove zero-carbs is the best. If you’re afraid that you’ll be unhealthy without a massive load of plant nutritients, then make sure to take care of potential problems with gut health and inflammation. In the eighth video below, a former vegan explains how she unknowingly had been managing her plant-induced inflammation with CBD oil, something she didn’t realize until after stopping its use. She later turned to an animal-based diet and the inflammation was no longer an issue.

But for those who don’t want to go strictly low-carb, much less carnivore, there are many ways to manage one’s health, besides anti-inflammatory CBD oil. Be sure to include other anti-inflammatories such as turmeric (curcumin) combined with, for absorption, black pepper (bioperine). Also, intermittent and extended fasting will be all the more important to offset the plant intake, although everyone should do fasting as it is what the human body is designed for. A simple method is limited eating periods, even going so far as one meal a day (OMAD), but any restriction is better than none. Remember that even sleeping at night is a fast and so, skipping breakfast or eating later, will extend that fast with its benefits; or else skipping dinner will start the fasting period earlier.

Even on a vegan or vegetarian diet, one can also do a ketogenic diet, which is another way of reducing inflammation and healing the gut. For this approach, I’d suggest reading Dr. Will Cole’s book Ketotarian; also helpful might be some other books such as Dena Harris’ The Paleo Vegetarian Diet and Mark Hyman’s Food: What the Heck Should I Eat?. Anytime carbs are low enough, including during fasts, will put the body into ketosis and eventually autophagy, the latter being how the body heals itself. Carbs, more than anything else, will knock you out of this healthy state, not that you want to be permanently in this state.

Still, I wouldn’t recommend extreme plant-based diets, in particular not the typically high-carb veganism. Even with the advantages of low-carb, I would still avoid it as this will force you to eat more unhealthy foods like soy and over-consume omega-6 fatty acids from nuts and seeds, one of the problems discussed in the fourth video. Some vegetarians and vegans will oddly make an exception for seafood; but if you don’t eat seafood at all, be sure to add an algal-source supplement of EPA and DHA, necessary omega-3 fatty acids that are also beneficial for inflammation and general health. If meat, including seafood, is entirely unacceptable, consider at least adding certain kinds animal foods in such as pasture-raised eggs and ghee.

If you still have health problems, consider the possibility of going zero-carb. Even a short meat fast might do wonders. As always, self-experimentation is the key. Put your health before dietary ideology. That is to say, don’t take my word for it nor the word of others. Try it for yourself. If you want to do a comparison, try strict veganism for a period and then follow it with carnivore. And if you really want to emphasize the difference, make the vegan part of the experiment high-carb and I don’t necessarily mean what are considered ‘unhealthy’ carbs — so, eat plenty of whole wheat bread, rice, corn, and beans, — that way you’ll also feel the difference that carbohydrates make. But if you don’t want to do carnivore for the other part of the experiment, at least try a ketogenic diet which can be done with more plant-based foods but consider reducing the most problematic plant foods, as Gundry explains.

Of course, you can simply jump right into carnivory and see what happens. Give it a few months or even a year, as it can take a while for your body to heal, not only in elimination of toxins. What do you have to lose?

* * *

I’ll add a personal note. I’ve long had an experimental attitude about life. But the last year, I’ve been quite intentional in my self-experimenting. Mainly, I try something and then observe the results, not that I’m always that systematic about it. Many of the changes I’ve experienced would be hard to miss, even when I’m not paying close attention.

That playing around with dietary parameters is what I’m still doing. My dietary experiments likely will go on for quite a while longer. After a few days of fermented vegetables, I felt fine and there were no symptoms. I decided to try a salad which is raw vegetables (lettuce, green onions, and radishes) and included fermented vegetables. Now I notice that the inflammation in my wrist has flared up. I’ll take that as my body giving me feedback.

One of the best benefits to zero-carb was how inflammation had gone away. My wrists weren’t bothering me at all and that is a big deal, as they’re has been irritation for years now with my job as a cashier and all the time I spend on the computer. Inflammation had gone down with low-carb, but it was still noticeable. There was further decrease with zero-carb and I’d hate to lose those gains.

As I said, I’m being cautious. The benefits I’ve seen are not slight and far from being limited to joint issues, with what is going on with my wrists probably being related to the crackling in my knees I experience earlier last decade before reducing sugar. A much bigger deal is the neurocognitive angle, since mental health has been such a struggle for decades. Possible inflammation in my brain is greater concern than inflammation in my wrists, not that the two can be separated as an inflammatory state can affect any and all parts of the body. I take depression extremely seriously and I’m hyper-aware to shifts in mood and related aspects.

I’ll limit myself to fermented vegetables for the time being and see how that goes.

Having written that, I remembered one other possible offending food. The day before the salad I had a slice of oat bread. I had asked someone to make me some almond bread, as I explained to them, because of the paleo diet and they misunderstood. They apparently thought the paleo diet was only about wheat and so they got it in their head that oats would be fine. Because they made it for me, I decided to have a slice as I’m not a dietary Puritan.

So maybe it wasn’t the salad, after all. Still, I think I’ll keep to the fermented veggies for a while. And I’ll keep away from those grains. That was the first time I had any oats in a long time. I’ll have to try oats again sometime in the future to see if I have a similar response. But for now, I’m keeping my diet simple by keeping animal foods at the center of of what I eat.

* * *

My own experience with diets makes me understand the attraction of carnivore diet. It isn’t only the most effective diet for healing from inflammation and gut problems. Also, it is so simple to do, it is highly satisfying with lots of fat and sat, and the results are dramatic and quick. You just eat until you’re no longer hungry.

Few other diets compare. The one exception being the ketogenic diet, which is unsurprising since zero-carb will obviously promote ketosis. Both of these diets have the advantage of simplicity. One quickly learns that all the struggle and suffering is unnecessary and undesirable. You eat until satiety and then stop. Overeating is almost impossible on carnivore, as the body returns to normal balance without all those carbs and sugar fucking up your metabolism and hormonal signaling for hunger.

We live in a dominator society that is drenched in moralistic religion and this impacts everyone, even atheists and new agers. This shapes the stories we tell, including dieting narratives of gluttony and sin (read Gary Taubes). We are told dieting must be hard, that it is something enforced, not something we do naturally as part of a lifestyle. We are taught to mistrust our bodies and, as if we are disembodied ego-minds, that we must control the body and resist temptation… and when we inevitably fail, one might argue by design, we must punish ourselves and double down on self-denial. If it feels good, it must be bad. What bullshit!

The addictive mentality of diets high in carbs and sugar are part of a particular social order built on oppressive social control. Rather than an internal sense of satisfaction, control must come from outside, such that we become disconnected even from our own bodies. It is a sense of scarcity where one is always hungry, always worried about where the next meal will come from. And in order to control this addictive state, we are told we have to fight against our own bodies, as if we are at war with ourselves. We lose an intuitive sense of what is healthy, as everything around us promotes imbalance and disease.

But what if there could be another way? What if you could feel even better with carnivory or in ketogenic fasting than you ever felt before?

* * *

I’ve written before about low-carb, fasting, ketosis, and related dietary topics such as paleo and nutrient-density:

Ketogenic Diet and Neurocognitive Health; Fasting, Calorie Restriction, and Ketosis; Fasting and Feasting; The Agricultural Mind; Spartan Diet; Sailors’ Rations, a High-Carb DietObese Military?; Low-Carb Diets On The Rise; Obesity Mindset; Malnourished Americans; Ancient Atherosclerosis?; Carcinogenic Grains; The Creed of Ancel Keys; Dietary Dictocrats of EAT-Lancet; Clearing Away the Rubbish; Damning Dietary Data; Paleo Diet, Traditional Foods, & General Health; and The Secret of Health.

This is the first post about the carnivore diet. Some of the other posts come close to it, though. In a couple of them, I discuss diets that were largely centered on animal foods, from the Mongols to the Spartans. It was specifically my reading about and experimenting with fasting and ketosis that opened my mind to considering the carnivore diet.

I bring this up because of another interesting historical example I just came across. Brad Lemley, a science journalist, is a LCHF practitioner and advocate. He writes that, “I’ve always been fascinated by Lewis and Clark’s expedition. What gave the 33 men and one dog the strength to traverse the wild nation? Nine pounds of meat per day per man”.

From the journal of Raymond Darwin Burroughs, there was a tally of the meat consumed on the expedition: “Deer (all species combined” 1,001; Elk 375; Bison 227; Antelope 62; Bighorn sheep 35; Bears, grizzly 43; Bears, black 23; Beaver (shot or trapped) 113; Otter 16; Geese and Brant 104; Grouse (all species) 46; Turkeys 9; Plovers 48; Wolves (only one eaten) 18; Indian dogs (purchased and consumed) 190; Horses 12″ (The Natural History of the Lewis and Clark Expedition).

“This list does not include the countless smaller or more exotic animals that were captured and eaten by the Corps, such as hawk, coyote, fox, crow, eagle, gopher, muskrat, seal, whale blubber, turtle, mussels, crab, salmon, and trout” (Hunting on the Lewis and Clark Trail). “Additionally, 193 pounds of “portable soup” were ordered as an emergency ration when stores ran out and game was scarce or unavailable. The soup was produced by boiling a broth down to a gelatinous consistency, then further drying it until it was rendered quite hard and desiccated. Not exactly a favorite with the men of the Corps, it nonetheless saved them from near starvation on a number of occasions.”

That would be a damn healthy diet. Almost entirely hunted and wild-caught meat. They would have been eating head-to-tail with nothing going to waste: brains, intestines, organ meats, etc. They also would’ve been getting the bone marrow and bone broth. This would have provided every nutrient needed for not just surviving but thriving at high levels of health and vitality. Yet they also would have gone through periods of privation and hunger.

“Despite the apparent bounty of the ever-changing landscape and the generosity of local tribes, many were the nights when the crew of the Corps went to sleep hungry. Many were the days when shots went awry and missed their mark, or game remained hidden from sight. Relentless rain ruined drying meat, punishing heat spoiled perishable provisions, and clothing rotted right off the backs of the men.”

That means they also spent good portions of time fasting. So, there was plenty of ketosis and autophagy involved, further factors that promote health and energy. Taken together, this dietary lifestyle follows the traditional hunter-gatherer pattern of feasting and fasting. Some ancient agricultural societies such as the Spartans intentionally mimicked this intermittent fasting through the practice of one-meal-a-day, at least for young boys training for the life of a soldier.

Nina Teicholz has pointed out that a meat-heavy diet was common to early Americans, not only to those on expeditions into the Western wilderness, and because of seasonal changes fasting and its results would also have been common. The modern industrial style of the standard American diet (SAD) doesn’t only diverge from traditional hunter-gatherer diets but also from the traditional American diet.

* * *

Video 1

Video 2

Video 3

Video 4

Video 5

Video 6

Video 7

Video 8

* * *

Bonus Video!

This one particularly fits my own experience with mental health. The guy interviewed offers a compelling conversion story, in going from the standard American diet (SAD) to carnivore after decades of everything getting worse. His example shows how, as long as you’re still alive, it is never too late to regain some of your health and sometimes with a complete reversal.

* * *

General online resources for carnivory:

Eat meat. Not too little. Mostly fat.
by L. Amber O’Hearn

Facultative Carnivore
A hypertext book (in-progress) by L. Amber O’Hearn
with audiobook version

The Ultimate Guide to the Carnivore Diet
co-Written by L. Amber O’Hearn and Raphael Sirtoli

An article that includes several videos on carnivory:

HOW CAN A MEAT-ONLY DIET REVERSE CHRONIC DISEASE? FIVE DOCTORS SHARE THEIR INSIGHTS
by Afifah Hamilton

Other videos:

Carnivore Diet Gut Microbiome Case Study … Carneval

* * *

There hasn’t been many studies on carnivory. But one research paper concluded, “Our study has shown that Austrian adults who consume a vegetarian diet are less healthy (in terms of cancer, allergies, and mental health disorders), have a lower quality of life, and also require more medical treatment.” The researchers were comparing a ‘vegetarian’ diet with a ‘carnivore’ diet.

It’s a bit confusing, though. The carnivore category was divided into sub-categories that I didn’t quite understand. At least one of the sub-categories of carnivore might better be described as omnivore. It’s not clear that any of the subjects ate animal foods only. Also, the ‘vegetarian’ group included multiple diets. Part of them (“pure vegetarians”) apparently were vegans while others ate certain animal foods, the latter including not only dairy and eggs but in some cases fish as well. Basically, the comparison was more broadly between plant-based diets and meat-based diets.

Possibly problematic, it is unclear if the differences in health outcomes are dietary or environmental, as the authors discuss major differences in lifestyle. The ‘vegetarians’ sought out less preventative healthcare, presumably out of a mistrust of mainstream medicine. Even so, it’s interesting in how it demonstrates that it’s more complicated than simply eating more plants will make people healthy.

Vegetarians Are Less Healthy Than Carnivores
by Steve Parker, M.D. (text below from link)

From Independent:

Vegetarians are less healthy than meat-eaters, a controversial study has concluded, despite drinking less, smoking less and being more physically active than their carnivorous counterparts.

A study conducted by the Medical University of Graz in Austria found that the vegetarian diet, as characterised by a low consumption of saturated fat and cholesterol, due to a higher intake of fruits, vegetables and whole-grain products, appeared to carry elevated risks of cancer, allergies and mental health problems such as depression and anxiety.

While not mentioned in the Independent article, the full PLOS One report defined “vegetarian”:

While 0.2% of the interviewees were pure vegetarians (57.7% female), 0.8% reported to be vegetarians consuming milk and eggs (77.3% female), and 1.2% to be vegetarians consuming fish and/ or eggs and milk (76.7% female).

I haven’t read the whole thing, but if you’re a vegetarian, you should digest it. Note the study was done in Austria. And if vegetarians are so unhealthy, why do Seventh Day Adventists in Loma Linda, CA, seem to have a longevity benefit. Do ya think maybe there’s more involved than diet, like culture or genetics?

Sailors’ Rations, a High-Carb Diet

In the 18th century British Navy, “Soldiers and sailors typically got one pound of bread a day,” in the form of hard tack, a hard biscuit. That is according to James Townsend. On top of that, some days they had been given peas and on other days a porridge called burgoo. Elsewhere, Townsend shares a some info from a 1796 memoir of the period — the author having written that, “every man and boy born on the books of any of his Majesty’s ships are allowed as following a pound of biscuit bread and a gallon of beer per day” (William Spavens, Memoirs of A Seafaring Life,  p. 106). So, grains and more grains, in multiple forms, foods and beverages.

About burgoo, it is a “ground oatmeal boiled up,” as described by Townsend. “Now you wouldn’t necessarily eat that all by itself. Early on, you were given to go with that salt beef fat. So the slush that came to the top when you’re boiling all your salt beef or salt pork. You get all that fat that goes up on top — they would scrape that off, they keep that and give it to you to go with your burgoo. But later on they said maybe that cause scurvy so they let you have some molasses instead.”

They really didn’t understand scurvy at the time. Animal foods, especially fat, would have some vitamin C in it, whereas the oats and molasses had none. They made up for this deficiency later on by adding in cabbage to the sailors’ diet, though not a great choice considering vegetables don’t store well on ships. I’d point out that it’s not that they weren’t getting enough vitamin C, at least for a healthy traditional diet, as they got meat four days a week and even on the other meat-free banyan-days they had some butter and cheese. That would have given them sufficient vitamin C for a low-carb diet, especially with seafood caught along the way.

A high-carb diet, however, is a whole other matter. The amount of carbs and sugar sailors ate daily was quite large. This came about with colonial trade that made grains cheap and widely available, along with the sudden access to sugar from distant sugarcane plantations. Glucose competes with the processing of vitamin C and so requires higher intake of the latter for basic health, specifically to avoid scurvy. A low-carb diet, on the other hand, can avoid scurvy with very little vitamin C since sufficient amounts are in animal foods. Also, a low-carb diet is less inflammatory and so this further decreases the need for antioxidants like vitamin C.

This is why Inuit could eat few plants and immense amounts of meat and fat. They got more vitamin C on a regular basis from seal fat than they did from the meager plant foods they could gather in the short warm period of the far north. But with almost no carbohydrates in the traditional Inuit diet, the requirement for vitamin C was so low as to not be a problem. This is probably the same explanation for why Vikings and Polynesians could travel vast distances across the ocean without getting sick, as they were surely eating mostly fresh seafood and very little, if any, starchy foods.

Unlike protein and fat, carbohydrate is not an essential macronutrient. Yes, carbohydrates provide glucose that the body needs in limited amounts, but through gluceogenesis proteins can be turned into glucose on demand. So, a long sea voyage with zero carbs would never have been a problem.

Sailors in the colonial era ate all of those biscuits, porridge, and peas not because it offered any health value beyond mere survival but because it was cheap food. Those sailors weren’t being fed to have long, healthy lives as labor was cheap and no one cared about them. As soon as a sailor was no longer useful, he would no longer be employed in that profession and he’d find himself among the impoverished masses. For all the health problems of a sailor’s diet, it was better than the alternative of starvation or near starvation that so many others faced.

Grain consumption had been increasing in late feudalism, but peasants still maintained wider variety in their diet through foods they could hunt or gather, not to mention some fresh meat, fat, eggs, and dairy from animals they raised. That all began to change with the enclosure movement. The end of feudal village life and loss of the peasants’ commons was not a pretty picture and did not lead to happy results, as the landless peasants evicted from their homes flooded into the cities where most of them died. The economic desperation made for much cheap labor. Naval sailors with their guaranteed rations, in spite of nutritional deficiencies, were comparably lucky.

* * *

This understanding of low-carb, animal-based diets isn’t new either. If you look back to previous centuries, you see that low-carb diets have been advocated going back to the late 1700s. Advocating such diets prior to that was irrelevant since low-carb was the dietary norm that was assumed without being needing to be stated.

Only beginning a couple of centuries ago did new forms of agriculture take hold that created large surplus yields for the first time in human existence. Suddenly, right when a high-carb diet became possible for a larger part of the population, it was unsurprising that the health problems of a high-carb diet began to appear and so the voices for low-carb soon followed.

In prior centuries, one even sees examples in old books describing the health advantages of animal foods. But I’m not sure if anyone made such connections of high-carb diets to scurvy until more recently. Still, this understanding is older than most people realize, going back at least to the late 1800s. L. Amber O’Hearn shares the following passages from (C is for Carnivore):

Selected notes from the Lancet volume 123
You can find this in Google books [1].

p 329. From a medical report from Mr. W. H. Neale, M.B. B.S. medical officer of the Eira, about an Arctic expedition:

“For the boat journey we saved 40 lb. of tinned meat (per man), and 351b. of tinned soups(per man), 3cwt. of biscuit, and about 800lb. of walrus me it, which was cooked and soldered up by our blacksmith in old provision tins. About 80 lb. of tea were saved, enabling us to have tea night and morning till almost the day we were picked up. No lime-juice was saved. A few bottles of wine and brandy were secured, and kept for Mr. Leigh-Smith and invalids. All the rum was saved, and every man was allowed one-fifth of a gill per day until May 1st, 1882, when it was decided to keep the remaining eighteen gallons for the boats. One man was a teetotaler from January to June, and was quite as healthy as anyone else. Personally it made very little difference whether I took the allowance of “grog” or not. One of the sick men was also a teetotaler nearly all the time. During the boat journey the men preferred their grog when doing any hard work, a fact I could never agree to, but when wet and cold a glass of grog before going to sleep seemed to give warmth to the body and helped to send one to sleep. Whilst sailing, also, one glass of grog would give temporary warmth ; but everyone acknowledged that a mug of hot tea was far better when it was fit weather to make a fire. I do not think that spirits or lime-juice is much use as anti scorbutics ; for if you live on the flesh of the country even, I believe, without vegetables, you will run very little risk of scurvy. There was not a sign of scurvy amongst us, not even an anaemic face. I have brought home a sample of bear and walrus meat in a tin, which I intend to have analysed if it is still in good preservation ; and then it will be a question as to how it will be best to preserve the meat of the country in such a form as to enable a sufficient supply to be taken on long sledge journeys ; for as long as you have plenty of ventilation and plenty of meat, anyone can live out an Arctic winter without fear of scurvy, even if they lie for days in their beds, as our men were compelled to do in the winter when the weather was too bad to go outside (there being no room inside for more than six or seven to be up at one time).”

p331, John Lucas: “Sir, —A propos the annotation appearing under the above heading in The Lancet of June 24th, pp. 1048-9, I would beg permission to observe that almost every medical man in India will be able to endorse the views of Dr. Moore, to which you refer. Medical officers of native regiments notice almost daily in their hospital practice that—to use your writer’s words—”insufficient diet will cause scurvy even if fresh vegetable material forms a part of the diet, though more rapidly if it is withheld.” Indeed, so far as my humble experience as a regimental surgeon from observations on the same men goes, I am inclined to think that the meat-eating classes of our Sepoys—to wit, the Mahomedans, especially those from the Punjaub—are comparatively seldom seen with the scorbutic taint ; while, on the contrary, the subjects are, in the main, vegetable feeders who are their non-meat-eating comrades, the Hindus (Parboos from the North- West Provinces and Deccan Mahrattas), especially those whose daily food is barely sufficient either in quality or quantity. A sceptic may refuse to accept this view on the ostensible reason that though the food of the meat-eating classes be such, it may, perchance, contain vegetable ingredients as well as meat. To this I would submit the rejoinder that as a matter of fact, quite apart from all theory and hypothesis, the food of these meat-eating classes does not always contain much, or any, vegetables. In the case of the semi-savage hill tribes of Afghanistan and Baluchistan, their food contains large amounts of meat (mutton), and is altogether devoid of vegetables. The singular immunity from scurvy of these races has struck me as a remarkable physiological circumstance, which should make us pause before accepting the vegetable doctrine in relation to scurvy et hoc genus omne.”

p370 Charles Henry Ralphe “To the Editor of The Lancet. Sir, —I was struck by two independent observations which occurred in your columns last week with regard to the etiology of scurvy, both tending to controvert the generally received opinion that the exclusive cause of the disease is the prolonged and complete withdrawal of succulent vegetables from the dietary of those affected. Thus Mr. Neale, of the Eira Arctic Expedition, says : ” I do not think that spirit or limejuice is of much use as an anti scorbutic ; for if you live on the flesh of the country, even, I believe, without vegetables, you will run very little risk of scurvy.” Dr. Lucas writes: “In the case of the semi- savage hill tribes of Afghanistan and Beluchistan their food contains a large amount of meat, and is altogether devoid of vegetables. The singular immunity from scurvy of these races has struck me as a remarkable physiological circumstance, which should make us pause before accepting the vegetable doctrine in relation to scurvy.” These observations do not stand alone. Arctic voyagers have long pointed out the antiscorbutic properties of fresh meat, and Baron Larrey, with regard to hot climates, arrived at the same conclusion in the Egyptian expedition under Bonaparte, at the end of last century.”

p495 “SCURVY. Dr. Buzzard, in a letter which appeared in oar columns last week, considers the fact that the crew of the Eira were supplied with preserved vegetables tells against the supposition advanced by Mr. Neale, that if Arctic voyagers were to feed only on the flesh of the animals supplied by the country they would be able to dispense with lime-juice. The truth is, it is an open question with many as to the relative antiscorbutic properties of preserved vegetables, and whether under the circumstances in which the Eira’s crew were placed they would have been sufficient, in the absence of lime-juice and fresh meat, to have preserved the crew from scurvy. A case in point is the outbreak that occurred on board the Adventure, in the surveying voyages of that vessel and the Beagle. The Adventure had been anchored in Port Famine for several months, and although “pickles, cranberries, large quantities of wild celery, preserved meats and soups, had been abundantly supplied,” still great difficulty had been experienced in obtaining fresh meat, and they were dependent on an intermittent supply from wild-fowl and a few shell-fish. Scurvy appeared early in July, fourteen cases, including the assistant-surgeon, being down with it. At the end of July fresh meat was obtained; at first it seemed to prove ineffectual, but an ample supply being continued, the commander was able to report, by the end of August, ” the timely supply of guanaco meat had certainly checked the scurvy.” This is an instance in which articles of diet having recognised antiscorbutic properties proved insufficient, in the absence of lime-juice and fresh meat, and under conditions of exceptional hardship, exposure, and depressing influence, to prevent the occurrence of scurvy. So with the Eira, we believe that had they not fortunately been able to obtain abundant supplies of fresh meat, scurvy would have appeared, and that the preserved vegetables in the absence of lime-juice would have proved insufficient as antiscorbutics. This antiscorbutic virtue of fresh meat has long been recognised by Arctic explorers, and, strangely, their experience in this respect is quite at variance with ours in Europe. It has been sought to explain the immunity from the disease of the Esquimaux, who live almost exclusively on seal and walrus flesh daring the winter months, by maintaining that the protection is derived from the herbage extracted from the stomach of reindeer they may kill. In view, however, of the small proportion of vegetable matter that would be thus obtained for each member of the tribe, and the intermittent nature of the supply, it can hardly be maintained that the antiscorbutic supplied in this way is sufficient unless there are other conditions tending in the same direction. And of these, one, as we have already stated, consists probably in the fact that the flesh is eaten without lactic acid decomposition having taken place, owing either to its being devoured immediately, or from its becoming frozen. The converse being the case in Europe, where meat is hung some time after rigor mortis has passed off, and lactic acid develops to a considerable extent. This seems a rational explanation, and it reconciles the discrepancy of opinion that exists between European and Arctic observers with regard to meat as an antiscorbutic. In bringing forward the claims of the flesh of recently killed animals as an antiscorbutic, it must be understood that we fully uphold the doctrine that the exclusive cause of scurvy is due to the insufficient supply of fresh vegetable food, and that it can be only completely cured by their administration ; but if the claims advanced with regard to the antiscorbutic qualities of recently slaughtered flesh be proved, then we have ascertained a fact which ought to be of the greatest practical value with regard to the conduct of exploring expeditions, and every effort should be made to obtain it. Everything, moreover, conducive to the improvement of the sailor’s dietary ought to receive serious consideration, and it has therefore seemed to us that the remarks of Mr. Neale and Dr. Lucas are especially worthy of attention, whilst we think the suggestion of the former gentleman with regard to the use of the blood of slaughtered animals likely to prove of special value.”

p913 “Sir, —In a foot-note to page 49G of his ” Manual of Practical Hygiene,”, fifth edition, (London, Churchill, 1878), Parkes says : —”For a good deal of evidence up to 1818, I beg to refer to a review I contributed on scurvy in the British and Foreign. Medico-Chirurgical Review in that year. The evidence since this period has added, I believe, little to our knowledge, except to show that the preservation and curative powers of fresh meat in large quantities, and especially raw meat (Kane’s Arctic Expedition), will not only prevent, but will cure scurvy. Kane found the raw meat of the walrus a certain cure. For the most recent evidence and much valuable information, see the Report of the Admiralty Committee on the Scurvy which occurred in the Arctic Expedition of 1875-76 (Blue Hook, 1877).” I think that the last sentence in the above is not Parkes’ own, but that it must have been added by the editor in order to bring it up to the date of the issue of the current edition. The experience since then of the Arctic Expedition in the Eira coincides with these. I refer to that portion of the report where the author tells us that “our food consisted chiefly of War and walrus meat, mixing some of the bear’s blood with the soup when possible.” And again: “I do not think that, spirits or lime-juice is much use as an antiscorbutic, for if you live on the flesh of the country, even, I believe, without vegetables, you will run very little risk of scurvy. There was not a sign of scurvy amongst us, not even an anaemic face,” (Lancet, Aug. 26th.) So that, as far as this question of fresh meat and raw meat and their prophylactic and curative properties are concerned, ample evidence will be found in other published literature to corroborate that of the Eira. But when you take up the question of the particular change which takes place in meat from its fresh to its stale condition, you will find a great deal of diversity and little harmony at opinion. Without taking up other authors on the subject, we stick to Parkes and compare his with Pr. I ; life’.-, views on this point. Parkes thought “fresh, and especially raw meat, is also useful, and this is conjectured to be from its amount of lactic acid ; but this is uncertain,”1 while on the other hand Dr. Ralfe repeats, as a probable explanation, too, of the reason of fresh meat being an anti scorbutic, but that it is due to the absence of lactic acid. For, from well-known chemical facts he deduces the following: — ” In hot climates meat has to be eaten so freshly killed that no lime is allowed for the development of the lactic acid : in arctic regions the freezing arrests its formation. The muscle plasma, therefore, remains alkaline. In Europe the meat is invariably hung, lactic acid is developed freely, and the muscle plasma is consequently acid. If, therefore, scurvy is, as I have endeavoured to show (“Inquiry into the General Pathology of Scurvy”), due to diminished alkalinity of the blood, it can be easily understood that meat may be antiscorbutic when fresh killed, or frozen immediately after killing, but scorbutic when these alkaline salts have been converted into acid ones by lactic acid decomposition.'”-‘ The view of the alkalinity of the blood coincides with Dr. Garrod’s theory, which, however, appears to have as a sine qua turn the absence of a particular salt- namely, potash. I am inclined to think that, taking into account the nervous symptoms which are not infrequently associated with a certain proportion of scorbutic cases, resulting probably from the changes taking place in the blood, not unlike those which occur in gout and rheumatism, there must be some material change produced in the sympathetic system. In many of the individuals tainted with scurvy there were slight and severe attacks of passing jaundice in the cases which occurred in Afghanistan. Can we possibly trace this icteric condition to this cause? This is but a conjecture so far. But there certainly is in Garrod’s observations an important point which, if applicable to all countries, climates, and conditions of life, is sufficiently weighty to indicate the necessity for farther research in that direction, and that point is this : the scorbutic condition disappeared on the patient being given a few grains of potash, though kept strictly on precisely the same diet which produced scurvy. —I am, Sir, yours truly, Ahmedabad, India, 30th Sept., 1882. JOHN C. LUCAS.”

On Salt: Sodium, Trace Minerals, and Electrolytes

There has been a lot of debate about salt lately. The mainstream view originated from little actual scientific evidence. It wasn’t well-supported. But research since then has been mixed.

The isn’t limited to disagreement between mainstream and alternative thinkers. Paleo advocates such as Dr. Loren Cordain (considered to be the founder of the paleo diet) continue to recommend lower salt intake. Still, there have been an increasing number of scientists and physicians coming out in favor of the benefits of salt: Dr Barbara Hendel, Dr. F. Batmanghelidj, Dr. Esteban Genoa, Dr. Eric Westman, Dr. Jeff S Volek, Dr. Stephen D. Phinney, and Dr. James DiNicolantonio. Many of these experts argue that increased amounts of salt is particularly important for a low-carb diet and that is even more true with high-protein. This relates to issues transitioning into ketosis, what is referred to as keto flu. Basically, the electrolytes temporarily get out of balance while one is adapting to ketosis. Yet Sally Fallon Morell states that it is a plant-based diet that requires more salt to increase HCL in the stomach for digestion.

All of this was brought to my attention because of Dr. DiNcolantonio’s book The Salt Fix that came out recently. His simplest advice is to salt to taste since your body (presumably under normal conditions) should know how much salt it needs. He argues that salt isn’t addictive like sugar. So, according to this view, salt cravings can be safely treated as a genuine need for salt. I haven’t read The Salt Fix, but I have skimmed a bit of one of his other books, Superfuel. In that book, he states that salt, besides maintaining healthy blood pressure, helps maintain insulin sensitivity. Also, salt goes back to the fat issue — more from the book:

“Diets very low in sodium (salt) increase adrenaline and aldosterone, and these hormones reduce activity of D6D and D5D. For this reason, low-salt diets increase the need for EPA and DHA due to the reduced desaturase enzyme activities. Another extremely common hormonal issue these days, one that interferes with conversion of the parent omega-6 and omega-3 fats into their derivatives, is hypothyroidism. Thyroid hormone is required for proper activity of D6D and D5D, so individuals with suboptimal thyroid hormone levels may benefit from consuming more EPA and DHA or taking good-quality supplements.”

There is a number of issues with sodium, potassium, and magnesium in relation to insulin, adrenaline, and aldosterone. Shifting the diet can affect any or all of these. The problem is most research has been limited to people on the standard American diet. We know very little, if anything at all, about salt intake or electrolyte supplementation with other diets. That forces people into experimentation. Anything true of high-carb diets may or may not apply to low-carb diets. Nor do we know that the same will be true between moderately low-carb diets, extremely low-carb diets, zero-carb diets, etc. Then there are other factors such as fasting, ketosis, autophagy, etc that alters the body’s functioning. It’s possible that, on low enough carb restriction, the need for electrolytes and trace minerals decreases, as is the case with vitamin C. Sounds like a great hypothesis to be tested.

Then there is the issue of what actually helps vs what might harm you. What are the potential risks and benefits of getting too few electrolytes and trace minerals vs higher levels? I’m not sure self-experimentation can exactly figure this out, although maybe some have strong enough responses to salt or its lack that they can figure out what works for them. My own experimentation hasn’t indicated anything particular, either positive or negative.

Like anyone else, I enjoy the taste of salt. But unlike sugar, I’ve never craved salt in the addictive sense (and I know what addiction feels like). According to some of what I read, the danger seems to be specifically with refined salt, as is the case with so much else that is refined. Refined salt doesn’t give your body what it needs and so throws off the balance, disallowing healthy processsing of glucose, and so according to this explanation this is why refined salt disposes you to sugar cravings. I remember reading about this sugar and salt craving cycle back in the 1990s, but apparently it only applies to refined salt, if I’m understanding the research correctly. It just so happens that processed food manufacturers love to combine refined carbs and sugar with refined salt, where taste has become disconnected from actual nutrient content because almost all nutrients have been stripped away. They also throw in addictive wheat and dairy for good measure.

I noticed that Dr. James DiNicolantonio says to worry less about sodium and instead focus on potassium. But he emphasizes natural sources of potassium. His point is that salt simply makes high-potassium foods more palatable, foods that otherwise would be more bitter. He points out that there are both animal and plant foods that have greater amounts of potassium: fish, shellfish, greens, beans, potatoes, and tomatoes. The significance of the salt is that once potassium hits a threshold the sodium supposedly will balance it out. Seafood is particularly high in these particular micronutrients, along with much else that is healthy (e.g., EPA and DHA). Many healthy populations have lived near the ocean, as observed by Weston A. Price and others. Some argue that seafood shaped human evolution, the aqauatic ape theory.

“Most animals with a sodium deficiency display an active craving for salt which, when satisfied, disappears. In humans, salt intake has little or no relation to the body’s needs. Some Inuit tribes avoid salt almost completely, while people in the Western world consume 1520 times the amount needed for health. In other works, a single African species (assuming humans have an African origin) possesses a wildly different scheme of salt management. Humans are also the only primates to regulate body temperature by sweat-cooling, a system profligate in the use of sodium. Proponents of the Aquatic Ape Hypothesis believe that sweat-cooling could not have developed anywhere except near the sea where diets contain considerable salt, in fact much more salt than the body requires.” (William R. Corliss, Our aquatic phase!)

An interesting theory to explain the unusual aspects of salt in the human species and why there is so many differences even across traditional societies. Whether or not the aquatic ape theory is correct, it’s for certain that the foods in the standard American diet are far different in numerous ways, likely including nutrient content of magnesium and potassium. It would be useful to measure the levels of micronutrients in a healthy hunter-gatherer diet, not only from salt but food sources as well. Besides seafood and certain plants, especially seaweed (Birgitte Svennevig, Did seaweed make us who we are today?), many have noted that it is a common practice among hunter-gatherers to consume blood along with organ meats and interstitial fluid, all of which are high in salt.

I wonder if this is something we overthink because dietary experts came to obsess over it, as a convenient scapegoat (as they scapegoated saturated fat). The whole debate has become polarized, those arguing for low-salt vs those for high-salt. But other factors might be more important. Besides the problems of a high-carb diet, maybe salt levels aren’t that big of an issue. Assuming there aren’t specific health conditions, most people might be perfectly safe to salt to taste or largely ignore salt if they prefer. Potassium and magnesium seem a bit different, though. Those mostly come from foods, not salt. I don’t know of research that compares people who eat foods high in these micronutrients and those who don’t. It’s another one of those confounders with the standard American diet. And even a zero-carb dieter can eat foods that are either high or low in these micronutrients. For those not using salt, it would be useful info to know which foods they eat and their micronutrient profile.

My conclusion is simply that salt tastes good and, until better science comes along to tell me otherwise, I’ll salt to taste. I’m definitely a fan of the philosophy of listening to one’s body. I self-experiment and find out what works. In my experience, there is a big difference between craving sugar and enjoying salt. One is clearly an addiction and the other not, at least in my case. I was reminded of this just moments ago. I got a glass of water. Since it was on my mind, I sprinkled some sea salt in it and a few drops of electrolytes, along with a capful of apple cider vinegar as I’m wont to do. I quickly downed it and realized how refreshing it was. Earlier this morning I had a glass of water without anything in it and it wasn’t nearly as thirst-quenching. I’m not sure why that is. Something about water with salt and trace minerals in it is so satisfying. I suppose that is why many people love Gatorade and similar drinks. They go down so easily, even though the other ingredients are horrible for your health.

My advice is this. Enjoy salt. It tastes good and makes food more satisfying. Certain trace minerals are necessary for life and health, although only small amounts are naturally found in salt. As for potential downsides, there is yet no clear evidence and no consensus. So, do as many others do, find out what works for you.

* * *

There is a secondary issue or rather some related secondary issues. Angela A. Stanton advises against consuming rock salts that have to be mined, such as Himalayan pink salt and Real Salt (The Truth About Himalayan Salt). She gives two main reasons. First, there might be impurities, including radioactive elements and heavy metal toxins such as lead, although she mentions there are also impurities in sea salt as well. The other problem is that these natural sources of salt lack iodine, an important nutrient. So, for both reasons, she recommends a refined salt that has been purified and iodized.

Her second point is the strongest. Iodine is, without a doubt, an essential nutrient and a deficiency is serious. I’m not sure how likely deficiencies are these days for those eating an otherwise wholesome diet, but it is something to keep in mind. Of course, you could solve this problem by occasionally sprinkling on your food some seaweed, a great natural source of iodine. Her fear about impurities, though, is maybe less substantiated because the amount of impurities is so small as to be insignificant. If we are to be paranoid, impurities are almost everywhere and in almost everything — the air you breathe and the water you drink, the food you eat and supplements you take. The human body evolved to handle such miniscule exposures.

If you have health concerns with iodine deficiency, then go ahead with iodized salt. But otherwise, it probably doesn’t matter too much which kind of salt you use, as long as it comes from a high quality source. But if you want to learn more about the issue of contaminants, David Asprey has directly responded to Stanton (Is Pink Himalayan Salt Toxic?) and so has Jeremy Orozco (Is Pink Himalayan Salt Toxic? Radioactive?). There are others as well who respond to the issue more generally to the topic. There were some responses to a Quora inquiry: Is the amount of plutonium in pink Himalayan salt dangerous? (less than .001 ppm). Also, in the comments section of a piece by Harriet Hall (Pink Himalayan Sea Salt: An Update), there were useful responses:

Jeff Mink • 2 years ago
“In case it wasn’t clear from the article, Himalayan sea salt does not contain “84 trace elements”. If you follow the link to the spectral analysis, it simply lists all non-noble gas elements in the periodic table. If the concentration is “< X ppm”, that means that none of that element was detected. That leaves it with a total of 29 elements (NOT MINERALS!) detected, assuming I counted correctly. In fact, they didn’t even test for as technetium and promethium, since there’s no chance (according to our modern scientific theories) that those could be in there. None of the elements that are actually contained in the salt are radioactive (at least not that I saw), but thallium and lead are definitely not good for the human body. Of course, at the concentrations listed, you’d probably succumb to sodium poisoning long before you got a harmful dose of heavy metals.”

Mathew Taylor • 2 years ago
“There are two main parts to this article: 1) Pink Salt does not provide any health benefits, or they are overstated grossly. I concede this point.

“However, the 2nd part, that pink salt is HARMFUL appears to be wrong. You state that it is full of poisons / contaminants. Lets look at a few of them;

“Arsenic – <0.01 ppm – There is more arsenic in some foods than this. In fact, local authorities limit arsenic concentrations in some seafood to 2mg/kg, thats 2ppm, orders of magnitude more than in pink salt and in something you would consume an order of magnitude more of.

“Mercury – .03 ppm in pink salt. Contrast that with Tuna, where levels are at least TEN TIMES higher, and the volume you would consume in a serving is many orders of magnitude higher.

“Lead – .1 ppm – There is lead in a variety of foods, but usually lower concentrations than this. Remember that salt is not used in massive quantities, unlike vegetables. The target for blood lead levels is less than 10 mcg/dl, or approx 500 mcg total. To get that much lead from pink salt, you’d have to consume 5 kilograms of the stuff. Good luck with that.

“Uranium – <0.001 ppm – Lots of food has uranium in it. Mushrooms can have over 100 μg U/kg (Dry mass).

“So don’t use it if you don’t want to, but don’t make out like this stuff is bad for you, it is, after all, 97.3% table salt.”

* * *

If you want further info about salt, here is a somewhat random collection of articles and videos, all of them bringing new perspectives based on the latest research:

The Salt of the Earth

Salt: friend or foe?

Why Salt Is Good For You, But Some Salt is Better Than Others

Dr. James DiNicolantonio On Sodium-Potassium Balance

The Potassium Myth

The Importance of Potassium and Sodium for Fertility Health

On Keto Flu and Electrolyte Imbalances

Leveraging basic physiology to prevent ‘keto-flu,’ ‘Atkins-flu,’ and ‘adrenal fatigue.’

How much sodium, potassium and magnesium should I have on a ketogenic diet?

Salt

The Crisis of Identity

“Besides real diseases we are subject to many that are only imaginary, for which the physicians have invented imaginary cures; these have then several names, and so have the drugs that are proper to them.”
~Jonathan Swift, 1726
Gulliver’s Travels

“The alarming increase in Insanity, as might naturally be expected, has incited many persons to an investigation of this disease.”
~John Haslam, 1809
On Madness and Melancholy: Including Practical Remarks on those Diseases

“Cancer, like insanity, seems to increase with the progress of civilization.”
~Stanislas Tanchou, 1843
Paper presented to the Paris Medical Society

I’ve been following Scott Preston over at his blog, Chrysalis. He has been writing on the same set of issues for a long time now, longer than I’ve been reading his blog. He reads widely and so draws on many sources, most of which I’m not familiar with, part of the reason I appreciate the work he does to pull together such informed pieces. A recent post, A Brief History of Our Disintegration, would give you a good sense of his intellectual project, although the word ‘intellectual’ sounds rather paltry for what he is exploring: “Around the end of the 19th century (called the fin de siecle period), something uncanny began to emerge in the functioning of the modern mind, also called the “perspectival” or “the mental-rational structure of consciousness” (Jean Gebser). As usual, it first became evident in the arts — a portent of things to come, but most especially as a disintegration of the personality and character structure of Modern Man and mental-rational consciousness.”

That time period has been an interest of mine as well. There are two books that come to mind that I’ve mentioned before: Tom Lutz’s American Nervousness, 1903 and Jackson Lear’s Rebirth of a Nation (for a discussion of the latter, see: Juvenile Delinquents and Emasculated Males). Both talk about that turn-of-the-century crisis, the psychological projections and physical manifestations, the social movements and political actions. A major concern was neurasthenia which, according to the dominant economic paradigm, meant a deficit of ‘nervous energy’ or ‘nerve force’, the reserves of which if not reinvested wisely and instead wasted would lead to physical and psychological bankruptcy, and so one became spent. (The term ‘neurasthenia’ was first used in 1829 and popularized by George Miller Beard in 1869, the same period when the related medical condition of ‘nostalgia’ became a more common diagnosis, although ‘nostalgia’ was first referred to in the 17th century (Swiss doctor Johannes Hofer coined the term, also using it interchangeably with nosomania and philopatridomania — see: Michael S. Roth, Memory, Trauma, and History; David Lowenthal, The Past Is a Foreign Country; Thomas Dodman, What Nostalgia Was; Susan J. Matt, Homesickness; Linda Marilyn Austin, Nostalgia in Transition, 1780-1917; Svetlana Boym, The Future of Nostalgia; Gary S. Meltzer, Euripides and the Poetics of Nostalgia; see The Disease of Nostalgia). Today, we might speak of ‘neurasthenia’ as stress and, even earlier, they had other ways of talking about it — as Bryan Kozlowski explained in The Jane Austen Diet, p. 231: “A multitude of Regency terms like “flutterings,” “fidgets,” “agitations,” “vexations,” and, above all, “nerves” are the historical equivalents to what we would now recognize as physiological stress.” It was the stress of falling into history, a new sense of time, linear progression that made the past a lost world — from Stranded in the Present, Peter Fritzsche wrote:

“On that August day on the way to Mainz, Boisseree reported on of the startling consequences of the French Revolution. This was that more and more people began to visualize history as a process that affected their lives in knowable, comprehensible ways, connected them to strangers on a market boat, and thus allowed them to offer their own versions and opinions to a wider public. The emerging historical consciousness was not restricted to an elite, or a small literate stratum, but was the shared cultural good of ordinary travelers, soldiers, and artisans. In many ways history had become a mass medium connecting people and their stories all over Europe and beyond. Moreover, the drama of history was construed in such a way as to put emphasis on displacement, whether because customary business routines had been upset by the unexpected demands of headquartered Prussian troops, as the innkeepers protested, or because so many demobilized soldiers were on the move as they returned home or pressed on to seek their fortune, or because restrictive legislation against Jews and other religious minorities had been lifted, which would explain the keen interest of “the black-bearded Jew” in Napoleon and of Boisseree in the Jew. History was not simply unsettlement, though. The exchange of opinion “in the front cabin” and “in the back” hinted at the contested nature of well-defined political visions: the role of the French, of Jacobins, of Napoleon. The travelers were describing a world knocked off the feet of tradition and reworked and rearranged by various ideological protagonists and conspirators (Napoleon, Talleyrand, Blucher) who sought to create new social communities. Journeying together to Mainz, Boisseree and his companions were bound together by their common understanding of moving toward a world that was new and strange, a place more dangerous and more wonderful than the one they left behind.”

That excitement was mixed with the feeling of being spent, the reserves having been fully tapped. This was mixed up with sexuality in what Theodore Dreiser called the ‘spermatic economy’ in the management of libido as psychic energy, a modernization of Galenic thought (by the way, the catalogue for Sears, Roebuck and Company offered an electrical device to replenish nerve force that came with a genital attachment). Obsession with sexuality was used to reinforce gender roles in how neurasthenic patients were treated in following the practice of Dr. Silas Weir Mitchell, in that men were recommended to become more active (the ‘West cure’) and women more passive (the ‘rest cure’), although some women “used neurasthenia to challenge the status quo, rather than enforce it. They argued that traditional gender roles were causing women’s neurasthenia, and that housework was wasting their nervous energy. If they were allowed to do more useful work, they said, they’d be reinvesting and replenishing their energies, much as men were thought to do out in the wilderness” (Julie Beck, ‘Americanitis’: The Disease of Living Too Fast). That feminist-style argument, as I recall, came up in advertisements for the Bernarr Macfadden’s fitness protocol in the early-1900s, encouraging (presumably middle class) women to give up housework for exercise and so regain their vitality. Macfadden was also an advocate of living a fully sensuous life, going as far as free love.

Besides the gender wars, there was the ever-present bourgeois bigotry. Neurasthenia is the most civilized of the diseases of civilization since, in its original American conception, it was perceived as only afflicting middle-to-upper class whites, especially WASPs — as Lutz says that, “if you were lower class, and you weren’t educated and you weren’t Anglo Saxon, you wouldn’t get neurasthenic because you just didn’t have what it took to be damaged by modernity” (Julie Beck, ‘Americanitis’: The Disease of Living Too Fast) and so, according to Lutz’s book, people would make “claims to sickness as claims to privilege.” This class bias goes back even earlier to Robert Burton’s melancholia with its element of what later would be understood as the Cartesian anxiety of mind-body dualism, a common ailment of the intellectual elite (mind-body dualism goes back to the Axial Age; see Julian Jaynes, The Origin of Consciousness in the Breakdown of the Bicameral Mind). The class bias was different for nostalgia, as written about by Svetlana Boym in The Future of Nostalgia (p. 5):

“For Robert Burton, melancholia, far from being a mere physical or psychological condition, had a philosophical dimension. The melancholic saw the world as a theater ruled by capricious fate and demonic play. Often mistaken for a mere misanthrope, the melancholic was in fact a utopian dreamer who had higher hopes for humanity. In this respect, melancholia was an affect and an ailment of intellectuals, a Hamletian doubt, a side effect of critical reason; in melancholia, thinking and feeling, spirit and matter, soul and body were perpetually in conflict. Unlike melancholia, which was regarded as an ailment of monks and philosophers, nostalgia was a more “democratic” disease that threatened to affect soldiers and sailors displaced far from home as well as many country people who began to move to the cities. Nostalgia was not merely an individual anxiety but a public threat that revealed the contradictions of modernity and acquired a greater importance.”

Like diabetes, melancholia and neuraesthenia was first seen among the elite, and so it was taken as demonstrating one’s elite nature. Prior to neurasthenic diagnoses but in the post-revolutionary era, a similar phenomenon went by other names. This is explored by Bryan Kozlowski in one chapter of The Jane Austen Diet (p. 232-233):

“Yet the idea that this was acceptable—nay, encouraged—behavior was rampant throughout the late 18th century. Ever since Jane was young, stress itself was viewed as the right and prerogative of the rich and well-off. The more stress you felt, the more you demonstrated to the world how truly delicate and sensitive your wealthy, softly pampered body actually was. The common catchword for this was having a heightened sensibility—one of the most fashionable afflictions in England at the time. Mainly affecting the “nerves,” a Regency woman who caught the sensibility but “disdains to be strong minded,” wrote a cultural observer in 1799, “she trembles at every breeze, faints at every peril and yields to every assailant.” Austen knew real-life strutters of this sensibility, writing about one acquaintance who rather enjoys “her spasms and nervousness and the consequence they give her.” It’s the same “sensibility” Marianne wallows in throughout the novel that bears its name, “feeding and encouraging” her anxiety “as a duty.” Readers of the era would have found nothing out of the ordinary in Marianne’s high-strung embrace of stress.”

This condition was considered a sign of progress, but over time it came to be seen by some as the greatest threat to civilization, in either case offering much material for fictionalized portrayals that were popular. Being sick in this fashion was proof that one was a modern individual, an exemplar of advanced civilization, if coming at immense cost —Julie Beck explains (‘Americanitis’: The Disease of Living Too Fast):

“The nature of this sickness was vague and all-encompassing. In his book Neurasthenic Nation, David Schuster, an associate professor of history at Indiana University-Purdue University Fort Wayne, outlines some of the possible symptoms of neurasthenia: headaches, muscle pain, weight loss, irritability, anxiety, impotence, depression, “a lack of ambition,” and both insomnia and lethargy. It was a bit of a grab bag of a diagnosis, a catch-all for nearly any kind of discomfort or unhappiness.

“This vagueness meant that the diagnosis was likely given to people suffering from a variety of mental and physical illnesses, as well as some people with no clinical conditions by modern standards, who were just dissatisfied or full of ennui. “It was really largely a quality-of-life issue,” Schuster says. “If you were feeling good and healthy, you were not neurasthenic, but if for some reason you were feeling run down, then you were neurasthenic.””

I’d point out how neurasthenia was seen as primarily caused by intellectual activity, as it became a descriptor of a common experience among the burgeoning middle class of often well-educated professionals and office workers. This relates to Weston A. Price’s work in the 1930s, as modern dietary changes first hit this demographic since they had the means to afford eating a fully industrialized Standard American Diet (SAD), long before others (within decades, though, SAD-caused malnourishment would wreck the health at all levels of society). What this meant, in particular, was a diet high in processed carbs and sugar that coincided, because of Upton Sinclair’s 1904 The Jungle: Muckraking the Meat-Packing Industry,  with the early-1900s decreased consumption of meat and saturated fats. As Price demonstrated, this was a vast change from the traditional diet found all over the world, including in rural Europe (and presumably in rural America, with most Americans not urbanized until the turn of last century), that always included significant amounts of nutritious animal foods loaded up with fat-soluble vitamins, not to mention lots of healthy fats and cholesterol.

Prior to talk of neurasthenia, the exhaustion model of health portrayed as waste and depletion took hold in Europe centuries earlier (e.g., anti-masturbation panics) and had its roots in humor theory of bodily fluids. It has long been understood that food, specifically macronutrients (carbohydrate, protein, & fat) and food groups, affect mood and behavior — see the early literature on melancholy. During feudalism food laws were used as a means of social control, such that in one case meat was prohibited prior to Carnival because of its energizing effect that it was thought could lead to rowdiness or even revolt, as sometimes did happen (Ken Albala & Trudy Eden, Food and Faith in Christian Culture). Red meat, in particular, was thought to heat up blood (warm, wet) and yellow bile (warm, dry), in promoting sanguine and choleric personalities of masculinity. Like women, peasants were supposed to be submissive and hence not too masculine — they were to be socially controlled, not self-controlled. Anyone who was too strong-willed and strong-minded, other than the (ruling, economic, clerical, and intellectual) elite, was considered problematic; and one of the solutions was an enforced change of diet to create the proper humoral disposition for their appointed social role within the social order (i.e., depriving nutrient-dense meat until an individual or group was too malnourished, weak, anemic, sickly, docile, and effeminate to be assertive, aggressive, and confrontational toward their ‘betters’)

There does seem to be a correlation (causal link?) between an increase of intellectual activity and abstract thought with an increase of carbohydrates and sugar, with this connection first appearing during the early colonial era that set the stage for the Enlightenment. It was the agricultural mind taken to a whole new level. Indeed, a steady flow of glucose is one way to fuel extended periods of brain work, such as reading and writing for hours on end and late into the night — the reason college students to this day will down sugary drinks while studying. Because of trade networks, Enlightenment thinkers were buzzing on the suddenly much more available simple carbs and sugar, with an added boost from caffeine and nicotine. The modern intellectual mind was drugged-up right from the beginning, and over time it took its toll. Such dietary highs inevitably lead to ever greater crashes of mood and health. Interestingly, Dr. Silas Weir Mitchell who advocated the ‘rest cure’ and ‘West cure’ in treating neurasthenia and other ailments additionally used a “meat-rich diet” for his patients (Ann Stiles, Go rest, young man). Other doctors of that era were even more direct in using specifically low-carb diets for various health conditions, often for obesity which was also a focus of Dr. Mitchell.

As a side note, the gendering of diet was seen as important for constructing, maintaining, and enforcing gender roles; that is carried over into the modern bias that masculine men eat steak and effeminate women eat salad. According to humoralism, men are well contained while women are leaky vessels. One can immediately see the fears of neurasthenia, emasculation, and excessive ejaculation. The ideal man was supposed to hold onto and control his bodily fluids, from urine to semen, by using and investing them carefully. With neurasthenia, though, men were seen as having become effeminate and leaky, dissipating and draining away their reserves vital fluids and psychic energies. So, a neurasthenic man needed a strengthening of the boundaries that held everything in. The leakiness of women was also a problem, but women couldn’t and shouldn’t be expected to contain themselves. The rest cure designed for women was to isolate them in a bedroom where they’d be contained by architectural structure of home that was owned and ruled over by the male master. A husband and, as an extension, the husband’s property was to contain the wife; since she too was property of the man’s propertied self. This made a weak man of the upper classes even more dangerous to the social order because he couldn’t play he is needed gender role of husband and patriarch, upon which all of Western civilization was dependent.

All of this was based on an economic model of physiological scarcity. With neurasthenia arising in late modernity, the public debate was overtly framed by an economic metaphor. But the perceived need of economic containment of the self, be it self-containment or enforced containment, went back to early modernity. The enclosure movement was part of a larger reform movement, not only of land but also of society and identity.

* * *

“It cannot be denied that civilization, in its progress, is rife with causes which over-excite individuals, and result in the loss of mental equilibrium.”
~Edward Jarvis, 1843
“What shall we do with the Insane?”
The North American Review, Volume 56, Issue 118

“Have we lived too fast?”
~Dr. Silas Weir Mitchell, 1871
Wear and Tear, or Hints for the Overworked

It goes far beyond diet or any other single factor. There has been a diversity of stressors that continued to amass over the centuries of tumultuous change. The exhaustion of modern man (and typically the focus has been on men) has been building up for generations upon generations before it came to feel like a world-shaking crisis with the new industrialized world. The lens of neurasthenia was an attempt to grapple with what had changed, but the focus was too narrow. With the plague of neurasthenia, the atomization of commericialized man and woman couldn’t hold together. And so there was a temptation toward nationalistic projects, including wars, to revitalize the ailing soul and to suture the gash of social division and disarray. But this further wrenched out of alignment the traditional order that had once held society together, and what was lost mostly went without recognition. The individual was brought into the foreground of public thought, a lone protagonist in a social Darwinian world. In this melodramatic narrative of struggle and self-assertion, many individuals didn’t fare so well and everything else suffered in the wake.

Tom Lutz writes that, “By 1903, neurasthenic language and representations of neurasthenia were everywhere: in magazine articles, fiction, poetry, medical journals and books, in scholarly journals and newspaper articles, in political rhetoric and religious discourse, and in advertisements for spas, cures, nostrums, and myriad other products in newspapers, magazines and mail-order catalogs” (American Nervousness, 1903, p. 2).

There was a sense of moral decline that was hard to grasp, although some people like Weston A. Price tried to dig down into concrete explanations of what had so gone wrong, the social and psychological changes observable during mass urbanization and industrialization. He was far from alone in his inquiries, having built on the prior observations of doctors, anthropologists, and missionaries. Other doctors and scientists were looking into the influences of diet in the mid-1800s and, by the 1880s, scientists were exploring a variety of biological theories. Their inability to pinpoint the cause maybe had more to do with their lack of a needed framework, as they touched upon numerous facets of biological functioning:

“Not surprisingly, laboratory experiments designed to uncover physiological changes in the nerve cell were inconclusive. European research on neurasthenics reported such findings as loss of elasticity of blood vessels,’ thickening of the cell wall, changes in the shape of nerve cells,’6 or nerve cells that never advanced beyond an embryonic state.’ Another theory held that an overtaxed organism cannot keep up with metabolic requirements, leading to inadequate cell nutrition and waste excretion. The weakened cells cannot develop properly, while the resulting build-up of waste products effectively poisons the cells (so-called “autointoxication”).’ This theory was especially attractive because it seemed to explain the extreme diversity of neurasthenic symptoms: weakened or poisoned cells might affect the functioning of any organ in the body. Furthermore, “autointoxicants” could have a stimulatory effect, helping to account for the increased sensitivity and overexcitability characteristic of neurasthenics.'” (Laura Goering, “Russian Nervousness”: Neurasthenia and National Identity in Nineteenth-Century Russia)

This early scientific research could not lessen the mercurial sense of unease, as neurasthenia was from its inception a broad category that captured some greater shift in public mood, even as it so powerfully shaped the individual’s health. For all the effort, there were as many theories about neurasthenia as there were symptoms. Deeper insight was required. “[I]f a human being is a multiformity of mind, body, soul, and spirit,” writes Preston, “you don’t achieve wholeness or fulfillment by amputating or suppressing one or more of these aspects, but only by an effective integration of the four aspects.” But integration is easier said than done.

The modern human hasn’t been suffering from mere psychic wear and tear for the individual body itself has been showing the signs of sickness, as the diseases of civilization have become harder and harder to ignore. On a societal level of human health, I’ve previously shared passages from Lears (see here) — he discusses the vitalist impulse that was the response to the turmoil, and vitalism often was explored in terms of physical health as the most apparent manifestation, although social and spiritual health were just as often spoken of in the same breath. The whole person was under assault by an accumulation of stressors and the increasingly isolated individual didn’t have the resources to fight them off.

By the way, this was far from being limited to America. Europeans picked up the discussion of neurasthenia and took it in other directions, often with less optimism about progress, but also some thinkers emphasizing social interpretations with specific blame on hyper-individualism (Laura Goering, “Russian Nervousness”: Neurasthenia and National Identity in Nineteenth-Century Russia). Thoughts on neurasthenia became mixed up with earlier speculations on nostalgia and romanticized notions of rural life. More important, Russian thinkers in particular understood that the problems of modernity weren’t limited to the upper classes, instead extending across entire populations, as a result of how societies had been turned on their heads during that fractious century of revolutions.

In looking around, I came across some other interesting stuff. From 1901 Nervous and Mental Diseases by Archibald Church and Frederick Peterson, the authors in the chapter on “Mental Disease” are keen to further the description, categorization, and labeling of ‘insanity’. And I noted their concern with physiological asymmetry, something shared later with Price, among many others going back to the prior century.

Maybe asymmetry was not only indicative of developmental issues but also symbolic of a deeper imbalance. The attempts of phrenological analysis about psychiatric, criminal, and anti-social behavior were off-base; and, despite the bigotry and proto-genetic determinism among racists using these kinds of ideas, there is a simple truth about health in relationship to physiological development, most easily observed in bone structure, but it would take many generations to understand the deeper scientific causes, along with nutrition (e.g., Price’s discovery of vitamin K2, what he called Acivator X) including parasites, toxins, and epigenetics. Church and Peterson did acknowledge that this went beyond mere individual or even familial issues: “The proportion of the insane to normal individuals may be stated to be about 1 to 300 of the population, though this proportion varies somewhat within narrow limits among different races and countries. It is probable that the intemperate use of alcohol and drugs, the spreading of syphilis, and the overstimulation in many directions of modern civilization have determined an increase difficult to estimate, but nevertheless palpable, of insanity in the present century as compared with past centuries.”

Also, there is the 1902 The Journal of Nervous and Mental Disease: Volume 29 edited by William G. Spiller. There is much discussion in there about how anxiety was observed, diagnosed, and treated at the time. Some of the case studies make for a fascinating read —– check out: “Report of a Case of Epilepsy Presenting as Symptoms Night Terrors, Inipellant Ideas, Complicated Automatisms, with Subsequent Development of Convulsive Motor Seizures and Psychical Aberration” by W. K. Walker. This reminds me of the case that influenced Sigmund Freud and Carl Jung, Daniel Paul Schreber’s 1903 Memoirs of My Nervous Illness.

Talk about “a disintegration of the personality and character structure of Modern Man and mental-rational consciousness,” as Scott Preston put it. He goes on to say that, “The individual is not a natural thing. There is an incoherency in “Margaret Thatcher’s view of things when she infamously declared “there is no such thing as society” — that she saw only individuals and families, that is to say, atoms and molecules.” Her saying that really did capture the mood of the society she denied existing. Even the family was shrunk down to the ‘nuclear’. To state there is no society is to declare that there is also no extended family, no kinship, no community, that there is no larger human reality of any kind. Ironically in this pseudo-libertarian sentiment, there is nothing holding the family together other than government laws imposing strict control of marriage and parenting where common finances lock two individuals together under the rule of capitalist realism (the only larger realities involved are inhuman systems) — compared to high trust societies such as Nordic countries where the definition and practice of family life is less legalistic (Nordic Theory of Love and Individualism).

* * *

“It is easy, as we can see, for a barbarian to be healthy; for a civilized man the task is hard. The desire for a powerful and uninhibited ego may seem to us intelligible, but, as is shown by the times we live in, it is the profoundest sense antagonistic to civilization.”
~Sigmund Freud, 1938
An Outline of Psychoanalysis

“Consciousness is a very recent acquisition of nature, and it is still in an “experimental” state. It is frail, menaced by specific dangers, and easily injured.”
~Carl Jung, 1961
Man and His Symbols
Part 1: Approaching the Unconscious
The importance of dreams

The individual consumer-citizen as a legal member of a family unit has to be created and then controlled, as it is a rather unstable atomized identity. “The idea of the “individual”,” Preston says, “has become an unsustainable metaphor and moral ideal when the actual reality is “21st century schizoid man” — a being who, far from being individual, is falling to pieces and riven with self-contradiction, duplicity, and cognitive dissonance, as reflects life in “the New Normal” of double-talk, double-think, double-standard, and double-bind.” That is partly the reason for the heavy focus on the body, an attempt to make concrete the individual in order to hold together the splintered self — great analysis of this can be found in Lewis Hyde’s Trickster Makes This World: “an unalterable fact about the body is linked to a place in the social order, and in both cases, to accept the link is to be caught in a kind of trap. Before anyone can be snared in this trap, an equation must be made between the body and the world (my skin color is my place as a Hispanic; menstruation is my place as a woman)” (see one of my posts about it: Lock Without a Key). Along with increasing authoritarianism, there was increasing medicalization and rationalization — to try to make sense of what was senseless.

A specific example of a change can be found in Dr. Frederick Hollick (1818-1900) who was a popular writer and speaker on medicine and health — his “links were to the free-thinking tradition, not to Christianity” (Helen Lefkowitz Horowitz, Rewriting Sex). With the influence of Mesmerism and animal magnetism, he studied and wrote about what more scientifically-sounding was variously called electrotherapeutics, galvanism, and electro-galvanism. Hollick was an English follower of the Scottish industrialist and socialist Robert Dale Owen who he literally followed to the United States where Owen started the utopian community New Harmony, a Southern Indiana village bought from the utopian German Harmonists and then filled with brilliant and innovative minds but lacking in practical know-how about running a self-sustaining community (Abraham Lincoln, later becoming a friend to the Owen family, recalled as a boy seeing the boat full of books heading to New Harmony).

“As had Owen before him, Hollick argued for the positive value of sexual feeling. Not only was it neither immoral nor injurious, it was the basis for morality and society. […] In many ways, Hollick was a sexual enthusiast” (Horowitz). These were the social circles of Abraham Lincoln, as he personally knew free-love advocates; that is why early Republicans were often referred to as ‘Red Republicans’, the ‘Red’ indicating radicalism as it still does to this day. Hollick wasn’t the first to be a sexual advocate nor, of course would he be the last — preceding him was Sarah Grimke (1837, Equality of the Sexes) and Charles Knowlton (1839, The Private Companion of Young Married People), Hollick having been “a student of Knowlton’s work” (Debran Rowland, The Boundaries of Her Body); and following him were two more well known figures, the previously mentioned Bernarr Macfadden (1868-1955) who was the first major health and fitness guru, and Wilhelm Reich (1897–1957) who was the less respectable member of the trinity formed with Sigmund Freud and Carl Jung. Sexuality became a symbolic issue of politics and health, partly because of increasing scientific knowledge but also because increasing marketization of products such as birth control (with public discussion of contraceptives happening in the late 1700s and advances in contraceptive production in the early 1800s), the latter being quite significant as it meant individuals could control pregnancy and this is particularly relevant to women. It should be noted that Hollick promoted the ideal of female sexual autonomy, that sex should be assented to and enjoyed by both partners.

This growing concern with sexuality began with the growing middle class in the decades following the American Revolution. Among much else, it was related to the post-revolutionary focus on parenting and the perceived need for raising republican citizens — this formed an audience far beyond radical libertinism and free-love. Expert advice was needed for the new bourgeouis family life, as part of the ‘civilizing process’ that increasingly took hold at that time with not only sexual manuals but also parenting guides, health pamphlets, books of manners, cookbooks, diet books, etc — cut off from the roots of traditional community and kinship, the modern individual no longer trusted inherited wisdom and so needed to be taught how to live, how to behave and relate (Norbert Elias, The Civilizing Process, & Society of Individuals; Bruce Mazlish, Civilization and Its Contents; Keith Thomas, In Pursuit of Civility; Stephen Mennell, The American Civilizing Process; Cas Wouters, Informalization; Jonathan Fletcher, Violence and Civilization; François Dépelteau & ‎T. Landini, Norbert Elias and Social Theory; Rob Watts, States of Violence and the Civilising Process; Pieter Spierenburg, Violence and Punishment; Steven Pinker, The Better Angels of Our Nature; Eric Dunning & Chris Rojek, Sport and Leisure in the Civilizing Process; D. E. Thiery, Polluting the Sacred; Helmut Kuzmics, Roland Axtmann, Authority, State and National Character; Mary Fulbrook, Un-Civilizing Processes?; John Zerzan, Against Civilization; Michel Foucault, Madness and Civilization; Dennis Smith, Norbert Elias and Modern Social Theory; Stejpan Mestrovic, The Barbarian Temperament; Thomas Salumets, Norbert Elias and Human Interdependencies). Along with the rise of the science, this situation promoted the role of the public intellectual that Hollick effectively took advantage of and, after the failure of Owen’s utopian experiment, he went on the lecture circuit which brought on legal cases in the unsuccessful attempt to silence him, the kind of persecution that Reich also later endured.

To put it in perspective, this Antebellum era of public debate and public education on sexuality coincided with other changes. Following the revolutionary era feminism (e.g., Mary Wollstonecraft), the ‘First Wave’ of organized feminists emerged generations later with the Seneca meeting in 1848 and, in that movement, there was a strong abolitionist impulse. This was part of the rise of ideological -isms in the North that so concerned the Southern aristocrats who wanted to maintain their hierarchical control of the entire country, the control they were quickly losing with the shift of power in the Federal government. A few years before that in 1844, a more effective condom was developed using vulcanized rubber, although condoms had been on the market since the previous decade; also in the 1840s, the vaginal sponge became available. Interestingly, many feminists were as against the contraceptives as they were against abortions. These were far from being mere practical issues as politics imbued every aspect and some feminists worried about how this might lessen the role of women and motherhood in society, if sexuality was divorced from pregnancy.

This was at a time when the abortion rate was sky-rocketing, indicating most women held other views; since large farm families were less needed with increase of both industrialized urbanization and industrialized farming. “Yet we also know that thousands of women were attending lectures in these years, lectures dealing, in part, with fertility control. And rates of abortion were escalating rapidly, especially, according to historian James Mohr, the rate for married women. Mohr estimates that in the period 1800-1830, perhaps one out of every twenty-five to thirty pregnancies was aborted. Between 1850 and 1860, he estimates, the ratio may have been one out of every five or six pregnancies. At mid-century, more than two hundred full-time abortionists reported worked in New York City” Other sources concur and extend this pattern of high abortion rate into the early 20th century: “Some have estimated that between 20-35 percent of 19th century pregnancies were terminated as a means of restoring “menstrual regularity” (Luker, 1984, p. 18-21). About 20 percent of pregnancies were aborted as late as in the 1930s (Tolnai, 1939, p. 425)” (Rickie Solinger, Pregnancy and Power, p. 61). (Polly F. Radosh, “Abortion: A Sociological Perspective”, from Interdisciplinary Views on Abortion ed. by Susan A. Martinelli-Fernandez, Lori Baker-Sperry, & Heather McIlvaine-Newsad).

In the unGodly and unChurched period of early America (“We forgot.”), organized religion was weak and “premarital sex was typical, many marriages following after pregnancy, but some people simply lived in sin. Single parents and ‘bastards’ were common” (A Vast Experiment). Abortions were so common at the time that abortifacients were advertised in major American newspapers, something that is never seen today. “Abortifacients were hawked in store fronts and even door to door. Vendors openly advertised their willingness to end women’s pregnancies” (Erin Blakemore, The Criminalization of Abortion Began as a Business Tactic). By the way, the oldest of the founding fathers, Benjamin Franklin, published in 1748 material about traditional methods of at-home abortions, what he referred to as ‘suppression of the courses’ (Molly Farrell, Ben Franklin Put an Abortion Recipe in His Math Textbook; Emily Feng, Benjamin Franklin gave instructions on at-home abortions in a book in the 1700s). It was a a reprinting of material from 1734. “While “suppression of the courses” can apply to any medical condition that results in the suspension of one’s menstrual cycle, the entry specifically refers to “unmarried women.” Described as a “misfortune” it recommends a number of known abortifacents from that time, like pennyroyal water and bellyache root, also known as angelica” (Nur Ibrahim, Did Ben Franklin Publish a Recipe in a Math Textbook on How to Induce Abortion?).

This is unsurprising as abortifacients have been known for at least millennia earlier, recorded in ancient texts from diverse societies, and probably were common knowledge prior to any written language, considering abortifacients are used by many hunter-gatherer tribes who need birth control to space out pregnancies in order to avoid malnourished babies and for other reasons. This is true within the Judeo-Christian tradition as well, such as where the Old Testament gives an abortion recipe for when a wife gets pregnant from an affair (Numbers 5:11-31). Patriarchal social dominators sought to further control women not necessarily for religious reasons, but more because medical practice was becoming professionalized by men who wanted to eliminate the business competition of female doctors, midwives, and herbalists. “To do so, they challenged common perceptions that a fetus was not a person until the pregnant mother felt it “quicken,” or move, inside their womb. In a time before sonograms, this was often the only way to definitively prove that a pregnancy was underway. Quickening was both a medical and legal concept, and abortions were considered immoral or illegal only after quickening. Churches discouraged the practice, but made a distinction between a woman who terminated her pregnancy pre- or post-quickening” (Erin Blakemore). Yet these conservative authoritarians would and still claim to speak on behalf of some vague and amorphous concept of Western Civilization and Christendom.

This is a great example of how, through the power of charismatic demagogues and Machiavellian social dominators, modern reactionary ideology obscures the past with deceptive nostalgia and replaces the traditional with historical revisionism. The thing is, until the modern era, abortifacients and other forms of birth control weren’t politicized, much less under the purview of judges. They were practical concerns that were largely determined privately and personally or else determined informally within communities and families. “Prior to the formation of the AMA, decisions related to pregnancy and abortion were made primarily with the domain and control of women. Midwives and the pregnant women they served decided the best course of action within extant knowledge of pregnancy. Most people did not view what would currently be called first trimester abortion as a significant moral issue. […] A woman’s awareness of quickening indicated a real pregnancy” (Polly F. Radosh). Yet something did change with birth control that was improved in its efficacy and ever more common or else more out in the open, making it a much more public and politicized issue, not to mention exacerbated by capitalist markets and mass media.

Premarital sex or, heck, even marital sex no longer inevitably meant birth; and with contraceptives, unwanted pregnancies often could be prevented entirely. Maybe this is why fertility had been declining for so long, and definitely the reason there was a mid-19th century moral panic. “Extending the analysis back further, the White fertility rate declined from 7.04 in 1800 to 5.42 in 1850, to 3.56 in 1900, and 2.98 in 1950. Thus, the White fertility declined for nearly all of American history but may have bottomed out in the 1980s. Black fertility has also been declining for well over 150 years, but it may very well continue to do so in the coming decades” (Ideas and Data, Sex, Marriage, and Children: Trends Among Millennial Women). If this is a crisis, it started pretty much at the founding of the country. And if we had reliable data before that, we might see the trend having originated in the colonial era or maybe back in late feudalism during the enclosure movement that destroyed traditional rural communities and kinship groups. Early Americans, by today’s standards of the culture wars, were not good Christians — many visiting Europeans at the time saw them as uncouth heathens and quite dangerous at that, such as the common American practice of toting around guns and knives, ever ready for a fight, whereas carrying weapons had been made illegal in England. In The Churching of America, Roger Finke and Rodney Stark write (pp. 25-26):

“Americans are burdened with more nostalgic illusions about the colonial era than about any other period in their history. Our conceptions of the time are dominated by a few powerful illustrations of Pilgrim scenes that most people over forty stared at year after year on classroom walls: the baptism of Pocahontas, the Pilgrims walking through the woods to church, and the first Thanksgiving. Had these classroom walls also been graced with colonial scenes of drunken revelry and barroom brawling, of women in risque ball-gowns, of gamblers and rakes, a better balance might have been struck. For the fact is that there never were all that many Puritans, even in New England, and non-Puritan behavior abounded. From 1761 through 1800 a third (33.7%) of all first births in New England occurred after less than nine months of marriage (D. S. Smith, 1985), despite harsh laws against fornication. Granted, some of these early births were simply premature and do not necessarily show that premarital intercourse had occurred, but offsetting this is the likelihood that not all women who engaged in premarital intercourse would have become pregnant. In any case, single women in New England during the colonial period were more likely to be sexually active than to belong to a church-in 1776 only about one out of five New Englanders had a religious affiliation. The lack of affiliation does not necessarily mean that most were irreligious (although some clearly were), but it does mean that their faith lacked public expression and organized influence.”

Though marriage remained important as an ideal in American culture, what changed was that procreative control became increasingly available — with fewer accidental pregnancies and more abortions, a powerful motivation for marriage disappeared. Unsurprisingly, at the same time, there was increasing worries about the breakdown of community and family, concerns that would turn into moral panic at various points. Antebellum America was in turmoil. This was concretely exemplified by the dropping birth rate that was already noticeable by mid-19th century (Timothy Crumrin, “Her Daily Concern:” Women’s Health Issues in Early 19th-Century Indiana) and was nearly halved from 1800 to 1900 (Debran Rowland, The Boundaries of Her Body). “The late 19th century and early 20th saw a huge increase in the country’s population (nearly 200 percent between 1860 and 1910) mostly due to immigration, and that population was becoming ever more urban as people moved to cities to seek their fortunes—including women, more of whom were getting college educations and jobs outside the home” (Julie Beck, ‘Americanitis’: The Disease of Living Too Fast). It was a period of crisis, not all that different from our present crisis, including the fear about low birth rate of native-born white Americans, especially the endangered species of whites/WASPs, being overtaken by the supposed dirty hordes of blacks, ethnics, and immigrants (i.e., replacement theory); at a time when Southern and Eastern Europeans, and even the Irish, were questionable in their whiteness, particularly if Catholic (Aren’t Irish White?).

The promotion of birth control was considered a genuine threat to American society, maybe to all of Western Civilization. It was most directly a threat to traditional gender roles. Women could better control when they got pregnant, a decisive factor in the phenomenon of  larger numbers of women entering college and the workforce. And with an epidemic of neurasthenia, this dilemma was worsened by the crippling effeminacy that neutered masculine potency. Was modern man, specifically the white ruling elite, up for the task of carrying on Western Civilization?

“Indeed, civilization’s demands on men’s nerve force had left their bodies positively effeminate. According to Beard, neurasthenics had the organization of “women more than men.” They possessed ” a muscular system comparatively small and feeble.” Their dainty frames and feeble musculature lacked the masculine vigor and nervous reserves of even their most recent forefathers. “It is much less than a century ago, that a man who could not [drink] many bottles of wine was thought of as effeminate—but a fraction of a man.” No more. With their dwindling reserves of nerve force, civilized men were becoming increasingly susceptible to the weakest stimulants until now, “like babes, we find no safe retreat, save in chocolate and milk and water.” Sex was as debilitating as alcohol for neurasthenics. For most men, sex in moderation was a tonic. Yet civilized neurasthenics could become ill if they attempted intercourse even once every three months. As Beard put it, “there is not force enough left in them to reproduce the species or go through the process of reproducing the species.” Lacking even the force “to reproduce the species,” their manhood was clearly in jeopardy.” (Gail Bederman, Manliness and Civilization, pp. 87-88)

This led to a backlash that began before the Civil War with the early obscenity laws and abortion laws, but went into high gear with the 1873 Comstock laws that effectively shut down the free market of both ideas and products related to sexuality, including sex toys. This made it near impossible for most women to learn about birth control or obtain contraceptives and abortifacients. There was a felt need to restore order and that meant white male order of the WASP middle-to-upper classes, especially with the end of slavery, mass immigration of ethnics, urbanization and industrialization. The crisis wasn’t only ideological or political. The entire world had been falling apart for centuries with the ending of feudalism and the ancien regime, the last remnants of it in America being maintained through slavery. Motherhood being the backbone of civilization, it was believed that women’s sexuality had to be controlled and, unlike so much else that was out of control, it actually could be controlled through enforcement of laws.

Outlawing abortions is a particularly interesting example of social control. Even with laws in place, abortions remained commonly practiced by local doctors, even in many rural areas (American Christianity: History, Politics, & Social Issues). Corey Robin argues that the strategy hasn’t been to deny women’s agency but to assert their subordination (Denying the Agency of the Subordinate Class). This is why, according to Rogin, abortion laws were designed to primarily target male doctors, although they rarely did, and not their female patients (at least once women had been largely removed from medical and healthcare practice, beyond the role as nurses who assisted male doctors). Everything comes down to agency or its lack or loss, but our entire sense of agency is out of accord with our own human nature. We seek to control what is outside of us, including control of others, for our own sense of self is out of control. The legalistic worldview is inherently authoritarian, at the heart of what Julian Jaynes proposes as the post-bicameral project of consciousness, the metaphorically contained self. But this psychic container is weak and keeps leaking all over the place.

* * *

“It is clear that if it goes on with the same ruthless speed for the next half century . . . the sane people will be in a minority at no very distant day.”
~Henry Maudsley, 1877
“The Alleged Increase of Insanity”
Journal of Mental Science, Volume 23, Issue 101

“If this increase was real, we have argued, then we are now in the midst of an epidemic of insanity so insidious that most people are even unaware of its existence.”
~Edwin Fuller Torrey & Judy Miller, 2001
The Invisible Plague: The Rise of Mental Illness from 1750 to the Present

To bring it back to the original inspiration, Scott Preston wrote: “Quite obviously, our picture of the human being as an indivisible unit or monad of existence was quite wrong-headed, and is not adequate for the generation and re-generation of whole human beings. Our self-portrait or self-understanding of “human nature” was deficient and serves now only to produce and reproduce human caricatures. Many of us now understand that the authentic process of individuation hasn’t much in common at all with individualism and the supremacy of the self-interest.” The failure we face is that of identify, of our way of being in the world. As with neurasthenia in the past, we are now in a crisis of anxiety and depression, along with yet another moral panic about the declining white race. So, we get the likes of Steve Bannon, Donald Trump, and Jordan Peterson. We failed to resolve past conflicts and so they keep re-emerging. Over this past century, we have continued to be in a crisis of identity (Mark Greif, The Age of the Crisis of Man).

“In retrospect, the omens of an impending crisis and disintegration of the individual were rather obvious,” Preston points out. “So, what we face today as “the crisis of identity” and the cognitive dissonance of “the New Normal” is not something really new — it’s an intensification of that disintegrative process that has been underway for over four generations now. It has now become acute. This is the paradox. The idea of the “individual” has become an unsustainable metaphor and moral ideal when the actual reality is “21st century schizoid man” — a being who, far from being individual, is falling to pieces and riven with self-contradiction, duplicity, and cognitive dissonance, as reflects life in “the New Normal” of double-talk, double-think, double-standard, and double-bind.” We never were individuals. It was just a story we told ourselves, but there are others that could be told. Scott Preston offers an alternative narrative, that of individuation.

* * *

I found some potentially interesting books while skimming material on Google Books, in my researching Frederick Hollick and other info. Among the titles below, I’ll share some text from one of them because it offers a good summary about sexuality at the time, specifically women’s sexuality. Obviously, it went far beyond sexuality itself, and going by my own theorizing I’d say it is yet another example of symbolic conflation, considering its direct relationship to abortion.

The Boundaries of Her Body: The Troubling History of Women’s Rights in America
by Debran Rowland
pp. 34

WOMEN AND THE WOMB: The Emerging Birth Control Debate

The twentieth century dawned in America on a falling white birth rate. In 1800, an average of seven children were born to each “American-born white wife,” historians report. 29 By 1900, that number had fallen to roughly half. 30 Though there may have been several factors, some historians suggest that this decline—occurring as it did among young white women—may have been due to the use of contraceptives or abstinence,though few talked openly about it. 31

“In spite of all the rhetoric against birth control,the birthrate plummeted in the late nineteenth century in America and Western Europe (as it had in France the century before); family size was halved by the time of World War I,” notes Shari Thurer in The Myth of Motherhood. 32

As issues go, the “plummeting birthrate” among whites was a powder keg, sparking outcry as the “failure”of the privileged class to have children was contrasted with the “failure” of poor immigrants and minorities to control the number of children they were having. Criticism was loud and rampant. “The upper classes started the trend, and by the 1880s the swarms of ragged children produced by the poor were regarded by the bourgeoisie, so Emile Zola’s novels inform us, as evidence of the lower order’s ignorance and brutality,” Thurer notes. 33

But the seeds of this then-still nearly invisible movement had been planted much earlier. In the late 1700s, British political theorists began disseminating information on contraceptives as concerns of overpopulation grew among some classes. 34 Despite the separation of an ocean, by the 1820s, this information was “seeping” into the United States.

“Before the introduction of the Comstock laws, contraceptive devices were openly advertised in newspapers, tabloids, pamphlets, and health magazines,” Yalom notes.“Condoms had become increasing popular since the 1830s, when vulcanized rubber (the invention of Charles Goodyear) began to replace the earlier sheepskin models.” 35 Vaginal sponges also grew in popularity during the 1840s, as women traded letters and advice on contraceptives. 36 Of course, prosecutions under the Comstock Act went a long way toward chilling public discussion.

Though Margaret Sanger’s is often the first name associated with the dissemination of information on contraceptives in the early United States, in fact, a woman named Sarah Grimke preceded her by several decades. In 1837, Grimke published the Letters on the Equality of the Sexes, a pamphlet containing advice about sex, physiology, and the prevention of pregnancy. 37

Two years later, Charles Knowlton published The Private Companion of Young Married People, becoming the first physician in America to do so. 38 Near this time, Frederick Hollick, a student of Knowlton’s work, “popularized” the rhythm method and douching. And by the 1850s, a variety of material was being published providing men and women with information on the prevention of pregnancy. And the advances weren’t limited to paper.

“In 1846,a diaphragm-like article called The Wife’s Protector was patented in the United States,” according to Marilyn Yalom. 39 “By the 1850s dozens of patents for rubber pessaries ‘inflated to hold them in place’ were listed in the U.S. Patent Office records,” Janet Farrell Brodie reports in Contraception and Abortion in 19th Century America. 40 And, although many of these early devices were often more medical than prophylactic, by 1864 advertisements had begun to appear for “an India-rubber contrivance”similar in function and concept to the diaphragms of today. 41

“[B]y the 1860s and 1870s, a wide assortment of pessaries (vaginal rubber caps) could be purchased at two to six dollars each,”says Yalom. 42 And by 1860, following publication of James Ashton’s Book of Nature, the five most popular ways of avoiding pregnancy—“withdrawal, and the rhythm methods”—had become part of the public discussion. 43 But this early contraceptives movement in America would prove a victim of its own success. The openness and frank talk that characterized it would run afoul of the burgeoning “purity movement.”

“During the second half of the nineteenth century,American and European purity activists, determined to control other people’s sexuality, railed against male vice, prostitution, the spread of venereal disease, and the risks run by a chaste wife in the arms of a dissolute husband,” says Yalom. “They agitated against the availability of contraception under the assumption that such devices, because of their association with prostitution, would sully the home.” 44

Anthony Comstock, a “fanatical figure,” some historians suggest, was a charismatic “purist,” who, along with others in the movement, “acted like medieval Christiansengaged in a holy war,”Yalom says. 45 It was a successful crusade. “Comstock’s dogged efforts resulted in the 1873 law passed by Congress that barred use of the postal system for the distribution of any ‘article or thing designed or intended for the prevention of contraception or procuring of abortion’,”Yalom notes.

Comstock’s zeal would also lead to his appointment as a special agent of the United States Post Office with the authority to track and destroy “illegal” mailing,i.e.,mail deemed to be “obscene”or in violation of the Comstock Act.Until his death in 1915, Comstock is said to have been energetic in his pursuit of offenders,among them Dr. Edward Bliss Foote, whose articles on contraceptive devices and methods were widely published. 46 Foote was indicted in January of 1876 for dissemination of contraceptive information. He was tried, found guilty, and fined $3,000. Though donations of more than $300 were made to help defray costs,Foote was reportedly more cautious after the trial. 47 That “caution”spread to others, some historians suggest.

Disorderly Conduct: Visions of Gender in Victorian America
By Carroll Smith-Rosenberg

Riotous Flesh: Women, Physiology, and the Solitary Vice in Nineteenth-Century America
by April R. Haynes

The Boundaries of Her Body: The Troubling History of Women’s Rights in America
by Debran Rowland

Rereading Sex: Battles Over Sexual Knowledge and Suppression in Nineteenth-century America
by Helen Lefkowitz Horowitz

Rewriting Sex: Sexual Knowledge in Antebellum America, A Brief History with Documents
by Helen Lefkowitz Horowitz

Imperiled Innocents: Anthony Comstock and Family Reproduction in Victorian America
by Nicola Kay Beisel

Against Obscenity: Reform and the Politics of Womanhood in America, 1873–1935
by Leigh Ann Wheeler

Purity in Print: Book Censorship in America from the Gilded Age to the Computer Age
by Paul S. Boyer

American Sexual Histories
edited by Elizabeth Reis

Wash and Be Healed: The Water-Cure Movement and Women’s Health
by Susan Cayleff

From Eve to Evolution: Darwin, Science, and Women’s Rights in Gilded Age America
by Kimberly A. Hamlin

Manliness and Civilization: A Cultural History of Gender and Race in the United States, 1880-1917
by Gail Bederman

One Nation Under Stress: The Trouble with Stress as an Idea
by Dana Becker

* * *

8/18/19 – Looking back at this piece, I realize there is so much that could be added to it. And it already is long. It’s a topic that would require writing a book to do it justice. And it is such a fascinating area of study with lines of thought going in numerous directions. But I’ll limit myself by adding only a few thoughts that point toward some of those other directions.

The topic of this post goes back to the Renaissance (Western Individuality Before the Enlightenment Age) and even earlier to the Axial Age (Hunger for Connection), a thread that can be traced back through history following the collapse of what Julian Jaynes called bicameral civilization in the Bronze Age. At the beginning of modernity, the psychic tension erupted in many ways that were increasingly dramatic and sometimes disturbing, from revolution to media panics (Technological Fears and Media Panics). I see all of this as having to do with the isolating and anxiety-inducing effects of hyper-individualism. The rigid egoic boundaries required by our social order are simply tiresome (Music and Dance on the Mind), as Julian Jaynes conjectured:

“Another advantage of schizophrenia, perhaps evolutionary, is tirelessness. While a few schizophrenics complain of generalized fatigue, particularly in the early stages of the illness, most patients do not. In fact, they show less fatigue than normal persons and are capable of tremendous feats of endurance. They are not fatigued by examinations lasting many hours. They may move about day and night, or work endlessly without any sign of being tired. Catatonics may hold an awkward position for days that the reader could not hold for more than a few minutes. This suggests that much fatigue is a product of the subjective conscious mind, and that bicameral man, building the pyramids of Egypt, the ziggurats of Sumer, or the gigantic temples at Teotihuacan with only hand labor, could do so far more easily than could conscious self-reflective men.”

On the Facebook page for Jaynes’ The Origin of Consciousness in the Breakdown of the Bicameral Mind, Luciano Imoto made the same basic point in speaking about hyper-individualism. He stated that, “In my point of view the constant use of memory (and the hippocampus) to sustain a fictitious identity of “self/I” could be deleterious to the brain´s health at long range (considering that the brain consumes about 20 percent of the body’s energy).” I’m sure others have made similar observations. This strain on the psyche has been building up for a long time, but it became particularly apparent in the 19th century, to such an extent it was deemed necessary to build special institutions to house and care for the broken and deficient humans who couldn’t handle modern life or else couldn’t appropriately conform to the ever more oppressive social norms (Mark Jackson, The Borderland of Imbecility). As radical as some consider Jaynes to be, insights like this were hardly new — in 1867, Henry Maudsley offered insight laced with bigotry, from The Physiology and Pathology of Mind:

“There are general causes, such as the state of civilization in a country, the form of its government and its religion, the occupation, habits, and condition of its inhabitants, which are not without influence in determining the pro portion of mental diseases amongst them. Reliable statistical data respecting the prevalence of insanity in different countries are not yet to be had ; even the question whether it has increased with the progress of civilization has not been positively settled. Travellers are certainly agreed that it is a rare disease amongst barbarous people, while, in the different civilized nations of the world, there is, so far as can be ascertained, an average of about one insane person in five hundred inhabitants. Theoretical considerations would lead to the expectation of an increased liability to mental disorder with an increase in the complexity of the mental organization: as there are a greater liability to disease, and the possibility of many more diseases, in a complex organism like the human body, where there are many kinds of tissues and an orderly subordination of parts, than in a simple organism with less differentiation of tissue and less complexity of structure; so in the complex mental organization, with its manifold, special, and complex relations with the external, which a state of civilization implies, there is plainly the favourable occasion of many derangements. The feverish activity of life, the eager interests, the numerous passions, and the great strain of mental work incident to the multiplied industries and eager competition of an active civilization, can scarcely fail, one may suppose, to augment the liability to mental disease. On the other hand, it may be presumed that mental sufferings will be as rare in an infant state of society as they are in the infancy of the individual. That degenerate nervous function in young children is displayed, not in mental disorder, but in convulsions; that animals very seldom suffer from insanity; that insanity is of comparatively rare occurrence among savages; all these are circumstances that arise from one and the same fact—a want of development of the mental organization. There seems, therefore, good reason to believe that, with the progress of mental development through the ages, there is, as is the case with other forms of organic development, a correlative degeneration going on, and that an increase of insanity is a penalty which an increase of our present civilization necessarily pays. […]

“If we admit such an increase of insanity with our present civilization, we shall be at no loss to indicate causes for it. Some would no doubt easily find in over-population the prolific parent of this as of numerous other ills to mankind. In the fierce and active struggle for existence which there necessarily is where the claimants are many and the supplies are limited, and where the competition therefore is severe, the weakest must suffer, and some of them, breaking down into madness, fall by the wayside. As it is the distinctly manifested aim of mental development to bring man into more intimate, special, and complex relations with the rest of nature by means of patient investigations of physical laws, and a corresponding internal adaptation to external relations, it is no marvel, it appears indeed inevitable, that those who, either from inherited weakness or some other debilitating causes, have been rendered unequal to the struggle of life, should be ruthlessly crushed out as abortive beings in nature. They are the waste thrown up by the silent but strong current of progress; they are the weak crushed out by the strong in the mortal struggle for development; they are examples of decaying reason thrown off by vigorous mental growth, the energy of which they testify. Everywhere and always “to be weak is to be miserable.”

As civilization became complex, so did the human mind in having to adapt to it and sometimes that hit a breaking point in individuals; or else what was previously considered normal behavior was now judged unacceptable, the latter explanation favored by Michel Foucault and Thomas Szasz (also see Bruce Levine’s article, Societies With Little Coercion Have Little Mental Illness). Whatever the explanation, something that once was severely abnormal had become normalized and, as it happened with insidious gradualism, few noticed and would accept what had changed “Living amid an ongoing epidemic that nobody notices is surreal. It is like viewing a mighty river that has risen slowly over two centuries, imperceptibly claiming the surrounding land, millimeter by millimeter. . . . Humans adapt remarkably well to a disaster as long as the disaster occurs over a long period of time” (E. Fuller Torrey & Judy Miller, Invisible Plague; also see Torrey’s Schizophrenia and Civilization); “At the end of the seventeenth century, insanity was of little significance and was little discussed. At the end of the eighteenth century, it was perceived as probably increasing and was of some concern. At the end of the nineteenth century, it was perceived as an epidemic and was a major concern. And at the end of the twentieth century, insanity was simply accepted as part of the fabric of life. It is a remarkable history.” All of the changes were mostly happening over generations and centuries, which left little if any living memory from when the changes began. Many thinkers like Torrey and Miller would be useful for fleshing this out, but here is a small sampling of authors and their books: Harold D. Foster’s What Really Causes Schizophrenia, Andrew Scull’s Madness in Civilization, Alain Ehrenberg’s Weariness of the Self, etc; and I shouldn’t ignore the growing field of Jaynesian scholarship such as found in the books put out by the Julian Jaynes Society.

Besides social stress and societal complexity, there was much else that was changing. For example, increasing concentrated urbanization and close proximity with other species meant ever more spread of infectious diseases and parasites (consider toxoplasma gondii from domesticated cats; see E. Fuller Torrey’s Beasts of Earth). Also, the 18th century saw the beginnings of industrialization with the related rise of toxins (Dan Olmsted & Mark Blaxill, The Age of Autism: Mercury, Medicine, and a Man-Made Epidemic). That worsened over the following century. Industrialization also transformed the Western diet. Sugar, having been introduced in the early colonial era, now was affordable and available to the general population. And wheat, once hard to grow and limited to the rich, also was becoming a widespread ingredient with new milling methods allowing highly refined white flour which made white bread popular (in the mid-1800s, Stanislas Tanchou did a statistical analysis that correlated the rate of grain consumption with the rate of cancer; and he observed that cancer, like insanity, spread along with civilization). For the first time in history, most Westerners were eating a very high-carb diet. This diet is addictive for a number of reasons and it was combined with the introduction of addictive stimulants. As I argue, this profoundly altered neurocognitive functioning and behavior (The Agricultural Mind, “Yes, tea banished the fairies.”, Autism and the Upper Crust, & Diets and SystemsDiets and Systems).

This represents an ongoing project for me. And I’m in good company.