The Literal Metaphor of Sickness

I’ve written about Lenore Skenazy before. She is one of my mom’s favorite writers and so she likes to share the articles with me. Skenazy has a another piece about her usual topic, helicopter parents and their captive children. Today’s column, in the local newspaper (The Gazette), has the title “The irony of overprotection” (you can find it on the Creators website or from the GazetteXtra). She begins with a metaphor. In studying how leukemia is contracted, scientist Mel Greaves found that two conditions were required. The first is a genetic susceptibility, which exists only in a certain number of kids, although far from uncommon. But that alone isn’t sufficient without the second factor.

There has to be an underdeveloped or compromised immune system. And sadly this also has become far from uncommon. Further evidence of the hygiene hypothesis keeps accumulating (should be called the hygiene theory at this point). Basically, it is only by being exposed to germs that a child’s immune system experiences healthy stress that activates the immune system into normal development. Without this, many are left plagued by ongoing sickness, allergies, and autoimmune conditions for the rest of their lives.

Parents have not only protected their children from the larger dangers and infinite risks of normal childhood: skinned knees from roughhousing, broken limbs from falling from trees, hurt feelings from bullies, trauma from child molesters, murder from the roving bands of psychotic kidnappers who will sell your children on the black market, etc. Beyond such everyday fears, parents have also protected their kids from minor infections, with endless application of anti-bacterial products and cocooning them in sterile spaces that have been liberally doused with chemicals that kill all known microbial life forms. That is not a good thing for the consequences are dire.

This is where the metaphor kicks in. Skenazy writes:

The long-term effects? Regarding leukemia, “when such a baby is eventually exposed to common infections, his or her unprimed immune system reacts in a grossly abnormal way,” says Greaves. “It overreacts and triggers chronic inflammation.”

Regarding plain old emotional resilience, what we might call “psychological inflammation” occurs when kids overreact to an unfamiliar or uncomfortable situation because they have been so sheltered from these. They feel unsafe, when actually they are only unprepared, because they haven’t been allowed the chance to develop a tolerance for some fears and frustrations. That means a minor issue can be enough to set a kid off — something we are seeing at college, where young people are at last on their own. There has been a surge in mental health issues on campuses.

It’s no surprise that anxiety would be spiking in an era when kids have had less chance to deal with minor risks from childhood on up.

There is only a minor detail of disagreement I’d throw out. There is nothing metaphorical about this. Because of an antiseptic world and other causes (leaky gut, high-carb diet, sugar addiction, food additives, chemical exposure, etc), the immune systems of so many modern Americans are so dysfunctional and overreactive that it wreaks havoc on the body. Chronic inflammation has been directly linked to or otherwise associated with about every major health issue you can think of.

This includes, by the way, neurocognitive conditions such as depression and anxiety, but much worse as well. Schizophrenia, Alzheimer’s, etc also often involve inflammation. When inflammation gets into the brain, gut-brain axis, and/or nervous system, major problems follow with a diversity of symptoms that can be severe and life threatening, but they can also be problematic on a social and psychological level as well. This new generation of children are literally being brain damaged, psychologically maimed, and left in a fragile state. For many of them, their bodies and minds are not fully prepared to deal with the real world with normal healthy responses. It is hard to manage the stresses of life when one is in a constant state of low-grade sickness that permanently sets the immune system on high, when even the most minor risks could endanger one’s well being.

The least of our worries is the fact that diseases like type 2 diabetes, what used to be called adult onset diabetes because it was unknown among children, is now increasing among children. Sure, adult illnesses will find their way earlier and earlier into young adulthood and childhood and the diseases of the elderly will hit people in middle age or younger. This will be a health crisis that could bankrupt and cripple our society. But worse than that is the human cost of sickness and pain, struggle and suffering. We are forcing this fate onto the young generations. That is cruel beyond comprehension. We can barely imagine what this will mean across the entire society when it finally erupts as a crisis.

We’ve done this out of ignorant good intentions of wanting to protect our children from anything that could touch them. It makes us feel better that we have created a bubble world of innocence where children won’t have to learn from the mistakes and failures, harms and difficulties we experienced in growing up. So instead, we’ve created something far worse for them.

Neolithic Troubles

Born Expecting the Pleistocene
by Mark Seely
p. 31

Not our natural habitat

The mismatch hypothesis

Our bodies including our brains—and thus our behavioral predispositions—have evolved in response to very specific environmental and social conditions. Many of those environmental and social conditions no longer exist for most of us. Our physiology and our psychology, all of our instincts and in-born social tendencies, are based on life in small semi-nomadic tribal groups of rarely more than 50 people. There is a dramatic mismatch between life in a crowded, frenetic, technology-based global civilization and the kind of life our biology and our psychology expects [14].

And we suffer serious negative consequences of this mismatch. A clear example can be seen in the obesity epidemic that has swept through developed nations in recent decades: our bodies evolved to meet energy demands in circumstances where the presence of food was less predictable and periods of abundance more variable. Because of this, we have a preference for calorie-dense food, we have a tendency to eat far more than we need, and our bodies are quick to hoard extra calories in the form of body fat.
This approach works quite well during a Pleistocene ice age, but it is maladaptive in our present food-saturated society—and so we have an obesity epidemic because of the mismatch between the current situation and our evolution-derived behavioral propensities with respect to food. Studies on Australian aborigines conducted in the 1980s, evaluating the health effects of the transition from traditional hunter-gatherer lifestyle to urban living, found clear evidence of the health advantages associated with a lifestyle consistent with our biological design [15]. More recent research on the increasingly popular Paleo-diet [16] has since confirmed wide-ranging health benefits associated with selecting food from a pre-agriculture menu, including cancer resistance, reduction in the prevalence of autoimmune disease, and improved mental health.

[14] Ornstein, R. & Ehrlich, P. (1989). New World, New Mind. New York: Simon & Schuster.
[15] O’Dea, K., Spargo, R., & Akerman, K. (1980). The effect of transition from traditional to urban life-style on the insulin secretory response in Australian Aborigines. Diabetes Care, 3(1), 31-37; O’Dea, K., White, N., & Sinclair, A. (1988). An investigation of nutrition-relatedrisk factors in an isolated Aboriginal community in northern Australia: advantagesof a traditionally-orientated life-style. The Medical Journal of Australia, 148 (4), 177-80.
[16] E.g., Frassetto, L. A., Schloetter, M., Mietus-Snyder, M., Morris, R. C., & Sebastian, A. (2009). Metabolic and physiological improvements from consuming a Paleolithic, hunter-gatherer type diet. European Journal of Clinical Nutrition, 63, 947=955.

pp. 71-73

The mechanisms of cultural evolution can be seen in the changing patterns of foraging behavior in response to changes in food availability and changes in population density. Archaeological analyses suggest that there is a predictable pattern of dietary choice that emerges from the interaction among population density, relative abundance of preferred food sources, and factors that relate to the search and handling of various foods. [56] In general, diets become more varied, or broaden, as population increases and the preferred food becomes more difficult to obtain. When a preferred food source is abundant, the calories in the diet may consist largely of that one particular food. But as the food source becomes more difficult to obtain, less preferable foods will be included and the diet will broaden. Such dietary changes imply changes in patterns of behavior within the community—changes of culture.

Behavior ecologists and anthropologists have partitioned the foraging process into two components with respect to the cost-benefit analysis associated with dietary decisions:
search and handling. [57] The search component of the cost-benefit ledger refers to the amount of work per calorie payoff (and other benefits such as the potential for enhanced social standing) associated with a food item’s abundance, distance, terrain, proximity of another group’s territory, water sources, etc. The handling component refers to the work per calorie payoff associated with getting the food into a state (location, form, etc.) in which it can be consumed. Search and handling considerations can be largely independent of each other. The residential permanence involved with the incorporation of agriculture reduces the search consideration greatly, and makes handling the primary consideration. Global industrial food economies change entirely the nature of both search and handling: handling in industrial society—from the perspective of the individual and the individual’s decision processes—is reduced largely to considerations of speed and convenience. The search component has been re-appropriated and refocused by corporate marketing, and reduced to something called shopping.

Domestication, hands down the most dramatic and far-reaching example of cultural evolution, emerges originally as a response to scarcity that is tied to a lack of mobility and an increase in population density. Domestication is a way of further broadening the diet when other local sources of food are already being maximally exploited. Initial experimentation with animal domestication “occurred in situations where forager diets were already quite broad and where the principle goal of domestication was the production of milk, an exercise that made otherwise unusable plants or plant parts available for human consumption. . . .” [58] The transition to life-ways based even partially on domestication has some counter-intuitive technological ramifications as well.

This leads to a further point about efficiency. It is often said that the adoption of more expensive subsistence technology marks an improvement in this aspect of food procurement: better tools make the process more efficient. This is true in the sense that such technology often enables its users to extract more nutrients per unit weight of resource processed or area of land harvested. If, on the other hand, the key criterion is the cost/benefit ratio, the rate of nutrient gained relative to the effort needed to acquire it, then the use of more expensive tools will often be associated with declines in subsistence efficiency. Increased investment in handling associated with the use of high-cost projectile weapons, in plant foods that require extensive tech-related processing, and in more intensive agriculture all illustrate this point. [59]

In modern times, thanks to the advent of—and supportive propaganda associated with—factory industrial agriculture, farming is coupled with ideas of plentitude and caloric abundance. However, in the absence of fossil energy and petroleum-based chemical fortification, farming is expensive in terms of the calories produced as a function of the amount of work involved. For example, “farmers grinding corn with hand-held stone tools can earn no more than about 1800 kcal per hour of total effort devoted to farming, and this from the least expensive cultivation technique.” [60] A successful fishing or bison hunting expedition is orders of magnitude more efficient in terms of the ratio of calories expended to calories obtained.

[56] Bird & O’Connell [Bird, D. W., & O’Connell, J. F. (2006). Behavioral ecology and archaeology. Journal of Archaeological Research, 14, 143-188]
[57] Ibid.
[58] Ibid, p. 152.
[59] Ibid, p. 153.
[60] Ibid, p. 151, italics in original.

pp. 122-123

The birth of the machine

The domestication frame

The Neolithic marks the beginnings of large scale domestication, what is typically referred to as the agricultural revolution. It was not really a revolution in that it occurred over an extended period of time (several thousand years) and in a mosaic piecemeal fashion, both in terms of the adoption of specific agrarian practices and in terms of specific groups of people who practiced them. Foraging lifestyles continue today, and represented the dominant lifestyle on the planet until relatively recently. The agricultural revolution was a true revolution, however, in terms of its consequences for the humans who adopted domestication-based life-ways, and for the rest of the natural world. The transition from nomadic and seminomadic hunting and gathering to sedentary agriculture is the most significant chapter in the chronicle of the human species. But it is clearly not a story of unmitigated success. Jared Diamond, who acknowledges somewhat the self-negating double-edge of technological “progress,” has called domestication the biggest mistake humans ever made.

That transition from hunting and gathering to agriculture is generally considered a decisive step in our progress, when we at last acquired the stable food supply and leisure time prerequisite to the great accomplishments of modern civilization. In fact, careful examination of that transition suggests another conclusion: for most people the transition brought infectious disease, malnutrition, and a shorter lifespan. For human society in general it worsened the relative lot of women and introduced class-based inequality. More than any other milestone along the path from chimpanzeehood to humanity, agriculture inextricably combines causes of our rise and our fall. [143]

The agricultural revolution had profoundly negative consequences for human physical,
psychological, and social well being, as well as a wide-ranging negative impact on the planet.

For humans, malnutrition and the emergence of infectious disease are the most salient physiological results of an agrarian lifestyle. A large variety of foodstuffs and the inclusion of a substantial amount of meat make malnutrition an unlikely problem for hunter gatherers, even during times of relative food scarcity. Once the diet is based on a few select mono-cropped grains supplemented by milk and meat from nutritionally-inferior domesticated animals, the stage is set for nutritional deficit. As a result, humans are not as tall or broad in stature today as they were 25,000 years ago; and the mean age of death is lower today as well. [144] In addition, both the sedentism and population density associated with agriculture create the preconditions for degenerative and infectious disease. “Among the human diseases directly attributable to our sedentary lives in villages and cities are heart and vascular disorders, diabetes, stroke, emphysema,
hypertension, and cirrhoses [sic.] of the liver, which together cause 75 percent of the deaths in the industrial nations.” [145] The diet and activity level of a foraging lifestyle serve as a potent prophylactic against all of these common modern-day afflictions. Nomadic hunter-gatherers are by no means immune to parasitic infection and disease. But the spread of disease is greatly limited by low population density and by a regular change of habitation which reduced exposure to accumulated wastes. Both hunter-gatherers and agriculturalists are susceptible to zoonotic diseases carried by animals, but domestication reduces an animal’s natural immunity to disease and infection, creates crowded conditions that support the spread of disease among animal populations, and increases the opportunity for transmission to humans. In addition, permanent dwellings provide a niche for a new kind of disease-carrying animal specialized for symbiotic parasitic cohabitation with humans, the rat being among the most infamous.
Plagues and epidemic outbreaks were not a problem in the Pleistocene.

There is a significant psychological dimension to the agricultural revolution as well.
A foraging hunter-gatherer lifestyle frames natural systems in terms of symbiosis and interrelationship. Understanding subtle connections among plants, animals, geography,
and seasonal climate change is an important requisite of survival. Human agents are intimately bound to these natural systems and contemplate themselves in terms of these systems, drawing easy analogy between themselves and the natural communities around them, using animals, plants, and other natural phenomena as metaphor. The manipulative focus of domestication frames natural systems in antagonistic terms of control and resistance. “Agriculture removed the means by which men [sic.] could contemplate themselves in any other than terms of themselves (or machines). It reflected back upon nature an image of human conflict and competition . . . .” [146] The domestication frame changed our perceived relationship with the natural world,
and lies at the heart of our modern-day environmental woes. According to Paul Shepard,
with animal domestication we lost contact with an essential component of our human nature, the “otherness within,” that part of ourselves that grounds us to the rest of nature:

The transformation of animals through domestication was the first step in remaking them into subordinate images of ourselves—altering them to fit human modes and purposes. Our perception of not only ourselves but also of the whole of animal life was subverted, for we mistook the purpose of those few domesticates as the purpose of all. Plants never had for us the same heightened symbolic representation of purpose itself. Once we had turned animals into the means of power among ourselves and over the rest of nature, their uses made possible the economy of husbandry that would, with the addition of the agrarian impulse, produce those motives and designs on the earth contrary to respecting it. Animals would become “The Others.” Purposes of their own were not allowable, not even comprehensible. [147]

Domestication had a profound impact on human psychological development. Development—both physiological and psychological—is organized around a series of stages and punctuated by critical periods, windows of time in which the development and functional integration of specific systems are dependent upon external input of a designated type and quality. If the necessary environmental input for a given system is absent or of a sufficiently reduced quality, the system does not mature appropriately. This can have a snowball effect because the future development of other systems is almost always critically dependent on the successful maturation of previously developed systems. The change in focus toward the natural world along with the emergence of a new kind of social order interfered with epigenetic programs that evolved to anticipate the environmental input associated with a foraging lifestyle. The result was arrested development and a culture-wide immaturity:

Politically, agriculture required a society composed of members with the acumen of children. Empirically, it set about amputating and replacing certain signals and experiences central to early epigenesis. Agriculture not only infantilized animals by domestication, but exploited the infantile human traits of normal individual neoteny. The obedience demanded by the organization necessary for anything larger than the earliest village life, associated with the rise of a military caste, is essentially juvenile and submissive . . . . [148]

[143] Diamond (1992), p. 139. [Diamond, J. (1992). The Third Chimpanzee. New York: HarperCollins.]
[144] Shepard (1998) [Shepard, P. (1998). Coming Home to the Pleistocene. Washington, D.C.: Island Press]
[145] Ibid, p. 99.
[146] Shepard (1982), p. 114. [Shepard, P. (1982). Nature and Madness. Athens Georgia: University of Georgia Press]
[147] Shepard (1998), p. 128.
[148] Shepard (1982), pp. 113-114.

Other People’s Craziness

In a Facebook group dedicated to Julian Jaynes, I was talking to a lady who is an academic and a poet. She happened to mention that she is also a ‘Manbo’, something like a vodou practitioner. She made the admission that she sees and hears spirits, but she qualified it by saying that her rational mind knew it wasn’t real. I found that qualification odd, as if she were worried about maintaining her respectability. She made clear that these experiences weren’t make-believe, as they felt real to her, as real as anything else, and yet one side of her personality couldn’t quite take them as real. So, two different realities existed inside her and she seemed split between them.

None of this is particularly strange in a group like that. Many voice-hearers, for obvious reasons, are attracted to Jaynes’ view on voice-hearing. Jaynes took such experiences seriously and, to a large degree, took the experiences on their own terms. Jaynes offered a rational or rationalizing narrative for why it is ‘normal’ to hear voices. The desire to be normal is powerful social force. Having a theory helps someone like this lady to compartmentalize the two aspects of her being and not feel overwhelmed. If she didn’t qualify her experience, she would be considered crazy by many others and maybe in her own mind. Her academic career might even be threatened. So, the demand of conformity is serious with real consequences.

That isn’t what interested me, though. Our conversation happened in a post about the experience of falling under a trance while driving, such that one ends up where one was going without remember how one got there. It’s a common experience and a key example Jaynes uses about how the human mind functions. I mentioned that many people have experiences of alien contact and UFO abduction while driving, often alone at night on some dark stretch of road. And I added that, according to Jacques Vallee and John Keel, many of these experiences match the descriptions of fairy abductions in folklore and the accounts of shamanic initiations. Her response surprised me, in her being critical.

Vallee also had two sides, on the one hand an analytical type who worked as an astronomer and a computer scientist and on the other a disreputable UFO researcher. He came at the UFO field from a scientific approach, but like Jaynes he felt compelled to take people at their word in accepting that their experience was real to them. He even came to believe there was something to these experiences. It started with a time he was working in an observatory and, after recording anomalous data of something in the sky that wasn’t supposed to be there, the director of the observatory erased the tapes out of fear that if it got out to the press it would draw negative attention to the institution. That is what originally piqued his curiosity and started him down the road of UFO research. But he also came across many cases where entire groups of people, including military, saw the same UFOs in the sky and their movements accorded with no known technology or physics.

That forced him to consider the possibility that people were seeing something that was on some level real, whatever it was. He went so far as to speculate about consciousness being much stranger than science could presently explain, that there really is more to the universe or at an angle to our universe. In this line of thought, he spoke of the phenomena as, “partly associated with a form of non-human consciousness that manipulates space and time.” Sure, to most people, that is crazy talk, though no more crazy than interacting with the spirit world. But the lady I was speaking with immediately dismissed this as going too far. Her anomalous experiences were fine, as long as she pretended that they were pretend or something, thus proving she wasn’t bat-shit loony. Someone else’s anomalous experience, however, was not to be taken seriously. It’s the common perception that only other people’s religion is mythology.

That amused me to no end. And I said that it amused me. She then blocked me. That amused me as well. I’m feeling amused. I was more willing to take her experiences as being valid in a way she was unwilling to do for others. It’s not that I had any skin in the game, as I’ve never talked to spirits nor been abducted by aliens. But I give people the benefit of the doubt that there experiences are real to them. I’m a radical skeptic and extreme agnostic. I take the world as it comes and sometimes the world is strange. No need to rationalize it. And if that strangeness is proof of insanity and disrepute, there are worse fates.

The Agricultural Mind

Let me make an argument about individualism, rigid egoic boundaries, and hence Jaynesian consciousness. But I’ll come at it from a less typical angle. I’ve been reading much about diet, nutrition, and health. There are significant links between what we eat and so much else: gut health, hormonal regulation, immune system, and neurocognitive functioning. There are multiple pathways, one of which is direct, connecting the gut and the brain. The gut is sometimes called the second brain, but in evolutionary terms it is the first brain. To demonstrate one example of a connection, many are beginning to refer to Alzheimer’s as type 3 diabetes, and dietary interventions have reversed symptoms in clinical studies. Also, microbes and parasites have been shown to influence our neurocognition and psychology, even altering personality traits and behavior (e.g., toxoplasma gondii).

One possibility to consider is the role of exorphins that are addictive and can be blocked in the same way as opioids. Exorphin, in fact, means external morphine-like substance, in the way that endorphin means indwelling morphine-like substance. Exorphins are found in milk and wheat. Milk, in particular, stands out. Even though exorphins are found in other foods, it’s been argued that they are insignificant because they theoretically can’t pass through the gut barrier, much less the blood-brain barrier. Yet exorphins have been measured elsewhere in the human body. One explanation is gut permeability that can be caused by many factors such as stress but also by milk. The purpose of milk is to get nutrients into the calf and this is done by widening the space in gut surface to allow more nutrients through the protective barrier. Exorphins get in as well and create a pleasurable experience to motivate the calf to drink more. Along with exorphins, grains and dairy also contain dopaminergic peptides, and dopamine is the other major addictive substance. It feels good to consume dairy as with wheat, whether you’re a calf or a human, and so one wants more.

Addiction, of food or drugs or anything else, is a powerful force. And it is complex in what it affects, not only physiologically and psychologically but also on a social level. Johann Hari offers a great analysis in Chasing the Scream. He makes the case that addiction is largely about isolation and that the addict is the ultimate individual. It stands out to me that addiction and addictive substances have increased over civilization. Growing of poppies, sugar, etc came later on in civilization, as did the production of beer and wine (by the way, alcohol releases endorphins, sugar causes a serotonin high, and both activate the hedonic pathway). Also, grain and dairy were slow to catch on, as a large part of the diet. Until recent centuries, most populations remained dependent on animal foods, including wild game. Americans, for example, ate large amounts of meat, butter, and lard from the colonial era through the 19th century. In 1900, Americans on average were only getting 10% of carbs as part of their diet and sugar was minimal.

Another factor to consider is that low-carb diets can alter how the body and brain functions. That is even more true if combined with intermittent fasting and restricted eating times that would have been more common in the past. Taken together, earlier humans would have spent more time in ketosis (fat-burning mode, as opposed to glucose-burning) which dramatically affects human biology. The further one goes back in history the greater amount of time people probably spent in ketosis. One difference with ketosis is cravings and food addictions disappear. It’s a non-addictive or maybe even anti-addictive state of mind. Many hunter-gatherer tribes can go days without eating and it doesn’t appear to bother them, and that is typical of ketosis. This was also observed of Mongol warriors who could ride and fight for days on end without tiring or needing to stop for food. What is also different about hunter-gatherers and similar traditional societies is how communal they are or were and how more expansive their identities in belonging to a group. Anthropological research shows how hunter-gatherers often have a sense of personal space that extends into the environment around them. What if that isn’t merely cultural but something to do with how their bodies and brains operate? Maybe diet even plays a role. Hold that thought for a moment.

Now go back to the two staples of the modern diet, grains and dairy. Besides exorphins and dopaminergic substances, they also have high levels of glutamate, as part of gluten and casein respectively. Dr. Katherine Reid is a biochemist whose daughter was diagnosed with autism and it was severe. She went into research mode and experimented with supplementation and then diet. Many things seemed to help, but the greatest result came from restriction of glutamate, a difficult challenge as it is a common food additive. This requires going on a largely whole foods diet, that is to say eliminating processed foods. But when dealing with a serious issue, it is worth the effort. Dr. Reid’s daughter showed immense improvement to such a degree that she was kicked out of the special needs school. After being on this diet for a while, she socialized and communicated normally like any other child, something she was previously incapable of. Keep in mind that glutamate is necessary as a foundational neurotransmitter in modulating communication between the gut and brain. But typically we only get small amounts of it, as opposed to the large doses found in the modern diet.

That reminds me of propionate. It is another substance normally taken in at a low level. Certain foods, including grains and dairy, contain it. The problem is that, as a useful preservative, it has been generously added to the food supply. Research on rodents shows injecting them with propionate causes autistic-like behaviors. And other rodent studies show how this stunts learning ability and causes repetitive behavior (both related to the autistic demand for the familiar), as too much propionate entrenches mental patterns through the mechanism that gut microbes use to communicate to the brain how to return to a needed food source. Autistics, along with cravings for propionate-containing foods, tend to have larger populations of a particular gut microbe that produces propionate. In killing microbes, this might be why antibiotics can help with autism.

As with proprionate, exorphins injected into rats will likewise elicit autistic-like behaviors. By two different pathways, the body produces exorphins and proprionate from the consumption of grains and dairy, the former from the breakdown of proteins and the latter produced by gut bacteria in the breakdown of some grains and refined carbohydrates (combined with the proprionate used as a food additive; added to other foods as well and also, at least in rodents, artificial sweeteners increase propionate levels). This is part of the explanation for why many autistics have responded well to low-carb ketosis, specifically paleo diets that restrict both wheat and dairy, but ketones themselves play a role in using the same transporters as propionate and so block their buildup in cells and, of course, ketones offer a different energy source for cells as a replacement for glucose which alters how cells function, specifically neurocognitive functioning and its attendant psychological effects.

What stands out to me about autism is how isolating it is. The repetitive behavior and focus on objects resonates with extreme addiction. Both conditions block normal human relating and create an obsessive mindset that, in the most most extreme forms, blocks out all else. I wonder if all of us moderns are simply expressing milder varieties of this biological and neurological phenomenon. And this might be the underpinning of our hyper-individualistic society, with the earliest precursors showing up in the Axial Age following what Julian Jaynes hypothesized as the breakdown of the much more other-oriented bicameral mind. What if our egoic individuality is the result of our food system, as part of the civilizational project of mass agriculture?

Reactionary Mind Is Not Normal

“To live a modern life anywhere in the world today, subject to perpetual social and technological transformations, is to experience the psychological equivalent of permanent revolution. Anxiety in the face of this process is now a universal experience, which is why reactionary ideas attract adherents around the world who share little except their sense of historical betrayal.

“Every major social transformation leaves behind a fresh Eden that can serve as the object of somebody’s nostalgia. And the reactionaries of our time have discovered that nostalgia can be a powerful political motivator, perhaps even more powerful than hope. Hopes can be disappointed. Nostalgia is irrefutable.”
~ Mark Lilla, Our Reactionary Age

What if we are all more reactionary than we’d like to admit? Related to that, what if we are all more splintered and dissociated as well?

We have this sense of knowing who we are, as though we are singular stable identities. And we defend those identities with ideology or rather with ideological rhetoric, a wall of language with metaphors as the sentries. It is easy to rationalize and narratize, to make coherent the divided self that plagues the modern mind. It makes me wonder that maybe none of us is as we seem. We look at people with serious personality issues and we might call them dissociated or something similar. But what if we are all disconnected on some basic level. If that is our default state, then it isn’t really dissociation for there is no other supposedly normal state to be dissociated from, no unified whole that only later becomes fractured.

I’ve observed those who can say something to another person, walk into the next room, and say the complete opposite to someone else. It’s an amazing thing to see for how naturally it comes. There is no sign of intentional deception or self-awareness. Of course, this is not so easy to observe in oneself. If one were doing the exact same thing, how would one know? Yet we are constantly splitting ourselves off in this manner, just to get by in this complex society. Who we are with family is different than when with friends. And who we are at church is different than when at work. Egoic consciousness (from a Buddhist, Humean, and Jaynesian view) is always a limited construct, not as grand and encompassing as it presents itself.

We are all raised in a society full of lies, half-truths, and just-so stories. When we are young and innocent, we might occasionally challenge authority figures in their dishonesty and deceptions, their self-serving explanations and commands. But the response of authority challenged is almost always negative, if not punishment then gaslighting. This creates each new generation of schizoid adults. Not every kid gets this kind of mind game to the same degree, but I suspect this is what happens to all of us in various ways. It’s the way our society is built.

The trickster quality of the reactionary demonstrates this. And it makes us feel better to accuse those others of being reactionaries. But maybe the reactionary frames everything, existing at the periphery of our vision at the dark, blurred edges of liberal idealism. We live in a society of instability and uncertainty, of stress, anxiety, and fear — the fertile black loam of the reactionary mind. That would explain why it seems so much easier for liberals and left-wingers to become reactionaries than the other way around.

In my gut-level hatred of this reactionary madness, I feel most judgmental of those who turn reactionary, especially those who should know better such as a well-read left-winger mindlessly repeating racist beliefs or a well-educated liberal obliviously being converted while following their curiosity into the “Dark Enlightenment”. It’s sad and frustrating. I can sympathize for the lost souls on the right who were simply born into a world of reaction. They don’t know better. But when those most aware of the dangers of the reactionary mind are lured into its temptations, it further chips away at what little hope I have remaining for humanity. It makes me worry for my own mind as well, as I sense how overpowering reaction can be when one is immersed in it.

Our intellectual defenses are weak. That is what makes the reactionary mind flourish under these unnatural conditions of capitalist modernity. It must be what it was like in the late Bronze Age when the first authoritarian leaders arose in the growing empires. There wasn’t yet the rhetorical capacity among the population to protect against it, as would later develop in the Axial Age. Millennia later, in this ever increasingly reactionary age, authoritarianism grows worse as its rhetorical skill manages to stay one step ahead. We are inoculated only to the reactionary mind as it expressed in the past, ever expecting it to repeat the same way with the Nazi brownshirts goosestepping in the streets.

Corrupt power is what it is. Filled not only with authoritarians but social dominators, psychopaths, narcissists, and Machiavellians. Those people, for all the problems they cause, are a miniscule proportion of the population. They could not rule, could not cause damage if there weren’t those who could be manipulated, riled up, and led to commit horrors upon others… and then the intellectuals and media hacks who come along to rationalize and normalize it all. This is why I fear the reactionary mind, the sway it holds even over those not explicitly reactionary. This is why we desperately need to come to greater self-understanding. Even simpleminded fools like Donald Trump are able to seize power and play us like fools, only because the reactionary mind has seized the entire political establishment and body politic with the grip of a heart attack. The kindly pseudo-liberal reactionaries of the Democratic Party are no better — if anything, far more dangerous for their masked face.

Still, maybe there is a hint of hope. When looking at some other societies, one doesn’t see this kind of full reactionary dominance. I’m particularly thinking of more isolated tribes maintaining their traditional cultures and lifestyles. The example I often turn to is that of the Piraha. Daniel Everett noted how they lacked any evidence of stress, anxiety, depression, and fear of death. And along with this, there was no expression of authoritarianism and social dominance. I’ve pointed out elsewhere that traditional communitarian societies, if they were to survive, could not tolerate psychopathy as we do in such careless fashion. We modern Westerners are far too tolerant of this threat, as if we believe we are above it all. That is a dangerously naive attitude.

Related to this, in understanding human nature, the diversity of identity indicates there is much more we don’t understand. Many hunter-gatherers don’t have rigid ego boundaries, don’t have permanent unitary selves, don’t have a sense of individualistic isolation from others and from the world around them. They aren’t divided, splintered, fractured, or dissociated. An open and loose embrace of a complex psyche is their natural way of being and maybe it is the default position for all humanity. Our carefully constructed egoic structures are rather flimsy, not a great foundation upon which to build a civilization. No wonder we are in a constant state of fear and anxiety, ever worried about the whole thing collapsing down around us. We are without the calm confidence found among many indigenous people who know their place in the world, know they belong without needing the authoritarian control of a clenched fist nor the rhetorical sleight-of-hand of a demagogue to keep everyone in line.

The reactionary mind may be the norm for our society. But it is not normal.

* * *

Let me leave some brief commentary on Mark Lilla and Corey Robin, the two main scholars on the reactionary mind. They are correct to place nostalgia as the taproot of this phenomenon. Lilla is also right to link it back to early thinkers like Plato who reacted, as I see it, to both Athenian democracy and the ancient poetic tradition. But he is wrong to separate the reactionary and the conservative, a mistake Robin avoids.

But I’d argue that Lilla and Robin are further missing out on how the reactionary is linked to the liberal, two sides of the same paradigm. All of these co-arise. The reactionary isn’t only a counter-revolutionary that is only found after the revolution for the leaders who co-opt revolutions typically are reactionaries, from the likes of George Washington to Maximilien Robespierre.

Some conservatives seek to distinguish themselves by identifying as ‘classical liberals’, in the hope of separating their identity from both progressives and reactionaries, but they fail in this endeavor. For one, many of the classical liberals were revolutionaries, as some were reactionaries, since liberalism as a paradigm has been a mix right from the beginning, even in terms of looking to its precursors in the Axial Age.

To demonstrate this confusion of ideological rhetoric, Mark Lilla himself in reacting to the failings of the liberalism he hopes to defend ends up turning to reactionary nostalgia. So, when even a scholar seeking to defuse the reactionary bomb falls prey to it, you know that it is potent stuff not to be handled lightly, even for those who think they know what they’re doing.

* * *

In conclusion, here is my alternative view as an independent. There is something I’ve never seen anyone acknowledge. This is my own insight.

Some see it all beginning with the French Revolution and Edmund Burke’s response. But it’s a bit complex, since Burke himself was a progressive reformer for his era and belonged to the ‘liberal’ party. His reactionary stance came late in life and yet this never led him to abandon his former left-leaning positions. He was a liberal and a conservative and a reactionary, but he was no revolutionary though he initially supported the American Revolution, only because he hoped for progressive reform. The standard story is that reaction was the counter-revolution following revolution. That doesn’t quite make sense of the facts, though. To clarify this, look to the French Revolution. The Jacobins, I’d argue, were reactionaries and counterrevolutionaries.

They were reacting to the old authoritarian regime of French monarchy but not to eliminate rigid hierarchy for they used the same tactics of oppressive violence to defend their preferred hierarchy, precisely as Corey Robin describes the reactionary agenda. It was a power grab and unsurprisingly it led to an even greater anti-democratic authoritarianism. And they were counterrevolutionaries fighting against what the American Revolution had unleashed. The radical and revolutionary social democrat Thomas Paine, we must remember, sat on the right side of the French National Assembly opposite of the Jacobins. The French Revolution didn’t begin with the Jacobins for they only managed to co-opt it long after it had started. They were counter-revolutionaries within the revolution.

I’ve come to a more complex view. I tend toward the theory of Robin. But I don’t entirely agree with him either. My present assessment is that conservatism is the reactionary and the reactionary is simply the other side of liberalism. It’s all of one cloth. They co-arose together (and continue to do so), going back to the precursors in the Axial Age. This puts the liberal in a less comfortably righteous position. There is no liberalism without the anxiety and hence nostalgia that leads to reaction, and conservatism has nowhere else to stand except right in the middle of the chaos.

This comes to a breaking point now that ‘liberalism’ has become the all-encompassing paradigm that rules as unquestionable ideological realism. Reactionary conservatism can offer no alternative but destruction. And liberalism can offer no response but defense of the status quo. So, liberalism becomes increasingly reactionary as well, until there is nothing left other than reaction in all directions, reactionaries reacting to reactionaries.

That is the final apocalypse of the ideological worldview we take for granted. But apocalypse, in its original meaning, referred to a revealing. Just as revolution once meant a cyclical turning that brings us back. Reaction, as such, is the return of the repressed. This is far from nostalgia, even as we have no choice than to carry the past forward as society is transformed. The reactionary age we find ourselves in is more radical than the revolutions that began it. And what, in our projections, is reflected back to us flatters neither the right nor the left.

It’s an ideological stalemate for it never was about competing political visions. All rhetoric has become empty or rather, to some extent, maybe it always was. It never meant what we thought it did. We find ourselves without any bearings or anchor. So we thrash about on a dark sea with a storm brewing. No sight beyond the next wave looming over us, casting its cold shadow, and ready to come crashing down.

* * *

The Reactionary Mind in a Reactionary Age

Reactionary Revolutionaries, Faceless Men, and God in the Gutter

The Ex-Cons
by Corey Robin

Lilla v. Robin
by Henry

Wrong Reaction
by Alex Gourevitch

Why reactionary nostalgia is stronger than liberal hope
by Carlos Lozada

The Shipwrecked Book: Mark Lilla’ Nostalgic Prison
by Robert L. Kehoe III

The Revolutionary Nostalgia That Gave Rise to Trump – and ISIS
by Shlomo Avineri

Is a Conservative Crack-Up on the Horizon?
by Samuel Goldman

Dietary Dictocrats of EAT-Lancet

“Civilisation is in crisis. We can no longer feed our population a healthy diet while balancing planetary resources. For the first time in 200 000 years of human history, we are severely out of synchronisation with the planet and nature. This crisis is accelerating, stretching Earth to its limits, and threatening human and other species’ sustained existence.”

Those words, found on the main page for EAT-Lancet, are from comments by Tarmara Lucas and Richard Horton, editors for The Lancet. EAT-Lancet is a campaign to force a high-carb, plant-based diet on all or most of the world’s population. The report itself, Food in the Anthropocene, is basically an opinion piece with the names of 37 scientists attached to it; but it doesn’t represent consensus opinion in the field nor are the references in the report reliable. The groups behind it have global aspirations. I don’t automatically have a problem with this, despite my dislike of technocratic paternalism, for I understand there are global problems that require global solutions (pollution, for example, knows no national boundary with 40% of worldwide deaths attributed to air pollution alone). But there is a long history of bad dietary advice being imposed on large populations. I’m not fond of dominator culture, no matter how good the intentions. We might be wise to take caution before going down that road again.

Besides, there seems to be an inherent contradiction behind this advocacy. The report and the editorial both are praising basically what is the same old high-carb diet that governments around the world have been pushing since the late 1970s, a diet that is correlated with an epidemic of chronic diseases. The journal’s own editors seemingly admit that they see it as a forced choice between “a healthy diet” and “balancing planetary resources” — one or the other but not both. Or rather, since many of them don’t follow their own advice (more on that further down), it’s good health for the rich shoved onto the malnourished shoulders of the poor. This interpretation is indicated by how the report simultaneously acknowledges certain foods are healthy even as those very foods are supposed to be restricted. Then the authors suggest that vitamin supplementation or fortification might be necessary to make up for what is lacking. This is further supported by the words of Walter Willet, one of EAT-Lancet’s main advocates — he argues that, “If we were just minimising greenhouse gases we’d say everyone be vegan”, a highly questionable claim as the data is off, but Willett has also been reported as acknowledging that, “a vegan diet wasn’t necessarily the healthiest option”. The EAT-Lancet report itself actually discusses the health benefits of animal foods. Such a deficient diet can’t honestly be called healthy when it requires nutritional supplementation because the food eaten doesn’t fully nourish the body. Sure, if you want to be a vegan for moral reasons to save the planet or whatever, more power to you and be sure to take vitamins. But let’s be clear that this has nothing to do with health.

Other than the ethics of meat-eating, why is this dietary regimen near-vegan in its restriction of animal foods? It’s not always clear, in the report, when a dietary suggestion is intended to promote human health or intended to promote planetary health (or maybe something else entirely). Are they really trying to save the world or simply hoping to prop up a collapsing global order? And what does this mean in practice? “Here is another question,” tweeted Troy Stapleton“If one were to provide a patient with advice to eat a “plant based diet” should the patient also be given information that this advice is based on environmental concerns and not their health?” This is a serious set of questions when it comes to sustainability. This EAT-Lancet diet of high-carbs and processed foods is guaranteed to worsen the chronic diseases that are plaguing us, as Walter Willet has argued himself (see below), and the rapidly rising costs of healthcare because of this could bankrupt our society. That is the opposite of sustainable, even if one ignores the moral quandary of giving people bad health advice in the hope that it might save the planet, despite the lack of evidence supporting this hope.

The claims about a healthy diet are suspect for other reasons as well. “The Achilles heel of the proposal?,” asks Tim Noakes and then continues, “Most must surely realise that this cannot be healthy in the long term.” For a key area of health, “Our brains NEED animal foods. They’re 2/3 fat and can’t function without DHA. It also needs Vitamins B12, K2, A & Iron. They’re ONLY in animal foods & without them we have major brain issues. The spread of veganism is pouring fuel on the mental health crisis fire” (Carnivore Aurelius). The EAT-Lancet diet is similar to the macrobiotic diet and that is worrying. Why do mainstream authorities have endless praise for plant-based diets? There is no consistent evidence of greater health among vegetarians and vegans. In some comparisons, they fare better while, in others, they do worse. And on average, they are middling in health outcomes, and middling isn’t overly impressive in our disease-ridden society. The data shows that vegans and vegetarians take twice as many sick days as meat-eaters, have lower sperm counts, etc. This might explain why there are more, three times to five times more in fact, ex-vegans and ex-vegetarians than still practicing vegans and vegetarians — 84% going back to meat and most of those after only a year on the diet, and the largest percentage cited concerns about declining health as the motivating reason. American ‘vegetarians’, on average, eat one serving of meat a day and this involves most who identify as vegetarian, particularly common while drunk which includes a third (37%) of them, but I’ve been surprised by how many vegans and vegetarians I’ve come across who somehow don’t consider fish to be animals and so eat them freely. In responding to accusations of fad diets, David Gillepsie summarizes the nutritional failure of plant-based diets as potentially some of the worst fad diets:

Research indicates that “the health of Western vegetarians is good and similar to that of comparable non-vegetarians”. However, studies also tell us that while vegetarian diets provide higher amounts of carbohydrates, omega-6 polyunsaturated fats, fibre, vitamin C, vitamin E and magnesium (compared to omnivores) they have lower amounts of protein, saturated fat, omega-3 fats, vitamins A, D and B12 and Zinc. Vegans are usually particularly low in B12 and also Calcium, a deficiency they are likely to share with hard-core paleo enthusiasts because both avoid dairy. We use vitamin B12 to create our DNA, red blood cells and the myelin insulation around our nerves. Not having enough of it can result in fatigue, weakness, psychiatric problems and anaemia. B12 deficiency in children and the elderly is even more worrying. Studies have consistently shown that children and older people lacking B12 suffer significant cognitive defects such as memory and reasoning. The lack of long chain omega-3 fats, the abundance of omega-6 fats and deficiencies in the fat soluble vitamins A and D are also serious cause for concern particularly in pregnancy.”

I have no doubt the EAT-Lancet proponents know this kind of data. But since among the authors of the report “more than 80% of them (31 out of 37) espoused vegetarian views” and “have, through their work, been promoting vegetarian, anti-meat views since before joining the EAT-Lancet Commission” (Nina Teicholz) and since “Oxford’s Dr Marco Springmann, the scientist behind much of the environmental portion of EAT Lancet[,…] is an activist vegan not considered biased but a cattle rancher is” (Frank Mitloehner), they wouldn’t be biased toward spreading this contrary evidence that undermines their belief system and ideological agenda. As these same scientists know or should know, this is not a new situation since malnourishment caused by dietary guidelines has been going on for generations at this point (consumption of nutrient-dense foods and animal-based foods has followed the same downward trend, opposite to the upward trend of simple carbs, seed oils, and processed foods). This point is also made by Teicholz: “Americans have eaten more plants, fewer animal foods, and 34% less red meat since 1970. While, rates of obesity and diabetes have skyrocketed. How does it make sense that continuing on this path will improve health if it hasn’t so far?” Compare that to American in “1955 when more than half of our calories came from meat, eggs, milk, cream, fats and oils […] and adult diabetes was virtually unheard of (Adele Hite, Keeping it Simply Stupid) and that was a lower level of animal foods than seen before that, such that: “In 1900 our diet was 10% carbs, in 2010 it is 63%” (Carroll Hoagland). This isn’t limited to Americans since the 1977 US Department Diet Guidelines were adopted widely throughout the world, based on extremely weak evidence and bad science.

Not long before the Eat-Lancet report was published, The Lancet journal also published a paper on the large and well-controlled PURE study that showed a diet low in carbs and high in animal foods, both meat and fat increased health — including the sources of saturated fat that most often gets blamed: “Those eating the highest levels of dairy and red meat saw their chances of early death fall by 25 per cent and a fatal heart attack cut by 22 per cent” (Nick McDermot). Based on The Lancet’s own published data, the EAT-Lancet recommendations make no sense. And as EAT-Lancet was based on weak science, it is sadly amusing that The Lancet just published another paper stating that, “In the absence of randomisation, analyses of most observational data from the real world, regardless of their sophistication, can only be viewed as hypothesis generating.” I’m pretty sure the EAT-Lancet report wasn’t intended to merely generate hypotheses. So, what is the justification for these unscientific dietary recommendations? Stating it simply, Teicholz concludes“There is no rational basis for that.” An as usual, Dr. Jason Fung shares his take on the situation: “they know they’re going to succeed with the same advice. Insanity, literally.” In another tweet, Tim Noakes concludes with a rhetorical question: “Don’t humans ever learn?”

Official dietary recommendations have been a grand failure, one could easily argue, and we have no reason to expect different results, other than a continued worsening as ill health accumulates from one generation to the next. Then again, maybe it hasn’t failed in that maybe it’s purpose was never to promote public health in the first place. When something seems to fail and continues to get repeated, consider the possibility that it is serving some other purpose all too well. If so, the real agenda simply isn’t the one being publicly stated. Not to be conspiratorial, but human nature is what it is and that means people are good at rationalizing to themselves and others. It is largely irrelevant whether or not they sincerely believe they have good intentions.

Perhaps the covert motive is old school social control and social engineering, and quite possibly motivated by genuine concern of paternalism. Promoting a single diet for all the world would mean more big government run by technocrats who work for plutocrats, all to make the world a better place and it just so happens to directly benefit certain people more than others. The ruling elite and the comfortable classes are starting to worry about the consequences of capitalism that has slowly destroyed the world and, in a technocratic fantasy, they are hoping to manage the situation. That means getting the masses in line. There are too many useless eaters. And if the excess population (i.e., the takers) can’t be eliminated entirely without a lot of mess and complication (World War III, plague, eugenics, etc), their consumption habits could be manipulated and redirected so that they don’t use up the resources needed by the rich (i.e., the makers). Since the teeming masses are useless anyhow, it matters not that they’ll be further malnourished than they already are. Sure, an increasing number will die at a younger age and all the better, as it will keep the population down. (Yes, I’m being cynical, maybe more than is called for. But I don’t feel forgiving at the moment toward those who claim to have all the answers to complex problems. Let them experiment on themselves first and then get back to us later with the results.)

The commissioners of the report recommend that governments use “choice editing” in order to “guide choice” (nudge theory) through incentives, disincentives, and default policy or, failing that, “restrict choice” and “eliminate choice” to enforce compliance. That is to say, “the scale of change to the food system is unlikely to be successful if left to the individual or whim of consumer choice. This change requires reframing at the population and systemic level. By contrast, hard policy interventions include laws fiscal measures, subsidies and penalties, trade reconfiguration, and other economic and structural measures.” And they are ambitious: “For significant transformation to happen, all levels of society must be engaged, from individual consumers to policymakers and everybody along the food supply chain.” This interventionism, including “banning and pariah status of key products” along with “rationing on a population scale”, would be more authoritarian in its proposed strategy than prior guidelines. I wish that were a joke, but they are deadly serious. With a straight face, the same corporate-funded interests (big food, big ag & big oil) behind EAT-Lancet are telling us that, “We support the implementation of a global treaty to limit the political influence of Big Food” (Kat Lay, Tackling obesity ‘needs treaty like climate change’). “If hypocrisy was a food group we could feed thousands and thousands of people” (Linda Snell). It’s misleading to call these  ‘guidelines’ at all when the object is to eliminate choice because the masses are seen as being too stupid and weak to make the right choice.

No doubt, an austerity diet would never be willingly accepted by entire populations. In the blockade following World War II, the residents of Berlin were forced by circumstances into severe restriction of a subsistence diet based mostly on carbs while low in calories, protein and fat — not that far off from the present official dietary ideology. Writing in 1952, Dr. H. E. Magee, Senior Medical Officer of the UK Ministry of Health, concluded: “The Berlin diet was austere… and only the compelling force of hunger and the fear of political oppression would, I believe, make any civilized community continue to eat a similar diet for as long as the Berliners did” (Nutrition Lessons of the Berlin Blockade). Yet so many officials continue with the mentality that austerity diets are the way to go: calorie counting, portion control, etc. But Gary Taubes, In Why We Get Fat, shows that all this accomplishes is making people endlessly hungry with the perverse effect of gaining weight, even if initially losing it. Other than a blockade or government enforcement, hunger almost always wins out. That is why the only successful diets are satiating, which generally means nutrient-dense and high-fat. But to the modern mind built on Christian morality, the real problem is that we are gluttonous sinners. We must be punished with deprivation to cleanse our souls and expiate our sins.

As always, the elite want to tell the lower classes how to live and then to have them do as their told, by carrot or stick. “The EAT-Lancet Commission spent three years calculating the first scientific targets for a healthy, globally-sustainable diet,” wrote Nick McDermott. “But,” he noted, “the panel of experts admitted none of them were on it.” Most of them admit their hypocrisy and the others maybe are unwilling to state it publicly: “The commission said red meat should be seen as “a treat”, similar to lobster but the plan is so strict that two out of three commission members introducing the diet at a briefing in London on Wednesday said they were not currently sticking to it. Dr Richard Horton, Editor-in-Chief at The Lancet, said: “I’m close, but I have two eggs for breakfast every morning, so I’m already having too many eggs.” Author Dr Line Gordon, director of the Stockholm Resilience Centre, also admitted: “I am moving towards it, but I have young kids at home, which is driving me in the wrong direction” (Sarah Knapton, ‘Planetary health diet’: Britons urged to cut meat intake to equivalent of one beefburger a fortnight).

The billionaires behind the EAT Foundation brazenly post pictures of themselves eating meat, from massive hamburgers to squid (and at least one of them identifies as a ‘vegan’). So, do as they say, not as they do. Also pointing out the blatant hypocrisy were Nina Teicholz and Dr. Jason Fung, the former stating it bluntly about one of the rich advocates: “#EATlancet funders: Private plane jetting around the world, major carbon footprint lifestyle while telling others to save planet from global warming. Doesn’t sound right.” Connecting some dots, Jeroen Sluiter observed that this isn’t exactly new behavior for the paternalistic dietary elite: “This reminds me of how nutrition guidelines’ first villain, Ancel Keys, lectured all of us about the “dangers” of meat while frequently enjoying the most delicious roast beef with his wife Margaret.”

I was reminded of the exact same thing. In reference to Ancel Key’s “stringent vows of the dietary priesthood”, Sally Fallon Morell offers the following note (p. 157, Nourishing Diets): “Actually, Keys recommended the practice of renunciation for the general population but not for himself or those of his inner circle. The esteemed researcher Fred Kummerow, PhD, defender of eggs and butter in the human diet, once spied Keys and a colleague eating eggs and bacon at a conference for cardiologists. When Kummerow inquired whether Keys had changed his mind about dietary fats and cholesterol, Keys replied that such a restricted diet was “for others,” not for himself.” In The Big Fat Surprise, Nina Teicholz also talks about this hypocrisy: “Keys himself, according to the [Times Magazine (January 13, 1961)] article, seemed barely to follow his own advice; his “ritual” of dinner by candlelight and “soft Brahms” at home with Margaret included meat—steak, chops, and roasts—three times a week or less. (He and Stamler were also once spotted by a colleague at a conference tucking into scrambled eggs and “five or so rations” of bacon.) “Nobody wants to live on mush,” Keys explained” (p. 62). Keep in mind that Keys was the main figure that forced this dietary religion onto the American population and much of the rest of the world. With persuasive charisma, he righteously advocated that others should eat a high carb and fiber diet with restricted animal products: meat, fat, butter, eggs, etc. This became government policy and transformed the entire food sector. The eventual impact has been on possibly billions of people over multiple generations. Yet it wasn’t important enough for his own dietary habits.

There is not enough to go around, but don’t worry, our benevolent overlords will never go without. As yet another put it with some historical context, “The elites will never eat this diet they prescribe to the masses. Meat for me. And wheat for thee. The elites with their superior bodies brains intellects and money will need special nutrition to maintain their hegemony and rightful place as leaders of the planet. Ask yourself why the silicon valley brainiacs are all on keto/carnivore. It’s a reenactment of feudal life w fatty meats for the elites & thin gruel for the peasants” (David Smith). A high-carb diet combined with low-protein and low-fat has always been a poverty diet, rarely eaten by choice other than by ascetic monks: “A vegetarian or fish-based diet was most often associated with self-denial and penitence” (Sydney Watts, “Enlightened Fasting”; from Food and Faith in Christian Culture, p. 119). Worse still, it easily can lead to malnutrition and, except when calories are pushed so low as to be a starvation diet, it’s fattening.

This general strategy has been done before. It’s a way of shifting the blame and the consequences elsewhere. It’s the same as promoting feel good policies such as encouraging recycling for households, which helps distract from the fact that the immensity of waste comes from factories and other businesses. The rich use most of the resources and cause the most problems. Yet it’s the rest of us who are supposed to take responsibility, as consumer-citizens. What the rich pushing this agenda refuse to talk about is that the entire system is to blame, the very system they benefit the most from. The only way to solve the problem is to eliminate the socioeconomic order that creates people so rich and powerful that they dream of controlling the world. If sustainability is the genuine concern, we need to return to a smaller-scale decentralized way of living where solutions come from communities, not enforced by distant bureaucrats and technocrats. But that would mean reducing inequality of wealth and power by bringing the resources and decision-making back to local populations. As Peter Kalmus wisely put it, “You cannot have billionaires and a livable Earth. The two cannot go together.” That isn’t what the billionaires Petter and Gunhild Stordalen leading this campaign want, though (their organization is the EAT part of EAT-Lancet). They like the inequality just fine the way it is, if not for the problem of all those poor people.

As Herman Melvile put it, “Of all the preposterous assumptions of humanity over humanity, nothing exceeds most of the criticisms made on the habits of the poor by the well-housed, well-warmed, and well-fed.” The rich are worrying about what will happen when the living conditions, including diets, improve for the rest of the global population. And there is reason to worry for, after all, it is a finite planet. But the upper classes should worry about themselves, with the externalized costs of their lifestyle (on a finite planet, externalizations that go around come around). Once the obstructionist elite get out of the way, I have no doubt that the rest of us can come up with innovative responses to these dire times. Locally-sourced food eaten in season, organic and GMO-free agriculture, community gardens and family farms, crop rotation and cattle pasturage, farmers markets and food co-ops, etc — these are the kinds of things that will save the world, assuming we aren’t already too late.

A local diet including animal foods will be a thousand times better for the planetary biosphere than turning all of earth’s available land into big ag industrial farming in order to support a plant-based diet. Even in the EAT-Lancet report, they agree that “animal production can also be essential for supporting livelihoods, grassland ecosystem services, poverty alleviation, and benefits of nutritional status.” They even go so far as to add that this is true “particularly in children and vulnerable populations.” Wondering about this dilemma, Barry Pearson states it bluntly: “Eliminating all people who EAT-Lancet isn’t suitable for, who is left? So far list of people it isn’t suitable for appears to include: Children. Old people. Pregnant or potentially pregnant women. People with diabetes. Has anyone identified a list of who it IS suitable for?” And Georgia Ede makes the same point, adding some to the list: “Yet the authors themselves admit diets low in animal foods are unhealthy for babies, growing children, teenage girls, pregnant women, aging adults, the malnourished, and the poor, and that high-carbohydrate diets are risky for those w/ insulin resistance.” Yet the EAT-Lancet true believers largely dismiss all animal foods, which are the best sources of fat-soluble vitamins that Weston A. Price found were central to the healthiest populations. Somehow too much animal products are bad for you and the entire planet, not just red meat but also chicken, fish, eggs and dairy (anyway, why pick on red meat considering over the past century beef consumption has not risen in countries like the United States or in the world as a whole). Instead, we’re supposed to sustain ourselves on loads of carbs, as part of the decades of government-subsidized, chemically-doused, genetically-modified, and nutrient-depleted “Green Revolution”. That should make happy the CEOs and shareholders of big ag, some of the main corporate backers of EAT-Lancet’s global agenda. “Ultra-processed food manufacturers must scarcely believe their luck. They’ve been handed a massive rebranding opportunity free of charge, courtesy of the vegan desire for plant-based junk posing as dairy, meat, fish, and eggs” (Joanna Blythman).

What they don’t explain is how the world’s poor are supposed to eat this way. That is no minor detail being overlooked. Most of the population in the world and in many developed countries, including the United States, are poor. This idealized diet is presented as emphasizing fruits and vegetables. But in many poor countries, fruits and vegetables are more expensive than some animal foods. That is when they are available at all which is often not the case in the food deserts that so many of the poor are trapped in. The authors of the report do admit that animal foods might be increased slightly for many demographics — as Dr. Georgia Ede put it: “Although their diet plan is intended for all “generally healthy individuals aged two years and older,” the authors admit it falls short of providing proper nutrition for growing children, adolescent girls, pregnant women, aging adults, the malnourished, and the impoverished—and that even those not within these special categories will need to take supplements to meet their basic requirements.” It’s not clear what this means, as this admission goes against their general recommendations. The proposal is vague on details with neither food lists nor meal plans. And, oddly, the details shown don’t actually indicate greater amounts of fruits and vegetables, as the plant-based foods mostly consist of carbs (according to Optimising Nutrition’s Should you EAT Lancet?, the calorie breakdown should be: 70% plant-based including sweeteners; with 46% carbs; only 3% vegetables & 5% fruits; & a remarkable 5% for sweeteners, about equal to allowance of meat).

In the harsh criticism offered by Optimising Nutrition: “You would be forgiven if you thought from their promotional materials that they were promoting more vegetables. But it’s not actually the case! However, I admit they are promoting primarily a ‘plant based diet’ if you count corn, soy and wheat (grown using large scale agricultural practices, mono-cropping and large doses of fertilisers and chemical pesticides) and the oils that you can extract them as ‘plant based’.” I eat more actual vegetables on my low-carb, high-fat paleo diet than is being recommended in the EAT-Lancet report. Just because a diet is ‘plant-based’ doesn’t mean it’s healthy, considering most processed foods consist of plant-based ingredients. Even commercial whole wheat breads with some fiber and vitamins added back in to the denatured flour are basically junk food with good marketing. Heck, partially hydrogenated oils and high fructose corn syrup are both plant-based. The EAT-Lancet diet is basically the Standard American Diet (SAD), as it has fallen in line with decades of a Food Pyramid with carbs as the base and an emphasis on unhealthy seed oils — more from Optimising Nutrition:

“The thing that struck me was the EAT Lancet dietary guidance seems to largely be an extension of the current status quo that is maximising profits for the food industry and driving us to eat more than we need to. Other than the doubling down on the recommendation to reduce red meat and eggs, it largely seems like business-as-usual for the food industry. With Walter Willett at the helm, it probably shouldn’t be surprising that this looks and feels like an extension of the Dietary Guidelines for Americans for the whole world, complete with talk of United Nations level sanctions to prevent excess meat consumption. […] it’s the added fats and oils (mostly from unsaturated fats) as well as flours and cereals (from rice, wheat and corn) that have exploded in our food system and tracked closely with the rise in obesity. The EAT Lancet guidelines will ensure that this runaway trend continues!”

The report, though, isn’t entirely worthless for it does correctly point out some of the problems we face, specifically as part of a global crisis. But it most definitely is confusing and internally conflicted. Even if it genuinely were a diet high in healthy produce, it’s not clear why the dismissal of all animal foods, including eggs and dairy that are enjoyed by most vegetarians and non-vegetarians alike. If feeding the world is the issue, it’s hard to beat an egg for cost effectiveness and it accomplishes this without need for the kind of subsidization we see with high-yield crops. When I was poor, I survived on eggs with the expensive ingredient being the few frozen vegetables I threw in for balance and variety. Eggs are filling, both satisfying and satiating. Also, they make for a quick and easy meal, an advantage for the working poor with limited time and energy.

We are being told, though, that eggs are part of what is destroying the world and so must be severely limited, if not entirely eliminated, for the good of humanity. “While eggs are no longer thought to increase risk of heart disease, Willett said the report recommends limiting them because studies indicate a breakfast of whole grains, nuts and fruit would be healthier” (Candice Choi). So, there is nothing unhealthy about eggs, but since they are made of protein and fat, we should eat more carbs and sugar instead — “According to EAT Lancet, you can eat 8 tsp of sugar but only 1/4 egg per day” (Nina Teicholz). After all, everyone knows that American health has improved over the decades as more carbs and sugar were eaten… no, wait, it’s the complete opposite with worsening health. That is plain fucked up! Explain to me again why eggs, one of the cheapest and healthiest food sources, are being targeted as a danger to human existence in somehow contributing or linked to overpopulation, environmental destruction, and climate change. What exactly is the connection? Am I missing something?

Whatever the explanation, eating less of such things as eggs, we are supposed to eat more of such things as vegetables, at least in taking at face value how this diet is being sold. Let’s pretend for a moment that the Eat-Lancet diet is accurately described as largely oriented toward fruits and vegetables and that, as a sweeping recommendation, this is fully justified. Consider that, as Diana Rodgers explains, “Fresh produce is not grown year round in all locations, not available to everyone, and by calorie, weight, and micronutrients, more expensive than meat. Oh, and lettuce has three times the GHG emissions of bacon and fruit has the largest water and energy footprint per calorie. I didn’t see this mentioned in the EAT Lancet report.” We forget that our cheap vegetables in the West are actually rather uncommon for much of the world, excluding root vegetables which are more widely available. I’d guess we only have such a broad variety of cheap vegetables here in the West because its part of the government-subsidization of high-yield farming, which by the way has simultaneously depleted our soil and so produced nutrient-deficient food (also, there is the American Empire’s neoliberally-rationalized and militarily-protected “free trade” agreements that have ensured cheap produce from around the world, but this simultaneously makes these foods out of reach for the foreign populations that actually grow the produce). I’m all in favor of subsidizing vegetables and much else, but I’d rather see the subsidization of sustainable farming in general that promotes nutrient-dense foods, not limited to plants. Anyway, how is telling poor people to eat more expensive and, in some cases, problematic foods going to help the world’s population that is struggling with poverty and inequality?

“And what are the things individuals can do to reduce their carbon footprint?,” as also brought up by Rodgers. “According to a recent meta-analysis, having one less child (in industrialized nations), which was shown by far to have the biggest impact, followed by living “car-free”, avoiding one round-trip trans-Atlantic flight, and buying “green” energy have much more of an effect on our carbon footprint than our dietary choices.” Most people in the West are already having fewer children. And most people in the rest of the world already live without cars. We know that the birthrate goes down when life conditions are improved and this is already being observed, but this dietary regime would worsen life conditions through austerity politics and so would make people feel more desperate than they already are. As for transportation, many things could lessen the externalized costs of that, from funding public transportation to the relevant option of increasing local farming: “New research from the University of California also recently concluded that grasslands are an even better and more resilient carbon storage option than trees.” (Danielle Smith, If you care about the planet, eat more beef); “These multiple research efforts verify that practical organic agriculture, if practiced on the planet’s 3.5 billion tillable acres, could sequester nearly 40 percent of current CO2 emissions” (Tim J. LaSalle & Paul Hepperly, Regenerative Organic Farming); see also this Ted Talk by Allan Savory and this paper.

Cattle aren’t the problem, considering that the earth for hundreds of millions of years has supported large numbers of ruminants without pollution, erosion, or any other problems. The United States maintains fewer cows than there were buffalo in the past and furthermore: “Ruminant herds have been a feature of our ecosystem since before the fall of the dinosaurs. Yes, they produce methane (so do we), but the atmosphere is accustomed to that level of methane. I can’t find data on total global animal biomass trends, but as the population of humans and domesticated animals has increased, so populations of wild animals (and particularly megafauna) has decreased. What is concerning is releases of methane that has been sequestered from the atmosphere over thousands or millions of years – melting permafrost, drained peatbogs and swamp forests. Methane is a significant greenhouse gas. But to get back to where we started, methane is a natural component of the atmosphere; the carbon from farts comes from the food that is eaten and is recycled as new food that grows; there is no evidence that I’m aware of that the total volume of farting is increasing.” (Simon Brooke). An accurate and amusing assessment. More mass industrial farming to support this top-down dietary scheme from EAT-Lancet would require more mass transportation and inevitably would create more pollution. One has to be insane to believe or well-paid by self-serving interests to claim to believe this is the solution.

One might note that EAT-Lancet is specifically partnered with big biz, including big ag companies such as Monsanto that has poisoned the world’s population with Roundup (i.e., glyphosate), and understand that big ag is among the most powerful interests in the US considering our country’s wealth was built on agriculture (a great example being the wealth of the plutocratic and corporatist Koch brothers whose wealth in part came from manufacturing fertilizer). Other companies involved are those developing meat alternatives produced from the industrially-farmed crops of big ag. And big ag is dependent on big oil for production of farm chemicals. EAT Foundation president and founder, Gunhild Stordalen, has been noted as a significant figure in the oil industry (Lars Taraldsen, ONS 2014 conference program to feature oil industry heavy hitters). But don’t worry about how this carb-laden diet of processed foods will harm your health with the majority of the American population already some combination of insulin sensitive, pre-diabetic, and diabetic — they’ve got this covered: “The drug company Novo Nordisk supports Eat-Lancet. Smart. Insulin is 85% of their revenue” (P. D. Mangan). I’m beginning to see a pattern here in the vested interests behind this proposal: “Eat lancet sponsors. Chemical companies, pharmaceutical companies (mostly making diabetes meds), the world’s biggest pasta manufacturer, the world biggest seed oil supplier, the world’s biggest breakfast cereal supplier” (David Wyant); “Pesticides, fertilisers, #gm (Bayer/Monsanto, BASF, Syngenta);sugar+fake flavourings/colourings (PepsiCo, Nestle, Givaudin, Symrise);ultraprocessed grains/starches (Cargill, Kellogg’s);#palmoil (Olam); additives and enzymes (DSM)- companies backing #EatLancet diet. I wonder why?” (Joanna Blythman).

Just to throw out a crazy idea, maybe transnational corporations are the problem, not the answer. “Just think about it. EAT Lancet is the processed food industry telling us that eating more processed food is good for our health & planet. That’s like oil industry stating burn more fossil fuel will save planet. Vested interests think we are that gullible?”, in the words of Gary Fettke, an outspoken surgeon who (like John Yudkin and Tim Noakes) was bullied and harassed when challenging the powers that be, for the crime of advising an evidence-based low-carb/sugar diet. “This Poison Cartel of companies,” writes Vandana Shiva in reference to the corporate alliance behind EAT-Lancet, “have together contributed up to 50% Green house gases leading to climate change, and the chronic disease epidemic related to chemicals in food, loss in diversity in the diet, industrially processed junk food, and fake food.” The Lancet Journal itself, from a new report, is now warning of us the exact same thing, in that many corporate sectors (including those backing EAT-Lancet) receive $5 trillion in government subsidies: “Big Food’s obstructive power is further enhanced by governance arrangements that legitimize industry participation in public policy development” (Swinburn et al, The Global Syndemic of Obesity, Undernutrition, and Climate Change).

The whole health and sustainability claim is a red herring. The EAT-Lancet commissioners and others of their ilk don’t feel they have to justify their position, not really. They throw out some halfhearted rationalizations, but they fall apart under casual scrutiny. Furthermore, there is far from being a consensus among the experts. The Associated Press offered some dissenting voices, such as “John Ioannidis, chair of disease prevention at Stanford University, said he welcomed the growing attention to how diets affect the environment, but that the report’s recommendations do not reflect the level of scientific uncertainties around nutrition and health.” Ioannidis, a non-partisan researcher in dietary debates, was quoted as saying, “The evidence is not as strong as it seems to be.” That is to put it mildly. We are in the middle of a replication crisis in numerous fields of science and, as Ioannidis has shown, food-related research is among the worse. When he says the evidence is not strong enough, people should pay attention.

For emphasis, consider what kind of scientists are involved in this project. The lead researcher and author behind the EAT-Lancet report is Walter Willett, chair of the Harvard School of Public Health’s nutrition department. He was recently rebuked in science journal Nature (editorial and feature article) for his unscientific behavior. Willett has many potential conflicts of interest with, according to Nina Teicholz, “many 100Ks in funding by a host of companies selling/promoting plant-based diet.” This is the guy, by the way, who inherited the mantle from Ancel Keys, an ‘honor’ that some would consider very low praise, as Keys too has regularly been accused of a sloppy and bullying approach to diet and nutrition. Willett is particularly misinformed about what is a healthy fat in his blaming saturated fat on the same flimsy evidence going back to Ancel Keys, but back in a 2004 Frontline interview from PBS he did make the surprising admission that it was carbs and not fat driving the disease epidemic:

“Well, the food guide pyramid that was developed in 1991 really is based on the idea that all fat is bad. Therefore [if] fat is bad, and you have to eat something, carbohydrate must be wonderful. So the base of the pyramid is really emphasizing large amounts of starch in the diet. We’re told we can eat up to 11 servings a day, and if that wasn’t enough starch, the pyramid puts potatoes along with the vegetables, so you can have up to 13 servings a day. That’s a huge amount of starch. […] Fat’s up at the top of the pyramid, and where it says explicitly “fats and oils, use sparingly.” It doesn’t make any distinction about the type of fat, and it tells us to eat basically as little as possible. […] Well, this pyramid is really not compatible with good scientific evidence, and it was really out of date from the day it was printed in 1991, because we knew, and we’ve known for 30 or 40 years that the type of fat is very important. That was totally neglected. […] In some ways, we do have to credit the food industry with being responsive to what nutritionists were saying. They did believe or accepted the evidence that vegetable fats, vegetable oils, would be better than animal fats, and that really led to the development and promotion of the margarine industry and Crisco, baking fats that were made from vegetable oils. But they were made by a process called partial hydrogenation, which converts a liquid oil, say like soybean oil or corn oil, to something like margarine or vegetable shortening. As it turns out that was a very disastrous mistake, because in the process of partial hydrogenation, a totally new type of fat is formed called trans fat. The evidence has now become very clear that trans fat is far worse than saturated fat. […] Unfortunately, as a physician back in the 1980s, I was telling people that they should replace butter with margarine because it was cholesterol free, and professional organizations like the American Heart Association were telling us as physicians that we should be promoting this. In reality, there was never any evidence that these margarines, that were high in trans fat, were any better than butter, and as it turned out, they were actually far worse than butter.”

In 2010, Walter Willett is again quoted in The Los Angeles Times declaring this same message in no uncertain terms: “Fat is not the problem […] If Americans could eliminate sugary beverages, potatoes, white bread, pasta, white rice and sugary snacks, we would wipe out almost all the problems we have with weight and diabetes and other metabolic diseases” (Marni Jameson, A reversal on carbs). He has been defending this consistent message for a while now. Why this sudden turnabout in defense of carbs by blaming fats once again? Is he just following the money as a scientific mercenary for hire to the highest bidder?

Considering animal fats are among the most nutrient-dense foods available, let me return to nutrient-density in my concluding thoughts. Feeding the whole world is the easy part. But if we want humanity, all of humanity, to thrive and not merely survive, this is what it comes down to — as I previously wrote (A Food Revolution Worthy of the Name!): “We don’t need to grow more food to feed the world but to grow better food to nourish everyone at least to a basic level, considering how many diseases even in rich countries are caused by nutrient deficiencies (e.g., Dr. Terry Wahls reversed multiple sclerosis symptoms in her self, in patients, and in clinical subjects through increasing nutrient-density). The same amount of food produced, if nutrient-dense, could feed many more people. We already have enough food and will continue to have enough food for the foreseeable future. That of equal and fair distribution of food is a separate issue. The problem isn’t producing a greater quantity for what we desperately need is greater quality. But that is difficult because our industrial farming has harmed the health of the soil and denatured our food supply.”

From that piece, I suggested that nutrient-density, especially if combined with low-carb, might decrease food consumption worldwide. In comparing locally-raised meat versus mass-transported produce, Frédéric Leroy made a related argument: “When protein quality is factored in, the data show a completely different picture. Assessments usually overlook nutrient density. Expressing environmental impact per unit of mass (g) has little sense, we should care about *nutrition* not quantity.” And for damn sure, it would improve health for those already eating so little. As I wrote,“What if we could feed more people with less land? And what if we could do so in a way that brought optimal and sustainable health to individuals, society, and the earth? Now that would be a food revolution worthy of the name!” This is very much an issue of inequality, as at least some of the EAT-Lancet commissioners acknowledge — Dr. Lawrence Haddad says, “Most conflict is driven by inequality, or at least a sense of inequality. Work by UNICEF and others shows that inequality in terms of malnutrition is actually rising faster within countries than it is between countries. So inequality within countries in terms of things like stunting and anaemia is either not improving or is actually worsening – and we know that inequality is a big driver of violence conflict.” The EAT-Lancet report itself mentions this in passing, mostly limited to a single paragraph:

“Wars and disasters cause food insecurity and highlight the issues faced when nutrition is inadequate and food becomes scarce. Wars and natural disasters also provide opportunities from which the food system can be transformed. However, only at the end of World War 2 was a global effort and commitment introduced to redirect the food system.258 New institutions were created or revised at the global level such as WHO, the Food and Agriculture Organization, and World Bank, which allied with new and renewed national Ministries of agriculture and health to stop pre-war food problems caused by market distortions, environmentally-damaging farming, and social inequalities.259 However, the negative consequences of the post-war food revolution are now becoming increasingly clear (ie, negative environmental and health consequences, as outlined in this Commission).”

I’ll give them credit for bringing it up at all, however inadequately. They do admit that our food system has failed. That makes it all the more unfortunate that, in many ways, they are demanding more of the same. As others have noted, the diet they have fashioned for the world is severely lacking in nutrition. And they offer no convincing suggestions in how to reverse this problem. It won’t help to eat more plant-based foods, if they are grown through chemical-dependent high-yield farming that is depleting the soil of minerals and killing earthworms, microbes, etc: “Veganism is a huge misinterpretation of what a responsible diet might look like. It fully supports and exacerbates industrial farming of grains, pulses, fruits and vegetables through high inputs, maximising yields at all costs and depleting soils.” (Cassie Robinson). The idea of nutrient-dense foods as part of traditional farming and healthy soil is simply not on the radar of mainstream thought, especially not within our corporatist system. That is because a large portion of nutrient-dense foods don’t come from plants (especially not high-profit monoculture crops) and, furthermore, aren’t compliant with industrial farming and food production. That isn’t to say we should necessarily be eating massive amounts of meat, but animal foods have been the key element to every healthy population — in fact, compared to the United States, the top two longest-living countries in the world (Hong Kong and Japan) eat more animal foods by some acconting, including lots of red meat, although Americans are probably ahead of those two countries on dairy foods, which taken together is an argument for the paleo diet. Even among vegetarians, the healthiest are those with access to high quality dairy and eggs, along with those eating food from traditional farming that includes many insects and soil microbes mixed in with what is grown (the latter was shown in a study that vegetarians in a region of India were healthier than in another, and the difference was the unintentional insects in the diet from traditional farming).

None of that, as far as I can tell, is discussed in the EAT-Lancet report. The authors offer no helpful advice, no long-term vision that can move us in a positive direction. Their ideology is getting ahead of the science. A sense of urgency is important. But impatience, especially the EAT Foundation’s self-described “impatient disruption”, won’t serve us well. It was careless hubris that got us here. It’s time we learn to respect the precautionary principle, to think carefully before collectively acting or rather before the ruling elite goes forward with yet another harebrained scheme. If as a society we want to direct our sense of urgency toward where it counts, that wouldn’t be hard to do: “World: Stop wasting a third of the food produced. Stop wrapping it in needless packaging. Stop transporting food half way round the world. Stop selling food at below-cost prices. Stop undercutting our produce with low standard alternatives. Then I’ll discuss how much meat I eat” (David Hill). It would mean drastically transforming our political and economic system. Capitalism and corporatism, as we know it, must end for the sake of humanity and the planet.

As a member of the liberal class, Gunhild Stordalen (founder and president of EAT Foundation) knows how to say the right things. Listen to how she sets up this brilliant piece of rhetoric: “What we eat and how we produce it drives some of our greatest health and environmental challenges. On the other hand, getting it right on food is our greatest opportunity to improve the health of people and planet. This will require concerted action across disciplines and sectors – and business will be a key part of the solution.” Much of it sounds nice, too nice. But the only part of that statement that was honest was the last bit. All one should hear is “Blah, blah, blah… and business will be a key part of the solution.” And she isn’t referring to small family farms, mom-and-pop grocery stores, and local co-ops. This is a corporatist vision of concentrated wealth and power. These people are serious about remaking the world in their own image. As Anand Giridharadas put it in another context: “Elites, he wrote, have found myriad ways to “change things on the surface so that in practice nothing changes at all”. The people with the most to lose from genuine social change have placed themselves in charge of social change – often with the passive assent of those most in need of it.” No thanks!

We don’t need a corporate-owned nanny state telling us what to do. If scientific and political institutions weren’t being manipulated, if the corporate media wasn’t used to propagandize and manage public perception, and if powerful interests weren’t spreading disinfo and division, it would be a lot easier for we the people to become an informed citizenry able to figure out how to democratically solve our own problems. We could even figure out how feed ourselves for health and sustainability. Despite it all, that is what we’re working toward.

* * *

What Experts Are Saying
from NAMI

It’s shocking that after years of promoting a groundbreaking report, EAT-Lancet’s own analysis shows the Commission’s recommended diet has almost no environmental benefit over business-as-usual scenarios. While EAT-Lancet claims its reference diet would decrease greenhouse gas emissions, the Commission’s fundamentally flawed data fail to account for methane reduction that occurs naturally, as methane remains in the atmosphere for only 10 years. The carbon emissions from all the flights required for the Commission’s global launch tour will have a much longer impact than that of methane from livestock animals

Frank Mitloehner, PhD, UC Davis

Meat and dairy are easily the most nutrient-dense foods available to humans. [The recommendations… are not only unrealistic but potentially dangerous for healthy diets…

Jason Rowntree, Associate Professor, Animal Science Department, Michigan State University

Human beings, especially as we age, cannot do without protein. The EAT-Lancet Commission’s recommendation to cut beef consumption to just a quarter ounce per day (7g) is a drastic departure from evidence showing meat and dairy improve diets.

Stuart Phillips, Professor; Director, Physical Activity Centre of Excellence. McMaster University, Canada.

The report’s recommendations do not reflect the level of scientific uncertainties around nutrition and health. The evidence is not as strong as it seems to be.

John Ioannidis, MD, Stanford University

The cornerstone of a healthy diet is still meat and dairy. Take those out and you’ll have under-nutrition and frailty. It’s unavoidable.

Andrew Mente, PhD, Associate Professor & Nutrition Epidemiologist, Population Health Research Institute, McMaster University

The #EatLancet Commission work does not reflect consensus among scientists. We need to invest in research to inform dialogue on what is healthy and sustainable. We should not base recommendations based on assumptions and 40+ year old confounded cohorts. Scientists must stop making premature recommendations based on opinion and weak data like in the past (e.g., eggs and fat). Unintended consequences happen folks. Let’s not make the same mistake twice!

Taylor Wallace, Ph.D., George Mason University

What start as academic and scientific debates become political arguments that are dangerously simplistic and may have several detrimental consequences for both health and the environment. Of course, climate change is real and does require our attention. And, yes, livestock should be optimized but also be used as part of the solution to make our environments and food systems more sustainable and our populations healthier. But instead of undermining the foundations of our diets and the livelihoods of many, we should be tackling rather than ignoring the root causes, in particular hyperconsumerism. What we should avoid is losing ourselves in slogans, nutritional scientism, and distorted worldviews

Fredric Leroy, PhD; Martin Cohen, PhD

To confine all our attention in eliminating animal-source food (#meat#eggs#milk) in our diet as solution to climate change is to limit human ability to solve challenges. Options’re available to abate impacts of livestock through investment

Aimable Uwizeye, Global Livestock expert, Veterinarian Doctor & PhD Fellow

As a cardiologist, I’ve made healthy lifestyle recommendations to thousands of patients, and it is clear that the best lifestyle is one people can actually maintain over the long term. It turns out that animal protein and fat are uniquely satiating — thus keeping hunger at bay — and therefore a friend to any dieter. It is lamentable that the EAT-Lancet authors should want to impose their ideas about healthy diets on all populations worldwide.

Bret Scher, MD

This is what the new EAT Lancet report remind me of. After years of abject failure with ‘plant based’, low fat, low calorie diets for metabolic health, they know they’re going to succeed with the same advice. Insanity, literally

Jason Fung, MD

Note that eating 0 grams of meat/seafood/poultry/eggs/dairy is supported, meaning vegan diets are officially sanctioned. Epidemiology choosing ideology over biology once again. No real science here

Georgia Ede, MD

The environmental science is as murky, unevenly applied & ideologically driven as the nutrition science. There isn’t a top-down, one-size-fits-all solution to “healthy diet” or “sustainable food system” because we are dealing with situated, idiosyncratic contexts in each case

Adele Hite, PhD

You’ll be short of calcium, iron, potassium, D3, K2, retinol, B12, sodium if you adopt EAT Lancet diet. It’s nutritionally deficient. Irresponsible!

Zoe Harcombe, Ph.D.

Those who feel that meat eaters are as bad as smokers and should eat their meals outside of the restaurant are obviously not coming from a place of reason and should be removed from decisions involving dietary policy.

Diana Rogers, RD

What concerns me is that people will give this report the same weight as Dietary Guidelines that go through years of discussion, must be based on scientific evidence, analysis and vetting by a team of experts that have to disclose COI – unlike this report.

Leah McGrath, RD

I work as a renal RD, & so I experience daily the actual impact pseudoscience like the #EATLancet study can have on society. It’s nonsense like this that has caused so many of my patients to fear meat—which improves clinical outcomes—more than highly processed foods.

Mike Shelby, RD

There are no Controlled Trials proving the EAT-Lancet [recommendations] are safe for humans to eat long-term! #yes2meat

Ken Berry, MD

Unfortunately, quantity of evidence does not equate to quality – especially in the diet/health arena.

Sean Mark, PhD

The #EATLancet diet: Nearly eliminates foods with important nutrients (dairy and all other products from animal origin). Will lead to an increase consumption of calories. Will have similar impact on climate change.

Maria Sanchez Mainar, PhD

* * *

The Big Fat Surprise
by Nina Teicholz
pp. 131-133

“We Cannot Afford to Wait”

In the late 1970s in America, the idea that a plant-based diet might be the best for health as well as the most historically authentic was just entering the popular consciousness. Active efforts to demonize saturated fat had been underway for more than fifteen years by that time, and we’ve seen how the McGovern committee’s staff were in short order persuaded by these ideas. Even so, the draft report that Mottern wrote for the McGovern committee sparked an uproar—predictably—from the meat, dairy, and egg producers. They sent representatives to McGovern’s office and insisted that he hold additional hearings. Under pressure from these lobbies, McGovern’s staff carved out an exception for lean meats, which Americans could be advised to eat. Thus, Dietary Goals recommended that Americans increase poultry and fish while cutting back on red meat, butterfat, eggs, and whole milk. In the language of macronutrients, this meant advising Americans to reduce total fat, saturated fat, dietary cholesterol, sugar, and salt while increasing carbohydrate consumption to between 55 percent and 60 percent of daily calories.

While Mottern would have liked the final report to advise against meat altogether, some of the senators on the committee were not so unequivocally confident about their ability to weigh in on matters of nutritional science. The ranking minority member, Charles H. Percy from Illinois, wrote in the final Dietary Goals report that he and two other senators had “serious reservations” about the “divergence of scientific opinion on whether dietary change can help the heart.” They described the “polarity” of views among well-known scientists such as Jerry Stamler and Pete Ahrens and noted that leaders in government, including no less than the head of the NHLBI as well as the undersecretary of health, Theodore Cooper, had urged restraint before making recommendations to the general public.

Yet this hesitation turned out to be too little too late to stop the momentum that Mottern’s report had set in motion. Dietary Goals revived the same argument that Keys and Stamler had used before: that now was the time to take action on an urgent public health problem. “We cannot afford to await the ultimate proof before correcting trends we believe to be detrimental,” said the Senate report.

So it was that Dietary Goals , compiled by one interested layperson, Mottern, without any formal review, became arguably the most influential document in the history of diet and disease. Following publication of Dietary Goals by the highest elective body in the land, an entire government and then a nation swiveled into gear behind its dietary advice. “It has stood the test of time, and I feel very proud of it, as does McGovern,” Marshall Matz, general counsel of the McGovern committee, told me thirty years later.

Proof of the report’s substantiality, according to Matz, is that its basic recommendations—to reduce saturated fat and overall fat while increasing carbohydrates—have endured down to today. But such logic is circular. What if the US Congress had said exactly the opposite: to eat meat and eggs and nothing else? Perhaps that advice, supported by the power of the federal government, would have lived on equally well. In the decades since the publication of Dietary Goals , Americans have seen the obesity and diabetes epidemics explode—a hint, perhaps, that something is wrong with our diet. Based on these facts, the government might have deemed it appropriate to reconsider these goals, but it has nevertheless stayed the course because governments are governments, the least nimble of institutions, and unable easily to change direction.

* * *

Dr. Andrew Samis:

One hundred and eleven years ago a scientist in St. Petersberg Russia fed rabbits meat, eggs, and dairy. Not unexpectedly for a herbivorous animal, it built up in the blood vessels. It also built up in the ligaments, the tendons, the muscles, and everywhere else in the rabbits body without any evolved mechanism for excretion. This yellow goop in the rabbit’s aortas looked just like human atherosclerosis, which had only been described four years earlier. This started science down a misguided pathway of focusing on fat as the cause of hardening of the arteries. A pathway that future historians will likely call the greatest tragedy in terms of years of life lost in the history of humanity.

Initially it was eating cholesterol that was blamed for causing of hardening of the arteries. Then in the 1950s an American physiologist, who had such an affinity for hard compacted refined carbohydrates that he designed soldiers rations featuring it, expanded the blame from cholesterol to all fat, especially animal fat. Carbohydrates should be increased and fat excluded, that was the battle cry! In the 1970s this unproven theory drew the attention of the US senate, and within a few short years blaming fat for atherosclerosis became a worldwide revolution. This time period, interesting, also marks the beginning of the obesity epidemic that has gripped the world’s developed countries. Tragically, what everyone seemed to have missed was the fact that there was no conclusive scientific evidence for this theory, and over time much of that thinking has actually been proven wrong. I have little doubt that issuing these guidelines without conclusive scientific evidence will eventually be viewed as the most significant blunder in the history of science.

I am an ICU doctor. I see the carnage that this cavalier and misguided attitude towards food guidelines has caused every single day, up close and personal. The tears of families suffering loss. The premature death of those who should have had long lives. Parents burying their adult sons and daughters. Atherosclerosis, obesity, and type 2 diabetes, when grouped together represent the top conditions for admission to adult ICUs everywhere on earth where our unhealthy Western Diet is consumed. And approximately one in five don’t survive their ICU stay. But what makes me the most angry is the fact that those people who draft these misguided non-scientific food guidelines, with their biased agendas and misrepresented studies, sit in government offices and ivory towers completely remote from the devastating impact of their work. Is it any wonder that the doctors of the world represent a large portion of those leading the charge against our current misguided food guidelines. Doctors are not remote to the problem or blind to the devastation. It is here every single day at work.

This has to stop. Food guidelines need to be based on rigorous science. How many more thousands of people have to die.

Enough is enough.

* * *

eat like your grandmother

No photo description available.

Image may contain: text

* * *

Monsanto is Safe and Good, Says Monsanto

Lancet Partners With Poison Makers to Give Food Advice Written
by Joseph Mercola

The EAT/Lancet Backers: Definitely in it for the Good of the Planet.
by Tim Rees

Food Industry Giants Invest $4 Million In Vegan Research
by Jemima Webber

Billionaire Vegan Tells Us all How to Eat
by Tim Rees

Globe-trotting billionaire behind campaign to save planet accused of blatant hypocrisy
by Martin Bagot

Billionaire tycoon who urged Brits to eat less meat tucks into 20,000-calorie burger
from Mirror

Letter to Dr. Gunhild A. Stordalen
by Angela A. Stanton

Majority of EAT-Lancet Authors (>80%) Favored Vegan/Vegetarian Diets
by Nina Teicholz

How vegan evangelists are propping up the ultra-processed food industry
by Joanna Blythman

Thou Shalt not discuss Nutrition ‘Science’ without understanding its driving force
by Belinda Fettke

2019 The Year Vegan Pseudo-Science Goes Mainstream?
by Afifah Hamilton

Meat-free report slammed, wool revival, agri-tech
from BBC

Eat Lancet, a template for sustaining irony
by Stefhan Gordon

Does Lancet want to hand control of our diets to the state?
by Kate Andrews

Tax, ban, regulate: the radical ‘planetary health diet’ explained
by Christopher Snowden

Lies Lying Liars Tell
by Tom Naughton

Eat Me, Lancet … These People Are A Perfect Example Of The Anointed
by Tom Naughton

EAT-Lancet Report is One-sided, Not Backed by Rigorous Science
by The Nutrition Coalition

Scientific Evidence on Red Meat and Health
by The Nutrition Coalition

Farmers have a beef with plant- or lab-grown ‘meat.’ Should you care?
by Laurent Belsie

If you care about the planet, eat more beef
by Danielle Smith

Why Eating Meat Is Good for You
by Chris Kresser

EAT-Lancet recommends slashing red meat consumption by 90%
by Amanda Radke

Report: Cut red-meat eating by 80 percent to save the planet?
by Anne Mullens and Bret Scher

Can vegetarians save the planet? Why campaigns to ban meat send the wrong message on climate change
by Erin Biba

Is the vegan health halo fading?
by Shan Goodwin

Two-pager Scientific Evidence on Red Meat and Health
from The Nutrition Coalition

A view on the meat debate
by Richard Young

I think you’ll find it’s a little bit more complicated than that…
by Malcolm Tucker

Why we should resist the vegan putsch
by Joanna Blythman

Scrutinise the small print of Eat-Lancet
by Joanna Blythman

Sally Fallon Morell Addresses the EAT-Lancet Diet Dietary Recommendations
from Weston A. Price Foundation

The EAT Lancet report recommends a diet that is ostensibly better for the planet & our health. In one simple IG post,…
from Weston A. Price Foundation

The EAT Lancet diet is nutritionally deficient
by Zoë Harcombe

Vegan diet ‘could have severe consequences’, professor warns
by Ali Gordon

EAT-Lancet Diet – inadequate protein for older adults
by Joy Kiddie

Any ‘planetary diet’ must also work for the poorest and most vulnerable
by Andrew Salter

EAT-Lancet report’s recommendations are at odds with sustainable food production
by Sustainable Food Trust

Report urging less meat in global diet ‘lacks agricultural understanding’
from FarmingUK

War on burgers continues with false environmental impact claims
by Amanda Radke

Sorry, But Giving Up on Meat Is Not Going to Save The Planet
by Frank M. Mitloehner

20 Ways EAT Lancet’s Global Diet is Wrongfully Vilifying Meat
by Diana Rodgers

What’s right and what’s wrong about the EAT Lancet Diet
by Defending Beef

With huge variations in meat consumption, we’re ‘all in this existential crisis together’—Vox
by Susan MacMillan

IFPRI’s Shenggen Fan on the ‘differentiated approach’ needed to navigate today’s food systems
by Susan MacMillan

FAO sets the record straight on flawed livestock emission comparisons–and the livestock livelihoods on the line
by Susan MacMillan

FAO sets the record straight–86% of livestock feed is inedible by humans
by Susan MacMillan

Climate change policy must distinguish (long-lived) carbon dioxide from (short-lived) methane–Oxford study
by Susan MacMillan

Red meat bounds down the carbon neutral path
by Shan GoodwinShan Goodwin

Can cows cause more climate change than cars?
by Frédéric Leroy

The EAT-Lancet Commission’s controversial campaign
by Frédéric Leroy and Martin Cohen

Why we shouldn’t all be vegan
by Frédéric Leroy and Martin Cohen

EAT-Lancet: what lies behind the Veggie-Business according to Frédéric Leory and Martin Cohen
from CARNI Sostenibili

Considerations on the EAT-Lancet Commission Report
from CARNI Sostenibili

The Eat-Lancet Commission: The World’s Biggest Lie
by Angela A. Stanton

We test diet of the future that will save the planet – that calls on Irish people to slash red meat consumption by 89 per cent
by Adam Higgins

Irish Mirrorman takes on five day health challenge to diet and help save the planet
by Kevan Furbank

Is the EAT-Lancet (Vegan) Rule-Book Hijacking Our Health?
by Belinda Fettke

EAT-Lancet’s Plant-based Planet: 10 Things You Need to Know
by Georgia Ede

Should you EAT Lancet?
from Optimising Nutrition

EAT-Lancet Report Offers a “Fad Diet” Solution to Complex Global Issues
from NAMI

Media Myth Crusher
from NAMI

Climate, Food, Facts
from Animal Agriculture Alliance

FAQ
from Animal Agriculture Alliance

What the experts are saying…
from Animal Agriculture Alliance

Birth of Snark

The next time you’re irritated by an internet troll, remember that one of the greatest inventions of civilization was snark. For millennia of recorded history, there was no evidence of it. Then suddenly, in the measure of historical time, there it was in all its glory.

Before there was social media and online comments sections, there were letters written in cuneiform. There is something about text-based communication that brings snark out in some people, no matter the medium. But first there had to be a transformation in consciousness.

* * *

The Origin of Consciousness in the Breakdown of the Bicameral Mind
by Julian Jaynes
p. 249-250

Going from Hammurabi’s letters to the state letters of Assyria of the seventh century B.C. is like leaving a thoughtless tedium of undisobeyable directives and entering a rich sensitive frightened grasping recalcitrant aware world not all that different from our own. The letters are addressed to people, not tablets, and probably were not heard, but had to be read aloud. The subjects discussed have changed in a thousand years to a far more extensive list of human activities. But they are also imbedded in a texture of deceit and divination, speaking of police investigations, complaints of lapsing ritual, paranoid fears, bribery, and pathetic appeals of imprisoned officers, all things unknown, unmentioned, and impossible in the world of Hammurabi. Even sarcasm, as in a letter from an Assyrian king to his restive acculturated deputies in conquered Babylon about 670 B.C.:

Word of the king to the pseudo-Babylonians. I am well . .  . So you, so help you heaven, have turned yourselves into Babylonians! And you keep bringing up against my servants charges— false charges,— which you and your master have concocted . .  . The document (nothing but windy words and importunities!) which you have sent me, I am returning to you, after replacing it into its seals. Of course you will say, “What is he sending back to us?” From the Babylonians, my servants and my friends are writing me: When I open and read, behold, the goodness of the shrines, birds of sin . .  . 28

And then the tablet is broken off.

A further interesting difference is their depiction of an Assyrian king. The Babylonian kings of the early second millennium were confident and fearless, and probably did not have to be too militaristic. The cruel Assyrian kings, whose palaces are virile with muscular depictions of lion hunts and grappling with clawing beasts, are in their letters indecisive frightened creatures appealing to their astrologers and diviners to contact the gods and tell them what to do and when to do it. These kings are told by their diviners that they are beggars or that their sins are making a god angry; they are told what to wear, or what to eat, or not to eat until further notice: 29 “Something is happening in the skies; have you noticed? As far as I am concerned, my eyes are fixed. I say, ‘What phenomenon have I failed to see, or failed to report to the king? Have I failed to observe something that does not pertain to his lot?’.  .  . As to that eclipse of the sun of which the king spoke, the eclipse did not take place. On the 27th I shall look again and send in a report. From whom does the lord my king fear misfortune? I have no information whatsoever.” 30

Does a comparison of these letters, a thousand years apart, demonstrate the alteration of mentality with which we are here concerned? Of course, a great deal of discussion could follow such a question. And research: content analyses, comparisons of syntax, uses of pronouns, questions, and future tenses, as well as specific words which appear to indicate subjectivity in the Assyrian letters and which are absent in the Old Babylonian. But such is our knowledge of cuneiform at present that a thorough analysis is not possible at this time. Even the translations I have used are hedged in favor of smooth English and familiar syntax and so are not to be completely trusted. Only an impressionist comparison is possible, and the result, I think, is clear: that the letters of the seventh century B.C. are far more similar to our own consciousness than those of Hammurabi a thousand years earlier.

Capitalist Realism and Fake Fakes

“This is where ‘we’ are now: not Harawayesque cyborgs affirming our ontological hybridity but replicant-puppets (of Capital) dreaming kitsch dreams of being restored to full humanity but “without any Gepettos or Good Fairies on the horizon”.”

~ Mark (k-punk), 2009
Honeymoon in Disneyland

* * *

“Where does that leave us? I’m not sure the solution is to seek out some pre-Inversion authenticity — to red-pill ourselves back to “reality.” What’s gone from the internet, after all, isn’t “truth,” but trust: the sense that the people and things we encounter are what they represent themselves to be. Years of metrics-driven growth, lucrative manipulative systems, and unregulated platform marketplaces, have created an environment where it makes more sense to be fake online — to be disingenuous and cynical, to lie and cheat, to misrepresent and distort — than it does to be real. Fixing that would require cultural and political reform in Silicon Valley and around the world, but it’s our only choice. Otherwise we’ll all end up on the bot internet of fake people, fake clicks, fake sites, and fake computers, where the only real thing is the ads.”

~ Max Read, 2018
How Much of the Internet Is Fake? Turns Out, a Lot of It, Actually.

* * *

“In my writing I got so interested in fakes that I finally came up with the concept of fake fakes. For example, in Disneyland there are fake birds worked by electric motors which emit caws and shrieks as you pass by them. Suppose some night all of us sneaked into the park with real birds and substituted them for the artificial ones. Imagine the horror the Disneyland officials would feel when they discovered the cruel hoax. Real birds! And perhaps someday even real hippos and lions. Consternation. The park being cunningly transmuted from the unreal to the real, by sinister forces. For instance, suppose the Matterhorn turned into a genuine snow-covered mountain? What if the entire place, by a miracle of God’s power and wisdom, was changed, in a moment, in the blink of an eye, into something incorruptible? They would have to close down.

“In Plato’s Timaeus, God does not create the universe, as does the Christian God; He simply finds it one day. It is in a state of total chaos. God sets to work to transform the chaos into order. That idea appeals to me, and I have adapted it to fit my own intellectual needs: What if our universe started out as not quite real, a sort of illusion, as the Hindu religion teaches, and God, out of love and kindness for us, is slowly transmuting it, slowly and secretly, into something real?”

~ Philip K. Dick, 1978
How to Build a Universe That Doesn’t Fall Apart Two Days Later

“…there resides in every language a characteristic world-view”

“Via the latter, qua character of a speech-sound, a pervasive analogy necessarily prevails in the same language; and since a like subjectivity also affects language in the same notion, there resides in every language a characteristic world-view. As the individual sound stands between man and the object, so the entire language steps in between him and the nature that operates, both inwardly and outwardly, upon him. He surrounds himself with a world of sounds, so as to take up and process within himself the world of objects. These expressions in no way outstrip the measure of the simple truth. Man lives primarily with objects, indeed, since feeling and acting in him depend on his presentations, he actually does so exclusively, as language presents them to him. By the same act whereby he spins language out of himself, he spins himself into it, and every language draws about the people that possesses it a circle whence it is possible to exit only by stepping over at once into the circle of another one. To learn a foreign language should therefore be to acquire a new standpoint in the world-view hitherto possessed, and in fact to a certain extent is so, since every language contains the whole conceptual fabric and mode of presentation of a portion of mankind.”

Wilhelm von Humboldt
On Language (1836)

* * *

Wilhelm von Humboldt
from Wikipedia

Wilhelm von Humboldt
from Stanford Encyclopedia of Philosophy

Wilhelm von Humboldt lectures
from Université de Rouen

Wilhelm von Humbold and the World of Languages
by Ian F. McNeely

Wilhelm von Humboldt: A Critical Review On His Philosophy of Language, Theory and Practice of Education
by Dr Arlini Alias

The theory of linguistic relativity from the historical perspective
by Iaroslav

Democratic Values in Balance or Opposition

At the New Yorker, Nathan Heller has an interesting piece about equality and freedom, The Philosopher Redefining Equality.

Mainstream American thought sees them as oppositional. But maybe the common ground between them is fairness. There can be neither equality nor freedom in an unfair society, although there can be liberty in an unfair society. That goes off on a tangent, but keep it in mind as background info. A society of freedom is not the same as a society of liberty, and a society of fairness might be a whole other thing as well. Yet it has been argued that English is the only language with exact words for all three concepts (see Liberty, Freedom, and Fairness) — for example, George Fletcher in Basic Concepts of Legal Thought writes,

“Remarkably, our concept of fairness does not readily translate into other languages. It is virtually impossible to find a suitable translation for fairness in European or Semitic languages. As a result, the term is transplanted directly in some languages such as German and Hebrew, and absent in others, such as French, which is resistant to adopting loan words that carry unique meanings.” (quoted by Manny Echevarria in Does Fairness Translate?)

The difference between the two cultural worldviews and ideological systems is what led to both the English Civil War and the American Civil War. This conflict has been internalized within American society, but it has never been resolved. Americans simply have pretended it went away when, in reality, the conflict has grown worse.

Heller writes about the experience and work of Elizabeth Anderson. She has been, “Working at the intersection of moral and political philosophy, social science, and economics, she has become a leading theorist of democracy and social justice.” And, as related to the above, “She has built a case, elaborated across decades, that equality is the basis for a free society.” Freedom isn’t only closely linked to equality but built upon and dependent upon it. That makes sense from an etymological perspective, as freedom originally meant living among equals in sharing freedom as a member of a free people, at least a member in good standing (ignoring the minor detail of the categories of people excluded: women, slaves, and strangers; but it might be noted that these categories weren’t always permanent statuses and unchangeable fates, since sometimes women could become warriors or divorce their husbands, slaves could end their bondage, and strangers could marry into the community). Hence, this is the reason the word ‘friend’ has the same origin — to be free is to be among friends, among those one trusts and relies upon as do they in return.

Fairness, by the way, is an odd word. It has an English meaning of fair, handsome, beautiful, and attractive (with some racist connotations); nice, clean, bright, clear, and pleasant; moderate as in not excessive in any direction (a fair balance or fair weather, neither hot nor cold) but also generous or plentiful as in considerable (a fair amount). And in various times and places, it has meant favorable, helpful, promising good fortune, and auspicious; morally or comparatively good, socially normative, average, suitable, agreeable, with propriety and justice, right conduct, etc; which overlaps with the modern sense of equitable, impartial, just, and free from bias (from fair and well to fair and square, from fair-dealing to fair play). But its other linguistic variants connect to setting, putting, placing, acting, doing, making, and becoming; make, compose, produce, construct, fashion, frame, build, erect, and appoint. There is an additional sense of sex and childbirth (i.e., fucking and birthing), the ultimate doing and making; and so seemingly akin to worldly goodness of fecundity, abundance, and creation. The latter maybe where the English meaning entered the picture. More than being fair as a noun, it is a verb of what one is doing in a real world sense.

Interestingly, some assert the closest etymological correlate to fairness in modern Swedish is ‘rättvis’. It breaks down to the roots ‘rätt’ and ‘vis’, the former signifying what is ‘correct’ or ‘just’ and the latter ‘wise’ (correct-wise or just-wise in the sense of clockwise or otherwise). This Swedish word is related to the English ‘righteous’. That feels right in the moral component of fairness that can be seen early on its development as a word. We think of what is righteous as having a more harsh and demanding tone than fairness. But I would note how easy it is to pair fairness with justice as if they belong together. John Rawls has a theory of justice as fairness. That makes sense, in accord with social science research that shows humans strongly find unjust that which is perceived as unfair. Then again, as freedom is not exactly the same as liberty, righteousness is not exactly the same as justice. There might be a reason that the Pledge of Allegiance states “with liberty and justice for all”, not liberty and righteousness, not freedom and justice. Pledging ourselves to liberty and justice might put us at odds with a social order of fairness, as paired with freedom, equality, or righteousness. Trying to translate these two worldviews into each other maybe is what created so much confusion in the first place.

All these notions of and related to fairness, one might argue indicate how lacking in fairness is our society, whatever one might think of liberty and justice. Humans tend to obsess over in articulating and declaring what is found most wanting. A more fair society would likely not bother to have a world for it as the sense of fairness would be taken for granted and would simply exist in the background as ideological realism and cultural worldview. From Integrity in Depth, John Beebe makes this argument about the word ‘integrity’ for modern society, whereas the integral/integrated lifestyle of many tribal people living in close relationship to their environment requires no such word. A people need what is not integrated that is seen as needing to be integrated in order to speak of what is or might be of integrity.

Consider the Piraha who are about as equal a society as can exist with fairness only becoming an issue in recent history because of trade with outsiders. The Piraha wanted Daniel Everett to teach them learn math because they couldn’t determine if they were being given a fair deal or were being cheated, a non-issue among Piraha themselves since they don’t have a currency or even terms for numerals. A word like fairness would be far too much of a generalized abstraction for the Piraha as traditionally most interactions were concrete and personal, as such more along the lines of Germanic ‘freedom’.

It might put some tribal people in an ‘unfair’ position if they don’t have the language to fully articulate unfairness, at least in economic terms. We Americans have greater capacity and talent in fighting for fairness because we get a lot of practice, as we can’t expect it as our cultural birthright. Unsurprisingly, we talk a lot about it and in great detail. Maybe to speak of fairness is always to imply both its lack and desirability. From the view of linguistic relativism, such a word invokes a particular wordview that shapes and influences thought, perception, and behavior.

This is observed in social science research when WEIRD populations are compared to others, as seen in Joe Henrich’s study of the prisoner’s dilemma: “It had been thought a matter of settled science that human beings insist on fairness in the division, or will punish the offering party by refusing to accept the offer. This was thought an interesting result, because economics would predict that accepting any offer is better than rejecting an offer of some money. But the Machiguenga acted in a more economically rational manner, accepting any offer, no matter how low. “They just didn’t understand why anyone would sacrifice money to punish someone who had the good luck of getting to play the other role in the game,” Henrich said” (John Watkins, The Strangeness of Being WEIRD). There is no impulse to punish an unfairness that, according to the culture, isn’t perceived as unfair. It appears that the very concept of fairness was irrelevant or maybe incomprehensible the Machiguenga, at least under these conditions. But if they are forced to deal more with outsiders who continually take advantage of them or who introduce perverse incentives into their communities, they surely would have to develop the principle of fairness and learn to punish unfairness. Language might be the first sign of such a change.

A similar point is made by James L. Kugel in The Great Shift about ancient texts written by temple priests declaring laws and prohibitions. This probably hints at a significant number of people at the time doing the complete opposite or else the priests wouldn’t have bothered to make it clear, often with punishments for those who didn’t fall in line. As Julian Jaynes explains, the earliest civilizations didn’t need written laws because the social norms were so embedded within not only the social fabric but the psyche. Laws were later written down because social norms were breaking down, specifically as societies grew in size, diversity, and complexity. We are now further down this road of the civilizational project and legalism is inseparable from our everyday experience, and so we need many words such as fairness, justice, righteousness, freedom, liberty, etc. We are obsessed with articulating these values as if by doing so we could re-enforce social norms that refuse to solidify and stabilize. So, we end up turning to centralized institutions such as big government to impose these values on individuals, markets, and corporations. And we need lawyers, judges, and politicians to help us navigate this legalistic world that we are anxious about falling apart at any moment.

This interpretation is supported by the evidence of the very society in which the word fairness was first used. “The tribal uses of fair and fairness were full of historical irony,” pointed out David Hackett Fischer in Fairness and Freedom (Kindle Locations 647-651). “These ideas flourished on the far fringes of northwestern Europe among groups of proud, strong, violent, and predatory people who lived in hard environments, fought to the death for the means of life, and sometimes preyed even on their own kin. Ideas of fairness and fair play developed as a way of keeping some of these habitual troublemakers from slaughtering each other even to the extinction of the tribe. All that might be understood as the first stage in the history of fairness.” This interpretation is based on a reading of the sagas as written down quite late in Scandinavian history. It was a period when great cultural shifts were happening such as radical and revolutionary introductions like that of writing itself. And I might add, this followed upon the millennium of ravage from the collapse of the bicameralism of Bronze Age Civilizations. The society was under great pressure, both from within and without, as the sagas describe those violent times. It was the sense of lack of fairness in societal chaos and conflict that made it necessary to invent fairness as a cultural ideal and social norm.

It’s impossible to argue we live in a fair society. The reason Adam Smith defended equality, for example, is because he thought it would be a nice ideal to aspire to and not that we had already attained it. On the other hand, there is an element of what has been lost. Feudal society had clearly spelled out rights and responsibilities that were agreed upon and followed as social norms, and so in that sense it was a fair society. The rise of capitalism with the enclosure and privatization of the commons was experienced as unfair, to which Thomas Paine was also responding with his defense of a citizen’s dividend to recompense what was taken, specifically as theft not only from living generations but also all generations following. When a sense of fairness was still palpable, as understood within the feudal social order, no argument for fairness as against unfairness was necessary. It likely is no coincidence that the first overt class war happened in the English Civil War when the enclosure movement was in high gear, the tragic results of which Paine would see in the following century, although the enclosure movement didn’t reach full completion until the 19th century with larger scale industrialization and farming.

As for how fairness accrued its modern meaning, I suspect that it is one of the many results of the Protestant Reformation as a precursor to the Enlightenment Age. The theological context became liberal. As Anna Wierzbicka put it: ” “Fair play” as a model of human interaction highlights the “procedural” character of the ethics of fairness. Arguably, the emergence of the concept of “fairness” reflects a shift away from absolute morality to “procedural (and contractual) morality,” and from the gradual shift from “just” to “fair” can be seen as parallel to the shifts from good to right and also from wise (and also true) to reasonable: in all cases, there is a shift from an absolute, substantive approach to a procedural one.” (from English: Meaning and Culture as quoted Mark Liberman in No word for fair?)

Nathan Heller’s article is about how the marriage of values appears like a new and “unorthodox notion”. But Elizabeth Anderson observes that, “through history, equality and freedom have arrived together as ideals.” This basic insight was a central tenet of Adam Smith’s economic philosophy. Smith said a free society wasn’t possible with high inequality. It simply wasn’t possible. Full stop. And his economic views are proclaimed as the basis of Western capitalism. So, how did this foundational understanding get lost along the way? I suppose because it was inconvenient to the powers that be who were looking for an excuse to further accumulate not only wealth but power.

It wasn’t only one part of the ruling elite that somehow ‘forgot’ this simple truth. From left to right, the establishment agreed in defense of the status quo: “If individuals exercise freedoms, conservatives like to say, some inequalities will naturally result. Those on the left basically agree—and thus allow constraints on personal freedom in order to reduce inequality. The philosopher Isaiah Berlin called the opposition between equality and freedom an “intrinsic, irremovable element in human life.” It is our fate as a society, he believed, to haggle toward a balance between them.” For whatever reason, there was a historical shift, a “Post-Enlightenment move” (Echevarria), both in the modern meaning of fairness and the modern deficiency in fairness.

That still doesn’t explain how the present ideological worldview became the dominant paradigm that went unquestioned by hundreds of millions of ordinary Americans and other Westerners. Direct everyday experience contradicts this neo-feudalist dogma of capitalist realism. There is nothing that Anderson observed in her own work experience that any worker couldn’t notice in almost any workplace. The truth has always been there right in front of us. Yet few had eyes to see. When lost in the darkness of a dominant paradigm, sometimes clear vision requires an imaginative leap into reality. I guess it’s a good thing we have a word to designate the ache we feel for a better world.

* * *

Fairness and Freedom
by David Hackett Fischer
Kindle Locations 596-675

Origins of the Words Fairness and Fair

Where did this language of fairness come from? What is the origin of the word itself? To search for the semantic roots of fair and fairness is to make a surprising discovery. Among widely spoken languages in the modern world, cognates for fairness and fair appear to have been unique to English, Danish, Norwegian, and Frisian until the mid-twentieth century. 40 They remained so until after World War II, when other languages began to import these words as anglicisms. 41

The ancestry of fair and fairness also sets them apart in another way. Unlike most value terms in the Western world, they do not derive from Greek or Latin roots. Their etymology is unlike that of justice and equity, which have cognates in many modern Western languages. Justice derives from the Latin ius, which meant a conformity to law or divine command, “without reference to one’s own inclinations.” Equity is from the Latin aequitas and its adjective aequus, which meant level, even, uniform, and reasonable. 42

Fairness and fair have a different origin. They derive from the Gothic fagrs, which meant “pleasing to behold,” and in turn from an Indo-European root that meant “to be content.” 43 At an early date, these words migrated from Asia to middle Europe. There they disappeared in a maelstrom of many languages, but not before they migrated yet again to remote peninsulas and islands of northern and western Europe, where they persisted to our time. 44 In Saxon English, for example, the old Gothic faeger survived in the prose of the Venerable Bede as late as the year 888. 45 By the tenth century, it had become faire in English speech. 46

In these early examples, fagr, faeger, fair, and fairness had multiple meanings. In one very old sense, fair meant blond or beautiful or both—fair skin, fair hair. As early as 870 a Viking king was called Harald Harfagri in Old Norse, or Harold Fairhair in English. In another usage, it meant favorable, helpful, and good—fair wind, fair weather, fair tide. In yet a third it meant spotless, unblemished, pleasing, and agreeable: fair words, fair speech, fair manner. All of these meanings were common in Old Norse, and Anglo-Saxon in the tenth and eleventh centuries. By 1450, it also meant right conduct in rivalries or competitions. Fair play, fair game, fair race, and fair chance appeared in English texts before 1490. 47

The more abstract noun fairness was also in common use. The great English lexicographer (and father of the Oxford English Dictionary) Sir James Murray turned up many examples, some so early that they were still in the old Gothic form—such as faegernyss in Saxon England circa 1000, before the Norman Conquest. It became fayreness and fairnesse as an ethical abstraction by the mid-fifteenth century, as “it is best that he trete him with farenes” in 1460. 48

As an ethical term, fairness described a process and a solution that could be accepted by most parties—fair price, fair judgment, fair footing, fair and square. Sometimes it also denoted a disposition to act fairly: fair-minded, fair-natured, fair-handed. All of these ethical meanings of fair and fairness were firmly established by the late sixteenth and early seventeenth centuries. Fair play appears in Shakespeare (1595); fair and square in Francis Bacon (1604); fair dealing in Lord Camden (before 1623). 49

To study these early English uses of fairness and fair is to find a consistent core of meaning. Like most vernacular words, they were intended not for study but for practical use. In ethical applications, they described a way of resolving an issue that is contested in its very nature: a bargain or sale, a race or rivalry, a combat or conflict. Fundamentally, fairness meant a way of settling contests and conflicts without bias or favor to any side, and also without deception or dishonesty. In that sense fairness was fundamentally about not taking undue advantage of other people. As early as the fifteenth century it variously described a process, or a result, or both together, but always in forms that fair-minded people would be willing to accept as legitimate.

Fairness functioned as a mediating idea. It was a way of linking individuals to groups, while recognizing their individuality at a surprisingly early date. Always, fairness was an abstract idea of right conduct that could be applied in different ways, depending on the situation. For example, in some specific circumstances, fairness was used to mean that people should be treated in the same way. But in other circumstances, fairness meant that people should be treated in different ways, or special ways that are warranted by particular facts and conditions, such as special merit, special need, special warrant, or special desire. 50

Fairness was a constraint on power and strength, but it did not seek to level those qualities in a Procrustean way. 51 Its object was to regulate ethical relationships between people who possess power and strength in different degrees—a fundamental fact of our condition. A call for fairness was often an appeal of the weak to the conscience of the strong. It was the eternal cry of an English-speaking child to parental authority: “It’s not fair!” As any parent knows, this is not always a cry for equality.

Modern Applications of Fairness: Their Consistent Core of Customary Meaning

Vernacular ideas of fairness and fair have changed through time, and in ways that are as unexpected as their origin. In early ethical usage, these words referred mostly to things that men did to one another—a fair fight, fair blow, fair race, fair deal, fair trade. They also tended to operate within tribes of Britons and Scandinavians, where they applied to freemen in good standing. Women, slaves, and strangers from other tribes were often excluded from fair treatment, and they bitterly resented it.

The tribal uses of fair and fairness were full of historical irony. These ideas flourished on the far fringes of northwestern Europe among groups of proud, strong, violent, and predatory people who lived in hard environments, fought to the death for the means of life, and sometimes preyed even on their own kin. Ideas of fairness and fair play developed as a way of keeping some of these habitual troublemakers from slaughtering each other even to the extinction of the tribe. All that might be understood as the first stage in the history of fairness. 52

Something fundamental changed in a second stage, when the folk cultures of Britain and Scandinavia began to grow into an ethic that embraced others beyond the tribe—and people of every rank and condition. This expansive tendency had its roots in universal values such as the Christian idea of the Golden Rule. 53 That broader conception of fairness expanded again when it met the humanist ideas of the Renaissance, the universal spirit of the Enlightenment, the ecumenical spirit of the Evangelical Movement, and democratic revolutions in America and Europe. When that happened, a tribal idea gradually became more nearly universal in its application. 54 Quantitative evidence suggests an inflection at the end of the eighteenth century. The frequency of fairness in English usage suddenly began to surge circa 1800. The same pattern appears in the use of the expression natural justice. 55

Then came a third stage in the history of fairness, when customary ideas began to operate within complex modern societies. In the twentieth century, fairness acquired many technical meanings with specific applications. One example regulated relations between government and modern media (“the fairness doctrine”). In another, fairness became a professional standard for people who were charged with the management of other people’s assets (“fiduciary fairness”). One of the most interesting modern instances appeared among lawyers as a test of “balance or impartiality” in legal proceedings, or a “subjective standard by which a court is deemed to have followed due process,” which began to be called “fundamental fairness” in law schools. Yet another example was “fair negotiation,” which one professional negotiator defined as a set of rules for “bargaining with the Devil without losing your soul.” One of the most complex applications is emerging today as an ethic of “fairness in electronic commerce.” These and other modern applications of fairness appear in legal treatises, professional codes, and complex bodies of regulatory law. 56

Even as modern uses of fair and fairness have changed in all of those ways, they also preserved a consistent core of vernacular meaning that had appeared in Old English, Norse, and Scandinavian examples and is still evident today. To summarize, fair and fairness have long been substantive and procedural ideas of right conduct, designed to regulate relations among people who are in conflict or rivalry or opposition in particular ways. Fairness means not taking undue advantage of others. It is also about finding ways to settle differences through a mutual acceptance of rules and processes that are thought to be impartial and honest—honesty is fundamental. And it is also about living with results that are obtained in this way. As the ancient Indo-European root of fagrs implied, a quest for fairness is the pursuit of practical solutions with which opposing parties could “be content.” These always were, and still are, the fundamental components of fairness. 57

Notes:

40. For an excellent and very helpful essay on fair and fairness by a distinguished cultural and historical linguist, see Anna Wierzbicka, “Being FAIR: Another Key Anglo Value and Its Cultural Underpinnings,” in English: Meaning and Culture (New York and Oxford, 2006), 141–70. See also Bart Wilson, “Fair’s Fair,” http://www.theatlantic.com/business/print/2009/01/fairs-fair/112; Bart J. Wilson, “Contra Private Fairness,” May 2008, http://www.chapman.edu/images/userimges/jcunning/Page_11731/ContraPrivateFairness05–2008.pdf; James Surowiecki, “Is the Idea of Fairness Universal?” Jan. 26, 2009, http://www.newyorker.com/online/blogs/jamessurowiecki/2009/01/is; and Mark Liberman, “No Word for Fair?” Jan. 28, 2009, http://languagelog.ldc.upenn.edu/nll/?p=1080.

41. Oxford English Dictionary, s.v. “fair” and “fairness.” Cognates for the English fairness include fagr in Icelandic and Old Norse, retferdighet in modern Norwegian, and retfaerighed in modern Danish. See Geír Tòmasson Zoëga, A Concise Dictionary of Old Icelandic (Toronto, 2004), s.v. “fagr.” For Frisian, see Karl von Richthofen, Altfriesisches Wörterbuch (Gottingen, 1840); idem, Friesische Rechtsquellen (Berlin, 1840). On this point I agree and disagree with Anna Wierzbicka. She believes that fair and unfair “have no equivalents in other European languages (let alone non-European ones) and are thoroughly untranslatable” (“Being FAIR,” 141). This is broadly true, but with the exception of Danish, Norwegian, Frisian, and Icelandic. Also I’d suggest that the words can be translated into other languages, but without a single exactly equivalent word. I believe that people of all languages are capable of understanding the meaning of fair and fairness, even if they have no single word for it.

42. OED, s.v. “justice,” “equity.”

43. Webster’s New World Dictionary, 2nd College Edition, ed. David B. Guralnik (New York and Cleveland, 1970), s.v. “fair”; OED, s.v. “fair.”

44. Ancient cognates for fair included fagar in Old English and fagr in Old Norse.

45. W. J. Sedgefield, Selections from the Old English Bede, with Text and Vocabulary, on an Early West Saxon Basis, and a Skeleton Outline of Old English Accidence (Manchester, London, and Bombay, 1917), 77; and in the attached vocabulary list, s.v. the noun “faeger” and the adverbial form “faegere.” Also Joseph Bosworth and T. Northcote Tollen, An Anglo-Saxon Dictionary, Based on Manuscript Collections (Oxford, 1882, 1898), s.v. “faeger,” ff.

46. Not to be confused with this word is another noun fair, for a show or market or carnival, from the Latin feria, feriae, feriarum, festival or holiday—an entirely different word, with another derivation and meaning.

47. Liberman, “No Word for Fair?”

48. OED, s.v. “fairness,” 1.a, b, c.

49. For fair and fairness in Shakespeare, see King John V.i.67. For fair and square in Francis Bacon in 1604 and Oliver Cromwell in 1649, see OED, s.v. “fair and square.”

50. Herein lies one of the most difficult issues about fairness. How can we distinguish between ordinary circumstances where fairness means that all people should be treated alike, and extraordinary circumstances where fairness means different treatment? This problem often recurs in cases over affirmative action in the United States. No court has been able to frame a satisfactory general rule, in part because of ideological differences on the bench.

51. Procrustes was a memorable character in Greek mythology, a son of Poseidon called Polypaemon or Damastes, and nicknamed Procrustes, “the Stretcher.” He was a bandit chief in rural Attica who invited unwary travelers to sleep in an iron bed. If they were longer than the bed, Procrustes cut off their heads or feet to make them fit; if too short he racked them instead. Procrustes himself was dealt with by his noble stepbrother Theseus, who racked him on his own bed and removed his head according to some accounts. In classical thought, and modern conservatism, the iron bed of Procrustes became a vivid image of rigid equality. The story was told by Diodorus Siculus, Historical Library 4.59; Pausanias, Guide to Greece 1.38.5; and Plutarch, Lives, Theseus 2.

52. Jesse Byock, Viking Age Iceland (London, 2001), 171–84; the best way to study the origin of fairness in a brutal world is in the Norse sagas themselves, especially Njal’s Saga, trans. and ed. Magnus Magnusson and Hermann Palsson (London, 1960, 1980), 21–22, 40, 108–11, 137–39, 144–45, 153, 163, 241, 248–55; Egil’s Saga, trans. and ed. Hermann Palsson and Paul Edwards (London, 1976, 1980), 136–39; Hrafnkel’s Saga and Other Icelandic Stories, trans. and ed. Hermann Palsson (London, 1971, 1980), 42–60.

53. Matthew 25:40; John 4:19–21; Luke 10:27.

54. The vernacular history of humanity, expanding in the world, is a central theme in David Hackett Fischer, Champlain’s Dream (New York and Toronto, 2008); as the expansion of vernacular ideas of liberty and freedom is central to Albion’s Seed (New York and Oxford, 1989) and Liberty and Freedom (New York and Oxford, 2005); and the present inquiry is about the expansion of vernacular ideas of fairness in the world. One purpose of all these projects is to study the history of ideas in a new key. Another purpose is to move toward a reunion of history and moral philosophy, while history also becomes more empirical and more logical in its epistemic frame.

55. For data on frequency, see Google Labs, Books Ngram Viewer, http://ngrams.googlelabs.com, s.v. “fairness” and “natural justice.” Similar patterns and inflection-points appear for the corpus of “English,” “British English,” and “American English,” in the full span 1500–2000, smoothing of 3. Here again on the history of fairness, I agree and disagree with Wierzbicka (“Being FAIR,” 141–67). The ethical meanings of fairness first appeared earlier than she believes to be the case. But I agree on the very important point that ethical use of fairness greatly expanded circa 1800.

56. Fred W. Friendly, The Good Guys, the Bad Guys, and the First Amendment (New York, 1976) is the classic work on the fairness doctrine. Quotations in this paragraph are from Carrie Menkow-Meadow and Michael Wheeler, eds., What’s Fair: Ethics for Negotiators (Cambridge, 2004), 57; Philip J. Clements and Philip W. Wisler, The Standard and Poor’s Guide to Fairness Opinions: A User’s Guide for Fiduciaries (New York, 2005); Merriam-Webster’s Dictionary of Law (Cleveland, 1996), s.v. “fundamental fairness”; Approaching a Formal Definition of Fairness in Electronic Commerce: Proceedings of the 18th IEEE Symposium on Reliable Distributed Systems (Washington, 1999), 354. Other technical uses of fairness can be found in projects directed by Arien Mack, editor of Social Research, director of the Social Research Conference series, and sponsor of many Fairness Conferences and also of a Web site called Fairness.com.

57. An excellent discussion of fairness, the best I have found in print, is George Klosko, The Principle of Fairness and Political Obligation (Savage, MD, 1992; rev. ed., 2004). It is similarto this formulation on many points, but different on others.