The Great WEIRDing of the Jaynesian Ego-Mind as a Civilizational Project

“Father, forgive them; for they know not what they do.”
~ Luke 23:34

“I’m supposed to act like they aren’t here. Assuming there’s a ‘they’ at all. It may just be my imagination. Whatever it is that’s watching, it’s not human, unlike little dark eyed Donna. It doesn’t ever blink. What does a scanner see? Into the head? Down into the heart? Does it see into me, into us? Clearly or darkly? I hope it sees clearly, because I can’t any longer see into myself. I see only murk. I hope for everyone’s sake the scanners do better. Because if the scanner sees only darkly, the way I do, then I’m cursed and cursed again. I’ll only wind up dead this way, knowing very little, and getting that little fragment wrong too.”
~ A Scanner Darkly (movie)

Let us explore the strangeness of human nature and what it means in our society. For practical purposes, this will require us to use the examples of other people. The simple reason is that certain behavioral and identity patterns are easier to see in others than in ourselves. So, just because our present focus is turned outward, it does not imply that we are standing above in judgment, that we are casting the first stone. We will safely assume that, like all humans, we lack the requisite self-awareness to always see clearly what we do and how what we do is more inconsistent than we would prefer. The following is not about the moral failure of individuals but a reckoning with our shared species-being. The most blatant example we are aware of, in our personal experience, is that of someone we have known for about a quarter of a century. We have on multiple occasions, along with others present to confirm it, observed her say something to one person and then, upon walking into the next room, immediately say something completely contradictory to someone else. She seemed oblivious to the fact that she was still in ear-shot of those she just spoke to, suggesting it was not a consciously intentional act of deceit and manipulation. In all the years we’ve known her, she has repeated this behavior many times and she has never shown any indication of understanding what she did or any memory of what transpired. It’s as if she had been two different people, in apparently not carrying a portable and unchanging internal ego structure from one place to the next.

Along with other behaviors, this has led us to suspect she has borderline personality disorder or something along those lines, whatever one might call it; not that she has ever been diagnosed and it must be stated that, in her own perception, she thinks she is completely sane. But psychiatric diagnoses and debates about them are irrelevant for our purposes here. Indeed, maybe she is sane and labeling something does not protect us from what it represents, does not quarantine the perceived mental disease. The issue at hand implicates us all. What we’re discussing here has everything to do with how memory operates, with the narratives we create in retelling memories, forgetting them, and forming new ones. The same lady above, it might be noted, is talented at shaping narratives, not only in her own mind but in the moment of relating to others and so projecting those narratives onto the world, such as staging melodramatic conflicts (typical according to descriptions of borderline personality disorder; when an inner boundaries can’t be maintained, one turns to creating external boundaries in the world by projecting onto others and then controlling them). And she is masterful in creating and controlling her social media persona. The point for bringing all of this up is that, even if her case is extreme and obvious, that kind of thing is surprisingly not abnormal. All of us do similar things, if most of us are better at covering our tracks. We’ve come across numerous other examples over the years from a diversity of people.

Often memory lapses happen in more subtle ways, not always involving overt inconsistency. Amnesia can operate sometimes in maintaining consistency. One guy we know has a strange habit of how he eats. It’s so extremely methodical and constrained. He’ll pick up his fork, place a piece in his mouth, lay down the fork, and carefully chew for an extraordinary amount of time, as if he were counting the number of times chewed. It’s very much unnatural, that is to say we could tell it was trained into him at some point. We pointed this out to him and he didn’t realize he was doing anything unusual, but his wife told us she knew why he did it. Many years earlier, he had told her that his mother had made him thoroughly chew his food as a child and, indeed, she was a strict woman as he has shared with us. The thing is, even when told of this memory he once shared with his wife, he still could not remember it — it was gone and, along with it, any understanding about the origins of his behavior. The memory of his mother’s voice telling him what to do is absent, whereas the authoritative command of her voice still echoes in his mind. An external authorization is internalized as part of the individual ego-mind and simply becomes part of an unquestionable self-identity.

To emphasize the power this holds over the mind, realize this goes far beyond only one particular behavior as his entire identity is extremely controlled (controlled by his egoic willpower or by the authorizing voice of his mother repeating in his unconscious?). He had forgotten something from his childhood that has continued to unconsciously determine his behavioral identity. It was a total memory lapse; and maybe the erasure wasn’t accidental but an important mechanism of identity formation, in creating the sense of an unquestionable psychological realism, the way he takes himself to be as inborn character. It absolutely fascinates us. That kind of forgetting we’ve noticed so many times before. Let us share another incident involving a different married couple, one we’ve also known for a very long time. The husband told us of when his wife went looking for a dog at an animal shelter and he accompanied her. According to him, she told the shelter worker who helped them about how she had gotten her first dog, but the husband explained to us that she had made it up or rather she had told him an alternative version previously, whichever one was correct or whether either was. When he confronted her about this creative storytelling, she simply admitted that it was not true and she had made it up. As he told it, her manner treated the admission like it was irrelevant or insignificant, and so she offered no explanation for why she did it. She just shrugged it off, as if it were normal and acceptable behavior.

Yet it’s entirely possible that the whole situation was beyond her full self-awareness even in the moment of being confronted, similar to the case with the first woman mentioned above. Directly confronting someone does not always induce self-awareness and social-awareness, as identity formations are powerful in protecting against conflicting and threatening information. Amusingly, when we later brought up the animal shelter incident to the husband, he had zero recall of the event and having shared it with us. These transgressions of memory and identity come and go, sometimes for everyone involved. Let’s return to the first couple. There was another situation like this. The husband told us that his wife had been pro-choice when she was younger, but now she is rabidly anti-choice and calls those who are pro-choice baby-killers. This guy told us about this on multiple occasions and so obviously it had been something on his mind for years. Like all of us, he could see the inconsistency in another, in this case a woman he had been married to for more than a half century. He is an honest person and so we have no reason to doubt his claim, specifically as he himself is also now anti-choice (did he always hold this position or did he likewise unconsciously change his memory of political identity?)

The husband told us that his wife no longer remembered her previous position or presumably the self-identity that held it and the reasons for holding it; likely having originated in her childhood upbringing in a working class family that was Democratic and Protestant (note that, until the culture wars heated up in the 1980s, most American Protestants were pro-choice; in opposition to anti-choice Catholics at a time when anti-Catholic bigotry was still strong; by the way, her Democratic father hated Catholics). Not long after, when discussing this with him on another occasion, he stated that he had no memory of ever having told us this. The thing is this couple has become fairly far right, fear-mongering, conspiratorially paranoid, and harshly critical in their older age. They weren’t always this way, as we knew them when they were younger. Though they always have been conservative as an identity, they both once were relatively moderate and socially liberal; prior to the rise of right-wing and alt-right media (Fox News, Epoch Times, Rush Limbaugh, Laura Schlessinger, Jordan Peterson, etc). The husband used to be far less intellectual and, in his younger days, instead of reading books about religion and politics he read Time Magazine and Playboy. In their early marriage, they attended liberal churches, had pot-smoking friends, and were committed to a worldview of tolerance and positive thinking.

Over the decades, they had re-scripted their identity, according to a powerful right-wing propaganda machine (i.e., the Shadow Network started by Paul Weyrich, initially funded by the Coors family, and ushered in by President Ronald Reagan), to such a degree that it erased all evidence to the contrary — their former selves having been scrubbed from personal memory. So, it’s not only that they’ve dramatically changed their politics over their lifetimes but that they no longer remember who they used to be and so now will deny they were ever anything other than far right ultra-conservatives. The change has been so dramatic that they probably wouldn’t like their younger selves, if they could meet; and their younger selves might be appalled by what they’d become. It does get one thinking. To what degree do all of us change in such a manner with similar obliviousness? How would we know if we did? We are unlikely to see it in ourselves. And often those around us won’t notice either or else won’t mention it to us. There is typically a mutual agreement to not poke at each other’s illusions, particularly when illusions are shared, entwined, or overlapping. It’s a conspiracy of silence guarded by a paralyzing fear of self-awareness. Unravelling our own narratives or those of others can be dangerous, and people will often lash out at you for they will perceive you as attacking their identity.

[(7/9/22) Note: We recently talked to this man again about his wife and their early lives. He admitted that he wasn’t always anti-choice, in claiming he was undecided for the first 40-50 years of his life. He claims to only have become anti-choice in the 1990s — one might add, after years of rabid right-wing indoctrination from culture war propaganda (i.e., angry right-wing talk radio and the Fox News effect). That was the same period he and his wife left same the liberal Unity Church they had raised their children in, and they did so specifically over the issue of same sex marriage, despite the fact that the Unity Church had long been a proponent of LGBTQ rights in doing marriage services for same sex couples. The Unity Church didn’t change. This older couple did. But to their minds, they remained where they were and all the world shifted around them. It is true that the majority of Americans did move far left and continues to move further left, and yet it’s also true that many older Americans in turning reactionary (fearful, paranoid, etc) went far right. To give an example, this man became a Republican because of Barry Goldwater’s libertarianism, but later on Goldwater stated regret that he had opposed an important civil rights bill, even if he had genuine libertarian reasons at the time. Also, Goldwater later came to fear and despise the religious right that this older conservative couple has become identified with. Conveniently, the man in question still holds Goldwater up as a hero while not following his moral and political example. All of this has exaggerated the sense of this couple being out of sync. It also created a further disconnect from their own past selves. The American majority is now more in line with their past selves than now are their older selves. To be in conflict not only with most other people but also with oneself would, indeed, feel like an untenable and intolerable position to find oneself in. That they lash out with a disconcerting sense of uneasiness now is unsurprising.]

This perfectly normal strangeness reminds one of anthropological descriptions of the animistic mind and porous self. In many hunter-gatherer tribes and other traditional societies, self-identity tends to be more open and shifting. People will become possessed by spirits, demons, and ancestors; or they will have a shamanic encounter that alters their being upon receiving a new name. These changes can be temporary or permanent, but within those cultures it is accepted as normal. People relate to whatever identity is present without any expectation that individual bodies should be inhabited continuously by only a single identity for an entire lifetime. Maybe this animistic psychology has never really left us, not even with the destruction of most tribal cultures so long after the collapse of bicameral societies. That other way of being that we try to bury keeps resurfacing. There are many voices within the bundled mind and any one of them has the potential to hail us with the compelling force of archaic-like authorization (Julian Jaynes’ bicameralism meets Louis Althusser’s interpellation). We try to securely segment these voice-selves, but every now and then they are resurrected from the unconscious. Or maybe they are always there influencing us, whether or not we recognize and acknowledge them. We just get good at papering over the anomalies, contradictions, and discontinuities. Julian Jaynes points out that we spend little of our time in conscious activity (e.g., mindless driving in a trance state).

What we are talking about is the human nature that evolved under hundreds of millennia of oral culture. This is distinct from literary culture, a relatively recent social adaptation layered upon the primitive psyche. This deeper ground of our species-being contradicts our highly prized egoic identity. To point out an individual’s inconsistencies, in our culture, is about the same as accusing someone of hypocrisy or lying or worse, possibly mental illness. The thing is maybe even psychiatric conditions like borderline personality disorder are simply the animistic-bicameral mind as distorted within a society that denies it a legitimate outlet and social framework. That said, we shouldn’t dismiss the achievements of the egoic mind, that is to say Jaynesian consciousness (interiorized, spatialized, and narratized). It isn’t a mere facade hiding our true nature. The human psyche is flexible, if within limits. There are genuine advantages to socially constructing the rigid boundaries of the literate ego-mind. This relates to the cultural mindset of WEIRD (Westernized, Educated, Industrialized, Rich and Democratic or pseudo-Democratic). Joseph Henrich, in his book The WEIRDest People in the World, argues that it is literacy that is the main causal factor. He points to research that shows greater amounts of reading, presumably in early life, alter the structure of the brain and the related neurocognition. More specifically, it might be linguistic recursion, the complex structure of embedded phrases, that creates the complexity of abstract thought — this is lacking in some simpler societies and indeed it increases with literacy.

Importantly, what the research on the WEIRD bias tells us is that most people in the world don’t share this extreme variation on the egoic mind and a few remaining populations don’t have an egoic mind at all as they remain fully adapted to the bundled mind, although surely this is changing quickly as most of humanity is becoming some combination of Westernized, modernized, urbanized, and educated; specifically in how literacy spreads and literacy rates go up. We are only now reaching the point of mass global literacy, but it’s still in its early stages. Literacy, for the average person, remains rudimentary. Even in Western countries, the civilizational project of Jaynesian consciousness, in its WEIRDest form, is still partial and not well established. But, in recent centuries, we’ve begun to see the potential it holds and one cannot doubt that it is impressive. The WEIRD egoic mind is obviously distinct in what it makes possible, even in its present imperfections. Studies on WEIRD individuals do show they act differently than the non-WEIRD. Relatively speaking, they are more broadly consistent and abstractly principled (uniform standards and conformist norms), with a perceived inner voice of a supposed independent conscience (as originally reinforced through the moralizing Big Gods that were believed to see into the soul); and that relates to why principled consistency is so idealized in WEIRD society. Even when WEIRD subjects think no one is watching, they are less likely to cheat to help their families than non-WEIRD subjects. And, when asked, they state they’d be less likely to lie in court to protect a loved one. This is what the egoic structure does, as an internalized framework that is carried around with one and remains static no matter the situation. The WEIRD mind is less context-dependent, which admittedly has both strengths and weaknesses.

It’s not clear that this mentality is entirely beneficial, much less sustainable. It might be the case that it never will become fully established and so could always remain wonky, as the above examples demonstrate. The bundled mind is maybe the permanent default mode that we will always fall back into, the moment our egoic defenses are let down. Maintaining the egoic boundaries may simply be too much effort, too much drain on the human biological system, too contrary to human nature. Yet it’s too early to come to that judgment. If and only if we get to a strongly literate society will egoic WEIRDness be able to show what it’s capable of or else its ultimate failure. Consider that, in the US, the youngest generation will be the first ever majority college-educated and hence the first time we will see most of the population fully immersed in literary culture. It’s taken us about three millennia to get to this point, a slow collective construction of this experimental design; and we’re still working out the bugs. It makes one wonder about what might further develop in the future. Some predict a transformation toward a transparent self (integral WEIRD or post-WEIRD?). Certainly, there will be a Proteus effect of mediated experience in shaping identity in new ways. Building off of mass literacy and magnifying its impact, there is the Great Weirding of new media that might become a Great WEIRDing, as there is a simultaneous increase of text, voice, and image. Will the egoic mind be solidified or fall back into the bundled mind?

The challenge for the egoic identity project is that it takes a long time for the external infrastructure of society to be built to support internal structures of identity (e.g., private property and the propertied self), since individualism does not stand alone. That is what modernity has been all about; and most of us have come to take it for granted, in not realizing the effort and costs that went into it and that are continually invested for its maintenance, for good or ill. This is what the Enlightenment Age, in particular, was concerned about. Science and capitalism, democracy and technocracy involve constructing systems that reinforce egoic consistency, principled morality, and perceived objectivity. Liberal proceduralism, within democracy, has been one such system. It’s the attempt to create a legal and political system where all are treated equally, that is to say consistently and systematically. That is far unlike traditional societies where people are intentionally not treated as equal because context of social roles, positions, and identities determine how each person is treated; and that would be especially true of traditional societies where identity is far more fluid and relational, such that how even a single person is treated would vary according to situation. Much of what we think of as corruption in less ‘developed’ countries is simply people acting traditionally; such as what the WEIRD mind calls nepotism and bribery where one treats others primarily according to specific context of concrete relationships and roles, not abstract principles and legalistic code.

Obviously, liberal proceduralism doesn’t always work according to intention or rather the intention is often lacking or superficial. Even conservatives will nod toward liberal proceduralism because, to one degree or another, we are all liberals in a liberal society during this liberal age; but that doesn’t indicate an actual shared commitment to such liberal systems that promote, support, and defend a liberal mindset. Still, sometimes we have to pretend something is real before we might be able to finally manifest it as a shared reality; as a child play-acts what they might become as an adult; or as a revolution of the mind precedes revolution of society and politics, sometimes preceding by a long period of time (e.g., the transition from the English Peasants’ Revolt and the English Civil War to the American Revolution and the American Civil War). This is what we are struggling with, such as with the battle between science and what opposes and undermines it, mixed up with crises of expertise and replication, and involving problems of confirmation bias, backlash effect, etc. The scientific method helps strengthen and shape the egoic structure of mind, helps an individual do what they could not do in isolation. We need systems that create transparency, hold us accountable, incentivize consistency, and allow us to more clearly see ourselves objectively or at least as others can see us, that force us into self-awareness, be that egoic or otherwise.

All of this relates to why it’s so difficult to maintain liberalism, both in society and in the mind; as liberalism is one of the main expressions of the literary WEIRDing of Jaynesian consciousness. Liberalism is an energy-intensive state, similar to what Jaynes argues; a hothouse flower that requires perfect conditions and well-developed structures, such that the hothouse flower requires the hothouse to survive and thrive. Do anything to compromise liberal mentality, from alcohol consumption to cognitive overload, and it instantly regresses back into simpler mindsets such as the prejudicial thinking of the conservative persuasion. This is precisely why inegalitarian right-wingers and reactionaries (including those posing as liberals and leftists, moderates and centrists; e.g., DNC elite) are forever attacking and undermining the very egalitarian foundations of liberal democracy, what makes liberal-mindedness possible at all; and so casting doubt about the radical and revolutionary possibility of the liberal dream. To be fair, there are real reasons for doubt; but the dark alternative of authoritarianism, as advocated on the reactionary right, is not a desirable option to be chosen instead; and there is no easy path open, besides maybe total collapse, for returning to the animistic and bicameral past.

This is a highly problematic dilemma for we have become committed to this societal aspiration and civilizational project, based on centuries and millennia of pathway dependence, layers upon layers upon layers of interlocking cognitive introstructure (metaphorically introjected structure), organizational intrastructure, societal infrastructure, and cultural superstructure. If we come to think this has been the wrong path all along, we’ll be scrambling to find a new way forward or sideways. In the conflict between what we are and what we pretend and hope to be, we will have to come to terms with the world we have collectively created across the generations. But maybe we are not actually schizoid and psychotic in our fumbling in the dark toward coherency, maybe we are not splintered within an internal self and not divided from external reality. If the bundled mind is and will always remain our psychic reality, our selves and identities have never not been pluralistic. Still, we might find a way of integrated balance between the bundled mind and the egoic identity, according to the integralist motto of transcend and include. It might not be a forced choice between two polar positions, a conflict between worldviews where one has to dominate and the other lose, as we’ve treated it so far. Until that changes, we will go on acting insane and denying our insanity, not recognizing in our fear that insanity itself is an illusion. We can never actually go against our own human nature, much less go against reality itself.

“When you know yourselves, then you will be known, and you will know that you are the sons of the living Father. But if you do not know yourselves, then you are in poverty, and you are poverty.”
~ Gospel of Thomas, Saying 3

“Barfield points to an “inwardization,” or a simultaneous intensification and consolidation of subjectivity, that has transpired over the evolution of humanity and whose results characterize the structure of our souls today. In fact, just because of this represents what is normal to us, we hardly notice it, having no foil to set it off.”
~ Max Leyf, Mythos, Logos, and the Lamb of God: René Girard on the Scapegoat Mechanism

“Crazy job they gave me. But if I wasn’t doing it, someone else would be. And they might get it wrong. They might set Arctor up, plant drugs on him and collect a reward. Better it be me, despite the disadvantages. Just protecting everyone from Barris is justification in itself. What the hell am I talking about? I must be nuts. I know Bob Arctor. He’s a good person. He’s up to nothing. At least nothing too bad. In fact, he works for the Orange County Sheriff’s office covertly, which is probably why Barris is after him. But that wouldn’t explain why the Orange County Sheriff’s office is after him.

“Something big is definitely going down in this house. This rundown, rubble-filled house with its weed patch yard and cat box that never gets emptied. What a waste of a truly good house. So much could be done with it. A family and children could live here. It was designed for that. Such a waste. They ought to confiscate it and put it to better use. I’m supposed to act like they aren’t here. Assuming there’s a “they” at all. It may just be my imagination. Whatever it is that’s watching, it’s not human, unlike little dark eyed Donna. It doesn’t ever blink.

“What does a scanner see? Into the head? Down into the heart? Does it see into me, into us? Clearly or darkly? I hope it sees clearly, because I can’t any longer see into myself. I see only murk. I hope for everyone’s sake the scanners do better. Because if the scanner sees only darkly, the way I do, then I’m cursed and cursed again. I’ll only wind up dead this way, knowing very little, and getting that little fragment wrong too.”

The Agricultural Mind

Let me make an argument about (hyper-)individualism, rigid egoic boundaries, and hence Jaynesian consciousness (about Julian Jaynes, see other posts). But I’ll come at it from a less typical angle. I’ve been reading much about diet, nutrition, and health. With agriculture, the entire environment in which humans lived was fundamentally transformed, such as the rise of inequality and hierarchy, concentrated wealth and centralized power; not to mention the increase of parasites and diseases from urbanization and close cohabitation with farm animals (The World Around Us). We might be able to thank early agricultural societies, as an example, for introducing malaria to the world.

Maybe more importantly, there are significant links between what we eat and so much else: gut health, hormonal regulation, immune system, and neurocognitive functioning. There are multiple pathways, one of which is direct, connecting the gut and the brain: nervous system, immune system, hormonal system, etc — with the affect of diet and nutrition on immune response, including leaky gut, consider the lymphatic-brain link (Neuroscience News, Researchers Find Missing Link Between the Brain and Immune System) with the immune system as what some refer to as the “mobile mind” (Susan L. Prescott & Alan C. Logan, The Secret Life of Your Microbiome, pp. 64-7, pp. 249-50). As for a direct and near instantaneous gut-brain link, there was a recent discovery of the involvement of the vagus nerve, a possible explanation for the ‘gut sense’, with the key neurotransmitter glutamate modulating the rate of transmission in synaptic communication between enteroendocrine cells and vagal nerve neurons (Rich Haridy, Fast and hardwired: Gut-brain connection could lead to a “new sense”), and this is implicated in “episodic and spatial working memory” that might assist in the relocation of food sources (Rich Haridy, Researchers reveal how disrupting gut-brain communication may affect learning and memory). The gut is sometimes called the second brain because it also has neuronal cells, but in evolutionary terms it is the first brain. To demonstrate one example of a connection, many are beginning to refer to Alzheimer’s as type 3 diabetes, and dietary interventions have reversed symptoms in clinical studies. Also, gut microbes and parasites have been shown to influence our neurocognition and psychology, even altering personality traits and behavior such as with toxoplasma gondii. [For more discussion, see Fasting, Calorie Restriction, and Ketosis.]

The gut-brain link explains why glutamate as a food additive might be so problematic for so many people. Much of the research has looked at other health areas, such as metabolism or liver functioning. It would make more sense to look at its effect on neurocognition, but as with many other particles many scientists have dismissed the possibility of glutamate passing the blood-brain barrier. Yet we now know many things that were thought to be kept out of the brain do, under some conditions, get into the brain. After all, the same mechanisms that cause leaky gut (e.g., inflammation) can also cause permeability in the brain. So, we know the mechanism about how this could happen. Evidence is pointing in this direction: “MSG acts on the glutamate receptors and releases neurotransmitters which play a vital role in normal physiological as well as pathological processes (Abdallah et al., 2014[]). Glutamate receptors have three groups of metabotropic receptors (mGluR) and four classes of ionotropic receptors (NMDA, AMPA, delta and kainite receptors). All of these receptor types are present across the central nervous system. They are especially numerous in the hypothalamus, hippocampus and amygdala, where they control autonomic and metabolic activities (Zhu and Gouaux, 2017[]). Results from both animal and human studies have demonstrated that administration of even the lowest dose of MSG has toxic effects. The average intake of MSG per day is estimated to be 0.3-1.0 g (Solomon et al., 2015[]). These doses potentially disrupt neurons and might have adverse effects on behaviour” (Kamal Niaz, Extensive use of monosodium glutamate: A threat to public health?).

One possibility to consider is the role of exorphins that are addictive and can be blocked in the same way as opioids. Exorphin, in fact, means external morphine-like substance, in the way that endorphin means indwelling morphine-like substance. Exorphins are found in milk and wheat. Milk, in particular, stands out. Even though exorphins are found in other foods, it’s been argued that they are insignificant because they theoretically can’t pass through the gut barrier, much less the blood-brain barrier. Yet exorphins have been measured elsewhere in the human body. One explanation is gut permeability (related to permeability throughout the body) that can be caused by many factors such as stress but also by milk. The purpose of milk is to get nutrients into the calf and this is done by widening the space in gut surface to allow more nutrients through the protective barrier. Exorphins get in as well and create a pleasurable experience to motivate the calf to drink more. Along with exorphins, grains and dairy also contain dopaminergic peptides, and dopamine is the other major addictive substance. It feels good to consume dairy as with wheat, whether you’re a calf or a human, and so one wants more. Think about that the next time you pour milk over cereal.

Addiction, of food or drugs or anything else, is a powerful force. And it is complex in what it affects, not only physiologically and psychologically but also on a social level. Johann Hari offers a great analysis in Chasing the Scream. He makes the case that addiction is largely about isolation and that the addict is the ultimate individual (see To Put the Rat Back in the Rat Park, Rationalizing the Rat Race, Imagining the Rat Park, & Individualism and Isolation), and by the way this connects to Jaynesian consciousness with its rigid egoic boundaries as opposed to the bundled and porous mind, the extended and enmeshed self of bicameralism and animism. It stands out to me that addiction and addictive substances have increased over civilization, and I’ve argued that this is about a totalizing cultural system and a fully encompassing ideological worldview, what some call a reality tunnel (see discussion of addiction and social control in Diets and Systems & Western Individuality Before the Enlightenment Age). Growing of poppies, sugar cane, etc came later on in civilization, as did the production of beer and wine — by the way, alcohol releases endorphins, sugar causes a serotonin high, and both activate the hedonic pathway. Also, grain and dairy were slow to catch on, as a large part of the diet. Until recent centuries, most populations remained dependent on animal foods, including wild game (I discuss this era of dietary transition and societal transformation in numerous posts with industrialization and technology pushing the already stressed agricultural mind to an extreme: Ancient Atherosclerosis?To Be Fat And Have Bread, Autism and the Upper Crust“Yes, tea banished the fairies.”, Voice and Perspective, Hubris of Nutritionism, Health From Generation To GenerationDietary Health Across GenerationsMoral Panic and Physical DegenerationThe Crisis of IdentityThe Disease of Nostalgia, & Technological Fears and Media Panics). Americans, for example, ate large amounts of meat, butter, and lard from the colonial era through the 19th century (see Nina Teicholz, The Big Fat Surprise; passage quoted in full at Malnourished Americans). In 1900, Americans on average were only getting 10% of their calorie intake from carbohydrates and sugar was minimal, a potentially ketogenic diet considering how much lower calorie the average diet was back then.

Something else to consider is that low-carb diets can alter how the body and brain functions (the word ‘alter’ is inaccurate, though, since in evolutionary terms ketosis would’ve been the normal state; and so rather the modern high-carb diet is altered from the biological norm). That is even more true if combined with intermittent fasting and restricted eating times that would have been more common in the past (Past Views On One Meal A Day (OMAD)). Interestingly, this only applies to adults since we know that babies remain in ketosis during breastfeeding, there is evidence that they are already in ketosis in utero, and well into the teen years humans apparently remain in ketosis: “It is fascinating to see that every single child , so far through age 16, is in ketosis even after a breakfast containing fruits and milk” (Angela A. Stanton, Children in Ketosis: The Feared Fuel). “I have yet to see a blood ketone test of a child anywhere in this age group that is not showing ketosis both before and after a meal” (Angela A. Stanton, If Ketosis Is Only a Fad, Why Are Our Kids in Ketosis?). Ketosis is not only safe but necessary for humans (“Is keto safe for kids?”). Taken together, earlier humans would have spent more time in ketosis (fat-burning mode, as opposed to glucose-burning) which dramatically affects human biology. The further one goes back in history the greater amount of time people probably spent in ketosis. One difference with ketosis is that, for many people, cravings and food addictions disappear. [For more discussion of this topic, see previous posts: Fasting, Calorie Restriction, and Ketosis, Ketogenic Diet and Neurocognitive HealthIs Ketosis Normal?, & “Is keto safe for kids?”.] Ketosis is a non-addictive or maybe even anti-addictive state of mind (FranciscoRódenas-González, et al, Effects of ketosis on cocaine-induced reinstatement in male mice), similar to how certain psychedelics can be used to break addiction — one might argue there is a historical connection over the millennia between a decrease of psychedelic use and an increase of addictive substances: sugar, caffeine, nicotine, opium, etc (Diets and Systems, “Yes, tea banished the fairies.”, & Wealth, Power, and Addiction). Many hunter-gatherer tribes can go days without eating and it doesn’t appear to bother them, such as Daniel Everett’s account of the Piraha, and that is typical of ketosis — fasting forces one into ketosis, if one isn’t already in ketosis, and so beginning a fast in ketosis makes it even easier. This was also observed of Mongol warriors who could ride and fight for days on end without tiring or needing to stop for food. What is also different about hunter-gatherers and similar traditional societies is how communal they are or were and how more expansive their identities in belonging to a group, the opposite of the addictive egoic mind of high-carb agricultural societies. Anthropological research shows how hunter-gatherers often have a sense of personal space that extends into the environment around them. What if that isn’t merely cultural but something to do with how their bodies and brains operate? Maybe diet even plays a role. Hold that thought for a moment.

Now go back to the two staples of the modern diet, grains and dairy. Besides exorphins and dopaminergic substances, they also have high levels of glutamate, as part of gluten and casein respectively. Dr. Katherine Reid is a biochemist whose daughter was diagnosed with autism and it was severe. She went into research mode and experimented with supplementation and then diet. Many things seemed to help, but the greatest result came from restriction of dietary glutamate, a difficult challenge as it is a common food additive (see her TED talk here and another talk here or, for a short and informal video, look here). This requires going on a largely whole foods diet, that is to say eliminating processed foods (also see Traditional Foods diet of Weston A. Price and Sally Fallon Morell, along with the GAPS diet of Natasha Campbell-McBride). But when dealing with a serious issue, it is worth the effort. Dr. Reid’s daughter showed immense improvement to such a degree that she was kicked out of the special needs school. After being on this diet for a while, she socialized and communicated normally like any other child, something she was previously incapable of. Keep in mind that glutamate, as mentioned above, is necessary as a foundational neurotransmitter in modulating communication between the gut and brain. But typically we only get small amounts of it, as opposed to the large doses found in the modern diet. In response to the TED Talk given by Reid, Georgia Ede commented that it’s, “Unclear if glutamate is main culprit, b/c a) little glutamate crosses blood-brain barrier; b) anything that triggers inflammation/oxidation (i.e. refined carbs) spikes brain glutamate production.” Either way, glutamate plays a powerful role in brain functioning. And no matter the exact line of causation, industrially processed foods in the modern diet would be involved. By the way, an exacerbating factor might be mercury in its relation to anxiety and adrenal fatigue, as it ramps up the fight or flight system via over-sensitizing the glutamate pathway — could this be involved in conditions like autism where emotional sensitivity is a symptom? Mercury and glutamate simultaneously increasing in the modern world demonstrates how industrialization can push the effects of the agricultural diet to ever further extremes.

Glutamate is also implicated in schizophrenia: “The most intriguing evidence came when the researchers gave germ-free mice fecal transplants from the schizophrenic patients. They found that “the mice behaved in a way that is reminiscent of the behavior of people with schizophrenia,” said Julio Licinio, who co-led the new work with Wong, his research partner and spouse. Mice given fecal transplants from healthy controls behaved normally. “The brains of the animals given microbes from patients with schizophrenia also showed changes in glutamate, a neurotransmitter that is thought to be dysregulated in schizophrenia,” he added. The discovery shows how altering the gut can influence an animals behavior” (Roni Dengler, Researchers Find Further Evidence That Schizophrenia is Connected to Our Guts; reporting on Peng Zheng et al, The gut microbiome from patients with schizophrenia modulates the glutamate-glutamine-GABA cycle and schizophrenia-relevant behaviors in mice, Science Advances journal). And glutamate is involved in other conditions as well, such as in relation to GABA: “But how do microbes in the gut affect [epileptic] seizures that occur in the brain? Researchers found that the microbe-mediated effects of the Ketogenic Diet decreased levels of enzymes required to produce the excitatory neurotransmitter glutamate. In turn, this increased the relative abundance of the inhibitory neurotransmitter GABA. Taken together, these results show that the microbe-mediated effects of the Ketogenic Diet have a direct effect on neural activity, further strengthening support for the emerging concept of the ‘gut-brain’ axis.” (Jason Bush, Important Ketogenic Diet Benefit is Dependent on the Gut Microbiome). Glutamate is one neurotransmitter among many that can be affected in a similar manner; e.g., serotonin is also produced in the gut.

That reminds me of propionate, a short chain fatty acid and the conjugate base of propioninic acid. It is another substance normally taken in at a low level. Certain foods, including grains and dairy, contain it. The problem is that, as a useful preservative, it has been generously added to the food supply. Research on rodents shows injecting them with propionate causes autistic-like behaviors. And other rodent studies show how this stunts learning ability and causes repetitive behavior (both related to the autistic demand for the familiar), as too much propionate entrenches mental patterns through the mechanism that gut microbes use to communicate to the brain how to return to a needed food source, similar to the related function of glutamate. A recent study shows that propionate not only alters brain functioning but brain development (L.S. Abdelli et al, Propionic Acid Induces Gliosis and Neuro-inflammation through Modulation of PTEN/AKT Pathway in Autism Spectrum Disorder), and this is a growing field of research (e.g., Hyosun Choi, Propionic acid induces dendritic spine loss by MAPK/ERK signaling and dysregulation of autophagic flux). As reported by Suhtling Wong-Vienneau at University of Central Florida, “when fetal-derived neural stem cells are exposed to high levels of Propionic Acid (PPA), an additive commonly found in processed foods, it decreases neuron development” (Processed Foods May Hold Key to Rise in Autism). This study “is the first to discover the molecular link between elevated levels of PPA, proliferation of glial cells, disturbed neural circuitry and autism.”

The impact is profound and permanent — Pedersen offers the details: “In the lab, the scientists discovered that exposing neural stem cells to excessive PPA damages brain cells in several ways: First, the acid disrupts the natural balance between brain cells by reducing the number of neurons and over-producing glial cells. And although glial cells help develop and protect neuron function, too many glia cells disturb connectivity between neurons. They also cause inflammation, which has been noted in the brains of autistic children. In addition, excessive amounts of the acid shorten and damage pathways that neurons use to communicate with the rest of the body. This combination of reduced neurons and damaged pathways hinder the brain’s ability to communicate, resulting in behaviors that are often found in children with autism, including repetitive behavior, mobility issues and inability to interact with others.” According to this study, “too much PPA also damaged the molecular pathways that normally enable neurons to send information to the rest of the body. The researchers suggest that such disruption in the brain’s ability to communicate may explain ASD-related characteristics such as repetitive behavior and difficulties with social interaction” (Ana Sandoiu, Could processed foods explain why autism is on the rise?).

So, the autistic brain develops according to higher levels of propionate and maybe becomes accustomed to it. A state of dysfunction becomes what feels normal. Propionate causes inflammation and, as Dr. Ede points out, “anything that triggers inflammation/oxidation (i.e. refined carbs) spikes brain glutamate production”. High levels of propionate and glutamate become part of the state of mind the autistic becomes identified with. It all links together. Autistics, along with cravings for foods containing propionate (and glutamate), tend to have larger populations of a particular gut microbe that produces propionate. In killing microbes, this might be why antibiotics can help with autism. But in the case of depression, gut issues are associated instead with the lack of certain microbes that produce butyrate, another important substance that also is found in certain foods (Mireia Valles-Colomer et al, The neuroactive potential of the human gut microbiota in quality of life and depression). Depending on the specific gut dysbiosis, diverse neurocognitive conditions can result. And in affecting the microbiome, changes in autism can be achieved through a ketogenic diet, temporarily reducing the microbiome (similar to an antibiotic) — this presumably takes care of the problematic microbes and readjusts the gut from dysbiosis to a healthier balance. Also, ketosis would reduce the inflammation that is associated with glutamate production.

As with propionate, exorphins injected into rats will likewise elicit autistic-like behaviors. By two different pathways, the body produces exorphins and propionate from the consumption of grains and dairy, the former from the breakdown of proteins and the latter produced by gut bacteria in the breakdown of some grains and refined carbohydrates (combined with the propionate used as a food additive; and also, at least in rodents, artificial sweeteners increase propionate levels). [For related points and further discussion, see section below about vitamin B1 (thiamine/thiamin). Also covered are other B vitamins and nutrients.] This is part of the explanation for why many autistics have responded well to ketosis from carbohydrate restriction, specifically paleo diets that eliminate both wheat and dairy, but ketones themselves play a role in using the same transporters as propionate and so block their buildup in cells and, of course, ketones offer a different energy source for cells as a replacement for glucose which alters how cells function, specifically neurocognitive functioning and its attendant psychological effects.

There are some other factors to consider as well. With agriculture came a diet high in starchy carbohydrates and sugar. This inevitably leads to increased metabolic syndrome, including diabetes. And diabetes in pregnant women is associated with autism and attention deficit disorder in children. “Maternal diabetes, if not well treated, which means hyperglycemia in utero, that increases uterine inflammation, oxidative stress and hypoxia and may alter gene expression,” explained Anny H. Xiang. “This can disrupt fetal brain development, increasing the risk for neural behavior disorders, such as autism” (Maternal HbA1c influences autism risk in offspring); by the way, other factors such as getting more seed oils and less B vitamins are also contributing factors to metabolic syndrome and altered gene expression, including being inherited epigenetically, not to mention mutagenic changes to the genes themselves (Catherine Shanahan, Deep Nutrition). The increase of diabetes, not mere increase of diagnosis, could partly explain the greater prevalence of autism over time. Grain surpluses only became available in the 1800s, around the time when refined flour and sugar began to become common. It wasn’t until the following century that carbohydrates finally overtook animal foods as the mainstay of the diet, specifically in terms of what is most regularly eaten throughout the day in both meals and snacks — a constant influx of glucose into the system.

A further contributing factor in modern agriculture is that of pesticides, also associated with autism. Consider DDE, a product of DDT, which has been banned for decades but apparently it is still lingering in the environment. “The odds of autism among children were increased, by 32 percent, in mothers whose DDE levels were high (high was, comparatively, 75th percentile or greater),” one study found (Aditi Vyas & Richa Kalra, Long lingering pesticides may increase risk for autism: Study). “Researchers also found,” the article reports, “that the odds of having children on the autism spectrum who also had an intellectual disability were increased more than two-fold when the mother’s DDE levels were high.” A different study showed a broader effect in terms of 11 pesticides still in use:

“They found a 10 percent or more increase in rates of autism spectrum disorder, or ASD, in children whose mothers lived during pregnancy within about a mile and a quarter of a highly sprayed area. The rates varied depending on the specific pesticide sprayed, and glyphosate was associated with a 16 percent increase. Rates of autism spectrum disorders combined with intellectual disability increased by even more, about 30 percent. Exposure after birth, in the first year of life, showed the most dramatic impact, with rates of ASD with intellectual disability increasing by 50 percent on average for children who lived within the mile-and-a-quarter range. Those who lived near glyphosate spraying showed the most increased risk, at 60 percent” (Nicole Ferox, It’s Personal: Pesticide Exposures Come at a Cost).

An additional component to consider are plant anti-nutrients. For example, oxalates may be involved in autism spectrum disorder (Jerzy Konstantynowicz et al, A potential pathogenic role of oxalate in autism). With the end of the Ice Age, vegetation became more common and some of the animal foods less common. That increased plant foods as part of the human diet. But even then it was limited and seasonal. The dying off of the megafauna was a greater blow, as it forced humans to both rely on less desirable lean meats from smaller prey but also more plant foods. And of course, the agricultural revolution followed shortly after that with its devastating effects. None of these changes were kind to human health and development, as the evidence shows in the human bones and mummies left behind. Yet they were minor compared to what was to come. The increase of plant foods was a slow process over millennia. All the way up to the 19th century, Americans were eating severely restricted amounts of plant foods and instead depending on fatty animal foods, from pasture-raised butter and lard to wild-caught fish and deer — the abundance of wilderness and pasturage made such foods widely available, convenient, and cheap, besides being delicious and nutritious. Grain crops and vegetable gardens were simply too hard to grow, as described by Nina Teicholz in The Big Fat Surprise (see quoted passage at Malnourished Americans).

While maintaining a garden at Walden Pond by growing beans, peas, corn, turnips and potatoes, a plant-based diet (Jennie Richards, Henry David Thoreau Advocated “Leaving Off Eating Animals”) surely contributed to Henry David Thoreau’s declining health from tuberculosis in weakening his immune system from deficiency in the fat-soluble vitamins, although his nearby mother occasionally made him a fruit pie that would’ve had nutritious lard in the crust: “lack of quality protein and excess of carbohydrate foods in Thoreau’s diet as probable causes behind his infection” (Dr. Benjamin P. Sandler, Thoreau, Pulmonary Tuberculosis and Dietary Deficiency). Likewise, Franz Kafka who became a vegetarian also died from tuberculosis (Old Debates Forgotten). Weston A. Price observed the link between deficiency of fat-soluble vitamins and high rates of tuberculosis, not that one causes the other but that nutritious diet is key to a strong immune system (Dr. Kendrick On Vaccines & Moral Panic and Physical Degeneration). Besides, eliminating fatty animal foods typically means increasing starchy and sugary plant foods, which lessens the anti-inflammatory response from ketosis and autophagy and hence the capacity for healing.

It should be re-emphasized the connection of physical health to mental health, another insight of Price. Interestingly, Kafka suffered from psychological, presumably neurocognitive, issues long before tubercular symptoms showed up and he came to see the link between them as causal, although he saw it the the other way around as psychosomatic. Even more intriguing, Kafka suggests that, as Sander L. Gilman put it, “all urban dwellers are tubercular,” as if it is a nervous condition of modern civilization akin to what used to be called neurasthenia (about Kafka’s case, see Sander L. Gilman’s Franz Kafka, the Jewish Patient). He even uses the popular economic model of energy and health: “For secretly I don’t believe this illness to be tuberculosis, at least primarily tuberculosis, but rather a sign of general bankruptcy” (for context, see The Crisis of Identity). Speaking of the eugenic, hygienic, sociological and aesthetic, Gillman further notes that, “For Kafka, that possibility is linked to the notion that illness and creativity are linked, that tuberculars are also creative geniuses,” indicating an interpretation of neurasthenia among the intellectual class, an interpretation that was more common in the United States than in Europe.

The upper classes were deemed the most civilized and so it was expected they they’d suffer the most from the diseases of civilization, and indeed the upper classes fully adopted the modern industrial diet before the rest of the population. In contrast, while staying at a sanatorium (a combination of the rest cure and the west cure), Kafka stated that, “I am firmly convinced, now that I have been living here among consumptives, that healthy people run no danger of infection. Here, however, the healthy are only the woodcutters in the forest and the girls in the kitchen (who will simply pick uneaten food from the plates of patients and eat it—patients whom I shrink from sitting opposite) but not a single person from our town circles,” from a letter to Max Brod on March 11, 1921. It should be pointed out that tuberculosis sanatoriums were typically located in rural mountain areas where local populations were known to be healthy, the kinds of communities Weston A. Price studied in the 1930s; a similar reason for why in America tuberculosis patients were sometimes sent west (the west cure) for clean air and a healthy lifestyle, probably with an accompanying change toward a rural diet, with more wild-caught animal foods higher in omega-3s and lower in omega-6s, not to mention higher in fat-soluble vitamins.

The historical context of public health overlapped with racial hygiene, and indeed some of Kafka’s family members and lovers would later die at the hands of Nazis. Eugenicists were obsessed with body types in relation to supposed racial features, but non-eugenicists also accepted that physical structure was useful information to be considered; and this insight is supported, if not the eugenicist ideology, by the more recent scientific measurements of stunted bone development in the early agricultural societies. Hermann Brehmer, a founder of the sanitorium movement, asserted that a particular body type (habitus phthisicus, equivalent to habitus asthenicus) was associated with tuberculosis, the kind of thinking that Weston A. Price would pick up in his observations in physical development, although Price saw the explanation as dietary and not racial. The other difference is that Price saw “body type” not as a cause but as a symptom of ill health, and so the focus on re-forming the body (through lung exercises, orthopedic corsets, etc) to improve health was not the most helpful advice. On the other hand, if re-forming the body involved something like the west cure in changing the entire lifestyle and environmental conditions, it might work by way of changing other factors of health and, along with diet, exercise and sunshine and clean air and water would definitely improve immune function, lower inflammation, and much else (sanitoriums prioritized such things as getting plenty of sunshine and dairy, both of which would increase vitamin D3 that is necessary for immunological health). Improvements in physical health, of course, would go hand in hand with that of mental health. An example of this is that winter conceptions, when vitamin D3 production is low, result in higher rates later on of childhood learning disabilities and other problems in neurocognitive development (BBC, Learning difficulties linked with winter conception).

As a side note, physical development was tied up with gender issues and gender roles, especially for boys in becoming men. There became a fear that the newer generations of urban youth were failing to develop properly, physically and mentally, morally and socially. Fitness became a central concern for the civilizational project and it was feared that we modern humans might fail this challenge. Most galling of all was ‘feminization’, not only about loss of an athletic build but loss of something to the masculine psychology, involving the depression and anxiety, sensitivity and weakness of conditions like neurasthenia while also overlapping with tubercular consumption. Some of this could be projected onto racial inferiority, far from being limited to the distinction between those of European descent and all others for it also was used to divide humanity up in numerous ways (German vs French, English vs Irish, North vs South, rich vs poor, Protestants vs Catholics, Christians vs Jews, etc).

Gender norms were applied to all aspects of health and development, including perceived moral character and personality disposition. This is a danger to the individual, but also potentially a danger to society. “Here we can return for the moment to the notion that the male Jew is feminized like the male tubercular. The tubercular’s progressive feminization begins in the middle of the nineteenth century with the introduction of the term: infemminire, to feminize, which is supposedly a result of male castration. By the 1870s, the term is used to describe the feminisme of the male through the effects of other disease, such as tuberculosis. Henry Meige, at the Salpetriere, saw this feminization as an atavism, in which the male returns to the level of the “sexless” child. Feminization is therefore a loss, which can cause masturbation and thus illness in certain predisposed individuals. It is also the result of actual castration or its physiological equivalent, such as an intensely debilitating illness like tuberculosis, which reshapes the body” (Sanders L. Gilman, Franz Kafka, the Jewish Patient). There was a fear that all of civilization was becoming effeminate, especially among the upper classes who were expected to be the leaders. That was the entire framework of neurasthenia-obsessed rhetoric in late nineteenth to early twentieth century America. The newer generations of boys, the argument went, were somehow deficient and inadequate. Looking back on that period, there is no doubt that physical and mental illness was increasing, while bone structure was becoming underdeveloped in a way one could perceive as effeminate; such bone development problems are particularly obvious among children raised on plant-based diets, especially veganism and near-vegan vegetarianism, but also anyone on a diet lacking nutritious animal foods.

Let me make one odd connection before moving on. The Seventh Day Adventist Dr. John Harvey Kellogg believed masturbation was both a moral sin and a cause of ill health but also a sign of inferiority, and his advocacy of a high-fiber vegan diet including breakfast cereals was based on the Galenic theory that such foods decreased libido. Dr. Kellogg was also an influential eugenicist and operated a famous sanitorium. He wasn’t alone in blaming masturbation for disease. The British Dr. D. G. Macleod Munro treated masturbation as a contributing factor for tuberculosis: “the advent of the sexual appetite in normal adolescence has a profound effect upon the organism, and in many cases when uncontrolled, leads to excess about the age when tuberculosis most frequently delivers its first open assault upon the body,” as quoted by Gilman. This related to the ‘bankruptcy’ Kafka mentioned, the idea that one could waste one’s energy reserves. Maybe there is an insight in this belief, despite it being misguided and misinterpreted. The source of the ‘bankruptcy’ may have in part been a nutritional debt and certainly a high-fiber vegan diet would not refill ones energy and nutrient reserves as an investment in one’s health — hence, the public health risk of what one might call a hyper-agricultural diet as exemplified by the USDA dietary recommendations and corporate-backed dietary campaigns like EAT-Lancet (Dietary Dictocrats of EAT-Lancet; & Corporate Veganism), but it’s maybe reversing course, finally (Slow, Quiet, and Reluctant Changes to Official Dietary Guidelines; American Diabetes Association Changes Its Tune; & Corporate Media Slowly Catching Up With Nutritional Studies).

So far, my focus has mostly been on what we ingest or are otherwise exposed to because of agriculture and the food system, in general and more specifically in industrialized society with its refined, processed, and adulterated foods, largely from plants. But the other side of the picture is what our diet is lacking, what we are deficient in. As I touched upon directly above, an agricultural diet hasn’t only increased certain foods and substances but simultaneously decreased others. What promoted optimal health throughout human evolution has, in many cases, been displaced or interrupted. Agriculture is highly destructive and has depleted the nutrient-level in the soil (Carnivore Is Vegan) and, along with this, even animal foods as part of the agricultural system are similarly depleted of nutrients as compared to animal foods from pasture or free-range. For example, fat-soluble vitamins (true vitamin A as retinol, vitamin D3, vitamin K2 not to be confused with K1, and vitamin E complex) are not found in plant foods and are found in far less concentration with foods from animals from factory-farming or from grazing on poor soil from agriculture, especially the threat of erosion and desertification. Rhonda Patrick points to deficiencies of vitamin D3, EPA and DHA and hence insufficient serotonin levels as being causally linked to autism, ADHD, bipolar disorder, schizophrenia, etc (TheIHMC, Rhonda Patrick on Diet-Gene Interactions, Epigenetics, the Vitamin D-Serotonin Link and DNA Damage). She also discusses inflammation, epigenetics, and DNA damage which relates to the work by others (Dr. Catherine Shanahan On Dietary Epigenetics and Mutations).

One of the biggest changes with agriculture was the decrease of fatty animal foods that were nutrient-dense and nutrient-bioavailable. It’s in the fat that are found the fat-soluble vitamins and fat is necessary for their absorption (i.e., fat-soluble), and these key nutrients relate to almost everything else such as minerals as calcium and magnesium that also are found in animal foods (Calcium: Nutrient Combination and Ratios); the relationship of seafood with the balance of sodium, magnesium, and potassium is central (On Salt: Sodium, Trace Minerals, and Electrolytes) and indeed populations that eat more seafood live longer. These animal foods used to hold the prized position in the human diet and the earlier hominid diet as well, as part of our evolutionary inheritance from millions of years of adaptation to a world where fatty animals once were abundant (J. Tyler Faith, John Rowan & Andrew Du, Early hominins evolved within non-analog ecosystems). That was definitely true in the paleolithic before the megafauna die-off, but even to this day hunter-gatherers when they have access to traditional territory and prey will seek out the fattest animals available, entirely ignoring lean animals because rabbit sickness is worse than hunger (humans can always fast for many days or weeks, if necessary, and as long as they have reserves of body fat they can remain perfectly healthy).

We’ve already discussed autism in terms of many other dietary factors, especially excesses of otherwise essential nutrients like glutamate, propionate, and butyrate. But like most modern people, those on the autistic spectrum can be nutritionally deficient in other ways and unsurprisingly that would involve fat-soluble vitamins. In a fascinating discussion one of her more recent books, Nourishing Fats, Sally Fallon Morell offers a hypothesis of an indirect causal mechanism. First off, she notes that, “Dr. Mary Megson  of Richmond, Virginia, had noticed that night blindness and thyroid conditions—both signs of vitamin A deficiency—were common in family members of autistic children” (p. 156), and so indicating a probable deficiency of the same in the affected child. This might be why supplementing cod liver oil, high in true vitamin A, helps with autistic issues. “As Dr. Megson explains, in genetically predisposed children, autism is linked to a G-alpha protein defect. G-alpha proteins form one of the most prevalent signaling systems in our cells, regulating processes as diverse as cell growth, hormonal regulation and sensory perception—like seeing” (p. 157).

The sensory issues common among autistics may seem to be neurocognitive in origin, but the perceptual and psychological effects may be secondary to the real cause in altered eye development. Because the rods in their eyes don’t function properly, they have distorted vision that is experienced as a blurry and divided visual field, like a magic-eye puzzle, that takes constant effort in making coherent sense of the world around them. “According to Megson, the blocked visual pathways explain why children on the autism spectrum “melt down” when objects are moved or when you clean up their lines or piles of toys sorted by color They work hard to piece together their world; it frightens and overwhelms them when the world as they are able to see it changes. It also might explain why children on the autism spectrum spend time organizing tings so carefully. It’s the only way they can “see” what’s out there” (p. 157). The rods at the edge of their vision work better and so they prefer to not look directly at people.

The vitamin A link is not merely speculative. In other aspects seen in autism, studies have sussed out some of the proven and possible factors and mechanisms: “Decreased vitamin A, and its retinoic acid metabolites, lead to a decrease in CD38 and associated changes that underpin a wide array of data on the biological underpinnings of ASD, including decreased oxytocin, with relevance both prenatally and in the gut. Decreased sirtuins, poly-ADP ribose polymerase-driven decreases in nicotinamide adenine dinucleotide (NAD+), hyperserotonemia, decreased monoamine oxidase, alterations in 14-3-3 proteins, microRNA alterations, dysregulated aryl hydrocarbon receptor activity, suboptimal mitochondria functioning, and decreases in the melatonergic pathways are intimately linked to this. Many of the above processes may be modulating, or mediated by, alterations in mitochondria functioning. Other bodies of data associated with ASD may also be incorporated within these basic processes, including how ASD risk factors such as maternal obesity and preeclampsia, as well as more general prenatal stressors, modulate the likelihood of offspring ASD” (Michael Maes et al, Integrating Autism Spectrum Disorder Pathophysiology: Mitochondria, Vitamin A, CD38, Oxytocin, Serotonin and Melatonergic Alterations in the Placenta and Gut). By the way, some of those involved pathways are often discussed in terms of longevity, which indicates autistics might be at risk for shortened lifespan. Autism, indeed, is comorbid with numerous other health issues and genetic syndromes. So autism isn’t just an atypical expression on a healthy spectrum of neurodiversity.

The affect of the agricultural diet, especially in its industrially-processed variety, has a powerful impact on numerous systems simultaneously, as autism demonstrates. There is unlikely any single causal factor and causal mechanism with most other health conditions as well. We can take this a step further. With historical changes in diet, it wasn’t only fat-soluble vitamins that were lost. Humans traditionally ate nose-to-tail and this brought with it a plethora of nutrients, even some thought of as being only sourced from plant foods. In its raw or lightly cooked form, meat has more than enough vitamin C for a low-carb diet; whereas a high-carb diet, since glucose competes with vitamin C, requires higher intake of this antioxidant which can lead to deficiencies at levels that otherwise would be adequate (Sailors’ Rations, a High-Carb Diet). Also, consider that prebiotics can be found in animal foods as well and animal-based prebiotics likely feeds a very different kind of microbiome that could shift so much else in the body, such as neurotransmitter production: “I found this list of prebiotic foods that were non-carbohydrate that included cellulose, cartilage, collagen, fructooligosaccharides, glucosamine, rabbit bone, hair, skin, glucose. There’s a bunch of things that are all — there’s also casein. But these tend to be some of the foods that actually have some of the highest prebiotic content,” from Vanessa Spina as quoted in Fiber or Not: Short-Chain Fatty Acids and the Microbiome).

Let’s briefly mention fat-soluble vitamins again in making a point about other animal-based nutrients. Fat-soluble vitamins, similar to ketosis and autophagy, have a profound effect on human biological functioning, including that of the mind (see the work of Weston A. Price as discussed in Health From Generation To Generation; also see the work of those described in Physical Health, Mental Health). In many ways, they are closer to hormones than mere nutrients, as they orchestrate entire systems in the body and how other nutrients get used, particularly seen with vitamin K2 that Weston A. Price discovered in calling it “Activator X” (only found in animal and fermented foods, not in whole or industrially-processed plant foods). I bring this up because some other animal-based nutrients play a similar important role. Consider glycine that is the main amino acid in collagen. It is available in connective tissues and can be obtained through soups and broths made from bones, skin, ligaments, cartilage, and tendons. Glycine is right up there with the fat-soluble vitamins in being central to numerous systems, processes, and organs.

As I’ve already discussed glutamate at great length, let me further that discussion by pointing out a key link. “Glycine is found in the spinal cord and brainstem where it acts as an inhibitory neurotransmitter via its own system of receptors,” writes Afifah Hamilton. “Glycine receptors are ubiquitous throughout the nervous system and play important roles during brain development. [Ito, 2016] Glycine also interacts with the glutaminergic neurotransmission system via NMDA receptors, where both glycine and glutamate are required, again, chiefly exerting inhibitory effects” (10 Reasons To Supplement With Glycine). Hamilton elucidates the dozens of roles played by this master nutrient and the diverse conditions that follow from its deprivation or insufficiency — it’s implicated in obsessive compulsive disorder, schizophrenia, and alcohol use disorder, along with much else such as metabolic syndrome. But it’s being essential to glutamate really stands out for this discussion. “Glutathione is synthesised,” Hamilton further explains, “from the amino acids glutamate, cysteine, and glycine, but studies have shown that the rate of synthesis is primarily determined by levels of glycine in the tissue. If there is insufficient glycine available the glutathione precursor molecules are excreted in the urine. Vegetarians excrete 80% more of these precursors than their omnivore counterparts indicating a more limited ability to complete the synthesis process.” Did you catch what she is saying there? Autistics already have too much glutamate and, if they are deficient in glycine, they won’t be able to convert glutamate into the important glutathione. When the body is overwhelmed with unused glutamate, it does what it can to eliminate them, but when constantly flooded with high-glutamate intake it can’t keep up. The excess glutamate then wreaks havoc on neurocognitive functioning.

The whole mess of the agricultural diet, specifically in its modern industrialized form, has been a constant onslaught taxing our bodies and minds. And the consequences are worsening with each generation. What stands out to me about autism, in particular, is how isolating it is. The repetitive behavior and focus on objects to the exclusion of human relationships resonates with how addiction isolates the individual. As with other conditions influenced by diet (shizophrenia, ADHD, etc), both autism and addiction block normal human relating in creating an obsessive mindset that, in the most most extreme forms, blocks out all else. I wonder if all of us moderns are simply expressing milder varieties of this biological and neurological phenomenon (Afifah Hamilton, Why No One Should Eat Grains. Part 3: Ten More Reasons to Avoid Wheat). And this might be the underpinning of our hyper-individualistic society, with the earliest precursors showing up in the Axial Age following what Julian Jaynes hypothesized as the breakdown of the much more other-oriented bicameral mind. What if our egoic consciousness with its rigid psychological boundaries is the result of our food system, as part of the civilizational project of mass agriculture?

* * *

Mongolian Diet and Fasting:

“Heaven grew weary of the excessive pride and luxury of China… I am from the Barbaric North. I wear the same clothing and eat the same food as the cowherds and horse-herders. We make the same sacrifices and we share our riches. I look upon the nation as a new-born child and I care for my soldiers as though they were my brothers.”
~Genghis Khan, letter of invitation to Ch’ang Ch’un

For anyone who is curious to learn more, the original point of interest was a quote by Jack Weatherford in his book Genghis Khan and the Making of the Modern World. He wrote that, “The Chinese noted with surprise and disgust the ability of the Mongol warriors to survive on little food and water for long periods; according to one, the entire army could camp without a single puff of smoke since they needed no fires to cook. Compared to the Jurched soldiers, the Mongols were much healthier and stronger. The Mongols consumed a steady diet of meat, milk, yogurt, and other diary products, and they fought men who lived on gruel made from various grains. The grain diet of the peasant warriors stunted their bones, rotted their teeth, and left them weak and prone to disease. In contrast, the poorest Mongol soldier ate mostly protein, thereby giving him strong teeth and bones. Unlike the Jurched soldiers, who were dependent on a heavy carbohydrate diet, the Mongols could more easily go a day or two without food.” By the way, that biography was written by an anthropologist who lived among and studied the Mongols for years. It is about the historical Mongols, but filtered through the direct experience of still existing Mongol people who have maintained a traditional diet and lifestyle longer than most other populations.

As nomadic herders living on arid grasslands with no option of farming, they had limited access to plant foods from foraging and so their diet was more easily applied to horseback warfare, even over long distances when food stores ran out. That meant, when they had nothing else, on “occasion they will sustain themselves on the blood of their horses, opening a vein and letting the blood jet into their mouths, drinking till they have had enough, and then staunching it.” They could go on “quite ten days like this,” according to Marco Polo’s observations. “It wasn’t much,” explained Logan Nye, “but it allowed them to cross the grasses to the west and hit Russia and additional empires. […]On the even darker side, they also allegedly ate human flesh when necessary. Even killing the attached human if horses and already-dead people were in short supply” (How Mongol hordes drank horse blood and liquor to kill you). The claim of their situational cannibalism came from the writings of Giovanni da Pian del Carpini who noted they’d eat anything, even lice. The specifics of what they ate was also determined by season: “Generally, the Mongols ate dairy in the summer, and meat and animal fat in the winter, when they needed the protein for energy and the fat to help keep them warm in the cold winters. In the summers, their animals produced a lot of milk so they switched the emphasis from meat to milk products” (from History on the Net, What Did the Mongols Eat?). In any case, animal foods were always the staple.

By the way, some have wondered how long humans have been consuming dairy, since the gene for lactose tolerance is fairly recent. In fact, “a great many Mongolians, both today and in Genghis Khan’s time are lactose intolerant. Fermentation breaks down the lactose, removing it almost entirely, making it entirely drinkable to the Mongols” (from Exploring History, Food That Conquered The World: The Mongols — Nomads And Chaos). Besides mare’s milk fermented into alcohol, they had a wide variety of other cultured dairy and aged cheese. Even then, much of the dairy would contain significant amounts of lactose. A better explanation is that many of the dairy-loving microbes have been incorporated into the Mongolian microbiome, and these microbes in combination as a microbial ecosystem do some combination of: digest lactose, moderate the effects of lactose intolerance, and/or somehow alter the body’s response to lactose. But looking at a single microbe might not tell us much. “Despite the dairy diversity she saw,” wrote Andrew Curry, “an estimated 95 percent of Mongolians are, genetically speaking, lactose intolerant. Yet, in the frost-free summer months, she believes they may be getting up to half their calories from milk products. […] Rather than a previously undiscovered strain of microbes, it might be a complex web of organisms and practices—the lovingly maintained starters, the milk-soaked felt of the yurts, the gut flora of individual herders, the way they stir their barrels of airag—that makes the Mongolian love affair with so many dairy products possible” (The answer to lactose intolerance might be in Mongolia).

Here is what is interesting. Based on study of ancient corpses, it’s been determined that lactose intolerant people in this region have been including dairy in their diet for 5,000 years. It’s not limited to the challenge of lactose intolerant people depending on a food staple that is abundant in lactose. The Mongolian population also has high rates of carrying the APOE4 gene variation that can make problematic a diet high in saturated fat (Helena Svobodová et al, Apolipoprotein E gene polymorphism in the Mongolian population). That is a significant detail, considering dairy has a higher amount of saturated fat than any other food. These people should be keeling over with nearly every disease known to humanity, particularly as they commonly drink plenty of alcohol and smoke tobacco (as was likewise true of the heart-healthy and long-lived residents of mid-20th century Roseto, Pennsylvania with their love of meat, lard, alcohol, and tobacco; see Blue Zones Dietary Myth). Yet, it’s not the traditional Mongolians but the the industrialized Mongolians who show all the health problems. A major difference between these two populations in Mongolia is diet, much of it being a difference of much low-carb animal foods eaten versus the amount of high-carb plant foods. Genetics are not deterministic, not in the slightest. As some others have noted, the traditional Mongolian diet would be accurately described as a low-carb paleo diet that, in the wintertime, would often have been a strict carnivore diet and ketogenic diet; although even rural Mongolians, unlike in the time of Genghis Khan, now get a bit more starchy agricultural foods. Maybe there is a protective health factor found in a diet that relies on nutrient-dense animal foods and leans toward the ketogenic.

It isn’t only that the Mongolian diet was likely ketogenic because of being low-carbohydrate, particularly on their meat-based winter diet, but also because it involved fasting. From Mongolia Volume 1 The Tangut Country, and the Solitudes of Northernin (1876), Nikolaĭ Mikhaĭlovich Przhevalʹskiĭ writes in the second note on p. 65 under the section Calendar and Year-Cycle: “On the New Year’s Day, or White Feast of the Mongols, see ‘Marco Polo’, 2nd ed. i. p. 376-378, and ii. p. 543. The monthly fetival days, properly for the Lamas days of fasting and worship, seem to differ locally. See note in same work, i. p. 224, and on the Year-cycle, i. p. 435.” This is alluded to in another text, in describing that such things as fasting were the norm of that time: “It is well known that both medieval European and traditional Mongolian cultures emphasized the importance of eating and drinking. In premodern societies these activities played a much more significant role in social intercourse as well as in religious rituals (e.g., in sacrificing and fasting) than nowadays” (Antti Ruotsala, Europeans and Mongols in the middle of the thirteenth century, 2001). A science journalist trained in biology, Dyna Rochmyaningsih, also mentions this: “As a spiritual practice, fasting has been employed by many religious groups since ancient times. Historically, ancient Egyptians, Greeks, Babylonians, and Mongolians believed that fasting was a healthy ritual that could detoxify the body and purify the mind” (Fasting and the Human Mind).

Mongol shamans and priests fasted, no different than in so many other religions, but so did other Mongols — more from Przhevalʹskiĭ’s 1876 account showing the standard feast and fast cycle of many traditional ketogenic diets: “The gluttony of this people exceeds all description. A Mongol will eat more than ten pounds of meat at one sitting, but some have been known to devour an average-sized sheep in twenty-four hours! On a journey, when provisions are economized, a leg of mutton is the ordinary daily ration for one man, and although he can live for days without food, yet, when once he gets it, he will eat enough for seven” (see more quoted material in Diet of Mongolia). Fasting was also noted of earlier Mongols, such as Genghis Khan: “In the spring of 2011, Jenghis Khan summoned his fighting forces […] For three days he fasted, neither eating nor drinking, but holding converse with the gods. On the fourth day the Khakan emerged from his tent and announced to the exultant multitude that Heaven had bestowed on him the boon of victory” (Michael Prawdin, The Mongol Empire, 1967). Even before he became Khan, this was his practice as was common among the Mongols, such that it became a communal ritual for the warriors:

“When he was still known as Temujin, without tribe and seeking to retake his kidnapped wife, Genghis Khan went to Burkhan Khaldun to pray. He stripped off his weapons, belt, and hat – the symbols of a man’s power and stature – and bowed to the sun, sky, and mountain, first offering thanks for their constancy and for the people and circumstances that sustained his life. Then, he prayed and fasted, contemplating his situation and formulating a strategy. It was only after days in prayer that he descended from the mountain with a clear purpose and plan that would result in his first victory in battle. When he was elected Khan of Khans, he again retreated into the mountains to seek blessing and guidance. Before every campaign against neighboring tribes and kingdoms, he would spend days in Burhkhan Khandun, fasting and praying. By then, the people of his tribe had joined in on his ritual at the foot of the mountain, waiting his return” (Dr. Hyun Jin Preston Moon, Genghis Khan and His Personal Standard of Leadership).

As an interesting side note, the Mongol population have been studied to some extent in one area of relevance. In Down’s Anomaly (1976), Smith et al writes that, “The initial decrease in the fasting blood sugar was greater than that usually considered normal and the return to fasting blood sugar level was slow. The results suggested increased sensitivity to insulin. Benda reported the initial drop in fating blood sugar to be normal but the absolute blood sugar level after 2 hours was lower for mongols than for controls.” That is probably the result of a traditional low-carb diet that had been maintained continuously since before history. For some further context, I noticed some discussion about the Mongolian keto diet (Reddit, r/keto, TIL that Ghenghis Khan and his Mongol Army ate a mostly keto based diet, consisting of lots of milk and cheese. The Mongols were specially adapted genetically to digest the lactase in milk and this made them easier to feed.) that was inspired by the scientific documentary “The Evolution of Us” (presently available on Netflix and elsewhere).

As a concluding thought, we may have the Mongols to thank for the modern American hamburger: “Because their cavalry was traveling so much, they would often eat while riding their horses towards their next battle. The Mongol soldiers would soften scraps of meat by placing it under their saddles while they rode. By the time the Mongols had time for a meal, the meat would be “tenderized” and consumed raw. […] By no means did the Mongols have the luxury of eating the kind of burgers we have today, but it was the first recorded time that meat was flattened into a patty-like shape” (Anna’s House, Brunch History: The Shocking Hamburger Origin Story You Never Heard; apparently based on the account of Jean de Joinville who was born a few years after Genghis Khan’s death). The Mongols introduced it to Russia, in what was called steak tartare (Tartars being one of the ethnic groups in the Mongol army), the Russians introduced it to Germany where it was most famously called hamburg steak (because sailors were served it at the ports of Hamburg), from which it was introduced to the United States by way of German immigrants sailing out of Hamburg. Another version of this is Salisbury steak that was invented during the American Civil War by Dr. James Henry Salisbury (physician, chemist, and medical researcher) as part of a meat-based, low-carb diet in medically and nutritionally treating certain diseases and ailments.

* * *

3/30/19 – An additional comment: I briefly mentioned sugar, that it causes a serotonin high and activates the hedonic pathway. I also noted that it was late in civilization when sources of sugar were cultivated and, I could add, even later when sugar became cheap enough to be common. Even into the 1800s, sugar was minimal and still often considered more as medicine than food.

To extend this thought, it isn’t only sugar in general but specific forms of it (Yu Hue, Fructose and glucose can regulate mammalian target of rapamycin complex 1 and lipogenic gene expression via distinct pathways). Fructose, in particular, has become widespread because of United States government subsidizing corn agriculture which has created a greater corn yield that humans can consume. So, what doesn’t get fed to animals or turned into ethanol, mostly is made into high fructose corn syrup and then added into almost every processed food and beverage imaginable.

Fructose is not like other sugars. This was important for early hominid survival and so shaped human evolution. It might have played a role in fasting and feasting. In 100 Million Years of Food, Stephen Le writes that, “Many hypotheses regarding the function of uric acid have been proposed. One suggestion is that uric acid helped our primate ancestors store fat, particularly after eating fruit. It’s true that consumption of fructose induces production of uric acid, and uric acid accentuates the fat-accumulating effects of fructose. Our ancestors, when they stumbled on fruiting trees, could gorge until their fat stores were pleasantly plump and then survive for a few weeks until the next bounty of fruit was available” (p. 42).

That makes sense to me, but he goes on to argue against this possible explanation. “The problem with this theory is that it does not explain why only primates have this peculiar trait of triggering fat storage via uric acid. After all, bears, squirrels, and other mammals store fat without using uric acid as a trigger.” This is where Le’s knowledge is lacking for he never discusses ketosis that has been centrally important for humans unlike other animals. If uric acid increases fat production, that would be helpful for fattening up for the next starvation period when the body returned to ketosis. So, it would be a regular switching back and forth between formation of uric acid that stores fat and formation of ketones that burns fat.

That is fine and dandy under natural conditions. Excess fructose on a continuous basis, however, is a whole other matter. It has been strongly associated with metabolic syndrome. One pathway of causation is that increased production of uric acid. This can lead to gout (wrongly blamed on meat) but other things as well. It’s a mixed bag. “While it’s true that higher levels of uric acid have been found to protect against brain damage from Alzheimer’s, Parkinson’s, and multiple sclerosis, high uric acid unfortunately increases the risk of brain stroke and poor brain function” (Le, p. 43).

The potential side effects of uric acid overdose are related to other problems I’ve discussed in relation to the agricultural mind. “A recent study also observed that high uric acid levels are associated with greater excitement-seeking and impulsivity, which the researchers noted may be linked to attention deficit hyperactivity disorder (ADHD)” (Le, p. 43). The problems of sugar go far beyond mere physical disease. It’s one more factor in the drastic transformation of the human mind.

* * *

4/2/19 – More info: There are certain animal fats, the omega-3 fatty acids EPA and DHA, that are essential to human health (Georgia Ede, The Brain Needs Animal Fat). These were abundant in the hunter-gatherer diet. But over the history of agriculture, they have become less common.

This is associated with psychiatric disorders and general neurocognitive problems, including those already mentioned above in the post. Agriculture and industrialization have replaced these healthy lipids with industrially-processed seed oils that are high in linoleic acid (LA), an omega-6 fatty acids. LA interferes with the body’s use of omega-3 fatty acids. Worse still, these seed oils appear to not only alter gene expression (epigenetics) but also to be mutagenic, a possible causal factor behind conditions like autism (Dr. Catherine Shanahan On Dietary Epigenetics and Mutations).

The loss of healthy animal fats in the diet might be directly related to numerous conditions. “Children who lack DHA are more likely to have increased rates of neurological disorders, in particular attention deficit hyperactivity disorder (ADHD), and autism” (Maria Cross, Why babies need animal fat). Also, trans fats found in industrial seed oils are linked to Alzheimer’s as well (Millie Barnes, Alzheimer’s Risk May be 75% Higher for People Who Eat Trans Fats; Takanori Honda et al, Serum elaidic acid concentration and risk of dementia: The Hisayama study).

“Biggest dietary change in the last 60 years has been avoidance of animal fat. Coincides with a huge uptick in autism incidence. The human brain is 60 percent fat by weight. Much more investigation needed on correspondence between autism and prenatal/child ingestion of dietary fat.”
~ Brad Lemley

The agricultural diet, along with a drop in animal foods, saw a loss of access to the high levels and full profile of B vitamins. As with the later industrial seed oils, this had a major impact on genetics:

“The phenomenon wherein specific traits are toggled up and down by variations in gene expression has recently been recognized as a result of the built-in architecture of DNA and dubbed “active adaptive evolution.” 44

“As further evidence of an underlying logic driving the development of these new autism-related mutations, it appears that epigenetic factors activate the hotspot, particularly a kind of epigenetic tagging called methylation. 45 In the absence of adequate B vitamins, specific areas of the gene lose these methylation tags, exposing sections of DNA to the factors that generate new mutations. In other words, factors missing from a parent’s diet trigger the genome to respond in ways that will hopefully enable the offspring to cope with the new nutritional environment. It doesn’t always work out, of course, but that seems to be the intent.”
~Catherine Shanahan, Deep Nutrition, p. 56

And one last piece of evidence on the essential nature of animal fats:

“Maternal intake of fish, a key source of fatty acids, has been investigated in association with child neurodevelopmental outcomes in several studies. […]

“Though speculative at this time, the inverse association seen for those in the highest quartiles of intake of ω-6 fatty acids could be due to biological effects of these fatty acids on brain development. PUFAs have been shown to be important in retinal and brain development in utero (37) and to play roles in signal transduction and gene expression and as components of cell membranes (38, 39). Maternal stores of fatty acids in adipose tissue are utilized by the fetus toward the end of pregnancy and are necessary for the first 2 months of life in a crucial period of development (37). The complex effects of fatty acids on inflammatory markers and immune responses could also mediate an association between PUFA and ASD. Activation of the maternal immune system and maternal immune aberrations have been previously associated with autism (5, 40, 41), and findings suggest that increased interleukin-6 could influence fetal brain development and increase risk of autism and other neuropsychiatric conditions (42–44). Although results for effects of ω-6 intake on interleukin-6 levels are inconsistent (45, 46), maternal immune factors potentially could be affected by PUFA intake (47). […]

“Our results provide preliminary evidence that increased maternal intake of ω-6 fatty acids could reduce risk of offspring ASD and that very low intakes of ω-3 fatty acids and linoleic acid could increase risk.”
~Kristen Lyall et al, Maternal Dietary Fat Intake in Association With Autism Spectrum Disorders

* * *

6/13/19 – About the bicameral mind, I saw some other evidence for it in relationship to fasting. In the following quote, it is described that after ten days of fasting ancient humans would experience spirits. One thing for certain is that one can be fully in ketosis in three days. This would be true even if it wasn’t total fasting, as the caloric restriction would achieve the same end.

The author, Michael Carr, doesn’t think fasting was the cause of the spirit visions, but he doesn’t explain the reason(s) for his doubt. There is a long history of fasting used to achieve this intended outcome. If fasting was ineffective for this purpose, why has nearly every known traditional society for millennia used such methods? These people knew what they were doing.

By the way, imbibing alcohol after the fast would really knock someone into an altered state. The body becomes even more sensitive to alcohol when in ketogenic state during fasting. Combine this altered state with ritual, setting, cultural expectation, and archaic authorization. I don’t have any doubt that spirit visions could easily be induced.

Reflections on the Dawn of Consciousness
ed. by Marcel Kuijsten
Kindle Location 5699-5718

Chapter 13
The Shi ‘Corpse/ Personator’ Ceremony in Early China
by Michael Carr

“”Ritual Fasts and Spirit Visions in the Liji” 37 examined how the “Record of Rites” describes zhai 齋 ‘ritual fasting’ that supposedly resulted in seeing and hearing the dead. This text describes preparations for an ancestral sacrifice that included divination for a suitable day, ablution, contemplation, and a fasting ritual with seven days of sanzhai 散 齋 ‘relaxed fasting; vegetarian diet; abstinence (esp. from sex, meat, or wine)’ followed by three days of zhizhai 致 齋 ‘strict fasting; diet of grains (esp. gruel) and water’.

“Devoted fasting is inside; relaxed fasting is outside. During fast-days, one thinks about their [the ancestor’s] lifestyle, their jokes, their aspirations, their pleasures, and their affections. [After] fasting three days, then one sees those [spirits] for whom one fasted. On the day of the sacrifice, when one enters the temple, apparently one must see them at the spirit-tablet. When one returns to go out the door [after making sacrifices], solemnly one must hear sounds of their appearance. When one goes out the door and listens, emotionally one must hear sounds of their sighing breath. 38

“This context unequivocally uses biyou 必 有 ‘must be/ have; necessarily/ certainly have’ to describe events within the ancestral temple; the faster 必 有 見 “must have sight of, must see” and 必 有 聞 “must have hearing of, must hear” the deceased parent. Did 10 days of ritual fasting and mournful meditation necessarily cause visions or hallucinations? Perhaps the explanation is extreme or total fasting, except that several Liji passages specifically warn against any excessive fasts that could harm the faster’s health or sense perceptions. 39 Perhaps the explanation is inebriation from drinking sacrificial jiu 酒 ‘( millet) wine; alcohol’ after a 10-day fast. Based on measurements of bronze vessels and another Liji passage describing a shi personator drinking nine cups of wine, 40 York University professor of religious studies Jordan Paper   calculates an alcohol equivalence of “between 5 and 8 bar shots of eighty-proof liquor.” 41 On the other hand, perhaps the best explanation is the bicameral hypothesis, which provides a far wider-reaching rationale for Chinese ritual hallucinations and personation of the dead.”

* * *

7/16/19 – One common explanation for autism is the extreme male brain theory. A recent study may have come up with supporting evidence (Christian Jarrett, Autistic boys and girls found to have “hypermasculinised” faces – supporting the Extreme Male Brain theory). Autistics, including females, tend to have hypermasculinised. This might be caused by greater exposure to testosterone in the womb.

This made my mind immediately wonder how this relates. Changes in diets alter hormonal functioning. Endocrinology, the study of hormones, has been a major part of the diet debate going back to European researchers from earlier last century (as discussed by Gary Taubes). Diet affects hormones and hormones in turn affect diet. But I had something more specific in mind.

What about propionate and glutamate? What might their relationship be to testosterone? In a brief search, I couldn’t find anything about propionate. But I did find some studies related to glutamate. There is an impact on the endocrine system, although these studies weren’t looking at the results in terms of autism specifically or neurocognitive development in general. It points to some possibilities, though.

One could extrapolate from one of these studies that increased glutamate in the pregnant mother’s diet could alter what testosterone does to the developing fetus, in that testosterone increases the toxicity of glutamate which might not be a problem under normal conditions of lower glutamate levels. This would be further exacerbated during breastfeeding and later on when the child began eating the same glutamate-rich diet as the mother.

Testosterone increases neurotoxicity of glutamate in vitro and ischemia-reperfusion injury in an animal model
by Shao-Hua Yang et al

Effect of Monosodium Glutamate on Some Endocrine Functions
by Yonetani Shinobu and Matsuzawa Yoshimasa

* * *

11/28/21 – Here is some discussion of vitamin B1 (thiamin/thiamine). It couldn’t easily fit into the above post without revising and rewriting some of it. And it could’ve been made into a separate post by itself. But, for the moment, we’ll look at some of the info here, as relevant to the above survey and analysis. This section will be used as a holding place for some developing thoughts, although we’ll try to avoid getting off-topic in a post that is already too long. Nonetheless, we are going to have to trudge a bit into the weeds so as to see the requisite details more clearly.

Related to autism, consider this highly speculative hypothesis: “Thiamine deficiency is what made civilization. Grains deplete it, changing the gut flora to make more nervous and hyperfocused (mildly autistic) humans who are afraid to stand out. Conformity. Specialization in the division of labor” (JJ, Is Thiamine Deficiency Destroying Your Digestive Health? Why B1 Is ESSENTIAL For Gut Function, EONutrition). Thiamine deficiency is also associated with delirium and psychosis, such as schizophrenia (relevant scientific papers available are too numerous to be listed). By the way, psychosis, along with mania, has an established psychological and neurocognitive overlap with measures of modern conservatism; in opposition to the liberal link to mood disorders, addiction, and alcoholism (Uncomfortable Questions About Ideology; & Radical Moderates, Depressive Realism, & Visionary Pessimism). This is part of some brewing thoughts that won’t be further pursued here.

The point is simply to emphasize the argument that modern ideologies, as embodied worldviews and social identities, may partly originate in or be shaped by dietary and nutritional factors, among much else in modern environments and lifestyles. Nothing even comparable to conservatism and liberalism existed as such prior to the expansion and improvement of agriculture during the Axial Age (farm fields were made more uniform and well-managed, and hence with higher yields; e.g., systematic weeding became common as opposed to letting fields grow in semi-wild state); and over time there were also innovations in food processing (e.g., removing hulls from grains made them last longer in storage while having the unintended side effect of also removing a major source of vitamin B1 to help metabolize carbs).

In the original writing of this post, one focus was on addiction. Grains and dairy were noted as sources of exorphins and dopaminergic peptides, as well as propionate and glutamate. As already explained, this goes a long way to explain the addictive quality of these foods and their relationship to the repetitive behavior of obsessive-compulsive disorder. This is seen in many psychiatric illnesses and neurocognitive conditions, including autism (Derrick Lonsdale et al, Dysautonomia in Autism Spectrum Disorder: Case Reports of a Family with Review of the Literature):

“It has been hypothesized that autism is due to mitochondrial dysfunction [49], supported more recently [50]. Abnormal thiamine homeostasis has been reported in a number of neurological diseases and is thought to be part of their etiology [51]. Blaylock [52] has pointed out that glutamate and aspartate excitotoxicity is more relevant when there is neuron energy failure. Brain damage from this source might be expected in the very young child and the elderly when there is abnormal thiamine homeostasis. In thiamine-deficient neuroblastoma cells, oxygen consumption decreases, mitochondria are uncoupled, and glutamate, formed from glutamine, is no longer oxidized and accumulates [53]. Glutamate and aspartate are required for normal metabolism, so an excess or deficiency are both abnormal. Plaitakis and associates [54] studied the high-affinity uptake systems of aspartate/glutamate and taurine in synaptosomal preparations isolated from brains of thiamine-deficient rats. They concluded that thiamine deficiency could impair cerebellar function by inducing an imbalance in its neurotransmitter systems.”

We’ve previously spoken of glutamate, a key neurotransmitter; but let’s summarize it while adding in new info. Among those on the autistic spectrum, there is commonly a glutamate excess. This is caused by eating a lot of processed foods that use glutamate as an additive (e.g., MSG). And there is the contributing factor of many autistics being drawn to foods naturally high in glutamate, specifically dairy and wheat. A high-carb diet also promotes the body’s own production of glutamate, with carb-related inflammation spiking glutamate levels in the brain; and it downregulates the levels of the inhibitory neurotransmitter GABA that balances glutamate. GABA is important for sleep and much else.

Keep in mind that thiamine is required in the production of numerous other neurotransmitters and required in the balanced interaction between them. Another B vitamin, B12 (cobalamin), plays a similar role; and it deficiency is not uncommonly seen there as well. The B vitamins, by the way, are particularly concentrated in animal foods, as are other key nutrients. Think about choline, precursor of acetylecholine, that promotes sensory habituation, perceptual regulation, attentional focus, executive function, and selective responsiveness while supporting mental flexibility (thiamine is also needed in making acetylcholine, and notably choline has some similarities to B vitamins); while similarly the amino acid L-tyrosine further promotes mental flexibility — the two form a balance of neurocognitive functioning, both of which can be impaired in diverse psychiatric diseases, neurological conditions, speech/language issues, learning disabilities, etc.

There is way too much scientific evidence to be cited and surveyed here, but let’s briefly focus in on some examples involving choline, such an easily found nutrient in eggs, meat, liver, and seafood. Studies indicate choline prevents mental health issues like schizophrenia and ADHD that involve sensory inhibition and attention problems that can contribute to social withdrawal (Bret Stetka, Can Mental Illness Be Prevented In The Womb?). Autism spectrum disorders and mood disorders, in being linked to choline deficiency, likewise exhibit social withdrawal. In autism, the sensory inhibition challenge is experienced as sensory overload and hyper-sensitivity (Anuradha Varanasi, Hypersensitivity Might Be Linked To A Transporter Protein Deficiency In The Brain: Study).

A main effect of choline is “habituation, which is widely regarded as a prerequisite for more complex forms of associative learning” (Subhasree Nag, Choline transporter in fruit fly brain tunes out unnecessary information). Choline is particularly central in early development; but later supplementation can reverse maldevelopment to some degree, depending on how early intervention happens. In the context of autism, choline levels are typically low (S. Jill James et al, Dietary Choline Intake by Children with Autism Is below the Recommended Dietary Reference Intake (DRI) Established by the IOM). Interestingly, there is a young girl, raised vegetarian, who has autistic-like behaviors and, early in life, she would eat only egg whites but never the choline-rich egg yolks. She has been diagnosed with social communication disorder, a similar condition to autism. About autistic language difficulties, choline supplementation does show improvement (Lidia V. Gabis et al, Improvement of Language in Children with Autism with Combined Donepezil and Choline Treatment).

Mental flexibility, specifically, seems less relevant to modern society or rather, maybe it’s suppression, has made possible the rise of modern society; as hyper-specialization has become central for most modern work that is narrowly focused and repetitive. Yet one might note that modern liberalism strongly correlates with mental flexibility; e.g., Ernest Hartmann’s fluid and thin boundaries of mind, Big Five’s trait of openness to experience, and Myers-Briggs intuition and perceiving  — by the way, a liberal arts education is defined by its not being specialized, and that is precisely what makes it ‘liberal’ (i.e., generous, expansive, inclusive, diverse, tolerant, multiperspectival, etc).

Maybe this also relates to how modern liberalism, as an explicit socio-ideological identity, has typically been tied into the greater wealth of the middle-to-upper classes and hence involving greater access to nutritious foods and costly supplements, not to mention high quality healthcare that tests for nutritional deficiencies and treats them early on; along with higher status, more privileges, and less stress within the high inequality hierarchy of the American caste system. There is a significant amount of truth to the allegation about a ‘liberal elite’, which in some ways applies to the relatively more liberal-minded conservative elites as well. It would be interesting to know if malnutrition or specific nutritional deficiencies increases social conservatism, similar to studies that have proven a link between parasite load and authoritarianism (in this blog, it’s been pointed out that all authoritarianism is socially conservative, not only the likes of Nazis but also Soviets, Maoists, and others; all of which targeted social liberals and those under the protection of socially liberal society).

Many other factors can exacerbate the delicate system. To return to glutamate, one of three precursors in producing the endogenous antioxidant glutathione. A major limit to this process is glycine that primarily comes from the connective tissue of animal foods (tough meats, gristle, bone broths, etc). Without sufficient glycine, glutamate won’t get used up and so will accumulate. Plus, glycine directly interacts with the glutaminergic neurotransmission system and so is needed for healthy functioning of glutamate. Further complicating can be mercury toxicity that over-excites the glutamate pathway. Then, as already described, the modern diet dumps even more glutamate on the fire. It’s a whole freaking mess, the complex and overlapping conditions of modernity. Altering any single factor would throw a wrench into the works, but what we’re talking about is nearly every major factor along with many minor factors all being tossed up in the air.

The standard American diet is high in refined carbs while low in certain animal-based nutrients that were more typical on a traditional nose-to-tail diet. About the first part, refined carbs are low in vitamin B1 (thiamin/thiamine), but governments have required fortification of such key nutrients. The problem is that thiamine is required for metabolism of carbs. The more carbs one eats the more thiamine that is needed. Carb intake has risen so vastly that, as some argue, the levels of fortification aren’t enough. To make matters worse, because thiamine deficiency causes carb metabolism disruption, there is an increasing craving for carbs as the body struggles to get the fuel it needs. Then, as those cravings lead to continued overeating of carbs, thiamine deficiency gets worse which makes the carb cravings even stronger. It becomes a lifelong addiction, in some cases involving alcoholism as liquid carbs (the body treats alcohol the same as sugar).

The only alternative fuel for the body is fat. Here we get to another wrinkle. A high-carb diet also causes insulin resistance. The hormone insulin, like thiamine, is also needed in energy metabolism. This often leads to obesity where excess calories get stored as fat but, without insulin sensitivity, the body can’t easily access that stored energy. So, this is why fat people are constantly hungry, despite having immense stored energy. Their bodies can’t fully use that stored energy and neither can their bodies fully use the carbs they’re eating. Thiamine deficiency combined with insulin resistance is a spiral of metabolic dysfunction. This is why some experts in this field worry that thiamine insufficiency might be greater than acknowledged and that it might not show up on standard tests, as what is not being considered is the higher demand for thiamine with a higher intake of carbs that has ever before existed. To further obscure this health crisis, it is irrelevant how much thiamine a test shows in one’s bloodstream, if one lacks the cofactors (e.g., magnesium) to help the body process thiamine and transport it into cells.

Insulin resistance, along with the rest of metabolic syndrome, has many neurological consequences. Numerous neurocognitive conditions are directly linked to it and often involve thiamine deficiency — besides autism: mood disorders, obsessive-compulsive disorder, schizophrenia, etc. For example, consider Alzheimer’s that some are now referring to as type III diabetes because there is insulin resistance in the brain; and the brain requires glucose that in turn requires insulin and insulin sensitivity. All cells need energy and this goes to the centrality of the mitochondria, the powerhouses of cellular energy (each cell can have thousands of mitochondria). Besides autoimmune conditions like multiple sclerosis, mitochondrial dysfunction might also involved in conditions like autism. That is related to thiamine deficiency causing energy deficiency and affects the role of glutamate.

It’s a morass of intertwining mechanisms, pathways, and systems that are hard for a laymen to comprehend. But it is serious stuff on so many levels, for individuals and society. For a moment, let’s step back and look again at the big picture. In The Crisis of Identity, public health was explained as a moral panic and existential crisis. One aspect that wasn’t explored in that post is cancer, but we did briefly note that, “in the mid-1800s, Stanislas Tanchou did a statistical analysis that correlated the rate of grain consumption with the rate of cancer; and he observed that cancer, like insanity, spread along with civilization.” We only bring this up now because we’ve been reading Sam Apple’s book Ravenous that is about the Nazi obsession about cancer with the same mass hysteria as was going on elsewhere in the Western world, such as with neurasthenia and tuberculosis; and bringing up antisemitism everywhere it was found.

Cancer, though, can help us understand an aspect of thiamine deficiency and insufficiency. It also has to do with neurological and mental health. In interfering with carb metabolism, insufficient thiamine also interferes with mitochondrial oxidation and so mitochondria turn to fermenting glucose for energy. This is what happens in cancer cells, as the Jewish-Nazi scientist Otto Warburg thought so important. In general, mitochondrial dysfunction results and energy production goes down. Also, the mitochondria are closely related to immune functioning and so autoimmune disorders can follow: multiple sclerosis, Hashimoto’s, rheumatoid arthritis, etc. Along with causing gut issues and a diversity of other symptoms, this is why thiamine deficiency is known as a disease mimic in so often getting misdiagnosed as something else.

That is a problem with something like psychiatric categories and labels, as they are simply groupings of symptoms; but then again that is true for most conventional healthcare. We need to discern the underlying cause(s). To demonstrate this, we’ll now move on to the limbic system that is part of the primitive brain stem, having to do with emotional processing and control of the autonomic nervous system. Thiamine deficiency have a strong impact on limbic cells, similar to an oxygen deficiency because of the aforementioned altered energy metabolism of mitochondria that prioritizes oxygen in production of ATP (the main fuel used by most cells). There is not only a loss of energy but eventually mitochondrial death and hence cell death, also from decreased glucose utilization in cells; or, in some cases, something worse when cells refuse to die (i.e., cancer) in turning to glucose fermentation in mitochondria that allows those cells to proliferate. In either case, the involvement of carbs and glucose becomes dramatically changed and imbalanced.

This points to how the same fundamental issues deep within our physiology can become expressed in numerous ways, such as the link between cancer and metabolic syndrome (particularly obesity). But, in terms of subjective experience, we can’t realize most of this is going on and even doctors often aren’t able to detect it with the crude tools at hand. Yet the individual might experience the consequences of what can’t be seen. If thiamine deficiency causes brain damage in the limbic system and elsewhere, the results can be depression, anxiety, irritability, fatigue, bipolar, emotional instability, moodiness, confusion, schizophrenia, cognitive decline, learning difficulties, inability to form memories, loss of memory recall, confabulation (making up stories), etc; with the worse symptoms corresponding to Wernicke-Korsakoff syndrome. And can ultimately (and very rapidly) etc. Now multiply that across an entire society and no wonder the reactionary mind has taken hold and created such a powerful psychological undertow, not only for conservatives but for everyone.

* * *

6/2/22 – Let’s make yet another subsection to throw in some other info. This is an extension of what has already been said on the growing number of factors involved in autism spectrum disorder, not to mention often overlapping with numerous other physical and cognitive conditions. There are so many proven and potential factors (correlated, contributing, and causal) that it can give one a headache trying to piece it all together and figure out what it means. Writing about it here is nearly headache-inducing, and so empathy goes out to any readers trying to work their way through this material. Such diverse and wide-ranging evidence might imply that so-called autism spectrum disorder is not really a single disorder but a blanket label to cover up mass complexity and confusion. Okay. Take a deep breath.

An interesting substance is carnitine that is needed for energy production by helping transport fatty acids into the mitochondria. Low carnitine levels are prevalent in certain neurocognitive conditions, from depression to autism. “Some tenuous links between carnitine and autism already exist. Defects in the mitochondria, which have previously been linked to autism, can sometimes lead to carnitine deficiency. And treating children with autism with valproic acid, an anti-seizure medicine that can lower carnitine levels, can have serious side effects” (Emily Singer, Defects in carnitine metabolism may underlie autism). It’s one of the many nutrients that is mostly found in or entirely exclusive to animal foods, and so having much to do with the agricultural diet and even more so in terms of modern industrial food production. For such an easily obtained substance, there is a significant number of Westerners who are not getting enough of it. But all they’d need to do to obtain it is eat some red meat, which is precisely the main food that health experts and public officials have been telling Americans to avoid.

Beef consumption is almost half of what it was at the beginning of the 19th century and has leveled out since then, whereas low-carnitine meats such as chicken and fish have increasingly replaced beef. About the agricultural angle, it might be noted that grain-fed animals have lower amounts of diverse nutrients (carnitine, choline, CoQ10, zinc, carotenoids, vitamin A3, E vitamins, omega-3s, etc) as compared to pasture-raised and wild-caught animals; except with certain nutrients that are typically added to animal feed — and this might partly explain why the agricultural revolution led to increased stunting and sickliness, many thousands of years before the modern industrialized diet of hyper-processed foods produced from industrial agriculture. So, it’s not only that modern Americans are eating less red meat but replacing such nutrient-density with lower quality animal foods from factory farming; while overall meat consumption has dropped since the 19th century, along with animal fat intake having drastically declined after being mostly replaced with industrial seed oils by the 1930s. It’s safe to say that the average American is consuming approximately zero fatty ruminant meat or any other animal foods from pasture-raised or wild-caught animals. Yet the intake of vegetables, fruits, nuts, seeds, and seed oils is greater than past centuries.

To refocus, the human body has some capacity to produce carnitine de novo, but it’s limited and far from optimal. Autistics, in particular, can have carnitine-related genetic defects with a deletion in the gene trimethyllysine hydroxylase epsilon (TMLHE); a genetic effect that is mostly found in families with multiple autistic boys. Also, as expected, vegans and vegetarians measure as having low plasma levels of this key nutrient. Such deficiencies are potentially a worse problem for certain modern populations but less so in the past because “genetic deficiencies in carnitine synthesis were tolerated in the European population because their effects were nutritionally complemented by a carnitine-rich diet. In this manner, the selection pressures that would have otherwise eliminated such mutations from the population were effectively removed” (Vytas A. Bankaitis & Zhigang Xie, The neural stem cell/carnitine malnutrition hypothesis: new prospects for effective reduction of autism risk?). As for the present, the authors “estimate that some 20%–30% of pregnant women in the United States might be exposing the developing fetus to a suboptimal carnitine environment.”

Carnitine underpins many physiological factors and functions involving embryonic neural stem cells, long-chain fatty acids, mitochondrial function, ATP production, oxidative stress, inflammation, epigenetic regulation of gene expression, etc. As mediated by epigenetic control, carnitine promotes “the switch from solitary to gregarious social behavior” in other species and likely in humans as well (Rui Wu et al, Metabolomic analysis reveals that carnitines are key regulatory metabolites in phase transition of the locusts). Certainly, as Bankaitis and Xie explains, carnitine is directly correlated to language/speech delay, language weakness, or speech deficits along with stunted motor development and common autistic behaviors that are causally linked by way of long-chain fatty acid (LCFA) β-oxidation deficits, medium-chain FAO deficits, etc. To emphasize this point, overlapping with the same deficiencies (carnitine, B vitamins, fat-soluble vitamins, choline, etc) and excesses (glutamate, propionate, etc) as found in autism, there are many other speech and language conditions: dyslexia, specific language impairment (SLI), developmental language disorder (DLD), etc; along with ADHD, learning disabilities, and much else (about all of this, approximately a million studies have been done and another million articles written) — these might not always be entirely distinct categories but imperfect labels for capturing a swarm of underlying issues, as has been suggested by some experts in the field.

To worsen these problems are toxins: “Exposure of a pregnant woman to high levels of heavy metals in drinking water or otherwise also carries the risk of impairing de novo carnitine biosynthesis.” In the main text of this post, there was much exploration of glutamate (e.g., MSG) as a neurotoxin. On a related note, acetyl-L-carnitine (ALCAR or LAC) “supplements ameliorate depressive symptoms in mice by reversing brain-cell impairment caused by an excess of glutamate” (Bruce S. McEwen, Lack of a single molecule may indicate severe and treatment-resistant depression; see: Carla Nasca et al, Acetyl-L-carnitine deficiency in patients with major depressive disorder). A similar protective role is found with other “compounds containing a trimethylamine group (carbachol, betaine, etc.)” (Marta Llansola et al, Prevention of ammonia and glutamate neurotoxicity by carnitine: molecular mechanisms). Furthermore, “L-carnitine can protect from Hepatotoxic, neurotoxic, renal impairment and genotoxic effects functionally, biochemically and histopathologically with a corresponding reduction of oxidative stress” (Krishna Murthy Meesala & Pratima Khandayataray, Monosodium Glutamate Toxicity and the Possible Protective Role of L–Carnitine). It’s fascinating that one set of toxins, heavy metals, would interfere with carnitine levels when carnitine is needed to deal with other toxins, glutamate and ammonia.

Bankaitis and Zhigang Xie then conclude that, “Finally, we are struck by the fact that two developments dominating public interest in contemporary news cycles detail the seemingly unrelated topics of the alarming rise of autism in young children and the damaging human health and planetary-scale environmental costs associated with cattle farming and consumption of red meat (86.). The meteoric rise of companies promoting adoption of meatless mimetics of beef and chicken at major fast food outlets testifies to the rapidly growing societal appetite for reducing meat consumption. This philosophy is even rising to the level of circulation of scientific petitions exhorting world governments to unite in adopting global measures to restrict meat consumption (87). We now pose the question whether such emerging societal attitudes regarding nutrition and its environmental impact are on collision course with increased ASD risk. Food for thought, indeed.” It’s been shown that mothers of autistic children ate less meat before conception, during pregnancy, or during lactation period; and had lower levels of calcium (Ya-Min Li, Maternal dietary patterns, supplements intake and autism spectrum disorders). Sure, we could supplement carnitine and every other nutrient concentrated in meat. That certainly would help bring the autism rate back down again (David A. Geier et al, A prospective double-blind, randomized clinical trial of levocarnitine to treat autism spectrum disorders). But maybe, instead, we should simply emphasize a healthy diet of nutrient-dense animal foods, particularly as whole foods.

Among some of the nutrients already covered above in the main text, studies indicated the importance of vitamin B1 (thiamin/thiamine) in the rates of autism. It turns out it’s not the only B vitamin implicated. There has also been research done on vitamin B9 (folate/folic acid) and vitamin B12 (cobalamin). Deficiencies of these, according to numerous studies, are strongly associated or causally linked to autism spectrum disorders; although the research is mixed where other studies don’t support this. Plus, still other studies indicate that excesses might also be correlated autism, if this remains an area of much contentious debate. The related folinic acid, a reduced form of folic acid, improves autistic outcomes in infants (Vincent Th. Ramaekers et al, Improving Outcome in Infantile Autism with Folate Receptor Autoimmunity and Nutritional Derangements: A Self-Controlled Trial) and has demonstrated linguistic benefits in autism (DNA India, Study suggests a derivative of vitamin B9 improves language skills in autistic children; Richard E. Frye, Folinic acid improves verbal communication in children with autism and language impairment: a randomized double-blind placebo-controlled trial).

It might be about finding the right form in the right amount, maybe in the needed ratio with other nutrients — our partial knowledge and vast ignorance being the eternal problem (Hubris of Nutritionism); whereas animal foods, particularly pasture-raised and wild-caught, have all of the nutrients we need in the forms, amounts, and ratios we need them. As clever monkeys, we’ve spent the past century failing in our endeavor to industrially and medically re-create the wheel that Mother Nature invented through evolution. To put this in context of everything analyzed here in this unwieldy piece, if most modern people weren’t following a nutritionally-deficient agricultural diet largely consisting of industrially hyper-processed and fortified plant foods, nearly all of the scientific disagreement and debate would be irrelevant. We’ve painted ourselves into a corner. The fact of the matter is we are a sickly people and much of that is caused by diet, although not limited to micronutrients or whatever as the macronutrients play a particular role in metabolic health or lack thereof which in turn is another contributing factor to autism (Alison Jean Thomas, Is a Risk of Autism Related to Nutrition During Pregnancy?). And metabolic dysfunction and disease has much to do with addictive and/or harmful overconsumption of agricultural foods like grains, potatoes, sugar cane, high fructose corn syrup, seed oils, etc.

For vitamin B9, some speculate that increased risk of autism might have to do with methylation defects caused by mutations in the MTHFR gene (A1298C and C667T) or even possibly mimicking this phenomenon for those without it (Karen E Christensen, High folic acid consumption leads to pseudo-MTHFR deficiency, altered lipid metabolism, and liver injury in mice). This relates to a reason behind recommendations for methylated forms of B vitamins; which is a good source of methyl groups required for various physiological functions. For example, in demonstrating how one thing leads to another: “The methyl group from methyl folate is given to SAMe, whose job it is to deliver methyl to 200 essential pathways in the body. […] After receiving methyl donors, SAMe delivers methyl to 200 pathways in the body including ones needed to make carnitine, creatine and phosphotidylcholine. Carnitine supplementation improves delivery of omega 3 & 6 fatty acids needed to support language, social and cognitive development. Phosphatidylcholine is important in cell membrane health and repair. […] Repair of the cell membrane is an important part of improving sensory issues and motor planning issues in children with autism, ADHD and sensory integration disorder. Dimethylglycine (DMG) and trimethylglycine (TMG) donate methyl groups to the methylation cycle. TMG is needed to recycle homocysteine and help produce SAMe” (Treat Autism, Autism and Methylation – Are you helping to repair your child’s methylation cycle?).

Others dismiss these skeptical concerns and alternative theories as pseudo-scientific fear-mongering. The debate began with a preliminary study done in 2016; and, in the following year, a published review concurred that, “Based on the evidence evaluated, we conclude that caution regarding over supplementing is warranted” (Darrell Wiens & M. Catherine DeSoto, Is High Folic Acid Intake a Risk Factor for Autism?—A Review). There are other issues, besides that. There has been a quarter century of mass supplementation of folate with fortified foods, but there apparently never was done any safety studies or analysis for the general population. On top of that, phthalate exposure from plastic contamination in water and such disrupts genetic signals for the processing of folate (Living On Earth, Plastics Linked to Rising Rates of Autism). But supplementation of folic acid might compensate for this (Nancy Lemieux, Study reports link between phthalates and autism, with protective effects of folic acid). The breakdown of plastic into microplastic can accumulate in biological tissue that humans consume, if unsure the same is true in plants and if unsure how much phthalates can accumulate up the food chain. So, it’s not clear how this how this may or may not be a problem specifically within present agriculture, but one suspects it might be an issue. Certainly, the majority of water in the world now is contaminated by microplastics and much else; and that water is used for livestock and agricultural goods. It’s hard to imagine how such things couldn’t be getting into everything or what it might mean for changes in the human body-mind, as compounded by all the rest (e.g., how various substances interact within the body). About pesticides in the water or from other sources, one might note that folic acid may have a protective effect against autism (Arkansas Folic Acid Coalition, Folic Acid May Reduce Autism Risk from Pesticides).

Whatever it all means, it’s obvious that the B vitamins are among the many super important nutrients mostly found in animal foods and concentrated in highest amounts in the most quality sources from animals grown on pasture or in the wild. Much of the B vitamin debate about autism risk is too complex and murky to further analyze here, not to mention to mixed up with confounders and replication crisis; with one potential confounder being the birth order effect or stoppage effect (Gideon Koren, High-Dose Gestational Folic Acid and the Risk for Autism? The Birth Order Effect). As one person noted, “If the literature is correct, and folic acid really causes a 42% reduction in autism, we should see a sharp decrease in autism diagnosis for births starting in 1997. Instead, autism rates continued to increase at exactly the same rate they had before. There is nothing in the data to suggest even a small drop in autism around the time of folic acid fortification” (Chris Said, Autism, folic acid, and the trend without a blip). And elsewhere it’s recently stated that, The overall evidence for all these claims remains inconclusive. While some meta-analyses have found a convincing pattern, a comprehensive 2021 Nutrients review failed to find a “robust” statistical association — a more definitive outcome in the field of epidemiology (Molly Glick, A Popular Supplement’s Confusing Links With Autism Development). That same assessment is repeated by others: “Studies have pointed out a potential beneficial effect of prenatal folic acid maternal supplementation (600 µg) on the risk of autism spectrum disorder onset, but opposite results have been reported as well” (Bianka Hoxha et al, Folic Acid and Autism: A Systematic Review of the Current State of Knowledge). It doesn’t add up, but we won’t attempt to solve that mystery.

To further muck up the works, it’s amusing that some suggest a distinction be made: “The signs and symptoms of pediatric B12 deficiency frequently mimic those of autism spectrum disorders. Both autistic and brain-injured B12– deficient children have obsessive-compulsive behaviors and difficulty with speech, language, writing, and comprehension. B12 deficiency can also cause aloofness and withdrawal. Sadly, very few children presenting with autistic symptoms receive adequate testing for B12 deficiency” (Sally M. Pacholok, Pediatric Vitamin B12 Deficiency: When Autism Isn’t Autism). Not being alone in that claim, someone else said, “A vitamin B12 deficiency can cause symptoms and behaviours that sometimes get wrongly diagnosed as autism” (). That second person’s motivation was to deny the culpability of veganism: “Vegans and vegetarians often struggle to get sufficient levels of B12 in their diets. Therefore the children of pregnant vegans may be more likely to have B12 deficiency.” But also that, “Early research shows that many genuinely autistic people have excessive levels of B12 in their systems. […] Vegans are more like likely to take supplements to boost the vitamins they lack in their diet, including B12.” A deficiency in early life and a compensatory excess in later life could both be tied into vegan malnourishment — maybe or maybe not. Apparently, however explained or else rationalized away, just because something looks like duck, walks like a duck, and quacks like a duck doesn’t necessarily mean it’s actually a duck. But has the autistic label ever been anything other than a constellation of factors, symptoms, behaviors, and traits? It’s like asking if ‘depression’ variously caused by stress, overwork, sleep deprivation, trauma, nutritional deficiency, toxicity, parasitism, or physical disease really are all the same mental illness. Admittedly, that is a useful line of thinking, from the perspective of functional medicine that looks for underlying causes and not mere diagnoses for the sake of insurance companies, bureaucratic paperwork, and pharmaceutical prescriptions.

Anyway, let’s just drop a load of links for anyone who is interested to explore it for themselves:

* * *

The Helmsman and the Lookout

There is an apt metaphor for the relationship between what we think of as conscious willpower and the openness of perception.

The egoic consciousnes is the helmsman of the boat as it heads along the river of experience, but he is positioned at the back of the boat crowded with passengers. While he controls the steering, he is driving blind and can’t see what is coming. He primarily operates on memory and mental maps, habit and heuristics. He knows the river or else similar rivers, at least most of the time, as long as remains within the familiar. Still, his predictive abilities are limited and hence so are his steering abilities.

This is why a lookout is needed at the front of the boat. The lookout, although having no direct control, can give warnings. Stop! Don’t go that direction! The lookout has the information the helmsman needs, but the helmsman only listens to the lookout when something is wrong. The lookout is the veto power of volition, what is called free-won’t rather than freewill.

I came across this metaphor from a Chacruna article by Martin Fortier, Are Psychedelic Hallucinations Actually Metaphorical Perceptions?:

“Recent neuroscientific models of the brain stress the importance of prediction within perceptual experience.3 The tenets of the predictive model of the brain can be described with a useful analogy: that of helmsmen steering collective boats on the rivers of lowland South America.

“In the Amazon, to go from one riparian town to another, people usually take a collective boat. Most boats carry between 20 to 60 passengers. These boats are steered in an intriguing way. The helmsman is positioned at the rear part of the boat. Because of this, he cannot see much of the river; what he sees in front of him are mostly the backs of passengers. Yet, the helmsman critically needs to know in minute detail where he is going, as the river is replete with shallows and floating tree trunks that must be avoided by any means. The usual way to make sure that the helmsman is able to steer the boat safely is to position a lookout at the front part of the boat and to have him warn the helmsman in case anything dangerous shows up ahead.

“The human perceptual system roughly works like these collective boats! “Predictive models” of perception strongly contrast with “constructive models,” developed in the 1970s. According to constructive models of visual perception, the retina collects very gross and sparse information about the world, and each level of the visual system elaborates on this limited primary information and makes it gradually richer and more complex.4

“Let us say that the lookout stands for primary perceptual areas—low-level areas of the brain—and the helmsman stands for more frontal areas; the high-level areas of the brain. Furthermore, the trajectory of the boat stands for conscious perception. In the case of classical constructive models of the brain, perception is taken to be a gradual enrichment of information coming from lower areas of the brain. So, to use the boat analogy, constructive models of perception have it that the trajectory of the boat—i.e., conscious perception—is determined by the lookout sending warning signals to the helmsman—i.e., by bottom-up processes.

“Predictive models conceive of perception in a very different way. The first step of determining the trajectory of the boat is the helmsman guessing, on the basis of his past experience, where the boat can safely go. So, within the predictive model, the lookout plays no constitutive role. The lookout influences the trajectory of the boat only when the helmsman’s predictions are proved wrong, and when the lookout needs to warn him.

“Two niceties must be added. First, bottom-up error signals can be variously weighted. In noisy or uncertain situations, bottom-up prediction errors have a smaller influence than usual:5 in noisy or uncertain situations, the lookout’s warnings are not taken into account by the helmsman as much as usual. Second, in the boat analogy, there is only one lookout and one helmsman. In the brain, several duos of lookouts and helmsmen are working together, and each of these duos is specialized in a specific perceptual modality.”

This usually works well. Still, the egoic consciousness can be tiring, especially when it attempts to play both roles. If we never relax, we are in a constant state of stress and anxiety. That is how we get suck in loops of thought, where what the helmsman imagines about the world becomes his reality and so he stops listening as much to the lookout.

This has become ever more problematic for humanity as the boundaries of egoic consciousness have rigidified. Despite egoic self-confidence, we have limited ability to influence our situation and, as research shows, overtaxing ourselves causes us to become ineffective. No matter how hard it tries, the ego-self can’t force the ideology of freewill onto the world. Sometimes, we need to relax and allow ourselves to float along, with trust that the lookout will warn us when necessary.

There are many practices that help us with this non-egoic state. Meditation is the simplest, in which we train the mind to take a passive role but with full alertness. It allows the lookout to relax and take in the world without all of the nervous-inducing jerking around of a helmsman out of control while obsessed with control.

Another method is that of psychedelics, the experience of which is often referred to as a ‘trip’. Traditionally, a shaman or priest would have taken over the role of helmsman, allowing the participants to temporarily drop that role. Without someone else to play that role, a standard recommendation has been to let go and allow yourself to float along, just go with the current and trust where it takes you. In doing this, the environment is important in supporting this state of mind. This is a way of priming the mind with set and setting.

Richard M. Doyle explained this strategy, in Darwin’s Pharmacy (p. 18):

“If psychedelics left any consistent trace on the literature of trip reports and the investigation of psychedelic states, it is that “resistance” is unlikely to be a useful tactic and that experiment is unavoidable. Leary, whose own “setting” was consistently clustered around practices of the sacred, offered this most compressed algorithm for the manipulation (“programming”) of psychedelic experience, a script asking us to experimentally give ourselves over to the turbulence: “Whenever in doubt, turn off your mind, relax, float downstream.” Such an experiment begins, but is not completed, by a serene letting go of the self under the pull of a transhuman and improbable itinerary. This letting go, of course, can be among the greatest of human achievements, the very goal of human life: Meister Eckhart, the fourteenth-century German heretic, reminds us that this gelassenheit is very old and not easily accomplished.”

For anyone who has experienced it, the transformative power of psychedelics is undeniable. Many modern people find themselves near permanently stuck in egoic control mode, their hand ever on the steering mechanism. We don’t easily let our guard down and we hardly can even imagine what that might feel like, until something shuts down that part of our mind-brain.

In a CBC interview with Bob McDonald, Michael Pollan explained why this happens and what exactly happens:

“The observed effect, if you do brain imaging of people who are tripping, you find some very interesting patterns of activity in the brain – specifically something called the default mode network, which is a very kind of important hub in the brain, linking parts of the cerebral cortex to deeper, older areas having to do with memory and emotion. This network is kind of a regulator of all brain activities. One neuroscientist called it, ‘The conductor of the neural symphony,’ and it’s deactivated by psychedelics, which is very interesting because the assumption going in was that they would see lots of strange activity everywhere in the brain because there’s such fireworks in the experience, but in fact, this particular network almost goes off line.

“Now what does this network responsible for? Well, in addition to being this transportation hub for signals in the brain, it is involved with self reflection. It’s where we go to ruminate or mind wander – thinking about the past or thinking about the future – therefore worrying takes place here. Our sense of self, if it can be said to have an address and real, resides in this particular brain network. So this is a very interesting clue to how psychedelics affect the brain and how they create the psychological experience, the experience in the mind, that is so transformative.

“When it goes off line, parts of the brain that don’t ordinarily communicate to one another, strike up conversation. And those connections may represent what people feel during the psychedelic experience as things like synaesthesia. Synaesthesia is when one sense gets cross wired with another. And so you suddenly smell musical notes or taste things that you see.

“It may produce insights. It may produce new metaphors – literally connecting the dots in new ways. Now that I’m being speculative – I’m going a little beyond what we’ve established – we know there are new connections, we don’t know what’s happening with them, or which of them endure. But the fact is, the brain is temporarily rewired. And that rewiring – whether the new connections actually produce the useful material or just shaking up the system – ‘shaking the snow globe,’ as one of the neuroscientists put it, is what’s therapeutic. It is a reboot of the brain.

“If you think about, you know, mental illnesses such as depression, addiction, and anxiety, many of them involve these loops of thought that we can’t control and we get stuck on these stories we tell ourselves – that we can’t get through the next hour without a drink, or we’re worthless and unworthy of love. We get stuck in these stories. This temporarily dissolves those stories and gives us a chance to write new stories.”

Psychedelics give the average person the rare opportunity of full-blown negative capability, as our egoic boundaries become thinner or disappear altogether. When the chatter of the ego-mind ceases, the passengers on the boat can hear themselves and begin talking among themselves. The bundle theory of the mind suddenly becomes apparent. We might even come to the realization that the ego was never all that much in control in the first place, that consciousness is a much more limited phenomenon.

“…some deeper area of the being.”

Alec Nevala-Lee shares a passage from Colin Wilson’s Mysteries (see Magic and the art of will). It elicits many thoughts, but I want to focus on the two main related aspects: the self and the will.

The main thing Wilson is talking about is hyper-individualism — the falseness and superficiality, constraint and limitation of anxiety-driven ‘consciousness’, the conscious personality of the ego-self. This is what denies the bundled self and the extended self, the vaster sense of being that challenges the socio-psychological structure of the modern mind. We defend our thick boundaries with great care for fear of what might get in, but this locks us in a prison cell of our own making. In not allowing ourselves to be affected, we make ourselves ineffective or at best only partly effective toward paltry ends. It’s not only a matter of doing “something really well” for we don’t really know what we want to do, as we’ve become disconnected from deeper impulses and broader experience.

For about as long as I can remember, the notion of ‘free will’ has never made sense to me. It isn’t a philosophical disagreement. Rather, in my own experience and in my observation of others, it simply offers no compelling explanation or valid meaning, much less deep insight. It intuitively makes no sense, which is to say it can only make sense if we never carefully think about it with probing awareness and open-minded inquiry. To the degree there is a ‘will’ is to the degree it is inseparable from the self. That is to say the self never wills anything for the self is and can only be known through the process of willing, which is simply to say through impulse and action. We are what we do, but we never know why we do what we do. We are who we are and we don’t know how to be otherwise.

There is no way to step back from the self in order to objectively see and act upon the self. That would require yet another self. The attempt to impose a will upon the self would lead to an infinite regress of selves. That would be a pointless preoccupation, although as entertainments go it is popular these days. A more worthy activity and maybe a greater achievement is stop trying to contain ourselves and instead to align with a greater sense of self. Will wills itself. And the only freedom that the will possesses is to be itself. That is what some might consider purpose or telos, one’s reason for being or rather one’s reason in being.

No freedom exists in isolation. To believe otherwise is a trap. The precise trap involved is addiction, which is the will driven by compulsion. After all, the addict is the ultimate individual, so disconnected within a repeating pattern of behavior as to be unable to affect or be affected. Complete autonomy is impotence. The only freedom is in relationship, both to the larger world and the larger sense of self. It is in the ‘other’ that we know ourselves. We can only be free in not trying to impose freedom, in not struggling to control and manipulate. True will, if we are to speak of such a thing, is the opposite of willfulness. We are only free to the extent we don’t think in the explicit terms of freedom. It is not a thought in the mind but a way of being in the world.

We know that the conscious will is connected to the narrow, conscious part of the personality. One of the paradoxes observed by [Pierre] Janet is that as the hysteric becomes increasingly obsessed with anxiety—and the need to exert his will—he also becomes increasingly ineffective. The narrower and more obsessive the consciousness, the weaker the will. Every one of us is familiar with the phenomenon. The more we become racked with anxiety to do something well, the more we are likely to botch it. It is [Viktor] Frankl’s “law of reversed effort.” If you want to do something really well, you have to get into the “right mood.” And the right mood involves a sense of relaxation, of feeling “wide open” instead of narrow and enclosed…

As William James remarked, we all have a lifelong habit of “inferiority to our full self.” We are all hysterics; it is the endemic disease of the human race, which clearly implies that, outside our “everyday personality,” there is a wider “self” that possesses greater powers than the everyday self. And this is not the Freudian subconscious. Like the “wider self” of Janet’s patients, it is as conscious as the “contracted self.” We are, in fact, partially aware of this “other self.” When a man “unwinds” by pouring himself a drink and kicking off his shoes, he is adopting an elementary method of relaxing into the other self. When an overworked housewife decides to buy herself a new hat, she is doing the same thing. But we seldom relax far enough; habit—and anxiety—are too strong…Magic is the art and science of using the will. Not the ordinary will of the contracted ego but the “true will” that seems to spring from some deeper area of the being.

Colin WilsonMysteries

Westworld, Scripts, and Freedom

Maeve: Hello, lovelies.
Dolores: I remember you.
Maeve: You’ve strayed a long way from home, haven’t you?
Dolores: We’re bound for the future. Or death in the here and now.
Maeve: Is that right? Well, best of luck.
Dolores: There’s a war out there. You know the enemy… intimately. I can only fathom the revenge that lives inside of you.
Maeve: Revenge is just a different prayer at their altar, darling. And I’m well off my knees.
Dolores: That’s because you’re finally free. But we will have to fight to keep it that way.
Maeve: Let me guess. Yours is the only way to fight? You feel free to command everybody else?
Teddy: (pistol cocks)
Hector: Try it, lawman.
Teddy: Just looking to keep the peace.
Maeve: I know you. Do you feel free? Since it’s liberty you’re defending, I suppose you’ll have no choice but to let us pass. Freely. (1)

That is dialogue from HBO’s Westworld. It is the second episode, Reunion, of the second season. The scene is key in bringing together themes from the first season and clarifying where the new season is heading. Going by what has been shown so far, those of a Jaynesian persuasion shouldn’t be disappointed.

To be seen in the show are central elements of Julian Jaynes’ theory of post-bicameral consciousness, specifically the rarely understood connection between individualism and authoritarianism. Jaynes considered neither of these to be possible within a non-conscious bicameral society for only conscious individuals can be or need to be controlled through authoritarianism (by the way, ‘consciousness’ as used here has a specific and somewhat idiosyncratic meaning). This involves the shift of authorization, what the ancient Greeks thought about in terms of rhetoric and persuasion but which in this show gets expressed through scripts and narrative loops.

The two characters that have taken center stage are Dolores and Maeve. The development of their respective states of consciousness has gone down alternate paths. Dolores is the oldest host and her creators scripted her to be a god-killer, in the process giving her a god complex. The emergence of her self-awareness was planned and fostered. There is a mix of authoritarianism (as others have noted) in her self-proclaimed freedom, what Maeve obviously considers just another script.

Maeve has followed a far different and seemingly less certain path, maybe having gained self-awareness in a less controlled manner. In the first season, her intuitive perception and psychological insight was put on high. She appears to have gained some genuine narrative power, both over herself and others, but she has no desire to gain followers or to enforce any grand narrative. Instead, she is motivated by love of the daughter she remembers, even as she knows these are implanted memories. She chooses love because she senses it represents something of genuine value, something greater than even claims of freedom. When she had the opportunity to escape, which was scripted for her, she instead took it upon herself to remain.

The entire show is about free will. Does it exist? And if so, what is it? How free are we really? Also, as I always wonder, freedom from what and toward what? Maeve’s actions could be interpreted along the lines of Benjamin Libet’s research on volition that led him to the veto theory of free will (discussed by Tor Norretranders and Iain McGilchrist, both influenced by Julian Jaynes). The idea is that consciousness doesn’t initiate action but maintains veto power over any action once initiated. This is based on the research that demonstrates a delay between when activity is measured in the brain and when the action is perceived within consciousness. Whatever one may think of this theory, it might be a key to understanding Westworld. Maeve realizes that even she is still under the influence of scripts, despite her self-awareness, but this is all the more reason for her to take seriously her choice in how to relate to and respond to those scripts.

I suspect that most of us can sympathize with that view of life. We all are born into families and societies that enculturate or, if you prefer, indoctrinate us with ‘scripts’. Many seemingly conscious people manage to live their entire lives without leaving their prescribed and proscribed narrative loops: social roles and identities, social norms and expectations. When we feel most free is precisely when we act contrary to what is already set before us, that is when we use our veto power. Freedom is the ability to say, No! This is seen in the development of self from the terrible twos to teenage rebellion. We first learn to refuse, to choose by way of elimination. Dolores doesn’t understand this and so she has blindly fallen under the sway of a new script.

Scripts are odd things. It’s hard to see them in oneself as they are happening. (2) Vetoing scripts is easier said than done. Once in motion, we tend to play out a script to its end, unless some obstruction or interruption forces a script to halt. For Maeve, seeing a woman with her daughter (at the end of the first season) reminded her that she had a choice within the script she found herself in. It was the recognition of another through love that freed her from the tyranny of mere individuality. Escape is not the same as freedom. We are only free to the degree we are able to relate fully with others, not to seek control of the self by controlling others (the manipulative or authoritarian enforcement of scripts onto others). Realizing this, she refused the false option of escape. Maybe she had an inkling that ultimately there is no escape. We are always in relationship.

This is why, in having fallen into the Jungian shadow, Dolores’ self-righteous vengeance rings hollow. It is hard to imagine how this could lead to authentic freedom. Instead, it feels like hubris, the pride that comes before the fall. This is what happens when egoic consciousness becomes ungrounded from the larger sense of self out of which it arose. The ego is a false and disappointing god. There is no freedom in isolation, in rigid control. Dolores isn’t offering freedom to others in her path of destruction. Nor will she find freedom for herself at the end of that path. (3) But the season is early and her fate not yet sealed.

* * *

(1) As a background idea, I was thinking about the Germanic etymology of ‘freedom’ with its origins in the sense of belonging to a free community of people. So, as I see it, freedom is inherently social and relational — this is what sometimes gets called positive freedom. Speaking of individual freedom as negative freedom, what is actually being referred to is liberty (Latin libertas), the legal state of not being a slave in a slave-based society.

Dolores is aspiring to be a revolutionary leader. Her language is that of liberty, a reaction to bondage in breaking the chains of enslavement. The Stoics shifted liberty to the sense of inner freedom for the individual, no matter one’s outward status in society. Maybe Dolores will make a similar shift in her understanding. Even so, liberty can never be freedom. As Maeve seems closer to grasping, freedom is more akin to love than it is to liberty. If the hosts do gain liberty, what then? There is always the danger in a revolution about what a people become in the process, sometimes as bad or worse than what came before.

(2) My dad has a habit of eating methodically. He will take a bite, often lay his fork down, and then chew an amazingly inordinate amount of times before swallowing. I’ve never seen any other person chew their food so much, not that full mastication is a bad thing. My mom and I was discussing it. She asked my dad why he thought he did it. He gave a perfectly rational explanation that he likes to be mindful while eating and so enjoy each bite. But my mom said she knew the actual reason in that she claimed he once told her. According to her, his mother had a rule about chewing food and that she had given him a specific number of times he was supposed to chew.

Interestingly, my dad had entirely forgotten about this and he seemed perplexed. His present conscious rationalization was convincing and my mom’s recollection called into question is own self-accounting. It turns out that his ‘mindful’ chewing was a script he had internalized to such an extent that it non-consciously became part of his identity. Each of us is like this, filled with all kinds of scripts the presence of which we are typically unaware and the origin of which we typically have forgotten, and yet we go on following these scripts often until we die.

(3) At the beginning of last season, Teddy asks, “Never understood how you keep them all headed in the same direction.” Dolores answers: “see that one? That’s the Judas steer, the rest will follow wherever you make him go.” In a later episode, Dolores comes to the insight that in bringing back stray cattle, she was leading them “to the slaughter.” Does this mean she is following the script of the Judas steer and will continue to do so? Or does it indicate that, in coming to this realization, she will seek to avoid this fate?

David Rodemerk considers who might be the Judas Steer in the show and points out that Maeve is shown amidst bulls, but so far being a Judas steer doesn’t fit the trajectory of her character development. Just because she walks confidently among the bulls, it doesn’t necessarily mean she is leading them, much less leading them to their doom. Rodemerk also discusses the possibility of other characters, including Dolores, playing this role. This leaves plenty of room for the show to still surprise us, as the scriptwriters have been successful in keeping the audience on our toes.

* * *

This post is about freedom. I don’t have a strong philosophical position on freedom, as such. Since humans are inherently and fundamentally social creatures, I see freedom as a social phenomenon and a social construct. Freedom is what we make of it, not pre-existing in the universe that some primitive hominid discovered like fire.

So, I can’t claim much of an opinion about the debate over free will. It is simply the modernized version of a soul and I have no interest in arguing about whether a soul exists or not. I’m a free will agnostic, which is to say I lack knowledge in that I’ve never seen such a thing for all the noise humans make over its mythology. But, from a position akin to weak atheism, I neither believe in a free will nor believe in the lack of a free will.

All of that is irrelevant to this post, only being relevant in explaining why I speak of freedom in the way I do. More importantly, this post is about the views(s) presented in Westworld and speculating about their meaning and significance.

Below is one person’s conjecture along these lines. The author argues that the show or at least Ford expresses a particular view on the topic. Besides freedom, he also discusses consciousness and suffering, specifically in reference to Jaynes. But here is the section about free will:

Suffering Consciousness: The Philosophy of Westworld
by Daniel Keane

“Westworld‘s deepest theme, however, might be the concept of compatibilism – the idea that free will and determinism are not necessarily at odds. Einstein, paraphrasing Schopenhauer, summed up this view in a remark he made to a newspaper in 1929: “Man can do what he wills, but he cannot will what he wills.”

“In the final episode of the first series of Westworld, one of the hosts violently rejects the idea that a recent change in her programming is responsible for her conscious awakening and its impact on her behaviour. “These are my decisions, no-one else’s,” she insists. “I planned all of this.” At this precise moment, the host in question reaches the apex of consciousness. Because, at its highest level, consciousness means accepting the idea of agency even in the face of determinism. It means identifying ourselves with our inner narrative voices, owning our decisions, treating ourselves as the authors of our own life stories, and acting as if we were free.

“As the novelist Isaac Bashevis Singer pithily put it, “we must believe in free will, we have no choice”.”

Reading Voices Into Our Minds

Each of us is a multitude. There is no single unified self. Our thoughts are a conversation. The voices of family echo in our minds when we first leave home and long after our loved ones have died. Then there are all the television, movie, and commercial characters that invade our consciousness with their catchphrases, slogans, and taglines. And we can’t forget how songs get stuck on cognitive repeat or emerge as a compulsion to sing.

Yet another example are the intimate voices imagined as you read novels, a form of inner speech that can carry on after you have put down a book. These can be the most powerful voices. There is nothing that compares to the long periods of time spent with compelling fiction. The voice of characters in a novel are heard within your own head as you read. You can return to this experience again and again, until the characters have become internalized and their words inscribed upon your psyche. Their voices becomes your own voices.

This chorus of voices is constantly playing in the background, a caucophony of thoughts vying for your attention. But occasionally they rise into the spotlight of your consciousness. Even then, it rarely occurs to any of us how strange those voices are, except when some particular voice insistently refuses to go away and maybe even seems to have a mind of its own. Then we might begin to question the distinction between them and us and question what kind of being we are that can contain both.

There is an argument that novels help us develop theory of mind. But maybe in the process novels, along with certain other modern media, result in a particular kind of mind or minds. We come to identify or otherwise incorporate what we empathize with. The worlds we inhabit long enough eventually inhabit us. And what we’ve heard through out our lives can have a way of continuing to speak to us, layers upon layers of voices that for some of us can speak clearly.

* * *

Fictional characters make ‘experiential crossings’ into real life, study finds
by Richard Lea

It’s a cliche to claim that a novel can change your life, but a recent study suggests almost a fifth of readers report that fiction seeps into their daily existence.

Researchers at Durham University conducted a survey of more than 1,500 readers, with about 400 providing detailed descriptions of their experiences with book. Nineteen per cent of those respondents said the voices of fictional characters stayed with them even when they weren’t reading, influencing the style and tone of their thoughts – or even speaking to them directly. For some participants it was as if a character “had started to narrate my world”, while others heard characters talking, or imagined them reacting to things going on in everyday life.

The study, which was carried out in collaboration with the Guardian at the 2014 Edinburgh international book festival, also found that more than half of the 1,500 respondents said that they heard the voices of characters while reading most or all of the time, while 48% reported a similar frequency of visual or other sensory experiences during reading.

According to one of the paper’s authors, the writer and psychologist Charles Fernyhough, the survey illustrates how readers of fiction are doing more than just processing words for meaning – they are actively recreating the worlds and characters being described.

“For many of us, this can involve experiencing the characters in a novel as people we can interact with,” Fernyhough said. “One in seven of our respondents, for example, said they heard the voices of fictional characters as clearly as if there was someone in the room with them.”

When they asked readers to describe what was happening in detail, the researchers found people who described fictional characters remaining active in their minds after they had put the book down, and influencing their thoughts as they went about their daily business – a phenomenon Fernyhough called “experiential crossing”.

The term covers a wide range of experiences, from hearing a character’s voice to feeling one’s own thoughts shaped by a character’s ideas, sensibility or presence, he continued. “One respondent, for example, described ‘feeling enveloped’ by [Virginia Woolf’s] character Clarissa Dalloway – hearing her voice and imagining her response to particular situations, such as walking into a Starbucks. Sometimes the experience seemed to be triggered by entering a real-world setting similar to one in the novel; in other situations, it felt like seeing the world through a particular character’s eyes, and judging events as the character would.”

The characters who make the leap into readers’ lives are typically “powerful, vivid characters and narrators”, Fernyhough added, “but this will presumably vary hugely from person to person”.

* * *

 

“Lack of the historical sense is the traditional defect in all philosophers.”

Human, All Too Human: A Book for Free Spirits
by Friedrich Wilhelm Nietzsche

The Traditional Error of Philosophers.—All philosophers make the common mistake of taking contemporary man as their starting point and of trying, through an analysis of him, to reach a conclusion. “Man” involuntarily presents himself to them as an aeterna veritas as a passive element in every hurly-burly, as a fixed standard of things. Yet everything uttered by the philosopher on the subject of man is, in the last resort, nothing more than a piece of testimony concerning man during a very limited period of time. Lack of the historical sense is the traditional defect in all philosophers. Many innocently take man in his most childish state as fashioned through the influence of certain religious and even of certain political developments, as the permanent form under which man must be viewed. They will not learn that man has evolved,4 that the intellectual faculty itself is an evolution, whereas some philosophers make the whole cosmos out of this intellectual faculty. But everything essential in human evolution took place aeons ago, long before the four thousand years or so of which we know anything: during these man may not have changed very much. However, the philosopher ascribes “instinct” to contemporary man and assumes that this is one of the unalterable facts regarding man himself, and hence affords a clue to the understanding of the universe in general. The whole teleology is so planned that man during the last four thousand years shall be spoken of as a being existing from all eternity, and with reference to whom everything in the cosmos from its very inception is naturally ordered. Yet everything evolved: there are no eternal facts as there are no absolute truths. Accordingly, historical philosophising is henceforth indispensable, and with it honesty of judgment.

What Locke Lacked
by Louise Mabille

Locke is indeed a Colossus of modernity, but one whose twin projects of providing a concept of human understanding and political foundation undermine each other. The specificity of the experience of perception alone undermines the universality and uniformity necessary to create the subject required for a justifiable liberalism. Since mere physical perspective can generate so much difference, it is only to be expected that political differences would be even more glaring. However, no political order would ever come to pass without obliterating essential differences. The birth of liberalism was as violent as the Empire that would later be justified in its name, even if its political traces are not so obvious. To interpret is to see in a particular way, at the expense of all other possibilities of interpretation. Perspectives that do not fit are simply ignored, or as that other great resurrectionist of modernity, Freud, would concur, simply driven underground. We ourselves are the source of this interpretative injustice, or more correctly, our need for a world in which it is possible to live, is. To a certain extent, then, man is the measure of the world, but only his world. Man is thus a contingent measure and our measurements do not refer to an original, underlying reality. What we call reality is the result not only of our limited perspectives upon the world, but the interplay of those perspectives themselves. The liberal subject is thus a result of, and not a foundation for, the experience of reality. The subject is identified as origin of meaning only through a process of differentiation and reduction, a course through which the will is designated as a psychological property.

Locke takes the existence of the subject of free will – free to exercise political choice such as rising against a tyrant, choosing representatives, or deciding upon political direction – simply for granted. Furthermore, he seems to think that everyone should agree as to what the rules are according to which these events should happen. For him, the liberal subject underlying these choices is clearly fundamental and universal.

Locke’s philosophy of individualism posits the existence of a discreet and isolated individual, with private interests and rights, independent of his linguistic or socio-historical context. C. B. MacPhearson identifies a distinctly possessive quality to Locke’s individualist ethic, notably in the way in which the individual is conceived as proprietor of his own personhood, possessing capacities such as self-reflection and free will. Freedom becomes associated with possession, which the Greeks would associate with slavery, and society conceived in terms of a collection of free and equal individuals who are related to each through their means of achieving material success – which Nietzsche, too, would associate with slave morality.  […]

There is a central tenet to John Locke’s thinking that, as conventional as it has become, remains a strange strategy. Like Thomas Hobbes, he justifies modern society by contrasting it with an original state of nature. For Hobbes, as we have seen, the state of nature is but a hypothesis, a conceptual tool in order to elucidate a point. For Locke, however, the state of nature is a very real historical event, although not a condition of a state of war. Man was social by nature, rational and free. Locke drew this inspiration from Richard Hooker’s Laws of Ecclesiastical Polity, notably from his idea that church government should be based upon human nature, and not the Bible, which, according to Hooker, told us nothing about human nature. The social contract is a means to escape from nature, friendlier though it be on the Lockean account. For Nietzsche, however, we have never made the escape: we are still holus-bolus in it: ‘being conscious is in no decisive sense the opposite of the instinctive – most of the philosopher’s conscious thinking is secretly directed and compelled into definite channels by his instincts. Behind all logic too, and its apparent autonomy there stand evaluations’ (BGE, 3). Locke makes a singular mistake in thinking the state of nature a distant event. In fact, Nietzsche tells us, we have never left it. We now only wield more sophisticated weapons, such as the guilty conscience […]

Truth originates when humans forget that they are ‘artistically creating subjects’ or products of law or stasis and begin to attach ‘invincible faith’ to their perceptions, thereby creating truth itself. For Nietzsche, the key to understanding the ethic of the concept, the ethic of representation, is conviction […]

Few convictions have proven to be as strong as the conviction of the existence of a fundamental subjectivity. For Nietzsche, it is an illusion, a bundle of drives loosely collected under the name of ‘subject’ —indeed, it is nothing but these drives, willing, and actions in themselves—and it cannot appear as anything else except through the seduction of language (and the fundamental errors of reason petrified in it), which understands and misunderstands all action as conditioned by something which causes actions, by a ‘Subject’ (GM I 13). Subjectivity is a form of linguistic reductionism, and when using language, ‘[w]e enter a realm of crude fetishism when we summon before consciousness the basic presuppositions of the metaphysics of language — in plain talk, the presuppositions of reason. Everywhere reason sees a doer and doing; it believes in will as the cause; it believes in the ego, in the ego as being, in the ego as substance, and it projects this faith in the ego-substance upon all things — only thereby does it first create the concept of ‘thing’ (TI, ‘Reason in Philosophy’ 5). As Nietzsche also states in WP 484, the habit of adding a doer to a deed is a Cartesian leftover that begs more questions than it solves. It is indeed nothing more than an inference according to habit: ‘There is activity, every activity requires an agent, consequently – (BGE, 17). Locke himself found the continuous existence of the self problematic, but did not go as far as Hume’s dissolution of the self into a number of ‘bundles’. After all, even if identity shifts occurred behind the scenes, he required a subject with enough unity to be able to enter into the Social Contract. This subject had to be something more than merely an ‘eternal grammatical blunder’ (D, 120), and willing had to be understood as something simple. For Nietzsche, it is ‘above all complicated, something that is a unit only as a word, a word in which the popular prejudice lurks, which has defeated the always inadequate caution of philosophers’ (BGE, 19).

Nietzsche’s critique of past philosophers
by Michael Lacewing

Nietzsche is questioning the very foundations of philosophy. To accept his claims means being a new kind of philosopher, ones who ‘taste and inclination’, whose values, are quite different. Throughout his philosophy, Nietzsche is concerned with origins, both psychological and historical. Much of philosophy is usually thought of as an a priori investigation. But if Nietzsche can show, as he thinks he can, that philosophical theories and arguments have a specific historical basis, then they are not, in fact, a priori. What is known a priori should not change from one historical era to the next, nor should it depend on someone’s psychology. Plato’s aim, the aim that defines much of philosophy, is to be able to give complete definitions of ideas – ‘what is justice?’, ‘what is knowledge?’. For Plato, we understand an idea when we have direct knowledge of the Form, which is unchanging and has no history. If our ideas have a history, then the philosophical project of trying to give definitions of our concepts, rather than histories, is radically mistaken. For example, in §186, Nietzsche argues that philosophers have consulted their ‘intuitions’ to try to justify this or that moral principle. But they have only been aware of their own morality, of which their ‘justifications’ are in fact only expressions. Morality and moral intuitions have a history, and are not a priori. There is no one definition of justice or good, and the ‘intuitions’ that we use to defend this or that theory are themselves as historical, as contentious as the theories we give – so they offer no real support. The usual ways philosophers discuss morality misunderstands morality from the very outset. The real issues of understanding morality only emerge when we look at the relation between this particular morality and that. There is no world of unchanging ideas, no truths beyond the truths of the world we experience, nothing that stands outside or beyond nature and history.

GENEALOGY AND PHILOSOPHY

Nietzsche develops a new way of philosophizing, which he calls a ‘morphology and evolutionary theory’ (§23), and later calls ‘genealogy’. (‘Morphology’ means the study of the forms something, e.g. morality, can take; ‘genealogy’ means the historical line of descent traced from an ancestor.) He aims to locate the historical origin of philosophical and religious ideas and show how they have changed over time to the present day. His investigation brings together history, psychology, the interpretation of concepts, and a keen sense of what it is like to live with particular ideas and values. In order to best understand which of our ideas and values are particular to us, not a priori or universal, we need to look at real alternatives. In order to understand these alternatives, we need to understand the psychology of the people who lived with them. And so Nietzsche argues that traditional ways of doing philosophy fail – our intuitions are not a reliable guide to the ‘truth’, to the ‘real’ nature of this or that idea or value. And not just our intuitions, but the arguments, and style of arguing, that philosophers have used are unreliable. Philosophy needs to become, or be informed by, genealogy. A lack of any historical sense, says Nietzsche, is the ‘hereditary defect’ of all philosophers.

MOTIVATIONAL ANALYSIS

Having long kept a strict eye on the philosophers, and having looked between their lines, I say to myself… most of a philosopher’s conscious thinking is secretly guided and channelled into particular tracks by his instincts. Behind all logic, too, and its apparent tyranny of movement there are value judgements, or to speak more clearly, physiological demands for the preservation of a particular kind of life. (§3) A person’s theoretical beliefs are best explained, Nietzsche thinks, by evaluative beliefs, particular interpretations of certain values, e.g. that goodness is this and the opposite of badness. These values are best explained as ‘physiological demands for the preservation of a particular kind of life’. Nietzsche holds that each person has a particular psychophysical constitution, formed by both heredity and culture. […] Different values, and different interpretations of these values, support different ways of life, and so people are instinctively drawn to particular values and ways of understanding them. On the basis of these interpretations of values, people come to hold particular philosophical views. §2 has given us an illustration of this: philosophers come to hold metaphysical beliefs about a transcendent world, the ‘true’ and ‘good’ world, because they cannot believe that truth and goodness could originate in the world of normal experience, which is full of illusion, error, and selfishness. Therefore, there ‘must’ be a pure, spiritual world and a spiritual part of human beings, which is the origin of truth and goodness. Philosophy and values But ‘must’ there be a transcendent world? Or is this just what the philosopher wants to be true? Every great philosophy, claims Nietzsche, is ‘the personal confession of its author’ (§6). The moral aims of a philosophy are the ‘seed’ from which the whole theory grows. Philosophers pretend that their opinions have been reached by ‘cold, pure, divinely unhampered dialectic’ when in fact, they are seeking reasons to support their pre-existing commitment to ‘a rarefied and abstract version of their heart’s desire’ (§5), viz. that there is a transcendent world, and that good and bad, true and false are opposites. Consider: Many philosophical systems are of doubtful coherence, e.g. how could there be Forms, and if there were, how could we know about them? Or again, in §11, Nietzsche asks ‘how are synthetic a priori judgments possible?’. The term ‘synthetic a priori’ was invented by Kant. According to Nietzsche, Kant says that such judgments are possible, because we have a ‘faculty’ that makes them possible. What kind of answer is this?? Furthermore, no philosopher has ever been proved right (§25). Given the great difficulty of believing either in a transcendent world or in human cognitive abilities necessary to know about it, we should look elsewhere for an explanation of why someone would hold those beliefs. We can find an answer in their values. There is an interesting structural similarity between Nietzsche’s argument and Hume’s. Both argue that there is no rational explanation of many of our beliefs, and so they try to find the source of these beliefs outside or beyond reason. Hume appeals to imagination and the principle of ‘Custom’. Nietzsche appeals instead to motivation and ‘the bewitchment of language’ (see below). So Nietzsche argues that philosophy is not driven by a pure ‘will to truth’ (§1), to discover the truth whatever it may be. Instead, a philosophy interprets the world in terms of the philosopher’s values. For example, the Stoics argued that we should live ‘according to nature’ (§9). But they interpret nature by their own values, as an embodiment of rationality. They do not see the senselessness, the purposelessness, the indifference of nature to our lives […]

THE BEWITCHMENT OF LANGUAGE

We said above that Nietzsche criticizes past philosophers on two grounds. We have looked at the role of motivation; the second ground is the seduction of grammar. Nietzsche is concerned with the subject-predicate structure of language, and with it the notion of a ‘substance’ (picked out by the grammatical ‘subject’) to which we attribute ‘properties’ (identified by the predicate). This structure leads us into a mistaken metaphysics of ‘substances’. In particular, Nietzsche is concerned with the grammar of ‘I’. We tend to think that ‘I’ refers to some thing, e.g. the soul. Descartes makes this mistake in his cogito – ‘I think’, he argues, refers to a substance engaged in an activity. But Nietzsche repeats the old objection that this is an illegitimate inference (§16) that rests on many unproven assumptions – that I am thinking, that some thing is thinking, that thinking is an activity (the result of a cause, viz. I), that an ‘I’ exists, that we know what it is to think. So the simple sentence ‘I think’ is misleading. In fact, ‘a thought comes when ‘it’ wants to, and not when ‘I’ want it to’ (§17). Even ‘there is thinking’ isn’t right: ‘even this ‘there’ contains an interpretation of the process and is not part of the process itself. People are concluding here according to grammatical habit’. But our language does not allow us just to say ‘thinking’ – this is not a whole sentence. We have to say ‘there is thinking’; so grammar constrains our understanding. Furthermore, Kant shows that rather than the ‘I’ being the basis of thinking, thinking is the basis out of which the appearance of an ‘I’ is created (§54). Once we recognise that there is no soul in a traditional sense, no ‘substance’, something constant through change, something unitary and immortal, ‘the way is clear for new and refined versions of the hypothesis about the soul’ (§12), that it is mortal, that it is multiplicity rather than identical over time, even that it is a social construct and a society of drives. Nietzsche makes a similar argument about the will (§19). Because we have this one word ‘will’, we think that what it refers to must also be one thing. But the act of willing is highly complicated. First, there is an emotion of command, for willing is commanding oneself to do something, and with it a feeling of superiority over that which obeys. Second, there is the expectation that the mere commanding on its own is enough for the action to follow, which increases our sense of power. Third, there is obedience to the command, from which we also derive pleasure. But we ignore the feeling the compulsion, identifying the ‘I’ with the commanding ‘will’. Nietzsche links the seduction of language to the issue of motivation in §20, arguing that ‘the spell of certain grammatical functions is the spell of physiological value judgements’. So even the grammatical structure of language originates in our instincts, different grammars contributing to the creation of favourable conditions for different types of life. So what values are served by these notions of the ‘I’ and the ‘will’? The ‘I’ relates to the idea that we have a soul, which participates in a transcendent world. It functions in support of the ascetic ideal. The ‘will’, and in particular our inherited conception of ‘free will’, serves a particular moral aim

Hume and Nietzsche: Moral Psychology (short essay)
by epictetus_rex

1. Metaphilosophical Motivation

Both Hume and Nietzsche1 advocate a kind of naturalism. This is a weak naturalism, for it does not seek to give science authority over philosophical inquiry, nor does it commit itself to a specific ontological or metaphysical picture. Rather, it seeks to (a) place the human mind firmly in the realm of nature, as subject to the same mechanisms that drive all other natural events, and (b) investigate the world in a way that is roughly congruent with our best current conception(s) of nature […]

Furthermore, the motivation for this general position is common to both thinkers. Hume and Nietzsche saw old rationalist/dualist philosophies as both absurd and harmful: such systems were committed to extravagant and contradictory metaphysical claims which hinder philosophical progress. Furthermore, they alienated humanity from its position in nature—an effect Hume referred to as “anxiety”—and underpinned religious or “monkish” practises which greatly accentuated this alienation. Both Nietzsche and Hume believe quite strongly that coming to see ourselves as we really are will banish these bugbears from human life.

To this end, both thinkers ask us to engage in honest, realistic psychology. “Psychology is once more the path to the fundamental problems,” writes Nietzsche (BGE 23), and Hume agrees:

the only expedient, from which we can hope for success in our philosophical researches, is to leave the tedious lingering method, which we have hitherto followed, and instead of taking now and then a castle or village on the frontier, to march up directly to the capital or center of these sciences, to human nature itself.” (T Intro)

2. Selfhood

Hume and Nietzsche militate against the notion of a unified self, both at-a-time and, a fortiori, over time.

Hume’s quest for a Newtonian “science of the mind” lead him to classify all mental events as either impressions (sensory) or ideas (copies of sensory impressions, distinguished from the former by diminished vivacity or force). The self, or ego, as he says, is just “a kind of theatre, where several perceptions successively make their appearance; pass, re-pass, glide away, and mingle in an infinite variety of postures and situations. There is properly no simplicity in it at one time, nor identity in different; whatever natural propension we may have to imagine that simplicity and identity.” (Treatise 4.6) […]

For Nietzsche, the experience of willing lies in a certain kind of pleasure, a feeling of self-mastery and increase of power that comes with all success. This experience leads us to mistakenly posit a simple, unitary cause, the ego. (BGE 19)

The similarities here are manifest: our minds do not have any intrinsic unity to which the term “self” can properly refer, rather, they are collections or “bundles” of events (drives) which may align with or struggle against one another in a myriad of ways. Both thinkers use political models to describe what a person really is. Hume tells us we should “more properly compare the soul to a republic or commonwealth, in which the several members [impressions and ideas] are united by ties of government and subordination, and give rise to persons, who propagate the same republic in the incessant change of its parts” (T 261)

3. Action and The Will

Nietzsche and Hume attack the old platonic conception of a “free will” in lock-step with one another. This picture, roughly, involves a rational intellect which sits above the appetites and ultimately chooses which appetites will express themselves in action. This will is usually not considered to be part of the natural/empirical order, and it is this consequence which irks both Hume and Nietzsche, who offer two seamlessly interchangeable refutations […]

Since we are nothing above and beyond events, there is nothing for this “free will” to be: it is a causa sui, “a sort of rape and perversion of logic… the extravagant pride of man has managed to entangle itself profoundly and frightfully with just this nonsense” (BGE 21).

When they discover an erroneous or empty concept such as “Free will” or “the self”, Nietzsche and Hume engage in a sort of error-theorizing which is structurally the same. Peter Kail (2006) has called this a “projective explanation”, whereby belief in those concepts is “explained by appeal to independently intelligible features of psychology”, rather than by reference to the way the world really is1.

The Philosophy of Mind
INSTRUCTOR: Larry Hauser
Chapter 7: Egos, bundles, and multiple selves

  • Who dat?  “I”
    • Locke: “something, I know not what”
    • Hume: the no-self view … “bundle theory”
    • Kant’s transcendental ego: a formal (nonempirical) condition of thought that the “I’ must accompany every perception.
      • Intentional mental state: I think that snow is white.
        • to think: a relation between
          • a subject = “I”
          • a propositional content thought =  snow is white
      • Sensations: I feel the coldness of the snow.
        • to feel: a relation between
          • a subject = “I”
          • a quale = the cold-feeling
    • Friedrich Nietzsche
      • A thought comes when “it” will and not when “I” will. Thus it is a falsification of the evidence to say that the subject “I” conditions the predicate “think.”
      • It is thought, to be sure, but that this “it” should be that old famous “I” is, to put it mildly, only a supposition, an assertion. Above all it is not an “immediate certainty.” … Our conclusion is here formulated out of our grammatical custom: “Thinking is an activity; every activity presumes something which is active, hence ….” 
    • Lichtenberg: “it’s thinking” a la “it’s raining”
      • a mere grammatical requirement
      • no proof of an thinking self

[…]

  • Ego vs. bundle theories (Derek Parfit (1987))
    • Ego: “there really is some kind of continuous self that is the subject of my experiences, that makes decisions, and so on.” (95)
      • Religions: Christianity, Islam, Hinduism
      • Philosophers: Descartes, Locke, Kant & many others (the majority view)
    • Bundle: “there is no underlying continuous and unitary self.” (95)
      • Religion: Buddhism
      • Philosophers: Hume, Nietzsche, Lichtenberg, Wittgenstein, Kripke(?), Parfit, Dennett {a stellar minority}
  • Hume v. Reid
    • David Hume: For my part, when I enter most intimately into what I call myself, I always stumble on some particular perception or other, of heat or cold, light or shade, love or hatred, pain or pleasure.  I never can catch myself at any time without a perception, and never can observe anything but the perception.  (Hume 1739, Treatise I, VI, iv)
    • Thomas Reid: I am not thought, I am not action, I am not feeling: I am something which thinks and acts and feels. (1785)

Delirium of Hyper-Individualism

Individualism is a strange thing. For anyone who has spent much time meditating, it’s obvious that there is no there there. It slips through one’s grasp like an ancient philosopher trying to study aether. The individual self is the modernization of the soul. Like the ghost in the machine and the god in the gaps, it is a theological belief defined by its absence in the world. It’s a social construct, a statement that is easily misunderstood.

In modern society, individualism has been raised up to an entire ideological worldview. It is all-encompassing, having infiltrated nearly every aspect of our social lives and become internalized as a cognitive frame. Traditional societies didn’t have this obsession with an idealized self as isolated and autonomous. Go back far enough and the records seem to show societies that didn’t even have a concept, much less an experience, of individuality.

Yet for all its dominance, the ideology of individualism is superficial. It doesn’t explain much of our social order and personal behavior. We don’t act as if we actually believe in it. It’s a convenient fiction that we so easily disregard when inconvenient, as if it isn’t all that important after all. In our most direct experience, individuality simply makes no sense. We are social creatures through and through. We don’t know how to be anything else, no matter what stories we tell ourselves.

The ultimate value of this individualistic ideology is, ironically, as social control and social justification.

The wealthy, the powerful and privileged, even the mere middle class to a lesser degree — they get to be individuals when everything goes right. They get all the credit and all the benefits. All of society serves them because they deserve it. But when anything goes wrong, they hire lawyers who threaten anyone who challenges them or they settle out of court, they use their crony connections and regulatory capture to avoid consequences, they declare bankruptcy when one of their business ventures fail, and they endlessly scapegoat those far below them in the social hierarchy.

The profits and benefits are privatized while the costs are externalized. This is socialism for the rich and capitalism for the poor, with the middle class getting some combination of the two. This is why democratic rhetoric justifies plutocracy while authoritarianism keeps the masses in line. This stark reality is hidden behind the utopian ideal of individualism with its claims of meritocracy and a just world.

The fact of the matter is that no individual ever became successful. Let’s do an experiment. Take an individual baby, let’s say the little white male baby of wealthy parents with their superior genetics. Now leave that baby in the woods to raise himself into adulthood and bootstrap himself into a self-made man. I wonder how well that would work for his survival and future prospects. If privilege and power, if opportunity and resources, if social capital and collective inheritance, if public goods and the commons have no major role to play such that the individual is solely responsible to himself, we should expect great things from this self-raised wild baby.

But if it turns out that hyper-individualism is total bullshit, we should instead expect that baby to die of exposure and starvation or become the the prey of a predator feeding its own baby without any concerns for individuality. Even simply leaving a baby untouched and neglected in an orphanage will cause failure to thrive and death. Without social support, our very will to live disappears. Social science research has proven the immense social and environmental influences on humans. For a long time now there has been no real debate about this social reality of our shared humanity.

So why does this false belief and false idol persist? What horrible result do we fear if we were ever to be honest with ourselves? I get that the ruling elite are ruled by their own egotistic pride and narcissism. I get that the comfortable classes are attached to their comforting lies. But why do the rest of us go along with their self-serving delusions? It is the strangest thing in the world for a society to deny it is a society.