“Perpetual motion keeps the dream in place.”

ImageKarl Marx: “The reform of consciousness consists in making the world aware of its own consciousness, in awakening it out of its dream about itself”

Chuang-Tzu: Dreams he’s a butterfly. Wakes up, finds he’s a man, & asks: “But who is dreaming who? Is the butterfly now dreaming me?”

Now replace Chuang-Tzu’s butterfly with capital.

We dreamt capital into being, it was a function of our imaginations, our social vision, our direction.

But at some point, the figure and ground flipped. Capital become the dreamer, and we, functions of its vision and (paltry) imagination.

This is one way I read Marx: we dreamt up capital (and thankfully so, it’s a necessary component to the dialectic of progress).

But then we fell inside our own creation, engulfed, and now live perpetually inside capital, confusing the means for the end.

The ‘reform of consciousness’ is a collective remembrance, a collective ‘waking up’ from the nap we’ve taken inside what was only ever meant to be a transitional phase.

But for Marx, this isn’t realized as some big ideological shift.

Rather, reforming the material environment is the precondition for reforming the collective consciousness.

This is why he takes shortening the working week as the prerequisite for “the true realm of freedom”. Perpetual motion keeps the dream in place.

 ~ Oshan Jarow, Twitter thread

Useful Info On Covid-19

For covid-19, data has been coming out. Two things to keep in mind are the incubation period and the latent period. The incubation period is the time from when the individual is infected to when they show symptoms, although there are asymptomatic cases where no symptoms are experienced at all.

The average incubation period is 3 days, but it can be as short as 2 days or maybe as long as 21 days or even 24 days. The upper end is rare. Quarantine for 14 days was assumed to be sufficient for nearly all patients. Even a quarantine of less than that, if implemented strictly and widely, presumably would eliminate most infections. Is that true?

More important is the latent period. This is how long it takes from being infected to the onset of symptoms. Keep in mind that, as I understand it, covid-19 is not more contagious than the common flu with any single exposure. Rather, there is a longer potential exposure period which translates as a higher infection rate.

It’s not clear how long one carries the virus and can pass it on to others. Doctors detected the RNA of the virus in the lungs 20 days after infection. And in a study, the pathogen was found in the respiratory tract for much longer, upwards of 37 days. However, carrying the virus doesn’t necessarily mean one is infectious.

For mild cases which is most cases, the infectious period following symptoms doesn’t likely last more than 10 days, even as the virus still can be detected. To demonstrate this, researchers used samples from patients (sputum, blood, urine, stool) to try to grow the virus. On day 8, they failed to do so with patients of mild infections.

So, the fact that some can test positive for weeks might be largely irrelevant. Most patients stop what is called viral shedding in the first 5 days, although in a minority of patients with severe sickness it can go some days beyond that. The extreme cases involved pneumonia and the viral shedding continued for 10-11 days. Still, generally speaking, the most infectious period is those first few days.

Also, keep in mind that not everyone is equally infectious. An elderly couple went on a cruise ship together for a couple of weeks. The wife got infected and presumably was sick for the entire two weeks and yet the husband remained free of infection.

Someone with symptoms may infect no one else while someone without symptoms could easily infect many. It is about 1.2% of covid-19 patients that show no symptoms. On the Diamond Princess cruise ship, 322 of 621 people tested positive with no symptoms. That makes containment difficult, especially with limited ability to do testing.

About symptoms, here is a key piece of info. In a small but significant number of cases, there are no symptoms at all. In many other cases, the symptoms are minor or even atypical. We mostly hear from the media about the respiratory problems, but it can be seen in other ways. Nearly half (48.5%) of patients had digestive issues such as diarrhea, vomiting, and abdominal pain. And about 7% showed no respiratory symptoms at all.

There are other symptoms as well. Most recently, it’s been found that loss of taste and smell can be a sign of infection. About half of patients, in one cluster from Germany, experienced a change in smell or taste. Sensory loss usually follows respiratory symptoms, although not always. Another symptom is redness around the eyes, as seen in some of the worst cases.

Those without the typical respiratory problems and higher temperature don’t see a doctor or only do so much later. This can actually make these cases more severe with a longer recovery period. Yet for some reason, few people are talking about the full spectrum of potential symptoms. In many lists of symptoms, gastrointestinal distress is not mentioned at all. And the loss of senses is only now being reported on.

Bonus info: A mouse study showed that ketosis has protectin against influenza (flu). Coronavirus is different in many ways, but the body deals with viruses through the same basic mechanisms. Ketosis causes the body to produce a special kind of T cell in the lungs and also a protective layer of mucose in the lungs. The survival rate of mice was higher for the mice on a keto diet.

This might be a similar reason why ketosis is inefficient in producing excess heat. Ben Bikman speculates this has to do with the time of year, winter, when ketosis tends to happen. While fasting or in dietary restriction during winter, it would be useful for both the body to produce extra heat and for the immune system to shift into higher functioning.

* * *

 

 

 

 

 

 

Coronavirus may incubate for longer than we thought–which means quarantines may have been too short
by Joseph Guzman

A Person Can Carry And Transmit COVID-19 Without Showing Symptoms, Scientists Confirm
by Aria Bendix

Coronavirus Can Live in Patients for Five Weeks After Contagion
by Claire Che

People ‘shed’ high levels of coronavirus, study finds, but most are likely not infectious after recovery begins
by Helen Branswell

‘Covid-19 most infectious in early days’
by Sumitra Debroy & Malathy Iyer

Lost Sense of Smell May Be Peculiar Clue to Coronavirus Infection
by Roni Caryn Rabin

Study: Nearly half of COVID-19 patients experience digestive issues
by Joseph Guzman

Diarrhea Could Be First Sign Of Coronavirus Infection, Study Says
by Jan Cortes

Doctors say pink eye with other key symptoms may represent COVID-19 cases
from Chron

Islamic Voice-Hearing

Islam, what kind of religion is it? Islam is the worship of a missing god, that is how we earlier described it. Some might consider that as unfair and dismissive to one of the world’s largest religions, but this is true to some extent for all post-bicameral religions. The difference is that Islam is among the most post-bicameral of the world religions. This is true simply in temporal terms.

The bicameral societies, according to Julian Jaynes, ended with the widespread collapse of the late Bronze Age empires and their trade networks. That happened around 1177 BCE, as the result of natural disasters and attacks by the mysterious Sea People, the latter maybe having formed out the refugees from the former. The Bronze Age continued for many centuries in various places: 700 BCE in Great Britain, Central Europe and China; 600 BCE in Northern Europe; 500 BCE in Korea and Ireland; and centuries beyond that in places like Japan.

But the Bronze Age Empires never returned. In that late lingering Bronze Age, a dark age took hold and put all of civilization onto a new footing. This was the era when, across numerous cultures, there were the endless laments about the gods, spirits, and ancestors having gone silent, having abandoned humanity. Entire cultural worldviews and psychological ways of being were utterly demolished or else irreparably diminished. This created an intense sense of loss, longing, and nostalgia that has never left humanity since.

Out of the ashes, while the Bronze Age was still holding on, the Axial Age arose around 900 BCE and continued until 200 BCE. New cultures were formed and new empires built. The result is what Jaynes described as ‘consciousness’ or what one can think of as introspective mental space, an inner world of egoic identity where the individual is separate from community and world. Consciousness and the formalized religions that accompanied it were a replacement for the loss of a world alive with voices.

By the time Rabbinic Judaism, Gnosticism, and Christianity came around, the Axial Age was already being looked back upon as a Golden Age and, other than through a few surviving myths, the Bronze Age before that was barely remembered at all. It would be nearly another 600 years after that first century monotheistic revival when Muhammad would have his visions of the angel Gabriel visiting him to speak on behalf of God. Islam is both post-bicameral and post-axial, to a far greater degree.

Muslims consider Muhammad to be the last prophet and even he didn’t get to hear God directly for it had to come through an angel. The voice of God had long ago grown so faint that people had come to rely on oracles, channelings, and such. These rather late revelations by way of Gabriel were but a barely audible echo of the archaic bicameral voices. It is may be understandable that, as with some oracles before him, Muhammad would declare God would never speak again. So, Islam, unlike the other monothesitic religions, fully embraces God’s absence from the world.

Actually, that is not quite right. Based on the Koran, God will never speak again until the Final Judgment. Then all will hear God again when he weighs your sins and decides the fate of your immortal soul. Here is the interesting part. The witnesses God shall call upon in each person’s case will be all the bicameral voices brought back out of silence. The animals and plants will witness for or against you, as will the earth and rocks and wind. Even your own resurrected body parts will come alive again with voices to speak of what you did. Body parts speaking is something familiar to those who read Jaynesian scholarship.

Until then, God and all the voices of the world will remain mute witnesses, watching your every move and taking notes. They see all, hear all, notice all — every time you masturbate or pick your nose, every time you have a cruel or impure thought, every time you don’t follow one of the large number of divine commandments, laws, and rules spelled out in the Koran. The entire world is spying upon you and will report back to God, at the end of time. The silent world only appears to be dumb and unconscious. God is biding his time, gathering a file on you like a cosmic FBI.

This could feel paralyzing, but in another way it offers total freedom from self, total freedom through complete submission. Jaynesian consciousness is a heavy load and that was becoming increasingly apparent over time, especially in the centuries following the Axial Age. The zealous idealism of the Axial Age prophets was growing dull and tiresome. By the time that Muhammad showed up, almost two millennia had passed since the bicameral mind descended into darkness. The new consciousness was sold as something amazing, but it hadn’t fully lived up to its promises. Instead, ever more brutal regimes came into power and a sense of anxiety was overtaking society.

Muhammad had an answer and the people of that region were obviously hungry for someone to provide an answer. After forming his large army, his military campaign barely experienced any resistance. And in a short period of time while he was still alive, most of the Arabian peninsula was converted to Islam. The silence of the gods had weakened society, but Muhammad offered an explanation for why the divine could no longer be experienced. He helped normalize what had once felt like a tragedy. He told them that they didn’t need to hear God because God had already revealed all knowledge to the prophets, including himself of course. No one had to worry, just follow orders and comply with commands.

All the tiresome complications of thought were unnecessary. God had already thought out everything for humans. The Koran as the final and complete holy text would entirely and permanently replace the bicameral voices, ever receding into the shadows of the psyche. But don’t worry, all those voices are still there, waiting to speak. But the only voice that the individual needed to listen to was that of the person directly above them in the religious hierarchy, be it one’s father or an imam or whoever else with greater official authority with a line of command that goes back to the prophets and through the angels to God Himself. Everything is in the Koran and the learned priestly class would explain it all and translate it into proper theocratic governance.

Muhammad came with a different message than anyone before. The Jewish prophets and Jesus, as with many Pagans, would speak of God as Father and humanity as His children. Early Christians took this as a challenge to a slave-based society, in borrowing from the Stoics that even a slave was free in his soul. Muhammad, instead, was offering another variety of freedom. We humans, rather than children of God, are slaves of God. The entire Islamic religion is predicated upon divine slavery, absolute submission. This is freedom from the harsh taskmaster of egoic individuality, a wannabe demiurge. Unlike Jesus, Muhammad formulated a totalitarian theocracy, a totalizing system. Nothing is left to question or interpretation, that is in theory or rather in belief.

This goes back to how, with the loss of the bicameral mind and social order, something took its place. It was a different kind of authoritarianism — rigid and hierarchical, centralized and concentrated, despotic and violent. Authoritarianism of this variety didn’t emerge until the late Bronze Age when the bicameral societies were becoming too large and complex, overstrained and unstable. Suddenly, as if to presage the coming collapse, there was the appearance of written laws, harsh punishment, and cruel torture — none of which ever existed before, according to historical records and archaeological finds. As the world shifted into post-bicameralism, this authoritarianism became ever more extreme (e.g., Roman Empire).

This was always the other side of the rise of individuality, of Jaynesian consciousness. The greater potential freedom the individual possesses the more that oppressive social control is required, as the communal bonds and social norms of the bicameral mind increasingly lost their hold to organically maintain order. Muhammad must have showed up at the precise moment of crisis in this change. After the Roman Empire’s system of slavery, Europe came up with feudalism to re-create some of what had disappeared. But apparently a different kind of solution was required in the Arab world.

Maybe this offsets the draining of psychic energy that comes with consciousness. Jaynes speculated that, like the schizophrenic, bicameral humans had immense energy and stamina which allowed them to accomplish near-miraculous feats such as building the pyramids with small populations and very little technology or infrastructure. Suppression of the extremes of individualism through emphasizing absolute subordination is maybe a way of keeping in check the energy loss of maintaining egoic consciousness. In the West, we eventually overcame this weakness by using massive doses of stimulants to overpower the otherwise debilitating anxiety and to help shore up the egoic boundaries, but this has come at the cost of destroying our physical health and mental health.

Time will tell which strategy is the most effective for long-term survival of specific societies. But I’m not sure I’d bet on the Western system, considering how unsustainable it appears to be and how easily it has become crippled by a minor disease epidemic like covid-19. Muhammad might simply have been trying to cobble together some semblance of a bicameral mind, in the face of divine silence. There is a good reason for trying to do that. Those bicameral societies lasted many millennia longer than has our post-bicameral civilization. It’s not clear that modern civilization or at least Western civilization will last beyond the end of this century. We underestimate the bicameral mind and the importance it played during the single longest period of advancement of civilization.

* * *

Let us leave a small note of a more personal nature. In the previous post (linked above), we mentioned that our line of inquiry began with a conversation we had with a friend of ours who is a Muslim. He also happens to be schizophrenic, i.e., a voice-hearer. The last post was about how voice=hearing is understood within Islam. Since supposedly God no longer speaks to humans nor do his angelic intermediaries, any voice a Muslim hears is automatically interpreted as not being of divine origins. It doesn’t necessarily make the voice evil, as it could be a jinn which is a neutral entity in Islamic theology, although jinn can be dangerous. Then again, voice-hearing might also be caused by an evil magician, what I think is called a sihir.

Anyway, we had the opportunity to speak to this friend once again, as we are both in jobs that require us to continue working downtown amidst everything otherwise being locked down because of the covid-19 epidemic. In being isolated from family and other friends, we’ve been meeting with this Islamic guy on a daily basis. Just this morning, we went for a long walk together and chatted about life and religion. He had previously talked about his schizophrenia in passing, apparently unworried by the stigma of it. He is an easy person to talk to, quite direct and open about his thoughts and experiences. I asked him about voice-hearing and he explained that, prior to being medicated, he would continue to hear people speak to him after they no longer were present. And unsurprisingly, the voices were often negative.

Both his imam and his therapist told him to ignore the voices. Maybe that is a standard approach in traditionally monotheistic cultures. As we mentioned in the other post, he is from North Africa where Arabs are common. But another friend of ours lives in Ghana, in West Africa. Voice-hearing experience among people in Ghana was compared to those in the United States, in the research of Tanya M. Luhrmann, an anthropologist inspired by Julian Jaynes. She found that Ghanans, with a tradition of voice-hearing (closer to bicameralism?), had a much more positive experience of the voices they heard. Americans, like our Islamic friend, did not tend to hear voices that were kind and helpful. This is probably the expectancy effect.

If you are raised to believe that voices are demonic or their Islamic equivalent of jinn or are from witches and evil magicians, or if you simply have been told voice-hearing means your insane, well, it’s not likely to lead to happy results when you do hear voices. I doubt it decreases the rate of voice-hearing, though. In spite of Islamic theology denying God and angels speak to humans any longer, that isn’t likely to have any affect on voice-hearing itself. So, the repressed bicameral mind keeps throwing out these odd experiences, but in our post-bicameral age we have fewer resources in dealing constructively with those voices. Simply denying and ignoring them probably is less helpful.

That is the ultimate snag. The same voices that once were identified as godly or something similar are now taken as false, unreal, or dangerous. In a sense, God never stopped speaking. One could argue that we all are voice-hearers, but some of us now call the voice of God as ‘conscience’ or whatever. Others, like Muslims, put great emphasis on this voice-hearing but have tried to gag God who goes on talking. Imagine how many potential new prophets have been locked away in psychiatric wards or, much worse, killed or imprisoned as heretics. If God can’t be silenced, the prophets who hear him can. The Old Testament even describes how the authorities forbid voice-hearing and demanded that voice-hearers be killed, even by their own parents.

The bicameral mind didn’t disappear naturally because it was inferior but because, in its potency, it was deemed dangerous to those who wanted to use brute power to enforce their own voices of authorization. The bicameral mind, once central to the social order, had become enemy number one. If people could talk to God directly, religion and its claims of authority would become irrelevant. That is how our Islamic friend, a devout religious practitioner, ended up being drugged up to get the voices to stop speaking.

Islam as Worship of a Missing God

A friend of ours is a Muslim and grew up in an Islamic country. As he talked about his religion, we realized how different it is from Christianity. There is no shared practice among Christians similar to the praying five times a day. From early on, Christianity was filled with diverse groups and disagreements, and that has only increased over time (there are over 4,600 denominations of Christianity in the United States alone). My friend had a hard time appreciating that there is no agreed upon authority, interpretation, or beliefs among all Christians.

Unlike Muhammad, Jesus never wrote anything nor was anything written down about him until much later. Nor did he intend to start a new religion. He offered no rules, social norms, instructions, etc for how to organize a church, a religious society, or a government. He didn’t even preach family values, if anything the opposite — from a command to let the dead bury themselves to the proclamation of having come to turn family members against each other. The Gospels offer no practical advice about anything. Much of Jesus’ teachings, beyond a general message of love and compassion, are vague and enigmatic, often parables that have many possible meanings.

Now compare Jesus to the Islamic prophet. Muhammad is considered the last prophet, although he never claimed to have heard the voice of God and instead supposedly having received the message secondhand through an angel. Still, according to Muslims, the Koran is the only complete holy text in existence — the final Word of God. That is also something that differs from Christianity. Jesus never asserted that God would become silent to all of humanity for eternity and that his worshippers would be condemned to a world without the God they longed for, in the way Allah never enters His own Creation.

Many Protestants and Anabaptists and those in similar groups believe that God continues to be revealed to people today, that the divine is known through direct experience, that the Bible as a holy text must be read as a personal relationship to God, not merely taken on the authority of blind faith. Some churches go so far as to teach people how to speak to and hear God (T.M. Luhrmann, When God Talks Back). Even within Catholicism, there have been further revelations of God since Jesus, from various mystics and saints that are acknowledged by the Vatican but also from ordinary Catholics claiming God spoke to them without any great fear of hereticism and excommunication.

It made me think about Julian Jaynes’ theory modern consciousness. With the collapse of the Bronze Age civilizations, there was this sense of the gods having gone silent. Yet this was never an absolute experience, as some people continued to hear the gods. Even into the modern world, occasionally people still claim to hear various gods and sometimes even found new religions based on revelations. The Bahai, for example, consider Muhammad to be just one more prophet with others having followed him. Hindus also have a living tradition of divine revelation that is equivalent to that of prophets. Only Islam, as far as I know, claims all prophecy and revelation to be ended for all time.

I was thinking about the sense of loss and loneliness people felt when bicameral societies came to an end. They were thrown onto an increasingly isolated individualism. Religion as we know it was designed to accommodate this, in order to give a sense of order, meaning and authority that had gone missing. But Islam takes this to an extreme. After Muhammad, no human supposedly would ever again personally hear, see, or experience the divine in any way (excluding mystical traditions like Sufism). For all intents and purposes, Allah has entirely receded from the world. The only sign of his existence that he left behind was a book of instructions. We must submit and comply or be punished in the afterlife, a world separate from this one

That seems so utterly depressing and dreary to me. I was raised Christian and on the far other extreme of Protestantism. My family attended the Unity Church that emphasizes direct experience of God to such a degree that the Bible itself was mostly ignored and almost irrelevant — why turn to mere words on paper when you can go straight to the source? Rather than being denied and condemned, to claim to have heard God speak would have been taken seriously. I’m no longer religious, but the nearly deist idea of a god that is distant and silent seems so alien and unappealing to me. Yet maybe that makes Islam well designed for the modern world, as it offers a strong response to atheism.

If you don’t have any experience of God, this is considered normal and expected in Islam, not something to be worried about, not something to challenge one’s faith as is common in Christianity (NDE: Spirituality vs Religiosity); and it avoids the riskiness and confusion of voice-hearing (Libby Anne, Voices in Your Head: Evangelicals and the Voice of God). One’s ignorance of the divine demonstrates one’s individual inadequacy and, as argued by religious authority, is all the more reason to submit to religious authority. Islamic relation between God and humanity is one-way, except to some extent by way of inspiration and dreams, but Allah himself never directly enters his Creation and so never directly interacts with humans, not even with prophets. Is that why constant prayer is necessary for Muslims, to offset God’s silence and vacancy? Worship of a missing God seems perfectly suited for the modern world.

Muslims are left with looking for traces of God in the Koran like ants crawling around in a footprint while trying to comprehend what made it and what it wants them to do. So, some of the ants claim to be part of a direct lineage of ants that goes back to an original ant that, according to tradition, was stepped upon by what passed by. These well-respected ants then explain to all the other ants what is meant by all the bumps and grooves in the dried mud. In worship, the ants pray toward the footprint and regularly gather to circle around it. This gives their life some sense of meaning and purpose and, besides, it maintains the social order.

That is what is needed in a world where the bicameral voices of archaic authorization no longer speak, no longer are heard. Something has to fill the silence as the loneliness it creates is unbearable. Islam has a nifty trick, embracing the emptiness and further irritating the overwhelming anxiety as it offers the salve for the soul. Muslims take the silence of God as proof of God, as a promise of something more. This otherworldly being, Allah, tells humans who don’t feel at home in this world that their real home is elsewhere, to which they will return if they do what they are told. Other religions do something similar, but Islam takes this to another level — arguably, the highest or most extreme form of monotheism, so far. The loss of the bicameral mind could not be pushed much further, one suspects, without being pushed into an abyss.

Islam is a truly modern religion. Right up there with capitalism and scientism.

* * *

Further discussion about this can be found on the Facebook page “Jaynes’ The Origin of Consciousness in the Breakdown of the Bicameral Mind”.

 

Plant-Based Nutritional Deficiencies

The purpose here is to highlight the nutritional deficiencies of plant-based diets but most specifically plant-exclusive diets such as veganism (important nutrients are listed below). Not all of these deficiencies involve essential nutrients, but our knowledge is limited on what is essential. There are deficiencies that will kill you quickly, others slowly, and still others that simply will cause deteriorating health or less than optimal functioning. Also, some of these nutrients or their precursors can be found in plant foods or otherwise produced by the body, but there can be several problems. The plant-based sources may be inadequate or not in the most bioavailable form, antinutrients in the plants may block the absorption of certain nutrients (e.g., phytates block mineral absorption), gut and microbiome problems related to a plant-based diet might interfere with absorption, and most people have severely limited capacity to turn certain precursors into the needed nutrients.

So, when eating a supposedly healthy diet, many vegans and vegetarians still have major deficiencies, even with nutrients that should be in their diet according to standard food intake calculations — in those cases, the nutrients are there in theory but for some reason not being absorbed or utilized. For example, raw spinach has a lot of calcium, but it is almost entirely unavailable to the body. Adding raw spinach to your smoothie or salad might be a net loss to your health, as the antinutrients will block the nutrients in other foods as well. Another factor is that, on a plant-based diet, nutrients can get out of ratio. Nutrients work together with some acting as precursors, others as catalysts, and still others like master hormones — such as vitamin K2 determining where calcium is transported to, preferably the bones as opposed to arteries, joints and the brain; or think about how the body can produce vitamin D3 but only if there is adequate cholesterol. As such, besides deficiencies, sometimes there can too much of a nutrient which interferes with another nutrient, as seen with copper in relation to zinc.

That is the advantage to an animal-based diet, which could even include a well-balanced vegetarian diet that emphasized dairy and eggs (Vegetarianism is an Animal-Based Diet), but unfortunately many vegetarians are near-vegan in limiting even those non-meat animal foods. Here is the reason why animal foods are so important. Other animals have similar nutritional needs as humans and so, when we eat animal foods, we are getting not only the nutrients our bodies need but in the required form and ratio for our own optimal functioning. Without animal foods, one has to study nutrition to understand all of this and then try to artificially re-create it through careful calculations in balancing what one eats and supplements, an almost impossible task that requires someone to have a scientific mindset. Even then, one is likely to get it wrong. Regular testing of nutritional levels would be absolutely necessary to ensure everything is going according to plan.

As for supplements and fortification, the nutrients aren’t always in the best form and so wouldn’t be as bioavailable nor would likely have all the needed cofactors in just the right amounts. Besides, a diet dependent on supplementation and fortification is not healthy by definition, in that the food itself in natural form lacksing those nutrients. The fact that most vegans in particular and vegetarians as well have to be extremely obsessive about nutrition just to maintain a basic level of health is not high praise to the health-giving benefits of such a plant-based diet — and hence the reason even vegetarians should emphasize the allowed animal foods (there are even vegans who will make exceptions for some animal foods, such as fish). This is probably why most people quit these diets after a short period of time and why most people who quit, including those who quit after years or decades, do so for health reasons. Among those who remain on these diets, their responses on surveys show that most of them cheat on occasion and so are getting some minimal level of animal-based nutrition, and that is a good thing for their health even as it calls into question the validity of health claims about plant-based diets (Being “mostly vegan” is like being “a little pregnant.”).

There has long been a bias against meat, especially red meat. It goes back to the ancient Greek thought of Galen and how it was adapted to Medieval society in being Christianized for purposes of maintaining social hierarchy and social control. This Galenic bias was carried forward in the Christian tradition and then modernized within nutrition studies through the surprisingly powerful influence of the Seventh Day Adventists who continue to fund a lot of nutritional studies to this day. This has had practical consequences. It has long been assumed, based on a theology of a sinful world, that eating animals would make us beastly. It’s similar to the ancient idea that eating the muscles or heart of a fallen warrior would make one strong or courageous. A similar logic was applied to plants, that they have inherent qualities that we can imbibe.

So, it has been long believed that plant foods are somehow healthier for both body and soul, somehow more spiritual and so would bring humans closer to God or else closer to their divine natural state before the Fall of Man. That has been the moral concern of many Christians, from Medieval Catholics to modern Seventh Day Adventists. And in secularized form, it became internalized by mainstream nutrition studies and dietary guidelines. Part of the purpose of eating plants, according to Christianized Galenism, was that a strong libido was considered bad and it was understood that a plant-based diet suppressed libido, which admittedly doesn’t sound like a sign of health but their idea of ‘health’ was very different. It was also worried that, along with firing up the libido, meat would heat up the entire body and would lead to a shorter lifespan. Pseudo-scientific explanations have been used to rationalize this theological doctrine, such as concerns about mTOR and IGF-1, although this requires contorting the science and dismissing other evidence.

The problem is this simply became built into mainstream nutritional ideology, to such an extent that few questioned it until recently. This has led to most researchers, nutritionists, dieticians, and other health experts to obsess over the nutrients in plants while overlooking the nutrients in animal foods. So, you’ll hear something along the lines of, “meat is not an important source of vitamin E and with the exception of liver, is not a particularly good source of fat-soluble vitamins” (Nutrients in Meat, from the Meat We Eat). Keep in mind that assertion comes from a project of the American Meat Science Association — not likely to be biased against meat. It’s sort of true, depending on how one defines meat. From Galenic thought, the notion of meat is still associated with red meat. It is true that muscle meat, particularly lean muscle meat, from beef, pork and veal doesn’t have much vitamin E compared to plant foods (M. Leonhardt et al, Vitamin E content of different animal products: influence of animal nutrition). This is why some vegetarians and even vegans see no contradiction or conflict, much less hypocrisy, in eating fish and fowl — culturally, these have for millennia been considered a separate category from meat.

Yet adequate amounts of vitamin E are found in many animal foods, whether or not we label them as ‘meat’: chicken, goose meat, fish, seafood, crayfish, butter, and cheese; and some vitamin E is also found in liver and eggs (Atli Anarson, 20 Foods That Are High in Vitamin E). We have to be clear what we mean by ‘meat’. On a meat-based diet, even to the degree of being carnivore, there are plentiful good sources of every essential nutrient, including vitamin E, and many that aren’t essential but highly conducive to optimal health. Besides animal foods, there is no other source of such immense nutrient-density and nutrient-biavailability. Plant foods don’t come close in comparison.

Also, as vitamin E is an antioxidant, it’s important to note that animal foods contain many other antioxidants that play a similar role in maintaining health, but animal-sourced antioxidants have been mostly ignored because they don’t fit the dominant plant-based paradigm. Plant foods lack these animal-sourced antioxidants. So why do so few talk about a deficiency in them for vegans and vegetarians? And why have researchers so rarely studied in depth the wide variety of nutrients in animal foods to determine their full health benefits? This is particularly odd when considering, as I already stated, every known essential nutrient can be found in animal foods but not in plant foods. Isn’t that an important detail? Why is there a collective silence among mainstream health experts?

Think about how plant antinutrients can block the absorption of nutrients, both in plant foods and animal foods, and so require even more nutrients to counteract this effect which might simply further increase the antinutrient intake, unless one is careful in following the food selection and preparation as advised by those like Steven Gundry (The Plant Paradox). Or think about how glucose competes with the antioxidant vitamin C causing an increase of scurvy if vitamin C is not increased, and yet a low-carb diet with far lower intake of vitamin C is not linked to scurvy — maybe the reason ancient Vikings and Polynesians could remain healthy at sea for months, but once a high-carb diet was introduced modern sailors were plagued by scurvy (Sailors’ Rations, a High-Carb Diet). Similarly, a plant-based diet in general might require greater amounts of vitamin E: “Plant-based foods have higher concentrations of vitamin E. And for good reason. A plant-based diet requires additional protection from oxidation of PUFA which Vitamin E helps provide through its antioxidant properties. It’s still found in adequate supply in meat” (Kevin Stock, Vitamins and Minerals – Plants vs Animals).

What is adequate depends on the diet. A diet low in carbs, seed oils, and other plant foods may require fewer plant-based antioxidants, especially if this is countered by an increase of animal-based antioxidants. It is reminiscent of the fiber debate. Yes, fiber adds bulk that supposedly will increase regularity, ignoring the fact that the research is divided on this topic. No doubt bulking up your poop makes you have larger poops and more often, but is that really a good thing? People on a low-residue carnivore diet more easily digest and absorb what the eat, and so they don’t have bulky poops — then again they don’t usually have constipation either, not if they’re getting enough dietary fat. The main cause of constipation is plant foods. So, why are people advised to eat more plant foods in the hope of resolving this issue caused by plant foods? It’s absurd! We keep looking at problems in isolation, as we look at nutrients in isolation (Hubris of Nutritionism). This has failed us, as demonstrated by our present public health crisis.

Let me throw in a last thought about antioxidants. It’s like the fiber issue. People on plant-based diets have contipation issues and so they eat more plant foods in the form of fiber in trying to solve the problem plant foods cause, not realizing that constipation generally resolves itself by eliminating or limiting plant foods. So, in relation to antioxidants, we have to ask ourselves what is it about our diet in the first place that is causing all the oxidative stress? Plant foods do have antioxidants, but some plant foods also cause oxidative stress (e.g., seed oils). If we eliminate these plant foods, our oxidative stress goes down and so our requirement of antioxidants to that degree also lessens. Our body already produces its own antioxidants and, combined with what comes from animal foods, we shouldn’t such excess amounts of antioxidants. Besides, it’s not clear from studies that plant antioxidants are always beneficial to health. It would be better to eliminate the need for them in the first place. Shawn Baker explained this in terms of vitamin C (interview with Shan Hussain, The Carnivore Diet with Dr. Shawn Baker MD):

“The Carnivore diet is deficient in carbohydrates and essential vitamins like Vitamin C, how do we make up for that? When I wanted to do this I was curious about this as well. You will see a number of potential deficiencies around this diet. There is no role of fibre in this diet. With Vitamin C we know there are some transporters across different cell membranes. In a higher glucose environment, Vitamin C is competitively inhibited and therefore we see less absorption of Vitamin C. We also see that interestingly human red blood cells do have the capacity to actually recycle Vitamin C which is something that not many people are aware of. One of the major function of Vitamin C is that it is an antioxidant. In low carbohydrate states our antioxidants systems particularly things like glutathione are regulated. We may obviate some of the need of antioxidants of the Vitamin C by regulating around systems in a low carb diet. Also, Vitamin C is very important in the function of carnitine which is part of the fat cycle. When we are ingesting carnitine we have actual transporters in the gut which can take up full carnosine. It is a misconception that we can only take amino acids, a number of di and tripeptide transporters that are contained within our gut. The other function of Vitamin C is when we don’t have sufficient Vitamin C relative to our needs, we start to develop symptoms of scurvy, bleeding gum problems, teeth falling out, sores and cuts won’t heal. This is all due to the collagen synthesis. If we look at Vitamin C’s role in collagen synthesis, it helps to take proline and lysine, hydroxyproline and hydroxylysine. In meat-based diet, we are getting that in ample amount. Even a steak has 3% of its content as collagen. There are all kinds of compensatory mechanisms.”

I’ll end on an amusing note. Chris Kresser wrote about the carnivore diet (Everything You Need to Know about the Carnivore Diet and How It Can Affect Your Health). Athough an advocate of low-carb diets and nutrient-dense animal foods, he is skeptical that carnivory will be healthy for most humans long-term. One worry is that there might be nutritional deficiencies, but the argument he makes is funny. He basically saying that if all one eats is muscle meat then key nutrients will get missed. Then he goes onto point out that these nutrients can be found in other animal foods, such as liver and dairy. So, his main concern about a carnivore diet is actually that people might not eat enough animal foods or rather not enough of certain animal foods. So, make sure you eat lots of a wide variety of animal foods if going full carnivore and apparently even critics like Kresser agree you’ll be fine, at least nutritionally. The problem isn’t too much animal foods but potentially too little. That made me smile.

Now to the whole point of this post. Below is a list of nutrients that are commonly deficient in those on plant-based diets, especially those on plant-exclusive diets (i.e., vegans). I won’t explain anything about these nutrients, as there is plenty of info online. But you can look to the linked articles below that cover the details.

  • Vitamin K2
  • Vitamin D3 (Cholecalciferol)
  • Vitamin A (Retinol)
  • Vitamin B12 (Cobalamin)
  • Vitamin B6 (Pyridoxine)
  • B3 (Niacin)
  • B2 (Riboflavin)
  • Calcium
  • Heme Iron
  • Zinc
  • Selenium
  • Iodine
  • Sulfur
  • DHA Omega-3 (Docosahexaenoic Acid)
  • EPA Omega-3 (Eicosapentaenoic Acid)
  • DPA Omega-3 (Docosapentaenoic Acid)
  • ARA Omega-6 (Arachidonic Acid)
  • SA Saturated Fat (Stearic Acid)
  • CLA (Conjugated Linoleic Acid)
  • Phytanic Acid
  • Phosphatidylserin, Phosphatidylcholine, and Other Phospholipids
  • Glutathione
  • SOD (Superoxide Dismutase)
  • CoQ10 (Coenzyme Q10)
  • Choline
  • Biotin
  • Cholesterol
  • Nucleotides (Nucleoproteins and Nucleic Acids)
  • Creatine
  • Taurine
  • Carnitine
  • Carnosine
  • Anserine (Derivative of Carnosine)
  • Beta-Alanine (Precursor to Carnosine)
  • HLA (Hyaluronic Acid)
  • Complete Proteins
  • Collagen
  • Other Essential Amino Acids (Creatine, Beta-Alanine, Glycine, Methionine, Tryptophan, Lysine, Leucine, Cysteine, Proline, Tyrosine, Phenylalanine, Serine, Alanine, Threonine, Isoleucine, and Valine)

[Please note in the comments any other essential or semi-essential nutrients not on the above list.]

“This list doesn’t even include things like peptides including BPC-157, Thymosin alpha-1, LEAP-2, spenlopentin, tuftsin, etc. which are known to occur naturally in animal foods and have beneficial effects in humans” (Paul Saladino). Other peptides, mainly found in animal foods, are not just important for optimal health but truly and entirely essential: aremethionine, threonine, tryptophan, isoleucine, leucine, lysine, valine, and phenylalanine.

Just for the sake of balance, I’ll also share a list of plant compounds that are problematic for many people — from Joe Cohen (20 Nutrients that Vegans & Vegetarians are Lacking):

  1. Lectins
  2. Amines
  3. Tannins
  4. Trypsin Inhibitors
  5. FODMAPS
  6. Salicylates
  7. Oxalates
  8. Sulfites, Benzoates, and MSG
  9. Non-protein amino acids
  10. Glycosides
  11. Alkaloids [includes solanine, chaconine]
  12. Triterpenes
  13. Lignins
  14. Saponins
  15. Phytic Acid [Also Called Phytate]
  16. Gluten
  17. Isoflavones

* * *

Are ‘vegetarians’ or ‘carnivores’ healthier?
Gundry’s Plant Paradox and Saladino’s Carnivory
Dr. Saladino on Plant and Animal Foods
True Vitamin A For Health And Happiness
Calcium: Nutrient Combination and Ratios
Vitamin D3 and Autophagy

The Vegetarian Myth: Food, Justice, and Sustainability
by Lierre Keit

Vegan Betrayal: Love, Lies, and Hunger in a Plants-Only World
by Mara J. Kahn

The Meat Fix: How a lifetime of healthy eating nearly killed me!
by John Nicholson

The Fat of the Land/Not By Bread Alone
by Vilhjalmur Stefansson

Sacred Cow: The Case for (Better) Meat: Why Well-Raised Meat Is Good for You and Good for the Planet
by Diana Rodgers and Robb Wolf

The Carnivore Code: Unlocking the Secrets to Optimal Health by Returning to Our Ancestral Diet
by Paul Saladino

Primal Body, Primal Mind: Beyond Paleo for Total Health and a Longer Life
by Nora Gedgauda

Paleo Principles
by Sarah Ballantyn

The Queen of Fats: Why Omega-3s Were Removed from the Western Diet and What We Can Do to Replace Them
by Susan Allport

The Omega Principle: Seafood and the Quest for a Long Life and a Healthier Planet
by Paul Greenberg

The Omega-3 Effect: Everything You Need to Know About the Super Nutrient for Living Longer, Happier, and Healthier
by William Sears and James Sear

The Missing Wellness Factors: EPA and DHA: The Most Important Nutrients Since Vitamins?
by Jorn Dyerberg and Richard Passwater

Could It Be B12?: An Epidemic of Misdiagnoses
by Sally M. Pacholok and Jeffrey J. Stuar

What You Need to Know About Pernicious Anaemia and Vitamin B12 Deficiency
by Martyn Hooper

Living with Pernicious Anaemia and Vitamin B12 Deficiency
by Martyn Hoope

Pernicious Anaemia: The Forgotten Disease: The causes and consequences of vitamin B12 deficiency
by Martyn Hooper

Healing With Iodine: Your Missing Link To Better Health
by Mark Sircus

Iodine: Thyroid: The Hidden Chemical at the Center of Your Health and Well-being
by Jennifer Co

The Iodine Crisis: What You Don’t Know About Iodine Can Wreck Your Life
by Lynne Farrow

L-Carnitine and the Heart
by Stephen T. Sinatra and Jan Sinatra

Food Politics: How the Food Industry Influences Nutrition and Health
by Marion Nestle

Unsavory Truth: How Food Companies Skew the Science of What We Eat
by Marion Nestle

Formerly Known As Food: How the Industrial Food System Is Changing Our Minds, Bodies, and Culture
by Kristin Lawless

Death by Food Pyramid: How Shoddy Science, Sketchy Politics and Shady Special Interests Have Ruined Our Health
by Denise Minge

Nutrition in Crisis: Flawed Studies, Misleading Advice, and the Real Science of Human Metabolism
by Richard David Feinman

Nutritionism: The Science and Politics of Dietary Advice
by Gyorgy Scrinis

Measured Meals: Nutrition in America
by Jessica J. Mudry

(Although more about macronutrients, also see the work of Gary Taubes and Nina Teicholz. They add useful historical context about nutrition studies, dietary advice, and public health.)

20 Nutrients that Vegans & Vegetarians are Lacking
by Joe Cohen

8 Nutrients You May Be Missing If You’re Vegetarian or Vegan
by Tina Donvito

7 Nutrients That You Can’t Get from Plants
by Atli Anarson

7 Supplements You Need on a Vegan Diet
by Alina Petre

The Top 5 Nutrient Deficiencies on a Plant Based Diet
by Kate Barrington

5 Brain Nutrients That You Can’t Get From Plants
by Kris Gunnars

Vitamin Supplements for Vegetarians
by Jeff Takacs

Health effects of vegan diets
by Winston J Craig

Nutritional Deficiencies and Essential Considerations for Every Vegan (An Evidence-Based Nutritional Perspective)
from Dai Manuel

Why You Should Think Twice About Vegetarian and Vegan Diets
by Chris Kresser

Three Big Reasons Why You Don’t Want to be a Vegetarian
by Alan Sears

How to Avoid Common Nutrient Deficiencies if You’re a Vegan
by Joseph Mercola

What is Glutathione and How Do I Get More of It?
by Mark Hyman

Could THIS Be the Hidden Factor Behind Obesity, Heart Disease, and Chronic Fatigue?
by Joseph Mercola

Vegetarianism produces subclinical malnutrition, hyperhomocysteinemia and atherogenesis
by Y. Ingenbleek Y and K. S. McCully

Vegan Diet is Sulfur Deficient and Heart Unhealthy
by Larry H. Bern

Heart of the Matter : Sulfur Deficits in Plant-Based Diets
by Kaayla Daniel

Copper-Zinc Imbalance: Unrecognized Consequence of Plant-Based Diets and a Contributor to Chronic Fatigue
by Laurie Warner

Vegan diets ‘risk lowering intake of nutrient critical for unborn babies’ brains’
by Richard Hartley-Parkinson

The Effects of a Mother’s Vegan Diet on Fetal Development
by Marc Choi

Vegan–vegetarian diets in pregnancy: danger orpanacea? A systematic narrative review
by G. B. Piccoli

Is vegetarianism healthy for children?
by Nathan Cofnas

Clinical practice: vegetarian infant and child nutrition
by M. Van Winckel, S. Vande Velde, R. De Bruyne, and S. Van Biervliet

Dietary intake and nutritional status of vegetarian and omnivorous preschool children and their parents in Taiwan
C. E. Yen, C. H. Yen, M. C. Huang, C. H. Cheng, and Y. C. Huang

Persistence of neurological damage induced by dietary vitamin B-12 deficiency in infancy
by Ursula von Schenck, Christine Bender-Götze, and Berthold Koletzko

Severe vitamin B12 deficiency in an exclusively breastfed 5-month-old Italian infant born to a mother receiving multivitamin supplementation during pregnancy
by S. Guez et al

Long-chain n-3 PUFA in vegetarian women: a metabolic perspective
by G. C. Burdge, S. Y. Tan, and C. J. Henry

Signs of impaired cognitive function in adolescents with marginal cobalamin status
by M. W. Louwman et al

Transient neonatal hypothyroidism due to a maternal vegan diet
by M. G. Shaikh, J. M. Anderson, S. K. Hall, M. A. Jackson

Veganism as a cause of iodine deficient hypothyroidism
by O. Yeliosof and L. A. Silverman

Do plant based diets deprive the brain of an essential nutrient?
by Ana Sandoiu

Suggested move to plant-based diets risks worsening brain health nutrient deficiency
from BMJ

Could we be overlooking a potential choline crisis in the United Kingdom?
by Emma Derbyshire

How a vegan diet could affect your intelligence
by Zaria Gorvett

Vitamins and Minerals – Plants vs Animals
by Kevin Stock

Health effects of vegan diets
by Winston J Craig

Comparing Glutathione in the Plasma of Vegetarian and Omnivore Populations
by Rachel Christine Manley

Vegan diets are adding to malnutrition in wealthy countries
by Chris Elliott, Chen Situ, and Claire McEvoy

What beneficial compounds are primarily found in animal products?
by Kamal Patel

The Brain Needs Animal Fat
by Georgia Ede

The Vegan Brain
by Georgia Ede

Meat, Organs, Bones and Skin
by Christopher Masterjohn

Vegetarianism and Nutrient Deficiencies
by Christopher Masterjohn

Adding milk, meat to diet dramatically improves nutrition for poor in Zambia
from Science Daily

Red meat plays vital role in diets, claims expert in fightback against veganism
by James Tapper

Nutritional Composition of Meat
by Rabia Shabir Ahmad, Ali Imran and Muhammad Bilal Hussain

Meat and meat products as functional food
by Maciej Ostaszewski

Meat: It’s More than Protein
from Paleo Leap

Conjugated Linoleic Acid: the Weight Loss Fat?
from Paleo Leap

Nutritional composition of red meat
by P. G. Williams

How Red Meat Can ‘Beef Up’ Your Nutrition
by David Hu

Endogenous antioxidants in fish
by Margrét Bragadóttir

Astaxanthin Benefits Better than Vitamin C?
by Rachael Link

Astaxanthin: The Most Powerful Antioxidant You’ve Never Heard Of
from XWERKS

Antioxidants Are Bullshit for the Same Reason Eggs Are Healthy
by Sam Westreich

We absolutely need fruits and vegetables to obtain optimal antioxidant status, right?
by Paul Saladino

Hen Egg as an Antioxidant Food Commodity: A Review
Chamila Nimalaratne and Jianping Wu

Eggs’ antioxidant properties may help prevent heart disease and cancer, study suggests
from Science Daily

The Ultimate Superfood? Milk Offers Up a Glass Full of Antioxidants
by Lauren Milligan Newmark

Antioxidant properties of Milk and dairy products: a comprehensive review of the current knowledge
by Imran Taj Khan et al

Antioxidants in cheese may offset blood vessel damage
from Farm and Dairy

Identification of New Peptides from Fermented Milk Showing Antioxidant Properties: Mechanism of Action
by Federica Tonolo

Bioavailability of iron, zinc, and other trace minerals from vegetarian diets
by Janet R Hunt

Dietary iron intake and iron status of German female vegans: results of the German vegan study.
by A. Waldmann, J. W. Koschizke, C. Leitzmann, and A. Hahn

Mechanisms of heme iron absorption: Current questions and controversies
by Adrian R. West and Phillip S. Oates

Association between Haem and Non-Haem Iron Intake and Serum Ferritin in Healthy Young Women
by Isabel Young et al

Pork meat increases iron absorption from a 5-day fully controlled diet when compared to a vegetarian diet with similar vitamin C and phytic acid content.
by M. Bach Kristensen, O. Hels, C. Morberg, J. Marving, S. Bügel, and I. Tetens

Do you need fiber?
by Kevin Stock

Ketosis, Epigenetics, and the Brain

It’s long been understood that ketones, specifically beta-hydroxybutyrate (BHB), are a brain superfuel. It’s easy to explain the evolutionary reasons for this. Hunter-gatherers spent much more time in ketosis. This metabolic state tends to occur during periods of low food access. That happens in winter but also throughout the year, as hunter-gatherers tend toward a feast and fast pattern.

After a period of plenty, it might be a while until the next big kill. Some hunting expeditions could take days or weeks. They needed their brains working in top form for a successful hunt. Many hunter-gatherer tribes purposely fast on a regular basis as a demonstration of their toughness, to show that they can go long periods without food, a very important ability for hunters. Even tribal people living amongst food abundance will fast for no particular reason other than it’s part of their culture.

The Piraha, for example, can procure food easily and yet will go without, sometimes simply because they’d rather socialize around the village or are in the middle of communal dancing that can go on for days. They have better things to do than eat all the time. Besides, on a low-carb diet that is typical among hunter-gatherers, it takes little effort to fast. That is one of the benefits of ketosis, one’s appetite is naturally suppressed and so it effortlessly promotes caloric restriction.

Along with improving brain function, ketosis increases general health, probably including extending lifespan and certainly extending healthspan. Some of this could be explained by creating the conditions necessary for autophagy, although there are many other factors. An interesting example of this was shown in a mouse study.

The researchers exposed the rodents to influenza (E. L. Goldberg et al, Ketogenic diet activates protective γδ T cell responses against influenza virus infection). Most of the mice on a high-carb diet died, whereas most on the keto diet lived. In this case, it wasn’t the ketones themselves but other processes involved. Giving exogenous ketones as a supplement did not have the same effect as the body producing its own ketones. We typically think of ketosis only in terms of ketones, but obviously there is much more going on.

Still, in the case of neurocognitive functioning, the ketones themselves are key. It’s not only that they act as a superfuel but simultaneously alter epigenetic expression of specific genes related to memory. On the opposite side, research shows that feeding people sugar literally makes them dumber. Ketosis also decreases inflammation, including inflammation in the brain. Through multiple causal mechanisms, ketosis has been medically used as an effective treatment for numerous neurocognitive conditions: mood disorders, schizophrenia, autism, ADHD, Alzheimer’s, etc.

If ketosis is a biological indicator of food scarcity, why does the body expend extra energy that is limited? This seems counter-intuitive. Other species deal with food scarcity by shutting the body down and slowing metabolism, some going into full hibernation or semi-hibernation during winter. But humans do the opposite. Food scarcity increases physiological activity. In fact, ketosis is actually inefficient, as it burns more energy than is needed with the excess being put off as heat.

Benjamin Bikman, an insulin researcher, speculates this is because ketosis often happens in winter. Hibernating creatures lower their body temperature, but humans don’t have this capacity. Neither do we have thick fur. Humans need large amounts of heat to survive harsh winters. In ketosis, everything goes into overdrive: metabolism, immune system, and brain. This alters the epigenome itself that can be passed onto following generations.

* * *

You Are What Your Mother and Father (and Grandmothers and Grandfathers) Ate
by Mark Sisson

Epigenetic Explanations For Why Cutting Sugar May Make You Feel Smarter
by Caitlin Aamodt

High Fat, Low Carb Diet Might Epigenetically Open Up DNA and Improve Mental Ability
by Bailey Kirkpatrick

Epigenetics May Provide Relief for Fragile X Syndrome and Intellectual Disorders
by Tim Barry

To Empathize is to Understand

What is empathy as a cognitive ability? And what is empathy as an expansion of identity, as part of awareness of self and other?

There is a basic level of empathy that appears to be common across numerous species. Tortoises, when seeing another on its back, will help flip it over. There are examples of animals helping or cooperating with those from an entirely different species. Such behavior has been repeatedly demonstrated in laboratories as well. These involve fairly advanced expressions of empathy. In some cases, one might interpret it as indicating at least rudimentary theory of mind, the understanding that others have their own experience, perspective, and motivations. But obviously human theory of mind can be much more complex.

One explanation about greater empathy has to do with identity. Empathy in a way is simply a matter of what is included within one’s personal experience (Do To Yourself As You Would Do For Others). To extend identity is to extend empathy to another individual or a group (or anything else that can be brought within sphere of the self). For humans, this can mean learning to include one’s future self, to empathize with experience one has not yet had, the person one has not yet become. The future self is fundamentally no different than another person.

Without cognitive empathy, affective empathy is limited to immediate experience. It’s the ability to feel what another feels. But lacking cognitive empathy as happens in the most severe autism, theory of mind cannot be developed and so there is no way to identity, locate and understand that feeling. One can only emotionally react, not being able to differentiate one’s own emotion from that of another. In that case, there would be pure emotion, and yet no recognition of the other. Cognitive empathy is necessary to get beyond affective reactivity, not all that different than the biological reactivity of a slug.

It’s interesting that some species (primates, rats, dolphins, etc) might be able to have more cognitive empathy and theory of mind than some people at the extreme ends of severe autism, not necessarily being an issue of intelligence. On the other hand, the high functioning on the autistic spectrum, if intervention happens early enough, can be taught theory of mind, although it is challenging for the. This kind of empathy is considered a hallmark of humanity, a defining feature. This is what leads to problems of social behavior for those with autism spectrum disorder.

Someone entirely lacking in theory of mind would be extremely difficult to communicate and interact with beyond the most basic level, as is seen in the severest cases of autism and other extreme developmental conditions. Helen Keller asserts she had no conscious identity, no theory of her own mind or that of others, until she learned language.* Prior to her awakening, she was aggressive and violent in reacting to a world she couldn’t understand, articulate, or think about. That fits in with the speculations of Julian Jaynes. What he calls ‘consciousness’ is the addition of abstract thought by way of metaphorical language, as built upon concrete experience and raw affect. Keller discusses how her experience went from from the concreteness of touch to the abstraction of language. In becoming aware of the world, she became aware of herself.

Without normal development of language, the human mind is crippled: “The “black silence” of the deaf, blind and mute is similar in many respects to the situation of acutely autistic children where there are associated difficulties with language and the children seem to lack what has been called “a theory of mind” ” (Robin Allott, Helen Keller: Language and Consciousenss). Even so, there is more to empathy than language, and that might be true as well for some aspects or kinds of cognitve empathy. Language is not the only form of communication.

Rats are a great example in comparing to humans. We think of them as pests, as psychologically inferior. But anyone who has kept rats knows how intelligent and social they are. They are friendlier and more interactive than the typical cat. And research has shown how cognitively advanced they are in learning. Rats do have the typical empathy of concern for others. For example, they won’t hurt another rat in exchange for a reward and, given a choice, they would rather go hungry. But it goes beyond that.

It’s also shown that “rats are more likely and quicker to help a drowning rat when they themselves have experienced being drenched, suggesting that they understand how the drowning rat feels” (Kristin Andrews, Rats are us). And “rats who had been shocked themselves were less likely to allow other rats to be shocked, having been through the discomfort themselves.” They can also learn to play hide-and-seek which necessitates taking on the perspective others. As Ed Yong asks in The Game That Made Rats Jump for Joy, “In switching roles, for example, are they taking on the perspective of their human partners, showing what researchers call “theory of mind”?”

That is much more than mere affective empathy. This seems to involve active sympathy and genuine emotional understanding, that is to say cognitive empathy and theory of mind. If they are capable of both affective and cognitive empathy, however limited, and if Jaynesian consciousness partly consists of empathy imaginatively extended in space and time, then a case could be made that rats have more going on than simple perceptual awareness and biological reactivity. They are empathically and imaginatively engaging with others in the world around them. Does this mean they are creating and maintaining a mental model of others? Kristin Andrews details the extensive abilities of rats:

“We now know that rats don’t live merely in the present, but are capable of reliving memories of past experiences and mentally planning ahead the navigation route they will later follow. They reciprocally trade different kinds of goods with each other – and understand not only when they owe a favour to another rat, but also that the favour can be paid back in a different currency. When they make a wrong choice, they display something that appears very close to regret. Despite having brains that are much simpler than humans’, there are some learning tasks in which they’ll likely outperform you. Rats can be taught cognitively demanding skills, such as driving a vehicle to reach a desired goal, playing hide-and-seek with a human, and using the appropriate tool to access out-of-reach food.”

To imagine the future for purposes of thinking in advance and planning actions, that is quite advanced cognitive behavior. Julian Jaynes argued that was the purpose of humans developing a new kind of consciousness, as the imagined metaphorical space that is narratized allows for the consideration of alternatives, something he speculates was lacking in humans prior to the Axial Age when behavior supposedly was more formulaic and predetermined according to norms, idioms, etc. Yet rats can navigate a path they’ve never taken before with novel beginning and ending locations, which would require taking into account multiple options. What theoretically makes Jaynesian consciousness unique?

Jaynes argues that it’s the metaphorical inner space that is the special quality that created the conditions for the Axial Age and all that followed from it, the flourishing of complex innovations and inventions, the ever greater extremes of abstraction seen in philosophy, math and science. We have so strongly developed this post-bicameral mind that we barely can imagine anything else. But we know that other societies have very different kinds of mentalities, such as the extended and fluid minds of animistic cultures. What exactly is the difference?

Australian Aborigines give hint to something between the two kinds of mind. In some ways, the mnemonic systems represent more complex cognitive ability than we are capable with our Jaynesian consciousness. Instead of an imagined inner space, the Songlines are vast systems of experience and knowledge, culture and identity overlaid upon immense landscapes. These mappings of externalized cognitive space can be used to guide the individual across distant territories the individual has never seen before and help them to identify and use the materials (plants, stones, etc) at a location no one in their tribe has visited for generations. Does this externalized mind have less potential for advanced abilities? Upon Western contact, Aborigines had farming and ranching, kept crop surpluses in granaries, used water and land management.

It’s not hard to imagine civilization having developed along entirely different lines based on divergent mentalities and worldviews. Our modern egoic consciousness was not an inevitability and it likely is far from offering the most optimal functioning. We might already be hitting a dead end with our present interiorized mind-space. Maybe it’s our lack of empathy in understanding the minds of other humans and other species that is an in-built limitation to the post-bicameral world of Jaynesian consciousness. And so maybe we have much to learn from entirely other perspectives and experiences, even from rats.

* * *

* Helen Keller, from Light in My Darkness:

I had no concepts whatever of nature or mind or death or God. I literally thought with my body. Without a single exception my memories of that time are tactile. . . . But there is not one spark of emotion or rational thought in these distinct yet corporeal memories. I was like an unconscious clod of earth. There was nothing in me except the instinct to eat and drink and sleep. My days were a blank without past, present, or future, without hope or anticipation, without interest or joy. Then suddenly, I knew not how or where or when, my brain felt the impact of another mind, and I awoke to language, to knowledge, to love, to the usual concepts of nature, good, and evil. I was actually lifted from nothingness to human life.

And from The Story of My Life:

As the cool stream gushed over one hand she spelled into the other the word water, first slowly, then rapidly. I stood still, my whole attention fixed upon the motions of her fingers. Suddenly I felt a misty consciousness as of something forgotten–-a thrill of returning thought; and somehow the mystery of language was revealed to me. I knew then that ‘w-a-t-e-r’ meant the wonderful cool something that was flowing over my hand. That living word awakened my soul, gave it light, hope, joy, set it free! There were barriers still, it is true, but barriers that could in time be swept away.

And from The World I Live In:

Before my teacher came to me, I did not know that I am. I lived in a world that was a no-world. I cannot hope to describe adequately that unconscious, yet conscious time of nothingness. I did not know that I knew aught, or that I lived or acted or desired. I had neither will nor intellect. I was carried along to objects and acts by a certain blind natural impetus. I had a mind which caused me to feel anger, satisfaction, desire. These two facts led those about me to suppose that I willed and thought. I can remember all this, not because I knew that it was so, but because I have tactual memory. It enables me to remember that I never contracted my forehead in the act of thinking. I never viewed anything beforehand or chose it. I also recall tactually the fact that never in a start of the body or a heart-beat did I feel that I loved or cared for anything. My inner life, then, was a blank without past, present, or future, without hope or anticipation, without wonder or joy or faith. […]

Since I had no power of thought, I did not compare one mental state with another. So I was not conscious of any change or process going on in my brain when my teacher began to instruct me. I merely felt keen delight in obtaining more easily what I wanted by means of the finger motions she taught me. I thought only of objects, and only objects I wanted. It was the turning of the freezer on a larger scale. When I learned the meaning of “I” and “me” and found that I was something, I began to think. Then consciousness first existed for me. Thus it was not the sense of touch that brought me knowledge. It was the awakening of my soul that first rendered my senses their value, their cognizance of objects, names, qualities, and properties. Thought made me conscious of love, joy, and all the emotions. I was eager to know, then to understand, afterward to reflect on what I knew and understood, and the blind impetus, which had before driven me hither and thither at the dictates of my sensations, vanished forever.”

I cannot represent more clearly than any one else the gradual and subtle changes from first impressions to abstract ideas. But I know that my physical ideas, that is, ideas derived from material objects, appear to me first an idea similar to those of touch. Instantly they pass into intellectual meanings. Afterward the meaning finds expression in what is called “inner speech.”  […]

As my experiences broadened and deepened, the indeterminate, poetic feelings of childhood began to fix themselves in definite thoughts. Nature—the world I could touch—was folded and filled with myself. I am inclined to believe those philosophers who declare that we know nothing but our own feelings and ideas. With a little ingenious reasoning one may see in the material world simply a mirror, an image of permanent mental sensations. In either sphere self-knowledge is the condition and the limit of our consciousness. That is why, perhaps, many people know so little about what is beyond their short range of experience. They look within themselves—and find nothing! Therefore they conclude that there is nothing outside themselves, either.

However that may be, I came later to look for an image of my emotions and sensations in others. I had to learn the outward signs of inward feelings. The start of fear, the suppressed, controlled tensity of pain, the beat of happy muscles in others, had to be perceived and compared with my own experiences before I could trace them back to the intangible soul of another. Groping, uncertain, I at last found my identity, and after seeing my thoughts and feelings repeated in others, I gradually constructed my world of men and of God. As I read and study, I find that this is what the rest of the race has done. Man looks within himself and in time finds the measure and the meaning of the universe.

* * *

As an example of how language relates to emotions:

The ‘untranslatable’ emotions you never knew you had
by David Robson

But studying these terms will not just be of scientific interest; Lomas suspects that familiarising ourselves with the words might actually change the way we feel ourselves, by drawing our attention to fleeting sensations we had long ignored.

“In our stream of consciousness – that wash of different sensations feelings and emotions – there’s so much to process that a lot passes us by,” Lomas says. “The feelings we have learned to recognise and label are the ones we notice – but there’s a lot more that we may not be aware of. And so I think if we are given these new words, they can help us articulate whole areas of experience we’ve only dimly noticed.”

As evidence, Lomas points to the work of Lisa Feldman Barrett at Northeastern University, who has shown that our abilities to identify and label our emotions can have far-reaching effects.

Her research was inspired by the observation that certain people use different emotion words interchangeably, while others are highly precise in their descriptions. “Some people use words like anxious, afraid, angry, disgusted to refer to a general affective state of feeling bad,” she explains. “For them, they are synonyms, whereas for other people they are distinctive feelings with distinctive actions associated with them.”

This is called “emotion granularity” and she usually measures this by asking the participants to rate their feelings on each day over the period of a few weeks, before she calculates the variation and nuances within their reports: whether the same old terms always coincide, for instance.

Importantly, she has found that this then determines how well we cope with life. If you are better able to pin down whether you are feeling despair or anxiety, for instance, you might be better able to decide how to remedy those feelings: whether to talk to a friend, or watch a funny film. Or being able to identify your hope in the face of disappointment might help you to look for new solutions to your problem.

In this way, emotion vocabulary is a bit like a directory, allowing you to call up a greater number of strategies to cope with life. Sure enough, people who score highly on emotion granularity are better able to recover more quickly from stress and are less likely to drink alcohol as a way of recovering from bad news. It can even improve your academic success. Marc Brackett at Yale University has found that teaching 10 and 11-year-old children a richer emotional vocabulary improved their end-of-year grades, and promoted better behaviour in the classroom. “The more granular our experience of emotion is, the more capable we are to make sense of our inner lives,” he says.

Both Brackett and Barrett agree that Lomas’s “positive lexicography” could be a good prompt to start identifying the subtler contours of our emotional landscape. “I think it is useful – you can think of the words and the concepts they are associated with as tools for living,” says Barrett. They might even inspire us to try new experiences, or appreciate old ones in a new light.

* * *

And related to all of this is hypocognition, overlapping with linguistic relativity — in how language and concepts determine our experience, identity, and sense of reality — constraining and framing and predetermining what we are even capable of perceiving, thinking about, and expressing:

Hypocognition is a censorship tool that mutes what we can feel
by Kaidi Wu

It is a strange feeling, stumbling upon an experience that we wish we had the apt words to describe, a precise language to capture. When we don’t, we are in a state of hypocognition, which means we lack the linguistic or cognitive representation of a concept to describe ideas or interpret experiences. The term was introduced to behavioural science by the American anthropologist Robert Levy, who in 1973 documented a peculiar observation: Tahitians expressed no grief when they suffered the loss of a loved one. They fell sick. They sensed strangeness. Yet, they could not articulate grief, because they had no concept of grief in the first place. Tahitians, in their reckoning of love and loss, and their wrestling with death and darkness, suffered not from grief but a hypocognition of grief. […]

But the darkest form of hypocognition is one born out of motivated, purposeful intentions. A frequently overlooked part of Levy’s treatise on Tahitians is why they suffered from a hypocognition of grief. As it turns out, Tahitians did have a private inkling of grief. However, the community deliberately kept the public knowledge of the emotion hypocognitive to suppress its expression. Hypocognition was used as a form of social control, a wily tactic to expressly dispel unwanted concepts by never elaborating on them. After all, how can you feel something that doesn’t exist in the first place?

Intentional hypocognition can serve as a powerful means of information control. In 2010, the Chinese rebel writer Han Han told CNN that any of his writings containing the words ‘government’ or ‘communist’ would be censored by the Chinese internet police. Ironically, these censorship efforts also muffled an abundance of praise from pro-leadership blogs. An effusive commendation such as ‘Long live the government!’ would be censored too, for the mere mention of ‘government’.

A closer look reveals the furtive workings of hypocognition. Rather than rebuking negative remarks and rewarding praises, the government blocks access to any related discussion altogether, rendering any conceptual understanding of politically sensitive information impoverished in the public consciousness. ‘They don’t want people discussing events. They simply pretend nothing happened… That’s their goal,’ Han Han said. Regulating what is said is more difficult than ensuring nothing is said. The peril of silence is not a suffocation of ideas. It is to engender a state of blithe apathy in which no idea is formed.

Do To Yourself As You Would Do For Others

“…our impulse control is less based on an order from our executive command center, or frontal cortex, and more correlated with the empathic part of our brain. In other words, when we exercise self-control, we take on the perspective of our future self and empathize with that self’s perspectives, feelings, and motivations.”
~ Alexandar Soutscheck

Self-control is rooted in self-awareness. Julian Jaynes and Brian McVeigh, in one of their talks, brought up the idea that “mind space” has increased over time: “The more things we think about, the more distinctions we make in our consciousness  between A and B, and so on, the more mind-space there is” (Discussions with Julian Jaynes, ed. by Brian J. McVeigh, p. 40). The first expansion was the creation of introspective consciousness itself. Narratization allowed that consciousness to also extend across time, to imagine possibilities and play out scenarios and consider consequences. Empathy, as we we experience it, might be a side effect of this as consciousness includes more and more within it, including empathy with our imagined future self. So, think of self-control as being kind to yourself, to your full temporal self, not only your immediate self.

This would relate to the suggestion that humans learn theory of mind, the basis of cognitive empathy, first by observing others and only later apply it to ourselves. That is to say the first expansion of mental space as consciousness takes root within relationship to others. It’s realizing that there might be inner experience within someone else that we claim inner space in our own experience. So, our very ability to understand ourselves is dependent on empathy with others. This was a central purpose of the religions that arose in the Axial Age, the traditions that continue into the modern world* (Tahere Salehi, The Effect of Training Self-Control and Empathy According to Spirituality on Self-Control and Empathy Preschool Female Students in Shiraz City). The prophets that emerged during that era taught love and compassion and introspection, not only as an otherworldly moral dictum but also in maintaining group coherence and the common good. The breakdown of what Jaynes called the bicameral mind was traumatic and a new empathic mind was needed to replace it, if only to maintain social order.

Social order has become a self-conscious obsession ever since, as Jaynesian consciousness in its tendency toward rigidity has inherent weaknesses. Social disconnection is a crippling of the mind because the human psyche is inherently social. Imagining our future selves is a relationship with a more expansive sense of self. It’s the same mechanism as relating to any other person. This goes back to Johann Hari’s idea, based on Bruce K. Alexander’s rat park research, that the addict is the ultimate individual. In this context, this ultimate individual lacking self-control is not only disconnected from other people but also disconnected from themselves. Addiction is isolating and isolation promotes addiction. Based on this understanding, I’ve proposed that egoic consciousness is inherently addictive and that post-axial society is dependent on addiction for social control.

But this psychological pattern is seen far beyond addiction. This fits our personal experience of self. When we were severely depressed, we couldn’t imagine or care about the future. This definitely inhibited self-control and led to more impulsive behavior in being in present-oriented psychological survival mode. Then again, the only reason self-control is useful at all is because, during and following the Axial Age, humans ever more loss the capacity of being part of a communal identity that created the conditions of communal control, the externally perceived commands of archaic authorization through voice-hearing. We’ve increasingly lost the capacity of a communal identity (extended mind/self) and hence a communal empathy, something that sounds strange or unappealing to the modern mind. In denying our social nature, this casts the shadow of authoritarianism, an oppressive and often violent enforcement of top-down control.

By the way, this isn’t merely about psychology. Lead toxicity causes higher rates of impulsivity and aggression. This is not personal moral failure but brain damage from poisoning. Sure, teaching brain-damaged kids and adults to have more empathy might help them overcome their disability. But if we are to develop and empathic society, we should learn to have enough empathy not to wantonly harm the brains of others with lead toxicity and other causes of stunted development (malnutrition, stress, ACEs, etc), just because they are poor or minority and can’t fight back. Maybe we need to first teach politicians and business leaders basic empathy, in overcoming the present dominance of pscyopathic traits, so that they could learn self-control in not harming others.

The part of the brain involving cognitive empathy and theory of mind is generally involved with selflessness and pro-social behavior. To stick with brain development and neurocognitive functioning, let’s look at diet. Weston A. Price, in studying traditional populations that maintained healthy diets, observed what he called moral health in that people seemed kinder, more helpful, and happier — they got along well. Strong social fabric and culture of trust is not an abstraction but built into general measures of health, in the case of Price’s work, having to do with nutrient-dense animal foods containing fat-soluble vitamins. As the standard American diet has worsened, so has mental health. That is a reason for hope. In an early study on the ketogenic diet as applied to childhood diabetes, the researchers made a side observation that not only did the diabetes symptoms improve but so did behavior. I’ve theorized about how a high-carb diet might be one of the factors that sustains the addictive and egoic self.

Narrow rigidity of the mind, as seen in the extremes of egoic consciousness, has come to be accepted as a social norm and even a social ideal. It is the social Darwinian worldview that has contributed to the rise of both competitive capitalism and the Dark Triad (psycopathy, narcissism, and Machiavellianism), and unsurprisingly it has led to a society that lacks awareness and appreciation of the harm caused to future generations (Scott Barry Kaufman, The Dark Triad and Impulsivity). Rather than normalized, maybe this dysfunction should be seen as a sickness, not only a soul sickness but a literal sickness of the body-mind that can be scientifically observed and measured, not to mention medically and socially treated. We need to thin the boundaries of the mind so as to expand our sense of self. Research shows that those with such thinner boundaries not only have more sense of identification with their future selves but also their past selves, in maintaining a connection to what it felt like to be a child. We need to care for ourselves and others in the way we would protect a child.

* * *

* In their article “Alone and aggressive“, A. William Crescioni and Roy F. Baumeister included the loss of meaning. It was maybe associated with the loss of empathy, specifically in understanding the meaning of others (e.g., the intention ‘behind’ words, gestures and actions). Meaning traditionally has been the purview of religion. And I’d suggest that it is not a coincidence that the obsession with meaning arose in the Axial Age right when words were invented for ‘religion’ as a formal institution separate from the rest of society. As Julian Jaynes argues, this was probably in response to the sense of nostalgia and longing that followed the silence of the gods, spirits, and ancestors.

A different kind of social connection had to be taught, but this post-bicameral culture wasn’t and still isn’t as effective in re-creating the strong social bonds of archaic humanity. Periods of moral crisis in fear of societal breakdown have repeated ever since, like a wound that was never healed. I’ve previously written about social rejection and aggressive behavior in relation to this (12 Rules for Potential School Shooters) — about school shooters, I explained:

Whatever they identify or don’t identify as, many and maybe most school shooters were raised Christian and one wonders if that plays a role in their often expressing a loss of meaning, an existential crisis, etc. Birgit Pfeifer and Ruard R. Ganzevoort focus on the religious-like concerns that obsess so many school shooters and note that many of them had religious backgrounds:

“Traditionally, religion offers answers to existential concerns. Interestingly, school shootings have occurred more frequently in areas with a strong conservative religious population (Arcus 2002). Michael Carneal (Heath High School shooting, 1997, Kentucky) came from a family of devoted members of the Lutheran Church. Mitchell Johnson (Westside Middle School shooting, 1998, Arkansas) sang in the Central Baptist Church youth choir (Newman et al. 2004). Dylan Klebold (Columbine shooting, 1999, Colorado) attended confirmation classes in accordance with Lutheran tradition. However, not all school shooters have a Christian background. Some of them declare themselves atheists…” (The Implicit Religion of School Shootings).

Princeton sociologist Katherine Newman, in studying school shootings, has noted that, “School rampage shootings tend to happen in small, isolated or rural communities. There isn’t a very direct connection between where violence typically happens, especially gun violence in the United States, and where rampage shootings happen” (Common traits of all school shooters in the U.S. since 1970).

It is quite significant that these American mass atrocities are concentrated in “small, isolated or rural communities” that are “frequently in areas with a strong conservative religious population”. That might more precisely indicate who these school shooters are and what they are reacting to. Also, one might note that rural areas in general and specifically in the South do have high rates of gun-related deaths, although many of them are listed as ‘accidental’ which is to say most rural shootings involve people who know each other; also true of school shootings.

* * *

Brain stimulation reveals crucial role of overcoming self-centeredness in self-control
by Alexander Soutschek, Christian C. Ruff, Tina Strombach, Tobias Kalenscher and Philippe N. Tobler

Empathic Self-Control
by David Shoemaker

People with a high degree of self-control typically enjoy better interpersonal relationships, greater social adjustment, and more happiness than those with a low degree of self-control. They also tend to have a high degree of empathy. Further, those with low self-control also tend to have low empathy. But what possible connection could there be between self-control and empathy, given that how one regulates oneself seems to have no bearing on how one views others. Nevertheless, this paper aims to argue for a very tight relation between self-control and empathy, namely, that empathy is in fact one type of self-control. The argument proceeds by exploring two familiar types of self-control, self-control over actions and attitudes, the objects for which we are also responsible. Call the former volitional self-control and the latter rational self-control. But we also seem to be responsible for—and have a certain type of control and self-control over—a range of perceptual states, namely, those in which we come to see from another person’s perspective how she views her valuable ends and what her emotional responses are to their thwarting or flourishing. This type of empathic self-control is a previously-unexplored feature of our interpersonal lives. In addition, once we see that the type of empathy exercised is also exercised when casting ourselves into the shoes of our future selves, we will realize how intra-personal empathy better enables both volitional and rational self-control.

Science Says When Self-Control Is Hard, Try Empathizing With Your Future Self
by Lindsay Shaffer

Soutscheck’s study also reveals what happens when we fail to exercise the empathic part of our brain. When Soutscheck interrupted the empathic center of the brain in 43 study volunteers, they were more likely to take a small amount of cash immediately over a larger amount in the future. They were also less inclined to share the money with a partner. Soutscheck’s study showed that the more people are stuck inside their own perspective, even just from having the empathic part of their brain disrupted, the more likely they are to behave selfishly and impulsively.

Self-Control Is Just Empathy With Your Future Self
by Ed Yong

This tells us that impulsivity and selfishness are just two halves of the same coin, as are their opposites restraint and empathy. Perhaps this is why people who show dark traits like psychopathy and sadism score low on empathy but high on impulsivity. Perhaps it’s why impulsivity correlates with slips among recovering addicts, while empathy correlates with longer bouts of abstinence. These qualities represent our successes and failures at escaping our own egocentric bubbles, and understanding the lives of others—even when those others wear our own older faces.

New Studies in Self Control: Treat Yourself Like You’d Treat Others
from Peak

A new study recently shifted the focus to a different mechanism of self control. Alexander Soutschek and colleagues from the University of Zurich believe self-control may be related to our ability to evaluate our future wants and needs.

The scientists suggest that this takes place in an area of the brain called the rTPJ, which has long been linked to selflessness and empathy for others. It’s an important part of our ability to “take perspectives” and help us step into the shoes of a friend.

The scientists hypothesized that perhaps the rTPJ treats our “future self” the same way it treats any other person. If it helps us step into our friend’s shoes, maybe we can do the same thing for ourselves. For example, if we’re deciding whether to indulge in another pint of beer at a bar, maybe our ability to hold off is related to our ability to imagine tomorrow morning’s hangover. As science writer Ed Yong explains, “Think of self-control as a kind of temporal selflessness. It’s Present You taking a hit to help out Future You.”

Empathy for Your Future Self
by Reed Rawlings

Further Research on the TPJ

The results of Soutscheks team were similar to past work on the empathy, future-self, and the TPJ. It’s believed a better connected rTPJ increases the likelihood of prosocial behaviors. Which relates to skills of executive function. Individuals who exhibit lower empathy, score higher for impulsivity – the opposite of self-control.

Keeping our future selves in mind may even keep our savings in check. In this research, Stanford University tested a “future self-continuity”. They wanted to explore how individuals related to their future self. Participants were asked to identify how they felt about the overlap between their current and future selves. They used the Venn diagrams below for this exercise.

If they saw themselves as separate, they were more likely to choose immediate rewards. A greater overlap increased the likelihood of selecting delayed rewards. In their final study, they assessed individuals from the San Francisco Bay area. The researchers found a correlation between wealth and an overlap between selves.

While the above research is promising, it doesn’t paint a full picture. Empathy seems useful, but making a sacrifice for our future-self requires that we understand the reason behind it. It’s the sacrifice that is especially crucial – positive gains demand negative trade-offs.

That’s where altruism, our willingness to give to others, comes in.

Why Do We Sacrifice?

Research from the University of Zurich’s examined some altruism’s driving factors. Their work came up with two correlations. First, the larger your rTPJ, the more likely you are to behave altruistically. Second, concerns of fairness affect how we give.

In this experiment, individuals were more generous if their choice would decrease inequality. When inequality would increase, participants were less likely to give.

This is an understandable human maxim. We have little reason to give to an individual who has more than we do. It feels completely unfair to do so. However, we’re raised to believe that helping those in need is objectively good. Helping ourselves should fall under the same belief.

Empathy and altruism, when focused on our own well-being, are intimately linked. To give selflessly, we need to have a genuine concern for another’s well-being. In this case, the ‘other’ is our future self. Thankfully, with a bit of reflection, each of us can gain a unique insight into our own lives.

Alone and aggressive: Social exclusion impairs self-control and empathy and increases hostile cognition and aggression.
by A. William Crescioni and Roy F. Baumeister
from Bullying, Rejection, and Peer Victimization ed. by Monic J. Harris
pp. 260-271 (full text)

Social Rejection and Emotional Numbing

Initial studies provided solid evidence for a causal relationship be-tween rejection and aggression. The mechanism driving this relation-ship remained unclear, however. Emotional distress was perhaps the most plausible mediator. Anxiety has been shown to play a role in both social rejection (Baumeister & Tice, 1990) and ostracism (Williamset al., 2000). Emotional distress, however, was not present in these experiments by Twenge et al. (2001). Only one significant mood effect was found, and even this effect deviated from expectations. The sole difference in mood between rejected and accepted participants was a slight decrease in positive affect. Rejected participants did not show any increase in negative affect; rather, they showed a flattening of affect, in particular a decrease in positive affect. This mood difference did not constitute a mediator of the link between rejection and aggression. It did, however, point toward a new line of thinking. It was possible that rejection would lead to emotional numbing rather than causing emotional distress. The flattening of affect seen in the previous set of studies would be consistent with a state of cognitive deconstruction. This state is characterized by an absence of emotion, an altered sense of time, a fixa-tion on the present, a lack of meaningful thought, and a general sense of lethargy (Baumeister, 1990). […]

Rejection and Self-Regulation

Although the emotional numbness and decrease in empathy experienced by rejected individuals play an important role in the link between social rejection and aggression, these effects do not constitute a complete explanation of why rejection leads to aggression. The diminished prosocial motivations experienced by those lacking in empathy can open the door to aggressive behavior, but having less of a desire to do good and having more of a desire to do harm are not necessarily equivalent. A loss of empathy, paired with the numbing effects of rejection, could lead individuals to shy away from those who had rejected them rather than lashing out. Emotional numbness, however, is not the only consequence of social rejection.

In addition to its emotional consequences, social rejection has adverse effects on a variety of cognitive abilities. Social rejection has been shown to decrease intelligent (Baumeister, Twenge, & Nuss, 2002) and meaningful thought (Twenge et al., 2002). But another category of cognitive response is self-regulation. Studies have demonstrated that self-regulation depends upon a finite resource and that acts of self-regulation can impair subsequent attempts to exercise self-control (Baumeister, Bratslavsky, Muraven, & Tice, 1998). Self-regulation has been shown to be an important tool for controlling aggressive impulses. Stucke and Baumeister (2006) found that targets whose ability to self-regulate had been depleted were more likely to respond aggressively to insulting provocation. DeWall, Baumeister, Stillman, and Galliot (2007) found that diminished self-regulatory resources led to an increase in aggression only in response to provocation; unprovoked participants showed no increase in aggressive behavior. Recall that in earlier work (Twenge et al.,2002) rejected individuals became more aggressive only when the target of their aggression was perceived as having insulted or provoked them.This aggression could have been the result of the diminished ability of rejected participants to regulate their aggressive urges. […]

These results clearly demonstrate that social rejection has a detrimental effect on self-regulation, but they do not explain why this is so and, indeed, the decrement in self-regulation would appear to be counterproductive for rejected individuals. Gaining social acceptance often involves regulating impulses in order to create positive impressions on others (Vohs, Baumeister, & Ciarocco, 2005). Rejected individuals should therefore show an increase in self-regulatory effort if they wish to create new connections or prevent further rejection. The observed drop in self-regulation therefore seems maladaptive. The explanation for this finding lies in rejection’s effect on self-awareness.

Self-awareness is an important prerequisite of conscious self-control (Carver & Scheier, 1981). Twenge et al. (2002) found that, when given the option, participants who had experienced rejection earlier in the study were more likely to sit facing away from rather than toward a mirror. Having participants face a mirror is a common technique for inducing self-awareness (Carver & Scheier, 1981), so participants’ unwillingness to do so following rejection provides evidence of a desire to avoid self-awareness. A drop in self-awareness is part of the suite of effects that comprises a state of cognitive deconstruction. Just as emotional numbness protects rejected individuals from the emotional distress of rejection, a drop in self-awareness would shield against awareness of personalflaws and shortcoming that could have led to that rejection. The benefit of this self-ignorance is that further distress over one’s inadequacies is mitigated. Unfortunately, this protection carries the cost of decreased self-regulation. Because self-regulation is important for positive self-presentation (Vohs et al., 2005), this drop in self-awareness could ironically lead to further rejection. […]

These data suggest that social rejection does not decrease the absolute ability of victims to self-regulate but rather decreases their willingness to exert the effort necessary to do so. Increased lethargy, another aspect of cognitive deconstruction, is consistent with this decrease in self-regulatory effort. Twenge et al. (2002) found that social rejection led participants to give shorter and less detailed explanations of proverbs. Because fully explaining the proverbs would require an effortful response, this shortening and simplification of responses is evidence of increased lethargy amongst rejected participants. This lethargy is not binding, however. When given sufficient incentive, rejected participants were able to match the self-regulatory performance of participants in other conditions. Inducing self-awareness also allowed rejected individuals to self-regulate as effectively as other participants. In the absence of such stimulation, however, rejected individuals showed a decrement in self-regulatory ability that constitutes an important contribution to explaining the link between rejection and aggression. […]

Rejection and Meaningfulness

Twenge et al. (2002) found that social rejection led to a decrease in meaningful thought among participants, as a well as an increased likelihood to endorse the statement, “Life is meaningless.” Williams (2002)has also suggested that social rejection ought to be associated with a perception of decreased meaning in life. Given the fundamental nature of the need to belong, it makes sense that defining life as meaningful would be at least in part contingent on the fulfillment of social needs. A recent line of work has looked explicitly at the effect of social rejection on the perception of meaning in life. Perceiving meaning in life has been shown to have an inverse relationship with hostility, aggression,and antisocial attitude (Mascaro, Morey, & Rosen, 2004). As such, any decrease in meaning associated with social rejection would constitute an important feature of the explanation of the aggressive behavior of rejected individuals.

The God of the Left Hemisphere:
Blake, Bolte Taylor and the Myth of Creation
by Roderick Tweedy

The left hemisphere is competitive… the will to power…is the agenda of the left hemisphere. It arose not to communicate with the world but to manipulate it. This inability to communicate or co-operate poses great difficulties for any project of reintegration or union. Its tendency would be to feed off the right hemisphere, to simply use and gain power over it too. Left hemisphere superiority is based, not on a leap forward by the left hemisphere, but on a ‘deliberate’ handicapping of the right. There is perhaps as much chance of persuading the head of a multinational to stop pursuing an agenda of self-interest and ruthless manipulation as there is of persuading the Urizenic program of the brain which controls him of “resubmitting” itself to the right hemisphere’s values and awareness.

The story of the Western world being one of increasing left-hemispheric domination, we would not expect insight to be the key note. Instead we would expect a sort of insouciant optimism, the sleepwalker whistling a happy tune as he ambles towards the abyss.

The left, rational, brain, it might be safe to conclude, has no idea how serious the problem is, that is to say, how psychopathic it has become. Of course, it doesn’t care that it doesn’t care. “The idiot Reasoner laughs at the Man of Imagination/And from laughter proceeds to murder by undervaluing calumny”, noted Blake in a comment that is only remarkable for the fact that it has taken two hundred years to understand.

The apparently “conscious” rational self, the driving program and personality of the left brain, turns out to be deeply unconscious, a pathological sleepwalker blithely poisoning its own environment whilst tenaciously clinging onto the delusion of its own rightness. This unfortunate mixture, of arrogance and ignorance, defines contemporary psychology. The left hemisphere not only cannot see that there is a problem, it cannot see that it is itself the problem.

Carl Jung’s Myth of the West

We’ve been reading Catfalque. This is Peter Kingsley’s most recent take on the Presocratics but this time explored through the life and work of Carl Jung. It is a satisfying read and gives one a sense of the depth that goes missing in many other Jungian views.

However, there was one thing that bothered me. Kingsley kept on insisting on the uniqueness of the West, that Westerners must focus on their own culture instead of looking to the East or elsewhere. For a scholar of the ancient world, this seems simplistic and naive. East and West, as we now know it, is not a distinction ancient people would have made. The Greeks were more concerned with differentiating themselves from Barbarians, including the tribal people of Europe that were to the west and north of their own lands.

Those Presocratics never thought of themselves as Westerners, except in a relative sense in talking about those to the east of them, but certainly not as a monolithic identity. In fact, they were part of a syncretistic tradition that was heavily influenced by the far and near East, often by way of Egypt. Some early Greek thinkers gave credit to African-ruled Egypt as the original source of great art and philosophy. This would be more fully embraced later on in Hellenism. Greek medicine, for example, may have been shaped by Eastern teachings.

We know that many Greeks had traveled East, as had many Easterners traveled to the Greek and Greco-Roman world. This included Buddhists and Hindus. This was true into the period of the Roman Empire when supposedly there was a Buddhist temple on the Sea of Galilee. The North African church father Augustine was originally a Manichaean before he converted to Christianity, and his early faith was an amalgamation of Judaic baptismal cult, Zoroastrianism, and Buddhism. Besides, the Greeks themselves were a wandering people who originated from somewhere else, and throughout their history they kept wandering about.

In following Jung’s own cultural defensiveness, Kingsley argues that we Westerners have to look to our own sacred origins and that there is a danger of doing otherwise. But Kingsley is an American, a culture of a thousand influences. And Jung was a northern European. Like most other supposed ‘Westerners’, neither probably had any ancestral roots in the ancient people of Greece nor the Greco-Roman Gnostics that Jung and Kingsley see as the heirs of the Presocratics.

The Gnostics were essentially the original Christians which formed out of Judaism which in turn was from the Near East. Judeo-Christianity, Gnostic or otherwise, was a foreign introduction to the Greco-Roman world and even more foreign to the far west and north of Europe. If Jung was looking for sacred origins of his own ancestral inheritance, he would’ve been more wise to look to the tribal paganism that was wiped out by the onslaught of Greco-Roman thought and imperialism. Christianization of Europe was a genocidal tragedy. Paganism held on in large parts of Europe into the Middle Ages and some Pagan traditions survived into modernity.

Our criticism isn’t with the respect given to these non-Western influences that took over the West. We are likewise fascinated by the Presocratics and Gnostics. But we feel no need to rationalize that they belong to us nor us to them. They are foreigners, both in space and time. The ancient Greeks were never a single people. As with the Celts and Jews, to be Greek in the ancient world was a very loose and, at times, extensive identity (Ancient Complexity). Many of the famous Greek thinkers technically weren’t ethnically Greek. It’s similar to how the Irish adopted the trade culture of the Celts, even though they are of Basque origins.

So, what is this fear* of the East seen in Jung’s reluctance while in India? And why has Kingsley adopted it? We are typical American mutts with some possible non-European ancestry mixed in, from African to Native American. And we were raised in a hodge-podge of New Age religion with much Eastern thought and practice thrown in. We have no sacred origins, no particular ancestral homeland. Even our European ancestry originated in different parts of Europe, although none from Italy or Greece, much less the Levant. The Presocratics and Gnostics aren’t our people.

So, it doesn’t bother us to seek wisdom wherever we can find it. It doesn’t cause us fear, in the way it did for Jung. He worried about losing himself and, as he had experienced psychotic breaks earlier in his life, it was a genuine concern. He needed a sense of being rooted in a tradition to hold himself together, even if that rootedness was an invented myth. And that doesn’t really bother us. We are still admirers of Jung’s work, as we appreciate Kingsley’s work.

We understand why Jung, having lived through the world war catastrophe that tore apart the Western world, sought a vision of a renewed Western tradition. It may have seemed like a useful and necessary story, but it poses its own dangers. Even if it really was useful then, we question that it is useful now.

* Why didn’t Carl Jung visit Ramana Maharshi after being told by both Zimmer and Brunton?, from Beezone. It has been argued that Carl Jung borrowed his notion of ‘the Self’ from Hinduism, and this notion was key to his own teachings. Maybe this was the fear, that the meeting point between the two cultures would simply overwhelm his own view and overwhelm his own psyche.

A Culture of Propaganda

“Contrary to previous readings by historians of the 20th century, which typically described propaganda films as glaringly biased and crude, contemporary historians have argued that filmmakers in propaganda’s coming of age were already educated in the power of subtle suggestion.”
~Christopher Maiytt, A Just Estimate of a Lie

“During the Cold War, it was commonplace to draw the distinction between “totalitarian” and “free” societies by noting that only in the free ones could groups self-organize independently of the state. But many of the groups that made that argument — including the magazines on this left — were often covertly-sponsored instruments of state power, at least in part.”
~Patrick Iber, Literary Magazines for Socialists Funded by the CIA, Ranked

“[Bernd] Scherer said he found fault with the CIA’s cultural programme for the way in which it “functionalised and thus corrupted the term ‘freedom’”, pointing out the paradoxes of an intelligence agency funnelling money to anti-apartheid organisations abroad while helping to sabotage the Black Panther movement at home.”
~Philip Oltermann, Berlin exhibition questions CIA’s influence on global art scene

The subjects of the American Empire are among the most propagandized in the world. And there is a long history of it. Propaganda during World War II was brought back home to be used in the United States, as were counterinsurgency techniques from Southeast Asian wars and covert operations. But few recognize it for what it is, as it filters our entire sense of reality, seeping into every crack and crevice of culture. It’s not merely disinformation. It’s a master narrative that rules our mind as the structures of power rule our lives.

There is a basic truth. In order to maintain the appearance of democracy in a banana republic, it requires maintaining basic levels of comfort so that people don’t question the world around them. This is why a minimal welfare state is necessary, to keep the population barely treading water and so keeping them from outright revolution. It’s the first part of carrot and stick, bread and circus.

Propaganda, as a vast circus, is all the more important to smooth over the bumps and divides. In a democratic society, Jacques Ellul argues in Propaganda, “as the government cannot follow opinion, opinion must follow the government. One must convince this present, ponderous, impassioned mass that the government’s decisions are legitimate and good and that its foreign policy is correct.”

A more blatantly authoritarian society is less reliant on propaganda since violent force maintains control and order. For example, the North Korean regime has little use for extensive and sophisticated methods of mind control and public perception management, since anyone who doesn’t conform and follow orders is simply imprisoned, tortured, or killed. But even in a banana republic such as the United States, violence always is a real threat, the stick for when the carrot fails.

There is a reason the American Empire has the largest military and prison system in history, a reason that it is the only country that has dropped atomic bombs on a human population, a reason it regularly supports terrorist groups and authoritarian regimes while overthrowing democracies. The authoritarian threat is not theoretical but quite real and carried out in punishing vast numbers of people every day, making them into examples — comply or else. Ask the large numbers of Americans who are locked away or ask the populations targeted by the military-industrial complex.

The trick is to turn public attention away from the brutality of raw power. Propaganda offers a story, a pleasant form of indoctrination. All Americans, on some level, know we are ruled by violent authoritarians and homicidal psychopaths. A good story makes us feel better about why we don’t revolt, why we stand by in complicity as millions suffer and die at the hands of the ruling elite, why we allow the theft of hundreds of trillions of dollars and the poisoning of the earth, leaving a horrific inheritance to our children and grandchildren.

Propaganda comes in many forms such as the daily mindless experience of the propaganda model of news or the invasive nature of corporate astroturf. But it has often been implemented as straightforward political rhetoric, propaganda campaigns, and psyops — see COINTELPRO and Operation Mockingbird. And look at the involvement of the CIA and Pentagon in education, art, literature, movies, video games, music, magazines, journals, and much else; even or especially philosophy and literary criticism — see the CIA obsession with postmodernism (Frances Stonor Saunders, The Cultural Cold War). Not to mention the CIA and FBI infiltration of organized labor, student groups, church organizations, and much else.

Also, one has to wonder about scientific fields as well, the social sciences most of all. Take anthropology (David H. Price, Anthropological Intelligence), such as with the career of Claude Lévi-Strauss. Or think of the less clear example of how the linguist Noam Chomsky criticized the military-industrial complex while essentially being on the payroll of the Pentagon (The Chomsky Problem); this is explored in Chris Knight’s book Decoding Chomsky. Be patient for a moment while we go off on a tangent.

* * *

One interesting detail is how consistent Chomsky has been in denying “conspiracy theories”, despite the fact that much of his own writing could only accurately be described as conspiracy theory, in that he analyzes the history of those who have conspired with various agendas and to various ends. Like many academics today, he seeks to be respectable. But how did alternative thinking become disreputable, even among alternative thinkers?

Although the term “conspiracy theorist” has been around since the 1800s, it was rarely used in the past. This changed following a 1967 CIA memo, in response to the Warren Commission Report, that conspired to control the narrative and manipulate public perception about the John F. Kennedy assassination: “The aim of this dispatch is to provide material for countering and discrediting the claims of the conspiracy theorists” (declassified CIA memo# 1035-960, “Countering Criticism of the Warren Report“; for more detailed info, read the book Conspiracy Theory in America by Prof. Lance deHaven-Smith).

In overtly advocating for the government to conspire against the public, the memo’s anonymous author directs CIA operatives to, “employ propaganda assets to answer and refute the attacks of the critics. Book reviews and feature articles are particularly appropriate for this purpose.” Who were these propaganda assets? And why was there such confidence in their power to carry out this conspiracy? Let’s put this in context.

That same year, in 1967, a Ramparts article exposed the CIA funding of the National Student Association. The following decade would lead to the revelations, in the Congressional investigations and reports, that the CIA was working with journalists in the mainstream media, along with connections to civic groups. At around the same time, the CIA Family Jewels report was compiled and, upon its declassification in 2007, it was shown that the CIA had a propaganda program called Operation Mockingbird that involved the media with operations going at least back to the 1960s. This was an extensive covert operation (AKA conspiracy), linked to major news outlets and influential journalists and editors in both the foreign and domestic media — from the Wikipedia article on Operation Mockingbird:

In a 1977 Rolling Stone magazine article, “The CIA and the Media,” reporter Carl Bernstein wrote that by 1953, CIA Director Allen Dulles oversaw the media network, which had major influence over 25 newspapers and wire agencies.[2] Its usual modus operandi was to place reports, developed from CIA-provided intelligence, with cooperating or unwitting reporters. Those reports would be repeated or cited by the recipient reporters and would then, in turn, be cited throughout the media wire services. These networks were run by people with well-known liberal but pro-American-big-business and anti-Soviet views, such as William S. Paley (CBS), Henry Luce (Time and Life), Arthur Hays Sulzberger (The New York Times), Alfred Friendly (managing editor of The Washington Post), Jerry O’Leary (The Washington Star), Hal Hendrix (Miami News), Barry Bingham, Sr. (Louisville Courier-Journal), James S. Copley (Copley News Services) and Joseph Harrison (The Christian Science Monitor).

This was admitted the year before, in 1976, by the Church Committee’s final report. About foreign media, it stated that, “The CIA currently maintains a network of several hundred foreign individuals around the world who provide intelligence for the CIA and at times attempt to influence opinion through the use of covert propaganda. These individuals provide the CIA with direct access to a large number of newspapers and periodicals, scores of press services and news agencies, radio and television stations, commercial book publishers, and other foreign media outlets” (Church Committee Final Report, Vol 1: Foreign and Military Intelligence, p. 455).

In our cynicism and passive complicity, we Americans expect that the CIA would be tangled up in all kinds of foreign organizations and many of us support these covert operations, maybe even feeling some pride in the greatness of American imperialism. But the shocking part is that the CIA would do the same in the United States and, sadly, most Americans have been intentionally kept ignorant of this fact (i.e., not typically taught about it as part of American history classes nor often mentioned in the news media and political debates). Read the following and let it sink in.

“Approximately 50 of the [CIA] assets are individual American journalists or employees of U.S. media organizations. Of these, fewer than half are “accredited” by U.S. media organizations … The remaining individuals are non-accredited freelance contributors and media representatives abroad … More than a dozen United States news organizations and commercial publishing houses formerly provided cover for CIA agents abroad. A few of these organizations were unaware that they provided this cover.”

Let’s get back to the CIA pushing the slur of “conspiracy theorists” through these assets. Just because a conspiracy is proven beyond a mere theory, that doesn’t mean it was effective and successful. So, what were the measurable results that followed? Kevin R. Ryan lays out the facts in showing how pivotal was that CIA memo in shifting the media framing — from Do we need another 9/11 conspiracy theory?:

“In the 45 years before the CIA memo came out, the phrase “conspiracy theory” appeared in the Washington Post and New York Times only 50 times, or about once per year. In the 45 years after the CIA memo, the phrase appeared 2,630 times, or about once per week.

“Before the CIA memo came out, the Washington Post and New York Times had never used the phrase “conspiracy theorist.” After the CIA memo came out, these two newspapers have used that phrase 1,118 times. Of course, in these uses the phrase is always delivered in a context in which “conspiracy theorists” were made to seem less intelligent and less rationale than people who uncritically accept official explanations for major events.”

Here is the sad irony. The CIA was always talented at playing two sides against each other. So, as they were using propaganda to weaponize “conspiracy theory” as an attack on critics of authoritarian statism and military imperialism, they were also using propaganda elsewhere to actively push false conspiracy theories to muddy the water. Kathryn S. Olmstead, a history professor at UC Davis, concluded that (Real Enemies, pp. 239-240, 2011),

“Citizens of a democracy must be wary of official and alternative conspiracists alike, demanding proof for the theories. Yet Americans should be most skeptical of official theorists, because the most dangerous conspiracies and conspiracy theories flow from the center of American government, not from the margins of society.

“Since the First World War, officials of the U.S. government have encouraged conspiracy theories, sometimes inadvertently, sometimes intentionally. They have engaged in conspiracies and used the cloak of national security to hide their actions from the American people. With cool calculation, they have promoted official conspiracy theories, sometimes demonstrably false ones, for their own purposes. They have assaulted civil liberties by spying on their domestic enemies. If antigovernment conspiracy theorists get the details wrong—and they often do—they get the basic issue right: it is the secret actions of the government that are the real enemies of democracy.”

[See my post Skepticism and Conspiracy.]

In respect to Chomsky, it was asked how alternative thinking became disreputable. This was not always the case. Chomsky is the most well-known left-winger in the world, but he often plays the role of guarding the boundaries of thought and shepherding loose sheep back into the fold, such as in recent elections repeatedly telling Americans to vote for corporatist Democrats. What in the hell is a supposed anarchist doing promoting corporatism? And why is he repeating a CIA talking point in dismissing conspiracy theories and acting condescending toward those he labels as conspiracy theorists?

One insightful answer is suggested by Chris Knight in Decoding Chomsky and it is highly recommended. The argument isn’t about claiming Chomsky is a CIA asset, but let’s remain focused on the point at hand. Left-wingers, earlier last century, were far less concerned about respectability, that is to say they were far more radical. “Around the time of the Second World War,” writes Ron Unz, “an important shift in political theory caused a huge decline in the respectability of any “conspiratorial” explanation of historical events” (American Pravda: How the CIA Invented “Conspiracy Theories”). He goes on to say that,

“For decades prior to that conflict, one of our most prominent scholars and public intellectuals had been historian Charles Beard, whose influential writings had heavily focused on the harmful role of various elite conspiracies in shaping American policy for the benefit of the few at the expense of the many, with his examples ranging from the earliest history of the United States down to the nation’s entry into WWI. Obviously, researchers never claimed that all major historical events had hidden causes, but it was widely accepted that some of them did, and attempting to investigate those possibilities was deemed a perfectly acceptable academic enterprise.”

Following Charles Beard, a new generation of intellectuals and scholars felt the walls closing in. They either quickly learned to submit and conform to the hidden demands of power or else find themselves shut out from polite society or even out of a job. It was the beginning of the era of respectability politics. In controlling the terms of debate, the CIA and other covert interests controlled public debate and hence public perception. The American ruling elite won the Cold War culture war, not only against the Soviet commies but also against the American people.

* * *

“It’s hard to fight an enemy who has outposts in your head.”
~Sally Kempton, Ben Price’s None Dare Call It Propaganda

“Power is the ability to rule the imagination.”
~Jacques Necker, from Guillaume de Sardes’ Against the hegemony of American art

Pseudo-radicals were allowed to go through the motions of freedom, as long as they toed the line, as long as they demonstrated a properly indoctrinated mind. Then they could be successful and, more importantly, respectable. They simply had to make the Devil’s Bargain of never taking radical action.  Other than that, they could talk all they wanted while remaining safely within the system of the status quo, such as Chomsky regularly appearing on corporate media — he has admitted that the system maintains control of what he is allowed to communicate.

“The smart way to keep people passive and obedient,” Chomsky fully understood, “is to strictly limit the spectrum of acceptable opinion, but allow very lively debate within that spectrum — even encourage the more critical and dissident views. That gives people the sense that there’s free thinking going on, while all the time the presuppositions of the system are being reinforced by the limits put on the range of the debate.” This how a system of propaganda becomes internalized with barriers erected in the mind. “With the help of propaganda,” Jacques Ellul writes,

“one can do almost anything, but certainly not create the behavior of a free man or, to a lesser degree, a democratic man. A man who lives in a democratic society and who is subjected to propaganda is being drained of the democratic content itself – of the style of democratic life, understanding of others, respect for minorities, re-examination of his own opinions, absence of dogmatism. The means employed to spread democratic ideas make the citizen, psychologically, a totalitarian man. The only difference between him and a Nazi is that he is a ‘totalitarian man with democratic convictions,’ but those convictions do not change his behavior in the least. Such contradiction is in no way felt by the individual for whom democracy has become a myth and a set of democratic imperatives, merely stimuli that activate conditioned reflexes. The word democracy, having become a simple incitation, no longer has anything to do with democratic behavior. And the citizen can repeat indefinitely ‘the sacred formulas of democracy’ while acting like a storm trooper.”

So, there was a closing of the American mind and a silencing of radical thought during the early Cold War. That is no surprise, but what is surprising is how leading radicals were not eliminated so much as neutered and house-trained. The conspiracy theory is that this was an intentional outcome, what the CIA was hoping to achieve. So, was that 1967 CIA memo part of a propaganda campaign? It would be hard to absolutely prove in terms of what specific actions were taken, even as the memo itself seems to admit to it and even as we know the CIA was using every dirty trick in the book. We will never fully and exactly know what were all those CIA assets doing within the world of media and culture.

Besides, it’s not always clear what is or is not propaganda, as the deep state has its hands in almost every aspect of society with its influences being pervasive if often subtle. But what can’t be denied is that, both when intentional or as a side effect, this has a propagandastic-like effect in shaping thought in the public mind and among intellectuals, writers, and artists. We are talking about immense amounts of money (and other resources) sloshing about determining which research gets funding, which articles get into journals, which books get published, which movies get made.

This is subterfuge at the highest level. One has to wonder about entirely other areas. Consider plutocratic and corporatist philanthropy, often combined with greenwashing and control of food systems, overlapping with big ag, big oil, and, of course, big food. Think about why the government and corporations have been so interested in manipulating the American diet since the world war era, coinciding with agricultural subsidies to artificially create cheap agricultural products (refined flour, corn syrup, etc) to be used as ingredients in mass-produced and industrially-processed foods.

Then look to something recent like the propagandistic EAT-Lancet report that argues for the need of authoritarian measures to control the global diet for reasons of ‘environment’ and ‘health’; and when one looks to the backers of this agenda, one finds transnational corporations, not only big farm and big food but other industries as well. It is a corporate narratizing to co-opt the environmentalist left, but it is being done through a respectable and powerful scientific institution, The Lancet Journal, that informs government policies.

In the American Empire, this has been a shared project of business and government. Ever since the early modern revolutionary era, the reactionaries — not only right-wing authoritarians and conservatives but also right-wing bourgeois liberals — have incessantly co-opted left-wing rhetoric, tactics, and cultural image (The Many Stolen Labels of the Reactionary Mind; & Reactionary Revolutionaries, Faceless Men, and God in the Gutter). They simultaneously co-opt the left as they attack the left, essentially playing both sides and determining the field of play so as to control the game; and hence controlling the outcome, choosing the winners.

This has particularly been true of reactionaries in power. For an obvious example, think of president Donald Trump speaking the progressive language of the New Deal and so co-opting the public outrage of economic populism. Or worse still, look back to Joseph Stalin who, as a right-wing ultra-nationalist, co-opted the communist movement in Russia and used it to rebuild the Russian Empire; and in the process silenced radical leftists (unionsts, syndicalists, Trotskyists, Marxists, feminists, etc) by imprisonment, banishment, and death.

The American Imperialists didn’t necessarily oppose Stalin because of ideology, as they opposed those same radical leftists, but because the Soviet Union was seen as a competing global superpower. As for Stalin, he had no aspirations to attack the West and, instead, hoped to become trading partners with his wartime allies (Cold War Ideology and Self-Fulfilling Prophecies). The problem is, with the Nazis gone, the American Imperialists needed a new boogeyman for purposes of domestic social control, as authoritarian oppression at home always needs an externalized rationalization, a group to be scapegoated or an enemy to be fought — then again, many American oligarchs were pro-Nazi before the war and remained so afterwards. The Cold War right from the start was a propaganda campaign, albeit one that got out of control and nearly turned into a nuclear holocaust.

As one person put it, “It took a lot of mental gymnastics to transform the Soviet Union from an anti-fascist ally into an enemy, and CIA was created in part to do a lot of the heavy lifting” (comment by rararoadrunner). To create and maintain political power and social control requires narrative dominance combined with mass spectacle. The Cold War was better than a real war, in that it could be drawn out for decades. It helped to politically justify the immense money going into the deep state. The first purpose of propaganda is to persuade the public that the propagandists are necessary.

Most propaganda, though, has been so successful because it remains hidden in plain sight, influencing us without our awareness — framing and precluding what we think, and so not overtly appearing to tell us what to think. Sure, there was plenty of silencing going on during the Cold War witch hunts, from McCarthyism to corporate blackballing, but the CIA played the long game of instead making certain voices louder, to drown out all else. Controlling and co-opting the political left has turned out to be a much more effective strategy in castrating opposition and replacing it with a controlled opposition. It was ideological warfare as cannibalism, taking on the power of one’s enemies by consuming them.

The radical became tainted by this masquerade of con men manipulating and posing as what they are not. Combined with outright infiltration and sabotage on American soil (e.g., COINTELPRO), not to mention assassinations (e.g., Fred Hampton), this multi-pronged approach to social control and perception management has had a devastating effect. Reactionary forces and mindsets successfully infiltrated the political left and have maintained their hold, creating conflict and division with the left turned against itself. This took the punch out of leftist critique and organizing — the demoralization has lingered ever since. From The CIA Reads French Theory, Gabriel Rockhill writes:

“Even theoreticians who were not as opposed to Marxism as these intellectual reactionaries have made a significant contribution to an environment of disillusionment with transformative egalitarianism, detachment from social mobilization and “critical inquiry” devoid of radical politics. This is extremely important for understanding the CIA’s overall strategy in its broad and profound attempts to dismantle the cultural left in Europe and elsewhere. In recognizing it was unlikely that it could abolish it entirely, the world’s most powerful spy organization has sought to move leftist culture away from resolute anti-capitalist and transformative politics toward center-left reformist positions that are less overtly critical of US foreign and domestic policies. In fact, as Saunders has demonstrated in detail, the Agency went behind the back of the McCarthy-driven Congress in the postwar era in order to directly support and promote leftist projects that steered cultural producers and consumers away from the resolutely egalitarian left. In severing and discrediting the latter, it also aspired to fragment the left in general, leaving what remained of the center left with only minimal power and public support (as well as being potentially discredited due to its complicity with right-wing power politics, an issue that continues to plague contemporary institutionalized parties on the left).”

Then again, this is a positive sign of potential power. The OSS before and the CIA later on would not have spent so many resources for something that was not of an ultimate threat. The ideals and principles of leftist radicalism is inherently anti-authoritarian and the the intelligence agencies are inherently authoritarian; those are the terms of the fight. Even as the political left appears weak and has lost confidence, it remains a potent danger to authoritarian regimes like the American Empire. The culture war continues, the war over hearts and minds.

* * *

In this concluding section, let’s look further into the (socio-)cultural aspect of the propagandistic culture wars. We’ll start with a personal or rather familial example and an interesting historical note.

Our father grew up in Alexandria, Indiana. It’s a small farm community that once was a small bustling factory town. There used to be many towns like it. That is why it was chosen to be designated, “Small Town USA“. This was part of a propaganda program set up by the OSS, the predecessor of the CIA. Pamphlets were made of life in Alexandria as the utopian ideal of American-style capitalism. During the Second World War, these pamphlets were distributed throughout Europe. So, the so-called Cultural Cold War had begun before the Cold War itself.

By the way, Alexandria has remained true to being representative of the United States. It has declined into poverty and unemployment, having gone from a labor union town that was a Democratic stronghold to more recently supporting Donald Trump in his 2016 presidential victory. The sense of pride once elicited by that propaganda campaign became a point of shame that Trump was then able to take advantage of with his own rhetoric, Make American Great Again. The myth of the American Dream, even if a fantasy and often a nightmare, remains powerful capitalist propaganda in how it echoes across the generations. The Cold War lives on.

Much of the Cold War propaganda was about branding. And it’s interesting to note that the rhetoric used by the United States and the Soviet Union were often so similar, in both presenting an image of freedom. The Soviets loved to point out that the poor and minorities in America experienced very much the opposite of freedom, especially in the early Cold War when there were still lynchings, sundown towns, redlining, and Jim Crow. And much of that prejudice targeted not only blacks but also Jews, Catholics, and ethnic Americans (e.g., along with Japanese-Americans, innocent Italian-Americans and German-Americans were likewise rounded up into internment camps).

Think about what propaganda is in terms of branding. Sure, the American ruling elite were attempting to gain cultural influence, especially in Western Europe. That was important, but more important was creating a new American identity and to uphold an ideal of American culture. That was the problem since prior to the world war era the United States was not seen as having its own distinct culture. This is why American Studies was created in colleges involving professors who worked for the CIA, sometimes as spymasters (Early Cold War Liberalism), largely to indoctrinate American students, if also to spy on foreign students and to do other work such as textual analysis.

We tend to think of branding, in the corporate world, as targeting customers and prospective customers. But Nick Westergaard, in Brand Now, argues that only represents the outer layer of targeted influence. First and foremost, branding needs to become an identity that employees internalize, from entry-level workers to upper management. Our father worked in factory management and later became a professor in the same. He did some consulting work in later years, as did an associate of his. This associate told him that this was the primary purpose of the 1980s Ford advertising campaign, “Quality is Job #1” in that it was primarily intended to inculcate an image of employee identity. It’s about corporate culture, essentially no different than the patriotism of nationalistic culture that is promoted by government propaganda. The point is to make people into true believers who will defend and embody the official dogma, whether to be good workers or good citizens.

It’s only after creating a culture as a self-contained and self-reinforcing worldview that those in power can then extend their influence beyond it. But here is the thing. Those in power are the greatest targets of propaganda, as they are the influencers of society (Hillsdale’s Imprimis: Neocon Propaganda). If you can get them to truly believe the ruling ideology or else to mindlessly repeat the talking points for personal gain, those propaganda messages and memes will spread like a contagious disease. And they get others to believe them by acting as if they believe — the con man first has to con himself, as Jack Black (the early 20th century author, not the actor) observed in his memoir You Can’t Win. C. J. Hopkins writes (Why Ridiculous Official Propaganda Still Works):

“Chief among the common misconceptions about the way official propaganda works is the notion that its goal is to deceive the public into believing things that are not “the truth” (that Trump is a Russian agent, for example, or that Saddam had weapons of mass destruction, or that the terrorists hate us for our freedom, et cetera). However, while official propagandists are definitely pleased if anyone actually believes whatever lies they are selling, deception is not their primary aim.

“The primary aim of official propaganda is to generate an “official narrative” that can be mindlessly repeated by the ruling classes and those who support and identify with them. This official narrative does not have to make sense, or to stand up to any sort of serious scrutiny. Its factualness is not the point. The point is to draw a Maginot line, a defensive ideological boundary, between “the truth” as defined by the ruling classes and any other “truth” that contradicts their narrative.”

It’s a similar methodology for why corporations spend so much money on astroturf and lobbying, especially in influencing doctors, health experts, government officials, academic researchers, etc (Sharyl Attkisson, Astroturf and manipulation of media messages). A lot of corporate funding goes to scientific journals, scientific conventions, and further education for professionals. Even more money gets thrown around to pay for fake news articles, fake positive reviews, fake social media accounts, etc. All of this to create an image and then to discredit anyone who challenges this image. Between the private and public sectors, this is an all-out propaganda onslaught from hundreds, if not thousands, of government agencies, corporations, lobbyist organizations, special interest groups, think tanks, and on and on.

“I am an intellectual thug who has slowly been accumulating a private arsenal with every intention of using it. In a mindless age, every insight takes on the character of a lethal weapon.”‬

‪Marshall McLuhan to Ezra Pound,‬ ‪letter, June 22, 1951‬.

* * *

Let me give an example of private censorship by powerful corporations, as a type of negative propaganda where public perception is shaped not only by what Americans were allowed to see but by what was omitted and eliminated from view. It’s often forgotten that most of the oppressive actions during the Cold War were taken by big biz, not big gov, including but not limited to blackballing. In the documentary Red Hollywood, there is discussion of the 1954 independent film Salt of the Earth. It was written, directed, and produced by three men on the Hollywood blacklist in being alleged Communists. The narrator of the documentary described its groundbreaking significance:

“But only after the blacklist had forced them outside the studio system could Hollywood Communists make a film in which working-class women stood up and demanded equality. No Hollywood film had ever shown a strike from the workers’ point of view. No Hollywood film had ever portrayed a strike as just and rational. No Hollywood film had ever given Chicanos the leading parts and put Anglos in subordinate roles. No Hollywood film had ever shown women courageously and effectively taking over the work of men. Salt of the Earth broke all these taboos, but it never reached its intended public.”

Then the documentary cuts to an interview with Paul Jarrico, the producer of Salt of the Earth. He explained that,

“After the opening in New York where the picture was well-received, not only by an audience who packed the theater for nine weeks, I think, or 10, but by good reviews in the New York Times, and Time magazine, and other journals. And a number of exhibitors said they wanted to play the picture, and then one by one they were pressured by the majors: ‘You play that picture and you’ll never get another RKO picture.’ ‘You play that picture, you’ll never get another MGM picture.’ And one by one, they backed out. The original intent when we formed the company was to make a number of films using the talents of blacklisted people. But we lost our shirts on Salt of the Earth and that was the end of that noble experiment. In a way, it’s the grandfather of independent filmmaking in the United States. I mean, there’ve been a lot of independent films since, but we didn’t make them.”

This is how alternative voices were silenced, again and again. In their place, films that toed the line were promoted. Through control of the film industry and backing by government, the major film companies were able to have near total control of the indoctrination of American citizens. That is but one example among many.

* * *

Hearts, Minds, and Dollars
by David Kaplan

A Lost Opportunity to Learn Lessons from the Cultural Cold War
by Steve Slick

How the CIA Really Won Hearts and Minds Naïve
by J.P. O’Malley

The CIA and the Cultural Cold War Revisited
by James Petras

The CIA and the Media
by Carl Bernstein

A Propaganda Model
by Edward Herman & Noam Chomsky

The CIA and the Press: When the Washington Post Ran the CIA’s Propaganda Network
by Jeffrey St. Clair

Murdoch, Scaife and CIA Propaganda
by Robert Parry

Modern art was CIA ‘weapon’
by Frances Stonor Saunders

The CIA as Art Patron
by Lenni Brenner

Washington DC’s role behind the scenes in Hollywood goes deeper than you think
by Matthew Alford

Hollywood and the Pentagon
by Jacobin Editors

EXCLUSIVE: Documents expose how Hollywood promotes war on behalf of the Pentagon, CIA and NSA
by Tom Secker

ROI: Does the Pentagon Fund Movies?
from Spy Culture

How Many Movies has the Pentagon Prevented from Being Made?
from Spy Culture

CIA helped shape ‘Tom Clancy’s Jack Ryan’ series into bigoted Venezuela regime change fantasy
by Max Blumenthal

How the Pentagon and CIA push Venezuela regime-change propaganda in video games
by Max Blumenthal and Ben Norton

“Invading Your Hearts and Minds”: Call of Duty® and the (Re)Writing of Militarism in U.S. Digital Games and Popular Culture
Frédérick Gagnon

Arts Armament: How the CIA Secretly Shaped The Arts in America
by Theodore Carter

The CIA-Soviet Culture Wars That Shaped American Art
by Juliana Spahr

Was modern art a weapon of the CIA?
by Alastair Sooke

Modern art was CIA ‘weapon’
by Frances Stonor Saunders

Modern art is a sham
by Arthur B. Alexi

The Occult War of Art
from Cult Of Frogs

The battle for Picasso’s mind
by Matthew Holman

Picasso and the CIA
by Susan Adler

How Jackson Pollock and the CIA Teamed Up to Win The Cold War
by Michael R. McBride

Postmodern philosopher Judith Butler repeatedly donated to ‘top cop’ Kamala Harris
by Ben Norton

The CIA Assesses the Power of French Post-Modern Philosophers: Read a Newly Declassified CIA Report from 1985
by Josh Jones

Why the CIA Cares About Marxism
by Michael Barker

Why the CIA Loved French New Left Philosophy, and Why They Were Wrong
from Spy Culture

Is Literature ‘the Most Important Weapon of Propaganda’?
by Nick Romeo

Literary Magazines for Socialists Funded by the CIA, Ranked
by Patrick Iber

The CIA Helped Build the Content Farm That Churns Out American Literature
by Brian Merchant

How Iowa Flattened Literature
by Eric Bennett

Hijack: The CIA and Literary Culture
by Antony Loewenstein

How the CIA Infiltrated the World’s Literature
by Mary von Aue

How the CIA Helped Shape the Creative Writing Scene in America
by Josh Jones

‘Workshops of Empire,’ by Eric Bennett
by Timothy Aubry

Silent Coup: How the CIA is Welcoming Itself Back Onto American University Campuses
by David Price

The science of spying: how the CIA secretly recruits academics
by Daniel Golden

Propaganda and Disinformation: How the CIA Manufactures History
by Victor Marchetti

The Mighty Wurlitzer: How the CIA Played America – Part 1 & Part 2
by Nancy Hanover

These are the propaganda ad campaigns that made socialism seem un-American
by Oana Godeanu-Kenworth

FBI Uses “Cute” Propaganda Campaign to Justify Civil Asset Forfeiture
by Jose Nino