Islamic Voice-Hearing

Islam, what kind of religion is it? Islam is the worship of a missing god, that is how we earlier described it. Some might consider that as unfair and dismissive to one of the world’s largest religions, but this is true to some extent for all post-bicameral religions. The difference is that Islam is among the most post-bicameral of the world religions. This is true simply in temporal terms.

The bicameral societies, according to Julian Jaynes, ended with the widespread collapse of the late Bronze Age empires and their trade networks. That happened around 1177 BCE, as the result of natural disasters and attacks by the mysterious Sea People, the latter maybe having formed out the refugees from the former. The Bronze Age continued for many centuries in various places: 700 BCE in Great Britain, Central Europe and China; 600 BCE in Northern Europe; 500 BCE in Korea and Ireland; and centuries beyond that in places like Japan.

But the Bronze Age Empires never returned. In that late lingering Bronze Age, a dark age took hold and put all of civilization onto a new footing. This was the era when, across numerous cultures, there were the endless laments about the gods, spirits, and ancestors having gone silent, having abandoned humanity. Entire cultural worldviews and psychological ways of being were utterly demolished or else irreparably diminished. This created an intense sense of loss, longing, and nostalgia that has never left humanity since.

Out of the ashes, while the Bronze Age was still holding on, the Axial Age arose around 900 BCE and continued until 200 BCE. New cultures were formed and new empires built. The result is what Jaynes described as ‘consciousness’ or what one can think of as introspective mental space, an inner world of egoic identity where the individual is separate from community and world. Consciousness and the formalized religions that accompanied it were a replacement for the loss of a world alive with voices.

By the time Rabbinic Judaism, Gnosticism, and Christianity came around, the Axial Age was already being looked back upon as a Golden Age and, other than through a few surviving myths, the Bronze Age before that was barely remembered at all. It would be nearly another 600 years after that first century monotheistic revival when Muhammad would have his visions of the angel Gabriel visiting him to speak on behalf of God. Islam is both post-bicameral and post-axial, to a far greater degree.

Muslims consider Muhammad to be the last prophet and even he didn’t get to hear God directly for it had to come through an angel. The voice of God had long ago grown so faint that people had come to rely on oracles, channelings, and such. These rather late revelations by way of Gabriel were but a barely audible echo of the archaic bicameral voices. It is may be understandable that, as with some oracles before him, Muhammad would declare God would never speak again. So, Islam, unlike the other monothesitic religions, fully embraces God’s absence from the world.

Actually, that is not quite right. Based on the Koran, God will never speak again until the Final Judgment. Then all will hear God again when he weighs your sins and decides the fate of your immortal soul. Here is the interesting part. The witnesses God shall call upon in each person’s case will be all the bicameral voices brought back out of silence. The animals and plants will witness for or against you, as will the earth and rocks and wind. Even your own resurrected body parts will come alive again with voices to speak of what you did. Body parts speaking is something familiar to those who read Jaynesian scholarship.

Until then, God and all the voices of the world will remain mute witnesses, watching your every move and taking notes. They see all, hear all, notice all — every time you masturbate or pick your nose, every time you have a cruel or impure thought, every time you don’t follow one of the large number of divine commandments, laws, and rules spelled out in the Koran. The entire world is spying upon you and will report back to God, at the end of time. The silent world only appears to be dumb and unconscious. God is biding his time, gathering a file on you like a cosmic FBI.

This could feel paralyzing, but in another way it offers total freedom from self, total freedom through complete submission. Jaynesian consciousness is a heavy load and that was becoming increasingly apparent over time, especially in the centuries following the Axial Age. The zealous idealism of the Axial Age prophets was growing dull and tiresome. By the time that Muhammad showed up, almost two millennia had passed since the bicameral mind descended into darkness. The new consciousness was sold as something amazing, but it hadn’t fully lived up to its promises. Instead, ever more brutal regimes came into power and a sense of anxiety was overtaking society.

Muhammad had an answer and the people of that region were obviously hungry for someone to provide an answer. After forming his large army, his military campaign barely experienced any resistance. And in a short period of time while he was still alive, most of the Arabian peninsula was converted to Islam. The silence of the gods had weakened society, but Muhammad offered an explanation for why the divine could no longer be experienced. He helped normalize what had once felt like a tragedy. He told them that they didn’t need to hear God because God had already revealed all knowledge to the prophets, including himself of course. No one had to worry, just follow orders and comply with commands.

All the tiresome complications of thought were unnecessary. God had already thought out everything for humans. The Koran as the final and complete holy text would entirely and permanently replace the bicameral voices, ever receding into the shadows of the psyche. But don’t worry, all those voices are still there, waiting to speak. But the only voice that the individual needed to listen to was that of the person directly above them in the religious hierarchy, be it one’s father or an imam or whoever else with greater official authority with a line of command that goes back to the prophets and through the angels to God Himself. Everything is in the Koran and the learned priestly class would explain it all and translate it into proper theocratic governance.

Muhammad came with a different message than anyone before. The Jewish prophets and Jesus, as with many Pagans, would speak of God as Father and humanity as His children. Early Christians took this as a challenge to a slave-based society, in borrowing from the Stoics that even a slave was free in his soul. Muhammad, instead, was offering another variety of freedom. We humans, rather than children of God, are slaves of God. The entire Islamic religion is predicated upon divine slavery, absolute submission. This is freedom from the harsh taskmaster of egoic individuality, a wannabe demiurge. Unlike Jesus, Muhammad formulated a totalitarian theocracy, a totalizing system. Nothing is left to question or interpretation, that is in theory or rather in belief.

This goes back to how, with the loss of the bicameral mind and social order, something took its place. It was a different kind of authoritarianism — rigid and hierarchical, centralized and concentrated, despotic and violent. Authoritarianism of this variety didn’t emerge until the late Bronze Age when the bicameral societies were becoming too large and complex, overstrained and unstable. Suddenly, as if to presage the coming collapse, there was the appearance of written laws, harsh punishment, and cruel torture — none of which ever existed before, according to historical records and archaeological finds. As the world shifted into post-bicameralism, this authoritarianism became ever more extreme (e.g., Roman Empire).

This was always the other side of the rise of individuality, of Jaynesian consciousness. The greater potential freedom the individual possesses the more that oppressive social control is required, as the communal bonds and social norms of the bicameral mind increasingly lost their hold to organically maintain order. Muhammad must have showed up at the precise moment of crisis in this change. After the Roman Empire’s system of slavery, Europe came up with feudalism to re-create some of what had disappeared. But apparently a different kind of solution was required in the Arab world.

Maybe this offsets the draining of psychic energy that comes with consciousness. Jaynes speculated that, like the schizophrenic, bicameral humans had immense energy and stamina which allowed them to accomplish near-miraculous feats such as building the pyramids with small populations and very little technology or infrastructure. Suppression of the extremes of individualism through emphasizing absolute subordination is maybe a way of keeping in check the energy loss of maintaining egoic consciousness. In the West, we eventually overcame this weakness by using massive doses of stimulants to overpower the otherwise debilitating anxiety and to help shore up the egoic boundaries, but this has come at the cost of destroying our physical health and mental health.

Time will tell which strategy is the most effective for long-term survival of specific societies. But I’m not sure I’d bet on the Western system, considering how unsustainable it appears to be and how easily it has become crippled by a minor disease epidemic like covid-19. Muhammad might simply have been trying to cobble together some semblance of a bicameral mind, in the face of divine silence. There is a good reason for trying to do that. Those bicameral societies lasted many millennia longer than has our post-bicameral civilization. It’s not clear that modern civilization or at least Western civilization will last beyond the end of this century. We underestimate the bicameral mind and the importance it played during the single longest period of advancement of civilization.

* * *

Let us leave a small note of a more personal nature. In the previous post (linked above), we mentioned that our line of inquiry began with a conversation we had with a friend of ours who is a Muslim. He also happens to be schizophrenic, i.e., a voice-hearer. The last post was about how voice=hearing is understood within Islam. Since supposedly God no longer speaks to humans nor do his angelic intermediaries, any voice a Muslim hears is automatically interpreted as not being of divine origins. It doesn’t necessarily make the voice evil, as it could be a jinn which is a neutral entity in Islamic theology, although jinn can be dangerous. Then again, voice-hearing might also be caused by an evil magician, what I think is called a sihir.

Anyway, we had the opportunity to speak to this friend once again, as we are both in jobs that require us to continue working downtown amidst everything otherwise being locked down because of the covid-19 epidemic. In being isolated from family and other friends, we’ve been meeting with this Islamic guy on a daily basis. Just this morning, we went for a long walk together and chatted about life and religion. He had previously talked about his schizophrenia in passing, apparently unworried by the stigma of it. He is an easy person to talk to, quite direct and open about his thoughts and experiences. I asked him about voice-hearing and he explained that, prior to being medicated, he would continue to hear people speak to him after they no longer were present. And unsurprisingly, the voices were often negative.

Both his imam and his therapist told him to ignore the voices. Maybe that is a standard approach in traditionally monotheistic cultures. As we mentioned in the other post, he is from North Africa where Arabs are common. But another friend of ours lives in Ghana, in West Africa. Voice-hearing experience among people in Ghana was compared to those in the United States, in the research of Tanya M. Luhrmann, an anthropologist inspired by Julian Jaynes. She found that Ghanans, with a tradition of voice-hearing (closer to bicameralism?), had a much more positive experience of the voices they heard. Americans, like our Islamic friend, did not tend to hear voices that were kind and helpful. This is probably the expectancy effect.

If you are raised to believe that voices are demonic or their Islamic equivalent of jinn or are from witches and evil magicians, or if you simply have been told voice-hearing means your insane, well, it’s not likely to lead to happy results when you do hear voices. I doubt it decreases the rate of voice-hearing, though. In spite of Islamic theology denying God and angels speak to humans any longer, that isn’t likely to have any affect on voice-hearing itself. So, the repressed bicameral mind keeps throwing out these odd experiences, but in our post-bicameral age we have fewer resources in dealing constructively with those voices. Simply denying and ignoring them probably is less helpful.

That is the ultimate snag. The same voices that once were identified as godly or something similar are now taken as false, unreal, or dangerous. In a sense, God never stopped speaking. One could argue that we all are voice-hearers, but some of us now call the voice of God as ‘conscience’ or whatever. Others, like Muslims, put great emphasis on this voice-hearing but have tried to gag God who goes on talking. Imagine how many potential new prophets have been locked away in psychiatric wards or, much worse, killed or imprisoned as heretics. If God can’t be silenced, the prophets who hear him can. The Old Testament even describes how the authorities forbid voice-hearing and demanded that voice-hearers be killed, even by their own parents.

The bicameral mind didn’t disappear naturally because it was inferior but because, in its potency, it was deemed dangerous to those who wanted to use brute power to enforce their own voices of authorization. The bicameral mind, once central to the social order, had become enemy number one. If people could talk to God directly, religion and its claims of authority would become irrelevant. That is how our Islamic friend, a devout religious practitioner, ended up being drugged up to get the voices to stop speaking.

Islam as Worship of a Missing God

A friend of ours is a Muslim and grew up in an Islamic country. As he talked about his religion, we realized how different it is from Christianity. There is no shared practice among Christians similar to the praying five times a day. From early on, Christianity was filled with diverse groups and disagreements, and that has only increased over time (there are over 4,600 denominations of Christianity in the United States alone). My friend had a hard time appreciating that there is no agreed upon authority, interpretation, or beliefs among all Christians.

Unlike Muhammad, Jesus never wrote anything nor was anything written down about him until much later. Nor did he intend to start a new religion. He offered no rules, social norms, instructions, etc for how to organize a church, a religious society, or a government. He didn’t even preach family values, if anything the opposite — from a command to let the dead bury themselves to the proclamation of having come to turn family members against each other. The Gospels offer no practical advice about anything. Much of Jesus’ teachings, beyond a general message of love and compassion, are vague and enigmatic, often parables that have many possible meanings.

Now compare Jesus to the Islamic prophet. Muhammad is considered the last prophet, although he never claimed to have heard the voice of God and instead supposedly having received the message secondhand through an angel. Still, according to Muslims, the Koran is the only complete holy text in existence — the final Word of God. That is also something that differs from Christianity. Jesus never asserted that God would become silent to all of humanity for eternity and that his worshippers would be condemned to a world without the God they longed for, in the way Allah never enters His own Creation.

Many Protestants and Anabaptists and those in similar groups believe that God continues to be revealed to people today, that the divine is known through direct experience, that the Bible as a holy text must be read as a personal relationship to God, not merely taken on the authority of blind faith. Some churches go so far as to teach people how to speak to and hear God (T.M. Luhrmann, When God Talks Back). Even within Catholicism, there have been further revelations of God since Jesus, from various mystics and saints that are acknowledged by the Vatican but also from ordinary Catholics claiming God spoke to them without any great fear of hereticism and excommunication.

It made me think about Julian Jaynes’ theory modern consciousness. With the collapse of the Bronze Age civilizations, there was this sense of the gods having gone silent. Yet this was never an absolute experience, as some people continued to hear the gods. Even into the modern world, occasionally people still claim to hear various gods and sometimes even found new religions based on revelations. The Bahai, for example, consider Muhammad to be just one more prophet with others having followed him. Hindus also have a living tradition of divine revelation that is equivalent to that of prophets. Only Islam, as far as I know, claims all prophecy and revelation to be ended for all time.

I was thinking about the sense of loss and loneliness people felt when bicameral societies came to an end. They were thrown onto an increasingly isolated individualism. Religion as we know it was designed to accommodate this, in order to give a sense of order, meaning and authority that had gone missing. But Islam takes this to an extreme. After Muhammad, no human supposedly would ever again personally hear, see, or experience the divine in any way (excluding mystical traditions like Sufism). For all intents and purposes, Allah has entirely receded from the world. The only sign of his existence that he left behind was a book of instructions. We must submit and comply or be punished in the afterlife, a world separate from this one

That seems so utterly depressing and dreary to me. I was raised Christian and on the far other extreme of Protestantism. My family attended the Unity Church that emphasizes direct experience of God to such a degree that the Bible itself was mostly ignored and almost irrelevant — why turn to mere words on paper when you can go straight to the source? Rather than being denied and condemned, to claim to have heard God speak would have been taken seriously. I’m no longer religious, but the nearly deist idea of a god that is distant and silent seems so alien and unappealing to me. Yet maybe that makes Islam well designed for the modern world, as it offers a strong response to atheism.

If you don’t have any experience of God, this is considered normal and expected in Islam, not something to be worried about, not something to challenge one’s faith as is common in Christianity (NDE: Spirituality vs Religiosity); and it avoids the riskiness and confusion of voice-hearing (Libby Anne, Voices in Your Head: Evangelicals and the Voice of God). One’s ignorance of the divine demonstrates one’s individual inadequacy and, as argued by religious authority, is all the more reason to submit to religious authority. Islamic relation between God and humanity is one-way, except to some extent by way of inspiration and dreams, but Allah himself never directly enters his Creation and so never directly interacts with humans, not even with prophets. Is that why constant prayer is necessary for Muslims, to offset God’s silence and vacancy? Worship of a missing God seems perfectly suited for the modern world.

Muslims are left with looking for traces of God in the Koran like ants crawling around in a footprint while trying to comprehend what made it and what it wants them to do. So, some of the ants claim to be part of a direct lineage of ants that goes back to an original ant that, according to tradition, was stepped upon by what passed by. These well-respected ants then explain to all the other ants what is meant by all the bumps and grooves in the dried mud. In worship, the ants pray toward the footprint and regularly gather to circle around it. This gives their life some sense of meaning and purpose and, besides, it maintains the social order.

That is what is needed in a world where the bicameral voices of archaic authorization no longer speak, no longer are heard. Something has to fill the silence as the loneliness it creates is unbearable. Islam has a nifty trick, embracing the emptiness and further irritating the overwhelming anxiety as it offers the salve for the soul. Muslims take the silence of God as proof of God, as a promise of something more. This otherworldly being, Allah, tells humans who don’t feel at home in this world that their real home is elsewhere, to which they will return if they do what they are told. Other religions do something similar, but Islam takes this to another level — arguably, the highest or most extreme form of monotheism, so far. The loss of the bicameral mind could not be pushed much further, one suspects, without being pushed into an abyss.

Islam is a truly modern religion. Right up there with capitalism and scientism.

* * *

Further discussion about this can be found on the Facebook page “Jaynes’ The Origin of Consciousness in the Breakdown of the Bicameral Mind”.

 

To Empathize is to Understand

What is empathy as a cognitive ability? And what is empathy as an expansion of identity, as part of awareness of self and other?

There is a basic level of empathy that appears to be common across numerous species. Tortoises, when seeing another on its back, will help flip it over. There are examples of animals helping or cooperating with those from an entirely different species. Such behavior has been repeatedly demonstrated in laboratories as well. These involve fairly advanced expressions of empathy. In some cases, one might interpret it as indicating at least rudimentary theory of mind, the understanding that others have their own experience, perspective, and motivations. But obviously human theory of mind can be much more complex.

One explanation about greater empathy has to do with identity. Empathy in a way is simply a matter of what is included within one’s personal experience (Do To Yourself As You Would Do For Others). To extend identity is to extend empathy to another individual or a group (or anything else that can be brought within sphere of the self). For humans, this can mean learning to include one’s future self, to empathize with experience one has not yet had, the person one has not yet become. The future self is fundamentally no different than another person.

Without cognitive empathy, affective empathy is limited to immediate experience. It’s the ability to feel what another feels. But lacking cognitive empathy as happens in the most severe autism, theory of mind cannot be developed and so there is no way to identity, locate and understand that feeling. One can only emotionally react, not being able to differentiate one’s own emotion from that of another. In that case, there would be pure emotion, and yet no recognition of the other. Cognitive empathy is necessary to get beyond affective reactivity, not all that different than the biological reactivity of a slug.

It’s interesting that some species (primates, rats, dolphins, etc) might be able to have more cognitive empathy and theory of mind than some people at the extreme ends of severe autism, not necessarily being an issue of intelligence. On the other hand, the high functioning on the autistic spectrum, if intervention happens early enough, can be taught theory of mind, although it is challenging for the. This kind of empathy is considered a hallmark of humanity, a defining feature. This is what leads to problems of social behavior for those with autism spectrum disorder.

Someone entirely lacking in theory of mind would be extremely difficult to communicate and interact with beyond the most basic level, as is seen in the severest cases of autism and other extreme developmental conditions. Helen Keller asserts she had no conscious identity, no theory of her own mind or that of others, until she learned language.* Prior to her awakening, she was aggressive and violent in reacting to a world she couldn’t understand, articulate, or think about. That fits in with the speculations of Julian Jaynes. What he calls ‘consciousness’ is the addition of abstract thought by way of metaphorical language, as built upon concrete experience and raw affect. Keller discusses how her experience went from from the concreteness of touch to the abstraction of language. In becoming aware of the world, she became aware of herself.

Without normal development of language, the human mind is crippled: “The “black silence” of the deaf, blind and mute is similar in many respects to the situation of acutely autistic children where there are associated difficulties with language and the children seem to lack what has been called “a theory of mind” ” (Robin Allott, Helen Keller: Language and Consciousenss). Even so, there is more to empathy than language, and that might be true as well for some aspects or kinds of cognitve empathy. Language is not the only form of communication.

Rats are a great example in comparing to humans. We think of them as pests, as psychologically inferior. But anyone who has kept rats knows how intelligent and social they are. They are friendlier and more interactive than the typical cat. And research has shown how cognitively advanced they are in learning. Rats do have the typical empathy of concern for others. For example, they won’t hurt another rat in exchange for a reward and, given a choice, they would rather go hungry. But it goes beyond that.

It’s also shown that “rats are more likely and quicker to help a drowning rat when they themselves have experienced being drenched, suggesting that they understand how the drowning rat feels” (Kristin Andrews, Rats are us). And “rats who had been shocked themselves were less likely to allow other rats to be shocked, having been through the discomfort themselves.” They can also learn to play hide-and-seek which necessitates taking on the perspective others. As Ed Yong asks in The Game That Made Rats Jump for Joy, “In switching roles, for example, are they taking on the perspective of their human partners, showing what researchers call “theory of mind”?”

That is much more than mere affective empathy. This seems to involve active sympathy and genuine emotional understanding, that is to say cognitive empathy and theory of mind. If they are capable of both affective and cognitive empathy, however limited, and if Jaynesian consciousness partly consists of empathy imaginatively extended in space and time, then a case could be made that rats have more going on than simple perceptual awareness and biological reactivity. They are empathically and imaginatively engaging with others in the world around them. Does this mean they are creating and maintaining a mental model of others? Kristin Andrews details the extensive abilities of rats:

“We now know that rats don’t live merely in the present, but are capable of reliving memories of past experiences and mentally planning ahead the navigation route they will later follow. They reciprocally trade different kinds of goods with each other – and understand not only when they owe a favour to another rat, but also that the favour can be paid back in a different currency. When they make a wrong choice, they display something that appears very close to regret. Despite having brains that are much simpler than humans’, there are some learning tasks in which they’ll likely outperform you. Rats can be taught cognitively demanding skills, such as driving a vehicle to reach a desired goal, playing hide-and-seek with a human, and using the appropriate tool to access out-of-reach food.”

To imagine the future for purposes of thinking in advance and planning actions, that is quite advanced cognitive behavior. Julian Jaynes argued that was the purpose of humans developing a new kind of consciousness, as the imagined metaphorical space that is narratized allows for the consideration of alternatives, something he speculates was lacking in humans prior to the Axial Age when behavior supposedly was more formulaic and predetermined according to norms, idioms, etc. Yet rats can navigate a path they’ve never taken before with novel beginning and ending locations, which would require taking into account multiple options. What theoretically makes Jaynesian consciousness unique?

Jaynes argues that it’s the metaphorical inner space that is the special quality that created the conditions for the Axial Age and all that followed from it, the flourishing of complex innovations and inventions, the ever greater extremes of abstraction seen in philosophy, math and science. We have so strongly developed this post-bicameral mind that we barely can imagine anything else. But we know that other societies have very different kinds of mentalities, such as the extended and fluid minds of animistic cultures. What exactly is the difference?

Australian Aborigines give hint to something between the two kinds of mind. In some ways, the mnemonic systems represent more complex cognitive ability than we are capable with our Jaynesian consciousness. Instead of an imagined inner space, the Songlines are vast systems of experience and knowledge, culture and identity overlaid upon immense landscapes. These mappings of externalized cognitive space can be used to guide the individual across distant territories the individual has never seen before and help them to identify and use the materials (plants, stones, etc) at a location no one in their tribe has visited for generations. Does this externalized mind have less potential for advanced abilities? Upon Western contact, Aborigines had farming and ranching, kept crop surpluses in granaries, used water and land management.

It’s not hard to imagine civilization having developed along entirely different lines based on divergent mentalities and worldviews. Our modern egoic consciousness was not an inevitability and it likely is far from offering the most optimal functioning. We might already be hitting a dead end with our present interiorized mind-space. Maybe it’s our lack of empathy in understanding the minds of other humans and other species that is an in-built limitation to the post-bicameral world of Jaynesian consciousness. And so maybe we have much to learn from entirely other perspectives and experiences, even from rats.

* * *

* Helen Keller, from Light in My Darkness:

I had no concepts whatever of nature or mind or death or God. I literally thought with my body. Without a single exception my memories of that time are tactile. . . . But there is not one spark of emotion or rational thought in these distinct yet corporeal memories. I was like an unconscious clod of earth. There was nothing in me except the instinct to eat and drink and sleep. My days were a blank without past, present, or future, without hope or anticipation, without interest or joy. Then suddenly, I knew not how or where or when, my brain felt the impact of another mind, and I awoke to language, to knowledge, to love, to the usual concepts of nature, good, and evil. I was actually lifted from nothingness to human life.

And from The Story of My Life:

As the cool stream gushed over one hand she spelled into the other the word water, first slowly, then rapidly. I stood still, my whole attention fixed upon the motions of her fingers. Suddenly I felt a misty consciousness as of something forgotten–-a thrill of returning thought; and somehow the mystery of language was revealed to me. I knew then that ‘w-a-t-e-r’ meant the wonderful cool something that was flowing over my hand. That living word awakened my soul, gave it light, hope, joy, set it free! There were barriers still, it is true, but barriers that could in time be swept away.

And from The World I Live In:

Before my teacher came to me, I did not know that I am. I lived in a world that was a no-world. I cannot hope to describe adequately that unconscious, yet conscious time of nothingness. I did not know that I knew aught, or that I lived or acted or desired. I had neither will nor intellect. I was carried along to objects and acts by a certain blind natural impetus. I had a mind which caused me to feel anger, satisfaction, desire. These two facts led those about me to suppose that I willed and thought. I can remember all this, not because I knew that it was so, but because I have tactual memory. It enables me to remember that I never contracted my forehead in the act of thinking. I never viewed anything beforehand or chose it. I also recall tactually the fact that never in a start of the body or a heart-beat did I feel that I loved or cared for anything. My inner life, then, was a blank without past, present, or future, without hope or anticipation, without wonder or joy or faith. […]

Since I had no power of thought, I did not compare one mental state with another. So I was not conscious of any change or process going on in my brain when my teacher began to instruct me. I merely felt keen delight in obtaining more easily what I wanted by means of the finger motions she taught me. I thought only of objects, and only objects I wanted. It was the turning of the freezer on a larger scale. When I learned the meaning of “I” and “me” and found that I was something, I began to think. Then consciousness first existed for me. Thus it was not the sense of touch that brought me knowledge. It was the awakening of my soul that first rendered my senses their value, their cognizance of objects, names, qualities, and properties. Thought made me conscious of love, joy, and all the emotions. I was eager to know, then to understand, afterward to reflect on what I knew and understood, and the blind impetus, which had before driven me hither and thither at the dictates of my sensations, vanished forever.”

I cannot represent more clearly than any one else the gradual and subtle changes from first impressions to abstract ideas. But I know that my physical ideas, that is, ideas derived from material objects, appear to me first an idea similar to those of touch. Instantly they pass into intellectual meanings. Afterward the meaning finds expression in what is called “inner speech.”  […]

As my experiences broadened and deepened, the indeterminate, poetic feelings of childhood began to fix themselves in definite thoughts. Nature—the world I could touch—was folded and filled with myself. I am inclined to believe those philosophers who declare that we know nothing but our own feelings and ideas. With a little ingenious reasoning one may see in the material world simply a mirror, an image of permanent mental sensations. In either sphere self-knowledge is the condition and the limit of our consciousness. That is why, perhaps, many people know so little about what is beyond their short range of experience. They look within themselves—and find nothing! Therefore they conclude that there is nothing outside themselves, either.

However that may be, I came later to look for an image of my emotions and sensations in others. I had to learn the outward signs of inward feelings. The start of fear, the suppressed, controlled tensity of pain, the beat of happy muscles in others, had to be perceived and compared with my own experiences before I could trace them back to the intangible soul of another. Groping, uncertain, I at last found my identity, and after seeing my thoughts and feelings repeated in others, I gradually constructed my world of men and of God. As I read and study, I find that this is what the rest of the race has done. Man looks within himself and in time finds the measure and the meaning of the universe.

* * *

As an example of how language relates to emotions:

The ‘untranslatable’ emotions you never knew you had
by David Robson

But studying these terms will not just be of scientific interest; Lomas suspects that familiarising ourselves with the words might actually change the way we feel ourselves, by drawing our attention to fleeting sensations we had long ignored.

“In our stream of consciousness – that wash of different sensations feelings and emotions – there’s so much to process that a lot passes us by,” Lomas says. “The feelings we have learned to recognise and label are the ones we notice – but there’s a lot more that we may not be aware of. And so I think if we are given these new words, they can help us articulate whole areas of experience we’ve only dimly noticed.”

As evidence, Lomas points to the work of Lisa Feldman Barrett at Northeastern University, who has shown that our abilities to identify and label our emotions can have far-reaching effects.

Her research was inspired by the observation that certain people use different emotion words interchangeably, while others are highly precise in their descriptions. “Some people use words like anxious, afraid, angry, disgusted to refer to a general affective state of feeling bad,” she explains. “For them, they are synonyms, whereas for other people they are distinctive feelings with distinctive actions associated with them.”

This is called “emotion granularity” and she usually measures this by asking the participants to rate their feelings on each day over the period of a few weeks, before she calculates the variation and nuances within their reports: whether the same old terms always coincide, for instance.

Importantly, she has found that this then determines how well we cope with life. If you are better able to pin down whether you are feeling despair or anxiety, for instance, you might be better able to decide how to remedy those feelings: whether to talk to a friend, or watch a funny film. Or being able to identify your hope in the face of disappointment might help you to look for new solutions to your problem.

In this way, emotion vocabulary is a bit like a directory, allowing you to call up a greater number of strategies to cope with life. Sure enough, people who score highly on emotion granularity are better able to recover more quickly from stress and are less likely to drink alcohol as a way of recovering from bad news. It can even improve your academic success. Marc Brackett at Yale University has found that teaching 10 and 11-year-old children a richer emotional vocabulary improved their end-of-year grades, and promoted better behaviour in the classroom. “The more granular our experience of emotion is, the more capable we are to make sense of our inner lives,” he says.

Both Brackett and Barrett agree that Lomas’s “positive lexicography” could be a good prompt to start identifying the subtler contours of our emotional landscape. “I think it is useful – you can think of the words and the concepts they are associated with as tools for living,” says Barrett. They might even inspire us to try new experiences, or appreciate old ones in a new light.

* * *

And related to all of this is hypocognition, overlapping with linguistic relativity — in how language and concepts determine our experience, identity, and sense of reality — constraining and framing and predetermining what we are even capable of perceiving, thinking about, and expressing:

Hypocognition is a censorship tool that mutes what we can feel
by Kaidi Wu

It is a strange feeling, stumbling upon an experience that we wish we had the apt words to describe, a precise language to capture. When we don’t, we are in a state of hypocognition, which means we lack the linguistic or cognitive representation of a concept to describe ideas or interpret experiences. The term was introduced to behavioural science by the American anthropologist Robert Levy, who in 1973 documented a peculiar observation: Tahitians expressed no grief when they suffered the loss of a loved one. They fell sick. They sensed strangeness. Yet, they could not articulate grief, because they had no concept of grief in the first place. Tahitians, in their reckoning of love and loss, and their wrestling with death and darkness, suffered not from grief but a hypocognition of grief. […]

But the darkest form of hypocognition is one born out of motivated, purposeful intentions. A frequently overlooked part of Levy’s treatise on Tahitians is why they suffered from a hypocognition of grief. As it turns out, Tahitians did have a private inkling of grief. However, the community deliberately kept the public knowledge of the emotion hypocognitive to suppress its expression. Hypocognition was used as a form of social control, a wily tactic to expressly dispel unwanted concepts by never elaborating on them. After all, how can you feel something that doesn’t exist in the first place?

Intentional hypocognition can serve as a powerful means of information control. In 2010, the Chinese rebel writer Han Han told CNN that any of his writings containing the words ‘government’ or ‘communist’ would be censored by the Chinese internet police. Ironically, these censorship efforts also muffled an abundance of praise from pro-leadership blogs. An effusive commendation such as ‘Long live the government!’ would be censored too, for the mere mention of ‘government’.

A closer look reveals the furtive workings of hypocognition. Rather than rebuking negative remarks and rewarding praises, the government blocks access to any related discussion altogether, rendering any conceptual understanding of politically sensitive information impoverished in the public consciousness. ‘They don’t want people discussing events. They simply pretend nothing happened… That’s their goal,’ Han Han said. Regulating what is said is more difficult than ensuring nothing is said. The peril of silence is not a suffocation of ideas. It is to engender a state of blithe apathy in which no idea is formed.

Do To Yourself As You Would Do For Others

“…our impulse control is less based on an order from our executive command center, or frontal cortex, and more correlated with the empathic part of our brain. In other words, when we exercise self-control, we take on the perspective of our future self and empathize with that self’s perspectives, feelings, and motivations.”
~ Alexandar Soutscheck

Self-control is rooted in self-awareness. Julian Jaynes and Brian McVeigh, in one of their talks, brought up the idea that “mind space” has increased over time: “The more things we think about, the more distinctions we make in our consciousness  between A and B, and so on, the more mind-space there is” (Discussions with Julian Jaynes, ed. by Brian J. McVeigh, p. 40). The first expansion was the creation of introspective consciousness itself. Narratization allowed that consciousness to also extend across time, to imagine possibilities and play out scenarios and consider consequences. Empathy, as we we experience it, might be a side effect of this as consciousness includes more and more within it, including empathy with our imagined future self. So, think of self-control as being kind to yourself, to your full temporal self, not only your immediate self.

This would relate to the suggestion that humans learn theory of mind, the basis of cognitive empathy, first by observing others and only later apply it to ourselves. That is to say the first expansion of mental space as consciousness takes root within relationship to others. It’s realizing that there might be inner experience within someone else that we claim inner space in our own experience. So, our very ability to understand ourselves is dependent on empathy with others. This was a central purpose of the religions that arose in the Axial Age, the traditions that continue into the modern world* (Tahere Salehi, The Effect of Training Self-Control and Empathy According to Spirituality on Self-Control and Empathy Preschool Female Students in Shiraz City). The prophets that emerged during that era taught love and compassion and introspection, not only as an otherworldly moral dictum but also in maintaining group coherence and the common good. The breakdown of what Jaynes called the bicameral mind was traumatic and a new empathic mind was needed to replace it, if only to maintain social order.

Social order has become a self-conscious obsession ever since, as Jaynesian consciousness in its tendency toward rigidity has inherent weaknesses. Social disconnection is a crippling of the mind because the human psyche is inherently social. Imagining our future selves is a relationship with a more expansive sense of self. It’s the same mechanism as relating to any other person. This goes back to Johann Hari’s idea, based on Bruce K. Alexander’s rat park research, that the addict is the ultimate individual. In this context, this ultimate individual lacking self-control is not only disconnected from other people but also disconnected from themselves. Addiction is isolating and isolation promotes addiction. Based on this understanding, I’ve proposed that egoic consciousness is inherently addictive and that post-axial society is dependent on addiction for social control.

But this psychological pattern is seen far beyond addiction. This fits our personal experience of self. When we were severely depressed, we couldn’t imagine or care about the future. This definitely inhibited self-control and led to more impulsive behavior in being in present-oriented psychological survival mode. Then again, the only reason self-control is useful at all is because, during and following the Axial Age, humans ever more loss the capacity of being part of a communal identity that created the conditions of communal control, the externally perceived commands of archaic authorization through voice-hearing. We’ve increasingly lost the capacity of a communal identity (extended mind/self) and hence a communal empathy, something that sounds strange or unappealing to the modern mind. In denying our social nature, this casts the shadow of authoritarianism, an oppressive and often violent enforcement of top-down control.

By the way, this isn’t merely about psychology. Lead toxicity causes higher rates of impulsivity and aggression. This is not personal moral failure but brain damage from poisoning. Sure, teaching brain-damaged kids and adults to have more empathy might help them overcome their disability. But if we are to develop and empathic society, we should learn to have enough empathy not to wantonly harm the brains of others with lead toxicity and other causes of stunted development (malnutrition, stress, ACEs, etc), just because they are poor or minority and can’t fight back. Maybe we need to first teach politicians and business leaders basic empathy, in overcoming the present dominance of pscyopathic traits, so that they could learn self-control in not harming others.

The part of the brain involving cognitive empathy and theory of mind is generally involved with selflessness and pro-social behavior. To stick with brain development and neurocognitive functioning, let’s look at diet. Weston A. Price, in studying traditional populations that maintained healthy diets, observed what he called moral health in that people seemed kinder, more helpful, and happier — they got along well. Strong social fabric and culture of trust is not an abstraction but built into general measures of health, in the case of Price’s work, having to do with nutrient-dense animal foods containing fat-soluble vitamins. As the standard American diet has worsened, so has mental health. That is a reason for hope. In an early study on the ketogenic diet as applied to childhood diabetes, the researchers made a side observation that not only did the diabetes symptoms improve but so did behavior. I’ve theorized about how a high-carb diet might be one of the factors that sustains the addictive and egoic self.

Narrow rigidity of the mind, as seen in the extremes of egoic consciousness, has come to be accepted as a social norm and even a social ideal. It is the social Darwinian worldview that has contributed to the rise of both competitive capitalism and the Dark Triad (psycopathy, narcissism, and Machiavellianism), and unsurprisingly it has led to a society that lacks awareness and appreciation of the harm caused to future generations (Scott Barry Kaufman, The Dark Triad and Impulsivity). Rather than normalized, maybe this dysfunction should be seen as a sickness, not only a soul sickness but a literal sickness of the body-mind that can be scientifically observed and measured, not to mention medically and socially treated. We need to thin the boundaries of the mind so as to expand our sense of self. Research shows that those with such thinner boundaries not only have more sense of identification with their future selves but also their past selves, in maintaining a connection to what it felt like to be a child. We need to care for ourselves and others in the way we would protect a child.

* * *

* In their article “Alone and aggressive“, A. William Crescioni and Roy F. Baumeister included the loss of meaning. It was maybe associated with the loss of empathy, specifically in understanding the meaning of others (e.g., the intention ‘behind’ words, gestures and actions). Meaning traditionally has been the purview of religion. And I’d suggest that it is not a coincidence that the obsession with meaning arose in the Axial Age right when words were invented for ‘religion’ as a formal institution separate from the rest of society. As Julian Jaynes argues, this was probably in response to the sense of nostalgia and longing that followed the silence of the gods, spirits, and ancestors.

A different kind of social connection had to be taught, but this post-bicameral culture wasn’t and still isn’t as effective in re-creating the strong social bonds of archaic humanity. Periods of moral crisis in fear of societal breakdown have repeated ever since, like a wound that was never healed. I’ve previously written about social rejection and aggressive behavior in relation to this (12 Rules for Potential School Shooters) — about school shooters, I explained:

Whatever they identify or don’t identify as, many and maybe most school shooters were raised Christian and one wonders if that plays a role in their often expressing a loss of meaning, an existential crisis, etc. Birgit Pfeifer and Ruard R. Ganzevoort focus on the religious-like concerns that obsess so many school shooters and note that many of them had religious backgrounds:

“Traditionally, religion offers answers to existential concerns. Interestingly, school shootings have occurred more frequently in areas with a strong conservative religious population (Arcus 2002). Michael Carneal (Heath High School shooting, 1997, Kentucky) came from a family of devoted members of the Lutheran Church. Mitchell Johnson (Westside Middle School shooting, 1998, Arkansas) sang in the Central Baptist Church youth choir (Newman et al. 2004). Dylan Klebold (Columbine shooting, 1999, Colorado) attended confirmation classes in accordance with Lutheran tradition. However, not all school shooters have a Christian background. Some of them declare themselves atheists…” (The Implicit Religion of School Shootings).

Princeton sociologist Katherine Newman, in studying school shootings, has noted that, “School rampage shootings tend to happen in small, isolated or rural communities. There isn’t a very direct connection between where violence typically happens, especially gun violence in the United States, and where rampage shootings happen” (Common traits of all school shooters in the U.S. since 1970).

It is quite significant that these American mass atrocities are concentrated in “small, isolated or rural communities” that are “frequently in areas with a strong conservative religious population”. That might more precisely indicate who these school shooters are and what they are reacting to. Also, one might note that rural areas in general and specifically in the South do have high rates of gun-related deaths, although many of them are listed as ‘accidental’ which is to say most rural shootings involve people who know each other; also true of school shootings.

* * *

Brain stimulation reveals crucial role of overcoming self-centeredness in self-control
by Alexander Soutschek, Christian C. Ruff, Tina Strombach, Tobias Kalenscher and Philippe N. Tobler

Empathic Self-Control
by David Shoemaker

People with a high degree of self-control typically enjoy better interpersonal relationships, greater social adjustment, and more happiness than those with a low degree of self-control. They also tend to have a high degree of empathy. Further, those with low self-control also tend to have low empathy. But what possible connection could there be between self-control and empathy, given that how one regulates oneself seems to have no bearing on how one views others. Nevertheless, this paper aims to argue for a very tight relation between self-control and empathy, namely, that empathy is in fact one type of self-control. The argument proceeds by exploring two familiar types of self-control, self-control over actions and attitudes, the objects for which we are also responsible. Call the former volitional self-control and the latter rational self-control. But we also seem to be responsible for—and have a certain type of control and self-control over—a range of perceptual states, namely, those in which we come to see from another person’s perspective how she views her valuable ends and what her emotional responses are to their thwarting or flourishing. This type of empathic self-control is a previously-unexplored feature of our interpersonal lives. In addition, once we see that the type of empathy exercised is also exercised when casting ourselves into the shoes of our future selves, we will realize how intra-personal empathy better enables both volitional and rational self-control.

Science Says When Self-Control Is Hard, Try Empathizing With Your Future Self
by Lindsay Shaffer

Soutscheck’s study also reveals what happens when we fail to exercise the empathic part of our brain. When Soutscheck interrupted the empathic center of the brain in 43 study volunteers, they were more likely to take a small amount of cash immediately over a larger amount in the future. They were also less inclined to share the money with a partner. Soutscheck’s study showed that the more people are stuck inside their own perspective, even just from having the empathic part of their brain disrupted, the more likely they are to behave selfishly and impulsively.

Self-Control Is Just Empathy With Your Future Self
by Ed Yong

This tells us that impulsivity and selfishness are just two halves of the same coin, as are their opposites restraint and empathy. Perhaps this is why people who show dark traits like psychopathy and sadism score low on empathy but high on impulsivity. Perhaps it’s why impulsivity correlates with slips among recovering addicts, while empathy correlates with longer bouts of abstinence. These qualities represent our successes and failures at escaping our own egocentric bubbles, and understanding the lives of others—even when those others wear our own older faces.

New Studies in Self Control: Treat Yourself Like You’d Treat Others
from Peak

A new study recently shifted the focus to a different mechanism of self control. Alexander Soutschek and colleagues from the University of Zurich believe self-control may be related to our ability to evaluate our future wants and needs.

The scientists suggest that this takes place in an area of the brain called the rTPJ, which has long been linked to selflessness and empathy for others. It’s an important part of our ability to “take perspectives” and help us step into the shoes of a friend.

The scientists hypothesized that perhaps the rTPJ treats our “future self” the same way it treats any other person. If it helps us step into our friend’s shoes, maybe we can do the same thing for ourselves. For example, if we’re deciding whether to indulge in another pint of beer at a bar, maybe our ability to hold off is related to our ability to imagine tomorrow morning’s hangover. As science writer Ed Yong explains, “Think of self-control as a kind of temporal selflessness. It’s Present You taking a hit to help out Future You.”

Empathy for Your Future Self
by Reed Rawlings

Further Research on the TPJ

The results of Soutscheks team were similar to past work on the empathy, future-self, and the TPJ. It’s believed a better connected rTPJ increases the likelihood of prosocial behaviors. Which relates to skills of executive function. Individuals who exhibit lower empathy, score higher for impulsivity – the opposite of self-control.

Keeping our future selves in mind may even keep our savings in check. In this research, Stanford University tested a “future self-continuity”. They wanted to explore how individuals related to their future self. Participants were asked to identify how they felt about the overlap between their current and future selves. They used the Venn diagrams below for this exercise.

If they saw themselves as separate, they were more likely to choose immediate rewards. A greater overlap increased the likelihood of selecting delayed rewards. In their final study, they assessed individuals from the San Francisco Bay area. The researchers found a correlation between wealth and an overlap between selves.

While the above research is promising, it doesn’t paint a full picture. Empathy seems useful, but making a sacrifice for our future-self requires that we understand the reason behind it. It’s the sacrifice that is especially crucial – positive gains demand negative trade-offs.

That’s where altruism, our willingness to give to others, comes in.

Why Do We Sacrifice?

Research from the University of Zurich’s examined some altruism’s driving factors. Their work came up with two correlations. First, the larger your rTPJ, the more likely you are to behave altruistically. Second, concerns of fairness affect how we give.

In this experiment, individuals were more generous if their choice would decrease inequality. When inequality would increase, participants were less likely to give.

This is an understandable human maxim. We have little reason to give to an individual who has more than we do. It feels completely unfair to do so. However, we’re raised to believe that helping those in need is objectively good. Helping ourselves should fall under the same belief.

Empathy and altruism, when focused on our own well-being, are intimately linked. To give selflessly, we need to have a genuine concern for another’s well-being. In this case, the ‘other’ is our future self. Thankfully, with a bit of reflection, each of us can gain a unique insight into our own lives.

Alone and aggressive: Social exclusion impairs self-control and empathy and increases hostile cognition and aggression.
by A. William Crescioni and Roy F. Baumeister
from Bullying, Rejection, and Peer Victimization ed. by Monic J. Harris
pp. 260-271 (full text)

Social Rejection and Emotional Numbing

Initial studies provided solid evidence for a causal relationship be-tween rejection and aggression. The mechanism driving this relation-ship remained unclear, however. Emotional distress was perhaps the most plausible mediator. Anxiety has been shown to play a role in both social rejection (Baumeister & Tice, 1990) and ostracism (Williamset al., 2000). Emotional distress, however, was not present in these experiments by Twenge et al. (2001). Only one significant mood effect was found, and even this effect deviated from expectations. The sole difference in mood between rejected and accepted participants was a slight decrease in positive affect. Rejected participants did not show any increase in negative affect; rather, they showed a flattening of affect, in particular a decrease in positive affect. This mood difference did not constitute a mediator of the link between rejection and aggression. It did, however, point toward a new line of thinking. It was possible that rejection would lead to emotional numbing rather than causing emotional distress. The flattening of affect seen in the previous set of studies would be consistent with a state of cognitive deconstruction. This state is characterized by an absence of emotion, an altered sense of time, a fixa-tion on the present, a lack of meaningful thought, and a general sense of lethargy (Baumeister, 1990). […]

Rejection and Self-Regulation

Although the emotional numbness and decrease in empathy experienced by rejected individuals play an important role in the link between social rejection and aggression, these effects do not constitute a complete explanation of why rejection leads to aggression. The diminished prosocial motivations experienced by those lacking in empathy can open the door to aggressive behavior, but having less of a desire to do good and having more of a desire to do harm are not necessarily equivalent. A loss of empathy, paired with the numbing effects of rejection, could lead individuals to shy away from those who had rejected them rather than lashing out. Emotional numbness, however, is not the only consequence of social rejection.

In addition to its emotional consequences, social rejection has adverse effects on a variety of cognitive abilities. Social rejection has been shown to decrease intelligent (Baumeister, Twenge, & Nuss, 2002) and meaningful thought (Twenge et al., 2002). But another category of cognitive response is self-regulation. Studies have demonstrated that self-regulation depends upon a finite resource and that acts of self-regulation can impair subsequent attempts to exercise self-control (Baumeister, Bratslavsky, Muraven, & Tice, 1998). Self-regulation has been shown to be an important tool for controlling aggressive impulses. Stucke and Baumeister (2006) found that targets whose ability to self-regulate had been depleted were more likely to respond aggressively to insulting provocation. DeWall, Baumeister, Stillman, and Galliot (2007) found that diminished self-regulatory resources led to an increase in aggression only in response to provocation; unprovoked participants showed no increase in aggressive behavior. Recall that in earlier work (Twenge et al.,2002) rejected individuals became more aggressive only when the target of their aggression was perceived as having insulted or provoked them.This aggression could have been the result of the diminished ability of rejected participants to regulate their aggressive urges. […]

These results clearly demonstrate that social rejection has a detrimental effect on self-regulation, but they do not explain why this is so and, indeed, the decrement in self-regulation would appear to be counterproductive for rejected individuals. Gaining social acceptance often involves regulating impulses in order to create positive impressions on others (Vohs, Baumeister, & Ciarocco, 2005). Rejected individuals should therefore show an increase in self-regulatory effort if they wish to create new connections or prevent further rejection. The observed drop in self-regulation therefore seems maladaptive. The explanation for this finding lies in rejection’s effect on self-awareness.

Self-awareness is an important prerequisite of conscious self-control (Carver & Scheier, 1981). Twenge et al. (2002) found that, when given the option, participants who had experienced rejection earlier in the study were more likely to sit facing away from rather than toward a mirror. Having participants face a mirror is a common technique for inducing self-awareness (Carver & Scheier, 1981), so participants’ unwillingness to do so following rejection provides evidence of a desire to avoid self-awareness. A drop in self-awareness is part of the suite of effects that comprises a state of cognitive deconstruction. Just as emotional numbness protects rejected individuals from the emotional distress of rejection, a drop in self-awareness would shield against awareness of personalflaws and shortcoming that could have led to that rejection. The benefit of this self-ignorance is that further distress over one’s inadequacies is mitigated. Unfortunately, this protection carries the cost of decreased self-regulation. Because self-regulation is important for positive self-presentation (Vohs et al., 2005), this drop in self-awareness could ironically lead to further rejection. […]

These data suggest that social rejection does not decrease the absolute ability of victims to self-regulate but rather decreases their willingness to exert the effort necessary to do so. Increased lethargy, another aspect of cognitive deconstruction, is consistent with this decrease in self-regulatory effort. Twenge et al. (2002) found that social rejection led participants to give shorter and less detailed explanations of proverbs. Because fully explaining the proverbs would require an effortful response, this shortening and simplification of responses is evidence of increased lethargy amongst rejected participants. This lethargy is not binding, however. When given sufficient incentive, rejected participants were able to match the self-regulatory performance of participants in other conditions. Inducing self-awareness also allowed rejected individuals to self-regulate as effectively as other participants. In the absence of such stimulation, however, rejected individuals showed a decrement in self-regulatory ability that constitutes an important contribution to explaining the link between rejection and aggression. […]

Rejection and Meaningfulness

Twenge et al. (2002) found that social rejection led to a decrease in meaningful thought among participants, as a well as an increased likelihood to endorse the statement, “Life is meaningless.” Williams (2002)has also suggested that social rejection ought to be associated with a perception of decreased meaning in life. Given the fundamental nature of the need to belong, it makes sense that defining life as meaningful would be at least in part contingent on the fulfillment of social needs. A recent line of work has looked explicitly at the effect of social rejection on the perception of meaning in life. Perceiving meaning in life has been shown to have an inverse relationship with hostility, aggression,and antisocial attitude (Mascaro, Morey, & Rosen, 2004). As such, any decrease in meaning associated with social rejection would constitute an important feature of the explanation of the aggressive behavior of rejected individuals.

The God of the Left Hemisphere:
Blake, Bolte Taylor and the Myth of Creation
by Roderick Tweedy

The left hemisphere is competitive… the will to power…is the agenda of the left hemisphere. It arose not to communicate with the world but to manipulate it. This inability to communicate or co-operate poses great difficulties for any project of reintegration or union. Its tendency would be to feed off the right hemisphere, to simply use and gain power over it too. Left hemisphere superiority is based, not on a leap forward by the left hemisphere, but on a ‘deliberate’ handicapping of the right. There is perhaps as much chance of persuading the head of a multinational to stop pursuing an agenda of self-interest and ruthless manipulation as there is of persuading the Urizenic program of the brain which controls him of “resubmitting” itself to the right hemisphere’s values and awareness.

The story of the Western world being one of increasing left-hemispheric domination, we would not expect insight to be the key note. Instead we would expect a sort of insouciant optimism, the sleepwalker whistling a happy tune as he ambles towards the abyss.

The left, rational, brain, it might be safe to conclude, has no idea how serious the problem is, that is to say, how psychopathic it has become. Of course, it doesn’t care that it doesn’t care. “The idiot Reasoner laughs at the Man of Imagination/And from laughter proceeds to murder by undervaluing calumny”, noted Blake in a comment that is only remarkable for the fact that it has taken two hundred years to understand.

The apparently “conscious” rational self, the driving program and personality of the left brain, turns out to be deeply unconscious, a pathological sleepwalker blithely poisoning its own environment whilst tenaciously clinging onto the delusion of its own rightness. This unfortunate mixture, of arrogance and ignorance, defines contemporary psychology. The left hemisphere not only cannot see that there is a problem, it cannot see that it is itself the problem.

Battle of Voices of Authorization in the World and in Ourselves

New Feelings: Podcast Passivity
by Suzannah Showler

My concern is that on some level, I’m prone to mistake any voice that pours so convincingly into my brain for my own. And maybe it’s not even a mistake, per se, so much as a calculated strategy on the part of my ego to maintain its primacy, targeting and claiming any foreign object that would stray so far into the inner-sanctum of my consciousness. Whether the medium is insidious, my mind a greedy assimilation machine, or both, it seems that at least some of the time, podcasts don’t just drown out my inner-monologue — they actually overwrite it. When I listen to a podcast, I think some part of me believes I’m only hearing myself think.

Twentieth-century critics worried about this, too. Writing sometime around the late 1930s, Theodore Adorno theorized that a solitary listener under the influence of radio is vulnerable to persuasion by an anonymous authority. He writes: “The deeper this [radio] voice is involved within his own privacy, the more it appears to pour out of the cells of his more intimate life; the more he gets the impression that his own cupboard, his own photography, his own bedroom speaks to him in a personal way, devoid of the intermediary stage of the printed words; the more perfectly he is ready to accept wholesale whatever he hears. It is just this privacy which fosters the authority of the radio voice and helps to hide it by making it no longer appear to come from outside.”

I’ll admit that I have occasionally been gripped by false memories as a result of podcasts — been briefly sure that I’d seen a TV show I’d never watched, or convinced that it was a friend, not a professional producer, who told me some great anecdote. But on the whole, my concern is less that I am being brainwashed and more that I’m indulging in something deeply avoidant: filling my head with ideas without actually having to do the messy, repetitive, boring, or anxious work of making meaning for myself. It’s like downloading a prefabbed stream of consciousness and then insisting it’s DIY. The effect is twofold: a podcast distracts me from the tedium of being alone with myself, while also convincingly building a rich, highly-produced version of my inner life. Of course that’s addictive — it’s one of the most effective answers to loneliness and self-importance I can imagine.

Being Your Selves: Identity R&D on alt Twitter
by Aaron Z. Lewis

Digital masks are making the static and immortal soul of the Renaissance seem increasingly out of touch. In an environment of info overload, it’s easy to lose track of where “my” ideas come from. My brain is filled with free-floating thoughts that are totally untethered from the humans who came up with them. I speak and think in memes — a language that’s more like the anonymous manuscript culture of medieval times than the individualist Renaissance era. Everything is a remix, including our identities. We wear our brains outside of our skulls and our nerves outside our skin. We walk around with other people’s voices in our heads. The self is in the network rather than a node.

The ability to play multiple characters online means that the project of crafting your identity now extends far beyond your physical body. In his later years, McLuhan predicted that this newfound ability would lead to a society-wide identity crisis:

The instant nature of electric-information movement is decentralizing — rather than enlarging — the family of man into a new state of multitudinous tribal existences. Particularly in countries where literate values are deeply institutionalized, this is a highly traumatic process, since the clash of old segmented visual culture and the new integral electronic culture creates a crisis of identity, a vacuum of the self, which generates tremendous violence — violence that is simply an identity quest, private or corporate, social or commercial.

As I survey the cultural landscape of 2020, it seems that McLuhan’s predictions have unfortunately come true. More than ever before, people are exposed to a daily onslaught of world views and belief systems that threaten their identities. Social media has become the battlefield for a modern-day Hobbesian war of all-against-all. And this conflict has leaked into the allegedly “offline” world.

“Individuation is not the culmination of the person; it is the end of the person.”

Julian Jaynes and the Jaynesian scholars have made a compelling argument about where egoic consciousness originated and how it formed. But in all the Jaynesian literature, I don’t recall anyone suggesting how to undo egoic consciousness, much less suggesting we should attempt annihilation of the demiurgic ego.

That latter project is what preoccupied Carl Jung, and it is what Peter Kingsley has often written about. They suggest it is not only possible but inevitable. In a sense, the ego is already dead and we are already in the underworld. We are corpses and our only task is to grieve.

The Cry of Merlin: Carl Jung and the Insanity of Reason
Gregory Shaw on Peter Kingsley

Kingsley explains that Jung emulated these magicians, and his journey through the Underworld followed the path of Pythagoras, Parmenides and Empedocles. Jung translated the terminology of the ancients into “scientific” terms, calling the initiation he realized in the abyss “individuation.” For Jungians today, individuation is the culmination of psychic development, as if it were our collective birthright. Yet Kingsley points out that this notion of individuation is a domestication, commodification, and utter distortion of what Jung experienced. Individuation is not the culmination of the person; it is the end of the person. It is the agonizing struggle of becoming a god and a person simultaneously, of living in contradictory worlds, eternity and time.

Kingsley reveals that although individuation is the quintessential myth of Jung’s psychology, it is almost never experienced because no one can bear it. Individuation is the surrendering of the personal to the impersonal, and precisely what Jung experienced it to be, the death of his personality. Jung explains that individuation is a total mystery; the mystery of the Grail that holds the essence of God. According to Henry Corbin, Jung saw “true individuation as becoming God or God’s secret.” Put simply, individuation is deification. To his credit, over twenty years ago Richard Noll argued this point and wrote that Jung experienced deification in the form of the lion-headed Mithras (Leontocephalus), but Kingsley gives the context for deification that Noll does not, and the context is crucial. He shows that Jung’s deification was not an “ego trip” that gave rise to “a religious cult with [Jung] as the totem,” Noll’s assumption; nor was it a “colossal narcissism,” as Ernest Jones suggested, but precisely the opposite. Individuation cuts to the very core of self-consciousness; it is the annihilation of the ego, not its inflation. […]

What is fundamentally important about Catafalque is that Kingsley demonstrates convincingly that Jung recovered the shamanic path exemplified by Pythagoras, Parmenides, and Socrates. Jung tried to save us from the “insanity of reason” by descending to the underworld, serving the archetypes, and disavowing the impiety of “the Greeks” who reduce the sacred to rationalizations. There is much in Catafalque I have not addressed, perhaps the most important is Kingsley’s discussion of the Hebrew prophets who raged against a godless world. Kingsley here appropriately includes Allen Ginsberg’s Howl, that draws from the rhythms of these prophets to wail against the “insanity of America,” its mechanized thinking, suffocating architecture, and the robotic efficiency that is the child of Reason. This almost verbatim mirrors the words of Jung who, after visiting New York, says “suppose an age when the machine gets on top of us …. After a while, when we have invested all our energy in rational forms, they will strangle us…They are the dragons now, they became a sort of nightmare.

Kingsley ends Catafalque with depressing prophecies about the end of western civilization, both from Jung and from Kingsley himself. The great wave that was our civilization has spent itself. We are in the undertow now, and we don’t even realize it. To read these chapters is to feel as if one is already a corpse. And Kingsley presents this so bluntly, with so much conviction, it is, frankly, disturbing. And even though Kingsley writes that “Quite literally, our western world has come to an end,” I don’t quite believe him. When speaking about Jung giving psychological advice, Kingsley says “make sure you have enough mētis or alertness not to believe him,” and I don’t believe Kingsley’s final message either. Kingsley’s message of doom is both true and false. The entire book has been telling us that we are already dead, that we are already in the underworld, but, of course, we just don’t understand it. So, then he offers us a very physical and literal picture of our end, laced with nuclear fallout and images of contamination. And he forthrightly says the purpose of his work is “to provide a catafalque for the western world.” It is, he says, time to grieve, and I think he is right. We need to grieve for the emptiness of our world, for our dead souls, our empty lives, but this grief is also the only medicine that can revive the collective corpse that we have become. Kingsley is doing his best to show us, without any false hope, the decaying corpse that we are. It is only through our unwavering acceptance, grieving and weeping for this, that we can be healed. In Jung’s terms, only the death of the personal can allow for birth into the impersonal. Into what…? We cannot know. We never will. It is not for our insatiable minds.

The Link Between Individualism and Collectivism

Individualism and collectivism. Autonomy and authoritarianism. These are opposites, right? Maybe not.

Julian Jaynes argued that humans, in the earliest small city-states, lived in a state he called the bicameral mind. It was a shared sense of identity where ‘thoughts’ were more publicly experienced as voices that were culturally inherited across generations. He observed that the rise of egoic consciousness as the isolated and independent self was simultaneous with a shift in culture and social order.

What was seen was a new kind of authoritarianism, much more brutally oppressive, much more centralized, hierarchical, and systematic. As the communal societies of the bicameral mind entered their end phase heading toward the collapse of the Bronze Age, there was the emergence of written laws, court systems, and standing armies. Criminals, enemy soldiers, and captives were treated much more harshly with mass killings like never before seen. Social order was no longer an organic community but required top-down enforcement.

One evidence of this new mentality was the sudden appearance of pornographic imagery. For thousands of years, humans created art, but never overtly sexual in nature. Then humans apparently became self-conscious of sexuality and also became obsessed with it. This was also a time when written laws and norms about sexuality became common. With sexual prurience came demands of sexual purity.

Repression was the other side of rigid egoic consciousness, as to maintain social control the new individualized self had to be controlled by society. The organic sense of communal identity could no longer be taken for granted and relied upon. The individual was cut off from the moral force of voice-hearing and so moral transgression as sin became an issue. This was the ‘Fall of Man’.

What is at stake is not merely an understanding of the past. We are defined by this past for it lives on within us. We are the heirs of millennia of psycho-cultural transformation. But our historical amnesia and our splintered consciousness leaves us adrift forces that we don’t understand or recognize. We are confused why, as we move toward greater individualism, we feel anxious about the looming threat of ever worse authoritarianism. There is a link between the two that is built into Jaynesian consciousness. But this is not fatalism, as if we are doomed to be ripped apart by diametric forces.

If we accept our situation and face the dilemma, we might be able to seek a point of balance. This is seen in Scandinavian countries where it is precisely a strong collective identity, culture of trust, and social democracy, even some democratic socialism, that makes possible a more stable and less fearful sense of genuine individuality (Anu Partanen, The Nordic Theory of Everything; & Nordic Theory of Love and Individualism). What is counter-intuitive to the American sensibility — or rather American madness — is that this doesn’t require greater legal regulations, such as how there is less red tape in starting a business in Scandinavia than the United States.

A book worth reading is Timothy Carney’s Alienated America. The author comes from the political right, but he is not a radical right-winger. His emphasis is on social conservatism, although the points he is making is dependent on the liberal viewpoint of social science. Look past some of the conservative biases of interpretation and there is much here that liberals, progressives, and even left-wingers could agree with.

He falls into the anti-government rhetoric of pseudo-libertarianism which causes him to be blind to how Scandinavian countries can have big governments that can rely more on culture of trust, rather than regulations, to enforce social norms. What Scandinavians would likely find odd is this American right-wing belief that government is separate from society, even when society isn’t outright denied as did Margaret Thatcher.

It’s because of this confusion that his other insights are all the more impressive. He is struggling against his own ideological chains. It shows how, even as the rhetoric maintains power over the mind, certain truths are beginning to shine through the weakening points of ideological fracture.

Even so, he ultimately fails to escape the gravity of right-wing ideological realism in coming to the opposite conclusion of Anu Partanen who understands that it is precisely the individual’s relationship to the state that allows for individual freedom. Carney, instead, wants to throw out both ‘collectivism’ and ‘hyper-individualism’. He expresses the still potent longing for the bicameral mind and its archaic authorization to compel social order.

What he misses is that this longing itself is part of the post-bicameral trap of Jaynesian consciousness, as the more one seeks to escape the dynamic the more tightly wound one becomes within its vice grip. It is only in holding lightly one’s place within the dynamic that one can steer a pathway through the narrow gap between the distorted extremes of false polarization and forced choice. This is exaggerated specifically by high inequality, not only of wealth but more importantly of resources and opportunities, power and privilege.

High inequality is correlated with mental illness, conflict, aggressive behavior, status anxiety, social breakdown, loss of social trust, political corruption, crony capitalism, etc. Collectivism and individualism may only express as authoritarianism and hyper-individualism under high inequality conditions. For some reason, many conservatives and right-wingers not only seem blind to the harm of inequality but, if anything, embrace it as a moral good expressing a social Darwinian vision of capitalist realism that must not be questioned.

Carney points to the greater social and economic outcomes of Scandinavian countries. But he can’t quite comprehend why such a collectivist society doesn’t have the problems he ascribes to collectivism. He comes so close to such an important truth, only to veer again back into the safety of right-wing ideology. Still, just the fact that, as a social conservative concerned for the public good, he feels morally compelled to acknowledge the kinds of things left-wingers have been talking about for generations shows that maybe we are finally coming to a point of reckoning.

Also, it is more than relevant that this is treading into the territory of Jaynesian thought, although the author has no clue how deep and dark are the woods once he leaves the well-beaten path. Even the briefest of forays shows how much has been left unexplored.

* * *

Alienated America:
Why Some Places Thrive While Others Collapse
by Timothy P. Carney

Two Sides of the Same Coin

“Collectivism and atomism are not opposite ends of the political spectrum,” Yuval Levin wrote in Fractured Republic, “but rather two sides of one coin. They are closely related tendencies, and they often coexist and reinforce one another—each making the other possible.” 32

“The Life of Julia” is clearly a story of atomization, but it is one made possible by the story of centralization: The growth of the central state in this story makes irrelevant—and actually difficult—the existence of any other organizations. Julia doesn’t need to belong to anything because central government, “the one thing we all belong to” (the Democratic Party’s mantra in that election), 33 took care of her needs.

This is the tendency of a large central state: When you strengthen the vertical bonds between the state and the individual, you tend to weaken the horizontal bonds between individuals. What’s left is a whole that by some measures is more cohesive, but individuals who are individually all less connected to one another.

Tocqueville foresaw this, thanks to the egalitarianism built into our democracy: “As in centuries of equality no one is obliged to lend his force to those like him and no one has the right to expect great support from those like him, each is at once independent and weak.

“His independence fills him with confidence and pride among his equals, and his debility makes him feel, from time to time, the need of the outside help that he cannot expect from any of them, since they are all impotent and cold.”

Tocqueville concludes, “In this extremity he naturally turns his regard to the immense being that rises alone in the midst of universal debasement.” 34

The centralizing state is the first step in this. The atomized individual is the end result: There’s a government agency to feed the hungry. Why should I do that? A progressive social philosophy, aimed at liberating individuals by means of a central state that provides their basic needs, can actually lead to a hyper-individualism.

According to some lines of thought, if you tell a man he has an individual duty to his actual neighbor, you are enslaving that man. It’s better, this viewpoint holds, to have the state carry out our collective duty to all men, and so no individual has to call on any other individual for what he needs. You’re freed of both debt to your neighbor (the state is taking care of it) and need (the state is taking care of it).

When Bernie Sanders says he doesn’t believe in charity, and his partymates say “government is the name for the things we do together,” the latter can sound almost like an aspiration —that the common things, and our duties to others, ought to be subsumed into government. The impersonality is part of the appeal, because everyone alike is receiving aid from the nameless bureaucrats and is thus spared the indignity of asking or relying on neighbors or colleagues or coparishioners for help.

And when we see the state crowding out charity and pushing religious organizations back into the corner, it’s easy to see how a more ambitious state leaves little oxygen for the middle institutions, thus suffocating everything between the state and the individual.

In these ways, collectivism begets atomization.

Christopher Lasch, the leftist philosopher, put it in the terms of narcissism. Paternalism, and the transfer of responsibility from the individual to a bureaucracy of experts, fosters a narcissism among individuals, Lasch argued. 35 Children are inherently narcissistic, and a society that deprives adults of responsibility will keep them more childlike, and thus more self-obsessed.

It’s also true that hyper-individualism begets collectivism. Hyper-individualism doesn’t work as a way of life. Man is a political animal and is meant for society. He needs durable bonds to others, such as those formed in institutions like a parish, a sports club, or a school community. Families need these bonds to other families as well, regardless of what Pa in Little House on the Prairie seemed to think at times.

The little platoons of community provide role models, advice, and a safety net, and everyone needs these things. An individual who doesn’t join these organizations soon finds himself deeply in need. The more people in need who aren’t cared for by their community, the more demand there is for a large central state to provide the safety net, the guidance, and the hand-holding.

Social scientists have repeatedly come across a finding along these lines. “[G]overnment regulation is strongly negatively correlated with measures of trust,” four economists wrote in MIT’s Quarterly Journal of Economics . The study relied on an international survey in which people were asked, “Generally speaking, would you say that most people can be trusted or that you need to be very careful in dealing with people?” The authors also looked at answers to the question “Do you have a lot of confidence, quite a lot of confidence, not very much confidence, no confidence at all in the following: Major companies? Civil servants?”

They found, among other examples:

High-trusting countries such as Nordic and Anglo-Saxon countries impose very few controls on opening a business, whereas low-trusting countries, typically Mediterranean, Latin-American, and African countries, impose heavy regulations. 36

The causality here goes both ways. In less trusting societies, people demand more regulation, and in more regulated societies, people trust each other less. This is the analogy of the Industrial Revolution’s vicious circle between Big Business and Big Labor: The less trust in humanity there is, the more rules crop up. And the more rules, the less people treat one another like humans, and so on.

Centralization of the state weakens the ties between individuals, leaving individuals more isolated, and that isolation yields more centralization.

The MIT paper, using economist-speak, concludes there are “two equilibria” here. That is, a society is headed toward a state of either total regulation and low trust, or low regulation and high trust. While both destinations might fit the definition of equilibrium, the one where regulation replaces interpersonal trust is not a fitting environment for human happiness.

On a deeper level, without a community that exists on a human level—somewhere where everyone knows your name, to borrow a phrase—a human can’t be fully human. To bring back the language of Aristotle for a moment, we actualize our potential only inside a human-scaled community.

And if you want to know what happens to individuals left without a community in which to live most fully as human, where men and women are abandoned, left without small communities in which to flourish, we should visit Trump Country.

Jaynesian Linguistic Relativity

  • “All of these concrete metaphors increase enormously our powers of perception of the world about us and our understanding of it, and literally create new objects. Indeed, language is an organ of perception, not simply a means of communication.
  • The lexicon of language, then, is a finite set of terms that by metaphor is able to stretch out over an infinite set of circumstances, even to creating new circumstances thereby.
  • “The bicameral mind with its controlling gods was evolved as a final stage of the evolution of language. And in this development lies the origin of civilization.”
  • “For if consciousness is based on language, then it follows that it is of much more recent origin than has been heretofore supposed. Consciousness come after language! The implications of such a position are extremely serious.
  • But there’s no doubt about it, Whorfian hypothesis is true for some of the more abstract concepts we have. Certainly, in that sense, I would certainly be a Whorfian. But I don’t think Whorf went far enough.
    ~Julian Jaynes

Julian Jaynes, in The Origin of Consciousness in the Breakdown of the Bicameral Mind, makes statements that obviously express a view of linguistic relativity, also known as the Sapir-Whorf hypothesis or Whorfian hypothesis, whether or not the related strong form of linguistic determinism, although the above quotes do indicate the strong form. Edward Sapir and Benjamin Lee Whorf, by the way, weren’t necessarily arguing for the determinism that was later ascribed to them or at least to Whorf (Straw Men in the Linguistic Imaginary). Yet none of Jaynes’ writings ever directly refer to this other field of study or the main thinkers involved, even though it is one of the closest fields to his own hypothesis on language and metaphor in relation to perception, cognition, and behavior. It’s also rare to see this connection come up in the writings of any Jaynesian scholars. There apparently isn’t even a single mention, even in passing, in the discussion forum at the official site of the Julian Jaynes Society (no search results were found for: Edward Sapir, Benjamin Lee Whorf, Sapir-Whorf, Whorfian, Whorfianism, linguistic relativity, linguistic relativism, or linguistic determinism), although I found a few writings elsewhere that touch upon this area of overlap (see end of post). Besides myself, someone finally linked to an article about linguistic relativity in the Facebook group dedicated to his book (also see below).

Limiting ourselves to published work, the one and only significant exception I’ve found is a passing mention from Brian J. McVeigh in his book The “Other” Psychology of Julian Jaynes: “Also, since no simple causal relation between language and interiorized mentation exists, an examination of how a lexicon shapes psychology is not necessarily a Sapir-Whorfian application of linguistic theory.” But since Sapir and Whorf didn’t claim a simple causal relation, this leads me to suspect that McVeigh isn’t overly familiar with their scholarship or widely read in the more recent research. But if I’m misunderstanding him and he has written more fully elsewhere about this, I’d love to read it (owning some of his books, I do enjoy and highly respect McVeigh’s work, as I might consider him the leading Jaynesian scholar). In my having brought this up in a Julian Jaynes Facebook group, Paul Otteson responded that, “my take on linguistic relativism and determinism is that they are obvious.” But obviously, it isn’t obvious to many others, including some Jaynesian scholars who are academic experts on linguistic analysis of texts and culture, as is the case with McVeigh. “For many of us,” Jeremy Lent wrote in The Patterning Instinct, “the idea that the language we speak affects how we think might seem self-evident, hardly requiring a great deal of scientific proof. However, for decades, the orthodoxy of academia has held categorically that the language a person speaks has no effect on the way they think. To suggest otherwise could land a linguist in such trouble that she risked her career. How did mainstream academic thinking get itself in such a straitjacket?” (quoted in Straw Men in the Linguistic Imaginary).

Jaynes focused heavily on how metaphors shape an experience of interiorized and narratized space, i.e., a specific way of perceiving space and time in relation to identity. More than relevant is the fact that, in linguistic relativity research, how language shapes spatial and temporal perception has also been a a key area of study. Linguistic relativity has gained compelling evidence in recent decades. And several great books have been written exploring and summarizing the evidence: Vyvyan Evans’s The Language Myth, Guy Deutscher’ Through the Looking Glass, Benjamin K. Bergen’s Louder Than Words, Aneta Pavlenko’s The Bilingual Mind, Jeremy Lent’s The Patterning Instinct, Caleb Everett’s Linguistic Relativity and Numbers and the Making of Us (maybe include Daniel L. Everett’s Dark Matter of the Mind, Language: The Cultural Tool, and How Language Began). This would be a fruitful area for Jaynesian thought, not to mention it would help it to break out into wider scholarly interest. The near silence is surprising because of the natural affinity between the two groups of thinkers. (Maybe I’m missing something. Does anyone know of a Jaynesian scholar exploring linguistic relativity, a linguistic relativity scholar studying Jaynesianism, or any similar crossover?)

What makes it odd to me is that Jaynes was clearly influenced by linguistic relativity, if not directly then indirectly. Franz Boas’ theories on language and culture shaped linguistic relativists along with the thinkers read by Jaynes, specifically Ruth Benedict. Jaynes was caught up in a web of influences that brought him into the sphere of linguistic relativity and related anthropological thought, along with philology, much of it going back to Boas: “Julian Jaynes had written about the comparison of shame and guilt cultures. He was influenced in by E. R. Dodds (and Bruno Snell). Dodds in turn based some of his own thinking about the Greeks on the work of Ruth Benedict, who originated the shame and guilt culture comparison in her writings on Japan and the United States. Benedict, like Margaret Mead, had been taught by Franz Boas. Boas developed some of the early anthropological thinking that saw societies as distinct cultures” (My Preoccupied Mind: Blogging and Research).

Among these thinkers, there is an interesting Jungian influence as well: “Boas founded a school of thought about the primacy of culture, the first major challenge to race realism and eugenics. He gave the anthropology field new direction and inspired a generation of anthropologists. This was the same era during which Jung was formulating his own views. As with Jung before him, Jaynes drew upon the work of anthropologists. Both also influenced anthropologists, but Jung’s influence of course came earlier. Even though some of these early anthropologists were wary of Jungian psychology, such as archetypes and collective unconscious, they saw personality typology as a revolutionary framework (those influenced also included the likes of Edward Sapir and Benjamin Lee Whorf, both having been mentors of Boas who maybe was the source of introducing linguistic relativity into American thought). Through personality types, it was possible to begin understanding what fundamentally made one mind different from another, a necessary factor in distinguishing one culture from another” (The Psychology and Anthropology of Consciousness). The following is from Jung and the Making of Modern Psychology, Sonu Shamdasani (Kindle Locations 4706-4718):

“The impact of Jung’s typology on Ruth Benedict may be found in her concept of Apollonian and Dionysian culture patterns which she first put forward in 1928 in “Psychological Types in the cultures of the Southwest,” and subsequently elaborated in Patterns of Culture. Mead recalled that their conversations on this topic had in part been shaped by Sapir and Oldenweiser’s discussion of Jung’s typology in Toronto in 1924 as well as by Seligman’s article cited above (1959, 207). In Patterns of Culture, Benedict discussed Wilhelm Worringer’s typification of empathy and abstraction, Oswald Spengler’s of the Apollonian and the Faustian and Friedrich Nietzsche’s of the Apollonian and the Dionysian. Conspicuously, she failed to cite Jung explicitly, though while criticizing Spengler, she noted that “It is quite as convincing to characterize our cultural type as thoroughly extravert … as it is to characterize it as Faustian” (1934, 54-55). One gets the impression that Benedict was attempting to distance herself from Jung, despite drawing some inspiration from his Psychological Types.

“In her autobiography, Mead recalls that in the period that led up to her Sex and Temperament, she had a great deal of discussion with Gregory Bateson concerning the possibility that aside from sex difference, there were other types of innate differences which “cut across sex lines” (1973, 216). She stated that: “In my own thinking I drew on the work of Jung, especially his fourfold scheme for grouping human beings as psychological types, each related to the others in a complementary way” (217). Yet in her published work, Mead omitted to cite Jung’s work. A possible explanation for the absence of citation of Jung by Benedict and Mead, despite the influence of his typological model, was that they were developing diametrically opposed concepts of culture and its relation to the personality to Jung’s. Ironically, it is arguably through such indirect and half-acknowledged conduits that Jung’s work came to have its greatest impact upon modern anthropology and concepts of culture. This short account of some anthropological responses to Jung may serve to indicate that when Jung’s work was engaged with by the academic community, it was taken to quite different destinations, and underwent a sea change.”

As part of the intellectual world that shaped Jaynes’ thought, this Jungian line of influence feeds into the Boasian line of influence. But interestingly, in the Jaynesian sphere, the Jungian side of things is the least obvious component. Certainly, Jaynes didn’t see the connection, despite Jung’s Jaynesian-like comments about consciousness long before Jaynes wrote about it in 1976. Jung, writing in 1960 stated that, “There is in my opinion no tenable argument against the hypothesis that psychic functions which today seem conscious to us were once unconscious and yet worked as if they were conscious” (On the Nature of the Psyche; see post). And four years later wrote that, “Consciousness is a very recent acquisition of nature” (Man and His Symbols; see post). In distancing himself from Jung, Jaynes was somewhat critical, though not dismissive: “Jung had many insights indeed, but the idea of the collective unconscious and of the archetypes has always seemed to me to be based on the inheritance of acquired characteristics, a notion not accepted by biologists or psychologists today” (quoted by Philip Ardery in “Ramifications of Julian Jaynes’s theory of consciousness for traditional general semantics“). His criticism was inaccurate, though, since Jung’s actual position was that, “It is not, therefore, a question of inherited ideas but of inherited possibilities of ideas” (What is the Blank Slate of the Mind?). So, in actuality, Jaynes’ view on this point appears to be right in line with that of Jung. This further emphasizes the unacknowledged Jungian influence.

I never see this kind of thing come up in Jaynesian scholarship. It makes me wonder how many Jaynesian scholars recognize the intellectual debt they owe to Boas and his students, including Sapir and Whorf. More than a half century before Jaynes published his book, a new way of thinking was paving the way. Jaynes didn’t come out of nowhere. Then again, neither did Boas. There are earlier linguistic philosophers such as Wilhelm von Humboldt — from On Language (1836): “Via the latter, qua character of a speech-sound, a pervasive analogy necessarily prevails in the same language; and since a like subjectivity also affects language in the same notion, there resides in every language a characteristic world-view. As the individual sound stands between man and the object, so the entire language steps in between him and the nature that operates, both inwardly and outwardly, upon him. He surrounds himself with a world of sounds, so as to take up and process within himself the world of objects. These expressions in no way outstrip the measure of the simple truth. Man lives primarily with objects, indeed, since feeling and acting in him depend on his presentations, he actually does so exclusively, as language presents them to him. By the same act whereby he spins language out of himself, he spins himself into it, and every language draws about the people that possesses it a circle whence it is possible to exit only by stepping over at once into the circle of another one. To learn a foreign language should therefore be to acquire a new standpoint in the world-view hitherto possessed, and in fact to a certain extent is so, since every language contains the whole conceptual fabric and mode of presentation of a portion of mankind.” The development of thought over time is always fascinating. But schools of thought too easily become narrow and insular over time, forgetting their own roots and becoming isolated from related areas of study. The Boasian lineage and Jaynesian theory have ever since been developing separately but in parallel. Maybe it’s time for them to merge back together or, at the very least, cross-pollinate.

To be fair, linguistic relativity has come up ever so slightly elsewhere in Jaynesian scholarship. As a suggestion, Marcel Kuijsten pointed to “John Limber’s chapter “Language and Consciousness” in Reflections on the Dawn of Consciousness”. I looked at that Limber piece. He does discuss this broad area of study involving language, thought, and consciousness. But as far as I can tell (based on doing an ebook search for relevant terms), he nowhere discusses Boas, Sapir, or Whorf. At best, he makes an indirect and brief mention of “pre-Whorfian advocates” without even bothering to mention, much less detail, Whorfian advocates or where they came from and how there is a line of influence from Boas to Jaynes. It’s an even more passing comment than that of McVeigh’s. It is found in note 82: “For reviews of non-Jaynesian ideas on inner speech and consciousness, see Sokolov (1972), Kucaj (1982), Dennett (1991), Nørretranders (1998), and Morin (2005). Vygotsky, of course, was somewhat of a Marxist and probably took something from Marx’s (1859) often cited “It is not the consciousness of men that determines their being, but, on the contrary, their social being that determines their consciousness.” Vygotsky was also influenced by various pre-Whorfian advocates of linguistic relativity. I say “Vygotsky as inspiration” because I have not as yet found much of substance in any of his writings on consciousness beyond that of the Marx quote above. (Several of his papers are available online at http://www.marxists.org.)” So, apparently in the entire Jaynesian literature and commentary, there are only two miniscule acknowledgements that linguistic relativists exist at all (nor much reference to similar thinkers like Marxist Lev Vygotsky; or consider Marx’s theory of species-being; also note the omission of Alfred Korzybski’s General Semantics). Considering the fact that Jaynes was making an argument for linguistic relativity and possibly going so far as linguistic determinism, whether or not he knew it and thought about it that way, this oversight really gets me thinking.

That was where my thought ended, until serendipity brought forth a third example. It is in a passage from one of McVeigh’s more recent books, Discussions with Julian Jaynes (2016). In the June 5, 1991 session of their talks, almost a couple of decades after the publication of his book, Jaynes spoke to McVeigh about this:
McVeigh: “The first thing I want to ask you about is language. Because in our book, language plays an important role, specifically metaphors. And what would you say to those who would accuse you of being too Whorfian? Or how would you handle the charge that you’re saying it is language that determines thought in your book? Or would you agree with the statement, “As conscious developed, language changed to reflect this transformation?” So, in other words, how do you handle this [type of] old question in linguistics, “Which comes first, the chicken or the egg?””
Jaynes: “Well, you see Whorf applies to some things and doesn’t apply to others, and it’s being carried to a caricature state when somebody, let’s say, shows [a people perceives colors] and they don’t have words for colors. That’s supposed to disprove Whorf. That’s absolutely ridiculous. Because after all, animals, fish have very good color vision. But there’s no doubt about it, Whorfian hypothesis is true for some of the more abstract concepts we have. Certainly, in that sense, I would certainly be a Whorfian. But I don’t think Whorf went far enough. That’s what I used to say. I’m trying to think of the way I would exactly say it. I don’t know. for example, his discussion of time I think it is very appropriate. Indeed, there wouldn’t be such a thing as time without consciousness. No concept of it.”
Jaynes bluntly stated, “I would certainly be a Whorfian.” He said this in response to a direct question McVeigh asked him about being accused of being a Whorfian. There was no dancing around it. Jaynes apparently thought it was obvious enough to not require further explanation. That makes it all the more odd that McVeigh, a Jaynesian scholar who has spent his career studying language, has never since pointed out this intriguing detail. After all, if Jaynes was a Whorfian by his own admission and McVeigh is a Jaynesian scholar, then doesn’t it automatically follow that McVeigh in studying Jaynesianism is studying Whorfianism?

That still leaves plenty of room for interpretation. It’s not clear what was Jayne’s full position on the Sapir-Whorf hypothesis. Remarkably, he did not only identify as a Whorfian for he then suggested that he went beyond Whorf. I don’t know what that means, but it does get one wondering. Whorf wasn’t offering any coherent and overarching explanatory theory in the way that did Jaynes. Rather, the Sapir-Whorf hypothesis is more basic in simply suggesting language can influence and maybe sometimes determine thought, perception, and behavior. That is more of a general framework of research that potentially could apply to a wide variety of theories. I’d argue it not only partly but entirely applies to Jaynes’ theory as well — as neither Sapir nor Whorf, as far as I know, were making any assertions for or against the role of language in the formation of consciousness. Certainly, Jaynesian consciousness or the bicameral mind before it would not be precluded according to the Sapir-Whorf linguistic paradigm. Specifically in identifying as Whorfian, Jaynes agrees that, “Whorfian hypothesis is true for some of the more abstract concepts we have.” What does he mean by ‘abstract’ in this context? I don’t recall any of the scholarly and popular texts on linguistic relativity ever describing the power of language being limited to abstractions. Then again, neither did Jaynes directly state it is limited in this fashion, even as he does not elaborate on any other applications. However, McVeigh interpreted his words as implying such a limitation — from the introduction of the book, McVeigh wrote that, “he argues that the relation between words and concepts is not one of simple causation and that the Whorfian hypothesis only works for certain abstract notions. In other words, the relation between language and conscious interiority is subtle and complex.” Well, I’m not expert on the writings of Whorf, but my sense is that Whorf would not necessarily disagree with that assessment. One of the best sources of evidence for such subtlety and complexity might be found in linguistic relativity, a growing field of research. It is the area of overlap that remains terra incognito. I’m not sure anyone knows the details of how linguistic relativity might apply to Jaynesian consciousness as metaphorical mindspace nor how it might apply the other way around.

* * *

Though reworked a bit, I wrote much of the above about a year ago in the Facebook group Jaynes’ The Origin of Consciousness in the Breakdown of the Bicameral Mind. And I just now shared a variation of my thoughts in another post to the same group. This link between the Jaynesian and the Whorfian (along with the Boasian, Marxian, Jungian, etc) has been on my mind for a while, but it was hard to write about as few others have written about it. There is a fairly large literature of Jaynesian scholarship and an even more vast literature of linguistic relativity research. Yet to find even passing references to both together is a rare finding. Below are the few examples I could find on the entire world wide web.

Language and thought: A Jaynesian Perspective
by Rachel Williams, Minds and Brains

The Future of Philosophy of Mind
by Rachel Williams, Minds and Brains

Recursion, Linguistic Evolution, Consciousness, the Sapir-Whorf Hypothesis, and I.Q.
by Gary Williams, New Amsterdam Paleoconservative

Rhapsody on Blue
by Chad Hill, the HipCrime Vocab
(a regular commenter on the Facebook group)

Why ancient civilizations couldn’t see the color blue
posted by J Nickolas FitzGerald, Jaynes’ The Origin of Consciousness in the Breakdown of the Bicameral Mind Facebook group

* * *

Out of curiosity, I did some less extensive searches, in relation to Julian Jaynes, for some other thinkers, specifically Lev Vygotsky and Alfred Korzybski. The latter only showed up to a significant degree in a single scholarly article on Jaynes’ work (Philip Ardery, Ramifications of Julian Jaynes’s Theory of Consciousness for Traditional General Semantics), although Charles Eisenstein does mention the two thinkers in the same passage of his book The Ascent of Humanity but without making any direct connection or comparison. Greater relevance is found with Vygotsky and indeed he does come up more often, including several times on the official Julian Jaynes Society website and also in two of the collections of Jaynesian scholarship.

Two of the mentions of Vygotsky on the website are Books Related to Jaynes’s Bicameral Mind Theory and Supplementary Material (for Reflections on the Dawn of Consciousness), with the third offering some slight commentary — Marcel Kuijsten’s Critique 13, from Critiques and Responses: Part 2, where he writes: “For the vast differences between consciousness as described by Jaynes, Dennett, Carruthers, Vygotsky, and others – which is linguistically based and uniquely human – vs. non-linguistic animal cognition, see Peter Carruthers, Language, Thought and Consciousness, Jose Luis Bermudez, Ch. 9, “The Limits of Thinking Without Words,” in Thinking without Words, Lev Vygotsky, Thought and Language, Daniel Dennett, Kinds of Minds, etc.” In the introduction to The Julian Jaynes Collection, Marcel Kuijsten discusses Jayne’s first hypothesis that consciousness is based on language. Vygotsky is mentioned in passing while explaining the views of another scholar:

“The debate over the importance of language for consciousness has a long history and has seen renewed interest in recent years. While many theorists continue to assume that infants are born conscious (confusing consciousness with sense perception), the work of child psychologist Philip Zelazo strongly supports Jaynes’s argument that consciousness develops in children over time through the acquisition of language. Building on the work of the early twentieth century Russian psychologists Lev Vygotsky and Alexander Luria and the Swiss psychologist Jean Piaget, Zelazo and his colleagues propose a model for the development of consciousness in children that highlights the importance of the interaction between thought and language. 11 Zelazo describes “four major age-related increases” in consciousness in children and corresponding increases in children’s ability to spatialize time. Zelazo’s fourth stage, reflective consciousness , corresponds roughly to Jaynes’s definition of consciousness, whereas Zelazo’s first stage, minimal consciousness, describes what Jaynes would term reactivity or basic sense perception.”

A slightly fuller, if brief, comment on Vygotsky is found in The “Other” Psychology of Julian Jaynes. The author, Brian J. McVeigh, writes that, “An important intellectual descendant of Volkerpsychologie took root in the Soviet Union with the work of the cultural-historical approach of Lev Vygotsky (1896-1934) (1998), Alexander Luria (1902-77) (1976), and Aleksei Leontiev (1903-79) (1978, 2005 [1940]). Vygotsky and Luria (1993 [1930]) emphasized the inherently social nature of mind, language, and thought. Higher mental processes are complex and self-regulating, social in origin, mediated, and “conscious and voluntary in their mode of functioning” (cited in Meshcheriakov 2000; 43; see Wertsch 1985, 1991).”

Interestingly, Rachel Williams, in the above linked post The Future of Philosophy of Mind, also brings up Vygotsky. “Julian Jaynes has already cleared the underbrush to prepare the way for social-linguistic constructivism,” she explains. “And not your Grandpa’s neutered Sapir-Whorf hypothesis either. I’m talking about the linguistic construction of consciousness and higher-order thought itself. In other words, Vygotsky, not Whorf.” So, she obviously thinks Vygotsky is of utmost importance. I must admit that I’m actually not all that familiar with Vygotsky, but I am familiar with how influential he has been on the thought of others. I have greater interest in Korzybski by way of my appreciation for William S. Burrough’s views of “word virus” and “Control”.

* * *

It should be mentioned that Jaynesian scholarship, in general, is immense in scope. Look at any of the books put out on the topic and you’ll be impressed. Those like Kuijsten and McVeigh are familiar and conversant with a wide variety of scholars and texts. But for whatever reason, certain thinkers haven’t shown up much on their intellectual radars. About the likes of Vygotsky and Korzybski, I feel less surprised that they don’t appear as often in Jaynesian scholarship. Though influential, knowledge of them is limited and I don’t generally see them come up in consciousness studies more broadly. Sapir and Whorf, on the other hand, have had a much larger impact and, over time, their influence has continuously grown. Linguistic relativity has gained a respectability that Jaynesian scholarship still lacks.

I sometimes suspect that Jaynesian scholars are still too worried about respectability, as black sheep in the academic world. Few serious intellectuals took Jaynes seriously and that still is the case. That used to be also true of Sapir and Whorf, but that has changed. Linguistic relativity, with improved research, has recovered the higher status it had earlier last century. That is the difference for Jaynesian scholarship, as it never was respectable. I think that is why linguistic relativity got so easily ignored or dismissed. Jaynesian scholars might’ve been worried about aligning their own theories to another field of study that was, for a generation of scholars, heavily criticized and considered taboo. The lingering stigma of ‘strong’ Whorfianism as linguistic determinism, that we aren’t entirely isolated autonomous self-determined free agents, is still not acceptable in mainstream thought in this hyper-individualistic society. But one would think Jaynesian scholars would be sympathetic as the same charge of heresy is lodged against them.

Whatever motivated Jaynesian scholars in the past, it is definitely long past the time to change tack. Linguistic relativity is an area of real world research that could falsifiably test and potentially demonstrate the verity of Jaynes’ theory. Simply for practical reasons, those wishing to promote Jaynes’ work might be wise to piggyback on these obvious connections into more mainstream thought, such as mining the work of the popular Daniel Everett and his son Caleb Everett. That would draw Jaynesian scholarship into one of the main battles in all of linguistics, that of the debate between Daniel Everett and Noam Chomsky about recursion. There is a great opening for bringing attention to Jaynes — discuss why recursion is relevant to consciousness studies in general and Jaynesian consciousness in particular. Or better yet, show the commonalities between Jaynes and Jung, considering Jung is one of the most popular thinkers in the Western world. And as I’ve argued in great detail, such larger context has everything to do with the cultural and cognitive differences demonstrated by linguistic relativity.

In general, Jaynesian studies has been trapped in an intellectual backwater. There has yet to be a writer to popularize Jaynes’ views as they apply to the larger world and present society, from politics to culture, from the economy to environmentalism, from media to entertainment. Even among intellectuals and academics, it remains largely unknown and even less understood. This is beginning to change, though. HBO’s Westworld did more than anything to bring Jaynes’ ideas to a larger audience that otherwise would never come across such strange insights into human nature. Placing this radical theory within a science fiction narrative makes it less daunting and threatening to status quo thought. There is nothing like a story to slip a meme past the psychological defenses. Now that a seed has been planted, may it grow in the public mind.

Let me add that my pointed jabs at the Jaynesian world come from a place of love. Jaynes is one of the main inspirations to my thought. And I enjoy reading Jaynesian scholarship more than about any other field. I just want to see it expand, to become even more impressive. Besides, I’ve never been one for respectability, whether in politics or intellectual pursuits. Still, I couldn’t help but feel kind of bad about writing this post. It could be perceived as if all I was doing was complaining. And I realize that my sense of respect for Jaynesian scholars might be less than obvious to someone casually reading it (I tried to remedy that in clarifying my position in the main text above). I didn’t intend it as an attack on those scholars I have learned so much from. But I felt a need to communicate something, even if all I accomplished for the moment was making an observation.

It’s true that, instead of complaining about the omission of linguistic relativity, I could make a positive contribution by simply writing about how linguistic relativity applies to Jaynesian scholarship. If others haven’t shown the connections, the evidence and the examples, well then maybe I should. And I probably will, eventually. But it might take a while before I get around to that project. When I do, it could be a partial continuation of or tangent from my ongoing theorizing about symbolic conflation and such — that is tough nut I’ve been trying to crack for years. Still, the omission of linguistic relativity itself somehow seemed significant in my mind. I’m not sure why. This post is basically a way of setting forth a problem to be solved. The significance is that linguistic relativity would offer the real world examples of how Jaynesian views of consciousness, authorization, narratization, etc might apply to our everyday experience. It would help explain why such complex analysis, intellectually brilliant as it is, is relevant at all to our actual lives.

The Disease of Nostalgia

“The nostalgic is looking for a spiritual addressee. Encountering silence, he looks for memorable signs, desperately misreading them.”
― Svetlana Boym, The Future of Nostalgia

Nostalgia is one of those strange medical conditions from the past, first observed in 17th century soldiers being sent off to foreign lands during that era of power struggles between colonial empires. It’s lost that medical framing since then, as it is now seen as a mere emotion or mood or quality. And it has become associated with the reactionary mind and invented traditions. We no longer take it seriously, sometimes even dismissing it as a sign of immaturity.

But it used to be considered a physiological disease with measurable symptoms such as brain inflammation along with serious repercussions, as the afflicted could literally waste away and die. It was a profound homesickness experienced as an existential crisis of identity, a longing for a particular place and and the sense of being uprooted from it.  Then it shifted from a focus on place to a focus on time. It became more abstract and, because of that, it lost its medical status. This happened simultaneously as a new disease, neurasthenia, took its place in the popular imagination.

In America, nostalgia never took hold to the same degree as it did in Europe. It finally made its appearance in the American Civil War, only to be dismissed as unmanly and weak character, a defect and deficiency. It was a disease of civilization, but it strongly affected the least civilized, such as rural farmers. America was sold as a nation of progress and so attachment to old ways was deemed unAmerican. Neurasthenia better fit the mood that the ruling elite sought to promote and, unlike nostalgia, it was presented as a disease of the most civilized, although over time it too became a common malady, specifically as it was Europeanized.

Over the centuries, there was a shift in the sense of time. Up through the early colonial era, a cyclical worldview remained dominant (John Demos, Circles and Lines). As time became linear, there was no possibility of a return. The revolutionary era permanently broke the psychological link between past and future. There was even a revolution in the understanding of ‘revolution’ itself, a term that originated from astrology and literally meant a cyclical return. In a return, there is replenishment. But without that possibility, one is thrown back on individual reserves that are limited and must be managed. The capitalist self of hyper-individualism is finally fully formed. That is what neurasthenia was concerned with and so nostalgia lost its explanatory power. In The Future of Nostalgia, Svetlana Boym writes:

“From the seventeenth to the nineteenth century, the representation of time itself changed; it moved away from allegorical human figures— an old man, a blind youth holding an hourglass, a woman with bared breasts representing Fate— to the impersonal language of numbers: railroad schedules, the bottom line of industrial progress. Time was no longer shifting sand; time was money. Yet the modern era also allowed for multiple conceptions of time and made the experience of time more individual and creative.”

As society turned toward an ethos of the dynamic, it became ungrounded and unstable. Some of the last healthy ties to the bicameral mind were severed. (Interestingly, in early diagnoses of nostalgia as a disease, Boym states that, “One of the early symptoms of nostalgia was an ability to hear voices or see ghosts.” That sounds like the bicameral mind re-emerging under conditions of stress, not unlike John Geiger’s third man factor. In nostalgia as in the archaic mind, there is a secret connection between language and music, as united through voice — see Development of Language and Music and Spoken Language: Formulaic, Musical, & Bicameral.)

Archaic authorization mutated into totalitarianism, a new refuge for the anxiety-riddled mind. And the emerging forms of authoritarianism heavily draw upon the nostalgic turn (Ben G. Price, Authoritarian Grammar and Fundamentalist Arithmetic Part II), just as did the first theocracies (religion, writes Julian Jaynes, is “the nostalgic anguish for the lost bicamerality of a subjectively conscious people”), even as or especially because the respectable classes dismissed it. This is courting disaster for the archaic mind still lives within us, still speaks in the world, even if the voices are no longer recognized.

The first laments of loss echoed out from the rubble of the Bronze Age and, precisely as the longing has grown stronger, the dysfunctions associated with it have become normalized. But how disconnected and lost in abstractions can we get before either we become something entirely else or face another collapse?

“Living amid an ongoing epidemic that nobody notices is surreal. It is like viewing a mighty river that has risen slowly over two centuries, imperceptibly claiming the surrounding land, millimeter by millimeter. . . . Humans adapt remarkably well to a disaster as long as the disaster occurs over a long period of time”
~E. Fuller Torrey & Judy Miller, Invisible Plague

* * *

As a side note, I’d point to utopia as being the other side of the coin to nostalgia. And so the radical is the twin of the reactionary. In a different context, I said something about shame that could apply equally well to nostalgia (“Why are you thinking about this?”): “The issue of shame is a sore spot where conservatism and liberalism have, from their close proximity, rubbed each other raw. It is also a site of much symbolic conflation, the linchpin like a stake in the ground to which a couple of old warriors are tied in their ritual dance of combat and wounding, where both are so focused on one another that neither pays much attention to the stake that binds them together. In circling around, they wind themselves ever tighter and their tethers grow shorter.”

In conversing with someone on the political left, an old pattern became apparent. This guy, although with a slight radical bent, is a fairly mainstream liberal coming out of the Whiggish tradition of ‘moderate’ progressivism, an ideological mindset that is often conservative-minded and sometimes reactionary (e.g., lesser evil voting no matter how evil it gets). This kind of person is forever pulling their punches. To continue from the same piece, I wrote that, “The conservative’s task is much easier for the reason that most liberals don’t want to untangle the knot, to remove the linchpin. Still, that is what conservative’s fear, for they know liberals have that capacity, no matter how unlikely they are to act on it. This fear is real. The entire social order is dependent on overlapping symbolic conflations, each a link in a chain, and so each a point of vulnerability.”

To pull that linchpin would require confronting the concrete issue at hand, getting one’s hands dirty. But that is what the moderate progressive fears for the liberal mind feels safe and protected within abstractions. Real-world context will always be sacrificed. Such a person mistrusts the nostalgia of the reactionary while maybe fearing even more the utopianism of the radical, flitting back and forth between one to the other and never getting anywhere. So, they entirely retreat from the battle and lose themselves in comforting fantasies of abstract ideals (making them prone to false equivalencies in their dreams of equality). In doing so, despite being well informed, they miss the trees for the forest, miss the reality on the ground for all the good intentions.

Neither nostalgia nor utopianism can offer a solution, even as both indicate the problem. That isn’t to say there is an escape either for that also reinforces the pattern of anxiety, of fear and hope. The narrative predetermines our roles and the possibilities of action. We need a new narrative. The disease model of the human psyche, framed as nostalgia or neurasthenia or depression or anything else, is maybe not so helpful. Yet we have to take seriously that the stress of modernity is not merely something in people’s minds. Scapegoating the individual simply distracts from the failure of individualism. These conditions of identity are both real and imagined — that is what makes them powerful, whatever name they go by and ideology they serve.

* * *

Let me throw out some loose thoughts. There is something that feels off about our society and it is hard to put one’s finger on. That is why, in our free floating anxiety, we look for anything to grab hold of. Most of the public debates that divide the public are distractions from the real issue that we don’t know how to face, much less how to comprehend. These red herrings of social control are what I call symbolic conflation. To put it simply, there is plenty of projecting going on — and it is mutual from all sides involved and its extremely distorted.

I’ll leave it at that. What is important for my purposes here is the anxiety itself, the intolerable sense of dissatisfaction or dukkha. Interestingly, this sense gets shifted onto the individual and so further justifies the very individualism that is at the heart of the problem. It is our individuality that makes us feel so ill at ease with the world because it disconnects and isolates us. The individual inevitably fails because individualism is ultimately impossible. We are social creatures through and through. It requires immense effort to create and maintain individuality, and sweet Jesus! is it tiresome. That is the sense of being drained that is common across these many historical conditions, from the earlier melancholia to the present depression and everything in between.

Since the beginning of modernity, there has been a fear that too many individuals are simply not up to the task. When reading about these earlier ‘diseases’, there is a common thread running across the long history. The message is how will the individual be made to get in line with modern world, not how to get the modern world in line with human nature. The show must go on. Progress must continue. There is no going back, so we’re told. Onward and upward. This strain of endless change and uncertainty has required special effort in enculturating and indoctrinating each new generation. In the Middle Ages and in tribal cultures, children weren’t special but basically considered miniature adults. There was no protected childhood with an extended period to raise, train, and educate the child. But in our society, the individual has to be made, as does the citizen and the consumer. None of this comes naturally and so must be artificially imposed. The child will resist and more than a few will come out the other side with severe damage, but the sacrifice must be made for the greater good of society.

This was seen, in the United States, most clearly after the American Revolution. Citizen-making became a collective project. Children needed to be shaped into a civic-minded public. And as seen in Europe, adults needed to be forced into a national identity, even if it required bullying or even occasionally burying a few people alive to get the point across No stragglers will be allowed! (Nonetheless, a large part of the European population maintained local identities until the world war era.) Turning boys into men became a particular obsession in the early 20th century with all of the building of parks, advocacy for hunting and fishing, creation of the Boy Scouts, and on and on. Boys used to turn into men spontaneously without any needed intervention, but with nostalgia and neurasthenia there was this growing fear of effeminacy and degeneracy. The civilizing project was important and must be done, no matter how many people are harmed in the process, even genocides. Creating the modern nation-state was a brutal and often bloody endeavor. No one willingly becomes a modern individual. It only happens under threat of violence and punishment.

By the way, this post is essentially an elaboration on my thoughts from another post, The Crisis of Identity. In that other post, I briefly mention nostalgia, but the focus was more on neurasthenia and related topics. It’s an extensive historical survey. This is part of a longer term intellectual project of mine, in trying to make sense of this society and how it came to be this way. Below are some key posts to consider, although I leave out those related to Jaynesian and related scholarship because that is a large area of thought all on its own (if interested, look at the tags for ConsciousnessBicameral MindJulian Jaynes, and Lewis Hyde):

The Transparent Self to Come?
Technological Fears and Media Panics
Western Individuality Before the Enlightenment Age
Juvenile Delinquents and Emasculated Males
The Breast To Rule Them All
The Agricultural Mind
“Yes, tea banished the fairies.”
Autism and the Upper Crust
Diets and Systems
Sleepwalking Through Our Dreams
Delirium of Hyper-Individualism
The Group Conformity of Hyper-Individualism
Individualism and Isolation
Hunger for Connection
To Put the Rat Back in the Rat Park
Rationalizing the Rat Race, Imagining the Rat Park

* * *

The Future of Nostalgia
by Svetlana Boym
pp. 25-30

Nostalgia was said to produce “erroneous representations” that caused the afflicted to lose touch with the present. Longing for their native land became their single-minded obsession. The patients acquired “a lifeless and haggard countenance,” and “indifference towards everything,” confusing past and present, real and imaginary events. One of the early symptoms of nostalgia was an ability to hear voices or see ghosts. Dr. Albert von Haller wrote: “One of the earliest symptoms is the sensation of hearing the voice of a person that one loves in the voice of another with whom one is conversing, or to see one’s family again in dreams.” 2 It comes as no surprise that Hofer’s felicitous baptism of the new disease both helped to identify the existing condition and enhanced the epidemic, making it a widespread European phenomenon. The epidemic of nostalgia was accompanied by an even more dangerous epidemic of “feigned nostalgia,” particularly among soldiers tired of serving abroad, revealing the contagious nature of the erroneous representations.

Nostalgia, the disease of an afflicted imagination, incapacitated the body. Hofer thought that the course of the disease was mysterious: the ailment spread “along uncommon routes through the untouched course of the channels of the brain to the body,” arousing “an uncommon and everpresent idea of the recalled native land in the mind.” 3 Longing for home exhausted the “vital spirits,” causing nausea, loss of appetite, pathological changes in the lungs, brain inflammation, cardiac arrests, high fever, as well as marasmus and a propensity for suicide. 4

Nostalgia operated by an “associationist magic,” by means of which all aspects of everyday life related to one single obsession. In this respect nostalgia was akin to paranoia, only instead of a persecution mania, the nostalgic was possessed by a mania of longing. On the other hand, the nostalgic had an amazing capacity for remembering sensations, tastes, sounds, smells, the minutiae and trivia of the lost paradise that those who remained home never noticed. Gastronomic and auditory nostalgia were of particular importance. Swiss scientists found that rustic mothers’ soups, thick village milk and the folk melodies of Alpine valleys were particularly conducive to triggering a nostalgic reaction in Swiss soldiers. Supposedly the sounds of “a certain rustic cantilena” that accompanied shepherds in their driving of the herds to pasture immediately provoked an epidemic of nostalgia among Swiss soldiers serving in France. Similarly, Scots, particularly Highlanders, were known to succumb to incapacitating nostalgia when hearing the sound of the bagpipes—so much so, in fact, that their military superiors had to prohibit them from playing, singing or even whistling native tunes in a suggestive manner. Jean-Jacques Rousseau talks about the effects of cowbells, the rustic sounds that excite in the Swiss the joys of life and youth and a bitter sorrow for having lost them. The music in this case “does not act precisely as music, but as a memorative sign.” 5 The music of home, whether a rustic cantilena or a pop song, is the permanent accompaniment of nostalgia—its ineffable charm that makes the nostalgic teary-eyed and tongue-tied and often clouds critical reflection on the subject.

In the good old days nostalgia was a curable disease, dangerous but not always lethal. Leeches, warm hypnotic emulsions, opium and a return to the Alps usually soothed the symptoms. Purging of the stomach was also recommended, but nothing compared to the return to the motherland believed to be the best remedy for nostalgia. While proposing the treatment for the disease, Hofer seemed proud of some of his patients; for him nostalgia was a demonstration of the patriotism of his compatriots who loved the charm of their native land to the point of sickness.

Nostalgia shared some symptoms with melancholia and hypochondria. Melancholia, according to the Galenic conception, was a disease of the black bile that affected the blood and produced such physical and emotional symptoms as “vertigo, much wit, headache, . . . much waking, rumbling in the guts . . . troublesome dreams, heaviness of the heart . . . continuous fear, sorrow, discontent, superfluous cares and anxiety.” For Robert Burton, melancholia, far from being a mere physical or psychological condition, had a philosophical dimension. The melancholic saw the world as a theater ruled by capricious fate and demonic play. 6 Often mistaken for a mere misanthrope, the melancholic was in fact a utopian dreamer who had higher hopes for humanity. In this respect, melancholia was an affect and an ailment of intellectuals, a Hamletian doubt, a side effect of critical reason; in melancholia, thinking and feeling, spirit and matter, soul and body were perpetually in conflict. Unlike melancholia, which was regarded as an ailment of monks and philosophers, nostalgia was a more “democratic” disease that threatened to affect soldiers and sailors displaced far from home as well as many country people who began to move to the cities. Nostalgia was not merely an individual anxiety but a public threat that revealed the contradictions of modernity and acquired a greater political importance.

The outburst of nostalgia both enforced and challenged the emerging conception of patriotism and national spirit. It was unclear at first what was to be done with the afflicted soldiers who loved their motherland so much that they never wanted to leave it, or for that matter to die for it. When the epidemic of nostalgia spread beyond the Swiss garrison, a more radical treatment was undertaken. The French doctor Jourdan Le Cointe suggested in his book written during the French Revolution of 1789 that nostalgia had to be cured by inciting pain and terror. As scientific evidence he offered an account of drastic treatment of nostalgia successfully undertaken by the Russians. In 1733 the Russian army was stricken by nostalgia just as it ventured into Germany, the situation becoming dire enough that the general was compelled to come up with a radical treatment of the nostalgic virus. He threatened that “the first to fall sick will be buried alive.” This was a kind of literalization of a metaphor, as life in a foreign country seemed like death. This punishment was reported to be carried out on two or three occasions, which happily cured the Russian army of complaints of nostalgia. 7 (No wonder longing became such an important part of the Russian national identity.) Russian soil proved to be a fertile ground for both native and foreign nostalgia. The autopsies performed on the French soldiers who perished in the proverbial Russian snow during the miserable retreat of the Napoleonic Army from Moscow revealed that many of them had brain inflammation characteristic of nostalgia.

While Europeans (with the exception of the British) reported frequent epidemics of nostalgia starting from the seventeenth century, American doctors proudly declared that the young nation remained healthy and didn’t succumb to the nostalgic vice until the American Civil War. 8 If the Swiss doctor Hofer believed that homesickness expressed love for freedom and one’s native land, two centuries later the American military doctor Theodore Calhoun conceived of nostalgia as a shameful disease that revealed a lack of manliness and unprogressive attitudes. He suggested that this was a disease of the mind and of a weak will (the concept of an “afflicted imagination” would be profoundly alien to him). In nineteenth-century America it was believed that the main reasons for homesickness were idleness and a slow and inefficient use of time conducive to daydreaming, erotomania and onanism. “Any influence that will tend to render the patient more manly will exercise a curative power. In boarding schools, as perhaps many of us remember, ridicule is wholly relied upon. . . . [The nostalgic] patient can often be laughed out of it by his comrades, or reasoned out of it by appeals to his manhood; but of all potent agents, an active campaign, with attendant marches and more particularly its battles is the best curative.” 9 Dr. Calhoun proposed as treatment public ridicule and bullying by fellow soldiers, an increased number of manly marches and battles and improvement in personal hygiene that would make soldiers’ living conditions more modern. (He also was in favor of an occasional furlough that would allow soldiers to go home for a brief period of time.)

For Calhoun, nostalgia was not conditioned entirely by individuals’ health, but also by their strength of character and social background. Among the Americans the most susceptible to nostalgia were soldiers from the rural districts, particularly farmers, while merchants, mechanics, boatmen and train conductors from the same area or from the city were more likely to resist the sickness. “The soldier from the city cares not where he is or where he eats, while his country cousin pines for the old homestead and his father’s groaning board,” wrote Calhoun. 10 In such cases, the only hope was that the advent of progress would somehow alleviate nostalgia and the efficient use of time would eliminate idleness, melancholy, procrastination and lovesickness.

As a public epidemic, nostalgia was based on a sense of loss not limited to personal history. Such a sense of loss does not necessarily suggest that what is lost is properly remembered and that one still knows where to look for it. Nostalgia became less and less curable. By the end of the eighteenth century, doctors discovered that a return home did not always treat the symptoms. The object of longing occasionally migrated to faraway lands beyond the confines of the motherland. Just as genetic researchers today hope to identify a gene not only for medical conditions but social behavior and even sexual orientation, so the doctors in the eighteenth and nineteenth centuries looked for a single cause of the erroneous representations, one so-called pathological bone. Yet the physicians failed to find the locus of nostalgia in their patient’s mind or body. One doctor claimed that nostalgia was a “hypochondria of the heart” that thrives on its symptoms. To my knowledge, the medical diagnosis of nostalgia survived in the twentieth century in one country only—Israel. (It is unclear whether this reflects a persistent yearning for the promised land or for the diasporic homelands left behind.) Everywhere else in the world nostalgia turned from a treatable sickness into an incurable disease. How did it happen that a provincial ailment, maladie du pays , became a disease of the modern age, mal du siècle?

In my view, the spread of nostalgia had to do not only with dislocation in space but also with the changing conception of time. Nostalgia was a historical emotion, and we would do well to pursue its historical rather than psychological genesis. There had been plenty of longing before the seventeenth century, not only in the European tradition but also in Chinese and Arabic poetry, where longing is a poetic commonplace. Yet the early modern conception embodied in the specific word came to the fore at a particular historical moment. “Emotion is not a word, but it can only be spread abroad through words,” writes Jean Starobinski, using the metaphor of border crossing and immigration to describe the discourse on nostalgia. 11 Nostalgia was diagnosed at a time when art and science had not yet entirely severed their umbilical ties and when the mind and body—internal and external well-being—were treated together. This was a diagnosis of a poetic science—and we should not smile condescendingly on the diligent Swiss doctors. Our progeny well might poeticize depression and see it as a metaphor for a global atmospheric condition, immune to treatment with Prozac.

What distinguishes modern nostalgia from the ancient myth of the return home is not merely its peculiar medicalization. The Greek nostos , the return home and the song of the return home, was part of a mythical ritual. […] Modern nostalgia is a mourning for the impossibility of mythical return, for the loss of an enchanted world with clear borders and values; it could be a secular expression of a spiritual longing, a nostalgia for an absolute, a home that is both physical and spiritual, the edenic unity of time and space before entry into history. The nostalgic is looking for a spiritual addressee. Encountering silence, he looks for memorable signs, desperately misreading them.

The diagnosis of the disease of nostalgia in the late seventeenth century took place roughly at the historical moment when the conception of time and history were undergoing radical change. The religious wars in Europe came to an end but the much prophesied end of the world and doomsday did not occur. “It was only when Christian eschatology shed its constant expectations of the immanent arrival of doomsday that a temporality could have been revealed that would be open to the new and without limit.” 13 It is customary to perceive “linear” Judeo-Christian time in opposition to the “cyclical” pagan time of eternal return and discuss both with the help of spatial metaphors. 14 What this opposition obscures is the temporal and historical development of the perception of time that since Renaissance on has become more and more secularized, severed from cosmological vision.

Before the invention of mechanical clocks in the thirteenth century the question, What time is it? was not very urgent. Certainly there were plenty of calamities, but the shortage of time wasn’t one of them; therefore people could exist “in an attitude of temporal ease. Neither time nor change appeared to be critical and hence there was no great worry about controlling the future.” 15 In late Renaissance culture,Time was embodied in the images of Divine Providence and capricious Fate, independent of human insight or blindness. The division of time into Past, Present and Future was not so relevant. History was perceived as a “teacher of life” (as in Cicero’s famous dictum, historia magistra vitae ) and the repertoire of examples and role models for the future. Alternatively, in Leibniz’s formulation, “The whole of the coming world is present and prefigured in that of the present.” 16

The French Revolution marked another major shift in European mentality. Regicide had happened before, but not the transformation of the entire social order. The biography of Napoleon became exemplary for an entire generation of new individualists, little Napoleons who dreamed of reinventing and revolutionizing their own lives. The “Revolution,” at first derived from natural movement of the stars and thus introduced into the natural rhythm of history as a cyclical metaphor, henceforth attained an irreversible direction: it appeared to unchain a yearned-for future. 17 The idea of progress through revolution or industrial development became central to the nineteenth-century culture. From the seventeenth to the nineteenth century, the representation of time itself changed; it moved away from allegorical human figures—an old man, a blind youth holding an hourglass, a woman with bared breasts representing Fate—to the impersonal language of numbers: railroad schedules, the bottom line of industrial progress. Time was no longer shifting sand; time was money. Yet the modern era also allowed for multiple conceptions of time and made the experience of time more individual and creative.

“The Origin of Consciousness, Gains and Losses: Walker Percy vs. Julian Jaynes”
by Laura Mooneyham White
from Gods, Voices, and the Bicameral Mind
ed. by Marcel Kuijsten

Jaynes is plainly one who understands the human yearning for Eden, the Eden of bicameral innocence. He writes of our longings for a return to that lost organization of human mentality, a return to lost certainty and splendour.” 44 Jones believes, in fact, that Jaynes speaks for himself when he describes the “yearning for divine volition and service [which] is with us still,” 45 of our “nostalgic anguish” which we feel for lost bicamerality. 46 Even schizophrenia, seen from Jaynes’s perspective as a vestige of bicamerality, is the anguishing state it is only because the relapse to bicamerality

is only partial. The learnings that make up a subjective consciousness are powerful and never totally suppressed. And thus the terror and the fury, the agony and the despair. … The lack of cultural support and definition for the voices [heard by schizophrenics] … provide a social withdrawal from the behavior of the absolutely social individual of bicameral societies. … [W]ithout this source of security, … living with hallucinations that are unacceptable and denied as unreal by those around him, the florid schizophrenic is in an opposite world to that of the god-owned laborers of Marduk. … [He] is a mind bared to his environment, waiting on gods in a godless world. 47

Jones, in fact, asserts that Jaynes’s discussion of schizophrenia is held in terms “reminiscent of R. D. Laing’s thesis that schizophrenics are the only sane people in our insane world.” 48 Jones goes on to say that “Jaynes, it would seem, holds that we would all be better off if ‘everyone’ were once again schizophrenic, if we could somehow return to a bicameral society which had not yet been infected by the disease of thinking.” 49

Jaynes does not, in my opinion, intimate a position nearly as reactionary as this; he has in fact made elsewhere an explicit statement to the effect that he himself feels no such longing to return to bicamerality, that he would in fact “shudder” at such a return. 50 Nonetheless, Jaynes does seem at some points in his book to describe introspection as a sort of pathological development in human history. For instance, instead of describing humanity’s move towards consciousness as liberating, Jaynes calls it “the slow inexorable profaning of our species.” 51 And no less an eminence than Northrop Frye recognized this tendency in Jaynes to disvalue consciousness. After surveying Jaynes’s argument and admitting the fascination of that argument’s revolutionary appeal, Frye points out that Jaynes’s ideas provoke a disturbing reflection: “seeing what a ghastly mess our egocentric consciousness has got us into, perhaps the sooner we get back to … hallucinations the better.” Frye expands his discussion of Jaynes to consider the cultural ramifications of this way of thinking, what he terms “one of the major cultural trends of our time”:

It is widely felt that our present form of consciousness, with its ego center, has become increasingly psychotic, incapable of dealing with the world, and that we must develop a more intensified form of consciousness, recapturing many of … Jaynes’ ‘bicameral’ features, if we are to survive the present century. 52

Frye evidently has little sympathy with such a position which would hold that consciousness is a “late … and on the whole regrettable arrival on the human scene” 53 rather than the wellspring of all our essentially human endeavors and achievements: art, philosophy, religion and science. The ground of this deprecatory perspective on consciousness, that is, a dislike or distrust of consciousness, has been held by many modern and postmodern thinkers and artists besides Jaynes, among them Sartre, Nietzsche, Faulkner, Pynchon, Freud, and Lacan, so much so that we might identify such an ill opinion of consciousness as a peculiarly modern ideology.

“Remembrance of Things (Far) Past”
by Julian Jaynes
from The Julian Jaynes Collection
ed. by Marcel Kuijsten

And nostalgia too. For with time metaphored as space, so like the space of our actual lives, a part of us solemnly keeps loitering behind, trying to visit past times as if they were actual spaces. Oh, what a temptation is there! The warm, sullen longing to return to scenes long vanished, to relive some past security or love, to redress some ancient wrong or redecide a past regret, or alter some ill-considered actions toward someone lost to our present lives, or to fill out past omissions — these are artifacts of our new remembering consciousness. Side effects. And they are waste and filler unless we use them to learn about ourselves.

Memory is a privilege for us who are born into the last three millennia. It is both an advantage and a predicament, liberation and an imprisonment. Memory is not a part of our biological evolution, as is our capacity to learn habits or simple knowings. It is an off-shoot of consciousness acquired by mankind only a hundred generations ago. It is thus the new environment of modern man. It is one which we sometimes are like legal aliens waiting for naturalization. The feeling of full franchise and citizenship in that new environment is a quest that is the unique hidden adventure of us all.

The Suffering System
by David Loy

In order to understand why that anxiety exists, we must relate dukkha to another crucial Buddhist term, anatta, or “non-self.” Our basic frustration is due most of all to the fact that our sense of being a separate self, set apart from the world we are in, is an illusion. Another way to express this is that the ego-self is ungrounded, and we experience this ungroundedness as an uncomfortable emptiness or hole at the very core of our being. We feel this problem as a sense of lack, of inadequacy, of unreality, and in compensation we usually spend our lives trying to accomplish things that we think will make us more real.

But what does this have to do with social challenges? Doesn’t it imply that social problems are just projections of our own dissatisfaction? Unfortunately, it’s not that simple. Being social beings, we tend to group our sense of lack, even as we strive to compensate by creating collective senses of self.

In fact, many of our social problems can be traced back to this deluded sense of collective self, this “wego,” or group ego. It can be defined as one’s own race, class, gender, nation (the primary secular god of the modern world), religion, or some combination thereof. In each case, a collective identity is created by discriminating one’s own group from another. As in the personal ego, the “inside” is opposed to the other “outside,” and this makes conflict inevitable, not just because of competition with other groups, but because the socially constructed nature of group identity means that one’s own group can never feel secure enough. For example, our GNP is not big enough, our nation is not powerful (“secure”) enough, we are not technologically developed enough. And if these are instances of group-lack or group-dukkha, our GNP can never be big enough, our military can never be powerful enough, and we can never have enough technology. This means that trying to solve our economic, political, and ecological problems with more of the same is a deluded response.

Just Smile.

“Pain in the conscious human is thus very different from that in any other species. Sensory pain never exists alone except in infancy or perhaps under the influence of morphine when a patient says he has pain but does not mind it. Later, in those periods after healing in which the phenomena usually called chronic pain occur, we have perhaps a predominance of conscious pain.”
~Julian Jaynes, Sensory Pain and Conscious Pain

I’ve lost count of the number of times I’ve seen a child react to a cut or stumble only after their parent(s) freaked out. Children are highly responsive to adults. If others think something bad has happened, they internalize this and act accordingly. Kids will do anything to conform to expectations. But most kids seem impervious to pain, assuming they don’t get the message that they are expected to put on an emotional display.

This difference can be seen when comparing how a child acts by themselves and how they act around a parent or other authority figure. You’ll sometimes see a kid looking around to see if their is an audience paying attention before crying or having a tantrum. We humans are social creatures and our behavior is always social. This is naturally understood even by infants who have an instinct for social cues and social response.

Pain is a physical sensation, an experience that passes, whereas suffering is in the mind, a story we tell ourselves. This is why trauma can last for decades after a bad experience. The sensory pain is gone but the conscious pain continues. We keep repeating a story.

It’s interesting that some cultures like the Piraha don’t appear to experience trauma from the exact same events that would traumatize a modern Westerner. Neither is depression and anxiety common among them. Nor an obsessive fear about death. Not only are the Piraha physically tougher but psychologically tougher as well. Apparently, they tell different stories that embody other expectations.

So, what kind of society is it that we’ve created with our Jaynesian consciousness of traumatized hyper-sensitivity and psychological melodrama? Why are we so attached to our suffering and victimization? What does this story offer us in return? What power does it hold over us? What would happen if we changed the master narrative of our society in replacing the competing claims of victimhood with an entirely different way of relating? What if outward performances of suffering were no longer expected or rewarded?

For one, we wouldn’t have a man-baby like Donald Trump as our national leader. He is the perfect personification of this conscious pain crying out for attention. And we wouldn’t have had the white victimhood that put him into power. But neither would we have any of the other victimhoods that these particular whites were reacting to. The whole culture of victimization would lose its power.

The social dynamic would be something else entirely. It’s hard to imagine what that might be. We’re addicted to the melodrama and we carefully enculturate and indoctrinate each generation to follow our example. To shake us loose from our socially constructed reality would require a challenge to our social order. The extremes of conscious pain isn’t only about our way of behaving. It is inseparable from how we maintain the world we are so desperately attached to.

We need the equivalent, in the cartoon below, of how this father relates to his son. But we need it on the collective level. Or at least we need this in the United States. What if the rest of the world simply stopped reacting to American leaders and American society? Just smile.

Image may contain: text

Credit: The basic observation and the cartoon was originally shared by Mateus Barboza on the Facebook group “Jaynes’ The Origin of Consciousness in the Breakdown of the Bicameral Mind”.