Harriet Tubman: Voice-Hearing Visionary

Origin of Harriet Tubman in the Persistence of the Bicameral Mind

The movie ‘Harriet’ came out this year, amidst pandemic and protest. The portrayal of Harriet Tubman’s life and her strange abilities reminds one of Julian Jaynes’ theory of the bicameral mind, as written about in what is now a classic work, The Origin of Consciousness in the Breakdown of the Bicameral Mind. Some background will help and so let’s look at the biographical details of what is known. This famous Underground Railroad conductor was born Araminta Harriet Ross in the early 1820s and, when younger, she was known as ‘Minty’. Her parents were religious, as she would later become. She might also have been exposed to the various church affiliations of her master’s extended family.

These influences were diverse, writes James A. McGowan and William C. Kashatus in their book Harriet Tubman: A Biography (pp. 11-12): “As a child, Minty had been told Bible stories by her mother, and she was occasionally forced to attend the services held by Dr. Anthony Thompson, Jr., who was a licensed Methodist minister. But Minty and her parents might also have been influenced by Episcopal, Baptist, and Catholic teachings since the Pattisons, Thompsons, and Brodesses initially belonged to Anglican and Episcopal churches in Dorchester County before they became Methodists. In addition, some of the white Tubmans and Rosses were originally Catholic. Accordingly, Minty’s religious beliefs might have been a composite of several different Christian traditions that were adapted to the evangelical emphasis on spiritual freedom.”

Tubman’s mixed religious background was also noted by Kate C. Larson: “The “creolization” of this family more accurately reflects the blending of cultures from West Africa, Northern Europe, and local Indian peoples in the Chesapeake. As historian Mechal Sobel put it, this was a “world they made together.” By the time Tubman was born, first generation Africans were visible presences in Dorchester County […] Tubman and her family integrated a number of religious practices and beliefs into their daily lives, including Episcopal, Methodist, Baptist, Catholic, and even Quaker teachings, all religious denominations supported by local white masters and their neighbors who were intimately involved with Tubman’s family. Many slaves were required to attend the churches of their owners and temporary masters. Tubman’s religiosity, however, was a deeply personal spiritual experience, rooted in evangelical Christian teachings and familial traditions” (Harriet Ross Tubman).

Other scholars likewise agree, such as Robert Gudmestad: “Like many enslaved people, her belief system fused Christian and African beliefs” (Faith made Harriet Tubman fearless as she rescued slaves). This syncretism was made simpler for the commonalities traditional African religion had with Christianity or particular sects of Christianity: worship of one God who was supreme, relating to God as a helpful friend who could be heard and talked with (a commonality with Quakerism), belief in an eternal soul and an afterlife, rites and initiations involving immersion in water, etc. Early generations of slaves were often kept out of the churches and so this allowed folk religion to take on a life of its own with a slow merging of traditions, such as how African rhythms of mourning were incorporated into Gospel music.

Furthermore, religious fervor was at a peak in the early 1800s and it was part of the world Tubman’s parents lived in and that Tubman was born into. “Both races attended the massive camp meetings so Rit and Ben experienced these sporadic evangelical upsurges”, wrote Margaret Washington (Let The Circle Be Unbroken: The World Of Araminta (“Minty”) Ross Or The Making Of Harriet Tubman). “She grew up during the Second Great Awakening,” Gudmestad explained, “which was a Protestant religious revival in the United States. Preachers took the gospel of evangelical Christianity from place to place, and church membership flourished. Christians at this time believed that they needed to reform America in order to usher in Christ’s second coming. Some during that restless period believed it was the End Times, as it was easier to imagine the world coming to an end than to imagine it to become something else.

This would have been personally felt by Tubman. “A number of black female preachers,” Gudmestad goes on to say, “preached the message of revival and sanctification on Maryland’s Eastern Shore. Jarena Lee was the first authorized female preacher in the African Methodist Episcopal Church. It is not clear if Tubman attended any of Lee’s camp meetings, but she was inspired by the evangelist. She came to understand that women could hold religious authority.” The religious fervor was part of a growing political fervor, as the country moved toward Civil War. For blacks, Moses leading his people to freedom inspired more than faith and hope toward the afterlife.

Around the time of Tubman’s birth, there was the failed 1822 revolt planned by Denmark Vesey in South Carolina. Later in 1831, Nat Turner led his rebellion in nearby Virginia and that would’ve been an exciting event for enslaved blacks, especially a lonely young slave girl who at the time was being kept separate from her family and mercilessly whipped. Then throughout her teens and into her early twenties, there were numerous other uprisings: 1835–1838 Black Seminole Slave Rebellion, 1839 Amistad seizure, 1841 Creole case, 1842 Slave Revolt in the Cherokee Nation. The Creole case was the most successful slave revolt in United States history. Such tremendous events, one might imagine, could shape a young impressionable mind.

* * *

Harriet Tubman’s Ethno-Cultural Ancestry and Family Inheritance

Someone like Tubman didn’t come out of nowhere. “I am quite willing to acknowledge that she was almost an anomaly among her people,” wrote her early biographer Sarah Bradford, “and so far I can judge they all seem to be particularly intelligent, upright and religious people, and to have a strong feeling of family affection” (Harriet: The Moses of Her People). She earned her strong spirit honestly, from the black culture around here and as modeled by her parents. The spiritual inclinations, as with with knowledge of nature, came from her father: “As a clairvoyant, Minty believed that she inherited this second sense from her father, Ben. […] Listening to Ben’s stories, predictions and sharing his faith convinced Minty that an omniscient force protected her” (Margaret Washington, Let The Circle Be Unbroken: The World Of Araminta (“Minty”) Ross Or The Making Of Harriet Tubman). But it was her mother, in particular, who showed what it meant to be a fiercely protective woman when it came to family. When Tubman returned to free her family, including her elderly parents, she was acting on the values she was raised with:

“Rit struggled to keep her family together as slavery threatened to tear it apart. Edward Brodess sold three of her daughters (Linah, Mariah Ritty, and Soph), separating them from the family forever.[10] When a trader from Georgia approached Brodess about buying Rit’s youngest son, Moses, she hid him for a month, aided by other enslaved people and freedmen in the community.[11] At one point she confronted her owner about the sale.[12] Finally, Brodess and “the Georgia man” came toward the slave quarters to seize the child, where Rit told them, “You are after my son; but the first man that comes into my house, I will split his head open.”[12] Brodess backed away and abandoned the sale.[13] Tubman’s biographers agree that stories told about this event within the family influenced her belief in the possibilities of resistance.[13][14] (Harriet Tubman, Wikipedia)

Whatever the cause, a strong moral sense developed in Tubman. Around the age of twelve or fifteen, there was an incident where she refused to help an overseer catch and tie up a runaway slave. Instead, she stood in front of the door and blocked his way. He threw an iron weight after the escapee, but it came up short when it hit her in the head, knocking her unconscious. She later said that it “broke my skull” and, though her master wanted to send her back to work, it took her a long time to recover. “The teenager remained in a coma for weeks,” writes M.W. Taylor, “lying on a bed of rags in the corner of her family’s windowless wooden cabin. Not until the following spring was she able to get up and walk unaided” (Harriet Tubman: Antislavery Activist, p. 16). Kate C. Larson says that, “It took months for her mother to nurse her back to health” (Harriet Ross Tubman).

Ever after, she had seizures and trance-like states (“spells”, “sleeping fits”, or “a sort of stupor or lethargy at times”), premonitions and prophetic visions (“vivid dreams”), and out-of-body and other shamanic-like experiences — possibly caused by temporal lobe epilepsy, narcolepsy, cataplexy, or hypersomnia. She claimed to have heard the voice of God that guided and protected her, that He “spoke directly to my soul”. She “prayed all the time” and “was always talking to the Lord”“When I went to the horse trough to wash my face, and took up the water n my hands, I said, ‘Oh Lord, wash me, make me clean.’ When I took up the towel to wipe my face and hands, I cried, ‘Oh Lord, for Jesus’ sake, wipe away all my sins!’ ” (Sarah H. Bradford, Harriet, p. 11).

“During these hallucinatory states,” writes Gordon S. Johnson Jr., “she would also hear voices, screams, music, and rushing water, and feel as though her skin was on fire, while still aware of what was going on around her. The attacks could occur suddenly, without warning, even in the middle of a conversation. She would wake up and pick up the conversation where it left off a half hour later. In addition, Tubman would have terrible headaches, and would become more religious after the injury” (Harriet Tubman Suffered a TBI Early In Life).

While recuperating, she prayed for her master’s soul, that he might be saved and become a Christian. Her master’s behavior didn’t improve. In her stupor, no amount of whipping would arouse her. So he tried to sell her, but no one wanted to buy an injured and incapacitated slave, even though prior to the accident she had been hardworking and was able to do the work of a full-grown man. She didn’t want to be sold and separated from her family. One day she prayed that, if her master couldn’t be saved, the Lord should kill him and take him away. Shortly later, he did die and, with overwhelming guilt, she felt her prayer had been the cause.

Tubman’s experiences may have been shaped by African traditions, as there were many first generation slaves around. She would have been close to her immediate and extended family living in the area, as described by Professor Larson: “Harriet Tubman’s grandmother, Modesty, lived on Pattison’s property for an undetermined number of years after Rit left with Mary and moved to the Thompson plantation. Though the Thompson plantation sat about 6 miles to the west of the Pattison plantation and their neighbors along the Little Blackwater River near the bridge, their interactions were likely frequent and essential to maintaining social, political, and economic wellbeing” (Harriet Tubman Underground Railroad National Monument: Historic Resource Study).

An important familial link, as discussed above, was their shared religious inheritance. “Methodism was one source of strength, blending smoothly with cultural and religious traditions that survived the middle passage from Africa,” wrote Professor Larson. “First generation Africans, like her grandmother Modesty, embodied a living African connection and memory for the Bradford, Scenes in the Life of Harriet Tubman. Tubman’s religious fervor and trust in God to protect and guide her evolved from a fusion of these traditions.” Tubman remained close to family living on nearby plantations, such as being hired out to do logging work with her father and quite likely hearing the same sermons, maybe sometimes clandestinely meeting in the “hidden church” of informal religious gatherings.

Her first biographer, Fanklin Sanborn, said that she was “one degree removed from the wolds of Africa, her grandfather being an imported African of a chieftan family” and that, as “the grand-daughter of a slave imported from Africa,” she “has not a drop of white blood in her veins” (“The Late Araminta Davis: Better Known as ‘Moses’ or ‘Harriet Tubman’.” Franklin B. Sanborn Papers. Box 1, Folder 5. Box 1, Folder 5, American Antiquarian Society). The latter claim of her being pure African ancestry has been disputed and was contradicted by other accounts, but at least part of her family was of recent African ancestry as was common in that era, making her a second generation American in at least one line. With a living memory of the Old World, Tubman’s maternal grandmother Modesty Green would have been treated as what is called a griot, an elder who is a teacher, healer, and counselor; a keeper of knowledge, wisdom, and customs. She would have remembered the old world and had learned much about how to live in the new one, helping to shape the creole culture into which Tubman was born.

Modesty might have come from the Ashanti tribe of West Africa, specifically Ghana. She was sold as a slave sometime before 1785, the year Tubman’s mother Rittia (Rit, Ritty) Green was born. The Ashanti ethnicity was common in the region, writes Ann Malaspina: “During the eighteenth century, more than one million slaves were bought by British, Danish, and Dutch slave traders and shipped to the Americas from the Ashanti Empire on West Africa’s Gold Coast, a rich trading region. Many Ashanti slaves were sold to buyers in Maryland” (Harriet Tubman, p. 10). The Ashanti had a proud reputation and the ethnic culture made its presence known, such as the “Asante proverbs that Harriet picked up as a young girl (“Don’t test the depth of a river with both feet”)” (Catherine Clinton, Harriet Tubman). Along with the Ashante, blacks of Igbo descent were numerous in the Tidewater region of Maryland and Virginia (Igbo Americans, Wikipedia). These cultures, along with the Kongo people, were known to be proud and loyal. Also, West Africa had a tradition of respect for women — as property owners and leaders, and sometimes as warriors.

It’s the reason the Tidewater plantation owners preferred them as slaves. The preference in the Deep South was different because down there plantations were large commercial operations with typically absentee owners, an aristocracy that spent most of its time in Charleston, England, or elsewhere. Tidewater slaveholders had smaller plantations and were less prosperous. This meant they and their families lived close to slaves and, in some cases, would have worked with them. These Tidewater aristocrats were more likely to use the paternalistic rhetoric that identified slaves as part of the extended family, as often was literally the case from generations of close relations with many of the plantation owner’s mulatto children, grandchildren, cousins, etc running around. Cultures like the Ashanti and Igbo, in being strongly devoted to their families and communities, could be manipulated to keep slaves from running away. The downside to this communal solidarity is that these ethnic groups were known to be disobedient and cause a lot of trouble, including some of the greatest slave rebellions

Tubman is an exemplar of this Tidewater black culture. According to her own statements recorded by Frank C. Drake: “the old mammies to whom she told [her] dreams were wont to nod knowingly and say, ‘I reckon youse one o’ dem ‘Shantees’, chile.’ For they knew the tradition of the unconquerable Ashantee blood, which in a slave made him a thorn in the side of the planter or cane grower whose property he became, so that few of that race were in bondage” (“The Moses of Her People. Amazing Life work of Harriet Tubman,” New York Herald, New York, Sept. 22, 1907). The claim about her grandmother was confirmed by a piece from the year before Tubman’s death, written by Ann Fitzhugh Miller (granddaughter of Tubman’s friend Gerrit Smith), in reporting that Tubman believed her maternal grandmother had been “brought in a slave ship from Africa” (“Harriet Tubman,” American Review, August 1912, p. 420).

Professor Kate C. Larson concludes that, “It has been generally assumed at least one if not more of Tubman’s grandparents came directly from Africa” (Harriet Tubman Underground Railroad National Monument: Historic Resource Study). This is the reason for speculating about a more direct African influence or, at the very least, it shows how important an African identity was to Tubman’s sense of faith and spirituality. “Like many enslaved people, her belief system fused Christian and African beliefs,” Robert Gudmestad suggests. “Her belief that there was no separation between the physical and spiritual worlds was a direct result of African religious practices. Tubman literally believed that she moved between a physical existence and a spiritual experience where she sometimes flew over the land.”

* * *

Harriet Tubman’s Special Relationship with God and Archaic Authorization

Whatever was the original source and true nature of Harriet Tubman’s abilities, they did serve her well in freeing slaves and saved her from her pursuers. She always trusted her voices and visions, and would change her course of action in an instant, such as the time God told her to not continue down a road and so, without hesitation, she led her fellow fugitives across the rushing waters of an icy stream, but the “several stout men” in her care “refused to follow til they saw her safe on the other side”. Sarah Bradford goes on to say that, “The strange part of the story we found to be, that the masters of these men had put up the previous day, at the railroad station near where she left, an advertisement for them, offering a large reward for their apprehension; but they made a safe exit” (p. 45). Commenting on this incident, McGowan and Kashatus notes, “Similar instances occurred on her rescue missions whenever Harriet was forced to make an important decision” (Harriet Tubman: A Biography, p. 62).

This divine guidance probably made her behavior erratic and unpredictable, always one step ahead (or one step to the side) of the slave-catchers — maybe not unlike the Trickster stories she likely heard growing up, as part of the folklore tradition in African-American communities or possibly picked up from Native Americans who still lived in the area. Maybe there is a reason both Trickster stories and voice-hearing are often found in oral cultures. The Trickster, as an archetype similar to salvific figures, exists between the divine and human — Jesus often played the role of Trickster. Looking more closely at this mentality might also tell us something about the bicameral mind.

Her visions and voice-hearing was also a comfort and assurance to her; and, as some suggested, this gave her “command over others’ minds” (Edna Cheney, “Moses”, The Freedmen’s Record, p. 35) — that is to say, when around her, people paid attention and did what they were told. She had the power of charisma and persuasion, and failing that she had a gun that she was not afraid to use too good effect. She heard God’s voice in conviction and so she spoke with conviction. One was wise to not doubt her and, when leading slaves to freedom, she did not tolerate anyone challenging her authority. But it was in moments of solitude that she most strongly felt the divine. Based on interviews with Tubman in 1865, Edna Cheney conveyed it in the following way:

“When going on these journeys she often lay alone in the forests all night. Her whole soul was filled with awe of the mysterious Unseen Presence, which thrilled her with such depths of emotion, that all other care and fear vanished. Then she seemed to speak with her Maker “as a man talketh with his friend;” her child-like petitions had direct answers, and beautiful visions lifted her up above all doubt and anxiety into serene trust and faith. No man can be a hero without this faith in some form; the sense that he walks not in his own strength, but leaning on an almighty arm. Call it fate, destiny, what you will, Moses of old, Moses of to-day, believed it to be Almight God” (p. 36).

Friends and co-conspirators described Tubman as having lacked the gnawing anxiety and doubt that, according to Julian Jaynes, has marked egoic consciousness since the collapse of Bronze Age civilization. “Great fears were entertained for her safety,” according to William Still, an African American abolitionist who personally knew her, “but she seemed wholly devoid of personal fear. The idea of being captured by slave-hunters or slave-holders, seemed never to enter her mind.” That kind of absolute courage and conviction, based on trust of voices and visions, is not common in the modern mind. Her example inspired and impressed many.

Thomas Garrett, a close confidante, said that, “I never met with any person, of any color, who had more confidence in the voice of God, as spoken direct to her soul. She has frequently told me that she talked with God, and he talked with her every day of her life, and she has declared to me that she felt no more fear of being arrested by her former master, or any other person, when in his immediate neighborhood, than she did in the State of New York, or Canada, for she said she never ventured only where God sent her, and her faith in a Supreme Power truly was great” (letter, 1868). As an aside, there is an interesting detail about her relationship with God — it was told by Samuel Hopkins Adams, grandson of Tubman’s friend and benefactor Samuel Miles Hopkins (brother of Tubman’s biographer Sarah Bradford): “Her relations with the Deity were personal, even intimate, though respectful on her part. He always addressed her as Araminta, which was her christened name” (“Slave in the Family”, Grandfather Stories, pp. 277-278; quoted by Jean M. Humez on p. 355 of Harriet Tubman: The Life and the Life Stories).

In summarizing her faith, Milton C. Sernett concluded that, “Tubman did not distinguish between seer and saint. She seems to have believed that her trust in the Lord enabled her to meet all of life’s exigencies with a confident foreknowledge of how things would turn out, a habit others found impressive, or uncanny, as the case may be” (Harriet Tubman: Myth, Memory, and History, p. 145). That is it. This supreme confidence did not come from herself. At one moment of uncertainty, she was faced with making a decision. “The Lord told me to do this. I said, ‘Oh Lord, I can’t—don’t ask me—take somebody else.” God then spoke to her: “It’s you I want, Harriet Tubman” (Catherine Clinton, Harriet Tubman: The Road to Freedom).

Anyone familiar with Julian Jaynes’ theory of the bicameral mind would perk up at this discussion of voice-hearing, specifically of commanding voices with the undeniable and infallible power of archaic authorization. Besides this, he spoke of three other necessary components to the general bicameral paradigm, as much relevant today as it was during the Bronze Age (The Origin of Consciousness in the Breakdown of the Bicameral Mind, p. 324):

  • “The collective cognitive imperative, or belief system, a culturally agreed-on expectancy or prescription which defines the particular form of a phenomenon and the roles to be acted out within that form”
  • “an induction or formally ritualized procedure whose function is the narrowing of consciousness by focusing attention on a small range of preoccupations”
  • “the trance itself, a response to both the preceding, characterized by a lessening of consciousness or its loss, the diminishing of the analog or its loss, resulting in a role that is accepted, tolerated, or encouraged by the group”

Collective cognitive imperative is central what we are exploring here. Tubman grew up in a culture where such spiritual, paranormal, and shamanic experiences were still part of a living tradition, including traces of traditional African religion. She lacked doubt about this greater reality because almost everyone around her shared this sense of faith. As social creatures, such shared culture has a powerful effect upon the human mind. But at that point in early modernity when Tubman grew up, most of American society had lost the practices of induction and hence the ability to enter trances.

The Evangelical church, however, has long promoted trance experiences and trained people how to talk to God and listen for his voice (still does, in some cases: Tanya Luhrmann, When God Talks Back). Because of her brain condition, Tubman didn’t necessarily require induction, although her ritual of constant prayer probably helped. She went into trance apparently without having to try, one might say against her will. There is also another important contributing factor. Voice-hearing has historically been most common among non-literate, especially preliterate, societies — that is because the written word alters the human mind, as argued by many besides Jaynes: Marshall McLuhan, Walter Ong, etc. Such illiteracy would describe the American slave population since it was against the law for them to read and write.

* * *

Harriet Tubman’s Illiteracy and Storytelling Talent

This state of illiteracy included Tubman. During the Civil War, she spoke of a desire to become literate so as to “write her own life” (Cheney, p. 38), but there is no evidence she ever learned to write. “The blow to the head of Tubman received at about thirteen may have been the root cause of her illiteracy. According to Cheney’s sketch, “The trouble in her head prevents her from applying closely to a book” “ (Milton C. Sernett, Harriet Tubman: Myth, Memory, and History, p. 105). She remained her whole life fully immersed in an oral mindset. This was demonstrated by her heavy use of figurative language with concrete imagery, as when describing a Civil War battle — recorded by visiting historian Albert Bushnell Hart:

“And then we saw the lightning, and that was the guns; and then we heard the thunder, and that was the big guns; and then we heard the rain falling, and that was the drops of blood falling; and when we came to get in the crops, it was dead men that we reaped” (Slavery and Abolition, p. 209). Also, consider how she spoke of her personal experiences: “She loves to describe her visions, which are very real to her; but she must tell them word for word as they lie in her untutored mind, with endless repetitions and details, she cannot condensed them, whatever be your haste. She has great dramatic power; the scene rises before you as she saw it, and her voice and language change with her different actors” (Cheney, pp. 36-37).

Elaborating on her storytelling talent, Jean M. Humez writes: “One of Earl Conrad’s informants who as a child had known Tubman in her old age reported: “there never was any variation in the stories she told, whether to me or to any other” (Tatlock, 1939a). It is characteristic of the folklore performer trained in an oral culture to tell a story in precisely the right way each time. This is because the story itself is often regarded as a form of knowledge that will educate the young and be passed down through the generations. The storyteller must not weaken the story’s integrity with a poor performance” (Harriet Tubman: The Life and the Life Stories, p. 135).

This was also heard in how Tubman drew upon the down-to-earth style of old school religion: “Instead of the classical Greek “tricks of oratory” to which the college-educated Higginson refers, Tubman drew upon homelier sources of eloquence, such as scriptures she would have heard preached in the South. She frequently employed a teaching technique made familiar in the New Testament Gospels—the “parable’ or narrative metaphor—to make her lessons persuasive and memorable” (Jean M. Humez, Harriet Tubman: The Life and the Life Stories, p. 135). She knew of Jesus’ message through oral tellings by preachers and that was fitting since Jesus too effectively taught in the spoken word.

She was masterful. Even before a crowd of respectable whites, such as at abolitionist meetings, she could captivate an audience and move them to great emotion. Having witnessed a performance of Tubman’s oft-repeated story of former slave Joe’s arrival in Canada along with a rendition of the song he sang in joyous praise, Charlotte Forten recorded the impact it had on those present: “How exciting it was to hear her tell the story. And to hear the very scraps of jubilant hymns that he sang. She said the ladies crowded around them, and some laughed and some cried. My own eyes were full I listened to her” (Charlotte Forten, journal entry, Saturday, January 31, 1862).

All of these ways of speaking are typical of those born in oral societies. As such, her illiteracy might have been key. “She is a rare instance,” as told in The Freedmen’s Record, “in the midst of high civilization and intellectual culture, of a being of great native powers, working powerfully, and to beneficient ends, entirely unaided by school or books” (Cheney, p. 34). Maybe the two factors are closely linked. Even in the ancient world, some of the most famous and respected oracles were given by the uneducated and illiterate, often women. Tubman did have the oracular about her, as she occasionally prophesied outcomes and coming events.

We mainly know of Tubman through the stories she told and retold of herself and her achievements, surely having been important in gaining support and raising funds in those early years when she needed provisions to make her trips to the South. She came from a storytelling tradition and, obviously, she knew how to entertain and persuade, to make real the plight of the still enslaved and the dangers it took to gain their freedom. She drew in her audience, as if they were there with bloodhounds tracking them, with their lives hanging in the balance of a single wrong decision or unfortunate turn of events.

One of her greatest talents was weaving song into her stories, but that was also part of oral culture. The slave’s life was filled with song, from morning to night. They sung in church and while at work, at births and burials. These songs were often stories, many of them taken from or inspired by the religion that was so much a part of their daily experience. Song itself was a form of language: “Tubman used spirituals to signal her arrival or as a secret code to tell of her plans. She also used spirituals to reassure those she was leading of their safety and to lift their spirits during the long journey to freedom” (M.W. Taylor, Harriet Tubman: Antislavery Activist, p. 18). She also used the song of birds and owls to communicate, something she may have learned from the African or Native American tradition.

Song defined Tubman, as much as did her spirituality. “Religious songs,” Jean M. Humez explains, “embellished Tubman’s oral storytelling performances and were frequently central plot elements in her most popular Underground Railroad stories. There was the story of teasing the thick-witted “master” the night before her escape by using a familiar Methodist song, “I’m Bound for the Promised Land,” to communicate to her family her intention to run away. Singing was also integral to her much-told story about coded communication with fugitives she had hidden in the woods. “Go Down, Moses” meant “stay hidden,” while a “Methodist air,” “Hail, oh hail, ye happy spirits,” meant “all clear” (Bradford, 1869)” (Harriet Tubman: The Life and the Life Stories, p. 136).

Humez goes on to say that, “Though she was able to capture and reproduce the lyrics for her readers, Bradford was evidently bewildered by Tubman’s musical performances in much the same way Cheney was by her spiritual testimony: “The air sung to these words was so wild, so full of plaintive minor strains, and unexpected quavers, that I would defy any white person to learn it, and often as I heard it, it was to me a constant surprise” (Bradford, 1886, 35-36).” Her performances used a full range expression, including through her movement. She would wave her arms and clap her hands, sway and stamp her feet, dance and gesture — according to the details of what she spoke and rhythm of what she sang (Humez, p. 137). Orality is an embodied way of communicating.

* * *

Harriet Tubman’s Voice-Hearing and the Power of Oral Culture

Tubman may have been more talented and charismatic than most, but one suspects that such a commanding presence of speech and rhetorical persuasion would have been far more common among the enslaved who were raised in an oral culture where language was one of the few sources of power in defense against those who wielded physical violence and political force — such as the necessary ability for survival to use language that was coded and veiled, symbolic and metaphorical, whether in conversation or song, in order to communicate without stating something directly for fear of being overheard.

Her display of orality would have impressed many whites simply because literacy and the literary mind by that point had become the norm among the well-off white abolitionists who came to hear her. Generations had passed since orality had been prevalent in mainstream American society, especially among the emerging liberal class. The traditional culture of the ancien regime had been eroding since the colonial era. There is a power in oral cultures that the modern mind has forgotten, but there were those like Tubman who carried the last traces of oral culture into the 20th century before she finally died in her early 90s in 1913.

The bewilderment of whites, slave-catchers and abolitionists alike, by Tubman’s prowess makes one think of another example of the power of oral culture. The Mongol hordes, as they were perceived, acted in a way that was incomprehensible to the literate ruling elite of European feudalism. Genghis Khan established a mnemonic system used among his illiterate cavalry that allowed messages to be spread quickly and accurately. As all Mongols rode horses and carried all food with them, they were able to act collectively like a swarm and so could easily shift strategy in the middle of a battle. Oral culture had less rigid hierarchy. It was also highly religious and based in a shamanic tradition not unlike that of Africa. Genghis Khan regularly prayed to God, fasting for days until getting a clear message before he would leave on a military campaign. In similar fashion, Thomas Garrett said of Tubman: “She is a firm believer in spiritual manifestations […] she never goes on her missions of mercy without his (God’s) consent” (letter to Eliza Wigham, Dec. 27, 1856).

One imagines that, as with that Mongol leader, Tubman was so successful for the reason she wielded archaic authorization. That was the underlying force of personality and persuasion that made her way of speaking and acting so compelling, for the voice of God spoke through her. It was a much greater way of being in the world, a porous self that extended much further and that could reach into the heart and minds of others, apparently not limited to humans. Her “contemporaries noted that Tubman had a strange power over all animals—another indication of psychic ability—and insisted that she never feared the bloodhounds who dogged her trail when she became and Underground Railroad agent” (James A. McGowan & William C. Kashatus, Harriet Tubman: A Biography, pp. 10-11). Psychic ability or simply a rare example of a well-functioning bicameral mind in the modern era.

Some people did perceive her as being psychic or otherwise having an uncanny perception, an ability to know things it seems she shouldn’t be able to know. It depends on one’s psychological interpretation and theological persuasion. Her compatriot Thomas Garrett was also strongly religious in his commitment to abolitionism. “In fact,” states McGowan and Kashatus, “Garrett compared Harriet’s psychic ability to hear “the voice of God as spoken direct to her soul” to the Quakers’ concept of an Inner Light, or a divine presence in each human being that allows them to do God’s will on earth. Because of their common emphasis on a mystical experience and a shared religious perspective, Tubman and the Quakers developed a mutual trust” (Harriet Tubman: A Biography, p. 62). A particular incident helps explain Garret’s appraisal, from the same book (pp. 59-60):

“One late afternoon in mid-October 1856, Harriet arrived in Wilmington, Delaware, in need of funding for a rescue mission to the Eastern Shore. She went immediately to the office of Thomas Garrett, a white Quaker station master who also operated a hardware business in the town. “God sent me to you, Thomas,” said Harriet, dismissing the formality of a simple greeting. “He tells me you have money for me.” Amused by the request, Garrett jokingly asked: “Has God ever deceived thee?” “No,” she snapped. “I have always been liberal with thee, Harriet, and wish to be of assistance,” said the Quaker station master, stringing her along. “But I am not rich and cannot afford to give thee much.” Undeterred by the response, Harriet shot back: “God told me you’ve got money for me, and God never fools me!” Realizing that she was getting upset, Garrett cut to the chase: “Well, then, how much does thee need?” After reflecting a moment, Tubman said, “About 23 dollars.”

“The elderly Quaker shook his head in disbelief. Harriet’s request was almost exactly the amount he had received from an antislavery society in Scotland for her specific use. He went to his cash box, retrieved the donation, and handed it to his visitor. Smiling at her benefactor, Tubman took the cash, turned abruptly and marched out of the office. Astonished by the incident, Garrett later confided to another abolitionist that “there was something remarkable” about Harriet. “Whether it [was] clairvoyance or the divine impression on her mind, I cannot tell,” he admitted. “But I am certain she has a guide within herself other than the written word, for she never had any education.”1 By most accounts, Tubman’s behavior can be described as selfrighteous, if not extremely presumptuous. But she viewed herself as being chosen by God for the special duty of a liberator. In fact, she admitted that she “felt like Moses,” the Old Testament prophet, because “the Lord told me to go down South and bring up my brothers and sisters.” When she expressed doubt about her abilities and suggested that the Lord “take somebody else,” He replied: “It’s you I want, Harriet Tubman.”2 With such a divine commission, Tubman was confident that her visions and actions—no matter how rude by 19th–century society’s standards—were condoned by the Almighty. Thomas Garrett understood that.”

There is no doubt she had an instinctive understanding that was built on an impressive awareness, a keen presence of mind — call it psychic or bicameral. With our rigid egoic boundaries and schizoid mentality, we inhabitants of this modern hyper-individualistic world have much to learn about the deeper realms of the bundled mind, of the multiplicity of self. We have made ourselves alien to our own human and animal nature, and we are the lesser for it. The post-bicameral loss of not only God’s voice but of a more expansive way of being is still felt in a nostalgic longing that continues to rule over us, ever leading to backlashes of the reactionary mind. Even with possible brain damage, Tubman was no where near as mentally crippled as we are with our prized ego-consciousness that shuts out all other voices and presences.

In the Western world, it would be hard to find such a fine specimen of visionary voice-hearing. Harriet Tubman had a genius about her, both genius in the modern sense of brilliance and genius in the ancient sense of a guiding spirit. If she were around today, she would likely be medicated and institutionalized or maybe imprisoned, as a threat to sane and civil society (Bruce Levine, “Sublime Madness”: Anarchists, Psychiatric Survivors, Emma Goldman & Harriet Tubman). Yet there are still other societies, including developed countries, in the world where this is not the case.

Tanya Luhrmann, as inspired by Julian Jaynes, went into anthropology where she researches voice-hearing (her work on evangelicalism is briefly noted above). One study she did compared the experience of voice-hearers in the Ghana and the United States (Differences in voice-hearing experiences of people with psychosis in the U.S.A., India and Ghana: interview-based study). Unlike here in this country, those voice-hearer’s in certain non-Western culture are not treated as mentally ill and, unsurprisingly, neither do they experience cruel and persecutory voices — quite the opposite in being kind, affirming, and helpful as was the case with Tubman.

“In the case of voice hearing, culture may also play a role in helping people cope.  One study conducted by Luhrmann, the anthropologist, found that compared to their American counterparts, voice-hearing people diagnosed with schizophrenia in more collectivist cultures were more likely to perceive their voices as helpful and friendly, sometimes even resembling members of their friends and family. She adds that people who meet criteria for schizophrenia in India have better outcomes than their U.S. counterparts. She suspects this is because of “the negative salience” a diagnosis of schizophrenia holds in the U.S., as well as the greater rates of homelessness among people with schizophrenia in America” (Joseph Frankel, Psychics Who Hear Voices Could Be On to Something).

One suspects that the Ashanti and related African cultures that helped shape black traditions in Tubman’s Maryland are basically the same as the culture still existing in Ghana to this day. After all, the Ashanti Empire that began in the early colonial era, 1701, continued its rule well into the twentieth century, 1957. If it’s true that her grandmother Modesty was Ashanti, that would go a long way in explaining the cultural background to Tubman’s voice-hearing. It’s been speculated her father was the child of two Africans and it was directly from him that she claimed to have inherited her peculiar talents. It’s possible that elements of the bicameral mind survived later in those West African societies and from there was carried across the Middle Passage.

* * *

The Friendship and Freedom of the Living God

It’s important to think about the bicameral mind by looking at real world examples of voice-hearing. It might teach us something about what it means to be in relationship with a living God — a living world, a living experience of the greater mind, the bundled self (no matter one’s beliefs). Many Christians talk about such things, but few take it seriously, much less experience it or seek it out. That was what drew the Quakers to Tubman and others like her influenced by the African tradition of a living God. It wasn’t only a commonality of politics, in fighting for abolitionism and such. Rather, the politics was an expression of that particular kind of spiritual and epistemological experience.

To personally know God — or, if you prefer, to directly know concrete, lived reality — without the intervention of either priest or text or the equivalent can create immense power through authorization. It is an ability to act with confidence, rather than bowing down to external authority of hierarchical institutions, be it church clergy or plantation aristocracy. But it also avoids the other extreme, that of getting lost in the abstractions of the egoic consciousness that drain psychic reserves and make human will impotent. As Harriet Tubman proved, this other way of being can be a source of empowerment and liberation.

What made this possible is not only that she was illiterate but unchurched as well. In their own way, Quakers traditionally maintained a practice of being unchurched, in avoiding certain formal church institutions such as eschewing the ministerial profession. Slaves, on the other hand, were often forced to be unchurched in not being allowed to participate in formal religion. This would have helped maintain traditional African spiritual practice and experience. Interestingly, as J.E. Kennedy reports, one set of data found that “belief in the paranormal was positively related to religious faith but negatively related to religious participation” (The Polarization of Psi Beliefs; as discussed in NDE: Spirituality vs Religiosity). It’s ironic that formal religion (organized, institutionalized) and literacy, specifically in a text-based religion, have the powerful effect of disconnecting people from experience of God. Yet experience of God can break the spell of that mind virus.

The other thing is that, like African religion, the Quaker emphasis was on the communal. This might not seem obvious, in how Quakers believed in the individual’s relationship to God. That is where Tubman’s example is helpful. She too had an individual relationship to God, but her identity was also tied closely to kinship, community, and ancestry. We need to think more carefully about what is meant when we speak of individuality. One can gain one’s own private liberty by freeing oneself from shackled enslavement, that is to say changing one’s status from owned by another to owned by oneself (i.e., owned by the ego-self, in some ways an even more harsh taskmaster). Freedom, however, is something else entirely. The etymology of ‘freedom’ is the same as ‘friend’. To be free is to be among friends, to be a member of a free society — one is reminded that, to Quakers and West Africans alike, there was an inclination to relate to God as a friend. Considering this simple but profound understanding, it wasn’t enough for Tubman to escape her oppressive bondage, if she left behind everyone she loved.

Often she repeated her moral claim for either liberty or death, as if they were of equivalent value; whereas freedom is about life and the essence of life is shared, as freedom is always about connection and relationship, about solidarity and belonging. She couldn’t be free alone and, under the will of something greater than her, she returned South to free her kith and kin. The year Harriet Tubman first sought freedom, 1849, was the same year of the birth of Emma Lazarus, a poet who would write some of the most well known words on slavery and oppression, including the simple statement that, “Until we are all free, we are none of us free.” About a century later, this was rephrased by Martin Luther King Jr. during the Civil Rights movement when he said, “No one is free until we are all free.” One could trace this insight back to the ancient world, as when Jesus spoke that, “Truly I tell you, whatever you did for one of the least of these brothers and sisters of mine, you did for me.” That is freedom.

A living God lives among a living generation of people, a living community. “For where two or three gather in my name,” as Jesus also taught, “there am I with them.” Quakers had a tradition of living constitutionalism, something now associated with liberalism but originally having its origins in a profound sense of the divine (Where Liberty and Freedom Converge). To the Quaker worldview, a constitution is a living agreement and expression of the Divine, a covenant between God a specific people; related to why Quakers denied natural law that would usurp the authorization of this divine presence. A constitution is not a piece of paper nor the words upon it. Nor can a constitution be imposed upon other people outside of that community of souls. So, neither slaves nor following generations are beholden to a constitution enacted by someone else. This was why Thomas Jefferson assumed later Americans would forever seek out new constitutions to express their democratic voice as a people. But those who understood this the best were Quakers; or those, like Thomas Paine, who were early on influenced by the Quaker faith.

Consider John Dickinson who was raised as a Quaker and, after inheriting slaves, freed them. He is the author of the first draft of America’s first constitution, the Articles of Confederation, which was inspired by Quaker constitutionalism. The Articles of Confederation was a living document, in that it’s only power was the authority of every state agreeing to it with total consensus and no change being allowed to be made to it without further consensus. The second constitution, simply known as the United States Constitution and unconstitutionally established according to the first constitution (The Vague and Ambiguous US Constitution), was designed to be a dead letter and it has become famous for enshrining the institution of slavery. Rather than expressing a message of freedom, it was a new system of centralized power and authority. The deity invoked under this oppression is a dead god, a god of death. No one hears the voice of this false god, this demiurge.

Such a false idol can make no moral claim over a free people. As such, a free people assert their freedom by the simplest act of walking away, as did Harriet Tubman by following the water gourd pointing to the North Star, and as she repeated many times in guiding her people to what to them was the Promised Land. What guided her was the living voice of the living God. They had their own divine covenant that took precedence over any paper scribbled upon by a human hand.

* * *

Harriet Tubman, an Unsung Naturalist, Used Owl Calls as a Signal on the Underground Railroad
by Allison Keys, Audubon Magazine

“It was in those timber fields where she learned the skills necessary to be a successful conductor on the Underground Railroad,” Crenshaw explains, “including how to read the landscape, how to be comfortable in the woods, how to navigate and use the sounds that were natural in Dorchester County at the time.”

Underground Railroad Secret Codes
from Harriet Tubman Historical Society

Supporters of the Underground Railroad used words railroad conductors employed everyday to create their own code as secret language in order to help slaves escape. Railroad language was chosen because the railroad was an emerging form of transportation and its communication language was not widespread. Code words would be used in letters to “agents” so that if they were intercepted they could not be caught. Underground Railroad code was also used in songs sung by slaves to communicate among each other without their masters being aware.

Myths & Facts About Harriet Tubman
from National Park Service

Tubman sang two songs while operating her rescue missions. Both are listed in Sarah Bradford’s biography Scenes in the Life of Harriet Tubman: “Go Down Moses,” and, “Bound For the Promised Land.” Tubman said she changed the tempo of the songs to indicate whether it was safe to come out or not.

Songs of the Underground Railroad
from Harriet Tubman Historical Society

Songs were used in everyday life by African slaves. Singing was tradition brought from Africa by the first slaves; sometimes their songs are called spirituals. Singing served many purposes such as providing repetitive rhythm for repetitive manual work, inspiration and motivation. Singing was also use to express their values and solidarity with each other and during celebrations. Songs were used as tools to remember and communicate since the majority of slaves could not read.

Harriet Tubman and other slaves used songs as a strategy to communicate with slaves in their struggle for freedom. Coded songs contained words giving directions on how to escape also known as signal songs or where to meet known as map songs.

Songs used Biblical references and analogies of Biblical people, places and stories, comparing them to their own history of slavery. For example, “being bound for the land of Canaan” for a white person could mean ready to die and go to heaven; but to a slave it meant ready to go to Canada.

Scenes in the Life of Harriet Tubman
by Sarah Hopkins Bradford
pp. 25-27

After nightfall, the sound of a hymn sung at a distance comes upon the ears of the concealed and famished fugitives in the woods, and they know that their deliverer is at hand. They listen eagerly for the words she sings, for by them they are to be warned of danger, or informed of safety. Nearer and nearer comes the unseen singer, and the words are wafted to their ears:

Hail, oh hail ye happy spirits,
Death no more shall make you fear,
No grief nor sorrow, pain nor anger (anguish)
Shall no more distress you there.

Around him are ten thousan’ angels,
Always ready to ‘bey comman’.
Dey are always hobring round you,
Till you reach the hebbenly lan’.

Jesus, Jesus will go wid you;
He will lead you to his throne;
He who died has gone before you,
Trod de wine-press all alone.

He whose thunders shake creation;
He who bids the planets roll;
He who rides upon the temple, (tempest)
An’ his scepter sways de whole.

Dark and thorny is de desert,
Through de pilgrim makes his ways,
Yet beyon’ dis vale of sorrow,
Lies de fiel’s of endless days.

I give these words exactly as Harriet sang them to me to a sweet and simple Methodist air. “De first time I go by singing dis hymn, dey don’t come out to me,” she said, “till I listen if de coast is clar; den when I go back and sing it again, dey come out. But if I sing:

Moses go down in Egypt,
Till ole Pharo’ let me go;
Hadn’t been for Adam’s fall,
Shouldn’t hab to died at all,

den dey don’t come out, for dere’s danger in de way.”

Let The Circle Be Unbroken: The World Of Araminta (“Minty”) Ross Or The Making Of Harriet Tubman
by Margaret Washington

I. Building Communities
C. It Takes a Village to Raise a Child.

Enslaved African Americans came from a heritage that embraced concepts of solidarity in a descending order from the larger ethnic group, to the communal village, to the extended family to the nuclear family. Individualism (as opposed to individuality) was considered selfish and antithetical to the broader interests of a unit. Whether societies were matrilineal or patrilineal, nearly all were patriarchal (power rested with men). Nonetheless, the glue that bound the communal circle was the woman, considered the life giving force, the bearer of culture, essence of aesthetic beauty and key to a community’s longevity. Mothers, grandmothers, aunts, sisters etc. had oversight of children until puberty, when male and female rites of passage prepared them separately for their gendered communal roles. West African women were spiritually strong, morally respected, valued for their economic propensity, important in governance and in some cultures (Ashanti, Kongo, Ibo) powerful warriors. However devalued and exploited in America, Modesty, Rit and Minty exemplified how enslaved women resisted a sense of futility or fatalism and refashioned African attributes of beauty, dignity, self-worth and ethics. Enslaved women combed the waterways, forests and woods to obtain roots, herbs, leaves, sap, barks and other medicinal products for healing, amulets and even conjuration. Rit certainly used such remedies to nurse Minty back to health after extreme exhaustion, illnesses, beatings and her near fatal blow on the head. Rit learned these remedies and poultices from her mother Modesty and Harriet Tubman used them on the Underground Railroad. Their example reveals the significance of women to the community and that despite the assaults on the black family; it remained an institution, which even separation could not sever. […]

II ANCHORING THE SPIRIT
A. The Hidden Church: An African-Christian Synthesis.

If community was the base of African and African American life and culture, spirituality was the superstructure. Certainly enslaved people ultimately embraced Christianity. But for generations Southern whites feared exposing blacks to Christianity. The Bible’s Old Testament militant nationalism and New Testament’s spiritual  egalitarianism were not lost on African Americans, a few of whom were literate and the majority of whom felt that baptism was one kind of freedom.

Like most enslaved children, young Minty grew up outside of a church. However, since Ben Ross’s owner Anthony Thompson Sr., was a practicing Methodist, Minty’s family heard Christian sermons. But Edward Brodess was not devout and when he separated the Ross family, little Minty was hired out and did not receive white religious benevolence. But a tradition of black religion and spirituality existed independent of whites. In African culture, sacred worship embedded every aspect of life (rites of passage, marriage, funerals, child birth, etc.). Divine reverence was not confined to a building, a single ceremony or a specific day of the week. Spirituality was pervasive, expressive, emotional and evocative. Although the religious culture developed in America had African roots, the ravages of bondage created more social-spiritual convergences. In Minty’s world, spirituality was wrapped in temporal concerns affecting the individual, the family and the community. Worship was praising, praying, lamenting, hoping and drawing strength from each other. Long before Minty’s birth, Africans in America had created a “hidden church” where enslaved people gathered clandestinely (the woods, in cabins, in boats, in white people’s kitchens and even in the fields). In the hidden church they recounted religious and secular experiences; gave testimonies and created a space were women such as Rit could express the pain of having children sold or of trying to bring Minty back to life after her head was bashed in. In the hidden church, enslaved people created subversive songs, prayed for spiritual salvation, heavenly retribution and freedom.

Africans traveling the Maafa brought an ethos that merged the sacred and secular worlds. Enslaved African Americans embraced Christianity but also selectively adapted it to previous traditions and to their historical circumstances. Above all, they rejected incongruous white teachings meant to relegate blacks to perpetual slavery. Rather than being converted to Christianity as taught by whites, enslaved people converted Christianity to their own needs. Moreover, some significant African and Christian traditions had noteworthy commonalities.

Africans, like Christians believed in one God (Nzambi among the Bantu, Onyame among the Akan-Ashanti for example) who was the apex of all existence just as humanity was the center of earthly life. While gendered concepts of the African Supreme Being varied, like Jehovah, Africans’ God was revered, all-powerful and approachable. However, unlike Jehovah, the African Supreme Being was not feared, jealous nor wrathful. Other spirits exist in the African pantheon, like saints in Catholicism. But there was only one God. Hence, when whites spoke of a Supreme God, Africans understood. Harriet Tubman’s God was an all-powerful friend. According to Thomas Garrett, her close friend and a beloved Quaker Underground Railroad Conductor, Harriet spoke to God every day of her life. “I never knew anyone so confident of her faith,” said Garrett. (Letter in Bradford)

Africans, like Christians, believed in a soul, sometimes called the “heart” or “voice.” The soul was responsible for human behavior in life and was one’s spiritual existence after death. Some ethnicities had complicated concepts of the soul; others simply recognized the soul as the “little me in the big me” which lived on. Africans believed in honoring this life after death, especially as part of the kinship spiritual connection (ancestor reverence), which brought protection to the living. The curse of the dead was much dreaded in Africa and in America. Hence the importance of burial and funeral rites throughout the Diaspora, even today. A woman such as Harriet Tubman who embraced Christianity, also blended a spiritual syncretism that constructed a concept of the soul around moral ethics and faith imparted through the word of God, “as spoken to her soul” according to her friend Garrett. “She is a firm believer in spiritual manifestations . . . she never goes on her missions of mercy without his (God’s) consent.” (Garrett to Eliza Wigham, in McGowan, 135)

Water was a life giving force in African culture and the spirit world was under water. Throughout the African Diaspora, water represented divine transformations—birth, death, baptism and rebirth. For many enslaved people, accepting Christianity carried implications reminiscent of older traditions that surpassed what whites intended. In African cultures, an initiate received a “sacred bath” following a special protracted rite of passage symbolizing acceptance and integration into the community. Similarly, with Christianity enslaved people sought salvation through isolation, prayer, meditation, and communication with God through visions and signs from the natural environment. Baptism by total immersion represented final acceptance into the “ark of safety.” Although Methodists baptized by sprinkling, enslaved people insisted on going “down under” the water. They also equated spiritual transformation with secular change. Such thinking was Christian because the New Testament upheld spiritual egalitarianism. It was also African: One traveled briefly into the watery world of the ancestors as an uncivil “little spirit of the bush” full of individualistic anti-communal tendencies. One emerged from the water as a citizen of the community able to partake of all rights and privileges. The change was both divine and temporal; it was fervent, overwhelming and thoroughgoing. Canals, marshes, swamps and rivers surrounded African descended people on the Eastern Shore. Here they labored as slaves. Here they were baptized and hence constantly reminded of water’s spiritual and liberating significance.

Minty’s Christian conversion experience probably happened while working for the Stewarts in Caroline County. Whether because of that experience or her blow on the head, Minty insisted she spoke to God, had trances and saw visions that foretold future events. As a clairvoyant, Minty believed that she inherited this second sense from her father, Ben. Africans and African Americans believed that a clairvoyant person was born with a “caul” or “veil,” a portion of the birth membrane that remained on the head. They were seers and visionaries who communicated with the supernatural world and were under a special spiritual dispensation. Visions sometimes came while Minty worked, were accompanied by music and articulated in a different language. Minty also claimed exceptional power. When Edward Brodess sent slave traders to Ben’s cabin to inspect Minty, she prayed for God to cleanse Brodess’s heart and make him a good man or kill him. Brodess’ death convinced Minty that she had “prayed him to death.”1 Since his death put her in eminent danger of sale, Minty knew it was a sign from God to flee.

Northerners called Ben “a full-blooded Negro.” His parents were probably African born and told him the old Maafa adage that he passed on to Minty: some Africans could fly. Indeed, captured Ibo people committed suicide believing that their spirits flew back to Africa.2 Similarly, as Minty envisioned her escape, “She used to dream of flying over fiefs and towns, and rivers and mountings, looking down upon them ‘like a bird.'” When it appeared as if her strength would give out and she could not cross the river, “there would be ladies all dressed in white over there, and they would put our their arms and pull me across.” Listening to Ben’s stories, predictions and sharing his faith convinced Minty that an omniscient force protected her. In visions, she became a disembodied spirit observing earthly and heavenly scenes. Harriet Tubman told friends that God “called” her to activism against her wishes. She begged God to “get someone else” but to no avail. Since God called her, she depended on God to guide her away from danger.

Balance of Egalitarianism and Hierarchy

David Graeber, an anthropologist, and David Wengrow, an archaeologist, have a theory about hunter-gatherer societies having cycled between egalitarianism and hierarchy. That is to say hierarchies were temporary and often seasonal. There was no permanent leadership or ruling caste, as seen in the fluid social order of still surviving hunter. This carried over into the early settlements that were initially transitory meeting places, likely for feasts and festivals.

There are two questions that need to be answered. First, why did humans permanently settle down? Second, why did civilization get stuck in hierarchy? These questions have to be answered separately. For millennia into civilization, the egalitarian impulse persisted within many permanent settlements. There was no linear development from egalitarianism to hierarchy, no fall from the Garden of Eden.

Julian Jaynes, in his theorizing about the bicameral mind, offered a possible explanation. A contributing factor for permanent settlements would be because the speaking idols had to be kept in a single location with agriculture developing as a later result. Then as societies became more populous, complex and expansive, hierarchies (as with moralizing gods) became more important to compensate for the communal limits of a voice-hearing social order.

That kind of hierarchy, though, was a much later development, especially in its extreme forms not seen until the Axial Age empires. The earlier bicameral societies had a more communal identity. That would’ve been true on the level of experience, as even the voices people heard were shared. There wasn’t an internal self separate from the communal identity and so no conflict between the individual member and larger society. One either fully belonged to and was immersed in that culture or not.

Large, complex hierarchies weren’t needed. Bicameralism began in small settlements that lacked police, court systems, standing armies, etc — all the traits of an oppressively authoritarian hierarchy that would later be seen, such as the simultaneous appearance of sexual moralizing and pornographic art. It wasn’t the threat of violent force by centralized authority and concentrated power that created and maintained the bicameral order but, as still seen with isolated indigenous tribes, shared identity and experience.

An example of this is that of early Egyptians. They were capable of impressive technological feats and yet they didn’t even have basic infrastructure like bridges. It appears they initially were a loose association of farmers organized around the bicameral culture of archaic authorization and, in the off-season, they built pyramids without coercion. Slavery was not required for this, as there is no evidence of forced labor.

In so many ways, this is alien to the conventional understanding of civilization. It is so radically strange that to many it seems impossible, especially when it gets described as ‘egalitarian’ in placing it in a framework of modern ideas. Mention primitive ‘communism’ or ‘anarchism’ and you’ll really lose most people. Nonetheless, however one wants to describe and label it, this is what the evidence points toward.

Here is another related thought. How societies went from bicameral mind to consciousness is well-trodden territory. But what about how bicameralism emerged from animism? They share enough similarities that I’ve referred to them as the animistic-bicameral complex. The bicameral mind seems like a variant or extension of the voice-hearing in animism.

Among hunter-gatherers, it was often costume and masks through which gods, spirits, and ancestors spoke. Any individual potentially could become the vessel of possession because, in the animistic view, all the world is alive with voices. So, how did this animistic voice-hearing become narrowed down to idol worship of corpses and statues?

I ask this because this is central to the question of why humans created permanent settlements. A god-king’s voice of authorization was so powerful that it persisted beyond his death. The corpse was turned into a mummy, as his voice was a living memory that kept speaking, and so god-houses were built. But how did the fluid practice of voice-hearing in animism become centralized in a god-king?

Did this begin with the rise of shamanism? Some hunter-gatherers don’t have shamans. But once the role of shaman becomes a permanent authority figure mediating with other realms, it’s not a large leap from a shaman-king to a god-king who could be fully deified in death. In that case, how did shamanism act as a transitional proto-bicameralism? In this, we might begin to discern the hitch upon which permanent hierarchy eventually got stuck.

I might point out that there is much disagreement in this area of scholarship, as expected. The position of Graeber and Wengrow is highly contested, even among those offering alternative interpretations of the evidence see Peter Turchin (An Anarchist View of Human Social Evolution & A Feminist Perspective on Human Social Evolution) and Camilla Power (Gender egalitarianism made us human: patriarchy was too little, too late & Gender egalitarianism made us human: A response to David Graeber & David Wengrow’s ‘How to change the course of human history’).

But I don’t see the disagreements as being significant for the purposes here. Here is a basic point that Turchin explains: “The reason we say that foragers were fiercely egalitarian is because they practiced reverse dominance hierarchy” (from first link directly above). That seems to go straight to the original argument. Many other primates have social hierarchy, although not all. Some of the difference appears to be cultural, in that humans early in evolution appear to have developed cultural methods of enforcing egalitarianism. This cultural pattern has existed long enough to have fundamentally altered human nature.

According to Graeber and Wengrow, these egalitarian habits weren’t lost easily, even as society became larger and more complex. Modern authoritarian hierarchies represent a late development, a fraction of a percentage of human existence. They are far outside the human norm. In social science experiments, we see how the egalitarian impulse persists. Consider two examples. Children will naturally help those in need, until someone pays them money to do so, shifting from intrinsic motivation to extrinsic. The other study showed how most people, both children an adults, will choose to punish wrongdoers even at personal cost.

This in-built egalitarianism is an old habit that doesn’t die easily no matter how it is suppressed or perverted by systems of authoritarian power. It is the psychological basis of a culture of trust that permanent hierarchies take advantage of through manipulation of human nature. The egalitarian impulse gets redirected in undermining egalitarianism. This is why modern societies are so unstable, as compared to the ancient societies that lasted for millennia.

That said, there is nothing wrong with genuine authority, expertise, and leadership — as seen even in the most radically egalitarian societies like the Piraha. Hierarchies are also part of our natural repertoire and only problematic when they fall out of balance with egalitarianism and so become entrenched. One way or another, human societies cycle between hierarchy and egalitarianism, whether it cycles on a regular basis or necessitates collapse. That is the point Walter Scheidel makes in his book, The Great Leveler. High inequality destabilizes society and always brings its own downfall.

We need to relearn that balance, if we hope to avoid mass disaster. Egalitarianism is not a utopian ideal. It’s simply the other side of human nature that gets forgotten.

* * *

Archaeology, anarchy, hierarchy, and the growth of inequality
by Andre Costopoulos

In some ways, I agree with both Graeber and Wengrow, and with Turchin. Models of the growth of social inequality have indeed emphasized a one dimensional march, sometimes inevitable, from virtual equality and autonomy to strong inequality and centralization. I agree with Graeber and Wengrow that this is a mistaken view. Except I think humans have moved from strong inequality, to somewhat managed inequality, to strong inequality again.

The rise and fall of equality

Hierarchy, dominance, power, influence, politics, and violence are hallmarks not only of human social organization, but of that of our primate cousins. They are widespread among mammals. Inequality runs deep in our lineage, and our earliest identifiable human ancestors must have inherited it. But an amazing thing happened among Pleistocene humans. They developed strong social leveling mechanisms, which actively reduced inequality. Some of those mechanisms are still at work in our societies today: Ridicule at the expense of self-aggrandizers, carnival inversion as a reminder of the vulnerability of the powerful, ostracism of the controlling, or just walking away from conflict, for example.

Understanding the growth of equality in Pleistocene human communities is the big untackled project of Paleolithic archaeology, mostly because we assume they started from a state of egalitarianism and either degenerated or progressed from there, depending on your lens. Our broader evolutionary context argues they didn’t.

During the Holocene, under increasing sedentism and dependence on spatially bounded resources such as agricultural fields that represent significant energy investments, these mechanisms gradually failed to dampen the pressures for increasing centralization of power. However, even at the height of the Pleistocene egalitarian adaptation, there were elites if, using Turchin’s figure of the top one or two percent, we consider that the one or two most influential members in a network of a hundred are its elite. All the social leveling in the world could not contain influence. Influence, in the end, if wielded effectively, is power.

Ancient ‘megasites’ may reshape the history of the first cities
by Bruce Bower

No signs of a centralized government, a ruling dynasty, or wealth or social class disparities appear in the ancient settlement, the researchers say. Houses were largely alike in size and design. Excavations yielded few prestige goods, such as copper items and shell ornaments. Many examples of painted pottery and clay figurines typical of Trypillia culture turned up, and more than 6,300 animal bones unearthed at the site suggest residents ate a lot of beef and lamb. Those clues suggest daily life was much the same across Nebelivka’s various neighborhoods and quarters. […]

Though some of these sprawling sites had social inequality, egalitarian cities like Nebelivka were probably more widespread several thousand years ago than has typically been assumed, says archaeologist David Wengrow of University College London. Ancient ceremonial centers in China and Peru, for instance, were cities with sophisticated infrastructures that existed before any hints of bureaucratic control, he argues. Wengrow and anthropologist David Graeber of the London School of Economics and Political Science also made that argument in a 2018 essay in Eurozine, an online cultural magazine.

Councils of social equals governed many of the world’s earliest cities, including Trypillia megasites, Wengrow contends. Egalitarian rule may even have characterized Mesopotamian cities for their first few hundred years, a period that lacks archaeological evidence of royal burials, armies or large bureaucracies typical of early states, he suggests.

How to change the course of human history
by David Graeber and David Wengrow

Overwhelming evidence from archaeology, anthropology, and kindred disciplines is beginning to give us a fairly clear idea of what the last 40,000 years of human history really looked like, and in almost no way does it resemble the conventional narrative. Our species did not, in fact, spend most of its history in tiny bands; agriculture did not mark an irreversible threshold in social evolution; the first cities were often robustly egalitarian. Still, even as researchers have gradually come to a consensus on such questions, they remain strangely reluctant to announce their findings to the public­ – or even scholars in other disciplines – let alone reflect on the larger political implications. As a result, those writers who are reflecting on the ‘big questions’ of human history – Jared Diamond, Francis Fukuyama, Ian Morris, and others – still take Rousseau’s question (‘what is the origin of social inequality?’) as their starting point, and assume the larger story will begin with some kind of fall from primordial innocence.

Simply framing the question this way means making a series of assumptions, that 1. there is a thing called ‘inequality,’ 2. that it is a problem, and 3. that there was a time it did not exist. Since the financial crash of 2008, of course, and the upheavals that followed, the ‘problem of social inequality’ has been at the centre of political debate. There seems to be a consensus, among the intellectual and political classes, that levels of social inequality have spiralled out of control, and that most of the world’s problems result from this, in one way or another. Pointing this out is seen as a challenge to global power structures, but compare this to the way similar issues might have been discussed a generation earlier. Unlike terms such as ‘capital’ or ‘class power’, the word ‘equality’ is practically designed to lead to half-measures and compromise. One can imagine overthrowing capitalism or breaking the power of the state, but it’s very difficult to imagine eliminating ‘inequality’. In fact, it’s not obvious what doing so would even mean, since people are not all the same and nobody would particularly want them to be.

‘Inequality’ is a way of framing social problems appropriate to technocratic reformers, the kind of people who assume from the outset that any real vision of social transformation has long since been taken off the political table. It allows one to tinker with the numbers, argue about Gini coefficients and thresholds of dysfunction, readjust tax regimes or social welfare mechanisms, even shock the public with figures showing just how bad things have become (‘can you imagine? 0.1% of the world’s population controls over 50% of the wealth!’), all without addressing any of the factors that people actually object to about such ‘unequal’ social arrangements: for instance, that some manage to turn their wealth into power over others; or that other people end up being told their needs are not important, and their lives have no intrinsic worth. The latter, we are supposed to believe, is just the inevitable effect of inequality, and inequality, the inevitable result of living in any large, complex, urban, technologically sophisticated society. That is the real political message conveyed by endless invocations of an imaginary age of innocence, before the invention of inequality: that if we want to get rid of such problems entirely, we’d have to somehow get rid of 99.9% of the Earth’s population and go back to being tiny bands of foragers again. Otherwise, the best we can hope for is to adjust the size of the boot that will be stomping on our faces, forever, or perhaps to wrangle a bit more wiggle room in which some of us can at least temporarily duck out of its way.

Mainstream social science now seems mobilized to reinforce this sense of hopelessness.

Rethinking cities, from the ground up
by David Wengrow

Settlements inhabited by tens of thousands of people make their first appearance in human history around 6,000 years ago. In the earliest examples on each continent, we find the seedbed of our modern cities; but as those examples multiply, and our understanding grows, the possibility of fitting them all into some neat evolutionary scheme diminishes. It is not just that some early cities lack the expected features of class divisions, wealth monopolies, and hierarchies of administration. The emerging picture suggests not just variability, but conscious experimentation in urban form, from the very point of inception. Intriguingly, much of this evidence runs counter to the idea that cities marked a ‘great divide’ between rich and poor, shaped by the interests of governing elites.

In fact, surprisingly few early cities show signs of authoritarian rule. There is no evidence for the existence of monarchy in the first urban centres of the Middle East or South Asia, which date back to the fourth and early third millennia BCE; and even after the inception of kingship in Mesopotamia, written sources tell us that power in cities remained in the hands of self-governing councils and popular assemblies. In other parts of Eurasia we find persuasive evidence for collective strategies, which promoted egalitarian relations in key aspects of urban life, right from the beginning. At Mohenjo-daro, a city of perhaps 40,000 residents, founded on the banks of the Indus around 2600 BCE, material wealth was decoupled from religious and political authority, and much of the population lived in high quality housing. In Ukraine, a thousand years earlier, prehistoric settlements already existed on a similar scale, but with no associated evidence of monumental buildings, central administration, or marked differences of wealth. Instead we find circular arrangements of houses, each with its attached garden, forming neighbourhoods around assembly halls; an urban pattern of life, built and maintained from the bottom-up, which lasted in this form for over eight centuries.⁶

A similar picture of experimentation is emerging from the archaeology of the Americas. In the Valley of Mexico, despite decades of active searching, no evidence for monarchy has been found among the remains of Teotihuacan, which had its magnificent heyday around 400 CE. After an early phase of monumental construction, which raised up the Pyramids of the Sun and Moon, most of the city’s resources were channelled into a prodigious programme of public housing, providing multi-family apartments for its residents. Laid out on a uniform grid, these stone-built villas — with their finely plastered floors and walls, integral drainage facilities, and central courtyards — were available to citizens regardless of wealth, status, or ethnicity. Archaeologists at first considered them to be palaces, until they realised virtually the entire population of the city (all 100,000 of them) were living in such ‘palatial’ conditions.⁷

A millennium later, when Europeans first came to Mesoamerica, they found an urban civilisation of striking diversity. Kingship was ubiquitous in cities, but moderated by the power of urban wards known as calpolli, which took turns to fulfil the obligations of municipal government, distributing the highest offices among a broad sector of the altepetl (or city-state). Some cities veered towards absolutism, but others experimented with collective governance. Tlaxcalan, in the Valley of Puebla, went impressively far in the latter direction. On arrival, Cortés described a commercial arcadia, where the ‘order of government so far observed among the people resembles very much the republics of Venice, Genoa, and Pisa for there is no supreme overlord.’ Archaeology confirms the existence here of an indigenous republic, where the most imposing structures were not palaces or pyramid-temples, but the residences of ordinary citizens, constructed around district plazas to uniformly high standards, and raised up on grand earthen terraces.⁸

Contemporary archaeology shows that the ecology of early cities was also far more diverse, and less centralised than once believed. Small-scale gardening and animal keeping were often central to their economies, as were the resources of rivers and seas, and indeed the ongoing hunting and collecting of wild seasonal foods in forests or in marshes, depending on where in the world we happen to be.⁹ What we are gradually learning about history’s first city-dwellers is that they did not always leave a harsh footprint on the environment, or on each other; and there is a contemporary message here too. When today’s urbanites take to the streets, calling for the establishment of citizens’ assemblies to tackle issues of climate change, they are not going against the grain of history or social evolution, but with its flow. They are asking us to reclaim something of the spark of political creativity that first gave life to cities, in the hope of discerning a sustainable future for the planet we all share.

Farewell to the ‘Childhood of Man’
by Gyrus

[Robert] Lowie made similar arguments to [Pierre] Clastres, about conscious knowledge of hierarchies among hunter-gatherers. However, for reasons related to his concentration on Amazonian Indians, Clastres missed a crucial point in Lowie’s work. Lowie highlighted the fact that among many foragers, such as the Eskimos in the Arctic, egalitarianism and hierarchy exist within the same society at once, cycling from one to another through seasonal social gatherings and dispersals. Based on social responses to seasonal variations in the weather, and patterns in the migration of hunted animals, not to mention the very human urge to sometimes hang out with a lot of people and sometimes to get the hell away from them, foraging societies often create and then dismantle hierarchical arrangements on a year-by-year basis.

There seems to have been some confusion about exactly what the pattern was. Does hierarchy arise during gatherings? This would tally with sociologist Émile Durkheim’s famous idea that ‘the gods’ were a kind of primitive hypothesis personifying the emergent forces that social complexity brought about. People sensed the dynamics changing as they lived more closely in greater numbers, and attributed these new ‘transcendent’ dynamics to organised supernatural forces that bound society together. Religion and cosmology thus function as naive mystifications of social forces. Graeber detailed ethnographic examples where some kind of ‘police force’ arises during tribal gatherings, enforcing the etiquette and social expectations of the event, but returning to being everyday people when it’s all over.

But sometimes, the gatherings are occasions for the subversion of social order — as is well known in civilised festivals such as the Roman Saturnalia. Thus, the evidence seemed to be confusing, and the idea of seasonal variations in social order was neglected. After the ’60s, the dominant view became that ‘simple’ egalitarian hunter-gatherers were superseded by ‘complex’ hierarchical hunter-gatherers as a prelude to farming and civilisation.

Graeber and Wengrow argue that the evidence isn’t confusing: it’s simply that hunter-gatherers are far more politically sophisticated and experimental than we’ve realised. Many different variations, and variations on variations, have been tried over the vast spans of time that hunter-gatherers have existed (over 200,000 years, compared to the 12,000 or so years we know agriculture has been around). Clastres was right: people were never naive, and resistance to the formation of hierarchies is a significant part of our heritage. However, seasonal variations in social structures mean that hierarchies may never have been a ghostly object of resistance. They have probably been at least a temporary factor throughout our long history.1 Sometimes they functioned, in this temporary guise, to facilitate socially positive events — though experience of their oppressive possibilities usually encouraged societies to keep them in check, and prevent them from becoming fixed.

How does this analysis change our sense of the human story? In its simplest form, it moves the debate from ‘how and when did hierarchy arise?’ to ‘how and when did we get stuck in the hierarchical mode?’. But this is merely the first stage in what Graeber and Wengrow promise is a larger project, which will include analysis of the persistence of egalitarianism among early civilisations, usually considered to be ‘after the fall’ into hierarchy.

 

Alienation and Soul Blindness

There is the view that consciousness* is a superficial overlay, that the animistic-bicameral mind is our fundamental nature and continues to operate within consciousness. In not recognizing this, we’ve become alienated from ourselves and from the world we are inseparable from. We don’t recognize that the egoic voice is but one of many voices and so we’ve lost appreciation for what it means to hear voices, including the internalized egoic voice that we’ve become identified with in submission to its demurgic authorization. This could be referred to as soul blindness, maybe related to soul loss — basically, a lack of psychological integration and coherency. Is this an inevitability within consciousness? Maybe not. What if a deeper appreciation of voice-hearing was developed within consciousness? What would emerge from consciousness coming to terms with its animistic-bicameral foundation? Would it still be consciousness or something else entirely?

* This is in reference to Julian Jaynes use of ‘consciousness’ that refers to the ego mind with its introspective and internal space built upon metaphor and narratization. Such consciousness as a social construction of a particular kind of culture is not mere perceptual awareness or biological reactivity.

* * *

Is It as Impossible to Build Jerusalem as It is to Escape Babylon? (Part Two)
by Peter Harrison

Marx identified the concept of alienation as being a separation, or estrangement, from one’s labour. And for Marx the consistent ability to labour, to work purposefully and consciously, as opposed to instinctively, towards a pre-imagined goal, was the trait that distinguished humans from other animals. This means also that humans are able to be persuaded to work creatively, with vigour and passion, for the goals of others, or for some higher goal than the maintenance of daily survival. As long as they are able see some tiny benefit for themselves, which might be service to a higher cause, or even just simple survival, since working for the goal of others may be the only means of obtaining food. So, Marx’s definition of alienation was more specific than an ‘existential’ definition because it specified labour as the defining human characteristic. But he was also aware that the general conditions of capitalism made this alienation more acute and that this escalated estrangement of humans from immediately meaningful daily activity led to a sense of being a stranger in one’s own world, and not only for the working class. This estrangement (I want to write étranger-ment, to reference Camus, but this is not a word) afflicted all classes, even those classes that seemed to benefit from class society, since capitalism had, even by his own time, gained an autonomy of its own. Life is as meaningless [or better: as anti-human] for a cleaner as it is for the head of a large corporation. This is why Marx stated that all people under capitalism were proletarian.

When I discovered the idea of soul blindness in Eduardo Kohn’s book, How Forests Think, I was struck by it as another useful way of understanding the idea of alienation. The concept of soul blindness, as used by the Runa people described by Kohn, seems to me to be related to the widespread Indigenous view of the recently deceased as aimless and dangerous beings who must be treated with great care and respect after their passing to prevent them wreaking havoc on the living. In Kohn’s interpretation, to be soul blind is to have reached the ‘terminus of selfhood,’ and this terminus can be reached while still alive, when one loses one’s sense of self through illness or despair, or even when one just drifts off into an unfocussed daze, or, more profoundly, sinks into an indifference similar to — to reference Camus again — that described by the character Meursault, in L’Etranger.

There are some accounts of Indigenous people first encountering white people in which the white people are initially seen as ghosts, one is recorded by Lévi-Strauss for Vanuatu. Another is embedded in the popular Aboriginal history of the area I live in. On first contact the white people are immediately considered to be some kind of ghost because of their white skin. This may have something to do with practice of preserving the bodies of the dead. This involves scraping off the top layer of skin which, apparently, makes the body white. This practice is described by the anthropologist, Atholl Chase, in his reminisces of Cape York. But for me there is more to the defining of the white intruders as ghosts because of their white skin. These foreigners also act as if they are soul blind. They are like machines, working for a cause that is external to them. For the Indigenous people these strangers do not seem to have soul: they are unpredictable; dangerous; they don’t know who they are.

But it is the anthropologist Eduardo Viveiros de Castro who, I think, connects most clearly to the work of James Hillman on the notion of the soul. James Hillman uses the term soul but he does not mean a Christian soul and he is not ultimately meaning the mind. For him the soul is a form of mediation between events and the subject and, in this sense, it might be similar to Bourdieu’s conception of ‘disposition.’ For Viveiros de Castro, ‘A perspective is not a representation because representations are a property of the mind or spirit, whereas the point of view is located in the body.’ Thus, Amerindian philosophy, which Viveiros de Castro is here describing, perhaps prefigures Hillman’s notion that ‘soul’ is ‘a perspective rather than a substance, a viewpoint towards things rather than a thing itself.’

Islamic Voice-Hearing

Islam, what kind of religion is it? Islam is the worship of a missing god, that is how we earlier described it. Some might consider that as unfair and dismissive to one of the world’s largest religions, but this is true to some extent for all post-bicameral religions. The difference is that Islam is among the most post-bicameral of the world religions. This is true simply in temporal terms.

The bicameral societies, according to Julian Jaynes, ended with the widespread collapse of the late Bronze Age empires and their trade networks. That happened around 1177 BCE, as the result of natural disasters and attacks by the mysterious Sea People, the latter maybe having formed out the refugees from the former. The Bronze Age continued for many centuries in various places: 700 BCE in Great Britain, Central Europe and China; 600 BCE in Northern Europe; 500 BCE in Korea and Ireland; and centuries beyond that in places like Japan.

But the Bronze Age Empires never returned. In that late lingering Bronze Age, a dark age took hold and put all of civilization onto a new footing. This was the era when, across numerous cultures, there were the endless laments about the gods, spirits, and ancestors having gone silent, having abandoned humanity. Entire cultural worldviews and psychological ways of being were utterly demolished or else irreparably diminished. This created an intense sense of loss, longing, and nostalgia that has never left humanity since.

Out of the ashes, while the Bronze Age was still holding on, the Axial Age arose around 900 BCE and continued until 200 BCE. New cultures were formed and new empires built. The result is what Jaynes described as ‘consciousness’ or what one can think of as introspective mental space, an inner world of egoic identity where the individual is separate from community and world. Consciousness and the formalized religions that accompanied it were a replacement for the loss of a world alive with voices.

By the time Rabbinic Judaism, Gnosticism, and Christianity came around, the Axial Age was already being looked back upon as a Golden Age and, other than through a few surviving myths, the Bronze Age before that was barely remembered at all. It would be nearly another 600 years after that first century monotheistic revival when Muhammad would have his visions of the angel Gabriel visiting him to speak on behalf of God. Islam is both post-bicameral and post-axial, to a far greater degree.

Muslims consider Muhammad to be the last prophet and even he didn’t get to hear God directly for it had to come through an angel. The voice of God had long ago grown so faint that people had come to rely on oracles, channelings, and such. These rather late revelations by way of Gabriel were but a barely audible echo of the archaic bicameral voices. It is may be understandable that, as with some oracles before him, Muhammad would declare God would never speak again. So, Islam, unlike the other monothesitic religions, fully embraces God’s absence from the world.

Actually, that is not quite right. Based on the Koran, God will never speak again until the Final Judgment. Then all will hear God again when he weighs your sins and decides the fate of your immortal soul. Here is the interesting part. The witnesses God shall call upon in each person’s case will be all the bicameral voices brought back out of silence. The animals and plants will witness for or against you, as will the earth and rocks and wind. Even your own resurrected body parts will come alive again with voices to speak of what you did. Body parts speaking is something familiar to those who read Jaynesian scholarship.

Until then, God and all the voices of the world will remain mute witnesses, watching your every move and taking notes. They see all, hear all, notice all — every time you masturbate or pick your nose, every time you have a cruel or impure thought, every time you don’t follow one of the large number of divine commandments, laws, and rules spelled out in the Koran. The entire world is spying upon you and will report back to God, at the end of time. The silent world only appears to be dumb and unconscious. God is biding his time, gathering a file on you like a cosmic FBI.

This could feel paralyzing, but in another way it offers total freedom from self, total freedom through complete submission. Jaynesian consciousness is a heavy load and that was becoming increasingly apparent over time, especially in the centuries following the Axial Age. The zealous idealism of the Axial Age prophets was growing dull and tiresome. By the time that Muhammad showed up, almost two millennia had passed since the bicameral mind descended into darkness. The new consciousness was sold as something amazing, but it hadn’t fully lived up to its promises. Instead, ever more brutal regimes came into power and a sense of anxiety was overtaking society.

Muhammad had an answer and the people of that region were obviously hungry for someone to provide an answer. After forming his large army, his military campaign barely experienced any resistance. And in a short period of time while he was still alive, most of the Arabian peninsula was converted to Islam. The silence of the gods had weakened society, but Muhammad offered an explanation for why the divine could no longer be experienced. He helped normalize what had once felt like a tragedy. He told them that they didn’t need to hear God because God had already revealed all knowledge to the prophets, including himself of course. No one had to worry, just follow orders and comply with commands.

All the tiresome complications of thought were unnecessary. God had already thought out everything for humans. The Koran as the final and complete holy text would entirely and permanently replace the bicameral voices, ever receding into the shadows of the psyche. But don’t worry, all those voices are still there, waiting to speak. But the only voice that the individual needed to listen to was that of the person directly above them in the religious hierarchy, be it one’s father or an imam or whoever else with greater official authority with a line of command that goes back to the prophets and through the angels to God Himself. Everything is in the Koran and the learned priestly class would explain it all and translate it into proper theocratic governance.

Muhammad came with a different message than anyone before. The Jewish prophets and Jesus, as with many Pagans, would speak of God as Father and humanity as His children. Early Christians took this as a challenge to a slave-based society, in borrowing from the Stoics that even a slave was free in his soul. Muhammad, instead, was offering another variety of freedom. We humans, rather than children of God, are slaves of God. The entire Islamic religion is predicated upon divine slavery, absolute submission. This is freedom from the harsh taskmaster of egoic individuality, a wannabe demiurge. Unlike Jesus, Muhammad formulated a totalitarian theocracy, a totalizing system. Nothing is left to question or interpretation, that is in theory or rather in belief.

This goes back to how, with the loss of the bicameral mind and social order, something took its place. It was a different kind of authoritarianism — rigid and hierarchical, centralized and concentrated, despotic and violent. Authoritarianism of this variety didn’t emerge until the late Bronze Age when the bicameral societies were becoming too large and complex, overstrained and unstable. Suddenly, as if to presage the coming collapse, there was the appearance of written laws, harsh punishment, and cruel torture — none of which ever existed before, according to historical records and archaeological finds. As the world shifted into post-bicameralism, this authoritarianism became ever more extreme (e.g., Roman Empire).

This was always the other side of the rise of individuality, of Jaynesian consciousness. The greater potential freedom the individual possesses the more that oppressive social control is required, as the communal bonds and social norms of the bicameral mind increasingly lost their hold to organically maintain order. Muhammad must have showed up at the precise moment of crisis in this change. After the Roman Empire’s system of slavery, Europe came up with feudalism to re-create some of what had disappeared. But apparently a different kind of solution was required in the Arab world.

Maybe this offsets the draining of psychic energy that comes with consciousness. Jaynes speculated that, like the schizophrenic, bicameral humans had immense energy and stamina which allowed them to accomplish near-miraculous feats such as building the pyramids with small populations and very little technology or infrastructure. Suppression of the extremes of individualism through emphasizing absolute subordination is maybe a way of keeping in check the energy loss of maintaining egoic consciousness. In the West, we eventually overcame this weakness by using massive doses of stimulants to overpower the otherwise debilitating anxiety and to help shore up the egoic boundaries, but this has come at the cost of destroying our physical health and mental health.

Time will tell which strategy is the most effective for long-term survival of specific societies. But I’m not sure I’d bet on the Western system, considering how unsustainable it appears to be and how easily it has become crippled by a minor disease epidemic like covid-19. Muhammad might simply have been trying to cobble together some semblance of a bicameral mind, in the face of divine silence. There is a good reason for trying to do that. Those bicameral societies lasted many millennia longer than has our post-bicameral civilization. It’s not clear that modern civilization or at least Western civilization will last beyond the end of this century. We underestimate the bicameral mind and the importance it played during the single longest period of advancement of civilization.

* * *

Let us leave a small note of a more personal nature. In the previous post (linked above), we mentioned that our line of inquiry began with a conversation we had with a friend of ours who is a Muslim. He also happens to be schizophrenic, i.e., a voice-hearer. The last post was about how voice=hearing is understood within Islam. Since supposedly God no longer speaks to humans nor do his angelic intermediaries, any voice a Muslim hears is automatically interpreted as not being of divine origins. It doesn’t necessarily make the voice evil, as it could be a jinn which is a neutral entity in Islamic theology, although jinn can be dangerous. Then again, voice-hearing might also be caused by an evil magician, what I think is called a sihir.

Anyway, we had the opportunity to speak to this friend once again, as we are both in jobs that require us to continue working downtown amidst everything otherwise being locked down because of the covid-19 epidemic. In being isolated from family and other friends, we’ve been meeting with this Islamic guy on a daily basis. Just this morning, we went for a long walk together and chatted about life and religion. He had previously talked about his schizophrenia in passing, apparently unworried by the stigma of it. He is an easy person to talk to, quite direct and open about his thoughts and experiences. I asked him about voice-hearing and he explained that, prior to being medicated, he would continue to hear people speak to him after they no longer were present. And unsurprisingly, the voices were often negative.

Both his imam and his therapist told him to ignore the voices. Maybe that is a standard approach in traditionally monotheistic cultures. As we mentioned in the other post, he is from North Africa where Arabs are common. But another friend of ours lives in Ghana, in West Africa. Voice-hearing experience among people in Ghana was compared to those in the United States, in the research of Tanya M. Luhrmann, an anthropologist inspired by Julian Jaynes. She found that Ghanans, with a tradition of voice-hearing (closer to bicameralism?), had a much more positive experience of the voices they heard. Americans, like our Islamic friend, did not tend to hear voices that were kind and helpful. This is probably the expectancy effect.

If you are raised to believe that voices are demonic or their Islamic equivalent of jinn or are from witches and evil magicians, or if you simply have been told voice-hearing means your insane, well, it’s not likely to lead to happy results when you do hear voices. I doubt it decreases the rate of voice-hearing, though. In spite of Islamic theology denying God and angels speak to humans any longer, that isn’t likely to have any affect on voice-hearing itself. So, the repressed bicameral mind keeps throwing out these odd experiences, but in our post-bicameral age we have fewer resources in dealing constructively with those voices. Simply denying and ignoring them probably is less helpful.

That is the ultimate snag. The same voices that once were identified as godly or something similar are now taken as false, unreal, or dangerous. In a sense, God never stopped speaking. One could argue that we all are voice-hearers, but some of us now call the voice of God as ‘conscience’ or whatever. Others, like Muslims, put great emphasis on this voice-hearing but have tried to gag God who goes on talking. Imagine how many potential new prophets have been locked away in psychiatric wards or, much worse, killed or imprisoned as heretics. If God can’t be silenced, the prophets who hear him can. The Old Testament even describes how the authorities forbid voice-hearing and demanded that voice-hearers be killed, even by their own parents.

The bicameral mind didn’t disappear naturally because it was inferior but because, in its potency, it was deemed dangerous to those who wanted to use brute power to enforce their own voices of authorization. The bicameral mind, once central to the social order, had become enemy number one. If people could talk to God directly, religion and its claims of authority would become irrelevant. That is how our Islamic friend, a devout religious practitioner, ended up being drugged up to get the voices to stop speaking.

Islam as Worship of a Missing God

A friend of ours is a Muslim and grew up in an Islamic country. As he talked about his religion, we realized how different it is from Christianity. There is no shared practice among Christians similar to the praying five times a day. From early on, Christianity was filled with diverse groups and disagreements, and that has only increased over time (there are over 4,600 denominations of Christianity in the United States alone). My friend had a hard time appreciating that there is no agreed upon authority, interpretation, or beliefs among all Christians.

Unlike Muhammad, Jesus never wrote anything nor was anything written down about him until much later. Nor did he intend to start a new religion. He offered no rules, social norms, instructions, etc for how to organize a church, a religious society, or a government. He didn’t even preach family values, if anything the opposite — from a command to let the dead bury themselves to the proclamation of having come to turn family members against each other. The Gospels offer no practical advice about anything. Much of Jesus’ teachings, beyond a general message of love and compassion, are vague and enigmatic, often parables that have many possible meanings.

Now compare Jesus to the Islamic prophet. Muhammad is considered the last prophet, although he never claimed to have heard the voice of God and instead supposedly having received the message secondhand through an angel. Still, according to Muslims, the Koran is the only complete holy text in existence — the final Word of God. That is also something that differs from Christianity. Jesus never asserted that God would become silent to all of humanity for eternity and that his worshippers would be condemned to a world without the God they longed for, in the way Allah never enters His own Creation.

Many Protestants and Anabaptists and those in similar groups believe that God continues to be revealed to people today, that the divine is known through direct experience, that the Bible as a holy text must be read as a personal relationship to God, not merely taken on the authority of blind faith. Some churches go so far as to teach people how to speak to and hear God (T.M. Luhrmann, When God Talks Back). Even within Catholicism, there have been further revelations of God since Jesus, from various mystics and saints that are acknowledged by the Vatican but also from ordinary Catholics claiming God spoke to them without any great fear of hereticism and excommunication.

It made me think about Julian Jaynes’ theory modern consciousness. With the collapse of the Bronze Age civilizations, there was this sense of the gods having gone silent. Yet this was never an absolute experience, as some people continued to hear the gods. Even into the modern world, occasionally people still claim to hear various gods and sometimes even found new religions based on revelations. The Bahai, for example, consider Muhammad to be just one more prophet with others having followed him. Hindus also have a living tradition of divine revelation that is equivalent to that of prophets. Only Islam, as far as I know, claims all prophecy and revelation to be ended for all time.

I was thinking about the sense of loss and loneliness people felt when bicameral societies came to an end. They were thrown onto an increasingly isolated individualism. Religion as we know it was designed to accommodate this, in order to give a sense of order, meaning and authority that had gone missing. But Islam takes this to an extreme. After Muhammad, no human supposedly would ever again personally hear, see, or experience the divine in any way (excluding mystical traditions like Sufism). For all intents and purposes, Allah has entirely receded from the world. The only sign of his existence that he left behind was a book of instructions. We must submit and comply or be punished in the afterlife, a world separate from this one

That seems so utterly depressing and dreary to me. I was raised Christian and on the far other extreme of Protestantism. My family attended the Unity Church that emphasizes direct experience of God to such a degree that the Bible itself was mostly ignored and almost irrelevant — why turn to mere words on paper when you can go straight to the source? Rather than being denied and condemned, to claim to have heard God speak would have been taken seriously. I’m no longer religious, but the nearly deist idea of a god that is distant and silent seems so alien and unappealing to me. Yet maybe that makes Islam well designed for the modern world, as it offers a strong response to atheism.

If you don’t have any experience of God, this is considered normal and expected in Islam, not something to be worried about, not something to challenge one’s faith as is common in Christianity (NDE: Spirituality vs Religiosity); and it avoids the riskiness and confusion of voice-hearing (Libby Anne, Voices in Your Head: Evangelicals and the Voice of God). One’s ignorance of the divine demonstrates one’s individual inadequacy and, as argued by religious authority, is all the more reason to submit to religious authority. Islamic relation between God and humanity is one-way, except to some extent by way of inspiration and dreams, but Allah himself never directly enters his Creation and so never directly interacts with humans, not even with prophets. Is that why constant prayer is necessary for Muslims, to offset God’s silence and vacancy? Worship of a missing God seems perfectly suited for the modern world.

Muslims are left with looking for traces of God in the Koran like ants crawling around in a footprint while trying to comprehend what made it and what it wants them to do. So, some of the ants claim to be part of a direct lineage of ants that goes back to an original ant that, according to tradition, was stepped upon by what passed by. These well-respected ants then explain to all the other ants what is meant by all the bumps and grooves in the dried mud. In worship, the ants pray toward the footprint and regularly gather to circle around it. This gives their life some sense of meaning and purpose and, besides, it maintains the social order.

That is what is needed in a world where the bicameral voices of archaic authorization no longer speak, no longer are heard. Something has to fill the silence as the loneliness it creates is unbearable. Islam has a nifty trick, embracing the emptiness and further irritating the overwhelming anxiety as it offers the salve for the soul. Muslims take the silence of God as proof of God, as a promise of something more. This otherworldly being, Allah, tells humans who don’t feel at home in this world that their real home is elsewhere, to which they will return if they do what they are told. Other religions do something similar, but Islam takes this to another level — arguably, the highest or most extreme form of monotheism, so far. The loss of the bicameral mind could not be pushed much further, one suspects, without being pushed into an abyss.

Islam is a truly modern religion. Right up there with capitalism and scientism.

* * *

Further discussion about this can be found on the Facebook page “Jaynes’ The Origin of Consciousness in the Breakdown of the Bicameral Mind”.

 

To Empathize is to Understand

What is empathy as a cognitive ability? And what is empathy as an expansion of identity, as part of awareness of self and other?

There is a basic level of empathy that appears to be common across numerous species. Tortoises, when seeing another on its back, will help flip it over. There are examples of animals helping or cooperating with those from an entirely different species. Such behavior has been repeatedly demonstrated in laboratories as well. These involve fairly advanced expressions of empathy. In some cases, one might interpret it as indicating at least rudimentary theory of mind, the understanding that others have their own experience, perspective, and motivations. But obviously human theory of mind can be much more complex.

One explanation about greater empathy has to do with identity. Empathy in a way is simply a matter of what is included within one’s personal experience (Do To Yourself As You Would Do For Others). To extend identity is to extend empathy to another individual or a group (or anything else that can be brought within sphere of the self). For humans, this can mean learning to include one’s future self, to empathize with experience one has not yet had, the person one has not yet become. The future self is fundamentally no different than another person.

Without cognitive empathy, affective empathy is limited to immediate experience. It’s the ability to feel what another feels. But lacking cognitive empathy as happens in the most severe autism, theory of mind cannot be developed and so there is no way to identity, locate and understand that feeling. One can only emotionally react, not being able to differentiate one’s own emotion from that of another. In that case, there would be pure emotion, and yet no recognition of the other. Cognitive empathy is necessary to get beyond affective reactivity, not all that different than the biological reactivity of a slug.

It’s interesting that some species (primates, rats, dolphins, etc) might be able to have more cognitive empathy and theory of mind than some people at the extreme ends of severe autism, not necessarily being an issue of intelligence. On the other hand, the high functioning on the autistic spectrum, if intervention happens early enough, can be taught theory of mind, although it is challenging for the. This kind of empathy is considered a hallmark of humanity, a defining feature. This is what leads to problems of social behavior for those with autism spectrum disorder.

Someone entirely lacking in theory of mind would be extremely difficult to communicate and interact with beyond the most basic level, as is seen in the severest cases of autism and other extreme developmental conditions. Helen Keller asserts she had no conscious identity, no theory of her own mind or that of others, until she learned language.* Prior to her awakening, she was aggressive and violent in reacting to a world she couldn’t understand, articulate, or think about. That fits in with the speculations of Julian Jaynes. What he calls ‘consciousness’ is the addition of abstract thought by way of metaphorical language, as built upon concrete experience and raw affect. Keller discusses how her experience went from from the concreteness of touch to the abstraction of language. In becoming aware of the world, she became aware of herself.

Without normal development of language, the human mind is crippled: “The “black silence” of the deaf, blind and mute is similar in many respects to the situation of acutely autistic children where there are associated difficulties with language and the children seem to lack what has been called “a theory of mind” ” (Robin Allott, Helen Keller: Language and Consciousenss). Even so, there is more to empathy than language, and that might be true as well for some aspects or kinds of cognitve empathy. Language is not the only form of communication.

Rats are a great example in comparing to humans. We think of them as pests, as psychologically inferior. But anyone who has kept rats knows how intelligent and social they are. They are friendlier and more interactive than the typical cat. And research has shown how cognitively advanced they are in learning. Rats do have the typical empathy of concern for others. For example, they won’t hurt another rat in exchange for a reward and, given a choice, they would rather go hungry. But it goes beyond that.

It’s also shown that “rats are more likely and quicker to help a drowning rat when they themselves have experienced being drenched, suggesting that they understand how the drowning rat feels” (Kristin Andrews, Rats are us). And “rats who had been shocked themselves were less likely to allow other rats to be shocked, having been through the discomfort themselves.” They can also learn to play hide-and-seek which necessitates taking on the perspective others. As Ed Yong asks in The Game That Made Rats Jump for Joy, “In switching roles, for example, are they taking on the perspective of their human partners, showing what researchers call “theory of mind”?”

That is much more than mere affective empathy. This seems to involve active sympathy and genuine emotional understanding, that is to say cognitive empathy and theory of mind. If they are capable of both affective and cognitive empathy, however limited, and if Jaynesian consciousness partly consists of empathy imaginatively extended in space and time, then a case could be made that rats have more going on than simple perceptual awareness and biological reactivity. They are empathically and imaginatively engaging with others in the world around them. Does this mean they are creating and maintaining a mental model of others? Kristin Andrews details the extensive abilities of rats:

“We now know that rats don’t live merely in the present, but are capable of reliving memories of past experiences and mentally planning ahead the navigation route they will later follow. They reciprocally trade different kinds of goods with each other – and understand not only when they owe a favour to another rat, but also that the favour can be paid back in a different currency. When they make a wrong choice, they display something that appears very close to regret. Despite having brains that are much simpler than humans’, there are some learning tasks in which they’ll likely outperform you. Rats can be taught cognitively demanding skills, such as driving a vehicle to reach a desired goal, playing hide-and-seek with a human, and using the appropriate tool to access out-of-reach food.”

To imagine the future for purposes of thinking in advance and planning actions, that is quite advanced cognitive behavior. Julian Jaynes argued that was the purpose of humans developing a new kind of consciousness, as the imagined metaphorical space that is narratized allows for the consideration of alternatives, something he speculates was lacking in humans prior to the Axial Age when behavior supposedly was more formulaic and predetermined according to norms, idioms, etc. Yet rats can navigate a path they’ve never taken before with novel beginning and ending locations, which would require taking into account multiple options. What theoretically makes Jaynesian consciousness unique?

Jaynes argues that it’s the metaphorical inner space that is the special quality that created the conditions for the Axial Age and all that followed from it, the flourishing of complex innovations and inventions, the ever greater extremes of abstraction seen in philosophy, math and science. We have so strongly developed this post-bicameral mind that we barely can imagine anything else. But we know that other societies have very different kinds of mentalities, such as the extended and fluid minds of animistic cultures. What exactly is the difference?

Australian Aborigines give hint to something between the two kinds of mind. In some ways, the mnemonic systems represent more complex cognitive ability than we are capable with our Jaynesian consciousness. Instead of an imagined inner space, the Songlines are vast systems of experience and knowledge, culture and identity overlaid upon immense landscapes. These mappings of externalized cognitive space can be used to guide the individual across distant territories the individual has never seen before and help them to identify and use the materials (plants, stones, etc) at a location no one in their tribe has visited for generations. Does this externalized mind have less potential for advanced abilities? Upon Western contact, Aborigines had farming and ranching, kept crop surpluses in granaries, used water and land management.

It’s not hard to imagine civilization having developed along entirely different lines based on divergent mentalities and worldviews. Our modern egoic consciousness was not an inevitability and it likely is far from offering the most optimal functioning. We might already be hitting a dead end with our present interiorized mind-space. Maybe it’s our lack of empathy in understanding the minds of other humans and other species that is an in-built limitation to the post-bicameral world of Jaynesian consciousness. And so maybe we have much to learn from entirely other perspectives and experiences, even from rats.

* * *

* Helen Keller, from Light in My Darkness:

I had no concepts whatever of nature or mind or death or God. I literally thought with my body. Without a single exception my memories of that time are tactile. . . . But there is not one spark of emotion or rational thought in these distinct yet corporeal memories. I was like an unconscious clod of earth. There was nothing in me except the instinct to eat and drink and sleep. My days were a blank without past, present, or future, without hope or anticipation, without interest or joy. Then suddenly, I knew not how or where or when, my brain felt the impact of another mind, and I awoke to language, to knowledge, to love, to the usual concepts of nature, good, and evil. I was actually lifted from nothingness to human life.

And from The Story of My Life:

As the cool stream gushed over one hand she spelled into the other the word water, first slowly, then rapidly. I stood still, my whole attention fixed upon the motions of her fingers. Suddenly I felt a misty consciousness as of something forgotten–-a thrill of returning thought; and somehow the mystery of language was revealed to me. I knew then that ‘w-a-t-e-r’ meant the wonderful cool something that was flowing over my hand. That living word awakened my soul, gave it light, hope, joy, set it free! There were barriers still, it is true, but barriers that could in time be swept away.

And from The World I Live In:

Before my teacher came to me, I did not know that I am. I lived in a world that was a no-world. I cannot hope to describe adequately that unconscious, yet conscious time of nothingness. I did not know that I knew aught, or that I lived or acted or desired. I had neither will nor intellect. I was carried along to objects and acts by a certain blind natural impetus. I had a mind which caused me to feel anger, satisfaction, desire. These two facts led those about me to suppose that I willed and thought. I can remember all this, not because I knew that it was so, but because I have tactual memory. It enables me to remember that I never contracted my forehead in the act of thinking. I never viewed anything beforehand or chose it. I also recall tactually the fact that never in a start of the body or a heart-beat did I feel that I loved or cared for anything. My inner life, then, was a blank without past, present, or future, without hope or anticipation, without wonder or joy or faith. […]

Since I had no power of thought, I did not compare one mental state with another. So I was not conscious of any change or process going on in my brain when my teacher began to instruct me. I merely felt keen delight in obtaining more easily what I wanted by means of the finger motions she taught me. I thought only of objects, and only objects I wanted. It was the turning of the freezer on a larger scale. When I learned the meaning of “I” and “me” and found that I was something, I began to think. Then consciousness first existed for me. Thus it was not the sense of touch that brought me knowledge. It was the awakening of my soul that first rendered my senses their value, their cognizance of objects, names, qualities, and properties. Thought made me conscious of love, joy, and all the emotions. I was eager to know, then to understand, afterward to reflect on what I knew and understood, and the blind impetus, which had before driven me hither and thither at the dictates of my sensations, vanished forever.”

I cannot represent more clearly than any one else the gradual and subtle changes from first impressions to abstract ideas. But I know that my physical ideas, that is, ideas derived from material objects, appear to me first an idea similar to those of touch. Instantly they pass into intellectual meanings. Afterward the meaning finds expression in what is called “inner speech.”  […]

As my experiences broadened and deepened, the indeterminate, poetic feelings of childhood began to fix themselves in definite thoughts. Nature—the world I could touch—was folded and filled with myself. I am inclined to believe those philosophers who declare that we know nothing but our own feelings and ideas. With a little ingenious reasoning one may see in the material world simply a mirror, an image of permanent mental sensations. In either sphere self-knowledge is the condition and the limit of our consciousness. That is why, perhaps, many people know so little about what is beyond their short range of experience. They look within themselves—and find nothing! Therefore they conclude that there is nothing outside themselves, either.

However that may be, I came later to look for an image of my emotions and sensations in others. I had to learn the outward signs of inward feelings. The start of fear, the suppressed, controlled tensity of pain, the beat of happy muscles in others, had to be perceived and compared with my own experiences before I could trace them back to the intangible soul of another. Groping, uncertain, I at last found my identity, and after seeing my thoughts and feelings repeated in others, I gradually constructed my world of men and of God. As I read and study, I find that this is what the rest of the race has done. Man looks within himself and in time finds the measure and the meaning of the universe.

* * *

As an example of how language relates to emotions:

The ‘untranslatable’ emotions you never knew you had
by David Robson

But studying these terms will not just be of scientific interest; Lomas suspects that familiarising ourselves with the words might actually change the way we feel ourselves, by drawing our attention to fleeting sensations we had long ignored.

“In our stream of consciousness – that wash of different sensations feelings and emotions – there’s so much to process that a lot passes us by,” Lomas says. “The feelings we have learned to recognise and label are the ones we notice – but there’s a lot more that we may not be aware of. And so I think if we are given these new words, they can help us articulate whole areas of experience we’ve only dimly noticed.”

As evidence, Lomas points to the work of Lisa Feldman Barrett at Northeastern University, who has shown that our abilities to identify and label our emotions can have far-reaching effects.

Her research was inspired by the observation that certain people use different emotion words interchangeably, while others are highly precise in their descriptions. “Some people use words like anxious, afraid, angry, disgusted to refer to a general affective state of feeling bad,” she explains. “For them, they are synonyms, whereas for other people they are distinctive feelings with distinctive actions associated with them.”

This is called “emotion granularity” and she usually measures this by asking the participants to rate their feelings on each day over the period of a few weeks, before she calculates the variation and nuances within their reports: whether the same old terms always coincide, for instance.

Importantly, she has found that this then determines how well we cope with life. If you are better able to pin down whether you are feeling despair or anxiety, for instance, you might be better able to decide how to remedy those feelings: whether to talk to a friend, or watch a funny film. Or being able to identify your hope in the face of disappointment might help you to look for new solutions to your problem.

In this way, emotion vocabulary is a bit like a directory, allowing you to call up a greater number of strategies to cope with life. Sure enough, people who score highly on emotion granularity are better able to recover more quickly from stress and are less likely to drink alcohol as a way of recovering from bad news. It can even improve your academic success. Marc Brackett at Yale University has found that teaching 10 and 11-year-old children a richer emotional vocabulary improved their end-of-year grades, and promoted better behaviour in the classroom. “The more granular our experience of emotion is, the more capable we are to make sense of our inner lives,” he says.

Both Brackett and Barrett agree that Lomas’s “positive lexicography” could be a good prompt to start identifying the subtler contours of our emotional landscape. “I think it is useful – you can think of the words and the concepts they are associated with as tools for living,” says Barrett. They might even inspire us to try new experiences, or appreciate old ones in a new light.

* * *

And related to all of this is hypocognition, overlapping with linguistic relativity — in how language and concepts determine our experience, identity, and sense of reality — constraining and framing and predetermining what we are even capable of perceiving, thinking about, and expressing:

Hypocognition is a censorship tool that mutes what we can feel
by Kaidi Wu

It is a strange feeling, stumbling upon an experience that we wish we had the apt words to describe, a precise language to capture. When we don’t, we are in a state of hypocognition, which means we lack the linguistic or cognitive representation of a concept to describe ideas or interpret experiences. The term was introduced to behavioural science by the American anthropologist Robert Levy, who in 1973 documented a peculiar observation: Tahitians expressed no grief when they suffered the loss of a loved one. They fell sick. They sensed strangeness. Yet, they could not articulate grief, because they had no concept of grief in the first place. Tahitians, in their reckoning of love and loss, and their wrestling with death and darkness, suffered not from grief but a hypocognition of grief. […]

But the darkest form of hypocognition is one born out of motivated, purposeful intentions. A frequently overlooked part of Levy’s treatise on Tahitians is why they suffered from a hypocognition of grief. As it turns out, Tahitians did have a private inkling of grief. However, the community deliberately kept the public knowledge of the emotion hypocognitive to suppress its expression. Hypocognition was used as a form of social control, a wily tactic to expressly dispel unwanted concepts by never elaborating on them. After all, how can you feel something that doesn’t exist in the first place?

Intentional hypocognition can serve as a powerful means of information control. In 2010, the Chinese rebel writer Han Han told CNN that any of his writings containing the words ‘government’ or ‘communist’ would be censored by the Chinese internet police. Ironically, these censorship efforts also muffled an abundance of praise from pro-leadership blogs. An effusive commendation such as ‘Long live the government!’ would be censored too, for the mere mention of ‘government’.

A closer look reveals the furtive workings of hypocognition. Rather than rebuking negative remarks and rewarding praises, the government blocks access to any related discussion altogether, rendering any conceptual understanding of politically sensitive information impoverished in the public consciousness. ‘They don’t want people discussing events. They simply pretend nothing happened… That’s their goal,’ Han Han said. Regulating what is said is more difficult than ensuring nothing is said. The peril of silence is not a suffocation of ideas. It is to engender a state of blithe apathy in which no idea is formed.

Do To Yourself As You Would Do For Others

“…our impulse control is less based on an order from our executive command center, or frontal cortex, and more correlated with the empathic part of our brain. In other words, when we exercise self-control, we take on the perspective of our future self and empathize with that self’s perspectives, feelings, and motivations.”
~ Alexandar Soutscheck

Self-control is rooted in self-awareness. Julian Jaynes and Brian McVeigh, in one of their talks, brought up the idea that “mind space” has increased over time: “The more things we think about, the more distinctions we make in our consciousness  between A and B, and so on, the more mind-space there is” (Discussions with Julian Jaynes, ed. by Brian J. McVeigh, p. 40). The first expansion was the creation of introspective consciousness itself. Narratization allowed that consciousness to also extend across time, to imagine possibilities and play out scenarios and consider consequences. Empathy, as we we experience it, might be a side effect of this as consciousness includes more and more within it, including empathy with our imagined future self. So, think of self-control as being kind to yourself, to your full temporal self, not only your immediate self.

This would relate to the suggestion that humans learn theory of mind, the basis of cognitive empathy, first by observing others and only later apply it to ourselves. That is to say the first expansion of mental space as consciousness takes root within relationship to others. It’s realizing that there might be inner experience within someone else that we claim inner space in our own experience. So, our very ability to understand ourselves is dependent on empathy with others. This was a central purpose of the religions that arose in the Axial Age, the traditions that continue into the modern world* (Tahere Salehi, The Effect of Training Self-Control and Empathy According to Spirituality on Self-Control and Empathy Preschool Female Students in Shiraz City). The prophets that emerged during that era taught love and compassion and introspection, not only as an otherworldly moral dictum but also in maintaining group coherence and the common good. The breakdown of what Jaynes called the bicameral mind was traumatic and a new empathic mind was needed to replace it, if only to maintain social order.

Social order has become a self-conscious obsession ever since, as Jaynesian consciousness in its tendency toward rigidity has inherent weaknesses. Social disconnection is a crippling of the mind because the human psyche is inherently social. Imagining our future selves is a relationship with a more expansive sense of self. It’s the same mechanism as relating to any other person. This goes back to Johann Hari’s idea, based on Bruce K. Alexander’s rat park research, that the addict is the ultimate individual. In this context, this ultimate individual lacking self-control is not only disconnected from other people but also disconnected from themselves. Addiction is isolating and isolation promotes addiction. Based on this understanding, I’ve proposed that egoic consciousness is inherently addictive and that post-axial society is dependent on addiction for social control.

But this psychological pattern is seen far beyond addiction. This fits our personal experience of self. When we were severely depressed, we couldn’t imagine or care about the future. This definitely inhibited self-control and led to more impulsive behavior in being in present-oriented psychological survival mode. Then again, the only reason self-control is useful at all is because, during and following the Axial Age, humans ever more loss the capacity of being part of a communal identity that created the conditions of communal control, the externally perceived commands of archaic authorization through voice-hearing. We’ve increasingly lost the capacity of a communal identity (extended mind/self) and hence a communal empathy, something that sounds strange or unappealing to the modern mind. In denying our social nature, this casts the shadow of authoritarianism, an oppressive and often violent enforcement of top-down control.

By the way, this isn’t merely about psychology. Lead toxicity causes higher rates of impulsivity and aggression. This is not personal moral failure but brain damage from poisoning. Sure, teaching brain-damaged kids and adults to have more empathy might help them overcome their disability. But if we are to develop and empathic society, we should learn to have enough empathy not to wantonly harm the brains of others with lead toxicity and other causes of stunted development (malnutrition, stress, ACEs, etc), just because they are poor or minority and can’t fight back. Maybe we need to first teach politicians and business leaders basic empathy, in overcoming the present dominance of pscyopathic traits, so that they could learn self-control in not harming others.

The part of the brain involving cognitive empathy and theory of mind is generally involved with selflessness and pro-social behavior. To stick with brain development and neurocognitive functioning, let’s look at diet. Weston A. Price, in studying traditional populations that maintained healthy diets, observed what he called moral health in that people seemed kinder, more helpful, and happier — they got along well. Strong social fabric and culture of trust is not an abstraction but built into general measures of health, in the case of Price’s work, having to do with nutrient-dense animal foods containing fat-soluble vitamins. As the standard American diet has worsened, so has mental health. That is a reason for hope. In an early study on the ketogenic diet as applied to childhood diabetes, the researchers made a side observation that not only did the diabetes symptoms improve but so did behavior. I’ve theorized about how a high-carb diet might be one of the factors that sustains the addictive and egoic self.

Narrow rigidity of the mind, as seen in the extremes of egoic consciousness, has come to be accepted as a social norm and even a social ideal. It is the social Darwinian worldview that has contributed to the rise of both competitive capitalism and the Dark Triad (psycopathy, narcissism, and Machiavellianism), and unsurprisingly it has led to a society that lacks awareness and appreciation of the harm caused to future generations (Scott Barry Kaufman, The Dark Triad and Impulsivity). Rather than normalized, maybe this dysfunction should be seen as a sickness, not only a soul sickness but a literal sickness of the body-mind that can be scientifically observed and measured, not to mention medically and socially treated. We need to thin the boundaries of the mind so as to expand our sense of self. Research shows that those with such thinner boundaries not only have more sense of identification with their future selves but also their past selves, in maintaining a connection to what it felt like to be a child. We need to care for ourselves and others in the way we would protect a child.

* * *

* In their article “Alone and aggressive“, A. William Crescioni and Roy F. Baumeister included the loss of meaning. It was maybe associated with the loss of empathy, specifically in understanding the meaning of others (e.g., the intention ‘behind’ words, gestures and actions). Meaning traditionally has been the purview of religion. And I’d suggest that it is not a coincidence that the obsession with meaning arose in the Axial Age right when words were invented for ‘religion’ as a formal institution separate from the rest of society. As Julian Jaynes argues, this was probably in response to the sense of nostalgia and longing that followed the silence of the gods, spirits, and ancestors.

A different kind of social connection had to be taught, but this post-bicameral culture wasn’t and still isn’t as effective in re-creating the strong social bonds of archaic humanity. Periods of moral crisis in fear of societal breakdown have repeated ever since, like a wound that was never healed. I’ve previously written about social rejection and aggressive behavior in relation to this (12 Rules for Potential School Shooters) — about school shooters, I explained:

Whatever they identify or don’t identify as, many and maybe most school shooters were raised Christian and one wonders if that plays a role in their often expressing a loss of meaning, an existential crisis, etc. Birgit Pfeifer and Ruard R. Ganzevoort focus on the religious-like concerns that obsess so many school shooters and note that many of them had religious backgrounds:

“Traditionally, religion offers answers to existential concerns. Interestingly, school shootings have occurred more frequently in areas with a strong conservative religious population (Arcus 2002). Michael Carneal (Heath High School shooting, 1997, Kentucky) came from a family of devoted members of the Lutheran Church. Mitchell Johnson (Westside Middle School shooting, 1998, Arkansas) sang in the Central Baptist Church youth choir (Newman et al. 2004). Dylan Klebold (Columbine shooting, 1999, Colorado) attended confirmation classes in accordance with Lutheran tradition. However, not all school shooters have a Christian background. Some of them declare themselves atheists…” (The Implicit Religion of School Shootings).

Princeton sociologist Katherine Newman, in studying school shootings, has noted that, “School rampage shootings tend to happen in small, isolated or rural communities. There isn’t a very direct connection between where violence typically happens, especially gun violence in the United States, and where rampage shootings happen” (Common traits of all school shooters in the U.S. since 1970).

It is quite significant that these American mass atrocities are concentrated in “small, isolated or rural communities” that are “frequently in areas with a strong conservative religious population”. That might more precisely indicate who these school shooters are and what they are reacting to. Also, one might note that rural areas in general and specifically in the South do have high rates of gun-related deaths, although many of them are listed as ‘accidental’ which is to say most rural shootings involve people who know each other; also true of school shootings.

* * *

Brain stimulation reveals crucial role of overcoming self-centeredness in self-control
by Alexander Soutschek, Christian C. Ruff, Tina Strombach, Tobias Kalenscher and Philippe N. Tobler

Empathic Self-Control
by David Shoemaker

People with a high degree of self-control typically enjoy better interpersonal relationships, greater social adjustment, and more happiness than those with a low degree of self-control. They also tend to have a high degree of empathy. Further, those with low self-control also tend to have low empathy. But what possible connection could there be between self-control and empathy, given that how one regulates oneself seems to have no bearing on how one views others. Nevertheless, this paper aims to argue for a very tight relation between self-control and empathy, namely, that empathy is in fact one type of self-control. The argument proceeds by exploring two familiar types of self-control, self-control over actions and attitudes, the objects for which we are also responsible. Call the former volitional self-control and the latter rational self-control. But we also seem to be responsible for—and have a certain type of control and self-control over—a range of perceptual states, namely, those in which we come to see from another person’s perspective how she views her valuable ends and what her emotional responses are to their thwarting or flourishing. This type of empathic self-control is a previously-unexplored feature of our interpersonal lives. In addition, once we see that the type of empathy exercised is also exercised when casting ourselves into the shoes of our future selves, we will realize how intra-personal empathy better enables both volitional and rational self-control.

Science Says When Self-Control Is Hard, Try Empathizing With Your Future Self
by Lindsay Shaffer

Soutscheck’s study also reveals what happens when we fail to exercise the empathic part of our brain. When Soutscheck interrupted the empathic center of the brain in 43 study volunteers, they were more likely to take a small amount of cash immediately over a larger amount in the future. They were also less inclined to share the money with a partner. Soutscheck’s study showed that the more people are stuck inside their own perspective, even just from having the empathic part of their brain disrupted, the more likely they are to behave selfishly and impulsively.

Self-Control Is Just Empathy With Your Future Self
by Ed Yong

This tells us that impulsivity and selfishness are just two halves of the same coin, as are their opposites restraint and empathy. Perhaps this is why people who show dark traits like psychopathy and sadism score low on empathy but high on impulsivity. Perhaps it’s why impulsivity correlates with slips among recovering addicts, while empathy correlates with longer bouts of abstinence. These qualities represent our successes and failures at escaping our own egocentric bubbles, and understanding the lives of others—even when those others wear our own older faces.

New Studies in Self Control: Treat Yourself Like You’d Treat Others
from Peak

A new study recently shifted the focus to a different mechanism of self control. Alexander Soutschek and colleagues from the University of Zurich believe self-control may be related to our ability to evaluate our future wants and needs.

The scientists suggest that this takes place in an area of the brain called the rTPJ, which has long been linked to selflessness and empathy for others. It’s an important part of our ability to “take perspectives” and help us step into the shoes of a friend.

The scientists hypothesized that perhaps the rTPJ treats our “future self” the same way it treats any other person. If it helps us step into our friend’s shoes, maybe we can do the same thing for ourselves. For example, if we’re deciding whether to indulge in another pint of beer at a bar, maybe our ability to hold off is related to our ability to imagine tomorrow morning’s hangover. As science writer Ed Yong explains, “Think of self-control as a kind of temporal selflessness. It’s Present You taking a hit to help out Future You.”

Empathy for Your Future Self
by Reed Rawlings

Further Research on the TPJ

The results of Soutscheks team were similar to past work on the empathy, future-self, and the TPJ. It’s believed a better connected rTPJ increases the likelihood of prosocial behaviors. Which relates to skills of executive function. Individuals who exhibit lower empathy, score higher for impulsivity – the opposite of self-control.

Keeping our future selves in mind may even keep our savings in check. In this research, Stanford University tested a “future self-continuity”. They wanted to explore how individuals related to their future self. Participants were asked to identify how they felt about the overlap between their current and future selves. They used the Venn diagrams below for this exercise.

If they saw themselves as separate, they were more likely to choose immediate rewards. A greater overlap increased the likelihood of selecting delayed rewards. In their final study, they assessed individuals from the San Francisco Bay area. The researchers found a correlation between wealth and an overlap between selves.

While the above research is promising, it doesn’t paint a full picture. Empathy seems useful, but making a sacrifice for our future-self requires that we understand the reason behind it. It’s the sacrifice that is especially crucial – positive gains demand negative trade-offs.

That’s where altruism, our willingness to give to others, comes in.

Why Do We Sacrifice?

Research from the University of Zurich’s examined some altruism’s driving factors. Their work came up with two correlations. First, the larger your rTPJ, the more likely you are to behave altruistically. Second, concerns of fairness affect how we give.

In this experiment, individuals were more generous if their choice would decrease inequality. When inequality would increase, participants were less likely to give.

This is an understandable human maxim. We have little reason to give to an individual who has more than we do. It feels completely unfair to do so. However, we’re raised to believe that helping those in need is objectively good. Helping ourselves should fall under the same belief.

Empathy and altruism, when focused on our own well-being, are intimately linked. To give selflessly, we need to have a genuine concern for another’s well-being. In this case, the ‘other’ is our future self. Thankfully, with a bit of reflection, each of us can gain a unique insight into our own lives.

Alone and aggressive: Social exclusion impairs self-control and empathy and increases hostile cognition and aggression.
by A. William Crescioni and Roy F. Baumeister
from Bullying, Rejection, and Peer Victimization ed. by Monic J. Harris
pp. 260-271 (full text)

Social Rejection and Emotional Numbing

Initial studies provided solid evidence for a causal relationship be-tween rejection and aggression. The mechanism driving this relation-ship remained unclear, however. Emotional distress was perhaps the most plausible mediator. Anxiety has been shown to play a role in both social rejection (Baumeister & Tice, 1990) and ostracism (Williamset al., 2000). Emotional distress, however, was not present in these experiments by Twenge et al. (2001). Only one significant mood effect was found, and even this effect deviated from expectations. The sole difference in mood between rejected and accepted participants was a slight decrease in positive affect. Rejected participants did not show any increase in negative affect; rather, they showed a flattening of affect, in particular a decrease in positive affect. This mood difference did not constitute a mediator of the link between rejection and aggression. It did, however, point toward a new line of thinking. It was possible that rejection would lead to emotional numbing rather than causing emotional distress. The flattening of affect seen in the previous set of studies would be consistent with a state of cognitive deconstruction. This state is characterized by an absence of emotion, an altered sense of time, a fixa-tion on the present, a lack of meaningful thought, and a general sense of lethargy (Baumeister, 1990). […]

Rejection and Self-Regulation

Although the emotional numbness and decrease in empathy experienced by rejected individuals play an important role in the link between social rejection and aggression, these effects do not constitute a complete explanation of why rejection leads to aggression. The diminished prosocial motivations experienced by those lacking in empathy can open the door to aggressive behavior, but having less of a desire to do good and having more of a desire to do harm are not necessarily equivalent. A loss of empathy, paired with the numbing effects of rejection, could lead individuals to shy away from those who had rejected them rather than lashing out. Emotional numbness, however, is not the only consequence of social rejection.

In addition to its emotional consequences, social rejection has adverse effects on a variety of cognitive abilities. Social rejection has been shown to decrease intelligent (Baumeister, Twenge, & Nuss, 2002) and meaningful thought (Twenge et al., 2002). But another category of cognitive response is self-regulation. Studies have demonstrated that self-regulation depends upon a finite resource and that acts of self-regulation can impair subsequent attempts to exercise self-control (Baumeister, Bratslavsky, Muraven, & Tice, 1998). Self-regulation has been shown to be an important tool for controlling aggressive impulses. Stucke and Baumeister (2006) found that targets whose ability to self-regulate had been depleted were more likely to respond aggressively to insulting provocation. DeWall, Baumeister, Stillman, and Galliot (2007) found that diminished self-regulatory resources led to an increase in aggression only in response to provocation; unprovoked participants showed no increase in aggressive behavior. Recall that in earlier work (Twenge et al.,2002) rejected individuals became more aggressive only when the target of their aggression was perceived as having insulted or provoked them.This aggression could have been the result of the diminished ability of rejected participants to regulate their aggressive urges. […]

These results clearly demonstrate that social rejection has a detrimental effect on self-regulation, but they do not explain why this is so and, indeed, the decrement in self-regulation would appear to be counterproductive for rejected individuals. Gaining social acceptance often involves regulating impulses in order to create positive impressions on others (Vohs, Baumeister, & Ciarocco, 2005). Rejected individuals should therefore show an increase in self-regulatory effort if they wish to create new connections or prevent further rejection. The observed drop in self-regulation therefore seems maladaptive. The explanation for this finding lies in rejection’s effect on self-awareness.

Self-awareness is an important prerequisite of conscious self-control (Carver & Scheier, 1981). Twenge et al. (2002) found that, when given the option, participants who had experienced rejection earlier in the study were more likely to sit facing away from rather than toward a mirror. Having participants face a mirror is a common technique for inducing self-awareness (Carver & Scheier, 1981), so participants’ unwillingness to do so following rejection provides evidence of a desire to avoid self-awareness. A drop in self-awareness is part of the suite of effects that comprises a state of cognitive deconstruction. Just as emotional numbness protects rejected individuals from the emotional distress of rejection, a drop in self-awareness would shield against awareness of personalflaws and shortcoming that could have led to that rejection. The benefit of this self-ignorance is that further distress over one’s inadequacies is mitigated. Unfortunately, this protection carries the cost of decreased self-regulation. Because self-regulation is important for positive self-presentation (Vohs et al., 2005), this drop in self-awareness could ironically lead to further rejection. […]

These data suggest that social rejection does not decrease the absolute ability of victims to self-regulate but rather decreases their willingness to exert the effort necessary to do so. Increased lethargy, another aspect of cognitive deconstruction, is consistent with this decrease in self-regulatory effort. Twenge et al. (2002) found that social rejection led participants to give shorter and less detailed explanations of proverbs. Because fully explaining the proverbs would require an effortful response, this shortening and simplification of responses is evidence of increased lethargy amongst rejected participants. This lethargy is not binding, however. When given sufficient incentive, rejected participants were able to match the self-regulatory performance of participants in other conditions. Inducing self-awareness also allowed rejected individuals to self-regulate as effectively as other participants. In the absence of such stimulation, however, rejected individuals showed a decrement in self-regulatory ability that constitutes an important contribution to explaining the link between rejection and aggression. […]

Rejection and Meaningfulness

Twenge et al. (2002) found that social rejection led to a decrease in meaningful thought among participants, as a well as an increased likelihood to endorse the statement, “Life is meaningless.” Williams (2002)has also suggested that social rejection ought to be associated with a perception of decreased meaning in life. Given the fundamental nature of the need to belong, it makes sense that defining life as meaningful would be at least in part contingent on the fulfillment of social needs. A recent line of work has looked explicitly at the effect of social rejection on the perception of meaning in life. Perceiving meaning in life has been shown to have an inverse relationship with hostility, aggression,and antisocial attitude (Mascaro, Morey, & Rosen, 2004). As such, any decrease in meaning associated with social rejection would constitute an important feature of the explanation of the aggressive behavior of rejected individuals.

The God of the Left Hemisphere:
Blake, Bolte Taylor and the Myth of Creation
by Roderick Tweedy

The left hemisphere is competitive… the will to power…is the agenda of the left hemisphere. It arose not to communicate with the world but to manipulate it. This inability to communicate or co-operate poses great difficulties for any project of reintegration or union. Its tendency would be to feed off the right hemisphere, to simply use and gain power over it too. Left hemisphere superiority is based, not on a leap forward by the left hemisphere, but on a ‘deliberate’ handicapping of the right. There is perhaps as much chance of persuading the head of a multinational to stop pursuing an agenda of self-interest and ruthless manipulation as there is of persuading the Urizenic program of the brain which controls him of “resubmitting” itself to the right hemisphere’s values and awareness.

The story of the Western world being one of increasing left-hemispheric domination, we would not expect insight to be the key note. Instead we would expect a sort of insouciant optimism, the sleepwalker whistling a happy tune as he ambles towards the abyss.

The left, rational, brain, it might be safe to conclude, has no idea how serious the problem is, that is to say, how psychopathic it has become. Of course, it doesn’t care that it doesn’t care. “The idiot Reasoner laughs at the Man of Imagination/And from laughter proceeds to murder by undervaluing calumny”, noted Blake in a comment that is only remarkable for the fact that it has taken two hundred years to understand.

The apparently “conscious” rational self, the driving program and personality of the left brain, turns out to be deeply unconscious, a pathological sleepwalker blithely poisoning its own environment whilst tenaciously clinging onto the delusion of its own rightness. This unfortunate mixture, of arrogance and ignorance, defines contemporary psychology. The left hemisphere not only cannot see that there is a problem, it cannot see that it is itself the problem.

Battle of Voices of Authorization in the World and in Ourselves

New Feelings: Podcast Passivity
by Suzannah Showler

My concern is that on some level, I’m prone to mistake any voice that pours so convincingly into my brain for my own. And maybe it’s not even a mistake, per se, so much as a calculated strategy on the part of my ego to maintain its primacy, targeting and claiming any foreign object that would stray so far into the inner-sanctum of my consciousness. Whether the medium is insidious, my mind a greedy assimilation machine, or both, it seems that at least some of the time, podcasts don’t just drown out my inner-monologue — they actually overwrite it. When I listen to a podcast, I think some part of me believes I’m only hearing myself think.

Twentieth-century critics worried about this, too. Writing sometime around the late 1930s, Theodore Adorno theorized that a solitary listener under the influence of radio is vulnerable to persuasion by an anonymous authority. He writes: “The deeper this [radio] voice is involved within his own privacy, the more it appears to pour out of the cells of his more intimate life; the more he gets the impression that his own cupboard, his own photography, his own bedroom speaks to him in a personal way, devoid of the intermediary stage of the printed words; the more perfectly he is ready to accept wholesale whatever he hears. It is just this privacy which fosters the authority of the radio voice and helps to hide it by making it no longer appear to come from outside.”

I’ll admit that I have occasionally been gripped by false memories as a result of podcasts — been briefly sure that I’d seen a TV show I’d never watched, or convinced that it was a friend, not a professional producer, who told me some great anecdote. But on the whole, my concern is less that I am being brainwashed and more that I’m indulging in something deeply avoidant: filling my head with ideas without actually having to do the messy, repetitive, boring, or anxious work of making meaning for myself. It’s like downloading a prefabbed stream of consciousness and then insisting it’s DIY. The effect is twofold: a podcast distracts me from the tedium of being alone with myself, while also convincingly building a rich, highly-produced version of my inner life. Of course that’s addictive — it’s one of the most effective answers to loneliness and self-importance I can imagine.

Being Your Selves: Identity R&D on alt Twitter
by Aaron Z. Lewis

Digital masks are making the static and immortal soul of the Renaissance seem increasingly out of touch. In an environment of info overload, it’s easy to lose track of where “my” ideas come from. My brain is filled with free-floating thoughts that are totally untethered from the humans who came up with them. I speak and think in memes — a language that’s more like the anonymous manuscript culture of medieval times than the individualist Renaissance era. Everything is a remix, including our identities. We wear our brains outside of our skulls and our nerves outside our skin. We walk around with other people’s voices in our heads. The self is in the network rather than a node.

The ability to play multiple characters online means that the project of crafting your identity now extends far beyond your physical body. In his later years, McLuhan predicted that this newfound ability would lead to a society-wide identity crisis:

The instant nature of electric-information movement is decentralizing — rather than enlarging — the family of man into a new state of multitudinous tribal existences. Particularly in countries where literate values are deeply institutionalized, this is a highly traumatic process, since the clash of old segmented visual culture and the new integral electronic culture creates a crisis of identity, a vacuum of the self, which generates tremendous violence — violence that is simply an identity quest, private or corporate, social or commercial.

As I survey the cultural landscape of 2020, it seems that McLuhan’s predictions have unfortunately come true. More than ever before, people are exposed to a daily onslaught of world views and belief systems that threaten their identities. Social media has become the battlefield for a modern-day Hobbesian war of all-against-all. And this conflict has leaked into the allegedly “offline” world.

“Individuation is not the culmination of the person; it is the end of the person.”

Julian Jaynes and the Jaynesian scholars have made a compelling argument about where egoic consciousness originated and how it formed. But in all the Jaynesian literature, I don’t recall anyone suggesting how to undo egoic consciousness, much less suggesting we should attempt annihilation of the demiurgic ego.

That latter project is what preoccupied Carl Jung, and it is what Peter Kingsley has often written about. They suggest it is not only possible but inevitable. In a sense, the ego is already dead and we are already in the underworld. We are corpses and our only task is to grieve.

The Cry of Merlin: Carl Jung and the Insanity of Reason
Gregory Shaw on Peter Kingsley

Kingsley explains that Jung emulated these magicians, and his journey through the Underworld followed the path of Pythagoras, Parmenides and Empedocles. Jung translated the terminology of the ancients into “scientific” terms, calling the initiation he realized in the abyss “individuation.” For Jungians today, individuation is the culmination of psychic development, as if it were our collective birthright. Yet Kingsley points out that this notion of individuation is a domestication, commodification, and utter distortion of what Jung experienced. Individuation is not the culmination of the person; it is the end of the person. It is the agonizing struggle of becoming a god and a person simultaneously, of living in contradictory worlds, eternity and time.

Kingsley reveals that although individuation is the quintessential myth of Jung’s psychology, it is almost never experienced because no one can bear it. Individuation is the surrendering of the personal to the impersonal, and precisely what Jung experienced it to be, the death of his personality. Jung explains that individuation is a total mystery; the mystery of the Grail that holds the essence of God. According to Henry Corbin, Jung saw “true individuation as becoming God or God’s secret.” Put simply, individuation is deification. To his credit, over twenty years ago Richard Noll argued this point and wrote that Jung experienced deification in the form of the lion-headed Mithras (Leontocephalus), but Kingsley gives the context for deification that Noll does not, and the context is crucial. He shows that Jung’s deification was not an “ego trip” that gave rise to “a religious cult with [Jung] as the totem,” Noll’s assumption; nor was it a “colossal narcissism,” as Ernest Jones suggested, but precisely the opposite. Individuation cuts to the very core of self-consciousness; it is the annihilation of the ego, not its inflation. […]

What is fundamentally important about Catafalque is that Kingsley demonstrates convincingly that Jung recovered the shamanic path exemplified by Pythagoras, Parmenides, and Socrates. Jung tried to save us from the “insanity of reason” by descending to the underworld, serving the archetypes, and disavowing the impiety of “the Greeks” who reduce the sacred to rationalizations. There is much in Catafalque I have not addressed, perhaps the most important is Kingsley’s discussion of the Hebrew prophets who raged against a godless world. Kingsley here appropriately includes Allen Ginsberg’s Howl, that draws from the rhythms of these prophets to wail against the “insanity of America,” its mechanized thinking, suffocating architecture, and the robotic efficiency that is the child of Reason. This almost verbatim mirrors the words of Jung who, after visiting New York, says “suppose an age when the machine gets on top of us …. After a while, when we have invested all our energy in rational forms, they will strangle us…They are the dragons now, they became a sort of nightmare.

Kingsley ends Catafalque with depressing prophecies about the end of western civilization, both from Jung and from Kingsley himself. The great wave that was our civilization has spent itself. We are in the undertow now, and we don’t even realize it. To read these chapters is to feel as if one is already a corpse. And Kingsley presents this so bluntly, with so much conviction, it is, frankly, disturbing. And even though Kingsley writes that “Quite literally, our western world has come to an end,” I don’t quite believe him. When speaking about Jung giving psychological advice, Kingsley says “make sure you have enough mētis or alertness not to believe him,” and I don’t believe Kingsley’s final message either. Kingsley’s message of doom is both true and false. The entire book has been telling us that we are already dead, that we are already in the underworld, but, of course, we just don’t understand it. So, then he offers us a very physical and literal picture of our end, laced with nuclear fallout and images of contamination. And he forthrightly says the purpose of his work is “to provide a catafalque for the western world.” It is, he says, time to grieve, and I think he is right. We need to grieve for the emptiness of our world, for our dead souls, our empty lives, but this grief is also the only medicine that can revive the collective corpse that we have become. Kingsley is doing his best to show us, without any false hope, the decaying corpse that we are. It is only through our unwavering acceptance, grieving and weeping for this, that we can be healed. In Jung’s terms, only the death of the personal can allow for birth into the impersonal. Into what…? We cannot know. We never will. It is not for our insatiable minds.

The Link Between Individualism and Collectivism

Individualism and collectivism. Autonomy and authoritarianism. These are opposites, right? Maybe not.

Julian Jaynes argued that humans, in the earliest small city-states, lived in a state he called the bicameral mind. It was a shared sense of identity where ‘thoughts’ were more publicly experienced as voices that were culturally inherited across generations. He observed that the rise of egoic consciousness as the isolated and independent self was simultaneous with a shift in culture and social order.

What was seen was a new kind of authoritarianism, much more brutally oppressive, much more centralized, hierarchical, and systematic. As the communal societies of the bicameral mind entered their end phase heading toward the collapse of the Bronze Age, there was the emergence of written laws, court systems, and standing armies. Criminals, enemy soldiers, and captives were treated much more harshly with mass killings like never before seen. Social order was no longer an organic community but required top-down enforcement.

One evidence of this new mentality was the sudden appearance of pornographic imagery. For thousands of years, humans created art, but never overtly sexual in nature. Then humans apparently became self-conscious of sexuality and also became obsessed with it. This was also a time when written laws and norms about sexuality became common. With sexual prurience came demands of sexual purity.

Repression was the other side of rigid egoic consciousness, as to maintain social control the new individualized self had to be controlled by society. The organic sense of communal identity could no longer be taken for granted and relied upon. The individual was cut off from the moral force of voice-hearing and so moral transgression as sin became an issue. This was the ‘Fall of Man’.

What is at stake is not merely an understanding of the past. We are defined by this past for it lives on within us. We are the heirs of millennia of psycho-cultural transformation. But our historical amnesia and our splintered consciousness leaves us adrift forces that we don’t understand or recognize. We are confused why, as we move toward greater individualism, we feel anxious about the looming threat of ever worse authoritarianism. There is a link between the two that is built into Jaynesian consciousness. But this is not fatalism, as if we are doomed to be ripped apart by diametric forces.

If we accept our situation and face the dilemma, we might be able to seek a point of balance. This is seen in Scandinavian countries where it is precisely a strong collective identity, culture of trust, and social democracy, even some democratic socialism, that makes possible a more stable and less fearful sense of genuine individuality (Anu Partanen, The Nordic Theory of Everything; & Nordic Theory of Love and Individualism). What is counter-intuitive to the American sensibility — or rather American madness — is that this doesn’t require greater legal regulations, such as how there is less red tape in starting a business in Scandinavia than the United States.

A book worth reading is Timothy Carney’s Alienated America. The author comes from the political right, but he is not a radical right-winger. His emphasis is on social conservatism, although the points he is making is dependent on the liberal viewpoint of social science. Look past some of the conservative biases of interpretation and there is much here that liberals, progressives, and even left-wingers could agree with.

He falls into the anti-government rhetoric of pseudo-libertarianism which causes him to be blind to how Scandinavian countries can have big governments that can rely more on culture of trust, rather than regulations, to enforce social norms. What Scandinavians would likely find odd is this American right-wing belief that government is separate from society, even when society isn’t outright denied as did Margaret Thatcher.

It’s because of this confusion that his other insights are all the more impressive. He is struggling against his own ideological chains. It shows how, even as the rhetoric maintains power over the mind, certain truths are beginning to shine through the weakening points of ideological fracture.

Even so, he ultimately fails to escape the gravity of right-wing ideological realism in coming to the opposite conclusion of Anu Partanen who understands that it is precisely the individual’s relationship to the state that allows for individual freedom. Carney, instead, wants to throw out both ‘collectivism’ and ‘hyper-individualism’. He expresses the still potent longing for the bicameral mind and its archaic authorization to compel social order.

What he misses is that this longing itself is part of the post-bicameral trap of Jaynesian consciousness, as the more one seeks to escape the dynamic the more tightly wound one becomes within its vice grip. It is only in holding lightly one’s place within the dynamic that one can steer a pathway through the narrow gap between the distorted extremes of false polarization and forced choice. This is exaggerated specifically by high inequality, not only of wealth but more importantly of resources and opportunities, power and privilege.

High inequality is correlated with mental illness, conflict, aggressive behavior, status anxiety, social breakdown, loss of social trust, political corruption, crony capitalism, etc. Collectivism and individualism may only express as authoritarianism and hyper-individualism under high inequality conditions. For some reason, many conservatives and right-wingers not only seem blind to the harm of inequality but, if anything, embrace it as a moral good expressing a social Darwinian vision of capitalist realism that must not be questioned.

Carney points to the greater social and economic outcomes of Scandinavian countries. But he can’t quite comprehend why such a collectivist society doesn’t have the problems he ascribes to collectivism. He comes so close to such an important truth, only to veer again back into the safety of right-wing ideology. Still, just the fact that, as a social conservative concerned for the public good, he feels morally compelled to acknowledge the kinds of things left-wingers have been talking about for generations shows that maybe we are finally coming to a point of reckoning.

Also, it is more than relevant that this is treading into the territory of Jaynesian thought, although the author has no clue how deep and dark are the woods once he leaves the well-beaten path. Even the briefest of forays shows how much has been left unexplored.

* * *

Alienated America:
Why Some Places Thrive While Others Collapse
by Timothy P. Carney

Two Sides of the Same Coin

“Collectivism and atomism are not opposite ends of the political spectrum,” Yuval Levin wrote in Fractured Republic, “but rather two sides of one coin. They are closely related tendencies, and they often coexist and reinforce one another—each making the other possible.” 32

“The Life of Julia” is clearly a story of atomization, but it is one made possible by the story of centralization: The growth of the central state in this story makes irrelevant—and actually difficult—the existence of any other organizations. Julia doesn’t need to belong to anything because central government, “the one thing we all belong to” (the Democratic Party’s mantra in that election), 33 took care of her needs.

This is the tendency of a large central state: When you strengthen the vertical bonds between the state and the individual, you tend to weaken the horizontal bonds between individuals. What’s left is a whole that by some measures is more cohesive, but individuals who are individually all less connected to one another.

Tocqueville foresaw this, thanks to the egalitarianism built into our democracy: “As in centuries of equality no one is obliged to lend his force to those like him and no one has the right to expect great support from those like him, each is at once independent and weak.

“His independence fills him with confidence and pride among his equals, and his debility makes him feel, from time to time, the need of the outside help that he cannot expect from any of them, since they are all impotent and cold.”

Tocqueville concludes, “In this extremity he naturally turns his regard to the immense being that rises alone in the midst of universal debasement.” 34

The centralizing state is the first step in this. The atomized individual is the end result: There’s a government agency to feed the hungry. Why should I do that? A progressive social philosophy, aimed at liberating individuals by means of a central state that provides their basic needs, can actually lead to a hyper-individualism.

According to some lines of thought, if you tell a man he has an individual duty to his actual neighbor, you are enslaving that man. It’s better, this viewpoint holds, to have the state carry out our collective duty to all men, and so no individual has to call on any other individual for what he needs. You’re freed of both debt to your neighbor (the state is taking care of it) and need (the state is taking care of it).

When Bernie Sanders says he doesn’t believe in charity, and his partymates say “government is the name for the things we do together,” the latter can sound almost like an aspiration —that the common things, and our duties to others, ought to be subsumed into government. The impersonality is part of the appeal, because everyone alike is receiving aid from the nameless bureaucrats and is thus spared the indignity of asking or relying on neighbors or colleagues or coparishioners for help.

And when we see the state crowding out charity and pushing religious organizations back into the corner, it’s easy to see how a more ambitious state leaves little oxygen for the middle institutions, thus suffocating everything between the state and the individual.

In these ways, collectivism begets atomization.

Christopher Lasch, the leftist philosopher, put it in the terms of narcissism. Paternalism, and the transfer of responsibility from the individual to a bureaucracy of experts, fosters a narcissism among individuals, Lasch argued. 35 Children are inherently narcissistic, and a society that deprives adults of responsibility will keep them more childlike, and thus more self-obsessed.

It’s also true that hyper-individualism begets collectivism. Hyper-individualism doesn’t work as a way of life. Man is a political animal and is meant for society. He needs durable bonds to others, such as those formed in institutions like a parish, a sports club, or a school community. Families need these bonds to other families as well, regardless of what Pa in Little House on the Prairie seemed to think at times.

The little platoons of community provide role models, advice, and a safety net, and everyone needs these things. An individual who doesn’t join these organizations soon finds himself deeply in need. The more people in need who aren’t cared for by their community, the more demand there is for a large central state to provide the safety net, the guidance, and the hand-holding.

Social scientists have repeatedly come across a finding along these lines. “[G]overnment regulation is strongly negatively correlated with measures of trust,” four economists wrote in MIT’s Quarterly Journal of Economics . The study relied on an international survey in which people were asked, “Generally speaking, would you say that most people can be trusted or that you need to be very careful in dealing with people?” The authors also looked at answers to the question “Do you have a lot of confidence, quite a lot of confidence, not very much confidence, no confidence at all in the following: Major companies? Civil servants?”

They found, among other examples:

High-trusting countries such as Nordic and Anglo-Saxon countries impose very few controls on opening a business, whereas low-trusting countries, typically Mediterranean, Latin-American, and African countries, impose heavy regulations. 36

The causality here goes both ways. In less trusting societies, people demand more regulation, and in more regulated societies, people trust each other less. This is the analogy of the Industrial Revolution’s vicious circle between Big Business and Big Labor: The less trust in humanity there is, the more rules crop up. And the more rules, the less people treat one another like humans, and so on.

Centralization of the state weakens the ties between individuals, leaving individuals more isolated, and that isolation yields more centralization.

The MIT paper, using economist-speak, concludes there are “two equilibria” here. That is, a society is headed toward a state of either total regulation and low trust, or low regulation and high trust. While both destinations might fit the definition of equilibrium, the one where regulation replaces interpersonal trust is not a fitting environment for human happiness.

On a deeper level, without a community that exists on a human level—somewhere where everyone knows your name, to borrow a phrase—a human can’t be fully human. To bring back the language of Aristotle for a moment, we actualize our potential only inside a human-scaled community.

And if you want to know what happens to individuals left without a community in which to live most fully as human, where men and women are abandoned, left without small communities in which to flourish, we should visit Trump Country.