Islamic Voice-Hearing

Islam, what kind of religion is it? Islam is the worship of a missing god, that is how we earlier described it. Some might consider that as unfair and dismissive to one of the world’s largest religions, but this is true to some extent for all post-bicameral religions. The difference is that Islam is among the most post-bicameral of the world religions. This is true simply in temporal terms.

The bicameral societies, according to Julian Jaynes, ended with the widespread collapse of the late Bronze Age empires and their trade networks. That happened around 1177 BCE, as the result of natural disasters and attacks by the mysterious Sea People, the latter maybe having formed out the refugees from the former. The Bronze Age continued for many centuries in various places: 700 BCE in Great Britain, Central Europe and China; 600 BCE in Northern Europe; 500 BCE in Korea and Ireland; and centuries beyond that in places like Japan.

But the Bronze Age Empires never returned. In that late lingering Bronze Age, a dark age took hold and put all of civilization onto a new footing. This was the era when, across numerous cultures, there were the endless laments about the gods, spirits, and ancestors having gone silent, having abandoned humanity. Entire cultural worldviews and psychological ways of being were utterly demolished or else irreparably diminished. This created an intense sense of loss, longing, and nostalgia that has never left humanity since.

Out of the ashes, while the Bronze Age was still holding on, the Axial Age arose around 900 BCE and continued until 200 BCE. New cultures were formed and new empires built. The result is what Jaynes described as ‘consciousness’ or what one can think of as introspective mental space, an inner world of egoic identity where the individual is separate from community and world. Consciousness and the formalized religions that accompanied it were a replacement for the loss of a world alive with voices.

By the time Rabbinic Judaism, Gnosticism, and Christianity came around, the Axial Age was already being looked back upon as a Golden Age and, other than through a few surviving myths, the Bronze Age before that was barely remembered at all. It would be nearly another 600 years after that first century monotheistic revival when Muhammad would have his visions of the angel Gabriel visiting him to speak on behalf of God. Islam is both post-bicameral and post-axial, to a far greater degree.

Muslims consider Muhammad to be the last prophet and even he didn’t get to hear God directly for it had to come through an angel. The voice of God had long ago grown so faint that people had come to rely on oracles, channelings, and such. These rather late revelations by way of Gabriel were but a barely audible echo of the archaic bicameral voices. It is may be understandable that, as with some oracles before him, Muhammad would declare God would never speak again. So, Islam, unlike the other monothesitic religions, fully embraces God’s absence from the world.

Actually, that is not quite right. Based on the Koran, God will never speak again until the Final Judgment. Then all will hear God again when he weighs your sins and decides the fate of your immortal soul. Here is the interesting part. The witnesses God shall call upon in each person’s case will be all the bicameral voices brought back out of silence. The animals and plants will witness for or against you, as will the earth and rocks and wind. Even your own resurrected body parts will come alive again with voices to speak of what you did. Body parts speaking is something familiar to those who read Jaynesian scholarship.

Until then, God and all the voices of the world will remain mute witnesses, watching your every move and taking notes. They see all, hear all, notice all — every time you masturbate or pick your nose, every time you have a cruel or impure thought, every time you don’t follow one of the large number of divine commandments, laws, and rules spelled out in the Koran. The entire world is spying upon you and will report back to God, at the end of time. The silent world only appears to be dumb and unconscious. God is biding his time, gathering a file on you like a cosmic FBI.

This could feel paralyzing, but in another way it offers total freedom from self, total freedom through complete submission. Jaynesian consciousness is a heavy load and that was becoming increasingly apparent over time, especially in the centuries following the Axial Age. The zealous idealism of the Axial Age prophets was growing dull and tiresome. By the time that Muhammad showed up, almost two millennia had passed since the bicameral mind descended into darkness. The new consciousness was sold as something amazing, but it hadn’t fully lived up to its promises. Instead, ever more brutal regimes came into power and a sense of anxiety was overtaking society.

Muhammad had an answer and the people of that region were obviously hungry for someone to provide an answer. After forming his large army, his military campaign barely experienced any resistance. And in a short period of time while he was still alive, most of the Arabian peninsula was converted to Islam. The silence of the gods had weakened society, but Muhammad offered an explanation for why the divine could no longer be experienced. He helped normalize what had once felt like a tragedy. He told them that they didn’t need to hear God because God had already revealed all knowledge to the prophets, including himself of course. No one had to worry, just follow orders and comply with commands.

All the tiresome complications of thought were unnecessary. God had already thought out everything for humans. The Koran as the final and complete holy text would entirely and permanently replace the bicameral voices, ever receding into the shadows of the psyche. But don’t worry, all those voices are still there, waiting to speak. But the only voice that the individual needed to listen to was that of the person directly above them in the religious hierarchy, be it one’s father or an imam or whoever else with greater official authority with a line of command that goes back to the prophets and through the angels to God Himself. Everything is in the Koran and the learned priestly class would explain it all and translate it into proper theocratic governance.

Muhammad came with a different message than anyone before. The Jewish prophets and Jesus, as with many Pagans, would speak of God as Father and humanity as His children. Early Christians took this as a challenge to a slave-based society, in borrowing from the Stoics that even a slave was free in his soul. Muhammad, instead, was offering another variety of freedom. We humans, rather than children of God, are slaves of God. The entire Islamic religion is predicated upon divine slavery, absolute submission. This is freedom from the harsh taskmaster of egoic individuality, a wannabe demiurge. Unlike Jesus, Muhammad formulated a totalitarian theocracy, a totalizing system. Nothing is left to question or interpretation, that is in theory or rather in belief.

This goes back to how, with the loss of the bicameral mind and social order, something took its place. It was a different kind of authoritarianism — rigid and hierarchical, centralized and concentrated, despotic and violent. Authoritarianism of this variety didn’t emerge until the late Bronze Age when the bicameral societies were becoming too large and complex, overstrained and unstable. Suddenly, as if to presage the coming collapse, there was the appearance of written laws, harsh punishment, and cruel torture — none of which ever existed before, according to historical records and archaeological finds. As the world shifted into post-bicameralism, this authoritarianism became ever more extreme (e.g., Roman Empire).

This was always the other side of the rise of individuality, of Jaynesian consciousness. The greater potential freedom the individual possesses the more that oppressive social control is required, as the communal bonds and social norms of the bicameral mind increasingly lost their hold to organically maintain order. Muhammad must have showed up at the precise moment of crisis in this change. After the Roman Empire’s system of slavery, Europe came up with feudalism to re-create some of what had disappeared. But apparently a different kind of solution was required in the Arab world.

Maybe this offsets the draining of psychic energy that comes with consciousness. Jaynes speculated that, like the schizophrenic, bicameral humans had immense energy and stamina which allowed them to accomplish near-miraculous feats such as building the pyramids with small populations and very little technology or infrastructure. Suppression of the extremes of individualism through emphasizing absolute subordination is maybe a way of keeping in check the energy loss of maintaining egoic consciousness. In the West, we eventually overcame this weakness by using massive doses of stimulants to overpower the otherwise debilitating anxiety and to help shore up the egoic boundaries, but this has come at the cost of destroying our physical health and mental health.

Time will tell which strategy is the most effective for long-term survival of specific societies. But I’m not sure I’d bet on the Western system, considering how unsustainable it appears to be and how easily it has become crippled by a minor disease epidemic like covid-19. Muhammad might simply have been trying to cobble together some semblance of a bicameral mind, in the face of divine silence. There is a good reason for trying to do that. Those bicameral societies lasted many millennia longer than has our post-bicameral civilization. It’s not clear that modern civilization or at least Western civilization will last beyond the end of this century. We underestimate the bicameral mind and the importance it played during the single longest period of advancement of civilization.

* * *

Let us leave a small note of a more personal nature. In the previous post (linked above), we mentioned that our line of inquiry began with a conversation we had with a friend of ours who is a Muslim. He also happens to be schizophrenic, i.e., a voice-hearer. The last post was about how voice=hearing is understood within Islam. Since supposedly God no longer speaks to humans nor do his angelic intermediaries, any voice a Muslim hears is automatically interpreted as not being of divine origins. It doesn’t necessarily make the voice evil, as it could be a jinn which is a neutral entity in Islamic theology, although jinn can be dangerous. Then again, voice-hearing might also be caused by an evil magician, what I think is called a sihir.

Anyway, we had the opportunity to speak to this friend once again, as we are both in jobs that require us to continue working downtown amidst everything otherwise being locked down because of the covid-19 epidemic. In being isolated from family and other friends, we’ve been meeting with this Islamic guy on a daily basis. Just this morning, we went for a long walk together and chatted about life and religion. He had previously talked about his schizophrenia in passing, apparently unworried by the stigma of it. He is an easy person to talk to, quite direct and open about his thoughts and experiences. I asked him about voice-hearing and he explained that, prior to being medicated, he would continue to hear people speak to him after they no longer were present. And unsurprisingly, the voices were often negative.

Both his imam and his therapist told him to ignore the voices. Maybe that is a standard approach in traditionally monotheistic cultures. As we mentioned in the other post, he is from North Africa where Arabs are common. But another friend of ours lives in Ghana, in West Africa. Voice-hearing experience among people in Ghana was compared to those in the United States, in the research of Tanya M. Luhrmann, an anthropologist inspired by Julian Jaynes. She found that Ghanans, with a tradition of voice-hearing (closer to bicameralism?), had a much more positive experience of the voices they heard. Americans, like our Islamic friend, did not tend to hear voices that were kind and helpful. This is probably the expectancy effect.

If you are raised to believe that voices are demonic or their Islamic equivalent of jinn or are from witches and evil magicians, or if you simply have been told voice-hearing means your insane, well, it’s not likely to lead to happy results when you do hear voices. I doubt it decreases the rate of voice-hearing, though. In spite of Islamic theology denying God and angels speak to humans any longer, that isn’t likely to have any affect on voice-hearing itself. So, the repressed bicameral mind keeps throwing out these odd experiences, but in our post-bicameral age we have fewer resources in dealing constructively with those voices. Simply denying and ignoring them probably is less helpful.

That is the ultimate snag. The same voices that once were identified as godly or something similar are now taken as false, unreal, or dangerous. In a sense, God never stopped speaking. One could argue that we all are voice-hearers, but some of us now call the voice of God as ‘conscience’ or whatever. Others, like Muslims, put great emphasis on this voice-hearing but have tried to gag God who goes on talking. Imagine how many potential new prophets have been locked away in psychiatric wards or, much worse, killed or imprisoned as heretics. If God can’t be silenced, the prophets who hear him can. The Old Testament even describes how the authorities forbid voice-hearing and demanded that voice-hearers be killed, even by their own parents.

The bicameral mind didn’t disappear naturally because it was inferior but because, in its potency, it was deemed dangerous to those who wanted to use brute power to enforce their own voices of authorization. The bicameral mind, once central to the social order, had become enemy number one. If people could talk to God directly, religion and its claims of authority would become irrelevant. That is how our Islamic friend, a devout religious practitioner, ended up being drugged up to get the voices to stop speaking.

Islam as Worship of a Missing God

A friend of ours is a Muslim and grew up in an Islamic country. As he talked about his religion, we realized how different it is from Christianity. There is no shared practice among Christians similar to the praying five times a day. From early on, Christianity was filled with diverse groups and disagreements, and that has only increased over time (there are over 4,600 denominations of Christianity in the United States alone). My friend had a hard time appreciating that there is no agreed upon authority, interpretation, or beliefs among all Christians.

Unlike Muhammad, Jesus never wrote anything nor was anything written down about him until much later. Nor did he intend to start a new religion. He offered no rules, social norms, instructions, etc for how to organize a church, a religious society, or a government. He didn’t even preach family values, if anything the opposite — from a command to let the dead bury themselves to the proclamation of having come to turn family members against each other. The Gospels offer no practical advice about anything. Much of Jesus’ teachings, beyond a general message of love and compassion, are vague and enigmatic, often parables that have many possible meanings.

Now compare Jesus to the Islamic prophet. Muhammad is considered the last prophet, although he never claimed to have heard the voice of God and instead supposedly having received the message secondhand through an angel. Still, according to Muslims, the Koran is the only complete holy text in existence — the final Word of God. That is also something that differs from Christianity. Jesus never asserted that God would become silent to all of humanity for eternity and that his worshippers would be condemned to a world without the God they longed for, in the way Allah never enters His own Creation.

Many Protestants and Anabaptists and those in similar groups believe that God continues to be revealed to people today, that the divine is known through direct experience, that the Bible as a holy text must be read as a personal relationship to God, not merely taken on the authority of blind faith. Some churches go so far as to teach people how to speak to and hear God (T.M. Luhrmann, When God Talks Back). Even within Catholicism, there have been further revelations of God since Jesus, from various mystics and saints that are acknowledged by the Vatican but also from ordinary Catholics claiming God spoke to them without any great fear of hereticism and excommunication.

It made me think about Julian Jaynes’ theory modern consciousness. With the collapse of the Bronze Age civilizations, there was this sense of the gods having gone silent. Yet this was never an absolute experience, as some people continued to hear the gods. Even into the modern world, occasionally people still claim to hear various gods and sometimes even found new religions based on revelations. The Bahai, for example, consider Muhammad to be just one more prophet with others having followed him. Hindus also have a living tradition of divine revelation that is equivalent to that of prophets. Only Islam, as far as I know, claims all prophecy and revelation to be ended for all time.

I was thinking about the sense of loss and loneliness people felt when bicameral societies came to an end. They were thrown onto an increasingly isolated individualism. Religion as we know it was designed to accommodate this, in order to give a sense of order, meaning and authority that had gone missing. But Islam takes this to an extreme. After Muhammad, no human supposedly would ever again personally hear, see, or experience the divine in any way (excluding mystical traditions like Sufism). For all intents and purposes, Allah has entirely receded from the world. The only sign of his existence that he left behind was a book of instructions. We must submit and comply or be punished in the afterlife, a world separate from this one

That seems so utterly depressing and dreary to me. I was raised Christian and on the far other extreme of Protestantism. My family attended the Unity Church that emphasizes direct experience of God to such a degree that the Bible itself was mostly ignored and almost irrelevant — why turn to mere words on paper when you can go straight to the source? Rather than being denied and condemned, to claim to have heard God speak would have been taken seriously. I’m no longer religious, but the nearly deist idea of a god that is distant and silent seems so alien and unappealing to me. Yet maybe that makes Islam well designed for the modern world, as it offers a strong response to atheism.

If you don’t have any experience of God, this is considered normal and expected in Islam, not something to be worried about, not something to challenge one’s faith as is common in Christianity (NDE: Spirituality vs Religiosity); and it avoids the riskiness and confusion of voice-hearing (Libby Anne, Voices in Your Head: Evangelicals and the Voice of God). One’s ignorance of the divine demonstrates one’s individual inadequacy and, as argued by religious authority, is all the more reason to submit to religious authority. Islamic relation between God and humanity is one-way, except to some extent by way of inspiration and dreams, but Allah himself never directly enters his Creation and so never directly interacts with humans, not even with prophets. Is that why constant prayer is necessary for Muslims, to offset God’s silence and vacancy? Worship of a missing God seems perfectly suited for the modern world.

Muslims are left with looking for traces of God in the Koran like ants crawling around in a footprint while trying to comprehend what made it and what it wants them to do. So, some of the ants claim to be part of a direct lineage of ants that goes back to an original ant that, according to tradition, was stepped upon by what passed by. These well-respected ants then explain to all the other ants what is meant by all the bumps and grooves in the dried mud. In worship, the ants pray toward the footprint and regularly gather to circle around it. This gives their life some sense of meaning and purpose and, besides, it maintains the social order.

That is what is needed in a world where the bicameral voices of archaic authorization no longer speak, no longer are heard. Something has to fill the silence as the loneliness it creates is unbearable. Islam has a nifty trick, embracing the emptiness and further irritating the overwhelming anxiety as it offers the salve for the soul. Muslims take the silence of God as proof of God, as a promise of something more. This otherworldly being, Allah, tells humans who don’t feel at home in this world that their real home is elsewhere, to which they will return if they do what they are told. Other religions do something similar, but Islam takes this to another level — arguably, the highest or most extreme form of monotheism, so far. The loss of the bicameral mind could not be pushed much further, one suspects, without being pushed into an abyss.

Islam is a truly modern religion. Right up there with capitalism and scientism.

* * *

Further discussion about this can be found on the Facebook page “Jaynes’ The Origin of Consciousness in the Breakdown of the Bicameral Mind”.

 

Do To Yourself As You Would Do For Others

“…our impulse control is less based on an order from our executive command center, or frontal cortex, and more correlated with the empathic part of our brain. In other words, when we exercise self-control, we take on the perspective of our future self and empathize with that self’s perspectives, feelings, and motivations.”
~ Alexandar Soutscheck

Self-control is rooted in self-awareness. Julian Jaynes and Brian McVeigh, in one of their talks, brought up the idea that “mind space” has increased over time: “The more things we think about, the more distinctions we make in our consciousness  between A and B, and so on, the more mind-space there is” (Discussions with Julian Jaynes, ed. by Brian J. McVeigh, p. 40). The first expansion was the creation of introspective consciousness itself. Narratization allowed that consciousness to also extend across time, to imagine possibilities and play out scenarios and consider consequences. Empathy, as we we experience it, might be a side effect of this as consciousness includes more and more within it, including empathy with our imagined future self. So, think of self-control as being kind to yourself, to your full temporal self, not only your immediate self.

This would relate to the suggestion that humans learn theory of mind, the basis of cognitive empathy, first by observing others and only later apply it to ourselves. That is to say the first expansion of mental space as consciousness takes root within relationship to others. It’s realizing that there might be inner experience within someone else that we claim inner space in our own experience. So, our very ability to understand ourselves is dependent on empathy with others. This was a central purpose of the religions that arose in the Axial Age, the traditions that continue into the modern world* (Tahere Salehi, The Effect of Training Self-Control and Empathy According to Spirituality on Self-Control and Empathy Preschool Female Students in Shiraz City). The prophets that emerged during that era taught love and compassion and introspection, not only as an otherworldly moral dictum but also in maintaining group coherence and the common good. The breakdown of what Jaynes called the bicameral mind was traumatic and a new empathic mind was needed to replace it, if only to maintain social order.

Social order has become a self-conscious obsession ever since, as Jaynesian consciousness in its tendency toward rigidity has inherent weaknesses. Social disconnection is a crippling of the mind because the human psyche is inherently social. Imagining our future selves is a relationship with a more expansive sense of self. It’s the same mechanism as relating to any other person. This goes back to Johann Hari’s idea, based on Bruce K. Alexander’s rat park research, that the addict is the ultimate individual. In this context, this ultimate individual lacking self-control is not only disconnected from other people but also disconnected from themselves. Addiction is isolating and isolation promotes addiction. Based on this understanding, I’ve proposed that egoic consciousness is inherently addictive and that post-axial society is dependent on addiction for social control.

But this psychological pattern is seen far beyond addiction. This fits our personal experience of self. When we were severely depressed, we couldn’t imagine or care about the future. This definitely inhibited self-control and led to more impulsive behavior in being in present-oriented psychological survival mode. Then again, the only reason self-control is useful at all is because, during and following the Axial Age, humans ever more loss the capacity of being part of a communal identity that created the conditions of communal control, the externally perceived commands of archaic authorization through voice-hearing. We’ve increasingly lost the capacity of a communal identity (extended mind/self) and hence a communal empathy, something that sounds strange or unappealing to the modern mind. In denying our social nature, this casts the shadow of authoritarianism, an oppressive and often violent enforcement of top-down control.

By the way, this isn’t merely about psychology. Lead toxicity causes higher rates of impulsivity and aggression. This is not personal moral failure but brain damage from poisoning. Sure, teaching brain-damaged kids and adults to have more empathy might help them overcome their disability. But if we are to develop and empathic society, we should learn to have enough empathy not to wantonly harm the brains of others with lead toxicity and other causes of stunted development (malnutrition, stress, ACEs, etc), just because they are poor or minority and can’t fight back. Maybe we need to first teach politicians and business leaders basic empathy, in overcoming the present dominance of pscyopathic traits, so that they could learn self-control in not harming others.

The part of the brain involving cognitive empathy and theory of mind is generally involved with selflessness and pro-social behavior. To stick with brain development and neurocognitive functioning, let’s look at diet. Weston A. Price, in studying traditional populations that maintained healthy diets, observed what he called moral health in that people seemed kinder, more helpful, and happier — they got along well. Strong social fabric and culture of trust is not an abstraction but built into general measures of health, in the case of Price’s work, having to do with nutrient-dense animal foods containing fat-soluble vitamins. As the standard American diet has worsened, so has mental health. That is a reason for hope. In an early study on the ketogenic diet as applied to childhood diabetes, the researchers made a side observation that not only did the diabetes symptoms improve but so did behavior. I’ve theorized about how a high-carb diet might be one of the factors that sustains the addictive and egoic self.

Narrow rigidity of the mind, as seen in the extremes of egoic consciousness, has come to be accepted as a social norm and even a social ideal. It is the social Darwinian worldview that has contributed to the rise of both competitive capitalism and the Dark Triad (psycopathy, narcissism, and Machiavellianism), and unsurprisingly it has led to a society that lacks awareness and appreciation of the harm caused to future generations (Scott Barry Kaufman, The Dark Triad and Impulsivity). Rather than normalized, maybe this dysfunction should be seen as a sickness, not only a soul sickness but a literal sickness of the body-mind that can be scientifically observed and measured, not to mention medically and socially treated. We need to thin the boundaries of the mind so as to expand our sense of self. Research shows that those with such thinner boundaries not only have more sense of identification with their future selves but also their past selves, in maintaining a connection to what it felt like to be a child. We need to care for ourselves and others in the way we would protect a child.

* * *

* In their article “Alone and aggressive“, A. William Crescioni and Roy F. Baumeister included the loss of meaning. It was maybe associated with the loss of empathy, specifically in understanding the meaning of others (e.g., the intention ‘behind’ words, gestures and actions). Meaning traditionally has been the purview of religion. And I’d suggest that it is not a coincidence that the obsession with meaning arose in the Axial Age right when words were invented for ‘religion’ as a formal institution separate from the rest of society. As Julian Jaynes argues, this was probably in response to the sense of nostalgia and longing that followed the silence of the gods, spirits, and ancestors.

A different kind of social connection had to be taught, but this post-bicameral culture wasn’t and still isn’t as effective in re-creating the strong social bonds of archaic humanity. Periods of moral crisis in fear of societal breakdown have repeated ever since, like a wound that was never healed. I’ve previously written about social rejection and aggressive behavior in relation to this (12 Rules for Potential School Shooters) — about school shooters, I explained:

Whatever they identify or don’t identify as, many and maybe most school shooters were raised Christian and one wonders if that plays a role in their often expressing a loss of meaning, an existential crisis, etc. Birgit Pfeifer and Ruard R. Ganzevoort focus on the religious-like concerns that obsess so many school shooters and note that many of them had religious backgrounds:

“Traditionally, religion offers answers to existential concerns. Interestingly, school shootings have occurred more frequently in areas with a strong conservative religious population (Arcus 2002). Michael Carneal (Heath High School shooting, 1997, Kentucky) came from a family of devoted members of the Lutheran Church. Mitchell Johnson (Westside Middle School shooting, 1998, Arkansas) sang in the Central Baptist Church youth choir (Newman et al. 2004). Dylan Klebold (Columbine shooting, 1999, Colorado) attended confirmation classes in accordance with Lutheran tradition. However, not all school shooters have a Christian background. Some of them declare themselves atheists…” (The Implicit Religion of School Shootings).

Princeton sociologist Katherine Newman, in studying school shootings, has noted that, “School rampage shootings tend to happen in small, isolated or rural communities. There isn’t a very direct connection between where violence typically happens, especially gun violence in the United States, and where rampage shootings happen” (Common traits of all school shooters in the U.S. since 1970).

It is quite significant that these American mass atrocities are concentrated in “small, isolated or rural communities” that are “frequently in areas with a strong conservative religious population”. That might more precisely indicate who these school shooters are and what they are reacting to. Also, one might note that rural areas in general and specifically in the South do have high rates of gun-related deaths, although many of them are listed as ‘accidental’ which is to say most rural shootings involve people who know each other; also true of school shootings.

* * *

Brain stimulation reveals crucial role of overcoming self-centeredness in self-control
by Alexander Soutschek, Christian C. Ruff, Tina Strombach, Tobias Kalenscher and Philippe N. Tobler

Empathic Self-Control
by David Shoemaker

People with a high degree of self-control typically enjoy better interpersonal relationships, greater social adjustment, and more happiness than those with a low degree of self-control. They also tend to have a high degree of empathy. Further, those with low self-control also tend to have low empathy. But what possible connection could there be between self-control and empathy, given that how one regulates oneself seems to have no bearing on how one views others. Nevertheless, this paper aims to argue for a very tight relation between self-control and empathy, namely, that empathy is in fact one type of self-control. The argument proceeds by exploring two familiar types of self-control, self-control over actions and attitudes, the objects for which we are also responsible. Call the former volitional self-control and the latter rational self-control. But we also seem to be responsible for—and have a certain type of control and self-control over—a range of perceptual states, namely, those in which we come to see from another person’s perspective how she views her valuable ends and what her emotional responses are to their thwarting or flourishing. This type of empathic self-control is a previously-unexplored feature of our interpersonal lives. In addition, once we see that the type of empathy exercised is also exercised when casting ourselves into the shoes of our future selves, we will realize how intra-personal empathy better enables both volitional and rational self-control.

Science Says When Self-Control Is Hard, Try Empathizing With Your Future Self
by Lindsay Shaffer

Soutscheck’s study also reveals what happens when we fail to exercise the empathic part of our brain. When Soutscheck interrupted the empathic center of the brain in 43 study volunteers, they were more likely to take a small amount of cash immediately over a larger amount in the future. They were also less inclined to share the money with a partner. Soutscheck’s study showed that the more people are stuck inside their own perspective, even just from having the empathic part of their brain disrupted, the more likely they are to behave selfishly and impulsively.

Self-Control Is Just Empathy With Your Future Self
by Ed Yong

This tells us that impulsivity and selfishness are just two halves of the same coin, as are their opposites restraint and empathy. Perhaps this is why people who show dark traits like psychopathy and sadism score low on empathy but high on impulsivity. Perhaps it’s why impulsivity correlates with slips among recovering addicts, while empathy correlates with longer bouts of abstinence. These qualities represent our successes and failures at escaping our own egocentric bubbles, and understanding the lives of others—even when those others wear our own older faces.

New Studies in Self Control: Treat Yourself Like You’d Treat Others
from Peak

A new study recently shifted the focus to a different mechanism of self control. Alexander Soutschek and colleagues from the University of Zurich believe self-control may be related to our ability to evaluate our future wants and needs.

The scientists suggest that this takes place in an area of the brain called the rTPJ, which has long been linked to selflessness and empathy for others. It’s an important part of our ability to “take perspectives” and help us step into the shoes of a friend.

The scientists hypothesized that perhaps the rTPJ treats our “future self” the same way it treats any other person. If it helps us step into our friend’s shoes, maybe we can do the same thing for ourselves. For example, if we’re deciding whether to indulge in another pint of beer at a bar, maybe our ability to hold off is related to our ability to imagine tomorrow morning’s hangover. As science writer Ed Yong explains, “Think of self-control as a kind of temporal selflessness. It’s Present You taking a hit to help out Future You.”

Empathy for Your Future Self
by Reed Rawlings

Further Research on the TPJ

The results of Soutscheks team were similar to past work on the empathy, future-self, and the TPJ. It’s believed a better connected rTPJ increases the likelihood of prosocial behaviors. Which relates to skills of executive function. Individuals who exhibit lower empathy, score higher for impulsivity – the opposite of self-control.

Keeping our future selves in mind may even keep our savings in check. In this research, Stanford University tested a “future self-continuity”. They wanted to explore how individuals related to their future self. Participants were asked to identify how they felt about the overlap between their current and future selves. They used the Venn diagrams below for this exercise.

If they saw themselves as separate, they were more likely to choose immediate rewards. A greater overlap increased the likelihood of selecting delayed rewards. In their final study, they assessed individuals from the San Francisco Bay area. The researchers found a correlation between wealth and an overlap between selves.

While the above research is promising, it doesn’t paint a full picture. Empathy seems useful, but making a sacrifice for our future-self requires that we understand the reason behind it. It’s the sacrifice that is especially crucial – positive gains demand negative trade-offs.

That’s where altruism, our willingness to give to others, comes in.

Why Do We Sacrifice?

Research from the University of Zurich’s examined some altruism’s driving factors. Their work came up with two correlations. First, the larger your rTPJ, the more likely you are to behave altruistically. Second, concerns of fairness affect how we give.

In this experiment, individuals were more generous if their choice would decrease inequality. When inequality would increase, participants were less likely to give.

This is an understandable human maxim. We have little reason to give to an individual who has more than we do. It feels completely unfair to do so. However, we’re raised to believe that helping those in need is objectively good. Helping ourselves should fall under the same belief.

Empathy and altruism, when focused on our own well-being, are intimately linked. To give selflessly, we need to have a genuine concern for another’s well-being. In this case, the ‘other’ is our future self. Thankfully, with a bit of reflection, each of us can gain a unique insight into our own lives.

Alone and aggressive: Social exclusion impairs self-control and empathy and increases hostile cognition and aggression.
by A. William Crescioni and Roy F. Baumeister
from Bullying, Rejection, and Peer Victimization ed. by Monic J. Harris
pp. 260-271 (full text)

Social Rejection and Emotional Numbing

Initial studies provided solid evidence for a causal relationship be-tween rejection and aggression. The mechanism driving this relation-ship remained unclear, however. Emotional distress was perhaps the most plausible mediator. Anxiety has been shown to play a role in both social rejection (Baumeister & Tice, 1990) and ostracism (Williamset al., 2000). Emotional distress, however, was not present in these experiments by Twenge et al. (2001). Only one significant mood effect was found, and even this effect deviated from expectations. The sole difference in mood between rejected and accepted participants was a slight decrease in positive affect. Rejected participants did not show any increase in negative affect; rather, they showed a flattening of affect, in particular a decrease in positive affect. This mood difference did not constitute a mediator of the link between rejection and aggression. It did, however, point toward a new line of thinking. It was possible that rejection would lead to emotional numbing rather than causing emotional distress. The flattening of affect seen in the previous set of studies would be consistent with a state of cognitive deconstruction. This state is characterized by an absence of emotion, an altered sense of time, a fixa-tion on the present, a lack of meaningful thought, and a general sense of lethargy (Baumeister, 1990). […]

Rejection and Self-Regulation

Although the emotional numbness and decrease in empathy experienced by rejected individuals play an important role in the link between social rejection and aggression, these effects do not constitute a complete explanation of why rejection leads to aggression. The diminished prosocial motivations experienced by those lacking in empathy can open the door to aggressive behavior, but having less of a desire to do good and having more of a desire to do harm are not necessarily equivalent. A loss of empathy, paired with the numbing effects of rejection, could lead individuals to shy away from those who had rejected them rather than lashing out. Emotional numbness, however, is not the only consequence of social rejection.

In addition to its emotional consequences, social rejection has adverse effects on a variety of cognitive abilities. Social rejection has been shown to decrease intelligent (Baumeister, Twenge, & Nuss, 2002) and meaningful thought (Twenge et al., 2002). But another category of cognitive response is self-regulation. Studies have demonstrated that self-regulation depends upon a finite resource and that acts of self-regulation can impair subsequent attempts to exercise self-control (Baumeister, Bratslavsky, Muraven, & Tice, 1998). Self-regulation has been shown to be an important tool for controlling aggressive impulses. Stucke and Baumeister (2006) found that targets whose ability to self-regulate had been depleted were more likely to respond aggressively to insulting provocation. DeWall, Baumeister, Stillman, and Galliot (2007) found that diminished self-regulatory resources led to an increase in aggression only in response to provocation; unprovoked participants showed no increase in aggressive behavior. Recall that in earlier work (Twenge et al.,2002) rejected individuals became more aggressive only when the target of their aggression was perceived as having insulted or provoked them.This aggression could have been the result of the diminished ability of rejected participants to regulate their aggressive urges. […]

These results clearly demonstrate that social rejection has a detrimental effect on self-regulation, but they do not explain why this is so and, indeed, the decrement in self-regulation would appear to be counterproductive for rejected individuals. Gaining social acceptance often involves regulating impulses in order to create positive impressions on others (Vohs, Baumeister, & Ciarocco, 2005). Rejected individuals should therefore show an increase in self-regulatory effort if they wish to create new connections or prevent further rejection. The observed drop in self-regulation therefore seems maladaptive. The explanation for this finding lies in rejection’s effect on self-awareness.

Self-awareness is an important prerequisite of conscious self-control (Carver & Scheier, 1981). Twenge et al. (2002) found that, when given the option, participants who had experienced rejection earlier in the study were more likely to sit facing away from rather than toward a mirror. Having participants face a mirror is a common technique for inducing self-awareness (Carver & Scheier, 1981), so participants’ unwillingness to do so following rejection provides evidence of a desire to avoid self-awareness. A drop in self-awareness is part of the suite of effects that comprises a state of cognitive deconstruction. Just as emotional numbness protects rejected individuals from the emotional distress of rejection, a drop in self-awareness would shield against awareness of personalflaws and shortcoming that could have led to that rejection. The benefit of this self-ignorance is that further distress over one’s inadequacies is mitigated. Unfortunately, this protection carries the cost of decreased self-regulation. Because self-regulation is important for positive self-presentation (Vohs et al., 2005), this drop in self-awareness could ironically lead to further rejection. […]

These data suggest that social rejection does not decrease the absolute ability of victims to self-regulate but rather decreases their willingness to exert the effort necessary to do so. Increased lethargy, another aspect of cognitive deconstruction, is consistent with this decrease in self-regulatory effort. Twenge et al. (2002) found that social rejection led participants to give shorter and less detailed explanations of proverbs. Because fully explaining the proverbs would require an effortful response, this shortening and simplification of responses is evidence of increased lethargy amongst rejected participants. This lethargy is not binding, however. When given sufficient incentive, rejected participants were able to match the self-regulatory performance of participants in other conditions. Inducing self-awareness also allowed rejected individuals to self-regulate as effectively as other participants. In the absence of such stimulation, however, rejected individuals showed a decrement in self-regulatory ability that constitutes an important contribution to explaining the link between rejection and aggression. […]

Rejection and Meaningfulness

Twenge et al. (2002) found that social rejection led to a decrease in meaningful thought among participants, as a well as an increased likelihood to endorse the statement, “Life is meaningless.” Williams (2002)has also suggested that social rejection ought to be associated with a perception of decreased meaning in life. Given the fundamental nature of the need to belong, it makes sense that defining life as meaningful would be at least in part contingent on the fulfillment of social needs. A recent line of work has looked explicitly at the effect of social rejection on the perception of meaning in life. Perceiving meaning in life has been shown to have an inverse relationship with hostility, aggression,and antisocial attitude (Mascaro, Morey, & Rosen, 2004). As such, any decrease in meaning associated with social rejection would constitute an important feature of the explanation of the aggressive behavior of rejected individuals.

The God of the Left Hemisphere:
Blake, Bolte Taylor and the Myth of Creation
by Roderick Tweedy

The left hemisphere is competitive… the will to power…is the agenda of the left hemisphere. It arose not to communicate with the world but to manipulate it. This inability to communicate or co-operate poses great difficulties for any project of reintegration or union. Its tendency would be to feed off the right hemisphere, to simply use and gain power over it too. Left hemisphere superiority is based, not on a leap forward by the left hemisphere, but on a ‘deliberate’ handicapping of the right. There is perhaps as much chance of persuading the head of a multinational to stop pursuing an agenda of self-interest and ruthless manipulation as there is of persuading the Urizenic program of the brain which controls him of “resubmitting” itself to the right hemisphere’s values and awareness.

The story of the Western world being one of increasing left-hemispheric domination, we would not expect insight to be the key note. Instead we would expect a sort of insouciant optimism, the sleepwalker whistling a happy tune as he ambles towards the abyss.

The left, rational, brain, it might be safe to conclude, has no idea how serious the problem is, that is to say, how psychopathic it has become. Of course, it doesn’t care that it doesn’t care. “The idiot Reasoner laughs at the Man of Imagination/And from laughter proceeds to murder by undervaluing calumny”, noted Blake in a comment that is only remarkable for the fact that it has taken two hundred years to understand.

The apparently “conscious” rational self, the driving program and personality of the left brain, turns out to be deeply unconscious, a pathological sleepwalker blithely poisoning its own environment whilst tenaciously clinging onto the delusion of its own rightness. This unfortunate mixture, of arrogance and ignorance, defines contemporary psychology. The left hemisphere not only cannot see that there is a problem, it cannot see that it is itself the problem.

The Commons of World, Experience, and Identity

The commons, sadly, have become less common. Mark Vernon writes that, “in the Middle Ages, fifty per cent or more of the land was commons, accessible to everybody” (Spiritual Commons). The Charter of the Forest formally established the commons within English law and it lasted from 1217 to 1971. That isn’t some ancient tradition but survived far into modernity, well within still living memory. The beginning of the end was the enclosure movement that was first seen not long after the Charter was signed into law, but the mass evictions of peasants from their land wouldn’t happen until many centuries later with sheep herding, coal mining, and industrialization.

It’s hard for us to imagine what was the commons. It wasn’t merely about land and resources, about the customs and laws about rights and responsibilities, about who had access to what and in what ways. The commons was a total social order, a way of being. The physical commons was secondary to the spiritual commons as community, home and sense of place (“First came the temple, then the city.”) — “Landscape is memory, and memory in turn compresses to become the rich black seam that underlies our territory” (Alan Moore, “Coal Country”, from Spirits of Place); “…haunted places are the only ones people can live in” (Michel de Certeau, The Practice of Everyday Life). The commons also was a living force, at a time when Christianity permeated every aspect of life and when the felt experience of Paganism continued in local traditions and stories, often incorporated into Church rituals and holy days. Within the commons, there was a shared world where everyone was accountable to everyone else. Even a chicken or a wagon could be brought to court, according to the English common law of doedands (Self, Other, & World).

The parish was an expression of the commons, embodying local community and identity that was reinforced by the annual beating of the bounds, a practice that goes back to ancient Rome, a faint memory of what once was likely akin to the Aboriginal songlines in invoking a spiritual reality. It was within the parish that life revolved and the community was maintained, such as determining disputes and taking care of the sick, crippled, elderly, widowed, and orphaned. We can’t genuinely care about what we feel disconnected from. Community is fellowship, kinship and neighborliness, is intimate relationship and familiarity. This relates to why Germanic ‘freedom’ meant to be part of a free people and etymologically was related to ‘friendship’, as opposed to Latin ‘liberty’ that merely indicated one wasn’t enslaved while surrounded by those who were (Liberty, Freedom, and Fairness).

“It is the non-material aspects of life,” Vernon suggests, “that, more often than not, are crucial for finding meaning and purpose, particularly when life involves suffering.” He states that a crucial element is to re-imagine, and that makes me think of he living imagination or what some call the imaginal as described by William Blake, Henry Cobin, James Hillman, Patrick Harpur, and many others. And to re-imagine would mean to re-experience in new light. He goes onto speak of the ancient Greek view of time. John Demos, in Circles and Lines, explains how cyclical time remained central to American experience late into the colonial era and, as the United States wasn’t fully urbanized until the 20th century, surely persisted in rural eras for much longer. Cyclical time was about a sense of recurrence and return, central to the astrological worldview that gave us the word ‘revolution’, that is to revolve. The American Revolutionaries were hoping for a return and the sense of the commons was still strong among them, even as it was disappearing quickly.

Instead of time as abundance, the modern world feels like time is always running out and closing in on us. We have no sense of openness to the world, as we’ve become insulated within egoic consciousness and hyper-individualism. As with beating the bounds of the parish, cyclical time contains the world into a familiar landscape of the larger world of weather patterns and seasons, of the sun, moon and stars — the North Wind is a force and a being, shaping the world around us; the river that floods the valley is the bringer of life. The world was vitally and viscerally alive in a way few moderns have ever experienced. Our urban yards and our rural farms are ecological deserts. City lights and smog hide the heavens from our view. Let us share a longer excerpt from Vernon’s insightful piece:

“Spiritual commons are often manifest in and through the loveliness of the material world, so that matters as well. It’s another area, alongside education, where spiritual commons has practical implications. That was spotted early by John Ruskin.

“Consider his 1884 lecture, The Storm-Cloud of the Nineteenth Century, in which he noted that “one of the last pure sunsets I ever saw” was in 1876, almost a decade previously. The colours back then were “prismatic”, he said, the sun going into “gold and vermillion”. “The brightest pigments we have would look dim beside the truth,” he continued. He had attempted to reflect that glorious manifestation of the spiritual commons in paint.

“He also knew that his experience of its beauty was lost because the atmosphere was becoming polluted. As a keen observer of nature, he noted how dust and smoke muddied and thinned the sky’s brilliance. In short, it would be crucial to clean up the environment if the vivid, natural displays were to return. Of course. But the subtler point Ruskin draws our attention to is the one about motivation: he wanted the vivid, natural displays because he had an awareness of, and desire for, spiritual commons.”

That is reminiscent of an event from 1994. There was a major earthquake on the West Coast and Los Angeles had a blackout. The emergency services were swamped with calls, not from people needing help for injuries but out of panic for the strange lights they were seeing in the sky. It scared people, as if the lights were more threatening than the earthquake itself — actual signs from the heavens. Eventually, the authorities were able to figure out what was going on. Thousands of urbanites were seeing the full starry sky for the first time in their entire lives. That situation has worsened since then, as mass urbanization is pushed to further extremes and, even though smog has lessened, light pollution has not (Urban Weirdness). We are literally disconnected from the immensity of the world around us, forever enclosed within our own human constructions. Even our own humanity has lost is wildness (see Paul Shepard’s The Others: How Animals Made Us Human).

We can speak of the world as living, but to most of us that is an abstract thought or a scientific statement. Sure, the world is full of other species and ecosystems. That doesn’t capture the living reality itself, though, the sense of vibrant and pulsing energy, the sounds and voices of other beings (Radical Human Mind: From Animism to Bicameralism and Beyond) — this is what the neuroanatomist Jill Bolte-Taylor, in her “Stroke of Insight”, described as the “life-force power of the universe” (See Scott Preston’s Immanence of the Transcendent & The Premises of Our Existence), maybe related to what Carl Jung referred to as the “objective psyche”. One time while tripping on magic mushrooms, I saw-felt the world glistening, the fields shimmered in the wind and moonlight and everything breathed a single breath in unison.

That animistic worldview once was common, as was the use of psychedelics, prior to their being outlawed and increasingly replaced by addictive substances, from nicotine to caffeine (The World that Inhabits Our Mind). And so the addictive mind has built up psychic scar tissue, the thick walls of the mind that safely and comfortably contain us (“Yes, tea banished the fairies.” & Diets and Systems). Instead of beating the bounds of a parish, we beat the bounds of our private egoic territory, our thoughts going round in round like creatures caught in a tidal pool that is drying up in the harsh sunlight — when will the tide come back in?

* * *

Here is some additional historical info. The feudal laws were to some extent carried over into North America. In early America, legally owning land didn’t necessarily mean much. Land was only effectively owned to the degree you used it and that originally was determined by fencing. So, having a paper that says you own thousands of acres didn’t necessarily mean anything, if it wasn’t being maintained for some purpose.

It was every citizen’s right to use any land (for fishing, hunting, gathering, camping, etc) as long as it wasn’t fenced in — that was at a time when fencing was expensive and required constant repair. This law remained in place until after the Civil War. It turned out to be inconvenient to the whites who wanted to remain masters, as blacks could simply go anywhere and live off of the land. That was unacceptable and so blacks need to be put back in their place. That was the end of that law.

But there were other similar laws about land usage. Squatting rights go back far into history. Even to this day, if someone shows no evidence of using and maintaining a building, someone who squats there for a period of time can claim legal ownership of it. Some of my ancestors were squatters. My great grandfather was born in a house his family was squatting in. Another law still in place has to do with general land usage. If someone uses your land to graze their horses or as a walking path, some laws will allow legal claims to be made on continuing that use of land, unless the owner explicitly sent legal paperwork in advance declaring his ownership.

There was a dark side to this. Canada also inherited this legal tradition from feudalism. In one case, a family owned land that they enjoyed but didn’t explicitly use. It was simply beautiful woods. A company was able to dredge up an old law that allowed them to assert their right to use the land that the family wasn’t using. Their claim was based on minerals that were on the property. They won the case and tore up the woods for mining, despite having no ownership of the land. Those old feudal laws worked well in feudalism but not always so well in capitalism.

I’ll end on a positive note. There was a law that was particularly common in Southern states. It basically stated that an individual’s right to land was irrevocable. Once you legally owned land, no one could ever forcefully take it away from you. Even if you went into debt or didn’t pay your taxes, the land would be yours. The logic was that land meant survival. You could be utterly impoverished and yet access to land meant access to food, water, firewood, building materials, etc. The right to basic survival, sustenance, and subsistence could not be taken away from anyone (well, other than Native Americans, African-Americans, Mexican-Americans, etc; okay, not an entirely positive note to end on).

Alienated Middle Class Whites

I’ve been reading Timothy Carney’s book Alienated America that came out this past year (and already posted about it). Like so many others, it’s about what brought us to a Trump presidency. But this particular piece of journalism stands out from the crowd, albeit not a difficult achievement. I’m giving the author extra credit points because he is somewhat balanced. For a conservative (paleo-libertarian?) henchman of the American Enterprise Institute living in Washington, D.C., he surprisingly brings up a number of points a left-winger could at least partly agree with.

Looking through the book, I kept expecting to be more critical. The political right bias was there, but Carney also drew upon the views of the political left, if not to the degree and depth I’d have preferred. He discusses the history of racism, gender bias, privilege, etc. Then he puts this in context of the problems of conservative nostalgia and revanchism. He takes some pointed jabs at the right, although he plays to the frame of ‘moderation’ in believing the truth is somewhere to be found in a hypothetical ‘centrism’, such as his critique of both individualism and collectivism or his critique of both big gov and big biz.

In giving the political left credit, he admits the importance of economic factors, such as rising inequality and he also brings up the problems of segregation and mistrust. But he is completely unaware that diversity only leads to loss of trust when combined with segregation (Eric Uslaner, Segregation and Mistrust). Nor does he appreciate how far reaching are the effects of inequality (Keith Payne, The Broken Ladder; Richard Wilkinson & Kate Pickett, The Inner Level). His view is not superficial or lacking in nuance, even as he remains trapped in capitalist realism. But he is coming from a more or less conventional worldview, no matter how far he stretches the boundaries a bit, although admittedly he does bring some good points to the table (The Right’s Lena Dunham Fallacy).

Here is the basic limitation. He constantly associates one positive factor with another in the assumption that the link is causal and goes in the direction that fits his beliefs, but he rarely if ever goes beyond correlation and he doesn’t acknowledge all the immense data and examples that contradict his assumptions and conclusions. Consider Scandinavians who show better results on numerous measures: poverty, unemployment, inequality, small business ownership, patents per capita, education, health, etc. They do this with highly conformist and collectivist societies with centralized welfare states and without the kind of civic participation seen in the US; for example, schools are operated professionally by highly trained and heavily unionized teachers, and parents don’t belong to an equivalent of a PTA or typically volunteer at their children’s schools. Yet it can be argued they somehow have a stronger and healthier form of individualism (Anu Partanen, The Nordic Theory of Everything). Such examples show that Edmund Burke’s “small platoons” can be as large and centralized as a highly advanced modern nation-state. It is true they are smaller nation-states, but large enough to have ambassadors, international policy, maintain militaries, and be allies with global superpowers.

Carney barely discusses anything outside of the United States. As I recall, he mentions Scandinavia once or twice and even then only in passing. Scandinavia undermines every aspect of his conclusions. That is the problem. He covers a lot of material and, for a mainstream writer, it is reasonably comprehensive as non-academic popular writing. But he never quite brings it together and he offers no meaningful solutions. What could have been a more worthy book stopped short of challenging the paradigm itself and pushing to an entirely different perspective and level of insight. Instead, he offers an obsession with social conservatism, if slightly more interesting than the standard approach. He makes a decent argument for what it is, maybe one of the better mainstream conservative arguments I’ve come across. He actually engages with diverse info. If nothing else, it will serve the purpose of introducing conservatives and right-wingers to a wealth of info and ideas they otherwise would never see.

I’m not sure I can hold the limitations against the author. Even if it fails in the end, it doesn’t fail to a greater degree than is expected. The analysis is adequate and, within the chosen framework, it was inevitable that it couldn’t really go anywhere beyond known territory. Even so, I really did appreciate how much space he gave to a topic like inequality. An example of where it comes short is not even touching on the saddest of inequalities, that of environment and health. It’s not merely that the poor don’t have access to green spaces and nice schools. The poor are literally being poisoned by lead in old pipes and toxic dumps located in poor communities. The oppressed poor aren’t accidental victims for their communities were intentionally destroyed by design, in the kind of capitalism we have that makes profit by devouring ‘social capital’. Still, it’s amazing how much he is willing to speak of, considering who employs him and who is his likely audience, but it ends up feeling like a wad of loose threads. The edges of his argument are as frayed as the social fabric he details. There is no larger context to hold it together, which is to be expected as the author is part of the very same problematic mainstream social order he is attempting to understand… and doing so on the same level the problem was created.

Though going far beyond where most on the political right dare to tread, he never fully takes seriously the ideals of basic human rights and moral righteousness nor the democratic values of fairness and justice as being of penultimate importance. The entire history of corporatocratic and plutocratic capitalism is that of violence, oppression, and theft. The kind of analysis in Alienated America, no matter how fair-minded and reasonable in intention (if we give the author the benefit of the doubt), doesn’t confront the bloody corpse of the elephant in the room, the reality that capitalism only applies to the poor while the rich get socialism (Trillions Upon Trillions of Dollars). Neither church attendance nor marriage rates could come close to undoing the moral harm. Forget the social fabric. We need to worry about the moral foundation of modern civilization.

As someone harshly put it, “Just a rehash of the same old “Trickle Down Economics” and “Thousand Points of Light” BS. Shrink government down till you can drown it in the bathtub destroying the social safety net while cutting taxes on the wealthy and corporations and miraculously private local organizations will jump in to take care of everything. At least try and come up with a more plausible explanation for the disaster to divert us from the truth that the gangster capitalism the Republican Party has been pushing on America since Reagan” (comment to Andy Smarick’s Where civic life crumbled, Donald Trump arose). I might be slightly more forgiving as I came to it with low expectations, but basically this comment is correct.

Carney’s argument is intellectually reasonable as far as mainstream arguments go, but it lacks a gut-level punch. He remains within the range of respectability, not getting too close to anything that might be mistaken as radical. Envisioning a slightly more friendly capitalism is not exactly a new proposition or overly inspiring. Nonetheless, his refusal to scapegoat individuals and his refusal to think of communities in isolation is refreshing. His focus on alienation is key, even as I personally find Joahann Hari (Chasing the Scream & Lost Connections) to be much more probing in getting to the heart of the matter, but that ultimately is just to complain that Carney isn’t a left-winger, not that Hari is extremely radical either.

Where his take offered clarity of light to see by was his dissection of Trump supporters and voters. He does a wonderful takedown of the mainstream narrative that it was the highly religious who were behind Trump’s election. Opposite of this narrative, the facts show that, as church attendance went up in a community, Trump’s voter count went down in that location. His ‘religious’ followers were mostly the unchurched and, interestingly, those lacking in an ethnic identity, as contrasted with traditioanlly religious and community-minded populations such as Dutch-American Calvinists (Terry Mattingly, Journalists don’t understand religious fault lines in ‘alienated’ America). Yet those unchurched Trump supporters claimed that religion was important to them, apparently as a symbolic issue among those who have otherwise lost meaning in their lives, which seems to be Carney’s takeaway. It reminds me of how school shooters are also concentrated in similar communities and, even when non-religious, the assailants often express religious-like concern for meaning (12 Rules for Potential School Shooters).

He busted another myth in pointing out that core support for Trump, although coming from economically struggling populations, did not specifically come from the poor but rather the wealthier in those communities (yet strangely he kept reinvoking the very myth he disproved and dismantled, in returning his focus to poor whites). This economic class of the relatively comfortable apparently have a troubled relationship with their impoverished ‘neighbors’, either in a fear of them or in a fear of becoming like them, which is to say class anxiety in one way or another. It’s understandable as the middle class has been shrinking and surely the middle class is shrinking the most in those economically distressed communities. And that would be hitting white males most of all in how, as many other demographics (women, minorities, and immigrants) have had improving economic outcomes over the past half century, white males are now making less than in the past.

On the other hand, the wealthier in wealthier communities are more protected from these problems and so felt no attraction to Trump’s demagoguery; their local economies are less stressed and divided. It indicates that, though Carney didn’t explain it this way, the real problem is inequality, where it was immediately felt and not. The more well off communities could either ignore inequality altogether as if it didn’t exist or else treat it as a problem of other people elsewhere. To the economically-segregated elites, inequality is an abstraction that isn’t viscerally real in their immediate experience and so, in the mind of the privileged, it is not personally relevant or morally compelling. But such dissociation can only last for so long as society crumbles all around their walled enclaves — as Keith Payne makes clear, even the rich are stressed, suffer, and become sick under conditions of high inequality. Ultimately, there is no escape from a society gone mad, especially when that society is the leading global superpower.

Where Carney really gets things right is about isolation and alienation. And it doesn’t happen in the way most would expect. Why is this particular middle class white demographic so anxiety-ridden and not other populations? In dealing with everyday needs and problems, Carney writes that, “Trump voters—as compared with Ted Cruz voters, or Bernie or Hillary supporters—answered, “I just rely on myself” the most.” That is quite telling. Sanders won the largest proportion of the poor and working class, far more than Trump. So, similar to how the wealthy in wealthy communities feel greater trust and connection toward their neighbors, so do many of the poor.

Stephen Steinberg writes that, “In her 1973 study All Our Kin, Carol Stack showed how poor single mothers develop a domestic network consisting of that indispensable grandmother, grandfathers, uncles, aunts, cousins, and a patchwork of neighbors and friends who provide mutual assistance with childrearing and the other exigencies of life. By comparison , the prototypical nuclear family, sequestered in a suburban house, surrounded by hedges and cut off from neighbors, removed from the pulsating vitality of poor urban neighborhoods, looks rather bleak. As a black friend once commented , “I didn’t know that blacks had weak families until I got to college.”” (Poor Reason; see Black Families: “Broken” and “Weak”).

So that is what Carney gets wrong. He goes from Trump’s core supporters from the middle class being isolated and alienated to shifting the frame back to the mainstream narrative of it somehow being about the declining white working class, in stating that, “In general, poorer people “tend to be socially isolated,” Putnam found, “even from their neighbors.” That probably is true to some extent, but the point is that it isn’t nearly true to the degree as found among the anxious middle class. The poorest of the poor, unlike the upwardly aspiring middle class, are those the least likely to move to seek a job and so are the most likely to remain living near lifelong connections of family, friends, and neighbors.

Yes, poverty has a way of isolating people such as being constantly busy with working multiple jobs while unable to afford childcare. Nonetheless, even when they don’t have the time to spend with those important social ties, they know that their social network is always there to fall back on in times of dire need. Sure, the rural poor are increasingly isolated quite literally in a geographic sense, as the rural areas empty out with the young moving to the cities. But in spite of the media loving to obsess over these loneliest of the desperate and aging poor, the reality is the vast majority of the poor, specifically poor whites, have lived in urban areas for over a century now. That isn’t to say it isn’t also shitty to be among the urban poor. But the basic point comes down to something odd going on here. The poorest Americans, contrary to expectation, are not the most anxious and are not those turning most to reactionary politics of nostalgia and strong man leadership. Instead, those on the bottom of society tend to be apolitial and disenfranchised, that is to say they usually don’t vote.

How different that is from Trump’s America. Trump was not speaking to those facing the worst economic hardship but those a few rungs above them. Something happened to the middle class to cause them to feel precarious, as if they had been cheated out of a more comfortable and secure lifestyle that they deserved. Maybe they had sacrificed extended family and community in climbing the economic ladder and pursuing their careers, and it turned out the rewards did not match the costs. So, they were left hanging somewhere in between. “Trump voters were significantly less socially connected,” Carney writes. “There’s plenty more data like this, charting the loneliness and social disconnection in Trump’s early core support.” For certain, something is making middle class whites go crazy and not merely those gripping the lowest edge of it (Fractures of a Society Coming Apart). Look at the breakdown of Trump voters, from my post Right-Wing Politics of the Middle Class, and notice it doesn’t fit the narrative spun in the corporate media:

“Trump voters seemed to include many average Americans, although Trump voters were slightly above the national average on wealth. With incomes below $50,000, 52% for Clinton and 41% for Trump. With incomes more than $50,000, 49% for Trump and 47% for Clinton. A large part of Trump’s votes came from the income range of +50 to -100 thousand range, i.e., the middle class. The only income level bracket that Trump lost to Clinton was those who make $49,999 and under. Trump’s victory came from the combined force of the middle-to-upper classes. Trump did get strong support from those without a college degree (i.e., some college or less), but then again the vast majority of Americans lack a college degree. It’s easy to forget that even many in the middle class lack college degrees. Factory jobs and construction jobs often pay more than certain professional careers such as teachers and tax accountants. I’m sure a fair number low level managers and office workers lack college degrees.

“Among white voters alone, though, Trump won more college-educated than did Clinton. The white middle class went to Trump, including white women with college degrees. Only 1 in 6 Trump voters were non-college-educated whites earning less than $50,000. Ignoring the racial breakdown, Trump overall won 52% of those with some college/associate degree, 45% of college graduates, and 37% with postgraduate study. That is a fairly broad swath. A basic point I’d make is that the majority of Trump voters without a college education work in white collar or middle skill jobs, representing the anxious and precarious lower middle class, but it has been argued that the sense of financial insecurity is more perceived than real. The working class, especially the poor, were far from being Trump’s strongest and most important support, despite their greater financial insecurity. Rather, the Trump voters who played the biggest role were those who fear downward economic mobility, whether or not one deems this fear rational (I tend to see it as being rational, considering a single accident or health condition could easily send into debt many in the lower middle class).”

Of course, Carney is making a more targeted point. He is speaking about Trump’s core support in specifying those who were supporting him from the very beginning of his campaign, prior to the GOP nomination. That core support wasn’t the comfortable upper middle class, but still they were solidly middle class above the common rabble. As he further emphasizes, “recall that Trump’s core supporters weren’t necessarily poorer than other voters. But they lived in places that were worse off, culturally and economically, than other places.” That cuts straight to one of Keith Payne’s main points, the way high inequality can feel like poverty even to those who aren’t poor. Economic stress comes in many forms, not limited to outright economic desperation. Inequality, when pushed to extremes, makes everyone feel shitty. And if the sense of conflict lasts long enough, people begin acting crazy, even crazy enough to vote for demagogues, social dominators, and authoritarians.

If we are to seek the cause of this problem, we should look elsewhere to those concentrations of segregated wealth. “Inequality in the United States is growing,” says Carney in pointing out the obvious. “Economic mobility is low. These facts alone suggest that our elites aren’t sharing the wealth.” That is an interesting conclusion coming from the political right, even to suggest they should share the wealth. Now if the right could only admit that most of that wealth was stolen and so needs to be returned, not merely shared, but such breathtaking honesty is far too much to ask for. We have to take what meager honesty we can get, even if it only gives us a glimpse: “This social inequality, as earlier chapters laid out, was far less in the 1960s (racial and gender inequality were far worse, of course). Between the upper class and the working class, there was a far smaller gap in marriage, in divorce, and in out-of-wedlock births. At the root of it all: In 1960, there was a narrower gap in social connectedness, including church attendance. Today, family life and strong community are increasingly a luxury good. And here we can blame the elites.”

If only social conservatives would take seriously what it means to have made the public good a luxury unaffordable to most of the public. But all we are left with is a diatribe of paternalistic moralizing. We don’t need to get rid of this modern aristocracy, so goes the lament, for the moral failure is that they’ve forgotten their noblesse oblige. They need to return to the founding ideal, as embodied by George Washington, of an enlightened aristocracy. Carney preaches that the economic elite need to once again embrace their role as ruling elite, to return plutocracy back to its aristocratic roots of theocratic patriarchy. The “more pernicious problem” is an “ideoogical commitment to egalitarianism among elites that prevents them from seeing themselves as elites.” Yeah, that is where we went wrong. The elites aren’t elitist enough and so they aren’t taking seriously their moral responsibility to compassionately rule over their local populations of neo-feudal serfs, instead locking themselves away in the modern equivalent of a castle keep. I’m glad we got that cleared up. That should set the world right again.

* * *

Alienated America
by Timothy P. Carney

A quick reminder, though, as we discuss election results and “Trump Country”: By the general election in 2016, a vast majority of Republicans had come around to Donald Trump. Many would choose anyone but Hillary. Others had grown fond of the man. By the end of Trump’s first couple of years in office, after two Supreme Court picks and a tax cut, many other right-leaning Americans embraced him.

This book isn’t about those later adopters, though. This book has mostly studied the results of the early primaries to sort out who was Trump’s early core support . When we have looked at general election results, we have been most interested in the voters or places that shifted from Democrat to Republican—the voters who would have stayed home or voted Democrat had Trump not been the nominee.

So on this question—who was Trump’s early core support ?—different studies found wildly differing results. You may recall those who said “economic anxiety” was the cause, and those who said they could prove that there was no economic anxiety, just racism at the heart of Trump’s earliest support.

What distinguished these two classes of studies? The studies that found no or little connection between economic woe and Trump support were polls of individuals. Those finding that economic woe predicted Trump support were studies of places.

As a Washington Post headline aptly put it: PLACES THAT BACKED TRUMP SKEWED POOR; VOTERS WHO BACKED TRUMP SKEWED WEALTHIER. 3

This is one reason we couldn’t tell the story of Trump without discussing community. The story of how we got Trump is the story of the collapse of community, which is also the story behind our opioid plague, our labor-force dropouts, our retreat from marriage, and our growing inequality.

The core Trump voters weren’t the people dying, obviously. They weren’t even necessarily the unhealthy ones. They weren’t necessarily the people drawing disability payments or dropping out of the workforce. Trump’s core voters were these people’s neighbors.

Trump’s win—specifically his wins in the early primaries and his outperformance of Mitt Romney—is best explained by his support in places where communities are in disarray. Many traits characterized Trump’s early core supporters. This chapter will explore them, and we will see how closely they are all tied to alienation.

How Trump Voters Are Giving the Right Qualms About Capitalism
by Park MacDougald

Yet if Carney offers a convincingly bleak view of social collapse in working-class America, his explanation for this collapse — and his suggestions for what to do about it — are somewhat less satisfying. Carney channels, to a limited degree, some of the new right-wing market skepticism: He offers a soft criticism of big business for stamping out local variation in the name of standardization and efficiency; he laments the rise of “Taylorism” and its dehumanization of work; he attacks the “gig economy” for not providing workers with stability; he disapproves of suburbanization and the isolation that stems from it; he even quotes Deneen to the effect that capitalism breeds an individualistic mind-set that makes relationships contingent and easily broken. But in explaining the troubles of working-class America, Carney tends to fall back on the collapse of church and community, which he largely attributes to traditional Republican bogeymen such as the welfare state, the sexual revolution, the rise of expressive individualism, and secularization. These explanations are not wrong per se, but they are so large and fuzzily cultural that they resist solutions beyond the local and individual. Carney offers a few policy fixes he thinks might help — reforming the mortgage interest deduction, decentralizing control over public schools — but he admits in his closing chapter that the “solution is mostly: You should go to church. Also, You should start a T-ball team.

Generally speaking, it probably is a good idea to start a T-ball team. And Carney’s willingness to critique aspects of American capitalism, mild as they may be, represents a marked shift from where the mainstream right was during the Obama years and where some of its leading lights still are. But at the same time, by delivering an account of a country facing full-blown social collapse and then retreating into calls for local, voluntary solutions, Carney ends up restating the basic premises of an old conservative consensus — it’s not the government’s job to fix your problems — that, as a political philosophy, has contributed to the alienation Carney so convincingly describes. It may be true, for instance, that the state is ill equipped to re-create devastated communities, but it is also true that state policy has enabled or even accelerated their devastation, and not merely in the sense that overregulation has hurt small businesses or that the welfare state has crowded out private charity.

Rising international economic competition, for instance, was always going to hurt the American working class. But as critics on both the left and the right have pointed out, globalization has been systematically tilted in favor of the mobile and highly educated. The critic Michael Lind, for instance, notes that the international harmonization of economic rules has focused on tariffs, financial liberalization, and intellectual property while avoiding areas that would benefit the Western working classes, such as wages, labor standards, and tax laws. Even some of the more diffuse cultural shifts lamented by conservatives have been midwifed by the state. As Harvard Law professors Jacob Gersen and Jeannie Suk Gersen have argued in their study of the evolution of Title IX, civil-rights laws designed to protect women’s equal access to education have created, through bureaucratic drift and activist institutional capture, a vast federal regulatory apparatus that treats socialization into “traditional” gender roles as a public-health risk and attempts, under the guise of fighting sexual assault, to inculcate among college students a progressive view of gender and sexuality.

The point here is not to chastise Carney for not adopting a more dirigiste political philosophy than the one he presumably holds. It is to say that, even on the right, intellectuals are concluding that the problems Carney identifies are so alarming that localist, laissez-faire solutions simply aren’t going to cut it. In a recent essay in American Affairs, Gladden Pappin issued a broadside against fusionist conservatives who, in his view, waste their energies calling for the resurrection of vanished civil-society traditions “that worked only as culturally embedded practices dependent on the traditions of aristocratic centuries.” Instead, Pappin demands conservatives ask themselves, “What can we do with the reins of power, that is, the state, to ensure the common good of our citizens?”

It remains to be seen whether anyone will take up Pappin’s call and, if they do, whether such a conservatism of the state would be effective or popular. But if Middle America’s condition really is as dire as people like Carney make it out to be, it’s hard to imagine that “go to church” will turn out to be a political winner. Carney ably describes the sort of malaise that led Republicans to flock to Trump, but if there’s one thing we learned from the 2016 election, it’s that desperate people want a leader who promises to try something different, however flawed his solutions might be.

God’s Bailout: A Review of Timothy P. Carney’s “Alienated America”
by Tyler Austin Harper

It is here that Alienated America is very insightful: Carney has a genuine knack for parsing the data, drawing out counterintuitive but rigorously defended observations, and resisting simple narratives about complex states of affairs. His central claim that the 2016 election was a referendum on whether the American dream is alive or dead is not novel, but it is both convincing and better supported than similar efforts. Additionally, although his defense of the salutary nature of cultural practices like religious observance, child-rearing, and marriage are unapologetically conservative in nature, his message remains comparatively broad in scope: unlike other conservative Catholic critics of Trump (most notably, Patrick Deneen), Carney predicates his argument on the form, rather than the content, of these practices. In the pages of Alienated America, you will find no diatribe on the superiority of heterosexual marriage or the Catholic faith — he notes repeatedly, for example, that observant Muslim Americans are among the groups most likely to report optimism about America and faith in the American dream, even after Donald Trump’s election and attempted Muslim ban. Rather, Carney’s message is practical and universalist in nature: people are better off among other people, when they have something, anything whatsoever, that they belong to and that unites them in a network of mutual responsibility.

It is this aspect of Carney’s argument that I find most appealing, and most useful for progressives like myself. Namely, the author eschews the common tendency — on the right and the left — to posit a linear relationship between wealth and well-being. More specifically, his work persuasively suggests that financial security and emotional security go hand in hand not because some kind of mechanical relationship exists between the two, but because, in contrast to the working class, the wealthy tend to have the resources to live in and contribute to places that provide opportunities for meaningful lives lived in common. As he succinctly puts it: “The erosion of community […] is unequally distributed, it is concentrated in the working class, and it is geographically discrete to the point that we can see it on a map.”

While those of us on the left are generally quick (and correct!) to highlight the importance of addressing widening income inequality and an increasingly precarious labor market, for example, it often seems that we are comparatively less likely to talk about questions of community, as though we assume that fixing the former will necessarily achieve the latter. Furthermore, when we do talk about community, we often use the term to refer to people who share common interests and experiences (for example, “communities of color”) but not necessarily geographical proximity or concrete spaces of interaction. If we are willing to take Carney’s assessment seriously, then, two questions seem obvious: What are the barriers to community, understood in the sense of mutual, meaningful networks of local support? And how might these barriers be removed?

Not surprisingly, it is here that Carney’s analysis breaks down, where his professed desire for strong communities is predictably thwarted by his inability to recognize unfettered capitalism, rather than government centralization and regulation, as the primary threat to the robust civic life he vaunts. Although Carney approvingly cites Orwell’s maxim “To see what is in front of one’s nose needs constant struggle,” he consistently fails to see that at the heart of every flyover town, closed plant, and shuttered church whose death he laments, there is a place where unregulated capital — not some big government boogeyman — has reared its ugly head.

Unlike his meticulously researched and tightly argued defense of the prosocial virtues of marriage and religious observance, for example, Carney’s tepid but persistent support of free-market capitalism and his assaults on liberal governance are fast and loose, often relying on anecdotal evidence, sparse data, and obscure cases of bureaucratic malfeasance to make his points. Oftentimes, his arguments are absurd — such as his claim that massive companies like Walmart, Amazon, or Starbucks crowd out small businesses because of too much, rather than too little, regulation. Other times, they’re comical — once in the 1980s, Mayor Bernie Sanders apparently professed not to believe in charities. This decades-old remark is spun by Carney into a sweeping indictment of the contemporary left’s widespread desire to have neighborly goodwill replaced by the Nanny State.

In fairness, Carney isn’t entirely oblivious to the problems caused by our neoliberal economic order — he frequently cites cases of Chinese manufacturing undermining manufacturing-centric US communities, for example. However, like many modern conservatives, he assuages his doubts by acknowledging that free-market capitalism has a few minor kinks, before swiftly pivoting to the supposedly graver dangers posed by governmental overreach, centralization, and regulation. As a direct consequence of this reaffirmation of the legitimacy of unfettered capital, Carney is thus forced to retreat into the untenable position that religion is the best and most readily available means to redress our present crisis of community. We can’t all be affluent, his argument goes, and thus we can’t all have access to the kind of secular communal life enjoyed by the wealthy. Yet, even the dirt poor can enjoy the social bonds provided by religious life.

To reiterate, I have no problem with Carney’s high estimation of organized religion. As with marriage, I know plenty of people for whom religion has been nightmarish, a source of trauma, insecurity, and even violence. I also know plenty of people, like Jim the bookish engineer, for whom religious affiliation has been a bulwark against the loneliness endemic to modern life. The problem is not religion itself, as one means among many for achieving the communal ties that foster well-being. The problem is Carney’s reliance on God to bail out capitalism. Unlike Robert Nisbet, the conservative sociologist whose classic work — The Quest for Community (1953) — he returns to frequently, Carney’s own work persistently downplays the connection between social alienation and the flow of unregulated capital that is the principal engine of that same alienation.

Although he signals kinship with an earlier tradition of postwar conservatives who were also preoccupied with the question of community — people like Nisbet, Russell Kirk, and Peter Viereck, who highlighted the corrosive and antisocial effects of the cult of free enterprise — Carney cannot ultimately bring himself to shed the laissez-faire, libertarian economics that dominate the Republican Party. The result is a book that puts its finger on the right problem, but whose author is too besotted by economic fatalism to imagine a variety of contentment that would be otherwise than religious, positioning secular forms of community as the unique provenance of the elite. While Carney’s insistence that we must reintegrate the classes, combating the geographical isolation of wealth and its resources, is laudable, his calls to privatize the safety net are as predictable as they are puerile.

Rather than buy into a zero-sum game that forces a choice between government as a tentacular monster and government as a minimalist “reinsurance” program (“a safety net for safety nets,” to use Carney’s term) is it not possible to imagine a government that supports community institutions by — and hear me out on this — actually funding and defending them? If you want a thriving book club scene, for example, why not fix the public schools? Try pumping money into education and paying teachers a salary that will make such work a feasible option for the best and the brightest. After all, lifelong learners, the kind who read for pleasure, do not grow on trees. Likewise, if you want heightened church attendance, mightn’t an increased minimum wage — allowing prospective attendees to forsake that second job, spending Sundays in the pews rather than driving for Uber — be a good start? If college graduates are far more likely to build robust communities, as Carney repeatedly claims, shouldn’t we work toward making a college education more affordable for the alienated, working poor whose cause he champions? These are the kind of questions that Carney dismisses out of hand as “centralizing” and “utopian,” preferring instead his own brand of theocratic utopianism in which a minimalist state would be kept afloat by little platoons of the charitable religious.

“What would Mister Rogers do?”

“He had faith in us, and even if his faith turns out to have been misplaced, even if we have abandoned him, he somehow endures, standing between us and our electrified antipathies and recriminations like the Tank Man of Tiananmen Square in a red sweater.”
~Tom Junod, My Friend Mister Rogers

A Beautiful Day in the Neighborhood is an inspiring and, in the end, a challenging portrayal of Fred Rogers, AKA ‘Mister Rogers’. It took some suspension of disbelief, though. Tom Hanks does as good of a job as is possible, but no one can replace the real thing. Mr. Rogers was distinctive in appearance and behavior. The production team could have used expensive CGI to make Hanks look more like the real man, but that was not necessary. It wasn’t a face that made the children’s tv show host so well respected and widely influential. A few minutes in, I was able to forget I was watching an actor playing a role and became immersed in the personality and the moral character that was cast upon the screen of imagination, the movie presented as if a new episode of Mister Rogers’ Neighborhood had just been released.

The way the movie was done was highly effective. It was based on an Esquire article, Can You Say … Hero? by Tom Junod. It was jarring at first in taking a roundabout approach, but it might have been the only way to go about it for the intended purpose. Fred Rogers appears to have been a person who was genuinely and fully focused on other people, not on himself. So a biopic that captures his essence requires demonstrating this concern for others, which makes him a secondary character in the very movie that is supposedly about him. We explore his world by experiencing the profound impact he had on specific people, in this case not only Junod but also his family, while there are other scenes showing the personable moments of Mr. Rogers meeting with children. The story arc is about Junod’s change of heart, whereas Mr. Rogers remains who he was from the start.

This leaves Mr. Rogers himself as an unknown to viewers not already familiar with the biographical details. We are shown little about his personal life and nothing about his past, but the narrow focus helps to get at something essential. We already were given a good documentary about him from last year. This movie was serving a different purpose. It offers a window to peer through, to see how he related and what it meant for those who experienced it. Part of the hidden background was his Christianity, as he was an ordained Presbyterian minister. Yet even as Christianity inspired him, he never put his faith out in the public view. As Jesus taught to pray in secret, Fred Rogers took it one step further by keeping his faith almost entirely hidden. He didn’t want to force his beliefs onto others. The purpose of religion is not dogma or outward forms. If religion matters at all, it’s about how it transforms people. That is what Mr. Rogers, as a man and a media personality, was all about.

Some people don’t understand this and so don’t grasp what made him so special. Armond White at National Review wrote that, “Heller and screenwriters Micah Fitzerman-Blue and Noah Harpster don’t show enough faith in Rogers’ remedies—and not enough interest in their religious origins. In short, the movie seems wary of faith (it briefly mentions that Rogers was an ordained minister) and settles for secular sentimentality to account for his sensibility and behavior. This not only weakens the film, but it also hobbles Hanks’s characterization” (Christian Faith Is the Missing Ingredient in A Beautiful Day in the Neighborhood). That misses the entire message being conveyed, not only the message of the movie but, more importantly, the message of Mr. Rogers himself. As Greg Forster subtly puts it, “that is of course the whole goddamned point here” (Pass the Popcorn: Anything Mentionable Is Managable).

To have put Mr. Roger’s Christianity front and center would be to do what Mr. Rogers himself intentionally avoided. He met people where they were at, rather than trying to force or coerce others into his belief system, not that he would have thought of his moral concern as a belief system. He was not an evangelical missionary seeking to preach and proselytize, much less attempting to save the lost souls of heathenish children or make Christian America great again. In his way of being present to others, he was being more Christ-like than most Christians, as Jesus never went around trying to convert people. Jesus wasn’t a ‘good Christian’ and, by being vulnerable in his humanity, neither was Fred Rogers. Rather, his sole purpose was just to be kind to others. Religion, in its highest form, is about how one relates to others and to the world. Thomas Paine voiced his own radical faith with the words, “The World is my country, all mankind are my brethren, and to do good is my religion.” I suspect Mr. Rogers would have agreed. It really is that simple or it should be.

That childlike directness of his message, the simplicity of being fully present and relating well, that was the magical quality of the show, Mister Rogers’ Neighborhood. I didn’t appreciate it when I was a kid. It was a fixture of my childhood, a show I watched and that was all. But looking back on it, I can sense what made it unique. Like the man himself, the show was extremely simple, one might call it basic, demonstrated by the same ragged puppets he used his entire career. This was no fancy Jim Henson muppet production. What made it real and compelling to a child was what the people involved put into it, not only Fred Rogers but so many others who were dedicated to the show. Along with the simplicity, there was a heartfelt sincerity to it all. The scenes with the puppets, Daniel Striped Tiger most of all, were often more emotionally raw and real than what is typically done by professional actors in Hollywood movies.

That is what stands out about Tom Hank’s performance in bringing this to life. He is one of the few actors who could come close to pulling it off and even his attempt was imperfect. But I have to give Hanks credit for getting the essence right. The emotional truth came through. Sincerity is no small thing, in this age of superficiality and cynicism. To call it a breath of fresh air is a criminal understatement. Mr. Rogers was entirely committed to being human and acknowledging the humanity in others. That is such a rare thing. I’m not sure how many people understood that about him, what exactly made him so fascinating to children and what created a cult-like following among the generations who grew up watching his show. As a character says about the drug D in A Scanner Darkly, “You’re either on it or you’ve never tried it.”

Some people claim that “sincerity is bullshit” (Harry Frankfurt), a sentiment I understand in feeling jaded about the world. But I must admit that Fred Rogers’ sincerity most definitely and deeply resonates for me, based on my own experience in the New Thought worldview I was raised in, a touchy-feel form of Christianity where emotional authenticity trumps outward form, basically Protestantism pushed to its most extreme endpoint. Seeing the emotional rawness in Mr. Rogers’ life, although coming from a different religious background than my own, reminded me of the sincerity that I’ve struggled with in myself. I’ve always been an overly sincere person and often overly serious, that is how I think of myself… but can anyone really ever be too sincere? The message of Mr. Rogers is that we all once were emotionally honest when children and only later forgot this birthright. It remains in all of us and that core of our humanity is what he sought to touch upon, and indeed many people responded to this and felt genuinely touched. The many testimonies of ordinary people to Mr. Rogers’ legacy are inspiring.

This worldview of authenticity was made clear in one particular scene in the movie. “Vogel says he believes his dining companion likes “people like me … broken people.” Rogers is having none of it. “I don’t think you are broken,” Rogers begins, speaking slowly and deliberately. “I know you are a man of conviction, a person who knows the difference between what is wrong and what is right. Try to remember that your relationship with your father also helped to shape those parts. He helped you become what you are”” (Cathleen Falsani, Meditating On Love and Connection with Mr. Rogers and C.S. Lewis). That dialogue was not pulled from real life, according to Tom Junod in his latest piece My Friend Mister Rogers, but even Junod found himself emotionally moved when watching the scene. The point is that what mattered to Fred Rogers was conviction and he lived his life through his own conviction, maybe a moral obligation even. The man was exacting in his discipline and extremely intentional in everything he did, maybe even obsessive-compulsive, as seen in how he maintained his weight at exactly 143 lbs throughout his adult life and in how he kept FBI-style files on all of his friends and correspondents. He had so little interest in himself that even his wife of 50 years knew little about his personal experience and memories that he rarely talked about. His entire life, his entire being apparently was focused laser-like on other people.

He was not a normal human. How does someone become like that? One gets the sense that Mr. Rogers in the flesh would have, with humility, downplayed such an inquiry. He let on that he too was merely human, that he worried and struggled like anyone else. The point, as he saw it, was that he was not a saint or a hero. He was just a man who felt deeply and passionately moved to take action. But where did that powerful current of empathy and compassion come from? He probably would have given all credit to God, as his softspoken and often unspoken faith appears to have been unwavering. Like the Blues Brothers, he was a man on a mission from God. He was not lacking in earnestness. And for those of us not so fully earnest, it can seem incomprehensible that such a mortal human could exist: “He was a genius,” Junod wrote, “he had superpowers; he might as well have been a friendly alien, thrown upon the rocks of our planet to help us find our way to the impossible possibility that we are loved” (My Friend Mister Rogers). Yet for all the easy ways it would be to idolize him or dismiss him, he continues to speak to the child in all of us. Maybe ‘Mister Rogers’ was not a mystery, but instead maybe we are making it too complicated. We need to step back and, as he so often advised, remember what it was like to be a child.

Fred Rogers was a simple man who spoke simply and that is what made him so radically challenging. “Indeed, what makes measuring Fred’s legacy so difficult is that Fred’s legacy is so clear.” Junod goes on to say, “It isn’t that he is revered but not followed; so much as he is revered because he is not followed—because remembering him as a nice man is easier than thinking of him as a demanding one. He spoke most clearly through his example, but our culture consoles itself with the simple fact that he once existed. There is no use asking further questions of him, only of ourselves. We know what Mister Rogers would do, but even now we don’t know what to do with the lessons of Mister Rogers.” He might as well have been talking about Jesus Christ, the divine made flesh. But if there was spiritual truth in Fred Rogers, he taught that it was a spiritual truth in all of us, that we are children of God. Rather than what would Mister Rogers do, what will we do in remembering him?

Enchantment of Capitalist Religion

We Have Never Been Disenchanted
by Eugene McCarraher, excerpt

“The world does not need to be re-enchanted, because it was never disenchanted in the first place.  Attending primarily to the history of the United States, I hope to demonstrate that capitalism has been, as Benjamin perceived, a religion of modernity, one that addresses the same hopes and anxieties formerly entrusted to traditional religion.  But this does not mean only that capitalism has been and continues to be “beguiling” or “fetishized,” and that rigorous analysis will expose the phantoms as the projections they really are.  These enchantments draw their power, not simply from our capacity for delusion., but from our deepest and truest desires — desires that are consonant and tragically out of touch with the dearest freshness of the universe.  The world can never be disenchanted, not because our emotional or political or cultural needs compel us to find enchantments — though they do — but because the world itself, as Hopkins realized, is charged with the grandeur of God…

“However significant theology is for this book, I have relied on a sizable body of historical literature on the symbolic universe of capitalism.  Much of this work suggests that capitalist cultural authority cannot be fully understood without regard to the psychic, moral, and spiritual longings inscribed in the imagery of business culture.”

Has Capitalism Become Our Religion?

“As a Christian, I reject the two assumptions found in conventional economics: scarcity (to the contrary, God has created a world of abundance) and rational, self-seeking, utility-maximizing humanism (a competitive conception of human nature that I believe traduces our creation in the image and likeness of God). I think that one of the most important intellectual missions of our time is the construction of an economics with very different assumptions about the nature of humanity and the world.”

 

 

To Be Fat And Have Bread

The obsession with body fat is an interesting story. It didn’t begin a few generations ago but goes back centuries. But maybe that shouldn’t be surprising.

That was the colonial era when the diet was transformed by imperial trade of foreign foods. I might note that this included previously rare or never before seen varieties of fattening carbohydrates: sugar, potatoes, corn, rice, etc. The old feudal system was ending and entirely different forms of food production and diets were developing, especially for the then landless peasants. Hunting, gathering and grazing for the commoners definitely would have been on the decline for a while at that point, as the last of the commons had been privatized. The loss of access to wild game would take longer in the colonies, but eventually it happened everywhere.

The last stage of that shift overlapped with the beginnings of industrialization and agricultural improvements. In the 19th century, change in wheat surpluses and hence costs and prices. Agriculture boomed as fewer people were employed in it. There was also a sudden obsession with gender roles and social roles in general, such as the post-revolutionary expectation of the mother to make citizens out of her children. Bread-making, a once uncommon activity for Americans, became increasingly important to the normative identity of family life and the symbolic maintenance of the social order.

Regular consumption of wheat bread was once limited to the wealthy and that is how refined bread gained its moral association with the refined class. Only the wealthy could afford wheat prior to the 19th century, as prior to that the poor were forced to rely upon cheaper grains and grain substitutes at a time when bread was regularly adulterated with bark, sawdust, chalk, etc. Poverty breads, in the previous centuries, often were made with no grain at all.* For wheat and especially heavily refined white bread to become available to all walks of life meant an upsurge of the civilizing process. The obsession with middle class life took hold and so cookbooks were produced in large numbers.

In a growing reactionary impulse, there was a nostalgic tendency toward invented traditions. Bread took on new meanings that then were projected onto the past. It wasn’t acknowledged how radical was the industrial agriculture and industrial milling that made all of this possible. And the disconnection is demonstrated by the simultaneous promotion of the grain production of this industrial age and the complaint about how industrialized life was destroying all that was good. Bread, as a symbol, transcended these mere details.

With the aristocracy having been challenged during the Revolutionary Era the refinement of the refined class that once was admired had then become suspect. The ideology of whole foods began to emerge and had some strong proponents. But by the end of the 1800s, the ideal of refinement gained prominence again and prepared the way for the following century of ever greater industrialization of processed foods. Refinement represented progress. Only after more extensive refinement led to mass malnourishment, near the end of that century and heading into the next, did whole foods once again capture the public imagination.

Then we enter the true era of fat obsession, fat blaming, and dieting, endless dieting. Eat your whole grains, get your fiber, make sure you get enough servings of fruits, and veggies, and don’t forget to exercise. Calories in, calories out. Count your calories, count your carbs, count your steps. Count every last one of them. Still, the basic sides of the debate remain the same: fewer carbohydrates vs less meat, whole foods vs refined foods, barbaric lifestyle vs civilizing process, individual moral failure vs societal changes, etc. One theme that runs through dietary advice from the ancient world to the present is that there is a close link between physical health, mental health, and moral health — the latter erupting as moral panic and moral hygiene. But what stands about the modern era, beginning in the 1600s, is that it was observed that psychological problems were mostly seen among the well-to-do.

This was often blamed on luxury and sometimes on meat (a complaint often about animals raised unnaturally in confinement and probably fed grain, the early equivalent of concerns about factory farming; but also a complaint about the introduction of foreign spices and use of fancy sauces to make meat more appetizing), although there was beginning to be an awareness that a high-carb diet might be playing a role in that it was often noted that the morbidly obese ate lots of pastries, fruit pies, and such. The poor didn’t have much access to wheat and sugar before the 1800s, but the wealthy had plenty of such foods centuries earlier. Meat consumption didn’t change much during that era of colonial trade. What did change the most was availability of starchy and sugary foods, and the wealthy consumed them in great proportions. Meat had always been a desirable food going back to earliest hominid evolution. Modern agriculture and global trade, however, entirely transformed the human diet with the introduction of massive amounts of carbohydrates.

It’s strange that right from the beginning of the modern era there were those pushing for a vegetarian diet, not many but their voices were being heard for the first time. Or maybe it wasn’t so strange. Prior to the modern era, a vegetarian diet so far north in Europe would have been impossible. It was only the elite promoting vegetarianism as only they could afford a vegetarian diet year round, in buying expensive plant-based foods that were often shipped in from far away. Although plant foods were expensive at the time, they were available to those who had plenty of money. But during the Middle Ages and earlier, vegetarianism for the most part was not an option for anyone since the food items required of such a diet simply weren’t available enough to sustain life, certainly not in places like England or Germany.

There is another side to this bring us back to the obsession with fat. It was only with the gradual increase of grain production that cattle could be fed grain, not only as additional feed in the winter but year round. This is also what allowed the possibility of confining animals, rather than grazing them on fields. Grain surpluses weren’t consistent until the 19th century, but even before that grain production had been increasing. There were slow improvements in agriculture over the centuries. The rich could afford meat from grain-fed animals much earlier than the rest of the population and it was highly sought after. That is because such meat is extremely fatty creating those beautiful marbled steaks, pork chops, etc (such fattiness, by the way, is a sign of metabolic syndrome in both animals and humans). Fat couldn’t have been a focus of debate prior to grain-fattened animals became common.

So, there is a reason that both wheat bread and fatty meat gained immense symbolic potency at the same time. Similarly, it was during this same era that vegetables became more common and gardens likewise became symbols of wealth, abundance, and the good life. Only the rich could afford to maintain large gardens because of the difficulty involved and immense time-consuming work required (see The Jane Austen Diet by Bryan Kozlowski**; also about the American diet before the 20th century, see The Big Fat Surprise by Nina Teicholz that I quote in Malnourished Americans). They represented the changed diet of modern civilization. They were either indicators of progress or decline, depending on one’s perspective. Prior to modernity, a diet had consisted to a much greater degree of foods that were gathered, hunted, trapped, and fished.

The shift from one source of food to another changed the diet and so changed the debate about diet. There suddenly were more options of foods available as choices to argue about. Diet as a concept was being more fully formulated. Rather than being something inherited according to the traditional constraints of local food systems and food customs, assuming one had the wealth, one could pick from a variety of possible diets. Even to this day, the obsession about dieting carries a taint of class privilege. It is, as they say, a first world problem. But what is fascinating is how this way of thinking took hold in the 1600s and 1700s. There was a modern revolution in dietary thought in the generations before modern political revolution. The old order was falling apart and sometimes actively being dismantled. This created much anxiety and it forced the individual into a state of uncertainty. Old wisdom no longer could be relied upon.

* * *

*Rather than bread, the food that was most associated with the laboring class was fish, a food the wealthy avoided. Think about how lobster and clams used to be poverty foods. In Galenic theory of humoral physiology, fish is considered cold and wet, hard to digest and weakening. This same humoral category of food also included fruits and vegetables. This might be why, even to this day, many vegetarians and vegans will make an exception for fish, in seeing it as different than ‘meat’. This is an old ideological bias because ‘meat’ was believed to have the complete opposite effect of being hot and dry, easy to digest and invigorating. This is the reason for why meat but not fish was often banned during religious fasts and festivals.

As an interesting side note, the supposed cooling effect of fish was a reason for not eating it during the cold times of the year. Fish is one of the highest sources of vitamin A. Another source is by way of the precursor of beta-carotene found in vegetables. That these two types of food are considered of the same variety according to Galenic thought is interesting. Cold weather is one of the factors that can disrupt the body’s ability to convert beta-carotene into usable vitamin A. The idea of humors mixes this up slightly, but it maybe points to understanding there was something important to be understood. Eating more meat, rather than vegetables, in winter is a wise practice in a traditional society that can’t supplement such nutrients. Vitamin A is key for maintaining a strong immune system and handling stress (True Vitamin A For Health And Happiness).

By the way, it was during the 19th century that a discussion finally arose about vegetarianism. The question was about whether life and health could be sustained with vegetables. Then again, those involved were probably still being influenced by Galenic thought. By vegetarianism, they likely meant a more general plant-based diet that excluded ‘meat’ but not necessarily fish. The context of the debate was the religious abstinence of Lent, during which fish was allowed. So, maybe the fundamental argument was more about the possibility of long-term survival solely on moist, cooling foods. Whatever the exact point of contention, it was the first time in the modern Western world where a plant-based diet (be it vegan, vegetarian, or pescetarian-style Mediterranean diet) was considered seriously.

These ideas have been inherited by us, even though the philosophical justifications no longer make sense to us. This is seen in the debate that continues over red meat in particular and meat in general, specifically in terms of the originally Galenic assertion of its heat and dryness building up the ‘blood’ (High vs Low Protein). It’s funny that dietary debates remain obsessed over red meat (along with the related issue of cows and their farts), even though actual consumption of red meat has declined over the past century. As with bread, the symbolic value of red meat has maybe even gained greater importance. Similarly, as I mentioned above, the uncertain categorization of fish remains hazy. I know a vegan who doesn’t eat ‘meat’ but does eat fish. When I noted how odd that was, a vegetarian I was talking to thought it made perfect sense. This is Galenic thought without the Galenic theory that at least made it a rational position, but the ideological bias remains in spite of those adhering to it being unable to explain why they hold that bias. It amuses me.

Ideologies are powerful systems. They are mind viruses that can survive and mutate across centuries and sometimes millennia. Most of the time, their origins are lost to history. But sometimes we are able to trace them and it makes for strange material to study.

See: “Fish in Renaissance Dietary Theory” by Ken Albala from Fish: Food from the Waters ed. by Harlan Walker, and Food and Faith in Christian Culture ed. by Ken Albala and Trudy Eden. Also, read text below, such as the discussion of vegetarianism.

* * *

(Both texts below are from collections that are freely available on Google Books and possibly elsewhere.)

The Fat of the Land: Proceedings of the Oxford Symposium on Food and Cooking 2002
ed. by Harlan Walker
“The Apparition of Fat in Western Nutritional Theory”
by Ken Albala

Naturally dietary systems of the past had different goals in mind when framing their recommendations. They had different conceptions of the good, and at some point in history that came to include not being fat. Body size then became an official concern for dietary writers. Whether the original impetus for this change was a matter of fashion, spirituality or has its roots in a different approach to science is impossible to say with any degree of precision. But this paper will argue that nutritional science itself as reformulated in the 17th century was largely to blame for the introduction of fat into the discourse about how health should be defined. […] Obesity is a pathological state according to modern nutritional science. But it was not always so.

When and why fat became a medical issue has been a topic of concern among contemporary scholars. Some studies, such as Peter N. Sterns’ Fat History: Bodies and Beauty in the Modern West, place the origin of our modern obsession in the late 19th century when the rise of nutritional science and health movements lead by figures like John Harvey Kellogg, hand in hand with modern advertising and Gibson Girls, swept away the Victorian preference for fulsome figures. As a form of social protest, those who could afford to, much as in the 60s, idealized the slim androgynous figure we associate with flappers. Others push the origin further back into the early 19th century, in the age of Muscular Christianity and Sylvester Graham. But clearly the obsession is earlier than this. In the 18th century the 448 pound physician George Cheyne and his miracle dieting had people flocking to try out the latest ‘cures.’ It was at the same time that dissertations on the topic of obesity became popular, and clearly the medical profession had classified this as a treatable condition. And readers had already been trained to monitor and police their own bodies for signs of impending corpulence. The roots of this fear and guilt must lie somewhere in the previous century as nutritional science was still groping its way through a myriad of chemical and mechanical theories attempting to quantify health and nutrition with empirical research.

The 17th century is also the ideal place to look if only because the earlier system of humoral physiology is almost totally devoid of a concept of fat as a sickness. […]

For all authors in the Galenic tradition it appears that fat was seen as a natural consequence of a complexion tending to the cold and moist, something which could be corrected, but not considered an illness that demanded serious attention. And socially there does not seem to have been any specific stigma attached to fat if Rubens’ taste in flesh is any measure.

The issue of fat really only emerges among authors who have abandoned, in part or totally, the system of humoral physiology. This seems to have something to do with both the new attempts to quantify nutrition, first and most famously by Santorio Santorio9 and also among those who began to see digestion and nutrition as chemical reactions which when gone awry cast fatty deposits throughout the body. It was only then that fat came to be considered a kind of sickness to be treated with therapy.10

The earliest indications that fat was beginning to be seen as a medical problem are found in the work of the first dietary writer who systematically weighed himself. Although Santorio does not seem to have been anxious about being overweight himself, he did consistently define health as the maintenance of body weight. Expanding on the rather vague concept of insensible perspiration used by Galenic authors, Santorio sought to precisely measure the amount of food he consumed each day compared to the amount excreted in ‘sensible’ evacuations. […] Still, fat was not a matter of eating too much. ‘He who eats more than he can digest, is nourished less than he ought to be, and [becomes] consequently emaciated.’12 More importantly, fat was a sign of a system in disarray. […]

Food was not in fact the only factor Santorio or his followers took into account though. As before, the amount of exercise one gets, baths, air quality, even emotions could alter the metabolic rate. But now, the effect of all these could be precisely calculated. […]

At the same time that these mechanistic conceptions of nutrition became mainstream, a chemical understanding of how food is broken down by means of acids and alkalis also came to be accepted by the medical profession. These ideas ultimately harked back to Paracelsus writing in the 16th century but were elaborated upon by 17th century writers […] It is clear that by the early 18th century fat could be seen as a physiological defect that could be corrected by heating the body to facilitate digestive fermentation and the passage of insensible perspiration. […] Although the theories themselves are obviously nothing like our own, we are much closer to the idea of fat as a medical condition. […]

Where Cheyne departs from conventional medical opinion, is in his recommendation of a cooked vegetable diet to counter the affects of a disordered system, which he admits is rooted in his own ‘experience and observation on my own crazy carcase and the infirmities of others I have treated’ rather than on any theoretical foundation.

The controversy over whether vegetables could be considered a proper diet, not only for the sick or overgrown but for healthy individuals, was of great concern in the 18th century. Nicholas Andry in his Traité des alimens de caresme offered an extended diatribe against the very notion that vegetables could sustain life, a question of particular importance in Catholic France where Lenten restriction were still in force, at least officially. […] According to current medical theory, vegetables could not be suitable for weight loss, despite the successful results of the empirics. […]

It is clear that authors had a number of potentially conflicting theoretical models to draw from and both mechanical and chemical explanations could be used to explain why fat accumulates in the body. Yet with entirely different conceptual tools, these authors arrived at dietary goals surprisingly like our own, and equally as contentious. The ultimate goals now became avoiding disease and fat, and living a long life. While it would be difficult to prove that these dietary authors had any major impact beyond the wealthy elites and professionals who read their works, it is clear that a concern over fat was firmly in place by the mid 18th century, and appears to have its roots in a new conception of physiology which not only paid close attention to body weight as an index of health, but increasingly saw fat as a medical condition.

Food and Morality: Proceedings of the Oxford Symposium on Food and Cookery 2007
ed. by Susan R. Friedland
“Moral Fiber: Bread in Nineteenth-Century America”

by Mark McWilliams

From Sarah Josepha Hale, who claimed, ‘the more perfect the bread, the more perfect the lady’ to Sylvester Graham, who insisted, ‘the wife, the mother only’ has the ‘moral sensibility’ required to bake good bread for her family, bread often became a gendered moral marker in nineteenth-century American culture.1 Of course, what Hale and Graham considered ‘good’ bread differed dramatically, and exactly what constituted ‘good’ bread was much contested. Amidst technological change that made white flour more widely available and home cooking more predictable, bread, described in increasingly explicit moral terms, became the leading symbol of a housewife’s care for her family.

Americans were hardly the first to ascribe moral meaning to their daily bread. As Bernard Dupaigne writes, ‘since time immemorial [bread] has attended the great events of various human communities: monsoon or grape harvest bread, the blessed bread of Catholics or the unleavened bread of Passover, or the fasting-break bread of Ramadan. There is no bread that does not, somewhere in the world, celebrate an agricultural or religious holiday, enrich a family event, or commemorate the dead.’2 With such varied symbolic resonance, bread seems easily filled with new meanings.

In America (as later in France),3 bread became a revolutionary symbol. To the early English colonists’ dismay, European wheat did not adapt well to the North American climate; the shift to corn as the primary grain was perhaps the most important dietary adaptation made by the colonists. Wheat remained too expensive for common consumption well into the nineteenth century. […]

By the end of the Revolution, then, bread was already charged with moral meaning in the young United States. In the nineteenth century, this meaning shifted in response to agricultural improvements that made wheat more widely available, technological change that made bread easier to make consistently, and, perhaps most important, social change that made good bread the primary symbol of a housewife’s care for her family. In effect, bread suffered a kind of identity crisis that paralleled the national identity crisis of Jacksonian America. As Americans thought seriously about who they were in this new nation, about how they should act and even how they should eat, bread’s symbolic meaning – and bread itself– changed.

American agricultural production exploded, although the proportion of the population working on farms declined. James Trager notes that even before the McCormick reaper first sold in large numbers as farmers struggled to replace workers leaving for the 1849 Gold Rush, the average time required to produce a bushel of wheat declined 22 per cent from 1831 to 1840.7 Dramatic improvements in efficiency led to larger yields; for example, wheat production more than doubled between 1840 and 1860. Such increases in wheat production, combined with better milling procedures, made white flour finally available in quantities sufficient for white bread to become more than a luxury good.8

Even as wheat became easier to find for many Americans, bread remained notoriously difficult to make, or at least to make well. Lydia Maria Child, a baker’s daughter who became one of America’s leading writers, emphasizes what must have been the intensely frustrating difficulty of learning to cook in the era before predictable heat sources, standardized measurements, and consistent ingredients.9 […]

Unlike Hale, who implies that learning to bake better can be a kind of self improvement, this passage works more as dire warning to those not yet making the proper daily bread. Though bread becomes the main distinction between the civilized and the savage, Beecher turns quickly, and reassuringly, to the science of her day: ‘By lightness is meant simply that in order to facilitate digestion the particles are to be separated from each other by little holes or air-cells; and all the different methods of making light bread are neither more nor less than the formation of bread with these air cells’ (170). She then carefully describes how to produce the desired lightness in bread, instructions which must have been welcome to the young housewife now fully convinced of her bread’s moral importance.

The path for Beecher, Hale, and others had been prepared by Sylvester Graham, although he is little mentioned in their work.14 In his campaign to improve bread, Graham’s rhetoric ‘romanticized the life of the traditional household’ in ways that ‘unknowingly helped prepare women to find a new role as guardians of domestic virtue,’ as Stephen Nissenbaum notes.15 Bread was only one aspect of Graham’s program to educate Americans on what he called ‘the Science of Human Life.’ Believing on the one hand, unlike many at the time, that overstimulation caused debility and, on the other, that industrialization and commercialization were debasing modern life, Graham proposed a lifestyle based around a strict controls on diet and sexuality.16 While Graham promoted a range of activities from vegetarianism to temperance, his emphasis on good bread was most influential. […]

And yet modern conditions make such bread difficult to produce. Each stage of the process is corrupted, according to Graham. Rather than grow wheat in ‘a pure virgin soil’ required for the best grain, farmers employ fields ‘exhausted by tillage, and debauched by the means which man uses to enrich and stimulate it.’ As Nissenbaum notes, the ‘conscious sexual connotations’ of Graham’s language here is typical of his larger system, but the language also begins to point to the moral dimensions of good bread (6).

Similarly loaded language marks Graham’s condemnation of bakery bread. Graham echoed the common complaints about adulteration by commercial bakers. But he added a unique twist: even the best bakery bread was doubly flawed. The flour itself was inferior because it was over-processed, according to Graham: the ‘superfine flour’ required for white bread ‘is always far less wholesome, in any and every situation of life, than that which is made of wheaten meal which contains all the natural properties of the grain.’ […]

As Nissenbaum argues, pointing to this passage, Graham’s claims invoke ‘the vision of a domestic idyll, of a mother nursing her family with bread and affection’ (8). Such a vision clearly anticipates the emphasis on cookery as measure of a woman’s social worth in the domestic rhetoric that came so to characterize the mid-nineteenth century.

Such language increasingly linking cookery with morality emphasized the virtue not of the food itself but rather of the cooks preparing it. This linkage reached read ers not only through the explosion of cookbooks and domestic manuals but also through the growing numbers of sentimental novels. Indeed, this linkage provided a tremendously useful trope for authors seeking a shorthand to define their fictional characters. And that trope, in turn, helped expand the popularity of interpreting cookery in moral terms. […]

After the Civil War, domestic rhetoric evolved away from its roots in the wholesome foods of the nation’s past toward the ever-more refined cuisine of the Gilded Age. Graham’s refusal to evolve in this direction – his system was based entirely in a nostalgic struggle against modernity, against refinement – may well be a large part of why his work was quickly left behind even by those for whom it had paved the way.

* * *

Here is another text I came across. It’s not free, but it seems like a good survey worth buying.

 

 

“A Bitch For God”

“I challenge the idea that the people who got us in this ditch are the only ones who can get us out of it.”

Marianne Williamson has called herself “a bitch for God.” As presidential candidate, she is getting plenty of attention right now. She is well known among a certain crowd, as she has written numerous books that sold widely, including best-sellers, such as Healing the Soul of America that topped The New York Times nonfiction list for 39 weeks: “Seven reached the New York Times best-seller list, and four hit No. 1” (Cameron Joseph, Marianne Williamson Knows You Think She’s a Joke. But Her Campaign Isn’t.). I’ve known about her since the 1990s during my young adulthood. But for most Americans, she hasn’t been a household name. Yet many people are more familiar with her words, such as a quote often misattributed to Nelson Mandela: “Our deepest fear is not that we are inadequate. Our deepest fear is that we are powerful beyond measure.”

Besides being on Oprah’s show in the past, she is well connected and, for those who know her, strongly supported. She has inspired many people, from famous stars to ordinary Americans, including in politics: “She has some surprising adherents in the Granite State, including former Rep. Paul Hodes who served as a co-chairman on President Obama’s 2008 campaign and is still a power broker in the state. He’s been a Williamson fan since her heyday in the ’90s — her quote “Who are we to stay small?” inspired him to run for Congress a decade-plus ago and hangs in his home to this day,” as reported by Cameron Joseph. That has been her career, inspiring people and she has a talent for it. It is the kind of mixing of religion, politics, and progressive vision we haven’t seen in a while, maybe not since Martin Luther King Jr.

I must admit it feels validating to hear her in the mainstream media, particularly in the early Democratic debates. She comes out of the same background as I do, something I explained in another post (Heretic For President!). She is part of a heretical tradition of thought that goes back to the earliest Christians. Today, we think of it as “New Age” or what in the liberal wing of Christianity is called New Thought. Basically, she believes God is Love — no ifs, ands, or buts. It’s the radical message of Jesus himself, too often diluted or rationalized away and yet still carrying a powerful punch when released from centuries of stale dogma.

Williamson was the minister of the second largest Unity church in the country, the denomination of Christianity I was raised in. She still does guest speaking at that church and other churches. Her primary career has been as a Christian minister, but the mainstream, both left and right, caricatures her as a New Ager, spiritual guru, or whatever; although I’ll give Slate some credit for sort of complimenting her, if backhanded (Shannon Palus, The Bizarre Charm of Marianne Williamson). “I do not understand why everyone is so dismissive of her,” said Marshall Kirkpatrick. “Are we really so out of touch with emotions, spirituality, etc that she seems insane?” If corporate media were to be fair, they’d have to admit she is a Christian minister who comes out of the American Evangelical tradition (Unity Church) and who upholds a theology that has its roots in the earliest Christianity by way of Valentinianism (A Course In Miracles). That is maybe too much historical knowledge for a society that suffers from permanent historical amnesia. She may be a heretic, but she is a heretic with credentials. I’ll call it the return of the repressed. It’s amusing.

Despite it all, Unity is slowly creeping into the mainstream. This has been going on for a long time. I remember when visiting non-Unity churches in decades past and I would sometimes come across the Unity publication The Daily Word even in mainstream churches. So, many people were reading New Thought theology without knowing it. More recently, the Unity Church showed up in a major subplot of the tv show The Path (Meyerism and Unity Church). Then there is the story of Carlton Pearson, as told in a segment on This American Life and in the Netflix movie Come Sunday. He attended Oral Roberts University and was mentored by Oral Roberts himself. As a popular fourth generation Pentecostal preacher, he came to a point of crisis in his faith. He no longer could believe God was a horrific and monstrous demiurge threatening people with eternal damnation. After much inner struggle, he converted to the view that there is no hell, was officially condemned as a heretic, lost his congregation, and then found his faith again in New Thought theology. He has since become the senior minister of a New Thought church and an affiliate minister of a Unity church. His story has inspired many.

Now here we are. We have a Unity minister as a presidential candidate. To me, it is mind-blowing. Unity Church powerfully shaped who I am. I can’t shake the blinding idealism of New Thought theology, in the way an ex-Catholic never quite gets over original sin or an ex-Baptist never loses that sense of fire-and-brimstone breathing down their neck. It is hard to explain being raised in that kind of light-and-love sincerity. I remember going to what was the Unity equivalent of a Bible camp, called Youth of Unity. I had never experienced so much positivity and goodwill in my life. Then I returned back to ‘normal’ life of high school and it shook me to the core. As wonderful as Unity was, it wasn’t the way life operated or so I was told. I was supposed to get real and accept the world the way it was. Like most others growing up in this society, cynicism fell upon me like a sledgehammer.

But Marianne Williamson embodies and exemplifies another way of being. She suggests there is another way and she walks her talk. She doesn’t care who attacks her. She won’t attack back. Instead, when she feels she is wrong, she admits and apologizes. Holy fuck! Someone aspiring to be president who isn’t afraid to apologize! Trump came to power on the arrogant, egomaniac and psychopathic claim that morality, compassion, and common human decency no longer matters. Williamson disagrees down to her soul that it does matter. How we act determines the kind of country we live in. And she is driven to make the world a better place or go down trying. When arguing her position, she doesn’t fall back on talking points. In response to a question about her strategy, she used air quotes as she spoke of her “strategy” — she said that her only strategy was to speak the truth she knows and to continue campaigning as long as people supported her vision of America (Marianne Williamson says she supports mandatory vaccines – but ‘when they are called for’). Her non-aggressive approach doesn’t come across as weakness for, when a principle is at stake, she doesn’t back down. And she isn’t afraid to call someone out on their bullshit, including the MSNBC interviewer Jo Ling Kent, but even then she does so with perfect politeness.

Her personality comes across as strong and confident, and not as a pretense and pose. I loved watching her in that interview. Before answering, she would often get this serious look on her face as if she were scrutinizing the true intentions behind the question and contemplating it as a philosophical issue. Such sincerity is potent, an antidote to cynicism. Trump would have a hard time combating her because she would never give him the kind of response he feeds on. No one is likely to throw Williamson off message because she lives her message. Walk and talk are perfectly aligned. I’m not sure how many people listening to her get where she is coming from. It’s something I’m extremely familiar with from years in the Unity Church. But most people rarely come across authenticity at this level. It’s not something we’ve come to expect in politics. The last time I heard a candidate this straight-shooting was when I went to a speech given by Ralph Nader when he was running for president in 2000, but even he didn’t come across with the same confidence in vision. Even Bernie Sanders, in his down-to-earth style, doesn’t come across as powerfully as this.

Marianne Williamson, in the Democratic debate said, “So, Mr. President, if you’re listening, I want you to hear me please — you have harnessed fear for political purposes and only love can cast that out … I’m going to harness love for political purposes. I will meet you on that field and, sir, love will win.” Who says something like that in a national political debate, especially in a political party that has become infamous for its political insincerity from Clinton domination, and even more especially while facing president Donald Trump who came to power through hate, anger, and outrage. Such audacity to proclaim love in this era of cynicism. Listen to what she said in that debate (Tim Hains, Marianne Williamson: If You Think We’re Going To Beat Donald Trump By Having A Lot Of Plans, You’ve Got Another Thing Coming). She kicks ass! And it has won her a following, something the corporate media is trying to dismiss — oddly, one hit piece calls her positivity-spouting and humorous followers on Reddit “trolls” (Ben Collins, 2020 candidate Marianne Williamson’s reddit following).

Those in the mainstream are looking for reasons to attack her. For example, some misrepresent her as an anti-vaxxer (Jo Ling Kent, Marianne Williamson says she supports mandatory vaccines – but ‘when they are called for’). In explaining her actual position, she states in no uncertain terms that, “I understand that many vaccines are important and save lives. I recognize there are epidemics around the world that are stopped by vaccines. I also understand some of the skepticism that abounds today about drugs which are rushed to market by Big Pharma.” There is no way to fairly call her an anti-vaxxer. What she is mainly questioning is the anti-democratic role big biz plays in public policy and wants to ensure the best scientific evidence possible is available to promote the public good. She is a principled anti-corporatist and pro-democrat. As she put it in her own words, “I want you to rail against the chemical companies and their GMO’s — not support them. I want you to decry the military industrial complex — not assure them you’re their girl. I want you to support reinstating Glass-Steagall — not just wink at Wall Street while sipping its champagne” (An Open Letter To Hillary Clinton).

She supports mandatory vaccinations when they meet the criteria of the highest standards of the scientific method, if and only if the best evidence strongly supports a public health concern that is proven beyond a reasonable doubt to be remedied only through this drastic course of action. Otherwise, if the evidence is weak or still under debate, if big pharma is unduly influencing government decisions, then we are morally forced to defend democratic process and individual liberty, personal conscience, and bodily autonomy. It is the forever difficult but not impossible democratic balance between public good and private good. A mandatory vaccination is justified in many cases and maybe not in others. She is not promoting denialism. After all, she has vaccinated her own daughter. Science isn’t a dogmatic belief system that is forever settled. Instead, science is an ongoing process. To act like it is otherwise is anti-scientific.

The same problem comes up with attacks on her credibility because she is skeptical about GMOs. Do these people even bother to look into the science? I could write a long post about all the contrary evidence, especially the relationship between GMOs and increased pesticide use (as opposed to organic farming), but this isn’t the place to flesh out that debate. Let’s just honestly acknowledge it exists as a contested issue, a state of affairs that, of course, is reported on in the alternative media but also found in mainstream sources (a few examples: The UK’s Royal Society: a Case Study in How the Health Risks of GMOs Have Been Systematically Misrepresented by Steven Druker from Independent Science News, How GMOs Cut The Use Of Pesticides — And Perhaps Boosted It Again by Dan Charles from NPR, Largest-Ever Study Reveals Environmental Impact of Genetically Modified Crops by Caroline Newman from University of Virginia, Major Pesticides Are More Toxic to Human Cells Than Their Declared Active Principles by Robin Mesnage et al from BioMed Research International, etc).

If only from a viewpoint of the precautionary principle, whether about the GMOs themselves or the pesticides heavily used with GMOs, it’s perfectly rational that the vast majority of Americans (Democrats, Republicans, and Independents) are concerned about GMOs and strongly support having GMO foods labeled — 71-95%, depending on the question and the group asked (Chris Mooney, Stop Pretending That Liberals Are Just As Anti-Science As Conservatives). Not that American politics was ever constrained by nuance. That is precisely the problem. Williamson is arguing that we must understand diverse problems as being systemically related, such as health and the food system or such as the inseparable relationship between GMOs and pesticides. Yet nuance is deemed ‘loony’ because it challenges the dominant paradigm that is dominated by corporate agendas.

As a loony left-winger myself, here is how I put it: “Yeah, monocultural GMO crops immersed in deadly chemicals that destroy soil and deplete nutrients are going to save us, not traditional grazing land that existed for hundreds of millions of years. So, sure, we could go on producing massive yields of grains in a utopian fantasy beloved by technocrats and plutocrats that further disconnects us from the natural world and our evolutionary origins, an industrial food system dependent on turning the whole world into endless monocrops denatured of all other life, making entire regions into ecological deserts that push us further into mass extinction. Or we could return to traditional ways of farming and living with a more traditional diet largely of animal foods (meat, fish, eggs, dairy, etc) balanced with an equal amount of vegetables, the original hunter-gatherer diet” (Carcinogenic Grains). Tell me. Is my skepticism irrational? If so, how has the highly destructive ‘rationality’ of mass industrialization been working out for life on this planet, as we head toward the cliff of mass extinction and climate change?

In many different ways, Marianne Williamson is a potential threat to the Clinton Democrats. Republicans have sensed this and, as a way of fucking with Democrats, some of them have donated to her campaign (Cnaan Liphshiz, Republicans donate to Marianne Williamson’s campaign to keep her in the Democratic debates). It reminds me of how Democrats promoted Trump in the hope that would ensure a Democratic victory. It’s funny that Republicans are falling into the same trap of naivete. Williamson isn’t a mere unknown outlier. After the debate she participated in, her name was the most Googled and, even while the debate was happening, Google searches for her name spiked every time she spoke (Malachi Barrett, Marianne Williamson searches in Michigan explode after Democratic debate). Also, “Williamson has performed better in national polls than more established candidates like New York Mayor Bill de Blasio; Montana Gov. Steve Bullock; and Tulsi Gabbard, congresswoman from Hawaii” writes Merle Ginsberg (Presidential Candidate Marianne Williamson Is Running on Empathy); and she concludes that, “If anybody could play Jesus to Trump’s Antichrist, Williamson is, as our wayward president would put it, straight out of central casting.”

Williamson is no lightweight. In the debates, she is the only candidate that brought up the harmful US policy in Latin America — interestingly, the only article I came across mentioning this came from a conservative source (Christian Watson, Democratic debate showed conservatives could learn something from Marianne Williamson). And she is bold in her vision that comes across as quite left-wing (e.g., since 1997, she has supported reparations for African American slave descendants) while simultaneously invoking the American founding generation of revolutionaries. Here is how she puts it: “Franklin Roosevelt said that the primary role of the presidency is moral leadership. Americans are a decent people, but over the last 50 years, the concept of what it takes to live a good life—an ethical life—has been overtaken by corporatocracy. When I was a child, corporations were expected to have responsibility to the community, not just focus on fiduciary responsibilities to stockholders. Soulless economics has not brought us economic vibrancy. It’s destroyed our middle class and replaced a model of democracy with a model of aristocracy. We repudiated that in 1776—and need to repudiate it again.”

We used to call that a jeremiad, an American tradition if there ever was one (Sacvan Bercovitch, The American Jeremiad). If that is ‘woo’, then give me more of it. This is ‘woo’ that could seriously shake up public and political debate and hopefully a whole lot more. Give me some of that old time religion.

American Spirituality

The United States is a religious society. But I don’t know to what degree it is a spiritual society. I’m not even quite sure what spirituality can mean here. There is an Anglo-American history of spirituality: Transcendentalism, Spiritualism, Mesmerism, Theosophy, etc. The Shakers are an interesting example, specifically of community. They originated from the Quakers, as they were the Shaking Quakers. They were really into communal dancing with the noise they made being heard miles away. They were also really into Spiritualism with their members going into trance states, channeling spirits, doing spirit paintings, etc. The Shakers, by the way, advocated abstinence. That might explain some of their behavior. They needed some kind of outlet. Avoiding sex meant they had to adopt children to maintain their society, which they did over a century. That is what happened to my great grandfather. He was one of the last generation of Shaker children. I would have loved to known about his experience, but apparently he never talked about it.

There were a lot of similar things going on during the revival movements of the Great Awakenings. All kinds of odd behaviors were common, from shaking to talking in tongues. The people believed God or the Holy Spirit came down and essentially possessed them. It’s hard to imagine this happening today in this country. There are still some churches that have such practices, including such things snake handling, though it doesn’t seem to be at the same level as seen in these once massive revivals. Interestingly, the Piraha also do snake handling when possessed, not that they think of it as possession. A possessed Piraha becomes entirely identified with the spirit, such that not even other Piraha would recognize him as anything else. The Piraha, by the way, have no shamanic tradition as such and so no shamans. Possession isn’t part of any formal tradition or rituals and just happens. Because of that, the Piraha might be a good framework for understanding some of the spiritual eruptions in American society.

Then there is the whole phenomenon of UFO sightings and abductee experiences, Mothman and Men in Black. That has developed into numerous UFO and alien cults (some good books have been written on that). Carl Jung considered UFOs to be an expression of a religious impulse, something new seeking to emerge within our society (see a letter he wrote to Gilbert A. Harrison and his book Flying Saucers: A Modern Myth of Things Seen in the Skies). Like Jung, others have seen a spiritual/mythological component to this. The biggest name being the astrologer and computer scientist Jacques Vallee who noted the similarity between alien abduction accounts, fairy abduction stories, and shamanic initiations. John Keel wrote about similar things. In a scientific age, it is in a scientific guise that spirituality often gets expressed. This is the unexpected form that the next major religion is likely to take. In the way that the Axial Age religions took ahistorical myths and rewrote them as history, our society will take non-scientific myths and retell them as science. On a personal level, that will be how spirituality will be experienced by many — if not necessarily the rise of UFO cults, then something like it.

I wonder what it would look like in the U.S. if we had a fourth (fifth?) Great Awakening with the large revivals or else along these lines, although not necessarily in Christian dressing. Admittedly, it’s harder to imagine it. But secularism doesn’t alter the underlying yearning for spirituality, for something transcendant or other, something ecstatic and transformative. The hunger is there, obviously. It just gets subverted in our capitalist society. The closest we come is presidential elections when people become a bit mentally unbalanced… still, not the same thing, at least not these days. But according to early American records, elections were more like ecstatic Carnival with truly wild behavior going on. Elections with their group-minded partisanship — combined with cult of personality — can make people lose their individual sense of self into something greater (see Winter Season and Holiday Spirit). That maybe the main purpose of elections in our society, not so much for democracy (as U.S. politics fails on that account) but as a state religion. I sometimes wonder if our entire society isn’t possessed in some sense. That might be a better explanation than anything else. That maybe the difficulty the respectable classes have in coming to terms with President Donald Trump, as he is less of a politician than a religious figure. Heck, maybe he is a lizard person too, as part of an advanced guard of an alien invasion.

* * *

Inventing the People:
The Rise of Popular Sovereignty in England and America
by Edmund S. Morgan
pp. 202-203

There were other parallels in contemporary English country life, in the fairs, “wakes,” and local festivals that punctuated the seasons, where sexual restraints were loosened and class barriers briefly broken in a “rough and ready social equality.” 82 But these were simply milder versions of what may be the most instructive parallel to an eighteenth-century election, namely the carnival— not the travelling amusement park familiar in America, but the festivities that preceded Lent in Catholic countries. The pre-Lenten carnival still survives in many places and still occupies an important place in community life, but it has assumed quite different functions from the earlier festivals. 83 It is the older carnivals, before the nineteenth century, that will bear comparison with eighteenth-century elections.

The carnival of the medieval or early modern period elicited from a community far more outrageous behavior and detailed ritual than did the elections that concern us. 84 But the carnival’s embellishments emphasize rather than obscure the fact that make-believe was the carnival’s basic characteristic and that carnival make-believe, like election make-believe, involved role reversal by the participants.

pp. 205-207

Where social tensions ran too high the carnival might become the occasion for putting a real scare into the cats and wolves of the community. There was always a cutting edge to the reversal of roles and to the seemingly frivolous competition. And when a society was ripe for revolt, the carnival activated it, as Le Roy Ladurie has shown in his account of the carnival at Romans in 1580. But normally a community went its way with the structure of power reinforced by its survival of the carnival’s make-believe challenge.

To put this idea in another way, one might say that the carnival provided society with a means of renewing consent to government, of annually legitimizing (in a loose sense of the word) the existing structure of power. Those who enacted the reversal of roles, by terminating the act accepted the validity of the order that they had ritually defied. By not carrying the make-believe forward into rebellion, they demonstrated their consent. By defying the social order only ritually they endorsed it. […]

The underlying similitude of an eighteenth-century election to a carnival is by now apparent. The two resembled each other not only in obvious outward manifestations— in the reversal of roles, in the make-believe quality of the contests, in the extravagance of the partisanship of artificial causes, in the outrageous behavior and language, in the drunkenness, the mob violence, even in the loosening of sexual restraints— not only in all these external attributes but also in an identity of social function. An election too was a safety valve, an interlude when the humble could feel a power otherwise denied them, a power that was only half illusory. And it was also a legitimizing ritual, a rite by which the populace renewed their consent to an oligarchical power structure.

Hence the insistence that the candidate himself or someone of the same rank solicit the votes of the humble. The election would not fully serve its purpose unless the truly great became for a time humble. Nor would it serve its purpose if the humble did not for a time put on a show of greatness, not giving their votes automatically to those who would ordinarily command their deference. Hence too the involvement of the whole populace in one way or another, if not in the voting or soliciting of votes, then in the tumults and riots, in the drinking and feasting, in the music and morris dancing.

It would be too much to say that the election was a substitute for a carnival. It will not do to push the analogy too far. The carnival was embedded deeply in folk culture, and its functions were probably more magical and religious than, overtly at least, political. An election, on had no the other hand, was almost exclusively a political affair, magical overtones; it was not connected with any religious calendar. 90 Nor did it always exhibit the wild excesses of a carnival; and when it did, it was surely not because the local oligarchy felt that this would renew their authority. They would generally have preferred to preserve “the peace of the country” by avoiding the contests that engaged them so hotly and cost them so much when they occurred. Moreover, the reversal of roles did not go anywhere near as far as in a carnival. In an election, along with the fraternization and condescension, there could be a great deal of direct pressure brought by the mighty on those who stood below them, with no pretense of reversing roles.

The resemblance to a carnival nevertheless remains striking. Is it wholly coincidence that there were no carnivals in Protestant England and her colonies where these carnival-like elections took place, and that in countries where carnivals did prevail elections were moribund or nonexistent? Is it too much to say that the important part of an eighteenth-century election contest in England and in the southern colonies and states was the contest itself, not the outcome of it? Is it too much to say that the temporary engagement of the population in a ritual, half-serious, half-comic battle was a mode of consent to government that filled a deeper popular need than the selection of one candidate over another by a process that in many ways denied voters the free choice ostensibly offered to them? Is it too much to say that the choice the voters made was not so much a choice of candidates as it was a choice to participate in the charade and act out the fiction of their own power, renewing their submission by accepting the ritual homage of those who sought their votes?