Collective Amnesia About Collective Amnesia

At Harper’s, Corey Robin has a piece on collective amnesia, specifically among liberals. It comes down to the incessant march of lesser evilism that inevitably leads to greater and greater evil, until nothing is left remaining but evil’s total dominance.

Each shock of evil normalizes the evil of the past, such that we frogs are slowly boiled alive. Don’t worry, the liberal suggests relaxing in his warm bath, we will revolt later when it finally gets bad enough. But this attitude never allows for self-awareness of complicity, the denial of which makes inevitable further complicity for it is never the right moment to admit guilt, no matter how often in the past it was stated, Never again!

“Strong stuff, suggesting the kind of experience you don’t easily recover from. If such feelings of betrayal don’t overwhelm you with a corrosive cynicism, inducing you to withdraw from politics, they provoke an incipient realism or an irrepressible radicalism… You get to lose your innocence only once… But… American liberalism is also a party of the born-again.

“The United States of Amnesia: true to form, we don’t remember who coined the phrase. It’s been attributed to Gore Vidal and to Philip Rahv, though it also appears in a syndicated column from 1948. But more than forgetfulness is at work in our ceremonies of innocence repeatedly drowned. And while it’s tempting to chalk up these rituals to a native simplicity or a preternatural naïveté — a parody of a Henry James novel, in which you get soiled by crossing the Potomac rather than the Atlantic — even our most knowing observers perform them. . . .

“Donald Trump is making America great again — not by his own hand but through the labor of his critics, who posit a more perfect union less as an aspiration for the future than as the accomplished fact of a reimagined past.

“There can be an appalling complexity to innocence,” the political scientist Louis Hartz observed in his classic 1955 study The Liberal Tradition in America, “especially if your point of departure is guilt.” That nexus of guilelessness and guilt, depth and innocence, is usually Roth country, but in this instance we’ll have to take the master’s tools and use them ourselves.

“Ever since the 2016 presidential election, we’ve been warned against normalizing Trump. That fear of normalization misstates the problem, though. It’s never the immediate present, no matter how bad, that gets normalized — it’s the not-so-distant past. Because judgments of the American experiment obey a strict economy, in which every critique demands an outlay of creed and every censure of the present is paid for with a rehabilitation of the past, any rejection of the now requires a normalization of the then.

“We all have a golden age in our pockets, ready as a wallet. Some people invent the memory of more tenderhearted days to dramatize and criticize present evil. Others reinvent the past less purposefully. Convinced the present is a monster, a stranger from nowhere, or an alien from abroad, they look to history for parent-protectors, the dragon slayers of generations past. Still others take strange comfort from the notion that theirs is an unprecedented age, with novel enemies and singular challenges. Whether strategic or sincere, revisionism encourages a refusal of the now.

“Or so we believe.

“The truth is that we’re captives, not captains, of this strategy. We think the contrast of a burnished past allows us to see the burning present, but all it does is keep the fire going, and growing… [T]he rehabilitation of the last monster allows the front line to move rightward, the new monster to get closer to the territory being defended. That may not be a problem for Roth, reader of Beckett: “Ever tried. Ever failed. No matter. Try again. Fail again.” (Though even Beckett concluded with the injunction to “fail better.”) It is a problem for us, followers of Alcoholics Anonymous: “Insanity is doing the same thing over and over again and expecting different results.” “

I noticed this scathing critique earlier and was glad to see it. It came to my attention again because he mentioned it in his blog in connection to an interview with Brooke Gladstone. That interview is part of several others (with Max Fischer, Deb Amos, and Sinan Antoon) in response to the 15th anniversary of the Iraq War.

Maybe no new insight is offered, but at the very least it is an important reminder. Not only is there collective amnesia for, as a society, we keep forgetting how often our collective amnesia has repeated in the past. We don’t know what we don’t know partly because we keep forgetting what we forgot.

Corey Robin also writes at Crooked Timer. And John Holbo at that site wrote about the Harper’s piece. In the comments section, there is debate about whether Trump is exceptional and, if so, for what reason. Should we be surprised or shocked? And about what exactly? Beyond that, what is the right context for understanding?

The comments section is filled with people who, for all their disagreement, know political and economic history. But even with the above average commentary and debate, it felt dissatisfying. There was little discussion of the fundamental causes of human behavior, of what makes the human mind tick, of the social sciences and related fields. The focus was almost entirely on externals, except for a few comments.

To some extent, that is even my complaint of Corey Robin. It is common on the political left, that is to say almost as common as on the political right. It’s easier to focus on externals and, with politics, the distraction of externals is endless. But for this reason, we rarely touch upon root causes. Robin’s theory of the reactionary mind goes a long way to explain conservatism and the modern mind in general, including amnesia among liberals (not that Robin talks about the reactionary in quite so expansive terms). Yet this analysis can only go so far because it never extends beyond the history of politics and economics.

A few comments by the same person, Lee A. Arnold, comes the closest to stepping outside of this intellectual blindspot:

Polanyi, The Great Transformation chapter 20 characterizes fascism as a spontaneous emotional “move” arising from within individuals, and uses the political conditions only to discard them. It is not a movement that requires a vanguard or imperial aspirations. The ONLY thing characterizing the rise of fascism, in the dozen or more countries in which it arose, was the sudden failure of the market system…”

Fascism is a socio-emotional disease that is ever-latent but suddenly arises within individuals and overcomes enough of them to make political control possible. Elites do not stop it, because it the disease overcomes them too. It is a paradoxical move: it offers an “escape from an institutional deadlock… yet… it would everywhere produce sickness unto death.” (Polanyi). Whether there is an external threat or an internal threat or both, it doesn’t matter, it’s all the same.”

The very recent spate of commentators (here, and elsewhere) writing that fascism has been avoided in the US (or even go so far as to write that it cannot happen) rely upon a misunderstanding of what fascism was, or take too large a comfort in a narrow diversion from it. That may not be good enough, next time. Far from being silly, I see this as Corey Robin’s basic point, too.”

I imagine that the intellectual confusion and dismissal was similar in the 1920’s-30’s. It’s interesting that today so many intellectuals think that fascism originated from outside the individual, as some sort of organized imposition of structure by a vanguard. So today, we will see it coming: “It can’t happen here.” (Or else they think that our current democratic institutions and media culture are strong enough and varietal enough to withstand it.)

“But the evidence is that fascism arose spontaneously within individuals. It’s almost a pure emotion in the anger & hatred quadrant. It suddenly swept up a lot of people into supporting “doing something”, to cut through the confusion and deadlock. Yet it emotionally disregards facts, logic, science, humane values. It’s not a political-economic form in the same category with liberalism, conservatism, capitalism, socialism.”

That is related to what Robin gets at. Fascism is simply one extreme variety of the reactionary mind. The reactionary is never constrained to the mainstream conventions of ideological rhetoric and political forms. That is because the reactionary, by definition, is the result of the failure (or perceived failure) of the mainstream project of social order.

Still, even this doesn’t dig deep enough. I end up getting more insight about what motivates politics and economics from those studying philology, linguistic relativism, social psychology, anthropology, cultural history, and consciousness studies. The failure of our society is better explained by the likes of Julian Jaynes (The Origin of Consciousness in the Breakdown of the Bicameral Mind), Iain McGilchrist (The Master and His Emissary), Lewis Hyde (Trickster Makes This World), James Gilligan (Preventing Violence), Keith Payne (The Broken Ladder), Johann Hari (Lost Connections), Sebastian Junger (Tribe), etc.

These other viewpoints are more likely to offer insight about collective amnesia. The typical political and economic analysis might, in some cases, be part of the problem. Karl Marx who, in emphasizing material conditions, gets portrayed as a reductionist by his critics and yet he went beyond most on the political left by presenting what is called ‘species-being’ — as I described it:

“The basic idea is that what we produce creates a particular kind of society and shapes human nature. We produce the kind of person that is needed for the world we bring into existence. And so the kind of person that is produced is incapable of seeing beyond the social and material world that produced him. We build our own prisons, even as there is hope for us to build new worlds to inhabit and new ways of being.”

So what produces a social identity that is prone to such extremes of collective amnesia? I’m specifically referring to what manifests in our society at the seeming peak or breaking point of this post-Enlightenment liberal age (or, if you prefer, this late stage of capitalism). If we could make sense of that, then much else would become easier to disentangle.

“Historical consciousness can be a conservative force, lessening the sting of urgency, deflating the demands of the now, leaving us adrift in a sea of relativism. But it need not be,” concludes Corey Robin in Harper’s.

“Telling a story of how present trespass derives from past crime or even original sin can inspire a more strenuous refusal, a more profound assault on the now. It can fuel a desire to be rid of not just the moment but the moments that made this moment, to ensure that we never have to face this moment again. But only if we acknowledge what we’re seldom prepared to admit: that the monster has been with us all along.”

That is no doubt true. But stories are a tricky business. The challenge is, in order to understand the present, we need to understand the origins of modern civilization and the modern mind. Most commentary, even on the alternative left, isn’t up to the task. The same old debate continues, as does the collective amnesia. We are stuck in a loop, until something forces us out.

This is not only a lack of understanding but a lack of motivation toward understanding. A revolution of the mind must come first before a revolution of society can follow. That isn’t something we collectively are capable of consciously choosing. What will happen at some point is that our old mindset will entirely fail us and every answer and response we used in the past will prove futile and impotent. Then and only then will we, out of desperation, turn toward the unknown in seeking the radically new. That is how change has always happened, as one would know from studying the deep past.

Until that point of total breakdown, we will go on forgetting and we will go on forgetting what we’ve forgotten. Our entire social order is dependent on it. And at the edge of breakdown, the reactionary mind takes hold and comes into power. The liberal too comes under its sway. It is a sign of the times. But is it the Kali Yuga of the liberal world or the dawning of the next Enlightenment leading to a new revolutionary era? Is there a difference?

* * *

4/1/18 – Let me take a different approach. We are all born into this collective amnesia. And for most of us, our upbringing fails us and our education is inadequate. That was true for me.

As I became more serious in dealing with my own state of ignorance, I became ever more acutely sensitive to the pervasive ignorance that is the foundation of America in particular. Even if that is true of all societies to some degree, I suspect there is something unique about what has been referred to as America’s Fantasyland. The US, in being an extreme expression, potentially can help us understand what plagues most of humanity at this late hour in the liberal age.

It is my love of liberalism that makes me so harsh in my criticisms. I long for a liberalism worthy of the name. But first that requires us to look at what the reactionary mind reflects back to us. It does no good to simply dismiss those others as ‘deplorables’. What gets repressed and projected doesn’t really go away, related to how externalized costs eventually come due.

Corey Robin’s take on liberal amnesia, despite my concerns of certain limitations, resonates with something going on right now. Few seem to be paying attention, specifically not those who have their hands on the wheel or are in a position to take hold of the wheel. Some are asleep, others are texting, and still others are imbibing intoxicants… but I’m suggesting that someone needs to be at the wheel with their eyes on the road. I’m one of those crazy radicals who would rather prevent an accident than deal with the aftermath.

So, first we should all look up and look forward. Where are we heading? And is that where we want to go? If not, why don’t we change direction?

We see a monster in the rearview mirror and, not realizing we are seeing our own reflection, we keep our foot on the gas pedal. Off we speed toward disaster, driven by fear and forgetting that we are the driver. As a society, we never stop going — expressed in the quote used by Robin: “Ever tried. Ever failed. No matter. Try again. Fail again.”

On we go. No one can doubt that we are going somewhere and getting there fast. The wind in our hair and adrenaline in our veins can feel almost like progress, as if we really will outrun our monsters this time. Meanwhile, Trump the egotistic man-child is sitting in the backseat tweeting and asking, Are we there yet?

Writing a post like this is less about in-depth analysis than it is about capturing a mood. I’ve written plenty of in-depth analyses and they have their value. But studying a medical textbook on cardiology is not the same as feeling the beating pulse under your fingertips. In reading Robin’s piece and the responses to it, I could sense the flow of blood at the heart of the issue.

Right there, something can be felt. It’s like the tug of the undertow when you’re standing a few feet out in the ocean water. Viscerally, in that moment, you know the power of the ocean and how easy it is to be swept away. The whole dynamic of the debate is caught in that push and pull. So maybe the debate itself (as framed) can’t avoid that sphere of influence, that narrative attractor. Much of the commentary on the problem is caught up in the problem itself, related to how the reactionary is the other side of the liberal.

This is why the amnesiac can’t escape the condition of amnesia. It’s not something external, like a pair of clothes that can be taken off or a wall that can be knocked down. It is a loss of some part of the self but not really lost, just hidden and forgotten. But anamnesis, remembering of what was forgotten, remains possible… if we could only remember that there was something we forgot and why it mattered so much that maybe we shouldn’t have forgotten it in the first place.

Thinking too much about that, though, makes us feel uncomfortable and we’re not quite sure why. We convince ourselves, once again, that maybe things unknown are best left that way. In a moment of crisis, the memory of something touches the edge of our awareness and yet quickly slips away again. We busy ourselves with other things, the preoccupation of the present eclipsing what came before. That is what responsible adults do, deal with the issues at hand, right?

I like the ocean metaphor. The beach, where land meets water, is our society. The progressive and reactionary is the tide going in and out. This liberal age is the storm wall we built to protect our home. But that storm wall has the unintended consequence of slowly eroding away the sand that forms the beach.

I saw a real world example of this form of erosion. And it was stark.

There is this wealthy estate my grandfather grew up on, as the son of the head gardener. When he was a child, there was a beautiful beach that was part of the estate. And next to the estate is a public beach. The estate owners had a storm wall built which caused their beach to entirely disappear, while the public beach is still there. Only a few feet separate the private property from the public space, but the erosion stops right at the boundary line.

This form of privatization is very much of this liberal age. Before this era, feudalism had its commons. Some speak of the tragedy of the commons, but that is a lie. There was no tragedy of the commons as they were heavily regulated. Those regulations of the commons only disappeared when the commons disappeared. And what followed was the tragedy of privatization, as seen with private developers building storm walls.

It is as pointless to blame the tide coming in, the progressive, as it is to blame the tide going out, the reactionary. It is the storm wall, the liberal paradigm, that frames the action of the tide and determines the consequences. Building the wall higher and stronger to protect us from the worsening effects of storms won’t really save us or our home. Maybe it is time to consider the possibility that the storm wall is the problem and not the natural flow of the tides that get defined by that modern construction.

The erosion of the beach is our collective amnesia. But to make the metaphor more apt, most people would never have a public beach next to them. There would be nothing to be compared against to remind people about what has been lost. And the loss would be gradual, the beach slowly shrinking as the storm wall grew larger. There simply is the loss of the beach along with loss even of the memory of the beach.

The very concept of a beach might disappear from public memory and public debate. Or people might assume that ‘beach’ was always a word referring to that threatening space just beyond the storm wall. Instead of discussing how to save or bring back beaches, political conflict would obsess over blaming the other side and arguing over the increasingly advanced techniques of building storm walls.

Eventually those storm walls would entirely block the view of the ocean, that is to say the view of the world outside of the system we’ve collectively built over the generations. The walls that protect us then would imprison us and enclose our minds, shut down our imaginations. But what fine and impressive walls they are, among the greatest advancements of modern civilization.

Here is the point, the moral of the metaphor.

It’s not that we should stop building great things, as expressions of what we value and envision. And it’s not even that we should specifically stop building walls for they too have their place when built with wisdom and understanding. But that requires us to realize the effect we have on the world around us in what we build and how that affects us in turn. The liberal project needs to be reinterpreted, reimagined, and reinvented. Or failing that, we need a new societal project that would inspire us as once did the liberal dream.

* * *

4/2/18 – On a related thought, Richard Eskow asks, Is the ‘liberal world order’ worth saving? That is a question I’ve often asked myelf and I’ve done so from the perspective of someone who has spent much of his life identified with the liberal world order. To be plainly honest, I like the liberal dream. It’s a beautiful dream.

So, a ‘liberal’ such as myself is implicated in this line of questioning, and the deepest implication is about what this change would mean on a personal level. Is my liberal identity worth saving? And in the long term, can it be saved? Or must we liberals become something new in seeking something new?

Eskow states that, “The “liberal world order” must own up to its mistakes. They were errors of commission, as well as omission. Today’s chaos – from Brexit to Trump – is fallout from a global system that works for the benefit of a privileged few and has failed to offer democratic alternatives to inequality and oligarchy.” That is to say liberals must own up to their mistakes. And in this liberal age, we are all liberals.

So, “Is the “liberal world order” worth saving?” That is a tough question. “Not in its present form,” suggests Eskow. “Yes, it has provided some semblance of order. But order without justice is both unfair and unstable. The unfairness has been apparent for many decades. Now we’re seeing the instability.”

Up to this point, we as a society have been unable to ask this question, much less take it seriously enough to attempt an answer. That is where ignorance and amnesia have left us. But maybe the coming storm, when it blows in our windows and knocks down our walls, will wake us up to the reality that was always there. What was hidden in plain sight will become impossible to ignore, as the costs finally come due.

From Ernest Hemingway’s The Sun Also Rises, there is some oft quoted dialogue. One character asks another, “How did you go bankrupt?” “Two ways,” was the response. “Gradually and then suddenly.” That is how we will emerge from our state of unconsciousness and obliviousness. The experience of anamnesis will be a hammer to our skull. The gradual process has been building up for a long time. And soon the sudden result will finally arrive, appearing as if out of nowhere.

Will the liberal order survive? I guess we’ll find out. Let the awakening begin!

 

 

 

A Storyteller’s Experienced Meaning

Storytelling is a way to embody and express meaning. But there is also no better way to hide and obscure meaning than behind a story.

The most powerful stories are those so compelling or so enforced that they take on the aura of reality, even if only for a moment of imagination. When someone doesn’t tell you the meaning of a story, it might be because they can’t but then again it might be because they won’t. The greatest storytellers don’t want you to consciously know what they are telling you for any meaning seen out in the open loses its power to control our minds, direct our perception, and shape our identity.

That is the entire history of religion and rhetoric, politics and propaganda, media and advertising. One way or another, a story is always being told. And with a dominant narrative, the audience is held in thrall. This is why stories can be as inescapable as they are dangerous. Also, this is why no one ever likes to have the meaning of their own story brought out into the harsh light. We grow attached to our stories, as individuals and societies. We couldn’t function without them, our strength as it is our Achilles’ heel.

So when a story threatens or terrorizes, imprisons or hobbles, the only way to fight it is by making its message explicit, to defuse the bomb. And that can only be done by telling an even better story, more powerful and compelling, at least in that moment of narrative crisis. But in the process of replacing one story with another, we once again get lost in meanings we can’t discern with consequences we can’t foresee.

All of that is beyond the task of the successful storyteller. As Flannery O’Connor made clear, her purpose is simply to give expression to “experienced meaning,” not to explain to others or even to herself. As a con man has to first con himself, if a storyteller didn’t get lost in her own stories, neither would her audience. The fisherman gets tangled up in his own net and sometimes drowns — that is a cost of the job, demonstrating how highly effective is the net. As such, a storyteller lives and dies by the stories told.

“When you can state the theme of a story, when you can separate it from the story itself, then you can be sure the story is not a very good one. The meaning of a story has to be embodied in it, has to be made concrete in it. A story is a way to say something that can’t be said any other way, and it takes every word in the story to say what the meaning is. You tell a story because a statement would be inadequate. When anybody asks what a story is about, the only proper thing is to tell him to read the story. The meaning of fiction is not abstract meaning but experienced meaning, and the purpose of making statements about the meaning of a story is only to help you experience that meaning more fully.”

Flannery O’Connor, “Writing Short Stories”
from Mystery and Manners (as quoted in Biblioklept)

Noble or savage?

From about a decade ago, in the last year of the Bush era, The Economist published the article “Noble or savage?” It is a typical piece, a rather simplistic and uninformed view, as expected. Rhetoric like this gets spun on a daily basis, but this particular example hit a nerve.

“two-thirds of modern hunter-gatherers are in a state of almost constant tribal warfare, and nearly 90% go to war at least once a year.”

That is plain fricking idiotic. Modern hunter-gatherers are under social, political, military, and environmental pressure that didn’t exist to the same degree until the modern era. Plus, they would be experiencing greater levels of pollution and introduction of new diseases that has been causing further stress and instability.

“Richard Wrangham of Harvard University says that chimpanzees and human beings are the only animals in which males engage in co-operative and systematic homicidal raids. The death rate is similar in the two species.”

Like modern hunter-gatherers, modern chimpanzees are living under unnatural conditions. The territory of chimpanzees has been severely impacted by human encroachment, environmental destruction, poaching, and violent conflict. Why would any halfway intelligent person expect anything other than dysfunctional behavior from populations of two closely related species experiencing the same dysfunctional conditions?

“Constant warfare was necessary to keep population density down to one person per square mile.”

Not necessarily. Hunter-gatherers have a lower birthrate, partly because of later puberty. Plus, they have higher infant mortality. Overpopulation is simply not much of a problem for most hunter-gatherers. They are usually able to maintain a stable population without needing warfare to decrease excess numbers. During boon times, population stress might become a problem, but that isn’t the norm.

“Hunter-gatherers may have been so lithe and healthy because the weak were dead.”

Maybe to some degree. But this would have to be proven, not claimed as empty speculation. Once past early childhood, hunter-gatherers have lifespans that are as long as and health that is superior to modern Westerners. Adult hunter-gatherers have low mortality rates, regularly living into old age with few if any of the stress-related illnesses common among WEIRD societies (low or non-existent rates of obesity, diabetes, heart disease, depression, psychosis, trauma, etc).

“Notice a close parallel with the industrial revolution. When rural peasants swapped their hovels for the textile mills of Lancashire, did it feel like an improvement? The Dickensian view is that factories replaced a rural idyll with urban misery, poverty, pollution and illness. Factories were indeed miserable and the urban poor were overworked and underfed. But they had flocked to take the jobs in factories often to get away from the cold, muddy, starving rural hell of their birth.”

Along with being ignorant of anthropology and social science, the author is likewise ignorant of history.

Most rural peasants didn’t freely choose to leave their villages. They were evicted from their homes and communities. The commons they had rights to and depended on became privatized during the enclosure movement of early capitalism. Oftentimes, their villages were razed to the ground and the then landless peasants were forced into a refugee crisis with their only option being to head to the cities where they found overcrowding, homelessness, unemployment, poverty, and starvation.

In comparison, their rural life had been a quite secure and pleasant. Barbara Ehrenreich discusses this in Dancing in the Streets. During much of feudalism, specifically earlier on, as much or more time was spent in socializing and public events (celebrations, holy days, carnival, etc) than in labor. The loss of that relatively easygoing and healthy rural life was devastating and traumatizing. This set the stage for the Peasants’ Revolt, English Civil War, and the revolutionary era. Thomas Paine saw firsthand the results of this societal and economic shift when he lived in London. It was a time of near continuous food riots, public hangings, and forced labor (either as convicts or indentured servants).

“Homo sapiens wrought havoc on many ecosystems as Homo erectus had not. There is no longer much doubt that people were the cause of the extinction of the megafauna in North America 11,000 years ago and Australia 30,000 years before that. The mammoths and giant kangaroos never stood a chance against co-ordinated ambush with stone-tipped spears and relentless pursuit by endurance runners.”

That probably had as much to do with environmental changes. There is still scholarly debate about how much of those extinctions were caused by over-hunting versus environmental stress on species. It was probably a combination of factors. After all, it was large-scale climate shifts that even made possible the migration of humans into new areas. When such climate shifts happened, ecosystems were dramatically altered and not all species were equally adaptable as their former niches disappeared. For example, neanderthals probably died because they couldn’t handle the warmer temperatures, not because they were massacred by invading homo sapiens.

“Incessant innovation is a characteristic of human beings. Agriculture, the domestication of animals and plants, must be seen in the context of this progressive change.”

That is rhetoric being used to justify an ideology. All societies and species experience change, as they are forced to adapt to changing conditions or failing that to die out. That is the natural state of earthly existence for all species. But the narrative of ‘progress’ is a particular view of change. All of evolution has not been heading toward Western Civilization, as an inevitable march of history as foreordained by God’s plan for the dominance of the white race over the beasts and savages.

“It was just another step: hunter-gatherers may have been using fire to encourage the growth of root plants in southern Africa 80,000 years ago. At 15,000 years ago people first domesticated another species—the wolf (though it was probably the wolves that took the initiative). After 12,000 years ago came crops. The internet and the mobile phone were in some vague sense almost predestined 50,000 years ago to appear eventually.”

How shockingly simpleminded! Nothing is predestined. That sounds like the old Whiggish history of Manifest Destiny. This entire article is a long argument for Western imperialism in its genocide of the natives and the stealing of their land. No one is arguing that the hunter-gatherer lifestyle is or was perfect, but let’s at least acknowledge the aspects of that lifestyle that were relatively better, such as low levels of social stress and mental illness. Do we really need yet another apologia for neoliberal hegemony and modern globalization? No, we don’t.

“There is a modern moral in this story. We have been creating ecological crises for ourselves and our habitats for tens of thousands of years. We have been solving them, too.”

There might be a modern moral in this story. But it might not be the one promoted by the ruthless dominators who have willfully destroyed all alternatives. The royal ‘we’ that supposedly have been solving these problems just so happens to be the same ‘we’ that has eliminated the natives and their lifestyle, often by way of one form of death or another, from war to disease. These hunter-gatherers, as representing an alternative, may seem problematic to the ruling class of capitalist nation-states. And no doubt that problem has been solved for the endless march of capitalist ‘progress’, such as it is.

“Pessimists will point out that each solution only brings us face to face with the next crisis, optimists that no crisis has proved insoluble yet. Just as we rebounded from the extinction of the megafauna and became even more numerous by eating first rabbits then grass seeds, so in the early 20th century we faced starvation for lack of fertiliser when the population was a billion people, but can now look forward with confidence to feeding 10 billion on less land using synthetic nitrogen, genetically high-yield crops and tractors. When we eventually reverse the build-up in carbon dioxide, there will be another issue waiting for us.”

So, the vision offered by the ruling elite is endless creative destruction that pulverizes everyone and everything that gets in the way. One problem ever leading to another, an ongoing Social Darwinism that implicitly presumes Westerners will continue to be the winners and survivors. They shall consume the entire earth until there is nothing left and then they will isolate themselves in post-apocalyptic biodomes or escape to the stars following their tech-utopian fantasies.

As for the rest of us, are we supposed to be apathetically submit to their power-hungry demands and world-weary pessimism threatening that resistance is futile, that no other options remain? Are we expected to unquestioningly accept how ideological realism denies moral and radical imagination, denies our freedom to choose anything else?

* * *

Noble or Savage? Both. (Part 1) And (Part 2)
by Jason Godesky

Romantic primitivism
by Evfit

Individualism and Isolation

“When an Indian child has been brought up among us, taught our language and habituated to our customs, yet if he goes to see his relations and makes one Indian ramble with them, there is no persuading him ever to return. [But] when white persons of either sex have been taken prisoners young by the Indians, and lived a while among them, tho’ ransomed by their friends, and treated with all imaginable tenderness to prevail with them to stay among the English, yet in a short time they become disgusted with our manner of life, and the care and pains that are necessary to support it, and take the first good opportunity of escaping again into the woods, from whence there is no reclaiming them.”
~ Benjamin Franklin

“The Indians, their old masters, gave them their choice and, without requiring any consideration, told them that they had been long as free as themselves. They chose to remain, and the reasons they gave me would greatly surprise you: the most perfect freedom, the ease of living, the absence of those cares and corroding solicitudes which so often prevail with us… all these and many more motives which I have forgot made them prefer that life of which we entertain such dreadful opinions. It cannot be, therefore, so bad as we generally conceive it to be; there must be in their social bond something singularly captivating, and far superior to anything to be boasted of among us; for thousands of Europeans are Indians, and we have no examples of even one of those Aborigines having from choice become Europeans! There must be something more congenial to our native dispositions than the fictitious society in which we live, or else why should children, and even grown persons, become in a short time so invincibly attached to it? There must be something very bewitching in their manners, something very indelible and marked by the very hands of nature. For, take a young Indian lad, give him the best education you possibly can, load him with your bounty, with presents, nay with riches, yet he will secretly long for his native woods, which you would imagine he must have long since forgot, and on the first opportunity he can possibly find, you will see him voluntarily leave behind all you have given him and return with inexpressible joy to lie on the mats of his fathers…

“Let us say what will of them, of their inferior organs, of their want of bread, etc., they are as stout and well made as the Europeans. Without temples, without priests, without kings and without laws, they are in many instances superior to us, and the proofs of what I advance are that they live without care, sleep without inquietude, take life as it comes, bearing all its asperities with unparalleled patience, and die without any kind of apprehension for what they have done or for what they expect to meet with hereafter. What system of philosophy can give us so many necessary qualifications for happiness? They most certainly are much more closely connected to nature than we are; they are her immediate children…”
~ J. Hector St. John de Crèvecœur

Western, educated, industrialized, rich, and democratic. Such societies, as the acronym goes, are WEIRD. But what exactly makes them weird?

This occurred to me because of reading Sebastian Junger’s Tribe. Much of what gets attributed to these WEIRD descriptors has been around for more than a half millennia, at least since the colonial imperialism began in Europe. From the moment Europeans settled in the Americas, a significant number of colonists were choosing to live among the natives because in many ways that lifestyle was a happier and healthier way of living with less stress and work, less poverty and inequality, not only lacking in arbitrary political power but also allowing far more personal freedom, especially for women.

Today, when we bother to think much about the problems we face, we mostly blame them on the side effects of modernity. But colonial imperialism began when Europe was still under the sway of of monarchies, state churches, and feudalism. There was nothing WEIRD about Western civilization at the time.

Those earlier Europeans hadn’t yet started to think of themselves in terms of a broad collective identity such as ‘Westerners’. They weren’t particularly well educated, not even the upper classes. Industrialization was centuries away. As for being rich, there was some wealth back then but it was limited to a few and even for those few it was rather unimpressive by modern standards. And Europeans back then were extremely anti-democratic.

Since European colonists were generally no more WEIRD than various native populations, we must look for other differences between them. Why did so many Europeans choose to live among the natives? Why did so many captured Europeans who were adopted into tribes refuse or resist being ‘saved’? And why did the colonial governments have to create laws and enforce harsh punishments in their having tried to stop people from ‘going native’?

European society, on both sides of the ocean, was severely oppressive and violent. That was particularly true in Virginia that was built on the labor of indentured servitude, at a time when most were worked to death before getting the opportunity for release from bondage. They had plenty of reason to seek the good life among the natives. But life was less than pleasant in the other colonies as well. A similar pattern repeated itself.

Thomas Morton went off with some men into the wilderness to start their own community where they commingled with the natives. This set an intolerable example that threatened Puritan social control and so the Puritans destroyed their community of the free. Roger Williams, a Puritan minister, took on the mission to convert the natives but found himself converted instead. He fled Puritan oppression because, as he put, the natives were more civilized than the colonists. Much later on, Thomas Paine living near some still free Indian tribes observed that their communities demonstrated greater freedom, self-governance, and natural rights than the colonies. He hoped Americans could take that lesson to heart. Other founders, from John Adams to Thomas Jefferson, looked admiringly to the example of their Indian neighbors.

The point is that whatever is problematic about Western society has been that way for a long time. Modernity has worsened this condition of unhappiness and dysfunction. There is no doubt that becoming more WEIRD has made us ever more weird, but Westerners were plenty weird from early on before they became WEIRD. Maybe the turning point for the Western world was the loss of our own traditional cultures and tribal lifestyles, as the Roman model of authoritarianism spread across Europe and became dominant. We have yet to shake off these chains of the past and instead have forced them upon everyone else.

It is what some call the Wetiko, one of the most infectious and deadly of mind viruses. “The New World fell not to a sword but to a meme,” as Daniel Quinn stated it (Beyond Civilization, p. 50). But it is a mind virus that can only take hold after immunity is destroyed. As long as there were societies of the free, the contagion was contained because the sick could be healed. But the power of the contagion is that the rabidly infected feel an uncontrollable compulsion to attack and kill the uninfected, the very people who would offer healing. Then the remaining survivors become infected and spread it further. A plague of victimization until no one is left untouched, until there is nowhere else to escape. Once all alternatives are eliminated, once a demiurgic monoculture comes to power, we are trapped in what Philip K. Dick called the Black Iron Prison. Sickness becomes all we know.

The usefulness of taking note of contemporary WEIRD societies isn’t that WEIRDness is the disease but that it shows the full blown set of symptoms of the disease. But the onset of the symptoms comes long after the infection, like a slow-growing malignant tumor in the brain. Still, symptoms are important, specifically when there is a comparison to a healthy population. That is what the New World offered the European mind, a comparison. The earliest accounts of native societies in the Americas helped Europeans to diagnose their own disease and helped motivate them to begin looking for a cure, although the initial attempts were fumbling and inept. The first thing that some Europeans did was simply to imagine what a healthy community might look like. That is what Thomas More attempted to do six centuries ago with his book Utopia.

Maybe the key is that of social concerns. Utopian visions have always focused on the social aspect, typically describing how people would ideally live together in communities. That was also the focus of the founders when they sought out alternative possibilities of organizing society. The change with colonialism, as feudalism was breaking down, was loss of belonging, of community and kinship. Modern individualism began not as an ideal but as a side effect of social breakdown, a condition of isolation and disconnection forced upon entire populations rather than having been freely chosen. And it was traumatizing to the Western psyche, and still is traumatizing, as seen with the high rates of mental illnesses in WEIRD societies, especially in the hyper-individualistic United States. That began before the defining factors of the WEIRD took hold. It was that trauma that made the WEIRD possible.

The colonists, upon meeting natives, discovered what had been lost. And for many colonists, that loss had happened within living memory. The hunger for what was lost was undeniable. To have seen a traditional communities that still functioned would have been like taking a breath of fresh air after having spent months in the stench of a ship’s hull. Not only did these native communities demonstrate what was recently lost but also what had been lost so much earlier. As many Indian tribes had more democratic practices, so did many European tribes prior to feudalism. But colonists had spent their entire lives being told democracy was impossible, that personal freedom was dangerous or even sinful.

The difference today is that none of this is within living memory for most of us, specifically Americans, unless one was raised Amish or in some similar community. The closest a typical American comes to this experience is by joining the military during war time. That is one of the few opportunities for a modern equivalent to a tribe, at least within WEIRD societies. And maybe a large part of the trauma soldiers struggle with isn’t merely the physical violence of war but the psychological violence of returning to the state of alienation, the loss of a bond that was closer than that of their own families.

Sebastian Junger notes that veterans who return to strong social support experience low rates of long-term PTSD, similar to Johann Hari’s argument about addiction in Chasing the Scream and his argument about depression in Lost Connections. Trauma, depression, addiction, etc — these are consequences of or worsened by isolation. These responses are how humans cope under stressful and unnatural conditions.

* * *

Tribe: On Homecoming and Belonging
by Sebastian Junger
pp. 16-25

The question for Western society isn’t so much why tribal life might be so appealing—it seems obvious on the face of it—but why Western society is so unappealing. On a material level it is clearly more comfortable and protected from the hardships of the natural world. But as societies become more affluent they tend to require more, rather than less, time and commitment by the individual, and it’s possible that many people feel that affluence and safety simply aren’t a good trade for freedom. One study in the 1960s found that nomadic !Kung people of the Kalahari Desert needed to work as little as twelve hours a week in order to survive—roughly one-quarter the hours of the average urban executive at the time. “The ‘camp’ is an open aggregate of cooperating persons which changes in size and composition from day to day,” anthropologist Richard Lee noted with clear admiration in 1968. “The members move out each day to hunt and gather, and return in the evening to pool the collected foods in such a way that every person present receives an equitable share… Because of the strong emphasis on sharing, and the frequency of movement, surplus accumulation… is kept to a minimum.”

The Kalahari is one of the harshest environments in the world, and the !Kung were able to continue living a Stone-Age existence well into the 1970s precisely because no one else wanted to live there. The !Kung were so well adapted to their environment that during times of drought, nearby farmers and cattle herders abandoned their livelihoods to join them in the bush because foraging and hunting were a more reliable source of food. The relatively relaxed pace of !Kung life—even during times of adversity—challenged long-standing ideas that modern society created a surplus of leisure time. It created exactly the opposite: a desperate cycle of work, financial obligation, and more work. The !Kung had far fewer belongings than Westerners, but their lives were under much greater personal control. […]

First agriculture, and then industry, changed two fundamental things about the human experience. The accumulation of personal property allowed people to make more and more individualistic choices about their lives, and those choices unavoidably diminished group efforts toward a common good. And as society modernized, people found themselves able to live independently from any communal group. A person living in a modern city or a suburb can, for the first time in history, go through an entire day—or an entire life—mostly encountering complete strangers. They can be surrounded by others and yet feel deeply, dangerously alone.

The evidence that this is hard on us is overwhelming. Although happiness is notoriously subjective and difficult to measure, mental illness is not. Numerous cross-cultural studies have shown that modern society—despite its nearly miraculous advances in medicine, science, and technology—is afflicted with some of the highest rates of depression, schizophrenia, poor health, anxiety, and chronic loneliness in human history. As affluence and urbanization rise in a society, rates of depression and suicide tend to go up rather than down. Rather than buffering people from clinical depression, increased wealth in a society seems to foster it.

Suicide is difficult to study among unacculturated tribal peoples because the early explorers who first encountered them rarely conducted rigorous ethnographic research. That said, there is remarkably little evidence of depression-based suicide in tribal societies. Among the American Indians, for example, suicide was understood to apply in very narrow circumstances: in old age to avoid burdening the tribe, in the ritual paroxysms of grief following the death of a spouse, in a hopeless but heroic battle with an enemy, and in an attempt to avoid the agony of torture. Among tribes that were ravaged by smallpox, it was also understood that a person whose face had been hideously disfigured by lesions might kill themselves. According to The Ethics of Suicide: Historical Sources , early chroniclers of the American Indians couldn’t find any other examples of suicide that were rooted in psychological causes. Early sources report that the Bella Coola, the Ojibwa, the Montagnais, the Arapaho, the Plateau Yuma, the Southern Paiute, and the Zuni, among many others, experienced no suicide at all.

This stands in stark contrast to many modern societies, where the suicide rate is as high as 25 cases per 100,000 people. (In the United States, white middle-aged men currently have the highest rate at nearly 30 suicides per 100,000.) According to a global survey by the World Health Organization, people in wealthy countries suffer depression at as much as eight times the rate they do in poor countries, and people in countries with large income disparities—like the United States—run a much higher lifelong risk of developing severe mood disorders. A 2006 study comparing depression rates in Nigeria to depression rates in North America found that across the board, women in rural areas were less likely to get depressed than their urban counterparts. And urban North American women—the most affluent demographic of the study—were the most likely to experience depression.

The mechanism seems simple: poor people are forced to share their time and resources more than wealthy people are, and as a result they live in closer communities. Inter-reliant poverty comes with its own stresses—and certainly isn’t the American ideal—but it’s much closer to our evolutionary heritage than affluence. A wealthy person who has never had to rely on help and resources from his community is leading a privileged life that falls way outside more than a million years of human experience. Financial independence can lead to isolation, and isolation can put people at a greatly increased risk of depression and suicide. This might be a fair trade for a generally wealthier society—but a trade it is. […]

The alienating effects of wealth and modernity on the human experience start virtually at birth and never let up. Infants in hunter-gatherer societies are carried by their mothers as much as 90 percent of the time, which roughly corresponds to carrying rates among other primates. One can get an idea of how important this kind of touch is to primates from an infamous experiment conducted in the 1950s by a primatologist and psychologist named Harry Harlow. Baby rhesus monkeys were separated from their mothers and presented with the choice of two kinds of surrogates: a cuddly mother made out of terry cloth or an uninviting mother made out of wire mesh. The wire mesh mother, however, had a nipple that dispensed warm milk. The babies took their nourishment as quickly as possible and then rushed back to cling to the terry cloth mother, which had enough softness to provide the illusion of affection. Clearly, touch and closeness are vital to the health of baby primates—including humans.

In America during the 1970s, mothers maintained skin-to-skin contact with babies as little as 16 percent of the time, which is a level that traditional societies would probably consider a form of child abuse. Also unthinkable would be the modern practice of making young children sleep by themselves. In two American studies of middle-class families during the 1980s, 85 percent of young children slept alone in their own room—a figure that rose to 95 percent among families considered “well educated.” Northern European societies, including America, are the only ones in history to make very young children sleep alone in such numbers. The isolation is thought to make many children bond intensely with stuffed animals for reassurance. Only in Northern European societies do children go through the well-known developmental stage of bonding with stuffed animals; elsewhere, children get their sense of safety from the adults sleeping near them.

The point of making children sleep alone, according to Western psychologists, is to make them “self-soothing,” but that clearly runs contrary to our evolution. Humans are primates—we share 98 percent of our DNA with chimpanzees—and primates almost never leave infants unattended, because they would be extremely vulnerable to predators. Infants seem to know this instinctively, so being left alone in a dark room is terrifying to them. Compare the self-soothing approach to that of a traditional Mayan community in Guatemala: “Infants and children simply fall asleep when sleepy, do not wear specific sleep clothes or use traditional transitional objects, room share and cosleep with parents or siblings, and nurse on demand during the night.” Another study notes about Bali: “Babies are encouraged to acquire quickly the capacity to sleep under any circumstances, including situations of high stimulation, musical performances, and other noisy observances which reflect their more complete integration into adult social activities.”

As modern society reduced the role of community, it simultaneously elevated the role of authority. The two are uneasy companions, and as one goes up, the other tends to go down. In 2007, anthropologist Christopher Boehm published an analysis of 154 foraging societies that were deemed to be representative of our ancestral past, and one of their most common traits was the absence of major wealth disparities between individuals. Another was the absence of arbitrary authority. “Social life is politically egalitarian in that there is always a low tolerance by a group’s mature males for one of their number dominating, bossing, or denigrating the others,” Boehm observed. “The human conscience evolved in the Middle to Late Pleistocene as a result of… the hunting of large game. This required… cooperative band-level sharing of meat.”

Because tribal foragers are highly mobile and can easily shift between different communities, authority is almost impossible to impose on the unwilling. And even without that option, males who try to take control of the group—or of the food supply—are often countered by coalitions of other males. This is clearly an ancient and adaptive behavior that tends to keep groups together and equitably cared for. In his survey of ancestral-type societies, Boehm found that—in addition to murder and theft—one of the most commonly punished infractions was “failure to share.” Freeloading on the hard work of others and bullying were also high up on the list. Punishments included public ridicule, shunning, and, finally, “assassination of the culprit by the entire group.” […]

Most tribal and subsistence-level societies would inflict severe punishments on anyone who caused that kind of damage. Cowardice is another form of community betrayal, and most Indian tribes punished it with immediate death. (If that seems harsh, consider that the British military took “cowards” off the battlefield and executed them by firing squad as late as World War I.) It can be assumed that hunter-gatherers would treat their version of a welfare cheat or a dishonest banker as decisively as they would a coward. They may not kill him, but he would certainly be banished from the community. The fact that a group of people can cost American society several trillion dollars in losses—roughly one-quarter of that year’s gross domestic product—and not be tried for high crimes shows how completely de-tribalized the country has become.

Dishonest bankers and welfare or insurance cheats are the modern equivalent of tribe members who quietly steal more than their fair share of meat or other resources. That is very different from alpha males who bully others and openly steal resources. Among hunter-gatherers, bullying males are often faced down by coalitions of other senior males, but that rarely happens in modern society. For years, the United States Securities and Exchange Commission has been trying to force senior corporate executives to disclose the ratio of their pay to that of their median employees. During the 1960s, senior executives in America typically made around twenty dollars for every dollar earned by a rank-and-file worker. Since then, that figure has climbed to 300-to-1 among S&P 500 companies, and in some cases it goes far higher than that. The US Chamber of Commerce managed to block all attempts to force disclosure of corporate pay ratios until 2015, when a weakened version of the rule was finally passed by the SEC in a strict party-line vote of three Democrats in favor and two Republicans opposed.

In hunter-gatherer terms, these senior executives are claiming a disproportionate amount of food simply because they have the power to do so. A tribe like the !Kung would not permit that because it would represent a serious threat to group cohesion and survival, but that is not true for a wealthy country like the United States. There have been occasional demonstrations against economic disparity, like the Occupy Wall Street protest camp of 2011, but they were generally peaceful and ineffective. (The riots and demonstrations against racial discrimination that later took place in Ferguson, Missouri, and Baltimore, Maryland, led to changes in part because they attained a level of violence that threatened the civil order.) A deep and enduring economic crisis like the Great Depression of the 1930s, or a natural disaster that kills tens of thousands of people, might change America’s fundamental calculus about economic justice. Until then, the American public will probably continue to refrain from broadly challenging both male and female corporate leaders who compensate themselves far in excess of their value to society.

That is ironic, because the political origins of the United States lay in confronting precisely this kind of resource seizure by people in power. King George III of England caused the English colonies in America to rebel by trying to tax them without allowing them a voice in government. In this sense, democratic revolutions are just a formalized version of the sort of group action that coalitions of senior males have used throughout the ages to confront greed and abuse. Thomas Paine, one of the principal architects of American democracy, wrote a formal denunciation of civilization in a tract called Agrarian Justice : “Whether… civilization has most promoted or most injured the general happiness of man is a question that may be strongly contested,” he wrote in 1795. “[Both] the most affluent and the most miserable of the human race are to be found in the countries that are called civilized.”

When Paine wrote his tract, Shawnee and Delaware warriors were still attacking settlements just a few hundred miles from downtown Philadelphia. They held scores of white captives, many of whom had been adopted into the tribe and had no desire to return to colonial society. There is no way to know the effect on Paine’s thought process of living next door to a communal Stone-Age society, but it might have been crucial. Paine acknowledged that these tribes lacked the advantages of the arts and science and manufacturing, and yet they lived in a society where personal poverty was unknown and the natural rights of man were actively promoted.

In that sense, Paine claimed, the American Indian should serve as a model for how to eradicate poverty and bring natural rights back into civilized life.

* * *

Dark Triad Domination
Urban Weirdness
Social Disorder, Mental Disorder
The Unimagined: Capitalism and Crappiness
It’s All Your Fault, You Fat Loser!
How Universal Is The Mind?
Bias About Bias
“Beyond that, there is only awe.”

 

Lock Without a Key

In his recent book The “Other” Psychology of Julian Jaynes, Brian J. McVeigh brings the latest evidence to bear on the the theory of the bicameral mind. His focus is on textual evidence, the use of what he calls mind words and their frequency. But before getting to the evidence itself, he clarifies a number of issues, specifically the ever confounding topic of consciousness itself. Many have proclaimed Jaynes to have been wrong based on their having no clue of what he was talking about, as he used consciousness in a particular way that he took great care to define… for anyone who bothers to actually read his book before dismissing it.

To avoid confusion, McVeigh refers to Jaynesian ‘consciousness’ as conscious interiority (others simply call it J-consciousness). McVeigh states that his purpose is to “distinguish it from perception, thinking, cognition, rational thought, and other concepts.” Basically, Jaynes wasn’t talking about any form of basic awareness and physiological reactivity (e.g., a slug recoiling from salt) nor complex neurocognitive skills (e.g., numerical accounting of trade goods and taxes), as the bicameral mind possessed these traits. Starting with Jaynes’ analysis, McVeigh as he does in other books gives a summary of the features of concious interiority:

  • Spatialization of Psyche
  • Introception
  • Excerption
  • Self-narratzation
  • Self-autonomy
  • Self-authorization
  • Concilence
  • Indidviduation
  • Self-reflexivity

These describe a historically contingent state of mind and social identity, a way of being in the world and experiencing reality, a social order and cultural lifestyle. These traits are peculiar to civilization of these past millennia. But be clear that they are no indication of superiority, as archaic civilizations were able to accomplish tasks we find phenomenal — such as envisioning and planning over multiple generations to carve stones that weigh hundreds of tons, move them over great distances, and place them into precise position in the process of building large complex structures, something we don’t know how to accomplish without the advances of modern mathematics, architecture, and technology.

There is nothing inevitable nor necessary about consciousness, and it is surely possible that civilization could have developed in alternative ways that might have been far more advanced than what we presently know. Consider that the largest city in the world during early European colonialism was in the Americas, where the bicameral mind may have held on for a longer period of time. If not for the decimation by disease (and earlier Bronze Age decimation by environmental catastrophe), bicameralism might have continued to dominate civilization and led to a different equivalent of the Axial Age with whatever would have followed from it. History is full of arbitrary junctures that end up tipping humanity one way or another, depending on which way the wind happens to be blowing on a particular day or how the entire weather pattern might shift over several centuries.

That said, in history as it did play out, we now have our specific egoic consciousness and it is hard for us to imagine anything else, even if in the grand scheme of things our mighty neurocognitive monoculture is a mere momentary blip in the passing of eons, maybe disappearing in the near future as quickly as it came. Our civilization lives in a constant state of precarious uncertainty. That is why we study the past, in order to grapple with the present, and maybe just maybe glimpse what is yet to come. We might as well spend our time in this game of attempted self-understanding, if only to amuse future generations who might happen upon our half-witted ponderings.

Let us consider one aspect of consciousness. Here is McVeigh’s quick summary of concilience (pp. 39-40): A “slightly ambiguous perceived object is made to conform to some previously learned schema.” Consilience (or assimilation) is “doing in mind-space what narratization does in our mind-time or spatialized time. It brings things together [as] conscious objects just as narratization brings episodes together as a story” (Jaynes 1976; 64-65). I hadn’t previously given this much thought. But for some reason it stood out to me in my perusal. Immediately coming to mind was Lewis Hyde’s metonymy, along with the nexus of metaphorical framing, embodied mind, and symbolic conflation. Related to concilience and narratization, McVeigh speaks of coception which he defines as “how perceptions and introceptions coincide (such overlapping deludes us into assuming that interior experiences are sensory reflections of reality)” (p. 41).

I’m not sure to what degree I comprehend what McVeigh is getting at. But I grasp some hints of what it might mean. It resonates with my readings of Lewis Hyde’s Trickster Makes This World (see “Why are you thinking about this?”). Hyde defines metonymy as “substituting of one thing for another” (p. 169), “an unalterable fact about the body is linked to a place in the social order, and in both cases, to accept the link is to be caught in a kind of trap. Before anyone can be snared in this trap, an equation must be made between the body and the world (my skin color is my place as a Hispanic; menstruation is my place as a woman).”

Even more relevant is Hyde’s detailed exploration of the body and shame. I’d propose that this is a key factor in the rise of consciousness. Jaynes’ noted that shame and general obsession over nakedness, sexuality, etc can’t be found in the earliest texts and art (maybe not entirely unrelated to how anthropologists have observed the depression that follows when tribal people first see themselves in a mirror and so see themselves as others see them, which leads to the entertaining thought of Bronze Age civilizations having collapsed as mirrors became widely used and the populations descended into mass despondency). Hyde says that, “The construction of the trap of shame begins with this metonymic trick, a kind of bait and switch in which one’s changeable social place is figured in terms of an unchangeable part of the body. Then by various means the trick is made to blend invisibly into the landscape. […] In short, to make the trap of shame we inscribe the body as a sign of wider worlds, then erase the artifice of that signification so that the content of shame becomes simply the way things are, as any fool can see.”

This word magik of hypnotic metonymy and concilience is how we fall into our collective trance. We mass hallucinate our moral imagination into social reality. And then we enforce the narrative of individual selfhood onto ourselves, passing it on as an intergenerational inheritance or curse. What if shame is the cornerstone of modern civilization built over the buried corpse of bicameral civilization? We construct the social order around us like a prison to which we forgot to design a key. Shame is the trap we set for ourselves, the trap we keep on triggering, the door locking behind us each time. And so shame is the mechanism of the lock that we refuse to look at, the mechanism that would release us.

It is only in consciousness that we can remain unconscious.

Powerful Conspiracies & Open Secrets

Why do so many Americans, including the well educated, know so little about how power operates? There is a naivete that dominates the mainstream mind, in how the sociopolitical reality tunnel is mediated by way of corporate media, think tanks, perception management, etc.

The most powerful conspiracies often operate as open secrets. The conspirators, not that they think of themselves that way, don’t worry about exposure because the public is kept in a permanent state of apathetic ignorance. The alternative media and foreign media that investigates and reports on these conspiracies rarely reaches the American audience. Nor do those in domestic corporate media and among the domestic political elite necessarily get much exposure to other views, as they are as trapped in echo chambers and media bubbles as anyone else. It’s a shared condition of mediated reality.

If mainstream media reporting and the political narrative is controlled, then control can be maintained of public perception and opinion. Truth doesn’t matter when truth has been made invisible and irrelevant to most of the targeted population. As long as the official story is repeated enough, few will question it and the few who do will be dismissed as cranks and conspiracy theorists, not that most will even take notice. A news report might occasionally surface, barely touching upon it, only to disappear once again. The very casual and fleeting treatment of the incident communicates and reinforces its insignificance to the audience.

Conspiracies are easy, as long as there is an established system of persuasion and influence. It doesn’t require a secret cabal operating in a dark room. Those in positions of power and authority, including media personalities, live in the same world and share the same vested interests. Like the rest of the population, those in the upper classes come to believe the stories they tell. The first victim of propaganda are the propagandists, as any con man has to first con himself. And humans are talented at rationalizing.

Truth is easily sacrificed, just as easily as it is ignored, even for those who know better. We simultaneously know and don’t know all kinds of things. In a cynical age such as ours, a conspiracy doesn’t seem like a conspiracy when it operates out in the open. We take it as business as usual, the way power always operates. The greatest conspiracy is our collective blindness and silence.

* * *

Daniel Ellsberg’s Advice for How to Stop Current and Future Wars
by Norman Solomon

Daniel Ellsberg has a message that managers of the warfare state don’t want people to hear.

“If you have information that bears on deception or illegality in pursuing wrongful policies or an aggressive war,” he said in a statement released last week, “don’t wait to put that out and think about it, consider acting in a timely way at whatever cost to yourself. … Do what Katharine Gun did.”

If you don’t know what Katharine Gun did, chalk that up to the media power of the war system.

Ellsberg’s video statement went public as this month began, just before the 15th anniversary of the revelation by a British newspaper, the Observer, of a secret NSA memo—thanks to Katharine Gun. At the UK’s intelligence agency GCHQ, about 100 people received the same email memo from the National Security Agency on the last day of January 2003, seven weeks before the invasion of Iraq got underway. Only Katharine Gun, at great personal risk, decided to leak the document.

If more people had taken such risks in early 2003, the Iraq War might have been prevented. If more people were willing to take such risks in 2018, the current military slaughter in several nations, mainly funded by U.S. taxpayers, might be curtailed if not stopped. Blockage of information about past whistleblowing deprives the public of inspiring role models.

That’s the kind of reality George Orwell was referring to when he wrote: “Who controls the past controls the future; who controls the present controls the past.” […]

Katharine Gun foiled that plan. While scarcely reported in the U.S. media (despite cutting-edge news releases produced by my colleagues at the Institute for Public Accuracy beginning in early March of 2003), the revelations published by the Observer caused huge media coverage across much of the globe—and sparked outrage in several countries with seats on the Security Council.

“In the rest of the world, there was great interest in the fact that American intelligence agencies were interfering with their policies of their representatives in the Security Council,” Ellsberg noted. A result was that for some governments on the Security Council at the time, the leak “made it impossible for their representatives to support the U.S. wish to legitimize this clear case of aggression against Iraq. So the U.S. had to give up its plan to get a supporting vote in the UN.” The U.S. and British governments “went ahead anyway, but without the legitimating precedent of an aggressive war that would have had, I think, many consequences later.”

The BBC’s disgraceful attempt at a McCarthyist-style shaping public of perceptions and flouting impartiality rule
by Kitty S. Jones

One source of media bias is a failure to include a perspective, viewpoint or information within a news story that might be objectively regarded as being important. This is important because exclusion of a particular viewpoint or opinion on a subject might be expected to shift the ‘Overton Window’, defining what it is politically acceptable to say. This can happen in such a way that a viewpoint becomes entirely eliminated or marginalised from political discourse. Within academic media theory, there is a line of reasoning that media influence on audiences is not immediate but occurs more through a continual process of repeated arguments – the ‘steady drip’ effect.

A second potential source of bias is ‘bias by selection’. This might entail particular issues or viewpoints being more frequently covered, or certain guests or organisations being more likely to be selected. There are several others, for some of which the BBC has regularly been criticised.

Herman and Chomsky (1988) proposed a propaganda model hypothesising systematic biases of media from structural economic causes. Their proposition is that media ownership by corporations, (and in other media, funding from advertising), the use of official sources, efforts to discredit independent media (“flak”), and “anti-communist“ ideology as the filters that bias news in favour of corporate and partisan political interests.

Politically biased messages may be conveyed via visual cues from the sets as a kind of underhanded reference-dependent framing. A frame defines the packaging of an element of rhetoric in such a way as to encourage certain interpretations and to discourage others. It entails the selection of some aspects of a perceived reality and makes them more salient in a communicating text, in such a way as to promote a particular problem definition, causal interpretation, moral evaluation and so on. […]

As Walter Lipman once noted, the news media are a primary source of those “pictures in our heads” about the larger world of public affairs, a world that for most citizens is “out of reach, out of sight, out of mind.” What people know about the world is largely based on what the media decide to show them. More specifically, the result of this mediated view of the world is that the priorities of the media strongly influence the priorities of the public. Elements prominent on the media agenda become prominent in the public mind.

Given the reduction in sophistication and rationality in government rhetoric, media news and current affairs presentation, (and reduction in democratic accountability, for that matter) we don’t currently have a climate that particularly encourages citizens to think critically and for themselves.

* * *

On a related topic, Kitty S. Jones has another piece she just put up. It is about how we are influenced. This is also a bit of an open secret. There has been some leaks about it, along with some investigative reporting. Yet the average American remains uninformed and unaware or else not comprehending or taking seriously the threat.

The fact of the matter is that a conspiracy to influence is as easy to accomplish as anything else. We are surrounded by highly effective conspiracies to influence us, many that operate out in the open and still we rarely notice them. They are the air we breathe in a cynical society such as this. They have become normalized and so we have stopped being shocked by how far the powerful are willing to go.

It would be depressingly naive to dismiss this as the rantings of conspiracy theorists. This is one of the most central concerns of anyone who both understands what democracy is and why it matters. We either want a free society or not. When we find ourselves being kept in the dark, the first thing to do is to turn on a light and look around to see what is going on, no matter how ugly is the reality we see and no matter how uncomfortable it makes us feel.

Here is Jones’ most recent post:

Cambridge Analytica, the commodification of voter decision-making and marketisation of democracy
by Kitty S Jones

It’s not such a big inferential leap to conclude that governments are attempting to manage legitimate criticism and opposition while stage-managing our democracy.

I don’t differentiate a great deal between the behavioural insights team at the heart of the Conservative cabinet office, and the dark world of PR and ‘big data’ and ‘strategic communications’ companies like Cambridge Analytica. The political misuse of psychology has been disguised as some kind of technocratic “fix” for a failing neoliberal paradigm, and paraded as neutral “science”.

However, its role as an authoritarian prop for an ideological imposition on the population has always been apparent to some of us, because the bottom line is that it is all about influencing people’s perceptions and decisions, using psychological warfare strategies.

The Conservatives’ behaviour change agenda is designed to align citizen’s perceptions and behaviours with neoliberal ideology and the interests of the state. However, in democratic societies, governments are traditionally elected to reflect and meet public needs. The use of “behaviour change” policy involves the state acting upon individuals, and instructing them how they must be.

Last year, I wrote a detailed article about some of these issues, including discussion of Cambridge Analytica’s involvement in data mining and the political ‘dark’ advertising that is only seen by its intended recipients. This is a much greater cause for concern than “fake news” in the spread of misinformation, because it is invisible to everyone but the person being targeted. This means that the individually tailored messages are not open to public scrutiny, nor are they fact checked.

A further problem is that no-one is monitoring the impact of the tailored messages and the potential to cause harm to individuals. The dark adverts are designed to exploit people’s psychological vulnerabilities, using personality profiling, which is controversial in itself. Intentionally generating and manipulating fear and anxiety to influence political outcomes isn’t a new thing. Despots have been using fear and slightly less subtle types of citizen “behaviour change” programmes for a long time.

* * *

In Kitty S. Jones’ above linked piece, she refers to an article that makes clear the dangerous consequences of covert anti-democratic tactics. This is reminiscent of the bad ol’ days of COINTELPRO, one of the many other proven government conspiracies. If this doesn’t wake you up, you must be deep asleep.

How Covert Agents Infiltrate the Internet to Manipulate, Deceive, and Destroy Reputations
by Glenn Greenwald

One of the many pressing stories that remains to be told from the Snowden archive is how western intelligence agencies are attempting to manipulate and control online discourse with extreme tactics of deception and reputation-destruction. It’s time to tell a chunk of that story, complete with the relevant documents.

Over the last several weeks, I worked with NBC News to publish a series of articles about “dirty trick” tactics used by GCHQ’s previously secret unit, JTRIG (Joint Threat Research Intelligence Group). These were based on four classified GCHQ documents presented to the NSA and the other three partners in the English-speaking “Five Eyes” alliance. Today, we at the Intercept are publishing another new JTRIG document, in full, entitled “The Art of Deception: Training for Online Covert Operations.”

By publishing these stories one by one, our NBC reporting highlighted some of the key, discrete revelations: the monitoring of YouTube and Blogger, the targeting of Anonymous with the very same DDoS attacks they accuse “hacktivists” of using, the use of “honey traps” (luring people into compromising situations using sex) and destructive viruses. But, here, I want to focus and elaborate on the overarching point revealed by all of these documents: namely, that these agencies are attempting to control, infiltrate, manipulate, and warp online discourse, and in doing so, are compromising the integrity of the internet itself.

Among the core self-identified purposes of JTRIG are two tactics: (1) to inject all sorts of false material onto the internet in order to destroy the reputation of its targets; and (2) to use social sciences and other techniques to manipulate online discourse and activism to generate outcomes it considers desirable. To see how extremist these programs are, just consider the tactics they boast of using to achieve those ends: “false flag operations” (posting material to the internet and falsely attributing it to someone else), fake victim blog posts (pretending to be a victim of the individual whose reputation they want to destroy), and posting “negative information” on various forums. […]

These agencies’ refusal to “comment on intelligence matters” – meaning: talk at all about anything and everything they do – is precisely why whistleblowing is so urgent, the journalism that supports it so clearly in the public interest, and the increasingly unhinged attacks by these agencies so easy to understand. Claims that government agencies are infiltrating online communities and engaging in “false flag operations” to discredit targets are often dismissed as conspiracy theories, but these documents leave no doubt they are doing precisely that.

Whatever else is true, no government should be able to engage in these tactics: what justification is there for having government agencies target people – who have been charged with no crime – for reputation-destruction, infiltrate online political communities, and develop techniques for manipulating online discourse? But to allow those actions with no public knowledge or accountability is particularly unjustifiable.

Corporate Imperialism

Corporations have always been forms or aspects of governments, agents and manifestations of state power. The earliest corporate charters were given to colonial governments that often were simultaneously for-profit business ventures and were operated accordingly — typically dependent on free stolen land and resources combined with a cheap workforce of impoverished immigrants, convict labor, indentured servants, and slaves. That is the origin of modern capitalism.

By definition, a corporation is a political entity and institution, a creature of government. A corporate charter is a legal and political construction offering legal rights and privileges that are protected and enforced by official authority and, when necessary, violent force. In some cases, from the East India Company ruling India to the American Robber Barons ruling company towns, corporations have operated their own policing and employed their own goons. And as long as political reform or populist revolution doesn’t take them out of power, they eventually become fully functioning governments.

Essentially, a corporation is no different than a central bank, an alphabet soup agency, a political party, etc. In fact, many regulatory agencies are captured by and act on the behalf of corporations, not on behalf of the people or their elected representatives. Even from the beginning, it was never clear whether corporations were entities beholden to governments or a new kind of governing body and political organization. The struggle between colonial corporations and the colonial empires was often about which elite held ultimate power, only later involving local populations attempting to seize power for self-governance. The American Revolution, for example, was as much a revolt against a corporation as it was against an empire.

We are living at a time when the majority (about two third) of the largest economies in the world are transnational corporations. These new corporations are not only seizing the power of governments or otherwise pulling the strings behind the scenes: bribery, blackmail, cronyism, etc. Acting beyond the level of nation-states, they are creating something entirely new — a global network of corporate governance that lacks any and all democratic procedure, transparency, and accountability.

Once colonial imperialism asserted itself, it was inevitable what corporations would become. The early ideology of corporatism had its origins in the Catholic Church, another vast transnational institution. But now corporations serve no other master than raw power, which is to say authoritarianism — national corporatocracy growing into an even more fearsome predator, transnational inverted totalitarianism ruled by psychopaths, dominators, and narcissists.

As our new Lord and Savior Donald Trump demonstrates, a successful plutocrat and kleptocrat can declare bankruptcy numerous times over decades and still maintain his position of immense wealth while using that wealth to buy political influence and position (with decades of ties to foreign oligarchs and crime syndicates involving apparent money laundering, only now being investigated but probably with no real consequences). Before Trump, it was Ronald Reagan who went from radio sportscaster to Hollywood actor to corporate spokesperson to politician to the most powerful man in the world. But if not a cult of media personality like that surrounding Reagan or Trump, we would be instead be ruled by an internet tycoon like Jeff Bezos (with his ties to the CIA and Pentagon) or a tech tycoon like Peter Thiel (with his dreams of utopian technocracy)— the results would be similar, an ever increasing accumulation of wealth and concentration of power.

Even more concerning are the powerful interests and dark money that operate behind the scenes, the Koch brothers and Mercer families of the world, the most successful of them remaining hidden from public disclosure and news reporting. The emergent corporate imperialism isn’t limited to individuals but crony networks of establishment power, political dynasties, and vast inherited wealth; along with lobbyist organizations, think tanks, front groups, big biz media, etc.

The money men (they are mostly men and, of course, white) are the celebrities and idols of the present corporatist world in the way those in past eras admired, worshipped, and bowed down to popes, monarchs, and aristocrats. This 21st century ruling elite, including the puppet masters that keep the show going, is as untouchable as that of the ancien regime and in many ways more powerful if more covert than the East India Company, that is until a new revolutionary era comes. There isn’t much room for hope. In all of these centuries of struggle between various ruling elites, democracy for all its rhetoric remains a dream yet to be made real, a promise yet to be fulfilled.

* * *

The East India Company: The original corporate raiders
by William Dalrymple

It seemed impossible that a single London corporation, however ruthless and aggressive, could have conquered an empire that was so magnificently strong, so confident in its own strength and brilliance and effortless sense of beauty.

Historians propose many reasons: the fracturing of Mughal India into tiny, competing states; the military edge that the industrial revolution had given the European powers. But perhaps most crucial was the support that the East India Company enjoyed from the British parliament. The relationship between them grew steadily more symbiotic throughout the 18th century. Returned nabobs like Clive used their wealth to buy both MPs and parliamentary seats – the famous Rotten Boroughs. In turn, parliament backed the company with state power: the ships and soldiers that were needed when the French and British East India Companies trained their guns on each other. […]

In September, the governor of India’s central bank, Raghuram Rajan, made a speech in Mumbai expressing his anxieties about corporate money eroding the integrity of parliament: “Even as our democracy and our economy have become more vibrant,” he said, “an important issue in the recent election was whether we had substituted the crony socialism of the past with crony capitalism, where the rich and the influential are alleged to have received land, natural resources and spectrum in return for payoffs to venal politicians. By killing transparency and competition, crony capitalism is harmful to free enterprise, and economic growth. And by substituting special interests for the public interest, it is harmful to democratic expression.

His anxieties were remarkably like those expressed in Britain more than 200 years earlier, when the East India Company had become synonymous with ostentatious wealth and political corruption: “What is England now?” fumed the Whig litterateur Horace Walpole, “A sink of Indian wealth.” In 1767 the company bought off parliamentary opposition by donating £400,000 to the Crown in return for its continued right to govern Bengal. But the anger against it finally reached ignition point on 13 February 1788, at the impeachment, for looting and corruption, of Clive’s successor as governor of Bengal, Warren Hastings. It was the nearest the British ever got to putting the EIC on trial, and they did so with one of their greatest orators at the helm – Edmund Burke.

Burke, leading the prosecution, railed against the way the returned company “nabobs” (or “nobs”, both corruptions of the Urdu word “Nawab”) were buying parliamentary influence, not just by bribing MPs to vote for their interests, but by corruptly using their Indian plunder to bribe their way into parliamentary office: “To-day the Commons of Great Britain prosecutes the delinquents of India,” thundered Burke, referring to the returned nabobs. “Tomorrow these delinquents of India may be the Commons of Great Britain.”

Burke thus correctly identified what remains today one of the great anxieties of modern liberal democracies: the ability of a ruthless corporation corruptly to buy a legislature. And just as corporations now recruit retired politicians in order to exploit their establishment contacts and use their influence, so did the East India Company. So it was, for example, that Lord Cornwallis, the man who oversaw the loss of the American colonies to Washington, was recruited by the EIC to oversee its Indian territories. As one observer wrote: “Of all human conditions, perhaps the most brilliant and at the same time the most anomalous, is that of the Governor General of British India. A private English gentleman, and the servant of a joint-stock company, during the brief period of his government he is the deputed sovereign of the greatest empire in the world; the ruler of a hundred million men; while dependant kings and princes bow down to him with a deferential awe and submission. There is nothing in history analogous to this position …”

Hastings survived his impeachment, but parliament did finally remove the EIC from power following the great Indian Uprising of 1857, some 90 years after the granting of the Diwani and 60 years after Hastings’s own trial. On 10 May 1857, the EIC’s own security forces rose up against their employer and on successfully crushing the insurgency, after nine uncertain months, the company distinguished itself for a final time by hanging and murdering tens of thousands of suspected rebels in the bazaar towns that lined the Ganges – probably the most bloody episode in the entire history of British colonialism.

Enough was enough. The same parliament that had done so much to enable the EIC to rise to unprecedented power, finally gobbled up its own baby. The British state, alerted to the dangers posed by corporate greed and incompetence, successfully tamed history’s most voracious corporation. In 1859, it was again within the walls of Allahabad Fort that the governor general, Lord Canning, formally announced that the company’s Indian possessions would be nationalised and pass into the control of the British Crown. Queen Victoria, rather than the directors of the EIC would henceforth be ruler of India. […]

For the corporation – a revolutionary European invention contemporaneous with the beginnings of European colonialism, and which helped give Europe its competitive edge – has continued to thrive long after the collapse of European imperialism. When historians discuss the legacy of British colonialism in India, they usually mention democracy, the rule of law, railways, tea and cricket. Yet the idea of the joint-stock company is arguably one of Britain’s most important exports to India, and the one that has for better or worse changed South Asia as much any other European idea. Its influence certainly outweighs that of communism and Protestant Christianity, and possibly even that of democracy.

Companies and corporations now occupy the time and energy of more Indians than any institution other than the family. This should come as no surprise: as Ira Jackson, the former director of Harvard’s Centre for Business and Government, recently noted, corporations and their leaders have today “displaced politics and politicians as … the new high priests and oligarchs of our system”. Covertly, companies still govern the lives of a significant proportion of the human race.

The 300-year-old question of how to cope with the power and perils of large multinational corporations remains today without a clear answer: it is not clear how a nation state can adequately protect itself and its citizens from corporate excess. As the international subprime bubble and bank collapses of 2007-2009 have so recently demonstrated, just as corporations can shape the destiny of nations, they can also drag down their economies. In all, US and European banks lost more than $1tn on toxic assets from January 2007 to September 2009. What Burke feared the East India Company would do to England in 1772 actually happened to Iceland in 2008-11, when the systemic collapse of all three of the country’s major privately owned commercial banks brought the country to the brink of complete bankruptcy. A powerful corporation can still overwhelm or subvert a state every bit as effectively as the East India Company did in Bengal in 1765.

Corporate influence, with its fatal mix of power, money and unaccountability, is particularly potent and dangerous in frail states where corporations are insufficiently or ineffectually regulated, and where the purchasing power of a large company can outbid or overwhelm an underfunded government. This would seem to have been the case under the Congress government that ruled India until last year. Yet as we have seen in London, media organisations can still bend under the influence of corporations such as HSBC – while Sir Malcolm Rifkind’s boast about opening British embassies for the benefit of Chinese firms shows that the nexus between business and politics is as tight as it has ever been.

The East India Company no longer exists, and it has, thankfully, no exact modern equivalent. Walmart, which is the world’s largest corporation in revenue terms, does not number among its assets a fleet of nuclear submarines; neither Facebook nor Shell possesses regiments of infantry. Yet the East India Company – the first great multinational corporation, and the first to run amok – was the ultimate model for many of today’s joint-stock corporations. The most powerful among them do not need their own armies: they can rely on governments to protect their interests and bail them out. The East India Company remains history’s most terrifying warning about the potential for the abuse of corporate power – and the insidious means by which the interests of shareholders become those of the state. Three hundred and fifteen years after its founding, its story has never been more current.

 

“We forgot.”

When somebody asked Alexander Hamilton why the Framers hadn’t mentioned God in the Constitution, his answer was deadpan hilarious: “We forgot.”
~ Kurt Andersen

The 18th century captures the American imagination, for reasons that are obvious and less so. It was a pivotal point and many were aware of it at the time. Over the preceding centuries, Feudalism slowly declined for numerous reasons. The most obvious force of change was the enclosure movement that evicted peasants from their land, their homes, and their communities.

This created a teeming population of landless peasants who were homeless, unemployed, and often starving. This sent waves of refugees heading for the cities and later the colonies. It was a direct attack on the rights of commoners (what the American colonists referred to as the rights of Englishmen). With the loss of Feudalism, there was the loss the Church’s traditional role and intimate participation in the daily lives of communities (see Dancing in the Streets by Barbara Ehrenreich). There also was the compounding impact of the Renaissance, Peasants’ Revolt, Reformation, English Civil War, Scientific Revolution, Enlightenment, and expanding colonial imperialism.

Yet, even as the early revolutionary era came to a close, much of the ancient world or the immediate sense of its loss was still fresh in living memory, at least for the older generations. Post-Reformation religious war went hand in hand with political and economic radicalism with early signs of class war, populism, and communism showing up as Feudalism waned, from the Peasants’ Revolt to the English Civil War. Immediately preceding the American Revolution, there was the First Great Awakening which kept alive the earlier radicalism while pushing it to further extremes, this being the initial motivation for the separation of church and state since the religious dissenters were being excluded and oppressed by Anglican state power.

Yet most Americans at the time weren’t formally religious. There were few ministers in the colonies, especially in rural areas. Americans had low rates of church attendance, with rates not increasing until the 19th century (see The Churching of America by Roger Finke and Rodney Stark). It was precisely this lack of formal religion that fed into a new rabid free-for-all where anyone’s religiosity was as good as another’s, where anyone could become a preacher and start their own sect or turn to whatever ideology they preferred, religious or anti-religious. This is how the influences of Reformation and Enlightenment melded together, creating a force greater than either alone.

Even so, the First Great Awakening didn’t directly impact many Americans. Those who heard the fiery preachers of the time were a small part of the population, although in certain cities it led to great tumult. The effect was uneven, some places unaware a change was happening. It was a slow build up of unrest as the American colonies moved toward revolution. It wasn’t so much religion itself but broader cultural shifts. The radical religious were getting louder but so were the radical irreligious. Both hereticism and secularism became virulent, sometimes flowing together as a single force, but not always.

Also, none of it fit into clear class lines. The upper class were filled with unitarians, universalists, deists, and secularists — this was seen in the founding generation but began to take hold earlier such as with Thomas Morton and Roger Williams. But some of the most heretical anti-Christians emerged from the working class, the most famous being Thomas Paine but included several other influential figures. The growing rift was not even so much between Christianity and atheism, rather more between establishment power and the challenges of dissent. On either side of the divide, many voices found themselves formed into a new alignment, voices that otherwise would have been antagonistic.

As with our present moment, the era preceding revolution was a struggle between the contented and the restless, with the former becoming more authoritarian and the latter more radicalized. That schism is a wound that has never healed. The American soul remains fractured. The caricature of culture war spectacle won’t save us. It’s not about religion. The American Founders didn’t forget about God. It wasn’t the issue that mattered then nor that matters now. Religiosity and heresy, even when they take center stage, are always expressions of or proxies for something else.

* * *

Fantasyland, How America Went Haywire:
A 500-Year History

by Kurt Andersen
pp. 56-59

Chapter 8
Meanwhile, in the Eighteenth-Century Reality-Based Community

THE TWENTY-FOUR-YEAR-OLD PHENOM GEORGE WHITEFIELD arrived in America for the first time just before All Saints’ Day, Halloween 1739. The first major stop on his all-colonies tour was Philadelphia. Crowds equal to half the inhabitants of the city gathered to see each performance. Among them was the not-so-religious young printer and publisher Benjamin Franklin.

Franklin was astonished by how Whitefield could “bring men to tears by pronouncing Mesopotamia, ” and “how much they admired and respected him, notwithstanding his common Abuse of them, by assuring them they were naturally half Beasts and half Devils.” The publisher introduced himself on the spot and signed up to print a four-volume set of Whitefield’s journals and sermons, which became an enormous bestseller. But Franklin’s only awakening during the Great Awakening was to the profits available by pandering to American religionists. Over the next three years, he published an evangelical book almost monthly. With Whitefield himself, Franklin wrote, he formed “no religious Connection.”

Franklin and his fellow Founders’ conceptions of God tended toward the vague and impersonal, a Creator who created and then got out of the way. The “enthusiasts” of the era—channelers of the Holy Spirit, elaborate decoders of the divine plan, proselytizers—were not their people. John Adams fretted in a letter to Jefferson that his son John Quincy might “retire…to study prophecies to the end of his life.” Adams wrote to a Dutch friend that the Bible consists of “millions of fables, tales, legends,” and that Christianity had “prostituted” all the arts “to the sordid and detestable purposes of superstition and fraud.” George Washington “is an unbeliever,” Jefferson once reckoned, and only “has divines constantly about him because he thinks it right to keep up appearances.” Jefferson himself kept up appearances by attending church but instructed his seventeen-year-old nephew to “question with boldness even the existence of a god; because, if there be one, he must more approve the homage of reason, than that of blindfolded fear.” He considered religions “all alike, founded upon fables and mythologies,” including “our particular superstition,” Christianity. One winter in the White House, President Jefferson performed an extraordinary act of revisionism: he cut up two copies of the New Testament, removing all references to miracles, including Christ’s resurrection, and called the reassembled result The Life and Morals of Jesus of Nazareth . “As to Jesus of Nazareth,” Franklin wrote just before he died, “I have…some doubts as to his Divinity; though it is a question I do not dogmatize upon…and I think it needless to busy myself with it now, when I expect soon an opportunity of knowing the truth with less trouble.”

When somebody asked Alexander Hamilton why the Framers hadn’t mentioned God in the Constitution, his answer was deadpan hilarious: “We forgot.”

Yet ordinary American people were apparently still much more religious than the English. In 1775 Edmund Burke warned his fellow members of Parliament that the X factor driving the incipient colonial rebellion was exactly that, the uppity Americans’ peculiar ultra-Protestant zeal. For them, Burke said, religion “is in no way worn out or impaired.”

Thus none of the Founders called himself an atheist. Yet by the standards of devout American Christians, then and certainly now, most were blasphemers. In other words, they were men of the Enlightenment, good-humored seculars who mainly chose reason and science to try to understand the nature of existence, the purposes of life, the shape of truth. Jefferson said Bacon, Locke, and Newton were “the three greatest men that have ever lived, without any exception.” Franklin, close friends with the Enlightenment philosophe Voltaire, * was called “the modern Prometheus” by the Enlightenment philosopher Immanuel Kant, and Adams was friends with the Enlightenment philosopher David Hume, whose 1748 essay “Of Miracles” was meant to be “an everlasting check to all kinds of superstitious delusion.” America’s political founders had far more in common with their European peers than with the superstar theologians barnstorming America to encourage superstitious delusion. “The motto of enlightenment,” Kant wrote the year after America won its war of independence, “is… Sapere aude! ” or Dare to know. “Have courage to use your own understanding!”

For three centuries, the Protestant Reformation and the emerging Enlightenment were strange bedfellows, symbiotically driving the radical idea of freedom of thought, each paving the way for the success of the other. Protestants decided they could reject the Vatican and start their own religion, and they continued rejecting the authority and doctrines of each new set of Protestant bosses and started their own new religions again and again. Enlightenment thinkers took freedom of thought a step further, deciding that people were also free to put supernatural belief and religious doctrine on the back burner or reject them altogether.

But the Enlightenment part of this shift in thinking was a double-edged sword. The Enlightenment liberated people to believe anything whatsoever about every aspect of existence—true, false, good, bad, sane, insane, plausible, implausible, brilliant, stupid, impossible. Its optimistic creators and enthusiasts ever since have assumed that in the long run, thanks to an efficient marketplace of ideas, reason would win. The Age of Reason had led to the Enlightenment, smart rationalists and empiricists were behind both, so…right?

No. “The familiar and often unquestioned claim that the Enlightenment was a movement concerned exclusively with enthralling reason over the passions and all other forms of human feeling or attachment, is…simply false,” writes the UCLA historian Anthony Pagden in The Enlightenment: And Why It Still Matters . “The Enlightenment was as much about rejecting the claims of reason and of rational choice as it was about upholding them.” The Enlightenment gave license to the freedom of all thought, in and outside religion, the absurd and untrue as well as the sensible and true. Especially in America. At the end of the 1700s, with the Enlightenment triumphant, science ascendant, and tolerance required, craziness was newly free to show itself. “Alchemy, astrology…occult Freemasonry, magnetic healing, prophetic visions, the conjuring of spirits, usually thought sidelined by natural scientists a hundred years earlier,” all revived, the Oxford historian Keith Thomas explains, their promoters and followers “implicitly following Kant’s injunction to think for themselves. It was only in an atmosphere of enlightened tolerance that such unorthodox cults could have been openly practiced.”

Kant himself saw the conundrum the Enlightenment faced. “Human reason,” he wrote in The Critique of Pure Reason, “has this peculiar fate, that in one species of its knowledge”—the spiritual, the existential, the meaning of life—“it is burdened by questions which…it is not able to ignore, but which…it is also not able to answer.” Americans had the peculiar fate of believing they could and must answer those religious questions the same way mathematicians and historians and natural philosophers answered theirs.

* “As long as there are fools and rascals,” Voltaire wrote in 1767, “there will be religions. [And Christianity] is assuredly the most ridiculous, the most absurd…religion

Political Right Rhetoric

The following is an accurate description of the political rhetoric, the labels and language in its use on the political right (from a Twitter thread). It is by Matthew A. Sears, an Associate Professor of Classics and Ancient History at the University of New Brunswick.

1. “I’m neither a liberal nor a conservative.” = “I’m totally a conservative.”

2. “I’m a radical centrist.” = “I’m totally a conservative.”

3. “I’m a classical liberal.” = “I’m a neoliberal who’s never read any classical liberals.”

4. “I’m not usually a fan of X.” *Retweets and agrees with everything X says.*

5. “I’m a free speech absolutist.” = “I’m glad racists are now free to speak publicly.”

6. “I believe in confronting views one finds offensive.” *Whines about being bullied by lefties.*

7. “My views are in the minority and aren’t given a fair hearing.”*Buys the best-selling book in the world.*

8. “Where else would you rather live?” = “Canada is perfect for me, and it better not frigging change to be better for anyone else.”

9. “Nazis should be able to speak and given platforms so we can debate them.” *Loses mind if someone says ‘fuck’ to a Nazi.*

10. “The left has taken over everything.” *Trump is president and the Republicans control Congress.*

And, finally, the apex of Twitterspeak:

11. “The left are tyrants and have taken over everything and refuse to hear other perspectives and pose a dire threat to the republic and Western Civilization.” *Ben Shapiro has over a million followers.*

I’d say treat this thread as an Enigma Machine for Quillette-speak/viewpoint-diversity-speak/reverse-racism-speak/MRA-speak, but none of these chaps are enigmas.

I can’t believe I have to add this, but some are *outraged* by this thread: I don’t mind if you’re *actually* centrist or conservative. I just mind if you *pretend to be* left/centrist for rhetorical/media cred/flamewar purposes, while *only* taking conservative stances. Sheesh

Like, I’m pretty left-wing on many issues these days. It would be sneaky of me to identity as “conservative” or “classical liberal” or whatever only to dump on all their ideas and always support opposing ideas. A left-winger or centrist is what a left-winger or centrist tweets.

James Taoist added:

12. “I’m a strict Constitutionalist” = “I’m as racist as fuck.”

Hunger for Connection

“Just as there are mental states only possible in crowds, there are mental states only possible in privacy.”

Those are the words of Sarah Perry from Luxuriating in Privacy. I came across the quote from a David Chapman tweet. He then asks, “Loneliness epidemic—or a golden age of privacy?” With that lure, I couldn’t help but bite.

I’m already familiar with Sarah Perry’s writings at Ribbonfarm. There is even an earlier comment by me at the piece the quote comes from, although I had forgotten about it. In the post, she begins with links to some of her previous commentary, the first one (Ritual and the Consciousness Monoculture) having been my introduction to her work. I referenced it in my post Music and Dance on the Mind and it does indeed connect to the above thought on privacy.

In that other post by Perry, she discusses Keeping Together in Time by William H. McNeill. His central idea is “muscular bonding” that creates, maintains, and expresses a visceral sense of group-feeling and fellow-feeling. This can happen through marching, dancing, rhythmic movements, drumming, chanting, choral singing, etc (for example, see: Choral Singing and Self-Identity). McNeill quotes A. R. Radcliffe about the Andaman islanders: “As the dancer loses himself in the dance, as he becomes absorbed in the unified community, he reaches a state of elation in which he feels himself filled with energy or force immediately beyond his ordinary state, and so finds himself able to perform prodigies of exertion” (Kindle Locations 125-126).

The individual is lost, at least temporarily, an experience humans are drawn to in many forms. Individuality is tiresome and we moderns feel compelled to take a vacation from it. Having forgotten earlier ways of being, maybe privacy is the closest most of us get to lowering our stressful defenses of hyper-individualistic pose and performance. The problem is privacy so easily reinforces the very individualistic isolation that drains us of energy.

This might create the addictive cycle that Johann Hari discussed in Chasing the Scream and would relate to the topic of depression in his most recent book, Lost Connections. He makes a strong argument about the importance of relationships of intimacy, bonding, and caring (some communities have begun to take seriously this issue; others deem what is required are even higher levels of change, radical and revolutionary). In particular, the rat park research is fascinating. The problem with addiction is that it simultaneously relieves the pain of our isolation while further isolating us. Or at least this is what happens in a punitive society with weak community and culture of trust. For that reason, we should look to other cultures for comparison. In some traditional societies, there is a greater balance and freedom to choose. I specifically had the Piraha in mind, as described by Daniel Everett.

The Piraha are a prime example of how not all cultures have a dualistic conflict between self and community, between privacy and performance. Their communities are loosely structured and the individual is largely autonomous in how and with whom they use their time. They lack much in the way of formal social structure, since there are no permanent positions of hierarchical authority (e.g., no tribal council of elders), although any given individual might temporarily take a leadership position in order to help accomplish an immediate task. Nor do they have much in the way of ritual or religion. It isn’t an oppressive society.

Accordingly, Everett observes how laid back, relaxed, and happy they seem. Depression, anxiety, and suicide appear foreign to them. When he told them about a depressed family member who killed herself, the Piraha laughed because assumed he was joking. There was no known case of suicide in the tribe. Even more interesting is that, growing up, the Piraha don’t exhibit transitional periods such as the terrible twos or teenage rebelliousness. They simply go from being weaned to joining adult activities with no one telling them to what to do.

The modern perceived conflict between group and individual might not be a universal and intrinsic aspect of human society. But it does seem a major issue for WEIRD societies, in particular. Maybe has to do with how ego-bound is our sense of identity. The other thing the Piraha lack is a permanent, unchanging self-identity because such as a meeting with a spirit in the jungle might lead to a change of name and, to the Piraha, the person who went by the previous name no longer is there. They feel no need to defend their individuality because any given individual self can be set aside.

It is hard for Westerners and Americans most of all to imagine a society that is this far different. It is outside of the mainstream capacity of imagining what is humanly possible. It’s similar to why so many people reject out of hand such theories as Julian Jaynes’ bicameral mind. Such worldviews simply don’t fit into what we know. But maybe this sense of conflict we cling to is entirely unnecessary. If so, why do we feel such conflict is inevitable? And so why do we value privacy so highly? What is it that we seek from being isolated and alone? What is it that we think we have lost that needs to be regained? To help answer these questions, I’ll present a quote by Julian Jaynes that I included in writing Music and Dance On the Mind — from his book that Perry is familiar with:

“Another advantage of schizophrenia, perhaps evolutionary, is tirelessness. While a few schizophrenics complain of generalized fatigue, particularly in the early stages of the illness, most patients do not. In fact, they show less fatigue than normal persons and are capable of tremendous feats of endurance. They are not fatigued by examinations lasting many hours. They may move about day and night, or work endlessly without any sign of being tired. Catatonics may hold an awkward position for days that the reader could not hold for more than a few minutes. This suggests that much fatigue is a product of the subjective conscious mind, and that bicameral man, building the pyramids of Egypt, the ziggurats of Sumer, or the gigantic temples at Teotihuacan with only hand labor, could do so far more easily than could conscious self-reflective men.”

Considering that, it could be argued that privacy is part of the same social order, ideological paradigm, and reality tunnel that tires us out so much in the first place. Endlessly without respite, we feel socially compelled to perform our individuality. And even in retreating into privacy, we go on performing our individuality for our own private audience, as played out on the internalized stage of self-consciousness that Jaynes describes. That said, even though the cost is high, it leads to great benefits for society as a whole. Modern civilization wouldn’t be possible without it. The question is whether the costs outweigh the benefits and also whether the costs are sustainable or self-destructive in the long term.

As Eli wrote in the comments section to Luxuriating in Privacy: “Privacy isn’t an unalloyed good. As you mention, we are getting ever-increasing levels of privacy to “luxuriate” in. But who’s to say we’re not just coping with the change modernity constantly imposes on us? Why should we elevate the coping mechanism, when it may well be merely a means to lessen the pain of an unnecessarily “alienating” constructed environment.” And “isn’t the tiresomeness of having to model the social environment itself contingent on the structural precariousness of one’s place in an ambiguous, constantly changing status hierarchy?”

Still, I do understand where Perry is coming from, as I’m very much an introvert who values my alone time and can be quite jealous of my privacy, although I can’t say that close and regular social contact “fills me with horror.” Having lived alone for years in apartments and barely knowing my neighbors, I spend little time at my ‘home’ and instead choose to regularly socialize with my family at my parents’ house. Decades of depression has caused me to be acutely aware of the double-edged sword of privacy.

Let me respond to some specifics of Perry’s argument.  “Consider obesity,” she writes. “A stylized explanation for rising levels of overweight and obesity since the 1980s is this: people enjoy eating, and more people can afford to eat as much as they want to. In other words, wealth and plenty cause obesity.” There are some insightful comparisons of eating practices. Not all modern societies with equal access to food have equal levels of obesity. Among many other health problems, obesity can result from stress because our bodies prepare for challenging times by accumulating fat reserves. And if there is enough stress, studies have found this is epigenetically passed onto children.

As a contrast, consider the French culture surrounding food. The French don’t eat much fast food, don’t tend to eat or drink on the go. It is more common for them sit down to enjoy their coffee in the morning, rather than putting it in a traveling mug to drink on the way to work. Also, they are more likely to take long lunches in order to eat leisurely and typically do so with others. For the French, the expectation is that meals are to be enjoyed as a social experience and so they organize their entire society accordingly. Even though they eat many foods that some consider unhealthy, they don’t have the same high rates of stress-related diseases as do Americans.

An even greater contrast comes from looking once again at the Piraha. They live in an environment of immense abundance. And it requires little work to attain sustenance. In a few hours of work, an individual could get enough food to feed an extended family for multiple meals. They don’t worry about going hungry and yet, for various reasons, will choose not to not eat for extended periods of time when they wish to spend their time in other ways such as relaxing or dancing. They impose a feast and fast lifestyle on themselves, a typical pattern for hunter-gatherers. As with the French, when the Piraha have a meal, it is very much a social event. Unsurprisingly, the Piraha are slim and trim, muscular and healthy. They don’t suffer from stress-related physical and mental conditions, certainly not obesity.

Perry argues that, “Analogized to privacy, perhaps the explanation of atomization is simply that people enjoy privacy, and can finally afford to have as much as they want. Privacy is an economic good, and people show a great willingness to trade other goods for more privacy.” Using Johan Hari’s perspective, I might rephrase it: Addiction is economically profitable within the hyper-individualism of capitalist realism, and people show a difficult to control craving that causes them to pay high costs to feed their addiction. Sure, temporarily alleviating the symptoms makes people feel better. But what is it a symptom of? That question is key to understanding. I’m persuaded that the issue at hand is disconnection, isolation, and loneliness. So much else follows from that.

Explaining the title of her post, Perry writes that: “One thing that people are said to do with privacy is to luxuriate in it. What are the determinants of this positive experience of privacy, of privacy experienced as a thing in itself, rather than through violation?” She goes on to describes on to describes the features of privacy, various forms of personal space and enclosure. Of course, Julian Jaynes argued that the ultimate privacy is the construction of individuality itself, the experience of space metaphorically internalized and interiorized. Further development of privacy, however, is a rather modern invention. For example, it wasn’t until recent centuries that private bedrooms became common, having been popularized in Anglo-American culture by Quakers. Before that, full privacy was a rare experience and far from having been considered a human necessity or human right.

But we have come to take privacy for granted, not talking about certain details is a central part of privacy itself. “Everybody knows that everybody poops. Still, you’re not supposed to poop in front of people. The domain of defecation is tacitly edited out of our interactions with other people: for most social purposes, we are expected to pretend that we neither produce nor dispose of bodily wastes, and to keep any evidence of such private. Polite social relations exclude parts of reality by tacit agreement; scatological humor is a reminder of common knowledge that is typically screened off by social agreement. Sex and masturbation are similar.”

Defecation is a great example. There is no universal experience about the privatization of the act of pooping. In early Europe, relieving oneself in public was common and considered well within social norms. It was a slow ‘civilizing’ process to teach people to be ashamed of bodily functions, even simple things like farting and belching in public (there are a number of interesting books on the topic). I was intrigued by Susan P. Mattern’s The Prince of Medicine. She describes how almost everything in the ancient world was a social experience. Even taking a shit was an opportunity to meet and chat with one’s family, friends, and neighbors. They apparently felt no drain of energy or need to perform in their social way of being in the world. It was relaxed and normal to them, simply how they lived and they knew nothing else.

Also, sex and masturbation haven’t always been exclusively private acts. We have little knowledge of sex in the archaic world. Jaynes noted that sexuality wasn’t treated as anything particularly concerning and worrisome during the bicameral era. Obsession with sex, positive or negative, more fully developed during the Axial Age. As late as Feudalism, heavily Christianized Europe offered little opportunity for privacy and maintained a relatively open attitude about sexuality during many public celebrations, specifically Carnival, and they spent an amazing amount of their time in public celebrations. Barbara Ehrenreich describes this ecstatic communality in Dancing in the Streets. Like the Piraha, these earlier Europeans had a more social and fluid sense of identity.

Let me finish by responding to Perry’s conclusion: “As I wrote in A Bad Carver, social interaction has increasingly become “unbundled” from other things. This may not be a coincidence: it may be that people have specifically desired more privacy, and the great unbundling took place along that axis especially, in response to demand. Modern people have more room, more autonomy, more time alone, and fewer social constraints than their ancestors had a hundred years ago. To scoff at this luxury, to call it “alienation,” is to ignore that it is the choices of those who are allegedly alienated that create this privacy-friendly social order.”

There is no doubt what people desire. In any given society, most people desire whatever they are acculturated to desire. Example after example of this can be found in social science research, the anthropological literature, and classical studies. It’s not obvious to me that there is any evidence that modern people have fewer social constraints. What is clear is that we have different social constraints and that difference seems to have led to immense stress, anxiety, depression, and fatigue. Barbara Ehrenreich discusses the rise in depression in particular, as have others such as Mark Fisher’s work on capitalist realism (I quote him and others here). The studies on WEIRD cultures are also telling (see: Urban Weirdness and Dark Triad Domination).

The issue isn’t simply what choices we make but what choices we are offered and denied, what choices we can and cannot perceive or even imagine. And that relates to how we lose contact with the human realities of other societies that embody other possibilities not chosen or even considered within the constraints of our own. We are disconnected not only from others within our society. Our WEIRD monocultural dominance has isolated us also from other expressions of human potential.

We luxuriate in privacy because our society offers us few other choices, like a choice between junk food or starvation, in which case junk food tastes delicious. For most modern Westerners, privacy is nothing more than a temporary escape from an overwhelming world. But what we most deeply hunger for is genuine connection.