Bicameralism and Bilingualism

A paper on multilingualism was posted by Eva Dunkel in the Facebook group for The Origin of Consciousness in the Breakdown of the Bicameral Mind: Consequences of multilingualism for neural architecture by Sayuri Hayakawa and Viorica Marian. It is a great find. The authors look at how multiple languages are processed within the brain and how they can alter brain structure.

This probably also relates to learning of music, art, and math — one might add that learning music later improves the ability to learn math. These are basically other kinds of languages, especially the former in terms of  musical languages (along with whistle and hum languages) that might indicate language having originated in music, not to mention the close relationship music has to dance, movement, and behavior and close relationship of music to group identity. The archaic authorization of command voices in the bicameral mind quite likely came in the form of music and one could imagine the kinds of synchronized collective activities that could have dominated life and work in bicameral societies. There is something powerful about language that we tend to overlook and take for granted. Also, since language is so embedded in culture, monolinguals never see outside of the cultural reality tunnel they exist within. This could bring us to wonder about the role played post-bicameral society by syncretic languages like English. We can’t forget the influence psychedelics might have had on language development and learning at different periods of human existence. And with psychedelics, there is the connection to shamanism with caves as aural spaces and locations of art, possibly the earliest origin of proto-writing.

There is no reason to give mathematics a mere secondary place in our considerations. Numeracy might be important as well in thinking about the bicameral mind specifically and certainly about the human mind in general (Caleb Everett, Numbers and the Making of Us), as numeracy was an advancement or complexification beyond the innumerate tribal societies (e.g., Piraha). Some of the earliest uses of writing was for calculations: accounting, taxation, astrology, etc. Bicameral societies, specifically the early city-states, can seem simplistic in many ways with their lack of complex hierarchies, large centralized governments, standing armies, police forces, or even basic infrastructure such as maintained roads and bridges. Yet they were capable of immense projects that required impressively high levels of planning, organizing, and coordination — as seen with the massive archaic pyramids and other structures built around the world. It’s strange how later empires in the Axial Age and beyond that, though so much larger and extensive with greater wealth and resources, rarely even attempted the seemingly impossible architectural feats of bicameral humans. Complex mathematical systems probably played a major role in the bicameral mind, as seen in how astrological calculations sometimes extended over millennia.

Hayakawa and Marian’s paper could add to the explanation of the breakdown of the bicameral mind. A central focus of their analysis is the increased executive function and neural integration in managing two linguistic inputs — I could see how that would relate to the development of egoic consciousness. It has been proposed that the first to develop Jaynesian consciousness may have been traders who were required to cross cultural boundaries and, of course, who would have been forced to learn multiple languages. As bicameral societies came into regular contact with more diverse linguistic cultures, their bicameral cognitive and social structures would have been increasingly stressed.

Multilingualism goes hand in hand with literacy. Rates of both have increased over the millennia. That would have been a major force in the post-bicameral Axial Age. The immense multiculturalism of societies like the Roman Empire is almost impossible for us to imagine. Hundreds of ethnicities, each with their own language, would co-exist in the same city and sometimes the same neighborhood. On a single street, there could be hundreds of shrines to diverse gods with people praying, people invoking and incantating in their separate languages. These individuals were suddenly forced to deal with complete strangers and learn some basic level of understanding foreign languages and hence foreign understandings.

This was simultaneous with the rise of literacy and its importance to society, only becoming more important over time as the rate of book reading continues to climb (more books are printed in a year these days than were produced in the first several millennia of writing). Still, it was only quite recently that the majority of the population became literate, following from that is the ability of silent reading and its correlate of inner speech. Multilingualism is close behind and catching up. The consciousness revolution is still under way. I’m willing to bet American society will be transformed as we return to multilingualism as the norm, considering that in the first centuries of American history there was immense multilingualism (e.g., German was once one of the most widely spoken languages in North America).

All of this reminds me of linguistic relativity. I’ve pointed out that, though not explicitly stated, Jaynes obviously was referring to linguistic relativity in his own theorizing about language. He talked quite directly about the power language —- and metaphors within language —- had over thought, perception, behavior, and identity (Anke Snoek has some good insights about this in exploring the thought of Giorgio Agamben). This was an idea maybe first expressed by Wilhelm von Humboldt (On Language) in 1836: “Via the latter, qua character of a speech-sound, a pervasive analogy necessarily prevails in the same language; and since a like subjectivity also affects language in the same notion, there resides in every language a characteristic world-view.” And Humboldt even considered the power of learning another language in stating that, “To learn a foreign language should therefore be to acquire a new standpoint in the world-view hitherto possessed, and in fact to a certain extent is so, since every language contains the whole conceptual fabric and mode of presentation of a portion of mankind.”

Multilingualism is multiperspectivism, a core element of the modern mind and modern way of being in the world. Language has the power to transform us. To study language, to learn a new language is to become something different. Each language is not only a separate worldview but locks into place a different sense of self, a persona. This would be true not only for learning different cultural languages but also different professional languages with their respective sets of terminology, as the modern world has diverse areas with their own ways of talking and we modern humans have to deal with this complexity on a regular basis, whether we are talking about tax codes or dietary lingo.

It’s hard to know what that means for humanity’s trajectory across the millennia. But the more we are caught within linguistic worlds and are forced to navigate our way within them the greater the need for a strong egoic individuality to self-initiate action, that is to say the self-authorization of Jaynesian consciousness. We step further back into our own internal space of meta-cognitive metaphor. To know more than one language strengthens an identity separate from any given language. The egoic self retreats behind its walls and looks out from its parapets. Language, rather than being the world we are immersed in, becomes the world we are trapped in (a world that is no longer home and from which we seek to escape, Philip K. Dick’s Black Iron Prison and William S. Burroughs Control). It closes in on us and forces us to become more adaptive to evade the constraints.

Boredom in the Mind: Liberals and Reactionaries

“Hobsbawm was obsessed with boredom; his experience of it appears at least twenty-seven times in Evans’s biography. Were it not for Marx, Hobsbawm tells us, in a book of essays, he never would “have developed any special interest in history.” The subject was too dull. The British writer Adam Phillips describes boredom as “that state of suspended anticipation in which things are started and nothing begins.” More than a wish for excitement, boredom contains a longing for narrative, for engagement that warrants attention to the world.

“A different biographer might have found in Hobsbawm’s boredom an opening onto an entire plane of the Communist experience. Marxism sought to render political desire as objective form, to make human intention a causal force in the world. Not since Machiavelli had political people thought so hard about the alignment of action and opportunity, about the disjuncture between public performance and private wish. Hobsbawm’s life and work are a case study in such questions.”

That is another great insight from Corey Robin, as written in his New Yorker piece, Eric Hobsbawm, the Communist Who Explained History. Boredom does seem key. It is one of the things that stood out to me in Robin’s writings about the reactionary mind. Reactionaries dislike, even fear, boredom more than almost anything else. The rhetoric of reactionaries is often to create the passionate excitement of melodrama, such as how Burke describes the treatment of the French queen.

The political left too often forgets the power of storytelling, especially simplistic and unoriginal storytelling, as seen with Trump. Instead, too many on the left fear the populist riling up of the masses. I remember Ralph Nader warning about this in a speech he gave in his 2000 presidential campaign. There is a leftist mistrust of passion and maybe there is good reason for this mistrust, considering it forms the heartbeat of the reactionary mind. Still, without passion, there is no power of persuasion and so all attempts are doomed from the start. The left will have to learn to fight on this turf or simply embrace full resignation and so fall into cynicism.

The thing is that those on the political left seem to have a higher tolerance for boredom, maybe related to their higher tolerance for cognitive dissonance shown in social science research. It requires greater uncertainty and stress to shut down the liberal-minded person (liberal in the psychological sense). I noticed this in myself. I’m not prone to the reactionary maybe because I don’t get bored easily and so don’t need something coming from outside to motivate me.

But it might go beyond mere tolerance in demonstrating an active preference for boredom. There is something about the liberal mind that is prone to complexity, nuance, and ambiguity that can only be grown amidst boredom — that is to say the open-mindedness of curiosity, doubt, and questioning are only possible when one acknowledges ignorance. It’s much more exciting to proclaim truth, instead, and proclaim it with an entertaining story. This is problematic in seeking political victories, if one is afraid of the melodrama of hard fights. Right-wingers might burn themselves out on endless existential crises, whereas left-wingers typically never build up enough fire to lightly toast a marshmallow.

The political left doesn’t require or thrive with a dualistic vision of opposition and battle, in the way does the political right. This is a central strength and weakness for the left. On the side of weakness, this is why it is so hard for the left to offer a genuinely threatening challenge to the right. Most often what happens is the reactionaries simply co-opt the left and the left too easily falls in line. See how many liberals will repeat reactionary rhetoric. Or notice how many on the political left turned full reactionary during times of conflict (e.g., world war era).

Boredom being the comfort zone of liberals is all the more reason they should resist settling down within its confines. There is no where to hide from the quite real drama that is going on in the world. The liberal elite can’t forever maintain their delusion of being a disinterested aristocracy. As Eric Hobsbawm understood and Karl Marx before him, only a leftist vision can offer a narrative that can compete against the reactionary mind

* * *

“Capitalism is boring. Devoting your life to it, as conservatives do, is horrifying if only because it’s so repetitious. It’s like sex.”
~William F. Buckley Jr., in an interview with Corey Robin

Violent Fantasy of Reactionary Intellectuals

The last thing in the world a reactionary wants is to be bored, as happened with the ending of the ideological battles of the Cold War. They need a worthy enemy or else to invent one. Otherwise, there is nothing to react to and so nothing to get excited about, followed by a total loss of meaning and purpose, resulting in dreaded apathy and ennui. This leads reactionaries to become provocative, in the hope of provoking an opponent into a fight. Another strategy is simply to portray the whole world as a battleground, such that everything is interpreted as a potential attack, working oneself or one’s followers into a froth.

The Fantasy of Creative Destruction

To the reactionary mind, sacrifice of self can be as acceptable as sacrifice of others. It’s the fight, the struggle itself that gives meaning — no matter the costs and consequences, no matter how it ends. The greatest sin is boredom, the inevitable result of victory. As Irving Kristol said to Corey Robin, the defeat of the Soviet Union “deprived us of an enemy.” It was the end of history for, without an enervating battle of moral imagination, it was the end of the world.

Sailors’ Rations, a High-Carb Diet

In the 18th century British Navy, “Soldiers and sailors typically got one pound of bread a day,” in the form of hard tack, a hard biscuit. That is according to James Townsend. On top of that, some days they had been given peas and on other days a porridge called burgoo. Elsewhere, Townsend shares a some info from a 1796 memoir of the period — the author having written that, “every man and boy born on the books of any of his Majesty’s ships are allowed as following a pound of biscuit bread and a gallon of beer per day” (William Spavens, Memoirs of A Seafaring Life,  p. 106). So, grains and more grains, in multiple forms, foods and beverages.

About burgoo, it is a “ground oatmeal boiled up,” as described by Townsend. “Now you wouldn’t necessarily eat that all by itself. Early on, you were given to go with that salt beef fat. So the slush that came to the top when you’re boiling all your salt beef or salt pork. You get all that fat that goes up on top — they would scrape that off, they keep that and give it to you to go with your burgoo. But later on they said maybe that cause scurvy so they let you have some molasses instead.”

They really didn’t understand scurvy at the time. Animal foods, especially fat, would have some vitamin C in it, whereas the oats and molasses had none. They made up for this deficiency later on by adding in cabbage to the sailors’ diet, though not a great choice considering vegetables don’t store well on ships. I’d point out that it’s not that they weren’t getting enough vitamin C, at least for a healthy traditional diet, as they got meat four days a week and even on the other meat-free banyan-days they had some butter and cheese. That would have given them sufficient vitamin C for a low-carb diet, especially with seafood caught along the way.

A high-carb diet, however, is a whole other matter. The amount of carbs and sugar sailors ate daily was quite large. This came about with colonial trade that made grains cheap and widely available, along with the sudden access to sugar from distant sugarcane plantations. Glucose competes with the processing of vitamin C and so requires higher intake of the latter for basic health, specifically to avoid scurvy. A low-carb diet, on the other hand, can avoid scurvy with very little vitamin C since sufficient amounts are in animal foods. Also, a low-carb diet is less inflammatory and so this further decreases the need for antioxidants like vitamin C.

This is why Inuit could eat few plants and immense amounts of meat and fat. They got more vitamin C on a regular basis from seal fat than they did from the meager plant foods they could gather in the short warm period of the far north. But with almost no carbohydrates in the traditional Inuit diet, the requirement for vitamin C was so low as to not be a problem. This is probably the same explanation for why Vikings and Polynesians could travel vast distances across the ocean without getting sick, as they were surely eating mostly fresh seafood and very little, if any, starchy foods.

Unlike protein and fat, carbohydrate is not an essential macronutrient. Yes, carbohydrates provide glucose that the body needs in limited amounts, but through gluceogenesis proteins can be turned into glucose on demand. So, a long sea voyage with zero carbs would never have been a problem.

Sailors in the colonial era ate all of those biscuits, porridge, and peas not because it offered any health value beyond mere survival but because it was cheap food. Those sailors weren’t being fed to have long, healthy lives as labor was cheap and no one cared about them. As soon as a sailor was no longer useful, he would no longer be employed in that profession and he’d find himself among the impoverished masses. For all the health problems of a sailor’s diet, it was better than the alternative of starvation or near starvation that so many others faced.

Grain consumption had been increasing in late feudalism, but peasants still maintained wider variety in their diet through foods they could hunt or gather, not to mention some fresh meat, fat, eggs, and dairy from animals they raised. That all began to change with the enclosure movement. The end of feudal village life and loss of the peasants’ commons was not a pretty picture and did not lead to happy results, as the landless peasants evicted from their homes flooded into the cities where most of them died. The economic desperation made for much cheap labor. Naval sailors with their guaranteed rations, in spite of nutritional deficiencies, were comparably lucky.

The Crisis of Identity

“Have we lived too fast?”
~Dr. Silas Weir Mitchell, 1871
Wear and Tear, or Hints for the Overworked

I’ve been following Scott Preston over at his blog, Chrysalis. He has been writing on the same set of issues for a long time now, longer than I’ve been reading his blog. He reads widely and so draws on many sources, most of which I’m not familiar with, part of the reason I appreciate the work he does to pull together such informed pieces. A recent post, A Brief History of Our Disintegration, would give you a good sense of his intellectual project, although the word ‘intellectual’ sounds rather paltry for what he is describing:

“Around the end of the 19th century (called the fin de siecle period), something uncanny began to emerge in the functioning of the modern mind, also called the “perspectival” or “the mental-rational structure of consciousness” (Jean Gebser). As usual, it first became evident in the arts — a portent of things to come, but most especially as a disintegration of the personality and character structure of Modern Man and mental-rational consciousness.”

That time period has been an interest of mine as well. There are two books that come to mind that I’ve mentioned before: Tom Lutz’s American Nervousness, 1903 and Jackson Lear’s Rebirth of a Nation (for a discussion of the latter, see: Juvenile Delinquents and Emasculated Males). Both talk about that turn-of-the-century crisis, the psychological projections and physical manifestations, the social movements and political actions. A major concern was neurasthenia which, according to the dominant economic paradigm, meant a deficit of ‘nervous energy’ or ‘nerve force’, the reserves of which if not reinvested wisely and instead wasted would lead to physical and psychological bankruptcy, and so one became spent (the term ‘neurasthenia’ was first used in 1829 and popularized by George Miller Beard in 1869, the same period when the related medical condition of ‘nostalgia’ began being diagnosed).

This was mixed up with sexuality in what Theodore Dreiser called the ‘spermatic economy’ (by the way, the catalogue for Sears, Roebuck and Company offered an electrical device to replenish nerve force that came with a genital attachment). Obsession with sexuality was used to reinforce gender roles in how neurasthenic patients were treated in following the practice of Dr. Silas Weir Mitchell, in that men were recommended to become more active (the ‘West cure’) and women more passive (the ‘rest cure’), although some women “used neurasthenia to challenge the status quo, rather than enforce it. They argued that traditional gender roles were causing women’s neurasthenia, and that housework was wasting their nervous energy. If they were allowed to do more useful work, they said, they’d be reinvesting and replenishing their energies, much as men were thought to do out in the wilderness” (Julie Beck, ‘Americanitis’: The Disease of Living Too Fast). That feminist-style argument, as I recall, came up in advertisements for the Bernarr Macfadden’s fitness protocol in the early-1900s, encouraging (presumably middle class) women to give up housework for exercise and so regain their vitality. Macfadden was also an advocate of living a fully sensuous life, going as far as free love.

Besides the gender wars, there was the ever-present bourgeois bigotry. Neurasthenia is the most civilized of the diseases of civilization since, in its original American conception, it was perceived as only afflicting middle-to-upper class whites, especially WASPs — as Lutz says that, “if you were lower class, and you weren’t educated and you weren’t Anglo Saxon, you wouldn’t get neurasthenic because you just didn’t have what it took to be damaged by modernity” (Julie Beck, ‘Americanitis’: The Disease of Living Too Fast) and so, according to Lutz’s book, people would make “claims to sickness as claims to privilege.” It was considered a sign of progress, but over time it came to be seen by some as the greatest threat to civilization, in either case offering much material for fictionalized portrayals that were popular. Being sick in this fashion was proof that one was a modern individual, an exemplar of advanced civilization, if coming at immense cost —Julie Beck explains:

“The nature of this sickness was vague and all-encompassing. In his book Neurasthenic Nation, David Schuster, an associate professor of history at Indiana University-Purdue University Fort Wayne, outlines some of the possible symptoms of neurasthenia: headaches, muscle pain, weight loss, irritability, anxiety, impotence, depression, “a lack of ambition,” and both insomnia and lethargy. It was a bit of a grab bag of a diagnosis, a catch-all for nearly any kind of discomfort or unhappiness.

“This vagueness meant that the diagnosis was likely given to people suffering from a variety of mental and physical illnesses, as well as some people with no clinical conditions by modern standards, who were just dissatisfied or full of ennui. “It was really largely a quality-of-life issue,” Schuster says. “If you were feeling good and healthy, you were not neurasthenic, but if for some reason you were feeling run down, then you were neurasthenic.””

I’d point out how neurasthenia was seen as primarily caused by intellectual activity, as it became a descriptor of a common experience among the burgeoning middle class of often well-educated professionals and office workers. This relates to Weston A. Price’s work in the 1930s, as modern dietary changes first hit this demographic since they had the means to afford eating a fully industrialized Standard American Diet (SAD), long before others (within decades, though, SAD-caused malnourishment would wreck the health at all levels of society). What this meant, in particular, was a diet high in processed carbs and sugar that coincided, because of Upton Sinclair’s 1904 The Jungle: Muckraking the Meat-Packing Industry,  with the early-1900s decreased consumption of meat and saturated fats. As Price demonstrated, this was a vast change from the traditional diet found all over the world, including in rural Europe (and presumably in rural America, with most Americans not urbanized until the turn of last century), that always included significant amounts of nutritious animal foods loaded up with fat-soluble vitamins, not to mention lots of healthy fats and cholesterol.

Prior to talk of neurasthenia, the exhaustion model of health portrayed as waste and depletion took hold in Europe centuries earlier (e.g., anti-masturbation panics) and had its roots in humor theory of bodily fluids. It has long been understood that food, specifically macronutrients (carbohydrate, protein, & fat), affect mood and behavior — see the early literature on melancholy. During feudalism food laws were used as a means of social control, such that in one case meat was prohibited prior to Carnival because of its energizing effect that it was thought could lead to rowdiness or even revolt (Ken Albala & Trudy Eden, Food and Faith in Christian Culture).

There does seem to be a connection between an increase of intellectual activity with an increase of carbohydrates and sugar, this connection first appearing during the early colonial era that set the stage for the Enlightenment. It was the agricultural mind taken to a whole new level. Indeed, a steady flow of glucose is one way to fuel extended periods of brain work, such as reading and writing for hours on end and late into the night — the reason college students to this day will down sugary drinks while studying. Because of trade networks, Enlightenment thinkers were buzzing on the suddenly much more available simple carbs and sugar, with an added boost from caffeine and nicotine. The modern intellectual mind was drugged-up right from the beginning, and over time it took its toll. Such dietary highs inevitably lead to ever greater crashes of mood and health. Interestingly, Dr. Silas Weir Mitchell who advocated the ‘rest cure’ and ‘West cure’ in treating neurasthenia and other ailments additionally used a “meat-rich diet” for his patients (Ann Stiles, Go rest, young man). Other doctors of that era were even more direct in using specifically low-carb diets for various health conditions, often for obesity which was also a focus of Dr. Mitchell.

Still, it goes far beyond diet. There has been a diversity of stressors that have continued to amass over the centuries of tumultuous change. The exhaustion of modern man (and typically the focus has been on men) has been building up for generations upon generations before it came to feel like a world-shaking crisis with the new industrialized world. The lens of neurasthenia was an attempt to grapple with what had changed, but the focus was too narrow. With the plague of neurasthenia, the atomization of commericialized man and woman couldn’t hold together. And so there was a temptation toward nationalistic projects, including wars, to revitalize the ailing soul and to suture the gash of social division and disarray. But this further wrenched out of alignment the traditional order that had once held society together, and what was lost mostly went without recognition. The individual was brought into the foreground of public thought, a lone protagonist in a social Darwinian world. In this melodramatic narrative of struggle and self-assertion, many individuals didn’t fare so well and everything else suffered in the wake.

Tom Lutz writes that, “By 1903, neurasthenic language and representations of neurasthenia were everywhere: in magazine articles, fiction, poetry, medical journals and books, in scholarly journals and newspaper articles, in political rhetoric and religious discourse, and in advertisements for spas, cures, nostrums, and myriad other products in newspapers, magazines and mail-order catalogs” (American Nervousness, 1903, p. 2).

There was a sense of moral decline that was hard to grasp, although some people like Weston A. Price tried to dig down into concrete explanations of what had so gone wrong, the social and psychological changes observable during mass urbanization and industrialization. He was far from alone in his inquiries, having built on the prior observations of doctors, anthropologists, and missionaries. Other doctors and scientists were looking into the influences of diet in the mid-1800s and, by the 1880s, scientists were exploring a variety of biological theories. Their inability to pinpoint the cause maybe had more to do with their lack of a needed framework, as they touched upon numerous facets of biological functioning:

“Not surprisingly, laboratory experiments designed to uncover physiological changes in the nerve cell were inconclusive. European research on neurasthenics reported such findings as loss of elasticity of blood vessels,’ thickening of the cell wall, changes in the shape of nerve cells,’6 or nerve cells that never advanced beyond an embryonic state.’ Another theory held that an overtaxed organism cannot keep up with metabolic requirements, leading to inadequate cell nutrition and waste excretion. The weakened cells cannot develop properly, while the resulting build-up of waste products effectively poisons the cells (so-called “autointoxication”).’ This theory was especially attractive because it seemed to explain the extreme diversity of neurasthenic symptoms: weakened or poisoned cells might affect the functioning of any organ in the body. Furthermore, “autointoxicants” could have a stimulatory effect, helping to account for the increased sensitivity and overexcitability characteristic of neurasthenics.'” (Laura Goering, “Russian Nervousness”: Neurasthenia and National Identity in Nineteenth-Century Russia)

This early scientific research could not lessen the mercurial sense of unease, as neurasthenia was from its inception a broad category that captured some greater shift in public mood, even as it so powerfully shaped the individual’s health. For all the effort, there were as many theories about neurasthenia as there were symptoms. Deeper insight was required. “[I]f a human being is a multiformity of mind, body, soul, and spirit,” writes Preston, “you don’t achieve wholeness or fulfillment by amputating or suppressing one or more of these aspects, but only by an effective integration of the four aspects.” But integration is easier said than done.

The modern human hasn’t been suffering from mere psychic wear and tear for the individual body itself has been showing the signs of sickness, as the diseases of civilization have become harder and harder to ignore. On a societal level of human health, I’ve previously shared passages from Lears (see here) — he discusses the vitalist impulse that was the response to the turmoil, and vitalism often was explored in terms of physical health as the most apparent manifestation, although social and spiritual health were just as often spoken of in the same breath. The whole person was under assault by an accumulation of stressors and the increasingly isolated individual didn’t have the resources to fight them off.

By the way, this was far from being limited to America. Europeans picked up the discussion of neurasthenia and took it in other directions, often with less optimism about progress, but also some thinkers emphasizing social interpretations with specific blame on hyper-individualism (Laura Goering, “Russian Nervousness”: Neurasthenia and National Identity in Nineteenth-Century Russia). Thoughts on neurasthenia became mixed up with earlier speculations on nostalgia and romanticized notions of rural life. More important, Russian thinkers in particular understood that the problems of modernity weren’t limited to the upper classes, instead extending across entire populations, as a result of how societies had been turned on their heads during that fractious century of revolutions.

In looking around, I came across some other interesting stuff. From 1901 Nervous and Mental Diseases by Archibald Church and Frederick Peterson, the authors in the chapter on “Mental Disease” are keen to further the description, categorization, and labeling of ‘insanity’. And I noted their concern with physiological asymmetry, something shared later with Price, among many others going back to the prior century.

Maybe asymmetry was not only indicative of developmental issues but also symbolic of a deeper imbalance. The attempts of phrenological analysis about psychiatric, criminal, and anti-social behavior were off-base; and, despite the bigotry and proto-genetic determinism among racists using these kinds of ideas, there is a simple truth about health in relationship to physiological development, most easily observed in bone structure, but it would take many generations to understand the deeper scientific causes, along with nutrition (e.g., Price’s discovery of vitamin K2, what he called Acivator X) including parasites, toxins, and epigenetics. Churchland and Peterson did acknowledge that this went beyond mere individual or even familial issues: “It is probable that the intemperate use of alcohol and drugs, the spreading of syphilis, and the overstimulation in many directions of modern civilization have determined an increase difficult to estimate, but nevertheless palpable, of insanity in the present century as compared with past centuries.”

Also, there is the 1902 The Journal of Nervous and Mental Disease: Volume 29 edited by William G. Spiller. There is much discussion in there about how anxiety was observed, diagnosed, and treated at the time. Some of the case studies make for a fascinating read —– check out: “Report of a Case of Epilepsy Presenting as Symptoms Night Terrors, Inipellant Ideas, Complicated Automatisms, with Subsequent Development of Convulsive Motor Seizures and Psychical Aberration” by W. K. Walker. This reminds me of the case that influenced Sigmund Freud and Carl Jung, Daniel Paul Schreber’s 1903 Memoirs of My Nervous Illness.

Talk about “a disintegration of the personality and character structure of Modern Man and mental-rational consciousness,” as Scott Preston put it. He goes on to say that, “The individual is not a natural thing. There is an incoherency in “Margaret Thatcher’s view of things when she infamously declared “there is no such thing as society” — that she saw only individuals and families, that is to say, atoms and molecules.” Her saying that really did capture the mood of the society she denied existing. Even the family was shrunk down to the ‘nuclear’. To state there is no society is to declare that there is also no extended family, no kinship, no community, that there is no larger human reality of any kind. Ironically in this pseudo-libertarian sentiment, there is nothing holding the family together other than government laws imposing strict control of marriage and parenting where common finances lock two individuals together under the rule of capitalist realism (the only larger realities involved are inhuman systems) — compared to high trust societies such as Nordic countries where the definition and practice of family life is less legalistic (Nordic Theory of Love and Individualism).

The individual consumer-citizen as a legal member of a family unit has to be created and then controlled, as it is a rather unstable atomized identity. “The idea of the “individual”,” Preston says, “has become an unsustainable metaphor and moral ideal when the actual reality is “21st century schizoid man” — a being who, far from being individual, is falling to pieces and riven with self-contradiction, duplicity, and cognitive dissonance, as reflects life in “the New Normal” of double-talk, double-think, double-standard, and double-bind.” That is partly the reason for the heavy focus on the body, an attempt to make concrete the individual in order to hold together the splintered self — great analysis of this can be found in Lewis Hyde’s Trickster Makes This World: “an unalterable fact about the body is linked to a place in the social order, and in both cases, to accept the link is to be caught in a kind of trap. Before anyone can be snared in this trap, an equation must be made between the body and the world (my skin color is my place as a Hispanic; menstruation is my place as a woman)” (see one of my posts about it: Lock Without a Key). Along with increasing authoritarianism, there was increasing medicalization and rationalization — to try to make sense of what was senseless.

A specific example of a change can be found in Dr. Frederick Hollick (1818-1900) who was a popular writer and speaker on medicine and health — his “links were to the free-thinking tradition, not to Christianity” (Helen Lefkowitz Horowitz, Rewriting Sex). With the influence of Mesmerism and animal magnetism, he studied and wrote about what more scientifically-sounding was variously called electrotherapeutics, galvanism, and electro-galvanism. Hollick was an English follower of the Scottish industrialist and socialist Robert Dale Owen who he literally followed to the United States where Owen started the utopian community New Harmony, a Southern Indiana village bought from the utopian German Harmonists and then filled with brilliant and innovative minds but lacking in practical know-how about running a self-sustaining community (Abraham Lincoln, later becoming a friend to the Owen family, recalled as a boy seeing the boat full of books heading to New Harmony).

“As had Owen before him, Hollick argued for the positive value of sexual feeling. Not only was it neither immoral nor injurious, it was the basis for morality and society. […] In many ways, Hollick was a sexual enthusiast” (Horowitz). These were the social circles of Abraham Lincoln, as he personally knew free-love advocates; that is why early Republicans were often referred to as “Red Republicans”, the ‘Red’ indicating radicalism as it still does to this day. Hollick wasn’t the first to be a sexual advocate nor, of course would he be the last — preceding him was Sarah Grimke (1837, Equality of the Sexes) and Charles Knowlton (1839, The Private Companion of Young Married People), Hollick having been “a student of Knowlton’s work” (Debran Rowland, The Boundaries of Her Body); and following him were two more well known figures, the previously mentioned Bernarr Macfadden (1868-1955) who was the first major health and fitness guru, and Wilhelm Reich (1897–1957) who was the less respectable member of the trinity formed with Sigmund Freud and Carl Jung. Sexuality became a symbolic issue of politics and health, partly because of increasing scientific knowledge but also because increasing marketization of products such as birth control (with public discussion of contraceptives happening in the late 1700s and advances in contraceptive production in the early 1800s), the latter being quite significant as it meant individuals could control pregnancy and this is particularly relevant to women. It should be noted that Hollick promoted the ideal of female sexual autonomy, that sex should be assented to and enjoyed by both partners.

This growing concern with sexuality began with the growing middle class in the decades following the American Revolution. Among much else, it was related to the post-revolutionary focus on parenting and the perceived need for raising republican citizens — this formed an audience far beyond radical libertinism and free-love. Expert advice was needed for the new bourgeouis family life, as part of the “civilizing process” that increasingly took hold at that time with not only sexual manuals but also parenting guides, health pamphlets, books of manners, cookbooks, diet books, etc — cut off from the roots of traditional community and kinship, the modern individual no longer trusted inherited wisdom and so needed to be taught how to live, how to behave and relate. Along with the rise of the science, this situation promoted the role of the public intellectual that Hollick effectively took advantage of and, after the failure of Owen’s utopian experiment, he went on the lecture circuit which brought on legal cases in the unsuccessful attempt to silence him, the kind of persecution that Reich also later endured.

To put it in perspective, this Antebellum era of public debate and public education on sexuality coincided with other changes. Following the revolutionary era feminism (e.g., Mary Wollstonecraft), the “First Wave” of organized feminists emerged generations later with the Seneca meeting in 1848 and, in that movement, there was a strong abolitionist impulse. This was part of the rise of ideological -isms in the North that so concerned the Southern aristocrats who wanted to maintain their hierarchical control of the entire country, the control they were quickly losing with the shift of power in the Federal government. A few years before that in 1844, a more effective condom was developed using vulcanized rubber, although condoms had been on the market since the previous decade; also in the 1840s, the vaginal sponge became available. Interestingly, many feminists were as against the contraceptives as they were against abortions. These were far from being mere practical issues as politics imbued every aspect and some feminists worried about how this might lessen the role of women and motherhood in society, if sexuality was divorced from pregnancy.

This was at a time when the abortion rate was sky-rocketing, indicating most women held other views. “Yet we also know that thousands of women were attending lectures in these years, lectures dealing, in part, with fertility control. And rates of abortion were escalating rapidly, especially, according to historian James Mohr, the rate for married women. Mohr estimates that in the period 1800-1830, perhaps one out of every twenty-five to thirty pregnancies was aborted. Between 1850 and 1860, he estimates, the ratio may have been one out of every five or six pregnancies. At mid-century, more than two hundred full-time abortionists reported worked in New York City” (Rickie Solinger, Pregnancy and Power, p. 61). In the unGodly and unChurched period of early America (“We forgot.”), organized religion was weak and “premarital sex was typical, many marriages following after pregnancy, but some people simply lived in sin. Single parents and ‘bastards’ were common” (A Vast Experiment). Early Americans, by today’s standards, were not good Christians — visiting Europeans often saw them as uncouth heathens and quite dangerous at that, such as the common American practice of toting around guns and knives, ever ready for a fight, whereas carrying weapons had been made illegal in England. In The Churching of America, Roger Finke and Rodney Stark write (pp. 25-26):

“Americans are burdened with more nostalgic illusions about the colonial era than about any other period in their history. Our conceptions of the time are dominated by a few powerful illustrations of Pilgrim scenes that most people over forty stared at year after year on classroom walls: the baptism of Pocahontas, the Pilgrims walking through the woods to church, and the first Thanksgiving. Had these classroom walls also been graced with colonial scenes of drunken revelry and barroom brawling, of women in risque ball-gowns, of gamblers and rakes, a better balance might have been struck. For the fact is that there never were all that many Puritans, even in New England, and non-Puritan behavior abounded. From 1761 through 1800 a third (33.7%) of all first births in New England occurred after less than nine months of marriage (D. S. Smith, 1985), despite harsh laws against fornication. Granted, some of these early births were simply premature and do not necessarily show that premarital intercourse had occurred, but offsetting this is the likelihood that not all women who engaged in premarital intercourse would have become pregnant. In any case, single women in New England during the colonial period were more likely to be sexually active than to belong to a church-in 1776 only about one out of five New Englanders had a religious affiliation. The lack of affiliation does not necessarily mean that most were irreligious (although some clearly were), but it does mean that their faith lacked public expression and organized influence.”

Though marriage remained important as an ideal in American culture, what changed was that procreative control became increasingly available — with fewer accidental pregnancies and more abortions, a powerful motivation for marriage disappeared. Unsurprisingly, at the same time, there was increasing worries about the breakdown of community and family, concerns that would turn into moral panic at various points. Antebellum America was in turmoil. This was concretely exemplified by the dropping birth rate that was already noticeable by mid-century (Timothy Crumrin, “Her Daily Concern:” Women’s Health Issues in Early 19th-Century Indiana) and was nearly halved from 1800 to 1900 (Debran Rowland, The Boundaries of Her Body). “The late 19th century and early 20th saw a huge increase in the country’s population (nearly 200 percent between 1860 and 1910) mostly due to immigration, and that population was becoming ever more urban as people moved to cities to seek their fortunes—including women, more of whom were getting college educations and jobs outside the home” (Julie Beck, ‘Americanitis’: The Disease of Living Too Fast). It was a period of crisis, not all that different from our present crisis, including the fear about low birth rate of native-born white Americans, especially the endangered species of WASPs, being overtaken by the supposed dirty hordes of blacks, ethnics, and immigrants.

The promotion of birth control was considered a genuine threat to American society, maybe to all of Western Civilization. It was most directly a threat to traditional gender roles. Women could better control when they got pregnant, a decisive factor in the phenomenon of  larger numbers of women entering college and the workforce. And with an epidemic of neurasthenia, this dilemma was worsened by the crippling effeminacy that neutered masculine potency. Was modern man, specifically the white ruling elite, up for the task of carrying on Western Civilization?

“Indeed, civilization’s demands on men’s nerve force had left their bodies positively effeminate. According to Beard, neurasthenics had the organization of “women more than men.” They possessed ” a muscular system comparatively small and feeble.” Their dainty frames and feeble musculature lacked the masculine vigor and nervous reserves of even their most recent forefathers. “It is much less than a century ago, that a man who could not [drink] many bottles of wine was thought of as effeminate—but a fraction of a man.” No more. With their dwindling reserves of nerve force, civilized men were becoming increasingly susceptible to the weakest stimulants until now, “like babes, we find no safe retreat, save in chocolate and milk and water.” Sex was as debilitating as alcohol for neurasthenics. For most men, sex in moderation was a tonic. Yet civilized neurasthenics could become ill if they attempted intercourse even once every three months. As Beard put it, “there is not force enough left in them to reproduce the species or go through the process of reproducing the species.” Lacking even the force “to reproduce the species,” their manhood was clearly in jeopardy.” (Gail Bederman, Manliness and Civilization, pp. 87-88)

This led to a backlash that began before the Civil War with the early obscenity laws and abortion laws, but went into high gear with the 1873 Comstock laws that effectively shut down the free market of both ideas and products related to sexuality, including sex toys. This made it near impossible for most women to learn about birth control or obtain contraceptives and abortifacients. There was a felt need to restore order and that meant white male order of the WASP middle-to-upper classes, especially with the end of slavery, mass immigration of ethnics, urbanization and industrialization. The crisis wasn’t only ideological or political. The entire world had been falling apart for centuries with the ending of feudalism and the ancien regime, the last remnants of it in America being maintained through slavery. Motherhood being the backbone of civilization, it was believed that women’s sexuality had to be controlled and, unlike so much else that was out of control, it actually could be controlled through enforcement of laws.

Outlawing abortions is a particularly interesting example of social control. Even with laws in place, abortions remained commonly practiced by local doctors, even in many rural areas (American Christianity: History, Politics, & Social Issues). Corey Robin argues that the strategy hasn’t been to deny women’s agency but to assert their subordination (Denying the Agency of the Subordinate Class). This is why abortion laws were designed to target male doctors, although they rarely did, and not their female patients. Everything comes down to agency or its lack or loss, but our entire sense of agency is out of accord with our own human nature. We seek to control what is outside of us for own sense of self is out of control. The legalistic worldview is inherently authoritarian, at the heart of what Julian Jaynes proposes as the post-bicameral project of consciousness, the contained self. But the container is weak and keeps leaking all over the place.

To bring it back to the original inspiration, Scott Preston wrote: “Quite obviously, our picture of the human being as an indivisible unit or monad of existence was quite wrong-headed, and is not adequate for the generation and re-generation of whole human beings. Our self-portrait or sel- understanding of “human nature” was deficient and serves now only to produce and reproduce human caricatures. Many of us now understand that the authentic process of individuation hasn’t much in common at all with individualism and the supremacy of the self-interest.” The failure we face is that of identify, of our way of being in the world. As with neurasthenia in the past, we are now in a crisis of anxiety and depression, along with yet another moral panic about the declining white race. So, we get the likes of Steve Bannon, Donald Trump, and Jordan Peterson. We failed to resolve past conflicts and so they keep re-emerging.

“In retrospect, the omens of an impending crisis and disintegration of the individual were rather obvious,” Preston points out. “So, what we face today as “the crisis of identity” and the cognitive dissonance of “the New Normal” is not something really new — it’s an intensification of that disintegrative process that has been underway for over four generations now. It has now become acute. This is the paradox. The idea of the “individual” has become an unsustainable metaphor and moral ideal when the actual reality is “21st century schizoid man” — a being who, far from being individual, is falling to pieces and riven with self-contradiction, duplicity, and cognitive dissonance, as reflects life in “the New Normal” of double-talk, double-think, double-standard, and double-bind.” We never were individuals. It was just a story we told ourselves, but there are others that could be told. Scott Preston offers an alternative narrative, that of individuation.

* * *

I found some potentially interesting books while skimming material on Google Books, in my researching Frederick Hollick and other info. Among the titles below, I’ll share some text from one of them because it offers a good summary about sexuality at the time, specifically women’s sexuality. Obviously, it went far beyond sexuality itself, and going by my own theorizing I’d say it is yet another example of symbolic conflation, considering its direct relationship to abortion.

The Boundaries of Her Body: The Troubling History of Women’s Rights in America
by Debran Rowland
pp. 34

WOMEN AND THE WOMB: The Emerging Birth Control Debate

The twentieth century dawned in America on a falling white birth rate. In 1800, an average of seven children were born to each “American-born white wife,” historians report. 29 By 1900, that number had fallen to roughly half. 30 Though there may have been several factors, some historians suggest that this decline—occurring as it did among young white women—may have been due to the use of contraceptives or abstinence,though few talked openly about it. 31

“In spite of all the rhetoric against birth control,the birthrate plummeted in the late nineteenth century in America and Western Europe (as it had in France the century before); family size was halved by the time of World War I,” notes Shari Thurer in The Myth of Motherhood. 32

As issues go, the “plummeting birthrate” among whites was a powder keg, sparking outcry as the “failure”of the privileged class to have children was contrasted with the “failure” of poor immigrants and minorities to control the number of children they were having. Criticism was loud and rampant. “The upper classes started the trend, and by the 1880s the swarms of ragged children produced by the poor were regarded by the bourgeoisie, so Emile Zola’s novels inform us, as evidence of the lower order’s ignorance and brutality,” Thurer notes. 33

But the seeds of this then-still nearly invisible movement had been planted much earlier. In the late 1700s, British political theorists began disseminating information on contraceptives as concerns of overpopulation grew among some classes. 34 Despite the separation of an ocean, by the 1820s, this information was “seeping” into the United States.

“Before the introduction of the Comstock laws, contraceptive devices were openly advertised in newspapers, tabloids, pamphlets, and health magazines,” Yalom notes.“Condoms had become increasing popular since the 1830s, when vulcanized rubber (the invention of Charles Goodyear) began to replace the earlier sheepskin models.” 35 Vaginal sponges also grew in popularity during the 1840s, as women traded letters and advice on contraceptives. 36 Of course, prosecutions under the Comstock Act went a long way toward chilling public discussion.

Though Margaret Sanger’s is often the first name associated with the dissemination of information on contraceptives in the early United States, in fact, a woman named Sarah Grimke preceded her by several decades. In 1837, Grimke published the Letters on the Equality of the Sexes, a pamphlet containing advice about sex, physiology, and the prevention of pregnancy. 37

Two years later, Charles Knowlton published The Private Companion of Young Married People, becoming the first physician in America to do so. 38 Near this time, Frederick Hollick, a student of Knowlton’s work, “popularized” the rhythm method and douching. And by the 1850s, a variety of material was being published providing men and women with information on the prevention of pregnancy. And the advances weren’t limited to paper.

“In 1846,a diaphragm-like article called The Wife’s Protector was patented in the United States,” according to Marilyn Yalom. 39 “By the 1850s dozens of patents for rubber pessaries ‘inflated to hold them in place’ were listed in the U.S. Patent Office records,” Janet Farrell Brodie reports in Contraception and Abortion in 19 th Century America. 40 And, although many of these early devices were often more medical than prophylactic, by 1864 advertisements had begun to appear for “an India-rubber contrivance”similar in function and concept to the diaphragms of today. 41

“[B]y the 1860s and 1870s, a wide assortment of pessaries (vaginal rubber caps) could be purchased at two to six dollars each,”says Yalom. 42 And by 1860, following publication of James Ashton’s Book of Nature, the five most popular ways of avoiding pregnancy—“withdrawal, and the rhythm methods”—had become part of the public discussion. 43 But this early contraceptives movement in America would prove a victim of its own success. The openness and frank talk that characterized it would run afoul of the burgeoning “purity movement.”

“During the second half of the nineteenth century,American and European purity activists, determined to control other people’s sexuality, railed against male vice, prostitution, the spread of venereal disease, and the risks run by a chaste wife in the arms of a dissolute husband,” says Yalom. “They agitated against the availability of contraception under the assumption that such devices, because of their association with prostitution, would sully the home.” 44

Anthony Comstock, a “fanatical figure,” some historians suggest, was a charismatic “purist,” who, along with others in the movement, “acted like medieval Christiansengaged in a holy war,”Yalom says. 45 It was a successful crusade. “Comstock’s dogged efforts resulted in the 1873 law passed by Congress that barred use of the postal system for the distribution of any ‘article or thing designed or intended for the prevention of contraception or procuring of abortion’,”Yalom notes.

Comstock’s zeal would also lead to his appointment as a special agent of the United States Post Office with the authority to track and destroy “illegal” mailing,i.e.,mail deemed to be “obscene”or in violation of the Comstock Act.Until his death in 1915, Comstock is said to have been energetic in his pursuit of offenders,among them Dr. Edward Bliss Foote, whose articles on contraceptive devices and methods were widely published. 46 Foote was indicted in January of 1876 for dissemination of contraceptive information. He was tried, found guilty, and fined $3,000. Though donations of more than $300 were made to help defray costs,Foote was reportedly more cautious after the trial. 47 That “caution”spread to others, some historians suggest.

Disorderly Conduct: Visions of Gender in Victorian America
By Carroll Smith-Rosenberg

Riotous Flesh: Women, Physiology, and the Solitary Vice in Nineteenth-Century America
by April R. Haynes

The Boundaries of Her Body: The Troubling History of Women’s Rights in America
by Debran Rowland

Rereading Sex: Battles Over Sexual Knowledge and Suppression in Nineteenth-century America
by Helen Lefkowitz Horowitz

Rewriting Sex: Sexual Knowledge in Antebellum America, A Brief History with Documents
by Helen Lefkowitz Horowitz

Imperiled Innocents: Anthony Comstock and Family Reproduction in Victorian America
by Nicola Kay Beisel

Against Obscenity: Reform and the Politics of Womanhood in America, 1873–1935
by Leigh Ann Wheeler

Purity in Print: Book Censorship in America from the Gilded Age to the Computer Age
by Paul S. Boyer

American Sexual Histories
edited by Elizabeth Reis

Wash and Be Healed: The Water-Cure Movement and Women’s Health
by Susan Cayleff

From Eve to Evolution: Darwin, Science, and Women’s Rights in Gilded Age America
by Kimberly A. Hamlin

Manliness and Civilization: A Cultural History of Gender and Race in the United States, 1880-1917
by Gail Bederman

One Nation Under Stress: The Trouble with Stress as an Idea
by Dana Becker

Moralizing Gods as Effect, Not Cause

There is a new study on moralizing gods and social complexity, specifically as populations grow large. The authors are critical of the Axial Age theory: “Although our results do not support the view that moralizing gods were necessary for the rise of complex societies, they also do not support a leading alternative hypothesis that moralizing gods only emerged as a byproduct of a sudden increase in affluence during a first millennium ‘Axial Age’. Instead, in three of our regions (Egypt, Mesopotamia and Anatolia), moralizing gods appeared before 1500.”

I don’t take this criticism as too significant, since it is mostly an issue of dating. Objectively, there are no such things as distinct historical periods. Sure, you’ll find precursors of the Axial Age in the late Bronze Age. Then again, you’ll find precursors of the Renaissance and Protestant Reformation in the Axial Age. And you’ll find the precursors of the Enlightenment in the Renaissance and Protestant Reformation. It turns out all of history is continuous. No big shocker there. Changes build up slowly, until they hit a breaking point. It’s that breaking point, often when it becomes widespread, that gets designated as the new historical period. But the dividing line from one era to the next is always somewhat arbitrary.

This is important to keep in mind. And it does have more than slight relevance. This reframing of what has been called the Axial Age accords perfectly with Julian Jaynes’ theories on the ending of the bicameral mind and the rise of egoic consciousness, along with the rise of the egoic gods with their jealousies, vengeance, and so forth. A half century ago, Jaynes was noting that aspects of moralizing social orders were appearing in the late Bronze Age and he speculated that it had to do with increasing complexity that set those societies up for collapse.

Religion itself, as a formal distinct institution with standardized practices, didn’t exist until well into the Axial Age. Before that, rituals and spiritual/supernatural experience were apparently inseparable from everyday life, as the archaic self was inseparable from the communal sense of the world. Religion as we now know it is what replaced that prior way of being in relationship to ‘gods’, but it wasn’t only a different sense of the divine for the texts refer to early people hearing the voices of spirits, godmen, dead kings, and ancestors. Religion was only necessary, according to Jaynes, when the voices went silent (i.e., when they were no longer heard externally because a singular voice had become internalized). The pre-religious mentality is what Jaynes called the bicameral mind and it represents the earliest and largest portion of civilization, maybe lasting for millennia upon millennia going back to the first city-states.

The pressures on the bicameral mind began to stress the social order beyond what could be managed. Those late Bronze Age civilizations had barely begun to adapt to that complexity and weren’t successful. Only Egypt was left standing and, in its sudden isolation amidst a world of wreckage and refugees, it too was transformed. We speak of the Axial Age in the context of a later date because it took many centuries for empires to be rebuilt around moralizing religions (and other totalizing systems and often totalitarian institutions; e.g., large centralized governments with rigid hierarchies). The archaic civilizations had to be mostly razed to the ground before something else could more fully take their place.

There is something else to understand. To have moralizing big gods to maintain social order, what is required is introspectable subjectivity (i.e., an individual to be controlled by morality). That is to say you need a narratizing inner space where a conscience can operate in the voicing of morality tales and the imagining of narratized scenarios such as considering alternate possible future actions, paths, and consequences. This is what Jaynes was arguing and it wasn’t vague speculation, as he was working with the best evidence he could accrue. Building on Jaynes work with language, Brian J. McVeigh has analyzed early texts to determine how often mind-words were found. Going by language use during the late Bronze Age, there was an increased focus on psychological ways of speaking. Prior to that, morality as such wasn’t necessary, no more than were written laws, court systems, police forces, and standing armies — all of which appeared rather late in civilization.

What creates the introspectable subjectivity of the egoic self, i.e., Jaynesian ‘consciousness’? Jaynes suggests that writing was a prerequisite and it needed to be advanced beyond the stage of simple record-keeping. A literary canon likely developed first to prime the mind for a particular form of narratizing. The authors of the paper do note that written language generally came first:

“This megasociety threshold does not seem to correspond to the point at which societies develop writing, which might have suggested that moralizing gods were present earlier but were not preserved archaeologically. Although we cannot rule out this possibility, the fact that written records preceded the development of moralizing gods in 9 out of the 12 regions analysed (by an average period of 400 years; Supplementary Table 2)—combined with the fact that evidence for moralizing gods is lacking in the majority of non-literate societies — suggests that such beliefs were not widespread before the invention of writing. The few small-scale societies that did display precolonial evidence of moralizing gods came from regions that had previously been used to support the claim that moralizing gods contributed to the rise of social complexity (Austronesia and Iceland), which suggests that such regions are the exception rather than the rule.”

As for the exceptions, it’s possible they were influenced by the moralizing religions of societies they came in contact with. Scandinavians, long before they developed complex societies with large concentrated populations, they were traveling and trading all over Eurasia, the Levant, and into North Africa. This was happening in the Bronze Age, during the period of rising big gods and moralizing religion: “The analysis showed that the blue beads buried with the [Nordic] women turned out to have originated from the same glass workshop in Amarna that adorned King Tutankhamun at his funeral in 1323 BCE. King Tut´s golden deathmask contains stripes of blue glass in the headdress, as well as in the inlay of his false beard.” (Philippe Bohstrom, Beads Found in 3,400-year-old Nordic Graves Were Made by King Tut’s Glassmaker). It would be best to not fall prey to notions of untouched primitives.

We can’t assume that these exceptions were actually exceptional, in supposedly being isolated examples contrary to the larger pattern. Even hunter-gatherers have been heavily shaped by the millennia of civilizations that surrounded them. Occasionally finding moralizing religions among simpler and smaller societies is no more remarkable than finding metal axes and t-shirts among tribal people today. All societies respond to changing conditions and adapt as necessary to survive. The appearance of moralizing religions and the empires that went with them transformed the world far beyond the borders of any given society, not that borders were all that defined back then anyway. The large-scale consequences spread across the earth these past three millennia, a tidal wave hitting some places sooner than others but in the end none remain untouched. We are all now under the watchful eye of big gods or else their secularized equivalent, big brother of the surveillance state.

* * *

Moralizing gods appear after, not before, the rise of social complexity, new research suggests
by Redazione Redazione

Professor Whitehouse said: ‘The original function of moralizing gods in world history may have been to hold together large but rather fragile, ethnically diverse societies. It raises the question as to how some of those functions could still be performed in today’s increasingly secular societies – and what the costs might be if they can’t. Even if world history cannot tell us how to live our lives, it could provide a more reliable way of estimating the probabilities of different futures.’

When Ancient Societies Hit a Million People, Vengeful Gods Appeared
by Charles Q. Choi

“For we know Him who said, ‘And I will execute great vengeance upon them with furious rebukes; and they shall know that I am the Lord, when I shall lay my vengeance upon them.'” Ezekiel 25:17.

The God depicted in the Old Testament may sometimes seem wrathful. And in that, he’s not alone; supernatural forces that punish evil play a central role in many modern religions.

But which came first: complex societies or the belief in a punishing god? […]

The researchers found that belief in moralizing gods usually followed increases in social complexity, generally appearing after the emergence of civilizations with populations of more than about 1 million people.

“It was particularly striking how consistent it was [that] this phenomenon emerged at the million-person level,” Savage said. “First, you get big societies, and these beliefs then come.”

All in all, “our research suggests that religion is playing a functional role throughout world history, helping stabilize societies and people cooperate overall,” Savage said. “In really small societies, like very small groups of hunter-gatherers, everyone knows everyone else, and everyone’s keeping an eye on everyone else to make sure they’re behaving well. Bigger societies are more anonymous, so you might not know who to trust.”

At those sizes, you see the rise of beliefs in an all-powerful, supernatural person watching and keeping things under control, Savage added.

Complex societies gave birth to big gods, not the other way around: study
from Complexity Science Hub Vienna

“It has been a debate for centuries why humans, unlike other animals, cooperate in large groups of genetically unrelated individuals,” says Seshat director and co-author Peter Turchin from the University of Connecticut and the Complexity Science Hub Vienna. Factors such as agriculture, warfare, or religion have been proposed as main driving forces.

One prominent theory, the big or moralizing gods hypothesis, assumes that religious beliefs were key. According to this theory, people are more likely to cooperate fairly if they believe in gods who will punish them if they don’t. “To our surprise, our data strongly contradict this hypothesis,” says lead author Harvey Whitehouse. “In almost every world region for which we have data, moralizing gods tended to follow, not precede, increases in social complexity.” Even more so, standardized rituals tended on average to appear hundreds of years before gods who cared about human morality.

Such rituals create a collective identity and feelings of belonging that act as social glue, making people to behave more cooperatively. “Our results suggest that collective identities are more important to facilitate cooperation in societies than religious beliefs,” says Harvey Whitehouse.

Society Creates God, God Does Not Create Society
by  Razib Khan

What’s striking is how soon moralizing gods shows up after the spike in social complexity.

In the ancient world, early Christian writers explicitly asserted that it was not a coincidence that their savior arrived with the rise of the Roman Empire. They contended that a universal religion, Christianity, required a universal empire, Rome. There are two ways you can look at this. First, that the causal arrow is such that social complexity leads to moralizing gods, and that’s that. The former is a necessary condition for the latter. Second, one could suggest that moralizing gods are a cultural adaptation to large complex societies, one of many, that dampen instability and allow for the persistence of those societies. That is, social complexity leads to moralistic gods, who maintain and sustain social complexity. To be frank, I suspect the answer will be closer to the second. But we’ll see.

Another result that was not anticipated I suspect is that ritual religion emerged before moralizing gods. In other words, instead of “Big Gods,” it might be “Big Rules.” With hindsight, I don’t think this is coincidental since cohesive generalizable rules are probably essential for social complexity and winning in inter-group competition. It’s not a surprise that legal codes emerge first in Mesopotamia, where you had the world’s first anonymous urban societies. And rituals lend themselves to mass social movements in public to bind groups. I think it will turn out that moralizing gods were grafted on top of these general rulesets, which allow for coordination, cooperation, and cohesion, so as to increase their import and solidify their necessity due to the connection with supernatural agents, which personalize the sets of rules from on high.

Complex societies precede moralizing gods throughout world history
by Harvey Whitehouse, Pieter François, Patrick E. Savage, Thomas E. Currie, Kevin C. Feeney, Enrico Cioni, Rosalind Purcell, Robert M. Ross, Jennifer Larson, John Baines, Barend ter Haar, Alan Covey, and Peter Turchin

The origins of religion and of complex societies represent evolutionary puzzles1–8. The ‘moralizing gods’ hypothesis offers a solution to both puzzles by proposing that belief in morally concerned supernatural agents culturally evolved to facilitate cooperation among strangers in large-scale societies9–13. Although previous research has suggested an association between the presence of moralizing gods and social complexity3,6,7,9–18, the relationship between the two is disputed9–13,19–24, and attempts to establish causality have been hampered by limitations in the availability of detailed global longitudinal data. To overcome these limitations, here we systematically coded records from 414societies that span the past 10,000years from 30regions around the world, using 51measures of social complexity and 4measures of supernatural enforcement of morality. Our analyses not only confirm the association between moralizing gods and social complexity, but also reveal that moralizing gods follow—rather than precede—large increases in social complexity. Contrary to previous predictions9,12,16,18, powerful moralizing ‘big gods’ and prosocial supernatural punishment tend to appear only after the emergence of ‘megasocieties’ with populations of more than around one million people. Moralizing gods are not a prerequisite for the evolution of social complexity, but they may help to sustain and expand complex multi-ethnic empires after they have become established. By contrast, rituals that facilitate the standardization of religious traditions across large populations25,26 generally precede the appearance of moralizing gods. This suggests that ritual practices were more important than the particular content of religious belief to the initial rise of social complexity.

 

 

Spartan Diet

There are a number well known low-carb diets. The most widely cited is that of the Inuit, but the Masai are often mentioned as well. I came across another example in Jack Weatherford’s Genghis Khan and the Making of the Modern World (see here for earlier discussion).

Mongols lived off of meat, blood, and milk paste. This diet, as the Chinese observed, allowed the Mongol warriors to ride and fight for days on end without needing to stop for meals. Part of this is because they could eat while riding, but there is a more key factor. This diet is so low-carb as to be ketogenic. And long-term ketosis leads to fat-adaptation which allows for high energy and stamina, even without meals as long as one has enough fat reserves (i.e., body fat). The feast and fast style of eating is common among non-agriculturalists.

There are other historical examples I haven’t previously researched. Ori Hofmekler in The Warrior Diet, claims that Spartans and Romans ate in a brief period each day, about a four hour window — because of the practice of having a communal meal once a day. This basically meant fasting for lengthy periods, although today it is often described as time-restricted eating. As I recall, Sikh monks have a similar practice of only eating one meal a day during which they are free to eat as much as they want. The trick to this diet is that it decreases overall food intake and keeps the body in ketosis more often — if starchy foods are restricted enough and the body is fat-adapted, this lessens hunger and cravings.

The Mongols may have been doing something similar. The thing about ketosis is your desire to snack all the time simply goes away. You don’t have to force yourself into food deprivation and it isn’t starvation, even if going without food for several days. As long as there is plenty of body fat and you are fat-adapted, the body maintains health, energy and mood just fine until the next big meal. Even non-warrior societies do this. The meat-loving and blubber-gluttonous Inuit don’t tolerate aggression in the slightest, and they certainly aren’t known for amassing large armies and going on military campaigns. Or consider the Piraha who are largely pacifists, banishing their own members if they kill another person, even someone from another tribe. The Piraha get about 70% of their diet from fish and other meat, that is to say a ketogenic diet. Plus, even though surrounded by lush forests filled with a wide variety of food, plants and animals, the Piraha regularly choose not to eat — sometimes for no particular reason but also sometimes when doing communal dances over multiple days.

So, I wouldn’t be surprised if Spartan and Roman warriors had similar practices, especially the Spartans who didn’t farm much (the grains that were grown by the Spartans’ slaves likely were most often fed to the slaves, not as much to the ruling Spartans). As for Romans, their diet probably became more carb-centric as Rome grew into an agricultural empire. But early on in the days of the Roman Republic, Romans probably were like Spartans in the heavy focus they would have put on raising cattle and hunting game. Still, a diet doesn’t have to be heavy in fatty meat to be ketogenic, as long as it involves some combination of calorie restriction, portion control, narrow periods of meals, intermittent fasting, etc — all being other ways of lessening the total intake of starchy foods.

One of the most common meals for Spartans was a blood and bone broth using boiled pork mixed with salt and vinegar, the consistency being thick and the color black. That would have included a lot of fat, fat-soluble vitamins, minerals, collagen, electrolytes, and much else. It was a nutrient-dense elixir of health, however horrible it may seem to the modern palate. And it probably was low-carb, depending on what else might’ve been added to it. Even the wine Spartans drink was watered down, as drunkenness was frowned upon. The purpose was probably more to kill unhealthy microbes in the water, as was watered down beer millennia later for early Americans, and so it would have added little sugar to the diet. Like the Mongols, they also enjoyed dairy. And they did have some grains such as bread, but apparently it was never a staple of their diet.

One thing they probably ate little of was olive oil, assuming it was used at all, as it was rarely mentioned in ancient texts and only became popular among Greeks in recent history, specifically the past century (discussed by Nina Teicholz in The Big Fat Surprise). Instead, Spartans as with most other early Greeks would have preferred animal fat, mostly lard in the case of the Spartans, whereas many other less landlocked Greeks preferred fish. Other foods the ancient Greeks, Spartans and otherwise, lacked was tomatoes later introduced from the New World and noodles later introduced from China, both during the colonial era of recent centuries. So, a traditional Greek diet would have looked far different than what we think of as the modern ‘Mediterranean diet’.

On top of that, Spartans were proud of eating very little and proud of their ability to fast. Plutarch (2nd century AD) writes in Parallel Lives “For the meals allowed them are scanty, in order that they may take into their own hands the fight against hunger, and so be forced into boldness and cunning”. Also, Xenophon who was alive whilst Sparta existed, writes in Spartan Society 2, “furnish for the common meal just the right amount for [the boys in their charge] never to become sluggish through being too full, while also giving them a taste of what it is not to have enough.” (from The Ancient Warrior Diet: Spartans) It’s hard to see how this wouldn’t have been ketogenic. Spartans were known for being great warriors achieving feats of military prowess that would’ve been impossible from lesser men. On their fatty meat diet of pork and game, they were taller and leaner than other Greeks. They didn’t have large meals and fasted for most of the day, but when they did eat it was food dense in fat, calories, and nutrition.

* * *

Ancient Spartan Food and Diet
from Legend & Chronicles

The Secrets of Spartan Cuisine
by Helena P. Schrader

Western Individuality Before the Enlightenment Age

The Culture Wars of the Late Renaissance: Skeptics, Libertines, and Opera
by Edward Muir
Introduction
pp. 5-7

One of the most disturbing sources of late-Renaissance anxiety was the collapse of the traditional hierarchic notion of the human self. Ancient and medieval thought depicted reason as governing the lower faculties of the will, the passions, and the body. Renaissance thought did not so much promote “individualism” as it cut away the intellectual props that presented humanity as the embodiment of a single divine idea, thereby forcing a desperate search for identity in many. John Martin has argued that during the Renaissance, individuals formed their sense of selfhood through a difficult negotiation between inner promptings and outer social roles. Individuals during the Renaissance looked both inward for emotional sustenance and outward for social assurance, and the friction between the inner and outer selves could sharpen anxieties 2 The fragmentation of the self seems to have been especially acute in Venice, where the collapse of aristocratic marriage structures led to the formation of what Virginia Cox has called the single self, most clearly manifest in the works of several women writers who argued for the moral and intellectual equality of women with men.’ As a consequence of the fragmented understanding of the self, such thinkers as Montaigne became obsessed with what was then the new concept of human psychology, a term in fact coined in this period.4 A crucial problem in the new psychology was to define the relation between the body and the soul, in particular to determine whether the soul died with the body or was immortal. With its tradition of Averroist readings of Aristotle, some members of the philosophy faculty at the University of Padua recurrently questioned the Christian doctrine of the immortality of the soul as unsound philosophically. Other hierarchies of the human self came into question. Once reason was dethroned, the passions were given a higher value, so that the heart could be understood as a greater force than the mind in determining human conduct. duct. When the body itself slipped out of its long-despised position, the sexual drives of the lower body were liberated and thinkers were allowed to consider sex, independent of its role in reproduction, a worthy manifestation of nature. The Paduan philosopher Cesare Cremonini’s personal motto, “Intus ut libet, foris ut moris est,” does not quite translate to “If it feels good, do it;” but it comes very close. The collapse of the hierarchies of human psychology even altered the understanding of the human senses. The sense of sight lost its primacy as the superior faculty, the source of “enlightenment”; the Venetian theorists of opera gave that place in the hierarchy to the sense of hearing, the faculty that most directly channeled sensory impressions to the heart and passions.

Historical and Philosophical Issues in the Conservation of Cultural Heritage
edited by Nicholas Price, M. Kirby Talley, and Alessandra Melucco Vaccaro
Reading 5: “The History of Art as a Humanistic Discipline”
by Erwin Panofsky
pp. 83-85

Nine days before his death Immanuel Kant was visited by his physician. Old, ill and nearly blind, he rose from his chair and stood trembling with weakness and muttering unintelligible words. Finally his faithful companion realized that he would not sit down again until the visitor had taken a seat. This he did, and Kant then permitted himself to be helped to his chair and, after having regained some of his strength, said, ‘Das Gefühl für Humanität hat mich noch nicht verlassen’—’The sense of humanity has not yet left me’. The two men were moved almost to tears. For, though the word Humanität had come, in the eighteenth century, to mean little more than politeness and civility, it had, for Kant, a much deeper significance, which the circumstances of the moment served to emphasize: man’s proud and tragic consciousness of self-approved and self-imposed principles, contrasting with his utter subjection to illness, decay and all that implied in the word ‘mortality.’

Historically the word humanitas has had two clearly distinguishable meanings, the first arising from a contrast between man and what is less than man; the second between man and what is more. In the first case humanitas means a value, in the second a limitation.

The concept of humanitas as a value was formulated in the circle around the younger Scipio, with Cicero as its belated, yet most explicit spokesman. It meant the quality which distinguishes man, not only from animals, but also, and even more so, from him who belongs to the species homo without deserving the name of homo humanus; from the barbarian or vulgarian who lacks pietas and παιδεια- that is, respect for moral values and that gracious blend of learning and urbanity which we can only circumscribe by the discredited word “culture.”

In the Middle Ages this concept was displaced by the consideration of humanity as being opposed to divinity rather than to animality or barbarism. The qualities commonly associated with it were therefore those of frailty and transience: humanitas fragilis, humanitas caduca.

Thus the Renaissance conception of humanitas had a two-fold aspect from the outset. The new interest in the human being was based both on a revival of the classical antithesis between humanitas and barbartias, or feritas, and on a survival of the mediaeval antithesis between humanitas and divinitas. When Marsilio Ficino defines man as a “rational soul participating in the intellect of God, but operating in a body,” he defines him as the one being that is both autonomous and finite. And Pico’s famous ‘speech’ ‘On the Dignity of Man’ is anything but a document of paganism. Pico says that God placed man in the center of the universe so that he might be conscious of where he stands, and therefore free to decide ‘where to turn.’ He does not say that man is the center of the universe, not even in the sense commonly attributed to the classical phrase, “man the measure of all things.”

It is from this ambivalent conception of humanitas that humanism was born. It is not so much a movement as an attitude which can be defined as the conviction of the dignity of man, based on both the insistence on human values (rationality and freedom) and the acceptance of human limitations (fallibility and frailty); from this two postulates result responsibility and tolerance.

Small wonder that this attitude has been attacked from two opposite camps whose common aversion to the ideas of responsibility and tolerance has recently aligned them in a united front. Entrenched in one of these camps are those who deny human values: the determinists, whether they believe in divine, physical or social predestination, the authoritarians, and those “insectolatrists” who profess the all-importance of the hive, whether the hive be called group, class, nation or race. In the other camp are those who deny human limitations in favor of some sort of intellectual or political libertinism, such as aestheticists, vitalists, intuitionists and hero-worshipers. From the point of view of determinism, the humanist is either a lost soul or an ideologist. From the point of view of authoritarianism, he is either a heretic or a revolutionary (or a counterrevolutionary). From the point of view of “insectolatry,” he is a useless individualist. And from the point of view of libertinism he is a timid bourgeois.

Erasmus of Rotterdam, the humanist par excellence, is a typical case in point. The church suspected and ultimately rejected the writings of this man who had said: “Perhaps the spirit of Christ is more largely diffused than we think, and there are many in the community of saints who are not in our calendar.” The adventurer Uhich von Hutten despised his ironical skepticism and his unheroic love of tranquillity. And Luther, who insisted that “no man has power to think anything good or evil, but everything occurs in him by absolute necessity,” was incensed by a belief which manifested itself in the famous phrase; “What is the use of man as a totality [that is, of man endowed with both a body and a soul], if God would work in him as a sculptor works in clay, and might just as well work in stone?”

Food and Faith in Christian Culture
edited by Ken Albala and Trudy Eden
Chapter 3: “The Food Police”
Sumptuary Prohibitions On Food In The Reformation
by Johanna B. Moyer
pp. 80-83

Protestants too employed a disease model to explain the dangers of luxury consumption. Luxury damaged the body politic leading to “most incurable sickness of the universal body” (33). Protestant authors also employed Galenic humor theory, arguing that “continuous superfluous expense” unbalanced the humors leading to fever and illness (191). However, Protestants used this model less often than Catholic authors who attacked luxury. Moreover, those Protestants who did employ the Galenic model used it in a different manner than their Catholic counterparts.

Protestants also drew parallels between the damage caused by luxury to the human body and the damage excess inflicted on the French nation. Rather than a disease metaphor, however, many Protestant authors saw luxury more as a “wound” to the body politic. For Protestants the danger of luxury was not only the buildup of humors within the body politic of France but the constant “bleeding out” of humor from the body politic in the form of cash to pay for imported luxuries. The flow of cash mimicked the flow of blood from a wound in the body. Most Protestants did not see luxury foodstuffs as the problem, indeed most saw food in moderation as healthy for the body. Even luxury apparel could be healthy for the body politic in moderation, if it was domestically produced and consumed. Such luxuries circulated the “blood” of the body politic creating employment and feeding the lower orders. 72 De La Noue made this distinction clear. He dismissed the need to individually discuss the damage done by each kind of luxury that was rampant in France in his time as being as pointless “as those who have invented auricular confession have divided mortal and venal sins into infinity of roots and branches.” Rather, he argued, the damage done by luxury was in its “entire bulk” to the patrimonies of those who purchased luxuries and to the kingdom of France (116). For the Protestants, luxury did not pose an internal threat to the body and salvation of the individual. Rather, the use of luxury posed an external threat to the group, to the body politic of France.

The Reformation And Sumptuary Legislation

Catholics, as we have seen, called for antiluxury regulations on food and banqueting, hoping to curb overeating and the damage done by gluttony to the body politic. Although some Protestants also wanted to restrict food and banqueting, more often French Protestants called for restrictions on clothing and foreign luxuries. These differing views of luxury during and after the French Wars of Religion not only give insight into the theological differences between these two branches of Christianity but also provides insight into the larger pattern of the sumptuary regulation of food in Europe in this period. Sumptuary restrictions were one means by which Catholics and Protestants enforced their theology in the post-Reformation era.

Although Catholicism is often correctly cast as the branch of Reformation Christianity that gave the individual the least control over their salvation, it was also true that the individual Catholic’s path to salvation depended heavily on ascetic practices. The responsibility for following these practices fell on the individual believer. Sumptuary laws on food in Catholic areas reinforced this responsibility by emphasizing what foods should and should not be eaten and mirrored the central theological practice of fasting for the atonement of sin. Perhaps the historiographical cliché that it was only Protestantism which gave the individual believer control of his or her salvation needs to be qualified. The arithmetical piety of Catholicism ultimately placed the onus on the individual to atone for each sin. Moreover, sumptuary legislation tried to steer the Catholic believer away from the more serious sins that were associated with overeating, including gluttony, lust, anger, and pride.

Catholic theology meshed nicely with the revival of Galenism that swept through Europe in this period. Galenists preached that meat eating, overeating, and the imbalance in humors which accompanied these practices, led to behavioral changes, including an increased sex drive and increased aggression. These physical problems mirrored the spiritual problems that luxury caused, including fornication and violence. This is why so many authors blamed the French nobility for the luxury problem in France. Nobles were seen not only as more likely to bear the expense of overeating but also as more prone to violence. 73

Galenism also meshed nicely with Catholicism because it was a very physical religion in which the control of the physical body figured prominently in the believer’s path to salvation. Not surprisingly, by the seventeenth century, Protestants gravitated away from Galenism toward the chemical view of the body offered by Paracelsus. 74 Catholic sumptuary law embodied a Galenic view of the body where sin and disease were equated and therefore pushed regulations that advocated each person’s control of his or her own body.

Protestant legislators, conversely, were not interested in the individual diner. Sumptuary legislation in Protestant areas ran the gamut from control of communal displays of eating, in places like Switzerland and Germany, to little or no concern with restrictions on luxury foods, as in England. For Protestants, it was the communal role of food and luxury use that was important. Hence the laws in Protestant areas targeted food in the context of weddings, baptisms, and even funerals. The English did not even bother to enact sumptuary restrictions on food after their break with Catholicism. The French Protestants who wrote on luxury glossed over the deleterious effects of meat eating, even proclaiming it to be healthful for the body while producing diatribes against the evils of imported luxury apparel. The use of Galenism in the French Reformed treatises suggests that Protestants too were concerned with a “body,” but it was not the individual body of the believer that worried Protestant legislators. Sumptuary restrictions were designed to safeguard the mystical body of believers, or the “Elect” in the language of Calvinism. French Protestants used the Galenic model of the body to discuss the damage that luxury did to the body of believers in France, but ultimately to safeguard the economic welfare of all French subjects. The Calvinists of Switzerland used sumptuary legislation on food to protect those predestined for salvation from the dangerous eating practices of members of the community whose overeating suggested they might not be saved.

Ultimately, sumptuary regulations in the Reformation spoke to the Christian practice of fasting. Fasting served very different functions in Protestants and Catholic theology. Raymond Mentzer has suggested that Protestants “modified” the Catholic practice of fasting during the Reformation. The major reformers, including Luther, Calvin, and Zwingli, all rejected fasting as a path to salvation. 75 For Protestants, fasting was a “liturgical rite,” part of the cycle of worship and a practice that served to “bind the community.” Fasting was often a response to adversity, as during the French Wars of Religion. For Catholics, fasting was an individual act, just as sumptuary legislation in Catholic areas targeted individual diners. However, for Protestants, fasting was a communal act, “calling attention to the body of believers.” 76 The symbolic nature of fasting, Mentzer argues, reflected Protestant rejection of transubstantiation. Catholics continued to believe that God was physically present in the host, but Protestants believed His was only a spiritual presence. When Catholics took Communion, they fasted to cleanse their own bodies so as to receive the real, physical body of Christ. Protestants, on the other hand, fasted as spiritual preparation because it was their spirits that connected with the spirit of Christ in the Eucharist. 77

“…there resides in every language a characteristic world-view”

“Via the latter, qua character of a speech-sound, a pervasive analogy necessarily prevails in the same language; and since a like subjectivity also affects language in the same notion, there resides in every language a characteristic world-view. As the individual sound stands between man and the object, so the entire language steps in between him and the nature that operates, both inwardly and outwardly, upon him. He surrounds himself with a world of sounds, so as to take up and process within himself the world of objects. These expressions in no way outstrip the measure of the simple truth. Man lives primarily with objects, indeed, since feeling and acting in him depend on his presentations, he actually does so exclusively, as language presents them to him. By the same act whereby he spins language out of himself, he spins himself into it, and every language draws about the people that possesses it a circle whence it is possible to exit only by stepping over at once into the circle of another one. To learn a foreign language should therefore be to acquire a new standpoint in the world-view hitherto possessed, and in fact to a certain extent is so, since every language contains the whole conceptual fabric and mode of presentation of a portion of mankind.”

Wilhelm von Humboldt
On Language (1836)

* * *

Wilhelm von Humboldt
from Wikipedia

Wilhelm von Humboldt
from Stanford Encyclopedia of Philosophy

Wilhelm von Humboldt lectures
from Université de Rouen

Wilhelm von Humbold and the World of Languages
by Ian F. McNeely

Wilhelm von Humboldt: A Critical Review On His Philosophy of Language, Theory and Practice of Education
by Dr Arlini Alias

The theory of linguistic relativity from the historical perspective
by Iaroslav

Democratic Values in Balance or Opposition

At the New Yorker, Nathan Heller has an interesting piece about equality and freedom, The Philosopher Redefining Equality.

Mainstream American thought sees them as oppositional. But maybe the common ground between them is fairness. There can be neither equality nor freedom in an unfair society, although there can be liberty in an unfair society. That goes off on a tangent, but keep it in mind as background info. A society of freedom is not the same as a society of liberty, and a society of fairness might be a whole other thing as well. Yet it has been argued that English is the only language with exact words for all three concepts (see Liberty, Freedom, and Fairness) — for example, George Fletcher in Basic Concepts of Legal Thought writes,

“Remarkably, our concept of fairness does not readily translate into other languages. It is virtually impossible to find a suitable translation for fairness in European or Semitic languages. As a result, the term is transplanted directly in some languages such as German and Hebrew, and absent in others, such as French, which is resistant to adopting loan words that carry unique meanings.” (quoted by Manny Echevarria in Does Fairness Translate?)

The difference between the two cultural worldviews and ideological systems is what led to both the English Civil War and the American Civil War. This conflict has been internalized within American society, but it has never been resolved. Americans simply have pretended it went away when, in reality, the conflict has grown worse.

Heller writes about the experience and work of Elizabeth Anderson. She has been, “Working at the intersection of moral and political philosophy, social science, and economics, she has become a leading theorist of democracy and social justice.” And, as related to the above, “She has built a case, elaborated across decades, that equality is the basis for a free society.” Freedom isn’t only closely linked to equality but built upon and dependent upon it. That makes sense from an etymological perspective, as freedom originally meant living among equals in sharing freedom as a member of a free people, at least a member in good standing (ignoring the minor detail of the categories of people excluded: women, slaves, and strangers; but it might be noted that these categories weren’t always permanent statuses and unchangeable fates, since sometimes women could become warriors or divorce their husbands, slaves could end their bondage, and strangers could marry into the community). Hence, this is the reason the word ‘friend’ has the same origin — to be free is to be among friends, among those one trusts and relies upon as do they in return.

Fairness, by the way, is an odd word. It has an English meaning of fair, handsome, beautiful, and attractive (with some racist connotations); nice, clean, bright, clear, and pleasant; moderate as in not excessive in any direction (a fair balance or fair weather, neither hot nor cold) but also generous or plentiful as in considerable (a fair amount). And in various times and places, it has meant favorable, helpful, promising good fortune, and auspicious; morally or comparatively good, socially normative, average, suitable, agreeable, with propriety and justice, right conduct, etc; which overlaps with the modern sense of equitable, impartial, just, and free from bias (from fair and well to fair and square, from fair-dealing to fair play). But its other linguistic variants connect to setting, putting, placing, acting, doing, making, and becoming; make, compose, produce, construct, fashion, frame, build, erect, and appoint. There is an additional sense of sex and childbirth (i.e., fucking and birthing), the ultimate doing and making; and so seemingly akin to worldly goodness of fecundity, abundance, and creation. The latter maybe where the English meaning entered the picture. More than being fair as a noun, it is a verb of what one is doing in a real world sense.

Interestingly, some assert the closest etymological correlate to fairness in modern Swedish is ‘rättvis’. It breaks down to the roots ‘rätt’ and ‘vis’, the former signifying what is ‘correct’ or ‘just’ and the latter ‘wise’ (correct-wise or just-wise in the sense of clockwise or otherwise). This Swedish word is related to the English ‘righteous’. That feels right in the moral component of fairness that can be seen early on its development as a word. We think of what is righteous as having a more harsh and demanding tone than fairness. But I would note how easy it is to pair fairness with justice as if they belong together. John Rawls has a theory of justice as fairness. That makes sense, in accord with social science research that shows humans strongly find unjust that which is perceived as unfair. Then again, as freedom is not exactly the same as liberty, righteousness is not exactly the same as justice. There might be a reason that the Pledge of Allegiance states “with liberty and justice for all”, not liberty and righteousness, not freedom and justice. Pledging ourselves to liberty and justice might put us at odds with a social order of fairness, as paired with freedom, equality, or righteousness. Trying to translate these two worldviews into each other maybe is what created so much confusion in the first place.

All these notions of and related to fairness, one might argue indicate how lacking in fairness is our society, whatever one might think of liberty and justice. Humans tend to obsess over in articulating and declaring what is found most wanting. A more fair society would likely not bother to have a world for it as the sense of fairness would be taken for granted and would simply exist in the background as ideological realism and cultural worldview. From Integrity in Depth, John Beebe makes this argument about the word ‘integrity’ for modern society, whereas the integral/integrated lifestyle of many tribal people living in close relationship to their environment requires no such word. A people need what is not integrated that is seen as needing to be integrated in order to speak of what is or might be of integrity.

Consider the Piraha who are about as equal a society as can exist with fairness only becoming an issue in recent history because of trade with outsiders. The Piraha wanted Daniel Everett to teach them learn math because they couldn’t determine if they were being given a fair deal or were being cheated, a non-issue among Piraha themselves since they don’t have a currency or even terms for numerals. A word like fairness would be far too much of a generalized abstraction for the Piraha as traditionally most interactions were concrete and personal, as such more along the lines of Germanic ‘freedom’.

It might put some tribal people in an ‘unfair’ position if they don’t have the language to fully articulate unfairness, at least in economic terms. We Americans have greater capacity and talent in fighting for fairness because we get a lot of practice, as we can’t expect it as our cultural birthright. Unsurprisingly, we talk a lot about it and in great detail. Maybe to speak of fairness is always to imply both its lack and desirability. From the view of linguistic relativism, such a word invokes a particular wordview that shapes and influences thought, perception, and behavior.

This is observed in social science research when WEIRD populations are compared to others, as seen in Joe Henrich’s study of the prisoner’s dilemma: “It had been thought a matter of settled science that human beings insist on fairness in the division, or will punish the offering party by refusing to accept the offer. This was thought an interesting result, because economics would predict that accepting any offer is better than rejecting an offer of some money. But the Machiguenga acted in a more economically rational manner, accepting any offer, no matter how low. “They just didn’t understand why anyone would sacrifice money to punish someone who had the good luck of getting to play the other role in the game,” Henrich said” (John Watkins, The Strangeness of Being WEIRD). There is no impulse to punish an unfairness that, according to the culture, isn’t perceived as unfair. It appears that the very concept of fairness was irrelevant or maybe incomprehensible the Machiguenga, at least under these conditions. But if they are forced to deal more with outsiders who continually take advantage of them or who introduce perverse incentives into their communities, they surely would have to develop the principle of fairness and learn to punish unfairness. Language might be the first sign of such a change.

A similar point is made by James L. Kugel in The Great Shift about ancient texts written by temple priests declaring laws and prohibitions. This probably hints at a significant number of people at the time doing the complete opposite or else the priests wouldn’t have bothered to make it clear, often with punishments for those who didn’t fall in line. As Julian Jaynes explains, the earliest civilizations didn’t need written laws because the social norms were so embedded within not only the social fabric but the psyche. Laws were later written down because social norms were breaking down, specifically as societies grew in size, diversity, and complexity. We are now further down this road of the civilizational project and legalism is inseparable from our everyday experience, and so we need many words such as fairness, justice, righteousness, freedom, liberty, etc. We are obsessed with articulating these values as if by doing so we could re-enforce social norms that refuse to solidify and stabilize. So, we end up turning to centralized institutions such as big government to impose these values on individuals, markets, and corporations. And we need lawyers, judges, and politicians to help us navigate this legalistic world that we are anxious about falling apart at any moment.

This interpretation is supported by the evidence of the very society in which the word fairness was first used. “The tribal uses of fair and fairness were full of historical irony,” pointed out David Hackett Fischer in Fairness and Freedom (Kindle Locations 647-651). “These ideas flourished on the far fringes of northwestern Europe among groups of proud, strong, violent, and predatory people who lived in hard environments, fought to the death for the means of life, and sometimes preyed even on their own kin. Ideas of fairness and fair play developed as a way of keeping some of these habitual troublemakers from slaughtering each other even to the extinction of the tribe. All that might be understood as the first stage in the history of fairness.” This interpretation is based on a reading of the sagas as written down quite late in Scandinavian history. It was a period when great cultural shifts were happening such as radical and revolutionary introductions like that of writing itself. And I might add, this followed upon the millennium of ravage from the collapse of the bicameralism of Bronze Age Civilizations. The society was under great pressure, both from within and without, as the sagas describe those violent times. It was the sense of lack of fairness in societal chaos and conflict that made it necessary to invent fairness as a cultural ideal and social norm.

It’s impossible to argue we live in a fair society. The reason Adam Smith defended equality, for example, is because he thought it would be a nice ideal to aspire to and not that we had already attained it. On the other hand, there is an element of what has been lost. Feudal society had clearly spelled out rights and responsibilities that were agreed upon and followed as social norms, and so in that sense it was a fair society. The rise of capitalism with the enclosure and privatization of the commons was experienced as unfair, to which Thomas Paine was also responding with his defense of a citizen’s dividend to recompense what was taken, specifically as theft not only from living generations but also all generations following. When a sense of fairness was still palpable, as understood within the feudal social order, no argument for fairness as against unfairness was necessary. It likely is no coincidence that the first overt class war happened in the English Civil War when the enclosure movement was in high gear, the tragic results of which Paine would see in the following century, although the enclosure movement didn’t reach full completion until the 19th century with larger scale industrialization and farming.

As for how fairness accrued its modern meaning, I suspect that it is one of the many results of the Protestant Reformation as a precursor to the Enlightenment Age. The theological context became liberal. As Anna Wierzbicka put it: ” “Fair play” as a model of human interaction highlights the “procedural” character of the ethics of fairness. Arguably, the emergence of the concept of “fairness” reflects a shift away from absolute morality to “procedural (and contractual) morality,” and from the gradual shift from “just” to “fair” can be seen as parallel to the shifts from good to right and also from wise (and also true) to reasonable: in all cases, there is a shift from an absolute, substantive approach to a procedural one.” (from English: Meaning and Culture as quoted Mark Liberman in No word for fair?)

Nathan Heller’s article is about how the marriage of values appears like a new and “unorthodox notion”. But Elizabeth Anderson observes that, “through history, equality and freedom have arrived together as ideals.” This basic insight was a central tenet of Adam Smith’s economic philosophy. Smith said a free society wasn’t possible with high inequality. It simply wasn’t possible. Full stop. And his economic views are proclaimed as the basis of Western capitalism. So, how did this foundational understanding get lost along the way? I suppose because it was inconvenient to the powers that be who were looking for an excuse to further accumulate not only wealth but power.

It wasn’t only one part of the ruling elite that somehow ‘forgot’ this simple truth. From left to right, the establishment agreed in defense of the status quo: “If individuals exercise freedoms, conservatives like to say, some inequalities will naturally result. Those on the left basically agree—and thus allow constraints on personal freedom in order to reduce inequality. The philosopher Isaiah Berlin called the opposition between equality and freedom an “intrinsic, irremovable element in human life.” It is our fate as a society, he believed, to haggle toward a balance between them.” For whatever reason, there was a historical shift, a “Post-Enlightenment move” (Echevarria), both in the modern meaning of fairness and the modern deficiency in fairness.

That still doesn’t explain how the present ideological worldview became the dominant paradigm that went unquestioned by hundreds of millions of ordinary Americans and other Westerners. Direct everyday experience contradicts this neo-feudalist dogma of capitalist realism. There is nothing that Anderson observed in her own work experience that any worker couldn’t notice in almost any workplace. The truth has always been there right in front of us. Yet few had eyes to see. When lost in the darkness of a dominant paradigm, sometimes clear vision requires an imaginative leap into reality. I guess it’s a good thing we have a word to designate the ache we feel for a better world.

* * *

Fairness and Freedom
by David Hackett Fischer
Kindle Locations 596-675

Origins of the Words Fairness and Fair

Where did this language of fairness come from? What is the origin of the word itself? To search for the semantic roots of fair and fairness is to make a surprising discovery. Among widely spoken languages in the modern world, cognates for fairness and fair appear to have been unique to English, Danish, Norwegian, and Frisian until the mid-twentieth century. 40 They remained so until after World War II, when other languages began to import these words as anglicisms. 41

The ancestry of fair and fairness also sets them apart in another way. Unlike most value terms in the Western world, they do not derive from Greek or Latin roots. Their etymology is unlike that of justice and equity, which have cognates in many modern Western languages. Justice derives from the Latin ius, which meant a conformity to law or divine command, “without reference to one’s own inclinations.” Equity is from the Latin aequitas and its adjective aequus, which meant level, even, uniform, and reasonable. 42

Fairness and fair have a different origin. They derive from the Gothic fagrs, which meant “pleasing to behold,” and in turn from an Indo-European root that meant “to be content.” 43 At an early date, these words migrated from Asia to middle Europe. There they disappeared in a maelstrom of many languages, but not before they migrated yet again to remote peninsulas and islands of northern and western Europe, where they persisted to our time. 44 In Saxon English, for example, the old Gothic faeger survived in the prose of the Venerable Bede as late as the year 888. 45 By the tenth century, it had become faire in English speech. 46

In these early examples, fagr, faeger, fair, and fairness had multiple meanings. In one very old sense, fair meant blond or beautiful or both—fair skin, fair hair. As early as 870 a Viking king was called Harald Harfagri in Old Norse, or Harold Fairhair in English. In another usage, it meant favorable, helpful, and good—fair wind, fair weather, fair tide. In yet a third it meant spotless, unblemished, pleasing, and agreeable: fair words, fair speech, fair manner. All of these meanings were common in Old Norse, and Anglo-Saxon in the tenth and eleventh centuries. By 1450, it also meant right conduct in rivalries or competitions. Fair play, fair game, fair race, and fair chance appeared in English texts before 1490. 47

The more abstract noun fairness was also in common use. The great English lexicographer (and father of the Oxford English Dictionary) Sir James Murray turned up many examples, some so early that they were still in the old Gothic form—such as faegernyss in Saxon England circa 1000, before the Norman Conquest. It became fayreness and fairnesse as an ethical abstraction by the mid-fifteenth century, as “it is best that he trete him with farenes” in 1460. 48

As an ethical term, fairness described a process and a solution that could be accepted by most parties—fair price, fair judgment, fair footing, fair and square. Sometimes it also denoted a disposition to act fairly: fair-minded, fair-natured, fair-handed. All of these ethical meanings of fair and fairness were firmly established by the late sixteenth and early seventeenth centuries. Fair play appears in Shakespeare (1595); fair and square in Francis Bacon (1604); fair dealing in Lord Camden (before 1623). 49

To study these early English uses of fairness and fair is to find a consistent core of meaning. Like most vernacular words, they were intended not for study but for practical use. In ethical applications, they described a way of resolving an issue that is contested in its very nature: a bargain or sale, a race or rivalry, a combat or conflict. Fundamentally, fairness meant a way of settling contests and conflicts without bias or favor to any side, and also without deception or dishonesty. In that sense fairness was fundamentally about not taking undue advantage of other people. As early as the fifteenth century it variously described a process, or a result, or both together, but always in forms that fair-minded people would be willing to accept as legitimate.

Fairness functioned as a mediating idea. It was a way of linking individuals to groups, while recognizing their individuality at a surprisingly early date. Always, fairness was an abstract idea of right conduct that could be applied in different ways, depending on the situation. For example, in some specific circumstances, fairness was used to mean that people should be treated in the same way. But in other circumstances, fairness meant that people should be treated in different ways, or special ways that are warranted by particular facts and conditions, such as special merit, special need, special warrant, or special desire. 50

Fairness was a constraint on power and strength, but it did not seek to level those qualities in a Procrustean way. 51 Its object was to regulate ethical relationships between people who possess power and strength in different degrees—a fundamental fact of our condition. A call for fairness was often an appeal of the weak to the conscience of the strong. It was the eternal cry of an English-speaking child to parental authority: “It’s not fair!” As any parent knows, this is not always a cry for equality.

Modern Applications of Fairness: Their Consistent Core of Customary Meaning

Vernacular ideas of fairness and fair have changed through time, and in ways that are as unexpected as their origin. In early ethical usage, these words referred mostly to things that men did to one another—a fair fight, fair blow, fair race, fair deal, fair trade. They also tended to operate within tribes of Britons and Scandinavians, where they applied to freemen in good standing. Women, slaves, and strangers from other tribes were often excluded from fair treatment, and they bitterly resented it.

The tribal uses of fair and fairness were full of historical irony. These ideas flourished on the far fringes of northwestern Europe among groups of proud, strong, violent, and predatory people who lived in hard environments, fought to the death for the means of life, and sometimes preyed even on their own kin. Ideas of fairness and fair play developed as a way of keeping some of these habitual troublemakers from slaughtering each other even to the extinction of the tribe. All that might be understood as the first stage in the history of fairness. 52

Something fundamental changed in a second stage, when the folk cultures of Britain and Scandinavia began to grow into an ethic that embraced others beyond the tribe—and people of every rank and condition. This expansive tendency had its roots in universal values such as the Christian idea of the Golden Rule. 53 That broader conception of fairness expanded again when it met the humanist ideas of the Renaissance, the universal spirit of the Enlightenment, the ecumenical spirit of the Evangelical Movement, and democratic revolutions in America and Europe. When that happened, a tribal idea gradually became more nearly universal in its application. 54 Quantitative evidence suggests an inflection at the end of the eighteenth century. The frequency of fairness in English usage suddenly began to surge circa 1800. The same pattern appears in the use of the expression natural justice. 55

Then came a third stage in the history of fairness, when customary ideas began to operate within complex modern societies. In the twentieth century, fairness acquired many technical meanings with specific applications. One example regulated relations between government and modern media (“the fairness doctrine”). In another, fairness became a professional standard for people who were charged with the management of other people’s assets (“fiduciary fairness”). One of the most interesting modern instances appeared among lawyers as a test of “balance or impartiality” in legal proceedings, or a “subjective standard by which a court is deemed to have followed due process,” which began to be called “fundamental fairness” in law schools. Yet another example was “fair negotiation,” which one professional negotiator defined as a set of rules for “bargaining with the Devil without losing your soul.” One of the most complex applications is emerging today as an ethic of “fairness in electronic commerce.” These and other modern applications of fairness appear in legal treatises, professional codes, and complex bodies of regulatory law. 56

Even as modern uses of fair and fairness have changed in all of those ways, they also preserved a consistent core of vernacular meaning that had appeared in Old English, Norse, and Scandinavian examples and is still evident today. To summarize, fair and fairness have long been substantive and procedural ideas of right conduct, designed to regulate relations among people who are in conflict or rivalry or opposition in particular ways. Fairness means not taking undue advantage of others. It is also about finding ways to settle differences through a mutual acceptance of rules and processes that are thought to be impartial and honest—honesty is fundamental. And it is also about living with results that are obtained in this way. As the ancient Indo-European root of fagrs implied, a quest for fairness is the pursuit of practical solutions with which opposing parties could “be content.” These always were, and still are, the fundamental components of fairness. 57

Notes:

40. For an excellent and very helpful essay on fair and fairness by a distinguished cultural and historical linguist, see Anna Wierzbicka, “Being FAIR: Another Key Anglo Value and Its Cultural Underpinnings,” in English: Meaning and Culture (New York and Oxford, 2006), 141–70. See also Bart Wilson, “Fair’s Fair,” http://www.theatlantic.com/business/print/2009/01/fairs-fair/112; Bart J. Wilson, “Contra Private Fairness,” May 2008, http://www.chapman.edu/images/userimges/jcunning/Page_11731/ContraPrivateFairness05–2008.pdf; James Surowiecki, “Is the Idea of Fairness Universal?” Jan. 26, 2009, http://www.newyorker.com/online/blogs/jamessurowiecki/2009/01/is; and Mark Liberman, “No Word for Fair?” Jan. 28, 2009, http://languagelog.ldc.upenn.edu/nll/?p=1080.

41. Oxford English Dictionary, s.v. “fair” and “fairness.” Cognates for the English fairness include fagr in Icelandic and Old Norse, retferdighet in modern Norwegian, and retfaerighed in modern Danish. See Geír Tòmasson Zoëga, A Concise Dictionary of Old Icelandic (Toronto, 2004), s.v. “fagr.” For Frisian, see Karl von Richthofen, Altfriesisches Wörterbuch (Gottingen, 1840); idem, Friesische Rechtsquellen (Berlin, 1840). On this point I agree and disagree with Anna Wierzbicka. She believes that fair and unfair “have no equivalents in other European languages (let alone non-European ones) and are thoroughly untranslatable” (“Being FAIR,” 141). This is broadly true, but with the exception of Danish, Norwegian, Frisian, and Icelandic. Also I’d suggest that the words can be translated into other languages, but without a single exactly equivalent word. I believe that people of all languages are capable of understanding the meaning of fair and fairness, even if they have no single word for it.

42. OED, s.v. “justice,” “equity.”

43. Webster’s New World Dictionary, 2nd College Edition, ed. David B. Guralnik (New York and Cleveland, 1970), s.v. “fair”; OED, s.v. “fair.”

44. Ancient cognates for fair included fagar in Old English and fagr in Old Norse.

45. W. J. Sedgefield, Selections from the Old English Bede, with Text and Vocabulary, on an Early West Saxon Basis, and a Skeleton Outline of Old English Accidence (Manchester, London, and Bombay, 1917), 77; and in the attached vocabulary list, s.v. the noun “faeger” and the adverbial form “faegere.” Also Joseph Bosworth and T. Northcote Tollen, An Anglo-Saxon Dictionary, Based on Manuscript Collections (Oxford, 1882, 1898), s.v. “faeger,” ff.

46. Not to be confused with this word is another noun fair, for a show or market or carnival, from the Latin feria, feriae, feriarum, festival or holiday—an entirely different word, with another derivation and meaning.

47. Liberman, “No Word for Fair?”

48. OED, s.v. “fairness,” 1.a, b, c.

49. For fair and fairness in Shakespeare, see King John V.i.67. For fair and square in Francis Bacon in 1604 and Oliver Cromwell in 1649, see OED, s.v. “fair and square.”

50. Herein lies one of the most difficult issues about fairness. How can we distinguish between ordinary circumstances where fairness means that all people should be treated alike, and extraordinary circumstances where fairness means different treatment? This problem often recurs in cases over affirmative action in the United States. No court has been able to frame a satisfactory general rule, in part because of ideological differences on the bench.

51. Procrustes was a memorable character in Greek mythology, a son of Poseidon called Polypaemon or Damastes, and nicknamed Procrustes, “the Stretcher.” He was a bandit chief in rural Attica who invited unwary travelers to sleep in an iron bed. If they were longer than the bed, Procrustes cut off their heads or feet to make them fit; if too short he racked them instead. Procrustes himself was dealt with by his noble stepbrother Theseus, who racked him on his own bed and removed his head according to some accounts. In classical thought, and modern conservatism, the iron bed of Procrustes became a vivid image of rigid equality. The story was told by Diodorus Siculus, Historical Library 4.59; Pausanias, Guide to Greece 1.38.5; and Plutarch, Lives, Theseus 2.

52. Jesse Byock, Viking Age Iceland (London, 2001), 171–84; the best way to study the origin of fairness in a brutal world is in the Norse sagas themselves, especially Njal’s Saga, trans. and ed. Magnus Magnusson and Hermann Palsson (London, 1960, 1980), 21–22, 40, 108–11, 137–39, 144–45, 153, 163, 241, 248–55; Egil’s Saga, trans. and ed. Hermann Palsson and Paul Edwards (London, 1976, 1980), 136–39; Hrafnkel’s Saga and Other Icelandic Stories, trans. and ed. Hermann Palsson (London, 1971, 1980), 42–60.

53. Matthew 25:40; John 4:19–21; Luke 10:27.

54. The vernacular history of humanity, expanding in the world, is a central theme in David Hackett Fischer, Champlain’s Dream (New York and Toronto, 2008); as the expansion of vernacular ideas of liberty and freedom is central to Albion’s Seed (New York and Oxford, 1989) and Liberty and Freedom (New York and Oxford, 2005); and the present inquiry is about the expansion of vernacular ideas of fairness in the world. One purpose of all these projects is to study the history of ideas in a new key. Another purpose is to move toward a reunion of history and moral philosophy, while history also becomes more empirical and more logical in its epistemic frame.

55. For data on frequency, see Google Labs, Books Ngram Viewer, http://ngrams.googlelabs.com, s.v. “fairness” and “natural justice.” Similar patterns and inflection-points appear for the corpus of “English,” “British English,” and “American English,” in the full span 1500–2000, smoothing of 3. Here again on the history of fairness, I agree and disagree with Wierzbicka (“Being FAIR,” 141–67). The ethical meanings of fairness first appeared earlier than she believes to be the case. But I agree on the very important point that ethical use of fairness greatly expanded circa 1800.

56. Fred W. Friendly, The Good Guys, the Bad Guys, and the First Amendment (New York, 1976) is the classic work on the fairness doctrine. Quotations in this paragraph are from Carrie Menkow-Meadow and Michael Wheeler, eds., What’s Fair: Ethics for Negotiators (Cambridge, 2004), 57; Philip J. Clements and Philip W. Wisler, The Standard and Poor’s Guide to Fairness Opinions: A User’s Guide for Fiduciaries (New York, 2005); Merriam-Webster’s Dictionary of Law (Cleveland, 1996), s.v. “fundamental fairness”; Approaching a Formal Definition of Fairness in Electronic Commerce: Proceedings of the 18th IEEE Symposium on Reliable Distributed Systems (Washington, 1999), 354. Other technical uses of fairness can be found in projects directed by Arien Mack, editor of Social Research, director of the Social Research Conference series, and sponsor of many Fairness Conferences and also of a Web site called Fairness.com.

57. An excellent discussion of fairness, the best I have found in print, is George Klosko, The Principle of Fairness and Political Obligation (Savage, MD, 1992; rev. ed., 2004). It is similarto this formulation on many points, but different on others.

Reactionaries, Powell Memo and Judicial Activism

To explain why the Powell Memo is important, I’ll begin with a summary of the games played by reactionaries which explains the rhetorical power they wield. There are two main aspects of the reactionary mind (* see below). The most interesting is described by Corey Robin, the reason I’ve come to refer to reactionaries as the “Faceless Men”.

Reactionaries steal the thunder and mimic the tactics of the political left, and in doing so co-opt political movements and even revolutions, turning the latter into counterrevolutions. More interesting still is how reactionaries pose as what they are not by claiming labels that originated with their opponents — calling themselves classical liberals and whatever else catches their fancy. They pretend to be defenders of constitutional originalism while they radically transform the Constitution, such as pushing corporate personhood and citizenship, something that would have horrified the American revolutionaries and founders.

The other side of this is what reactionaries project onto others. They are the greatest purveyors of political correctness in attacking free speech, an area in which they show their brilliance in controlling narrative framing. They manage to portray their enemies as doing what they are most guilty of and through this tactic they silence and discredit others.

In this way, the reactionary element of the intellectual elite, Hollywood elite, and banking elite (as seen in the career of Steve Bannon) somehow manages to convince their followers that they are average Americans in a noble fight against the ruling elite. The target often ends up being students at state colleges who, according to the data and opposite of the reactionary portrayal, are mostly those working their way out of the working class — if meritocracy exists at all in the United States, this is the closest we get to it. But anyway, it’s highly doubtful that colleges are serving a genuinely democratic purpose at a time when corporate and other private funding is flooding into colleges, and so the accusation of their being bastions of the liberal faith is a sad joke. This state of confusion is intentionally created by reactionaries — up is down and those who are down are the enemy to be blamed for everything.

Or consider the accusation of a liberal media bias. It’s odd where I most often here this bizarre claim. It’s regularly repeated in the corporate media itself. Even the supposedly liberal media gives a platform to people to spout this bullshit. So, what kind of liberal bias is it that criticizes liberal bias by giving equal or greater time to right-wingers? That is no exaggeration, in that even NPR gives more airtime to corporatist and right-wing think tanks than to those on the anti-corporatist left. that is unsurprising since NPR gets most of its funding from private sources such as corporations, not from the government. Public radio?

This brings me to an example that has been on my mind for a while. I’ve been meaning to write about it. This seems as good of a time as ever. Let this be my first post of the new year, as clarifying where we stand as a society and how we got here.

The infamous Powell Memo (AKA Powell Manifesto) was only recently leaked, but it was written way back in 1971. Ever since that time, it has been the guiding vision of a cabal of right-wing oligarchs and plutocrats. It set out a strategy in how to take over the government and it was successful. Do you know how those on the political right are always alleging a left-wing conspiracy to pack the courts with activist judges? Well, that is an expression of a guilty conscience. It’s exactly what the right-wing ruling elite has been doing this past half century, culminating in the nomination of Judge Brett  Kavanaugh to the Supreme Court. Predictably, this so-called constitutionalist just so happens to be a big supporter of executive power, an agenda that more fully began being pushed under the administration of George W. Bush. There is absolutely nothing constitutionalist about this, as it undermines the very core pillar of separation and balance of powers. Instead of being a countervailing force, right-wingers are seeking to create a corporatocratic Supreme Court that serves at the behest of a right-wing presidency and political system.

That isn’t to entirely blame them, as the Democratic Party has shifted so far right on these issues that they are now to the right of Republicans from earlier last century. The reactionary mind has a way of infecting nearly everything and everyone. Our entire government has become a reactionary institution, but it’s important that we keep in mind who planned and led this coup. Then again, Lewis Powell who wrote the Powell Memo did so not only as a corporate lobbyist but also as a Democrat. And to show the bipartisan nature of this corporatocracy, it was Richard Nixon as a Republican president who that same year nominated him to the Supreme Court and the next year he was appointed. Still, it was the right-wing ALEC (American Legislative Exchange Council) that was most directly inspired by the Powell Memo, the organization that then helped enact this neoliberal and neo-fascist coup.

It’s not only the respectable good liberals of the ‘mainstream’ left that was in the dark about these machinations. Before the Powell Memo was leaked, anyone who pointed to the corporate takeover would have been called a conspiracy theorist, and so no more welcome in the Democratic Party than in the Republican Party. Americans in general couldn’t see what was happening because the decisions and exchange of money happened mostly behind closed doors. Besides, the corporate media had no interest in reporting on it, quite the opposite of course. There was no transparency, as planned, and so there was no accountability. Democracy dies in the dark.

Only now that Clown-Fuhrer Trumpf is in power do we suddenly get some push back showing up in the mainstream. The struggle for power within the ruling elite goes into high gear. And our dear leader has put Judge Kavanaugh onto the Supreme Court. I’ve heard stalwart Republicans who despise and fear Trump, nonetheless, supporting him to the extent that he is pushing for a ‘conservative’ judiciary which supposedly opposes all those activist judges on the left. Yet Kavanaugh is as activist as they come. The main reason Trump picked him probably was because, when the time comes, the Supreme Court can be swung in defense of the administration.

After the Barack Obama followed the example of George W. Bush in further advancing executive power, now Democrats are thinking that their support for authoritarianism may have been a bad die after all. They assumed they were going to maintain power since it was obvious to them that Hillary Clinton couldn’t lose the presidential election and so that the unrestrained executive could’ve then been used for their more paternalistic variety of friendly fascism. Trump and gang, of course, make a convenient scapegoat for Democratic sins. But that is a useless game at this point.

The joke is on all of them. And the entire political system is the punchline.

* * *

* What is the distinguishing feature of the reactionary mind? Maybe it has to do with the Dark Triad, the strong correlation between narcissism, psychopathy, and Machiavellianism. In particular, the last one might be key. Research has shown that those who are most fearful of Machiavellianism in fantasizing about conspiracies existing behind every door and lurking under the bed are themselves more prone to acting in Machiavellian ways. That very much sounds like the reactionary mind.

In political terms, the reactionary mind gets expressed by the social dominance orientation (SDO), which essentially is another way of speaking of Machiavellianism. This is where authoritarianism comes in, as SDO types are attracted to and have talent in manipulating authoritarian followers. As such, maybe authoritarians will only be reactionary to the degree that a society becomes reactionary and SDO types gain power, since authoritarians will conform to almost any social norm, good or bad.

It’s only under these conditions that we can speak of a distinct reactionary mind. The arising to dominance of the reactionary mind indicates that something is being reacted to. From other research, what seems to elicit this is rising inequality and segregation that foments mistrust, fear, and anxiety. This is what we see before every period of instability and, in reaction, there are those who seek to enforce order.

What makes reactionaries unique is their innovativeness in how they go about this. They aren’t traditionalists, although they also will co-opt from traditionalists as they co-opt from the political left. One of the purest forms of the reactionary mind is nostalgia which, unsurprisingly, rarely has anything to do with the historical past. It is co-opting the rhetoric and emotion of tradition with little respect or concern about actual traditions.

A key example of this anti-traditional pseudo-traditionalism is constitutional originalism. What the reactionary right is pushing is a complete contradiction and betrayal of what the American Revolution was fought for and what the United States was founded upon, specifically founded on the Declaration of Independence and the first constitution of the Articles of Confederation. These reactionaries will claim that liberalism is an attack on American political tradition, even as any informed person knows that liberalism (including in its progressive and radical forms) was core to American society from the beginning. Consider the view that a constitution is a living document as a pact of a specific community of people, an American tradition that came out of Quaker constitutionalism and (by way of Quaker-raised John Dickinson) informed the democratic sensibility of the Articles of Confederation.

Such history is inconvenient and so irrelevant to the reactionary mind. But because reactionaries took control so early with the Constitutional Convention, their counterrevolution permanently obscured the true history and that has left the American population with collective amnesia. As demonstrated by the extremes of Donald Trump, reactionaries love to invent ‘facts’ and then to repeat them until they become accepted or else until all sense of truth is lost. This is what makes the reactionary mind Machiavellian and, as such, finds itself at home in the Dark Triad.

* * *

Powell Memorandum:

CONFIDENTIAL MEMORANDUM
Attack on American Free Enterprise System

DATE: August 23, 1971
TO: Mr. Eugene B. Sydnor, Jr., Chairman, Education Committee, U.S. Chamber of Commerce
FROM: Lewis F. Powell, Jr.

Neglected Opportunity in the Courts

American business and the enterprise system have been affected as much by the courts as by the executive and legislative branches of government. Under our constitutional system, especially with an activist-minded Supreme Court, the judiciary may be the most important instrument for social, economic and political change.

Other organizations and groups, recognizing this, have been far more astute in exploiting judicial action than American business. Perhaps the most active exploiters of the judicial system have been groups ranging in political orientation from “liberal” to the far left.

The American Civil Liberties Union is one example. It initiates or intervenes in scores of cases each year, and it files briefs amicus curiae in the Supreme Court in a number of cases during each term of that court. Labor unions, civil rights groups and now the public interest law firms are extremely active in the judicial arena. Their success, often at business’ expense, has not been inconsequential.

This is a vast area of opportunity for the Chamber, if it is willing to undertake the role of spokesman for American business and if, in turn, business is willing to provide the funds.

As with respect to scholars and speakers, the Chamber would need a highly competent staff of lawyers. In special situations it should be authorized to engage, to appear as counsel amicus in the Supreme Court, lawyers of national standing and reputation. The greatest care should be exercised in selecting the cases in which to participate, or the suits to institute. But the opportunity merits the necessary effort.

* * *

Founding fathers worried about corporate clout
by Angela Carella

A Very American Coup
by David McLaren

The Shift From Democracy To Corporate Dictatorship And The Tragedy Of The Lack Of Push Back. Citizens United v. Federal
by Michael Monk

Powell Memo
by Jeremy Wilburn

How the Right Packed the Court
by William Yeomans

Extremists on the Bench: Five Years After Citizens United, Our Rogue Supreme Court
by Steve Justin

Citizens United, corporate personhood, and the way forward
by Mal Warwick

Is the Supreme Court Determined to Expand Corporate Power?
by Robert Monks and Peter Murray

The Powell Memo And The Birth Of The Washington Lobbying Empire
by Tina-Desiree Berg

Context of ‘August 23, 1971 and After: ’Powell Memo’ Leads to Massive Pro-Business Efforts to Influence Political, Social Discourse’
from History Commons

The Powell Memo
by .ren

* * *

If Confirmed, Brett Kavanaugh Fulfills the Powell Manifesto!
by Frank Puig

9 law experts on what Brett Kavanaugh means for the future of America
by Isolde Raftery and Sydney Brownstone

Kavanaugh would cement Supreme Court support for an oppressed minority — corporations
by Steven Strauss

With Kavanaugh, Trump Could Fashion The Most Business-Friendly Supreme Court Since The New Deal
by Michael Bobelian

How the Supreme Court Unleashed Business
by Justin Fox

Brett Kavanaugh’s Role in Schemes to Politicize the Judiciary Should Disqualify Him
by John Nichols

Brett Kavanaugh and the new judicial activism
by Matthew Yglesias

Judge Kavanaugh’s Activist Vision of Administrative Law
by Robert V. Percival

Brett Kavanaugh’s Dangerous Right-Wing Judicial Activism
by Elliot Mincberg

SCOTUS Nominee Brett Kavanaugh’s Record Depicts Dangerous Conservative Judicial Activism
by Devon Schmidt

Kavanaugh record hints at judicial activism in American election law
by Adav Noti and David Kolker

How Brett Kavanaugh Could Change the Supreme Court—and America
by Brian Bennett

His Entire Career Has Been In Service Of The Republican Agenda”: D.C. Lawyers Dish On Whether Brett Kavanaugh Will Give Trump A Pass On Russia
by Abigail Tracy

Brett Kavanaugh And The Fraud Of Originalism
by Rich Barlow

Brett Kavanaugh’s nomination is a victory for ‘originalists’
by Jill Abramson

Why the Supreme Court is now America’s most dangerous branch
by Mary Kay Linge

Brett Kavanaugh Was Involved in 3 Different Crises of Democracy
by Charles Pierce

The Senate Must Closely Examine These Documents From Kavanaugh’s Bush Years
by Peter M. Shane

How Brett Kavanaugh Worked to Weaponize the War on Terror
by Faiza Patel and Andrew Boyle

Could Brett Kavanaugh Protect Trump From Prosecution?
by Andy Kroll

Brett Kavanaugh and the Imperial Presidency (Supreme Court)
from Best of the Left

Brett Kavanaugh’s Radical View of Executive Power
by Corey Brettschneider

Judge Kavanaugh: An Originalist With a New—and Terrifying—Interpretation of Executive Power
by Patricia J. Williams

Judge Brett Kavanaugh’s radically expansive view of the power of the presidency: Analysis
by Terry Moran

For Brett Kavanaugh, the Separation of Powers Is a One-Way Street
by Dan Froomkin

7 legal experts on how Kavanaugh views executive power — and what it could mean for Mueller
by Jen Kirby

Courting Disaster: The Trouble with Brett Kavanaugh’s Views of Executive Power in the Age of Trump
by Michael Waldman

Where Supreme Court Nominee Brett Kavanaugh Stands On Executive Power
from All Things Considered

Kavanaugh File: Executive Privilege
By Robert Farley

Brett Kavanaugh’s SCOTUS Nomination Is Bad News for Church/State Separation
by Hemant Mehta

Brett Kavanaugh and the Triumph of the Conservative Counterrevolution
by Bill Blum

Brett Kavanaugh Has Power To Strengthen Donald Trump, But Supreme Court Has Boosted Presidents For Decades
by Greg Price

If Brett Kavanaugh Wins, the Supreme Court Loses
by Jay Michaelson

Why Conservatives Could Regret Confirming Brett Kavanaugh
by Eric Levitz