Moralizing Gods as Effect, Not Cause

There is a new study on moralizing gods and social complexity, specifically as populations grow large. The authors are critical of the Axial Age theory: “Although our results do not support the view that moralizing gods were necessary for the rise of complex societies, they also do not support a leading alternative hypothesis that moralizing gods only emerged as a byproduct of a sudden increase in affluence during a first millennium ‘Axial Age’. Instead, in three of our regions (Egypt, Mesopotamia and Anatolia), moralizing gods appeared before 1500.”

I don’t take this criticism as too significant, since it is mostly an issue of dating. Objectively, there are no such things as distinct historical periods. Sure, you’ll find precursors of the Axial Age in the late Bronze Age. Then again, you’ll find precursors of the Renaissance and Protestant Reformation in the Axial Age. And you’ll find the precursors of the Enlightenment in the Renaissance and Protestant Reformation. It turns out all of history is continuous. No big shocker there. Changes build up slowly, until they hit a breaking point. It’s that breaking point, often when it becomes widespread, that gets designated as the new historical period. But the dividing line from one era to the next is always somewhat arbitrary.

This is important to keep in mind. And it does have more than slight relevance. This reframing of what has been called the Axial Age accords perfectly with Julian Jaynes’ theories on the ending of the bicameral mind and the rise of egoic consciousness, along with the rise of the egoic gods with their jealousies, vengeance, and so forth. A half century ago, Jaynes was noting that aspects of moralizing social orders were appearing in the late Bronze Age and he speculated that it had to do with increasing complexity that set those societies up for collapse.

Religion itself, as a formal distinct institution with standardized practices, didn’t exist until well into the Axial Age. Before that, rituals and spiritual/supernatural experience were apparently inseparable from everyday life, as the archaic self was inseparable from the communal sense of the world. Religion as we now know it is what replaced that prior way of being in relationship to ‘gods’, but it wasn’t only a different sense of the divine for the texts refer to early people hearing the voices of spirits, godmen, dead kings, and ancestors. Religion was only necessary, according to Jaynes, when the voices went silent (i.e., when they were no longer heard externally because a singular voice had become internalized). The pre-religious mentality is what Jaynes called the bicameral mind and it represents the earliest and largest portion of civilization, maybe lasting for millennia upon millennia going back to the first city-states.

The pressures on the bicameral mind began to stress the social order beyond what could be managed. Those late Bronze Age civilizations had barely begun to adapt to that complexity and weren’t successful. Only Egypt was left standing and, in its sudden isolation amidst a world of wreckage and refugees, it too was transformed. We speak of the Axial Age in the context of a later date because it took many centuries for empires to be rebuilt around moralizing religions (and other totalizing systems and often totalitarian institutions; e.g., large centralized governments with rigid hierarchies). The archaic civilizations had to be mostly razed to the ground before something else could more fully take their place.

There is something else to understand. To have moralizing big gods to maintain social order, what is required is introspectable subjectivity (i.e., an individual to be controlled by morality). That is to say you need a narratizing inner space where a conscience can operate in the voicing of morality tales and the imagining of narratized scenarios such as considering alternate possible future actions, paths, and consequences. This is what Jaynes was arguing and it wasn’t vague speculation, as he was working with the best evidence he could accrue. Building on Jaynes work with language, Brian J. McVeigh has analyzed early texts to determine how often mind-words were found. Going by language use during the late Bronze Age, there was an increased focus on psychological ways of speaking. Prior to that, morality as such wasn’t necessary, no more than were written laws, court systems, police forces, and standing armies — all of which appeared rather late in civilization.

What creates the introspectable subjectivity of the egoic self, i.e., Jaynesian ‘consciousness’? Jaynes suggests that writing was a prerequisite and it needed to be advanced beyond the stage of simple record-keeping. A literary canon likely developed first to prime the mind for a particular form of narratizing. The authors of the paper do note that written language generally came first:

“This megasociety threshold does not seem to correspond to the point at which societies develop writing, which might have suggested that moralizing gods were present earlier but were not preserved archaeologically. Although we cannot rule out this possibility, the fact that written records preceded the development of moralizing gods in 9 out of the 12 regions analysed (by an average period of 400 years; Supplementary Table 2)—combined with the fact that evidence for moralizing gods is lacking in the majority of non-literate societies — suggests that such beliefs were not widespread before the invention of writing. The few small-scale societies that did display precolonial evidence of moralizing gods came from regions that had previously been used to support the claim that moralizing gods contributed to the rise of social complexity (Austronesia and Iceland), which suggests that such regions are the exception rather than the rule.”

As for the exceptions, it’s possible they were influenced by the moralizing religions of societies they came in contact with. Scandinavians, long before they developed complex societies with large concentrated populations, they were traveling and trading all over Eurasia, the Levant, and into North Africa. This was happening in the Bronze Age, during the period of rising big gods and moralizing religion: “The analysis showed that the blue beads buried with the [Nordic] women turned out to have originated from the same glass workshop in Amarna that adorned King Tutankhamun at his funeral in 1323 BCE. King Tut´s golden deathmask contains stripes of blue glass in the headdress, as well as in the inlay of his false beard.” (Philippe Bohstrom, Beads Found in 3,400-year-old Nordic Graves Were Made by King Tut’s Glassmaker). It would be best to not fall prey to notions of untouched primitives.

We can’t assume that these exceptions were actually exceptional, in supposedly being isolated examples contrary to the larger pattern. Even hunter-gatherers have been heavily shaped by the millennia of civilizations that surrounded them. Occasionally finding moralizing religions among simpler and smaller societies is no more remarkable than finding metal axes and t-shirts among tribal people today. All societies respond to changing conditions and adapt as necessary to survive. The appearance of moralizing religions and the empires that went with them transformed the world far beyond the borders of any given society, not that borders were all that defined back then anyway. The large-scale consequences spread across the earth these past three millennia, a tidal wave hitting some places sooner than others but in the end none remain untouched. We are all now under the watchful eye of big gods or else their secularized equivalent, big brother of the surveillance state.

* * *

Moralizing gods appear after, not before, the rise of social complexity, new research suggests
by Redazione Redazione

Professor Whitehouse said: ‘The original function of moralizing gods in world history may have been to hold together large but rather fragile, ethnically diverse societies. It raises the question as to how some of those functions could still be performed in today’s increasingly secular societies – and what the costs might be if they can’t. Even if world history cannot tell us how to live our lives, it could provide a more reliable way of estimating the probabilities of different futures.’

When Ancient Societies Hit a Million People, Vengeful Gods Appeared
by Charles Q. Choi

“For we know Him who said, ‘And I will execute great vengeance upon them with furious rebukes; and they shall know that I am the Lord, when I shall lay my vengeance upon them.'” Ezekiel 25:17.

The God depicted in the Old Testament may sometimes seem wrathful. And in that, he’s not alone; supernatural forces that punish evil play a central role in many modern religions.

But which came first: complex societies or the belief in a punishing god? […]

The researchers found that belief in moralizing gods usually followed increases in social complexity, generally appearing after the emergence of civilizations with populations of more than about 1 million people.

“It was particularly striking how consistent it was [that] this phenomenon emerged at the million-person level,” Savage said. “First, you get big societies, and these beliefs then come.”

All in all, “our research suggests that religion is playing a functional role throughout world history, helping stabilize societies and people cooperate overall,” Savage said. “In really small societies, like very small groups of hunter-gatherers, everyone knows everyone else, and everyone’s keeping an eye on everyone else to make sure they’re behaving well. Bigger societies are more anonymous, so you might not know who to trust.”

At those sizes, you see the rise of beliefs in an all-powerful, supernatural person watching and keeping things under control, Savage added.

Complex societies gave birth to big gods, not the other way around: study
from Complexity Science Hub Vienna

“It has been a debate for centuries why humans, unlike other animals, cooperate in large groups of genetically unrelated individuals,” says Seshat director and co-author Peter Turchin from the University of Connecticut and the Complexity Science Hub Vienna. Factors such as agriculture, warfare, or religion have been proposed as main driving forces.

One prominent theory, the big or moralizing gods hypothesis, assumes that religious beliefs were key. According to this theory, people are more likely to cooperate fairly if they believe in gods who will punish them if they don’t. “To our surprise, our data strongly contradict this hypothesis,” says lead author Harvey Whitehouse. “In almost every world region for which we have data, moralizing gods tended to follow, not precede, increases in social complexity.” Even more so, standardized rituals tended on average to appear hundreds of years before gods who cared about human morality.

Such rituals create a collective identity and feelings of belonging that act as social glue, making people to behave more cooperatively. “Our results suggest that collective identities are more important to facilitate cooperation in societies than religious beliefs,” says Harvey Whitehouse.

Society Creates God, God Does Not Create Society
by  Razib Khan

What’s striking is how soon moralizing gods shows up after the spike in social complexity.

In the ancient world, early Christian writers explicitly asserted that it was not a coincidence that their savior arrived with the rise of the Roman Empire. They contended that a universal religion, Christianity, required a universal empire, Rome. There are two ways you can look at this. First, that the causal arrow is such that social complexity leads to moralizing gods, and that’s that. The former is a necessary condition for the latter. Second, one could suggest that moralizing gods are a cultural adaptation to large complex societies, one of many, that dampen instability and allow for the persistence of those societies. That is, social complexity leads to moralistic gods, who maintain and sustain social complexity. To be frank, I suspect the answer will be closer to the second. But we’ll see.

Another result that was not anticipated I suspect is that ritual religion emerged before moralizing gods. In other words, instead of “Big Gods,” it might be “Big Rules.” With hindsight, I don’t think this is coincidental since cohesive generalizable rules are probably essential for social complexity and winning in inter-group competition. It’s not a surprise that legal codes emerge first in Mesopotamia, where you had the world’s first anonymous urban societies. And rituals lend themselves to mass social movements in public to bind groups. I think it will turn out that moralizing gods were grafted on top of these general rulesets, which allow for coordination, cooperation, and cohesion, so as to increase their import and solidify their necessity due to the connection with supernatural agents, which personalize the sets of rules from on high.

Complex societies precede moralizing gods throughout world history
by Harvey Whitehouse, Pieter François, Patrick E. Savage, Thomas E. Currie, Kevin C. Feeney, Enrico Cioni, Rosalind Purcell, Robert M. Ross, Jennifer Larson, John Baines, Barend ter Haar, Alan Covey, and Peter Turchin

The origins of religion and of complex societies represent evolutionary puzzles1–8. The ‘moralizing gods’ hypothesis offers a solution to both puzzles by proposing that belief in morally concerned supernatural agents culturally evolved to facilitate cooperation among strangers in large-scale societies9–13. Although previous research has suggested an association between the presence of moralizing gods and social complexity3,6,7,9–18, the relationship between the two is disputed9–13,19–24, and attempts to establish causality have been hampered by limitations in the availability of detailed global longitudinal data. To overcome these limitations, here we systematically coded records from 414societies that span the past 10,000years from 30regions around the world, using 51measures of social complexity and 4measures of supernatural enforcement of morality. Our analyses not only confirm the association between moralizing gods and social complexity, but also reveal that moralizing gods follow—rather than precede—large increases in social complexity. Contrary to previous predictions9,12,16,18, powerful moralizing ‘big gods’ and prosocial supernatural punishment tend to appear only after the emergence of ‘megasocieties’ with populations of more than around one million people. Moralizing gods are not a prerequisite for the evolution of social complexity, but they may help to sustain and expand complex multi-ethnic empires after they have become established. By contrast, rituals that facilitate the standardization of religious traditions across large populations25,26 generally precede the appearance of moralizing gods. This suggests that ritual practices were more important than the particular content of religious belief to the initial rise of social complexity.

 

 

Spartan Diet

There are a number well known low-carb diets. The most widely cited is that of the Inuit, but the Masai are often mentioned as well. I came across another example in Jack Weatherford’s Genghis Khan and the Making of the Modern World (see here for earlier discussion).

Mongols lived off of meat, blood, and milk paste. This diet, as the Chinese observed, allowed the Mongol warriors to ride and fight for days on end without needing to stop for meals. Part of this is because they could eat while riding, but there is a more key factor. This diet is so low-carb as to be ketogenic. And long-term ketosis leads to fat-adaptation which allows for high energy and stamina, even without meals as long as one has enough fat reserves (i.e., body fat). The feast and fast style of eating is common among non-agriculturalists.

There are other historical examples I haven’t previously researched. Ori Hofmekler in The Warrior Diet, claims that Spartans and Romans ate in a brief period each day, about a four hour window — because of the practice of having a communal meal once a day. This basically meant fasting for lengthy periods, although today it is often described as time-restricted eating. As I recall, Sikh monks have a similar practice of only eating one meal a day during which they are free to eat as much as they want. The trick to this diet is that it decreases overall food intake and keeps the body in ketosis more often — if starchy foods are restricted enough and the body is fat-adapted, this lessens hunger and cravings.

The Mongols may have been doing something similar. The thing about ketosis is your desire to snack all the time simply goes away. You don’t have to force yourself into food deprivation and it isn’t starvation, even if going without food for several days. As long as there is plenty of body fat and you are fat-adapted, the body maintains health, energy and mood just fine until the next big meal. Even non-warrior societies do this. The meat-loving and blubber-gluttonous Inuit don’t tolerate aggression in the slightest, and they certainly aren’t known for amassing large armies and going on military campaigns. Or consider the Piraha who are largely pacifists, banishing their own members if they kill another person, even someone from another tribe. The Piraha get about 70% of their diet from fish and other meat, that is to say a ketogenic diet. Plus, even though surrounded by lush forests filled with a wide variety of food, plants and animals, the Piraha regularly choose not to eat — sometimes for no particular reason but also sometimes when doing communal dances over multiple days.

So, I wouldn’t be surprised if Spartan and Roman warriors had similar practices, especially the Spartans who didn’t farm much (the grains that were grown by the Spartans’ slaves likely were most often fed to the slaves, not as much to the ruling Spartans). As for Romans, their diet probably became more carb-centric as Rome grew into an agricultural empire. But early on in the days of the Roman Republic, Romans probably were like Spartans in the heavy focus they would have put on raising cattle and hunting game. Still, a diet doesn’t have to be heavy in fatty meat to be ketogenic, as long as it involves some combination of calorie restriction, portion control, narrow periods of meals, intermittent fasting, etc — all being other ways of lessening the total intake of starchy foods.

One of the most common meals for Spartans was a blood and bone broth using boiled pork mixed with salt and vinegar, the consistency being thick and the color black. That would have included a lot of fat, fat-soluble vitamins, minerals, collagen, electrolytes, and much else. It was a nutrient-dense elixir of health, however horrible it may seem to the modern palate. And it probably was low-carb, depending on what else might’ve been added to it. Even the wine Spartans drink was watered down, as drunkenness was frowned upon. The purpose was probably more to kill unhealthy microbes in the water, as was watered down beer millennia later for early Americans, and so it would have added little sugar to the diet. Like the Mongols, they also enjoyed dairy. And they did have some grains such as bread, but apparently it was never a staple of their diet.

One thing they probably ate little of was olive oil, assuming it was used at all, as it was rarely mentioned in ancient texts and only became popular among Greeks in recent history, specifically the past century (discussed by Nina Teicholz in The Big Fat Surprise). Instead, Spartans as with most other early Greeks would have preferred animal fat, mostly lard in the case of the Spartans, whereas many other less landlocked Greeks preferred fish. Other foods the ancient Greeks, Spartans and otherwise, lacked was tomatoes later introduced from the New World and noodles later introduced from China, both during the colonial era of recent centuries. So, a traditional Greek diet would have looked far different than what we think of as the modern ‘Mediterranean diet’.

On top of that, Spartans were proud of eating very little and proud of their ability to fast. Plutarch (2nd century AD) writes in Parallel Lives “For the meals allowed them are scanty, in order that they may take into their own hands the fight against hunger, and so be forced into boldness and cunning”. Also, Xenophon who was alive whilst Sparta existed, writes in Spartan Society 2, “furnish for the common meal just the right amount for [the boys in their charge] never to become sluggish through being too full, while also giving them a taste of what it is not to have enough.” (from The Ancient Warrior Diet: Spartans) It’s hard to see how this wouldn’t have been ketogenic. Spartans were known for being great warriors achieving feats of military prowess that would’ve been impossible from lesser men. On their fatty meat diet of pork and game, they were taller and leaner than other Greeks. They didn’t have large meals and fasted for most of the day, but when they did eat it was food dense in fat, calories, and nutrition.

* * *

Ancient Spartan Food and Diet
from Legend & Chronicles

The Secrets of Spartan Cuisine
by Helena P. Schrader

Western Individuality Before the Enlightenment Age

The Culture Wars of the Late Renaissance: Skeptics, Libertines, and Opera
by Edward Muir
Introduction
pp. 5-7

One of the most disturbing sources of late-Renaissance anxiety was the collapse of the traditional hierarchic notion of the human self. Ancient and medieval thought depicted reason as governing the lower faculties of the will, the passions, and the body. Renaissance thought did not so much promote “individualism” as it cut away the intellectual props that presented humanity as the embodiment of a single divine idea, thereby forcing a desperate search for identity in many. John Martin has argued that during the Renaissance, individuals formed their sense of selfhood through a difficult negotiation between inner promptings and outer social roles. Individuals during the Renaissance looked both inward for emotional sustenance and outward for social assurance, and the friction between the inner and outer selves could sharpen anxieties 2 The fragmentation of the self seems to have been especially acute in Venice, where the collapse of aristocratic marriage structures led to the formation of what Virginia Cox has called the single self, most clearly manifest in the works of several women writers who argued for the moral and intellectual equality of women with men.’ As a consequence of the fragmented understanding of the self, such thinkers as Montaigne became obsessed with what was then the new concept of human psychology, a term in fact coined in this period.4 A crucial problem in the new psychology was to define the relation between the body and the soul, in particular to determine whether the soul died with the body or was immortal. With its tradition of Averroist readings of Aristotle, some members of the philosophy faculty at the University of Padua recurrently questioned the Christian doctrine of the immortality of the soul as unsound philosophically. Other hierarchies of the human self came into question. Once reason was dethroned, the passions were given a higher value, so that the heart could be understood as a greater force than the mind in determining human conduct. duct. When the body itself slipped out of its long-despised position, the sexual drives of the lower body were liberated and thinkers were allowed to consider sex, independent of its role in reproduction, a worthy manifestation of nature. The Paduan philosopher Cesare Cremonini’s personal motto, “Intus ut libet, foris ut moris est,” does not quite translate to “If it feels good, do it;” but it comes very close. The collapse of the hierarchies of human psychology even altered the understanding of the human senses. The sense of sight lost its primacy as the superior faculty, the source of “enlightenment”; the Venetian theorists of opera gave that place in the hierarchy to the sense of hearing, the faculty that most directly channeled sensory impressions to the heart and passions.

Historical and Philosophical Issues in the Conservation of Cultural Heritage
edited by Nicholas Price, M. Kirby Talley, and Alessandra Melucco Vaccaro
Reading 5: “The History of Art as a Humanistic Discipline”
by Erwin Panofsky
pp. 83-85

Nine days before his death Immanuel Kant was visited by his physician. Old, ill and nearly blind, he rose from his chair and stood trembling with weakness and muttering unintelligible words. Finally his faithful companion realized that he would not sit down again until the visitor had taken a seat. This he did, and Kant then permitted himself to be helped to his chair and, after having regained some of his strength, said, ‘Das Gefühl für Humanität hat mich noch nicht verlassen’—’The sense of humanity has not yet left me’. The two men were moved almost to tears. For, though the word Humanität had come, in the eighteenth century, to mean little more than politeness and civility, it had, for Kant, a much deeper significance, which the circumstances of the moment served to emphasize: man’s proud and tragic consciousness of self-approved and self-imposed principles, contrasting with his utter subjection to illness, decay and all that implied in the word ‘mortality.’

Historically the word humanitas has had two clearly distinguishable meanings, the first arising from a contrast between man and what is less than man; the second between man and what is more. In the first case humanitas means a value, in the second a limitation.

The concept of humanitas as a value was formulated in the circle around the younger Scipio, with Cicero as its belated, yet most explicit spokesman. It meant the quality which distinguishes man, not only from animals, but also, and even more so, from him who belongs to the species homo without deserving the name of homo humanus; from the barbarian or vulgarian who lacks pietas and παιδεια- that is, respect for moral values and that gracious blend of learning and urbanity which we can only circumscribe by the discredited word “culture.”

In the Middle Ages this concept was displaced by the consideration of humanity as being opposed to divinity rather than to animality or barbarism. The qualities commonly associated with it were therefore those of frailty and transience: humanitas fragilis, humanitas caduca.

Thus the Renaissance conception of humanitas had a two-fold aspect from the outset. The new interest in the human being was based both on a revival of the classical antithesis between humanitas and barbartias, or feritas, and on a survival of the mediaeval antithesis between humanitas and divinitas. When Marsilio Ficino defines man as a “rational soul participating in the intellect of God, but operating in a body,” he defines him as the one being that is both autonomous and finite. And Pico’s famous ‘speech’ ‘On the Dignity of Man’ is anything but a document of paganism. Pico says that God placed man in the center of the universe so that he might be conscious of where he stands, and therefore free to decide ‘where to turn.’ He does not say that man is the center of the universe, not even in the sense commonly attributed to the classical phrase, “man the measure of all things.”

It is from this ambivalent conception of humanitas that humanism was born. It is not so much a movement as an attitude which can be defined as the conviction of the dignity of man, based on both the insistence on human values (rationality and freedom) and the acceptance of human limitations (fallibility and frailty); from this two postulates result responsibility and tolerance.

Small wonder that this attitude has been attacked from two opposite camps whose common aversion to the ideas of responsibility and tolerance has recently aligned them in a united front. Entrenched in one of these camps are those who deny human values: the determinists, whether they believe in divine, physical or social predestination, the authoritarians, and those “insectolatrists” who profess the all-importance of the hive, whether the hive be called group, class, nation or race. In the other camp are those who deny human limitations in favor of some sort of intellectual or political libertinism, such as aestheticists, vitalists, intuitionists and hero-worshipers. From the point of view of determinism, the humanist is either a lost soul or an ideologist. From the point of view of authoritarianism, he is either a heretic or a revolutionary (or a counterrevolutionary). From the point of view of “insectolatry,” he is a useless individualist. And from the point of view of libertinism he is a timid bourgeois.

Erasmus of Rotterdam, the humanist par excellence, is a typical case in point. The church suspected and ultimately rejected the writings of this man who had said: “Perhaps the spirit of Christ is more largely diffused than we think, and there are many in the community of saints who are not in our calendar.” The adventurer Uhich von Hutten despised his ironical skepticism and his unheroic love of tranquillity. And Luther, who insisted that “no man has power to think anything good or evil, but everything occurs in him by absolute necessity,” was incensed by a belief which manifested itself in the famous phrase; “What is the use of man as a totality [that is, of man endowed with both a body and a soul], if God would work in him as a sculptor works in clay, and might just as well work in stone?”

Food and Faith in Christian Culture
edited by Ken Albala and Trudy Eden
Chapter 3: “The Food Police”
Sumptuary Prohibitions On Food In The Reformation
by Johanna B. Moyer
pp. 80-83

Protestants too employed a disease model to explain the dangers of luxury consumption. Luxury damaged the body politic leading to “most incurable sickness of the universal body” (33). Protestant authors also employed Galenic humor theory, arguing that “continuous superfluous expense” unbalanced the humors leading to fever and illness (191). However, Protestants used this model less often than Catholic authors who attacked luxury. Moreover, those Protestants who did employ the Galenic model used it in a different manner than their Catholic counterparts.

Protestants also drew parallels between the damage caused by luxury to the human body and the damage excess inflicted on the French nation. Rather than a disease metaphor, however, many Protestant authors saw luxury more as a “wound” to the body politic. For Protestants the danger of luxury was not only the buildup of humors within the body politic of France but the constant “bleeding out” of humor from the body politic in the form of cash to pay for imported luxuries. The flow of cash mimicked the flow of blood from a wound in the body. Most Protestants did not see luxury foodstuffs as the problem, indeed most saw food in moderation as healthy for the body. Even luxury apparel could be healthy for the body politic in moderation, if it was domestically produced and consumed. Such luxuries circulated the “blood” of the body politic creating employment and feeding the lower orders. 72 De La Noue made this distinction clear. He dismissed the need to individually discuss the damage done by each kind of luxury that was rampant in France in his time as being as pointless “as those who have invented auricular confession have divided mortal and venal sins into infinity of roots and branches.” Rather, he argued, the damage done by luxury was in its “entire bulk” to the patrimonies of those who purchased luxuries and to the kingdom of France (116). For the Protestants, luxury did not pose an internal threat to the body and salvation of the individual. Rather, the use of luxury posed an external threat to the group, to the body politic of France.

The Reformation And Sumptuary Legislation

Catholics, as we have seen, called for antiluxury regulations on food and banqueting, hoping to curb overeating and the damage done by gluttony to the body politic. Although some Protestants also wanted to restrict food and banqueting, more often French Protestants called for restrictions on clothing and foreign luxuries. These differing views of luxury during and after the French Wars of Religion not only give insight into the theological differences between these two branches of Christianity but also provides insight into the larger pattern of the sumptuary regulation of food in Europe in this period. Sumptuary restrictions were one means by which Catholics and Protestants enforced their theology in the post-Reformation era.

Although Catholicism is often correctly cast as the branch of Reformation Christianity that gave the individual the least control over their salvation, it was also true that the individual Catholic’s path to salvation depended heavily on ascetic practices. The responsibility for following these practices fell on the individual believer. Sumptuary laws on food in Catholic areas reinforced this responsibility by emphasizing what foods should and should not be eaten and mirrored the central theological practice of fasting for the atonement of sin. Perhaps the historiographical cliché that it was only Protestantism which gave the individual believer control of his or her salvation needs to be qualified. The arithmetical piety of Catholicism ultimately placed the onus on the individual to atone for each sin. Moreover, sumptuary legislation tried to steer the Catholic believer away from the more serious sins that were associated with overeating, including gluttony, lust, anger, and pride.

Catholic theology meshed nicely with the revival of Galenism that swept through Europe in this period. Galenists preached that meat eating, overeating, and the imbalance in humors which accompanied these practices, led to behavioral changes, including an increased sex drive and increased aggression. These physical problems mirrored the spiritual problems that luxury caused, including fornication and violence. This is why so many authors blamed the French nobility for the luxury problem in France. Nobles were seen not only as more likely to bear the expense of overeating but also as more prone to violence. 73

Galenism also meshed nicely with Catholicism because it was a very physical religion in which the control of the physical body figured prominently in the believer’s path to salvation. Not surprisingly, by the seventeenth century, Protestants gravitated away from Galenism toward the chemical view of the body offered by Paracelsus. 74 Catholic sumptuary law embodied a Galenic view of the body where sin and disease were equated and therefore pushed regulations that advocated each person’s control of his or her own body.

Protestant legislators, conversely, were not interested in the individual diner. Sumptuary legislation in Protestant areas ran the gamut from control of communal displays of eating, in places like Switzerland and Germany, to little or no concern with restrictions on luxury foods, as in England. For Protestants, it was the communal role of food and luxury use that was important. Hence the laws in Protestant areas targeted food in the context of weddings, baptisms, and even funerals. The English did not even bother to enact sumptuary restrictions on food after their break with Catholicism. The French Protestants who wrote on luxury glossed over the deleterious effects of meat eating, even proclaiming it to be healthful for the body while producing diatribes against the evils of imported luxury apparel. The use of Galenism in the French Reformed treatises suggests that Protestants too were concerned with a “body,” but it was not the individual body of the believer that worried Protestant legislators. Sumptuary restrictions were designed to safeguard the mystical body of believers, or the “Elect” in the language of Calvinism. French Protestants used the Galenic model of the body to discuss the damage that luxury did to the body of believers in France, but ultimately to safeguard the economic welfare of all French subjects. The Calvinists of Switzerland used sumptuary legislation on food to protect those predestined for salvation from the dangerous eating practices of members of the community whose overeating suggested they might not be saved.

Ultimately, sumptuary regulations in the Reformation spoke to the Christian practice of fasting. Fasting served very different functions in Protestants and Catholic theology. Raymond Mentzer has suggested that Protestants “modified” the Catholic practice of fasting during the Reformation. The major reformers, including Luther, Calvin, and Zwingli, all rejected fasting as a path to salvation. 75 For Protestants, fasting was a “liturgical rite,” part of the cycle of worship and a practice that served to “bind the community.” Fasting was often a response to adversity, as during the French Wars of Religion. For Catholics, fasting was an individual act, just as sumptuary legislation in Catholic areas targeted individual diners. However, for Protestants, fasting was a communal act, “calling attention to the body of believers.” 76 The symbolic nature of fasting, Mentzer argues, reflected Protestant rejection of transubstantiation. Catholics continued to believe that God was physically present in the host, but Protestants believed His was only a spiritual presence. When Catholics took Communion, they fasted to cleanse their own bodies so as to receive the real, physical body of Christ. Protestants, on the other hand, fasted as spiritual preparation because it was their spirits that connected with the spirit of Christ in the Eucharist. 77

“…there resides in every language a characteristic world-view”

“Via the latter, qua character of a speech-sound, a pervasive analogy necessarily prevails in the same language; and since a like subjectivity also affects language in the same notion, there resides in every language a characteristic world-view. As the individual sound stands between man and the object, so the entire language steps in between him and the nature that operates, both inwardly and outwardly, upon him. He surrounds himself with a world of sounds, so as to take up and process within himself the world of objects. These expressions in no way outstrip the measure of the simple truth. Man lives primarily with objects, indeed, since feeling and acting in him depend on his presentations, he actually does so exclusively, as language presents them to him. By the same act whereby he spins language out of himself, he spins himself into it, and every language draws about the people that possesses it a circle whence it is possible to exit only by stepping over at once into the circle of another one. To learn a foreign language should therefore be to acquire a new standpoint in the world-view hitherto possessed, and in fact to a certain extent is so, since every language contains the whole conceptual fabric and mode of presentation of a portion of mankind.”

Wilhelm von Humboldt
On Language (1836)

* * *

Wilhelm von Humboldt
from Wikipedia

Wilhelm von Humboldt
from Stanford Encyclopedia of Philosophy

Wilhelm von Humboldt lectures
from Université de Rouen

Wilhelm von Humbold and the World of Languages
by Ian F. McNeely

Wilhelm von Humboldt: A Critical Review On His Philosophy of Language, Theory and Practice of Education
by Dr Arlini Alias

The theory of linguistic relativity from the historical perspective
by Iaroslav

Democratic Values in Balance or Opposition

At the New Yorker, Nathan Heller has an interesting piece about equality and freedom, The Philosopher Redefining Equality.

Mainstream American thought sees them as oppositional. But maybe the common ground between them is fairness. There can be neither equality nor freedom in an unfair society, although there can be liberty in an unfair society. That goes off on a tangent, but keep it in mind as background info. A society of freedom is not the same as a society of liberty, and a society of fairness might be a whole other thing as well. Yet it has been argued that English is the only language with exact words for all three concepts (see Liberty, Freedom, and Fairness) — for example, George Fletcher in Basic Concepts of Legal Thought writes,

“Remarkably, our concept of fairness does not readily translate into other languages. It is virtually impossible to find a suitable translation for fairness in European or Semitic languages. As a result, the term is transplanted directly in some languages such as German and Hebrew, and absent in others, such as French, which is resistant to adopting loan words that carry unique meanings.” (quoted by Manny Echevarria in Does Fairness Translate?)

The difference between the two cultural worldviews and ideological systems is what led to both the English Civil War and the American Civil War. This conflict has been internalized within American society, but it has never been resolved. Americans simply have pretended it went away when, in reality, the conflict has grown worse.

Heller writes about the experience and work of Elizabeth Anderson. She has been, “Working at the intersection of moral and political philosophy, social science, and economics, she has become a leading theorist of democracy and social justice.” And, as related to the above, “She has built a case, elaborated across decades, that equality is the basis for a free society.” Freedom isn’t only closely linked to equality but built upon and dependent upon it. That makes sense from an etymological perspective, as freedom originally meant living among equals in sharing freedom as a member of a free people, at least a member in good standing (ignoring the minor detail of the categories of people excluded: women, slaves, and strangers; but it might be noted that these categories weren’t always permanent statuses and unchangeable fates, since sometimes women could become warriors or divorce their husbands, slaves could end their bondage, and strangers could marry into the community). Hence, this is the reason the word ‘friend’ has the same origin — to be free is to be among friends, among those one trusts and relies upon as do they in return.

Fairness, by the way, is an odd word. It has an English meaning of fair, handsome, beautiful, and attractive (with some racist connotations); nice, clean, bright, clear, and pleasant; moderate as in not excessive in any direction (a fair balance or fair weather, neither hot nor cold) but also generous or plentiful as in considerable (a fair amount). And in various times and places, it has meant favorable, helpful, promising good fortune, and auspicious; morally or comparatively good, socially normative, average, suitable, agreeable, with propriety and justice, right conduct, etc; which overlaps with the modern sense of equitable, impartial, just, and free from bias (from fair and well to fair and square, from fair-dealing to fair play). But its other linguistic variants connect to setting, putting, placing, acting, doing, making, and becoming; make, compose, produce, construct, fashion, frame, build, erect, and appoint. There is an additional sense of sex and childbirth (i.e., fucking and birthing), the ultimate doing and making; and so seemingly akin to worldly goodness of fecundity, abundance, and creation. The latter maybe where the English meaning entered the picture. More than being fair as a noun, it is a verb of what one is doing in a real world sense.

Interestingly, some assert the closest etymological correlate to fairness in modern Swedish is ‘rättvis’. It breaks down to the roots ‘rätt’ and ‘vis’, the former signifying what is ‘correct’ or ‘just’ and the latter ‘wise’ (correct-wise or just-wise in the sense of clockwise or otherwise). This Swedish word is related to the English ‘righteous’. That feels right in the moral component of fairness that can be seen early on its development as a word. We think of what is righteous as having a more harsh and demanding tone than fairness. But I would note how easy it is to pair fairness with justice as if they belong together. John Rawls has a theory of justice as fairness. That makes sense, in accord with social science research that shows humans strongly find unjust that which is perceived as unfair. Then again, as freedom is not exactly the same as liberty, righteousness is not exactly the same as justice. There might be a reason that the Pledge of Allegiance states “with liberty and justice for all”, not liberty and righteousness, not freedom and justice. Pledging ourselves to liberty and justice might put us at odds with a social order of fairness, as paired with freedom, equality, or righteousness. Trying to translate these two worldviews into each other maybe is what created so much confusion in the first place.

All these notions of and related to fairness, one might argue indicate how lacking in fairness is our society, whatever one might think of liberty and justice. Humans tend to obsess over in articulating and declaring what is found most wanting. A more fair society would likely not bother to have a world for it as the sense of fairness would be taken for granted and would simply exist in the background as ideological realism and cultural worldview. From Integrity in Depth, John Beebe makes this argument about the word ‘integrity’ for modern society, whereas the integral/integrated lifestyle of many tribal people living in close relationship to their environment requires no such word. A people need what is not integrated that is seen as needing to be integrated in order to speak of what is or might be of integrity.

Consider the Piraha who are about as equal a society as can exist with fairness only becoming an issue in recent history because of trade with outsiders. The Piraha wanted Daniel Everett to teach them learn math because they couldn’t determine if they were being given a fair deal or were being cheated, a non-issue among Piraha themselves since they don’t have a currency or even terms for numerals. A word like fairness would be far too much of a generalized abstraction for the Piraha as traditionally most interactions were concrete and personal, as such more along the lines of Germanic ‘freedom’.

It might put some tribal people in an ‘unfair’ position if they don’t have the language to fully articulate unfairness, at least in economic terms. We Americans have greater capacity and talent in fighting for fairness because we get a lot of practice, as we can’t expect it as our cultural birthright. Unsurprisingly, we talk a lot about it and in great detail. Maybe to speak of fairness is always to imply both its lack and desirability. From the view of linguistic relativism, such a word invokes a particular wordview that shapes and influences thought, perception, and behavior.

This is observed in social science research when WEIRD populations are compared to others, as seen in Joe Henrich’s study of the prisoner’s dilemma: “It had been thought a matter of settled science that human beings insist on fairness in the division, or will punish the offering party by refusing to accept the offer. This was thought an interesting result, because economics would predict that accepting any offer is better than rejecting an offer of some money. But the Machiguenga acted in a more economically rational manner, accepting any offer, no matter how low. “They just didn’t understand why anyone would sacrifice money to punish someone who had the good luck of getting to play the other role in the game,” Henrich said” (John Watkins, The Strangeness of Being WEIRD). There is no impulse to punish an unfairness that, according to the culture, isn’t perceived as unfair. It appears that the very concept of fairness was irrelevant or maybe incomprehensible the Machiguenga, at least under these conditions. But if they are forced to deal more with outsiders who continually take advantage of them or who introduce perverse incentives into their communities, they surely would have to develop the principle of fairness and learn to punish unfairness. Language might be the first sign of such a change.

A similar point is made by James L. Kugel in The Great Shift about ancient texts written by temple priests declaring laws and prohibitions. This probably hints at a significant number of people at the time doing the complete opposite or else the priests wouldn’t have bothered to make it clear, often with punishments for those who didn’t fall in line. As Julian Jaynes explains, the earliest civilizations didn’t need written laws because the social norms were so embedded within not only the social fabric but the psyche. Laws were later written down because social norms were breaking down, specifically as societies grew in size, diversity, and complexity. We are now further down this road of the civilizational project and legalism is inseparable from our everyday experience, and so we need many words such as fairness, justice, righteousness, freedom, liberty, etc. We are obsessed with articulating these values as if by doing so we could re-enforce social norms that refuse to solidify and stabilize. So, we end up turning to centralized institutions such as big government to impose these values on individuals, markets, and corporations. And we need lawyers, judges, and politicians to help us navigate this legalistic world that we are anxious about falling apart at any moment.

This interpretation is supported by the evidence of the very society in which the word fairness was first used. “The tribal uses of fair and fairness were full of historical irony,” pointed out David Hackett Fischer in Fairness and Freedom (Kindle Locations 647-651). “These ideas flourished on the far fringes of northwestern Europe among groups of proud, strong, violent, and predatory people who lived in hard environments, fought to the death for the means of life, and sometimes preyed even on their own kin. Ideas of fairness and fair play developed as a way of keeping some of these habitual troublemakers from slaughtering each other even to the extinction of the tribe. All that might be understood as the first stage in the history of fairness.” This interpretation is based on a reading of the sagas as written down quite late in Scandinavian history. It was a period when great cultural shifts were happening such as radical and revolutionary introductions like that of writing itself. And I might add, this followed upon the millennium of ravage from the collapse of the bicameralism of Bronze Age Civilizations. The society was under great pressure, both from within and without, as the sagas describe those violent times. It was the sense of lack of fairness in societal chaos and conflict that made it necessary to invent fairness as a cultural ideal and social norm.

It’s impossible to argue we live in a fair society. The reason Adam Smith defended equality, for example, is because he thought it would be a nice ideal to aspire to and not that we had already attained it. On the other hand, there is an element of what has been lost. Feudal society had clearly spelled out rights and responsibilities that were agreed upon and followed as social norms, and so in that sense it was a fair society. The rise of capitalism with the enclosure and privatization of the commons was experienced as unfair, to which Thomas Paine was also responding with his defense of a citizen’s dividend to recompense what was taken, specifically as theft not only from living generations but also all generations following. When a sense of fairness was still palpable, as understood within the feudal social order, no argument for fairness as against unfairness was necessary. It likely is no coincidence that the first overt class war happened in the English Civil War when the enclosure movement was in high gear, the tragic results of which Paine would see in the following century, although the enclosure movement didn’t reach full completion until the 19th century with larger scale industrialization and farming.

As for how fairness accrued its modern meaning, I suspect that it is one of the many results of the Protestant Reformation as a precursor to the Enlightenment Age. The theological context became liberal. As Anna Wierzbicka put it: ” “Fair play” as a model of human interaction highlights the “procedural” character of the ethics of fairness. Arguably, the emergence of the concept of “fairness” reflects a shift away from absolute morality to “procedural (and contractual) morality,” and from the gradual shift from “just” to “fair” can be seen as parallel to the shifts from good to right and also from wise (and also true) to reasonable: in all cases, there is a shift from an absolute, substantive approach to a procedural one.” (from English: Meaning and Culture as quoted Mark Liberman in No word for fair?)

Nathan Heller’s article is about how the marriage of values appears like a new and “unorthodox notion”. But Elizabeth Anderson observes that, “through history, equality and freedom have arrived together as ideals.” This basic insight was a central tenet of Adam Smith’s economic philosophy. Smith said a free society wasn’t possible with high inequality. It simply wasn’t possible. Full stop. And his economic views are proclaimed as the basis of Western capitalism. So, how did this foundational understanding get lost along the way? I suppose because it was inconvenient to the powers that be who were looking for an excuse to further accumulate not only wealth but power.

It wasn’t only one part of the ruling elite that somehow ‘forgot’ this simple truth. From left to right, the establishment agreed in defense of the status quo: “If individuals exercise freedoms, conservatives like to say, some inequalities will naturally result. Those on the left basically agree—and thus allow constraints on personal freedom in order to reduce inequality. The philosopher Isaiah Berlin called the opposition between equality and freedom an “intrinsic, irremovable element in human life.” It is our fate as a society, he believed, to haggle toward a balance between them.” For whatever reason, there was a historical shift, a “Post-Enlightenment move” (Echevarria), both in the modern meaning of fairness and the modern deficiency in fairness.

That still doesn’t explain how the present ideological worldview became the dominant paradigm that went unquestioned by hundreds of millions of ordinary Americans and other Westerners. Direct everyday experience contradicts this neo-feudalist dogma of capitalist realism. There is nothing that Anderson observed in her own work experience that any worker couldn’t notice in almost any workplace. The truth has always been there right in front of us. Yet few had eyes to see. When lost in the darkness of a dominant paradigm, sometimes clear vision requires an imaginative leap into reality. I guess it’s a good thing we have a word to designate the ache we feel for a better world.

* * *

Fairness and Freedom
by David Hackett Fischer
Kindle Locations 596-675

Origins of the Words Fairness and Fair

Where did this language of fairness come from? What is the origin of the word itself? To search for the semantic roots of fair and fairness is to make a surprising discovery. Among widely spoken languages in the modern world, cognates for fairness and fair appear to have been unique to English, Danish, Norwegian, and Frisian until the mid-twentieth century. 40 They remained so until after World War II, when other languages began to import these words as anglicisms. 41

The ancestry of fair and fairness also sets them apart in another way. Unlike most value terms in the Western world, they do not derive from Greek or Latin roots. Their etymology is unlike that of justice and equity, which have cognates in many modern Western languages. Justice derives from the Latin ius, which meant a conformity to law or divine command, “without reference to one’s own inclinations.” Equity is from the Latin aequitas and its adjective aequus, which meant level, even, uniform, and reasonable. 42

Fairness and fair have a different origin. They derive from the Gothic fagrs, which meant “pleasing to behold,” and in turn from an Indo-European root that meant “to be content.” 43 At an early date, these words migrated from Asia to middle Europe. There they disappeared in a maelstrom of many languages, but not before they migrated yet again to remote peninsulas and islands of northern and western Europe, where they persisted to our time. 44 In Saxon English, for example, the old Gothic faeger survived in the prose of the Venerable Bede as late as the year 888. 45 By the tenth century, it had become faire in English speech. 46

In these early examples, fagr, faeger, fair, and fairness had multiple meanings. In one very old sense, fair meant blond or beautiful or both—fair skin, fair hair. As early as 870 a Viking king was called Harald Harfagri in Old Norse, or Harold Fairhair in English. In another usage, it meant favorable, helpful, and good—fair wind, fair weather, fair tide. In yet a third it meant spotless, unblemished, pleasing, and agreeable: fair words, fair speech, fair manner. All of these meanings were common in Old Norse, and Anglo-Saxon in the tenth and eleventh centuries. By 1450, it also meant right conduct in rivalries or competitions. Fair play, fair game, fair race, and fair chance appeared in English texts before 1490. 47

The more abstract noun fairness was also in common use. The great English lexicographer (and father of the Oxford English Dictionary) Sir James Murray turned up many examples, some so early that they were still in the old Gothic form—such as faegernyss in Saxon England circa 1000, before the Norman Conquest. It became fayreness and fairnesse as an ethical abstraction by the mid-fifteenth century, as “it is best that he trete him with farenes” in 1460. 48

As an ethical term, fairness described a process and a solution that could be accepted by most parties—fair price, fair judgment, fair footing, fair and square. Sometimes it also denoted a disposition to act fairly: fair-minded, fair-natured, fair-handed. All of these ethical meanings of fair and fairness were firmly established by the late sixteenth and early seventeenth centuries. Fair play appears in Shakespeare (1595); fair and square in Francis Bacon (1604); fair dealing in Lord Camden (before 1623). 49

To study these early English uses of fairness and fair is to find a consistent core of meaning. Like most vernacular words, they were intended not for study but for practical use. In ethical applications, they described a way of resolving an issue that is contested in its very nature: a bargain or sale, a race or rivalry, a combat or conflict. Fundamentally, fairness meant a way of settling contests and conflicts without bias or favor to any side, and also without deception or dishonesty. In that sense fairness was fundamentally about not taking undue advantage of other people. As early as the fifteenth century it variously described a process, or a result, or both together, but always in forms that fair-minded people would be willing to accept as legitimate.

Fairness functioned as a mediating idea. It was a way of linking individuals to groups, while recognizing their individuality at a surprisingly early date. Always, fairness was an abstract idea of right conduct that could be applied in different ways, depending on the situation. For example, in some specific circumstances, fairness was used to mean that people should be treated in the same way. But in other circumstances, fairness meant that people should be treated in different ways, or special ways that are warranted by particular facts and conditions, such as special merit, special need, special warrant, or special desire. 50

Fairness was a constraint on power and strength, but it did not seek to level those qualities in a Procrustean way. 51 Its object was to regulate ethical relationships between people who possess power and strength in different degrees—a fundamental fact of our condition. A call for fairness was often an appeal of the weak to the conscience of the strong. It was the eternal cry of an English-speaking child to parental authority: “It’s not fair!” As any parent knows, this is not always a cry for equality.

Modern Applications of Fairness: Their Consistent Core of Customary Meaning

Vernacular ideas of fairness and fair have changed through time, and in ways that are as unexpected as their origin. In early ethical usage, these words referred mostly to things that men did to one another—a fair fight, fair blow, fair race, fair deal, fair trade. They also tended to operate within tribes of Britons and Scandinavians, where they applied to freemen in good standing. Women, slaves, and strangers from other tribes were often excluded from fair treatment, and they bitterly resented it.

The tribal uses of fair and fairness were full of historical irony. These ideas flourished on the far fringes of northwestern Europe among groups of proud, strong, violent, and predatory people who lived in hard environments, fought to the death for the means of life, and sometimes preyed even on their own kin. Ideas of fairness and fair play developed as a way of keeping some of these habitual troublemakers from slaughtering each other even to the extinction of the tribe. All that might be understood as the first stage in the history of fairness. 52

Something fundamental changed in a second stage, when the folk cultures of Britain and Scandinavia began to grow into an ethic that embraced others beyond the tribe—and people of every rank and condition. This expansive tendency had its roots in universal values such as the Christian idea of the Golden Rule. 53 That broader conception of fairness expanded again when it met the humanist ideas of the Renaissance, the universal spirit of the Enlightenment, the ecumenical spirit of the Evangelical Movement, and democratic revolutions in America and Europe. When that happened, a tribal idea gradually became more nearly universal in its application. 54 Quantitative evidence suggests an inflection at the end of the eighteenth century. The frequency of fairness in English usage suddenly began to surge circa 1800. The same pattern appears in the use of the expression natural justice. 55

Then came a third stage in the history of fairness, when customary ideas began to operate within complex modern societies. In the twentieth century, fairness acquired many technical meanings with specific applications. One example regulated relations between government and modern media (“the fairness doctrine”). In another, fairness became a professional standard for people who were charged with the management of other people’s assets (“fiduciary fairness”). One of the most interesting modern instances appeared among lawyers as a test of “balance or impartiality” in legal proceedings, or a “subjective standard by which a court is deemed to have followed due process,” which began to be called “fundamental fairness” in law schools. Yet another example was “fair negotiation,” which one professional negotiator defined as a set of rules for “bargaining with the Devil without losing your soul.” One of the most complex applications is emerging today as an ethic of “fairness in electronic commerce.” These and other modern applications of fairness appear in legal treatises, professional codes, and complex bodies of regulatory law. 56

Even as modern uses of fair and fairness have changed in all of those ways, they also preserved a consistent core of vernacular meaning that had appeared in Old English, Norse, and Scandinavian examples and is still evident today. To summarize, fair and fairness have long been substantive and procedural ideas of right conduct, designed to regulate relations among people who are in conflict or rivalry or opposition in particular ways. Fairness means not taking undue advantage of others. It is also about finding ways to settle differences through a mutual acceptance of rules and processes that are thought to be impartial and honest—honesty is fundamental. And it is also about living with results that are obtained in this way. As the ancient Indo-European root of fagrs implied, a quest for fairness is the pursuit of practical solutions with which opposing parties could “be content.” These always were, and still are, the fundamental components of fairness. 57

Notes:

40. For an excellent and very helpful essay on fair and fairness by a distinguished cultural and historical linguist, see Anna Wierzbicka, “Being FAIR: Another Key Anglo Value and Its Cultural Underpinnings,” in English: Meaning and Culture (New York and Oxford, 2006), 141–70. See also Bart Wilson, “Fair’s Fair,” http://www.theatlantic.com/business/print/2009/01/fairs-fair/112; Bart J. Wilson, “Contra Private Fairness,” May 2008, http://www.chapman.edu/images/userimges/jcunning/Page_11731/ContraPrivateFairness05–2008.pdf; James Surowiecki, “Is the Idea of Fairness Universal?” Jan. 26, 2009, http://www.newyorker.com/online/blogs/jamessurowiecki/2009/01/is; and Mark Liberman, “No Word for Fair?” Jan. 28, 2009, http://languagelog.ldc.upenn.edu/nll/?p=1080.

41. Oxford English Dictionary, s.v. “fair” and “fairness.” Cognates for the English fairness include fagr in Icelandic and Old Norse, retferdighet in modern Norwegian, and retfaerighed in modern Danish. See Geír Tòmasson Zoëga, A Concise Dictionary of Old Icelandic (Toronto, 2004), s.v. “fagr.” For Frisian, see Karl von Richthofen, Altfriesisches Wörterbuch (Gottingen, 1840); idem, Friesische Rechtsquellen (Berlin, 1840). On this point I agree and disagree with Anna Wierzbicka. She believes that fair and unfair “have no equivalents in other European languages (let alone non-European ones) and are thoroughly untranslatable” (“Being FAIR,” 141). This is broadly true, but with the exception of Danish, Norwegian, Frisian, and Icelandic. Also I’d suggest that the words can be translated into other languages, but without a single exactly equivalent word. I believe that people of all languages are capable of understanding the meaning of fair and fairness, even if they have no single word for it.

42. OED, s.v. “justice,” “equity.”

43. Webster’s New World Dictionary, 2nd College Edition, ed. David B. Guralnik (New York and Cleveland, 1970), s.v. “fair”; OED, s.v. “fair.”

44. Ancient cognates for fair included fagar in Old English and fagr in Old Norse.

45. W. J. Sedgefield, Selections from the Old English Bede, with Text and Vocabulary, on an Early West Saxon Basis, and a Skeleton Outline of Old English Accidence (Manchester, London, and Bombay, 1917), 77; and in the attached vocabulary list, s.v. the noun “faeger” and the adverbial form “faegere.” Also Joseph Bosworth and T. Northcote Tollen, An Anglo-Saxon Dictionary, Based on Manuscript Collections (Oxford, 1882, 1898), s.v. “faeger,” ff.

46. Not to be confused with this word is another noun fair, for a show or market or carnival, from the Latin feria, feriae, feriarum, festival or holiday—an entirely different word, with another derivation and meaning.

47. Liberman, “No Word for Fair?”

48. OED, s.v. “fairness,” 1.a, b, c.

49. For fair and fairness in Shakespeare, see King John V.i.67. For fair and square in Francis Bacon in 1604 and Oliver Cromwell in 1649, see OED, s.v. “fair and square.”

50. Herein lies one of the most difficult issues about fairness. How can we distinguish between ordinary circumstances where fairness means that all people should be treated alike, and extraordinary circumstances where fairness means different treatment? This problem often recurs in cases over affirmative action in the United States. No court has been able to frame a satisfactory general rule, in part because of ideological differences on the bench.

51. Procrustes was a memorable character in Greek mythology, a son of Poseidon called Polypaemon or Damastes, and nicknamed Procrustes, “the Stretcher.” He was a bandit chief in rural Attica who invited unwary travelers to sleep in an iron bed. If they were longer than the bed, Procrustes cut off their heads or feet to make them fit; if too short he racked them instead. Procrustes himself was dealt with by his noble stepbrother Theseus, who racked him on his own bed and removed his head according to some accounts. In classical thought, and modern conservatism, the iron bed of Procrustes became a vivid image of rigid equality. The story was told by Diodorus Siculus, Historical Library 4.59; Pausanias, Guide to Greece 1.38.5; and Plutarch, Lives, Theseus 2.

52. Jesse Byock, Viking Age Iceland (London, 2001), 171–84; the best way to study the origin of fairness in a brutal world is in the Norse sagas themselves, especially Njal’s Saga, trans. and ed. Magnus Magnusson and Hermann Palsson (London, 1960, 1980), 21–22, 40, 108–11, 137–39, 144–45, 153, 163, 241, 248–55; Egil’s Saga, trans. and ed. Hermann Palsson and Paul Edwards (London, 1976, 1980), 136–39; Hrafnkel’s Saga and Other Icelandic Stories, trans. and ed. Hermann Palsson (London, 1971, 1980), 42–60.

53. Matthew 25:40; John 4:19–21; Luke 10:27.

54. The vernacular history of humanity, expanding in the world, is a central theme in David Hackett Fischer, Champlain’s Dream (New York and Toronto, 2008); as the expansion of vernacular ideas of liberty and freedom is central to Albion’s Seed (New York and Oxford, 1989) and Liberty and Freedom (New York and Oxford, 2005); and the present inquiry is about the expansion of vernacular ideas of fairness in the world. One purpose of all these projects is to study the history of ideas in a new key. Another purpose is to move toward a reunion of history and moral philosophy, while history also becomes more empirical and more logical in its epistemic frame.

55. For data on frequency, see Google Labs, Books Ngram Viewer, http://ngrams.googlelabs.com, s.v. “fairness” and “natural justice.” Similar patterns and inflection-points appear for the corpus of “English,” “British English,” and “American English,” in the full span 1500–2000, smoothing of 3. Here again on the history of fairness, I agree and disagree with Wierzbicka (“Being FAIR,” 141–67). The ethical meanings of fairness first appeared earlier than she believes to be the case. But I agree on the very important point that ethical use of fairness greatly expanded circa 1800.

56. Fred W. Friendly, The Good Guys, the Bad Guys, and the First Amendment (New York, 1976) is the classic work on the fairness doctrine. Quotations in this paragraph are from Carrie Menkow-Meadow and Michael Wheeler, eds., What’s Fair: Ethics for Negotiators (Cambridge, 2004), 57; Philip J. Clements and Philip W. Wisler, The Standard and Poor’s Guide to Fairness Opinions: A User’s Guide for Fiduciaries (New York, 2005); Merriam-Webster’s Dictionary of Law (Cleveland, 1996), s.v. “fundamental fairness”; Approaching a Formal Definition of Fairness in Electronic Commerce: Proceedings of the 18th IEEE Symposium on Reliable Distributed Systems (Washington, 1999), 354. Other technical uses of fairness can be found in projects directed by Arien Mack, editor of Social Research, director of the Social Research Conference series, and sponsor of many Fairness Conferences and also of a Web site called Fairness.com.

57. An excellent discussion of fairness, the best I have found in print, is George Klosko, The Principle of Fairness and Political Obligation (Savage, MD, 1992; rev. ed., 2004). It is similarto this formulation on many points, but different on others.

Reactionaries, Powell Memo and Judicial Activism

To explain why the Powell Memo is important, I’ll begin with a summary of the games played by reactionaries which explains the rhetorical power they wield. There are two main aspects of the reactionary mind (* see below). The most interesting is described by Corey Robin, the reason I’ve come to refer to reactionaries as the “Faceless Men”.

Reactionaries steal the thunder and mimic the tactics of the political left, and in doing so co-opt political movements and even revolutions, turning the latter into counterrevolutions. More interesting still is how reactionaries pose as what they are not by claiming labels that originated with their opponents — calling themselves classical liberals and whatever else catches their fancy. They pretend to be defenders of constitutional originalism while they radically transform the Constitution, such as pushing corporate personhood and citizenship, something that would have horrified the American revolutionaries and founders.

The other side of this is what reactionaries project onto others. They are the greatest purveyors of political correctness in attacking free speech, an area in which they show their brilliance in controlling narrative framing. They manage to portray their enemies as doing what they are most guilty of and through this tactic they silence and discredit others.

In this way, the reactionary element of the intellectual elite, Hollywood elite, and banking elite (as seen in the career of Steve Bannon) somehow manages to convince their followers that they are average Americans in a noble fight against the ruling elite. The target often ends up being students at state colleges who, according to the data and opposite of the reactionary portrayal, are mostly those working their way out of the working class — if meritocracy exists at all in the United States, this is the closest we get to it. But anyway, it’s highly doubtful that colleges are serving a genuinely democratic purpose at a time when corporate and other private funding is flooding into colleges, and so the accusation of their being bastions of the liberal faith is a sad joke. This state of confusion is intentionally created by reactionaries — up is down and those who are down are the enemy to be blamed for everything.

Or consider the accusation of a liberal media bias. It’s odd where I most often here this bizarre claim. It’s regularly repeated in the corporate media itself. Even the supposedly liberal media gives a platform to people to spout this bullshit. So, what kind of liberal bias is it that criticizes liberal bias by giving equal or greater time to right-wingers? That is no exaggeration, in that even NPR gives more airtime to corporatist and right-wing think tanks than to those on the anti-corporatist left. that is unsurprising since NPR gets most of its funding from private sources such as corporations, not from the government. Public radio?

This brings me to an example that has been on my mind for a while. I’ve been meaning to write about it. This seems as good of a time as ever. Let this be my first post of the new year, as clarifying where we stand as a society and how we got here.

The infamous Powell Memo (AKA Powell Manifesto) was only recently leaked, but it was written way back in 1971. Ever since that time, it has been the guiding vision of a cabal of right-wing oligarchs and plutocrats. It set out a strategy in how to take over the government and it was successful. Do you know how those on the political right are always alleging a left-wing conspiracy to pack the courts with activist judges? Well, that is an expression of a guilty conscience. It’s exactly what the right-wing ruling elite has been doing this past half century, culminating in the nomination of Judge Brett  Kavanaugh to the Supreme Court. Predictably, this so-called constitutionalist just so happens to be a big supporter of executive power, an agenda that more fully began being pushed under the administration of George W. Bush. There is absolutely nothing constitutionalist about this, as it undermines the very core pillar of separation and balance of powers. Instead of being a countervailing force, right-wingers are seeking to create a corporatocratic Supreme Court that serves at the behest of a right-wing presidency and political system.

That isn’t to entirely blame them, as the Democratic Party has shifted so far right on these issues that they are now to the right of Republicans from earlier last century. The reactionary mind has a way of infecting nearly everything and everyone. Our entire government has become a reactionary institution, but it’s important that we keep in mind who planned and led this coup. Then again, Lewis Powell who wrote the Powell Memo did so not only as a corporate lobbyist but also as a Democrat. And to show the bipartisan nature of this corporatocracy, it was Richard Nixon as a Republican president who that same year nominated him to the Supreme Court and the next year he was appointed. Still, it was the right-wing ALEC (American Legislative Exchange Council) that was most directly inspired by the Powell Memo, the organization that then helped enact this neoliberal and neo-fascist coup.

It’s not only the respectable good liberals of the ‘mainstream’ left that was in the dark about these machinations. Before the Powell Memo was leaked, anyone who pointed to the corporate takeover would have been called a conspiracy theorist, and so no more welcome in the Democratic Party than in the Republican Party. Americans in general couldn’t see what was happening because the decisions and exchange of money happened mostly behind closed doors. Besides, the corporate media had no interest in reporting on it, quite the opposite of course. There was no transparency, as planned, and so there was no accountability. Democracy dies in the dark.

Only now that Clown-Fuhrer Trumpf is in power do we suddenly get some push back showing up in the mainstream. The struggle for power within the ruling elite goes into high gear. And our dear leader has put Judge Kavanaugh onto the Supreme Court. I’ve heard stalwart Republicans who despise and fear Trump, nonetheless, supporting him to the extent that he is pushing for a ‘conservative’ judiciary which supposedly opposes all those activist judges on the left. Yet Kavanaugh is as activist as they come. The main reason Trump picked him probably was because, when the time comes, the Supreme Court can be swung in defense of the administration.

After the Barack Obama followed the example of George W. Bush in further advancing executive power, now Democrats are thinking that their support for authoritarianism may have been a bad die after all. They assumed they were going to maintain power since it was obvious to them that Hillary Clinton couldn’t lose the presidential election and so that the unrestrained executive could’ve then been used for their more paternalistic variety of friendly fascism. Trump and gang, of course, make a convenient scapegoat for Democratic sins. But that is a useless game at this point.

The joke is on all of them. And the entire political system is the punchline.

* * *

* What is the distinguishing feature of the reactionary mind? Maybe it has to do with the Dark Triad, the strong correlation between narcissism, psychopathy, and Machiavellianism. In particular, the last one might be key. Research has shown that those who are most fearful of Machiavellianism in fantasizing about conspiracies existing behind every door and lurking under the bed are themselves more prone to acting in Machiavellian ways. That very much sounds like the reactionary mind.

In political terms, the reactionary mind gets expressed by the social dominance orientation (SDO), which essentially is another way of speaking of Machiavellianism. This is where authoritarianism comes in, as SDO types are attracted to and have talent in manipulating authoritarian followers. As such, maybe authoritarians will only be reactionary to the degree that a society becomes reactionary and SDO types gain power, since authoritarians will conform to almost any social norm, good or bad.

It’s only under these conditions that we can speak of a distinct reactionary mind. The arising to dominance of the reactionary mind indicates that something is being reacted to. From other research, what seems to elicit this is rising inequality and segregation that foments mistrust, fear, and anxiety. This is what we see before every period of instability and, in reaction, there are those who seek to enforce order.

What makes reactionaries unique is their innovativeness in how they go about this. They aren’t traditionalists, although they also will co-opt from traditionalists as they co-opt from the political left. One of the purest forms of the reactionary mind is nostalgia which, unsurprisingly, rarely has anything to do with the historical past. It is co-opting the rhetoric and emotion of tradition with little respect or concern about actual traditions.

A key example of this anti-traditional pseudo-traditionalism is constitutional originalism. What the reactionary right is pushing is a complete contradiction and betrayal of what the American Revolution was fought for and what the United States was founded upon, specifically founded on the Declaration of Independence and the first constitution of the Articles of Confederation. These reactionaries will claim that liberalism is an attack on American political tradition, even as any informed person knows that liberalism (including in its progressive and radical forms) was core to American society from the beginning. Consider the view that a constitution is a living document as a pact of a specific community of people, an American tradition that came out of Quaker constitutionalism and (by way of Quaker-raised John Dickinson) informed the democratic sensibility of the Articles of Confederation.

Such history is inconvenient and so irrelevant to the reactionary mind. But because reactionaries took control so early with the Constitutional Convention, their counterrevolution permanently obscured the true history and that has left the American population with collective amnesia. As demonstrated by the extremes of Donald Trump, reactionaries love to invent ‘facts’ and then to repeat them until they become accepted or else until all sense of truth is lost. This is what makes the reactionary mind Machiavellian and, as such, finds itself at home in the Dark Triad.

* * *

Powell Memorandum:

CONFIDENTIAL MEMORANDUM
Attack on American Free Enterprise System

DATE: August 23, 1971
TO: Mr. Eugene B. Sydnor, Jr., Chairman, Education Committee, U.S. Chamber of Commerce
FROM: Lewis F. Powell, Jr.

Neglected Opportunity in the Courts

American business and the enterprise system have been affected as much by the courts as by the executive and legislative branches of government. Under our constitutional system, especially with an activist-minded Supreme Court, the judiciary may be the most important instrument for social, economic and political change.

Other organizations and groups, recognizing this, have been far more astute in exploiting judicial action than American business. Perhaps the most active exploiters of the judicial system have been groups ranging in political orientation from “liberal” to the far left.

The American Civil Liberties Union is one example. It initiates or intervenes in scores of cases each year, and it files briefs amicus curiae in the Supreme Court in a number of cases during each term of that court. Labor unions, civil rights groups and now the public interest law firms are extremely active in the judicial arena. Their success, often at business’ expense, has not been inconsequential.

This is a vast area of opportunity for the Chamber, if it is willing to undertake the role of spokesman for American business and if, in turn, business is willing to provide the funds.

As with respect to scholars and speakers, the Chamber would need a highly competent staff of lawyers. In special situations it should be authorized to engage, to appear as counsel amicus in the Supreme Court, lawyers of national standing and reputation. The greatest care should be exercised in selecting the cases in which to participate, or the suits to institute. But the opportunity merits the necessary effort.

* * *

Founding fathers worried about corporate clout
by Angela Carella

A Very American Coup
by David McLaren

The Shift From Democracy To Corporate Dictatorship And The Tragedy Of The Lack Of Push Back. Citizens United v. Federal
by Michael Monk

Powell Memo
by Jeremy Wilburn

How the Right Packed the Court
by William Yeomans

Extremists on the Bench: Five Years After Citizens United, Our Rogue Supreme Court
by Steve Justin

Citizens United, corporate personhood, and the way forward
by Mal Warwick

Is the Supreme Court Determined to Expand Corporate Power?
by Robert Monks and Peter Murray

The Powell Memo And The Birth Of The Washington Lobbying Empire
by Tina-Desiree Berg

Context of ‘August 23, 1971 and After: ’Powell Memo’ Leads to Massive Pro-Business Efforts to Influence Political, Social Discourse’
from History Commons

The Powell Memo
by .ren

* * *

If Confirmed, Brett Kavanaugh Fulfills the Powell Manifesto!
by Frank Puig

9 law experts on what Brett Kavanaugh means for the future of America
by Isolde Raftery and Sydney Brownstone

Kavanaugh would cement Supreme Court support for an oppressed minority — corporations
by Steven Strauss

With Kavanaugh, Trump Could Fashion The Most Business-Friendly Supreme Court Since The New Deal
by Michael Bobelian

How the Supreme Court Unleashed Business
by Justin Fox

Brett Kavanaugh’s Role in Schemes to Politicize the Judiciary Should Disqualify Him
by John Nichols

Brett Kavanaugh and the new judicial activism
by Matthew Yglesias

Judge Kavanaugh’s Activist Vision of Administrative Law
by Robert V. Percival

Brett Kavanaugh’s Dangerous Right-Wing Judicial Activism
by Elliot Mincberg

SCOTUS Nominee Brett Kavanaugh’s Record Depicts Dangerous Conservative Judicial Activism
by Devon Schmidt

Kavanaugh record hints at judicial activism in American election law
by Adav Noti and David Kolker

How Brett Kavanaugh Could Change the Supreme Court—and America
by Brian Bennett

His Entire Career Has Been In Service Of The Republican Agenda”: D.C. Lawyers Dish On Whether Brett Kavanaugh Will Give Trump A Pass On Russia
by Abigail Tracy

Brett Kavanaugh And The Fraud Of Originalism
by Rich Barlow

Brett Kavanaugh’s nomination is a victory for ‘originalists’
by Jill Abramson

Why the Supreme Court is now America’s most dangerous branch
by Mary Kay Linge

Brett Kavanaugh Was Involved in 3 Different Crises of Democracy
by Charles Pierce

The Senate Must Closely Examine These Documents From Kavanaugh’s Bush Years
by Peter M. Shane

How Brett Kavanaugh Worked to Weaponize the War on Terror
by Faiza Patel and Andrew Boyle

Could Brett Kavanaugh Protect Trump From Prosecution?
by Andy Kroll

Brett Kavanaugh and the Imperial Presidency (Supreme Court)
from Best of the Left

Brett Kavanaugh’s Radical View of Executive Power
by Corey Brettschneider

Judge Kavanaugh: An Originalist With a New—and Terrifying—Interpretation of Executive Power
by Patricia J. Williams

Judge Brett Kavanaugh’s radically expansive view of the power of the presidency: Analysis
by Terry Moran

For Brett Kavanaugh, the Separation of Powers Is a One-Way Street
by Dan Froomkin

7 legal experts on how Kavanaugh views executive power — and what it could mean for Mueller
by Jen Kirby

Courting Disaster: The Trouble with Brett Kavanaugh’s Views of Executive Power in the Age of Trump
by Michael Waldman

Where Supreme Court Nominee Brett Kavanaugh Stands On Executive Power
from All Things Considered

Kavanaugh File: Executive Privilege
By Robert Farley

Brett Kavanaugh’s SCOTUS Nomination Is Bad News for Church/State Separation
by Hemant Mehta

Brett Kavanaugh and the Triumph of the Conservative Counterrevolution
by Bill Blum

Brett Kavanaugh Has Power To Strengthen Donald Trump, But Supreme Court Has Boosted Presidents For Decades
by Greg Price

If Brett Kavanaugh Wins, the Supreme Court Loses
by Jay Michaelson

Why Conservatives Could Regret Confirming Brett Kavanaugh
by Eric Levitz

Misreading the Misreadings of History

“A majority of decent well-meaning people said there was no need to confront Hitler…. When people decided to not confront fascism, they were doing the popular thing, they were doing it for good reasons, and they were good people…but they made the wrong decision.”

Tony Blair spoke those words as UK Prime Minister in 2003. And the supposedly Hitler-like figure he alluded to was Saddam Hussein. I ran across this quote in a piece from The Wall Street Journal, The Trouble With Hitler Analogies by Zachary Karabell. I’d instead point out the trouble with those who feel troubled. The critic here, if he was like most in the mainstream media at the time, beat the drums for war in attacking Iraq.

That war, if you can call it that when from a safe distance the most powerful countries in the world bomb a small country to oblivion, was a war of aggression. It was illegal according to both US law and international law, whatever its legal standing may have been in the UK. Besides, the justification for the military attack on Iraq was based on a lie and everyone knew it was a lie, that is to say we have long been in a post-truth age (numerous military conflicts in US history were based on lies, from the Cold War to the Vietnam War).

Hussein had no weapons of mass destruction. The only reason we could even suggest that he did was because we had sold some to him back in the 1980s. But there was no way those weapons would still work at this point, since those biological weapons had a short lifespan. Worse still, whatever horrible things Saddam did in recent years, it pales against the horrible things he did while he was our ally. He used those US biological weapons against his own people back then and the US military stood by and did nothing, the US government implicitly supporting his authoritarian actions, as has long been the established pattern of US foreign relations. Our authoritarian allies do horrific atrocities all the time, for their own purposes and sometimes on our behalf.

What makes Blair’s statement morally demented is that he was speaking as an authoritarian imperialist flexing his muscles. Hussein, as a petty tyrant, was getting uppity and needed to be put back in his place. It had nothing to do with freedom and democracy, any more than World War II was motivated by such noble ideals. The US and UK went up against the Nazis because Germany (along with Japan) was a competing empire that became a direct threat, nothing less and nothing more. Having won that global conflict, the US then became a fully global empire and the greatest authoritarian power in history. Fascism wasn’t defeated, though, as the US became even more fascist over the following generations. This had to do with political elites such as the Bush family that made its wealth in doing business with Nazis and that later helped Nazi war criminals to evade justice by working for the US government.

Here is the real problem with Hitler analogies. If Hitler were here today and he had a different name, he would either smear his opponents as being like Hitler or he would dismiss those who dared to make such comparisons. Either way, the purpose would be to muddy the water and make impossible any public discussion and moral accounting. It is interesting to note that the author of the WSJ article indirectly defends the authoritarian forces in our society by blaming those who call names:

“Contesting today’s populist strongmen doesn’t require calling them fascists, a label that often deepens the anger and alienation of their followers. The only thing worse than forgetting history is using it badly, responding to echoes of the past with actions that fuel today’s fires rather than douse them.”

Basically, don’t antagonize the authoritarians or they might get mean. Well, they’re already mean. It’s too late for that. It’s another example of someone demanding moderation in a society that has gone mad. As I often wonder, moderate toward what? If we can’t even call authoritarianism for what it is as authoritarians rise to power, then what defense is there against what is taboo to speak of? There is none. That is the point. This is how authoritarianism takes hold in a society.

But to the author, one suspects that is not necessarily a bad thing. Authoritarianism, in this worldview, might be fine as long as it is used wisely and the mob is kept in check. The only problem with the second Iraq War wasn’t that it was authoritarian but that it failed in its own stated authoritarian agenda. What can’t be mentioned is the historical analogy of Hitler also failing in his authoritarian agenda when he turned to wars of aggression in a bid to assert imperial rule. The analogy, of course, ends there for the moment. That is because, unlike Nazi Germany, 21st century America doesn’t quite have the equivalent of an opposing power also aspiring to empire. Not yet. But Russia and China, if and when World War III begins, probably will be willing to play the role.

The End of History is a Beginning

Francis Fukuyama’s ideological change, from neocon to neoliberal, signaled among the intellectual class a similar but dissimilar change that was happening in the broader population. The two are parallel tracks down which history like a train came barreling and rumbling, the end not in sight.

The difference between them is that the larger shift was ignored, until Donald Trump revealed the charade to be a charade, as it always was. It shouldn’t have come as a surprise, this populist moment. A new mood has been in the air that resonates with an old mood that some thought was lost in the past, the supposed end of history. It has been developing for a long while now. And when reform is denied, much worse comes along.

On that unhappy note, there is a reason why Trump used old school rhetoric of progressivism and fascism (with the underlying corporatism to both ideologies). Just as there is a reason Steve Bannon, while calling himself a Leninist, gave voice to his hope that the present would be as exciting as the 1930s. Back in the early aughts, Fukuyma gave a warning about the dark turn of events, imperialistic ambition turned to hubris. No doubt he hoped to prevent the worse. But not many in the ruling class cared to listen. So here we are.

Whatever you think of him and his views, you have to give Fukuyama credit for the simple capacity of changing his mind and, to some extent, admitting he was wrong. He is a technocratic elitist with anti-populist animosity and paternalistic aspirations. But at the very least his motivations are sincere. One journalist, Andrew O’Hehir, described him this way:

“He even renounced the neoconservative movement after the Iraq war turned into an unmitigated disaster — although he had initially been among its biggest intellectual cheerleaders — and morphed into something like a middle-road Obama-Clinton Democrat. Today we might call him a neoliberal, meaning that not as leftist hate speech but an accurate descriptor.”

Not exactly a compliment. Many neocons and former neocons, when faced with the changes of the Republican Party, found the Clinton Democrats more attractive. For most of them, this conversion only happened with Trump’s campaign. Fukuyama stands out for being one of the early trendsetters on the right in turning against Cold War neoconservatism before it was popular to do so (athough did Fukuyama really change or did he simply look to a softer form of neoconservatism).

For good or ill, the Clinton Democrats, in the mainstream mind, now stand for the sane center, the moderate middle. To those like Fukuyama fearing a populist uprising, Trump marks the far right and Sanders the far left. That leaves the battleground between them that of a milquetoast DNC establishment, holding onto power by its loosening fingertips. Fukuyama doesn’t necessarily offer us much in the way of grand insight or of practical use (here is a harsher critique). It’s still interesting to hear someone like him make such an about face, though — if only in political rhetoric and not in fundamental principles. And for whatever its worth, he so far has been right about Trump’s weakness as a strongman.

It’s also appreciated that those like Francis Fukuyama and Charles Murray bring attention to the dangers of inequality and the failures of capitalism, no matter that I oppose the ideological bent of their respective conclusions. So, even as they disagree with populism as a response, like Teddy Roosevelt, they do take seriously the gut-level assessment of what is being responded to. It’s all the more interesting that these are views coming from respectable figures who once represented the political right, much more stimulating rhetoric than anything coming out of the professional liberal class.

* * *

Donald Trump and the return of class: an interview with Francis Fukuyama

“What is happening in the politics of the US particularly, but also in other countries, is that identity in a form of nationality or ethnicity or race has become a proxy for class.”

Francis Fukuyama interview: “Socialism ought to come back”

Fukuyama, who studied political philosophy under Allan Bloom, the author of The Closing of the American Mind, at Cornell University, initially identified with the neoconservative movement: he was mentored by Paul Wolfowitz while a government official during the Reagan-Bush years. But by late 2003, Fukuyama had recanted his support for the Iraq war, which he now regards as a defining error alongside financial deregulation and the euro’s inept creation. “These are all elite-driven policies that turned out to be pretty disastrous, there’s some reason for ordinary people to be upset.”

The End of History was a rebuke to Marxists who regarded communism as humanity’s final ideological stage. How, I asked Fukuyama, did he view the resurgence of the socialist left in the UK and the US? “It all depends on what you mean by socialism. Ownership of the means of production – except in areas where it’s clearly called for, like public utilities – I don’t think that’s going to work.

“If you mean redistributive programmes that try to redress this big imbalance in both incomes and wealth that has emerged then, yes, I think not only can it come back, it ought to come back. This extended period, which started with Reagan and Thatcher, in which a certain set of ideas about the benefits of unregulated markets took hold, in many ways it’s had a disastrous effect.

“In social equality, it’s led to a weakening of labour unions, of the bargaining power of ordinary workers, the rise of an oligarchic class almost everywhere that then exerts undue political power. In terms of the role of finance, if there’s anything we learned from the financial crisis it’s that you’ve got to regulate the sector like hell because they’ll make everyone else pay. That whole ideology became very deeply embedded within the Eurozone, the austerity that Germany imposed on southern Europe has been disastrous.”

Fukuyama added, to my surprise: “At this juncture, it seems to me that certain things Karl Marx said are turning out to be true. He talked about the crisis of overproduction… that workers would be impoverished and there would be insufficient demand.”

Was Francis Fukuyama the first man to see Trump coming? – Paul Sagar | Aeon Essays

Dickinson’s Purse and Sword

A lesser known founding father is John Dickinson, but he should be more well known considering how important he was at the time. His politics could today be described as moderate conservatism or maybe status quo liberalism. During conflict with the British Empire, he hoped the colonial leaders would seek reconciliation. Yet even as he refused to sign the Declaration of Independence, not based on principle but prudence, he didn’t stand in the way of those who supported it. And once war was under way, he served in the revolutionary armed forces. After that, he was a key figure in developing the Articles of Confederation and the Constitution.

Although a Federalist, he was highly suspicious of nationalism, the two being distinguished at the time. It might be noted that, if not for losing the war of rhetoric, the Anti-Federalists would be known as Federalists for they actually wanted a functioning federation. Indeed, Dickinson made arguments that are more Anti-Federalist in spirit. An example of this is his warning against a centralized government possessing both purse and sword, that is to say a powerful government that has both a standing army and the means of taxation to fund it without any need of consent of the governed. That is what the Articles protected against and the Constitution failed to do.

That warning remains unheeded to this day. And so the underlying issue remains silenced, the conflict and tension remains unresolved. The lack of political foresight and moral courage was what caused the American Revolution, the problems (e.g., division of power) arising in the English Civil War and Glorious Revolution still being problems generations later. The class war and radical ideologies from the 17th century led to the decades of political strife and public outrage prior to the official start of the American Revolution. But the British leadership hoped to continue to suppress the growing unrest, similar to how present American leadership hopes for the same and probably with the same eventual result.

What is interesting is how such things never go away and how non-radicals like Dickinson can end up giving voice to radical ideas. The idea of the purse strings being held by a free people, i.e., those taxed having the power of self-governance to determine their own taxation,  is not that far off from Karl Marx speaking of workers controlling the means of production — both implying that a society is only free to the degree people are free. Considering Dickinson freed the slaves he inherited, even a reluctant revolutionary such as himself could envision

* * *

On a related thought, one of the most radical documents, of course, was Thomas Jefferson’s strongly worded Declaration of Independence. It certainly was radical when it was written and, as with much else from that revolutionary era, maintains its radicalism to this day.

The Articles of Confederation, originally drafted by Dickinson, were closely adhering to the guiding vision of the Declaration.  Even though Dickinson was against declaring independence until all alternatives had been exhausted, once independence had been declared he was very much about following a course of moral principle as set down by that initial revolutionary document.

Yet the Constitution, that is the second constitution after the Articles, was directly unconstitutional and downright authoritarian according to the Articles.  The men of the Constitutional Convention blatantly disregarded their constitutional mandate in their having replaced the Articles without constitutional consensus and consent, that is to say it was a coup (many of the revolutionary soldiers didn’t take this coup lightly and continued the revolutionary war through such acts as Shay’s Rebellion, which was violently put down by the very Federal military that the Anti-Federalists warned about).

But worse still, the Constitution ended up being a complete betrayal of the Declaration which set out the principles that justified a revolution in the first place. As Howard Schartz put it:

“The Declaration itself, by contrast, never envisioned a Federal government at all. Ironically, then, if one wants to see the political philosophy of the United States in the Declaration of Independence, one should theoretically be against any form of federal government and not just for a particular interpretation of its limited powers.”
(Liberty In America’s Founding Moment, Kindle Locations 5375-5378)

It does seem that the contradiction bothered Dickinson. But he wasn’t a contrarian by nature, much less a rabblerouser. Once it was determined a new constitution was going to be passed, he sought the best compromise he saw as possible, although on principle he still refused to show consent by being a signatory. As for Jefferson, whether or not he ever thought the Constitution was a betrayal of the Declaration, he assumed any constitution was an imperfect document and that no constitution would or should last beyond his own generation.

* * *

Letters from a Farmer
Letter IX

No free people ever existed, or can ever exist, without keeping, to use a common, but strong expression, “the purse strings,” in their own hands. Where this is the case, they have a constitutional check upon the administration, which may thereby be brought into order without violence: But where such a power is not lodged in the people, oppression proceeds uncontrolled in its career, till the governed, transported into rage, seek redress in the midst of blood and confusion.

Letter II

Nevertheless I acknowledge the proceedings of the convention furnish my mind with many new and strong reasons, against a complete consolidation of the states. They tend to convince me, that it cannot be carried with propriety very far—that the convention have gone much farther in one respect than they found it practicable to go in another; that is, they propose to lodge in the general government very extensive powers—powers nearly, if not altogether, complete and unlimited, over the purse and the sword. But, in its organization, they furnish the strongest proof that the proper limbs, or parts of a government, to support and execute those powers on proper principles (or in which they can be safely lodged) cannot be formed. These powers must be lodged somewhere in every society; but then they should be lodged where the strength and guardians of the people are collected. They can be wielded, or safely used, in a free country only by an able executive and judiciary, a respectable senate, and a secure, full, and equal representation of the people. I think the principles I have premised or brought into view, are well founded—I think they will not be denied by any fair reasoner. It is in connection with these, and other solid principles, we are to examine the constitution. It is not a few democratic phrases, or a few well formed features, that will prove its merits; or a few small omissions that will produce its rejection among men of sense; they will inquire what are the essential powers in a community, and what are nominal ones; where and how the essential powers shall be lodged to secure government, and to secure true liberty.

Letter III

When I recollect how lately congress, conventions, legislatures, and people contended in the cause of liberty, and carefully weighed the importance of taxation, I can scarcely believe we are serious in proposing to vest the powers of laying and collecting internal taxes in a government so imperfectly organized for such purposes. Should the United States be taxed by a house of representatives of two hundred members, which would be about fifteen members for Connecticut, twenty-five for Massachusetts, etc., still the middle and lower classes of people could have no great share, in fact, in taxation. I am aware it is said, that the representation proposed by the new constitution is sufficiently numerous; it may be for many purposes; but to suppose that this branch is sufficiently numerous to guard the rights of the people in the administration of the government, in which the purse and sword are placed, seems to argue that we have forgotten what the true meaning of representation is. I am sensible also, that it is said that congress will not attempt to lay and collect internal taxes; that it is necessary for them to have the power, though it cannot probably be exercised. I admit that it is not probable that any prudent congress will attempt to lay and collect internal taxes, especially direct taxes: but this only proves that the power would be improperly lodged in congress, and that it might be abused by imprudent and designing men.

Letter XVII

It is said, that as the federal head must make peace and war, and provide for the common defense, it ought to possess all powers necessary to that end: that powers unlimited, as to the purse and sword, to raise men and monies, and form the militia, are necessary[168] to that end; and, therefore, the federal head ought to possess them. This reasoning is far more specious than solid: it is necessary that these powers so exist in the body politic, as to be called into exercise whenever necessary for the public safety; but it is by no means true, that the man, or congress of men, whose duty it more immediately is to provide for the common defense, ought to possess them without limitation. But clear it is, that if such men, or congress, be not in a situation to hold them without danger to liberty, he or they ought not to possess them. It has long been thought to be a well-founded position, that the purse and sword ought not to be placed in the same hands in a free government. Our wise ancestors have carefully separated them—placed the sword in the hands of their king, even under considerable limitations, and the purse in the hands of the commons alone: yet the king makes peace and war, and it is his duty to provide for the common defense of the nation. This authority at least goes thus far—that a nation, well versed in the science of government, does not conceive it to be necessary or expedient for the man entrusted with the common defense and general tranquility, to possess unlimitedly the powers in question, or even in any considerable degree.

Reactionary Revolutionaries, Faceless Men, and God in the Gutter

First there was revolution. And then there was counter-revolution. Therefore, reaction follows what it is reacting to.

This is a simple analysis and, I’d argue, overly simplistic. It is the narrative reactionaries have been telling about themselves for a couple of centuries. It is also the narrative that Mark Lilla repeats in his recent work, The Shipwrecked Mind, which is a useful survey, summary, and synthesis of modern ideological history but not essentially original in framing.

The problem is the reactionary mind is not a modern invention. Many arguments could be made about when it first emerged. For example, I’d place it firmly in the Axial Age or, better  yet, in that earliest of dark ages when the Bronze Age civilizations collapsed and the Jaynesian bicameral mind was lost.

By the time Plato came up with his authoritarian republicanism as a reaction to Athenian democracy, the reactionary mind had already been developing for some time. That was the era when, as Julian Jaynes points out, lament rang out across many populations of the silence, loss, or abandonment of the divine. Nostalgia in one of its most potent form was born.

As with Corey Robin, Mark Lilla is right to mark out nostalgia as an expression of the reactionary. But focusing too much on that can be a red herring. Robin is better than Lilla in pointing out that reactionaries can co-opt almost anything, even radical utopianism or revolution itself.

That is where my own thoughts come in. The modern reactionary mind initially took shape not after the early modern revolutionary period but during it — maybe before it, depending on when one defines the beginning of that period. The reactionary mind as a modern phenomenon was well on its way at least by the English Civil War, what some consider the first modern revolution, although some consider the Peasants’ Revolt an incipient form of this societal shift through conflict and class war.

The point is that the French Revolution was late to the game. That reactionaries finally found their voice following that is not entirely relevant to understanding the reactionary mind and its historical development. What the French Revolution does help us with is in showing another example of how reaction arose within the revolution itself, co-opting it as happened with the American Revolution (related to the rarely acknowledged fact that the American Revolution was a precedent for what followed, including large-scale destruction and violence).

Thomas Paine demonstrates the connections well, but his example also serves to show the complex relationship of reaction to revolution. He was a radical in the American Revolution and his radicalism was profound in its democratic vision. When he was welcomed into the French National Assembly during the French Revolution, he actually sat on the right side with the moderate reformers. It was actually his radicalism for democracy that made him moderate or aligned with more moderate forces.

What Paine specifically advocated was a democratic constitution and leniency to the king, rather than violent despotism and violent vengeance. The Jacobins are called radicals but in reality they were reactionaries or at least the leadership was. They were using the same means that the monarchy had used in enforcing power and silencing opponents. So, the Jacobins, as is typical with reactionaries, wanted to create a new and improved version of the old order by ensuring a rigid hierarchy remained. They weren’t interested in democracy, that is for sure.

That is what Mark Lilla misses. The French reactionaries, like the American reactionaries, took over the revolution through political coup — and this happened during the revolution itself, not afterwards. In France, it happened by the Jacobins seizing power. But in the United States, the Federalists did it through an ironically unconstitutional Constitutional Convention and then afterward they crushed the ongoing revolution.

The relationship between revolution and reaction is entangled. If this isn’t understood, it is likely that the reactionary mind itself can’t be understood. This creates a trap for the mind, in not understanding history we dangerously don’t understand ourselves.

Reactionaries aren’t limited to those other people, Hillary Clinton’s “basket of deplorables”. The potential for reaction exists within all of us. A surprising number of Marxists, socialists, communists, and anarchists fell under the sway of early 20th century fascism. The same pattern is seen today with left-wingers who almost unconsciously become fascinated with or drawn toward reactionary thought, often with the rationalization of studying the enemy but it is clear with some that it is more than mere curiosity. The reactionary mind is dangerous for the very reason we see it as something other.

The confusion in all of this is that the reactionary mind is chameleon-like. I’ve come to call them Faceless Men, based on Game of Thrones. Reactionaries rarely present themselves as reactionaries. That means that anyone, under the right conditions, can get pulled into the mindset without realizing it. Reaction is simply an expression of fear an anxiety, once it fully takes hold. The human mind gets weird under high levels of stress (Keith Payne examines one angle on this by way of inequality, in his book The Broken Ladder). It really is that simple.

We need to develop intellectual, ideological, and psychological defenses against the reactionary mind. None of us are born with an immunity. But before we can do that, we have to learn how to identify the pattern of thought and behavior, to discern its incipient forms and the development that follows, to recognize the conditions and causes that make it possible.

This leads to me to another thought. Philip K. Dick has the notion of God in the Gutter. Let me decontextualize it from the monotheistic tradition of deus absconditus. Any powerful ‘god’ that rules over us, over our minds our society, such a ‘god’ is always hidden. And its ability to remain hidden is what I call symbolic conflation, a method of deception, obfuscation, and most importantly misdirection. That is the source of its power. That is also what makes it hard to analyze. Someone like Mark Lilla is taking the reactionary mind at face value, how it presents itself. That is problematic for obvious reasons. Corey Robin is more effective in peeling away the mask to see what is behind.

That is what we all need to be doing in these reactionary times. Lets start rummaging around in the gutter, looking below our normal line of vision, looking through the garbage or what appears to be garbage. But let’s do so with great care.