The Agricultural Mind

Let me make an argument about individualism, rigid egoic boundaries, and hence Jaynesian consciousness. But I’ll come at it from a less typical angle. I’ve been reading much about diet, nutrition, and health. There are significant links between what we eat and so much else: gut health, hormonal regulation, immune system, and neurocognitive functioning. There are multiple pathways, one of which is direct, connecting the gut and the brain. The gut is sometimes called the second brain, but in evolutionary terms it is the first brain. To demonstrate one example of a connection, many are beginning to refer to Alzheimer’s as type 3 diabetes, and dietary interventions have reversed symptoms in clinical studies. Also, microbes and parasites have been shown to influence our neurocognition and psychology, even altering personality traits and behavior (e.g., toxoplasma gondii).

One possibility to consider is the role of exorphins that are addictive and can be blocked in the same way as opioids. Exorphin, in fact, means external morphine-like substance, in the way that endorphin means indwelling morphine-like substance. Exorphins are found in milk and wheat. Milk, in particular, stands out. Even though exorphins are found in other foods, it’s been argued that they are insignificant because they theoretically can’t pass through the gut barrier, much less the blood-brain barrier. Yet exorphins have been measured elsewhere in the human body. One explanation is gut permeability that can be caused by many factors such as stress but also by milk. The purpose of milk is to get nutrients into the calf and this is done by widening the space in gut surface to allow more nutrients through the protective barrier. Exorphins get in as well and create a pleasurable experience to motivate the calf to drink more. Along with exorphins, grains and dairy also contain dopaminergic peptides, and dopamine is the other major addictive substance. It feels good to consume dairy as with wheat, whether you’re a calf or a human, and so one wants more.

Addiction, of food or drugs or anything else, is a powerful force. And it is complex in what it affects, not only physiologically and psychologically but also on a social level. Johann Hari offers a great analysis in Chasing the Scream. He makes the case that addiction is largely about isolation and that the addict is the ultimate individual. It stands out to me that addiction and addictive substances have increased over civilization. Growing of poppies, sugar, etc came later on in civilization, as did the production of beer and wine (by the way, alcohol releases endorphins, sugar causes a serotonin high, and both activate the hedonic pathway). Also, grain and dairy were slow to catch on, as a large part of the diet. Until recent centuries, most populations remained dependent on animal foods, including wild game. Americans, for example, ate large amounts of meat, butter, and lard from the colonial era through the 19th century. In 1900, Americans on average were only getting 10% of carbs as part of their diet and sugar was minimal.

Another factor to consider is that low-carb diets can alter how the body and brain functions. That is even more true if combined with intermittent fasting and restricted eating times that would have been more common in the past. Taken together, earlier humans would have spent more time in ketosis (fat-burning mode, as opposed to glucose-burning) which dramatically affects human biology. The further one goes back in history the greater amount of time people probably spent in ketosis. One difference with ketosis is cravings and food addictions disappear. It’s a non-addictive or maybe even anti-addictive state of mind. Many hunter-gatherer tribes can go days without eating and it doesn’t appear to bother them, and that is typical of ketosis. This was also observed of Mongol warriors who could ride and fight for days on end without tiring or needing to stop for food. What is also different about hunter-gatherers and similar traditional societies is how communal they are or were and how more expansive their identities in belonging to a group. Anthropological research shows how hunter-gatherers often have a sense of personal space that extends into the environment around them. What if that isn’t merely cultural but something to do with how their bodies and brains operate? Maybe diet even plays a role. Hold that thought for a moment.

Now go back to the two staples of the modern diet, grains and dairy. Besides exorphins and dopaminergic substances, they also have high levels of glutamate, as part of gluten and casein respectively. Dr. Katherine Reid is a biochemist whose daughter was diagnosed with autism and it was severe. She went into research mode and experimented with supplementation and then diet. Many things seemed to help, but the greatest result came from restriction of glutamate, a difficult challenge as it is a common food additive. This requires going on a largely whole foods diet, that is to say eliminating processed foods. But when dealing with a serious issue, it is worth the effort. Dr. Reid’s daughter showed immense improvement to such a degree that she was kicked out of the special needs school. After being on this diet for a while, she socialized and communicated normally like any other child, something she was previously incapable of. Keep in mind that glutamate is necessary as a foundational neurotransmitter in modulating communication between the gut and brain. But typically we only get small amounts of it, as opposed to the large doses found in the modern diet.

That reminds me of propionate. It is another substance normally taken in at a low level. Certain foods, including grains and dairy, contain it. The problem is that, as a useful preservative, it has been generously added to the food supply. Research on rodents shows injecting them with propionate causes autistic-like behaviors. And other rodent studies show how this stunts learning ability and causes repetitive behavior (both related to the autistic demand for the familiar), as too much propionate entrenches mental patterns through the mechanism that gut microbes use to communicate to the brain how to return to a needed food source. Autistics, along with cravings for propionate-containing foods, tend to have larger populations of a particular gut microbe that produces propionate. In killing microbes, this might be why antibiotics can help with autism.

As with proprionate, exorphins injected into rats will likewise elicit autistic-like behaviors. By two different pathways, the body produces exorphins and proprionate from the consumption of grains and dairy, the former from the breakdown of proteins and the latter produced by gut bacteria in the breakdown of some grains and refined carbohydrates (combined with the proprionate used as a food additive; added to other foods as well and also, at least in rodents, artificial sweeteners increase propionate levels). This is part of the explanation for why many autistics have responded well to low-carb ketosis, specifically paleo diets that restrict both wheat and dairy, but ketones themselves play a role in using the same transporters as propionate and so block their buildup in cells and, of course, ketones offer a different energy source for cells as a replacement for glucose which alters how cells function, specifically neurocognitive functioning and its attendant psychological effects.

What stands out to me about autism is how isolating it is. The repetitive behavior and focus on objects resonates with extreme addiction. Both conditions block normal human relating and create an obsessive mindset that, in the most most extreme forms, blocks out all else. I wonder if all of us moderns are simply expressing milder varieties of this biological and neurological phenomenon. And this might be the underpinning of our hyper-individualistic society, with the earliest precursors showing up in the Axial Age following what Julian Jaynes hypothesized as the breakdown of the much more other-oriented bicameral mind. What if our egoic individuality is the result of our food system, as part of the civilizational project of mass agriculture?

Western Individuality Before the Enlightenment Age

The Culture Wars of the Late Renaissance: Skeptics, Libertines, and Opera
by Edward Muir
Introduction
pp. 5-7

One of the most disturbing sources of late-Renaissance anxiety was the collapse of the traditional hierarchic notion of the human self. Ancient and medieval thought depicted reason as governing the lower faculties of the will, the passions, and the body. Renaissance thought did not so much promote “individualism” as it cut away the intellectual props that presented humanity as the embodiment of a single divine idea, thereby forcing a desperate search for identity in many. John Martin has argued that during the Renaissance, individuals formed their sense of selfhood through a difficult negotiation between inner promptings and outer social roles. Individuals during the Renaissance looked both inward for emotional sustenance and outward for social assurance, and the friction between the inner and outer selves could sharpen anxieties 2 The fragmentation of the self seems to have been especially acute in Venice, where the collapse of aristocratic marriage structures led to the formation of what Virginia Cox has called the single self, most clearly manifest in the works of several women writers who argued for the moral and intellectual equality of women with men.’ As a consequence of the fragmented understanding of the self, such thinkers as Montaigne became obsessed with what was then the new concept of human psychology, a term in fact coined in this period.4 A crucial problem in the new psychology was to define the relation between the body and the soul, in particular to determine whether the soul died with the body or was immortal. With its tradition of Averroist readings of Aristotle, some members of the philosophy faculty at the University of Padua recurrently questioned the Christian doctrine of the immortality of the soul as unsound philosophically. Other hierarchies of the human self came into question. Once reason was dethroned, the passions were given a higher value, so that the heart could be understood as a greater force than the mind in determining human conduct. duct. When the body itself slipped out of its long-despised position, the sexual drives of the lower body were liberated and thinkers were allowed to consider sex, independent of its role in reproduction, a worthy manifestation of nature. The Paduan philosopher Cesare Cremonini’s personal motto, “Intus ut libet, foris ut moris est,” does not quite translate to “If it feels good, do it;” but it comes very close. The collapse of the hierarchies of human psychology even altered the understanding of the human senses. The sense of sight lost its primacy as the superior faculty, the source of “enlightenment”; the Venetian theorists of opera gave that place in the hierarchy to the sense of hearing, the faculty that most directly channeled sensory impressions to the heart and passions.

Historical and Philosophical Issues in the Conservation of Cultural Heritage
edited by Nicholas Price, M. Kirby Talley, and Alessandra Melucco Vaccaro
Reading 5: “The History of Art as a Humanistic Discipline”
by Erwin Panofsky
pp. 83-85

Nine days before his death Immanuel Kant was visited by his physician. Old, ill and nearly blind, he rose from his chair and stood trembling with weakness and muttering unintelligible words. Finally his faithful companion realized that he would not sit down again until the visitor had taken a seat. This he did, and Kant then permitted himself to be helped to his chair and, after having regained some of his strength, said, ‘Das Gefühl für Humanität hat mich noch nicht verlassen’—’The sense of humanity has not yet left me’. The two men were moved almost to tears. For, though the word Humanität had come, in the eighteenth century, to mean little more than politeness and civility, it had, for Kant, a much deeper significance, which the circumstances of the moment served to emphasize: man’s proud and tragic consciousness of self-approved and self-imposed principles, contrasting with his utter subjection to illness, decay and all that implied in the word ‘mortality.’

Historically the word humanitas has had two clearly distinguishable meanings, the first arising from a contrast between man and what is less than man; the second between man and what is more. In the first case humanitas means a value, in the second a limitation.

The concept of humanitas as a value was formulated in the circle around the younger Scipio, with Cicero as its belated, yet most explicit spokesman. It meant the quality which distinguishes man, not only from animals, but also, and even more so, from him who belongs to the species homo without deserving the name of homo humanus; from the barbarian or vulgarian who lacks pietas and παιδεια- that is, respect for moral values and that gracious blend of learning and urbanity which we can only circumscribe by the discredited word “culture.”

In the Middle Ages this concept was displaced by the consideration of humanity as being opposed to divinity rather than to animality or barbarism. The qualities commonly associated with it were therefore those of frailty and transience: humanitas fragilis, humanitas caduca.

Thus the Renaissance conception of humanitas had a two-fold aspect from the outset. The new interest in the human being was based both on a revival of the classical antithesis between humanitas and barbartias, or feritas, and on a survival of the mediaeval antithesis between humanitas and divinitas. When Marsilio Ficino defines man as a “rational soul participating in the intellect of God, but operating in a body,” he defines him as the one being that is both autonomous and finite. And Pico’s famous ‘speech’ ‘On the Dignity of Man’ is anything but a document of paganism. Pico says that God placed man in the center of the universe so that he might be conscious of where he stands, and therefore free to decide ‘where to turn.’ He does not say that man is the center of the universe, not even in the sense commonly attributed to the classical phrase, “man the measure of all things.”

It is from this ambivalent conception of humanitas that humanism was born. It is not so much a movement as an attitude which can be defined as the conviction of the dignity of man, based on both the insistence on human values (rationality and freedom) and the acceptance of human limitations (fallibility and frailty); from this two postulates result responsibility and tolerance.

Small wonder that this attitude has been attacked from two opposite camps whose common aversion to the ideas of responsibility and tolerance has recently aligned them in a united front. Entrenched in one of these camps are those who deny human values: the determinists, whether they believe in divine, physical or social predestination, the authoritarians, and those “insectolatrists” who profess the all-importance of the hive, whether the hive be called group, class, nation or race. In the other camp are those who deny human limitations in favor of some sort of intellectual or political libertinism, such as aestheticists, vitalists, intuitionists and hero-worshipers. From the point of view of determinism, the humanist is either a lost soul or an ideologist. From the point of view of authoritarianism, he is either a heretic or a revolutionary (or a counterrevolutionary). From the point of view of “insectolatry,” he is a useless individualist. And from the point of view of libertinism he is a timid bourgeois.

Erasmus of Rotterdam, the humanist par excellence, is a typical case in point. The church suspected and ultimately rejected the writings of this man who had said: “Perhaps the spirit of Christ is more largely diffused than we think, and there are many in the community of saints who are not in our calendar.” The adventurer Uhich von Hutten despised his ironical skepticism and his unheroic love of tranquillity. And Luther, who insisted that “no man has power to think anything good or evil, but everything occurs in him by absolute necessity,” was incensed by a belief which manifested itself in the famous phrase; “What is the use of man as a totality [that is, of man endowed with both a body and a soul], if God would work in him as a sculptor works in clay, and might just as well work in stone?”

Food and Faith in Christian Culture
edited by Ken Albala and Trudy Eden
Chapter 3: “The Food Police”
Sumptuary Prohibitions On Food In The Reformation
by Johanna B. Moyer
pp. 80-83

Protestants too employed a disease model to explain the dangers of luxury consumption. Luxury damaged the body politic leading to “most incurable sickness of the universal body” (33). Protestant authors also employed Galenic humor theory, arguing that “continuous superfluous expense” unbalanced the humors leading to fever and illness (191). However, Protestants used this model less often than Catholic authors who attacked luxury. Moreover, those Protestants who did employ the Galenic model used it in a different manner than their Catholic counterparts.

Protestants also drew parallels between the damage caused by luxury to the human body and the damage excess inflicted on the French nation. Rather than a disease metaphor, however, many Protestant authors saw luxury more as a “wound” to the body politic. For Protestants the danger of luxury was not only the buildup of humors within the body politic of France but the constant “bleeding out” of humor from the body politic in the form of cash to pay for imported luxuries. The flow of cash mimicked the flow of blood from a wound in the body. Most Protestants did not see luxury foodstuffs as the problem, indeed most saw food in moderation as healthy for the body. Even luxury apparel could be healthy for the body politic in moderation, if it was domestically produced and consumed. Such luxuries circulated the “blood” of the body politic creating employment and feeding the lower orders. 72 De La Noue made this distinction clear. He dismissed the need to individually discuss the damage done by each kind of luxury that was rampant in France in his time as being as pointless “as those who have invented auricular confession have divided mortal and venal sins into infinity of roots and branches.” Rather, he argued, the damage done by luxury was in its “entire bulk” to the patrimonies of those who purchased luxuries and to the kingdom of France (116). For the Protestants, luxury did not pose an internal threat to the body and salvation of the individual. Rather, the use of luxury posed an external threat to the group, to the body politic of France.

The Reformation And Sumptuary Legislation

Catholics, as we have seen, called for antiluxury regulations on food and banqueting, hoping to curb overeating and the damage done by gluttony to the body politic. Although some Protestants also wanted to restrict food and banqueting, more often French Protestants called for restrictions on clothing and foreign luxuries. These differing views of luxury during and after the French Wars of Religion not only give insight into the theological differences between these two branches of Christianity but also provides insight into the larger pattern of the sumptuary regulation of food in Europe in this period. Sumptuary restrictions were one means by which Catholics and Protestants enforced their theology in the post-Reformation era.

Although Catholicism is often correctly cast as the branch of Reformation Christianity that gave the individual the least control over their salvation, it was also true that the individual Catholic’s path to salvation depended heavily on ascetic practices. The responsibility for following these practices fell on the individual believer. Sumptuary laws on food in Catholic areas reinforced this responsibility by emphasizing what foods should and should not be eaten and mirrored the central theological practice of fasting for the atonement of sin. Perhaps the historiographical cliché that it was only Protestantism which gave the individual believer control of his or her salvation needs to be qualified. The arithmetical piety of Catholicism ultimately placed the onus on the individual to atone for each sin. Moreover, sumptuary legislation tried to steer the Catholic believer away from the more serious sins that were associated with overeating, including gluttony, lust, anger, and pride.

Catholic theology meshed nicely with the revival of Galenism that swept through Europe in this period. Galenists preached that meat eating, overeating, and the imbalance in humors which accompanied these practices, led to behavioral changes, including an increased sex drive and increased aggression. These physical problems mirrored the spiritual problems that luxury caused, including fornication and violence. This is why so many authors blamed the French nobility for the luxury problem in France. Nobles were seen not only as more likely to bear the expense of overeating but also as more prone to violence. 73

Galenism also meshed nicely with Catholicism because it was a very physical religion in which the control of the physical body figured prominently in the believer’s path to salvation. Not surprisingly, by the seventeenth century, Protestants gravitated away from Galenism toward the chemical view of the body offered by Paracelsus. 74 Catholic sumptuary law embodied a Galenic view of the body where sin and disease were equated and therefore pushed regulations that advocated each person’s control of his or her own body.

Protestant legislators, conversely, were not interested in the individual diner. Sumptuary legislation in Protestant areas ran the gamut from control of communal displays of eating, in places like Switzerland and Germany, to little or no concern with restrictions on luxury foods, as in England. For Protestants, it was the communal role of food and luxury use that was important. Hence the laws in Protestant areas targeted food in the context of weddings, baptisms, and even funerals. The English did not even bother to enact sumptuary restrictions on food after their break with Catholicism. The French Protestants who wrote on luxury glossed over the deleterious effects of meat eating, even proclaiming it to be healthful for the body while producing diatribes against the evils of imported luxury apparel. The use of Galenism in the French Reformed treatises suggests that Protestants too were concerned with a “body,” but it was not the individual body of the believer that worried Protestant legislators. Sumptuary restrictions were designed to safeguard the mystical body of believers, or the “Elect” in the language of Calvinism. French Protestants used the Galenic model of the body to discuss the damage that luxury did to the body of believers in France, but ultimately to safeguard the economic welfare of all French subjects. The Calvinists of Switzerland used sumptuary legislation on food to protect those predestined for salvation from the dangerous eating practices of members of the community whose overeating suggested they might not be saved.

Ultimately, sumptuary regulations in the Reformation spoke to the Christian practice of fasting. Fasting served very different functions in Protestants and Catholic theology. Raymond Mentzer has suggested that Protestants “modified” the Catholic practice of fasting during the Reformation. The major reformers, including Luther, Calvin, and Zwingli, all rejected fasting as a path to salvation. 75 For Protestants, fasting was a “liturgical rite,” part of the cycle of worship and a practice that served to “bind the community.” Fasting was often a response to adversity, as during the French Wars of Religion. For Catholics, fasting was an individual act, just as sumptuary legislation in Catholic areas targeted individual diners. However, for Protestants, fasting was a communal act, “calling attention to the body of believers.” 76 The symbolic nature of fasting, Mentzer argues, reflected Protestant rejection of transubstantiation. Catholics continued to believe that God was physically present in the host, but Protestants believed His was only a spiritual presence. When Catholics took Communion, they fasted to cleanse their own bodies so as to receive the real, physical body of Christ. Protestants, on the other hand, fasted as spiritual preparation because it was their spirits that connected with the spirit of Christ in the Eucharist. 77

Individualism and Isolation

“When an Indian child has been brought up among us, taught our language and habituated to our customs, yet if he goes to see his relations and makes one Indian ramble with them, there is no persuading him ever to return. [But] when white persons of either sex have been taken prisoners young by the Indians, and lived a while among them, tho’ ransomed by their friends, and treated with all imaginable tenderness to prevail with them to stay among the English, yet in a short time they become disgusted with our manner of life, and the care and pains that are necessary to support it, and take the first good opportunity of escaping again into the woods, from whence there is no reclaiming them.”
~ Benjamin Franklin

“The Indians, their old masters, gave them their choice and, without requiring any consideration, told them that they had been long as free as themselves. They chose to remain, and the reasons they gave me would greatly surprise you: the most perfect freedom, the ease of living, the absence of those cares and corroding solicitudes which so often prevail with us… all these and many more motives which I have forgot made them prefer that life of which we entertain such dreadful opinions. It cannot be, therefore, so bad as we generally conceive it to be; there must be in their social bond something singularly captivating, and far superior to anything to be boasted of among us; for thousands of Europeans are Indians, and we have no examples of even one of those Aborigines having from choice become Europeans! There must be something more congenial to our native dispositions than the fictitious society in which we live, or else why should children, and even grown persons, become in a short time so invincibly attached to it? There must be something very bewitching in their manners, something very indelible and marked by the very hands of nature. For, take a young Indian lad, give him the best education you possibly can, load him with your bounty, with presents, nay with riches, yet he will secretly long for his native woods, which you would imagine he must have long since forgot, and on the first opportunity he can possibly find, you will see him voluntarily leave behind all you have given him and return with inexpressible joy to lie on the mats of his fathers…

“Let us say what will of them, of their inferior organs, of their want of bread, etc., they are as stout and well made as the Europeans. Without temples, without priests, without kings and without laws, they are in many instances superior to us, and the proofs of what I advance are that they live without care, sleep without inquietude, take life as it comes, bearing all its asperities with unparalleled patience, and die without any kind of apprehension for what they have done or for what they expect to meet with hereafter. What system of philosophy can give us so many necessary qualifications for happiness? They most certainly are much more closely connected to nature than we are; they are her immediate children…”
~ J. Hector St. John de Crèvecœur

Western, educated, industrialized, rich, and democratic. Such societies, as the acronym goes, are WEIRD. But what exactly makes them weird?

This occurred to me because of reading Sebastian Junger’s Tribe. Much of what gets attributed to these WEIRD descriptors has been around for more than a half millennia, at least since the colonial imperialism began in Europe. From the moment Europeans settled in the Americas, a significant number of colonists were choosing to live among the natives because in many ways that lifestyle was a happier and healthier way of living with less stress and work, less poverty and inequality, not only lacking in arbitrary political power but also allowing far more personal freedom, especially for women.

Today, when we bother to think much about the problems we face, we mostly blame them on the side effects of modernity. But colonial imperialism began when Europe was still under the sway of of monarchies, state churches, and feudalism. There was nothing WEIRD about Western civilization at the time.

Those earlier Europeans hadn’t yet started to think of themselves in terms of a broad collective identity such as ‘Westerners’. They weren’t particularly well educated, not even the upper classes. Industrialization was centuries away. As for being rich, there was some wealth back then but it was limited to a few and even for those few it was rather unimpressive by modern standards. And Europeans back then were extremely anti-democratic.

Since European colonists were generally no more WEIRD than various native populations, we must look for other differences between them. Why did so many Europeans choose to live among the natives? Why did so many captured Europeans who were adopted into tribes refuse or resist being ‘saved’? And why did the colonial governments have to create laws and enforce harsh punishments in their having tried to stop people from ‘going native’?

European society, on both sides of the ocean, was severely oppressive and violent. That was particularly true in Virginia that was built on the labor of indentured servitude, at a time when most were worked to death before getting the opportunity for release from bondage. They had plenty of reason to seek the good life among the natives. But life was less than pleasant in the other colonies as well. A similar pattern repeated itself.

Thomas Morton went off with some men into the wilderness to start their own community where they commingled with the natives. This set an intolerable example that threatened Puritan social control and so the Puritans destroyed their community of the free. Roger Williams, a Puritan minister, took on the mission to convert the natives but found himself converted instead. He fled Puritan oppression because, as he put, the natives were more civilized than the colonists. Much later on, Thomas Paine living near some still free Indian tribes observed that their communities demonstrated greater freedom, self-governance, and natural rights than the colonies. He hoped Americans could take that lesson to heart. Other founders, from John Adams to Thomas Jefferson, looked admiringly to the example of their Indian neighbors.

The point is that whatever is problematic about Western society has been that way for a long time. Modernity has worsened this condition of unhappiness and dysfunction. There is no doubt that becoming more WEIRD has made us ever more weird, but Westerners were plenty weird from early on before they became WEIRD. Maybe the turning point for the Western world was the loss of our own traditional cultures and tribal lifestyles, as the Roman model of authoritarianism spread across Europe and became dominant. We have yet to shake off these chains of the past and instead have forced them upon everyone else.

It is what some call the Wetiko, one of the most infectious and deadly of mind viruses. “The New World fell not to a sword but to a meme,” as Daniel Quinn stated it (Beyond Civilization, p. 50). But it is a mind virus that can only take hold after immunity is destroyed. As long as there were societies of the free, the contagion was contained because the sick could be healed. But the power of the contagion is that the rabidly infected feel an uncontrollable compulsion to attack and kill the uninfected, the very people who would offer healing. Then the remaining survivors become infected and spread it further. A plague of victimization until no one is left untouched, until there is nowhere else to escape. Once all alternatives are eliminated, once a demiurgic monoculture comes to power, we are trapped in what Philip K. Dick called the Black Iron Prison. Sickness becomes all we know.

The usefulness of taking note of contemporary WEIRD societies isn’t that WEIRDness is the disease but that it shows the full blown set of symptoms of the disease. But the onset of the symptoms comes long after the infection, like a slow-growing malignant tumor in the brain. Still, symptoms are important, specifically when there is a comparison to a healthy population. That is what the New World offered the European mind, a comparison. The earliest accounts of native societies in the Americas helped Europeans to diagnose their own disease and helped motivate them to begin looking for a cure, although the initial attempts were fumbling and inept. The first thing that some Europeans did was simply to imagine what a healthy community might look like. That is what Thomas More attempted to do six centuries ago with his book Utopia.

Maybe the key is that of social concerns. Utopian visions have always focused on the social aspect, typically describing how people would ideally live together in communities. That was also the focus of the founders when they sought out alternative possibilities of organizing society. The change with colonialism, as feudalism was breaking down, was loss of belonging, of community and kinship. Modern individualism began not as an ideal but as a side effect of social breakdown, a condition of isolation and disconnection forced upon entire populations rather than having been freely chosen. And it was traumatizing to the Western psyche, and still is traumatizing, as seen with the high rates of mental illnesses in WEIRD societies, especially in the hyper-individualistic United States. That began before the defining factors of the WEIRD took hold. It was that trauma that made the WEIRD possible.

The colonists, upon meeting natives, discovered what had been lost. And for many colonists, that loss had happened within living memory. The hunger for what was lost was undeniable. To have seen a traditional communities that still functioned would have been like taking a breath of fresh air after having spent months in the stench of a ship’s hull. Not only did these native communities demonstrate what was recently lost but also what had been lost so much earlier. As many Indian tribes had more democratic practices, so did many European tribes prior to feudalism. But colonists had spent their entire lives being told democracy was impossible, that personal freedom was dangerous or even sinful.

The difference today is that none of this is within living memory for most of us, specifically Americans, unless one was raised Amish or in some similar community. The closest a typical American comes to this experience is by joining the military during war time. That is one of the few opportunities for a modern equivalent to a tribe, at least within WEIRD societies. And maybe a large part of the trauma soldiers struggle with isn’t merely the physical violence of war but the psychological violence of returning to the state of alienation, the loss of a bond that was closer than that of their own families.

Sebastian Junger notes that veterans who return to strong social support experience low rates of long-term PTSD, similar to Johann Hari’s argument about addiction in Chasing the Scream and his argument about depression in Lost Connections. Trauma, depression, addiction, etc — these are consequences of or worsened by isolation. These responses are how humans cope under stressful and unnatural conditions.

* * *

Tribe: On Homecoming and Belonging
by Sebastian Junger
pp. 16-25

The question for Western society isn’t so much why tribal life might be so appealing—it seems obvious on the face of it—but why Western society is so unappealing. On a material level it is clearly more comfortable and protected from the hardships of the natural world. But as societies become more affluent they tend to require more, rather than less, time and commitment by the individual, and it’s possible that many people feel that affluence and safety simply aren’t a good trade for freedom. One study in the 1960s found that nomadic !Kung people of the Kalahari Desert needed to work as little as twelve hours a week in order to survive—roughly one-quarter the hours of the average urban executive at the time. “The ‘camp’ is an open aggregate of cooperating persons which changes in size and composition from day to day,” anthropologist Richard Lee noted with clear admiration in 1968. “The members move out each day to hunt and gather, and return in the evening to pool the collected foods in such a way that every person present receives an equitable share… Because of the strong emphasis on sharing, and the frequency of movement, surplus accumulation… is kept to a minimum.”

The Kalahari is one of the harshest environments in the world, and the !Kung were able to continue living a Stone-Age existence well into the 1970s precisely because no one else wanted to live there. The !Kung were so well adapted to their environment that during times of drought, nearby farmers and cattle herders abandoned their livelihoods to join them in the bush because foraging and hunting were a more reliable source of food. The relatively relaxed pace of !Kung life—even during times of adversity—challenged long-standing ideas that modern society created a surplus of leisure time. It created exactly the opposite: a desperate cycle of work, financial obligation, and more work. The !Kung had far fewer belongings than Westerners, but their lives were under much greater personal control. […]

First agriculture, and then industry, changed two fundamental things about the human experience. The accumulation of personal property allowed people to make more and more individualistic choices about their lives, and those choices unavoidably diminished group efforts toward a common good. And as society modernized, people found themselves able to live independently from any communal group. A person living in a modern city or a suburb can, for the first time in history, go through an entire day—or an entire life—mostly encountering complete strangers. They can be surrounded by others and yet feel deeply, dangerously alone.

The evidence that this is hard on us is overwhelming. Although happiness is notoriously subjective and difficult to measure, mental illness is not. Numerous cross-cultural studies have shown that modern society—despite its nearly miraculous advances in medicine, science, and technology—is afflicted with some of the highest rates of depression, schizophrenia, poor health, anxiety, and chronic loneliness in human history. As affluence and urbanization rise in a society, rates of depression and suicide tend to go up rather than down. Rather than buffering people from clinical depression, increased wealth in a society seems to foster it.

Suicide is difficult to study among unacculturated tribal peoples because the early explorers who first encountered them rarely conducted rigorous ethnographic research. That said, there is remarkably little evidence of depression-based suicide in tribal societies. Among the American Indians, for example, suicide was understood to apply in very narrow circumstances: in old age to avoid burdening the tribe, in the ritual paroxysms of grief following the death of a spouse, in a hopeless but heroic battle with an enemy, and in an attempt to avoid the agony of torture. Among tribes that were ravaged by smallpox, it was also understood that a person whose face had been hideously disfigured by lesions might kill themselves. According to The Ethics of Suicide: Historical Sources , early chroniclers of the American Indians couldn’t find any other examples of suicide that were rooted in psychological causes. Early sources report that the Bella Coola, the Ojibwa, the Montagnais, the Arapaho, the Plateau Yuma, the Southern Paiute, and the Zuni, among many others, experienced no suicide at all.

This stands in stark contrast to many modern societies, where the suicide rate is as high as 25 cases per 100,000 people. (In the United States, white middle-aged men currently have the highest rate at nearly 30 suicides per 100,000.) According to a global survey by the World Health Organization, people in wealthy countries suffer depression at as much as eight times the rate they do in poor countries, and people in countries with large income disparities—like the United States—run a much higher lifelong risk of developing severe mood disorders. A 2006 study comparing depression rates in Nigeria to depression rates in North America found that across the board, women in rural areas were less likely to get depressed than their urban counterparts. And urban North American women—the most affluent demographic of the study—were the most likely to experience depression.

The mechanism seems simple: poor people are forced to share their time and resources more than wealthy people are, and as a result they live in closer communities. Inter-reliant poverty comes with its own stresses—and certainly isn’t the American ideal—but it’s much closer to our evolutionary heritage than affluence. A wealthy person who has never had to rely on help and resources from his community is leading a privileged life that falls way outside more than a million years of human experience. Financial independence can lead to isolation, and isolation can put people at a greatly increased risk of depression and suicide. This might be a fair trade for a generally wealthier society—but a trade it is. […]

The alienating effects of wealth and modernity on the human experience start virtually at birth and never let up. Infants in hunter-gatherer societies are carried by their mothers as much as 90 percent of the time, which roughly corresponds to carrying rates among other primates. One can get an idea of how important this kind of touch is to primates from an infamous experiment conducted in the 1950s by a primatologist and psychologist named Harry Harlow. Baby rhesus monkeys were separated from their mothers and presented with the choice of two kinds of surrogates: a cuddly mother made out of terry cloth or an uninviting mother made out of wire mesh. The wire mesh mother, however, had a nipple that dispensed warm milk. The babies took their nourishment as quickly as possible and then rushed back to cling to the terry cloth mother, which had enough softness to provide the illusion of affection. Clearly, touch and closeness are vital to the health of baby primates—including humans.

In America during the 1970s, mothers maintained skin-to-skin contact with babies as little as 16 percent of the time, which is a level that traditional societies would probably consider a form of child abuse. Also unthinkable would be the modern practice of making young children sleep by themselves. In two American studies of middle-class families during the 1980s, 85 percent of young children slept alone in their own room—a figure that rose to 95 percent among families considered “well educated.” Northern European societies, including America, are the only ones in history to make very young children sleep alone in such numbers. The isolation is thought to make many children bond intensely with stuffed animals for reassurance. Only in Northern European societies do children go through the well-known developmental stage of bonding with stuffed animals; elsewhere, children get their sense of safety from the adults sleeping near them.

The point of making children sleep alone, according to Western psychologists, is to make them “self-soothing,” but that clearly runs contrary to our evolution. Humans are primates—we share 98 percent of our DNA with chimpanzees—and primates almost never leave infants unattended, because they would be extremely vulnerable to predators. Infants seem to know this instinctively, so being left alone in a dark room is terrifying to them. Compare the self-soothing approach to that of a traditional Mayan community in Guatemala: “Infants and children simply fall asleep when sleepy, do not wear specific sleep clothes or use traditional transitional objects, room share and cosleep with parents or siblings, and nurse on demand during the night.” Another study notes about Bali: “Babies are encouraged to acquire quickly the capacity to sleep under any circumstances, including situations of high stimulation, musical performances, and other noisy observances which reflect their more complete integration into adult social activities.”

As modern society reduced the role of community, it simultaneously elevated the role of authority. The two are uneasy companions, and as one goes up, the other tends to go down. In 2007, anthropologist Christopher Boehm published an analysis of 154 foraging societies that were deemed to be representative of our ancestral past, and one of their most common traits was the absence of major wealth disparities between individuals. Another was the absence of arbitrary authority. “Social life is politically egalitarian in that there is always a low tolerance by a group’s mature males for one of their number dominating, bossing, or denigrating the others,” Boehm observed. “The human conscience evolved in the Middle to Late Pleistocene as a result of… the hunting of large game. This required… cooperative band-level sharing of meat.”

Because tribal foragers are highly mobile and can easily shift between different communities, authority is almost impossible to impose on the unwilling. And even without that option, males who try to take control of the group—or of the food supply—are often countered by coalitions of other males. This is clearly an ancient and adaptive behavior that tends to keep groups together and equitably cared for. In his survey of ancestral-type societies, Boehm found that—in addition to murder and theft—one of the most commonly punished infractions was “failure to share.” Freeloading on the hard work of others and bullying were also high up on the list. Punishments included public ridicule, shunning, and, finally, “assassination of the culprit by the entire group.” […]

Most tribal and subsistence-level societies would inflict severe punishments on anyone who caused that kind of damage. Cowardice is another form of community betrayal, and most Indian tribes punished it with immediate death. (If that seems harsh, consider that the British military took “cowards” off the battlefield and executed them by firing squad as late as World War I.) It can be assumed that hunter-gatherers would treat their version of a welfare cheat or a dishonest banker as decisively as they would a coward. They may not kill him, but he would certainly be banished from the community. The fact that a group of people can cost American society several trillion dollars in losses—roughly one-quarter of that year’s gross domestic product—and not be tried for high crimes shows how completely de-tribalized the country has become.

Dishonest bankers and welfare or insurance cheats are the modern equivalent of tribe members who quietly steal more than their fair share of meat or other resources. That is very different from alpha males who bully others and openly steal resources. Among hunter-gatherers, bullying males are often faced down by coalitions of other senior males, but that rarely happens in modern society. For years, the United States Securities and Exchange Commission has been trying to force senior corporate executives to disclose the ratio of their pay to that of their median employees. During the 1960s, senior executives in America typically made around twenty dollars for every dollar earned by a rank-and-file worker. Since then, that figure has climbed to 300-to-1 among S&P 500 companies, and in some cases it goes far higher than that. The US Chamber of Commerce managed to block all attempts to force disclosure of corporate pay ratios until 2015, when a weakened version of the rule was finally passed by the SEC in a strict party-line vote of three Democrats in favor and two Republicans opposed.

In hunter-gatherer terms, these senior executives are claiming a disproportionate amount of food simply because they have the power to do so. A tribe like the !Kung would not permit that because it would represent a serious threat to group cohesion and survival, but that is not true for a wealthy country like the United States. There have been occasional demonstrations against economic disparity, like the Occupy Wall Street protest camp of 2011, but they were generally peaceful and ineffective. (The riots and demonstrations against racial discrimination that later took place in Ferguson, Missouri, and Baltimore, Maryland, led to changes in part because they attained a level of violence that threatened the civil order.) A deep and enduring economic crisis like the Great Depression of the 1930s, or a natural disaster that kills tens of thousands of people, might change America’s fundamental calculus about economic justice. Until then, the American public will probably continue to refrain from broadly challenging both male and female corporate leaders who compensate themselves far in excess of their value to society.

That is ironic, because the political origins of the United States lay in confronting precisely this kind of resource seizure by people in power. King George III of England caused the English colonies in America to rebel by trying to tax them without allowing them a voice in government. In this sense, democratic revolutions are just a formalized version of the sort of group action that coalitions of senior males have used throughout the ages to confront greed and abuse. Thomas Paine, one of the principal architects of American democracy, wrote a formal denunciation of civilization in a tract called Agrarian Justice : “Whether… civilization has most promoted or most injured the general happiness of man is a question that may be strongly contested,” he wrote in 1795. “[Both] the most affluent and the most miserable of the human race are to be found in the countries that are called civilized.”

When Paine wrote his tract, Shawnee and Delaware warriors were still attacking settlements just a few hundred miles from downtown Philadelphia. They held scores of white captives, many of whom had been adopted into the tribe and had no desire to return to colonial society. There is no way to know the effect on Paine’s thought process of living next door to a communal Stone-Age society, but it might have been crucial. Paine acknowledged that these tribes lacked the advantages of the arts and science and manufacturing, and yet they lived in a society where personal poverty was unknown and the natural rights of man were actively promoted.

In that sense, Paine claimed, the American Indian should serve as a model for how to eradicate poverty and bring natural rights back into civilized life.

* * *

Dark Triad Domination
Urban Weirdness
Social Disorder, Mental Disorder
The Unimagined: Capitalism and Crappiness
It’s All Your Fault, You Fat Loser!
How Universal Is The Mind?
Bias About Bias
“Beyond that, there is only awe.”

 

Hunger for Connection

“Just as there are mental states only possible in crowds, there are mental states only possible in privacy.”

Those are the words of Sarah Perry from Luxuriating in Privacy. I came across the quote from a David Chapman tweet. He then asks, “Loneliness epidemic—or a golden age of privacy?” With that lure, I couldn’t help but bite.

I’m already familiar with Sarah Perry’s writings at Ribbonfarm. There is even an earlier comment by me at the piece the quote comes from, although I had forgotten about it. In the post, she begins with links to some of her previous commentary, the first one (Ritual and the Consciousness Monoculture) having been my introduction to her work. I referenced it in my post Music and Dance on the Mind and it does indeed connect to the above thought on privacy.

In that other post by Perry, she discusses Keeping Together in Time by William H. McNeill. His central idea is “muscular bonding” that creates, maintains, and expresses a visceral sense of group-feeling and fellow-feeling. This can happen through marching, dancing, rhythmic movements, drumming, chanting, choral singing, etc (for example, see: Choral Singing and Self-Identity). McNeill quotes A. R. Radcliffe about the Andaman islanders: “As the dancer loses himself in the dance, as he becomes absorbed in the unified community, he reaches a state of elation in which he feels himself filled with energy or force immediately beyond his ordinary state, and so finds himself able to perform prodigies of exertion” (Kindle Locations 125-126).

The individual is lost, at least temporarily, an experience humans are drawn to in many forms. Individuality is tiresome and we moderns feel compelled to take a vacation from it. Having forgotten earlier ways of being, maybe privacy is the closest most of us get to lowering our stressful defenses of hyper-individualistic pose and performance. The problem is privacy so easily reinforces the very individualistic isolation that drains us of energy.

This might create the addictive cycle that Johann Hari discussed in Chasing the Scream and would relate to the topic of depression in his most recent book, Lost Connections. He makes a strong argument about the importance of relationships of intimacy, bonding, and caring (some communities have begun to take seriously this issue; others deem what is required are even higher levels of change, radical and revolutionary). In particular, the rat park research is fascinating. The problem with addiction is that it simultaneously relieves the pain of our isolation while further isolating us. Or at least this is what happens in a punitive society with weak community and culture of trust. For that reason, we should look to other cultures for comparison. In some traditional societies, there is a greater balance and freedom to choose. I specifically had the Piraha in mind, as described by Daniel Everett.

The Piraha are a prime example of how not all cultures have a dualistic conflict between self and community, between privacy and performance. Their communities are loosely structured and the individual is largely autonomous in how and with whom they use their time. They lack much in the way of formal social structure, since there are no permanent positions of hierarchical authority (e.g., no tribal council of elders), although any given individual might temporarily take a leadership position in order to help accomplish an immediate task. Nor do they have much in the way of ritual or religion. It isn’t an oppressive society.

Accordingly, Everett observes how laid back, relaxed, and happy they seem. Depression, anxiety, and suicide appear foreign to them. When he told them about a depressed family member who killed herself, the Piraha laughed because assumed he was joking. There was no known case of suicide in the tribe. Even more interesting is that, growing up, the Piraha don’t exhibit transitional periods such as the terrible twos or teenage rebelliousness. They simply go from being weaned to joining adult activities with no one telling them to what to do.

The modern perceived conflict between group and individual might not be a universal and intrinsic aspect of human society. But it does seem a major issue for WEIRD societies, in particular. Maybe has to do with how ego-bound is our sense of identity. The other thing the Piraha lack is a permanent, unchanging self-identity because such as a meeting with a spirit in the jungle might lead to a change of name and, to the Piraha, the person who went by the previous name no longer is there. They feel no need to defend their individuality because any given individual self can be set aside.

It is hard for Westerners and Americans most of all to imagine a society that is this far different. It is outside of the mainstream capacity of imagining what is humanly possible. It’s similar to why so many people reject out of hand such theories as Julian Jaynes’ bicameral mind. Such worldviews simply don’t fit into what we know. But maybe this sense of conflict we cling to is entirely unnecessary. If so, why do we feel such conflict is inevitable? And so why do we value privacy so highly? What is it that we seek from being isolated and alone? What is it that we think we have lost that needs to be regained? To help answer these questions, I’ll present a quote by Julian Jaynes that I included in writing Music and Dance On the Mind — from his book that Perry is familiar with:

“Another advantage of schizophrenia, perhaps evolutionary, is tirelessness. While a few schizophrenics complain of generalized fatigue, particularly in the early stages of the illness, most patients do not. In fact, they show less fatigue than normal persons and are capable of tremendous feats of endurance. They are not fatigued by examinations lasting many hours. They may move about day and night, or work endlessly without any sign of being tired. Catatonics may hold an awkward position for days that the reader could not hold for more than a few minutes. This suggests that much fatigue is a product of the subjective conscious mind, and that bicameral man, building the pyramids of Egypt, the ziggurats of Sumer, or the gigantic temples at Teotihuacan with only hand labor, could do so far more easily than could conscious self-reflective men.”

Considering that, it could be argued that privacy is part of the same social order, ideological paradigm, and reality tunnel that tires us out so much in the first place. Endlessly without respite, we feel socially compelled to perform our individuality. And even in retreating into privacy, we go on performing our individuality for our own private audience, as played out on the internalized stage of self-consciousness that Jaynes describes. That said, even though the cost is high, it leads to great benefits for society as a whole. Modern civilization wouldn’t be possible without it. The question is whether the costs outweigh the benefits and also whether the costs are sustainable or self-destructive in the long term.

As Eli wrote in the comments section to Luxuriating in Privacy: “Privacy isn’t an unalloyed good. As you mention, we are getting ever-increasing levels of privacy to “luxuriate” in. But who’s to say we’re not just coping with the change modernity constantly imposes on us? Why should we elevate the coping mechanism, when it may well be merely a means to lessen the pain of an unnecessarily “alienating” constructed environment.” And “isn’t the tiresomeness of having to model the social environment itself contingent on the structural precariousness of one’s place in an ambiguous, constantly changing status hierarchy?”

Still, I do understand where Perry is coming from, as I’m very much an introvert who values my alone time and can be quite jealous of my privacy, although I can’t say that close and regular social contact “fills me with horror.” Having lived alone for years in apartments and barely knowing my neighbors, I spend little time at my ‘home’ and instead choose to regularly socialize with my family at my parents’ house. Decades of depression has caused me to be acutely aware of the double-edged sword of privacy.

Let me respond to some specifics of Perry’s argument.  “Consider obesity,” she writes. “A stylized explanation for rising levels of overweight and obesity since the 1980s is this: people enjoy eating, and more people can afford to eat as much as they want to. In other words, wealth and plenty cause obesity.” There are some insightful comparisons of eating practices. Not all modern societies with equal access to food have equal levels of obesity. Among many other health problems, obesity can result from stress because our bodies prepare for challenging times by accumulating fat reserves. And if there is enough stress, studies have found this is epigenetically passed onto children.

As a contrast, consider the French culture surrounding food. The French don’t eat much fast food, don’t tend to eat or drink on the go. It is more common for them sit down to enjoy their coffee in the morning, rather than putting it in a traveling mug to drink on the way to work. Also, they are more likely to take long lunches in order to eat leisurely and typically do so with others. For the French, the expectation is that meals are to be enjoyed as a social experience and so they organize their entire society accordingly. Even though they eat many foods that some consider unhealthy, they don’t have the same high rates of stress-related diseases as do Americans.

An even greater contrast comes from looking once again at the Piraha. They live in an environment of immense abundance. And it requires little work to attain sustenance. In a few hours of work, an individual could get enough food to feed an extended family for multiple meals. They don’t worry about going hungry and yet, for various reasons, will choose not to not eat for extended periods of time when they wish to spend their time in other ways such as relaxing or dancing. They impose a feast and fast lifestyle on themselves, a typical pattern for hunter-gatherers. As with the French, when the Piraha have a meal, it is very much a social event. Unsurprisingly, the Piraha are slim and trim, muscular and healthy. They don’t suffer from stress-related physical and mental conditions, certainly not obesity.

Perry argues that, “Analogized to privacy, perhaps the explanation of atomization is simply that people enjoy privacy, and can finally afford to have as much as they want. Privacy is an economic good, and people show a great willingness to trade other goods for more privacy.” Using Johan Hari’s perspective, I might rephrase it: Addiction is economically profitable within the hyper-individualism of capitalist realism, and people show a difficult to control craving that causes them to pay high costs to feed their addiction. Sure, temporarily alleviating the symptoms makes people feel better. But what is it a symptom of? That question is key to understanding. I’m persuaded that the issue at hand is disconnection, isolation, and loneliness. So much else follows from that.

Explaining the title of her post, Perry writes that: “One thing that people are said to do with privacy is to luxuriate in it. What are the determinants of this positive experience of privacy, of privacy experienced as a thing in itself, rather than through violation?” She goes on to describes on to describes the features of privacy, various forms of personal space and enclosure. Of course, Julian Jaynes argued that the ultimate privacy is the construction of individuality itself, the experience of space metaphorically internalized and interiorized. Further development of privacy, however, is a rather modern invention. For example, it wasn’t until recent centuries that private bedrooms became common, having been popularized in Anglo-American culture by Quakers. Before that, full privacy was a rare experience and far from having been considered a human necessity or human right.

But we have come to take privacy for granted, not talking about certain details is a central part of privacy itself. “Everybody knows that everybody poops. Still, you’re not supposed to poop in front of people. The domain of defecation is tacitly edited out of our interactions with other people: for most social purposes, we are expected to pretend that we neither produce nor dispose of bodily wastes, and to keep any evidence of such private. Polite social relations exclude parts of reality by tacit agreement; scatological humor is a reminder of common knowledge that is typically screened off by social agreement. Sex and masturbation are similar.”

Defecation is a great example. There is no universal experience about the privatization of the act of pooping. In early Europe, relieving oneself in public was common and considered well within social norms. It was a slow ‘civilizing’ process to teach people to be ashamed of bodily functions, even simple things like farting and belching in public (there are a number of interesting books on the topic). I was intrigued by Susan P. Mattern’s The Prince of Medicine. She describes how almost everything in the ancient world was a social experience. Even taking a shit was an opportunity to meet and chat with one’s family, friends, and neighbors. They apparently felt no drain of energy or need to perform in their social way of being in the world. It was relaxed and normal to them, simply how they lived and they knew nothing else.

Also, sex and masturbation haven’t always been exclusively private acts. We have little knowledge of sex in the archaic world. Jaynes noted that sexuality wasn’t treated as anything particularly concerning and worrisome during the bicameral era. Obsession with sex, positive or negative, more fully developed during the Axial Age. As late as Feudalism, heavily Christianized Europe offered little opportunity for privacy and maintained a relatively open attitude about sexuality during many public celebrations, specifically Carnival, and they spent an amazing amount of their time in public celebrations. Barbara Ehrenreich describes this ecstatic communality in Dancing in the Streets. Like the Piraha, these earlier Europeans had a more social and fluid sense of identity.

Let me finish by responding to Perry’s conclusion: “As I wrote in A Bad Carver, social interaction has increasingly become “unbundled” from other things. This may not be a coincidence: it may be that people have specifically desired more privacy, and the great unbundling took place along that axis especially, in response to demand. Modern people have more room, more autonomy, more time alone, and fewer social constraints than their ancestors had a hundred years ago. To scoff at this luxury, to call it “alienation,” is to ignore that it is the choices of those who are allegedly alienated that create this privacy-friendly social order.”

There is no doubt what people desire. In any given society, most people desire whatever they are acculturated to desire. Example after example of this can be found in social science research, the anthropological literature, and classical studies. It’s not obvious to me that there is any evidence that modern people have fewer social constraints. What is clear is that we have different social constraints and that difference seems to have led to immense stress, anxiety, depression, and fatigue. Barbara Ehrenreich discusses the rise in depression in particular, as have others such as Mark Fisher’s work on capitalist realism (I quote him and others here). The studies on WEIRD cultures are also telling (see: Urban Weirdness and Dark Triad Domination).

The issue isn’t simply what choices we make but what choices we are offered and denied, what choices we can and cannot perceive or even imagine. And that relates to how we lose contact with the human realities of other societies that embody other possibilities not chosen or even considered within the constraints of our own. We are disconnected not only from others within our society. Our WEIRD monocultural dominance has isolated us also from other expressions of human potential.

We luxuriate in privacy because our society offers us few other choices, like a choice between junk food or starvation, in which case junk food tastes delicious. For most modern Westerners, privacy is nothing more than a temporary escape from an overwhelming world. But what we most deeply hunger for is genuine connection.

Hyperobjects and Individuality

We live in a liberal age and the liberal paradigm dominates, not just for liberals but for everyone. Our society consists of nothing other than liberalism and reactions to liberalism. And at the heart of it all is individualism. But through the cracks, other possibilities can be glimpsed.

One challenging perspective is that of hyperobjects, a proposed by Timothy Morton — as he writes: “Hyperobjects pose numerous threats to individualism, nationalism, anti-intellectualism, racism, speciesism, anthropocentrism, you name it. Possibly even capitalism itself.”

Evander Price summarizes the origin of the theory and the traits of hyperobjects (Hyperobjects & Dark Ecology). He breaks it down into seven points. The last three refer to individuality — here they are (with some minor editing):

5) Individuality is lost. We are not separate from other things. (This is Object Oriented Ontology) — Morton calls this entangledness. “Knowing more about hyperobjects is knowing more about how we are hopelessly fastened to them.” A little bit like Ahab all tangled up in the lines of Moby-Dick.

6) “Utilitarianism is deeply flawed when it comes to working with hyperobjects. The simple reason why is that hpyerobjects are profoundly futural.” (135) <–I’ve been arguing against utilitarianism for a while now within this line of thinking; this is because utilitarianism, the idea that moral goodness is measured by whether an action or idea increases the overall happiness of a given community, is always embedded within a temporal framework, outside of which the collective ‘happiness’ of a given individual or community is not considered. Fulfilling the greatest happiness for the current generation is always dependent on taking resources now [from] future generations. What is needed is chronocritical utilitarianism, but that is anathema to the radical individuality of utilitarianism.

7) Undermining — the opposite of hyperobjecting. From Harman. “Undermining is when things are reduced to smaller things that are held to be more real. The classic form of undermining in contemporary capitalism is individualism: ‘There are only individuals and collective decisions are ipso facto false.’” <– focusing on how things affect me because I am the most important is essentially undermining that I exist as part of a community, and a planet.

And from the book on the topic:

Hyperobjects:
Philosophy and Ecology after the End of the World

by Timothy Morton
Kindle Locations 427-446

The ecological thought that thinks hyperobjects is not one in which individuals are embedded in a nebulous overarching system, or conversely, one in which something vaster than individuals extrudes itself into the temporary shapes of individuals. Hyperobjects provoke irreductionist thinking, that is, they present us with scalar dilemmas in which ontotheological statements about which thing is the most real (ecosystem, world, environment, or conversely, individual) become impossible. 28 Likewise, irony qua absolute distance also becomes inoperative. Rather than a vertiginous antirealist abyss, irony presents us with intimacy with existing nonhumans.

The discovery of hyperobjects and OOO are symptoms of a fundamental shaking of being, a being-quake. The ground of being is shaken. There we were, trolling along in the age of industry, capitalism, and technology, and all of a sudden we received information from aliens, information that even the most hardheaded could not ignore, because the form in which the information was delivered was precisely the instrumental and mathematical formulas of modernity itself. The Titanic of modernity hits the iceberg of hyperobjects. The problem of hyperobjects, I argue, is not a problem that modernity can solve. Unlike Latour then, although I share many of his basic philosophical concerns, I believe that we have been modern, and that we are only just learning how not to be.

Because modernity banks on certain forms of ontology and epistemology to secure its coordinates, the iceberg of hyperobjects thrusts a genuine and profound philosophical problem into view. It is to address these problems head on that this book exists. This book is part of the apparatus of the Titanic, but one that has decided to dash itself against the hyperobject. This rogue machinery— call it speculative realism, or OOO— has decided to crash the machine, in the name of a social and cognitive configuration to come, whose outlines are only faintly visible in the Arctic mist of hyperobjects. In this respect, hyperobjects have done us a favor. Reality itself intervenes on the side of objects that from the prevalent modern point of view— an emulsion of blank nothingness and tiny particles— are decidedly medium-sized. It turns out that these medium-sized objects are fascinating, horrifying, and powerful.

For one thing, we are inside them, like Jonah in the Whale. This means that every decision we make is in some sense related to hyperobjects. These decisions are not limited to sentences in texts about hyperobjects.

Kindle Locations 467-472

Hyperobjects are a good candidate for what Heidegger calls “the last god,” or what the poet Hölderlin calls “the saving power” that grows alongside the dangerous power. 31 We were perhaps expecting an eschatological solution from the sky, or a revolution in consciousness— or, indeed, a people’s army seizing control of the state. What we got instead came too soon for us to anticipate it. Hyperobjects have dispensed with two hundred years of careful correlationist calibration. The panic and denial and right-wing absurdity about global warming are understandable. Hyperobjects pose numerous threats to individualism, nationalism, anti-intellectualism, racism, speciesism, anthropocentrism, you name it. Possibly even capitalism itself.

Kindle Locations 2712-2757

Marxists will argue that huge corporations are responsible for ecological damage and that it is self-destructive to claim that we are all responsible. Marxism sees the “ethical” response to the ecological emergency as hypocrisy. Yet according to many environmentalists and some anarchists, in denying that individuals have anything to do with why Exxon pumps billions of barrels of oil, Marxists are displacing the blame away from humans. This view sees the Marxist “political” response to the ecological emergency as hypocrisy. The ethics– politics binary is a true differend: an opposition so radical that it is in some sense insuperable. Consider this. If I think ethics, I seem to want to reduce the field of action to one-on-one encounters between beings. If I think politics, I hold that one-on-one encounters are never as significant as the world (of economic, class, moral, and so on), relations in which they take place. These two ways of talking form what Adorno would have called two halves of a torn whole, which nonetheless don’t add up together. Some nice compromise “between” the two is impossible. Aren’t we then hobbled when it comes to issues that affect society as a whole— nay the biosphere as a whole— yet affect us all individually (I have mercury in my blood, and ultraviolet rays affect me unusually strongly)?

Yet the deeper problem is that our (admittedly cartoonish) Marxist and anarchist see the problem as hypocrisy. Hypocrisy is denounced from the standpoint of cynicism. Both the Marxist and the anti-Marxist are still wedded to the game of modernity, in which she who grabs the most cynical “meta” position is the winner: Anything You Can Do, I Can Do Meta. Going meta has been the intellectual gesture par excellence for two centuries. I am smarter than you because I can see through you. You are smarter than they are because you ground their statements in conditions of possibility. From a height, I look down on the poor fools who believe what they think. But it is I who believes, more than they. I believe in my distance, I believe in the poor fools, I believe they are deluded. I have a belief about belief: I believe that belief means gripping something as tightly as possible with my mind. Cynicism becomes the default mode of philosophy and of ideology. Unlike the poor fool, I am undeluded— either I truly believe that I have exited from delusion, or I know that no one can, including myself, and I take pride in this disillusionment.

This attitude is directly responsible for the ecological emergency, not the corporation or the individual per se, but the attitude that inheres both in the corporation and in the individual, and in the critique of the corporation and of the individual. Philosophy is directly embodied in the size and shape of a paving stone, the way a Coca Cola bottle feels to the back of my neck, the design of an aircraft, or a system of voting. The overall guiding view, the “top philosophy,” has involved a cynical distance. It is logical to suppose that many things in my world have been affected by it— the way a shopping bag looks, the range of options on the sports channel, the way I think Nature is “over yonder.” By thinking rightness and truth as the highest possible elevation, as cynical transcendence, I think Earth and its biosphere as the stage set on which I prance for the amusement of my audience. Indeed, cynicism has already been named in some forms of ideology critique as the default mode of contemporary ideology. 48 But as we have seen, cynicism is only hypocritical hypocrisy.

Cynicism is all over the map: left, right, green, indifferent. Isn’t Gaian holism a form of cynicism? One common Gaian assertion is that there is something wrong with humans. Nonhumans are more Natural. Humans have deviated from the path and will be wiped out (poor fools!). No one says the same about dolphins, but it’s just as true. If dolphins go extinct, why worry? Dolphins will be replaced. The parts are greater than the whole. A mouse is not a mouse if it is not in the network of Gaia. 49 The parts are replaceable. Gaia will replace humans with a less defective component. We are living in a gigantic machine— a very leafy one with a lot of fractals and emergent properties to give it a suitably cool yet nonthreatening modern aesthetic feel.

It is fairly easy to discern how refusing to see the big picture is a form of what Harman calls undermining. 50 Undermining is when things are reduced to smaller things that are held to be more real. The classic form of undermining in contemporary capitalism is individualism: “There are only individuals and collective decisions are ipso facto false.” But this is a problem that the left, and environmentalism more generally, recognize well.

The blind spot lies in precisely the opposite direction: in how common ideology tends to think that bigger is better or more real. Environmentalism, the right, and the left seem to have one thing in common: they all hold that incremental change is a bad thing. Yet doesn’t the case against incrementalism, when it comes to things like global warming, amount to a version of what Harman calls overmining, in the domain of ethics and politics? Overmining is when one reduces a thing “upward” into an effect of some supervenient system (such as Gaia or consciousness). 51 Since bigger things are more real than smaller things, incremental steps will never accomplish anything. The critique of incrementalism laughs at the poor fools who are trying to recycle as much as possible or drive a Prius. By postponing ethical and political decisions into an idealized future, the critique of incrementalism leaves the world just as it is, while maintaining a smug distance toward it. In the name of the medium-sized objects that coexist on Earth (aspen trees, polar bears, nematode worms, slime molds, coral, mitochondria, Starhawk, and Glenn Beck), we should forge a genuinely new ethical view that doesn’t reduce them or dissolve them.

 

The Group Conformity of Hyper-Individualism

When talking to teens, it’s helpful to understand how their tendency to form groups and cliques is partly a consequence of American culture. In America, we encourage individuality. Children freely and openly develop strong preferences—defining their self-identity by the things they like and dislike. They learn to see differences. Though singular identity is the long-term goal, in high school this identity-quest is satisfied by forming and joining distinctive subgroups. So, in an ironic twist, the more a culture emphasizes individualism, the more the high school years will be marked by subgroupism. Japan, for instance, values social harmony over individualism, and children are discouraged from asserting personal preferences. Thus, less groupism is observed in their high schools.

That is from Bronson and Merryman’s NurtureShock (p. 45). It touches on a number of points. The most obvious point is made clear by the authors. American culture is defined by groupism. The authors discussed this in a chapter about race, explaining why group stereotypes are so powerful in this kind of society. They write that, “The security that comes from belonging to a group, especially for teens, is palpable. Traits that mark this membership are—whether we like it or not—central to this developmental period.” This was emphasized with a University Michigan study done on Detroit black high school students “that shows just how powerful this need to belong is, and how much it can affect a teen.”

Particularly for the boys, those who rated themselves as dark-skinned blacks had the highest GPAs. They also had the highest ratings for social acceptance and academic confidence. The boys with lighter skin tones were less secure socially and academically.

The researchers subsequently replicated these results with students who “looked Latino.”

The researchers concluded that doing well in school could get a minority teen labeled as “acting white.” Teens who were visibly sure of membership within the minority community were protected from this insult and thus more willing to act outside the group norm. But the light-skinned blacks and the Anglo-appearing Hispanics—their status within the minority felt more precarious. So they acted more in keeping with their image of the minority identity—even if it was a negative stereotype—in order to solidify their status within the group.

A group-minded society reinforces stereotypes at a very basic level of human experience and relationships. Along with a weak culture of trust, American hyper-individualism creates the conditions for strong group identities and all that goes with it. Stereotypes become the defining feature of group identities.

The worst part isn’t the stereotypes projected onto us but the stereotypes we internalize. And those who least fit the stereotypes are those who feel the greatest pressure to conform to them in dressing and speaking, acting and behaving in stereotypical ways. There isn’t a strong national identity to create social belonging and support. So, Americans turn to sub-groups and the population becomes splintered, the citizenry divided against itself.

The odd part about this is how non-intuitive it seems , according to the dominant paradigm. The ironic part about American hyper-individualism is that it is a social norm demanding social conformity through social enforcement. In many ways, American society is one of the most conformist countries in the world, related to how much we are isolated into enclaves of groupthink by media bubbles and echo chambers.

This isn’t inevitable, as the comparison to the Japanese makes clear. Not all societies operate according to hyper-individualistic ideology. In Japan, it’s not just the outward expression of the individual that is suppressed but also separate sub-group identities within the larger society. According to one study, this leads to greater narcissism among the Japanese. Because it is taboo to share personal issues in the public sphere, the Japanese spend more time privately dwelling on their personal issues (i.e., narcissism as self-obsession). This is exacerbated by the lack of sub-groups through which to publicly express the personal and socially strengthen individuality. Inner experience, for the Japanese, has fewer outlets to give it form and so there are fewer ways to escape the isolated self.

Americans, on the other hand, are so group-oriented that even their personal issues are part of the public sphere. It is valuing both the speaking of personal views and the listening to the personal views of others — upheld by liberal democratic ideals of free speech, open dialogue, and public debate. For Americans, the personal is the public in the way that the individualistic is the groupish. If we are to apply narcissism to Americans, it is mostly in terms of what is called collective narcissism. We Americans are narcissistic about the groups we belong to. And our entire self-identities get filtered through group identities, presumably with a less intense self-awareness than the Japanese experience.

This is why American teens show a positive response to being perceived as closely conforming to a stereotypical group such as within a racial community. The same pattern, though, wouldn’t be found in a country like Japan. For a Japanese to be strongly identified with a separate sub-group would be seen as unacceptable to larger social norms. Besides, there is little need for sub-group belonging in Japan, since most Japanese would grow up with a confident sense of simply being Japanese — no effort required. Americans have to work much harder for their social identities and so, in compensation, Americans also have to go to a greater extent in proving their individuality.

It’s not that one culture is superior to the other. The respective problems are built into each society. In fact, the problems are necessary in maintaining the social orders. To eliminate the problems would be to chip away at the foundations, either leading to destruction or requiring a restructuring. That is the reason that, in the United States, racism is so persistent and so difficult to talk about. The very social order is at stake.

Delirium of Hyper-Individualism

Individualism is a strange thing. For anyone who has spent much time meditating, it’s obvious that there is no there there. It slips through one’s grasp like an ancient philosopher trying to study aether. The individual self is the modernization of the soul. Like the ghost in the machine and the god in the gaps, it is a theological belief defined by its absence in the world. It’s a social construct, a statement that is easily misunderstood.

In modern society, individualism has been raised up to an entire ideological worldview. It is all-encompassing, having infiltrated nearly every aspect of our social lives and become internalized as a cognitive frame. Traditional societies didn’t have this obsession with an idealized self as isolated and autonomous. Go back far enough and the records seem to show societies that didn’t even have a concept, much less an experience, of individuality.

Yet for all its dominance, the ideology of individualism is superficial. It doesn’t explain much of our social order and personal behavior. We don’t act as if we actually believe in it. It’s a convenient fiction that we so easily disregard when inconvenient, as if it isn’t all that important after all. In our most direct experience, individuality simply makes no sense. We are social creatures through and through. We don’t know how to be anything else, no matter what stories we tell ourselves.

The ultimate value of this individualistic ideology is, ironically, as social control and social justification.

The wealthy, the powerful and privileged, even the mere middle class to a lesser degree — they get to be individuals when everything goes right. They get all the credit and all the benefits. All of society serves them because they deserve it. But when anything goes wrong, they hire lawyers who threaten anyone who challenges them or they settle out of court, they use their crony connections and regulatory capture to avoid consequences, they declare bankruptcy when one of their business ventures fail, and they endlessly scapegoat those far below them in the social hierarchy.

The profits and benefits are privatized while the costs are externalized. This is socialism for the rich and capitalism for the poor, with the middle class getting some combination of the two. This is why democratic rhetoric justifies plutocracy while authoritarianism keeps the masses in line. This stark reality is hidden behind the utopian ideal of individualism with its claims of meritocracy and a just world.

The fact of the matter is that no individual ever became successful. Let’s do an experiment. Take an individual baby, let’s say the little white male baby of wealthy parents with their superior genetics. Now leave that baby in the woods to raise himself into adulthood and bootstrap himself into a self-made man. I wonder how well that would work for his survival and future prospects. If privilege and power, if opportunity and resources, if social capital and collective inheritance, if public goods and the commons have no major role to play such that the individual is solely responsible to himself, we should expect great things from this self-raised wild baby.

But if it turns out that hyper-individualism is total bullshit, we should instead expect that baby to die of exposure and starvation or become the the prey of a predator feeding its own baby without any concerns for individuality. Even simply leaving a baby untouched and neglected in an orphanage will cause failure to thrive and death. Without social support, our very will to live disappears. Social science research has proven the immense social and environmental influences on humans. For a long time now there has been no real debate about this social reality of our shared humanity.

So why does this false belief and false idol persist? What horrible result do we fear if we were ever to be honest with ourselves? I get that the ruling elite are ruled by their own egotistic pride and narcissism. I get that the comfortable classes are attached to their comforting lies. But why do the rest of us go along with their self-serving delusions? It is the strangest thing in the world for a society to deny it is a society.

The Head of Capital and the Body Politic

What is capitalism? The term is etymologically related to cattle (and chattel). The basic notion of capitalism is fungible wealth. That is property that can be moved around, like cattle (or else what can be moved by cattle, such as being put in a wagon pulled by cattle or some other beast of burden). It relates to a head of cattle. The term capitalism is derived from capital or rather capitale, a late Latin word based on caput, meaning of the head.

A capital is the head of a society and the capitol is where the head of power resides — Capital vs. Capitol:

Both capital and capitol are derived from the Latin root caput, meaning “head.” Capital evolved from the words capitālis, “of the head,” and capitāle, “wealth.” Capitol comes from Capitōlium, the name of a temple (dedicated to Jupiter, the Roman equivalent of the Greek god Zeus) that once sat on the smallest of Rome’s seven hills, Capitoline Hill.

But there is also the body politic or the body of Christ. The head has become the symbolic representation of the body, but the head is just one part of the body. It is the body that is the organic whole, with the people as demos: national citizenry, community members, church congregants, etc. This is the corporeal existence of the social order. And it is the traditional basis of a corporation, specifically as representing some kind of personhood. At one time, objects and organizations were treated as having actual, not just legal, personhood. The body of Christ was perceived as a living reality, not just a convenient way for the powerful to wield their power.

If you go back far enough, the head of a society was apparently quite literal. In the ancient world, when a leader died, they often lopped off his head because that was the source of the voice of authority. Supposedly, bicameral societies involved an experience where people continued hearing voices of dead kings and godmen, presumably why they kept the skull around. The earliest known permanent structures were temples of death cults with headless imagery, and these temples were built prior to humans settling down — prior to agriculture, pottery, and domesticating cattle. They built houses to their gods before they built houses for themselves. The capital of these societies were temples and that was the convenient location for storing their holy skulls.

Gobekli Tepe, like many other similar sites, was located on a hill. That has long been symbolic of power. After bicameral societies developed, they built artificial hills such as mounding up dirt or stacking large stones as pyramids. The head is at the top of the body and it is from that vantage point that all of the world can be seen. It was natural to associate the panoramic view of a hill or mountain with power and authority, to associate vision with visionary experience. Therefore, it made sense to locate a god’s house in such a high place. Temples and churches, until recent history, were typically the tallest structures in any town or city. In this age of capitalism, it is unsurprising that buildings of business now serve that symbolic role of what is held highest in esteem and so housed in the tallest buildings. The CEO is the head of our society, quite literally at the moment with a businessman as president, a new plutocratic aristocracy forming.

What we’ve forgotten is that the head is part of a body. As a mere part of the body, the head should serve the body in that the part should serve the whole and not the other way around. In tribal societies, there is the big man who represents the tribe. He is the head of the community, but his ability to command submission was severely limited. In Native American tribes, it was common for clans to make their own decisions, whether to follow the tribal leader or not. The real power was in the community, in the social order. The Amazonian Piraha go so far as to have no permanent leadership roles at all.

Even in the more complex Western social order before capitalism took hold, feudal lords were constricted by social responsibilities and obligations to their communities. These feudal lords originated from a tradition where kings were beholden to and originally chosen by the community. Their power wasn’t separate from the community, although feudalism slowly developed in that direction which made possible for the takeover of privatized capitalism. But even in early capitalism, plantation owners were still acting as the big men of their communities where they took care of trade with the external word and took care of problems within the local population. Store owners began taking over this role. Joe Bageant described the West Virginian town he grew up in during the early 20th century and it was still operating according to a barter economy where all outside trade flowed through the store owner with no monetary system being required within the community.

A major difference of the early societies is how important was social order. It was taken as reality, in the way we today take individuality as reality. For most of human existence, most humans would never have been able to comprehend our modern notion of individuality. Primary value was not placed on the individual, not even the individual leader who represented something greater than himself. Even the Roosevelts as presidents still carried a notion of noblesse oblige which signified that there was something more important than their own individuality, one of the most ancient ideas that has almost entirely disappeared.

Interestingly, pre-modern people as with tribal people in some ways had greater freedom in their identity for the very reason their identity was social, rather than individual. The Piraha can change their name and become a new person, as far as other Piraha are concerned. In Feudalism, carnival allowed people to regularly lose their sense of identity and become something else. We modern people are so attached to our individuality that losing our self seems like madness. Our modern social order is built on the rhetoric of individuality and this puts immense weight on individuals, possibly explaining the high rates of mental illness and suicide in modern society. Madness and death is our only escape from the ego.

Capitalism, as globalized neoliberalism, is a high pressure system. Instead of the head of society serving the body politic, we worship the detached head as if a new death cult has taken hold. A corporation is the zombie body without a soul, the preeminent form of our corporatist society with the transnational CEO as the god king standing upon his temple hill. We worship individuality to such a degree that only a few at the top are allowed to be genuine individuals, a cult of death by way of a cult of personality, power detached from the demos like a plant uprooted. The ruling elite are the privileged individuals who tell the dirty masses what to do, the voices we hear on the all-pervasive media. The poor are just bodies to be sacrificed on the altar of desperation, homelessness, prison, and war. As Margaret Thatcher stated in no uncertain terms, there is no society. That is to say there is no body politic, just a mass of bodies as food for the gods.

The head of power, like a cancerous tumor, has grown larger than the body politic. The fungible wealth of capitalism can be moved, but where is it to move. The head can’t move without the body. Wealth can’t be separated from what the world that creates it. Do the plutocrats plan on herding their wealth across the starry heavens in the hope of escaping gravity of the corporeal earth? If we take the plutocrats hallowed skull and trap the plutocrat’s divine being in a temple hill, what would the voice tell us?

At the end of the Bronze Age, a major factor of the mass collapse of civilizations was the horse-drawn chariot. Horses were an early domesticated animal, a major form of fungible wealth. Horses and chariots made new forms of warfare possible, involving large standing armies that could be quickly moved across vast distances with supply chains to keep them fed and armed. Along with other factors, this was a game-changer and the once stable bicameral societies fell one after another. Bicameral societies were non-capitalistic, but the following Axial Age would set the foundations for what would eventually become modern capitalism. Bicameral civilization remained stable for millennia. The civilization formed from the Axial Age has maintained itself and we are the inheritors of its traditions. The danger is that, like bicameral societies, we might become the victims of our own success in growing so large. Our situation is precarious. A single unforeseen factor could send it all tumbling down. Maybe globalized neoliberalism is our horse-drawn chariot.

A head detached from its body is the symbol of modernity, grotesquely demonstrated by the guillotine of the French Revolution, the horror of horrors to the defenders of the ancien regime. Abstract ideas have taken on a life of their own with ideological systems far outreaching what supports them. It’s like a tree clinging to a crumbling cliffside, as if it were hoping to spread its limbs like wings to take flight out across the chasm below. In forgetting the ground of our being, what has been lost and what even greater loss threatens? Before revolution had begun but with revolution in the air, Jean-Jacques Rousseau wrote in 1750 (Discourse on the Arts and Sciences), “What will become of virtue if riches are to be acquired at any cost? The politicians of the ancient world spoke constantly of morals and virtue; ours speak of nothing but commerce and money.” That question is now being answered.

* * *

There was one detail I forgot to work into this piece. Feudalism was on my mind. The end of feudalism was the final nail in the coffin for the societal transformation that began during the Axial Age. What finally forced the feudal order, upon which the ancien regime was dependent, to fall apart or rather be dismantled was sheep, another domesticated animal.

Feudalism was dependent on labor-intensive agriculture that required a large local peasant population. With sheep herding, fewer people were required. The feudal commons were privatized, the peasants kicked off the land, entire villages were razed to the ground, and probably millions of people over several centuries were made destitute, homeless, and starving.

Vast wealth was transferred into private hands. This created a new plutocratic class within a new capitalist order. There is an interesting relationship between domesticated animals and social change. Another example of this is how free-ranging pigs in the American colonies wreaked havoc on Native American villages and gardens, making impossible their way of life.

This process of destruction is how civilization as we know it was built. Some call this creative destruction. For others, it has been plain destruction.

From Community to Legalism

The United States has become a legalistic society. It has always been more legalistic than some countries, for various reasons, but it’s become even more legalistic over time. Earlier last century, most problems weren’t dealt with through the legal system.

This is why it’s hard to compare present data to past data. A lot of criminal behavior never led people to the court system, much less prison. And even when people ended up in court, judges used to have more legal freedom to be lenient, unlike our present mandatory sentencing. This meant that there wasn’t much in the way of mass incarceration in the US until this past half century or so.

Take juvenile delinquents as a key example, far from a new problem. As urbanization took hold in the late 1800s and into the early Cold War, there was moral panic about teenagers being out of control, turning into criminals, and joining gangs. But most kids with problems didn’t end up facing a judge.

There were community institutions that figured out ways to deal with problems without recourse to legal punishment. Kids might get sent to family members who lived elsewhere, to a group home for delinquents, to reform school, etc. Or they might simply be made to do community service or pay restitution. But none of it would end up as a criminal record, likely not even getting reported in the local newspaper. It would have been dealt with quietly, informally, and privately.

There were cultural reasons at the time. It was assumed that kids weren’t fully responsible for their own behavior, as kids were treated as dependents of adults. The problems of kids was seen as the failure of parenting or social conditions. There was little tolerance for bad behavior in many ways at that time, but also society was much more forgiving. A kid would have to commit many major crimes before he would end up in a court and in jail.

The downside of this is that individuals had less rights, as people were seen more in social terms. It was easier to institutionalize people back then. Or if a girl got pregnant, her family would make sure she was sent somewhere else and not bring shame on the family. Juveniles were considered dependents until well into young adulthood. A 21 year old woman who was accused of prostitution, even if false, could find herself sent off to a group home for girls. Early 20th century childhood was highly protected and extended, although far different from present helicopter parenting.

Parents were considered legally and morally responsible for their kids, in a way that is not seen these days. Individual rights were still rather limited in the early 20th century. But there was also a sense of community responsibility for members of the community. It was accepted that social conditions shaped and influenced individuals. So, to change individual behavior, it was understood that social conditions needed to be changed for the individual.

In present American society, we see the past as socially oppressive and it was. We now put the individual before the community. We think it’s wrong to send juvenile delinquents off to reform schools, to separate the low IQ kids from other students, and to institutionalize the mentally ill. But this typically means we simply ignore problems.

The kid with severe autism in a normal classroom is not getting a good education or being prepared for adult life in any kind of way, although there is merit to his being socialized with his neurotypical peers. The mentally ill being homeless instead of in institutions is not exactly an improvement, even considering the problems of psychiatric institutions in the past. And the world is not a better place for our warehousing problematic people in prisons.

Our society has been pushed to an extreme. It would be nice to see more balance between rights of individuals and the responsibility of communities. But that isn’t possible if our main options are to either ignore problems or turn to the legal system. This is a difficult challenge, as increasing urbanization and industrialization have led to the breakdown of communities. There was a much stronger social fabric a century ago. It’s harder for us to turn to community solutions now since communities no longer function as they once did. And growing inequality has undermined the culture of trust that is necessary for well-functioning community.

Yet it’s obvious, according to polls, that most Americans realize that social problems require social solutions. But our political system hasn’t caught up with this social reality. Or rather the ruling class would rather not admit to it.

Self-Care of Summer Soldiers and Sunshine Patriots

THESE are the times that try men’s souls. The summer soldier and the sunshine patriot will, in this crisis, shrink from the service of their country; but he that stands it now, deserves the love and thanks of man and woman.

Tyranny, like hell, is not easily conquered; yet we have this consolation with us, that the harder the conflict, the more glorious the triumph.

What we obtain too cheap, we esteem too lightly: it is dearness only that gives every thing its value. Heaven knows how to put a proper price upon its goods; and it would be strange indeed if so celestial an article as FREEDOM should not be highly rated.

There is the issue of self-care. It’s something never far from my mind, as I deal with severe depression. Self-care is the only way I’ve lasted this long in life. But I also know that high rates of such things as depression are only found in high inequality societies. The ultimate self-care would mean undoing the cause of the problem, not just treating the symptoms, case by case. As social creatures, self-care is never about mere individual concern. As Thomas Paine explains above, that cheapens what should most matter.

The importance of self-care came up in an article that argued that hatred is self-harming, even if the other side deserves your hatred. That is fair and I’d agree to an extent. Still, the emphasis seems wrong. What matters is that we come from a place of honesty, integrity, and authenticity. Only from that position of strength can we envision something radically better and move toward it. We should always give voice to the truth, especially when uncomfortable, even to ourselves. Speaking from a place of truth is always a compassionate act, whether or not it makes us feel warm and fuzzy with idealized notions of how we should feel. First we should be honest with ourselves about what we actually feel, not what we are told we are supposed to feel. Political correctness certainly shouldn’t be applied to our feelings, for that would be the opposite of self-care.

Besides, no large-scale positive change ever happened in the world without a whole lot of angry people. Societies (or rather the ruling elite and comfortable classes) tend to resist change and the more they resist the more frustrated so many people become. Sadly, that is how it tends to go, until a breaking point is reached.

Those who are already being harmed by the status quo are usually less concerned about mere self-harm, as the greater harm is what others are doing to them. Whereas those talking about self-care tend to be individuals in a privileged position to not be victimized by the even greater harms. The comfortable tend to only become overly motivated toward change when comfort is no longer an option. But for the most victimized, comfort was never an option. Yet many others find themselves somewhere in between these two extremes. That is a tough place to find oneself. Slowly, over time, more and more people will fall further and further out of what little comfort (and security) they might have. As the harm grows, the number of harmed will also grow. Admonitions about self-care will sound ever more empty, pointless, and irrelevant.

Self-care in a narrow individualistic sense becomes increasingly difficult and, at some point, moot. This is even more true when personal problems are inseparable from societal and political problems. What if the only possibility of self-care is to fight those doing harm to you, to your loved ones, and to the majority of others in your society? When the American colonists started a revolution, they did so out of self-care. It was difficult and many paid the ultimate sacrifice. But what other option was there, besides accepting increasing harm from authoritarian oppression, growing corporatism, an impoverishing economic system, and demoralizing loss of power over their own lives?

Consider Thomas Paine. When still in England, he joined other excise officers in bringing a petition for better pay and working conditions to Parliament and consequently lost his job, sending him into poverty and barely avoiding debtors’ prison. Later in America, after the revolution started, he gave all of the proceeds from his publications to the war effort. He also risked his life numerous times, giving everything for making a better world. And after all of that, he died in humble obscurity with few caring enough to attend his funeral. But that was irrelevant, since what he had sought was freedom and justice, not fame and wealth. As he explained it, “I prefer peace. But if trouble must come, let it come in my time, so that my children can live in peace.” These words were spoken by someone who had seen his own child die, along with the child’s mother. So, he was really speaking of other people’s children.

Was Paine and those like him committing self-harm? Or were they seeking a greater good beyond themselves, the greater good for all of society, the greater good for their neighbors and fellow citizens, the greater good for their children and grandchildren? In response to those soldiers who abandoned the revolutionary cause for self-care, Paine called them summer soldiers and sunshine patriots. They were those who only fought when it was easy, when no great sacrifice was required of them. Paine’s point is that we can never separate care for ourselves from care for others. The point he was making wasn’t philosophical but practical. As Benjamin Franklin put it, “We must all hang together, or most assuredly we shall all hang separately.” Or even more powerfully articulated in a different context, there are the oft-quoted words of Martin Niemöller… the danger of narrow self-care until there is no one left to care about you:

“First they came for the Socialists, and I did not speak out—
Because I was not a Socialist.

“Then they came for the Trade Unionists, and I did not speak out—
Because I was not a Trade Unionist.

“Then they came for the Jews, and I did not speak out—
Because I was not a Jew.

“Then they came for me—and there was no one left to speak for me.”