“You’re the only people alive on the earth today.”

“You’re the only people alive on the earth today. All those people who created traditions, created countries and created rules…they are dead. Why don’t you start your own world while you’ve got the chance?”
~ Bill Hicks.

What is rarely, if ever, taught in public education, much less heard in elite institutions of politics and media, is that this anti-authoritarian demand to be free of the past was one of the main views of the American revolutionaries, including many major leaders like Thomas Jefferson and Thomas Paine, having openly defended direct democracy and majoritarianism. They often spoke of this problem as the ‘dead hand’; a criticism applied to any established institution, tradition, custom, norm, law, constitution, or holy book. Freedom is always in the present because it is the only moment in which to act freely. To live shackled to the past, in being beholden to the dead, is to not be truly alive; instead, it’s to be infected with the soul sickness of the zombified living dead. One of the greatest of oppressions is to be haunted by a past that controls one’s mind, identity, and ability to act; held with the vice-grip of commanding voices that possess the victim, like J. R. R. Tolkien’s Gríma Wormtongue whispering into the ear of King Théoden of Rohan.

During the American Revolution, the radical advocates for the living generation and living constitutionalism came to be called the Anti-Federalists, only because they lost the war of rhetoric when the so-called ‘Federalists’ took control in dismantling the Articles of Confederation and enforcing centralized government controlled by elites (this kind of radical critique, such as when Bill Hicks speaks it, is now identified as ‘liberal’ or ‘leftist’). But in reality the ‘Anti-Federalists’ were the strongest defenders of actual federalism as decentralized power and self-governance. Levi Preston, a revolutionary veteran, as an old man simply stated what the American Revolution was about, “Young man, what we meant in going for those redcoats was this: we always had governed ourselves, and we always meant to. They didn’t mean we should.”  He clarified exactly what he meant. Right before that, he said, “Oppressions? I didn’t feel them. I never saw one of those stamps, and always understood that Governor Bernard put them all in Castle William. I am certain I never paid a penny for one of them. Tea tax! I never drank a drop of the stuff; the boys threw it all overboard.” It was not a tax revolt, as if early working class Americans were willing to fight, sacrifice, and die in defense of capitalism. Their sense of freedom denied was much more visceral and communal, with political implications right from the start. They were social justice warriors. They understood that the political is personal and the personal is political.

Such righteous assertion of self-independence, self-autonomy, and self-governance — the Spirit of ’76 living in the Spirit of the People — is not possible if one places the authority of corpses and ghosts over one’s own self-authorization and self-authority, any more than one can be free by submitting to the power of an aristocrat, king, or pope (or dictator, demagogue, etc; or partisanship and lesser evilism). Every living generation, morally and practically, has no choice but to choose for themselves, again and again. Even choosing submission to the dead is a choice of the living and so responsibility for the consequences of that choice cannot be denied. That sense of freedom-loving, almost anarcho-libertarian, independence is why many of the revolutionaries didn’t see the revolution as having ended with the defeat of the British Empire and so continued to fight against corrupt and oppressive elites, including against the plutocratic and oligarchic Federalists (e.g., Shays’ Rebellion); with the Spirit of ’76 never having gone away. Jefferson’s hope and Paine’s promise spring eternal; as evidenced by the thousands of riots, revolts, uprisings, insurrections, protest movements, and mass strikes that have happened since that time.

The colonial working class radicals and revolutionaries weren’t the only ones who bucked against new oppressions replacing the old, even ignoring rare aristocrats like Jefferson. Many others understood or suspected that leading Federalists like Alexander Hamilton were consciously modeling the new ‘constitutional’ republic on the British Empire and the British East India Company, if those like James Madison figured it out too late. These Federalists aspired not to be free but to be the next ruling elite of an even greater global superpower. Such schemes were a real threat, as we can see with what the United States has become, but it’s obviously not like no one saw it coming. Consider moderate and principled Federalists like John Dickinson, initially resistant to revolution at all and later the draft author of the Articles of Confederation, who feared such imperialistic centralized and concentrated power; as expressed in his purse and sword argument (basically, an Anti-Federalist argument; and the Articles did become a touchstone of Anti-Federalist thought). Even the Anti-Federalist Abraham Clark, supposedly the one who suggested a constitutional convention, was unhappy about the results; to such an extent that he warned, “We may awake in fetters, more grievous, than the yoke we have shaken off.” That worry turned out to be prescient, like so many other Anti-Federalist warnings and predictions.

Decades later, Jefferson would admit in private correspondence that the experiment of constitutional republicanism had been a failure because the founders failed to understand the mother principle, that of democratic self-governance. He said that the Spirit of ’76 only lived on in the spirit of our people (and in the “will of the people”; not in the constitution or government), the only hope that the gains of revolution would not be entirely lost. The people, as advocated by the Anti-Federalists, understood the soul of the American Dream better than the elite, as promoted by the faux Federalists. That fundamental conflict is what our country was founded upon and it remains with us to this day. Not even the American Civil War was able to undo that moral corruption and political foundering because there was no one in leadership who was wise enough and brave enough to throw the Ring of Power into Mount Doom when they had the chance, and indeed there were numerous opportunities to course correct, to revive the anti-authoritarian and egalitarian vision of the Articles of Confederation.

None of this is merely about the past but about the ever present choice of each and every new living generation. That is why Bill Hicks’ words resonate with us today, the same reason the words of the Anti-Federalists inspired revolution back then. The authority of those words are not in who said them, be it a comedian or a ‘Founding Father’. The force of those words is in knowing they speak truth for time immemorial, as we can verify that truth in our own minds, hearts, and souls; can observe it, test it, and prove it in our lived experience; can touch it, feel it, and know it in the world around us. The sense of being a living generation of people is not an abstraction but what cannot be denied. First appearing in the Axial Age, there was the notion that all living people, as individuals or communal selves, can have direct access via experience and relationship to ultimate truth, natural law, higher reality, or divine being.

The message of Hicks and the Anti-Federalists is ancient, fundamentally spiritual and religious in nature — as Levi Preston explained, “We read only the Bible, the Catechism, Watt’s Psalms and Hymns, and the Almanack.” The point is that they read these texts for themselves, as literacy was becoming common, and so the words were brought to life by their own voices. Rising literacy rates and availability of reading material, including radical pamphlets written by Paine, was the main force behind the revolution of mind that preceded the revolution of government and society. With an emerging independent-mindedness, the once mostly indentured and wretchedly oppressed colonials were gaining confidence in themselves. Unlike in earlier eras, they could read for themselves, interpret what they read for themselves, think for themselves, and so act for themselves.

There was a change not only in mentality and identity, for it was part of an ongoing shift in an entire worldview, a transformation of experienced reality; what first was planted in the Axial Age, took root in the Middle Ages, and finally was coming to fruition in early modernity. It’s a sense of being enmeshed within and inseparably part of a living world. This is what Jesus meant by the Kingdom of God being all around us. And it’s what the 14th century peasants meant, in revolting, when they demanded equality on Earth as it is in Heaven. That is what then inspired those like the Quakers, having come into their own during the radicalism of the English Civil War, to formulate their view of living constitutionalism; the source of John Dickinson’s thinking, as he was raised Quaker. Living constitutionalism, according to the Quakers, treats a constitution as a living document, not a dead piece of paper; for it is considered a compact between a living God and a living community, a specific living generation of people. Ironically, the reactionary right tries to cast shade on living constitutionalism as anti-traditional, when they know nothing of the traditions our society are actually built upon.

No one, not in the past nor in a distant place, can speak to anyone else on behalf of God or speak for anyone else in relationship to God (i.e., the highest truth, reality, and authority). We are all responsible for our own connection to and discernment of the ultimate. This is why natural law, now often co-opted as reactionary rhetoric, could in the past be perceived as radically dangerous in challenging the entire basis and justification for human law, as politically-established and government-enforced. That is what Jesus was invoking in challenging Jewish and Roman hierarchical authority and social institutions, casually dismissing them as if irrelevant with a zealous and charismatic confidence that the truth he knew could not be denied or harmed, no matter what the ruling elite and Roman soldiers may do to his body. In the living moment, he acted on, demonstrated, and proved the truth he spoke; emphasizing he was not special in this manner by telling others that they too were gods, of the Holy Spirit. The living God is not far away in Heaven but here on Earth. The living Revelation is not in the ancient past but right now. The living Word is not in a book but in the world. The living Reality is experienced and known by those with eyes to see, ears to hear.

* * *

Let us make a small note here. We briefly mentioned the reactionary but didn’t go into further detail, as it wasn’t our present focus and we’ve talked about it plenty elsewhere. But as always, it can’t be ignored. What we did mention is how the reactionary has largely co-opted the rhetoric of natural law and so repurposed it to regressive ends. The deeper point, though, is that natural law originated in the radical, not the reactionary. A similar thing can be said of living constitutionalism. Sure, the reactionary can co-opt the social force and political results of living constitutionalism, as it can co-opt almost anything and everything. That is unfortunate, if it also shows the weakness and limitations involved. What the reactionary can never do is co-opt the moral force and motivating essence of natural law, living constitutionalism, and such. That is the beating heart that we are speaking of. The reactionary is always deadening. It is death and brings death to everything it touches, most of all rot of the human soul. It’s love versus fear, vitality versus anxiety, life versus death; but the two sides are far from being equal. One is light and the other mere shadow.

The living moral force of the living truth and reality is inherently and fundamentally radical and forever retains the radical; it is progressive and never regressive, liberating and never oppressive. All that radical literally means is a return to the root; and hence a return to underlying nature, fundamental truth, first principles. That is the point of showing the long history of this shared inheritance of profound wisdom, making clear that the roots of the radical go deep into human nature and human society. Not mentioned at all here is that the notion of a living experience of a living world is rooted even further down into the most archaic layers of our shared humanity, back to bicameral and animistic societies. No amount of reactionary co-option can undo this power. That is because it originates and is sourced within us, individually and collectively. As long as humans exist, the radical living challenge will remain potent and threatening. That is the whole point of why the reactionary feels compelled to co-opt the very thing that undermines it, in grotesquely wearing it like a superficial mask. This is the reason that a probing intellectual, spiritual, and moral discernment is of the utmost.

Yet it’s not only that the reactionary can’t undo the radical for neither can it stop it from spreading. That is precisely what has been happening these past millennia, as a new mentality has been taking hold, beginning as a spark and catching fire again and again. The Holy Spirit is a burning fire, the world aflame in light. The mistake many make is thinking that Enlightenment and post-Enlightenment thought is merely about a dully simple, reductionist, and materialistic individualism. But that false understanding is because the radicalism of the past has been obscured, as has been the radicalism of Western origins and the radicalism of the American founding. For instance, take the appearance in the ancient world, as if out of nowhere, of the idea that there is a common humanity, a universal human nature, a shared world, a single cosmos. During the Axial and post-Axial ages, that radical understanding came up in the words of numerous prophets, philosophers, wisdom teachers, gurus, and salvific figures. Human identities have grown ever broader over time. The peasantry, in revolting, came to an emergent class consciousness. The colonists, in revolting, upheld the ideal of global citizenship. Such an expanding and inclusive worldview keeps on growing, with each age of tumult bringing us to new understanding and a greater identity.

So, there is what is ancient to human society, even primal in having originated within the human psyche from millions of years of hominid evolution. To experience the living fusion of self and world, human and non-human is the undifferentiated state that forms the baseline of human existence. That isn’t to say differentiation, therefore, is bad; of course not. But starting millennia ago, a divide began to form, a mere crack at first, that has since fractured and splintered into modern psychosis. The radical impulse has never been to resist or deny differentiation that has made possible modern individuality, but neither has it sought to dismiss and devalue the communal identities of the past, the very ground of the bundled mind that we stand upon. Instead, what radical thinkers have advocated is how to transform and reform past communal identities, such that collective health and sanity can be maintained. Abstract identities, however, disconnect us from the living sense of belonging to others and to the larger world. For most of human existence, belonging has meant an identity of tribe built on a deep sense of place. That concrete immediacy and sensory immersion remains essential and necessary. Yet in a globalized interconnected society our ability to perceive a shared living reality is potentially immense; the imaginative capacity to sense, feel, understand, and know that other people are equally real. It’s the task before us, the ancient ideal and aspiration that guides us.

* * *

Roger Williams and American Democracy
Founding Visions of the Past and Progress
Whose Original Intent?
Anti-Partisan Original Intent
US: Republic & Democracy
 (part two and three)
Democracy: Rhetoric & Reality
Pursuit of Happiness and Consent of the Governed
St. George Tucker On Secession
The Radicalism of The Articles of Confederation
From Articles of Confederation to the Constitution
The Vague and Ambiguous US Constitution
Wickedness of Civilization & the Role of Government
A Truly Free People
Nature’s God and American Radicalism
“We forgot.”
What and who is America?
Attributes of Thomas Paine
Predicting an Age of Paine
Thomas Paine and the Promise of America
About The American Crisis No. III
Feeding Strays: Hazlitt on Malthus
Inconsistency of Burkean Conservatism
American Paternalism, Honor and Manhood
Revolutionary Class War: Paine & Washington
Paine, Dickinson and What Was Lost
Betrayal of Democracy by Counterrevolution
Revolutions: American and French (part two)
Failed Revolutions All Around
The Violence of Bourgeois Revolutions and Authoritarian Capitalism
The Haunted Moral Imagination
“Europe, and not England, is the parent country of America.”
“…from every part of Europe.”

The Fight For Freedom Is the Fight To Exist: Independence and Interdependence
A Vast Experiment
The Root and Rot of the Tree of Liberty
America’s Heartland: Middle Colonies, Mid-Atlantic States and the Midwest
When the Ancient World Was Still a Living Memory
Ancient Outrage of the Commoners
The Moral Axis of the Axial Age
Axial Age Revolution of the Mind Continues
A Neverending Revolution of the Mind
Liberalism, Enlightenment & Axial Age
Leftism Points Beyond the Right and Beyond Itself

Jo Walton On Death

At present, we’re reading a fiction book, Jo Walton’s Or What You Will, and enjoying it. We’ve never read the author before, but plan on looking for more of her writings. This book has elements of postmodernism about it, in how the narrator and some of the characters speak about the story, storytelling, and such; the breaking of the fourth wall. But it’s probably better described as metamodern, the combining of how modernity takes itself seriously and how the postmodern stands back in self-conscious critique, mixed together with experimental playfulness and sincere inquiry; all the while touching on various themes and topics, casual and weighty; but always coming back to what a narratizing voice(s) means, dipping into the territory of the bundled mind.

A focus of concern the author returns to throughout the book is mortality and the desire for immortality; how the human relationship to death has changed over time, how we speak about it and narrate it, and how it shapes us and our culture. Besides comparing present attitudes and responses to earlier times, at one point she contrasts the modern detachment, confusion, and discomfort with human death with the modern ‘extravagant grieving’ over pets. Anyone in our society is familiar with what she is talking about. A coworker of ours, when her lizard died, was so distraught she couldn’t talk about it, to the extent that she became upset when we brought it up, in our trying to be consoling; as we had often talked about her lizard in the past. She was not consoled. It might be the most inconsolable we’ve ever seen anyone about any death of any species.

For whatever reason, we’ve never been that way about death, be it human or non-human; even with a cat we had for more than 20 years, a cat that had sometimes been our sole companion in dark periods. So far, we’ve tended to take it all in stride, with what acceptance we can muster. It’s sad, but so much of life can be sad, if one thinks about it; the world is full of pain and suffering, where death is the least of it and sometimes a blessing. Maybe a lifetime of depression and depressive realism has inoculated us to such things. Death is death. There’s nothing to be done about it, other than to avoid it for as long as is possible and desirable. But as someone who once attempted suicide and used to contemplate it often, we don’t hold it against people who decide to opt out early. Most modern people, though, don’t have this perspective.

p. 111-113:

“If you are a modern person, in our world now, it’s not unlikely that you might not have known the true grief and loss caused by the death of someone close to you until well into adulthood. […] The estrangement from grieving is a change in human nature, and one that has happened over the course of Sylvia’s lifetime. Young people today are not the same as they were when she was young. […] Death comes to them as a stranger, not an intimate. She notices it first in what she sees as extravagant grieving for animals, and then starts to notice it more when her friends lose parents at older and older ages, and take it harder and harder. […] Then she observes a growing embarrassment in younger people around the mention of real death, where people don’t know what to say or how to reaction, until talking about it is almost a taboo. Simultaneous with this came the rise of the vampire as attractive, sexual, appealing, rather than a figure of horror. […] Other undead have also undergone this process in art, even zombies by the first decade of the new millennium. Friends with no religion, who mock Sylvia for her vestigial Catholicism, revert to strained religiosity in the face of death because they have no social patterns of coping.

“Look at it this way: Freud wasn’t necessarily wrong about Thanatos. But he was living in a different world, before antibiotics. His patients were very different people from the people of this century. They would all and every one of them have lost siblings, school friends, parents. […] We read Freud now, and wonder how he could have thought of some of these things, but his patients lived crowded together in houses with one bathroom or none, where they shared rooms with their dying siblings and fornicating parents, and where death was a constant and familiar presence. Nor did they feel grief any less for the familiarity of loss. Read Victorian children’s books; read Charlotte M. Yonge (as Sylvia did as a child in her grandmother’s house) and see what a constant presence death is, almost a character—and not necessarily violent death, but death by illness or accident, inevitable death that simply cannot be cured. We mock their wallowing in woe, the crepe, the widow’s weeds, the jewelry made of jet and hair, the huge mausoleums, the black and purple mourning clothes, until we are faced with our own absences in emptiness, with nothing at all to console us and no signals to send to warn others to tread lightly. […]

pp. 115-116:

“Death in fantasy, is generally defanged. Ever since Tolkien brought Gandalf back, and Lewis resurrected Aslan, both of then in conscious imitation of Christ, and right at the beginning of the shaping of genre fantasy, death in fantasy novels has been more and more negotiable. It’s more unusual for a beloved character to stay dead than for them to come back to life. Death is for enemies and spear carriers, and the way a spear carrier death is treated is that the main characters will have a single dramatic scene of mourning and then rarely think of them once they turn the page at the end of the chapter. Boromir’s death resonates through the rest of The Lord of the Rings but the imitations of it lesser writers put in do not. Tolkien and Lewis lived through the Great War, and saw as much death as anyone has. Their imitators are modern people, whose understanding of death is much less visceral. Modern fantasy, even, and perhaps especially “grim-dark” fantasy, is often written by people without much close-up experience of death. The horror of the Dead Marshes, in Tolkien, comes direct from Flanders field. They are not there for thrills.

“As for resurrections—she goes to San Marco and sees Fra Angelico’s paintings of the angel in the empty tomb, and Christ harrowing Hell and opening the door that has been closed for so long and letting in light where there was only darkness. The easy way people come back to life in fantasy cheapens resurrection. The ultimate mystery of Christianity becomes commonplace, with the extreme version of the cheapening happening in computer games where there can be an actual fixed price in gold for bringing a party member back to life.”

“What is the most important thing in life?”

This was asked of some men of a Hadaza or Hadzabe tribe in Tanzania (from a video on Mike Corey’s Fearless & Far Youtube channel). The answer, according to one hunter-gatherer: 

  • “Meat.”
  • “Honey.”
  • “Corn porridge.”

That is the order he gave them in. He paused between stating each. But the first answer came without any pause. And the last one would’ve been introduced during colonialism.

To emphasize his point, he later said, “If we have meat, honey, and water, then we are happy. Thank you, friend.” He didn’t bother to add the corn porridge in the second answer. Corn porridge is probably only what they eat when they have nothing else.

Then further on, the interviewer asked, “What is your biggest struggle?” Guess what the hunter-gatherer’s answer was. “Meat.” It really does all go back to meat, although they did explain the importance of water as well. Honey is a nice treat, but they kept coming back to meat.

This hunter-gatherer was really obsessed with the baboons they were going to hunt that night. He was quite excited about it. Those baboons on the rock in the distant meant meat.

Meat makes the world go around, including the fear of becoming meat. The first answer to their greatest fear was, “Lion.” Eat or be eaten. The hunter-gatherer’s whole live is obsessed over the next kill and avoiding being killed.

Honey is pleasurable and good quick energy. Plant foods can be eaten in a pinch or for variety. But, for humans, lack of meat in the wild means death.

The Crisis of Identity

“Besides real diseases we are subject to many that are only imaginary, for which the physicians have invented imaginary cures; these have then several names, and so have the drugs that are proper to them.”
~Jonathan Swift, 1726
Gulliver’s Travels

“The alarming increase in Insanity, as might naturally be expected, has incited many persons to an investigation of this disease.”
~John Haslam, 1809
On Madness and Melancholy: Including Practical Remarks on those Diseases

“Cancer, like insanity, seems to increase with the progress of civilization.”
~Stanislas Tanchou, 1843
Paper presented to the Paris Medical Society

I’ve been following Scott Preston over at his blog, Chrysalis. He has been writing on the same set of issues for a long time now, longer than I’ve been reading his blog. He reads widely and so draws on many sources, most of which I’m not familiar with, part of the reason I appreciate the work he does to pull together such informed pieces. A recent post, A Brief History of Our Disintegration, would give you a good sense of his intellectual project, although the word ‘intellectual’ sounds rather paltry for what he is exploring: “Around the end of the 19th century (called the fin de siecle period), something uncanny began to emerge in the functioning of the modern mind, also called the “perspectival” or “the mental-rational structure of consciousness” (Jean Gebser). As usual, it first became evident in the arts — a portent of things to come, but most especially as a disintegration of the personality and character structure of Modern Man and mental-rational consciousness.”

That time period has been an interest of mine as well. There are two books that come to mind that I’ve mentioned before: Tom Lutz’s American Nervousness, 1903 and Jackson Lear’s Rebirth of a Nation (for a discussion of the latter, see: Juvenile Delinquents and Emasculated Males). Both talk about that turn-of-the-century crisis, the psychological projections and physical manifestations, the social movements and political actions. A major concern was neurasthenia which, according to the dominant economic paradigm, meant a deficit of ‘nervous energy’ or ‘nerve force’, the reserves of which if not reinvested wisely and instead wasted would lead to physical and psychological bankruptcy, and so one became spent. (The term ‘neurasthenia’ was first used in 1829 and popularized by George Miller Beard in 1869, the same period when the related medical condition of ‘nostalgia’ became a more common diagnosis, although ‘nostalgia’ was first referred to in the 17th century (Swiss doctor Johannes Hofer coined the term, also using it interchangeably with nosomania and philopatridomania — see: Michael S. Roth, Memory, Trauma, and History; David Lowenthal, The Past Is a Foreign Country; Thomas Dodman, What Nostalgia Was; Susan J. Matt, Homesickness; Linda Marilyn Austin, Nostalgia in Transition, 1780-1917; Svetlana Boym, The Future of Nostalgia; Gary S. Meltzer, Euripides and the Poetics of Nostalgia; see The Disease of Nostalgia). Today, we might speak of ‘neurasthenia’ as stress and, even earlier, they had other ways of talking about it — as Bryan Kozlowski explained in The Jane Austen Diet, p. 231: “A multitude of Regency terms like “flutterings,” “fidgets,” “agitations,” “vexations,” and, above all, “nerves” are the historical equivalents to what we would now recognize as physiological stress.” It was the stress of falling into history, a new sense of time, linear progression that made the past a lost world — from Stranded in the Present, Peter Fritzsche wrote:

“On that August day on the way to Mainz, Boisseree reported on of the startling consequences of the French Revolution. This was that more and more people began to visualize history as a process that affected their lives in knowable, comprehensible ways, connected them to strangers on a market boat, and thus allowed them to offer their own versions and opinions to a wider public. The emerging historical consciousness was not restricted to an elite, or a small literate stratum, but was the shared cultural good of ordinary travelers, soldiers, and artisans. In many ways history had become a mass medium connecting people and their stories all over Europe and beyond. Moreover, the drama of history was construed in such a way as to put emphasis on displacement, whether because customary business routines had been upset by the unexpected demands of headquartered Prussian troops, as the innkeepers protested, or because so many demobilized soldiers were on the move as they returned home or pressed on to seek their fortune, or because restrictive legislation against Jews and other religious minorities had been lifted, which would explain the keen interest of “the black-bearded Jew” in Napoleon and of Boisseree in the Jew. History was not simply unsettlement, though. The exchange of opinion “in the front cabin” and “in the back” hinted at the contested nature of well-defined political visions: the role of the French, of Jacobins, of Napoleon. The travelers were describing a world knocked off the feet of tradition and reworked and rearranged by various ideological protagonists and conspirators (Napoleon, Talleyrand, Blucher) who sought to create new social communities. Journeying together to Mainz, Boisseree and his companions were bound together by their common understanding of moving toward a world that was new and strange, a place more dangerous and more wonderful than the one they left behind.”

That excitement was mixed with the feeling of being spent, the reserves having been fully tapped. This was mixed up with sexuality in what Theodore Dreiser called the ‘spermatic economy’ in the management of libido as psychic energy, a modernization of Galenic thought (by the way, the catalogue for Sears, Roebuck and Company offered an electrical device to replenish nerve force that came with a genital attachment). Obsession with sexuality was used to reinforce gender roles in how neurasthenic patients were treated in following the practice of Dr. Silas Weir Mitchell, in that men were recommended to become more active (the ‘West cure’) and women more passive (the ‘rest cure’), although some women “used neurasthenia to challenge the status quo, rather than enforce it. They argued that traditional gender roles were causing women’s neurasthenia, and that housework was wasting their nervous energy. If they were allowed to do more useful work, they said, they’d be reinvesting and replenishing their energies, much as men were thought to do out in the wilderness” (Julie Beck, ‘Americanitis’: The Disease of Living Too Fast). That feminist-style argument, as I recall, came up in advertisements for the Bernarr Macfadden’s fitness protocol in the early-1900s, encouraging (presumably middle class) women to give up housework for exercise and so regain their vitality. Macfadden was also an advocate of living a fully sensuous life, going as far as free love.

Besides the gender wars, there was the ever-present bourgeois bigotry. Neurasthenia is the most civilized of the diseases of civilization since, in its original American conception, it was perceived as only afflicting middle-to-upper class whites, especially WASPs — as Lutz says that, “if you were lower class, and you weren’t educated and you weren’t Anglo Saxon, you wouldn’t get neurasthenic because you just didn’t have what it took to be damaged by modernity” (Julie Beck, ‘Americanitis’: The Disease of Living Too Fast) and so, according to Lutz’s book, people would make “claims to sickness as claims to privilege.” This class bias goes back even earlier to Robert Burton’s melancholia with its element of what later would be understood as the Cartesian anxiety of mind-body dualism, a common ailment of the intellectual elite (mind-body dualism goes back to the Axial Age; see Julian Jaynes, The Origin of Consciousness in the Breakdown of the Bicameral Mind). The class bias was different for nostalgia, as written about by Svetlana Boym in The Future of Nostalgia (p. 5):

“For Robert Burton, melancholia, far from being a mere physical or psychological condition, had a philosophical dimension. The melancholic saw the world as a theater ruled by capricious fate and demonic play. Often mistaken for a mere misanthrope, the melancholic was in fact a utopian dreamer who had higher hopes for humanity. In this respect, melancholia was an affect and an ailment of intellectuals, a Hamletian doubt, a side effect of critical reason; in melancholia, thinking and feeling, spirit and matter, soul and body were perpetually in conflict. Unlike melancholia, which was regarded as an ailment of monks and philosophers, nostalgia was a more “democratic” disease that threatened to affect soldiers and sailors displaced far from home as well as many country people who began to move to the cities. Nostalgia was not merely an individual anxiety but a public threat that revealed the contradictions of modernity and acquired a greater importance.”

Like diabetes, melancholia and neuraesthenia was first seen among the elite, and so it was taken as demonstrating one’s elite nature. Prior to neurasthenic diagnoses but in the post-revolutionary era, a similar phenomenon went by other names. This is explored by Bryan Kozlowski in one chapter of The Jane Austen Diet (p. 232-233):

“Yet the idea that this was acceptable—nay, encouraged—behavior was rampant throughout the late 18th century. Ever since Jane was young, stress itself was viewed as the right and prerogative of the rich and well-off. The more stress you felt, the more you demonstrated to the world how truly delicate and sensitive your wealthy, softly pampered body actually was. The common catchword for this was having a heightened sensibility—one of the most fashionable afflictions in England at the time. Mainly affecting the “nerves,” a Regency woman who caught the sensibility but “disdains to be strong minded,” wrote a cultural observer in 1799, “she trembles at every breeze, faints at every peril and yields to every assailant.” Austen knew real-life strutters of this sensibility, writing about one acquaintance who rather enjoys “her spasms and nervousness and the consequence they give her.” It’s the same “sensibility” Marianne wallows in throughout the novel that bears its name, “feeding and encouraging” her anxiety “as a duty.” Readers of the era would have found nothing out of the ordinary in Marianne’s high-strung embrace of stress.”

This condition was considered a sign of progress, but over time it came to be seen by some as the greatest threat to civilization, in either case offering much material for fictionalized portrayals that were popular. Being sick in this fashion was proof that one was a modern individual, an exemplar of advanced civilization, if coming at immense cost —Julie Beck explains (‘Americanitis’: The Disease of Living Too Fast):

“The nature of this sickness was vague and all-encompassing. In his book Neurasthenic Nation, David Schuster, an associate professor of history at Indiana University-Purdue University Fort Wayne, outlines some of the possible symptoms of neurasthenia: headaches, muscle pain, weight loss, irritability, anxiety, impotence, depression, “a lack of ambition,” and both insomnia and lethargy. It was a bit of a grab bag of a diagnosis, a catch-all for nearly any kind of discomfort or unhappiness.

“This vagueness meant that the diagnosis was likely given to people suffering from a variety of mental and physical illnesses, as well as some people with no clinical conditions by modern standards, who were just dissatisfied or full of ennui. “It was really largely a quality-of-life issue,” Schuster says. “If you were feeling good and healthy, you were not neurasthenic, but if for some reason you were feeling run down, then you were neurasthenic.””

I’d point out how neurasthenia was seen as primarily caused by intellectual activity, as it became a descriptor of a common experience among the burgeoning middle class of often well-educated professionals and office workers. This relates to Weston A. Price’s work in the 1930s, as modern dietary changes first hit this demographic since they had the means to afford eating a fully industrialized Standard American Diet (SAD), long before others (within decades, though, SAD-caused malnourishment would wreck the health at all levels of society). What this meant, in particular, was a diet high in processed carbs and sugar that coincided, because of Upton Sinclair’s 1904 The Jungle: Muckraking the Meat-Packing Industry,  with the early-1900s decreased consumption of meat and saturated fats. As Price demonstrated, this was a vast change from the traditional diet found all over the world, including in rural Europe (and presumably in rural America, with most Americans not urbanized until the turn of last century), that always included significant amounts of nutritious animal foods loaded up with fat-soluble vitamins, not to mention lots of healthy fats and cholesterol.

Prior to talk of neurasthenia, the exhaustion model of health portrayed as waste and depletion took hold in Europe centuries earlier (e.g., anti-masturbation panics) and had its roots in humor theory of bodily fluids. It has long been understood that food, specifically macronutrients (carbohydrate, protein, & fat) and food groups, affect mood and behavior — see the early literature on melancholy. During feudalism food laws were used as a means of social control, such that in one case meat was prohibited prior to Carnival because of its energizing effect that it was thought could lead to rowdiness or even revolt, as sometimes did happen (Ken Albala & Trudy Eden, Food and Faith in Christian Culture). Red meat, in particular, was thought to heat up blood (warm, wet) and yellow bile (warm, dry), in promoting sanguine and choleric personalities of masculinity. Like women, peasants were supposed to be submissive and hence not too masculine — they were to be socially controlled, not self-controlled. Anyone who was too strong-willed and strong-minded, other than the (ruling, economic, clerical, and intellectual) elite, was considered problematic; and one of the solutions was an enforced change of diet to create the proper humoral disposition for their appointed social role within the social order (i.e., depriving nutrient-dense meat until an individual or group was too malnourished, weak, anemic, sickly, docile, and effeminate to be assertive, aggressive, and confrontational toward their ‘betters’)

There does seem to be a correlation (causal link?) between an increase of intellectual activity and abstract thought with an increase of carbohydrates and sugar, with this connection first appearing during the early colonial era that set the stage for the Enlightenment. It was the agricultural mind taken to a whole new level. Indeed, a steady flow of glucose is one way to fuel extended periods of brain work, such as reading and writing for hours on end and late into the night — the reason college students to this day will down sugary drinks while studying. Because of trade networks, Enlightenment thinkers were buzzing on the suddenly much more available simple carbs and sugar, with an added boost from caffeine and nicotine. The modern intellectual mind was drugged-up right from the beginning, and over time it took its toll. Such dietary highs inevitably lead to ever greater crashes of mood and health. Interestingly, Dr. Silas Weir Mitchell who advocated the ‘rest cure’ and ‘West cure’ in treating neurasthenia and other ailments additionally used a “meat-rich diet” for his patients (Ann Stiles, Go rest, young man). Other doctors of that era were even more direct in using specifically low-carb diets for various health conditions, often for obesity which was also a focus of Dr. Mitchell.

As a side note, the gendering of diet was seen as important for constructing, maintaining, and enforcing gender roles; that is carried over into the modern bias that masculine men eat steak and effeminate women eat salad. According to humoralism, men are well contained while women are leaky vessels. One can immediately see the fears of neurasthenia, emasculation, and excessive ejaculation. The ideal man was supposed to hold onto and control his bodily fluids, from urine to semen, by using and investing them carefully. With neurasthenia, though, men were seen as having become effeminate and leaky, dissipating and draining away their reserves vital fluids and psychic energies. So, a neurasthenic man needed a strengthening of the boundaries that held everything in. The leakiness of women was also a problem, but women couldn’t and shouldn’t be expected to contain themselves. The rest cure designed for women was to isolate them in a bedroom where they’d be contained by architectural structure of home that was owned and ruled over by the male master. A husband and, as an extension, the husband’s property was to contain the wife; since she too was property of the man’s propertied self. This made a weak man of the upper classes even more dangerous to the social order because he couldn’t play he is needed gender role of husband and patriarch, upon which all of Western civilization was dependent.

All of this was based on an economic model of physiological scarcity. With neurasthenia arising in late modernity, the public debate was overtly framed by an economic metaphor. But the perceived need of economic containment of the self, be it self-containment or enforced containment, went back to early modernity. The enclosure movement was part of a larger reform movement, not only of land but also of society and identity.

* * *

“It cannot be denied that civilization, in its progress, is rife with causes which over-excite individuals, and result in the loss of mental equilibrium.”
~Edward Jarvis, 1843
“What shall we do with the Insane?”
The North American Review, Volume 56, Issue 118

“Have we lived too fast?”
~Dr. Silas Weir Mitchell, 1871
Wear and Tear, or Hints for the Overworked

It goes far beyond diet or any other single factor. There has been a diversity of stressors that continued to amass over the centuries of tumultuous change. The exhaustion of modern man (and typically the focus has been on men) has been building up for generations upon generations before it came to feel like a world-shaking crisis with the new industrialized world. The lens of neurasthenia was an attempt to grapple with what had changed, but the focus was too narrow. With the plague of neurasthenia, the atomization of commericialized man and woman couldn’t hold together. And so there was a temptation toward nationalistic projects, including wars, to revitalize the ailing soul and to suture the gash of social division and disarray. But this further wrenched out of alignment the traditional order that had once held society together, and what was lost mostly went without recognition. The individual was brought into the foreground of public thought, a lone protagonist in a social Darwinian world. In this melodramatic narrative of struggle and self-assertion, many individuals didn’t fare so well and everything else suffered in the wake.

Tom Lutz writes that, “By 1903, neurasthenic language and representations of neurasthenia were everywhere: in magazine articles, fiction, poetry, medical journals and books, in scholarly journals and newspaper articles, in political rhetoric and religious discourse, and in advertisements for spas, cures, nostrums, and myriad other products in newspapers, magazines and mail-order catalogs” (American Nervousness, 1903, p. 2).

There was a sense of moral decline that was hard to grasp, although some people like Weston A. Price tried to dig down into concrete explanations of what had so gone wrong, the social and psychological changes observable during mass urbanization and industrialization. He was far from alone in his inquiries, having built on the prior observations of doctors, anthropologists, and missionaries. Other doctors and scientists were looking into the influences of diet in the mid-1800s and, by the 1880s, scientists were exploring a variety of biological theories. Their inability to pinpoint the cause maybe had more to do with their lack of a needed framework, as they touched upon numerous facets of biological functioning:

“Not surprisingly, laboratory experiments designed to uncover physiological changes in the nerve cell were inconclusive. European research on neurasthenics reported such findings as loss of elasticity of blood vessels,’ thickening of the cell wall, changes in the shape of nerve cells,’6 or nerve cells that never advanced beyond an embryonic state.’ Another theory held that an overtaxed organism cannot keep up with metabolic requirements, leading to inadequate cell nutrition and waste excretion. The weakened cells cannot develop properly, while the resulting build-up of waste products effectively poisons the cells (so-called “autointoxication”).’ This theory was especially attractive because it seemed to explain the extreme diversity of neurasthenic symptoms: weakened or poisoned cells might affect the functioning of any organ in the body. Furthermore, “autointoxicants” could have a stimulatory effect, helping to account for the increased sensitivity and overexcitability characteristic of neurasthenics.'” (Laura Goering, “Russian Nervousness”: Neurasthenia and National Identity in Nineteenth-Century Russia)

This early scientific research could not lessen the mercurial sense of unease, as neurasthenia was from its inception a broad category that captured some greater shift in public mood, even as it so powerfully shaped the individual’s health. For all the effort, there were as many theories about neurasthenia as there were symptoms. Deeper insight was required. “[I]f a human being is a multiformity of mind, body, soul, and spirit,” writes Preston, “you don’t achieve wholeness or fulfillment by amputating or suppressing one or more of these aspects, but only by an effective integration of the four aspects.” But integration is easier said than done.

The modern human hasn’t been suffering from mere psychic wear and tear for the individual body itself has been showing the signs of sickness, as the diseases of civilization have become harder and harder to ignore. On a societal level of human health, I’ve previously shared passages from Lears (see here) — he discusses the vitalist impulse that was the response to the turmoil, and vitalism often was explored in terms of physical health as the most apparent manifestation, although social and spiritual health were just as often spoken of in the same breath. The whole person was under assault by an accumulation of stressors and the increasingly isolated individual didn’t have the resources to fight them off.

By the way, this was far from being limited to America. Europeans picked up the discussion of neurasthenia and took it in other directions, often with less optimism about progress, but also some thinkers emphasizing social interpretations with specific blame on hyper-individualism (Laura Goering, “Russian Nervousness”: Neurasthenia and National Identity in Nineteenth-Century Russia). Thoughts on neurasthenia became mixed up with earlier speculations on nostalgia and romanticized notions of rural life. More important, Russian thinkers in particular understood that the problems of modernity weren’t limited to the upper classes, instead extending across entire populations, as a result of how societies had been turned on their heads during that fractious century of revolutions.

In looking around, I came across some other interesting stuff. From 1901 Nervous and Mental Diseases by Archibald Church and Frederick Peterson, the authors in the chapter on “Mental Disease” are keen to further the description, categorization, and labeling of ‘insanity’. And I noted their concern with physiological asymmetry, something shared later with Price, among many others going back to the prior century.

Maybe asymmetry was not only indicative of developmental issues but also symbolic of a deeper imbalance. The attempts of phrenological analysis about psychiatric, criminal, and anti-social behavior were off-base; and, despite the bigotry and proto-genetic determinism among racists using these kinds of ideas, there is a simple truth about health in relationship to physiological development, most easily observed in bone structure, but it would take many generations to understand the deeper scientific causes, along with nutrition (e.g., Price’s discovery of vitamin K2, what he called Acivator X) including parasites, toxins, and epigenetics. Church and Peterson did acknowledge that this went beyond mere individual or even familial issues: “The proportion of the insane to normal individuals may be stated to be about 1 to 300 of the population, though this proportion varies somewhat within narrow limits among different races and countries. It is probable that the intemperate use of alcohol and drugs, the spreading of syphilis, and the overstimulation in many directions of modern civilization have determined an increase difficult to estimate, but nevertheless palpable, of insanity in the present century as compared with past centuries.”

Also, there is the 1902 The Journal of Nervous and Mental Disease: Volume 29 edited by William G. Spiller. There is much discussion in there about how anxiety was observed, diagnosed, and treated at the time. Some of the case studies make for a fascinating read —– check out: “Report of a Case of Epilepsy Presenting as Symptoms Night Terrors, Inipellant Ideas, Complicated Automatisms, with Subsequent Development of Convulsive Motor Seizures and Psychical Aberration” by W. K. Walker. This reminds me of the case that influenced Sigmund Freud and Carl Jung, Daniel Paul Schreber’s 1903 Memoirs of My Nervous Illness.

Talk about “a disintegration of the personality and character structure of Modern Man and mental-rational consciousness,” as Scott Preston put it. He goes on to say that, “The individual is not a natural thing. There is an incoherency in “Margaret Thatcher’s view of things when she infamously declared “there is no such thing as society” — that she saw only individuals and families, that is to say, atoms and molecules.” Her saying that really did capture the mood of the society she denied existing. Even the family was shrunk down to the ‘nuclear’. To state there is no society is to declare that there is also no extended family, no kinship, no community, that there is no larger human reality of any kind. Ironically in this pseudo-libertarian sentiment, there is nothing holding the family together other than government laws imposing strict control of marriage and parenting where common finances lock two individuals together under the rule of capitalist realism (the only larger realities involved are inhuman systems) — compared to high trust societies such as Nordic countries where the definition and practice of family life is less legalistic (Nordic Theory of Love and Individualism).

* * *

“It is easy, as we can see, for a barbarian to be healthy; for a civilized man the task is hard. The desire for a powerful and uninhibited ego may seem to us intelligible, but, as is shown by the times we live in, it is the profoundest sense antagonistic to civilization.”
~Sigmund Freud, 1938
An Outline of Psychoanalysis

“Consciousness is a very recent acquisition of nature, and it is still in an “experimental” state. It is frail, menaced by specific dangers, and easily injured.”
~Carl Jung, 1961
Man and His Symbols
Part 1: Approaching the Unconscious
The importance of dreams

The individual consumer-citizen as a legal member of a family unit has to be created and then controlled, as it is a rather unstable atomized identity. “The idea of the “individual”,” Preston says, “has become an unsustainable metaphor and moral ideal when the actual reality is “21st century schizoid man” — a being who, far from being individual, is falling to pieces and riven with self-contradiction, duplicity, and cognitive dissonance, as reflects life in “the New Normal” of double-talk, double-think, double-standard, and double-bind.” That is partly the reason for the heavy focus on the body, an attempt to make concrete the individual in order to hold together the splintered self — great analysis of this can be found in Lewis Hyde’s Trickster Makes This World: “an unalterable fact about the body is linked to a place in the social order, and in both cases, to accept the link is to be caught in a kind of trap. Before anyone can be snared in this trap, an equation must be made between the body and the world (my skin color is my place as a Hispanic; menstruation is my place as a woman)” (see one of my posts about it: Lock Without a Key). Along with increasing authoritarianism, there was increasing medicalization and rationalization — to try to make sense of what was senseless.

A specific example of a change can be found in Dr. Frederick Hollick (1818-1900) who was a popular writer and speaker on medicine and health — his “links were to the free-thinking tradition, not to Christianity” (Helen Lefkowitz Horowitz, Rewriting Sex). With the influence of Mesmerism and animal magnetism, he studied and wrote about what more scientifically-sounding was variously called electrotherapeutics, galvanism, and electro-galvanism. Hollick was an English follower of the Scottish industrialist and socialist Robert Dale Owen who he literally followed to the United States where Owen started the utopian community New Harmony, a Southern Indiana village bought from the utopian German Harmonists and then filled with brilliant and innovative minds but lacking in practical know-how about running a self-sustaining community (Abraham Lincoln, later becoming a friend to the Owen family, recalled as a boy seeing the boat full of books heading to New Harmony).

“As had Owen before him, Hollick argued for the positive value of sexual feeling. Not only was it neither immoral nor injurious, it was the basis for morality and society. […] In many ways, Hollick was a sexual enthusiast” (Horowitz). These were the social circles of Abraham Lincoln, as he personally knew free-love advocates; that is why early Republicans were often referred to as ‘Red Republicans’, the ‘Red’ indicating radicalism as it still does to this day. Hollick wasn’t the first to be a sexual advocate nor, of course would he be the last — preceding him was Sarah Grimke (1837, Equality of the Sexes) and Charles Knowlton (1839, The Private Companion of Young Married People), Hollick having been “a student of Knowlton’s work” (Debran Rowland, The Boundaries of Her Body); and following him were two more well known figures, the previously mentioned Bernarr Macfadden (1868-1955) who was the first major health and fitness guru, and Wilhelm Reich (1897–1957) who was the less respectable member of the trinity formed with Sigmund Freud and Carl Jung. Sexuality became a symbolic issue of politics and health, partly because of increasing scientific knowledge but also because increasing marketization of products such as birth control (with public discussion of contraceptives happening in the late 1700s and advances in contraceptive production in the early 1800s), the latter being quite significant as it meant individuals could control pregnancy and this is particularly relevant to women. It should be noted that Hollick promoted the ideal of female sexual autonomy, that sex should be assented to and enjoyed by both partners.

This growing concern with sexuality began with the growing middle class in the decades following the American Revolution. Among much else, it was related to the post-revolutionary focus on parenting and the perceived need for raising republican citizens — this formed an audience far beyond radical libertinism and free-love. Expert advice was needed for the new bourgeouis family life, as part of the ‘civilizing process’ that increasingly took hold at that time with not only sexual manuals but also parenting guides, health pamphlets, books of manners, cookbooks, diet books, etc — cut off from the roots of traditional community and kinship, the modern individual no longer trusted inherited wisdom and so needed to be taught how to live, how to behave and relate (Norbert Elias, The Civilizing Process, & Society of Individuals; Bruce Mazlish, Civilization and Its Contents; Keith Thomas, In Pursuit of Civility; Stephen Mennell, The American Civilizing Process; Cas Wouters, Informalization; Jonathan Fletcher, Violence and Civilization; François Dépelteau & ‎T. Landini, Norbert Elias and Social Theory; Rob Watts, States of Violence and the Civilising Process; Pieter Spierenburg, Violence and Punishment; Steven Pinker, The Better Angels of Our Nature; Eric Dunning & Chris Rojek, Sport and Leisure in the Civilizing Process; D. E. Thiery, Polluting the Sacred; Helmut Kuzmics, Roland Axtmann, Authority, State and National Character; Mary Fulbrook, Un-Civilizing Processes?; John Zerzan, Against Civilization; Michel Foucault, Madness and Civilization; Dennis Smith, Norbert Elias and Modern Social Theory; Stejpan Mestrovic, The Barbarian Temperament; Thomas Salumets, Norbert Elias and Human Interdependencies). Along with the rise of the science, this situation promoted the role of the public intellectual that Hollick effectively took advantage of and, after the failure of Owen’s utopian experiment, he went on the lecture circuit which brought on legal cases in the unsuccessful attempt to silence him, the kind of persecution that Reich also later endured.

To put it in perspective, this Antebellum era of public debate and public education on sexuality coincided with other changes. Following the revolutionary era feminism (e.g., Mary Wollstonecraft), the ‘First Wave’ of organized feminists emerged generations later with the Seneca meeting in 1848 and, in that movement, there was a strong abolitionist impulse. This was part of the rise of ideological -isms in the North that so concerned the Southern aristocrats who wanted to maintain their hierarchical control of the entire country, the control they were quickly losing with the shift of power in the Federal government. A few years before that in 1844, a more effective condom was developed using vulcanized rubber, although condoms had been on the market since the previous decade; also in the 1840s, the vaginal sponge became available. Interestingly, many feminists were as against the contraceptives as they were against abortions. These were far from being mere practical issues as politics imbued every aspect and some feminists worried about how this might lessen the role of women and motherhood in society, if sexuality was divorced from pregnancy.

This was at a time when the abortion rate was sky-rocketing, indicating most women held other views; since large farm families were less needed with increase of both industrialized urbanization and industrialized farming. “Yet we also know that thousands of women were attending lectures in these years, lectures dealing, in part, with fertility control. And rates of abortion were escalating rapidly, especially, according to historian James Mohr, the rate for married women. Mohr estimates that in the period 1800-1830, perhaps one out of every twenty-five to thirty pregnancies was aborted. Between 1850 and 1860, he estimates, the ratio may have been one out of every five or six pregnancies. At mid-century, more than two hundred full-time abortionists reported worked in New York City” Other sources concur and extend this pattern of high abortion rate into the early 20th century: “Some have estimated that between 20-35 percent of 19th century pregnancies were terminated as a means of restoring “menstrual regularity” (Luker, 1984, p. 18-21). About 20 percent of pregnancies were aborted as late as in the 1930s (Tolnai, 1939, p. 425)” (Rickie Solinger, Pregnancy and Power, p. 61). (Polly F. Radosh, “Abortion: A Sociological Perspective”, from Interdisciplinary Views on Abortion ed. by Susan A. Martinelli-Fernandez, Lori Baker-Sperry, & Heather McIlvaine-Newsad).

In the unGodly and unChurched period of early America (“We forgot.”), organized religion was weak and “premarital sex was typical, many marriages following after pregnancy, but some people simply lived in sin. Single parents and ‘bastards’ were common” (A Vast Experiment). Abortions were so common at the time that abortifacients were advertised in major American newspapers, something that is never seen today. “Abortifacients were hawked in store fronts and even door to door. Vendors openly advertised their willingness to end women’s pregnancies” (Erin Blakemore, The Criminalization of Abortion Began as a Business Tactic). By the way, the oldest of the founding fathers, Benjamin Franklin, published in 1748 material about traditional methods of at-home abortions, what he referred to as ‘suppression of the courses’ (Molly Farrell, Ben Franklin Put an Abortion Recipe in His Math Textbook; Emily Feng, Benjamin Franklin gave instructions on at-home abortions in a book in the 1700s). It was a a reprinting of material from 1734. “While “suppression of the courses” can apply to any medical condition that results in the suspension of one’s menstrual cycle, the entry specifically refers to “unmarried women.” Described as a “misfortune” it recommends a number of known abortifacents from that time, like pennyroyal water and bellyache root, also known as angelica” (Nur Ibrahim, Did Ben Franklin Publish a Recipe in a Math Textbook on How to Induce Abortion?).

This is unsurprising as abortifacients have been known for at least millennia earlier, recorded in ancient texts from diverse societies, and probably were common knowledge prior to any written language, considering abortifacients are used by many hunter-gatherer tribes who need birth control to space out pregnancies in order to avoid malnourished babies and for other reasons. This is true within the Judeo-Christian tradition as well, such as where the Old Testament gives an abortion recipe for when a wife gets pregnant from an affair (Numbers 5:11-31). Patriarchal social dominators sought to further control women not necessarily for religious reasons, but more because medical practice was becoming professionalized by men who wanted to eliminate the business competition of female doctors, midwives, and herbalists. “To do so, they challenged common perceptions that a fetus was not a person until the pregnant mother felt it “quicken,” or move, inside their womb. In a time before sonograms, this was often the only way to definitively prove that a pregnancy was underway. Quickening was both a medical and legal concept, and abortions were considered immoral or illegal only after quickening. Churches discouraged the practice, but made a distinction between a woman who terminated her pregnancy pre- or post-quickening” (Erin Blakemore). Yet these conservative authoritarians would and still claim to speak on behalf of some vague and amorphous concept of Western Civilization and Christendom.

This is a great example of how, through the power of charismatic demagogues and Machiavellian social dominators, modern reactionary ideology obscures the past with deceptive nostalgia and replaces the traditional with historical revisionism. The thing is, until the modern era, abortifacients and other forms of birth control weren’t politicized, much less under the purview of judges. They were practical concerns that were largely determined privately and personally or else determined informally within communities and families. “Prior to the formation of the AMA, decisions related to pregnancy and abortion were made primarily with the domain and control of women. Midwives and the pregnant women they served decided the best course of action within extant knowledge of pregnancy. Most people did not view what would currently be called first trimester abortion as a significant moral issue. […] A woman’s awareness of quickening indicated a real pregnancy” (Polly F. Radosh). Yet something did change with birth control that was improved in its efficacy and ever more common or else more out in the open, making it a much more public and politicized issue, not to mention exacerbated by capitalist markets and mass media.

Premarital sex or, heck, even marital sex no longer inevitably meant birth; and with contraceptives, unwanted pregnancies often could be prevented entirely. Maybe this is why fertility had been declining for so long, and definitely the reason there was a mid-19th century moral panic. “Extending the analysis back further, the White fertility rate declined from 7.04 in 1800 to 5.42 in 1850, to 3.56 in 1900, and 2.98 in 1950. Thus, the White fertility declined for nearly all of American history but may have bottomed out in the 1980s. Black fertility has also been declining for well over 150 years, but it may very well continue to do so in the coming decades” (Ideas and Data, Sex, Marriage, and Children: Trends Among Millennial Women). If this is a crisis, it started pretty much at the founding of the country. And if we had reliable data before that, we might see the trend having originated in the colonial era or maybe back in late feudalism during the enclosure movement that destroyed traditional rural communities and kinship groups. Early Americans, by today’s standards of the culture wars, were not good Christians — many visiting Europeans at the time saw them as uncouth heathens and quite dangerous at that, such as the common American practice of toting around guns and knives, ever ready for a fight, whereas carrying weapons had been made illegal in England. In The Churching of America, Roger Finke and Rodney Stark write (pp. 25-26):

“Americans are burdened with more nostalgic illusions about the colonial era than about any other period in their history. Our conceptions of the time are dominated by a few powerful illustrations of Pilgrim scenes that most people over forty stared at year after year on classroom walls: the baptism of Pocahontas, the Pilgrims walking through the woods to church, and the first Thanksgiving. Had these classroom walls also been graced with colonial scenes of drunken revelry and barroom brawling, of women in risque ball-gowns, of gamblers and rakes, a better balance might have been struck. For the fact is that there never were all that many Puritans, even in New England, and non-Puritan behavior abounded. From 1761 through 1800 a third (33.7%) of all first births in New England occurred after less than nine months of marriage (D. S. Smith, 1985), despite harsh laws against fornication. Granted, some of these early births were simply premature and do not necessarily show that premarital intercourse had occurred, but offsetting this is the likelihood that not all women who engaged in premarital intercourse would have become pregnant. In any case, single women in New England during the colonial period were more likely to be sexually active than to belong to a church-in 1776 only about one out of five New Englanders had a religious affiliation. The lack of affiliation does not necessarily mean that most were irreligious (although some clearly were), but it does mean that their faith lacked public expression and organized influence.”

Though marriage remained important as an ideal in American culture, what changed was that procreative control became increasingly available — with fewer accidental pregnancies and more abortions, a powerful motivation for marriage disappeared. Unsurprisingly, at the same time, there was increasing worries about the breakdown of community and family, concerns that would turn into moral panic at various points. Antebellum America was in turmoil. This was concretely exemplified by the dropping birth rate that was already noticeable by mid-19th century (Timothy Crumrin, “Her Daily Concern:” Women’s Health Issues in Early 19th-Century Indiana) and was nearly halved from 1800 to 1900 (Debran Rowland, The Boundaries of Her Body). “The late 19th century and early 20th saw a huge increase in the country’s population (nearly 200 percent between 1860 and 1910) mostly due to immigration, and that population was becoming ever more urban as people moved to cities to seek their fortunes—including women, more of whom were getting college educations and jobs outside the home” (Julie Beck, ‘Americanitis’: The Disease of Living Too Fast). It was a period of crisis, not all that different from our present crisis, including the fear about low birth rate of native-born white Americans, especially the endangered species of whites/WASPs, being overtaken by the supposed dirty hordes of blacks, ethnics, and immigrants (i.e., replacement theory); at a time when Southern and Eastern Europeans, and even the Irish, were questionable in their whiteness, particularly if Catholic (Aren’t Irish White?).

The promotion of birth control was considered a genuine threat to American society, maybe to all of Western Civilization. It was most directly a threat to traditional gender roles. Women could better control when they got pregnant, a decisive factor in the phenomenon of  larger numbers of women entering college and the workforce. And with an epidemic of neurasthenia, this dilemma was worsened by the crippling effeminacy that neutered masculine potency. Was modern man, specifically the white ruling elite, up for the task of carrying on Western Civilization?

“Indeed, civilization’s demands on men’s nerve force had left their bodies positively effeminate. According to Beard, neurasthenics had the organization of “women more than men.” They possessed ” a muscular system comparatively small and feeble.” Their dainty frames and feeble musculature lacked the masculine vigor and nervous reserves of even their most recent forefathers. “It is much less than a century ago, that a man who could not [drink] many bottles of wine was thought of as effeminate—but a fraction of a man.” No more. With their dwindling reserves of nerve force, civilized men were becoming increasingly susceptible to the weakest stimulants until now, “like babes, we find no safe retreat, save in chocolate and milk and water.” Sex was as debilitating as alcohol for neurasthenics. For most men, sex in moderation was a tonic. Yet civilized neurasthenics could become ill if they attempted intercourse even once every three months. As Beard put it, “there is not force enough left in them to reproduce the species or go through the process of reproducing the species.” Lacking even the force “to reproduce the species,” their manhood was clearly in jeopardy.” (Gail Bederman, Manliness and Civilization, pp. 87-88)

This led to a backlash that began before the Civil War with the early obscenity laws and abortion laws, but went into high gear with the 1873 Comstock laws that effectively shut down the free market of both ideas and products related to sexuality, including sex toys. This made it near impossible for most women to learn about birth control or obtain contraceptives and abortifacients. There was a felt need to restore order and that meant white male order of the WASP middle-to-upper classes, especially with the end of slavery, mass immigration of ethnics, urbanization and industrialization. The crisis wasn’t only ideological or political. The entire world had been falling apart for centuries with the ending of feudalism and the ancien regime, the last remnants of it in America being maintained through slavery. Motherhood being the backbone of civilization, it was believed that women’s sexuality had to be controlled and, unlike so much else that was out of control, it actually could be controlled through enforcement of laws.

Outlawing abortions is a particularly interesting example of social control. Even with laws in place, abortions remained commonly practiced by local doctors, even in many rural areas (American Christianity: History, Politics, & Social Issues). Corey Robin argues that the strategy hasn’t been to deny women’s agency but to assert their subordination (Denying the Agency of the Subordinate Class). This is why, according to Rogin, abortion laws were designed to primarily target male doctors, although they rarely did, and not their female patients (at least once women had been largely removed from medical and healthcare practice, beyond the role as nurses who assisted male doctors). Everything comes down to agency or its lack or loss, but our entire sense of agency is out of accord with our own human nature. We seek to control what is outside of us, including control of others, for our own sense of self is out of control. The legalistic worldview is inherently authoritarian, at the heart of what Julian Jaynes proposes as the post-bicameral project of consciousness, the metaphorically contained self. But this psychic container is weak and keeps leaking all over the place.

* * *

“It is clear that if it goes on with the same ruthless speed for the next half century . . . the sane people will be in a minority at no very distant day.”
~Henry Maudsley, 1877
“The Alleged Increase of Insanity”
Journal of Mental Science, Volume 23, Issue 101

“If this increase was real, we have argued, then we are now in the midst of an epidemic of insanity so insidious that most people are even unaware of its existence.”
~Edwin Fuller Torrey & Judy Miller, 2001
The Invisible Plague: The Rise of Mental Illness from 1750 to the Present

To bring it back to the original inspiration, Scott Preston wrote: “Quite obviously, our picture of the human being as an indivisible unit or monad of existence was quite wrong-headed, and is not adequate for the generation and re-generation of whole human beings. Our self-portrait or self-understanding of “human nature” was deficient and serves now only to produce and reproduce human caricatures. Many of us now understand that the authentic process of individuation hasn’t much in common at all with individualism and the supremacy of the self-interest.” The failure we face is that of identify, of our way of being in the world. As with neurasthenia in the past, we are now in a crisis of anxiety and depression, along with yet another moral panic about the declining white race. So, we get the likes of Steve Bannon, Donald Trump, and Jordan Peterson. We failed to resolve past conflicts and so they keep re-emerging. Over this past century, we have continued to be in a crisis of identity (Mark Greif, The Age of the Crisis of Man).

“In retrospect, the omens of an impending crisis and disintegration of the individual were rather obvious,” Preston points out. “So, what we face today as “the crisis of identity” and the cognitive dissonance of “the New Normal” is not something really new — it’s an intensification of that disintegrative process that has been underway for over four generations now. It has now become acute. This is the paradox. The idea of the “individual” has become an unsustainable metaphor and moral ideal when the actual reality is “21st century schizoid man” — a being who, far from being individual, is falling to pieces and riven with self-contradiction, duplicity, and cognitive dissonance, as reflects life in “the New Normal” of double-talk, double-think, double-standard, and double-bind.” We never were individuals. It was just a story we told ourselves, but there are others that could be told. Scott Preston offers an alternative narrative, that of individuation.

* * *

I found some potentially interesting books while skimming material on Google Books, in my researching Frederick Hollick and other info. Among the titles below, I’ll share some text from one of them because it offers a good summary about sexuality at the time, specifically women’s sexuality. Obviously, it went far beyond sexuality itself, and going by my own theorizing I’d say it is yet another example of symbolic conflation, considering its direct relationship to abortion.

The Boundaries of Her Body: The Troubling History of Women’s Rights in America
by Debran Rowland
pp. 34

WOMEN AND THE WOMB: The Emerging Birth Control Debate

The twentieth century dawned in America on a falling white birth rate. In 1800, an average of seven children were born to each “American-born white wife,” historians report. 29 By 1900, that number had fallen to roughly half. 30 Though there may have been several factors, some historians suggest that this decline—occurring as it did among young white women—may have been due to the use of contraceptives or abstinence,though few talked openly about it. 31

“In spite of all the rhetoric against birth control,the birthrate plummeted in the late nineteenth century in America and Western Europe (as it had in France the century before); family size was halved by the time of World War I,” notes Shari Thurer in The Myth of Motherhood. 32

As issues go, the “plummeting birthrate” among whites was a powder keg, sparking outcry as the “failure”of the privileged class to have children was contrasted with the “failure” of poor immigrants and minorities to control the number of children they were having. Criticism was loud and rampant. “The upper classes started the trend, and by the 1880s the swarms of ragged children produced by the poor were regarded by the bourgeoisie, so Emile Zola’s novels inform us, as evidence of the lower order’s ignorance and brutality,” Thurer notes. 33

But the seeds of this then-still nearly invisible movement had been planted much earlier. In the late 1700s, British political theorists began disseminating information on contraceptives as concerns of overpopulation grew among some classes. 34 Despite the separation of an ocean, by the 1820s, this information was “seeping” into the United States.

“Before the introduction of the Comstock laws, contraceptive devices were openly advertised in newspapers, tabloids, pamphlets, and health magazines,” Yalom notes.“Condoms had become increasing popular since the 1830s, when vulcanized rubber (the invention of Charles Goodyear) began to replace the earlier sheepskin models.” 35 Vaginal sponges also grew in popularity during the 1840s, as women traded letters and advice on contraceptives. 36 Of course, prosecutions under the Comstock Act went a long way toward chilling public discussion.

Though Margaret Sanger’s is often the first name associated with the dissemination of information on contraceptives in the early United States, in fact, a woman named Sarah Grimke preceded her by several decades. In 1837, Grimke published the Letters on the Equality of the Sexes, a pamphlet containing advice about sex, physiology, and the prevention of pregnancy. 37

Two years later, Charles Knowlton published The Private Companion of Young Married People, becoming the first physician in America to do so. 38 Near this time, Frederick Hollick, a student of Knowlton’s work, “popularized” the rhythm method and douching. And by the 1850s, a variety of material was being published providing men and women with information on the prevention of pregnancy. And the advances weren’t limited to paper.

“In 1846,a diaphragm-like article called The Wife’s Protector was patented in the United States,” according to Marilyn Yalom. 39 “By the 1850s dozens of patents for rubber pessaries ‘inflated to hold them in place’ were listed in the U.S. Patent Office records,” Janet Farrell Brodie reports in Contraception and Abortion in 19th Century America. 40 And, although many of these early devices were often more medical than prophylactic, by 1864 advertisements had begun to appear for “an India-rubber contrivance”similar in function and concept to the diaphragms of today. 41

“[B]y the 1860s and 1870s, a wide assortment of pessaries (vaginal rubber caps) could be purchased at two to six dollars each,”says Yalom. 42 And by 1860, following publication of James Ashton’s Book of Nature, the five most popular ways of avoiding pregnancy—“withdrawal, and the rhythm methods”—had become part of the public discussion. 43 But this early contraceptives movement in America would prove a victim of its own success. The openness and frank talk that characterized it would run afoul of the burgeoning “purity movement.”

“During the second half of the nineteenth century,American and European purity activists, determined to control other people’s sexuality, railed against male vice, prostitution, the spread of venereal disease, and the risks run by a chaste wife in the arms of a dissolute husband,” says Yalom. “They agitated against the availability of contraception under the assumption that such devices, because of their association with prostitution, would sully the home.” 44

Anthony Comstock, a “fanatical figure,” some historians suggest, was a charismatic “purist,” who, along with others in the movement, “acted like medieval Christiansengaged in a holy war,”Yalom says. 45 It was a successful crusade. “Comstock’s dogged efforts resulted in the 1873 law passed by Congress that barred use of the postal system for the distribution of any ‘article or thing designed or intended for the prevention of contraception or procuring of abortion’,”Yalom notes.

Comstock’s zeal would also lead to his appointment as a special agent of the United States Post Office with the authority to track and destroy “illegal” mailing,i.e.,mail deemed to be “obscene”or in violation of the Comstock Act.Until his death in 1915, Comstock is said to have been energetic in his pursuit of offenders,among them Dr. Edward Bliss Foote, whose articles on contraceptive devices and methods were widely published. 46 Foote was indicted in January of 1876 for dissemination of contraceptive information. He was tried, found guilty, and fined $3,000. Though donations of more than $300 were made to help defray costs,Foote was reportedly more cautious after the trial. 47 That “caution”spread to others, some historians suggest.

Disorderly Conduct: Visions of Gender in Victorian America
By Carroll Smith-Rosenberg

Riotous Flesh: Women, Physiology, and the Solitary Vice in Nineteenth-Century America
by April R. Haynes

The Boundaries of Her Body: The Troubling History of Women’s Rights in America
by Debran Rowland

Rereading Sex: Battles Over Sexual Knowledge and Suppression in Nineteenth-century America
by Helen Lefkowitz Horowitz

Rewriting Sex: Sexual Knowledge in Antebellum America, A Brief History with Documents
by Helen Lefkowitz Horowitz

Imperiled Innocents: Anthony Comstock and Family Reproduction in Victorian America
by Nicola Kay Beisel

Against Obscenity: Reform and the Politics of Womanhood in America, 1873–1935
by Leigh Ann Wheeler

Purity in Print: Book Censorship in America from the Gilded Age to the Computer Age
by Paul S. Boyer

American Sexual Histories
edited by Elizabeth Reis

Wash and Be Healed: The Water-Cure Movement and Women’s Health
by Susan Cayleff

From Eve to Evolution: Darwin, Science, and Women’s Rights in Gilded Age America
by Kimberly A. Hamlin

Manliness and Civilization: A Cultural History of Gender and Race in the United States, 1880-1917
by Gail Bederman

One Nation Under Stress: The Trouble with Stress as an Idea
by Dana Becker

* * *

8/18/19 – Looking back at this piece, I realize there is so much that could be added to it. And it already is long. It’s a topic that would require writing a book to do it justice. And it is such a fascinating area of study with lines of thought going in numerous directions. But I’ll limit myself by adding only a few thoughts that point toward some of those other directions.

The topic of this post goes back to the Renaissance (Western Individuality Before the Enlightenment Age) and even earlier to the Axial Age (Hunger for Connection), a thread that can be traced back through history following the collapse of what Julian Jaynes called bicameral civilization in the Bronze Age. At the beginning of modernity, the psychic tension erupted in many ways that were increasingly dramatic and sometimes disturbing, from revolution to media panics (Technological Fears and Media Panics). I see all of this as having to do with the isolating and anxiety-inducing effects of hyper-individualism. The rigid egoic boundaries required by our social order are simply tiresome (Music and Dance on the Mind), as Julian Jaynes conjectured:

“Another advantage of schizophrenia, perhaps evolutionary, is tirelessness. While a few schizophrenics complain of generalized fatigue, particularly in the early stages of the illness, most patients do not. In fact, they show less fatigue than normal persons and are capable of tremendous feats of endurance. They are not fatigued by examinations lasting many hours. They may move about day and night, or work endlessly without any sign of being tired. Catatonics may hold an awkward position for days that the reader could not hold for more than a few minutes. This suggests that much fatigue is a product of the subjective conscious mind, and that bicameral man, building the pyramids of Egypt, the ziggurats of Sumer, or the gigantic temples at Teotihuacan with only hand labor, could do so far more easily than could conscious self-reflective men.”

On the Facebook page for Jaynes’ The Origin of Consciousness in the Breakdown of the Bicameral Mind, Luciano Imoto made the same basic point in speaking about hyper-individualism. He stated that, “In my point of view the constant use of memory (and the hippocampus) to sustain a fictitious identity of “self/I” could be deleterious to the brain´s health at long range (considering that the brain consumes about 20 percent of the body’s energy).” I’m sure others have made similar observations. This strain on the psyche has been building up for a long time, but it became particularly apparent in the 19th century, to such an extent it was deemed necessary to build special institutions to house and care for the broken and deficient humans who couldn’t handle modern life or else couldn’t appropriately conform to the ever more oppressive social norms (Mark Jackson, The Borderland of Imbecility). As radical as some consider Jaynes to be, insights like this were hardly new — in 1867, Henry Maudsley offered insight laced with bigotry, from The Physiology and Pathology of Mind:

“There are general causes, such as the state of civilization in a country, the form of its government and its religion, the occupation, habits, and condition of its inhabitants, which are not without influence in determining the pro portion of mental diseases amongst them. Reliable statistical data respecting the prevalence of insanity in different countries are not yet to be had ; even the question whether it has increased with the progress of civilization has not been positively settled. Travellers are certainly agreed that it is a rare disease amongst barbarous people, while, in the different civilized nations of the world, there is, so far as can be ascertained, an average of about one insane person in five hundred inhabitants. Theoretical considerations would lead to the expectation of an increased liability to mental disorder with an increase in the complexity of the mental organization: as there are a greater liability to disease, and the possibility of many more diseases, in a complex organism like the human body, where there are many kinds of tissues and an orderly subordination of parts, than in a simple organism with less differentiation of tissue and less complexity of structure; so in the complex mental organization, with its manifold, special, and complex relations with the external, which a state of civilization implies, there is plainly the favourable occasion of many derangements. The feverish activity of life, the eager interests, the numerous passions, and the great strain of mental work incident to the multiplied industries and eager competition of an active civilization, can scarcely fail, one may suppose, to augment the liability to mental disease. On the other hand, it may be presumed that mental sufferings will be as rare in an infant state of society as they are in the infancy of the individual. That degenerate nervous function in young children is displayed, not in mental disorder, but in convulsions; that animals very seldom suffer from insanity; that insanity is of comparatively rare occurrence among savages; all these are circumstances that arise from one and the same fact—a want of development of the mental organization. There seems, therefore, good reason to believe that, with the progress of mental development through the ages, there is, as is the case with other forms of organic development, a correlative degeneration going on, and that an increase of insanity is a penalty which an increase of our present civilization necessarily pays. […]

“If we admit such an increase of insanity with our present civilization, we shall be at no loss to indicate causes for it. Some would no doubt easily find in over-population the prolific parent of this as of numerous other ills to mankind. In the fierce and active struggle for existence which there necessarily is where the claimants are many and the supplies are limited, and where the competition therefore is severe, the weakest must suffer, and some of them, breaking down into madness, fall by the wayside. As it is the distinctly manifested aim of mental development to bring man into more intimate, special, and complex relations with the rest of nature by means of patient investigations of physical laws, and a corresponding internal adaptation to external relations, it is no marvel, it appears indeed inevitable, that those who, either from inherited weakness or some other debilitating causes, have been rendered unequal to the struggle of life, should be ruthlessly crushed out as abortive beings in nature. They are the waste thrown up by the silent but strong current of progress; they are the weak crushed out by the strong in the mortal struggle for development; they are examples of decaying reason thrown off by vigorous mental growth, the energy of which they testify. Everywhere and always “to be weak is to be miserable.”

As civilization became complex, so did the human mind in having to adapt to it and sometimes that hit a breaking point in individuals; or else what was previously considered normal behavior was now judged unacceptable, the latter explanation favored by Michel Foucault and Thomas Szasz (also see Bruce Levine’s article, Societies With Little Coercion Have Little Mental Illness). Whatever the explanation, something that once was severely abnormal had become normalized and, as it happened with insidious gradualism, few noticed and would accept what had changed “Living amid an ongoing epidemic that nobody notices is surreal. It is like viewing a mighty river that has risen slowly over two centuries, imperceptibly claiming the surrounding land, millimeter by millimeter. . . . Humans adapt remarkably well to a disaster as long as the disaster occurs over a long period of time” (E. Fuller Torrey & Judy Miller, Invisible Plague; also see Torrey’s Schizophrenia and Civilization); “At the end of the seventeenth century, insanity was of little significance and was little discussed. At the end of the eighteenth century, it was perceived as probably increasing and was of some concern. At the end of the nineteenth century, it was perceived as an epidemic and was a major concern. And at the end of the twentieth century, insanity was simply accepted as part of the fabric of life. It is a remarkable history.” All of the changes were mostly happening over generations and centuries, which left little if any living memory from when the changes began. Many thinkers like Torrey and Miller would be useful for fleshing this out, but here is a small sampling of authors and their books: Harold D. Foster’s What Really Causes Schizophrenia, Andrew Scull’s Madness in Civilization, Alain Ehrenberg’s Weariness of the Self, etc; and I shouldn’t ignore the growing field of Jaynesian scholarship such as found in the books put out by the Julian Jaynes Society.

Besides social stress and societal complexity, there was much else that was changing. For example, increasing concentrated urbanization and close proximity with other species meant ever more spread of infectious diseases and parasites (consider toxoplasma gondii from domesticated cats; see E. Fuller Torrey’s Beasts of Earth). Also, the 18th century saw the beginnings of industrialization with the related rise of toxins (Dan Olmsted & Mark Blaxill, The Age of Autism: Mercury, Medicine, and a Man-Made Epidemic). That worsened over the following century. Industrialization also transformed the Western diet. Sugar, having been introduced in the early colonial era, now was affordable and available to the general population. And wheat, once hard to grow and limited to the rich, also was becoming a widespread ingredient with new milling methods allowing highly refined white flour which made white bread popular (in the mid-1800s, Stanislas Tanchou did a statistical analysis that correlated the rate of grain consumption with the rate of cancer; and he observed that cancer, like insanity, spread along with civilization). For the first time in history, most Westerners were eating a very high-carb diet. This diet is addictive for a number of reasons and it was combined with the introduction of addictive stimulants. As I argue, this profoundly altered neurocognitive functioning and behavior (The Agricultural Mind, “Yes, tea banished the fairies.”, Autism and the Upper Crust, & Diets and SystemsDiets and Systems).

This represents an ongoing project for me. And I’m in good company.

Western Individuality Before the Enlightenment Age

The Culture Wars of the Late Renaissance: Skeptics, Libertines, and Opera
by Edward Muir
Introduction
pp. 5-7

One of the most disturbing sources of late-Renaissance anxiety was the collapse of the traditional hierarchic notion of the human self. Ancient and medieval thought depicted reason as governing the lower faculties of the will, the passions, and the body. Renaissance thought did not so much promote “individualism” as it cut away the intellectual props that presented humanity as the embodiment of a single divine idea, thereby forcing a desperate search for identity in many. John Martin has argued that during the Renaissance, individuals formed their sense of selfhood through a difficult negotiation between inner promptings and outer social roles. Individuals during the Renaissance looked both inward for emotional sustenance and outward for social assurance, and the friction between the inner and outer selves could sharpen anxieties 2 The fragmentation of the self seems to have been especially acute in Venice, where the collapse of aristocratic marriage structures led to the formation of what Virginia Cox has called the single self, most clearly manifest in the works of several women writers who argued for the moral and intellectual equality of women with men.’ As a consequence of the fragmented understanding of the self, such thinkers as Montaigne became obsessed with what was then the new concept of human psychology, a term in fact coined in this period.4 A crucial problem in the new psychology was to define the relation between the body and the soul, in particular to determine whether the soul died with the body or was immortal. With its tradition of Averroist readings of Aristotle, some members of the philosophy faculty at the University of Padua recurrently questioned the Christian doctrine of the immortality of the soul as unsound philosophically. Other hierarchies of the human self came into question. Once reason was dethroned, the passions were given a higher value, so that the heart could be understood as a greater force than the mind in determining human conduct. duct. When the body itself slipped out of its long-despised position, the sexual drives of the lower body were liberated and thinkers were allowed to consider sex, independent of its role in reproduction, a worthy manifestation of nature. The Paduan philosopher Cesare Cremonini’s personal motto, “Intus ut libet, foris ut moris est,” does not quite translate to “If it feels good, do it;” but it comes very close. The collapse of the hierarchies of human psychology even altered the understanding of the human senses. The sense of sight lost its primacy as the superior faculty, the source of “enlightenment”; the Venetian theorists of opera gave that place in the hierarchy to the sense of hearing, the faculty that most directly channeled sensory impressions to the heart and passions.

Historical and Philosophical Issues in the Conservation of Cultural Heritage
edited by Nicholas Price, M. Kirby Talley, and Alessandra Melucco Vaccaro
Reading 5: “The History of Art as a Humanistic Discipline”
by Erwin Panofsky
pp. 83-85

Nine days before his death Immanuel Kant was visited by his physician. Old, ill and nearly blind, he rose from his chair and stood trembling with weakness and muttering unintelligible words. Finally his faithful companion realized that he would not sit down again until the visitor had taken a seat. This he did, and Kant then permitted himself to be helped to his chair and, after having regained some of his strength, said, ‘Das Gefühl für Humanität hat mich noch nicht verlassen’—’The sense of humanity has not yet left me’. The two men were moved almost to tears. For, though the word Humanität had come, in the eighteenth century, to mean little more than politeness and civility, it had, for Kant, a much deeper significance, which the circumstances of the moment served to emphasize: man’s proud and tragic consciousness of self-approved and self-imposed principles, contrasting with his utter subjection to illness, decay and all that implied in the word ‘mortality.’

Historically the word humanitas has had two clearly distinguishable meanings, the first arising from a contrast between man and what is less than man; the second between man and what is more. In the first case humanitas means a value, in the second a limitation.

The concept of humanitas as a value was formulated in the circle around the younger Scipio, with Cicero as its belated, yet most explicit spokesman. It meant the quality which distinguishes man, not only from animals, but also, and even more so, from him who belongs to the species homo without deserving the name of homo humanus; from the barbarian or vulgarian who lacks pietas and παιδεια- that is, respect for moral values and that gracious blend of learning and urbanity which we can only circumscribe by the discredited word “culture.”

In the Middle Ages this concept was displaced by the consideration of humanity as being opposed to divinity rather than to animality or barbarism. The qualities commonly associated with it were therefore those of frailty and transience: humanitas fragilis, humanitas caduca.

Thus the Renaissance conception of humanitas had a two-fold aspect from the outset. The new interest in the human being was based both on a revival of the classical antithesis between humanitas and barbartias, or feritas, and on a survival of the mediaeval antithesis between humanitas and divinitas. When Marsilio Ficino defines man as a “rational soul participating in the intellect of God, but operating in a body,” he defines him as the one being that is both autonomous and finite. And Pico’s famous ‘speech’ ‘On the Dignity of Man’ is anything but a document of paganism. Pico says that God placed man in the center of the universe so that he might be conscious of where he stands, and therefore free to decide ‘where to turn.’ He does not say that man is the center of the universe, not even in the sense commonly attributed to the classical phrase, “man the measure of all things.”

It is from this ambivalent conception of humanitas that humanism was born. It is not so much a movement as an attitude which can be defined as the conviction of the dignity of man, based on both the insistence on human values (rationality and freedom) and the acceptance of human limitations (fallibility and frailty); from this two postulates result responsibility and tolerance.

Small wonder that this attitude has been attacked from two opposite camps whose common aversion to the ideas of responsibility and tolerance has recently aligned them in a united front. Entrenched in one of these camps are those who deny human values: the determinists, whether they believe in divine, physical or social predestination, the authoritarians, and those “insectolatrists” who profess the all-importance of the hive, whether the hive be called group, class, nation or race. In the other camp are those who deny human limitations in favor of some sort of intellectual or political libertinism, such as aestheticists, vitalists, intuitionists and hero-worshipers. From the point of view of determinism, the humanist is either a lost soul or an ideologist. From the point of view of authoritarianism, he is either a heretic or a revolutionary (or a counterrevolutionary). From the point of view of “insectolatry,” he is a useless individualist. And from the point of view of libertinism he is a timid bourgeois.

Erasmus of Rotterdam, the humanist par excellence, is a typical case in point. The church suspected and ultimately rejected the writings of this man who had said: “Perhaps the spirit of Christ is more largely diffused than we think, and there are many in the community of saints who are not in our calendar.” The adventurer Uhich von Hutten despised his ironical skepticism and his unheroic love of tranquillity. And Luther, who insisted that “no man has power to think anything good or evil, but everything occurs in him by absolute necessity,” was incensed by a belief which manifested itself in the famous phrase; “What is the use of man as a totality [that is, of man endowed with both a body and a soul], if God would work in him as a sculptor works in clay, and might just as well work in stone?”

Food and Faith in Christian Culture
edited by Ken Albala and Trudy Eden
Chapter 3: “The Food Police”
Sumptuary Prohibitions On Food In The Reformation
by Johanna B. Moyer
pp. 80-83

Protestants too employed a disease model to explain the dangers of luxury consumption. Luxury damaged the body politic leading to “most incurable sickness of the universal body” (33). Protestant authors also employed Galenic humor theory, arguing that “continuous superfluous expense” unbalanced the humors leading to fever and illness (191). However, Protestants used this model less often than Catholic authors who attacked luxury. Moreover, those Protestants who did employ the Galenic model used it in a different manner than their Catholic counterparts.

Protestants also drew parallels between the damage caused by luxury to the human body and the damage excess inflicted on the French nation. Rather than a disease metaphor, however, many Protestant authors saw luxury more as a “wound” to the body politic. For Protestants the danger of luxury was not only the buildup of humors within the body politic of France but the constant “bleeding out” of humor from the body politic in the form of cash to pay for imported luxuries. The flow of cash mimicked the flow of blood from a wound in the body. Most Protestants did not see luxury foodstuffs as the problem, indeed most saw food in moderation as healthy for the body. Even luxury apparel could be healthy for the body politic in moderation, if it was domestically produced and consumed. Such luxuries circulated the “blood” of the body politic creating employment and feeding the lower orders. 72 De La Noue made this distinction clear. He dismissed the need to individually discuss the damage done by each kind of luxury that was rampant in France in his time as being as pointless “as those who have invented auricular confession have divided mortal and venal sins into infinity of roots and branches.” Rather, he argued, the damage done by luxury was in its “entire bulk” to the patrimonies of those who purchased luxuries and to the kingdom of France (116). For the Protestants, luxury did not pose an internal threat to the body and salvation of the individual. Rather, the use of luxury posed an external threat to the group, to the body politic of France.

The Reformation And Sumptuary Legislation

Catholics, as we have seen, called for antiluxury regulations on food and banqueting, hoping to curb overeating and the damage done by gluttony to the body politic. Although some Protestants also wanted to restrict food and banqueting, more often French Protestants called for restrictions on clothing and foreign luxuries. These differing views of luxury during and after the French Wars of Religion not only give insight into the theological differences between these two branches of Christianity but also provides insight into the larger pattern of the sumptuary regulation of food in Europe in this period. Sumptuary restrictions were one means by which Catholics and Protestants enforced their theology in the post-Reformation era.

Although Catholicism is often correctly cast as the branch of Reformation Christianity that gave the individual the least control over their salvation, it was also true that the individual Catholic’s path to salvation depended heavily on ascetic practices. The responsibility for following these practices fell on the individual believer. Sumptuary laws on food in Catholic areas reinforced this responsibility by emphasizing what foods should and should not be eaten and mirrored the central theological practice of fasting for the atonement of sin. Perhaps the historiographical cliché that it was only Protestantism which gave the individual believer control of his or her salvation needs to be qualified. The arithmetical piety of Catholicism ultimately placed the onus on the individual to atone for each sin. Moreover, sumptuary legislation tried to steer the Catholic believer away from the more serious sins that were associated with overeating, including gluttony, lust, anger, and pride.

Catholic theology meshed nicely with the revival of Galenism that swept through Europe in this period. Galenists preached that meat eating, overeating, and the imbalance in humors which accompanied these practices, led to behavioral changes, including an increased sex drive and increased aggression. These physical problems mirrored the spiritual problems that luxury caused, including fornication and violence. This is why so many authors blamed the French nobility for the luxury problem in France. Nobles were seen not only as more likely to bear the expense of overeating but also as more prone to violence. 73

Galenism also meshed nicely with Catholicism because it was a very physical religion in which the control of the physical body figured prominently in the believer’s path to salvation. Not surprisingly, by the seventeenth century, Protestants gravitated away from Galenism toward the chemical view of the body offered by Paracelsus. 74 Catholic sumptuary law embodied a Galenic view of the body where sin and disease were equated and therefore pushed regulations that advocated each person’s control of his or her own body.

Protestant legislators, conversely, were not interested in the individual diner. Sumptuary legislation in Protestant areas ran the gamut from control of communal displays of eating, in places like Switzerland and Germany, to little or no concern with restrictions on luxury foods, as in England. For Protestants, it was the communal role of food and luxury use that was important. Hence the laws in Protestant areas targeted food in the context of weddings, baptisms, and even funerals. The English did not even bother to enact sumptuary restrictions on food after their break with Catholicism. The French Protestants who wrote on luxury glossed over the deleterious effects of meat eating, even proclaiming it to be healthful for the body while producing diatribes against the evils of imported luxury apparel. The use of Galenism in the French Reformed treatises suggests that Protestants too were concerned with a “body,” but it was not the individual body of the believer that worried Protestant legislators. Sumptuary restrictions were designed to safeguard the mystical body of believers, or the “Elect” in the language of Calvinism. French Protestants used the Galenic model of the body to discuss the damage that luxury did to the body of believers in France, but ultimately to safeguard the economic welfare of all French subjects. The Calvinists of Switzerland used sumptuary legislation on food to protect those predestined for salvation from the dangerous eating practices of members of the community whose overeating suggested they might not be saved.

Ultimately, sumptuary regulations in the Reformation spoke to the Christian practice of fasting. Fasting served very different functions in Protestants and Catholic theology. Raymond Mentzer has suggested that Protestants “modified” the Catholic practice of fasting during the Reformation. The major reformers, including Luther, Calvin, and Zwingli, all rejected fasting as a path to salvation. 75 For Protestants, fasting was a “liturgical rite,” part of the cycle of worship and a practice that served to “bind the community.” Fasting was often a response to adversity, as during the French Wars of Religion. For Catholics, fasting was an individual act, just as sumptuary legislation in Catholic areas targeted individual diners. However, for Protestants, fasting was a communal act, “calling attention to the body of believers.” 76 The symbolic nature of fasting, Mentzer argues, reflected Protestant rejection of transubstantiation. Catholics continued to believe that God was physically present in the host, but Protestants believed His was only a spiritual presence. When Catholics took Communion, they fasted to cleanse their own bodies so as to receive the real, physical body of Christ. Protestants, on the other hand, fasted as spiritual preparation because it was their spirits that connected with the spirit of Christ in the Eucharist. 77

Capitalist Realism and Fake Fakes

“This is where ‘we’ are now: not Harawayesque cyborgs affirming our ontological hybridity but replicant-puppets (of Capital) dreaming kitsch dreams of being restored to full humanity but “without any Gepettos or Good Fairies on the horizon”.”

~ Mark (k-punk), 2009
Honeymoon in Disneyland

* * *

“Where does that leave us? I’m not sure the solution is to seek out some pre-Inversion authenticity — to red-pill ourselves back to “reality.” What’s gone from the internet, after all, isn’t “truth,” but trust: the sense that the people and things we encounter are what they represent themselves to be. Years of metrics-driven growth, lucrative manipulative systems, and unregulated platform marketplaces, have created an environment where it makes more sense to be fake online — to be disingenuous and cynical, to lie and cheat, to misrepresent and distort — than it does to be real. Fixing that would require cultural and political reform in Silicon Valley and around the world, but it’s our only choice. Otherwise we’ll all end up on the bot internet of fake people, fake clicks, fake sites, and fake computers, where the only real thing is the ads.”

~ Max Read, 2018
How Much of the Internet Is Fake? Turns Out, a Lot of It, Actually.

* * *

“In my writing I got so interested in fakes that I finally came up with the concept of fake fakes. For example, in Disneyland there are fake birds worked by electric motors which emit caws and shrieks as you pass by them. Suppose some night all of us sneaked into the park with real birds and substituted them for the artificial ones. Imagine the horror the Disneyland officials would feel when they discovered the cruel hoax. Real birds! And perhaps someday even real hippos and lions. Consternation. The park being cunningly transmuted from the unreal to the real, by sinister forces. For instance, suppose the Matterhorn turned into a genuine snow-covered mountain? What if the entire place, by a miracle of God’s power and wisdom, was changed, in a moment, in the blink of an eye, into something incorruptible? They would have to close down.

“In Plato’s Timaeus, God does not create the universe, as does the Christian God; He simply finds it one day. It is in a state of total chaos. God sets to work to transform the chaos into order. That idea appeals to me, and I have adapted it to fit my own intellectual needs: What if our universe started out as not quite real, a sort of illusion, as the Hindu religion teaches, and God, out of love and kindness for us, is slowly transmuting it, slowly and secretly, into something real?”

~ Philip K. Dick, 1978
How to Build a Universe That Doesn’t Fall Apart Two Days Later

Essentialism On the Decline

Before getting to the topic of essentialism, let me take an indirect approach. In reading about paleolithic diets and traditional foods, a recurring theme is inflammation, specifically as it relates to the health of the gut-brain network and immune system.

The paradigm change this signifies is that seemingly separate diseases with different diagnostic labels often have underlying commonalities. They share overlapping sets of causal and contributing factors, biological processes and symptoms. This is why simple dietary changes can have a profound effect on numerous health conditions. For some, the diseased state expresses as mood disorders and for others as autoimmune disorders and for still others something entirely else, but there are immense commonalities between them all. The differences have more to do with how dysbiosis and dysfunction happens to develop, where it takes hold in the body, and so what symptoms are experienced.

From a paleo diet perspective in treating both patients and her own multiple sclerosis, Terry Wahls gets at this point in a straightforward manner (The Wahls Protocol, p. 47): “In a very real sense, we all have the same disease because all disease begins with broken, incorrect biochemistry and disordered communication within and between our cells. […] Inside, the distinction between these autoimmune diseases is, frankly, fairly arbitrary”. In How Emotions Are Made, Lisa Feldman Barrett wrote (Kindle Locations 3834-3850):

“Inflammation has been a game-changer for our understanding of mental illness. For many years, scientists and clinicians held a classical view of mental illnesses like chronic stress, chronic pain, anxiety, and depression. Each ailment was believed to have a biological fingerprint that distinguished it from all others. Researchers would ask essentialist questions that assume each disorder is distinct: “How does depression impact your body? How does emotion influence pain? Why do anxiety and depression frequently co-occur?” 9

“More recently, the dividing lines between these illnesses have been evaporating. People who are diagnosed with the same-named disorder may have greatly diverse symptoms— variation is the norm. At the same time, different disorders overlap: they share symptoms, they cause atrophy in the same brain regions, their sufferers exhibit low emotional granularity, and some of the same medications are prescribed as effective.

“As a result of these findings, researchers are moving away from a classical view of different illnesses with distinct essences. They instead focus on a set of common ingredients that leave people vulnerable to these various disorders, such as genetic factors, insomnia, and damage to the interoceptive network or key hubs in the brain (chapter 6). If these areas become damaged, the brain is in big trouble: depression, panic disorder, schizophrenia, autism, dyslexia, chronic pain, dementia, Parkinson’s disease, and attention deficit hyperactivity disorder are all associated with hub damage. 10

“My view is that some major illnesses considered distinct and “mental” are all rooted in a chronically unbalanced body budget and unbridled inflammation. We categorize and name them as different disorders, based on context, much like we categorize and name the same bodily changes as different emotions. If I’m correct, then questions like, “Why do anxiety and depression frequently co-occur?” are no longer mysteries because, like emotions, these illnesses do not have firm boundaries in nature.”

What jumped out at me was the conventional view of disease as essentialist, and hence the related essentialism in biology and psychology. This is exemplified by genetic determinism, such as it informs race realism. It’s easy for most well-informed people to dismiss race realists, but essentialism takes on much more insidious forms that are harder to detect and root out. When scientists claimed to find a gay gene, some gay men quickly took this genetic determinism as a defense against the fundamentalist view that homosexuality is a choice and a sin. It turned out that there was no gay gene (by the way, this incident demonstrated how, in reacting to reactionaries, even leftist activists can be drawn into the reactionary mind). Not only is there no gay gene but also no simple and absolute gender divisions at all — as I previously explained (Is the Tide Starting to Turn on Genetics and Culture?):

“Recent research has taken this even further in showing that neither sex nor gender is binary (1234, & 5), as genetics and its relationship to environment, epigenetics, and culture is more complex than was previously realized. It’s far from uncommon for people to carry genetics of both sexes, even multiple DNA. It has to do with diverse interlinking and overlapping causal relationships. We aren’t all that certain at this point what ultimately determines the precise process of conditions, factors, and influences in how and why any given gene expresses or not and how and why it expresses in a particular way.”

The attraction of essentialism is powerful. And as shown in numerous cases, the attraction can be found across the political spectrum, as it offers a seemingly strong defense in diverting attention away from other factors. Similar to the gay gene, many people defend neurodiversity as if some people are simply born a particular way, and that therefore we can’t and shouldn’t seek to do anything to change or improve their condition, much less cure it or prevent it in future generations.

For example, those on the high-functioning end of the autism spectrum will occasionally defend their condition as being gifted in their ability to think and perceive differently. That is fine as far as it goes, but from a scientific perspective we still should find it concerning that conditions like this are on a drastic rise and it can’t be explained by mere greater rates of diagnosis. Whether or not one believes the world would be a better place with more people with autism, this shouldn’t be left as a fatalistic vision of an evolutionary leap, especially considering most on the autism spectrum aren’t high functioning — instead, we should try to understand why it is happening and what it means.

Researchers have found that there are prospective causes to be studied. Consider proprionate, a substance discussed by Alanna Collen (10% Human, p. 83): “although propionate was an important compound in the body, it was also used as a preservative in bread products – the very foods many autistic children crave. To top it all off, clostridia species are known to produce propionate. In itself, propionate is not ‘bad’, but MacFabe began to wonder whether autistic children were getting an overdose.” This might explain why antibiotics helped many with autism, as it would have been knocking off the clostridia population that was boosting propionate. To emphasize this point, when rodents were injected with propionate, they exhibited the precise behaviors of autism and they too showed inflammation in the brain. The fact that autistics often have brain inflammation, an unhealthy condition, is strong evidence that autism shouldn’t be taken as mere neurodiversity (and, among autistics, the commonality of inflammation-related gut issues emphasizes this point).

There is no doubt that genetic determinism, like the belief in an eternal soul, can be comforting. We identify with our genes, as we inherit them and are born with them. But to speak of inflammation or propionate or whatever makes it seem like we are victims of externalities. And it means we aren’t isolated individuals to be blamed or to take credit for who we are. To return to Collen (pp. 88-89):

“In health, we like to think we are the products of our genes and experiences. Most of us credit our virtues to the hurdles we have jumped, the pits we have climbed out of, and the triumphs we have fought for. We see our underlying personalities as fixed entities – ‘I am just not a risk-taker’, or ‘I like things to be organised’ – as if these are a result of something intrinsic to us. Our achievements are down to determination, and our relationships reflect the strength of our characters. Or so we like to think.

“But what does it mean for free will and accomplishment, if we are not our own masters? What does it mean for human nature, and for our sense of self? The idea that Toxoplasma, or any other microbe inhabiting your body, might contribute to your feelings, decisions and actions, is quite bewildering. But if that’s not mind-bending enough for you, consider this: microbes are transmissible. Just as a cold virus or a bacterial throat infection can be passed from one person to another, so can the microbiota. The idea that the make-up of your microbial community might be influenced by the people you meet and the places you go lends new meaning to the idea of cultural mind-expansion. At its simplest, sharing food and toilets with other people could provide opportunity for microbial exchange, for better or worse. Whether it might be possible to pick up microbes that encourage entrepreneurship at a business school, or a thrill-seeking love of motorbiking at a race track, is anyone’s guess for now, but the idea of personality traits being passed from person to person truly is mind-expanding.”

This goes beyond the personal level, which lends a greater threat to the proposal. Our respective societies, communities, etc might be heavily influenced by environmental factors that we can’t see. A ton of research shows the tremendous impact of parasites, heavy metal toxins, food additives, farm chemicals, hormones, hormone mimics, hormone disruptors, etc. Entire regions might be shaped by even a single species of parasite, such as how higher rates of toxoplasmosis gondii in New England is directly correlated to higher rates of neuroticism (see What do we inherit? And from whom? & Uncomfortable Questions About Ideology).

Essentialism, though still popular, has taken numerous major hits in recent years. It once was the dominant paradigm and went largely unquestioned. Consider how early last century respectable fields of study such as anthropology, linguistic relativism and behaviorism suggested that humans were largely products of environmental and cultural factors. This was the original basis of the attack on racism and race realism. In linguistics, Noam Chomsky overturned this view in positing the essentialist belief that, though not observed much less proven, there must exist within the human brain a language module with a universal grammar. It was able to defeat and replace the non-essentialist theories because it was more satisfying to the WEIRD ideologies that were becoming a greater force in an increasingly WEIRD society.

Ever since Plato, Western civilization has been drawn toward the extremes of essentialism (as part of the larger Axial Age shift toward abstraction and idealism). Yet there has also long been a countervailing force (even among the ancients, non-essentialist interpretations were common; consider group identity: here, here, here, here, and here). It wasn’t predetermined that essentialism would be so victorious as to have nearly obliterated the memory of all alternatives. It fit the spirit of the times for this past century, but now the public mood is shifting again. It’s no accident that, as social democracy and socialism regains favor, environmentalist explanations are making a comeback. But this is merely the revival of a particular Western tradition of thought, a tradition that is centuries old.

I was reminded of this in reading Liberty in America’s Founding Moment by Howard Schwartz. It’s an interesting shift of gears, since Schwartz doesn’t write about anything related to biology, health, or science. But he does indirectly get at environmentalist critique that comes out in his analysis of David Hume (1711-1776). I’ve mostly thought of Hume in terms of his bundle theory of self, possibly having been borrowed from Buddhism that he might have learned from Christian missionaries having returned from the East. However he came to it, the bundle theory argued that there is no singular coherent self, as was a central tenet of traditional Christian theology. Still, heretical views of the self were hardly new — some detect a possible Western precursor of Humean bundle theory in the ideas of Baruch Spinoza (1632-1677).

Whatever its origins in Western thought, environmentalism has been challenging essentialism since the Enlightenment. And in the case of Hume, there is an early social constructionist view of society and politics, that what motivates people isn’t essentialism. This puts a different spin on things, as Hume’s writings were widely read during the revolutionary era when the United States was founded. Thomas Jefferson, among others, was familiar with Hume and highly recommended his work. Hume represented the opposite position to John Locke. We are now returning to this old battle of ideas.

“…some deeper area of the being.”

Alec Nevala-Lee shares a passage from Colin Wilson’s Mysteries (see Magic and the art of will). It elicits many thoughts, but I want to focus on the two main related aspects: the self and the will.

The main thing Wilson is talking about is hyper-individualism — the falseness and superficiality, constraint and limitation of anxiety-driven ‘consciousness’, the conscious personality of the ego-self. This is what denies the bundled self and the extended self, the vaster sense of being that challenges the socio-psychological structure of the modern mind. We defend our thick boundaries with great care for fear of what might get in, but this locks us in a prison cell of our own making. In not allowing ourselves to be affected, we make ourselves ineffective or at best only partly effective toward paltry ends. It’s not only a matter of doing “something really well” for we don’t really know what we want to do, as we’ve become disconnected from deeper impulses and broader experience.

For about as long as I can remember, the notion of ‘free will’ has never made sense to me. It isn’t a philosophical disagreement. Rather, in my own experience and in my observation of others, it simply offers no compelling explanation or valid meaning, much less deep insight. It intuitively makes no sense, which is to say it can only make sense if we never carefully think about it with probing awareness and open-minded inquiry. To the degree there is a ‘will’ is to the degree it is inseparable from the self. That is to say the self never wills anything for the self is and can only be known through the process of willing, which is simply to say through impulse and action. We are what we do, but we never know why we do what we do. We are who we are and we don’t know how to be otherwise.

There is no way to step back from the self in order to objectively see and act upon the self. That would require yet another self. The attempt to impose a will upon the self would lead to an infinite regress of selves. That would be a pointless preoccupation, although as entertainments go it is popular these days. A more worthy activity and maybe a greater achievement is stop trying to contain ourselves and instead to align with a greater sense of self. Will wills itself. And the only freedom that the will possesses is to be itself. That is what some might consider purpose or telos, one’s reason for being or rather one’s reason in being.

No freedom exists in isolation. To believe otherwise is a trap. The precise trap involved is addiction, which is the will driven by compulsion. After all, the addict is the ultimate individual, so disconnected within a repeating pattern of behavior as to be unable to affect or be affected. Complete autonomy is impotence. The only freedom is in relationship, both to the larger world and the larger sense of self. It is in the ‘other’ that we know ourselves. We can only be free in not trying to impose freedom, in not struggling to control and manipulate. True will, if we are to speak of such a thing, is the opposite of willfulness. We are only free to the extent we don’t think in the explicit terms of freedom. It is not a thought in the mind but a way of being in the world.

We know that the conscious will is connected to the narrow, conscious part of the personality. One of the paradoxes observed by [Pierre] Janet is that as the hysteric becomes increasingly obsessed with anxiety—and the need to exert his will—he also becomes increasingly ineffective. The narrower and more obsessive the consciousness, the weaker the will. Every one of us is familiar with the phenomenon. The more we become racked with anxiety to do something well, the more we are likely to botch it. It is [Viktor] Frankl’s “law of reversed effort.” If you want to do something really well, you have to get into the “right mood.” And the right mood involves a sense of relaxation, of feeling “wide open” instead of narrow and enclosed…

As William James remarked, we all have a lifelong habit of “inferiority to our full self.” We are all hysterics; it is the endemic disease of the human race, which clearly implies that, outside our “everyday personality,” there is a wider “self” that possesses greater powers than the everyday self. And this is not the Freudian subconscious. Like the “wider self” of Janet’s patients, it is as conscious as the “contracted self.” We are, in fact, partially aware of this “other self.” When a man “unwinds” by pouring himself a drink and kicking off his shoes, he is adopting an elementary method of relaxing into the other self. When an overworked housewife decides to buy herself a new hat, she is doing the same thing. But we seldom relax far enough; habit—and anxiety—are too strong…Magic is the art and science of using the will. Not the ordinary will of the contracted ego but the “true will” that seems to spring from some deeper area of the being.

Colin WilsonMysteries

The Madness of Reason

Commenting online brings one in contact with odd people. It is often merely irritating, but at times it can be fascinating to see all the strange ways humanity gets expressed.

I met a guy, Naj Ziad, on the Facebook page for a discussion group about Julian Jaynes’ book, The Origin of Consciousness in the Breakdown of the Bicameral Mind. He posted something expressing his obsession with logical coherence and consistency. We dialogued for quite a bit, including in a post on his own Facebook page, and he seemed like a nice enough guy. He came across as being genuine in his intentions and worldview, but there was simply something off about him.

It’s likely he has Asperger’s, of a high IQ and high functioning variety. Or it could be that he has some kind of personality disorder. Either way, my sense is that he is severely lacking in cognitive empathy, although I doubt he is deficient in affective empathy. He just doesn’t seem to get that other people can perceive and experience the world differently than he does or that others entirely exist as separate entities apart from his own existence.

When I claimed that my worldview was simply different than his and that neither of our personal realities could be reduced to the other, he called me a dualist. I came to the conclusion that this guy was a solipsist, although he doesn’t identify that way. Solipsism was the only philosophy that made sense of his idiosyncratic ramblings, entirely logical ramblings I might add. He is obviously intelligent, clever, and reasonably well read. His verbal intelligence is particularly high.

In fact, he is so obsessed with his verbal intelligence that he has come to the conclusion that all of reality is language. Of course, he has his own private definition of language which asserts that language is everything. This leaves his argument as a tautology and he freely admitted this was the case, but he kept returning to his defense that his argument was logically consistent and coherent. Sure.

It was endlessly amusing. He really could not grasp that the way his mind operates isn’t how everyone’s mind operates, and so he couldn’t escape the hermetically-sealed reality tunnel of his own clever monkey mind. His world view is so perfectly constructed and orderly that there isn’t a single crack to let in fresh air or a beam of light.

He was doing a wondrous impression of Spock in being entirely logical within his narrow psychological and ideological framework. He kept falling back on his being logical and also his use of idiosyncratic jargon. He defined his terms to entirely fit his ideological worldview that those terms had no meaning to him outside of his ideological worldview. It all made perfect sense within itself.

His life philosophy is a well-rehearsed script that he goes on repeating. It is an amazing thing to observe as an outsider, especially considering he stubbornly refused to acknowledge that anything could be outside of his own mind for he couldn’t imagine the world being different than his own mind. He wouldn’t let go of his beliefs about reality, like the monkey with his hand trapped in a jar because he won’t let go of the banana.

If this guy was just insane or a troll, I would dismiss him out of hand. But that isn’t the case. Obviously, he is neuroatypical and I won’t hold that against anyone. And I freely admit that his ideological worldview is logically consistent and coherent, for whatever that is worth.

What made it so fascinating to my mind is that solipsism has always been a speculative philosophy, to be considered only as a thought experiment. It never occurred to me that there would be a highly intelligent and rational person who would seriously uphold it as an entire self-contained worldview and lifestyle. His arguments for it were equally fascinating and he had interesting thoughts and insights, some of which I even agreed with. He is a brilliant guy who, as sometimes happens, has gone a bit off the deep end.

He built an ideology that perfectly expresses and conforms to his highly unusual neurocognitive profile. And of course, in pointing this out to him, he dismisses it as ‘psychologizing’. His arguments are so perfectly patched together that he never refers to any factual evidence as support for his ideological commitments, as it is entirely unnecessary in his own mind. External facts in the external world, what he calls ‘scientism’, are as meaningless as others claiming to have existence independent of his own. From his perspective, there is only what he calls the ‘now’ and there can only be one ‘now’ to rule them all, which just so happens to coincide with his own ego-mind.

If you challenge him on any of this, he is highly articulate in defending why he is being entirely reasonable. Ah, the madness of reason!

* * *

On a personal note, I should make clear that I sympathize with this guy. I have my own psychological issues (depression, anxiety, thought disorder, strong introversion, etc) that can make me feel isolated and cause me to retreat further into myself. Along with a tendency to over-intellectualize everything, my psychological issues have at times led me to get lost in my own head.

I can even understand the attraction of solipsism and, as a thought experiment, I’ve entertained it. But somehow I’ve always known that I’m not ‘normal’, which is to say that others are not like me. I have never actually doubted that others not only exist but exist in a wide variety of differences. It hasn’t occurred to me to deny all otherness by reducing all others to my own psychological experience and ideological worldview. I’ve never quite been that lost in myself, although I could imagine how it might happen. There have been moments in my life where my mind could have gone off the deep end.

Yet my sympathy only goes so far. It is hard to sympathize with someone who refuses to acknowledge your independent existence as a unique human being with your own identity and views. There is an element of frustration in dealing with a solipsist, but in this case my fascination drew me in. Before I ascertained he was a solipsist, it was obvious something about him was highly unusual. I kept poking and prodding him until the shape of his worldview became apparent. At that point, my fascination ended. Any further engagement would have continued to go around in circles, which means watching this guy’s mind go around in circles like a dog chasing its own tail.

Of all the contortions the human mind can put itself into, solipsism has to be one of the greatest feats to accomplish. I have to give this guy credit where its due. Not many people could keep up such a mindset for long.

Hyperballad and Hyperobjects

Morton’s use of the term ‘hyperobjects’ was inspired by Björk’s 1996 single ‘Hyperballad’
(Wikipedia)

Björk
by Timothy Morton

Björk and I think that there is a major cultural shift going on around the world towards something beyond cynical reason and nihilism, as more and more it becomes impossible not to have consideration for nonhumans in everything we do. Hopefully this piece we made contributes to that somehow.

I was so lucky to be doing this while she was mixing her album with some of the nicest and most incredible musicians/producers I’ve ever met…great examples of this shift beyond cynical reason…

Here is something I think is so so amazing, the Subtle Abuse mix of “Hyperballad.” Car parts, bottles, cutlery–all the objects, right? Not to mention Björk’s body “slamming against those rocks.” It’s a veritable Latour Litany… And the haunting repetition…

Dark Ecological Chocolate
by Timothy Morton

This being-an-object is intimately related with the Kantian beauty experience, wherein I find experiential evidence without metaphysical positing that at least one other being exists. The Sadness is the attunement of coexistence stripped of its conceptual content. Since the rigid anthropocentric standard of taste with its refined distances has collapsed, it becomes at this level impossible to rebuild the distinction we lost in The Ethereal between being interested or concerned with (this painting, this polar bear) and being fascinated by… Being interested means I am in charge. Being fascinated means that something else is. Beauty starts to show the subscenent wiring under the board.

Take Björk. Her song “Hyperballad” is a classic example of what I’m trying to talk about here. She shows you the wiring under board of an emotion, the way a straightforward feeling like I love you is obviously not straightforward at all, so don’t write a love song like that, write one that says you’re sitting on top of this cliff, and you’re dropping bits and pieces of the edge like car parts, bottles and cutlery, all kinds of not-you nonhuman prosthetic bits that we take to be extensions of our totally integrated up to date shiny religious holistic selves, and then you picture throwing yourself off, and what would you look like—to the you who’s watching you still on the edge of the cliff—as you fell, and when you hit the bottom would you be alive or dead, would you look awake or asleep, would your eyes be closed, or open?

When you experience beauty you experience evidence in your inner space that at least one thing that isn’t you exists. An evanescent footprint in your inner space—you don’t need to prove that things are real by hitting them or eating them. A nonviolent coexisting without coercion. There is an undecidability between two entities—me and not-me, the thing. Beauty is sad because it is ungraspable; there is an elegiac quality to it. When we grasp it withdraws, like putting my hand into water. Yet it appears.

Beauty is virtual: I am unable to tell whether the beauty resides in me or in the thing—it is as if it were in the thing, but impossible to pin down there. The subjunctive, floating “as if” virtual reality of beauty is a little queasy—the thing emits a tractor beam in whose vortex I find myself; I veer towards it. The aesthetic dimension says something true about causality in a modern age: I can’t tell for sure what the causes and effects are without resorting to illegal metaphysical moves.[14] Something slightly sinister is afoot—there is a basic entanglement such that I can’t tell who or what started it.

Beauty is the givenness of data. A thing impinges on me before I can contain it or use it or think it. It is as if I hear the thing breathing right next to me. From the standpoint of agricultural white patriarchy, something slightly “evil” is happening: something already has a grip on us, and this is demonic insofar as it is “from elsewhere.” This “saturated” demonic proximity is the essential ingredient of ecological being and ecological awareness, not some Nature over yonder.[15]

Interdependence, which is ecology, is sad and contingent. Because of interdependence, when I’m nice to a bunny rabbit I’m not being nice to bunny rabbit parasites. Amazing violence would be required to try to fit a form over everything all at once. If you try then you basically undermine the bunnies and everything else into components of a machine, replaceable components whose only important aspect is their existence