Poised on a Knife Edge

“To make any thing very terrible, obscurity seems in general to be necessary. When we know the full extent of any danger, when we can accustom our eyes to it, a great deal of the apprehension vanishes.”
~ Edmund Burke

I spent much of the day looking back at old posts. My purpose was to find my various writings on the revolutionary era, specifically in relation to the American Revolution. I was doing so in order to link to them in the post I just wrote, about democratic republicanism in early America.

In my search, I came across a post from several years ago. It is sort of a rambling book review of Yuval Levin’s The Great Debate, the topic being the relationship between Thomas Paine and Edmund Burke. What caught my attention was the comments section. I sometimes put more into the comments section than I do in the post itself. A longtime friend and reader of the blog left a comment, which is partly what led me to go off on some tangents there.

As one of my responses, I quoted at length from Corey Robin’s writings. One quote came from the first book I read by him, The Reactionary Mind:

Earlier than most, Burke understood that if violence were to retain its sublimity, it had to remain a possibility, an object of fantasy— a horror movie, a video game, an essay on war. For the actuality (as opposed to the representation) of violence was at odds with the requirements of sublimity. Real, as opposed to imagined, violence entailed objects getting too close, bodies pressing too near, flesh upon flesh. Violence stripped the body of its veils; violence made its antagonists familiar to each other in a way they had never been before. Violence dispelled illusion and mystery, making things drab and dreary. That is why, in his discussion in the Reflections of the revolutionaries’ abduction of Marie Antoinette, Burke takes such pains to emphasize her “almost naked” body and turns so effortlessly to the language of clothing—“ the decent drapery of life,” the “wardrobe of the moral imagination,” “antiquated fashion,” and so on— to describe the event. 68 The disaster of the revolutionaries’ violence, for Burke, was not cruelty; it was the unsought enlightenment.

Robin explains what Burke meant by the moral imagination, explains why such power exists and what nullifies it. That is why I began this post with the quote by Burke. Here is the fuller context from the 1759 text (“A philosophical enquiry into the origin of our ideas of the sublime and beautiful”, Part Two, Section III – Obscurity):

To make any thing very terrible, obscurity seems in general to be necessary. When we know the full extent of any danger, when we can accustom our eyes to it, a great deal of the apprehension vanishes. Every one will be sensible of this, who considers how greatly night adds to our dread, in all cases of danger, and how much the notions of ghosts and goblins, of which none can form clear ideas, affect minds, which give credit to the popular tales concerning such sorts of beings. Those despotic governments, which are founded on the passions of men, and principally upon the passion of fear, keep their chief as much as may be from the public eye. The policy has been the same in many cases of religion.

It’s not just the power of the mind. Moral imagination is what extends power over people, the emotional grip of distant or hidden authority, human or otherwise. Sublimity and fear, awe and terror.

But this misses the subtlety of this power. Moral imagination is everpresent, the pervasive force that puts blinders on our vision, hypnotizing us into a reality tunnel and sometimes full epistemic closure. As Burke puts it, this forms the wardrobe of our moral imagination, from which we clothe our experience of the world. This wardrobe holds the social constructs of the mind, the ideologies and narratives of society, the customs and norms of culture. It is just there, all around us, enclosing us, a familiar presence, and yet near impossible to see directly, most often barely glimpsed at the periphery of our awareness. It’s power is in its simultaneous obscurity and presence, the unseen depths of unconsciousness with an undertow that can be felt.

Also in the comments section, I pointed to the connection to another writer: “I noticed in these passages that ‘horror’ was mentioned a few times. Corey Robin even made reference to horror movies/films and “delightful horror.” What came to my mind is something that Thomas Ligotti said in an interview. He was discussing monsters. He explained that no story can ever have a monster as the protagonist, for then the sense of monstrosity would be lost. The monster has to remain other and the evil vague. That is what gives a horror story its power to horrify.” That stood out to me most of all. There is a simple reason for this, as I had just recently mentioned Ligotti (in relation to True Detective) to this same friend when he came to visit me. I had forgotten about these comments. Reading them again, I saw them in new light. That involves a more important reason for these comments interesting me. Ligotti was making a deeper point than mere commentary on horror fiction. The most horrifying other is that which is unseen and that is its power over us.

This all connects back to the ongoing development of my own theory, that of symbolic conflation. But I forgot about an earlier post where I brought Burke into the context of symbolic conflation. It was for a different reason, though.

In that post, I explained Burke’s role as an outsider and how that positioned him as a purveyor of symbolic conflation. The moral imagination is all about this, as symbolic conflation is the beating heart, the meeting point of the imagined and the real. The centrality of the outsider status also brings into play the reactionary mind, according to Corey Robin, for the outsider sees most clearly the threat of boundaries being transgressed and all boundaries are ultimately boundaries of the mind. A symbolic conflation is a wall that both marks and establishes the boundary. It makes the boundary real and, in doing so, defends the authority of claims about what is real.

This is the moral imagination of fear. It is a visceral fear, the embodied imagination. A symbolic conflation requires a grounding within bodily experience, fight and flight, pain and illness, pleasure and guilt, punishment and death. It relates to what I call the morality-punishment link. It also offers possible insight into the origins of the reactionary mind. The conservative, as I argue, is simply a liberal in reactionary mode. The conservative is a liberal who has been mugged by their own moral imagination. Their minds have been wrapped in chains of fear and locked shut by symbolic conflation, the visceral experience of a story that has become their reality.

This is a potential existing within everyone, not just those on the political right. But this potential requires specific conditions to become manifest. Liberalism and the conservative reaction to it is an expression of modernity. This dynamic isn’t found in all societies. It is a cultural product and so there is nothing inevitable about it. Other cultures are possible with other ideological mindsets and other social dynamics. For us moderns, though, it is the only reality we know, this endless conflict within our collective psyche.

Maybe unintentionally, Edmund Burke offers us the key to unlock the modern mind. Knowing this key existed is what he feared the most, for then the human mind and its potential would be laid bare. Yet this fear is what gives the reactionary mind its sense of power and purpose, an existential threat that must be fought. Modernity is continuously poised on a knife edge.

The near cosmic morality tale of ideological conflict is itself a symbolic conflation. There is always a story being told and its narrative force has deep roots. Wherever a symbolic conflation takes hold, a visceral embodiment is to be found nearby. Our obsession with ideology is unsurprisingly matched by our obsession with the human brain. The symbolic conflation, though moral imagination, gets overlaid onto the brain for there is no greater bodily symbol of the modern self. We fight over the meaning of human nature by wielding the scientific facts of neurocognition and brain scans. It’s the same reason the culture wars obsess over the visceral physicality of sexuality: same sex marriage, abortion, etc. But the hidden mysteries of the brain make it particularly fertile soil. As Robert Burton explained in A Skeptic’s Guide to the Mind (Kindle Locations 2459-2465):

our logic is influenced by a sense of beauty and symmetry. Even the elegance of brain imaging can greatly shape our sense of what is correct. In a series of experiments by psychologists David McCabe and Alan Castel, it was shown that “presenting brain images with an article summarizing cognitive neuroscience research resulted in higher ratings of scientific reasoning for arguments made in those articles, as compared to other articles that did not contain similar images. These data lend support to the notion that part of the fascination and credibility of brain imaging research lies in the persuasive power of the actual brain images.” The authors’ conclusion: “Brain images are influential because they provide a physical basis for abstract cognitive processes, appealing to people’s affinity for reductionistic explanations of cognitive phenomena.” *

The body is always the symbolic field of battle. Yet the material form occludes what exactly the battle is being fought over. The embodied imagination is the body politic. We are the fear we project outward. And that very fear keeps us from looking inward, instead always drawing us onward. We moderns are driven by anxiety, even as we can never quite pinpoint what is agitating us. We are stuck in a holding pattern of the mind, waiting for something we don’t know and are afraid to know. Even as we are constantly on the move, we aren’t sure we are getting anywhere, like a dog trotting along the fenceline of its yard.

* * *

* D. McCabe and A. Castel, “Seeing Is Believing: The Effect of Brain Images on Judgments of Scientific Reasoning,” Cognition, 107( 1), April 2008, 345– 52.
(For criticisms, see: The Not So Seductive Allure of Colorful Brain Images, The Neurocritic.)

Democratic Republicanism in Early America

There was much debate and confusion around various terms, in early America.

The word ‘democracy’ wasn’t used on a regular basis at the time of the American Revolution, even as the ideal of it was very much in the air. Instead, the word ‘republic’ was used by most people back then to refer to democracy. But some of the founding fathers such as Thomas Paine avoided such confusion and made it clear beyond any doubt by speaking directly of ‘democracy’. Thomas Jefferson, the author of the first founding document and 3rd president, formed a political party with both ‘democratic’ and ‘republican’ in the name, demonstrating that no conflict was seen between the two terms.

The reason ‘democracy’ doesn’t come up in founding documents is that the word is too specific, although it gets alluded to when speaking of “the People” since democracy is literally “people power”. Jefferson, in writing the Declaration of Independence, was particularly clever in avoiding most language that evoked meaning that was too ideologically singular and obvious (e.g., he effectively used rhetoric to avoid the divisive debates for and against belief in natural law). That is because the founding documents were meant to unite a diverse group of people with diverse opinions. Such a vague and ambiguous word as ‘republic’ could mean almost anything to anyone and so was an easy way to paper over disagreements and differing visions. If more specific language was used that made absolutely clear what they were actually talking about, it would have led to endless conflict, dooming the American experiment from the start.

Yet it was obvious from pamphlets and letters that many American founders and revolutionaries wanted democracy, in whole or part, to the degree they had any understanding of it. Some preferred a civic democracy with some basic social democratic elements and civil rights, while others (mostly Anti-Federalists) pushed for more directly democratic forms of self-governance. The first American constitution, the Articles of Confederation, was clearly a democratic document with self-governance greatly emphasized. Even among those who were wary of democracy and spoke out against it, they nonetheless regularly used democratic rhetoric (invoking democratic ideals, principles, and values) because democracy was a major reason why so many fought the revolution in the first place. If not for democracy, there was little justification for and relevance in starting a new country, beyond a self-serving power grab by a new ruling elite.

Without assuming that large number of those early Americans had democracy in mind, their speaking of a republic makes no sense. And that is a genuine possibility for at least some of them, as they weren’t always clear in their own minds about what they did and didn’t mean. To be technical (according to even the common understanding from the 1700s), a country either is a democratic republic or a non-democratic republic. The variety of non-democratic republics would include what today we’d call theocracy, fascism, communism, etc. It is a bit uncertain exactly what kind of republic various early Americans envisioned, but one thing is certain: There was immense overlap and conflation between democracy and republicanism in the early American mind. This was the battleground of the fight between Federalists and Anti-Federalists (or to be more accurate, between pseudo-Federalists and real Federalists).

As a label, stating something is a republic says nothing at all about what kind of government it is. All that it says is what a government isn’t, that is to say it isn’t a monarchy, although there were even those who argued for republican monarchy with an elective king which is even more confused and so the king theoretically would serve the citizenry that democratically elected him. Even some of the Federalists talked about this possibility of republic with elements of a monarchy, strange as it seems to modern Americans. This is what the Anti-Federalists worried about.

Projecting our modern ideological biases onto the past is the opposite of helpful. The earliest American democrats were, by definition, republicans. And most of the earliest American republicans were heavily influenced by democratic political philosophy, even when they denounced it while co-opting it. There was no way to avoid the democratic promise of the American Revolution and the founding documents. Without that promise, we Americans would still be British. That promise remains, yet unfulfilled. The seed of an ideal is hard to kill once planted.

Still, bright ideals cast dark shadows. And the reactionary authoritarianism of the counter-revolutionaries was a powerful force. It is an enemy we still fight. The revolution never ended.

* * *

Democracy Denied: The Untold Story
by Arthur D. Robbins
Kindle Locations 2862-2929

Fascism has been defined as “an authoritarian political ideology (generally tied to a mass movement) that considers individual and other societal interests inferior to the needs of the state, and seeks to forge a type of national unity, usually based on ethnic, religious, cultural, or racial attributes.”[ 130] If there is a significant difference between fascism thus defined and the society enunciated in Plato’s Republic,[ 131] in which the state is supreme and submission to a warrior class is the highest virtue, I fail to detect it. [132] What is noteworthy is that Plato’s Republic is probably the most widely known and widely read of political texts, certainly in the United States, and that the word “republic” has come to be associated with democracy and a wholesome and free way of life in which individual self-expression is a centerpiece.

To further appreciate the difficulty that exists in trying to attach specific meaning to the word “republic,” one need only consult the online encyclopedia Wikipedia.[ 133] There one will find a long list of republics divided by period and type. As of this writing, there are five listings by period (Antiquity, Middle Ages and Renaissance, Early Modern, 19th Century, and 20th Century and Later), encompassing 90 separate republics covered in Wikipedia. The list of republic types is broken down into eight categories (Unitary Republics, Federal Republics, Confederal Republics, Arab Republics, Islamic Republics, Democratic Republics, Socialist Republics, and People’s Republics), with a total of 226 entries. There is some overlap between the lists, but one is still left with roughly 300 republics— and roughly 300 ideas of what, exactly, constitutes a republic.

One might reasonably wonder what useful meaning the word “republic” can possibly have when applied in such diverse political contexts. The word— from “res publica,” an expression of Roman (i.e., Latin) origin— might indeed apply to the Roman Republic, but how can it have any meaning when applied to ancient Athens, which had a radically different form of government existing in roughly the same time frame, and where res publica would have no meaning whatsoever?

Let us recall what was going on in Rome in the time of the Republic. Defined as the period from the expulsion of the Etruscan kings (509 B.C.) until Julius Caesar’s elevation to dictator for life (44 B.C.),[ 134] the Roman Republic covered a span of close to five hundred years in which Rome was free of despotism. The title rex was forbidden. Anyone taking on kingly airs might be killed on sight. The state of affairs that prevailed during this period reflects the essence of the word “republic”: a condition— freedom from the tyranny of one-man rule— and not a form of government. In fact, The American Heritage College Dictionary offers the following as its first definition for republic: “A political order not headed by a monarch.”

[…] John Adams (1735– 1826), second President of the United States and one of the prime movers behind the U.S. Constitution, wrote a three-volume study of government entitled Defence of the Constitutions of Government of the United States of America (published in 1787), in which he relies on the writings of Cicero as his guide in applying Roman principles to American government.[ 136] From Cicero he learned the importance of mixed governments,”[ 137] that is, governments formed from a mixture of monarchy, aristocracy, and democracy. According to this line of reasoning, a republic is a non-monarchy in which there are monarchic, aristocratic, and democratic elements. For me, this is confusing. Why, if one had just shed blood in unburdening oneself of monarchy, with a full understanding of just how pernicious such a form of government can be, would one then think it wise or desirable to voluntarily incorporate some form of monarchy into one’s new “republican” government? If the word “republic” has any meaning at all, it means freedom from monarchy.

The problem with establishing a republic in the United States was that the word had no fixed meaning to the very people who were attempting to apply it. In Federalist No. 6, Alexander Hamilton says, “Sparta, Athens, Rome and Carthage were all republics”( F.P., No. 6, 57). Of the four mentioned, Rome is probably the only one that even partially qualifies according to Madison’s definition from Federalist No. 10 (noted earlier): “a government in which the scheme of representation takes place,” in which government is delegated “to a small number of citizens elected by the rest” (ibid, No. 10, 81-82).

Madison himself acknowledges that there is a “confounding of a republic with a democracy” and that people apply “to the former reasons drawn from the nature of the latter ”( ibid., No. 14, 100). He later points out that were one trying to define “republic” based on existing examples, one would be at a loss to determine the common elements. He then goes on to contrast the governments of Holland, Venice, Poland, and England, all allegedly republics, concluding, “These examples … are nearly as dissimilar to each other as to a genuine republic” and show “the extreme inaccuracy with which the term has been used in political disquisitions.”( ibid., No. 39, 241).

Thomas Paine offers a different viewpoint: “What is now called a republic, is not any particular form of government. It is wholly characteristical [sic] of the purport, matter, or object for which government ought to be instituted, and on which it is to be employed, res-publica, the public affairs or the public good” (Paine, 369) (italics in the original). In other words, as Paine sees it, “res-publica” describes the subject matter of government, not its form.

Given all the confusion about the most basic issues relating to the meaning of “republic,” what is one to do? Perhaps the wisest course would be to abandon the term altogether in discussions of government. Let us grant the word has important historical meaning and some rhetorical appeal. “Vive la Republique!” can certainly mean thank God we are free of the tyranny of one-man, hereditary rule. That surely is the sense the word had in early Rome, in the early days of the United States, and in some if not all of the French and Italian republics. Thus understood, “republic” refers to a condition— freedom from monarchy— not a form of government.

* * *

Roger Williams and American Democracy
US: Republic & Democracy
 (part two and three)
Democracy: Rhetoric & Reality
Pursuit of Happiness and Consent of the Governed
The Radicalism of The Articles of Confederation
The Vague and Ambiguous US Constitution
Wickedness of Civilization & the Role of Government
Spirit of ’76
A Truly Free People
Nature’s God and American Radicalism
What and who is America?
Thomas Paine and the Promise of America
About The American Crisis No. III
Feeding Strays: Hazlitt on Malthus
Inconsistency of Burkean Conservatism
American Paternalism, Honor and Manhood
Revolutionary Class War: Paine & Washington
Paine, Dickinson and What Was Lost
Betrayal of Democracy by Counterrevolution
Revolutions: American and French (part two)
Failed Revolutions All Around
The Haunted Moral Imagination
“Europe, and not England, is the parent country of America.”
“…from every part of Europe.”

The Fight For Freedom Is the Fight To Exist: Independence and Interdependence
A Vast Experiment
America’s Heartland: Middle Colonies, Mid-Atlantic States and the Midwest
When the Ancient World Was Still a Living Memory

Deep Roots in Dark Soil

In doing genealogy research, I’ve made many connections to American history, some of it quite dark and much of it not that far back in time. It is something that has been bothering me for a while. I had a longer series of posts I was writing about it, but I got bogged down with the topic. It’s overwhelming and hard to grapple with. So, let me keep this post simple and to the point.

Possibly the earliest line of my family that came to America was the Peebles. They were Scottish and, maybe for siding with the king, they arrived in the Virginia colony (1649 or 1650) during the English Civil War. David Peebles, the patriarch, came with some help (either indentured servants or slaves) and built a plantation. Later generations of the Peebles were definitely slave owners and they fought for the Confederacy in the Civil War.

The family across the generations drifted further South and West, ending up in Texas. That is where my paternal grandmother was born in 1912, well within living memory of slavery and the Civil War. The last Civil War veterans died in the 1950s, the last known survivor of the Atlantic slave trade between Africa and the United States died in the 1930s, the last American born into slavery died in the 1970s — the latter happening just a few years before I was born and about a decade before my grandmother died. None of this is ancient history. It’s possible that if my grandmother had bothered to ask that there were people in the family who still remembered owning slaves.

Also, the early twentieth century was a time of the last of the Indian Wars. There were major battles that happened in that part of the country when my grandmother was a child. The last significant altercation in the United States happened in 1924 when she was twelve years old and that is the age when kids begin to gain awareness of the larger world. But there were Indian holdouts who kept fighting in Mexico and weren’t defeated until nine years later in 1933. My grandmother was twenty-one years old at that point and so this was part of the world she was entering into.

David Peebles himself had been an Indian fighter, a captain in the Virginia militia. He was a well respected man. As reward, he had been given a Native American captive and I’m sure that person was treated as a slave. It’s assumed that David Peebles received an injury from fighting and he slowly disappeared from the records. Between those first Peebles in America and my grandmother, I’m sure there were numerous Indian fighters in my ancestry. After all, that part of my family was involved in the push Westward, as Native Americans retreated or were forcibly removed. And then they ended up in the region of the last battles with the last free natives.

All of this national history is intimately intertwined with my family history. And much of it was still living memory into my grandmothers childhood and even into her adulthood (in some cases, even into my parents’ adulthood). More importantly, it was an ongoing history. The struggles of blacks didn’t end with the Civil War any more than the struggles of Native Americans ended with the Indian Wars. I could understand how much of this history was hidden at the time, even as the suffering and oppression continued. Native Americans, after all, were forced onto reservations that made their plight practically invisible to the rest of the country. It was a problem that wasn’t seen and so didn’t need to be thought about. But the problems facing blacks would have been impossible to ignore for those living in the South and also in the North.

In the South my grandmother grew up in, Jim Crow was in full force and blacks had for decades faced re-enslavement through chain gang labor. My grandmother was a few years old when the Second Klan was founded. The Klan was a growing force during her childhood and was at its height in her teenage years: “At its peak in the mid-1920s, the organization claimed to include about 15% of the nation’s eligible population, approximately 4–5 million men” (Wikipedia). I have no doubt that many generations and many lines of my family were involved in the various incarnations of the Klan, along with other violently racist organizations and activities; but there is no family stories about any of this, as it’s one of those things that people don’t talk about.

When my grandmother was eight years old, a short distance from her childhood home the Tulsa race riots occurred where white mobs rioted and terrorized the black population. It was an actual battle with whites and blacks fighting in the streets (many of them WWI veterans, including black veterans who took their military weapons home with them), snipers were positioned in buildings shooting at people below, airplanes firebombed the wealthiest black community in America at the time (Black Wall Street), and belatedly troops were sent in to restore order. Hundreds of blacks were killed, hundreds more ended up in the hospital, 6,000 black residents were arrested and detained, and in the detention centers blacks were forced to do labor. In the aftermath, most of the black population became refugees who had lost everything and thousands of white residents in Tulsa joined the Klan.

It was one of the most violent and destructive events in American history. Yet it was erased from public awareness almost instantly, as if it had never happened. “The Tulsa race riot of 1921 was rarely mentioned in history books, classrooms or even in private. Blacks and whites alike grew into middle age unaware of what had taken place” (A.G. Sulzberger, “As Survivors Dwindle, Tulsa Confronts Past“, NYT).

This was just one of many race riots and other acts of mass racial violence that occurred in the decades before and following what happened in Tulsa. Violence like this, including lynchings, would have been common events for the first two-thirds of her life. After her family left Oklahoma, they moved to a part of Mississippi that was a major center of the Second Klan. Then as an adult in 1940, she moved her own young family to Indiana, the headquarters and epicenter of the Second Klan, during a time when the last vestiges of the organization were still to be seen. It was in the 1950s and 1960s when a splintered KKK reasserted itself in fighting the Civil Rights Movement.

Indiana is close to the South and not just geographically. It’s been culturally and economically connected to Kentucky from early on. This area is sometimes referred to as Kentuckiana. Much of Indiana’s population originally came from Kentucky and that has made Indiana the most Southern state in the Midwest (my maternal ancestry includes Indian fighters who came to Kentucky shortly after the American Revolution). A generation after my mother’s family left the border region of Kentucky and Indiana, she grew up in a large industrial city in central Indiana and yet she maintained a Southern accent well into her twenties.

Indiana was a destination of many white Southerners looking for work. Yet Southern blacks knew to mostly avoid Indiana, except for Northern parts of the state closer to Chicago. This wasn’t just a vague notion that blacks had about Indiana. The local white population, Klan and otherwise, made it overtly clear they weren’t welcome in most parts of the state.

My father was born in small town Indiana and then moved to another nearby small town. They were both in an area of much racism, but the second town where he spent most of his early life was a sundown town. When my father and his family moved there, a sign warning blacks to stay away was still visible on a major road into town. My father would have been too young to understand, my Southern grandmother could not have missed something so obvious. They had to have known they moved into a sundown town. Did my father know about this? No. Did his mother, my grandmother, ever talk about it? No. It wasn’t talked about. As my grandfather was the town minister, he could have challenged this racism from the pulpit. Did he? No. The reason for this is that my grandfather My was a racist, although like many he softened his prejudiced views later in life. Still, that doesn’t change the moral failure.

My grandmother was always a religious and spiritual person, moreso than my grandfather despite his being a minister. She grew in that old time religion, Southern Baptist church. When she moved to the West Coast, she became quite liberal and joined extremely liberal churches, such as Unity Church and Science of Mind. It was because of my grandmother that I was raised in the same kind of liberal churches. This led me to become the liberal I am today. Even so, my grandmother never spoke of our family’s ancestral sin of racial oppression, even though she had spent so much of her life right in the middle of it.

My father went off to college at Purdue. The city, Lafayette, had been a sundown town at one point. The systemic racism was lessening there by the time my parents attended, but the black population remained low. While they were at college, the Civil Rights Movement was growing and violence was happening. Professors and college students from Purdue even joined in some of the major events of that time. The world was changing all around my parents, but they apparently were oblivious to it all. When I’ve asked them, they had only slight memory of what was happening at the time, other than some brief news stories that they paid little attention to. It didn’t seem all that important to them, as white conservatives in a white conservative state with a hopeful future before them.

Systemic and institutional racism continued in some parts of the country long after the death of MLK. Blacks were still fighting for basic rights and demanding that laws against racism be enforced, well into my own lifetime (in fact, the struggle for justice continues to this day). For my parents, living in Ohio after college, that was a happy time of their life. As their children were born, protests and riots were going on around the country (including nearby), but it all seemed distant and insignificant, maybe a bit incomprehensible. After that, during the 1980s, our family moved to Deerfield, Illinois — a Chicago suburb with a history of keeping blacks out, something my parents were also unaware of. Then we headed to Iowa, which at the time was a demographic bubble of whiteness.

In my own childhood, I don’t recall my parents or other adults talking about race and racism. I also was oblivious to it all, until we moved to South Carolina when I was thirteen years old. It was a shock to my system. I didn’t grow up with that world and so I saw it with fresh eyes in a way someone wouldn’t have if they had grown up with it. Even then, amidst obvious racism and an overt racial social order, few people talked about it. I saw blacks at school, but no blacks lived in my neighborhood or went to my church. Black kids didn’t come home with me nor did I go home with them.

I was facing generations of denial in my own family. No one gave me any tools to deal with any of it. If not for genealogy research, I might never have realized how close to home all of this comes. Even now, I live in a liberal college town where at an earlier point in time a racist mob chased out of town the radical abolitionist John Brown, shortly before his execution. And a muted form of that old racism lingers still.

How do we deal with the legacy of centuries of oppression when it’s almost impossible to even publicly acknowledge what has happened within living memory? How do we come to terms with the fact that the legacy continues with systemic and institutional racism? How do we open up dialogue? How do we move forward? If more people simply dug into their own family histories, what might they find? And if they put that into context of the larger national history, what understandings might they come to?

My eternal refrain: Then what?

I’ve gained this knowledge and it was no easy task, as I had to find it for myself through decades of obsessive research and intense study. Generations of my own family have avoided this knowledge, built on centuries of ignorance and denial, supported by a vast social order designed to maintain the status quo. So, here we are. Many others like me are looking at these hidden truths now brought to light. What are we supposed to do with it all? How does a society come to terms with collective guilt?

William Faulkner spent most of his life a few counties away from my great grandmother’s home in Mississippi, the last place my grandmother lived before adulthood and the area she returned to after college to work a teaching job for a couple of years, around 1935. That is where my father would visit as a child and where he saw his first “colored” water fountain. Faulkner’s Requiem for a Nun was set in that part of Mississippi, as were other of his novels. The events in the story were fictionally placed in the years immediately following my grandmother’s departure. The world that Faulkner described was the world that shaped my grandmother, a world she couldn’t leave behind because she carried it with her.

One of Faulkner’s best known lines comes from that novel. He wrote:

“The past is never dead. It’s not even past.”

My grandmother was an educated woman, a teacher in fact. I wonder. Did she ever read those words? And if so, what did she think of them? Did she ever look to the past, her own past and that of her family? Or was she trying to escape the past by getting as far away as possible, ending up in the Northwest? It’s ironic that she spent the last years of her life in Oregon, the only state in the Union that was once fully sundown, excluding blacks entirely.

From what I gather, my grandmother was a kindhearted woman, but that could be said of many people. Few white Americans are overtly mean-spirited. People simply try to live their lives, and yet their lives exist along a moral arc bending from the past into the future. How often do any of us consider our place in the larger scheme of things and wonder about what future generations will think of us?

American Christianity: History, Politics, & Social Issues

I broke my policy and wrote a comment on an Atlantic article, Trump Is Bringing Progressive Protestants Back to Church by Emma Green. I’ve tried to stop commenting outside of a few select places on the internet because it usually ends up feeling pointless. Some of the responses were unworthy in this case, but it turned out not to be that bad of a discussion, relatively speaking.

Despite the frustration often involved, part of me enjoys the challenge of formulating an informative comment that actually adds to public debate. Plus, it got me thinking about one of my ancestors, a country abortion doctor who operated when abortion was technically illegal in Indiana and yet the law apparently wasn’t enforced at the time.

That was a different world, when communities decided which laws they did and did not care about, no matter what distant governments declared. Most people were fine with abortions, just as during Prohibition most people were fine with drinking. Laws are only meaningful when they can be enforced and the US political system has often left much of the power of enforcement at the local level, which is how so many bootleggers avoided prosecution as their neighbors were the jury of their peers.

The following are my comments, my original comment first and then two following comments. I had several other comments in the discussion, but those below are the most significant.

* * *

Sertorius wrote: “These liberal Christian denominations have experienced a massive drop in membership. Example: the Presbyterian Church (USA) had more than 3 million members 30 years ago. It now has half of that.

“This is unsurprising. Why would people go to a church which doesn’t take the Bible seriously? What is the point? How is it different than the local meeting of the Democratic Party?”

Most young Christians, including most Evangelicals and Catholics, identity as progressive or liberal. Most young Christians also support gay marriage and pro-choice. They do so because they read the Bible for themselves, instead of trusting the words of fundamentalist preachers.

Thomas R wrote: “Do you have a source for this odd assertion? I believe a good part of why millennials come out so socially liberals is they are less Christian than other generations.”

I always find it odd when I’m asked a question like this on the internet. If you really wanted to know, you could find such info in a few minutes of doing web searches. Maybe a bit more time, if you were really curious.

I’m sure you believe all kinds of things. But your beliefs, if uninformed, are irrelevant. Many other Christians would also believe that you are less Christian. BTW, if you go back some generations to the early 1900s, many Christians were progressives and the religious left was a powerful force. This kind of thing tends to go in cycles. But there is always a split. Even as the religious right became loud and demanding, a large swath of silenced Evangelicals remained liberal/progressive.

Belief is a funny thing. Surveys have found that the atheists on average know more about the Bible than do Christians on average. So, if Christian belief for so many self-proclaimed Christians isn’t based on knowledge of the Bible, what is it based on? Does God speak to Christians personally and tell them what to believe? Or are most Christians simply following false prophet preachers? Since these preachers are false prophets, should they be killed as the Bible commands?

If you look below at my response to rsabharw, you’ll see how little fundamentalists actually know about the Bible. The irony of their literalism is how non-literal or even anti-literal it is. Literalism simply becomes a codeword for ignorant bigotry and dogmatic politics.

Anyway, most Americans identify as Christian and have done so for generations. Yet most Americans are pro-choice, supporting abortion in most or all situations, even as most Americans also support there being strong and clear regulations for where abortions shouldn’t be allowed. It’s complicated, specifically among Christians. The vast majority (70%) seeking abortions considered themselves Christians, including over 50% who attend church regularly having kept their abortions secret from their church community and 40% feeling that churches are not equipped to help them make decisions about unwanted pregnancies.

It should be noted that, on the issue of abortion, Millennials are in agreement with Americans in general and so it isn’t a generational gap. Young Evangelicals have always had high rates of premarital sex, going back to the largely Scots-Irish Evangelicals of Appalachia and the Upper South. Millennial teen sex rates now are as low as they were more than a half century ago (drug use and violent crime rates among the young also are low right now). Sexuality hasn’t really changed over time, even as rates slightly shift up and down in cycles. Even in early America, most marriages followed pregnancy and hence premarital sex. No matter what a belief states, humans remain human.

It’s similar to other issues, although often with more of a generational gap. Consider guns, a supposedly divisive issue but where the majority of Americans simultaneously supports strong protection of gun rights and the need for stronger regulation (and enforcement) of guns. Even liberal Americans state having high rates of a guns in the home. There is no contradiction between someone being for both gun rights and gun regulations, both being liberal positions, one classical liberal and the other progressive liberal.

In general, most Americans are fairly liberal, progressive, and economic populist on most major issues. But this political leftism cuts deep into the part of the population that outwardly identifies as conservatives. So, even conservatism in the US is rather liberal.

Public opinion, across the generations, has been moving left. But it is most clearly seen in the younger generation. Still, even the oldest living generation seems liberal compared to the generations that were alive before them. The Lost Generation (i.e., WWI vets and 1920s libertines) were judged in their youth by older generations just the same as young people today. This would be obvious, if so many Americans weren’t historically ignorant.

The greatest differences in opinion aren’t necessarily between generations. Nor even between Christians and atheists. The growing divides in the US are often seen most clearly within Christianity, between: Catholics and Protestants, Mainline Christians and Fundamentalists, white Christians and minority Christians, etc. But that has always been true, going back centuries. The entire Protestant Reformation, Counter-Reformation, and religious wars including the English Civil War) were about Christians struggling over who would get to define Christianity for others and who would be free to define Christianity for themselves.

Many of these are old issues. Catholics, for example, genocidally wiped out the Christian Cathars for practicing gay sex. Many denominations that exist today were created by congregations being split over social and political issues. That will continue. Rifts are developing within churches, such as the Catholic Church that is equally divided between the two major parties. The small town Midwestern church my grandfather preached in was shut down over conflict between the local congregation that was fine with a gay music director and the national church organization that was against it. In place of churches like that, new churches will form.

Thomas R wrote: “The rules on abortion and homosexuality are part of the faith. Both are found in the writings of the Early Christians and in the Catechism. (See Cyprian, Ambrosiaster, St. John Chrysostom (c. 349 – 407), Severian, the Didache, Clement of Alexandria, St. Basil, Canon 1398) As well as the statements of Popes.

“At the very least abortion after the first trimester is consistently considered wrong by the faith.”

Even most pro-choicers treat third trimester abortions differently. There is also a reason why pro-choicers like me are more concerned about preventing abortions entirely than are most supposedly pro-lifers, it being a question of prioritizing either moral outcomes or ideological dogmatism.

Your knowledge of Christian history is obviously incomplete. That is problematic.

Among early Christians, there were different views about life, ensoulment, abortion, and murder. There was no unanimous Christian belief about such things, something you would know if you knew Christian history. There is no scholarly consensus that most early Christians treated abortion as a crime. It was often a standard sin, like most other sex-related sins. As far as that goes, sex itself was considered a sin.

It’s hard to know what early Christians believed. When they spoke of abortion, they had specific ideas in mind based in a cultural context of meaning. That depended on when one considered the fetus or baby to gain a soul. Not all early Christians thought life, much less ensoulment, began at conception and so early endings of pregnancies weren’t necessarily considered abortions. That is a main point that many pro-choicers make.

None of the New Testament or Old Testament writings clearly and directly discuss abortion, infanticide, and exposure. It apparently wasn’t considered important enough issue even to be mentioned specifically, much less condemned. It was only in the following centuries that Christians made statements about it. So, if Christianity isn’t directly based on Jesus’ teachings and the Bible, then what is Christianity? What kind of Christian tradition isn’t based on the earliest known Christianity that formed by Jesus’ first followers?

Aborton didn’t become much of a legal and political issue until modern Christianity. Plus, beyond decrees in the following centuries after Jesus’ crucifixion, there is no evidence that early Christians were ever any less likely to have abortions than non-Christians, as decrees imply something is common in persisting and so requires condemnation. So, is Christian tradition based on what church elites decree or on what Christians practice?

If the former, then all of Protestantism is false Christianity, since it was founded on defying the church elite of the time (even the Catholic heresiologists were defying the Christians in the church that came before them, such as Valentinus and Marcion). But if Protestants are correct about individual conscience of the Christian, then what Christians do has more validity than what church elites decree.

This is no minor point with profound theological and moral significance, especially considering most American Catholics seem fine with not absolutely following Vatican declarations. This is further complicated since the various church elites over the centuries have disagreed with one another on fundamental moral issues, including on abortion.

Anyway, shouldn’t Scripture supersede the personal opinions of church elites, no matter how authoritative they like to pretend to be? No one speaks for God but God. The fact that church elites disagreed and argued with one another proves they are far from infallible. Even the Vatican didn’t consider church positions on abortion to be infallible teachings.

However individuals wish to interpret all of this, there is the issue of one’s response as a Christian. Since only liberal policies have proven to decrease unwanted pregnancies that lead to abortions, it would be the religious duty of any pro-life Christian to support liberal policies. Yet they don’t and instead promote policies that either increase the number of abortions or don’t decrease them. Those Christians are committing sin, in placing their political ideology above their faith.

When someone acts in such a way that inevitably promotes a sin, what should the Christian response?

http://www.huffingtonpost.com/jonathan-dudley/how-evangelicals-decided-that-life-begins-at-conception_b_2072716.html
My Take: When evangelicals were pro-choice
https://eewc.com/FemFaith/evangelicals-open-differing-views-abortion/
http://www.patheos.com/blogs/returntorome/2013/01/evangelicals-and-abortion-in-the-20th-century-a-hidden-history/
https://www.onfaith.co/onfaith/2013/01/22/roe-v-wade-anniversary-how-abortion-became-an-evangelical-issue/11238
http://religiondispatches.org/the-not-so-lofty-origins-of-the-evangelical-pro-life-movement/
http://www.politico.com/magazine/story/2014/05/religious-right-real-origins-107133

https://en.wikipedia.org/wiki/History_of_Christian_thought_on_abortion

“There is scholarly disagreement on how early Christians felt about abortion. Some scholars have concluded that early Christians took a nuanced stance on what is now called abortion, and that at different and in separate places early Christians have taken different stances. Other scholars have concluded that early Christians considered abortion a sin at all stages; though there is disagreement over their thoughts on what type of sin it was and how grave a sin it was held to be. Some early Christians believed that the embryo did not have a soul from conception, and consequently opinion was divided as to whether early abortion was murder or ethically equivalent to murder.”

http://rationalwiki.org/wiki/History_of_abortion#Early_Christianity

“Neither the Old nor New Testament of the Bible make any specific mention of abortion, though Numbers 5:11-31 refers to a ritual known as the “ordeal of the bitter water”, which will test if a woman has been faithful to her husband by giving her a special potion concocted by a priest, possibly an abortifacient. If the woman was unfaithful, this will cause her “thigh” (a biblical euphemism for the woman’s reproductive organs, as well as any embryo contained within) to “swell and fall away” (some texts use the term “rupture” instead of “fall away”), which is a likely reference to miscarriage. Because of the Bible’s authors being so fond of euphemisms, it is a matter of debate whether this text is an endorsement for abortion when the woman is impregnated by someone who is not her husband (euphemistic interpretation) or simply a ritual that would presumably kill the wife for her adultery (literal interpretation).[13] The actual views of Christian society and the Church can definitively be gathered only via other extra-Biblical writings on theology and ethics.

“During the first and second century CE, abortion, intentional or forced miscarriages, and infanticide, were all commonplace, as families faced serious limitations on the number of people they could support. Though legal and ethical texts seem to suggest that this was somehow sinful, it did not take on any serious move to create or enforce a prohibition against abortion or infanticide. Scholars[14] have suggested that in the very early parts of the 1st and 2nd centuries, discussions about abortion and infanticide were effectively the same issue.

“By the mid-2nd century however, Christians separated themselves from the pagan Romans and proclaimed that the theological and legal issues with abortion had nothing to do with the father’s rights, but with God’s view of the sanctity of life itself. It was as bad a sin as any other sexual sin, including contraception and intentional sterilization, which suggested that a central issue was the giving of one’s body to God and being open for procreation as much as it was the inherent value of the unborn’s life. The issue of when the soul enters the body, and if that should affect the ethics of abortion, remained unresolved, though Augustine of Hippo offered his opinion that it did not enter until the third or sixth month, depending on the sex (the latter for girls). However, while he did not view abortion as murder until that point, it was still a sin in his view.”

http://addictinginfo.org/2013/03/21/abortion-church-conception-history/

“Then, in 1869, completely ignoring earlier teachings, Pope Pius IX wrote in Apostolicae Sedis that excommunication is the required penalty for abortion at any stage of pregnancy. He further stated that all abortion was homicide. This was an implicit endorsement – the church’s first – of ensoulment at conception.”

http://sanctuaryforallfaiths.yuku.com/topic/2170/Abortion-and-Catholic-Thought-The-LittleTold-History#.WE__-_krLIV

“Most people believe that the Roman Catholic church’s position on abortion has remained unchanged for two thousand years. Not true. Church teaching on abortion has varied continually over the course of its history. There has been no unanimous opinion on abortion at any time. While there has been constant general agreement that abortion is almost always evil and sinful, the church has had difficulty in defining the nature of that evil. Members of the Catholic hierarchy have opposed abortion consistently as evidence of sexual sin, but they have not always seen early abortion as homicide. Contrary to conventional wisdom, the “right-to-life” argument is a relatively recent development in church teaching. The debate continues today.

“Also contrary to popular belief, no pope has proclaimed the prohibition of abortion an “infallible” teaching. This fact leaves much more room for discussion on abortion than is usually thought, with opinions among theologians and the laity differing widely. In any case, Catholic theology tells individuals to follow their personal conscience in moral matters, even when their conscience is in conflict with hierarchical views.

“The campaign by Pope John Paul II to make his position on abortion the defining one at the United Nations International Conference on Population and Development in 1994 was just one leg of a long journey of shifting views within the Catholic church. In the fifth century a.d., St. Augustine expressed the mainstream view that early abortion required penance only for sexual sin. Eight centuries later, St. Thomas Aquinas agreed, saying abortion was not homicide unless the fetus was “ensouled,” and ensoulment, he was sure, occurred well after conception. The position that abortion is a serious sin akin to murder and is grounds for excommunication only became established 150 years ago.”

‘An Intercultural Perspective on Human Embryonic Cell Research’ by Leroy Walters
Stem Cells, Human Embryos and Ethics: Interdisciplinary Perspectives
edited by Lars Østnor
p. 106

“”In the early centuries of Christianity there was diversity of opinion on the question of abortion. In a Roman Empire where abortion was widely practiced, some Christian theologians argued that every abortion was a homicide (Noonan 1970: 7-14). On the other hand, the ‘formed-unformed’ distinction came to prevail in the mainstream, or most authoritative, Christian theological and penitential traditions. Augustine presaged the predominant view when he argued that an unformed fetus had no soul and no sentience (Noonan 1970: 15-16). His view was accepted by Thomas Aquinas and by most theologians through at least the 18th century (Noonan 1970: 34-36). There is a nuance here that I do not want to obscure. Both the abortion of an unformed (that is, unensouled) fetus and of a formed (ensouled) fetus were considered to be sins. However, terminating the life of an unformed fetus was morally equivalent to the sin of contraception. In contrast, the terminating the life of a formed fetus was considered to be (unjustified) homicide (Noonan 1970: 15-18).

“The predominant Christian view was increasingly called into question in the 18th and 19th centuries. Finally, in 1869, the authoritative Roman Catholic view came to be that it was morally safer to assume that ensoulment occurs at the time of fertilization.”

Abortion and the Politics of Motherhood
by Kristin Luker
pp. 11-14

“SURPRISING As it may seem, the view that abortion is murder is a relatively recent belief in American history. To be sure, there has always been a school of thought, extending back at least to the Pythagoreans of ancient Greece, that holds that abortion is wrong because the embryo is the moral equivalent of the child it will become. Equally ancient however is the belief articulated by the Stoics: that although embryos have some of the rights of already-born children (and these rights may increase over the course of the pregnancy) , embryos are of a different moral order, and thus to end their existence by an abortion is not tantamount to murder.

“Perhaps the most interesting thing about these two perspectives (which have coexisted over the last two thousand years) is the fact more ancient and the more prevalent one. Their success in this effort is the product of an unusual set of events that occurred in the nineteenth century, events I call the first “right-to-life” movement. […]

“Similarly, although early Christians were actively pro-natalist and their rhetoric denounced abortion, contraception, homosexuality, and castration as all being morally equivalent to murder, the legal and moral treatment of these acts—and particularly the treatment of abortion—was never consistent with the rhetoric. 4 For instance, induced abortion is ignored in the most central Judeo-Christian writings: it is not mentioned in the Christian or the Jewish Bible, or in the Jewish Mishnah or Talmud.* Abortion, it is true, was denounced in early Christian writings such as the Didache and by early Christian authors such as Clement of Alexandria, Tertullian, and St. Basil. But church councils, such as those of Elvira and Ancyra, which were called to specify the legal groundwork for did not agree on the penalties for abortion or on whether early abortion is wrong.

(“* Opponents of abortion sometimes argue that the Bible does express disapproval of abortion in Exodus 21:22-23. In fact, what is mentioned there is accidental miscarriage. The text says that when two men are fighting and they strike a pregnant woman, “causing the fruit of her womb to depart,” they may be liable for a capital offense, depending on whether “mischief” has occurred. It is not clear what is meant by “mischief”; the Hebrew word it stands for (“ason”) occurs only one other time in the Bible. Nor is induced abortion covered in the Talmud; for information on abortion in Jewish law, see David Feldman, Birth Control in Jewish Law, p. 255. The only related text in the Mishnah says that during a difficult delivery, an embryo may be dismembered until “the greater part” of it is born; only when the “greater part” has been born does Jewish law hold that the embryo is a person, and “we do not set aside one life for another”; see Immanuel Jakobovits, Jewish Medical Ethics , p. 184.”)

“In the year 1100 A.d., this debate was clarified, but hardly in the direction of making abortion at all times unequivocally murder. Ivo of Chartres, a prominent church scholar, condemned abortion but held that abortion of the “unformed” embryo was not homicide, and his work was the beginning of a new consensus. Fifty years later Gratian, in a work which became the basis of canon law for the next seven hundred years, reiterated this stand. 6

“The “formation” of an embryo (sometimes known as “animation” or “vivification”) was held to happen at forty days for a male embryo and at eighty days for a female embryo; the canonist Roger Huser argues that in questions of ambiguity the embryo was considered female. In this connection it is important to remember law—which were, in effect, the moral and legal standard for the Western world until the coming of the Reformation and secular courts—did not treat what we would now call first trimester abortions as murder. 8 (And given the difficulty in ascertaining when pregnancy actually began, in practice this toleration must have included later abortions as well.)

“Nineteenth-century America, therefore, did not inherit an unqualified opposition to abortion, which John Noonan has called an “almost absolute value in history.” 9 On the contrary, American legal and moral practice at the beginning of the nineteenth century was quite consistent with the preceding Catholic canon law: early abortions were legally ignored and only late abortions could be prosecuted. (In fact, there is some disagreement as to whether or not even late abortions were ever prosecuted under the common law tradition.) 10

“Ironically, then, the much-maligned 1973 Supreme Court decision on abortion, Roe v. Wade, which divided the legal regulation of abortion by trimesters, was much more in line with the traditional treatment of abortion than most Americans appreciate. But that in itself is an interesting fact. The brief history moral equivalent of murder.”

* * *

rsabharw wrote: “Where does it say in the bible that sodomy and child-killing are good things?”

Your question indicates why it is so important to have knowledge.

The Old Testament is one of the most violent holy texts in the world. God commands and sometimes commits all kinds of atrocities. Priests and prophets also made decrees that were, by today’s standards, quite horrific. And, yes, this did include child-killing (along with much worse, such as genocide and what is akin to eugenics).

Let me give an example from the prophet Zechariah. I find it fascinating because of the worldview it represents. This seems to imply that any Christian child who speaks in tongues or some similar act should be put to death.

“And it shall come to pass, that when any shall yet prophesy, then his father and his mother that begat him shall say unto him, Thou shalt not live; for thou speakest lies in the name of the LORD: and his father and his mother that begat him shall thrust him through when he prophesieth.”

That kind of thing is from uncommon in the Old Testament. I could make an extremely long comment just by quoting the Bible. Yet that kind of thing only involves children after they are born. The Bible is clear that a fetus isn’t treated as a full human and that death of a fetus isn’t considered murder.

For most of history, this was a non-issue for Christians. It was even a non-issue for most Americans until the culture wars. Earlier in the 20th century and before, the average doctor regularly did abortions, as it was considered part of their job. I have an ancestor who was a country doctor in Indiana, from the late 1800s to early 1900s, and he was also the local abortion provider.

As for homosexuality, the Bible has no clear and consistent position. Besides, no Christian follows all the rules and regulations, decrees and commandments described in the Old Testament. Even Jesus didn’t seem to have believed that his new message of love superseded the old Jewish legalisms.

If Christians are to literally interpret and follow the Old Testament, that means Christians can’t eat pork, shellfish, and black pudding; can’t get tatoos, cut the hair on the side of their heads, wearing of blended fabrics, charging interest on loans; et cetera. Plus, Christians would have to marry their brother’s widow, adulterers instead of being forgiven if they repent must be killed. and those with disabilities are to be treated as unclean like pigs. But slavery, genocide, and child murder are fine.

Yet if we are to simply go by Jesus’ words, we are limited to having no opinion on homosexuality and abortion. The best a fundy literalist could do is to cite Paul, but he never met Jesus and the evidence points to him having been a Gnostic (the heretical Valentinus and Marcion were among the earliest followers of the Pauline tradition, prior to Paul being incorporated as part of the Catholic canon).

So, if Christians don’t prioritize the teachings of Jesus over all else, what is the point of their even calling themselves Christians?

rsabharw wrote: “Abortion was illegal in Indiana in the 1800s. Therefore, your ancestor was not a doctor, but, rather, a criminal. The Hippocratic Oath specifically bans abortion. Any doctor who performs one is breaking that most sacred oath, and thus cannot call him or herself a doctor any longer.”

Studies show that banning abortions either doesn’t decrease or actually increases the abortion rate. It’s common sense that laws don’t always have much to do with actual human behavior. Even Christianity has been outlawed at different times and places, but it didn’t stop Christians from practicing.

Anyway, when did rural people ever worry about what political elite in far away big cities decided to tell the lower classes what to do? My ancestors in rural Indiana, besides including a country doctor who was an abortion provider, were also bootleggers. Screw you paternalistic, authoritarian a**holes! That is what my Kentuckiana ancestors would have told you. And I agree with them, on this issue.

We will make our own decisions and live as free patriots. Despite the laws, it’s obvious that the other rural people living around my country doctor ancestor were fine with what he did, for he was never prosecuted. These were his people, the place where he was born and raised. It was a typical community for the time. Few abortion cases were ever brought to court, despite it being extremely common at the time.

http://socialistworker.org/2005-2/562/562_06_Abortion.shtml

“History shows that women have always tried to terminate unwanted pregnancies. When safe medical procedures are banned by law, they have resorted to dangerous–sometimes deadly–“back-alley” abortions.”

http://www.indystar.com/story/news/crime/2016/07/22/feticide-conviction/87440440/

“The court also said that because many of the state abortion laws dating tothe 1800s explicitly protect pregnant women from prosecution, it was a stretch to believe that lawmakers intended for the feticide law to be used against pregnant women who attempt to terminate a pregnancy.”

http://www.connerprairie.org/education-research/indiana-history-1800-1860/women-and-the-law-in-early-19th-century

“In the early nineteenth century abortion simply did not elicit as much comment or controversy as today. Though not openly encouraged – and condemned in some circles – it was not necessarily dismissed out of hand if done early enough into the pregnancy. Abortion before “quickening,” the first signs of fetal movement, usually during the second trimester, was generally considered acceptable. “Most forms of abortion were not illegal and those women who wished to practice it did so.” As there were no laws specifically addressing abortion in the America of 1800, the only source for guidance was, again, English common law, which recognized quickening. […]

“These earliest abortion laws must be viewed contextually to be properly understood. In the main, they were not promulgated out of any fervor over the “morality” of abortion. As mentioned, quickening was generally accepted by both the courts and the public as the pivotal issue in abortion. Abortion was not generally considered immoral or illegal if performed prior to fetal movement. Because this was so widely accepted most American women did not have to “face seriously the moral agonies so characteristic of the twentieth century.” That Indiana’s law did not specifically mention quickening should not be seen as a step away from the doctrine. Instead, it is likely further evidence that quickening was so ingrained that it need not be especially written into the statute. […]

“Whatever the reasons, Indiana had an “anti-abortion” measure on the books after 1835. It seems to have been a law little regarded and little enforced. It also seems unlikely that it prevented many women who wished an abortion from obtaining one. Chemical or natural agents for producing abortions were readily available if a woman knew where to look – and most knew exactly where to fix their gaze. Mid-wives knew all the secrets; druggists advertised appropriate potions; medical texts provided answers.

“To judge the relative importance lawmakers attached to abortion, one need only compare the penalties involved. Assisting in an abortion, or performing a self-abortion, was punishable by a maximum fine of $500.00 and a year in the county jail. Burglary’s penalty was fourteen years in the state prison; murder (analogous in some modern minds with abortion) was a capital offense. Clearly, the state of Indiana did not equate abortion with murder, or even stealing your neighbor’s silver service.”

http://civilwarrx.blogspot.com/2014/11/her-daily-concern-womens-health-issues.html

“As the above indicates, abortion, like birth control information, became more available between 1830 and 1850. That period saw a mail order and retail abortifacient drug trade flourish. A woman could send away for certain pills or discreetly purchase them at a store. Surgical methods were “available, but dangerous.” This openness and commercial availability was mainly a feature of northern urban areas. Like much other technological and cultural change, it was later in its arrival in the midwest, and the average midwestern woman likely had a more difficult time in obtaining an abortion than her eastern, urban counterpart if she desired one.

“It was not, however, impossible. Such information and abortifacients were within reach of a woman if she grasped hard enough. Herbal abortifacients were the most widely utilized in rural, nineteenth century America. Again, networking and word-of-mouth broadcast specious methods. Women who relied on such information sometimes resorted to rubbing gunpowder on their breasts or drinking a “tea” brewed with rusty nail water. Other suggestions included “bleeding from the foot, hot baths, and cathartics.” Midwives were thought reliable informants and were wont to prescribe seneca, snakeroot, or cohosh, the favored method of Native American women. Thomsonians claimed the preferred “remedy” was a mixture of tansy syrup and rum.

“More reliable sources of information were the ever popular home medical books. If a woman knew where to look the information was easily gleaned. One book, Samuel Jennings’ The Married Ladies Companion, was meant especially to be used by rural women. It offered frank advice for women who “took a common cold,” the period colloquialism for missing a period. It urged using cathartics like aloe and calomel, and bleeding to restore menstruation. Abortion information was usually available in two sections of home medical books: how to “release obstructed menses” and “dangers” to avoid during pregnancy.

“The latter section was a sort of how-to in reverse that could be effectively put to use by the reader. The most widely consulted work, Buchan’s Domestic Medicine, advised emetics and a mixture of prepared steel, powdered myrrh, and aloe to “restore menstrual flow.” Under causes of abortion to be avoided, it listed violent exercise, jumping too high, blows to the belly, and lifting great weights. Clearly, any woman wishing badly enough to abort could find a solution to her dilemma, without relying on outside aid. If she wished to rely on herbal remedies, they could be easily obtained. Aloes, one of the most widely urged and effective abortifacient, were regularly advertised in newspapers as being available in local stores.

“Of course, the number of women who availed themselves of the abortion option cannot be properly approximated. It is enough to say that abortion was feasible, available, and used option; it was a likely contributor to the birth rate falling by mid-century.”

It’s the Working Class, Stupid

This election was mainly interesting for what it forced to the surface. Many people began paying attention. But the election itself wasn’t a fundamental change from trends and developments that have been happening for decades.

Politics after WWII was built on the growing middle class. And it was mostly a white middle class. The New Deal programs, the GI Bill, and such were designed to primarily help whites and to exclude minorities. Still, even many minorities were making economic gains at the time and increasingly joining the middle class. Not all boats were being floated, but more than ever before. And it was built with extremely high taxation on the rich. Creating a middle class doesn’t come cheap.

That subsidized and supported growing middle class made possible a new kind of politics. It took shape in the early Cold War, but only gained full force in the latter part of the 20th century. As much of the population became economically comfortable and complacent, they became ripe for the rhetoric of red-baiting, union-busting, culture wars, civil rights fights, and identity politics. Politicians had long stopped talking about the working class, about those aspiring to do better, and in its place came an emphasis on those who had already made it. The white middle class decided to pull up the ladder behind them and barricade the door.

Wages began to stagnate when I was born, back in 1976. Well, they stagnated for the average worker, which means they were dropping for the working poor. Buying power was decreasing, but people were able to maintain their lifestyles by working longer hours or multiple jobs. The economic problems were mostly felt across generations, as education costs increased and opportunities decreased, as job security disappeared and good benefits became rare. The unions made sure to protect older workers, which meant sacrificing younger workers. And the union leadership defended the political status quo in the hope of maintaining their increasingly precarious position. But the influence of unions was being felt by a decreasing number of Americans, especially among the working class and those falling out of the middle class.

Still, even going into the 21st century, there was still a large middle class. It was beginning to show signs of serious hurting, but the inertia of the economy kept the reversals from being noticed by the political and media elite. It was only at the bottom of society that it was obvious how bad it was getting, specifically among the young. That was true even in the 1980s and 1990s. GenXers were the only generation last century to experience a recession that only their generation experienced, and black GenXers were hurt the worst. That was true going back in the early life of GenXers with worsening child poverty rates. The vibrant middle class was poisoned in the cribs of GenX.

The talk of the middle class continued until this election. What had become clear this past decade or so, though, is that politicians and pundits in talking about the middle class were often actually talking about the working class. More people were falling out of the middle class, instead of entering it. In the past, simply aspiring to be middle class made you middle class, no matter if you were born working class and had a working class job. Middle class was primarily defined as an aspiration and the American Dream was about upward mobility. It was the sense that the whole country was moving up, all or most boats were being floated. But that has been changing for a long time.

This is the first election in my lifetime where the political and media elite finally had to admit that the US was defined by its working class, not its middle class. That is because in recent years this has become unavoidable. The US economic mobility had been falling behind other countries for a while, and fairly recently the US middle class lost its position as the most wealthy in the world. Trump won his nomination through inciting the fears and anxieties of a hurting middle class, as his earliest supporters weren’t the poor and working class, but he won the election because of those on the bottom of society, the working poor. Once Sanders was eliminated, Trump was the last candidate left standing who talked about economic populism and economic reform. As many have been reminding the Democratic establishment, “It’s the economy, stupid.”

It turns out that the mid-20th century middle class, along with the post-war economic boom that made it possible, was a historical anomaly. We are once again a working class country. And it isn’t a working class that is feeling all that hopeful at the moment. These Americans aren’t a temporarily down-on-their-luck middle class, much less temporarily embarrassed millionaires, nor are they even aspiring to much beyond not being left behind. Unless the entire economic and political system is reformed, this working class is here to stay. And if we continue on this path, it will become a permanent underclass.

Conservative Coast

It’s always odd to hear conservatives talk about coasts, specifically coastal states and cities. It’s as if they think there is something about the ocean water that warps the brain. Or maybe it’s some kind of foreign influence drifting in on the ocean currents.

I sort of know what conservatives mean, in terms of recent voting patterns for the parties. But it’s a bit historically, demographically, and ideologically clueless.

Sure, many of the biggest cities are found on the coast, although not all. Besides there are plenty of big cities in conservative states. The fact of the matter is that most conservatives, like most liberals, live in urban areas. The rural conservatives who have outsized voting power in our electoral system are a miniscule minority of the total number of conservatives.

This also ignores the simple fact that the majority of Southern states are on the coast. There are nine southern coastal states. And that includes some highly populated states and big cities.

In fact, Texas is far larger than most countries in the world and one of the most influential states in the country (e.g., the textbook your child is likely using in school). Texas has six of the largest cities among the top twenty largest cities in the country, with three of them in the top ten. Not even California has this many big cities in the top level of population rankings.

Plus, the South has a growing population. For quite a while, it has been the largest regional population in the country, presently consisting of more than a third (37.6%) of US citizens — and, with places like Texas, a disproportionate number of non-citizen residents.

It’s true that the largest city by far is New York City. But why do conservatives dismiss city dwellers? New York City has a long history of being the home of working class whites. Most people who live in cities are working class and the largest proportion of the working class is white. Do conservatives mistrust these people because these working class whites tend to be ethnics, as the early 20th century conservative movement targeted ethnic whites as one of the greatest dangers to the country, just as they now attack Hispanics? Do WASPs still have a bad taste in their mouth all these generations later from their earlier political fights with ethnic whites?

Most people these days live in cities, not just liberals and rich people. A massive part of the big city population grew up in rural areas and small towns or else that is where their family came from a generation or two prior. Until the 1970s, the majority of the black population was still rural. Many big city residents still have family outside of the big cities, family they might visit or be visited by on holidays.

More than anything, when conservatives talk of coasts, they think of the West Coast or as it sometimes is called the Left Coast. Even more specifically, they have Californian in mind. But California has always been a divided state. From before the Civil War to after WWII, there were many large waves of immigrants from Southern states and most of them were working class whites, the most famous example being the partly misnamed Oakies. Unsurprisingly, most of those Southerners settled in southern California. It almost led to the Civil War erupting on the West Coast.

Southern California is the location of the second largest US city, Los Angeles. Ronald Reagan spent much of his time there and his politics were shaped by his experiences at that time. Next to it is the infamous right-wing Orange County, where Richard Nixon was born and raised. Southern California, along with being a major center of the military industry, is where originated big agriculture and an early strain of the corporatism — all of that now being among the most powerful influences in the US economy and politics. It was in this atmosphere that Nixon developed the Southern Strategy and Reagan became a leading figure, first as a corporate spokesperson and then as a politician. Reagan began his highly effective red-baiting while still in California.

That was the soil out of which mega-churches grew and the religious right rhetoric was honed. The preachers of those California mega-churches were televised around the country. Southern California was the headquarters for the Moral Majority movement. Some of the largest conservative rallies of the culture wars happened in and around big cities such as Los Angeles.

That isn’t the California that conservatives like to dismiss as a stereotype. None of these kinds of places quite fit stereotypes. Conservatives fear minorities, even though minority populations are far more socially conservative and religious than white Republicans. It confuses white Republicans that conservatives would vote Democrat.

But that is where the cluelessness comes in. Democrats have always been a big tent party. What Republicans should ask is: In the past, why did so many conservatives, especially among the religious right, vote for reform candidates like Franklin Delano Roosevelt and gave such strong support to the New Deal? Until Trump, why did most of the white working class vote Democrat for most of the past century, even with the rise of identity politics following the Civil Rights Movement? It’s not because of minorities and liberals that Republicans have in the past lost electoral success in certain big cities and in certain coastal states.

Maybe this election will finally get whites on the political right to rethink their false and unhelpful assumptions about their fellow Americans. Republicans have won a complex coalition with this election. But what will they do with it now that they have it? And will they be able to keep it?

“Beyond that, there is only awe.”

“What is the meaning of life?” This question has no answer except in the history of how it came to be asked. There is no answer because words have meaning, not life or persons or the universe itself. Our search for certainty rests in our attempts at understanding the history of all individual selves and all civilizations. Beyond that, there is only awe.
~ Julian Jaynes, 1988, Life Magazine

That is always a nice quote. Jaynes never seemed like an ideologue about his own speculations. In his controversial book, more than a decade earlier (1976), he titled his introduction as “The Problem of Consciousness”. That is what frames his thought, confronting a problem. The whole issue of consciousness is still problematic to this day and likely will be so for a long time. After a lengthy analysis of complex issues, he concludes his book with some humbling thoughts:

For what is the nature of this blessing of certainty that science so devoutly demands in its very Jacob-like wrestling with nature? Why should we demand that the universe make itself clear to us? Why do we care?

To be sure, a part of the impulse to science is simple curiosity, to hold the unheld and watch the unwatched. We are all children in the unknown.

Following that, he makes a plea for understanding. Not just understanding of the mind but also of experience. It is a desire to grasp what makes us human, the common impulses that bind us, underlying both religion and science. There is a tender concern being given voice, probably shaped and inspired by his younger self having poured over his deceased father’s Unitarian sermons.

As individuals we are at the mercies of our own collective imperatives. We see over our everyday attentions, our gardens and politics, and children, into the forms of our culture darkly. And our culture is our history. In our attempts to communicate or to persuade or simply interest others, we are using and moving about through cultural models among whose differences we may select, but from whose totality we cannot escape. And it is in this sense of the forms of appeal, of begetting hope or interest or appreciation or praise for ourselves or for our ideas, that our communications are shaped into these historical patterns, these grooves of persuasion which are even in the act of communication an inherent part of what is communicated. And this essay is no exception.

That humility feels genuine. His book was far beyond mere scholarship. It was an expression of decades of questioning and self-questioning, about what it means to be human and what it might have meant for others throughout the millennia.

He never got around to writing another book on the topic, despite his stated plans to do so. But during the last decade of his life, he wrote an afterword to his original work. It was placed in the 1990 edition, fourteen years after the original publication. He had faced much criticism and one senses a tired frustration in those last years. Elsewhere, he complained about the expectation to explain himself and make himself understood to people who, for whatever reason, didn’t understand. Still, he realized that was the nature of his job as an academic scholar working at a major university. From the after word, he wrote:

A favorite practice of some professional intellectuals when at first faced with a theory as large as the one I have presented is to search for that loose thread which, when pulled, will unravel all the rest. And rightly so. It is part of the discipline of scientific thinking. In any work covering so much of the terrain of human nature and history, hustling into territories jealously guarded by myriad aggressive specialists, there are bound to be such errancies, sometimes of fact but I fear more often of tone. But that the knitting of this book is such that a tug on such a bad stitch will unravel all the rest is more of a hope on the part of the orthodox than a fact in the scientific pursuit of truth. The book is not a single hypothesis.

Interestingly, Jaynes doesn’t state the bicameral mind as an overarching context for the hypotheses he lists. In fact, it is just one among the several hypotheses and not even the first to be mentioned. That shouldn’t be surprising since decades of his thought and research, including laboratory studies done on animal behavior, preceded the formulation of the bicameral hypothesis. Here are the four hypotheses:

  1. Consciousness is based on language.
  2. The bicameral mind.
  3. The dating.
  4. The double brain.

He states that, “I wish to emphasize that these four hypotheses are separable. The last, for example, could be mistaken (at least in the simplified version I have presented) and the others true. The two hemispheres of the brain are not the bicameral mind but its present neurological model. The bicameral mind is an ancient mentality demonstrated in the literature and artifacts of antiquity.” Each hypothesis is connected to the others but must be dealt with separately. The key element to his project is consciousness, as that is the key problem. And as problems go, it is a doozy. Calling it a problem is like calling the moon a chunk of rock and the sun a warm fire.

Related to these hypotheses, earlier in his book, Jaynes proposes a useful framework. He calls it the General Bicameral Paradigm. “By this phrase,” he explains, “I mean an hypothesized structure behind a large class of phenomena of diminished consciousness which I am interpreting as partial holdovers from our earlier mentality.” There are four components:

  1. “the collective cognitive imperative, or belief system, a culturally agreed-on expectancy or prescription which defines the particular form of a phenomenon and the roles to be acted out within that form;”
  2. “an induction or formally ritualized procedure whose function is the narrowing of consciousness by focusing attention on a small range of preoccupations;”
  3. “the trance itself, a response to both the preceding, characterized by a lessening of consciousness or its loss, the diminishing of the analog or its loss, resulting in a role that is accepted, tolerated, or encouraged by the group; and”
  4. “the archaic authorization to which the trance is directed or related to, usually a god, but sometimes a person who is accepted by the individual and his culture as an authority over the individual, and who by the collective cognitive imperative is prescribed to be responsible for controlling the trance state.”

The point is made that the reader shouldn’t assume that they are “to be considered as a temporal succession necessarily, although the induction and trance usually do follow each other. But the cognitive imperative and the archaic authorization pervade the whole thing. Moreover, there is a kind of balance or summation among these elements, such that when one of them is weak the others must be strong for the phenomena to occur. Thus, as through time, particularly in the millennium following the beginning of consciousness, the collective cognitive imperative becomes weaker (that is, the general population tends toward skepticism about the archaic authorization), we find a rising emphasis on and complication of the induction procedures, as well as the trance state itself becoming more profound.”

This general bicameral paradigm is partly based on the insights he gained from studying ancient societies. But ultimately it can be considered separately from that. All you have to understand is that these are a basic set of cognitive abilities and tendencies that have been with humanity for a long time. These are the vestiges of human evolution and societal development. They can be combined and expressed in multiple ways. Our present society is just one of many possible manifestations. Human nature is complex and human potential is immense, and so diversity is to be expected among human neurocognition, behavior, and culture.

An important example of the general bicameral paradigm is hypnosis. It isn’t just an amusing trick done for magic shows. Hypnosis shows something profoundly odd, disturbing even, about the human mind. Also, it goes far beyond the individual for it is about how humans relate. It demonstrates the power of authority figures, in whatever form they take, and indicates the significance of what Jaynes calls authorization. By the way, this leads down the dark pathways of authoritarianism, brainwashing, propaganda, and punishment — as for the latter, Jaynes writes that:

If we can regard punishment in childhood as a way of instilling an enhanced relationship to authority, hence training some of those neurological relationships that were once the bicameral mind, we might expect this to increase hypnotic susceptibility. And this is true. Careful studies show that those who have experienced severe punishment in childhood and come from a disciplined home are more easily hypnotized, while those who were rarely punished or not punished at all tend to be less susceptible to hypnosis.

He discusses the history of hypnosis beginning with Mesmer. In this, he shows how metaphor took different form over time. And, accordingly, it altered shared experience and behavior.

Now it is critical here to realize and to understand what we might call the paraphrandic changes which were going on in the people involved, due to these metaphors. A paraphrand, you will remember, is the projection into a metaphrand of the associations or paraphiers of a metaphier. The metaphrand here is the influences between people. The metaphiers, or what these influences are being compared to, are the inexorable forces of gravitation, magnetism, and electricity. And their paraphiers of absolute compulsions between heavenly bodies, of unstoppable currents from masses of Ley den jars, or of irresistible oceanic tides of magnetism, all these projected back into the metaphrand of interpersonal relationships, actually changing them, changing the psychological nature of the persons involved, immersing them in a sea of uncontrollable control that emanated from the ‘magnetic fluids’ in the doctor’s body, or in objects which had ‘absorbed’ such from him.

It is at least conceivable that what Mesmer was discovering was a different kind of mentality that, given a proper locale, a special education in childhood, a surrounding belief system, and isolation from the rest of us, possibly could have sustained itself as a society not based on ordinary consciousness, where metaphors of energy and irresistible control would assume some of the functions of consciousness.

How is this even possible? As I have mentioned already, I think Mesmer was clumsily stumbling into a new way of engaging that neurological patterning I have called the general bicameral paradigm with its four aspects: collective cognitive imperative, induction, trance, and archaic authorization.

Through authority and authorization, immense power and persuasion can be wielded. Jaynes argues that it is central to the human mind, but that in developing consciousness we learned how to partly internalize the process. Even so, Jaynesian self-consciousness is never a permanent, continuous state and the power of individual self-authorization easily morphs back into external forms. This is far from idle speculation, considering authoritarianism still haunts the modern mind. I might add that the ultimate power of authoritarianism, as Jaynes makes clear, isn’t overt force and brute violence. Outward forms of power are only necessary to the degree that external authorization is relatively weak, as is typically the case in modern societies.

This touches upon the issue of rhetoric, although Jaynes never mentioned the topic. It’s disappointing since his original analysis of metaphor has many implications. Fortunately, others have picked up where he left off (see Ted Remington, Brian J. McVeigh, and Frank J. D’Angelo). Authorization in the ancient world came through a poetic voice, but today it is most commonly heard in rhetoric.

Still, that old time religion can be heard in the words and rhythm of any great speaker. Just listen to how a recorded speech of Martin Luther King jr can pull you in with its musicality. Or if you prefer a dark example, consider the persuasive power of Adolf Hitler for even some Jews admitted they got caught up listening to his speeches. This is why Plato feared the poets and banished them from his utopia of enlightened rule. Poetry would inevitably undermine and subsume the high-minded rhetoric of philosophers. “[P]oetry used to be divine knowledge,” as Guerini et al states in Echoes of Persuasion, “It was the sound and tenor of authorization and it commanded where plain prose could only ask.”

Metaphor grows naturally in poetic soil, but its seeds are planted in every aspect of language and thought, giving fruit to our perceptions and actions. This is a thousandfold true on the collective level of society and politics. Metaphors are most powerful when we don’t see them as metaphors. So, the most persuasive rhetoric is that which hides its metaphorical frame and obfuscates any attempts to bring it to light.

Going far back into the ancient world, metaphors didn’t need to be hidden in this sense. The reason for this is that there was no intellectual capacity or conceptual understanding of metaphors as metaphors. Instead, metaphors were taken literally. The way people spoke about reality was inseparable from their experience of reality and they had no way of stepping back from their cultural biases, as the cultural worldviews they existed within were all-encompassing. It’s only with the later rise of multicultural societies, especially the vast multi-ethnic trade empires, that people began to think in terms of multiple perspectives. Such a society was developing in the trade networking and colonizing nation-states of Greece in the centuries leading up to Hellenism.

That is the well known part of Jaynes’ speculations, the basis of his proposed bicameral mind. And Jaynes considered it extremely relevant to the present.

Marcel Kuijsten wrote that, “Jaynes maintained that we are still deep in the midst of this transition from bicamerality to consciousness; we are continuing the process of expanding the role of our internal dialogue and introspection in the decision-making process that was started some 3,000 years ago. Vestiges of the bicameral mind — our longing for absolute guidance and external control — make us susceptible to charismatic leaders, cults, trends, and persuasive rhetoric that relies on slogans to bypass logic” (“Consciousness, Hallucinations, and the Bicameral Mind Three Decades of New Research”, Reflections on the Dawn of Consciousness, Kindle Locations 2210-2213). Considering the present, in Authoritarian Grammar and Fundamentalist Arithmetic, Ben G. Price puts it starkly: “Throughout, tyranny asserts its superiority by creating a psychological distance between those who command and those who obey. And they do this with language, which they presume to control.” The point made by the latter is that this knowledge, even as it can be used as intellectual defense, might just lead to even more effective authoritarianism.

We’ve grown less fearful of rhetoric because we see ourselves as being savvy, experienced consumers of media. The cynical modern mind is always on guard, our well-developed and rigid state of consciousness offering a continuous psychological buffering against the intrusions of the world. So we like to think. I remember, back in 7th grade, being taught how the rhetoric of advertising is used to manipulate us. But we are over-confident. Consciousness operates at the surface of the psychic depths. We are better at rationalizing than being rational, something we may understand intellectually but rarely do we fully acknowledge the psychological and societal significance of this. That is the usefulness of theories like that of bicameralism, as they remind us that we are out of our depths. In the ancient world, there was a profound mistrust between the poetic and rhetorical, and for good reason. We would be wise to learn from that clash of mindsets and worldviews.

We shouldn’t be so quick to assume we understand our own minds, the kind of vessel we find ourselves on. Nor should we allow ourselves to get too comfortable within the worldview we’ve always known, the safe harbor of our familiar patterns of mind. It’s hard to think about these issues because they touch upon our own being, the surface of consciousness along with the depths below it. This is the near difficult task of fathoming the ocean floor using rope and a weight, an easier task the closer we hug the shoreline. But what might we find if cast ourselves out on open waters? What new lands might be found, lands to be newly discovered and lands already inhabited?

We moderns love certainty. And it’s true we possess more knowledge than any civilization before has accumulated. Yet we’ve partly made the unfamiliar into familiar by remaking the world in our own image. There is no place on earth that remains entirely untouched. Only a couple hundred small isolated tribes are still uncontacted, representing foreign worldviews not known or studied, but even they live under unnatural conditions of stress as the larger world closes in on them. Most of the ecological and cultural diversity that once existed has been obliterated from the face of the earth, most of it having left not a single trace or record, just simply gone. Populations beyond count have faced extermination by outside influences and forces before they ever got a chance to meet an outsider. Plagues, environmental destruction, and societal collapse wiped them out often in short periods of time.

Those other cultures might have gifted us with insights about our humanity that now are lost forever, just as extinct species might have held answers to questions not yet asked and medicines for diseases not yet understood. Almost all that now is left is a nearly complete monoculture with the differences ever shrinking into the constraints of capitalist realism. If not for scientific studies done on the last of isolated tribal people, we would never know how much diversity exists within human nature. Many of the conclusions that earlier social scientists had made were based mostly on studies involving white, middle class college kids in Western countries, what some have called the WEIRD: Western, Educated, Industrialized, Rich, and Democratic. But many of those conclusions have since proven wrong, biased, or limited.

When Jaynes’ first thought on such matters, the social sciences were still getting established as serious fields of study. His entered college around 1940 when behaviorism was a dominant paradigm. It was only in the prior decades that the very idea of ‘culture’ began to take hold among anthropologists. He was influenced by anthropologists, directly and indirectly. One indirect influence came by way of E. R. Dodds, a classical scholar, who in writing his 1951 The Greeks and the Irrational found inspiration from Ruth Benedict’s anthropological work comparing cultures (Benedict taking this perspective through the combination of the ideas of Franz Boas and Carl Jung). Still, anthropology was young and the fascinating cases so well known today were unknown back then (e.g., Daniel Everett’s recent books on the Pirahã). So, in following Dodds example, Jaynes turned to ancient societies and their literature.

His ideas were forming at the same time the social sciences were gaining respectability and maturity. It was a time when many scholars and other intellectuals were more fully questioning Western civilization. But it was also the time when Western ascendancy was becoming clear with the WWI ending of the Ottoman Empire and the WWII ending of the Japanese Empire. The whole world was falling under Western cultural influence. And traditional societies were in precipitous decline. That was the dawning of the age of monoculture.

We are the inheritors of the world that was created from that wholesale destruction of all that came before. And even what came before was built on millennia of collapsing civilizations. Jaynes focused on the earliest example of mass destruction and chaos leading him to see a stark division to what came before and after. How do we understand why we came to be the way we are when so much has been lost? We are forced back on our own ignorance. Jaynes apparently understood that and so considered awe to be the proper response. We know the world through our own humanity, but we can only know our own humanity through the cultural worldview we are born into. It is our words that have meaning, was Jaynes response, “not life or persons or the universe itself.” That is to say we bring meaning to what we seek to understand. Meaning is created, not discovered. And the kind of meaning we create depends on our cultural worldview.

In Monoculture, F. S. Michaels writes (pp. 1-2):

THE HISTORY OF HOW we think and act, said twentieth-century philosopher Isaiah Berlin, is, for the most part, a history of dominant ideas. Some subject rises to the top of our awareness, grabs hold of our imagination for a generation or two, and shapes our entire lives. If you look at any civilization, Berlin said, you will find a particular pattern of life that shows up again and again, that rules the age. Because of that pattern, certain ideas become popular and others fall out of favor. If you can isolate the governing pattern that a culture obeys, he believed, you can explain and understand the world that shapes how people think, feel and act at a distinct time in history.1

The governing pattern that a culture obeys is a master story — one narrative in society that takes over the others, shrinking diversity and forming a monoculture. When you’re inside a master story at a particular time in history, you tend to accept its definition of reality. You unconsciously believe and act on certain things, and disbelieve and fail to act on other things. That’s the power of the monoculture; it’s able to direct us without us knowing too much about it.

Over time, the monoculture evolves into a nearly invisible foundation that structures and shapes our lives, giving us our sense of how the world works. It shapes our ideas about what’s normal and what we can expect from life. It channels our lives in a certain direction, setting out strict boundaries that we unconsciously learn to live inside. It teaches us to fear and distrust other stories; other stories challenge the monoculture simply by existing, by representing alternate possibilities.

Jaynes argued that ideas are more than mere concepts. Ideas are embedded in language and metaphor. And ideas take form not just as culture but as entire worldviews built on interlinked patterns of attitudes, thought, perception, behavior, and identity. Taken together, this is the reality tunnel we exist within.

It takes a lot to shake us loose from these confines of the mind. Certain practices, from meditation to imbibing psychedelics, can temporarily or permanently alter the matrix of our identity. Jaynes, for reasons of his own, came to question the inevitability of the society around him which allowed him to see that other possibilities may exist. The direction his queries took him landed him in foreign territory, outside of the idolized individualism of Western modernity.

His ideas might have been less challenging in a different society. We modern Westerners identify ourselves with our thoughts, the internalized voice of egoic consciousness. And we see this as the greatest prize of civilization, the hard-won rights and freedoms of the heroic individual. It’s the story we tell. But in other societies, such as in the East, there are traditions that teach the self is distinct from thought. From the Buddhist perspective of dependent (co-)origination, it is a much less radical notion that the self arises out of thought, instead of the other way around, and that thought itself simply arises. A Buddhist would have a much easier time intuitively grasping the theory of bicameralism, that thoughts are greater than and precede the self.

Maybe we modern Westerners need to practice a sense of awe, to inquire more deeply. Jaynes offers a different way of thinking that doesn’t even require us to look to another society. If he is correct, this radical worldview is at the root of Western Civilization. Maybe the traces of the past are still with us.

* * *

The Origin of Rhetoric in the Breakdown of the Bicameral Mind
by Ted Remington

Endogenous Hallucinations and the Bicameral Mind
by Rick Straussman

Consciousness and Dreams
by Marcel Kuijsten, Julian Jaynes Society

Ritual and the Consciousness Monoculture
by Sarah Perry, Ribbonfarm

“I’m Nobody”: Lyric Poetry and the Problem of People
by David Baker, The Virginia Quarterly Review

It is in fact dangerous to assume a too similar relationship between those ancient people and us. A fascinating difference between the Greek lyricists and ourselves derives from the entity we label “the self.” How did the self come to be? Have we always been self-conscious, of two or three or four minds, a stew of self-aware voices? Julian Jaynes thinks otherwise. In The Origin of Consciousness in the Breakdown of the Bicameral Mind—that famous book my poetry friends adore and my psychologist friends shrink from—Jaynes surmises that the early classical mind, still bicameral, shows us the coming-into-consciousness of the modern human, shows our double-minded awareness as, originally, a haunted hearing of voices. To Jaynes, thinking is not the same as consciousness: “one does one’s thinking before one knows what one is to think about.” That is, thinking is not synonymous with consciousness or introspection; it is rather an automatic process, notably more reflexive than reflective. Jaynes proposes that epic poetry, early lyric poetry, ritualized singing, the conscience, even the voices of the gods, all are one part of the brain learning to hear, to listen to, the other.

Auditory Hallucinations: Psychotic Symptom or Dissociative Experience?
by Andrew Moskowitz & Dirk Corstens

Voices heard by persons diagnosed schizophrenic appear to be indistinguishable, on the basis of their experienced characteristics, from voices heard by persons with dissociative disorders or by persons with no mental disorder at all.

Neuroimaging, auditory hallucinations, and the bicameral mind.
by L. Sher, Journal of Psychiatry and Neuroscience

Olin suggested that recent neuroimaging studies “have illuminated and confirmed the importance of Jaynes’ hypothesis.” Olin believes that recent reports by Lennox et al and Dierks et al support the bicameral mind. Lennox et al reported a case of a right-handed subject with schizophrenia who experienced a stable pattern of hallucinations. The authors obtained images of repeated episodes of hallucination and observed its functional anatomy and time course. The patient’s auditory hallucination occurred in his right hemisphere but not in his left.

What Is It Like to Be Nonconscious?: A Defense of Julian Jaynes
by Gary William, Phenomenology and the Cognitive Sciences

To explain the origin of consciousness is to explain how the analog “I” began to narratize in a functional mind-space. For Jaynes, to understand the conscious mind requires that we see it as something fleeting rather than something always present. The constant phenomenality of what-it-is-like to be an organism is not equivalent to consciousness and, subsequently, consciousness must be thought in terms of the authentic possibility of consciousness rather than its continual presence.

Defending Damasio and Jaynes against Block and Gopnik
by Emilia Barile, Phenomenology Lab

When Jaynes says that there was “nothing it is like” to be preconscious, he certainly didn’t mean to say that nonconscious animals are somehow not having subjective experience in the sense of “experiencing” or “being aware” of the world. When Jaynes said there is “nothing it is like” to be preconscious, he means that there is no sense of mental interiority and no sense of autobiographical memory. Ask yourself what it is like to be driving a car and then suddenly wake up and realize that you have been zoned out for the past minute. Was there something it is like to drive on autopilot? This depends on how we define “what it is like”.

“The Evolution of the Analytic Topoi: A Speculative Inquiry”
by Frank J. D’Angelo
from Essays on Classical Rhetoric and Modern Discourse
ed. Robert J. Connors, Lisa S. Ede, & Andrea A. Lunsford
pp. 51-5

The first stage in the evolution of the analytic topoi is the global stage. Of this stage we have scanty evidence, since we must assume the ontogeny of invention in terms of spoken language long before the individual is capable of anything like written language. But some hints of how logical invention might have developed can be found in the work of Eric Havelock. In his Preface to Plato, Havelock, in recapitulating the educational experience of the Homeric and post-Homeric Greek, comments that the psychology of the Homeric Greek is characterized by a high degree of automatism.

He is required as a civilised being to become acquainted with the history, the social organisation, the technical competence and the moral imperatives of his group. This in turn is able to function only as a fragment of the total Hellenic world. It shares a consciousness in which he is keenly aware that he, as a Hellene, in his memory. Such is poetic tradition, essentially something he accepts uncritically, or else it fails to survive in his living memory. Its acceptance and retention are made psychologically possible by a mechanism of self-surrender to the poetic performance and of self-identification with the situations and the stories related in the performance. . . . His receptivity to the tradition has thus, from the standpoint of inner psychology, a degree of automatism which however is counter-balanced by a direct and unfettered capacity for action in accordance with the paradigms he has absorbed. 6

Preliterate man was apparently unable to think logically. He acted, or as Julian Jaynes, in The Origin of Consciousness in the Breakdown of the Bicameral Mind, puts it, “reacted” to external events. “There is in general,” writes Jaynes, “no consciousness in the Iliad . . . and in general therefore, no words for consciousness or mental acts.” 7 There was, in other words, no subjective consciousness in Iliadic man. His actions were not rooted in conscious plans or in reasoning. We can only speculate, then, based on the evidence given by Havelock and Jaynes that logical invention, at least in any kind of sophisticated form, could not take place until the breakdown of the bicameral mind, with the invention of writing. If ancient peoples were unable to introspect, then we must assume that the analytic topoi were a discovery of literate man. Eric Havelock, however, warns that the picture he gives of Homeric and post-Homeric man is oversimplified and that there are signs of a latent mentality in the Greek mind. But in general, Homeric man was more concerned to go along with the tradition than to make individual judgments.

For Iliadic man to be able to think, he must think about something. To do this, states Havelock, he had to be able to revolt against the habit of self-identification with the epic poem. But identification with the poem at this time in history was necessary psychologically (identification was necessary for memorization) and in the epic story implicitly as acts or events that are carried out by important people, must be abstracted from the narrative flux. “Thus the autonomous subject who no longer recalls and feels, but knows, can now be confronted with a thousand abstract laws, principles, topics, and formulas which become the objects of his knowledge.” 8

The analytic topoi, then, were implicit in oral poetic discourse. They were “experienced” in the patterns of epic narrative, but once they are abstracted they can become objects of thought as well as of experience. As Eric Havelock puts it,

If we view them [these abstractions] in relation to the epic narrative from which, as a matter of historical fact, they all emerged they can all be regarded as in one way or another classifications of an experience which was previously “felt” in an unclassified medley. This was as true of justice as of motion, of goodness as of body or space, of beauty as of weight or dimension. These categories turn into linguistic counters, and become used as a matter of course to relate one phenomenon to another in a non-epic, non-poetic, non-concrete idiom. 9

The invention of the alphabet made it easier to report experience in a non-epic idiom. But it might be a simplification to suppose that the advent of alphabetic technology was the only influence on the emergence of logical thinking and the analytic topics, although perhaps it was the major influence. Havelock contends that the first “proto-thinkers” of Greece were the poets who at first used rhythm and oral formulas to attempt to arrange experience in categories, rather than in narrative events. He mentions in particular that it was Hesiod who first parts company with the narrative in the Theogony and Works and Days. In Works and Days, Hesiod uses a cataloging technique, consisting of proverbs, aphorisms, wise sayings, exhortations, and parables, intermingled with stories. But this effect of cataloging that goes “beyond the plot of a story in order to impose a rough logic of topics . . . presumes that Hesiod is 10

The kind of material found in the catalogs of Hesiod was more like the cumulative commonplace material of the Renaissance than the abstract topics that we are familiar with today. Walter Ong notes that “the oral performer, poet or orator needed a stock of material to keep him going. The doctrine of the commonplaces is, from one point of view, the codification of ways of assuring and managing this stock.” 11 We already know what some of the material was like: stock epithets, figures of speech, exempla, proverbs, sententiae, quotations, praises or censures of people and things, and brief treatises on virtues and vices. By the time we get to the invention of printing, there are vast collections of this commonplace material, so vast, relates Ong, that scholars could probably never survey it all. Ong goes on to observe that

print gave the drive to collect and classify such excerpts a potential previously undreamed of. . . . the ranging of items side by side on a page once achieved, could be multiplied as never before. Moreover, printed collections of such commonplace excerpts could be handily indexed; it was worthwhile spending days or months working up an index because the results of one’s labors showed fully in thousands of copies. 12

To summarize, then, in oral cultures rhetorical invention was bound up with oral performance. At this stage, both the cumulative topics and the analytic topics were implicit in epic narrative. Then the cumulative commonplaces begin to appear, separated out by a cataloging technique from poetic narrative, in sources such as the Theogony and Works and Days . Eric Havelock points out that in Hesiod, the catalog “has been isolated or abstracted . . . out of a thousand contexts in the rich reservoir of oral tradition. … A general world view is emerging in isolated or ‘abstracted’ form.” 13 Apparently, what we are witnessing is the emergence of logical thinking. Julian Jaynes describes the kind of thought to be found in the Works and Days as “preconscious hypostases.” Certain lines in Hesiod, he maintains, exhibit “some kind of bicameral struggle.” 14

The first stage, then, of rhetorical invention is that in which the analytic topoi are embedded in oral performance in the form of commonplace material as “relationships” in an undifferentiated matrix. Oral cultures preserve this knowledge by constantly repeating the fixed sayings and formulae. Mnemonic patterns, patterns of repetition, are not added to the thought of oral cultures. They are what the thought consists of.

Emerging selves: Representational foundations of subjectivity
by Wolfgang Prinz, Consciousness and Cognition

What, then, may mental selves be good for and why have they emerged during evolution (or, perhaps, human evolution or even early human history)? Answers to these questions used to take the form of stories explaining how the mental self came about and what advantages were associated with it. In other words, these are theories that construct hypothetical scenarios offering plausible explanations for why certain (groups of) living things that initially do not possess a mental self gain fitness advantages when they develop such an entity—with the consequence that they move from what we can call a self-less to a self-based or “self-morphic” state.

Modules for such scenarios have been presented occasionally in recent years by, for example, Dennett, 1990 and Dennett, 1992, Donald (2001), Edelman (1989), Jaynes (1976), Metzinger, 1993 and Metzinger, 2003, or Mithen (1996). Despite all the differences in their approaches, they converge around a few interesting points. First, they believe that the transition between the self-less and self-morphic state occurred at some stage during the course of human history—and not before. Second, they emphasize the cognitive and dynamic advantages accompanying the formation of a mental self. And, third, they also discuss the social and political conditions that promote or hinder the constitution of this self-morphic state. In the scenario below, I want to show how these modules can be keyed together to form a coherent construction. […]

Thus, where do thoughts come from? Who or what generates them, and how are they linked to the current perceptual situation? This brings us to a problem that psychology describes as the problem of source attribution ( Heider, 1958).

One obvious suggestion is to transfer the schema for interpreting externally induced messages to internally induced thoughts as well. Accordingly, thoughts are also traced back to human sources and, likewise, to sources that are present in the current situation. Such sources can be construed in completely different ways. One solution is to trace the occurrence of thoughts back to voices—the voices of gods, priests, kings, or ancestors, in other words, personal authorities that are believed to have an invisible presence in the current situation. Another solution is to locate the source of thoughts in an autonomous personal authority bound to the body of the actor: the self.

These two solutions to the attribution problem differ in many ways: historically, politically, and psychologically. In historical terms, the former must be markedly older than the latter. The transition from one solution to the other and the mentalities associated with them are the subject of Julian Jaynes’s speculative theory of consciousness. He even considers that this transfer occurred during historical times: between the Iliad and the Odyssey. In the Iliad, according to Jaynes, the frame of mind of the protagonists is still structured in a way that does not perceive thoughts, feelings, and intentions as products of a personal self, but as the dictates of supernatural voices. Things have changed in the Odyssey: Odysseus possesses a self, and it is this self that thinks and acts. Jaynes maintains that the modern consciousness of Odysseus could emerge only after the self had taken over the position of the gods (Jaynes, 1976; see also Snell, 1975).

Moreover, it is obvious why the political implications of the two solutions differ so greatly: Societies whose members attribute their thoughts to the voices of mortal or immortal authorities produce castes of priests or nobles that claim to be the natural authorities or their authentic interpreters and use this to derive legitimization for their exercise of power. It is only when the self takes the place of the gods that such castes become obsolete, and authoritarian constructions are replaced by other political constructions that base the legitimacy for their actions on the majority will of a large number of subjects who are perceived to be autonomous.

Finally, an important psychological difference is that the development of a self-concept establishes the precondition for individuals to become capable of perceiving themselves as persons with a coherent biography. Once established, the self becomes involved in every re-presentation and representation as an implicit personal source, and just as the same body is always present in every perceptual situation, it is the same mental self that remains identical across time and place. […]

According to the cognitive theories of schizophrenia developed in the last decade (Daprati et al., 1997; Frith, 1992), these symptoms can be explained with the same basic pattern that Julian Jaynes uses in his theory to characterize the mental organization of the protagonists in the Iliad. Patients with delusions suffer from the fact that the standardized attribution schema that localizes the sources of thoughts in the self is not available to them. Therefore, they need to explain the origins of their thoughts, ideas, and desires in another way (see, e.g., Stephens & Graham, 2000). They attribute them to person sources that are present but invisible—such as relatives, physicians, famous persons, or extraterrestrials. Frequently, they also construct effects and mechanisms to explain how the thoughts proceeding from these sources are communicated, by, for example, voices or pictures transmitted over rays or wires, and nowadays frequently also over phones, radios, or computers. […]

As bizarre as these syndromes seem against the background of our standard concept of subjectivity and personhood, they fit perfectly with the theoretical idea that mental selves are not naturally given but rather culturally constructed, and in fact set up in, attribution processes. The unity and consistency of the self are not a natural necessity but a cultural norm, and when individuals are exposed to unusual developmental and life conditions, they may well develop deviant attribution patterns. Whether these deviations are due to disturbances in attribution to persons or to disturbances in dual representation cannot be decided here. Both biological and societal conditions are involved in the formation of the self, and when they take an unusual course, the causes could lie in both domains.


“The Varieties of Dissociative Experience”
by Stanley Krippner
from Broken Images Broken Selves: Dissociative Narratives In Clinical Practice
pp. 339-341

In his provocative description of the evolution of humanity’s conscious awareness, Jaynes (1976) asserted that ancient people’s “bicameral mind” enabled them to experience auditory hallucinations— the voices of the deities— but they eventually developed an integration of the right and left cortical hemispheres. According to Jaynes, vestiges of this dissociation can still be found, most notably among the mentally ill, the extremely imaginative, and the highly suggestible. Even before the development of the cortical hemispheres, the human brain had slowly evolved from a “reptilian brain” (controlling breathing, fighting, mating, and other fixed behaviors), to the addition of an “old-mammalian brain,” (the limbic system, which contributed emotional components such as fear, anger, and affection), to the superimposition of a “new-mammalian brain” (responsible for advanced sensory processing and thought processes). MacLean (1977) describes this “triune brain” as responsible, in part, for distress and inefficiency when the parts do not work well together. Both Jaynes’ and MacLean’s theories are controversial, but I believe that there is enough autonomy in the limbic system and in each of the cortical hemispheres to justify Ornstein’s (1986) conclusion that human beings are much more complex and intricate than they imagine, consisting of “an uncountable number of small minds” (p. 72), sometimes collaborating and sometimes competing. Donald’s (1991) portrayal of mental evolution also makes use of the stylistic differences of the cerebral hemisphere, but with a greater emphasis on neuropsychology than Jaynes employs. Mithen’s (1996) evolutionary model is a sophisticated account of how specialized “cognitive domains” reached the point that integrated “cognitive fluidity” (apparent in art and the use of symbols) was possible.

James (1890) spoke of a “multitude” of selves, and some of these selves seem to go their separate ways in posttraumatic stress disorder (PTSD) (see Greening, Chapter 5), dissociative identity disorder (DID) (see Levin, Chapter 6), alien abduction experiences (see Powers, Chapter 9), sleep disturbances (see Barrett, Chapter 10), psychedelic drug experiences (see Greenberg, Chapter 11), death terrors (see Lapin, Chapter 12), fantasy proneness (see Lynn, Pintar, & Rhue, Chapter 13), near-death experiences (NDEs) (see Greyson, Chapter 7), and mediumship (see Grosso, Chapter 8). Each of these conditions can be placed into a narrative construction, and the value of these frameworks has been described by several authors (e.g., Barclay, Chapter 14; Lynn, Pintar, & Rhue, Chapter 13; White, Chapter 4). Barclay (Chapter 14) and Powers (Chapter 15) have addressed the issue of narrative veracity and validation, crucial issues when stories are used in psychotherapy. The American Psychiatric Association’s Board of Trustees (1993) felt constrained to issue an official statement that “it is not known what proportion of adults who report memories of sexual abuse were actually abused” (p. 2). Some reports may be fabricated, but it is more likely that traumatic memories may be misconstrued and elaborated (Steinberg, 1995, p. 55). Much of the same ambiguity surrounds many other narrative accounts involving dissociation, especially those described by White (Chapter 4) as “exceptional human experiences.”

Nevertheless, the material in this book makes the case that dissociative accounts are not inevitably uncontrolled and dysfunctional. Many narratives considered “exceptional” from a Western perspective suggest that dissociation once served and continues to serve adaptive functions in human evolution. For example, the “sham death” reflex found in animals with slow locomotor abilities effectively offers protection against predators with greater speed and agility. Uncontrolled motor responses often allow an animal to escape from dangerous or frightening situations through frantic, trial-and-error activity (Kretchmer, 1926). Many evolutionary psychologists have directed their attention to the possible value of a “multimodular” human brain that prevents painful, unacceptable, and disturbing thoughts, wishes, impulses, and memories from surfacing into awareness and interfering with one’s ongoing contest for survival (Nesse & Lloyd, 1992, p. 610). Ross (1991) suggests that Western societies suppress this natural and valuable capacity at their peril.

The widespread prevalence of dissociative reactions argues for their survival value, and Ludwig (1983) has identified seven of them: (1) The capacity for automatic control of complex, learned behaviors permits organisms to handle a much greater work load in as smooth a manner as possible; habitual and learned behaviors are permitted to operate with a minimum expenditure of conscious control. (2) The dissociative process allows critical judgment to be suspended so that, at times, gratification can be more immediate. (3) Dissociation seems ideally suited for dealing with basic conflicts when there is no instant means of resolution, freeing an individual to take concerted action in areas lacking discord. (4) Dissociation enables individuals to escape the bounds of reality, providing for inspiration, hope, and even some forms of “magical thinking.” (5) Catastrophic experiences can be isolated and kept in check through dissociative defense mechanisms. (6) Dissociative experiences facilitate the expression of pent-up emotions through a variety of culturally sanctioned activities. (7) Social cohesiveness and group action often are facilitated by dissociative activities that bind people together through heightened suggestibility.

Each of these potentially adaptive functions may be life-depotentiating as well as life-potentiating; each can be controlled as well as uncontrolled. A critical issue for the attribution of dissociation may be the dispositional set of the experiencer-in-context along with the event’s adaptive purpose. Salamon (1996) described her mother’s ability to disconnect herself from unpleasant surroundings or facts, a proclivity that led to her ignoring the oncoming imprisonment of Jews in Nazi Germany but that, paradoxically, enabled her to survive her years in Auschwitz. Gergen (1991) has described the jaundiced eye that modern Western science has cast toward Dionysian revelry, spiritual experiences, mysticism, and a sense of bonded unity with nature, a hostility he predicts may evaporate in the so-called “postmodern” era, which will “open the way to the full expression of all discourses” (pp. 246– 247). For Gergen, this postmodern lifestyle is epitomized by Proteus, the Greek sea god, who could change his shape from wild boar to dragon, from fire to flood, without obvious coherence through time. This is all very well and good, as long as this dissociated existence does not leave— in its wake— a residue of broken selves whose lives have lost any intentionality or meaning, who live in the midst of broken images, and whose multiplicity has resulted in nihilistic affliction and torment rather than in liberation and fulfillment (Glass, 1993, p. 59).

 

 

Mid-Atlantic Ancestral Homeland

New Jersey, New York, Connecticut, Massachusetts, Vermont, and Pennsylvania. Those are the states my family and I visited recently, more or less in that order. It was a return to one of my ancestral homelands, the Mid-Atlantic region. The trip actually only involved part of my family, only one brother and one parent. My father was finally feeling nostalgic in his old age. So, he was our tour guide for much of the trip.

The last stretch involved some brief driving through Vermont and a stop at the Gettysburg in Pennsylvania. Vermont was beautiful, exactly as I imagined it would be. And the Gettysburg battlefield was fascinating, similar to a tour years ago of Little Big Horn where Custer had his infamous last stand. As a side note, Custer had been on the winning side at Gettysburg and played a significant role in stopping the Confederate advance, but almost everyone remembers him solely for his later defeat and death. The guy deserves some credit. Enjoyable as it was to see some beautiful countryside and explore an important part of American history, it was the personal side of our East Coast travels that interested me the most. This was first and foremost a tour of family history.

Our first destination was New Jersey. We headed to Morristown. It was in Morris County that my father’s paternal grandparents were born. That is part of the New York Metropolitan area. Morristown itself has been a place since the colonial era that attracted the wealthy (it is still a place of money), but my family wasn’t wealthy and so they lived out in what was the surrounding countryside. Morristown was also a major center for the American Revolution, a meeting place for important figures and a headquarters for George Washington.

My paternal great grandmother is Matilda Reinthaler. Her father, Charles, escaped the Austrian Army, when he was an officer sent to Italy (I guess it was the Crimean War). His men were forced to wear heavy uniforms and, though it was hot, they weren’t allowed to unbutton to cool off. He refused to follow orders and, facing court marshall, was forced to flee, ending up in New York and then New Jersey. Matilda’s mother, Caroline Lindenmeyer, left Bavaria for unknown reasons, but probably related to the 19th century wars and revolutionary fervor.

We found their home and the one room school house she would have attended, still standing down the road. Even with new houses having been built, there was a sense of the rural clinging to the former country road. She had a more stable childhood than experienced by her husband, my paternal great grandfather, Charles Salvester Steele. His grandfather came from Pennsylvania and his mother’s family had been in New Jersey continuously since the colonial era (one line of my own mother’s family, the Hawks, also came from colonial New Jersey, but we didn’t visit that part of the southern part of the state). Charles’ mother died when he was young and, since his father couldn’t afford to raise all the children, he was sent to live with the Shakers somewhere near Rochester.

That particular Shaker village has since become part of a prison. The only way to visit the buildings my great grandfather spent time in would be get arrested for a serious crime. Later on in the trip, we visited a different Shaker village that is still standing in Hancock, Massachusetts. It was part of a complex of villages along the border of New York and Massachusetts. The Shakers were a fascinating group, highly innovative and technologically advanced, business leaders in agriculture and industry. Even the design and quality of their buildings is impressive, such as the round barn we saw which is the most practical barn I’ve seen in my life. They knew how to run an operation and they had no desire to cling to the past, like the Amish. Growing up there would have been simple, but deprivation in any form would not have been an issue. The Shakers for much of their history were successful and wealthy.

My great grandfather’s childhood was a not unusual fate for many poor kids of the time. The Shakers on a regular basis legally adopted children given to them, a practice that continued until the federal government made it illegal for groups to adopt children and thus officially doomed the abstinent Shaker communities. Once reaching adulthood, the children raised were given a choice to stay or leave. My paternal great grandfather was living at the Shaker village at a time when they were already in decline. He decided to leave and that Shaker village closed not too long later.

It seems he returned to Morristown, probably because it was the only other place he knew. He remained in contact with his family, but one gets the sense that the contact was limited. His wife was also from that area and so one might presume that is how they met, although there is no family info about this. My paternal great grandfather would have learned a trade or maybe multiple trades when with the Shakers, as they put heavy focus on practical knowledge and skills. As an adult, he probably did some farming; certainly, the Shakers were famous for their agriculture. While living in this area, he took a large wagon into nearby New York City to sell produce and it likely was produce that he had grown himself.

As part of our family pilgrimage, we headed into New York City. It’s hard to imagine what it must have looked like back then. When my paternal grandfather was younger, he would travel there sometimes, since an uncle had a grocery store in Brooklyn. My family and I only had a day in the city and so we didn’t see much, besides the standard tourist sights, although not even having enough time to visit the 9/11 memorial. The most exciting part was taking the Staten Island Ferry where could be seen the Statue of Liberty and Ellis Island, famous landmarks for incoming immigrants, although both from an immigrant era that came after my own immigrant ancestry.

Later on following marriage, Charles Salvester Steele worked doing professional gardening and lawn maintenance in Connecticut. He also entered into flower shows, which is where the wealthy Benjamin DeWitt Riegel met him and hired him as the estate superintendent and head groundskeeper. That is how my grandfather ended up growing up on a Long Island Sound estate where, later on, my father in his own childhood would spend his summers.

That estate is apparently known as Xanadu, but my father recalls that to his family and the Riegel family it was simply known as “The Place”. It was still in Riegel ownership until recent years. About a decade ago, my father and uncle were able to get hold of Mr. Riegel’s daughter, Katherine Riegel Emory (she remembered it as “The Place”, when my father mentioned it). She was a childhood playmate of my grandfather, until teenagehood when the fates of the classes diverged. My father and uncle knew her as Mrs. Riegel when they visited in the summers of their youth. In those last years of her life, they were given permission to walk the grounds of the estate one last time, not that they realized it would soon fall under new ownership.

Years of talking about the place was a major reason for this trip. My father has for a long time wanted my brothers and I to see the place of his fond childhood memories. There was no longer a way to get on the estate by invitation with new ownership, but there is a public road that is along one side of it and two public beaches adjacent to it. At low tide, we were able to walk the rocky beach directly between the estate and Long Island Sound, making possible a clear view across the vast lawns my great grandfather once maintained. My dad pointed to all that he remembered from his childhood, along with stories his own father had shared with him, such as the time as a child when my grandfather built a contraption attached to a cable secured to the top story of the barn and stretched taut to the beach upon which the two of them rode it down barely missing a wall in the process. By the way, an article stated that my great grandfather (referred to by his work title, not his name) used to gather the eggs from the chicken coops near the old barn, but according to my father it was in fact my great grandmother who did this — just wanted to set the record straight.

The Xanadu estate is in Fairfield, Connecticut. It is another old area, inhabited long before Europeans settled there in the early colonial era. Before the Riegels bought the property, it was a gentlemen’s horse farm and at some point an onion farm. There was a village nearby that had been almost entirely burned down by the British during the American Revolution, the British having landed right around where the estate is located. As with many places on the Eastern seaboard, there is much history there.

It was nice to finally see this place I’d heard so much about all my life. The Place! I also saw the school my grandfather went to. One time walking to school, the Riegel’s chauffeur drove by on the road splashing muddy water onto my grandfather who thought it was done on purpose. My grandfather grew up with Riegel children and lived a protected life during the Great Depression, but he had an inferiority complex living on the periphery of great wealth. He spent the rest of his days being extremely class conscious and always wanting to enjoy the good life. It was even passed onto his children, including my father who likes nice things (i.e., classy cars, large houses, manicured lawns, expensive resorts, and such), not that my family is wealthy enough afford many nice things.

It would be strange growing up as the son of the help on an estate or even visiting such a place as a child. My dad recalls as a child telling a close friend back in Alexandria, Indiana (“Small Town, USA”) that he spent the summer at an estate and his friend called him a liar. Life on an estate is not an experience most of us ever have.

One thing stood out to me. There are, as I said, two public beaches on either side of the estate. They are fairly nice beaches for the area and when we first arrived many local people lounged around on the sand and played in the water as people do. However, directly on the waterfront of the estate, there is almost nothing other than rocks. This is because the Riegel family had built a seawall that caused erosion of the sand and disallowed the beach to naturally rebuild itself. This is a great example of the opposite of the tragedy of the commons. In trying to protect their private property, they destroyed the beach along their property, while on either side are two popular public beaches with lots of sand forming popular beaches.

After my grandfather graduated college, Mr. Riegel offered him a job as night superintendent at one of his mills in order to get trained. His job was to manage the factory during the night shift. It was the Trion factory in Georgia and my grandfather was one of the fair-haired boys that Mr. Riegel sent down from New York. It was at Trion that my grandfather met his first wife and my grandmother, Billie Jean Nye, who was working as a school teacher employed by the company in the company town. You can see pictures of the mill town at this article, including a picture of the hotel where lived the unmarried employees like my grandparents, the place where they first met, and a picture of the school where I assume my grandmother would have taught.

He had felt socially obligated to accept that job. Mr. Riegel, after all, was not only his father’s boss but also the owner of the house his father lived in. It would have been an offense to decline an offer of such a good job, at a time not long after the Great Depression when the economy was getting back on its feet. Still, my grandfather hated the job, as it was his responsibility to pick the workers for the week out of a crowd of men desperate for a job, deciding who would get work and who wouldn’t. During the Great Depression, my grandfather had lived a protected existence on the estate. Before working at Trion, he probably had never seen much extreme poverty and unemployment. Also, that company town would have still been recovering from recent conflict. In 1934, a year or two before my grandfather arrived, the town had been the site of labor conflict and violence:

“1934 marked the 3rd closing of the plant for any length of time. Throughout the South unions were making a strong push to organize factories and mills. “Flying Squadrons” of union activists were sent into mill communities to gain support. The large group of employees working in Trion was high on their list. Led by a group of people from the Rome Foundry, along with some local people, a mob literally tried to take over the mill. Trion’s Chief of Police, Mr. Hix, was killed attempting to protect the mill. Others that had come on to work that day were beaten or roughed up. Eventually the National Guard was called in. The mill remained closed for approximately six weeks.”

It was the kind of clash of the classes that happened in places like that. In controlling employment, the company had total power over people’s lives. It was the largest employer in the area and still there were more people looking for work than there were jobs available. As an interesting side note, this was all going on in the last years of Mr. Riegel’s life. He had contracted some disease, maybe polio, and was kept alive with an iron lung. In 1941, back on the estate, a storm had hit and the power went out. The iron lung was run on electricity and apparently they had no backup generator. My great grandfather was sent for and he tried to hook up the iron lung to the engine of a Model A truck, but it was too late. Mr. Riegel had suffocated to death. Along with the ending of his life, it was the ending of an era.

Anyway, in those remaining years of Mr. Riegel’s life, my grandfather didn’t last long at Trion. He realized there weren’t many respectable ways he could quit without offending Mr. Riegel. He could join the military or he could become a minister. He chose the latter and took his wife with him to Indiana. But Mr. Riegel was still immensely disappointed, having given this son of the help such a rare opportunity to move up in the world.

I could imagine the sense of expectation and conflict. While at Trion, my grandfather managed the mill during the evening shift. Some new advanced machinery had been installed and, along with another guy, my grandfather had to learn how to operate it and keep it running non-stop. The problem is no one had been sent to show how it all worked and something went wrong, destroying the equipment. My grandfather was horrified about the incident, but after an investigation no one was blamed. Mr. Riegel had put immense trust and responsibility onto my grandfather’s shoulders, and he obviously looked to him with great promise. After all of that, it must have seemed ungrateful for my grandfather to quit.

Even so, the training my grandfather received didn’t entirely go to waste. There was a tomato canning factory in Geneva, Indiana where the family (including my father as a young child) lived for a time. It operated seasonally after the tomato harvest and my grandfather, while not doing his ministerial duties, worked there as a temporary factory manager.

My father without realizing it followed in his footsteps when he later became a factory manager, a family tradition that began with with the Riegels. Then my father also fell into the same pattern when he refused career advancement in order to look for other work, initially having considered the ministerial option as well until he decided to become a professor in order to preach at students instead. Like his own father, he found stressful the cutthroat world of business and the harsh reality of controlling the fate of workers, in personally determining who would be hired and fired. My family apparently doesn’t have the right kind of personality traits to be part of the wealthy business elite.

Later on, my great grandmother died on the estate in 1954, when my father was twelve years old. A few years later, the Riegel family asked my great grandfather to leave the property. He was around eighty years old and had spent half of his life working and residing on the estate. It was his home and, from the way my father talks about it, I get the sense that he was heartbroken. Mr. Riegel had promised that he would always be taken care of, but Mr. Riegel died in 1941 and had never wrote anything down. His word-of-mouth promise apparently meant nothing to the heirs of Mr. Riegel or maybe it never came up. Whatever the case, my great grandfather wasn’t given any retirement package or even a place to live. He  was just told to immediately leave the home he had known and loved for so long.

As a comparison, on the other side of the road was the estate of Harold Gray, the comic strip artist of Little Orphan Annie. My great grandparents were good friends with some of long term help at that other estate, only a few minutes walk away. When Gray’s long term help retired, he bought them an expensive house. My great grandfather was probably expecting something similar, as the Riegels were surely even wealthier. Instead, he was forced to move in with his son and died shortly later.

That part of my family has always felt distant to me. This trip was the first time, since I was a baby, that I’ve visited this part of the country. There are still some of the extended family living around there, but we’ve had a hard time contacting them. My father hasn’t seen his extended family on that side since he was a kid. Yet that part of the country is so key, both to my family history and to American history.

As I mentioned, one line of my mother’s family (originating with Sampson Hawk) came from colonial New Jersey. Like my father’s family, they were likely of Germanic ancestry. The difference was that they early on headed for the frontier, whereas the New Jersey lines of my father’s family didn’t venture far. The German-American Riegel family were also from New Jersey, along with Pennsylvania where Mr. Riegel was born. My own surname has an early Pennsylvanian background, although I don’t know the ancestral source of it.

Like the Midwest, the mid-Atlantic region was ethnic American (i.e., non-WASP) territory. Specifically, it was one of the areas where German-Americans were the majority. But none of this comes up much in official histories and collective memory, as cultural amnesia is almost complete. My father has a vague memory of his grandparents having some kind of accent, whatever it was. They weren’t that far from the immigrant experience and they lived in a place where the immigrant experience should have been close to the surface. Even so, my father doesn’t recall anyone ever discussing such things. The oppressive world war era had stigmatized and erased so much of the former ethno-cultural diversity. That makes me sad, as it is a loss of part of the ancestral history that shaped my family.

Visiting New Jersey, in particular, gave me a glimpse of the world that once existed there. I have more of a sense of the place. But family history came to life even moreso in our visit to Connecticut. My father doesn’t get too excited about genealogy, maybe having to do with particular disconnections over the generations. Talking about the estate, though, allowed me to see another side of him. The estate was something personally real and important to my father’s life, one of the fondest connections he has to his family history. And for me, the stories I’ve heard for years suddenly had physical locations that I can now see in my mind’s eye.

The Old WASP Dream Falters

Over at Steve Wiggin’s blog, I was commenting on a recent post of his, Majority Report. He brought up the WASP myth and put it in context, although his focus was mostly on the Protestant part. In my comments, I mentioned the pluralist background of American society. WASPs have made up a large chunk of the ruling elite, but they’ve never been the majority of the population, contrary to the belief of many.

His post stood out to me partly just because that kind of thing is always of interest to me. But it was already on my mind because of an article I read recently from a local newspaper, The Daily Iowan — the article being Is this heaven? No, it’s beer by Clair Dietz. It appears to be in response to an exhibit being put on by the University of Iowa, German Iowa and the Global Midwest. I live near where the old breweries used to be located, along with the beer caves. My landlord, Doug Alberhasky, was quoted often in the piece, as his family’s business is a well known local distributor of alcohol, John’s Grocery.

There once was much clashing, sometimes violent, between WASPs and so-called hyphenated Americans. Many ethnic immigrant groups, especially German-Americans, loved their beer and liquor. The WASPs here in Iowa were seeking prohibition before the rest of the country, as Iowa became a major destination for German immigrants. Entire communities spoke German and carried on their German traditions, including the making of alcohol. There is a great book I’ve written about before, Gentlemen Bootleggers by Bryce Bauer, about one such community during Prohibition and how they became famous for their bootlegged Templeton Rye.

Another article on the topic comes from the other local newspaper, Press-Citizen: Iowa has deep German Roots by H. Glenn Penny. That article interested me even more. The author points out that there used to be three German-language newspapers here in Iowa City, an impressive number considering there are only two newspapers left in town at present: “In fact, the German language was so widespread that many German-Iowans lived here for decades without ever learning English.” Much of the Midwest was like this, especially this part of the Midwest such as the neighboring states of Minnesota and Wisconsin. This was German-American territory where German culture and language was the norm, not the exception.

This all came to a halt with the beginning of World War I, such as with the Babel Proclamation that outlawed any language besides English. And German-American independence and self-determination was further decimated with World War II. The cultural genocide was so complete that collective memory of this past was lost to the following generations. German-Americans were always the largest immigrant group and the largest ancestry, far beyond the meager numbers of WASPs, but they suffered for not having sufficient political power among the ruling elite. German-American culture was almost entirely lost, as if it never existed, until recent interest in ethnic ancestry was revived.

Still, this kind of political reaction seems to go in cycles. Every time there is a movement of populations, fear and bigotry inevitably follows. As with Germans of the past, the same thing has happened with immigrants of Arab, Persian, or similar looking ethnicities. This is true even within the country, as when Southerners migrated to the North and West. More recently, it has been true of blacks moving almost anywhere, but especially when it involves supposed inner city blacks. The Press-Citizen article made me think about this, when Penny wrote about how initially German immigrants were welcomed and even sought out:

“Iowa: The Home for Immigrants.” That was the title of the 1870 volume published by the Iowa Board for Immigration in Des Moines. It was translated it into multiple languages and distributed it across Northern Europe. The goal was to spur Europeans to abandon their homes and move to the state.

And it worked. Germans were the most numerous group to arrive. In fact, German immigrants consistently accounted for the largest number of foreign-born people in Iowa from the 1850s through the 1970s.

That instantly struck my mind. That sounded like a “workforce recruitment” campaign the Iowa government has had to attract people from other states. There has been a pattern of young Iowans leaving the state and so, in order to counter the demographic loss and brain drain, a need to attract young professionals and young families. Starting in the 1980s, the Iowa Department of Economic Development has advertised in Chicago by putting up billboards — here is an example (from About those Chicago billboards by Adam Belz):

This advertisement ran on billboards along interstates in Chicago in 2007.

Belz points out that, “It’s really a far cry from the local myth that Iowa has been running Section 8 ads in south Chicago for years, but as Steve Rackis, the guy who oversees Section 8 in Iowa City, points out, everyone drives on the interstate, and everyone likes the idea of a safe, quiet place with good schools and no traffic. So certainly, some low-income black people have seen these ads and responded by moving to Iowa.”

Most of the people who respond to such billboards aren’t poor, unemployed inner city blacks, aren’t stereotyped welfare queens, thugs, and gangbangers. The fact of the matter is most people coming from Chicago to Iowa are middle class white people. That is what happened to my family back in the 1980s, when my family left the Chicago suburbs in order to move to Iowa City where my father returned to school for a PhD program. My parents were young middle class professionals with young kids, the demographic targeted by the billboards. I’m sure my father saw such signs, as he headed into Chicago for work, whether or not they were part of the reason for his decision to move his family to Iowa.

Besides, most of those on housing assistance in Iowa City, according to data kept, are whites and long-term Iowa residents. Among these, the majority are elderly or disabled (many elderly and disabled move here because of the multiple hospitals, including a world class university medical center and a major Veterans Affairs facility). The rest are young families and most of these are employed, as unemployment rates are low here. There probably aren’t many “welfare queens” in the area, considering all the local opportunities for jobs, education, and training. Plus, the worst off poor people in Iowa are rural whites living in dying farm towns and trailer parks, not blacks from Chicago.

Considering the proven racial targeting of blacks by the police in Johnson County, it isn’t exactly a welcoming place to blacks and so isn’t a place most blacks are going to choose to move to. In interviews, many blacks living here explained that they saw their situation as temporary simply for the sake of finding work and saving money, and as soon as they were able they planned on leaving.

Sure, all kinds of people end up in a town like Iowa City. It’s a diverse community with people from all over the world. There is a growing population of non-whites here, although it is mostly Asians and Hispanics, not blacks. Even among blacks, they come from many other places besides Chicago, including a fair number of African immigrants. Of five blacks I’ve worked with in my present job with the city, two were from families that had been in Iowa for generations, two were from Africa, one might have been from Chicago or somewhere like that, and another I never knew long enough to learn of his background; three of those people I know were married with young kids and three had degrees from the local university.

Since I was a kid in the 1980s, violent crime has vastly decreased across the country. Iowa has always had low crime rates, violence and otherwise, and that is still the case. For more than a decade, the violent crime in Johnson County, where Iowa City is located, has continued to drop. This is the time period during which there has been an increase in the minority population. There is actually less crime now in Iowa with more minorities than there were back when there were fewer minorities. Yet there is this public perception, largely based on mainstream news reporting, that everything is getting worse, despite the fact that Iowa has been doing well even during the recession.

The real fear is that German-Americans, Hispanics, blacks, or whatever group is most reviled at the moment is a danger to the American way of life. They are bringing bad things with them. And they are taking our country away from us. States like Iowa have always depended on immigration from other countries or simply other states, but this dependence has led to resentment. When WWI came around, it didn’t matter that German immigrants had settled Iowa and cleared the land, had helped make America the country it is, and shaped the entire cultural experience of the Heartland. Suddenly, they were threatening strange foreigners.

The experience of blacks has been different, of course. They were considered a threat right from the start, even though most early blacks didn’t come to America by choice. Interestingly, before Anglo-Americans settled Iowa, there were already free blacks, likely escaped slaves, living right here in Iowa City. Blacks were the first Iowa Citians and yet today, after the era of sundown towns driving blacks out of states like Iowa, blacks are considered as foreign as were those WWI era German-Americans.

Donald Trump rides white outrage in gaining support as a presidential candidate. A century ago, his German-Scottish ancestry would have made him an untrustworthy outsider. But today he stands as the defender of American whiteness and promises to make America great again. Meanwhile, Hillary Clinton represents the last vestiges of the WASP rightful ruling elite and disinterested aristocracy of professional politicians who for centuries have defended the status quo from uncouth ethnics like the Drumpf family and their crude business wealth being used to usurp political power (not to mention having to deal with meddling Jews such as Bernie Sanders). The uppity WASPs make their last stand to maintain the respectable political order.

WASPs never were the majority of American population. But they have maintained most of the political power and social influence for centuries. As the non-WASP and non-white population grows, WASPs are slowly losing even their position and privilege. There are challengers on all sides, as the old WASP dream falters.

* * *

Previous blog posts:

America’s Heartland: Middle Colonies, Mid-Atlantic States and the Midwest

Centerville, IA: Meeting Point of Diversity & Conflict

The Cultural Amnesia of German-Americans

Equal Opportunity Oppression in America

The Fight For Freedom Is the Fight To Exist: Independence and Interdependence

Substance Control is Social Control

The Shame of Iowa and the Midwest

Paranoia of a Guilty Conscience

* * *

Online Articles:

The Great Chicago Migration Myth
by Mikel Livingston and Steven Porter, JConline

It was during the early 2000s when Curbelo, then a program coordinator at Iowa State University in Ames, first encountered the belief that an influx of former Chicago residents was wreaking havoc on local crime rates.

“That caused the police to start targeting minorities around town,” Curbelo said. “It led to harassing the minority population in a town that didn’t have a lot of diversity.”

A public forum in 2008 helped the community confront and move past the issue. When Curbelo moved to Lafayette earlier this year, he was surprised to be confronted with the notion yet again.

” ‘All people from Chicago are criminals, they’re black, they’re on welfare,’ ” Curbelo said, reciting the misconceptions. “No. They’re hard-working people looking for better opportunities. That’s part of the American dream and nobody can judge you for moving to a place to better your family by the color you are.”

The black ‘Pleasantville’ migration myth: moving from a city isn’t pleasant
by Robert Gutsche Jr

Ironically, Iowa City’s downtown – on the doorstep of the University of Iowa – continues to be more violent than the Southeast Side. Every weekend, white college students vandalize buildings, vomit on sidewalks, and assault each other, though it’s the Southeast Side – and its presumed Chicago migrants – who bear the brunt of the responsibility for the city’s crime.

How the Media Stokes Racism in Iowa City – and Everywhere
by Eleanor J. Bader, Truthout

Central to this discourse, of course, is the belief that low-income women, aka “welfare queens,” are taking advantage of government programs and feeding at the trough of public generosity. “Chicago has come to mean more than just another city,” Gutsche concludes. “It signals the ghetto, danger, blackness – and most directly, of not being from here.” That two-thirds of the low-income households registered with the Iowa City Housing Authority were elderly and disabled – not poor, black or from Chicago – went unacknowledged by reporters. Similarly, the drunken escapades of mostly white University of Iowa students have been depicted by reporters as essentially benign and developmentally appropriate. “Just as news coverage explained downtown violence as a natural college experience, news coverage normalized southeast side violence as being the effect of urban black culture,” Gutsche writes. “News stories indicated that drunken packs of college students were isolated to the downtown, whereas southeast side violence was described as infiltrating the city’s schools, social services and public safety.”

 

 

A community divided: Racial segregation on the rise in Iowa City
by Matthew Byrd, Little Village

Some renters felt the underlying presence of racial bias when discussing public assistance with Iowa City landlords […] There are other plausible explanations as well. A 2013 report issued by the Iowa City Coalition for Racial Justice found a high degree of overlap between race and class within Johnson County, with 40 percent of black residents living below the poverty line compared to 16 percent of whites. The fact that Iowa City is the fourteenth most segregated metropolitan area by income in the country, according to the Martin Prosperity Institute, means that, in a county where you are more likely to be poor if you’re black rather than white, segregation by income can also mean de facto segregation by race.

On a similar note, black residents in Iowa City are much more significantly limited in their ability to take out mortgages than whites. The Public Policy center study found that, while blacks comprise nearly 6 percent of the city’s overall population, they only account for 1 percent of housing loans and are much more likely than their white counterparts to be denied loans (the study’s authors do concede, however, that without access to credit scores they “cannot conclusively assert that the higher denial rates … is due to race”).

Whatever the case may be, the rate of racial segregation Iowa City experiences is disturbingly high.

Does Section 8 housing hurt a neighborhood?
The Gazette

In Iowa City, nine of 10 voucher holders is either elderly, disabled or working. More than 85 percent of vouchers in the Corridor are issued locally, not to out of towners. Voucher holders who get in trouble with the law, who shelter people with criminal backgrounds, or who don’t return letters and phone calls are kicked out of the program.

“We review the police dockets and the newspapers on a daily basis,” said Steve Rackis, who heads up the program in Iowa City.

Within the past two years, 230 vouchers have been terminated in Cedar Rapids. Iowa City terminates about 10 people each month. […]

Myth: Most Section 8 vouchers are held by people from Chicago.

Fact: 93 percent of vouchers in Cedar Rapids were issued locally. The program requires one year of residency and has a three- to five-year waiting list. 4.8 percent of voucher holders come from Illinois, representing about 50 households. In Iowa City, 9 percent of vouchers come from Illinois, representing about 114 households. […]

Myth: The cities of Cedar Rapids and Iowa City have billboards in Chicago encouraging Section 8 voucherholders to move to Eastern Iowa.

Fact: The Iowa Department of Economic Development occasionally runs billboards in Chicago encouraging people to move to Iowa, but they are geared toward professionals, extolling Iowa’s hassle-free commutes, for example. […]

Myth: Section 8 is mostly for people who don’t work but survive on welfare.

Fact: In Iowa City, 1,149 households in the program — 91 percent — are elderly, disabled or working. The same is true of 879 households in Cedar Rapids, or 82 percent of those in the program.

Leaving Chicago for Iowa’s “Fields of Opportunity”: Community Dispossession, Rootlessness, and the Quest for Somewhere to “Be OK”
by Danya E. Keene, Mark B. Padilla, & Arline T. Geronimus, NCBI

Iowa City and the surrounding Johnson County, located 200 miles west of Chicago, have received small but significant numbers of low-income African Americans from Chicago. The Iowa City Housing Authority (ICHA), which serves all of Johnson County, reported in 2007 that 14 percent (184) of the families that it assists through vouchers and public housing were from Illinois, and according to housing authority staff, virtually all of these families are from the Chicago area (Iowa City Housing Authority 2007). Additionally, the ICHA estimates that about one-third of the approximately 1,500 families on its rental-assistance waiting list are Chicago area families. Little is known about why families choose eastern Iowa as a destination, but speculation among ICHA officials is that the moves are motivated by shorter waiting lists for subsidized housing and the fact that Johnson County has a reputation for good schools, safe communities, and ample job opportunities.

From the perspective of a growing emphasis on poverty deconcentration in both academic and policy circles (Imbroscio 2008), leaving Chicago’s high poverty neighborhoods for Iowa’s white middle and working-class communities represents an idealized escape from urban poverty. However, the experiences of participants in this study speak to the challenges as well as the benefits of long distance moves to what are often referred to as “opportunity areas” (Venkatesh et al. 2004).

Little is known about the experience of Chicago families in Iowa, but preliminary evidence suggests that Chicago migrants may face many barriers to acceptance. Despite their relatively small numbers, African Americans from Chicago are visible outsiders in Iowa’s predominantly white communities. In Johnson County, blacks made up only 3.9 percent of the population in 2008, an increase from 2.9 percent in 2000 and higher than the 2008 state average of 2.9 percent (United States Census Bureau). Iowa City, a college town that is home to the University of Iowa, contains considerably more ethnic diversity than many Iowa communities and is home to a small number of African-American professionals, students, and faculty. However, the arrival of low-income African Americans from Chicago is a highly contentious issue and has given rise to a divisive local discourse that is often imbued with racialized and class-based stereotypes of urban areas.

The recent migration of urban African Americans to Iowa has also occurred in a climate of uncertainty about the state’s economic future (Wilson n.d.). Over the past few decades, Iowa has lost numerous sources of well-paying employment. The state has also experienced significant population losses, particularly among the college educated (Carr and Kefalas 2009). While college towns such as Iowa City have been somewhat protected from these demographic and economic shifts, in Johnson County, dramatic increases in free lunch program participation and growing demands for subsidized housing over the last decade indicate increasing local need (Wilson n.d.). According to documentary filmmaker Carla Wilson (n.d.), many Iowans feel that in the last few years, poor blacks from Chicago descended on the state, placing a tremendous burden on social service resources at a time when budgets are already stretched. As stated in one concerned letter from Don Sanders (personal communication, [February 3], 2004) to Iowa City’s City Council, “We’re turning into a mecca for out-of-state, high maintenance, welfare recipients. These often dysfunctional families are causing serious problems for our schools and police.” […]

Iowa is not only a place where the social terrain is unfamiliar, but a place where Chicago migrants experience a vulnerable status as stigmatized outsiders. As Danielle says, “It’s someone else’s city,” a place where, according to Marlene, “we are only here because they are letting us be here.” The stigmatization of Chicago migrants plays a profound role in shaping social relationships, both among fellow migrants and between Chicago migrants and Iowans. Several participants describe how Chicago is often blamed for “everything that goes wrong in Iowa City,” particularly in relation to drugs and crime. According to 53-year-old Diane Field, “It’s just, Chicago, Chicago, Chicago. I mean, everywhere you go they talk about us. There were drugs in Iowa long before anyone came from Chicago.” This association between drugs, crime, and Chicago is also prevalent in the local media. For example, one newspaper article about a fight in southeast Iowa City drew numerous racially charged on-line comments about the problems caused by Chicago migrants, despite the fact that “Chicago” was not even referenced in the article.

While participants describe the “helpfulness” of many Iowans, they also note that some oppose their presence. Carol, for example, says she was told by a fellow bus passenger, “I’m tired of all these black folks coming and messing up our small town. I don’t know why the hell y’all up in here, but y’all need to go back where you came from.” While Carol explains that encounters such as these are rare, Jonathan considers this attitude to be more pervasive. He says, “They don’t want us black people down here. Even though it’s some black people down here like me and my family that want something better for our life. They don’t understand that.”

Several participants describe facing discrimination specifically because of where they are from. In this context, 33-year-old Tanya Neeld says that she has begun telling people that she is from Indiana, Michigan, or “somewhere else, not Chicago.” Participants also describe attempts to differentiate themselves from those individuals who “bring Chicago to Iowa” (by getting involved with drugs, for example), by emphasizing their own desire to find a “better life” and to escape discursively condemned Chicago neighborhoods. Additionally, in order to resist the label of, “just another one from Chicago,” many participants also describe keeping to themselves and avoiding relationships with other Chicagoans. For example, Michelle, says, “They act like they really don’t want us here. They try to make like we keep up so much trouble. I don’t know what the rest of these people are doing. That’s why I stay to myself.”

Other participants describe avoiding, in particular, people in their immediate neighborhood who were often fellow Chicagoans. A large portion of Chicago movers live in a few housing complexes on the southeast side of Iowa City, and several participants explain that it is difficult to find landlords elsewhere who will rent to them. Michelle says, “A lot of places here don’t accept Section 8 [rental assistance]. I figure it’s because they don’t want that type of thing in their neighborhood.” These sentiments were echoed by 25-year-old Christine Frazier who says, “It sort of looks likes they section us off.”ii

In the context of residential segregation and stigmatization, many participants also describe the challenges of forming ties with Iowans. A few explain that they actively avoid interactions with white Iowans as a form of self-protection. For example, Christine describes how when she first started working in Iowa, her coworkers, who were all white, left her out of their conversations and talked about her behind her back. She says that from this early experience, she learned to stay to herself at work. She says, “I still have my guards up. You know, it affected me when I got other jobs because I don’t want to interact.” Michelle describes how she has adapted to frequent encounters with racism in Iowa. She says, “I’m basically a friendly person, but I can be not friendly as well. So, that’s the way I cope with it. I just act like they don’t exist. I just stay in my own little world.”

Separation from social ties in Chicago and barriers to the formation of new ties in Iowa leave many former Chicagoans socially isolated and reliant on highly individualized strategies of survival. The desire to be self-sufficient is a common theme throughout the interviews, and in the context of social isolation, some participants may be left with no alternative to relying on themselves. As Tara says, “I don’t count on these people in this neighborhood. I count on myself because myself would not let my own self down.”

Without social rootedness, for many participants, Iowa is not a place to call home, just somewhere to be for a while in order to “do what you have to do.” Or, as Lakia says, “Living in Iowa is like doing a beat,” (a reference, she explains, to a prison sentence). Without social ties, and in the context of stigma and economic vulnerability, the nature of this “beat” is also extremely fragile and many participants have stories of friends and family who eventually returned to Chicago or moved on in search of somewhere else to “be OK.”

How do we make the strange familiar?

I’ve been simultaneously looking at two books: This is Your Brain on Parasites by Kathleen McAuliffe. And Stranger Than We Can Imagine by John Higgs. The two relate, with the latter offering a larger context for the former. The theme of both might well be summed up with the word ‘strange’. The world is strange and becoming ever stranger. We are becoming aware of how utterly bizarre the world is, both within us and all around us.

The first is not only about parasites, despite the catchy title. It goes so far beyond just that. After all, most of the genetic material we carry around with us, including within our brains, is non-human. It’s not merely that we are part of environments for we are environments. We are mobile ecosystems with boundaries that are fluid and permeable.

For a popular science book, it covers a surprising amount of territory and done so with more depth than one might expect. Much of the research discussed is preliminary and exploratory, as the various scientific fields have been slow to emerge. This might be because of how much they challenge the world as we know it and society as it is presently ordered. There are other psychological factors the author details such as the resistance humans have in dealing with topics of perceived disgust.

To summarize the book, McAuliffe explores the conclusions and implications of research involving parasitism and microbiomes in terms of neurocognitive functioning, behavioral tendencies, personality traits, political ideologies, population patterns, social structures, and culture. She offers some speculations of those involved in these fields, and what makes the speculations interesting is how they demonstrate the potential challenges of these new understandings. Whether or not we wish to take the knowledge and speculations seriously, the real world consequences will remain to be dealt with somehow.

The most obvious line of thought is the powerful influence of environments. The world around us doesn’t just effect us. It shapes who we are at a deep level and so shapes our entire society. There is no way to separate the social world from the natural world. This isn’t fatalism, since we also shape our environments. The author points to the possibility that Western societies have been liberalized at least partly because of the creation of healthier conditions that allow human flourishing. All of the West not that long ago was dominated by fairly extreme forms of social conservatism, violent ethnocentrism, authoritarian systems, etc. Yet in the generations following the creation of sewer systems, clean water, environmental regulations and improved healthcare, there was a revolution in Western social values along with vast improvements in human development.

In terms of intelligence, some call this the Moral Flynn Effect, a convergence of diverse improvements. And there is no reason to assume it will stop and won’t spread further. We know the problems we face. We basically understand what those problems are, what causes them and alleviates them, even if not entirely eliminates them. So, we know what we should do, assuming we actually wanted to create a better world. Most importantly, we have the monetary wealth, natural resources, and human capacity to implement what needs to be done. It’s not a mystery, not beyond our comprehension and ability. But the general public has so far lacked this knowledge, for it takes a while for new info and understandings to spread — e.g., Enlightenment ideas developed over centuries and it wasn’t until the movable type printing press became common that revolutions began. The ruling elite, as in the past, will join in solving these problems when fear of the masses forces them to finally act. Or else the present ruling elite will itself be eliminated, as happened with previous societies.

What is compelling about this book are the many causal links and correlations shown. It matches closely with what is seen from other fields, forming a picture that can’t be ignored. It’s probably no accident that ethnocentric populations, socially conservative societies, authoritarian governments, and strict religions all happen to be found where there are high rates of disease, parasites, toxins, malnutrition, stress, poverty, inequality, etc — all the conditions that stunt and/or alter physical, neurocognitive, and psychological development.

For anti-democratic ruling elites, there is probably an intuitive or even conscious understanding that the only way to maintain social control is through keeping the masses to some degree unhealthy and stunted. If you let people develop more of their potential, they will start demanding more. If you let intelligence increase and education improve, individuals will start thinking for themselves and the public imagining new possibilities.

Maybe its unsurprising that American conservatives have seen the greatest threat not just in public education but, more imporantly, in public health. The political right doesn’t fear the failures of the political left, the supposed wasted use of tax money. No, what they fear is that the key leftist policies have been proven to work. The healthier, smarter, and better educated people become the more they develop attitudes of social liberalism and anti-authoritarianism, which leads toward the possibility of radical imagination and radical action. Until people are free to more fully develop their potentials, freedom is a meaningless and empty abstraction. The last thing the political right wants, and sadly this includes many mainstream ‘liberals’, is a genuinely free population.

This creates a problem. The trajectory of Western civilization for centuries has been the improvement of all these conditions that seems to near inevitably create a progressive society. That isn’t to say the West is perfect. Far from it. But imagine what kind of world it would be if universal healthcare and education was provided to every person on the planet. This is within the realm of possibility at this very moment, if we so chose to invest our resources in this way. It’s nothing special about the West and even in the West there are still large parts of the population living in severe deprivation and oppression. In a single generation, we could transform civilization and solve (or at least shrink to manageable size) the worst social problems. There is absolutely nothing stopping us but ourselves. Instead, Western governments have been using their vast wealth and power to dominate other countries, making the world a worst place in the process, helping to create the very conditions that further undermine any hope for freedom and democracy. Blowing up hospitals, destroying infrastructure, and banning trade won’t lead to healthier and more peaceful populations; if anything, the complete opposite.

A thought occurred to me. If environmental conditions are so important to how individuals and societies form, then maybe political ideologies are less key than we think or else not as important in the way we normally think about them. Our beliefs about our society might be more result than cause (maybe the limited healthcare availability in the American South being a central factor in maintaining its historical conservatism and authoritarianism). We have a hard time thinking outside of the conditions that have shaped our very minds.

That isn’t to say there is no feedback loop where ideology can reinforce the conditions that made it possible. The point is that free individuals aren’t fully possible in an unfree society where individuals aren’t free on a practical level to develop toward optimal health and ability. As such, fights over ideology miss an important point. The actual fight needs to be over the conditions that precede any particular ideological framing and conflict. On a practical level, we would be better off investing money and resources where it is needed most and in ways that practically improve lives, rather than simply imprisoning populations into submission and bombing entire societies into oblivion, either of which worsens the problems for those people and for everyone else as well. The best way to fight crime and terrorism would be by improving the lives for all people. Imagine that!

The only reason we can have a public debate now is because we have finally come to the point in society where conditions have improved just enough where these issues are finally comprehensible, as we have begun to see their real world impact in improving society. It would have been fruitless trying to have a public debate about public goods such as public healthcare and public education in centuries past when even the notion of a ‘public’ still seemed radical. The conditions for a public with a voice to be heard had to first be created. Once that was in place, it is unsurprising that it required radicals like socialists to take it to the next level in suggesting the creation of public sanitation and public bakeries, based on the idea that health was a priority, if not an individual right then a social responsibility. Now, these kinds of socialist policies have become the norm in Western societies, the most basic level of a social safety net.

As I began reading McAuliffe’s book, I came across Higgs’ book. It wasn’t immediately apparent that there was a connection between the two. Reading some reviews and interviews showed the importance Higgs placed on the role (hyper-)individualism has played this past century. And upon perusing the book, it became clear that he understood how this went beyond philosophy and politics, touching upon every aspect of our society, most certainly including science.

It was useful thinking about the issue of micro-organisms in a larger historical context. McAuliffe doesn’t shy away from the greater implications, but her writing was focused on a single area of study. To both of these books, we could also add such things as the research on epigentics which might further help transform our entire understanding of humanity. Taken together, it is clear that we are teetering on the edge of a paradigm shift, of the extent only seen a few times before. We live in a transitional era, but it isn’t a smooth transition. As Higgs argues, the 20th century has been a rupture, what having developed not being fully explicable according to what came before.

We are barely beginning to scratch the surface of our own ignorance, which is to say our potential new knowledge. We know just enough to realize how wrong mainstream views have been in the past. Our society was built upon and has been operating according to beliefs that have been proven partial, inaccurate, and false. The world is more complex and fascinating than we previously acknowledged.

Realizing we have been so wrong, how do we make it right going forward? What will it take for us to finally confront what we’ve ignored for so long? How do we make the strange familiar?

* * *

Donald Trump: Stranger Than We Can Imagine?
by David McConkey

Why Jeremy Corbyn makes sense in the age of the selfie
By John Higgs

Stranger Than We Can Imagine:
Making Sense of the Twentieth Century
by John Higgs
pp. 308-310

In the words of the American social physicist Alex Pentland, “It is time that we dropped the fiction of individuals as the unit of rationality, and recognised that our rationality is largely determined by the surrounding social fabric. Instead of being actors in markets, we are collaborators in determining the public good.” Pentland and his team distributed smartphones loaded with tracking software to a number of communities in order to study the vast amount of data the daily interactions of large groups generated. They found that the overriding factor in a whole range of issues, from income to weight gain and voting intentions, was not individual free will but the influence of others. The most significant factor deciding whether you would eat a doughnut was not willpower or good intentions, but whether everyone else in the office took one. As Pentland discovered, “The single biggest factor driving adoption of new behaviours was the behaviour of peers. Put another way, the effects of this implicit social learning were roughly the same size as the influence of your genes on your behaviour, or your IQ on your academic performance.”

A similar story is told by the research into child development and neuroscience. An infant is not born with language, logic and an understanding of how to behave in society. They are instead primed to acquire these skills from others. Studies of children who have been isolated from the age of about six months, such as those abandoned in the Romanian orphanages under the dictatorship of Nicolae Ceauşescu, show that they can never recover from the lost social interaction at that crucial age. We need others, it turns out, in order to develop to the point where we’re able to convince ourselves that we don’t need others.

Many aspects of our behaviour only make sense when we understand their social role. Laughter, for example, creates social bonding and strengthens ties within a group. Evolution did not make us make those strange noises for our own benefit. In light of this, it is interesting that there is so much humour on the internet.

Neuroscientists have come to view our sense of “self,” the idea that we are a single entity making rational decisions, as no more than a quirk of the mind. Brain-scanning experiments have shown that the mental processes that lead to an action, such as deciding to press a button, occur a significant period before the conscious brain believes it makes the decision to press the button. This does not indicate a rational individual exercising free will. It portrays the conscious mind as more of a spin doctor than a decision maker, rationalising the actions of the unconscious mind after the fact. As the Canadian-British psychologist Bruce Hood writes, “Our brain creates the experience of our self as a model – a cohesive, integrated character – to make sense of the multitude of experiences that assault our senses throughout our lifetime.”

In biology an “individual” is an increasingly complicated word to define. A human body, for example, contains ten times more non-human bacteria than it does human cells. Understanding the interaction between the two, from the immune system to the digestive organs, is necessary to understand how we work. This means that the only way to study a human is to study something more than that human.

Individualism trains us to think of ourselves as isolated, self-willed units. That description is not sufficient, either biologically, socially, psychologically, emotionally or culturally. This can be difficult to accept if you were raised in the twentieth century, particularly if your politics use the idea of a free individual as your primary touchstone. The promotion of individualism can become a core part of a person’s identity, and something that must be defended. This is ironic, because where did that idea come from? Was it created by the person who defends their individualism? Does it belong to them? In truth, that idea was, like most ideas, just passing through.

* * *

Social Conditions of an Individual’s Condition

Uncomfortable Questions About Ideology

To Put the Rat Back in the Rat Park

Rationalizing the Rat Race, Imagining the Rat Park

Social Disorder, Mental Disorder

The Desperate Acting Desperately

Homelessness and Mental Illness

It’s All Your Fault, You Fat Loser!

Morality-Punishment Link

Denying the Agency of the Subordinate Class

Freedom From Want, Freedom to Imagine

Ideological Realism & Scarcity of Imagination

The Unimagined: Capitalism and Crappiness

Neoliberalism: Dream & Reality

Moral Flynn Effect?

Racists Losing Ground: Moral Flynn Effect?

Immoral/Amoral Flynn Effect?

Of Mice and Men and Environments

What do we inherit? And from whom?

Radical & Moderate Enlightenments: Revolution & Reaction, Science & Religion

No One Knows