Social Construction & Ideological Abstraction

The following passages from two books help to explain what is social construction. As society has headed in a particular direction of development, abstract thought has become increasingly dominant.

But for us modern people who take abstractions for granted, we often don’t even recognize abstractions for what they are. Many abstractions simply become reality as we know it. They are ‘looped’ into existence, as race realism, capitalist realism, etc.

Ideological abstractions become so pervasive and systemic that we lose the capacity to think outside of them. They form our reality tunnel.

This wasn’t always so. Humans used to conceive of and hence perceive the world far differently. And this shaped their sense of identity, which is hard for us to imagine.

* * *

Dynamics of Human Biocultural Diversity:
A Unified Approach

by Elisa J. Sobo
Kindle Locations 94-104)

Until now, many biocultural anthropologists have focused mainly on the ‘bio’ half of the equation, using ‘biocultural’ generically, like biology, to refer to genetic, anatomical, physiological, and related features of the human body that vary across cultural groups. The number of scholars with a more sophisticated approach is on the upswing, but they often write only for super-educated expert audiences. Accordingly, although introductory biocultural anthropology texts make some attempt to acknowledge the role of culture, most still treat culture as an external variable— as an add-on to an essentially biological system. Most fail to present a model of biocultural diversity that gives adequate weight to the cultural side of things.

Note that I said most, not all: happily, things are changing. A movement is afoot to take anthropology’s claim of holism more seriously by doing more to connect— or reconnect— perspectives from both sides of the fence. Ironically, prior to the industrial revolution and the rise of the modern university, most thinkers took a very comprehensive view of the human condition. It was only afterward that fragmented, factorial, compartmental thinking began to undermine our ability to understand ourselves and our place in— and connection with— the world. Today, the leading edge of science recognizes the links and interdependencies that such thinking keeps falsely hidden.

Nature, Human Nature, and Human Difference:
Race in Early Modern Philosophy
by Justin E. H. Smith

pp. 9-10

The connection to the problem of race should be obvious: kinds of people are to no small extent administered into being, brought into existence through record keeping, census taking, and, indeed, bills of sale. A census form asks whether a citizen is “white,” and the possibility of answering this question affirmatively helps to bring into being a subkind of the human species that is by no means simply there and given, ready to be picked out, prior to the emergence of social practices such as the census. Censuses, in part, bring white people into existence, but once they are in existence they easily come to appear as if they had been there all along. This is in part what Hacking means by “looping”: human kinds, in contrast with properly natural kinds such as helium or water, come to be what they are in large part as a result of the human act of identifying them as this or that. Two millennia ago no one thought of themselves as neurotic, or straight, or white, and nothing has changed in human biology in the meantime that could explain how these categories came into being on their own. This is not to say that no one is melancholic, neurotic, straight, white, and so on, but only that how that person got to be that way cannot be accounted for in the same way as, say, how birds evolved the ability to fly, or how iron oxidizes.

In some cases, such as the diagnosis of mental illness, kinds of people are looped into existence out of a desire, successful or not, to help them. Racial categories seem to have been looped into existence, by contrast, for the facilitation of the systematic exploitation of certain groups of people by others. Again, the categories facilitate the exploitation in large part because of the way moral status flows from legal status. Why can the one man be enslaved, and the other not? Because the one belongs to the natural-seeming kind of people that is suitable for enslavement. This reasoning is tautological from the outside, yet self-evident from within. Edward Long, as we have seen, provides a vivid illustration of it in his defense of plantation labor in Jamaica. But again, categories cannot be made to stick on the slightest whim of their would-be coiner. They must build upon habits of thinking that are already somewhat in place. And this is where the history of natural science becomes crucial for understanding the history of modern racial thinking, for the latter built directly upon innovations in the former. Modern racial thinking could not have taken the form it did if it had not been able to piggyback, so to speak, on conceptual innovations in the way science was beginning to approach the diversity of the natural world, and in particular of the living world.

This much ought to be obvious: racial thinking could not have been biologized if there were no emerging science of biology. It may be worthwhile to dwell on this obvious point, however, and to see what more unexpected insights might be drawn out of it. What might not be so obvious, or what seems to be ever in need of renewed pointing out, is a point that ought to be of importance for our understanding of the differing, yet ideally parallel, scope and aims of the natural and social sciences: the emergence of racial categories, of categories of kinds of humans, may in large part be understood as an overextension of the project of biological classification that was proving so successful in the same period. We might go further, and suggest that all of the subsequent kinds of people that would emerge over the course of the nineteenth and twentieth centuries, the kinds of central interest to Foucault and Hacking, amount to a further reaching still, an unprecedented, peculiarly modern ambition to make sense of the slightest variations within the human species as if these were themselves species differentia. Thus for example Foucault’s well-known argument that until the nineteenth century there was no such thing as “the homosexual,” but only people whose desires could impel them to do various things at various times. But the last two centuries have witnessed a proliferation of purportedly natural kinds of humans, a typology of “extroverts,” “depressives,” and so on, whose objects are generally spoken of as if on an ontological par with elephants and slime molds. Things were not always this way. In fact, as we will see, they were not yet this way throughout much of the early part of the period we call “modern.”

Time and Trauma

And I think of that “Groundhog Day” movie with Bill Murray in which he repeats the same day, again and again, with only minor changes. If you’ve seen the movie, Murray finally breaks out of what appears to be an infinite loop only when he changes his ways, his approach to life, his mentality. He becomes a better person and even gets the girl.

When is the USA going to break out of its infinite loop of war? Only when we change our culture, our mentality.

A “war on terror” is a forever war, an infinite loop, in which the same place names and similar actions crop up again and again. Names like Mosul and Helmand province. Actions like reprisals and war crimes and the deaths of innocents, because that is the face of war.

~W.J. Astore, Happy 4th of July! And a Global War on Something

* * *

The impression we form is that it is not that linear time perception or experience that has been corrupted by trauma; it is that time “itself” has been traumatized — so that we come to comprehend “history” not as a random sequence of events, but as a series of traumatic clusters. This broken time, this sense of history as a malign repetition, is “experienced” as seizure and breakdown; I have placed “experienced” in inverted commas here because the kind of voiding interruption of subjectivity seems to obliterate the very conditions that allows experience to happen.

It is as if the combination of adolescent erotic energy with an inorganic artefact … produces a trigger for a repeating of the ancient legend. It is not clear that “repeating” is the right word here, though. It might be better to say that the myth has been re-instantiated, with the myth being understood as a kind of structure that can be implemented whenever the conditions are right. But the myth doesn’t repeat so much as it abducts individuals out of linear time and into its “own” time, in which each iteration of the myth is in some sense always the first time.

…the mythic is part of the virtual infrastructure which makes human life as such possible. It is not the case that first of all there are human beings, and the mythic arrives afterwards, as a kind of cultural carapace added to a biological core. Humans are from the start — or from before the start, before the birth of the individual — enmeshed in mythic structures.

~Mark Fisher, Eerie ThanatosThe Weird and the Eerie (pp. 96-97)

A Neverending Revolution of the Mind

In a recent book, Juliet Barker offers new perspective about an old event (1381: The Year of the Peasant’s Revolt, Kindle Locations 41-48):

“In the summer of 1381 England erupted in a violent popular uprising that was as unexpected as it was unprecedented. Previous rebellions had always been led by ambitious and discontented noblemen seeking to overthrow the government and seize power for themselves. The so-called ‘Peasants’ Revolt’ was led by commoners— most famously Wat Tyler, Jack Straw and John Balle— whose origins were obscure and whose moment at the forefront of events was brief. Even more unusually, they did not seek personal advancement but a radical political agenda which, if it had been implemented, would fundamentally have transformed English society: the abolition of serfdom and the dues and services owed by tenants to their lord of the manor; freedom from tolls and customs on buying and selling goods throughout the country; the recognition of a man’s right to work for whom he chose at the wages he chose; the state’s seizure of the Church’s wealth and property. Their demands anticipated the French Revolution by four hundred years.”

Our understanding of the origins of modernity keep being pushed back. It used to be thought that the American Revolution was the first modern revolution. But it was preceded by generations of revolts against colonial elite. And before that was the English Civil War, which increasingly is seen as the first modern revolution. We might have to push it even further back to the Peasant’s Revolt.

It makes sense when you know some of the historical background. England had become a major center of wool production. This unintentionally undermined the feudal order. The reason is that an entire community of feudal peasants isn’t necessary for herding sheep, in the way it had been for traditional agriculture. So, by the time the Peasant’s Revolt came around, there had already been several centuries of increasing irrelevance for much of the peasant population. This would continue on into the Enlightenment Age when the enclosure movement took hold and masses of landless peasants flooded into the cities.

It’s interesting that the pressure on the social order was already being felt that far back, almost jumpstarting the modern revolutionary era four centuries earlier. Those commoners were already beginning to think of themselves as more than mere cogs in the machinery of feudalism. They anticipated the possibility of becoming agents of their own fate. It was the origins of modern class identity and class war, at least for Anglo-American society.

There were other changes happening around then. It was the beginning of the Renaissance. This brought ancient Greek philosophy, science, and politics back into Western thought. The new old ideas were quickly spread through the invention of the movable type printing press and increasing use of vernacular language. And that directly made the Enlightenment possible.

The Italian city-states and colonial empires were becoming greater influences, bringing with them new economic systems of capitalism and corporatism. The Italian city-states, in the High Middle Ages, also initiated advocacy of anti-monarchialism and liberty-oriented republicanism. Related to this, humanism became a major concern, as taught by the ancient Sophists with Protagoras famously stating that “Man is the measure of all things.” And with this came early developments in psychological thought, such as the radical notion that everyone had the same basic human nature. Diverse societies had growing contact and so cultural differences became an issue, provoking difficult questions and adding to a sense of uncertainty and doubt.

Individual identity and social relationships were being transformed, in a way not seen since the Axial Age. Proto-feudalism developed in the Roman empire. Once established, feudalism lasted for more than a millennia. It wasn’t just a social order but an entire worldview, a way of being in and part of a shared world. Every aspect of life was structured by it. The slow unraveling inevitably led to increasing radicalism, as what it meant to be human was redefined and re-envisioned.

My thoughts continuously return to these historical changes. I can’t shake the feeling that we are living through another such period of societal transformation. But as during any major shift in consciousness, the outward results are hard to understand or sometimes hard to even notice, at least in terms of their ultimate consequences. That is until they result in an uprising of the masses and sometimes a complete overthrow of established power. Considering that everpresent possibility and looming threat, it might be wise to question how stable is our present social order and the human identity it is based upon.

These thoughts are inspired by other books I’ve been reading. The ideas I regularly return to is that of Julian Jaynes’ bicameralism and the related speculations of those who were inspired by him, such as Iain McGilchrist. Most recently, I found useful insight from two books whose authors were new to me: Consciousness by Susan Blackmore and A Skeptic’s Guide to the Mind by Robert Burton.

Those authors offer overviews that question and criticize many common views, specifically that of the Enlightenment ideal of individuality, in considering issues of embodiment and affect, extended self and bundled self. These aren’t just new theories that academics preoccupy themselves for reasons of entertainment and job security. They are ideas that have much earlier origins and, dismissed for so long because they didn’t fit into the prevailing paradigm, they are only now being taken seriously. The past century led to an onslaught of research findings that continuously challenged what we thought we knew.

This shift is in some ways a return to a different tradition of radical thought. John Locke was radical enough for his day, although his radicalism was hidden behind pieties. Even more radical was a possible influence on Locke, Wim Klever going so far as seeing crypto-quotations of Baruch Spinoza in Locke’s writings. Spinoza was an Enlightenment thinker who focused not just on what it meant to be human but a human in the world. What kind of world is this? Unlike Locke, his writings weren’t as narrowly focused on politics, governments, constitutions, etc. Even so, Matthew Stewart argues that through Locke’s writings Spinozism was a hidden impulse that fueled the fires of the American Revolution, taking form and force through a working class radicalism as described in Nature’s God.

Spinozism has been revived in many areas of study, such as the growing body of work about affect. Never fully appreciated in his lifetime, his radicalism continues to inform and inspire innovative thinking. As Renaissance ideas took centuries to finally displace what came before, Spinoza’s ideas are slowly but powerfully helping to remake the modern mind. I’d like to believe that a remaking of the modern world will follow.

I just started an even more interesting book, Immaterial Bodies by Lisa Blackman. She does briefly discuss Spinoza, but her framing concern is the the relationship “between the humanities and the sciences (particularly the life, neurological and psychological sciences).” She looks at the more recent developments of thought, including that of Jaynes and McGilchrist. Specifically, she unpacks the ideological self-identity we’ve inherited.

To argue for or to simply assume a particular social construct about our humanity is to defend a particular social order and thus to enforce a particular social control. She makes a compelling case for viewing neoliberalism as more than a mere economic and political system. The greatest form of control isn’t only controlling how people are allowed to act and relate but, first and foremost, how they are able to think about themselves and the world around them. In speaking about neoliberalism, she quotes Fernando Vidal (Kindle Locations 3979-3981):

“The individualism characteristic of western and westernized societies, the supreme value given to the individual as autonomous agent of choice and initiative, and the corresponding emphasis on interiority at the expense of social bonds and contexts, are sustained by the brain-hood ideology and reproduced by neurocultural discourses.”

Along with mentioning Spinoza, Blackman does give some historical background, such as in the following. And as a bonus, it is placed in the even larger context of Jaynes’ thought. She writes (Kindle Locations 3712-3724):

“Dennett, along with other scientists interested in the problem of consciousness (see Kuijsten, 2006), has identified Jaynes’s thesis as providing a bridge between matter and inwardness, or what I would prefer to term the material and immaterial. Dennett equates this to the difference between a brick and a bricklayer, where agency and sentience are only accorded to the bricklayer and never to the brick. For Dennett, under certain conditions we might have some sense of what it means to be a bricklayer, but it is doubtful, within the specificities of consciousness as we currently know and understand it, that we could ever know what it might mean to be a brick. This argument might be more usefully extended within the humanities by considering the difference between understanding the body as an entity and as a process. The concept of the body as having a ‘thing-like’ quality, where the body is reconceived as a form of property, is one that has taken on a truth status since at least its incorporation into the Habeas Corpus Act of 1679 (see Cohen, 2009). As Cohen (2009: 81) suggests, ‘determining the body as the legal location of the person radically reimagines both the ontological and political basis of person-hood’. This act conceives the body as an object possessed or owned by individuals, what Cohen (2009) terms a form of ‘biopolitical individualization’. Within this normative conception of corporeality bodies are primarily material objects that can be studied in terms of their physicochemical processes, and are objects owned by individuals who can maintain and work upon them in order to increase the individual’s physical and cultural capital.”

In her epilogue, she presents a question by Catherine Malabou (Kindle Locations 4014-4015): “What should we do so that consciousness of the brain does not purely and simply coincide with the spirit of capitalism?” The context changes as the social order changes, from feudalism to colonialism and now capitalism. But phrased in various ways, it is the same question that has been asked for centuries.

Another interesting question to ask is, by what right? It is more than a question. It is a demand to prove the authority of an action. And relevant to my thoughts here, it has historical roots in feudalism. It’s like asking someone, who do you think you are to tell me what to do? Inherent in this inquiry is one’s position in the prevailing social order, whether feudal lords challenging the kings authority or peasants challenging those feudal lords. The issue isn’t only who we are and what we are allowed to do based on that but who or what gets to define who we are, our human nature and social identity.

Such questions always have a tinge of the revolutionary, even if only in potential. Once people begin questioning, established attitudes and identities have already become unmoored and are drifting. The act of questioning is itself radical, no matter what the eventual answers. The doubting mind is ever poised on a knife edge.

The increasing pressure put on peasants, especially once they became landless, let loose individuals and identities. This incited radical new thought and action. As a yet another underclass forms, that of the imprisoned and permanently unemployed that even now forms a tenth of the population, what will this lead to? Throwing people into desperation with few opportunities and lots of time on their hands tends to lead to disruptive outcomes, sometimes even revolution.

Radicalism means to go to the root and there is nothing more radical than going to the root of our shared humanity. In questions being asked, those in power won’t be happy with the answers found. But at this point, it is already too late to stop what will follow. We are on our way.

Imagination: Moral, Dark, and Radical

Absence is presence.
These are the fundamentals of mystery.
The Young Pope

Below is a gathering of excerpts from writings. The key issue here is imagination, specifically Edmund Burke’s moral imagination with its wardrobe but also the dark imagination and the radical imagination. I bring in some other thinkers for context: Thomas Paine, Corey Robin, Thomas Ligotti, Lewis Hyde, and Julian Jaynes.

Besides imagination, the connecting strands of thought are:

  • Pleasure, beauty, and sublimity; comfort, familiarity, intimacy, the personal, and subjectivity; embodiment, anchoring, shame, and nakedness; pain, violence, suffering, and death;
  • Darkness, awe, fear, terror, horror, and the monstrous; oppression, prejudice, and ignorance; obfuscation, obscurity, disconnection, and dissociation; the hidden, the veiled, the unknown, and the distant; mystery, madness, and deception;
  • Identity, consciousness, and metaphor; creativity, art, story, poetry, and rhetoric; literalism, realism, and dogmatism; reason, knowledge, and science;
  • Enlightenment, abstractions, ideology, revolution, and counter-revolution; nobility, power, chivalry, aristocracy, and monarchy; tradition, nostalgia, and the reactionary mind; liberalism, conservatism, and culture wars;
  • Et cetera.

The touchstone for my own thinking is what I call symbolic conflation, along with the larger context of conceptual slippage, social construction, and reality tunnels. This is closely related to what Lewis Hyde discusses in terms of metonymy, liminality, and the Trickster archetype.

Read the following as a contemplation of ideas and insights. In various ways, they connect, overlap, and resonate. Soften your focus and you might see patterns emerge. If these are all different perspectives of the same thing, what exactly is it that is being perceived? What does each view say about the individual espousing it and if not necessarily about all of humanity at least about our society?

(I must admit that my motivation for this post was mainly personal. I simply wanted to gather these writings together. They include some writings and writers that I have been thinking about for a long time. Quotes and passages from many of them can be found in previous posts on this blog. I brought them together here for the purposes of my own thinking about certain topics. I don’t post stuff like this with much expectation that it will interest anyone else, as I realize my own interests are idiosyncratic. Still, if someone comes along and finds a post like this fascinating, then I’ll know they are my soulmate. This post is only for cool people with curious minds. Ha!)

* * *

On the Sublime and Beautiful
by Edmund Burke

Of the Passion Caused by the Sublime

THE PASSION caused by the great and sublime in nature, when those causes operate most powerfully, is astonishment; and astonishment is that state of the soul, in which all its motions are suspended, with some degree of horror. 1 In this case the mind is so entirely filled with its object, that it cannot entertain any other, nor by consequence reason on that object which employs it. Hence arises the great power of the sublime, that, far from being produced by them, it anticipates our reasonings, and hurries us on by an irresistible force. Astonishment, as I have said, is the effect of the sublime in its highest degree; the inferior effects are admiration, reverence, and respect.

Terror

NO passion so effectually robs the mind of all its powers of acting and reasoning as fear. 1 For fear being an apprehension of pain or death, it operates in a manner that resembles actual pain. Whatever therefore is terrible, with regard to sight, is sublime too, whether this cause of terror be endued with greatness of dimensions or not; for it is impossible to look on anything as trifling, or contemptible, that may be dangerous. There are many animals, who though far from being large, are yet capable of raising ideas of the sublime, because they are considered as objects of terror. As serpents and poisonous animals of almost all kinds. And to things of great dimensions, if we annex an adventitious idea of terror, they become without comparison greater. A level plain of a vast extent on land, is certainly no mean idea; the prospect of such a plain may be as extensive as a prospect of the ocean: but can it ever fill the mind with anything so great as the ocean itself? This is owing to several causes; but it is owing to none more than this, that the ocean is an object of no small terror. Indeed, terror is in all cases whatsoever, either more openly or latently, the ruling principle of the sublime. Several languages bear a strong testimony to the affinity of these ideas. They frequently use the same word, to signify indifferently the modes of astonishment or admiration, and those of terror. [Greek] is in Greek, either fear or wonder; [Greek] is terrible or respectable; [Greek], to reverence or to fear. Vereor in Latin, is what [Greek] is in Greek. The Romans used the verb stupeo, a term which strongly marks the state of an astonished mind, to express the effect of either of simple fear or of astonishment; the word attonitus (thunder-struck) is equally expressive of the alliance of these ideas; and do not the French étonnement, and the English astonishment and amazement, point out as clearly the kindred emotions which attend fear and wonder? They who have a more general knowledge of languages, could produce, I make no doubt, many other and equally striking examples.

Obscurity

TO make anything very terrible, obscurity seems in general to be necessary. When we know the full extent of any danger, when we can accustom our eyes to it, a great deal of the apprehension vanishes. Every one will be sensible of this, who considers how greatly night adds to our dread, in all cases of danger, and how much the notions of ghosts and goblins, of which none can form clear ideas, affect minds which give credit to the popular tales concerning such sorts of beings. Those despotic governments, which are founded on the passions of men, and principally upon the passion of fear, keep their chief as much as may be from the public eye. The policy has been the same in many cases of religion. Almost all the heathen temples were dark. Even in the barbarous temples of the Americans at this day, they keep their idol in a dark part of the hut, which is consecrated to his worship. For this purpose too the Druids performed all their ceremonies in the bosom of the darkest woods, and in the shade of the oldest and most spreading oaks. No person seems better to have understood the secret of heightening, or of setting terrible things, if I may use the expression, in their strongest light, by the force of a judicious obscurity, than Milton. His description of Death in the second book is admirably studied; it is astonishing with what a gloomy pomp, with what a significant and expressive uncertainty of strokes and colouring, he has finished the portrait of the king of terrors:

—The other shape,
If shape it might be called that shape had none
Distinguishable, in member, joint, or limb;
Or substance might be called that shadow seemed;
For each seemed either; black he stood as night;
Fierce as ten furies; terrible as hell;
And shook a deadly dart. What seemed his head
The likeness of a kingly crown had on.

In this description all is dark, uncertain, confused, terrible, and sublime to the last degree. […]

The Same Subject Continued

[…] I know several who admire and love painting, and yet who regard the objects of their admiration in that art with coolness enough in comparison of that warmth with which they are animated by affecting pieces of poetry or rhetoric. Among the common sort of people, I never could perceive that painting had much influence on their passions. It is true, that the best sorts of painting, as well as the best sorts of poetry, are not much understood in that sphere. But it is most certain, that their passions are very strongly roused by a fanatic preacher, or by the ballads of Chevy-chase, or the Children in the Wood, and by other little popular poems and tales that are current in that rank of life. I do not know of any paintings, bad or good, that produce the same effect. So that poetry, with all its obscurity, has a more general, as well as a more powerful, dominion over the passions, than the other art. And I think there are reasons in nature, why the obscure idea, when properly conveyed, should be more affecting than the clear. It is our ignorance of things that causes all our admiration, and chiefly excites our passions. Knowledge and acquaintance make the most striking causes affect but little. It is thus with the vulgar; and all men are as the vulgar in what they do not understand. The ideas of eternity and infinity are among the most affecting we have; and yet perhaps there is nothing of which we really understand so little, as of infinity and eternity. […]

Locke’s Opinion Concerning Darkness Considered

IT is Mr. Locke’s opinion, that darkness is not naturally an idea of terror; and that, though an excessive light is painful to the sense, the greatest excess of darkness is no ways troublesome. He observes indeed in another place, that a nurse or an old woman having once associated the idea of ghosts and goblins with that of darkness, night, ever after, becomes painful and horrible to the imagination. The authority of this great man is doubtless as great as that of any man can be, and it seems to stand in the way of our general principle. We have considered darkness as a cause of the sublime; and we have all along considered the sublime as depending on some modification of pain or terror: so that if darkness be no way painful or terrible to any, who have not had their minds early tainted with superstitions, it can be no source of the sublime to them. But, with all deference to such an authority, it seems to me, that an association of a more general nature, an association which takes in all mankind, and make darkness terrible; for in utter darkness it is impossible to know in what degree of safety we stand; we are ignorant of the objects that surround us; we may every moment strike against some dangerous obstruction; we may fall down a precipice the first step we take; and if an enemy approach, we know not in what quarter to defend ourselves; in such a case strength is no sure protection; wisdom can only act by guess; the boldest are staggered, and he, who would pray for nothing else towards his defence, is forced to pray for light.

As to the association of ghosts and goblins; surely it is more natural to think, that darkness, being originally an idea of terror, was chosen as a fit scene for such terrible representations, than that such representations have made darkness terrible. The mind of man very easily slides into an error of the former sort; but it is very hard to imagine, that the effect of an idea so universally terrible in all times, and in all countries, as darkness, could possibly have been owing to a set of idle stories, or to any cause of a nature so trivial, and of an operation so precarious.

Reflections on the French Revolution
by Edmund Burke

History will record, that on the morning of the 6th of October, 1789, the king and queen of France, after a day of confusion, alarm, dismay, and slaughter, lay down, under the pledged security of public faith, to indulge nature in a few hours of respite, and troubled, melancholy repose. From this sleep the queen was first startled by the voice of the sentinel at her door, who cried out her to save herself by flight—that this was the last proof of fidelity he could give—that they were upon him, and he was dead. Instantly he was cut down. A band of cruel ruffians and assassins, reeking with his blood, rushed into the chamber of the queen, and pierced with a hundred strokes of bayonets and poniards the bed, from whence this persecuted woman had but just time to fly almost naked, and, through ways unknown to the murderers, had escaped to seek refuge at the feet of a king and husband, not secure of his own life for a moment.

This king, to say no more of him, and this queen, and their infant children, (who once would have been the pride and hope of a great and generous people,) were then forced to abandon the sanctuary of the most splendid palace in the world, which they left swimming in blood, polluted by massacre, and strewed with scattered limbs and mutilated carcases. Thence they were conducted into the capital of their kingdom. […]

It is now sixteen or seventeen years since I saw the queen of France, then the dauphiness, at Versailles; and surely never lighted on this orb, which she hardly seemed to touch, a more delightful vision. I saw her just above the horizon, decorating and cheering the elevated sphere she just began to move in,—glittering like the morning-star, full of life, and splendour, and joy. Oh! what a revolution! and what a heart must I have to contemplate without emotion that elevation and that fall! Little did I dream when she added titles of veneration to those of enthusiastic, distant, respectful love, that she should ever be obliged to carry the sharp antidote against disgrace concealed in that bosom; little did I dream that I should have lived to see such disasters fallen upon her in a nation of gallant men, in a nation of men of honour, and of cavaliers. I thought ten thousand swords must have leaped from their scabbards to avenge even a look that threatened her with insult. But the age of chivalry is gone. That of sophisters, economists, and calculators, has succeeded; and the glory of Europe is extinguished for ever. Never, never more shall we behold that generous loyalty to rank and sex, that proud submission, that dignified obedience, that subordination of the heart, which kept alive, even in servitude itself, the spirit of an exalted freedom. The unbought grace of life, the cheap defence of nations, the nurse of manly sentiment and heroic enterprise, is gone! It is gone, that sensibility of principle, that charity of honor, which felt a stain like a wound, which inspired courage whilst it mitigated ferocity, which ennobled whatever it touched, and under which vice itself lost half its evil, by losing all its grossness.

This mixed system of opinion and sentiment had its origin in the ancient chivalry; and the principle, though varied in its appearance by the varying state of human affairs, subsisted and influenced through a long succession of generations, even to the time we live in. If it should ever be totally extinguished, the loss I fear will be great. It is this which has given its character to modern Europe. It is this which has distinguished it under all its forms of government, and distinguished it to its advantage, from the states of Asia, and possibly from those states which flourished in the most brilliant periods of the antique world. It was this, which, without confounding ranks, had produced a noble equality, and handed it down through all the gradations of social life. It was this opinion which mitigated kings into companions, and raised private men to be fellows with kings. Without force or opposition, it subdued the fierceness of pride and power; it obliged sovereigns to submit to the soft collar of social esteem, compelled stern authority to submit to elegance, and gave a dominating vanquisher of laws to be subdued by manners.

But now all is to be changed. All the pleasing illusions, which made power gentle and obedience liberal, which harmonized the different shades of life, and which, by a bland assimilation, incorporated into politics the sentiments which beautify and soften private society, are to be dissolved by this new conquering empire of light and reason. All the decent drapery of life is to be rudely torn off. All the superadded ideas, furnished from the wardrobe of a moral imagination, which the heart owns, and the understanding ratifies, as necessary to cover the defects of our naked, shivering nature, and to raise it to dignity in our own estimation, are to be exploded as a ridiculous, absurd, and antiquated fashion.

On this scheme of things, a king is but a man, a queen is but a woman; a woman is but an animal, and an animal not of the highest order. All homage paid to the sex in general as such, and without distinct views, is to be regarded as romance and folly. Regicide, and parricide, and sacrilege, are but fictions of superstition, corrupting jurisprudence by destroying its simplicity. The murder of a king, or a queen, or a bishop, or a father, are only common homicide; and if the people are by any chance, or in any way, gainers by it, a sort of homicide much the most pardonable, and into which we ought not to make too severe a scrutiny.

On the scheme of this barbarous philosophy, which is the offspring of cold hearts and muddy understandings, and which is as void of solid wisdom as it is destitute of all taste and elegance, laws are to be supported only by their own terrors, and by the concern which each individual may find in them from his own private speculations, or can spare to them from his own private interests. In the groves of their academy, at the end of every vista, you see nothing but the gallows. Nothing is left which engages the affections on the part of the commonwealth. On the principles of this mechanic philosophy, our institutions can never be embodied, if I may use the expression, in persons; so as to create in us love, veneration, admiration, or attachment. But that sort of reason which banishes the affections is incapable of filling their place. These public affections, combined with manners, are required sometimes as supplements, sometimes as correctives, always as aids to law. The precept given by a wise man, as well as a great critic, for the construction of poems, is equally true as to states:—Non satis est pulchra esse poemata, dulcia sunto. There ought to be a system of manners in every nation, which a well-formed mind would be disposed to relish. To make us love our country, our country ought to be lovely.

* * *

Rights of Man:
Being an Answer to Mr. Burke’s Attack on the French Revolution
by Thomas Paine

But Mr. Burke appears to have no idea of principles when he is contemplating Governments. “Ten years ago,” says he, “I could have felicitated France on her having a Government, without inquiring what the nature of that Government was, or how it was administered.” Is this the language of a rational man? Is it the language of a heart feeling as it ought to feel for the rights and happiness of the human race? On this ground, Mr. Burke must compliment all the Governments in the world, while the victims who suffer under them, whether sold into slavery, or tortured out of existence, are wholly forgotten. It is power, and not principles, that Mr. Burke venerates; and under this abominable depravity he is disqualified to judge between them. Thus much for his opinion as to the occasions of the French Revolution. I now proceed to other considerations.

I know a place in America called Point-no-Point, because as you proceed along the shore, gay and flowery as Mr. Burke’s language, it continually recedes and presents itself at a distance before you; but when you have got as far as you can go, there is no point at all. Just thus it is with Mr. Burke’s three hundred and sixty-six pages. It is therefore difficult to reply to him. But as the points he wishes to establish may be inferred from what he abuses, it is in his paradoxes that we must look for his arguments.

As to the tragic paintings by which Mr. Burke has outraged his own imagination, and seeks to work upon that of his readers, they are very well calculated for theatrical representation, where facts are manufactured for the sake of show, and accommodated to produce, through the weakness of sympathy, a weeping effect. But Mr. Burke should recollect that he is writing history, and not plays, and that his readers will expect truth, and not the spouting rant of high-toned exclamation.

When we see a man dramatically lamenting in a publication intended to be believed that “The age of chivalry is gone! that The glory of Europe is extinguished for ever! that The unbought grace of life (if anyone knows what it is), the cheap defence of nations, the nurse of manly sentiment and heroic enterprise is gone!” and all this because the Quixot age of chivalry nonsense is gone, what opinion can we form of his judgment, or what regard can we pay to his facts? In the rhapsody of his imagination he has discovered a world of wind mills, and his sorrows are that there are no Quixots to attack them. But if the age of aristocracy, like that of chivalry, should fall (and they had originally some connection) Mr. Burke, the trumpeter of the Order, may continue his parody to the end, and finish with exclaiming: “Othello’s occupation’s gone!”

Notwithstanding Mr. Burke’s horrid paintings, when the French Revolution is compared with the Revolutions of other countries, the astonishment will be that it is marked with so few sacrifices; but this astonishment will cease when we reflect that principles, and not persons, were the meditated objects of destruction. The mind of the nation was acted upon by a higher stimulus than what the consideration of persons could inspire, and sought a higher conquest than could be produced by the downfall of an enemy. Among the few who fell there do not appear to be any that were intentionally singled out. They all of them had their fate in the circumstances of the moment, and were not pursued with that long, cold-blooded unabated revenge which pursued the unfortunate Scotch in the affair of 1745.

Through the whole of Mr. Burke’s book I do not observe that the Bastille is mentioned more than once, and that with a kind of implication as if he were sorry it was pulled down, and wished it were built up again. “We have rebuilt Newgate,” says he, “and tenanted the mansion; and we have prisons almost as strong as the Bastille for those who dare to libel the queens of France.” As to what a madman like the person called Lord George Gordon might say, and to whom Newgate is rather a bedlam than a prison, it is unworthy a rational consideration. It was a madman that libelled, and that is sufficient apology; and it afforded an opportunity for confining him, which was the thing that was wished for. But certain it is that Mr. Burke, who does not call himself a madman (whatever other people may do), has libelled in the most unprovoked manner, and in the grossest style of the most vulgar abuse, the whole representative authority of France, and yet Mr. Burke takes his seat in the British House of Commons! From his violence and his grief, his silence on some points and his excess on others, it is difficult not to believe that Mr. Burke is sorry, extremely sorry, that arbitrary power, the power of the Pope and the Bastille, are pulled down.

Not one glance of compassion, not one commiserating reflection that I can find throughout his book, has he bestowed on those who lingered out the most wretched of lives, a life without hope in the most miserable of prisons. It is painful to behold a man employing his talents to corrupt himself. Nature has been kinder to Mr. Burke than he is to her. He is not affected by the reality of distress touching his heart, but by the showy resemblance of it striking his imagination. He pities the plumage, but forgets the dying bird. Accustomed to kiss the aristocratical hand that hath purloined him from himself, he degenerates into a composition of art, and the genuine soul of nature forsakes him. His hero or his heroine must be a tragedy-victim expiring in show, and not the real prisoner of misery, sliding into death in the silence of a dungeon.

As Mr. Burke has passed over the whole transaction of the Bastille (and his silence is nothing in his favour), and has entertained his readers with refections on supposed facts distorted into real falsehoods, I will give, since he has not, some account of the circumstances which preceded that transaction. They will serve to show that less mischief could scarcely have accompanied such an event when considered with the treacherous and hostile aggravations of the enemies of the Revolution.

The mind can hardly picture to itself a more tremendous scene than what the city of Paris exhibited at the time of taking the Bastille, and for two days before and after, nor perceive the possibility of its quieting so soon. At a distance this transaction has appeared only as an act of heroism standing on itself, and the close political connection it had with the Revolution is lost in the brilliancy of the achievement. But we are to consider it as the strength of the parties brought man to man, and contending for the issue. The Bastille was to be either the prize or the prison of the assailants. The downfall of it included the idea of the downfall of despotism, and this compounded image was become as figuratively united as Bunyan’s Doubting Castle and Giant Despair.

* * *

The Reactionary Mind
by Corey Robin
pp. 243-245

As Orwell taught, the possibilities for cruelty and violence are as limitless as the imagination that dreams them up. But the armies and agencies of today’s violence are vast bureaucracies, and vast bureaucracies need rules. Eliminating the rules does not Prometheus unbind; it just makes for more billable hours.

“No yielding. No equivocation. No lawyering this thing to death.” That was George W. Bush’s vow after 9/ 11 and his description of how the war on terror would be conducted. Like so many of Bush’s other declarations, it turned out to be an empty promise. This thing was lawyered to death. But, and this is the critical point, far from minimizing state violence— which was the great fear of the neocons— lawyering has proven to be perfectly compatible with violence. In a war already swollen with disappointment and disillusion, the realization that inevitably follows— the rule of law can, in fact, authorize the greatest adventures of violence and death, thereby draining them of sublimity— must be, for the conservative, the greatest disillusion of all.

Had they been closer readers of Burke, the neoconservatives— like Fukuyama, Roosevelt, Sorel, Schmitt, Tocqueville, Maistre, Treitschke, and so many more on the American and European right— could have seen this disillusion coming. Burke certainly did. Even as he wrote of the sublime effects of pain and danger, he was careful to insist that should those pains and dangers “press too nearly” or “too close”— that is, should they become realities rather than fantasies, should they become “conversant about the present destruction of the person”— their sublimity would disappear. They would cease to be “delightful” and restorative and become simply terrible. 64 Burke’s point was not merely that no one, in the end, really wants to die or that no one enjoys unwelcome, excruciating pain. It was that sublimity of whatever kind and source depends upon obscurity: get too close to anything, whether an object or experience, see and feel its full extent, and it loses its mystery and aura. It becomes familiar. A “great clearness” of the sort that comes from direct experience “is in some sort an enemy to all enthusiasms whatsoever.” 65 “It is our ignorance of things that causes all our admiration, and chiefly excites our passions. Knowledge and acquaintance make the most striking causes affect but little.” 66 “A clear idea,” Burke concludes, “is therefore another name for a little idea.” 67 Get to know anything, including violence, too well, and it loses whatever attribute— rejuvenation, transgression, excitement, awe— you ascribed to it when it was just an idea.

Earlier than most, Burke understood that if violence were to retain its sublimity, it had to remain a possibility, an object of fantasy— a horror movie, a video game, an essay on war. For the actuality (as opposed to the representation) of violence was at odds with the requirements of sublimity. Real, as opposed to imagined, violence entailed objects getting too close, bodies pressing too near, flesh upon flesh. Violence stripped the body of its veils; violence made its antagonists familiar to each other in a way they had never been before. Violence dispelled illusion and mystery, making things drab and dreary. That is why, in his discussion in the Reflections of the revolutionaries’ abduction of Marie Antoinette, Burke takes such pains to emphasize her “almost naked” body and turns so effortlessly to the language of clothing—“ the decent drapery of life,” the “wardrobe of the moral imagination,” “antiquated fashion,” and so on— to describe the event. 68 The disaster of the revolutionaries’ violence, for Burke, was not cruelty; it was the unsought enlightenment.

Since 9/ 11, many have complained, and rightly so, about the failure of conservatives— or their sons and daughters— to fight the war on terror themselves. For those on the left, that failure is symptomatic of the class injustice of contemporary America. But there is an additional element to the story. So long as the war on terror remains an idea— a hot topic on the blogs, a provocative op-ed, an episode of 24— it is sublime. As soon as the war on terror becomes a reality, it can be as cheerless as a discussion of the tax code and as tedious as a trip to the DMV.

Fear: The History of a Political Idea
by Corey Robin
Kindle Locations 402-406

It might seem strange that a book about political fear should assign so much space to our ideas about fear rather than to its practice. But recall what Burke said: It is not so much the actuality of a threat, but the imagined idea of that threat, that renews and restores. “If the pain and terror are so modified as not to be actually noxious; if the pain is not carried to violence, and the terror is not conversant about the present destruction of the person,” then, and only then, do we experience a delightful horror.”1 The condition of our being renewed by fear is not that we directly experience the object that threatens us, but that the object be kept at some remove move from ourselves.

Kindle Locations 1061-1066

Whether they have read The Spirit of the Laws or not, these writers are its children. With its trawling allusions to the febrile and the fervid, The Spirit of the Laws successfully aroused the conviction that terror was synonymous with barbarism, and that its cures were to be found entirely within liberalism. Thus was a new political and literary aesthetic born, a rhetoric of hyperbole suggesting that terror’s escorts were inevitably remoteness, irrationality, and darkness, and its enemies, familiarity, reason, and light. Perhaps it was this aesthetic that a young Edmund Burke had in mind when he wrote, two years after Montesquieu’s death, “To make any thing very terrible, obscurity seems in general to be necessary. When we know the full extent of any danger, when we can accustom our eyes to it, a great deal of the apprehension vanishes.”

Kindle Locations 1608-1618

As she set about establishing a new political morality in the shadow of total terror, however, Arendt became aware of a problem that had plagued Hobbes, Montesquieu, and Tocqueville, and that Burke-not to mention makers of horror films-understood all too well: once terrors become familiar, they cease to arouse dread. The theorist who tries to establish fear as a foundation for a new politics must always find a demon darker than that of her predecessors, discover ever more novel, and more frightening, forms of fear. Thus Montesquieu, seeking to outdo Hobbes, imagined a form of terror that threatened the very basis of that which made us human. In Arendt’s case, it was her closing image of interchangeable victims and victimizers-of terror serving no interest and no party, not even its wielders; of a world ruled by no one and nothing, save the impersonal laws of motion-that yielded the necessary “radical evil” from which a new politics could emerge.

But as her friend and mentor Karl Jaspers was quick to recognize, Arendt had come upon this notion of radical evil at a terrible cost: it made moral judgment of the perpetrators of total terror nearly impossible.59 According to Origins, total terror rendered everyone-from Hitler down through the Jews, from Stalin to the kulaks-incapable of acting. Indeed, as Arendt admitted in 1963, “There exists a widespread theory, to which I also contributed [in Origins], that these crimes defy the possibility of human judgment and explode the frame of our legal institutions.”60 Total terror may have done what fear, terror, and anxiety did for her predecessors-found a new politics-but, as Arendt would come to realize in Eichmann in Jerusalem, it was a false foundation, inspiring an operatic sense of catastrophe, that ultimately let the perpetrators off the hook by obscuring the hard political realities of rule by fear.

Liberalism at Bay, Conservatism at Piay:
Fear in the Contemporary Imagination

by Corey Robin

For theorists like Locke and Burke, fear is something to be cherished, not because it alerts us to real danger or propels us to take necessary action against it, but because fear is supposed to arouse a heightened state of experience. It quickens our perceptions as no other emotion can, forcing us to see and to act in the world in new and more interesting ways, with greater moral discrimination and a more acute consciousness of our surroundings and ourselves. According to Locke, fear is “an uneasiness of the mind” and “the chief, if not only spur to human industry and action is uneasiness.” Though we might think that men and women act on behalf of desire, Locke insisted that “a little burning felt”—like fear—”pushes us more powerfully than great pleasures in prospect draw or allure.” Burke had equally low regard for pleasure. It induces a grotesque implosion of self, a “soft tranquility” approximating an advanced state of decay if not death itself.

The head reclines something on one side; the eyelids are
more closed than usual, and the eyes roll gently with an
inclination to the object, the mouth is a little opened, and
the breath drawn slowly, with now and then a low sigh;
the whole body is composed, and the hands fall idly to
the sides. All this is accompanied with an inward sense of
melting and languor . . . relaxing the solids of the whole
system.

But when we imagine the prospect of “pain and terror,” Burke added, we experience a delightful horror,” the “strongest of all passions.” Without fear, we are passive; with it, we are roused to “the strongest emotion which the mind is capable of feeling” (Locke, 1959,11.20.6,10;11.21.34: 304-5, 334; Burke, 1990: 32, 36,123,135-36).

At the political level, modem theorists have argued that fear is a spur to civic vitality and moral renewal, perhaps even a source of public freedom. Writing in the wake of the French Revolution, Tocqueville bemoaned the lethargy of modem democracy. With its free-wheeling antimonianism and social mobility, democratic society “inevitably enervates the soul, and relaxing the springs of the will, prepares a people for bondage. Then not only will they let their freedom be taken from them, but often they actually hand it over themselves” (Tocqueville, 1969:444). Lacking confidence in the traditional truths of God and king, Tocqueville believed that democracies might find a renewed confidence in the experience of fear, which could activate and ground a commitment to public freedom. “Fear,” he wrote in a note to himself, “must be put to work on behalf of liberty,” or, as he put it in Democracy in America, “Let us, then, look forward to the future with that salutary fear which makes men keep watch and ward for freedom, and not with that flabby, idle terror which makes men’s hearts sink and enervates them” (cited in Lamberti, 1989: 229; Tocqueville, 1969: 702). Armed with fear, democracy would be fortified against not only external and domestic enemies but also the inner tendency, the native desire, to dissolve into the soupy indifference of which Burke spoke.

* * *

The Dark Beauty of Unheard-Of Horrors
by Thomas Ligotti

This is how it is when a mysterious force is embodied in a human body, or in any form that is too well fixed. And a mystery explained is one robbed of its power of emotion, dwindling into a parcel of information, a tissue of rules and statistics without meaning in themselves.

Of course, mystery actually requires a measure of the concrete if it is to be perceived at all; otherwise it is only a void, the void. The thinnest mixture of this mortar, I suppose, is contained in that most basic source of mystery—darkness. Very difficult to domesticate this phenomenon, to collar it and give a name to the fear it inspires. As a verse writer once said:

The blackness at the bottom of a well
May bold most any kind of hell.

The dark, indeed, phenomenon possessing the maximum of mystery, the one most resistant to the taming of the mind and most resonant with emotions and meanings of a highly complex and subtle type. It is also extremely abstract as a provenance for supernatural horror, an elusive prodigy whose potential for fear may slip through a writer’s fingers and right past even a sensitive reader of terror tales. Obviously it is problematic in away that a solid pair of gleaming fangs at a victim’s neck is not. Hence, darkness itself is rarely used in a story as the central incarnation of the supernatural, though it often serves in a supporting role as an element of atmosphere, an extension of more concrete phenomena. The shadowy ambiance of a fictional locale almost always resolves itself into an apparition of substance, a threat with a name, if not a full blown history. Darkness may also perform in a strictly symbolic capacity, representing the abyss at the core of any genuine tale of mystery and horror. But to draw a reader’s attention to this abyss, this unnameable hell of blackness, is usually sacrificed in favor of focusing on some tangible dread pressing against the body of everyday life. From these facts may be derived an ad hoc taxonomy for dividing supernatural stories into types, or rather a spectrum of types: on the one side, those that tend to emphasize the surface manifestations of a supernatural phenomenon; on the other, those that reach toward the dark core of mystery in purest and most abstract condition. The former stories show us the bodies, big as life, of the demonic tribe of spooks, vampires, and other assorted bogeymen; the latter suggest to us the essence, far bigger than life, of that dark universal terror beyond naming which is the matrix for all other terrors. […]

Like Erich Zann’s “world of beauty,” Lovecraft’s “lay in some far cosmos of the imagination,” and like that of another  artist, it is a “beauty that hath horror in it.

The Conspiracy against the Human Race: A Contrivance of Horror
by Thomas Ligotti
pp. 41-42

As heretofore noted, consciousness may have assisted our species’ survival in the hard times of prehistory, but as it became ever more intense it evolved the potential to ruin everything if not securely muzzled. This is the problem: We must either outsmart consciousness or be thrown into its vortex of doleful factuality and suffer, as Zapffe termed it, a “dread of being”— not only of our own being but of being itself, the idea that the vacancy that might otherwise have obtained is occupied like a stall in a public lavatory of infinite dimensions, that there is a universe in which things like celestial bodies and human beings are roving about, that anything exists in the way it seems to exist, that we are part of all being until we stop being, if there is anything we may understand as being other than semblances or the appearance of semblances.

On the premise that consciousness must be obfuscated so that we might go on as we have all these years, Zapffe inferred that the sensible thing would be not to go on with the paradoxical nonsense of trying to inhibit our cardinal attribute as beings, since we can tolerate existence only if we believe— in accord with a complex of illusions, a legerdemain of duplicity— that we are not what we are: unreality on legs. As conscious beings, we must hold back that divulgement lest it break us with a sense of being things without significance or foundation, anatomies shackled to a landscape of unintelligible horrors. In plain language, we cannot live except as self-deceivers who must lie to ourselves about ourselves, as well as about our unwinnable situation in this world.

Accepting the preceding statements as containing some truth, or at least for the sake of moving on with the present narrative, it seems that we are zealots of Zapffe’s four plans for smothering consciousness: isolation (“ Being alive is all right”), anchoring (“ One Nation under God with Families, Morality, and Natural Birthrights for all”), distraction (“ Better to kill time than kill oneself”), and sublimation (“ I am writing a book titled The Conspiracy against the Human Race”). These practices make us organisms with a nimble intellect that can deceive themselves “for their own good.” Isolation, anchoring, distraction, and sublimation are among the wiles we use to keep ourselves from dispelling every illusion that keeps us up and running. Without this cognitive double-dealing, we would be exposed for what we are. It would be like looking into a mirror and for a moment seeing the skull inside our skin looking back at us with its sardonic smile. And beneath the skull— only blackness, nothing.  A little piece of our world has been peeled back, and underneath is creaking desolation— a carnival where all the rides are moving but no patrons occupy the seats. We are missing from the world we have made for ourselves. Maybe if we could resolutely gaze wide-eyed at our lives we would come to know what we really are. But that would stop the showy attraction we are inclined to think will run forever.

p. 182

That we all deserve punishment by horror is as mystifying as it is undeniable. To be an accomplice, however involuntarily, in a reasonless non-reality is cause enough for the harshest sentencing. But we have been trained so well to accept the “order” of an unreal world that we do not rebel against it. How could we? Where pain and pleasure form a corrupt alliance against us, paradise and hell are merely different divisions in the same monstrous bureaucracy. And between these two poles exists everything we know or can ever know. It is not even possible to imagine a utopia, earthly or otherwise, that can stand up under the mildest criticism. But one must take into account the shocking fact that we live on a world that spins. After considering this truth, nothing should come as a surprise.

Still, on rare occasions we do overcome hopelessness or velleity and make mutinous demands to live in a real world, one that is at least episodically ordered to our advantage. But perhaps it is only a demon of some kind that moves us to such idle insubordination, the more so to aggravate our condition in the unreal. After all, is it not wondrous that we are allowed to be both witnesses and victims of the sepulchral pomp of wasting tissue? And one thing we know is real: horror. It is so real, in fact, that we cannot be sure it could not exist without us. Yes, it needs our imaginations and our consciousness, but it does not ask or require our consent to use them. Indeed, horror operates with complete autonomy. Generating ontological havoc, it is mephitic foam upon which our lives merely float. And, ultimately, we must face up to it: Horror is more real than we are.

p. 218

Without death— meaning without our consciousness of death— no story of supernatural horror would ever have been written, nor would any other artistic representation of human life have been created for that matter. It is always there, if only between the lines or brushstrokes, or conspicuously by its absence. It is a terrific stimulus to that which is at once one of our greatest weapons and greatest weaknesses— imagination. Our minds are always on the verge of exploding with thoughts and images as we ceaselessly pound the pavement of our world. Both our most exquisite cogitations and our worst cognitive drivel announce our primal torment: We cannot linger in the stillness of nature’s vacuity. And so we have imagination to beguile us. A misbegotten hatchling of consciousness, a birth defect of our species, imagination is often revered as a sign of vigor in our make-up. But it is really just a psychic overcompensation for our impotence as beings. Denied nature’s exemption from creativity, we are indentured servants of the imaginary until the hour of our death, when the final harassments of imagination will beset us.

* * *

The Horror of the Unreal
By Peter Bebergal

The TV show “The Walking Dead” is one long exercise in tension. But the zombies—the supposed centerpiece of the show’s horror—are not particularly frightening. Gross, to be sure, but also knowable, literal. You can see them coming from yards away. They are the product of science gone wrong, or of a virus, or of some other phenomenal cause. They can be destroyed with an arrow through the brain. More aberration than genuine monsters, they lack the essential quality to truly terrify: an aspect of the unreal.

The horror writer Thomas Ligotti believes that even tales of virus-created zombies—and other essentially comprehensible creatures—can elicit what we might call, quoting the theologian Rudolf Otto, “the wholly other,” but it requires a deft hand. The best such stories “approach the realm of the supernatural,” he told me over e-mail, even if their monsters are entirely earthly. As an example, he pointed to “The Texas Chainsaw Massacre,” “wherein the brutality displayed is so deviant and strange it takes off into the uncanny.” Ligotti doesn’t require bloodthirsty villains to convey a sense of impending horror, though. “I tend to stipulate in my work that the world by its nature already exists in a state of doom rather than being in the process of doom.” […]

“Whether or not there is anything called the divine is neither here nor there,” Ligotti told me. “It’s irrelevant to our sense of what is beyond the veil.” Ligotti believes that fiction can put us in touch with that sense of things unseen, that it can create an encounter with—to quote Rudolf Otto again—the mysterium tremendum et fascinans, a state that combines terror and enchantment with the divine. In fact, Ligotti believes that “any so-called serious work of literature that doesn’t to some extent serve this function has failed.” It’s not a matter of genre, he says. He cites Raymond Chandler’s Philip Marlowe as a character who would go wherever the clues took him, no matter how deep into the heart of the “unknown.” “Chandler wanted his detective stories to invoke the sense of the ‘country behind the hill.’ “

Because Ligotti has no interest in whether or not that world beyond actually exists, there is a tension, an unanswered question, in his work: Can we locate the source of this horror? His characters are often confronted by people or groups who worship something so alien that their rituals don’t conform to any identifiable modes of religious practice. Usually, they involve some form of sacrifice or other suggestion of violence. The implication seems to be that, even if there is meaning in the universe, that meaning is so foreign, so strange, that we could never understand it, and it could never make a difference in our lives. Any attempt to penetrate it will only lead to madness.

As a practical matter, Ligotti believes that the short story is the most potent means for conveying this idea. “A novel can’t consistently project what Poe called a ‘single effect,’ “ he explains. “It would be too wearing on the reader—too repetitious and dense, as would, for instance, a lengthy narrative poem written in the style of a lyric poem. A large part of supernatural novels must therefore be concerned with the mundane and not with a sense of what I’ll call ‘the invisible.’ “

Trying to get Ligotti to explain what he means by the “invisible” is not easy. “I’m not able to see my stories as establishing or presuming the existence of a veil beyond which the characters in them are incapable of seeing. I simply don’t view them in this way. ” But his characters, I insisted, suggest that we are all capable of seeing beyond the veil, though it’s impossible to tell if they are simply mad, or if they have indeed perceived something outside normal perception. I asked Ligotti if he saw a difference between these two states of consciousness. “The only interest I’ve taken in psychological aberrancy in fiction,” he answered, “has been as a vehicle of perceiving the derangement of creation.”

Thomas Ligotti: Dark Phenomenology and Abstract Horror
by S.C. Hickman

Ligotti makes a point that horror must stay ill-defined, that the monstrous must menace us from a distance, from the unknown; a non-knowledge, rather than a knowledge of the natural; it is the unnatural and invisible that affects us not something we can reduce to some sociological, psychological, or political formation or representation, which only kills the mystery – taming it and pigeonholing it into some cultural gatekeeper’s caged obituary. […] The domesticated beast is no horror at all.

In the attic of the mind a lunatic family resides, a carnival world of aberrant thoughts and feelings – that, if we did not lock away in a conspiracy of silence would freeze us in such terror and fright that we would become immobilized unable to think, feel, or live accept as zombies, mindlessly. So we isolate these demented creatures, keep them at bay. Then we anchor ourselves in artifice, accept substitutes, religious mythologies, secular philosophies, and anything else that will help us keep the monsters at bay. As Ligotti will say, we need our illusions – our metaphysical anchors and dreamscapes “that inebriate us with a sense of being official, authentic, and safe in our beds” (CHR, 31). Yet, when even these metaphysical ploys want stem the tide of those heinous monsters from within we seek out distraction, entertainment: TV, sports, bars, dancing, friends, fishing, scuba diving, boating, car racing, horse riding… almost anything that will keep our mind empty of its dark secret, that will allow it to escape the burden of emotion – of fear, if even for a night or an afternoon of sheer mindless bliss. And, last, but not least, we seek out culture, sublimation – art, theatre, festivals, carnivals, painting, writing, books… we seek to let it all out, let it enter into that sphere of the tragic or comic, that realm where we can exorcize it, display it, pin it to the wall for all to see our fears and terrors on display not as they are but as we lift them up into art, shape them to our nightmare visions or dreamscapes of desire. As Ligotti tells it, we read literature or watch a painting, go to a theatre, etc. […]

Horror acts like a sigil, a diagram that invokes the powers within the darkness to arise, to unfold their mystery, to explain themselves; and, if not explain then at least to invade our equilibrium, our staid and comfortable world with their rage, their torment, their corruption. The best literary horror or weird tales never describe in detail the mystery, rather they invoke by hyperstitional invention: calling forth the forces out of darkness and the abstract, and allowing them to co-habit for a time the shared space – the vicarious bubble or interzone between the reader and narrative […]

This notion of the tension between the epistemic and ontic in abstract horror returns me to Nick Land’s short work Phyl-Undhu: Abstract Horror, Exterminator in which the narrator tells us that what we fear, what terrorizes us is not the seen – the known and definable, but rather the unseen and unknown, even “shapeless threat, ‘Outside’ only in the abstract sense (encompassing the negative immensity of everything that we cannot grasp). It could be anywhere, from our genes or ecological dynamics, to the hidden laws of technological evolution, or the hostile vastnesses between the stars. We know only that, in strict proportion to the vitality of the cosmos, the probability of its existence advances towards inevitability, and that for us it means supreme ill. Ontological density without identifiable form is abstract horror itself.” […]

Yet, as Lovecraft in one of his famous stories – “Call of Cthulhu” once suggested, the “sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the deadly light into the peace and safety of a new dark age.” Here is the nub for Ligotti, the dividing line of those who continue to sleep in the illusory safety net of their cultural delusions […] Many will remember the Anglo-American poet T. S. Eliot once suggested that “humankind cannot bear too much reality”. […]

For Ligotti the subjective reaction to the seemingly objective stimulus of the uncanny is the gaining of “dark knowledge” about the workings of individuals, […] This sense that the corruption works both ways, upon the victim and the perpetrator; that the world is now topsy-turvy and that the uncanny boundaries between victim and perpetrator are reversible and hazy, and not always obvious is due to that subtle knowledge that each culture is circumscribed within its own black box of conceptuality. By that I mean by that that as Eduardo Viveiros de Castro in his Cannibal Metaphysics argues the case that Amazonian and other Amerindian groups inhabit a radically different conceptual universe than ours—in which nature and culture, human and nonhuman, subject and object are conceived in terms that reverse our own—he presents the case for anthropology as the study of such “other” metaphysical schemes, and as the corresponding critique of the concepts imposed on them by the human sciences. […]

We’re in that position of moving either way: 1) literalizing our fantasies: building walls and barbed-wire fences against invading hordes of refugees, migrants, etc.; or, 2) of seeing through them, seeing the aesthetic and defensive use of art and social mechanisms to defend ourselves from the onslaught of our own daemonic nihilism and drives: our fears and terrors. […]

In our time we’ve forgotten this fact, and forgotten the art laughter, to see the world through the lens of art or horror literature and know that this, too, is illusion: the aesthetic call to our emotions, to our fears and our terrors that allows that purge, that release that only great art can supply. Rather in our time we’ve all become literalists of the imagination, so that apocalypse rather than a pleasant channeling of our fears has become an actual possibility and real manifestation in the world around us in wars, famines, racism, hatred, murder, mayhem… The problem we face is that we’ve targeted the external world of actual people and deemed them disposable as if they are the ravenous zombies and vampires of our contemporary globalist madness. We’ve turned the inside out, reversed what once existed within into a projected nightmare scenario and living hell in the real world not as fantasy but as daemonic threat and doom upon ourselves and others. Talking of contemporary horror films Ligotti remarks that the characters in these films “cannot be sure who is a “thing” and who is not, since those who are transmuted retain their former appearance, memories, and behaviors even after they have become, in their essence, uncanny monstrosities from another world.” (CHR, 92) This sense that we’ve allowed the immigrants (US) and refugees (US and EU) to enter into and become a part of the social body of our nations leads to this sense of the uncanny uncertainty that one cannot be sure who is the “thing” – is it us or them: a paranoiac nightmare world of ravening lunacy, indeed. Because our categories of normal/abnormal have broken down due to the absolute Other of other conceptual cultures who have other sets of Symbolic Orders and ideas, concepts, ideologies, religious, and Laws, etc. we are now in the predicament of mutating and transforming into an Other ourselves all across the globe. There is no safe haven, no place to hide or defend oneself against oneself. In this sense we’ve all – everyone on the planet – become as Ligotti states it, in “essence, uncanny monstrosities from another world”. (CHR, 92)

* * *

Trickster Makes This World
by Lewis Hyde
pp. 168-172

During the years I was writing this book, there was an intense national debate over the concern that government funds might be used to subsidize pornographic art. The particulars will undoubtedly change, but the debate is perennial. On the one side, we have those who presume to speak for the collective trying to preserve the coverings and silences that give social space its order. On the other side, we have the agents of change, time travelers who take the order itself to be mutable, who hope— to give it the most positive formulation— to preserve the sacred by finding ways to shift the structure of things as contingency demands. It is not immediately clear why this latter camp must so regularly turn to bodily and sexual display, but the context I am establishing here suggests that such display is necessary.

To explore why this might be the case, let me begin with the classic image from the Old Testament: Adam and Eve leaving the garden, having learned shame and therefore having covered their genitals and, in the old paintings, holding their hands over their faces as well. By these actions they inscribe their own bodies. The body happens to be a uniquely apt location for the inscription of shame, partly because the body itself seems to be the sense organ of shame (the feeling swamps us, we stutter and flush against our will), but also because the content of shame, what we feel ashamed of, typically seems indelible and fixed, with us as a sort of natural fact, the way the body is with us as a natural fact. “Shame is what you are, guilt is what you do,” goes an old saying. Guilt can be undone with acts of penance, but the feeling of shame sticks around like a birthmark or the smell of cigarettes.

I earlier connected the way we learn about shame to rules about speech and silence, and made the additional claim that those rules have an ordering function. Now, let us say that the rules give order to several things at once, not just to society but to the body and the psyche as well. When I say “several things at once” I mean that the rules imply the congruence of these three realms; the orderliness of one is the orderliness of the others. The organized body is a sign that we are organized psychologically and that we understand and accept the organization of the world around us. When Adam and Eve cover their genitals, they simultaneously begin to structure consciousness and to structure their primordial community. To make the temenos, a line is drawn on the earth and one thing cut from another; when Adam and Eve learn shame, they draw a line on their bodies, dividing them into zones like the zones of silence and speech— or, rather, not “like” those zones, but identified with them, for what one covers on the body one also consigns to silence.

[…] an unalterable fact about the body is linked to a place in the social order, and in both cases, to accept the link is to be caught in a kind of trap.

Before anyone can be snared in this trap, an equation must be made between the body and the world (my skin color is my place as a Hispanic; menstruation is my place as a woman). This substituting of one thing for another is called metonymy in rhetoric, one of the many figures of thought, a trope or verbal turn. The construction of the trap of shame begins with this metonymic trick, a kind of bait and switch in which one’s changeable social place is figured in terms of an unchangeable part of the body. Then by various means the trick is made to blend invisibly into the landscape. To begin with, there are always larger stories going on— about women or race or a snake in a garden. The enchantment of those regularly repeated fables, along with the rules of silence at their edges, and the assertion that they are intuitively true— all these things secure the borders of the narrative and make it difficult to see the contingency of its figures of thought. Once the verbal tricks are invisible, the artifice of the social order becomes invisible as well, and begins to seem natural. As menstruation and skin color and the genitals are natural facts, so the social and psychological orders become natural facts.

In short, to make the trap of shame we inscribe the body as a sign of wider worlds, then erase the artifice of that signification so that the content of shame becomes simply the way things are, as any fool can see.

If this is how the trap is made, then escaping it must involve reversing at least some of these elements. In what might be called the “heavy-bodied” escape, one senses that there’s something to be changed but ends up trying to change the body itself, mutilating it, or even committing suicide […]

These are the beginnings of conscious struggle, but we have yet to meet the mind of the trickster— or if we have, it belongs to the trickster who tries to eat the reflected berries, who burns his own anus in anger, who has not learned to separate the bait from the hook. As we saw earlier, the pressures of experience produce from that somewhat witless character a more sophisticated trickster who can separate bait from hook, who knows that the sign of something is not the thing itself, and who is therefore a better escape artist with a much more playful relationship to the local stories. The heavy-bodied, literalizing attempt to escape from shame carries much of the trap with it— the link to the body, the silence, and so on. Inarticulately, it takes the sign for the thing itself, imagining racism inheres in the color of the skin. Wise to the tricks of language, the light-bodied escape from shame refuses the whole setup— refuses the metonymic shift, the enchantment of group story, and the rules of silence— and by these refusals it detaches the supposedly overlapping levels of inscription from one another so that the body, especially, need no longer stand as the mute, incarnate seal of social and psychological order. All this, but especially the speaking out where shame demands silence, depends largely on a consciousness that doesn’t feel much inhibition, and knows how traps are made, and knows how to subvert them.

This is the insight that comes to all boundary-crossers— immigrants in fact or immigrants in time— that meaning is contingent and identity fluid, even the meaning and identity of one’s own body.

It should by now be easier to see why there will always be art that uncovers the body, and artists who speak shamelessly, even obscenely. All social structures do well to anchor their rules of conduct in the seemingly simple inscription of the body, so that only after I have covered my privates am I allowed to show my face to the world and have a public life. The rules of bodily decorum usually imply that the cosmos depends on the shame we feel about our bodies. But sometimes the lesson is a lie, and a cunningly self-protecting one at that, for to question it requires self-exposure and loss of face, and who would want that? Well, trickster would, as would all those who find they cannot fashion a place for themselves in the world until they have spoken against collective silence. We certainly see this— not just the speaking out but the self-exposure— in Allen Ginsberg, and we see it a bit more subtly in both Kingston and Rodriguez. Neither of them is a “dirty writer” the way Ginsberg is, but to begin to speak, one of them must talk about menstruation (which talk she links to becoming the mistress of her own sexuality) and the other must talk about his skin (which talk he links to possessing his “maleness”).

To the degree that other orders are linked to the way the body is inscribed, and to the degree that the link is sealed by rules of silence, the first stuttering questioning of those orders must always begin by breaking the seal and speaking about the body. Where obscene speech has such roots it is worth defending, and those who would suppress it court a subtle but serious danger. They are like the gods who would bind Loki, for this suppression hobbles the imagination that copes with the shifting and contingent nature of things, and so invites apocalyptic change where something more playful would have sufficed. Better to let trickster steal the shame covers now and then. Better to let Coyote have a ride in the Sun-god’s lodge. Better to let Monkey come on your journey to the West.

* * *

“Disseminated Volition in the New Testament Gospels”
by Andrew Stehlik
The Jaynesian (Vol. 3, Issue 1)

It is well known that many words for inner spiritual motions and emotions are actually metaphors derived from primitive (outward) physiological observations. Brief references to any good dictionary which includes etymology can corroborate this conclusion.

Julian Jaynes in The Origin of Consciousness in the Breakdown of the Bicameral Mind dedicated a whole chapter to this theme — looking forward through the Iliad (pp. 257– 272). He concentrates on seven words: thumos, phrenes, noos, psyche, kradie, ker, and etor.

Julian Jaynes recognized that these and other similar body based, physiological or anatomical metaphors (in almost any language) are actually more than simple linguistic metaphors and that they played an important role in the breakdown of bicameralism and the development of consciousness. Different forms of stress and anxiety trigger different physiological responses. Observations of these responses were used in naming and creating hypostases and metaphors useful in the terminology of introspection and the development of consciousness. […]

In the New Testament Gospels (therefore quite late in the historical process — the second half of the first century CE) I recently recognized an interesting phenomenon which could be part of this process, or, even better, a pathological deviation along this process.

Once in the gospel of Mark (9: 42– 48) and twice in the gospel of Matthew (5: 27– 30 and 18: 6– 10) Jesus is supposed to utter an almost identical saying. In this saying, individual parts of the body (eyes, hands, feet) are given the ability of independent volition. They can inform acting of the whole person. The saying suggests, further, that when the influence (instructions, independent volition) of these body parts is perceived as dangerous or harmful, they should be silenced by cutting them off to protect the integrity of the rest of the body.

All academic theological literature known to me takes these sayings as high literary metaphors. Frequent references are made to biology and medicine and the use of amputations are the last resort in serious conditions.

Completely unrecognized is the whole presumption of this saying according to which individual body parts could possess independent volition and as such can inform (sway/direct) the acting of the whole body. Even more seriously — the presumption that self-mutilation can stop or somehow influence higher mental processes. Even the person who is not a trained psychologist or psychiatrist can recognize that we are dealing with a seriously pathological state of mind. […]

Already at the time of recording in the gospels this saying was perceived as anomalous. Luke, the most educated and refined of synoptical authors, preserved the immediate context, but edited out most of the peculiar parts concerning disseminated volition and self-mutilations.

Further and broader contexts which may be mentioned and discussed: other Greek and Hebrew physiological and anatomical metaphors; the popularity of a metaphor of the body for structuring and functioning of society in Hellenism; the ancient practice of religious self-mutilation; the potential for facilitating our understanding of brutish penal codes or modern self-mutilations.

* * *

The Monstrous, the Impure, & the Imaginal
The Haunted Moral Imagination

Inconsistency of Burkean Conservatism
On Truth and Bullshit
Poised on a Knife Edge
“Why are you thinking about this?”

On Truth and Bullshit

One of the most salient features of our culture is that there is so much bullshit.

This is how Harry Frankfurt begins his essay, “On Bullshit“. He continues:

“Everyone knows this. Each of us contributes his share. But we tend to take the situation for granted. Most people are rather confident of their ability to recognize bullshit and to avoid being taken in by it. So the phenomenon has not aroused much deliberate concern, or attracted much sustained inquiry. In consequence, we have no clear understanding of what bullshit is, why there is so much of it, or what functions it serves. And we lack a conscientiously developed appreciation of what it means to us. In other words, we have no theory.”

So, what is this bullshit? He goes through many definitions of related words. A main point is that it’s “short of lying” and this leads him to insincerity. The bullshitter isn’t a liar for the bullshitter isn’t concerned about either truth or its contrary. No intention to lie is required.

“Someone who lies and someone who tells the truth are playing on the opposite sides, so to speak, in the same game. Each responds to the facts . . . the response of the one is guided by the authority of the truth, while the response of the other defies that authority and refuses to meet its demands. The bullshitter ignores these demands altogether. He does not reject the authority of truth, as the liar does, and oppose himself to it. He pays no attention to it at all.”

Bullshitting is more of a creative act that dances around such concerns of verity:

“For the essence of bullshit is not that it is false but that it is phony. In order to appreciate this distinction, one must recognize that a fake or a phony need not be in any respect (apart from authenticity itself) inferior to the real thing. What is not genuine need not also be defective in some other way. It may be, after all, an exact copy. What is wrong with a counterfeit is not what it is like, but how it was made. This points to a similar and fundamental aspect of the essential nature of bullshit: although it is produced without concern with the truth, it need not be false. The bullshitter is faking things. But this does not mean that he necessarily gets them wrong.”

Bullshit is, first and foremost, insincere. In Frankfurt’s essay, that is some combination of an observation, premise, and conclusion. It is the core issue. But as with bullshit, what is this insincerity? How are we to judge it, from what perspective and according to what standard?

His answer seems to be that bullshit is to sincerity as a lie to the truth. This implies that the bullshitter knows they are insincere in the way the liar knows they are being untruthful. And as the bullshitter doesn’t care about truth, the liar doesn’t care about sincerity. This assumes that the intention of a speaker can be known, both to the presumed bullshitter and to the one perceiving (or accusing) them as a bullshitter. We know bullshit when we hear it, as we know porn when we see it.

After much analysis, the ultimate conclusion is that, “sincerity itself is bullshit.” Bullshit is insincere and sincerity is bullshit. How clever! But there is a genuine point being made. Frankfurt’s ideal is that of truth, not sincerity. Truth and sincerity aren’t polar opposite ideals. They are separate worldviews and attitudes, so the argument goes.

Coming to the end of the essay, I immediately realized what this conflict was. It is an old conflict. It goes back at least to Socrates, although part of larger transcultural changes happening in the post-bicameral Axial Age. Socrates is simply the standard originating point for Western thought, the frame we prefer since Greece represents the earliest known example of a democracy (as a highly organized political system within an advanced civilization).

Socrates, as known through the writings of Plato, is often portrayed as the victim of democracy’s dark populism. The reality, though, is that Plato was severely anti-democratic and associated with those behind the authoritarian forces that sought to destroy Athenian democracy. His fellow Athenians didn’t take kindly to this treasonous threat, whether or not it was just and fair to blame Socrates (we shall never know since we lack the details of the accusation and evidence, as no official court proceedings are extant).

What we know, from Plato, is that Socrates had issues with the Sophists. So, who were these Sophists? It’s a far more interesting question than it first appears. It turns out that the word has a complicated history. It originally referred to poets, the original teachers of wisdom in archaic Greek society. And it should be recalled that the poets were specifically excluded from Plato’s utopian society because, in Plato’s mind, of the danger they posed to rationalistic idealism.

What did the poets and Sophists have in common? They both used language to persuade, through language that was concrete rather than abstract, emotional rather than detached. Plato was interested in big ‘T’ absolute Truth, whereas those employing poetry and rhetoric were interested in small ‘t’ relative truths that were on a human scale. Ancient Greek poets and Sophists weren’t necessarily untruthful but simply indifferent to Platonic ideals of Truth.

This does relate to Frankfurt’s theory of bullshit. Small ‘t’ truths are bullshit or at least easily seen in this light. The main example he uses demonstrates this. A friend of Ludwig Wittgenstein’ was sick and she told him that, “I feel just like a dog that has been run over.” Wittgenstein saw this as careless use of language, not even meaningful enough to be wrong. It was a human truth, instead of a philosophical Truth.

Her statement expressed a physical and emotional experience. One could even argue that Wittgenstein was wrong about a human not being able to know what a hurt dog feels like, as mammals have similar biology and neurology. Besides, as far as we know, this friend had a pet dog run over by a car and was speaking from having a closer relationship to this dog than she had to Wittgenstein. Reading this account, Wittgenstein comes off as someone with severe Asperger’s and indeed plenty of people have speculated elsewhere about this possible diagnosis. Whatever is the case, his response was obtuse and callous.

It is hard to know what the relevance of such an anecdote might have, in reference to clarifying the meaning of bullshit. What it does make clear is that there are different kinds of truths.

This is what separated Socrates and Plato on one side and the poets and Sophists on the other. The Sophists had inherited a tradition of teaching from the poets and it was a tradition that became ever more important in the burgeoning democracy. But it was an era when the power of divine voice still clung to the human word. Persuasion was a power not to be underestimated, as the common person back then hadn’t yet developed the thick-boundaried intellectual defensiveness against rhetoric that we moderns take for granted. Plato sought a Truth that was beyond both petty humans and petty gods, a longing to get beyond all the ‘bullshit’.

Yet it might be noted that some even referred to Socrates and Plato as Sophists. They too used rhetoric to persuade. And of course, the Platonic is the foundation of modern religion (e.g., Neoplatonic Alexandrian Jews who helped shape early Christian theology and Biblical exegesis), the great opponent of the Enlightenment tradition of rationality.

This is why some, instead, prefer to emphasize the divergent strategies of Plato and Aristotle, the latter making its own accusations of bullshit against the former. From the Aristotelian view, Platonism is a belief system proclaiming truth all the while willfully detached from reality. The Platonic concern with Truth, from this perspective, can seem rather meaningless, maybe so meaningless as to not even being false. The Sophists who opposed Socrates and Plato at least were interested in practical knowledge that applied to the real world of human society, dedicated as they were to teaching the skills necessary for a functioning democracy.

As a side note, the closest equivalent to the Sophists today is the liberal arts professor who hopes to instill a broad knowledge in each new generation of students. It’s quite telling that those on the political right are the most likely to make accusations of bullshit against the liberal arts tradition. A traditional university education was founded on philology, the study of languages. And the teaching of rhetoric was standard in education into the early 1900s. Modern Western Civilization was built on the values of the Sophists, the ideal of a well rounded education and the central importance of language, including the ability to speak well and persuasively, the ability to logically defend an argument and rhetorically to make a case. The Sophists saw that to have a democratic public what was needed was an educated public.

Socrates and Plato came from more of what we’d call an aristocratic tradition. They were an enlightened elite, born into wealth, luxury, and privilege. This put them in opposition to the emerging democratic market of ideas. The Sophists were seen as mercenary philosophers who would teach or do anything for money. Socrates didn’t accept money from his students, but then again he was independently wealthy (in that, he didn’t have to work because slaves did the work for him). He wanted pure philosophy, unadulterated by the coarse human realities such as making a living and democratic politics.

It’s not that Socrates and Plato were necessarily wrong. Sophists were a diverse bunch, some using their talents for the public good and others not so much. They were simply the well educated members of the perceived meritocracy who used their expertise in exchange for payment. It seems like a rather normal thing to do in a capitalist society such as ours, but back then a market system was a newfangled notion that seemed radically destabilizing to the social order. Socrates and Plato were basically the reactionaries of their day, nostalgically longing for what they imagined was being lost. Yet they were helping creating an entirely new society, wresting it from the control and authority of tradition. Plato offered a radical utopian vision precisely because he was a reactionary, in terms of how the reactionary is explained by Corey Robin.

Socrates and Plato were challenging the world they were born into. Like all reactionaries, they had no genuine interest in a conservative-minded defense of the status quo. It would take centuries for their influence to grow so large as to become a tradition of its own. Even then, they laid the groundwork for future radicalism during the Renaissance, Protestant Reformation, and Enlightenment Age. Platonic idealism is the seed of modern idealism. What was reactionary in classical Greece fed into a progressive impulse about two millennia later, the anti-democratic leading to the eventual return of democratization. The fight against ‘bullshit’ became the engine of change that overthrew the European ancien régime of Catholicicism, feudalism, aristocracy, and monarchy. Utopian visions such as that of Plato’s Republic became increasingly common.

Thinking along these lines, it brought to mind a recent post of mind, Poised on a Knife Edge. I was once again considering the significance of the ‘great debate’ between Edmund Burke and Thomas Paine. It was Paine who was more of the inheritor of Greek idealism, but unlike some of the early Greek idealists he was very much putting idealism in service of democracy, not some utopian vision above and beyond the messiness of public politics. It occurred to me that, to Paine and his allies, Burke’s attack on the French Revolution was ‘bullshit’. The wardrobe of the moral imagination was deemed rhetorical obfuscation, a refusal of the plain speech and the plain honest truth that was favored by Paine (and by Socrates).

Let me explain why this matters. As I began reading Frankfurt’s “On Bullshit”, I was naturally pulled into the view presented. Pretty much everyone hates bullshit. But I considered a different possible explanation for this. Maybe bullshit isn’t more common than before. Maybe it’s even less common in some sense. It’s just that, as a society that idealizes truth, the category of bullshit represents something no longer respected or understood. We’ve lost touch with something within our own human nature. Our hyper-sensitivity in seeing bullshit everywhere, almost a paranoia, is an indication of this.

As much as I love Paine and his vision, I have to give credit where it is due by acknowledging that Burke managed to catch hold of a different kind of truth, a very human truth. He warned us about treading cautiously on the sacred ground of the moral imagination. On this point, I think he was right. We are too careless.

Frankfurt talks about the ‘bullshit artist’. Bullshitters are always artists. And maybe artists are always bullshitters. This is because the imagination, moral or otherwise, is the playground of the bullshitter. This is because the artist, the master of imagination, is different than a craftsmen. The artist always has a bit of the trickster about him, as he plays at the boundaries of the mind. Here is how Frankfurt explains it:

“Wittgenstein once said that the following bit of verse by Longfellow could serve him as a motto:

“In the elder days of art
Builders wrought with greatest care
Each minute and unseen part,
For the Gods are everywhere.

“The point of these lines is clear. In the old days, craftsmen did not cut corners. They worked carefully, and they took care with every aspect of their work. Every part of the product was considered, and each was designed and made to be exactly as it should be. These craftsmen did not relax their thoughtful self-discipline even with respect to features of their work which would ordinarily not be visible. Although no one would notice if those features were not quite right, the craftsmen would be bothered by their consciences. So nothing was swept under the rug. Or, one might perhaps also say, there was no bullshit.

“It does seem fitting to construe carelessly made, shoddy goods as in some way analogues of bullshit. But in what way? Is the resemblance that bullshit itself is invariably produced in a careless or self-indulgent manner, that it is never finely crafted, that in the making of it there is never the meticulously attentive concern with detail to which Longfellow alludes? Is the bullshitter by his very nature a mindless slob? Is his product necessarily messy or unrefined? The word shit does, to be sure, suggest this. Excrement is not designed or crafted at all; it is merely emitted, or dumped. It may have a more or less coherent shape, or it may not, but it is in any case certainly not wrought.

“The notion of carefully wrought bullshit involves, then, a certain inner strain. Thoughtful attention to detail requires discipline and objectivity. It entails accepting standards and limitations that forbid the indulgence of impulse or whim. It is this selflessness that, in connection with bullshit, strikes us as inapposite. But in fact it is not out of the question at all.”

This is logos vs mythos. In religious terms, it is the One True God who creates ex nihilo vs the demiurgic god of this world. And in Platonic terms, it is the idealistic forms vs concrete substance, where the latter is a pale imitation of the former. As such, truth is unique whereas bullshit is endless. The philosopher and the poet represent opposing forces. To the Philosopher, everything is either philosophically relevant or bullshit. But to the poet (and his kin), this misses the point and overlooks the essence of our humanity. Each side makes sense, according to the perspective of each side. And so each side is correct about what is wrong with the other side.

If all bullshit was eliminated and all further bullshit made impossible, what would be left of our humanity? Maybe our very definition of truth is dependent on bullshit, both as a contrast and an impetus. Without bullshit, we might no longer be able to imagine new truths. But such imagination, if not serving greater understanding, is of uncertain value and potentially dangerous to society. For good or ill, the philosopher, sometimes obtuse and detached, and the artist, sometimes full of bullshit, are the twin representatives of civilization as we know it.

* * *

“I had my tonsils out and was in the Evelyn Nursing Home feeling sorry for myself. Wittgenstein called.”
by Ann Althouse

Short of Lying
by Heinz Brandenburg

Bullshit as the Absence of Truthfulness
by Michael R. Kelly

Democracy is not a truth machine
by Thomas R. Wells

Our ability as individuals to get to true facts merely by considering different arguments is distinctly limited. If we only know of one account of the holocaust – what we were taught in school – we are likely to accept it. But whether it is true or false is a matter of luck rather than our intellectual capacities. Now it is reasonable to suppose that if we were exposed to a diversity of claims about the holocaust then our opinions on the subject would become more clearly our own, and our own responsibility. They would be the product of our own intellectual capacities and character instead of simply reflecting which society we happened to be born into. But so what? Holding sincere opinions about whether the holocaust happened is all very well and Millian, but it has no necessary relation to their truth. As Harry Frankfurt notes in his philosophical essay On Bullshit, sincerity is concerned with being true to oneself, not to the nature of the world: from the perspective of truth seeking, sincerity is bullshit.

Knowing this, we can have no faith that the popularity of certain factual claims among people as ordinary as ourselves is any guide to their truth. Democracy is no more equipped to evaluate facts than rational truths. We can all, of course, hold opinions about the civilisational significance of the holocaust and its status as a justification for the state of Israel, and debate them with others in democratic ways. Yet, when it comes to the facts, neither the sincerity with which individuals believe that ‘the holocaust’ is a myth nor the popularity of such beliefs can make them epistemically respectable. 90% of the population denying the holocaust is irrelevant to its truth status. And vice versa.

Rhetoric and Bullshit
by James Fredal

Frankfurt is also indebted (indirectly) to Plato: Phaedrus is as much about the bullshitter’s (Lysias’s or the non-lover’s) lack of concern for (or “love” for) the truth as is Frankfurt’s brief tome. From the perspective of Plato, Lysias’s speech in praise of the non-lover is just so much bullshit not simply because it is not true, but because Lysias is not concerned with telling the truth so much as he is with gaining the affection and attention of his audience: the beloved boy, the paying student or, more to the point, that lover of speeches, Phaedrus himself.

The non-lover described by Lysias in Phaedrus is best understood as Plato’s allegory for sophists who reject any “natural” truth and who remain committed to contradictory arguments as the practical consequence of their general agnosticism. For Lysias’s non-lover, language is not for telling the truth, because the truth is inaccessible: language is for finding and strengthening positions, for gaining advantage, and for exerting influence over others. Richard Weaver offers a similar reading of Phaedrus that sees the non-lover as representing an attitude toward language use (though for Weaver the non-lover is not a sophist, but a scientist).

Others interested in the bullshitter apply a different, more favorable lens. Daniel Mears, for example, draws on Chandra Mukerji’s study of bullshit among hitchhikers, and more generally on Erving Goffman’s study of self-presentation in the interaction order (for example, “Role Distance” and Interaction Rituals) to highlight bullshit as a form of impression management: what, as Mears notes, Suzanne Eggins and Diana Slade call a “framing device” for the “construction and maintenance of our social identities and social relationships” (qtd. in Mears 279). For Mears, bullshit is the deliberate (albeit playful) creation of possible but ultimately misleading impressions of self or reality, whether for expressive or instrumental reasons (4).

Like Frankfurt, Mears locates the source of bullshit in the speaker herself and her desire to craft a creditable self-image. But whereas Frankfurt sees bullshitting as a species of deception worse than lying (because at least liars have to know the truth if only to lead us away from it, whereas bullshitters have no concern at all for the truth), Mears understands bullshit as a significant social phenomenon that serves several prosocial functions.7 For Mears, we engage in bullshit for purposes of socialization and play, for self-exploration and self-expression, for the resolution of social tensions and cognitive dissonance, and for gaining an advantage in encounters.

Like Mukerji, Mears emphasizes the playful (though often nontrivial and highly consequential) quality of bullshit, much as the ancient sophists composed speeches as “play”: as exercises and exempla, for enjoyment, for display and impression management, and for study separate from the “real world” of politics and law.

Rhetoric Is Not Bullshit
by Davd J. Tietge
from Bullshit and Philosophy
Kindle Locations 3917-4003

The Truth about Postmodernism

One issue that helps obscure the universality of rhetoric, and thus promotes the pejorative use of ‘rhetoric’, is the popular tendency to oversimplify the “truth-lie” dichotomy. In The Liar’s Tale: A History of Falsehood, Jeremy Campbell reminds us that the reductionistic binary that separates truth from falsity is not only in error, but also that the thoroughly unclear and inconsistent distinction between the true and the false has a long, rich cultural history.180 Those doing much of the speaking in our own era, however, assume that the dividing line between truth and untruth is clear and, more significantly, internalized by the average human. Truth, however, is an elusive concept. While we can cite many examples of truths (that the sky is blue today, that the spoon will fall if dropped, and so forth), these depend on definitions of the words used. The sky is blue because ‘blue’ is the word we use to describe the hue that we have collectively agreed is bluish. We may, however, disagree about what shade of blue the sky is. Is it powder blue? Blue-green? Royal Blue? Interpretive responses to external realities that rely on definition (and language generally) always complicate the true-false binary, especially when we begin to discuss the nature of abstractions involved in, say, religion or metaphysics. The truth of ‘God is good’ depends very heavily upon the speaker’s understanding of God and the nature of goodness, both of which depend upon the speaker’s conceptualization, which may be unique to him, his group, or his cultural environment, and thus neither clear nor truthful to other parties.

Is this rampant relativism? Some might think so, but it is perhaps more useful to suggest that the Absolute Truths that we usually embrace are unattainable because of these complexities of language. Some cultures have seen the linguistic limitations of specifying the Truth. Hinduism has long recognized that language is incapable of revealing Truth; to utter the Truth, it holds, is simultaneously to make it no longer the Truth.

Note here the distinction between capital ‘T’ truth and lower-case ‘t’ truth. Lower-case truths are situational, even personal. They often reflect more the state of mind of the agent making the utterance than the immutable nature of the truth. They are also temporally situated; what may be true now may not be in the future. Truth in this sense is predicated on both perception and stability, and, pragmatically speaking, such truths are tran-sitional and, often, relative. Capital ‘T’ Truths can be traced back at least as far as Plato, and are immutable, pure, and incorruptible. They do not exist in our worldly realm, at least so far as Plato was concerned. This is why Plato was so scornful of rhetoric: he felt that rhetoricians (in particular, the Sophists) were opportunists who taught people how to disguise the Truth with language and persuasion. Whereas Plato imagined a realm in which the worldly flaws and corruption of a physical existence were supplanted by perfect forms, the corporeal domain of human activity was saturated with language, and therefore, could not be trusted to reveal Truth with any certainty.

Contemporary, postmodern interest in truth and meaning turns the tables on Plato and studies meaning and truth in this shifting, less certain domain of human activity. Campbell cites many thinkers from our philosophical past who helped inaugurate this development, but none is more important than Friedrich Nietzsche. For Nietzsche, humans have no “organ” for discerning Truth, but we do have a natural instinct for falsehood. “Truth,” as an abstraction taken from the subjectivity of normal human activities, was a manufactured fiction that we are not equipped to actually find. On the other hand, a natural aptitude for falsehood is an important survival mechanism for many species. Human beings have simply cultivated it in innovative, sophisticated, ways. As the rhetorician George A. Kennedy has noted, “in daily life, many human speech acts are not consciously intentional; they are automatic reactions to situations, culturally (rather than genetically) imprinted in the brain or rising from the subconscious.”181 Our propensity for appropriate (if not truthful) responses to situations is something nourished by an instinct to survive, interact, protect, and socialize. Civilization gives us as many new ways to do this as there are situations that require response.

This is why Nietzsche carefully distinguished Truth from a belief system that only professed to contain the Truth. Ken Gemes notes that Nietzsche co-ordinated the question of Truth around the pragmatics of survival,182 an observation echoed by Kennedy, who provides examples of animals that deceive for self-preservation. Camouflage, for example, can be seen in plants and animals. Many birds imitate the calls of rival species to fool them to distraction and away from their nests or food sources. Deception, it seems, is common in nature. But Nietzsche took doctrinal Truth (note the “T”) to be one of the most insidious deceptions to occur in human culture, especially as it is articulated in religions. It is not a basic lie that is being promulgated, but rather a lie masquerading as the Truth and, according to Nietzsche, performing certain functions. Truth, that is, is a ritualized fiction, a condition manufactured for institutions and the individuals who control them to maintain their power.

Rhetoric and Bullshit

Truth, deception, control over others. This survey of rhetoric thus brings us close to the territory that Harry Frankfurt explores in On Bullshit. For Frankfurt, however, bullshit has little to do with these complexities about truth and Truth that rhetoric helps us identify. Indeed bullshit, for Frankfurt, has little do with truth at all, insofar as it requires an indifference to truth. Does this mean, then, that language that is not bullshit has settled the matter of truth and has access to truth (or Truth)? Does this lead us to a dichotomy between truth and bullshit that is similar to the dichotomy between truth and falsity that postmodernism criticizes? It may seem that postmodernism has little place in Frankfurt’s view, insofar as he rejects “various forms of skepticism which deny that we have any reliable access to objective reality, and which therefore reject the possibility of knowing how things truly are” (p. 64). Indeed, postmodernism is often vilified as the poster child of relativism and skepticism.

Yet postmodernism is far subtler than a mere denial of “objective reality.” Postmodernism claims, rather, that reality is as much a construct of language as it is objective and unchanging. Postmodernism is less about rejecting beliefs about objective reality than about the intersection between material reality and the human interpretations of it that change, mutate, and shift that reality to our own purposes—the kind of small-t truths that Nietzsche addressed. The common complaint about post-modernism, for example, that it denies “natural laws,” forgets that humans noticed and formulated those laws. Postmodernism attempts to supply a vocabulary to describe this kind of process. It is not just “jargon,” as is so often charged; it is an effort to construct a metalinguistic lexicon for dealing with some very difficult and important epistemological questions.

And, not surprisingly, so is rhetoric. Constructing language that deals with the nature of language is a unique human problem. It is meta-cognition at its most complicated because it requires us to use the same apparatus to decode human texts that is contained in the texts themselves—that is, using words to talk about words, what Kenneth Burke referred to in The Rhetoric of Religion as “logology.”183 In no other area of human thinking is this really the case. Most forms of intellectual exploration involve an extraneous phenomenon, event, agent, or object that requires us to bring language to bear upon it in order to observe, describe, classify, and draw conclusions about its nature, its behavior, or its effect. For example, scientific inquiry usually involves an event or a process in the material world that is separate from the instruments we use to describe it. Historical analysis deals with texts as a matter of disciplinary course, yet most historians rarely question the efficacy or the reliability of the language used to convey an event of the remote (or, for that matter, recent) past. Even linguistics, which uses a scientific model to describe language structure, deals little with meaning or textual analysis.

Law is one of the closest cousins of rhetoric. Words are very much a part of the ebb and flow of legal wrangling, and the attention given to meaning and interpretation is central. Yet, even here, there is little theoretical discussion about how words have meaning or how, based on such theory, that meaning can be variously interpreted. Law is more concerned with the fact that words can be interpreted differently and how different agents might interpret language in different ways. This is why legal documents are often so unreadable; in an attempt to control ambiguity, more words (and more words with specific, technical meanings) must be used so that multiple interpretations can be avoided. If theoretical discussions about how language generates meaning were entered into the equation, the law would be impossible to apply in any practical way. Yet, to understand legal intricacies, every law student should be exposed to rhetoric—not so they can better learn how to manipulate a jury or falsify an important document, but so they understand how tenuous and limited language actually is for dealing with ordinary situations. Moreover, nearly every disciplinary area of inquiry uses language, but only rhetoric (and its associated disciplines, especially philosophy of language and literary /cultural criticism, which have influenced the development of modern rhetoric considerably) analyzes language using a hermeneutical instrument designed to penetrate the words to examine their effects—desired or not—on the people who use them.

What, then, qualifies as “bullshit”? Certainly, as I hope I have shown, rhetoric and bullshit are hardly the same thing. They are not even distant cousins. When a student begins a paper with the sentence, “In today’s society, there are many things that people have different and similar opinions about,” it’s a pretty good guess that there is little of rhetorical value there. About the only conclusion a reader can draw is that the student is neither inspired nor able to hide this fact. This is the extent of the subtext, and it could conceivably qualify as bullshit. In this sense, Frankfurt’s characterization of bullshit as “unavoidable whenever circumstances require someone to talk without knowing what he is talking about” (p. 63) is a useful differentiation.

But aside from these rather artificial instances, if bullshit does occur at the rate Frankfurt suggests, we have an arduous task in separating the bullshit from more interesting and worthy rhetorical situations. We have all met people whom we know, almost from the moment of acquaintance, are full of bullshit. It is the salesman syndrome that some people just (naturally, it seems) possess. In one sense, then, poor rhetoric—a rhetoric of transparency or obviousness—can be construed as bullshit. For the person with salesman syndrome is certainly attempting to achieve identification with his audience; he may even be attempting to persuade others that he is upright or trustworthy. But he fails because his bullshit is apparent. He is a bad rhetorician in the sense that he fails to convince others that he should be taken seriously, that his words are worthy of attention and, possibly, action.

Bullshit is something we can all recognize. Rhetoric is not. My remedy for this situation is simple: learn rhetoric.

 

The Sociology of Intellectual Life
by Steve Fuller
pp. 147-8

Harry Frankfurt’s (2005) On Bullshit is the latest contribution to a long, distinguished, yet deeply problematic line of Western thought that has attempted to redeem the idea of intellectual integrity from the cynic’s suspicion that it is nothing but high-minded, self-serving prejudice. I say ‘problematic’ because while Plato’s unflattering portrayal of poets and sophists arguably marked the opening salvo in the philosophical war against bullshit, Plato availed himself of bullshit in promoting the ‘myth of the metals’ as a principle of social stratification in his Republic. This doublethink has not been lost on the neo-conservative followers of the great twentieth century Platonist Leo Strauss. […]

The bullshit detector aims to convert an epistemic attitude into a moral virtue: reality can be known only by the right sort of person. This idea, while meeting with widespread approval by philosophers strongly tied to the classical tradition of Plato and Aristotle, is not lacking in dissenters. The line of dissent is best seen in the history of ‘rhetoric’, a word Plato coined to demonize Socrates’ dialectical opponents, the sophists. The sophists were prepared to teach anyone the art of winning arguments, provided you could pay the going rate. As a series of sophistic interlocutors tried to make clear to Socrates, possession of the skills required to secure the belief of your audience is the only knowledge you really need to have. Socrates famously attacked this claim on several fronts, which the subsequent history of philosophy has often conflated. In particular, Socrates’ doubts about the reliability of the sophists’ techniques have been run together with a more fundamental criticism: even granting the sophists their skills, they are based on a knowledge of human gullibility, not of reality itself.

Bullshit is sophistry under this charitable reading, which acknowledges that the truth may not be strong enough by itself to counteract an artfully presented claim that is not so much outright false as, in the British idiom, ‘economical with the truth’. In stressing the difference between bullshit and lies, Frankfurt clearly has this conception in mind, though he does sophistry a disservice by casting the bullshitter’s attitude toward the truth as ‘indifference’. On the contrary, the accomplished bullshitter must be a keen student of what people tend to regard as true, if only to cater to those tendencies so as to serve her own ends. What likely offends Frankfurt and other philosophers here is the idea that the truth is just one more tool to be manipulated for personal advantage. Conceptual frameworks are simply entertained and then discarded as their utility passes. The nature of the offence, I suspect, is the divine eye-view implicated in such an attitude – the very idea that one could treat in a detached fashion the terms in which people normally negotiate their relationship to reality. A bullshitter revealed becomes a god unmade.

pp. 152-3

The bullshit detector believes not only that there is a truth but also that her own access to it is sufficiently reliable and general to serve as a standard by which others may be held accountable. Protestants appeared prepared to accept the former but not the latter condition, which is why dissenters were encouraged – or perhaps ostracized – to establish their own ministries. The sophists appeared to deny the former and possibly the latter condition as well. Both Protestants and sophists are prime candidates for the spread of bullshit because they concede that we may normally address reality in terms it does not recognize – or at least do not require it to yield straight ‘yes-or-no’, ‘true-or-false’ answers. In that case, we must make up the difference between the obliqueness of our inquiries and the obtuseness of reality’s responses. That ‘difference’ is fairly seen as bullshit. When crystallized as a philosophy of mind or philosophy of language, this attitude is known as antirealism. Its opposite number, the background philosophy of bullshit detectors, is realism.

The difference in the spirit of the two philosophies is captured as follows: do you believe that everything you say and hear is bullshit unless you have some way of showing whether it is true or false; or rather, that everything said and heard is simply true or false, unless it is revealed to be bullshit? The former is the antirealist, the latter the realist response. Seen in those terms, we might say that the antirealist regards reality as inherently risky and always under construction (Caveat credor: ‘Let the believer beware!’) whereas the realist treats reality as, on the whole, stable and orderly – except for the reprobates who try to circumvent the system by producing bullshit. In this respect, On Bullshit may be usefully read as an ad hominem attack on antirealists. Frankfurt himself makes passing reference to this interpretation near the end of the essay (Frankfurt 2005: 64–65). Yet, he appears happy to promote the vulgar image of antirealism as intellectually, and perhaps morally, slipshod, instead of treating it as the philosophically honorable position that it is.

A case in point is Frankfurt’s presentation of Wittgenstein as one of history’s great bullshit detectors (Frankfurt 2005: 24–34). He offers a telling anecdote in which the Viennese philosopher objects to Fania Pascal’s self description as having been ‘sick as a dog’. Wittgenstein reportedly told Pascal that she misused language by capitalizing on the hearer’s easy conflation of a literal falsehood with a genuine condition, which is made possible by the hearer’s default anthropocentric bias. Wittgenstein’s objection boils down to claiming that, outside clearly marked poetic contexts, our intellectual end never suffices alone to justify our linguistic means. Frankfurt treats this point as a timeless truth about how language structures reality. Yet, it would be quite easy, especially recalling that this ‘truth’ was uttered seventy years ago, to conclude that Wittgenstein’s irritation betrays a spectacular lack of imagination in the guise of scrupulousness.

Wittgenstein’s harsh judgement presupposes that humans lack any real access to canine psychology, which renders any appeal to dogs purely fanciful. For him, this lack of access is an established fact inscribed in a literal use of language, not an open question answers to which a figurative use of language might offer clues for further investigation. Nevertheless, scientists informed by the Neo-Darwinian synthesis – which was being forged just at the time of Wittgenstein’s pronouncement – have quite arguably narrowed the gap between the mental lives of humans and animals in research associated with ‘evolutionary psychology’. As this research makes more headway, what Wittgenstein confidently declared to be bullshit in his day may tomorrow appear as having been a prescient truth. But anyone holding such a fluid view of verifiability would derive scant comfort from either Wittgenstein or Frankfurt, who act as if English linguistic intuitions, circa 1935, should count indefinitely as demonstrable truths.

Some philosophers given to bullshit detection are so used to treating any Wittgensteinian utterance as a profundity that it never occurs to them that Wittgenstein may have been himself a grandmaster of bullshit. The great bullshit detectors whom I originally invoked, Nietzsche and Mencken, made themselves vulnerable to critics by speaking from their own self-authorizing standpoint, which supposedly afforded a clear vista for distinguishing bullshit from its opposite. Wittgenstein adopts the classic bullshitter’s technique of ventriloquism, speaking through the authority of someone or something else in order to be spared the full brunt of criticism.

I use ‘adopts’ advisedly, since the deliberateness of Wittgenstein’s rhetoric remains unclear. What was he trying to do: to speak modestly without ever having quite controlled his spontaneously haughty manner, or to exercise his self-regarding superiority as gently as possible so as not to frighten the benighted? Either way, Wittgenstein became – for a certain kind of philosopher – the standard-bearer of linguistic rectitude, where ‘language’ is treated as a proxy for reality itself. Of course, to the bullshitter, this description also fits someone whose strong personality cowed the impressionable into distrusting their own thought processes. As with most successful bullshit, the trick is revealed only after it has had the desired effect and the frame of reference has changed. Thus, Wittgenstein’s precious concern about Pascal’s account of her state of health should strike, at least some readers today, as akin to a priest’s fretting over a parishioner’s confession of impure thoughts. In each case, the latter is struck by something that lies outside the box in which the former continues to think.

If Wittgenstein was a bullshitter, how did he manage to take in professed enemies of bullshit like Frankfurt? One clue is that most bullshit is forward looking, and Wittgenstein’s wasn’t. The bullshitter normally refers to things whose prima facie plausibility immunizes the hearer against checking their actual validity. The implication is that the proof is simply ‘out there’ waiting be found. But is there really such proof? Here the bullshitter is in a race against time. A sufficient delay in checking sources has salvaged the competence and even promoted the prescience of many bullshitters. Such was the spirit of Paul Feyerabend’s (1975) notorious account of Galileo’s ‘discoveries’, which concluded that his Papal Inquisitors were originally justified in their scepticism, even though Galileo’s followers subsequently redeemed his epistemic promissory notes.

In contrast, Wittgenstein’s unique brand of bullshit was backward-looking, always reminding hearers and readers of something they should already know but had perhaps temporarily forgotten. Since Wittgenstein usually confronted his interlocutors with mundane examples, it was relatively easy to convey this impression. The trick lay in immediately shifting the context from the case at hand to what Oxford philosophers in the 1950s called a ‘paradigm case’ that was presented as a self-evident standard of usage against which to judge the case at hand. That Wittgenstein, a non-native speaker of English, impressed one or two generations of Britain’s philosophical elite with just this mode of argumentation remains the envy of the aspiring bullshitter. Ernest Gellner (1959), another émigré from the old Austro Hungarian Empire, ended up ostracized from the British philosophical establishment for offering a cutting diagnosis of this phenomenon as it was unfolding. He suggested that Wittgenstein’s success testified to his ability to feed off British class anxiety, which was most clearly marked in language use. An academically sublimated form of such language-driven class anxiety remains in the discipline of sociolinguistics (Bernstein 1971–77).

Yet, after nearly a half-century, Gellner’s diagnosis is resisted, despite the palpable weakening of Wittgenstein’s posthumous grip on the philosophical imagination. One reason is that so many living philosophers still ride on Wittgenstein’s authority – if not his mannerisms – that to declare him a bullshitter would amount to career suicide. But a second reason is also operative, one that functions as an insurance policy against future debunkers. Wittgenstein is often portrayed, by himself and others, as mentally unbalanced. You might think that this would render his philosophical deliverances unreliable. On the contrary, Wittgenstein’s erratic disposition is offered as evidence for his spontaneously guileless nature – quite unlike the controlled and calculated character of bullshitters. Bullshit fails to stick to Wittgenstein because he is regarded as an idiot savant.

Democratic Republicanism in Early America

There was much debate and confusion around various terms, in early America.

The word ‘democracy’ wasn’t used on a regular basis at the time of the American Revolution, even as the ideal of it was very much in the air. Instead, the word ‘republic’ was used by most people back then to refer to democracy. But some of the founding fathers such as Thomas Paine avoided such confusion and made it clear beyond any doubt by speaking directly of ‘democracy’. Thomas Jefferson, the author of the first founding document and 3rd president, formed a political party with both ‘democratic’ and ‘republican’ in the name, demonstrating that no conflict was seen between the two terms.

The reason ‘democracy’ doesn’t come up in founding documents is that the word is too specific, although it gets alluded to when speaking of “the People” since democracy is literally “people power”. Jefferson, in writing the Declaration of Independence, was particularly clever in avoiding most language that evoked meaning that was too ideologically singular and obvious (e.g., he effectively used rhetoric to avoid the divisive debates for and against belief in natural law). That is because the founding documents were meant to unite a diverse group of people with diverse opinions. Such a vague and ambiguous word as ‘republic’ could mean almost anything to anyone and so was an easy way to paper over disagreements and differing visions. If more specific language was used that made absolutely clear what they were actually talking about, it would have led to endless conflict, dooming the American experiment from the start.

Yet it was obvious from pamphlets and letters that many American founders and revolutionaries wanted democracy, in whole or part, to the degree they had any understanding of it. Some preferred a civic democracy with some basic social democratic elements and civil rights, while others (mostly Anti-Federalists) pushed for more directly democratic forms of self-governance. The first American constitution, the Articles of Confederation, was clearly a democratic document with self-governance greatly emphasized. Even among those who were wary of democracy and spoke out against it, they nonetheless regularly used democratic rhetoric (invoking democratic ideals, principles, and values) because democracy was a major reason why so many fought the revolution in the first place. If not for democracy, there was little justification for and relevance in starting a new country, beyond a self-serving power grab by a new ruling elite.

Without assuming that large number of those early Americans had democracy in mind, their speaking of a republic makes no sense. And that is a genuine possibility for at least some of them, as they weren’t always clear in their own minds about what they did and didn’t mean. To be technical (according to even the common understanding from the 1700s), a country either is a democratic republic or a non-democratic republic. The variety of non-democratic republics would include what today we’d call theocracy, fascism, communism, etc. It is a bit uncertain exactly what kind of republic various early Americans envisioned, but one thing is certain: There was immense overlap and conflation between democracy and republicanism in the early American mind. This was the battleground of the fight between Federalists and Anti-Federalists (or to be more accurate, between pseudo-Federalists and real Federalists).

As a label, stating something is a republic says nothing at all about what kind of government it is. All that it says is what a government isn’t, that is to say it isn’t a monarchy, although there were even those who argued for republican monarchy with an elective king which is even more confused and so the king theoretically would serve the citizenry that democratically elected him. Even some of the Federalists talked about this possibility of republic with elements of a monarchy, strange as it seems to modern Americans. This is what the Anti-Federalists worried about.

Projecting our modern ideological biases onto the past is the opposite of helpful. The earliest American democrats were, by definition, republicans. And most of the earliest American republicans were heavily influenced by democratic political philosophy, even when they denounced it while co-opting it. There was no way to avoid the democratic promise of the American Revolution and the founding documents. Without that promise, we Americans would still be British. That promise remains, yet unfulfilled. The seed of an ideal is hard to kill once planted.

Still, bright ideals cast dark shadows. And the reactionary authoritarianism of the counter-revolutionaries was a powerful force. It is an enemy we still fight. The revolution never ended.

* * *

Democracy Denied: The Untold Story
by Arthur D. Robbins
Kindle Locations 2862-2929

Fascism has been defined as “an authoritarian political ideology (generally tied to a mass movement) that considers individual and other societal interests inferior to the needs of the state, and seeks to forge a type of national unity, usually based on ethnic, religious, cultural, or racial attributes.”[ 130] If there is a significant difference between fascism thus defined and the society enunciated in Plato’s Republic,[ 131] in which the state is supreme and submission to a warrior class is the highest virtue, I fail to detect it. [132] What is noteworthy is that Plato’s Republic is probably the most widely known and widely read of political texts, certainly in the United States, and that the word “republic” has come to be associated with democracy and a wholesome and free way of life in which individual self-expression is a centerpiece.

To further appreciate the difficulty that exists in trying to attach specific meaning to the word “republic,” one need only consult the online encyclopedia Wikipedia.[ 133] There one will find a long list of republics divided by period and type. As of this writing, there are five listings by period (Antiquity, Middle Ages and Renaissance, Early Modern, 19th Century, and 20th Century and Later), encompassing 90 separate republics covered in Wikipedia. The list of republic types is broken down into eight categories (Unitary Republics, Federal Republics, Confederal Republics, Arab Republics, Islamic Republics, Democratic Republics, Socialist Republics, and People’s Republics), with a total of 226 entries. There is some overlap between the lists, but one is still left with roughly 300 republics— and roughly 300 ideas of what, exactly, constitutes a republic.

One might reasonably wonder what useful meaning the word “republic” can possibly have when applied in such diverse political contexts. The word— from “res publica,” an expression of Roman (i.e., Latin) origin— might indeed apply to the Roman Republic, but how can it have any meaning when applied to ancient Athens, which had a radically different form of government existing in roughly the same time frame, and where res publica would have no meaning whatsoever?

Let us recall what was going on in Rome in the time of the Republic. Defined as the period from the expulsion of the Etruscan kings (509 B.C.) until Julius Caesar’s elevation to dictator for life (44 B.C.),[ 134] the Roman Republic covered a span of close to five hundred years in which Rome was free of despotism. The title rex was forbidden. Anyone taking on kingly airs might be killed on sight. The state of affairs that prevailed during this period reflects the essence of the word “republic”: a condition— freedom from the tyranny of one-man rule— and not a form of government. In fact, The American Heritage College Dictionary offers the following as its first definition for republic: “A political order not headed by a monarch.”

[…] John Adams (1735– 1826), second President of the United States and one of the prime movers behind the U.S. Constitution, wrote a three-volume study of government entitled Defence of the Constitutions of Government of the United States of America (published in 1787), in which he relies on the writings of Cicero as his guide in applying Roman principles to American government.[ 136] From Cicero he learned the importance of mixed governments,”[ 137] that is, governments formed from a mixture of monarchy, aristocracy, and democracy. According to this line of reasoning, a republic is a non-monarchy in which there are monarchic, aristocratic, and democratic elements. For me, this is confusing. Why, if one had just shed blood in unburdening oneself of monarchy, with a full understanding of just how pernicious such a form of government can be, would one then think it wise or desirable to voluntarily incorporate some form of monarchy into one’s new “republican” government? If the word “republic” has any meaning at all, it means freedom from monarchy.

The problem with establishing a republic in the United States was that the word had no fixed meaning to the very people who were attempting to apply it. In Federalist No. 6, Alexander Hamilton says, “Sparta, Athens, Rome and Carthage were all republics”( F.P., No. 6, 57). Of the four mentioned, Rome is probably the only one that even partially qualifies according to Madison’s definition from Federalist No. 10 (noted earlier): “a government in which the scheme of representation takes place,” in which government is delegated “to a small number of citizens elected by the rest” (ibid, No. 10, 81-82).

Madison himself acknowledges that there is a “confounding of a republic with a democracy” and that people apply “to the former reasons drawn from the nature of the latter ”( ibid., No. 14, 100). He later points out that were one trying to define “republic” based on existing examples, one would be at a loss to determine the common elements. He then goes on to contrast the governments of Holland, Venice, Poland, and England, all allegedly republics, concluding, “These examples … are nearly as dissimilar to each other as to a genuine republic” and show “the extreme inaccuracy with which the term has been used in political disquisitions.”( ibid., No. 39, 241).

Thomas Paine offers a different viewpoint: “What is now called a republic, is not any particular form of government. It is wholly characteristical [sic] of the purport, matter, or object for which government ought to be instituted, and on which it is to be employed, res-publica, the public affairs or the public good” (Paine, 369) (italics in the original). In other words, as Paine sees it, “res-publica” describes the subject matter of government, not its form.

Given all the confusion about the most basic issues relating to the meaning of “republic,” what is one to do? Perhaps the wisest course would be to abandon the term altogether in discussions of government. Let us grant the word has important historical meaning and some rhetorical appeal. “Vive la Republique!” can certainly mean thank God we are free of the tyranny of one-man, hereditary rule. That surely is the sense the word had in early Rome, in the early days of the United States, and in some if not all of the French and Italian republics. Thus understood, “republic” refers to a condition— freedom from monarchy— not a form of government.

* * *

Roger Williams and American Democracy
US: Republic & Democracy
 (part two and three)
Democracy: Rhetoric & Reality
Pursuit of Happiness and Consent of the Governed
The Radicalism of The Articles of Confederation
The Vague and Ambiguous US Constitution
Wickedness of Civilization & the Role of Government
Spirit of ’76
A Truly Free People
Nature’s God and American Radicalism
What and who is America?
Thomas Paine and the Promise of America
About The American Crisis No. III
Feeding Strays: Hazlitt on Malthus
Inconsistency of Burkean Conservatism
American Paternalism, Honor and Manhood
Revolutionary Class War: Paine & Washington
Paine, Dickinson and What Was Lost
Betrayal of Democracy by Counterrevolution
Revolutions: American and French (part two)
Failed Revolutions All Around
The Haunted Moral Imagination
“Europe, and not England, is the parent country of America.”
“…from every part of Europe.”

The Fight For Freedom Is the Fight To Exist: Independence and Interdependence
A Vast Experiment
America’s Heartland: Middle Colonies, Mid-Atlantic States and the Midwest
When the Ancient World Was Still a Living Memory

Dark Matter of the Mind

The past half year has been spent in anticipation. Daniel Everett has a new book that finally came out the other day: Dark Matter of the Mind. I was so curious to read it because Everett is the newest and most well known challenger to mainstream linguistics theory. This is only an interest to me because it so happens to directly touch upon every aspect of our humanity: human nature (vs nurture), self-identity, consciousness, cognition, perception, behavior, culture, philosophy, etc.

The leading opponent to Everett’s theory is Noam Chomsky, a well-known and well-respected public intellectual. Chomsky is the founder of the so-called cognitive revolution — not that Everett sees it as all that revolutionary: “it was not a revolution in any sense, however popular that narrative has become” (Kindle Location 306). That brings into the conflict issues of personality, academia, politics, and funding. It’s two paradigms clashing, one of the paradigms having been dominant for more than a half century.

Now that I’ve been reading the book, I find my response to be mixed. Everett is running headlong into difficult terrain and I must admit he does so competently. He is doing the tough scholarly work that needs to be done. As Bill Benzon explained (at 3 Quarks Daily):

“While the intellectual world is rife with specialized argumentation arrayed around culture and associated concepts (nature, nurture, instinct, learning) these concepts themselves do not have well-defined technical meanings. In fact, I often feel they are destined to go the way of phlogiston, except that, alas, we’ve not yet discovered the oxygen that will allow us to replace them [4]. These concepts are foundational, but the foundation is crumbling. Everett is attempting to clear away the rubble and start anew on cleared ground. That’s what dark matter is, the cleared ground that becomes visible once the rubble has been pushed to the side. Just what we’ll build on it, and how, that’s another question.”

This explanation points to a fundamental problem, if we are to consider it a problem. Earlier in the piece, Benzon wrote that, “OK, I get it, I think, you say, but this dark matter stuff is so vague and metaphorical. You’re right. And it remains that way to the end of the book. And that, I suppose, is my major criticism, though it’s a minor one. “Dark matter” does a lot of conceptual work for Everett, but he discusses it indirectly.” Basically, Everett struggles with a limited framework of terminology and concepts. But that isn’t entirely his fault. It’s not exactly new territory that Everett discovered, just not yet fully explored and mapped out. The main thing he did, in his earliest work, was to bring up evidence that simply did not fit into prevailing theories. And now in a book like this he is trying to make sense of what that evidence indicates and what theory better explains it.

It would have been useful if Everett had been able to give a fuller survey of the relevant scholarship. But if he had, it would have been a larger and more academic book. It is already difficult enough for most readers not familiar with the topic. Besides, I suspect that Everett was pushing against the boundaries of his own knowledge and readings. It was easy for me to see everything that was left out, in relation to numerous other fields beyond his focus of linguistics and anthropology — such as: neurocognitive research, consciousness studies, classical studies of ancient texts, voice-hearing and mental health, etc.

The book sometimes felt like reinventing the wheel. Everett’s expertise is in linguistics, and apparently that has has been an insular field of study defended by a powerful and entrenched academic establishment. My sense is that linguistics is far behind in development, compared to many other fields. The paradigm shift that is just now happening in linguistics has been for decades creating seismic shifts elsewhere in academia. Some argue that this is because linguistics became enmeshed in Pentagon-funded computer research and so has had a hard time disentangling itself in order to become an independent field once again. Chomsky as leader of the cognitive revolution has effectively dissuaded a generation of linguists from doing social science, instead promoting the hard sciences, a problematic position to hold about a rather soft field like linguistics. As anthropologist Chris Knight explains it, in Decoding Chomsky (Chapter 1):

“[O]ne bedrock assumption underlies his work. If you want to be a scientist, Chomsky advises, restrict your efforts to natural science. Social science is mostly fraud. In fact, there is no such thing as social science.[49] As Chomsky asks: ‘Is there anything in the social sciences that even merits the term “theory”? That is, some explanatory system involving hidden structures with non-trivial principles that provide understanding of phenomena? If so, I’ve missed it.’[50]

“So how is it that Chomsky himself is able to break the mould? What special factor permits him to develop insights which do merit the term ‘theory’? In his view, ‘the area of human language . . . is one of the very few areas of complex human functioning’ in which theoretical work is possible.[51] The explanation is simple: language as he defines it is neither social nor cultural, but purely individual and natural. Provided you acknowledge this, you can develop theories about hidden structures – proceeding as in any other natural science. Whatever else has changed over the years, this fundamental assumption has not.”

This makes Everett’s job harder than it should be, in breaking new ground in linguistics and in trying to connect it to the work already done elsewhere, most often in the social sciences. As humans are complex social animals living in a complex world, it is bizarre and plain counterproductive to study humans in the way one studies a hard science like geology. Humans aren’t isolated biological computers that can operate outside of the larger context of specific cultures and environments. But Chomsky simply assumes all of that is irrelevant on principle. Field research of actual functioning languages, as Everett has done, can be dismissed because it is mere social science. One can sense how difficult it is for Everett in struggling against this dominant paradigm.

Still, even with these limitations of the linguistics field, the book remains a more than worthy read. His using Plato and Aristotle to frame the issue was helpful to an extent, although it also added another variety of limitation. I got a better sense of the conflict of worldviews and how they relate to the larger history of ideas. But in doing so, I became more aware of the problems of that frame, very closely related to the problems of the nature vs nurture debate (for, in reality, nature and nurture are inseparable). He describes linguistic theoreticians like Chomsky as being in the Platonic school of thought. Chomsky surely would agree, as he has already made that connection in his own writings, what he discusses as Plato’s problem and Plato’s answer. Chomsky’s universal grammar are Platonic in nature, for as he has written such “knowledge is ‘remembered’” (“Linguistics, a personal view” from The Chomskyan Turn). This is Plato’s ananmesis and alethia, an unforgetting of what is true, based on the belief that humans are born with certain kinds of innate knowledge.

That is interesting to think about. But in the end I felt that something was being oversimplified or entirely left out. Everett is arguing against nativism, that there is an inborn predetermined human nature. It’s not so much that he is arguing for a blank slate as he is trying to explain the immense diversity and potential that exists across cultures. But the duality of nativism vs non-nativism lacks the nuance to wrestle down complex realities.

I’m sympathetic to Everett’s view and to his criticisms of the nativist view. But there are cross-cultural patterns that need to be made sense of, even with the exceptions that deviate from those patterns. Dismissing evidence is never satisfying. Along with Chomsky, he throws in the likes of Carl Jung. But the difference between Chomsky and Jung is that the former is an academic devoted to pure theory unsullied by field research while the latter was a practicing psychotherapist who began with the particulars of individual cases. Everett is arguing for a focus on the particulars, upon which to build theory, but that is what Jung did. The criticisms of Chomsky can’t be shifted over to Jung, no matter what one thinks of Jung’s theories.

Part of the problem is that the kind of evidence Jung dealt with remains to be explained. It’s simply a fact that certain repeating patterns are found in human experience, across place and time. That is evidence to be considered, not dismissed, however one wishes to interpret it. Not even most respectable nativist thinkers want to confront this kind of evidence that challenges conventional understandings on all sides. Maybe Jungian theories of archetypes, personality types, etc are incorrect. But how do we study and test such things, going from direct observation to scientific research? And how is the frame of nativism/non-nativism helpful at all?

Maybe there are patterns, not unlike gravity and other natural laws, that are simply native to the world humans inhabit and so might not be entirely or at all native to the human mind, which is to say not in the way that Chomsky makes nativist claims about universal grammar. Rather, these patterns would be native to to humans in the way and to the extent humans are native to the world. This could be made to fit into Everett’s own theorizing, as he is attempting to situate the human within larger contexts of culture, environment, and such.

Consider an example from psychedelic studies. It has been found that people under the influence of particular psychedelics often have similar experiences. This is why shamanic cultures speak of psychedelic plants as having spirits that reside within or are expressed through them.

Let me be more specific. DMT is the most common psychedelic in the world, it being found in numerous plants and even is produced in small quantities by the human brain. It’s an example of interspecies co-evolution, plants and humans having chemicals in common. Plants are chemistry factories and they use chemicals for various purposes, including communication with other plants (e.g., chemically telling nearby plants that something is nibbling on its leaves and so put up your chemical defenses) and communicating with non-plants (e.g., sending out bitter chemicals to help inform the nibbler that they might want to eat elsewhere). Animals didn’t just co-evolve with edible plants but also psychedelic plants. And humans aren’t the only species to imbibe. Maybe chemicals like DMT serve a purpose. And maybe there is a reason so many humans tripping on DMT experience what some describe as self-replicating machine elves or self-transforming fractal elves. Humans have been tripping on DMT for longer than civilization has existed.

DMT is far from being the only psychedelic plant like this. It’s just one of the more common. The reason plant psychedelics do what they do to our brains is because our brains were shaped by evolution to interact with chemicals like this. These chemicals almost seem designed for animal brains, especially DMT which our own brains produce.

That brings up some issues about the whole nativism/non-nativism conflict. Is a common experience many humans have with a psychedelic plant native to humans, native to the plant, or native to the inter-species relationship between human and plant? Where do the machine/fractal elves live, in the plant or in our brain? My tendency is to say that they in some sense ‘exist’ in the relationship between plants and humans, an experiential expression of that relationship, as immaterial and ephemeral as the love felt by two humans. These weird psychedelic beings are a plant-human hybrid, a shared creation of our shared evolution. They are native to our humanity to the extent that we are native to the ecosystems we share with those psychedelic plants.

Other areas of human experience lead down similar strange avenues. Take as another example the observations of Jacques Vallée. When he was a practicing astronomer, he became interested in UFOs as some of his fellow astronomers would destroy rather than investigate anomalous observational data. This led him to look into the UFO field and that led to his studying those claiming alien abduction experiences. What he noted was that the stories told were quite similar to fairy abduction folktales and shamanic accounts of initiation. There seemed to be a shared pattern of experience that was interpreted differently according to culture but that in a large number of cases the basic pattern held.

Or take yet another example. Judith Weissman has noted patterns among the stated experiences of voice-hearers. Another researcher on voice-hearing, Tanya Luhrmann, has studied how voice-hearing both has commonalities and differences across cultures. John Geiger has shown how common voice-hearing can be, even if for most people it is usually only elicited during times of stress. Based on this and the work of others, it is obvious that voice-hearing is a normal capacity existing within all humans. It is actually quite common among children and some theorize it was more common for adults in other societies. Is pointing out the surprisingly common experience of voice-hearing an argument for nativism?

These aspects of our humanity are plain weird. It was the kind of thing that always fascinated Jung. But what do we do with such evidence? It doesn’t prove a universal human nature that is inborn and predetermined. Not everyone has these experiences. But it appears everyone is capable of having these experiences.

This is where mainstream thinking in the field of linguistics shows its limitations. Going by Everett’s descriptions of the Pirahã, it seems likely that voice-hearing is common among them, although they wouldn’t interpret it that way. For them, voice-hearing appears to manifest as full possession and what, to Western outsiders, seems like a shared state of dissociation. It’s odd that as a linguist it didn’t occur to Everett to study the way of speaking of those who were possessed or to think more deeply about the experiential significance of the use of language indicating dissociation. Maybe it was too far outside of his own cultural biases, the same cultural biases that causes many Western voice-hearers to be medicated and institutionalized.

And if we’re going to talk about voice-hearing, we have to bring up Julian Jaynes. Everett probably doesn’t realize it, but his views seem to be in line with the bicameral theory or at least not in explicit contradiction with it on conceptual grounds. He seems to be coming out of the cultural school of thought within anthropology, the same influence on Jaynes. It is precisely Everett’s anthropological field research that distinguishes him from a theoretical linguist like Chomsky who has never formally studied any foreign language nor gone out into the field to test his theories. It was from studying the Pirahã firsthand over many years that the power of culture was impressed upon him. Maybe that is a commonality with Jaynes who began his career doing scientific research, not theorizing.

As I was reading the book, I kept being reminded of Jaynes, despite Everett never mentioning him or related thinkers. It’s largely how he talks about individuals situated in a world and worldview, along with his mentioning of Bordieu’s habitus. This fits into his emphasis on the culture and nurture side of influences, arguing that people (and languages) are products of their environments. Also, when Everett wrote that his view was there is “nothing to an individual but one’s body” (Kindle Location 328), it occurred to me how this fit into the proposed experience of hypothetical ancient bicameral humans. My thought was confirmed when he stated that his own understanding was most in line with the Buddhist anatnam, ‘non-self’. Just a week ago, I wrote the following in reference to Jaynes’ bicameral theory:

“We modern Westerners identify ourselves with our thoughts, the internalized voice of egoic consciousness. And we see this as the greatest prize of civilization, the hard-won rights and freedoms of the heroic individual. It’s the story we tell. But in other societies, such as in the East, there are traditions that teach the self is distinct from thought. From the Buddhist perspective of dependent (co-)origination, it is a much less radical notion that the self arises out of thought, instead of the other way around, and that thought itself simply arises. A Buddhist would have a much easier time intuitively grasping the theory of bicameralism, that thoughts are greater than and precede the self.”

Jaynes considered self-consciousness and self-identity to be products of thought, rather than the other way around. Like Everett, this is an argument against the old Western belief in a human soul that is eternal and immortal, that Platonically precedes individual corporality. But notions like Chomsky’s universal grammar feel like an attempt to revamp the soul for a scientific era, a universal human nature that precedes any individual, a soul as the spark of God and the divine expressed as a language imprinted on the soul. If I must believe in something existing within me that pre-exists me, then I’d rather go with alien-fairy-elves hiding out in the tangled undergrowth of my neurons.

Anyway, how might Everett’s views of nativism/non-nativism been different if he had been more familiar with the work of these other researchers and thinkers? The problem is that the nativism/non-nativism framework is itself culturally biased. It’s related to the problem of anthropologists who try to test the color perception of other cultures using tests that are based on Western color perception. Everett’s observations of the Pirahã, by the way, have also challenged that field of study — as he has made the claim that the Pirahã have no color terms and no particular use in discriminating colors. That deals with the relationship of language to cognition and perception. Does language limit our minds? If so, how and to what extent? If not, are we to assume that such things as ‘colors’ are native to how the human brain functions? Would an individual born into and raised in a completely dark room still ‘see’ colors in their mind’s eye?

Maybe the fractal elves produce the colors, consuming the DMT and defecating rainbows. Maybe the alien-fairies abduct us in our sleep and use advanced technology to implant the colors into our brains. Maybe without the fractal elves and alien-fairies, we would finally all be colorblind and our society would be free from racism. Just some alternative theories to consider.

Talking about cultural biases, I was fascinated by some of the details he threw out about the Pirahã, the tribe he had spent the most years studying. He wrote that (Kindle Locations 147-148), “Looking back, I can identify many of the hidden problems it took me years to recognize, problems based in contrasting sets of tacit assumptions held by the Pirahãs and me.” He then lists some of the tacit assumptions held by these people he came to know.

They don’t appear to have any concepts, language, or interest in God or gods, in religion, or anything spiritual/supernatural that wasn’t personally experienced by them or someone they personally know. Their language is very direct and precise about all experience and the source of claims. But they don’t feel like they’re spiritually lost or somehow lacking anything. In fact, Everett describes them as being extremely happy and easygoing, except on the rare occasion when a trader gives them alcohol.

They don’t have any concern or fear about nor do they seek out and talk about death, the dead, ancestral spirits, or the afterlife. They apparently are entirely focused on present experience. They don’t speculate, worry, or even have curiosity about what is outside their experience. Foreign cultures are irrelevant to them, this being an indifference and not hatred of foreigners. It’s just that foreign cultures is thought of as good for foreigners, as Pirahã culture is good for Pirahã. Generally, they seem to lack the standard anxiety that is typical of our society, despite living in and walking around barefoot in one of the most dangerous environments on the planet surrounded by poisonous and deadly creatures. It’s actually malaria that tends to cut their lives short. But they don’t much comparison in thinking that their lives are cut short.

Their society is based on personal relationships and “do not like for any individual to tell another individual how to live” (Kindle Locations 149-150). They don’t have governments or, as far as I know, governing councils. They don’t practice social coercion, community-mandated punishments, and enforced norms. They are very small tribe living in isolation with a way of life that has likely remained basically the same for millennia. Their culture and lifestyle is well-adapted to their environmental niche, and so they don’t tend to encounter many new problems that require them to act differently than in the past. They also don’t practice or comprehend incarceration, torture, capital punishment, mass war, genocide, etc. It’s not that violence never happens in their society, but I get the sense that it’s rare.

In the early years of life, infants and young toddlers live in near constant proximity to their mothers and other adults. They are given near ownership rights of their mothers’ bodies, freely suckling whenever they want without asking permission or being denied. But once weaned, Pirahã are the opposite of coddled. Their mothers simply cut them off from their bodies and the toddlers go through a tantrum period that is ignored by adults. They learn from experience and get little supervision in the process. They quickly become extremely knowledgeable and capable about living in and navigating the world around them. The parents have little fear about their children and it seems to be well-founded, as the children prove themselves able to easily learn self-sufficiency and a willingness to contribute. It reminded me of Jean Liedloff’s continuum concept.

Then, once they become teenagers, they don’t go through a rebellious phase. It seems a smooth transition into adulthood. As he described it in his first book (Don’t Sleep, There Are Snakes, p. 99-100):

“I did not see Pirahã teenagers moping, sleeping in late, refusing to accept responsibility for their own actions, or trying out what they considered to be radically new approaches to life. They in fact are highly productive and conformist members of their community in the Pirahã sense of productivity (good fishermen, contributing generally to the security, food needs, and o ther aspects of the physical survival of the community). One gets no sense of teenage angst, depression, or insecurity among the Pirahã youth. They do not seem to be searching for answers. They have them. And new questions rarely arise.

“Of course, this homeostasis can stifle creativity and individuality, two important Western values. If one considers cultural evolution to be a good thing, then this may not be something to emulate, since cultural evolution likely requires conflict, angst, and challenge. But if your life is unthreatened (so far as you know) and everyone in your society is satisfied, why would you desire change? How could things be improved? Especially if the outsiders you came into contact with seemed more irritable and less satisfied with life than you. I asked the Pirahãs once during my early missionary years if they knew why I was there. “You are here because this is a beautiful place. The water is pretty. There are good things to eat here. The Pirahãs are nice people.” That was and is the Pirahãs’ perspective. Life is good. Their upbringing, everyone learning early on to pull their own weight, produces a society of satisfied members. That is hard to argue against.”

The most strange and even shocking aspect of Pirahã life is their sexuality. Kids quickly learn about sex. It’s not that people have sex out in the open. But it’s a lifestyle that provides limited privacy. Sexual activity isn’t considered a mere adult activity and children aren’t protected from it. Quite the opposite (Kindle Locations 2736-2745):

“Sexual behavior is another behavior distinguishing Pirahãs from most middle-class Westerners early on. A young Pirahã girl of about five years came up to me once many years ago as I was working and made crude sexual gestures, holding her genitalia and thrusting them at me repeatedly, laughing hysterically the whole time. The people who saw this behavior gave no sign that they were bothered. Just child behavior, like picking your nose or farting. Not worth commenting about.

“But the lesson is not that a child acted in a way that a Western adult might find vulgar. Rather, the lesson, as I looked into this, is that Pirahã children learn a lot more about sex early on, by observation, than most American children. Moreover, their acquisition of carnal knowledge early on is not limited to observation. A man once introduced me to a nine- or ten-year-old girl and presented her as his wife. “But just to play,” he quickly added. Pirahã young people begin to engage sexually, though apparently not in full intercourse, from early on. Touching and being touched seem to be common for Pirahã boys and girls from about seven years of age on. They are all sexually active by puberty, with older men and women frequently initiating younger girls and boys, respectively. There is no evidence that the children then or as adults find this pedophilia the least bit traumatic.”

This seems plain wrong to most Westerners. Then again, to the Pirahã, much of what Westerners do would seem plain wrong or simply incomprehensible. Which is worse, Pirahã pedophilia or Western mass violence and systematic oppression?

What is most odd is that, like death for adults, sexuality for children isn’t considered a traumatizing experience and they don’t act traumatized. It’s apparently not part of their culture to be traumatized. They aren’t a society based on and enmeshed in a worldview of violence, fear, and anxiety. That isn’t how they think about any aspect of their lifeworld. I would assume that, like most tribal people, they don’t have high rates of depression and other mental illnesses. Everett pointed out that in the thirty years he knew the Pirahã there never was a suicide. And when he told them about his stepmother killing herself, they burst out in laughter because it made absolutely no sense to them that someone would take their own life.

That demonstrates the power of culture, environment, and lifestyle. According to Everett, it also demonstrates the power of language, inseparable from the society that shapes and is shaped by it, and demonstrates how little we understand the dark matter of the mind.

* * *

The Amazon’s Pirahã People’s Secret to Happiness: Never Talk of the Past or Future
by Dominique Godrèche, Indian Country

Being Pirahã Means Never Having to Say You’re Sorry
by Christopher Ryan, Psychology Today

The Myth of Teenage Rebellion
by Suzanne Calulu, Patheos

The Suicide Paradox: Full Transcript
from Freakonomics

“Beyond that, there is only awe.”

“What is the meaning of life?” This question has no answer except in the history of how it came to be asked. There is no answer because words have meaning, not life or persons or the universe itself. Our search for certainty rests in our attempts at understanding the history of all individual selves and all civilizations. Beyond that, there is only awe.
~ Julian Jaynes, 1988, Life Magazine

That is always a nice quote. Jaynes never seemed like an ideologue about his own speculations. In his controversial book, more than a decade earlier (1976), he titled his introduction as “The Problem of Consciousness”. That is what frames his thought, confronting a problem. The whole issue of consciousness is still problematic to this day and likely will be so for a long time. After a lengthy analysis of complex issues, he concludes his book with some humbling thoughts:

For what is the nature of this blessing of certainty that science so devoutly demands in its very Jacob-like wrestling with nature? Why should we demand that the universe make itself clear to us? Why do we care?

To be sure, a part of the impulse to science is simple curiosity, to hold the unheld and watch the unwatched. We are all children in the unknown.

Following that, he makes a plea for understanding. Not just understanding of the mind but also of experience. It is a desire to grasp what makes us human, the common impulses that bind us, underlying both religion and science. There is a tender concern being given voice, probably shaped and inspired by his younger self having poured over his deceased father’s Unitarian sermons.

As individuals we are at the mercies of our own collective imperatives. We see over our everyday attentions, our gardens and politics, and children, into the forms of our culture darkly. And our culture is our history. In our attempts to communicate or to persuade or simply interest others, we are using and moving about through cultural models among whose differences we may select, but from whose totality we cannot escape. And it is in this sense of the forms of appeal, of begetting hope or interest or appreciation or praise for ourselves or for our ideas, that our communications are shaped into these historical patterns, these grooves of persuasion which are even in the act of communication an inherent part of what is communicated. And this essay is no exception.

That humility feels genuine. His book was far beyond mere scholarship. It was an expression of decades of questioning and self-questioning, about what it means to be human and what it might have meant for others throughout the millennia.

He never got around to writing another book on the topic, despite his stated plans to do so. But during the last decade of his life, he wrote an afterword to his original work. It was placed in the 1990 edition, fourteen years after the original publication. He had faced much criticism and one senses a tired frustration in those last years. Elsewhere, he complained about the expectation to explain himself and make himself understood to people who, for whatever reason, didn’t understand. Still, he realized that was the nature of his job as an academic scholar working at a major university. From the after word, he wrote:

A favorite practice of some professional intellectuals when at first faced with a theory as large as the one I have presented is to search for that loose thread which, when pulled, will unravel all the rest. And rightly so. It is part of the discipline of scientific thinking. In any work covering so much of the terrain of human nature and history, hustling into territories jealously guarded by myriad aggressive specialists, there are bound to be such errancies, sometimes of fact but I fear more often of tone. But that the knitting of this book is such that a tug on such a bad stitch will unravel all the rest is more of a hope on the part of the orthodox than a fact in the scientific pursuit of truth. The book is not a single hypothesis.

Interestingly, Jaynes doesn’t state the bicameral mind as an overarching context for the hypotheses he lists. In fact, it is just one among the several hypotheses and not even the first to be mentioned. That shouldn’t be surprising since decades of his thought and research, including laboratory studies done on animal behavior, preceded the formulation of the bicameral hypothesis. Here are the four hypotheses:

  1. Consciousness is based on language.
  2. The bicameral mind.
  3. The dating.
  4. The double brain.

He states that, “I wish to emphasize that these four hypotheses are separable. The last, for example, could be mistaken (at least in the simplified version I have presented) and the others true. The two hemispheres of the brain are not the bicameral mind but its present neurological model. The bicameral mind is an ancient mentality demonstrated in the literature and artifacts of antiquity.” Each hypothesis is connected to the others but must be dealt with separately. The key element to his project is consciousness, as that is the key problem. And as problems go, it is a doozy. Calling it a problem is like calling the moon a chunk of rock and the sun a warm fire.

Related to these hypotheses, earlier in his book, Jaynes proposes a useful framework. He calls it the General Bicameral Paradigm. “By this phrase,” he explains, “I mean an hypothesized structure behind a large class of phenomena of diminished consciousness which I am interpreting as partial holdovers from our earlier mentality.” There are four components:

  1. “the collective cognitive imperative, or belief system, a culturally agreed-on expectancy or prescription which defines the particular form of a phenomenon and the roles to be acted out within that form;”
  2. “an induction or formally ritualized procedure whose function is the narrowing of consciousness by focusing attention on a small range of preoccupations;”
  3. “the trance itself, a response to both the preceding, characterized by a lessening of consciousness or its loss, the diminishing of the analog or its loss, resulting in a role that is accepted, tolerated, or encouraged by the group; and”
  4. “the archaic authorization to which the trance is directed or related to, usually a god, but sometimes a person who is accepted by the individual and his culture as an authority over the individual, and who by the collective cognitive imperative is prescribed to be responsible for controlling the trance state.”

The point is made that the reader shouldn’t assume that they are “to be considered as a temporal succession necessarily, although the induction and trance usually do follow each other. But the cognitive imperative and the archaic authorization pervade the whole thing. Moreover, there is a kind of balance or summation among these elements, such that when one of them is weak the others must be strong for the phenomena to occur. Thus, as through time, particularly in the millennium following the beginning of consciousness, the collective cognitive imperative becomes weaker (that is, the general population tends toward skepticism about the archaic authorization), we find a rising emphasis on and complication of the induction procedures, as well as the trance state itself becoming more profound.”

This general bicameral paradigm is partly based on the insights he gained from studying ancient societies. But ultimately it can be considered separately from that. All you have to understand is that these are a basic set of cognitive abilities and tendencies that have been with humanity for a long time. These are the vestiges of human evolution and societal development. They can be combined and expressed in multiple ways. Our present society is just one of many possible manifestations. Human nature is complex and human potential is immense, and so diversity is to be expected among human neurocognition, behavior, and culture.

An important example of the general bicameral paradigm is hypnosis. It isn’t just an amusing trick done for magic shows. Hypnosis shows something profoundly odd, disturbing even, about the human mind. Also, it goes far beyond the individual for it is about how humans relate. It demonstrates the power of authority figures, in whatever form they take, and indicates the significance of what Jaynes calls authorization. By the way, this leads down the dark pathways of authoritarianism, brainwashing, propaganda, and punishment — as for the latter, Jaynes writes that:

If we can regard punishment in childhood as a way of instilling an enhanced relationship to authority, hence training some of those neurological relationships that were once the bicameral mind, we might expect this to increase hypnotic susceptibility. And this is true. Careful studies show that those who have experienced severe punishment in childhood and come from a disciplined home are more easily hypnotized, while those who were rarely punished or not punished at all tend to be less susceptible to hypnosis.

He discusses the history of hypnosis beginning with Mesmer. In this, he shows how metaphor took different form over time. And, accordingly, it altered shared experience and behavior.

Now it is critical here to realize and to understand what we might call the paraphrandic changes which were going on in the people involved, due to these metaphors. A paraphrand, you will remember, is the projection into a metaphrand of the associations or paraphiers of a metaphier. The metaphrand here is the influences between people. The metaphiers, or what these influences are being compared to, are the inexorable forces of gravitation, magnetism, and electricity. And their paraphiers of absolute compulsions between heavenly bodies, of unstoppable currents from masses of Ley den jars, or of irresistible oceanic tides of magnetism, all these projected back into the metaphrand of interpersonal relationships, actually changing them, changing the psychological nature of the persons involved, immersing them in a sea of uncontrollable control that emanated from the ‘magnetic fluids’ in the doctor’s body, or in objects which had ‘absorbed’ such from him.

It is at least conceivable that what Mesmer was discovering was a different kind of mentality that, given a proper locale, a special education in childhood, a surrounding belief system, and isolation from the rest of us, possibly could have sustained itself as a society not based on ordinary consciousness, where metaphors of energy and irresistible control would assume some of the functions of consciousness.

How is this even possible? As I have mentioned already, I think Mesmer was clumsily stumbling into a new way of engaging that neurological patterning I have called the general bicameral paradigm with its four aspects: collective cognitive imperative, induction, trance, and archaic authorization.

Through authority and authorization, immense power and persuasion can be wielded. Jaynes argues that it is central to the human mind, but that in developing consciousness we learned how to partly internalize the process. Even so, Jaynesian self-consciousness is never a permanent, continuous state and the power of individual self-authorization easily morphs back into external forms. This is far from idle speculation, considering authoritarianism still haunts the modern mind. I might add that the ultimate power of authoritarianism, as Jaynes makes clear, isn’t overt force and brute violence. Outward forms of power are only necessary to the degree that external authorization is relatively weak, as is typically the case in modern societies.

This touches upon the issue of rhetoric, although Jaynes never mentioned the topic. It’s disappointing since his original analysis of metaphor has many implications. Fortunately, others have picked up where he left off (see Ted Remington, Brian J. McVeigh, and Frank J. D’Angelo). Authorization in the ancient world came through a poetic voice, but today it is most commonly heard in rhetoric.

Still, that old time religion can be heard in the words and rhythm of any great speaker. Just listen to how a recorded speech of Martin Luther King jr can pull you in with its musicality. Or if you prefer a dark example, consider the persuasive power of Adolf Hitler for even some Jews admitted they got caught up listening to his speeches. This is why Plato feared the poets and banished them from his utopia of enlightened rule. Poetry would inevitably undermine and subsume the high-minded rhetoric of philosophers. “[P]oetry used to be divine knowledge,” as Guerini et al states in Echoes of Persuasion, “It was the sound and tenor of authorization and it commanded where plain prose could only ask.”

Metaphor grows naturally in poetic soil, but its seeds are planted in every aspect of language and thought, giving fruit to our perceptions and actions. This is a thousandfold true on the collective level of society and politics. Metaphors are most powerful when we don’t see them as metaphors. So, the most persuasive rhetoric is that which hides its metaphorical frame and obfuscates any attempts to bring it to light.

Going far back into the ancient world, metaphors didn’t need to be hidden in this sense. The reason for this is that there was no intellectual capacity or conceptual understanding of metaphors as metaphors. Instead, metaphors were taken literally. The way people spoke about reality was inseparable from their experience of reality and they had no way of stepping back from their cultural biases, as the cultural worldviews they existed within were all-encompassing. It’s only with the later rise of multicultural societies, especially the vast multi-ethnic trade empires, that people began to think in terms of multiple perspectives. Such a society was developing in the trade networking and colonizing nation-states of Greece in the centuries leading up to Hellenism.

That is the well known part of Jaynes’ speculations, the basis of his proposed bicameral mind. And Jaynes considered it extremely relevant to the present.

Marcel Kuijsten wrote that, “Jaynes maintained that we are still deep in the midst of this transition from bicamerality to consciousness; we are continuing the process of expanding the role of our internal dialogue and introspection in the decision-making process that was started some 3,000 years ago. Vestiges of the bicameral mind — our longing for absolute guidance and external control — make us susceptible to charismatic leaders, cults, trends, and persuasive rhetoric that relies on slogans to bypass logic” (“Consciousness, Hallucinations, and the Bicameral Mind Three Decades of New Research”, Reflections on the Dawn of Consciousness, Kindle Locations 2210-2213). Considering the present, in Authoritarian Grammar and Fundamentalist Arithmetic, Ben G. Price puts it starkly: “Throughout, tyranny asserts its superiority by creating a psychological distance between those who command and those who obey. And they do this with language, which they presume to control.” The point made by the latter is that this knowledge, even as it can be used as intellectual defense, might just lead to even more effective authoritarianism.

We’ve grown less fearful of rhetoric because we see ourselves as being savvy, experienced consumers of media. The cynical modern mind is always on guard, our well-developed and rigid state of consciousness offering a continuous psychological buffering against the intrusions of the world. So we like to think. I remember, back in 7th grade, being taught how the rhetoric of advertising is used to manipulate us. But we are over-confident. Consciousness operates at the surface of the psychic depths. We are better at rationalizing than being rational, something we may understand intellectually but rarely do we fully acknowledge the psychological and societal significance of this. That is the usefulness of theories like that of bicameralism, as they remind us that we are out of our depths. In the ancient world, there was a profound mistrust between the poetic and rhetorical, and for good reason. We would be wise to learn from that clash of mindsets and worldviews.

We shouldn’t be so quick to assume we understand our own minds, the kind of vessel we find ourselves on. Nor should we allow ourselves to get too comfortable within the worldview we’ve always known, the safe harbor of our familiar patterns of mind. It’s hard to think about these issues because they touch upon our own being, the surface of consciousness along with the depths below it. This is the near difficult task of fathoming the ocean floor using rope and a weight, an easier task the closer we hug the shoreline. But what might we find if cast ourselves out on open waters? What new lands might be found, lands to be newly discovered and lands already inhabited?

We moderns love certainty. And it’s true we possess more knowledge than any civilization before has accumulated. Yet we’ve partly made the unfamiliar into familiar by remaking the world in our own image. There is no place on earth that remains entirely untouched. Only a couple hundred small isolated tribes are still uncontacted, representing foreign worldviews not known or studied, but even they live under unnatural conditions of stress as the larger world closes in on them. Most of the ecological and cultural diversity that once existed has been obliterated from the face of the earth, most of it having left not a single trace or record, just simply gone. Populations beyond count have faced extermination by outside influences and forces before they ever got a chance to meet an outsider. Plagues, environmental destruction, and societal collapse wiped them out often in short periods of time.

Those other cultures might have gifted us with insights about our humanity that now are lost forever, just as extinct species might have held answers to questions not yet asked and medicines for diseases not yet understood. Almost all that now is left is a nearly complete monoculture with the differences ever shrinking into the constraints of capitalist realism. If not for scientific studies done on the last of isolated tribal people, we would never know how much diversity exists within human nature. Many of the conclusions that earlier social scientists had made were based mostly on studies involving white, middle class college kids in Western countries, what some have called the WEIRD: Western, Educated, Industrialized, Rich, and Democratic. But many of those conclusions have since proven wrong, biased, or limited.

When Jaynes’ first thought on such matters, the social sciences were still getting established as serious fields of study. His entered college around 1940 when behaviorism was a dominant paradigm. It was only in the prior decades that the very idea of ‘culture’ began to take hold among anthropologists. He was influenced by anthropologists, directly and indirectly. One indirect influence came by way of E. R. Dodds, a classical scholar, who in writing his 1951 The Greeks and the Irrational found inspiration from Ruth Benedict’s anthropological work comparing cultures (Benedict taking this perspective through the combination of the ideas of Franz Boas and Carl Jung). Still, anthropology was young and the fascinating cases so well known today were unknown back then (e.g., Daniel Everett’s recent books on the Pirahã). So, in following Dodds example, Jaynes turned to ancient societies and their literature.

His ideas were forming at the same time the social sciences were gaining respectability and maturity. It was a time when many scholars and other intellectuals were more fully questioning Western civilization. But it was also the time when Western ascendancy was becoming clear with the WWI ending of the Ottoman Empire and the WWII ending of the Japanese Empire. The whole world was falling under Western cultural influence. And traditional societies were in precipitous decline. That was the dawning of the age of monoculture.

We are the inheritors of the world that was created from that wholesale destruction of all that came before. And even what came before was built on millennia of collapsing civilizations. Jaynes focused on the earliest example of mass destruction and chaos leading him to see a stark division to what came before and after. How do we understand why we came to be the way we are when so much has been lost? We are forced back on our own ignorance. Jaynes apparently understood that and so considered awe to be the proper response. We know the world through our own humanity, but we can only know our own humanity through the cultural worldview we are born into. It is our words that have meaning, was Jaynes response, “not life or persons or the universe itself.” That is to say we bring meaning to what we seek to understand. Meaning is created, not discovered. And the kind of meaning we create depends on our cultural worldview.

In Monoculture, F. S. Michaels writes (pp. 1-2):

THE HISTORY OF HOW we think and act, said twentieth-century philosopher Isaiah Berlin, is, for the most part, a history of dominant ideas. Some subject rises to the top of our awareness, grabs hold of our imagination for a generation or two, and shapes our entire lives. If you look at any civilization, Berlin said, you will find a particular pattern of life that shows up again and again, that rules the age. Because of that pattern, certain ideas become popular and others fall out of favor. If you can isolate the governing pattern that a culture obeys, he believed, you can explain and understand the world that shapes how people think, feel and act at a distinct time in history.1

The governing pattern that a culture obeys is a master story — one narrative in society that takes over the others, shrinking diversity and forming a monoculture. When you’re inside a master story at a particular time in history, you tend to accept its definition of reality. You unconsciously believe and act on certain things, and disbelieve and fail to act on other things. That’s the power of the monoculture; it’s able to direct us without us knowing too much about it.

Over time, the monoculture evolves into a nearly invisible foundation that structures and shapes our lives, giving us our sense of how the world works. It shapes our ideas about what’s normal and what we can expect from life. It channels our lives in a certain direction, setting out strict boundaries that we unconsciously learn to live inside. It teaches us to fear and distrust other stories; other stories challenge the monoculture simply by existing, by representing alternate possibilities.

Jaynes argued that ideas are more than mere concepts. Ideas are embedded in language and metaphor. And ideas take form not just as culture but as entire worldviews built on interlinked patterns of attitudes, thought, perception, behavior, and identity. Taken together, this is the reality tunnel we exist within.

It takes a lot to shake us loose from these confines of the mind. Certain practices, from meditation to imbibing psychedelics, can temporarily or permanently alter the matrix of our identity. Jaynes, for reasons of his own, came to question the inevitability of the society around him which allowed him to see that other possibilities may exist. The direction his queries took him landed him in foreign territory, outside of the idolized individualism of Western modernity.

His ideas might have been less challenging in a different society. We modern Westerners identify ourselves with our thoughts, the internalized voice of egoic consciousness. And we see this as the greatest prize of civilization, the hard-won rights and freedoms of the heroic individual. It’s the story we tell. But in other societies, such as in the East, there are traditions that teach the self is distinct from thought. From the Buddhist perspective of dependent (co-)origination, it is a much less radical notion that the self arises out of thought, instead of the other way around, and that thought itself simply arises. A Buddhist would have a much easier time intuitively grasping the theory of bicameralism, that thoughts are greater than and precede the self.

Maybe we modern Westerners need to practice a sense of awe, to inquire more deeply. Jaynes offers a different way of thinking that doesn’t even require us to look to another society. If he is correct, this radical worldview is at the root of Western Civilization. Maybe the traces of the past are still with us.

* * *

The Origin of Rhetoric in the Breakdown of the Bicameral Mind
by Ted Remington

Endogenous Hallucinations and the Bicameral Mind
by Rick Straussman

Consciousness and Dreams
by Marcel Kuijsten, Julian Jaynes Society

Ritual and the Consciousness Monoculture
by Sarah Perry, Ribbonfarm

“I’m Nobody”: Lyric Poetry and the Problem of People
by David Baker, The Virginia Quarterly Review

It is in fact dangerous to assume a too similar relationship between those ancient people and us. A fascinating difference between the Greek lyricists and ourselves derives from the entity we label “the self.” How did the self come to be? Have we always been self-conscious, of two or three or four minds, a stew of self-aware voices? Julian Jaynes thinks otherwise. In The Origin of Consciousness in the Breakdown of the Bicameral Mind—that famous book my poetry friends adore and my psychologist friends shrink from—Jaynes surmises that the early classical mind, still bicameral, shows us the coming-into-consciousness of the modern human, shows our double-minded awareness as, originally, a haunted hearing of voices. To Jaynes, thinking is not the same as consciousness: “one does one’s thinking before one knows what one is to think about.” That is, thinking is not synonymous with consciousness or introspection; it is rather an automatic process, notably more reflexive than reflective. Jaynes proposes that epic poetry, early lyric poetry, ritualized singing, the conscience, even the voices of the gods, all are one part of the brain learning to hear, to listen to, the other.

Auditory Hallucinations: Psychotic Symptom or Dissociative Experience?
by Andrew Moskowitz & Dirk Corstens

Voices heard by persons diagnosed schizophrenic appear to be indistinguishable, on the basis of their experienced characteristics, from voices heard by persons with dissociative disorders or by persons with no mental disorder at all.

Neuroimaging, auditory hallucinations, and the bicameral mind.
by L. Sher, Journal of Psychiatry and Neuroscience

Olin suggested that recent neuroimaging studies “have illuminated and confirmed the importance of Jaynes’ hypothesis.” Olin believes that recent reports by Lennox et al and Dierks et al support the bicameral mind. Lennox et al reported a case of a right-handed subject with schizophrenia who experienced a stable pattern of hallucinations. The authors obtained images of repeated episodes of hallucination and observed its functional anatomy and time course. The patient’s auditory hallucination occurred in his right hemisphere but not in his left.

What Is It Like to Be Nonconscious?: A Defense of Julian Jaynes
by Gary William, Phenomenology and the Cognitive Sciences

To explain the origin of consciousness is to explain how the analog “I” began to narratize in a functional mind-space. For Jaynes, to understand the conscious mind requires that we see it as something fleeting rather than something always present. The constant phenomenality of what-it-is-like to be an organism is not equivalent to consciousness and, subsequently, consciousness must be thought in terms of the authentic possibility of consciousness rather than its continual presence.

Defending Damasio and Jaynes against Block and Gopnik
by Emilia Barile, Phenomenology Lab

When Jaynes says that there was “nothing it is like” to be preconscious, he certainly didn’t mean to say that nonconscious animals are somehow not having subjective experience in the sense of “experiencing” or “being aware” of the world. When Jaynes said there is “nothing it is like” to be preconscious, he means that there is no sense of mental interiority and no sense of autobiographical memory. Ask yourself what it is like to be driving a car and then suddenly wake up and realize that you have been zoned out for the past minute. Was there something it is like to drive on autopilot? This depends on how we define “what it is like”.

“The Evolution of the Analytic Topoi: A Speculative Inquiry”
by Frank J. D’Angelo
from Essays on Classical Rhetoric and Modern Discourse
ed. Robert J. Connors, Lisa S. Ede, & Andrea A. Lunsford
pp. 51-5

The first stage in the evolution of the analytic topoi is the global stage. Of this stage we have scanty evidence, since we must assume the ontogeny of invention in terms of spoken language long before the individual is capable of anything like written language. But some hints of how logical invention might have developed can be found in the work of Eric Havelock. In his Preface to Plato, Havelock, in recapitulating the educational experience of the Homeric and post-Homeric Greek, comments that the psychology of the Homeric Greek is characterized by a high degree of automatism.

He is required as a civilised being to become acquainted with the history, the social organisation, the technical competence and the moral imperatives of his group. This in turn is able to function only as a fragment of the total Hellenic world. It shares a consciousness in which he is keenly aware that he, as a Hellene, in his memory. Such is poetic tradition, essentially something he accepts uncritically, or else it fails to survive in his living memory. Its acceptance and retention are made psychologically possible by a mechanism of self-surrender to the poetic performance and of self-identification with the situations and the stories related in the performance. . . . His receptivity to the tradition has thus, from the standpoint of inner psychology, a degree of automatism which however is counter-balanced by a direct and unfettered capacity for action in accordance with the paradigms he has absorbed. 6

Preliterate man was apparently unable to think logically. He acted, or as Julian Jaynes, in The Origin of Consciousness in the Breakdown of the Bicameral Mind, puts it, “reacted” to external events. “There is in general,” writes Jaynes, “no consciousness in the Iliad . . . and in general therefore, no words for consciousness or mental acts.” 7 There was, in other words, no subjective consciousness in Iliadic man. His actions were not rooted in conscious plans or in reasoning. We can only speculate, then, based on the evidence given by Havelock and Jaynes that logical invention, at least in any kind of sophisticated form, could not take place until the breakdown of the bicameral mind, with the invention of writing. If ancient peoples were unable to introspect, then we must assume that the analytic topoi were a discovery of literate man. Eric Havelock, however, warns that the picture he gives of Homeric and post-Homeric man is oversimplified and that there are signs of a latent mentality in the Greek mind. But in general, Homeric man was more concerned to go along with the tradition than to make individual judgments.

For Iliadic man to be able to think, he must think about something. To do this, states Havelock, he had to be able to revolt against the habit of self-identification with the epic poem. But identification with the poem at this time in history was necessary psychologically (identification was necessary for memorization) and in the epic story implicitly as acts or events that are carried out by important people, must be abstracted from the narrative flux. “Thus the autonomous subject who no longer recalls and feels, but knows, can now be confronted with a thousand abstract laws, principles, topics, and formulas which become the objects of his knowledge.” 8

The analytic topoi, then, were implicit in oral poetic discourse. They were “experienced” in the patterns of epic narrative, but once they are abstracted they can become objects of thought as well as of experience. As Eric Havelock puts it,

If we view them [these abstractions] in relation to the epic narrative from which, as a matter of historical fact, they all emerged they can all be regarded as in one way or another classifications of an experience which was previously “felt” in an unclassified medley. This was as true of justice as of motion, of goodness as of body or space, of beauty as of weight or dimension. These categories turn into linguistic counters, and become used as a matter of course to relate one phenomenon to another in a non-epic, non-poetic, non-concrete idiom. 9

The invention of the alphabet made it easier to report experience in a non-epic idiom. But it might be a simplification to suppose that the advent of alphabetic technology was the only influence on the emergence of logical thinking and the analytic topics, although perhaps it was the major influence. Havelock contends that the first “proto-thinkers” of Greece were the poets who at first used rhythm and oral formulas to attempt to arrange experience in categories, rather than in narrative events. He mentions in particular that it was Hesiod who first parts company with the narrative in the Theogony and Works and Days. In Works and Days, Hesiod uses a cataloging technique, consisting of proverbs, aphorisms, wise sayings, exhortations, and parables, intermingled with stories. But this effect of cataloging that goes “beyond the plot of a story in order to impose a rough logic of topics . . . presumes that Hesiod is 10

The kind of material found in the catalogs of Hesiod was more like the cumulative commonplace material of the Renaissance than the abstract topics that we are familiar with today. Walter Ong notes that “the oral performer, poet or orator needed a stock of material to keep him going. The doctrine of the commonplaces is, from one point of view, the codification of ways of assuring and managing this stock.” 11 We already know what some of the material was like: stock epithets, figures of speech, exempla, proverbs, sententiae, quotations, praises or censures of people and things, and brief treatises on virtues and vices. By the time we get to the invention of printing, there are vast collections of this commonplace material, so vast, relates Ong, that scholars could probably never survey it all. Ong goes on to observe that

print gave the drive to collect and classify such excerpts a potential previously undreamed of. . . . the ranging of items side by side on a page once achieved, could be multiplied as never before. Moreover, printed collections of such commonplace excerpts could be handily indexed; it was worthwhile spending days or months working up an index because the results of one’s labors showed fully in thousands of copies. 12

To summarize, then, in oral cultures rhetorical invention was bound up with oral performance. At this stage, both the cumulative topics and the analytic topics were implicit in epic narrative. Then the cumulative commonplaces begin to appear, separated out by a cataloging technique from poetic narrative, in sources such as the Theogony and Works and Days . Eric Havelock points out that in Hesiod, the catalog “has been isolated or abstracted . . . out of a thousand contexts in the rich reservoir of oral tradition. … A general world view is emerging in isolated or ‘abstracted’ form.” 13 Apparently, what we are witnessing is the emergence of logical thinking. Julian Jaynes describes the kind of thought to be found in the Works and Days as “preconscious hypostases.” Certain lines in Hesiod, he maintains, exhibit “some kind of bicameral struggle.” 14

The first stage, then, of rhetorical invention is that in which the analytic topoi are embedded in oral performance in the form of commonplace material as “relationships” in an undifferentiated matrix. Oral cultures preserve this knowledge by constantly repeating the fixed sayings and formulae. Mnemonic patterns, patterns of repetition, are not added to the thought of oral cultures. They are what the thought consists of.

Emerging selves: Representational foundations of subjectivity
by Wolfgang Prinz, Consciousness and Cognition

What, then, may mental selves be good for and why have they emerged during evolution (or, perhaps, human evolution or even early human history)? Answers to these questions used to take the form of stories explaining how the mental self came about and what advantages were associated with it. In other words, these are theories that construct hypothetical scenarios offering plausible explanations for why certain (groups of) living things that initially do not possess a mental self gain fitness advantages when they develop such an entity—with the consequence that they move from what we can call a self-less to a self-based or “self-morphic” state.

Modules for such scenarios have been presented occasionally in recent years by, for example, Dennett, 1990 and Dennett, 1992, Donald (2001), Edelman (1989), Jaynes (1976), Metzinger, 1993 and Metzinger, 2003, or Mithen (1996). Despite all the differences in their approaches, they converge around a few interesting points. First, they believe that the transition between the self-less and self-morphic state occurred at some stage during the course of human history—and not before. Second, they emphasize the cognitive and dynamic advantages accompanying the formation of a mental self. And, third, they also discuss the social and political conditions that promote or hinder the constitution of this self-morphic state. In the scenario below, I want to show how these modules can be keyed together to form a coherent construction. […]

Thus, where do thoughts come from? Who or what generates them, and how are they linked to the current perceptual situation? This brings us to a problem that psychology describes as the problem of source attribution ( Heider, 1958).

One obvious suggestion is to transfer the schema for interpreting externally induced messages to internally induced thoughts as well. Accordingly, thoughts are also traced back to human sources and, likewise, to sources that are present in the current situation. Such sources can be construed in completely different ways. One solution is to trace the occurrence of thoughts back to voices—the voices of gods, priests, kings, or ancestors, in other words, personal authorities that are believed to have an invisible presence in the current situation. Another solution is to locate the source of thoughts in an autonomous personal authority bound to the body of the actor: the self.

These two solutions to the attribution problem differ in many ways: historically, politically, and psychologically. In historical terms, the former must be markedly older than the latter. The transition from one solution to the other and the mentalities associated with them are the subject of Julian Jaynes’s speculative theory of consciousness. He even considers that this transfer occurred during historical times: between the Iliad and the Odyssey. In the Iliad, according to Jaynes, the frame of mind of the protagonists is still structured in a way that does not perceive thoughts, feelings, and intentions as products of a personal self, but as the dictates of supernatural voices. Things have changed in the Odyssey: Odysseus possesses a self, and it is this self that thinks and acts. Jaynes maintains that the modern consciousness of Odysseus could emerge only after the self had taken over the position of the gods (Jaynes, 1976; see also Snell, 1975).

Moreover, it is obvious why the political implications of the two solutions differ so greatly: Societies whose members attribute their thoughts to the voices of mortal or immortal authorities produce castes of priests or nobles that claim to be the natural authorities or their authentic interpreters and use this to derive legitimization for their exercise of power. It is only when the self takes the place of the gods that such castes become obsolete, and authoritarian constructions are replaced by other political constructions that base the legitimacy for their actions on the majority will of a large number of subjects who are perceived to be autonomous.

Finally, an important psychological difference is that the development of a self-concept establishes the precondition for individuals to become capable of perceiving themselves as persons with a coherent biography. Once established, the self becomes involved in every re-presentation and representation as an implicit personal source, and just as the same body is always present in every perceptual situation, it is the same mental self that remains identical across time and place. […]

According to the cognitive theories of schizophrenia developed in the last decade (Daprati et al., 1997; Frith, 1992), these symptoms can be explained with the same basic pattern that Julian Jaynes uses in his theory to characterize the mental organization of the protagonists in the Iliad. Patients with delusions suffer from the fact that the standardized attribution schema that localizes the sources of thoughts in the self is not available to them. Therefore, they need to explain the origins of their thoughts, ideas, and desires in another way (see, e.g., Stephens & Graham, 2000). They attribute them to person sources that are present but invisible—such as relatives, physicians, famous persons, or extraterrestrials. Frequently, they also construct effects and mechanisms to explain how the thoughts proceeding from these sources are communicated, by, for example, voices or pictures transmitted over rays or wires, and nowadays frequently also over phones, radios, or computers. […]

As bizarre as these syndromes seem against the background of our standard concept of subjectivity and personhood, they fit perfectly with the theoretical idea that mental selves are not naturally given but rather culturally constructed, and in fact set up in, attribution processes. The unity and consistency of the self are not a natural necessity but a cultural norm, and when individuals are exposed to unusual developmental and life conditions, they may well develop deviant attribution patterns. Whether these deviations are due to disturbances in attribution to persons or to disturbances in dual representation cannot be decided here. Both biological and societal conditions are involved in the formation of the self, and when they take an unusual course, the causes could lie in both domains.


“The Varieties of Dissociative Experience”
by Stanley Krippner
from Broken Images Broken Selves: Dissociative Narratives In Clinical Practice
pp. 339-341

In his provocative description of the evolution of humanity’s conscious awareness, Jaynes (1976) asserted that ancient people’s “bicameral mind” enabled them to experience auditory hallucinations— the voices of the deities— but they eventually developed an integration of the right and left cortical hemispheres. According to Jaynes, vestiges of this dissociation can still be found, most notably among the mentally ill, the extremely imaginative, and the highly suggestible. Even before the development of the cortical hemispheres, the human brain had slowly evolved from a “reptilian brain” (controlling breathing, fighting, mating, and other fixed behaviors), to the addition of an “old-mammalian brain,” (the limbic system, which contributed emotional components such as fear, anger, and affection), to the superimposition of a “new-mammalian brain” (responsible for advanced sensory processing and thought processes). MacLean (1977) describes this “triune brain” as responsible, in part, for distress and inefficiency when the parts do not work well together. Both Jaynes’ and MacLean’s theories are controversial, but I believe that there is enough autonomy in the limbic system and in each of the cortical hemispheres to justify Ornstein’s (1986) conclusion that human beings are much more complex and intricate than they imagine, consisting of “an uncountable number of small minds” (p. 72), sometimes collaborating and sometimes competing. Donald’s (1991) portrayal of mental evolution also makes use of the stylistic differences of the cerebral hemisphere, but with a greater emphasis on neuropsychology than Jaynes employs. Mithen’s (1996) evolutionary model is a sophisticated account of how specialized “cognitive domains” reached the point that integrated “cognitive fluidity” (apparent in art and the use of symbols) was possible.

James (1890) spoke of a “multitude” of selves, and some of these selves seem to go their separate ways in posttraumatic stress disorder (PTSD) (see Greening, Chapter 5), dissociative identity disorder (DID) (see Levin, Chapter 6), alien abduction experiences (see Powers, Chapter 9), sleep disturbances (see Barrett, Chapter 10), psychedelic drug experiences (see Greenberg, Chapter 11), death terrors (see Lapin, Chapter 12), fantasy proneness (see Lynn, Pintar, & Rhue, Chapter 13), near-death experiences (NDEs) (see Greyson, Chapter 7), and mediumship (see Grosso, Chapter 8). Each of these conditions can be placed into a narrative construction, and the value of these frameworks has been described by several authors (e.g., Barclay, Chapter 14; Lynn, Pintar, & Rhue, Chapter 13; White, Chapter 4). Barclay (Chapter 14) and Powers (Chapter 15) have addressed the issue of narrative veracity and validation, crucial issues when stories are used in psychotherapy. The American Psychiatric Association’s Board of Trustees (1993) felt constrained to issue an official statement that “it is not known what proportion of adults who report memories of sexual abuse were actually abused” (p. 2). Some reports may be fabricated, but it is more likely that traumatic memories may be misconstrued and elaborated (Steinberg, 1995, p. 55). Much of the same ambiguity surrounds many other narrative accounts involving dissociation, especially those described by White (Chapter 4) as “exceptional human experiences.”

Nevertheless, the material in this book makes the case that dissociative accounts are not inevitably uncontrolled and dysfunctional. Many narratives considered “exceptional” from a Western perspective suggest that dissociation once served and continues to serve adaptive functions in human evolution. For example, the “sham death” reflex found in animals with slow locomotor abilities effectively offers protection against predators with greater speed and agility. Uncontrolled motor responses often allow an animal to escape from dangerous or frightening situations through frantic, trial-and-error activity (Kretchmer, 1926). Many evolutionary psychologists have directed their attention to the possible value of a “multimodular” human brain that prevents painful, unacceptable, and disturbing thoughts, wishes, impulses, and memories from surfacing into awareness and interfering with one’s ongoing contest for survival (Nesse & Lloyd, 1992, p. 610). Ross (1991) suggests that Western societies suppress this natural and valuable capacity at their peril.

The widespread prevalence of dissociative reactions argues for their survival value, and Ludwig (1983) has identified seven of them: (1) The capacity for automatic control of complex, learned behaviors permits organisms to handle a much greater work load in as smooth a manner as possible; habitual and learned behaviors are permitted to operate with a minimum expenditure of conscious control. (2) The dissociative process allows critical judgment to be suspended so that, at times, gratification can be more immediate. (3) Dissociation seems ideally suited for dealing with basic conflicts when there is no instant means of resolution, freeing an individual to take concerted action in areas lacking discord. (4) Dissociation enables individuals to escape the bounds of reality, providing for inspiration, hope, and even some forms of “magical thinking.” (5) Catastrophic experiences can be isolated and kept in check through dissociative defense mechanisms. (6) Dissociative experiences facilitate the expression of pent-up emotions through a variety of culturally sanctioned activities. (7) Social cohesiveness and group action often are facilitated by dissociative activities that bind people together through heightened suggestibility.

Each of these potentially adaptive functions may be life-depotentiating as well as life-potentiating; each can be controlled as well as uncontrolled. A critical issue for the attribution of dissociation may be the dispositional set of the experiencer-in-context along with the event’s adaptive purpose. Salamon (1996) described her mother’s ability to disconnect herself from unpleasant surroundings or facts, a proclivity that led to her ignoring the oncoming imprisonment of Jews in Nazi Germany but that, paradoxically, enabled her to survive her years in Auschwitz. Gergen (1991) has described the jaundiced eye that modern Western science has cast toward Dionysian revelry, spiritual experiences, mysticism, and a sense of bonded unity with nature, a hostility he predicts may evaporate in the so-called “postmodern” era, which will “open the way to the full expression of all discourses” (pp. 246– 247). For Gergen, this postmodern lifestyle is epitomized by Proteus, the Greek sea god, who could change his shape from wild boar to dragon, from fire to flood, without obvious coherence through time. This is all very well and good, as long as this dissociated existence does not leave— in its wake— a residue of broken selves whose lives have lost any intentionality or meaning, who live in the midst of broken images, and whose multiplicity has resulted in nihilistic affliction and torment rather than in liberation and fulfillment (Glass, 1993, p. 59).

 

 

Probability of Reality as We Know it

Jogging this morning, a pebble got into my shoe. I was on a sidewalk that wasn’t covered in rocks. The shoes I had on have high tops and were tied tightly. The thought occurred to me about probability, considering all the perfect conditions that have to come together to lead to even such a simple result as a pebble in my shoe.

I had to step on one of the few tiny rocks that happened to be in the right spot. Somehow the rock got kicked up about 6 inches where it caught the back edge of my shoe. It had to land perfectly right in order to lodge in the slight space between my foot and the shoe. Then it had to make its way down my shoe without first getting kicked back out.

It just got me thinking. For any given person at any given moment, a rock getting in their shoe is highly improbable. I run and/or walk numerous times every single day. And I can go years without getting a rock in my shoe. Even when it does happen, it would usually be because I was walking on a gravel road or alley, not on a standard sidewalk. Yet for all of the billions of people who are out and about every single day, the probability of numerous people getting rocks in their shoes at any given moment is quite high.

A more exciting example is getting struck by lightning. The vast majority of people go through their entire lives without getting hit. Still, there is a miniscule minority of the world’s population that gets hit on any day. Some rare people even get struck by lightning multiple times in their lifetime. Lightning directly hitting any single person is extremely improbable, while lightning directly hitting some person somewhere is extremely probable.

Most people don’t go around worrying about lightning, but right at this moment multiple people in the world are probably getting struck. Someone somewhere inevitably will get struck. It could be you, right now where you are. And sometimes lightning comes seemingly out of nowhere with no storm in sight, even on occasion hitting people in their houses.

Probability is dependent on context. So it depends on our perspective, on how we look at the data and how we calculate the probability. Our view of probability tends to be biased by the personal, of course. So it tends to be biased by what we know and have experienced, what is familiar to us. It is hard to think about probability in purely rational terms.

Given the right perspective, almost anything can be seen as improbable.

The entire existence of the universe, if one thinks too much about it, starts to seem improbable. Also improbable is life emerging on a particular planet, then that life leading to consciousness, intelligence, and advanced civilizations. Even so, because of the immense number of planets in the immense number of solar systems in the immense number of galaxies, it is probable to the point of near inevitability that there are vast numbers of planets with conscious, intelligent lifeforms and advanced civilizations.

Heck, we might be surrounded by lifeforms on our planet and in our own solar system while being unable to perceive and recognize them. We think of the probability of life, along with all that goes with it, in terms of the life we know immediately around us. But the actual probability is that other lifeforms would be bizarre to us, even if we could even discern them. Other lifeforms might simply be beings of energy or fluids, might be too small to detect with our senses or too large to comprehend with our minds. If a gut microbe gained intelligence and you were able to ask it what the probability was that their world was a giant ambling creature, the response would probably be amused laughter or else they’d look at you as though you were crazy. Maybe our own imaginations toward that which is beyond us is as relatively limited as that of the gut microbe.

Another aspect is cultural bias. People living in a society that wears sandals would have a different view of the probability of rocks in their ‘shoes’ than those in a society that wears tall boots. Societies that don’t wear any footwear at all wouldn’t even comprehend the issue of rocks in shoes. The same thing for beings that can’t be seen, as in some societies it would be common belief that such beings are all around us (ghosts, spirits, demons, elves, supernatural creatures, etc), and they may claim to know how to interact with them.

How do we determine the probability of bicameral societies having existed in the ancient world? Some say it isn’t even plausible, much less probable. I was reading Hearing Voices by Simon McCarthy-Jones and the author was in this doubting camp. He basically argued that, interpreting ancient non-Western texts based on modern Western preconceptions, it is highly improbable that ancient non-Western societies could exist that contradicted modern Western preconceptions. Uh, well, yeah, I guess. Within that circular logic, it indeed is a coherent opinion. But obviously others disagree based on the possibility of other ways of interpreting the same evidence. For example, unlike McCarthy-Jones, some people would point to the anthropological record to see possible examples of bicameralism or something akin to it, such as the Ugandan Ik and the Amazonian Pirahã.

My point isn’t whether or not bicameral theory is the best possible explanation of the data. But even ignoring the theory, the anthropological record makes absolutely clear there are societies that seem very strange to our modern Western sensibility. Then again, to those other societies, we would appear strange. Considering how perfect conditions have had to be, all of modern Western civilization is highly improbable. If it were possible to re-create the entire world in a vast laboratory, you could run an experiment numerous times and probably never be able to repeat these same results. Supposedly strange societies like the Ik and Pirahã are immensely more probable than our own strange society. Some other societies have lasted for thousands of years and we might be lucky to last the coming century.

Although it’s possible that the world perfectly matches our present beliefs and biases, it is ridiculously improbable that such is the case. Future generations surely will look back on us as we look back on the ignorance and barbarity of ancient societies. So, who are we to hold ourselves up as the norm for all of humanity? And who are we to use our cultural biases to judge all of reality?

We have no way to determine the probability of most things or often even their plausibility. All we know is what we know. And we don’t know what we don’t know. Usually, we don’t even know that we don’t know what we don’t know. Our state of ignorance is almost entirely self-enclosed, as what we know or think we know is inseparable from what we don’t know. As it has been said: The world is not only stranger than we imagine, it is stranger than we can imagine.

The world is full of kicked-up pebbles and lightning strikes, strange lifeforms and even stranger cultures. Everything is improbable from some perspective, until it happens to you or is experienced by you and then it’s the most probable thing in the world. Then it simply is the reality you know.

Shaken and Stirred

I Is an Other
by James Geary
Kindle Locations 303-310

Descartes’s “Cogito ergo sum.”

This phrase is routinely translated as:

I think, therefore I am.

But there is a better translation.

The Latin word cogito is derived from the prefix co (with or together) and the verb agitare (to shake). Agitare is the root of the English words “agitate” and “agitation.” Thus, the original meaning of cogito is “to shake together,” and the proper translation of “Cogito ergo sum” is:

I shake things up, therefore I am.

Staying with the Trouble
by Donna J. Haraway
Kindle Locations 293-303

Trouble is an interesting word. It derives from a thirteenth-century French verb meaning “to stir up,” “to make cloudy,” “to disturb.” We— all of us on Terra— live in disturbing times, mixed-up times, troubling and turbid times. The task is to become capable, with each other in all of our bumptious kinds, of response. Mixed-up times are overflowing with both pain and joy— with vastly unjust patterns of pain and joy, with unnecessary killing of ongoingness but also with necessary resurgence. The task is to make kin in lines of inventive connection as a practice of learning to live and die well with each other in a thick present. Our task is to make trouble, to stir up potent response to devastating events, as well as to settle troubled waters and rebuild quiet places. In urgent times, many of us are tempted to address trouble in terms of making an imagined future safe, of stopping something from happening that looms in the future, of clearing away the present and the past in order to make futures for coming generations. Staying with the trouble does not require such a relationship to times called the future. In fact, staying with the trouble requires learning to be truly present, not as a vanishing pivot between awful or edenic pasts and apocalyptic or salvific futures, but as mortal critters entwined in myriad unfinished configurations of places, times, matters, meanings.