The Madness of Reason

Commenting online brings one in contact with odd people. It is often merely irritating, but at times it can be fascinating to see all the strange ways humanity gets expressed.

I met a guy, Naj Ziad, on the Facebook page for a discussion group about Julian Jaynes’ book, The Origin of Consciousness in the Breakdown of the Bicameral Mind. He posted something expressing his obsession with logical coherence and consistency. We dialogued for quite a bit, including in a post on his own Facebook page, and he seemed like a nice enough guy. He came across as being genuine in his intentions and worldview, but there was simply something off about him.

It’s likely he has Asperger’s, of a high IQ and high functioning variety. Or it could be that he has some kind of personality disorder. Either way, my sense is that he is severely lacking in cognitive empathy, although I doubt he is deficient in affective empathy. He just doesn’t seem to get that other people can perceive and experience the world differently than he does or that others entirely exist as separate entities apart from his own existence.

When I claimed that my worldview was simply different than his and that neither of our personal realities could be reduced to the other, he called me a dualist. I came to the conclusion that this guy was a solipsist, although he doesn’t identify that way. Solipsism was the only philosophy that made sense of his idiosyncratic ramblings, entirely logical ramblings I might add. He is obviously intelligent, clever, and reasonably well read. His verbal intelligence is particularly high.

In fact, he is so obsessed with his verbal intelligence that he has come to the conclusion that all of reality is language. Of course, he has his own private definition of language which asserts that language is everything. This leaves his argument as a tautology and he freely admitted this was the case, but he kept returning to his defense that his argument was logically consistent and coherent. Sure.

It was endlessly amusing. He really could not grasp that the way his mind operates isn’t how everyone’s mind operates, and so he couldn’t escape the hermetically-sealed reality tunnel of his own clever monkey mind. His world view is so perfectly constructed and orderly that there isn’t a single crack to let in fresh air or a beam of light.

He was doing a wondrous impression of Spock in being entirely logical within his narrow psychological and ideological framework. He kept falling back on his being logical and also his use of idiosyncratic jargon. He defined his terms to entirely fit his ideological worldview that those terms had no meaning to him outside of his ideological worldview. It all made perfect sense within itself.

His life philosophy is a well-rehearsed script that he goes on repeating. It is an amazing thing to observe as an outsider, especially considering he stubbornly refused to acknowledge that anything could be outside of his own mind for he couldn’t imagine the world being different than his own mind. He wouldn’t let go of his beliefs about reality, like the monkey with his hand trapped in a jar because he won’t let go of the banana.

If this guy was just insane or a troll, I would dismiss him out of hand. But that isn’t the case. Obviously, he is neuroatypical and I won’t hold that against anyone. And I freely admit that his ideological worldview is logically consistent and coherent, for whatever that is worth.

What made it so fascinating to my mind is that solipsism has always been a speculative philosophy, to be considered only as a thought experiment. It never occurred to me that there would be a highly intelligent and rational person who would seriously uphold it as an entire self-contained worldview and lifestyle. His arguments for it were equally fascinating and he had interesting thoughts and insights, some of which I even agreed with. He is a brilliant guy who, as sometimes happens, has gone a bit off the deep end.

He built an ideology that perfectly expresses and conforms to his highly unusual neurocognitive profile. And of course, in pointing this out to him, he dismisses it as ‘psychologizing’. His arguments are so perfectly patched together that he never refers to any factual evidence as support for his ideological commitments, as it is entirely unnecessary in his own mind. External facts in the external world, what he calls ‘scientism’, are as meaningless as others claiming to have existence independent of his own. From his perspective, there is only what he calls the ‘now’ and there can only be one ‘now’ to rule them all, which just so happens to coincide with his own ego-mind.

If you challenge him on any of this, he is highly articulate in defending why he is being entirely reasonable. Ah, the madness of reason!

* * *

On a personal note, I should make clear that I sympathize with this guy. I have my own psychological issues (depression, anxiety, thought disorder, strong introversion, etc) that can make me feel isolated and cause me to retreat further into myself. Along with a tendency to over-intellectualize everything, my psychological issues have at times led me to get lost in my own head.

I can even understand the attraction of solipsism and, as a thought experiment, I’ve entertained it. But somehow I’ve always known that I’m not ‘normal’, which is to say that others are not like me. I have never actually doubted that others not only exist but exist in a wide variety of differences. It hasn’t occurred to me to deny all otherness by reducing all others to my own psychological experience and ideological worldview. I’ve never quite been that lost in myself, although I could imagine how it might happen. There have been moments in my life where my mind could have gone off the deep end.

Yet my sympathy only goes so far. It is hard to sympathize with someone who refuses to acknowledge your independent existence as a unique human being with your own identity and views. There is an element of frustration in dealing with a solipsist, but in this case my fascination drew me in. Before I ascertained he was a solipsist, it was obvious something about him was highly unusual. I kept poking and prodding him until the shape of his worldview became apparent. At that point, my fascination ended. Any further engagement would have continued to go around in circles, which means watching this guy’s mind go around in circles like a dog chasing its own tail.

Of all the contortions the human mind can put itself into, solipsism has to be one of the greatest feats to accomplish. I have to give this guy credit where its due. Not many people could keep up such a mindset for long.

Advertisements

Hyperballad and Hyperobjects

Morton’s use of the term ‘hyperobjects’ was inspired by Björk’s 1996 single ‘Hyperballad’
(Wikipedia)

Björk
by Timothy Morton

Björk and I think that there is a major cultural shift going on around the world towards something beyond cynical reason and nihilism, as more and more it becomes impossible not to have consideration for nonhumans in everything we do. Hopefully this piece we made contributes to that somehow.

I was so lucky to be doing this while she was mixing her album with some of the nicest and most incredible musicians/producers I’ve ever met…great examples of this shift beyond cynical reason…

Here is something I think is so so amazing, the Subtle Abuse mix of “Hyperballad.” Car parts, bottles, cutlery–all the objects, right? Not to mention Björk’s body “slamming against those rocks.” It’s a veritable Latour Litany… And the haunting repetition…

Dark Ecological Chocolate
by Timothy Morton

This being-an-object is intimately related with the Kantian beauty experience, wherein I find experiential evidence without metaphysical positing that at least one other being exists. The Sadness is the attunement of coexistence stripped of its conceptual content. Since the rigid anthropocentric standard of taste with its refined distances has collapsed, it becomes at this level impossible to rebuild the distinction we lost in The Ethereal between being interested or concerned with (this painting, this polar bear) and being fascinated by… Being interested means I am in charge. Being fascinated means that something else is. Beauty starts to show the subscenent wiring under the board.

Take Björk. Her song “Hyperballad” is a classic example of what I’m trying to talk about here. She shows you the wiring under board of an emotion, the way a straightforward feeling like I love you is obviously not straightforward at all, so don’t write a love song like that, write one that says you’re sitting on top of this cliff, and you’re dropping bits and pieces of the edge like car parts, bottles and cutlery, all kinds of not-you nonhuman prosthetic bits that we take to be extensions of our totally integrated up to date shiny religious holistic selves, and then you picture throwing yourself off, and what would you look like—to the you who’s watching you still on the edge of the cliff—as you fell, and when you hit the bottom would you be alive or dead, would you look awake or asleep, would your eyes be closed, or open?

When you experience beauty you experience evidence in your inner space that at least one thing that isn’t you exists. An evanescent footprint in your inner space—you don’t need to prove that things are real by hitting them or eating them. A nonviolent coexisting without coercion. There is an undecidability between two entities—me and not-me, the thing. Beauty is sad because it is ungraspable; there is an elegiac quality to it. When we grasp it withdraws, like putting my hand into water. Yet it appears.

Beauty is virtual: I am unable to tell whether the beauty resides in me or in the thing—it is as if it were in the thing, but impossible to pin down there. The subjunctive, floating “as if” virtual reality of beauty is a little queasy—the thing emits a tractor beam in whose vortex I find myself; I veer towards it. The aesthetic dimension says something true about causality in a modern age: I can’t tell for sure what the causes and effects are without resorting to illegal metaphysical moves.[14] Something slightly sinister is afoot—there is a basic entanglement such that I can’t tell who or what started it.

Beauty is the givenness of data. A thing impinges on me before I can contain it or use it or think it. It is as if I hear the thing breathing right next to me. From the standpoint of agricultural white patriarchy, something slightly “evil” is happening: something already has a grip on us, and this is demonic insofar as it is “from elsewhere.” This “saturated” demonic proximity is the essential ingredient of ecological being and ecological awareness, not some Nature over yonder.[15]

Interdependence, which is ecology, is sad and contingent. Because of interdependence, when I’m nice to a bunny rabbit I’m not being nice to bunny rabbit parasites. Amazing violence would be required to try to fit a form over everything all at once. If you try then you basically undermine the bunnies and everything else into components of a machine, replaceable components whose only important aspect is their existence

“Lack of the historical sense is the traditional defect in all philosophers.”

Human, All Too Human: A Book for Free Spirits
by Friedrich Wilhelm Nietzsche

The Traditional Error of Philosophers.—All philosophers make the common mistake of taking contemporary man as their starting point and of trying, through an analysis of him, to reach a conclusion. “Man” involuntarily presents himself to them as an aeterna veritas as a passive element in every hurly-burly, as a fixed standard of things. Yet everything uttered by the philosopher on the subject of man is, in the last resort, nothing more than a piece of testimony concerning man during a very limited period of time. Lack of the historical sense is the traditional defect in all philosophers. Many innocently take man in his most childish state as fashioned through the influence of certain religious and even of certain political developments, as the permanent form under which man must be viewed. They will not learn that man has evolved,4 that the intellectual faculty itself is an evolution, whereas some philosophers make the whole cosmos out of this intellectual faculty. But everything essential in human evolution took place aeons ago, long before the four thousand years or so of which we know anything: during these man may not have changed very much. However, the philosopher ascribes “instinct” to contemporary man and assumes that this is one of the unalterable facts regarding man himself, and hence affords a clue to the understanding of the universe in general. The whole teleology is so planned that man during the last four thousand years shall be spoken of as a being existing from all eternity, and with reference to whom everything in the cosmos from its very inception is naturally ordered. Yet everything evolved: there are no eternal facts as there are no absolute truths. Accordingly, historical philosophising is henceforth indispensable, and with it honesty of judgment.

What Locke Lacked
by Louise Mabille

Locke is indeed a Colossus of modernity, but one whose twin projects of providing a concept of human understanding and political foundation undermine each other. The specificity of the experience of perception alone undermines the universality and uniformity necessary to create the subject required for a justifiable liberalism. Since mere physical perspective can generate so much difference, it is only to be expected that political differences would be even more glaring. However, no political order would ever come to pass without obliterating essential differences. The birth of liberalism was as violent as the Empire that would later be justified in its name, even if its political traces are not so obvious. To interpret is to see in a particular way, at the expense of all other possibilities of interpretation. Perspectives that do not fit are simply ignored, or as that other great resurrectionist of modernity, Freud, would concur, simply driven underground. We ourselves are the source of this interpretative injustice, or more correctly, our need for a world in which it is possible to live, is. To a certain extent, then, man is the measure of the world, but only his world. Man is thus a contingent measure and our measurements do not refer to an original, underlying reality. What we call reality is the result not only of our limited perspectives upon the world, but the interplay of those perspectives themselves. The liberal subject is thus a result of, and not a foundation for, the experience of reality. The subject is identified as origin of meaning only through a process of differentiation and reduction, a course through which the will is designated as a psychological property.

Locke takes the existence of the subject of free will – free to exercise political choice such as rising against a tyrant, choosing representatives, or deciding upon political direction – simply for granted. Furthermore, he seems to think that everyone should agree as to what the rules are according to which these events should happen. For him, the liberal subject underlying these choices is clearly fundamental and universal.

Locke’s philosophy of individualism posits the existence of a discreet and isolated individual, with private interests and rights, independent of his linguistic or socio-historical context. C. B. MacPhearson identifies a distinctly possessive quality to Locke’s individualist ethic, notably in the way in which the individual is conceived as proprietor of his own personhood, possessing capacities such as self-reflection and free will. Freedom becomes associated with possession, which the Greeks would associate with slavery, and society conceived in terms of a collection of free and equal individuals who are related to each through their means of achieving material success – which Nietzsche, too, would associate with slave morality.  […]

There is a central tenet to John Locke’s thinking that, as conventional as it has become, remains a strange strategy. Like Thomas Hobbes, he justifies modern society by contrasting it with an original state of nature. For Hobbes, as we have seen, the state of nature is but a hypothesis, a conceptual tool in order to elucidate a point. For Locke, however, the state of nature is a very real historical event, although not a condition of a state of war. Man was social by nature, rational and free. Locke drew this inspiration from Richard Hooker’s Laws of Ecclesiastical Polity, notably from his idea that church government should be based upon human nature, and not the Bible, which, according to Hooker, told us nothing about human nature. The social contract is a means to escape from nature, friendlier though it be on the Lockean account. For Nietzsche, however, we have never made the escape: we are still holus-bolus in it: ‘being conscious is in no decisive sense the opposite of the instinctive – most of the philosopher’s conscious thinking is secretly directed and compelled into definite channels by his instincts. Behind all logic too, and its apparent autonomy there stand evaluations’ (BGE, 3). Locke makes a singular mistake in thinking the state of nature a distant event. In fact, Nietzsche tells us, we have never left it. We now only wield more sophisticated weapons, such as the guilty conscience […]

Truth originates when humans forget that they are ‘artistically creating subjects’ or products of law or stasis and begin to attach ‘invincible faith’ to their perceptions, thereby creating truth itself. For Nietzsche, the key to understanding the ethic of the concept, the ethic of representation, is conviction […]

Few convictions have proven to be as strong as the conviction of the existence of a fundamental subjectivity. For Nietzsche, it is an illusion, a bundle of drives loosely collected under the name of ‘subject’ —indeed, it is nothing but these drives, willing, and actions in themselves—and it cannot appear as anything else except through the seduction of language (and the fundamental errors of reason petrified in it), which understands and misunderstands all action as conditioned by something which causes actions, by a ‘Subject’ (GM I 13). Subjectivity is a form of linguistic reductionism, and when using language, ‘[w]e enter a realm of crude fetishism when we summon before consciousness the basic presuppositions of the metaphysics of language — in plain talk, the presuppositions of reason. Everywhere reason sees a doer and doing; it believes in will as the cause; it believes in the ego, in the ego as being, in the ego as substance, and it projects this faith in the ego-substance upon all things — only thereby does it first create the concept of ‘thing’ (TI, ‘Reason in Philosophy’ 5). As Nietzsche also states in WP 484, the habit of adding a doer to a deed is a Cartesian leftover that begs more questions than it solves. It is indeed nothing more than an inference according to habit: ‘There is activity, every activity requires an agent, consequently – (BGE, 17). Locke himself found the continuous existence of the self problematic, but did not go as far as Hume’s dissolution of the self into a number of ‘bundles’. After all, even if identity shifts occurred behind the scenes, he required a subject with enough unity to be able to enter into the Social Contract. This subject had to be something more than merely an ‘eternal grammatical blunder’ (D, 120), and willing had to be understood as something simple. For Nietzsche, it is ‘above all complicated, something that is a unit only as a word, a word in which the popular prejudice lurks, which has defeated the always inadequate caution of philosophers’ (BGE, 19).

Nietzsche’s critique of past philosophers
by Michael Lacewing

Nietzsche is questioning the very foundations of philosophy. To accept his claims means being a new kind of philosopher, ones who ‘taste and inclination’, whose values, are quite different. Throughout his philosophy, Nietzsche is concerned with origins, both psychological and historical. Much of philosophy is usually thought of as an a priori investigation. But if Nietzsche can show, as he thinks he can, that philosophical theories and arguments have a specific historical basis, then they are not, in fact, a priori. What is known a priori should not change from one historical era to the next, nor should it depend on someone’s psychology. Plato’s aim, the aim that defines much of philosophy, is to be able to give complete definitions of ideas – ‘what is justice?’, ‘what is knowledge?’. For Plato, we understand an idea when we have direct knowledge of the Form, which is unchanging and has no history. If our ideas have a history, then the philosophical project of trying to give definitions of our concepts, rather than histories, is radically mistaken. For example, in §186, Nietzsche argues that philosophers have consulted their ‘intuitions’ to try to justify this or that moral principle. But they have only been aware of their own morality, of which their ‘justifications’ are in fact only expressions. Morality and moral intuitions have a history, and are not a priori. There is no one definition of justice or good, and the ‘intuitions’ that we use to defend this or that theory are themselves as historical, as contentious as the theories we give – so they offer no real support. The usual ways philosophers discuss morality misunderstands morality from the very outset. The real issues of understanding morality only emerge when we look at the relation between this particular morality and that. There is no world of unchanging ideas, no truths beyond the truths of the world we experience, nothing that stands outside or beyond nature and history.

GENEALOGY AND PHILOSOPHY

Nietzsche develops a new way of philosophizing, which he calls a ‘morphology and evolutionary theory’ (§23), and later calls ‘genealogy’. (‘Morphology’ means the study of the forms something, e.g. morality, can take; ‘genealogy’ means the historical line of descent traced from an ancestor.) He aims to locate the historical origin of philosophical and religious ideas and show how they have changed over time to the present day. His investigation brings together history, psychology, the interpretation of concepts, and a keen sense of what it is like to live with particular ideas and values. In order to best understand which of our ideas and values are particular to us, not a priori or universal, we need to look at real alternatives. In order to understand these alternatives, we need to understand the psychology of the people who lived with them. And so Nietzsche argues that traditional ways of doing philosophy fail – our intuitions are not a reliable guide to the ‘truth’, to the ‘real’ nature of this or that idea or value. And not just our intuitions, but the arguments, and style of arguing, that philosophers have used are unreliable. Philosophy needs to become, or be informed by, genealogy. A lack of any historical sense, says Nietzsche, is the ‘hereditary defect’ of all philosophers.

MOTIVATIONAL ANALYSIS

Having long kept a strict eye on the philosophers, and having looked between their lines, I say to myself… most of a philosopher’s conscious thinking is secretly guided and channelled into particular tracks by his instincts. Behind all logic, too, and its apparent tyranny of movement there are value judgements, or to speak more clearly, physiological demands for the preservation of a particular kind of life. (§3) A person’s theoretical beliefs are best explained, Nietzsche thinks, by evaluative beliefs, particular interpretations of certain values, e.g. that goodness is this and the opposite of badness. These values are best explained as ‘physiological demands for the preservation of a particular kind of life’. Nietzsche holds that each person has a particular psychophysical constitution, formed by both heredity and culture. […] Different values, and different interpretations of these values, support different ways of life, and so people are instinctively drawn to particular values and ways of understanding them. On the basis of these interpretations of values, people come to hold particular philosophical views. §2 has given us an illustration of this: philosophers come to hold metaphysical beliefs about a transcendent world, the ‘true’ and ‘good’ world, because they cannot believe that truth and goodness could originate in the world of normal experience, which is full of illusion, error, and selfishness. Therefore, there ‘must’ be a pure, spiritual world and a spiritual part of human beings, which is the origin of truth and goodness. Philosophy and values But ‘must’ there be a transcendent world? Or is this just what the philosopher wants to be true? Every great philosophy, claims Nietzsche, is ‘the personal confession of its author’ (§6). The moral aims of a philosophy are the ‘seed’ from which the whole theory grows. Philosophers pretend that their opinions have been reached by ‘cold, pure, divinely unhampered dialectic’ when in fact, they are seeking reasons to support their pre-existing commitment to ‘a rarefied and abstract version of their heart’s desire’ (§5), viz. that there is a transcendent world, and that good and bad, true and false are opposites. Consider: Many philosophical systems are of doubtful coherence, e.g. how could there be Forms, and if there were, how could we know about them? Or again, in §11, Nietzsche asks ‘how are synthetic a priori judgments possible?’. The term ‘synthetic a priori’ was invented by Kant. According to Nietzsche, Kant says that such judgments are possible, because we have a ‘faculty’ that makes them possible. What kind of answer is this?? Furthermore, no philosopher has ever been proved right (§25). Given the great difficulty of believing either in a transcendent world or in human cognitive abilities necessary to know about it, we should look elsewhere for an explanation of why someone would hold those beliefs. We can find an answer in their values. There is an interesting structural similarity between Nietzsche’s argument and Hume’s. Both argue that there is no rational explanation of many of our beliefs, and so they try to find the source of these beliefs outside or beyond reason. Hume appeals to imagination and the principle of ‘Custom’. Nietzsche appeals instead to motivation and ‘the bewitchment of language’ (see below). So Nietzsche argues that philosophy is not driven by a pure ‘will to truth’ (§1), to discover the truth whatever it may be. Instead, a philosophy interprets the world in terms of the philosopher’s values. For example, the Stoics argued that we should live ‘according to nature’ (§9). But they interpret nature by their own values, as an embodiment of rationality. They do not see the senselessness, the purposelessness, the indifference of nature to our lives […]

THE BEWITCHMENT OF LANGUAGE

We said above that Nietzsche criticizes past philosophers on two grounds. We have looked at the role of motivation; the second ground is the seduction of grammar. Nietzsche is concerned with the subject-predicate structure of language, and with it the notion of a ‘substance’ (picked out by the grammatical ‘subject’) to which we attribute ‘properties’ (identified by the predicate). This structure leads us into a mistaken metaphysics of ‘substances’. In particular, Nietzsche is concerned with the grammar of ‘I’. We tend to think that ‘I’ refers to some thing, e.g. the soul. Descartes makes this mistake in his cogito – ‘I think’, he argues, refers to a substance engaged in an activity. But Nietzsche repeats the old objection that this is an illegitimate inference (§16) that rests on many unproven assumptions – that I am thinking, that some thing is thinking, that thinking is an activity (the result of a cause, viz. I), that an ‘I’ exists, that we know what it is to think. So the simple sentence ‘I think’ is misleading. In fact, ‘a thought comes when ‘it’ wants to, and not when ‘I’ want it to’ (§17). Even ‘there is thinking’ isn’t right: ‘even this ‘there’ contains an interpretation of the process and is not part of the process itself. People are concluding here according to grammatical habit’. But our language does not allow us just to say ‘thinking’ – this is not a whole sentence. We have to say ‘there is thinking’; so grammar constrains our understanding. Furthermore, Kant shows that rather than the ‘I’ being the basis of thinking, thinking is the basis out of which the appearance of an ‘I’ is created (§54). Once we recognise that there is no soul in a traditional sense, no ‘substance’, something constant through change, something unitary and immortal, ‘the way is clear for new and refined versions of the hypothesis about the soul’ (§12), that it is mortal, that it is multiplicity rather than identical over time, even that it is a social construct and a society of drives. Nietzsche makes a similar argument about the will (§19). Because we have this one word ‘will’, we think that what it refers to must also be one thing. But the act of willing is highly complicated. First, there is an emotion of command, for willing is commanding oneself to do something, and with it a feeling of superiority over that which obeys. Second, there is the expectation that the mere commanding on its own is enough for the action to follow, which increases our sense of power. Third, there is obedience to the command, from which we also derive pleasure. But we ignore the feeling the compulsion, identifying the ‘I’ with the commanding ‘will’. Nietzsche links the seduction of language to the issue of motivation in §20, arguing that ‘the spell of certain grammatical functions is the spell of physiological value judgements’. So even the grammatical structure of language originates in our instincts, different grammars contributing to the creation of favourable conditions for different types of life. So what values are served by these notions of the ‘I’ and the ‘will’? The ‘I’ relates to the idea that we have a soul, which participates in a transcendent world. It functions in support of the ascetic ideal. The ‘will’, and in particular our inherited conception of ‘free will’, serves a particular moral aim

Hume and Nietzsche: Moral Psychology (short essay)
by epictetus_rex

1. Metaphilosophical Motivation

Both Hume and Nietzsche1 advocate a kind of naturalism. This is a weak naturalism, for it does not seek to give science authority over philosophical inquiry, nor does it commit itself to a specific ontological or metaphysical picture. Rather, it seeks to (a) place the human mind firmly in the realm of nature, as subject to the same mechanisms that drive all other natural events, and (b) investigate the world in a way that is roughly congruent with our best current conception(s) of nature […]

Furthermore, the motivation for this general position is common to both thinkers. Hume and Nietzsche saw old rationalist/dualist philosophies as both absurd and harmful: such systems were committed to extravagant and contradictory metaphysical claims which hinder philosophical progress. Furthermore, they alienated humanity from its position in nature—an effect Hume referred to as “anxiety”—and underpinned religious or “monkish” practises which greatly accentuated this alienation. Both Nietzsche and Hume believe quite strongly that coming to see ourselves as we really are will banish these bugbears from human life.

To this end, both thinkers ask us to engage in honest, realistic psychology. “Psychology is once more the path to the fundamental problems,” writes Nietzsche (BGE 23), and Hume agrees:

the only expedient, from which we can hope for success in our philosophical researches, is to leave the tedious lingering method, which we have hitherto followed, and instead of taking now and then a castle or village on the frontier, to march up directly to the capital or center of these sciences, to human nature itself.” (T Intro)

2. Selfhood

Hume and Nietzsche militate against the notion of a unified self, both at-a-time and, a fortiori, over time.

Hume’s quest for a Newtonian “science of the mind” lead him to classify all mental events as either impressions (sensory) or ideas (copies of sensory impressions, distinguished from the former by diminished vivacity or force). The self, or ego, as he says, is just “a kind of theatre, where several perceptions successively make their appearance; pass, re-pass, glide away, and mingle in an infinite variety of postures and situations. There is properly no simplicity in it at one time, nor identity in different; whatever natural propension we may have to imagine that simplicity and identity.” (Treatise 4.6) […]

For Nietzsche, the experience of willing lies in a certain kind of pleasure, a feeling of self-mastery and increase of power that comes with all success. This experience leads us to mistakenly posit a simple, unitary cause, the ego. (BGE 19)

The similarities here are manifest: our minds do not have any intrinsic unity to which the term “self” can properly refer, rather, they are collections or “bundles” of events (drives) which may align with or struggle against one another in a myriad of ways. Both thinkers use political models to describe what a person really is. Hume tells us we should “more properly compare the soul to a republic or commonwealth, in which the several members [impressions and ideas] are united by ties of government and subordination, and give rise to persons, who propagate the same republic in the incessant change of its parts” (T 261)

3. Action and The Will

Nietzsche and Hume attack the old platonic conception of a “free will” in lock-step with one another. This picture, roughly, involves a rational intellect which sits above the appetites and ultimately chooses which appetites will express themselves in action. This will is usually not considered to be part of the natural/empirical order, and it is this consequence which irks both Hume and Nietzsche, who offer two seamlessly interchangeable refutations […]

Since we are nothing above and beyond events, there is nothing for this “free will” to be: it is a causa sui, “a sort of rape and perversion of logic… the extravagant pride of man has managed to entangle itself profoundly and frightfully with just this nonsense” (BGE 21).

When they discover an erroneous or empty concept such as “Free will” or “the self”, Nietzsche and Hume engage in a sort of error-theorizing which is structurally the same. Peter Kail (2006) has called this a “projective explanation”, whereby belief in those concepts is “explained by appeal to independently intelligible features of psychology”, rather than by reference to the way the world really is1.

The Philosophy of Mind
INSTRUCTOR: Larry Hauser
Chapter 7: Egos, bundles, and multiple selves

  • Who dat?  “I”
    • Locke: “something, I know not what”
    • Hume: the no-self view … “bundle theory”
    • Kant’s transcendental ego: a formal (nonempirical) condition of thought that the “I’ must accompany every perception.
      • Intentional mental state: I think that snow is white.
        • to think: a relation between
          • a subject = “I”
          • a propositional content thought =  snow is white
      • Sensations: I feel the coldness of the snow.
        • to feel: a relation between
          • a subject = “I”
          • a quale = the cold-feeling
    • Friedrich Nietzsche
      • A thought comes when “it” will and not when “I” will. Thus it is a falsification of the evidence to say that the subject “I” conditions the predicate “think.”
      • It is thought, to be sure, but that this “it” should be that old famous “I” is, to put it mildly, only a supposition, an assertion. Above all it is not an “immediate certainty.” … Our conclusion is here formulated out of our grammatical custom: “Thinking is an activity; every activity presumes something which is active, hence ….” 
    • Lichtenberg: “it’s thinking” a la “it’s raining”
      • a mere grammatical requirement
      • no proof of an thinking self

[…]

  • Ego vs. bundle theories (Derek Parfit (1987))
    • Ego: “there really is some kind of continuous self that is the subject of my experiences, that makes decisions, and so on.” (95)
      • Religions: Christianity, Islam, Hinduism
      • Philosophers: Descartes, Locke, Kant & many others (the majority view)
    • Bundle: “there is no underlying continuous and unitary self.” (95)
      • Religion: Buddhism
      • Philosophers: Hume, Nietzsche, Lichtenberg, Wittgenstein, Kripke(?), Parfit, Dennett {a stellar minority}
  • Hume v. Reid
    • David Hume: For my part, when I enter most intimately into what I call myself, I always stumble on some particular perception or other, of heat or cold, light or shade, love or hatred, pain or pleasure.  I never can catch myself at any time without a perception, and never can observe anything but the perception.  (Hume 1739, Treatise I, VI, iv)
    • Thomas Reid: I am not thought, I am not action, I am not feeling: I am something which thinks and acts and feels. (1785)

Arete: History and Etymology

Arete (moral virtue)
Wikipedia

Arete (Greekἀρετή), in its basic sense, means “excellence of any kind”.[1] The term may also mean “moral virtue”.[1] In its earliest appearance in Greek, this notion of excellence was ultimately bound up with the notion of the fulfillment of purpose or function: the act of living up to one’s full potential.

The term from Homeric times onwards is not gender specific. Homer applies the term of both the Greek and Trojan heroes as well as major female figures, such as Penelope, the wife of the Greek hero Odysseus. In the Homeric poems, Arete is frequently associated with bravery, but more often with effectiveness. The man or woman of Arete is a person of the highest effectiveness; they use all their faculties—strength, bravery and wit—to achieve real results. In the Homeric world, then, Arete involves all of the abilities and potentialities available to humans.

In some contexts, Arete is explicitly linked with human knowledge, where the expressions “virtue is knowledge” and “Arete is knowledge” are used interchangeably. The highest human potential is knowledge and all other human abilities are derived from this central capacity. If Arete is knowledge and study, the highest human knowledge is knowledge about knowledge itself; in this light, the theoretical study of human knowledge, which Aristotle called “contemplation”, is the highest human ability and happiness.[2]

History

The Ancient Greeks applied the term to anything: for example, the excellence of a chimney, the excellence of a bull to be bred and the excellence of a man. The meaning of the word changes depending on what it describes, since everything has its own peculiar excellence; the arete of a man is different from the arete of a horse. This way of thinking comes first from Plato, where it can be seen in the Allegory of the Cave.[3] In particular, the aristocratic class was presumed, essentially by definition, to be exemplary of arete: “The root of the word is the same as aristos, the word which shows superlative ability and superiority, and aristos was constantly used in the plural to denote the nobility.”[4]

By the 5th and 4th centuries BC, arete as applied to men had developed to include quieter virtues, such as dikaiosyne (justice) and sophrosyne (self-restraint). Plato attempted to produce a moral philosophy that incorporated this new usage,[5] but it was in the work of Aristotle that the doctrine of arete found its fullest flowering. Aristotle’s Doctrine of the Mean is a paradigm example of his thinking.

Arete has also been used by Plato when talking about athletic training and also the education of young boys. Stephen G. Miller delves into this usage in his book “Ancient Greek Athletics”. Aristotle is quoted as deliberating between education towards arete “…or those that are theoretical”.[6] Educating towards arete in this sense means that the boy would be educated towards things that are useful in life. However, even Plato himself says that arete is not something that can be agreed upon. He says, “Nor is there even an agreement about what constitutes arete, something that leads logically to a disagreement about the appropriate training for arete.”[7] To say that arete has a common definition of excellence or fulfillment may be an overstatement simply because it was very difficult to pinpoint arete, much less the proper ways to go about obtaining it. […]

Homer

In Homer‘s Iliad and Odyssey, “arete” is used mainly to describe heroes and nobles and their mobile dexterity, with special reference to strength and courage, but it is not limited to this. Penelope‘s arete, for example, relates to co-operation, for which she is praised by Agamemnon. The excellence of the gods generally included their power, but, in the Odyssey (13.42), the gods can grant excellence to a life, which is contextually understood to mean prosperity. Arete was also the name of King Alcinous‘s wife.

According to Bernard Knox‘s notes found in the Robert Fagles translation of The Odyssey, “arete” is also associated with the Greek word for “pray”, araomai.[8]

All Things Shining
by Hubert Dreyfus
pp. 61-63

Homer’s epic poems brought into focus a notion of arete, or excellence in life, that was at the center of the Greek understanding of human being.6 Many admirers of Greek culture have attempted to define this notion, but success here requires avoiding two prominent temptations. There is the temptation to patronize that we have already mentioned. But there is also a temptation to read a modern sensibility into Homer’s time. One standard translation of the Greek word arete as “virtue” runs the risk of this kind of retroactive reading: for any attempt to interpret the Homeric Greek notion of human excellence in terms of “virtue”—especially if one hears in this word its typical Christian or even Roman overtones—is bound to go astray. Excellence in the Greek sense involves neither the Christian notion of humility and love nor the Roman ideal of stoic adherence to one’s duty.7 Instead, excellence in the Homeric world depends crucially on one’s sense of gratitude and wonder.

Nietzsche was one of the first to understand that Homeric excellence bears little resemblance to modern moral agency. His view was that the Homeric world understood nobility in terms of the overpowering strength of noble warriors. The effect of the ensuing Judeo-Christian tradition, on this Nietzschean reading, was to enfeeble the Homeric understanding of excellence by substituting the meekness of the lamb for the strength and power of the noble warrior.8

Nietzsche was certainly right that the Homeric tradition valorizes the strong, noble hero; and he was right, too, that in some important sense the Homeric account of excellence is foreign to our basic moralizing assumptions. But there is something that the Nietzschean account leaves out. As Bernard Knox emphasizes, the Greek word arete is etymologically related to the Greek verb “to pray” (araomai).9 It follows that Homer’s basic account of human excellence involves the necessity of being in an appropriate relationship to whatever is understood to be sacred in the culture. Helen’s greatness, on this interpretation, is not properly measured in terms of the degree to which she is morally responsible for her actions.

What makes Helen great in Homer’s world is her ability to live a life that is constantly responsive to golden Aphrodite, the shining example of the sacred erotic dimension of existence. Likewise, Achilles had a special kind of receptivity to Ares and his warlike way of life; Odysseus had Athena, with her wisdom and cultural adaptability, to look out for him. Presumably, the master craftsmen of Homer’s world worked in the light of Hephaestus’s shining. In order to engage with this understanding of human excellence, we will have to think clearly about how the Homeric Greeks understood themselves. Why would it make sense to describe their lives in relation to the presence and absence of the gods?

Several questions focus this kind of approach. What is the phenomenon that Homer is responding to when he says that a god intervened or in some way took part in an action or event? Is this phenomenon recognizable to us, even if only marginally? And if Homer’s reference to the gods is something other than an attempt to pass off moral responsibility for one’s actions, then what exactly is it? Only by facing these questions head on can we understand whether it is possible—or desirable—to lure back Homer’s polytheistic gods.

The gods are essential to the Homeric Greek understanding of what it is to be a human being at all. As Peisistratus—the son of wise old Nestor—says toward the beginning of the Odyssey, “All men need the gods.”10 The Greeks were deeply aware of the ways in which our successes and our failures—indeed, our very actions themselves—are never completely under our control. They were constantly sensitive to, amazed by, and grateful for those actions that one cannot perform on one’s own simply by trying harder: going to sleep, waking up, fitting in, standing out, gathering crowds together, holding their attention with a speech, changing their mood, or indeed being filled with longing, desire, courage, wisdom, and so on. Homer sees each of these achievements as a particular god’s gift. To say that all men need the gods therefore is to say, in part at least, that we are the kinds of beings who are at our best when we find ourselves acting in ways that we cannot—and ought not—entirely take credit for.

The Discovery of the Mind
by Bruno Snell
pp. 158-160

The words for virtue and good, arete and agathos, are at first by no means clearly distinguished from the area of profit. In the early period they are not as palpably moral in content as might be supposed; we may compare the German terms Tu end and gut which originally stood for the ‘suitable’ (taugende) and the ‘fitting’ (cf. Gatte). When Homer says that a man is good, agathos, he does not mean thereby that he is morally unobjectionable, much less good-hearted, but rather that he is useful, proficient, and capable of vigorous action. We also speak of a good warrior or a good instrument. Similarly arete, virtue, does not denote a moral property but nobility, achievement, success and reputation. And yet these words have an unmistakable tendency toward the moral because, unlike ‘happiness’ or ‘profit’, they designate qualities for which a man may win the respect of his whole community. Arete is ‘ability’ and ‘achievement’, characteristics which are expected of a ‘good’, an ‘able’ man, an aner agathos. From Homer to Plato and beyond these words spell out the worth of a man and his work. Any change in their meaning, therefore, would indicate a reassessment of values. It is possible to show how at various times the formation and consolidation of social groups and even of states was connected with people’s ideas about the ‘good’. But that would be tantamount to writing a history of Greek culture. In Homer, to possess ‘virtue’ or to be ‘good’ means to realize one’s nature, and one’s wishes, to perfection. Frequently happiness and profit form the reward, but it is no such extrinsic prospect which leads men to virtue and goodness. The expressions contain a germ of the notion of entelechy. A Homeric hero, for instance, is capable of ‘reminding himself’, or of ‘experiencing’, that he is noble. ‘Use your experience to become what you are’ advises Pindar who adheres to this image of arete. The ‘good’ man fulfils his proper function, prattei ta heautou, as Plato demands it; he achieves his own perfection. And in the early period this also entails that he is good in the eyes of others, for the notions and definitions of goodness are plain and uniform: a man appears to others as he is.

In the Iliad (11.404—410) Odysseus reminds himself that he is an aristocrat, and thereby resolves his doubts how he should conduct himself in a critical situation. He does it by concentrating on the thought that he belongs to a certain social order, and that it is his duty to fulfill the ‘virtue’ of that order. The universal which underlies the predication ‘I am a noble’ is the group; he does not reflect on an abstract ‘good ’but upon the circle of which he claims membership. It is the same as if an officer were to say: ‘As an officer I must do this or that,’ thus gauging his action by the rigid conception of honour peculiar to his caste.

Aretan is ‘to thrive’; arete is the objective which the early nobles attach to achievement and success. By means of arete the aristocrat implements the ideal of his order—and at the same time distinguishes himself above his fellow nobles. With his arete the individual subjects himself to the judgment of his community, but he also surpasses it as an individual. Since the days of Jacob Burckhardt the competitive character of the great Greek achievements has rightly been stressed. Well into the classical period, those who compete for arete are remunerated with glory and honour. The community puts its stamp of approval on the value which the individual sets on himself. Thus honour, time, is even more significant than arete for the growth of the moral consciousness, because it is more evident, more palpable to all. From his earliest boyhood the young nobleman is urged to think of his glory and his honour; he must look out for his good name, and he must see to it that he commands the necessary respect. For honour is a very sensitive plant; wherever it is destroyed the moral existence of the loser collapses. Its importance is greater even than that of life itself; for the sake of glory and honour the knight is prepared to sacrifice his life.

pp. 169-172

The truth of the matter is that it was not the concept of justice but that of arete which gave rise to the call for positive individual achievement, the moral imperative which the early Greek community enjoins upon its members who in turn acknowledge it for themselves. A man may have purely egotistical motives for desiring virtue and achievement, but his group gives him considerably more credit for these ideals than if he were to desire profit or happiness. The community expects, and even demands, arete. Conversely a man who accomplishes a high purpose may convince himself so thoroughly that his deed serves the interests of a supra-personal, a universal cause that the alternative of egotism or altruism becomes irrelevant. What does the community require of the individual? What does the individual regard as universal, as eternal? These, in the archaic age, are the questions about which the speculations on arete revolve.

The problem remains simple as long as the individual cherishes the same values as the rest of his group. Given this condition, even the ordinary things in life are suffused with an air of dignity, because they are part of custom and tradition. The various daily functions, such as rising in the morning and the eating of meals, are sanctified by prayer and sacrifice, and the crucial events in the life of man—birth, marriage, burial—are for ever fixed and rooted in the rigid forms of cult. Life bears the imprint of a permanent authority which is divine, and all activity is, therefore, more than just personal striving. No one doubts the meaning of life; the hallowed tradition is carried on with implicit trust in the holy wisdom of its rules. In such a society, if a man shows unusual capacity he is rewarded as a matter of course. In Homer a signal achievement is, as one would expect, also honoured with a special permanence, through the song of the bard which outlasts the deed celebrated and preserves it for posterity. This simple concept is still to be found in Pindar’s Epinicians. The problem of virtue becomes more complex when the ancient and universally recognized ideal of chivalry breaks down. Already in Homeric times a differentiation sets in. As we have seen in the story of the quarrel over the arms of Achilles, the aretai become a subject for controversy. The word arete itself contains a tendency toward the differentiation of values, since it is possible to speak of the virtues of various men and various things. As more sections of society become aware of their own merit, they are less willing to conform to the ideal of the once-dominant class. It is discovered that the ways of men are diverse, and that arete may be attained in all sorts of professions. Whereas aristocratic society had been held together, not to say made possible by a uniform notion of arete, people now begin to ask what true virtue is. The crisis of the social system is at the same time the crisis of an ideal, and thus of morality. Archilochus says (fr. 41)that different men have their hearts quickened in various ways. But he also states, elaborating a thought which first crops up in the Odyssey: the mind of men is as Zeus ushers in each day, and they think whatever they happen to hit upon (fr. 68). One result of this splitting up of the various forms of life is a certain failure of nerve. Man begins to feel that he is changeable and exposed to many variable forces. This insight deepens the moral reflexions of the archaic period; the search for the good becomes a search for the permanent.

The topic of the virtues is especially prominent in the elegy. Several elegiac poets furnish lists of the various aretai which they exemplify by means of well-known myths. Their purpose is to clarify for themselves their own attitudes toward the conflicting standards of life. Theognis (699 ff.) stands at the end of this development; with righteous indignation he complains that the masses no longer have eyes for anything except wealth. For him material gain has, in contrast with earlier views, become an enemy of virtue.

The first to deal with this general issue is Tyrtaeus. His call to arms pronounces the Spartan ideal; perhaps he was the one to formulate that ideal for the first time. Nothing matters but the bravery of the soldier fighting for his country. Emphatically he rejects all other accomplishments and virtues as secondary: the swiftness of the runner in the arena, or the strength of the wrestler, or again physical beauty, wealth, royal power, and eloquence, are as nothing before bravery. In the Iliad also a hero best proves his virtue by standing firm against the enemy, but that is not his only proof; the heroic figures of Homer dazzle us precisely because of their richness in human qualities. Achilles is not only brave but also beautiful, ‘swift of foot’, he knows how to sing, and so forth. Tyrtaeus sharply reduces the scope of the older arete; what is more, he goes far beyond Homer in magnifying the fame of fortitude and the ignominy which awaits the coward. Of the fallen he actually says that they acquire immortality (9.32). This one-sidedness is due to the fact that the community has redoubled its claim on the individual; Sparta in particular taxed the energies of its citizenry to the utmost during the calamitous period of the Messenian wars. The community is a thing of permanence for whose sake the individual mortal has to lay down his life, and in whose memory lies his only chance for any kind of survival. Even in Tyrtaeus, however, these claims of the group do not lead to a termite morality. Far from prescribing a blind and unthinking service to the whole, or a spirit of slavish self-sacrifice, Tyrtaeus esteems the performance of the individual as a deed worthy of fame. This is a basic ingredient of arete which, in spite of countless shifts and variations, is never wholly lost.

Philosophy Before Socrates
by Richard D. McKirahan
pp. 366-369

Aretē and Agathos These two basic concepts of Greek morality are closely related and not straightforwardly translatable into English. As an approximation, aretē can be rendered “excellence” or “goodness” (sometimes “virtue”), and agathos as “excellent” or “good.” The terms are related in that a thing or person is agathos if and only if it has aretē and just because it has aretē. The concepts apply to objects, conditions, and actions as well as to humans. They are connected with the concept of ergon (plural, erga), which may be rendered as “function” or “characteristic activity.” A good (agathos) person is one who performs human erga well, and similarly a good knife is a knife that performs the ergon of a knife well. The ergon of a knife is cutting, and an agathos knife is one that cuts well. Thus, the aretē of a knife is the qualities or characteristics a knife must have in order to cut well. Likewise, if a human ergon can be identified, an agathos human is one who can and on appropriate occasions does perform that ergon well, and human aretē is the qualities or characteristics that enable him or her to do so. The classical discussion of these concepts occurs after our period, in Aristotle,6 but he is only making explicit ideas that go back to Homer and which throw light on much of the pre-philosophical ethical thought of the Greeks.

This connection of concepts makes it automatic, virtually an analytic truth, that the right goal for a person—any person—is to be or become agathos. Even if that goal is unreachable for someone, the aretē–agathos standard still stands as an ideal against which to measure one’s successes and failures. However, there is room for debate over the nature of human erga, both whether there is a set of erga applicable to all humans and relevant to aretē and, supposing that there is such a set of erga, what those erga are. The existence of the aretē–agathos standard makes it vitally important to settle these issues, for otherwise human life is left adrift with no standards of conduct. […]

The moral scene Homer presents is appropriate to the society it represents and quite alien to our own. It is the starting point for subsequent moral speculation which no one in the later Greek tradition could quite forget. The development of Greek moral thought through the Archaic and Classical periods can be seen as the gradual replacement of the competitive by the cooperative virtues as the primary virtues of conduct and as the recognition and increasing recognition of the significance of people’s intentions as well as their actions.7

Rapid change in Greek society in the Archaic and Classical periods called for new conceptions of the ideal human and the ideal human life and activities. The Archaic period saw different kinds of rulers from the Homeric kings, and individual combat gave way to the united front of a phalanx of hoplites (heavily armed warriors). Even though the Homeric warrior-king was no longer a possible role in society, the qualities of good birth, beauty, courage, honor, and the abilities to give good counsel and rule well remained. Nevertheless, the various strands of the Homeric heroic ideal began to unravel. In particular, good birth, wealth, and fighting ability no longer automatically went together. This situation forced the issue: what are the best qualities we can possess? What constitutes human aretē? The literary sources contain conflicting claims about the best life for a person, the best kind of person to be, and the relative merits of qualities thought to be ingredients of human happiness. In one way or another these different conceptions of human excellence have Homeric origins, though they diverge from Homer’s conception and from one another.

Lack of space makes it impossible to present the wealth of materials that bear on this subject.8 I will confine discussion to two representatives of the aristocratic tradition who wrote at the end of the Archaic period. Pindar shows how the aristocratic ideal had survived and been transformed from the Homeric conception and how vital it remained as late as the early fifth century, and Theognis reveals how social, political, and economic reality was undermining that ideal.

p. 374

The increase in wealth and the shift in its distribution which had begun by the seventh century led to profound changes in the social and political scenes in the sixth and forced a wedge in among the complex of qualities which traditionally constituted aristocratic aretē. Pindar’s unified picture in which wealth, power, and noble birth tend to go together became ever less true to contemporary reality.

The aristocratic response to this changed situation receives its clearest expression in the poems attributed to Theognis and composed in the sixth and early fifth centuries. Even less than with Pindar can we find a consistent set of views advocated in these poems, but among the most frequently recurring themes are the view that money does not make the man, that many undeserving people are now rich and many deserving people (deserving because of their birth and social background) are now poor. It is noteworthy how Theognis plays on the different connotations of uses of the primary terms of value, agathos and aretē, and their opposites kakos and kakia: morally good vs. evil; well-born, noble vs. low-born; and politically and socially powerful vs. powerless. Since the traditional positive attributes no longer regularly all went together, it was important to decide which are most important, indeed which are the essential ingredients of human aretē.

pp. 379-382

In short, Protagoras taught his students how to succeed in public and private life. What he claimed to teach is, in a word, aretē. That this was his boast follows from the intimate connection between agathos and aretē as well as from the fact that a person with aretē is one who enjoys success, as measured by current standards. Anyone with the abilities Protagoras claimed to teach had the keys to a successful life in fifth-century Athens.

In fact, the key to success was rhetoric, the art of public speaking, which has a precedent in the heroic conception of aretē, which included excellence in counsel. But the Sophists’ emphasis on rhetoric must not be understood as hearkening back to Homeric values. Clear reasons why success in life depended on the ability to speak well in public can be found in fifth-century politics and society. […]

That is not to say that every kind of success depended on rhetoric. It could not make you successful in a craft like carpentry and would not on its own make you a successful military commander. Nor is it plausible that every student of Protagoras could have become another Pericles. Protagoras acknowledged that natural aptitude was required over and above diligence. […] Protagoras recognized that he could not make a silk purse out of a sow’s ear, but he claimed to be able to develop a (sufficiently young) person’s abilities to the greatest extent possible.28

Pericles was an effective counselor in part because he could speak well but also by dint of his personality, experience, and intelligence. To a large extent these last three factors cannot be taught, but rhetoric can be offered as a tekhnē, a technical art or skill which has rules of its own and which can be instilled through training and practice. In these ways rhetoric is like medicine, carpentry, and other technical arts, but it is different in its seemingly universal applicability. Debates can arise on any conceivable subject, including technical ones, and rhetorical skill can be turned to the topic at hand whatever it may be. The story goes that Gorgias used his rhetorical skill to convince medical patients to undergo surgery when physicians failed to persuade them.29 Socrates turned the tables on the Sophists, arguing that if rhetoric has no specific subject matter, then so far from being a universal art, it should not be considered an art at all.30 And even if we grant that rhetoric is an art that can be taught, it remains controversial whether aretē can be taught and in what aretē consists. […]

The main charges against the Sophists are of two different sorts. First the charge of prostituting themselves. Plato emphasizes the money-making aspect of the Sophist’s work, which he uses as one of his chief criteria for determining that Socrates was not a Sophist. This charge contains two elements: the Sophists teach aretē for money, and they teach it to anyone who pays. Both elements have aristocratic origins. Traditionally aretē was learned from one’s family and friends and came as the result of a long process of socialization beginning in infancy. Such training and background can hardly be bought. Further, according to the aristocratic mentality most people are not of the right type, the appropriate social background, to aspire to aretē.

Lila
by Robert Pirsig
pp. 436-442

Digging back into ancient Greek history, to the time when this mythos-to-logos transition was taking place, Phædrus noted that the ancient rhetoricians of Greece, the Sophists, had taught what they called aretê , which was a synonym for Quality. Victorians had translated aretê as “virtue” but Victorian “virtue” connoted sexual abstinence, prissiness and a holier-than-thou snobbery. This was a long way from what the ancient Greeks meant. The early Greek literature, particularly the poetry of Homer, showed that aretê had been a central and vital term.

With Homer Phædrus was certain he’d gone back as far as anyone could go, but one day he came across some information that startled him. It said that by following linguistic analysis you could go even further back into the mythos than Homer. Ancient Greek was not an original language. It was descended from a much earlier one, now called the Proto-Indo-European language. This language has left no fragments but has been derived by scholars from similarities between such languages as Sanskrit, Greek and English which have indicated that these languages were fallouts from a common prehistoric tongue. After thousands of years of separation from Greek and English the Hindi word for “mother” is still “Ma.” Yoga both looks like and is translated as “yoke.” The reason an Indian rajah’ s title sounds like “regent” is because both terms are fallouts from Proto-Indo-European. Today a Proto-Indo-European dictionary contains more than a thousand entries with derivations extending into more than one hundred languages.

Just for curiosity’s sake Phædrus decided to see if aretê was in it. He looked under the “a” words and was disappointed to find it was not. Then he noted a statement that said that the Greeks were not the most faithful to the Proto-Indo-European spelling. Among other sins, the Greeks added the prefix “a” to many of the Proto-Indo-European roots. He checked this out by looking for aretê under “r.” This time a door opened.

The Proto-Indo-European root of aretê was the morpheme rt . There, beside aretê , was a treasure room of other derived “rt” words: “arithmetic,” “aristocrat,” “art,” “rhetoric,” “worth,” “rite,” “ritual,” “wright,” “right (handed)” and “right (correct).” All of these words except arithmetic seemed to have a vague thesaurus-like similarity to Quality. Phædrus studied them carefully, letting them soak in, trying to guess what sort of concept, what sort of way of seeing the world, could give rise to such a collection.

When the morpheme appeared in aristocrat and arithmetic the reference was to “firstness.” Rt meant first. When it appeared in art and wright it seemed to mean “created” and “of beauty.” “Ritual” suggested repetitive order. And the word right has two meanings: “right-handed” and “moral and esthetic correctness.” When all these meanings were strung together a fuller picture of the rt morpheme emerged. Rt referred to the “first, created, beautiful repetitive order of moral and esthetic correctness.” […]

There was just one thing wrong with this Proto-Indo-European discovery, something Phædrus had tried to sweep under the carpet at first, but which kept creeping out again. The meanings, grouped together, suggested something different from his interpretation of aretê . They suggested “importance” but it was an importance that was formal and social and procedural and manufactured, almost an antonym to the Quality he was talking about. Rt meant “quality” all right but the quality it meant was static, not Dynamic. He had wanted it to come out the other way, but it looked as though it wasn’t going to do it. Ritual. That was the last thing he wanted aretê to turn out to be. Bad news. It looked as though the Victorian translation of aretê as “virtue” might be better after all since “virtue” implies ritualistic conformity to social protocol. […]

Rta . It was a Sanskrit word, and Phædrus remembered what it meant: Rta was the “cosmic order of things.” Then he remembered he had read that the Sanskrit language was considered the most faithful to the Proto-Indo-European root, probably because the linguistic patterns had been so carefully preserved by the Hindu priests. […]

Rta , from the oldest portion of the Rg Veda , which was the oldest known writing of the Indo-Aryan language. The sun god, Sūrya , began his chariot ride across the heavens from the abode of rta. Varuna , the god for whom the city in which Phædrus was studying was named, was the chief support of rta .

Varuna was omniscient and was described as ever witnessing the truth and falsehood of men—as being “the third whenever two plot in secret.” He was essentially a god of righteousness and a guardian of all that is worthy and good. The texts had said that the distinctive feature of Varuna was his unswerving adherence to high principles. Later he was overshadowed by Indra who was a thunder god and destroyer of the enemies of the Indo-Aryans. But all the gods were conceived as “guardians of ta ,” willing the right and making sure it was carried out.

One of Phædrus’s old school texts, written by M. Hiriyanna, contained a good summary: “Rta , which etymologically stands for ‘course’ originally meant ‘cosmic order,’ the maintenance of which was the purpose of all the gods; and later it also came to mean ‘right,’ so that the gods were conceived as preserving the world not merely from physical disorder but also from moral chaos. The one idea is implicit in the other: and there is order in the universe because its control is in righteous hands.…”

The physical order of the universe is also the moral order of the universe. Rta is both. This was exactly what the Metaphysics of Quality was claiming. It was not a new idea. It was the oldest idea known to man.

This identification of rta and aretê was enormously valuable, Phædrus thought, because it provided a huge historical panorama in which the fundamental conflict between static and Dynamic Quality had been worked out. It answered the question of why aretê meant ritual. Rta also meant ritual. But unlike the Greeks, the Hindus in their many thousands of years of cultural evolution had paid enormous attention to the conflict between ritual and freedom. Their resolution of this conflict in the Buddhist and Vedantist philosophies is one of the profound achievements of the human mind.

Pagan Ethics: Paganism as a World Religion
by Michael York
pp. 59-60

Pirsig contends that Plato incorporated the arete of the Sophists into his dichotomy between ideas and appearances — where it was subordinated to Truth. Once Plato identifies the True with the Good, arete’s position is usurped by “dialectically determined truth.” This, in turn, allows Plato to demote the Good to a lower order and minor branch of knowledge. For Pirsig, the Sophists were those Greek philosophers who exalted quality over truth; they were the true champions of arete or excellence. With a pagan quest for the ethical that develops from an idolatrous understanding of the physical, while Aristotle remains an important consideration, it is to the Sophists (particularly Protagoras, Prodicus and Pirsig’s understanding of them) and a reconstruction of their underlying humanist position that perhaps the most important answers are to be framed if not found as well.

A basic pagan position is an acceptance of the appetites — in fact, their celebration rather than their condemnation. We find the most unbridled expression of the appetites in the actions of the young. Youth may engage in binge-drinking, vandalism, theft, promiscuity and profligate experimentation. Pagan perspectives may recognize the inherent dangers in these as there are in life itself. But they also trust the overall process of learning. In paganism, morality has a much greater latitude than it does in the transcendental philosophy of a Pythagoras, Plato, or Plotinus: it may veer toward a form of relativism, but its ultimate check is always the sanctity of the other animate individuals. An it harm none, do what ye will. The pagan ethic must be found within the appetites and not in their denial.

In fact, paganism is part of a protest against Platonic assertion. The wider denial is that of nature herself. Nature denies the Platonic by refusing to conform to the Platonic ideal. It insists on moments of chaos, the epagomenae, the carnival, that overlap between the real and the ideal that is itself a metaphor for reality. The actual year is a refusal to cooperate with the mathematically ideal year of 360 days — close but only tantalizingly.

In addition, pagans have always loved asking what is arete? This is the fundamental question we encounter with the Sophists, Plato and Aristotle. It is the question that is before us still. The classics considered variously both happiness and the good as alternative answers. The Hedonists pick happiness — but a particular kind of happiness. The underlying principle recognized behind all these possibilities is arete ‘excellence, the best’ however it is embodied — whether god, goddess, goods, the good, gods, virtue, happiness, pleasure or all of these together. Arete is that to which both individual and community aspire. Each wants one’s own individual way of putting it together in excellent fashion — but at the same time wanting some commensurable overlap of the individual way with the community way.

What is the truth of the historical claims about Greek philosophy in Zen and the Art of Motorcycle Maintenance?
answer by Ammon Allred

Arete is usually translated as “virtue,” which is certainly connected up with the good “agathon” — but in Plato an impersonal Good is probably more important than aletheia or truth. See, for instance, the central images at the end of Book VI, where the Good is called the “Father of the Sun.” The same holds in the Philebus. And it wouldn’t be right to say that Plato (or Aristotle) thought virtue was part of some small branch called “ethics” (Plato doesn’t divide his philosophy up this way; Aristotle does — although then we get into fact that we don’t have the dialogues he wrote — but still what he means by ethics is far broader than what we mean).

Certainly the Sophists pushed for a humanistic account of the Good, whereas Plato’s was far more impersonal. And Plato himself had a complex relationship to the Sophists (consider the dialogue of Protagoras, where Socrates and Protagoras both end up about equally triumphant).

That said, Pirsig is almost certainly right about Platonism — that is to say, the approach to philosophy that has been taught as though it were Plato’s philosophy. Certainly, the sophists have gotten a bad rap because of the view that Socrates and Plato were taken to have about the sophists; but even there, many philosophers have tried to rehabilitate them: most famously, Nietzsche.

The Art of the Lost Cause

Many people are understandably disappointed, frustrated, or angry when they lose. It’s just not fun to lose, especially in a competitive society. But there are advantages to losing. And losses are as much determined by perspective. Certainly, in more cooperative societies, what may be seen as a loss by outsiders could be taken quite differently by an insider. Western researchers discovered that difference when using games as part of social science studies. Some non-Western people refused win-lose scenarios, at least among members of the community. The individual didn’t lose for everyone gained. I point this out to help shift our thinking.

Recently, the political left in the United States has experienced losses. Bernie Sanders lost the nomination to Hillary Clinton who in turn lost the presidency to Donald Trump. But is this an entirely surprising result and bad outcome? Losses can lead to soul-searching and motivation for change. The Republicans we know now have dominated the political narrative in recent decades, which forced the Democrats to shift far to the right with third way ‘triangulation’. That wasn’t always the case. Republicans went through a period of major losses before being able to reinvent themselves with the southern strategy, Reagan revolution, trickle down voodo economics, the two Santa Claus theory, culture wars, etc.

The Clinton New Democrats were only able to win at all in recent history by sacrificing the political left and, in the process, becoming the new conservative party. So, even when Democrats have been able to win it has been a loss. Consider Obama who turned out to be one of the most neoliberal and neocon presidents in modern history, betraying his every promise: maintaining militarism, refusing to shut down GITMO, passing pro-biz insurance reform, etc. Liberals and leftists would have been better off to have been entirely out of power these past decades, allowing a genuine political left movement to form and so allowing democracy to begin to reassert itself from below. Instead, Democrats have managed to win just enough elections to keep the political left suppressed by co-opting their rhetoric. Democrats have won by forcing the American public to lose.

In the Democratic leadership failing so gloriously, they have been publicly shamed to the point of no redemption. The party is now less popular than the opposition, an amazing feat considering how unpopular is Trump and the GOP at the moment. Yet amidst all of this, Bernie Sanders is more popular than ever, more popular among women than men and more popular among minorities than whites. I never thought Sanders was likely to win and so I wasn’t disappointed. What his campaign did accomplish, as I expected, was to reshape the political narrative and shift the Overton window back toward the political left again. This period of loss will be remembered as a turning point in the future. It was a necessary loss, a reckoning and re-envisioning.

Think about famous lost causes. One that came to mind is that of Jesus and the early Christians. They were a tiny unknown cult in a vast empire filled with hundreds of thousands of similar cults. They were nothing special, of no significance or consequence, such that no one bothered to even take note of them, not even Jewish writers at the time. Then Jesus was killed as a common criminal among other criminals and even that didn’t draw any attention. There is no evidence that the Romans considered Jesus even mildly interesting. After his death, Christianity remained small and splintered into a few communities. It took generations for this cult to grow much at all and finally attract much outside attention.

Early Christians weren’t even important enough to be feared. The persecution stories seem to have been mostly invented by later Christians to make themselves feel more important, as there is no records of any systematic and pervasive persecution. Romans killing a few cultists here and there happened all the time and Christians didn’t stand out as being targeted more than any others. In fact, early Christians were lacking in uniqueness that they were often confused with other groups such as Stoics. By the way, it was the Stoics who were famous at the time for seeking out persecution and so gaining street cred respectability, maybe causing envy among Christians. Even Christian theology was largely borrowed from others, such as natural law also having been taken from the Stoics — related to the idea that a slave can be free in their mind and being, their heart and soul because natural law transcends human law.

Still, this early status of Christians as losers created a powerful narrative that has not only survived but proliferated. Some of that narrative, such as their persecution, was invented. But that is far from unusual — the mythos that develops around lost causes tends to be more invented than not. Still, at the core, the Christians were genuinely pathetic for a couple of centuries. They weren’t a respectable religion in the Roman Empire, until long after Jesus’ death when an emperor decided to use them to shore up his own power. In the waning era of Roman imperialism, I suppose a lost cause theology felt compelling and comforting. It was also a good way to convert other defeated people, as they could be promised victory in heaven. Lost Causes tend to lead to romanticizing of a distant redemption that one day would come. And in the case of Christianity, this would mean that the ultimate sacrificial loser, Jesus himself, would return victorious! Amen! Praise the Lord! Like a Taoist philosopher, Jesus taught that to find oneself was to lose oneself but to lose oneself was to find oneself. This is a loser’s mentality and relates to why some have considered Christianity to be a slaver religion. The lowly are uplifted, at least in words and ideals. But I’d argue there is more to it than seeking comfort by rationalizing suffering, oppression, and defeat.

Winning isn’t always a good thing, at least in the short term. I sometimes wonder if America would be a better place if the American Revolution had been lost. When I compare the United States to Canada, I don’t see any great advantage to American colonists having won. Canada is a much more stable and well-functioning social democracy. And the British Empire ended up enacting sweeping reforms, including abolishing slavery through law long before the US managed to end slavery through bloody conflict. In many ways, Americans were worse off after the revolution than before it. A reactionary backlash took hold as oligarchs co-opted the revolution and turned it into counter-revolution. Through the coup of a Constitutional Convention, the ruling elite seized power of the new government. It was in seeming to win that the average American ended up losing. An overt loss potentially could have been a greater long term victory. In particular for women and blacks, being on the side of the revolutionaries didn’t turn out to be such a great deal. Woman who had gained the vote had it taken away from them again and blacks hoping for freedom were returned to slavery. The emerging radical movement of democratic reform was strangled in the crib.

Later on, the Confederates learned of the power of a lost cause. To such an extent that they have become the poster boys of The Lost Cause, all of American society having been transformed by it. Victory of the United States government, once again, turned out to be far from a clear victory for the oppressed. If Confederates had won or otherwise been allowed to secede, the Confederate government would have been forced to come to terms with the majority black population that existed in the South and they wouldn’t have had the large Northern population to help keep blacks down. It’s possible that some of the worst results could have been avoided: re-enslavement through chain gangs and mass incarceration, Jim Crow laws and Klan terrorism, sundown towns and redlining, etc —  all the ways that racism became further entrenched. After the Civil War, blacks became scattered and would then become a minority. Having lost their position as the Southern majority, they lost most of the leverage they might have had. Instead of weak reforms leading to new forms of oppression, blacks might have been able to have forced a societal transformation within a Confederate government or else to have had a mass exodus in order to secede and create their own separate nation-state. There were many possibilities that became impossible because of Union victory.

Now consider the civil rights movement. The leaders, Martin Luther King in particular, understood the power of a lost cause. They intentionally staged events of getting attacked by police and white mobs, always making sure there were cameras nearby to make it into a national event. It was in losing these confrontations to the greater power of white oppression that they managed to win public support. As a largely Christian movement, the civil rights activists surely had learned from the story of Jesus as a sacrificial loser and his followers as persecuted losers. The real failure of civil rights only came later on when it gained mainstream victories and a corrupt black leadership aligned with white power, such as pushing the racist 1994 Crime Bill which was part of the Democrats becoming the new conservative party. The civil rights movement might have been better able to transform society and change public opinion by having remained a lost cause for a few more generations.

A victory forced can be a victory lost. Gain requires sacrifice, not to be bought cheaply. Success requires risk of failure, putting everything on the line. The greatest losses can come from seeking victory too soon and too easily. Transformative change can only be won by losing what came before. Winning delayed sometimes is progress ensured, slow but steady change. The foundation has to be laid before something can emerge from the ground up. Being brought low is the beginning point, like planting a seed in the soil.

It reminds me of my habit of always looking down as I walk. My father, on the other hand, never looks down and has a habit of stepping on things. It is only by looking down that we can see what is underneath our feet, what we stand on or are stepping toward. Foundation and fundament are always below eye level. Even in my thinking, I’m forever looking down, to what is beneath everyday awareness and oft-repeated words. Just to look down, such a simple and yet radical act.

“Looking down is also a sign of shame or else humility, the distinction maybe being less relevant to those who avoid looking down. To humble means to bring low, to the level of the ground, the soil, humus. To be further down the ladder of respectability, to be low caste or low class, is to have a unique vantage point. One can see more clearly and more widely when one has grown accustomed to looking down, for then one can see the origins of things, the roots of the world, where experience meets the ground of being.”

* * *

Living Differently: On How the Feminist Utopia Is Something You Have to Be Doing Now
by Lynne Segal

Another anthropologist, the anarchist David Graeber, having been involved in protest networks for decades, remains even more certain that participation in moments of direct action and horizontal decision-making bring to life a new and enduring conception of politics, while providing shared hope and meaning in life, even if their critics see in the outcomes of these movements only defeat:

What they don’t understand is that once people’s political horizons have been broadened, the change is permanent. Hundreds of thousands of Americans (and not only Americans, but Greeks, Spaniards and Tunisians) now have direct experience of self-organization, collective action and human solidarity. This makes it almost impossible to go back to one’s previous life and see things the same way. While the world’s financial and political elite skate blindly towards the next 2008-scale crisis, we’re continuing to carry out occupations of buildings, farms, foreclosed homes and workplaces, organizing rent strikes, seminars and debtor’s assemblies, and in doing so laying the groundwork for a genuinely democratic culture … With it has come a revival of the revolutionary imagination that conventional wisdom has long since declared dead.

Discussing what he calls ‘The Democracy Project’, Graeber celebrates forms of political resistance that in his view move well beyond calls for policy reforms, creating instead permanent spaces of opposition to all existing frameworks. For Graeber, one fundamental ground for optimism is that the future is unknowable, and one can live dissident politics in the present, or try to. This is both despite, and also because of, the insistent neo-liberal boast that there can be no alternative to its own historical trajectory: which has become a linear project of endless growth and the amassing of wealth by the few, toil and the struggle for precarious survival for so many.

Furthermore, Graeber points out that historically, although few revolutionaries actually succeeded in taking power themselves, the effects of their actions were often experienced far outside their immediate geographical location. In a similar reflection on unintended consequences, Terry Eagleton suggests that even with the gloomiest of estimates in mind, many aspects of utopic thinking may be not only possible but well- nigh inevitable:

Perhaps it is only when we run out of oil altogether, or when the world system crashes for other reasons, or when ecological catastrophe finally overtakes us, that we will be forced into some kind of co-operative commonwealth of the kind William Morris might have admired.

Even catastrophism, one might say, has its potentials. […]

It should come as no surprise that most of the goals we dream of will usually elude us, at least partially. However, to confront rather than accept the evils of the present, some utopian spirit is always necessary to embrace the complexity of working, against all odds, to create better futures. A wilful optimism is needed, despite and because of our inevitable blind-spots and inadequacies, both personal and collective.

For many of us, it means trying to live differently in the here and now, knowing that the future will never be a complete break with the present or the past, but hopefully something that may develop out of our most supportive engagements with others. To think otherwise inhibits resistance and confirms the dominant conceit that there is no alternative to the present. Thus, I want to close this chapter repeating the words of the late Latin American writer, Eduardo Galeano, which seem to have been translated into almost every language on earth, though I cannot track down their source:

Utopia is on the horizon. I move two steps closer; it moves two steps further away. I walk another ten steps and the horizon runs ten steps further away. As much as I may walk, I’ll never reach it. So what’s the point of utopia? The point is this: to keep moving forward.

Our political dreams can end in disappointment, but are likely, nevertheless, to make us feel more alive, and hence happier, along the way, at least when they help to connect us to and express concern for those around us. Happiness demands nothing less.

 

Paul Adkin on Decadence & Stagnation

“Decadence: when things are just too good and easy that no one bothers to push forward anymore, bringing about stagnation …

But there is also another kind of stagnation: one which comes about because there just isn’t enough time to go forward; when all time is taken up with something that is essentially futile when considered from the point of view of the bigger picture. Like making money. Even the seemingly dynamic world of business, if it is dedicated only to business and not to authentically meaningful human progress (things associated with knowledge and discovery), it is essentially stagnating. Any society that is a simulacra society, hell-bent on reproducing copies rather than on developing its creativity, is a decadent, stagnating society. We are stagnant not because of what we are doing, our anthill society is always busy, but because what we are driven by, in all this anthill activity, is not creative. When production is synonymous with reproduction, then we know we have fallen into the stagnant pool of decadence.

“Nietzsche talked about the residual nature of decadence[1]. That decadence is a cumulative thing. Certainly, it is nurtured both by dogma and nihilism. Only a sceptical meaningfulness can push forward in a creative way.

“Sceptical meaningfulness? How can such a thing be? Surely it is a contradiction in terms.

“To understand how this oxymoron combination can work, we need to see meaningfulness as a forward pushing phenomenon. Once it stops pushing forward, meaningfulness slips into dogma. Meaning is fuelled by truth, but it does not swim in truth as if truth were a lake. Truth, in order to be lasting, has to be a river.”

from Decadence & Stagnation by Paul Adkin

Social Construction & Ideological Abstraction

The following passages from two books help to explain what is social construction. As society has headed in a particular direction of development, abstract thought has become increasingly dominant.

But for us modern people who take abstractions for granted, we often don’t even recognize abstractions for what they are. Many abstractions simply become reality as we know it. They are ‘looped’ into existence, as race realism, capitalist realism, etc.

Ideological abstractions become so pervasive and systemic that we lose the capacity to think outside of them. They form our reality tunnel.

This wasn’t always so. Humans used to conceive of and hence perceive the world far differently. And this shaped their sense of identity, which is hard for us to imagine.

* * *

Dynamics of Human Biocultural Diversity:
A Unified Approach

by Elisa J. Sobo
Kindle Locations 94-104)

Until now, many biocultural anthropologists have focused mainly on the ‘bio’ half of the equation, using ‘biocultural’ generically, like biology, to refer to genetic, anatomical, physiological, and related features of the human body that vary across cultural groups. The number of scholars with a more sophisticated approach is on the upswing, but they often write only for super-educated expert audiences. Accordingly, although introductory biocultural anthropology texts make some attempt to acknowledge the role of culture, most still treat culture as an external variable— as an add-on to an essentially biological system. Most fail to present a model of biocultural diversity that gives adequate weight to the cultural side of things.

Note that I said most, not all: happily, things are changing. A movement is afoot to take anthropology’s claim of holism more seriously by doing more to connect— or reconnect— perspectives from both sides of the fence. Ironically, prior to the industrial revolution and the rise of the modern university, most thinkers took a very comprehensive view of the human condition. It was only afterward that fragmented, factorial, compartmental thinking began to undermine our ability to understand ourselves and our place in— and connection with— the world. Today, the leading edge of science recognizes the links and interdependencies that such thinking keeps falsely hidden.

Nature, Human Nature, and Human Difference:
Race in Early Modern Philosophy
by Justin E. H. Smith

pp. 9-10

The connection to the problem of race should be obvious: kinds of people are to no small extent administered into being, brought into existence through record keeping, census taking, and, indeed, bills of sale. A census form asks whether a citizen is “white,” and the possibility of answering this question affirmatively helps to bring into being a subkind of the human species that is by no means simply there and given, ready to be picked out, prior to the emergence of social practices such as the census. Censuses, in part, bring white people into existence, but once they are in existence they easily come to appear as if they had been there all along. This is in part what Hacking means by “looping”: human kinds, in contrast with properly natural kinds such as helium or water, come to be what they are in large part as a result of the human act of identifying them as this or that. Two millennia ago no one thought of themselves as neurotic, or straight, or white, and nothing has changed in human biology in the meantime that could explain how these categories came into being on their own. This is not to say that no one is melancholic, neurotic, straight, white, and so on, but only that how that person got to be that way cannot be accounted for in the same way as, say, how birds evolved the ability to fly, or how iron oxidizes.

In some cases, such as the diagnosis of mental illness, kinds of people are looped into existence out of a desire, successful or not, to help them. Racial categories seem to have been looped into existence, by contrast, for the facilitation of the systematic exploitation of certain groups of people by others. Again, the categories facilitate the exploitation in large part because of the way moral status flows from legal status. Why can the one man be enslaved, and the other not? Because the one belongs to the natural-seeming kind of people that is suitable for enslavement. This reasoning is tautological from the outside, yet self-evident from within. Edward Long, as we have seen, provides a vivid illustration of it in his defense of plantation labor in Jamaica. But again, categories cannot be made to stick on the slightest whim of their would-be coiner. They must build upon habits of thinking that are already somewhat in place. And this is where the history of natural science becomes crucial for understanding the history of modern racial thinking, for the latter built directly upon innovations in the former. Modern racial thinking could not have taken the form it did if it had not been able to piggyback, so to speak, on conceptual innovations in the way science was beginning to approach the diversity of the natural world, and in particular of the living world.

This much ought to be obvious: racial thinking could not have been biologized if there were no emerging science of biology. It may be worthwhile to dwell on this obvious point, however, and to see what more unexpected insights might be drawn out of it. What might not be so obvious, or what seems to be ever in need of renewed pointing out, is a point that ought to be of importance for our understanding of the differing, yet ideally parallel, scope and aims of the natural and social sciences: the emergence of racial categories, of categories of kinds of humans, may in large part be understood as an overextension of the project of biological classification that was proving so successful in the same period. We might go further, and suggest that all of the subsequent kinds of people that would emerge over the course of the nineteenth and twentieth centuries, the kinds of central interest to Foucault and Hacking, amount to a further reaching still, an unprecedented, peculiarly modern ambition to make sense of the slightest variations within the human species as if these were themselves species differentia. Thus for example Foucault’s well-known argument that until the nineteenth century there was no such thing as “the homosexual,” but only people whose desires could impel them to do various things at various times. But the last two centuries have witnessed a proliferation of purportedly natural kinds of humans, a typology of “extroverts,” “depressives,” and so on, whose objects are generally spoken of as if on an ontological par with elephants and slime molds. Things were not always this way. In fact, as we will see, they were not yet this way throughout much of the early part of the period we call “modern.”

Time and Trauma

And I think of that “Groundhog Day” movie with Bill Murray in which he repeats the same day, again and again, with only minor changes. If you’ve seen the movie, Murray finally breaks out of what appears to be an infinite loop only when he changes his ways, his approach to life, his mentality. He becomes a better person and even gets the girl.

When is the USA going to break out of its infinite loop of war? Only when we change our culture, our mentality.

A “war on terror” is a forever war, an infinite loop, in which the same place names and similar actions crop up again and again. Names like Mosul and Helmand province. Actions like reprisals and war crimes and the deaths of innocents, because that is the face of war.

~W.J. Astore, Happy 4th of July! And a Global War on Something

* * *

The impression we form is that it is not that linear time perception or experience that has been corrupted by trauma; it is that time “itself” has been traumatized — so that we come to comprehend “history” not as a random sequence of events, but as a series of traumatic clusters. This broken time, this sense of history as a malign repetition, is “experienced” as seizure and breakdown; I have placed “experienced” in inverted commas here because the kind of voiding interruption of subjectivity seems to obliterate the very conditions that allows experience to happen.

It is as if the combination of adolescent erotic energy with an inorganic artefact … produces a trigger for a repeating of the ancient legend. It is not clear that “repeating” is the right word here, though. It might be better to say that the myth has been re-instantiated, with the myth being understood as a kind of structure that can be implemented whenever the conditions are right. But the myth doesn’t repeat so much as it abducts individuals out of linear time and into its “own” time, in which each iteration of the myth is in some sense always the first time.

…the mythic is part of the virtual infrastructure which makes human life as such possible. It is not the case that first of all there are human beings, and the mythic arrives afterwards, as a kind of cultural carapace added to a biological core. Humans are from the start — or from before the start, before the birth of the individual — enmeshed in mythic structures.

~Mark Fisher, Eerie ThanatosThe Weird and the Eerie (pp. 96-97)

A Neverending Revolution of the Mind

In a recent book, Juliet Barker offers new perspective about an old event (1381: The Year of the Peasant’s Revolt, Kindle Locations 41-48):

“In the summer of 1381 England erupted in a violent popular uprising that was as unexpected as it was unprecedented. Previous rebellions had always been led by ambitious and discontented noblemen seeking to overthrow the government and seize power for themselves. The so-called ‘Peasants’ Revolt’ was led by commoners— most famously Wat Tyler, Jack Straw and John Balle— whose origins were obscure and whose moment at the forefront of events was brief. Even more unusually, they did not seek personal advancement but a radical political agenda which, if it had been implemented, would fundamentally have transformed English society: the abolition of serfdom and the dues and services owed by tenants to their lord of the manor; freedom from tolls and customs on buying and selling goods throughout the country; the recognition of a man’s right to work for whom he chose at the wages he chose; the state’s seizure of the Church’s wealth and property. Their demands anticipated the French Revolution by four hundred years.”

Our understanding of the origins of modernity keep being pushed back. It used to be thought that the American Revolution was the first modern revolution. But it was preceded by generations of revolts against colonial elite. And before that was the English Civil War, which increasingly is seen as the first modern revolution. We might have to push it even further back to the Peasant’s Revolt.

It makes sense when you know some of the historical background. England had become a major center of wool production. This unintentionally undermined the feudal order. The reason is that an entire community of feudal peasants isn’t necessary for herding sheep, in the way it had been for traditional agriculture. So, by the time the Peasant’s Revolt came around, there had already been several centuries of increasing irrelevance for much of the peasant population. This would continue on into the Enlightenment Age when the enclosure movement took hold and masses of landless peasants flooded into the cities.

It’s interesting that the pressure on the social order was already being felt that far back, almost jumpstarting the modern revolutionary era four centuries earlier. Those commoners were already beginning to think of themselves as more than mere cogs in the machinery of feudalism. They anticipated the possibility of becoming agents of their own fate. It was the origins of modern class identity and class war, at least for Anglo-American society.

There were other changes happening around then. It was the beginning of the Renaissance. This brought ancient Greek philosophy, science, and politics back into Western thought. The new old ideas were quickly spread through the invention of the movable type printing press and increasing use of vernacular language. And that directly made the Enlightenment possible.

The Italian city-states and colonial empires were becoming greater influences, bringing with them new economic systems of capitalism and corporatism. The Italian city-states, in the High Middle Ages, also initiated advocacy of anti-monarchialism and liberty-oriented republicanism. Related to this, humanism became a major concern, as taught by the ancient Sophists with Protagoras famously stating that “Man is the measure of all things.” And with this came early developments in psychological thought, such as the radical notion that everyone had the same basic human nature. Diverse societies had growing contact and so cultural differences became an issue, provoking difficult questions and adding to a sense of uncertainty and doubt.

Individual identity and social relationships were being transformed, in a way not seen since the Axial Age. Proto-feudalism developed in the Roman empire. Once established, feudalism lasted for more than a millennia. It wasn’t just a social order but an entire worldview, a way of being in and part of a shared world. Every aspect of life was structured by it. The slow unraveling inevitably led to increasing radicalism, as what it meant to be human was redefined and re-envisioned.

My thoughts continuously return to these historical changes. I can’t shake the feeling that we are living through another such period of societal transformation. But as during any major shift in consciousness, the outward results are hard to understand or sometimes hard to even notice, at least in terms of their ultimate consequences. That is until they result in an uprising of the masses and sometimes a complete overthrow of established power. Considering that everpresent possibility and looming threat, it might be wise to question how stable is our present social order and the human identity it is based upon.

These thoughts are inspired by other books I’ve been reading. The ideas I regularly return to is that of Julian Jaynes’ bicameralism and the related speculations of those who were inspired by him, such as Iain McGilchrist. Most recently, I found useful insight from two books whose authors were new to me: Consciousness by Susan Blackmore and A Skeptic’s Guide to the Mind by Robert Burton.

Those authors offer overviews that question and criticize many common views, specifically that of the Enlightenment ideal of individuality, in considering issues of embodiment and affect, extended self and bundled self. These aren’t just new theories that academics preoccupy themselves for reasons of entertainment and job security. They are ideas that have much earlier origins and, dismissed for so long because they didn’t fit into the prevailing paradigm, they are only now being taken seriously. The past century led to an onslaught of research findings that continuously challenged what we thought we knew.

This shift is in some ways a return to a different tradition of radical thought. John Locke was radical enough for his day, although his radicalism was hidden behind pieties. Even more radical was a possible influence on Locke, Wim Klever going so far as seeing crypto-quotations of Baruch Spinoza in Locke’s writings. Spinoza was an Enlightenment thinker who focused not just on what it meant to be human but a human in the world. What kind of world is this? Unlike Locke, his writings weren’t as narrowly focused on politics, governments, constitutions, etc. Even so, Matthew Stewart argues that through Locke’s writings Spinozism was a hidden impulse that fueled the fires of the American Revolution, taking form and force through a working class radicalism as described in Nature’s God.

Spinozism has been revived in many areas of study, such as the growing body of work about affect. Never fully appreciated in his lifetime, his radicalism continues to inform and inspire innovative thinking. As Renaissance ideas took centuries to finally displace what came before, Spinoza’s ideas are slowly but powerfully helping to remake the modern mind. I’d like to believe that a remaking of the modern world will follow.

I just started an even more interesting book, Immaterial Bodies by Lisa Blackman. She does briefly discuss Spinoza, but her framing concern is the the relationship “between the humanities and the sciences (particularly the life, neurological and psychological sciences).” She looks at the more recent developments of thought, including that of Jaynes and McGilchrist. Specifically, she unpacks the ideological self-identity we’ve inherited.

To argue for or to simply assume a particular social construct about our humanity is to defend a particular social order and thus to enforce a particular social control. She makes a compelling case for viewing neoliberalism as more than a mere economic and political system. The greatest form of control isn’t only controlling how people are allowed to act and relate but, first and foremost, how they are able to think about themselves and the world around them. In speaking about neoliberalism, she quotes Fernando Vidal (Kindle Locations 3979-3981):

“The individualism characteristic of western and westernized societies, the supreme value given to the individual as autonomous agent of choice and initiative, and the corresponding emphasis on interiority at the expense of social bonds and contexts, are sustained by the brain-hood ideology and reproduced by neurocultural discourses.”

Along with mentioning Spinoza, Blackman does give some historical background, such as in the following. And as a bonus, it is placed in the even larger context of Jaynes’ thought. She writes (Kindle Locations 3712-3724):

“Dennett, along with other scientists interested in the problem of consciousness (see Kuijsten, 2006), has identified Jaynes’s thesis as providing a bridge between matter and inwardness, or what I would prefer to term the material and immaterial. Dennett equates this to the difference between a brick and a bricklayer, where agency and sentience are only accorded to the bricklayer and never to the brick. For Dennett, under certain conditions we might have some sense of what it means to be a bricklayer, but it is doubtful, within the specificities of consciousness as we currently know and understand it, that we could ever know what it might mean to be a brick. This argument might be more usefully extended within the humanities by considering the difference between understanding the body as an entity and as a process. The concept of the body as having a ‘thing-like’ quality, where the body is reconceived as a form of property, is one that has taken on a truth status since at least its incorporation into the Habeas Corpus Act of 1679 (see Cohen, 2009). As Cohen (2009: 81) suggests, ‘determining the body as the legal location of the person radically reimagines both the ontological and political basis of person-hood’. This act conceives the body as an object possessed or owned by individuals, what Cohen (2009) terms a form of ‘biopolitical individualization’. Within this normative conception of corporeality bodies are primarily material objects that can be studied in terms of their physicochemical processes, and are objects owned by individuals who can maintain and work upon them in order to increase the individual’s physical and cultural capital.”

In her epilogue, she presents a question by Catherine Malabou (Kindle Locations 4014-4015): “What should we do so that consciousness of the brain does not purely and simply coincide with the spirit of capitalism?” The context changes as the social order changes, from feudalism to colonialism and now capitalism. But phrased in various ways, it is the same question that has been asked for centuries.

Another interesting question to ask is, by what right? It is more than a question. It is a demand to prove the authority of an action. And relevant to my thoughts here, it has historical roots in feudalism. It’s like asking someone, who do you think you are to tell me what to do? Inherent in this inquiry is one’s position in the prevailing social order, whether feudal lords challenging the kings authority or peasants challenging those feudal lords. The issue isn’t only who we are and what we are allowed to do based on that but who or what gets to define who we are, our human nature and social identity.

Such questions always have a tinge of the revolutionary, even if only in potential. Once people begin questioning, established attitudes and identities have already become unmoored and are drifting. The act of questioning is itself radical, no matter what the eventual answers. The doubting mind is ever poised on a knife edge.

The increasing pressure put on peasants, especially once they became landless, let loose individuals and identities. This incited radical new thought and action. As a yet another underclass forms, that of the imprisoned and permanently unemployed that even now forms a tenth of the population, what will this lead to? Throwing people into desperation with few opportunities and lots of time on their hands tends to lead to disruptive outcomes, sometimes even revolution.

Radicalism means to go to the root and there is nothing more radical than going to the root of our shared humanity. In questions being asked, those in power won’t be happy with the answers found. But at this point, it is already too late to stop what will follow. We are on our way.

Imagination: Moral, Dark, and Radical

Absence is presence.
These are the fundamentals of mystery.
The Young Pope

Below is a gathering of excerpts from writings. The key issue here is imagination, specifically Edmund Burke’s moral imagination with its wardrobe but also the dark imagination and the radical imagination. I bring in some other thinkers for context: Thomas Paine, Corey Robin, Thomas Ligotti, Lewis Hyde, and Julian Jaynes.

Besides imagination, the connecting strands of thought are:

  • Pleasure, beauty, and sublimity; comfort, familiarity, intimacy, the personal, and subjectivity; embodiment, anchoring, shame, and nakedness; pain, violence, suffering, and death;
  • Darkness, awe, fear, terror, horror, and the monstrous; oppression, prejudice, and ignorance; obfuscation, obscurity, disconnection, and dissociation; the hidden, the veiled, the unknown, and the distant; mystery, madness, and deception;
  • Identity, consciousness, and metaphor; creativity, art, story, poetry, and rhetoric; literalism, realism, and dogmatism; reason, knowledge, and science;
  • Enlightenment, abstractions, ideology, revolution, and counter-revolution; nobility, power, chivalry, aristocracy, and monarchy; tradition, nostalgia, and the reactionary mind; liberalism, conservatism, and culture wars;
  • Et cetera.

The touchstone for my own thinking is what I call symbolic conflation, along with the larger context of conceptual slippage, social construction, and reality tunnels. This is closely related to what Lewis Hyde discusses in terms of metonymy, liminality, and the Trickster archetype.

Read the following as a contemplation of ideas and insights. In various ways, they connect, overlap, and resonate. Soften your focus and you might see patterns emerge. If these are all different perspectives of the same thing, what exactly is it that is being perceived? What does each view say about the individual espousing it and if not necessarily about all of humanity at least about our society?

(I must admit that my motivation for this post was mainly personal. I simply wanted to gather these writings together. They include some writings and writers that I have been thinking about for a long time. Quotes and passages from many of them can be found in previous posts on this blog. I brought them together here for the purposes of my own thinking about certain topics. I don’t post stuff like this with much expectation that it will interest anyone else, as I realize my own interests are idiosyncratic. Still, if someone comes along and finds a post like this fascinating, then I’ll know they are my soulmate. This post is only for cool people with curious minds. Ha!)

* * *

On the Sublime and Beautiful
by Edmund Burke

Of the Passion Caused by the Sublime

THE PASSION caused by the great and sublime in nature, when those causes operate most powerfully, is astonishment; and astonishment is that state of the soul, in which all its motions are suspended, with some degree of horror. 1 In this case the mind is so entirely filled with its object, that it cannot entertain any other, nor by consequence reason on that object which employs it. Hence arises the great power of the sublime, that, far from being produced by them, it anticipates our reasonings, and hurries us on by an irresistible force. Astonishment, as I have said, is the effect of the sublime in its highest degree; the inferior effects are admiration, reverence, and respect.

Terror

NO passion so effectually robs the mind of all its powers of acting and reasoning as fear. 1 For fear being an apprehension of pain or death, it operates in a manner that resembles actual pain. Whatever therefore is terrible, with regard to sight, is sublime too, whether this cause of terror be endued with greatness of dimensions or not; for it is impossible to look on anything as trifling, or contemptible, that may be dangerous. There are many animals, who though far from being large, are yet capable of raising ideas of the sublime, because they are considered as objects of terror. As serpents and poisonous animals of almost all kinds. And to things of great dimensions, if we annex an adventitious idea of terror, they become without comparison greater. A level plain of a vast extent on land, is certainly no mean idea; the prospect of such a plain may be as extensive as a prospect of the ocean: but can it ever fill the mind with anything so great as the ocean itself? This is owing to several causes; but it is owing to none more than this, that the ocean is an object of no small terror. Indeed, terror is in all cases whatsoever, either more openly or latently, the ruling principle of the sublime. Several languages bear a strong testimony to the affinity of these ideas. They frequently use the same word, to signify indifferently the modes of astonishment or admiration, and those of terror. [Greek] is in Greek, either fear or wonder; [Greek] is terrible or respectable; [Greek], to reverence or to fear. Vereor in Latin, is what [Greek] is in Greek. The Romans used the verb stupeo, a term which strongly marks the state of an astonished mind, to express the effect of either of simple fear or of astonishment; the word attonitus (thunder-struck) is equally expressive of the alliance of these ideas; and do not the French étonnement, and the English astonishment and amazement, point out as clearly the kindred emotions which attend fear and wonder? They who have a more general knowledge of languages, could produce, I make no doubt, many other and equally striking examples.

Obscurity

TO make anything very terrible, obscurity seems in general to be necessary. When we know the full extent of any danger, when we can accustom our eyes to it, a great deal of the apprehension vanishes. Every one will be sensible of this, who considers how greatly night adds to our dread, in all cases of danger, and how much the notions of ghosts and goblins, of which none can form clear ideas, affect minds which give credit to the popular tales concerning such sorts of beings. Those despotic governments, which are founded on the passions of men, and principally upon the passion of fear, keep their chief as much as may be from the public eye. The policy has been the same in many cases of religion. Almost all the heathen temples were dark. Even in the barbarous temples of the Americans at this day, they keep their idol in a dark part of the hut, which is consecrated to his worship. For this purpose too the Druids performed all their ceremonies in the bosom of the darkest woods, and in the shade of the oldest and most spreading oaks. No person seems better to have understood the secret of heightening, or of setting terrible things, if I may use the expression, in their strongest light, by the force of a judicious obscurity, than Milton. His description of Death in the second book is admirably studied; it is astonishing with what a gloomy pomp, with what a significant and expressive uncertainty of strokes and colouring, he has finished the portrait of the king of terrors:

—The other shape,
If shape it might be called that shape had none
Distinguishable, in member, joint, or limb;
Or substance might be called that shadow seemed;
For each seemed either; black he stood as night;
Fierce as ten furies; terrible as hell;
And shook a deadly dart. What seemed his head
The likeness of a kingly crown had on.

In this description all is dark, uncertain, confused, terrible, and sublime to the last degree. […]

The Same Subject Continued

[…] I know several who admire and love painting, and yet who regard the objects of their admiration in that art with coolness enough in comparison of that warmth with which they are animated by affecting pieces of poetry or rhetoric. Among the common sort of people, I never could perceive that painting had much influence on their passions. It is true, that the best sorts of painting, as well as the best sorts of poetry, are not much understood in that sphere. But it is most certain, that their passions are very strongly roused by a fanatic preacher, or by the ballads of Chevy-chase, or the Children in the Wood, and by other little popular poems and tales that are current in that rank of life. I do not know of any paintings, bad or good, that produce the same effect. So that poetry, with all its obscurity, has a more general, as well as a more powerful, dominion over the passions, than the other art. And I think there are reasons in nature, why the obscure idea, when properly conveyed, should be more affecting than the clear. It is our ignorance of things that causes all our admiration, and chiefly excites our passions. Knowledge and acquaintance make the most striking causes affect but little. It is thus with the vulgar; and all men are as the vulgar in what they do not understand. The ideas of eternity and infinity are among the most affecting we have; and yet perhaps there is nothing of which we really understand so little, as of infinity and eternity. […]

Locke’s Opinion Concerning Darkness Considered

IT is Mr. Locke’s opinion, that darkness is not naturally an idea of terror; and that, though an excessive light is painful to the sense, the greatest excess of darkness is no ways troublesome. He observes indeed in another place, that a nurse or an old woman having once associated the idea of ghosts and goblins with that of darkness, night, ever after, becomes painful and horrible to the imagination. The authority of this great man is doubtless as great as that of any man can be, and it seems to stand in the way of our general principle. We have considered darkness as a cause of the sublime; and we have all along considered the sublime as depending on some modification of pain or terror: so that if darkness be no way painful or terrible to any, who have not had their minds early tainted with superstitions, it can be no source of the sublime to them. But, with all deference to such an authority, it seems to me, that an association of a more general nature, an association which takes in all mankind, and make darkness terrible; for in utter darkness it is impossible to know in what degree of safety we stand; we are ignorant of the objects that surround us; we may every moment strike against some dangerous obstruction; we may fall down a precipice the first step we take; and if an enemy approach, we know not in what quarter to defend ourselves; in such a case strength is no sure protection; wisdom can only act by guess; the boldest are staggered, and he, who would pray for nothing else towards his defence, is forced to pray for light.

As to the association of ghosts and goblins; surely it is more natural to think, that darkness, being originally an idea of terror, was chosen as a fit scene for such terrible representations, than that such representations have made darkness terrible. The mind of man very easily slides into an error of the former sort; but it is very hard to imagine, that the effect of an idea so universally terrible in all times, and in all countries, as darkness, could possibly have been owing to a set of idle stories, or to any cause of a nature so trivial, and of an operation so precarious.

Reflections on the French Revolution
by Edmund Burke

History will record, that on the morning of the 6th of October, 1789, the king and queen of France, after a day of confusion, alarm, dismay, and slaughter, lay down, under the pledged security of public faith, to indulge nature in a few hours of respite, and troubled, melancholy repose. From this sleep the queen was first startled by the voice of the sentinel at her door, who cried out her to save herself by flight—that this was the last proof of fidelity he could give—that they were upon him, and he was dead. Instantly he was cut down. A band of cruel ruffians and assassins, reeking with his blood, rushed into the chamber of the queen, and pierced with a hundred strokes of bayonets and poniards the bed, from whence this persecuted woman had but just time to fly almost naked, and, through ways unknown to the murderers, had escaped to seek refuge at the feet of a king and husband, not secure of his own life for a moment.

This king, to say no more of him, and this queen, and their infant children, (who once would have been the pride and hope of a great and generous people,) were then forced to abandon the sanctuary of the most splendid palace in the world, which they left swimming in blood, polluted by massacre, and strewed with scattered limbs and mutilated carcases. Thence they were conducted into the capital of their kingdom. […]

It is now sixteen or seventeen years since I saw the queen of France, then the dauphiness, at Versailles; and surely never lighted on this orb, which she hardly seemed to touch, a more delightful vision. I saw her just above the horizon, decorating and cheering the elevated sphere she just began to move in,—glittering like the morning-star, full of life, and splendour, and joy. Oh! what a revolution! and what a heart must I have to contemplate without emotion that elevation and that fall! Little did I dream when she added titles of veneration to those of enthusiastic, distant, respectful love, that she should ever be obliged to carry the sharp antidote against disgrace concealed in that bosom; little did I dream that I should have lived to see such disasters fallen upon her in a nation of gallant men, in a nation of men of honour, and of cavaliers. I thought ten thousand swords must have leaped from their scabbards to avenge even a look that threatened her with insult. But the age of chivalry is gone. That of sophisters, economists, and calculators, has succeeded; and the glory of Europe is extinguished for ever. Never, never more shall we behold that generous loyalty to rank and sex, that proud submission, that dignified obedience, that subordination of the heart, which kept alive, even in servitude itself, the spirit of an exalted freedom. The unbought grace of life, the cheap defence of nations, the nurse of manly sentiment and heroic enterprise, is gone! It is gone, that sensibility of principle, that charity of honor, which felt a stain like a wound, which inspired courage whilst it mitigated ferocity, which ennobled whatever it touched, and under which vice itself lost half its evil, by losing all its grossness.

This mixed system of opinion and sentiment had its origin in the ancient chivalry; and the principle, though varied in its appearance by the varying state of human affairs, subsisted and influenced through a long succession of generations, even to the time we live in. If it should ever be totally extinguished, the loss I fear will be great. It is this which has given its character to modern Europe. It is this which has distinguished it under all its forms of government, and distinguished it to its advantage, from the states of Asia, and possibly from those states which flourished in the most brilliant periods of the antique world. It was this, which, without confounding ranks, had produced a noble equality, and handed it down through all the gradations of social life. It was this opinion which mitigated kings into companions, and raised private men to be fellows with kings. Without force or opposition, it subdued the fierceness of pride and power; it obliged sovereigns to submit to the soft collar of social esteem, compelled stern authority to submit to elegance, and gave a dominating vanquisher of laws to be subdued by manners.

But now all is to be changed. All the pleasing illusions, which made power gentle and obedience liberal, which harmonized the different shades of life, and which, by a bland assimilation, incorporated into politics the sentiments which beautify and soften private society, are to be dissolved by this new conquering empire of light and reason. All the decent drapery of life is to be rudely torn off. All the superadded ideas, furnished from the wardrobe of a moral imagination, which the heart owns, and the understanding ratifies, as necessary to cover the defects of our naked, shivering nature, and to raise it to dignity in our own estimation, are to be exploded as a ridiculous, absurd, and antiquated fashion.

On this scheme of things, a king is but a man, a queen is but a woman; a woman is but an animal, and an animal not of the highest order. All homage paid to the sex in general as such, and without distinct views, is to be regarded as romance and folly. Regicide, and parricide, and sacrilege, are but fictions of superstition, corrupting jurisprudence by destroying its simplicity. The murder of a king, or a queen, or a bishop, or a father, are only common homicide; and if the people are by any chance, or in any way, gainers by it, a sort of homicide much the most pardonable, and into which we ought not to make too severe a scrutiny.

On the scheme of this barbarous philosophy, which is the offspring of cold hearts and muddy understandings, and which is as void of solid wisdom as it is destitute of all taste and elegance, laws are to be supported only by their own terrors, and by the concern which each individual may find in them from his own private speculations, or can spare to them from his own private interests. In the groves of their academy, at the end of every vista, you see nothing but the gallows. Nothing is left which engages the affections on the part of the commonwealth. On the principles of this mechanic philosophy, our institutions can never be embodied, if I may use the expression, in persons; so as to create in us love, veneration, admiration, or attachment. But that sort of reason which banishes the affections is incapable of filling their place. These public affections, combined with manners, are required sometimes as supplements, sometimes as correctives, always as aids to law. The precept given by a wise man, as well as a great critic, for the construction of poems, is equally true as to states:—Non satis est pulchra esse poemata, dulcia sunto. There ought to be a system of manners in every nation, which a well-formed mind would be disposed to relish. To make us love our country, our country ought to be lovely.

* * *

Rights of Man:
Being an Answer to Mr. Burke’s Attack on the French Revolution
by Thomas Paine

But Mr. Burke appears to have no idea of principles when he is contemplating Governments. “Ten years ago,” says he, “I could have felicitated France on her having a Government, without inquiring what the nature of that Government was, or how it was administered.” Is this the language of a rational man? Is it the language of a heart feeling as it ought to feel for the rights and happiness of the human race? On this ground, Mr. Burke must compliment all the Governments in the world, while the victims who suffer under them, whether sold into slavery, or tortured out of existence, are wholly forgotten. It is power, and not principles, that Mr. Burke venerates; and under this abominable depravity he is disqualified to judge between them. Thus much for his opinion as to the occasions of the French Revolution. I now proceed to other considerations.

I know a place in America called Point-no-Point, because as you proceed along the shore, gay and flowery as Mr. Burke’s language, it continually recedes and presents itself at a distance before you; but when you have got as far as you can go, there is no point at all. Just thus it is with Mr. Burke’s three hundred and sixty-six pages. It is therefore difficult to reply to him. But as the points he wishes to establish may be inferred from what he abuses, it is in his paradoxes that we must look for his arguments.

As to the tragic paintings by which Mr. Burke has outraged his own imagination, and seeks to work upon that of his readers, they are very well calculated for theatrical representation, where facts are manufactured for the sake of show, and accommodated to produce, through the weakness of sympathy, a weeping effect. But Mr. Burke should recollect that he is writing history, and not plays, and that his readers will expect truth, and not the spouting rant of high-toned exclamation.

When we see a man dramatically lamenting in a publication intended to be believed that “The age of chivalry is gone! that The glory of Europe is extinguished for ever! that The unbought grace of life (if anyone knows what it is), the cheap defence of nations, the nurse of manly sentiment and heroic enterprise is gone!” and all this because the Quixot age of chivalry nonsense is gone, what opinion can we form of his judgment, or what regard can we pay to his facts? In the rhapsody of his imagination he has discovered a world of wind mills, and his sorrows are that there are no Quixots to attack them. But if the age of aristocracy, like that of chivalry, should fall (and they had originally some connection) Mr. Burke, the trumpeter of the Order, may continue his parody to the end, and finish with exclaiming: “Othello’s occupation’s gone!”

Notwithstanding Mr. Burke’s horrid paintings, when the French Revolution is compared with the Revolutions of other countries, the astonishment will be that it is marked with so few sacrifices; but this astonishment will cease when we reflect that principles, and not persons, were the meditated objects of destruction. The mind of the nation was acted upon by a higher stimulus than what the consideration of persons could inspire, and sought a higher conquest than could be produced by the downfall of an enemy. Among the few who fell there do not appear to be any that were intentionally singled out. They all of them had their fate in the circumstances of the moment, and were not pursued with that long, cold-blooded unabated revenge which pursued the unfortunate Scotch in the affair of 1745.

Through the whole of Mr. Burke’s book I do not observe that the Bastille is mentioned more than once, and that with a kind of implication as if he were sorry it was pulled down, and wished it were built up again. “We have rebuilt Newgate,” says he, “and tenanted the mansion; and we have prisons almost as strong as the Bastille for those who dare to libel the queens of France.” As to what a madman like the person called Lord George Gordon might say, and to whom Newgate is rather a bedlam than a prison, it is unworthy a rational consideration. It was a madman that libelled, and that is sufficient apology; and it afforded an opportunity for confining him, which was the thing that was wished for. But certain it is that Mr. Burke, who does not call himself a madman (whatever other people may do), has libelled in the most unprovoked manner, and in the grossest style of the most vulgar abuse, the whole representative authority of France, and yet Mr. Burke takes his seat in the British House of Commons! From his violence and his grief, his silence on some points and his excess on others, it is difficult not to believe that Mr. Burke is sorry, extremely sorry, that arbitrary power, the power of the Pope and the Bastille, are pulled down.

Not one glance of compassion, not one commiserating reflection that I can find throughout his book, has he bestowed on those who lingered out the most wretched of lives, a life without hope in the most miserable of prisons. It is painful to behold a man employing his talents to corrupt himself. Nature has been kinder to Mr. Burke than he is to her. He is not affected by the reality of distress touching his heart, but by the showy resemblance of it striking his imagination. He pities the plumage, but forgets the dying bird. Accustomed to kiss the aristocratical hand that hath purloined him from himself, he degenerates into a composition of art, and the genuine soul of nature forsakes him. His hero or his heroine must be a tragedy-victim expiring in show, and not the real prisoner of misery, sliding into death in the silence of a dungeon.

As Mr. Burke has passed over the whole transaction of the Bastille (and his silence is nothing in his favour), and has entertained his readers with refections on supposed facts distorted into real falsehoods, I will give, since he has not, some account of the circumstances which preceded that transaction. They will serve to show that less mischief could scarcely have accompanied such an event when considered with the treacherous and hostile aggravations of the enemies of the Revolution.

The mind can hardly picture to itself a more tremendous scene than what the city of Paris exhibited at the time of taking the Bastille, and for two days before and after, nor perceive the possibility of its quieting so soon. At a distance this transaction has appeared only as an act of heroism standing on itself, and the close political connection it had with the Revolution is lost in the brilliancy of the achievement. But we are to consider it as the strength of the parties brought man to man, and contending for the issue. The Bastille was to be either the prize or the prison of the assailants. The downfall of it included the idea of the downfall of despotism, and this compounded image was become as figuratively united as Bunyan’s Doubting Castle and Giant Despair.

* * *

The Reactionary Mind
by Corey Robin
pp. 243-245

As Orwell taught, the possibilities for cruelty and violence are as limitless as the imagination that dreams them up. But the armies and agencies of today’s violence are vast bureaucracies, and vast bureaucracies need rules. Eliminating the rules does not Prometheus unbind; it just makes for more billable hours.

“No yielding. No equivocation. No lawyering this thing to death.” That was George W. Bush’s vow after 9/ 11 and his description of how the war on terror would be conducted. Like so many of Bush’s other declarations, it turned out to be an empty promise. This thing was lawyered to death. But, and this is the critical point, far from minimizing state violence— which was the great fear of the neocons— lawyering has proven to be perfectly compatible with violence. In a war already swollen with disappointment and disillusion, the realization that inevitably follows— the rule of law can, in fact, authorize the greatest adventures of violence and death, thereby draining them of sublimity— must be, for the conservative, the greatest disillusion of all.

Had they been closer readers of Burke, the neoconservatives— like Fukuyama, Roosevelt, Sorel, Schmitt, Tocqueville, Maistre, Treitschke, and so many more on the American and European right— could have seen this disillusion coming. Burke certainly did. Even as he wrote of the sublime effects of pain and danger, he was careful to insist that should those pains and dangers “press too nearly” or “too close”— that is, should they become realities rather than fantasies, should they become “conversant about the present destruction of the person”— their sublimity would disappear. They would cease to be “delightful” and restorative and become simply terrible. 64 Burke’s point was not merely that no one, in the end, really wants to die or that no one enjoys unwelcome, excruciating pain. It was that sublimity of whatever kind and source depends upon obscurity: get too close to anything, whether an object or experience, see and feel its full extent, and it loses its mystery and aura. It becomes familiar. A “great clearness” of the sort that comes from direct experience “is in some sort an enemy to all enthusiasms whatsoever.” 65 “It is our ignorance of things that causes all our admiration, and chiefly excites our passions. Knowledge and acquaintance make the most striking causes affect but little.” 66 “A clear idea,” Burke concludes, “is therefore another name for a little idea.” 67 Get to know anything, including violence, too well, and it loses whatever attribute— rejuvenation, transgression, excitement, awe— you ascribed to it when it was just an idea.

Earlier than most, Burke understood that if violence were to retain its sublimity, it had to remain a possibility, an object of fantasy— a horror movie, a video game, an essay on war. For the actuality (as opposed to the representation) of violence was at odds with the requirements of sublimity. Real, as opposed to imagined, violence entailed objects getting too close, bodies pressing too near, flesh upon flesh. Violence stripped the body of its veils; violence made its antagonists familiar to each other in a way they had never been before. Violence dispelled illusion and mystery, making things drab and dreary. That is why, in his discussion in the Reflections of the revolutionaries’ abduction of Marie Antoinette, Burke takes such pains to emphasize her “almost naked” body and turns so effortlessly to the language of clothing—“ the decent drapery of life,” the “wardrobe of the moral imagination,” “antiquated fashion,” and so on— to describe the event. 68 The disaster of the revolutionaries’ violence, for Burke, was not cruelty; it was the unsought enlightenment.

Since 9/ 11, many have complained, and rightly so, about the failure of conservatives— or their sons and daughters— to fight the war on terror themselves. For those on the left, that failure is symptomatic of the class injustice of contemporary America. But there is an additional element to the story. So long as the war on terror remains an idea— a hot topic on the blogs, a provocative op-ed, an episode of 24— it is sublime. As soon as the war on terror becomes a reality, it can be as cheerless as a discussion of the tax code and as tedious as a trip to the DMV.

Fear: The History of a Political Idea
by Corey Robin
Kindle Locations 402-406

It might seem strange that a book about political fear should assign so much space to our ideas about fear rather than to its practice. But recall what Burke said: It is not so much the actuality of a threat, but the imagined idea of that threat, that renews and restores. “If the pain and terror are so modified as not to be actually noxious; if the pain is not carried to violence, and the terror is not conversant about the present destruction of the person,” then, and only then, do we experience a delightful horror.”1 The condition of our being renewed by fear is not that we directly experience the object that threatens us, but that the object be kept at some remove move from ourselves.

Kindle Locations 1061-1066

Whether they have read The Spirit of the Laws or not, these writers are its children. With its trawling allusions to the febrile and the fervid, The Spirit of the Laws successfully aroused the conviction that terror was synonymous with barbarism, and that its cures were to be found entirely within liberalism. Thus was a new political and literary aesthetic born, a rhetoric of hyperbole suggesting that terror’s escorts were inevitably remoteness, irrationality, and darkness, and its enemies, familiarity, reason, and light. Perhaps it was this aesthetic that a young Edmund Burke had in mind when he wrote, two years after Montesquieu’s death, “To make any thing very terrible, obscurity seems in general to be necessary. When we know the full extent of any danger, when we can accustom our eyes to it, a great deal of the apprehension vanishes.”

Kindle Locations 1608-1618

As she set about establishing a new political morality in the shadow of total terror, however, Arendt became aware of a problem that had plagued Hobbes, Montesquieu, and Tocqueville, and that Burke-not to mention makers of horror films-understood all too well: once terrors become familiar, they cease to arouse dread. The theorist who tries to establish fear as a foundation for a new politics must always find a demon darker than that of her predecessors, discover ever more novel, and more frightening, forms of fear. Thus Montesquieu, seeking to outdo Hobbes, imagined a form of terror that threatened the very basis of that which made us human. In Arendt’s case, it was her closing image of interchangeable victims and victimizers-of terror serving no interest and no party, not even its wielders; of a world ruled by no one and nothing, save the impersonal laws of motion-that yielded the necessary “radical evil” from which a new politics could emerge.

But as her friend and mentor Karl Jaspers was quick to recognize, Arendt had come upon this notion of radical evil at a terrible cost: it made moral judgment of the perpetrators of total terror nearly impossible.59 According to Origins, total terror rendered everyone-from Hitler down through the Jews, from Stalin to the kulaks-incapable of acting. Indeed, as Arendt admitted in 1963, “There exists a widespread theory, to which I also contributed [in Origins], that these crimes defy the possibility of human judgment and explode the frame of our legal institutions.”60 Total terror may have done what fear, terror, and anxiety did for her predecessors-found a new politics-but, as Arendt would come to realize in Eichmann in Jerusalem, it was a false foundation, inspiring an operatic sense of catastrophe, that ultimately let the perpetrators off the hook by obscuring the hard political realities of rule by fear.

Liberalism at Bay, Conservatism at Piay:
Fear in the Contemporary Imagination

by Corey Robin

For theorists like Locke and Burke, fear is something to be cherished, not because it alerts us to real danger or propels us to take necessary action against it, but because fear is supposed to arouse a heightened state of experience. It quickens our perceptions as no other emotion can, forcing us to see and to act in the world in new and more interesting ways, with greater moral discrimination and a more acute consciousness of our surroundings and ourselves. According to Locke, fear is “an uneasiness of the mind” and “the chief, if not only spur to human industry and action is uneasiness.” Though we might think that men and women act on behalf of desire, Locke insisted that “a little burning felt”—like fear—”pushes us more powerfully than great pleasures in prospect draw or allure.” Burke had equally low regard for pleasure. It induces a grotesque implosion of self, a “soft tranquility” approximating an advanced state of decay if not death itself.

The head reclines something on one side; the eyelids are
more closed than usual, and the eyes roll gently with an
inclination to the object, the mouth is a little opened, and
the breath drawn slowly, with now and then a low sigh;
the whole body is composed, and the hands fall idly to
the sides. All this is accompanied with an inward sense of
melting and languor . . . relaxing the solids of the whole
system.

But when we imagine the prospect of “pain and terror,” Burke added, we experience a delightful horror,” the “strongest of all passions.” Without fear, we are passive; with it, we are roused to “the strongest emotion which the mind is capable of feeling” (Locke, 1959,11.20.6,10;11.21.34: 304-5, 334; Burke, 1990: 32, 36,123,135-36).

At the political level, modem theorists have argued that fear is a spur to civic vitality and moral renewal, perhaps even a source of public freedom. Writing in the wake of the French Revolution, Tocqueville bemoaned the lethargy of modem democracy. With its free-wheeling antimonianism and social mobility, democratic society “inevitably enervates the soul, and relaxing the springs of the will, prepares a people for bondage. Then not only will they let their freedom be taken from them, but often they actually hand it over themselves” (Tocqueville, 1969:444). Lacking confidence in the traditional truths of God and king, Tocqueville believed that democracies might find a renewed confidence in the experience of fear, which could activate and ground a commitment to public freedom. “Fear,” he wrote in a note to himself, “must be put to work on behalf of liberty,” or, as he put it in Democracy in America, “Let us, then, look forward to the future with that salutary fear which makes men keep watch and ward for freedom, and not with that flabby, idle terror which makes men’s hearts sink and enervates them” (cited in Lamberti, 1989: 229; Tocqueville, 1969: 702). Armed with fear, democracy would be fortified against not only external and domestic enemies but also the inner tendency, the native desire, to dissolve into the soupy indifference of which Burke spoke.

* * *

The Dark Beauty of Unheard-Of Horrors
by Thomas Ligotti

This is how it is when a mysterious force is embodied in a human body, or in any form that is too well fixed. And a mystery explained is one robbed of its power of emotion, dwindling into a parcel of information, a tissue of rules and statistics without meaning in themselves.

Of course, mystery actually requires a measure of the concrete if it is to be perceived at all; otherwise it is only a void, the void. The thinnest mixture of this mortar, I suppose, is contained in that most basic source of mystery—darkness. Very difficult to domesticate this phenomenon, to collar it and give a name to the fear it inspires. As a verse writer once said:

The blackness at the bottom of a well
May bold most any kind of hell.

The dark, indeed, phenomenon possessing the maximum of mystery, the one most resistant to the taming of the mind and most resonant with emotions and meanings of a highly complex and subtle type. It is also extremely abstract as a provenance for supernatural horror, an elusive prodigy whose potential for fear may slip through a writer’s fingers and right past even a sensitive reader of terror tales. Obviously it is problematic in away that a solid pair of gleaming fangs at a victim’s neck is not. Hence, darkness itself is rarely used in a story as the central incarnation of the supernatural, though it often serves in a supporting role as an element of atmosphere, an extension of more concrete phenomena. The shadowy ambiance of a fictional locale almost always resolves itself into an apparition of substance, a threat with a name, if not a full blown history. Darkness may also perform in a strictly symbolic capacity, representing the abyss at the core of any genuine tale of mystery and horror. But to draw a reader’s attention to this abyss, this unnameable hell of blackness, is usually sacrificed in favor of focusing on some tangible dread pressing against the body of everyday life. From these facts may be derived an ad hoc taxonomy for dividing supernatural stories into types, or rather a spectrum of types: on the one side, those that tend to emphasize the surface manifestations of a supernatural phenomenon; on the other, those that reach toward the dark core of mystery in purest and most abstract condition. The former stories show us the bodies, big as life, of the demonic tribe of spooks, vampires, and other assorted bogeymen; the latter suggest to us the essence, far bigger than life, of that dark universal terror beyond naming which is the matrix for all other terrors. […]

Like Erich Zann’s “world of beauty,” Lovecraft’s “lay in some far cosmos of the imagination,” and like that of another  artist, it is a “beauty that hath horror in it.

The Conspiracy against the Human Race: A Contrivance of Horror
by Thomas Ligotti
pp. 41-42

As heretofore noted, consciousness may have assisted our species’ survival in the hard times of prehistory, but as it became ever more intense it evolved the potential to ruin everything if not securely muzzled. This is the problem: We must either outsmart consciousness or be thrown into its vortex of doleful factuality and suffer, as Zapffe termed it, a “dread of being”— not only of our own being but of being itself, the idea that the vacancy that might otherwise have obtained is occupied like a stall in a public lavatory of infinite dimensions, that there is a universe in which things like celestial bodies and human beings are roving about, that anything exists in the way it seems to exist, that we are part of all being until we stop being, if there is anything we may understand as being other than semblances or the appearance of semblances.

On the premise that consciousness must be obfuscated so that we might go on as we have all these years, Zapffe inferred that the sensible thing would be not to go on with the paradoxical nonsense of trying to inhibit our cardinal attribute as beings, since we can tolerate existence only if we believe— in accord with a complex of illusions, a legerdemain of duplicity— that we are not what we are: unreality on legs. As conscious beings, we must hold back that divulgement lest it break us with a sense of being things without significance or foundation, anatomies shackled to a landscape of unintelligible horrors. In plain language, we cannot live except as self-deceivers who must lie to ourselves about ourselves, as well as about our unwinnable situation in this world.

Accepting the preceding statements as containing some truth, or at least for the sake of moving on with the present narrative, it seems that we are zealots of Zapffe’s four plans for smothering consciousness: isolation (“ Being alive is all right”), anchoring (“ One Nation under God with Families, Morality, and Natural Birthrights for all”), distraction (“ Better to kill time than kill oneself”), and sublimation (“ I am writing a book titled The Conspiracy against the Human Race”). These practices make us organisms with a nimble intellect that can deceive themselves “for their own good.” Isolation, anchoring, distraction, and sublimation are among the wiles we use to keep ourselves from dispelling every illusion that keeps us up and running. Without this cognitive double-dealing, we would be exposed for what we are. It would be like looking into a mirror and for a moment seeing the skull inside our skin looking back at us with its sardonic smile. And beneath the skull— only blackness, nothing.  A little piece of our world has been peeled back, and underneath is creaking desolation— a carnival where all the rides are moving but no patrons occupy the seats. We are missing from the world we have made for ourselves. Maybe if we could resolutely gaze wide-eyed at our lives we would come to know what we really are. But that would stop the showy attraction we are inclined to think will run forever.

p. 182

That we all deserve punishment by horror is as mystifying as it is undeniable. To be an accomplice, however involuntarily, in a reasonless non-reality is cause enough for the harshest sentencing. But we have been trained so well to accept the “order” of an unreal world that we do not rebel against it. How could we? Where pain and pleasure form a corrupt alliance against us, paradise and hell are merely different divisions in the same monstrous bureaucracy. And between these two poles exists everything we know or can ever know. It is not even possible to imagine a utopia, earthly or otherwise, that can stand up under the mildest criticism. But one must take into account the shocking fact that we live on a world that spins. After considering this truth, nothing should come as a surprise.

Still, on rare occasions we do overcome hopelessness or velleity and make mutinous demands to live in a real world, one that is at least episodically ordered to our advantage. But perhaps it is only a demon of some kind that moves us to such idle insubordination, the more so to aggravate our condition in the unreal. After all, is it not wondrous that we are allowed to be both witnesses and victims of the sepulchral pomp of wasting tissue? And one thing we know is real: horror. It is so real, in fact, that we cannot be sure it could not exist without us. Yes, it needs our imaginations and our consciousness, but it does not ask or require our consent to use them. Indeed, horror operates with complete autonomy. Generating ontological havoc, it is mephitic foam upon which our lives merely float. And, ultimately, we must face up to it: Horror is more real than we are.

p. 218

Without death— meaning without our consciousness of death— no story of supernatural horror would ever have been written, nor would any other artistic representation of human life have been created for that matter. It is always there, if only between the lines or brushstrokes, or conspicuously by its absence. It is a terrific stimulus to that which is at once one of our greatest weapons and greatest weaknesses— imagination. Our minds are always on the verge of exploding with thoughts and images as we ceaselessly pound the pavement of our world. Both our most exquisite cogitations and our worst cognitive drivel announce our primal torment: We cannot linger in the stillness of nature’s vacuity. And so we have imagination to beguile us. A misbegotten hatchling of consciousness, a birth defect of our species, imagination is often revered as a sign of vigor in our make-up. But it is really just a psychic overcompensation for our impotence as beings. Denied nature’s exemption from creativity, we are indentured servants of the imaginary until the hour of our death, when the final harassments of imagination will beset us.

* * *

The Horror of the Unreal
By Peter Bebergal

The TV show “The Walking Dead” is one long exercise in tension. But the zombies—the supposed centerpiece of the show’s horror—are not particularly frightening. Gross, to be sure, but also knowable, literal. You can see them coming from yards away. They are the product of science gone wrong, or of a virus, or of some other phenomenal cause. They can be destroyed with an arrow through the brain. More aberration than genuine monsters, they lack the essential quality to truly terrify: an aspect of the unreal.

The horror writer Thomas Ligotti believes that even tales of virus-created zombies—and other essentially comprehensible creatures—can elicit what we might call, quoting the theologian Rudolf Otto, “the wholly other,” but it requires a deft hand. The best such stories “approach the realm of the supernatural,” he told me over e-mail, even if their monsters are entirely earthly. As an example, he pointed to “The Texas Chainsaw Massacre,” “wherein the brutality displayed is so deviant and strange it takes off into the uncanny.” Ligotti doesn’t require bloodthirsty villains to convey a sense of impending horror, though. “I tend to stipulate in my work that the world by its nature already exists in a state of doom rather than being in the process of doom.” […]

“Whether or not there is anything called the divine is neither here nor there,” Ligotti told me. “It’s irrelevant to our sense of what is beyond the veil.” Ligotti believes that fiction can put us in touch with that sense of things unseen, that it can create an encounter with—to quote Rudolf Otto again—the mysterium tremendum et fascinans, a state that combines terror and enchantment with the divine. In fact, Ligotti believes that “any so-called serious work of literature that doesn’t to some extent serve this function has failed.” It’s not a matter of genre, he says. He cites Raymond Chandler’s Philip Marlowe as a character who would go wherever the clues took him, no matter how deep into the heart of the “unknown.” “Chandler wanted his detective stories to invoke the sense of the ‘country behind the hill.’ “

Because Ligotti has no interest in whether or not that world beyond actually exists, there is a tension, an unanswered question, in his work: Can we locate the source of this horror? His characters are often confronted by people or groups who worship something so alien that their rituals don’t conform to any identifiable modes of religious practice. Usually, they involve some form of sacrifice or other suggestion of violence. The implication seems to be that, even if there is meaning in the universe, that meaning is so foreign, so strange, that we could never understand it, and it could never make a difference in our lives. Any attempt to penetrate it will only lead to madness.

As a practical matter, Ligotti believes that the short story is the most potent means for conveying this idea. “A novel can’t consistently project what Poe called a ‘single effect,’ “ he explains. “It would be too wearing on the reader—too repetitious and dense, as would, for instance, a lengthy narrative poem written in the style of a lyric poem. A large part of supernatural novels must therefore be concerned with the mundane and not with a sense of what I’ll call ‘the invisible.’ “

Trying to get Ligotti to explain what he means by the “invisible” is not easy. “I’m not able to see my stories as establishing or presuming the existence of a veil beyond which the characters in them are incapable of seeing. I simply don’t view them in this way. ” But his characters, I insisted, suggest that we are all capable of seeing beyond the veil, though it’s impossible to tell if they are simply mad, or if they have indeed perceived something outside normal perception. I asked Ligotti if he saw a difference between these two states of consciousness. “The only interest I’ve taken in psychological aberrancy in fiction,” he answered, “has been as a vehicle of perceiving the derangement of creation.”

Thomas Ligotti: Dark Phenomenology and Abstract Horror
by S.C. Hickman

Ligotti makes a point that horror must stay ill-defined, that the monstrous must menace us from a distance, from the unknown; a non-knowledge, rather than a knowledge of the natural; it is the unnatural and invisible that affects us not something we can reduce to some sociological, psychological, or political formation or representation, which only kills the mystery – taming it and pigeonholing it into some cultural gatekeeper’s caged obituary. […] The domesticated beast is no horror at all.

In the attic of the mind a lunatic family resides, a carnival world of aberrant thoughts and feelings – that, if we did not lock away in a conspiracy of silence would freeze us in such terror and fright that we would become immobilized unable to think, feel, or live accept as zombies, mindlessly. So we isolate these demented creatures, keep them at bay. Then we anchor ourselves in artifice, accept substitutes, religious mythologies, secular philosophies, and anything else that will help us keep the monsters at bay. As Ligotti will say, we need our illusions – our metaphysical anchors and dreamscapes “that inebriate us with a sense of being official, authentic, and safe in our beds” (CHR, 31). Yet, when even these metaphysical ploys want stem the tide of those heinous monsters from within we seek out distraction, entertainment: TV, sports, bars, dancing, friends, fishing, scuba diving, boating, car racing, horse riding… almost anything that will keep our mind empty of its dark secret, that will allow it to escape the burden of emotion – of fear, if even for a night or an afternoon of sheer mindless bliss. And, last, but not least, we seek out culture, sublimation – art, theatre, festivals, carnivals, painting, writing, books… we seek to let it all out, let it enter into that sphere of the tragic or comic, that realm where we can exorcize it, display it, pin it to the wall for all to see our fears and terrors on display not as they are but as we lift them up into art, shape them to our nightmare visions or dreamscapes of desire. As Ligotti tells it, we read literature or watch a painting, go to a theatre, etc. […]

Horror acts like a sigil, a diagram that invokes the powers within the darkness to arise, to unfold their mystery, to explain themselves; and, if not explain then at least to invade our equilibrium, our staid and comfortable world with their rage, their torment, their corruption. The best literary horror or weird tales never describe in detail the mystery, rather they invoke by hyperstitional invention: calling forth the forces out of darkness and the abstract, and allowing them to co-habit for a time the shared space – the vicarious bubble or interzone between the reader and narrative […]

This notion of the tension between the epistemic and ontic in abstract horror returns me to Nick Land’s short work Phyl-Undhu: Abstract Horror, Exterminator in which the narrator tells us that what we fear, what terrorizes us is not the seen – the known and definable, but rather the unseen and unknown, even “shapeless threat, ‘Outside’ only in the abstract sense (encompassing the negative immensity of everything that we cannot grasp). It could be anywhere, from our genes or ecological dynamics, to the hidden laws of technological evolution, or the hostile vastnesses between the stars. We know only that, in strict proportion to the vitality of the cosmos, the probability of its existence advances towards inevitability, and that for us it means supreme ill. Ontological density without identifiable form is abstract horror itself.” […]

Yet, as Lovecraft in one of his famous stories – “Call of Cthulhu” once suggested, the “sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the deadly light into the peace and safety of a new dark age.” Here is the nub for Ligotti, the dividing line of those who continue to sleep in the illusory safety net of their cultural delusions […] Many will remember the Anglo-American poet T. S. Eliot once suggested that “humankind cannot bear too much reality”. […]

For Ligotti the subjective reaction to the seemingly objective stimulus of the uncanny is the gaining of “dark knowledge” about the workings of individuals, […] This sense that the corruption works both ways, upon the victim and the perpetrator; that the world is now topsy-turvy and that the uncanny boundaries between victim and perpetrator are reversible and hazy, and not always obvious is due to that subtle knowledge that each culture is circumscribed within its own black box of conceptuality. By that I mean by that that as Eduardo Viveiros de Castro in his Cannibal Metaphysics argues the case that Amazonian and other Amerindian groups inhabit a radically different conceptual universe than ours—in which nature and culture, human and nonhuman, subject and object are conceived in terms that reverse our own—he presents the case for anthropology as the study of such “other” metaphysical schemes, and as the corresponding critique of the concepts imposed on them by the human sciences. […]

We’re in that position of moving either way: 1) literalizing our fantasies: building walls and barbed-wire fences against invading hordes of refugees, migrants, etc.; or, 2) of seeing through them, seeing the aesthetic and defensive use of art and social mechanisms to defend ourselves from the onslaught of our own daemonic nihilism and drives: our fears and terrors. […]

In our time we’ve forgotten this fact, and forgotten the art laughter, to see the world through the lens of art or horror literature and know that this, too, is illusion: the aesthetic call to our emotions, to our fears and our terrors that allows that purge, that release that only great art can supply. Rather in our time we’ve all become literalists of the imagination, so that apocalypse rather than a pleasant channeling of our fears has become an actual possibility and real manifestation in the world around us in wars, famines, racism, hatred, murder, mayhem… The problem we face is that we’ve targeted the external world of actual people and deemed them disposable as if they are the ravenous zombies and vampires of our contemporary globalist madness. We’ve turned the inside out, reversed what once existed within into a projected nightmare scenario and living hell in the real world not as fantasy but as daemonic threat and doom upon ourselves and others. Talking of contemporary horror films Ligotti remarks that the characters in these films “cannot be sure who is a “thing” and who is not, since those who are transmuted retain their former appearance, memories, and behaviors even after they have become, in their essence, uncanny monstrosities from another world.” (CHR, 92) This sense that we’ve allowed the immigrants (US) and refugees (US and EU) to enter into and become a part of the social body of our nations leads to this sense of the uncanny uncertainty that one cannot be sure who is the “thing” – is it us or them: a paranoiac nightmare world of ravening lunacy, indeed. Because our categories of normal/abnormal have broken down due to the absolute Other of other conceptual cultures who have other sets of Symbolic Orders and ideas, concepts, ideologies, religious, and Laws, etc. we are now in the predicament of mutating and transforming into an Other ourselves all across the globe. There is no safe haven, no place to hide or defend oneself against oneself. In this sense we’ve all – everyone on the planet – become as Ligotti states it, in “essence, uncanny monstrosities from another world”. (CHR, 92)

* * *

Trickster Makes This World
by Lewis Hyde
pp. 168-172

During the years I was writing this book, there was an intense national debate over the concern that government funds might be used to subsidize pornographic art. The particulars will undoubtedly change, but the debate is perennial. On the one side, we have those who presume to speak for the collective trying to preserve the coverings and silences that give social space its order. On the other side, we have the agents of change, time travelers who take the order itself to be mutable, who hope— to give it the most positive formulation— to preserve the sacred by finding ways to shift the structure of things as contingency demands. It is not immediately clear why this latter camp must so regularly turn to bodily and sexual display, but the context I am establishing here suggests that such display is necessary.

To explore why this might be the case, let me begin with the classic image from the Old Testament: Adam and Eve leaving the garden, having learned shame and therefore having covered their genitals and, in the old paintings, holding their hands over their faces as well. By these actions they inscribe their own bodies. The body happens to be a uniquely apt location for the inscription of shame, partly because the body itself seems to be the sense organ of shame (the feeling swamps us, we stutter and flush against our will), but also because the content of shame, what we feel ashamed of, typically seems indelible and fixed, with us as a sort of natural fact, the way the body is with us as a natural fact. “Shame is what you are, guilt is what you do,” goes an old saying. Guilt can be undone with acts of penance, but the feeling of shame sticks around like a birthmark or the smell of cigarettes.

I earlier connected the way we learn about shame to rules about speech and silence, and made the additional claim that those rules have an ordering function. Now, let us say that the rules give order to several things at once, not just to society but to the body and the psyche as well. When I say “several things at once” I mean that the rules imply the congruence of these three realms; the orderliness of one is the orderliness of the others. The organized body is a sign that we are organized psychologically and that we understand and accept the organization of the world around us. When Adam and Eve cover their genitals, they simultaneously begin to structure consciousness and to structure their primordial community. To make the temenos, a line is drawn on the earth and one thing cut from another; when Adam and Eve learn shame, they draw a line on their bodies, dividing them into zones like the zones of silence and speech— or, rather, not “like” those zones, but identified with them, for what one covers on the body one also consigns to silence.

[…] an unalterable fact about the body is linked to a place in the social order, and in both cases, to accept the link is to be caught in a kind of trap.

Before anyone can be snared in this trap, an equation must be made between the body and the world (my skin color is my place as a Hispanic; menstruation is my place as a woman). This substituting of one thing for another is called metonymy in rhetoric, one of the many figures of thought, a trope or verbal turn. The construction of the trap of shame begins with this metonymic trick, a kind of bait and switch in which one’s changeable social place is figured in terms of an unchangeable part of the body. Then by various means the trick is made to blend invisibly into the landscape. To begin with, there are always larger stories going on— about women or race or a snake in a garden. The enchantment of those regularly repeated fables, along with the rules of silence at their edges, and the assertion that they are intuitively true— all these things secure the borders of the narrative and make it difficult to see the contingency of its figures of thought. Once the verbal tricks are invisible, the artifice of the social order becomes invisible as well, and begins to seem natural. As menstruation and skin color and the genitals are natural facts, so the social and psychological orders become natural facts.

In short, to make the trap of shame we inscribe the body as a sign of wider worlds, then erase the artifice of that signification so that the content of shame becomes simply the way things are, as any fool can see.

If this is how the trap is made, then escaping it must involve reversing at least some of these elements. In what might be called the “heavy-bodied” escape, one senses that there’s something to be changed but ends up trying to change the body itself, mutilating it, or even committing suicide […]

These are the beginnings of conscious struggle, but we have yet to meet the mind of the trickster— or if we have, it belongs to the trickster who tries to eat the reflected berries, who burns his own anus in anger, who has not learned to separate the bait from the hook. As we saw earlier, the pressures of experience produce from that somewhat witless character a more sophisticated trickster who can separate bait from hook, who knows that the sign of something is not the thing itself, and who is therefore a better escape artist with a much more playful relationship to the local stories. The heavy-bodied, literalizing attempt to escape from shame carries much of the trap with it— the link to the body, the silence, and so on. Inarticulately, it takes the sign for the thing itself, imagining racism inheres in the color of the skin. Wise to the tricks of language, the light-bodied escape from shame refuses the whole setup— refuses the metonymic shift, the enchantment of group story, and the rules of silence— and by these refusals it detaches the supposedly overlapping levels of inscription from one another so that the body, especially, need no longer stand as the mute, incarnate seal of social and psychological order. All this, but especially the speaking out where shame demands silence, depends largely on a consciousness that doesn’t feel much inhibition, and knows how traps are made, and knows how to subvert them.

This is the insight that comes to all boundary-crossers— immigrants in fact or immigrants in time— that meaning is contingent and identity fluid, even the meaning and identity of one’s own body.

It should by now be easier to see why there will always be art that uncovers the body, and artists who speak shamelessly, even obscenely. All social structures do well to anchor their rules of conduct in the seemingly simple inscription of the body, so that only after I have covered my privates am I allowed to show my face to the world and have a public life. The rules of bodily decorum usually imply that the cosmos depends on the shame we feel about our bodies. But sometimes the lesson is a lie, and a cunningly self-protecting one at that, for to question it requires self-exposure and loss of face, and who would want that? Well, trickster would, as would all those who find they cannot fashion a place for themselves in the world until they have spoken against collective silence. We certainly see this— not just the speaking out but the self-exposure— in Allen Ginsberg, and we see it a bit more subtly in both Kingston and Rodriguez. Neither of them is a “dirty writer” the way Ginsberg is, but to begin to speak, one of them must talk about menstruation (which talk she links to becoming the mistress of her own sexuality) and the other must talk about his skin (which talk he links to possessing his “maleness”).

To the degree that other orders are linked to the way the body is inscribed, and to the degree that the link is sealed by rules of silence, the first stuttering questioning of those orders must always begin by breaking the seal and speaking about the body. Where obscene speech has such roots it is worth defending, and those who would suppress it court a subtle but serious danger. They are like the gods who would bind Loki, for this suppression hobbles the imagination that copes with the shifting and contingent nature of things, and so invites apocalyptic change where something more playful would have sufficed. Better to let trickster steal the shame covers now and then. Better to let Coyote have a ride in the Sun-god’s lodge. Better to let Monkey come on your journey to the West.

* * *

“Disseminated Volition in the New Testament Gospels”
by Andrew Stehlik
The Jaynesian (Vol. 3, Issue 1)

It is well known that many words for inner spiritual motions and emotions are actually metaphors derived from primitive (outward) physiological observations. Brief references to any good dictionary which includes etymology can corroborate this conclusion.

Julian Jaynes in The Origin of Consciousness in the Breakdown of the Bicameral Mind dedicated a whole chapter to this theme — looking forward through the Iliad (pp. 257– 272). He concentrates on seven words: thumos, phrenes, noos, psyche, kradie, ker, and etor.

Julian Jaynes recognized that these and other similar body based, physiological or anatomical metaphors (in almost any language) are actually more than simple linguistic metaphors and that they played an important role in the breakdown of bicameralism and the development of consciousness. Different forms of stress and anxiety trigger different physiological responses. Observations of these responses were used in naming and creating hypostases and metaphors useful in the terminology of introspection and the development of consciousness. […]

In the New Testament Gospels (therefore quite late in the historical process — the second half of the first century CE) I recently recognized an interesting phenomenon which could be part of this process, or, even better, a pathological deviation along this process.

Once in the gospel of Mark (9: 42– 48) and twice in the gospel of Matthew (5: 27– 30 and 18: 6– 10) Jesus is supposed to utter an almost identical saying. In this saying, individual parts of the body (eyes, hands, feet) are given the ability of independent volition. They can inform acting of the whole person. The saying suggests, further, that when the influence (instructions, independent volition) of these body parts is perceived as dangerous or harmful, they should be silenced by cutting them off to protect the integrity of the rest of the body.

All academic theological literature known to me takes these sayings as high literary metaphors. Frequent references are made to biology and medicine and the use of amputations are the last resort in serious conditions.

Completely unrecognized is the whole presumption of this saying according to which individual body parts could possess independent volition and as such can inform (sway/direct) the acting of the whole body. Even more seriously — the presumption that self-mutilation can stop or somehow influence higher mental processes. Even the person who is not a trained psychologist or psychiatrist can recognize that we are dealing with a seriously pathological state of mind. […]

Already at the time of recording in the gospels this saying was perceived as anomalous. Luke, the most educated and refined of synoptical authors, preserved the immediate context, but edited out most of the peculiar parts concerning disseminated volition and self-mutilations.

Further and broader contexts which may be mentioned and discussed: other Greek and Hebrew physiological and anatomical metaphors; the popularity of a metaphor of the body for structuring and functioning of society in Hellenism; the ancient practice of religious self-mutilation; the potential for facilitating our understanding of brutish penal codes or modern self-mutilations.

* * *

The Monstrous, the Impure, & the Imaginal
The Haunted Moral Imagination

Inconsistency of Burkean Conservatism
On Truth and Bullshit
Poised on a Knife Edge
“Why are you thinking about this?”