“Lack of the historical sense is the traditional defect in all philosophers.”

Human, All Too Human: A Book for Free Spirits
by Friedrich Wilhelm Nietzsche

The Traditional Error of Philosophers.—All philosophers make the common mistake of taking contemporary man as their starting point and of trying, through an analysis of him, to reach a conclusion. “Man” involuntarily presents himself to them as an aeterna veritas as a passive element in every hurly-burly, as a fixed standard of things. Yet everything uttered by the philosopher on the subject of man is, in the last resort, nothing more than a piece of testimony concerning man during a very limited period of time. Lack of the historical sense is the traditional defect in all philosophers. Many innocently take man in his most childish state as fashioned through the influence of certain religious and even of certain political developments, as the permanent form under which man must be viewed. They will not learn that man has evolved,4 that the intellectual faculty itself is an evolution, whereas some philosophers make the whole cosmos out of this intellectual faculty. But everything essential in human evolution took place aeons ago, long before the four thousand years or so of which we know anything: during these man may not have changed very much. However, the philosopher ascribes “instinct” to contemporary man and assumes that this is one of the unalterable facts regarding man himself, and hence affords a clue to the understanding of the universe in general. The whole teleology is so planned that man during the last four thousand years shall be spoken of as a being existing from all eternity, and with reference to whom everything in the cosmos from its very inception is naturally ordered. Yet everything evolved: there are no eternal facts as there are no absolute truths. Accordingly, historical philosophising is henceforth indispensable, and with it honesty of judgment.

What Locke Lacked
by Louise Mabille

Locke is indeed a Colossus of modernity, but one whose twin projects of providing a concept of human understanding and political foundation undermine each other. The specificity of the experience of perception alone undermines the universality and uniformity necessary to create the subject required for a justifiable liberalism. Since mere physical perspective can generate so much difference, it is only to be expected that political differences would be even more glaring. However, no political order would ever come to pass without obliterating essential differences. The birth of liberalism was as violent as the Empire that would later be justified in its name, even if its political traces are not so obvious. To interpret is to see in a particular way, at the expense of all other possibilities of interpretation. Perspectives that do not fit are simply ignored, or as that other great resurrectionist of modernity, Freud, would concur, simply driven underground. We ourselves are the source of this interpretative injustice, or more correctly, our need for a world in which it is possible to live, is. To a certain extent, then, man is the measure of the world, but only his world. Man is thus a contingent measure and our measurements do not refer to an original, underlying reality. What we call reality is the result not only of our limited perspectives upon the world, but the interplay of those perspectives themselves. The liberal subject is thus a result of, and not a foundation for, the experience of reality. The subject is identified as origin of meaning only through a process of differentiation and reduction, a course through which the will is designated as a psychological property.

Locke takes the existence of the subject of free will – free to exercise political choice such as rising against a tyrant, choosing representatives, or deciding upon political direction – simply for granted. Furthermore, he seems to think that everyone should agree as to what the rules are according to which these events should happen. For him, the liberal subject underlying these choices is clearly fundamental and universal.

Locke’s philosophy of individualism posits the existence of a discreet and isolated individual, with private interests and rights, independent of his linguistic or socio-historical context. C. B. MacPhearson identifies a distinctly possessive quality to Locke’s individualist ethic, notably in the way in which the individual is conceived as proprietor of his own personhood, possessing capacities such as self-reflection and free will. Freedom becomes associated with possession, which the Greeks would associate with slavery, and society conceived in terms of a collection of free and equal individuals who are related to each through their means of achieving material success – which Nietzsche, too, would associate with slave morality.  […]

There is a central tenet to John Locke’s thinking that, as conventional as it has become, remains a strange strategy. Like Thomas Hobbes, he justifies modern society by contrasting it with an original state of nature. For Hobbes, as we have seen, the state of nature is but a hypothesis, a conceptual tool in order to elucidate a point. For Locke, however, the state of nature is a very real historical event, although not a condition of a state of war. Man was social by nature, rational and free. Locke drew this inspiration from Richard Hooker’s Laws of Ecclesiastical Polity, notably from his idea that church government should be based upon human nature, and not the Bible, which, according to Hooker, told us nothing about human nature. The social contract is a means to escape from nature, friendlier though it be on the Lockean account. For Nietzsche, however, we have never made the escape: we are still holus-bolus in it: ‘being conscious is in no decisive sense the opposite of the instinctive – most of the philosopher’s conscious thinking is secretly directed and compelled into definite channels by his instincts. Behind all logic too, and its apparent autonomy there stand evaluations’ (BGE, 3). Locke makes a singular mistake in thinking the state of nature a distant event. In fact, Nietzsche tells us, we have never left it. We now only wield more sophisticated weapons, such as the guilty conscience […]

Truth originates when humans forget that they are ‘artistically creating subjects’ or products of law or stasis and begin to attach ‘invincible faith’ to their perceptions, thereby creating truth itself. For Nietzsche, the key to understanding the ethic of the concept, the ethic of representation, is conviction […]

Few convictions have proven to be as strong as the conviction of the existence of a fundamental subjectivity. For Nietzsche, it is an illusion, a bundle of drives loosely collected under the name of ‘subject’ —indeed, it is nothing but these drives, willing, and actions in themselves—and it cannot appear as anything else except through the seduction of language (and the fundamental errors of reason petrified in it), which understands and misunderstands all action as conditioned by something which causes actions, by a ‘Subject’ (GM I 13). Subjectivity is a form of linguistic reductionism, and when using language, ‘[w]e enter a realm of crude fetishism when we summon before consciousness the basic presuppositions of the metaphysics of language — in plain talk, the presuppositions of reason. Everywhere reason sees a doer and doing; it believes in will as the cause; it believes in the ego, in the ego as being, in the ego as substance, and it projects this faith in the ego-substance upon all things — only thereby does it first create the concept of ‘thing’ (TI, ‘Reason in Philosophy’ 5). As Nietzsche also states in WP 484, the habit of adding a doer to a deed is a Cartesian leftover that begs more questions than it solves. It is indeed nothing more than an inference according to habit: ‘There is activity, every activity requires an agent, consequently – (BGE, 17). Locke himself found the continuous existence of the self problematic, but did not go as far as Hume’s dissolution of the self into a number of ‘bundles’. After all, even if identity shifts occurred behind the scenes, he required a subject with enough unity to be able to enter into the Social Contract. This subject had to be something more than merely an ‘eternal grammatical blunder’ (D, 120), and willing had to be understood as something simple. For Nietzsche, it is ‘above all complicated, something that is a unit only as a word, a word in which the popular prejudice lurks, which has defeated the always inadequate caution of philosophers’ (BGE, 19).

Nietzsche’s critique of past philosophers
by Michael Lacewing

Nietzsche is questioning the very foundations of philosophy. To accept his claims means being a new kind of philosopher, ones who ‘taste and inclination’, whose values, are quite different. Throughout his philosophy, Nietzsche is concerned with origins, both psychological and historical. Much of philosophy is usually thought of as an a priori investigation. But if Nietzsche can show, as he thinks he can, that philosophical theories and arguments have a specific historical basis, then they are not, in fact, a priori. What is known a priori should not change from one historical era to the next, nor should it depend on someone’s psychology. Plato’s aim, the aim that defines much of philosophy, is to be able to give complete definitions of ideas – ‘what is justice?’, ‘what is knowledge?’. For Plato, we understand an idea when we have direct knowledge of the Form, which is unchanging and has no history. If our ideas have a history, then the philosophical project of trying to give definitions of our concepts, rather than histories, is radically mistaken. For example, in §186, Nietzsche argues that philosophers have consulted their ‘intuitions’ to try to justify this or that moral principle. But they have only been aware of their own morality, of which their ‘justifications’ are in fact only expressions. Morality and moral intuitions have a history, and are not a priori. There is no one definition of justice or good, and the ‘intuitions’ that we use to defend this or that theory are themselves as historical, as contentious as the theories we give – so they offer no real support. The usual ways philosophers discuss morality misunderstands morality from the very outset. The real issues of understanding morality only emerge when we look at the relation between this particular morality and that. There is no world of unchanging ideas, no truths beyond the truths of the world we experience, nothing that stands outside or beyond nature and history.

GENEALOGY AND PHILOSOPHY

Nietzsche develops a new way of philosophizing, which he calls a ‘morphology and evolutionary theory’ (§23), and later calls ‘genealogy’. (‘Morphology’ means the study of the forms something, e.g. morality, can take; ‘genealogy’ means the historical line of descent traced from an ancestor.) He aims to locate the historical origin of philosophical and religious ideas and show how they have changed over time to the present day. His investigation brings together history, psychology, the interpretation of concepts, and a keen sense of what it is like to live with particular ideas and values. In order to best understand which of our ideas and values are particular to us, not a priori or universal, we need to look at real alternatives. In order to understand these alternatives, we need to understand the psychology of the people who lived with them. And so Nietzsche argues that traditional ways of doing philosophy fail – our intuitions are not a reliable guide to the ‘truth’, to the ‘real’ nature of this or that idea or value. And not just our intuitions, but the arguments, and style of arguing, that philosophers have used are unreliable. Philosophy needs to become, or be informed by, genealogy. A lack of any historical sense, says Nietzsche, is the ‘hereditary defect’ of all philosophers.

MOTIVATIONAL ANALYSIS

Having long kept a strict eye on the philosophers, and having looked between their lines, I say to myself… most of a philosopher’s conscious thinking is secretly guided and channelled into particular tracks by his instincts. Behind all logic, too, and its apparent tyranny of movement there are value judgements, or to speak more clearly, physiological demands for the preservation of a particular kind of life. (§3) A person’s theoretical beliefs are best explained, Nietzsche thinks, by evaluative beliefs, particular interpretations of certain values, e.g. that goodness is this and the opposite of badness. These values are best explained as ‘physiological demands for the preservation of a particular kind of life’. Nietzsche holds that each person has a particular psychophysical constitution, formed by both heredity and culture. […] Different values, and different interpretations of these values, support different ways of life, and so people are instinctively drawn to particular values and ways of understanding them. On the basis of these interpretations of values, people come to hold particular philosophical views. §2 has given us an illustration of this: philosophers come to hold metaphysical beliefs about a transcendent world, the ‘true’ and ‘good’ world, because they cannot believe that truth and goodness could originate in the world of normal experience, which is full of illusion, error, and selfishness. Therefore, there ‘must’ be a pure, spiritual world and a spiritual part of human beings, which is the origin of truth and goodness. Philosophy and values But ‘must’ there be a transcendent world? Or is this just what the philosopher wants to be true? Every great philosophy, claims Nietzsche, is ‘the personal confession of its author’ (§6). The moral aims of a philosophy are the ‘seed’ from which the whole theory grows. Philosophers pretend that their opinions have been reached by ‘cold, pure, divinely unhampered dialectic’ when in fact, they are seeking reasons to support their pre-existing commitment to ‘a rarefied and abstract version of their heart’s desire’ (§5), viz. that there is a transcendent world, and that good and bad, true and false are opposites. Consider: Many philosophical systems are of doubtful coherence, e.g. how could there be Forms, and if there were, how could we know about them? Or again, in §11, Nietzsche asks ‘how are synthetic a priori judgments possible?’. The term ‘synthetic a priori’ was invented by Kant. According to Nietzsche, Kant says that such judgments are possible, because we have a ‘faculty’ that makes them possible. What kind of answer is this?? Furthermore, no philosopher has ever been proved right (§25). Given the great difficulty of believing either in a transcendent world or in human cognitive abilities necessary to know about it, we should look elsewhere for an explanation of why someone would hold those beliefs. We can find an answer in their values. There is an interesting structural similarity between Nietzsche’s argument and Hume’s. Both argue that there is no rational explanation of many of our beliefs, and so they try to find the source of these beliefs outside or beyond reason. Hume appeals to imagination and the principle of ‘Custom’. Nietzsche appeals instead to motivation and ‘the bewitchment of language’ (see below). So Nietzsche argues that philosophy is not driven by a pure ‘will to truth’ (§1), to discover the truth whatever it may be. Instead, a philosophy interprets the world in terms of the philosopher’s values. For example, the Stoics argued that we should live ‘according to nature’ (§9). But they interpret nature by their own values, as an embodiment of rationality. They do not see the senselessness, the purposelessness, the indifference of nature to our lives […]

THE BEWITCHMENT OF LANGUAGE

We said above that Nietzsche criticizes past philosophers on two grounds. We have looked at the role of motivation; the second ground is the seduction of grammar. Nietzsche is concerned with the subject-predicate structure of language, and with it the notion of a ‘substance’ (picked out by the grammatical ‘subject’) to which we attribute ‘properties’ (identified by the predicate). This structure leads us into a mistaken metaphysics of ‘substances’. In particular, Nietzsche is concerned with the grammar of ‘I’. We tend to think that ‘I’ refers to some thing, e.g. the soul. Descartes makes this mistake in his cogito – ‘I think’, he argues, refers to a substance engaged in an activity. But Nietzsche repeats the old objection that this is an illegitimate inference (§16) that rests on many unproven assumptions – that I am thinking, that some thing is thinking, that thinking is an activity (the result of a cause, viz. I), that an ‘I’ exists, that we know what it is to think. So the simple sentence ‘I think’ is misleading. In fact, ‘a thought comes when ‘it’ wants to, and not when ‘I’ want it to’ (§17). Even ‘there is thinking’ isn’t right: ‘even this ‘there’ contains an interpretation of the process and is not part of the process itself. People are concluding here according to grammatical habit’. But our language does not allow us just to say ‘thinking’ – this is not a whole sentence. We have to say ‘there is thinking’; so grammar constrains our understanding. Furthermore, Kant shows that rather than the ‘I’ being the basis of thinking, thinking is the basis out of which the appearance of an ‘I’ is created (§54). Once we recognise that there is no soul in a traditional sense, no ‘substance’, something constant through change, something unitary and immortal, ‘the way is clear for new and refined versions of the hypothesis about the soul’ (§12), that it is mortal, that it is multiplicity rather than identical over time, even that it is a social construct and a society of drives. Nietzsche makes a similar argument about the will (§19). Because we have this one word ‘will’, we think that what it refers to must also be one thing. But the act of willing is highly complicated. First, there is an emotion of command, for willing is commanding oneself to do something, and with it a feeling of superiority over that which obeys. Second, there is the expectation that the mere commanding on its own is enough for the action to follow, which increases our sense of power. Third, there is obedience to the command, from which we also derive pleasure. But we ignore the feeling the compulsion, identifying the ‘I’ with the commanding ‘will’. Nietzsche links the seduction of language to the issue of motivation in §20, arguing that ‘the spell of certain grammatical functions is the spell of physiological value judgements’. So even the grammatical structure of language originates in our instincts, different grammars contributing to the creation of favourable conditions for different types of life. So what values are served by these notions of the ‘I’ and the ‘will’? The ‘I’ relates to the idea that we have a soul, which participates in a transcendent world. It functions in support of the ascetic ideal. The ‘will’, and in particular our inherited conception of ‘free will’, serves a particular moral aim

Hume and Nietzsche: Moral Psychology (short essay)
by epictetus_rex

1. Metaphilosophical Motivation

Both Hume and Nietzsche1 advocate a kind of naturalism. This is a weak naturalism, for it does not seek to give science authority over philosophical inquiry, nor does it commit itself to a specific ontological or metaphysical picture. Rather, it seeks to (a) place the human mind firmly in the realm of nature, as subject to the same mechanisms that drive all other natural events, and (b) investigate the world in a way that is roughly congruent with our best current conception(s) of nature […]

Furthermore, the motivation for this general position is common to both thinkers. Hume and Nietzsche saw old rationalist/dualist philosophies as both absurd and harmful: such systems were committed to extravagant and contradictory metaphysical claims which hinder philosophical progress. Furthermore, they alienated humanity from its position in nature—an effect Hume referred to as “anxiety”—and underpinned religious or “monkish” practises which greatly accentuated this alienation. Both Nietzsche and Hume believe quite strongly that coming to see ourselves as we really are will banish these bugbears from human life.

To this end, both thinkers ask us to engage in honest, realistic psychology. “Psychology is once more the path to the fundamental problems,” writes Nietzsche (BGE 23), and Hume agrees:

the only expedient, from which we can hope for success in our philosophical researches, is to leave the tedious lingering method, which we have hitherto followed, and instead of taking now and then a castle or village on the frontier, to march up directly to the capital or center of these sciences, to human nature itself.” (T Intro)

2. Selfhood

Hume and Nietzsche militate against the notion of a unified self, both at-a-time and, a fortiori, over time.

Hume’s quest for a Newtonian “science of the mind” lead him to classify all mental events as either impressions (sensory) or ideas (copies of sensory impressions, distinguished from the former by diminished vivacity or force). The self, or ego, as he says, is just “a kind of theatre, where several perceptions successively make their appearance; pass, re-pass, glide away, and mingle in an infinite variety of postures and situations. There is properly no simplicity in it at one time, nor identity in different; whatever natural propension we may have to imagine that simplicity and identity.” (Treatise 4.6) […]

For Nietzsche, the experience of willing lies in a certain kind of pleasure, a feeling of self-mastery and increase of power that comes with all success. This experience leads us to mistakenly posit a simple, unitary cause, the ego. (BGE 19)

The similarities here are manifest: our minds do not have any intrinsic unity to which the term “self” can properly refer, rather, they are collections or “bundles” of events (drives) which may align with or struggle against one another in a myriad of ways. Both thinkers use political models to describe what a person really is. Hume tells us we should “more properly compare the soul to a republic or commonwealth, in which the several members [impressions and ideas] are united by ties of government and subordination, and give rise to persons, who propagate the same republic in the incessant change of its parts” (T 261)

3. Action and The Will

Nietzsche and Hume attack the old platonic conception of a “free will” in lock-step with one another. This picture, roughly, involves a rational intellect which sits above the appetites and ultimately chooses which appetites will express themselves in action. This will is usually not considered to be part of the natural/empirical order, and it is this consequence which irks both Hume and Nietzsche, who offer two seamlessly interchangeable refutations […]

Since we are nothing above and beyond events, there is nothing for this “free will” to be: it is a causa sui, “a sort of rape and perversion of logic… the extravagant pride of man has managed to entangle itself profoundly and frightfully with just this nonsense” (BGE 21).

When they discover an erroneous or empty concept such as “Free will” or “the self”, Nietzsche and Hume engage in a sort of error-theorizing which is structurally the same. Peter Kail (2006) has called this a “projective explanation”, whereby belief in those concepts is “explained by appeal to independently intelligible features of psychology”, rather than by reference to the way the world really is1.

The Philosophy of Mind
INSTRUCTOR: Larry Hauser
Chapter 7: Egos, bundles, and multiple selves

  • Who dat?  “I”
    • Locke: “something, I know not what”
    • Hume: the no-self view … “bundle theory”
    • Kant’s transcendental ego: a formal (nonempirical) condition of thought that the “I’ must accompany every perception.
      • Intentional mental state: I think that snow is white.
        • to think: a relation between
          • a subject = “I”
          • a propositional content thought =  snow is white
      • Sensations: I feel the coldness of the snow.
        • to feel: a relation between
          • a subject = “I”
          • a quale = the cold-feeling
    • Friedrich Nietzsche
      • A thought comes when “it” will and not when “I” will. Thus it is a falsification of the evidence to say that the subject “I” conditions the predicate “think.”
      • It is thought, to be sure, but that this “it” should be that old famous “I” is, to put it mildly, only a supposition, an assertion. Above all it is not an “immediate certainty.” … Our conclusion is here formulated out of our grammatical custom: “Thinking is an activity; every activity presumes something which is active, hence ….” 
    • Lichtenberg: “it’s thinking” a la “it’s raining”
      • a mere grammatical requirement
      • no proof of an thinking self

[…]

  • Ego vs. bundle theories (Derek Parfit (1987))
    • Ego: “there really is some kind of continuous self that is the subject of my experiences, that makes decisions, and so on.” (95)
      • Religions: Christianity, Islam, Hinduism
      • Philosophers: Descartes, Locke, Kant & many others (the majority view)
    • Bundle: “there is no underlying continuous and unitary self.” (95)
      • Religion: Buddhism
      • Philosophers: Hume, Nietzsche, Lichtenberg, Wittgenstein, Kripke(?), Parfit, Dennett {a stellar minority}
  • Hume v. Reid
    • David Hume: For my part, when I enter most intimately into what I call myself, I always stumble on some particular perception or other, of heat or cold, light or shade, love or hatred, pain or pleasure.  I never can catch myself at any time without a perception, and never can observe anything but the perception.  (Hume 1739, Treatise I, VI, iv)
    • Thomas Reid: I am not thought, I am not action, I am not feeling: I am something which thinks and acts and feels. (1785)
Advertisements

Arete: History and Etymology

Arete (moral virtue)
Wikipedia

Arete (Greekἀρετή), in its basic sense, means “excellence of any kind”.[1] The term may also mean “moral virtue”.[1] In its earliest appearance in Greek, this notion of excellence was ultimately bound up with the notion of the fulfillment of purpose or function: the act of living up to one’s full potential.

The term from Homeric times onwards is not gender specific. Homer applies the term of both the Greek and Trojan heroes as well as major female figures, such as Penelope, the wife of the Greek hero Odysseus. In the Homeric poems, Arete is frequently associated with bravery, but more often with effectiveness. The man or woman of Arete is a person of the highest effectiveness; they use all their faculties—strength, bravery and wit—to achieve real results. In the Homeric world, then, Arete involves all of the abilities and potentialities available to humans.

In some contexts, Arete is explicitly linked with human knowledge, where the expressions “virtue is knowledge” and “Arete is knowledge” are used interchangeably. The highest human potential is knowledge and all other human abilities are derived from this central capacity. If Arete is knowledge and study, the highest human knowledge is knowledge about knowledge itself; in this light, the theoretical study of human knowledge, which Aristotle called “contemplation”, is the highest human ability and happiness.[2]

History

The Ancient Greeks applied the term to anything: for example, the excellence of a chimney, the excellence of a bull to be bred and the excellence of a man. The meaning of the word changes depending on what it describes, since everything has its own peculiar excellence; the arete of a man is different from the arete of a horse. This way of thinking comes first from Plato, where it can be seen in the Allegory of the Cave.[3] In particular, the aristocratic class was presumed, essentially by definition, to be exemplary of arete: “The root of the word is the same as aristos, the word which shows superlative ability and superiority, and aristos was constantly used in the plural to denote the nobility.”[4]

By the 5th and 4th centuries BC, arete as applied to men had developed to include quieter virtues, such as dikaiosyne (justice) and sophrosyne (self-restraint). Plato attempted to produce a moral philosophy that incorporated this new usage,[5] but it was in the work of Aristotle that the doctrine of arete found its fullest flowering. Aristotle’s Doctrine of the Mean is a paradigm example of his thinking.

Arete has also been used by Plato when talking about athletic training and also the education of young boys. Stephen G. Miller delves into this usage in his book “Ancient Greek Athletics”. Aristotle is quoted as deliberating between education towards arete “…or those that are theoretical”.[6] Educating towards arete in this sense means that the boy would be educated towards things that are useful in life. However, even Plato himself says that arete is not something that can be agreed upon. He says, “Nor is there even an agreement about what constitutes arete, something that leads logically to a disagreement about the appropriate training for arete.”[7] To say that arete has a common definition of excellence or fulfillment may be an overstatement simply because it was very difficult to pinpoint arete, much less the proper ways to go about obtaining it. […]

Homer

In Homer‘s Iliad and Odyssey, “arete” is used mainly to describe heroes and nobles and their mobile dexterity, with special reference to strength and courage, but it is not limited to this. Penelope‘s arete, for example, relates to co-operation, for which she is praised by Agamemnon. The excellence of the gods generally included their power, but, in the Odyssey (13.42), the gods can grant excellence to a life, which is contextually understood to mean prosperity. Arete was also the name of King Alcinous‘s wife.

According to Bernard Knox‘s notes found in the Robert Fagles translation of The Odyssey, “arete” is also associated with the Greek word for “pray”, araomai.[8]

All Things Shining
by Hubert Dreyfus
pp. 61-63

Homer’s epic poems brought into focus a notion of arete, or excellence in life, that was at the center of the Greek understanding of human being.6 Many admirers of Greek culture have attempted to define this notion, but success here requires avoiding two prominent temptations. There is the temptation to patronize that we have already mentioned. But there is also a temptation to read a modern sensibility into Homer’s time. One standard translation of the Greek word arete as “virtue” runs the risk of this kind of retroactive reading: for any attempt to interpret the Homeric Greek notion of human excellence in terms of “virtue”—especially if one hears in this word its typical Christian or even Roman overtones—is bound to go astray. Excellence in the Greek sense involves neither the Christian notion of humility and love nor the Roman ideal of stoic adherence to one’s duty.7 Instead, excellence in the Homeric world depends crucially on one’s sense of gratitude and wonder.

Nietzsche was one of the first to understand that Homeric excellence bears little resemblance to modern moral agency. His view was that the Homeric world understood nobility in terms of the overpowering strength of noble warriors. The effect of the ensuing Judeo-Christian tradition, on this Nietzschean reading, was to enfeeble the Homeric understanding of excellence by substituting the meekness of the lamb for the strength and power of the noble warrior.8

Nietzsche was certainly right that the Homeric tradition valorizes the strong, noble hero; and he was right, too, that in some important sense the Homeric account of excellence is foreign to our basic moralizing assumptions. But there is something that the Nietzschean account leaves out. As Bernard Knox emphasizes, the Greek word arete is etymologically related to the Greek verb “to pray” (araomai).9 It follows that Homer’s basic account of human excellence involves the necessity of being in an appropriate relationship to whatever is understood to be sacred in the culture. Helen’s greatness, on this interpretation, is not properly measured in terms of the degree to which she is morally responsible for her actions.

What makes Helen great in Homer’s world is her ability to live a life that is constantly responsive to golden Aphrodite, the shining example of the sacred erotic dimension of existence. Likewise, Achilles had a special kind of receptivity to Ares and his warlike way of life; Odysseus had Athena, with her wisdom and cultural adaptability, to look out for him. Presumably, the master craftsmen of Homer’s world worked in the light of Hephaestus’s shining. In order to engage with this understanding of human excellence, we will have to think clearly about how the Homeric Greeks understood themselves. Why would it make sense to describe their lives in relation to the presence and absence of the gods?

Several questions focus this kind of approach. What is the phenomenon that Homer is responding to when he says that a god intervened or in some way took part in an action or event? Is this phenomenon recognizable to us, even if only marginally? And if Homer’s reference to the gods is something other than an attempt to pass off moral responsibility for one’s actions, then what exactly is it? Only by facing these questions head on can we understand whether it is possible—or desirable—to lure back Homer’s polytheistic gods.

The gods are essential to the Homeric Greek understanding of what it is to be a human being at all. As Peisistratus—the son of wise old Nestor—says toward the beginning of the Odyssey, “All men need the gods.”10 The Greeks were deeply aware of the ways in which our successes and our failures—indeed, our very actions themselves—are never completely under our control. They were constantly sensitive to, amazed by, and grateful for those actions that one cannot perform on one’s own simply by trying harder: going to sleep, waking up, fitting in, standing out, gathering crowds together, holding their attention with a speech, changing their mood, or indeed being filled with longing, desire, courage, wisdom, and so on. Homer sees each of these achievements as a particular god’s gift. To say that all men need the gods therefore is to say, in part at least, that we are the kinds of beings who are at our best when we find ourselves acting in ways that we cannot—and ought not—entirely take credit for.

The Discovery of the Mind
by Bruno Snell
pp. 158-160

The words for virtue and good, arete and agathos, are at first by no means clearly distinguished from the area of profit. In the early period they are not as palpably moral in content as might be supposed; we may compare the German terms Tu end and gut which originally stood for the ‘suitable’ (taugende) and the ‘fitting’ (cf. Gatte). When Homer says that a man is good, agathos, he does not mean thereby that he is morally unobjectionable, much less good-hearted, but rather that he is useful, proficient, and capable of vigorous action. We also speak of a good warrior or a good instrument. Similarly arete, virtue, does not denote a moral property but nobility, achievement, success and reputation. And yet these words have an unmistakable tendency toward the moral because, unlike ‘happiness’ or ‘profit’, they designate qualities for which a man may win the respect of his whole community. Arete is ‘ability’ and ‘achievement’, characteristics which are expected of a ‘good’, an ‘able’ man, an aner agathos. From Homer to Plato and beyond these words spell out the worth of a man and his work. Any change in their meaning, therefore, would indicate a reassessment of values. It is possible to show how at various times the formation and consolidation of social groups and even of states was connected with people’s ideas about the ‘good’. But that would be tantamount to writing a history of Greek culture. In Homer, to possess ‘virtue’ or to be ‘good’ means to realize one’s nature, and one’s wishes, to perfection. Frequently happiness and profit form the reward, but it is no such extrinsic prospect which leads men to virtue and goodness. The expressions contain a germ of the notion of entelechy. A Homeric hero, for instance, is capable of ‘reminding himself’, or of ‘experiencing’, that he is noble. ‘Use your experience to become what you are’ advises Pindar who adheres to this image of arete. The ‘good’ man fulfils his proper function, prattei ta heautou, as Plato demands it; he achieves his own perfection. And in the early period this also entails that he is good in the eyes of others, for the notions and definitions of goodness are plain and uniform: a man appears to others as he is.

In the Iliad (11.404—410) Odysseus reminds himself that he is an aristocrat, and thereby resolves his doubts how he should conduct himself in a critical situation. He does it by concentrating on the thought that he belongs to a certain social order, and that it is his duty to fulfill the ‘virtue’ of that order. The universal which underlies the predication ‘I am a noble’ is the group; he does not reflect on an abstract ‘good ’but upon the circle of which he claims membership. It is the same as if an officer were to say: ‘As an officer I must do this or that,’ thus gauging his action by the rigid conception of honour peculiar to his caste.

Aretan is ‘to thrive’; arete is the objective which the early nobles attach to achievement and success. By means of arete the aristocrat implements the ideal of his order—and at the same time distinguishes himself above his fellow nobles. With his arete the individual subjects himself to the judgment of his community, but he also surpasses it as an individual. Since the days of Jacob Burckhardt the competitive character of the great Greek achievements has rightly been stressed. Well into the classical period, those who compete for arete are remunerated with glory and honour. The community puts its stamp of approval on the value which the individual sets on himself. Thus honour, time, is even more significant than arete for the growth of the moral consciousness, because it is more evident, more palpable to all. From his earliest boyhood the young nobleman is urged to think of his glory and his honour; he must look out for his good name, and he must see to it that he commands the necessary respect. For honour is a very sensitive plant; wherever it is destroyed the moral existence of the loser collapses. Its importance is greater even than that of life itself; for the sake of glory and honour the knight is prepared to sacrifice his life.

pp. 169-172

The truth of the matter is that it was not the concept of justice but that of arete which gave rise to the call for positive individual achievement, the moral imperative which the early Greek community enjoins upon its members who in turn acknowledge it for themselves. A man may have purely egotistical motives for desiring virtue and achievement, but his group gives him considerably more credit for these ideals than if he were to desire profit or happiness. The community expects, and even demands, arete. Conversely a man who accomplishes a high purpose may convince himself so thoroughly that his deed serves the interests of a supra-personal, a universal cause that the alternative of egotism or altruism becomes irrelevant. What does the community require of the individual? What does the individual regard as universal, as eternal? These, in the archaic age, are the questions about which the speculations on arete revolve.

The problem remains simple as long as the individual cherishes the same values as the rest of his group. Given this condition, even the ordinary things in life are suffused with an air of dignity, because they are part of custom and tradition. The various daily functions, such as rising in the morning and the eating of meals, are sanctified by prayer and sacrifice, and the crucial events in the life of man—birth, marriage, burial—are for ever fixed and rooted in the rigid forms of cult. Life bears the imprint of a permanent authority which is divine, and all activity is, therefore, more than just personal striving. No one doubts the meaning of life; the hallowed tradition is carried on with implicit trust in the holy wisdom of its rules. In such a society, if a man shows unusual capacity he is rewarded as a matter of course. In Homer a signal achievement is, as one would expect, also honoured with a special permanence, through the song of the bard which outlasts the deed celebrated and preserves it for posterity. This simple concept is still to be found in Pindar’s Epinicians. The problem of virtue becomes more complex when the ancient and universally recognized ideal of chivalry breaks down. Already in Homeric times a differentiation sets in. As we have seen in the story of the quarrel over the arms of Achilles, the aretai become a subject for controversy. The word arete itself contains a tendency toward the differentiation of values, since it is possible to speak of the virtues of various men and various things. As more sections of society become aware of their own merit, they are less willing to conform to the ideal of the once-dominant class. It is discovered that the ways of men are diverse, and that arete may be attained in all sorts of professions. Whereas aristocratic society had been held together, not to say made possible by a uniform notion of arete, people now begin to ask what true virtue is. The crisis of the social system is at the same time the crisis of an ideal, and thus of morality. Archilochus says (fr. 41)that different men have their hearts quickened in various ways. But he also states, elaborating a thought which first crops up in the Odyssey: the mind of men is as Zeus ushers in each day, and they think whatever they happen to hit upon (fr. 68). One result of this splitting up of the various forms of life is a certain failure of nerve. Man begins to feel that he is changeable and exposed to many variable forces. This insight deepens the moral reflexions of the archaic period; the search for the good becomes a search for the permanent.

The topic of the virtues is especially prominent in the elegy. Several elegiac poets furnish lists of the various aretai which they exemplify by means of well-known myths. Their purpose is to clarify for themselves their own attitudes toward the conflicting standards of life. Theognis (699 ff.) stands at the end of this development; with righteous indignation he complains that the masses no longer have eyes for anything except wealth. For him material gain has, in contrast with earlier views, become an enemy of virtue.

The first to deal with this general issue is Tyrtaeus. His call to arms pronounces the Spartan ideal; perhaps he was the one to formulate that ideal for the first time. Nothing matters but the bravery of the soldier fighting for his country. Emphatically he rejects all other accomplishments and virtues as secondary: the swiftness of the runner in the arena, or the strength of the wrestler, or again physical beauty, wealth, royal power, and eloquence, are as nothing before bravery. In the Iliad also a hero best proves his virtue by standing firm against the enemy, but that is not his only proof; the heroic figures of Homer dazzle us precisely because of their richness in human qualities. Achilles is not only brave but also beautiful, ‘swift of foot’, he knows how to sing, and so forth. Tyrtaeus sharply reduces the scope of the older arete; what is more, he goes far beyond Homer in magnifying the fame of fortitude and the ignominy which awaits the coward. Of the fallen he actually says that they acquire immortality (9.32). This one-sidedness is due to the fact that the community has redoubled its claim on the individual; Sparta in particular taxed the energies of its citizenry to the utmost during the calamitous period of the Messenian wars. The community is a thing of permanence for whose sake the individual mortal has to lay down his life, and in whose memory lies his only chance for any kind of survival. Even in Tyrtaeus, however, these claims of the group do not lead to a termite morality. Far from prescribing a blind and unthinking service to the whole, or a spirit of slavish self-sacrifice, Tyrtaeus esteems the performance of the individual as a deed worthy of fame. This is a basic ingredient of arete which, in spite of countless shifts and variations, is never wholly lost.

Philosophy Before Socrates
by Richard D. McKirahan
pp. 366-369

Aretē and Agathos These two basic concepts of Greek morality are closely related and not straightforwardly translatable into English. As an approximation, aretē can be rendered “excellence” or “goodness” (sometimes “virtue”), and agathos as “excellent” or “good.” The terms are related in that a thing or person is agathos if and only if it has aretē and just because it has aretē. The concepts apply to objects, conditions, and actions as well as to humans. They are connected with the concept of ergon (plural, erga), which may be rendered as “function” or “characteristic activity.” A good (agathos) person is one who performs human erga well, and similarly a good knife is a knife that performs the ergon of a knife well. The ergon of a knife is cutting, and an agathos knife is one that cuts well. Thus, the aretē of a knife is the qualities or characteristics a knife must have in order to cut well. Likewise, if a human ergon can be identified, an agathos human is one who can and on appropriate occasions does perform that ergon well, and human aretē is the qualities or characteristics that enable him or her to do so. The classical discussion of these concepts occurs after our period, in Aristotle,6 but he is only making explicit ideas that go back to Homer and which throw light on much of the pre-philosophical ethical thought of the Greeks.

This connection of concepts makes it automatic, virtually an analytic truth, that the right goal for a person—any person—is to be or become agathos. Even if that goal is unreachable for someone, the aretē–agathos standard still stands as an ideal against which to measure one’s successes and failures. However, there is room for debate over the nature of human erga, both whether there is a set of erga applicable to all humans and relevant to aretē and, supposing that there is such a set of erga, what those erga are. The existence of the aretē–agathos standard makes it vitally important to settle these issues, for otherwise human life is left adrift with no standards of conduct. […]

The moral scene Homer presents is appropriate to the society it represents and quite alien to our own. It is the starting point for subsequent moral speculation which no one in the later Greek tradition could quite forget. The development of Greek moral thought through the Archaic and Classical periods can be seen as the gradual replacement of the competitive by the cooperative virtues as the primary virtues of conduct and as the recognition and increasing recognition of the significance of people’s intentions as well as their actions.7

Rapid change in Greek society in the Archaic and Classical periods called for new conceptions of the ideal human and the ideal human life and activities. The Archaic period saw different kinds of rulers from the Homeric kings, and individual combat gave way to the united front of a phalanx of hoplites (heavily armed warriors). Even though the Homeric warrior-king was no longer a possible role in society, the qualities of good birth, beauty, courage, honor, and the abilities to give good counsel and rule well remained. Nevertheless, the various strands of the Homeric heroic ideal began to unravel. In particular, good birth, wealth, and fighting ability no longer automatically went together. This situation forced the issue: what are the best qualities we can possess? What constitutes human aretē? The literary sources contain conflicting claims about the best life for a person, the best kind of person to be, and the relative merits of qualities thought to be ingredients of human happiness. In one way or another these different conceptions of human excellence have Homeric origins, though they diverge from Homer’s conception and from one another.

Lack of space makes it impossible to present the wealth of materials that bear on this subject.8 I will confine discussion to two representatives of the aristocratic tradition who wrote at the end of the Archaic period. Pindar shows how the aristocratic ideal had survived and been transformed from the Homeric conception and how vital it remained as late as the early fifth century, and Theognis reveals how social, political, and economic reality was undermining that ideal.

p. 374

The increase in wealth and the shift in its distribution which had begun by the seventh century led to profound changes in the social and political scenes in the sixth and forced a wedge in among the complex of qualities which traditionally constituted aristocratic aretē. Pindar’s unified picture in which wealth, power, and noble birth tend to go together became ever less true to contemporary reality.

The aristocratic response to this changed situation receives its clearest expression in the poems attributed to Theognis and composed in the sixth and early fifth centuries. Even less than with Pindar can we find a consistent set of views advocated in these poems, but among the most frequently recurring themes are the view that money does not make the man, that many undeserving people are now rich and many deserving people (deserving because of their birth and social background) are now poor. It is noteworthy how Theognis plays on the different connotations of uses of the primary terms of value, agathos and aretē, and their opposites kakos and kakia: morally good vs. evil; well-born, noble vs. low-born; and politically and socially powerful vs. powerless. Since the traditional positive attributes no longer regularly all went together, it was important to decide which are most important, indeed which are the essential ingredients of human aretē.

pp. 379-382

In short, Protagoras taught his students how to succeed in public and private life. What he claimed to teach is, in a word, aretē. That this was his boast follows from the intimate connection between agathos and aretē as well as from the fact that a person with aretē is one who enjoys success, as measured by current standards. Anyone with the abilities Protagoras claimed to teach had the keys to a successful life in fifth-century Athens.

In fact, the key to success was rhetoric, the art of public speaking, which has a precedent in the heroic conception of aretē, which included excellence in counsel. But the Sophists’ emphasis on rhetoric must not be understood as hearkening back to Homeric values. Clear reasons why success in life depended on the ability to speak well in public can be found in fifth-century politics and society. […]

That is not to say that every kind of success depended on rhetoric. It could not make you successful in a craft like carpentry and would not on its own make you a successful military commander. Nor is it plausible that every student of Protagoras could have become another Pericles. Protagoras acknowledged that natural aptitude was required over and above diligence. […] Protagoras recognized that he could not make a silk purse out of a sow’s ear, but he claimed to be able to develop a (sufficiently young) person’s abilities to the greatest extent possible.28

Pericles was an effective counselor in part because he could speak well but also by dint of his personality, experience, and intelligence. To a large extent these last three factors cannot be taught, but rhetoric can be offered as a tekhnē, a technical art or skill which has rules of its own and which can be instilled through training and practice. In these ways rhetoric is like medicine, carpentry, and other technical arts, but it is different in its seemingly universal applicability. Debates can arise on any conceivable subject, including technical ones, and rhetorical skill can be turned to the topic at hand whatever it may be. The story goes that Gorgias used his rhetorical skill to convince medical patients to undergo surgery when physicians failed to persuade them.29 Socrates turned the tables on the Sophists, arguing that if rhetoric has no specific subject matter, then so far from being a universal art, it should not be considered an art at all.30 And even if we grant that rhetoric is an art that can be taught, it remains controversial whether aretē can be taught and in what aretē consists. […]

The main charges against the Sophists are of two different sorts. First the charge of prostituting themselves. Plato emphasizes the money-making aspect of the Sophist’s work, which he uses as one of his chief criteria for determining that Socrates was not a Sophist. This charge contains two elements: the Sophists teach aretē for money, and they teach it to anyone who pays. Both elements have aristocratic origins. Traditionally aretē was learned from one’s family and friends and came as the result of a long process of socialization beginning in infancy. Such training and background can hardly be bought. Further, according to the aristocratic mentality most people are not of the right type, the appropriate social background, to aspire to aretē.

Lila
by Robert Pirsig
pp. 436-442

Digging back into ancient Greek history, to the time when this mythos-to-logos transition was taking place, Phædrus noted that the ancient rhetoricians of Greece, the Sophists, had taught what they called aretê , which was a synonym for Quality. Victorians had translated aretê as “virtue” but Victorian “virtue” connoted sexual abstinence, prissiness and a holier-than-thou snobbery. This was a long way from what the ancient Greeks meant. The early Greek literature, particularly the poetry of Homer, showed that aretê had been a central and vital term.

With Homer Phædrus was certain he’d gone back as far as anyone could go, but one day he came across some information that startled him. It said that by following linguistic analysis you could go even further back into the mythos than Homer. Ancient Greek was not an original language. It was descended from a much earlier one, now called the Proto-Indo-European language. This language has left no fragments but has been derived by scholars from similarities between such languages as Sanskrit, Greek and English which have indicated that these languages were fallouts from a common prehistoric tongue. After thousands of years of separation from Greek and English the Hindi word for “mother” is still “Ma.” Yoga both looks like and is translated as “yoke.” The reason an Indian rajah’ s title sounds like “regent” is because both terms are fallouts from Proto-Indo-European. Today a Proto-Indo-European dictionary contains more than a thousand entries with derivations extending into more than one hundred languages.

Just for curiosity’s sake Phædrus decided to see if aretê was in it. He looked under the “a” words and was disappointed to find it was not. Then he noted a statement that said that the Greeks were not the most faithful to the Proto-Indo-European spelling. Among other sins, the Greeks added the prefix “a” to many of the Proto-Indo-European roots. He checked this out by looking for aretê under “r.” This time a door opened.

The Proto-Indo-European root of aretê was the morpheme rt . There, beside aretê , was a treasure room of other derived “rt” words: “arithmetic,” “aristocrat,” “art,” “rhetoric,” “worth,” “rite,” “ritual,” “wright,” “right (handed)” and “right (correct).” All of these words except arithmetic seemed to have a vague thesaurus-like similarity to Quality. Phædrus studied them carefully, letting them soak in, trying to guess what sort of concept, what sort of way of seeing the world, could give rise to such a collection.

When the morpheme appeared in aristocrat and arithmetic the reference was to “firstness.” Rt meant first. When it appeared in art and wright it seemed to mean “created” and “of beauty.” “Ritual” suggested repetitive order. And the word right has two meanings: “right-handed” and “moral and esthetic correctness.” When all these meanings were strung together a fuller picture of the rt morpheme emerged. Rt referred to the “first, created, beautiful repetitive order of moral and esthetic correctness.” […]

There was just one thing wrong with this Proto-Indo-European discovery, something Phædrus had tried to sweep under the carpet at first, but which kept creeping out again. The meanings, grouped together, suggested something different from his interpretation of aretê . They suggested “importance” but it was an importance that was formal and social and procedural and manufactured, almost an antonym to the Quality he was talking about. Rt meant “quality” all right but the quality it meant was static, not Dynamic. He had wanted it to come out the other way, but it looked as though it wasn’t going to do it. Ritual. That was the last thing he wanted aretê to turn out to be. Bad news. It looked as though the Victorian translation of aretê as “virtue” might be better after all since “virtue” implies ritualistic conformity to social protocol. […]

Rta . It was a Sanskrit word, and Phædrus remembered what it meant: Rta was the “cosmic order of things.” Then he remembered he had read that the Sanskrit language was considered the most faithful to the Proto-Indo-European root, probably because the linguistic patterns had been so carefully preserved by the Hindu priests. […]

Rta , from the oldest portion of the Rg Veda , which was the oldest known writing of the Indo-Aryan language. The sun god, Sūrya , began his chariot ride across the heavens from the abode of rta. Varuna , the god for whom the city in which Phædrus was studying was named, was the chief support of rta .

Varuna was omniscient and was described as ever witnessing the truth and falsehood of men—as being “the third whenever two plot in secret.” He was essentially a god of righteousness and a guardian of all that is worthy and good. The texts had said that the distinctive feature of Varuna was his unswerving adherence to high principles. Later he was overshadowed by Indra who was a thunder god and destroyer of the enemies of the Indo-Aryans. But all the gods were conceived as “guardians of ta ,” willing the right and making sure it was carried out.

One of Phædrus’s old school texts, written by M. Hiriyanna, contained a good summary: “Rta , which etymologically stands for ‘course’ originally meant ‘cosmic order,’ the maintenance of which was the purpose of all the gods; and later it also came to mean ‘right,’ so that the gods were conceived as preserving the world not merely from physical disorder but also from moral chaos. The one idea is implicit in the other: and there is order in the universe because its control is in righteous hands.…”

The physical order of the universe is also the moral order of the universe. Rta is both. This was exactly what the Metaphysics of Quality was claiming. It was not a new idea. It was the oldest idea known to man.

This identification of rta and aretê was enormously valuable, Phædrus thought, because it provided a huge historical panorama in which the fundamental conflict between static and Dynamic Quality had been worked out. It answered the question of why aretê meant ritual. Rta also meant ritual. But unlike the Greeks, the Hindus in their many thousands of years of cultural evolution had paid enormous attention to the conflict between ritual and freedom. Their resolution of this conflict in the Buddhist and Vedantist philosophies is one of the profound achievements of the human mind.

Pagan Ethics: Paganism as a World Religion
by Michael York
pp. 59-60

Pirsig contends that Plato incorporated the arete of the Sophists into his dichotomy between ideas and appearances — where it was subordinated to Truth. Once Plato identifies the True with the Good, arete’s position is usurped by “dialectically determined truth.” This, in turn, allows Plato to demote the Good to a lower order and minor branch of knowledge. For Pirsig, the Sophists were those Greek philosophers who exalted quality over truth; they were the true champions of arete or excellence. With a pagan quest for the ethical that develops from an idolatrous understanding of the physical, while Aristotle remains an important consideration, it is to the Sophists (particularly Protagoras, Prodicus and Pirsig’s understanding of them) and a reconstruction of their underlying humanist position that perhaps the most important answers are to be framed if not found as well.

A basic pagan position is an acceptance of the appetites — in fact, their celebration rather than their condemnation. We find the most unbridled expression of the appetites in the actions of the young. Youth may engage in binge-drinking, vandalism, theft, promiscuity and profligate experimentation. Pagan perspectives may recognize the inherent dangers in these as there are in life itself. But they also trust the overall process of learning. In paganism, morality has a much greater latitude than it does in the transcendental philosophy of a Pythagoras, Plato, or Plotinus: it may veer toward a form of relativism, but its ultimate check is always the sanctity of the other animate individuals. An it harm none, do what ye will. The pagan ethic must be found within the appetites and not in their denial.

In fact, paganism is part of a protest against Platonic assertion. The wider denial is that of nature herself. Nature denies the Platonic by refusing to conform to the Platonic ideal. It insists on moments of chaos, the epagomenae, the carnival, that overlap between the real and the ideal that is itself a metaphor for reality. The actual year is a refusal to cooperate with the mathematically ideal year of 360 days — close but only tantalizingly.

In addition, pagans have always loved asking what is arete? This is the fundamental question we encounter with the Sophists, Plato and Aristotle. It is the question that is before us still. The classics considered variously both happiness and the good as alternative answers. The Hedonists pick happiness — but a particular kind of happiness. The underlying principle recognized behind all these possibilities is arete ‘excellence, the best’ however it is embodied — whether god, goddess, goods, the good, gods, virtue, happiness, pleasure or all of these together. Arete is that to which both individual and community aspire. Each wants one’s own individual way of putting it together in excellent fashion — but at the same time wanting some commensurable overlap of the individual way with the community way.

What is the truth of the historical claims about Greek philosophy in Zen and the Art of Motorcycle Maintenance?
answer by Ammon Allred

Arete is usually translated as “virtue,” which is certainly connected up with the good “agathon” — but in Plato an impersonal Good is probably more important than aletheia or truth. See, for instance, the central images at the end of Book VI, where the Good is called the “Father of the Sun.” The same holds in the Philebus. And it wouldn’t be right to say that Plato (or Aristotle) thought virtue was part of some small branch called “ethics” (Plato doesn’t divide his philosophy up this way; Aristotle does — although then we get into fact that we don’t have the dialogues he wrote — but still what he means by ethics is far broader than what we mean).

Certainly the Sophists pushed for a humanistic account of the Good, whereas Plato’s was far more impersonal. And Plato himself had a complex relationship to the Sophists (consider the dialogue of Protagoras, where Socrates and Protagoras both end up about equally triumphant).

That said, Pirsig is almost certainly right about Platonism — that is to say, the approach to philosophy that has been taught as though it were Plato’s philosophy. Certainly, the sophists have gotten a bad rap because of the view that Socrates and Plato were taken to have about the sophists; but even there, many philosophers have tried to rehabilitate them: most famously, Nietzsche.

The Art of the Lost Cause

Many people are understandably disappointed, frustrated, or angry when they lose. It’s just not fun to lose, especially in a competitive society. But there are advantages to losing. And losses are as much determined by perspective. Certainly, in more cooperative societies, what may be seen as a loss by outsiders could be taken quite differently by an insider. Western researchers discovered that difference when using games as part of social science studies. Some non-Western people refused win-lose scenarios, at least among members of the community. The individual didn’t lose for everyone gained. I point this out to help shift our thinking.

Recently, the political left in the United States has experienced losses. Bernie Sanders lost the nomination to Hillary Clinton who in turn lost the presidency to Donald Trump. But is this an entirely surprising result and bad outcome? Losses can lead to soul-searching and motivation for change. The Republicans we know now have dominated the political narrative in recent decades, which forced the Democrats to shift far to the right with third way ‘triangulation’. That wasn’t always the case. Republicans went through a period of major losses before being able to reinvent themselves with the southern strategy, Reagan revolution, trickle down voodo economics, the two Santa Claus theory, culture wars, etc.

The Clinton New Democrats were only able to win at all in recent history by sacrificing the political left and, in the process, becoming the new conservative party. So, even when Democrats have been able to win it has been a loss. Consider Obama who turned out to be one of the most neoliberal and neocon presidents in modern history, betraying his every promise: maintaining militarism, refusing to shut down GITMO, passing pro-biz insurance reform, etc. Liberals and leftists would have been better off to have been entirely out of power these past decades, allowing a genuine political left movement to form and so allowing democracy to begin to reassert itself from below. Instead, Democrats have managed to win just enough elections to keep the political left suppressed by co-opting their rhetoric. Democrats have won by forcing the American public to lose.

In the Democratic leadership failing so gloriously, they have been publicly shamed to the point of no redemption. The party is now less popular than the opposition, an amazing feat considering how unpopular is Trump and the GOP at the moment. Yet amidst all of this, Bernie Sanders is more popular than ever, more popular among women than men and more popular among minorities than whites. I never thought Sanders was likely to win and so I wasn’t disappointed. What his campaign did accomplish, as I expected, was to reshape the political narrative and shift the Overton window back toward the political left again. This period of loss will be remembered as a turning point in the future. It was a necessary loss, a reckoning and re-envisioning.

Think about famous lost causes. One that came to mind is that of Jesus and the early Christians. They were a tiny unknown cult in a vast empire filled with hundreds of thousands of similar cults. They were nothing special, of no significance or consequence, such that no one bothered to even take note of them, not even Jewish writers at the time. Then Jesus was killed as a common criminal among other criminals and even that didn’t draw any attention. There is no evidence that the Romans considered Jesus even mildly interesting. After his death, Christianity remained small and splintered into a few communities. It took generations for this cult to grow much at all and finally attract much outside attention.

Early Christians weren’t even important enough to be feared. The persecution stories seem to have been mostly invented by later Christians to make themselves feel more important, as there is no records of any systematic and pervasive persecution. Romans killing a few cultists here and there happened all the time and Christians didn’t stand out as being targeted more than any others. In fact, early Christians were lacking in uniqueness that they were often confused with other groups such as Stoics. By the way, it was the Stoics who were famous at the time for seeking out persecution and so gaining street cred respectability, maybe causing envy among Christians. Even Christian theology was largely borrowed from others, such as natural law also having been taken from the Stoics — related to the idea that a slave can be free in their mind and being, their heart and soul because natural law transcends human law.

Still, this early status of Christians as losers created a powerful narrative that has not only survived but proliferated. Some of that narrative, such as their persecution, was invented. But that is far from unusual — the mythos that develops around lost causes tends to be more invented than not. Still, at the core, the Christians were genuinely pathetic for a couple of centuries. They weren’t a respectable religion in the Roman Empire, until long after Jesus’ death when an emperor decided to use them to shore up his own power. In the waning era of Roman imperialism, I suppose a lost cause theology felt compelling and comforting. It was also a good way to convert other defeated people, as they could be promised victory in heaven. Lost Causes tend to lead to romanticizing of a distant redemption that one day would come. And in the case of Christianity, this would mean that the ultimate sacrificial loser, Jesus himself, would return victorious! Amen! Praise the Lord! Like a Taoist philosopher, Jesus taught that to find oneself was to lose oneself but to lose oneself was to find oneself. This is a loser’s mentality and relates to why some have considered Christianity to be a slaver religion. The lowly are uplifted, at least in words and ideals. But I’d argue there is more to it than seeking comfort by rationalizing suffering, oppression, and defeat.

Winning isn’t always a good thing, at least in the short term. I sometimes wonder if America would be a better place if the American Revolution had been lost. When I compare the United States to Canada, I don’t see any great advantage to American colonists having won. Canada is a much more stable and well-functioning social democracy. And the British Empire ended up enacting sweeping reforms, including abolishing slavery through law long before the US managed to end slavery through bloody conflict. In many ways, Americans were worse off after the revolution than before it. A reactionary backlash took hold as oligarchs co-opted the revolution and turned it into counter-revolution. Through the coup of a Constitutional Convention, the ruling elite seized power of the new government. It was in seeming to win that the average American ended up losing. An overt loss potentially could have been a greater long term victory. In particular for women and blacks, being on the side of the revolutionaries didn’t turn out to be such a great deal. Woman who had gained the vote had it taken away from them again and blacks hoping for freedom were returned to slavery. The emerging radical movement of democratic reform was strangled in the crib.

Later on, the Confederates learned of the power of a lost cause. To such an extent that they have become the poster boys of The Lost Cause, all of American society having been transformed by it. Victory of the United States government, once again, turned out to be far from a clear victory for the oppressed. If Confederates had won or otherwise been allowed to secede, the Confederate government would have been forced to come to terms with the majority black population that existed in the South and they wouldn’t have had the large Northern population to help keep blacks down. It’s possible that some of the worst results could have been avoided: re-enslavement through chain gangs and mass incarceration, Jim Crow laws and Klan terrorism, sundown towns and redlining, etc —  all the ways that racism became further entrenched. After the Civil War, blacks became scattered and would then become a minority. Having lost their position as the Southern majority, they lost most of the leverage they might have had. Instead of weak reforms leading to new forms of oppression, blacks might have been able to have forced a societal transformation within a Confederate government or else to have had a mass exodus in order to secede and create their own separate nation-state. There were many possibilities that became impossible because of Union victory.

Now consider the civil rights movement. The leaders, Martin Luther King in particular, understood the power of a lost cause. They intentionally staged events of getting attacked by police and white mobs, always making sure there were cameras nearby to make it into a national event. It was in losing these confrontations to the greater power of white oppression that they managed to win public support. As a largely Christian movement, the civil rights activists surely had learned from the story of Jesus as a sacrificial loser and his followers as persecuted losers. The real failure of civil rights only came later on when it gained mainstream victories and a corrupt black leadership aligned with white power, such as pushing the racist 1994 Crime Bill which was part of the Democrats becoming the new conservative party. The civil rights movement might have been better able to transform society and change public opinion by having remained a lost cause for a few more generations.

A victory forced can be a victory lost. Gain requires sacrifice, not to be bought cheaply. Success requires risk of failure, putting everything on the line. The greatest losses can come from seeking victory too soon and too easily. Transformative change can only be won by losing what came before. Winning delayed sometimes is progress ensured, slow but steady change. The foundation has to be laid before something can emerge from the ground up. Being brought low is the beginning point, like planting a seed in the soil.

It reminds me of my habit of always looking down as I walk. My father, on the other hand, never looks down and has a habit of stepping on things. It is only by looking down that we can see what is underneath our feet, what we stand on or are stepping toward. Foundation and fundament are always below eye level. Even in my thinking, I’m forever looking down, to what is beneath everyday awareness and oft-repeated words. Just to look down, such a simple and yet radical act.

“Looking down is also a sign of shame or else humility, the distinction maybe being less relevant to those who avoid looking down. To humble means to bring low, to the level of the ground, the soil, humus. To be further down the ladder of respectability, to be low caste or low class, is to have a unique vantage point. One can see more clearly and more widely when one has grown accustomed to looking down, for then one can see the origins of things, the roots of the world, where experience meets the ground of being.”

* * *

Living Differently: On How the Feminist Utopia Is Something You Have to Be Doing Now
by Lynne Segal

Another anthropologist, the anarchist David Graeber, having been involved in protest networks for decades, remains even more certain that participation in moments of direct action and horizontal decision-making bring to life a new and enduring conception of politics, while providing shared hope and meaning in life, even if their critics see in the outcomes of these movements only defeat:

What they don’t understand is that once people’s political horizons have been broadened, the change is permanent. Hundreds of thousands of Americans (and not only Americans, but Greeks, Spaniards and Tunisians) now have direct experience of self-organization, collective action and human solidarity. This makes it almost impossible to go back to one’s previous life and see things the same way. While the world’s financial and political elite skate blindly towards the next 2008-scale crisis, we’re continuing to carry out occupations of buildings, farms, foreclosed homes and workplaces, organizing rent strikes, seminars and debtor’s assemblies, and in doing so laying the groundwork for a genuinely democratic culture … With it has come a revival of the revolutionary imagination that conventional wisdom has long since declared dead.

Discussing what he calls ‘The Democracy Project’, Graeber celebrates forms of political resistance that in his view move well beyond calls for policy reforms, creating instead permanent spaces of opposition to all existing frameworks. For Graeber, one fundamental ground for optimism is that the future is unknowable, and one can live dissident politics in the present, or try to. This is both despite, and also because of, the insistent neo-liberal boast that there can be no alternative to its own historical trajectory: which has become a linear project of endless growth and the amassing of wealth by the few, toil and the struggle for precarious survival for so many.

Furthermore, Graeber points out that historically, although few revolutionaries actually succeeded in taking power themselves, the effects of their actions were often experienced far outside their immediate geographical location. In a similar reflection on unintended consequences, Terry Eagleton suggests that even with the gloomiest of estimates in mind, many aspects of utopic thinking may be not only possible but well- nigh inevitable:

Perhaps it is only when we run out of oil altogether, or when the world system crashes for other reasons, or when ecological catastrophe finally overtakes us, that we will be forced into some kind of co-operative commonwealth of the kind William Morris might have admired.

Even catastrophism, one might say, has its potentials. […]

It should come as no surprise that most of the goals we dream of will usually elude us, at least partially. However, to confront rather than accept the evils of the present, some utopian spirit is always necessary to embrace the complexity of working, against all odds, to create better futures. A wilful optimism is needed, despite and because of our inevitable blind-spots and inadequacies, both personal and collective.

For many of us, it means trying to live differently in the here and now, knowing that the future will never be a complete break with the present or the past, but hopefully something that may develop out of our most supportive engagements with others. To think otherwise inhibits resistance and confirms the dominant conceit that there is no alternative to the present. Thus, I want to close this chapter repeating the words of the late Latin American writer, Eduardo Galeano, which seem to have been translated into almost every language on earth, though I cannot track down their source:

Utopia is on the horizon. I move two steps closer; it moves two steps further away. I walk another ten steps and the horizon runs ten steps further away. As much as I may walk, I’ll never reach it. So what’s the point of utopia? The point is this: to keep moving forward.

Our political dreams can end in disappointment, but are likely, nevertheless, to make us feel more alive, and hence happier, along the way, at least when they help to connect us to and express concern for those around us. Happiness demands nothing less.

 

Paul Adkin on Decadence & Stagnation

“Decadence: when things are just too good and easy that no one bothers to push forward anymore, bringing about stagnation …

But there is also another kind of stagnation: one which comes about because there just isn’t enough time to go forward; when all time is taken up with something that is essentially futile when considered from the point of view of the bigger picture. Like making money. Even the seemingly dynamic world of business, if it is dedicated only to business and not to authentically meaningful human progress (things associated with knowledge and discovery), it is essentially stagnating. Any society that is a simulacra society, hell-bent on reproducing copies rather than on developing its creativity, is a decadent, stagnating society. We are stagnant not because of what we are doing, our anthill society is always busy, but because what we are driven by, in all this anthill activity, is not creative. When production is synonymous with reproduction, then we know we have fallen into the stagnant pool of decadence.

“Nietzsche talked about the residual nature of decadence[1]. That decadence is a cumulative thing. Certainly, it is nurtured both by dogma and nihilism. Only a sceptical meaningfulness can push forward in a creative way.

“Sceptical meaningfulness? How can such a thing be? Surely it is a contradiction in terms.

“To understand how this oxymoron combination can work, we need to see meaningfulness as a forward pushing phenomenon. Once it stops pushing forward, meaningfulness slips into dogma. Meaning is fuelled by truth, but it does not swim in truth as if truth were a lake. Truth, in order to be lasting, has to be a river.”

from Decadence & Stagnation by Paul Adkin

Social Construction & Ideological Abstraction

The following passages from two books help to explain what is social construction. As society has headed in a particular direction of development, abstract thought has become increasingly dominant.

But for us modern people who take abstractions for granted, we often don’t even recognize abstractions for what they are. Many abstractions simply become reality as we know it. They are ‘looped’ into existence, as race realism, capitalist realism, etc.

Ideological abstractions become so pervasive and systemic that we lose the capacity to think outside of them. They form our reality tunnel.

This wasn’t always so. Humans used to conceive of and hence perceive the world far differently. And this shaped their sense of identity, which is hard for us to imagine.

* * *

Dynamics of Human Biocultural Diversity:
A Unified Approach

by Elisa J. Sobo
Kindle Locations 94-104)

Until now, many biocultural anthropologists have focused mainly on the ‘bio’ half of the equation, using ‘biocultural’ generically, like biology, to refer to genetic, anatomical, physiological, and related features of the human body that vary across cultural groups. The number of scholars with a more sophisticated approach is on the upswing, but they often write only for super-educated expert audiences. Accordingly, although introductory biocultural anthropology texts make some attempt to acknowledge the role of culture, most still treat culture as an external variable— as an add-on to an essentially biological system. Most fail to present a model of biocultural diversity that gives adequate weight to the cultural side of things.

Note that I said most, not all: happily, things are changing. A movement is afoot to take anthropology’s claim of holism more seriously by doing more to connect— or reconnect— perspectives from both sides of the fence. Ironically, prior to the industrial revolution and the rise of the modern university, most thinkers took a very comprehensive view of the human condition. It was only afterward that fragmented, factorial, compartmental thinking began to undermine our ability to understand ourselves and our place in— and connection with— the world. Today, the leading edge of science recognizes the links and interdependencies that such thinking keeps falsely hidden.

Nature, Human Nature, and Human Difference:
Race in Early Modern Philosophy
by Justin E. H. Smith

pp. 9-10

The connection to the problem of race should be obvious: kinds of people are to no small extent administered into being, brought into existence through record keeping, census taking, and, indeed, bills of sale. A census form asks whether a citizen is “white,” and the possibility of answering this question affirmatively helps to bring into being a subkind of the human species that is by no means simply there and given, ready to be picked out, prior to the emergence of social practices such as the census. Censuses, in part, bring white people into existence, but once they are in existence they easily come to appear as if they had been there all along. This is in part what Hacking means by “looping”: human kinds, in contrast with properly natural kinds such as helium or water, come to be what they are in large part as a result of the human act of identifying them as this or that. Two millennia ago no one thought of themselves as neurotic, or straight, or white, and nothing has changed in human biology in the meantime that could explain how these categories came into being on their own. This is not to say that no one is melancholic, neurotic, straight, white, and so on, but only that how that person got to be that way cannot be accounted for in the same way as, say, how birds evolved the ability to fly, or how iron oxidizes.

In some cases, such as the diagnosis of mental illness, kinds of people are looped into existence out of a desire, successful or not, to help them. Racial categories seem to have been looped into existence, by contrast, for the facilitation of the systematic exploitation of certain groups of people by others. Again, the categories facilitate the exploitation in large part because of the way moral status flows from legal status. Why can the one man be enslaved, and the other not? Because the one belongs to the natural-seeming kind of people that is suitable for enslavement. This reasoning is tautological from the outside, yet self-evident from within. Edward Long, as we have seen, provides a vivid illustration of it in his defense of plantation labor in Jamaica. But again, categories cannot be made to stick on the slightest whim of their would-be coiner. They must build upon habits of thinking that are already somewhat in place. And this is where the history of natural science becomes crucial for understanding the history of modern racial thinking, for the latter built directly upon innovations in the former. Modern racial thinking could not have taken the form it did if it had not been able to piggyback, so to speak, on conceptual innovations in the way science was beginning to approach the diversity of the natural world, and in particular of the living world.

This much ought to be obvious: racial thinking could not have been biologized if there were no emerging science of biology. It may be worthwhile to dwell on this obvious point, however, and to see what more unexpected insights might be drawn out of it. What might not be so obvious, or what seems to be ever in need of renewed pointing out, is a point that ought to be of importance for our understanding of the differing, yet ideally parallel, scope and aims of the natural and social sciences: the emergence of racial categories, of categories of kinds of humans, may in large part be understood as an overextension of the project of biological classification that was proving so successful in the same period. We might go further, and suggest that all of the subsequent kinds of people that would emerge over the course of the nineteenth and twentieth centuries, the kinds of central interest to Foucault and Hacking, amount to a further reaching still, an unprecedented, peculiarly modern ambition to make sense of the slightest variations within the human species as if these were themselves species differentia. Thus for example Foucault’s well-known argument that until the nineteenth century there was no such thing as “the homosexual,” but only people whose desires could impel them to do various things at various times. But the last two centuries have witnessed a proliferation of purportedly natural kinds of humans, a typology of “extroverts,” “depressives,” and so on, whose objects are generally spoken of as if on an ontological par with elephants and slime molds. Things were not always this way. In fact, as we will see, they were not yet this way throughout much of the early part of the period we call “modern.”

Time and Trauma

And I think of that “Groundhog Day” movie with Bill Murray in which he repeats the same day, again and again, with only minor changes. If you’ve seen the movie, Murray finally breaks out of what appears to be an infinite loop only when he changes his ways, his approach to life, his mentality. He becomes a better person and even gets the girl.

When is the USA going to break out of its infinite loop of war? Only when we change our culture, our mentality.

A “war on terror” is a forever war, an infinite loop, in which the same place names and similar actions crop up again and again. Names like Mosul and Helmand province. Actions like reprisals and war crimes and the deaths of innocents, because that is the face of war.

~W.J. Astore, Happy 4th of July! And a Global War on Something

* * *

The impression we form is that it is not that linear time perception or experience that has been corrupted by trauma; it is that time “itself” has been traumatized — so that we come to comprehend “history” not as a random sequence of events, but as a series of traumatic clusters. This broken time, this sense of history as a malign repetition, is “experienced” as seizure and breakdown; I have placed “experienced” in inverted commas here because the kind of voiding interruption of subjectivity seems to obliterate the very conditions that allows experience to happen.

It is as if the combination of adolescent erotic energy with an inorganic artefact … produces a trigger for a repeating of the ancient legend. It is not clear that “repeating” is the right word here, though. It might be better to say that the myth has been re-instantiated, with the myth being understood as a kind of structure that can be implemented whenever the conditions are right. But the myth doesn’t repeat so much as it abducts individuals out of linear time and into its “own” time, in which each iteration of the myth is in some sense always the first time.

…the mythic is part of the virtual infrastructure which makes human life as such possible. It is not the case that first of all there are human beings, and the mythic arrives afterwards, as a kind of cultural carapace added to a biological core. Humans are from the start — or from before the start, before the birth of the individual — enmeshed in mythic structures.

~Mark Fisher, Eerie ThanatosThe Weird and the Eerie (pp. 96-97)

A Neverending Revolution of the Mind

In a recent book, Juliet Barker offers new perspective about an old event (1381: The Year of the Peasant’s Revolt, Kindle Locations 41-48):

“In the summer of 1381 England erupted in a violent popular uprising that was as unexpected as it was unprecedented. Previous rebellions had always been led by ambitious and discontented noblemen seeking to overthrow the government and seize power for themselves. The so-called ‘Peasants’ Revolt’ was led by commoners— most famously Wat Tyler, Jack Straw and John Balle— whose origins were obscure and whose moment at the forefront of events was brief. Even more unusually, they did not seek personal advancement but a radical political agenda which, if it had been implemented, would fundamentally have transformed English society: the abolition of serfdom and the dues and services owed by tenants to their lord of the manor; freedom from tolls and customs on buying and selling goods throughout the country; the recognition of a man’s right to work for whom he chose at the wages he chose; the state’s seizure of the Church’s wealth and property. Their demands anticipated the French Revolution by four hundred years.”

Our understanding of the origins of modernity keep being pushed back. It used to be thought that the American Revolution was the first modern revolution. But it was preceded by generations of revolts against colonial elite. And before that was the English Civil War, which increasingly is seen as the first modern revolution. We might have to push it even further back to the Peasant’s Revolt.

It makes sense when you know some of the historical background. England had become a major center of wool production. This unintentionally undermined the feudal order. The reason is that an entire community of feudal peasants isn’t necessary for herding sheep, in the way it had been for traditional agriculture. So, by the time the Peasant’s Revolt came around, there had already been several centuries of increasing irrelevance for much of the peasant population. This would continue on into the Enlightenment Age when the enclosure movement took hold and masses of landless peasants flooded into the cities.

It’s interesting that the pressure on the social order was already being felt that far back, almost jumpstarting the modern revolutionary era four centuries earlier. Those commoners were already beginning to think of themselves as more than mere cogs in the machinery of feudalism. They anticipated the possibility of becoming agents of their own fate. It was the origins of modern class identity and class war, at least for Anglo-American society.

There were other changes happening around then. It was the beginning of the Renaissance. This brought ancient Greek philosophy, science, and politics back into Western thought. The new old ideas were quickly spread through the invention of the movable type printing press and increasing use of vernacular language. And that directly made the Enlightenment possible.

The Italian city-states and colonial empires were becoming greater influences, bringing with them new economic systems of capitalism and corporatism. The Italian city-states, in the High Middle Ages, also initiated advocacy of anti-monarchialism and liberty-oriented republicanism. Related to this, humanism became a major concern, as taught by the ancient Sophists with Protagoras famously stating that “Man is the measure of all things.” And with this came early developments in psychological thought, such as the radical notion that everyone had the same basic human nature. Diverse societies had growing contact and so cultural differences became an issue, provoking difficult questions and adding to a sense of uncertainty and doubt.

Individual identity and social relationships were being transformed, in a way not seen since the Axial Age. Proto-feudalism developed in the Roman empire. Once established, feudalism lasted for more than a millennia. It wasn’t just a social order but an entire worldview, a way of being in and part of a shared world. Every aspect of life was structured by it. The slow unraveling inevitably led to increasing radicalism, as what it meant to be human was redefined and re-envisioned.

My thoughts continuously return to these historical changes. I can’t shake the feeling that we are living through another such period of societal transformation. But as during any major shift in consciousness, the outward results are hard to understand or sometimes hard to even notice, at least in terms of their ultimate consequences. That is until they result in an uprising of the masses and sometimes a complete overthrow of established power. Considering that everpresent possibility and looming threat, it might be wise to question how stable is our present social order and the human identity it is based upon.

These thoughts are inspired by other books I’ve been reading. The ideas I regularly return to is that of Julian Jaynes’ bicameralism and the related speculations of those who were inspired by him, such as Iain McGilchrist. Most recently, I found useful insight from two books whose authors were new to me: Consciousness by Susan Blackmore and A Skeptic’s Guide to the Mind by Robert Burton.

Those authors offer overviews that question and criticize many common views, specifically that of the Enlightenment ideal of individuality, in considering issues of embodiment and affect, extended self and bundled self. These aren’t just new theories that academics preoccupy themselves for reasons of entertainment and job security. They are ideas that have much earlier origins and, dismissed for so long because they didn’t fit into the prevailing paradigm, they are only now being taken seriously. The past century led to an onslaught of research findings that continuously challenged what we thought we knew.

This shift is in some ways a return to a different tradition of radical thought. John Locke was radical enough for his day, although his radicalism was hidden behind pieties. Even more radical was a possible influence on Locke, Wim Klever going so far as seeing crypto-quotations of Baruch Spinoza in Locke’s writings. Spinoza was an Enlightenment thinker who focused not just on what it meant to be human but a human in the world. What kind of world is this? Unlike Locke, his writings weren’t as narrowly focused on politics, governments, constitutions, etc. Even so, Matthew Stewart argues that through Locke’s writings Spinozism was a hidden impulse that fueled the fires of the American Revolution, taking form and force through a working class radicalism as described in Nature’s God.

Spinozism has been revived in many areas of study, such as the growing body of work about affect. Never fully appreciated in his lifetime, his radicalism continues to inform and inspire innovative thinking. As Renaissance ideas took centuries to finally displace what came before, Spinoza’s ideas are slowly but powerfully helping to remake the modern mind. I’d like to believe that a remaking of the modern world will follow.

I just started an even more interesting book, Immaterial Bodies by Lisa Blackman. She does briefly discuss Spinoza, but her framing concern is the the relationship “between the humanities and the sciences (particularly the life, neurological and psychological sciences).” She looks at the more recent developments of thought, including that of Jaynes and McGilchrist. Specifically, she unpacks the ideological self-identity we’ve inherited.

To argue for or to simply assume a particular social construct about our humanity is to defend a particular social order and thus to enforce a particular social control. She makes a compelling case for viewing neoliberalism as more than a mere economic and political system. The greatest form of control isn’t only controlling how people are allowed to act and relate but, first and foremost, how they are able to think about themselves and the world around them. In speaking about neoliberalism, she quotes Fernando Vidal (Kindle Locations 3979-3981):

“The individualism characteristic of western and westernized societies, the supreme value given to the individual as autonomous agent of choice and initiative, and the corresponding emphasis on interiority at the expense of social bonds and contexts, are sustained by the brain-hood ideology and reproduced by neurocultural discourses.”

Along with mentioning Spinoza, Blackman does give some historical background, such as in the following. And as a bonus, it is placed in the even larger context of Jaynes’ thought. She writes (Kindle Locations 3712-3724):

“Dennett, along with other scientists interested in the problem of consciousness (see Kuijsten, 2006), has identified Jaynes’s thesis as providing a bridge between matter and inwardness, or what I would prefer to term the material and immaterial. Dennett equates this to the difference between a brick and a bricklayer, where agency and sentience are only accorded to the bricklayer and never to the brick. For Dennett, under certain conditions we might have some sense of what it means to be a bricklayer, but it is doubtful, within the specificities of consciousness as we currently know and understand it, that we could ever know what it might mean to be a brick. This argument might be more usefully extended within the humanities by considering the difference between understanding the body as an entity and as a process. The concept of the body as having a ‘thing-like’ quality, where the body is reconceived as a form of property, is one that has taken on a truth status since at least its incorporation into the Habeas Corpus Act of 1679 (see Cohen, 2009). As Cohen (2009: 81) suggests, ‘determining the body as the legal location of the person radically reimagines both the ontological and political basis of person-hood’. This act conceives the body as an object possessed or owned by individuals, what Cohen (2009) terms a form of ‘biopolitical individualization’. Within this normative conception of corporeality bodies are primarily material objects that can be studied in terms of their physicochemical processes, and are objects owned by individuals who can maintain and work upon them in order to increase the individual’s physical and cultural capital.”

In her epilogue, she presents a question by Catherine Malabou (Kindle Locations 4014-4015): “What should we do so that consciousness of the brain does not purely and simply coincide with the spirit of capitalism?” The context changes as the social order changes, from feudalism to colonialism and now capitalism. But phrased in various ways, it is the same question that has been asked for centuries.

Another interesting question to ask is, by what right? It is more than a question. It is a demand to prove the authority of an action. And relevant to my thoughts here, it has historical roots in feudalism. It’s like asking someone, who do you think you are to tell me what to do? Inherent in this inquiry is one’s position in the prevailing social order, whether feudal lords challenging the kings authority or peasants challenging those feudal lords. The issue isn’t only who we are and what we are allowed to do based on that but who or what gets to define who we are, our human nature and social identity.

Such questions always have a tinge of the revolutionary, even if only in potential. Once people begin questioning, established attitudes and identities have already become unmoored and are drifting. The act of questioning is itself radical, no matter what the eventual answers. The doubting mind is ever poised on a knife edge.

The increasing pressure put on peasants, especially once they became landless, let loose individuals and identities. This incited radical new thought and action. As a yet another underclass forms, that of the imprisoned and permanently unemployed that even now forms a tenth of the population, what will this lead to? Throwing people into desperation with few opportunities and lots of time on their hands tends to lead to disruptive outcomes, sometimes even revolution.

Radicalism means to go to the root and there is nothing more radical than going to the root of our shared humanity. In questions being asked, those in power won’t be happy with the answers found. But at this point, it is already too late to stop what will follow. We are on our way.

Imagination: Moral, Dark, and Radical

Absence is presence.
These are the fundamentals of mystery.
The Young Pope

Below is a gathering of excerpts from writings. The key issue here is imagination, specifically Edmund Burke’s moral imagination with its wardrobe but also the dark imagination and the radical imagination. I bring in some other thinkers for context: Thomas Paine, Corey Robin, Thomas Ligotti, Lewis Hyde, and Julian Jaynes.

Besides imagination, the connecting strands of thought are:

  • Pleasure, beauty, and sublimity; comfort, familiarity, intimacy, the personal, and subjectivity; embodiment, anchoring, shame, and nakedness; pain, violence, suffering, and death;
  • Darkness, awe, fear, terror, horror, and the monstrous; oppression, prejudice, and ignorance; obfuscation, obscurity, disconnection, and dissociation; the hidden, the veiled, the unknown, and the distant; mystery, madness, and deception;
  • Identity, consciousness, and metaphor; creativity, art, story, poetry, and rhetoric; literalism, realism, and dogmatism; reason, knowledge, and science;
  • Enlightenment, abstractions, ideology, revolution, and counter-revolution; nobility, power, chivalry, aristocracy, and monarchy; tradition, nostalgia, and the reactionary mind; liberalism, conservatism, and culture wars;
  • Et cetera.

The touchstone for my own thinking is what I call symbolic conflation, along with the larger context of conceptual slippage, social construction, and reality tunnels. This is closely related to what Lewis Hyde discusses in terms of metonymy, liminality, and the Trickster archetype.

Read the following as a contemplation of ideas and insights. In various ways, they connect, overlap, and resonate. Soften your focus and you might see patterns emerge. If these are all different perspectives of the same thing, what exactly is it that is being perceived? What does each view say about the individual espousing it and if not necessarily about all of humanity at least about our society?

(I must admit that my motivation for this post was mainly personal. I simply wanted to gather these writings together. They include some writings and writers that I have been thinking about for a long time. Quotes and passages from many of them can be found in previous posts on this blog. I brought them together here for the purposes of my own thinking about certain topics. I don’t post stuff like this with much expectation that it will interest anyone else, as I realize my own interests are idiosyncratic. Still, if someone comes along and finds a post like this fascinating, then I’ll know they are my soulmate. This post is only for cool people with curious minds. Ha!)

* * *

On the Sublime and Beautiful
by Edmund Burke

Of the Passion Caused by the Sublime

THE PASSION caused by the great and sublime in nature, when those causes operate most powerfully, is astonishment; and astonishment is that state of the soul, in which all its motions are suspended, with some degree of horror. 1 In this case the mind is so entirely filled with its object, that it cannot entertain any other, nor by consequence reason on that object which employs it. Hence arises the great power of the sublime, that, far from being produced by them, it anticipates our reasonings, and hurries us on by an irresistible force. Astonishment, as I have said, is the effect of the sublime in its highest degree; the inferior effects are admiration, reverence, and respect.

Terror

NO passion so effectually robs the mind of all its powers of acting and reasoning as fear. 1 For fear being an apprehension of pain or death, it operates in a manner that resembles actual pain. Whatever therefore is terrible, with regard to sight, is sublime too, whether this cause of terror be endued with greatness of dimensions or not; for it is impossible to look on anything as trifling, or contemptible, that may be dangerous. There are many animals, who though far from being large, are yet capable of raising ideas of the sublime, because they are considered as objects of terror. As serpents and poisonous animals of almost all kinds. And to things of great dimensions, if we annex an adventitious idea of terror, they become without comparison greater. A level plain of a vast extent on land, is certainly no mean idea; the prospect of such a plain may be as extensive as a prospect of the ocean: but can it ever fill the mind with anything so great as the ocean itself? This is owing to several causes; but it is owing to none more than this, that the ocean is an object of no small terror. Indeed, terror is in all cases whatsoever, either more openly or latently, the ruling principle of the sublime. Several languages bear a strong testimony to the affinity of these ideas. They frequently use the same word, to signify indifferently the modes of astonishment or admiration, and those of terror. [Greek] is in Greek, either fear or wonder; [Greek] is terrible or respectable; [Greek], to reverence or to fear. Vereor in Latin, is what [Greek] is in Greek. The Romans used the verb stupeo, a term which strongly marks the state of an astonished mind, to express the effect of either of simple fear or of astonishment; the word attonitus (thunder-struck) is equally expressive of the alliance of these ideas; and do not the French étonnement, and the English astonishment and amazement, point out as clearly the kindred emotions which attend fear and wonder? They who have a more general knowledge of languages, could produce, I make no doubt, many other and equally striking examples.

Obscurity

TO make anything very terrible, obscurity seems in general to be necessary. When we know the full extent of any danger, when we can accustom our eyes to it, a great deal of the apprehension vanishes. Every one will be sensible of this, who considers how greatly night adds to our dread, in all cases of danger, and how much the notions of ghosts and goblins, of which none can form clear ideas, affect minds which give credit to the popular tales concerning such sorts of beings. Those despotic governments, which are founded on the passions of men, and principally upon the passion of fear, keep their chief as much as may be from the public eye. The policy has been the same in many cases of religion. Almost all the heathen temples were dark. Even in the barbarous temples of the Americans at this day, they keep their idol in a dark part of the hut, which is consecrated to his worship. For this purpose too the Druids performed all their ceremonies in the bosom of the darkest woods, and in the shade of the oldest and most spreading oaks. No person seems better to have understood the secret of heightening, or of setting terrible things, if I may use the expression, in their strongest light, by the force of a judicious obscurity, than Milton. His description of Death in the second book is admirably studied; it is astonishing with what a gloomy pomp, with what a significant and expressive uncertainty of strokes and colouring, he has finished the portrait of the king of terrors:

—The other shape,
If shape it might be called that shape had none
Distinguishable, in member, joint, or limb;
Or substance might be called that shadow seemed;
For each seemed either; black he stood as night;
Fierce as ten furies; terrible as hell;
And shook a deadly dart. What seemed his head
The likeness of a kingly crown had on.

In this description all is dark, uncertain, confused, terrible, and sublime to the last degree. […]

The Same Subject Continued

[…] I know several who admire and love painting, and yet who regard the objects of their admiration in that art with coolness enough in comparison of that warmth with which they are animated by affecting pieces of poetry or rhetoric. Among the common sort of people, I never could perceive that painting had much influence on their passions. It is true, that the best sorts of painting, as well as the best sorts of poetry, are not much understood in that sphere. But it is most certain, that their passions are very strongly roused by a fanatic preacher, or by the ballads of Chevy-chase, or the Children in the Wood, and by other little popular poems and tales that are current in that rank of life. I do not know of any paintings, bad or good, that produce the same effect. So that poetry, with all its obscurity, has a more general, as well as a more powerful, dominion over the passions, than the other art. And I think there are reasons in nature, why the obscure idea, when properly conveyed, should be more affecting than the clear. It is our ignorance of things that causes all our admiration, and chiefly excites our passions. Knowledge and acquaintance make the most striking causes affect but little. It is thus with the vulgar; and all men are as the vulgar in what they do not understand. The ideas of eternity and infinity are among the most affecting we have; and yet perhaps there is nothing of which we really understand so little, as of infinity and eternity. […]

Locke’s Opinion Concerning Darkness Considered

IT is Mr. Locke’s opinion, that darkness is not naturally an idea of terror; and that, though an excessive light is painful to the sense, the greatest excess of darkness is no ways troublesome. He observes indeed in another place, that a nurse or an old woman having once associated the idea of ghosts and goblins with that of darkness, night, ever after, becomes painful and horrible to the imagination. The authority of this great man is doubtless as great as that of any man can be, and it seems to stand in the way of our general principle. We have considered darkness as a cause of the sublime; and we have all along considered the sublime as depending on some modification of pain or terror: so that if darkness be no way painful or terrible to any, who have not had their minds early tainted with superstitions, it can be no source of the sublime to them. But, with all deference to such an authority, it seems to me, that an association of a more general nature, an association which takes in all mankind, and make darkness terrible; for in utter darkness it is impossible to know in what degree of safety we stand; we are ignorant of the objects that surround us; we may every moment strike against some dangerous obstruction; we may fall down a precipice the first step we take; and if an enemy approach, we know not in what quarter to defend ourselves; in such a case strength is no sure protection; wisdom can only act by guess; the boldest are staggered, and he, who would pray for nothing else towards his defence, is forced to pray for light.

As to the association of ghosts and goblins; surely it is more natural to think, that darkness, being originally an idea of terror, was chosen as a fit scene for such terrible representations, than that such representations have made darkness terrible. The mind of man very easily slides into an error of the former sort; but it is very hard to imagine, that the effect of an idea so universally terrible in all times, and in all countries, as darkness, could possibly have been owing to a set of idle stories, or to any cause of a nature so trivial, and of an operation so precarious.

Reflections on the French Revolution
by Edmund Burke

History will record, that on the morning of the 6th of October, 1789, the king and queen of France, after a day of confusion, alarm, dismay, and slaughter, lay down, under the pledged security of public faith, to indulge nature in a few hours of respite, and troubled, melancholy repose. From this sleep the queen was first startled by the voice of the sentinel at her door, who cried out her to save herself by flight—that this was the last proof of fidelity he could give—that they were upon him, and he was dead. Instantly he was cut down. A band of cruel ruffians and assassins, reeking with his blood, rushed into the chamber of the queen, and pierced with a hundred strokes of bayonets and poniards the bed, from whence this persecuted woman had but just time to fly almost naked, and, through ways unknown to the murderers, had escaped to seek refuge at the feet of a king and husband, not secure of his own life for a moment.

This king, to say no more of him, and this queen, and their infant children, (who once would have been the pride and hope of a great and generous people,) were then forced to abandon the sanctuary of the most splendid palace in the world, which they left swimming in blood, polluted by massacre, and strewed with scattered limbs and mutilated carcases. Thence they were conducted into the capital of their kingdom. […]

It is now sixteen or seventeen years since I saw the queen of France, then the dauphiness, at Versailles; and surely never lighted on this orb, which she hardly seemed to touch, a more delightful vision. I saw her just above the horizon, decorating and cheering the elevated sphere she just began to move in,—glittering like the morning-star, full of life, and splendour, and joy. Oh! what a revolution! and what a heart must I have to contemplate without emotion that elevation and that fall! Little did I dream when she added titles of veneration to those of enthusiastic, distant, respectful love, that she should ever be obliged to carry the sharp antidote against disgrace concealed in that bosom; little did I dream that I should have lived to see such disasters fallen upon her in a nation of gallant men, in a nation of men of honour, and of cavaliers. I thought ten thousand swords must have leaped from their scabbards to avenge even a look that threatened her with insult. But the age of chivalry is gone. That of sophisters, economists, and calculators, has succeeded; and the glory of Europe is extinguished for ever. Never, never more shall we behold that generous loyalty to rank and sex, that proud submission, that dignified obedience, that subordination of the heart, which kept alive, even in servitude itself, the spirit of an exalted freedom. The unbought grace of life, the cheap defence of nations, the nurse of manly sentiment and heroic enterprise, is gone! It is gone, that sensibility of principle, that charity of honor, which felt a stain like a wound, which inspired courage whilst it mitigated ferocity, which ennobled whatever it touched, and under which vice itself lost half its evil, by losing all its grossness.

This mixed system of opinion and sentiment had its origin in the ancient chivalry; and the principle, though varied in its appearance by the varying state of human affairs, subsisted and influenced through a long succession of generations, even to the time we live in. If it should ever be totally extinguished, the loss I fear will be great. It is this which has given its character to modern Europe. It is this which has distinguished it under all its forms of government, and distinguished it to its advantage, from the states of Asia, and possibly from those states which flourished in the most brilliant periods of the antique world. It was this, which, without confounding ranks, had produced a noble equality, and handed it down through all the gradations of social life. It was this opinion which mitigated kings into companions, and raised private men to be fellows with kings. Without force or opposition, it subdued the fierceness of pride and power; it obliged sovereigns to submit to the soft collar of social esteem, compelled stern authority to submit to elegance, and gave a dominating vanquisher of laws to be subdued by manners.

But now all is to be changed. All the pleasing illusions, which made power gentle and obedience liberal, which harmonized the different shades of life, and which, by a bland assimilation, incorporated into politics the sentiments which beautify and soften private society, are to be dissolved by this new conquering empire of light and reason. All the decent drapery of life is to be rudely torn off. All the superadded ideas, furnished from the wardrobe of a moral imagination, which the heart owns, and the understanding ratifies, as necessary to cover the defects of our naked, shivering nature, and to raise it to dignity in our own estimation, are to be exploded as a ridiculous, absurd, and antiquated fashion.

On this scheme of things, a king is but a man, a queen is but a woman; a woman is but an animal, and an animal not of the highest order. All homage paid to the sex in general as such, and without distinct views, is to be regarded as romance and folly. Regicide, and parricide, and sacrilege, are but fictions of superstition, corrupting jurisprudence by destroying its simplicity. The murder of a king, or a queen, or a bishop, or a father, are only common homicide; and if the people are by any chance, or in any way, gainers by it, a sort of homicide much the most pardonable, and into which we ought not to make too severe a scrutiny.

On the scheme of this barbarous philosophy, which is the offspring of cold hearts and muddy understandings, and which is as void of solid wisdom as it is destitute of all taste and elegance, laws are to be supported only by their own terrors, and by the concern which each individual may find in them from his own private speculations, or can spare to them from his own private interests. In the groves of their academy, at the end of every vista, you see nothing but the gallows. Nothing is left which engages the affections on the part of the commonwealth. On the principles of this mechanic philosophy, our institutions can never be embodied, if I may use the expression, in persons; so as to create in us love, veneration, admiration, or attachment. But that sort of reason which banishes the affections is incapable of filling their place. These public affections, combined with manners, are required sometimes as supplements, sometimes as correctives, always as aids to law. The precept given by a wise man, as well as a great critic, for the construction of poems, is equally true as to states:—Non satis est pulchra esse poemata, dulcia sunto. There ought to be a system of manners in every nation, which a well-formed mind would be disposed to relish. To make us love our country, our country ought to be lovely.

* * *

Rights of Man:
Being an Answer to Mr. Burke’s Attack on the French Revolution
by Thomas Paine

But Mr. Burke appears to have no idea of principles when he is contemplating Governments. “Ten years ago,” says he, “I could have felicitated France on her having a Government, without inquiring what the nature of that Government was, or how it was administered.” Is this the language of a rational man? Is it the language of a heart feeling as it ought to feel for the rights and happiness of the human race? On this ground, Mr. Burke must compliment all the Governments in the world, while the victims who suffer under them, whether sold into slavery, or tortured out of existence, are wholly forgotten. It is power, and not principles, that Mr. Burke venerates; and under this abominable depravity he is disqualified to judge between them. Thus much for his opinion as to the occasions of the French Revolution. I now proceed to other considerations.

I know a place in America called Point-no-Point, because as you proceed along the shore, gay and flowery as Mr. Burke’s language, it continually recedes and presents itself at a distance before you; but when you have got as far as you can go, there is no point at all. Just thus it is with Mr. Burke’s three hundred and sixty-six pages. It is therefore difficult to reply to him. But as the points he wishes to establish may be inferred from what he abuses, it is in his paradoxes that we must look for his arguments.

As to the tragic paintings by which Mr. Burke has outraged his own imagination, and seeks to work upon that of his readers, they are very well calculated for theatrical representation, where facts are manufactured for the sake of show, and accommodated to produce, through the weakness of sympathy, a weeping effect. But Mr. Burke should recollect that he is writing history, and not plays, and that his readers will expect truth, and not the spouting rant of high-toned exclamation.

When we see a man dramatically lamenting in a publication intended to be believed that “The age of chivalry is gone! that The glory of Europe is extinguished for ever! that The unbought grace of life (if anyone knows what it is), the cheap defence of nations, the nurse of manly sentiment and heroic enterprise is gone!” and all this because the Quixot age of chivalry nonsense is gone, what opinion can we form of his judgment, or what regard can we pay to his facts? In the rhapsody of his imagination he has discovered a world of wind mills, and his sorrows are that there are no Quixots to attack them. But if the age of aristocracy, like that of chivalry, should fall (and they had originally some connection) Mr. Burke, the trumpeter of the Order, may continue his parody to the end, and finish with exclaiming: “Othello’s occupation’s gone!”

Notwithstanding Mr. Burke’s horrid paintings, when the French Revolution is compared with the Revolutions of other countries, the astonishment will be that it is marked with so few sacrifices; but this astonishment will cease when we reflect that principles, and not persons, were the meditated objects of destruction. The mind of the nation was acted upon by a higher stimulus than what the consideration of persons could inspire, and sought a higher conquest than could be produced by the downfall of an enemy. Among the few who fell there do not appear to be any that were intentionally singled out. They all of them had their fate in the circumstances of the moment, and were not pursued with that long, cold-blooded unabated revenge which pursued the unfortunate Scotch in the affair of 1745.

Through the whole of Mr. Burke’s book I do not observe that the Bastille is mentioned more than once, and that with a kind of implication as if he were sorry it was pulled down, and wished it were built up again. “We have rebuilt Newgate,” says he, “and tenanted the mansion; and we have prisons almost as strong as the Bastille for those who dare to libel the queens of France.” As to what a madman like the person called Lord George Gordon might say, and to whom Newgate is rather a bedlam than a prison, it is unworthy a rational consideration. It was a madman that libelled, and that is sufficient apology; and it afforded an opportunity for confining him, which was the thing that was wished for. But certain it is that Mr. Burke, who does not call himself a madman (whatever other people may do), has libelled in the most unprovoked manner, and in the grossest style of the most vulgar abuse, the whole representative authority of France, and yet Mr. Burke takes his seat in the British House of Commons! From his violence and his grief, his silence on some points and his excess on others, it is difficult not to believe that Mr. Burke is sorry, extremely sorry, that arbitrary power, the power of the Pope and the Bastille, are pulled down.

Not one glance of compassion, not one commiserating reflection that I can find throughout his book, has he bestowed on those who lingered out the most wretched of lives, a life without hope in the most miserable of prisons. It is painful to behold a man employing his talents to corrupt himself. Nature has been kinder to Mr. Burke than he is to her. He is not affected by the reality of distress touching his heart, but by the showy resemblance of it striking his imagination. He pities the plumage, but forgets the dying bird. Accustomed to kiss the aristocratical hand that hath purloined him from himself, he degenerates into a composition of art, and the genuine soul of nature forsakes him. His hero or his heroine must be a tragedy-victim expiring in show, and not the real prisoner of misery, sliding into death in the silence of a dungeon.

As Mr. Burke has passed over the whole transaction of the Bastille (and his silence is nothing in his favour), and has entertained his readers with refections on supposed facts distorted into real falsehoods, I will give, since he has not, some account of the circumstances which preceded that transaction. They will serve to show that less mischief could scarcely have accompanied such an event when considered with the treacherous and hostile aggravations of the enemies of the Revolution.

The mind can hardly picture to itself a more tremendous scene than what the city of Paris exhibited at the time of taking the Bastille, and for two days before and after, nor perceive the possibility of its quieting so soon. At a distance this transaction has appeared only as an act of heroism standing on itself, and the close political connection it had with the Revolution is lost in the brilliancy of the achievement. But we are to consider it as the strength of the parties brought man to man, and contending for the issue. The Bastille was to be either the prize or the prison of the assailants. The downfall of it included the idea of the downfall of despotism, and this compounded image was become as figuratively united as Bunyan’s Doubting Castle and Giant Despair.

* * *

The Reactionary Mind
by Corey Robin
pp. 243-245

As Orwell taught, the possibilities for cruelty and violence are as limitless as the imagination that dreams them up. But the armies and agencies of today’s violence are vast bureaucracies, and vast bureaucracies need rules. Eliminating the rules does not Prometheus unbind; it just makes for more billable hours.

“No yielding. No equivocation. No lawyering this thing to death.” That was George W. Bush’s vow after 9/ 11 and his description of how the war on terror would be conducted. Like so many of Bush’s other declarations, it turned out to be an empty promise. This thing was lawyered to death. But, and this is the critical point, far from minimizing state violence— which was the great fear of the neocons— lawyering has proven to be perfectly compatible with violence. In a war already swollen with disappointment and disillusion, the realization that inevitably follows— the rule of law can, in fact, authorize the greatest adventures of violence and death, thereby draining them of sublimity— must be, for the conservative, the greatest disillusion of all.

Had they been closer readers of Burke, the neoconservatives— like Fukuyama, Roosevelt, Sorel, Schmitt, Tocqueville, Maistre, Treitschke, and so many more on the American and European right— could have seen this disillusion coming. Burke certainly did. Even as he wrote of the sublime effects of pain and danger, he was careful to insist that should those pains and dangers “press too nearly” or “too close”— that is, should they become realities rather than fantasies, should they become “conversant about the present destruction of the person”— their sublimity would disappear. They would cease to be “delightful” and restorative and become simply terrible. 64 Burke’s point was not merely that no one, in the end, really wants to die or that no one enjoys unwelcome, excruciating pain. It was that sublimity of whatever kind and source depends upon obscurity: get too close to anything, whether an object or experience, see and feel its full extent, and it loses its mystery and aura. It becomes familiar. A “great clearness” of the sort that comes from direct experience “is in some sort an enemy to all enthusiasms whatsoever.” 65 “It is our ignorance of things that causes all our admiration, and chiefly excites our passions. Knowledge and acquaintance make the most striking causes affect but little.” 66 “A clear idea,” Burke concludes, “is therefore another name for a little idea.” 67 Get to know anything, including violence, too well, and it loses whatever attribute— rejuvenation, transgression, excitement, awe— you ascribed to it when it was just an idea.

Earlier than most, Burke understood that if violence were to retain its sublimity, it had to remain a possibility, an object of fantasy— a horror movie, a video game, an essay on war. For the actuality (as opposed to the representation) of violence was at odds with the requirements of sublimity. Real, as opposed to imagined, violence entailed objects getting too close, bodies pressing too near, flesh upon flesh. Violence stripped the body of its veils; violence made its antagonists familiar to each other in a way they had never been before. Violence dispelled illusion and mystery, making things drab and dreary. That is why, in his discussion in the Reflections of the revolutionaries’ abduction of Marie Antoinette, Burke takes such pains to emphasize her “almost naked” body and turns so effortlessly to the language of clothing—“ the decent drapery of life,” the “wardrobe of the moral imagination,” “antiquated fashion,” and so on— to describe the event. 68 The disaster of the revolutionaries’ violence, for Burke, was not cruelty; it was the unsought enlightenment.

Since 9/ 11, many have complained, and rightly so, about the failure of conservatives— or their sons and daughters— to fight the war on terror themselves. For those on the left, that failure is symptomatic of the class injustice of contemporary America. But there is an additional element to the story. So long as the war on terror remains an idea— a hot topic on the blogs, a provocative op-ed, an episode of 24— it is sublime. As soon as the war on terror becomes a reality, it can be as cheerless as a discussion of the tax code and as tedious as a trip to the DMV.

Fear: The History of a Political Idea
by Corey Robin
Kindle Locations 402-406

It might seem strange that a book about political fear should assign so much space to our ideas about fear rather than to its practice. But recall what Burke said: It is not so much the actuality of a threat, but the imagined idea of that threat, that renews and restores. “If the pain and terror are so modified as not to be actually noxious; if the pain is not carried to violence, and the terror is not conversant about the present destruction of the person,” then, and only then, do we experience a delightful horror.”1 The condition of our being renewed by fear is not that we directly experience the object that threatens us, but that the object be kept at some remove move from ourselves.

Kindle Locations 1061-1066

Whether they have read The Spirit of the Laws or not, these writers are its children. With its trawling allusions to the febrile and the fervid, The Spirit of the Laws successfully aroused the conviction that terror was synonymous with barbarism, and that its cures were to be found entirely within liberalism. Thus was a new political and literary aesthetic born, a rhetoric of hyperbole suggesting that terror’s escorts were inevitably remoteness, irrationality, and darkness, and its enemies, familiarity, reason, and light. Perhaps it was this aesthetic that a young Edmund Burke had in mind when he wrote, two years after Montesquieu’s death, “To make any thing very terrible, obscurity seems in general to be necessary. When we know the full extent of any danger, when we can accustom our eyes to it, a great deal of the apprehension vanishes.”

Kindle Locations 1608-1618

As she set about establishing a new political morality in the shadow of total terror, however, Arendt became aware of a problem that had plagued Hobbes, Montesquieu, and Tocqueville, and that Burke-not to mention makers of horror films-understood all too well: once terrors become familiar, they cease to arouse dread. The theorist who tries to establish fear as a foundation for a new politics must always find a demon darker than that of her predecessors, discover ever more novel, and more frightening, forms of fear. Thus Montesquieu, seeking to outdo Hobbes, imagined a form of terror that threatened the very basis of that which made us human. In Arendt’s case, it was her closing image of interchangeable victims and victimizers-of terror serving no interest and no party, not even its wielders; of a world ruled by no one and nothing, save the impersonal laws of motion-that yielded the necessary “radical evil” from which a new politics could emerge.

But as her friend and mentor Karl Jaspers was quick to recognize, Arendt had come upon this notion of radical evil at a terrible cost: it made moral judgment of the perpetrators of total terror nearly impossible.59 According to Origins, total terror rendered everyone-from Hitler down through the Jews, from Stalin to the kulaks-incapable of acting. Indeed, as Arendt admitted in 1963, “There exists a widespread theory, to which I also contributed [in Origins], that these crimes defy the possibility of human judgment and explode the frame of our legal institutions.”60 Total terror may have done what fear, terror, and anxiety did for her predecessors-found a new politics-but, as Arendt would come to realize in Eichmann in Jerusalem, it was a false foundation, inspiring an operatic sense of catastrophe, that ultimately let the perpetrators off the hook by obscuring the hard political realities of rule by fear.

Liberalism at Bay, Conservatism at Piay:
Fear in the Contemporary Imagination

by Corey Robin

For theorists like Locke and Burke, fear is something to be cherished, not because it alerts us to real danger or propels us to take necessary action against it, but because fear is supposed to arouse a heightened state of experience. It quickens our perceptions as no other emotion can, forcing us to see and to act in the world in new and more interesting ways, with greater moral discrimination and a more acute consciousness of our surroundings and ourselves. According to Locke, fear is “an uneasiness of the mind” and “the chief, if not only spur to human industry and action is uneasiness.” Though we might think that men and women act on behalf of desire, Locke insisted that “a little burning felt”—like fear—”pushes us more powerfully than great pleasures in prospect draw or allure.” Burke had equally low regard for pleasure. It induces a grotesque implosion of self, a “soft tranquility” approximating an advanced state of decay if not death itself.

The head reclines something on one side; the eyelids are
more closed than usual, and the eyes roll gently with an
inclination to the object, the mouth is a little opened, and
the breath drawn slowly, with now and then a low sigh;
the whole body is composed, and the hands fall idly to
the sides. All this is accompanied with an inward sense of
melting and languor . . . relaxing the solids of the whole
system.

But when we imagine the prospect of “pain and terror,” Burke added, we experience a delightful horror,” the “strongest of all passions.” Without fear, we are passive; with it, we are roused to “the strongest emotion which the mind is capable of feeling” (Locke, 1959,11.20.6,10;11.21.34: 304-5, 334; Burke, 1990: 32, 36,123,135-36).

At the political level, modem theorists have argued that fear is a spur to civic vitality and moral renewal, perhaps even a source of public freedom. Writing in the wake of the French Revolution, Tocqueville bemoaned the lethargy of modem democracy. With its free-wheeling antimonianism and social mobility, democratic society “inevitably enervates the soul, and relaxing the springs of the will, prepares a people for bondage. Then not only will they let their freedom be taken from them, but often they actually hand it over themselves” (Tocqueville, 1969:444). Lacking confidence in the traditional truths of God and king, Tocqueville believed that democracies might find a renewed confidence in the experience of fear, which could activate and ground a commitment to public freedom. “Fear,” he wrote in a note to himself, “must be put to work on behalf of liberty,” or, as he put it in Democracy in America, “Let us, then, look forward to the future with that salutary fear which makes men keep watch and ward for freedom, and not with that flabby, idle terror which makes men’s hearts sink and enervates them” (cited in Lamberti, 1989: 229; Tocqueville, 1969: 702). Armed with fear, democracy would be fortified against not only external and domestic enemies but also the inner tendency, the native desire, to dissolve into the soupy indifference of which Burke spoke.

* * *

The Dark Beauty of Unheard-Of Horrors
by Thomas Ligotti

This is how it is when a mysterious force is embodied in a human body, or in any form that is too well fixed. And a mystery explained is one robbed of its power of emotion, dwindling into a parcel of information, a tissue of rules and statistics without meaning in themselves.

Of course, mystery actually requires a measure of the concrete if it is to be perceived at all; otherwise it is only a void, the void. The thinnest mixture of this mortar, I suppose, is contained in that most basic source of mystery—darkness. Very difficult to domesticate this phenomenon, to collar it and give a name to the fear it inspires. As a verse writer once said:

The blackness at the bottom of a well
May bold most any kind of hell.

The dark, indeed, phenomenon possessing the maximum of mystery, the one most resistant to the taming of the mind and most resonant with emotions and meanings of a highly complex and subtle type. It is also extremely abstract as a provenance for supernatural horror, an elusive prodigy whose potential for fear may slip through a writer’s fingers and right past even a sensitive reader of terror tales. Obviously it is problematic in away that a solid pair of gleaming fangs at a victim’s neck is not. Hence, darkness itself is rarely used in a story as the central incarnation of the supernatural, though it often serves in a supporting role as an element of atmosphere, an extension of more concrete phenomena. The shadowy ambiance of a fictional locale almost always resolves itself into an apparition of substance, a threat with a name, if not a full blown history. Darkness may also perform in a strictly symbolic capacity, representing the abyss at the core of any genuine tale of mystery and horror. But to draw a reader’s attention to this abyss, this unnameable hell of blackness, is usually sacrificed in favor of focusing on some tangible dread pressing against the body of everyday life. From these facts may be derived an ad hoc taxonomy for dividing supernatural stories into types, or rather a spectrum of types: on the one side, those that tend to emphasize the surface manifestations of a supernatural phenomenon; on the other, those that reach toward the dark core of mystery in purest and most abstract condition. The former stories show us the bodies, big as life, of the demonic tribe of spooks, vampires, and other assorted bogeymen; the latter suggest to us the essence, far bigger than life, of that dark universal terror beyond naming which is the matrix for all other terrors. […]

Like Erich Zann’s “world of beauty,” Lovecraft’s “lay in some far cosmos of the imagination,” and like that of another  artist, it is a “beauty that hath horror in it.

The Conspiracy against the Human Race: A Contrivance of Horror
by Thomas Ligotti
pp. 41-42

As heretofore noted, consciousness may have assisted our species’ survival in the hard times of prehistory, but as it became ever more intense it evolved the potential to ruin everything if not securely muzzled. This is the problem: We must either outsmart consciousness or be thrown into its vortex of doleful factuality and suffer, as Zapffe termed it, a “dread of being”— not only of our own being but of being itself, the idea that the vacancy that might otherwise have obtained is occupied like a stall in a public lavatory of infinite dimensions, that there is a universe in which things like celestial bodies and human beings are roving about, that anything exists in the way it seems to exist, that we are part of all being until we stop being, if there is anything we may understand as being other than semblances or the appearance of semblances.

On the premise that consciousness must be obfuscated so that we might go on as we have all these years, Zapffe inferred that the sensible thing would be not to go on with the paradoxical nonsense of trying to inhibit our cardinal attribute as beings, since we can tolerate existence only if we believe— in accord with a complex of illusions, a legerdemain of duplicity— that we are not what we are: unreality on legs. As conscious beings, we must hold back that divulgement lest it break us with a sense of being things without significance or foundation, anatomies shackled to a landscape of unintelligible horrors. In plain language, we cannot live except as self-deceivers who must lie to ourselves about ourselves, as well as about our unwinnable situation in this world.

Accepting the preceding statements as containing some truth, or at least for the sake of moving on with the present narrative, it seems that we are zealots of Zapffe’s four plans for smothering consciousness: isolation (“ Being alive is all right”), anchoring (“ One Nation under God with Families, Morality, and Natural Birthrights for all”), distraction (“ Better to kill time than kill oneself”), and sublimation (“ I am writing a book titled The Conspiracy against the Human Race”). These practices make us organisms with a nimble intellect that can deceive themselves “for their own good.” Isolation, anchoring, distraction, and sublimation are among the wiles we use to keep ourselves from dispelling every illusion that keeps us up and running. Without this cognitive double-dealing, we would be exposed for what we are. It would be like looking into a mirror and for a moment seeing the skull inside our skin looking back at us with its sardonic smile. And beneath the skull— only blackness, nothing.  A little piece of our world has been peeled back, and underneath is creaking desolation— a carnival where all the rides are moving but no patrons occupy the seats. We are missing from the world we have made for ourselves. Maybe if we could resolutely gaze wide-eyed at our lives we would come to know what we really are. But that would stop the showy attraction we are inclined to think will run forever.

p. 182

That we all deserve punishment by horror is as mystifying as it is undeniable. To be an accomplice, however involuntarily, in a reasonless non-reality is cause enough for the harshest sentencing. But we have been trained so well to accept the “order” of an unreal world that we do not rebel against it. How could we? Where pain and pleasure form a corrupt alliance against us, paradise and hell are merely different divisions in the same monstrous bureaucracy. And between these two poles exists everything we know or can ever know. It is not even possible to imagine a utopia, earthly or otherwise, that can stand up under the mildest criticism. But one must take into account the shocking fact that we live on a world that spins. After considering this truth, nothing should come as a surprise.

Still, on rare occasions we do overcome hopelessness or velleity and make mutinous demands to live in a real world, one that is at least episodically ordered to our advantage. But perhaps it is only a demon of some kind that moves us to such idle insubordination, the more so to aggravate our condition in the unreal. After all, is it not wondrous that we are allowed to be both witnesses and victims of the sepulchral pomp of wasting tissue? And one thing we know is real: horror. It is so real, in fact, that we cannot be sure it could not exist without us. Yes, it needs our imaginations and our consciousness, but it does not ask or require our consent to use them. Indeed, horror operates with complete autonomy. Generating ontological havoc, it is mephitic foam upon which our lives merely float. And, ultimately, we must face up to it: Horror is more real than we are.

p. 218

Without death— meaning without our consciousness of death— no story of supernatural horror would ever have been written, nor would any other artistic representation of human life have been created for that matter. It is always there, if only between the lines or brushstrokes, or conspicuously by its absence. It is a terrific stimulus to that which is at once one of our greatest weapons and greatest weaknesses— imagination. Our minds are always on the verge of exploding with thoughts and images as we ceaselessly pound the pavement of our world. Both our most exquisite cogitations and our worst cognitive drivel announce our primal torment: We cannot linger in the stillness of nature’s vacuity. And so we have imagination to beguile us. A misbegotten hatchling of consciousness, a birth defect of our species, imagination is often revered as a sign of vigor in our make-up. But it is really just a psychic overcompensation for our impotence as beings. Denied nature’s exemption from creativity, we are indentured servants of the imaginary until the hour of our death, when the final harassments of imagination will beset us.

* * *

The Horror of the Unreal
By Peter Bebergal

The TV show “The Walking Dead” is one long exercise in tension. But the zombies—the supposed centerpiece of the show’s horror—are not particularly frightening. Gross, to be sure, but also knowable, literal. You can see them coming from yards away. They are the product of science gone wrong, or of a virus, or of some other phenomenal cause. They can be destroyed with an arrow through the brain. More aberration than genuine monsters, they lack the essential quality to truly terrify: an aspect of the unreal.

The horror writer Thomas Ligotti believes that even tales of virus-created zombies—and other essentially comprehensible creatures—can elicit what we might call, quoting the theologian Rudolf Otto, “the wholly other,” but it requires a deft hand. The best such stories “approach the realm of the supernatural,” he told me over e-mail, even if their monsters are entirely earthly. As an example, he pointed to “The Texas Chainsaw Massacre,” “wherein the brutality displayed is so deviant and strange it takes off into the uncanny.” Ligotti doesn’t require bloodthirsty villains to convey a sense of impending horror, though. “I tend to stipulate in my work that the world by its nature already exists in a state of doom rather than being in the process of doom.” […]

“Whether or not there is anything called the divine is neither here nor there,” Ligotti told me. “It’s irrelevant to our sense of what is beyond the veil.” Ligotti believes that fiction can put us in touch with that sense of things unseen, that it can create an encounter with—to quote Rudolf Otto again—the mysterium tremendum et fascinans, a state that combines terror and enchantment with the divine. In fact, Ligotti believes that “any so-called serious work of literature that doesn’t to some extent serve this function has failed.” It’s not a matter of genre, he says. He cites Raymond Chandler’s Philip Marlowe as a character who would go wherever the clues took him, no matter how deep into the heart of the “unknown.” “Chandler wanted his detective stories to invoke the sense of the ‘country behind the hill.’ “

Because Ligotti has no interest in whether or not that world beyond actually exists, there is a tension, an unanswered question, in his work: Can we locate the source of this horror? His characters are often confronted by people or groups who worship something so alien that their rituals don’t conform to any identifiable modes of religious practice. Usually, they involve some form of sacrifice or other suggestion of violence. The implication seems to be that, even if there is meaning in the universe, that meaning is so foreign, so strange, that we could never understand it, and it could never make a difference in our lives. Any attempt to penetrate it will only lead to madness.

As a practical matter, Ligotti believes that the short story is the most potent means for conveying this idea. “A novel can’t consistently project what Poe called a ‘single effect,’ “ he explains. “It would be too wearing on the reader—too repetitious and dense, as would, for instance, a lengthy narrative poem written in the style of a lyric poem. A large part of supernatural novels must therefore be concerned with the mundane and not with a sense of what I’ll call ‘the invisible.’ “

Trying to get Ligotti to explain what he means by the “invisible” is not easy. “I’m not able to see my stories as establishing or presuming the existence of a veil beyond which the characters in them are incapable of seeing. I simply don’t view them in this way. ” But his characters, I insisted, suggest that we are all capable of seeing beyond the veil, though it’s impossible to tell if they are simply mad, or if they have indeed perceived something outside normal perception. I asked Ligotti if he saw a difference between these two states of consciousness. “The only interest I’ve taken in psychological aberrancy in fiction,” he answered, “has been as a vehicle of perceiving the derangement of creation.”

Thomas Ligotti: Dark Phenomenology and Abstract Horror
by S.C. Hickman

Ligotti makes a point that horror must stay ill-defined, that the monstrous must menace us from a distance, from the unknown; a non-knowledge, rather than a knowledge of the natural; it is the unnatural and invisible that affects us not something we can reduce to some sociological, psychological, or political formation or representation, which only kills the mystery – taming it and pigeonholing it into some cultural gatekeeper’s caged obituary. […] The domesticated beast is no horror at all.

In the attic of the mind a lunatic family resides, a carnival world of aberrant thoughts and feelings – that, if we did not lock away in a conspiracy of silence would freeze us in such terror and fright that we would become immobilized unable to think, feel, or live accept as zombies, mindlessly. So we isolate these demented creatures, keep them at bay. Then we anchor ourselves in artifice, accept substitutes, religious mythologies, secular philosophies, and anything else that will help us keep the monsters at bay. As Ligotti will say, we need our illusions – our metaphysical anchors and dreamscapes “that inebriate us with a sense of being official, authentic, and safe in our beds” (CHR, 31). Yet, when even these metaphysical ploys want stem the tide of those heinous monsters from within we seek out distraction, entertainment: TV, sports, bars, dancing, friends, fishing, scuba diving, boating, car racing, horse riding… almost anything that will keep our mind empty of its dark secret, that will allow it to escape the burden of emotion – of fear, if even for a night or an afternoon of sheer mindless bliss. And, last, but not least, we seek out culture, sublimation – art, theatre, festivals, carnivals, painting, writing, books… we seek to let it all out, let it enter into that sphere of the tragic or comic, that realm where we can exorcize it, display it, pin it to the wall for all to see our fears and terrors on display not as they are but as we lift them up into art, shape them to our nightmare visions or dreamscapes of desire. As Ligotti tells it, we read literature or watch a painting, go to a theatre, etc. […]

Horror acts like a sigil, a diagram that invokes the powers within the darkness to arise, to unfold their mystery, to explain themselves; and, if not explain then at least to invade our equilibrium, our staid and comfortable world with their rage, their torment, their corruption. The best literary horror or weird tales never describe in detail the mystery, rather they invoke by hyperstitional invention: calling forth the forces out of darkness and the abstract, and allowing them to co-habit for a time the shared space – the vicarious bubble or interzone between the reader and narrative […]

This notion of the tension between the epistemic and ontic in abstract horror returns me to Nick Land’s short work Phyl-Undhu: Abstract Horror, Exterminator in which the narrator tells us that what we fear, what terrorizes us is not the seen – the known and definable, but rather the unseen and unknown, even “shapeless threat, ‘Outside’ only in the abstract sense (encompassing the negative immensity of everything that we cannot grasp). It could be anywhere, from our genes or ecological dynamics, to the hidden laws of technological evolution, or the hostile vastnesses between the stars. We know only that, in strict proportion to the vitality of the cosmos, the probability of its existence advances towards inevitability, and that for us it means supreme ill. Ontological density without identifiable form is abstract horror itself.” […]

Yet, as Lovecraft in one of his famous stories – “Call of Cthulhu” once suggested, the “sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the deadly light into the peace and safety of a new dark age.” Here is the nub for Ligotti, the dividing line of those who continue to sleep in the illusory safety net of their cultural delusions […] Many will remember the Anglo-American poet T. S. Eliot once suggested that “humankind cannot bear too much reality”. […]

For Ligotti the subjective reaction to the seemingly objective stimulus of the uncanny is the gaining of “dark knowledge” about the workings of individuals, […] This sense that the corruption works both ways, upon the victim and the perpetrator; that the world is now topsy-turvy and that the uncanny boundaries between victim and perpetrator are reversible and hazy, and not always obvious is due to that subtle knowledge that each culture is circumscribed within its own black box of conceptuality. By that I mean by that that as Eduardo Viveiros de Castro in his Cannibal Metaphysics argues the case that Amazonian and other Amerindian groups inhabit a radically different conceptual universe than ours—in which nature and culture, human and nonhuman, subject and object are conceived in terms that reverse our own—he presents the case for anthropology as the study of such “other” metaphysical schemes, and as the corresponding critique of the concepts imposed on them by the human sciences. […]

We’re in that position of moving either way: 1) literalizing our fantasies: building walls and barbed-wire fences against invading hordes of refugees, migrants, etc.; or, 2) of seeing through them, seeing the aesthetic and defensive use of art and social mechanisms to defend ourselves from the onslaught of our own daemonic nihilism and drives: our fears and terrors. […]

In our time we’ve forgotten this fact, and forgotten the art laughter, to see the world through the lens of art or horror literature and know that this, too, is illusion: the aesthetic call to our emotions, to our fears and our terrors that allows that purge, that release that only great art can supply. Rather in our time we’ve all become literalists of the imagination, so that apocalypse rather than a pleasant channeling of our fears has become an actual possibility and real manifestation in the world around us in wars, famines, racism, hatred, murder, mayhem… The problem we face is that we’ve targeted the external world of actual people and deemed them disposable as if they are the ravenous zombies and vampires of our contemporary globalist madness. We’ve turned the inside out, reversed what once existed within into a projected nightmare scenario and living hell in the real world not as fantasy but as daemonic threat and doom upon ourselves and others. Talking of contemporary horror films Ligotti remarks that the characters in these films “cannot be sure who is a “thing” and who is not, since those who are transmuted retain their former appearance, memories, and behaviors even after they have become, in their essence, uncanny monstrosities from another world.” (CHR, 92) This sense that we’ve allowed the immigrants (US) and refugees (US and EU) to enter into and become a part of the social body of our nations leads to this sense of the uncanny uncertainty that one cannot be sure who is the “thing” – is it us or them: a paranoiac nightmare world of ravening lunacy, indeed. Because our categories of normal/abnormal have broken down due to the absolute Other of other conceptual cultures who have other sets of Symbolic Orders and ideas, concepts, ideologies, religious, and Laws, etc. we are now in the predicament of mutating and transforming into an Other ourselves all across the globe. There is no safe haven, no place to hide or defend oneself against oneself. In this sense we’ve all – everyone on the planet – become as Ligotti states it, in “essence, uncanny monstrosities from another world”. (CHR, 92)

* * *

Trickster Makes This World
by Lewis Hyde
pp. 168-172

During the years I was writing this book, there was an intense national debate over the concern that government funds might be used to subsidize pornographic art. The particulars will undoubtedly change, but the debate is perennial. On the one side, we have those who presume to speak for the collective trying to preserve the coverings and silences that give social space its order. On the other side, we have the agents of change, time travelers who take the order itself to be mutable, who hope— to give it the most positive formulation— to preserve the sacred by finding ways to shift the structure of things as contingency demands. It is not immediately clear why this latter camp must so regularly turn to bodily and sexual display, but the context I am establishing here suggests that such display is necessary.

To explore why this might be the case, let me begin with the classic image from the Old Testament: Adam and Eve leaving the garden, having learned shame and therefore having covered their genitals and, in the old paintings, holding their hands over their faces as well. By these actions they inscribe their own bodies. The body happens to be a uniquely apt location for the inscription of shame, partly because the body itself seems to be the sense organ of shame (the feeling swamps us, we stutter and flush against our will), but also because the content of shame, what we feel ashamed of, typically seems indelible and fixed, with us as a sort of natural fact, the way the body is with us as a natural fact. “Shame is what you are, guilt is what you do,” goes an old saying. Guilt can be undone with acts of penance, but the feeling of shame sticks around like a birthmark or the smell of cigarettes.

I earlier connected the way we learn about shame to rules about speech and silence, and made the additional claim that those rules have an ordering function. Now, let us say that the rules give order to several things at once, not just to society but to the body and the psyche as well. When I say “several things at once” I mean that the rules imply the congruence of these three realms; the orderliness of one is the orderliness of the others. The organized body is a sign that we are organized psychologically and that we understand and accept the organization of the world around us. When Adam and Eve cover their genitals, they simultaneously begin to structure consciousness and to structure their primordial community. To make the temenos, a line is drawn on the earth and one thing cut from another; when Adam and Eve learn shame, they draw a line on their bodies, dividing them into zones like the zones of silence and speech— or, rather, not “like” those zones, but identified with them, for what one covers on the body one also consigns to silence.

[…] an unalterable fact about the body is linked to a place in the social order, and in both cases, to accept the link is to be caught in a kind of trap.

Before anyone can be snared in this trap, an equation must be made between the body and the world (my skin color is my place as a Hispanic; menstruation is my place as a woman). This substituting of one thing for another is called metonymy in rhetoric, one of the many figures of thought, a trope or verbal turn. The construction of the trap of shame begins with this metonymic trick, a kind of bait and switch in which one’s changeable social place is figured in terms of an unchangeable part of the body. Then by various means the trick is made to blend invisibly into the landscape. To begin with, there are always larger stories going on— about women or race or a snake in a garden. The enchantment of those regularly repeated fables, along with the rules of silence at their edges, and the assertion that they are intuitively true— all these things secure the borders of the narrative and make it difficult to see the contingency of its figures of thought. Once the verbal tricks are invisible, the artifice of the social order becomes invisible as well, and begins to seem natural. As menstruation and skin color and the genitals are natural facts, so the social and psychological orders become natural facts.

In short, to make the trap of shame we inscribe the body as a sign of wider worlds, then erase the artifice of that signification so that the content of shame becomes simply the way things are, as any fool can see.

If this is how the trap is made, then escaping it must involve reversing at least some of these elements. In what might be called the “heavy-bodied” escape, one senses that there’s something to be changed but ends up trying to change the body itself, mutilating it, or even committing suicide […]

These are the beginnings of conscious struggle, but we have yet to meet the mind of the trickster— or if we have, it belongs to the trickster who tries to eat the reflected berries, who burns his own anus in anger, who has not learned to separate the bait from the hook. As we saw earlier, the pressures of experience produce from that somewhat witless character a more sophisticated trickster who can separate bait from hook, who knows that the sign of something is not the thing itself, and who is therefore a better escape artist with a much more playful relationship to the local stories. The heavy-bodied, literalizing attempt to escape from shame carries much of the trap with it— the link to the body, the silence, and so on. Inarticulately, it takes the sign for the thing itself, imagining racism inheres in the color of the skin. Wise to the tricks of language, the light-bodied escape from shame refuses the whole setup— refuses the metonymic shift, the enchantment of group story, and the rules of silence— and by these refusals it detaches the supposedly overlapping levels of inscription from one another so that the body, especially, need no longer stand as the mute, incarnate seal of social and psychological order. All this, but especially the speaking out where shame demands silence, depends largely on a consciousness that doesn’t feel much inhibition, and knows how traps are made, and knows how to subvert them.

This is the insight that comes to all boundary-crossers— immigrants in fact or immigrants in time— that meaning is contingent and identity fluid, even the meaning and identity of one’s own body.

It should by now be easier to see why there will always be art that uncovers the body, and artists who speak shamelessly, even obscenely. All social structures do well to anchor their rules of conduct in the seemingly simple inscription of the body, so that only after I have covered my privates am I allowed to show my face to the world and have a public life. The rules of bodily decorum usually imply that the cosmos depends on the shame we feel about our bodies. But sometimes the lesson is a lie, and a cunningly self-protecting one at that, for to question it requires self-exposure and loss of face, and who would want that? Well, trickster would, as would all those who find they cannot fashion a place for themselves in the world until they have spoken against collective silence. We certainly see this— not just the speaking out but the self-exposure— in Allen Ginsberg, and we see it a bit more subtly in both Kingston and Rodriguez. Neither of them is a “dirty writer” the way Ginsberg is, but to begin to speak, one of them must talk about menstruation (which talk she links to becoming the mistress of her own sexuality) and the other must talk about his skin (which talk he links to possessing his “maleness”).

To the degree that other orders are linked to the way the body is inscribed, and to the degree that the link is sealed by rules of silence, the first stuttering questioning of those orders must always begin by breaking the seal and speaking about the body. Where obscene speech has such roots it is worth defending, and those who would suppress it court a subtle but serious danger. They are like the gods who would bind Loki, for this suppression hobbles the imagination that copes with the shifting and contingent nature of things, and so invites apocalyptic change where something more playful would have sufficed. Better to let trickster steal the shame covers now and then. Better to let Coyote have a ride in the Sun-god’s lodge. Better to let Monkey come on your journey to the West.

* * *

“Disseminated Volition in the New Testament Gospels”
by Andrew Stehlik
The Jaynesian (Vol. 3, Issue 1)

It is well known that many words for inner spiritual motions and emotions are actually metaphors derived from primitive (outward) physiological observations. Brief references to any good dictionary which includes etymology can corroborate this conclusion.

Julian Jaynes in The Origin of Consciousness in the Breakdown of the Bicameral Mind dedicated a whole chapter to this theme — looking forward through the Iliad (pp. 257– 272). He concentrates on seven words: thumos, phrenes, noos, psyche, kradie, ker, and etor.

Julian Jaynes recognized that these and other similar body based, physiological or anatomical metaphors (in almost any language) are actually more than simple linguistic metaphors and that they played an important role in the breakdown of bicameralism and the development of consciousness. Different forms of stress and anxiety trigger different physiological responses. Observations of these responses were used in naming and creating hypostases and metaphors useful in the terminology of introspection and the development of consciousness. […]

In the New Testament Gospels (therefore quite late in the historical process — the second half of the first century CE) I recently recognized an interesting phenomenon which could be part of this process, or, even better, a pathological deviation along this process.

Once in the gospel of Mark (9: 42– 48) and twice in the gospel of Matthew (5: 27– 30 and 18: 6– 10) Jesus is supposed to utter an almost identical saying. In this saying, individual parts of the body (eyes, hands, feet) are given the ability of independent volition. They can inform acting of the whole person. The saying suggests, further, that when the influence (instructions, independent volition) of these body parts is perceived as dangerous or harmful, they should be silenced by cutting them off to protect the integrity of the rest of the body.

All academic theological literature known to me takes these sayings as high literary metaphors. Frequent references are made to biology and medicine and the use of amputations are the last resort in serious conditions.

Completely unrecognized is the whole presumption of this saying according to which individual body parts could possess independent volition and as such can inform (sway/direct) the acting of the whole body. Even more seriously — the presumption that self-mutilation can stop or somehow influence higher mental processes. Even the person who is not a trained psychologist or psychiatrist can recognize that we are dealing with a seriously pathological state of mind. […]

Already at the time of recording in the gospels this saying was perceived as anomalous. Luke, the most educated and refined of synoptical authors, preserved the immediate context, but edited out most of the peculiar parts concerning disseminated volition and self-mutilations.

Further and broader contexts which may be mentioned and discussed: other Greek and Hebrew physiological and anatomical metaphors; the popularity of a metaphor of the body for structuring and functioning of society in Hellenism; the ancient practice of religious self-mutilation; the potential for facilitating our understanding of brutish penal codes or modern self-mutilations.

* * *

The Monstrous, the Impure, & the Imaginal
The Haunted Moral Imagination

Inconsistency of Burkean Conservatism
On Truth and Bullshit
Poised on a Knife Edge
“Why are you thinking about this?”

On Truth and Bullshit

One of the most salient features of our culture is that there is so much bullshit.

This is how Harry Frankfurt begins his essay, “On Bullshit“. He continues:

“Everyone knows this. Each of us contributes his share. But we tend to take the situation for granted. Most people are rather confident of their ability to recognize bullshit and to avoid being taken in by it. So the phenomenon has not aroused much deliberate concern, or attracted much sustained inquiry. In consequence, we have no clear understanding of what bullshit is, why there is so much of it, or what functions it serves. And we lack a conscientiously developed appreciation of what it means to us. In other words, we have no theory.”

So, what is this bullshit? He goes through many definitions of related words. A main point is that it’s “short of lying” and this leads him to insincerity. The bullshitter isn’t a liar for the bullshitter isn’t concerned about either truth or its contrary. No intention to lie is required.

“Someone who lies and someone who tells the truth are playing on the opposite sides, so to speak, in the same game. Each responds to the facts . . . the response of the one is guided by the authority of the truth, while the response of the other defies that authority and refuses to meet its demands. The bullshitter ignores these demands altogether. He does not reject the authority of truth, as the liar does, and oppose himself to it. He pays no attention to it at all.”

Bullshitting is more of a creative act that dances around such concerns of verity:

“For the essence of bullshit is not that it is false but that it is phony. In order to appreciate this distinction, one must recognize that a fake or a phony need not be in any respect (apart from authenticity itself) inferior to the real thing. What is not genuine need not also be defective in some other way. It may be, after all, an exact copy. What is wrong with a counterfeit is not what it is like, but how it was made. This points to a similar and fundamental aspect of the essential nature of bullshit: although it is produced without concern with the truth, it need not be false. The bullshitter is faking things. But this does not mean that he necessarily gets them wrong.”

Bullshit is, first and foremost, insincere. In Frankfurt’s essay, that is some combination of an observation, premise, and conclusion. It is the core issue. But as with bullshit, what is this insincerity? How are we to judge it, from what perspective and according to what standard?

His answer seems to be that bullshit is to sincerity as a lie to the truth. This implies that the bullshitter knows they are insincere in the way the liar knows they are being untruthful. And as the bullshitter doesn’t care about truth, the liar doesn’t care about sincerity. This assumes that the intention of a speaker can be known, both to the presumed bullshitter and to the one perceiving (or accusing) them as a bullshitter. We know bullshit when we hear it, as we know porn when we see it.

After much analysis, the ultimate conclusion is that, “sincerity itself is bullshit.” Bullshit is insincere and sincerity is bullshit. How clever! But there is a genuine point being made. Frankfurt’s ideal is that of truth, not sincerity. Truth and sincerity aren’t polar opposite ideals. They are separate worldviews and attitudes, so the argument goes.

Coming to the end of the essay, I immediately realized what this conflict was. It is an old conflict. It goes back at least to Socrates, although part of larger transcultural changes happening in the post-bicameral Axial Age. Socrates is simply the standard originating point for Western thought, the frame we prefer since Greece represents the earliest known example of a democracy (as a highly organized political system within an advanced civilization).

Socrates, as known through the writings of Plato, is often portrayed as the victim of democracy’s dark populism. The reality, though, is that Plato was severely anti-democratic and associated with those behind the authoritarian forces that sought to destroy Athenian democracy. His fellow Athenians didn’t take kindly to this treasonous threat, whether or not it was just and fair to blame Socrates (we shall never know since we lack the details of the accusation and evidence, as no official court proceedings are extant).

What we know, from Plato, is that Socrates had issues with the Sophists. So, who were these Sophists? It’s a far more interesting question than it first appears. It turns out that the word has a complicated history. It originally referred to poets, the original teachers of wisdom in archaic Greek society. And it should be recalled that the poets were specifically excluded from Plato’s utopian society because, in Plato’s mind, of the danger they posed to rationalistic idealism.

What did the poets and Sophists have in common? They both used language to persuade, through language that was concrete rather than abstract, emotional rather than detached. Plato was interested in big ‘T’ absolute Truth, whereas those employing poetry and rhetoric were interested in small ‘t’ relative truths that were on a human scale. Ancient Greek poets and Sophists weren’t necessarily untruthful but simply indifferent to Platonic ideals of Truth.

This does relate to Frankfurt’s theory of bullshit. Small ‘t’ truths are bullshit or at least easily seen in this light. The main example he uses demonstrates this. A friend of Ludwig Wittgenstein’ was sick and she told him that, “I feel just like a dog that has been run over.” Wittgenstein saw this as careless use of language, not even meaningful enough to be wrong. It was a human truth, instead of a philosophical Truth.

Her statement expressed a physical and emotional experience. One could even argue that Wittgenstein was wrong about a human not being able to know what a hurt dog feels like, as mammals have similar biology and neurology. Besides, as far as we know, this friend had a pet dog run over by a car and was speaking from having a closer relationship to this dog than she had to Wittgenstein. Reading this account, Wittgenstein comes off as someone with severe Asperger’s and indeed plenty of people have speculated elsewhere about this possible diagnosis. Whatever is the case, his response was obtuse and callous.

It is hard to know what the relevance of such an anecdote might have, in reference to clarifying the meaning of bullshit. What it does make clear is that there are different kinds of truths.

This is what separated Socrates and Plato on one side and the poets and Sophists on the other. The Sophists had inherited a tradition of teaching from the poets and it was a tradition that became ever more important in the burgeoning democracy. But it was an era when the power of divine voice still clung to the human word. Persuasion was a power not to be underestimated, as the common person back then hadn’t yet developed the thick-boundaried intellectual defensiveness against rhetoric that we moderns take for granted. Plato sought a Truth that was beyond both petty humans and petty gods, a longing to get beyond all the ‘bullshit’.

Yet it might be noted that some even referred to Socrates and Plato as Sophists. They too used rhetoric to persuade. And of course, the Platonic is the foundation of modern religion (e.g., Neoplatonic Alexandrian Jews who helped shape early Christian theology and Biblical exegesis), the great opponent of the Enlightenment tradition of rationality.

This is why some, instead, prefer to emphasize the divergent strategies of Plato and Aristotle, the latter making its own accusations of bullshit against the former. From the Aristotelian view, Platonism is a belief system proclaiming truth all the while willfully detached from reality. The Platonic concern with Truth, from this perspective, can seem rather meaningless, maybe so meaningless as to not even being false. The Sophists who opposed Socrates and Plato at least were interested in practical knowledge that applied to the real world of human society, dedicated as they were to teaching the skills necessary for a functioning democracy.

As a side note, the closest equivalent to the Sophists today is the liberal arts professor who hopes to instill a broad knowledge in each new generation of students. It’s quite telling that those on the political right are the most likely to make accusations of bullshit against the liberal arts tradition. A traditional university education was founded on philology, the study of languages. And the teaching of rhetoric was standard in education into the early 1900s. Modern Western Civilization was built on the values of the Sophists, the ideal of a well rounded education and the central importance of language, including the ability to speak well and persuasively, the ability to logically defend an argument and rhetorically to make a case. The Sophists saw that to have a democratic public what was needed was an educated public.

Socrates and Plato came from more of what we’d call an aristocratic tradition. They were an enlightened elite, born into wealth, luxury, and privilege. This put them in opposition to the emerging democratic market of ideas. The Sophists were seen as mercenary philosophers who would teach or do anything for money. Socrates didn’t accept money from his students, but then again he was independently wealthy (in that, he didn’t have to work because slaves did the work for him). He wanted pure philosophy, unadulterated by the coarse human realities such as making a living and democratic politics.

It’s not that Socrates and Plato were necessarily wrong. Sophists were a diverse bunch, some using their talents for the public good and others not so much. They were simply the well educated members of the perceived meritocracy who used their expertise in exchange for payment. It seems like a rather normal thing to do in a capitalist society such as ours, but back then a market system was a newfangled notion that seemed radically destabilizing to the social order. Socrates and Plato were basically the reactionaries of their day, nostalgically longing for what they imagined was being lost. Yet they were helping creating an entirely new society, wresting it from the control and authority of tradition. Plato offered a radical utopian vision precisely because he was a reactionary, in terms of how the reactionary is explained by Corey Robin.

Socrates and Plato were challenging the world they were born into. Like all reactionaries, they had no genuine interest in a conservative-minded defense of the status quo. It would take centuries for their influence to grow so large as to become a tradition of its own. Even then, they laid the groundwork for future radicalism during the Renaissance, Protestant Reformation, and Enlightenment Age. Platonic idealism is the seed of modern idealism. What was reactionary in classical Greece fed into a progressive impulse about two millennia later, the anti-democratic leading to the eventual return of democratization. The fight against ‘bullshit’ became the engine of change that overthrew the European ancien régime of Catholicicism, feudalism, aristocracy, and monarchy. Utopian visions such as that of Plato’s Republic became increasingly common.

Thinking along these lines, it brought to mind a recent post of mind, Poised on a Knife Edge. I was once again considering the significance of the ‘great debate’ between Edmund Burke and Thomas Paine. It was Paine who was more of the inheritor of Greek idealism, but unlike some of the early Greek idealists he was very much putting idealism in service of democracy, not some utopian vision above and beyond the messiness of public politics. It occurred to me that, to Paine and his allies, Burke’s attack on the French Revolution was ‘bullshit’. The wardrobe of the moral imagination was deemed rhetorical obfuscation, a refusal of the plain speech and the plain honest truth that was favored by Paine (and by Socrates).

Let me explain why this matters. As I began reading Frankfurt’s “On Bullshit”, I was naturally pulled into the view presented. Pretty much everyone hates bullshit. But I considered a different possible explanation for this. Maybe bullshit isn’t more common than before. Maybe it’s even less common in some sense. It’s just that, as a society that idealizes truth, the category of bullshit represents something no longer respected or understood. We’ve lost touch with something within our own human nature. Our hyper-sensitivity in seeing bullshit everywhere, almost a paranoia, is an indication of this.

As much as I love Paine and his vision, I have to give credit where it is due by acknowledging that Burke managed to catch hold of a different kind of truth, a very human truth. He warned us about treading cautiously on the sacred ground of the moral imagination. On this point, I think he was right. We are too careless.

Frankfurt talks about the ‘bullshit artist’. Bullshitters are always artists. And maybe artists are always bullshitters. This is because the imagination, moral or otherwise, is the playground of the bullshitter. This is because the artist, the master of imagination, is different than a craftsmen. The artist always has a bit of the trickster about him, as he plays at the boundaries of the mind. Here is how Frankfurt explains it:

“Wittgenstein once said that the following bit of verse by Longfellow could serve him as a motto:

“In the elder days of art
Builders wrought with greatest care
Each minute and unseen part,
For the Gods are everywhere.

“The point of these lines is clear. In the old days, craftsmen did not cut corners. They worked carefully, and they took care with every aspect of their work. Every part of the product was considered, and each was designed and made to be exactly as it should be. These craftsmen did not relax their thoughtful self-discipline even with respect to features of their work which would ordinarily not be visible. Although no one would notice if those features were not quite right, the craftsmen would be bothered by their consciences. So nothing was swept under the rug. Or, one might perhaps also say, there was no bullshit.

“It does seem fitting to construe carelessly made, shoddy goods as in some way analogues of bullshit. But in what way? Is the resemblance that bullshit itself is invariably produced in a careless or self-indulgent manner, that it is never finely crafted, that in the making of it there is never the meticulously attentive concern with detail to which Longfellow alludes? Is the bullshitter by his very nature a mindless slob? Is his product necessarily messy or unrefined? The word shit does, to be sure, suggest this. Excrement is not designed or crafted at all; it is merely emitted, or dumped. It may have a more or less coherent shape, or it may not, but it is in any case certainly not wrought.

“The notion of carefully wrought bullshit involves, then, a certain inner strain. Thoughtful attention to detail requires discipline and objectivity. It entails accepting standards and limitations that forbid the indulgence of impulse or whim. It is this selflessness that, in connection with bullshit, strikes us as inapposite. But in fact it is not out of the question at all.”

This is logos vs mythos. In religious terms, it is the One True God who creates ex nihilo vs the demiurgic god of this world. And in Platonic terms, it is the idealistic forms vs concrete substance, where the latter is a pale imitation of the former. As such, truth is unique whereas bullshit is endless. The philosopher and the poet represent opposing forces. To the Philosopher, everything is either philosophically relevant or bullshit. But to the poet (and his kin), this misses the point and overlooks the essence of our humanity. Each side makes sense, according to the perspective of each side. And so each side is correct about what is wrong with the other side.

If all bullshit was eliminated and all further bullshit made impossible, what would be left of our humanity? Maybe our very definition of truth is dependent on bullshit, both as a contrast and an impetus. Without bullshit, we might no longer be able to imagine new truths. But such imagination, if not serving greater understanding, is of uncertain value and potentially dangerous to society. For good or ill, the philosopher, sometimes obtuse and detached, and the artist, sometimes full of bullshit, are the twin representatives of civilization as we know it.

* * *

“I had my tonsils out and was in the Evelyn Nursing Home feeling sorry for myself. Wittgenstein called.”
by Ann Althouse

Short of Lying
by Heinz Brandenburg

Bullshit as the Absence of Truthfulness
by Michael R. Kelly

Democracy is not a truth machine
by Thomas R. Wells

Our ability as individuals to get to true facts merely by considering different arguments is distinctly limited. If we only know of one account of the holocaust – what we were taught in school – we are likely to accept it. But whether it is true or false is a matter of luck rather than our intellectual capacities. Now it is reasonable to suppose that if we were exposed to a diversity of claims about the holocaust then our opinions on the subject would become more clearly our own, and our own responsibility. They would be the product of our own intellectual capacities and character instead of simply reflecting which society we happened to be born into. But so what? Holding sincere opinions about whether the holocaust happened is all very well and Millian, but it has no necessary relation to their truth. As Harry Frankfurt notes in his philosophical essay On Bullshit, sincerity is concerned with being true to oneself, not to the nature of the world: from the perspective of truth seeking, sincerity is bullshit.

Knowing this, we can have no faith that the popularity of certain factual claims among people as ordinary as ourselves is any guide to their truth. Democracy is no more equipped to evaluate facts than rational truths. We can all, of course, hold opinions about the civilisational significance of the holocaust and its status as a justification for the state of Israel, and debate them with others in democratic ways. Yet, when it comes to the facts, neither the sincerity with which individuals believe that ‘the holocaust’ is a myth nor the popularity of such beliefs can make them epistemically respectable. 90% of the population denying the holocaust is irrelevant to its truth status. And vice versa.

Rhetoric and Bullshit
by James Fredal

Frankfurt is also indebted (indirectly) to Plato: Phaedrus is as much about the bullshitter’s (Lysias’s or the non-lover’s) lack of concern for (or “love” for) the truth as is Frankfurt’s brief tome. From the perspective of Plato, Lysias’s speech in praise of the non-lover is just so much bullshit not simply because it is not true, but because Lysias is not concerned with telling the truth so much as he is with gaining the affection and attention of his audience: the beloved boy, the paying student or, more to the point, that lover of speeches, Phaedrus himself.

The non-lover described by Lysias in Phaedrus is best understood as Plato’s allegory for sophists who reject any “natural” truth and who remain committed to contradictory arguments as the practical consequence of their general agnosticism. For Lysias’s non-lover, language is not for telling the truth, because the truth is inaccessible: language is for finding and strengthening positions, for gaining advantage, and for exerting influence over others. Richard Weaver offers a similar reading of Phaedrus that sees the non-lover as representing an attitude toward language use (though for Weaver the non-lover is not a sophist, but a scientist).

Others interested in the bullshitter apply a different, more favorable lens. Daniel Mears, for example, draws on Chandra Mukerji’s study of bullshit among hitchhikers, and more generally on Erving Goffman’s study of self-presentation in the interaction order (for example, “Role Distance” and Interaction Rituals) to highlight bullshit as a form of impression management: what, as Mears notes, Suzanne Eggins and Diana Slade call a “framing device” for the “construction and maintenance of our social identities and social relationships” (qtd. in Mears 279). For Mears, bullshit is the deliberate (albeit playful) creation of possible but ultimately misleading impressions of self or reality, whether for expressive or instrumental reasons (4).

Like Frankfurt, Mears locates the source of bullshit in the speaker herself and her desire to craft a creditable self-image. But whereas Frankfurt sees bullshitting as a species of deception worse than lying (because at least liars have to know the truth if only to lead us away from it, whereas bullshitters have no concern at all for the truth), Mears understands bullshit as a significant social phenomenon that serves several prosocial functions.7 For Mears, we engage in bullshit for purposes of socialization and play, for self-exploration and self-expression, for the resolution of social tensions and cognitive dissonance, and for gaining an advantage in encounters.

Like Mukerji, Mears emphasizes the playful (though often nontrivial and highly consequential) quality of bullshit, much as the ancient sophists composed speeches as “play”: as exercises and exempla, for enjoyment, for display and impression management, and for study separate from the “real world” of politics and law.

Rhetoric Is Not Bullshit
by Davd J. Tietge
from Bullshit and Philosophy
Kindle Locations 3917-4003

The Truth about Postmodernism

One issue that helps obscure the universality of rhetoric, and thus promotes the pejorative use of ‘rhetoric’, is the popular tendency to oversimplify the “truth-lie” dichotomy. In The Liar’s Tale: A History of Falsehood, Jeremy Campbell reminds us that the reductionistic binary that separates truth from falsity is not only in error, but also that the thoroughly unclear and inconsistent distinction between the true and the false has a long, rich cultural history.180 Those doing much of the speaking in our own era, however, assume that the dividing line between truth and untruth is clear and, more significantly, internalized by the average human. Truth, however, is an elusive concept. While we can cite many examples of truths (that the sky is blue today, that the spoon will fall if dropped, and so forth), these depend on definitions of the words used. The sky is blue because ‘blue’ is the word we use to describe the hue that we have collectively agreed is bluish. We may, however, disagree about what shade of blue the sky is. Is it powder blue? Blue-green? Royal Blue? Interpretive responses to external realities that rely on definition (and language generally) always complicate the true-false binary, especially when we begin to discuss the nature of abstractions involved in, say, religion or metaphysics. The truth of ‘God is good’ depends very heavily upon the speaker’s understanding of God and the nature of goodness, both of which depend upon the speaker’s conceptualization, which may be unique to him, his group, or his cultural environment, and thus neither clear nor truthful to other parties.

Is this rampant relativism? Some might think so, but it is perhaps more useful to suggest that the Absolute Truths that we usually embrace are unattainable because of these complexities of language. Some cultures have seen the linguistic limitations of specifying the Truth. Hinduism has long recognized that language is incapable of revealing Truth; to utter the Truth, it holds, is simultaneously to make it no longer the Truth.

Note here the distinction between capital ‘T’ truth and lower-case ‘t’ truth. Lower-case truths are situational, even personal. They often reflect more the state of mind of the agent making the utterance than the immutable nature of the truth. They are also temporally situated; what may be true now may not be in the future. Truth in this sense is predicated on both perception and stability, and, pragmatically speaking, such truths are tran-sitional and, often, relative. Capital ‘T’ Truths can be traced back at least as far as Plato, and are immutable, pure, and incorruptible. They do not exist in our worldly realm, at least so far as Plato was concerned. This is why Plato was so scornful of rhetoric: he felt that rhetoricians (in particular, the Sophists) were opportunists who taught people how to disguise the Truth with language and persuasion. Whereas Plato imagined a realm in which the worldly flaws and corruption of a physical existence were supplanted by perfect forms, the corporeal domain of human activity was saturated with language, and therefore, could not be trusted to reveal Truth with any certainty.

Contemporary, postmodern interest in truth and meaning turns the tables on Plato and studies meaning and truth in this shifting, less certain domain of human activity. Campbell cites many thinkers from our philosophical past who helped inaugurate this development, but none is more important than Friedrich Nietzsche. For Nietzsche, humans have no “organ” for discerning Truth, but we do have a natural instinct for falsehood. “Truth,” as an abstraction taken from the subjectivity of normal human activities, was a manufactured fiction that we are not equipped to actually find. On the other hand, a natural aptitude for falsehood is an important survival mechanism for many species. Human beings have simply cultivated it in innovative, sophisticated, ways. As the rhetorician George A. Kennedy has noted, “in daily life, many human speech acts are not consciously intentional; they are automatic reactions to situations, culturally (rather than genetically) imprinted in the brain or rising from the subconscious.”181 Our propensity for appropriate (if not truthful) responses to situations is something nourished by an instinct to survive, interact, protect, and socialize. Civilization gives us as many new ways to do this as there are situations that require response.

This is why Nietzsche carefully distinguished Truth from a belief system that only professed to contain the Truth. Ken Gemes notes that Nietzsche co-ordinated the question of Truth around the pragmatics of survival,182 an observation echoed by Kennedy, who provides examples of animals that deceive for self-preservation. Camouflage, for example, can be seen in plants and animals. Many birds imitate the calls of rival species to fool them to distraction and away from their nests or food sources. Deception, it seems, is common in nature. But Nietzsche took doctrinal Truth (note the “T”) to be one of the most insidious deceptions to occur in human culture, especially as it is articulated in religions. It is not a basic lie that is being promulgated, but rather a lie masquerading as the Truth and, according to Nietzsche, performing certain functions. Truth, that is, is a ritualized fiction, a condition manufactured for institutions and the individuals who control them to maintain their power.

Rhetoric and Bullshit

Truth, deception, control over others. This survey of rhetoric thus brings us close to the territory that Harry Frankfurt explores in On Bullshit. For Frankfurt, however, bullshit has little to do with these complexities about truth and Truth that rhetoric helps us identify. Indeed bullshit, for Frankfurt, has little do with truth at all, insofar as it requires an indifference to truth. Does this mean, then, that language that is not bullshit has settled the matter of truth and has access to truth (or Truth)? Does this lead us to a dichotomy between truth and bullshit that is similar to the dichotomy between truth and falsity that postmodernism criticizes? It may seem that postmodernism has little place in Frankfurt’s view, insofar as he rejects “various forms of skepticism which deny that we have any reliable access to objective reality, and which therefore reject the possibility of knowing how things truly are” (p. 64). Indeed, postmodernism is often vilified as the poster child of relativism and skepticism.

Yet postmodernism is far subtler than a mere denial of “objective reality.” Postmodernism claims, rather, that reality is as much a construct of language as it is objective and unchanging. Postmodernism is less about rejecting beliefs about objective reality than about the intersection between material reality and the human interpretations of it that change, mutate, and shift that reality to our own purposes—the kind of small-t truths that Nietzsche addressed. The common complaint about post-modernism, for example, that it denies “natural laws,” forgets that humans noticed and formulated those laws. Postmodernism attempts to supply a vocabulary to describe this kind of process. It is not just “jargon,” as is so often charged; it is an effort to construct a metalinguistic lexicon for dealing with some very difficult and important epistemological questions.

And, not surprisingly, so is rhetoric. Constructing language that deals with the nature of language is a unique human problem. It is meta-cognition at its most complicated because it requires us to use the same apparatus to decode human texts that is contained in the texts themselves—that is, using words to talk about words, what Kenneth Burke referred to in The Rhetoric of Religion as “logology.”183 In no other area of human thinking is this really the case. Most forms of intellectual exploration involve an extraneous phenomenon, event, agent, or object that requires us to bring language to bear upon it in order to observe, describe, classify, and draw conclusions about its nature, its behavior, or its effect. For example, scientific inquiry usually involves an event or a process in the material world that is separate from the instruments we use to describe it. Historical analysis deals with texts as a matter of disciplinary course, yet most historians rarely question the efficacy or the reliability of the language used to convey an event of the remote (or, for that matter, recent) past. Even linguistics, which uses a scientific model to describe language structure, deals little with meaning or textual analysis.

Law is one of the closest cousins of rhetoric. Words are very much a part of the ebb and flow of legal wrangling, and the attention given to meaning and interpretation is central. Yet, even here, there is little theoretical discussion about how words have meaning or how, based on such theory, that meaning can be variously interpreted. Law is more concerned with the fact that words can be interpreted differently and how different agents might interpret language in different ways. This is why legal documents are often so unreadable; in an attempt to control ambiguity, more words (and more words with specific, technical meanings) must be used so that multiple interpretations can be avoided. If theoretical discussions about how language generates meaning were entered into the equation, the law would be impossible to apply in any practical way. Yet, to understand legal intricacies, every law student should be exposed to rhetoric—not so they can better learn how to manipulate a jury or falsify an important document, but so they understand how tenuous and limited language actually is for dealing with ordinary situations. Moreover, nearly every disciplinary area of inquiry uses language, but only rhetoric (and its associated disciplines, especially philosophy of language and literary /cultural criticism, which have influenced the development of modern rhetoric considerably) analyzes language using a hermeneutical instrument designed to penetrate the words to examine their effects—desired or not—on the people who use them.

What, then, qualifies as “bullshit”? Certainly, as I hope I have shown, rhetoric and bullshit are hardly the same thing. They are not even distant cousins. When a student begins a paper with the sentence, “In today’s society, there are many things that people have different and similar opinions about,” it’s a pretty good guess that there is little of rhetorical value there. About the only conclusion a reader can draw is that the student is neither inspired nor able to hide this fact. This is the extent of the subtext, and it could conceivably qualify as bullshit. In this sense, Frankfurt’s characterization of bullshit as “unavoidable whenever circumstances require someone to talk without knowing what he is talking about” (p. 63) is a useful differentiation.

But aside from these rather artificial instances, if bullshit does occur at the rate Frankfurt suggests, we have an arduous task in separating the bullshit from more interesting and worthy rhetorical situations. We have all met people whom we know, almost from the moment of acquaintance, are full of bullshit. It is the salesman syndrome that some people just (naturally, it seems) possess. In one sense, then, poor rhetoric—a rhetoric of transparency or obviousness—can be construed as bullshit. For the person with salesman syndrome is certainly attempting to achieve identification with his audience; he may even be attempting to persuade others that he is upright or trustworthy. But he fails because his bullshit is apparent. He is a bad rhetorician in the sense that he fails to convince others that he should be taken seriously, that his words are worthy of attention and, possibly, action.

Bullshit is something we can all recognize. Rhetoric is not. My remedy for this situation is simple: learn rhetoric.

 

The Sociology of Intellectual Life
by Steve Fuller
pp. 147-8

Harry Frankfurt’s (2005) On Bullshit is the latest contribution to a long, distinguished, yet deeply problematic line of Western thought that has attempted to redeem the idea of intellectual integrity from the cynic’s suspicion that it is nothing but high-minded, self-serving prejudice. I say ‘problematic’ because while Plato’s unflattering portrayal of poets and sophists arguably marked the opening salvo in the philosophical war against bullshit, Plato availed himself of bullshit in promoting the ‘myth of the metals’ as a principle of social stratification in his Republic. This doublethink has not been lost on the neo-conservative followers of the great twentieth century Platonist Leo Strauss. […]

The bullshit detector aims to convert an epistemic attitude into a moral virtue: reality can be known only by the right sort of person. This idea, while meeting with widespread approval by philosophers strongly tied to the classical tradition of Plato and Aristotle, is not lacking in dissenters. The line of dissent is best seen in the history of ‘rhetoric’, a word Plato coined to demonize Socrates’ dialectical opponents, the sophists. The sophists were prepared to teach anyone the art of winning arguments, provided you could pay the going rate. As a series of sophistic interlocutors tried to make clear to Socrates, possession of the skills required to secure the belief of your audience is the only knowledge you really need to have. Socrates famously attacked this claim on several fronts, which the subsequent history of philosophy has often conflated. In particular, Socrates’ doubts about the reliability of the sophists’ techniques have been run together with a more fundamental criticism: even granting the sophists their skills, they are based on a knowledge of human gullibility, not of reality itself.

Bullshit is sophistry under this charitable reading, which acknowledges that the truth may not be strong enough by itself to counteract an artfully presented claim that is not so much outright false as, in the British idiom, ‘economical with the truth’. In stressing the difference between bullshit and lies, Frankfurt clearly has this conception in mind, though he does sophistry a disservice by casting the bullshitter’s attitude toward the truth as ‘indifference’. On the contrary, the accomplished bullshitter must be a keen student of what people tend to regard as true, if only to cater to those tendencies so as to serve her own ends. What likely offends Frankfurt and other philosophers here is the idea that the truth is just one more tool to be manipulated for personal advantage. Conceptual frameworks are simply entertained and then discarded as their utility passes. The nature of the offence, I suspect, is the divine eye-view implicated in such an attitude – the very idea that one could treat in a detached fashion the terms in which people normally negotiate their relationship to reality. A bullshitter revealed becomes a god unmade.

pp. 152-3

The bullshit detector believes not only that there is a truth but also that her own access to it is sufficiently reliable and general to serve as a standard by which others may be held accountable. Protestants appeared prepared to accept the former but not the latter condition, which is why dissenters were encouraged – or perhaps ostracized – to establish their own ministries. The sophists appeared to deny the former and possibly the latter condition as well. Both Protestants and sophists are prime candidates for the spread of bullshit because they concede that we may normally address reality in terms it does not recognize – or at least do not require it to yield straight ‘yes-or-no’, ‘true-or-false’ answers. In that case, we must make up the difference between the obliqueness of our inquiries and the obtuseness of reality’s responses. That ‘difference’ is fairly seen as bullshit. When crystallized as a philosophy of mind or philosophy of language, this attitude is known as antirealism. Its opposite number, the background philosophy of bullshit detectors, is realism.

The difference in the spirit of the two philosophies is captured as follows: do you believe that everything you say and hear is bullshit unless you have some way of showing whether it is true or false; or rather, that everything said and heard is simply true or false, unless it is revealed to be bullshit? The former is the antirealist, the latter the realist response. Seen in those terms, we might say that the antirealist regards reality as inherently risky and always under construction (Caveat credor: ‘Let the believer beware!’) whereas the realist treats reality as, on the whole, stable and orderly – except for the reprobates who try to circumvent the system by producing bullshit. In this respect, On Bullshit may be usefully read as an ad hominem attack on antirealists. Frankfurt himself makes passing reference to this interpretation near the end of the essay (Frankfurt 2005: 64–65). Yet, he appears happy to promote the vulgar image of antirealism as intellectually, and perhaps morally, slipshod, instead of treating it as the philosophically honorable position that it is.

A case in point is Frankfurt’s presentation of Wittgenstein as one of history’s great bullshit detectors (Frankfurt 2005: 24–34). He offers a telling anecdote in which the Viennese philosopher objects to Fania Pascal’s self description as having been ‘sick as a dog’. Wittgenstein reportedly told Pascal that she misused language by capitalizing on the hearer’s easy conflation of a literal falsehood with a genuine condition, which is made possible by the hearer’s default anthropocentric bias. Wittgenstein’s objection boils down to claiming that, outside clearly marked poetic contexts, our intellectual end never suffices alone to justify our linguistic means. Frankfurt treats this point as a timeless truth about how language structures reality. Yet, it would be quite easy, especially recalling that this ‘truth’ was uttered seventy years ago, to conclude that Wittgenstein’s irritation betrays a spectacular lack of imagination in the guise of scrupulousness.

Wittgenstein’s harsh judgement presupposes that humans lack any real access to canine psychology, which renders any appeal to dogs purely fanciful. For him, this lack of access is an established fact inscribed in a literal use of language, not an open question answers to which a figurative use of language might offer clues for further investigation. Nevertheless, scientists informed by the Neo-Darwinian synthesis – which was being forged just at the time of Wittgenstein’s pronouncement – have quite arguably narrowed the gap between the mental lives of humans and animals in research associated with ‘evolutionary psychology’. As this research makes more headway, what Wittgenstein confidently declared to be bullshit in his day may tomorrow appear as having been a prescient truth. But anyone holding such a fluid view of verifiability would derive scant comfort from either Wittgenstein or Frankfurt, who act as if English linguistic intuitions, circa 1935, should count indefinitely as demonstrable truths.

Some philosophers given to bullshit detection are so used to treating any Wittgensteinian utterance as a profundity that it never occurs to them that Wittgenstein may have been himself a grandmaster of bullshit. The great bullshit detectors whom I originally invoked, Nietzsche and Mencken, made themselves vulnerable to critics by speaking from their own self-authorizing standpoint, which supposedly afforded a clear vista for distinguishing bullshit from its opposite. Wittgenstein adopts the classic bullshitter’s technique of ventriloquism, speaking through the authority of someone or something else in order to be spared the full brunt of criticism.

I use ‘adopts’ advisedly, since the deliberateness of Wittgenstein’s rhetoric remains unclear. What was he trying to do: to speak modestly without ever having quite controlled his spontaneously haughty manner, or to exercise his self-regarding superiority as gently as possible so as not to frighten the benighted? Either way, Wittgenstein became – for a certain kind of philosopher – the standard-bearer of linguistic rectitude, where ‘language’ is treated as a proxy for reality itself. Of course, to the bullshitter, this description also fits someone whose strong personality cowed the impressionable into distrusting their own thought processes. As with most successful bullshit, the trick is revealed only after it has had the desired effect and the frame of reference has changed. Thus, Wittgenstein’s precious concern about Pascal’s account of her state of health should strike, at least some readers today, as akin to a priest’s fretting over a parishioner’s confession of impure thoughts. In each case, the latter is struck by something that lies outside the box in which the former continues to think.

If Wittgenstein was a bullshitter, how did he manage to take in professed enemies of bullshit like Frankfurt? One clue is that most bullshit is forward looking, and Wittgenstein’s wasn’t. The bullshitter normally refers to things whose prima facie plausibility immunizes the hearer against checking their actual validity. The implication is that the proof is simply ‘out there’ waiting be found. But is there really such proof? Here the bullshitter is in a race against time. A sufficient delay in checking sources has salvaged the competence and even promoted the prescience of many bullshitters. Such was the spirit of Paul Feyerabend’s (1975) notorious account of Galileo’s ‘discoveries’, which concluded that his Papal Inquisitors were originally justified in their scepticism, even though Galileo’s followers subsequently redeemed his epistemic promissory notes.

In contrast, Wittgenstein’s unique brand of bullshit was backward-looking, always reminding hearers and readers of something they should already know but had perhaps temporarily forgotten. Since Wittgenstein usually confronted his interlocutors with mundane examples, it was relatively easy to convey this impression. The trick lay in immediately shifting the context from the case at hand to what Oxford philosophers in the 1950s called a ‘paradigm case’ that was presented as a self-evident standard of usage against which to judge the case at hand. That Wittgenstein, a non-native speaker of English, impressed one or two generations of Britain’s philosophical elite with just this mode of argumentation remains the envy of the aspiring bullshitter. Ernest Gellner (1959), another émigré from the old Austro Hungarian Empire, ended up ostracized from the British philosophical establishment for offering a cutting diagnosis of this phenomenon as it was unfolding. He suggested that Wittgenstein’s success testified to his ability to feed off British class anxiety, which was most clearly marked in language use. An academically sublimated form of such language-driven class anxiety remains in the discipline of sociolinguistics (Bernstein 1971–77).

Yet, after nearly a half-century, Gellner’s diagnosis is resisted, despite the palpable weakening of Wittgenstein’s posthumous grip on the philosophical imagination. One reason is that so many living philosophers still ride on Wittgenstein’s authority – if not his mannerisms – that to declare him a bullshitter would amount to career suicide. But a second reason is also operative, one that functions as an insurance policy against future debunkers. Wittgenstein is often portrayed, by himself and others, as mentally unbalanced. You might think that this would render his philosophical deliverances unreliable. On the contrary, Wittgenstein’s erratic disposition is offered as evidence for his spontaneously guileless nature – quite unlike the controlled and calculated character of bullshitters. Bullshit fails to stick to Wittgenstein because he is regarded as an idiot savant.

Democratic Republicanism in Early America

There was much debate and confusion around various terms, in early America.

The word ‘democracy’ wasn’t used on a regular basis at the time of the American Revolution, even as the ideal of it was very much in the air. Instead, the word ‘republic’ was used by most people back then to refer to democracy. But some of the founding fathers such as Thomas Paine avoided such confusion and made it clear beyond any doubt by speaking directly of ‘democracy’. Thomas Jefferson, the author of the first founding document and 3rd president, formed a political party with both ‘democratic’ and ‘republican’ in the name, demonstrating that no conflict was seen between the two terms.

The reason ‘democracy’ doesn’t come up in founding documents is that the word is too specific, although it gets alluded to when speaking of “the People” since democracy is literally “people power”. Jefferson, in writing the Declaration of Independence, was particularly clever in avoiding most language that evoked meaning that was too ideologically singular and obvious (e.g., he effectively used rhetoric to avoid the divisive debates for and against belief in natural law). That is because the founding documents were meant to unite a diverse group of people with diverse opinions. Such a vague and ambiguous word as ‘republic’ could mean almost anything to anyone and so was an easy way to paper over disagreements and differing visions. If more specific language was used that made absolutely clear what they were actually talking about, it would have led to endless conflict, dooming the American experiment from the start.

Yet it was obvious from pamphlets and letters that many American founders and revolutionaries wanted democracy, in whole or part, to the degree they had any understanding of it. Some preferred a civic democracy with some basic social democratic elements and civil rights, while others (mostly Anti-Federalists) pushed for more directly democratic forms of self-governance. The first American constitution, the Articles of Confederation, was clearly a democratic document with self-governance greatly emphasized. Even among those who were wary of democracy and spoke out against it, they nonetheless regularly used democratic rhetoric (invoking democratic ideals, principles, and values) because democracy was a major reason why so many fought the revolution in the first place. If not for democracy, there was little justification for and relevance in starting a new country, beyond a self-serving power grab by a new ruling elite.

Without assuming that large number of those early Americans had democracy in mind, their speaking of a republic makes no sense. And that is a genuine possibility for at least some of them, as they weren’t always clear in their own minds about what they did and didn’t mean. To be technical (according to even the common understanding from the 1700s), a country either is a democratic republic or a non-democratic republic. The variety of non-democratic republics would include what today we’d call theocracy, fascism, communism, etc. It is a bit uncertain exactly what kind of republic various early Americans envisioned, but one thing is certain: There was immense overlap and conflation between democracy and republicanism in the early American mind. This was the battleground of the fight between Federalists and Anti-Federalists (or to be more accurate, between pseudo-Federalists and real Federalists).

As a label, stating something is a republic says nothing at all about what kind of government it is. All that it says is what a government isn’t, that is to say it isn’t a monarchy, although there were even those who argued for republican monarchy with an elective king which is even more confused and so the king theoretically would serve the citizenry that democratically elected him. Even some of the Federalists talked about this possibility of republic with elements of a monarchy, strange as it seems to modern Americans. This is what the Anti-Federalists worried about.

Projecting our modern ideological biases onto the past is the opposite of helpful. The earliest American democrats were, by definition, republicans. And most of the earliest American republicans were heavily influenced by democratic political philosophy, even when they denounced it while co-opting it. There was no way to avoid the democratic promise of the American Revolution and the founding documents. Without that promise, we Americans would still be British. That promise remains, yet unfulfilled. The seed of an ideal is hard to kill once planted.

Still, bright ideals cast dark shadows. And the reactionary authoritarianism of the counter-revolutionaries was a powerful force. It is an enemy we still fight. The revolution never ended.

* * *

Democracy Denied: The Untold Story
by Arthur D. Robbins
Kindle Locations 2862-2929

Fascism has been defined as “an authoritarian political ideology (generally tied to a mass movement) that considers individual and other societal interests inferior to the needs of the state, and seeks to forge a type of national unity, usually based on ethnic, religious, cultural, or racial attributes.”[ 130] If there is a significant difference between fascism thus defined and the society enunciated in Plato’s Republic,[ 131] in which the state is supreme and submission to a warrior class is the highest virtue, I fail to detect it. [132] What is noteworthy is that Plato’s Republic is probably the most widely known and widely read of political texts, certainly in the United States, and that the word “republic” has come to be associated with democracy and a wholesome and free way of life in which individual self-expression is a centerpiece.

To further appreciate the difficulty that exists in trying to attach specific meaning to the word “republic,” one need only consult the online encyclopedia Wikipedia.[ 133] There one will find a long list of republics divided by period and type. As of this writing, there are five listings by period (Antiquity, Middle Ages and Renaissance, Early Modern, 19th Century, and 20th Century and Later), encompassing 90 separate republics covered in Wikipedia. The list of republic types is broken down into eight categories (Unitary Republics, Federal Republics, Confederal Republics, Arab Republics, Islamic Republics, Democratic Republics, Socialist Republics, and People’s Republics), with a total of 226 entries. There is some overlap between the lists, but one is still left with roughly 300 republics— and roughly 300 ideas of what, exactly, constitutes a republic.

One might reasonably wonder what useful meaning the word “republic” can possibly have when applied in such diverse political contexts. The word— from “res publica,” an expression of Roman (i.e., Latin) origin— might indeed apply to the Roman Republic, but how can it have any meaning when applied to ancient Athens, which had a radically different form of government existing in roughly the same time frame, and where res publica would have no meaning whatsoever?

Let us recall what was going on in Rome in the time of the Republic. Defined as the period from the expulsion of the Etruscan kings (509 B.C.) until Julius Caesar’s elevation to dictator for life (44 B.C.),[ 134] the Roman Republic covered a span of close to five hundred years in which Rome was free of despotism. The title rex was forbidden. Anyone taking on kingly airs might be killed on sight. The state of affairs that prevailed during this period reflects the essence of the word “republic”: a condition— freedom from the tyranny of one-man rule— and not a form of government. In fact, The American Heritage College Dictionary offers the following as its first definition for republic: “A political order not headed by a monarch.”

[…] John Adams (1735– 1826), second President of the United States and one of the prime movers behind the U.S. Constitution, wrote a three-volume study of government entitled Defence of the Constitutions of Government of the United States of America (published in 1787), in which he relies on the writings of Cicero as his guide in applying Roman principles to American government.[ 136] From Cicero he learned the importance of mixed governments,”[ 137] that is, governments formed from a mixture of monarchy, aristocracy, and democracy. According to this line of reasoning, a republic is a non-monarchy in which there are monarchic, aristocratic, and democratic elements. For me, this is confusing. Why, if one had just shed blood in unburdening oneself of monarchy, with a full understanding of just how pernicious such a form of government can be, would one then think it wise or desirable to voluntarily incorporate some form of monarchy into one’s new “republican” government? If the word “republic” has any meaning at all, it means freedom from monarchy.

The problem with establishing a republic in the United States was that the word had no fixed meaning to the very people who were attempting to apply it. In Federalist No. 6, Alexander Hamilton says, “Sparta, Athens, Rome and Carthage were all republics”( F.P., No. 6, 57). Of the four mentioned, Rome is probably the only one that even partially qualifies according to Madison’s definition from Federalist No. 10 (noted earlier): “a government in which the scheme of representation takes place,” in which government is delegated “to a small number of citizens elected by the rest” (ibid, No. 10, 81-82).

Madison himself acknowledges that there is a “confounding of a republic with a democracy” and that people apply “to the former reasons drawn from the nature of the latter ”( ibid., No. 14, 100). He later points out that were one trying to define “republic” based on existing examples, one would be at a loss to determine the common elements. He then goes on to contrast the governments of Holland, Venice, Poland, and England, all allegedly republics, concluding, “These examples … are nearly as dissimilar to each other as to a genuine republic” and show “the extreme inaccuracy with which the term has been used in political disquisitions.”( ibid., No. 39, 241).

Thomas Paine offers a different viewpoint: “What is now called a republic, is not any particular form of government. It is wholly characteristical [sic] of the purport, matter, or object for which government ought to be instituted, and on which it is to be employed, res-publica, the public affairs or the public good” (Paine, 369) (italics in the original). In other words, as Paine sees it, “res-publica” describes the subject matter of government, not its form.

Given all the confusion about the most basic issues relating to the meaning of “republic,” what is one to do? Perhaps the wisest course would be to abandon the term altogether in discussions of government. Let us grant the word has important historical meaning and some rhetorical appeal. “Vive la Republique!” can certainly mean thank God we are free of the tyranny of one-man, hereditary rule. That surely is the sense the word had in early Rome, in the early days of the United States, and in some if not all of the French and Italian republics. Thus understood, “republic” refers to a condition— freedom from monarchy— not a form of government.

* * *

Roger Williams and American Democracy
US: Republic & Democracy
 (part two and three)
Democracy: Rhetoric & Reality
Pursuit of Happiness and Consent of the Governed
The Radicalism of The Articles of Confederation
The Vague and Ambiguous US Constitution
Wickedness of Civilization & the Role of Government
Spirit of ’76
A Truly Free People
Nature’s God and American Radicalism
What and who is America?
Thomas Paine and the Promise of America
About The American Crisis No. III
Feeding Strays: Hazlitt on Malthus
Inconsistency of Burkean Conservatism
American Paternalism, Honor and Manhood
Revolutionary Class War: Paine & Washington
Paine, Dickinson and What Was Lost
Betrayal of Democracy by Counterrevolution
Revolutions: American and French (part two)
Failed Revolutions All Around
The Haunted Moral Imagination
“Europe, and not England, is the parent country of America.”
“…from every part of Europe.”

The Fight For Freedom Is the Fight To Exist: Independence and Interdependence
A Vast Experiment
America’s Heartland: Middle Colonies, Mid-Atlantic States and the Midwest
When the Ancient World Was Still a Living Memory