Essentialism On the Decline

Before getting to the topic of essentialism, let me take an indirect approach. In reading about paleolithic diets and traditional foods, a recurring theme is inflammation, specifically as it relates to the health of the gut-brain network and immune system.

The paradigm change this signifies is that seemingly separate diseases with different diagnostic labels often have underlying commonalities. They share overlapping sets of causal and contributing factors, biological processes and symptoms. This is why simple dietary changes can have a profound effect on numerous health conditions. For some, the diseased state expresses as mood disorders and for others as autoimmune disorders and for still others something entirely else, but there are immense commonalities between them all. The differences have more to do with how dysbiosis and dysfunction happens to develop, where it takes hold in the body, and so what symptoms are experienced.

From a paleo diet perspective in treating both patients and her own multiple sclerosis, Terry Wahls gets at this point in a straightforward manner (p. 47): “In a very real sense, we all have the same disease because all disease begins with broken, incorrect biochemistry and disordered communication within and between our cells. […] Inside, the distinction between these autoimmune diseases is, frankly, fairly arbitrary”. In How Emotions Are Made, Lisa Feldman Barrett wrote (Kindle Locations 3834-3850):

“Inflammation has been a game-changer for our understanding of mental illness. For many years, scientists and clinicians held a classical view of mental illnesses like chronic stress, chronic pain, anxiety, and depression. Each ailment was believed to have a biological fingerprint that distinguished it from all others. Researchers would ask essentialist questions that assume each disorder is distinct: “How does depression impact your body? How does emotion influence pain? Why do anxiety and depression frequently co-occur?” 9

“More recently, the dividing lines between these illnesses have been evaporating. People who are diagnosed with the same-named disorder may have greatly diverse symptoms— variation is the norm. At the same time, different disorders overlap: they share symptoms, they cause atrophy in the same brain regions, their sufferers exhibit low emotional granularity, and some of the same medications are prescribed as effective.

“As a result of these findings, researchers are moving away from a classical view of different illnesses with distinct essences. They instead focus on a set of common ingredients that leave people vulnerable to these various disorders, such as genetic factors, insomnia, and damage to the interoceptive network or key hubs in the brain (chapter 6). If these areas become damaged, the brain is in big trouble: depression, panic disorder, schizophrenia, autism, dyslexia, chronic pain, dementia, Parkinson’s disease, and attention deficit hyperactivity disorder are all associated with hub damage. 10

“My view is that some major illnesses considered distinct and “mental” are all rooted in a chronically unbalanced body budget and unbridled inflammation. We categorize and name them as different disorders, based on context, much like we categorize and name the same bodily changes as different emotions. If I’m correct, then questions like, “Why do anxiety and depression frequently co-occur?” are no longer mysteries because, like emotions, these illnesses do not have firm boundaries in nature.”

What jumped out at me was the conventional view of disease as essentialist, and hence the related essentialism in biology and psychology. This is exemplified by genetic determinism, such as it informs race realism. It’s easy for most well-informed people to dismiss race realists, but essentialism takes on much more insidious forms that are harder to detect and root out. When scientists claimed to find a gay gene, some gay men quickly took this genetic determinism as a defense against the fundamentalist view that homosexuality is a choice and a sin. It turned out that there was no gay gene (by the way, this incident demonstrated how, in reacting to reactionaries, even leftist activists can be drawn into the reactionary mind). Not only is there no gay gene but also no simple and absolute gender divisions at all — as I previously explained (Is the Tide Starting to Turn on Genetics and Culture?):

“Recent research has taken this even further in showing that neither sex nor gender is binary (1234, & 5), as genetics and its relationship to environment, epigenetics, and culture is more complex than was previously realized. It’s far from uncommon for people to carry genetics of both sexes, even multiple DNA. It has to do with diverse interlinking and overlapping causal relationships. We aren’t all that certain at this point what ultimately determines the precise process of conditions, factors, and influences in how and why any given gene expresses or not and how and why it expresses in a particular way.”

The attraction of essentialism is powerful. And as shown in numerous cases, the attraction can be found across the political spectrum, as it offers a seemingly strong defense in diverting attention away from other factors. Similar to the gay gene, many people defend neurodiversity as if some people are simply born a particular way, and that therefore we can’t and shouldn’t seek to do anything to change or improve their condition, much less cure it or prevent it in future generations.

For example, those on the high-functioning end of the autism spectrum will occasionally defend their condition as being gifted in their ability to think and perceive differently. That is fine as far as it goes, but from a scientific perspective we still should find it concerning that conditions like this are on a drastic rise and it can’t be explained by mere greater rates of diagnosis. Whether or not one believes the world would be a better place with more people with autism, this shouldn’t be left as a fatalistic vision of an evolutionary leap, especially considering most on the autism spectrum aren’t high functioning — instead, we should try to understand why it is happening and what it means.

Researchers have found that there are prospective causes to be studied. Consider proprionate, a substance discussed by Alanna Collen (10% Human, p. 83): “although propionate was an important compound in the body, it was also used as a preservative in bread products – the very foods many autistic children crave. To top it all off, clostridia species are known to produce propionate. In itself, propionate is not ‘bad’, but MacFabe began to wonder whether autistic children were getting an overdose.” This might explain why antibiotics helped many with autism, as it would have been knocking off the clostridia population that was boosting propionate. To emphasize this point, when rodents were injected with propionate, they exhibited the precise behaviors of autism and they too showed inflammation in the brain. The fact that autistics often have brain inflammation, an unhealthy condition, is strong evidence that autism shouldn’t be taken as mere neurodiversity (and, among autistics, the commonality of inflammation-related gut issues emphasizes this point).

There is no doubt that genetic determinism, like the belief in an eternal soul, can be comforting. We identify with our genes, as we inherit them and are born with them. But to speak of inflammation or propionate or whatever makes it seem like we are victims of externalities. And it means we aren’t isolated individuals to be blamed or to take credit for who we are. To return to Collen (pp. 88-89):

“In health, we like to think we are the products of our genes and experiences. Most of us credit our virtues to the hurdles we have jumped, the pits we have climbed out of, and the triumphs we have fought for. We see our underlying personalities as fixed entities – ‘I am just not a risk-taker’, or ‘I like things to be organised’ – as if these are a result of something intrinsic to us. Our achievements are down to determination, and our relationships reflect the strength of our characters. Or so we like to think.

“But what does it mean for free will and accomplishment, if we are not our own masters? What does it mean for human nature, and for our sense of self? The idea that Toxoplasma, or any other microbe inhabiting your body, might contribute to your feelings, decisions and actions, is quite bewildering. But if that’s not mind-bending enough for you, consider this: microbes are transmissible. Just as a cold virus or a bacterial throat infection can be passed from one person to another, so can the microbiota. The idea that the make-up of your microbial community might be influenced by the people you meet and the places you go lends new meaning to the idea of cultural mind-expansion. At its simplest, sharing food and toilets with other people could provide opportunity for microbial exchange, for better or worse. Whether it might be possible to pick up microbes that encourage entrepreneurship at a business school, or a thrill-seeking love of motorbiking at a race track, is anyone’s guess for now, but the idea of personality traits being passed from person to person truly is mind-expanding.”

This goes beyond the personal level, which lends a greater threat to the proposal. Our respective societies, communities, etc might be heavily influenced by environmental factors that we can’t see. A ton of research shows the tremendous impact of parasites, heavy metal toxins, food additives, farm chemicals, hormones, hormone mimics, hormone disruptors, etc. Entire regions might be shaped by even a single species of parasite, such as how higher rates of toxoplasmosis gondii in New England is directly correlated to higher rates of neuroticism (see What do we inherit? And from whom? & Uncomfortable Questions About Ideology).

Essentialism, though still popular, has taken numerous major hits in recent years. It once was the dominant paradigm and went largely unquestioned. Consider how early last century respectable fields of study such as anthropology, linguistic relativism and behaviorism suggested that humans were largely products of environmental and cultural factors. This was the original basis of the attack on racism and race realism. In linguistics, Noam Chomsky overturned this view in positing the essentialist belief that, though not observed much less proven, there must exist within the human brain a language module with a universal grammar. It was able to defeat and replace the non-essentialist theories because it was more satisfying to the WEIRD ideologies that were becoming a greater force in an increasingly WEIRD society.

Ever since Plato, Western civilization has been drawn toward the extremes of essentialism (as part of the larger Axial Age shift toward abstraction and idealism). Yet there has also long been a countervailing force (even among the ancients, non-essentialist interpretations were common; consider group identity: here, here, here, here, and here). It wasn’t predetermined that essentialism would be so victorious as to have nearly obliterated the memory of all alternatives. It fit the spirit of the times for this past century, but now the public mood is shifting again. It’s no accident that, as social democracy and socialism regains favor, environmentalist explanations are making a comeback. But this is merely the revival of a particular Western tradition of thought, a tradition that is centuries old.

I was reminded of this in reading Liberty in America’s Founding Moment by Howard Schwartz. It’s an interesting shift of gears, since Schwartz doesn’t write about anything related to biology, health, or science. But he does indirectly get at environmentalist critique that comes out in his analysis of David Hume (1711-1776). I’ve mostly thought of Hume in terms of his bundle theory of self, possibly having been borrowed from Buddhism that he might have learned from Christian missionaries having returned from the East. However he came to it, the bundle theory argued that there is no singular coherent self, as was a central tenet of traditional Christian theology. Still, heretical views of the self were hardly new — some detect a possible Western precursor of Humean bundle theory in the ideas of Baruch Spinoza (1632-1677).

Whatever its origins in Western thought, environmentalism has been challenging essentialism since the Enlightenment. And in the case of Hume, there is an early social constructionist view of society and politics, that what motivates people isn’t essentialism. This puts a different spin on things, as Hume’s writings were widely read during the revolutionary era when the United States was founded. Thomas Jefferson, among others, was familiar with Hume and highly recommended his work. Hume represented the opposite position to John Locke. We are now returning to this old battle of ideas.

“…some deeper area of the being.”

Alec Nevala-Lee shares a passage from Colin Wilson’s Mysteries (see Magic and the art of will). It elicits many thoughts, but I want to focus on the two main related aspects: the self and the will.

The main thing Wilson is talking about is hyper-individualism — the falseness and superficiality, constraint and limitation of anxiety-driven ‘consciousness’, the conscious personality of the ego-self. This is what denies the bundled self and the extended self, the vaster sense of being that challenges the socio-psychological structure of the modern mind. We defend our thick boundaries with great care for fear of what might get in, but this locks us in a prison cell of our own making. In not allowing ourselves to be affected, we make ourselves ineffective or at best only partly effective toward paltry ends. It’s not only a matter of doing “something really well” for we don’t really know what we want to do, as we’ve become disconnected from deeper impulses and broader experience.

For about as long as I can remember, the notion of ‘free will’ has never made sense to me. It isn’t a philosophical disagreement. Rather, in my own experience and in my observation of others, it simply offers no compelling explanation or valid meaning, much less deep insight. It intuitively makes no sense, which is to say it can only make sense if we never carefully think about it with probing awareness and open-minded inquiry. To the degree there is a ‘will’ is to the degree it is inseparable from the self. That is to say the self never wills anything for the self is and can only be known through the process of willing, which is simply to say through impulse and action. We are what we do, but we never know why we do what we do. We are who we are and we don’t know how to be otherwise.

There is no way to step back from the self in order to objectively see and act upon the self. That would require yet another self. The attempt to impose a will upon the self would lead to an infinite regress of selves. That would be a pointless preoccupation, although as entertainments go it is popular these days. A more worthy activity and maybe a greater achievement is stop trying to contain ourselves and instead to align with a greater sense of self. Will wills itself. And the only freedom that the will possesses is to be itself. That is what some might consider purpose or telos, one’s reason for being or rather one’s reason in being.

No freedom exists in isolation. To believe otherwise is a trap. The precise trap involved is addiction, which is the will driven by compulsion. After all, the addict is the ultimate individual, so disconnected within a repeating pattern of behavior as to be unable to affect or be affected. Complete autonomy is impotence. The only freedom is in relationship, both to the larger world and the larger sense of self. It is in the ‘other’ that we know ourselves. We can only be free in not trying to impose freedom, in not struggling to control and manipulate. True will, if we are to speak of such a thing, is the opposite of willfulness. We are only free to the extent we don’t think in the explicit terms of freedom. It is not a thought in the mind but a way of being in the world.

We know that the conscious will is connected to the narrow, conscious part of the personality. One of the paradoxes observed by [Pierre] Janet is that as the hysteric becomes increasingly obsessed with anxiety—and the need to exert his will—he also becomes increasingly ineffective. The narrower and more obsessive the consciousness, the weaker the will. Every one of us is familiar with the phenomenon. The more we become racked with anxiety to do something well, the more we are likely to botch it. It is [Viktor] Frankl’s “law of reversed effort.” If you want to do something really well, you have to get into the “right mood.” And the right mood involves a sense of relaxation, of feeling “wide open” instead of narrow and enclosed…

As William James remarked, we all have a lifelong habit of “inferiority to our full self.” We are all hysterics; it is the endemic disease of the human race, which clearly implies that, outside our “everyday personality,” there is a wider “self” that possesses greater powers than the everyday self. And this is not the Freudian subconscious. Like the “wider self” of Janet’s patients, it is as conscious as the “contracted self.” We are, in fact, partially aware of this “other self.” When a man “unwinds” by pouring himself a drink and kicking off his shoes, he is adopting an elementary method of relaxing into the other self. When an overworked housewife decides to buy herself a new hat, she is doing the same thing. But we seldom relax far enough; habit—and anxiety—are too strong…Magic is the art and science of using the will. Not the ordinary will of the contracted ego but the “true will” that seems to spring from some deeper area of the being.

Colin WilsonMysteries

The Madness of Reason

Commenting online brings one in contact with odd people. It is often merely irritating, but at times it can be fascinating to see all the strange ways humanity gets expressed.

I met a guy, Naj Ziad, on the Facebook page for a discussion group about Julian Jaynes’ book, The Origin of Consciousness in the Breakdown of the Bicameral Mind. He posted something expressing his obsession with logical coherence and consistency. We dialogued for quite a bit, including in a post on his own Facebook page, and he seemed like a nice enough guy. He came across as being genuine in his intentions and worldview, but there was simply something off about him.

It’s likely he has Asperger’s, of a high IQ and high functioning variety. Or it could be that he has some kind of personality disorder. Either way, my sense is that he is severely lacking in cognitive empathy, although I doubt he is deficient in affective empathy. He just doesn’t seem to get that other people can perceive and experience the world differently than he does or that others entirely exist as separate entities apart from his own existence.

When I claimed that my worldview was simply different than his and that neither of our personal realities could be reduced to the other, he called me a dualist. I came to the conclusion that this guy was a solipsist, although he doesn’t identify that way. Solipsism was the only philosophy that made sense of his idiosyncratic ramblings, entirely logical ramblings I might add. He is obviously intelligent, clever, and reasonably well read. His verbal intelligence is particularly high.

In fact, he is so obsessed with his verbal intelligence that he has come to the conclusion that all of reality is language. Of course, he has his own private definition of language which asserts that language is everything. This leaves his argument as a tautology and he freely admitted this was the case, but he kept returning to his defense that his argument was logically consistent and coherent. Sure.

It was endlessly amusing. He really could not grasp that the way his mind operates isn’t how everyone’s mind operates, and so he couldn’t escape the hermetically-sealed reality tunnel of his own clever monkey mind. His world view is so perfectly constructed and orderly that there isn’t a single crack to let in fresh air or a beam of light.

He was doing a wondrous impression of Spock in being entirely logical within his narrow psychological and ideological framework. He kept falling back on his being logical and also his use of idiosyncratic jargon. He defined his terms to entirely fit his ideological worldview that those terms had no meaning to him outside of his ideological worldview. It all made perfect sense within itself.

His life philosophy is a well-rehearsed script that he goes on repeating. It is an amazing thing to observe as an outsider, especially considering he stubbornly refused to acknowledge that anything could be outside of his own mind for he couldn’t imagine the world being different than his own mind. He wouldn’t let go of his beliefs about reality, like the monkey with his hand trapped in a jar because he won’t let go of the banana.

If this guy was just insane or a troll, I would dismiss him out of hand. But that isn’t the case. Obviously, he is neuroatypical and I won’t hold that against anyone. And I freely admit that his ideological worldview is logically consistent and coherent, for whatever that is worth.

What made it so fascinating to my mind is that solipsism has always been a speculative philosophy, to be considered only as a thought experiment. It never occurred to me that there would be a highly intelligent and rational person who would seriously uphold it as an entire self-contained worldview and lifestyle. His arguments for it were equally fascinating and he had interesting thoughts and insights, some of which I even agreed with. He is a brilliant guy who, as sometimes happens, has gone a bit off the deep end.

He built an ideology that perfectly expresses and conforms to his highly unusual neurocognitive profile. And of course, in pointing this out to him, he dismisses it as ‘psychologizing’. His arguments are so perfectly patched together that he never refers to any factual evidence as support for his ideological commitments, as it is entirely unnecessary in his own mind. External facts in the external world, what he calls ‘scientism’, are as meaningless as others claiming to have existence independent of his own. From his perspective, there is only what he calls the ‘now’ and there can only be one ‘now’ to rule them all, which just so happens to coincide with his own ego-mind.

If you challenge him on any of this, he is highly articulate in defending why he is being entirely reasonable. Ah, the madness of reason!

* * *

On a personal note, I should make clear that I sympathize with this guy. I have my own psychological issues (depression, anxiety, thought disorder, strong introversion, etc) that can make me feel isolated and cause me to retreat further into myself. Along with a tendency to over-intellectualize everything, my psychological issues have at times led me to get lost in my own head.

I can even understand the attraction of solipsism and, as a thought experiment, I’ve entertained it. But somehow I’ve always known that I’m not ‘normal’, which is to say that others are not like me. I have never actually doubted that others not only exist but exist in a wide variety of differences. It hasn’t occurred to me to deny all otherness by reducing all others to my own psychological experience and ideological worldview. I’ve never quite been that lost in myself, although I could imagine how it might happen. There have been moments in my life where my mind could have gone off the deep end.

Yet my sympathy only goes so far. It is hard to sympathize with someone who refuses to acknowledge your independent existence as a unique human being with your own identity and views. There is an element of frustration in dealing with a solipsist, but in this case my fascination drew me in. Before I ascertained he was a solipsist, it was obvious something about him was highly unusual. I kept poking and prodding him until the shape of his worldview became apparent. At that point, my fascination ended. Any further engagement would have continued to go around in circles, which means watching this guy’s mind go around in circles like a dog chasing its own tail.

Of all the contortions the human mind can put itself into, solipsism has to be one of the greatest feats to accomplish. I have to give this guy credit where its due. Not many people could keep up such a mindset for long.

Hyperballad and Hyperobjects

Morton’s use of the term ‘hyperobjects’ was inspired by Björk’s 1996 single ‘Hyperballad’
(Wikipedia)

Björk
by Timothy Morton

Björk and I think that there is a major cultural shift going on around the world towards something beyond cynical reason and nihilism, as more and more it becomes impossible not to have consideration for nonhumans in everything we do. Hopefully this piece we made contributes to that somehow.

I was so lucky to be doing this while she was mixing her album with some of the nicest and most incredible musicians/producers I’ve ever met…great examples of this shift beyond cynical reason…

Here is something I think is so so amazing, the Subtle Abuse mix of “Hyperballad.” Car parts, bottles, cutlery–all the objects, right? Not to mention Björk’s body “slamming against those rocks.” It’s a veritable Latour Litany… And the haunting repetition…

Dark Ecological Chocolate
by Timothy Morton

This being-an-object is intimately related with the Kantian beauty experience, wherein I find experiential evidence without metaphysical positing that at least one other being exists. The Sadness is the attunement of coexistence stripped of its conceptual content. Since the rigid anthropocentric standard of taste with its refined distances has collapsed, it becomes at this level impossible to rebuild the distinction we lost in The Ethereal between being interested or concerned with (this painting, this polar bear) and being fascinated by… Being interested means I am in charge. Being fascinated means that something else is. Beauty starts to show the subscenent wiring under the board.

Take Björk. Her song “Hyperballad” is a classic example of what I’m trying to talk about here. She shows you the wiring under board of an emotion, the way a straightforward feeling like I love you is obviously not straightforward at all, so don’t write a love song like that, write one that says you’re sitting on top of this cliff, and you’re dropping bits and pieces of the edge like car parts, bottles and cutlery, all kinds of not-you nonhuman prosthetic bits that we take to be extensions of our totally integrated up to date shiny religious holistic selves, and then you picture throwing yourself off, and what would you look like—to the you who’s watching you still on the edge of the cliff—as you fell, and when you hit the bottom would you be alive or dead, would you look awake or asleep, would your eyes be closed, or open?

When you experience beauty you experience evidence in your inner space that at least one thing that isn’t you exists. An evanescent footprint in your inner space—you don’t need to prove that things are real by hitting them or eating them. A nonviolent coexisting without coercion. There is an undecidability between two entities—me and not-me, the thing. Beauty is sad because it is ungraspable; there is an elegiac quality to it. When we grasp it withdraws, like putting my hand into water. Yet it appears.

Beauty is virtual: I am unable to tell whether the beauty resides in me or in the thing—it is as if it were in the thing, but impossible to pin down there. The subjunctive, floating “as if” virtual reality of beauty is a little queasy—the thing emits a tractor beam in whose vortex I find myself; I veer towards it. The aesthetic dimension says something true about causality in a modern age: I can’t tell for sure what the causes and effects are without resorting to illegal metaphysical moves.[14] Something slightly sinister is afoot—there is a basic entanglement such that I can’t tell who or what started it.

Beauty is the givenness of data. A thing impinges on me before I can contain it or use it or think it. It is as if I hear the thing breathing right next to me. From the standpoint of agricultural white patriarchy, something slightly “evil” is happening: something already has a grip on us, and this is demonic insofar as it is “from elsewhere.” This “saturated” demonic proximity is the essential ingredient of ecological being and ecological awareness, not some Nature over yonder.[15]

Interdependence, which is ecology, is sad and contingent. Because of interdependence, when I’m nice to a bunny rabbit I’m not being nice to bunny rabbit parasites. Amazing violence would be required to try to fit a form over everything all at once. If you try then you basically undermine the bunnies and everything else into components of a machine, replaceable components whose only important aspect is their existence

“Lack of the historical sense is the traditional defect in all philosophers.”

Human, All Too Human: A Book for Free Spirits
by Friedrich Wilhelm Nietzsche

The Traditional Error of Philosophers.—All philosophers make the common mistake of taking contemporary man as their starting point and of trying, through an analysis of him, to reach a conclusion. “Man” involuntarily presents himself to them as an aeterna veritas as a passive element in every hurly-burly, as a fixed standard of things. Yet everything uttered by the philosopher on the subject of man is, in the last resort, nothing more than a piece of testimony concerning man during a very limited period of time. Lack of the historical sense is the traditional defect in all philosophers. Many innocently take man in his most childish state as fashioned through the influence of certain religious and even of certain political developments, as the permanent form under which man must be viewed. They will not learn that man has evolved,4 that the intellectual faculty itself is an evolution, whereas some philosophers make the whole cosmos out of this intellectual faculty. But everything essential in human evolution took place aeons ago, long before the four thousand years or so of which we know anything: during these man may not have changed very much. However, the philosopher ascribes “instinct” to contemporary man and assumes that this is one of the unalterable facts regarding man himself, and hence affords a clue to the understanding of the universe in general. The whole teleology is so planned that man during the last four thousand years shall be spoken of as a being existing from all eternity, and with reference to whom everything in the cosmos from its very inception is naturally ordered. Yet everything evolved: there are no eternal facts as there are no absolute truths. Accordingly, historical philosophising is henceforth indispensable, and with it honesty of judgment.

What Locke Lacked
by Louise Mabille

Locke is indeed a Colossus of modernity, but one whose twin projects of providing a concept of human understanding and political foundation undermine each other. The specificity of the experience of perception alone undermines the universality and uniformity necessary to create the subject required for a justifiable liberalism. Since mere physical perspective can generate so much difference, it is only to be expected that political differences would be even more glaring. However, no political order would ever come to pass without obliterating essential differences. The birth of liberalism was as violent as the Empire that would later be justified in its name, even if its political traces are not so obvious. To interpret is to see in a particular way, at the expense of all other possibilities of interpretation. Perspectives that do not fit are simply ignored, or as that other great resurrectionist of modernity, Freud, would concur, simply driven underground. We ourselves are the source of this interpretative injustice, or more correctly, our need for a world in which it is possible to live, is. To a certain extent, then, man is the measure of the world, but only his world. Man is thus a contingent measure and our measurements do not refer to an original, underlying reality. What we call reality is the result not only of our limited perspectives upon the world, but the interplay of those perspectives themselves. The liberal subject is thus a result of, and not a foundation for, the experience of reality. The subject is identified as origin of meaning only through a process of differentiation and reduction, a course through which the will is designated as a psychological property.

Locke takes the existence of the subject of free will – free to exercise political choice such as rising against a tyrant, choosing representatives, or deciding upon political direction – simply for granted. Furthermore, he seems to think that everyone should agree as to what the rules are according to which these events should happen. For him, the liberal subject underlying these choices is clearly fundamental and universal.

Locke’s philosophy of individualism posits the existence of a discreet and isolated individual, with private interests and rights, independent of his linguistic or socio-historical context. C. B. MacPhearson identifies a distinctly possessive quality to Locke’s individualist ethic, notably in the way in which the individual is conceived as proprietor of his own personhood, possessing capacities such as self-reflection and free will. Freedom becomes associated with possession, which the Greeks would associate with slavery, and society conceived in terms of a collection of free and equal individuals who are related to each through their means of achieving material success – which Nietzsche, too, would associate with slave morality.  […]

There is a central tenet to John Locke’s thinking that, as conventional as it has become, remains a strange strategy. Like Thomas Hobbes, he justifies modern society by contrasting it with an original state of nature. For Hobbes, as we have seen, the state of nature is but a hypothesis, a conceptual tool in order to elucidate a point. For Locke, however, the state of nature is a very real historical event, although not a condition of a state of war. Man was social by nature, rational and free. Locke drew this inspiration from Richard Hooker’s Laws of Ecclesiastical Polity, notably from his idea that church government should be based upon human nature, and not the Bible, which, according to Hooker, told us nothing about human nature. The social contract is a means to escape from nature, friendlier though it be on the Lockean account. For Nietzsche, however, we have never made the escape: we are still holus-bolus in it: ‘being conscious is in no decisive sense the opposite of the instinctive – most of the philosopher’s conscious thinking is secretly directed and compelled into definite channels by his instincts. Behind all logic too, and its apparent autonomy there stand evaluations’ (BGE, 3). Locke makes a singular mistake in thinking the state of nature a distant event. In fact, Nietzsche tells us, we have never left it. We now only wield more sophisticated weapons, such as the guilty conscience […]

Truth originates when humans forget that they are ‘artistically creating subjects’ or products of law or stasis and begin to attach ‘invincible faith’ to their perceptions, thereby creating truth itself. For Nietzsche, the key to understanding the ethic of the concept, the ethic of representation, is conviction […]

Few convictions have proven to be as strong as the conviction of the existence of a fundamental subjectivity. For Nietzsche, it is an illusion, a bundle of drives loosely collected under the name of ‘subject’ —indeed, it is nothing but these drives, willing, and actions in themselves—and it cannot appear as anything else except through the seduction of language (and the fundamental errors of reason petrified in it), which understands and misunderstands all action as conditioned by something which causes actions, by a ‘Subject’ (GM I 13). Subjectivity is a form of linguistic reductionism, and when using language, ‘[w]e enter a realm of crude fetishism when we summon before consciousness the basic presuppositions of the metaphysics of language — in plain talk, the presuppositions of reason. Everywhere reason sees a doer and doing; it believes in will as the cause; it believes in the ego, in the ego as being, in the ego as substance, and it projects this faith in the ego-substance upon all things — only thereby does it first create the concept of ‘thing’ (TI, ‘Reason in Philosophy’ 5). As Nietzsche also states in WP 484, the habit of adding a doer to a deed is a Cartesian leftover that begs more questions than it solves. It is indeed nothing more than an inference according to habit: ‘There is activity, every activity requires an agent, consequently – (BGE, 17). Locke himself found the continuous existence of the self problematic, but did not go as far as Hume’s dissolution of the self into a number of ‘bundles’. After all, even if identity shifts occurred behind the scenes, he required a subject with enough unity to be able to enter into the Social Contract. This subject had to be something more than merely an ‘eternal grammatical blunder’ (D, 120), and willing had to be understood as something simple. For Nietzsche, it is ‘above all complicated, something that is a unit only as a word, a word in which the popular prejudice lurks, which has defeated the always inadequate caution of philosophers’ (BGE, 19).

Nietzsche’s critique of past philosophers
by Michael Lacewing

Nietzsche is questioning the very foundations of philosophy. To accept his claims means being a new kind of philosopher, ones who ‘taste and inclination’, whose values, are quite different. Throughout his philosophy, Nietzsche is concerned with origins, both psychological and historical. Much of philosophy is usually thought of as an a priori investigation. But if Nietzsche can show, as he thinks he can, that philosophical theories and arguments have a specific historical basis, then they are not, in fact, a priori. What is known a priori should not change from one historical era to the next, nor should it depend on someone’s psychology. Plato’s aim, the aim that defines much of philosophy, is to be able to give complete definitions of ideas – ‘what is justice?’, ‘what is knowledge?’. For Plato, we understand an idea when we have direct knowledge of the Form, which is unchanging and has no history. If our ideas have a history, then the philosophical project of trying to give definitions of our concepts, rather than histories, is radically mistaken. For example, in §186, Nietzsche argues that philosophers have consulted their ‘intuitions’ to try to justify this or that moral principle. But they have only been aware of their own morality, of which their ‘justifications’ are in fact only expressions. Morality and moral intuitions have a history, and are not a priori. There is no one definition of justice or good, and the ‘intuitions’ that we use to defend this or that theory are themselves as historical, as contentious as the theories we give – so they offer no real support. The usual ways philosophers discuss morality misunderstands morality from the very outset. The real issues of understanding morality only emerge when we look at the relation between this particular morality and that. There is no world of unchanging ideas, no truths beyond the truths of the world we experience, nothing that stands outside or beyond nature and history.

GENEALOGY AND PHILOSOPHY

Nietzsche develops a new way of philosophizing, which he calls a ‘morphology and evolutionary theory’ (§23), and later calls ‘genealogy’. (‘Morphology’ means the study of the forms something, e.g. morality, can take; ‘genealogy’ means the historical line of descent traced from an ancestor.) He aims to locate the historical origin of philosophical and religious ideas and show how they have changed over time to the present day. His investigation brings together history, psychology, the interpretation of concepts, and a keen sense of what it is like to live with particular ideas and values. In order to best understand which of our ideas and values are particular to us, not a priori or universal, we need to look at real alternatives. In order to understand these alternatives, we need to understand the psychology of the people who lived with them. And so Nietzsche argues that traditional ways of doing philosophy fail – our intuitions are not a reliable guide to the ‘truth’, to the ‘real’ nature of this or that idea or value. And not just our intuitions, but the arguments, and style of arguing, that philosophers have used are unreliable. Philosophy needs to become, or be informed by, genealogy. A lack of any historical sense, says Nietzsche, is the ‘hereditary defect’ of all philosophers.

MOTIVATIONAL ANALYSIS

Having long kept a strict eye on the philosophers, and having looked between their lines, I say to myself… most of a philosopher’s conscious thinking is secretly guided and channelled into particular tracks by his instincts. Behind all logic, too, and its apparent tyranny of movement there are value judgements, or to speak more clearly, physiological demands for the preservation of a particular kind of life. (§3) A person’s theoretical beliefs are best explained, Nietzsche thinks, by evaluative beliefs, particular interpretations of certain values, e.g. that goodness is this and the opposite of badness. These values are best explained as ‘physiological demands for the preservation of a particular kind of life’. Nietzsche holds that each person has a particular psychophysical constitution, formed by both heredity and culture. […] Different values, and different interpretations of these values, support different ways of life, and so people are instinctively drawn to particular values and ways of understanding them. On the basis of these interpretations of values, people come to hold particular philosophical views. §2 has given us an illustration of this: philosophers come to hold metaphysical beliefs about a transcendent world, the ‘true’ and ‘good’ world, because they cannot believe that truth and goodness could originate in the world of normal experience, which is full of illusion, error, and selfishness. Therefore, there ‘must’ be a pure, spiritual world and a spiritual part of human beings, which is the origin of truth and goodness. Philosophy and values But ‘must’ there be a transcendent world? Or is this just what the philosopher wants to be true? Every great philosophy, claims Nietzsche, is ‘the personal confession of its author’ (§6). The moral aims of a philosophy are the ‘seed’ from which the whole theory grows. Philosophers pretend that their opinions have been reached by ‘cold, pure, divinely unhampered dialectic’ when in fact, they are seeking reasons to support their pre-existing commitment to ‘a rarefied and abstract version of their heart’s desire’ (§5), viz. that there is a transcendent world, and that good and bad, true and false are opposites. Consider: Many philosophical systems are of doubtful coherence, e.g. how could there be Forms, and if there were, how could we know about them? Or again, in §11, Nietzsche asks ‘how are synthetic a priori judgments possible?’. The term ‘synthetic a priori’ was invented by Kant. According to Nietzsche, Kant says that such judgments are possible, because we have a ‘faculty’ that makes them possible. What kind of answer is this?? Furthermore, no philosopher has ever been proved right (§25). Given the great difficulty of believing either in a transcendent world or in human cognitive abilities necessary to know about it, we should look elsewhere for an explanation of why someone would hold those beliefs. We can find an answer in their values. There is an interesting structural similarity between Nietzsche’s argument and Hume’s. Both argue that there is no rational explanation of many of our beliefs, and so they try to find the source of these beliefs outside or beyond reason. Hume appeals to imagination and the principle of ‘Custom’. Nietzsche appeals instead to motivation and ‘the bewitchment of language’ (see below). So Nietzsche argues that philosophy is not driven by a pure ‘will to truth’ (§1), to discover the truth whatever it may be. Instead, a philosophy interprets the world in terms of the philosopher’s values. For example, the Stoics argued that we should live ‘according to nature’ (§9). But they interpret nature by their own values, as an embodiment of rationality. They do not see the senselessness, the purposelessness, the indifference of nature to our lives […]

THE BEWITCHMENT OF LANGUAGE

We said above that Nietzsche criticizes past philosophers on two grounds. We have looked at the role of motivation; the second ground is the seduction of grammar. Nietzsche is concerned with the subject-predicate structure of language, and with it the notion of a ‘substance’ (picked out by the grammatical ‘subject’) to which we attribute ‘properties’ (identified by the predicate). This structure leads us into a mistaken metaphysics of ‘substances’. In particular, Nietzsche is concerned with the grammar of ‘I’. We tend to think that ‘I’ refers to some thing, e.g. the soul. Descartes makes this mistake in his cogito – ‘I think’, he argues, refers to a substance engaged in an activity. But Nietzsche repeats the old objection that this is an illegitimate inference (§16) that rests on many unproven assumptions – that I am thinking, that some thing is thinking, that thinking is an activity (the result of a cause, viz. I), that an ‘I’ exists, that we know what it is to think. So the simple sentence ‘I think’ is misleading. In fact, ‘a thought comes when ‘it’ wants to, and not when ‘I’ want it to’ (§17). Even ‘there is thinking’ isn’t right: ‘even this ‘there’ contains an interpretation of the process and is not part of the process itself. People are concluding here according to grammatical habit’. But our language does not allow us just to say ‘thinking’ – this is not a whole sentence. We have to say ‘there is thinking’; so grammar constrains our understanding. Furthermore, Kant shows that rather than the ‘I’ being the basis of thinking, thinking is the basis out of which the appearance of an ‘I’ is created (§54). Once we recognise that there is no soul in a traditional sense, no ‘substance’, something constant through change, something unitary and immortal, ‘the way is clear for new and refined versions of the hypothesis about the soul’ (§12), that it is mortal, that it is multiplicity rather than identical over time, even that it is a social construct and a society of drives. Nietzsche makes a similar argument about the will (§19). Because we have this one word ‘will’, we think that what it refers to must also be one thing. But the act of willing is highly complicated. First, there is an emotion of command, for willing is commanding oneself to do something, and with it a feeling of superiority over that which obeys. Second, there is the expectation that the mere commanding on its own is enough for the action to follow, which increases our sense of power. Third, there is obedience to the command, from which we also derive pleasure. But we ignore the feeling the compulsion, identifying the ‘I’ with the commanding ‘will’. Nietzsche links the seduction of language to the issue of motivation in §20, arguing that ‘the spell of certain grammatical functions is the spell of physiological value judgements’. So even the grammatical structure of language originates in our instincts, different grammars contributing to the creation of favourable conditions for different types of life. So what values are served by these notions of the ‘I’ and the ‘will’? The ‘I’ relates to the idea that we have a soul, which participates in a transcendent world. It functions in support of the ascetic ideal. The ‘will’, and in particular our inherited conception of ‘free will’, serves a particular moral aim

Hume and Nietzsche: Moral Psychology (short essay)
by epictetus_rex

1. Metaphilosophical Motivation

Both Hume and Nietzsche1 advocate a kind of naturalism. This is a weak naturalism, for it does not seek to give science authority over philosophical inquiry, nor does it commit itself to a specific ontological or metaphysical picture. Rather, it seeks to (a) place the human mind firmly in the realm of nature, as subject to the same mechanisms that drive all other natural events, and (b) investigate the world in a way that is roughly congruent with our best current conception(s) of nature […]

Furthermore, the motivation for this general position is common to both thinkers. Hume and Nietzsche saw old rationalist/dualist philosophies as both absurd and harmful: such systems were committed to extravagant and contradictory metaphysical claims which hinder philosophical progress. Furthermore, they alienated humanity from its position in nature—an effect Hume referred to as “anxiety”—and underpinned religious or “monkish” practises which greatly accentuated this alienation. Both Nietzsche and Hume believe quite strongly that coming to see ourselves as we really are will banish these bugbears from human life.

To this end, both thinkers ask us to engage in honest, realistic psychology. “Psychology is once more the path to the fundamental problems,” writes Nietzsche (BGE 23), and Hume agrees:

the only expedient, from which we can hope for success in our philosophical researches, is to leave the tedious lingering method, which we have hitherto followed, and instead of taking now and then a castle or village on the frontier, to march up directly to the capital or center of these sciences, to human nature itself.” (T Intro)

2. Selfhood

Hume and Nietzsche militate against the notion of a unified self, both at-a-time and, a fortiori, over time.

Hume’s quest for a Newtonian “science of the mind” lead him to classify all mental events as either impressions (sensory) or ideas (copies of sensory impressions, distinguished from the former by diminished vivacity or force). The self, or ego, as he says, is just “a kind of theatre, where several perceptions successively make their appearance; pass, re-pass, glide away, and mingle in an infinite variety of postures and situations. There is properly no simplicity in it at one time, nor identity in different; whatever natural propension we may have to imagine that simplicity and identity.” (Treatise 4.6) […]

For Nietzsche, the experience of willing lies in a certain kind of pleasure, a feeling of self-mastery and increase of power that comes with all success. This experience leads us to mistakenly posit a simple, unitary cause, the ego. (BGE 19)

The similarities here are manifest: our minds do not have any intrinsic unity to which the term “self” can properly refer, rather, they are collections or “bundles” of events (drives) which may align with or struggle against one another in a myriad of ways. Both thinkers use political models to describe what a person really is. Hume tells us we should “more properly compare the soul to a republic or commonwealth, in which the several members [impressions and ideas] are united by ties of government and subordination, and give rise to persons, who propagate the same republic in the incessant change of its parts” (T 261)

3. Action and The Will

Nietzsche and Hume attack the old platonic conception of a “free will” in lock-step with one another. This picture, roughly, involves a rational intellect which sits above the appetites and ultimately chooses which appetites will express themselves in action. This will is usually not considered to be part of the natural/empirical order, and it is this consequence which irks both Hume and Nietzsche, who offer two seamlessly interchangeable refutations […]

Since we are nothing above and beyond events, there is nothing for this “free will” to be: it is a causa sui, “a sort of rape and perversion of logic… the extravagant pride of man has managed to entangle itself profoundly and frightfully with just this nonsense” (BGE 21).

When they discover an erroneous or empty concept such as “Free will” or “the self”, Nietzsche and Hume engage in a sort of error-theorizing which is structurally the same. Peter Kail (2006) has called this a “projective explanation”, whereby belief in those concepts is “explained by appeal to independently intelligible features of psychology”, rather than by reference to the way the world really is1.

The Philosophy of Mind
INSTRUCTOR: Larry Hauser
Chapter 7: Egos, bundles, and multiple selves

  • Who dat?  “I”
    • Locke: “something, I know not what”
    • Hume: the no-self view … “bundle theory”
    • Kant’s transcendental ego: a formal (nonempirical) condition of thought that the “I’ must accompany every perception.
      • Intentional mental state: I think that snow is white.
        • to think: a relation between
          • a subject = “I”
          • a propositional content thought =  snow is white
      • Sensations: I feel the coldness of the snow.
        • to feel: a relation between
          • a subject = “I”
          • a quale = the cold-feeling
    • Friedrich Nietzsche
      • A thought comes when “it” will and not when “I” will. Thus it is a falsification of the evidence to say that the subject “I” conditions the predicate “think.”
      • It is thought, to be sure, but that this “it” should be that old famous “I” is, to put it mildly, only a supposition, an assertion. Above all it is not an “immediate certainty.” … Our conclusion is here formulated out of our grammatical custom: “Thinking is an activity; every activity presumes something which is active, hence ….” 
    • Lichtenberg: “it’s thinking” a la “it’s raining”
      • a mere grammatical requirement
      • no proof of an thinking self

[…]

  • Ego vs. bundle theories (Derek Parfit (1987))
    • Ego: “there really is some kind of continuous self that is the subject of my experiences, that makes decisions, and so on.” (95)
      • Religions: Christianity, Islam, Hinduism
      • Philosophers: Descartes, Locke, Kant & many others (the majority view)
    • Bundle: “there is no underlying continuous and unitary self.” (95)
      • Religion: Buddhism
      • Philosophers: Hume, Nietzsche, Lichtenberg, Wittgenstein, Kripke(?), Parfit, Dennett {a stellar minority}
  • Hume v. Reid
    • David Hume: For my part, when I enter most intimately into what I call myself, I always stumble on some particular perception or other, of heat or cold, light or shade, love or hatred, pain or pleasure.  I never can catch myself at any time without a perception, and never can observe anything but the perception.  (Hume 1739, Treatise I, VI, iv)
    • Thomas Reid: I am not thought, I am not action, I am not feeling: I am something which thinks and acts and feels. (1785)

Arete: History and Etymology

Arete (moral virtue)
Wikipedia

Arete (Greekἀρετή), in its basic sense, means “excellence of any kind”.[1] The term may also mean “moral virtue”.[1] In its earliest appearance in Greek, this notion of excellence was ultimately bound up with the notion of the fulfillment of purpose or function: the act of living up to one’s full potential.

The term from Homeric times onwards is not gender specific. Homer applies the term of both the Greek and Trojan heroes as well as major female figures, such as Penelope, the wife of the Greek hero Odysseus. In the Homeric poems, Arete is frequently associated with bravery, but more often with effectiveness. The man or woman of Arete is a person of the highest effectiveness; they use all their faculties—strength, bravery and wit—to achieve real results. In the Homeric world, then, Arete involves all of the abilities and potentialities available to humans.

In some contexts, Arete is explicitly linked with human knowledge, where the expressions “virtue is knowledge” and “Arete is knowledge” are used interchangeably. The highest human potential is knowledge and all other human abilities are derived from this central capacity. If Arete is knowledge and study, the highest human knowledge is knowledge about knowledge itself; in this light, the theoretical study of human knowledge, which Aristotle called “contemplation”, is the highest human ability and happiness.[2]

History

The Ancient Greeks applied the term to anything: for example, the excellence of a chimney, the excellence of a bull to be bred and the excellence of a man. The meaning of the word changes depending on what it describes, since everything has its own peculiar excellence; the arete of a man is different from the arete of a horse. This way of thinking comes first from Plato, where it can be seen in the Allegory of the Cave.[3] In particular, the aristocratic class was presumed, essentially by definition, to be exemplary of arete: “The root of the word is the same as aristos, the word which shows superlative ability and superiority, and aristos was constantly used in the plural to denote the nobility.”[4]

By the 5th and 4th centuries BC, arete as applied to men had developed to include quieter virtues, such as dikaiosyne (justice) and sophrosyne (self-restraint). Plato attempted to produce a moral philosophy that incorporated this new usage,[5] but it was in the work of Aristotle that the doctrine of arete found its fullest flowering. Aristotle’s Doctrine of the Mean is a paradigm example of his thinking.

Arete has also been used by Plato when talking about athletic training and also the education of young boys. Stephen G. Miller delves into this usage in his book “Ancient Greek Athletics”. Aristotle is quoted as deliberating between education towards arete “…or those that are theoretical”.[6] Educating towards arete in this sense means that the boy would be educated towards things that are useful in life. However, even Plato himself says that arete is not something that can be agreed upon. He says, “Nor is there even an agreement about what constitutes arete, something that leads logically to a disagreement about the appropriate training for arete.”[7] To say that arete has a common definition of excellence or fulfillment may be an overstatement simply because it was very difficult to pinpoint arete, much less the proper ways to go about obtaining it. […]

Homer

In Homer‘s Iliad and Odyssey, “arete” is used mainly to describe heroes and nobles and their mobile dexterity, with special reference to strength and courage, but it is not limited to this. Penelope‘s arete, for example, relates to co-operation, for which she is praised by Agamemnon. The excellence of the gods generally included their power, but, in the Odyssey (13.42), the gods can grant excellence to a life, which is contextually understood to mean prosperity. Arete was also the name of King Alcinous‘s wife.

According to Bernard Knox‘s notes found in the Robert Fagles translation of The Odyssey, “arete” is also associated with the Greek word for “pray”, araomai.[8]

All Things Shining
by Hubert Dreyfus
pp. 61-63

Homer’s epic poems brought into focus a notion of arete, or excellence in life, that was at the center of the Greek understanding of human being.6 Many admirers of Greek culture have attempted to define this notion, but success here requires avoiding two prominent temptations. There is the temptation to patronize that we have already mentioned. But there is also a temptation to read a modern sensibility into Homer’s time. One standard translation of the Greek word arete as “virtue” runs the risk of this kind of retroactive reading: for any attempt to interpret the Homeric Greek notion of human excellence in terms of “virtue”—especially if one hears in this word its typical Christian or even Roman overtones—is bound to go astray. Excellence in the Greek sense involves neither the Christian notion of humility and love nor the Roman ideal of stoic adherence to one’s duty.7 Instead, excellence in the Homeric world depends crucially on one’s sense of gratitude and wonder.

Nietzsche was one of the first to understand that Homeric excellence bears little resemblance to modern moral agency. His view was that the Homeric world understood nobility in terms of the overpowering strength of noble warriors. The effect of the ensuing Judeo-Christian tradition, on this Nietzschean reading, was to enfeeble the Homeric understanding of excellence by substituting the meekness of the lamb for the strength and power of the noble warrior.8

Nietzsche was certainly right that the Homeric tradition valorizes the strong, noble hero; and he was right, too, that in some important sense the Homeric account of excellence is foreign to our basic moralizing assumptions. But there is something that the Nietzschean account leaves out. As Bernard Knox emphasizes, the Greek word arete is etymologically related to the Greek verb “to pray” (araomai).9 It follows that Homer’s basic account of human excellence involves the necessity of being in an appropriate relationship to whatever is understood to be sacred in the culture. Helen’s greatness, on this interpretation, is not properly measured in terms of the degree to which she is morally responsible for her actions.

What makes Helen great in Homer’s world is her ability to live a life that is constantly responsive to golden Aphrodite, the shining example of the sacred erotic dimension of existence. Likewise, Achilles had a special kind of receptivity to Ares and his warlike way of life; Odysseus had Athena, with her wisdom and cultural adaptability, to look out for him. Presumably, the master craftsmen of Homer’s world worked in the light of Hephaestus’s shining. In order to engage with this understanding of human excellence, we will have to think clearly about how the Homeric Greeks understood themselves. Why would it make sense to describe their lives in relation to the presence and absence of the gods?

Several questions focus this kind of approach. What is the phenomenon that Homer is responding to when he says that a god intervened or in some way took part in an action or event? Is this phenomenon recognizable to us, even if only marginally? And if Homer’s reference to the gods is something other than an attempt to pass off moral responsibility for one’s actions, then what exactly is it? Only by facing these questions head on can we understand whether it is possible—or desirable—to lure back Homer’s polytheistic gods.

The gods are essential to the Homeric Greek understanding of what it is to be a human being at all. As Peisistratus—the son of wise old Nestor—says toward the beginning of the Odyssey, “All men need the gods.”10 The Greeks were deeply aware of the ways in which our successes and our failures—indeed, our very actions themselves—are never completely under our control. They were constantly sensitive to, amazed by, and grateful for those actions that one cannot perform on one’s own simply by trying harder: going to sleep, waking up, fitting in, standing out, gathering crowds together, holding their attention with a speech, changing their mood, or indeed being filled with longing, desire, courage, wisdom, and so on. Homer sees each of these achievements as a particular god’s gift. To say that all men need the gods therefore is to say, in part at least, that we are the kinds of beings who are at our best when we find ourselves acting in ways that we cannot—and ought not—entirely take credit for.

The Discovery of the Mind
by Bruno Snell
pp. 158-160

The words for virtue and good, arete and agathos, are at first by no means clearly distinguished from the area of profit. In the early period they are not as palpably moral in content as might be supposed; we may compare the German terms Tu end and gut which originally stood for the ‘suitable’ (taugende) and the ‘fitting’ (cf. Gatte). When Homer says that a man is good, agathos, he does not mean thereby that he is morally unobjectionable, much less good-hearted, but rather that he is useful, proficient, and capable of vigorous action. We also speak of a good warrior or a good instrument. Similarly arete, virtue, does not denote a moral property but nobility, achievement, success and reputation. And yet these words have an unmistakable tendency toward the moral because, unlike ‘happiness’ or ‘profit’, they designate qualities for which a man may win the respect of his whole community. Arete is ‘ability’ and ‘achievement’, characteristics which are expected of a ‘good’, an ‘able’ man, an aner agathos. From Homer to Plato and beyond these words spell out the worth of a man and his work. Any change in their meaning, therefore, would indicate a reassessment of values. It is possible to show how at various times the formation and consolidation of social groups and even of states was connected with people’s ideas about the ‘good’. But that would be tantamount to writing a history of Greek culture. In Homer, to possess ‘virtue’ or to be ‘good’ means to realize one’s nature, and one’s wishes, to perfection. Frequently happiness and profit form the reward, but it is no such extrinsic prospect which leads men to virtue and goodness. The expressions contain a germ of the notion of entelechy. A Homeric hero, for instance, is capable of ‘reminding himself’, or of ‘experiencing’, that he is noble. ‘Use your experience to become what you are’ advises Pindar who adheres to this image of arete. The ‘good’ man fulfils his proper function, prattei ta heautou, as Plato demands it; he achieves his own perfection. And in the early period this also entails that he is good in the eyes of others, for the notions and definitions of goodness are plain and uniform: a man appears to others as he is.

In the Iliad (11.404—410) Odysseus reminds himself that he is an aristocrat, and thereby resolves his doubts how he should conduct himself in a critical situation. He does it by concentrating on the thought that he belongs to a certain social order, and that it is his duty to fulfill the ‘virtue’ of that order. The universal which underlies the predication ‘I am a noble’ is the group; he does not reflect on an abstract ‘good ’but upon the circle of which he claims membership. It is the same as if an officer were to say: ‘As an officer I must do this or that,’ thus gauging his action by the rigid conception of honour peculiar to his caste.

Aretan is ‘to thrive’; arete is the objective which the early nobles attach to achievement and success. By means of arete the aristocrat implements the ideal of his order—and at the same time distinguishes himself above his fellow nobles. With his arete the individual subjects himself to the judgment of his community, but he also surpasses it as an individual. Since the days of Jacob Burckhardt the competitive character of the great Greek achievements has rightly been stressed. Well into the classical period, those who compete for arete are remunerated with glory and honour. The community puts its stamp of approval on the value which the individual sets on himself. Thus honour, time, is even more significant than arete for the growth of the moral consciousness, because it is more evident, more palpable to all. From his earliest boyhood the young nobleman is urged to think of his glory and his honour; he must look out for his good name, and he must see to it that he commands the necessary respect. For honour is a very sensitive plant; wherever it is destroyed the moral existence of the loser collapses. Its importance is greater even than that of life itself; for the sake of glory and honour the knight is prepared to sacrifice his life.

pp. 169-172

The truth of the matter is that it was not the concept of justice but that of arete which gave rise to the call for positive individual achievement, the moral imperative which the early Greek community enjoins upon its members who in turn acknowledge it for themselves. A man may have purely egotistical motives for desiring virtue and achievement, but his group gives him considerably more credit for these ideals than if he were to desire profit or happiness. The community expects, and even demands, arete. Conversely a man who accomplishes a high purpose may convince himself so thoroughly that his deed serves the interests of a supra-personal, a universal cause that the alternative of egotism or altruism becomes irrelevant. What does the community require of the individual? What does the individual regard as universal, as eternal? These, in the archaic age, are the questions about which the speculations on arete revolve.

The problem remains simple as long as the individual cherishes the same values as the rest of his group. Given this condition, even the ordinary things in life are suffused with an air of dignity, because they are part of custom and tradition. The various daily functions, such as rising in the morning and the eating of meals, are sanctified by prayer and sacrifice, and the crucial events in the life of man—birth, marriage, burial—are for ever fixed and rooted in the rigid forms of cult. Life bears the imprint of a permanent authority which is divine, and all activity is, therefore, more than just personal striving. No one doubts the meaning of life; the hallowed tradition is carried on with implicit trust in the holy wisdom of its rules. In such a society, if a man shows unusual capacity he is rewarded as a matter of course. In Homer a signal achievement is, as one would expect, also honoured with a special permanence, through the song of the bard which outlasts the deed celebrated and preserves it for posterity. This simple concept is still to be found in Pindar’s Epinicians. The problem of virtue becomes more complex when the ancient and universally recognized ideal of chivalry breaks down. Already in Homeric times a differentiation sets in. As we have seen in the story of the quarrel over the arms of Achilles, the aretai become a subject for controversy. The word arete itself contains a tendency toward the differentiation of values, since it is possible to speak of the virtues of various men and various things. As more sections of society become aware of their own merit, they are less willing to conform to the ideal of the once-dominant class. It is discovered that the ways of men are diverse, and that arete may be attained in all sorts of professions. Whereas aristocratic society had been held together, not to say made possible by a uniform notion of arete, people now begin to ask what true virtue is. The crisis of the social system is at the same time the crisis of an ideal, and thus of morality. Archilochus says (fr. 41)that different men have their hearts quickened in various ways. But he also states, elaborating a thought which first crops up in the Odyssey: the mind of men is as Zeus ushers in each day, and they think whatever they happen to hit upon (fr. 68). One result of this splitting up of the various forms of life is a certain failure of nerve. Man begins to feel that he is changeable and exposed to many variable forces. This insight deepens the moral reflexions of the archaic period; the search for the good becomes a search for the permanent.

The topic of the virtues is especially prominent in the elegy. Several elegiac poets furnish lists of the various aretai which they exemplify by means of well-known myths. Their purpose is to clarify for themselves their own attitudes toward the conflicting standards of life. Theognis (699 ff.) stands at the end of this development; with righteous indignation he complains that the masses no longer have eyes for anything except wealth. For him material gain has, in contrast with earlier views, become an enemy of virtue.

The first to deal with this general issue is Tyrtaeus. His call to arms pronounces the Spartan ideal; perhaps he was the one to formulate that ideal for the first time. Nothing matters but the bravery of the soldier fighting for his country. Emphatically he rejects all other accomplishments and virtues as secondary: the swiftness of the runner in the arena, or the strength of the wrestler, or again physical beauty, wealth, royal power, and eloquence, are as nothing before bravery. In the Iliad also a hero best proves his virtue by standing firm against the enemy, but that is not his only proof; the heroic figures of Homer dazzle us precisely because of their richness in human qualities. Achilles is not only brave but also beautiful, ‘swift of foot’, he knows how to sing, and so forth. Tyrtaeus sharply reduces the scope of the older arete; what is more, he goes far beyond Homer in magnifying the fame of fortitude and the ignominy which awaits the coward. Of the fallen he actually says that they acquire immortality (9.32). This one-sidedness is due to the fact that the community has redoubled its claim on the individual; Sparta in particular taxed the energies of its citizenry to the utmost during the calamitous period of the Messenian wars. The community is a thing of permanence for whose sake the individual mortal has to lay down his life, and in whose memory lies his only chance for any kind of survival. Even in Tyrtaeus, however, these claims of the group do not lead to a termite morality. Far from prescribing a blind and unthinking service to the whole, or a spirit of slavish self-sacrifice, Tyrtaeus esteems the performance of the individual as a deed worthy of fame. This is a basic ingredient of arete which, in spite of countless shifts and variations, is never wholly lost.

Philosophy Before Socrates
by Richard D. McKirahan
pp. 366-369

Aretē and Agathos These two basic concepts of Greek morality are closely related and not straightforwardly translatable into English. As an approximation, aretē can be rendered “excellence” or “goodness” (sometimes “virtue”), and agathos as “excellent” or “good.” The terms are related in that a thing or person is agathos if and only if it has aretē and just because it has aretē. The concepts apply to objects, conditions, and actions as well as to humans. They are connected with the concept of ergon (plural, erga), which may be rendered as “function” or “characteristic activity.” A good (agathos) person is one who performs human erga well, and similarly a good knife is a knife that performs the ergon of a knife well. The ergon of a knife is cutting, and an agathos knife is one that cuts well. Thus, the aretē of a knife is the qualities or characteristics a knife must have in order to cut well. Likewise, if a human ergon can be identified, an agathos human is one who can and on appropriate occasions does perform that ergon well, and human aretē is the qualities or characteristics that enable him or her to do so. The classical discussion of these concepts occurs after our period, in Aristotle,6 but he is only making explicit ideas that go back to Homer and which throw light on much of the pre-philosophical ethical thought of the Greeks.

This connection of concepts makes it automatic, virtually an analytic truth, that the right goal for a person—any person—is to be or become agathos. Even if that goal is unreachable for someone, the aretē–agathos standard still stands as an ideal against which to measure one’s successes and failures. However, there is room for debate over the nature of human erga, both whether there is a set of erga applicable to all humans and relevant to aretē and, supposing that there is such a set of erga, what those erga are. The existence of the aretē–agathos standard makes it vitally important to settle these issues, for otherwise human life is left adrift with no standards of conduct. […]

The moral scene Homer presents is appropriate to the society it represents and quite alien to our own. It is the starting point for subsequent moral speculation which no one in the later Greek tradition could quite forget. The development of Greek moral thought through the Archaic and Classical periods can be seen as the gradual replacement of the competitive by the cooperative virtues as the primary virtues of conduct and as the recognition and increasing recognition of the significance of people’s intentions as well as their actions.7

Rapid change in Greek society in the Archaic and Classical periods called for new conceptions of the ideal human and the ideal human life and activities. The Archaic period saw different kinds of rulers from the Homeric kings, and individual combat gave way to the united front of a phalanx of hoplites (heavily armed warriors). Even though the Homeric warrior-king was no longer a possible role in society, the qualities of good birth, beauty, courage, honor, and the abilities to give good counsel and rule well remained. Nevertheless, the various strands of the Homeric heroic ideal began to unravel. In particular, good birth, wealth, and fighting ability no longer automatically went together. This situation forced the issue: what are the best qualities we can possess? What constitutes human aretē? The literary sources contain conflicting claims about the best life for a person, the best kind of person to be, and the relative merits of qualities thought to be ingredients of human happiness. In one way or another these different conceptions of human excellence have Homeric origins, though they diverge from Homer’s conception and from one another.

Lack of space makes it impossible to present the wealth of materials that bear on this subject.8 I will confine discussion to two representatives of the aristocratic tradition who wrote at the end of the Archaic period. Pindar shows how the aristocratic ideal had survived and been transformed from the Homeric conception and how vital it remained as late as the early fifth century, and Theognis reveals how social, political, and economic reality was undermining that ideal.

p. 374

The increase in wealth and the shift in its distribution which had begun by the seventh century led to profound changes in the social and political scenes in the sixth and forced a wedge in among the complex of qualities which traditionally constituted aristocratic aretē. Pindar’s unified picture in which wealth, power, and noble birth tend to go together became ever less true to contemporary reality.

The aristocratic response to this changed situation receives its clearest expression in the poems attributed to Theognis and composed in the sixth and early fifth centuries. Even less than with Pindar can we find a consistent set of views advocated in these poems, but among the most frequently recurring themes are the view that money does not make the man, that many undeserving people are now rich and many deserving people (deserving because of their birth and social background) are now poor. It is noteworthy how Theognis plays on the different connotations of uses of the primary terms of value, agathos and aretē, and their opposites kakos and kakia: morally good vs. evil; well-born, noble vs. low-born; and politically and socially powerful vs. powerless. Since the traditional positive attributes no longer regularly all went together, it was important to decide which are most important, indeed which are the essential ingredients of human aretē.

pp. 379-382

In short, Protagoras taught his students how to succeed in public and private life. What he claimed to teach is, in a word, aretē. That this was his boast follows from the intimate connection between agathos and aretē as well as from the fact that a person with aretē is one who enjoys success, as measured by current standards. Anyone with the abilities Protagoras claimed to teach had the keys to a successful life in fifth-century Athens.

In fact, the key to success was rhetoric, the art of public speaking, which has a precedent in the heroic conception of aretē, which included excellence in counsel. But the Sophists’ emphasis on rhetoric must not be understood as hearkening back to Homeric values. Clear reasons why success in life depended on the ability to speak well in public can be found in fifth-century politics and society. […]

That is not to say that every kind of success depended on rhetoric. It could not make you successful in a craft like carpentry and would not on its own make you a successful military commander. Nor is it plausible that every student of Protagoras could have become another Pericles. Protagoras acknowledged that natural aptitude was required over and above diligence. […] Protagoras recognized that he could not make a silk purse out of a sow’s ear, but he claimed to be able to develop a (sufficiently young) person’s abilities to the greatest extent possible.28

Pericles was an effective counselor in part because he could speak well but also by dint of his personality, experience, and intelligence. To a large extent these last three factors cannot be taught, but rhetoric can be offered as a tekhnē, a technical art or skill which has rules of its own and which can be instilled through training and practice. In these ways rhetoric is like medicine, carpentry, and other technical arts, but it is different in its seemingly universal applicability. Debates can arise on any conceivable subject, including technical ones, and rhetorical skill can be turned to the topic at hand whatever it may be. The story goes that Gorgias used his rhetorical skill to convince medical patients to undergo surgery when physicians failed to persuade them.29 Socrates turned the tables on the Sophists, arguing that if rhetoric has no specific subject matter, then so far from being a universal art, it should not be considered an art at all.30 And even if we grant that rhetoric is an art that can be taught, it remains controversial whether aretē can be taught and in what aretē consists. […]

The main charges against the Sophists are of two different sorts. First the charge of prostituting themselves. Plato emphasizes the money-making aspect of the Sophist’s work, which he uses as one of his chief criteria for determining that Socrates was not a Sophist. This charge contains two elements: the Sophists teach aretē for money, and they teach it to anyone who pays. Both elements have aristocratic origins. Traditionally aretē was learned from one’s family and friends and came as the result of a long process of socialization beginning in infancy. Such training and background can hardly be bought. Further, according to the aristocratic mentality most people are not of the right type, the appropriate social background, to aspire to aretē.

Lila
by Robert Pirsig
pp. 436-442

Digging back into ancient Greek history, to the time when this mythos-to-logos transition was taking place, Phædrus noted that the ancient rhetoricians of Greece, the Sophists, had taught what they called aretê , which was a synonym for Quality. Victorians had translated aretê as “virtue” but Victorian “virtue” connoted sexual abstinence, prissiness and a holier-than-thou snobbery. This was a long way from what the ancient Greeks meant. The early Greek literature, particularly the poetry of Homer, showed that aretê had been a central and vital term.

With Homer Phædrus was certain he’d gone back as far as anyone could go, but one day he came across some information that startled him. It said that by following linguistic analysis you could go even further back into the mythos than Homer. Ancient Greek was not an original language. It was descended from a much earlier one, now called the Proto-Indo-European language. This language has left no fragments but has been derived by scholars from similarities between such languages as Sanskrit, Greek and English which have indicated that these languages were fallouts from a common prehistoric tongue. After thousands of years of separation from Greek and English the Hindi word for “mother” is still “Ma.” Yoga both looks like and is translated as “yoke.” The reason an Indian rajah’ s title sounds like “regent” is because both terms are fallouts from Proto-Indo-European. Today a Proto-Indo-European dictionary contains more than a thousand entries with derivations extending into more than one hundred languages.

Just for curiosity’s sake Phædrus decided to see if aretê was in it. He looked under the “a” words and was disappointed to find it was not. Then he noted a statement that said that the Greeks were not the most faithful to the Proto-Indo-European spelling. Among other sins, the Greeks added the prefix “a” to many of the Proto-Indo-European roots. He checked this out by looking for aretê under “r.” This time a door opened.

The Proto-Indo-European root of aretê was the morpheme rt . There, beside aretê , was a treasure room of other derived “rt” words: “arithmetic,” “aristocrat,” “art,” “rhetoric,” “worth,” “rite,” “ritual,” “wright,” “right (handed)” and “right (correct).” All of these words except arithmetic seemed to have a vague thesaurus-like similarity to Quality. Phædrus studied them carefully, letting them soak in, trying to guess what sort of concept, what sort of way of seeing the world, could give rise to such a collection.

When the morpheme appeared in aristocrat and arithmetic the reference was to “firstness.” Rt meant first. When it appeared in art and wright it seemed to mean “created” and “of beauty.” “Ritual” suggested repetitive order. And the word right has two meanings: “right-handed” and “moral and esthetic correctness.” When all these meanings were strung together a fuller picture of the rt morpheme emerged. Rt referred to the “first, created, beautiful repetitive order of moral and esthetic correctness.” […]

There was just one thing wrong with this Proto-Indo-European discovery, something Phædrus had tried to sweep under the carpet at first, but which kept creeping out again. The meanings, grouped together, suggested something different from his interpretation of aretê . They suggested “importance” but it was an importance that was formal and social and procedural and manufactured, almost an antonym to the Quality he was talking about. Rt meant “quality” all right but the quality it meant was static, not Dynamic. He had wanted it to come out the other way, but it looked as though it wasn’t going to do it. Ritual. That was the last thing he wanted aretê to turn out to be. Bad news. It looked as though the Victorian translation of aretê as “virtue” might be better after all since “virtue” implies ritualistic conformity to social protocol. […]

Rta . It was a Sanskrit word, and Phædrus remembered what it meant: Rta was the “cosmic order of things.” Then he remembered he had read that the Sanskrit language was considered the most faithful to the Proto-Indo-European root, probably because the linguistic patterns had been so carefully preserved by the Hindu priests. […]

Rta , from the oldest portion of the Rg Veda , which was the oldest known writing of the Indo-Aryan language. The sun god, Sūrya , began his chariot ride across the heavens from the abode of rta. Varuna , the god for whom the city in which Phædrus was studying was named, was the chief support of rta .

Varuna was omniscient and was described as ever witnessing the truth and falsehood of men—as being “the third whenever two plot in secret.” He was essentially a god of righteousness and a guardian of all that is worthy and good. The texts had said that the distinctive feature of Varuna was his unswerving adherence to high principles. Later he was overshadowed by Indra who was a thunder god and destroyer of the enemies of the Indo-Aryans. But all the gods were conceived as “guardians of ta ,” willing the right and making sure it was carried out.

One of Phædrus’s old school texts, written by M. Hiriyanna, contained a good summary: “Rta , which etymologically stands for ‘course’ originally meant ‘cosmic order,’ the maintenance of which was the purpose of all the gods; and later it also came to mean ‘right,’ so that the gods were conceived as preserving the world not merely from physical disorder but also from moral chaos. The one idea is implicit in the other: and there is order in the universe because its control is in righteous hands.…”

The physical order of the universe is also the moral order of the universe. Rta is both. This was exactly what the Metaphysics of Quality was claiming. It was not a new idea. It was the oldest idea known to man.

This identification of rta and aretê was enormously valuable, Phædrus thought, because it provided a huge historical panorama in which the fundamental conflict between static and Dynamic Quality had been worked out. It answered the question of why aretê meant ritual. Rta also meant ritual. But unlike the Greeks, the Hindus in their many thousands of years of cultural evolution had paid enormous attention to the conflict between ritual and freedom. Their resolution of this conflict in the Buddhist and Vedantist philosophies is one of the profound achievements of the human mind.

Pagan Ethics: Paganism as a World Religion
by Michael York
pp. 59-60

Pirsig contends that Plato incorporated the arete of the Sophists into his dichotomy between ideas and appearances — where it was subordinated to Truth. Once Plato identifies the True with the Good, arete’s position is usurped by “dialectically determined truth.” This, in turn, allows Plato to demote the Good to a lower order and minor branch of knowledge. For Pirsig, the Sophists were those Greek philosophers who exalted quality over truth; they were the true champions of arete or excellence. With a pagan quest for the ethical that develops from an idolatrous understanding of the physical, while Aristotle remains an important consideration, it is to the Sophists (particularly Protagoras, Prodicus and Pirsig’s understanding of them) and a reconstruction of their underlying humanist position that perhaps the most important answers are to be framed if not found as well.

A basic pagan position is an acceptance of the appetites — in fact, their celebration rather than their condemnation. We find the most unbridled expression of the appetites in the actions of the young. Youth may engage in binge-drinking, vandalism, theft, promiscuity and profligate experimentation. Pagan perspectives may recognize the inherent dangers in these as there are in life itself. But they also trust the overall process of learning. In paganism, morality has a much greater latitude than it does in the transcendental philosophy of a Pythagoras, Plato, or Plotinus: it may veer toward a form of relativism, but its ultimate check is always the sanctity of the other animate individuals. An it harm none, do what ye will. The pagan ethic must be found within the appetites and not in their denial.

In fact, paganism is part of a protest against Platonic assertion. The wider denial is that of nature herself. Nature denies the Platonic by refusing to conform to the Platonic ideal. It insists on moments of chaos, the epagomenae, the carnival, that overlap between the real and the ideal that is itself a metaphor for reality. The actual year is a refusal to cooperate with the mathematically ideal year of 360 days — close but only tantalizingly.

In addition, pagans have always loved asking what is arete? This is the fundamental question we encounter with the Sophists, Plato and Aristotle. It is the question that is before us still. The classics considered variously both happiness and the good as alternative answers. The Hedonists pick happiness — but a particular kind of happiness. The underlying principle recognized behind all these possibilities is arete ‘excellence, the best’ however it is embodied — whether god, goddess, goods, the good, gods, virtue, happiness, pleasure or all of these together. Arete is that to which both individual and community aspire. Each wants one’s own individual way of putting it together in excellent fashion — but at the same time wanting some commensurable overlap of the individual way with the community way.

What is the truth of the historical claims about Greek philosophy in Zen and the Art of Motorcycle Maintenance?
answer by Ammon Allred

Arete is usually translated as “virtue,” which is certainly connected up with the good “agathon” — but in Plato an impersonal Good is probably more important than aletheia or truth. See, for instance, the central images at the end of Book VI, where the Good is called the “Father of the Sun.” The same holds in the Philebus. And it wouldn’t be right to say that Plato (or Aristotle) thought virtue was part of some small branch called “ethics” (Plato doesn’t divide his philosophy up this way; Aristotle does — although then we get into fact that we don’t have the dialogues he wrote — but still what he means by ethics is far broader than what we mean).

Certainly the Sophists pushed for a humanistic account of the Good, whereas Plato’s was far more impersonal. And Plato himself had a complex relationship to the Sophists (consider the dialogue of Protagoras, where Socrates and Protagoras both end up about equally triumphant).

That said, Pirsig is almost certainly right about Platonism — that is to say, the approach to philosophy that has been taught as though it were Plato’s philosophy. Certainly, the sophists have gotten a bad rap because of the view that Socrates and Plato were taken to have about the sophists; but even there, many philosophers have tried to rehabilitate them: most famously, Nietzsche.

The Art of the Lost Cause

Many people are understandably disappointed, frustrated, or angry when they lose. It’s just not fun to lose, especially in a competitive society. But there are advantages to losing. And losses are as much determined by perspective. Certainly, in more cooperative societies, what may be seen as a loss by outsiders could be taken quite differently by an insider. Western researchers discovered that difference when using games as part of social science studies. Some non-Western people refused win-lose scenarios, at least among members of the community. The individual didn’t lose for everyone gained. I point this out to help shift our thinking.

Recently, the political left in the United States has experienced losses. Bernie Sanders lost the nomination to Hillary Clinton who in turn lost the presidency to Donald Trump. But is this an entirely surprising result and bad outcome? Losses can lead to soul-searching and motivation for change. The Republicans we know now have dominated the political narrative in recent decades, which forced the Democrats to shift far to the right with third way ‘triangulation’. That wasn’t always the case. Republicans went through a period of major losses before being able to reinvent themselves with the southern strategy, Reagan revolution, trickle down voodo economics, the two Santa Claus theory, culture wars, etc.

The Clinton New Democrats were only able to win at all in recent history by sacrificing the political left and, in the process, becoming the new conservative party. So, even when Democrats have been able to win it has been a loss. Consider Obama who turned out to be one of the most neoliberal and neocon presidents in modern history, betraying his every promise: maintaining militarism, refusing to shut down GITMO, passing pro-biz insurance reform, etc. Liberals and leftists would have been better off to have been entirely out of power these past decades, allowing a genuine political left movement to form and so allowing democracy to begin to reassert itself from below. Instead, Democrats have managed to win just enough elections to keep the political left suppressed by co-opting their rhetoric. Democrats have won by forcing the American public to lose.

In the Democratic leadership failing so gloriously, they have been publicly shamed to the point of no redemption. The party is now less popular than the opposition, an amazing feat considering how unpopular is Trump and the GOP at the moment. Yet amidst all of this, Bernie Sanders is more popular than ever, more popular among women than men and more popular among minorities than whites. I never thought Sanders was likely to win and so I wasn’t disappointed. What his campaign did accomplish, as I expected, was to reshape the political narrative and shift the Overton window back toward the political left again. This period of loss will be remembered as a turning point in the future. It was a necessary loss, a reckoning and re-envisioning.

Think about famous lost causes. One that came to mind is that of Jesus and the early Christians. They were a tiny unknown cult in a vast empire filled with hundreds of thousands of similar cults. They were nothing special, of no significance or consequence, such that no one bothered to even take note of them, not even Jewish writers at the time. Then Jesus was killed as a common criminal among other criminals and even that didn’t draw any attention. There is no evidence that the Romans considered Jesus even mildly interesting. After his death, Christianity remained small and splintered into a few communities. It took generations for this cult to grow much at all and finally attract much outside attention.

Early Christians weren’t even important enough to be feared. The persecution stories seem to have been mostly invented by later Christians to make themselves feel more important, as there is no records of any systematic and pervasive persecution. Romans killing a few cultists here and there happened all the time and Christians didn’t stand out as being targeted more than any others. In fact, early Christians were lacking in uniqueness that they were often confused with other groups such as Stoics. By the way, it was the Stoics who were famous at the time for seeking out persecution and so gaining street cred respectability, maybe causing envy among Christians. Even Christian theology was largely borrowed from others, such as natural law also having been taken from the Stoics — related to the idea that a slave can be free in their mind and being, their heart and soul because natural law transcends human law.

Still, this early status of Christians as losers created a powerful narrative that has not only survived but proliferated. Some of that narrative, such as their persecution, was invented. But that is far from unusual — the mythos that develops around lost causes tends to be more invented than not. Still, at the core, the Christians were genuinely pathetic for a couple of centuries. They weren’t a respectable religion in the Roman Empire, until long after Jesus’ death when an emperor decided to use them to shore up his own power. In the waning era of Roman imperialism, I suppose a lost cause theology felt compelling and comforting. It was also a good way to convert other defeated people, as they could be promised victory in heaven. Lost Causes tend to lead to romanticizing of a distant redemption that one day would come. And in the case of Christianity, this would mean that the ultimate sacrificial loser, Jesus himself, would return victorious! Amen! Praise the Lord! Like a Taoist philosopher, Jesus taught that to find oneself was to lose oneself but to lose oneself was to find oneself. This is a loser’s mentality and relates to why some have considered Christianity to be a slaver religion. The lowly are uplifted, at least in words and ideals. But I’d argue there is more to it than seeking comfort by rationalizing suffering, oppression, and defeat.

Winning isn’t always a good thing, at least in the short term. I sometimes wonder if America would be a better place if the American Revolution had been lost. When I compare the United States to Canada, I don’t see any great advantage to American colonists having won. Canada is a much more stable and well-functioning social democracy. And the British Empire ended up enacting sweeping reforms, including abolishing slavery through law long before the US managed to end slavery through bloody conflict. In many ways, Americans were worse off after the revolution than before it. A reactionary backlash took hold as oligarchs co-opted the revolution and turned it into counter-revolution. Through the coup of a Constitutional Convention, the ruling elite seized power of the new government. It was in seeming to win that the average American ended up losing. An overt loss potentially could have been a greater long term victory. In particular for women and blacks, being on the side of the revolutionaries didn’t turn out to be such a great deal. Woman who had gained the vote had it taken away from them again and blacks hoping for freedom were returned to slavery. The emerging radical movement of democratic reform was strangled in the crib.

Later on, the Confederates learned of the power of a lost cause. To such an extent that they have become the poster boys of The Lost Cause, all of American society having been transformed by it. Victory of the United States government, once again, turned out to be far from a clear victory for the oppressed. If Confederates had won or otherwise been allowed to secede, the Confederate government would have been forced to come to terms with the majority black population that existed in the South and they wouldn’t have had the large Northern population to help keep blacks down. It’s possible that some of the worst results could have been avoided: re-enslavement through chain gangs and mass incarceration, Jim Crow laws and Klan terrorism, sundown towns and redlining, etc —  all the ways that racism became further entrenched. After the Civil War, blacks became scattered and would then become a minority. Having lost their position as the Southern majority, they lost most of the leverage they might have had. Instead of weak reforms leading to new forms of oppression, blacks might have been able to have forced a societal transformation within a Confederate government or else to have had a mass exodus in order to secede and create their own separate nation-state. There were many possibilities that became impossible because of Union victory.

Now consider the civil rights movement. The leaders, Martin Luther King in particular, understood the power of a lost cause. They intentionally staged events of getting attacked by police and white mobs, always making sure there were cameras nearby to make it into a national event. It was in losing these confrontations to the greater power of white oppression that they managed to win public support. As a largely Christian movement, the civil rights activists surely had learned from the story of Jesus as a sacrificial loser and his followers as persecuted losers. The real failure of civil rights only came later on when it gained mainstream victories and a corrupt black leadership aligned with white power, such as pushing the racist 1994 Crime Bill which was part of the Democrats becoming the new conservative party. The civil rights movement might have been better able to transform society and change public opinion by having remained a lost cause for a few more generations.

A victory forced can be a victory lost. Gain requires sacrifice, not to be bought cheaply. Success requires risk of failure, putting everything on the line. The greatest losses can come from seeking victory too soon and too easily. Transformative change can only be won by losing what came before. Winning delayed sometimes is progress ensured, slow but steady change. The foundation has to be laid before something can emerge from the ground up. Being brought low is the beginning point, like planting a seed in the soil.

It reminds me of my habit of always looking down as I walk. My father, on the other hand, never looks down and has a habit of stepping on things. It is only by looking down that we can see what is underneath our feet, what we stand on or are stepping toward. Foundation and fundament are always below eye level. Even in my thinking, I’m forever looking down, to what is beneath everyday awareness and oft-repeated words. Just to look down, such a simple and yet radical act.

“Looking down is also a sign of shame or else humility, the distinction maybe being less relevant to those who avoid looking down. To humble means to bring low, to the level of the ground, the soil, humus. To be further down the ladder of respectability, to be low caste or low class, is to have a unique vantage point. One can see more clearly and more widely when one has grown accustomed to looking down, for then one can see the origins of things, the roots of the world, where experience meets the ground of being.”

* * *

Living Differently: On How the Feminist Utopia Is Something You Have to Be Doing Now
by Lynne Segal

Another anthropologist, the anarchist David Graeber, having been involved in protest networks for decades, remains even more certain that participation in moments of direct action and horizontal decision-making bring to life a new and enduring conception of politics, while providing shared hope and meaning in life, even if their critics see in the outcomes of these movements only defeat:

What they don’t understand is that once people’s political horizons have been broadened, the change is permanent. Hundreds of thousands of Americans (and not only Americans, but Greeks, Spaniards and Tunisians) now have direct experience of self-organization, collective action and human solidarity. This makes it almost impossible to go back to one’s previous life and see things the same way. While the world’s financial and political elite skate blindly towards the next 2008-scale crisis, we’re continuing to carry out occupations of buildings, farms, foreclosed homes and workplaces, organizing rent strikes, seminars and debtor’s assemblies, and in doing so laying the groundwork for a genuinely democratic culture … With it has come a revival of the revolutionary imagination that conventional wisdom has long since declared dead.

Discussing what he calls ‘The Democracy Project’, Graeber celebrates forms of political resistance that in his view move well beyond calls for policy reforms, creating instead permanent spaces of opposition to all existing frameworks. For Graeber, one fundamental ground for optimism is that the future is unknowable, and one can live dissident politics in the present, or try to. This is both despite, and also because of, the insistent neo-liberal boast that there can be no alternative to its own historical trajectory: which has become a linear project of endless growth and the amassing of wealth by the few, toil and the struggle for precarious survival for so many.

Furthermore, Graeber points out that historically, although few revolutionaries actually succeeded in taking power themselves, the effects of their actions were often experienced far outside their immediate geographical location. In a similar reflection on unintended consequences, Terry Eagleton suggests that even with the gloomiest of estimates in mind, many aspects of utopic thinking may be not only possible but well- nigh inevitable:

Perhaps it is only when we run out of oil altogether, or when the world system crashes for other reasons, or when ecological catastrophe finally overtakes us, that we will be forced into some kind of co-operative commonwealth of the kind William Morris might have admired.

Even catastrophism, one might say, has its potentials. […]

It should come as no surprise that most of the goals we dream of will usually elude us, at least partially. However, to confront rather than accept the evils of the present, some utopian spirit is always necessary to embrace the complexity of working, against all odds, to create better futures. A wilful optimism is needed, despite and because of our inevitable blind-spots and inadequacies, both personal and collective.

For many of us, it means trying to live differently in the here and now, knowing that the future will never be a complete break with the present or the past, but hopefully something that may develop out of our most supportive engagements with others. To think otherwise inhibits resistance and confirms the dominant conceit that there is no alternative to the present. Thus, I want to close this chapter repeating the words of the late Latin American writer, Eduardo Galeano, which seem to have been translated into almost every language on earth, though I cannot track down their source:

Utopia is on the horizon. I move two steps closer; it moves two steps further away. I walk another ten steps and the horizon runs ten steps further away. As much as I may walk, I’ll never reach it. So what’s the point of utopia? The point is this: to keep moving forward.

Our political dreams can end in disappointment, but are likely, nevertheless, to make us feel more alive, and hence happier, along the way, at least when they help to connect us to and express concern for those around us. Happiness demands nothing less.

 

Paul Adkin on Decadence & Stagnation

“Decadence: when things are just too good and easy that no one bothers to push forward anymore, bringing about stagnation …

But there is also another kind of stagnation: one which comes about because there just isn’t enough time to go forward; when all time is taken up with something that is essentially futile when considered from the point of view of the bigger picture. Like making money. Even the seemingly dynamic world of business, if it is dedicated only to business and not to authentically meaningful human progress (things associated with knowledge and discovery), it is essentially stagnating. Any society that is a simulacra society, hell-bent on reproducing copies rather than on developing its creativity, is a decadent, stagnating society. We are stagnant not because of what we are doing, our anthill society is always busy, but because what we are driven by, in all this anthill activity, is not creative. When production is synonymous with reproduction, then we know we have fallen into the stagnant pool of decadence.

“Nietzsche talked about the residual nature of decadence[1]. That decadence is a cumulative thing. Certainly, it is nurtured both by dogma and nihilism. Only a sceptical meaningfulness can push forward in a creative way.

“Sceptical meaningfulness? How can such a thing be? Surely it is a contradiction in terms.

“To understand how this oxymoron combination can work, we need to see meaningfulness as a forward pushing phenomenon. Once it stops pushing forward, meaningfulness slips into dogma. Meaning is fuelled by truth, but it does not swim in truth as if truth were a lake. Truth, in order to be lasting, has to be a river.”

from Decadence & Stagnation by Paul Adkin

Social Construction & Ideological Abstraction

The following passages from two books help to explain what is social construction. As society has headed in a particular direction of development, abstract thought has become increasingly dominant.

But for us modern people who take abstractions for granted, we often don’t even recognize abstractions for what they are. Many abstractions simply become reality as we know it. They are ‘looped’ into existence, as race realism, capitalist realism, etc.

Ideological abstractions become so pervasive and systemic that we lose the capacity to think outside of them. They form our reality tunnel.

This wasn’t always so. Humans used to conceive of and hence perceive the world far differently. And this shaped their sense of identity, which is hard for us to imagine.

* * *

Dynamics of Human Biocultural Diversity:
A Unified Approach

by Elisa J. Sobo
Kindle Locations 94-104)

Until now, many biocultural anthropologists have focused mainly on the ‘bio’ half of the equation, using ‘biocultural’ generically, like biology, to refer to genetic, anatomical, physiological, and related features of the human body that vary across cultural groups. The number of scholars with a more sophisticated approach is on the upswing, but they often write only for super-educated expert audiences. Accordingly, although introductory biocultural anthropology texts make some attempt to acknowledge the role of culture, most still treat culture as an external variable— as an add-on to an essentially biological system. Most fail to present a model of biocultural diversity that gives adequate weight to the cultural side of things.

Note that I said most, not all: happily, things are changing. A movement is afoot to take anthropology’s claim of holism more seriously by doing more to connect— or reconnect— perspectives from both sides of the fence. Ironically, prior to the industrial revolution and the rise of the modern university, most thinkers took a very comprehensive view of the human condition. It was only afterward that fragmented, factorial, compartmental thinking began to undermine our ability to understand ourselves and our place in— and connection with— the world. Today, the leading edge of science recognizes the links and interdependencies that such thinking keeps falsely hidden.

Nature, Human Nature, and Human Difference:
Race in Early Modern Philosophy
by Justin E. H. Smith

pp. 9-10

The connection to the problem of race should be obvious: kinds of people are to no small extent administered into being, brought into existence through record keeping, census taking, and, indeed, bills of sale. A census form asks whether a citizen is “white,” and the possibility of answering this question affirmatively helps to bring into being a subkind of the human species that is by no means simply there and given, ready to be picked out, prior to the emergence of social practices such as the census. Censuses, in part, bring white people into existence, but once they are in existence they easily come to appear as if they had been there all along. This is in part what Hacking means by “looping”: human kinds, in contrast with properly natural kinds such as helium or water, come to be what they are in large part as a result of the human act of identifying them as this or that. Two millennia ago no one thought of themselves as neurotic, or straight, or white, and nothing has changed in human biology in the meantime that could explain how these categories came into being on their own. This is not to say that no one is melancholic, neurotic, straight, white, and so on, but only that how that person got to be that way cannot be accounted for in the same way as, say, how birds evolved the ability to fly, or how iron oxidizes.

In some cases, such as the diagnosis of mental illness, kinds of people are looped into existence out of a desire, successful or not, to help them. Racial categories seem to have been looped into existence, by contrast, for the facilitation of the systematic exploitation of certain groups of people by others. Again, the categories facilitate the exploitation in large part because of the way moral status flows from legal status. Why can the one man be enslaved, and the other not? Because the one belongs to the natural-seeming kind of people that is suitable for enslavement. This reasoning is tautological from the outside, yet self-evident from within. Edward Long, as we have seen, provides a vivid illustration of it in his defense of plantation labor in Jamaica. But again, categories cannot be made to stick on the slightest whim of their would-be coiner. They must build upon habits of thinking that are already somewhat in place. And this is where the history of natural science becomes crucial for understanding the history of modern racial thinking, for the latter built directly upon innovations in the former. Modern racial thinking could not have taken the form it did if it had not been able to piggyback, so to speak, on conceptual innovations in the way science was beginning to approach the diversity of the natural world, and in particular of the living world.

This much ought to be obvious: racial thinking could not have been biologized if there were no emerging science of biology. It may be worthwhile to dwell on this obvious point, however, and to see what more unexpected insights might be drawn out of it. What might not be so obvious, or what seems to be ever in need of renewed pointing out, is a point that ought to be of importance for our understanding of the differing, yet ideally parallel, scope and aims of the natural and social sciences: the emergence of racial categories, of categories of kinds of humans, may in large part be understood as an overextension of the project of biological classification that was proving so successful in the same period. We might go further, and suggest that all of the subsequent kinds of people that would emerge over the course of the nineteenth and twentieth centuries, the kinds of central interest to Foucault and Hacking, amount to a further reaching still, an unprecedented, peculiarly modern ambition to make sense of the slightest variations within the human species as if these were themselves species differentia. Thus for example Foucault’s well-known argument that until the nineteenth century there was no such thing as “the homosexual,” but only people whose desires could impel them to do various things at various times. But the last two centuries have witnessed a proliferation of purportedly natural kinds of humans, a typology of “extroverts,” “depressives,” and so on, whose objects are generally spoken of as if on an ontological par with elephants and slime molds. Things were not always this way. In fact, as we will see, they were not yet this way throughout much of the early part of the period we call “modern.”

Time and Trauma

And I think of that “Groundhog Day” movie with Bill Murray in which he repeats the same day, again and again, with only minor changes. If you’ve seen the movie, Murray finally breaks out of what appears to be an infinite loop only when he changes his ways, his approach to life, his mentality. He becomes a better person and even gets the girl.

When is the USA going to break out of its infinite loop of war? Only when we change our culture, our mentality.

A “war on terror” is a forever war, an infinite loop, in which the same place names and similar actions crop up again and again. Names like Mosul and Helmand province. Actions like reprisals and war crimes and the deaths of innocents, because that is the face of war.

~W.J. Astore, Happy 4th of July! And a Global War on Something

* * *

The impression we form is that it is not that linear time perception or experience that has been corrupted by trauma; it is that time “itself” has been traumatized — so that we come to comprehend “history” not as a random sequence of events, but as a series of traumatic clusters. This broken time, this sense of history as a malign repetition, is “experienced” as seizure and breakdown; I have placed “experienced” in inverted commas here because the kind of voiding interruption of subjectivity seems to obliterate the very conditions that allows experience to happen.

It is as if the combination of adolescent erotic energy with an inorganic artefact … produces a trigger for a repeating of the ancient legend. It is not clear that “repeating” is the right word here, though. It might be better to say that the myth has been re-instantiated, with the myth being understood as a kind of structure that can be implemented whenever the conditions are right. But the myth doesn’t repeat so much as it abducts individuals out of linear time and into its “own” time, in which each iteration of the myth is in some sense always the first time.

…the mythic is part of the virtual infrastructure which makes human life as such possible. It is not the case that first of all there are human beings, and the mythic arrives afterwards, as a kind of cultural carapace added to a biological core. Humans are from the start — or from before the start, before the birth of the individual — enmeshed in mythic structures.

~Mark Fisher, Eerie ThanatosThe Weird and the Eerie (pp. 96-97)