Gender and Personality on the Autism Spectrum

There is ongoing debate about autism, such as how it is defined and what causes it, which in turn leads to how it is and should be diagnosed. Some have speculated that autism in girls and women is underdiagnosed:

However, it’s unclear whether this gender bias is the result of genetics or reflects differences in diagnosis or the way females manifest symptoms of the disorder. Girls with autism tend to actively compensate for their symptoms in ways that boys don’t, which may account for the discrepancy, says Skuse.

As a result, the females enrolled in studies may tend to be severely affected and carry multiple mutations. “There is some suggestion that higher-functioning females are out there in the general population, but they’re not being referred,” he says.

Here is what one could argue. Maybe it is most likely that the bias is not just in diagnosis for there would be a directly related bias in the research itself. After all, it is diagnosis that determines the subjects in the autism studies. So, if diagnosis is biased, there is no reason to assume that the subjects are representative of the full autistic population. Biased input would inevitably lead to biased results and hence biased conclusions. Basically, these studies at present might not be able to tell us anything about possible gender differences.

A reason given for the alleged failure to detect female autism is that “it may be because girls are better at masking the symptoms – better at copying social norms while not necessarily understanding them.” That might be true of many boys and men as well.

I have some Asperger’s-like traits, although I’ve never been diagnosed. Maybe it’s because I learned to fit in. I was socially clueless when younger and social situations stress me out, a set of factors exacerbated by my inner-focused nature. I don’t connect easily with others. But you wouldn’t notice that from casually interacting with me. I know how to pretend to be normal. It’s maybe why therapy has never worked for me, as I’ve developed a habit of effectively hiding my problems. It’s a survival mechanism that I learned young.

What occurs to me is that I’m a Jungian Feeling type. Myers-Briggs testing has found that most Feeling types are female, although about 30% are male. The same pattern in the opposite direction is seen with Thinking types. There is a general pattern that follows along gender lines. Still, that approximate third of the population is a significant number. That might mean that a third of male autistics don’t fit into the male pattern, maybe while a third of female autistics do.

So the seeming gender difference found in autism could be more about personality differences. And those personality differences may or may not be genetic in nature. Much of this could instead be culturally learned behavior. It wouldn’t only be cultural biases in diagnosis of autism for, if that is so, it would also be cultural biases in how autism is expressed. In that case, the question is what might be the relationship between culture, personality, gender, and neurocognitive development. There are obviously many complex factors involved, such as considering how a significant number of people don’t fall into simple gender categories: “It’s far from uncommon for people to carry genetics of both sexes, even multiple DNA.” Since gender isn’t binary, the expressions of autism presumably also wouldn’t be binary.

It would be easy to test my speculation if formulated as a hypothesis. My prediction would be that Thinking type females would be more likely to be diagnosed as autstic. And the opposite prediction would be that Feeling type males would be less likely. That is simply to say that autism would express differently depending on personality traits/functions. Similar research could be done with FFM/Big Five, and maybe such research already has been done. A related issue that would need to be disentangled is whether autism is more common among certain personalities or simply more diagnosed among certain personalities, an issue that could be studied either in relation to or separate from gender.

All of this is particularly complicated for certain Myers-Briggs types. My specific type is INFP. This type is one of the most talented types when it comes to masking behavior, “known as being inscrutable.” As Carl Jung described dominant introverted feeling (what Myers-Briggs divides into two types: INFP and ISFP):

They are mostly silent, inaccessible, hard to understand; often they hide behind a childish or banal mask, and their temperament is inclined to melancholy…Their outward demeanor is harmonious, inconspicuous…with no desire to affect others, to impress, influence or change them in any way. If this is more pronounced, it arouses suspicion of indifference and coldness…Although there is a constant readiness for peaceful and harmonious co-existence, strangers are shown no touch of amiability, no gleam of responsive warmth…It might seem on a superficial view that they have no feelings at all.
(Psych. Types, Para. 640-641)

An INFP aspie would make for a rather confusing specimen. It is the dominant introverted feeling that is so hard to discern. And this introverted feeling is hidden behind the chameleon-like and outward-facing extraverted intuition, what is in the position called the auxiliary function. Extraverted intuition is the ultimate mask to hide behind, as it is highly fluid and adaptable. And as the auxiliary function, extraverted intuition plays the role of mediation with and defense against the outside world.

Maybe a significant number of autistics have hidden introverted feeling. This would fit the autistic pattern of feeling strongly in response to others (high functioning affective empathy) while not easily connecting to others (low functioning cognitive empathy). By its nature, there is no straightforward way for introverted feeling to be expressed in social behavior. Yet an INFP can be talented at learning normal social behavior, as extraverted intuition helps them to be mimics. Or failing that, they could stonewall anyone trying to figure them out. Usually being conflict avoidant, most dominant introverted feeling types will go along to get along, as long as no one treads on their core sense of value.

Here is a more general point:

I think it’s a bit silly to make a distinction between “male” and “female” interests in the first place and realize that it can also be healthy for women to take interest in more traditionally “male” subjects such as science and technology and that doesn’t always mean that they have a disorder. In making a diagnosis they should always be aware of the underlying pattern rather than the actual interest and keep in mind that interests may differ for each individual, so (e.g.) whether a female is obsessively talking about computers or fashion should not matter, because the pattern is the same. Indeed, it probably is more obvious in the first case, especially when society is more geared toward male/female stereotyping [so “masculine” interests for women stand out]. And besides, narrow interests is but 1 clue, it doesn’t count for every individual with an ASD; they may have a range of interests, just as typical people do.

Also, as some typologists argue, the US has been an society dominated by ESTJ types that is becoming dominated by ENTJ types (John Giannini, Compass of the Soul). The commonality then is E_TJ, that is to say dominant extraverted thinking. This typological bias is how American culture defines and enforces the social norms of the male gender. Unsurprisingly, that would also be how autism gets diagnosed, according to extraversion and thinking.

On the other hand, autism that was introverted and/or feeling would express in entirely different ways. In particular, dominant introverted feeling would express as strong affective empathy, rather than the (stereotypically) masculine tendency toward emotional detachment. Also, introversion taken on its own, whether in relation to feeling or thinking, would be more internalized and hence less observable — meaning obsessions that would be unlikely to seen in outward behavior: more subtle and nuanced or else more hidden and masked.

This personality perspective might be immensely more helpful than using a gender lens alone. It’s also a more psychologically complex frame of interpretation, appealing to my personality predilections. Considering that autism and Asperger’s was originally observed and defined by men, one might wonder what kind of personalities they had. Their personalities might have determined which personalities they were drawn to in studying and hence drawn to in using as the standard for their early models of the autism spectrum.

Advertisements

Fluidity of Perceived Speciation

There is a Princeton article that discusses a study on speciation. Some researchers observed a single finch that became isolated from its own species. The island it ended up on, though, had several other species of finch. So, it crossed the species divide to mate with one of the other populations.

That alone questions the very meaning of species. It was neither genetics nor behavior that kept these breeding populations separate. It was simply geographic distance. Eliminate that geographic factor and hybridization quickly follows. The researchers argue that this hybridization represents a new species. But their observations are over a short period of time. There is no reason to assume that further hybridization won’t occur, causing this population to slowly assimilate back into the original local population, the genetic variance lessening over time (as with populations of homo sapiens that hybridized with other homonids such as neanderthals).

All this proves is that our present definition of ‘species’ isn’t always particularly scientific, in being useful for careful understanding. Of course, it’s not hard to create a separate breeding population. But if separate breeding populations don’t have much genetic difference and can easily interbreed, then how is calling them separate species meaningful in any possible sense of that word? Well, it isn’t meaningful.

This study showed that sub-populations can become isolated for periods of times. What it doesn’t show is that this isolation will be long-lasting, as it isn’t entirely known what caused the separation of the breeding populations in the first place. For example, we don’t know to what extent the altered bird songs are related to genetics versus epigenetics, microbiome, environmental shifts, learned behavior, etc. The original lost and isolated finch carried with it much more than the genetics of its species. It would be unscientific to conclude much from such limited info and observations.

The original cause(s) might change again. In that case, the temporary sub-population would lose the traits, in this case birdsong, that have separated it. That probably happens all the time, temporary changes within populations and occasional hybridized populations appearing only to disappear again. But it’s probably rare that these changes become permanent so as to develop into genuinely separate species, in the meaningful sense of being genetically and behaviorally distinct to a large enough degree.

Also, the researches didn’t eliminate the possible explanation of what in humans would be called culture. Consider mountain lions. Different mountain lion populations will only hunt certain prey species. This isn’t genetically determined behavior. Rather, specific hunting techniques are taught from mother to cub. But this could create separate breeding populations for, in some cases, they might hunt in different areas where the various prey are concentrated. Even so, this hasn’t separated the mountain lion populations into different species. They remain genetically the same.

Sure, give it enough time combined with environmental changes, and then speciation might follow. But speciation can’t be determined by behavior alone, even when combined with minor genetic differences. Otherwise, that would mean every human culture on the planet is a separate species. The Irish wold be a separate species from the English. The Germans would be a separate species from the French. The Chinese would be a separate species from the Japanese. Et cetera. This is ludicrous, even though some right-wingers might love this idea and in fact this was an early pre-scientific definition of races as species or sub-species. But as we know, humans have some of the lowest levels of genetic diversity as seen among similar species.

Our notion of species is too simplistic. We have this simplistic view because, as our lives are short and science is young, we only have a snapshot of nature. Supposed species are probably a lot more fluid than the present paradigm allows for. The perceived or imposed boundaries of ‘species’ could constantly be changing with various sub-populations constantly emerging and merging, with environmental niches constantly shifting and coalescing. The idea of static species generally seems unhelpful, except maybe in rare cases where a species becomes isolated over long periods of time (e.g., the ice age snails surviving in a few ice caves in Iowa and Illinois) or else in species that are so perfectly adapted that evolutionary conditions have had little apparent impact (e.g., crocodiles).

We easily forget that modern science hasn’t been studying nature for very long. As I often repeat, our ignorance is vast beyond comprehension, much greater than our present knowledge.

As an amusing related case, some species will have sex with entirely different species. Hybridization isn’t even possible in such situations. It’s not clear why this happens. An example of this is a particular population of monkeys sexually mounting deer and, as they sometimes get grooming and food out of the deal, a fair number of the deer tolerate the behavior. There is no reason to assume these deer-mounting monkeys have evolved into a new species, as compared to nearby populations of monkeys who don’t sexually molest hoofed animals. Wild animals don’t seem to care all that much what modern humans think of them. Abstract categories of species don’t stop them from acting however they so desire. And it hasn’t always stopped humans either, whether between the supposed races within the human species or across the supposed divide of species.

From the lascivious monkey article (linked directly above):

“Finally, the researchers say, this might be a kind of cultural practice. Japanese macaques display different behaviors in different locations — some wash their food, or take hot-spring baths, or play with snowballs.

“Adolescent females grinding on the backs of deer might similarly be a cultural phenomenon. But it has only been observed at Minoo within the past few years.

“The monkey-deer sexual interactions reported in our paper may reflect the early stage development of a new behavioural tradition at Minoo,” Gunst-Leca told The Guardian.

“Alternatively, the paper notes, it could be a “short-lived fad.” Time will tell.”

Ian Cheng on Julian Jaynes

Down an Internet Rabbit Hole With an Artist as Your Guide
by Daniel McDermon

The art of Ian Cheng, for example, is commonly described in relation to video games, a clear influence. But the SI: Visions episode about him touches only lightly on that connection and on Mr. Cheng’s career, which includes a solo exhibition earlier this year at MoMA PS1. Instead, viewers go on a short but heady intellectual journey, narrated by Mr. Cheng, who discusses improv theater and the esoteric theories of the psychologist Julian Jaynes.

Jaynes, Mr. Cheng said, posits that ancient people weren’t conscious in the way that modern humans are. “You and I hear an internal voice and we perceive it to be a voice that comes from us,” Mr. Cheng says in the video. But Jaynes argued that those voices might well have been perceived as other people.

In that theory, Mr. Cheng explained in an interview, “The mind is actually composed of many sub-people inside of you, and any one of those people is getting the spotlight at any given time.” It’s a model of consciousness that is echoed in the film “Inside Out,” in which an adolescent girl’s mind comprises five different characters.

This conception of consciousness and motivation helped him build out the triad of digital simulations that were shown at MoMA PS1. In those works, Mr. Cheng created characters and landscapes, but the narrative that unfolds is beyond his control. He has referred to them as “video games that play themselves.”

This is a struggle for power.

“[D]espite their misgivings, conservatives have reconciled themselves to capitalism because the expansion of the market has also meant the growth of the private sphere of domination and control. . . . [C]onservatism is about the freedom and ability of some people to dominate, control, and extract from others, which capitalist inequality and hierarchy make possible.”
Peter Kolozi

The United States is a plutocracy. But ultimately that means oligarchy. The reason that the wealthy rule is because wealth is power. I would clarify a point, though. Wealth isn’t limited to direct power over the masses for it also allows the wealthy to control all aspects of society, even those not directly related to wealth.

The plutocrats aren’t powerful merely because of wealth. It is that they are part of a ruling elite that works together to shut out everyone else, to exclude the majority not just from wealth but more importantly from power. This means maintaining their control of privileges and resources, by controlling the system of politics, economics, media, and education.

This is why the United States is such a brutally oppressive society. Much of what the ruling elite does comes at great costs to themselves, although at even greater costs to everyone else, which is the point in always ensuring others are harmed more. First and foremost, the purpose is social control. Wealth is a means to that end, but there are many other means to that end: military imperialism, police-intelligence state, mass incarceration, media propaganda, and much else.

It’s not so much class war, in the simple sense. “This is a struggle for power,” as Caitlin Johnstone makes clear in the piece below. And at present most Americans are losing the fight. This isn’t a metaphor. Millions of Americans are victims — locked away or otherwise trapped in the legal system, struggling in poverty and homelessness, sick and dying because lack of healthcare. Millions more are barely getting by and, out of fear, kept in their place as slaves to the system. The power and its consequences are concretely and viscerally real.

It is a war with growing numbers of casualties. But if the American public could realize the power that exists in numbers, it could instead become a revolution.

On a related note, thoughts along these lines lead straight to issues of inequality. As with oligarchy, inequality isn’t only about who has most of the wealth. As a divide in wealth indicates a divide in power, what this means is a divide in political membership and representation. It becomes harder for most Americans to participate in politics, partly because they don’t have the time or money to participate.

It requires ever larger amounts of wealth and resources, not to mention crony connections, to engage a successful campaign. And for those who do get elected, they do so either by belonging to the oligarchy or by becoming indebted to the oligarchy. This is why, as Johnstone points out, studies have shown that politicians mostly do whatever the rich want them to do.

As the rich gain greater power, they gain greater leverage to take even more power. It’s a cycle that has only one end point, total authoritarianism. That is, unless we the public stop it.

Some ask why does it matter that an elite has more money than everyone else. What an unbelievably naive question that is. To anyone who is confused on the issue, I’d suggest that they simply open their eyes.

* * *

The Real Reason The Elites Keep Killing Single-Payer
by Caitlin Johnstone

The word “oligarchy” gets thrown around a lot in progressive discourse, usually to highlight the problem of money in politics, but not many people seem to really settle in and grapple with the hefty implications of what that word actually means. If you say that America is an oligarchy (and it certainly is, which we’ll get to in a second), you’re not merely saying that there is too much money in US politics or that the wealthy have an unfair amount of power in America. Per definition, you are saying that a small class of elites rule over you and your nation, like a king rules over his kingdom.

You’ve studied history, in school if nowhere else. How often have you read about kings voluntarily relinquishing their thrones and handing power to their subjects out of the goodness of their hearts? Once someone makes it to the very top of a society, how often have you known them to eagerly step down from that power position in order to give the people self-rule?

This isn’t about money, this is about power. The wealthiest of the wealthy in America haven’t been doing everything they can to stave off universal healthcare and economic justice in order to save a few million dollars. They haven’t been fighting to keep you poor because they are money hoarders and they can’t bare to part with a single penny from their trove. It’s so much more sinister than that: the goal isn’t to keep you from making the plutocrats a little less wealthy, the goal is to keep you from having any wealth of your own.

Power is intrinsically relative: it only exists in relation to the amount of power that other people have or don’t have. If we all have the same amount of government power, then none of us has any power over the other. If, however, I can figure out a way to manipulate the system into giving me 25 percent more governmental power than anyone else, power has now entered into the equation, and I have an edge over everyone else that I can use to my advantage. But that edge only exists due to the fact that you’re all 25 percent less powerful than I am. If you all become five percent more powerful, my power is instantly diminished by that much, in the same way a schoolyard bully would no longer enjoy the same amount of dominance if everyone at school suddenly grew five percent bigger and stronger.

Here’s where I’m going with all this: the ruling elites have set up a system where wealth equals power. In order for them to rule, in order for them to enjoy the power of kings, they necessarily need to keep the general public from wealth. Not so that they can have a little more money for themselves in case they want to buy a few extra private jets or whatever, but because their power is built upon your lack of power. By keeping you from having a few thousand extra dollars of spending money throughout the year, they guarantee that you and your fellow citizens won’t pool that extra money toward challenging their power in the wealth-equals-power paradigm that they’ve set up for themselves. […]

You can see, then, why the oligarchs must resist socialism and populism tooth and claw. You can see why their media propaganda outlets are so ferociously dedicated to tearing down any sincere attempt to fight the Walmart economy or allow an inch of ground to be gained in bringing any economic power to ordinary Americans. By asking for economic justice, you may think that you are simply asking for a small slice of the enormous pie the billionaire class could never hope to eat in a single lifetime, but what you are actually doing is asking for their crown, their throne and their scepter. You are making yourself a direct existential threat to their dynasty.

This is why they fought so hard to stomp out the Sanders movement. It wasn’t that Sanders himself was a threat to them, it’s that a large group of the unwashed masses was pooling their wealth together and leaping over seemingly insurmountable obstacles using nothing but tiny $27 donations as fuel. Imagine if Americans had more disposable income to invest in a better future for their kids by pointing it at changing America’s political landscape? Imagine a populist movement where Americans pushing for economic justice can suddenly all afford to pool a bunch of $270 donations to support a beloved candidate or agenda? Or $2,700? Under the current money-equals-power paradigm, the will of the people would become unstoppable, and the US power establishment would be forced to reshape itself in a way that benefits the people instead of benefitting a few billionaires.

* * *

Basalat Raja, Jun 24

This is why you will find them behaving in ways that are opposite to their “direct interests” — if you assume that their “direct interests” are making more money. A prosperous middle class will make the rich even richer, because more of us will be able to buy the products from the companies that they own large amounts of shares in, leading to more profits for those companies, and obviously, lifting share prices, making them richer.

But that means less control, since economically contented people are harder to herd. If you have a decent job and a decent house, it’s harder to tell you that Mexicans/Muslims/Russians/gays/etc. have stolen your job, etc.

“Lack of the historical sense is the traditional defect in all philosophers.”

Human, All Too Human: A Book for Free Spirits
by Friedrich Wilhelm Nietzsche

The Traditional Error of Philosophers.—All philosophers make the common mistake of taking contemporary man as their starting point and of trying, through an analysis of him, to reach a conclusion. “Man” involuntarily presents himself to them as an aeterna veritas as a passive element in every hurly-burly, as a fixed standard of things. Yet everything uttered by the philosopher on the subject of man is, in the last resort, nothing more than a piece of testimony concerning man during a very limited period of time. Lack of the historical sense is the traditional defect in all philosophers. Many innocently take man in his most childish state as fashioned through the influence of certain religious and even of certain political developments, as the permanent form under which man must be viewed. They will not learn that man has evolved,4 that the intellectual faculty itself is an evolution, whereas some philosophers make the whole cosmos out of this intellectual faculty. But everything essential in human evolution took place aeons ago, long before the four thousand years or so of which we know anything: during these man may not have changed very much. However, the philosopher ascribes “instinct” to contemporary man and assumes that this is one of the unalterable facts regarding man himself, and hence affords a clue to the understanding of the universe in general. The whole teleology is so planned that man during the last four thousand years shall be spoken of as a being existing from all eternity, and with reference to whom everything in the cosmos from its very inception is naturally ordered. Yet everything evolved: there are no eternal facts as there are no absolute truths. Accordingly, historical philosophising is henceforth indispensable, and with it honesty of judgment.

What Locke Lacked
by Louise Mabille

Locke is indeed a Colossus of modernity, but one whose twin projects of providing a concept of human understanding and political foundation undermine each other. The specificity of the experience of perception alone undermines the universality and uniformity necessary to create the subject required for a justifiable liberalism. Since mere physical perspective can generate so much difference, it is only to be expected that political differences would be even more glaring. However, no political order would ever come to pass without obliterating essential differences. The birth of liberalism was as violent as the Empire that would later be justified in its name, even if its political traces are not so obvious. To interpret is to see in a particular way, at the expense of all other possibilities of interpretation. Perspectives that do not fit are simply ignored, or as that other great resurrectionist of modernity, Freud, would concur, simply driven underground. We ourselves are the source of this interpretative injustice, or more correctly, our need for a world in which it is possible to live, is. To a certain extent, then, man is the measure of the world, but only his world. Man is thus a contingent measure and our measurements do not refer to an original, underlying reality. What we call reality is the result not only of our limited perspectives upon the world, but the interplay of those perspectives themselves. The liberal subject is thus a result of, and not a foundation for, the experience of reality. The subject is identified as origin of meaning only through a process of differentiation and reduction, a course through which the will is designated as a psychological property.

Locke takes the existence of the subject of free will – free to exercise political choice such as rising against a tyrant, choosing representatives, or deciding upon political direction – simply for granted. Furthermore, he seems to think that everyone should agree as to what the rules are according to which these events should happen. For him, the liberal subject underlying these choices is clearly fundamental and universal.

Locke’s philosophy of individualism posits the existence of a discreet and isolated individual, with private interests and rights, independent of his linguistic or socio-historical context. C. B. MacPhearson identifies a distinctly possessive quality to Locke’s individualist ethic, notably in the way in which the individual is conceived as proprietor of his own personhood, possessing capacities such as self-reflection and free will. Freedom becomes associated with possession, which the Greeks would associate with slavery, and society conceived in terms of a collection of free and equal individuals who are related to each through their means of achieving material success – which Nietzsche, too, would associate with slave morality.  […]

There is a central tenet to John Locke’s thinking that, as conventional as it has become, remains a strange strategy. Like Thomas Hobbes, he justifies modern society by contrasting it with an original state of nature. For Hobbes, as we have seen, the state of nature is but a hypothesis, a conceptual tool in order to elucidate a point. For Locke, however, the state of nature is a very real historical event, although not a condition of a state of war. Man was social by nature, rational and free. Locke drew this inspiration from Richard Hooker’s Laws of Ecclesiastical Polity, notably from his idea that church government should be based upon human nature, and not the Bible, which, according to Hooker, told us nothing about human nature. The social contract is a means to escape from nature, friendlier though it be on the Lockean account. For Nietzsche, however, we have never made the escape: we are still holus-bolus in it: ‘being conscious is in no decisive sense the opposite of the instinctive – most of the philosopher’s conscious thinking is secretly directed and compelled into definite channels by his instincts. Behind all logic too, and its apparent autonomy there stand evaluations’ (BGE, 3). Locke makes a singular mistake in thinking the state of nature a distant event. In fact, Nietzsche tells us, we have never left it. We now only wield more sophisticated weapons, such as the guilty conscience […]

Truth originates when humans forget that they are ‘artistically creating subjects’ or products of law or stasis and begin to attach ‘invincible faith’ to their perceptions, thereby creating truth itself. For Nietzsche, the key to understanding the ethic of the concept, the ethic of representation, is conviction […]

Few convictions have proven to be as strong as the conviction of the existence of a fundamental subjectivity. For Nietzsche, it is an illusion, a bundle of drives loosely collected under the name of ‘subject’ —indeed, it is nothing but these drives, willing, and actions in themselves—and it cannot appear as anything else except through the seduction of language (and the fundamental errors of reason petrified in it), which understands and misunderstands all action as conditioned by something which causes actions, by a ‘Subject’ (GM I 13). Subjectivity is a form of linguistic reductionism, and when using language, ‘[w]e enter a realm of crude fetishism when we summon before consciousness the basic presuppositions of the metaphysics of language — in plain talk, the presuppositions of reason. Everywhere reason sees a doer and doing; it believes in will as the cause; it believes in the ego, in the ego as being, in the ego as substance, and it projects this faith in the ego-substance upon all things — only thereby does it first create the concept of ‘thing’ (TI, ‘Reason in Philosophy’ 5). As Nietzsche also states in WP 484, the habit of adding a doer to a deed is a Cartesian leftover that begs more questions than it solves. It is indeed nothing more than an inference according to habit: ‘There is activity, every activity requires an agent, consequently – (BGE, 17). Locke himself found the continuous existence of the self problematic, but did not go as far as Hume’s dissolution of the self into a number of ‘bundles’. After all, even if identity shifts occurred behind the scenes, he required a subject with enough unity to be able to enter into the Social Contract. This subject had to be something more than merely an ‘eternal grammatical blunder’ (D, 120), and willing had to be understood as something simple. For Nietzsche, it is ‘above all complicated, something that is a unit only as a word, a word in which the popular prejudice lurks, which has defeated the always inadequate caution of philosophers’ (BGE, 19).

Nietzsche’s critique of past philosophers
by Michael Lacewing

Nietzsche is questioning the very foundations of philosophy. To accept his claims means being a new kind of philosopher, ones who ‘taste and inclination’, whose values, are quite different. Throughout his philosophy, Nietzsche is concerned with origins, both psychological and historical. Much of philosophy is usually thought of as an a priori investigation. But if Nietzsche can show, as he thinks he can, that philosophical theories and arguments have a specific historical basis, then they are not, in fact, a priori. What is known a priori should not change from one historical era to the next, nor should it depend on someone’s psychology. Plato’s aim, the aim that defines much of philosophy, is to be able to give complete definitions of ideas – ‘what is justice?’, ‘what is knowledge?’. For Plato, we understand an idea when we have direct knowledge of the Form, which is unchanging and has no history. If our ideas have a history, then the philosophical project of trying to give definitions of our concepts, rather than histories, is radically mistaken. For example, in §186, Nietzsche argues that philosophers have consulted their ‘intuitions’ to try to justify this or that moral principle. But they have only been aware of their own morality, of which their ‘justifications’ are in fact only expressions. Morality and moral intuitions have a history, and are not a priori. There is no one definition of justice or good, and the ‘intuitions’ that we use to defend this or that theory are themselves as historical, as contentious as the theories we give – so they offer no real support. The usual ways philosophers discuss morality misunderstands morality from the very outset. The real issues of understanding morality only emerge when we look at the relation between this particular morality and that. There is no world of unchanging ideas, no truths beyond the truths of the world we experience, nothing that stands outside or beyond nature and history.

GENEALOGY AND PHILOSOPHY

Nietzsche develops a new way of philosophizing, which he calls a ‘morphology and evolutionary theory’ (§23), and later calls ‘genealogy’. (‘Morphology’ means the study of the forms something, e.g. morality, can take; ‘genealogy’ means the historical line of descent traced from an ancestor.) He aims to locate the historical origin of philosophical and religious ideas and show how they have changed over time to the present day. His investigation brings together history, psychology, the interpretation of concepts, and a keen sense of what it is like to live with particular ideas and values. In order to best understand which of our ideas and values are particular to us, not a priori or universal, we need to look at real alternatives. In order to understand these alternatives, we need to understand the psychology of the people who lived with them. And so Nietzsche argues that traditional ways of doing philosophy fail – our intuitions are not a reliable guide to the ‘truth’, to the ‘real’ nature of this or that idea or value. And not just our intuitions, but the arguments, and style of arguing, that philosophers have used are unreliable. Philosophy needs to become, or be informed by, genealogy. A lack of any historical sense, says Nietzsche, is the ‘hereditary defect’ of all philosophers.

MOTIVATIONAL ANALYSIS

Having long kept a strict eye on the philosophers, and having looked between their lines, I say to myself… most of a philosopher’s conscious thinking is secretly guided and channelled into particular tracks by his instincts. Behind all logic, too, and its apparent tyranny of movement there are value judgements, or to speak more clearly, physiological demands for the preservation of a particular kind of life. (§3) A person’s theoretical beliefs are best explained, Nietzsche thinks, by evaluative beliefs, particular interpretations of certain values, e.g. that goodness is this and the opposite of badness. These values are best explained as ‘physiological demands for the preservation of a particular kind of life’. Nietzsche holds that each person has a particular psychophysical constitution, formed by both heredity and culture. […] Different values, and different interpretations of these values, support different ways of life, and so people are instinctively drawn to particular values and ways of understanding them. On the basis of these interpretations of values, people come to hold particular philosophical views. §2 has given us an illustration of this: philosophers come to hold metaphysical beliefs about a transcendent world, the ‘true’ and ‘good’ world, because they cannot believe that truth and goodness could originate in the world of normal experience, which is full of illusion, error, and selfishness. Therefore, there ‘must’ be a pure, spiritual world and a spiritual part of human beings, which is the origin of truth and goodness. Philosophy and values But ‘must’ there be a transcendent world? Or is this just what the philosopher wants to be true? Every great philosophy, claims Nietzsche, is ‘the personal confession of its author’ (§6). The moral aims of a philosophy are the ‘seed’ from which the whole theory grows. Philosophers pretend that their opinions have been reached by ‘cold, pure, divinely unhampered dialectic’ when in fact, they are seeking reasons to support their pre-existing commitment to ‘a rarefied and abstract version of their heart’s desire’ (§5), viz. that there is a transcendent world, and that good and bad, true and false are opposites. Consider: Many philosophical systems are of doubtful coherence, e.g. how could there be Forms, and if there were, how could we know about them? Or again, in §11, Nietzsche asks ‘how are synthetic a priori judgments possible?’. The term ‘synthetic a priori’ was invented by Kant. According to Nietzsche, Kant says that such judgments are possible, because we have a ‘faculty’ that makes them possible. What kind of answer is this?? Furthermore, no philosopher has ever been proved right (§25). Given the great difficulty of believing either in a transcendent world or in human cognitive abilities necessary to know about it, we should look elsewhere for an explanation of why someone would hold those beliefs. We can find an answer in their values. There is an interesting structural similarity between Nietzsche’s argument and Hume’s. Both argue that there is no rational explanation of many of our beliefs, and so they try to find the source of these beliefs outside or beyond reason. Hume appeals to imagination and the principle of ‘Custom’. Nietzsche appeals instead to motivation and ‘the bewitchment of language’ (see below). So Nietzsche argues that philosophy is not driven by a pure ‘will to truth’ (§1), to discover the truth whatever it may be. Instead, a philosophy interprets the world in terms of the philosopher’s values. For example, the Stoics argued that we should live ‘according to nature’ (§9). But they interpret nature by their own values, as an embodiment of rationality. They do not see the senselessness, the purposelessness, the indifference of nature to our lives […]

THE BEWITCHMENT OF LANGUAGE

We said above that Nietzsche criticizes past philosophers on two grounds. We have looked at the role of motivation; the second ground is the seduction of grammar. Nietzsche is concerned with the subject-predicate structure of language, and with it the notion of a ‘substance’ (picked out by the grammatical ‘subject’) to which we attribute ‘properties’ (identified by the predicate). This structure leads us into a mistaken metaphysics of ‘substances’. In particular, Nietzsche is concerned with the grammar of ‘I’. We tend to think that ‘I’ refers to some thing, e.g. the soul. Descartes makes this mistake in his cogito – ‘I think’, he argues, refers to a substance engaged in an activity. But Nietzsche repeats the old objection that this is an illegitimate inference (§16) that rests on many unproven assumptions – that I am thinking, that some thing is thinking, that thinking is an activity (the result of a cause, viz. I), that an ‘I’ exists, that we know what it is to think. So the simple sentence ‘I think’ is misleading. In fact, ‘a thought comes when ‘it’ wants to, and not when ‘I’ want it to’ (§17). Even ‘there is thinking’ isn’t right: ‘even this ‘there’ contains an interpretation of the process and is not part of the process itself. People are concluding here according to grammatical habit’. But our language does not allow us just to say ‘thinking’ – this is not a whole sentence. We have to say ‘there is thinking’; so grammar constrains our understanding. Furthermore, Kant shows that rather than the ‘I’ being the basis of thinking, thinking is the basis out of which the appearance of an ‘I’ is created (§54). Once we recognise that there is no soul in a traditional sense, no ‘substance’, something constant through change, something unitary and immortal, ‘the way is clear for new and refined versions of the hypothesis about the soul’ (§12), that it is mortal, that it is multiplicity rather than identical over time, even that it is a social construct and a society of drives. Nietzsche makes a similar argument about the will (§19). Because we have this one word ‘will’, we think that what it refers to must also be one thing. But the act of willing is highly complicated. First, there is an emotion of command, for willing is commanding oneself to do something, and with it a feeling of superiority over that which obeys. Second, there is the expectation that the mere commanding on its own is enough for the action to follow, which increases our sense of power. Third, there is obedience to the command, from which we also derive pleasure. But we ignore the feeling the compulsion, identifying the ‘I’ with the commanding ‘will’. Nietzsche links the seduction of language to the issue of motivation in §20, arguing that ‘the spell of certain grammatical functions is the spell of physiological value judgements’. So even the grammatical structure of language originates in our instincts, different grammars contributing to the creation of favourable conditions for different types of life. So what values are served by these notions of the ‘I’ and the ‘will’? The ‘I’ relates to the idea that we have a soul, which participates in a transcendent world. It functions in support of the ascetic ideal. The ‘will’, and in particular our inherited conception of ‘free will’, serves a particular moral aim

Hume and Nietzsche: Moral Psychology (short essay)
by epictetus_rex

1. Metaphilosophical Motivation

Both Hume and Nietzsche1 advocate a kind of naturalism. This is a weak naturalism, for it does not seek to give science authority over philosophical inquiry, nor does it commit itself to a specific ontological or metaphysical picture. Rather, it seeks to (a) place the human mind firmly in the realm of nature, as subject to the same mechanisms that drive all other natural events, and (b) investigate the world in a way that is roughly congruent with our best current conception(s) of nature […]

Furthermore, the motivation for this general position is common to both thinkers. Hume and Nietzsche saw old rationalist/dualist philosophies as both absurd and harmful: such systems were committed to extravagant and contradictory metaphysical claims which hinder philosophical progress. Furthermore, they alienated humanity from its position in nature—an effect Hume referred to as “anxiety”—and underpinned religious or “monkish” practises which greatly accentuated this alienation. Both Nietzsche and Hume believe quite strongly that coming to see ourselves as we really are will banish these bugbears from human life.

To this end, both thinkers ask us to engage in honest, realistic psychology. “Psychology is once more the path to the fundamental problems,” writes Nietzsche (BGE 23), and Hume agrees:

the only expedient, from which we can hope for success in our philosophical researches, is to leave the tedious lingering method, which we have hitherto followed, and instead of taking now and then a castle or village on the frontier, to march up directly to the capital or center of these sciences, to human nature itself.” (T Intro)

2. Selfhood

Hume and Nietzsche militate against the notion of a unified self, both at-a-time and, a fortiori, over time.

Hume’s quest for a Newtonian “science of the mind” lead him to classify all mental events as either impressions (sensory) or ideas (copies of sensory impressions, distinguished from the former by diminished vivacity or force). The self, or ego, as he says, is just “a kind of theatre, where several perceptions successively make their appearance; pass, re-pass, glide away, and mingle in an infinite variety of postures and situations. There is properly no simplicity in it at one time, nor identity in different; whatever natural propension we may have to imagine that simplicity and identity.” (Treatise 4.6) […]

For Nietzsche, the experience of willing lies in a certain kind of pleasure, a feeling of self-mastery and increase of power that comes with all success. This experience leads us to mistakenly posit a simple, unitary cause, the ego. (BGE 19)

The similarities here are manifest: our minds do not have any intrinsic unity to which the term “self” can properly refer, rather, they are collections or “bundles” of events (drives) which may align with or struggle against one another in a myriad of ways. Both thinkers use political models to describe what a person really is. Hume tells us we should “more properly compare the soul to a republic or commonwealth, in which the several members [impressions and ideas] are united by ties of government and subordination, and give rise to persons, who propagate the same republic in the incessant change of its parts” (T 261)

3. Action and The Will

Nietzsche and Hume attack the old platonic conception of a “free will” in lock-step with one another. This picture, roughly, involves a rational intellect which sits above the appetites and ultimately chooses which appetites will express themselves in action. This will is usually not considered to be part of the natural/empirical order, and it is this consequence which irks both Hume and Nietzsche, who offer two seamlessly interchangeable refutations […]

Since we are nothing above and beyond events, there is nothing for this “free will” to be: it is a causa sui, “a sort of rape and perversion of logic… the extravagant pride of man has managed to entangle itself profoundly and frightfully with just this nonsense” (BGE 21).

When they discover an erroneous or empty concept such as “Free will” or “the self”, Nietzsche and Hume engage in a sort of error-theorizing which is structurally the same. Peter Kail (2006) has called this a “projective explanation”, whereby belief in those concepts is “explained by appeal to independently intelligible features of psychology”, rather than by reference to the way the world really is1.

The Philosophy of Mind
INSTRUCTOR: Larry Hauser
Chapter 7: Egos, bundles, and multiple selves

  • Who dat?  “I”
    • Locke: “something, I know not what”
    • Hume: the no-self view … “bundle theory”
    • Kant’s transcendental ego: a formal (nonempirical) condition of thought that the “I’ must accompany every perception.
      • Intentional mental state: I think that snow is white.
        • to think: a relation between
          • a subject = “I”
          • a propositional content thought =  snow is white
      • Sensations: I feel the coldness of the snow.
        • to feel: a relation between
          • a subject = “I”
          • a quale = the cold-feeling
    • Friedrich Nietzsche
      • A thought comes when “it” will and not when “I” will. Thus it is a falsification of the evidence to say that the subject “I” conditions the predicate “think.”
      • It is thought, to be sure, but that this “it” should be that old famous “I” is, to put it mildly, only a supposition, an assertion. Above all it is not an “immediate certainty.” … Our conclusion is here formulated out of our grammatical custom: “Thinking is an activity; every activity presumes something which is active, hence ….” 
    • Lichtenberg: “it’s thinking” a la “it’s raining”
      • a mere grammatical requirement
      • no proof of an thinking self

[…]

  • Ego vs. bundle theories (Derek Parfit (1987))
    • Ego: “there really is some kind of continuous self that is the subject of my experiences, that makes decisions, and so on.” (95)
      • Religions: Christianity, Islam, Hinduism
      • Philosophers: Descartes, Locke, Kant & many others (the majority view)
    • Bundle: “there is no underlying continuous and unitary self.” (95)
      • Religion: Buddhism
      • Philosophers: Hume, Nietzsche, Lichtenberg, Wittgenstein, Kripke(?), Parfit, Dennett {a stellar minority}
  • Hume v. Reid
    • David Hume: For my part, when I enter most intimately into what I call myself, I always stumble on some particular perception or other, of heat or cold, light or shade, love or hatred, pain or pleasure.  I never can catch myself at any time without a perception, and never can observe anything but the perception.  (Hume 1739, Treatise I, VI, iv)
    • Thomas Reid: I am not thought, I am not action, I am not feeling: I am something which thinks and acts and feels. (1785)

An Opportunity… Lost?

It’s hard to ignore the #MeToo movement right now. And that is a good thing. It is forcing much to the surface. This is a necessary, if difficult, process. But I fear that the recent trend of women coming forward will simultaneously go too far and not far enough.

For example, a recent allegation against Al Franken is that he had his arm around a woman during a photograph and squeezed her waist. Even assuming this allegation was honestly intended, such an action is not sexual misconduct. If the woman didn’t like being touched in that manner and thought it inappropriate, she simply should have stated so at the time. I realize it is difficult to speak out when a situation is uncomfortable, but making a public allegation of such minor behavior so long after the event in no way helps anyone involved. Besides, it is an insult to women (and men) who have actually experienced sexual harassment and assault.

That kind of thing will end up causing many to question the legitimacy of even the women who should be coming forward, which some fear is already leading to a backlash. And that would be an unfortunate and counterproductive result. What we need is a better process for dealing with these problems. Trial by media and judgment by public opinion is inherently anti-democratic. This atmosphere further promotes those like Trump, in pushing the reactionaries toward greater reaction. And partisanship makes the whole fiasco even more of a mess.

We live in a society where there is so much abuse, far beyond the recent cases: women molesting young boys, police harming minorities, the homeless being endlessly harassed, etc. People, not only men, in positions of authority misuse their power in victimizing others all the time. And most oppression is largely impersonal, as it is systemic and institutional. We need to be having a larger conversation. What has been happening lately is a good start. But I worry that this is where it will end. And as a society, after scapegoating a few people, we’ll likely go back to pretending that our society isn’t completely fucked up in a thousand ways.

When is there going to be a #MeToo movement for every time a poor person is unfairly evicted from their home, every time a family falls into debt because of medical costs, every time someone turns to prostitution or selling drugs in order to pay the bills, every time another person is sent to prison for a victimless crime, every time a poor black child is damaged by lead toxicity, every time an innocent is killed in America’s endless wars of aggression, and on and on? And When will the corporate media and identity politics activists equally obsess over the abuse, oppression, and prejudice millions of Americans (mostly poor or minority, many of them men) experience everyday?

I want a just and fair society. But we refuse to look at all of this vast abuse and put it in context. It’s not limited to individuals or even limited to specific kinds of abuse. It all connects together, as a victim of one kind of abuse might grow up to become a victimizer of another kind of abuse, and few people want to talk about this culture and cycle of victimization where the status of victim and victmizer blurs over time. If we were honest with ourselves, we’d have to question our complicity in not protesting or revolting against such overwhelming moral depravity that we have come to accept as normal or simply, out of discomfort and apathy, that we silently pretend isn’t happening.

I’m not feeling hopeful. Much of what is going on with sexual allegations is a useful and necessary first step. But it is a tiny step compared to how far we need to go. What needs to be challenged is the authoritarianism inherent to a hierarchical society built on high inequality — such as a powerful white woman like Hillary Clinton laughing about a situation, the death of Gaddafi (because Western imperialists didn’t want a Pan-African currency), that led to mass violence of untold number of poor brown people; and don’t forget that the pseudo-feminist Hillary Clinton attacked women who came forward with sexual allegations against her husband.

Even limiting ourselves to the patriarchy, consider the following. Some feminists want to emphasize the stereotypical narrative of older male victimizers and younger female victims. And some feminists want to claim that, even when boys and men are victimized, it isn’t systemic as it is with girls and women. But these ways of looking at abuse is part of the patriarchal worldview. The only way to challenge patriarchy is to move the frame of discussion toward intersectional politics, where it is understood that abusive women too can hold power over others and where most men are at risk of being victims of a systemically oppressive and abusive society.

When is that discussion going to happen? Besides the men’s rights activists, you can get some radical leftists to take it seriously. But there isn’t much discussion across the corporate media, in establishment politics, and among liberal class activists. This moment is an opportunity that could quickly become a lost opportunity. It’s far from clear that civil rights for women and all Americans will be advanced, rather than undermined and misdirected. Already it is becoming yet another political game and media spectacle, exactly what the establishment loves to promote in maintaining the status quo.

Unsurprisingly, as women have increased in number moving up the corporate ladder, the rate of workplace harassment of men by women has also increased. Are we hoping to lessen abusive behavior toward women or simply make abusive behavior more gender equal? Are we hoping to end the patriarchy or to help more women join the patriarchy?

Why can’t we get to the point of acknowledging that we live in a severely oppressive society that victimizes large numbers across numerous demographics? Why not focus on and publicly discuss the systemic, institutional, and pervasive reality of population-wide victimization and how it cyclically persists, sadly with many victims becoming a new generation of victimizers? What are we afraid of? And why would we rather allow this horrific moral failure to continue than accept our shared responsibility in dealing with it? If not now, when? Why do we continually delay justice? Why is it never the right time to seek reform and defend the rights of all victims?

Compassion isn’t a limited commodity and justice isn’t a zero sum game. I have a suggestion. Instead of being manipulated by divide and conquer, instead of seeking the benefit of one’s group alone, let’s seek solidarity against a system of victimization that makes it possible for victimizers to take advantage of others. Let’s all agree that so many of us have been harmed by this dysfunctional social order. Let’s work together to create a better society for everyone. Let’s give voice to a shared intention to include, support, and promote each other through a vision of common good. And let’s strive for a functioning social democracy built on the upholding and defense of constitutional civil rights and universal human rights.

That almost sounds inspiring and unifying. Now that is something to which we can all add an emphatic #MeToo!

* * *

How Politics Might Sour the #MeToo Movement
by R. Marie Griffith

Why the #MeToo Movement Should Be Ready for a Backlash
by Emily Yoffe

Could cascade of allegations send #MeToo movement off the rails?
by Pam Louwagie

Is #MeToo Only For Women? Should It Be?
by Abby Franquemont

What Happens When Men Say #MeToo, Too?
by Colin Beavan

To Stop the Abuse of Women, We Also Have to Stop the Abuse of Men
by Mike Kasdan and Lisa Hickey

When #MeToo Means #WeBlameYou
by T. S. Barracks

Yes, Men Can Be Sexually Harassed In The Workplace
by PLBSMH

More Men Report Sexual Harassment at Work
by Robert DiGiacomo

More men file workplace sexual harassment claims
by The Washington Times

Women Harassing Men
by Gretchen Voss

Sexual harassment of men more common than you think
by Emanuel Shirazi

Sexual harassment at work not just men against women
by Science Daily

Workplace sexual harassment at the margins
by Paula McDonald and Sara Charlesworth

When Men Face Sexual Harassment
by Romeo Vitelli

I Was Sexually Harassed By My (Female) Boss
by Eugene S. Robinson

Men who were sexually assaulted by women share their stories – and how their friends reacted
by Nicola Oakley

‘It’s hard for a guy to say, “I need help.”‘ How shelters reach out to male victims of domestic violence
by Jenny Jarvie

‘There is nowhere for us to go’: Domestic violence happens to men too
by Ginger Gorman

The Number of Male Domestic Abuse Victims Is Shockingly High — So Why Don’t We Hear About Them?
by Jenna Birch

Laughing at the Sexual Abuse of Boys?
by Amy Simpson

Why female violence against men is society’s last great taboo
by Martin Daubney

The Understudied Female Sexual Predator
by Conor Friedersdorf

10 Ways Female Sexual Predators Assault Men And Boys
by Jim Goad

Sexual Victimization by Women Is More Common Than Previously Known
by Lara Stemple and Ilan H. Meyer

Sexual victimization perpetrated by women: Federal data reveal surprising prevalence
by Lara Stemple, Andrew Flores, and Ilan HMeyer

Sexual Abuse of Men by Women is Underreported
by Christen Hovet

Cases of sexual abuse of boys by women raise awareness of an uncommon crime
by Keith L. Alexander

Sexual Abuse of Boys
by Jim Hopper

When Men Are Raped
by Hanna Rosin

The Hidden Epidemic of Men Who Are Raped by Women
by Steven Blum

Myths and Facts
Adapted and expanded from an online piece by Ken Singer

500% Increase in Female Abuse of Men
by BLM

Latest Data From The ABS And AIC
by One in Three

Men Can Be Abused, Too
by domesticshelters.org

Embarrassment barrier for abused men
by Melissa Wishart

Why are so many MEN becoming victims of domestic violence?
by Antonia Hoyle

Woman As Aggressor: The Unspoken Truth Of Domestic Violence
by Edward Rhymes

Domestic Violence Against Men: Women More Likely To Be ‘Intimate Terrorists’ With Controlling Behavior In Relationships
by Lizette Borreli

The Truth About Domestic Violence Against Men
by Abby Jackson

What Domestic Violence Against Men Looks Like
by C. Brian Smith

Men Can Be Victims of Abuse, Too
by The National Domestic Violence Hotline

CDC Study: More Men than Women Victims of Partner Abuse
by Bert H. Hoff, J.D

More than 40% of domestic violence victims are male, report reveals
by Denis Campbell

‘Men tolerate abuse’: Over 5,000 reports of domestic abuse against men made in 2016
by Hayley Halpin

Domestic violence against men soars to record levels as number of cases treble in past decade
by David Wooding

Domestic violence against men – Prevalence
by Wikipedia

Dark Triad Domination

It has been noted that some indigenous languages have words that can be interpreted as what, in English, is referred to as psychopathic, sociopathic, narcissistic, Machiavellian, etc. This is the region of the Dark Triad. One Inuit language has the word ‘kunlangeta‘, meaning “his mind knows what to do but he does not do it.” That could be thought of as describing a psychopath’s possession of cognitive empathy while lacking affective empathy. Or consider the Yoruba word ‘arankan‘ that “is applied to a person who always goes his own way regardless of others, who is uncooperative, full of malice, and bullheaded.”

These are tribal societies. Immense value is placed on kinship loyalty, culture of trust, community survival, collective well-being, and public good. Even though they aren’t oppressive authoritarian states, the modern Western notion of hyper-individualism wouldn’t make much sense within these close-knit groups. Sacrifice of individual freedom and rights is a given under such social conditions, since individuals are intimately related to one another and physically dependent upon one another. Actually, it wouldn’t likely be experienced as sacrifice at all since it would simply be the normal state of affairs, the shared reality within which they exist.

This got me thinking about psychopathy and modern society. Research has found that, at least in some Western countries, the rate of psychopathy is not just high in prison populations but equally as high among the economic and political elite. My father left upper management in a major corporation because of how ruthless was the backstabbing, a win at all costs social Darwinism. This is what defines a country like the United States, as these social dominators are the most revered and emulated individuals. Psychopaths and such, instead of being eliminated or banished, are promoted and empowered.

What occurred to me is the difference for tribal societies is that hyper-individualism is seen not only as abnormal but dangerous and so intolerable. Maybe the heavy focus on individualism in the modern West inevitably leads to the psychopathological traits of the Dark Triad. As such, that would mean there is something severely abnormal and dysfunctional about Western societies (WEIRD – Western Educated Industrialized Rich Democratic). Psychopaths and narcissists, in particular, are the ultimate individualists and so they will be the ultimate winners in an individualistic culture — their relentless confidence and ruthless competitiveness, their Machiavellian manipulations and persuasive charm supporting a narcissistic optimism and leading to success.

There are a couple of ways of looking at this. First off, there might be something about urbanization itself or a correlated factor that exacerbates mental illness. Studies have found, for example, an increase in psychosis across the recent generations of city-dwellers — precisely during the period of populations being further urbanized and concentrated. It makes one think of the study done on crowding large numbers of rats in a small contained cage until they turned anti-social, aggressive, and violent. If these rats were humans, we’d describe this behavior in terms of psychopathy or sociopathy.

There is a second thing to consider, as discussed by Barbara Oakley in her book Evil Genes (pp. 265-6). About rural populations, she writes that, “Psychopathy is rare in those settings, notes psychologist David Cooke, who has studied psychopathy across cultures.” And she continues:

“But what about more urban environments? Cooke’s research has shown, surprisingly, that there are more psychopaths from Scotland prisons of England and Wales than there are in Scottish prisons. (Clearly, this is not to say that the Scottish are more given to psychopathy than anyone else.) Studies of migration records showed that many Scottish psychopaths had migrated to the more populated metropolitan areas of the south. Cooke hypothesized that, in the more crowded metropolitan areas, the psychopath could attack or steal with little danger that the victim would recognize or catch him. Additionally, the psychopath’s impulsivity and need for stimulation could also play a role in propelling the move to the dazzling delights of the big city — he would have no affection for family and friends to keep him tethered back home. Densely populated areas, apparently, are the equivalent for psychopaths of ponds and puddles for malarial mosquitoes.”

As Oakley’s book is on genetics, she goes in an unsurprising direction in pointing out how some violent individuals have been able to pass on their genetics to large numbers of descendants. The most famous example being Genghis Khan. She writes that (p. 268),

“These recent discoveries reinforce the findings of the anthropologist Laura Betzig. Her 1986 Despotism and Differential Reproduction provides a cornucopia of evidence documenting the increased capacity of those with more power — and frequently, Machiavellian tendencies — to have offspring. […] As Machiavellian researcher Richard Christie and his colleague Florence Geis aptly note: “[H]igh population density and highly competitive environments have been found to increase the use of antisocial and Machiavellian strategies, and my in fact foster the ability of those who possess those strategies to reproduce.” […] Beltzig’s ultimte point is not that the corrupt attain power but that those corrupted individuals who achieved power in preindustrial agricultural societies had far more opportunity to reproduce, generally through polygyny, and pass on their genes. In fact, the more Machiavellian, that is, despotic, a man might be, the more polygynous he tended to be — grabbing and keeping for himself as many beautiful women as he could. Some researchers have posited that envy is itself a useful, possibly geneticall linked trait, “serving a key role in survival, motivating achievement, serving the conscience of self and other, and alerting us to inequities that, if fueled, can lead to esclaated violence.” Thus, genese related to envy — not to mention other more problematic temperaments — might have gradually found increased prevalence in such environments.”

That kind of genetic hypothesis is highly speculative, to say the least. Their could be some truth value in them, if one wanted to give the benefit of the doubt, but we have no direct evidence that such is the case. At present, these speculations are yet more just-so stories and they will remain so until we can better control confounding factors in order to directly ascertain causal factors. Anyway, genetic determinism in this simplistic sense is largely moot at this point, as the science is moving on into new understandings. Besides being unhelpful, such speculations are unnecessary. We already have plenty of social science research that proves changing environmental conditions alters social behavior — besides what I’ve already mentioned, there is such examples as the fascinating rat park research. There is no debate to be had about the immense influence of external influences, such as factors of socioeconomic class and high inequality: Power Causes Brain Damage by Justin Renteria, How Wealth Reduces Compassion by Daisy Grewal, Got Money? Then You Might Lack Compassion by Jeffrey Kluger, Why the Rich Don’t Give to Charity by Ken Stern, Rich People Literally See the World Differently by Drake Baer, The rich really DO ignore the poor by Cheyenne Macdonald, Propagandopoly: Monopoly as an Ideological Tool by Naomi Russo, A ‘Rigged’ Game Of Monopoly Reveals How Feeling Wealthy Changes Our Behavior [TED VIDEO] by Planetsave, etc.

Knowing the causes is important. But knowing the consequences is just as important. No matter what increases Dark Triad behaviors, they can have widespread and long-lasting repurcussions, maybe even permanently altering entire societies in how they function. Following her speculations, Oakley gets down to the nitty gritty (p. 270):

“Questions we might reasonably ask are — has the percentage of Machiavellians and other more problematic personality types increased in the human population, or in certain human populations, since the advent of agriculture? And if the answer is yes, does the increase in these less savory types change a group’s culture? In other words, is there a tipping point of Machiavellian and emote control behavior that can subtly or not so subtly affect the way the members of a society interact? Certainly a high expectation of meeting a “cheater,” for example, would profoundly impact the trust that appears to form the grease of modern democratic societies and might make the development of democratic processes in certain areas more difficult. Crudely put, an increase in successfully sinister types from 2 percent, say, to 4 percent of a population would double the pool of Machiavellians vying for power. And it is the people in power who set the emotional tone, perhaps through mirroring and emotional contagion, for their followers and those around them. As Judith Rich Harris points out, higher-status members of a group are looked at more, which means they have more influence on how a person becomes socialized.”

The key factor in much of this seems to be concentration. Simply concentrating populations, humans or rats, leads to social problems related to mental health issues. On top of that, there is the troubling concern of what kind of people are being concentrated and where they are being concentrated — psychopaths being concentrated not only in big cities and prisons but worse still in positions of wealth and power, authority and influence. We live in a society that creates the conditions for the Dark Triad to increase and flourish. This is how the success of those born psychopaths encourages others to follow their example in developing into sociopaths, which in turn makes the Dark Triad mindset into a dominant ethos within mainstream culture.

The main thing on my mind is individualism. It’s been on my mind a lot lately, such as in terms of the bundle theory of the mind and the separate individual, related to my long term interest in community and the social nature of humans. In relation to individualism, there is the millennia-old cultural divide between Germanic ‘freedom‘ and Roman ‘liberty‘. But because Anglo-American society mixed up the two, this became incorrectly framed by Isaiah Berlin in terms of positive and negative. In Contemporary Political Theory, J. C. Johari writes that (p. 266), “Despite this all, it may be commented that though Berlin advances the argument that the two aspects of liberty cannot be so distinguished in practical terms, one may differ from him and come to hold that his ultimate preference is for the defence of the negative view of liberty. Hence, he obviously belongs to the category of Mill and Hayek.”  He states this this “is evident from his emphatic affirmation” in the following assertion by Berlin:

“The fundamental sense of freedom is freedom from chains, from imprisonment, from enslavement by others. The rest is extension of this sense or else metaphor. To strive to be free is to seek to remove obstacles; to struggle for personal freedom is to seek to curb interference, exploitation, enslavement by men whose ends are theirs, not one’s own. Freedom, at least in its political sense, is coterminous with the absence of bullying or domination.”

Berlin makes a common mistake here. Liberty was defined by not being a slave in a slave-based society, which is what existed in the Roman Empire. But that isn’t freedom, an entirely different term with an etymology related to ‘friend’ and with a meaning that indicated membership in an autonomous community — such freedom meant not being under the oppression of a slave-based society (e.g., German tribes remaining independent of the Roman Empire). Liberty, not freedom, was determined by one’s individual status of lacking oppression in an oppressive social order. This is why liberty has a negative connotation for it is what you lack, rather than what you possess. A homeless man starving alone on the street with no friend in the world to help him and no community to support him, such a man has liberty but not freedom. He is ‘free’ to do what he wants under those oppressive conditions and constraints, as no one is physically detaining him.

This notion of liberty has had a purchase on the American mind because of the history of racial and socioeconomic oppression. After the Civil War, blacks had negative liberty in no longer being slaves but they definitely did not have positive freedom through access to resources and opportunities, instead being shackled by systemic and institutional racism that maintained their exploited status as a permanent underclass. Other populations such as Native Americans faced a similar dilemma. Is one actually free when the chains holding one down are invisible but still all too real? If liberty is an abstraction detached from lived experience and real world results, of what value is such liberty?

This point is made by another critic of Berlin’s perspective. “It is hard for me to see that Berlin is consistent on this point,” writes L. H. Crocker (Positive Liberty, p. 69). “Surely not all alterable human failures to open doors are cases of bullying. After all, it is often through neglect that opportunities fail to be created for the disadvantaged. It is initially more plausible that all failures to open doors are the result of domination in some sense or another.” I can’t help but think that Dark Triad individuals would feel right at home in a culture of liberty where individuals have the ‘freedom’ to oppress and be oppressed. Embodying this sick mentality, Margaret Thatcher once gave perfect voice to the sociopathic worldview — speaking of the victims of disadvantage and desparation, she claimed that, “They’re casting their problem on society. And, you know, there is no such thing as society.” That is to say, there is no freedom.

The question, then, is whether or not we want freedom. A society is only free to the degree that as a society freedom is demanded. To deny society itself is an attempt to deny the very basis of freedom, but that is just a trick of rhetoric. A free people know their own freedom by acting freely, even if that means fighting the oppressors who seek to deny that freedom. Thatcher intentionally conflated society and government, something never heard in the clear-eyed wisdom of a revolutionary social democrat like Thomas Paine“Society in every state is a blessing, but government, even in its best stage, is but a necessary evil; in its worst state an intolerable one.” These words expressed the values of negative liberty as made perfect sense for someone living in an empire built on colonialism, corporatism, and slavery. But the same words gave hint to a cultural memory of Germanic positive freedom. It wasn’t a principled libertarian hatred of governance, rather the principled radical protest against a sociopathic social order. As Paine made clear, this unhappy situation was neither inevitable nor desirable, much less tolerable.

The Inuits would find a way for psychopaths to ‘accidentally’ fall off the ice, never to trouble the community again. As for the American revolutionaries, they preferred more overt methods, from tar and feathering to armed revolt. So, now to regain our freedom as a people, what recourse do we have in abolishing the present Dark Triad domination?

* * *

Here are some blog posts on individualism and community, as contrasted between far different societies. In these writings, I explore issues of mental health (from depression to addiction), and social problems (from authoritarianism to capitalist realism) — as well as other topics, including carnival and revolution.

Self, Other, & World

Retrieving the Lost Worlds of the Past:
The Case for an Ontological Turn
by Greg Anderson

“[…] This ontological individualism would have been scarcely intelligible to, say, the inhabitants of precolonial Bali or Hawai’i, where the divine king or chief, the visible incarnation of the god Lono, was “the condition of possibility of the community,” and thus “encompasse[d] the people in his own person, as a projection of his own being,” such that his subjects were all “particular instances of the chief’s existence.” 12 It would have been barely imaginable, for that matter, in the world of medieval Europe, where conventional wisdom proverbially figured sovereign and subjects as the head and limbs of a single, primordial “body politic” or corpus mysticum. 13 And the idea of a natural, presocial individual would be wholly confounding to, say, traditional Hindus and the Hagen people of Papua New Guinea, who objectify all persons as permeable, partible “dividuals” or “social microcosms,” as provisional embodiments of all the actions, gifts, and accomplishments of others that have made their lives possible.1

“We alone in the modern capitalist west, it seems, regard individuality as the true, primordial estate of the human person. We alone believe that humans are always already unitary, integrated selves, all born with a natural, presocial disposition to pursue a rationally calculated self-interest and act competitively upon our no less natural, no less presocial rights to life, liberty, and private property. We alone are thus inclined to see forms of sociality, like relations of kinship, nationality, ritual, class, and so forth, as somehow contingent, exogenous phenomena, not as essential constituents of our very subjectivity, of who or what we really are as beings. And we alone believe that social being exists to serve individual being, rather than the other way round. Because we alone imagine that individual humans are free-standing units in the first place, “unsocially sociable” beings who ontologically precede whatever “society” our self-interest prompts us to form at any given time.”

What Kinship Is-And Is Not
by Marshall Sahlins, p. 2

“In brief, the idea of kinship in question is “mutuality of being”: people who are intrinsic to one another’s existence— thus “mutual person(s),” “life itself,” “intersubjective belonging,” “transbodily being,” and the like. I argue that “mutuality of being” will cover the variety of ethnographically documented ways that kinship is locally constituted, whether by procreation, social construction, or some combination of these. Moreover, it will apply equally to interpersonal kinship relations, whether “consanguineal” or “affinal,” as well as to group arrangements of descent. Finally, “mutuality of being” will logically motivate certain otherwise enigmatic effects of kinship bonds— of the kind often called “mystical”— whereby what one person does or suffers also happens to others. Like the biblical sins of the father that descend on the sons, where being is mutual, there experience is more than individual.”

Music and Dance on the Mind

We aren’t as different from ancient humanity as it might seem. Our societies have changed drastically, suppressing old urges and potentialities. Yet the same basic human nature still lurks within us, hidden in the underbrush along the well trod paths of the mind. The hive mind is what the human species naturally falls back upon, from millennia of collective habit. The problem we face is we’ve lost the ability to express well our natural predisposition toward group-mindedness, too easily getting locked into groupthink, a tendency easily manipulated.

Considering this, we have good reason to be wary, not knowing what we could tap into. We don’t understand our own minds and so we naively underestimate the power of humanity’s social nature. With the right conditions, hiving is easy to elicit but hard to control or shut down. The danger is that the more we idolize individuality the more prone we become to what is so far beyond the individual. It is the glare of hyper-individualism that casts the shadow of authoritarianism.

Pacifiers, Individualism & Enculturation

I’ve often thought that individualism, in particular hyper-individualism, isn’t the natural state of human nature. By this, I mean that it isn’t how human nature manifested for the hundreds of thosands of years prior to modern Western civilization. Julian Jaynes theorizes that, even in early Western civilization, humans didn’t have a clear sense of separate individuality. He points out that in the earliest literature humans were all the time hearing voices outside of themselves (giving them advice, telling them what to do, making declarations, chastising them, etc), maybe not unlike in the way we hear a voice in our head.

We moderns have internalized those external voices of collective culture. This seems normal to us. This is not just about pacifiers. It’s about technology in general. The most profound technology ever invented was written text (along with the binding of books and the printing press). All the time I see my little niece absorbed in a book, even though she can’t yet read. Like pacifiers, books are tools of enculturation that help create the individual self. Instead of mommy’s nipple, the baby soothes themselves. Instead of voices in the world, the child becomes focused on text. In both cases, it is a process of internalizing.

All modern civilization is built on this process of individualization. I don’t know if it is overall good or bad. I’m sure much of our destructive tendencies are caused by the relationship between individualization and objectification. Nature as a living world that could speak to us has become mere matter without mind or soul. So, the cost of this process has been high… but then again, the innovative creativeness has exploded as this individualizing process has increasingly taken hold in recent centuries.

“illusion of a completed, unitary self”

The Voices Within: The History and Science of How We Talk to Ourselves
by Charles Fernyhough, Kindle Locations 3337-3342

“And we are all fragmented. There is no unitary self. We are all in pieces, struggling to create the illusion of a coherent “me” from moment to moment. We are all more or less dissociated. Our selves are constantly constructed and reconstructed in ways that often work well, but often break down. Stuff happens, and the center cannot hold. Some of us have more fragmentation going on, because of those things that have happened; those people face a tougher challenge of pulling it all together. But no one ever slots in the last piece and makes it whole. As human beings, we seem to want that illusion of a completed, unitary self, but getting there is hard work. And anyway, we never get there.”

Delirium of Hyper-Individualism

Individualism is a strange thing. For anyone who has spent much time meditating, it’s obvious that there is no there there. It slips through one’s grasp like an ancient philosopher trying to study aether. The individual self is the modernization of the soul. Like the ghost in the machine and the god in the gaps, it is a theological belief defined by its absence in the world. It’s a social construct, a statement that is easily misunderstood.

In modern society, individualism has been raised up to an entire ideological worldview. It is all-encompassing, having infiltrated nearly every aspect of our social lives and become internalized as a cognitive frame. Traditional societies didn’t have this obsession with an idealized self as isolated and autonomous. Go back far enough and the records seem to show societies that didn’t even have a concept, much less an experience, of individuality.

Yet for all its dominance, the ideology of individualism is superficial. It doesn’t explain much of our social order and personal behavior. We don’t act as if we actually believe in it. It’s a convenient fiction that we so easily disregard when inconvenient, as if it isn’t all that important after all. In our most direct experience, individuality simply makes no sense. We are social creatures through and through. We don’t know how to be anything else, no matter what stories we tell ourselves.

The ultimate value of this individualistic ideology is, ironically, as social control and social justification.

It’s All Your Fault, You Fat Loser!

Capitalist Realism: Is there no alternative?
By Mark Fisher, pp. 18-20

“[…] In what follows, I want to stress two other aporias in capitalist realism, which are not yet politicized to anything like the same degree. The first is mental health. Mental health, in fact, is a paradigm case of how capitalist realism operates. Capitalist realism insists on treating mental health as if it were a natural fact, like weather (but, then again, weather is no longer a natural fact so much as a political-economic effect). In the 1960s and 1970s, radical theory and politics (Laing, Foucault, Deleuze and Guattari, etc.) coalesced around extreme mental conditions such as schizophrenia, arguing, for instance, that madness was not a natural, but a political, category. But what is needed now is a politicization of much more common disorders. Indeed, it is their very commonness which is the issue: in Britain, depression is now the condition that is most treated by the NHS . In his book The Selfish Capitalist, Oliver James has convincingly posited a correlation between rising rates of mental distress and the neoliberal mode of capitalism practiced in countries like Britain, the USA and Australia. In line with James’s claims, I want to argue that it is necessary to reframe the growing problem of stress (and distress) in capitalist societies. Instead of treating it as incumbent on individuals to resolve their own psychological distress, instead, that is, of accepting the vast privatization of stress that has taken place over the last thirty years, we need to ask: how has it become acceptable that so many people, and especially so many young people, are ill? The ‘mental health plague’ in capitalist societies would suggest that, instead of being the only social system that works, capitalism is inherently dysfunctional, and that the cost of it appearing to work is very high.”

There is always an individual to blame. It sucks to be an individual these days, I tell ya. I should know because I’m one of those faulty miserable individuals. I’ve been one my whole life. If it weren’t for all of us pathetic and depraved individuals, capitalism would be utopia. I beat myself up all the time for failing the great dream of capitalism. Maybe I need to buy more stuff.

“The other phenomenon I want to highlight is bureaucracy. In making their case against socialism, neoliberal ideologues often excoriated the top-down bureaucracy which supposedly led to institutional sclerosis and inefficiency in command economies. With the triumph of neoliberalism, bureaucracy was supposed to have been made obsolete; a relic of an unlamented Stalinist past. Yet this is at odds with the experiences of most people working and living in late capitalism, for whom bureaucracy remains very much a part of everyday life. Instead of disappearing, bureaucracy has changed its form; and this new, decentralized, form has allowed it to proliferate. The persistence of bureaucracy in late capitalism does not in itself indicate that capitalism does not work – rather, what it suggests is that the way in which capitalism does actually work is very different from the picture presented by capitalist realism.”

Neoliberalism: Dream & Reality

in the book Capitalist Realism by Mark Fisher (p. 20):

“[…] But incoherence at the level of what Brown calls ‘political rationality’ does nothing to prevent symbiosis at the level of political subjectivity, and, although they proceeded from very different guiding assumptions, Brown argues that neoliberalism and neoconservatism worked together to undermine the public sphere and democracy, producing a governed citizen who looks to find solutions in products, not political processes. As Brown claims,

“the choosing subject and the governed subject are far from opposites … Frankfurt school intellectuals and, before them, Plato theorized the open compatibility between individual choice and political domination, and depicted democratic subjects who are available to political tyranny or authoritarianism precisely because they are absorbed in a province of choice and need-satisfaction that they mistake for freedom.”

“Extrapolating a little from Brown’s arguments, we might hypothesize that what held the bizarre synthesis of neoconservatism and neoliberalism together was their shared objects of abomination: the so called Nanny State and its dependents. Despite evincing an anti-statist rhetoric, neoliberalism is in practice not opposed to the state per se – as the bank bail-outs of 2008 demonstrated – but rather to particular uses of state funds; meanwhile, neoconservatism’s strong state was confined to military and police functions, and defined itself against a welfare state held to undermine individual moral responsibility.”

[…] what Robin describes touches upon my recent post about the morality-punishment link. As I pointed out, the world of Star Trek: Next Generation imagines the possibility of a social order that serves humans, instead of the other way around. I concluded that, “Liberals seek to promote freedom, not just freedom to act but freedom from being punished for acting freely. Without punishment, though, the conservative sees the world lose all meaning and society to lose all order.” The neoliberal vision subordinates the individual to the moral order. The purpose of forcing the individual into a permanent state of anxiety and fear is to preoccupy their minds and their time, to redirect all the resources of the individual back into the system itself. The emphasis on the individual isn’t because individualism is important as a central ideal but because the individual is the weak point that must be carefully managed. Also, focusing on the individual deflects our gaze from the structure and its attendant problems.

This brings me to how this relates to corporations in neoliberalism (Fisher, pp. 69-70):

“For this reason, it is a mistake to rush to impose the individual ethical responsibility that the corporate structure deflects. This is the temptation of the ethical which, as Žižek has argued, the capitalist system is using in order to protect itself in the wake of the credit crisis – the blame will be put on supposedly pathological individuals, those ‘abusing the system’, rather than on the system itself. But the evasion is actually a two step procedure – since structure will often be invoked (either implicitly or openly) precisely at the point when there is the possibility of individuals who belong to the corporate structure being punished. At this point, suddenly, the causes of abuse or atrocity are so systemic, so diffuse, that no individual can be held responsible. This was what happened with the Hillsborough football disaster, the Jean Charles De Menezes farce and so many other cases. But this impasse – it is only individuals that can be held ethically responsible for actions, and yet the cause of these abuses and errors is corporate, systemic – is not only a dissimulation: it precisely indicates what is lacking in capitalism. What agencies are capable of regulating and controlling impersonal structures? How is it possible to chastise a corporate structure? Yes, corporations can legally be treated as individuals – but the problem is that corporations, whilst certainly entities, are not like individual humans, and any analogy between punishing corporations and punishing individuals will therefore necessarily be poor. And it is not as if corporations are the deep-level agents behind everything; they are themselves constrained by/ expressions of the ultimate cause-that-is-not-a-subject: Capital.”

Sleepwalking Through Our Dreams

The modern self is not normal, by historical and evolutionary standards. Extremely unnatural and unhealthy conditions have developed, our minds having correspondingly grown malformed like the binding of feet. Our hyper-individuality is built on disconnection and, in place of human connection, we take on various addictions, not just to drugs and alcohol but also to work, consumerism, entertainment, social media, and on and on. The more we cling to an unchanging sense of bounded self, the more burdened we become trying to hold it all together, hunched over with the load we carry on our shoulders. We are possessed by the identities we possess.

This addiction angle interests me. Our addiction is the result of our isolated selves. Yet even as our addiction attempts to fill emptiness, to reach out beyond ourselves toward something, anything, a compulsive relationship devoid of the human, we isolate ourselves further. As Johann Hari explained in Chasing the Scream (Kindle Locations 3521-3544):

There were three questions I had never understood. Why did the drug war begin when it did, in the early twentieth century? Why were people so receptive to Harry Anslinger’s message? And once it was clear that it was having the opposite effect to the one that was intended— that it was increasing addiction and supercharging crime— why was it intensified, rather than abandoned?

I think Bruce Alexander’s breakthrough may hold the answer.

“Human beings only become addicted when they cannot find anything better to live for and when they desperately need to fill the emptiness that threatens to destroy them,” Bruce explained in a lecture in London31 in 2011. “The need to fill an inner void is not limited to people who become drug addicts, but afflicts the vast majority of people of the late modern era, to a greater or lesser degree.”

A sense of dislocation has been spreading through our societies like a bone cancer throughout the twentieth century. We all feel it: we have become richer, but less connected to one another. Countless studies prove this is more than a hunch, but here’s just one: the average number of close friends a person has has been steadily falling. We are increasingly alone, so we are increasingly addicted. “We’re talking about learning to live with the modern age,” Bruce believes. The modern world has many incredible benefits, but it also brings with it a source of deep stress that is unique: dislocation. “Being atomized and fragmented and all on [your] own— that’s no part of human evolution and it’s no part of the evolution of any society,” he told me.

And then there is another kicker. At the same time that our bonds with one another have been withering, we are told— incessantly, all day, every day, by a vast advertising-shopping machine— to invest our hopes and dreams in a very different direction: buying and consuming objects. Gabor tells me: “The whole economy is based around appealing to and heightening every false need and desire, for the purpose of selling products. So people are always trying to find satisfaction and fulfillment in products.” This is a key reason why, he says, “we live in a highly addicted society.” We have separated from one another and turned instead to things for happiness— but things can only ever offer us the thinnest of satisfactions.

This is where the drug war comes in. These processes began in the early twentieth century— and the drug war followed soon after. The drug war wasn’t just driven, then, by a race panic. It was driven by an addiction panic— and it had a real cause. But the cause wasn’t a growth in drugs. It was a growth in dislocation.

The drug war began when it did because we were afraid of our own addictive impulses, rising all around us because we were so alone. So, like an evangelical preacher who rages against gays because he is afraid of his own desire to have sex with men, are we raging against addicts because we are afraid of our own growing vulnerability to addiction?

In The Secret Life of Puppets, Victoria Nelson makes some useful observations of reading addiction, specifically in terms of formulaic genres. She discusses Sigmund Freud’s repetition compulsion and Lenore Terr’s post-traumatic games. She sees genre reading as a ritual-like enactment that can’t lead to resolution, and so the addictive behavior becomes entrenched. This would apply to many other forms of entertainment and consumption. And it fits into Derrick Jensen’s discussion of abuse, trauma, and the victimization cycle.

I would broaden her argument in another way. People have feared the written text ever since it was invented. In the 18th century, there took hold a moral panic about reading addiction in general and that was before any fiction genres had developed (Frank Furedi, The Media’s First Moral Panic). The written word is unchanging and so creates the conditions for repetition compulsion. Every time a text is read, it is the exact same text.

That is far different from oral societies. And it is quite telling that oral societies have a much more fluid sense of self. The Piraha, for example, don’t cling to their sense of self nor that of others. When a Piraha individual is possessed by a spirit or meets a spirit who gives them a new name, the self that was there is no longer there. When asked where is that person, the Piraha will say that he or she isn’t there, even if the same body of the individual is standing right there in front of them. They also don’t have a storytelling tradition or concern for the past.

Another thing that the Piraha apparently lack is mental illness, specifically depression along with suicidal tendencies. According to Barbara Ehrenreich from Dancing in the Streets, there wasn’t much written about depression even in the Western world until the suppression of religious and public festivities, such as Carnival. One of the most important aspects of Carnival and similar festivities was the masking, shifting, and reversal of social identities. Along with this, there was the losing of individuality within the group. And during the Middle Ages, an amazing number of days in the year were dedicated to communal celebrations. The ending of this era coincided with numerous societal changes, including the increase of literacy with the spread of the movable type printing press.

Another thing happened with suppression of festivities. Local community began to break down as power became centralized in far off places and the classes became divided, which Ehrenreich details. The aristocracy used to be inseparable from their feudal roles and this meant participating in local festivities where, as part of the celebration, a king might wrestle with a blacksmith. As the divides between people grew into vast chasms, the social identities held and social roles played became hardened into place. This went along with a growing inequality of wealth and power. And as research has shown, wherever there is inequality also there is found high rates of social problems and mental health issues.

It’s maybe unsurprising that what followed from this was colonial imperialism and a racialized social order, class conflict and revolution. A society formed that was simultaneously rigid in certain ways and destabilized in others. The individuals became increasingly atomized and isolated. With the loss of kinship and community, the cheap replacement we got is identity politics. The natural human bonds are lost or constrained. Social relations are narrowed down. Correspondingly, our imaginations are hobbled and we can’t envision society being any other way. Most tragic, we forget that human society used to be far different, a collective amnesia forcing us into a collective trance. Our entire sense of reality is held in the vice grip of historical moment we find ourselves in.

Social Conditions of an Individual’s Condition

A wide variety of research and data is pointing to a basic conclusion. Environmental conditions (physical, social, political, and economic) are of penultimate importance. So, why do we treat as sick individuals those who suffer the consequences of the externalized costs of society?

Here is the sticking point. Systemic and collective problems in some ways are the easiest to deal with. The problems, once understood, are essentially simple and their solutions tend to be straightforward. Even so, the very largeness of these problems make them hard for us to confront. We want someone to blame. But who do we blame when the entire society is dysfunctional?

If we recognize the problems as symptoms, we are forced to acknowledge our collective agency and shared fate. For those who understand this, they are up against countervailing forces that maintain the status quo. Even if a psychiatrist realizes that their patient is experiencing the symptoms of larger social issues, how is that psychiatrist supposed to help the patient? Who is going to diagnose the entire society and demand it seek rehabilitation?

Winter Season and Holiday Spirit

With this revelry and reversal follows, along with licentiousness and transgression, drunkenness and bawdiness, fun and games, song and dance, feasting and festival. It is a time for celebration of this year’s harvest and blessing of next year’s harvest. Bounty and community. Death and rebirth. The old year must be brought to a close and the new year welcomed. This is the period when gods, ancestors, spirits, and demons must be solicited, honored, appeased, or driven out. The noise of song, gunfire, and such serves many purposes.

In the heart of winter, some of the most important religious events took place. This includes Christmas, of course, but also the various celebrations around the same time. A particular winter festival season that began on All Hallows Eve (i.e., Halloween) ended with the Twelfth Night. This included carnival-like revelry and a Lord of Misrule. There was also the tradition of going house to house, of singing and pranks, of demanding treats/gifts and threats if they weren’t forthcoming. It was a time of community and sharing, and those who didn’t willingly participate might be punished. Winter, a harsh time of need, was when the group took precedence. […]

I’m also reminded of the Santa Claus as St. Nick. This invokes an image of jollity and generosity. And this connects to wintertime as period of community needs and interdependence, of sharing and gifting, of hospitality and kindness. This includes enforcement of social norms which easily could transform into the challenging of social norms.

It’s maybe in this context we should think of the masked vigilantes participating in the Boston Tea Party. Like carnival, there had developed a tradition of politics out-of-doors, often occurring on the town commons. And on those town commons, large trees became identified as liberty trees — under which people gathered, upon which notices were nailed, and sometimes where effigies were hung. This was an old tradition that originated in Northern Europe, where a tree was the center of a community, the place of law-giving and community decision-making. In Europe, the commons had become the place of festivals and celebrations, such as carnival. And so the commons came to be the site of revolutionary fervor as well.

The most famous Liberty Tree was a great elm near the Boston common. It was there that many consider the birth of the American Revolution, as it was the site of early acts of defiance. This is where the Sons of Liberty met, organized, and protested. This would eventually lead to that even greater act of defiance on Saturnalia eve, the Boston Tea Party. One of the participants in the Boston Tea Party and later in the Revolutionary War, Samuel Sprague, is buried in the Boston Common.

There is something many don’t understand about the American Revolution. It wasn’t so much a fight against oppression in general and certainly not about mere taxation in particular. What angered those Bostonians and many other colonists was that they had become accustomed to community-centered self-governance and this was being challenged. The tea tax wasn’t just an imposition of imperial power but also colonial corporatism. The East India Company was not acting as a moral member of the community, in its taking advantage by monopolizing trade. Winter had long been the time of year when bad actors in the community would be punished. Selfishness was not to be tolerated.

Those Boston Tea Partiers were simply teaching a lesson about the Christmas spirit. And in the festival tradition, they chose the guise of Native Americans which to their minds would have symbolized freedom and an inversion of power. What revolution meant to them was a demand for return of what was taken from them, making the world right again. It was revelry with a purpose.

Wordplay Schmordplay

What Do You Call Words Like Wishy-Washy or Mumbo Jumbo?

Words like wishy-washy or mumbo-jumbo, or any words that contain two identical or similar parts (a segment, syllable, or morpheme), are called reduplicative words or tautonyms. The process of forming such words is known as reduplication. In many cases, the first word is a real word, while the second part (sometimes nonsensical) is invented to create a rhyme and to create emphasis. Most reduplicative begin as hyphenated words, and through very common usage, eventually lose the hype to become single words. Regardless of their hyphenation, they underscore the playfulness of the English language.

Reduplication isn’t just jibber-jabber

There are several kinds of reduplication. One type replaces a vowel while keeping the initial consonant, as in “flip-flop,” “pish-posh,” and “ping-pong.” Another type keeps the vowel but replaces that first sound, as in “namby-pamby,” “hanky-panky,” “razzle-dazzle,” and “timey-wimey,” a word used by Dr. Who fans for time-travel shenanigans. Reduplication doesn’t get any simpler than when the whole word is repeated, like when you pooh-pooh a couple’s attempt to dress matchy-matchy. My favorite type is “schm” reduplication, though some might say “Favorite, schmavorite!” All the types show that redundancy isn’t a problem in word-making. Grant Barrett, host of the public radio show “A Way with Words,” notes via e-mail that even the word “reduplication” has an unnecessary frill: “I’ve always liked the ‘re’ in ‘reduplicate.’ We’re doing it again! It’s right there in the word!”

Reduplication

Reduplication in linguistics is a morphological process in which the root or stem of a word (or part of it) or even the whole word is repeated exactly or with a slight change.

Reduplication is used in inflections to convey a grammatical function, such as plurality, intensification, etc., and in lexical derivation to create new words. It is often used when a speaker adopts a tone more “expressive” or figurative than ordinary speech and is also often, but not exclusively, iconic in meaning. Reduplication is found in a wide range of languages and language groups, though its level of linguistic productivity varies.

Reduplication is the standard term for this phenomenon in the linguistics literature. Other terms that are occasionally used include cloningdoublingduplicationrepetition, and tautonym when it is used in biological taxonomies, such as “Bison bison”.

The origin of this usage of tautonym is uncertain, but it has been suggested that it is of relatively recent derivation.

Reduplication

The coinage of new words and phrases into English has been greatly enhanced by the pleasure we get from playing with words. There are numerous alliterative and rhyming idioms, which are a significant feature of the language. These aren’t restricted to poets and Cockneys; everyone uses them. We start in the nursery with choo-choos, move on in adult life to hanky-panky and end up in the nursing home having a sing-song.

The repeating of parts of words to make new forms is called reduplication. There are various categories of this: rhyming, exact and ablaut (vowel substitution). Examples, are respectively, okey-dokey, wee-wee and zig-zag. The impetus for the coining of these seems to be nothing more than the enjoyment of wordplay. The words that make up these reduplicated idioms often have little meaning in themselves and only appear as part of a pair. In other cases, one word will allude to some existing meaning and the other half of the pair is added for effect or emphasis.

New coinages have often appeared at times of national confidence, when an outgoing and playful nature is expressed in language; for example, during the 1920s, following the First World War, when many nonsense word pairs were coined – the bee’s knees, heebie-jeebies etc. That said, the introduction of such terms begin with Old English and continues today. Willy-nilly is over a thousand years old. Riff-raff dates from the 1400s and helter-skelter, arsy-versy (a form of vice-versa), and hocus-pocus all date from the 16th century. Coming up to date we have bling-bling, boob-tube and hip-hop. I’ve not yet recorded a 21st century reduplication. Bling-bling comes very close but is 20th century. ‘Bieber Fever’ is certainly 21st century, but isn’t quite a reduplication.

A hotchpotch of reduplication

Argy-bargy and lovey-dovey lie on opposite ends of the interpersonal scale, but they have something obvious in common: both are reduplicatives.

Reduplication is when a word or part of a word is repeated, sometimes modified, and added to make a longer term, such as aye-ayemishmash, and hotchpotch. This process can mark plurality or intensify meaning, and it can be used for effect or to generate new words. The added part may be invented or it may be an existing word whose form and sense are a suitable fit.

Reduplicatives emerge early in our language-learning lives. As infants in the babbling phase we reduplicate syllables to utter mama, dada, nana and papa, which is where these pet names come from. Later we use moo-moo, choo-choo, wee-wee and bow-wow (or similar) to refer to familiar things. The repetition, as well as being fun, might help children develop and practise the pronunciation of sounds.

As childhood progresses, reduplicatives remain popular, popping up in children’s books, songs and rhymes. Many characters in children’s stories have reduplicated names: Humpty Dumpty, Chicken Licken and Handy Andy, to name a few.

The language rule we know – but don’t know we know

Ding dong King Kong

Well, in fact, the Big Bad Wolf is just obeying another great linguistic law that every native English speaker knows, but doesn’t know that they know. And it’s the same reason that you’ve never listened to hop-hip music.

You are utterly familiar with the rule of ablaut reduplication. You’ve been using it all your life. It’s just that you’ve never heard of it. But if somebody said the words zag-zig, or ‘cross-criss you would know, deep down in your loins, that they were breaking a sacred rule of language. You just wouldn’t know which one.

All four of a horse’s feet make exactly the same sound. But we always, always say clip-clop, never clop-clip. Every second your watch (or the grandfather clock in the hall makes the same sound) but we say tick-tock, never tock-tick. You will never eat a Kat Kit bar. The bells in Frère Jaques will forever chime ‘ding dang dong’.

Reduplication in linguistics is when you repeat a word, sometimes with an altered consonant (lovey-dovey, fuddy-duddy, nitty-gritty), and sometimes with an altered vowel: bish-bash-bosh, ding-dang-dong. If there are three words then the order has to go I, A, O. If there are two words then the first is I and the second is either A or O. Mish-mash, chit-chat, dilly-dally, shilly-shally, tip top, hip-hop, flip-flop, tic tac, sing song, ding dong, King Kong, ping pong.

Why this should be is a subject of endless debate among linguists, it might be to do with the movement of your tongue or an ancient language of the Caucasus. It doesn’t matter. It’s the law, and, as with the adjectives, you knew it even if you didn’t know you knew it. And the law is so important that you just can’t have a Bad Big Wolf.

Jibber Jabber: The Unwritten Ablaut Reduplication Rule

In all these ablaut reduplication word pairs, the key vowels appear in a specific order: either i before a, or i before o.

In linguistic terms, you could say that a high vowel comes before a low vowel. The i sound is considered a high vowel because of the location of the tongue relative to the mouth in American speech. The a and o sounds are low vowels.

See-saw doesn’t use the letter i, but the high-vowel-before-low-vowel pattern still applies.

This Weird Grammar Rule is Why We Say “Flip Flop” Instead of “Flop Flip”

As to why this I-A-O pattern has such a firm hold in our linguistic history, nobody can say. Forsyth calls it a topic of “endless debate” among linguists that may originate in the arcane movements of the human tongue or an ancient language of the Caucasus. Whatever the case, the world’s English speakers are on-board, and you will never catch Lucy accusing Charlie Brown of being washy-wishy.

Reduplicative Words

Ricochet Word

wishy-washy, hanky panky – name for this type of word-formation?

argle-bargle

Easy-Peasy

Double Trouble

English Ryming Compound Words

Rhyming Compounds

Reduplicates

REDUPLICATION

English gitaigo: Flip-Flop Words

Arete: History and Etymology

Arete (moral virtue)
Wikipedia

Arete (Greekἀρετή), in its basic sense, means “excellence of any kind”.[1] The term may also mean “moral virtue”.[1] In its earliest appearance in Greek, this notion of excellence was ultimately bound up with the notion of the fulfillment of purpose or function: the act of living up to one’s full potential.

The term from Homeric times onwards is not gender specific. Homer applies the term of both the Greek and Trojan heroes as well as major female figures, such as Penelope, the wife of the Greek hero Odysseus. In the Homeric poems, Arete is frequently associated with bravery, but more often with effectiveness. The man or woman of Arete is a person of the highest effectiveness; they use all their faculties—strength, bravery and wit—to achieve real results. In the Homeric world, then, Arete involves all of the abilities and potentialities available to humans.

In some contexts, Arete is explicitly linked with human knowledge, where the expressions “virtue is knowledge” and “Arete is knowledge” are used interchangeably. The highest human potential is knowledge and all other human abilities are derived from this central capacity. If Arete is knowledge and study, the highest human knowledge is knowledge about knowledge itself; in this light, the theoretical study of human knowledge, which Aristotle called “contemplation”, is the highest human ability and happiness.[2]

History

The Ancient Greeks applied the term to anything: for example, the excellence of a chimney, the excellence of a bull to be bred and the excellence of a man. The meaning of the word changes depending on what it describes, since everything has its own peculiar excellence; the arete of a man is different from the arete of a horse. This way of thinking comes first from Plato, where it can be seen in the Allegory of the Cave.[3] In particular, the aristocratic class was presumed, essentially by definition, to be exemplary of arete: “The root of the word is the same as aristos, the word which shows superlative ability and superiority, and aristos was constantly used in the plural to denote the nobility.”[4]

By the 5th and 4th centuries BC, arete as applied to men had developed to include quieter virtues, such as dikaiosyne (justice) and sophrosyne (self-restraint). Plato attempted to produce a moral philosophy that incorporated this new usage,[5] but it was in the work of Aristotle that the doctrine of arete found its fullest flowering. Aristotle’s Doctrine of the Mean is a paradigm example of his thinking.

Arete has also been used by Plato when talking about athletic training and also the education of young boys. Stephen G. Miller delves into this usage in his book “Ancient Greek Athletics”. Aristotle is quoted as deliberating between education towards arete “…or those that are theoretical”.[6] Educating towards arete in this sense means that the boy would be educated towards things that are useful in life. However, even Plato himself says that arete is not something that can be agreed upon. He says, “Nor is there even an agreement about what constitutes arete, something that leads logically to a disagreement about the appropriate training for arete.”[7] To say that arete has a common definition of excellence or fulfillment may be an overstatement simply because it was very difficult to pinpoint arete, much less the proper ways to go about obtaining it. […]

Homer

In Homer‘s Iliad and Odyssey, “arete” is used mainly to describe heroes and nobles and their mobile dexterity, with special reference to strength and courage, but it is not limited to this. Penelope‘s arete, for example, relates to co-operation, for which she is praised by Agamemnon. The excellence of the gods generally included their power, but, in the Odyssey (13.42), the gods can grant excellence to a life, which is contextually understood to mean prosperity. Arete was also the name of King Alcinous‘s wife.

According to Bernard Knox‘s notes found in the Robert Fagles translation of The Odyssey, “arete” is also associated with the Greek word for “pray”, araomai.[8]

All Things Shining
by Hubert Dreyfus
pp. 61-63

Homer’s epic poems brought into focus a notion of arete, or excellence in life, that was at the center of the Greek understanding of human being.6 Many admirers of Greek culture have attempted to define this notion, but success here requires avoiding two prominent temptations. There is the temptation to patronize that we have already mentioned. But there is also a temptation to read a modern sensibility into Homer’s time. One standard translation of the Greek word arete as “virtue” runs the risk of this kind of retroactive reading: for any attempt to interpret the Homeric Greek notion of human excellence in terms of “virtue”—especially if one hears in this word its typical Christian or even Roman overtones—is bound to go astray. Excellence in the Greek sense involves neither the Christian notion of humility and love nor the Roman ideal of stoic adherence to one’s duty.7 Instead, excellence in the Homeric world depends crucially on one’s sense of gratitude and wonder.

Nietzsche was one of the first to understand that Homeric excellence bears little resemblance to modern moral agency. His view was that the Homeric world understood nobility in terms of the overpowering strength of noble warriors. The effect of the ensuing Judeo-Christian tradition, on this Nietzschean reading, was to enfeeble the Homeric understanding of excellence by substituting the meekness of the lamb for the strength and power of the noble warrior.8

Nietzsche was certainly right that the Homeric tradition valorizes the strong, noble hero; and he was right, too, that in some important sense the Homeric account of excellence is foreign to our basic moralizing assumptions. But there is something that the Nietzschean account leaves out. As Bernard Knox emphasizes, the Greek word arete is etymologically related to the Greek verb “to pray” (araomai).9 It follows that Homer’s basic account of human excellence involves the necessity of being in an appropriate relationship to whatever is understood to be sacred in the culture. Helen’s greatness, on this interpretation, is not properly measured in terms of the degree to which she is morally responsible for her actions.

What makes Helen great in Homer’s world is her ability to live a life that is constantly responsive to golden Aphrodite, the shining example of the sacred erotic dimension of existence. Likewise, Achilles had a special kind of receptivity to Ares and his warlike way of life; Odysseus had Athena, with her wisdom and cultural adaptability, to look out for him. Presumably, the master craftsmen of Homer’s world worked in the light of Hephaestus’s shining. In order to engage with this understanding of human excellence, we will have to think clearly about how the Homeric Greeks understood themselves. Why would it make sense to describe their lives in relation to the presence and absence of the gods?

Several questions focus this kind of approach. What is the phenomenon that Homer is responding to when he says that a god intervened or in some way took part in an action or event? Is this phenomenon recognizable to us, even if only marginally? And if Homer’s reference to the gods is something other than an attempt to pass off moral responsibility for one’s actions, then what exactly is it? Only by facing these questions head on can we understand whether it is possible—or desirable—to lure back Homer’s polytheistic gods.

The gods are essential to the Homeric Greek understanding of what it is to be a human being at all. As Peisistratus—the son of wise old Nestor—says toward the beginning of the Odyssey, “All men need the gods.”10 The Greeks were deeply aware of the ways in which our successes and our failures—indeed, our very actions themselves—are never completely under our control. They were constantly sensitive to, amazed by, and grateful for those actions that one cannot perform on one’s own simply by trying harder: going to sleep, waking up, fitting in, standing out, gathering crowds together, holding their attention with a speech, changing their mood, or indeed being filled with longing, desire, courage, wisdom, and so on. Homer sees each of these achievements as a particular god’s gift. To say that all men need the gods therefore is to say, in part at least, that we are the kinds of beings who are at our best when we find ourselves acting in ways that we cannot—and ought not—entirely take credit for.

The Discovery of the Mind
by Bruno Snell
pp. 158-160

The words for virtue and good, arete and agathos, are at first by no means clearly distinguished from the area of profit. In the early period they are not as palpably moral in content as might be supposed; we may compare the German terms Tu end and gut which originally stood for the ‘suitable’ (taugende) and the ‘fitting’ (cf. Gatte). When Homer says that a man is good, agathos, he does not mean thereby that he is morally unobjectionable, much less good-hearted, but rather that he is useful, proficient, and capable of vigorous action. We also speak of a good warrior or a good instrument. Similarly arete, virtue, does not denote a moral property but nobility, achievement, success and reputation. And yet these words have an unmistakable tendency toward the moral because, unlike ‘happiness’ or ‘profit’, they designate qualities for which a man may win the respect of his whole community. Arete is ‘ability’ and ‘achievement’, characteristics which are expected of a ‘good’, an ‘able’ man, an aner agathos. From Homer to Plato and beyond these words spell out the worth of a man and his work. Any change in their meaning, therefore, would indicate a reassessment of values. It is possible to show how at various times the formation and consolidation of social groups and even of states was connected with people’s ideas about the ‘good’. But that would be tantamount to writing a history of Greek culture. In Homer, to possess ‘virtue’ or to be ‘good’ means to realize one’s nature, and one’s wishes, to perfection. Frequently happiness and profit form the reward, but it is no such extrinsic prospect which leads men to virtue and goodness. The expressions contain a germ of the notion of entelechy. A Homeric hero, for instance, is capable of ‘reminding himself’, or of ‘experiencing’, that he is noble. ‘Use your experience to become what you are’ advises Pindar who adheres to this image of arete. The ‘good’ man fulfils his proper function, prattei ta heautou, as Plato demands it; he achieves his own perfection. And in the early period this also entails that he is good in the eyes of others, for the notions and definitions of goodness are plain and uniform: a man appears to others as he is.

In the Iliad (11.404—410) Odysseus reminds himself that he is an aristocrat, and thereby resolves his doubts how he should conduct himself in a critical situation. He does it by concentrating on the thought that he belongs to a certain social order, and that it is his duty to fulfill the ‘virtue’ of that order. The universal which underlies the predication ‘I am a noble’ is the group; he does not reflect on an abstract ‘good ’but upon the circle of which he claims membership. It is the same as if an officer were to say: ‘As an officer I must do this or that,’ thus gauging his action by the rigid conception of honour peculiar to his caste.

Aretan is ‘to thrive’; arete is the objective which the early nobles attach to achievement and success. By means of arete the aristocrat implements the ideal of his order—and at the same time distinguishes himself above his fellow nobles. With his arete the individual subjects himself to the judgment of his community, but he also surpasses it as an individual. Since the days of Jacob Burckhardt the competitive character of the great Greek achievements has rightly been stressed. Well into the classical period, those who compete for arete are remunerated with glory and honour. The community puts its stamp of approval on the value which the individual sets on himself. Thus honour, time, is even more significant than arete for the growth of the moral consciousness, because it is more evident, more palpable to all. From his earliest boyhood the young nobleman is urged to think of his glory and his honour; he must look out for his good name, and he must see to it that he commands the necessary respect. For honour is a very sensitive plant; wherever it is destroyed the moral existence of the loser collapses. Its importance is greater even than that of life itself; for the sake of glory and honour the knight is prepared to sacrifice his life.

pp. 169-172

The truth of the matter is that it was not the concept of justice but that of arete which gave rise to the call for positive individual achievement, the moral imperative which the early Greek community enjoins upon its members who in turn acknowledge it for themselves. A man may have purely egotistical motives for desiring virtue and achievement, but his group gives him considerably more credit for these ideals than if he were to desire profit or happiness. The community expects, and even demands, arete. Conversely a man who accomplishes a high purpose may convince himself so thoroughly that his deed serves the interests of a supra-personal, a universal cause that the alternative of egotism or altruism becomes irrelevant. What does the community require of the individual? What does the individual regard as universal, as eternal? These, in the archaic age, are the questions about which the speculations on arete revolve.

The problem remains simple as long as the individual cherishes the same values as the rest of his group. Given this condition, even the ordinary things in life are suffused with an air of dignity, because they are part of custom and tradition. The various daily functions, such as rising in the morning and the eating of meals, are sanctified by prayer and sacrifice, and the crucial events in the life of man—birth, marriage, burial—are for ever fixed and rooted in the rigid forms of cult. Life bears the imprint of a permanent authority which is divine, and all activity is, therefore, more than just personal striving. No one doubts the meaning of life; the hallowed tradition is carried on with implicit trust in the holy wisdom of its rules. In such a society, if a man shows unusual capacity he is rewarded as a matter of course. In Homer a signal achievement is, as one would expect, also honoured with a special permanence, through the song of the bard which outlasts the deed celebrated and preserves it for posterity. This simple concept is still to be found in Pindar’s Epinicians. The problem of virtue becomes more complex when the ancient and universally recognized ideal of chivalry breaks down. Already in Homeric times a differentiation sets in. As we have seen in the story of the quarrel over the arms of Achilles, the aretai become a subject for controversy. The word arete itself contains a tendency toward the differentiation of values, since it is possible to speak of the virtues of various men and various things. As more sections of society become aware of their own merit, they are less willing to conform to the ideal of the once-dominant class. It is discovered that the ways of men are diverse, and that arete may be attained in all sorts of professions. Whereas aristocratic society had been held together, not to say made possible by a uniform notion of arete, people now begin to ask what true virtue is. The crisis of the social system is at the same time the crisis of an ideal, and thus of morality. Archilochus says (fr. 41)that different men have their hearts quickened in various ways. But he also states, elaborating a thought which first crops up in the Odyssey: the mind of men is as Zeus ushers in each day, and they think whatever they happen to hit upon (fr. 68). One result of this splitting up of the various forms of life is a certain failure of nerve. Man begins to feel that he is changeable and exposed to many variable forces. This insight deepens the moral reflexions of the archaic period; the search for the good becomes a search for the permanent.

The topic of the virtues is especially prominent in the elegy. Several elegiac poets furnish lists of the various aretai which they exemplify by means of well-known myths. Their purpose is to clarify for themselves their own attitudes toward the conflicting standards of life. Theognis (699 ff.) stands at the end of this development; with righteous indignation he complains that the masses no longer have eyes for anything except wealth. For him material gain has, in contrast with earlier views, become an enemy of virtue.

The first to deal with this general issue is Tyrtaeus. His call to arms pronounces the Spartan ideal; perhaps he was the one to formulate that ideal for the first time. Nothing matters but the bravery of the soldier fighting for his country. Emphatically he rejects all other accomplishments and virtues as secondary: the swiftness of the runner in the arena, or the strength of the wrestler, or again physical beauty, wealth, royal power, and eloquence, are as nothing before bravery. In the Iliad also a hero best proves his virtue by standing firm against the enemy, but that is not his only proof; the heroic figures of Homer dazzle us precisely because of their richness in human qualities. Achilles is not only brave but also beautiful, ‘swift of foot’, he knows how to sing, and so forth. Tyrtaeus sharply reduces the scope of the older arete; what is more, he goes far beyond Homer in magnifying the fame of fortitude and the ignominy which awaits the coward. Of the fallen he actually says that they acquire immortality (9.32). This one-sidedness is due to the fact that the community has redoubled its claim on the individual; Sparta in particular taxed the energies of its citizenry to the utmost during the calamitous period of the Messenian wars. The community is a thing of permanence for whose sake the individual mortal has to lay down his life, and in whose memory lies his only chance for any kind of survival. Even in Tyrtaeus, however, these claims of the group do not lead to a termite morality. Far from prescribing a blind and unthinking service to the whole, or a spirit of slavish self-sacrifice, Tyrtaeus esteems the performance of the individual as a deed worthy of fame. This is a basic ingredient of arete which, in spite of countless shifts and variations, is never wholly lost.

Philosophy Before Socrates
by Richard D. McKirahan
pp. 366-369

Aretē and Agathos These two basic concepts of Greek morality are closely related and not straightforwardly translatable into English. As an approximation, aretē can be rendered “excellence” or “goodness” (sometimes “virtue”), and agathos as “excellent” or “good.” The terms are related in that a thing or person is agathos if and only if it has aretē and just because it has aretē. The concepts apply to objects, conditions, and actions as well as to humans. They are connected with the concept of ergon (plural, erga), which may be rendered as “function” or “characteristic activity.” A good (agathos) person is one who performs human erga well, and similarly a good knife is a knife that performs the ergon of a knife well. The ergon of a knife is cutting, and an agathos knife is one that cuts well. Thus, the aretē of a knife is the qualities or characteristics a knife must have in order to cut well. Likewise, if a human ergon can be identified, an agathos human is one who can and on appropriate occasions does perform that ergon well, and human aretē is the qualities or characteristics that enable him or her to do so. The classical discussion of these concepts occurs after our period, in Aristotle,6 but he is only making explicit ideas that go back to Homer and which throw light on much of the pre-philosophical ethical thought of the Greeks.

This connection of concepts makes it automatic, virtually an analytic truth, that the right goal for a person—any person—is to be or become agathos. Even if that goal is unreachable for someone, the aretē–agathos standard still stands as an ideal against which to measure one’s successes and failures. However, there is room for debate over the nature of human erga, both whether there is a set of erga applicable to all humans and relevant to aretē and, supposing that there is such a set of erga, what those erga are. The existence of the aretē–agathos standard makes it vitally important to settle these issues, for otherwise human life is left adrift with no standards of conduct. […]

The moral scene Homer presents is appropriate to the society it represents and quite alien to our own. It is the starting point for subsequent moral speculation which no one in the later Greek tradition could quite forget. The development of Greek moral thought through the Archaic and Classical periods can be seen as the gradual replacement of the competitive by the cooperative virtues as the primary virtues of conduct and as the recognition and increasing recognition of the significance of people’s intentions as well as their actions.7

Rapid change in Greek society in the Archaic and Classical periods called for new conceptions of the ideal human and the ideal human life and activities. The Archaic period saw different kinds of rulers from the Homeric kings, and individual combat gave way to the united front of a phalanx of hoplites (heavily armed warriors). Even though the Homeric warrior-king was no longer a possible role in society, the qualities of good birth, beauty, courage, honor, and the abilities to give good counsel and rule well remained. Nevertheless, the various strands of the Homeric heroic ideal began to unravel. In particular, good birth, wealth, and fighting ability no longer automatically went together. This situation forced the issue: what are the best qualities we can possess? What constitutes human aretē? The literary sources contain conflicting claims about the best life for a person, the best kind of person to be, and the relative merits of qualities thought to be ingredients of human happiness. In one way or another these different conceptions of human excellence have Homeric origins, though they diverge from Homer’s conception and from one another.

Lack of space makes it impossible to present the wealth of materials that bear on this subject.8 I will confine discussion to two representatives of the aristocratic tradition who wrote at the end of the Archaic period. Pindar shows how the aristocratic ideal had survived and been transformed from the Homeric conception and how vital it remained as late as the early fifth century, and Theognis reveals how social, political, and economic reality was undermining that ideal.

p. 374

The increase in wealth and the shift in its distribution which had begun by the seventh century led to profound changes in the social and political scenes in the sixth and forced a wedge in among the complex of qualities which traditionally constituted aristocratic aretē. Pindar’s unified picture in which wealth, power, and noble birth tend to go together became ever less true to contemporary reality.

The aristocratic response to this changed situation receives its clearest expression in the poems attributed to Theognis and composed in the sixth and early fifth centuries. Even less than with Pindar can we find a consistent set of views advocated in these poems, but among the most frequently recurring themes are the view that money does not make the man, that many undeserving people are now rich and many deserving people (deserving because of their birth and social background) are now poor. It is noteworthy how Theognis plays on the different connotations of uses of the primary terms of value, agathos and aretē, and their opposites kakos and kakia: morally good vs. evil; well-born, noble vs. low-born; and politically and socially powerful vs. powerless. Since the traditional positive attributes no longer regularly all went together, it was important to decide which are most important, indeed which are the essential ingredients of human aretē.

pp. 379-382

In short, Protagoras taught his students how to succeed in public and private life. What he claimed to teach is, in a word, aretē. That this was his boast follows from the intimate connection between agathos and aretē as well as from the fact that a person with aretē is one who enjoys success, as measured by current standards. Anyone with the abilities Protagoras claimed to teach had the keys to a successful life in fifth-century Athens.

In fact, the key to success was rhetoric, the art of public speaking, which has a precedent in the heroic conception of aretē, which included excellence in counsel. But the Sophists’ emphasis on rhetoric must not be understood as hearkening back to Homeric values. Clear reasons why success in life depended on the ability to speak well in public can be found in fifth-century politics and society. […]

That is not to say that every kind of success depended on rhetoric. It could not make you successful in a craft like carpentry and would not on its own make you a successful military commander. Nor is it plausible that every student of Protagoras could have become another Pericles. Protagoras acknowledged that natural aptitude was required over and above diligence. […] Protagoras recognized that he could not make a silk purse out of a sow’s ear, but he claimed to be able to develop a (sufficiently young) person’s abilities to the greatest extent possible.28

Pericles was an effective counselor in part because he could speak well but also by dint of his personality, experience, and intelligence. To a large extent these last three factors cannot be taught, but rhetoric can be offered as a tekhnē, a technical art or skill which has rules of its own and which can be instilled through training and practice. In these ways rhetoric is like medicine, carpentry, and other technical arts, but it is different in its seemingly universal applicability. Debates can arise on any conceivable subject, including technical ones, and rhetorical skill can be turned to the topic at hand whatever it may be. The story goes that Gorgias used his rhetorical skill to convince medical patients to undergo surgery when physicians failed to persuade them.29 Socrates turned the tables on the Sophists, arguing that if rhetoric has no specific subject matter, then so far from being a universal art, it should not be considered an art at all.30 And even if we grant that rhetoric is an art that can be taught, it remains controversial whether aretē can be taught and in what aretē consists. […]

The main charges against the Sophists are of two different sorts. First the charge of prostituting themselves. Plato emphasizes the money-making aspect of the Sophist’s work, which he uses as one of his chief criteria for determining that Socrates was not a Sophist. This charge contains two elements: the Sophists teach aretē for money, and they teach it to anyone who pays. Both elements have aristocratic origins. Traditionally aretē was learned from one’s family and friends and came as the result of a long process of socialization beginning in infancy. Such training and background can hardly be bought. Further, according to the aristocratic mentality most people are not of the right type, the appropriate social background, to aspire to aretē.

Lila
by Robert Pirsig
pp. 436-442

Digging back into ancient Greek history, to the time when this mythos-to-logos transition was taking place, Phædrus noted that the ancient rhetoricians of Greece, the Sophists, had taught what they called aretê , which was a synonym for Quality. Victorians had translated aretê as “virtue” but Victorian “virtue” connoted sexual abstinence, prissiness and a holier-than-thou snobbery. This was a long way from what the ancient Greeks meant. The early Greek literature, particularly the poetry of Homer, showed that aretê had been a central and vital term.

With Homer Phædrus was certain he’d gone back as far as anyone could go, but one day he came across some information that startled him. It said that by following linguistic analysis you could go even further back into the mythos than Homer. Ancient Greek was not an original language. It was descended from a much earlier one, now called the Proto-Indo-European language. This language has left no fragments but has been derived by scholars from similarities between such languages as Sanskrit, Greek and English which have indicated that these languages were fallouts from a common prehistoric tongue. After thousands of years of separation from Greek and English the Hindi word for “mother” is still “Ma.” Yoga both looks like and is translated as “yoke.” The reason an Indian rajah’ s title sounds like “regent” is because both terms are fallouts from Proto-Indo-European. Today a Proto-Indo-European dictionary contains more than a thousand entries with derivations extending into more than one hundred languages.

Just for curiosity’s sake Phædrus decided to see if aretê was in it. He looked under the “a” words and was disappointed to find it was not. Then he noted a statement that said that the Greeks were not the most faithful to the Proto-Indo-European spelling. Among other sins, the Greeks added the prefix “a” to many of the Proto-Indo-European roots. He checked this out by looking for aretê under “r.” This time a door opened.

The Proto-Indo-European root of aretê was the morpheme rt . There, beside aretê , was a treasure room of other derived “rt” words: “arithmetic,” “aristocrat,” “art,” “rhetoric,” “worth,” “rite,” “ritual,” “wright,” “right (handed)” and “right (correct).” All of these words except arithmetic seemed to have a vague thesaurus-like similarity to Quality. Phædrus studied them carefully, letting them soak in, trying to guess what sort of concept, what sort of way of seeing the world, could give rise to such a collection.

When the morpheme appeared in aristocrat and arithmetic the reference was to “firstness.” Rt meant first. When it appeared in art and wright it seemed to mean “created” and “of beauty.” “Ritual” suggested repetitive order. And the word right has two meanings: “right-handed” and “moral and esthetic correctness.” When all these meanings were strung together a fuller picture of the rt morpheme emerged. Rt referred to the “first, created, beautiful repetitive order of moral and esthetic correctness.” […]

There was just one thing wrong with this Proto-Indo-European discovery, something Phædrus had tried to sweep under the carpet at first, but which kept creeping out again. The meanings, grouped together, suggested something different from his interpretation of aretê . They suggested “importance” but it was an importance that was formal and social and procedural and manufactured, almost an antonym to the Quality he was talking about. Rt meant “quality” all right but the quality it meant was static, not Dynamic. He had wanted it to come out the other way, but it looked as though it wasn’t going to do it. Ritual. That was the last thing he wanted aretê to turn out to be. Bad news. It looked as though the Victorian translation of aretê as “virtue” might be better after all since “virtue” implies ritualistic conformity to social protocol. […]

Rta . It was a Sanskrit word, and Phædrus remembered what it meant: Rta was the “cosmic order of things.” Then he remembered he had read that the Sanskrit language was considered the most faithful to the Proto-Indo-European root, probably because the linguistic patterns had been so carefully preserved by the Hindu priests. […]

Rta , from the oldest portion of the Rg Veda , which was the oldest known writing of the Indo-Aryan language. The sun god, Sūrya , began his chariot ride across the heavens from the abode of rta. Varuna , the god for whom the city in which Phædrus was studying was named, was the chief support of rta .

Varuna was omniscient and was described as ever witnessing the truth and falsehood of men—as being “the third whenever two plot in secret.” He was essentially a god of righteousness and a guardian of all that is worthy and good. The texts had said that the distinctive feature of Varuna was his unswerving adherence to high principles. Later he was overshadowed by Indra who was a thunder god and destroyer of the enemies of the Indo-Aryans. But all the gods were conceived as “guardians of ta ,” willing the right and making sure it was carried out.

One of Phædrus’s old school texts, written by M. Hiriyanna, contained a good summary: “Rta , which etymologically stands for ‘course’ originally meant ‘cosmic order,’ the maintenance of which was the purpose of all the gods; and later it also came to mean ‘right,’ so that the gods were conceived as preserving the world not merely from physical disorder but also from moral chaos. The one idea is implicit in the other: and there is order in the universe because its control is in righteous hands.…”

The physical order of the universe is also the moral order of the universe. Rta is both. This was exactly what the Metaphysics of Quality was claiming. It was not a new idea. It was the oldest idea known to man.

This identification of rta and aretê was enormously valuable, Phædrus thought, because it provided a huge historical panorama in which the fundamental conflict between static and Dynamic Quality had been worked out. It answered the question of why aretê meant ritual. Rta also meant ritual. But unlike the Greeks, the Hindus in their many thousands of years of cultural evolution had paid enormous attention to the conflict between ritual and freedom. Their resolution of this conflict in the Buddhist and Vedantist philosophies is one of the profound achievements of the human mind.

Pagan Ethics: Paganism as a World Religion
by Michael York
pp. 59-60

Pirsig contends that Plato incorporated the arete of the Sophists into his dichotomy between ideas and appearances — where it was subordinated to Truth. Once Plato identifies the True with the Good, arete’s position is usurped by “dialectically determined truth.” This, in turn, allows Plato to demote the Good to a lower order and minor branch of knowledge. For Pirsig, the Sophists were those Greek philosophers who exalted quality over truth; they were the true champions of arete or excellence. With a pagan quest for the ethical that develops from an idolatrous understanding of the physical, while Aristotle remains an important consideration, it is to the Sophists (particularly Protagoras, Prodicus and Pirsig’s understanding of them) and a reconstruction of their underlying humanist position that perhaps the most important answers are to be framed if not found as well.

A basic pagan position is an acceptance of the appetites — in fact, their celebration rather than their condemnation. We find the most unbridled expression of the appetites in the actions of the young. Youth may engage in binge-drinking, vandalism, theft, promiscuity and profligate experimentation. Pagan perspectives may recognize the inherent dangers in these as there are in life itself. But they also trust the overall process of learning. In paganism, morality has a much greater latitude than it does in the transcendental philosophy of a Pythagoras, Plato, or Plotinus: it may veer toward a form of relativism, but its ultimate check is always the sanctity of the other animate individuals. An it harm none, do what ye will. The pagan ethic must be found within the appetites and not in their denial.

In fact, paganism is part of a protest against Platonic assertion. The wider denial is that of nature herself. Nature denies the Platonic by refusing to conform to the Platonic ideal. It insists on moments of chaos, the epagomenae, the carnival, that overlap between the real and the ideal that is itself a metaphor for reality. The actual year is a refusal to cooperate with the mathematically ideal year of 360 days — close but only tantalizingly.

In addition, pagans have always loved asking what is arete? This is the fundamental question we encounter with the Sophists, Plato and Aristotle. It is the question that is before us still. The classics considered variously both happiness and the good as alternative answers. The Hedonists pick happiness — but a particular kind of happiness. The underlying principle recognized behind all these possibilities is arete ‘excellence, the best’ however it is embodied — whether god, goddess, goods, the good, gods, virtue, happiness, pleasure or all of these together. Arete is that to which both individual and community aspire. Each wants one’s own individual way of putting it together in excellent fashion — but at the same time wanting some commensurable overlap of the individual way with the community way.

What is the truth of the historical claims about Greek philosophy in Zen and the Art of Motorcycle Maintenance?
answer by Ammon Allred

Arete is usually translated as “virtue,” which is certainly connected up with the good “agathon” — but in Plato an impersonal Good is probably more important than aletheia or truth. See, for instance, the central images at the end of Book VI, where the Good is called the “Father of the Sun.” The same holds in the Philebus. And it wouldn’t be right to say that Plato (or Aristotle) thought virtue was part of some small branch called “ethics” (Plato doesn’t divide his philosophy up this way; Aristotle does — although then we get into fact that we don’t have the dialogues he wrote — but still what he means by ethics is far broader than what we mean).

Certainly the Sophists pushed for a humanistic account of the Good, whereas Plato’s was far more impersonal. And Plato himself had a complex relationship to the Sophists (consider the dialogue of Protagoras, where Socrates and Protagoras both end up about equally triumphant).

That said, Pirsig is almost certainly right about Platonism — that is to say, the approach to philosophy that has been taught as though it were Plato’s philosophy. Certainly, the sophists have gotten a bad rap because of the view that Socrates and Plato were taken to have about the sophists; but even there, many philosophers have tried to rehabilitate them: most famously, Nietzsche.

Edge of the Depths

“In Science there are no ‘depths’; there is surface everywhere.”
~ Rodolf Carnap

I was reading Richard S. Hallam’s Virtual Selves, Real Persons. I’ve enjoyed it, but I find a point of disagreement or maybe merely doubt and questioning. He emphasizes persons as being real, in that they are somehow pre-existing and separate. He distinguishes the person from selves, although this distinction isn’t necessarily relevant to my thoughts here.

I’m not sure to what degree our views diverge, as I find much of the text to be insightful and a wonderful overview. However, to demonstrate my misgivings, the author only mentions David Hume’s bundle theory a couple of times on a few pages (in a several hundred page book), a rather slight discussion for such a key perspective. He does give a bit more space to Julian Jaynes’ bicameral theory, but even Jaynes is isolated to one fairly small section and not fully integrated into the author’s larger analysis.

The commonality between Humes and Jaynes is that they perceived conscious identity as being more nebulous — no there there. In my own experience, that feels more right to me. As one dives down into the psyche, the waters become quite murky, so dark that one can’t even see one’s hands in front of one’s face, much less know what one might be attempting to grasp. Notions of separateness, at a great enough depth, fades away — one finds oneself floating in darkness with no certain sense of distance or direction. I don’t know how to explain this, if one hasn’t experienced altered states of mind, from extended meditation to psychedelic trips.

This is far from a new line of thought for me, but it kept jumping out at me as I read Hallam’s book. His writing is scholarly to a high degree and, for me, that is never a criticism. The downside is that a scholarly perspective alone can’t be taken into the depths. Jaynes solved this dilemma by maintaining a dual focus, intellectual argument balanced with a sense of wonder — speaking of our search for certainty, he said that, “Beyond that, there is only awe.

I keep coming back to that. For all I appreciate of Hallam’s book, I never once experienced awe. Then again, he probably wasn’t attempting to communicate awe. So, it’s not exactly that I judge this as a failing, even if it can feel like an inadequacy from the perspective of human experience or at least my experience. In the throes of awe, we are humbled into an existential state of ignorance. A term like ‘separation’ becomes yet another word. To take consciousness directly and fully is to lose any sense of separateness for, then, there is consciousness alone — not my consciousness and your consciousness, just consciousness.

I could and have made more intellectual arguments about consciousness and how strange it can be. It’s not clear to me, as it is clear to some, that there is any universal experience of consciousness (human or otherwise). There seems to be a wide variety of states of mind found across diverse societies and species. Consider animism that seems so alien to the modern sensibility. What does ‘separation’ mean in an animate world that doesn’t assume the individual as the starting point of human existence?

I don’t need to rationally analyze any of this. Rationality as easily turns into rationalization, justifying what we think we already know. All I can say is that, intuitively, Hume’s bundle theory makes more sense of what I know directly within my own mind, whatever that may say about the minds of others. That viewpoint can’t be scientifically proven for the experience behind it is inscrutable, not an object to be weighed and measured, even as brain scans remain fascinating. Consciousness can’t be found by pulling apart Hume’s bundle anymore than a frog’s soul can be found by dissecting its beating heart — consciousness having a similar metaphysical status as the soul. Something like the bundle theory either makes sense or not. Consciousness is a mystery, no matter how unsatisfying that may seem. Science can take us to the edge of the depths, but that is where it stops. To step off that edge requires something else entirely.

Actually, stepping off rarely happens since few, if any, ever choose to sink into the depths. One slips and falls and the depths envelop one. Severe depression was my initiation experience, the weight dragging me down. There are many possible entry points to this otherness. When that happens, thoughts on consciousness stop being intellectual speculation and thought experiment. One knows consciousness as well as one will ever know it when one drowns in it. If one thrashes their way back to the surface, then and only then can one offer meaningful insight but more likely one is lost in silence, water still choking in one’s throat.

This is why Julian Jaynes, for all of his brilliance and insight, reached the end of his life filled with frustration at what felt like a failure to communicate. As his historical argument went, individuals don’t change their mindsets so much as the social system that maintains a particular mindset is changed, which in the case of bicameralism meant the collapse of the Bronze Age civilizations. Until our society faces a similar crises and is collectively thrown into the depths, separation will remain the dominant mode of experience and understanding. As for what might replace it, that is anyone’s guess.

Here we stand, our footing not entirely secure, at the edge of the depths.