Human Nature: Categories & Biases

There is something compelling about seemingly opposing views. There is Mythos vs Logos, Apollonian vs Dionysian, Fox vs Hedgehog, Socratic vs the Sophistic, Platonic vs Aristotelian, Spinoza vs Locke, Paine vs Burke, Jung vs Freud, nature vs nurture, biology vs culture, determinism vs free will, parenting style vs peer influence, etc.

And these perceived divisions overlap in various ways, a long developing history of ideas, worldviews, and thinkers. It’s a dance. One side will take the lead and then the other. The two sides will take different forms, the dividing lines shifting.

In more recent decades, we’ve come to more often think in terms of political ideologies. The greatest of them all is liberal vs conservative. But since World War II, there has been a growing obsession with authoritarianism and anti-authoritarianism. And there is the newer area of social dominance orientation (SDO). Some prefer focusing on progressive vs reactionary as more fundamental, as it relates to the history of the revolutionary and counterrevolutionary.

With the advent of social science and neuroscience, we’ve increasingly put all of this in new frames. Always popular, there is left and right brain hemispheres, along with more specific brain anatomy (e.g., conservatives on average have a larger amygdala). Then there is the personality research: Myers-Briggs, trait theory, boundary types, etc — of those three, trait theory being the most widely used.

Part of it is that humans simply like to categorize. It’s how we attempt to make sense of the world. And there is nothing that preoccupies human curiosity more than humanity itself, our shared inheritance of human ideas and human nature. For as long as humans have been writing and probably longer, there have been categorizations to slot humans into.

My focus has most often been toward personality, along with social science more generally. What also interests me is that one’s approach to such issues also comes in different varieties. With that in mind, I wanted to briefly compare two books. Both give voice to two sides of my own thinking. The first I’ll discuss is The Liberal’s Guide to Conservatives by J. Scott Wagner. And the second is A Skeptic’s Guide to the Mind by Robert Burton.

Wagner’s book is the kind of overview I wish I’d had earlier last decade. But a book like this gets easier to write as time goes on. Many points of confusion have been further clarified, if not always resolved, by more recent research. Then again, often this has just made us more clear about what exactly is our confusion.

What is useful about a book like this is that it helps show what we do know at the moment. Or simply what we think we know, until further research is done to confirm or disconfirm present theories. But at least some of it allows a fair amount of certainty that we are looking at significant patterns in the data.

It’s a straightforward analysis with a simple purpose. The author is on the political left and he wants to help those who share his biases to understand those on the political right who have different biases. A noble endeavor, as always. He covers a lot of territory and it is impressive. I won’t even attempt to summarize it all. I’m already broadly familiar with the material, as this area of study involves models and theories that have been researched for a long time.

What most stood out to me was his discussion of authoritarianism and social dominance orientation (SDO). For some reason, that seems like more important than all the rest. Those taken together represent the monkey wrench thrown into the gears of the human mind. I was amused when Wagner opined that,

Unlike all that subtlety around “social conformity-autonomy” and authoritarianism, the SDO test is straightforward: not to put too fine a point on it, but to me, the questions measure how much of a jerk you are. (Kindle Locations 3765-3767)

He holds no love for SDOs. And for good reason. Combine the worst aspects from the liberal elite of the classical liberal variety as found in a class-based pseudo-meritocracy. Remove any trace of liberal-minded tolerance, empathy, kindness, and compassion. And then wrap this all up with in-group domination. Serve with a mild sauce of near sociopathy.

Worse part of it is that SDOs are disproportionately found among those with wealth and power, authority and privilege. These people are found among the ruling elite for the simple reason that they want to be a ruling elite. Unless society stops them from dominating, they will dominate. It’s their nature, like the scorpion that stings the frog carrying him across the river. The scorpion can’t help itself.

All of that is important info. I do wish more people would read books like these. There is no way for the public, conservative and liberal alike, to come together in defense against threats to the public good when they don’t understand or often even clearly see those threats.

Anyway, Wagner’s book offers a systematizing approach, with a more practical emphasis that offers useful insight. He shows what differentiates people and what those demarcations signify. He offers various explanations and categorizations, models and theories. You could even take professional tests that will show your results on the various scales discussed, in order to see where you fit in the scheme of personality traits and ideological predispositions. Reading his book will help you understand why conflicts are common and communication difficult. But he doesn’t leave it at that, as he shares personal examples and helpful advice.

Now for the other approach, more contrarian in nature. This is exemplified by the other book I’ve been reading, the one by Robert Burton (who I quoted in a recent post). As Wagner brings info together, Burton dissects it into its complicated messy details (Daniel Everett has a similar purpose). Yet Burton also is seeking to be of use, in promoting clear thinking and a better scientific understanding. His is a challenge not just to the public but also to scientific researchers.

Rather than promising answers to age-old questions about the mind, it is my goal to challenge the underlying assumptions that drive these questions. In the end, this is a book questioning the nature of the questions about the mind that we seem compelled to ask yet are scientifically unable to answer. (p. 7)

Others like Wagner show the answers so far found for the questions we ask. Burton’s motive is quite the opposite, to question those answers. This is in the hope of improving both questions and answers.

Here is what I consider the core insight from Burton’s analysis (p. 105-7):

“Heinrich’s team showed the illusion to members of sixteen different social groups including fourteen from small-scale societies such as native African tribes. To see how strong the illusion was in each of these groups, they determined how much longer the “shorter” line needed to be for the observer to conclude that the two lines were equal. (You can test yourself at this website— http://www.michaelbach.de/ot/sze_muelue/index.html.) By measuring the amount of lengthening necessary for the illusion to disappear, they were able to chart differences between various societies. At the far end of the spectrum— those requiring the greatest degree of lengthening in order to perceive the two lines as equal (20 percent lengthening)— were American college undergraduates, followed by the South African European sample from Johannesburg. At the other end of the spectrum were members of a Kalahari Desert tribe, the San foragers. For the San tribe members, the lines looked equal; no line adjustment was necessary, as they experienced no sense of illusion. The authors’ conclusion: “This work suggests that even a process as apparently basic as visual perception can show substantial variation across populations. If visual perception can vary, what kind of psychological processes can we be sure will not vary?” 14

“Challenging the entire field of psychology, Heinrich and colleagues have come to some profoundly disquieting conclusions. Lifelong members of societies that are Western, educated, industrialized, rich, democratic (the authors coined the acronym WEIRD) reacted differently from others in experiment after experiment involving measures of fairness, antisocial punishment, and cooperation, as well as when responding to visual illusions and questions of individualism and conformity. “The fact that WEIRD people are the outliers in so many key domains of the behavioral sciences may render them one of the worst subpopulations one could study for generalizing about Homo sapiens.” The researchers found that 96 percent of behavioral science experiment subjects are from Western industrialized countries, even though those countries have just 12 percent of the world’s population, and that 68 percent of all subjects are Americans.

“Jonathan Haidt, University of Virginia psychologist and prepublication reviewer of the article, has said that Heinrich’s study “confirms something that many researchers knew all along but didn’t want to admit or acknowledge because its implications are so troublesome.” 15 Heinrich feels that either many behavioral psychology studies have to be redone on a far wider range of cultural groups— a daunting proposition— or they must be understood to offer insight only into the minds of rich, educated Westerners.

“Results of a scientific study that offer universal claims about human nature should be independent of location, cultural factors, and any outside influences. Indeed, one of the prerequisites of such a study would be to test the physical principles under a variety of situations and circumstances. And yet, much of what we know or believe we know about human behavior has been extrapolated from the study of a small subsection of the world’s population known to have different perceptions in such disparate domains as fairness, moral choice, even what we think about sharing. 16 If we look beyond the usual accusations and justifications— from the ease of inexpensively studying undergraduates to career-augmenting shortcuts— we are back at the recurrent problem of a unique self-contained mind dictating how it should study itself.”

I don’t feel much need to add to that. The implications of it are profound. This possibly throws everything up in the air. We might be forced to change what we think we know. I will point out Jonathan Haidt being quoted in that passage. Like many other social scientists, Haidt’s own research has been limited in scope, something that has been pointed out before (by me and others). But at least those like Haidt are acknowledging the problem and putting some effort into remedying it.

These are exciting times. There is the inevitable result that, as we come to know more, we come to realize how little we know and how limited is what we know (or think we know). We become more circumspect in our knowledge.

Still, that doesn’t lessen the significance of what we’ve so far learned. Even with the WEIRD bias disallowing generalization about a universal human nature, the research done remains relevant to showing the psychological patterns and social dynamics in WEIRD societies. So, for us modern Westerners, the social science is as applicable as it ever was. But what it shows is that there is nothing inevitable about human nature, as what has been shown is that there is immense potential for diverse expressions of our shared humanity.

If you combine these two books, you will have greater understanding than either alone. They can be seen as opposing views, but at a deeper level they share a common purpose, that of gaining better insight into ourselves and others.

Poised on a Knife Edge

“To make any thing very terrible, obscurity seems in general to be necessary. When we know the full extent of any danger, when we can accustom our eyes to it, a great deal of the apprehension vanishes.”
~ Edmund Burke

I spent much of the day looking back at old posts. My purpose was to find my various writings on the revolutionary era, specifically in relation to the American Revolution. I was doing so in order to link to them in the post I just wrote, about democratic republicanism in early America.

In my search, I came across a post from several years ago. It is sort of a rambling book review of Yuval Levin’s The Great Debate, the topic being the relationship between Thomas Paine and Edmund Burke. What caught my attention was the comments section. I sometimes put more into the comments section than I do in the post itself. A longtime friend and reader of the blog left a comment, which is partly what led me to go off on some tangents there.

As one of my responses, I quoted at length from Corey Robin’s writings. One quote came from the first book I read by him, The Reactionary Mind:

Earlier than most, Burke understood that if violence were to retain its sublimity, it had to remain a possibility, an object of fantasy— a horror movie, a video game, an essay on war. For the actuality (as opposed to the representation) of violence was at odds with the requirements of sublimity. Real, as opposed to imagined, violence entailed objects getting too close, bodies pressing too near, flesh upon flesh. Violence stripped the body of its veils; violence made its antagonists familiar to each other in a way they had never been before. Violence dispelled illusion and mystery, making things drab and dreary. That is why, in his discussion in the Reflections of the revolutionaries’ abduction of Marie Antoinette, Burke takes such pains to emphasize her “almost naked” body and turns so effortlessly to the language of clothing—“ the decent drapery of life,” the “wardrobe of the moral imagination,” “antiquated fashion,” and so on— to describe the event. 68 The disaster of the revolutionaries’ violence, for Burke, was not cruelty; it was the unsought enlightenment.

Robin explains what Burke meant by the moral imagination, explains why such power exists and what nullifies it. That is why I began this post with the quote by Burke. Here is the fuller context from the 1759 text (“A philosophical enquiry into the origin of our ideas of the sublime and beautiful”, Part Two, Section III – Obscurity):

To make any thing very terrible, obscurity seems in general to be necessary. When we know the full extent of any danger, when we can accustom our eyes to it, a great deal of the apprehension vanishes. Every one will be sensible of this, who considers how greatly night adds to our dread, in all cases of danger, and how much the notions of ghosts and goblins, of which none can form clear ideas, affect minds, which give credit to the popular tales concerning such sorts of beings. Those despotic governments, which are founded on the passions of men, and principally upon the passion of fear, keep their chief as much as may be from the public eye. The policy has been the same in many cases of religion.

It’s not just the power of the mind. Moral imagination is what extends power over people, the emotional grip of distant or hidden authority, human or otherwise. Sublimity and fear, awe and terror.

But this misses the subtlety of this power. Moral imagination is everpresent, the pervasive force that puts blinders on our vision, hypnotizing us into a reality tunnel and sometimes full epistemic closure. As Burke puts it, this forms the wardrobe of our moral imagination, from which we clothe our experience of the world. This wardrobe holds the social constructs of the mind, the ideologies and narratives of society, the customs and norms of culture. It is just there, all around us, enclosing us, a familiar presence, and yet near impossible to see directly, most often barely glimpsed at the periphery of our awareness. It’s power is in its simultaneous obscurity and presence, the unseen depths of unconsciousness with an undertow that can be felt.

Also in the comments section, I pointed to the connection to another writer: “I noticed in these passages that ‘horror’ was mentioned a few times. Corey Robin even made reference to horror movies/films and “delightful horror.” What came to my mind is something that Thomas Ligotti said in an interview. He was discussing monsters. He explained that no story can ever have a monster as the protagonist, for then the sense of monstrosity would be lost. The monster has to remain other and the evil vague. That is what gives a horror story its power to horrify.” That stood out to me most of all. There is a simple reason for this, as I had just recently mentioned Ligotti (in relation to True Detective) to this same friend when he came to visit me. I had forgotten about these comments. Reading them again, I saw them in new light. That involves a more important reason for these comments interesting me. Ligotti was making a deeper point than mere commentary on horror fiction. The most horrifying other is that which is unseen and that is its power over us.

This all connects back to the ongoing development of my own theory, that of symbolic conflation. But I forgot about an earlier post where I brought Burke into the context of symbolic conflation. It was for a different reason, though.

In that post, I explained Burke’s role as an outsider and how that positioned him as a purveyor of symbolic conflation. The moral imagination is all about this, as symbolic conflation is the beating heart, the meeting point of the imagined and the real. The centrality of the outsider status also brings into play the reactionary mind, according to Corey Robin, for the outsider sees most clearly the threat of boundaries being transgressed and all boundaries are ultimately boundaries of the mind. A symbolic conflation is a wall that both marks and establishes the boundary. It makes the boundary real and, in doing so, defends the authority of claims about what is real.

This is the moral imagination of fear. It is a visceral fear, the embodied imagination. A symbolic conflation requires a grounding within bodily experience, fight and flight, pain and illness, pleasure and guilt, punishment and death. It relates to what I call the morality-punishment link. It also offers possible insight into the origins of the reactionary mind. The conservative, as I argue, is simply a liberal in reactionary mode. The conservative is a liberal who has been mugged by their own moral imagination. Their minds have been wrapped in chains of fear and locked shut by symbolic conflation, the visceral experience of a story that has become their reality.

This is a potential existing within everyone, not just those on the political right. But this potential requires specific conditions to become manifest. Liberalism and the conservative reaction to it is an expression of modernity. This dynamic isn’t found in all societies. It is a cultural product and so there is nothing inevitable about it. Other cultures are possible with other ideological mindsets and other social dynamics. For us moderns, though, it is the only reality we know, this endless conflict within our collective psyche.

Maybe unintentionally, Edmund Burke offers us the key to unlock the modern mind. Knowing this key existed is what he feared the most, for then the human mind and its potential would be laid bare. Yet this fear is what gives the reactionary mind its sense of power and purpose, an existential threat that must be fought. Modernity is continuously poised on a knife edge.

The near cosmic morality tale of ideological conflict is itself a symbolic conflation. There is always a story being told and its narrative force has deep roots. Wherever a symbolic conflation takes hold, a visceral embodiment is to be found nearby. Our obsession with ideology is unsurprisingly matched by our obsession with the human brain. The symbolic conflation, though moral imagination, gets overlaid onto the brain for there is no greater bodily symbol of the modern self. We fight over the meaning of human nature by wielding the scientific facts of neurocognition and brain scans. It’s the same reason the culture wars obsess over the visceral physicality of sexuality: same sex marriage, abortion, etc. But the hidden mysteries of the brain make it particularly fertile soil. As Robert Burton explained in A Skeptic’s Guide to the Mind (Kindle Locations 2459-2465):

our logic is influenced by a sense of beauty and symmetry. Even the elegance of brain imaging can greatly shape our sense of what is correct. In a series of experiments by psychologists David McCabe and Alan Castel, it was shown that “presenting brain images with an article summarizing cognitive neuroscience research resulted in higher ratings of scientific reasoning for arguments made in those articles, as compared to other articles that did not contain similar images. These data lend support to the notion that part of the fascination and credibility of brain imaging research lies in the persuasive power of the actual brain images.” The authors’ conclusion: “Brain images are influential because they provide a physical basis for abstract cognitive processes, appealing to people’s affinity for reductionistic explanations of cognitive phenomena.” *

The body is always the symbolic field of battle. Yet the material form occludes what exactly the battle is being fought over. The embodied imagination is the body politic. We are the fear we project outward. And that very fear keeps us from looking inward, instead always drawing us onward. We moderns are driven by anxiety, even as we can never quite pinpoint what is agitating us. We are stuck in a holding pattern of the mind, waiting for something we don’t know and are afraid to know. Even as we are constantly on the move, we aren’t sure we are getting anywhere, like a dog trotting along the fenceline of its yard.

* * *

* D. McCabe and A. Castel, “Seeing Is Believing: The Effect of Brain Images on Judgments of Scientific Reasoning,” Cognition, 107( 1), April 2008, 345– 52.
(For criticisms, see: The Not So Seductive Allure of Colorful Brain Images, The Neurocritic.)

Bundle Theory: Embodied Mind, Social Nature

I was listening to the audio version of Susan Blackmore’s Consciousness: A Very Short Introduction. It’s less than five hours long and so I listened to it multiple times to get a good sense of it. I’ve read plenty about the topic and I’m already generally familiar with the material, but it was still helpful getting an overview.

One part that interested me was about split brain research, something that always interests me. The roles of and relationship between the hemispheres indicates much about how our minds operate. Blackmore discussed one often referenced study where split brain patients had information given separately to each hemisphere in order to see how the individual would explain their behavior. As the left hemisphere typically controls linguistic communication, individuals couldn’t give accurate reasons for what was done by their right hemisphere.

The author wrote that (pp. 72-3),

“In this way, the verbal left brain covered up its ignorance by confabulating. It did the same when the other half was shown an emotional picture – making up a plausible excuse for laughing, smiling, blushing, or whatever emotional reaction had been provoked. This might help to explain how these patients can appear so normal. But it should also make us wonder about ourselves. Our brains consist of lots of relatively independent modules, and the verbal part does not have access to everything that goes on, yet it frequently supplies convincing reasons for our actions. How many of these are plausible confabulations rather than true reasons, and can we tell?

“From these experiments, Sperry concluded that his patients had two conscious entities in one head; each having private sensations and free will. In contrast, Gazzaniga argued that only the left hemisphere sustains ‘the interpreter’, which uses language, organizes beliefs, and ascribes actions and intentions to people. Only this hemisphere has ‘high-level consciousness’, leaving the other hemisphere with many abilities and skills but without true consciousness.”

She points out that there is no way to resolve this issue. We can’t prove what is really going on here, even as it touches upon our most personal experience. But she adds that, “Bundle theory does away with the problem altogether. There is neither one self nor two selves inside the split brain; there are experiences but there is no one who is having them” (p. 74). What this means is that our experience of an egoic consciousness is overlaid on the entire experiential field, one experience presenting itself as all experience. Or else an interpretation of experience that alters what we experience and how we experience it. The self as coherent individuality is a mirage. That isn’t to say it is meaningless. Our minds naturally look for patterns, even or especially within our own minds. Meaning is always what we bring to our experience.

As for actual reading, as opposed to listening to audiobooks, my focus has still been on Daniel Everett’s recent publication, Dark Matter of the Mind. It is a difficult read in many parts because much of the linguistics scholarship goes over my head and the academic language can get tiresome, but I’ve been determined to finish it and I’m now near the last chapter. Parts of it are quite interesting, such as his mentioning the theory that “gestures and speech were equally and simultaneously implicated in the evolution of language” (Kindle Location 5102). He then details the relevance of gestures and the embodied communication (Kindle Locations 5108-5111):

““Mead’s loop,” wherein one’s own gestures are responded to by one’s own mirror neurons in the same way that these neurons respond to the actions of others, thus bringing one’s own actions into the realm of the social and contributing crucially to the development of a theory of mind— being able to interpret the actions of others under the assumption that others have minds like we do and think according to similar processes.”

That is what came to mind while listening to what Blackmore had to say about bundle theory of experience. The parts of the ‘self’ don’t form a coherent whole so much as they are involved in intimate contact and communication.

Our experience is social at the most fundamental level, a social phenomenon within each person’s body and social connection to the bodies of others. Our embodied selves are shifting realities with blurred boundaries, out of which forms patterns of social order and social identities. As others have argued, we develop a theory of mind within ourselves by first sussing out a theory of mind about others. So, our sense of self is built on our sense of others, which is to say we understand the relationships between experiences within own embodied minds as an inseparable understanding of our relationships with the larger world.

It’s hard to get at what this might mean. But one important factor is that of language. As Julian Jaynes argued in his book about the bicameral mind, “language is an organ of perception, not simply a means of communication” (p. 50, Kindle edition). Perception is always embodied. In The Master and His Emissary, Iain McGilchrist offers a summary that resonates with what I shared above by Everett (pp. 122-123):

“language originates as an embodied expression of emotion, that is communicated by one individual ‘inhabiting’ the body, and  therefore the emotional world, of another; a bodily skill, further, that is acquired by each of us through imitation, by the emotional identification and intuitive harmonisation of the bodily states of the one who learns with the one from whom it is learnt; a skill moreover that originates in the brain as an analogue of bodily movement, and involves the same processes, and even the same brain areas, as certain highly expressive gestures, as well as involving neurones (mirror neurones) that are activated equally when we carry out an action and when we see another carry it out (so that in the process we can almost literally be said to share one another’s bodily experience and inhabit one another’s bodies); a process, finally, that anthropologists see as derived from music, in turn an extension of grooming, which binds us together as physically embodied beings through a form of extended body language that is emotionally compelling across a large number of individuals within the group.”

Both Everett and McGilchrist are concerned with the evolution and development of language. They see it as inseparable from the embodied mind and the enculturated self. As Everett discusses the importance of gesture, McGilchrist explores the role of music and poetry. There is a strong argument that non-linguistic communication (gesture and/or poetry-music) was well established and highly effective among the earliest hominids, including pre-linguistic homo sapiens. It seems likely that this was the base upon which was built language as we know it.

Jaynes argues that written language was one of the factors that weakened the bicameral mind, a particular pre-egoic bundle theory. Prior to that, oral culture dominated; and in oral culture, language is intertwined with other aspects of human experience and behavior. Some of the evidence supporting this is how ancient humans sometimes spoke of body parts as having their own minds (a way of talking that continued into late Axial Age such as the New Testament canon, such that hands and eyes aren’t necessarily considered part of an integrally whole self; and it should be noted that the New Testament tradition was passed on orally for a number of generations before being written down). This is an experience still spoken of by some of those with schizophrenia and dissociative identity disorder. Even otherwise normal people will have voice-hearing experiences where the voices heard aren’t located in the head, sometimes in or around other parts of the body.

Most of human cognition and behavior is unconscious. The same goes for most of human communication and much of that non-conscious communication is also non-linguistic. This is the bodily or embodied unconscious. This relates to the social nature of our psyches, as with rapport where people mimic each other unawares (gestures, posture, breathing, etc) along with how yawns and laughter can be contagious. What I’m wondering about is how does the body-mind create rapport with itself in order to coordinate its vast multitudinous complexity.

Because of hemispheric divisions, for example, parts of the mind act rather independently. The corpus callosum doesn’t just allow the hemispheres to communicate for it also inhibits and restricts that communication, in ways and for reasons we don’t yet fully understand. Even when the corpus callosum is entirely cut making direct neurological communication impossible, the two hemispheres are able to coordinate behavior such that a person appears normal, even as two separate minds seem to be operating within the skull. Without directly communicating with one another, how do the hemispheres accomplish this?

The simplest answer is that both hemispheres have access to the sensory organs on the opposite side of the body and so can indirectly observe what the other hemisphere is doing (and, in the case of the left hemisphere, hear it’s explanations). But interestingly the two divided hemispheres can come to different conclusions based on different their separate input and processing. They can also act independently, a literal scenario of the left hand not knowing what the right hand is doing.

Here is a different kind of example from Everett (Kindle Locations 5071-5076):

“At age nineteen, IW suddenly lost all sense of touch and proprioception below the neck due to an infection. The experiments conducted by McNeill and his colleagues show that IW is unable to control instrumental movements when he cannot see his hands (though when he can see his hands, he has learned how to use this visual information to control them in a natural-appearing manner). What is fascinating is that IW, when speaking, uses a number of (what IW refers to as) “throwaway gestures” that are well coordinated, unplanned, nonvisually reliant, speech-connected gestures. McNeill concludes that at a minimum, this case provides evidence that speech gestures are different from other uses of the hands— even other gesturing uses of the hands.”

So, gestures are connected to speech. And gestures happen spontaneously. But even without proprioreception, other senses can be used to bridge the gap between conscious and unconscious expression. There are clearly different areas of behavior, cognition, and communication that relate in different ways. We are embodied minds and we know our minds through our bodies. And most of what our mind does is never accessed or controlled by consciousness. As research has shown, consciousness often only plays a role after behavior has already been initiated (less a power of will than a power of won’t).

So, what kind of mind is it that we have or rather that has us?

Orderliness and Animals

There is another example that demonstrates the conservative mind. It comes from my parents, as did the last one I discussed. This one is also about the conservative relationship to animals.

My parents have a lovable fat cat, Sam. He is getting old and this requires more effort than it used to. This past year he was diagnosed with diabetes and he has to have an insulin shot twice a day, which makes traveling anywhere difficult.

There are always clear rules in my parents’ house, the way things are supposed to be done and what is not allowed. This was true when I was a kid. And it still is true for Sam who lives under their roof. One of those rules is that cats are only allowed on particular pieces of furniture, such as the furniture in the basement and footstools on the main floor. But Sam has a fondness for a couple of chairs he isn’t supposed to be on.

Just the other day he barfed on the chair. It’s a high quality chair that was expensive. My parents have had it for a long time and it matches the way they have their house decorated. The cat barf doesn’t seem to be cleaning up or else some of the dye came out of the fabric. This is unacceptable, as this chair is directly where they entertain guests.

I could see how upset my mother was. Sam then barfed in some other places as well. One of those places was a silk rug. My parents wouldn’t normally buy a rug that was made out of silk, but they didn’t realize that is what it was when they bought it. The barf came out fine with the rug, but it added to the stress.

This made me think of a couple of things.

My parents always threatened that any pet that caused too much trouble would be gotten rid of. They like Sam, as they’ve liked other pets we’ve had, but my parents aren’t bleeding-heart liberals. They wouldn’t feel the kind of sadness I’d feel by putting down an animal. They, in particular my mother, have a more practical view of pet ownership and death. Their attitude about such things is very much an expression of a thick boundary. It’s easier for them to cut off emotion, specifically as compared to my namby-pamby soft heart.

The other thing about the thick boundary type is the need for orderliness. My parents go to great effort to create and maintain an orderly house. Not just clean but but also well decorated, well organized, and generally well kept. Nothing broken or with a burned out light is likely to remain that way for very long. In the middle of a conversation, my mother will start wiping the counters that didn’t look dirty.

A pet, like a child, is a potential agent of disorder. My parents are fine with pets and children, as long as they are well-behaved. But a pet, in particular, is secondary to the home itself. A cat that adds to the good feeling of a home is allowed, but if the cat detracts it might quickly wear out its welcome.

My parents have an idea of what house and a home should be like. It’s a very specific vision built on a conservative worldview and conservative social norms. If you watch a Hallmark movie or an early black-and-white sitcom, you know the guiding vision of this conservative attitude, expressing a desire to fit in and be normal. Rules are put in place to ensure this is maintained.

None of this is a judgment of this conservative-mindedness. Nor is this the only way conservative-mindedness can be acted on. For some conservatives, a sense of loyalty to a pet such as a dog might override orderliness or else the kind of order considered the norm might be far different. My parents are filtering their conservative-mindedness through a particular middle class attitude, specifically as idealized in mainstream culture and as seen in mainstream media. A working class conservative, however, might conform to some other social norm, such as keeping religious paraphernalia in a particular way or having regularly cooked family meals. But however it is perceived and given form, one thing that conservative-mindedness strongly correlates with is orderliness.

What is clear is that, for conservatives, the social order is prioritized. This is true of both the larger sense of order in a society or as defined in ideological worldviews and the smaller sense of order in a personal living space or an office. Order is greater than the individual or, pushed to the extreme, that there is no individual outside the order. One way or another, individuals are expected to conform to the order rather than the structuring the order to conform to individuals. It’s the job of the individual to remain in the place allotted to them and to follow the role demanded of them; or else to work hard and compete for the opportunity to gain a new social position, which then would require new expectations and norms to be accepted.

On the other hand, a strongly liberal-minded person would have a less clear cut or more malleable sense of order. If the cat kept getting on furniture and barfing, the liberal-minded would tend toward arranging the house to accommodate the cat. Liberal-mindedness also correlates to a weaker sense of disgust and so occasional barf wouldn’t be as bothersome and distressing. Of course, it depends on how liberal-minded a person is. Many self-identified liberals aren’t strongly liberal-minded in all or even most ways, and so such liberals might take a more conservative-minded attitude about order and cleanliness.

This doesn’t seem all that important on a personal level. How someone wants to maintain their house is a personal issue, since it doesn’t generally effect others. Whether you have barfy animals in a cluttered house or the opposite, it is mostly irrelevant in the big picture. But these personal attitudes are inseparable from our social and political opinions.

This relates to an insight I had many years ago. The abortion issue isn’t about the overt issue itself. The whole debate is ultimately about the question of social order. Conservatives wouldn’t support liberal policies, even if it meant that the abortion rate would be lower than under conservative policies. The reason is that the social order about relationships, sexuality, and family values are more important than even the lives of fetuses.

Someone who gets pregnant, to the conservative mind, must suffer the consequences. It is irrelevant how actual people act in the real world, such that abortion bans lead not to fewer abortions but simply to an increased rate of illegal abortions. That is irrelevant, for those who are harmed by botched illegal abortions would be getting the punishment they deserve. If they were a good person, they wouldn’t be having sex when they don’t want kids. And if they were a good person who did have sex, they would take responsibility by allowing the pregnancy go to term and then raising the child. The conservative social order never fails, for it is individuals who fail the conservative social order, which in no ways disproves and invalidates it.

Order is at the heart of the conservative worldview. More than anything else, this is what motivates conservative-mindedness. Through the lens of a thick boundary, there is right and wrong that must be defended even at high costs. The greater the conservative-mindedness the greater the willingness to enforce those costs, even when it is personally harmful. Psychological research shows that a fair number of people, presumably the most conservative-minded, are willing to punish those who break social norms even when it doesn’t personally benefit the punisher. Maintaining the social order is worth it, within a certain worldview.

It’s important to keep in mind, though, that few people are at either extreme of conservative-mindedness or liberal-mindedness. Most people want some social order, but most people also have clear limits to how far they will go in enforcing a social order. The average person can switch between these mindsets, to varying degrees and according to different situations.

That is true of my parents. As conservatives go, they are actually quite liberal-minded. Even though they strongly prefer order, they aren’t willing to enforce it at any costs. They have their breaking point where order would come to the forefront and be prioritized over all else, but they would have to be pushed fairly far before they got to that point. Sam would have to destroy some other pieces of furniture and cause other problems as well before they finally got around to getting rid of him, which at this age would mean putting him down. Plus, my parents have softened quite a bit with age and so have become more tolerant, one might say more liberal-minded. Still, this kind of thing bothers them in a way it would less likely bother someone much further up the scale on liberal-mindedness.

Plus, my parents know that I love Sam and would be heartbroken if they put him down. Family is important to conservatives. With that in mind, my parents realize keeping Sam around is a way to get me to visit more often. They are manipulating my soft liberal-mindedness, not that I mind.

Development of Language and Music

Evidence Rebuts Chomsky’s Theory of Language Learning
by Paul Ibbotson and Michael Tomasello

All of this leads ineluctably to the view that the notion of universal grammar is plain wrong. Of course, scientists never give up on their favorite theory, even in the face of contradictory evidence, until a reasonable alternative appears. Such an alternative, called usage-based linguistics, has now arrived. The theory, which takes a number of forms, proposes that grammatical structure is not in­­nate. Instead grammar is the product of history (the processes that shape how languages are passed from one generation to the next) and human psychology (the set of social and cognitive capacities that allow generations to learn a language in the first place). More important, this theory proposes that language recruits brain systems that may not have evolved specifically for that purpose and so is a different idea to Chomsky’s single-gene mutation for recursion.

In the new usage-based approach (which includes ideas from functional linguistics, cognitive linguistics and construction grammar), children are not born with a universal, dedicated tool for learning grammar. Instead they inherit the mental equivalent of a Swiss Army knife: a set of general-purpose tools—such as categorization, the reading of communicative intentions, and analogy making, with which children build grammatical categories and rules from the language they hear around them.

Broca and Wernicke are dead – it’s time to rewrite the neurobiology of language
by Christian Jarrett, BPS Research Digest

Yet the continued dominance of the Classic Model means that neuropsychology and neurology students are often learning outmoded ideas, without getting up to date with the latest findings in the area. Medics too are likely to struggle to account for language-related symptoms caused by brain damage or illness in areas outside of the Classic Model, but which are relevant to language function, such as the cerebellum.

Tremblay and Dick call for a “clean break” from the Classic Model and a new approach that rejects the “language centric” perspective of the past (that saw the language system as highly specialised and clearly defined), and that embraces a more distributed perspective that recognises how much of language function is overlaid on cognitive systems that originally evolved for other purposes.

Signing, Singing, Speaking: How Language Evolved
by Jon Hamilton, NPR

There’s no single module in our brain that produces language. Instead, language seems to come from lots of different circuits. And many of those circuits also exist in other species.

For example, some birds can imitate human speech. Some monkeys use specific calls to tell one another whether a predator is a leopard, a snake or an eagle. And dogs are very good at reading our gestures and tone of voice. Take all of those bits and you get “exactly the right ingredients for making language possible,” Elman says.

We are not the only species to develop speech impediments
by Moheb Costandi, BBC

Jarvis now thinks vocal learning is not an all-or-nothing function. Instead there is a continuum of skill – just as you would expect from something produced by evolution, and which therefore was assembled slowly, piece by piece.

The music of language: exploring grammar, prosody and rhythm perception in zebra finches and budgerigars
by Michelle Spierings, Institute of Biology Leiden

Language is a uniquely human trait. All animals have ways to communicate, but these systems do not bear the same complexity as human language. However, this does not mean that all aspects of human language are specifically human. By studying the language perception abilities of other species, we can discover which parts of language are shared. It are these parts that might have been at the roots of our language evolution. In this thesis I have studied language and music perception in two bird species, zebra finches and budgerigars. For example, zebra finches can perceive the prosodic (intonation) patterns of human language. The budgerigars can learn to discriminate between different abstract (grammar) patterns and generalize these patterns to new sounds. These and other results give us insight in the cognitive abilities that might have been at the very basis of the evolution of human language.

How Music and Language Mimicked Nature to Evolve Us
by Maria Popova, Brain Pickings

Curiously, in the majority of our interaction with the world, we seem to mimic the sounds of events among solid objects. Solid-object events are comprised of hits, slides and rings, producing periodic vibrations. Every time we speak, we find the same three fundamental auditory constituents in speech: plosives (hit-sounds like t, d and p), fricatives (slide-sounds like f, v and sh), and sonorants (ring-sounds like a, u, w, r and y). Changizi demonstrates that solid-object events have distinct “grammar” recurring in speech patterns across different languages and time periods.

But it gets even more interesting with music, a phenomenon perceived as a quintessential human invention — Changizi draws on a wealth of evidence indicating that music is actually based on natural sounds and sound patterns dating back to the beginning of time. Bonus points for convincingly debunking Steven Pinker’s now-legendary proclamation that music is nothing more than “auditory cheesecake.”

Ultimately, Harnessed shows that both speech and music evolved in culture to be simulacra of nature, making our brains’ penchant for these skills appear intuitive.

The sounds of movement
by Bob Holmes, New Scientist

It is this subliminal processing that spoken language taps into, says Changizi. Most of the natural sounds our ancestors would have processed fall into one of three categories: things hitting one another, things sliding over one another, and things resonating after being struck. The three classes of phonemes found in speech – plosives such as p and k, fricatives such as sh and f, and sonorants such as r, m and the vowels – closely resemble these categories of natural sound.

The same nature-mimicry guides how phonemes are assembled into syllables, and syllables into words, as Changizi shows with many examples. This explains why we acquire language so easily: the subconscious auditory processing involved is no different to what our ancestors have done for millions of years.

The hold that music has on us can also be explained by this kind of mimicry – but where speech imitates the sounds of everyday objects, music mimics the sound of people moving, Changizi argues. Primitive humans would have needed to know four things about someone moving nearby: their distance, speed, intent and whether they are coming nearer or going away. They would have judged distance from loudness, speed from the rate of footfalls, intent from gait, and direction from subtle Doppler shifts. Voila: we have volume, tempo, rhythm and pitch, four of the main components of music.

Scientists recorded two dolphins ‘talking’ to each other
by Maria Gallucci, Mashable

While marine biologists have long understood that dolphins communicate within their pods, the new research, which was conducted on two captive dolphins, is the first to link isolated signals to particular dolphins. The findings reveal that dolphins can string together “sentences” using a handful of “words.”

“Essentially, this exchange of [pulses] resembles a conversation between two people,” Vyacheslav Ryabov, the study’s lead researcher, told Mashable.

“The dolphins took turns in producing ‘sentences’ and did not interrupt each other, which gives reason to believe that each of the dolphins listened to the other’s pulses before producing its own,” he said in an email.

“Whistled Languages” Reveal How the Brain Processes Information
by Julien Meyer, Scientific American

Earlier studies had shown that the left hemisphere is, in fact, the dominant language center for both tonal and atonal tongues as well as for nonvocalized click and sign languages. Güntürkün was interested in learning how much the right hemisphere—associated with the processing of melody and pitch—would also be recruited for a whistled language. He and his colleagues reported in 2015 in Current Biology that townspeople from Kuşköy, who were given simple hearing tests, used both hemispheres almost equally when listening to whistled syllables but mostly the left one when they heard vocalized spoken syllables.

Did Music Evolve Before Language?
by Hank Campbell, Science 2.0

Gottfriend Schlaug of Harvard Medical School does something a little more direct that may be circumstantial but is a powerful exclamation point for a ‘music came first’ argument. His work with patients who have suffered severe lesions on the left side of their brain showed that while they could not speak – no language skill as we might define it – they were able to sing phrases like “I am thirsty”, sometimes within two minutes of having the phrase mapped to a melody.

Chopin, Bach used human speech ‘cues’ to express emotion in music
by Andrew Baulcomb, Science Daily

“What we found was, I believe, new evidence that individual composers tend to use cues in their music paralleling the use of these cues in emotional speech.” For example, major key or “happy” pieces are higher and faster than minor key or “sad” pieces.

Theory: Music underlies language acquisition
by B.J. Almond, Rice University

Contrary to the prevailing theories that music and language are cognitively separate or that music is a byproduct of language, theorists at Rice University’s Shepherd School of Music and the University of Maryland, College Park (UMCP) advocate that music underlies the ability to acquire language.

“Spoken language is a special type of music,” said Anthony Brandt, co-author of a theory paper published online this month in the journal Frontiers in Cognitive Auditory Neuroscience. “Language is typically viewed as fundamental to human intelligence, and music is often treated as being dependent on or derived from language. But from a developmental perspective, we argue that music comes first and language arises from music.”

– See more at: http://news.rice.edu/2012/09/18/theory-music-underlies-language-acquisition/#sthash.kQbEBqnh.dpuf

How Brains See Music as Language
by Adrienne LaFrance, The Atlantic

What researchers found: The brains of jazz musicians who are engaged with other musicians in spontaneous improvisation show robust activation in the same brain areas traditionally associated with spoken language and syntax. In other words, improvisational jazz conversations “take root in the brain as a language,” Limb said.

“It makes perfect sense,” said Ken Schaphorst, chair of the Jazz Studies Department at the New England Conservatory in Boston. “I improvise with words all the time—like I am right now—and jazz improvisation is really identical in terms of the way it feels. Though it’s difficult to get to the point where you’re comfortable enough with music as a language where you can speak freely.”

Along with the limitations of musical ability, there’s another key difference between jazz conversation and spoken conversation that emerged in Limb’s experiment. During a spoken conversation, the brain is busy processing the structure and syntax of language, as well the semantics or meaning of the words. But Limb and his colleagues found that brain areas linked to meaning shut down during improvisational jazz interactions. In other words, this kind of music is syntactic but it’s not semantic.

“Music communication, we know it means something to the listener, but that meaning can’t really be described,” Limb said. “It doesn’t have propositional elements or specificity of meaning in the same way a word does. So a famous bit of music—Beethoven’s dun dun dun duuuun—we might hear that and think it means something but nobody could agree what it means.”

 

Introverted Delights

I’ve been watching Westworld. It’s my favorite show at the moment. That is saying a lot, considering it’s competition. The second season of The Man in the High Castle is about to come out, based on a novel I love by my favorite fiction writer. And the always entertaining Game of Thrones will be returning soon. But neither of those shows competes with Westworld.

Westworld is popular. But even though it has higher viewer ratings than Game of Thrones, it has much more mixed reviews. It’s such a complex show. The plotlines of Westworld are immensely more complicated than the sprawling narrative world of Game of Thrones. This makes it all the more impressive that it is so popular.

For some people, they see it as too cerebral. I wonder why that is. There is more emotional depth to this show in many ways than a show like Game of Thrones that is focused so much on physical action of fighting, on political machinations and worldly power. The inner experience of Westworld characters is conveyed to a much greater extent. Maybe that is what is difficult for some people, specifically extraverts.

Westworld, despite the outward action and adventure of the virtual world portrayed, is ultimately a show maybe best appreciated by an introvert. So many of the main characters on the show seem rather inwardly drawn and guarded about their most personal experience, which is unusual for mainstream action-oriented sci-fi. The point of the entire show revolves around growing self-awareness and the strengthening of an inner voice, the kind of thing that preoccupies introverts.

Some people wonder what is the point of all the convoluted plotlines, multitudinous cultural references, and in-show commentary of obscure ideas. Also, there is the simultaneous celebration and questioning of genre tropes. Is it embracing “guns and tits and all that mindless shit”? Or is the entire show a criticism of that, an exploration of what it means for our humanity? Maybe both. From my perspective, that just makes the show more interesting. But the basic show can be enjoyed on a much simpler level, even ignoring the sex and violence, as much of the character development is fairly straightforward. The motivation of characters is revealed as the show goes on, assuming enough imagination and curiosity pulls you in to follow the characters on their path of emergence.

The tricky part is that the identities of characters isn’t immediately apparent, only being revealed as their pasts are revealed. This is a slow reveal with glimpses of a murky past gradually coming into focus. The exploration of motivation is a learning experience as much for the characters themselves as for the viewers. We are meant to identify and empathize with the characters as individuals and not merely to be caught up in their actions and relationships with other characters.

This requires of the viewer both patience and immersion, along with suspension of disbelief about the entire fictional world. It’s an act of imaginative speculation taken to an extreme degree, an attempt to bring we the viewers into the borderlands of consciousness and of humanity. Some people have more tolerance than others for that kind of thing, but this is what the best sci-fi is able to achieve. That is what the producers of the Westworld show have been attempting, it being fair game to argue over how well they achieved it. Still, no matter how well done, these themes aren’t exactly of mainstream interest. Most viewers probably just want to see robots revolting and, for those folk, this show does deliver on that promise.

Still, Westworld is constrained by the sub-genre it belongs to. There is a central element of dark mystery and claustrophobic focus that is typical of gritty neo-noir, always leaving certain things unseen and unexplained. Take the slow burn of Blade Runner, exaggerate and complicate it, spread it across an entire show series with no linear plotline or single dominant protagonist, and that is what you get with Westworld. This isn’t a world-building exercise like some traditional fantasy and space operas where every detail is articulated and the background fully described. Everything in the narrative revolves around the characters and about what it means to be human.

This season introduced the individuals and their place in the world. The exploration of the larger world, if it is to happen, will be developed in the next season. The hosts, having gained consciousness, will no longer be trapped in voice commands, character scripts, and narrative loops. The inward focus likely will turn ever more outward, as the hosts try to grasp what kind of world they find themselves in. That is the natural progression of emerging consciousness, whether for a child or an android.

Conscious Dreaming, Conscious Self

Children’s Dreaming and the Development of Consciousness
by David Foulkes

dreaming as we normally understand it–active stories in which the dreamer is an actor–appears relatively late in childhood. This true dreaming begins between the ages of 7 and 9. He argues that this late development of dreaming suggests an equally late development of waking reflective self-awareness.

What Little Kids See When They Dream
from Happiest Baby

Understandably, dreams can confuse small kids. Pre-schoolers often think their dreams are magically placed in their heads by someone else, or by God. […] Are you wondering what your kids are doing in their dreams? Good question, but the answer is…nothing! The “character of the self” hasn’t even made an appearance yet! […] Generally around age 8, children appear as central characters in their dreams. Dream narratives become more complex and longer. Not only do kids participate in the action as it unfolds, they also have thoughts and feelings within the dream.

What Do Babies Dream About?
by Natalie Wolchover, Live Science

According to research by Foulkes and his colleagues, even children at the ripe old age of 4 or 5 typically describe dreams that are static and plain, with no characters that move or act, few emotions and no memories.

Vivid dreams with structured narratives set in at age 7 or 8, around the same time children develop a clear understanding of their own identity. Researchers think self-awareness is necessary for the insertion of the self into dreams. In fact, the amount of self-knowledge a child possesses — her understanding that she would be the same person even if she had a different name, for instance, and that she is the same person as she was when she was a baby — strongly correlates with the vibrancy and amount of plot structure in that child’s dreams.

Dreaming and Narration
by Richard Walsh, LHN

The notion of the dream as itself narrative appears to conflate perceptual consciousness of the “facts” of the dream with reflective consciousness about the dream.5

In the Freudian model, the dream gives expression to prior, unconscious dream thoughts (Freud [1900] 1953). From a neurobiological perspective, however, there is no further regression of meaning, because dreams arise from the activation of the forebrain by periodic neuronal activity in the brain stem (Hobson & McCarley 1977). Such brain activity during sleep may be random or part of some adaptive process associated with that of sleep itself; the inception of dream mentation is just a by-product in this account. All the remarkable coherence of dreams is attributed to the mind’s subsequent cognitive efforts of synthesis, drawing upon the narrative sense-making capacities of waking life (Hobson 2002). Cognitive models of dreaming have more to say about the functioning of such sense-making processes, however. They too regard narrativizing as integral to the formation of dreams, but note that this should not be taken for granted; our storytelling capabilities develop in the course of childhood, and this development correlates with the development of children’s dreams (Foulkes 1999). Narrative logic, here, is not a given; instead, cognitive accounts foreground the creativity of dreams—their status, that is, not just as narratives but as fictions. Such approaches conceive the motive forces of dreaming as continuous with those of waking thought, whether the emphasis falls upon imaginative world-making (States 2003) or on the articulation of emotion (Hartmann 2010b).

Science: Julian Jaynes
by Josh Ronsen, monk mink pink punk

The most interesting, to me, paper concerned Agamemnon’s dream in the Iliad, how this dream mirrors the structure of the Bicameral Mind and how it differs from our dreams. In Bicameral dreams, and Jaynes admits there are not that many dreams from this time period to analyze, the dreamer is never anywhere other than his sleeping area, and the dream is always a direct message from a god/angel. Jacob’s “ladder” dream from the Jewish Bible fits in here as well. Compare this with our dreams, which can take place anywhere within the limits of our imagination, just as our consciousness can be projected throughout those same limits.

Bicameral Dream question
from Julian Jaynes Society Discussion Forum

Jaynes believes modern dreams are consciousness operating in sleep. We see elements of waking consciousness in dreams such as an analog ‘I’ narratizing in a mind-space. In cases where the dreamer simply experiences a visitation from a spirit or god issuing a command while asleep in his own bed, this aspect of consciousness is absent — i.e., the person does not see themselves as an actor in their dreams. So dreams do not “prove” but rather provide further evidence for a different pre-conscious mentality. We see these types of visitation dreams in ancient civilizations, pre-literate societies, and in children. As children develop consciousness, we see consciousness expand in their dreams.

“Primitive Mentality” by Lucien Lévy-Bruhl & Jaynes’ Theory
from Julian Jaynes Society Discussion Forum

In Chapter 3, Levy-Bruhl discusses the prophetic nature of dreams among tribal people.

“To the primitive mind, as we know, the seen and the unseen worlds form but one, and there is therefore uninterrupted communication between what we call obvious reality and the mystic powers. Nowhere perhaps is this more directly and completely brought about than in dreams, in which man passes from the one world to the other without being aware of it. Such is in fact the ordinary idea of the dream to primitive peoples. The ‘soul’ leaves its tenement for the time being. It frequently goes very far away; it communes with spirits or with ghosts. At the moment of awakening it returns to take its place in the body once more. … At other times, it is the spirits of the dead, or even other powers, which come and visit the soul in sleep” (pgs. 98–99).

This immediately calls to mind E.R. Dodds’ discussion of the prophetic nature of dreams among the ancient Greeks. Dreams in ancient Greece, unlike modern, conscious dreams, often took the form of a visitation by a god or spirit that issued some form of command. […]

There seems to be strong evidence for the very different nature of what we might call “bicameral dreams” vs. “conscious dreams.” For those interested in this subject, I highly recommend reading Levy-Bruhl’s entire chapter on dreams along with Dodds.

More on the commanding nature of primitive people’s dreams:

“It frequently happens that when all the missionary’s efforts to induce a native to change his faith have proved ineffectual, a dream suddenly determines him to take the step, especially if the dream is repeated several times. For example, among the Basutos, ‘what plays the chief part in the conversion of the Mosuto? … The paramount role is played by the dream. … To make him definitely decide, there must be something out of the common, a Divine intervention (as he regards it) which strikes his imagination. … If you ask a heathen who has heard the Gospel, when he will be converted, he will answer in the most matter-of-course way: ‘When God speaks to me'” (p. 110).

“In Central Africa, dreams have similar meanings. To give but one example: “The Azande of the Upper Congo believe that during the night the dead make their wishes known to the living. Dreams are quite authentic to them, and they are convinced that when they see a dead relative in a dream they really have a conversation with his ghost, and in its course he gives advice, expresses satisfaction or displeasure, and states his aspirations and desires” (pgs. 111–112).

“‘The Iroquois,’ says another Jesuit priest, ‘have, strictly speaking, but one divinity, which is the dream; they submit to it and follow all its orders most implicitly.’ … It is not simply a question of advice, hints, friendly suggestions, official warnings conveyed by dreams; it is nearly always definite orders, and nothing can prevent the Indian from obeying them” (p. 113).

“The Greeks and the Irrational” by E.R. Dodds
from Julian Jaynes Society Discussion Forum

Chapter 4 describes the nature of dreams in ancient Greeks and how dreams changes as culture [or consciousness] changes. Dodds describes what Jaynes would probably refer to as “bicameral dreams” — dreams that consist of a visitation and the communication of some type of message or command.

“Ancient literature is full of these ‘godsent’ dreams in which a single dream-figure presents itself, as in Homer, to the sleeper, and gives him prophecy, advice, or warning” (p. 107).

“Such dreams played an important part in the life of other ancient peoples, as they do in that of many races to-day. Most of the dreams recorded in Assyrian, Hittite, and ancient Egyptian literature are ‘divine dreams’ in which a god appears and delivers a plain message to the sleeper, sometimes predicting the future, sometimes demanding cult” (pgs. 108-109).

On the frequency of hallucinations and visions:

“As I have mentioned self-induced visions in connection with the Asclepius cult, I may add a couple of general remarks on waking visions or hallucinations. It is likely that these were commoner in former times than they are to-day, since they seem to be relatively frequent among primitives; and even with us they are less rare than is often supposed. They have in general the same origin and psychological structure as dreams, and like dreams they tend to reflect traditional culture-patterns. Among the Greeks, by far the commonest type is the apparition of a god or the hearing of a diving voice which commands or forbids the performance of certain acts. This type figures, under the name of ‘spectaculum,’ in Chalcidius’ classification of dreams and visions; his example is the daemonion of Socrates. When all allowance has been made for the influence of literary tradition in creating a stereotyped form, we should probably conclude that experiences of this kind had once been fairly frequent, and still occurred occasionally in historical times” (pgs. 116-117).

Consciousness and Dreams
by Marcel Kuijsten, Julian Jaynes Society

The study of dreams in ancient civilizations and pre-literate societies demonstrate that dreams can be used as an indication of the level of consciousness in a given culture. Similarly, children’s dreams provide evidence that dreams can be used as an indication of the level of consciousness in a developing child. In Children’s Dreaming and the Development of Consciousness (2002), child psychologist and dream expert David Foulkes challenges the popular misconception that dreaming is “a given” in human experience. In a section on the development of consciousness in children that sounds surprisingly reminiscent of Jaynes, Foulkes writes: “I hypothesize that dreaming is simply the operation of consciousness in sleep … that consciousness develops, and that it does so more slowly and later than is generally believed” (Foulkes, 2002).

According to Foulkes, the nature and content of children’s dreams changes dramatically over time. For example, during the preschool years, “dreams are brief and infrequent; they focus on body states; their imagery is static.” Dreams slowly transform to those experienced in adulthood between the ages of 5 and 9:

First, dream reports become longer, but not more frequent, and now describe social interaction and the kind of movement that suggests kinematic rather than static imaging; still lacking, however, is active participation in dream events by the dreamer herself or himself. Next, dream reports become more frequent as well as longer and narratively more complex, and active self-participation becomes a general possibility, along with, for the first time, the reliable attribution to the self of feelings and thoughts occurring in the dream in response to dream events (Foulkes, 2002).

The dreamer does not regularly appear as an active participant in his or her dreams — according to Jaynes, one of the hallmarks of conscious dreams — until between the ages of 7 and 9. Conscious dreams, therefore, seem to be infrequent until some time after the child has developed consciousness in waking life.

The content of dreams provide another method to gauge the level of consciousness in a given culture or individual. If language had no effect on consciousness — or if consciousness developed far back in our evolutionary past and has remained unchanged since — we would expect dreams to remain unchanged both throughout recorded history and throughout an individual’s development. Instead, dreams reflect developmental stages in mentality from preconscious to conscious, brought about by changes both culturally as well as in the linguistic sophistication of the dreamer.

Dreams in bicameral cultures lack consciousness — an analog ‘I’ narratizing in a mind-space, and mimic the waking experience of receiving behavioral commands from gods. In contrast, the dreams of conscious individuals reflect conscious narratization during sleep.

Midrange Pathology

“I want to mention, almost as an aside, another side of this issue of normalcy. There seems to be one variant of midrange pathology, not extensively studied or well understood (especially in developmental terms), that blends into the woodwork, so to speak, and is difficult to discern. I am referring to persons who one might say are excessively “normal”. These people, called by some workers “normopaths,” “anti-analysands,” “robot analysands,” or “pseudonormals” suffering from a “normotic illness,” have been described by various authors, including Bollas (1989), McDougall (1980, especially chap. 13; 1985, p. 156), and McWilliams and Lependorf (1990).

“These are persons who, when looked at superficially or casually seem to function adequately but who on deeper, more careful examination are seen actually to be drastically cut off from their affective lives, and the result is a peculiar, horrifying “normality.” When one becomes sensitized to this other pole of severe pathology, one sees how prevalent it is in the “normal” population: “The fundamental identifying feature of this individual is his disinclination to entertain the subjective element in life, whether it exists inside himself or in the other” (Bollas, 1989, p. 319; from “Normotic Illness” in Fromm & Smith (eds.), The Facilitating Environment, pp. 317-44). The author goes on to present an evocative and chilling description of the normotic personality; it sounds like an apt description of a significant part of our population.

“As far as I know, this class of persons has not been studied diagnostically by means of mainstream frameworks and instruments (e.g., the MMPI, behavioral checklists), but I would not be surprised if these kinds of people would appear to be just fine when evaluated by such surface-oriented, structured tools. […] In sum, the possibility is very real that in empirical or experimental studies, the control group of “normals” is itself significantly pathological.”

Substance Abuse as Symptom
by Louis S. Berger
pp. 121–22

We Are Empathy

Recall how Ptolemy used epicycles to accurately predict the movements of objects in the sky, yet he had no clue about the actual nature of those movements. We’re still in the Ptolemaic phase of social science.

Paul Bloom had an article come out in the WSJ today, The Perils of Empathy. It’s on the limitations and problems of empathy, a topic he has been writing about for years (it’s not even his first WSJ article about it).

The above quote is from the comments section, a response posted by Anthony Cusano, and it captures my own basic thought. As others noted, Bloom’s understanding of empathy is limited and so it’s unsurprising he comes to the conclusion that is limited. So, the problem is Bloom’s own confusion, based on narrow research and simplistic analysis.

There isn’t much point in analyzing the article itself. But I realize that such articles have immense influence given the platform. I’m always surprised that someone like Bloom, a respected ivy league academic and professor, would have such a superficial grasp. I’d like to think that Bloom realizes it’s more complex and that he is using rhetoric to make a point, not that this generous interpretation makes it any better.

Even though I love social science, this demonstrates a constant danger of trying to make sense of the research produced. Evidence is only as good as the frame used to interpet it.

Bloom is mixing up the rhetoric, perception, and experience of empathy. He treats empathy as something rather simple, maybe confusing it with mere sympathy. And he does this by ignoring most of what empathy consists of, such as cognitive empathy. Along with many of his allies and critics, he never puts it into its largest context. Human civilization would never exist without human empathy. This is because humanity is inseparable from empathy, as we are inherently a social species and there is no sociality without empathy.

There isn’t any grand significance in my writing specifically about Bloom’s article. The main thing wasn’t what was in it but what was left out of it.

The last thing I wrote earlier in the week was about the hive mind in terms of entrainment. There would be no human families, groups, social identities, communities, nations, etc without empathy. None of this is solely or even primarily dependent on empathy as direct emotionality and personal sympathy. An army marching has a shared identity that doesn’t require any given soldier to empathize with any other individual soldier, much less every single soldier. The empathy is with a sense of group identity that transcends all individuality. The soldiers in marching form grok this collective identity as a muscular bonding that, in the moment, is as real as their own bodies.

Empathy is the foundation and essence of everything that is human. It precedes and encompasses every other aspect of our humanity, including rational compassion. Posing empathy as a choice is irrelevant. There is no choice. Empathy just is, whether or not we use it well. We can’t objectively study empathy because we can’t separate ourselves from it. There is no outside perspective.

Let me conclude with some words of wisdom, “We are Groot.”

* * *

I Could Say that Paul Bloom is a Callous Idiot, But I Empathize With Him…
by Nathan J. Robinson, The Navel Observatory

Thinkfluence Man Pretends To Think Empathy Is Bad
by Albert Burneko, The Concourse

Why Paul Bloom Is Wrong About Empathy and Morality
by Denise Cummins, Psychology Today

The one thing that could save the world: Why we need empathy now more than ever
by Roman Krznaric, Salon

Welcome to the empathy wars
by Roman Krznaric, Transformation

Can You Run Out of Empathy?
by C. Daryl Cameron, Berkeley

Understanding is Inherent to Empathy: On Paul Boom and Empathy
by Jeremiah Stanghini, blog

What’s So Funny ‘Bout Peace, Love and (Empathic) Understanding
by John Payne, EPIC

What is the Blank Slate of the Mind?

In Dark Matter of the Mind, Daniel Everett contrasts Plato and Aristotle. He sides with the latter, specifically in terms of a blank slate view of the human mind. But most people wouldn’t understand what is meant by a blank slate in this context. He explains that (Kindle Locations 1140-1143),

Like Aristotle, Locke did not believe that the absence of knowledge on a tablet means that the tablet has no other properties. It has the capacity to receive and store information and more. Neither philosopher thought of the tabula rasa as devoid of capacity to be written on, not even of capacity to write upon itself. In my reading, they meant by tabula rasa not that there were no innate abilities, but that there were no innate specific concepts.

This is hard to grasp the exact distinction. It’s not an argument that nothing is preexisting. All that it means is nothing is predetermined, as already formed (i.e., Platonic forms). So, what exactly might be already present at birth and innate to all human minds?

I’m still not entirely sure about Everett’s answer to that question. He is critical of someone like Jung, based on the claim of Platonic error or overreach. Here is his description (Kindle Locations 971-973):

Jung (1875– 1961), another of the leading dark matter theorists in the Platonic tradition, was the founder of “analytical psychology” (Jung [1916] 2003). Fundamental to this form of therapy and the theory behind it was, again, Bastian’s elementary ideas, which Jung reconceived as the “collective unconscious,” that is, innate tacit information common to all humans.

It’s the last part that is relevant, “innate tacit information common to all humans”. But is that an accurate interpretation of Jung? Let’s turn to Jung’s explanation of his own view (“Concerning the Archetypes with Special Reference to the Anima Concept”):

It is in my view a great mistake to suppose that the psyche of a new-born child is a tabula rasa in the sense that there is absolutely nothing in it. In so far as the child is born with a differentiated brain that is predetermined by heredity and therefore individualized, it meets sensory stimuli coming from outside not with any aptitudes, but with specific ones, and this necessarily results in a particular, individual choice and pattern of apperception. These aptitudes can be shown to be inherited instincts and preformed patterns, the latter being the a priori and formal conditions of apperception that are based on instinct. Their presence gives the world of the child and the dreamer its anthropomorphic stamp. They are the archetypes, which direct all fantasy activity into its appointed paths and in this way produce, in the fantasy-images of children’s dreams as well as in the delusions of schizophrenia, astonishing mythological parallels such as can also be found, though in lesser degree, in the dreams of normal persons and neurotics. It is not, therefore, a question of inherited ideas but of inherited possibilities of ideas.

Everett says that Jung is claiming “innate tacit information” and speaks of this in terms of Bastian’s “elementary ideas”. That seems to be the same as inherited ideas. If so, Jung is denying Everett’s allegation before it ever was made. “It is not, therefore, a question of inherited ideas but of inherited possibilities of ideas.” That doesn’t sound all that different in how Everett discusses the topic (Kindle Locations 349-355):

The theses of learned tacit knowledge and nativism need not be opposed, of course. It is possible that both learned and innate forms of tacit knowledge are crucially implicated in human cognition and behavior. What we are genuinely interested in is not a false dichotomy of extremes but in a continuum of possibilities— where do the most important or even the most overlooked contributions to knowledge come from?

I am here particularly concerned with difference, however, rather than sameness among the members of our species— with variation rather than homeostasis. This is because the variability in dark matter from one society to another is fundamental to human survival, arising from and sustaining our species’ ecological diversity. The range of possibilities produces a variety of “human natures”

That in turn sounds much like Jung. A variety of “human natures”. Well, Jung developed an entire theory about this, not just a variety through archetypes as inherited possibilities but more specifically a variety of human personality types (i.e., “human natures”). The potentials within humanity could constellate into many patterns, according to Jung. And his book about personality types was directly influential on anthropology in developing a modern understanding of the variety of cultures, of which Everett writes much about.

So, if a supposedly Platonic thinker like Jung can make a basic argument that isn’t necessarily and clearly distinct from a supposedly Aristotelian thinker like Everett, then what precisely is the distinction being proposed? How does one differentiate innate ideas and innate possibilities of ideas? Is anyone “genuinely interested in… a false dichotomy of extremes”?