Voice and Perspective

“No man should [refer to himself in the third person] unless he is the King of England — or has a tapeworm.”
~ Mark Twain

“Love him or hate him, Trump is a man who is certain about what he wants and sets out to get it, no holds barred. Women find his power almost as much of a turn-on as his money.”
~ Donald Trump

The self is a confusing matter. As always, who is speaking and who is listening. Clues can come from the language that is used. And the language we use shapes human experience, as studied in linguistic relativity.

Speaking in first person may be a more recent innovation of the human society and psyche. The autobiographical self requires the self-authorization of Jaynesian narrative consciousness. The emergence of the egoic self is the fall into historical time, an issue too complex for discussion here (see Julian Jaynes’ classic work or the diverse Jaynesian scholarship it inspired, or look at some of my previous posts on the topic).

Consider the mirror effect. When hunter-gatherers encounter a mirror for the first time there is what is called  “the tribal terror of self-recognition” (Edmund Carpenter as quoted by Philippe Rochat, from Others in Mind, p. 31). “After a frightening reaction,” Carpenter wrote about the Biamis of Papua New Guinea, “they become paralyzed, covering their mouths and hiding their heads — they stood transfixed looking at their own images, only their stomach muscles betraying great tension.”

Research has shown that heavy use of first person is associated with depression, anxiety, and other distressing emotions. Oddly, this full immersion into subjectivity can lead into depressive depersonalization and depressive realism — the individual sometimes passes through the self and into some other state. And in that other state, I’ve noticed that silence befalls the mind, that is to say the loss of the ‘I’ where the inner dialogue goes silent. One sees the world as if coldly detached, as if outside of it all.

Third person is stranger and with a much more ancient pedigree. In the modern mind, third person is often taken as an effect of narcissistic inflation of the ego, such as seen with celebrities speaking of themselves in terms of their media identities. But in other countries and at other times, it has been an indication of religious humility or a spiritual shifting of perspective (possibly expressing the belief that only God can speak of Himself as ‘I’).

There is also the Batman effect. Children act more capable and with greater perseverance when speaking of themselves in third person, specifically as superhero character. As with religious practice, this serves the purpose of distancing from emotion. Yet a sense of self can simultaneously be strengthened when the individual becomes identified with a character. This is similar to celebrities who turn their social identities into something akin to mythological figures. Or as the child can be encouraged to invoke their favorite superhero to stand in for their underdeveloped ego-selves, a religious true believer can speak of God or the Holy Spirit working through them. There is immense power in this.

This might point to the Jaynesian bicameral mind. When an Australian Aborigine ritually sings a Songline, he is invoking a god-spirit-personality. That third person of the mythological story shifts the Aboriginal experience of self and reality. The Aborigine has as many selves as he has Songlines, each a self-contained worldview and way of being. This could be a more natural expression of human nature… or at least an easier and less taxing mode of being (Hunger for Connection). Jaynes noted that schizophrenics with their weakened and loosened egoic boundaries have seemingly inexhaustible energy.

He suspected this might explain why archaic humans could do seemingly impossible tasks such as building pyramids, something moderns could only accomplish through use of our largest and most powerful cranes. Yet the early Egyptians managed it with a small, impoverished, and malnourished population that lacked even basic infrastructure of roads and bridges. Similarly, this might explain how many tribal people can dance for days on end with little rest and no food. And maybe also like how armies can collectively march for days on end in a way no individual could (Music and Dance on the Mind).

Upholding rigid egoic boundaries is tiresome work. This might be why, when individuals reach exhaustion under stress (mourning a death, getting lost in the wilderness, etc), they can experience what John Geiger called the third man factor, the appearance of another self often with its own separate voice. Apparently, when all else fails, this is the state of mind we fall back on and it’s a common experience at that. Furthermore, a negatory experience, as Jaynes describes it, can lead to negatory possession in the re-emergence of a bicameral-like mind with a third person identity becoming a fully expressed personality of its own, a phenomenon that can happen through trauma-induced dissociation and splitting:

“Like schizophrenia, negatory possession usually begins with some kind of an hallucination. 11 It is often a castigating ‘voice’ of a ‘demon’ or other being which is ‘heard’ after a considerable stressful period. But then, unlike schizophrenia, probably because of the strong collective cognitive imperative of a particular group or religion, the voice develops into a secondary system of personality, the subject then losing control and periodically entering into trance states in which consciousness is lost, and the ‘demon’ side of the personality takes over.”

Jaynes noted that those who are abused in childhood are more easily hypnotized. Their egoic boundaries never as fully develop or else the large gaps are left in this self-construction, gaps through which other voices can slip in. This relates to what has variously been referred to as the porous self, thin boundary type, fantasy proneness, etc. Compared to those who have never experienced trauma, I bet such people would find it easier to speak in the third person and when doing so would show a greater shift in personality and behavior.

As for first person subjectivity, it has its own peculiarities. I think of the association of addiction and individuality, as explored by Johann Hari and as elaborated in my own writings (Individualism and Isolation; To Put the Rat Back in the Rat Park; & The Agricultural Mind). As the ego is a tiresome project that depletes one’s reserves, maybe it’s the energy drain that causes the depression, irritability, and such. A person with such a guarded sense of self would be resistant to speak in third person in finding it hard to escape the trap of ego they’ve so carefully constructed. So many of us have fallen under its sway and can’t imagine anything else (The Spell of Inner Speech). That is probably why it so often requires trauma to break open our psychological defenses.

Besides trauma, many moderns have sought to escape the egoic prison through religious practices. Ancient methods include fasting, meditation, and prayer — these are common across the world. Fasting, by the way, fundamentally alters the functioning of the body and mind through ketosis (also the result of a very low-carb diet), something I’ve speculated may have been a supporting factor for the bicameral mind and related to do with the much earlier cultural preference of psychedelics over addictive stimulants, an entirely different discussion (“Yes, tea banished the fairies.”; & Autism and the Upper Crust). The simplest method of all is using third person language until it becomes a new habit of mind, something might require a long period of practice to feel natural.

The modern mind has always been under stress. That is because it is the source of that stress. It’s not a stable and sustainable way of being in the world (The Crisis of Identity). Rather, it’s a transitional state and all of modernity has been a centuries-long stage of transformation into something else. There is an impulse hidden within, if we could only trigger the release of the locking mechanism (Lock Without a Key). The language of perspectives, as Scott Preston explores (The Three Gems and The Cross of Reality), tells us something important about our predicament. Words such as ‘I’, ‘you’, etc aren’t merely words. In language, we discover our humanity as we come to know the other.

* * *

Are Very Young Children Stuck in the Perpetual Present?
by Jesse Bering

Interestingly, however, the authors found that the three-year-olds were significantly more likely to refer to themselves in the third person (using their first names rather and saying that the sticker is on “his” or “her” head) than were the four-year-olds, who used first-person pronouns (“me” and “my head”) almost exclusively. […]

Povinelli has pointed out the relevancy of these findings to the phenomenon of “infantile amnesia,” which tidily sums up the curious case of most people being unable to recall events from their first three years of life. (I spent my first three years in New Jersey, but for all I know I could have spontaneously appeared as a four-year-old in my parent’s bedroom in Virginia, which is where I have my first memory.) Although the precise neurocognitive mechanisms underlying infantile amnesia are still not very well-understood, escaping such a state of the perpetual present would indeed seemingly require a sense of the temporally enduring, autobiographical self.

5 Reasons Shaq and Other Athletes Refer to Themselves in the Third Person
by Amelia Ahlgren

“Illeism,” or the act of referring to oneself in the third person, is an epidemic in the sports world.

Unfortunately for humanity, the cure is still unknown.

But if we’re forced to listen to these guys drone on about an embodiment of themselves, we might as well guess why they do it.

Here are five reasons some athletes are allergic to using the word “I.”

  1. Lag in Linguistic Development (Immaturity)
  2. Reflection of Egomania
  3. Amp-Up Technique
  4. Pure Intimidation
  5. Goofiness

Rene Thinks, Therefore He Is. You?
by Richard Sandomir

Some strange, grammatical, mind-body affliction is making some well-known folks in sports and politics refer to themselves in the third person. It is as if they have stepped outside their bodies. Is this detachment? Modesty? Schizophrenia? If this loopy verbal quirk were simple egomania, then Louis XIV might have said, “L’etat, c’est Lou.” He did not. And if it were merely a sign of one’s overweening power, the Queen Victoria would not have invented the royal we (“we are not amused”) but rather the royal she. She did not.

Lately, though, some third persons have been talking in a kind of royal he:

* Accepting the New York Jets’ $25 million salary and bonus offer, the quarterback Neil O’Donnell said of his former team, “The Pittsburgh Steelers had plenty of opportunities to sign Neil O’Donnell.”

* As he pushed to be traded from the Los Angeles Kings, Wayne Gretzky said he did not want to wait for the Kings to rebuild “because that doesn’t do a whole lot of good for Wayne Gretzky.”

* After his humiliating loss in the New Hampshire primary, Senator Bob Dole proclaimed: “You’re going to see the real Bob Dole out there from now on.”

These people give you the creepy sense that they’re not talking to you but to themselves. To a first, second or third person’s ear, there’s just something missing. What if, instead of “I am what I am,” we had “Popeye is what Popeye is”?

Vocative self-address, from ancient Greece to Donald Trump
by Ben Zimmer

Earlier this week on Twitter, Donald Trump took credit for a surge in the Consumer Confidence Index, and with characteristic humility, concluded the tweet with “Thanks Donald!”

The “Thanks Donald!” capper led many to muse about whether Trump was referring to himself in the second person, the third person, or perhaps both.

Since English only marks grammatical person on pronouns, it’s not surprising that there is confusion over what is happening with the proper name “Donald” in “Thanks, Donald!” We associate proper names with third-person reference (“Donald Trump is the president-elect”), but a name can also be used as a vocative expression associated with second-person address (“Pleased to meet you, Donald Trump”). For more on how proper names and noun phrases in general get used as vocatives in English, see two conference papers from Arnold Zwicky: “Hey, Whatsyourname!” (CLS 10, 1974) and “Isolated NPs” (Semantics Fest 5, 2004).

The use of one’s own name in third-person reference is called illeism. Arnold Zwicky’s 2007 Language Log post, “Illeism and its relatives” rounds up many examples, including from politicians like Bob Dole, a notorious illeist. But what Trump is doing in tweeting “Thanks, Donald!” isn’t exactly illeism, since the vocative construction implies second-person address rather than third-person reference. We can call this a form of vocative self-address, wherein Trump treats himself as an addressee and uses his own name as a vocative to create something of an imagined interior dialogue.

Give me that Prime Time religion
by Mark Schone

Around the time football players realized end zones were for dancing, they also decided that the pronouns “I” and “me,” which they used an awful lot, had worn out. As if to endorse the view that they were commodities, cartoons or royalty — or just immune to introspection — athletes began to refer to themselves in the third person.

It makes sense, therefore, that when the most marketed personality in the NFL gets religion, he announces it in the weirdly detached grammar of football-speak. “Deion Sanders is covered by the blood of Jesus now,” writes Deion Sanders. “He loves the Lord with all his heart.” And in Deion’s new autobiography, the Lord loves Deion right back, though the salvation he offers third-person types seems different from what mere mortals can expect.

Refering to yourself in the third person
by Tetsuo

It does seem to be a stylistic thing in formal Chinese. I’ve come across a couple of articles about artists by the artist in question where they’ve referred to themselves in the third person throughout. And quite a number of politicians do the same, I’ve been told.

Illeism
from Wikipedia

Illeism in everyday speech can have a variety of intentions depending on context. One common usage is to impart humility, a common practice in feudal societies and other societies where honorifics are important to observe (“Your servant awaits your orders”), as well as in master–slave relationships (“This slave needs to be punished”). Recruits in the military, mostly United States Marine Corps recruits, are also often made to refer to themselves in the third person, such as “the recruit,” in order to reduce the sense of individuality and enforce the idea of the group being more important than the self.[citation needed] The use of illeism in this context imparts a sense of lack of self, implying a diminished importance of the speaker in relation to the addressee or to a larger whole.

Conversely, in different contexts, illeism can be used to reinforce self-promotion, as used to sometimes comic effect by Bob Dole throughout his political career.[2] This was particularly made notable during the United States presidential election, 1996 and lampooned broadly in popular media for years afterwards.

Deepanjana Pal of Firstpost noted that speaking in the third person “is a classic technique used by generations of Bollywood scriptwriters to establish a character’s aristocracy, power and gravitas.”[3] Conversely, third person self referral can be associated with self-irony and not taking oneself too seriously (since the excessive use of pronoun “I” is often seen as a sign of narcissism and egocentrism[4]), as well as with eccentricity in general.

In certain Eastern religions, like Hinduism or Buddhism, this is sometimes seen as a sign of enlightenment, since by doing so, an individual detaches his eternal self (atman) from the body related one (maya). Known illeists of that sort include Swami Ramdas,[5] Ma Yoga Laxmi,[6] Anandamayi Ma,[7] and Mata Amritanandamayi.[8] Jnana yoga actually encourages its practitioners to refer to themselves in the third person.[9]

Young children in Japan commonly refer to themselves by their own name (a habit probably picked from their elders who would normally refer to them by name. This is due to the normal Japanese way of speaking, where referring to another in the third person is considered more polite than using the Japanese words for “you”, like Omae. More explanation given in Japanese pronouns, though as the children grow older they normally switch over to using first person references. Japanese idols also may refer to themselves in the third person so to give off the feeling of childlike cuteness.

Four Paths to the Goal
from Sheber Hinduism

Jnana yoga is a concise practice made for intellectual people. It is the quickest path to the top but it is the steepest. The key to jnana yoga is to contemplate the inner self and find who our self is. Our self is Atman and by finding this we have found Brahman. Thinking in third person helps move us along the path because it helps us consider who we are from an objective point of view. As stated in the Upanishads, “In truth, who knows Brahman becomes Brahman.” (Novak 17).

Non-Reactivity: The Supreme Practice of Everyday Life
by Martin Schmidt

Respond with non-reactive awareness: consider yourself a third-person observer who watches your own emotional responses arise and then dissipate. Don’t judge, don’t try to change yourself; just observe! In time this practice will begin to cultivate a third-person perspective inside yourself that sometimes is called the Inner Witness.[4]

Frequent ‘I-Talk’ may signal proneness to emotional distress
from Science Daily

Researchers at the University of Arizona found in a 2015 study that frequent use of first-person singular pronouns — I, me and my — is not, in fact, an indicator of narcissism.

Instead, this so-called “I-talk” may signal that someone is prone to emotional distress, according to a new, follow-up UA study forthcoming in the Journal of Personality and Social Psychology.

Research at other institutions has suggested that I-talk, though not an indicator of narcissism, may be a marker for depression. While the new study confirms that link, UA researchers found an even greater connection between high levels of I-talk and a psychological disposition of negative emotionality in general.

Negative emotionality refers to a tendency to easily become upset or emotionally distressed, whether that means experiencing depression, anxiety, worry, tension, anger or other negative emotions, said Allison Tackman, a research scientist in the UA Department of Psychology and lead author of the new study.

Tackman and her co-authors found that when people talk a lot about themselves, it could point to depression, but it could just as easily indicate that they are prone to anxiety or any number of other negative emotions. Therefore, I-talk shouldn’t be considered a marker for depression alone.

Talking to yourself in the third person can help you control emotions
from Science Daily

The simple act of silently talking to yourself in the third person during stressful times may help you control emotions without any additional mental effort than what you would use for first-person self-talk — the way people normally talk to themselves.

A first-of-its-kind study led by psychology researchers at Michigan State University and the University of Michigan indicates that such third-person self-talk may constitute a relatively effortless form of self-control. The findings are published online in Scientific Reports, a Nature journal.

Say a man named John is upset about recently being dumped. By simply reflecting on his feelings in the third person (“Why is John upset?”), John is less emotionally reactive than when he addresses himself in the first person (“Why am I upset?”).

“Essentially, we think referring to yourself in the third person leads people to think about themselves more similar to how they think about others, and you can see evidence for this in the brain,” said Jason Moser, MSU associate professor of psychology. “That helps people gain a tiny bit of psychological distance from their experiences, which can often be useful for regulating emotions.”

Pretending to be Batman helps kids stay on task
by Christian Jarrett

Some of the children were assigned to a “self-immersed condition”, akin to a control group, and before and during the task were told to reflect on how they were doing, asking themselves “Am I working hard?”. Other children were asked to reflect from a third-person perspective, asking themselves “Is James [insert child’s actual name] working hard?” Finally, the rest of the kids were in the Batman condition, in which they were asked to imagine they were either Batman, Bob The Builder, Rapunzel or Dora the Explorer and to ask themselves “Is Batman [or whichever character they were] working hard?”. Children in this last condition were given a relevant prop to help, such as Batman’s cape. Once every minute through the task, a recorded voice asked the question appropriate for the condition each child was in [Are you working hard? or Is James working hard? or Is Batman working hard?].

The six-year-olds spent more time on task than the four-year-olds (half the time versus about a quarter of the time). No surprise there. But across age groups, and apparently unrelated to their personal scores on mental control, memory, or empathy, those in the Batman condition spent the most time on task (about 55 per cent for the six-year-olds; about 32 per cent for the four-year-olds). The children in the self-immersed condition spent the least time on task (about 35 per cent of the time for the six-year-olds; just over 20 per cent for the four-year-olds) and those in the third-person condition performed in between.

Dressing up as a superhero might actually give your kid grit
by Jenny Anderson

In other words, the more the child could distance him or herself from the temptation, the better the focus. “Children who were asked to reflect on the task as if they were another person were less likely to indulge in immediate gratification and more likely to work toward a relatively long-term goal,” the authors wrote in the study called “The “Batman Effect”: Improving Perseverance in Young Children,” published in Child Development.

Curmudgucation: Don’t Be Batman
by Peter Greene

This underlines the problem we see with more and more or what passes for early childhood education these days– we’re not worried about whether the school is ready to appropriately handle the students, but instead are busy trying to beat three-, four- and five-year-olds into developmentally inappropriate states to get them “ready” for their early years of education. It is precisely and absolutely backwards. I can’t say this hard enough– if early childhood programs are requiring “increased demands” on the self-regulatory skills of kids, it is the programs that are wrong, not the kids. Full stop.

What this study offers is a solution that is more damning than the “problem” that it addresses. If a four-year-old child has to disassociate, to pretend that she is someone else, in order to cope with the demands of your program, your program needs to stop, today.

Because you know where else you hear this kind of behavior described? In accounts of victims of intense, repeated trauma. In victims of torture who talk about dealing by just pretending they aren’t even there, that someone else is occupying their body while they float away from the horror.

That should not be a description of How To Cope With Preschool.

Nor should the primary lesson of early childhood education be, “You can’t really cut it as yourself. You’ll need to be somebody else to get ahead in life.” I cannot even begin to wrap my head around what a destructive message that is for a small child.

Can You Live With the Voices in Your Head?
by Daniel B. Smith

And though psychiatrists acknowledge that almost anyone is capable of hallucinating a voice under certain circumstances, they maintain that the hallucinations that occur with psychoses are qualitatively different. “One shouldn’t place too much emphasis on the content of hallucinations,” says Jeffrey Lieberman, chairman of the psychiatry department at Columbia University. “When establishing a correct diagnosis, it’s important to focus on the signs or symptoms” of a particular disorder. That is, it’s crucial to determine how the voices manifest themselves. Voices that speak in the third person, echo a patient’s thoughts or provide a running commentary on his actions are considered classically indicative of schizophrenia.

Auditory hallucinations: Psychotic symptom or dissociative experience?
by Andrew Moskowitz & Dirk Corstens

While auditory hallucinations are considered a core psychotic symptom, central to the diagnosis of schizophrenia, it has long been recognized that persons who are not psychotic may also hear voices. There is an entrenched clinical belief that distinctions can be made between these groups, typically on the basis of the perceived location or the ‘third-person’ perspective of the voices. While it is generally believed that such characteristics of voices have significant clinical implications, and are important in the differential diagnosis between dissociative and psychotic disorders, there is no research evidence in support of this. Voices heard by persons diagnosed schizophrenic appear to be indistinguishable, on the basis of their experienced characteristics, from voices heard by persons with dissociative disorders or with no mental disorder at all. On this and other bases outlined below, we argue that hearing voices should be considered a dissociative experience, which under some conditions may have pathological consequences. In other words, we believe that, while voices may occur in the context of a psychotic disorder, they should not be considered a psychotic symptom.

Hallucinations and Sensory Overrides
by T. M. Luhrmann

The psychiatric and psychological literature has reached no settled consensus about why hallucinations occur and whether all perceptual “mistakes” arise from the same processes (for a general review, see Aleman & Laroi 2008). For example, many researchers have found that when people hear hallucinated voices, some of these people have actually been subvocalizing: They have been using muscles used in speech, but below the level of their awareness (Gould 1949, 1950). Other researchers have not found this inner speech effect; moreover, this hypothesis does not explain many of the odd features of the hallucinations associated with psychosis, such as hearing voices that speak in the second or third person (Hoffman 1986). But many scientists now seem to agree that hallucinations are the result of judgments associated with what psychologists call “reality monitoring” (Bentall 2003). This is not the process Freud described with the term reality testing, which for the most part he treated as a cognitive higher-level decision: the ability to distinguish between fantasy and the world as it is (e.g., he loves me versus he’s just not that into me). Reality monitoring refers to the much more basic decision about whether the source of an experience is internal to the mind or external in the world.

Originally, psychologists used the term to refer to judgments about memories: Did I really have that conversation with my boyfriend back in college, or did I just think I did? The work that gave the process its name asked what it was about memories that led someone to infer that these memories were records of something that had taken place in the world or in the mind (Johnson & Raye 1981). Johnson & Raye’s elegant experiments suggested that these memories differ in predictable ways and that people use those differences to judge what has actually taken place. Memories of an external event typically have more sensory details and more details in general. By contrast, memories of thoughts are more likely to include the memory of cognitive effort, such as composing sentences in one’s mind.

Self-Monitoring and Auditory Verbal Hallucinations in Schizophrenia
by Wayne Wu

It’s worth pointing out that a significant portion of the non-clinical population experiences auditory hallucinations. Such hallucinations need not be negative in content, though as I understand it, the preponderance of AVH in schizophrenia is or becomes negative. […]

I’ve certainly experienced the “third man”, in a moment of vivid stress when I was younger. At the time, I thought it was God speaking to me in an encouraging and authoritative way! (I was raised in a very strict religious household.) But I wouldn’t be surprised if many of us have had similar experiences. These days, I have more often the cell-phone buzzing in my pocket illusion.

There are, I suspect, many reasons why they auditory system might be activated to give rise to auditory experiences that philosophers would define as hallucinations: recalling things in an auditory way, thinking in inner speech where this might be auditory in structure, etc. These can have positive influences on our ability to adapt to situations.

What continues to puzzle me about AVH in schizophrenia are some of its fairly consistent phenomenal properties: second or third-person voice, typical internal localization (though plenty of external localization) and negative content.

The Digital God, How Technology Will Reshape Spirituality
by William Indick
pp. 74-75

Doubled Consciousness

Who is this third who always walks beside you?
When I count, there are only you and I together.
But when I look ahead up the white road
There is always another one walking beside you
Gliding wrapt in a brown mantle, hooded.
—T.S. Eliot, The Waste Land

The feeling of “doubled consciousness” 81 has been reported by numerous epileptics. It is the feeling of being outside of one’s self. The feeling that you are observing yourself as if you were outside of your own body, like an outsider looking in on yourself. Consciousness is “doubled” because you are aware of the existence of both selves simultaneously—the observer and the observed. It is as if the two halves of the brain temporarily cease to function as a single mechanism; but rather, each half identifies itself separately as its own self. 82 The doubling effect that occurs as a result of some temporal lobe epileptic seizures may lead to drastic personality changes. In particular, epileptics following seizures often become much more spiritual, artistic, poetic, and musical. 83 Art and music, of course, are processed primarily in the right hemisphere, as is poetry and the more lyrical, metaphorical aspects of language. In any artistic endeavor, one must engage in “doubled consciousness,” creating the art with one “I,” while simultaneously observing the art and the artist with a critically objective “other-I.” In The Great Gatsby, Fitzgerald expressed the feeling of “doubled consciousness” in a scene in which Nick Caraway, in the throes of profound drunkenness, looks out of a city window and ponders:

Yet high over the city our line of yellow windows must have contributed their share of human secrecy to the casual watcher in the darkening streets, and I was him too , looking up and wondering . I was within and without , simultaneously enchanted and repelled by the inexhaustible variety of life.

Doubled-consciousness, the sense of being both “within and without” of one’s self, is a moment of disconnection and disassociation between the two hemispheres of the brain, a moment when left looks independently at right and right looks independently at left, each recognizing each other as an uncanny mirror reflection of himself, but at the same time not recognizing the other as “I.”

The sense of doubled consciousness also arises quite frequently in situations of extreme physical and psychological duress. 84 In his book, The Third Man Factor John Geiger delineates the conditions associated with the perception of the “sensed presence”: darkness, monotony, barrenness, isolation, cold, hunger, thirst, injury, fatigue, and fear. 85 Shermer added sleep deprivation to this list, noting that Charles Lindbergh, on his famous cross–Atlantic flight, recorded the perception of “ghostly presences” in the cockpit, that “spoke with authority and clearness … giving me messages of importance unattainable in ordinary life.” 86 Sacks noted that doubled consciousness is not necessarily an alien or abnormal sensation, we all feel it, especially when we are alone, in the dark, in a scary place. 87 We all can recall a memory from childhood when we could palpably feel the presence of the monster hiding in the closet, or that indefinable thing in the dark space beneath our bed. The experience of the “sensed other” is common in schizophrenia, can be induced by certain drugs, is a central aspect of the “near death experience,” and is also associated with certain neurological disorders. 88

To speak of oneself in the third person; to express the wish to “find myself,” is to presuppose a plurality within one’s own mind. 89 There is consciousness, and then there is something else … an Other … who is nonetheless a part of our own mind, though separate from our moment-to-moment consciousness. When I make a statement such as: “I’m disappointed with myself because I let myself gain weight,” it is quite clear that there are at least two wills at work within one mind—one will that dictates weight loss and is disappointed—and another will that defies the former and allows the body to binge or laze. One cannot point at one will and say: “This is the real me and the other is not me.” They’re both me. Within each “I” there exists a distinct Other that is also “I.” In the mind of the believer—this double-I, this other-I, this sentient other, this sensed presence who is me but also, somehow, not me—how could this be anyone other than an angel, a spirit, my own soul, or God? Sacks recalls an incident in which he broke his leg while mountain climbing alone and had to descend the mountain despite his injury and the immense pain it was causing him. Sacks heard “an inner voice” that was “wholly unlike” his normal “inner speech”—a “strong, clear, commanding voice” that told him exactly what he had to do to survive the predicament, and how to do it. “This good voice, this Life voice, braced and resolved me.” Sacks relates the story of Joe Simpson, author of Touching the Void , who had a similar experience during a climbing mishap in the Andes. For days, Simpson trudged along with a distinctly dual sense of self. There was a distracted self that jumped from one random thought to the next, and then a clearly separate focused self that spoke to him in a commanding voice, giving specific instructions and making logical deductions. 90 Sacks also reports the experience of a distraught friend who, at the moment she was about to commit suicide, heard a “voice” tell her: “No, you don’t want to do that…” The male voice, which seemed to come from outside of her, convinced her not to throw her life away. She speaks of it as her “guardian angel.” Sacks suggested that this other voice may always be there, but it is usually inhibited. When it is heard, it’s usually as an inner voice, rather than an external one. 91 Sacks also reports that the “persistent feeling” of a “presence” or a “companion” that is not actually there is a common hallucination, especially among people suffering from Parkinson’s disease. Sacks is unsure if this is a side-effect of L-DOPA, the drug used to treat the disease, or if the hallucinations are symptoms of the neurological disease itself. He also noted that some patients were able to control the hallucinations to varying degrees. One elderly patient hallucinated a handsome and debonair gentleman caller who provided “love, attention, and invisible presents … faithfully each evening.” 92

Part III: Off to the Asylum – Rational Anti-psychiatry
by Veronika Nasamoto

The ancients were also clued up in that the origins of mental instability was spiritual but they perceived it differently. In The Origins of Consciousness in the Breakdown of Bicameral Mind, Julian Jaynes’ book present a startling thesis, based on an analysis of the language of the Iliad, that the ancient Greeks were not conscious in the same way that modern humans are. Because the ancient Greeks had no sense of “I” (also Victorian England would sometimes speak in the third person rather than say I, because the eternal God – YHWH was known as the great “I AM”) with which to locate their mental processes. To them their inner thoughts were perceived as coming from the gods, which is why the characters in the Iliad find themselves in frequent communication with supernatural entities.

The Shadows of Consciousness in the Breakdown of the Bicameral Mirror
by Chris Savia

Jaynes’s description of consciousness, in relation to memory, proposes what people believe to be rote recollection are concepts, the platonic ideals of their office, the view out of the window, et al. These contribute to one’s mental sense of place and position in the world. The memories enabling one to see themselves in the third person.

Language, consciousness and the bicameral mind
by Andreas van Cranenburgh

Consciousness not a copy of experience Since Locke’s tabula rasa it has been thought that consciousness records our experiences, to save them for possible later reflection. However, this is clearly false: most details of our experience are immediately lost when not given special notice. Recalling an arbitrary past event requires a reconstruction of memories. Interestingly, memories are often from a third-person perspective, which proves that they could not be a mere copy of experience.

The Origin of Consciousness in the Breakdown of the Bicameral Mind
by Julian Jaynes
pp. 347-350

Negatory Possession

There is another side to this vigorously strange vestige of the bicameral mind. And it is different from other topics in this chapter. For it is not a response to a ritual induction for the purpose of retrieving the bicameral mind. It is an illness in response to stress. In effect, emotional stress takes the place of the induction in the general bicameral paradigm just as in antiquity. And when it does, the authorization is of a different kind.

The difference presents a fascinating problem. In the New Testament, where we first hear of such spontaneous possession, it is called in Greek daemonizomai, or demonization. 10 And from that time to the present, instances of the phenomenon most often have that negatory quality connoted by the term. The why of the negatory quality is at present unclear. In an earlier chapter (II. 4) I have tried to suggest the origin of ‘evil’ in the volitional emptiness of the silent bicameral voices. And that this took place in Mesopotamia and particularly in Babylon, to which the Jews were exiled in the sixth century B.C., might account for the prevalence of this quality in the world of Jesus at the start of this syndrome.

But whatever the reasons, they must in the individual be similar to the reasons behind the predominantly negatory quality of schizophrenic hallucinations. And indeed the relationship of this type of possession to schizophrenia seems obvious.

Like schizophrenia, negatory possession usually begins with some kind of an hallucination. 11 It is often a castigating ‘voice’ of a ‘demon’ or other being which is ‘heard’ after a considerable stressful period. But then, unlike schizophrenia, probably because of the strong collective cognitive imperative of a particular group or religion, the voice develops into a secondary system of personality, the subject then losing control and periodically entering into trance states in which consciousness is lost, and the ‘demon’ side of the personality takes over.

Always the patients are uneducated, usually illiterate, and all believe heartily in spirits or demons or similar beings and live in a society which does. The attacks usually last from several minutes to an hour or two, the patient being relatively normal between attacks and recalling little of them. Contrary to horror fiction stories, negatory possession is chiefly a linguistic phenomenon, not one of actual conduct. In all the cases I have studied, it is rare to find one of criminal behavior against other persons. The stricken individual does not run off and behave like a demon; he just talks like one.

Such episodes are usually accompanied by twistings and writhings as in induced possession. The voice is distorted, often guttural, full of cries, groans, and vulgarity, and usually railing against the institutionalized gods of the period. Almost always, there is a loss of consciousness as the person seems the opposite of his or her usual self. ‘He’ may name himself a god, demon, spirit, ghost, or animal (in the Orient it is often ‘the fox’), may demand a shrine or to be worshiped, throwing the patient into convulsions if these are withheld. ‘He’ commonly describes his natural self in the third person as a despised stranger, even as Yahweh sometimes despised his prophets or the Muses sneered at their poets. 12 And ‘he’ often seems far more intelligent and alert than the patient in his normal state, even as Yahweh and the Muses were more intelligent and alert than prophet or poet.

As in schizophrenia, the patient may act out the suggestions of others, and, even more curiously, may be interested in contracts or treaties with observers, such as a promise that ‘he’ will leave the patient if such and such is done, bargains which are carried out as faithfully by the ‘demon’ as the sometimes similar covenants of Yahweh in the Old Testament. Somehow related to this suggestibility and contract interest is the fact that the cure for spontaneous stress-produced possession, exorcism, has never varied from New Testament days to the present. It is simply by the command of an authoritative person often following an induction ritual, speaking in the name of a more powerful god. The exorcist can be said to fit into the authorization element of the general bicameral paradigm, replacing the ‘demon.’ The cognitive imperatives of the belief system that determined the form of the illness in the first place determine the form of its cure.

The phenomenon does not depend on age, but sex differences, depending on the historical epoch, are pronounced, demonstrating its cultural expectancy basis. Of those possessed by ‘demons’ whom Jesus or his disciples cured in the New Testament, the overwhelming majority were men. In the Middle Ages and thereafter, however, the overwhelming majority were women. Also evidence for its basis in a collective cognitive imperative are its occasional epidemics, as in convents of nuns during the Middle Ages, in Salem, Massachusetts, in the eighteenth century, or those reported in the nineteenth century at Savoy in the Alps. And occasionally today.

The Emergence of Reflexivity in Greek Language and Thought
by Edward T. Jeremiah
p. 3

Modernity’s tendency to understand the human being in terms of abstract grammatical relations, namely the subject and self, and also the ‘I’—and, conversely, the relative indifference of Greece to such categories—creates some of the most important semantic contrasts between our and Greek notions of the self.

p. 52

Reflexivisations such as the last, as well as those like ‘Know yourself’ which reconstitute the nature of the person, are entirely absent in Homer. So too are uses of the reflexive which reference some psychological aspect of the subject. Indeed the reference of reflexives directly governed by verbs in Homer is overwhelmingly bodily: ‘adorning oneself’, ‘covering oneself’, ‘defending oneself’, ‘debasing oneself physically’, ‘arranging themselves in a certain formation’, ‘stirring oneself’, ad all the prepositional phrases. The usual reference for indirect arguments is the self interested in its own advantage. We do not find in Homer any of the psychological models of self-relation discussed by Lakoff.

Use of the Third Person for Self-Reference by Jesus and Yahweh
by Rod Elledge
pp. 11-13

Viswanathan addresses illeism in Shakespeare’s works, designating it as “illeism with a difference.” He writes: “It [‘illeism with a difference’] is one by which the dramatist makes a character, speaking in the first person, refer to himself in the third person, not simple as a ‘he’, which would be illeism proper, a traditional grammatical mode, but by name.” He adds that the device is extensively used in Julius Caesar and Troilus and Cressida, and occasionally in Hamlet and Othello. Viswanathan notes the device, prior to Shakespeare, was used in the medieval theater simply to allow a character to announce himself and clarify his identity. Yet, he argues that, in the hands of Shakespeare, the device becomes “a masterstroke of dramatic artistry.” He notes four uses of this illeism with a difference.” First, it highlights the character using it and his inner self. He notes that it provides a way of “making the character momentarily detach himself from himself, achieve a measure of dramatic (and philosophical) depersonalization, and create a kind of aesthetic distance from which he can contemplate himself.” Second, it reflects the tension between the character’s public and private selves. Third, the device “raises the question of the way in which the character is seen to behave and to order his very modes of feeling and thought in accordance with a rightly or wrongly conceived image or idea of himself.” Lastly, he notes the device tends to point toward the larger philosophical problem for man’s search for identity. Speaking of the use of illeism within Julius Caesar, Spevak writes that “in addiction to the psychological and other implications, the overall effect is a certain stateliness, a classical look, a consciousness on the part of the actors that they are acting in a not so everyday context.”

Modern linguistic scholarship

Otto Jespersen notes various examples of the third-person self-reference including those seeking to reflect deference or politeness, adults talking to children as “papa” or “Aunt Mary” to be more easily understood, as well as the case of some writers who write “the author” or “this present writer” in order to avoid the mention of “I.” He notes Caesar as a famous example of “self-effacement [used to] produce the impression of absolute objectivity.” Yet, Head writes, in response to Jespersen, that since the use of the third person for self-reference

is typical of important personages, whether in autobiography (e.g. Caesar in De Bello Gallico and Captain John Smith in his memoirs) or in literature (Marlowe’s Faustus, Shakespeare’s Julius Caesar, Cordelia and Richared II, Lessing’s Saladin, etc.), it is actually an indication of special status and hence implies greater social distance than does the more commonly used first person singular.

Land and Kitzinger argue that “very often—but not always . . . the use of a third-person reference form in self-reference is designed to display that the speaker is talking about themselves as if from the perspective of another—either the addressee(s) . . . or a non-present other.” The linguist Laurence Horn, noting the use of illeism by various athlete and political celebrities, notes that “the celeb is viewing himself . . . from the outside.” Addressing what he refers to as “the dissociative third person,” he notes that an athlete or politician “may establish distance between himself (virtually never herself) and his public persona, but only by the use of his name, never a 3rd person pronoun.”

pp. 15-17

Illeism in Clasical Antiquity

As referenced in the history of research, Kostenberger writes: “It may strike the modern reader as curious that Jesus should call himself ‘Jesus Christ’; however, self-reference in the third person was common in antiquity.” While Kostenberger’s statement is a brief comment in the context of a commentary and not a monographic study on the issue, his comment raises a critical question. Does a survey of the evidence reveal that Jesus’s use of illeism in this verse (and by implication elsewhere in the Gospels) reflects simply another example of a common mannerism in antiquity? […]

Early Evidence

From the fifth century BC to the time of Jesus the following historians refer to themselves in the third person in their historical accounts: Hecataeus (though the evidence is fragmentary), Herodotus, Thucydides, Xenophon, Polybius, Caesar, and Josephus. For the scope of this study this point in history (from fifth century BC to first century AD) is the primary focus. Yet, this feature was adopted from the earlier tendency in literature in which an author states his name as a seal or sphragis for their work. Herkommer notes that the “self-introduction” (Selbstvorstellung) in the Homeric Hymn to Apollo, in choral poetry (Chorlyrik) such as that by the Greek poet Alkman (seventh century BC), and in the poetic mxims, (Spruchdichtung) such as those of the Greek poet Phokylides (seventh century BC). Yet, from fifth century onward, this feature appears primarily in the works of Greek historians. In addition to early evidence (prior to the fifth century of an author’s self-reference in his historiographic work, the survey of evidence also noted an early example of illeism within Homer’s Illiad. Because this ancient Greek epic poem reflects an early use of the third-person self-reference in a narrative context and offers a point of comparison to its use in later Greek historiography, this early example of the use of illeism is briefly addressed.

Maricola notes that the style of historical narrative that first appears in Herodotus is a legacy from Homer (ca. 850 BC). He notes that “as the writer of the most ‘authoritative’ third-person narrative, [Homer] provided a model not only for later poets, epic and otherwise, but also to the prose historians who, by way of Herodotus, saw him as their model and rival.” While Homer provided the authoritative example of third-person narrative, he also, centuries before the development of Greek historiography, used illeism in his epic poem the Iliad. Illeism occurs in the direct speech of Zeus (the king of the gods), Achilles (the “god-like” son of a king and goddess), and Hector (the mighty Trojan prince).

Zeus, addressing the assembled gods on Mt. Olympus, refers to himself as “Zeus, the supreme Master” […] and states how superior he is above all gods and men. Hector’s use of illeism occurs as he addresses the Greeks and challenges the best of them to fight against “good Hector” […]. Muellner notes in these instances of third person for self-reference (Zeus twice and Hector once) that “the personage at the top and center of the social hierarchy is asserting his superiority over the group . . . . In other word, these are self-aggrandizing third-person references, like those in the war memoirs of Xenophon, Julius Caesar, and Napoleon.” He adds that “the primary goal of this kind of third-person self-reference is to assert the status accruing to exception excellence. Achilles refers to himself in the context of an oath (examples of which are reflected in the OT), yet his self-reference serves to emphasize his status in relation to the Greeks, and especially to King Agamemnon. Addressing Agamemnon, the general of the Greek armies, Achillies swears by his sceptor and states that the day will come when the Greeks will long for Achilles […].

Homer’s choice to use illeism within the direct speech of these three characters contributes to an understanding of its potential rhetorical implications. In each case the character’s use of illeism serves to set him apart by highlighting his innate authority and superior status. Also, all three characters reflect divine and/or royal aspects (Zeus, king of gods; Achilles, son of a king and a goddess, and referred to as “god-like”; and Hector, son of a king). The examples of illeism in the Iliad, among the earliest evidence of illeism, reflect a usage that shares similarities with the illeism as used by Jesus and Yahweh. The biblical and Homeric examples each reflects illeism in direct speech within narrative discourse and the self-reverence serves to emphasize authority or status as well as a possible associated royal and/or divine aspect(s). Yet, the examples stand in contrast to the use of illeism by later historians. As will be addressed next, these ancient historians used the third-person self-reference as a literary device to give their historical accounts a sense of objectivity.

Women and Gender in Medieval Europe: An Encyclopedia
edited by Margaret C. Schaus
“Mystics’ Writings”

by Patricia Dailey
p. 600

The question of scribal mediation is further complicated in that the mystic’s text is, in essence, a message transmitted through her, which must be transmitted to her surrounding community. Thus, the denuding of voice of the text, of a first-person narrative, goes hand in hand with the status of the mystic as “transcriber” of a divine message that does not bear the mystic’s signature, but rather God’s. In addition, the tendency to write in the third person in visionary narratives may draw from a longstanding tradition that stems from Paul in 2 Cor. of communicating visions in the third person, but at the same time, it presents a means for women to negotiate with conflicts with regard to authority or immediacy of the divine through a veiled distance or humility that conformed to a narrative tradition.

Romantic Confession: Jean-Jacques Rousseau and Thomas de Quincey
by Martina Domines Veliki

It is no accident that the term ‘autobiography’, entailing a special amalgam of ‘autos’, ‘bios’ and ‘graphe’ (oneself, life and writing), was first used in 1797 in the Monthly Review by a well-known essayist and polyglot, translator of German romantic literature, William Taylor of Norwich. However, the term‘autobiographer’ was first extensively used by an English Romantic poet, one of the Lake Poets, Robert Southey1. This does not mean that no autobiographies were written before the beginning of the nineteenth century. The classical writers wrote about famous figures of public life, the Middle Ages produced educated writers who wrote about saints’ lives and from Renaissance onward people wrote about their own lives. However, autobiography, as an auto-reflexive telling of one’s own life’s story, presupposes a special understanding of one’s‘self’ and therefore, biographies and legends of Antiquity and the Middle Ages are fundamentally different from ‘modern’ autobiography, which postulates a truly autonomous subject, fully conscious of his/her own uniqueness2. Life-writing, whether in the form of biography or autobiography, occupied the central place in Romanticism. Autobiography would also often appear in disguise. One would immediately think of S. T. Coleridge’s Biographia Literaria (1817) which combines literary criticism and sketches from the author’s life and opinions, and Mary Wollstonecratf’s Short Residence in Sweden, Norway and Denmark (1796),which combines travel narrative and the author’s own difficulties of travelling as a woman.

When one thinks about the first ‘modern’ secular autobiography, it is impossible to avoid the name of Jean-Jacques Rousseau. He calls his first autobiography The Confessions, thus aligning himself in the long Western tradition of confessional writings inaugurated by St. Augustine (354 – 430 AD). Though St. Augustine confesses to the almighty God and does not really perceive his own life as significant, there is another dimension of Augustine’s legacy which is important for his Romantic inheritors: the dichotomies inherent in the Christian way of perceiving the world, namely the opposition of spirit/matter, higher/lower, eternal/temporal, immutable/changing become ultimately emanations of a single binary opposition, that of inner and outer (Taylor 1989: 128). The substance of St. Augustine’s piety is summed up by a single sentence from his Confessions:

“And how shall I call upon my God – my God and my Lord? For when I call on Him, I ask Him to come into me. And what place is there in me into which my God can come? (…) I could not therefore exist, could not exist at all, O my God, unless Thou wert in me.” (Confessions, book I, chapter 2, p.2, emphasis mine)

The step towards inwardness was for Augustine the step towards Truth, i.e. God, and as Charles Taylor explains, this turn inward was a decisive one in the Western tradition of thought. The ‘I’ or the first person standpoint becomes unavoidable thereafter. It took a long way from Augustine’s seeing these sources to reside in God to Rousseau’s pivotal turn to inwardness without recourse to God. Of course, one must not lose sight of the developments in continental philosophy pre-dating Rousseau’s work. René Descartes was the first to embrace Augustinian thinking at the beginning of the modern era, and he was responsible for the articulation of the disengaged subject: the subject asserting that the real locus of all experience is in his own mind3. With the empiricist philosophy of John Locke and David Hume, who claimed that we reach the knowledge of the surrounding world through disengagement and procedural reason, there is further development towards an idea of the autonomous subject. Although their teachings seemed to leave no place for subjectivity as we know it today, still they were a vital step in redirecting the human gaze from the heavens to man’s own existence.

2 Furthermore, the Middle Ages would not speak about such concepts as ‘the author’and one’s ‘individuality’ and it is futile to seek in such texts the appertaining subject. When a Croatian fourteenth-century-author, Hanibal Lucić, writes about his life in a short text called De regno Croatiae et Dalmatiae? Paulus de Paulo, the last words indicate that the author perceives his life as being insignificant and invaluable. The nuns of the fourteenth century writing their own confessions had to use the third person pronoun to refer to themselves and the ‘I’ was reserved for God only. (See Zlatar 2000)

Return to Childhood by Leila Abouzeid
by Geoff Wisner

In addition, autobiography has the pejorative connotation in Arabic of madihu nafsihi wa muzakkiha (he or she who praises and recommends him- or herself). This phrase denotes all sorts of defects in a person or a writer: selfishness versus altruism, individualism versus the spirit of the group, arrogance versus modesty. That is why Arabs usually refer to themselves in formal speech in the third person plural, to avoid the use of the embarrassing íI.ë In autobiography, of course, one uses íIë frequently.

Becoming Abraham Lincoln
by Richard Kigel
Preface, XI

A note about the quotations and sources: most of the statements were collected by William Herndon, Lincoln’s law partner and friend, in the years following Lincoln’s death. The responses came in original handwritten letters and transcribed interviews. Because of the low literacy levels of many of his subjects, sometimes these statements are difficult to understand. Often they used no punctuation and wrote in fragments of thoughts. Misspellings were common and names and places were often confused. “Lincoln” was sometimes spelled “Linkhorn” or “Linkern.” Lincoln’s grandmother “Lucy” was sometimes “Lucey.” Some respondents referred to themselves in third person. Lincoln himself did in his biographical writings.

p. 35

“From this place,” wrote Abe, referring to himself in the third person, “he removed to what is now Spencer County, Indiana, in the autumn of 1816, Abraham then being in his eighth [actually seventh] year. This removal was partly on account of slavery, but chiefly on account of the difficulty in land titles in Kentucky.”

Ritual and the Consciousness Monoculture
by Sarah Perry

Mirrors only became common in the nineteenth century; before, they were luxury items owned only by the rich. Access to mirrors is a novelty, and likely a harmful one.

In Others In Mind: Social Origins of Self-Consciousness, Philippe Rochat describes an essential and tragic feature of our experience as humans: an irreconcilable gap between the beloved, special self as experienced in the first person, and the neutrally-evaluated self as experienced in the third person, imagined through the eyes of others. One’s first-person self image tends to be inflated and idealized, whereas the third-person self image tends to be deflated; reminders of this distance are demoralizing.

When people without access to mirrors (or clear water in which to view their reflections) are first exposed to them, their reaction tends to be very negative. Rochat quotes the anthropologist Edmund Carpenter’s description of showing mirrors to the Biamis of Papua New Guinea for the first time, a phenomenon Carpenter calls “the tribal terror of self-recognition”:

After a first frightening reaction, they became paralyzed, covering their mouths and hiding their heads – they stood transfixed looking at their own images, only their stomach muscles betraying great tension.

Why is their reaction negative, and not positive? It is that the first-person perspective of the self tends to be idealized compared to accurate, objective information; the more of this kind of information that becomes available (or unavoidable), the more each person will feel the shame and embarrassment from awareness of the irreconcilable gap between his first-person specialness and his third-person averageness.

There are many “mirrors”—novel sources of accurate information about the self—in our twenty-first century world. School is one such mirror; grades and test scores measure one’s intelligence and capacity for self-inhibition, but just as importantly, peers determine one’s “erotic ranking” in the social hierarchy, as the sociologist Randall Collins terms it. […]

There are many more “mirrors” available to us today; photography in all its forms is a mirror, and internet social networks are mirrors. Our modern selves are very exposed to third-person, deflating information about the idealized self. At the same time, say Rochat, “Rich contemporary cultures promote individual development, the individual expression and management of self-presentation. They foster self-idealization.”

My Beef With Ken Wilber
by Scott Preston (also posted on Integral World)

We see immediately from this schema why the persons of grammar are minimally four and not three. It’s because we are fourfold beings and our reality is a fourfold structure, too, being constituted of two times and two spaces — past and future, inner and outer. The fourfold human and the fourfold cosmos grew up together. Wilber’s model can’t account for that at all.

So, what’s the problem here? Wilber seems to have omitted time and our experience of time as an irrelevancy. Time isn’t even represented in Wilber’s AQAL model. Only subject and object spaces. Therefore, the human form cannot be properly interpreted, for we have four faces, like some representations of the god Janus, that face backwards, forwards, inwards, and outwards, and we have attendant faculties and consciousness functions organised accordingly for mastery of these dimensions — Jung’s feeling, thinking, sensing, willing functions are attuned to a reality that is fourfold in terms of two times and two spaces. And the four basic persons of grammar — You, I, We, He or She — are the representation in grammar of that reality and that consciousness, that we are fourfold beings just as our reality is a fourfold cosmos.

Comparing Wilber’s model to Rosenstock-Huessy’s, I would have to conclude that Wilber’s model is “deficient integral” owing to its apparent omission of time and subsequently of the “I-thou” relationship in which the time factor is really pronounced. For the “I-It” (or “We-Its”) relation is a relation of spaces — inner and outer, while the “I-Thou” (or “We-thou”) relation is a relation of times.

It is perhaps not so apparent to English speakers especially that the “thou” or “you” form is connected with time future. Other languages, like German, still preserve the formal aspects of this. In old English you had to say “go thou!” or “be thou loving!”, and so on. In other words, the “thou” or “you” is most closely associated with the imperative form and that is the future addressing the past. It is a call to change one’s personal or collective state — what we call the “vocation” or “calling” is time future in dialogue with time past. Time past is represented in the “we” form. We is not plural “I’s”. It is constituted by some historical act, like a marriage or union or congregation of peoples or the sexes in which “the two shall become one flesh”. We is the collective person, historically established by some act. The people in “We the People” is a singularity and a unity, an historically constituted entity called “nation”. A bunch of autonomous “I’s” or egos never yet formed a tribe or a nation — or a commune for that matter. Nor a successful marriage.

Though “I-It” (or “We-Its”) might be permissible in referring to the relation of subject and object spaces, “we-thou” is the relation in which the time element is outstanding.

The Spell of Inner Speech

Inner speech is not a universal trait of humanity, according to Russell T. Hurlburt. That is unsurprising. Others go much further in arguing that inner speech was once non-existent for entire civilizations.

My favorite version of this argument being that of Julian Jayne’s theory of the bicameral mind. It was noted by Jaynes how bicameralism can be used as an interpretative frame to understand many of the psychological oddities still found in modern society. His theory goes a long way in explaining hypnosis, for instance. From that perspective, I’ve long suspected that post-bicameral consciousness isn’t as well established as is generally assumed. David Abrahms observes (see at end of post for full context):

“It is important to realize that the now common experience of “silent” reading is a late development in the story of the alphabet, emerging only during the Middle Ages, when spaces were first inserted between the words in a written manuscript (along with various forms of punctuation), enabling readers to distinguish the words of a written sentence without necessarily sounding them out audibly. Before this innovation, to read was necessarily to read aloud, or at the very least to mumble quietly; after the twelfth century it became increasingly possible to internalize the sounds, to listen inwardly to phantom words (or the inward echo of words once uttered).”

Internal experience took a long time to take hold. During the Enlightenment, there was still contentious debate about whether or not all humans shared a common capacity for inner experience — that is, did peasants, slaves and savages (and women) have minds basically the same as rich white men, presumably as rational actors with independent-mindedness and abstract thought. The rigid boundaries of the hyper-individualistic ego-mind required millennia to be built up within the human psyche, initially considered the sole province of the educated elite that, from Plato onward, was portrayed as a patriarchal and paternalistic enlightened aristocracy.

The greatest of radical ideals was to challenge this self-serving claim of the privileged mind by demanding that all be treated as equals before God and government, in that through the ability to read all could have a personal relationship with God and through natural rights all could self-govern. Maybe it wasn’t merely a change in the perception of common humanity but a change within common humanity itself. As modernity came into dominance, the inner sense of self with the accompanying inner speech became an evermore prevalent experience. Something rare among the elite not too many centuries earlier had suddenly become common among the commoners.

With minds of their own, quite literally, the rabble became rabblerousers who no longer mindlessly bowed down to their betters. The external commands and demands of the ancien regime lost their grip as individuality became the norm. What replaced it was what Jaynes referred to as self-authorization, very much dependent on an inner voice. But it is interesting to speculate that it might have required such a long incubation period considering this new mindset had first taken root back in the Axial Age. It sometimes can be a slow process for new memes to filter across vast geographic populations and seep down into the masses.

So what might the premodern mentality have been like? At Hurlburt’s piece, I noticed some comments about personal experience. One anonymous person mentioned, after brain trauma, “LOSING my inner voice. It is a totally different sensation/experience of reality. […] It is totally unlike anything I had ever known, I felt “simple” my day to day routines where driven only by images related to my goals (example: seeing Toothbrush and knowing my goals is to brush my teeth) and whenever I needed to recite something or create thoughts for communication, it seemed I could only conjure up the first thoughts to come to my mind without any sort of filter. And I would mumble and whisper to myself in Lue of the inner voice. But even when mumbling and whispering there was NO VOICE in my head. Images, occasionally. Other than that I found myself being almost hyper-aware of my surroundings with my incoming visual stimuli as the primary focus throughout my day.”

This person said a close comparison was being in the zone, sometimes referred to as runner’s high. That got me thinking about various factors that can shut down the normal functioning of the egoic mind. Extreme physical activity forces the mind into a mode that isn’t experienced that often and extensively by people in the modern world, a state of mind combining exhaustion, endorphins, and ketosis — a state of mind, on the other hand, that would have been far from uncommon before modernity with some arguing ketosis was once the normal mode of neurocogntivie functioning. Related to this, it has been argued that the abstractions of Enlightenment thought was fueled by the imperial sugar trade, maybe the first time a permanent non-ketogenic mindset was possible in the Western world. What sugar (i.e., glucose), especially when mixed with the other popular trade items of tea and coffee, makes possible is thinking and reading (i.e., inner experience) for long periods of time without mental tiredness. During the Enlightenment, the modern mind was borne out of a drugged-up buzz. That is one interpretation. Whatever the cause, something changed.

Also, in the comment section of that article, I came across a perfect description of self-authorization. Carla said that, “There are almost always words inside my head. In fact, I’ve asked people I live with to not turn on the radio in the morning. When they asked why, they thought my answer was weird: because it’s louder than the voice in my head and I can’t perform my morning routine without that voice.” We are all like that to some extent. But for most of us, self-authorization has become so natural as to largely go unnoticed. Unlike Carla, the average person learns to hear their own inner voice despite external sounds. I’m willing to bet that, if tested, Carla would show results of having thin mental boundaries and probably an accordingly weaker egoic will to force her self-authorization onto situations. Some turn to sugar and caffeine (or else nicotine and other drugs) to help shore up rigid thick boundaries and maintain focus in this modern world filled with distractions — likely a contributing factor to drug addiction.

In Abrams’s book, The Spell of the Sensuous, he emphasizes the connection between sight and sound. By way of reading, seeing words becomes hearing words in one’s own mind. This is made possible by the perceptual tendency to associate sight and sound, the two main indicators of movement, with a living other such as an animal moving through the underbrush. Maybe this is what creates the sense of a living other within, a Jaynesian consciousness as interiorized metaphorical space. The magic of hearing words inside puts a spell on the mind, invoking a sense of inner being separate from the outer world. This is how reading can conjure forth an entire visuospatial experience of narratized world, sometimes as compellingly real or moreso than our mundane lives. To hear and see, even if only imagined inwardly, is to make real.

Yet many lose the ability to visualize as they age. I wonder if that has to do with how the modern world until recently has been almost exclusively focused on text. It’s only now that a new generation has been so fully raised on the visual potency of 24/7 cable and the online world, and unlike past generations they might remain more visually-oriented into old age. The loss of visual imagination might have been more of a quirk of printed text, the visual not so much disappearing as being subverted into sound as the ego’s own voice became insular. But even when we are unaware of it, maybe the visual remains as the light in the background that makes interior space visible like a lamp in a sonorous cave, the lamp barely offering enough light to allow us follow the sound further into the darkness. Bertrand Russell went so far as to “argues that mental imagery is the essence of the meaning of words in most cases” (Bertrand Russell: Unconscious Terrors; Murder, Rage and Mental Imagery.). It is the visual that makes the aural come alive with meaning — as Russell put it:

“it is nevertheless the possibility of a memory image in the child and an imagination image in the hearer that makes the essence of the ‘meaning’ of the words. In so far as this is absent, the words are mere counters, capable of meaning, but not at the moment possessing it.”

Jaynes resolves the seeming dilemma by proposing the visuospatial as a metaphorical frame in which the mind operates, rather than itself being the direct focus of thought. And to combine this with Russell’s view, as the visual recedes from awareness, abstract thought recedes from the visceral sense of meaning of the outer world. This is how modern humanity, ever more lost in thought, has lost contact with the larger world of nature and universe, a shrinking number of people who still regularly experience a wilderness vista or the full starry sky. Our entire world turns inward and loses its vividness, becomes smaller, the boundaries dividing thicker. Our minds become ruled by Russell’s counters of meaning (i.e., symbolic proxies), rather than meaning directly. That may be changing, though, in this new era of visually-saturated media. Even books, as audiobooks, can now be heard outwardly in the voice of another. The rigid walls of the ego, so carefully constructed over centuries, are being cracked open again. If so, we might see a merging back together again of the separated senses, which could manifest as a return of synaesthesia as a common experience and with it a resurgence of metaphorical thought that hews close to the sensory world, the fertile ground of meaning. About a talk by Vilanayur S. Ramachandran, Maureen Seaberg writes (The Sea of Similitude):

“The refined son of an Indian diplomat explains that synesthesia was discovered by Sir Francis Galton, cousin of Charles Darwin, and that its name is derived from the Greek words for joined sensations. Next, he says something that really gets me to thinking – that there is greater cross wiring in the brains of synesthetes. This has enormous implications. “Now, if you assume that this greater cross wiring and concepts are also in different parts of the brain [than just where the synesthesia occurs], then it’s going to create a greater propensity towards metaphorical thinking and creativity in people with synesthesia. And, hence, the eight times more common incidence of synesthesia among poets, artists and novelists,” he says.

“In 2005, Dr. Ramachandran and his colleagues at the University of California at San Diego identified where metaphors are likely generated in the brain by studying people who could no longer understand metaphor because of brain damage. Proving once again the maxim that nature speaks through exceptions, they tested four patients who had experienced injuries to the left angular gyrus region. In May 2005, Scientific American reported on this and pointed out that although the subjects were bright and good communicators, when the researchers presented them with common proverbs and metaphors such as “the grass is always greener on the other side” and “reaching for the stars,” the subjects interpreted the sayings literally almost all the time. Their metaphor centers – now identified – had been compromised by the damage and the people just didn’t get the symbolism. Interestingly, synesthesia has also been found to occur mostly in the fusiform and angular gyrus – it’s in the same neighborhood. […]

“Facility with metaphor is a “thing” in synesthesia. Not only do Rama’s brain studies prove it, but I’ve noticed synesthetes seldom choose the expected, clichéd options when forming the figures of speech that describe a thing in a way that is symbolic to explain an idea or make comparisons. It would be more enviable were it not completely involuntary and automatic. In our brains without borders, it just works that way. Our neuronal nets are more interwoven”

The meeting of synaesthesia and metaphor opens up to our greater, if largely forgotten, humanity. As Jaynes and many others have made clear, those in the distant past and those still living in isolated tribes, such people experience the world far differently than us. This can be seen in odd use of language in ancient texts, which we may take as odd turns of phrase, as mere metaphor. But what if these people so foreign to us took their own metaphors quite literally, so to speak. In another post by Maureen Seaberg (The Shamanic Synesthesia of the Kalahari Bushmen), there are clear examples of this:

“The oldest cultures found that ecstatic experience expands our awareness and in its most special form, the world is experienced through more sensory involvement and presence, he says. “The shaman’s transition into ecstasy brought about what we call synesthesia today. But there was more involved than just passively experiencing it. The ecstatic shaman also performed sound, movement, and made reference to vision, smell, and taste in ways that helped evoke extraordinary experiences in others. They were both recipients and performers of multi-sensory theatres. Of course this is nothing like the weekend workshop shamans of the new age who are day dreaming rather than shaking wildly…. Rhythm, especially syncopated African drumming, excites the whole body to feel more intensely. Hence, it is valued as a means of ‘getting there’. A shaman (an ecstatic performer) played all the senses.” If this seems far afield from Western experience, consider that in Exodus 20:18, as Moses ascended Mt. Sinai to retrieve the tablets, the people present were said to have experienced synesthesia. “And all the people saw the voices” of heaven, it says. And we know synesthesia happens even in non-synesthetes during meditation — a heightened state.” “

The metaphorical ground of synaesthesia is immersive and participatory. It is a world alive with meaning. It was a costly trade in sacrificing this in creating our separate and sensory-deprived egoic consciousness, despite all that we gained in wielding power over the world. During the Bronze Age when written language still had metaphorical mud on its living roots, what Jaynes calls the bicameral mind would have been closer to this animistic mindset. A metaphor in that experiential reality was far more than what we now know of as metaphor. The world was alive with beings and voices. This isn’t only the origins of our humanity for it remains the very ground of our being, the source of what we have become — language most of all (“First came the temple, then the city.”):

“Looking at an even more basic level, I was reading Mark Changizi’s Harnessed. He argues that (p. 11), “Speech and music culturally evolved over time to be simulacra of nature.” That reminded me of Lynne Kelly’s description of how indigenous people would use vocal techniques and musical instruments to mimic natural sounds, as a way of communicating and passing on complex knowledge of the world. Changizi’s argument is based on the observation that “human speech sounds like solid-object physical events” and that “music sounds like humans moving and behaving (usually expressively)” (p. 19). Certain sounds give information about what is going on in the immediate environment, specifically sounds related to action and movement. This sound-based information processing would make for an optimal basis of language formation. This is given support from evidence that Kelly describes in her own books.

“This also touches upon the intimate relationship language has to music, dance, and gesture. Language is inseparable from our experience of being in the world, involving multiple senses or even synaesthesia. The overlapping of sensory experience may have been more common to earlier societies. Research has shown that synaesthetes have better capacity for memory: “spatial sequence synesthetes have a built-in and automatic mnemonic reference” (Wikipedia). That is relevant considering that memory is central to oral societies, as Kelly demonstrates. And the preliterate memory systems are immensely vast, potentially incorporating the equivalent of thousands of pages of info. Knowledge and memory isn’t just in the mind but within the entire sense of self, sense of community, and sense of place.”

We remain haunted by the past (“Beyond that, there is only awe.”):

“Through authority and authorization, immense power and persuasion can be wielded. Jaynes argues that it is central to the human mind, but that in developing consciousness we learned how to partly internalize the process. Even so, Jaynesian self-consciousness is never a permanent, continuous state and the power of individual self-authorization easily morphs back into external forms. This is far from idle speculation, considering authoritarianism still haunts the modern mind. I might add that the ultimate power of authoritarianism, as Jaynes makes clear, isn’t overt force and brute violence. Outward forms of power are only necessary to the degree that external authorization is relatively weak, as is typically the case in modern societies.

If you are one of those who clearly hears a voice in your head, appreciate all that went into creating and constructing it. This is an achievement of our entire civilization. But also realize how precarious is this modern mind. It’s a strange thing to contemplate. What is that voice that speaks? And who is it that is listening? Now imagine what it would be like if, as with the bicameral gods going silent, your own god-like ego went silent. And imagine this silence spreading across all of society, an entire people suddenly having lost their self-authorization to act, their very sense of identity and social reality. Don’t take for granted that voice within.

* * *

Below is a passage from a book I read long ago, maybe back when it was first published in 1996. The description of cognitive change almost could have been lifted straight out of Julian Jaynes book from twenty years earlier (e.g., the observation of the gods becoming silent). Abrams doesn’t mention Jaynes and it’s possible he was unfamiliar with it, whether or not there was an indirect influence. The kinds of ideas Jaynes was entertaining had been floating around for a long while before him as well. The unique angle that Abrams brings in this passage is framing it all within synaesthesia.

The Spell of the Sensuous
by David Abrams
p. 69

Although contemporary neuroscientists study “synaesthesia”—the overlap and blending of the senses—as though it were a rare or pathological experience to which only certain persons are prone (those who report “seeing sounds,” “hearing colors,” and the like), our primordial, preconceptual experience, as Merleau-Ponty makes evident, is inherently synaesthetic. The intertwining of sensory modalities seems unusual to us only to the extent that we have become estranged from our direct experience (and hence from our primordial contact with the entities and elements that surround us):

…Synaesthetic perception is the rule, and we are unaware of it only because scientific knowledge shifts the center of gravity of experience, so that we have unlearned how to see, hear, and generally speaking, feel, in order to deduce, from our bodily organization and the world as the physicist conceives it, what we are to see, hear, and feel. 20

pp. 131-144

It is remarkable that none of the major twentieth-century scholars who have directed their attention to the changes wrought by literacy have seriously considered the impact of writing—and, in particular, phonetic writing—upon the human experience of the wider natural world. Their focus has generally centered upon the influence of phonetic writing on the structure and deployment of human language, 53 on patterns of cognition and thought, 54 or upon the internal organization of human societies. 55 Most of the major research, in other words, has focused upon the alphabet’s impact on processes either internal to human society or presumably “internal” to the human mind. Yet the limitation of such research—its restriction within the bounds of human social interaction and personal interiority—itself reflects an anthropocentric bias wholly endemic to alphabetic culture. In the absence of phonetic literacy, neither society, nor language, nor even the experience of “thought” or consciousness, can be pondered in isolation from the multiple nonhuman shapes and powers that lend their influence to all our activities (we need think only of our ceaseless involvement with the ground underfoot, with the air that swirls around us, with the plants and animals that we consume, with the daily warmth of the sun and the cyclic pull of the moon). Indeed, in the absence of formal writing systems, human communities come to know themselves primarily as they are reflected back by the animals and the animate landscapes with which they are directly engaged. This epistemological dependence is readily evidenced, on every continent, by the diverse modes of identification commonly categorized under the single term “totemism.”

It is exceedingly difficult for us literates to experience anything approaching the vividness and intensity with which surrounding nature spontaneously presents itself to the members of an indigenous, oral community. Yet as we saw in the previous chapters, Merleau-Ponty’s careful phenomenology of perceptual experience had begun to disclose, underneath all of our literate abstractions, a deeply participatory relation to things and to the earth, a felt reciprocity curiously analogous to the animistic awareness of indigenous, oral persons. If we wish to better comprehend the remarkable shift in the human experience of nature that was occasioned by the advent and spread of phonetic literacy, we would do well to return to the intimate analysis of sensory perception inaugurated by Merleau-Ponty. For without a clear awareness of what reading and writing amounts to when considered at the level of our most immediate, bodily experience, any “theory” regarding the impact of literacy can only be provisional and speculative.

Although Merleau-Ponty himself never attempted a phenomenology of reading or writing, his recognition of the importance of synaesthesia—the overlap and intertwining of the senses—resulted in a number of experiential analyses directly pertinent to the phenomenon of reading. For reading, as soon as we attend to its sensorial texture, discloses itself as a profoundly synaesthetic encounter. Our eyes converge upon a visible mark, or a series of marks, yet what they find there is a sequence not of images but of sounds, something heard; the visible letters, as we have said, trade our eyes for our ears. Or, rather, the eye and the ear are brought together at the surface of the text—a new linkage has been forged between seeing and hearing which ensures that a phenomenon apprehended by one sense is instantly transposed into the other. Further, we should note that this sensory transposition is mediated by the human mouth and tongue; it is not just any kind of sound that is experienced in the act of reading, but specifically human, vocal sounds—those which issue from the human mouth. It is important to realize that the now common experience of “silent” reading is a late development in the story of the alphabet, emerging only during the Middle Ages, when spaces were first inserted between the words in a written manuscript (along with various forms of punctuation), enabling readers to distinguish the words of a written sentence without necessarily sounding them out audibly. Before this innovation, to read was necessarily to read aloud, or at the very least to mumble quietly; after the twelfth century it became increasingly possible to internalize the sounds, to listen inwardly to phantom words (or the inward echo of words once uttered). 56

Alphabetic reading, then, proceeds by way of a new synaesthetic collaboration between the eye and the ear, between seeing and hearing. To discern the consequences of this new synaesthesia, we need to examine the centrality of synaesthesia in our perception of others and of the earth.

The experiencing body (as we saw in Chapter 2) is not a self-enclosed object, but an open, incomplete entity. This openness is evident in the arrangement of the senses: I have these multiple ways of encountering and exploring the world—listening with my ears, touching with my skin, seeing with my eyes, tasting with my tongue, smelling with my nose—and all of these various powers or pathways continually open outward from the perceiving body, like different paths diverging from a forest. Yet my experience of the world is not fragmented; I do not commonly experience the visible appearance of the world as in any way separable from its audible aspect, or from the myriad textures that offer themselves to my touch. When the local tomcat comes to visit, I do not have distinctive experiences of a visible cat, an audible cat, and an olfactory cat; rather, the tomcat is precisely the place where these separate sensory modalities join and dissolve into one another, blending as well with a certain furry tactility. Thus, my divergent senses meet up with each other in the surrounding world, converging and commingling in the things I perceive. We may think of the sensing body as a kind of open circuit that completes itself only in things, and in the world. The differentiation of my senses, as well as their spontaneous convergence in the world at large, ensures that I am a being destined for relationship: it is primarily through my engagement with what is not me that I effect the integration of my senses, and thereby experience my own unity and coherence. 57 […]

The diversity of my sensory systems, and their spontaneous convergence in the things that I encounter, ensures this interpenetration or interweaving between my body and other bodies—this magical participation that permits me, at times, to feel what others feel. The gestures of another being, the rhythm of its voice, and the stiffness or bounce in its spine all gradually draw my senses into a unique relation with one another, into a coherent, if shifting, organization. And the more I linger with this other entity, the more coherent the relation becomes, and hence the more completely I find myself face-to-face with another intelligence, another center of experience.

In the encounter with the cyclist, as in my experience of the blackbird, the visual focus induced and made possible the participation of the other senses. In different situations, other senses may initiate the synaesthesia: our ears, when we are at an orchestral concert; or our nostrils, when a faint whiff of burning leaves suddenly brings images of childhood autumns; our skin, when we are touching or being touched by a lover. Nonetheless, the dynamic conjunction of the eyes has a particularly ubiquitous magic, opening a quivering depth in whatever we focus upon, ceaselessly inviting the other senses into a concentrated exchange with stones, squirrels, parked cars, persons, snow-capped peaks, clouds, and termite-ridden logs. This power—the synaesthetic magnetism of the visual focus—will prove crucial for our understanding of literacy and its perceptual effects.

The most important chapter of Merleau-Ponty’s last, unfinished work is entitled “The Intertwining—The Chiasm.” The word “chiasm,” derived from an ancient Greek term meaning “crisscross,” is in common use today only in the field of neurobiology: the “optic chiasm” is that anatomical region, between the right and left hemispheres of the brain, where neuronal fibers from the right eye and the left eye cross and interweave. As there is a chiasm between the two eyes, whose different perspectives continually conjoin into a single vision, so—according to Merleau-Ponty—there is a chiasm between the various sense modalities, such that they continually couple and collaborate with one another. Finally, this interplay of the different senses is what enables the chiasm between the body and the earth, the reciprocal participation—between one’s own flesh and the encompassing flesh of the world—that we commonly call perception. 59

Phonetic reading, of course, makes use of a particular sensory conjunction—that between seeing and hearing. And indeed, among the various synaesthesias that are common to the human body, the confluence (or chiasm) between seeing and hearing is particularly acute. For vision and hearing are the two “distance” senses of the human organism. In contrast to touch and proprioception (inner-body sensations), and unlike the chemical senses of taste and smell, seeing and hearing regularly place us in contact with things and events unfolding at a substantial distance from our own visible, audible body.

My visual gaze explores the reflective surfaces of things, their outward color and contour. By following the play of light and shadow, the dance of colors, and the gradients of repetitive patterns, the eyes—themselves gleaming surfaces—keep me in contact with the multiple outward facets, or faces, of the things arrayed about me. The ears, meanwhile, are more inward organs; they emerge from the depths of my skull like blossoms or funnels, and their participation tells me less about the outer surface than the interior substance of things. For the audible resonance of beings varies with their material makeup, as the vocal calls of different animals vary with the size and shape of their interior cavities and hollows. I feel their expressive cries resound in my skull or my chest, echoing their sonorous qualities with my own materiality, and thus learn of their inward difference from myself. Looking and listening bring me into contact, respectively, with the outward surfaces and with the interior voluminosity of things, and hence where these senses come together, I experience, over there, the complex interplay of inside and outside that is characteristic of my own self-experience. It is thus at those junctures in the surrounding landscape where my eyes and my ears are drawn together that I most readily feel myself confronted by another power like myself, another life. […]

Yet our ears and our eyes are drawn together not only by animals, but by numerous other phenomena within the landscape. And, strangely, wherever these two senses converge, we may suddenly feel ourselves in relation with another expressive power, another center of experience. Trees, for instance, can seem to speak to us when they are jostled by the wind. Different forms of foliage lend each tree a distinctive voice, and a person who has lived among them will easily distinguish the various dialects of pine trees from the speech of spruce needles or Douglas fir. Anyone who has walked through cornfields knows the uncanny experience of being scrutinized and spoken to by whispering stalks. Certain rock faces and boulders request from us a kind of auditory attentiveness, and so draw our ears into relation with our eyes as we gaze at them, or with our hands as we touch them—for it is only through a mode of listening that we can begin to sense the interior voluminosity of the boulder, its particular density and depth. There is an expectancy to the ears, a kind of patient receptivity that they lend to the other senses whenever we place ourselves in a mode of listening—whether to a stone, or a river, or an abandoned house. That so many indigenous people allude to the articulate speech of trees or of mountains suggests the ease with which, in an oral culture, one’s auditory attention may be joined with the visual focus in order to enter into a living relation with the expressive character of things.

Far from presenting a distortion of their factual relation to the world, the animistic discourse of indigenous, oral peoples is an inevitable counterpart of their immediate, synaesthetic engagement with the land that they inhabit. The animistic proclivity to perceive the angular shape of a boulder (while shadows shift across its surface) as a kind of meaningful gesture, or to enter into felt conversations with clouds and owls—all of this could be brushed aside as imaginary distortion or hallucinatory fantasy if such active participation were not the very structure of perception, if the creative interplay of the senses in the things they encounter was not our sole way of linking ourselves to those things and letting the things weave themselves into our experience. Direct, prereflective perception is inherently synaesthetic, participatory, and animistic, disclosing the things and elements that surround us not as inert objects but as expressive subjects, entities, powers, potencies.

And yet most of us seem, today, very far from such experience. Trees rarely, if ever, speak to us; animals no longer approach us as emissaries from alien zones of intelligence; the sun and the moon no longer draw prayers from us but seem to arc blindly across the sky. How is it that these phenomena no longer address us , no longer compel our involvement or reciprocate our attention? If participation is the very structure of perception, how could it ever have been brought to a halt? To freeze the ongoing animation, to block the wild exchange between the senses and the things that engage them, would be tantamount to freezing the body itself, stopping it short in its tracks. And yet our bodies still move, still live, still breathe. If we no longer experience the enveloping earth as expressive and alive, this can only mean that the animating interplay of the senses has been transferred to another medium, another locus of participation.

IT IS THE WRITTEN TEXT THAT PROVIDES THIS NEW LOCUS . FOR TO read is to enter into a profound participation, or chiasm, with the inked marks upon the page. In learning to read we must break the spontaneous participation of our eyes and our ears in the surrounding terrain (where they had ceaselessly converged in the synaesthetic encounter with animals, plants, and streams) in order to recouple those senses upon the flat surface of the page. As a Zuñi elder focuses her eyes upon a cactus and hears the cactus begin to speak, so we focus our eyes upon these printed marks and immediately hear voices. We hear spoken words, witness strange scenes or visions, even experience other lives. As nonhuman animals, plants, and even “inanimate” rivers once spoke to our tribal ancestors, so the “inert” letters on the page now speak to us! This is a form of animism that we take for granted, but it is animism nonetheless—as mysterious as a talking stone.

And indeed, it is only when a culture shifts its participation to these printed letters that the stones fall silent. Only as our senses transfer their animating magic to the written word do the trees become mute, the other animals dumb.

But let us be more precise, recalling the distinction between different forms of writing discussed at the start of this chapter. As we saw there, pictographic, ideographic, and even rebuslike writing still makes use of, or depends upon, our sensorial participation with the natural world. As the tracks of moose and bear refer beyond themselves to those entities of whom they are the trace, so the images in early writing systems draw their significance not just from ourselves but from sun, moon, vulture, jaguar, serpent, lightning—from all those sensorial, never strictly human powers, of which the written images were a kind of track or tracing. To be sure, these signs were now inscribed by human hands, not by the hooves of deer or the clawed paws of bear; yet as long as they presented images of paw prints and of clouds , of sun and of serpent , these characters still held us in relation to a more-than-human field of discourse. Only when the written characters lost all explicit reference to visible, natural phenomena did we move into a new order of participation. Only when those images came to be associated, alphabetically, with purely human-made sounds, and even the names of the letters lost all worldly, extrahuman significance, could speech or language come to be experienced as an exclusively human power. For only then did civilization enter into the wholly self-reflexive mode of animism, or magic, that still holds us in its spell:

We know what the animals do, what are the needs of the beaver, the bear, the salmon, and other creatures, because long ago men married them and acquired this knowledge from their animal wives. Today the priests say we lie, but we know better. The white man has been only a short time in this country and knows very little about the animals; we have lived here thousands of years and were taught long ago by the animals themselves. The white man writes everything down in a book so that it will not be forgotten; but our ancestors married animals, learned all their ways, and passed on this knowledge from one generation to another. 60

THAT ALPHABETIC READING AND WRITING WAS ITSELF experienced as a form of magic is evident from the reactions of cultures suddenly coming into contact with phonetic writing. Anthropological accounts from entirely different continents report that members of indigenous, oral tribes, after seeing the European reading from a book or from his own notes, came to speak of the written pages as “talking leaves,” for the black marks on the flat, leaflike pages seemed to talk directly to the one who knew their secret.

The Hebrew scribes never lost this sense of the letters as living, animate powers. Much of the Kabbalah, the esoteric body of Jewish mysticism, is centered around the conviction that each of the twenty-two letters of the Hebrew aleph-beth is a magic gateway or guide into an entire sphere of existence. Indeed, according to some kabbalistic accounts, it was by combining the letters that the Holy One, Blessed Be He, created the ongoing universe. The Jewish kabbalists found that the letters, when meditated upon, would continually reveal new secrets; through the process of tzeru, the magical permutation of the letters, the Jewish scribe could bring himself into successively greater states of ecstatic union with the divine. Here, in other words, was an intensely concentrated form of animism—a participation conducted no longer with the sculpted idols and images worshiped by other tribes but solely with the visible letters of the aleph-beth.

Perhaps the most succinct evidence for the potent magic of written letters is to be found in the ambiguous meaning of our common English word “spell.” As the roman alphabet spread through oral Europe, the Old English word “spell,” which had meant simply to recite a story or tale, took on the new double meaning: on the one hand, it now meant to arrange, in the proper order, the written letters that constitute the name of a thing or a person; on the other, it signified a magic formula or charm. Yet these two meanings were not nearly as distinct as they have come to seem to us today. For to assemble the letters that make up the name of a thing, in the correct order, was precisely to effect a magic, to establish a new kind of influence over that entity, to summon it forth! To spell, to correctly arrange the letters to form a name or a phrase, seemed thus at the same time to cast a spell , to exert a new and lasting power over the things spelled. Yet we can now realize that to learn to spell was also, and more profoundly, to step under the influence of the written letters ourselves, to cast a spell upon our own senses. It was to exchange the wild and multiplicitous magic of an intelligent natural world for the more concentrated and refined magic of the written word.

THE BULGARIAN SCHOLAR TZVETAN TODOROV HAS WRITTEN AN illuminating study of the Spanish conquest of the Americas, based on extensive study of documents from the first months and years of contact between European culture and the native cultures of the American continent. 61 The lightning-swift conquest of Mexico by Cortéz has remained a puzzle for historians, since Cortéz, leading only a few hundred men, managed to seize the entire kingdom of Montezuma, who commanded several hundred thousand . Todorov concludes that Cortéz’s astonishing and rapid success was largely a result of the discrepancy between the different forms of participation engaged in by the two societies. The Aztecs, whose writing was highly pictorial, necessarily felt themselves in direct communication with an animate, more-than-human environment. “Everything happens as if, for the Aztecs, [written] signs automatically and necessarily proceed from the world they designate…”; the Aztecs are unable to use their spoken words, or their written characters, to hide their true intentions, since these signs belong to the world around them as much as to themselves. 62 To be duplicitous with signs would be, for the Aztecs, to go against the order of nature, against the encompassing speech or logos of an animate world, in which their own tribal discourse was embedded.

The Spaniards, however, suffer no such limitation. Possessed of an alphabetic writing system, they experience themselves not in communication with the sensuous forms of the world, but solely with one another. The Aztecs must answer, in their actions as in their speech, to the whole sensuous, natural world that surrounds them; the Spanish need answer only to themselves.

In contact with this potent new magic, with these men who participate solely with their own self-generated signs, whose speech thus seems to float free of the surrounding landscape, and who could therefore be duplicitous and lie even in the presence of the sun, the moon, and the forest, the Indians felt their own rapport with those sensuous powers, or gods, beginning to falter:

The testimony of the Indian accounts, which is a description rather than an explanation, asserts that everything happened because the Mayas and the Aztecs lost control of communication. The language of the gods has become unintelligible, or else these gods fell silent. “Understanding is lost, wisdom is lost” [from the Mayan account of the Spanish invasion]….As for the Aztecs, they describe the beginning of their own end as a silence that falls: the gods no longer speak to them. 63

In the face of aggression from this new, entirely self-reflexive form of magic, the native peoples of the Americas—like those of Africa and, later, of Australia—felt their own magics wither and become useless, unable to protect them.