Bicameralism and Bilingualism

A paper on multilingualism was posted by Eva Dunkel in the Facebook group for The Origin of Consciousness in the Breakdown of the Bicameral Mind: Consequences of multilingualism for neural architecture by Sayuri Hayakawa and Viorica Marian. It is a great find. The authors look at how multiple languages are processed within the brain and how they can alter brain structure.

This probably also relates to learning of music, art, and math — one might add that learning music later improves the ability to learn math. These are basically other kinds of languages, especially the former in terms of  musical languages (along with whistle and hum languages) that might indicate language having originated in music, not to mention the close relationship music has to dance, movement, and behavior and close relationship of music to group identity. The archaic authorization of command voices in the bicameral mind quite likely came in the form of music and one could imagine the kinds of synchronized collective activities that could have dominated life and work in bicameral societies. There is something powerful about language that we tend to overlook and take for granted. Also, since language is so embedded in culture, monolinguals never see outside of the cultural reality tunnel they exist within. This could bring us to wonder about the role played post-bicameral society by syncretic languages like English. We can’t forget the influence psychedelics might have had on language development and learning at different periods of human existence. And with psychedelics, there is the connection to shamanism with caves as aural spaces and locations of art, possibly the earliest origin of proto-writing.

There is no reason to give mathematics a mere secondary place in our considerations. Numeracy might be important as well in thinking about the bicameral mind specifically and certainly about the human mind in general (Caleb Everett, Numbers and the Making of Us), as numeracy was an advancement or complexification beyond the innumerate tribal societies (e.g., Piraha). Some of the earliest uses of writing was for calculations: accounting, taxation, astrology, etc. Bicameral societies, specifically the early city-states, can seem simplistic in many ways with their lack of complex hierarchies, large centralized governments, standing armies, police forces, or even basic infrastructure such as maintained roads and bridges. Yet they were capable of immense projects that required impressively high levels of planning, organizing, and coordination — as seen with the massive archaic pyramids and other structures built around the world. It’s strange how later empires in the Axial Age and beyond that, though so much larger and extensive with greater wealth and resources, rarely even attempted the seemingly impossible architectural feats of bicameral humans. Complex mathematical systems probably played a major role in the bicameral mind, as seen in how astrological calculations sometimes extended over millennia.

Hayakawa and Marian’s paper could add to the explanation of the breakdown of the bicameral mind. A central focus of their analysis is the increased executive function and neural integration in managing two linguistic inputs — I could see how that would relate to the development of egoic consciousness. It has been proposed that the first to develop Jaynesian consciousness may have been traders who were required to cross cultural boundaries and, of course, who would have been forced to learn multiple languages. As bicameral societies came into regular contact with more diverse linguistic cultures, their bicameral cognitive and social structures would have been increasingly stressed.

Multilingualism goes hand in hand with literacy. Rates of both have increased over the millennia. That would have been a major force in the post-bicameral Axial Age. The immense multiculturalism of societies like the Roman Empire is almost impossible for us to imagine. Hundreds of ethnicities, each with their own language, would co-exist in the same city and sometimes the same neighborhood. On a single street, there could be hundreds of shrines to diverse gods with people praying, people invoking and incantating in their separate languages. These individuals were suddenly forced to deal with complete strangers and learn some basic level of understanding foreign languages and hence foreign understandings.

This was simultaneous with the rise of literacy and its importance to society, only becoming more important over time as the rate of book reading continues to climb (more books are printed in a year these days than were produced in the first several millennia of writing). Still, it was only quite recently that the majority of the population became literate, following from that is the ability of silent reading and its correlate of inner speech. Multilingualism is close behind and catching up. The consciousness revolution is still under way. I’m willing to bet American society will be transformed as we return to multilingualism as the norm, considering that in the first centuries of American history there was immense multilingualism (e.g., German was once one of the most widely spoken languages in North America).

All of this reminds me of linguistic relativity. I’ve pointed out that, though not explicitly stated, Jaynes obviously was referring to linguistic relativity in his own theorizing about language. He talked quite directly about the power language —- and metaphors within language —- had over thought, perception, behavior, and identity (Anke Snoek has some good insights about this in exploring the thought of Giorgio Agamben). This was an idea maybe first expressed by Wilhelm von Humboldt (On Language) in 1836: “Via the latter, qua character of a speech-sound, a pervasive analogy necessarily prevails in the same language; and since a like subjectivity also affects language in the same notion, there resides in every language a characteristic world-view.” And Humboldt even considered the power of learning another language in stating that, “To learn a foreign language should therefore be to acquire a new standpoint in the world-view hitherto possessed, and in fact to a certain extent is so, since every language contains the whole conceptual fabric and mode of presentation of a portion of mankind.”

Multilingualism is multiperspectivism, a core element of the modern mind and modern way of being in the world. Language has the power to transform us. To study language, to learn a new language is to become something different. Each language is not only a separate worldview but locks into place a different sense of self, a persona. This would be true not only for learning different cultural languages but also different professional languages with their respective sets of terminology, as the modern world has diverse areas with their own ways of talking and we modern humans have to deal with this complexity on a regular basis, whether we are talking about tax codes or dietary lingo.

It’s hard to know what that means for humanity’s trajectory across the millennia. But the more we are caught within linguistic worlds and are forced to navigate our way within them the greater the need for a strong egoic individuality to self-initiate action, that is to say the self-authorization of Jaynesian consciousness. We step further back into our own internal space of meta-cognitive metaphor. To know more than one language strengthens an identity separate from any given language. The egoic self retreats behind its walls and looks out from its parapets. Language, rather than being the world we are immersed in, becomes the world we are trapped in (a world that is no longer home and from which we seek to escape, Philip K. Dick’s Black Iron Prison and William S. Burroughs Control). It closes in on us and forces us to become more adaptive to evade the constraints.

The Crisis of Identity

“Have we lived too fast?”
~Dr. Silas Weir Mitchell, 1871
Wear and Tear, or Hints for the Overworked

I’ve been following Scott Preston over at his blog, Chrysalis. He has been writing on the same set of issues for a long time now, longer than I’ve been reading his blog. He reads widely and so draws on many sources, most of which I’m not familiar with, part of the reason I appreciate the work he does to pull together such informed pieces. A recent post, A Brief History of Our Disintegration, would give you a good sense of his intellectual project, although the word ‘intellectual’ sounds rather paltry for what he is describing:

“Around the end of the 19th century (called the fin de siecle period), something uncanny began to emerge in the functioning of the modern mind, also called the “perspectival” or “the mental-rational structure of consciousness” (Jean Gebser). As usual, it first became evident in the arts — a portent of things to come, but most especially as a disintegration of the personality and character structure of Modern Man and mental-rational consciousness.”

That time period has been an interest of mine as well. There are two books that come to mind that I’ve mentioned before: Tom Lutz’s American Nervousness, 1903 and Jackson Lear’s Rebirth of a Nation (for a discussion of the latter, see: Juvenile Delinquents and Emasculated Males). Both talk about that turn-of-the-century crisis, the psychological projections and physical manifestations, the social movements and political actions. A major concern was neurasthenia which, according to the dominant economic paradigm, meant a deficit of ‘nervous energy’ or ‘nerve force’, the reserves of which if not reinvested wisely and instead wasted would lead to physical and psychological bankruptcy, and so one became spent (the term ‘neurasthenia’ was first used in 1829 and popularized by George Miller Beard in 1869, the same period when the related medical condition of ‘nostalgia’ began being diagnosed).

This was mixed up with sexuality in what Theodore Dreiser called the ‘spermatic economy’ (by the way, the catalogue for Sears, Roebuck and Company offered an electrical device to replenish nerve force that came with a genital attachment). Obsession with sexuality was used to reinforce gender roles in how neurasthenic patients were treated in following the practice of Dr. Silas Weir Mitchell, in that men were recommended to become more active (the ‘West cure’) and women more passive (the ‘rest cure’), although some women “used neurasthenia to challenge the status quo, rather than enforce it. They argued that traditional gender roles were causing women’s neurasthenia, and that housework was wasting their nervous energy. If they were allowed to do more useful work, they said, they’d be reinvesting and replenishing their energies, much as men were thought to do out in the wilderness” (Julie Beck, ‘Americanitis’: The Disease of Living Too Fast). That feminist-style argument, as I recall, came up in advertisements for the Bernarr Macfadden’s fitness protocol in the early-1900s, encouraging (presumably middle class) women to give up housework for exercise and so regain their vitality. Macfadden was also an advocate of living a fully sensuous life, going as far as free love.

Besides the gender wars, there was the ever-present bourgeois bigotry. Neurasthenia is the most civilized of the diseases of civilization since, in its original American conception, it was perceived as only afflicting middle-to-upper class whites, especially WASPs — as Lutz says that, “if you were lower class, and you weren’t educated and you weren’t Anglo Saxon, you wouldn’t get neurasthenic because you just didn’t have what it took to be damaged by modernity” (Julie Beck, ‘Americanitis’: The Disease of Living Too Fast) and so, according to Lutz’s book, people would make “claims to sickness as claims to privilege.” It was considered a sign of progress, but over time it came to be seen by some as the greatest threat to civilization, in either case offering much material for fictionalized portrayals that were popular. Being sick in this fashion was proof that one was a modern individual, an exemplar of advanced civilization, if coming at immense cost —Julie Beck explains:

“The nature of this sickness was vague and all-encompassing. In his book Neurasthenic Nation, David Schuster, an associate professor of history at Indiana University-Purdue University Fort Wayne, outlines some of the possible symptoms of neurasthenia: headaches, muscle pain, weight loss, irritability, anxiety, impotence, depression, “a lack of ambition,” and both insomnia and lethargy. It was a bit of a grab bag of a diagnosis, a catch-all for nearly any kind of discomfort or unhappiness.

“This vagueness meant that the diagnosis was likely given to people suffering from a variety of mental and physical illnesses, as well as some people with no clinical conditions by modern standards, who were just dissatisfied or full of ennui. “It was really largely a quality-of-life issue,” Schuster says. “If you were feeling good and healthy, you were not neurasthenic, but if for some reason you were feeling run down, then you were neurasthenic.””

I’d point out how neurasthenia was seen as primarily caused by intellectual activity, as it became a descriptor of a common experience among the burgeoning middle class of often well-educated professionals and office workers. This relates to Weston A. Price’s work in the 1930s, as modern dietary changes first hit this demographic since they had the means to afford eating a fully industrialized Standard American Diet (SAD), long before others (within decades, though, SAD-caused malnourishment would wreck the health at all levels of society). What this meant, in particular, was a diet high in processed carbs and sugar that coincided, because of Upton Sinclair’s 1904 The Jungle: Muckraking the Meat-Packing Industry,  with the early-1900s decreased consumption of meat and saturated fats. As Price demonstrated, this was a vast change from the traditional diet found all over the world, including in rural Europe (and presumably in rural America, with most Americans not urbanized until the turn of last century), that always included significant amounts of nutritious animal foods loaded up with fat-soluble vitamins, not to mention lots of healthy fats and cholesterol.

Prior to talk of neurasthenia, the exhaustion model of health portrayed as waste and depletion took hold in Europe centuries earlier (e.g., anti-masturbation panics) and had its roots in humor theory of bodily fluids. It has long been understood that food, specifically macronutrients (carbohydrate, protein, & fat), affect mood and behavior — see the early literature on melancholy. During feudalism food laws were used as a means of social control, such that in one case meat was prohibited prior to Carnival because of its energizing effect that it was thought could lead to rowdiness or even revolt (Ken Albala & Trudy Eden, Food and Faith in Christian Culture).

There does seem to be a connection between an increase of intellectual activity with an increase of carbohydrates and sugar, this connection first appearing during the early colonial era that set the stage for the Enlightenment. It was the agricultural mind taken to a whole new level. Indeed, a steady flow of glucose is one way to fuel extended periods of brain work, such as reading and writing for hours on end and late into the night — the reason college students to this day will down sugary drinks while studying. Because of trade networks, Enlightenment thinkers were buzzing on the suddenly much more available simple carbs and sugar, with an added boost from caffeine and nicotine. The modern intellectual mind was drugged-up right from the beginning, and over time it took its toll. Such dietary highs inevitably lead to ever greater crashes of mood and health. Interestingly, Dr. Silas Weir Mitchell who advocated the ‘rest cure’ and ‘West cure’ in treating neurasthenia and other ailments additionally used a “meat-rich diet” for his patients (Ann Stiles, Go rest, young man). Other doctors of that era were even more direct in using specifically low-carb diets for various health conditions, often for obesity which was also a focus of Dr. Mitchell.

Still, it goes far beyond diet. There has been a diversity of stressors that have continued to amass over the centuries of tumultuous change. The exhaustion of modern man (and typically the focus has been on men) has been building up for generations upon generations before it came to feel like a world-shaking crisis with the new industrialized world. The lens of neurasthenia was an attempt to grapple with what had changed, but the focus was too narrow. With the plague of neurasthenia, the atomization of commericialized man and woman couldn’t hold together. And so there was a temptation toward nationalistic projects, including wars, to revitalize the ailing soul and to suture the gash of social division and disarray. But this further wrenched out of alignment the traditional order that had once held society together, and what was lost mostly went without recognition. The individual was brought into the foreground of public thought, a lone protagonist in a social Darwinian world. In this melodramatic narrative of struggle and self-assertion, many individuals didn’t fare so well and everything else suffered in the wake.

Tom Lutz writes that, “By 1903, neurasthenic language and representations of neurasthenia were everywhere: in magazine articles, fiction, poetry, medical journals and books, in scholarly journals and newspaper articles, in political rhetoric and religious discourse, and in advertisements for spas, cures, nostrums, and myriad other products in newspapers, magazines and mail-order catalogs” (American Nervousness, 1903, p. 2).

There was a sense of moral decline that was hard to grasp, although some people like Weston A. Price tried to dig down into concrete explanations of what had so gone wrong, the social and psychological changes observable during mass urbanization and industrialization. He was far from alone in his inquiries, having built on the prior observations of doctors, anthropologists, and missionaries. Other doctors and scientists were looking into the influences of diet in the mid-1800s and, by the 1880s, scientists were exploring a variety of biological theories. Their inability to pinpoint the cause maybe had more to do with their lack of a needed framework, as they touched upon numerous facets of biological functioning:

“Not surprisingly, laboratory experiments designed to uncover physiological changes in the nerve cell were inconclusive. European research on neurasthenics reported such findings as loss of elasticity of blood vessels,’ thickening of the cell wall, changes in the shape of nerve cells,’6 or nerve cells that never advanced beyond an embryonic state.’ Another theory held that an overtaxed organism cannot keep up with metabolic requirements, leading to inadequate cell nutrition and waste excretion. The weakened cells cannot develop properly, while the resulting build-up of waste products effectively poisons the cells (so-called “autointoxication”).’ This theory was especially attractive because it seemed to explain the extreme diversity of neurasthenic symptoms: weakened or poisoned cells might affect the functioning of any organ in the body. Furthermore, “autointoxicants” could have a stimulatory effect, helping to account for the increased sensitivity and overexcitability characteristic of neurasthenics.'” (Laura Goering, “Russian Nervousness”: Neurasthenia and National Identity in Nineteenth-Century Russia)

This early scientific research could not lessen the mercurial sense of unease, as neurasthenia was from its inception a broad category that captured some greater shift in public mood, even as it so powerfully shaped the individual’s health. For all the effort, there were as many theories about neurasthenia as there were symptoms. Deeper insight was required. “[I]f a human being is a multiformity of mind, body, soul, and spirit,” writes Preston, “you don’t achieve wholeness or fulfillment by amputating or suppressing one or more of these aspects, but only by an effective integration of the four aspects.” But integration is easier said than done.

The modern human hasn’t been suffering from mere psychic wear and tear for the individual body itself has been showing the signs of sickness, as the diseases of civilization have become harder and harder to ignore. On a societal level of human health, I’ve previously shared passages from Lears (see here) — he discusses the vitalist impulse that was the response to the turmoil, and vitalism often was explored in terms of physical health as the most apparent manifestation, although social and spiritual health were just as often spoken of in the same breath. The whole person was under assault by an accumulation of stressors and the increasingly isolated individual didn’t have the resources to fight them off.

By the way, this was far from being limited to America. Europeans picked up the discussion of neurasthenia and took it in other directions, often with less optimism about progress, but also some thinkers emphasizing social interpretations with specific blame on hyper-individualism (Laura Goering, “Russian Nervousness”: Neurasthenia and National Identity in Nineteenth-Century Russia). Thoughts on neurasthenia became mixed up with earlier speculations on nostalgia and romanticized notions of rural life. More important, Russian thinkers in particular understood that the problems of modernity weren’t limited to the upper classes, instead extending across entire populations, as a result of how societies had been turned on their heads during that fractious century of revolutions.

In looking around, I came across some other interesting stuff. From 1901 Nervous and Mental Diseases by Archibald Church and Frederick Peterson, the authors in the chapter on “Mental Disease” are keen to further the description, categorization, and labeling of ‘insanity’. And I noted their concern with physiological asymmetry, something shared later with Price, among many others going back to the prior century.

Maybe asymmetry was not only indicative of developmental issues but also symbolic of a deeper imbalance. The attempts of phrenological analysis about psychiatric, criminal, and anti-social behavior were off-base; and, despite the bigotry and proto-genetic determinism among racists using these kinds of ideas, there is a simple truth about health in relationship to physiological development, most easily observed in bone structure, but it would take many generations to understand the deeper scientific causes, along with nutrition (e.g., Price’s discovery of vitamin K2, what he called Acivator X) including parasites, toxins, and epigenetics. Churchland and Peterson did acknowledge that this went beyond mere individual or even familial issues: “It is probable that the intemperate use of alcohol and drugs, the spreading of syphilis, and the overstimulation in many directions of modern civilization have determined an increase difficult to estimate, but nevertheless palpable, of insanity in the present century as compared with past centuries.”

Also, there is the 1902 The Journal of Nervous and Mental Disease: Volume 29 edited by William G. Spiller. There is much discussion in there about how anxiety was observed, diagnosed, and treated at the time. Some of the case studies make for a fascinating read —– check out: “Report of a Case of Epilepsy Presenting as Symptoms Night Terrors, Inipellant Ideas, Complicated Automatisms, with Subsequent Development of Convulsive Motor Seizures and Psychical Aberration” by W. K. Walker. This reminds me of the case that influenced Sigmund Freud and Carl Jung, Daniel Paul Schreber’s 1903 Memoirs of My Nervous Illness.

Talk about “a disintegration of the personality and character structure of Modern Man and mental-rational consciousness,” as Scott Preston put it. He goes on to say that, “The individual is not a natural thing. There is an incoherency in “Margaret Thatcher’s view of things when she infamously declared “there is no such thing as society” — that she saw only individuals and families, that is to say, atoms and molecules.” Her saying that really did capture the mood of the society she denied existing. Even the family was shrunk down to the ‘nuclear’. To state there is no society is to declare that there is also no extended family, no kinship, no community, that there is no larger human reality of any kind. Ironically in this pseudo-libertarian sentiment, there is nothing holding the family together other than government laws imposing strict control of marriage and parenting where common finances lock two individuals together under the rule of capitalist realism (the only larger realities involved are inhuman systems) — compared to high trust societies such as Nordic countries where the definition and practice of family life is less legalistic (Nordic Theory of Love and Individualism).

The individual consumer-citizen as a legal member of a family unit has to be created and then controlled, as it is a rather unstable atomized identity. “The idea of the “individual”,” Preston says, “has become an unsustainable metaphor and moral ideal when the actual reality is “21st century schizoid man” — a being who, far from being individual, is falling to pieces and riven with self-contradiction, duplicity, and cognitive dissonance, as reflects life in “the New Normal” of double-talk, double-think, double-standard, and double-bind.” That is partly the reason for the heavy focus on the body, an attempt to make concrete the individual in order to hold together the splintered self — great analysis of this can be found in Lewis Hyde’s Trickster Makes This World: “an unalterable fact about the body is linked to a place in the social order, and in both cases, to accept the link is to be caught in a kind of trap. Before anyone can be snared in this trap, an equation must be made between the body and the world (my skin color is my place as a Hispanic; menstruation is my place as a woman)” (see one of my posts about it: Lock Without a Key). Along with increasing authoritarianism, there was increasing medicalization and rationalization — to try to make sense of what was senseless.

A specific example of a change can be found in Dr. Frederick Hollick (1818-1900) who was a popular writer and speaker on medicine and health — his “links were to the free-thinking tradition, not to Christianity” (Helen Lefkowitz Horowitz, Rewriting Sex). With the influence of Mesmerism and animal magnetism, he studied and wrote about what more scientifically-sounding was variously called electrotherapeutics, galvanism, and electro-galvanism. Hollick was an English follower of the Scottish industrialist and socialist Robert Dale Owen who he literally followed to the United States where Owen started the utopian community New Harmony, a Southern Indiana village bought from the utopian German Harmonists and then filled with brilliant and innovative minds but lacking in practical know-how about running a self-sustaining community (Abraham Lincoln, later becoming a friend to the Owen family, recalled as a boy seeing the boat full of books heading to New Harmony).

“As had Owen before him, Hollick argued for the positive value of sexual feeling. Not only was it neither immoral nor injurious, it was the basis for morality and society. […] In many ways, Hollick was a sexual enthusiast” (Horowitz). These were the social circles of Abraham Lincoln, as he personally knew free-love advocates; that is why early Republicans were often referred to as “Red Republicans”, the ‘Red’ indicating radicalism as it still does to this day. Hollick wasn’t the first to be a sexual advocate nor, of course would he be the last — preceding him was Sarah Grimke (1837, Equality of the Sexes) and Charles Knowlton (1839, The Private Companion of Young Married People), Hollick having been “a student of Knowlton’s work” (Debran Rowland, The Boundaries of Her Body); and following him were two more well known figures, the previously mentioned Bernarr Macfadden (1868-1955) who was the first major health and fitness guru, and Wilhelm Reich (1897–1957) who was the less respectable member of the trinity formed with Sigmund Freud and Carl Jung. Sexuality became a symbolic issue of politics and health, partly because of increasing scientific knowledge but also because increasing marketization of products such as birth control (with public discussion of contraceptives happening in the late 1700s and advances in contraceptive production in the early 1800s), the latter being quite significant as it meant individuals could control pregnancy and this is particularly relevant to women. It should be noted that Hollick promoted the ideal of female sexual autonomy, that sex should be assented to and enjoyed by both partners.

This growing concern with sexuality began with the growing middle class in the decades following the American Revolution. Among much else, it was related to the post-revolutionary focus on parenting and the perceived need for raising republican citizens — this formed an audience far beyond radical libertinism and free-love. Expert advice was needed for the new bourgeouis family life, as part of the “civilizing process” that increasingly took hold at that time with not only sexual manuals but also parenting guides, health pamphlets, books of manners, cookbooks, diet books, etc — cut off from the roots of traditional community and kinship, the modern individual no longer trusted inherited wisdom and so needed to be taught how to live, how to behave and relate. Along with the rise of the science, this situation promoted the role of the public intellectual that Hollick effectively took advantage of and, after the failure of Owen’s utopian experiment, he went on the lecture circuit which brought on legal cases in the unsuccessful attempt to silence him, the kind of persecution that Reich also later endured.

To put it in perspective, this Antebellum era of public debate and public education on sexuality coincided with other changes. Following the revolutionary era feminism (e.g., Mary Wollstonecraft), the “First Wave” of organized feminists emerged generations later with the Seneca meeting in 1848 and, in that movement, there was a strong abolitionist impulse. This was part of the rise of ideological -isms in the North that so concerned the Southern aristocrats who wanted to maintain their hierarchical control of the entire country, the control they were quickly losing with the shift of power in the Federal government. A few years before that in 1844, a more effective condom was developed using vulcanized rubber, although condoms had been on the market since the previous decade; also in the 1840s, the vaginal sponge became available. Interestingly, many feminists were as against the contraceptives as they were against abortions. These were far from being mere practical issues as politics imbued every aspect and some feminists worried about how this might lessen the role of women and motherhood in society, if sexuality was divorced from pregnancy.

This was at a time when the abortion rate was sky-rocketing, indicating most women held other views. “Yet we also know that thousands of women were attending lectures in these years, lectures dealing, in part, with fertility control. And rates of abortion were escalating rapidly, especially, according to historian James Mohr, the rate for married women. Mohr estimates that in the period 1800-1830, perhaps one out of every twenty-five to thirty pregnancies was aborted. Between 1850 and 1860, he estimates, the ratio may have been one out of every five or six pregnancies. At mid-century, more than two hundred full-time abortionists reported worked in New York City” (Rickie Solinger, Pregnancy and Power, p. 61). In the unGodly and unChurched period of early America (“We forgot.”), organized religion was weak and “premarital sex was typical, many marriages following after pregnancy, but some people simply lived in sin. Single parents and ‘bastards’ were common” (A Vast Experiment). Early Americans, by today’s standards, were not good Christians — visiting Europeans often saw them as uncouth heathens and quite dangerous at that, such as the common American practice of toting around guns and knives, ever ready for a fight, whereas carrying weapons had been made illegal in England. In The Churching of America, Roger Finke and Rodney Stark write (pp. 25-26):

“Americans are burdened with more nostalgic illusions about the colonial era than about any other period in their history. Our conceptions of the time are dominated by a few powerful illustrations of Pilgrim scenes that most people over forty stared at year after year on classroom walls: the baptism of Pocahontas, the Pilgrims walking through the woods to church, and the first Thanksgiving. Had these classroom walls also been graced with colonial scenes of drunken revelry and barroom brawling, of women in risque ball-gowns, of gamblers and rakes, a better balance might have been struck. For the fact is that there never were all that many Puritans, even in New England, and non-Puritan behavior abounded. From 1761 through 1800 a third (33.7%) of all first births in New England occurred after less than nine months of marriage (D. S. Smith, 1985), despite harsh laws against fornication. Granted, some of these early births were simply premature and do not necessarily show that premarital intercourse had occurred, but offsetting this is the likelihood that not all women who engaged in premarital intercourse would have become pregnant. In any case, single women in New England during the colonial period were more likely to be sexually active than to belong to a church-in 1776 only about one out of five New Englanders had a religious affiliation. The lack of affiliation does not necessarily mean that most were irreligious (although some clearly were), but it does mean that their faith lacked public expression and organized influence.”

Though marriage remained important as an ideal in American culture, what changed was that procreative control became increasingly available — with fewer accidental pregnancies and more abortions, a powerful motivation for marriage disappeared. Unsurprisingly, at the same time, there was increasing worries about the breakdown of community and family, concerns that would turn into moral panic at various points. Antebellum America was in turmoil. This was concretely exemplified by the dropping birth rate that was already noticeable by mid-century (Timothy Crumrin, “Her Daily Concern:” Women’s Health Issues in Early 19th-Century Indiana) and was nearly halved from 1800 to 1900 (Debran Rowland, The Boundaries of Her Body). “The late 19th century and early 20th saw a huge increase in the country’s population (nearly 200 percent between 1860 and 1910) mostly due to immigration, and that population was becoming ever more urban as people moved to cities to seek their fortunes—including women, more of whom were getting college educations and jobs outside the home” (Julie Beck, ‘Americanitis’: The Disease of Living Too Fast). It was a period of crisis, not all that different from our present crisis, including the fear about low birth rate of native-born white Americans, especially the endangered species of WASPs, being overtaken by the supposed dirty hordes of blacks, ethnics, and immigrants.

The promotion of birth control was considered a genuine threat to American society, maybe to all of Western Civilization. It was most directly a threat to traditional gender roles. Women could better control when they got pregnant, a decisive factor in the phenomenon of  larger numbers of women entering college and the workforce. And with an epidemic of neurasthenia, this dilemma was worsened by the crippling effeminacy that neutered masculine potency. Was modern man, specifically the white ruling elite, up for the task of carrying on Western Civilization?

“Indeed, civilization’s demands on men’s nerve force had left their bodies positively effeminate. According to Beard, neurasthenics had the organization of “women more than men.” They possessed ” a muscular system comparatively small and feeble.” Their dainty frames and feeble musculature lacked the masculine vigor and nervous reserves of even their most recent forefathers. “It is much less than a century ago, that a man who could not [drink] many bottles of wine was thought of as effeminate—but a fraction of a man.” No more. With their dwindling reserves of nerve force, civilized men were becoming increasingly susceptible to the weakest stimulants until now, “like babes, we find no safe retreat, save in chocolate and milk and water.” Sex was as debilitating as alcohol for neurasthenics. For most men, sex in moderation was a tonic. Yet civilized neurasthenics could become ill if they attempted intercourse even once every three months. As Beard put it, “there is not force enough left in them to reproduce the species or go through the process of reproducing the species.” Lacking even the force “to reproduce the species,” their manhood was clearly in jeopardy.” (Gail Bederman, Manliness and Civilization, pp. 87-88)

This led to a backlash that began before the Civil War with the early obscenity laws and abortion laws, but went into high gear with the 1873 Comstock laws that effectively shut down the free market of both ideas and products related to sexuality, including sex toys. This made it near impossible for most women to learn about birth control or obtain contraceptives and abortifacients. There was a felt need to restore order and that meant white male order of the WASP middle-to-upper classes, especially with the end of slavery, mass immigration of ethnics, urbanization and industrialization. The crisis wasn’t only ideological or political. The entire world had been falling apart for centuries with the ending of feudalism and the ancien regime, the last remnants of it in America being maintained through slavery. Motherhood being the backbone of civilization, it was believed that women’s sexuality had to be controlled and, unlike so much else that was out of control, it actually could be controlled through enforcement of laws.

Outlawing abortions is a particularly interesting example of social control. Even with laws in place, abortions remained commonly practiced by local doctors, even in many rural areas (American Christianity: History, Politics, & Social Issues). Corey Robin argues that the strategy hasn’t been to deny women’s agency but to assert their subordination (Denying the Agency of the Subordinate Class). This is why abortion laws were designed to target male doctors, although they rarely did, and not their female patients. Everything comes down to agency or its lack or loss, but our entire sense of agency is out of accord with our own human nature. We seek to control what is outside of us for own sense of self is out of control. The legalistic worldview is inherently authoritarian, at the heart of what Julian Jaynes proposes as the post-bicameral project of consciousness, the contained self. But the container is weak and keeps leaking all over the place.

To bring it back to the original inspiration, Scott Preston wrote: “Quite obviously, our picture of the human being as an indivisible unit or monad of existence was quite wrong-headed, and is not adequate for the generation and re-generation of whole human beings. Our self-portrait or sel- understanding of “human nature” was deficient and serves now only to produce and reproduce human caricatures. Many of us now understand that the authentic process of individuation hasn’t much in common at all with individualism and the supremacy of the self-interest.” The failure we face is that of identify, of our way of being in the world. As with neurasthenia in the past, we are now in a crisis of anxiety and depression, along with yet another moral panic about the declining white race. So, we get the likes of Steve Bannon, Donald Trump, and Jordan Peterson. We failed to resolve past conflicts and so they keep re-emerging.

“In retrospect, the omens of an impending crisis and disintegration of the individual were rather obvious,” Preston points out. “So, what we face today as “the crisis of identity” and the cognitive dissonance of “the New Normal” is not something really new — it’s an intensification of that disintegrative process that has been underway for over four generations now. It has now become acute. This is the paradox. The idea of the “individual” has become an unsustainable metaphor and moral ideal when the actual reality is “21st century schizoid man” — a being who, far from being individual, is falling to pieces and riven with self-contradiction, duplicity, and cognitive dissonance, as reflects life in “the New Normal” of double-talk, double-think, double-standard, and double-bind.” We never were individuals. It was just a story we told ourselves, but there are others that could be told. Scott Preston offers an alternative narrative, that of individuation.

* * *

I found some potentially interesting books while skimming material on Google Books, in my researching Frederick Hollick and other info. Among the titles below, I’ll share some text from one of them because it offers a good summary about sexuality at the time, specifically women’s sexuality. Obviously, it went far beyond sexuality itself, and going by my own theorizing I’d say it is yet another example of symbolic conflation, considering its direct relationship to abortion.

The Boundaries of Her Body: The Troubling History of Women’s Rights in America
by Debran Rowland
pp. 34

WOMEN AND THE WOMB: The Emerging Birth Control Debate

The twentieth century dawned in America on a falling white birth rate. In 1800, an average of seven children were born to each “American-born white wife,” historians report. 29 By 1900, that number had fallen to roughly half. 30 Though there may have been several factors, some historians suggest that this decline—occurring as it did among young white women—may have been due to the use of contraceptives or abstinence,though few talked openly about it. 31

“In spite of all the rhetoric against birth control,the birthrate plummeted in the late nineteenth century in America and Western Europe (as it had in France the century before); family size was halved by the time of World War I,” notes Shari Thurer in The Myth of Motherhood. 32

As issues go, the “plummeting birthrate” among whites was a powder keg, sparking outcry as the “failure”of the privileged class to have children was contrasted with the “failure” of poor immigrants and minorities to control the number of children they were having. Criticism was loud and rampant. “The upper classes started the trend, and by the 1880s the swarms of ragged children produced by the poor were regarded by the bourgeoisie, so Emile Zola’s novels inform us, as evidence of the lower order’s ignorance and brutality,” Thurer notes. 33

But the seeds of this then-still nearly invisible movement had been planted much earlier. In the late 1700s, British political theorists began disseminating information on contraceptives as concerns of overpopulation grew among some classes. 34 Despite the separation of an ocean, by the 1820s, this information was “seeping” into the United States.

“Before the introduction of the Comstock laws, contraceptive devices were openly advertised in newspapers, tabloids, pamphlets, and health magazines,” Yalom notes.“Condoms had become increasing popular since the 1830s, when vulcanized rubber (the invention of Charles Goodyear) began to replace the earlier sheepskin models.” 35 Vaginal sponges also grew in popularity during the 1840s, as women traded letters and advice on contraceptives. 36 Of course, prosecutions under the Comstock Act went a long way toward chilling public discussion.

Though Margaret Sanger’s is often the first name associated with the dissemination of information on contraceptives in the early United States, in fact, a woman named Sarah Grimke preceded her by several decades. In 1837, Grimke published the Letters on the Equality of the Sexes, a pamphlet containing advice about sex, physiology, and the prevention of pregnancy. 37

Two years later, Charles Knowlton published The Private Companion of Young Married People, becoming the first physician in America to do so. 38 Near this time, Frederick Hollick, a student of Knowlton’s work, “popularized” the rhythm method and douching. And by the 1850s, a variety of material was being published providing men and women with information on the prevention of pregnancy. And the advances weren’t limited to paper.

“In 1846,a diaphragm-like article called The Wife’s Protector was patented in the United States,” according to Marilyn Yalom. 39 “By the 1850s dozens of patents for rubber pessaries ‘inflated to hold them in place’ were listed in the U.S. Patent Office records,” Janet Farrell Brodie reports in Contraception and Abortion in 19 th Century America. 40 And, although many of these early devices were often more medical than prophylactic, by 1864 advertisements had begun to appear for “an India-rubber contrivance”similar in function and concept to the diaphragms of today. 41

“[B]y the 1860s and 1870s, a wide assortment of pessaries (vaginal rubber caps) could be purchased at two to six dollars each,”says Yalom. 42 And by 1860, following publication of James Ashton’s Book of Nature, the five most popular ways of avoiding pregnancy—“withdrawal, and the rhythm methods”—had become part of the public discussion. 43 But this early contraceptives movement in America would prove a victim of its own success. The openness and frank talk that characterized it would run afoul of the burgeoning “purity movement.”

“During the second half of the nineteenth century,American and European purity activists, determined to control other people’s sexuality, railed against male vice, prostitution, the spread of venereal disease, and the risks run by a chaste wife in the arms of a dissolute husband,” says Yalom. “They agitated against the availability of contraception under the assumption that such devices, because of their association with prostitution, would sully the home.” 44

Anthony Comstock, a “fanatical figure,” some historians suggest, was a charismatic “purist,” who, along with others in the movement, “acted like medieval Christiansengaged in a holy war,”Yalom says. 45 It was a successful crusade. “Comstock’s dogged efforts resulted in the 1873 law passed by Congress that barred use of the postal system for the distribution of any ‘article or thing designed or intended for the prevention of contraception or procuring of abortion’,”Yalom notes.

Comstock’s zeal would also lead to his appointment as a special agent of the United States Post Office with the authority to track and destroy “illegal” mailing,i.e.,mail deemed to be “obscene”or in violation of the Comstock Act.Until his death in 1915, Comstock is said to have been energetic in his pursuit of offenders,among them Dr. Edward Bliss Foote, whose articles on contraceptive devices and methods were widely published. 46 Foote was indicted in January of 1876 for dissemination of contraceptive information. He was tried, found guilty, and fined $3,000. Though donations of more than $300 were made to help defray costs,Foote was reportedly more cautious after the trial. 47 That “caution”spread to others, some historians suggest.

Disorderly Conduct: Visions of Gender in Victorian America
By Carroll Smith-Rosenberg

Riotous Flesh: Women, Physiology, and the Solitary Vice in Nineteenth-Century America
by April R. Haynes

The Boundaries of Her Body: The Troubling History of Women’s Rights in America
by Debran Rowland

Rereading Sex: Battles Over Sexual Knowledge and Suppression in Nineteenth-century America
by Helen Lefkowitz Horowitz

Rewriting Sex: Sexual Knowledge in Antebellum America, A Brief History with Documents
by Helen Lefkowitz Horowitz

Imperiled Innocents: Anthony Comstock and Family Reproduction in Victorian America
by Nicola Kay Beisel

Against Obscenity: Reform and the Politics of Womanhood in America, 1873–1935
by Leigh Ann Wheeler

Purity in Print: Book Censorship in America from the Gilded Age to the Computer Age
by Paul S. Boyer

American Sexual Histories
edited by Elizabeth Reis

Wash and Be Healed: The Water-Cure Movement and Women’s Health
by Susan Cayleff

From Eve to Evolution: Darwin, Science, and Women’s Rights in Gilded Age America
by Kimberly A. Hamlin

Manliness and Civilization: A Cultural History of Gender and Race in the United States, 1880-1917
by Gail Bederman

One Nation Under Stress: The Trouble with Stress as an Idea
by Dana Becker

Conceptual Spaces

In a Nautilis piece, New Evidence for the Strange Geometry of Thought, Adithya Rajagopalan reports on the fascinating topic of conceptual or cognitive spaces. He begins with the work of the philosopher and cognitive scientist Peter Gärdenfors who wrote about this in a 2000 book, Conceptual Spaces. Then last year, there was published a Science paper by several neuroscientists: Jacob Bellmund, Christian Doeller, and Edvard Moser. It has to do with the brain’s “inner GPS.”

Anyone who has followed my blog for a while should see the interest this has for me. There is Julian Jaynes’ thought on consciousness, of course. And there are all kinds of other thinkers as well. I could throw out Iain McGilchrist and James L. Kugel who, though critical of Jaynes, make similar points about identity and the divided mind.

The work of Gärdenfors and the above neuroscientists helps explain numerous phenomenon, specifically in what way splintering and dissociation operates. How a Nazi doctor could torture Jewish children at work and then go home to play with his own children. How the typical person can be pious at church on Sunday and yet act in complete contradiction to this for the rest of the week. How we can know that the world is being destroyed through climate change and still go on about our lives as if everything remains the same.How we can simultaneously know and not know so many things. Et cetera.

It might begin to give us some more details in explaining the differences between the bicameral mind and Jaynesian consciousness, between Ernest Hartmann’s thin and thick boundaries of the mind, and much else. Also, in light of Lynne Kelly’s work on traditional mnemonic systems, we might be in a better position of understanding the phenomenal memory feats humans are capable of and why they are so often spatial in organization (e.g., the Songlines of Australian Aborigines) and why these often involve shifts in mental states. It might also clarify how people can temporarily or permanently change personalities and identities, how people can compartmentalize parts of themselves such as their childhood selves and maybe help explain why others fail at compartmentalizing.

The potential significance is immense. Our minds are mansions with many rooms. Below is the meat of Rajagopalan’s article.

* * *

“Cognitive spaces are a way of thinking about how our brain might organize our knowledge of the world,” Bellmund said. It’s an approach that concerns not only geographical data, but also relationships between objects and experience. “We were intrigued by evidence from many different groups that suggested that the principles of spatial coding in the hippocampus seem to be relevant beyond the realms of just spatial navigation,” Bellmund said. The hippocampus’ place and grid cells, in other words, map not only physical space but conceptual space. It appears that our representation of objects and concepts is very tightly linked with our representation of space.

Work spanning decades has found that regions in the brain—the hippocampus and entorhinal cortex—act like a GPS. Their cells form a grid-like representation of the brain’s surroundings and keep track of its location on it. Specifically, neurons in the entorhinal cortex activate at evenly distributed locations in space: If you drew lines between each location in the environment where these cells activate, you would end up sketching a triangular grid, or a hexagonal lattice. The activity of these aptly named “grid” cells contains information that another kind of cell uses to locate your body in a particular place. The explanation of how these “place” cells work was stunning enough to award scientists John O’Keefe, May-Britt Moser, and Edvard Moser, the 2014 Nobel Prize in Physiology or Medicine. These cells activate only when you are in one particular location in space, or the grid, represented by your grid cells. Meanwhile, head-direction cells define which direction your head is pointing. Yet other cells indicate when you’re at the border of your environment—a wall or cliff. Rodent models have elucidated the nature of the brain’s spatial grids, but, with functional magnetic resonance imaging, they have also been validated in humans.

Recent fMRI studies show that cognitive spaces reside in the hippocampal network—supporting the idea that these spaces lie at the heart of much subconscious processing. For example, subjects of a 2016 study—headed by neuroscientists at Oxford—were shown a video of a bird’s neck and legs morph in size. Previously they had learned to associate a particular bird shape with a Christmas symbol, such as Santa or a Gingerbread man. The researchers discovered the subjects made the connections with a “mental picture” that could not be described spatially, on a two-dimensional map. Yet grid-cell responses in the fMRI data resembled what one would see if subjects were imagining themselves walking in a physical environment. This kind of mental processing might also apply to how we think about our family and friends. We might picture them “on the basis of their height, humor, or income, coding them as tall or short, humorous or humorless, or more or less wealthy,” Doeller said. And, depending on whichever of these dimensions matters in the moment, the brain would store one friend mentally closer to, or farther from, another friend.

But the usefulness of a cognitive space isn’t just restricted to already familiar object comparisons. “One of the ways these cognitive spaces can benefit our behavior is when we encounter something we have never seen before,” Bellmund said. “Based on the features of the new object we can position it in our cognitive space. We can then use our old knowledge to infer how to behave in this novel situation.” Representing knowledge in this structured way allows us to make sense of how we should behave in new circumstances.

Data also suggests that this region may represent information with different levels of abstraction. If you imagine moving through the hippocampus, from the top of the head toward the chin, you will find many different groups of place cells that completely map the entire environment but with different degrees of magnification. Put another way, moving through the hippocampus is like zooming in and out on your phone’s map app. The area in space represented by a single place cell gets larger. Such size differences could be the basis for how humans are able to move between lower and higher levels of abstraction—from “dog” to “pet” to “sentient being,” for example. In this cognitive space, more zoomed-out place cells would represent a relatively broad category consisting of many types, while zoomed-in place cells would be more narrow.

Yet the mind is not just capable of conceptual abstraction but also flexibility—it can represent a wide range of concepts. To be able to do this, the regions of the brain involved need to be able to switch between concepts without any informational cross-contamination: It wouldn’t be ideal if our concept for bird, for example, were affected by our concept for car. Rodent studies have shown that when animals move from one environment to another—from a blue-walled cage to a black-walled experiment room, for example—place-cell firing is unrelated between the environments. Researchers looked at where cells were active in one environment and compared it to where they were active in the other. If a cell fired in the corner of the blue cage as well as the black room, there might be some cross-contamination between environments. The researchers didn’t see any such correlation in the place-cell activity. It appears that the hippocampus is able to represent two environments without confounding the two. This property of place cells could be useful for constructing cognitive spaces, where avoiding cross-contamination would be essential. “By connecting all these previous discoveries,” Bellmund said, “we came to the assumption that the brain stores a mental map, regardless of whether we are thinking about a real space or the space between dimensions of our thoughts.”

The Agricultural Mind

Let me make an argument about individualism, rigid egoic boundaries, and hence Jaynesian consciousness. But I’ll come at it from a less typical angle. I’ve been reading much about diet, nutrition, and health. There are significant links between what we eat and so much else: gut health, hormonal regulation, immune system, and neurocognitive functioning. There are multiple pathways, one of which is direct, connecting the gut and the brain. The gut is sometimes called the second brain, but in evolutionary terms it is the first brain. To demonstrate one example of a connection, many are beginning to refer to Alzheimer’s as type 3 diabetes, and dietary interventions have reversed symptoms in clinical studies. Also, microbes and parasites have been shown to influence our neurocognition and psychology, even altering personality traits and behavior (e.g., toxoplasma gondii).

One possibility to consider is the role of exorphins that are addictive and can be blocked in the same way as opioids. Exorphin, in fact, means external morphine-like substance, in the way that endorphin means indwelling morphine-like substance. Exorphins are found in milk and wheat. Milk, in particular, stands out. Even though exorphins are found in other foods, it’s been argued that they are insignificant because they theoretically can’t pass through the gut barrier, much less the blood-brain barrier. Yet exorphins have been measured elsewhere in the human body. One explanation is gut permeability that can be caused by many factors such as stress but also by milk. The purpose of milk is to get nutrients into the calf and this is done by widening the space in gut surface to allow more nutrients through the protective barrier. Exorphins get in as well and create a pleasurable experience to motivate the calf to drink more. Along with exorphins, grains and dairy also contain dopaminergic peptides, and dopamine is the other major addictive substance. It feels good to consume dairy as with wheat, whether you’re a calf or a human, and so one wants more.

Addiction, of food or drugs or anything else, is a powerful force. And it is complex in what it affects, not only physiologically and psychologically but also on a social level. Johann Hari offers a great analysis in Chasing the Scream. He makes the case that addiction is largely about isolation and that the addict is the ultimate individual. It stands out to me that addiction and addictive substances have increased over civilization. Growing of poppies, sugar, etc came later on in civilization, as did the production of beer and wine (by the way, alcohol releases endorphins, sugar causes a serotonin high, and both activate the hedonic pathway). Also, grain and dairy were slow to catch on, as a large part of the diet. Until recent centuries, most populations remained dependent on animal foods, including wild game. Americans, for example, ate large amounts of meat, butter, and lard from the colonial era through the 19th century. In 1900, Americans on average were only getting 10% of carbs as part of their diet and sugar was minimal.

Another factor to consider is that low-carb diets can alter how the body and brain functions. That is even more true if combined with intermittent fasting and restricted eating times that would have been more common in the past. Taken together, earlier humans would have spent more time in ketosis (fat-burning mode, as opposed to glucose-burning) which dramatically affects human biology. The further one goes back in history the greater amount of time people probably spent in ketosis. One difference with ketosis is cravings and food addictions disappear. It’s a non-addictive or maybe even anti-addictive state of mind. Many hunter-gatherer tribes can go days without eating and it doesn’t appear to bother them, and that is typical of ketosis. This was also observed of Mongol warriors who could ride and fight for days on end without tiring or needing to stop for food. What is also different about hunter-gatherers and similar traditional societies is how communal they are or were and how more expansive their identities in belonging to a group. Anthropological research shows how hunter-gatherers often have a sense of personal space that extends into the environment around them. What if that isn’t merely cultural but something to do with how their bodies and brains operate? Maybe diet even plays a role. Hold that thought for a moment.

Now go back to the two staples of the modern diet, grains and dairy. Besides exorphins and dopaminergic substances, they also have high levels of glutamate, as part of gluten and casein respectively. Dr. Katherine Reid is a biochemist whose daughter was diagnosed with autism and it was severe. She went into research mode and experimented with supplementation and then diet. Many things seemed to help, but the greatest result came from restriction of glutamate, a difficult challenge as it is a common food additive. This requires going on a largely whole foods diet, that is to say eliminating processed foods. But when dealing with a serious issue, it is worth the effort. Dr. Reid’s daughter showed immense improvement to such a degree that she was kicked out of the special needs school. After being on this diet for a while, she socialized and communicated normally like any other child, something she was previously incapable of. Keep in mind that glutamate is necessary as a foundational neurotransmitter in modulating communication between the gut and brain. But typically we only get small amounts of it, as opposed to the large doses found in the modern diet.

Glutamate is also implicated in schizophrenia: “The most intriguing evidence came when the researchers gave germ-free mice fecal transplants from the schizophrenic patients. They found that “the mice behaved in a way that is reminiscent of the behavior of people with schizophrenia,” said ,Julio Licinio, who co-led the new work with Wong, his research partner and spouse. Mice given fecal transplants from healthy controls behaved normally. “The brains of the animals given microbes from patients with schizophrenia also showed changes in glutamate, a neurotransmitter that is thought to be dysregulated in schizophrenia,” he added. The discovery shows how altering the gut can influence an animals behavior” (Roni Dengler, Researchers Find Further Evidence That Schizophrenia is Connected to Our Guts; reporting on Peng Zheng et al, The gut microbiome from patients with schizophrenia modulates the glutamate-glutamine-GABA cycle and schizophrenia-relevant behaviors in mice, Science Advances journal). And glutamate is involved in other conditions as well, such as in relation to GABA: “But how do microbes in the gut affect [epileptic] seizures that occur in the brain? Researchers found that the microbe-mediated effects of the Ketogenic Diet decreased levels of enzymes required to produce the excitatory neurotransmitter glutamate. In turn, this increased the relative abundance of the inhibitory neurotransmitter GABA. Taken together, these results show that the microbe-mediated effects of the Ketogenic Diet have a direct effect on neural activity, further strengthening support for the emerging concept of the ‘gut-brain’ axis.” (Jason Bush, Important Ketogenic Diet Benefit is Dependent on the Gut Microbiome). Glutamate is one neurotransmitter among many that can be affected in a similar manner; e.g., serotonin is also produced in the gut.

That reminds me of propionate, a short chain fatty acid. It is another substance normally taken in at a low level. Certain foods, including grains and dairy, contain it. The problem is that, as a useful preservative, it has been generously added to the food supply. Research on rodents shows injecting them with propionate causes autistic-like behaviors. And other rodent studies show how this stunts learning ability and causes repetitive behavior (both related to the autistic demand for the familiar), as too much propionate entrenches mental patterns through the mechanism that gut microbes use to communicate to the brain how to return to a needed food source. Autistics, along with cravings for propionate-containing foods, tend to have larger populations of a particular gut microbe that produces propionate. In killing microbes, this might be why antibiotics can help with autism. But in the case of depression, it is associated instead with the lack of certain microbes that produce butyrate, another important substance that also is found in certain foods (Mireia Valles-Colomer et al, The neuroactive potential of the human gut microbiota in quality of life and depression). Depending on the specific gut dysbiosis, diverse neurocognitive conditions can result. And in affecting the microbiome, changes in autism can be achieved through a ketogenic diet, reducing the microbiome (similar to an antibiotic) — this presumably takes care of the problematic microbes and readjusts the gut from dysbiosis to a healthier balance.

As with proprionate, exorphins injected into rats will likewise elicit autistic-like behaviors. By two different pathways, the body produces exorphins and proprionate from the consumption of grains and dairy, the former from the breakdown of proteins and the latter produced by gut bacteria in the breakdown of some grains and refined carbohydrates (combined with the proprionate used as a food additive; added to other foods as well and also, at least in rodents, artificial sweeteners increase propionate levels). This is part of the explanation for why many autistics have responded well to low-carb ketosis, specifically paleo diets that restrict both wheat and dairy, but ketones themselves play a role in using the same transporters as propionate and so block their buildup in cells and, of course, ketones offer a different energy source for cells as a replacement for glucose which alters how cells function, specifically neurocognitive functioning and its attendant psychological effects.

What stands out to me about autism is how isolating it is. The repetitive behavior and focus on objects resonates with extreme addiction. Both conditions block normal human relating and create an obsessive mindset that, in the most most extreme forms, blocks out all else. I wonder if all of us moderns are simply expressing milder varieties of this biological and neurological phenomenon. And this might be the underpinning of our hyper-individualistic society, with the earliest precursors showing up in the Axial Age following what Julian Jaynes hypothesized as the breakdown of the much more other-oriented bicameral mind. What if our egoic individuality is the result of our food system, as part of the civilizational project of mass agriculture?

* * *

Mongolian Diet and Fasting

For anyone who is curious to learn more, the original point of interest for me was a quote by Jack Weatherford in his book Genghis Khan and the Making of the Modern World: “The Chinese noted with surprise and disgust the ability of the Mongol warriors to survive on little food and water for long periods; according to one, the entire army could camp without a single puff of smoke since they needed no fires to cook. Compared to the Jurched soldiers, the Mongols were much healthier and stronger. The Mongols consumed a steady diet of meat, milk, yogurt, and other diary products, and they fought men who lived on gruel made from various grains. The grain diet of the peasant warriors stunted their bones, rotted their teeth, and left them weak and prone to disease. In contrast, the poorest Mongol soldier ate mostly protein, thereby giving him strong teeth and bones. Unlike the Jurched soldiers, who were dependent on a heavy carbohydrate diet, the Mongols could more easily go a day or two without food.” By the way, that biography was written by an anthropologist who lived among and studied the Mongols for years. It is about the historical Mongols, but filtered through the direct experience of still existing Mongol people who have maintained a traditional diet and lifestyle longer than most other populations. It isn’t only that their diet was ketogenic because of being low-carb but also because it involved fasting.

From Mongolia Volume 1 The Tangut Country, and the Solitudes of Northernin (1876), Nikolaĭ Mikhaĭlovich Przhevalʹskiĭ writes in the second note on p. 65 under the section Calendar and Year-Cycle: “On the New Year’s Day, or White Feast of the Mongols, see ‘Marco Polo’, 2nd ed. i. p. 376-378, and ii. p. 543. The monthly fetival days, properly for the Lamas days of fasting and worship, seem to differ locally. See note in same work, i. p. 224, and on the Year-cycle, i. p. 435.” This is alluded to in another text, in describing that such things as fasting were the norm of that time: “It is well known that both medieval European and traditional Mongolian cultures emphasized the importance of eating and drinking. In premodern societies these activities played a much more significant role in social intercourse as well as in religious rituals (e.g., in sacrificing and fasting) than nowadays” (Antti Ruotsala, Europeans and Mongols in the middle of the thirteenth century, 2001). A science journalist trained in biology, Dyna Rochmyaningsih, also mentions this: “As a spiritual practice, fasting has been employed by many religious groups since ancient times. Historically, ancient Egyptians, Greeks, Babylonians, and Mongolians believed that fasting was a healthy ritual that could detoxify the body and purify the mind” (Fasting and the Human Mind).

Mongol shamans and priests fasted, no different than in so many other religions, but so did other Mongols — more from Przhevalʹskiĭ’s 1876 account showing the standard feast and fast cycle of many traditional ketogenic diets: “The gluttony of this people exceeds all description. A Mongol will eat more than ten pounds of meat at one sitting, but some have been known to devour an average-sized sheep in twenty-four hours! On a journey, when provisions are economized, a leg of mutton is the ordinary daily ration for one man, and although he can live for days without food, yet, when once he gets it, he will eat enough for seven” (see more quoted material in Diet of Mongolia). Fasting was also noted of earlier Mongols, such as Genghis Khan: “In the spring of 2011, Jenghis Khan summoned his fighting forces […] For three days he fasted, neither eating nor drinking, but holding converse with the gods. On the fourth day the Khakan emerged from his tent and announced to the exultant multitude that Heaven had bestowed on him the boon of victory” (Michael Prawdin, The Mongol Empire, 1967). Even before he became Khan, this was his practice as was common among the Mongols, such that it became a communal ritual for the warriors:

“When he was still known as Temujin, without tribe and seeking to retake his kidnapped wife, Genghis Khan went to Burkhan Khaldun to pray. He stripped off his weapons, belt, and hat – the symbols of a man’s power and stature – and bowed to the sun, sky, and mountain, first offering thanks for their constancy and for the people and circumstances that sustained his life. Then, he prayed and fasted, contemplating his situation and formulating a strategy. It was only after days in prayer that he descended from the mountain with a clear purpose and plan that would result in his first victory in battle. When he was elected Khan of Khans, he again retreated into the mountains to seek blessing and guidance. Before every campaign against neighboring tribes and kingdoms, he would spend days in Burhkhan Khandun, fasting and praying. By then, the people of his tribe had joined in on his ritual at the foot of the mountain, waiting his return” (Dr. Hyun Jin Preston Moon, Genghis Khan and His Personal Standard of Leadership).

As an interesting side note, the Mongol population have been studied to some extent in one area of relevance. In Down’s Anomaly (1976), Smith et al writes that, “The initial decrease in the fasting blood sugar was greater than that usually considered normal and the return to fasting blood sugar level was slow. The results suggested increased sensitivity to insulin. Benda reported the initial drop in fating blood sugar to be normal but the absolute blood sugar level after 2 hours was lower for mongols than for controls.” That is probably the result of a traditional low-carb diet that had been maintained continuously since before history. For some further context, I noticed some discusion about the Mongolian keto diet (Reddit, r/keto, TIL that Ghenghis Khan and his Mongol Army ate a mostly keto based diet, consisting of lots of milk and cheese. The Mongols were specially adapted genetically to digest the lactase in milk and this made them easier to feed.) that was inspired by the scientific documentary “The Evolution of Us” (presently available on Netflix and elsewhere).

* * *

3/30/19 – An additional comment: I briefly mentioned sugar, that it causes a serotonin high and activates the hedonic pathway. I also noted that it was late in civilization when sources of sugar were cultivated and, I could add, even later when sugar became cheap enough to be common. Even into the 1800s, sugar was minimal and still often considered more as medicine than food.

To extend this thought, it isn’t only sugar in general but specific forms of it. Fructose, in particular, has become widespread because of United States government subsidizing corn agriculture which has created a greater corn yield that humans can consume. So, what doesn’t get fed to animals or turned into ethanol, mostly is made into high fructose corn syrup and then added into almost every processed food and beverage imaginable.

Fructose is not like other sugars. This was important for early hominid survival and so shaped human evolution. It might have played a role in fasting and feasting. In 100 Million Years of Food, Stephen Le writes that, “Many hypotheses regarding the function of uric acid have been proposed. One suggestion is that uric acid helped our primate ancestors store fat, particularly after eating fruit. It’s true that consumption of fructose induces production of uric acid, and uric acid accentuates the fat-accumulating effects of fructose. Our ancestors, when they stumbled on fruiting trees, could gorge until their fat stores were pleasantly plump and then survive for a few weeks until the next bounty of fruit was available” (p. 42).

That makes sense to me, but he goes on to argue against this possible explanation. “The problem with this theory is that it does not explain why only primates have this peculiar trait of triggering fat storage via uric acid. After all, bears, squirrels, and other mammals store fat without using uric acid as a trigger.” This is where Le’s knowledge is lacking for he never discusses ketosis that has been centrally important for humans unlike other animals. If uric acid increases fat production, that would be helpful for fattening up for the next starvation period when the body returned to ketosis. So, it would be a regular switching back and forth between formation of uric acid that stores fat and formation of ketones that burns fat.

That is fine and dandy under natural conditions. Excess fructose, however, is a whole other matter. It has been strongly associated with metabolic syndrome. One pathway of causation is that increased production of uric acid. This can lead to gout but other things as well. It’s a mixed bag. “While it’s true that higher levels of uric acid have been found to protect against brain damage from Alzheimer’s, Parkinson’s, and multiple sclerosis, high uric acid unfortunately increases the risk of brain stroke and poor brain function” (p. 43).

The potential side effects of uric acid overdose are related to other problems I’ve discussed in relation to the agricultural mind. “A recent study also observed that high uric acid levels are associated with greater excitement-seeking and impulsivity, which the researchers noted may be linked to attention deficit hyperactivity disorder (ADHD)” (p. 43). The problems of sugar go far beyond mere physical disease. It’s one more factor in the drastic transformation of the human mind.

* * *

4/2/19 – More info: There are certain animal fats, the omega-3 fatty acids EPA and DHA, that are essential to human health. These were abundant in the hunter-gatherer diet. But over the history of agriculture, they have become less common.

This is associated with psychiatric disorders and general neurocognitive problems, including those already mentioned above in the post. Agriculture and industrialization have replaced these healthy oils with overly processed oils that are high in linoleic acid (LA), an omega-6 fatty acids. LA interferes with the body’s use of omega-3 fatty acids.

The Brain Needs Animal Fat
by Georgia Ede

Western Individuality Before the Enlightenment Age

The Culture Wars of the Late Renaissance: Skeptics, Libertines, and Opera
by Edward Muir
Introduction
pp. 5-7

One of the most disturbing sources of late-Renaissance anxiety was the collapse of the traditional hierarchic notion of the human self. Ancient and medieval thought depicted reason as governing the lower faculties of the will, the passions, and the body. Renaissance thought did not so much promote “individualism” as it cut away the intellectual props that presented humanity as the embodiment of a single divine idea, thereby forcing a desperate search for identity in many. John Martin has argued that during the Renaissance, individuals formed their sense of selfhood through a difficult negotiation between inner promptings and outer social roles. Individuals during the Renaissance looked both inward for emotional sustenance and outward for social assurance, and the friction between the inner and outer selves could sharpen anxieties 2 The fragmentation of the self seems to have been especially acute in Venice, where the collapse of aristocratic marriage structures led to the formation of what Virginia Cox has called the single self, most clearly manifest in the works of several women writers who argued for the moral and intellectual equality of women with men.’ As a consequence of the fragmented understanding of the self, such thinkers as Montaigne became obsessed with what was then the new concept of human psychology, a term in fact coined in this period.4 A crucial problem in the new psychology was to define the relation between the body and the soul, in particular to determine whether the soul died with the body or was immortal. With its tradition of Averroist readings of Aristotle, some members of the philosophy faculty at the University of Padua recurrently questioned the Christian doctrine of the immortality of the soul as unsound philosophically. Other hierarchies of the human self came into question. Once reason was dethroned, the passions were given a higher value, so that the heart could be understood as a greater force than the mind in determining human conduct. duct. When the body itself slipped out of its long-despised position, the sexual drives of the lower body were liberated and thinkers were allowed to consider sex, independent of its role in reproduction, a worthy manifestation of nature. The Paduan philosopher Cesare Cremonini’s personal motto, “Intus ut libet, foris ut moris est,” does not quite translate to “If it feels good, do it;” but it comes very close. The collapse of the hierarchies of human psychology even altered the understanding of the human senses. The sense of sight lost its primacy as the superior faculty, the source of “enlightenment”; the Venetian theorists of opera gave that place in the hierarchy to the sense of hearing, the faculty that most directly channeled sensory impressions to the heart and passions.

Historical and Philosophical Issues in the Conservation of Cultural Heritage
edited by Nicholas Price, M. Kirby Talley, and Alessandra Melucco Vaccaro
Reading 5: “The History of Art as a Humanistic Discipline”
by Erwin Panofsky
pp. 83-85

Nine days before his death Immanuel Kant was visited by his physician. Old, ill and nearly blind, he rose from his chair and stood trembling with weakness and muttering unintelligible words. Finally his faithful companion realized that he would not sit down again until the visitor had taken a seat. This he did, and Kant then permitted himself to be helped to his chair and, after having regained some of his strength, said, ‘Das Gefühl für Humanität hat mich noch nicht verlassen’—’The sense of humanity has not yet left me’. The two men were moved almost to tears. For, though the word Humanität had come, in the eighteenth century, to mean little more than politeness and civility, it had, for Kant, a much deeper significance, which the circumstances of the moment served to emphasize: man’s proud and tragic consciousness of self-approved and self-imposed principles, contrasting with his utter subjection to illness, decay and all that implied in the word ‘mortality.’

Historically the word humanitas has had two clearly distinguishable meanings, the first arising from a contrast between man and what is less than man; the second between man and what is more. In the first case humanitas means a value, in the second a limitation.

The concept of humanitas as a value was formulated in the circle around the younger Scipio, with Cicero as its belated, yet most explicit spokesman. It meant the quality which distinguishes man, not only from animals, but also, and even more so, from him who belongs to the species homo without deserving the name of homo humanus; from the barbarian or vulgarian who lacks pietas and παιδεια- that is, respect for moral values and that gracious blend of learning and urbanity which we can only circumscribe by the discredited word “culture.”

In the Middle Ages this concept was displaced by the consideration of humanity as being opposed to divinity rather than to animality or barbarism. The qualities commonly associated with it were therefore those of frailty and transience: humanitas fragilis, humanitas caduca.

Thus the Renaissance conception of humanitas had a two-fold aspect from the outset. The new interest in the human being was based both on a revival of the classical antithesis between humanitas and barbartias, or feritas, and on a survival of the mediaeval antithesis between humanitas and divinitas. When Marsilio Ficino defines man as a “rational soul participating in the intellect of God, but operating in a body,” he defines him as the one being that is both autonomous and finite. And Pico’s famous ‘speech’ ‘On the Dignity of Man’ is anything but a document of paganism. Pico says that God placed man in the center of the universe so that he might be conscious of where he stands, and therefore free to decide ‘where to turn.’ He does not say that man is the center of the universe, not even in the sense commonly attributed to the classical phrase, “man the measure of all things.”

It is from this ambivalent conception of humanitas that humanism was born. It is not so much a movement as an attitude which can be defined as the conviction of the dignity of man, based on both the insistence on human values (rationality and freedom) and the acceptance of human limitations (fallibility and frailty); from this two postulates result responsibility and tolerance.

Small wonder that this attitude has been attacked from two opposite camps whose common aversion to the ideas of responsibility and tolerance has recently aligned them in a united front. Entrenched in one of these camps are those who deny human values: the determinists, whether they believe in divine, physical or social predestination, the authoritarians, and those “insectolatrists” who profess the all-importance of the hive, whether the hive be called group, class, nation or race. In the other camp are those who deny human limitations in favor of some sort of intellectual or political libertinism, such as aestheticists, vitalists, intuitionists and hero-worshipers. From the point of view of determinism, the humanist is either a lost soul or an ideologist. From the point of view of authoritarianism, he is either a heretic or a revolutionary (or a counterrevolutionary). From the point of view of “insectolatry,” he is a useless individualist. And from the point of view of libertinism he is a timid bourgeois.

Erasmus of Rotterdam, the humanist par excellence, is a typical case in point. The church suspected and ultimately rejected the writings of this man who had said: “Perhaps the spirit of Christ is more largely diffused than we think, and there are many in the community of saints who are not in our calendar.” The adventurer Uhich von Hutten despised his ironical skepticism and his unheroic love of tranquillity. And Luther, who insisted that “no man has power to think anything good or evil, but everything occurs in him by absolute necessity,” was incensed by a belief which manifested itself in the famous phrase; “What is the use of man as a totality [that is, of man endowed with both a body and a soul], if God would work in him as a sculptor works in clay, and might just as well work in stone?”

Food and Faith in Christian Culture
edited by Ken Albala and Trudy Eden
Chapter 3: “The Food Police”
Sumptuary Prohibitions On Food In The Reformation
by Johanna B. Moyer
pp. 80-83

Protestants too employed a disease model to explain the dangers of luxury consumption. Luxury damaged the body politic leading to “most incurable sickness of the universal body” (33). Protestant authors also employed Galenic humor theory, arguing that “continuous superfluous expense” unbalanced the humors leading to fever and illness (191). However, Protestants used this model less often than Catholic authors who attacked luxury. Moreover, those Protestants who did employ the Galenic model used it in a different manner than their Catholic counterparts.

Protestants also drew parallels between the damage caused by luxury to the human body and the damage excess inflicted on the French nation. Rather than a disease metaphor, however, many Protestant authors saw luxury more as a “wound” to the body politic. For Protestants the danger of luxury was not only the buildup of humors within the body politic of France but the constant “bleeding out” of humor from the body politic in the form of cash to pay for imported luxuries. The flow of cash mimicked the flow of blood from a wound in the body. Most Protestants did not see luxury foodstuffs as the problem, indeed most saw food in moderation as healthy for the body. Even luxury apparel could be healthy for the body politic in moderation, if it was domestically produced and consumed. Such luxuries circulated the “blood” of the body politic creating employment and feeding the lower orders. 72 De La Noue made this distinction clear. He dismissed the need to individually discuss the damage done by each kind of luxury that was rampant in France in his time as being as pointless “as those who have invented auricular confession have divided mortal and venal sins into infinity of roots and branches.” Rather, he argued, the damage done by luxury was in its “entire bulk” to the patrimonies of those who purchased luxuries and to the kingdom of France (116). For the Protestants, luxury did not pose an internal threat to the body and salvation of the individual. Rather, the use of luxury posed an external threat to the group, to the body politic of France.

The Reformation And Sumptuary Legislation

Catholics, as we have seen, called for antiluxury regulations on food and banqueting, hoping to curb overeating and the damage done by gluttony to the body politic. Although some Protestants also wanted to restrict food and banqueting, more often French Protestants called for restrictions on clothing and foreign luxuries. These differing views of luxury during and after the French Wars of Religion not only give insight into the theological differences between these two branches of Christianity but also provides insight into the larger pattern of the sumptuary regulation of food in Europe in this period. Sumptuary restrictions were one means by which Catholics and Protestants enforced their theology in the post-Reformation era.

Although Catholicism is often correctly cast as the branch of Reformation Christianity that gave the individual the least control over their salvation, it was also true that the individual Catholic’s path to salvation depended heavily on ascetic practices. The responsibility for following these practices fell on the individual believer. Sumptuary laws on food in Catholic areas reinforced this responsibility by emphasizing what foods should and should not be eaten and mirrored the central theological practice of fasting for the atonement of sin. Perhaps the historiographical cliché that it was only Protestantism which gave the individual believer control of his or her salvation needs to be qualified. The arithmetical piety of Catholicism ultimately placed the onus on the individual to atone for each sin. Moreover, sumptuary legislation tried to steer the Catholic believer away from the more serious sins that were associated with overeating, including gluttony, lust, anger, and pride.

Catholic theology meshed nicely with the revival of Galenism that swept through Europe in this period. Galenists preached that meat eating, overeating, and the imbalance in humors which accompanied these practices, led to behavioral changes, including an increased sex drive and increased aggression. These physical problems mirrored the spiritual problems that luxury caused, including fornication and violence. This is why so many authors blamed the French nobility for the luxury problem in France. Nobles were seen not only as more likely to bear the expense of overeating but also as more prone to violence. 73

Galenism also meshed nicely with Catholicism because it was a very physical religion in which the control of the physical body figured prominently in the believer’s path to salvation. Not surprisingly, by the seventeenth century, Protestants gravitated away from Galenism toward the chemical view of the body offered by Paracelsus. 74 Catholic sumptuary law embodied a Galenic view of the body where sin and disease were equated and therefore pushed regulations that advocated each person’s control of his or her own body.

Protestant legislators, conversely, were not interested in the individual diner. Sumptuary legislation in Protestant areas ran the gamut from control of communal displays of eating, in places like Switzerland and Germany, to little or no concern with restrictions on luxury foods, as in England. For Protestants, it was the communal role of food and luxury use that was important. Hence the laws in Protestant areas targeted food in the context of weddings, baptisms, and even funerals. The English did not even bother to enact sumptuary restrictions on food after their break with Catholicism. The French Protestants who wrote on luxury glossed over the deleterious effects of meat eating, even proclaiming it to be healthful for the body while producing diatribes against the evils of imported luxury apparel. The use of Galenism in the French Reformed treatises suggests that Protestants too were concerned with a “body,” but it was not the individual body of the believer that worried Protestant legislators. Sumptuary restrictions were designed to safeguard the mystical body of believers, or the “Elect” in the language of Calvinism. French Protestants used the Galenic model of the body to discuss the damage that luxury did to the body of believers in France, but ultimately to safeguard the economic welfare of all French subjects. The Calvinists of Switzerland used sumptuary legislation on food to protect those predestined for salvation from the dangerous eating practices of members of the community whose overeating suggested they might not be saved.

Ultimately, sumptuary regulations in the Reformation spoke to the Christian practice of fasting. Fasting served very different functions in Protestants and Catholic theology. Raymond Mentzer has suggested that Protestants “modified” the Catholic practice of fasting during the Reformation. The major reformers, including Luther, Calvin, and Zwingli, all rejected fasting as a path to salvation. 75 For Protestants, fasting was a “liturgical rite,” part of the cycle of worship and a practice that served to “bind the community.” Fasting was often a response to adversity, as during the French Wars of Religion. For Catholics, fasting was an individual act, just as sumptuary legislation in Catholic areas targeted individual diners. However, for Protestants, fasting was a communal act, “calling attention to the body of believers.” 76 The symbolic nature of fasting, Mentzer argues, reflected Protestant rejection of transubstantiation. Catholics continued to believe that God was physically present in the host, but Protestants believed His was only a spiritual presence. When Catholics took Communion, they fasted to cleanse their own bodies so as to receive the real, physical body of Christ. Protestants, on the other hand, fasted as spiritual preparation because it was their spirits that connected with the spirit of Christ in the Eucharist. 77

The Embodied Spider

There is more to embodied cognition than that neurocogntion happens within and inseparably from the body. We are bodies. And our bodies are of the world, one might say they are the world, the only world we can comprehend (com- ‘together’ + prehendere ‘grasp’). That is simple enough. But what kind of embodied beings are we with what kind of embodied experience?

How we exist within our bodies… how we hold our physical form… how we position ourselves in relation to the world… how we inhabit our extended selves… All of this and more determines our way of being, what we perceive, think, and do, what we can imagine. It is through our bodies that we manage our lived reality. And it is through our bodies that we are managed by the forces and patterns of society and environment, the affordances of structured existence forming our habitus and worldview. Maybe epigenetically carried across generations and centuries.

We are spiders in webs of our own making but webs we don’t so much see as through which we perceive, as if strands connecting us to the world to such an extent that it is unclear who is the puppet and who the puppetmaster. Social constructivism points toward a greater truth of webbed realism, of what we sense and know in our entanglement. As we are embodied, so we are embedded. Our identities extend into the world, which means the other extends back into us. One part shifts and the rest follows.

* * *

The World Shifts When a Black Widow Squats
by Ed Yong

“The widow’s abilities are part of a concept called “embodied cognition,” which argues that a creature’s ability to sense and think involves its entire body, not just its brain and sense organs. Octopus arms, for example, can grab and manipulate food without ever calling on the central brain. Female crickets can start turning toward the sound of a male using only the ears and neurons in their legs, well before their central nervous system even has a chance to process the noise. In the case of the black widow, the information provided by the sense organs in the legs depends on the position of the entire animal.

“Earlier, I described this as a postural squint. That’s close, but the analogy isn’t quite right, since squinting helps us focus on particular parts of space. Here, the spider is focusing on different parts of information space. It’s as if a human could focus on red colors by squatting, or single out high-pitched sounds by going into downward dog (or downward spider).

“The ability to sense vibrations that move through solid surfaces, as distinct from sounds that travel through air, is “an often overlooked aspect of animal communication,” says Beth Mortimer from the University of Oxford, who studies it in creatures from elephants to spiders. It’s likely, then, that the widow’s ability to control perception through posture “almost certainly [exists in] other spiders and web types, too, and other arthropods, including insects, that detect vibrations along surfaces through their legs.” Scientists just need to tune in.”

“…there resides in every language a characteristic world-view”

“Via the latter, qua character of a speech-sound, a pervasive analogy necessarily prevails in the same language; and since a like subjectivity also affects language in the same notion, there resides in every language a characteristic world-view. As the individual sound stands between man and the object, so the entire language steps in between him and the nature that operates, both inwardly and outwardly, upon him. He surrounds himself with a world of sounds, so as to take up and process within himself the world of objects. These expressions in no way outstrip the measure of the simple truth. Man lives primarily with objects, indeed, since feeling and acting in him depend on his presentations, he actually does so exclusively, as language presents them to him. By the same act whereby he spins language out of himself, he spins himself into it, and every language draws about the people that possesses it a circle whence it is possible to exit only by stepping over at once into the circle of another one. To learn a foreign language should therefore be to acquire a new standpoint in the world-view hitherto possessed, and in fact to a certain extent is so, since every language contains the whole conceptual fabric and mode of presentation of a portion of mankind.”

Wilhelm von Humboldt
On Language (1836)

* * *

Wilhelm von Humboldt
from Wikipedia

Wilhelm von Humboldt
from Stanford Encyclopedia of Philosophy

Wilhelm von Humboldt lectures
from Université de Rouen

Wilhelm von Humbold and the World of Languages
by Ian F. McNeely

Wilhelm von Humboldt: A Critical Review On His Philosophy of Language, Theory and Practice of Education
by Dr Arlini Alias

The theory of linguistic relativity from the historical perspective
by Iaroslav

Ideasthesia

Ideasthesia
from Wikipedia

Ideasthesia (alternative spelling ideaesthesia) is defined as a phenomenon in which activations of concepts (inducers) evoke perception-like experiences (concurrents). The name comes Ancient Greek ἰδέα (idéa) and αἴσθησις (aísthēsis), meaning “sensing concepts” or “sensing ideas”. The main reason for introducing the notion of ideasthesia was the problems with synesthesia. While “synesthesia” means “union of senses”, empirical evidence indicated that this was an incorrect explanation of a set of phenomena traditionally covered by this heading. Syn-aesthesis denoting also “co-perceiving”, implies the association of two sensory elements with little connection to the cognitive level. However, according to others, most phenomena that have inadvertently been linked to synesthesia in fact are induced by the semantic representations. That is, the meaning of the stimulus is what is important rather than its sensory properties, as would be implied by the term synesthesia. In other words, while synesthesia presumes that both the trigger (inducer) and the resulting experience (concurrent) are of sensory nature, ideasthesia presumes that only the resulting experience is of sensory nature while the trigger is semantic. Meanwhile, the concept of ideasthesia developed into a theory of how we perceive and the research has extended to topics other than synesthesia — as the concept of ideasthesia turned out applicable to our everyday perception. Ideasthesia has been even applied to the theory of art. Research on ideasthesia bears important implications for solving the mystery of human conscious experience, which according to ideasthesia, is grounded in how we activate concepts.

What Is “Ideasthesia” And Why Is It Important To Know?
by Faena Aleph

Many of us speak metaphorically when we describe a color as “screaming” or a sound as “sharp”, These are synesthetic associations we all experience, whether we know it or not ––but we pronounce them literally because it makes enough sense to us.

But synesthesia, which is one of the most charming sensory phenomena, has been overly studied and illustrated by many artists. Today, however, a fascinating aspect of this bridge between senses is being discovered: ideasthesia.

Danko Nikolic, a brain researcher from the Max-Plank Institute, has proposed this theory that questions the reality of two philosophical premises 1) the mind and body, and 2) the perception of senses and ideas. His research suggests that, for starters, these dualities might not exist.

Widely speaking, ideasthesia is a type of bridge that metaphorically links rational abstractions, i.e. ideas with sensory stimuli in a dynamic catalyzed by language. Nevertheless, the best way of understanding “ideasthesia” is through a TED talk that Nikolic himself recently gave. And, be warned, his theory might just change your paradigms from their foundation and reinforce the beliefs that Walt Whitman anticipated over a hundred years ago.

Ideasthesia — Art, Genius, Insanity, and Semiotics
by Totem And Token

…the notion of ideasthesia — that one can feel or physically experience an idea. Instead of a letter or a sound or a single word as being physically felt, an entire idea or construct or abstract is experienced phenomenologically.

But this seems abstract in and of itself, right? Like, what would it mean to ‘feel’ an idea? The classic example, linked to here, would be to imagine two shapes. One is a curvy splatter, kind of like the old 90s Nickelodeon logo, and the other is an angular, jagged, pointy sort of shape. Which would you call Kiki and which would you call Bouba?

An overwhelming majority (95% according to one source) would say that the splatter is Bouba and the pointy thing is Kiki.

But why though?

Bouba and Kiki are random sounds, absolutely meaningless and the figures were similarly meaningless. Some contend that it is a linguistic effect, since ‘K’ is an angular letter and ‘B’ is more rounded. Yet, there seems to be a consensus on which is which, even cross-culturally to some extent. Because just the idea of the pointy shape feels like a Kiki and the blobbier shape feels like a Bouba.

Another way I think it is felt is when we talk about highly polarizing topics, often political or religious in nature. In the podcast You Are Not So Smart, David McRaney talks about being confronted with a differing view point as having a gut-wrenching, physical effect on him. Researchers pointed out that the feeling is so strong that it actually elicits a fight-or-flight response.

But it’s just words, right? It’s not like someone saying “I don’t believe in universal healthcare” or “You should have the right to pull the plug in a coma” actually makes it so, or will cause it to happen to you. It is simply one person’s thought, so why does it trigger such a deep-seated emotion? The researchers in the episode hypothesize that the core ideas are related to you identity which is being threatened, but I think the explanation is somewhat simpler and stranger.

It’s because the ideas actually feel dangerous to you.

This is why what feels perfectly rational to you feels irrational to others.

It also makes more sense when talking about geniuses or highly gifted individuals. Although they exist, the Dr. House-type hyper-rational savants aren’t usually what you hear about when you look at the biographies of highly intelligent or gifted peoples. Da Vinci, Goethe, Tesla, Einstein and others all seem to describe an intensely phenomenological approach to creating their works.

Even in what is traditionally considered to be more rational pursuits, like math, have occasional introspective debates about whether string theory or higher order mathematics is created or discovered. This seems like a question about whether one feels out a thought or collects and constructs evidence to make a case.

What’s more is that, while I think most people can feel an idea to some extent (kiki vs bouba), gifted peoples and geniuses are more sensitive to these ideas and can thus navigate it better. Sensitivity seems to really be the hallmark of gifted individuals, so much so that I remember reading about how some gifted students have to wear special socks because the inner stitching was too distracting.

I remember when I was younger (around elementary school) there was a girl who was in our schools gifted program who everyone could not stand. She seemed to have a hairline trigger and would snap at just about anything. I realize now that she was simply incredible sensitive to other children and didn’t really know how to handle it maturely.

I can imagine if this sort of sensitivity applied to ideas and thought processes might actually be a big reason why geniuses can handle seemingly large and complex thoughts that are a struggle for the rest of us — they aren’t just thinking through it, they are also feeling their way through it.

It may offer insight into the oft-observed correlation between madness and intellect. Maybe that’s what’s really going on in schizophrenia. It’s not just a disconnect of thoughts, but an oversensitivity to the ideas that breed those thoughts that elicits instinctive, reactionary emotions much like our fight-or-flight responses to polarizing thoughts. The hallucinations are another manifestation of the weaker sensory experience of benign symbols and thoughts.

The Haunting of Voices

“If I met a skin-changer who demanded my shoes, I’d give him my shoes.” This is what a Navajo guy once told me. I didn’t inquire about why a skin-changer would want his shoes, but it was a nice detail of mundane realism. This conversation happened when I was living in Arizona and working at the Grand Canyon. Some might see this anecdote as the over-worked imagination of the superstitious. That probably is how I took it at the time. But I wouldn’t now be so dismissive.

While there, my job was to do housekeeping in the El Tovar. It’s an old hotel located directly on the South Rim of the canyon. It has the feeling of a building that has been around a while. It’s age was hard for me to ignore in its lacking an elevator, something I became familiar with in carrying stacks of sheets up the stairs of multiple floors. I worked there a few times late at night and there was an eerie atmosphere to the place. You could viscerally sense the history, all the people who had stayed there and passed through.

There were stories of suicides and homicides, of lonely lost souls still looking for their lovers or simply going through their habitual routine in the afterlife. The place was famous for it having been one of the locations where the Harvey Girls worked, young women looking for wealthy husbands. There was a tunnel that was once used by the Harvey girls to go between the hotel and the women’s dorm. This hidden and now enclosed tunnel added to the spookiness.

Many Navajo worked at the Grand Canyon, including at the El Tovar. And sometimes we would chat. I asked about the ghosts that supposedly haunted the place. But they were reluctant to talk about it. I later learned that they thought it disrespectful or unwise to speak of the dead. I also learned that some had done traditional ceremonies in the hotel in order to put the dead to rest and help them pass over to the other side. Speaking of the dead would be like calling them back to the world of the living.

I doubt this worldview is merely metaphorical in the superficial sense. Though it might be metaphorical in the Jaynesian sense. Julian Jaynes hypothesized that ancient people continued to hear the voices of the dead, that the memory would live on as auditory experience. He called this the bicameral mind. And in bicameral societies, voice-hearing supposedly was key to social order. This changed because of various reasons and then voice-hearing became a threat to the next social order that replaced the old one.

The Navajo’s fearful respect of ghosts could be thought of as a bicameral carryover. Maybe they better understand the power voice-hearing can have. Ask any schizophrenic about this and they’d agree. Most of us, however, have developed thick boundaries of the egoic mind. We so effectively repress the many voices under the authority of the egoic sole rulership that we no longer are bothered by their sway, at least not consciously.

Still, we may be more influenced than we realize. We still go through the effort of costly rituals of burying the dead where they are kept separate from the living, not to mention appeasing them with flowers and flags. Research shows that the number of people who have heard disembodied voices in their lifetime is surprisingly high. The difference for us is that we don’t openly talk about it and try our best to quickly forget it again. Even as we don’t have ceremonies in the way seen in Navajo tradition, we have other methods for dispelling the spirits that otherwise would haunt us.

Psychedelics and Language

“We cannot evolve any faster than we evolve our language because you cannot go to places that you cannot describe.”
~Terence McKenna

This post is a placeholder, as I work through some thoughts. Maybe the most central link between much of it is Terence Mckenna’s stoned ape theory. That is about the evolution of consciousness as it relates to psychedelics and language. Related to McKenna’s view, there have been many observations of non-human animals imbibing a wide variety of mind-altering plants, often psychedelics. Giorgio Samorini, in Animals and Psychedelics, that this behavior is evolutionarily advantageous in that it induces lateral thinking.

Also, as McKenna points out, many psychedelics intensify the senses, a useful effect hunting. Humans won’t only take drugs themelves for this purpose but also give them to their animals: “A classic case is indigenous people giving psychedelics to hunting dogs to enhance their abilities. A study published in the Journal of Ethnobiology, reports that at least 43 species of psychedelic plants have been used across the globe for boosting dog hunting practices. The Shuar, an indigenous people from Ecuador, include 19 different psychedelic plants in their repertoire for this purpose—including ayahuasca and four different types of brugmansia” (Alex K. Gearin, High Kingdom). So, there are many practical reasons for using psychoactive drugs. Language might have been an unintended side effect.

There is another way to get to McKenna’s conclusion. David Lewis Williams asserts that cave paintings are shamanic. He discusses the entoptic imagery that is common in trance, whether from psychedelics or by other means. This interpretation isn’t specifically about language, but that is where another theory can help us. Genevieve von Petzinger takes a different tack by speculating that the geometric signs on cave walls were a set of symbols, possibly a system of graphic communication and so maybe the origin of writing.

In exploring the sites for herself, she ascertained there were 32 signs found over a 30,000 period in Europe. Some of the same signs were found outside of Europe as well. It’s the consistency and repetition that caught her attention. They weren’t random or idiosyncratic aesthetic flourishes. If we combine that with Williams’ theory, we might have the development of proto-concepts, still attached to the concrete world but in the process of developing into something else. It would indicate that something fundamental about the human mind itself was changing.

I have my own related theory about the competing influence of psychedelics and addictive substances, the influence being not only on the mind but on society and so related to the emergence of civilization. I’m playing around with the observation that it might tell us much about civilization that, over time, addiction became more prevalent than psychedelics. I see the shift in this preference having become apparent sometime following the neolithic era, although becoming most noticeable in the Axial Age. Of course, language already existed at that point. Though maybe, as Julian Jaynes and others have argued, the use of language changed. I’ll speculate about all of that at a later time.

In the articles and passages and links below, there are numerous overlapping ideas and topics. Here are some of what stood out to me or else some of the thoughts on my mind while reading:

  • Synaesthesia, gesture, ritual, dance, sound, melody, music, poeisis, repetition (mimesis, meter, rhythm, rhyme, and alliteration, etc) vs repetition-compulsion;
  • formulaic vs grammatical language, poetry vs prose, concrete vs abstract, metaphor, and metonymy;
  • Aural and oral, listening and speaking, preliterate, epic storytelling, eloquence, verbosity, fluency, and graphomania;
  • enthralled, entangled, enactivated, embodied, extended, hypnosis, voices, voice-hearing, bundle theory of self, ego theory of self, authorization, and Logos;
  • Et cetera.

* * *

Animals on Psychedelics: Survival of the Trippiest
by Steven Kotler

According to Italian ethnobotanist Giorgio Samorini, in his 2001 Animals and Psychedelics, the risk is worth it because intoxication promotes what psychologist Edward de Bono once called lateral thinking-problem-solving through indirect and creative approaches. Lateral thinking is thinking outside the box, without which a species would be unable to come up with new solutions to old problems, without which a species would be unable to survive. De Bono thinks intoxication an important “liberating device,” freeing us from “rigidity of established ideas, schemes, divisions, categories and classifications.” Both Siegel and Samorini think animals use intoxicants for this reason, and they do so knowingly.

Don’t Be A Sea Squirt.
by Tom Morgan

It’s a feature of complex adaptive systems that a stable system is a precursor to a dead system. Something that runs the same routine day-after-day is typically a dying system. There’s evidence that people with depression are stuck in neurological loops that they can’t get out of. We all know what it’s like to be trapped in the same negative thought patterns. Life needs perpetual novelty to succeed. This is one of the reasons researchers think that psychedelics have proven effective at alleviating depression; they break our brains out of the same familiar neural pathways.

This isn’t a uniquely human trait, animals also engage in deliberate intoxication. In his book Animals and Psychedelics, Italian ethnobotanist Giorgio Samorini wrote ‘drug-seeking and drug-taking behavior, on the part of both humans and animals, enjoys an intimate connection with…..depatterning.’And thus dolphins get high on blowfish, elephants seek out alcohol and goats eat the beans of the mescal plant. They’re not just having fun, they’re expanding the possible range of their behaviours and breaking stale patterns. You’re not just getting wasted, you’re furthering the prospects of the species!*

Synesthesias, Synesthetic Imagination, and Metaphor in the Context of Individual Cognitive Development and Societal Collective Consciousness
by Hunt Harry

The continuum of synesthesias is considered in the context of evolution, childhood development, adult creativity, and related states of imaginative absorption, as well as the anthropology and sociology of “collective consciousness”. In Part I synesthesias are considered as part of the mid-childhood development of metacognition, based on a Vygotskian model of the internalization of an earlier animism and physiognomic perception, and as the precursor for an adult capacity for imaginative absorption central to creativity, metaphor, and the synesthetically based “higher states of consciousness” in spontaneous mystical experience, meditation, and psychedelic states. Supporting research is presented on childhood precocities of a fundamental synesthetic imagination that expands the current neuroscience of classical synesthetes into a broader, more spontaneous, and open-ended continuum of introspective cross modal processes that constitute the human self referential consciousness of “felt meaning”. In Part II Levi-Strauss’ analysis of the cross modal and synesthetic lattices underlying the mythologies of native peoples and their traditional animation thereby of surrounding nature as a self reflective metaphoric mirror, is illustrated by its partial survival and simplification in the Chinese I-Ching. Jung’s psychological analysis of the I-Ching, as a device for metaphorically based creative insight and as a prototype for the felt “synchronicities” underlying paranormal experience, is further extended into a model for a synesthetically and metaphorically based “collective consciousness”. This metaphorically rooted and coordinated social field is explicit in mythologically centered, shamanic peoples but rendered largely unconscious in modern societies that fail to further educate and train the first spontaneous synesthetic imaginings of mid-childhood.

Psychedelics and the Full-Fluency Phenomenon
by T.H.

Like me, the full-fluency phenomenon has been experienced by many other people who stutter while using psilocybin and MDMA, and unlike me, while using LSD as well. […]

There’s also potential for immediate recovery from stuttering following a single high dose experience. One well told account of this comes from Paul Stamets, the renowned mycologist, whose stuttering stopped altogether following his first psilocybin mushroom experience. To sustain such a high increase in fluency after the effects of the drug wear off is rare, but Paul’s story gives testimony to the possibility for it to occur.

Can Psychedelics Help You Learn New Languages?
by The Third Wave Podcast

Idahosa Ness runs “The Mimic Method,” a website that promises to help you learn foreign languages quickly by immersing you in their sounds and pronunciations. We talk to Idahosa about his experiences with cannabis and other psychedelics, and how they have improved his freestyle rapping, increased his motivation to learn new languages, and helped the growth of his business.

Marijuana and Divergent Thinking
by Jonah Lehrer

A new paper published in Psychiatry Research sheds some light on this phenomenon, or why smoking weed seems to unleash a stream of loose associations. The study looked at a phenomenon called semantic priming, in which the activation of one word allows us to react more quickly to related words. For instance, the word “dog” might lead to decreased reaction times for “wolf,” “pet” and “Lassie,” but won’t alter how quickly we react to “chair”.

Interestingly, marijuana seems to induce a state of hyper-priming, in which the reach of semantic priming extends outwards to distantly related concepts. As a result, we hear “dog” and think of nouns that, in more sober circumstances, would seem to have nothing in common. […]

Last speculative point: marijuana also enhances brain activity (at least as measured indirectly by cerebral blood flow) in the right hemisphere. The drug, in other words, doesn’t just suppress our focus or obliterate our ability to pay attention. Instead, it seems to change the very nature of what we pay attention to, flattening out our hierarchy of associations.

How the Brain Processes Language on Acid Is a Trip
by Madison Margolin

“Results showed that while LSD does not affect reaction times, people under LSD made more mistakes that were similar in meaning to the pictures they saw,” said lead author Dr. Neiloufar Family, a post-doc from the University of Kaiserslautern.

For example, participants who were dosed with acid would more often say “bus” or “train” when asked to identify a picture of a car, compared to those who ingested the placebo. These lexical mixups shed some light on how LSD affects semantic networks and the way the brain draws connections between different words or concepts.

“The effects of LSD on language can result in a cascade of associations that allow quicker access to far way concepts stored in the mind,” said Family, discussing the study’s implications for psychedelic-assisted psychotherapy. Moreover, she added, “inducing a hyper-associative state may have implications for the enhancement of creativity.”

New study shows LSD’s effects on language
by Technische Universität Kaiserslautern

This indicates that LSD seems to affect the mind’s semantic networks, or how words and concepts are stored in relation to each other. When LSD makes the network activation stronger, more words from the same family of meanings come to mind.

The results from this experiment can lead to a better understanding of the neurobiological basis of semantic network activation. Neiloufar Family explains further implication: “These findings are relevant for the renewed exploration of psychedelic psychotherapy, which are being developed for depression and other mental illnesses. The effects of LSD on language can result in a cascade of associations that allow quicker access to far away concepts stored in the mind.”

The many potential uses of this class of substances are under scientific debate. “Inducing a hyper-associative state may have implications for the enhancement of creativity,” Family adds. The increase in activation of semantic networks can lead distant or even subconscious thoughts and concepts to come to the surface.

A new harmonic language decodes the effects of LSD
by Oxford Neuroscience

Dr Selen Atasoy, the lead author of the study says: “The connectome harmonics we used to decode brain activity are universal harmonic waves, such as sound waves emerging within a musical instrument, but adapted to the anatomy of the brain. Translating fMRI data into this harmonic language is actually not different than decomposing a complex musical piece into its musical notes”. “What LSD does to your brain seems to be similar to jazz improvisation” says Atasoy, “your brain combines many more of these harmonic waves (connectome harmonics) spontaneously yet in a structured way, just like improvising jazz musicians play many more musical notes in a spontaneous, non-random fashion”.

“The presented method introduces a new paradigm to study brain function, one that links space and time in brain activity via the universal principle of harmonic waves. It also shows that this spatio-temporal relation in brain dynamics resides at the transition between order and chaos.” says Prof Gustavo Deco.

Dr. Robin Carhart-Harris adds: “Our findings reveal the first experimental evidence that LSD tunes brain dynamics closer to criticality, a state that is maximally diverse and flexible while retaining properties of order. This may explain the unusual richness of consciousness experienced under psychedelic drugs and the notion that they ‘expand consciousness’.”

Did Psilocybin Mushrooms Lead to Human Language?
by Chris Rhine

Numerous archaeological finds discovered depictions of psilocybin mushrooms in various places and times around the world. One such occasion found hallucinogenic mushrooms from works produced 7,000 to 9,000 years ago in the Sahara Desert, as stated in Giorgio Samorini’s article, “The Oldest Representations of Hallucinogenic Mushrooms in the World.” Samorini concluded, “This Saharan testimony would demonstrate that the use of hallucinogens originates in the Paleolithic period and is invariably included within mystico-religious contexts and rituals.”

Some of early man’s first drawings include the ritualization of a plant as a sign—possibly a tribute to the substance that helped in the written sign’s development.

Are Psychedelic Hallucinations Actually Metaphorical Perceptions?
by Michael Fortier

The brain is constantly attempting to predict what is going on in the world. Because it happens in a dark environment with reduced sensory stimulation, the ayahuasca ritual dampens bottom-up signaling (sensory information becomes scarcer). If you are facing a tree in daylight and your brain wrongly guesses that there is an electric pole in front you, bottom-up prediction errors will quickly correct the wrong prediction—i.e., the lookout will quickly and successfully warn the helmsman. But if the same happens in the dark, bottom-up prediction errors will be sparser and vaguer, and possibly not sufficient enough to correct errors—as it were, the lookout’s warning will be too faint to reach the helmsman. As ayahuasca introduces noise in the brain processes,6 and because bottom-up corrections cannot be as effective as usual, hallucinations appear more easily. So, on the one hand, the relative sensory deprivation of the environment in which the ayahuasca ritual takes place, and the absence of bodily motion, both favor the occurrence of hallucinations.

Furthermore, the ayahuasca ritual does include some sensory richness. The songs, the perfume, and the tobacco stimulate the brain in multiple ways. Psychedelic hallucinogens are known to induce synesthesia7 and to increase communication between areas and networks of the brain that do not usually communicate with each other.8 It is hence no surprise that the shamans’ songs are able to shape people’s visions. If one sensory modality is noisier or fainter than others, its role in perception will be downplayed.9 This is what happens with ayahuasca: Given that not much information can be gathered by the visual modality, most of the prediction errors that contribute to the shaping of conscious perception are those coming from the auditory and olfactory modalities. The combination of synesthetic processing with the increased weight attributed to non-visual senses enables shamans to “drive” people’s visions.

The same mechanisms explain the shamans’ recommendation that perfume should be sprayed or tobacco blown when one is faced with a bad spirit. Conscious perception—e.g., vision of a spirit—is the result of a complex tradeoff between top-down predictions and bottom-up prediction errors. If you spray a huge amount of perfume or blow wreaths of smoke around you, your brain will receive new and reliable information from the olfactory modality. Under psychedelics, sensory modalities easily influence one another; as a result, a sudden olfactory change amounts to sending prediction errors to upper regions of the brain. Conscious perception is updated accordingly: as predicted by the shamans’ recommendation, the olfactory change dissolves the vision of bad spirits.

In its classical sense, hallucination refers to sensory content that is not caused by objects of the world. The above description of the ayahuasca ritual demonstrates that psychedelic visions are not, in the classical sense of the term, hallucinations. Indeed, the content of the visions is tightly tied to the environment: A change of melody in a song or an olfactory change can completely transform the content of the visions. Ayahuasca visions are not caused by hypothetical supernatural entities living in a parallel world, nor are they constructed independently of the mundane objects of the world. What are they, then? They are metaphorical perceptions.

In everyday life, melodic and olfactory changes cannot affect vision much. However, because ayahuasca experience is profoundly synesthetic and intermodal, ayahuasca visions are characteristically metaphorical: A change in one sensory modality easily affects another modality. Ayahuasca visions are not hallucinations, since they are caused by real objects and events; for example, a cloud of perfume. It is more accurate to define them as metaphorical perceptions: they are loose intermodal interpretations of things that are really there.

Michael Pollan on the science of how psychedelics can ‘shake your snow globe’
interview with Michael Pollan

We know that, for example, the so-called classic psychedelics like psilocybin, LSD, and DMT, mescaline, these activate a certain receptor a serotonin receptor. And so we know that are the key that fits that lock. But beyond that, there’s a cascade of effects that happens.

The observed effect, if you do brain imaging of people who are tripping, you find some very interesting patterns of activity in the brain – specifically something called the default mode network, which is a very kind of important hub in the brain, linking parts of the cerebral cortex to deeper, older areas having to do with memory and emotion. This network is kind of a regulator of all brain activities. One neuroscientist called it, ‘The conductor of the neural symphony,’ and it’s deactivated by psychedelics, which is very interesting because the assumption going in was that they would see lots of strange activity everywhere in the brain because there’s such fireworks in the experience, but in fact, this particular network almost goes off line.

Now what does this network responsible for? Well, in addition to being this transportation hub for signals in the brain, it is involved with self reflection. It’s where we go to ruminate or mind wander – thinking about the past or thinking about the future – therefore worrying takes place here. Our sense of self, if it can be said to have an address and real, resides in this particular brain network. So this is a very interesting clue to how psychedelics affect the brain and how they create the psychological experience, the experience in the mind, that is so transformative.

When it goes off line, parts of the brain that don’t ordinarily communicate to one another, strike up conversation. And those connections may represent what people feel during the psychedelic experience as things like synaesthesia. Synaesthesia is when one sense gets cross wired with another. And so you suddenly smell musical notes or taste things that you see.

It may produce insights. It may produce new metaphors – literally connecting the dots in new ways. Now that I’m being speculative – I’m going a little beyond what we’ve established – we know there are new connections, we don’t know what’s happening with them, or which of them endure. But the fact is, the brain is temporarily rewired. And that rewiring – whether the new connections actually produce the useful material or just shaking up the system – ‘shaking the snow globe,’ as one of the neuroscientists put it, is what’s therapeutic. It is a reboot of the brain.

If you think about, you know, mental illnesses such as depression, addiction, and anxiety, many of them involve these loops of thought that we can’t control and we get stuck on these stories we tell ourselves – that we can’t get through the next hour without a drink, or we’re worthless and unworthy of love. We get stuck in these stories. This temporarily dissolves those stories and gives us a chance to write new stories.

Terence McKenna Collection

The mutation-inducing influence of diet on early humans and the effect of exotic metabolites on the evolution of their neurochemistry and culture is still unstudied territory. The early hominids’ adoption of an omnivorous diet and their discovery of the power of certain plants were decisive factors in moving early humans out of the stream of animal evolution and into the fast-rising tide of language and culture. Our remote ancestors discovered that certain plants, when self-administered, suppress appetite, diminish pain, supply bursts of sudden energy, confer immunity against pathogens, and synergize cognitive activities. These discoveries set us on the long journey to self-reflection. Once we became tool-using omnivores, evolution itself changed from a process of slow modification of our physical form to a rapid definition of cultural forms by the elaboration of rituals, languages, writing, mnemonic skills, and technology.

Food of the Gods
by Terence McKenna
pp. 24-29

Because scientists were unable to explain this tripling of the human brain size in so short a span of evolutionary time, some of the early primate paleontologists and evolutionary theorists predicted and searched for evidence of transitional skeletons. Today the idea of a “missing link” has largely been abandoned. Bipedalism, binocular vision, the opposable thumb, the throwing arm-all have been put forth as the key ingredient in the mix that caused self-reflecting humans to crystallize out of the caldron of competing hominid types and strategies. Yet all we really know is that the shift in brain size was accompanied by remarkable changes in the social organization of the hominids. They became users of tools, fire, and language. They began the process as higher animals and emerged from it 100,000 years ago as conscious, self-aware individuals.

THE REAL MISSING LINK

My contention is that mutation-causing, psychoactive chemical compounds in the early human diet directly influenced the rapid reorganization of the brain’s information-processing capacities. Alkaloids in plants, specifically the hallucinogenic compounds such as psilocybin, dimethyltryptamine (DMT), and harmaline, could be the chemical factors in the protohuman diet that catalyzed the emergence of human self-reflection. The action of hallucinogens present in many common plants enhanced our information processing activity, or environmental sensitivity, and thus contributed to the sudden expansion of the human brain size. At a later stage in this same process, hallucinogens acted as catalysts in the development of imagination, fueling the creation of internal stratagems and hopes that may well have synergized the emergence of language and religion.

In research done in the late 1960s, Roland Fischer gave small amounts of psilocybin to graduate students and then measured their ability to detect the moment when previously parallel lines became skewed. He found that performance ability on this particular task was actually improved after small doses of psilocybin.5

When I discussed these findings with Fischer, he smiled after explaining his conclusions, then summed up, “You see what is conclusively proven here is that under certain circumstances one is actually better informed concerning the real world if one has taken a drug than if one has not.” His facetious remark stuck with me, first as an academic anecdote, later as an effort on his part to communicate something profound. What would be the consequences for evolutionary theory of admitting that some chemical habits confer adaptive advantage and thereby become deeply scripted in the behavior and even genome of some individuals?

THREE BIG STEPS FOR THE HUMAN RACE

In trying to answer that question I have constructed a scenario, some may call it fantasy; it is the world as seen from the vantage point of a mind for which the millennia are but seasons, a vision that years of musing on these matters has moved me toward. Let us imagine for a moment that we stand outside the surging gene swarm that is biological history, and that we can see the interwoven consequences of changes in diet and climate, which must certainly have been too slow to be felt by our ancestors. The scenario that unfolds involves the interconnected and mutually reinforcing effects of psilocybin taken at three different levels. Unique in its properties, psilocybin is the only substance, I believe, that could yield this scenario.

At the first, low, level of usage is the effect that Fischer noted: small amounts of psilocybin, consumed with no awareness of its psychoactivity while in the general act of browsing for food, and perhaps later consumed consciously, impart a noticeable increase in visual acuity, especially edge detection. As visual acuity is at a premium among hunter-gatherers, the discovery of the equivalent of “chemical binoculars” could not fail to have an impact on the hunting and gathering success of those individuals who availed themselves of this advantage. Partnership groups containing individuals with improved eyesight will be more successful at feeding their offspring. Because of the increase in available food, the offspring within such groups will have a higher probability of themselves reaching reproductive age. In such a situation, the out breeding (or decline) of non-psilocybin-using groups would be a natural consequence.

Because psilocybin is a stimulant of the central nervous system, when taken in slightly larger doses, it tends to trigger restlessness and sexual arousal. Thus, at this second level of usage, by increasing instances of copulation, the mushrooms directly favored human reproduction. The tendency to regulate and schedule sexual activity within the group, by linking it to a lunar cycle of mushroom availability, may have been important as a first step toward ritual and religion. Certainly at the third and highest level of usage, religious concerns would be at the forefront of the tribe’s consciousness, simply because of the power and strangeness of the experience itself. This third level, then, is the level of the full-blown shamanic ecstasy. The psilocybin intoxication is a rapture whose breadth and depth is the despair of prose. It is wholly Other and no less mysterious to us than it was to our mushroom-munching ancestors. The boundary-dissolving qualities of shamanic ecstasy predispose hallucinogen-using tribal groups to community bonding and to group sexual activities, which promote gene mixing, higher birth rates, and a communal sense of responsibility for the group offspring.

At whatever dose the mushroom was used, it possessed the magical property of conferring adaptive advantages upon its archaic users and their group. Increased visual acuity, sexual arousal, and access to the transcendent Other led to success in obtaining food, sexual prowess and stamina, abundance of offspring, and access to realms of supernatural power. All of these advantages can be easily self-regulated through manipulation of dosage and frequency of ingestion. Chapter 4 will detail psilocybin’s remarkable property of stimulating the language-forming capacity of the brain. Its power is so extraordinary that psilocybin can be considered the catalyst to the human development of language.

STEERING CLEAR OF LAMARCK

An objection to these ideas inevitably arises and should be dealt with. This scenario of human emergence may seem to smack of Lamarckism, which theorizes that characteristics acquired by an organism during its lifetime can be passed on to its progeny. The classic example is the claim that giraffes have long necks because they stretch their necks to reach high branches.

This straightforward and rather common-sense idea is absolutely anathema among
neoDarwinians, who currently hold the high ground in evolutionary theory. Their position is that mutations are entirely random and that only after the mutations are expressed as the traits of organisms does natural selection mindlessly and dispassionately fulfill its function of preserving those individuals upon whom an adaptive advantage had been conferred.

Their objection can be put like this: While the mushrooms may have given us better eyesight, sex, and language when eaten, how did these enhancements get into the human genome and become innately human? Nongenetic enhancements of an organism’s functioning made by outside agents retard the corresponding genetic reservoirs of those facilities by rendering them superfluous. In other words, if a necessary metabolite is common in available food, there will not be pressure to develop a trait for endogenous expression of the metabolite. Mushroom use would thus create individuals with less visual acuity, language facility, and consciousness. Nature would not provide those enhancements through organic evolution because the metabolic investment required to sustain them wouldn’t pay off, relative to the tiny metabolic investment required to eat mushrooms. And yet today we all have these enhancements, without taking mushrooms. So how did the mushroom modifications get into the genome?

The short answer to this objection, one that requires no defense of Lamarck’s ideas, is that the presence of psilocybin in the hominid diet changed the parameters of the process of natural selection by changing the behavioral patterns upon which that selection was operating. Experimentation with many types of foods was causing a general increase in the numbers of random mutations being offered up to the process of natural selection, while the augmentation of visual acuity, language use, and ritual activity through the use of psilocybin represented new behaviors. One of these new behaviors, language use, previously only a marginally important trait, was suddenly very useful in the context of new hunting and gathering lifestyles. Hence psilocybin inclusion in the diet shifted the parameters of human behavior in favor of patterns of activity that promoted increased language; acquisition of language led to more vocabulary and an expanded memory capacity. The psilocybin-using individuals evolved epigenetic rules or cultural forms that enabled them to survive and reproduce better than other individuals. Eventually the more successful epigenetically based styles of behavior spread through the populations along with the genes that reinforce them. In this fashion the population would evolve genetically and culturally.

As for visual acuity, perhaps the widespread need for corrective lenses among modem humans is a legacy of the long period o “artificial” enhancement of vision through psilocybin use. After all, atrophy of the olfactory abilities of human beings is thought by one school to be a result of a need for hungry omnivores to tolerate strong smells and tastes, perhaps even carrion. Trade-offs of this sort are common in evolution. The suppression of keenness of tasty and smell would allow inclusion of foods in the diet that might otherwise be passed over as “too strong.” Or it may indicate some thing more profound about our evolutionary relationship to diet My brother Dennis has written:

The apparent atrophy of the human olfactory system may actually represent a functional shift in a set of primitive, externally directed chemo-receptors to an interiorized regulatory function. This function may be related to the control of the human pheromonal system, which is largely under the control of the pineal gland, and which mediates, on a subliminal level, a host of psycho-sexual and psycho-social interactions between individuals. The pineal tends to suppress gonadal development and the onset of puberty, among other functions, and this mechanism may play a role in the persistence of neonatal characteristics in the human species. Delayed maturation and prolonged childhood and adolescence play a critical role in the neurological and psychological development of the individual, since they provide the circumstances which permit the post-natal development of the brain in the early, formative years of childhood. The symbolic, cognitive and linguistic stimuli that the brain experiences during this period are essential to its development and are the factors that make us the unique, conscious, symbol-manipulating, language-using beings that we are.

Neuroactive amines and alkaloids in the diet of early primates may have played a role in the biochemical activation of the pineal gland and the resulting adaptations.

pp. 46-60

HUMAN COGNITION

All the unique characteristics and preoccupations of human beings can be summed up under the heading of cognitive activities: dance, philosophy, painting, poetry, sport, meditation, erotic fantasy, politics, and ecstatic self-intoxication. We are truly Homo sapiens, the thinking animal; our acts are all a product of the dimension that is uniquely ours, the dimension of cognitive activity. Of thought and emotion, memory and anticipation. Of Psyche.

From observing the ayahuasca-using people of the Upper Amazon, it became very clear to me that shamanism is often intuitively guided group decision making. The shamans decide when the group should move or hunt or make war. Human cognition is an adaptive response that is profoundly flexible in the way it allows us to manage what in other species are genetically programmed behaviors.

We alone live in an environment that is conditioned not only by the biological and physical constraints to which all species are subject but also by symbols and language. Our human environment is conditioned by meaning. And meaning lies in the collective mind of the group.

Symbols and language allow us to act in a dimension that is “supranatural”-outside the ordinary activities of other forms of organic life. We can actualize our cultural assumptions, alter and shape the natural world in the pursuit of ideological ends and according to the internal model of the world that our symbols have empowered us to create. We do this through the elaboration of ever more effective, and hence ever more destructive, artifacts and technologies, which we feel compelled to use.

Symbols allow us to store information outside of the physical brain. This creates for us a relationship to the past very different from that of our animal companions. Finally, we must add to any analysis of the human picture the notion of self-directed modification of activity. We are able to modify our behavior patterns based on a symbolic analysis of past events, in other words, through history. Through our ability to store and recover information as images and written records, we have created a human environment as much conditioned by symbols and languages as by biological and environmental factors.

TRANSFORMATIONS OF MONKEYS

The evolutionary breakouts that led to the appearance of language and, later, writing are examples of fundamental, almost ontological, transformations of the hominid line. Besides providing us with the ability to code data outside the confines of DNA, cognitive activities allow us to transmit information across space and time. At first this amounted merely to the ability to shout a warning or a command, really little more than a modification of the cry of alarm that is a familiar feature of the behavior of social animals. Over the course of human history this impulse to communicate has motivated the elaboration of ever more effective communication techniques. But by our century, this basic ability has turned into the all-pervasive communications media, which literally engulf the space surrounding our planet. The planet swims through a self-generated ocean of messages. Telephone calls, data exchanges, and electronically transmitted entertainment create an invisible world experienced as global informational simultaneity. We think nothing of this; as a culture we take it for granted.

Our unique and feverish love of word and symbol has given us a collective gnosis, a collective understanding of ourselves and our world that has survived throughout history until very recent times. This collective gnosis lies behind the faith of earlier centuries in “universal truths” and common human values. Ideologies can be thought of as meaning-defined environments. They are invisible, yet they surround us and determine for us, though we may never realize it, what we should think about ourselves and reality. Indeed they define for us what we can think.

The rise of globally simultaneous electronic culture has vastly accelerated the rate at which we each can obtain information necessary to our survival. This and the sheer size of the human population as a whole have brought to a halt our physical evolution as a species. The larger a population is, the less impact mutations will have on the evolution of that species. This fact, coupled with the development of shamanism and, later, scientific medicine, has removed us from the theater of natural selection. Meanwhile libraries and electronic data bases have replaced the individual human mind as the basic hardware providing storage for the cultural data base. Symbols and languages have gradually moved us away from the style of social organization that characterized the mute nomadism of our remote ancestors and has replaced that archaic model with the vastly more complicated social organization characteristic of an electronically unified planetary society. As a result of these changes, we ourselves have become largely epigenetic, meaning that much of what we are as human beings is no longer in our genes but in our culture.

THE PREHISTORIC EMERGENCE OF HUMAN IMAGINATION

Our capacity for cognitive and linguistic activity is related to the size and organization of the human brain. Neural structures concerned with conceptualization, visualization, signification, and association are highly developed in our species. Through the act of speaking vividly, we enter into a flirtation with the domain of the imagination. The ability to associate sounds, or the small mouth noises of language, with meaningful internal images is a synesthesic activity. The most recently evolved areas of the human brain, Broca’s area and the neocortex, are devoted to the control of symbol and language processing.

The conclusion universally drawn from these facts is that the highly organized neurolinguistic areas of our brain have made language and culture possible. Where the search for scenarios of human emergence and social organization is concerned, the problem is this: we know that our linguistic abilities must have evolved in response to enormous evolutionary pressures-but we do not know what these pressures were.
Where psychoactive plant use was present, hominid nervous systems over many millennia would have been flooded by hallucinogenic realms of strange and alien beauty. However, evolutionary necessity channels the organism’s awareness into a narrow cul-desac where ordinary reality is perceived through the reducing valve of the senses. Otherwise, we would be rather poorly adapted for the rough-and-tumble of immediate existence. As creatures with animal bodies, we are aware that we are subject to a range of immediate concerns that we can ignore only at great peril. As human beings we are also aware of an interior world, beyond the needs of the animal body, but evolutionary necessity has placed that world far from ordinary consciousness.

PATTERNS AND UNDERSTANDING

Consciousness has been called awareness of awareness’ and is characterized by novel associations and connections among the various data of experience. Consciousness is like a super nonspecific immune response. The key to the working of the immune system is the ability of one chemical to recognize, to have a key-in-lock relationship, with another. Thus both the immune system and consciousness represent systems that learn, recognize, and remember.’

As I write this I think of what Alfred North Whitehead said about understanding, that it is apperception of pattern as such. This is also a perfectly acceptable definition of consciousness. Awareness of pattern conveys the feeling that attends understanding. There presumably can be no limit to how much consciousness a species can acquire, since understanding is not a finite project with an imaginable conclusion, but rather a stance toward immediate experience. This appears self-evident from within a world view that sees consciousness as analogous to a source of light. The more powerful the light, the greater the surface area of darkness revealed. Consciousness is the moment-to-moment integration of the individual’s perception of the world. How well, one could almost say how gracefully, an individual accomplishes this integration determines that individual’s unique adaptive response to existence.

We are masters not only of individual cognitive activity, but, when acting together, of group cognitive activity as well. Cognitive activity within a group usually means the elaboration and manipulation of symbols and language. Although this occurs in many species, within the human species it is especially well developed. Our immense power to manipulate symbols and language gives us our unique position in the natural world. The power of our magic and our science arises out of our commitment to group mental activity, symbol sharing, meme replication (the spreading of ideas), and the telling of tall tales.

The idea, expressed above, that ordinary consciousness is the end product of a process of extensive compression and filtration, and that the psychedelic experience is the antithesis of this construction, was put forward by Aldous Huxley, who contrasted this with the psychedelic experience. In analyzing his experiences with mescaline, Huxley wrote:

I find myself agreeing with the eminent Cambridge philosopher, Dr. C. D. Broad, “that we should do well to consider the suggestion that the function of the brain and nervous system and sense organs is in the main eliminative and not productive.” The function of the brain and nervous system is to protect us from being overwhelmed and confused by this mass of largely useless and irrelevant knowledge, by shutting out most of what we should otherwise perceive or remember at any moment, and leaving only that very small and special selection which is likely to be practically useful. According to such a theory, each one of us is potentially Mind at Large. But in so far as we are animals, our business is at all costs to survive. To make biological survival possible, Mind at Large has to be funnelled through the reducing valve of the brain and nervous system. What comes out at the other end is a measly trickle of the kind of consciousness which will help us to stay alive on the surface of this particular planet. To formulate and express the contents of this reduced awareness, man has invented and endlessly elaborated those symbol-systems and implicit philosophies which we call languages. Every individual is at once the beneficiary and the victim of the linguistic tradition into which he has been born. That which, in the language of religion, is called “this world” is the universe of reduced awareness, expressed, and, as it were, petrified by language. The various “other worlds” with which human beings erratically make contact are so many elements in the totality of the awareness belonging to Mind at Large …. Temporary by-passes may be acquired either spontaneously, or as the result of deliberate “spiritual exercises,”. . . or by means of drugs.’

What Huxley did not mention was that drugs, specifically the plant hallucinogens, can reliably and repeatedly open the floodgates of the reducing valve of consciousness and expose the individual to the full force of the howling Tao. The way in which we internalize the impact of this experience of the Unspeakable, whether encountered through psychedelics or other means, is to generalize and extrapolate our world view through acts of imagination. These acts of imagination represent our adaptive response to information concerning the outside world that is conveyed to us by our senses. In our species, culture-specific, situation-specific syntactic software in the form of language can compete with and sometimes replace the instinctual world of hard-wired animal behavior. This means that we can learn and communicate experience and thus put maladaptive behaviors behind us. We can collectively recognize the virtues of peace over war, or of cooperation over struggle. We can change.

As we have seen, human language may have arisen when primate organizational potential was synergized by plant hallucinogens. The psychedelic experience inspired us to true self-reflective thought in the first place and then further inspired us to communicate our thoughts about it.

Others have sensed the importance of hallucinations as catalysts of human psychic organization. Julian Jaynes’s theory, presented in his controversial book The Origin of Consciousness in the Breakdown of the Bicameral Mind,’ makes the point that major shifts in human self-definition may have occurred even in historical times. He proposes that through Homeric times people did not have the kind of interior psychic organization that we take for granted. Thus, what we call ego was for Homeric people a “god.” When danger threatened suddenly, the god’s voice was heard in the individual’s mind; an intrusive and alien psychic function was expressed as a kind of metaprogram for survival called forth under moments of great stress. This psychic function was perceived by those experiencing it as the direct voice of a god, of the king, or of the king in the afterlife. Merchants and traders moving from one society to another brought the unwelcome news that the gods were saying different things in different places, and so cast early seeds of doubt. At some point people integrated this previously autonomous function, and each person became the god and reinterpreted the inner voice as the “self” or, as it was later called, the “ego.”

Jaynes’s theory has been largely dismissed. Regrettably his book on the impact of hallucinations on culture, though 467 pages in length, manages to avoid discussion of hallucinogenic plants or drugs nearly entirely. By this omission Jaynes deprived himself of a mechanism that could reliably drive the kind of transformative changes he saw taking place in the evolution of human consciousness.

CATALYZING CONSCIOUSNESS

The impact of hallucinogens in the diet has been more than psychological; hallucinogenic plants may have been the catalysts for everything about us that distinguishes us from other higher primates, for all the mental functions that we associate with humanness. Our society more than others will find this theory difficult to accept, because we have made pharmacologically obtained ecstasy a taboo. Like sexuality, altered states of consciousness are taboo because they are consciously or unconsciously sensed to be entwined with the mysteries of our origin-with where we came from and how we got to be the way we are. Such experiences dissolve boundaries and threaten the order of the reigning patriarchy and the domination of society by the unreflecting expression of ego. Yet consider how plant hallucinogens may have catalyzed the use of language, the most unique of human activities.

One has, in a hallucinogenic state, the incontrovertible impression that language possesses an objectified and visible dimension, which is ordinarily hidden from our awareness. Language, under such conditions, is seen, is beheld, just as we would ordinarily see our homes and normal surroundings. In fact our ordinary cultural environment is correctly recognized, during the experience of the altered state, as the bass drone in the ongoing linguistic business of objectifying the imagination. In other words, the collectively designed cultural environment in which we all live is the objectification of our collective linguistic intent.

Our language-forming ability may have become active through the mutagenic influence of hallucinogens working directly on organelles that are concerned with the processing and generation of signals. These neural substructures are found in various portions of the brain, such as Broca’s area, that govern speech formation. In other words, opening the valve that limits consciousness forces utterance, almost as if the word is a concretion of meaning previously felt but left unarticulated. This active impulse to speak, the “going forth of the word,” is sensed and described in the cosmogonies of many peoples.

Psilocybin specifically activates the areas of the brain concerned with processing signals. A common occurrence with psilocybin intoxication is spontaneous outbursts of poetry and other vocal activity such as speaking in tongues, though in a manner distinct from ordinary glossolalia. In cultures with a tradition of mushroom use, these phenomena have given rise to the notion of discourse with spirit doctors and supernatural allies. Researchers familiar with the territory agree that psilocybin has a profoundly catalytic effect on the linguistic impulse.

Once activities involving syntactic self-expression were established habits among early human beings, the continued evolution of language in environments where mushrooms were scarce or unavailable permitted a tendency toward the expression and emergence of the ego. If the ego is not regularly and repeatedly dissolved in the unbounded hyperspace of the Transcendent Other, there will always be slow drift away from the sense of self as part of nature’s larger whole. The ultimate consequence of this drift is the fatal ennui that now permeates Western civilization.

The connection between mushrooms and language was brilliantly anticipated by Henry Munn in his essay “The Mushrooms of Language.” Language is an ecstatic activity of signification. Intoxicated by the mushrooms, the fluency, the ease, the aptness of expression one becomes capable of are such that one is astounded by the words that issue forth from the contact of the intention of articulation with the matter of experience. The spontaneity the mushrooms liberate is not only perceptual, but linguistic. For the shaman, it is as if existence were uttering itself through him.

THE FLESH MADE WORD

The evolutionary advantages of the use of speech are both obvious and subtle. Many unusual factors converged at the birth of human language. Obviously speech facilitates communication and cognitive activity, but it also may have had unanticipated effects on the whole human enterprise.

Some neurophysiologists have hypothesized that the vocal vibration associated with human use of language caused a kind of cleansing of the cerebrospinal fluid. It has been observed that vibrations can precipitate and concentrate small molecules in the spinal fluid, which bathes and continuously purifies the brain. Our ancestors may have, consciously or unconsciously, discovered that vocal sound cleared the chemical cobwebs out of their heads. This practice may have affected the evolution of our present-day thin skull structure and proclivity for language. A self-regulated process as simple as singing might well have positive adaptive advantages if it also made the removal of chemical waste from the brain more efficient. The following excerpt supports this provocative idea:

Vibrations of human skull, as produced by loud vocalization, exert a massaging effect on the brain and facilitate elution of metabolic products from the brain into the cerebrospinal fluid (CSF) . . . . The Neanderthals had a brain 15% larger than we have, yet they did not survive in competition with modern humans. Their brains were more polluted, because their massive skulls did not vibrate and therefore the brains were not sufficiently cleaned. In the evolution of the modern humans the thinning of cranial bones was important.’

As already discussed, hominids and hallucinogenic plants must have been in close association for a long span of time, especially if we want to suggest that actual physical changes in the human genome resulted from the association. The structure of the soft palate in the human infant and timing of its descent is a recent adaptation that facilitates the acquisition of language. No other primate exhibits this characteristic. This change may have been a result of selective pressure on mutations originally caused by the new omnivorous diet.

WOMEN AND LANGUAGE

Women, the gatherers in the Archaic hunter-gatherer equation, were under much greater pressure to develop language than were their male counterparts. Hunting, the prerogative of the larger male, placed a premium on strength, stealth, and stoic waiting. The hunter was able to function quite well on a very limited number of linguistic signals, as is still the case among hunting peoples such as the !Kung or the Maku.

For gatherers, the situation was different. Those women with the largest repertoire of communicable images of foods and their sources and secrets of preparation were unquestionably placed in a position of advantage. Language may well have arisen as a mysterious power possessed largely by women-women who spent much more of their waking time together-and, usually, talking-than did men, women who in all societies are seen as group-minded, in contrast to the lone male image, which is the romanticized version of the alpha male of the primate troop.

The linguistic accomplishments of women were driven by a need to remember and describe to each other a variety of locations and landmarks as well as numerous taxonomic and structural details about plants to be sought or avoided. The complex morphology of the natural world propelled the evolution of language toward modeling of the world beheld. To this day a taxonomic description of a plant is a Joycean thrill to read: “Shrub 2 to 6 feet in height, glabrous throughout. Leaves mostly opposite, some in threes or uppermost alternate, sessile, linear-lanceolate or lanceolate, acute or acuminate. Flowers solitary in axils, yellow, with aroma, pedicellate. Calyx campanulate, petals soon caducous, obovate” and so on for many lines.

The linguistic depth women attained as gatherers eventually led to a momentous discovery: the discovery of agriculture. I call it momentous because of its consequences. Women realized that they could simply grow a restricted number of plants. As a result, they learned the needs of only those few plants, embraced a sedentary lifestyle, and began to forget the rest of nature they had once known so well.

At that point the retreat from the natural world began, and the dualism of humanity versus nature was born. As we will soon see, one of the places where the old goddess culture died, fatal Huyuk, in present-day Anatolian Turkey, is the very place where agriculture may have first arisen. At places like fatal Huyuk and Jericho, humans and their domesticated plants and animals became for the first time physically and psychologically separate from the life of untamed nature and the howling unknown. Use of hallucinogens can only be sanctioned in hunting and gathering societies. When agriculturists use these plants, they are unable to get up at dawn the morning after and go hoe the fields. At that point, corn and grain become gods-gods that symbolize domesticity and hard labor. These replace the old goddesses of plant-induced ecstasy.

Agriculture brings with it the potential for overproduction, which leads to excess wealth, hoarding, and trade. Trade leads to cities; cities isolate their inhabitants from the natural world. Paradoxically, more efficient utilization of plant resources through agriculture led to a breaking away from the symbiotic relationship that had bound human beings to nature. I do not mean this metaphorically. The ennui of modernity is the consequence of a disrupted quasisymbiotic relationship between ourselves and Galan nature. Only a restoration of this relationship in some form is capable of carrying us into a full appreciation of our birthright and sense of ourselves as complete human beings.

HABIT AS CULTURE AND RELIGION

At regular intervals that were probably lunar, the ordinary activities of the small nomadic group of herders were put aside. Rains usually followed the new moon in the tropics, making mushrooms plentiful. Gatherings took place at night; night is the time of magical projection and hallucinations, and visions are more easily obtained in darkness. The whole clan was present from oldest to youngest. Elders, especially shamans, usually women but often men, doled out each person’s dose. Each clan member stood before the group and reflectively chewed and swallowed the body of the Goddess before returning to his or her place in the circle. Bone flutes and drums wove within the chanting. Line dances with heavy foot stamping channeled the energy of the first wave of visions. Suddenly the elders signal silence.

In the motionless darkness each mind follows its own trail of sparks into the bush while some people keen softly. They feel fear, and they triumph over fear through the strength of the group. They feel relief mingled with wonder at the beauty of the visionary expanse; some spontaneously reach out to those nearby in simple affection and an impulse for closeness or in erotic desire. An individual feels no distance between himself or herself and the rest of the clan or between the clan and the world. Identity is dissolved in the higher wordless truth of ecstasy. In that world, all divisions are overcome. There is only the One Great Life; it sees itself at play, and it is glad.

The impact of plants on the evolution of culture and consciousness has not been widely explored, though a conservative form of this notion appears in R. Gordon Wasson’s The Road to Eleusis. Wasson does not comment on the emergence of self-reflection in hominids, but does suggest hallucinogenic mushrooms as the causal agent in the appearance of spiritually aware human beings and the genesis of religion. Wasson feels that omnivorous foraging humans would have sooner or later encountered hallucinogenic mushrooms or other psychoactive plants in their environment:

As man emerged from his brutish past, thousands of years ago, there was a stage in the evolution of his awareness when the discovery of the mushroom (or was it a higher plant?) with miraculous properties was a revelation to him, a veritable detonator to his soul, arousing in him sentiments of awe and reverence, and gentleness and love, to the highest pitch of which mankind is capable, all those sentiments and virtues that mankind has ever since regarded as the highest attribute of his kind. It made him see what this perishing mortal eye cannot see. How right the Greeks were to hedge about this Mystery, this imbibing of the potion with secrecy and surveillance! . . . Perhaps with all our modem knowledge we do not need the divine mushroom anymore. Or do we need them more than ever? Some are shocked that the key even to religion might be reduced to a mere chug. On the other hand, the chug is as mysterious as it ever was: “like the wind that comes we know not whence nor why.” Out of a mere chug comes the ineffable, comes ecstasy. It is not the only instance in the history of humankind where the lowly has given birth to the divine.’

Scattered across the African grasslands, the mushrooms would be especially noticeable to hungry eyes because of their inviting smell and unusual form and color. Once having experienced the state of consciousness induced by the mushrooms, foraging humans would return to them repeatedly, in order to reexperience their bewitching novelty. This process would create what C. H. Waddington called a “creode, “z a pathway of developmental activity, what we call a habit.

ECSTASY

We have already mentioned the importance of ecstasy for shamanism. Among early humans a preference for the intoxication experience was ensured simply because the experience was ecstatic. “Ecstatic” is a word central to my argument and preeminently worthy of further attention. It is a notion that is forced on us whenever we wish to indicate an experience or a state of mind that is cosmic in scale. An ecstatic experience transcends duality; it is simultaneously terrifying, hilarious, awe-inspiring, familiar, and bizarre. It is an experience that one wishes to have over and over again.

For a minded and language-using species like ourselves, the experience of ecstasy is not perceived as simple pleasure but, rather, is incredibly intense and complex. It is tied up with the very nature of ourselves and our reality, our languages, and our imagings of ourselves. It is fitting, then, that it is enshrined at the center of shamanic approaches to existence. As Mircea Eliade pointed out, shamanism and ecstasy are atroot one concern:

This shamanic complex is very old; it is found, in whole or in part, among the Australians, the archaic peoples of North and South America, in the polar regions, etc. The essential and defining element of shamanism is ecstasy the shaman is a specialist in the sacred, able to abandon his body and undertake cosmic journeys “in the spirit” (in trance). “Possession” by spirits, although documented in a great many shamanisms, does not seem to have been a primary and essential element. Rather, it suggests a phenomenon of degeneration; for the supreme goal of the shaman is to abandon his body and rise to heaven or descend into hell-not to let himself be “possessed” by his assisting spirits, by demons or the souls of the dead; the shaman’s ideal is to master these spirits, not to let himself be “occupied” by them.’

Gordon Wasson added these observations on ecstasy:

In his trance the shaman goes on a far joumey-the place of the departed ancestors, or the nether world, or there where the gods dwell-and this wonderland is, I submit, precisely where the hallucinogens take us. They are a gateway to ecstasy. Ecstasy in itself is neither pleasant nor unpleasant. The bliss or panic into which it plunges you is incidental to ecstasy. When you are in a state of ecstasy, your very soul seems scooped out from your body and away it goes. Who controls its flight: Is it you, or your “subconscious,” or a “higher power”? Perhaps it is pitch dark, yet you see and hear more clearly than you have ever seen or heard before. You are at last face to face with Ultimate Truth: this is the overwhelming impression (or illusion) that grips you. You may visit Hell, or the Elysian fields of Asphodel, or the Gobi desert, or Arctic wastes. You know awe, you know bliss, and fear, even terror. Everyone experiences ecstasy in his own way, and never twice in the same way. Ecstasy is the very essence of shamanism. The neophyte from the great world associates the mushrooms primarily with visions, but for those who know the Indian language of the shaman the mushrooms “speak” through the shaman. The mushroom is the Word: es habla, as Aurelio told me. The mushroom bestows on the curandero what the Greeks called Logos, the Aryan Vac, Vedic Kavya, “poetic potency,” as Louis Renous put it. The divine afflatus of poetry is the gift of the entheogen. The textual exegete skilled only in dissecting the cruces of the verses lying before him is of course indispensable and his shrewd observations should have our full attention, but unless gifted with Kavya, he does well to be cautious in discussing the higher reaches of Poetry. He dissects the verses but knows not ecstasy, which is the soul of the verses.’

The Magic Language of the Fourth Way
by Pierre Bonnasse
pp. 228-234

Speech, just like sacred medicine, forms the basis of the shamanic path in that it permits us not only to see but also to do. Ethnobotany, the science that studies man as a function of his relationship to the plants around him, offers us new paths of reflection, explaining our relationship to language from a new angle that reconsiders all human evolution in a single movement. It now appears clear that the greatest power of the shaman, that master of ecstasy, resides in his mastery of the magic word stimulated by the ingestion of modifiers of consciousness.

For the shaman, language produces reality, our world being made of language. Terence McKenna, in his revolutionary endeavor to rethink human evolution, shows how plants have been able to influence the development of humans and animals. 41 He explains why farming and the domestication of animals as livestock were a great step forward in our cultural evolution: It was at this moment, according to him, that we were able to come into contact with the Psilocybe mushroom, which grows on and around dung. He supports the idea that “mutation-causing, psychoactive chemical compounds in the early human diet directly influenced the rapid reorganization of the brain’s information-processing capacities.” 42 Further, because “thinking about human evolution ultimately means thinking about the evolution of human consciousness,” he supports the thesis that psychedelic plants “may well have synergized the emergence of language and religion.” 43

Studies undertaken by Fischer have shown that weak doses of psilocybin can improve certain types of mental performance while making the investigator more aware of the real world. McKenna distinguishes three degrees of effects of psilocybin: improvement of visual acuity, increase of sexual excitation, and, at higher doses, “certainly . . . religious concerns would be at the forefront of the tribe’s consciousness, simply because of the power and strangeness of the experience itself.” 44 Because “the psilocybin intoxication is a rapture whose breadth and depth is the despair of prose,” it is entirely clear to McKenna that shamanic ecstasy, characterized by its “boundary-dissolving qualities,” played a crucial role in the evolution of human consciousness, which, according to him, can be attributed to “psilocybin’s remarkable property of stimulating the language-forming capacity of the brain.” Indeed, “[i]ts power is so extraordinary that psilocybin can be considered the catalyst to the human development of language.” 45 In response to the neo-Darwinist objection, McKenna states that “the presence of psilocybin in the hominid diet changed the parameters of the process of natural selection by changing the behavioral patterns upon which that selection was operating,” and that “the augmentation of visual acuity, language use, and ritual activity through the use of psilocybin represented new behaviors.” 46

Be that as it may, it is undeniable that the unlimiters of consciousness, as Charles Duits calls them, have a real impact upon linguistic activity in that they strongly stimulate the emergence of speech. If, according to McKenna’s theories, “psilocybin inclusion in the diet shifted the parameters of human behavior in favor of patterns of activity that promoted increased language,” resulting in “more vocabulary and an expanded memory capacity,” 47 then it seems obvious that the birth of poetry, literature, and all the arts came about ultimately through the fantastic encounter between humans and the magic mushroom—a primordial plant, the “umbilical cord linking us to the feminine spirit of the planet,” and thence, inevitably, to poetry. Rich in behavioral and evolutionary consequences, the mushroom, in its dynamic relationship to the human being, propelled us toward higher cultural levels developing parallel to self-reflection. 48

This in no way means that this level of consciousness is inherent in all people, but it must be observed that the experience in itself leads to a gaining of consciousness which, in order to be preserved and maintained, requires rigorous and well-directed work on ourselves. This being said, the experience allows us to observe this action in ourselves in order to endeavor to understand its subtle mechanisms. Terence McKenna writes,

Of course, imagining these higher states of self-reflection is not easy. For when we seek to do this we are acting as if we expect language to somehow encompass that which is, at present, beyond language, or translinguistic. Psilocybin, the hallucinogen unique to mushrooms, is an effective tool in this situation. Psilocybin’s main synergistic effect seems ultimately to be in the domain of language. It excites vocalization; it empowers articulation; it transmutes language into something that is visibly beheld. It could have had an impact on the sudden emergence of consciousness and language use in early humans. We literally may have eaten our way to higher consciousness. 49

If we espouse this hypothesis, then speaking means evoking and repeating the primordial act of eating the sacred medicine. Ethnobotanists insist upon the role of the human brain in the accomplishment of this process, pinpointing precisely the relevant area of activity, which, in Gurdjieffian terms, is located in the center of gravity of the intellectual center: “Our capacity for cognitive and linguistic activity is related to the size and organization of the human brain. . . . The most recently evolved areas of the human brain, Broca’s area and the neocortex, are devoted to the control of symbol and language processing.” 50 It thus appears that these are the areas of the brain that have allowed for the emergence of language and culture. Yet McKenna adds, “our linguistic abilities must have evolved in response to enormous evolutionary pressures,” though we do not know the nature of these pressures. According to him, it is this “immense power to manipulate symbols and language” that “gives us our unique position in the natural world.” 51 This is obvious, in that speech and consciousness, inextricably linked, are solely the property of humans. Thus it seems logical that the plants known as psychoactive must have been the catalysts “for everything about us that distinguishes us from other higher primates, for all the mental functions that we associate with humanness,” 52 with the primary position being held by language, “the most unique of human activities,” and the catalyst for poetic and literary activity.

Under the influence of an unlimiter, we have the incontrovertible impression that language possesses an objectified and visible dimension that is ordinarily hidden from our awareness. Under such conditions, language is seen and beheld just as we would ordinarily see our homes and normal surroundings. In fact, during the experience of the altered state, our ordinary cultural environment is recognized correctly as the bass drone in the ongoing linguistic business of objectifying the imagination. In other words, the collectively designed cultural environment in which we all live is the objectification of our collective linguistic intent.

Our language-forming ability may have become active through the mutagenic influence of hallucinogens working directly on organelles that are concerned with the processing and generation of signals. These neural substructures are found in various portions of the brain, such as Broca’s area, that govern speech formation. In other words, opening the valve that limits consciousness forces utterance, almost as if the word is a concretion of meaning previously felt but left unarticulated. This active impulse to speak, the “going forth of the word,” is sensed and described in the cosmogonies of many peoples.

Psilocybin specifically activates the areas of the brain concerned with processing signals. A common occurrence with psilocybin intoxication is spontaneous outbursts of poetry and other vocal activity such as speaking in tongues, though in a manner distinct from ordinary glossolalia. In cultures with a tradition of mushroom use, these phenomenons have given rise to the notion of discourse with spirit doctors and supernatural allies. Researchers familiar with the territory agree that psilocybin has a profoundly catalytic effect on the linguistic impulse. 53

Here we are touching upon the higher powers of speech—spontaneous creations, outbursts of poetry and suprahuman communications—which are part of the knowledge of the shamans and “sorcerers” who, through years of rigorous education, have become highly perceptive of these phenomena, which elude the subjective consciousness. In his essay “The Mushrooms of Language,” Henry Munn points to the direct links existing between the states of ecstasy and language: “Language is an ecstatic activity of signification. Intoxicated by the mushrooms, the fluency, the ease, the aptness of expression one becomes capable of are such that one is astounded by the words that issue forth from the contact of the intention of articulation with the matter of experience. . . . The spontaneity they liberate is not only perceptual, but linguistic . . . For the shaman, it is as if existence were uttering itself through him.” 54

In the 1920s, the Polish writer S. I. Witkiewicz, who attributed crucial importance to verbal creation, showed how peyote (he was one of the first people in Europe to experiment with it, or, at least, one of the first to give an account of doing so) acts upon the actual creation of words and also intervenes in the structure of sentences themselves: “. . . [I]t must also be remarked that peyote, perhaps by reason of the desire one has to capture with words that which cannot be captured, creates conceptual neologisms that belong to it alone and twists sentences in order to adapt their constructions to the frightening dimensions of its bizarrification . . .” 55 Peyote also gives those who ingest it a desire to create “new combinations of meanings.” Witkiewicz distinguishes three categories of objects in his visions: dead objects, moving objects, and living creatures. Regarding this last category, he distinguishes the “real” living creatures from the “fantastical” living creatures, which “discourage any attempt at description.” This is the moment when peyote intervenes: when those who wish to describe find themselves facing the limits of language. Peyote does not break through these limits; it simply shows that they do not exist, that they are hallucinations of the ordinary consciousness, that they are illusory, a mirage of tradition and the history of language.

The lucidogen—as it is called by Charles Duits, who created other neologisms for describing his experience with the sacred cactus—shows that life is present in everything, including speech, and he proves it. Sometimes, peyote leads us to the signifiers that escape us, always in order better to embrace the signified. Witkiewicz, pushing the phenomenon to the extreme limits of the senses and the sensible, insists:

I must draw attention to the fact that under the influence of peyote, one wants to make up neologisms. One of my friends, the most normal man in the world where language is concerned, in a state of trance and powerless to come to grips with the strangeness of these visions which defied all combinations of normal words, described them thus: “Pajtrakaly symforove i kondjioul v trykrentnykh pordeliansach.” I devised many formulas of this type on the night when I went to bed besieged by visions. I remember only this one. There is therefore nothing surprising in the fact that I, who have such inclinations even under normal conditions, should sometimes be driven to create some fancy word in order to attempt to disentangle and sort out the infernal vortex of creatures that unfurled upon me all night long from the depths of the ancient world of peyote. 56

Here, we cannot help but remember René Daumal’s experience, reported in “Le souvenir déterminant”: Under the influence of carbon tetrachloride, he pronounced with difficulty: “approximately: temgouf temgouf drr . . .” Henry Munn makes a similar remark after having taken part in shamanic rituals: “The mushroom session of language creates the words for phenomena without name.” 57 Sacred plants (and some other substances) are neologens, meaning they produce or generate neologisms from the attempts made at description by the subjects who consume them. This new word, this neologism created by circumstance, appears to be suited for this linguistic reality. We now have a word to designate this particular phenomenon pushing us against the limits of language, which in fact are revealed to be illusory.

Beyond this specific case, what is it that prevents us from creating new words whenever it appears necessary? Witkiewicz, speaking of language and life, defends the writer’s right to take liberties with the rules and invent new words. “Although certain professors insist on clinging to their own tripe,” he writes, “language is a living thing, even if it has always been considered a mummy, even if it has been thought impermissible to change anything in it. We can only imagine what literature, poetry, and even this accursed and beloved life would look like otherwise.” 58 Peyote not only incites us to this, but also, more forcefully, exercising a mysterious magnetic attraction toward a sort of supreme meaning beyond language and shaking up conventional signifiers and beings alike, peyote acts directly upon the heart of speech within the body of language. In this sense, it takes part actively and favorably in the creation of the being, the new and infinitely renewed human who, after a death that is more than symbolic, is reborn to new life. It is also very clear, in light of this example, that psilocybin alone does not explain everything, and that all lucidogenic substances work toward this same opening, this same outpouring of speech. McKenna writes:

Languages appear invisible to the people who speak them, yet they create the fabric of reality for their users. The problem of mistaking language for reality in the everyday world is only too well known. Plant use is an example of a complex language of chemical and social interactions. Yet most of us are unaware of the effects of plants on ourselves and our reality, partly because we have forgotten that plants have always mediated the human cultural relationship to the world at large. 59

pp. 238-239

It is interesting to note this dimension of speech specific to shamans, this inspired, active, healing speech. “It is not I who speak,” Heraclitus said, “it is the word.” The receptiveness brought about by an increased level of consciousness allows us not only to understand other voices, but also, above all, to express them in their entire magical substance. “Language is an ecstatic activity of signification. Intoxicated by the mushrooms, the fluency, the ease, the aptness of expression one becomes capable of are such that one is astounded by the words that issue forth from the contact of the intention of articulation with the matter of experience. . . . The spontaneity they liberate is not only perceptual, but linguistic, the spontaneity of speech, of fervent, lucid discourse, of the logos in activity.” 72

The shamanic paroxysm is therefore the mastery of the word, the mastery of the sacred songs very often inspired by the powers that live in plants—which instruct us, making us receptive to phenomena that escape the ordinary consciousness. The shaman becomes a channel through which subtle energies can pass. Because of the mystic intoxication, he becomes the instrument for spirits that express themselves through him. Hence the word tzo —“says”—which punctuates the phrases of the Mazatec shaman in her communication with the “little growing things”: “Says, says, says. It is said. I say. Who says! We say, man says, language says, being and existence say.” 73 “The inspired man,” writes the Mexican poet Octavio Paz in an essay on Breton, “the man who speaks the truth, says nothing that is his own: Through his mouth, it is the language that speaks.” 74

The language thus regains its primordial power, its creative force and Orphic value, which determine all true poetry, for, as Duits writes, poetry—which is born in the visionary experience—is nothing other than “the language of the gods.” There is nothing phantasmagoric, hallucinated, or illusory about this speech. “[W]ords are materializations of consciousness; language is a privileged vehicle of our relation to reality,” writes Munn. Because poetry carries the world, it is the language of power, a tool in the service of knowledge and action. The incantatory repetition of names, for example, an idea we have already touched upon in our discussion of prayer, acts upon the heart of the being. “The shaman has a conception of poesis in its original sense as an action: words themselves are medicine.” 75 The words—used in their sacred dimension —work toward the transmutation of being, the healing of the spirit, our development, but in order for it be effective, the magic word must be born from a direct confrontation with the experience, because experience alone is a safe reserve for truth. Knowledge is not enough; only those who have eaten are in a position to understand, only those who have heard and seen are in a position to say. If speech goes farther than the eye, it is because it has the power of doing. “Though the psychedelic experience produced by the mushrooms is of heightened perceptivity,” Munn writes, “the I say is of privileged importance to the I see .” 76 Psychedelic speech is speech of power, revealing the spirit.

Darwin’s Pharmacy
by Richard M. Doyle
pp. 8-23

Rhetoric is the practice of learning and teaching eloquence, persuasion, and information architecture by revealing the choices of expression or interpretation open to any given rhetor, viewer, listener, or reader. Robert Anton Wilson offers a definition of rhetoric by example when he focuses on the word “reality” in his book Cosmic Trigger:

“Reality” is a word in the English language which happens to be (a) a noun and (b) singular. Thinking in the English language (and in cognate Indo-European languages) therefore subliminally programs us to conceptualize “reality” as one block-like entity, sort of like a huge New York skyscraper, in which every part is just another “room” within the same building. This linguistic program is so pervasive that most people cannot “think” outside it at all, and when one tries to offer a different perspective they imagine one is talking gibberish. (iii) […]

Mitchell’s vision offers perhaps an equally startling irony: it was only by taking on a literally extraterrestrial perspective that the moon walker overcame alienated perception.5 […]

Thus, perception is not an object but rather the label for a nonlinear process involving an object, a percipient and information.” (Mitchell n.d.; emphasis mine) […]

Like the mind apprehending it, information “wants to be free” if only because it is essentially “not an object,” but rather “the label for a nonlinear process involving an object, a percipient and information.”6 It is worth noting that Mitchell’s experience induces a desire to comprehend, an impulse that is not only the desire to tell the story of his ecodelic imbrication but a veritable symptom of it.7 […]

What are psychedelics such that they seem to persuade humans of their interconnection with an ecosystem?

Terence McKenna’s 1992 book recursively answered this query with a title: Food of the Gods. Psychedelics, McKenna argued, were important vectors in the evolution of consciousness and spiritual practice. In his “shaggy primate story,” McKenna argued that psilocybin mushrooms were a “genome-shaping power” integral to the evolution of human consciousness. On this account, human consciousness—the only instance we know of where one part of the ecosystem is capable of reflecting on itself as a self and acting on the result—was “bootstrapped” by its encounter with the astonishing visions of high-dose psilocybin, an encounter with the Transcendental Other McKenna dubbed “a glimpse of the peacock angel.” Hence for McKenna, psychedelics are both a food fit for the gods and a food that, in scrambling the very distinction between food and drug, man and god, engenders less transcendence than immanence—each is recursively implicated, nested, in the other. […]

Evolutionarily speaking the emergence of widespread animal life on earth is not separable from a “mutualistic” economy of plants, pollinators, and seed dispersers.

The basis for the spectacular radiations of animals on earth today is clearly the resources provided by plants. They are the major primary producers, autotrophically energizing planet Earth…the new ecological relationships of flowering plants resulted in colonizing species with population structures conducive to rapid evolutionary change. (Price, 4)

And if mammalian and primate evolution is enmeshed in a systemic way with angiosperms (flowering plants), so too have humans and other primates been constantly constituted by interaction with plants. […]

Navigating our implication with both plants and their precipitates might begin, then, with the startling recognition of plants as an imbricated power, a nontrivial vector in the evolution of Homo sapiens, a power against which we have waged war. “Life is a rhizome,” wrote Carl Jung, our encrypted ecological “shadow” upon which we manifest as Homo sapiens, whose individuation is an interior folding or “involution” that increases, rather than decreases, our entanglement with any given ecosystem. […]

In other words, psychedelics are (a suppressed) part of evolution. As Italian ethnobotanist Giorgio Samorini put it “the drug phenomenon is a natural phenomenon, while the drug problem is a cultural problem“ (87). […]

Indeed, even DMT, an endogenous and very real product of the human brain, has been “scheduled” by the federal government. DMT would be precisely, by most first person accounts, “the most potent hallucinogen on sale in Haight or Ashbury or Telegraph Avenue” and is a very real attribute of our brains as well as plant ecology. We are all “holding” a Schedule One psychedelic—our own brains, wired for ecodelia, are quite literally against the law. […]

The first principle of harm reduction with psychedelics is therefore this: one must pay attention to set and setting, the organisms for whom and context in which the psychedelic experience unfolds.For even as the (rediscovery of psychedelics by twentieth-century technoscience suggested to many that consciousness was finally understandable via a molecular biology of the brain, this apex of reductionism also fostered the recognition that the effects of psychedelics depend on much more than neurochemistry.23 If ecodelics can undoubtedly provoke the onset of an extra-ordinary state of mind, they do so only on the condition of an excessive response-ability, a responsiveness to rhetorical conditions—the sensory and symbolic framework in which they are assayed. Psychologists Ralph Metzner and Timothy Leary made this point most explicitly in their discussion of session “programming,” the sequencing of text, sound, and sensation that seemed to guide, but not determine the content of psychedelic experiences:

It is by now a well-known fact that psychedelic drugs may produce religious, aesthetic, therapeutic or other kinds of experiences depending on the set and setting…. Using programming we try to control the content of a psychedelic experience in specific desired directions. (5; reversed order)

Leary, Metzner, and many others have provided much shared code for such programming, but all of these recipes are bundled with an unavoidable but difficult to remember premise: an extraordinary sensitivity to initial rhetorical conditions characterizes psychedelic “drug action.” […]

Note that the nature of the psychedelic experience is contingent upon its rhetorical framing—what Leary, Metzner, and Richard Alpert characterized in The Psychedelic Experience as “the all-determining character of thought” in psychedelic experience. The force of rhetorical conditions here is immense— for Huxley it is the force linking premise to conclusion:

“No I couldn’t control it. If one began with fear and hate as the major premise, one would have to go on the conclusion.” (Ibid.)

Rhetorical technologies structure and enable fundamentally different kinds of ecodelic experiences. If the psychonaut “began” with different premises, different experiences would ensue.

pp. 33-37

Has this coevolution of rhetorical practices and humans ceased? This book will argue that psychedelic compounds have already been vectors of tech-noscientific change, and that they have been effective precisely because they are deeply implicated in the history of human problem solving. Our brains, against the law with their endogenous production of DMT, regularly go ecodelic and perceive dense interconnectivity. The human experience of radical interconnection with an ecosystem becomes a most useful snapshot of the systemic breakdowns between “autonomous” organisms necessary to sexual reproduction, and, not incidentally, they render heuristic information about the ecosystem as an ecosystem, amplifying human perception of the connections in their environment and allowing those connections to be mimed and investigated. This increased interconnection can be spurred simply by providing a different vision of the environment. Psychologist Roland Fischer noted that some aspects of visual acuity were heightened under the influence of psilocybin, and his more general theory of perception suggests that this acuity emerges out of a shift in sensory-motor ratios.

For Fischer the very distinction between “hallucination” and “perception” resides in the ratio between sensory data and motor control. Hallucination, for Fischer, is that which cannot be verified in three-dimensional Euclidean space. Hence Fischer differentiates hallucination from perception based not on truth or falsehood, but on a capacity to interact: if a subject can interact with a sensation, and at least work toward verifying it in their lived experience, navigating the shift in sensory-motor ratios, then the subject has experienced something on the order of perception. Such perception is easily fooled and is often false, but it appears to be sufficiently connective to our ecosystems to allow for human survival and sufficiently excitable for sexually selected fitness. If a human subject cannot interact with a sensation, Fischer applies the label “hallucination” for the purpose of creating a “cartography of ecstatic states.”

Given the testimony of psychonauts about their sense of interconnection, Fischer’s model suggests that ecodelic experience tunes perception through a shift of sensory-motor ratios toward an apprehension of, and facility for, interconnection: the econaut becomes a continuum between inside and outside. […] speech itself might plausibly emerge as nothing other than a symptom and practice of early hominid use of ecodelics.

pp. 51-52

It may seem that the visions—as opposed to the description of set and setting or even affect and body load—described in the psychonautic tradition elude this pragmatic dynamic of the trip report. Heinrich Klüver, writing in the 1940s and Benny Shannon, writing in the early twenty-first century, both suggest that the forms of psychedelic vision (for mescaline and ayahuasca respectively) are orderly and consistent even while they are indescribable. Visions, then, would seem to be messages without a code (Barthes) whose very consistency suggested content.

Hence this general consensus on the “indescribableness” (Ellis) of psychedelic experience still yields its share of taxonomies as well as the often remarkable textual treatments of the “retinal circus” that has become emblematic of psychedelic experience. The geometric, fractal, and arabesque visuals of trip reports would seem to be little more than pale snapshots of the much sought after “eye candy” of visual psychedelics such as LSD, DMT, 2C-I, and mescaline. Yet as deeply participatory media technologies, psychedelics involve a learning curve capable of “going with” and accepting a diverse array of phantasms that challenge the beholder and her epistemology, ontology, and identity. Viewed with the requisite detachment, such visions can effect transformation in the observing self, as it finds itself nested within an imbricated hierarchy: egoic self observed by ecstatic Atman which apprehends itself as Brahman reverberating and recoiling back onto ego. Many contemporary investigators of DMT, for example, expect and often encounter what Terence McKenna described as the “machine elves,” elfin entities seemingly tinkering with the ontological mechanics of an interdimension, so much so that the absence of such entities is itself now a frequent aspect of trip reportage and skeptics assemble to debunk elfin actuality (Kent 2004).

p. 63

While synesthesia is classically treated as a transfer or confusion of distinct perceptions, as in the tactile and gustatory conjunction of “sharp cheese,” more recent work in neurobiology by V. S. Ramachandran and others suggests that this mixture is fundamental to language itself—the move from the perceptual to the signifying, in this view, is itself essentially synesthetic. Rather than an odd symptom of a sub-population, then, synesthesia becomes fundamental to any act of perception or communication, an attribute of realistic perception rather than a pathological deviation from it.

pp. 100-126

Rhetorical practices are practically unavoidable on the occasion of death, and scholars in the history of rhetoric and linguistics have both opined that it was as a practice of mourning that rhetoric emerged as a recognizable and repeatable practice in the “West.” […] It is perhaps this capacity of some rhetorical practices to induce and manage the breakdown of borders—such as those between male and female, life and death, silence and talk—that deserves the name “eloquence.” Indeed, the Oxford English Dictionary reminds us that it is the very difference between silence and speech that eloquence manages: a. Fr. éloquent, ad. L. loquent-em, pr. pple., f. loqui to speak out.2 […]

And despite Huxley’s concern that such an opening of the doors of (rhetorical) perception would be biologically “useless,” properly Darwinian treatments of such ordeals of signification would place them squarely within the purview of sexual selection—the competition for mates. If psychedelics such as the west African plant Iboga are revered for “breaking open the head,” it may be because we are rather more like stags butting heads than we are ordinarily comfortable putting into language (Pinchbeck 2004, cover). And our discomfort and fascination ensues, because sexual selection is precisely where sexual difference is at stake rather than determined. A gradient, sexuality is, of course, not a binary form but is instead an enmeshed involutionary zone of recombination: human reproduction takes place in a “bardo” or between space that is neither male nor female nor even, especially, human. Indeed, sex probably emerged as a technique for exploring the space of all possible genotypes, breaking the symmetry of an asexual reproduction and introducing the generative “noise” of sexuality with which Aldous Huxley’s flowers resonated. In this context, psychedelics become a way of altering the context of discursive signaling within which human reproduction likely evolved, a sensory rather than “extra-sensory” sharing of information about fitness.

Doctors of the Word

In an ecstatic treatment of Mazatec mushroom intoxication, Henry Munn casts the curandera as veritable Sophists whose inebriation is marked by an incessant speaking:

The shamans who eat them, their function is to speak, they are the speakers who chant and sing the truth, they are the oral poets of their people, the doctors of the word, they who tell what is wrong and how to remedy it, the seers and oracles, the ones possessed by the voice. (Munn, 88)

Given the contingency of psychedelic states on the rhetorical conditions under which they are used, it is perhaps not surprising that the Mazatec, who have used the “little children” of psilocybin for millennia, have figured out how to modulate and even program psilocybin experience with rhetorical practices. But the central role enjoyed by rhetoricians here—those doctors of the word—should not obscure the difficulty of the shaman/ rhetorician’s task: “possessed by the voice,” such curanderas less control psychedelic experience than consistently give themselves over to it. They do not wield ecstasy, but are taught by it. Munn’s mushroom Sophists are athletes of “negative capability,” nineteenth-century poet John Keats’s term for the capacity to endure uncertainty. Hence the programming of ecodelic experience enables not control but a practiced flexibility within ritual, a “jungle gym” for traversing the transhuman interpolation. […]

Fundamental to shamanic rhetoric is the uncertainty clustering around the possibility of being an “I,” an uncertainty that becomes the very medium in which shamanic medicine emerges. While nothing could appear more straightforward than the relationship between the one who speaks and the subject of the sentence “I speak,” Munn writes, sampling Heraclitus, “It is not I who speak…it is the logos.” This sense of being less in dialogue with a voice than a conduit for language itself leads Munn toward the concept of “ecstatic signification.”

Language is an ecstatic activity of signification…. Intoxicated by the mushrooms, the fluency, the ease, the aptness of expression one becomes capable of are such that one is astounded by the words that issue forth from the contact of the intention of articulation with the matter of experience. At times it is as if one were being told what to say, for the words leap to mind, one after another, of themselves without having to be searched for: a phenomenon similar to the automatic dictation of the surrealists except that here the flow of consciousness, rather than being disconnected, tends to be coherent: a rational enunciation of meanings. Message fields of communication with the world, others, and one’s self are disclosed by the mushrooms. (Ibid., 88-89)

If these practices are “ecstatic,” they are so in the strictest of fashions. While recent usage tends to conjoin the “ecstatic” with enjoyment, its etymology suggests an ontological bifurcation—a “being beside oneself” in which the very location, if not existence, of a self is put into disarray and language takes on an unpredictable and lively agency: “words leap to mind, one after another.”3 This displacement suggests that the shaman hardly governs the speech and song she seemingly produces, but is instead astonished by its fluent arrival. Yet this surprise does not give way to panic, and the intoxication increases rather than retards fluency—if anything, Munn’s description suggests that for the Mazatec (and, perhaps, for Munn) psilocybin is a rhetorical adjunct that gives the speaker, singer, listener, eater access to “message fields of communication.” How might we make sense of this remarkable claim? What mechanisms would allow a speaker to deploy intoxication for eloquence?

Classically speaking, rhetoric has treated human discourse as a tripartite affair, a threefold mixture of ethos, an appeal based on character; logos, an appeal based on the word; and pathos, an appeal to or from the body.4 Numerous philosophers and literary critics since Jacques Derrida have decried the Western fascination with the logos, and many scholars have looked to the rich traditions of rhetoric for modalities associated with other offices of persuasion, deliberation, and transformation. But Munn’s account asks us to recall yet another forgotten rhetorical practice—a pharmacopeia of rhetorical adjuncts drawn from plant, fungus, and geological sources. In the context of the Mazatec, the deliberate and highly practiced ingestion of mushrooms serves to give the rhetor access not to individually created statements or acts of persuasion, but to “fields” of communication where rhetorical practice calls less for a “subject position” than it does a capacity to abide multiplicity—the combination and interaction, at the very least, of human and plant.

Writer, philosopher, and pioneering psychonaut Walter Benjamin noted that his experiments with hashish seemed to induce a “speaking out,” a lengthening of his sentences: “One is very much struck by how long one’s sentences are” (20). Longer sentences, of course, are not necessarily more eloquent in any ordinary sense than short ones, since scholars, readers, and listeners find that eloquence inheres in a response to any given rhetorical context. Indeed, Benjamin’s own telegraphic style in his hashish protocols becomes extraordinary, rare, and paradoxical given his own claim for long sentences in a short note. Yet Benjamin’s account does remind us that ecodelics often work on and with the etymological sense of “eloquence,” a “speaking out,” an outburst of language, a provocation to language. Benjamin reported that it was through language that material forms could be momentarily transformed: “The word ‘ginger’ is uttered and suddenly in place of the desk there is a fruit stand” (ibid., 21).

And yet if language and, indeed, the writing table, is the space where hashish begins to resonate for Benjamin, it does so only by making itself available to continual lacunae, openings and closings where, among other things, laughter occurs. For precisely as they are telegraphic, the hashish protocols of Benjamin create a series of non sequiturs: […]

Hashish, then, is an assassin of referentiality, inducing a butterfly effect in thought. In Benjamin, cannabis induces a parataxis wherein sentences less connect to each other through an explicit semantics than resonate together and summon coherence in the bardos between one statement and another. It is the silent murmur between sentences that is consistent while the sentences continually differentiate until, through repetition, an order appears: “You follow the same paths of thought as before. Only, they appear strewn with roses.”

For a comparable practice in classical rhetoric linking “intoxication” with eloquence, we return to Delphi, where the oracles made predictions persuasive even to the always skeptical Socrates, predictions whose oracular ecodelic speech was rendered through the invisible but inebriating “atmosphere” of ethylene gases—a geological rhetoric. Chemist Albert Hofmann, classicist Carl Ruck, ethnobotanist Jonathan Ott, and others have made a compelling case that at Eleusis, where Socrates, well before Bartleby, “preferred not” to go, the Greek Mysteries were delivered in the context of an ecodelic beverage, perhaps one derived from fermented grain or the ergotladen sacrament kykeon, chemically analogous to LSD.5 These Mystery rites occasioned a very specific rhetorical practice—silence—since participants were forbidden from describing the kykeon or its effects. But silence, too, is a rhetorical practice, and one can notice that such a prohibition functions rhetorically not only to repress but also to intensify a desire to “speak out” of the silence that must come before and after Eleusis.

And Mazatec curandera Maria Sabina is explicit that indeed it is not language or even its putative absence, silence, that is an adjunct or “set and setting” for the mushrooms. Rather, the mushrooms themselves are a languaging, eloquence itself, a book that presents itself and speaks out:

At other times, God is not like a man: He is the Book. A Book that is born from the earth, a sacred Book whose birth makes the world shake. It is the Book of God that speaks to me in order for me to speak. It counsels me, it teaches me, it tells me what I have to say to men, to the sick, to life. The Book appears and I learn new words.6

Crucial to this “speaking” is the way in which Maria Sabina puts it. Densely interactive and composed of repetition, the rhetorical encounter with the mushroom is more than informative it is pedagogical and transformative: “The Book appears and I learn new words.” The earth shakes with vitality, manifesting the mushroom orator.7 Like any good teacher, the mushrooms work with rhythms, repetitions that not only reinforce prior knowledge but induce one to take leave of it. “It counsels me, it teaches me.” The repetition of which and through which Maria Sabina speaks communicates more than knowledge, but allows for its gradual arrival, a rhythm of coming into being consonant and perhaps even resonant with the vibrations of the Earth, that scene of continual evolutionary transformation.

More than a supplement or adjunct to the rhetor, the mushroom is a transformer. Mary Barnard maps out a puppetry of flesh that entails becoming a transducer of the mushroom itself: “The mushroom-deity takes possession of the shaman’s body and speaks with the shaman’s lips. The shaman does not say whether the sick child will live or die; the mushroom says” (248).

Nor are reports of psilocybin’s effects as a rhetorical adjunct peculiar to Munn or even the Mazatec tradition. Over a span of ten years, psychologist Roland Fischer and his colleagues at Ohio State University tested the effects of psilocybin on linguistic function. Fischer articulated “the hallucination-perception continuum,” wherein hallucinations would be understood less as failed images of the real than virtual aspects of reality not verifiable in the “Euclidean” space projected by the human sensorium. Fischer, working with the literary critic Colin Martindale, located in the human metabolism of psilocybin (and its consequent rendering into psilocin) linguistic symptoms isomorphic to the epics of world literature. Psilocybin, Fischer and Martindale argued, provoked an increase in the “primary process content” of writing composed under the influence of psilocybin. Repetitious and yet corresponding to the very rhetorical structure of epics, psilocybin can thus be seen to be prima facie adjuncts to an epic eloquence, a “speaking out” that leaves rhetorical patterns consistent with the epic journey (Martindale and Fisher).

And in this journey, it is often language itself that is exhausted—there is a rhythm in the epic structure between the prolix production of primary process content and its interruption. Sage Ramana Maharshi described mouna, a “state which transcends speech and thought,” as the state that emerges only when “silence prevails.” […]

A more recent study conducted of high-dose psilocybin experience among international psychonauts suggested that over 35 percent of subjects heard what they called “the logos” after consuming psilocybin mushrooms.

Based on the responses to the question of the number of times psilocybin was taken, the study examined approximately 3,427 reported psilocybin experiences (n = 118). Of the total questionnaire responses (n = 128), 35.9% (n = 46) of the participants reported having heard a voice(s) with psilocybin use, while 64.0% (n = 82) of the participants stated that they had not. (Beach) […]

Inevitably, this flow fluctuates between silence and discourse. Michaux’s experiments with psychedelics rendered the now recognizable symptoms of graphomania, silence, and rhetorical amplification. In Miserable Miracle, one of the three books Michaux wrote “with mescaline,” Michaux testifies to a strange transformation into a Sophist:

For the first time I understood from within that animal, till now so strange and false, that is called an orator. I seemed to feel how irresistible must be the propensity for eloquence in certain people. Mesc. acted in such a way that it gave me the desire to make proclamations. On what? On anything at all. (81)11

Hence, while their spectrum of effects is wide ranging and extraordinarily sensitive to initial rhetorical conditions, psychedelics are involved in an intense inclination to speak unto silence, to write and sing in a time not limited to the physical duration of the sacramental effect, and this involvement with rhetorical practice—the management of the plume, the voice, and the breath—appears to be essential to the nature of psychedelics; they are compounds whose most persistent symptoms are rhetorical. […]

Crucial to Krippner’s analysis, though, is the efficacy of psychedelics in peeling away these strata of rhetorical practice. By withering some layers of perception, others are amplified:

In one experiment (Jarvik et al. 1955), subjects ingested one hundred micrograms of LSD and demonstrated an increase in their ability to quickly cancel out words on a page of standardized material, but a decreased ability to cancel out individual letters. The drug seemed to facilitate the perceptions of meaningful language units while it interfered with the visual perception of non-meaningful ones. (Krippner, 220)

Krippner notes that the LSD functioned here as a perceptual adjunct, somehow tuning the visual perception toward increased semantic and hence rhetorical efficacy. This intensified visual perception of language no doubt yielded the familiar swelling of font most associated with psychedelic art and pioneered by the psychedelic underground press (such as the San Francisco Oracle.) By amplifying the visual aspect of font—whose medium is the psychedelic message—this psychedelic innovation remixes the alphabet itself, as more information (the visual, often highly sensory swelling of font) is embedded in a given sequence of (otherwise syntactic and semantic) symbols. More information is compressed into font precisely by working with the larger-scale context of any given message rather than its content. This apprehension of larger-scale contexts for any given data may be the very signature of ecodelic experience. Krippner reports that this sensory amplification even reached dimensional thresholds, transforming texts:

Earlier, I had tasted an orange and found it the most intense, delightful taste sensation I had ever experienced. I tried reading a magazine as I was “coming down,” and felt the same sensual delight in moving my eye over the printed page as I had experienced when eating the orange. The words stood out in three dimensions. Reading had never been such a sheer delight and such a complete joy. My comprehension was excellent. I quickly grasped the intent of the author and felt that I knew exactly what meaning he had tried to convey. (221)

Rather than a cognitive modulation, then, psychedelics in Krippner’s analysis seem to affect language function through an intensification of sensory attention on and through language, “a complete joy.” One of Krippner’s reports concerned a student attempting to learn German. The student reported becoming fascinated with the language in a most sensory fashion, noting that it was the “delicacy” of the language that allowed him to, well, “make sense” of it and indulge his desire to “string” together language:

The thing that impressed me at first was the delicacy of the language.…Before long, I was catching on even to the umlauts. Things were speeding up like mad, and there were floods of associations.…Memory, of course, is a matter of association and boy was I ever linking up to things! I had no difficulty recalling words he had given me—in fact, I was eager to string them together. In a couple of hours after that, I was even reading some simple German, and it all made sense. (Ibid.)

Krippner reports that by the end of his LSD session, the student “had fallen in love with German” (222). Krippner rightly notes that this “falling” is anything but purely verbal, and hypothesizes that psychedelics are adjuncts to “non-verbal training”: “The psychedelic session as non-verbal training represents a method by which an individual can attain a higher level of linguistic maturity and sophistication” (225).

What could be the mechanism of such a “non-verbal” training? The motor-control theory of language suggests that language is bootstrapped and developed out of the nonlinguistic rhythms of the ventral premotor system, whose orderly patterns provided the substrate of differential repetition necessary to the arbitrary configuration and reconfiguration of linguistic units. Neuroscientist V. S. Ramachandran describes the discovery of “mirror neurons” by Giaccamo Rizzolati. Rizzolati

recorded from the ventral premotor area of the frontal lobes of monkeys and found that certain cells will fire when a monkey performs a single, highly specific action with its hand: pulling, pushing, tugging, grasping, picking up and putting a peanut in the mouth etc. different neurons fire in response to different actions. One might be tempted to think that these are motor “command” neurons, making muscles do certain things; however, the astonishing truth is that any given mirror neuron will also fire when the monkey in question observes another monkey (or even the experimenter) performing the same action, e.g. tasting a peanut! (Ramachandran)

Here the distinction between observing and performing an action are confused, as watching a primate pick up a peanut becomes indistinguishable from picking up the peanut, at least from the perspective of an EEG. Such neurological patterns are not arbitrary, linked as they are to the isomorphic patterns that are the developmentally articulated motor control system of the body. This may explain how psychedelics can, according to Krippner, allow for the perceptual discernment of meaningful units. By releasing the attention from the cognitive self or ego, human subjects can focus their attention on the orderly structures “below” conscious awareness and distributed across their embodiment and environments. Robin Allot has been arguing for the motor theory of language evolution since the 1980s:

In the evolution of language, shapes or objects seen, sounds heard, and actions perceived or performed, generated neural motor programs which, on transfer to the vocal apparatus, produced words structurally correlated with the perceived shapes, objects, sounds and actions. (1989)

These perceived shapes, objects, sounds, and actions, of course, include the sounds, smells, visions, and actions continually transmitted by ecosystems and the human body itself, and by focusing the attention on them, we browse for patterns not yet articulated by our embodiment. Significantly, as neuroscientist Ramachandran points out, this “mirror neuron” effect seems to occur only when other living systems are involved:

When people move their hands a brain wave called the MU wave gets blocked and disappears completely. Eric Altschuller, Jamie Pineda, and I suggested at the Society for Neurosciences in 1998 that this suppression was caused by Rizzolati’s mirror neuron system. Consistent with this theory we found that such a suppression also occurs when a person watches someone else moving his hand but not if he watches a similar movement by an inanimate object.

Hence, in this view, language evolves and develops precisely by nonverbal means in interaction with other living systems, as the repetitions proper to language iterate on the basis of a prior repetition—the coordinated movements necessary to survival that are coupled to neurological patterns and linked to an animate environment. By blocking the “throttling embrace of the self,” ecodelics perhaps enable a resonance between the mind and nature not usually available to the attention. This resonance creates a continuum between words and things even as it appears to enable the differentiation between meaningful and nonmeaningful units: […]

This continuum between the abstract character of language and its motor control system is consistent with Krippner’s observation that “at the sensory level, words are encoded and decoded in highly unusual ways” (238). This differential interaction with the sensory attributes of language includes an interaction with rhythms and puns common to psychedelic experience, a capacity to become aware of a previously unobserved difference and connection. Puns are often denounced as, er, punishing a reader’s sense of taste, but in fact they set up a field of resonance and association between previously distinct terms, a nonverbal connection of words. In a highly compressed fashion, puns transmit novel information in the form of a meshed relation between terms that would otherwise remain, often for cultural or taboo reasons, radically distinct.12 This punning involves a tuning of a word toward another meaning, a “troping” or bending of language toward increased information through nonsemantic means such as rhyming. This induction of eloquence and its sensory perception becomes synesthetic as an oral utterance becomes visual: […]

Hence, if it is fair to characterize some psychedelic experiences as episodes of rhetorical augmentation, it is nonetheless necessary to understand rhetoric as an ecological practice, one which truly works with all available means of persuasion (Aristotle), human or otherwise, to increase the overall dissipation of energy in any given ecology. One “goes for broke,” attempting the hopeless task of articulating psychedelics in language until exhausting language of any possible referential meaning and becoming silent. By locating “new” information only implicit in a given segment of language and not semantically available to awareness, a pun increases the informational output of an ecosystem featuring humans. This seems to feedback, […]

Paired with an apprehension of the logos, this tuning in to ecodelia suggests that in “ego death,” many psychonauts experience a perceived awareness of what Vernadsky called the noösphere, the effects of their own consciousness on their ecosystem, about which they incessantly cry out: “Will we listen in time?”

In the introduction, I noted that the ecodelic adoption of this non-local and hence distributed perspective of the biosphere was associated with the apprehension of the cosmos as an interconnected whole, and with the language of “interpellation” I want to suggest that this sense of interconnection often appears in psychonautic testimony as a “calling out” by our evolutionary context. […]

The philosopher Louis Althusser used the language of “interpellation” to describe the function of ideology and its purchase on an individual subject to it, and he treats interpellation as precisely such a “calling out.” Rather than a vague overall system involving the repression of content or the production of illusion, ideology for Althusser functions through its ability to become an “interior” rhetorical force that is the very stuff of identity, at least any identity subject to being “hailed” by any authority it finds itself response-able to. I turn to that code commons Wikipedia for Althusser’s most memorable treatment of this concept:

Memorably, Althusser illustrates this with the concept of “hailing” or “interpellation.” He uses the example of an individual walking in a street: upon hearing a policeman shout “Hey you there!”, the individual responds by turning round and in this simple movement of his body she is transformed into a subject. The person being hailed recognizes himself as the subject of the hail, and knows to respond.14

This sense of “hailing” and unconscious “turning” is appropriate to the experience of ecodelic interconnection I am calling “the transhuman interpellation.” Shifting back and forth between the nonhuman perspectives of the macro and the micro, one is hailed by the tiniest of details or largest of overarching structures as reminders of the way we are always already linked to the “evolutionary heritage that bonds all living things genetically and behaviorally to the biosphere” (Roszak et al., 14). And when we find, again and again, that such an interpellation by a “teacher” or other plant entity (à la the logos) is associated not only with eloquence but also with healing,15 we perhaps aren’t surprised by a close-up view of the etymology of “healing.” The Oxford English Dictionary traces it from the Teutonic “heilen,” which links it to “helig” or “holy.” And the alluvial flow of etymology connects “hailing” and “healing” in something more than a pun:

A Com. Teut. vb.: OE. hlan = OFris. hêla, OS. hêlian (MDu. hêlen, heilen, Du. heelen, LG. helen), OHG. heilan (Ger. heilen), ON. heil (Sw. hela, Da. hele), Goth. hailjan, deriv. of hail-s, OTeut. *hailo-z, OS. Hál <HALE><WHOLE>16

Hailed by the whole, one can become healed through ecodelic practice precisely because the subject turns back on who they thought they were, becoming aware of the existence of a whole, a system in which everything “really is” connected—the noösphere. Such a vision can be discouraging and even frightening to the phantasmically self-birthed ego, who feels not guilt but a horror of exocentricity. It appears impossible to many of us that anything hierarchically distinct, and larger and more complex than Homo sapiens—such as Gaia—could exist, and so we often cry out as one in the wilderness, in amazement and repetition.

Synesthesia, and Psychedelics, and Civilization! Oh My!
Were cave paintings an early language?

Choral Singing and Self-Identity
Music and Dance on the Mind
Development of Language and Music
Spoken Language: Formulaic, Musical, & Bicameral
“Beyond that, there is only awe.”
“First came the temple, then the city.”
The Spell of Inner Speech
Language and Knowledge, Parable and Gesture