Winter Season and Holiday Spirit

The Carnival season has ended and Lent is upon us. But Christmas has still been on my mind for some reason. There is something about the winter holiday season in general. I’m not a big fan of Christmas. It hasn’t excited me much since childhood. Even as a kid, all that Christmas meant was lots of presents on a particular day. Christmas follows directly after my birthday and so nothing about Christmas itself stood out to me.

I do somewhat get into the winter holiday mood because, as holidays go, Christmas sure is hard to ignore. My mother has always gone to great lengths to decorate. And we usually get together as a family. It helps having my nieces and nephew around on Christmas morning. It’s not the same without little children to get excited about gifts under the tree. All of that is nice, if only to see family. It’s just there isn’t much Christmas tradition in my family. The closest we get to that is decorating the Christmas tree, as we all have our own ornaments. And we do eat potato soup as a family meal, typically on Christmas Eve. But we don’t sing Christmas carols together or anything. Christmas simply happens, with family convening and then dispersing soon after.

This came to mind when I heard “God Rest Ye Merry Gentlemen“. It’s the version done by Annie Lennox. The style of the song and the imagery of the video make for an enjoyable combo, capturing a sense of old time mystery and touching on the pagan origins of the holiday season. It’s one of the oldest carols in the English tradition, although the present lyrics were not fully written down until recent centuries. There are multiple versions of the carol. The origins are obscure and the original version is unknown. The tune itself is much older, apparently going back to France and Germany. It very well might predate the spread of Christianity in Europe or else was a product of the surviving pagan wassailing tradition. Other songs are sung to the same tune, such as the “Sussex Sugar Wassail” and “Chestnut or Jack Doves Figary”.

All of that is fascinating. There is a long cultural and religious history behind winter holiday traditions and celebrations. It seems to have always been an important time of year. Somewhere between fall and spring equinoxes, one year is considered to have ended and another to have begun, the precise month and day differing between calendrical systems, but generally it corresponds to the period between harvest and planting. The central theme is that of transition and a loosening of boundaries between not just years and seasons but between this world and another, along with a loosening of the bounds of the social order. Things are brought closer together. Spirits, ghosts, gods, and Santa Claus are let loose to roam the human world.

This is why the custom of wearing masks was common from Halloween to Mardi Gras, including a masking tradition around Christmastime. Masks served many purposes. It hid your identity from those non-human beings, to protect you from harm. But sometimes the masks were to represent those very same beings, even one’s own ancestors. In general, masking and guising give one a new identity. Individuals could temporarily be someone else, of a different class or social role, and so act in ways not otherwise allowed.

With this revelry and reversal follows, along with licentiousness and transgression, drunkenness and bawdiness, fun and games, song and dance, feasting and festival. It is a time for celebration of this year’s harvest and blessing of next year’s harvest. Bounty and community. Death and rebirth. The old year must be brought to a close and the new year welcomed. This is the period when gods, ancestors, spirits, and demons must be solicited, honored, appeased, or driven out. The noise of song, gunfire, and such serves many purposes.

In the heart of winter, some of the most important religious events took place. This includes Christmas, of course, but also the various celebrations around the same time. A particular winter festival season that began on All Hallows Eve (i.e., Halloween) ended with the Twelfth Night. This included carnival-like revelry and a Lord of Misrule. There was also the tradition of going house to house, of singing and pranks, of demanding treats/gifts and threats if they weren’t forthcoming. It was a time of community and sharing, and those who didn’t willingly participate might be punished. Winter, a harsh time of need, was when the group took precedence.

This is when Jesus was born to a virgin, not to mention the birth of many other salvific gods and resurrection godmen. Jesus’ coming into the world was humble and with him came a message of hope but also of inversion, the powerful brought down low and the meek lifted up. Christianity inherited much from other religions that also placed great importance on the solstice, the greatest darkness before the return of the light, the liminal moment of time stopping and the sun reversing its course.

Two examples of virgin born godmen are Mithras and Attis. Like Santa Claus, both wore a Phrygian cap, sometimes referred to as the liberty cap because of conflation with the Roman pileus that was worn by emancipated slaves (the pileus was worn during Saturnalia, a solstice celebration). An important detail is that St. Paul came from Tarsus, the place of origin for Mithras worship that arose to prominence in the century before his birth, and so he certainly would have recognized the similarities to Christianity. Mithraism had been the most widespread religion in Europe before Christianity came to dominate under Constantine.

By the way, there is also an intriguing theory about the psychedelic mushroom known as the fly agaric, similar to the liberty cap. It grows under pine trees, is eaten by reindeer that then leap around, and is supposedly used by Siberian shamans who it was thought entered dwellings through the smoke hole. Some consider this to be the origin of much of the Christmas imagery.

Besides this, trees in general play a central role. Along with Christmas trees, there is the tradition of wassailing to the elder tree in an orchard where it was considered a spirit dwelled. Trees, of course, are an ancient symbol of the axis mundi, upon which the world turned, along with close association to the death and resurrection of gods and godmen. Also, the liberty pole became a central symbol of revolution, including during the American Revolution, and sometimes would have a Phrygian cap or pileus on top of it. The word ‘revolution’ came from astrology and referred to cycles, a returning. It’s interesting to note that the Boston Tea Party involved masking and occurred on the eve of Saturnalia.

I’m also reminded of the Santa Claus as St. Nick. This invokes an image of jollity and generosity. And this connects to wintertime as period of community needs and interdependence, of sharing and gifting, of hospitality and kindness. This includes enforcement of social norms which easily could transform into the challenging of social norms.

It’s maybe in this context we should think of the masked vigilantes participating in the Boston Tea Party. Like carnival, there had developed a tradition of politics out-of-doors, often occurring on the town commons. And on those town commons, large trees became identified as liberty trees — under which people gathered, upon which notices were nailed, and sometimes where effigies were hung. This was an old tradition that originated in Northern Europe, where a tree was the center of a community, the place of law-giving and community decision-making. In Europe, the commons had become the place of festivals and celebrations, such as carnival. And so the commons came to be the site of revolutionary fervor as well.

The most famous Liberty Tree was a great elm near the Boston common. It was there that many consider the birth of the American Revolution, as it was the site of early acts of defiance. This is where the Sons of Liberty met, organized, and protested. This would eventually lead to that even greater act of defiance on Saturnalia eve, the Boston Tea Party. One of the participants in the Boston Tea Party and later in the Revolutionary War, Samuel Sprague, is buried in the Boston Common.

There is something many don’t understand about the American Revolution. It wasn’t so much a fight against oppression in general and certainly not about mere taxation in particular. What angered those Bostonians and many other colonists was that they had become accustomed to community-centered self-governance and this was being challenged. The tea tax wasn’t just an imposition of imperial power but also colonial corporatism. The East India Company was not acting as a moral member of the community, in its taking advantage by monopolizing trade. Winter had long been the time of year when bad actors in the community would be punished. Selfishness was not to be tolerated.

Those Boston Tea Partiers were simply teaching a lesson about the Christmas spirit. And in the festival tradition, they chose the guise of Native Americans which to their minds would have symbolized freedom and an inversion of power. What revolution meant to them was a demand for return of what was taken from them, making the world right again. It was revelry with a purpose.

* * *

Trickster Makes This World:
Mischief, Myth and Art
by Lewis Hyde
pp. 188-189

Where we value the old world, carnival’s conservative function is one of its virtues, of course. The dirt ritual protects us against our own exclusions, like a kind of vaccination, and in that manner offers a stability that is lively and not particularly violent. After all, it is not just night-crowing cocks who end up dead when violence is the only way for the dominant order to protect itself. Beware the social system that cannot laugh at itself, that responds to those who do not know their place by building a string of prisons.

Where change is not in order, then, ritual dirt-work offers the virtue of non-violent stability. But where change is in order, dirt-work also has a role to play, for it simply isn’t true that these rituals are always conservative. Dirt rituals may stabilize things for years on end, but when the order is in fundamental crisis these rituals can become the focal points for change, catalytic moments for dirt’s revaluation and true structural shifts. Every so often Fat Tuesday does leak over into Lean Wednesday, and into the rest of the year as well. Regular dirt rituals are like nodes on a shoot of bamboo, repeating year after year to strengthen the growing stalk, but then, when conditions demand it, splitting open to produce new growth.

Historians have recently provided us with a number of specific cases that demonstrate this general model. It now seems clear, for example, that carnival’s ritual debasing of the Pope played a key role in the Reformation in Germany. The ritual container broke, the pollution leaked out, and the Church itself was fundamentally altered. It seems clear also that play with gender roles has sometimes leapt the fences of ritual. The historian Natalie Zemon Davis has argued that the gender reversals of various early modern European festivals served to “undermine as well as reinforce” prevailing social structures. The carnival image of unruly women, normally the object of joking and play, sometimes turned out “to sanction riot and political disobedience for both men and women in a society that allowed the lower orders few formal means of protest.” Davis is well aware that letting carnival’s “woman-on-top” have power during the holidays usually served to keep women on the bottom when the holidays were over, but once such an image exists it is hard to control, and this one sometimes also “promoted resistance,” “kept open an alternate way of conceiving family structure,” and served as “a resource for feminist reflection on women’s capacities.”

I assume that trickster tales serve an analogous double role; usually they bring harmless release, but occasionally they authorize moments of radical change. The tales themselves, at least, declare the latter point: the character who can freely play with dirt, they say, is also the culture hero who brings fundamental change.

Dancing in the Streets:
A History of Collective Joy
by Barbara Ehrenreich
pp. 89-90

The widespread occurrence of mocking rituals would almost suggest some human, or at least plebeian, instinct to playfully overthrow the existing order—whether as a way of harmlessly letting off steam or, at some level of consciousness, rehearsing for the real thing. Many of the mocking rituals associated with European carnival centered on a king of fools, a costumed character who probably first appeared in the Church-sanctioned Feast of Fools. If anything illustrates the ambivalence of the Church toward festive behavior, it was this event, which was initiated by the lower-level clergy—deacons, subdeacons, and priests—who comprised the Church’s internal lower class. This feast, described by Chambers as “largely an ebullition of the natural lout beneath the cassock,” originally took place inside churches between Christmas and New Year’s. The participating clergy dressed absurdly—in women’s clothes or their own clothes worn inside out—and performed a noisy burlesque of the mass, with sausages replacing the priest’s censer, or with “stinking smoke from the soles of old shoes” instead of incense, and “wanton songs” and gibberish substituting for the usual Latin incantations.23 As one disapproving contemporary described the scene: “They run and leap through the church, without a blush at their own shame. Finally they drive about the town … and rouse the laughter of their fellows and the bystanders in infamous performances, with indecent gestures and verses scurrilous and unchaste.”

pp. 101-102

Protestantism, serving as the ideological handmaiden of the new capitalism, “descended like a frost on the life of ‘Merrie Old England,’” as Weber put it, destroying in its icy grip the usual Christmas festivities, the maypole, the games, and all traditional forms of group pleasure.13 But this account downplays the importance of festivities as a point of contention in their own right, quite apart from their perceived economic effects. Without question, industrial capitalism and Protestantism played a central role in motivating the destruction of carnival and other festivities. There was another factor, though, usually neglected in the economic-based accounts: To elites, the problem with festivities lay not only in what people were not doing—that is, working—but in what they were doing, that is, in the nature of the revelry itself. In the sixteenth century, European authorities (secular and ecclesiastical, Catholic as well as Protestant) were coming to fear and disdain the public festivities that they themselves had once played starring roles in—to see them as vulgar and, more important, dangerous.

p. 103

There is probably no general and universal answer, though, to the question of whether carnival functioned as a school for revolution or as a means of social control. We do not know how the people themselves construed their festive mockeries of kings and priests, for example—as good-natured mischief or as a kind of threat. But it is safe to say that carnival increasingly gains a political edge, in the modern sense, after the Middle Ages, from the sixteenth century on, in what is known today as the early modern period. It is then that large numbers of people begin to use the masks and noises of their traditional festivities as a cover for armed rebellion, and to see, perhaps for the first time, the possibility of inverting hierarchy on a permanent basis, and not just for a few festive hours.

p. 165

Let us begin with carnival and other, somewhat secular festivities brought by Europeans to the Americas. These celebrations, which Europeans expected to carry on as vigorously—if not more vigorously—in the “new” world as in the old, posed an immediate problem in the colonial setting: What about the slaves? When Europeans caroused or simply feasted, there were always dark faces watching, waiting for some particle of generosity to come their way, or waiting perhaps for some moment of weakness to present an opportunity for revolt. In Protestant settings, such as Jamaica and the southern United States, where Christmas was the highlight of the social calendar, slaves used it as an opening to establish their own, probably African-derived festivity: Jonkonnu. As early as 1688, Jamaican slaves were celebrating Jonkonnu with costuming and dancing with “Rattles ty’d to their Legs and Wrists.”38 A little over a century later, they had won a measure of white respect for Jonkonnu, with whites agreeing to do their own chores during this brief period of black celebration. A white contemporary reported that during the holidays “the distance between [masters and slaves] appears to be annihilated for the moment, like the familiar footing on which the Roman slaves were with their masters at the feast of the Saturnalia, to which a West Indian Christmas may be compared.” 39 In the Carolinas, where Jonkonnu had spread by the nineteenth century, slaves marched to the big house, where they danced and demanded money and drinks from their masters. Thus a moment of white weakness—Christmas—was transformed into a black opportunity.

p. 168

In another striking parallel to the European festive tradition, Caribbean slaves and freed blacks put carnival to service as an occasion for armed uprisings. The historian Elizabeth Fenn reports that 35 percent of all known slave plots and rebellions in the British Caribbean were planned for the Christmas period, noting that “in this regard the slaves of the Americas differed little from the French peasants and laborers studied by Emmanuel Le Roy Ladurie and Natalie Zemon Davis.”

Inventing the People:
The Rise of Popular Sovereignty in England and America
by Edmund S. Morgan
pp. 202-203

There were other parallels in contemporary English country life, in the fairs, “wakes,” and local festivals that punctuated the seasons, where sexual restraints were loosened and class barriers briefly broken in a “rough and ready social equality.” 82 But these were simply milder versions of what may be the most instructive parallel to an eighteenth-century election, namely the carnival— not the travelling amusement park familiar in America, but the festivities that preceded Lent in Catholic countries. The pre-Lenten carnival still survives in many places and still occupies an important place in community life, but it has assumed quite different functions from the earlier festivals. 83 It is the older carnivals, before the nineteenth century, that will bear comparison with eighteenth-century elections.

The carnival of the medieval or early modern period elicited from a community far more outrageous behavior and detailed ritual than did the elections that concern us. 84 But the carnival’s embellishments emphasize rather than obscure the fact that make-believe was the carnival’s basic characteristic and that carnival make-believe, like election make-believe, involved role reversal by the participants.

pp. 205-207

Where social tensions ran too high the carnival might become the occasion for putting a real scare into the cats and wolves of the community. There was always a cutting edge to the reversal of roles and to the seemingly frivolous competition. And when a society was ripe for revolt, the carnival activated it, as Le Roy Ladurie has shown in his account of the carnival at Romans in 1580. But normally a community went its way with the structure of power reinforced by its survival of the carnival’s make-believe challenge.

To put this idea in another way, one might say that the carnival provided society with a means of renewing consent to government, of annually legitimizing (in a loose sense of the word) the existing structure of power. Those who enacted the reversal of roles, by terminating the act accepted the validity of the order that they had ritually defied. By not carrying the make-believe forward into rebellion, they demonstrated their consent. By defying the social order only ritually they endorsed it. […]

The underlying similitude of an eighteenth-century election to a carnival is by now apparent. The two resembled each other not only in obvious outward manifestations— in the reversal of roles, in the make-believe quality of the contests, in the extravagance of the partisanship of artificial causes, in the outrageous behavior and language, in the drunkenness, the mob violence, even in the loosening of sexual restraints— not only in all these external attributes but also in an identity of social function. An election too was a safety valve, an interlude when the humble could feel a power otherwise denied them, a power that was only half illusory. And it was also a legitimizing ritual, a rite by which the populace renewed their consent to an oligarchical power structure.

Hence the insistence that the candidate himself or someone of the same rank solicit the votes of the humble. The election would not fully serve its purpose unless the truly great became for a time humble. Nor would it serve its purpose if the humble did not for a time put on a show of greatness, not giving their votes automatically to those who would ordinarily command their deference. Hence too the involvement of the whole populace in one way or another, if not in the voting or soliciting of votes, then in the tumults and riots, in the drinking and feasting, in the music and morris dancing.

It would be too much to say that the election was a substitute for a carnival. It will not do to push the analogy too far. The carnival was embedded deeply in folk culture, and its functions were probably more magical and religious than, overtly at least, political. An election, on had no the other hand, was almost exclusively a political affair, magical overtones; it was not connected with any religious calendar. 90 Nor did it always exhibit the wild excesses of a carnival; and when it did, it was surely not because the local oligarchy felt that this would renew their authority. They would generally have preferred to preserve “the peace of the country” by avoiding the contests that engaged them so hotly and cost them so much when they occurred. Moreover, the reversal of roles did not go anywhere near as far as in a carnival. In an election, along with the fraternization and condescension, there could be a great deal of direct pressure brought by the mighty on those who stood below them, with no pretense of reversing roles.

The resemblance to a carnival nevertheless remains striking. Is it wholly coincidence that there were no carnivals in Protestant England and her colonies where these carnival-like elections took place, and that in countries where carnivals did prevail elections were moribund or nonexistent? Is it too much to say that the important part of an eighteenth-century election contest in England and in the southern colonies and states was the contest itself, not the outcome of it? Is it too much to say that the temporary engagement of the population in a ritual, half-serious, half-comic battle was a mode of consent to government that filled a deeper popular need than the selection of one candidate over another by a process that in many ways denied voters the free choice ostensibly offered to them? Is it too much to say that the choice the voters made was not so much a choice of candidates as it was a choice to participate in the charade and act out the fiction of their own power, renewing their submission by accepting the ritual homage of those who sought their votes?

The Romance between Greece and the East
ed. by  Tim Whitmarsh & Stuart Thomson
“The Greek novel Ninus and Semiramis: Its background in Assyrian and Seleucid history and monuments”
by Stephanie Dalley
Kindle Locations 3943-3958

More likely, in my view, is a relationship of some romances to carnivals: a festival of Aphrodite for Chariton’s Chaereas and Callirhoe where the lovers first meet, and a festival of Artemis for the setting of the beginning and end of the story in Habrocomes and Antheia. The Hebrew Book of Esther is integrally linked to the carnival-type feast of Purim. A festival based upon a Babylonian or Assyrian version of the traditional New Year Festival was celebrated at Palmyra, where a fine frieze showed the triumph (in Roman dress) over the sea of chaos, 47 and probably also at Hierapolis-Membidj. 48 But I doubt that one can claim a carnival connection for all the compositions.

The stories with a vaguely Assyrian historical background mainly have no particular love interest of the boy-meets-girl kind. This is not because such a theme was taboo in Assyrian literature: there are very explicitly erotic Love Lyrics, which were recited in rites of Ishtar of Babylon. 49 I would like to make a suggestion as to why the erotic element was introduced into the genre (if we can call it that). The carnival element involves dressing up, pretending to be another person or disguising one’s true nature, often behaving ‘badly’ in a theatrical way. Tomas Hägg suggested that the mosaics found near Antioch and at Alexandretta may have illustrated a theatrical performance, 50 and one might invoke a similar connection for the wall-painting depicting a scene from the story of Esther at Dura Europus, because we know that rude theatrical events were often a part of Purim celebrations.

Jesus Mythicism:
An Introduction
by Minas Papageorgiou
Kindle Locations 3094-3113

It should not surprise us that our people maintained or restored some of these elements throughout the centuries. A good example would be the so-called “Dodecameric,” the twelve days between Christmas and Epiphany. The customs observed during that period reminds us of a series of Dionysian celebrations related to fertility that took place at the same time of the year in ancient times. For example, “Aloa” was a festival in honor of Demeter and Persephone, the “Rural Dionysia” was a joyful celebration, and “Lenaia” was a festival with a dramatic competition.

Thus, in the village of Volakas, in Drama, the feast of “Arapides,” masked men with faces painted with soot, takes place every year on January 7. The next day the “Bears” appear in the village. These are men covered in goatskin who make phallic dance movements, swear and strike with their sticks whomever they meet for good luck. These celebrations go back in time. “We are dressing up as ‘Arapides’ for good luck, for the good of our crops. This is how we found it, and so we keep it going,” say the disguised locals. Similarly, on the Epiphany (January 6), in another village in Drama, Kali Vrisi, another celebration takes place that lasts until the eighth day. It is the feast of “Babougera.” People are disguised as animals wearing masks and hang in their waist heavy bells. They dance and chase endlessly and cheerfully the people on the street. When the time for the ritual wedding comes, as part of the celebration, the disguised men grab the “bride,” who is basically a man dressed up as a woman.

All these elements, of course, are reminiscent of the traditional customs of Carnival. Strangely enough, at the same time when the ancient Athenians celebrated Carnival another celebration was taking place in honor of Dionysus, the Anthesteria festival. This included, among other things, the sacred marriage between the god and the “basilinna,” wife of the archon basileus (king), who represented the city. It is highly possible that modern carnival celebrations, such as the Vlach wedding in Thrace, have their roots in these ancient customs.

Dionysian elements can also be found in some phallic customs as part of the carnival celebrations, for example in Tyrnavos in Thessalia and Agia Anna in Evia. Besides, one of the main characteristics of the Rural Dionysia and the City Dionysia was the procession of men carrying phalloi, known as phallophoroi. In the center of the procession was a large wooden phallus, usually from a fig tree. People were also singing several “phallic” songs. Comedy, characterized by strong sexual and obscene language, derives from this tradition, as Aristotle informs us.

Religion in Human Evolution:
From the Paleolithic to the Axial Age
by Robert N. Bellah
Kindle Locations 5260-5286

It is part of the myth of Dionysus that he was an outsider, that he came from abroad, from Thrace or Phrygia, in historic times. Modern scholars as well as ancient Greeks tended to accept this part of the story as historically true, until the name of Dionysus appeared several times among the gods of the Mycenaeans in Linear B texts. So Dionysus is a very ancient Greek god, but he is “always” coming from abroad. He was very important in Athens, where a number of festivals, some of them very early, were dedicated to him. Robert Connor has seen the growth of Dionysiac worship in sixth-century Athens as a kind of religious preparation for the emergence of Greek democracy racy in the reforms of Cleisthenes beginning in 508-507.” Connor discusses the Dionysiac thiasotai (confraternities) as among the many forms of voluntary association that made up something like “civil society” in sixth-century Athens-associations that were to some degree self-governing and that fostered the practice of group discussion and group decision making. It was the combination of the social practice nurtured in such associations with the spirit of Dionysiac religion that Connor sees as an important foundation for the democratic reforms, reforms that Cleisthenes nurtured but could not have created.

The structural reforms undertaken by Cleisthenes, or by the people of Athens under his leadership, are too complex for us to describe in detail. Suffice it to say that these reforms overcame some of the divisiveness that characterized Athens in earlier times and extended the participation of the common people in the government of the polis. What is significant for us is the fact that these political changes were accompanied by, were one aspect of, a general change that was religious as much as political. It is this religious side of the change that Connor characterizes as the increasing importance of Dionysiac religion.

The myth of Dionysus is complex and ambiguous, indeed ambivalent, with a dark side as well as a joyous one, but one of its foci is that of the outsider god who comes into a city and turns it upside down, leading to the destruction of those who oppose him but to a new solidarity among those who accept him. He is transgressive, to use a term common in current discourse, a boundary-crosser crosser to be sure, but also integrative, the symbol of new community.72 Connor nor believes that Dionysiac worship in the sixth century “is best understood as the first imaginings of a new type of community.” More specifically, he writes:

Dionysiac worship tumbles into carnival and carnival inverts, temporarily, the norms and practices of aristocratic society. While these inversions may provide a temporary venting mechanism and thereby help stabilize repressive regimes, in the longer run they can have quite a different effect. They make it possible to think about an alternative community, one open to all, where status differentiations can be limited or eliminated, and where speech can be truly free. It is a society that can imagine Dionysiac equality and freedom.73

Connor gives the example of features institutionalized in the political realm “that probably originated in religious practice, for example, ‘outspokenness,’ parrhesia, and isegoria, `equality of speech.”’74 Given the importance of Dionysiac cult groups and the spirit of Dionysiac religion, Connor finds it “not surprising” that the newly established Athenian democracy would express itself in a new festival, the City Dionysia, or festival of Dionysus Eleuthereus (that is, the Dionysus who came from the border city of Eleutheria, but also with the etymological implication of freedom). He argues that the City Dionysia was founded not under the Pisistratids but under Cleisthenes or shortly thereafter and so was a kind of “freedom festival” celebrating the fall of the tyranny.75 Other specialists on Greek religion believe that the City Dionysia was founded under the Pisistratids, but that it underwent went significant reform and enhancement at the time of Cleisthenes.76 In that case, Connor’s argument would still be applicable.

What from our point of view is most interesting is that religious practice not only made possible the idea of a different social reality than the one existing, but helped to actualize it as well. Although the capacity to imagine alternative social realities is part of what we have described as the axial transition, it is interesting that in this case it does not involve anything explicitly theoretical. Indeed, Connor writes: “The festival helps us understand why our texts contain no elaborate statement of Athenian democratic theory … The ancient Greeks did not write theory; they enacted it. They enacted it in particular through the City Dionysia.”77

Circles and Lines:
The Shape of Life in Early America
by John Demos
pp. 11-13

Virtually everywhere, harvest was a peak time-a crisis even-when all hands, including those of women and children, were turned to getting the crops safely in. But there were slack times, too, especially in winter, when things slowed way down for days or weeks at a stretch.

The same agricultural rhythm meant changes also in food availability. People experienced dramatic seasonal differences in everyday diet-moving, say, from the summertime, with lots of fresh vegetables and fruit, to the special bounty of harvest, traditionally celebrated with a feast of freshly slaughtered animals (the antecedent of our own Thanksgiving), giving), and then to winter, when the dietary range would narrow to dried foods like peas and turnips and a dwindling supply of salted meats.”

A different (though not unrelated) kind of seasonal variance involved health and illness and marriage and reproduction. The evidence for this lies mostly below the surface and must be pried out through laborious demographic analysis, but its impact was certainly large. For example, marriage-making—weddings—showed making-weddings-showed a striking seasonal distribution. The headline is that weddings happened in hugely disproportionate proportionate numbers during the late fall.’2 And, going a bit further, one finds a distinct up-and-down annual “curve” for weddings (see Figure i), with much regularity from one year and one community to the next. Moreover, the distance between top and bottom was very wide; there were roughly three times as many weddings in November, for example, ample, as during the midsummer low. This particular curve was not so directly tied to Nature’s rhythms as, for instance, all the activity around farming. It could even be seen as culturally determined-since people might well have chosen differently about when to marry. Still, the link to harvest seems too obvious to ignore. When that was over, there was suddenly more time available, and more energy; there was also a feeling of release, and expansiveness, and good cheer. The impulse to celebrate might then lead not just to a Thanksgiving feast but to a wedding as well.

And there was more. This next had no aspect of cultural preference but was entirely controlled by Nature—in fact, by deep (and not fully understood) elements of human biology. It’s what demographers call the “conception cycle”; and it reflects the way pregnancies were unevenly, but very consistently, distributed throughout the calendar year. The evidence of copious local and family records yields another annual curve-in fact, a pair of curves, one reflecting births, the second, times of conception (see Figure z). Of course, the dynamic element here was always conception; once that had taken place, birth would (barring mishap) occur about nine months later. In fact, the curve shows two peaks in conception, the tallest coming in late spring, with corresponding valleys (and a difference between them approaching too percent).13

What this meant, in terms of actual experience, was many more babies born in late winter than at other times of the year. In fact, demographers have found the same rhythm in premodern communities throughout the northern hemisphere. sphere. Moreover, they have also found it in the southern hemisphere—except that there the months, though not the seasons, are directly reversed. The southern conception peak comes in November-December, which, of course, is their spring; so the pattern is actually the same. We might just note, as a final gloss on all this, that the conception cycle flattens out and virtually disappears in the modern period. The reason is obvious: as soon as contraception enters the picture—that is, planned fertility control-the timing of pregnancy is determined by innumerable individual choices; and those choices, when aggregated, spread evenly throughout the year.

pp. 45-47

We can zero in on that link by considering the word revolution and its own history of change. In fact, its not too much to say that the word moved from an originally circular to an eventually linear meaning, over the span of several centuries. Other scholarly hands have been into this history—of the word—in some detail.” Their conclusions deserve serve a careful summary. Revolution was, during the late Middle Ages and on into the early modern period, used to refer to things that turned, that rotated-circular and cyclical cal things. (In this it followed the sense of its Latin root.) Most especially was it used by astronomers to describe the orbital movement of the stars—for example, in the landmark work of Copernicus, De Revolutionibus Orbium Coelestium. Then bit by bit it was brought down from the heavens and applied to more earthly matters-as a metaphor for revolving tendencies of all sorts. Then, in the seventeenth century, it became a specifically political term, but still with the underlying sense of movement around and back to pre-established positions. This was especially true of its widespread application to political events in England from mid-century onward: the Puritan Revolution (which, from the perspective of many, represented a turning back toward older and better ways), and also the Glorious Revolution of 1688 (which was widely understood as a restoration of monarchical cal power to its appropriate form and context).

And that was where the meaning of the term remained for quite a while longer—indeed, until the last part of the eighteenth century. The American Revolution, as we’ve already ready remarked, was begun in a spirit of restoration, of reengaging engaging principles and structures supposedly forgotten (or abandoned, or subverted). Thus the word, in its traditional usage, was initially a good fit. But when the political context changed—when the historical actors began to acknowledge, and even to embrace, the novelty of what they were about—the the word changed, too. This is the truly remarkable thing: events reversed a meaning that had endured for several hundred years. From now on, revolution would signify not a turning back into old paths but the creation of entirely new ones. (This result was solidified, just a few years further ahead, with the start of the French Revolution. There, too—though perhaps a bit more ambiguously—one sees a movement away from restorative conceptions toward openly innovative ones.)

* * *

Christmas carol
Wassailing
Apple Wassail
Wait (musician)
Mummers Parade
Mummers play
Carols, Wassailers, Waits and Mummers
Why do Christmas carols make the church feel nervous?
Wassailing with Wenceslas – Christmas Carol Origins
Here We Come A-Wassailing; The Roots of a Christmas Tradition
Here We Go a Wassailing
Wassailing through History
Apple Tree Wassails
Oh Apple Tree, we Wassail Thee
Wassailing! Notes On The Songs And Traditions
When Thanksgiving Tradition Included Halloween-Like Masquerading
Celebrating Hallowmas
Allhallowtide
Samhain (Historic customs)
Halloween, a faraway origin feast
Winter solstice
Christmas: The Birthday of Sun Gods

Happy Birthday Mithras!
Paul & Mithraism
St Paul – History, Biblical Epistles, Gnosticism and Mithraism
Mithraism and Early Christianity
Mithra: The Pagan Christ
Attis: Born of a Virgin on December 25th, Crucified and Resurrected after Three Days
Christmas and holiday season
Christmas
The History of Christmas

Christmas controversies (Pre-Christian influence)
A Roman Christmas

Christmas’ Pagan Origins
Boxing Day
Feast of Fools & Lord of Misrule
Twelve Days of Christmas
Twelfth Night (holiday)
Solar origins of the ‘Twelve Days of Christmas’ and Christianity.

Christmas, Yule and the Winter Solstice
Festive ecology (Christmas)
Christmas tree
O, Tannenbaum: the Origin of the Christmas Tree
List of Christmas and winter gift-bringers by country
The History and Origins of Santa Claus
Santa is a Wildman!
Krampus
SinterklaasZwarte Piet
Magic Mushrooms May Explain Santa & His ‘Flying’ Reindeer
Psychedelic Santa And Christmas Mushrooms
Yule
Brumalia
Saturnalia, Sigillaria, & Opiconsivia
The Roman Saturnalia parties and Christmas
Io Saturnalia! The Reason for the Season?
Saturnalia—A Roman Solstice Romp
The Puritan War on Christmas
Slaves Received Gift Of Role Reversal
Carnival
The Carnaval Celebration that became Christmas & New Year’S Eve
Carnival, A People’s Uprising at Romans
Carnival, Processions and Parades – Interview Claire Tancons
Carnival, an upside down world
Revolution as Carnival
In Theory Bakhtin: Carnival against Capital, Carnival against Power
Occupy Wall Street: Carnival Against Capital? Carnivalesque as Protest Sensibility
Carnival to Commons: Pussy Riot, Punk Protest, and the Exercise of Democratic Culture
The Lord of Misrule
Tactical frivolity
Phrygian cap
Pileus (hat)
The History of Marianne’s Cap
Liberty pole
The Maypole’s Revolutionary Heritage
Roots of the Liberty Tree
Merry Mount and May Poles
Démos, The People
Revolution and Apocalypse
Music and Dance on the Mind
Beating the bounds
Terminalia

Immigrants, Their Children, & Contributing Factors

In discussing comparisons between the US and France, someone brought up the issues of immigration, assimilation, and violence. The specific focus was the children of the North African Muslim immigrants. Some have noted that violent crime, terrorism, and radicalization is seen more with the native-born second and third generations than with the immigrants themselves. So, this violence is learned in Europe, rather than it having been brought here by refugees.

It’s an interesting point, but it’s hard to disentangle the strands and harder still to put it all into context. For that reason, let me offer some of my commentary from a previous post, in response to Kenan Malik — Good Liberals vs Savage Nihilists:

“He does admit that some terrorists are refugees. His argument, though, is that they aren’t the majority. That’s true. As I recall, something like 20% are refugees, which admittedly still is a large number. More important is the entire atmosphere. Even for non-refugee Muslims in Europe, they likely would be surrounded by and regularly in contact with Muslims who are refugees. In general, they’d be constantly reminded of the refugee crisis in the media, reminded of the public response of hatred and bigotry, and probably mistaken as a refugee themselves. […]

“Many European Muslims still experience the negative effects of xenophobia, racism, ghettoization, and other forms of isolation, exclusion, and prejudice. They aren’t treated as fully integrated by their fellow citizens. Simply being born in a country doesn’t mean most people will see you as an equal. It takes generations for assimilation to take place. Even after centuries, Jews and Romani have continued to struggle for acceptance and tolerance in Europe. […]

“Plus, consider the situation in the United States. American Muslims on average are wealthier and more well-educated. But unlike in Europe they aren’t ghettoized nor racialized in the same way (we already have our racialized boogeyman with blacks). Maybe it should be unsurprising that per capita American Muslims commit far less mass violence than do native-born American whites. In the US, you’re more likely to be shot by a white terrorist and treated by a Islamic doctor, in terms of percentage of each population.

“The same identity politics and decline of traditional politics have happened in the United States. In some ways, the loss of community and culture of trust is far worse here in the States. Yet Islamic integration seems more of a reality than in Europe. American Muslims apparently don’t feel disenfranchised and nihilistic, as Malik assumes they should feel. This undermines his entire argument, indicating other factors are more important.

“Obviously, there is nothing inherently violent to either Arab culture or the Islamic religion. The Ottoman Empire was one of the great powers of the world, not particularly different than European empires. If any European empire with large contiguous territory (e.g., Russian Empire) had been defeated and demolished in a similar fashion and then artificially divided up as a colonial prize, we’d probably now have something in Europe akin to the present violence-torn Middle East. There is nothing that makes either region unique, besides the accidents of history. After WWI, the Ottoman Empire could have been left intact or even given assistance in rebuilding. In that case, none of the rest would have followed.”

Europe is having issues with assimilation based on a refugee crisis involving and related to more than a century of problematic relations with the the Middle East and North Africa. There is: the post-WWI forced dismantling of the Ottoman Empire, neo-colonial exploitation, Cold War conflict, proxy wars, covert operations, coups, assassinations, puppet dictators, destruction of democracy, support of theocracy, millions of innocents regularly killed over several generations, War on Terror, climate change-caused droughts, etc. All of this has been caused or contributed to by foreign governments, especially Western governments. This is built on centuries of ongoing racial and class conflicts in European history, including the legacies of colonial imperialism.

Assimilation is always a slow process. The Roman Empire spent centuries trying to assimilate the barbarian hordes of Europe, but they ultimately failed before those backwards Europeans took down that once great Mediterranean empire. Yet after the collapse of the Roman Empire, various European societies slowly assimilated aspects of the Roman Empire, developing into Western imperialism, colonialism, and feudalism. This process took most of Europe about a millennia or so, until finally a new assimilated culture could begin to be clearly identified as Western. For example, it took the Celts, Scandinavians, Germans, and Normans more than a millennia of bloodshed to assimilate into what eventually would be called the English.

As for our present situation, even in Europe, immigration violence is relatively low. Most of the increase in violence, as far as I know, hasn’t come from immigrants and their children. There has been a right-wing and reactionary radicalization of the native-born ‘white’ populations of European countries. It’s that few people ever bother to compare this native population violence against the immigrant population violence. I would like to see good data on this. I hear lot of people repeating what they think is true, but I never see the evidence for why they think it is true other than other people also repeating the same claims.

Even if it were true, this might be a normal pattern. Europe has seen millennia of violence rates that increase and then settle down following population shifts. And Americans were making similar complaints against European ethnic immigrants in the early 19th century. Yet immigrants almost always assimilate, slowly or quickly depending on the kind of society, but the only time assimilation fails is when there is enforced segregation (e.g., American blacks). I always take such allegations with a grain of salt because, when one researches them, they so often are found to be nothing more than stereotypes. Still, I do take seriously the problems of refugee crises, especially those that could be avoided, from the English-caused Irish potato famine to the US-promoted Latin American destabilization.

Unsurprisingly, desperate people act desperately. So, if the children of refugees are being targeted with prejudice, oppressed by systemic and institutional biases,  economically segregated and ghettoized, it would be entirely predictable that bad results would follow. I’ve pointed out the research that shows diversity only correlates to mistrust when there is segregation. What I’d like to see is the data on prejudice and oppression, violent crime and police brutality committed against these immigrants, their children, and their grandchildren. And then I’d like to see that compared to rates of violent crime in immigrant communities, broken down in various ways: older and younger, foreign-born and native-born, etc.

But most importantly I’d like to see research that controls for at least the most obviously significant confounding factors: poverty, inequality, segregation, political disenfranchisement, racial/ethnic targeting, etc. Consider that last one. We know that American blacks get stopped, arrested, prosecuted, and imprisoned more often and more harshly than do American whites, even for crimes that have been proven to have higher rates for American whites. So, how do we know the bias against these populations aren’t built into the institutions, such as police departments, that create and keep this data?

Now consider this. All these points I make, all these questions and criticisms, they seem obvious to me. And I can’t help but think that they should be obvious to everyone. Yet most of this is rarely if ever mentioned, much less seriously discussed, by right-wingers and neo-reactionaries, by race realists and genetic determinists, by white supremacists and ethno-nationalists. As far as that goes, you won’t hear much about it by mainstream liberals, Democratic politicians, and corporate media. Why is that?

Look at the essay below, “Crime and the Native Born Sons of European Immigrants.” It is from 1937. The author, Harold Ross, discussed and analyzed these very same kinds of issues, although about European (Christian) immigrants. He even considered the confounding factor of economic segregation, among other issues. So, how is it that such an essay could be written 80 years ago and so many people to this day continue to make ignorant arguments, as if such confounding factors don’t exist? Was Harold Ross a genius or, like me, was he simply willing to state the obvious?

* * *

Crime and the Native Born Sons of European Immigrants
by Harold Ross
Journal of Criminal Law and Criminology
Volume 28, Issue 2 July-August, 1937

The European immigrant, landing on American shores, was forced to find cheap lodgings as he was usually penniless. These cheap lodgings he found in the disorganized slum areas of the industrialized American cities.5 The behavior of the new-comer himself was determined by behavior patterns organized in the culturally more stable European environment but his native born children suffered the stresses and strains of the new individualistic environment.

These children, the native born offspring of foreign parentage, were reared under those barren, poverty stricken socio-economic conditions that produced a higher crime rate than a more sheltered and prosperous environment. The environment of the slum dwellers meant for all the inhabitants there, be they of native or foreign parentage, a life conditioned by irregular, poorly paid employment, by a family disorganized by the necessity of the mother to leave the task of home-making in search of work to supplement the chief wage-earner’s meager income, by the general institutional disorganization, by inadequate educational opportunities and a sordid, barren milieu for the children. These vital forces were far more powerful than the fact that one slum-reared child’s parents spoke Italian and another’s parents spoke native American slang, that the one ate spaghetti, and other beef stew

If the crimes of the native born of native stock and those of the native born of foreign stock were stimulated by different causes, the cause in the latter case being a cultural clash between American and European customs which is non-existent in the former case, then there should be little similarity in the growth from childhood to careers of crime between both groups. If, on the other hand, crimes in both cases were stimulated by the same cause, namely dwelling on the same socio-economic level, then there should be definite similarity in the maturation from childhood to crime.

Anti-social behavior first becomes evident in the delinquencies of predatory boy gangs. Boys naturally tend to play with other boys. The environment determines whether this spontaneous grouping is social or anti-social, whether it is a respectable Boy Scout Troop or a predatory gang.’ The typical city “kids” gang consisted mainly of the native born offspring of foreign born parents, but nativity per se was not responsible for the gang problem.7 All boys of the same socio-economic class, whether of foreign, negro, or native white parentage, enter into gangs with equal facility.8 Boys of the more prosperous classes do not form anti-social gangs, not because they are of native white stock, but because of their prosperous environment.9 It is needless for them to rebel against the mores and law, for life has been comfortable to them. Others, regardless of parental nativity and because of their lower socio-economic position, did not willingly accept the mores and law that doomed them to a barren life, so naturally violated them.

This disregard by delinquency of nativity is illustrated by Chicago districts near the Loop, the stock yards, and the south Chicago steel mills which have had high delinquency rates as far back as the records go, and yet whose” population composition has been constantly changing. 0 In many cities it has been noted that the incidence in delinquency varied more accurately with community background than with nationality. High rates coincided with the areas of physical deterioration.”

There has been no fixed boundary between the boy’s predatory gang and the adult’s criminal group.’ 2 Behavior patterns organized in the former were carried over into and accentuated by the latter. Sons, both of native and foreign born stocks, made this promotion from juvenile delinquent to adult offender with equal facility. A follow up of 420 Chicago cases found a negligible difference.’ 3 Continuance of anti-social conduct was dependent upon other conditions than nationality. 4

Further, evidence that the crimes of native born white of both European and American parentage were the resultant not of conditions peculiar to either group but of the same general socio-economic pressures affecting both is shown by the fact that the types of crimes the immigrant’s sons were guilty of were similar not to the offenses of their parents, but to the offenses committed by native Americans. This tendency of the second generation to shift away from crimes peculiar to immigrants and towards native crimes is substantiated by records of all commitments to Massachusett’s penal institutions during the year ending September 30, 1909, and by the records of convictions in the New York Court of General Sessions from October 1, 1908 to June 30, 1909. 25

In summary, then, it was noted by an examination of both American and European reports that the differences in socio-economic conditions between urban and rural life resulted in differences in crime rate whatever may be the nativity or cultural heritage of the individuals. Further it is contended that there are just as marked differences between the environment of prosperous and poverty stricken districts within the urban areas which also result in differing crime rates. Thus the crime of the native born sons of foreign born parentage may be a result not of cultural maladjustment as is usually held, but of their position in a poverty class, a class which breeds criminals with equal facility from all its constituents be they of native or foreign parentage. This view is substantiated by evidence that indicates that native born whites of both American and European parents, if on the same socio-economic level, formed predatory groups, that both grew up into careers of crime with equal facility, and that both were guilty of the same types of crime. This coincidence of factors indicates that the criminality of both was not due to conditions peculiar to each group individually, but to general conditions affecting both equally, namely, their residence in a poverty stricken socio-economic class.

This explanation, if accepted, harmonizes the apparent contradiction between statistical studies, on the one hand, which demonstrate a higher crime rate for the native born of European parentage than for the native born of American parentage, and the personal experiences of countless officials and investigators, on the other hand, who claim, after handling hundreds of second generation offenders, that the foreign stock from which the offenders sprang was in no way responsible for the criminality.16 As the native born sons of foreign parentage tend to be segregated on that income level which has a high crime rate and the native stock tends to be dispersed through all income levels, then obviously statistical studies would endow the former with a higher crime ratio. […]

In conclusion concerning the number and causes of crime of native born individuals of foreign stock, in contradiction to accepted opinion, these views are tentatively presented.

2) Statistics seem to indicate a higher crime rate for the native born of European stock only because they disregard the various income levels. What their actual crime rate is is still a matter of opinion and it is this writer’s hypothesis that all peoples on the same socio-economic level have approximately the same crime rate.

1) The second generation is not a group culturally adrift with neither the culture of their parents nor of their new environment to guide them, but is a group with a very definite culture, a culture of a socio-economic level that is determined by irregular, poorly paid employment and results in broken homes, inadequate eductional and recreational opportunity, and a general stunted environment. And this culture determines for its inhabitants, whatever their nativity, a high crime rate.

 

Christians Dancing

In Dancing in the Streets, Barbara Ehrenreich writes that (p.58):

From a Roman perspective, Christianity was at first just another “oriental” religion coming out of the east, and, like others of similar provenance, attractive to women and the poor. It offered direct communion with the deity, with the promise of eternal life, but so did many of the other imported religions that so vexed the Roman authorities. In fact, there is reason to think that early Christianity was itself an ecstatic religion, overlapping the cult of Dionysus.

The Roman Empire looked east, to the “Orient”. Almost everything of significance was came from that direction where most of the other great empires, societies, and cities were to be found: Persia, Greece, Alexandria, etc. Jews and early Christians, to the Roman mind, were perceived as Easterners. A practice like circumcision made Jews stand out in Rome, but in the East there were other religions and ethnic groups that did the same thing.

Jews and Christians, along with Stoics and the worshippers of Dionysus, Isis, and many others — they were all hard to tell apart. Before the Romans came to power, the Greeks developed a worldview of everyone who wasn’t Greek was therefore a Barabarian. In the ancient world, it was only the ruling authorities of the empires that eventually became concerned about sorting people into the proper box and labeling them accordingly.

So, if you vaguely looked like or did anything approximating the behavior of some known group, that is who you’d get lumped with. Simply refusing to eat Pork because you were a vegetarian could get you accused of being a Jew. At times, being Jewish had great advantages and so large numbers converted to Judaism. And at other times, some other identity was preferable. Ancient people were often taken at their word. If you claimed to be a member of a particular religion, ethnicity or nationality, you’d likely be treated as such.

It wasn’t usually a matter of deception, though. Most ancient people had fluid and overlapping identities. The distinction between one group and another was often rather fuzzy. The various populations were constantly intermingling, borrowing traditions from each other, incorporating foreign elements into their religions, and otherwise shifting their identities and cultures as social and political conditions changed.

Ancient people didn’t think in modern terms. But there was beginning to be changes. With the rise of colonial and expansionist empires during the Axial Age, the greater contact put greater emphasis on identity. This can be sensed most clearly in the late Axial Age when religions like Christianity arose. If you were in the growing Roman Empire, how a group defined themselves and were perceived became increasingly important. This is why, “The obvious parallel between the Christ story and that of pagan victim gods was a source of great chagrin to second-century Church fathers” (p. 58). Paul, as a Roman citizen, was particularly concerned about making Christianity respectable to the Roman authorities and within Roman society.

This was a challenge. It was obvious to everyone involved that Christianity had borrowed heavily from diverse religious and philosophical traditions. There was nothing unique about early Christianity. There were thousands of small cults like it. The worst part about it, from a Roman perspective, is the stark similarities and connections to Eastern groups. And how could it be denied. The first Christians were themselves Jews who were from the East.

The Jews had spent centuries mixing up various oral traditions with elements of nearby religions before writing any of it down. Then the Jews became heavily enmeshed in Greek culture. In the centuries immediately prior to Christianity, many Jews were worshipping pagan deities, including the ancient practice of conflating deities. Yahweh, for many Jews and non-Jews alike, had become identified with Zeus and/or Dionysus, the relation between those two Greek gods laying the ground work for the relationship between Yahweh and Jesus. Ehrenreich briefly quotes from Robert M. Price’s “Christianty, Diaspora Judaism, and Roman Crisis” and here is the passage she quoted from:

What about the challenges of Diaspora assimilationism? There surely was such a thing as Jews taking attractive features of Gentile faiths and mixing them with their own. My caveat is just to say that wildly diverse Judaism already existed back in the Holy Land. And I would say the mythemes later assimilated from Hellenistic Mystery Religions were able to gain entry because they answered to elements already present in Judaism, perhaps all the more attractive once they had become forbidden fruit in the wake of Javneh. In other words, when the family next door celebrated the death and resurrection of Osiris or Adonis this might appeal to a Jew who was dimly aware that his grandfathers had celebrated pretty much the same rites in honor of Baal, Tammuz, or even Isaac, years before.17 2 Maccabees 6:7 tells us that Antiochus converted large numbers of Jews to the worship of Dionysus. One suspects it was no arduous task, given that some Greek writers already considered Jehovah simply another local variant of Dionysus anyway. The Sabazius religion of Phrygia is plainly an example of worshipping Jehovah as Dionysus. The Phrygian Attis was another version of Adam, his mother and lover Cybele a cognate form of Eve. No wonder the Naasene Document identifies the resurrected Jesus with both Attis and Adam. No wonder we have Jewish sarcophagi from this period depicting both the menorah and the symbol of the resurrected Attis.18

The temptations and challenges of the Diaspora only served to increase the diversity of ancient Judaism, a diversity directly reflected in emerging Christianity, which demonstrably partakes of Jewish Gnosticism,19 Zoroastrianism,20 the Mystery Cults, etc. As Rodney Stark has shown, Diaspora Jews remained a major and continuous source of new Christian converts on into the fifth century.21 Christianity would have been, Stark very plausibly surmises, the ideal assimilation vehicle, since the “new” faith allowed one to retain the cherished ethical monotheism of Judaism yet without keeping up the walls of purity rules that separated one (arbitrarily, as it seemed, and as it would seem again to nineteenth- and twentieth-century Reform Jews) from one’s neighbors. It seems to me that adherence to Christianity (the “true Israel”) would also have been the natural way of clinging to traditional elements of popular Judaism upon which Orthodoxy had frowned but which, as Barker shows, had never died out. I suspect that such Christian-leaning Jews eyed emergent Rabbinic Javneh Judaism as a modern product and viewed it as most pious non-Pharisaic Jews had always viewed the stricter party of the Pharisees (and the Essenes). It would have been entirely natural for Christianizing Jews, hanging on to cherished “underground” mythemes, etc., to have viewed themselves as the real Judaism, the old-time religion. We have, again, been too eager to take the Rabbinic claims to pedigree and originality at face value. Perhaps one more piece of evidence that this is a proper way to view matters is the otherwise odd fact that many Christians continued to attend synagogue for centuries, alongside church, often to the great consternation of their bishops. This implies that the synagogue-attenders viewed the defining label for their religiosity as Judaism, not as a new, split-off religion. Their Christianity was Judaism in their eyes, even if Christian bishops (like Chrysostom) and Jewish Rabbis alike bemoaned the fact.

That fascinates me endlessly. It was such a different world. Monotheism had yet to become monolithic because monotheism itself was a rather fuzzy concept. Sure, you could believe there is one true god, but which god and how many forms could he take. In practical terms, there is no absolute distinction between monotheism and polytheism. Even the Jews often referred to their god using a word that was plural, Elohim. There was a polytheism within Judaism that was very much within living memory of many ancient Jews. And it seems a cultural memory of this continued into the early centuries of Christianity, which maybe explains why there came to be so many violent purges by the heresiologists who gained power. In order to make Christianity into a new religion, they had to annihilate the last remnants of the old time religion. They were never entirely successful, but not for a lack of trying.

An area of struggle was the ecstatic tradition of Christianity. That is part of Ehrenreich’s focus in her book. What has dancing and related practices meant to humans over the millennia. We modern Westerners, especially Americans, don’t associate dancing with the Christian tradition. But there has been a long struggle about this within Christianity itself. And this struggle has taken many forms. One site of struggle has been the dancing so typical of carnivals and festivals. These ecstatic forms of religiosity have sometimes been included within Christianity, at other times they were merely tolerated, and at yet other times they were forbidden. There is evidence that in early Christianity dance was considered by many as a normal expression of worship and devotion. But it isn’t entirely clear what kind of dance it was. Ehrenreich discusses this in great detail (pp. 65-66):

Most of what Christians of the first and second centuries actually did together—whether they even possessed a standardized form of worship, for example—is unknown to us today, but the general scholarly view is that “church services were noisy, charismatic affairs, quite different from a tasteful evensong today at the parish church.”20 They met in people’s homes, where their central ritual was a shared meal that was no doubt washed down with Jesus’ favorite beverage, wine.21 There is reason to think they sang too, and that the songs were sometimes accompanied by instrumental music. 22 Justin Martyr, a gentile convert who died at the hands of the Romans in 165 CE, once wrote that children should sing together, “just as in the same way one enjoys songs and similar music in church.”23 Very likely, Christians also danced; at least this is how the historian Louis Backman interpreted various statements of the second-century Church fathers. Clement of Alexandria (150-216 CE), for example, instructed the faithful to “dance in a ring, together with the angels, around Him who is without beginning or end,” suggesting that the Christian initiation rite included a ringdance around the altar. At another point Clement wrote that in order to invoke the “zest and delight of the spirit,” Christians “raise our heads and our hands to heaven and move our feet just at the end of the prayer—pedes excitamus,” where, according to Backman, pedes excitamus is “a technical term for dancing.”24

So Christians sang and possibly danced, but did they dance ecstatically, as did members of the old Dionysian cults? The evidence for ecstatic dancing, such as it is, hinges on Paul’s instruction, in his letter to the Corinthian congregation, that women should keep their heads covered in church (1 Cor. 11:5). This may represent nothing more than a concern that Christianity remain within the normal pagan and Jewish bounds of gender decorum. After all, Paul did not want women prophesying or even speaking in church, despite the fact that he worked with women as fellow proselytizers and had at one point proclaimed that “male and female are one in Christ.” An alternative explanation for the head-covering rule, proposed by the theologian E. S. Fiorenza, is that the women of Corinth were becoming a little too exuberant for Paul’s tastes.

It seems that during their ecstatic-pneumatic worship celebrations some of the Corinthian women prophets and liturgists unbound their hair, letting it flow freely rather than keeping it in its fashionable coiffure, which often was quite elaborate and enhanced with jewelry, ribbons and veils. Such a sight of disheveled hair would have been quite common in the ecstatic worship of oriental deities.25

Roman women spent hours on their tight coiffures, leaving the long, unbound look to the worshippers of Dionysus, Cybele, and Isis. If we know one thing about Paul, it is that he was greatly concerned about making Christianity respectable to the Romans, and hence as little like the other “oriental” religions—with their disorderly dancing women—as possible.

This may seem like a rather tenuous inference, but the association between hair-tossing and ecstatic practice is widespread and was well established in the ancient world.

All that we know of early Christianity, along with most other early religions, is a fading memory of what came before. It was that fading memory that was written down and typically written down by those who were attempting to eliminate the traces of that memory. All that we can be certain of is that modern Christianity probably has little if any resemblance to early Christianity, in either substance or form.

 

* * *

Dance of the Savior

The Round Dance–text and commentary
by Michael Howard

Singing with the Savior: Reconstructing the Ritual Ring-dance in the Gospel of the Savior
by Erik Yinglin

The Evolution of Sacred Dance in the JudeoChristian Tradition
by Jade Luerssen

Greek Dance: An Ancient Link — A Living Heritage
by Athan Karras

Jesus as Lord of the Dance
From early Christianity to medieval Nubia

by Paul Dilley

“Beyond that, there is only awe.”

“What is the meaning of life?” This question has no answer except in the history of how it came to be asked. There is no answer because words have meaning, not life or persons or the universe itself. Our search for certainty rests in our attempts at understanding the history of all individual selves and all civilizations. Beyond that, there is only awe.
~ Julian Jaynes, 1988, Life Magazine

That is always a nice quote. Jaynes never seemed like an ideologue about his own speculations. In his controversial book, more than a decade earlier (1976), he titled his introduction as “The Problem of Consciousness”. That is what frames his thought, confronting a problem. The whole issue of consciousness is still problematic to this day and likely will be so for a long time. After a lengthy analysis of complex issues, he concludes his book with some humbling thoughts:

For what is the nature of this blessing of certainty that science so devoutly demands in its very Jacob-like wrestling with nature? Why should we demand that the universe make itself clear to us? Why do we care?

To be sure, a part of the impulse to science is simple curiosity, to hold the unheld and watch the unwatched. We are all children in the unknown.

Following that, he makes a plea for understanding. Not just understanding of the mind but also of experience. It is a desire to grasp what makes us human, the common impulses that bind us, underlying both religion and science. There is a tender concern being given voice, probably shaped and inspired by his younger self having poured over his deceased father’s Unitarian sermons.

As individuals we are at the mercies of our own collective imperatives. We see over our everyday attentions, our gardens and politics, and children, into the forms of our culture darkly. And our culture is our history. In our attempts to communicate or to persuade or simply interest others, we are using and moving about through cultural models among whose differences we may select, but from whose totality we cannot escape. And it is in this sense of the forms of appeal, of begetting hope or interest or appreciation or praise for ourselves or for our ideas, that our communications are shaped into these historical patterns, these grooves of persuasion which are even in the act of communication an inherent part of what is communicated. And this essay is no exception.

That humility feels genuine. His book was far beyond mere scholarship. It was an expression of decades of questioning and self-questioning, about what it means to be human and what it might have meant for others throughout the millennia.

He never got around to writing another book on the topic, despite his stated plans to do so. But during the last decade of his life, he wrote an afterword to his original work. It was placed in the 1990 edition, fourteen years after the original publication. He had faced much criticism and one senses a tired frustration in those last years. Elsewhere, he complained about the expectation to explain himself and make himself understood to people who, for whatever reason, didn’t understand. Still, he realized that was the nature of his job as an academic scholar working at a major university. From the after word, he wrote:

A favorite practice of some professional intellectuals when at first faced with a theory as large as the one I have presented is to search for that loose thread which, when pulled, will unravel all the rest. And rightly so. It is part of the discipline of scientific thinking. In any work covering so much of the terrain of human nature and history, hustling into territories jealously guarded by myriad aggressive specialists, there are bound to be such errancies, sometimes of fact but I fear more often of tone. But that the knitting of this book is such that a tug on such a bad stitch will unravel all the rest is more of a hope on the part of the orthodox than a fact in the scientific pursuit of truth. The book is not a single hypothesis.

Interestingly, Jaynes doesn’t state the bicameral mind as an overarching context for the hypotheses he lists. In fact, it is just one among the several hypotheses and not even the first to be mentioned. That shouldn’t be surprising since decades of his thought and research, including laboratory studies done on animal behavior, preceded the formulation of the bicameral hypothesis. Here are the four hypotheses:

  1. Consciousness is based on language.
  2. The bicameral mind.
  3. The dating.
  4. The double brain.

He states that, “I wish to emphasize that these four hypotheses are separable. The last, for example, could be mistaken (at least in the simplified version I have presented) and the others true. The two hemispheres of the brain are not the bicameral mind but its present neurological model. The bicameral mind is an ancient mentality demonstrated in the literature and artifacts of antiquity.” Each hypothesis is connected to the others but must be dealt with separately. The key element to his project is consciousness, as that is the key problem. And as problems go, it is a doozy. Calling it a problem is like calling the moon a chunk of rock and the sun a warm fire.

Related to these hypotheses, earlier in his book, Jaynes proposes a useful framework. He calls it the General Bicameral Paradigm. “By this phrase,” he explains, “I mean an hypothesized structure behind a large class of phenomena of diminished consciousness which I am interpreting as partial holdovers from our earlier mentality.” There are four components:

  1. “the collective cognitive imperative, or belief system, a culturally agreed-on expectancy or prescription which defines the particular form of a phenomenon and the roles to be acted out within that form;”
  2. “an induction or formally ritualized procedure whose function is the narrowing of consciousness by focusing attention on a small range of preoccupations;”
  3. “the trance itself, a response to both the preceding, characterized by a lessening of consciousness or its loss, the diminishing of the analog or its loss, resulting in a role that is accepted, tolerated, or encouraged by the group; and”
  4. “the archaic authorization to which the trance is directed or related to, usually a god, but sometimes a person who is accepted by the individual and his culture as an authority over the individual, and who by the collective cognitive imperative is prescribed to be responsible for controlling the trance state.”

The point is made that the reader shouldn’t assume that they are “to be considered as a temporal succession necessarily, although the induction and trance usually do follow each other. But the cognitive imperative and the archaic authorization pervade the whole thing. Moreover, there is a kind of balance or summation among these elements, such that when one of them is weak the others must be strong for the phenomena to occur. Thus, as through time, particularly in the millennium following the beginning of consciousness, the collective cognitive imperative becomes weaker (that is, the general population tends toward skepticism about the archaic authorization), we find a rising emphasis on and complication of the induction procedures, as well as the trance state itself becoming more profound.”

This general bicameral paradigm is partly based on the insights he gained from studying ancient societies. But ultimately it can be considered separately from that. All you have to understand is that these are a basic set of cognitive abilities and tendencies that have been with humanity for a long time. These are the vestiges of human evolution and societal development. They can be combined and expressed in multiple ways. Our present society is just one of many possible manifestations. Human nature is complex and human potential is immense, and so diversity is to be expected among human neurocognition, behavior, and culture.

An important example of the general bicameral paradigm is hypnosis. It isn’t just an amusing trick done for magic shows. Hypnosis shows something profoundly odd, disturbing even, about the human mind. Also, it goes far beyond the individual for it is about how humans relate. It demonstrates the power of authority figures, in whatever form they take, and indicates the significance of what Jaynes calls authorization. By the way, this leads down the dark pathways of authoritarianism, brainwashing, propaganda, and punishment — as for the latter, Jaynes writes that:

If we can regard punishment in childhood as a way of instilling an enhanced relationship to authority, hence training some of those neurological relationships that were once the bicameral mind, we might expect this to increase hypnotic susceptibility. And this is true. Careful studies show that those who have experienced severe punishment in childhood and come from a disciplined home are more easily hypnotized, while those who were rarely punished or not punished at all tend to be less susceptible to hypnosis.

He discusses the history of hypnosis beginning with Mesmer. In this, he shows how metaphor took different form over time. And, accordingly, it altered shared experience and behavior.

Now it is critical here to realize and to understand what we might call the paraphrandic changes which were going on in the people involved, due to these metaphors. A paraphrand, you will remember, is the projection into a metaphrand of the associations or paraphiers of a metaphier. The metaphrand here is the influences between people. The metaphiers, or what these influences are being compared to, are the inexorable forces of gravitation, magnetism, and electricity. And their paraphiers of absolute compulsions between heavenly bodies, of unstoppable currents from masses of Ley den jars, or of irresistible oceanic tides of magnetism, all these projected back into the metaphrand of interpersonal relationships, actually changing them, changing the psychological nature of the persons involved, immersing them in a sea of uncontrollable control that emanated from the ‘magnetic fluids’ in the doctor’s body, or in objects which had ‘absorbed’ such from him.

It is at least conceivable that what Mesmer was discovering was a different kind of mentality that, given a proper locale, a special education in childhood, a surrounding belief system, and isolation from the rest of us, possibly could have sustained itself as a society not based on ordinary consciousness, where metaphors of energy and irresistible control would assume some of the functions of consciousness.

How is this even possible? As I have mentioned already, I think Mesmer was clumsily stumbling into a new way of engaging that neurological patterning I have called the general bicameral paradigm with its four aspects: collective cognitive imperative, induction, trance, and archaic authorization.

Through authority and authorization, immense power and persuasion can be wielded. Jaynes argues that it is central to the human mind, but that in developing consciousness we learned how to partly internalize the process. Even so, Jaynesian self-consciousness is never a permanent, continuous state and the power of individual self-authorization easily morphs back into external forms. This is far from idle speculation, considering authoritarianism still haunts the modern mind. I might add that the ultimate power of authoritarianism, as Jaynes makes clear, isn’t overt force and brute violence. Outward forms of power are only necessary to the degree that external authorization is relatively weak, as is typically the case in modern societies.

This touches upon the issue of rhetoric, although Jaynes never mentioned the topic. It’s disappointing since his original analysis of metaphor has many implications. Fortunately, others have picked up where he left off (see Ted Remington, Brian J. McVeigh, and Frank J. D’Angelo). Authorization in the ancient world came through a poetic voice, but today it is most commonly heard in rhetoric.

Still, that old time religion can be heard in the words and rhythm of any great speaker. Just listen to how a recorded speech of Martin Luther King jr can pull you in with its musicality. Or if you prefer a dark example, consider the persuasive power of Adolf Hitler for even some Jews admitted they got caught up listening to his speeches. This is why Plato feared the poets and banished them from his utopia of enlightened rule. Poetry would inevitably undermine and subsume the high-minded rhetoric of philosophers. “[P]oetry used to be divine knowledge,” as Guerini et al states in Echoes of Persuasion, “It was the sound and tenor of authorization and it commanded where plain prose could only ask.”

Metaphor grows naturally in poetic soil, but its seeds are planted in every aspect of language and thought, giving fruit to our perceptions and actions. This is a thousandfold true on the collective level of society and politics. Metaphors are most powerful when we don’t see them as metaphors. So, the most persuasive rhetoric is that which hides its metaphorical frame and obfuscates any attempts to bring it to light.

Going far back into the ancient world, metaphors didn’t need to be hidden in this sense. The reason for this is that there was no intellectual capacity or conceptual understanding of metaphors as metaphors. Instead, metaphors were taken literally. The way people spoke about reality was inseparable from their experience of reality and they had no way of stepping back from their cultural biases, as the cultural worldviews they existed within were all-encompassing. It’s only with the later rise of multicultural societies, especially the vast multi-ethnic trade empires, that people began to think in terms of multiple perspectives. Such a society was developing in the trade networking and colonizing nation-states of Greece in the centuries leading up to Hellenism.

That is the well known part of Jaynes’ speculations, the basis of his proposed bicameral mind. And Jaynes considered it extremely relevant to the present.

Marcel Kuijsten wrote that, “Jaynes maintained that we are still deep in the midst of this transition from bicamerality to consciousness; we are continuing the process of expanding the role of our internal dialogue and introspection in the decision-making process that was started some 3,000 years ago. Vestiges of the bicameral mind — our longing for absolute guidance and external control — make us susceptible to charismatic leaders, cults, trends, and persuasive rhetoric that relies on slogans to bypass logic” (“Consciousness, Hallucinations, and the Bicameral Mind Three Decades of New Research”, Reflections on the Dawn of Consciousness, Kindle Locations 2210-2213). Considering the present, in Authoritarian Grammar and Fundamentalist Arithmetic, Ben G. Price puts it starkly: “Throughout, tyranny asserts its superiority by creating a psychological distance between those who command and those who obey. And they do this with language, which they presume to control.” The point made by the latter is that this knowledge, even as it can be used as intellectual defense, might just lead to even more effective authoritarianism.

We’ve grown less fearful of rhetoric because we see ourselves as being savvy, experienced consumers of media. The cynical modern mind is always on guard, our well-developed and rigid state of consciousness offering a continuous psychological buffering against the intrusions of the world. So we like to think. I remember, back in 7th grade, being taught how the rhetoric of advertising is used to manipulate us. But we are over-confident. Consciousness operates at the surface of the psychic depths. We are better at rationalizing than being rational, something we may understand intellectually but rarely do we fully acknowledge the psychological and societal significance of this. That is the usefulness of theories like that of bicameralism, as they remind us that we are out of our depths. In the ancient world, there was a profound mistrust between the poetic and rhetorical, and for good reason. We would be wise to learn from that clash of mindsets and worldviews.

We shouldn’t be so quick to assume we understand our own minds, the kind of vessel we find ourselves on. Nor should we allow ourselves to get too comfortable within the worldview we’ve always known, the safe harbor of our familiar patterns of mind. It’s hard to think about these issues because they touch upon our own being, the surface of consciousness along with the depths below it. This is the near difficult task of fathoming the ocean floor using rope and a weight, an easier task the closer we hug the shoreline. But what might we find if cast ourselves out on open waters? What new lands might be found, lands to be newly discovered and lands already inhabited?

We moderns love certainty. And it’s true we possess more knowledge than any civilization before has accumulated. Yet we’ve partly made the unfamiliar into familiar by remaking the world in our own image. There is no place on earth that remains entirely untouched. Only a couple hundred small isolated tribes are still uncontacted, representing foreign worldviews not known or studied, but even they live under unnatural conditions of stress as the larger world closes in on them. Most of the ecological and cultural diversity that once existed has been obliterated from the face of the earth, most of it having left not a single trace or record, just simply gone. Populations beyond count have faced extermination by outside influences and forces before they ever got a chance to meet an outsider. Plagues, environmental destruction, and societal collapse wiped them out often in short periods of time.

Those other cultures might have gifted us with insights about our humanity that now are lost forever, just as extinct species might have held answers to questions not yet asked and medicines for diseases not yet understood. Almost all that now is left is a nearly complete monoculture with the differences ever shrinking into the constraints of capitalist realism. If not for scientific studies done on the last of isolated tribal people, we would never know how much diversity exists within human nature. Many of the conclusions that earlier social scientists had made were based mostly on studies involving white, middle class college kids in Western countries, what some have called the WEIRD: Western, Educated, Industrialized, Rich, and Democratic. But many of those conclusions have since proven wrong, biased, or limited.

When Jaynes’ first thought on such matters, the social sciences were still getting established as serious fields of study. His entered college around 1940 when behaviorism was a dominant paradigm. It was only in the prior decades that the very idea of ‘culture’ began to take hold among anthropologists. He was influenced by anthropologists, directly and indirectly. One indirect influence came by way of E. R. Dodds, a classical scholar, who in writing his 1951 The Greeks and the Irrational found inspiration from Ruth Benedict’s anthropological work comparing cultures (Benedict taking this perspective through the combination of the ideas of Franz Boas and Carl Jung). Still, anthropology was young and the fascinating cases so well known today were unknown back then (e.g., Daniel Everett’s recent books on the Pirahã). So, in following Dodds example, Jaynes turned to ancient societies and their literature.

His ideas were forming at the same time the social sciences were gaining respectability and maturity. It was a time when many scholars and other intellectuals were more fully questioning Western civilization. But it was also the time when Western ascendancy was becoming clear with the WWI ending of the Ottoman Empire and the WWII ending of the Japanese Empire. The whole world was falling under Western cultural influence. And traditional societies were in precipitous decline. That was the dawning of the age of monoculture.

We are the inheritors of the world that was created from that wholesale destruction of all that came before. And even what came before was built on millennia of collapsing civilizations. Jaynes focused on the earliest example of mass destruction and chaos leading him to see a stark division to what came before and after. How do we understand why we came to be the way we are when so much has been lost? We are forced back on our own ignorance. Jaynes apparently understood that and so considered awe to be the proper response. We know the world through our own humanity, but we can only know our own humanity through the cultural worldview we are born into. It is our words that have meaning, was Jaynes response, “not life or persons or the universe itself.” That is to say we bring meaning to what we seek to understand. Meaning is created, not discovered. And the kind of meaning we create depends on our cultural worldview.

In Monoculture, F. S. Michaels writes (pp. 1-2):

THE HISTORY OF HOW we think and act, said twentieth-century philosopher Isaiah Berlin, is, for the most part, a history of dominant ideas. Some subject rises to the top of our awareness, grabs hold of our imagination for a generation or two, and shapes our entire lives. If you look at any civilization, Berlin said, you will find a particular pattern of life that shows up again and again, that rules the age. Because of that pattern, certain ideas become popular and others fall out of favor. If you can isolate the governing pattern that a culture obeys, he believed, you can explain and understand the world that shapes how people think, feel and act at a distinct time in history.1

The governing pattern that a culture obeys is a master story — one narrative in society that takes over the others, shrinking diversity and forming a monoculture. When you’re inside a master story at a particular time in history, you tend to accept its definition of reality. You unconsciously believe and act on certain things, and disbelieve and fail to act on other things. That’s the power of the monoculture; it’s able to direct us without us knowing too much about it.

Over time, the monoculture evolves into a nearly invisible foundation that structures and shapes our lives, giving us our sense of how the world works. It shapes our ideas about what’s normal and what we can expect from life. It channels our lives in a certain direction, setting out strict boundaries that we unconsciously learn to live inside. It teaches us to fear and distrust other stories; other stories challenge the monoculture simply by existing, by representing alternate possibilities.

Jaynes argued that ideas are more than mere concepts. Ideas are embedded in language and metaphor. And ideas take form not just as culture but as entire worldviews built on interlinked patterns of attitudes, thought, perception, behavior, and identity. Taken together, this is the reality tunnel we exist within.

It takes a lot to shake us loose from these confines of the mind. Certain practices, from meditation to imbibing psychedelics, can temporarily or permanently alter the matrix of our identity. Jaynes, for reasons of his own, came to question the inevitability of the society around him which allowed him to see that other possibilities may exist. The direction his queries took him landed him in foreign territory, outside of the idolized individualism of Western modernity.

His ideas might have been less challenging in a different society. We modern Westerners identify ourselves with our thoughts, the internalized voice of egoic consciousness. And we see this as the greatest prize of civilization, the hard-won rights and freedoms of the heroic individual. It’s the story we tell. But in other societies, such as in the East, there are traditions that teach the self is distinct from thought. From the Buddhist perspective of dependent (co-)origination, it is a much less radical notion that the self arises out of thought, instead of the other way around, and that thought itself simply arises. A Buddhist would have a much easier time intuitively grasping the theory of bicameralism, that thoughts are greater than and precede the self.

Maybe we modern Westerners need to practice a sense of awe, to inquire more deeply. Jaynes offers a different way of thinking that doesn’t even require us to look to another society. If he is correct, this radical worldview is at the root of Western Civilization. Maybe the traces of the past are still with us.

* * *

The Origin of Rhetoric in the Breakdown of the Bicameral Mind
by Ted Remington

Endogenous Hallucinations and the Bicameral Mind
by Rick Straussman

Consciousness and Dreams
by Marcel Kuijsten, Julian Jaynes Society

Ritual and the Consciousness Monoculture
by Sarah Perry, Ribbonfarm

“I’m Nobody”: Lyric Poetry and the Problem of People
by David Baker, The Virginia Quarterly Review

It is in fact dangerous to assume a too similar relationship between those ancient people and us. A fascinating difference between the Greek lyricists and ourselves derives from the entity we label “the self.” How did the self come to be? Have we always been self-conscious, of two or three or four minds, a stew of self-aware voices? Julian Jaynes thinks otherwise. In The Origin of Consciousness in the Breakdown of the Bicameral Mind—that famous book my poetry friends adore and my psychologist friends shrink from—Jaynes surmises that the early classical mind, still bicameral, shows us the coming-into-consciousness of the modern human, shows our double-minded awareness as, originally, a haunted hearing of voices. To Jaynes, thinking is not the same as consciousness: “one does one’s thinking before one knows what one is to think about.” That is, thinking is not synonymous with consciousness or introspection; it is rather an automatic process, notably more reflexive than reflective. Jaynes proposes that epic poetry, early lyric poetry, ritualized singing, the conscience, even the voices of the gods, all are one part of the brain learning to hear, to listen to, the other.

Auditory Hallucinations: Psychotic Symptom or Dissociative Experience?
by Andrew Moskowitz & Dirk Corstens

Voices heard by persons diagnosed schizophrenic appear to be indistinguishable, on the basis of their experienced characteristics, from voices heard by persons with dissociative disorders or by persons with no mental disorder at all.

Neuroimaging, auditory hallucinations, and the bicameral mind.
by L. Sher, Journal of Psychiatry and Neuroscience

Olin suggested that recent neuroimaging studies “have illuminated and confirmed the importance of Jaynes’ hypothesis.” Olin believes that recent reports by Lennox et al and Dierks et al support the bicameral mind. Lennox et al reported a case of a right-handed subject with schizophrenia who experienced a stable pattern of hallucinations. The authors obtained images of repeated episodes of hallucination and observed its functional anatomy and time course. The patient’s auditory hallucination occurred in his right hemisphere but not in his left.

What Is It Like to Be Nonconscious?: A Defense of Julian Jaynes
by Gary William, Phenomenology and the Cognitive Sciences

To explain the origin of consciousness is to explain how the analog “I” began to narratize in a functional mind-space. For Jaynes, to understand the conscious mind requires that we see it as something fleeting rather than something always present. The constant phenomenality of what-it-is-like to be an organism is not equivalent to consciousness and, subsequently, consciousness must be thought in terms of the authentic possibility of consciousness rather than its continual presence.

Defending Damasio and Jaynes against Block and Gopnik
by Emilia Barile, Phenomenology Lab

When Jaynes says that there was “nothing it is like” to be preconscious, he certainly didn’t mean to say that nonconscious animals are somehow not having subjective experience in the sense of “experiencing” or “being aware” of the world. When Jaynes said there is “nothing it is like” to be preconscious, he means that there is no sense of mental interiority and no sense of autobiographical memory. Ask yourself what it is like to be driving a car and then suddenly wake up and realize that you have been zoned out for the past minute. Was there something it is like to drive on autopilot? This depends on how we define “what it is like”.

“The Evolution of the Analytic Topoi: A Speculative Inquiry”
by Frank J. D’Angelo
from Essays on Classical Rhetoric and Modern Discourse
ed. Robert J. Connors, Lisa S. Ede, & Andrea A. Lunsford
pp. 51-5

The first stage in the evolution of the analytic topoi is the global stage. Of this stage we have scanty evidence, since we must assume the ontogeny of invention in terms of spoken language long before the individual is capable of anything like written language. But some hints of how logical invention might have developed can be found in the work of Eric Havelock. In his Preface to Plato, Havelock, in recapitulating the educational experience of the Homeric and post-Homeric Greek, comments that the psychology of the Homeric Greek is characterized by a high degree of automatism.

He is required as a civilised being to become acquainted with the history, the social organisation, the technical competence and the moral imperatives of his group. This in turn is able to function only as a fragment of the total Hellenic world. It shares a consciousness in which he is keenly aware that he, as a Hellene, in his memory. Such is poetic tradition, essentially something he accepts uncritically, or else it fails to survive in his living memory. Its acceptance and retention are made psychologically possible by a mechanism of self-surrender to the poetic performance and of self-identification with the situations and the stories related in the performance. . . . His receptivity to the tradition has thus, from the standpoint of inner psychology, a degree of automatism which however is counter-balanced by a direct and unfettered capacity for action in accordance with the paradigms he has absorbed. 6

Preliterate man was apparently unable to think logically. He acted, or as Julian Jaynes, in The Origin of Consciousness in the Breakdown of the Bicameral Mind, puts it, “reacted” to external events. “There is in general,” writes Jaynes, “no consciousness in the Iliad . . . and in general therefore, no words for consciousness or mental acts.” 7 There was, in other words, no subjective consciousness in Iliadic man. His actions were not rooted in conscious plans or in reasoning. We can only speculate, then, based on the evidence given by Havelock and Jaynes that logical invention, at least in any kind of sophisticated form, could not take place until the breakdown of the bicameral mind, with the invention of writing. If ancient peoples were unable to introspect, then we must assume that the analytic topoi were a discovery of literate man. Eric Havelock, however, warns that the picture he gives of Homeric and post-Homeric man is oversimplified and that there are signs of a latent mentality in the Greek mind. But in general, Homeric man was more concerned to go along with the tradition than to make individual judgments.

For Iliadic man to be able to think, he must think about something. To do this, states Havelock, he had to be able to revolt against the habit of self-identification with the epic poem. But identification with the poem at this time in history was necessary psychologically (identification was necessary for memorization) and in the epic story implicitly as acts or events that are carried out by important people, must be abstracted from the narrative flux. “Thus the autonomous subject who no longer recalls and feels, but knows, can now be confronted with a thousand abstract laws, principles, topics, and formulas which become the objects of his knowledge.” 8

The analytic topoi, then, were implicit in oral poetic discourse. They were “experienced” in the patterns of epic narrative, but once they are abstracted they can become objects of thought as well as of experience. As Eric Havelock puts it,

If we view them [these abstractions] in relation to the epic narrative from which, as a matter of historical fact, they all emerged they can all be regarded as in one way or another classifications of an experience which was previously “felt” in an unclassified medley. This was as true of justice as of motion, of goodness as of body or space, of beauty as of weight or dimension. These categories turn into linguistic counters, and become used as a matter of course to relate one phenomenon to another in a non-epic, non-poetic, non-concrete idiom. 9

The invention of the alphabet made it easier to report experience in a non-epic idiom. But it might be a simplification to suppose that the advent of alphabetic technology was the only influence on the emergence of logical thinking and the analytic topics, although perhaps it was the major influence. Havelock contends that the first “proto-thinkers” of Greece were the poets who at first used rhythm and oral formulas to attempt to arrange experience in categories, rather than in narrative events. He mentions in particular that it was Hesiod who first parts company with the narrative in the Theogony and Works and Days. In Works and Days, Hesiod uses a cataloging technique, consisting of proverbs, aphorisms, wise sayings, exhortations, and parables, intermingled with stories. But this effect of cataloging that goes “beyond the plot of a story in order to impose a rough logic of topics . . . presumes that Hesiod is 10

The kind of material found in the catalogs of Hesiod was more like the cumulative commonplace material of the Renaissance than the abstract topics that we are familiar with today. Walter Ong notes that “the oral performer, poet or orator needed a stock of material to keep him going. The doctrine of the commonplaces is, from one point of view, the codification of ways of assuring and managing this stock.” 11 We already know what some of the material was like: stock epithets, figures of speech, exempla, proverbs, sententiae, quotations, praises or censures of people and things, and brief treatises on virtues and vices. By the time we get to the invention of printing, there are vast collections of this commonplace material, so vast, relates Ong, that scholars could probably never survey it all. Ong goes on to observe that

print gave the drive to collect and classify such excerpts a potential previously undreamed of. . . . the ranging of items side by side on a page once achieved, could be multiplied as never before. Moreover, printed collections of such commonplace excerpts could be handily indexed; it was worthwhile spending days or months working up an index because the results of one’s labors showed fully in thousands of copies. 12

To summarize, then, in oral cultures rhetorical invention was bound up with oral performance. At this stage, both the cumulative topics and the analytic topics were implicit in epic narrative. Then the cumulative commonplaces begin to appear, separated out by a cataloging technique from poetic narrative, in sources such as the Theogony and Works and Days . Eric Havelock points out that in Hesiod, the catalog “has been isolated or abstracted . . . out of a thousand contexts in the rich reservoir of oral tradition. … A general world view is emerging in isolated or ‘abstracted’ form.” 13 Apparently, what we are witnessing is the emergence of logical thinking. Julian Jaynes describes the kind of thought to be found in the Works and Days as “preconscious hypostases.” Certain lines in Hesiod, he maintains, exhibit “some kind of bicameral struggle.” 14

The first stage, then, of rhetorical invention is that in which the analytic topoi are embedded in oral performance in the form of commonplace material as “relationships” in an undifferentiated matrix. Oral cultures preserve this knowledge by constantly repeating the fixed sayings and formulae. Mnemonic patterns, patterns of repetition, are not added to the thought of oral cultures. They are what the thought consists of.

Emerging selves: Representational foundations of subjectivity
by Wolfgang Prinz, Consciousness and Cognition

What, then, may mental selves be good for and why have they emerged during evolution (or, perhaps, human evolution or even early human history)? Answers to these questions used to take the form of stories explaining how the mental self came about and what advantages were associated with it. In other words, these are theories that construct hypothetical scenarios offering plausible explanations for why certain (groups of) living things that initially do not possess a mental self gain fitness advantages when they develop such an entity—with the consequence that they move from what we can call a self-less to a self-based or “self-morphic” state.

Modules for such scenarios have been presented occasionally in recent years by, for example, Dennett, 1990 and Dennett, 1992, Donald (2001), Edelman (1989), Jaynes (1976), Metzinger, 1993 and Metzinger, 2003, or Mithen (1996). Despite all the differences in their approaches, they converge around a few interesting points. First, they believe that the transition between the self-less and self-morphic state occurred at some stage during the course of human history—and not before. Second, they emphasize the cognitive and dynamic advantages accompanying the formation of a mental self. And, third, they also discuss the social and political conditions that promote or hinder the constitution of this self-morphic state. In the scenario below, I want to show how these modules can be keyed together to form a coherent construction. […]

Thus, where do thoughts come from? Who or what generates them, and how are they linked to the current perceptual situation? This brings us to a problem that psychology describes as the problem of source attribution ( Heider, 1958).

One obvious suggestion is to transfer the schema for interpreting externally induced messages to internally induced thoughts as well. Accordingly, thoughts are also traced back to human sources and, likewise, to sources that are present in the current situation. Such sources can be construed in completely different ways. One solution is to trace the occurrence of thoughts back to voices—the voices of gods, priests, kings, or ancestors, in other words, personal authorities that are believed to have an invisible presence in the current situation. Another solution is to locate the source of thoughts in an autonomous personal authority bound to the body of the actor: the self.

These two solutions to the attribution problem differ in many ways: historically, politically, and psychologically. In historical terms, the former must be markedly older than the latter. The transition from one solution to the other and the mentalities associated with them are the subject of Julian Jaynes’s speculative theory of consciousness. He even considers that this transfer occurred during historical times: between the Iliad and the Odyssey. In the Iliad, according to Jaynes, the frame of mind of the protagonists is still structured in a way that does not perceive thoughts, feelings, and intentions as products of a personal self, but as the dictates of supernatural voices. Things have changed in the Odyssey: Odysseus possesses a self, and it is this self that thinks and acts. Jaynes maintains that the modern consciousness of Odysseus could emerge only after the self had taken over the position of the gods (Jaynes, 1976; see also Snell, 1975).

Moreover, it is obvious why the political implications of the two solutions differ so greatly: Societies whose members attribute their thoughts to the voices of mortal or immortal authorities produce castes of priests or nobles that claim to be the natural authorities or their authentic interpreters and use this to derive legitimization for their exercise of power. It is only when the self takes the place of the gods that such castes become obsolete, and authoritarian constructions are replaced by other political constructions that base the legitimacy for their actions on the majority will of a large number of subjects who are perceived to be autonomous.

Finally, an important psychological difference is that the development of a self-concept establishes the precondition for individuals to become capable of perceiving themselves as persons with a coherent biography. Once established, the self becomes involved in every re-presentation and representation as an implicit personal source, and just as the same body is always present in every perceptual situation, it is the same mental self that remains identical across time and place. […]

According to the cognitive theories of schizophrenia developed in the last decade (Daprati et al., 1997; Frith, 1992), these symptoms can be explained with the same basic pattern that Julian Jaynes uses in his theory to characterize the mental organization of the protagonists in the Iliad. Patients with delusions suffer from the fact that the standardized attribution schema that localizes the sources of thoughts in the self is not available to them. Therefore, they need to explain the origins of their thoughts, ideas, and desires in another way (see, e.g., Stephens & Graham, 2000). They attribute them to person sources that are present but invisible—such as relatives, physicians, famous persons, or extraterrestrials. Frequently, they also construct effects and mechanisms to explain how the thoughts proceeding from these sources are communicated, by, for example, voices or pictures transmitted over rays or wires, and nowadays frequently also over phones, radios, or computers. […]

As bizarre as these syndromes seem against the background of our standard concept of subjectivity and personhood, they fit perfectly with the theoretical idea that mental selves are not naturally given but rather culturally constructed, and in fact set up in, attribution processes. The unity and consistency of the self are not a natural necessity but a cultural norm, and when individuals are exposed to unusual developmental and life conditions, they may well develop deviant attribution patterns. Whether these deviations are due to disturbances in attribution to persons or to disturbances in dual representation cannot be decided here. Both biological and societal conditions are involved in the formation of the self, and when they take an unusual course, the causes could lie in both domains.


“The Varieties of Dissociative Experience”
by Stanley Krippner
from Broken Images Broken Selves: Dissociative Narratives In Clinical Practice
pp. 339-341

In his provocative description of the evolution of humanity’s conscious awareness, Jaynes (1976) asserted that ancient people’s “bicameral mind” enabled them to experience auditory hallucinations— the voices of the deities— but they eventually developed an integration of the right and left cortical hemispheres. According to Jaynes, vestiges of this dissociation can still be found, most notably among the mentally ill, the extremely imaginative, and the highly suggestible. Even before the development of the cortical hemispheres, the human brain had slowly evolved from a “reptilian brain” (controlling breathing, fighting, mating, and other fixed behaviors), to the addition of an “old-mammalian brain,” (the limbic system, which contributed emotional components such as fear, anger, and affection), to the superimposition of a “new-mammalian brain” (responsible for advanced sensory processing and thought processes). MacLean (1977) describes this “triune brain” as responsible, in part, for distress and inefficiency when the parts do not work well together. Both Jaynes’ and MacLean’s theories are controversial, but I believe that there is enough autonomy in the limbic system and in each of the cortical hemispheres to justify Ornstein’s (1986) conclusion that human beings are much more complex and intricate than they imagine, consisting of “an uncountable number of small minds” (p. 72), sometimes collaborating and sometimes competing. Donald’s (1991) portrayal of mental evolution also makes use of the stylistic differences of the cerebral hemisphere, but with a greater emphasis on neuropsychology than Jaynes employs. Mithen’s (1996) evolutionary model is a sophisticated account of how specialized “cognitive domains” reached the point that integrated “cognitive fluidity” (apparent in art and the use of symbols) was possible.

James (1890) spoke of a “multitude” of selves, and some of these selves seem to go their separate ways in posttraumatic stress disorder (PTSD) (see Greening, Chapter 5), dissociative identity disorder (DID) (see Levin, Chapter 6), alien abduction experiences (see Powers, Chapter 9), sleep disturbances (see Barrett, Chapter 10), psychedelic drug experiences (see Greenberg, Chapter 11), death terrors (see Lapin, Chapter 12), fantasy proneness (see Lynn, Pintar, & Rhue, Chapter 13), near-death experiences (NDEs) (see Greyson, Chapter 7), and mediumship (see Grosso, Chapter 8). Each of these conditions can be placed into a narrative construction, and the value of these frameworks has been described by several authors (e.g., Barclay, Chapter 14; Lynn, Pintar, & Rhue, Chapter 13; White, Chapter 4). Barclay (Chapter 14) and Powers (Chapter 15) have addressed the issue of narrative veracity and validation, crucial issues when stories are used in psychotherapy. The American Psychiatric Association’s Board of Trustees (1993) felt constrained to issue an official statement that “it is not known what proportion of adults who report memories of sexual abuse were actually abused” (p. 2). Some reports may be fabricated, but it is more likely that traumatic memories may be misconstrued and elaborated (Steinberg, 1995, p. 55). Much of the same ambiguity surrounds many other narrative accounts involving dissociation, especially those described by White (Chapter 4) as “exceptional human experiences.”

Nevertheless, the material in this book makes the case that dissociative accounts are not inevitably uncontrolled and dysfunctional. Many narratives considered “exceptional” from a Western perspective suggest that dissociation once served and continues to serve adaptive functions in human evolution. For example, the “sham death” reflex found in animals with slow locomotor abilities effectively offers protection against predators with greater speed and agility. Uncontrolled motor responses often allow an animal to escape from dangerous or frightening situations through frantic, trial-and-error activity (Kretchmer, 1926). Many evolutionary psychologists have directed their attention to the possible value of a “multimodular” human brain that prevents painful, unacceptable, and disturbing thoughts, wishes, impulses, and memories from surfacing into awareness and interfering with one’s ongoing contest for survival (Nesse & Lloyd, 1992, p. 610). Ross (1991) suggests that Western societies suppress this natural and valuable capacity at their peril.

The widespread prevalence of dissociative reactions argues for their survival value, and Ludwig (1983) has identified seven of them: (1) The capacity for automatic control of complex, learned behaviors permits organisms to handle a much greater work load in as smooth a manner as possible; habitual and learned behaviors are permitted to operate with a minimum expenditure of conscious control. (2) The dissociative process allows critical judgment to be suspended so that, at times, gratification can be more immediate. (3) Dissociation seems ideally suited for dealing with basic conflicts when there is no instant means of resolution, freeing an individual to take concerted action in areas lacking discord. (4) Dissociation enables individuals to escape the bounds of reality, providing for inspiration, hope, and even some forms of “magical thinking.” (5) Catastrophic experiences can be isolated and kept in check through dissociative defense mechanisms. (6) Dissociative experiences facilitate the expression of pent-up emotions through a variety of culturally sanctioned activities. (7) Social cohesiveness and group action often are facilitated by dissociative activities that bind people together through heightened suggestibility.

Each of these potentially adaptive functions may be life-depotentiating as well as life-potentiating; each can be controlled as well as uncontrolled. A critical issue for the attribution of dissociation may be the dispositional set of the experiencer-in-context along with the event’s adaptive purpose. Salamon (1996) described her mother’s ability to disconnect herself from unpleasant surroundings or facts, a proclivity that led to her ignoring the oncoming imprisonment of Jews in Nazi Germany but that, paradoxically, enabled her to survive her years in Auschwitz. Gergen (1991) has described the jaundiced eye that modern Western science has cast toward Dionysian revelry, spiritual experiences, mysticism, and a sense of bonded unity with nature, a hostility he predicts may evaporate in the so-called “postmodern” era, which will “open the way to the full expression of all discourses” (pp. 246– 247). For Gergen, this postmodern lifestyle is epitomized by Proteus, the Greek sea god, who could change his shape from wild boar to dragon, from fire to flood, without obvious coherence through time. This is all very well and good, as long as this dissociated existence does not leave— in its wake— a residue of broken selves whose lives have lost any intentionality or meaning, who live in the midst of broken images, and whose multiplicity has resulted in nihilistic affliction and torment rather than in liberation and fulfillment (Glass, 1993, p. 59).

 

 

Why not?

Hearing voices. We all hear voices, both in and outside our heads. But obviously not all voice-hearing is the same.

Some people hear voices that others don’t hear. Children talk to imaginary friends who talk back to them. Schizophrenics hear all kinds of voices, from disembodied beings to the thoughts in other people’s heads. Even ordinary people, during periods of grief and stress, hear voices that can’t be explained (the Third Man Factor popularized by John Geiger). Studies show this is a lot more common than people realize, because few people talk about the voices they hear for fear of being called crazy.

These voices are as real to those hearing them as the voice of a physical person speaking before their eyes. According Julian Jaynes, entire ancient societies were based on this kind of experience, and he proposed that visual hallucinations often accompanied them. As such, it would have been the only reality the bicameral mind knew. Our sense of reality is nothing more than what we and those around us experience.

Some people dismiss Jaynes’ speculations. He hasn’t always been respectable, quite the opposite, although his intellectual currency has been rising. Over time, more and more people have taken him seriously, even when uncertain what to make of his theory. It’s such an intriguing possibility based on evidence that typically gets ignored and dismissed. But bicameralism or not, the evidence remains to be explained.

Indeed, it is challenging to make sense of it. As Tanya Lurhman, a Stanford anthropologist trained in psychology, simply stated it: “Julian Jaynes blew my mind.” It didn’t just blow her mind for it also set the course of her professional career. Research that she has done follows from the possibility that Jaynes first presented. In her work, she has looked at different cultures in how they relate to voice-hearing. She has compared cultural experiences and also religious experience, both among schizophrenics and the mentally healthy.

Her book on Evangelicals hearing God’s voice is what got my attention. I liked her approach. She treats her subjects with respect and tries to understand them on their own terms. It reminded me of Jaynes’ own approach to ancient people, to take them at their word and consider the possibility that they actually meant what they said.

What if we took all people seriously, not just those who confirm our biases? What if we tried to understand their stated experience, instead of rationalizing it away according to present social norms? Why not?

* * *

When God Talks Back
by T.M. Luhrmann

Our Most Troubling Madness
edited by T.M. Luhrmann & Jocelyn Marrow

Is That God Talking?
T. M. Luhrmann

My Take: If you hear God speak audibly, you (usually) aren’t crazy
by T.M. Luhrmann

Living With Voices
by T. M. Luhrmann

Toward an Anthropological Theory of Mind
by T. M. Luhrmann

Cognitive Science, Learning, and ‘Theory of Mind’
by Ann Taves

Hallucinatory ‘voices’ shaped by local culture, Stanford anthropologist says
by Clifton B. Parker

The voices heard by people with schizophrenia are friendlier in India and Africa, than in the US
by Christian Jarrett

Tanya Luhrmann, hearing voices in Accra and Chenai
by Greg Downey

Hallucinated voices’ attitudes vary with culture
by Bruce Bower

Psychotic Voices In Your Head Depend On Culture You’re From: Friendly In Ghana, Evil In America
by Chris Weller

More Evidence for Vestigial Bicamerality
by Gary Williams

Aren’t Irish White?

I think it cannot be maintained by any candid person that the African race have ever occupied or do promise ever to occupy any very high place in the human family. Their present condition is the strongest proof that they cannot. The Irish cannot; the American Indian cannot; the Chinese cannot. Before the energy of the Caucasian race all the other races have quailed and done obeisance.

Ralph Waldo Emerson wrote those words in the late 1820s or early 1830s (Journals and Miscellaneous Notebooks, Volume 12). Someone asked, in response to that quote, “aren’t Irish white?” Well, to the younger Emerson, obviously the Irish weren’t white or rather weren’t Caucasian.

Another great American luminary was Walt Whitman, a close acquaintance of Emerson. From a personal letter, he called Emerson “dear Friend and Master” and the admiration was mutual, Emerson even having penned Whitman a letter of recommendation. In the following decade, writing about Catholics after St. Patrick’s Cathedral was attacked by a Protestant mob, Whitman echoed Emerson’s attitude in describing the Irish faith as a “horrible and beastly superstition . . . dregs of foreign filth” (from The New York Aurora). Beastly! That was once a common way of speaking of the Irish, not just their whiteness but their very humanity under the severest of doubt.

They both were writing at a time when the large waves of Irish immigrants were seen as one of the greatest threats by American WASPs. Think about it. In the decades prior, there had been several Irish rebellions, all of which failed. This had led to many Irish seeking to escape to other countries, most of them ending up in the United States. The English were more than glad to get rid of them. Those of English ancestry in the U.S., however, weren’t so glad to receive them. Just because Americans had fought a revolution against the British a half century before didn’t lead them to be any more sympathetic to the Irish cause, much less the Irish people.

I know it seems strange compared to the world now. But the US once was a far different place. It’s just a fact that the Irish, Scots-Irish, Italians, etc weren’t always considered white or Caucasian. There are entire books written explaining this history. One such book is The History of White People by Nell Irvin Painter, in which she discusses the above Emerson quote, and a few paragraphs on she writes that,

After the failure of the Hungarian revolution in 1848 and Lajos Kossuth’s triumphant tour as a hero in exile, Emerson found a way to view the Hungarian situation through an Irish lens: “The paddy period lasts long. Hungary, it seems, must take the yoke again, & Austria, & Italy, & Prussia, & France. Only the English race can be trusted with freedom.” Emerson pontificated against Central Europeans as well as the Irish: “Races. Our idea, certainly, of Poles & Hungarians is little better than of horses recently humanized.”

Back in the day, whiteness as an idea was mixed up with nationality, ethnicity, and religion. The Irish (and other immigrant groups) weren’t English, weren’t of Anglo/Germanic stock, and generally weren’t Protestant. Although assimilating better than later immigrants, even the Germans early on were treated differently. Benjamin Franklin was prejudiced against Palatine Germans and perceived them as almost racially other—since they were shorter and darker-skinned, along with speaking a different language, having a different culture, and being of different religions (at the time, many were Pietists or Anabaptists, separate from the Protestant tradition).

All those who weren’t WASPs were perceived as foreigners and they indeed often looked different—different color of skin, different color of hair, different attire, etc. Italians, in the 1800s, were sometimes referred to as ‘niggers’ because of their dark skin and dark, curly hair. The Irish, despite their pale skin and lighter hair, were also compared to Africans and Native Americans, portrayed as ape-like and called gorillas, sometimes referred to as savages and their children in the cities dismissed as Street Arabs (Catholicism was seen as foreign as Islam). Painter, in The History of White People, states that,

AMERICAN VISUAL culture testifies to a widespread fondness for likening the Irishman to the Negro. No one supplied better fodder for this parallel than Thomas Nast, the German-born editorial cartoonist for Harper’s Weekly. In 1876, for instance, Nast pictured stereotypical southern freedmen and northern Irishmen as equally unsuited for the vote during Reconstruction after the American Civil War.

As with the Scottish and Scots-Irish, the Irish were seen as a tribal people, not quite civilized. In early America, poor ethnics (i.e., white trash) were associated with Native Americans, sometimes seen as below them—from White Trash by Nancy Isenberg, (pp. 109-110):

“Crackers” first appeared in the records of British officials in the 1760s and described a population with nearly identical traits. In a letter to Lord Dartmouth, one colonial British officer explained that the people called “crackers” were “great boasters,” a “lawless set of rascals on the frontiers of Virginia, Maryland, the Carolinas and Georgia, who often change their places of abode.” As backcountry “banditti,” “villains,” and “horse thieves,” they were dismissed as “idle strag[g]lers” and “a set of vagabonds often worse than the Indians.”

The children of Irish-Americans and other non-English ethnics in Eastern cities were regularly gathered up and put on orphan trains to be sent off West to be adopted. But in reality it usually meant a form of indentured servitude as they were often used as cheap labor. This practice that began in the 19th century continued into the early 20th century. This played a role in the Irish becoming white, as I explained previously:

WASPs, in their fear of Catholics, intentionally placed Catholic children into Protestant homes. In response, Catholics began to implement their own programs to deal with Catholic children in need of homes. One such case involved nuns bringing a trainload of Irish orphans to Arizona to be adopted by Catholic families. The problem was that the Catholic families in question were Mexican-American. The nuns didn’t understand the local racism/ethnocentrism involved and were taken by surprise by the response of the local WASPs. The “white” population living there took great offense at this challenge to racial purity. Suddenly, when put into that context, the Irish children were deemed to be innocent whites to be protected against an inferior race. This is ironic because where those Irish children came from in the big cities out East they were considered the inferior race.

It still took a long time for the Irish to become fully white.

Consider another example: white flight and ethnic succession. This was in reality a lot more complex. Different groups were escaping various other groups over time. Those deemed most inferior, undesirable, and threatening was always shifting. Early on, into the 20th century, the Irish were focus of fear and derision—Prohibitionists often had the Irish in mind when they sought to enforce social control over the perceived drunken masses. Even other minorities, blacks included, sometimes thought it best to escape the Irish. Certainly, the more well off whites didn’t want them in their neighborhoods, not until the mid-20th century when the Irish had moved a bit further up the ladder of economic class.

It demanded centuries of struggle—from political disenfranchisement and economic oppression by the English in Ireland, not unlike slavery and sometimes worse (as during the mass starvation and deportation of the artificially created Potato Famine), to finally being assimilated into American whiteness. That path toward respectability and relative privilege wasn’t inevitable and wouldn’t have been obvious to earlier generations. It wasn’t obvious to 19th century WASPs such as Emerson and Whitman, two white men who thought Irish advancement implausible and Irish aspirations threatening.

It’s sad, of course, that Irish-Americans shoved down African-Americans and Chinese-Americans in their pushing themselves up. They defied the stereotypes of the Irish Paddy and Bridget, even as they promoted the stereotypes of others. This is the story of America. If Emerson and Whitman had lived longer, the Irish might have finally won over some grudging admiration in their joining the ranks of whiteness and defending the racial order. Or maybe those early American WASPs wouldn’t have recognized this broader notion of the white race, the American mutt—it’s not the country they had envisioned as their own.

* * *

Why did the English people previously see the Irish and Scottish Celts as racially inferior?
by samj234, Reddit

The Teen Who Exposed a Professor’s Myth
by Ben Collins, The Daily Beast

The Irish were persecuted in the American job market—and precisely in the overt, literally written-down way that was always believed.

Irish-Americans, Racism and the Pursuit of Whiteness
by Jessie Daniels, Racism Review

Like many immigrant groups in the United States, the Irish were characterized as racial Others when they first arrived in the first half of the 19th century. The Irish had suffered profound injustice in the U.K. at the hands of the British, widely seen as “white negroes.” The potato famine that created starvation conditions that cost the lives of millions of Irish and forced the out-migration of millions of surviving ones, was less a natural disaster and more a complex set of social conditions created by British landowners (much like Hurricane Katrina). Forced to flee from their native Ireland and the oppressive British landowners, many Irish came to the U.S.

Once in the U.S., the Irish were to negative stereotyping that was very similar to that of enslaved Africans and African Americans. The comic Irishman – happy, lazy, stupid, with a gift for music and dance – was a stock character in American theater. Drunkenness and criminality were major themes of Irish stereotypes […]

Simian, or ape-like caricature of the Irish immigrant was also a common one among the mainstream news publications of the day (much like the recent New York Post cartoon). For example, in 1867 American cartoonist Thomas Nast drew “The Day We Celebrate” a cartoon depicting the Irish on St. Patrick’s Day as violent, drunken apes. And, in 1899, Harper’s Weekly featrued a drawing of three men’s heads in profile: Irish, Anglo-Teutonic and Negro, in order to illustrate the similarity between the Irish and the Negro (and, the supposed superiority of the Anglo-Teutonic). In northern states, blacks and Irish immigrants were forced into overlapping – often integrated – slum neighborhoods. Although leaders of the Irish liberation struggle (in Ireland) saw slavery as an evil, their Irish-American cousins largely aligned with the slaveholders.

And, following the end of slavery, the Irish and African Americans were forced to compete for the same low-wage, low-status jobs. So, the “white negroes” of the U.K. came to the United States and, though not enslaved, faced a status almost as low as that of recently-freed blacks. While there were moments of solidarity between Irish and African Americans, this was short lived.

IRISH AS SUB-HUMAN
by Michele Walfred, Thomas Nast Cartoons

The Irish-as-ape-stereotype frequently surfaces, as a popular trope, with the English in the mid-nineteenth century. But, In Nothing But the Same Old Story, researcher Liz Curtis provides plentiful examples that establish anti-Irish sentiment as a centuries-long tradition.

Dehumanizing the Irish by drawing them as beasts or primates served as a convenient technique for any conqueror, and it made perfect sense for an English empire intent on placing Ireland and its people under its jurisdiction and control. The English needed to prove the backwardness of the Irish to justify their colonization (16). When the Irish fought back against English oppression, their violence only perpetuated the “violent beast” prejudice held against them.

English artist James Gillray had drawn the Irish as an ogre – a type of humanoid beast – in a reaction to the Irish’s short-lived rebellion against England in 1798. Even before English scientific circles had begun to distort Darwin’s On the Origin of the Species later in the century, the English had favored the monkey and ape as a symbol for Hibernians.

After the Irish had made great social and political gains in the latter part of the nineteenth century, the view that they were of a different race than white people continued to persist…

Nativism
by Michele Walfred, Thomas Nast Cartoons

In America, Highman distills this down to three themes that ran through nativist sentiment in the early nineteenth century: Reformation and the hatred of Roman Catholicism, fear of foreign radicals and political revolutionaries, and racial nativism, which led to the belief America belonged to people of the Anglo-Saxon race. The United States was their domain. The Irish were viewed as a different race and this belief continued to permeate long after the initial Protestant-driven nativist sentiment had considerably weakened. […]

“American writers, cartoonists, and so-called scientific experts hammered away at Irish violence, emotional instability, and contentment in squalor” (Meagher 217). In the eyes of Protestants with ancestral ties to England, the Irish were no better than animals. The Irish presented a triple threat. Their growing numbers, allegiance to strong, organized religion ruled by a foreign monarch, and political gains within Tweed’s Democratic Party, all posed a serious concern to the Protestant elite.

Protestant nativists fought for their survival and painted the Irish as “others.” They eagerly adopted and repeated the British trope of the Irish as unsophisticated, violent-prone animals, a lower being on the evolutionary scale. The Irish’s faith, and in particular their blind allegiance to a foreign pontiff, unsettled nativists. Protestants Americans remembered the hard-fought revolutionary history of their young nation. During the peak years of the potato famine migration (1845-1855) nativists portrayed the Irish in invasion terminology. Nativists predicted the American way of life would end.

By 1880, by and large, the Irish successfully pulled themselves out of their “lowlife” status in a number of ways. They gained respect through their service in the Civil War on behalf of the Union, and in New York City, through political positions awarded by William M. “Boss” Tweed in return for their loyalty and vote. With these gains in respectablility and power, the Irish emerged as a sought-after voting bloc. But politics alone was not enough to counter nativist prejudice. Most significantly, the Irish fought hard to define themselves as white. To do so meant practicing their own brand of nativism. and align with other xenophobes. The Chinese were a convenient target.

In assessing the work of several “whiteness” studies, historian Timothy Meagher asserts that self-identification as “white” went beyond skin color. “It was not clear that the Irish were white” (217).

America’s dark and not-very-distant history of hating Catholics
by Rory Carroll, The Guardian

Demagogues in the nativist movement incited fury and fear about the huge numbers of impoverished German and Irish Catholic immigrants, many barely speaking English, who spilled off ships.

Newspapers and Protestant clergymen, including Lyman Beecher, co-founder of the American Temperance Society, swelled the outcry, warning the influx would take jobs, spread disease and crime and plot a coup to install the Pope in power.

In 1844 mobs burnt Catholic churches and hunted down victims, notably in Philadelphia where, coincidentally or not, Francis will wrap up his week-long visit.

Abuse from Protestant officers partly drove hundreds of Irish soldiers to defect from the US army to the Mexican side before and during the 1846-48 war with Mexico. The deserters obtained revenge, for a while, by forming the San Patricio battalion and targeting their former superiors in battle, only to wind up jailed, branded and hanged after Mexico surrendered.

The growth of the Ku Klux Klan in the early 20th century gave a new impetus to attacks – mostly verbal – on Catholics.

Roman Catholics and Immigration in Nineteenth-Century America
by Julie Byrne, NHC

Many members of other faiths—Jews, Protestants, and even some Muslims, Hindus and Buddhists—arrived in the successive waves of massive immigration to the United States between the 1840s and 1920s. But Catholics from various countries were the most numerous—and the most noticed. In 1850 Catholics made up only five percent of the total U.S. population. By 1906, they made up seventeen percent of the total population (14 million out of 82 million people)—and constituted the single largest religious denomination in the country.

Immigration in the 1920s
Shmoop

The New Immigrants were distinctive from earlier migrants in that most didn’t want to stay. These immigrants, mostly male and mostly young, hoped to earn enough money during a temporary stay in America to be able to afford an increased standard of living upon returning to their homeland. Something between 50% and 80% of the New Immigrants are believed to have eventually returned to their countries of origin. The exceptions were Jews (who mostly came from Russia, and only 4% of whom repatriated) and Irish (9%), two groups that tended to stay in America permanently because they faced religious persecution, political oppression, and economic privation back home.

Free Speech, World War One, and the Problem of Dissent
by Michael O’Malley, RRCHNM

World War One pitted England, France and Russia against Germany and the Austro-Hungarian Empire. It was difficult, at the beginning of the war, to determine who was the worst of the warring paries, and Americans faced the conflict with divided loyalties. For many Americans of English descent, England seemed like our natural ally. Many American political leaders, most prominently Woodrow Wilson, felt a strong sense of “anglophilia,” or love of England. But Germans and Irish were the two largest immigrant groups to the United States in 1917. Irish immigrants carried bitter memories of English oppression, while German Americans, not surprisingly, tended to favor their homeland, or at least not to regard it as an enemy.

Wilson worried about this division and regarded it as dangerous. Regarding Italian-Americans, German-American, Irish-Americans as suspect, he once declared “Any man who caries a hyphen around with him carries a dagger that he is ready to plunge into the vitals of the republic.

The Visibility of Whiteness and Immigration Restriction in the United States, 1880-1930
by Robert Júlio Decker, Critical Race and Whiteness Studies

In the second half of the nineteenth century, the Western definition of whiteness underwent several significant changes. Scientific racism, understood here as the “language, concepts, methods and authority of science [which] were used to support the belief that certain human groups were intrinsically inferior to others, as measured by some socially defined criterion” (Stepan 1987: IX), provided the methods to not only construct a black/white racial binary, but also to distinguish between several European races. Scientific racism was often augmented by discourses centred on the supposed cultural traits inherent to racial composition. In Britain and the United States, Irish immigrants were racialised as putatively inferior up to the 1880s (Ignatiev 1995: 34-59; Jacobson 1998: 48-52; Knobel 1996). From the 1860s, however, the definition of Englishness slowly began to include all inhabitants of the British Isles and the term Anglo-Saxon was established as generic racial referent for this group (Young 2008: 140-187).

A “Perverse and Ill-Fated People”:
English Perceptions of the Irish, 1845-52

by Ed Lengel, University of Virginia

…the emerging racialist conception of Irish difference, which became dominant in the second half of the nineteenth century. In a sense, the products of Liberal and racialist interpretations of the Irish problem were the same. Idealistic Liberal dreams of an “intimate” marriage between Hibernia and John Bull did not challenge the essentially paternalistic and colonial Anglo-Irish relationship. Indeed, Liberal faith in the improvability of men contributed to a restrictive famine policy intended to teach the Irish to adopt middle-class standards of thrift and morality. It is worth emphasizing in any case that Liberals and racialists agreed on the basic qualities of Saxon and Celt; but while Liberals explained this difference in a gendered discourse of moral inequality, racialists insisted that the ineradicable boundaries of biology would forever separate the two peoples. In both instances, Britain would forever be the master and Ireland the subject.

Racism and Anti-Irish Prejudice in Victorian England
by Anthony S. Wohl, The Victorian Web

In much of the pseudo-scientific literature of the day the Irish were held to be inferior, an example of a lower evolutionary form, closer to the apes than their “superiors”, the Anglo-Saxons . Cartoons in Punch portrayed the Irish as having bestial, ape-like or demonic features and the Irishman, (especially the political radical) was invariably given a long or prognathous jaw, the stigmata to the phrenologists of a lower evolutionary order, degeneracy, or criminality. Thus John Beddoe, who later became the President of the Anthropological Institute (1889-1891), wrote in his Races of Britain (1862) that all men of genius were orthognathous (less prominent jaw bones) while the Irish and the Welsh were prognathous and that the Celt was closely related to Cromagnon man, who, in turn, was linked, according to Beddoe, to the “Africanoid”. The position of the Celt in Beddoe’s “Index of Nigrescence” was very different from that of the Anglo-Saxon. These ideas were not confined to a lunatic fringe of the scientific community, for although they never won over the mainstream of British scientists they were disseminated broadly and it was even hinted that the Irish might be the elusive missing link! Certainly the “ape-like” Celt became something of an malevolent cliche of Victorian racism. Thus Charles Kingsley could write

I am haunted by the human chimpanzees I saw [in Ireland] . . . I don’t believe they are our fault. . . . But to see white chimpanzees is dreadful; if they were black, one would not feel it so much. . . .” (Charles Kingsley in a letter to his wife, quoted in L.P. Curtis, Anglo-Saxons and Celts, p.84).

Even seemingly complimentary generalizations about the Irish national character could, in the Victorian context, be damaging to the Celt. Thus, following the work of Ernest Renan’s La Poésie des Races Celtiques (1854), it was broadly argued that the Celt was poetic, light-hearted and imaginative, highly emotional, playful, passionate, and sentimental. But these were characteristics the Victorians also associated with children. Thus the Irish were “immature” and in need of guidance by others, more highly developed than themselves. Irish “emotion” was contrasted, unfavorably, with English “reason”, Irish “femininity” with English “masculine” virtues, Irish “poetic” attributes with English “pragmatism”. These were all arguments which conveniently supported British rule in Ireland.

A British Ireland, Or the Limits of Race and Hybridity in Maria Edgeworth’s Novels
by Kimberly Philomen Clarke, Georgetown University

In the seventeenth and eighteenth centuries, as Roxanne Wheeler discusses in The Complexion of Race (2000), race was seen as mutable and had a complex relationship to religion. Racial difference was not only dependent on a fixed categorization of skin color, but also on clothing, religion, and culture.19 Throughout the seventeenth and eighteenth centuries, Britons defined themselves according to their Protestantism, clothing, and climate, among other characteristics, and as the nineteenth century arrived, whiteness finally became a marker of Britishness as “skin color emerg[ed] as the most important component of racial identity in Britain during the third quarter of the eighteenth century” (Wheeler 9).

Race became the determinant of culture and history, a common “principle of academic knowledge” in the nineteenth century (Young 93). The correlation between whiteness with Englishness developed in the 1720s and 1730s with the assumption that racial blackness signified one’s intellectual and spiritual inferiority (Wheeler 98). Historian Winthrop Jordan has argued that in the mid-seventeenth century, colonists in confrontation with the Other went from calling themselves Christian to calling themselves English, free, and “white,” a term that came to symbolize a moral and intellectual superiority against blackness and non-Britishness (Wheeler 74). Against this darker, inferior other among the nonwhite British colonies in Africa, the West Indies, and India, Britishness became emblematic of a white empire that would not be culturally or racially muddied by foreign influences (Colley 312).

[…] for the Irish to be British. Primarily, they have to sacrifice their symbolic blackness, that which symbolizes their peasantry class, cultural otherness, and religious differences, and particularly that which marks their contentious history and centuries long colonization by England. To forfeit this darkness symbolizing the history of suppression and difference, however, is also to surrender a part of a collective Irish identity in Britain. […]

Throughout the nineteenth century, the Irish were seen as a symbolic manifestation of a biracial, Caucasian/African hybridity. There are stereotypes that confirm the outsider status of the Irish both before and after the 1801 Act of Union, some of which continue to paint the British as white and the Irish as nonwhite, or at least not white enough to be British. Richard Lebow’s White Ireland and Black Ireland (1976) mentions the “racist attitudes toward the Irish in Victorian Britain” (14). He argues that “racist expressions were merely the age old anti-Irish prejudice couched in the jargon of the day” (15). In The Times in 1836, Benjamin Disraeli claims the Irish “hate our free and fertile isle. They hate our order, our civilization, our enterprising industry, our sustained courage, our decorous liberty, our pure religion. This wild, reckless, indolent, uncertain, and superstitious race has no sympathy with the English character” (quoted in Lebow 61). Andrew Murphy quotes Charles Kingsley, who visited Ireland in the mid-nineteenth century, writing to his wife that, “I am daunted by the human chimpanzees I saw along that hundred miles of horrible country…to see white chimpanzees is dreadful: if they were black, one would not feel it so much, but their skins, except where tanned by exposure, are as white as ours” (Murphy 12). Furthermore, disgusted at Irish poverty and how it contradicts his British image of whiteness, Kingsley writes, “Can you picture in your mind a race of white men reduced to this condition? White men! Yes the highest and purest blood and breed of men” (Murphy 44). These quotations demonstrate both the racial whiteness and “otherness” or non-whiteness that Irish identity connotes in Edgeworth’s literature. Irish otherness was fueled stereotypes of racial, cultural, and intellectual differences that “the Irish” as a generalized group endured before and throughout the nineteenth century and onward. […]

Edgeworth associates Irish peasantry with physical blackness in a letter to her Aunt Ruxton in which she expresses her fears of the sort of Irish rebellion that was frequent in the late eighteenth century and which her family twice before had endured.27 Edgeworth confesses, “All I crave for my own part is, that if I am to have my throat cut, it might not be by a man with his face blackened with charcoal” (Egenolf 849-50). She later says that she “shall look at every person that comes here very closely, to see if there be any marks of charcoal upon their visages” (850). This blackness results from working with charcoal and other materials associated with manual labor. However, in these lines, Edgeworth is not commenting on Irish working class life but rather the threatening gaze of those faces blackened with charcoal and the fear that blackness represents for Edgeworth and her family as the Irish rebel, reclaiming his own agency, destabilizes the power of the upper class families in Ireland. Therefore, keeping in mind the Africanist image of the danger associated with Irish blackened faces, one may read Christy’s physical blackness as not a result of work but some inherent racial trait the Irish were thought to have and that reflected anxieties about the power native Irish against middle and upper class whiteness (859-60).

Irish Nationalists and the Making of the Irish Race
by Bruce Nelson
pp. 34-35

A month later the Bristol Mirror charged that “the Indians with their tomahawks and scalping knives are a much nobler set of savages than the murderers of Limerick and Tipperary.” 16

The comparison of the Irish with the “savages of America” was familiar enough; it dated from the seventeenth century. But there was a dramatically new development in the second half of the nineteenth century, a time when Darwinian science posited an evolutionary chain of being in which humans were descended directly from African apes. In this context, British commentators created a “simianized,” or apelike, Paddy whose likeness to the “backward” races of Africa was inescapable. Perry Curtis has traced this development in Apes and Angels. He notes that the Rising of 1798 led British cartoonists to develop images of a preternaturally ugly Paddy whose appearance was far more ominous and repellent than that of the bumptious but relatively harmless stage Irishman who had predominated for much of the eighteenth century. Some of these cartoon characters were given porcine features, but until the 1860s the cartoon Irishman remained largely human. It was with the coming of Darwinian evolution, and the reemergence of violent Irish republicanism in the guise of Fenianism, that the transformation of the stereotypical Paddy really took off with the publication of cartoon caricatures such as “The Irish Devil-Fish” (a massive octopus with simian facial features) and the even more notorious “Irish Frankenstein,” with his dagger dripping blood. According to Curtis, “In a biological sense, Paddy had devolved, not evolved, from a primitive peasant to an unruly Caliban, thence to a ‘white Negro,’ and finally he arrived at the lowest conceivable level of the gorilla and the orangutan.”

pp. 38-45

Even in regard to France, the citadel of Celtic achievement, he observed that the country’s “vast Moorish population” was “superior in all respects to the lazy, worthless Celt.” 24

Knox’s elevation of the dark-skinned Moor above the Celt is a vivid example of the slippage that often occurred in racial discourse about the Irish. Even relatively sympathetic observers resorted to characterizations of Irish Celts that linked them to darker races and, sometimes, to apes. […]

But Jackson also ruminated on the “Iberian character” of the Irish peasantry, raising the familiar specter of southern origins, Moorish blood, and intimations of darkness and savagery. Referring specifically to the peasants of the west and south of Ireland, he reported that “an absolutely negroid type has been occasionally detected by keen observers,” which meant that “inferior and non-Aryan racial elements are clearly perceptible in the population of the sister isle.” 26 Jackson’s fellow anthropologist Hector MacLean concurred and identified a racial type, also with Iberian characteristics, that was “very prevalent in the west of Ireland. . . . The stature is generally low,” he claimed, “with dark skin and complexion; the head is long, low, and broad; the hair black, coarse, and shaggy; the eyes black or dark brown, or grey, with fiery lustre; forehead receding, with lower part of face prominent.” 27 To those who were predisposed to believe them, reports of this kind served to reinforce elite and popular perceptions of the Irish as akin to “the negro,” “the savage,” and even “the ape.” […]

To locate the “real” Irish, then, one had to go to the west and southwest of the country, where there had been less immigration and therefore less mixing of blood. To be sure, in fishing villages on Galway Bay and in the Aran Islands, Beddoe found significant examples of intermarriage, and thus of racial hybridity. But for the most part, the west was the home of “swarthy” and “dark-complexioned aborigines,” many of whom had dark eyes and even darker, sometimes “coal-black,” hair. By themselves, hair and eye color did not indicate skin color, and for the most part Beddoe acknowledged that he was dealing with whites, although he did record that in the mountains between Sligo and Roscommon he had encountered “the swarthiest people I have ever seen.” He also created an “Index of Nigrescence” to measure the range of hair and eye color from one racial type to another, and like virtually all of the anthropologists of his generation, he could not help but speculate on the relationship between racial classification and intelligence and temperament. “There is an Irish type . . . which I am disposed to derive from the race of Cro-Magnon,“ he reported. “In the West of Ireland I have frequently seen it. Though the head is large, the intelligence is low, and there is a great deal of cunning and suspicion.” He also discovered a tendency toward “prognathism” among people in England, Wales, and Ireland, with Ireland as its “present centre.” Venturing onto very slippery terrain indeed, he speculated that “most of its lineaments are such as to lead us to think of Africa as its possible birthplace, and it may be well, provisionally, to call it Africanoid.” 30

Beddoe did not always follow the apparent logic of his own conclusions. He argued in The Races of Britain that “the points of likeness to the anthropoid apes are distributed variously among the different races of mankind, . . . [and] none of them can be taken in themselves to imply intellectual or moral inferiority.” But by creating an index of nigrescence, and constructing a prognathous physical type in Ireland that he identified as “Africanoid,” he provided openings for others who were far more determined to assert the racial inferiority of the Irish and to see them as a race that had not achieved the salient characteristics commonly associated with “whiteness.” In the early twentieth century, especially in response to the polarization and violence of the Irish War of Independence, a new generation of scholars and pseudoscholars was determined to portray the Irish as a people whose many negative attributes were rooted in a suspect racial past. In 1919 two Harvard geneticists claimed that the Irish were “principally the product of the mingling of two savage Mongolian tribes,” and in 1922 two equally zealous Hibernophobes found a “strain of negro blood” in the Firbolgs, or Attacotti, the ancient race that had invaded Ireland and allegedly waged a war of extermination against its “fair-haired and clean-skinned” rivals on the island. 31

These developments in the realm of science were reflected in a wider, more random discourse through which elite and popular commentators linked the Irish with black Africans and African Americans in a shared stereotype that alleged laziness, irrationality, and an incapacity for self-government as essential characteristics of both races. By the mid-nineteenth century or soon thereafter, the tendency to portray the Irish as apelike creatures who were laughably crude and lamentably violent was becoming a commonplace in the United States as well as Britain. In a meditation on the “Celtic physiognomy,” the American magazine Harper’s Weekly commented on the “small and somewhat upturned nose [and] the black tint of the skin,” while Punch characterized the “Irish Yahoo” who populated “the lowest districts of London and Liverpool” as “a creature manifestly between the Gorilla and the Negro,” a “climbing animal [who] may sometimes be seen ascending a ladder with a hod of bricks.” 32 […]

What comes through in so many of these observations is the racial “in-betweenness” of the Irish in the eye of the beholder. 34 Although Harper’s Weekly did comment on the “black tint of the [Irish] skin,” few observers were willing to argue that the Irish were “black” or “coloured,” no matter how high they registered on Beddoe’s index of nigrescence. Instead, in the age of Darwin, Irishmen and -women were portrayed as “white chimpanzees,” as “creature[ s] manifestly between the Gorilla and the Negro,” and as “more like a tribe of squalid apes than human beings.” Charles Kingsley, an Anglican clergyman and regius professor of modern history at Cambridge, was “haunted by the human chimpanzees” he encountered during a holiday in Ireland in 1860. “To see white chimpanzees is dreadful,” he confided to his wife; “if they were black, one would not feel it so much, but their skins, except where tanned by exposure, are as white as ours.” Thomas Carlyle, the Scottish writer and polemicist, did not doubt that the Irish had “a white skin” and even “European features,” but they were “savages” nonetheless. “The Celt[ s] of Connemara,” he wrote in the 1840s, “are white and not black; but it is not the colour of the skin that determines the savagery of a man” or of a race. “He is a savage who in his sullen stupidity, in his chronic rage and misery, cannot know the facts of this world when he sees them; [who] . . . brandishes his tomahawk against the laws of Nature.” Carlyle exempted the “Teutonic Irish” of Ulster from his censure, but he charged that the chronic laziness of the Celtic Irish, and their refusal to accept that for the foreseeable future their role must be to labor for others, made them akin to the black ex-slaves of Jamaica, for whom he recommended a return to the “beneficence” of involuntary servitude. As for Kingsley, he informed a friend that the “harsh school of facts” had cured him of any illusions about equality between the races. “I have seen,” he wrote, “that the differences of race are so great, that certain races, e.g., the Irish Celts, seem quite unfit for self-government.” 35

Other observers also believed that the racial characteristics of the Irish made them seem more like blacks and less like bona fide “white men.” When James Bryce wrote of the Negro that “his intelligence is rather quick than solid, and . . . shows the childishness as well as lack of self-control which belongs to primitive peoples,” he could just as easily have been describing the Irish as far as many readers were concerned. 36 During the Great War, it was not uncommon for those who witnessed or worked with Irish recruits in the British army to characterize them as “hardy and brave,” but also as prone to “displays of unnecessary bravado” that resulted in excessive casualties on the battlefield. Even a British officer who had “great sympathy” for the Irish troops he led confided to his wife that “his men came from ‘an extraordinary and inexplicable race’ and that Ireland must be an ‘island of children with the bodies of men.’ ” These are nearly the same terms that French observers applied to the black soldiers who were recruited from France’s West African colonies. They too displayed a “wild impulsiveness” and “fierce ardour for hand-to-hand combat” that made them ideal “shock troops.” But there were also frequent allegations that they lacked discipline and cohesion, that, like the Irish, they were a race of “children,” albeit “wonderful children, with generous hearts.” 37

For the Irish, racial in-betweenness was a condition they could ill afford at a time when European and American conceptions of race were narrowing, from the belief in a “multiplicity of nations, races, and religions” to the fulsome embrace of a simple binary division between “white” and “nonwhite.” […]

Dilke was a graduate of Cambridge, where he studied with Charles Kingsley. He was also a Liberal politician, a widely published author, and a racial imperialist whose main concern was not the supremacy of British capital but the triumph, on a global scale, of English institutions and values. The great impediment to this accomplishment, he believed, was the migration of the “cheaper races” to English-speaking countries such as the United States and Australia. “In America,” he wrote in Greater Britain: A Record of Travel in English-Speaking Countries during 1866 and 1867, “we have seen the struggle of the dear races against the cheap— the endeavors of the English to hold their own against the Irish and the Chinese.” But the threat these races posed was not only to the standard of living of the Saxons and their descendants but to civilization itself. He warned of “the danger to our race and to the world from Irish ascendency.” For if the Celt, his religion, and his “fierce” temperament prevailed, then the Englishman and his way of life would be eclipsed and the “freedom of mankind” would be jeopardized. 40

In tracing the evolution of anti-Irish stereotypes and polemics, then, from the sixteenth century through the nineteenth and into the twentieth, one comes face to face with a process of racialization rooted in conquest, colonization, and Anglicization. It was a process that sometimes engendered violence on a horrific scale and one that by means of the stage Irishman, the cartoon caricature, and the condescension and ridicule inherent in the “Paddy joke” did enormous damage to Irish self-esteem. 41 We have seen how the native Irish were portrayed as heathens, savages, and even wild animals; we have seen, too, how Paddy was constructed as feckless, lazy, riotous, and, sometimes, dangerous to the peace and tranquillity of England as well as Ireland. Perhaps by way of summary it is appropriate to turn to the Kentish Gazette, which in February 1847 sought to identify the essential ingredients of the Irish character and to offer up a solution to the Irish Question. During one of the most devastating months of the Great Famine, the Gazette commented editorially that “the system of agitation, of midnight plunder, of open-day assassination, of Hottentot ignorance, superstition, idolatry, and indolence must be eradicated, removed, abolished by the strong arm of the law.” 42 “Idolatry” and “superstition” were, of course, code words for Catholicism; indolence was, allegedly, the preferred pastime of the Irish people; assassination and midnight plunder were the staples of Irish politics; and Hottentot ignorance linked the Irish to African people who were widely regarded as primitive and backward, thus completing the process of racialization.

Who Built America, Volume II
pp . 146-149

American nativism often took the form of anti-Catholicism. In 1887 the American Protective Association (APA) organized to drive Irish Catholics out of American politics and soon claimed a half-million members, all of whom took an oath never to vote for a Catholic. The APA explicitly blamed the depression on Catholics, asserting that immigrants had taken the jobs of native-born Americans. It endorsed political candidates in 1894, but it broke apart when its members could not agree on establishing a third party or supporting the Republican ticket in 1896.

THE HISTORY OF RELIGIOUS CONFLICT IN THE UNITED STATES:
REVOLUTION TO SEPTEMBER 11TH

by Erik Wong, Stanford University

The early part of the 19th Century was relatively quiet in terms of religious conflict in America. The religious conflict that stands out in this period involves tensions between Catholics and Protestants, culminating in violence directed at Irish Catholic immigrants. The surge in immigration from Europe during the 19th Century coincided with and influx of Catholics and the rise of activist Protestantism in the U.S. As strong Protestant values permeated the country, immigrants who were Catholic also became viewed as outsiders and undemocratic. These views are separate from, but on top of, the harsh anti-Irish sentiment that also spread during the period.

In the 1830s and 1840s, anti-Catholic violence broke out in the Northeast and elsewhere. In 1835, one incident was ignited by a speaking tour by Lyman Beecher, who published Plea for the West, a book about a Catholic plot to take over the U.S. and impose Catholic rule. After Beecher’s speaking tour passed through Charlestown, Massachusetts, a mob set fire to the Ursuline convent and school.[3] In Philadelphia in 1844, pitched gun battles broke out between “native” Americans and mostly Irish Catholics. Martial law had to be declared in order to end the violence.[4]

The Divide Between Blacks and the Irish
by Noel Ignatiev, The Root

The Irish who immigrated to America in the 18th and 19th centuries were fleeing caste oppression and a system of landlordism that made the material conditions of the Irish peasant comparable to those of an American slave. The Penal Laws regulated every aspect of Irish life and established Irish Catholics as an oppressed race. Anticipating Judge Roger B. Taney’s famous dictum in the Dred Scott decision, on two occasions officials with judiciary authority in Ireland declared that “the law does not suppose any such person to exist as an Irish Roman Catholic.”

When they first began arriving here in large numbers, the Irish were, in the words of Mr. Dooley (a character created by journalist Finley Peter Dunne), given a shovel and told to start digging up the place as if they owned it. On the rail beds and canals, they labored for low wages under dangerous conditions; in the South they were occasionally employed where it did not make sense to risk the life of a slave. As they came to the cities, they were crowded into districts that became centers of crime, vice and disease.

They commonly found themselves thrown together with free Negroes. Blacks and the Irish fought each other and the police, socialized and occasionally intermarried, and developed a common culture of the lowly. They also both suffered the scorn of those better situated. Along with Jim Crow and Jim Dandy, the drunken, belligerent and foolish Patrick and Bridget were stock characters on the early stage. In antebellum America, it was speculated that if racial amalgamation was ever to take place, it would begin between those two groups. As we know, things turned out otherwise.

How the Irish Became White
by Art McDonald, University of Pittsburgh

Ironically, Irish Catholics came to this country as an oppressed race yet quickly learned that to succeed they had to in turn oppress their closest social class competitors, free Northern blacks. Back home these “native Irish or papists” suffered something very similar to American slavery under English Penal Laws. Yet, despite their revolutionary roots as an oppressed group fighting for freedom and rights, and despite consistent pleas from the great Catholic emancipator, Daniel O’Connell, to support the abolitionists, the newly arrived Irish-Americans judged that the best way of gaining acceptance as good citizens and to counter the Nativist movement was to cooperate in the continued oppression of African Americans. Ironically, at the same time they were collaborating with the dominant culture to block abolition, they were garnering support from among Southern, slaveholding democrats for Repeal of the oppressive English Act of the Union back home. Some even convinced themselves that abolition was an English plot to weaken this country.

Upon hearing of this position on the part of so many of his fellow countrymen now residing in the United States, in 1843 O’Connell wrote: “Over the broad Atlantic I pour forth my voice, saying, come out of such a land, you Irishmen; or, if you remain, and dare countenance the system of slavery that is supported there, we will recognize you as Irishmen no longer.” It’s a tragic story. In a letter published in the Liberator in 1854, it was stated that “passage to the United States seems to produce the same effect upon the exile of Erin as the eating of the forbidden fruit did upon Adam and Eve. In the morning, they were pure, loving, and innocent; in the evening, guilty.”

Irish and Africans Americans had lots in common and lots of contact during this period; they lived side by side and shared work spaces. In the early years of immigration the poor Irish and blacks were thrown together, very much part of the same class competing for the same jobs. In the census of 1850, the term mulatto appears for the first time due primarily to inter-marriage between Irish and African Americans. The Irish were often referred to as “Negroes turned inside out and Negroes as smoked Irish.” A famous quip of the time attributed to a black man went something like this: “My master is a great tyrant, he treats me like a common Irishman.” Free blacks and Irish were viewed by the Nativists as related, somehow similar, performing the same tasks in society. It was felt that if amalgamation between the races was to happen, it would happen between Irish and blacks. But, ultimately, the Irish made the decision to embrace whiteness, thus becoming part of the system which dominated and oppressed blacks. Although it contradicted their experience back home, it meant freedom here since blackness meant slavery.

How Housing Discrimination Created the Idea of Whiteness
by Whet Moser, Chicago Magazine

Note that Irish and Germans are at the top of the list. Had Hoyt’s book been written fifty, or even twenty years before, they likely would have been lower. As Lewinnek described to me, German and Irish immigrants were relegated to the periphery of the city after the Great Fire by the “fire limits,” prohibitions on the construction of inexpensive wooden houses that effectively pushed working-class homeowners out of the city center; Chicago Germans were at the forefront of briefly successful protests against the fire limits.

Not In My Neighborhood: How Bigotry Shaped a Great American City
by Antero Pietila
Excerpt

Harlem exemplifies racial succession, which is the sociologists’ term for ethnic, racial and economic neighborhood transition. In the space of four decades between the 1870s and 1910s, that section of New York City went from a white upper-class community of American-born residents to one populated by recent Irish, Jewish, German, Italian and Scandinavian immigrants.

American Pharaoh
by Adam Cohen and Elizabeth Taylor
Chapter 1

If white Chicago as a whole turned a cold shoulder to the new black arrivals, Daley’s Irish kinsmen were particularly unwelcoming.

The Irish and blacks had much in common. Ireland’s many years of domination at the hands of the British resembled, if not slavery, then certainly southern sharecropping — with Irish farmers working the land and sending rent to absentee landlords in England. The Irish were dominated, like southern blacks, through violence, and lost many of the same civil rights: to vote, to serve on juries, and to marry outside their group. Indeed, after Cromwell’s bloody invasion in the mid-1600s, not only were Irish-Catholics massacred in large numbers, but several thousand were sent in chains to the West Indies, where they were sold into slavery. But these similar histories of oppression did not bring Chicago’s Irish and blacks together. Much of the early difficulty stemmed from rivalry between two groups relegated to the lowest levels of the social order.

Ethnic America: A History
by Thomas Sowell
pp. 277-279

Today’s neighborhood changes have been dramatized by such expressions as “white flight.” but these patterns existed long before black-white neighborhood changes were the issue. When the nineteenth-century Irish immigrants flooded into New York and Boston, the native Americans fled. With the first appearance of an Irish family in a neighborhood, “the exodus of non-Irish residents began,” 2 According to a contemporary, property values “tremble” as “fear spreads,” and panicky flight ensues. 3 As “the old occupants fled to the outskirts of town,” 4 in the mid-nineteenth century when immigration increased, New York City grew northward about one mile per decade. The built-up area extended only as far north as Fourteenth Street in 1840, but it grew to Thirty-fourth Street in a decade, and to Forty-second Street by I860.5

“White flight” is a misleading term, not only because of its historical narrowness, but also because blacks too have fled when the circumstances were reversed. Blacks fled a whole series of neighborhoods in nineteenth-century New York, “pursued” by new Italian immigrants who moved in. 6 In nineteenth-century Detroit, blacks moved out of neighborhoods as Polish immigrants moved in. 7 The first blacks in Harlem were fleeing from the tough Irish neighborhoods in mid-Manhattan, 8 and avoided going north of 145th Street, for fear of encountering more Irish there. 9

As the relative socioeconomic positions of ethnic groups changed with the passage of time, so did the neighborhood flight. In nineteenth-century nieghborhoods where Anglo-Saxons had once fled as the Irish moved in, the middle-class Irish later fled as the Jews and Italians moved in. […]

Ethnic succession did not end with neighborhoods. Early Irish immigrants were often used as strikebreakers and were hated and kept out of unions as a result. Later, the Irish were unionized and Italians, Negroes, and many others were used as strikebreakers, encountering in turn the same hostility and resistance to their admission to unions. Still later, the Irish were union leaders, while Jews or Italians were rank-and-file union members. Today, there are unions where Jews are union leaders and blacks and Puerto Ricans are members. Similarly, in the schools, the Irish immigrant children in the mid-nineteenth century were taught by Protestant Anglo-Saxon teachers. Half a century later, Jewish immigrant children were far more likely to be taught by Irish Catholics than by Jewish teachers. A generation later, Negro children in Harlem were far more likely to be taught by Jewish teachers than by black teachers. Few children of rising ethnic groups have had “role models” of their own ethnicity. Some of the most successful— notably the Chinese and the Japanese— almost never did.

While various ethnic groups succeeded each other in neighborhoods, schools, jobs, etc., the country as a whole was also changing. The installation of underground sewage lines and indoor plumbing in the late nineteenth century meant that no other urban ethnic group had to endure as primitive and dangerous a set of living conditions as the Irish had in the mid-nineteenth century. Subways, trolleys, and eventually bus lines made it feasible for working people to spread out and still get to work in a reasonable time. The incredible overcrowding on New York’s lower east side in the nineteenth century was never to be approached again in modern slums. Blacks, Puerto Ricans, and Mexican Americans today live in crowded housing conditions, compared to their contemporaries, but in no way so crowded as the conditions among Jews, Italians, or the Irish in the nineteenth century. “Overcrowded” schools today may have perhaps half as many students per class as in nineteenth century schools on New York’s lower east side. The problems of today are very real, and sometimes severe, but they are by no means historically unprecedented.

Many of the problems of the poor and powerless remain the same, whatever group fills that role at a given time. The Jewish Daily Forward commented in 1907: “police in the Jewish quarter of New York are the most savage in America.” 19 An Italian immigrant writer complained in the early twentieth century about his experiences with the “rudeness” and “inconsiderateness” of government officials, which he found “disgusting.” 20 Many of the complaints against poor ethnic groups were also similar to those today— that “children are born with reckless regularity” among the Jews and Italians, 21 that murders are a result of “the wanton brutality of the moment,” 22 and that raising the immigrants to a decent level “implies a problem of such magnitude and such distant realization” that it can only be imagined. 23

Piraha and Bicameralism

For the past few months, I’ve been reading about color perception, cognition, and terminology. I finally got around to finishing a post on it. The topic is a lot more complex and confusing than what one might expect. The specific inspiration was the color blue, a word that apparently doesn’t signify a universal human experience. There is no condition of blueness objectively existing in the external world. It’s easy to forget that a distinction always exists between perception and reality or rather between one perception of reality and another.

How do you prove something is real when it feels real in your experience? For example, how would you attempt to prove your consciousness, interior experience, and individuality? What does it mean for your sense of self to be real? You can’t even verify your experience of blue matches that of anyone else, much less show that blueness is a salient hue for all people. All you have is the experience itself. Your experience can motivate, influence, and shape what and how you communicate or try to communicate, but you can’t communicate the experience itself. This inability is a stumbling block of all human interactions. The gap between cultures can be even more vast.

This is why language is so important to us. Language doesn’t only serve the purpose of communication but more importantly the purpose of creating a shared worldview. This is the deeply ingrained human impulse to bond with others, no matter how imperfect this is achieved in practice. When we have a shared language, we can forget about the philosophical dilemmas of experience and to what degree it is shared. We’d rather not have to constantly worry about such perplexing and disturbing issues.

These contemplations were stirred up by one book in particular, Daniel L. Everett’s Don’t Sleep, There Are Snakes. In my post on color, I brought up some of his observations about the Piraha (read pp. 136-141 from that book and have your mind blown). Their experience is far beyond what most people experience in the modern West. They rely on immediacy of experience. If they don’t experience or someone they know doesn’t experience something, it has little relevance to their lives and no truth value in their minds. Yet what they consider to be immediate experience can seem bizarre for us outsiders.

Piraha spirituality isn’t otherworldly. Spirits exist, just as humans exist. In fact, there is no certain distinction. When someone is possessed by a spirit, they are that spirit and the Piraha treat them as such. The person who is possessed is simply not there. The spirit is real because they experience the spirit with their physical senses. Sometimes in coming into contact with a spirit, a Piraha individual will lose their old identity and gain a new one, the change being permanent and another name to go along with it. The previous person is no longer there and I suppose never comes back. They aren’t pretending to change personalities. That is their direct experience of reality. Talk about the power of language. A spirit gives someone a new name and they become a different person. The name has power, represents an entire way of being, a personality unto itself. The person becomes what they are named. This is why the Piraha don’t automatically assume someone is the same person the next time they meet them, for they live in a fluid world where change is to be expected.

A modern Westerner sees the Piraha individual. To their mind, it’s the same person. They can see he or she is physically the same person. But another Piraha tribal member doesn’t see the same person. For example, when possessed, the person is apparently not conscious of the experience and won’t remember it later. During possession, they will be in an entirely dissociated state of mind, literally being someone else with different behaviors and a different voice. The Piraha audience watching the possession also won’t remember anything other than a spirit having visited. It isn’t a possession to them. The spirit literally was there. That is their perceived reality, what they know in their direct experience.

What the Piraha consider crazy and absurd is the Western faith in a monotheistic tradition not based on direct experience. If you never met Jesus, they can’t comprehend why you’d believe in him. The very notion of ‘faith’ makes absolutely no sense to them, as it seems like an act of believing what you know not to be real in your own experience. They are sincere Doubting Thomases. Jesus isn’t real, until he physically walks into their village to be seen with their own eyes, touched with their own hands, and heard with their own ears. To them, spirituality is as real as the physical world around them and is proven by the same means, through direct experience or else the direct experience of someone who is personally trusted to speak honestly.

Calling the Piraha experience of spirits a mass hallucination is to miss the point. To the degree that is true, we are all mass hallucinating all the time. It’s just one culture’s mass hallucinations differ from that of another. We modern Westerners, however, so desperately want to believe there can only be one objective reality to rule them all. The problem is we humans aren’t objective beings. Our perceived reality is unavoidably subjective. We can’t see our own cultural biases because they are the only reality we know.

In reading Everett’s description of the Piraha, I couldn’t help thinking about Julian Jaynes’ theory of the bicameral mind. Jaynes wasn’t primarily focused on hunter-gatherers such as the Piraha. Even so, one could see the Piraha culture as having elements of bicameralism, whether or not they ever were fully bicameral. They don’t hallucinate hearing voices from spirits. They literally hear them. How such voices are spoken is apparently not the issue. What matters is that they are spoken and heard. And those spirit voices will sometimes tell the Piraha important information that will influence, if not determine, their behaviors and actions. These spirit visitations are obviously treated seriously and play a central role in the functioning of their society.

What is strangest of all is that the Piraha are not fundamentally different than you or I. They point to one of the near infinite possibilities that exist within our shared human nature. If a baby from Western society was raised by the Piraha, we have no reason to assume that he or she wouldn’t grow up to be like any other Piraha. It was only a few centuries ago when it also was common for Europeans to have regular contact with spirits. The distance between the modern mind and what came before is shorter than it first appears, for what came before still exists within us, as what we will become is a seed already planted.*

I don’t want this point to be missed. What is being discussed here isn’t ultimately about colors or spirits. This is a way of holding up a mirror to ourselves. What we see reflected back isn’t what we expected, isn’t how we appeared in our own imaginings. What if we aren’t what we thought we were? What if we turn out to be a much more amazing kind of creature, one that holds a multitude within?

(*Actually, that isn’t stated quite correctly. It isn’t what came before. The Piraha are still here, as are many other societies far different from the modern West. It’s not just that we carry the past within us. That is as true for the Piraha, considering they too carry a past within them, most of it being a past of human evolution shared with the rest of humanty. Modern individuality has only existed in a blip of time, a few hundred years in the hundreds of thousands of years of hominid existence. The supposed bicameral mind lasted for thousands of years longer than the entire post-bicameral age. What are the chances that our present experience of individuality will last as long? Highly unlikely.)

* * *

Don’t Sleep, There Are Snakes:
Life and Language in the Amazonian Jungle
by Daniel L Everett
pp. 138-139

Pirahãs occasionally talked about me, when I emerged from the river in the evenings after my bath. I heard them ask one another, “Is this the same one who entered the river or is it kapioxiai?”

When I heard them discuss what was the same and what was different about me after I emerged from the river, I was reminded of Heraclitus, who was concerned about the nature of identities through time. Heraclitus posed the question of whether one could step twice into the same river. The water that we stepped into the first time is no longer there. The banks have been altered by the flow so that they are not exactly the same. So apparently we step into a different river. But that is not a satisfying conclusion. Surely it is the same river. So what does it mean to say that something or someone is the same this instant as they were a minute ago? What does it mean to say that I am the same person I was when I was a toddler? None of my cells are the same. Few if any of my thoughts are. To the Pirahãs, people are not the same in each phase of their lives. When you get a new name from a spirit, something anyone can do anytime they see a spirit, you are not exactly the same person as you were before.

Once when I arrived in Posto Novo, I went up to Kóhoibiíihíai and asked him to work with me, as he always did. No answer. So I asked again, “Ko Kóhoi, kapiigakagakaísogoxoihí?” (Hey Kóhoi, do you want to mark paper with me?) Still no answer. So I asked him why he wasn’t talking to me. He responded, “Were you talking to me? My name is Tiáapahai. There is no Kóhoi here. Once I was called Kóhoi, but he is gone now and Tiáapahai is here.”

So, unsurprisingly, they wondered if I had become a different person. But in my case their concern was greater. Because if, in spite of evidence to the contrary, I turned out not to be a xíbiisi, I might really be a different entity altogether and, therefore, a threat to them. I assured them that I was still Dan. I was not kapioxiai.

On many rainless nights, a high falsetto voice can be heard from the jungle near a Pirahã village. This falsetto sounds spiritlike to me. Indeed, it is taken by all the Pirahãs in the village to be a kaoáíbógí, or fast mouth. The voice gives the villagers suggestions and advice, as on how to spend the next day, or on possible night dangers (jaguars, other spirits, attacks by other Indians). This kaoáíbógí also likes sex, and he frequently talks about his desire to copulate with village women, with considerable detail provided.

One night I wanted to see the kaoáíbógí myself. I walked through the brush about a hundred feet to the source of that night’s voice. The man talking in the falsetto was Xagábi, a Pirahã from the village of Pequial and someone known to be very interested in spirits. “Mind if I record you?” I asked, not knowing how he might react, but having a good idea that he would not mind.

“Sure, go ahead,” he answered immediately in his normal voice. I recorded about ten minutes of his kaoáíbógí speech and then returned to my house.

The next day, I went to Xagábi’s place and asked, “Say, Xagábi, why were you talking like a kaoáíbógí last night?”

He acted surprised. “Was there a kaoáíbógí last night? I didn’t hear one. But, then, I wasn’t here.”

pp. 140-141

After some delay, which I could not help but ascribe to the spirits’ sense of theatrical timing, Peter and I simultaneously heard a falsetto voice and saw a man dressed as a woman emerge from the jungle. It was Xisaóoxoi dressed as a recently deceased Pirahã woman. He was using a falsetto to indicate that it was the woman talking. He had a cloth on his head to represent the long hair of a woman, hanging back like a Pirahã woman’s long tresses. “She” was wearing a dress.

Xisaóoxoi’s character talked about how cold and dark it was under the ground where she was buried. She talked about what it felt like to die and about how there were other spirits under the ground. The spirit Xisaóoxoi was “channeling” spoke in a rhythm different from normal Pirahã speech, dividing syllables into groups of two (binary feet) instead of the groups of three (ternary feet) used in everyday talking. I was just thinking how interesting this would be in my eventual analysis of rhythm in Pirahã, when the “woman” rose and left.

Within a few minutes Peter and I heard Xisaóoxoi again, but this time speaking in a low, gruff voice. Those in the “audience” started laughing. A well-known comical spirit was about to appear. Suddenly, out of the jungle, Xisaóoxoi emerged, naked, and pounding the ground with a heavy section of the trunk of a small tree. As he pounded, he talked about how he would hurt people who got in his way, how he was not afraid, and other testosterone-inspired bits of braggadocio.

I had discovered, with Peter, a form of Pirahã theater! But this was of course only my classification of what I was seeing. This was not how the Pirahãs would have described it at all, regardless of the fact that it might have had exactly this function for them. To them they were seeing spirits. They never once addressed Xisaóoxoi by his name, but only by the names of the spirits.

What we had seen was not the same as shamanism, because there was no one man among the Pirahãs who could speak for or to the spirits. Some men did this more frequently than others, but any Pirahã man could, and over the years I was with them most did, speak as a spirit in this way.

The next morning when Peter and I tried to tell Xisaóoxoi how much we enjoyed seeing the spirits, he, like Xagábi, refused to acknowledge knowing anything about it, saying he wasn’t there.

This led me to investigate Pirahã beliefs more aggressively. Did the Pirahãs, including Xisaóoxoi, interpret what we had just seen as fiction or as fact, as real spirits or as theater? Everyone, including Pirahãs who listened to the tape later, Pirahãs from other villages, stated categorically that this was a spirit. And as Peter and I were watching the “spirit show,” I was given a running commentary by a young man sitting next to me, who assured me that this was a spirit, not Xisaóoxoi. Moreover, based on previous episodes in which the Pirahãs doubted that I was the same person and their expressed belief that other white people were spirits, changing forms at will, the only conclusion I could come to was that for the Pirahãs these were encounters with spirits— similar to Western culture’s seances and mediums.

Pirahãs see spirits in their mind, literally. They talk to spirits, literally. Whatever anyone else might think of these claims, all Pirahãs will say that they experience spirits. For this reason, Pirahã spirits exemplify the immediacy of experience principle. And the myths of any other culture must also obey this constraint or there is no appropriate way to talk about them in the Pirahã language.

One might legitimately ask whether something that is not true to Western minds can be experienced. There is reason to believe that it can. When the Pirahãs claim to experience a spirit they have experienced something, and they label this something a spirit. They attribute properties to this experience, as well as the label spirit. Are all the properties, such as existence and lack of blood, correct? I am sure that they are not. But I am equally sure that we attribute properties to many experiences in our daily lives that are incorrect.

* * *

Radical Human Mind: From Animism to Bicameralism and Beyond

On Being Strange

Self, Other, & World

Humanity in All of its Blindness

The World that Inhabits Our Mind

Spiritualism and Bicameralism

In Spirit of Equality, Steve A. Wiggins discusses the recent Ghostbusters movie. His focus is on spiritualism and gender. He writes that,

“A thoughtful piece in by Colin Dickey in New Republic points out some of the unusual dynamics at play here. Looking at the history of Spiritualism as the basis for the modern interest in ghosts, Dickey suggests that women have been involved in the long-term fascination with the dead from the beginning. Their motive, however, was generally communication. Women wanted to relate with ghosts to make a connection. The original Ghostbusters movie represented a male, rationalistic approach to ghosts. As Dickey points out, instead of communicating, the men hunt and trap rather than trance and rap.”

I’m familiar with the spiritualist tradition. It’s part of the milieu that formed the kind of religion I was raised in, Science of Mind and New Thought Christianity.

The main church I grew up in, Unity Church, was heavily influenced by women from when it began in the late 1800s. Its founding was inspired by Myrtle Fillmore’s spiritual healing, women were leaders early on in the church, and ministers officiated same sex marriage ceremonies at least as far back as when I was a kid. It’s not patriarchal religion and emphasizes the idea of having a direct and personal connection to the divine, such that you can manifest it in your life.

The gender differences mentioned by Wiggins are the type of thing that always interest me. There are clear differences, whatever are the causes. Psychological research has found variations in cognition and behavior, on average between the genders. This is seen in personality research. And brain research shows at least some of these differences are based in biology, i.e., women having on average a larger corpus callosum.

I’m not sure how these kinds of differences relate to something like spiritualism and the fictional portrayal of human interaction with ghosts/spirits. The two Ghostbusters movies do offer a fun way to think about it.

Reading Wiggin’s piece, I thought about an essay I just read this past week. It offers a different perspective on a related topic, that of hearing voice commands and the traditions that go along with it. The essay is “Evolution and Inspiration” by Judith Weissman (from Gods, Voices and the Bicameral Mind ed. Marcel Kuijsten).

She notes, “that all over the world, throughout history, most of the poets who hear voices have been male, and their poems are usually about the laws of the fathers.” She considers this likely relevant, although she doesn’t offer any certain conclusions about what it might mean.

In the context of what Wiggins brings up, it makes one wonder what separates the tradition of voice-hearing poets and spiritualists. I can think of one thing, from that same essay.

Weissman mentioned that command voices often tell people what to do. A famous example was Daniel Paul Schreber who, when hearing a voice telling him to defend his manhood, punched in the face an attendant working at the psychiatric institute. Interestingly, Schreber was a well educated, rationalistic, and non-religious man before he began hearing voices.

Command voices tell people, often men, what to do. It leads to action, sometimes rather masculine action. Few people hear such voices these days and, when they do, they are considered schizophrenic—crazier than even a spiritualist.

From the New Republic article, The Spiritualist Origins of Ghostbusters, Colin Dickey offers an explanation about spiritualism in a gender-biased society.

“Spiritualism also celebrated precisely those aspects of femininity that the rest of culture was busy pathologizing. Nervousness, erratic behavior, uncontrolled outbursts, flagrant sexuality—doctors and psychiatrists saw these all as symptoms of hysteria, that ever-elusive disease that mostly boiled down to women acting out. But these same unruly behaviors were qualities prized in an excellent medium, and women who exhibited these traits were routinely praised for their psychic sensitivity. Women who might have otherwise been institutionalized found celebrity through Spiritualism instead.”

That makes me wonder. Which is cause and which effect? How does spiritualism and other forms of spirituality get expressed in other kinds of societies?

I’m reminded of two other things. First, there was an interesting passage on hysteria from a book on Galen, The Prince of Medicine by Susan P. Mattern. In bicameral fashion, the woman’s uterus (Greek hystera) literally had a mind of its own and was presumed to move around causing problems. The second thing is another part from the the Weissman essay:

“The last priests died shortly after the Ik were forcibly moved, and only one person was left who could still hear commanding voices, Nagoli, the daughter of a priest. Because she was not allowed to become a priest herself, she was called mad.”

Spirituality, when it was part of the social order, was respectable. But when that male-oriented society broke down, the spiritual ability of that woman was then seen as madness. The men (and the other women) couldn’t hear the voices she heard. The voices that once were guidance had become a threat. If that voice-hearing daughter of a priest had lived in 19th century United States, she might have been diagnosed with hysteria or else have become a popular spiritualist. Or if her culture hadn’t fallen into disarray, she would have been no one special at all, perfectly normal.

Ancient Social Identity: The Case of Jews

How, then, did you know a Jew in antiquity when you saw one? The answer is that you did not.

I started reading two fascinating books. Both are about Judaism. The first one I was looking at is The Beginning of Jewishness by Shaye J. D. Cohen (the source of the quote above, Kindle Location 796). And the other is The Invention of God by Thomas Römer.

Having read a little bit of each, I realized that they offered a useful angle in thinking about claims of ancient proto-racism. In my recent post on the topic, I did briefly use it an example:

“the early Jews probably were darker-skinned before outbreeding with Europeans and Arabs (Palestinians are descendants of the original Jews that never left). Or consider how those early Jews perceived the Samaritans as a separate people, even though they shared the same holy texts.”

That post was more wide-ranging. My thoughts were fairly general, as the point I was making was general. Sometimes, though, such issues become more interesting as you focus in on the details of a specific example.

In perusing the two books mentioned above, I was reminded me once again of how little I know and hence how much there is to learn. Certain books are able to change how you see something. The second book, The Invention of God, is more familiar territory, although still fascinating. Relevant to my thoughts here, I noticed the following (p. 13):

“Its origins do not lie, as the book of Joshua claims, in the military conquest of a territory by a population invading from somewhere else; rather “Israel” resulted from a slow process that took place gradually within the framework of the global upheavals of the Late Bronze Age— that is, it had its origin in indigenous populations. The opposition we find in the Bible between “Israelites” and “Canaanites” was in no way based on an existing ethnic difference, but is a much later theoretical construction in the service of a segregationist ideology.”

We modern people read ancient texts or, more likely, historical interpretations of ancient texts. In doing so, we come across labels like Israelites, Canaanites, etc. Our frame of reference include modern politics and conflicts along with media portrayals in movies and on television.

Also, there is the issue of how words changed over time. Looking at ancient texts, most people read a translation. But even reading the original language requires care, as there is a vast scholarship analyzing the context of texts and how, intentionally or unintentionally, they were altered over time. (See: David M. Goldberg, Reading Rabbinic Literature; and Michael L. Satlow, Jew or Judaean?)

I just found it fascinating. It turns out, like most people, I had no idea how social identities were formed and perceived in the ancient world. Cohen’s book makes this particularly clear.

There was no certain way to know someone was a Jew, as most ancient people living in the same area tended to look, dress, act, and speak more or less alike. Even circumcision in the Eastern Roman Empire was practiced by other groups besides Jews, and besides no one used circumcision to prove their social identity. Besides, many people who might have been perceived as Jewish because of following certain customs didn’t always perceive themselves as Jews and among those who did identify as Jews there was diverse lifestyles. The rants of the priestly class about what defined a real Jew were more prescriptive than descriptive, which is to say driven by ideology and politics rather than how people actually lived their lives.

It’s not as if there was an official record kept of all Jews. It was originally a rather informal social identity, besides a few basic rules that were more or less agreed upon.

Anyone could become a Jew, as conversion was simple. All you needed to do was be circumcised by a Jew and you were a Jew. No rabbi or ritual was necessary. Conversion was quite common at different points, as their were many incentives. Rulers were known to give special privileges to various groups, depending on the needs of rulership, and that sometimes included Jews having dispensation from certain laws and taxes. There was so much conversion going on that even anyone who claimed to be a Jew was treated as such.

Even the simple act of denying idolatry or abstaining from eating pork because of vegetarianism often got ancient people labeled as Jews, no matter what the individual claimed. If someone did anything like a Jew, however vague, for all intents and purposes they might as well have been a Jew.

There was much permeability of social identities, not just in perception but also in practice—as Cohen notes (Kindle Locations 739-740): “There is abundant evidence that in the first centuries of our era some-perhaps many-gentiles, whether polytheist or Christian, attended Jewish synagogues, abstained from work on the Sabbath, and perhaps observed other Jewish rituals as well.” It went the other way around as well. Some—perhaps many—Jews attended gentile religious services (e.g., mystery schools), participated in gentile holy days, and observed other gentile rituals as well.

“In sum: people associating with Jews were not necessarily Jews themselves. selves. Even people assembled in a synagogue or present in a Jewish neighborhood were not necessarily Jews themselves. In the Roman diaspora social mingling between Jews and gentiles was such that, without out inquiring or checking, you could not be sure who was a Jew and who was not” (Kindle Locations 697-699).

What distinguished and identified people wasn’t religion, ethnicity, or race. It was mostly about location and politics. A Judean wasn’t necessarily a Jew. Rather, a Judean was someone who lived in Judah and fell under Judean law and governance. It was a particular population and nothing more. The idea of a religious identity disconnected from all else would take many more centuries to fully form, under the influence of grand totalizing and imperialistic religions like Roman Catholicism. It was upon that basis that later notions of race would develop.

Even with the early disapora, an absolutely distinct ethno-religious identity hadn’t yet formed. “In the Roman diaspora, certainly after 70 C.E.,” as Cohen explains (Kindle Locations 609-610), “there is no evidence for obsession with genealogical purity and hardly any evidence for public archives and archival records.” Our modern obsessions were irrelevant to ancient people. They didn’t so easily and quickly turn to broad abstract categories. And the categories that did exist, context-dependent as they were, had a mercurial quality to them.

Origins of Ritual Behavior

Here is something from the Scientific American. It’s an article by Laura Kehoe, Mysterious Chimpanzee Behavior May Be Evidence of “Sacred” Rituals:

“Even more intriguing than this, maybe we found the first evidence of chimpanzees creating a kind of shrine that could indicate sacred trees. Indigenous West African people have stone collections at “sacred” trees and such man-made stone collections are commonly observed across the world and look eerily similar to what we have discovered here.”

Apparently, this has never before been observed and documented. It is an amazing discovery. Along with tool use, it points toward a central building block of primate society.

I immediately thought of the first evidence of settled civilization. Before humans built homes for themselves in settlements, they built homes for their gods. These first temples likely began quite simply, maybe even as simple as a pile of rocks.

Human society, as we know it, developed around ritual sites. This may have begun much earlier with the common ancestor of both humans and chimpanzees.