Rate of Moral Panic

I’m always looking for historical background that puts our present situation in new light. We often don’t realize, for example, how different was the world before and after the Second World War. The 1940s and 1950s was a strange time.

There was a brief moment around the mid-century when the number of marriages shot up and people married younger. So, when we compare marriage rates now to those in the post-war period, we get a skewed perspective because that post-war period was extremely abnormal by historical standards (Ana Swanson, 144 years of marriage and divorce in the United States, in one chart). It’s true that marriage rates never returned to the level of that brief marriage (and birth) boom following the war, but then again marriage rates weren’t ever that high earlier either.

In the 1990s, during the height of the culture wars when family values were supposedly under attack, the marriage rate was about the same as it was from before the Civil War and into the early 1900s, the period I’ve referred to as the crisis of identity. In the decades immediately before that starting around 1970, the marriage rate had been even higher than what was seen in the late 19th century (there isn’t dependable earlier data). Nor is it that premarital sex has become normalized over time, as young people have always had sex: “leaving out the even lower teen sex rate of GenZ, there isn’t a massive difference between the teen sex rates of Millennials and that of Boomers and Silents” (Rates of Young Sluts).

As another example from this past century, “In 1920, 43 percent of Americans were members of a church; by 1960, that figure had jumped to 63 percent” (Alex Morris, False Idol — Why the Christian Right Worships Donald Trump). Think about that. Most Americans, in the early 1900s, were some combination of unchurched and non-religious or otherwise religiously uninvolved and disinterested. A similar pattern was seen in the colonial era when many people lived in communities that lacked a church. Church membership didn’t begin to rise until the 1800s and apparently declined again with mass urbanization and early industrialization.

By the way, that is closely associated with the issue of marriage. Consider early America when premarital sex was so common that a large percentage of women got married after pregnancy and many of those marriages were common law, meaning that couples were simply living together. Moral norms were an informal affair that, if and when enforced, came form neighbors and not religious authority figures. Those moral norms were generous enough to allow the commonality of bastards and single parents, although some of that was explained by other issues such as rape and spousal death.

Many early Americans rarely saw a minister, outside of itinerant preachers who occasionally passed by. This is partly why formal marriages were less common. “Historians of American religion have long noted that the colonies did not exude universal piety. There was a general agreement that in the colonial period no more than 10-20 percent of the population actually belonged to a church” (Roger Finke & Rodney Stark, The Churching of America). This was at a time when many governments had state religions and so churches were associated with oppressiveness, as seen with the rise of non-Christian views (agnosticism, atheism, deism, universalism, unitarianism, etc) during the revolutionary period.

And don’t get me started on abortion, in how maybe as high as one in five or six pregnancies were aborted right before the American Civil War. That might be related to why fertility rates have been steadily dropping for centuries: “Extending the analysis back further, the White fertility rate declined from 7.04 in 1800 to 5.42 in 1850, to 3.56 in 1900, and 2.98 in 1950. Thus, the White fertility declined for nearly all of American history but may have bottomed out in the 1980s. Black fertility has also been declining for well over 150 years, but it may very well continue to do so in the coming decades” (Ideas and Data, Sex, Marriage, and Children: Trends Among Millennial Women).

Are we to blame commie liberal hippies traveling back in time to cause the decline of America practically before the country was even founded? Nostalgia is a fantasy and, interestingly, it is also a disease. The world is getting worse in some ways, but the main problems we face are real world crises such as climate change, not namby pamby cultural paranoia and fear-mongering. The fate of humanity does not rest on promoting the birth rate of native-born American WASPs nor on the hope that theocracy will save us. If we want to worry about doom, we should be looking at whether the rate of moral panic is experiencing an uptick, something that often precedes the rise of authoritarian mass violence.

* * *

Central to the moral panic of the culture wars has been “family values.” This has been held up by an idealized standard of middle-to-upper class respectablity and responsibility embodied in WASP nuclear families. What relationship does the normative ideal have to do with the lived realty across American history? See below for a historcal analysis.

“Family Values”: The Uses and Abuses of American Family History
by Elaine Tyler May

“Family Values” and Historical Scholarship

How did this national preoccupation emerge, and what does it mean for American political life? First, let us examine the phrase that is most frequently invoked in political debates: “family values.” In the political landscape that has emerged in recent decades, “family values” is a phrase that connotes specific positions on particular issues, and it has highly charged policy implications. It involves a constellation of issues. Under the banner of “family values” we find opposition to legal abortion; support for prayer in schools; opposition to civil rights for gays and lesbians; support for censorship of the arts, movies and popular culture; welfare reform; opposition to gun control; the “war on drugs.” These measures are usually found on the conservative agenda, although liberals have increasingly championed some of them in their efforts to jump on the “family values” bandwagon. Many of these issues have nothing to do with families—but they all have to do with values. And they all inspire fierce passions and heated debates.

It is also clear that “family values” is a term often used as a code and marker of race and class. For example, poor black single mothers, and educated white professional women, are both likely to be blamed for society’s ills as a result of their alleged defiance of “family values.” Presumably, a mother on welfare who goes out and gets a job demonstrates good family values; one who stays home with her kids does not. Yet an educated middle-class woman who goes out and gets a job demonstrates bad family values; one who stays home with her kids does not. The rules change according to racial and class position, as well as marital status. The gender, class, and sexual expectations also change over time. In the 1930s, for example, welfare payments were made to poor mothers to enable them to stay home with their children. Now mothers on welfare are required to hold jobs (Gordon 1994).

Scholars have only recently begun to examine these issues through the lens of history. What used to be called the “new” social history gave rise to flourishing scholarship on the working class, women, gender and sexuality, racial minorities and race relations, immigrants and ethnic groups, family history, and gay and lesbian history. That rich body of scholarship has altered the way history is studied and taught today. The new history expands our understanding of the American past, making it far richer and more complete than it ever was before. But it has also led to criticism that American history has now become so fragmented and particularized that there is no longer any unified understanding of the past that offers to Americans a cohesive view of their national history. This criticism often stems from a desire to replace the new complex multicultural and diverse history with a dominant narrative grounded in the stories and deeds of powerful leaders, returning to a traditional unified narrative that was partial, biased, and left out most Americans (Bender 2002). Nevertheless, these criticisms challenge scholars to bring together aspects of history that are usually studied in isolation from each other. Much of social history has left politics in the background, or left it out altogether. In recent years, several historians have done important work that takes a new look at American politics with the contributions of social history providing the foundation for their scholarship (Coontz 1992; Kerber 1998). Feminists were the first to proclaim that the “personal is political,” and scholars studying women, sexuality, gender, and the family have kept that insight at the center of their scholarship. [5][5]For a pioneering example of feminist scholarship that used this…

We know from the work of these scholars that there was never a “traditional” American family. There has been as much diversity and changes in American families as in any other aspect of national life. But the power of the myth continues. In fact, misperceptions of the American family may be more relevant to current political debates than the reality of American families (Coontz; May 1999). For example, many Americans are surprised to learn that contrary to common assumptions, the Puritans did not condemn premarital sex or out-of wedlock pregnancy (provided the young couple intended to marry); or that abortion was legal during much of the XIXth century; or that the alleged “golden age” of the 1950s’ white middle-class family was marred by rampant alcohol and drug abuse among suburban housewives, and high rates of sexual activity among teenagers (many of whom were married); or that rates of voluntary childlessness were higher a century ago than they are today. Many people believe that American nuclear families were strong, stable and self-reliant until the 1960s, when they began to unravel (Coontz; Stacey).

Scholars eager to set the record straight argue that the family has always been a changing institution, and that claims of its demise are highly exaggerated. They have put great effort into demonstrating that there never was a “traditional” self-sufficient nuclear family to match the mythical ideal. [6][6]For a forceful and thorough debunking of myths of the American… This scholarship is powerful and important. But it does not ask or answer a fundamental question: if change is a constant in the history of the American family, why during certain times—but not all times—do politicians and leaders warn that family decline portends the nation’s doom? I would like to suggest that anxieties about the family emerge at times when national identity, as defined and understood by the American middle class, appears to be threatened—by immigrants, radicals, “communists,” racial or sexual minorities, or feminists. […]

From the founding of the nation, then, the American family had a well-defined political role. Attached to that role were certain assumptions about the structure of the family, its functions, and the specific responsibilities of its members. In the first century of the Republic, gender roles within middle-class families carried civic meanings. As towns and cities grew, most urban households lost their function as centers of production. Instead of working at home, men left to work in the public arena while women remained in the domestic sphere. Men became breadwinners, while women took on the elevated stature of moral guardians and nurturers. Women’s responsibilities included instilling virtue in their families and raising children to be responsible and productive future citizens. The democratic family would be nuclear in structure, freed from undue influence from the older generation, and grounded in these distinct gender roles that were believed to be “natural” —at least for white European-Americans (Ryan 1981).

In the political culture that developed from these expectations, the family had a major responsibility for the well-being of society. The responsibility of the society for the well-being of the family was less well articulated, and defined mostly in the negative. The government was to leave the family alone, not intrude into it, and not provide for it. The family was, presumably, self-sufficient. Politics was the arena where white men, acting as democratic citizens, shaped public policies. The family was the place where white women, spared the corrupting influences of public life, would instill self-sufficiency and virtue into the citizenry.

From the beginning, however, the reality of family life defied those definitions and strained against the normative ideal. The vast majority of Americans lived on farms, or in households that required the productive labor of all adult members of the family. The prevailing middle-class norm in the XIXth century that defined “separate spheres” for men and women never pertained to these families, nor did it reflect the experiences of African-Americans, either during or after slavery. Only the most privileged white Protestant women in the towns and cities had the resources that allowed them to devote themselves full-time to nurturing their families and rearing future citizens. Their leisure time for moral uplift depended upon the labors of other women—African-American slaves, immigrant household servants, and working-class women who toiled in factories—to provide the goods and services that would enable privileged white women to pursue their role as society’s moral guardians. And it was those very women, affluent and educated, who first rebelled against their constrained domestic roles, arguing that the system of coverture denied them their rights as citizens. [7][7]For examples and analysis, see two classic works in the field:…

At the same time, when social problems developed that appeared to threaten social order, often the family was blamed—particularly those families, or individuals, whose behavior did not conform to the normative family ideal. The family came to be seen as the source or cause of social problems as well as the potential solution or cure. In other words, bad families eroded American society, and good families would restore it. Good families were the key to social order and national progress. Good families were those that conformed to the ideal of the so-called “traditional” American family, a family form that seemed to flourish among the white Protestant middle class in the XIXth century, and allegedly reached its twentieth-century apex, or “golden age,” in the 1950s. Here we find the source of the mythic nuclear family ideal.

A Historical Perspective

The founders of the nation assumed that the white middle-class family, nurtured by women in the private arena protected from the corruptions of commerce and public life, would produce virtuous citizens and provide the foundation for public order. The responsibility of the government was essentially to leave the family alone—not to intervene with either material support or regulation. Marriage laws established heterosexual monogamy as the foundation for families, prohibited unions across racial lines, and determined marital possibilities for immigrants (Cott 2000). Once established, the family was expected to serve its members and the society without government interference. However, by the late XIXth century, observers began to realize that not all families could be counted upon to promote the interests of the white Protestant status quo. Several dramatic developments—the end of slavery and the migration North of thousands of African-Americans, the influx of immigrants, the political activism of middle-class women, the declining birthrate of native-born Protestant Americans, the political power of Irish Catholics in northern cities—made it clear that the government could no longer remain aloof and expect families to take care of the nation. At the turn of the XXth century, the Anglo-Saxon middle class faced major challenges to its hegemonic definition of national identity.

In response, political leaders during the Progressive Era boldly altered the relationship between the family and the state. Progressive reformers no longer assumed that the family would, without support or intervention from the government, maintain civic virtue and social order. President Theodore Roosevelt was the first national leader to articulate a new dimension to the public/private bargain. In his first campaign for the presidency, he brought the family into the center of national political debates. It has remained there ever since.

The Disease of Nostalgia

“The nostalgic is looking for a spiritual addressee. Encountering silence, he looks for memorable signs, desperately misreading them.”
― Svetlana Boym, The Future of Nostalgia

Nostalgia is one of those strange medical conditions from the past, first observed in 17th century soldiers being sent off to foreign lands during that era of power struggles between colonial empires. It’s lost that medical framing since then, as it is now seen as a mere emotion or mood or quality. And it has become associated with the reactionary mind and invented traditions. We no longer take it seriously, sometimes even dismissing it as a sign of immaturity.

But it used to be considered a physiological disease with measurable symptoms such as brain inflammation along with serious repercussions, as the afflicted could literally waste away and die. It was a profound homesickness experienced as an existential crisis of identity, a longing for a particular place and and the sense of being uprooted from it.  Then it shifted from a focus on place to a focus on time. It became more abstract and, because of that, it lost its medical status. This happened simultaneously as a new disease, neurasthenia, took its place in the popular imagination.

In America, nostalgia never took hold to the same degree as it did in Europe. It finally made its appearance in the American Civil War, only to be dismissed as unmanly and weak character, a defect and deficiency. It was a disease of civilization, but it strongly affected the least civilized, such as rural farmers. America was sold as a nation of progress and so attachment to old ways was deemed unAmerican. Neurasthenia better fit the mood that the ruling elite sought to promote and, unlike nostalgia, it was presented as a disease of the most civilized, although over time it too became a common malady, specifically as it was Europeanized.

Over the centuries, there was a shift in the sense of time. Up through the early colonial era, a cyclical worldview remained dominant (John Demos, Circles and Lines). As time became linear, there was no possibility of a return. The revolutionary era permanently broke the psychological link between past and future. There was even a revolution in the understanding of ‘revolution’ itself, a term that originated from astrology and literally meant a cyclical return. In a return, there is replenishment. But without that possibility, one is thrown back on individual reserves that are limited and must be managed. The capitalist self of hyper-individualism is finally fully formed. That is what neurasthenia was concerned with and so nostalgia lost its explanatory power. In The Future of Nostalgia, Svetlana Boym writes:

“From the seventeenth to the nineteenth century, the representation of time itself changed; it moved away from allegorical human figures— an old man, a blind youth holding an hourglass, a woman with bared breasts representing Fate— to the impersonal language of numbers: railroad schedules, the bottom line of industrial progress. Time was no longer shifting sand; time was money. Yet the modern era also allowed for multiple conceptions of time and made the experience of time more individual and creative.”

As society turned toward an ethos of the dynamic, it became ungrounded and unstable. Some of the last healthy ties to the bicameral mind were severed. (Interestingly, in early diagnoses of nostalgia as a disease, Boym states that, “One of the early symptoms of nostalgia was an ability to hear voices or see ghosts.” That sounds like the bicameral mind re-emerging under conditions of stress, not unlike John Geiger’s third man factor. In nostalgia as in the archaic mind, there is a secret connection between language and music, as united through voice — see Development of Language and Music and Spoken Language: Formulaic, Musical, & Bicameral.)

Archaic authorization mutated into totalitarianism, a new refuge for the anxiety-riddled mind. And the emerging forms of authoritarianism heavily draw upon the nostalgic turn (Ben G. Price, Authoritarian Grammar and Fundamentalist Arithmetic Part II), just as did the first theocracies (religion, writes Julian Jaynes, is “the nostalgic anguish for the lost bicamerality of a subjectively conscious people”), even as or especially because the respectable classes dismissed it. This is courting disaster for the archaic mind still lives within us, still speaks in the world, even if the voices are no longer recognized.

The first laments of loss echoed out from the rubble of the Bronze Age and, precisely as the longing has grown stronger, the dysfunctions associated with it have become normalized. But how disconnected and lost in abstractions can we get before either we become something entirely else or face another collapse?

“Living amid an ongoing epidemic that nobody notices is surreal. It is like viewing a mighty river that has risen slowly over two centuries, imperceptibly claiming the surrounding land, millimeter by millimeter. . . . Humans adapt remarkably well to a disaster as long as the disaster occurs over a long period of time”
~E. Fuller Torrey & Judy Miller, Invisible Plague

* * *

As a side note, I’d point to utopia as being the other side of the coin to nostalgia. And so the radical is the twin of the reactionary. In a different context, I said something about shame that could apply equally well to nostalgia (“Why are you thinking about this?”): “The issue of shame is a sore spot where conservatism and liberalism have, from their close proximity, rubbed each other raw. It is also a site of much symbolic conflation, the linchpin like a stake in the ground to which a couple of old warriors are tied in their ritual dance of combat and wounding, where both are so focused on one another that neither pays much attention to the stake that binds them together. In circling around, they wind themselves ever tighter and their tethers grow shorter.”

In conversing with someone on the political left, an old pattern became apparent. This guy, although with a slight radical bent, is a fairly mainstream liberal coming out of the Whiggish tradition of ‘moderate’ progressivism, an ideological mindset that is often conservative-minded and sometimes reactionary (e.g., lesser evil voting no matter how evil it gets). This kind of person is forever pulling their punches. To continue from the same piece, I wrote that, “The conservative’s task is much easier for the reason that most liberals don’t want to untangle the knot, to remove the linchpin. Still, that is what conservative’s fear, for they know liberals have that capacity, no matter how unlikely they are to act on it. This fear is real. The entire social order is dependent on overlapping symbolic conflations, each a link in a chain, and so each a point of vulnerability.”

To pull that linchpin would require confronting the concrete issue at hand, getting one’s hands dirty. But that is what the moderate progressive fears for the liberal mind feels safe and protected within abstractions. Real-world context will always be sacrificed. Such a person mistrusts the nostalgia of the reactionary while maybe fearing even more the utopianism of the radical, flitting back and forth between one to the other and never getting anywhere. So, they entirely retreat from the battle and lose themselves in comforting fantasies of abstract ideals (making them prone to false equivalencies in their dreams of equality). In doing so, despite being well informed, they miss the trees for the forest, miss the reality on the ground for all the good intentions.

Neither nostalgia nor utopianism can offer a solution, even as both indicate the problem. That isn’t to say there is an escape either for that also reinforces the pattern of anxiety, of fear and hope. The narrative predetermines our roles and the possibilities of action. We need a new narrative. The disease model of the human psyche, framed as nostalgia or neurasthenia or depression or anything else, is maybe not so helpful. Yet we have to take seriously that the stress of modernity is not merely something in people’s minds. Scapegoating the individual simply distracts from the failure of individualism. These conditions of identity are both real and imagined — that is what makes them powerful, whatever name they go by and ideology they serve.

* * *

Let me throw out some loose thoughts. There is something that feels off about our society and it is hard to put one’s finger on. That is why, in our free floating anxiety, we look for anything to grab hold of. Most of the public debates that divide the public are distractions from the real issue that we don’t know how to face, much less how to comprehend. These red herrings of social control are what I call symbolic conflation. To put it simply, there is plenty of projecting going on — and it is mutual from all sides involved and its extremely distorted.

I’ll leave it at that. What is important for my purposes here is the anxiety itself, the intolerable sense of dissatisfaction or dukkha. Interestingly, this sense gets shifted onto the individual and so further justifies the very individualism that is at the heart of the problem. It is our individuality that makes us feel so ill at ease with the world because it disconnects and isolates us. The individual inevitably fails because individualism is ultimately impossible. We are social creatures through and through. It requires immense effort to create and maintain individuality, and sweet Jesus! is it tiresome. That is the sense of being drained that is common across these many historical conditions, from the earlier melancholia to the present depression and everything in between.

Since the beginning of modernity, there has been a fear that too many individuals are simply not up to the task. When reading about these earlier ‘diseases’, there is a common thread running across the long history. The message is how will the individual be made to get in line with modern world, not how to get the modern world in line with human nature. The show must go on. Progress must continue. There is no going back, so we’re told. Onward and upward. This strain of endless change and uncertainty has required special effort in enculturating and indoctrinating each new generation. In the Middle Ages and in tribal cultures, children weren’t special but basically considered miniature adults. There was no protected childhood with an extended period to raise, train, and educate the child. But in our society, the individual has to be made, as does the citizen and the consumer. None of this comes naturally and so must be artificially imposed. The child will resist and more than a few will come out the other side with severe damage, but the sacrifice must be made for the greater good of society.

This was seen, in the United States, most clearly after the American Revolution. Citizen-making became a collective project. Children needed to be shaped into a civic-minded public. And as seen in Europe, adults needed to be forced into a national identity, even if it required bullying or even occasionally burying a few people alive to get the point across No stragglers will be allowed! (Nonetheless, a large part of the European population maintained local identities until the world war era.) Turning boys into men became a particular obsession in the early 20th century with all of the building of parks, advocacy for hunting and fishing, creation of the Boy Scouts, and on and on. Boys used to turn into men spontaneously without any needed intervention, but with nostalgia and neurasthenia there was this growing fear of effeminacy and degeneracy. The civilizing project was important and must be done, no matter how many people are harmed in the process, even genocides. Creating the modern nation-state was a brutal and often bloody endeavor. No one willingly becomes a modern individual. It only happens under threat of violence and punishment.

By the way, this post is essentially an elaboration on my thoughts from another post, The Crisis of Identity. In that other post, I briefly mention nostalgia, but the focus was more on neurasthenia and related topics. It’s an extensive historical survey. This is part of a longer term intellectual project of mine, in trying to make sense of this society and how it came to be this way. Below are some key posts to consider, although I leave out those related to Jaynesian and related scholarship because that is a large area of thought all on its own (if interested, look at the tags for ConsciousnessBicameral MindJulian Jaynes, and Lewis Hyde):

The Transparent Self to Come?
Technological Fears and Media Panics
Western Individuality Before the Enlightenment Age
Juvenile Delinquents and Emasculated Males
The Breast To Rule Them All
The Agricultural Mind
“Yes, tea banished the fairies.”
Autism and the Upper Crust
Diets and Systems
Sleepwalking Through Our Dreams
Delirium of Hyper-Individualism
The Group Conformity of Hyper-Individualism
Individualism and Isolation
Hunger for Connection
To Put the Rat Back in the Rat Park
Rationalizing the Rat Race, Imagining the Rat Park

* * *

The Future of Nostalgia
by Svetlana Boym
pp. 25-30

Nostalgia was said to produce “erroneous representations” that caused the afflicted to lose touch with the present. Longing for their native land became their single-minded obsession. The patients acquired “a lifeless and haggard countenance,” and “indifference towards everything,” confusing past and present, real and imaginary events. One of the early symptoms of nostalgia was an ability to hear voices or see ghosts. Dr. Albert von Haller wrote: “One of the earliest symptoms is the sensation of hearing the voice of a person that one loves in the voice of another with whom one is conversing, or to see one’s family again in dreams.” 2 It comes as no surprise that Hofer’s felicitous baptism of the new disease both helped to identify the existing condition and enhanced the epidemic, making it a widespread European phenomenon. The epidemic of nostalgia was accompanied by an even more dangerous epidemic of “feigned nostalgia,” particularly among soldiers tired of serving abroad, revealing the contagious nature of the erroneous representations.

Nostalgia, the disease of an afflicted imagination, incapacitated the body. Hofer thought that the course of the disease was mysterious: the ailment spread “along uncommon routes through the untouched course of the channels of the brain to the body,” arousing “an uncommon and everpresent idea of the recalled native land in the mind.” 3 Longing for home exhausted the “vital spirits,” causing nausea, loss of appetite, pathological changes in the lungs, brain inflammation, cardiac arrests, high fever, as well as marasmus and a propensity for suicide. 4

Nostalgia operated by an “associationist magic,” by means of which all aspects of everyday life related to one single obsession. In this respect nostalgia was akin to paranoia, only instead of a persecution mania, the nostalgic was possessed by a mania of longing. On the other hand, the nostalgic had an amazing capacity for remembering sensations, tastes, sounds, smells, the minutiae and trivia of the lost paradise that those who remained home never noticed. Gastronomic and auditory nostalgia were of particular importance. Swiss scientists found that rustic mothers’ soups, thick village milk and the folk melodies of Alpine valleys were particularly conducive to triggering a nostalgic reaction in Swiss soldiers. Supposedly the sounds of “a certain rustic cantilena” that accompanied shepherds in their driving of the herds to pasture immediately provoked an epidemic of nostalgia among Swiss soldiers serving in France. Similarly, Scots, particularly Highlanders, were known to succumb to incapacitating nostalgia when hearing the sound of the bagpipes—so much so, in fact, that their military superiors had to prohibit them from playing, singing or even whistling native tunes in a suggestive manner. Jean-Jacques Rousseau talks about the effects of cowbells, the rustic sounds that excite in the Swiss the joys of life and youth and a bitter sorrow for having lost them. The music in this case “does not act precisely as music, but as a memorative sign.” 5 The music of home, whether a rustic cantilena or a pop song, is the permanent accompaniment of nostalgia—its ineffable charm that makes the nostalgic teary-eyed and tongue-tied and often clouds critical reflection on the subject.

In the good old days nostalgia was a curable disease, dangerous but not always lethal. Leeches, warm hypnotic emulsions, opium and a return to the Alps usually soothed the symptoms. Purging of the stomach was also recommended, but nothing compared to the return to the motherland believed to be the best remedy for nostalgia. While proposing the treatment for the disease, Hofer seemed proud of some of his patients; for him nostalgia was a demonstration of the patriotism of his compatriots who loved the charm of their native land to the point of sickness.

Nostalgia shared some symptoms with melancholia and hypochondria. Melancholia, according to the Galenic conception, was a disease of the black bile that affected the blood and produced such physical and emotional symptoms as “vertigo, much wit, headache, . . . much waking, rumbling in the guts . . . troublesome dreams, heaviness of the heart . . . continuous fear, sorrow, discontent, superfluous cares and anxiety.” For Robert Burton, melancholia, far from being a mere physical or psychological condition, had a philosophical dimension. The melancholic saw the world as a theater ruled by capricious fate and demonic play. 6 Often mistaken for a mere misanthrope, the melancholic was in fact a utopian dreamer who had higher hopes for humanity. In this respect, melancholia was an affect and an ailment of intellectuals, a Hamletian doubt, a side effect of critical reason; in melancholia, thinking and feeling, spirit and matter, soul and body were perpetually in conflict. Unlike melancholia, which was regarded as an ailment of monks and philosophers, nostalgia was a more “democratic” disease that threatened to affect soldiers and sailors displaced far from home as well as many country people who began to move to the cities. Nostalgia was not merely an individual anxiety but a public threat that revealed the contradictions of modernity and acquired a greater political importance.

The outburst of nostalgia both enforced and challenged the emerging conception of patriotism and national spirit. It was unclear at first what was to be done with the afflicted soldiers who loved their motherland so much that they never wanted to leave it, or for that matter to die for it. When the epidemic of nostalgia spread beyond the Swiss garrison, a more radical treatment was undertaken. The French doctor Jourdan Le Cointe suggested in his book written during the French Revolution of 1789 that nostalgia had to be cured by inciting pain and terror. As scientific evidence he offered an account of drastic treatment of nostalgia successfully undertaken by the Russians. In 1733 the Russian army was stricken by nostalgia just as it ventured into Germany, the situation becoming dire enough that the general was compelled to come up with a radical treatment of the nostalgic virus. He threatened that “the first to fall sick will be buried alive.” This was a kind of literalization of a metaphor, as life in a foreign country seemed like death. This punishment was reported to be carried out on two or three occasions, which happily cured the Russian army of complaints of nostalgia. 7 (No wonder longing became such an important part of the Russian national identity.) Russian soil proved to be a fertile ground for both native and foreign nostalgia. The autopsies performed on the French soldiers who perished in the proverbial Russian snow during the miserable retreat of the Napoleonic Army from Moscow revealed that many of them had brain inflammation characteristic of nostalgia.

While Europeans (with the exception of the British) reported frequent epidemics of nostalgia starting from the seventeenth century, American doctors proudly declared that the young nation remained healthy and didn’t succumb to the nostalgic vice until the American Civil War. 8 If the Swiss doctor Hofer believed that homesickness expressed love for freedom and one’s native land, two centuries later the American military doctor Theodore Calhoun conceived of nostalgia as a shameful disease that revealed a lack of manliness and unprogressive attitudes. He suggested that this was a disease of the mind and of a weak will (the concept of an “afflicted imagination” would be profoundly alien to him). In nineteenth-century America it was believed that the main reasons for homesickness were idleness and a slow and inefficient use of time conducive to daydreaming, erotomania and onanism. “Any influence that will tend to render the patient more manly will exercise a curative power. In boarding schools, as perhaps many of us remember, ridicule is wholly relied upon. . . . [The nostalgic] patient can often be laughed out of it by his comrades, or reasoned out of it by appeals to his manhood; but of all potent agents, an active campaign, with attendant marches and more particularly its battles is the best curative.” 9 Dr. Calhoun proposed as treatment public ridicule and bullying by fellow soldiers, an increased number of manly marches and battles and improvement in personal hygiene that would make soldiers’ living conditions more modern. (He also was in favor of an occasional furlough that would allow soldiers to go home for a brief period of time.)

For Calhoun, nostalgia was not conditioned entirely by individuals’ health, but also by their strength of character and social background. Among the Americans the most susceptible to nostalgia were soldiers from the rural districts, particularly farmers, while merchants, mechanics, boatmen and train conductors from the same area or from the city were more likely to resist the sickness. “The soldier from the city cares not where he is or where he eats, while his country cousin pines for the old homestead and his father’s groaning board,” wrote Calhoun. 10 In such cases, the only hope was that the advent of progress would somehow alleviate nostalgia and the efficient use of time would eliminate idleness, melancholy, procrastination and lovesickness.

As a public epidemic, nostalgia was based on a sense of loss not limited to personal history. Such a sense of loss does not necessarily suggest that what is lost is properly remembered and that one still knows where to look for it. Nostalgia became less and less curable. By the end of the eighteenth century, doctors discovered that a return home did not always treat the symptoms. The object of longing occasionally migrated to faraway lands beyond the confines of the motherland. Just as genetic researchers today hope to identify a gene not only for medical conditions but social behavior and even sexual orientation, so the doctors in the eighteenth and nineteenth centuries looked for a single cause of the erroneous representations, one so-called pathological bone. Yet the physicians failed to find the locus of nostalgia in their patient’s mind or body. One doctor claimed that nostalgia was a “hypochondria of the heart” that thrives on its symptoms. To my knowledge, the medical diagnosis of nostalgia survived in the twentieth century in one country only—Israel. (It is unclear whether this reflects a persistent yearning for the promised land or for the diasporic homelands left behind.) Everywhere else in the world nostalgia turned from a treatable sickness into an incurable disease. How did it happen that a provincial ailment, maladie du pays , became a disease of the modern age, mal du siècle?

In my view, the spread of nostalgia had to do not only with dislocation in space but also with the changing conception of time. Nostalgia was a historical emotion, and we would do well to pursue its historical rather than psychological genesis. There had been plenty of longing before the seventeenth century, not only in the European tradition but also in Chinese and Arabic poetry, where longing is a poetic commonplace. Yet the early modern conception embodied in the specific word came to the fore at a particular historical moment. “Emotion is not a word, but it can only be spread abroad through words,” writes Jean Starobinski, using the metaphor of border crossing and immigration to describe the discourse on nostalgia. 11 Nostalgia was diagnosed at a time when art and science had not yet entirely severed their umbilical ties and when the mind and body—internal and external well-being—were treated together. This was a diagnosis of a poetic science—and we should not smile condescendingly on the diligent Swiss doctors. Our progeny well might poeticize depression and see it as a metaphor for a global atmospheric condition, immune to treatment with Prozac.

What distinguishes modern nostalgia from the ancient myth of the return home is not merely its peculiar medicalization. The Greek nostos , the return home and the song of the return home, was part of a mythical ritual. […] Modern nostalgia is a mourning for the impossibility of mythical return, for the loss of an enchanted world with clear borders and values; it could be a secular expression of a spiritual longing, a nostalgia for an absolute, a home that is both physical and spiritual, the edenic unity of time and space before entry into history. The nostalgic is looking for a spiritual addressee. Encountering silence, he looks for memorable signs, desperately misreading them.

The diagnosis of the disease of nostalgia in the late seventeenth century took place roughly at the historical moment when the conception of time and history were undergoing radical change. The religious wars in Europe came to an end but the much prophesied end of the world and doomsday did not occur. “It was only when Christian eschatology shed its constant expectations of the immanent arrival of doomsday that a temporality could have been revealed that would be open to the new and without limit.” 13 It is customary to perceive “linear” Judeo-Christian time in opposition to the “cyclical” pagan time of eternal return and discuss both with the help of spatial metaphors. 14 What this opposition obscures is the temporal and historical development of the perception of time that since Renaissance on has become more and more secularized, severed from cosmological vision.

Before the invention of mechanical clocks in the thirteenth century the question, What time is it? was not very urgent. Certainly there were plenty of calamities, but the shortage of time wasn’t one of them; therefore people could exist “in an attitude of temporal ease. Neither time nor change appeared to be critical and hence there was no great worry about controlling the future.” 15 In late Renaissance culture,Time was embodied in the images of Divine Providence and capricious Fate, independent of human insight or blindness. The division of time into Past, Present and Future was not so relevant. History was perceived as a “teacher of life” (as in Cicero’s famous dictum, historia magistra vitae ) and the repertoire of examples and role models for the future. Alternatively, in Leibniz’s formulation, “The whole of the coming world is present and prefigured in that of the present.” 16

The French Revolution marked another major shift in European mentality. Regicide had happened before, but not the transformation of the entire social order. The biography of Napoleon became exemplary for an entire generation of new individualists, little Napoleons who dreamed of reinventing and revolutionizing their own lives. The “Revolution,” at first derived from natural movement of the stars and thus introduced into the natural rhythm of history as a cyclical metaphor, henceforth attained an irreversible direction: it appeared to unchain a yearned-for future. 17 The idea of progress through revolution or industrial development became central to the nineteenth-century culture. From the seventeenth to the nineteenth century, the representation of time itself changed; it moved away from allegorical human figures—an old man, a blind youth holding an hourglass, a woman with bared breasts representing Fate—to the impersonal language of numbers: railroad schedules, the bottom line of industrial progress. Time was no longer shifting sand; time was money. Yet the modern era also allowed for multiple conceptions of time and made the experience of time more individual and creative.

“The Origin of Consciousness, Gains and Losses: Walker Percy vs. Julian Jaynes”
by Laura Mooneyham White
from Gods, Voices, and the Bicameral Mind
ed. by Marcel Kuijsten

Jaynes is plainly one who understands the human yearning for Eden, the Eden of bicameral innocence. He writes of our longings for a return to that lost organization of human mentality, a return to lost certainty and splendour.” 44 Jones believes, in fact, that Jaynes speaks for himself when he describes the “yearning for divine volition and service [which] is with us still,” 45 of our “nostalgic anguish” which we feel for lost bicamerality. 46 Even schizophrenia, seen from Jaynes’s perspective as a vestige of bicamerality, is the anguishing state it is only because the relapse to bicamerality

is only partial. The learnings that make up a subjective consciousness are powerful and never totally suppressed. And thus the terror and the fury, the agony and the despair. … The lack of cultural support and definition for the voices [heard by schizophrenics] … provide a social withdrawal from the behavior of the absolutely social individual of bicameral societies. … [W]ithout this source of security, … living with hallucinations that are unacceptable and denied as unreal by those around him, the florid schizophrenic is in an opposite world to that of the god-owned laborers of Marduk. … [He] is a mind bared to his environment, waiting on gods in a godless world. 47

Jones, in fact, asserts that Jaynes’s discussion of schizophrenia is held in terms “reminiscent of R. D. Laing’s thesis that schizophrenics are the only sane people in our insane world.” 48 Jones goes on to say that “Jaynes, it would seem, holds that we would all be better off if ‘everyone’ were once again schizophrenic, if we could somehow return to a bicameral society which had not yet been infected by the disease of thinking.” 49

Jaynes does not, in my opinion, intimate a position nearly as reactionary as this; he has in fact made elsewhere an explicit statement to the effect that he himself feels no such longing to return to bicamerality, that he would in fact “shudder” at such a return. 50 Nonetheless, Jaynes does seem at some points in his book to describe introspection as a sort of pathological development in human history. For instance, instead of describing humanity’s move towards consciousness as liberating, Jaynes calls it “the slow inexorable profaning of our species.” 51 And no less an eminence than Northrop Frye recognized this tendency in Jaynes to disvalue consciousness. After surveying Jaynes’s argument and admitting the fascination of that argument’s revolutionary appeal, Frye points out that Jaynes’s ideas provoke a disturbing reflection: “seeing what a ghastly mess our egocentric consciousness has got us into, perhaps the sooner we get back to … hallucinations the better.” Frye expands his discussion of Jaynes to consider the cultural ramifications of this way of thinking, what he terms “one of the major cultural trends of our time”:

It is widely felt that our present form of consciousness, with its ego center, has become increasingly psychotic, incapable of dealing with the world, and that we must develop a more intensified form of consciousness, recapturing many of … Jaynes’ ‘bicameral’ features, if we are to survive the present century. 52

Frye evidently has little sympathy with such a position which would hold that consciousness is a “late … and on the whole regrettable arrival on the human scene” 53 rather than the wellspring of all our essentially human endeavors and achievements: art, philosophy, religion and science. The ground of this deprecatory perspective on consciousness, that is, a dislike or distrust of consciousness, has been held by many modern and postmodern thinkers and artists besides Jaynes, among them Sartre, Nietzsche, Faulkner, Pynchon, Freud, and Lacan, so much so that we might identify such an ill opinion of consciousness as a peculiarly modern ideology.

“Remembrance of Things (Far) Past”
by Julian Jaynes
from The Julian Jaynes Collection
ed. by Marcel Kuijsten

And nostalgia too. For with time metaphored as space, so like the space of our actual lives, a part of us solemnly keeps loitering behind, trying to visit past times as if they were actual spaces. Oh, what a temptation is there! The warm, sullen longing to return to scenes long vanished, to relive some past security or love, to redress some ancient wrong or redecide a past regret, or alter some ill-considered actions toward someone lost to our present lives, or to fill out past omissions — these are artifacts of our new remembering consciousness. Side effects. And they are waste and filler unless we use them to learn about ourselves.

Memory is a privilege for us who are born into the last three millennia. It is both an advantage and a predicament, liberation and an imprisonment. Memory is not a part of our biological evolution, as is our capacity to learn habits or simple knowings. It is an off-shoot of consciousness acquired by mankind only a hundred generations ago. It is thus the new environment of modern man. It is one which we sometimes are like legal aliens waiting for naturalization. The feeling of full franchise and citizenship in that new environment is a quest that is the unique hidden adventure of us all.

The Suffering System
by David Loy

In order to understand why that anxiety exists, we must relate dukkha to another crucial Buddhist term, anatta, or “non-self.” Our basic frustration is due most of all to the fact that our sense of being a separate self, set apart from the world we are in, is an illusion. Another way to express this is that the ego-self is ungrounded, and we experience this ungroundedness as an uncomfortable emptiness or hole at the very core of our being. We feel this problem as a sense of lack, of inadequacy, of unreality, and in compensation we usually spend our lives trying to accomplish things that we think will make us more real.

But what does this have to do with social challenges? Doesn’t it imply that social problems are just projections of our own dissatisfaction? Unfortunately, it’s not that simple. Being social beings, we tend to group our sense of lack, even as we strive to compensate by creating collective senses of self.

In fact, many of our social problems can be traced back to this deluded sense of collective self, this “wego,” or group ego. It can be defined as one’s own race, class, gender, nation (the primary secular god of the modern world), religion, or some combination thereof. In each case, a collective identity is created by discriminating one’s own group from another. As in the personal ego, the “inside” is opposed to the other “outside,” and this makes conflict inevitable, not just because of competition with other groups, but because the socially constructed nature of group identity means that one’s own group can never feel secure enough. For example, our GNP is not big enough, our nation is not powerful (“secure”) enough, we are not technologically developed enough. And if these are instances of group-lack or group-dukkha, our GNP can never be big enough, our military can never be powerful enough, and we can never have enough technology. This means that trying to solve our economic, political, and ecological problems with more of the same is a deluded response.

Juvenile Delinquents and Emasculated Males

I was reminded of an old post of mine where I discussed an unintentionally humorous bumper sticker: “Kids who hunt, fish, and trap don’t mug little old ladies.” The logic being used is rather odd, the former having little to do with the latter. It just makes me smile.

The fact of the matter is that few kids do any of those things. It’s true that most kids who hunt, fish, and trap don’t mug little old ladies. But then again, it’s likewise true that most kids who don’t hunt, fish, and trap also don’t mug little old ladies. Despite the paranoia of right-wing media, there isn’t a pandemic of juvenile delinquents taking advantage of the elderly.

The culture wars never die. In one form or another, they’ve been going on for a long time. The same kind of rhetoric can be found even centuries ago. It’s a powerful worldview, eliciting generational conflict. It seems that adults have always complained about kids being worse than they were before, as if the entirety of civilization has been a slow decline from a Golden Age when perfect children once were obedient little angels.

Seeing that post again, I remembered a book I read about a decade ago: Jackson Lear’s Rebirth of a Nation. The author explained the reason manliness and character building suddenly became an obsession around the turn of the century. It led to stocking rivers with game fish, the creation of the Boy Scouts, and greater emphasis put on team sports.

It was far from a new concern. It was built on the Jeffersonian views of agrarian democracy. Immediately following the revolution, it became a fear that the next generation of children needed to be carefully shaped into good citizens. The wholesome farm life was a major focus, especially among the ruling elite who worried about the unruly underclass. This worry grew over time. What exacerbated the fears over the following generations is that in the mid-to-late 1800s there was the beginnings of mass industrialization and urbanization, along with the commercialization of every aspect of life such as the emergence of a consumer economy and consumer culture. The consumer-citizen didn’t fit the heroic mould of old democratic-republican ideals of masculinity.

It relates to why Southerners worried about the end of slavery. It wasn’t just about blacks being free. It was a sign of the times, the end of the independent farmer and the rise of paid labor. Many worried that this would simply be a new form of slavery. How could a man be a man when he was as dependent as a child on another for his living?

This was a collective concern. And so society turned to collective answers. This contributed to the push for Prohibition and public schooling. It was a sense that boys and young men, in particular, had lost some essential element of character that once came natural to their agrarian ancestors. This new generation would have to be taught how to be real men by teaching them hunting, fishing, trapping, sports, etc.

* * *

Rebirth of a Nation:
The Making of Modern America, 1877-1920
By Jackson Lears
pp. 27-29

But for many other observers, too many American youths—especially among the upper classes—had succumbed to the vices of commerce: the worship of Mammon, the love of ease. Since the Founding Fathers’ generation, republican ideologues had fretted about the corrupting effects of commercial life. Norton and other moralists, North and South, had imagined war would provide an antidote. During the Gilded Age those fears acquired a peculiarly palpable intensity. The specter of “overcivilization”—invoked by republican orators since Jefferson’s time—developed a sharper focus: the figure of the overcivilized businessman became a stock figure in social criticism. Flabby, ineffectual, anxious, possibly even neurasthenic, he embodied bourgeois vulnerability to the new challenges posed by restive, angry workers and waves of strange new immigrants. “Is American Stamina Declining?” asked William Blaikie, a former Harvard athlete and author of How to Get Strong and Stay So, in Harper’s in 1889. Among white-collar “brain-workers,” legions of worried observers were asking similar questions. Throughout the country, metropolitan life for the comfortable classes was becoming a staid indoor affair. Blaikie caught the larger contours of the change:

“A hundred years ago, there was more done to make our men and women hale and vigorous than there is to-day. Over eighty per cent of all our men then were farming, hunting, or fishing, rising early, out all day in the pure, bracing air, giving many muscles very active work, eating wholesome food, retiring early, and so laying in a good stock of vitality and health. But now hardly forty per cent are farmers, and nearly all the rest are at callings—mercantile, mechanical, or professional—which do almost nothing to make one sturdy and enduring.”

This was the sort of anxiety that set men (and more than a few women) to pedaling about on bicycles, lifting weights, and in general pursuing fitness with unprecedented zeal. But for most Americans, fitness was not merely a matter of physical strength. What was equally essential was character, which they defined as adherence to Protestant morality. Body and soul would be saved together.

This was not a gender-neutral project. Since the antebellum era, purveyors of conventional wisdom had assigned respectable women a certain fragility. So the emerging sense of physical vulnerability was especially novel and threatening to men. Manliness, always an issue in Victorian culture, had by the 1880s become an obsession. Older elements of moral character continued to define the manly man, but a new emphasis on physical vitality began to assert itself as well. Concern about the over-soft socialization of the young promoted the popularity of college athletics. During the 1880s, waves of muscular Christianity began to wash over campuses.

Fearful Perceptions

They all look the same.

That is a stereotypical racist statement, an excuse for generalizing, but it isn’t just rhetoric. It is directly related to perception and so is the basis of racism itself. You first have to perceive people as the same in order to perceive them as a race in the first place.

I’ve even heard otherwise well-meaning people make comments like this, with no self-awareness of the racist implications of it. Most racism operates unconsciously and implicitly.

Then this informs specifically how an individual is seen. For example, all people perceived as ‘black’ also are perceived as older and guiltier—see the MNT article:

“The evidence shows that perceptions of the essential nature of children can be affected by race, and for black children, this can mean they lose the protection afforded by assumed childhood innocence well before they become adults,” said co-author Matthew Jackson, PhD, also of UCLA. “With the average age overestimation for black boys exceeding four-and-a-half years, in some cases, black children may be viewed as adults when they are just 13 years old.”

Consider another aspect of perception, that of generations over time. Most people, especially as they age, look to the past with nostalgia. The world used to be a better place and the people were better too.

I’ve explored this before with the rates of teen sexuality and all that goes with it. Many older people assume that a generation of sluts has emerged. It is true that kids now talk more openly about sex and no doubt sexual imagery is more easily accessible in movies and on the web.

Even so, it turns out the kids these days are prudes compared to past generations. Abortion rates are going down not just because of improved sex education and increased use of birth control. It’s simply less of an issue because the young’uns apparently are having less sex and it sure is hard to get pregnant without sex. To emphasize this point, they also have lower rates of STDs, another hard thing to get without sex.

On top of that, they are “partaking in less alcohol, tobacco, and drugs.” Not just prudes, but “boring prudes.”

None of that fits public perception, though. Everyone seems to know the world is getting worse. I’m not necessarily one to argue against the claim that the world is going to shit. There is no doubt plenty going wrong. Still, I do try to not generalize too much.

The other article I noticed, by Mike Males at CJCJ, is also about changes in crime rates.

Imagine that a time-liberated version of vigilante George Zimmerman sees two youths walking through his neighborhood: black, hoodied Trayvon Martin of 2012, and a white teen from 1959 (say Bud Anderson from Father Knows Best). Based purely on statistics of race and era, which one should Zimmerman most fear of harboring criminal intent? Answer: He should fear (actually, not fear) them equally; each has about the same low odds of committing a crime.

So, why are young blacks such an obsession of our collective fear?

In the town I live in, white kids commit crimes all the time and it rarely get covered by the local media, but any black kids step out of line and it is major news. Over about a decade (1997-2009), there were two incidents where police shot an innocent men, one white and the other black. Guess which caused the most outrage? Guess which one now has a memorial in the local city park? Let me give you a hint: It wasn’t the black guy, despite his having been fairly well known in town and well liked by those who knew him.

Further on in the CJCJ article, the author points out that:

We don’t associate Jim and Margaret Anderson’s 1950s cherubs with juvenile crime—but that’s based on nostalgia and cultural biases, not fact. Back then, nearly 1 in 10 youth were arrested every year; today, around 3 in 100. Limited statistics of the 1950s show juvenile crime wasn’t just pranks and joyriding; “younger and younger children” are committing “the most wanton and senseless of murders… and mass rape,” the chair of the Senate Subcommittee on Juvenile Delinquency warned in 1956.

We certainly don’t associate 1950s white kids as having been dangerous criminals. Even so, if you look back at the period, you quickly realize that adults during that era were scared shitless of the new generation, between the new media of television and the emergence of full-blown Cold War paranoia. To get a sense of how kids were perceived back then, watch the movie “Village of the Damned.”

And, with immigration barely a trickle, that was when whites came to hold the largest majority in any time of American history. Following decades of racial laws and practices, it was the perfect white utopia or as perfect as it was going to get.

It is true that there was a decline, having begun with Boomers and perfected with my own GenX peers. I’ve written about that issue a lot. The economy was heading down its slow decline and lead toxicity rates shot up like never before. So, the parents were losing their good jobs while the kids’ brains were being poisoned, a great combination. The whole world was shifting beneath the American population, and it didn’t tend to lead to good results. Communities and families were under extreme stress, often to the breaking point.

Since the sainted Fifties, America has seen rapid teenage population growth and dramatic shifts toward more single parenting, more lethal drugs and weapons, increased middle-aged (that is, parent-age) drug abuse and imprisonment, decreased incarceration of youth, decreased youthful religious affiliation, and more violent and explicit media available to younger ages. Horrifying, as the culture critics far Right to far Left—including Obama, who spends many pages and speeches berating popular culture as some major driver of bad youth behavior—repeatedly insist.

It used to be that blacks were blamed for almost everything. They still are blamed for plenty and disproportionately so. Yet the political right has started to viciously turned on its own favored group, the white working class. Charles Murray did that in his recent book, Coming Apart, where he almost entirely ignored blacks in order to focus on the divide emerging between whites, sorting into the low class losers and the upper class meritocracy.

In a post from last year, I pointed to some articles discussing Murray’s book. One article (by Paul Krugman over at Truthout) makes a relevant point:

Reading Mr. Murray’s book and all the commentary about the sources of moral collapse among working-class whites, I’ve had a nagging question: Is it really all that bad?

I mean, yes, marriage rates are way down, and labor force participation is down among working-age men (although not as much as some of the rhetoric might imply), but it’s generally left as an implication that these trends must be causing huge social ills. Are they?

Well, one thing oddly missing in Mr. Murray’s work is any discussion of that traditional indicator of social breakdown, teenage pregnancy. Why? Because it has actually been falling like a stone, according to National Vital Statistics data.

And what about crime? It’s soaring, right? Wrong, according to Justice Department data.

So here’s a thought: maybe traditional social values are eroding in the white working class — but maybe those traditional social values aren’t as essential to a good society as conservatives like to imagine.

Nell Irvin Painter at NYT offers this thought:

Involuntary sterilization is no longer legal, and intelligence is recognized as a complex interplay between biology and environment. Indeed, the 1960s, the era that Mr. Murray blames for the moral failings that have driven poor and middle-class white America apart, was the very same era that stemmed the human rights abuse of involuntary sterilization. (Not coincidentally, it was the same era that began addressing the discrimination that entrenched black poverty as well.)

The stigmatization of poor white families more than a century ago should provide a warning: behaviors that seem to have begun in the 1960s belong to a much longer and more complex history than ideologically driven writers like Mr. Murray would have us believe.

Considering it all, who should we fear? That is who should we fear, besides Muslims, immigrants, and foreigners. Should we fear blacks? The young? Or the Poor? Fortunately, we don’t have to choose between our fears. Any combination of black, young, and poor will do—all three together, of course, being the worst.

When fear drives perception, we perceive a fearful world. To release the tension of anxiety and paranoia, someone has to be the scapegoat, whatever group is easiest to generalize about without any confusing emotions of empathy, which in practice means those with the least power to speak out and be heard. The generalizations don’t need to correspond to reality, just as long as a good narrative can be spun in the mainstream media.