Diet and Industrialization, Gender and Class

Below are a couple of articles about the shift in diet since the 19th century. Earlier Americans ate a lot of meat, lard, and butter. It’s how everyone ate — women and men, adults and children — as that was what was available and everyone ate meals together. Then there was a decline in consumption of both red meat and lard in the early 20th century (dairy has also seen a decline). The changes created a divergence in who was eating what.

It’s interesting that, as part of moral panic and identity crisis, diets became gendered as part of reinforcing social roles and the social order. It’s strange that industrialization and gendering happened simultaneously, although maybe it’s not so strange. It was largely industrialization in altering society so dramatically that caused the sense of panic and crisis. So, diet also became heavily politicized and used for social engineering, a self-conscious campaign to create a new kind of society of individualism and nuclear family.

This period also saw the rise of the middle class as an ideal, along with increasing class anxiety and class war. This led to the popularity of cookbooks within bourgeois culture, as the foods one ate not only came to define gender identity but also class identity. As grains and sugar were only becoming widely available in the 19th century with improved agriculture and international trade, the first popular cookbooks were focused on desert recipes (Liz Susman Karp, Eliza Leslie: The Most Influential Cookbook Writer of the 19th Century). Before that, deserts had been limited to the rich.

Capitalism was transforming everything. The emerging industrial diet was self-consciously created to not only sell products but to sell an identity and lifestyle. It was an entire vision of what defined the good life. Diet became an indicator of one’s place in society, what one aspired toward or was expected to conform to.

* * *

How Steak Became Manly and Salads Became Feminine
Food didn’t become gendered until the late 19th century.
by Paul Freedman

Before the Civil War, the whole family ate the same things together. The era’s best-selling household manuals and cookbooks never indicated that husbands had special tastes that women should indulge.

Even though “women’s restaurants” – spaces set apart for ladies to dine unaccompanied by men – were commonplace, they nonetheless served the same dishes as the men’s dining room: offal, calf’s heads, turtles and roast meat.

Beginning in the 1870s, shifting social norms – like the entry of women into the workplace – gave women more opportunities to dine without men and in the company of female friends or co-workers.

As more women spent time outside of the home, however, they were still expected to congregate in gender-specific places.

Chain restaurants geared toward women, such as Schrafft’s, proliferated. They created alcohol-free safe spaces for women to lunch without experiencing the rowdiness of workingmen’s cafés or free-lunch bars, where patrons could get a free midday meal as long as they bought a beer (or two or three).

It was during this period that the notion that some foods were more appropriate for women started to emerge. Magazines and newspaper advice columns identified fish and white meat with minimal sauce, as well as new products like packaged cottage cheese, as “female foods.” And of course, there were desserts and sweets, which women, supposedly, couldn’t resist.

How Crisco toppled lard – and made Americans believers in industrial food
by Helen Zoe Veit

For decades, Crisco had only one ingredient, cottonseed oil. But most consumers never knew that. That ignorance was no accident.

A century ago, Crisco’s marketers pioneered revolutionary advertising techniques that encouraged consumers not to worry about ingredients and instead to put their trust in reliable brands. It was a successful strategy that other companies would eventually copy. […]

It was only after a chemist named David Wesson pioneered industrial bleaching and deodorizing techniques in the late 19th century that cottonseed oil became clear, tasteless and neutral-smelling enough to appeal to consumers. Soon, companies were selling cottonseed oil by itself as a liquid or mixing it with animal fats to make cheap, solid shortenings, sold in pails to resemble lard.

Shortening’s main rival was lard. Earlier generations of Americans had produced lard at home after autumn pig slaughters, but by the late 19th century meat processing companies were making lard on an industrial scale. Lard had a noticeable pork taste, but there’s not much evidence that 19th-century Americans objected to it, even in cakes and pies. Instead, its issue was cost. While lard prices stayed relatively high through the early 20th century, cottonseed oil was abundant and cheap. […]

In just five years, Americans were annually buying more than 60 million cans of Crisco, the equivalent of three cans for every family in the country. Within a generation, lard went from being a major part of American diets to an old-fashioned ingredient. […]

In the decades that followed Crisco’s launch, other companies followed its lead, introducing products like Spam, Cheetos and Froot Loops with little or no reference to their ingredients.

Once ingredient labeling was mandated in the U.S. in the late 1960s, the multisyllabic ingredients in many highly processed foods may have mystified consumers. But for the most part, they kept on eating.

So if you don’t find it strange to eat foods whose ingredients you don’t know or understand, you have Crisco partly to thank.

 

Rate of Moral Panic

I’m always looking for historical background that puts our present situation in new light. We often don’t realize, for example, how different was the world before and after the Second World War. The 1940s and 1950s was a strange time.

There was a brief moment around the mid-century when the number of marriages shot up and people married younger. So, when we compare marriage rates now to those in the post-war period, we get a skewed perspective because that post-war period was extremely abnormal by historical standards (Ana Swanson, 144 years of marriage and divorce in the United States, in one chart). It’s true that marriage rates never returned to the level of that brief marriage (and birth) boom following the war, but then again marriage rates weren’t ever that high earlier either.

In the 1990s, during the height of the culture wars when family values were supposedly under attack, the marriage rate was about the same as it was from before the Civil War and into the early 1900s, the period I’ve referred to as the crisis of identity. In the decades immediately before that starting around 1970, the marriage rate had been even higher than what was seen in the late 19th century (there isn’t dependable earlier data). Nor is it that premarital sex has become normalized over time, as young people have always had sex: “leaving out the even lower teen sex rate of GenZ, there isn’t a massive difference between the teen sex rates of Millennials and that of Boomers and Silents” (Rates of Young Sluts).

As another example from this past century, “In 1920, 43 percent of Americans were members of a church; by 1960, that figure had jumped to 63 percent” (Alex Morris, False Idol — Why the Christian Right Worships Donald Trump). Think about that. Most Americans, in the early 1900s, were some combination of unchurched and non-religious or otherwise religiously uninvolved and disinterested. A similar pattern was seen in the colonial era when many people lived in communities that lacked a church. Church membership didn’t begin to rise until the 1800s and apparently declined again with mass urbanization and early industrialization.

By the way, that is closely associated with the issue of marriage. Consider early America when premarital sex was so common that a large percentage of women got married after pregnancy and many of those marriages were common law, meaning that couples were simply living together. Moral norms were an informal affair that, if and when enforced, came form neighbors and not religious authority figures. Those moral norms were generous enough to allow the commonality of bastards and single parents, although some of that was explained by other issues such as rape and spousal death.

Many early Americans rarely saw a minister, outside of itinerant preachers who occasionally passed by. This is partly why formal marriages were less common. “Historians of American religion have long noted that the colonies did not exude universal piety. There was a general agreement that in the colonial period no more than 10-20 percent of the population actually belonged to a church” (Roger Finke & Rodney Stark, The Churching of America). This was at a time when many governments had state religions and so churches were associated with oppressiveness, as seen with the rise of non-Christian views (agnosticism, atheism, deism, universalism, unitarianism, etc) during the revolutionary period.

And don’t get me started on abortion, in how maybe as high as one in five or six pregnancies were aborted right before the American Civil War. That might be related to why fertility rates have been steadily dropping for centuries: “Extending the analysis back further, the White fertility rate declined from 7.04 in 1800 to 5.42 in 1850, to 3.56 in 1900, and 2.98 in 1950. Thus, the White fertility declined for nearly all of American history but may have bottomed out in the 1980s. Black fertility has also been declining for well over 150 years, but it may very well continue to do so in the coming decades” (Ideas and Data, Sex, Marriage, and Children: Trends Among Millennial Women).

Are we to blame commie liberal hippies traveling back in time to cause the decline of America practically before the country was even founded? Nostalgia is a fantasy and, interestingly, it is also a disease. The world is getting worse in some ways, but the main problems we face are real world crises such as climate change, not namby pamby cultural paranoia and fear-mongering. The fate of humanity does not rest on promoting the birth rate of native-born American WASPs nor on the hope that theocracy will save us. If we want to worry about doom, we should be looking at whether the rate of moral panic is experiencing an uptick, something that often precedes the rise of authoritarian mass violence.

Moral Panic and Physical Degeneration

From the beginning of the country, there has been an American fear of moral and mental decline that was always rooted in the physical, involving issues of vitality of land and health of the body, and built on an ancient divide between the urban and rural. Over time, it grew into a fever pitch of moral panic about degeneration and degradation of the WASP culture, the white race, and maybe civilization itself. Some saw the end was near, maybe being able to hold out for another few generations before finally succumbing to disease and weakness. The need for revitalization and rebirth became a collective project (Jackson Lears, Rebirth of a Nation), which sadly fed into ethno-nationalist bigotry and imperialistic war-mongering — Make America Great Again!

A major point of crisis, of course, was the the Civil War. Racial ideology became predominant, not only because of slavery but maybe moreso because of mass immigration, the latter being the main reason the North won. Racial tensions merged with the developing scientific mindset of Darwinism and out of this mix came eugenics. For all we can now dismiss this kind of simplistic ignorance and with hindsight see the danger it led to, the underlying anxieties were real. Urbanization and industrialization were having an obvious impact on public health that was observed by many, and it wasn’t limited to mere physical ailments. “Cancer, like insanity, seems to increase with the progress of civilization,” noted Stanislas Tanchou, a mid-19th century French physician.

The diseases of civilization, including mental sickness, have been spreading for centuries (millennia, actually, considering the ‘modern’ chronic health conditions were first detected in the mummies of the agricultural Egyptians). Consider how talk of depression suddenly showed up in written accounts with the ending of feudalism (Barbara Ehrenreich, Dancing in the Street). That era included the enclosure movement that forced millions of then landless serfs into the desperate conditions of crowded cities and colonies where they faced stress, hunger, malnutrition, and disease. The loss of rural life hit Europe much earlier than America, but it eventually came here as well. The majority of white Americans were urban by the beginning of the 20th century and the majority of black Americans were urban by the 1970s. There has been a consistent pattern of mass problems following urbanization, everywhere it happens. It still is happening. The younger generation, more urbanized than any generation before, are seeing rising rates of psychosis that is specifically concentrated in the most urbanized areas.

In the United States, it was the last decades of the 19th century that was the turning point, the period of the first truly big cities. Into this milieu, Weston A. Price was born (1870) in a small rural village in Canada. As an adult, he became a dentist and sought work in Cleveland, Ohio (1893). Initially, most of his patients probably had, like him, grown up in rural areas. But over the decades, he increasingly was exposed to the younger generations having spent their entire lives in the city. Lierre Keith puts Price’s early observations in context, after pointing out that he started his career in 1893: “This date is important, as he entered the field just prior to the glut of industrial food. Over the course of the next thirty years, he watched children’s dentition — and indeed their overall health deteriorate. There was suddenly children whose teeth didn’t fit in their mouths, children with foreshortened jaws, children with lots of cavities. Not only were their dental arches too small, but he noticed their nasal passages were also too narrow, and they had poor health overall; asthma, allergies, behavioral problems” (The Vegetarian Myth, p. 187). This was at the time when the industrialization of farming and food had reached a new level, far beyond the limited availability of canned foods that in the mid-to-late 1800s when most Americans still relied on a heavy amount of wild-sourced meat, fish, nuts, etc. Even city-dwellers in early America had ready access to wild game because of the abundance of surrounding wilderness areas. In fact, in the 19th century, the average American ate more meat (mostly hunted) than bread.

We are once again coming back to the ever recurrent moral panic about the civilizational project. The same fears given voice in the late 19th to early 20th century are being repeated again. For example, Dr. Leonard Sax alerts us to how girls are sexually maturing early (1% of female infants showing signs of puberty), whereas boys are maturing later. As a comparison, hunter-gatherers don’t have such a large gender disparity of puberty nor do they experience puberty so early for girls, instead both genders typically coming to puberty around 18 years old with sex, pregnancy, and marriage happening more or less simultaneously. Dr. Sax, along with others, speculates about a number of reasons. Common causes that are held responsible include health factors, from diet to chemicals. Beyond altered puberty, many other examples could be added: heart disease, autoimmune disorders, mood disorders, autism, ADHD, etc; all of them increasing and worsening with each generation (e.g., type 2 diabetes used to be known as adult onset diabetes but now is regularly diagnosed in young children; the youngest victim recorded recently was three years old when diagnosed).

In the past, Americans responded to moral panic with genocide of Native Americans, Prohibition targeting ethnic (hyphenated) Americans and the poor, and immigrant restrictions to keep the bad sort out; the spread of racism and vigilantism such as KKK and Jim Crow and sundown towns and redlining, forced assimilation such as English only laws and public schools, and internment camps for not only Japanese-Americans but also German-Americans and Italian-Americans; implementation of citizen-making projects like national park systems, Boy Scouts, WPA, and CCC; promotion of eugenics, war on poverty (i.e., war on the poor), imperial expansionism, neo-colonial exploitation, and world wars; et cetera. The cure sought was often something to be forced onto the population by a paternalistic elite, that is to say rich white males, most specifically WASPs of the capitalist class.

Eugenics was, of course, one of the main focuses as it carried the stamp of science (or rather scientism). Yet at the same time, there were those challenging biological determinism and race realism, as views shifted toward environmental explanations. The anthropologists were at the front lines of this battle, but there were also Social Christians who changed their minds after having seen poverty firsthand. Weston A. Price, however, didn’t come to this from a consciously ideological position or religious motivation. He was simply a dentist who couldn’t ignore the severe health issues of his patients. So, he decided to travel the world in order to find healthy populations to study, in the hope of explaining why the change had occurred (Nutrition and Physical Degeneration).

Although familiar with eugenics literature, what Price observed in ‘primitive’ communities (including isolated villages in Europe) did not conform to eugenicist thought. It didn’t matter which population he looked at. Those who ate traditional diets were healthy and those who ate an industrialized Western diet were not. And it was a broad pattern that he saw everywhere he went, not only physical health but also neurocognitive health as indicated by happiness, low anxiety, and moral character. Instead of blaming individuals or races, he saw the common explanation as nutrition and he made a strong case by scientifically analyzing the nutrition of available foods.

In reading about traditional foods, paleo diet/lifestyle and functional medicine, Price’s work comes up quite often. He took many photographs that compared people from healthy and unhealthy populations. The contrast is stark. But what really stands out is how few people in the modern world look close to as healthy as those from the healthiest societies of the past. I live in a fairly wealthy college and medical town where there is a far above average concern for health along with access to healthcare. Even so, I now can’t help noticing how many people around me show signs of stunted or perturbed development of the exact kind Price observed in great detail: thin bone structure, sunken chests, sloping shoulders, narrow facial features, asymmetry, etc. That is even with modern healthcare correcting some of the worst conditions: cavities, underbites, pigeon-toes, etc. My fellow residents in this town are among the most privileged people in the world and, nonetheless, their state of health is a sad state of affairs in what it says about humanity at present.

It makes me wonder, as it made Price wonder, what consequences this has on neurocognitive health for individuals and the moral health of society. Taken alone, it isn’t enough to get excited about. But put in a larger context of looming catastrophes and it does become concerning. It’s not clear that our health will be up to the task of the problems we need to solve. We are a sickly population, far more sickly than when moral panic took hold in past generations.

As important, there is the personal component. I’m at a point where I’m not going to worry too much about decline and maybe collapse of civilization. I’m kind of hoping the American Empire will meet its demise. Still, that leaves us with many who suffer, no matter what happens to society as a whole. I take that personally, as one who has struggled with physical and mental health issues. And I’ve come around to Price’s view of nutrition as being key. I see these problems in other members of my family and it saddens me to watch as health conditions seem to get worse from one generation to the next.

It’s far from being a new problem, the central point I’m trying to make here. Talking to my mother, she has a clear sense of the differences on the two sides of her family. Her mother’s family came from rural areas and, even after moving to a larger city for work, they continued to hunt on a daily basis as there were nearby fields and woods that made that possible. They were a healthy, happy, and hard-working lot. They got along well as a family. Her father’s side of the family was far different. They had been living in towns and cities for several generations by the time she was born. They didn’t hunt at all. They were known for being surly, holding grudges, and being mean drunks. They also had underbites (i.e., underdeveloped jaw structure) and seemed to have had learning disabilities, though no one was diagnosing such conditions back then. Related to this difference, my mother’s father raised rabbits whereas my mother’s mother’s family hunted rabbits (and other wild game). This makes a big difference in terms of nutrition, as wild game has higher levels of omega-3 fatty acids and fat-soluble vitamins, all of which are key to optimal health and development.

What my mother observed in her family is basically the same as what Price observed in hundreds of communities in multiple countries on every continent. And I now observe the same pattern repeating. I grew up with an underbite. My brothers and I all required orthodontic work, as do so many now. I was diagnosed with a learning disability when young. Maybe not a learning disability, but behavioral issues were apparent when my oldest brother was young, likely related to his mildew allergies and probably an underlying autoimmune condition. I know I had food allergies as a child, as I think my other brother did as well. All of us have had neurocognitive and psychological issues of a fair diversity, besides learning disabilities: stuttering, depression, anxiety, and maybe some Asperger’s.

Now another generation is coming along with increasing rates of major physical and mental health issues. My nieces and nephews are sick all the time. They don’t eat well and are probably malnourished. During a medical checkup for my nephew, my mother asked the doctor about his extremely unhealthy diet, consisting mostly of white bread and sugar. The doctor bizarrely dismissed it as ‘normal’ in that, as she claimed, no kid eats healthy. If that is the new normal, maybe we should be in a moral panic.

* * *

Violent Behavior: A Solution in Plain Sight
by Sylvia Onusic

Nutrition and Mental Development
by Sally Fallon Morell

You Are What You Eat: The Research and Legacy of Dr. Weston Andrew Price
by John Larabell

While practicing in his Cleveland office, Dr. Price noticed an increase in dental problems among the younger generations. These issues included the obvious dental caries (cavities) as well as improper jaw development leading to crowded, crooked teeth. In fact, the relatively new orthodontics industry was at that time beginning to gain popularity. Perplexed by these modern problems that seemed to be affecting a greater and greater portion of the population, Dr. Price set about to research the issue by examining people who did not display such problems. He suspected (correctly, as he would later find) that many of the dental problems, as well as other degenerative health problems, that were plaguing modern society were the result of inadequate nutrition owing to the increasing use of refined, processed foods.

Nasty, Brutish and Short?
by Sally Fallon Morell

It seems as if the twentieth century will exit with a crescendo of disease. Things were not so bad back in the 1930’s, but the situation was already serious enough to cause one Cleveland, Ohio dentist to be concerned. Dr. Weston Price was reluctant to accept the conditions exhibited by his patients as normal. Rarely did an examination of an adult patient reveal anything but rampant decay, often accompanied by serious problems elsewhere in the body, such as arthritis, osteoporosis, diabetes, intestinal complaints and chronic fatigue. (They called it neurasthenia in Price’s day.) But it was the dentition of younger patients that alarmed him most. Price observed that crowded, crooked teeth were becoming more and more common, along with what he called “facial deformities”-overbites, narrowed faces, underdevelopment of the nose, lack of well-defined cheekbones and pinched nostrils. Such children invariably suffered from one or more complaints that sound all too familiar to mothers of the 1990’s: frequent infections, allergies, anemia, asthma, poor vision, lack of coordination, fatigue and behavioral problems. Price did not believe that such “physical degeneration” was God’s plan for mankind. He was rather inclined to believe that the Creator intended physical perfection for all human beings, and that children should grow up free of ailments.

Is it Mental or is it Dental?
by Raymond Silkman

The widely held model of orthodontics, which considers developmental problems in the jaws and head to be genetic in origin, never made sense to me. Since they are wedded to the genetic model, orthodontists dealing with crowded teeth end up treating the condition with tooth extraction in a majority of the cases. Even though I did not resort to pulling teeth in my practice, and I was using appliances to widen the jaws and getting the craniums to look as they should, I still could not come up with the answer as to why my patients looked the way they did. I couldn’t believe that the Creator had given them a terrible blueprint –it just did not make sense. In four years of college education, four years of dental school education and almost three years of post-graduate orthodontic training, students never hear a mention of Dr. Price, so they never learn the true reasons for these malformations. I have had the opportunity to work with a lot of very knowledgeable doctors in various fields of allopathic and alternative healthcare who still do not know about Dr. Price and his critical findings.

These knowledgeable doctors have not stared in awe at the beautiful facial development that Price captured in the photographs he took of primitive peoples throughout the globe and in so doing was able to answer this most important question: What do humans look like in health? And how have humans been able to carry on throughout history and populate such varied geographical and physical environments on the earth without our modern machines and tools?

The answer that Dr. Price was able to illuminate came through his photographs of beautiful, healthy human beings with magnificent physical form and mental development, living in harmony with their environments. […]

People who are not well oxygenated and who have poor posture often suffer from fatigue and fibromyalgia symptoms, they snore and have sleep apnea, they have sinusitis and frequent ear infections. Life becomes psychologically and physically challenging for them and they end up with long-term dependence on medications—and all of that just from the seemingly simple condition of crowded teeth.

In other words, people with poor facial development are not going to live very happily. […]

While very few people have heard of the work of Weston Price these days, we haven’t lost our ability to recognize proper facial form. To make it in today’s society, you must have good facial development. You’re not going to see a general or a president with a weak chin, you’re not going to see coaches with weak chins, you’re not going to see a lot of well-to-do personalities in the media with underdeveloped faces and chins. You don’t see athletes and newscasters with narrow palates and crooked teeth.

Weston A. Price: An Unorthodox Dentist
by Nourishing Israel

Price discovered that the native foods eaten by the isolated populations were far more nutrient dense than the modern foods. In the first generation that changed their diet there was noticeable tooth decay; in subsequent generations the dental and facial bone structure changed, as well as other changes that were seen in American and European families and previously considered to be the result of interracial marriage.

By studying the different routes that the same populations had taken – traditional versus modern diet – he saw that the health of the children is directly related to the health of the parents and the germ plasms that they provide, and are as important to the child’s makeup as the health of the mother before and during pregnancy.

Price also found that primitive populations were very conscious of the importance of the mothers’ health and many populations made sure that girls were given a special diet for several months before they were allowed to marry.

Another interesting finding was that although genetic makeup was important, it did not have as great a degree of influence on a person’s development and health as was thought, but that a lot of individual characteristics, including brain development and brain function, where due to environmental influence, what he called “intercepted heredity”.

The origin of personality and character appear in the light of the newer date to be biologic products and to a much less degree than usually considered pure hereditary traits. Since these various factors are biologic, being directly related to both the nutrition of the parents and to the nutritional environment of the individuals in the formative and growth period any common contributing factor such as food deficiencies due to soil depletion will be seen to produce degeneration of the masses of people due to a common cause. Mass behavior therefore, in this new light becomes the result of natural forces, the expression of which may not be modified by propaganda but will require correction at the source. [1] …

It will be easy for the reader to be prejudiced since many of the applications suggested are not orthodox. I suggest that conclusions be deferred until the new approach has been used to survey the physical and mental status of the reader’s own family, of his brothers and sisters, of associated families, and finally, of the mass of people met in business and on the street. Almost everyone who studies the matter will be surprised that such clear-cut evidence of a decline in modern reproductive efficiency could be all about us and not have been previously noted and reviewed.[2]

From Nutrition and Physical Degeneration by Weston Price

Food Freedom – Nourishing Raw Milk
by Lisa Virtue

In 1931 Price visited the people of the Loetschental Valley in the Swiss Alps. Their diet consisted of rye bread, milk, cheese and butter, including meat once a week (Price, 25). The milk was collected from pastured cows, and was consumed raw: unpasteurized, unhomogenized (Schmid, 9).

Price described these people as having “stalwart physical development and high moral character…superior types of manhood, womanhood and childhood that Nature has been able to produce from a suitable diet and…environment” (Price, 29). At this time, Tuberculosis had taken more lives in Switzerland than any other disease. The Swiss government ordered an inspection of the valley, revealing not a single case. No deaths had been recorded from Tuberculosis in the history of the Loetschental people (Shmid, 8). Upon return home, Price had dairy samples from the valley sent to him throughout the year. These samples were higher in minerals and vitamins than samples from commercial (thus pasteurized) dairy products in America and the rest of Europe. The Loetschental milk was particularly high in fat soluble vitamin D (Schmid, 9).

The daily intake of calcium and phosphorous, as well as fat soluble vitamins would have been higher than average North American children. These children were strong and sturdy, playing barefoot in the glacial waters into the late chilly evenings. Of all the children in the valley eating primitive foods, cavities were detected at an average of 0.3 per child (Price, 25). This without visiting a dentist or physician, for the valley had none, seeing as there was no need (Price, 23). To offer some perspective, the rate of cavities per child between the ages of 6-19 in the United States has been recorded to be 3.25, over 10 times the rate seen in Loetschental (Nagel).

Price offers some perspective on a society subsisting mainly on raw dairy products: “One immediately wonders if there is not something in the life-giving vitamins and minerals of the food that builds not only great physical structures within which their souls reside, but builds minds and hearts capable of a higher type of manhood…” (Price, 26).

100 Years Before Weston Price
by Nancy Henderson

Like Price, Catlin was struck by the beauty, strength and demeanor of the Native Americans. “The several tribes of Indians inhabiting the regions of the Upper Missouri. . . are undoubtedly the finest looking, best equipped, and most beautifully costumed of any on the Continent.” Writing of the Blackfoot and Crow, tribes who hunted buffalo on the rich glaciated soils of the American plains, “They are the happiest races of Indian I have met—picturesque and handsome, almost beyond description.”

“The very use of the word savage,” wrote Catlin, “as it is applied in its general sense, I am inclined to believe is an abuse of the word, and the people to whom it is applied.” […]

As did Weston A. Price one hundred years later, Catlin noted the fact that moral and physical degeneration came together with the advent of civilized society. In his late 1830s portrait of “Pigeon’s Egg Head (The Light) Going to and Returning from Washington” Catlin painted him corrupted with “gifts of the great white father” upon his return to his native homeland. Those gifts including two bottles of whiskey in his pockets. […]

Like Price, Catlin discusses the issue of heredity versus environment. “No diseases are natural,” he writes, “and deformities, mental and physical, are neither hereditary nor natural, but purely the result of accidents or habits.”

So wrote Dr. Price: “Neither heredity nor environment alone cause our juvenile delinquents and mental defectives. They are cripples, physically, mentally and morally, which could have and should have been prevented by adequate education and by adequate parental nutrition. Their protoplasm was not normally organized.”

The Right Price
by Weston A. Price Foundation

Many commentators have criticized Price for attributing “decline in moral character” to malnutrition. But it is important to realize that the subject of “moral character” was very much on the minds of commentators of his day. As with changes in facial structure, observers in the first half of the 20th century blamed “badness” in people to race mixing, or to genetic defects. Price quotes A.C. Jacobson, author of a 1926 publication entitled Genius (Some Revaluations),35 who stated that “The Jekyll-Hydes of our common life are ethnic hybrids.” Said Jacobson, “Aside from the effects of environment, it may safely be assumed that when two strains of blood will not mix well a kind of ‘molecular insult’ occurs which the biologists may some day be able to detect beforehand, just as blood is now tested and matched for transfusion.” The implied conclusion to this assertion is that “degenerates” can be identified through genetic testing and “weeded out” by sterilizing the unfit–something that was imposed on many women during the period and endorsed by powerful individuals, including Oliver Wendell Holmes.

It is greatly to Price’s credit that he objected to this arrogant point of view: “Most current interpretations are fatalistic and leave practically no escape from our succession of modern physical, mental and moral cripples. . . If our modern degeneration were largely the result of incompatible racial stocks as indicated by these premises, the outlook would be gloomy in the extreme.”36 Price argued that nutritional deficiencies affecting the physical structure of the body can also affect the brain and nervous system; and that while “bad” character may be the result of many influences–poverty, upbringing, displacement, etc.–good nutrition also plays a role in creating a society of cheerful, compassionate individuals.36

Rebirth of a Nation:
The Making of Modern America, 1877-1920
By Jackson Lears
pp. 7-9

By the late nineteenth century, dreams of rebirth were acquiring new meanings. Republican moralists going back to Jefferson’s time had long fretted about “overcivilization,” but the word took on sharper meaning among the middle and upper classes in the later decades of the nineteenth century. During the postwar decades, “overcivilization” became not merely a social but an individual condition, with a psychiatric diagnosis. In American Nervousness (1880), the neurologist George Miller Beard identified “neurasthenia,” or “lack of nerve force,” as the disease of the age. Neurasthenia encompassed a bewildering variety of symptoms (dyspepsia, insomnia, nocturnal emissions, tooth decay, “fear of responsibility, of open places or closed places, fear of society, fear of being alone, fear of fears, fear of contamination, fear of everything, deficient mental control, lack of decision in trifling matters, hopelessness”), but they all pointed to a single overriding effect: a paralysis of the will.

The malady identified by Beard was an extreme version of a broader cultural malaise—a growing sense that the Protestant ethic of disciplined achievement had reached the end of its tether, had become entangled in the structures of an increasingly organized capitalist society. Ralph Waldo Emerson unwittingly predicted the fin de siècle situation. “Every spirit makes its house,” he wrote in “Fate” (1851), “but afterwards the house confines the spirit.” The statement presciently summarized the history of nineteenth-century industrial capitalism, on both sides of the Atlantic.

By 1904, the German sociologist Max Weber could put Emerson’s proposition more precisely. The Protestant ethic of disciplined work for godly ends had created an “iron cage” of organizations dedicated to the mass production and distribution of worldly goods, Weber argued. The individual striver was caught in a trap of his own making. The movement from farm to factory and office, and from physical labor outdoors to sedentary work indoors, meant that more Europeans and North Americans were insulated from primary processes of making and growing. They were also caught up in subtle cultural changes—the softening of Protestantism into platitudes; the growing suspicion that familiar moral prescriptions had become mere desiccated, arbitrary social conventions. With the decline of Christianity, the German philosopher Friedrich Nietzsche wrote, “it will seem for a time as though all things had become weightless.”

Alarmists saw these tendencies as symptoms of moral degeneration. But a more common reaction was a diffuse but powerful feeling among the middle and upper classes—a sense that they had somehow lost contact with the palpitating actuality of “real life.” The phrase acquired unprecedented emotional freight during the years around the turn of the century, when reality became something to be pursued rather than simply experienced. This was another key moment in the history of longing, a swerve toward the secular. Longings for this-worldly regeneration intensified when people with Protestant habits of mind (if not Protestant beliefs) confronted a novel cultural situation: a sense that their way of life was being stifled by its own success.

On both sides of the Atlantic, the drive to recapture “real life” took myriad cultural forms. It animated popular psychotherapy and municipal reform as well as avant-garde art and literature, but its chief institutional expression was regeneration through military force. As J. A. Hobson observed in Imperialism (1902), the vicarious identification with war energized jingoism and militarism. By the early twentieth century, in many minds, war (or the fantasy of it) had become the way to keep men morally and physically fit. The rise of total war between the Civil War and World War I was rooted in longings for release from bourgeois normality into a realm of heroic struggle. This was the desperate anxiety, the yearning for rebirth, that lay behind official ideologies of romantic nationalism, imperial progress, and civilizing mission—and that led to the trenches of the Western Front.

Americans were immersed in this turmoil in peculiarly American ways. As the historian Richard Slotkin has brilliantly shown, since the early colonial era a faith in regeneration through violence underlay the mythos of the American frontier. With the closing of the frontier (announced by the U.S. census in 1890), violence turned outward, toward empire. But there was more going on than the refashioning of frontier mythology. American longings for renewal continued to be shaped by persistent evangelical traditions, and overshadowed by the shattering experience of the Civil War. American seekers merged Protestant dreams of spiritual rebirth with secular projects of purification—cleansing the body politic of secessionist treason during the war and political corruption afterward, reasserting elite power against restive farmers and workers, taming capital in the name of the public good, reviving individual and national vitality by banning the use of alcohol, granting women the right to vote, disenfranchising African-Americans, restricting the flow of immigrants, and acquiring an overseas empire.

Of course not all these goals were compatible. Advocates of various versions of rebirth—bodybuilders and Prohibitionists, Populists and Progressives, Social Christians and Imperialists—all laid claims to legitimacy. Their crusades met various ends, but overall they relieved the disease of the fin de siècle by injecting some visceral vitality into a modern culture that had seemed brittle and about to collapse. Yearning for intense experience, many seekers celebrated Force and Energy as ends in themselves. Such celebrations could reinforce militarist fantasies but could also lead in more interesting directions—toward new pathways in literature and the arts and sciences. Knowledge could be revitalized, too. William James, as well as Houdini and Roosevelt, was a symbol of the age.

The most popular forms of regeneration had a moral dimension.

pp. 27-29

But for many other observers, too many American youths—especially among the upper classes—had succumbed to the vices of commerce: the worship of Mammon, the love of ease. Since the Founding Fathers’ generation, republican ideologues had fretted about the corrupting effects of commercial life. Norton and other moralists, North and South, had imagined war would provide an antidote. During the Gilded Age those fears acquired a peculiarly palpable intensity. The specter of “overcivilization”—invoked by republican orators since Jefferson’s time—developed a sharper focus: the figure of the overcivilized businessman became a stock figure in social criticism. Flabby, ineffectual, anxious, possibly even neurasthenic, he embodied bourgeois vulnerability to the new challenges posed by restive, angry workers and waves of strange new immigrants. “Is American Stamina Declining?” asked William Blaikie, a former Harvard athlete and author of How to Get Strong and Stay So, in Harper’s in 1889. Among white-collar “brain-workers,” legions of worried observers were asking similar questions. Throughout the country, metropolitan life for the comfortable classes was becoming a staid indoor affair. Blaikie caught the larger contours of the change:

“A hundred years ago, there was more done to make our men and women hale and vigorous than there is to-day. Over eighty per cent of all our men then were farming, hunting, or fishing, rising early, out all day in the pure, bracing air, giving many muscles very active work, eating wholesome food, retiring early, and so laying in a good stock of vitality and health. But now hardly forty per cent are farmers, and nearly all the rest are at callings—mercantile, mechanical, or professional—which do almost nothing to make one sturdy and enduring.”

This was the sort of anxiety that set men (and more than a few women) to pedaling about on bicycles, lifting weights, and in general pursuing fitness with unprecedented zeal. But for most Americans, fitness was not merely a matter of physical strength. What was equally essential was character, which they defined as adherence to Protestant morality. Body and soul would be saved together.

This was not a gender-neutral project. Since the antebellum era, purveyors of conventional wisdom had assigned respectable women a certain fragility. So the emerging sense of physical vulnerability was especially novel and threatening to men. Manliness, always an issue in Victorian culture, had by the 1880s become an obsession. Older elements of moral character continued to define the manly man, but a new emphasis on physical vitality began to assert itself as well. Concern about the over-soft socialization of the young promoted the popularity of college athletics. During the 1880s, waves of muscular Christianity began to wash over campuses.

pp. 63-71

NOT MANY AMERICAN men, even among the comparatively prosperous classes, were as able as Carnegie and Rockefeller to master the tensions at the core of their culture. Success manuals acknowledged the persistent problem of indiscipline, the need to channel passion to productive ends. Often the language of advice literature was sexually charged. In The Imperial Highway (1881), Jerome Bates advised:

[K]eep cool, have your resources well in hand, and reserve your strength until the proper time arrives to exert it. There is hardly any trait of character or faculty of intellect more valuable than the power of self-possession, or presence of mind. The man who is always “going off” unexpectedly, like an old rusty firearm, who is easily fluttered and discomposed at the appearance of some unforeseen emergency; who has no control over himself or his powers, is just the one who is always in trouble and is never successful or happy.

The assumptions behind this language are fascinating and important to an understanding of middle-and upper-class Americans in the Gilded Age. Like many other purveyors of conventional wisdom—ministers, physicians, journalists, health reformers—authors of self-help books assumed a psychic economy of scarcity. For men, this broad consensus of popular psychology had sexual implications: the scarce resource in question was seminal fluid, and one had best not be diddling it away in masturbation or even nocturnal emissions. This was easier said than done, of course, as Bates indicated, since men were constantly addled by insatiable urges, always on the verge of losing self-control—the struggle to keep it was an endless battle with one’s own darker self. Spiritual, psychic, and physical health converged. What Freud called “‘civilized’ sexual morality” fed directly into the “precious bodily fluids” school of health management. The man who was always “‘going off’ unexpectedly, like an old rusty firearm,” would probably be sickly as well as unsuccessful—sallow, sunken-chested, afflicted by languorous indecision (which was how Victorian health literature depicted the typical victim of what was called “self-abuse”).

But as this profile of the chronic masturbator suggests, scarcity psychology had implications beyond familiar admonitions to sexual restraint. Sexual scarcity was part of a broader psychology of scarcity; the need to conserve semen was only the most insistently physical part of a much more capacious need to conserve psychic energy. As Bates advised, the cultivation of “self-possession” allowed you to “keep your resources well in hand, and reserve your strength until the proper time arrives to exert it.” The implication was that there was only so much strength available to meet demanding circumstances and achieve success in life. The rhetoric of “self-possession” had financial as well as sexual connotations. To preserve a cool, unruffled presence of mind (to emulate Rockefeller, in effect) was one way to stay afloat on the storm surges of the business cycle.

The object of this exercise, at least for men, was personal autonomy—the ownership of one’s self. […]

It was one thing to lament excessive wants among the working class, who were supposed to be cultivating contentment with their lot, and quite another to find the same fault among the middle class, who were supposed to be improving themselves. The critique of middle-class desire posed potentially subversive questions about the dynamic of dissatisfaction at the core of market culture, about the very possibility of sustaining a stable sense of self in a society given over to perpetual jostling for personal advantage. The ruinous results of status-striving led advocates of economic thrift to advocate psychic thrift as well.

By the 1880s, the need to conserve scarce psychic resources was a commonly voiced priority among the educated and affluent. Beard’s American Nervousness had identified “the chief and primary cause” of neurasthenia as “modern civilization,” which placed unprecedented demands on limited emotional energy. “Neurasthenia” and “nervous prostration” became catchall terms for a constellation of symptoms that today would be characterized as signs of chronic depression—anxiety, irritability, nameless fears, listlessness, loss of will. In a Protestant culture, where effective exercise of will was the key to individual selfhood, the neurasthenic was a kind of anti-self—at best a walking shadow, at worst a bedridden invalid unable to make the most trivial choices or decisions. Beard and his colleagues—neurologists, psychiatrists, and self-help writers in the popular press—all agreed that nervous prostration was the price of progress, a signal that the psychic circuitry of “brain workers” was overloaded by the demands of “modern civilization.”

While some diagnoses of this disease deployed electrical metaphors, the more common idiom was economic. Popular psychology, like popular economics, was based on assumptions of scarcity: there was only so much emotional energy (and only so much money) to go around. The most prudent strategy was the husbanding of one’s resources as a hedge against bankruptcy and breakdown. […]

Being reborn through a self-allowed regime of lassitude was idiosyncratic, though important as a limiting case. Few Americans had the leisure or the inclination to engage in this kind of Wordsworthian retreat. Most considered neurasthenia at best a temporary respite, at worst an ordeal. They strained, if ambivalently, to be back in harness.

The manic-depressive psychology of the business class mimicked the lurching ups and downs of the business cycle. In both cases, assumptions of scarcity underwrote a pervasive defensiveness, a circle-the-wagons mentality. This was the attitude that lay behind the “rest cure” devised by the psychiatrist Silas Weir Mitchell, who proposed to “fatten” and “redden” the (usually female) patient by isolating her from all mental and social stimulation. (This nearly drove the writer Charlotte Perkins Gilman crazy, and inspired her story “The Yellow Wallpaper.”) It was also the attitude that lay behind the fiscal conservatism of the “sound-money men” on Wall Street and in Washington—the bankers and bondholders who wanted to restrict the money supply by tying it to the gold standard. Among the middle and upper classes, psyche and economy alike were haunted by the common specter of scarcity. But there were many Americans for whom scarcity was a more palpable threat.

AT THE BOTTOM of the heap were the urban poor. To middle-class observers they seemed little more than a squalid mass jammed into tenements that were festering hives of “relapsing fever,” a strange malady that left its survivors depleted of strength and unable to work. The disease was “the most efficient recruiting officer pauperism ever had,” said a journalist investigating tenement life in the 1870s. Studies of “the nether side of New York” had been appearing for decades, but—in the young United States at least—never before the Gilded Age had the story of Dives and Lazarus been so dramatically played out, never before had wealth been so flagrant, or poverty been so widespread and so unavoidably appalling. The army of thin young “sewing-girls” trooping off in the icy dawn to sweatshops all over Manhattan, the legions of skilled mechanics forced by high New York rents to huddle with their families amid a crowd of lowlifes, left without even a pretense of privacy in noisome tenements that made a mockery of the Victorian cult of home—these populations began to weigh on the bourgeois imagination, creating concrete images of the worthy, working poor.

pp. 99-110

Racial animosities flared in an atmosphere of multicultural fluidity, economic scarcity, and sexual rivalry. Attitudes arising from visceral hostility acquired a veneer of scientific objectivity. Race theory was nothing new, but in the late nineteenth century it mutated into multiple forms, many of them characterized by manic urgency, sexual hysteria, and biological determinism. Taxonomists had been trying to arrange various peoples in accordance with skull shape and brain size for decades; popularized notions of natural selection accelerated the taxonomic project, investing it more deeply in anatomical details. The superiority of the Anglo-Saxon—according to John Fiske, the leading pop-evolutionary thinker—arose not only from the huge size of his brain, but also from the depth of its furrows and the plenitude of its creases. The most exalted mental events had humble somatic origins. Mind was embedded in body, and both could be passed on to the next generation.

The year 1877 marked a crucial development in this hereditarian synthesis: in that year, Richard Dugdale published the results of his investigation into the Juke family, a dull-witted crew that had produced more than its share of criminals and mental defectives. While he allowed for the influence of environment, Dugdale emphasized the importance of inherited traits in the Juke family. If mental and emotional traits could be inherited along with physical ones, then why couldn’t superior people be bred like superior dogs or horses? The dream of creating a science of eugenics, dedicated to improving and eventually even perfecting human beings, fired the reform imagination for decades. Eugenics was a kind of secular millennialism, a vision of a society where biological engineering complemented social engineering to create a managerial utopia. The intellectual respectability of eugenics, which lasted until the 1930s, when it became associated with Nazism, underscores the centrality of racialist thinking among Americans who considered themselves enlightened and progressive. Here as elsewhere, racism and modernity were twinned.

Consciousness of race increasingly pervaded American culture in the Gilded Age. Even a worldview as supple as Henry James’s revealed its moorings in conventional racial categories when, in The American (1877), James presented his protagonist, Christopher Newman, as a quintessential Anglo-Saxon but with echoes of the noble Red Man, with the same classical posture and physiognomy. There was an emerging kinship between these two groups of claimants to the title “first Americans.” The iconic American, from this view, was a blend of Anglo-Saxon refinement and native vigor. While James only hints at this, in less than a generation such younger novelists as Frank Norris and Jack London would openly celebrate the rude vitality of the contemporary Anglo-Saxon, proud descendant of the “white savages” who subdued a continent. It should come as no surprise that their heroes were always emphatically male. The rhetoric of race merged with a broader agenda of masculine revitalization.[…]

By the 1880s, muscular Christians were sweeping across the land, seeking to meld spiritual and physical renewal, establishing institutions like the Young Men’s Christian Association. The YMCA provided prayer meetings and Bible study to earnest young men with spiritual seekers’ yearnings, gyms and swimming pools to pasty young men with office workers’ midriffs. Sometimes they were the same young men. More than any other organization, the YMCA aimed to promote the symmetry of character embodied in the phrase “body, mind, spirit”—which a Y executive named Luther Gulick plucked from Deuteronomy and made the motto of the organization. The key to the Y’s appeal, a Harper’s contributor wrote in 1882, was the “overmastering conviction” of its members: “The world always respects manliness, even when it is not convinced [by theological argument]; and if the organizations did not sponsor that quality in young men, they would be entitled to no respect.” In the YMCA, manliness was officially joined to a larger agenda.

For many American Protestants, the pursuit of physical fitness merged with an encompassing vision of moral and cultural revitalization—one based on the reassertion of Protestant self-control against the threats posed to it by immigrant masses and mass-marketed temptation. […]

Science and religion seemed to point in the same direction: Progress and Providence were one.

Yet the synthesis remained precarious. Physical prowess, the basis of national supremacy, could not be taken for granted. Strong acknowledged in passing that Anglo-Saxons could be “devitalized by alcohol and tobacco.” Racial superiority could be undone by degenerate habits. Even the most triumphalist tracts contained an undercurrent of anxiety, rooted in the fear of flab. The new stress on the physical basis of identity began subtly to undermine the Protestant synthesis, to reinforce the suspicion that religion was a refuge for effeminate weaklings. The question inevitably arose, in some men’s minds: What if the YMCA and muscular Christianity were not enough to revitalize tired businessmen and college boys?

Under pressure from proliferating ideas of racial “fitness,” models of manhood became more secular. Despite the efforts of muscular Christians to reunite body and soul, the ideal man emerging among all classes by the 1890s was tougher and less introspective than his mid-Victorian predecessors. He was also less religious. Among advocates of revitalization, words like “Energy” and “Force” began to dominate discussion—often capitalized, often uncoupled from any larger frameworks of moral or spiritual meaning, and often combined with racist assumptions. […]

The emerging worship of force raised disturbing issues. Conventional morality took a backseat to the celebration of savage strength. After 1900, in the work of a pop-Nietzschean like Jack London, even criminality became a sign of racial vitality: as one of his characters says, “We whites have been land-robbers and sea-robbers from remotest time. It is in our blood, I guess, and we can’t get away from it.” This reversal of norms did not directly challenge racial hierarchies, but the assumptions behind it led toward disturbing questions. If physical prowess was the mark of racial superiority, what was one to make of the magnificent specimens of manhood produced by allegedly inferior races? Could it be that desk-bound Anglo-Saxons required an infusion of barbarian blood (or at least the “barbarian virtues” recommended by Theodore Roosevelt)? Behind these questions lay a primitivist model of regeneration, to be accomplished by incorporating the vitality of the vanquished, dark-skinned other. The question was how to do that and maintain racial purity.

pp. 135-138

Yet to emphasize the gap between country and the city was not simply an evasive exercise: dreams of bucolic stillness or urban energy stemmed from motives more complex than mere escapist sentiment. City and country were mother lodes of metaphor, sources for making sense of the urban-industrial revolution that was transforming the American countryside and creating a deep sense of discontinuity in many Americans’ lives during the decades after the Civil War. If the city epitomized the attraction of the future, the country embodied the pull of the past. For all those who had moved to town in search of excitement or opportunity, rural life was ineluctably associated with childhood and memory. The contrast between country and city was about personal experience as well as political economy. […]

REVERENCE FOR THE man of the soil was rooted in the republican tradition. In his Notes on the State of Virginia (1785), Jefferson articulated the antithesis that became central to agrarian politics (and to the producerist worldview in general)—the contrast between rural producers and urban parasites. “Those who labour in the earth are the chosen people of God, if ever he had a chosen people, whose breasts he has made his peculiar deposit for substantial and genuine virtue,” he announced. “Corruption of morals in the mass of cultivators is a phenomenon of which no age nor nation has furnished an example. It is the mark set on those, who not looking up to heaven, to their own soil and industry, as does the husbandman, for their subsistence, depend for it on the casualties and caprice of customers. Dependence begets subservience and venality, suffocates the germ of virtue, and prepares fit tools for the design of ambition.” Small wonder, from this view, that urban centers of commerce seemed to menace the public good. “The mobs of great cities,” Jefferson concluded, “add just so much to the support of pure government as sores do to the strength of the human body.” Jefferson’s invidious distinctions echoed through the nineteenth century, fueling the moral passion of agrarian rebels. Watson, among many, considered himself a Jeffersonian.

There were fundamental contradictions embedded in Jefferson’s conceptions of an independent yeomanry. Outside certain remote areas in New England, most American farmers were not self-sufficient in the nineteenth century—nor did they want to be. Many were eager participants in the agricultural market economy, animated by a restless, entrepreneurial spirit. Indeed, Jefferson’s own expansionist policies, especially the Louisiana Purchase, encouraged centrifugal movement as much as permanent settlement. “What developed in America,” the historian Richard Hofstadter wrote, “was an agricultural society whose real attachment was not to the land but to land values.” The figure of the independent yeoman, furnishing enough food for himself and his family, participating in the public life of a secure community—this icon embodied longings for stability amid a maelstrom of migration.

Often the longings were tinged with a melancholy sense of loss. […] For those with Jeffersonian sympathies, abandoned farms were disturbing evidence of cultural decline. As a North American Review contributor wrote in 1888: “Once let the human race be cut off from personal contact with the soil, once let the conventionalities and artificial restrictions of so-called civilization interfere with the healthful simplicity of nature, and decay is certain.” Romantic nature-worship had flourished fitfully among intellectuals since Emerson had become a transparent eye-ball on the Concord common and Whitman had loafed among leaves of grass. By the post–Civil War decades, romantic sentiment combined with republican tradition to foster forebodings. Migration from country to city, from this view, was a symptom of disease in the body politic. Yet the migration continued. Indeed, nostalgia for rural roots was itself a product of rootlessness. A restless spirit, born of necessity and desire, spun Americans off in many directions—but mainly westward. The vision of a stable yeomanry was undercut by the prevalence of the westering pioneer.

pp. 246-247

Whether energy came from within or without, it was as limitless as electricity apparently was. The obstacles to access were not material—class barriers or economic deprivation were never mentioned by devotees of abundance psychology—they were mental and emotional. The most debilitating emotion was fear, which cropped up constantly as the core problem in diagnoses of neurasthenia. The preoccupation with freeing oneself from internal constraints undermined the older, static ideal of economic self-control at its psychological base. As one observer noted in 1902: “The root cause of thrift, which we all admire and preach because it is so convenient to the community, is fear, fear of future want; and that fear, we are convinced, when indulged overmuch by pessimist minds is the most frequent cause of miserliness….” Freedom from fear meant freedom to consume.

And consumption began at the dinner table. Woods Hutchinson claimed in 1913 that the new enthusiasm for calories was entirely appropriate to a mobile, democratic society. The old “stagnation” theory of diet merely sought to maintain the level of health and vigor; it was a diet for slaves or serfs, for people who were not supposed to rise above their station. “The new diet theory is based on the idea of progress, of continuous improvement, of never resting satisfied with things as they are,” Hutchinson wrote. “No diet is too liberal or expensive that will…yield good returns on the investment.” Economic metaphors for health began to focus on growth and process rather than stability, on consumption and investment rather than savings.

As abundance psychology spread, a new atmosphere of dynamism enveloped old prescriptions for success. After the turn of the century, money was less often seen as an inert commodity, to be gradually accumulated and tended to steady growth; and more often seen as a fluid and dynamic force. To Americans enraptured by the strenuous life, energy became an end itself—and money was a kind of energy. Success mythology reflected this subtle change. In the magazine hagiographies of business titans—as well as in the fiction of writers like Dreiser and Norris—the key to success frequently became a mastery of Force (as those novelists always capitalized it), of raw power. Norris’s The Pit (1903) was a paean to the furious economic energies concentrated in Chicago. “It was Empire, the restless subjugation of all this central world of the lakes and prairies. Here, mid-most in the land, beat the Heart of the nation, whence inevitably must come its immeasurable power, its infinite, inexhaustible vitality. Here of all her cities, throbbed the true life—the true power and spirit of America: gigantic, crude, with the crudity of youth, disdaining rivalry; sane and healthy and vigorous; brutal in its ambition, arrogant in the new-found knowledge of its giant strength, prodigal of its wealth, infinite in its desires.” This was the vitalist vision at its most breathless and jejune, the literary equivalent of Theodore Roosevelt’s adolescent antics.

The new emphasis on capital as Force translated the psychology of abundance into economic terms. The economist who did the most to popularize this translation was Simon Nelson Patten, whose The New Basis of Civilization (1907) argued that the United States had passed from an “era of scarcity” to an “era of abundance” characterized by the unprecedented availability of mass-produced goods. His argument was based on the confident assumption that human beings had learned to control the weather. “The Secretary of Agriculture recently declared that serious crop failures will occur no more,” Patten wrote. “Stable, progressive farming controls the terror, disorder, and devastation of earlier times. A new agriculture means a new civilization.” Visions of perpetual growth were in the air, promising both stability and dynamism.

The economist Edward Atkinson pointed the way to a new synthesis with a hymn to “mental energy” in the Popular Science Monthly. Like other forms of energy, it was limitless. “If…there is no conceivable limit to the power of mind over matter or to the number of conversions of force that can be developed,” he wrote, “it follows that pauperism is due to want of mental energy, not of material resources.” Redistribution of wealth was not on the agenda; positive thinking was.

pp. 282-283

TR’s policies were primarily designed to protect American corporations’ access to raw materials, investment opportunities, and sometimes markets. The timing was appropriate. In the wake of the merger wave of 1897–1903, Wall Street generated new pools of capital, while Washington provided new places to invest it. Speculative excitement seized many among the middle and upper classes who began buying stocks for the first time. Prosperity spread even among the working classes, leading Simon Nelson Patten to detect a seismic shift from an era of scarcity to an era of abundance. For him, a well-paid working population committed to ever-expanding consumption would create what he called The New Basis of Civilization (1907).

Patten understood that the mountains of newly available goods were in part the spoils of empire, but he dissolved imperial power relations in a rhetoric of technological determinism. The new abundance, he argued, depended not only on the conquest of weather but also on the annihilation of time and space—a fast, efficient distribution system that provided Americans with the most varied diet in the world, transforming what had once been luxuries into staples of even the working man’s diet. “Rapid distribution of food carries civilization with it, and the prosperity that gives us a Panama canal with which to reach untouched tropic riches is a distinctive laborer’s resource, ranking with refrigerated express and quick freight carriage.” The specific moves that led to the seizure of the Canal Zone evaporated in the abstract “prosperity that gives us a Panama Canal,” which in turn became as much a boon to the workingman as innovative transportation. Empire was everywhere, in Patten’s formulation, and yet nowhere in sight.

What Patten implied (rather than stated overtly) was that imperialism underwrote expanding mass consumption, raising standards of living for ordinary folk. “Tropic riches” became cheap foods for the masses. The once-exotic banana was now sold from pushcarts for 6 cents a dozen, “a permanent addition to the laborer’s fund of goods.” The same was true of “sugar, which years ago was too expensive to be lavishly consumed by the well-to-do,” but “now freely gives its heat to the workingman,” as Patten wrote. “The demand that will follow the developing taste for it can be met by the vast quantities latent in Porto Rico and Cuba, and beyond them by the teeming lands of South America, and beyond them by the virgin tropics of another hemisphere.” From this view, the relation between empire and consumption was reciprocal: if imperial policies helped stimulate consumer demand, consumer demand in turn promoted imperial expansion. A society committed to ever-higher levels of mass-produced abundance required empire to be a way of life.

The Boy Crisis

There is a recent C-SPAN talk with Warren Farrell about his book, The Boy Crisis. Although mostly focused on the US, I imagine it would apply to some other countries, as he does briefly mention ISIS recruits. American boys and girls have the same suicide rate at age 9, but in the years following that it goes up for boys only. Overall, the mortality of boys is declining in recent years, even though mortality of girls remains the same. I don’t know if the book is insightful or not, as I haven’t read it, but the issues the author brings up are important. I’ve made similar observations about gender divides. Let me make my case, although my thoughts here are tentative and so I’m not entirely attached to them. No doubt my own biases will slip in, but let me try my best to be clear in my position, even if I’m not perfectly right.

The difficulty is gender inevitably is mixed with culture, not that gender is merely a social construct, but gender identity and perception does have a powerful influence. I’d argue that, in certain ways, girls get more of a certain kind of attention. My nieces have received immense help for problems they’ve had such as social training, therapy, etc and been given many opportunities such as signing them up for social events, activities, etc. But what I sense is that my nephew who has serious problems has mostly been ignored, such as no one apparently helping him with his learning difficulties, despite his obviously needing more help than his sister and female cousin who are natural learners. The attitude seems to be that boys will be boys, that boys should suck it up and take care of their own problems, that boys aren’t sensitive like girls and so don’t need the same help, that boys are naturally aggressive and disruptive and so troubled behavior should simply be expected or else punished. Boy problems are to be ignored or eliminated, a Social Darwinian approach less often applied to girls, so it seems to me.

There is also something physiological going on, something I feel more confident in asserting. Boys and girls do seem to deal with health issues differently. Girls, according to some research, have a better ability at dealing with stress (or maybe just less acting out their stress in ways that distress others, similar to how female-profile aspies might be better at socially compensating than male-profile aspies). Some of the aggressive and impulsive behaviors from such things as lead toxicity can be rationalized away as the extremes of otherwise normal boy behavior. The same goes for autism, ADHD, etc — simply not taken as seriously when seen in boys (e.g., autism explained as extreme male profile). This is complicated by the question of whether girls are being diagnosed less, a complication I’ve written about before but won’t be explored further in this post because it goes into difficult issues of the psychology and behavior of personality as filtered through culture.

Dr. Leonard Sax also speculates that something in the environment or diet is causing developmental issues (and this is where much of my own recent thinking comes in). Over time, girls are reaching puberty earlier and boys later, which causes an inequality in neurocognitive development and educational achievement, resulting in boys dropping out at higher rates and girls attending college at higher rates. He suspects it might have to do with estrogen-like chemicals in plastics (then again, it could have to do with food additives, increased soy consumption, hormones in dairy, a high-carb diet, etc or else any number of a slew of environmental toxins and other chemicals, some of which are hormone mimics; others have observed that boys today seem to have more effeminate features such as less square jaw structure than what is seen in photographs of boys from the past and from hunter-gatherer tribes). He also makes a slightly different kind of argument that typical boy behavior is less tolerated in schools with stereotypical girl behavior being the ideal of a good student — that of sitting quietly and calmly, rather than running around like, well, little boys which is an issue as free playtime and gym classes have been among the first to be cut in the new push for teaching to the test (of course, this would also impact girls who don’t follow stereotypical female behavior). Not all of these arguments necessarily fit together.

Most likely, it’s dozen of major factors that are overlapping (and one senses the terrain covered with landmines of confounders). Throw in some reactionary right-wing backlash to mix it up, along with partisan politics to polarize the population. The paranoia about boys being emasculated turns into a moral panic and there is the fear on the other side about the return of theocratic patriarchy or whatever. There is no doubt something to worry about for all involved, but the water gets muddied up with ideologically-driven fearful fantasies and identity politics of every variety. Similar moral panics were seen before WWI and earlier before the Civil War. Societies have a tendency of getting militaristic and violent toward other societies, in the hope of toughening up their boys and often the rhetoric and propaganda becomes rather blatant about this. It is madness that leads to madness. Meanwhile, the real problems facing boys mostly get ignored by the political left and right, until a few generations later when the unresolved problems erupt again as moral panic returns.

Society goes through cycles of ignoring boys and obsessing over them. Girls typically never get the same kind of extreme attention, positive or negative (which one could argue leads to other problems for girls). There is a lot of social pressure in being a boy and a lot more judgment for perceived failure and inadequacy, which surely would relate to the higher rate of suicide and self-destructive behavior, including suicide by cop. That isn’t to say life is easy for girls either, but many of the measurements seem to be improving or at least remaining stable for girls in a way not seen for boys where in important areas worsening is apparent. There is a growing disparity that needs to be explained. Why would mortality be worsening for boys while not for girls? Why would more girls and fewer boys be attending college? Why are there more homeless men on the streets? In a society that is historically patriarchal with certain male privileges, this is the complete opposite of what one would expect. And this resonates with life expectancy and well-being (e.g., drug addiction rates) getting worse for rural white men and middle class white men, even as most other demographics aren’t seeing such declines, indicating that even among males it’s particular populations being hit the most.

The awareness of this problem, a sense of something severely wrong, is the kind of thing driving too many Americans to support someone like President Donald Trump. The populist outrage is real, if misdirected in a way that will make everything worse. Authoritarian nationalism promoted through xenophobic scapegoating, chest-pounding, and war-mongering is not going to save our boys. Yet one can feel that so many people in power are itching for mass violence to enforce social order again and that means enforcing nostalgic notions of ultra-masculinity. Nurturing children, all children, and ensuring public health and the public good for all… well, that is less exciting than lamenting the decline of Western civilization or whatever. It’s not about gender wars, about boys and men losing their position in society or it shouldn’t be about that. We need to find ways to help children where they are at, to create equality of opportunities not only in theory but in reality. We are a society out of balance with gender being one expression among many others.

Improving the lives of girls should be a priority, as is true for other historically disadvantaged demographics and populations. But it is severely problematic if improvement in one area of society seems to be coming at the cost at worsening conditions elsewhere, such as for boys. Even if that isn’t exactly true, in that one can’t be directly or fully blamed for the other, we shouldn’t be so naive as not to realize that is how it will get portrayed. We can’t afford to dismiss the real harm and suffering caused to part of the population, especially at a time when the entire society is under stress. Identity politics turned into dysfunctional demographic tribalism can’t lead to a happy result. This situation isn’t feminism in a fight against the men’s rights movement. These boys have sisters, mothers, and aunts. And these boys will grow up to be husbands and fathers. We don’t live in demographic abstractions for we are part of personal relationships that connect us. Our problems are shared, as is the good we seek.

The Transparent Self to Come?

Scott Preston’s newest piece, The Seer, is worth reading. He makes an argument for what is needed next for humanity, what one might think of as getting off the Wheel of Karma. But I can’t help considering about the messy human details, in this moment of societal change and crisis. The great thinkers like Jean Gebser talk of integral consciousness in one way while most people experience the situation in entirely different terms. That is why I’m glad Preston brought in what is far less respectable (and far more popular) like Carlos Castaneda and the Seth Material.

As anyone should know, we aren’t discussing mere philosophy here for it touches upon human experience and social reality. I sense much of what is potentially involved, even as it is hard to put one’s finger on it. The challenge we are confronted with is far more disconcerting than we typically are able and willing to acknowledge, assuming we can even begin to comprehend what we are facing and what is emerging. How we get to the integral is the difficult part. Preston explains well the issue of making the ego/emissary transparent — as the Seth Material put it, “true transparency is not the ability to see through, but to move through”. That is a good way of putting it.

I appreciate his explanation of Satan (the egoic-demiurge) as the ape of God, what Iain McGilchrist calls usurpation. This reminds me of the mimicry of the Trickster archetype and its relation to the co-optation of the reactionary mind (see Corey Robin). A different kind of example of this is that of the folkloric Men in Black, as described by John Keel. It makes me wonder about what such things represent in human reality. This was on my mind because of another discussion I was having in a different post, Normal, from rauldukeblog’s The Violent Ink. The topic had to do with present mass hysteria and, as I’m wont to do, I threw out my own idiosyncratic context. Climate change came up and so I was trying to explain what makes this moment of crisis different than the past.

There is the scientific quality to it. Modern science created climate change through technological innovation and industrialization. And now science warns us about it. But it usually isn’t like a war, famine, or plague that hits a population in an undeniable way — not for most of us, not yet. That is the complexifying change in the scientific worldview we now inhabit and it is why the anxiety is so amorphous, in away profoundly different than before. To come to terms with climate change, something within human nature itself would have to shift. If we are to survive it while maintaining civilization, we will likely have to be as dramatically transformed as were bicameral humans during the collapse of the Bronze Age Civilizations. We won’t come through this unscathed and unchanged.

In speaking of the scientific or pseudo-scientific, there is the phenomenon of UFOs and contact experience. I pointed out that there has been a shift in official military policy toward reporting of UFO sightings, which gets one wondering about motives and also gets one thinking about why now. UFOs and aliens express that free-floating sense of vague anxiety about the unknown, specifically in a modern framework. It’s almost irrelevant what UFOs really are or aren’t. And no doubt, as in the past, various governments will attempt to use UFO reports to manipulate populations, to obfuscate what they wish to keep hidden, or whatever else. The relevant point here is what UFOs symbolize in the human psyche and why they gain so much attention during periods of wide scale uncertainty and stress. The UFO cults that have appeared over the past few generations are maybe akin to the cults like Jesus worship that arose in the Axial Age. Besides Jung, it might be helpful to bring in Jacques Vallee’s even more fascinating view. A new mythos is forming.

I’m not sure what it all adds up to. And my crystal ball is no less cloudy than anyone else’s. It just feels different in that we aren’t only facing crisis and catastrophe. It feels like a far more pivotal point, a fork in the path. During what is called the General Crisis, there was much change going on and it did help bring to an end what remained of feudalism. But the General Crisis didn’t fundamentally change society and culture, much less cut deeper into the human psyche. I’d argue that it simply brought us further down the same path we’d been on for two millennia since the Axial Age. I keep wondering if now the Axial Age is coming to its final conclusion, that there isn’t much further we can go down this path.

By the way, I think my introduction to Jacques Vallee came through my further reading after having discovered John Keel’s The Mothman Prophecies, the book that came out long before the movie. That is where the basic notion comes from that I was working with here. During times of crisis and foreboding, often preceding actual mass death, there is a build up of strangeness that spills out from our normal sense of reality. We can, of course, talk about this in more rational or rather respectable terms without any of the muck of UFO research.

Keith Payne, in The Broken Ladder, notes that people come to hold bizarre beliefs and generally act irrationally when under conditions of high inequality, that is to say when inflicted with unrelenting stress. But it goes beyond that. There is more going on than mere beliefs. People’s sense of reality becomes distorted and they begin experiencing what they otherwise would not. This was the basis of Julian Jaynes’ hypothesis of the bicameral mind where voice-hearing was supposedly elicited through stress. And this is supported by modern evidence, such as the cases recorded by John Geiger in the Third Man Factor.

An additional layer could be brought to this with Jacques Valle’s work in showing how anecdotes of alien contact follow the same pattern as the stories of fairy abductions and the anthropological accounts of shamanic initiation. These are religious experiences. At other times, they were more likely interpreted as visitations by spiritual beings or as transportation into higher realms. Similarly, spinning and flying disks in the sky were interpreted as supernatural manifestations in the pre-scientific age. But maybe it’s all the same phenomenon, whether the source is elsewhere or from within the human psyche.

The interesting part is that these experiences, sometimes sightings involving crowds of people (including many incidents with military personnel and pilots), often correspond with intensified societal conflict. UFO sightings and contact experiences appear to increase at specific periods of stress. Unsurprisingly, people turn to the strange in strange times. And there is something about this strangeness, the pervasiveness of it and the power it holds. To say we are living in a reactionary time when nearly everything and everyone has become reactionary, that is to understate it to an extreme degree. The Trickster quality of the reactionary mind, one might argue, is its most defining feature.

One might call it the return of the repressed. Or it could be thought of as the eruption (irruption?) of the bicameral mind. Whatever it is, it challenges and threatens the world we think we know. Talk of Russian meddling and US political failure is tiddlywinks in comparison. But the fact that we take such tiddlywinks so seriously does add to the sense of crisis. Everything is real to the degree we believe it to be real, in that the effects of it become manifest in our experience and behavior, in the collective choices that we make and accumulate over time.

We manifest our beliefs. And even the strangest of beliefs can become normalized and, as such, become self-fulfilling prophecies. Social realities aren’t only constructed. They are imagined into being. Such imagination is human reality for we are incapable of experiencing it as anything other than reality. We laugh at the strange beliefs of others at our own peril. But what is being responded to can remain hidden or outside of the mainstream frame of consciousness. Think of the way that non-human animals act in unusual ways before an earthquake hits. If all we see is what the animals are doing and lack any greater knowledge, we won’t appreciate that it means we should prepare for the earthquake to come.

Humans too act strangely before coming catastrophes. It doesn’t require anyone to consciously know of and rationally understand what is coming. Most of how humans respond is instinctual or intuitive. I’d only suggest to pay less attention to the somewhat arbitrary focus of anxiety and, instead, to take the anxiety itself as a phenomenon to be taken seriously. Something real is going on. And it portends something on its way.

Here is my point. We see things through a glass darkly. Things are a bit on the opaque side. Transparency of self is more of an aspiration at this point, at least for those of us not yet enlightened beings. All the voices remain loud within us and in the world around us. In many thinkers seeking a new humanity, there is the prioritizing of the visual over the auditory. There is a historical background to this. The bicameral mind was ruled by voices. To be seek freedom from this, to get off the grinding and rumbling Wheel of Karma requires a different relationship to our senses. There is a reason the Enlightenment was so powerfully obsessed with tools that altered and extended our perception with a major focus on the visual, from lenses to the printed word. Oral society was finally losing its power over us or that is what some wanted to believe.

The strangeness of it all is that pre-consciousness maintains its pull over modern consciousness simultaneously as we idealize the next stage of humanity, integral trans-consciousness. Instead of escaping the authoritative power of the bicameral voice, we find ourselves in a world of mass media and social media where voices have proliferated. We are now drowning in voices and so we fantasize about the cool silence of the visionary, that other side of our human nature — as Preston described it:

One of the things we find in don Juan’s teachings is “the nagual” and “the tonal” relation and this is significant because it is clearly the same as McGilchrist’s “Master” and “Emissary” relationship of the two modes of attention of the divided brain. In don Juan’s teachings, these correspond to the what is called the “first” and “the second attentions”. If you have read neuroscientist Jill Bolte-Taylor’s My Stroke of Insight or followed her TED talk about that experience, you will see that she, too, is describing the different modes of attention of the “nagual” and the “tonal” (or the “Master” and the “Emissary”) in her own experience, and that when she, too, shifted into the “nagual” mode, also saw what Castaneda saw — energy as it flows in the universe, and she also called that “the Life Force Power of the Universe”

About getting off the Wheel, rauldukeblog wrote that, “Karma is a Sanskrit word meaning action so the concept is that any act(tion) creates connective tissue which locks one into reaction and counter and so on in an endless loop.” That brings us back to the notion of not only seeing through the egoic self but more importantly to move through the egoic self. If archaic authorization came from voices according to Jaynes, and if self-authorization of the internalized voice of egoic consciousness hasn’t fundamentally changed this equation, then what would offer us an entirely different way of being and acting in the world?

The last time we had a major transformation of the human mind, back during the ending of the Bronze Age, it required the near total collapse of every civilization. Structures of the mind aren’t easily disentangled from entrenched patterns of social identity as long as the structures of civilization remain in place. All these millennia later, we are still struggling to deal with the aftermath of the Axial Age. What are the chances that the next stage of humanity is going to be easier or happen more quickly?

Violent Fantasy of Reactionary Intellectuals

“Capitalism is boring. Devoting your life to it, as conservatives do, is horrifying if only because it’s so repetitious. It’s like sex.”
~William F. Buckley Jr., in an interview with Corey Robin

The last thing in the world a reactionary wants is to be bored, as happened with the ending of the ideological battles of the Cold War. They need a worthy enemy or else to invent one. Otherwise, there is nothing to react to and so nothing to get excited about, followed by a total loss of meaning and purpose, resulting in dreaded apathy and ennui. This leads reactionaries to become provocative, in the hope of provoking an opponent into a fight. Another strategy is simply to portray the whole world as a battleground, such that everything is interpreted as a potential attack, working oneself or one’s followers into a froth.

There are demagogues like Bill O’Reilly and Donald Trump. The former has made numerous stated or implied threats of violence over the years, and others including his ex-wife have accused him of actual violence. As for the latter, his invoking violence is well known, going so far as to brag he could shoot someone in the street and get away with it. Of course, both also speak of violence in broader terms of culture war and dog whistles, racism and xenophobia, paranoia and conspiracy. But whatever form it takes, it tends to be rather blatant and blunt in going for maximum effect.

There is another kind of reactionary as well. They often present themselves as respectable intellectuals and often liberals will treat them as such. Once dead and gone, through rose-colored nostalgia, they are remembered as representing some high point of worthy conservatism. A great example of this is William F. Buckley Jr. who had a combative attitude, occasionally erupting into threats. Yet, upon his passing, liberals praised him as the leader of a golden age of conservatism. That isn’t how liberals saw him at the time, of course. He was no soft-spoken, fair-minded public intellectual. There was a reactionary edge back then that essentially is no different than today.

More recently, there is Jordan B. Peterson who has taken on the defense of masculinity and has done so with an increasingly confrontational attitude, aggressively so at times. Some might argue that he has followed a predictable path of reactionary decline. Or rather that his reactionary mind is showing its true nature. One suspects there is often a threat behind the rhetoric of reactionary ideology, even if not always explicit, but give it enough time and it can become explicit. Is that true of Peterson?

He began as an academic talking about a Jungian archetypal masculinity (i.e., patriarchy as mythology and mysticism) enforcing order on feminine chaos (one wonders if he read Carl Jung’s Answer to Job where the patriarchal Yahweh is portrayed as a chaotic force of  unconscious nature) — by implication, this is a Manichaean fight against the effeminizing forces on the political left that are psychologically and socially neutering boys. But for all the semi-religiosity of his language, his ideas were always presented in rather boring academic terms and with a meandering pedantic style. Now some perceive the academic veneer to be wearing thin, as he has slipped further into the archetypal role of paternalistic father figure, in becoming yet another right-wing pundit and self-help guru.

The difference for the reactionary intellectual, as Corey Robin explained, is that they approach the Burkean moral imagination of the horrific and sublime (with its sociopolitical framing of purity) by way of abstraction while usually keeping a safe distance from the concrete. They are inspired, excited, and enthralled by the fear-ridden imaginary with its fantasized violence — that is until it gets too close, too real. In an actual fight, Buckley or Peterson would likely get the shit beat out of them. The pose of intellectual brawlers and alpha males is just that, a pose not to be taken too literally, and yet there is always an underlying hint of authoritarian authority. They do see themselves in an existential crisis, a near cosmic fight that must be won or else that all of Western civilization will be lost, and they don’t think of this as mere hyperbole.

This is why, when cornered, they will lash out with the language of violence, sometimes with stated threats of hitting their opponents. Peterson did this recently in using a tweet to threaten someone with mild-mannered violence, a rather unmanly ‘slap’ (maybe his opponent was deemed unworthy of the full manly force of fisticuffs). Of course, this ‘threat’ is silly taken at face value. We Americans aren’t exactly worried about the importation of the Canadian “slap culture”. The point of concern is that he would even say such a thing, considering how common this aggressive machismo is on the reactionary right. This kind of verbal threat could be dismissed, if it didn’t ever lead to action but sadly there is a long history of it doing just that. Take for example Bill O’Reilly repeatedly having called Dr. George Tiller a “baby killer” until one of O’Reilly’s viewers took the implicit threat and made it explicit by assassinating Dr. Tiller. Or consider the Pizzagate fake news pushed by right-wing media that also led to a real world shooting. Violence is a desired result, not an unintended consequence, the enacting and enforcement of the moral imagination.

It’s not that there is any reason to worry about one of Peterson’s fanboys going out on a slapping rampage. What is worrisome is the pattern of talk that becomes increasingly extreme over time, not just by any single person but across an entire society, specifically here in the United States, that is already so obsessed with violence and authoritarianism. This might be taken less seriously were we not in the middle of this era of rule by Donald Trump, a man who came to power through violent rhetoric, a man now as president who has shown fascist tendencies toward authoritarian display, from a declared desire for a military march with tanks to sending the military to the border.

I don’t see Jordan Peterson as a fascist, much less a Nazi. And I would be wary of too broadly painting the canvas of fascist mysticism, such as how Carl Jung is often dismissed out of hand. But I do take seriously the dark moral imagination that forms a swift and powerful undercurrent. And as such I do have valid fear about how Peterson’s words, no matter his intentions, could so easily be misused and so quickly lead to harmful ends.

Though I don’t agree with all criticisms of Peterson, I do wonder if some are on target in pointing to a fascist tendency in Western modernity (a reactionary defense of hierarchical authority given persuasive force through neo-romantic mythologizing, often as folk religiosity and volk nationalism). There is a powerful current of thought that gets tapped, even by those who don’t realize what they are tapping into — to put it in a Jungian frame, there are unconscious archetypal forces that can possess us. I’m not sure it matters whether or not someone means well. If anything, my greatest concern is often about those who hide behind personas built on claims of good intentions.

Peterson is invoking moral imagination. It is a powerful tool. And potentially it is a dangerous weapon. I’m not entirely convinced he realizes the fire he is playing with. There is a short distance from nostalgic fantasies to reactionary radicalization. And that distance can be covered in no time at all when a resonance develops between public mood and political power. It has happened before and could happen again. Peterson should heed his own warnings about totalitarian thought and authoritarian politics.

Criticisms of left-wingers, feminists, etc hasn’t tended to end well in the Western world — interestingly, considering Jordan Peterson’s fear-mongering, the ruling elites of both the Nazis and the Soviets attacked, imprisoned, and killed left-wingers: feminists, social liberals, social democrats, Marxists, anarchosyndicalists, labor organizers, radical intellectuals, experimental artists, etc. This puts Peteson as a self-proclaimed anti-authoritarian in strange company when he too attacks these same left-wingers. I’d rather we, including Peterson and his followers, learned from history than having to repeat it again and again.

I’ll let Canadians worry about Canada. But as an American, I’ll worry about the United States. Let us not forget what kind of country this is. The U.S. isn’t only a country founded on genocide and slavery. You remember that little thing about Nazi eugenics. Guess where they got eugenics from? Right here in the good ol’ U.S. of A..

Let me explain how close this hits to home. There were many Americans who originated eugenicist thought and practice, helping to set an example that inspired Nazis. One of those Americans was an Iowan school teacher, Harry H. Laughlin, who lived near my home — Adolf Hitler personally praised this Iowan eugenicist: “The Reichstag of Nazi Germany passed the Law for the Prevention of Hereditarily Diseased Offspring in 1933, closely based on Laughlin’s model. Between 35,000 and 80,000 persons were sterilized in the first full year alone. (It is now known that over 350,000 persons were sterilized). Laughlin was awarded an honorary degree by the University of Heidelberg in 1936 for his work behalf of the “science of racial cleansing.” (Five other Americans received honorary degrees the same year).” Eugenics never became as powerful in American society, but the impulse behind it fed into Social Darwinism, the Second Klan, Jim Crow, sundown towns, ethnic internment camps, violently enforced assimilation, etc.

Around the same time in Western history, mass urbanization was underway. As women gained more freedom in cities, feminism and other women’s movements gained new force and influence. So, with the destruction of rural communities and loss of the agrarian lifestyle, a moral panic arose about boys being turned effeminate and weak, not just by womanly culture but also by a supposed soft city living along with the temptations of alcohol and such. This fear-mongering about a lost generation of boys was a major impulse behind fascism and it took hold in the United States. There were large fascist marches in the U.S. at the time. But we are fortunate, I guess, that anti-German and anti-Italian xenophobic bigotry took much of the force out of American fascism. Instead, all we got was a patriarchal movement that created Boy Scouts and a National Park system. We might not be so lucky next time.

Someone like Peterson may be less problematic for Canada, as Canadians don’t have the same cultural history of reactionary extremism. What is problematic for Americans is that Peterson doesn’t seem to understand what kind of influence he might have south of the Canadian border. His words and ideas might speak to American reactionaries in an entirely different way than he intends. And that could have real world consequences. He isn’t helping matters by suggesting the way to deal with ideological opponents is through physical force, not that interpreting his words as idle threats is any better. Furthermore, his projecting his violent fantasies of a postmodern Marxist death cult (the equivalent of cultural Marxism or cultural Bolshevism) and feminist totalitarianism onto his opponents is just as, if not more, troubling.

Rather than defusing conflict, Jordan Peterson is fueling the fire. He is itching for a fight, playing out some script of antagonism that he is fantasizing about. What brought him to fame was a political issue involving gender pronouns that turned out have been fake news he helped gin up by way of misinterpreting a proposed law. But having been proven so severely wrong didn’t chasten him for he is getting more aggressive as time goes on. His rhetoric plays directly into reactionary paranoia and alt-right fear. We are far from the end of history for we are smack dab in the middle of it. The stage set long ago, the third act of a tragic play might begin soon. If so, it will be the denouement of yet one more cycle of conflict, first imagined and then acted upon. I fear it won’t be boring.

* * *

“Now listen, you queer, stop calling me a crypto-Nazi or I’ll sock you in your goddam face, and you’ll stay plastered…”
~William F. Buckley Jr. to Gore Vidal

“Maybe not tonight, because as you would, I’d smash you in the goddamn face.”
~William F. Buckley Jr. to Noam Chomsky

“Here’s the problem, I know how to stand up to a man who’s unfairly trespassed against me and the reason I know that is because the parameters for my resistance are quite well-defined, which is: we talk, we argue, we push, and then it becomes physical. If we move beyond the boundaries of civil discourse, we know what the next step is. That’s forbidden in discourse with women and so I don’t think that men can control crazy women. I really don’t believe it.”
~Jordan B. Peterson to Camille Paglia

“And you call me a fascist? You sanctimonious prick. If you were in my room at the moment, I’d slap you happily.”
~Jordan B. Peterson to Pankaj Mishra

Jordan Peterson joins the club of macho writers who have thrown a fit over a bad review.
by Jeet Heer

Since Peterson loves to categorize the world into Jungian archetypes (the devouring mother, the dragon-slaying hero), it’s worth noting that this tweet fits an age-old pattern: the hyper-masculine writer who is unhinged by critical words.

In 1933, Max Eastman wrote a scathing review in The New Republic of Ernest Hemingway’s Death in the Afternoon, accusing the bullfight-loving author of “wearing false hair on his chest.” Four years later, the two met in the New York offices of their shared publisher, Scribner. “What do you mean accusing me of impotence?” Hemingway asked, before trying to beat up Eastman. The two men had to be separated by editorial staff. The same year, Hemingway assaulted the poet Wallace Stevens, twenty years his senior, for saying that Hemingway was “not a man.”

In 1971, Gore Vidal wrote a scathing essay on Norman Mailer for TheNew York Review of Books. “The Patriarchalists have been conditioned to think of women as, at best, breeders of sons, at worst, objects to be poked, humiliated and killed,” Vidal wrote. “There has been from Henry Miller to Norman Mailer to Charles Manson a logical progression.” Enraged, Mailer slammed his head into Vidal’s face in the dressing room of The Dick Cavett Show. Five years later, Mailer was still looking for revenge. At a dinner party, he threw a drink at Vidal before tackling him to the ground. “Once again, words fail Norman Mailer,” Vidal quipped, while still on the floor.

In 2000, the critic Dale Peck went after Stanley Crouch in The New Republic, writing that Crouch’s novel Don’t the Moon Look Lonesome“is a terrible novel, badly conceived, badly executed, and put forward in bad faith; reviewing it is like shooting fish in a barrel.” In 2004, still stinging from the review, Crouch confronted Peck at Tartine, a Manhattan restaurant, and slapped him.

* * *

Jordan Peterson & Fascist Mysticism
by Pankaj Mishra

Reactionary white men will surely be thrilled by Peterson’s loathing for “social justice warriors” and his claim that divorce laws should not have been liberalized in the 1960s. Those embattled against political correctness on university campuses will heartily endorse Peterson’s claim that “there are whole disciplines in universities forthrightly hostile towards men.” Islamophobes will take heart from his speculation that “feminists avoid criticizing Islam because they unconsciously long for masculine dominance.” Libertarians will cheer Peterson’s glorification of the individual striver, and his stern message to the left-behinds (“Maybe it’s not the world that’s at fault. Maybe it’s you. You’ve failed to make the mark.”). The demagogues of our age don’t read much; but, as they ruthlessly crack down on refugees and immigrants, they can derive much philosophical backup from Peterson’s sub-chapter headings: “Compassion as a vice” and “Toughen up, you weasel.”

In all respects, Peterson’s ancient wisdom is unmistakably modern. The “tradition” he promotes stretches no further back than the late nineteenth century, when there first emerged a sinister correlation between intellectual exhortations to toughen up and strongmen politics. This was a period during which intellectual quacks flourished by hawking creeds of redemption and purification while political and economic crises deepened and faith in democracy and capitalism faltered. Many artists and thinkers—ranging from the German philosopher Ludwig Klages, member of the hugely influential Munich Cosmic Circle, to the Russian painter Nicholas Roerich and Indian activist Aurobindo Ghosh—assembled Peterson-style collages of part-occultist, part-psychological, and part-biological notions. These neo-romantics were responding, in the same way as Peterson, to an urgent need, springing from a traumatic experience of social and economic modernity, to believe—in whatever reassures and comforts. […]

Nowhere in his published writings does Peterson reckon with the moral fiascos of his gurus and their political ramifications; he seems unbothered by the fact that thinking of human relations in such terms as dominance and hierarchy connects too easily with such nascent viciousness such as misogyny, anti-Semitism and Islamophobia. He might argue that his maps of meaning aim at helping lost individuals rather than racists, ultra-nationalists, or imperialists. But he can’t plausibly claim, given his oft-expressed hostility to the “murderous equity doctrine” of feminists, and other progressive ideas, that he is above the fray of our ideological and culture wars. […]

Peterson rails today against “softness,” arguing that men have been “pushed too hard to feminize.” In his bestselling book Degeneration (1892), the Zionist critic Max Nordau amplified, more than a century before Peterson, the fear that the empires and nations of the West are populated by the weak-willed, the effeminate, and the degenerate. The French philosopher Georges Sorel identified myth as the necessary antidote to decadence and spur to rejuvenation. An intellectual inspiration to fascists across Europe, Sorel was particularly nostalgic about the patriarchal systems of ancient Israel and Greece.

Like Peterson, many of these hyper-masculinist thinkers saw compassion as a vice and urged insecure men to harden their hearts against the weak (women and minorities) on the grounds that the latter were biologically and culturally inferior. Hailing myth and dreams as the repository of fundamental human truths, they became popular because they addressed a widely felt spiritual hunger: of men looking desperately for maps of meaning in a world they found opaque and uncontrollable.

It was against this (eerily familiar) background—a “revolt against the modern world,” as the title of Evola’s 1934 book put it—that demagogues emerged so quickly in twentieth-century Europe and managed to exalt national and racial myths as the true source of individual and collective health. The drastic individual makeover demanded by the visionaries turned out to require a mass, coerced retreat from failed liberal modernity into an idealized traditional realm of myth and ritual.

In the end, deskbound pedants and fantasists helped bring about, in Thomas Mann’s words in 1936, an extensive “moral devastation” with their “worship of the unconscious”—that “knows no values, no good or evil, no morality.” Nothing less than the foundations for knowledge and ethics, politics and science, collapsed, ultimately triggering the cataclysms of the twentieth century: two world wars, totalitarian regimes, and the Holocaust. It is no exaggeration to say that we are in the midst of a similar intellectual and moral breakdown, one that seems to presage a great calamity. Peterson calls it, correctly, “psychological and social dissolution.” But he is a disturbing symptom of the malaise to which he promises a cure.

 

The Resolution of Jordan Peterson
by Brent Cooper

This of course obscures the broader context of longer interviews, and distorts Peterson’s message at the expense of his critics, so nobody wins. Peterson is not cryptofascist, but a great portion of his audience is. (What does one do when they finally discover a dark truth behind their popularity?)

“So is Jordan Peterson preparing his base for the coming race war? I do not think so. My read of him is that he is actually terrified of what he started. Nobody is more surprised than he is by his fame… he’s on sabbatical after basically declaring war on his own institution. You can’t go home after that. He needs his Patreon now… He has cast his lot with his mob.” — The CANADALAND Guide to Jordan B. Peterson

[…] An aside: In my article on systemic-conspiracy, I argued that the concept provides a useful explanation of how totalitarianism occurs, and how to avoid it. What I am theorizing complements Peterson’s message, but his denial of systemic (sociological) approaches prevents any of those ideas even getting on his radar.

“This is relevant and convergent with Jordan Peterson’s oft-repeated warning that we all have the potential for totalitarian fascism in us; to participate in systems of violence. Systemic-conspiracy is sociologically latent, which is arguably the major lesson of the 20th century.” — Systemic Conspiracy and Social Pathology

Peterson is so hellbent on avoiding totalitarianism, that he ironically has a totalizing worldview about “the left” to the point of scapegoating them just like Jews were. Cultural-marxism is the new cultural bolshevism and its stupidly obvious, and glaringly wrong, but conservatives love it because it’s their last resort: blame the people trying to fix the problem conservatives started. Peterson’s stock is artificially inflated because of support for these beliefs. Come for the supreme mythological wisdom, stay for the crypto-fascism. Or is it the other way around? Peterson is ironic — he’s not post-ironic, because he’s not metamodern. He doesn’t get it, and if his fans and critics don’t get it either, then this will remain a stalemate.

These sentiments are perhaps better articulated by Noah Berlatsky than myself (below). Again, no one is attacking Peterson here, but rather just logically pointing out the hypocrisy. Peterson gets highjacked by the right, so this information should help him reform rather than retaliate. The term “useful idiot” doesn’t really fit, since Peterson is incredibly smart, but he is nonetheless being used for that very intelligence to spread bullshit.

“But how does Peterson suggest an alternate path to fascism when his philosophy is suffused with barely hidden fascist talking points and conspiracy theories?… And, moreover, why is a supposed anti-totalitarian literally calling for educators who disagree with him to be subject to McCarthyite purges and tried for treason?”

“People who put Leninist posters on their walls to remind themselves to hate communists all day, every day, are leaving a door open to other kinds of hate too. Peterson does not want to be a member of the alt-right. But he shares their hatred of the left, and, as a result, he makes their arguments for them.”

— How Anti-Leftism Has Made Jordan Peterson a Mark for Fascist Propaganda, Berlatsky

Is Jordan Peterson the stupid man’s smart person?
by Tabatha Southey

“Postmodern neo-Marxism” is Peterson’s nemesis, and the best way to explain what postmodern neo-Marxism is, is to explain what it is not—that is, it is entirely distinct from the concept of “cultural Marxism.”

“Cultural Marxism” is a conspiracy theory holding that an international cabal of Marxist academics, realizing that traditional Marxism is unlikely to triumph any time soon, is out to destroy Western civilization by undermining its cultural values. “Postmodern neo-Marxism,” on the other hand, is a conspiracy theory holding that an international cabal of Marxist academics, realizing that traditional Marxism is unlikely to triumph any time soon, is out to destroy Western civilization by undermining its cultural values with “cultural” taken out of the name so it doesn’t sound quite so similar to the literal Nazi conspiracy theory of “cultural Bolshevism.”

To be clear, Jordan Peterson is not a neo-Nazi, but there’s a reason he’s as popular as he is on the alt-right. You’ll never hear him use the phrase “We must secure a future for our white children”; what you will hear him say is that, while there does appear to be a causal relationship between empowering women and economic growth, we have to consider whether this is good for society, “‘’cause the birth rate is plummeting.” He doesn’t call for a “white ethnostate,” but he does retweet Daily Caller articles with opening lines like: “Yet again an American city is being torn apart by black rioters.” He has dedicated two-and-a-half-hour-long YouTube videos to “identity politics and the Marxist lie of white privilege.” […]

What he’s telling you is that certain people—most of them women and minorities—are trying to destroy not only our freedom to spite nonbinary university students for kicks, but all of Western civilization and the idea of objective truth itself. He’s telling you that when someone tells you racism is still a problem and that something should be done about it, they are, at best, a dupe and, at worst, part of a Marxist conspiracy to destroy your way of life.

Peterson says he only thinks of it as a “non-violent war.” But when you insist the stakes are that high, the opposition that pernicious, who’s to say where the chips will fall?

Some of My Beef With Jordan Peterson
by son1dow

In terms of postmodernism, it has been well covered that he has no idea what is going on, he is yet another bullshit about postmodernism dealer online. Just read wokeupabug’s comments in that thread M1zzu recently linked, as well as so many others – it explains how his main source is not at all one you should trust. The forum there is askphilosophy, the user linked has a PhD in philosophy. I wish I could link famous philosophers for this kind of stuff, but they don’t like giving these youtube intellectuals and renegade scholars recognition too much. The more I hear of Peterson, the more I wonder if he read anything of postmodernist philosophy, since the only views he seems to espouse perfectly match bullshit dealers like Hicks, and he NEVER EVER seems to properly engage Derrida, Lyotard etc. For all I know, he could be reading neofeudalist conspiracy nuts like Dugin as well. For all of his love of debate and challenge, I would be interested to see him discuss postmodernism with someone who has read the actual books, yet I cannot find that. The worst thing about these people is that there is no way anyone with even the most cursory understanding of postmodernism would mistake Hicks or Peterson as knowledgeable about it; yet it spreads like wildfire. Some of the most dumb misunderstanding of it is perfectly incapsulated in this comic – note the explanation below the comic. The comic itself satirizes the fact that postmodernism is literally the opposite of feminism or marxism, it is as sceptical of metanarratives like them as it is of scientism or judaism. So blaming it for marxism is the dumbest thing you can do. I’ve personally had this conversation with Peterson’s disciples like 50 times; none of them know the first thing about postmodernism and are stumped by these basic questions. This is concerning a school of thought that many of them are sure is trying to bring the downfall of western civilization, mind you – and few if any of them know the most basic things about it.

Cultural marxism is more of the same, it’s a repeat of an old nazi conspiracy theory called cultural bolshevism that has to do with a real term… Only the term is about an obscure school of thought that is not even related to any of the claims people make about cultural marxism. It’s just another nonsense term to throw around and talk about as much as you want, with no basis. Once again you have to wonder how many of these youtube intellectuals boil down to reading conspriacy theorists to get this stuff. However by now it is a real industry of people repeating the same shit and explaining it as the cause of feminism or transgenderism or whatever they like, with their viewers gobbling it up without any regard for going to the sources which couldn’t possibly show anything like it. Makes you wonder how they can doublethink their way into doing that while still considerig themselves intellectuals. Very few people repeating this nonsense even know what critical theory is, yet they’re sure as it is bringing the downfall of western civilization. Talk about drinking the kool-aid.

* * *

Why Conservatives Love War
by Corey Robin

While the contrast between the true conservative and the pseudo-conservative has been drawn in different ways—the first reads Burke, the second doesn’t read; the first defends ancient liberties, the second derides them; the first seeks to limit government, the second to strengthen it—the distinction often comes down to the question of violence. Where the pseudo-conservative is captivated by war, Sullivan claims that the true conservative “wants peace and is content only with peace.” The true conservative’s endorsements of war, such as they are, are the weariest of concessions to reality. He knows that we live and love in the midst of great evil. That evil must be resisted, sometimes by violent means. All things being equal, he would like to see a world without violence. But all things are not equal, and he is not in the business of seeing the world as he’d like it to be.

The historical record suggests otherwise. Far from being saddened, burdened, or vexed by violence, conservatives have been enlivened by it. Not necessarily in a personal sense, though it’s true that many a conservative has expressed an unanticipated enthusiasm for violence. “I enjoy wars,” said Harold Macmillan, wounded three times in World War I. “Any adventure’s better than sitting in an office.” The conservative’s commitment to violence is more than psychological, however: It’s philosophical. Violence, the conservative maintains, is one of the experiences in life that makes us most feel alive, and violence, particularly warfare, is an activity that makes life, well, lively. Such arguments can be made nimbly, as in the case of Santayana, who wrote, “Only the dead have seen the end of war,” or laboriously, as in the case of Heinrich von Treitschke:

To the historian who lives in the world of will it is immediately clear that the demand for a perpetual peace is thoroughly reactionary; he sees that with war all movement, all growth, must be struck out of history. It has always been the tired, unintelligent, and enervated periods that have played with the dream of perpetual peace.

Pithy or prolix, the case boils down to this: War is life, peace is death. […]

Far from challenging the conservative tradition’s infatuation with violence, however, this indifference to the realities of war is merely the flip side of the Burkean coin. Even as he wrote of the sublime effects of pain and danger, Burke was careful to insist that should those pains and dangers “press too nearly” or “too close”—should they become real threats, “conversant about the present destruction of the person”—their sublimity would disappear. Burke’s point was not that nobody, in the end, really wants to die, or that nobody enjoys excruciating pain. It was that sublimity depends upon obscurity: Get too close to anything, see and feel its full extent, and it loses its mystery and aura. A “great clearness” of the sort that comes from direct experience is “an enemy to all enthusiasms whatsoever.” Get to know anything, including violence, too well, and it loses the thrill you got when it was just an idea.

Since 9/11, many have complained, and rightly so, about the failure of conservatives—or their sons and daughters—to fight the war on terror themselves. For many, that failure is symptomatic of the inequality of contemporary America, and it is. But there is an additional element to the story. So long as the war on terror remains an idea—a hot topic on the blogs, a provocative op-ed, an episode of 24—it is sublime. As soon as it becomes a reality, it can be as tedious as a discussion of the tax code or as cheerless as a trip to the DMV.

Redefining the Right Wing
Corey Robin interviewed by Daniel Larison

Last, the question of sublimity and violence. I think this is one of the most interesting elements of the right because it shows just how extraordinarily rich and sophisticated its vision of human nature is. I don’t think the right has by any means a monopoly on the discourse of violence; the left has its own long tradition of reflection on violence. But where the left’s discourse is primarily influenced by Machiavelli — that is, an awareness of what Sheldon Wolin calls “the economy of violence,” or the necessity of instrumentalizing violence, of making a very little go a long, long way — the right’s attitude is reflected in Burke’s moral psychology, particularly his theory of the sublime.

You had asked previously how representative the account in the book is. You suggested that my strongest cases are Teddy Roosevelt and Georges Sorel, neither of whom is an unproblematic representative of the right. But I mention a great many other cases throughout history of voices that virtually every anthology of the right would include: not just Burke but also Maistre, Tocqueville, Churchill, and of course many of the neocons. Now I know, Daniel, that you’ve spent the better part of your career fighting the good fight against neocon imperialism and that part of your argument against the neocons is that they are not conservative. But their position has deep roots on the right. My sense that it’s too easy to dismiss the neocons as innovators from afar.

I think what’s distinctive about the discourse of violence on the right is that whereas the audience for violence on the left is the victim of violence — the leftist (whether a revolutionary, guerrilla fighter, terrorist, what have you) seeks to impress upon enemies the power of what threatens them if they do not accede to the left’s demands — I think that the primary audience for violence on the right is the perpetrator and/or his/her allies. In other words, the right sees violence as primarily a source of rejuvenation among a ruling class that has gone soft. That’s what is so interesting to me, in part because it completely inverts the standard stereotype we have of the conservative being more hard-headed and realistic than the progressive. If anything — and I really assign no normative weight to this; it’s more interesting to me as an intellectual problem — it is the left, as I’ve suggested, that has been more influenced by realist modes of thinking when it comes to violence. Lenin read Clausewitz, Gramsci read Machiavelli, and so on. And that’s not because the left is more humanitarian or anything like that; it’s mostly because of necessity. Revolutionaries, by definition, don’t have a monopoly on the means of violence; they operate at a major deficit, so economy is essentially forced upon them. The right by contrast suffers from a surfeit of power, so it looks to violence to address a quite different set of concerns.

Politics and Vision
by Sheldon S. Wolin
(as quoted by Don MacDonald)

In evaluating Machiavelli’s economy of violence it is easy to criticize it as being the product of a technician’s admiration for efficient means. A century like ours, which has witnessed the unparalleled efficiency displayed by totalitarian regimes in the use of terror and coercion, experiences difficulty in being tolerant on the subject. Yet to see Machiavelli as the philosopher of Himmlerism would be quite misleading; and the basic reason is not alone that Machiavelli regarded the science of violence as the means for reducing the amount of suffering in the political condition, but that he was clearly aware of the dangers of entrusting its use to the morally obtuse. What he hoped to further by his economy of violence was the “pure” use of power, undefiled by pride, ambition, or motives of petty revenge.

A more meaningful contrast to Machiavelli would be the great modern theoretician of violence, Georges Sorel. Here is a true example of the irresponsible political individual, fired by romantic notions of heroism, preaching the use of violence for ends which are deliberately and proudly clothed in the vague outline of the irrational “myth,” contemptuous of the cost, blinded by a vision of virile proletarian barbarians who would revitalize the decadent West. In contrast, there was no hint of child-like delight when Machiavelli contemplated the barbarous and savage destructiveness of the new prince, sweeping away the settled arrangements of society and “leaving nothing intact.” There was, however, the laconic remark that it was better to be a private citizen than to embark on a career which involved the ruin of men. This suggest that the theorist like Machiavelli, who was aware of the limited efficacy of force and who devoted himself to showing how its technique could be used more efficiently, was far more sensitive to the moral dilemmas of politics and far more committed to the preservation of man than those theorists who, saturated with moral indignation and eager for heroic regeneration, preach purification by the holy flame of violence.

The Poverty of Conservatism
The ideology of power, privilege and plutocracy

by Johnny Reb

A Little History

“Hatred of the left in all its guises, from the most tepid to the most outré, is thus not incidental to fascism; it is at its core.The fascist route to power has always been passed through cooperation with conservative elites; without the acquiescence or even active assent of the traditional elites could never have attained power” – Robert O Paxton, The Anatomy of Fascism

Historian and political scientist Robert O Paxton informs us that hatred and fear of the left is not just a key characteristic of fascism, but of conservatism as well. For conservatives it’s the trepidation that the majority underclass will rise up and demand real democracy and social justice as they did in France in 1789 and Russia in 1917. This hatred and fear is the locus of the conservatives reactionary response to democratic movements that challenge their traditional entitlements and privileges. Violence is, and always has been, an open option for conservatives, but one of their less dramatic and vicious responses to left wing movements is propaganda, cooption or minor concessions to the working classes that don’t meaningfully change their supremacy within the socio-political order.

It’s generally agreed by political philosophers that the monarchist Edmund Burke (1729-1797) who, in his ponderous uncompromising diatribe on the French Revolution*, was the first express and define conservatism as a discrete political ideology of moderation and prudence. But the history of the past 200 years has been anything but moderate or prudent when one considers the fanatical anti-democratic invectives against the French and Bolshevik Revolutions, the defense of racism, slavery and Jim Crow, the genocide of indigenous peoples throughout the world, the vicious attacks on trade unionism, the red baiting and persecution of ordinary working people, social democracy and the welfare state, the ongoing hostility to the New Deal of FDR, the Great Society of LBJ, civil rights, humanism, feminism, gay rights and endless imperialistic wars**. Whereas the predecessors of today’s conservatives (and the transmogrified new beta version, the neo-conservative) in the old regime thought of inequality as a naturally occurring phenomenon ordained by God, an inheritance passed on from generation to generation, their encounter with many people’s revolutions such as in the Russian and Cuban revolutions and the Spanish Civil War clearly demonstrates that the revolutionaries were right after all: inequality is a distinctly human creation. No book on conservatism since Burke’s magnum opus comes close to improving on his contempt and condescension of working classes, which he described as the “swinish multitude”, and the pompous celebration of his “natural aristocracy.”

* Edmund Burke, Reflections of the Revolution in France, 1790. Every major political tradition without exception lays claims to liberty and the tradition of freedom. None have so far delivered for the masses the freedom from constraint or coercion that these claims entail. Anarchism is, in my view, really the only genuine political philosophy of freedom and egalitarianism. But it’s never been provided with an opportunity with the exception of many indigenous cultures in North America, the short period of the Spanish Civil War and the Kronstadt Mutiny during the Bolshevik Revolution. Burke, whose opinions are not so uplifting as some of his grandiose prose, advised William Pitt that his government ought not concern itself with helping to feed starving citizens by any other means than for sale through profit and not be concerned with actions that would alleviate the suffering and death by famine. This expresses the essence of Conservatism (blame the victim) and Burke’s resolute opposition to democracy and obsession with private property rights that has been carried on by his successors. In fact it was conservatives who consistently blocked the vote for those who did not own property. And only those who are well-heeled, entrepreneurial or efficiently acquisitive are of any value to society and who have the right to lay any claim to liberty. These were the values of the white slave and land owning white aristocratic conservatives who were the framers of the US Constitution.

**Conservatives, it can be evidenced, love war. The historical record confirms that, far from being saddened, burdened, or vexed by violence, conservatives have been energized by it. Not necessarily in a personal sense, though it’s true that many conservatives have expressed an unanticipated enthusiasm for violence. “I enjoy wars,” said Harold Macmillan, wounded three times in World War I. “Any adventure’s better than sitting in an office.” The conservative’s commitment to violence is more than psychological, however; it’s a philosophical; it’s a “war is life and peace is death” philosophical commitment. Power and its partner violence, the conservative maintains, are the experiences in life that makes us most feel alive, and violence, particularly warfare, is an activity that makes life exhilarating, full of risk and worth living.

One possibility explanation for the conservatives love for war is its embrace of authoritarianism and hierarchy, with their twin requirements of submission and domination; the other is violence, particularly warfare, with its rigid injunction to kill or be killed. Perhaps not coincidentally, both are of great significance to conservatism as a theoretical tradition and historical practice. Consistent with Edmund Burke’s argument, however, the conservative often favours the latter over the former. Once we are assured of our power over another being, says Burke, it loses its capacity to harm or threaten us. Make a creature useful and obedient, and “you spoil it of everything sublime.” It becomes an object of contempt, contempt being “the attendant on a strength that is subservient and innoxious.” At least one-half, then, of the experience of hierarchy—the experience of ruling another—is incompatible with, and indeed weakens, the sublime. Confirmed of our power, we are lulled into the same ease and comfort, undergo the same inward melting, that we experience while in the throes of pleasure.

* * *

Rebirth of a Nation
by Jackson Lears
pp. 18-19

The organic imagery embodied in “the national tree” reflected a new strain of romantic nationalism, which melded the individual with the collective by likening the nation to a natural organism. According to Edward Everett Hale’s popular didactic tale, The Man Without a Country (1863), one’s personal identity—indeed one’s very life—was dependent on immersion in a larger national identity. While Lincoln used the language of “the people” to elevate democracy as well as nationhood, more typical orators deployed the same idiom in the service of organic nationalism, wrapping the government and the citizenry in the sacred garment of the nation.

The sanctity of the nation justified its demands for blood. Redefining unspeakable losses as religious sacrifice, Northerners forged a powerful link between war and regeneration. In some formulations, personal rebirth seemed to arise simply from the decision to risk combat—to plunge into action as an end in itself, heedless of the consequences. (This would be the version that Oliver Wendell Holmes Jr. would eventually celebrate, as he recalled his own war experience, and that Theodore Roosevelt would unwittingly parody.) More commonly, the revitalization was explicitly moral. For generations, republican moralists had been haunted by visions of a citizenry grown soft through indulgence in luxury and other vices of commerce. The many forms of sacrifice demanded by the war provided a perfect opportunity for Americans to redeem themselves from commercial corruption, to transcend private gain in pursuit of a larger public good. So moralists said.

Sacrifice was most appealing when imagined from a distance. As usual in such cases, the loudest yelps for blood often came from those farthest from the battlefield. Charles Eliot Norton, a well-connected young Brahmin intellectual, waxed eloquent over “the Advantages of Defeat” after the Union Army was routed at the first battle of Manassas. The humiliation might have the salutary effect of sobering us, soldiers and civilians—of reminding us that this “religious war” would require a mass blood sacrifice. “But there must be no shrinking from the prospect of the death of our soldiers,” the young man warned. “Better than that we should fail that a million men should die on the battlefield.” Victory would eventually come; and meanwhile Northern character—so long sunk in selfishness and softness—would be purified by protracted struggle. Years later, Norton would repudiate these youthful fatuities and become an outspoken anti-imperialist. But during the Civil War, his breathtaking arrogance was commonplace. Men routinely praised the cleansing power of war from a comfortable distance.

Some turned in therapeutic directions. The Albany Argus predicted that “A vigorous war would tone up the public mind, and impart to it qualities that would last after the calamities of war had passed.” And the historian Benson Lossing wrote to Sue Wallace (the wife of General Lew Wallace) in 1862: “I have felt profoundly impressed with the conviction that out of all this tribulation would come health, and strength, and purification for the nation.” From the perspective of the people who actually fought it, or were swept up in it, one could attribute few more bizarre effects to the war than “health, strength, and purification.” Here as elsewhere, one can glimpse the connections between millennial dreams of collective rebirth and the sort of organic nationalism that could eventually mutate into fascism.

pp. 27-29

But for many other observers, too many American youths—especially among the upper classes—had succumbed to the vices of commerce: the worship of Mammon, the love of ease. Since the Founding Fathers’ generation, republican ideologues had fretted about the corrupting effects of commercial life. Norton and other moralists, North and South, had imagined war would provide an antidote. During the Gilded Age those fears acquired a peculiarly palpable intensity. The specter of “overcivilization”—invoked by republican orators since Jefferson’s time—developed a sharper focus: the figure of the overcivilized businessman became a stock figure in social criticism. Flabby, ineffectual, anxious, possibly even neurasthenic, he embodied bourgeois vulnerability to the new challenges posed by restive, angry workers and waves of strange new immigrants. “Is American Stamina Declining?” asked William Blaikie, a former Harvard athlete and author of How to Get Strong and Stay So, in Harper’s in 1889. Among white-collar “brain-workers,” legions of worried observers were asking similar questions. Throughout the country, metropolitan life for the comfortable classes was becoming a staid indoor affair. Blaikie caught the larger contours of the change:

“A hundred years ago, there was more done to make our men and women hale and vigorous than there is to-day. Over eighty per cent of all our men then were farming, hunting, or fishing, rising early, out all day in the pure, bracing air, giving many muscles very active work, eating wholesome food, retiring early, and so laying in a good stock of vitality and health. But now hardly forty per cent are farmers, and nearly all the rest are at callings—mercantile, mechanical, or professional—which do almost nothing to make one sturdy and enduring.”

This was the sort of anxiety that set men (and more than a few women) to pedaling about on bicycles, lifting weights, and in general pursuing fitness with unprecedented zeal. But for most Americans, fitness was not merely a matter of physical strength. What was equally essential was character, which they defined as adherence to Protestant morality. Body and soul would be saved together.

This was not a gender-neutral project. Since the antebellum era, purveyors of conventional wisdom had assigned respectable women a certain fragility. So the emerging sense of physical vulnerability was especially novel and threatening to men. Manliness, always an issue in Victorian culture, had by the 1880s become an obsession. Older elements of moral character continued to define the manly man, but a new emphasis on physical vitality began to assert itself as well. Concern about the over-soft socialization of the young promoted the popularity of college athletics. During the 1880s, waves of muscular Christianity began to wash over campuses.

pp. 203-204

American politicians were capable of this sort of sentimentality, too. In public, at least, they could insist that their apparently imperial aims were uniquely leavened with moral concerns—in particular a commitment to the spread of freedom and democracy. But in private, their sentiments were less exalted. Writing to Rudyard Kipling, Theodore Roosevelt reviled “the jack-fools who seriously think that any group of pirates and head-hunters needs nothing but independence in order that it may be turned forthwith into a dark-hued New England town meeting.” Most “dark-hued” peoples lacked the crucial character trait, he noted elsewhere: “There must be control. There must be mastery, somewhere, and if there is no self-control and self-mastery, the control and the mastery will ultimately be imposed from without.”

Roosevelt’s obsession with “mastery” revealed the trigger of empire. Behind all the economic calculations and all the lofty rhetoric about civilization and progress was a primal emotion—a yearning to reassert control, a masculine will to power amid the drifting slack waters of the fin de siècle. Admiral Alfred Thayer Mahan invoked the cautionary example of ancient Rome, after it had abandoned its “strong masculine impulse” and “degenerated into that worship of comfort, wealth, and general softness, which is the ideal of the peace prophets of to-day.” Mahan was the leading big-navy imperialist, and imperialism was the most important political form of late-nineteenth-century longings for regeneration. Those desires flourished on both sides of the Atlantic, taking shapes peculiar to their surroundings. In the United States, the quest for regeneration through empire reworked ancient Protestant dreams of rebirth into a secular militarist agenda. Yearnings to recapture the heights of Civil War heroism combined with Anglo-Saxon racism, fears of overcivilized decadence, and a providentialist faith in American mission.

The result was an ideological witches’ brew. In Europe similar mixtures fostered fascism; in the United States imperial ideology had more benign consequences—for U.S. citizens themselves, if not for their subject populations. The reasons for this divergence are many and complex, but perhaps the most important was the genius of the Constitution’s framers in creating the checks and balances that prevented executive tyranny. Still, American imperialist rhetoric, including Roosevelt’s, often sounded remarkably proto-fascist. Like the ministerial ranting of the Civil War, fin de siècle militarism celebrated blood sacrifice in combat, but with new and more secular emphases on sheer physical courage and the inherently revitalizing effects of conflict.

Popular misunderstandings of Darwinism equated evolution with inevitable progress, and assumed that progress could be achieved only through death-dealing struggle. “Antagonism,” the Popular Science Monthly announced in 1888, is “a necessity of existence, and of the organism of the universe so far as we can understand it; [it is apparent] that motion and life cannot go on without it; that it is not a mere casual adjunct of nature, but that without it there would be no nature.” A struggle for existence was at the heart of all life, among men as well as wolves, in commerce as in war, “as necessary to good as to evil.” Without it life would be boring to the point of ennui, or nonbeing.

* * *

The Fantasy of Creative Destruction
The Haunted Moral Imagination
Imagination: Moral, Dark, and Radical
Reconstruction Era Race Relations
Juvenile Delinquents and Emasculated Males
The Right-Wing New Age

Technological Fears and Media Panics

“One of the first characteristics of the first era of any new form of communication is that those who live through it usually have no idea what they’re in.”
~Mitchell Stephens

“Almost every new medium of communication or expression that has appeared since the dawn of history has been accompanied by doomsayers and critics who have confidently predicted that it would bring about The End of the World as We Know It by weakening the brain or polluting our precious bodily fluids.”
~New Media Are Evil, from TV Tropes

“The internet may appear new and fun…but it’s really a porn highway to hell. If your children want to get on the internet, don’t let them. It’s only a matter of time before they get sucked into a vortex of shame, drugs, and pornography from which they’ll never recover. The internet…it’s just not worth it.”
~Grand Theft Auto: Liberty City Stories

“It’s the same old devil with a new face.”
~Rev. George Bender, Harry Potter book burner

Media technology is hard to ignore. This goes beyond it being pervasive. Our complaints and fears, our fascination and optimism are mired in far greater things. It is always about something else. Media technology is not only the face of some vague cultural change but the embodiment of new forms of power that seem uncontrollable. Our lives are no longer fully our own, a constant worry in an individualistic society. With globalization, it’s as if the entire planet has become a giant company town.

I’m not one for giving into doom and gloom about technology. That response is as old as civilization and doesn’t offer anything useful. But I’m one of the first to admit to the dire situation we are facing. It’s just that in some sense the situation has always been dire, the world has always been ending. We never know if this finally will be the apocalypse that has been predicted for millennia, an ending to end it all with no new beginning. One way or another, the world as we know it is ending. There probably isn’t much reason to worry about it. Whatever the future holds, it is beyond our imagining as our present world was beyond the imagining of past generations.

One thing is clear. There is no point in getting in a moral panic over it. The young who embrace what is new always get blamed for it, even though they are simply inheriting what others have created. The youth today aren’t any worse off than any other prior generation at the same age. Still, it’s possible that these younger generations might take us into a future that us old fogies won’t be able to understand. History shows how shocking innovations can be. Talking about panics, think about Orson Welles’s radio show, War of the Worlds. The voice of radio back then had a power that we no longer can appreciate. Yet here we are with radio being so much background noise added to the rest.

Part of what got me thinking about this were two posts by Matt Cardin, at The Teeming Brain blog. In one post, he shares some of Nathaniel Rich’s review, Roth Agonistes, of Philip Roth’s Why Write?: Collected Nonfiction 1960–2013. There is a quote from Roth in 1960:

“The American writer in the middle of the twentieth century has his hands full in trying to understand, describe, and then make credible much of American reality. It stupefies, it sickens, it infuriates, and finally it is even a kind of embarrassment to one’s own meager imagination. The actuality is continually outdoing our talents, and the culture tosses up figures almost daily that are the envy of any novelist.”

Rich comments that, “Roth, despite writing before the tumult of the Sixties, went farther, suggesting that a radically destabilized society had made it difficult to discriminate between reality and fiction. What was the point of writing or reading novels when reality was as fantastic as any fiction? Such apprehensions may seem quaint when viewed from the comic-book hellscape of 2018, though it is perversely reassuring that life in 1960 felt as berserk as it does now.”

We are no more post-truth now than back then. It’s always been this way. But it is easy to lose context. Rich notes that, “Toward the end of his career, in his novels and public statements, Roth began to prophesy the extinction of a literary culture — an age-old pastime for aging writers.” The ever present fear that the strangeness and stresses of the unknown will replace the comforting of the familiar. We all grow attached to the world we experienced in childhood, as it forms the foundation of our identity. But every now and then something comes along to threaten it all. And the post-World War era was definitely a time of dramatic and, for some, traumatic change — despite all of the nostalgia that has accrued to its memories like flowers on a gravestone.

The technological world we presently live in took its first form during that earlier era. Since then, the book as an art form is far from being near extinction. More books have been printed in recent decades than ever before in history. New technology has oddly led us read even more books, both in their old and new technological forms. My young niece, of the so-called Internet Generation, prefers physical books… not that she is likely to read Philip Roth. Literacy, along with education and IQ, is on the rise. There is more of everything right now, what makes it overwhelming. Technologies of the past for the  most part aren’t being replaced but incorporated into a different world. This Borg-like process of assimilation might be more disturbing to the older generations than simply becoming obsolete.

The other post by Matt Cardin shares an excerpt from an NPR piece by Laura Sydell, The Father Of The Internet Sees His Invention Reflected Back Through A ‘Black Mirror’. It is about the optimists of inventors and the consequences of inventions, unforeseen except by a few. One of those who did see the long term implications was William Gibson: “The first people to embrace a technology are the first to lose the ability to see it objectively.” Maybe so, but that is true for about everyone, including most of those who don’t embrace it or go so far as to fear it. It’s not in human nature to see much of anything objectively.

Gibson did see the immediate realities of what he coined as ‘Cyberspace’. We do seem to be moving in that general direction of cyberpunk dystopia, at least here in this country. I’m less certain about the even longer term developments, as Gibson’s larger vision is as fantastical as many others. But it is the immediate realities that always concern people because they can be seen and felt, if not always acknowledged for what they are, often not even by the fear-mongers.

I share his being more “interested in how people behave around new technologies.” In reference to “how TV changed New York City neighborhoods in the 1940s,” Gibson states that, “Fewer people sat out on the stoops at night and talked to their neighbors, and it was because everyone was inside watching television. No one really noticed it at the time as a kind of epochal event, which I think it was.”

I would make two points about.

First, there is what I already said. It is always an epochal event when a major technology is invented, going back to the many inventions before that such as media technology (radio, films, telegraph, printing press, bound book, etc) but also other technologies (assembly lines, cotton gin, compass, etc). Did the Chinese individual who assembled the first firework imagine the carnage of bombs that made castles easy targets and led to two world wars that transformed all of human existence? Of course not. Even the simplest of technologies can turn civilization on its head, which has happened multiple times over the past few millennia and often with destructive results.

The second point is to look at something specific like television. It happened along with the building of the interstate highway system, the rise of car culture, and the spread of suburbia. Television became a portal for the outside world to invade the fantasyland of home life that took hold after the war. Similar fears about radio and the telephone were transferred to the television set and those fears were directed at the young. The first half of the 20th century was constant technological wonder and uncertainty. The social order was thrown askew.

We like to imagine the 1940s and 1950s as a happy time of social conformity and social order, a time of prosperity and a well behaved population, but that fantasy didn’t match the reality. It was an era of growing concerns about adolescent delinquency, violent crime, youth gangs, sexual deviancy, teen pregnancy, loose morals, and rock ‘n roll — and the data bears out that a large number in that generation were caught up in the criminal system, whether because they were genuinely a bad generation or that the criminal system had become more punitive, although others have argued that it was merely a side effect of the baby boom with youth making up a greater proportion of society. Whatever was involved, the sense of social angst got mixed up with lingering wartime trauma and emerging Cold War paranoia. The policing, arrests, and detention of wayward youth became a priority to the point of oppressive obsession. Besides youth problems, veterans from World War II did not come home content and happy (listen to Audible’s “The Home Front”). It was a tumultuous time, quite opposite of the perfect world portrayed in those family sitcoms of the 1940s and 1950s.

The youth during that era had a lot in common with their grandparents, the wild and unruly Lost Generation corrupted by family and community breakdown from early mass immigration, urbanization, industrialization, consumerism, etc. Starting in the late 1800s, youth gangs and hooliganism became rampant, as moral panic became widespread. As romance novels earlier had been blamed and later comic books would be blamed, around the turn of the century the popular media most feared were the violent penny dreadfuls and dime novels that targeted tender young minds with portrayals of lawlessness and debauchery, so it seemed to the moral reformers and authority figures.

It was the same old fear rearing its ugly head. This pattern has repeated on a regular basis. What new technology does is give an extra push to the swings of generational cycles. So, as change occurs, much remains the same. For all that William Gibson got right, no one can argue that the world has been balkanized into anarcho-corporatist city-states (Snow Crash), although it sure is a plausible near future. The general point is true, though. We are a changed society. Yet the same old patterns of fear-mongering and moral panic continue. What is cyclical and what is trend is hard to differentiate as it happens, it being easier to see clearly in hindsight.

I might add that vast technological and social transformations have occurred every century for the past half millennia. The ending of feudalism was far more devastating. Much earlier, the technological advancement of written text and the end of oral culture had greater consequences than even Socrates could have predicted. And it can’t be forgotten that movable type printing presses ushered in centuries of mass civil unrest, populist movements, religious wars, and revolution across numerous countries.

Our own time so far doesn’t compare, one could argue. The present relative peace and stability will continue until maybe World War III and climate change catastrophe forces a technological realignment and restructuring of civilization. Anyway, the internet corrupting the youth and smart phones rotting away people’s brains should be the least of our worries.

Even the social media meddling that Russia is accused of in manipulating the American population is simply a continuation of techniques that go back to before the internet existed. The game has changed a bit, but nations and corporations are pretty much acting in the devious ways they always have, except they are collecting a lot more info. Admittedly, technology does increase the effectiveness of their deviousness. But it also increases the potential methods for resisting and revolting against oppression.

I do see major changes coming. My doubts are more about how that change will happen. Modern civilization is massively dysfunctional. That we use new technologies less than optimally might have more to do with pre-existing conditions of general crappiness. For example, television along with air conditioning likely did contribute to people not sitting outside and talking to their neighbors, but as great or greater of a contribution probably had to do with diverse social and economic forces driving shifts in urbanization and suburbanization with the dying of small towns and the exodus from ethnic enclaves. Though technology was mixed into these changes, we maybe give technology too much credit and blame for the changes that were already in motion.

It is similar to the shift away from a biological explanation of addiction. It’s less that certain substances create uncontrollable cravings. Such destructive behavior is only possible and probable when particular conditions are set in place. There already has to be breakdown of relationships of trust and support. But rebuild those relationships and the addictive tendencies will lessen.

Similarly, there is nothing inevitable about William Gibson’s vision of the future or rather his predictions might be more based on patterns in our society than anything inherent to the technology itself. We retain the choice and responsibility to create the world we want or, failing that, to fall into self-fulfilling prophecies.

The question is what is the likelihood of our acting with conscious intention and wise forethought. All in all, self-fulfilling prophecy appears to be the most probable outcome. It is easy to be cynical, considering the track record of the present superpower that dominates the world and the present big biz corporatism that dominates the economy. Still, I hold out for the chance that conditions could shift for various reasons, altering what otherwise could be taken as near inevitable.

* * *

Fear of the new - a techno panic timeline

11 Examples of Fear and Suspicion of New Technology
by Len Wilson

New communications technologies don’t come with user’s manuals. They are primitive, while old tech is refined. So critics attack. The critic’s job is easier than the practitioner’s: they score with the fearful by comparing the infancy of the new medium with the perfected medium it threatens. But of course, the practitioner wins. In the end, we always assimilate to the new technology.

“Writing is a step backward for truth.”
~Plato, c. 370 BC

“Printed book will never be the equivalent of handwritten codices.”
~Trithemius of Sponheim, 1492

“The horrible mass of books that keeps growing might lead to a fall back into barbarism..”
~Gottfried Wilhelm, 1680

“Few students will study Homer or Virgil when they can read Tom Jones or a thousand inferior or more dangerous novels.”
~Rev. Vicemius Know, 1778

“The most powerful of ignorance’s weapons is the dissemination of printed matter.”
~Count Leo Tolstoy, 1869

“We will soon be nothing but transparent heaps of jelly to each other.”
~New York Times 1877 Editorial, on the advent of the telephone

“[The telegraph is] a constant diffusion of statements in snippets.”
~Spectator Magazine, 1889

“Have I done the world good, or have I added a menace?”
~Guglielmo Marconi, inventor of radio, 1920

“The cinema is little more than a fad. It’s canned drama. What audiences really want to see is flesh and blood on the stage.”
~Charlie Chaplin, 1916

“There is a world market for about five computer.”
~Thomas J. Watson, IBM Chairman and CEO, 1943

“Television won’t be able to hold on to any market it captures after the first six months. People will soon get tired of staring at a plywood box every night.”
~Daryl Zanuck, 20th Century Fox CEO, 1946

Moral Panics Over Youth Culture and Video Games
by Kenneth A. Gagne

Several decades of the past century have been marked by forms of entertainment that were not available to the previous generation. The comic books of the Forties and Fifties, rock ‘n roll music of the Fifties, Dungeons & Dragons in the Seventies and Eighties, and video games of the Eighties and Nineties were each part of the popular culture of that era’s young people. Each of these entertainment forms, which is each a medium unto itself, have also fallen under public scrutiny, as witnessed in journalistic media such as newspapers and journals – thus creating a “moral panic.”

The Smartphone’s Impact is Nothing New
by Rabbi Jack Abramowitz

Any invention that we see as a benefit to society was once an upstart disruption to the status quo. Television was terrible because when listened to the radio, we used our imaginations instead of being spoon-fed. Radio was terrible because families used to sit around telling stories. Moveable type was terrible because if books become available to the masses, the lower classes will become educated beyond their level. Here’s a newsflash: Socrates objected to writing! In The Phaedrus (by his disciple Plato), Socrates argues that “this discovery…will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. … (Y)ou give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.”

When the Internet and the smartphone evolved, society did what we always do: we adapted. Every new technology has this effect. Do you know why songs on the radio are about 3½ minutes long? Because that’s what a 45-rpm record would hold. Despite the threat some perceived in this radical format, we adapted. (As it turns out, 45s are now a thing of the past but the pop song endures. Turns out we like 3½-minute songs!)

What parallels do you see between the invention of the internet – the ‘semantic web’ and the invention of the printing press?
answer by Howard Doughty

Technology, and especially the technology of communication, has tremendous consequences for human relations – social, economic and political.

Socrates raged against the written word, insisting that it was the end of philosophy which, in his view, required two or more people in direct conversation. Anything else, such as a text, was at least one step removed from the real thing and, like music and poetry which he also despised, represented a pale imitation (or bastardization) of authentic life. (Thank goodness Plato wrote it all down.)

From an oral to a written society was one thing, but as Marshall McLuhan so eruditely explained in his book, The Gutenberg Galaxy, the printing press altered fundamantal cultural patterns again – making reading matter more easily available and, in the process, enabling the Protestant Reformation and its emphasis on isolated individual interpretations of whatever people imagined their god to be.

In time, the telegraph and the telephone began the destruction of space, time and letter writing, making it possible to have disembodied conversations over thousands of miles.

Don’t Touch That Dial!
by Vaughan Bell

A respected Swiss scientist, Conrad Gessner, might have been the first to raise the alarm about the effects of information overload. In a landmark book, he described how the modern world overwhelmed people with data and that this overabundance was both “confusing and harmful” to the mind. The media now echo his concerns with reports on the unprecedented risks of living in an “always on” digital environment. It’s worth noting that Gessner, for his part, never once used e-mail and was completely ignorant about computers. That’s not because he was a technophobe but because he died in 1565. His warnings referred to the seemingly unmanageable flood of information unleashed by the printing press.

Worries about information overload are as old as information itself, with each generation reimagining the dangerous impacts of technology on mind and brain. From a historical perspective, what strikes home is not the evolution of these social concerns, but their similarity from one century to the next, to the point where they arrive anew with little having changed except the label.

 These concerns stretch back to the birth of literacy itself. In parallel with modern concerns about children’s overuse of technology, Socrates famously warned against writing because it would “create forgetfulness in the learners’ souls, because they will not use their memories.” He also advised that children can’t distinguish fantasy from reality, so parents should only allow them to hear wholesome allegories and not “improper” tales, lest their development go astray. The Socratic warning has been repeated many times since: The older generation warns against a new technology and bemoans that society is abandoning the “wholesome” media it grew up with, seemingly unaware that this same technology was considered to be harmful when first introduced.

Gessner’s anxieties over psychological strain arose when he set about the task of compiling an index of every available book in the 16th century, eventually published as the Bibliotheca universalis. Similar concerns arose in the 18th century, when newspapers became more common. The French statesman Malesherbes railed against the fashion for getting news from the printed page, arguing that it socially isolated readers and detracted from the spiritually uplifting group practice of getting news from the pulpit. A hundred years later, as literacy became essential and schools were widely introduced, the curmudgeons turned against education for being unnatural and a risk to mental health. An 1883 article in the weekly medical journal the Sanitarian argued that schools “exhaust the children’s brains and nervous systems with complex and multiple studies, and ruin their bodies by protracted imprisonment.” Meanwhile, excessive study was considered a leading cause of madness by the medical community.

When radio arrived, we discovered yet another scourge of the young: The wireless was accused of distracting children from reading and diminishing performance in school, both of which were now considered to be appropriate and wholesome. In 1936, the music magazine the Gramophone reported that children had “developed the habit of dividing attention between the humdrum preparation of their school assignments and the compelling excitement of the loudspeaker” and described how the radio programs were disturbing the balance of their excitable minds. The television caused widespread concern as well: Media historian Ellen Wartella has noted how “opponents voiced concerns about how television might hurt radio, conversation, reading, and the patterns of family living and result in the further vulgarization of American culture.”

Demonized Smartphones Are Just Our Latest Technological Scapegoat
by Zachary Karabell

AS IF THERE wasn’t enough angst in the world, what with the Washington soap opera, #MeToo, false nuclear alerts, and a general sense of apprehension, now we also have a growing sense of alarm about how smartphones and their applications are impacting children.

In the past days alone, The Wall Street Journal ran a long story about the “parents’ dilemma” of when to give kids a smartphone, citing tales of addiction, attention deficit disorder, social isolation, and general malaise. Said one parent, “It feels a little like trying to teach your kid how to use cocaine, but in a balanced way.” The New York Times ran a lead article in its business section titled “It’s Time for Apple to Build a Less Addictive iPhone,” echoing a rising chorus in Silicon Valley about designing products and programs that are purposely less addictive.

All of which begs the question: Are these new technologies, which are still in their infancy, harming a rising generation and eroding some basic human fabric? Is today’s concern about smartphones any different than other generations’ anxieties about new technology? Do we know enough to make any conclusions?

Alarm at the corrosive effects of new technologies is not new. Rather, it is deeply rooted in our history. In ancient Greece, Socrates cautioned that writing would undermine the ability of children and then adults to commit things to memory. The advent of the printing press in the 15th century led Church authorities to caution that the written word might undermine the Church’s ability to lead (which it did) and that rigor and knowledge would vanish once manuscripts no longer needed to be copied manually.

Now, consider this question: “Does the telephone make men more active or more lazy? Does [it] break up home life and the old practice of visiting friends?” Topical, right? In fact, it’s from a 1926 survey by the Knights of Columbus about old-fashioned landlines.

 The pattern of technophobia recurred with the gramophone, the telegraph, the radio, and television. The trope that the printing press would lead to loss of memory is very much the same as the belief that the internet is destroying our ability to remember. The 1950s saw reports about children glued to screens, becoming more “aggressive and irritable as a result of over-stimulating experiences, which leads to sleepless nights and tired days.” Those screens, of course, were televisions.

Then came fears that rock-n-roll in the 1950s and 1960s would fray the bonds of family and undermine the ability of young boys and girls to become productive members of society. And warnings in the 2000s that videogames such as Grand Theft Auto would, in the words of then-Senator Hillary Rodham Clinton, “steal the innocence of our children, … making the difficult job of being a parent even harder.”

Just because these themes have played out benignly time and again does not, of course, mean that all will turn out fine this time. Information technologies from the printed book onward have transformed societies and upended pre-existing mores and social order.

Protruding Breasts! Acidic Pulp! #*@&!$% Senators! McCarthyism! Commies! Crime! And Punishment!
by R.C. Baker

In his medical practice, Wertham saw some hard cases—juvenile muggers, murderers, rapists. In Seduction, he begins with a gardening metaphor for the relationship between children and society: “If a plant fails to grow properly because attacked by a pest, only a poor gardener would look for the cause in that plant alone.” He then observes, “To send a child to a reformatory is a serious step. But many children’s-court judges do it with a light heart and a heavy calendar.” Wertham advocated a holistic approach to juvenile delinquency, but then attacked comic books as its major cause. “All comics with their words and expletives in balloons are bad for reading.” “What is the social meaning of these supermen, super women … super-ducks, super-mice, super-magicians, super-safecrackers? How did Nietzsche get into the nursery?” And although the superhero, Western, and romance comics were easily distinguishable from the crime and horror genres that emerged in the late 1940s, Wertham viewed all comics as police blotters. “[Children] know a crime comic when they see one, whatever the disguise”; Wonder Woman is a “crime comic which we have found to be one of the most harmful”; “Western comics are mostly just crime comic books in a Western setting”; “children have received a false concept of ‘love’ … they lump together ‘love, murder, and robbery.’” Some crimes are said to directly imitate scenes from comics. Many are guilty by association—millions of children read comics, ergo, criminal children are likely to have read comics. When listing brutalities, Wertham throws in such asides as, “Incidentally, I have seen children vomit over comic books.” Such anecdotes illuminate a pattern of observation without sourcing that becomes increasingly irritating. “There are quite a number of obscure stores where children congregate, often in back rooms, to read and buy secondhand comic books … in some parts of cities, men hang around these stores which sometimes are foci of childhood prostitution. Evidently comic books prepare the little girls well.” Are these stores located in New York? Chicago? Sheboygan? Wertham leaves us in the dark. He also claimed that powerful forces were arrayed against him because the sheer number of comic books was essential to the health of the pulp-paper manufacturers, forcing him on a “Don Quixotic enterprise … fighting not windmills, but paper mills.”

When Pac-Man Started a National “Media Panic”
by Michael Z. Newman

This moment in the history of pop culture and technology might have seemed unprecedented, as computerized gadgets were just becoming part of the fabric of everyday life in the early ‘80s. But we can recognize it as one in a predictable series of overheated reactions to new media that go back all the way to the invention of writing (which ancients thought would spell the end of memory). There is a particularly American tradition of becoming enthralled with new technologies of communication, identifying their promise of future prosperity and renewed community. It is matched by a related American tradition of freaking out about the same objects, which are also figured as threats to life as we know it.

The emergence of the railroad and the telegraph in the 19th century and of novel 20th century technologies like the telephone, radio, cinema, television, and the internet were all similarly greeted by a familiar mix of high hopes and dark fears. In Walden, published in 1854, Henry David Thoreau warned that, “we do not ride on the railroad; it rides upon us.” Technologies of both centuries were imagined to unite to unite a vast and dispersed nation and edify citizens, but they also were suspected of trivializing daily affairs, weakening local bonds, and worse yet, exposing vulnerable children to threats and hindering their development into responsible adults.

These expressions are often a species of moral outrage known as media panic, a reaction of adults to the perceived dangers of an emerging culture popular with children, which the parental generation finds unfamiliar and threatening. Media panics recur in a dubious cycle of lathering outrage, with grownups seeming not to realize that the same excessive alarmism has arisen in every generation. Eighteenth and 19th century novels might have caused confusion to young women about the difference between fantasy and reality, and excited their passions too much. In the 1950s, rock and roll was “the devil’s music,” feared for inspiring lust and youthful rebellion, and encouraging racial mixing. Dime novels, comic books, and camera phones have all been objects of frenzied worry about “the kids these days.”

The popularity of video games in the ‘80s prompted educators, psychotherapists, local government officeholders, and media commentators to warn that young players were likely to suffer serious negative effects. The games would influence their aficionados in the all the wrong ways. They would harm children’s eyes and might cause “Space Invaders Wrist” and other physical ailments. Like television, they would be addictive, like a drug. Games would inculcate violence and aggression in impressionable youngsters. Their players would do badly in school and become isolated and desensitized. A reader wrote to The New York Times to complain that video games were “cultivating a generation of mindless, ill-tempered adolescents.”

The arcades where many teenagers preferred to play video games were imagined as dens of vice, of illicit trade in drugs and sex. Kids who went to play Tempest or Donkey Kong might end up seduced by the lowlifes assumed to hang out in arcades, spiraling into lives of substance abuse, sexual depravity, and crime. Children hooked on video games might steal to feed their habit. Reports at the time claimed that video kids had vandalized cigarette machines, pocketing the quarters and leaving behind the nickels and dimes. […]

Somehow, a generation of teenagers from the 1980s managed to grow up despite the dangers, real or imagined, from video games. The new technology could not have been as powerful as its detractors or its champions imagined. It’s easy to be captivated by novelty, but it can force us to miss the cyclical nature of youth media obsessions. Every generation fastens onto something that its parents find strange, whether Elvis or Atari. In every moment in media history, intergenerational tension accompanies the emergence of new forms of culture and communication. Now we have sexting, cyberbullying, and smartphone addiction to panic about.

But while the gadgets keep changing, our ideas about youth and technology, and our concerns about young people’s development in an uncertain and ever-changing modern world, endure.

Why calling screen time ‘digital heroin’ is digital garbage
by Rachel Becker

The supposed danger of digital media made headlines over the weekend when psychotherapist Nicholas Kardaras published a story in the New York Post called “It’s ‘digital heroin’: How screens turn kids into psychotic junkies.” In the op-ed, Kardaras claims that “iPads, smartphones and XBoxes are a form of digital drug.” He stokes fears about the potential for addiction and the ubiquity of technology by referencing “hundreds of clinical studies” that show “screens increase depression, anxiety and aggression.”
We’ve seen this form of scaremongering before. People are frequently uneasy with new technology, after all. The problem is, screens and computers aren’t actually all that new. There’s already a whole generation — millennials — who grew up with computers. They appear, mostly, to be fine, selfies aside. If computers were “digital drugs,” wouldn’t we have already seen warning signs?

No matter. Kardaras opens with a little boy who was so hooked on Minecraft that his mom found him in his room in the middle of the night, in a “catatonic stupor” — his iPad lying next to him. This is an astonishing use of “catatonic,” and is almost certainly not medically correct. It’s meant to scare parents.

by Alison Gopnik

My own childhood was dominated by a powerful device that used an optical interface to transport the user to an alternate reality. I spent most of my waking hours in its grip, oblivious of the world around me. The device was, of course, the book. Over time, reading hijacked my brain, as large areas once dedicated to processing the “real” world adapted to processing the printed word. As far as I can tell, this early immersion didn’t hamper my development, but it did leave me with some illusions—my idea of romantic love surely came from novels.
English children’s books, in particular, are full of tantalizing food descriptions. At some point in my childhood, I must have read about a honeycomb tea. Augie, enchanted, agreed to accompany me to the grocery store. We returned with a jar of honeycomb, only to find that it was an inedible, waxy mess.

Many parents worry that “screen time” will impair children’s development, but recent research suggests that most of the common fears about children and screens are unfounded. (There is one exception: looking at screens that emit blue light before bed really does disrupt sleep, in people of all ages.) The American Academy of Pediatrics used to recommend strict restrictions on screen exposure. Last year, the organization examined the relevant science more thoroughly, and, as a result, changed its recommendations. The new guidelines emphasize that what matters is content and context, what children watch and with whom. Each child, after all, will have some hundred thousand hours of conscious experience before turning sixteen. Those hours can be like the marvellous ones that Augie and I spent together bee-watching, or they can be violent or mindless—and that’s true whether those hours are occupied by apps or TV or books or just by talk.

New tools have always led to panicky speculation. Socrates thought that reading and writing would have disastrous effects on memory; the novel, the telegraph, the telephone, and the television were all declared to be the End of Civilization as We Know It, particularly in the hands of the young. Part of the reason may be that adult brains require a lot of focus and effort to learn something new, while children’s brains are designed to master new environments spontaneously. Innovative technologies always seem distracting and disturbing to the adults attempting to master them, and transparent and obvious—not really technology at all—to those, like Augie, who encounter them as children.

The misguided moral panic over Slender Man
by Adam Possamai

Sociologists argue that rather than simply being created stories, urban legends represent the fear and anxieties of current time, and in this instance, the internet culture is offering a global and a more participatory platform in the story creation process.

New technology is also allowing urban legends to be transmitted at a faster pace than before the invention of the printing press, and giving more people the opportunity to shape folk stories that blur the line between fiction and reality. Commonly, these stories take a life of their own and become completely independent from what the original creator wanted to achieve.

Yet if we were to listen to social commentary this change in the story creation process is opening the door to deviant acts.

Last century, people were already anxious about children accessing VHS and Betamax tapes and being exposed to violence and immorality. We are now likely to face a similar moral panic with regards to the internet.

Sleepwalking Through Our Dreams

In The Secret Life of Puppets, Victoria Nelson makes some useful observations of reading addiction, specifically in terms of formulaic genres. She discusses Sigmund Freud’s repetition compulsion and Lenore Terr’s post-traumatic games. She sees genre reading as a ritual-like enactment that can’t lead to resolution, and so the addictive behavior becomes entrenched. This would apply to many other forms of entertainment and consumption. And it fits into Derrick Jensen’s discussion of abuse, trauma, and the victimization cycle.

I would broaden her argument in another way. People have feared the written text ever since it was invented. In the 18th century, there took hold a moral panic about reading addiction in general and that was before any fiction genres had developed (Frank Furedi, The Media’s First Moral Panic). The written word is unchanging and so creates the conditions for repetition compulsion. Every time a text is read, it is the exact same text.

That is far different from oral societies. And it is quite telling that oral societies have a much more fluid sense of self. The Piraha, for example, don’t cling to their sense of self nor that of others. When a Piraha individual is possessed by a spirit or meets a spirit who gives them a new name, the self that was there is no longer there. When asked where is that person, the Piraha will say that he or she isn’t there, even if the same body of the individual is standing right there in front of them. They also don’t have a storytelling tradition or concern for the past.

Another thing that the Piraha apparently lack is mental illness, specifically depression along with suicidal tendencies. According to Barbara Ehrenreich from Dancing in the Streets, there wasn’t much written about depression even in the Western world until the suppression of religious and public festivities, such as Carnival. One of the most important aspects of Carnival and similar festivities was the masking, shifting, and reversal of social identities. Along with this, there was the losing of individuality within the group. And during the Middle Ages, an amazing number of days in the year were dedicated to communal celebrations. The ending of this era coincided with numerous societal changes, including the increase of literacy with the spread of the movable type printing press.

The Media’s First Moral Panic
by Frank Furedi

When cultural commentators lament the decline of the habit of reading books, it is difficult to imagine that back in the 18th century many prominent voices were concerned about the threat posed by people reading too much. A dangerous disease appeared to afflict the young, which some diagnosed as reading addiction and others as reading rage, reading fever, reading mania or reading lust. Throughout Europe reports circulated about the outbreak of what was described as an epidemic of reading. The behaviours associated with this supposedly insidious contagion were sensation-seeking and morally dissolute and promiscuous behaviour. Even acts of self-destruction were associated with this new craze for the reading of novels.

What some described as a craze was actually a rise in the 18th century of an ideal: the ‘love of reading’. The emergence of this new phenomenon was largely due to the growing popularity of a new literary genre: the novel. The emergence of commercial publishing in the 18th century and the growth of an ever-widening constituency of readers was not welcomed by everyone. Many cultural commentators were apprehensive about the impact of this new medium on individual behaviour and on society’s moral order.

With the growing popularity of novel reading, the age of the mass media had arrived. Novels such as Samuel Richardson’s Pamela, or Virtue Rewarded (1740) and Rousseau’s Julie, or the New Heloise (1761) became literary sensations that gripped the imagination of their European readers. What was described as ‘Pamela-fever’ indicated the powerful influence novels could exercise on the imagination of the reading public. Public deliberation on these ‘fevers’ focused on what was a potentially dangerous development, which was the forging of an intense and intimate interaction between the reader and literary characters. The consensus that emerged was that unrestrained exposure to fiction led readers to lose touch with reality and identify with the novel’s romantic characters to the point of adopting their behaviour. The passionate enthusiasm with which European youth responded to the publication of Johann Wolfgang von Goethe’s novel The Sorrows of Young Werther (1774) appeared to confirm this consensus. […]

What our exploration of the narrative of Werther fever suggests is that it acquired a life of its own to the point that it mutated into a taken-for-granted rhetorical idiom, which accounted for the moral problems facing society. Warnings about an epidemic of suicide said more about the anxieties of their authors than the behaviour of the readers of the novels. An inspection of the literature circulating these warnings indicates a striking absence of empirical evidence. The constant allusion to Miss. G., to nameless victims and to similarly framed death scenes suggests that these reports had little factual content to draw on. Stories about an epidemic of suicide were as fictional as the demise of Werther in Goethe’s novel.

It is, however, likely that readers of Werther were influenced by the controversy surrounding the novel. Goethe himself was affected by it and in his autobiography lamented that so many of his readers felt called upon to ‘re-enact the novel, and possibly shoot themselves’. Yet, despite the sanctimonious scaremongering, it continued to attract a large readership. While there is no evidence that Werther was responsible for the promotion of a wave of copycat suicides, it evidently succeeded in inspiring a generation of young readers. The emergence of what today would be described as a cult of fans with some of the trappings of a youth subculture is testimony to the novel’s powerful appeal.

The association of the novel with the disorganisation of the moral order represented an early example of a media panic. The formidable, sensational and often improbable effects attributed to the consequences of reading in the 18th century provided the cultural resources on which subsequent reactions to the cinema, television or the Internet would draw on. In that sense Werther fever anticipated the media panics of the future.

Curiously, the passage of time has not entirely undermined the association of Werther fever with an epidemic of suicide. In 1974 the American sociologist Dave Phillips coined the term, the ‘Werther Effect’ to describe mediastimulated imitation of suicidal behaviour. But the durability of the Werther myth notwithstanding, contemporary media panics are rarely focused on novels. In the 21st century the simplistic cause and effect model of the ‘Werther Effect is more likely to be expressed through moral anxieties about the danger of cybersuicide, copycat online suicide.

The Better Angels of Our Nature
by Steven Pinker
Kindle Locations 13125-13143
(see To Imagine and Understand)

It would be surprising if fictional experiences didn’t have similar effects to real ones, because people often blur the two in their memories. 65 And a few experiments do suggest that fiction can expand sympathy. One of Batson’s radio-show experiments included an interview with a heroin addict who the students had been told was either a real person or an actor. 66 The listeners who were asked to take his point of view became more sympathetic to heroin addicts in general, even when the speaker was fictitious (though the increase was greater when they thought he was real). And in the hands of a skilled narrator, a fictitious victim can elicit even more sympathy than a real one. In his book The Moral Laboratory, the literary scholar Jèmeljan Hakemulder reports experiments in which participants read similar facts about the plight of Algerian women through the eyes of the protagonist in Malike Mokkeddem’s novel The Displaced or from Jan Goodwin’s nonfiction exposé Price of Honor. 67 The participants who read the novel became more sympathetic to Algerian women than those who read the true-life account; they were less likely, for example, to blow off the women’s predicament as a part of their cultural and religious heritage. These experiments give us some reason to believe that the chronology of the Humanitarian Revolution, in which popular novels preceded historical reform, may not have been entirely coincidental: exercises in perspective-taking do help to expand people’s circle of sympathy.

The science of empathy has shown that sympathy can promote genuine altruism, and that it can be extended to new classes of people when a beholder takes the perspective of a member of that class, even a fictitious one. The research gives teeth to the speculation that humanitarian reforms are driven in part by an enhanced sensitivity to the experiences of living things and a genuine desire to relieve their suffering. And as such, the cognitive process of perspective-taking and the emotion of sympathy must figure in the explanation for many historical reductions in violence. They include institutionalized violence such as cruel punishments, slavery, and frivolous executions; the everyday abuse of vulnerable populations such as women, children, homosexuals, racial minorities, and animals; and the waging of wars, conquests, and ethnic cleansings with a callousness to their human costs.

Innocent Weapons:
The Soviet and American Politics of Childhood in the Cold War

by Margaret E. Peacock
pp. 88-89

As a part of their concern over American materialism, politicians and members of the American public turned their attention to the rising influence of media and popular culture upon the next generation.69 Concerns over uncontrolled media were not new in the United States in the 1950s. They had a way of erupting whenever popular culture underwent changes that seemed to differentiate the generations. This was the case during the silent film craze of the 1920s and when the popularity of dime novels took off in the 1930s.70 Yet, for many in the postwar era, the press, the radio, and the television presented threats to children that the country had never seen before. As members of Congress from across the political spectrum would argue throughout the 1950s, the media had the potential to present a negative image of the United States abroad, and it ran the risk of corrupting the minds of the young at a time when shoring up national patriotism and maintaining domestic order were more important than ever. The impact of media on children was the subject of Fredric Wertham’s 1953 best-selling book Seduction of the Innocent, in which he chronicled his efforts over the course of three years to “trace some of the roots of the modern mass delinquency.”71 Wertham’s sensationalist book documented case after case of child delinquents who seemed to be mimicking actions that they had seen on the television or, in particular, in comic strips. Horror comics, which were popular from 1948 until 1954, showed images of children killing their parents and peers, sometimes in gruesome ways—framing them for murder—being cunning and devious, even cannibalistic. A commonly cited story was that of “Bloody Mary,” published by Farrell Comics, which told the story of a seven-year-old girl who strangles her mother, sends her father to the electric chair for the murder, and then kills a psychiatrist who has learned that the girl committed these murders and that she is actually a dwarf in disguise.72 Wertham’s crusade against horror comics was quickly joined by two Senate subcommittees in 1954, at the heads of which sat Estes Kefauver and Robert Hendrickson. They argued to their colleagues that the violence and destruction of the family in these comic books symbolized “a terrible twilight zone between sanity and madness.”73 They contended that children found in these comic books violent models of behavior and that they would otherwise be law abiding. J. Edgar Hoover chimed in to comment that “a comic which makes lawlessness attractive . . . may influence the susceptible boy or girl.”74

Such depictions carried two layers of threat. First, as Wertham, Hoover, and Kefauver argued, they reflected the seeming potential of modern media to transform “average” children into delinquents.75 Alex Drier, popular NBC newscaster, argued in May 1954 that “this continuous flow of filth [is] so corruptive in its effects that it has actually obliterated decent instincts in many of our children.”76 Yet perhaps more telling, the comics, as well as the heated response that they elicited, also reflected larger anxieties about what identities children should assume in contemporary America. As in the case of Bloody Mary, these comics presented an image of apparently sweet youths who were in fact driven by violent impulses and were not children at all. “How can we expose our children to this and then expect them to run the country when we are gone?” an agitated Hendrickson asked his colleagues in 1954.77 Bloody Mary, like the uneducated dolts of the Litchfield report and the spoiled boys of Wylie’s conjuring, presented an alternative identity for American youth that seemed to embody a new and dangerous future.

In the early months of 1954, Robert Hendrickson argued to his colleagues that “the strained international and domestic situation makes it impossible for young people of today to look forward with certainty to higher education, to entering a trade or business, to plans for marriage, a home, and family. . . . Neither the media, nor modern consumerism, nor the threat from outside our borders creates a problem child. But they do add to insecurity, to loneliness, to fear.”78 For Hendrickson these domestic trends, along with what he called “deficient adults,” seemed to have created a new population of troubled and victimized children who were “beyond the pale of our society.”79

The End of Victory Culture:
Cold War America and the Disillusioning of a Generation

by Tom Engelhardt
Kindle Locations 2872-2910

WORRY, BORDERING ON HYSTERIA, about the endangering behaviors of “youth” has had a long history in America, as has the desire of reformers and censors to save “innocent” children from the polluting effects of commercial culture. At the turn of the century, when middle-class white adolescents first began to take their place as leisure-time trendsetters, fears arose that the syncopated beat of popular “coon songs” and ragtime music would demonically possess young listeners, who might succumb to the “evils of the Negro soul.” Similarly, on-screen images of crime, sensuality, and violence in the earliest movies, showing in “nickel houses” run by a “horde of foreigners,” were decried by reformers. They were not just “unfit for children’s eyes,” but a “disease” especially virulent to young (and poor) Americans, who were assumed to lack all immunity to such spectacles. 1 […]

To many adults, a teen culture beyond parental oversight had a remarkably alien look to it. In venues ranging from the press to Senate committees, from the American Psychiatric Association to American Legion meetings, sensational and cartoonlike horror stories about the young or the cultural products they were absorbing were told. Tabloid newspaper headlines reflected this: “Two Teen Thrill Killings Climax City Park Orgies. Teen Age Killers Pose a Mystery— Why Did They Do It?… 22 Juveniles Held in Gang War. Teen Age Mob Rips up BMT Train. Congressmen Stoned, Cops Hunt Teen Gang.” After a visit to the movies in 1957 to watch two “teenpics,” Rock All Night and Dragstrip Girl, Ruth Thomas of Newport, Rhode Island’s Citizen’s Committee on Literature expressed her shock in words at least as lurid as those of any tabloid: “Isn’t it a form of brain-washing? Brain-washing the minds of the people and especially the youth of our nation in filth and sadistic violence. What enemy technique could better lower patriotism and national morale than the constant presentation of crime and horror both as news and recreation.” 3

You did not have to be a censor, a right-wing anti-Communist, or a member of the Catholic Church’s Legion of Decency, however, to hold such views. Dr. Frederick Wertham, a liberal psychiatrist, who testified in the landmark Brown v. Board of Education desegregation case and set up one of the first psychiatric clinics in Harlem, publicized the idea that children viewing commercially produced acts of violence and depravity, particularly in comic books, could be transformed into little monsters. The lurid title of his best-selling book, Seduction of the Innocent, an assault on comic books as “primers for crime,” told it all. In it, Dr. Wertham offered copious “horror stories” that read like material from Tales from the Crypt: “Three boys, six to eight years old, took a boy of seven, hanged him nude from a tree, his hands tied behind him, then burned him with matches. Probation officers investigating found that they were re-enacting a comic-book plot.… A boy of thirteen committed a lust murder of a girl of six. After his arrest, in jail, he asked for comicbooks” 4

Kindle Locations 2927-2937

The two— hood and performer, lower-class white and taboo black— merged in the “pelvis” of a Southern “greaser” who dressed like a delinquent, used “one of black America’s favorite products, Royal Crown Pomade hair grease” (meant to give hair a “whiter” look), and proceeded to move and sing “like a negro.” Whether it was because they saw a white youth in blackface or a black youth in whiteface, much of the media grew apoplectic and many white parents alarmed. In the meantime, swiveling his hips and playing suggestively with the microphone, Elvis Presley broke into the lives of millions of teens in 1956, bringing with him an element of disorder and sexuality associated with darkness. 6†

The second set of postwar fears involved the “freedom” of the commercial media— record and comic book companies, radio stations, the movies, and television— to concretize both the fantasies of the young and the nightmarish fears of grown-ups into potent products. For many adults, this was abundance as betrayal, the good life not as a vision of Eden but as an unexpected horror story.

Kindle Locations 2952-2979

Take comic books. Even before the end of World War II, a new kind of content was creeping into them as they became the reading matter of choice for the soldier-adolescent. […] Within a few years, “crime” comics like Crime Does Not Pay emerged from the shadows, displaying a wide variety of criminal acts for the delectation of young readers. These were followed by horror and science fiction comics, purchased in enormous numbers. By 1953, more than 150 horror comics were being produced monthly, featuring acts of torture often of an implicitly sexual nature, murders and decapitations of various bloody sorts, visions of rotting flesh, and so on. 9

Miniature catalogs of atrocities, their feel was distinctly assaultive. In their particular version of the spectacle of slaughter, they targeted the American family, the good life, and revered institutions. Framed by sardonic detective narrators or mocking Grand Guignol gatekeepers, their impact was deconstructive. Driven by a commercial “hysteria” as they competed to attract buyers with increasingly atrocity-ridden covers and stories, they both partook of and mocked the hysteria about them. Unlike radio or television producers, the small publishers of the comic book business were neither advertiser driven nor corporately controlled.

Unlike the movies, comics were subject to no code. Unlike the television networks, comics companies had no Standards and Practices departments. No censoring presence stood between them and whoever would hand over a dime at a local newsstand. Their penny-ante ads and pathetic pay scale ensured that writing and illustrating them would be a job for young men in their twenties (or even teens). Other than early rock and roll, comics were the only cultural form of the period largely created by the young for those only slightly younger. In them, uncensored, can be detected the dismantling voice of a generation that had seen in the world war horrors beyond measure.

The hysterical tone of the response to these comics was remarkable. Comics publishers were denounced for conspiring to create a delinquent nation. Across the country, there were publicized comic book burnings like one in Binghamton, New York, where 500 students were dismissed from school early in order to torch 2,000 comics and magazines. Municipalities passed ordinances prohibiting the sale of comics, and thirteen states passed legislation to control their publication, distribution, or sale. Newspapers and magazines attacked the comics industry. The Hartford Courant decried “the filthy stream that flows from the gold-plated sewers of New York.” In April 1954, the Senate Subcommittee to Investigate Juvenile Delinquency convened in New York to look into links between comics and teen crime. 10

Kindle Locations 3209-3238

If sponsors and programmers recognized the child as an independent taste center, the sight of children glued to the TV, reveling in their own private communion with the promise of America, proved unsettling to some adults. The struggle to control the set, the seemingly trancelike quality of TV time, the soaring number of hours spent watching, could leave a parent feeling challenged by some hard-to-define force released into the home under the aegis of abundance, and the watching child could gain the look of possession, emptiness, or zombification.

Fears of TV’s deleterious effects on the child were soon widespread. The medical community even discovered appropriate new childhood illnesses. There was “TV squint” or eyestrain, “TV bottom,” “bad feet” (from TV-induced inactivity), “frogitis” (from a viewing position that put too much strain on inner-leg ligaments), “TV tummy” (from TV-induced overexcitement), “TV jaw” or “television malocclusion” (from watching while resting on one’s knuckles, said to force the eyeteeth inward), and “tired child syndrome” (chronic fatigue, loss of appetite, headaches, and vomiting induced by excessive viewing).

However, television’s threat to the child was more commonly imagined to lie in the “violence” of its programming. Access to this “violence” and the sheer number of hours spent in front of the set made the idea that this new invention was acting in loco parentis seem chilling to some; and it was true that via westerns, crime shows, war and spy dramas, and Cold War-inspired cartoons TV was indiscriminately mixing a tamed version of the war story with invasive Cold War fears. Now, children could endlessly experience the thrill of being behind the barrel of a gun. Whether through the Atom Squad’s three government agents, Captain Midnight and his Secret Squadron, various FBI men, cowboys, or detectives, they could also encounter “an array of H-bomb scares, mad Red scientists, [and] plots to rule the world,” as well as an increasing level of murder and mayhem that extended from the six-gun frontier of the “adult” western to the blazing machine guns of the crime show. 30

Critics, educators, and worried parents soon began compiling TV body counts as if the statistics of victory were being turned on young Americans. “Frank Orme, an independent TV watchdog, made a study of Los Angeles television in 1952 and noted, in one week, 167 murders, 112 justifiable homicides, and 356 attempted murders. Two-thirds of all the violence he found occurred in children’s shows. In 1954, Orme said violence on kids’ shows had increased 400 percent since he made his first report.” PTAs organized against TV violence, and Senate hearings searched for links between TV programming and juvenile delinquency.

Such “violence,” though, was popular. In addition, competition for audiences among the three networks had the effect of ratcheting up the pressures for violence, just as it had among the producers of horror comics. At The Untouchables, a 1960 hit series in which Treasury agent Eliot Ness took on Chicago’s gangland (and weekly reached 5-8 million young viewers), ABC executives would push hard for more “action.” Producer Quinn Martin would then demand the same of his subordinates, “or we are all going to get clobbered.” In a memo to one of the show’s writers, he asked: “I wish you would come up with a different device than running the man down with a car, as we have done this now in three different shows. I like the idea of sadism, but I hope we can come up with another approach to it.” 31