Moral Panic and Physical Degeneration

From the beginning of the country, there has been an American fear of moral and mental decline that was always rooted in the physical, involving issues of vitality of land and health of the body, and built on an ancient divide between the urban and rural. Over time, it grew into a fever pitch of moral panic about degeneration and degradation of the WASP culture, the white race, and maybe civilization itself. Some saw the end was near, maybe being able to hold out for another few generations before finally succumbing to disease and weakness. The need for revitalization and rebirth became a collective project (Jackson Lears, Rebirth of a Nation), which sadly fed into ethno-nationalist bigotry and imperialistic war-mongering — Make America Great Again!

A major point of crisis, of course, was the the Civil War. Racial ideology became predominant, not only because of slavery but maybe moreso because of mass immigration, the latter being the main reason the North won. Racial tensions merged with the developing scientific mindset of Darwinism and out of this mix came eugenics. For all we can now dismiss this kind of simplistic ignorance and with hindsight see the danger it led to, the underlying anxieties were real. Urbanization and industrialization were having an obvious impact on public health that was observed by many, and it wasn’t limited to mere physical ailments. “Cancer, like insanity, seems to increase with the progress of civilization,” noted Stanislas Tanchou, a mid-19th century French physician.

The diseases of civilization, including mental sickness, have been spreading for centuries (millennia, actually, considering the ‘modern’ chronic health conditions were first detected in the mummies of the agricultural Egyptians). Consider how talk of depression suddenly showed up in written accounts with the ending of feudalism (Barbara Ehrenreich, Dancing in the Street). That era included the enclosure movement that forced millions of then landless serfs into the desperate conditions of crowded cities and colonies where they faced stress, hunger, malnutrition, and disease. The loss of rural life hit Europe much earlier than America, but it eventually came here as well. The majority of white Americans were urban by the beginning of the 20th century and the majority of black Americans were urban by the 1970s. There has been a consistent pattern of mass problems following urbanization, everywhere it happens. It still is happening. The younger generation, more urbanized than any generation before, are seeing rising rates of psychosis that is specifically concentrated in the most urbanized areas.

In the United States, it was the last decades of the 19th century that was the turning point, the period of the first truly big cities. Into this milieu, Weston A. Price was born (1870) in a small rural village in Canada. As an adult, he became a dentist and sought work in Cleveland, Ohio (1893). Initially, most of his patients probably had, like him, grown up in rural areas. But over the decades, he increasingly was exposed to the younger generations having spent their entire lives in the city. Lierre Keith puts Price’s early observations in context, after pointing out that he started his career in 1893: “This date is important, as he entered the field just prior to the glut of industrial food. Over the course of the next thirty years, he watched children’s dentition — and indeed their overall health deteriorate. There was suddenly children whose teeth didn’t fit in their mouths, children with foreshortened jaws, children with lots of cavities. Not only were their dental arches too small, but he noticed their nasal passages were also too narrow, and they had poor health overall; asthma, allergies, behavioral problems” (The Vegetarian Myth, p. 187). This was at the time when the industrialization of farming and food had reached a new level, far beyond the limited availability of canned foods that in the mid-to-late 1800s when most Americans still relied on a heavy amount of wild-sourced meat, fish, nuts, etc. Even city-dwellers in early America had ready access to wild game because of the abundance of surrounding wilderness areas. In fact, in the 19th century, the average American ate more meat (mostly hunted) than bread.

We are once again coming back to the ever recurrent moral panic about the civilizational project. The same fears given voice in the late 19th to early 20th century are being repeated again. For example, Dr. Leonard Sax alerts us to how girls are sexually maturing early (1% of female infants showing signs of puberty), whereas boys are maturing later. As a comparison, hunter-gatherers don’t have such a large gender disparity of puberty nor do they experience puberty so early for girls, instead both genders typically coming to puberty around 18 years old with sex, pregnancy, and marriage happening more or less simultaneously. Dr. Sax, along with others, speculates about a number of reasons. Common causes that are held responsible include health factors, from diet to chemicals. Beyond altered puberty, many other examples could be added: heart disease, autoimmune disorders, mood disorders, autism, ADHD, etc; all of them increasing and worsening with each generation (e.g., type 2 diabetes used to be known as adult onset diabetes but now is regularly diagnosed in young children; the youngest victim recorded recently was three years old when diagnosed).

In the past, Americans responded to moral panic with genocide of Native Americans, Prohibition targeting ethnic (hyphenated) Americans and the poor, and immigrant restrictions to keep the bad sort out; the spread of racism and vigilantism such as KKK and Jim Crow and sundown towns and redlining, forced assimilation such as English only laws and public schools, and internment camps for not only Japanese-Americans but also German-Americans and Italian-Americans; implementation of citizen-making projects like national park systems, Boy Scouts, WPA, and CCC; promotion of eugenics, war on poverty (i.e., war on the poor), imperial expansionism, neo-colonial exploitation, and world wars; et cetera. The cure sought was often something to be forced onto the population by a paternalistic elite, that is to say rich white males, most specifically WASPs of the capitalist class.

Eugenics was, of course, one of the main focuses as it carried the stamp of science (or rather scientism). Yet at the same time, there were those challenging biological determinism and race realism, as views shifted toward environmental explanations. The anthropologists were at the front lines of this battle, but there were also Social Christians who changed their minds after having seen poverty firsthand. Weston A. Price, however, didn’t come to this from a consciously ideological position or religious motivation. He was simply a dentist who couldn’t ignore the severe health issues of his patients. So, he decided to travel the world in order to find healthy populations to study, in the hope of explaining why the change had occurred (Nutrition and Physical Degeneration).

Although familiar with eugenics literature, what Price observed in ‘primitive’ communities (including isolated villages in Europe) did not conform to eugenicist thought. It didn’t matter which population he looked at. Those who ate traditional diets were healthy and those who ate an industrialized Western diet were not. And it was a broad pattern that he saw everywhere he went, not only physical health but also neurocognitive health as indicated by happiness, low anxiety, and moral character. Instead of blaming individuals or races, he saw the common explanation as nutrition and he made a strong case by scientifically analyzing the nutrition of available foods.

In reading about traditional foods, paleo diet/lifestyle and functional medicine, Price’s work comes up quite often. He took many photographs that compared people from healthy and unhealthy populations. The contrast is stark. But what really stands out is how few people in the modern world look close to as healthy as those from the healthiest societies of the past. I live in a fairly wealthy college and medical town where there is a far above average concern for health along with access to healthcare. Even so, I now can’t help noticing how many people around me show signs of stunted or perturbed development of the exact kind Price observed in great detail: thin bone structure, sunken chests, sloping shoulders, narrow facial features, asymmetry, etc. That is even with modern healthcare correcting some of the worst conditions: cavities, underbites, pigeon-toes, etc. My fellow residents in this town are among the most privileged people in the world and, nonetheless, their state of health is a sad state of affairs in what it says about humanity at present.

It makes me wonder, as it made Price wonder, what consequences this has on neurocognitive health for individuals and the moral health of society. Taken alone, it isn’t enough to get excited about. But put in a larger context of looming catastrophes and it does become concerning. It’s not clear that our health will be up to the task of the problems we need to solve. We are a sickly population, far more sickly than when moral panic took hold in past generations.

As important, there is the personal component. I’m at a point where I’m not going to worry too much about decline and maybe collapse of civilization. I’m kind of hoping the American Empire will meet its demise. Still, that leaves us with many who suffer, no matter what happens to society as a whole. I take that personally, as one who has struggled with physical and mental health issues. And I’ve come around to Price’s view of nutrition as being key. I see these problems in other members of my family and it saddens me to watch as health conditions seem to get worse from one generation to the next.

It’s far from being a new problem, the central point I’m trying to make here. Talking to my mother, she has a clear sense of the differences on the two sides of her family. Her mother’s family came from rural areas and, even after moving to a larger city for work, they continued to hunt on a daily basis as there were nearby fields and woods that made that possible. They were a healthy, happy, and hard-working lot. They got along well as a family. Her father’s side of the family was far different. They had been living in towns and cities for several generations by the time she was born. They didn’t hunt at all. They were known for being surly, holding grudges, and being mean drunks. They also had underbites (i.e., underdeveloped jaw structure) and seemed to have had learning disabilities, though no one was diagnosing such conditions back then. Related to this difference, my mother’s father raised rabbits whereas my mother’s mother’s family hunted rabbits (and other wild game). This makes a big difference in terms of nutrition, as wild game has higher levels of omega-3 fatty acids and fat-soluble vitamins, all of which are key to optimal health and development.

What my mother observed in her family is basically the same as what Price observed in hundreds of communities in multiple countries on every continent. And I now observe the same pattern repeating. I grew up with an underbite. My brothers and I all required orthodontic work, as do so many now. I was diagnosed with a learning disability when young. Maybe not a learning disability, but behavioral issues were apparent when my oldest brother was young, likely related to his mildew allergies and probably an underlying autoimmune condition. I know I had food allergies as a child, as I think my other brother did as well. All of us have had neurocognitive and psychological issues of a fair diversity, besides learning disabilities: stuttering, depression, anxiety, and maybe some Asperger’s.

Now another generation is coming along with increasing rates of major physical and mental health issues. My nieces and nephews are sick all the time. They don’t eat well and are probably malnourished. During a medical checkup for my nephew, my mother asked the doctor about his extremely unhealthy diet, consisting mostly of white bread and sugar. The doctor bizarrely dismissed it as ‘normal’ in that, as she claimed, no kid eats healthy. If that is the new normal, maybe we should be in a moral panic.

* * *

Violent Behavior: A Solution in Plain Sight
by Sylvia Onusic

Nutrition and Mental Development
by Sally Fallon Morell

You Are What You Eat: The Research and Legacy of Dr. Weston Andrew Price
by John Larabell

While practicing in his Cleveland office, Dr. Price noticed an increase in dental problems among the younger generations. These issues included the obvious dental caries (cavities) as well as improper jaw development leading to crowded, crooked teeth. In fact, the relatively new orthodontics industry was at that time beginning to gain popularity. Perplexed by these modern problems that seemed to be affecting a greater and greater portion of the population, Dr. Price set about to research the issue by examining people who did not display such problems. He suspected (correctly, as he would later find) that many of the dental problems, as well as other degenerative health problems, that were plaguing modern society were the result of inadequate nutrition owing to the increasing use of refined, processed foods.

Nasty, Brutish and Short?
by Sally Fallon Morell

It seems as if the twentieth century will exit with a crescendo of disease. Things were not so bad back in the 1930’s, but the situation was already serious enough to cause one Cleveland, Ohio dentist to be concerned. Dr. Weston Price was reluctant to accept the conditions exhibited by his patients as normal. Rarely did an examination of an adult patient reveal anything but rampant decay, often accompanied by serious problems elsewhere in the body, such as arthritis, osteoporosis, diabetes, intestinal complaints and chronic fatigue. (They called it neurasthenia in Price’s day.) But it was the dentition of younger patients that alarmed him most. Price observed that crowded, crooked teeth were becoming more and more common, along with what he called “facial deformities”-overbites, narrowed faces, underdevelopment of the nose, lack of well-defined cheekbones and pinched nostrils. Such children invariably suffered from one or more complaints that sound all too familiar to mothers of the 1990’s: frequent infections, allergies, anemia, asthma, poor vision, lack of coordination, fatigue and behavioral problems. Price did not believe that such “physical degeneration” was God’s plan for mankind. He was rather inclined to believe that the Creator intended physical perfection for all human beings, and that children should grow up free of ailments.

Is it Mental or is it Dental?
by Raymond Silkman

The widely held model of orthodontics, which considers developmental problems in the jaws and head to be genetic in origin, never made sense to me. Since they are wedded to the genetic model, orthodontists dealing with crowded teeth end up treating the condition with tooth extraction in a majority of the cases. Even though I did not resort to pulling teeth in my practice, and I was using appliances to widen the jaws and getting the craniums to look as they should, I still could not come up with the answer as to why my patients looked the way they did. I couldn’t believe that the Creator had given them a terrible blueprint –it just did not make sense. In four years of college education, four years of dental school education and almost three years of post-graduate orthodontic training, students never hear a mention of Dr. Price, so they never learn the true reasons for these malformations. I have had the opportunity to work with a lot of very knowledgeable doctors in various fields of allopathic and alternative healthcare who still do not know about Dr. Price and his critical findings.

These knowledgeable doctors have not stared in awe at the beautiful facial development that Price captured in the photographs he took of primitive peoples throughout the globe and in so doing was able to answer this most important question: What do humans look like in health? And how have humans been able to carry on throughout history and populate such varied geographical and physical environments on the earth without our modern machines and tools?

The answer that Dr. Price was able to illuminate came through his photographs of beautiful, healthy human beings with magnificent physical form and mental development, living in harmony with their environments. […]

People who are not well oxygenated and who have poor posture often suffer from fatigue and fibromyalgia symptoms, they snore and have sleep apnea, they have sinusitis and frequent ear infections. Life becomes psychologically and physically challenging for them and they end up with long-term dependence on medications—and all of that just from the seemingly simple condition of crowded teeth.

In other words, people with poor facial development are not going to live very happily. […]

While very few people have heard of the work of Weston Price these days, we haven’t lost our ability to recognize proper facial form. To make it in today’s society, you must have good facial development. You’re not going to see a general or a president with a weak chin, you’re not going to see coaches with weak chins, you’re not going to see a lot of well-to-do personalities in the media with underdeveloped faces and chins. You don’t see athletes and newscasters with narrow palates and crooked teeth.

Weston A. Price: An Unorthodox Dentist
by Nourishing Israel

Price discovered that the native foods eaten by the isolated populations were far more nutrient dense than the modern foods. In the first generation that changed their diet there was noticeable tooth decay; in subsequent generations the dental and facial bone structure changed, as well as other changes that were seen in American and European families and previously considered to be the result of interracial marriage.

By studying the different routes that the same populations had taken – traditional versus modern diet – he saw that the health of the children is directly related to the health of the parents and the germ plasms that they provide, and are as important to the child’s makeup as the health of the mother before and during pregnancy.

Price also found that primitive populations were very conscious of the importance of the mothers’ health and many populations made sure that girls were given a special diet for several months before they were allowed to marry.

Another interesting finding was that although genetic makeup was important, it did not have as great a degree of influence on a person’s development and health as was thought, but that a lot of individual characteristics, including brain development and brain function, where due to environmental influence, what he called “intercepted heredity”.

The origin of personality and character appear in the light of the newer date to be biologic products and to a much less degree than usually considered pure hereditary traits. Since these various factors are biologic, being directly related to both the nutrition of the parents and to the nutritional environment of the individuals in the formative and growth period any common contributing factor such as food deficiencies due to soil depletion will be seen to produce degeneration of the masses of people due to a common cause. Mass behavior therefore, in this new light becomes the result of natural forces, the expression of which may not be modified by propaganda but will require correction at the source. [1] …

It will be easy for the reader to be prejudiced since many of the applications suggested are not orthodox. I suggest that conclusions be deferred until the new approach has been used to survey the physical and mental status of the reader’s own family, of his brothers and sisters, of associated families, and finally, of the mass of people met in business and on the street. Almost everyone who studies the matter will be surprised that such clear-cut evidence of a decline in modern reproductive efficiency could be all about us and not have been previously noted and reviewed.[2]

From Nutrition and Physical Degeneration by Weston Price

Food Freedom – Nourishing Raw Milk
by Lisa Virtue

In 1931 Price visited the people of the Loetschental Valley in the Swiss Alps. Their diet consisted of rye bread, milk, cheese and butter, including meat once a week (Price, 25). The milk was collected from pastured cows, and was consumed raw: unpasteurized, unhomogenized (Schmid, 9).

Price described these people as having “stalwart physical development and high moral character…superior types of manhood, womanhood and childhood that Nature has been able to produce from a suitable diet and…environment” (Price, 29). At this time, Tuberculosis had taken more lives in Switzerland than any other disease. The Swiss government ordered an inspection of the valley, revealing not a single case. No deaths had been recorded from Tuberculosis in the history of the Loetschental people (Shmid, 8). Upon return home, Price had dairy samples from the valley sent to him throughout the year. These samples were higher in minerals and vitamins than samples from commercial (thus pasteurized) dairy products in America and the rest of Europe. The Loetschental milk was particularly high in fat soluble vitamin D (Schmid, 9).

The daily intake of calcium and phosphorous, as well as fat soluble vitamins would have been higher than average North American children. These children were strong and sturdy, playing barefoot in the glacial waters into the late chilly evenings. Of all the children in the valley eating primitive foods, cavities were detected at an average of 0.3 per child (Price, 25). This without visiting a dentist or physician, for the valley had none, seeing as there was no need (Price, 23). To offer some perspective, the rate of cavities per child between the ages of 6-19 in the United States has been recorded to be 3.25, over 10 times the rate seen in Loetschental (Nagel).

Price offers some perspective on a society subsisting mainly on raw dairy products: “One immediately wonders if there is not something in the life-giving vitamins and minerals of the food that builds not only great physical structures within which their souls reside, but builds minds and hearts capable of a higher type of manhood…” (Price, 26).

100 Years Before Weston Price
by Nancy Henderson

Like Price, Catlin was struck by the beauty, strength and demeanor of the Native Americans. “The several tribes of Indians inhabiting the regions of the Upper Missouri. . . are undoubtedly the finest looking, best equipped, and most beautifully costumed of any on the Continent.” Writing of the Blackfoot and Crow, tribes who hunted buffalo on the rich glaciated soils of the American plains, “They are the happiest races of Indian I have met—picturesque and handsome, almost beyond description.”

“The very use of the word savage,” wrote Catlin, “as it is applied in its general sense, I am inclined to believe is an abuse of the word, and the people to whom it is applied.” […]

As did Weston A. Price one hundred years later, Catlin noted the fact that moral and physical degeneration came together with the advent of civilized society. In his late 1830s portrait of “Pigeon’s Egg Head (The Light) Going to and Returning from Washington” Catlin painted him corrupted with “gifts of the great white father” upon his return to his native homeland. Those gifts including two bottles of whiskey in his pockets. […]

Like Price, Catlin discusses the issue of heredity versus environment. “No diseases are natural,” he writes, “and deformities, mental and physical, are neither hereditary nor natural, but purely the result of accidents or habits.”

So wrote Dr. Price: “Neither heredity nor environment alone cause our juvenile delinquents and mental defectives. They are cripples, physically, mentally and morally, which could have and should have been prevented by adequate education and by adequate parental nutrition. Their protoplasm was not normally organized.”

The Right Price
by Weston A. Price Foundation

Many commentators have criticized Price for attributing “decline in moral character” to malnutrition. But it is important to realize that the subject of “moral character” was very much on the minds of commentators of his day. As with changes in facial structure, observers in the first half of the 20th century blamed “badness” in people to race mixing, or to genetic defects. Price quotes A.C. Jacobson, author of a 1926 publication entitled Genius (Some Revaluations),35 who stated that “The Jekyll-Hydes of our common life are ethnic hybrids.” Said Jacobson, “Aside from the effects of environment, it may safely be assumed that when two strains of blood will not mix well a kind of ‘molecular insult’ occurs which the biologists may some day be able to detect beforehand, just as blood is now tested and matched for transfusion.” The implied conclusion to this assertion is that “degenerates” can be identified through genetic testing and “weeded out” by sterilizing the unfit–something that was imposed on many women during the period and endorsed by powerful individuals, including Oliver Wendell Holmes.

It is greatly to Price’s credit that he objected to this arrogant point of view: “Most current interpretations are fatalistic and leave practically no escape from our succession of modern physical, mental and moral cripples. . . If our modern degeneration were largely the result of incompatible racial stocks as indicated by these premises, the outlook would be gloomy in the extreme.”36 Price argued that nutritional deficiencies affecting the physical structure of the body can also affect the brain and nervous system; and that while “bad” character may be the result of many influences–poverty, upbringing, displacement, etc.–good nutrition also plays a role in creating a society of cheerful, compassionate individuals.36

Rebirth of a Nation:
The Making of Modern America, 1877-1920
By Jackson Lears
pp. 7-9

By the late nineteenth century, dreams of rebirth were acquiring new meanings. Republican moralists going back to Jefferson’s time had long fretted about “overcivilization,” but the word took on sharper meaning among the middle and upper classes in the later decades of the nineteenth century. During the postwar decades, “overcivilization” became not merely a social but an individual condition, with a psychiatric diagnosis. In American Nervousness (1880), the neurologist George Miller Beard identified “neurasthenia,” or “lack of nerve force,” as the disease of the age. Neurasthenia encompassed a bewildering variety of symptoms (dyspepsia, insomnia, nocturnal emissions, tooth decay, “fear of responsibility, of open places or closed places, fear of society, fear of being alone, fear of fears, fear of contamination, fear of everything, deficient mental control, lack of decision in trifling matters, hopelessness”), but they all pointed to a single overriding effect: a paralysis of the will.

The malady identified by Beard was an extreme version of a broader cultural malaise—a growing sense that the Protestant ethic of disciplined achievement had reached the end of its tether, had become entangled in the structures of an increasingly organized capitalist society. Ralph Waldo Emerson unwittingly predicted the fin de siècle situation. “Every spirit makes its house,” he wrote in “Fate” (1851), “but afterwards the house confines the spirit.” The statement presciently summarized the history of nineteenth-century industrial capitalism, on both sides of the Atlantic.

By 1904, the German sociologist Max Weber could put Emerson’s proposition more precisely. The Protestant ethic of disciplined work for godly ends had created an “iron cage” of organizations dedicated to the mass production and distribution of worldly goods, Weber argued. The individual striver was caught in a trap of his own making. The movement from farm to factory and office, and from physical labor outdoors to sedentary work indoors, meant that more Europeans and North Americans were insulated from primary processes of making and growing. They were also caught up in subtle cultural changes—the softening of Protestantism into platitudes; the growing suspicion that familiar moral prescriptions had become mere desiccated, arbitrary social conventions. With the decline of Christianity, the German philosopher Friedrich Nietzsche wrote, “it will seem for a time as though all things had become weightless.”

Alarmists saw these tendencies as symptoms of moral degeneration. But a more common reaction was a diffuse but powerful feeling among the middle and upper classes—a sense that they had somehow lost contact with the palpitating actuality of “real life.” The phrase acquired unprecedented emotional freight during the years around the turn of the century, when reality became something to be pursued rather than simply experienced. This was another key moment in the history of longing, a swerve toward the secular. Longings for this-worldly regeneration intensified when people with Protestant habits of mind (if not Protestant beliefs) confronted a novel cultural situation: a sense that their way of life was being stifled by its own success.

On both sides of the Atlantic, the drive to recapture “real life” took myriad cultural forms. It animated popular psychotherapy and municipal reform as well as avant-garde art and literature, but its chief institutional expression was regeneration through military force. As J. A. Hobson observed in Imperialism (1902), the vicarious identification with war energized jingoism and militarism. By the early twentieth century, in many minds, war (or the fantasy of it) had become the way to keep men morally and physically fit. The rise of total war between the Civil War and World War I was rooted in longings for release from bourgeois normality into a realm of heroic struggle. This was the desperate anxiety, the yearning for rebirth, that lay behind official ideologies of romantic nationalism, imperial progress, and civilizing mission—and that led to the trenches of the Western Front.

Americans were immersed in this turmoil in peculiarly American ways. As the historian Richard Slotkin has brilliantly shown, since the early colonial era a faith in regeneration through violence underlay the mythos of the American frontier. With the closing of the frontier (announced by the U.S. census in 1890), violence turned outward, toward empire. But there was more going on than the refashioning of frontier mythology. American longings for renewal continued to be shaped by persistent evangelical traditions, and overshadowed by the shattering experience of the Civil War. American seekers merged Protestant dreams of spiritual rebirth with secular projects of purification—cleansing the body politic of secessionist treason during the war and political corruption afterward, reasserting elite power against restive farmers and workers, taming capital in the name of the public good, reviving individual and national vitality by banning the use of alcohol, granting women the right to vote, disenfranchising African-Americans, restricting the flow of immigrants, and acquiring an overseas empire.

Of course not all these goals were compatible. Advocates of various versions of rebirth—bodybuilders and Prohibitionists, Populists and Progressives, Social Christians and Imperialists—all laid claims to legitimacy. Their crusades met various ends, but overall they relieved the disease of the fin de siècle by injecting some visceral vitality into a modern culture that had seemed brittle and about to collapse. Yearning for intense experience, many seekers celebrated Force and Energy as ends in themselves. Such celebrations could reinforce militarist fantasies but could also lead in more interesting directions—toward new pathways in literature and the arts and sciences. Knowledge could be revitalized, too. William James, as well as Houdini and Roosevelt, was a symbol of the age.

The most popular forms of regeneration had a moral dimension.

pp. 27-29

But for many other observers, too many American youths—especially among the upper classes—had succumbed to the vices of commerce: the worship of Mammon, the love of ease. Since the Founding Fathers’ generation, republican ideologues had fretted about the corrupting effects of commercial life. Norton and other moralists, North and South, had imagined war would provide an antidote. During the Gilded Age those fears acquired a peculiarly palpable intensity. The specter of “overcivilization”—invoked by republican orators since Jefferson’s time—developed a sharper focus: the figure of the overcivilized businessman became a stock figure in social criticism. Flabby, ineffectual, anxious, possibly even neurasthenic, he embodied bourgeois vulnerability to the new challenges posed by restive, angry workers and waves of strange new immigrants. “Is American Stamina Declining?” asked William Blaikie, a former Harvard athlete and author of How to Get Strong and Stay So, in Harper’s in 1889. Among white-collar “brain-workers,” legions of worried observers were asking similar questions. Throughout the country, metropolitan life for the comfortable classes was becoming a staid indoor affair. Blaikie caught the larger contours of the change:

“A hundred years ago, there was more done to make our men and women hale and vigorous than there is to-day. Over eighty per cent of all our men then were farming, hunting, or fishing, rising early, out all day in the pure, bracing air, giving many muscles very active work, eating wholesome food, retiring early, and so laying in a good stock of vitality and health. But now hardly forty per cent are farmers, and nearly all the rest are at callings—mercantile, mechanical, or professional—which do almost nothing to make one sturdy and enduring.”

This was the sort of anxiety that set men (and more than a few women) to pedaling about on bicycles, lifting weights, and in general pursuing fitness with unprecedented zeal. But for most Americans, fitness was not merely a matter of physical strength. What was equally essential was character, which they defined as adherence to Protestant morality. Body and soul would be saved together.

This was not a gender-neutral project. Since the antebellum era, purveyors of conventional wisdom had assigned respectable women a certain fragility. So the emerging sense of physical vulnerability was especially novel and threatening to men. Manliness, always an issue in Victorian culture, had by the 1880s become an obsession. Older elements of moral character continued to define the manly man, but a new emphasis on physical vitality began to assert itself as well. Concern about the over-soft socialization of the young promoted the popularity of college athletics. During the 1880s, waves of muscular Christianity began to wash over campuses.

pp. 63-71

NOT MANY AMERICAN men, even among the comparatively prosperous classes, were as able as Carnegie and Rockefeller to master the tensions at the core of their culture. Success manuals acknowledged the persistent problem of indiscipline, the need to channel passion to productive ends. Often the language of advice literature was sexually charged. In The Imperial Highway (1881), Jerome Bates advised:

[K]eep cool, have your resources well in hand, and reserve your strength until the proper time arrives to exert it. There is hardly any trait of character or faculty of intellect more valuable than the power of self-possession, or presence of mind. The man who is always “going off” unexpectedly, like an old rusty firearm, who is easily fluttered and discomposed at the appearance of some unforeseen emergency; who has no control over himself or his powers, is just the one who is always in trouble and is never successful or happy.

The assumptions behind this language are fascinating and important to an understanding of middle-and upper-class Americans in the Gilded Age. Like many other purveyors of conventional wisdom—ministers, physicians, journalists, health reformers—authors of self-help books assumed a psychic economy of scarcity. For men, this broad consensus of popular psychology had sexual implications: the scarce resource in question was seminal fluid, and one had best not be diddling it away in masturbation or even nocturnal emissions. This was easier said than done, of course, as Bates indicated, since men were constantly addled by insatiable urges, always on the verge of losing self-control—the struggle to keep it was an endless battle with one’s own darker self. Spiritual, psychic, and physical health converged. What Freud called “‘civilized’ sexual morality” fed directly into the “precious bodily fluids” school of health management. The man who was always “‘going off’ unexpectedly, like an old rusty firearm,” would probably be sickly as well as unsuccessful—sallow, sunken-chested, afflicted by languorous indecision (which was how Victorian health literature depicted the typical victim of what was called “self-abuse”).

But as this profile of the chronic masturbator suggests, scarcity psychology had implications beyond familiar admonitions to sexual restraint. Sexual scarcity was part of a broader psychology of scarcity; the need to conserve semen was only the most insistently physical part of a much more capacious need to conserve psychic energy. As Bates advised, the cultivation of “self-possession” allowed you to “keep your resources well in hand, and reserve your strength until the proper time arrives to exert it.” The implication was that there was only so much strength available to meet demanding circumstances and achieve success in life. The rhetoric of “self-possession” had financial as well as sexual connotations. To preserve a cool, unruffled presence of mind (to emulate Rockefeller, in effect) was one way to stay afloat on the storm surges of the business cycle.

The object of this exercise, at least for men, was personal autonomy—the ownership of one’s self. […]

It was one thing to lament excessive wants among the working class, who were supposed to be cultivating contentment with their lot, and quite another to find the same fault among the middle class, who were supposed to be improving themselves. The critique of middle-class desire posed potentially subversive questions about the dynamic of dissatisfaction at the core of market culture, about the very possibility of sustaining a stable sense of self in a society given over to perpetual jostling for personal advantage. The ruinous results of status-striving led advocates of economic thrift to advocate psychic thrift as well.

By the 1880s, the need to conserve scarce psychic resources was a commonly voiced priority among the educated and affluent. Beard’s American Nervousness had identified “the chief and primary cause” of neurasthenia as “modern civilization,” which placed unprecedented demands on limited emotional energy. “Neurasthenia” and “nervous prostration” became catchall terms for a constellation of symptoms that today would be characterized as signs of chronic depression—anxiety, irritability, nameless fears, listlessness, loss of will. In a Protestant culture, where effective exercise of will was the key to individual selfhood, the neurasthenic was a kind of anti-self—at best a walking shadow, at worst a bedridden invalid unable to make the most trivial choices or decisions. Beard and his colleagues—neurologists, psychiatrists, and self-help writers in the popular press—all agreed that nervous prostration was the price of progress, a signal that the psychic circuitry of “brain workers” was overloaded by the demands of “modern civilization.”

While some diagnoses of this disease deployed electrical metaphors, the more common idiom was economic. Popular psychology, like popular economics, was based on assumptions of scarcity: there was only so much emotional energy (and only so much money) to go around. The most prudent strategy was the husbanding of one’s resources as a hedge against bankruptcy and breakdown. […]

Being reborn through a self-allowed regime of lassitude was idiosyncratic, though important as a limiting case. Few Americans had the leisure or the inclination to engage in this kind of Wordsworthian retreat. Most considered neurasthenia at best a temporary respite, at worst an ordeal. They strained, if ambivalently, to be back in harness.

The manic-depressive psychology of the business class mimicked the lurching ups and downs of the business cycle. In both cases, assumptions of scarcity underwrote a pervasive defensiveness, a circle-the-wagons mentality. This was the attitude that lay behind the “rest cure” devised by the psychiatrist Silas Weir Mitchell, who proposed to “fatten” and “redden” the (usually female) patient by isolating her from all mental and social stimulation. (This nearly drove the writer Charlotte Perkins Gilman crazy, and inspired her story “The Yellow Wallpaper.”) It was also the attitude that lay behind the fiscal conservatism of the “sound-money men” on Wall Street and in Washington—the bankers and bondholders who wanted to restrict the money supply by tying it to the gold standard. Among the middle and upper classes, psyche and economy alike were haunted by the common specter of scarcity. But there were many Americans for whom scarcity was a more palpable threat.

AT THE BOTTOM of the heap were the urban poor. To middle-class observers they seemed little more than a squalid mass jammed into tenements that were festering hives of “relapsing fever,” a strange malady that left its survivors depleted of strength and unable to work. The disease was “the most efficient recruiting officer pauperism ever had,” said a journalist investigating tenement life in the 1870s. Studies of “the nether side of New York” had been appearing for decades, but—in the young United States at least—never before the Gilded Age had the story of Dives and Lazarus been so dramatically played out, never before had wealth been so flagrant, or poverty been so widespread and so unavoidably appalling. The army of thin young “sewing-girls” trooping off in the icy dawn to sweatshops all over Manhattan, the legions of skilled mechanics forced by high New York rents to huddle with their families amid a crowd of lowlifes, left without even a pretense of privacy in noisome tenements that made a mockery of the Victorian cult of home—these populations began to weigh on the bourgeois imagination, creating concrete images of the worthy, working poor.

pp. 99-110

Racial animosities flared in an atmosphere of multicultural fluidity, economic scarcity, and sexual rivalry. Attitudes arising from visceral hostility acquired a veneer of scientific objectivity. Race theory was nothing new, but in the late nineteenth century it mutated into multiple forms, many of them characterized by manic urgency, sexual hysteria, and biological determinism. Taxonomists had been trying to arrange various peoples in accordance with skull shape and brain size for decades; popularized notions of natural selection accelerated the taxonomic project, investing it more deeply in anatomical details. The superiority of the Anglo-Saxon—according to John Fiske, the leading pop-evolutionary thinker—arose not only from the huge size of his brain, but also from the depth of its furrows and the plenitude of its creases. The most exalted mental events had humble somatic origins. Mind was embedded in body, and both could be passed on to the next generation.

The year 1877 marked a crucial development in this hereditarian synthesis: in that year, Richard Dugdale published the results of his investigation into the Juke family, a dull-witted crew that had produced more than its share of criminals and mental defectives. While he allowed for the influence of environment, Dugdale emphasized the importance of inherited traits in the Juke family. If mental and emotional traits could be inherited along with physical ones, then why couldn’t superior people be bred like superior dogs or horses? The dream of creating a science of eugenics, dedicated to improving and eventually even perfecting human beings, fired the reform imagination for decades. Eugenics was a kind of secular millennialism, a vision of a society where biological engineering complemented social engineering to create a managerial utopia. The intellectual respectability of eugenics, which lasted until the 1930s, when it became associated with Nazism, underscores the centrality of racialist thinking among Americans who considered themselves enlightened and progressive. Here as elsewhere, racism and modernity were twinned.

Consciousness of race increasingly pervaded American culture in the Gilded Age. Even a worldview as supple as Henry James’s revealed its moorings in conventional racial categories when, in The American (1877), James presented his protagonist, Christopher Newman, as a quintessential Anglo-Saxon but with echoes of the noble Red Man, with the same classical posture and physiognomy. There was an emerging kinship between these two groups of claimants to the title “first Americans.” The iconic American, from this view, was a blend of Anglo-Saxon refinement and native vigor. While James only hints at this, in less than a generation such younger novelists as Frank Norris and Jack London would openly celebrate the rude vitality of the contemporary Anglo-Saxon, proud descendant of the “white savages” who subdued a continent. It should come as no surprise that their heroes were always emphatically male. The rhetoric of race merged with a broader agenda of masculine revitalization.[…]

By the 1880s, muscular Christians were sweeping across the land, seeking to meld spiritual and physical renewal, establishing institutions like the Young Men’s Christian Association. The YMCA provided prayer meetings and Bible study to earnest young men with spiritual seekers’ yearnings, gyms and swimming pools to pasty young men with office workers’ midriffs. Sometimes they were the same young men. More than any other organization, the YMCA aimed to promote the symmetry of character embodied in the phrase “body, mind, spirit”—which a Y executive named Luther Gulick plucked from Deuteronomy and made the motto of the organization. The key to the Y’s appeal, a Harper’s contributor wrote in 1882, was the “overmastering conviction” of its members: “The world always respects manliness, even when it is not convinced [by theological argument]; and if the organizations did not sponsor that quality in young men, they would be entitled to no respect.” In the YMCA, manliness was officially joined to a larger agenda.

For many American Protestants, the pursuit of physical fitness merged with an encompassing vision of moral and cultural revitalization—one based on the reassertion of Protestant self-control against the threats posed to it by immigrant masses and mass-marketed temptation. […]

Science and religion seemed to point in the same direction: Progress and Providence were one.

Yet the synthesis remained precarious. Physical prowess, the basis of national supremacy, could not be taken for granted. Strong acknowledged in passing that Anglo-Saxons could be “devitalized by alcohol and tobacco.” Racial superiority could be undone by degenerate habits. Even the most triumphalist tracts contained an undercurrent of anxiety, rooted in the fear of flab. The new stress on the physical basis of identity began subtly to undermine the Protestant synthesis, to reinforce the suspicion that religion was a refuge for effeminate weaklings. The question inevitably arose, in some men’s minds: What if the YMCA and muscular Christianity were not enough to revitalize tired businessmen and college boys?

Under pressure from proliferating ideas of racial “fitness,” models of manhood became more secular. Despite the efforts of muscular Christians to reunite body and soul, the ideal man emerging among all classes by the 1890s was tougher and less introspective than his mid-Victorian predecessors. He was also less religious. Among advocates of revitalization, words like “Energy” and “Force” began to dominate discussion—often capitalized, often uncoupled from any larger frameworks of moral or spiritual meaning, and often combined with racist assumptions. […]

The emerging worship of force raised disturbing issues. Conventional morality took a backseat to the celebration of savage strength. After 1900, in the work of a pop-Nietzschean like Jack London, even criminality became a sign of racial vitality: as one of his characters says, “We whites have been land-robbers and sea-robbers from remotest time. It is in our blood, I guess, and we can’t get away from it.” This reversal of norms did not directly challenge racial hierarchies, but the assumptions behind it led toward disturbing questions. If physical prowess was the mark of racial superiority, what was one to make of the magnificent specimens of manhood produced by allegedly inferior races? Could it be that desk-bound Anglo-Saxons required an infusion of barbarian blood (or at least the “barbarian virtues” recommended by Theodore Roosevelt)? Behind these questions lay a primitivist model of regeneration, to be accomplished by incorporating the vitality of the vanquished, dark-skinned other. The question was how to do that and maintain racial purity.

pp. 135-138

Yet to emphasize the gap between country and the city was not simply an evasive exercise: dreams of bucolic stillness or urban energy stemmed from motives more complex than mere escapist sentiment. City and country were mother lodes of metaphor, sources for making sense of the urban-industrial revolution that was transforming the American countryside and creating a deep sense of discontinuity in many Americans’ lives during the decades after the Civil War. If the city epitomized the attraction of the future, the country embodied the pull of the past. For all those who had moved to town in search of excitement or opportunity, rural life was ineluctably associated with childhood and memory. The contrast between country and city was about personal experience as well as political economy. […]

REVERENCE FOR THE man of the soil was rooted in the republican tradition. In his Notes on the State of Virginia (1785), Jefferson articulated the antithesis that became central to agrarian politics (and to the producerist worldview in general)—the contrast between rural producers and urban parasites. “Those who labour in the earth are the chosen people of God, if ever he had a chosen people, whose breasts he has made his peculiar deposit for substantial and genuine virtue,” he announced. “Corruption of morals in the mass of cultivators is a phenomenon of which no age nor nation has furnished an example. It is the mark set on those, who not looking up to heaven, to their own soil and industry, as does the husbandman, for their subsistence, depend for it on the casualties and caprice of customers. Dependence begets subservience and venality, suffocates the germ of virtue, and prepares fit tools for the design of ambition.” Small wonder, from this view, that urban centers of commerce seemed to menace the public good. “The mobs of great cities,” Jefferson concluded, “add just so much to the support of pure government as sores do to the strength of the human body.” Jefferson’s invidious distinctions echoed through the nineteenth century, fueling the moral passion of agrarian rebels. Watson, among many, considered himself a Jeffersonian.

There were fundamental contradictions embedded in Jefferson’s conceptions of an independent yeomanry. Outside certain remote areas in New England, most American farmers were not self-sufficient in the nineteenth century—nor did they want to be. Many were eager participants in the agricultural market economy, animated by a restless, entrepreneurial spirit. Indeed, Jefferson’s own expansionist policies, especially the Louisiana Purchase, encouraged centrifugal movement as much as permanent settlement. “What developed in America,” the historian Richard Hofstadter wrote, “was an agricultural society whose real attachment was not to the land but to land values.” The figure of the independent yeoman, furnishing enough food for himself and his family, participating in the public life of a secure community—this icon embodied longings for stability amid a maelstrom of migration.

Often the longings were tinged with a melancholy sense of loss. […] For those with Jeffersonian sympathies, abandoned farms were disturbing evidence of cultural decline. As a North American Review contributor wrote in 1888: “Once let the human race be cut off from personal contact with the soil, once let the conventionalities and artificial restrictions of so-called civilization interfere with the healthful simplicity of nature, and decay is certain.” Romantic nature-worship had flourished fitfully among intellectuals since Emerson had become a transparent eye-ball on the Concord common and Whitman had loafed among leaves of grass. By the post–Civil War decades, romantic sentiment combined with republican tradition to foster forebodings. Migration from country to city, from this view, was a symptom of disease in the body politic. Yet the migration continued. Indeed, nostalgia for rural roots was itself a product of rootlessness. A restless spirit, born of necessity and desire, spun Americans off in many directions—but mainly westward. The vision of a stable yeomanry was undercut by the prevalence of the westering pioneer.

pp. 246-247

Whether energy came from within or without, it was as limitless as electricity apparently was. The obstacles to access were not material—class barriers or economic deprivation were never mentioned by devotees of abundance psychology—they were mental and emotional. The most debilitating emotion was fear, which cropped up constantly as the core problem in diagnoses of neurasthenia. The preoccupation with freeing oneself from internal constraints undermined the older, static ideal of economic self-control at its psychological base. As one observer noted in 1902: “The root cause of thrift, which we all admire and preach because it is so convenient to the community, is fear, fear of future want; and that fear, we are convinced, when indulged overmuch by pessimist minds is the most frequent cause of miserliness….” Freedom from fear meant freedom to consume.

And consumption began at the dinner table. Woods Hutchinson claimed in 1913 that the new enthusiasm for calories was entirely appropriate to a mobile, democratic society. The old “stagnation” theory of diet merely sought to maintain the level of health and vigor; it was a diet for slaves or serfs, for people who were not supposed to rise above their station. “The new diet theory is based on the idea of progress, of continuous improvement, of never resting satisfied with things as they are,” Hutchinson wrote. “No diet is too liberal or expensive that will…yield good returns on the investment.” Economic metaphors for health began to focus on growth and process rather than stability, on consumption and investment rather than savings.

As abundance psychology spread, a new atmosphere of dynamism enveloped old prescriptions for success. After the turn of the century, money was less often seen as an inert commodity, to be gradually accumulated and tended to steady growth; and more often seen as a fluid and dynamic force. To Americans enraptured by the strenuous life, energy became an end itself—and money was a kind of energy. Success mythology reflected this subtle change. In the magazine hagiographies of business titans—as well as in the fiction of writers like Dreiser and Norris—the key to success frequently became a mastery of Force (as those novelists always capitalized it), of raw power. Norris’s The Pit (1903) was a paean to the furious economic energies concentrated in Chicago. “It was Empire, the restless subjugation of all this central world of the lakes and prairies. Here, mid-most in the land, beat the Heart of the nation, whence inevitably must come its immeasurable power, its infinite, inexhaustible vitality. Here of all her cities, throbbed the true life—the true power and spirit of America: gigantic, crude, with the crudity of youth, disdaining rivalry; sane and healthy and vigorous; brutal in its ambition, arrogant in the new-found knowledge of its giant strength, prodigal of its wealth, infinite in its desires.” This was the vitalist vision at its most breathless and jejune, the literary equivalent of Theodore Roosevelt’s adolescent antics.

The new emphasis on capital as Force translated the psychology of abundance into economic terms. The economist who did the most to popularize this translation was Simon Nelson Patten, whose The New Basis of Civilization (1907) argued that the United States had passed from an “era of scarcity” to an “era of abundance” characterized by the unprecedented availability of mass-produced goods. His argument was based on the confident assumption that human beings had learned to control the weather. “The Secretary of Agriculture recently declared that serious crop failures will occur no more,” Patten wrote. “Stable, progressive farming controls the terror, disorder, and devastation of earlier times. A new agriculture means a new civilization.” Visions of perpetual growth were in the air, promising both stability and dynamism.

The economist Edward Atkinson pointed the way to a new synthesis with a hymn to “mental energy” in the Popular Science Monthly. Like other forms of energy, it was limitless. “If…there is no conceivable limit to the power of mind over matter or to the number of conversions of force that can be developed,” he wrote, “it follows that pauperism is due to want of mental energy, not of material resources.” Redistribution of wealth was not on the agenda; positive thinking was.

pp. 282-283

TR’s policies were primarily designed to protect American corporations’ access to raw materials, investment opportunities, and sometimes markets. The timing was appropriate. In the wake of the merger wave of 1897–1903, Wall Street generated new pools of capital, while Washington provided new places to invest it. Speculative excitement seized many among the middle and upper classes who began buying stocks for the first time. Prosperity spread even among the working classes, leading Simon Nelson Patten to detect a seismic shift from an era of scarcity to an era of abundance. For him, a well-paid working population committed to ever-expanding consumption would create what he called The New Basis of Civilization (1907).

Patten understood that the mountains of newly available goods were in part the spoils of empire, but he dissolved imperial power relations in a rhetoric of technological determinism. The new abundance, he argued, depended not only on the conquest of weather but also on the annihilation of time and space—a fast, efficient distribution system that provided Americans with the most varied diet in the world, transforming what had once been luxuries into staples of even the working man’s diet. “Rapid distribution of food carries civilization with it, and the prosperity that gives us a Panama canal with which to reach untouched tropic riches is a distinctive laborer’s resource, ranking with refrigerated express and quick freight carriage.” The specific moves that led to the seizure of the Canal Zone evaporated in the abstract “prosperity that gives us a Panama Canal,” which in turn became as much a boon to the workingman as innovative transportation. Empire was everywhere, in Patten’s formulation, and yet nowhere in sight.

What Patten implied (rather than stated overtly) was that imperialism underwrote expanding mass consumption, raising standards of living for ordinary folk. “Tropic riches” became cheap foods for the masses. The once-exotic banana was now sold from pushcarts for 6 cents a dozen, “a permanent addition to the laborer’s fund of goods.” The same was true of “sugar, which years ago was too expensive to be lavishly consumed by the well-to-do,” but “now freely gives its heat to the workingman,” as Patten wrote. “The demand that will follow the developing taste for it can be met by the vast quantities latent in Porto Rico and Cuba, and beyond them by the teeming lands of South America, and beyond them by the virgin tropics of another hemisphere.” From this view, the relation between empire and consumption was reciprocal: if imperial policies helped stimulate consumer demand, consumer demand in turn promoted imperial expansion. A society committed to ever-higher levels of mass-produced abundance required empire to be a way of life.

Coping Mechanisms of Health

Carl Jung argued that sometimes what seems like mental illness is in actuality an effective coping mechanism. He advised against treating the coping mechanism as the problem without understanding what it is a response to. The problem itself could be made worse. Some people have found a careful balance that allows them to function in the world, no matter how dysfunctional it may seem to others, from addiction to dissociation. We need to have respect and compassion for how humans cope with difficulties.

There is something similar in physical health. Consider obesity. Is it always the cause of health problems? Or might it be the body’s way of protecting against other health problems? That is what was explored in a recent study mentioned by Gabor Erdosi. It is Friendly Fat Theory – Explaining the Paradox of Diabetes and Obesity by Rajiv Singla et al. The authors write:

“Obesity has been called the mother of all diseases and, historically, has been strongly linked to diabetes. However, there are still some paradoxes that exist in diabetes epidemiology and obesity and no unifying hypothesis has been proposed to explain these paradoxical phenomena. Despite the ever-increasing prevalence of both obesity and diabetes, differential relationships exist between diabetes and the extent of obesity in various different ethnic groups. In addition, people with a higher body mass index have been shown to have an improved survival advantage in terms of chronic diabetes complications, especially cardiovascular complications. This narrative review attempts to explain these paradoxical and complex relationships with a single unifying theory. We propose that adipocytes are actually friends of the human body to prevent the occurrence of diabetes and also help in mitigating the complications of diabetes. Adipose tissue actually acts as a reservoir of free fatty acids, responsible for insulin resistance, and prevents their overflow into insulin-sensitive tissues and, therefore, friendly fat theory.”

L. Amber O’Hearn responded, “Wait, are you saying the body is actually trying to be healthy and that many symptoms we see in connection with disease are functionally protective coping mechanisms? Yes, indeed.” Following that, someone else mentioned that this perspective was argued by Dr. Jason Fung in an interview with Peter Attia, podcast #59. I’m sure many others have said similar things. It’s not difficult to understand for anyone familiar with some of the science.

For example, inflammation causes or is a causal link in many health problems or otherwise seen as an indicator of health deterioration (arthritis, depression, schizophrenia, etc), but inflammation itself isn’t the fundamental cause since it is a protective response itself to something else (allergens, leaky gut, etc). Or as yet another example, there is the theory that cholesterol plaque in arteries doesn’t cause the problem but is a response to it, as the cholesterol is essentially forming a scab in seeking to heal injury. Pointing at cholesterol would be like making accusations about firefighters being present at fires. To bring it back to diabetes, consider the amyloid plaque in the brain commonly found with Alzheimer’s, i.e., type 3 diabetes. Amy Berger, based on the work of others (Dale Bredesen, Sónia Correia, Mortimer Mamelak, etc), speculates that the amyloid beta (Aβ) peptide plays a neuroprotective role in reducing continuously high levels of glucose, but in dealing with insulin the resulting Aβ plaques are unable to be cleared out. Those plaques would, therefore, be a symptom and not a cause.

As Berger say, “Aβ plaques can be likened to a fever: Fever is a natural protective mechanism the body employs to kill and neutralize invading pathogens. (The goal is to raise the body’s core temperature to a level that is fatal to the bugs and viruses that cause illness.) However, even though it is a protective mechanism, if a fever goes too high, it can have disastrous effects for other parts of human physiology. The same can be said of Aβ. It might start as a defensive step in the brain, but it progresses to a point where, rather than being helpful, it becomes harmful.” Drug companies have targeted these plaques and have successfully decreased them. The result wasn’t as expected. Instead of improving, the condition of Alzheimer’s patients worsened. Blocking amyloid beta peptide, it turns out, was not a beneficial thing. That complies with the theory of a neuroprotective role.

One could look to numerous other things, as the basic principal is widely applicable. The body is always seeking the healthiest balance under any conditions, even if less than optimal. So, in seeking greater health, we must realize that the body-mind of an individual is a system that is part of larger systems. To get different results, the totality of the situation needs to be shifted into a new balance. That is why something like ketosis can dramatically improve so many health issues, as it completely alters the functioning of gut health, metabolism, immune response, hormonal system, neurocognition, and on and on. That diet could have that kind of impact should not be hard to understand. Think about the multiple links, direct and indirect, between the gut and the brain — multiply that by hundreds of other major connections within our biology.

The failing of conventional medicine is that it has usually been a symptoms-based approach. Diagnosis is determined by patterns of symptoms. Too often that then is used to choose a medication or surgical intervention to treat those symptoms. Underlying causes are rarely understood or even considered. Partly, that is because of a lack of knowledge and the related low quality of many medical studies. But more problematic is that the dominant paradigm constrains thought, shuts down the ability to imagine other ways of doing medicine. The above study, however, suggests that we should understand what purpose something is serving. Obesity isn’t merely too much fat. Instead of being the problem itself, obesity might be the body’s best possible solution under those conditions.

What if so many of our supposed problems operate in a similar manner? What if instead of constantly fighting against what we deem as bad we sought understanding first about what purpose is being served and then sought some other means of accomplishing that end? Think about the short-term thinking that has been observed under conditions of poverty and high inequality. Instead of judging people as inferior, we could realize that short-term thinking makes perfect sense in evolutionary terms, as extreme stress indicates that immediate problems must be dealt with first. Rather than blaming the symptom or scapegoating the victim, we should look at the entire context of what is going on. If we don’t like the results we are getting as individuals and as a society, we better change the factors that lead to those results. It’s a simple and typically overlooked insight.

We aren’t isolated individuals. We are an inseparable aspect of a larger world. Every system within our bodies and minds, every system in society and the environment is integral to our holistic functioning as human beings. Everything is connected in various ways. Change one thing and it will ripple outward.

* * *

It’s The Insulin Resistance, Stupid: Part 1 & Part 2
by Timothy Noakes

Most Mainstream Doctors Would Fail Nutrition

To return to the topic at hand, the notion of food as medicine, a premise of the paleo diet, also goes back to the ancient Greeks — in fact, originates with the founder of modern medicine, Hippocrates (he also is ascribed as saying that, “All disease begins in the gut,” a slight exaggeration of a common view about the importance of gut health, a key area of connection between the paleo diet and alternative medicine). What we now call functional medicine, treating people holistically, used to be standard practice of family doctors for centuries and probably millennia, going back to medicine men and women. But this caring attitude and practice went by the wayside because it took time to spend with patients and insurance companies wouldn’t pay for it. Traditional healthcare that we now think of as alternative is maybe not possible with a for-profit model, but I’d say that is more of a criticism of the for-profit model than a criticism of traditional healthcare.

Diets and Systems

Related to diet, Pezeshki does bring up the issue of inflammation. As I originally came around to my present diet from a paleo viewpoint, I became familiar with the approach of functional medicine that puts inflammation as a central factor (Essentialism On the Decline). Inflammation is a bridge between the physiological and the psychological, the individual and the social. Where and how inflammation erupts within the individual determines how a disease condition or rather a confluence of symptoms gets labeled and treated, even if the fundamental cause originated elsewhere, maybe in the ‘external’ world (socioeconomic stress, transgenerational trauma, environmental toxins, parasites because of lack of public sanitation, etc. Inflammation is linked to leaky gut, leaky brain, arthritis, autoimmune disorders, mood disorders, ADHD, autism, schizophrenia, impulsivity, short-term thinking, addiction, aggression, etc — and such problems increase under high inequality.

There are specific examples to point to. Diabetes and mood disorders co-occur. There is the connection of depression and anhedonia, involving the reward circuit and pleasure, which in turn can be affected by inflammation. Also, inflammation can lead to changes in glutamate in depression, similar to the glutamate alterations in autism from diet and microbes, and that is significant considering that glutamate is not only a major neurotransmitter but also a common food additive. Dr. Roger McIntyre writes that, “MRI scans have shown that if you make someone immune activated, the hypervigilance center is activated, activity in the motoric region is reduced, and the person becomes withdrawn and hypervigilant. And that’s what depression is. What’s the classic presentation of depression? People are anxious, agitated, and experience a lack of spontaneous activity and increased emotional withdrawal” (Inflammation, Mood Disorders, and Disease Model Convergence). Inflammation is a serious condition and, in the modern world, quite pervasive. The implications of this are not to be dismissed.

Essentialism On the Decline

In reading about paleolithic diets and traditional foods, a recurring theme is inflammation, specifically as it relates to the health of the gut-brain network and immune system.

The paradigm change this signifies is that seemingly separate diseases with different diagnostic labels often have underlying commonalities. They share overlapping sets of causal and contributing factors, biological processes and symptoms. This is why simple dietary changes can have a profound effect on numerous health conditions. For some, the diseased state expresses as mood disorders and for others as autoimmune disorders and for still others something entirely else, but there are immense commonalities between them all. The differences have more to do with how dysbiosis and dysfunction happens to develop, where it takes hold in the body, and so what symptoms are experienced.

From a paleo diet perspective in treating both patients and her own multiple sclerosis, Terry Wahls gets at this point in a straightforward manner (p. 47): “In a very real sense, we all have the same disease because all disease begins with broken, incorrect biochemistry and disordered communication within and between our cells. […] Inside, the distinction between these autoimmune diseases is, frankly, fairly arbitrary”. In How Emotions Are Made, Lisa Feldman Barrett wrote (Kindle Locations 3834-3850):

“Inflammation has been a game-changer for our understanding of mental illness. For many years, scientists and clinicians held a classical view of mental illnesses like chronic stress, chronic pain, anxiety, and depression. Each ailment was believed to have a biological fingerprint that distinguished it from all others. Researchers would ask essentialist questions that assume each disorder is distinct: “How does depression impact your body? How does emotion influence pain? Why do anxiety and depression frequently co-occur?” 9

“More recently, the dividing lines between these illnesses have been evaporating. People who are diagnosed with the same-named disorder may have greatly diverse symptoms— variation is the norm. At the same time, different disorders overlap: they share symptoms, they cause atrophy in the same brain regions, their sufferers exhibit low emotional granularity, and some of the same medications are prescribed as effective.

“As a result of these findings, researchers are moving away from a classical view of different illnesses with distinct essences. They instead focus on a set of common ingredients that leave people vulnerable to these various disorders, such as genetic factors, insomnia, and damage to the interoceptive network or key hubs in the brain (chapter 6). If these areas become damaged, the brain is in big trouble: depression, panic disorder, schizophrenia, autism, dyslexia, chronic pain, dementia, Parkinson’s disease, and attention deficit hyperactivity disorder are all associated with hub damage. 10

“My view is that some major illnesses considered distinct and “mental” are all rooted in a chronically unbalanced body budget and unbridled inflammation. We categorize and name them as different disorders, based on context, much like we categorize and name the same bodily changes as different emotions. If I’m correct, then questions like, “Why do anxiety and depression frequently co-occur?” are no longer mysteries because, like emotions, these illnesses do not have firm boundaries in nature.”

What jumped out at me was the conventional view of disease as essentialist, and hence the related essentialism in biology and psychology.

On Health or Lack Thereof

Millennials’ health plummets after the age of 27: Study finds the generation has unprecedented rates of diabetes, depression, and digestive disorders
by Natalie Rahhal

  • After age 27, all major measures of health start to decline sharply for millennials, according to a new Blue Cross Blue Shield Report
  • Millennials have higher rates of eight of the top 10 most common health conditions by their mid-30s than generation X-ers did at the same age
  • As their health continues to decline, millennials stand to cost the American health care industry and economy steep sums

It's all downhill from here: A depressing graph shows steep health decline that begins after age 27 and continues until death for millennials
It’s all downhill from here: A depressing graph shows steep health decline that begins after age 27 and continues until death for millennials

Effect of Dietary Lipid on UV Light Carcinogenesis in the Hairless Mouse
by Vivienne E. Reeve, Melissa Matheson, Gavin E. Greenoak, Paul J. Canfield, Christa Boehm‐Wilcox, and Clifford H. Gallagher

Isocaloric feeding of diets varying in lipid content to albino hairless mice has shown that their susceptibility to skin tumorigenesis induced by simulated solar UV light was not affected by the level of polyunsaturated fat, 5% or 20%. However a qualitative effect of dietary lipid was demonstrated. Mice fed 20% saturated fat were almost completely protected from UV tumorigenesis when compared with mice fed 20% polyunsaturated fat. Multiple latent tumours were detected in the saturated fat‐fed mice by subsequent dietary replenishment, suggesting that a requirement for dietary unsaturated fat exists for the promotion stage of UV‐induced skin carcinogenesis.

Therapeutic benefit of combining calorie-restricted ketogenic diet and glutamine targeting in late-stage experimental glioblastoma
by Purna Mukherjee, Zachary M. Augur, Mingyi Li, Collin Hill, Bennett Greenwood, Marek A. Domin, Gramoz Kondakci, Niven R. Narain, Michael A. Kiebish, Roderick T. Bronson, Gabriel Arismendi-Morillo, Christos Chinopoulos, and Thomas N. Seyfried

Glioblastoma (GBM) is an aggressive primary human brain tumour that has resisted effective therapy for decades. Although glucose and glutamine are the major fuels that drive GBM growth and invasion, few studies have targeted these fuels for therapeutic management. The glutamine antagonist, 6-diazo-5-oxo-L-norleucine (DON), was administered together with a calorically restricted ketogenic diet (KD-R) to treat late-stage orthotopic growth in two syngeneic GBM mouse models: VM-M3 and CT-2A. DON targets glutaminolysis, while the KD-R reduces glucose and, simultaneously, elevates neuroprotective and non-fermentable ketone bodies. The diet/drug therapeutic strategy killed tumour cells while reversing disease symptoms, and improving overall mouse survival. The therapeutic strategy also reduces edema, hemorrhage, and inflammation. Moreover, the KD-R diet facilitated DON delivery to the brain and allowed a lower dosage to achieve therapeutic effect. The findings support the importance of glucose and glutamine in driving GBM growth and provide a therapeutic strategy for non-toxic metabolic management.

Writer’s block
by Dr. Malcolm Kendrick

Anyway, to return to the main issue here, which is that medical science may now be incapable of self-correction. Erroneous ideas will be compounded, built on, and can never be overturned. Because of a thing called non-reproducibility.

In most areas of science, there is nothing to stop a researcher going back over old research and trying to replicate it. The correct term is reproducibility. In every branch of science there is currently an acknowledged crisis with reproducibility.

‘Reproducibility is a hot topic in science at the moment, but is there a crisis? Nature asked 1,576 scientists this question as part of an online survey. Most agree that there is a crisis and over 70% said they’d tried and failed to reproduce another group’s experiments.’ 2

This is not good, but in medical research this issue is magnified many times. Because there is another in-built problem. You cannot reproduce research that has been positive. Take clinical trials into statins. You start with middle aged men, split them into two groups, give one a statin and one a placebo. At the end of your five-year trial, you claim that statins had a benefit – stopped heart attacks and strokes and suchlike.

Once this claim has been made, in this group, it becomes unethical/impossible to replicate this study, in this group – ever again. The ethics committee would tell you that statins have been proven to have a benefit, you cannot withhold a drug with a ‘proven’ benefit from patients. Therefore, you cannot have a placebo arm in your trial. Therefore, you cannot attempt to replicate the findings. Ever.

Thus, if a trial was flawed/biased/corrupt or simply done badly. That’s it. You are going to have to believe the results, and you can never, ever, have another go. Ergo, medicine cannot self-correct through non-reproducibility. Stupidity can now last for ever. In fact, it is built in.

When Evidence Says No, but Doctors Say Yes
by David Epstein

Even if a drug you take was studied in thousands of people and shown truly to save lives, chances are it won’t do that for you. The good news is, it probably won’t harm you, either. Some of the most widely prescribed medications do little of anything meaningful, good or bad, for most people who take them.

In a 2013 study, a dozen doctors from around the country examined all 363 articles published in The New England Journal of Medicine over a decade—2001 through 2010—that tested a current clinical practice, from the use of antibiotics to treat people with persistent Lyme disease symptoms (didn’t help) to the use of specialized sponges for preventing infections in patients having colorectal surgery (caused more infections). Their results, published in the Mayo Clinic Proceedings, found 146 studies that proved or strongly suggested that a current standard practice either had no benefit at all or was inferior to the practice it replaced; 138 articles supported the efficacy of an existing practice, and the remaining 79 were deemed inconclusive. (There was, naturally, plenty of disagreement with the authors’ conclusions.) Some of the contradicted practices possibly affect millions of people daily: Intensive medication to keep blood pressure very low in diabetic patients caused more side effects and was no better at preventing heart attacks or death than more mild treatments that allowed for a somewhat higher blood pressure. Other practices challenged by the study are less common—like the use of a genetic test to determine if a popular blood thinner is right for a particular patient—but gaining in popularity despite mounting contrary evidence. Some examples defy intuition: CPR is no more effective with rescue breathing than if chest compressions are used alone; and breast-cancer survivors who are told not to lift weights with swollen limbs actually should lift weights, because it improves their symptoms.

A separate but similarly themed study in 2012 funded by the Australian Department of Health and Ageing, which sought to reduce spending on needless procedures, looked across the same decade and identified 156 active medical practices that are probably unsafe or ineffective. The list goes on: A brand new review of 48 separate studies—comprising more than 13,000 clinicians—looked at how doctors perceive disease-screening tests and found that they tend to underestimate the potential harms of screening and overestimate the potential benefits; an editorial in American Family Physician, co-written by one of the journal’s editors, noted that a “striking feature” of recent research is how much of it contradicts traditional medical opinion.

That isn’t likely to change any time soon. The 21st Century Cures Act—a rare bipartisan bill, pushed by more than 1,400 lobbyists and signed into law in December—lowers evidentiary standards for new uses of drugs and for marketing and approval of some medical devices. Furthermore, last month President Donald Trump scolded the FDA for what he characterized as withholding drugs from dying patients. He promised to slash regulations “big league. … It could even be up to 80 percent” of current FDA regulations, he said. To that end, one of the president’s top candidates to head the FDA, tech investor Jim O’Neill, has openly advocated for drugs to be approved before they’re shown to work. “Let people start using them at their own risk,” O’Neill has argued.

So, while Americans can expect to see more drugs and devices sped to those who need them, they should also expect the problem of therapies based on flimsy evidence to accelerate. In a recent Stat op-ed, two Johns Hopkins University physician-researchers wrote that the new 21st Century Cures Act will turn the label “FDA approved” into “a shadow of its former self.” In 1962, Congress famously raised the evidentiary bar for drug approvals after thousands of babies were born with malformed limbs to mothers who had taken the sleep aid thalidomide. Steven Galson, a retired rear admiral and former acting surgeon general under both President George W. Bush and President Barack Obama, has called the strengthened approval process created in 1962 the FDA’s “biggest contribution to health.” Before that, he said, “many marketed drugs were ineffective for their labeled uses.”

Striking the right balance between innovation and regulation is incredibly difficult, but once remedies are in use—even in the face of contrary evidence—they tend to persist. A 2007 Journal of the American Medical Association papercoauthored by John Ioannidis—a Stanford University medical researcher and statistician who rose to prominence exposing poor-quality medical science—found that it took 10 years for large swaths of the medical community to stop referencing popular practices after their efficacy was unequivocally vanquished by science.

Science institute that advised EU and UN ‘actually industry lobby group’
by Arthur Nelson

An institute whose experts have occupied key positions on EU and UN regulatory panels is, in reality, an industry lobby group that masquerades as a scientific health charity, according to a peer-reviewed study.

The Washington-based International Life Sciences Institute (ILSI) describes its mission as “pursuing objectivity, clarity and reproducibility” to “benefit the public good”.

But researchers from the University of Cambridge, Bocconi University in Milan, and the US Right to Know campaign assessed over 17,000 pages of documents under US freedom of information laws to present evidence of influence-peddling.

The paper’s lead author, Dr Sarah Steele, a Cambridge university senior research associate, said: “Our findings add to the evidence that this nonprofit organisation has been used by its corporate backers for years to counter public health policies. ILSI should be regarded as an industry group – a private body – and regulated as such, not as a body acting for the greater good.”

The New Faces of Coke
by Kyle Pfister

Of the 115 individuals Coca-Cola admitted to funding, here’s a breakdown:

By sector, 57% (65) are dietitians, 20% (23) are academics, 7% (8) are medical professionals (mostly Doctors), 6% (7) are fitness experts, 5% (6) are authors, 3% (3) are chefs, and 1% (1) are food representatives. I was not able to identify sectors for two of the funded experts.

Kellogg Paid ‘Independent Experts’ to Promote Its Cereal
by Michael Addady

Kellogg paid council experts an average of $13,000 per year, according to emails and contracts obtained by the Associated Press. The payment was for expert to engage in “nutrition influencer outreach” and refrain from offering their services to products that were “competitive or negative to cereal.”

Outreach usually meant one of two things: Experts would claim Kellogg was their favorite brand on social media, or they would tout the cereal during public appearances. Kellogg’s spokesperson Kris Charles told Fortune in a statement that the experts’ association with the company was disclosed at public appearances.

Additionally, the experts’ connection to the company may have affected some of their published work. For example, an independent expert was involved in publishing an academic paper in the Journal of the Academy of Nutrition and Dietetics that defined a “quality breakfast.” Kellogg had the opportunity to edit the paper and even asked that the author remove a suggestion about limiting added sugar (something the sugar industry has also been accused of doing with heart disease research).

FDA: Sampling finds toxic nonstick compounds in some food
by Ellen Knickmeyer, John Flesher, and Michael Casey

A federal toxicology report last year cited links between high levels of the compounds in people’s blood and health problems, but said it was not certain the nonstick compounds were the cause.

The levels in nearly half of the meat and fish tested were two or more times over the only currently existing federal advisory level for any kind of the widely used manmade compounds, which are called per- and polyfluoroalykyl substances, or PFAS.

The level in the chocolate cake was higher: more than 250 times the only federal guidelines, which are for some PFAS in drinking water.

Food and Drug Administration spokeswoman Tara Rabin said Monday that the agency thought the contamination was “not likely to be a human health concern,” even though the tests exceeded the sole existing federal PFAS recommendations for drinking water.

Why smelling good could come with a cost to health
by Lauren Zanolli

About 4,000 chemicals are currently used to scent products, but you won’t find any of them listed on a label. Fragrance formulations are considered a “trade secret” and therefore protected from disclosure – even to regulators or manufacturers. Instead, one word, fragrance, appears on ingredients lists for countless cosmetics, personal care and cleaning products. A single scent may contain anywhere from 50 to 300 distinct chemicals.

“No state, federal or global authority is regulating the safety of fragrance chemicals,” says Janet Nudelman, policy director for Breast Cancer Prevention Partners (BCPP) and co-founder of the Campaign for Safe Cosmetics. “No state, federal or global authority even knows which fragrance chemicals appear in which products.”

Three-quarters of the toxic chemicals detected in a test of 140 products came from fragrance, reported a 2018 BCPP study of personal care and cleaning brands. The chemicals identified were linked to chronic health issues, including cancer.

Stress and Shittiness

What causes heart disease – Part 63
by Malcolm Kendrick

To keep this simple, and stripping terminology down things down to basics, the concept I am trying to capture, and the word that I am going to use, here to describe the factor that can affect entire populations is ‘psychosocial stress’. By which I mean an environment where there is breakdown of community and support structures, often poverty, with physical threats and suchlike. A place where you would not really want to walk down the road unaccompanied.

This can be a zip code in the US, known as postcode in the UK. It can be a bigger physical area than that, such as a county, a town, or whole community – which could be split across different parts of a country. Such as native Americans living in areas that are called reservations.

On the largest scale it is fully possible for many countries to suffer from major psychosocial stress at the same time. […] Wherever you look, you can see that populations that have been exposed to significant social dislocation, and major psychosocial stressors, have extremely high rate of coronary heart disease/cardiovascular disease.

The bad news is we’re dying early in Britain – and it’s all down to ‘shit-life syndrome’
by Will Hutton

Britain and America are in the midst of a barely reported public health crisis. They are experiencing not merely a slowdown in life expectancy, which in many other rich countries is continuing to lengthen, but the start of an alarming increase in death rates across all our populations, men and women alike. We are needlessly allowing our people to die early.

In Britain, life expectancy, which increased steadily for a century, slowed dramatically between 2010 and 2016. The rate of increase dropped by 90% for women and 76% for men, to 82.8 years and 79.1 years respectively. Now, death rates among older people have so much increased over the last two years – with expectations that this will continue – that two major insurance companies, Aviva and Legal and General, are releasing hundreds of millions of pounds they had been holding as reserves to pay annuities to pay to shareholders instead. Society, once again, affecting the citadels of high finance.

Trends in the US are more serious and foretell what is likely to happen in Britain without an urgent change in course. Death rates of people in midlife(between 25 and 64) are increasing across the racial and ethnic divide. It has long been known that the mortality rates of midlife American black and Hispanic people have been worse than the non-Hispanic white population, but last week the British Medical Journal published an important study re-examining the trends for all racial groups between 1999 and 2016 .

The malaises that have plagued the black population are extending to the non-Hispanic, midlife white population. As the report states: “All cause mortality increased… among non-Hispanic whites.” Why? “Drug overdoses were the leading cause of increased mortality in midlife, but mortality also increased for alcohol-related conditions, suicides and organ diseases involving multiple body systems” (notably liver, heart diseases and cancers).

US doctors coined a phrase for this condition: “shit-life syndrome”. Poor working-age Americans of all races are locked in a cycle of poverty and neglect, amid wider affluence. They are ill educated and ill trained. The jobs available are drudge work paying the minimum wage, with minimal or no job security. They are trapped in poor neighbourhoods where the prospect of owning a home is a distant dream. There is little social housing, scant income support and contingent access to healthcare. Finding meaning in life is close to impossible; the struggle to survive commands all intellectual and emotional resources. Yet turn on the TV or visit a middle-class shopping mall and a very different and unattainable world presents itself. Knowing that you are valueless, you resort to drugs, antidepressants and booze. You eat junk food and watch your ill-treated body balloon. It is not just poverty, but growing relative poverty in an era of rising inequality, with all its psychological side-effects, that is the killer.

The UK is not just suffering shit-life syndrome. We’re also suffering shit-politician syndrome.
by Richard Murphy

Will Hutton has an article in the Guardian in which he argues that the recent decline in the growth of life expectancy in the UK (and its decline in some parts) is down to what he describes as ‘shit-life syndrome’. This is the state where life is reduced to an exercise in mere survival as a result of the economic and social oppression lined up against those suffering the condition. And, as he points out, those suffering are not just those on the economic and social margins of society. In the UK, as in the US, the syndrome is spreading.

The reasons for this can be debated. I engaged in such argument in my book The Courageous State. In that book I argued that we live in a world where those with power do now, when they identify a problem, run as far as they might from it and say the market will find a solution. The market won’t do that. It is designed not to do so. Those suffering shit-life syndrome have, by default, little impact on the market. That’s one of the reasons why they are suffering the syndrome in the first place. That is why so much of current politics has turned a blind eye to this issue.

And they get away with it. That’s because the world of make belief advertising which drives the myths that underpin the media, and in turn out politics, simply pretends such a syndrome does not exist whilst at the same time perpetually reinforcing the sense of dissatisfaction that is at its core.

With Brexit, It’s the Geography, Stupid
by Dawn Foster

One of the major irritations of public discourse after the United Kingdom’s Brexit vote has been the complete poverty of analysis on the reasons behind different demographics’ voting preferences. Endless time, energy, and media attention has been afforded to squabbling over the spending of each campaign for and against continued European Union membership — and now more on the role social media played in influencing the vote — mirroring the arguments in the United States that those who voted to Leave were, like Trump voters, unduly influenced by shady political actors, with little transparency behind political ads and social media tactics.

It’s a handy distraction from the root causes in the UK: widening inequality, but also an increasingly entrenched economic system that is geographically specific, meaning your place of birth and rearing has far more influence over how limited your life is than anything within your control: work, education and life choices.

Across Britain, territorial injustice is growing: for decades, London has boomed in comparison to the rest of the country, with more and more wealth being sucked towards the southeast and other regions being starved of resources, jobs and infrastructure as a result. A lack of secure and well-remunerated work doesn’t just determine whether you can get by each month without relying on social security to make ends meet, but also all aspects of your health, and the health of your children. A recent report by researchers at Cambridge University examined the disproportionate effect of central government cuts on local authorities and services: inner city areas with high rates of poverty, and former industrial areas were hardest hit. Mia Gray, one of the authors of the Cambridge report said: “Ever since vast sums of public money were used to bail out the banks a decade ago, the British people have been told that there is no other choice but austerity imposed at a fierce and relentless rate. We are now seeing austerity policies turn into a downward spiral of disinvestment in certain people and places. This could affect the life chances of entire generations born in the wrong part of the country.”

Life expectancy is perhaps the starkest example. In many other rich countries, life expectancy continues to grow. In the United Kingdom it is not only stalling, but in certain regions falling. The gap between the north and south of England reveals the starkest gap in deaths among young people: in 2015, 29.3 percent more 25-34-year-olds died in the north of England than the south. For those aged 35-44, the number of deaths in the north was 50 percent higher than the south.

In areas left behind economically, such as the ex-mining towns in the Welsh valleys, the post-industrial north of England, and former seaside holiday destinations that have been abandoned as people plump for cheap European breaks, doctors informally describe the myriad tangle of health, social and economic problems besieging people as “Shit Life Syndrome”. The term, brought to public attention by the Financial Times, sounds flippant, but it attempts to tease out the cumulative impact of strict and diminished life chances, poor health worsened by economic circumstances, and the effects of low paid work and unemployment on mental health, and lifestyle issues such as smoking, heavy drinking, and lack of exercise, factors worsened by a lack of agency in the lives of people in the most deprived areas. Similar to “deaths of despair” in the United States, Shit Life Syndrome leads to stark upticks in avoidable deaths due to suicide, accidents, and overdoses: several former classmates who remained in the depressed Welsh city I grew up in have taken their own lives, overdosed, or died as a result of accidents caused by alcohol or drugs. Their lives prior to death were predictably unhappy, but the opportunity to turn things around simply didn’t exist. To move away, you need money and therefore a job. The only vacancies that appear pay minimum wage, and usually you’re turned away without interview.

Simply put, it’s a waste of lives on an industrial scale, but few people notice or care. One of the effects of austerity is the death of public spaces people can gather without being forced to spend money. Youth clubs no longer exist, and public health officials blame their demise on the rise in teenagers becoming involved in gangs and drug dealing in inner cities. Libraries are closing at a rate of knots, despite the government requiring all benefits claims to be submitted via computers. More and more public spaces and playgrounds are being sold off to land-hungry developers, forcing more and more people to shoulder their misery alone, depriving them of spaces and opportunities to meet people and socialise. Shame is key in perpetuating the sense that poverty is deserved, but isolation and loneliness help exacerbate the self-hatred that stops you fighting back against your circumstances.

“Shit-Life Syndrome” (Oxycontin Blues)
by Curtis Price

In narrowing drug use to a legal or public health problem, as many genuinely concerned about the legal and social consequences of addiction will argue, I believe a larger politics and political critique gets lost (This myopia is not confined to drug issues. From what I’ve seen, much of the “social justice” perspective in the professional care industry is deeply conservative; what gets argued for amounts to little more than increased funding for their own services and endless expansion of non-profits). Drug use, broadly speaking, doesn’t take place in a vacuum. It is a thermometer for social misery and the more social misery, the greater the use. In other words, it’s not just a matter of the properties of the drug or the psychological states of the individual user, but also of the social context in which such actions play out.

If we accept this as a yardstick, then it’s no accident then that the loss of the 1984-1985 U.K. Miners’ Strike, with the follow-on closure of the pits and destruction of pit communities’ tight-knit ways of life, triggered widespread heroin use (2). What followed the defeat of the Miners’ Strike only telescoped into a few years the same social processes that in much of the U.S. were drawn out, more prolonged, insidious, and harder to detect. Until, that is, the mortality rates – that canary in the epidemiological coalmine -sharply rose to everyone’s shock.

US doctors have coined a phrase for the underlying condition of which drug use and alcoholism is just part: “shit-life syndrome.” As Will Hutton in the Guardian describes it,

“Poor working-age Americans of all races are locked in a cycle of poverty and neglect, amid wider affluence. They are ill educated and ill trained. The jobs available are drudge work paying the minimum wage, with minimal or no job security. They are trapped in poor neighborhoods where the prospect of owning a home is a distant dream. There is little social housing, scant income support and contingent access to healthcare. Finding meaning in life is close to impossible; the struggle to survive commands all intellectual and emotional resources. Yet turn on the TV or visit a middle-class shopping mall and a very different and unattainable world presents itself. Knowing that you are valueless, you resort to drugs, antidepressants and booze. You eat junk food and watch your ill-treated body balloon. It is not just poverty, but growing relative poverty in an era of rising inequality, with all its psychological side-effects, that is the killer”(3).

This accurately sums up “shit-life syndrome.” So, by all means, end locking up non-violent drug offenders and increase drug treatment options. But as worthwhile as these steps may be, they will do nothing to alter “shit-life syndrome.” “Shit-life syndrome” is just one more expression of the never-ending cruelty of capitalism, an underlying cruelty inherent in the way the system operates, that can’t be reformed out, and won’t disappear until new ways of living and social organization come into place.

The Human Kind, A Doctor’s Stories From The Heart Of Medicine
Peter Dorward
p. 155-157

It’s not like this for all kinds of illness, of course. Illness, by and large, is as solid and real as the chair I’m sitting on: and nothing I say or believe about it will change its nature. That’s what people mean when they describe an illness as ‘real’. You can see it and touch it, and if you can’t do that, then at least you can measure it. You can weigh a tumour; you can see on the screen the ragged outline of the plaque of atheroma in your coronary artery which is occluded and crushing the life out of you, and you would be mad to question the legitimacy of this condition that prompts the wiry cardiologist to feed the catheter down the long forks and bends of your clogged arterial tree in order to feed an expanding metal stent into the blocked artery and save you.

No one questions the reality and medical legitimacy of those things in the world that can be seen, felt, weighed, touched. That creates a deep bias in the patient; it creates a profound preference among us, the healers.

But a person is interactive . Minds can’t exist independently of other minds: that’s the nature of our kind. The names we have for things in the world and the way that we choose to talk about them affect how we experience them. Our minds are made of language, and grammar, intentions, emotions, perceptions and memory. We can only experience the world through the agency of our minds, and how our minds interact with others. Science is a great tool for talking about the external world: the world that is indifferent to what we think. Science doesn’t begin to touch the other, inner, social stuff. And that’s a challenge in medicine. You need other tools for that.

‘Shit-life syndrome,’ offers Becky, whose skin is so pale it looks translucent, who wears white blouses with little ruffs buttoned to the top and her blonde hair in plaits, whose voice is vicarage English and in whose mouth shit life sounds anomalous. Medicine can have this coarsening effect. ‘Shit-life syndrome provides the raw material. We doctors do all the rest.’

‘Go on…’

‘That’s all I ever seem to see in GP. People whose lives are non-specifically crap. Women single parenting too many children, doing three jobs which they hate, with kids on Ritalin, heads wrecked by smartphone and tablet parenting. Women who hate their bodies and have a new diagnosis of diabetes because they’re too fat. No wonder they want a better diagnosis! What am I meant to do?’

I like to keep this tutorial upbeat. I don’t like it to become a moan-fest, which is pointless and damaging. Yet, I don’t want to censor.

‘… Sometimes I feel like a big stone, dropped into a river of pain. I create a few eddies around me, the odd wave or ripple, but the torrent just goes on…’

‘… I see it different. It’s worse! I think half the time we actually cause the problems. Or at least we create our own little side channels in the torrent. Build dams. Deep pools of misery of our own creation!’

That’s Nadja. She’s my trainee. And I recognise something familiar in what she is saying – the echo of something that I have said to her. It’s flattering, and depressing.

‘For example, take the issuing of sick notes. They’re the worst. We have all of these people who say they’re depressed, or addicted, or stressed, who stay awake all night because they can’t sleep for worry, and sleep all day so they can’t work, and they say they’re depressed or anxious, or have backache or work-related stress, and we drug them up and sign them off, but what they’re really suffering from are the symptoms of chronic unemployment and the misery of poverty, which are the worst illnesses that there are! And every time I sign one of these sick notes, I feel another little flake chipped off my integrity. You’re asking about vectors for social illness? Sick notes! It’s like we’re … shitting in the river, and worrying about the cholera!’

Strong words. I need to speak to Nadja about her intemperate opinions…

‘At least, that’s what he keeps saying,’ says Nadja, nodding at me.

Nadja’s father was a Croatian doctor, who fled the war there. Brought up as she was, at her father’s knee, on his stories of war and torture, of driving his motorbike between Kiseljac and Sarajevo and all the villages in between with his medical bag perched on the back to do his house calls, she can never quite believe the sorts of things that pass for ‘suffering’ here. It doesn’t make Nadja a more compassionate doctor. She sips her coffee, with a smile.

Aly, the one training to be an anaesthetist-traumatologist, says, ‘We shouldn’t do it. Simple as that. It’s just not medicine. We should confine ourselves to the physical, and send the rest to a social worker, or a counsellor or a priest. No more sick notes, no more doing the dirty work of governments. If society has a problem with unemployment, that’s society’s problem, not mine. No more convincing people that they’re sick. No more prescriptions for crap drugs that don’t work. If you can’t see it or measure it, it isn’t real. We’re encouraging all this pseudo-­illness with our sick notes and our crap drugs. What’s our first duty? Do no harm! End of.’

She’ll be a great trauma doctor, no doubt about it.

* * *

From Bad to Worse: Trends Across Generations
Rate And Duration of Despair
Trauma, Embodied and Extended
Facing Shared Trauma and Seeking Hope
Society: Precarious or Persistent?
Union Membership, Free Labor, and the Legacy of Slavery
The Desperate Acting Desperately
Social Disorder, Mental Disorder
Social Conditions of an Individual’s Condition
Society and Dysfunction
It’s All Your Fault, You Fat Loser!
To Grow Up Fast
Individualism and Isolation
To Put the Rat Back in the Rat Park
Rationalizing the Rat Race, Imagining the Rat Park
The Unimagined: Capitalism and Crappiness
Stress Is Real, As Are The Symptoms
On Conflict and Stupidity
Connecting the Dots of Violence
Inequality in the Anthropocene
Morality-Punishment Link