From the beginning of the country, there has been an American fear of moral and mental decline that was always rooted in the physical, involving issues of vitality of land and health of the body, and built on an ancient divide between the urban and rural. Over time, it grew into a fever pitch of moral panic about degeneration and degradation of the WASP culture, the white race, and maybe civilization itself. Some saw the end was near, maybe being able to hold out for another few generations before finally succumbing to disease and weakness. The need for revitalization and rebirth became a collective project (Jackson Lears, Rebirth of a Nation), which sadly fed into ethno-nationalist bigotry and imperialistic war-mongering — Make America Great Again!
A major point of crisis, of course, was the the Civil War. Racial ideology became predominant, not only because of slavery but maybe moreso because of mass immigration, the latter being the main reason the North won. Racial tensions merged with the developing scientific mindset of Darwinism and out of this mix came eugenics. For all we can now dismiss this kind of simplistic ignorance and with hindsight see the danger it led to, the underlying anxieties were real. Urbanization and industrialization were having an obvious impact on public health that was observed by many, and it wasn’t limited to mere physical ailments. “Cancer, like insanity, seems to increase with the progress of civilization,” noted Stanislas Tanchou, a mid-19th century French physician.
The diseases of civilization, including mental sickness, have been spreading for centuries (millennia, actually, considering the ‘modern’ chronic health conditions were first detected in the mummies of the agricultural Egyptians). Consider how talk of depression suddenly showed up in written accounts with the ending of feudalism (Barbara Ehrenreich, Dancing in the Street). That era included the enclosure movement that forced millions of then landless serfs into the desperate conditions of crowded cities and colonies where they faced stress, hunger, malnutrition, and disease. The loss of rural life hit Europe much earlier than America, but it eventually came here as well. The majority of white Americans were urban by the beginning of the 20th century and the majority of black Americans were urban by the 1970s. There has been a consistent pattern of mass problems following urbanization, everywhere it happens. It still is happening. The younger generation, more urbanized than any generation before, are seeing rising rates of psychosis that is specifically concentrated in the most urbanized areas.
In the United States, it was the last decades of the 19th century that was the turning point, the period of the first truly big cities. Into this milieu, Weston A. Price was born (1870) in a small rural village in Canada. As an adult, he became a dentist and sought work in Cleveland, Ohio (1893). Initially, most of his patients probably had, like him, grown up in rural areas. But over the decades, he increasingly was exposed to the younger generations having spent their entire lives in the city. Lierre Keith puts Price’s early observations in context, after pointing out that he started his career in 1893: “This date is important, as he entered the field just prior to the glut of industrial food. Over the course of the next thirty years, he watched children’s dentition — and indeed their overall health deteriorate. There was suddenly children whose teeth didn’t fit in their mouths, children with foreshortened jaws, children with lots of cavities. Not only were their dental arches too small, but he noticed their nasal passages were also too narrow, and they had poor health overall; asthma, allergies, behavioral problems” (The Vegetarian Myth, p. 187). This was at the time when the industrialization of farming and food had reached a new level, far beyond the limited availability of canned foods that in the mid-to-late 1800s when most Americans still relied on a heavy amount of wild-sourced meat, fish, nuts, etc. Even city-dwellers in early America had ready access to wild game because of the abundance of surrounding wilderness areas. In fact, in the 19th century, the average American ate more meat (mostly hunted) than bread.
We are once again coming back to the ever recurrent moral panic about the civilizational project. The same fears given voice in the late 19th to early 20th century are being repeated again. For example, Dr. Leonard Sax alerts us to how girls are sexually maturing early (1% of female infants showing signs of puberty), whereas boys are maturing later. As a comparison, hunter-gatherers don’t have such a large gender disparity of puberty nor do they experience puberty so early for girls, instead both genders typically coming to puberty around 18 years old with sex, pregnancy, and marriage happening more or less simultaneously. Dr. Sax, along with others, speculates about a number of reasons. Common causes that are held responsible include health factors, from diet to chemicals. Beyond altered puberty, many other examples could be added: heart disease, autoimmune disorders, mood disorders, autism, ADHD, etc; all of them increasing and worsening with each generation (e.g., type 2 diabetes used to be known as adult onset diabetes but now is regularly diagnosed in young children; the youngest victim recorded recently was three years old when diagnosed).
In the past, Americans responded to moral panic with genocide of Native Americans, Prohibition targeting ethnic (hyphenated) Americans and the poor, and immigrant restrictions to keep the bad sort out; the spread of racism and vigilantism such as KKK and Jim Crow and sundown towns and redlining, forced assimilation such as English only laws and public schools, and internment camps for not only Japanese-Americans but also German-Americans and Italian-Americans; implementation of citizen-making projects like national park systems, Boy Scouts, WPA, and CCC; promotion of eugenics, war on poverty (i.e., war on the poor), imperial expansionism, neo-colonial exploitation, and world wars; et cetera. The cure sought was often something to be forced onto the population by a paternalistic elite, that is to say rich white males, most specifically WASPs of the capitalist class.
Eugenics was, of course, one of the main focuses as it carried the stamp of science (or rather scientism). Yet at the same time, there were those challenging biological determinism and race realism, as views shifted toward environmental explanations. The anthropologists were at the front lines of this battle, but there were also Social Christians who changed their minds after having seen poverty firsthand. Weston A. Price, however, didn’t come to this from a consciously ideological position or religious motivation. He was simply a dentist who couldn’t ignore the severe health issues of his patients. So, he decided to travel the world in order to find healthy populations to study, in the hope of explaining why the change had occurred (Nutrition and Physical Degeneration).
Although familiar with eugenics literature, what Price observed in ‘primitive’ communities (including isolated villages in Europe) did not conform to eugenicist thought. It didn’t matter which population he looked at. Those who ate traditional diets were healthy and those who ate an industrialized Western diet were not. And it was a broad pattern that he saw everywhere he went, not only physical health but also neurocognitive health as indicated by happiness, low anxiety, and moral character. Instead of blaming individuals or races, he saw the common explanation as nutrition and he made a strong case by scientifically analyzing the nutrition of available foods.
In reading about traditional foods, paleo diet/lifestyle and functional medicine, Price’s work comes up quite often. He took many photographs that compared people from healthy and unhealthy populations. The contrast is stark. But what really stands out is how few people in the modern world look close to as healthy as those from the healthiest societies of the past. I live in a fairly wealthy college and medical town where there is a far above average concern for health along with access to healthcare. Even so, I now can’t help noticing how many people around me show signs of stunted or perturbed development of the exact kind Price observed in great detail: thin bone structure, sunken chests, sloping shoulders, narrow facial features, asymmetry, etc. That is even with modern healthcare correcting some of the worst conditions: cavities, underbites, pigeon-toes, etc. My fellow residents in this town are among the most privileged people in the world and, nonetheless, their state of health is a sad state of affairs in what it says about humanity at present.
It makes me wonder, as it made Price wonder, what consequences this has on neurocognitive health for individuals and the moral health of society. Taken alone, it isn’t enough to get excited about. But put in a larger context of looming catastrophes and it does become concerning. It’s not clear that our health will be up to the task of the problems we need to solve. We are a sickly population, far more sickly than when moral panic took hold in past generations.
As important, there is the personal component. I’m at a point where I’m not going to worry too much about decline and maybe collapse of civilization. I’m kind of hoping the American Empire will meet its demise. Still, that leaves us with many who suffer, no matter what happens to society as a whole. I take that personally, as one who has struggled with physical and mental health issues. And I’ve come around to Price’s view of nutrition as being key. I see these problems in other members of my family and it saddens me to watch as health conditions seem to get worse from one generation to the next.
It’s far from being a new problem, the central point I’m trying to make here. Talking to my mother, she has a clear sense of the differences on the two sides of her family. Her mother’s family came from rural areas and, even after moving to a larger city for work, they continued to hunt on a daily basis as there were nearby fields and woods that made that possible. They were a healthy, happy, and hard-working lot. They got along well as a family. Her father’s side of the family was far different. They had been living in towns and cities for several generations by the time she was born. They didn’t hunt at all. They were known for being surly, holding grudges, and being mean drunks. They also had underbites (i.e., underdeveloped jaw structure) and seemed to have had learning disabilities, though no one was diagnosing such conditions back then. Related to this difference, my mother’s father raised rabbits whereas my mother’s mother’s family hunted rabbits (and other wild game). This makes a big difference in terms of nutrition, as wild game has higher levels of omega-3 fatty acids and fat-soluble vitamins, all of which are key to optimal health and development.
What my mother observed in her family is basically the same as what Price observed in hundreds of communities in multiple countries on every continent. And I now observe the same pattern repeating. I grew up with an underbite. My brothers and I all required orthodontic work, as do so many now. I was diagnosed with a learning disability when young. Maybe not a learning disability, but behavioral issues were apparent when my oldest brother was young, likely related to his mildew allergies and probably an underlying autoimmune condition. I know I had food allergies as a child, as I think my other brother did as well. All of us have had neurocognitive and psychological issues of a fair diversity, besides learning disabilities: stuttering, depression, anxiety, and maybe some Asperger’s.
Now another generation is coming along with increasing rates of major physical and mental health issues. My nieces and nephews are sick all the time. They don’t eat well and are probably malnourished. During a medical checkup for my nephew, my mother asked the doctor about his extremely unhealthy diet, consisting mostly of white bread and sugar. The doctor bizarrely dismissed it as ‘normal’ in that, as she claimed, no kid eats healthy. If that is the new normal, maybe we should be in a moral panic.
* * *
Violent Behavior: A Solution in Plain Sight
by Sylvia Onusic
Nutrition and Mental Development
by Sally Fallon Morell
You Are What You Eat: The Research and Legacy of Dr. Weston Andrew Price
by John Larabell
While practicing in his Cleveland office, Dr. Price noticed an increase in dental problems among the younger generations. These issues included the obvious dental caries (cavities) as well as improper jaw development leading to crowded, crooked teeth. In fact, the relatively new orthodontics industry was at that time beginning to gain popularity. Perplexed by these modern problems that seemed to be affecting a greater and greater portion of the population, Dr. Price set about to research the issue by examining people who did not display such problems. He suspected (correctly, as he would later find) that many of the dental problems, as well as other degenerative health problems, that were plaguing modern society were the result of inadequate nutrition owing to the increasing use of refined, processed foods.
Nasty, Brutish and Short?
by Sally Fallon Morell
It seems as if the twentieth century will exit with a crescendo of disease. Things were not so bad back in the 1930’s, but the situation was already serious enough to cause one Cleveland, Ohio dentist to be concerned. Dr. Weston Price was reluctant to accept the conditions exhibited by his patients as normal. Rarely did an examination of an adult patient reveal anything but rampant decay, often accompanied by serious problems elsewhere in the body, such as arthritis, osteoporosis, diabetes, intestinal complaints and chronic fatigue. (They called it neurasthenia in Price’s day.) But it was the dentition of younger patients that alarmed him most. Price observed that crowded, crooked teeth were becoming more and more common, along with what he called “facial deformities”-overbites, narrowed faces, underdevelopment of the nose, lack of well-defined cheekbones and pinched nostrils. Such children invariably suffered from one or more complaints that sound all too familiar to mothers of the 1990’s: frequent infections, allergies, anemia, asthma, poor vision, lack of coordination, fatigue and behavioral problems. Price did not believe that such “physical degeneration” was God’s plan for mankind. He was rather inclined to believe that the Creator intended physical perfection for all human beings, and that children should grow up free of ailments.
Is it Mental or is it Dental?
by Raymond Silkman
The widely held model of orthodontics, which considers developmental problems in the jaws and head to be genetic in origin, never made sense to me. Since they are wedded to the genetic model, orthodontists dealing with crowded teeth end up treating the condition with tooth extraction in a majority of the cases. Even though I did not resort to pulling teeth in my practice, and I was using appliances to widen the jaws and getting the craniums to look as they should, I still could not come up with the answer as to why my patients looked the way they did. I couldn’t believe that the Creator had given them a terrible blueprint –it just did not make sense. In four years of college education, four years of dental school education and almost three years of post-graduate orthodontic training, students never hear a mention of Dr. Price, so they never learn the true reasons for these malformations. I have had the opportunity to work with a lot of very knowledgeable doctors in various fields of allopathic and alternative healthcare who still do not know about Dr. Price and his critical findings.
These knowledgeable doctors have not stared in awe at the beautiful facial development that Price captured in the photographs he took of primitive peoples throughout the globe and in so doing was able to answer this most important question: What do humans look like in health? And how have humans been able to carry on throughout history and populate such varied geographical and physical environments on the earth without our modern machines and tools?
The answer that Dr. Price was able to illuminate came through his photographs of beautiful, healthy human beings with magnificent physical form and mental development, living in harmony with their environments. […]
People who are not well oxygenated and who have poor posture often suffer from fatigue and fibromyalgia symptoms, they snore and have sleep apnea, they have sinusitis and frequent ear infections. Life becomes psychologically and physically challenging for them and they end up with long-term dependence on medications—and all of that just from the seemingly simple condition of crowded teeth.
In other words, people with poor facial development are not going to live very happily. […]
While very few people have heard of the work of Weston Price these days, we haven’t lost our ability to recognize proper facial form. To make it in today’s society, you must have good facial development. You’re not going to see a general or a president with a weak chin, you’re not going to see coaches with weak chins, you’re not going to see a lot of well-to-do personalities in the media with underdeveloped faces and chins. You don’t see athletes and newscasters with narrow palates and crooked teeth.
Weston A. Price: An Unorthodox Dentist
by Nourishing Israel
Price discovered that the native foods eaten by the isolated populations were far more nutrient dense than the modern foods. In the first generation that changed their diet there was noticeable tooth decay; in subsequent generations the dental and facial bone structure changed, as well as other changes that were seen in American and European families and previously considered to be the result of interracial marriage.
By studying the different routes that the same populations had taken – traditional versus modern diet – he saw that the health of the children is directly related to the health of the parents and the germ plasms that they provide, and are as important to the child’s makeup as the health of the mother before and during pregnancy.
Price also found that primitive populations were very conscious of the importance of the mothers’ health and many populations made sure that girls were given a special diet for several months before they were allowed to marry.
Another interesting finding was that although genetic makeup was important, it did not have as great a degree of influence on a person’s development and health as was thought, but that a lot of individual characteristics, including brain development and brain function, where due to environmental influence, what he called “intercepted heredity”.
The origin of personality and character appear in the light of the newer date to be biologic products and to a much less degree than usually considered pure hereditary traits. Since these various factors are biologic, being directly related to both the nutrition of the parents and to the nutritional environment of the individuals in the formative and growth period any common contributing factor such as food deficiencies due to soil depletion will be seen to produce degeneration of the masses of people due to a common cause. Mass behavior therefore, in this new light becomes the result of natural forces, the expression of which may not be modified by propaganda but will require correction at the source. [1] …
It will be easy for the reader to be prejudiced since many of the applications suggested are not orthodox. I suggest that conclusions be deferred until the new approach has been used to survey the physical and mental status of the reader’s own family, of his brothers and sisters, of associated families, and finally, of the mass of people met in business and on the street. Almost everyone who studies the matter will be surprised that such clear-cut evidence of a decline in modern reproductive efficiency could be all about us and not have been previously noted and reviewed.[2]
From Nutrition and Physical Degeneration by Weston Price
Food Freedom – Nourishing Raw Milk
by Lisa Virtue
In 1931 Price visited the people of the Loetschental Valley in the Swiss Alps. Their diet consisted of rye bread, milk, cheese and butter, including meat once a week (Price, 25). The milk was collected from pastured cows, and was consumed raw: unpasteurized, unhomogenized (Schmid, 9).
Price described these people as having “stalwart physical development and high moral character…superior types of manhood, womanhood and childhood that Nature has been able to produce from a suitable diet and…environment” (Price, 29). At this time, Tuberculosis had taken more lives in Switzerland than any other disease. The Swiss government ordered an inspection of the valley, revealing not a single case. No deaths had been recorded from Tuberculosis in the history of the Loetschental people (Shmid, 8). Upon return home, Price had dairy samples from the valley sent to him throughout the year. These samples were higher in minerals and vitamins than samples from commercial (thus pasteurized) dairy products in America and the rest of Europe. The Loetschental milk was particularly high in fat soluble vitamin D (Schmid, 9).
The daily intake of calcium and phosphorous, as well as fat soluble vitamins would have been higher than average North American children. These children were strong and sturdy, playing barefoot in the glacial waters into the late chilly evenings. Of all the children in the valley eating primitive foods, cavities were detected at an average of 0.3 per child (Price, 25). This without visiting a dentist or physician, for the valley had none, seeing as there was no need (Price, 23). To offer some perspective, the rate of cavities per child between the ages of 6-19 in the United States has been recorded to be 3.25, over 10 times the rate seen in Loetschental (Nagel).
Price offers some perspective on a society subsisting mainly on raw dairy products: “One immediately wonders if there is not something in the life-giving vitamins and minerals of the food that builds not only great physical structures within which their souls reside, but builds minds and hearts capable of a higher type of manhood…” (Price, 26).
100 Years Before Weston Price
by Nancy Henderson
Like Price, Catlin was struck by the beauty, strength and demeanor of the Native Americans. “The several tribes of Indians inhabiting the regions of the Upper Missouri. . . are undoubtedly the finest looking, best equipped, and most beautifully costumed of any on the Continent.” Writing of the Blackfoot and Crow, tribes who hunted buffalo on the rich glaciated soils of the American plains, “They are the happiest races of Indian I have met—picturesque and handsome, almost beyond description.”
“The very use of the word savage,” wrote Catlin, “as it is applied in its general sense, I am inclined to believe is an abuse of the word, and the people to whom it is applied.” […]
As did Weston A. Price one hundred years later, Catlin noted the fact that moral and physical degeneration came together with the advent of civilized society. In his late 1830s portrait of “Pigeon’s Egg Head (The Light) Going to and Returning from Washington” Catlin painted him corrupted with “gifts of the great white father” upon his return to his native homeland. Those gifts including two bottles of whiskey in his pockets. […]
Like Price, Catlin discusses the issue of heredity versus environment. “No diseases are natural,” he writes, “and deformities, mental and physical, are neither hereditary nor natural, but purely the result of accidents or habits.”
So wrote Dr. Price: “Neither heredity nor environment alone cause our juvenile delinquents and mental defectives. They are cripples, physically, mentally and morally, which could have and should have been prevented by adequate education and by adequate parental nutrition. Their protoplasm was not normally organized.”
The Right Price
by Weston A. Price Foundation
Many commentators have criticized Price for attributing “decline in moral character” to malnutrition. But it is important to realize that the subject of “moral character” was very much on the minds of commentators of his day. As with changes in facial structure, observers in the first half of the 20th century blamed “badness” in people to race mixing, or to genetic defects. Price quotes A.C. Jacobson, author of a 1926 publication entitled Genius (Some Revaluations),35 who stated that “The Jekyll-Hydes of our common life are ethnic hybrids.” Said Jacobson, “Aside from the effects of environment, it may safely be assumed that when two strains of blood will not mix well a kind of ‘molecular insult’ occurs which the biologists may some day be able to detect beforehand, just as blood is now tested and matched for transfusion.” The implied conclusion to this assertion is that “degenerates” can be identified through genetic testing and “weeded out” by sterilizing the unfit–something that was imposed on many women during the period and endorsed by powerful individuals, including Oliver Wendell Holmes.
It is greatly to Price’s credit that he objected to this arrogant point of view: “Most current interpretations are fatalistic and leave practically no escape from our succession of modern physical, mental and moral cripples. . . If our modern degeneration were largely the result of incompatible racial stocks as indicated by these premises, the outlook would be gloomy in the extreme.”36 Price argued that nutritional deficiencies affecting the physical structure of the body can also affect the brain and nervous system; and that while “bad” character may be the result of many influences–poverty, upbringing, displacement, etc.–good nutrition also plays a role in creating a society of cheerful, compassionate individuals.36
Rebirth of a Nation:
The Making of Modern America, 1877-1920
By Jackson Lears
pp. 7-9
By the late nineteenth century, dreams of rebirth were acquiring new meanings. Republican moralists going back to Jefferson’s time had long fretted about “overcivilization,” but the word took on sharper meaning among the middle and upper classes in the later decades of the nineteenth century. During the postwar decades, “overcivilization” became not merely a social but an individual condition, with a psychiatric diagnosis. In American Nervousness (1880), the neurologist George Miller Beard identified “neurasthenia,” or “lack of nerve force,” as the disease of the age. Neurasthenia encompassed a bewildering variety of symptoms (dyspepsia, insomnia, nocturnal emissions, tooth decay, “fear of responsibility, of open places or closed places, fear of society, fear of being alone, fear of fears, fear of contamination, fear of everything, deficient mental control, lack of decision in trifling matters, hopelessness”), but they all pointed to a single overriding effect: a paralysis of the will.
The malady identified by Beard was an extreme version of a broader cultural malaise—a growing sense that the Protestant ethic of disciplined achievement had reached the end of its tether, had become entangled in the structures of an increasingly organized capitalist society. Ralph Waldo Emerson unwittingly predicted the fin de siècle situation. “Every spirit makes its house,” he wrote in “Fate” (1851), “but afterwards the house confines the spirit.” The statement presciently summarized the history of nineteenth-century industrial capitalism, on both sides of the Atlantic.
By 1904, the German sociologist Max Weber could put Emerson’s proposition more precisely. The Protestant ethic of disciplined work for godly ends had created an “iron cage” of organizations dedicated to the mass production and distribution of worldly goods, Weber argued. The individual striver was caught in a trap of his own making. The movement from farm to factory and office, and from physical labor outdoors to sedentary work indoors, meant that more Europeans and North Americans were insulated from primary processes of making and growing. They were also caught up in subtle cultural changes—the softening of Protestantism into platitudes; the growing suspicion that familiar moral prescriptions had become mere desiccated, arbitrary social conventions. With the decline of Christianity, the German philosopher Friedrich Nietzsche wrote, “it will seem for a time as though all things had become weightless.”
Alarmists saw these tendencies as symptoms of moral degeneration. But a more common reaction was a diffuse but powerful feeling among the middle and upper classes—a sense that they had somehow lost contact with the palpitating actuality of “real life.” The phrase acquired unprecedented emotional freight during the years around the turn of the century, when reality became something to be pursued rather than simply experienced. This was another key moment in the history of longing, a swerve toward the secular. Longings for this-worldly regeneration intensified when people with Protestant habits of mind (if not Protestant beliefs) confronted a novel cultural situation: a sense that their way of life was being stifled by its own success.
On both sides of the Atlantic, the drive to recapture “real life” took myriad cultural forms. It animated popular psychotherapy and municipal reform as well as avant-garde art and literature, but its chief institutional expression was regeneration through military force. As J. A. Hobson observed in Imperialism (1902), the vicarious identification with war energized jingoism and militarism. By the early twentieth century, in many minds, war (or the fantasy of it) had become the way to keep men morally and physically fit. The rise of total war between the Civil War and World War I was rooted in longings for release from bourgeois normality into a realm of heroic struggle. This was the desperate anxiety, the yearning for rebirth, that lay behind official ideologies of romantic nationalism, imperial progress, and civilizing mission—and that led to the trenches of the Western Front.
Americans were immersed in this turmoil in peculiarly American ways. As the historian Richard Slotkin has brilliantly shown, since the early colonial era a faith in regeneration through violence underlay the mythos of the American frontier. With the closing of the frontier (announced by the U.S. census in 1890), violence turned outward, toward empire. But there was more going on than the refashioning of frontier mythology. American longings for renewal continued to be shaped by persistent evangelical traditions, and overshadowed by the shattering experience of the Civil War. American seekers merged Protestant dreams of spiritual rebirth with secular projects of purification—cleansing the body politic of secessionist treason during the war and political corruption afterward, reasserting elite power against restive farmers and workers, taming capital in the name of the public good, reviving individual and national vitality by banning the use of alcohol, granting women the right to vote, disenfranchising African-Americans, restricting the flow of immigrants, and acquiring an overseas empire.
Of course not all these goals were compatible. Advocates of various versions of rebirth—bodybuilders and Prohibitionists, Populists and Progressives, Social Christians and Imperialists—all laid claims to legitimacy. Their crusades met various ends, but overall they relieved the disease of the fin de siècle by injecting some visceral vitality into a modern culture that had seemed brittle and about to collapse. Yearning for intense experience, many seekers celebrated Force and Energy as ends in themselves. Such celebrations could reinforce militarist fantasies but could also lead in more interesting directions—toward new pathways in literature and the arts and sciences. Knowledge could be revitalized, too. William James, as well as Houdini and Roosevelt, was a symbol of the age.
The most popular forms of regeneration had a moral dimension.
pp. 27-29
But for many other observers, too many American youths—especially among the upper classes—had succumbed to the vices of commerce: the worship of Mammon, the love of ease. Since the Founding Fathers’ generation, republican ideologues had fretted about the corrupting effects of commercial life. Norton and other moralists, North and South, had imagined war would provide an antidote. During the Gilded Age those fears acquired a peculiarly palpable intensity. The specter of “overcivilization”—invoked by republican orators since Jefferson’s time—developed a sharper focus: the figure of the overcivilized businessman became a stock figure in social criticism. Flabby, ineffectual, anxious, possibly even neurasthenic, he embodied bourgeois vulnerability to the new challenges posed by restive, angry workers and waves of strange new immigrants. “Is American Stamina Declining?” asked William Blaikie, a former Harvard athlete and author of How to Get Strong and Stay So, in Harper’s in 1889. Among white-collar “brain-workers,” legions of worried observers were asking similar questions. Throughout the country, metropolitan life for the comfortable classes was becoming a staid indoor affair. Blaikie caught the larger contours of the change:
“A hundred years ago, there was more done to make our men and women hale and vigorous than there is to-day. Over eighty per cent of all our men then were farming, hunting, or fishing, rising early, out all day in the pure, bracing air, giving many muscles very active work, eating wholesome food, retiring early, and so laying in a good stock of vitality and health. But now hardly forty per cent are farmers, and nearly all the rest are at callings—mercantile, mechanical, or professional—which do almost nothing to make one sturdy and enduring.”
This was the sort of anxiety that set men (and more than a few women) to pedaling about on bicycles, lifting weights, and in general pursuing fitness with unprecedented zeal. But for most Americans, fitness was not merely a matter of physical strength. What was equally essential was character, which they defined as adherence to Protestant morality. Body and soul would be saved together.
This was not a gender-neutral project. Since the antebellum era, purveyors of conventional wisdom had assigned respectable women a certain fragility. So the emerging sense of physical vulnerability was especially novel and threatening to men. Manliness, always an issue in Victorian culture, had by the 1880s become an obsession. Older elements of moral character continued to define the manly man, but a new emphasis on physical vitality began to assert itself as well. Concern about the over-soft socialization of the young promoted the popularity of college athletics. During the 1880s, waves of muscular Christianity began to wash over campuses.
pp. 63-71
NOT MANY AMERICAN men, even among the comparatively prosperous classes, were as able as Carnegie and Rockefeller to master the tensions at the core of their culture. Success manuals acknowledged the persistent problem of indiscipline, the need to channel passion to productive ends. Often the language of advice literature was sexually charged. In The Imperial Highway (1881), Jerome Bates advised:
[K]eep cool, have your resources well in hand, and reserve your strength until the proper time arrives to exert it. There is hardly any trait of character or faculty of intellect more valuable than the power of self-possession, or presence of mind. The man who is always “going off” unexpectedly, like an old rusty firearm, who is easily fluttered and discomposed at the appearance of some unforeseen emergency; who has no control over himself or his powers, is just the one who is always in trouble and is never successful or happy.
The assumptions behind this language are fascinating and important to an understanding of middle-and upper-class Americans in the Gilded Age. Like many other purveyors of conventional wisdom—ministers, physicians, journalists, health reformers—authors of self-help books assumed a psychic economy of scarcity. For men, this broad consensus of popular psychology had sexual implications: the scarce resource in question was seminal fluid, and one had best not be diddling it away in masturbation or even nocturnal emissions. This was easier said than done, of course, as Bates indicated, since men were constantly addled by insatiable urges, always on the verge of losing self-control—the struggle to keep it was an endless battle with one’s own darker self. Spiritual, psychic, and physical health converged. What Freud called “‘civilized’ sexual morality” fed directly into the “precious bodily fluids” school of health management. The man who was always “‘going off’ unexpectedly, like an old rusty firearm,” would probably be sickly as well as unsuccessful—sallow, sunken-chested, afflicted by languorous indecision (which was how Victorian health literature depicted the typical victim of what was called “self-abuse”).
But as this profile of the chronic masturbator suggests, scarcity psychology had implications beyond familiar admonitions to sexual restraint. Sexual scarcity was part of a broader psychology of scarcity; the need to conserve semen was only the most insistently physical part of a much more capacious need to conserve psychic energy. As Bates advised, the cultivation of “self-possession” allowed you to “keep your resources well in hand, and reserve your strength until the proper time arrives to exert it.” The implication was that there was only so much strength available to meet demanding circumstances and achieve success in life. The rhetoric of “self-possession” had financial as well as sexual connotations. To preserve a cool, unruffled presence of mind (to emulate Rockefeller, in effect) was one way to stay afloat on the storm surges of the business cycle.
The object of this exercise, at least for men, was personal autonomy—the ownership of one’s self. […]
It was one thing to lament excessive wants among the working class, who were supposed to be cultivating contentment with their lot, and quite another to find the same fault among the middle class, who were supposed to be improving themselves. The critique of middle-class desire posed potentially subversive questions about the dynamic of dissatisfaction at the core of market culture, about the very possibility of sustaining a stable sense of self in a society given over to perpetual jostling for personal advantage. The ruinous results of status-striving led advocates of economic thrift to advocate psychic thrift as well.
By the 1880s, the need to conserve scarce psychic resources was a commonly voiced priority among the educated and affluent. Beard’s American Nervousness had identified “the chief and primary cause” of neurasthenia as “modern civilization,” which placed unprecedented demands on limited emotional energy. “Neurasthenia” and “nervous prostration” became catchall terms for a constellation of symptoms that today would be characterized as signs of chronic depression—anxiety, irritability, nameless fears, listlessness, loss of will. In a Protestant culture, where effective exercise of will was the key to individual selfhood, the neurasthenic was a kind of anti-self—at best a walking shadow, at worst a bedridden invalid unable to make the most trivial choices or decisions. Beard and his colleagues—neurologists, psychiatrists, and self-help writers in the popular press—all agreed that nervous prostration was the price of progress, a signal that the psychic circuitry of “brain workers” was overloaded by the demands of “modern civilization.”
While some diagnoses of this disease deployed electrical metaphors, the more common idiom was economic. Popular psychology, like popular economics, was based on assumptions of scarcity: there was only so much emotional energy (and only so much money) to go around. The most prudent strategy was the husbanding of one’s resources as a hedge against bankruptcy and breakdown. […]
Being reborn through a self-allowed regime of lassitude was idiosyncratic, though important as a limiting case. Few Americans had the leisure or the inclination to engage in this kind of Wordsworthian retreat. Most considered neurasthenia at best a temporary respite, at worst an ordeal. They strained, if ambivalently, to be back in harness.
The manic-depressive psychology of the business class mimicked the lurching ups and downs of the business cycle. In both cases, assumptions of scarcity underwrote a pervasive defensiveness, a circle-the-wagons mentality. This was the attitude that lay behind the “rest cure” devised by the psychiatrist Silas Weir Mitchell, who proposed to “fatten” and “redden” the (usually female) patient by isolating her from all mental and social stimulation. (This nearly drove the writer Charlotte Perkins Gilman crazy, and inspired her story “The Yellow Wallpaper.”) It was also the attitude that lay behind the fiscal conservatism of the “sound-money men” on Wall Street and in Washington—the bankers and bondholders who wanted to restrict the money supply by tying it to the gold standard. Among the middle and upper classes, psyche and economy alike were haunted by the common specter of scarcity. But there were many Americans for whom scarcity was a more palpable threat.
AT THE BOTTOM of the heap were the urban poor. To middle-class observers they seemed little more than a squalid mass jammed into tenements that were festering hives of “relapsing fever,” a strange malady that left its survivors depleted of strength and unable to work. The disease was “the most efficient recruiting officer pauperism ever had,” said a journalist investigating tenement life in the 1870s. Studies of “the nether side of New York” had been appearing for decades, but—in the young United States at least—never before the Gilded Age had the story of Dives and Lazarus been so dramatically played out, never before had wealth been so flagrant, or poverty been so widespread and so unavoidably appalling. The army of thin young “sewing-girls” trooping off in the icy dawn to sweatshops all over Manhattan, the legions of skilled mechanics forced by high New York rents to huddle with their families amid a crowd of lowlifes, left without even a pretense of privacy in noisome tenements that made a mockery of the Victorian cult of home—these populations began to weigh on the bourgeois imagination, creating concrete images of the worthy, working poor.
pp. 99-110
Racial animosities flared in an atmosphere of multicultural fluidity, economic scarcity, and sexual rivalry. Attitudes arising from visceral hostility acquired a veneer of scientific objectivity. Race theory was nothing new, but in the late nineteenth century it mutated into multiple forms, many of them characterized by manic urgency, sexual hysteria, and biological determinism. Taxonomists had been trying to arrange various peoples in accordance with skull shape and brain size for decades; popularized notions of natural selection accelerated the taxonomic project, investing it more deeply in anatomical details. The superiority of the Anglo-Saxon—according to John Fiske, the leading pop-evolutionary thinker—arose not only from the huge size of his brain, but also from the depth of its furrows and the plenitude of its creases. The most exalted mental events had humble somatic origins. Mind was embedded in body, and both could be passed on to the next generation.
The year 1877 marked a crucial development in this hereditarian synthesis: in that year, Richard Dugdale published the results of his investigation into the Juke family, a dull-witted crew that had produced more than its share of criminals and mental defectives. While he allowed for the influence of environment, Dugdale emphasized the importance of inherited traits in the Juke family. If mental and emotional traits could be inherited along with physical ones, then why couldn’t superior people be bred like superior dogs or horses? The dream of creating a science of eugenics, dedicated to improving and eventually even perfecting human beings, fired the reform imagination for decades. Eugenics was a kind of secular millennialism, a vision of a society where biological engineering complemented social engineering to create a managerial utopia. The intellectual respectability of eugenics, which lasted until the 1930s, when it became associated with Nazism, underscores the centrality of racialist thinking among Americans who considered themselves enlightened and progressive. Here as elsewhere, racism and modernity were twinned.
Consciousness of race increasingly pervaded American culture in the Gilded Age. Even a worldview as supple as Henry James’s revealed its moorings in conventional racial categories when, in The American (1877), James presented his protagonist, Christopher Newman, as a quintessential Anglo-Saxon but with echoes of the noble Red Man, with the same classical posture and physiognomy. There was an emerging kinship between these two groups of claimants to the title “first Americans.” The iconic American, from this view, was a blend of Anglo-Saxon refinement and native vigor. While James only hints at this, in less than a generation such younger novelists as Frank Norris and Jack London would openly celebrate the rude vitality of the contemporary Anglo-Saxon, proud descendant of the “white savages” who subdued a continent. It should come as no surprise that their heroes were always emphatically male. The rhetoric of race merged with a broader agenda of masculine revitalization.[…]
By the 1880s, muscular Christians were sweeping across the land, seeking to meld spiritual and physical renewal, establishing institutions like the Young Men’s Christian Association. The YMCA provided prayer meetings and Bible study to earnest young men with spiritual seekers’ yearnings, gyms and swimming pools to pasty young men with office workers’ midriffs. Sometimes they were the same young men. More than any other organization, the YMCA aimed to promote the symmetry of character embodied in the phrase “body, mind, spirit”—which a Y executive named Luther Gulick plucked from Deuteronomy and made the motto of the organization. The key to the Y’s appeal, a Harper’s contributor wrote in 1882, was the “overmastering conviction” of its members: “The world always respects manliness, even when it is not convinced [by theological argument]; and if the organizations did not sponsor that quality in young men, they would be entitled to no respect.” In the YMCA, manliness was officially joined to a larger agenda.
For many American Protestants, the pursuit of physical fitness merged with an encompassing vision of moral and cultural revitalization—one based on the reassertion of Protestant self-control against the threats posed to it by immigrant masses and mass-marketed temptation. […]
Science and religion seemed to point in the same direction: Progress and Providence were one.
Yet the synthesis remained precarious. Physical prowess, the basis of national supremacy, could not be taken for granted. Strong acknowledged in passing that Anglo-Saxons could be “devitalized by alcohol and tobacco.” Racial superiority could be undone by degenerate habits. Even the most triumphalist tracts contained an undercurrent of anxiety, rooted in the fear of flab. The new stress on the physical basis of identity began subtly to undermine the Protestant synthesis, to reinforce the suspicion that religion was a refuge for effeminate weaklings. The question inevitably arose, in some men’s minds: What if the YMCA and muscular Christianity were not enough to revitalize tired businessmen and college boys?
Under pressure from proliferating ideas of racial “fitness,” models of manhood became more secular. Despite the efforts of muscular Christians to reunite body and soul, the ideal man emerging among all classes by the 1890s was tougher and less introspective than his mid-Victorian predecessors. He was also less religious. Among advocates of revitalization, words like “Energy” and “Force” began to dominate discussion—often capitalized, often uncoupled from any larger frameworks of moral or spiritual meaning, and often combined with racist assumptions. […]
The emerging worship of force raised disturbing issues. Conventional morality took a backseat to the celebration of savage strength. After 1900, in the work of a pop-Nietzschean like Jack London, even criminality became a sign of racial vitality: as one of his characters says, “We whites have been land-robbers and sea-robbers from remotest time. It is in our blood, I guess, and we can’t get away from it.” This reversal of norms did not directly challenge racial hierarchies, but the assumptions behind it led toward disturbing questions. If physical prowess was the mark of racial superiority, what was one to make of the magnificent specimens of manhood produced by allegedly inferior races? Could it be that desk-bound Anglo-Saxons required an infusion of barbarian blood (or at least the “barbarian virtues” recommended by Theodore Roosevelt)? Behind these questions lay a primitivist model of regeneration, to be accomplished by incorporating the vitality of the vanquished, dark-skinned other. The question was how to do that and maintain racial purity.
pp. 135-138
Yet to emphasize the gap between country and the city was not simply an evasive exercise: dreams of bucolic stillness or urban energy stemmed from motives more complex than mere escapist sentiment. City and country were mother lodes of metaphor, sources for making sense of the urban-industrial revolution that was transforming the American countryside and creating a deep sense of discontinuity in many Americans’ lives during the decades after the Civil War. If the city epitomized the attraction of the future, the country embodied the pull of the past. For all those who had moved to town in search of excitement or opportunity, rural life was ineluctably associated with childhood and memory. The contrast between country and city was about personal experience as well as political economy. […]
REVERENCE FOR THE man of the soil was rooted in the republican tradition. In his Notes on the State of Virginia (1785), Jefferson articulated the antithesis that became central to agrarian politics (and to the producerist worldview in general)—the contrast between rural producers and urban parasites. “Those who labour in the earth are the chosen people of God, if ever he had a chosen people, whose breasts he has made his peculiar deposit for substantial and genuine virtue,” he announced. “Corruption of morals in the mass of cultivators is a phenomenon of which no age nor nation has furnished an example. It is the mark set on those, who not looking up to heaven, to their own soil and industry, as does the husbandman, for their subsistence, depend for it on the casualties and caprice of customers. Dependence begets subservience and venality, suffocates the germ of virtue, and prepares fit tools for the design of ambition.” Small wonder, from this view, that urban centers of commerce seemed to menace the public good. “The mobs of great cities,” Jefferson concluded, “add just so much to the support of pure government as sores do to the strength of the human body.” Jefferson’s invidious distinctions echoed through the nineteenth century, fueling the moral passion of agrarian rebels. Watson, among many, considered himself a Jeffersonian.
There were fundamental contradictions embedded in Jefferson’s conceptions of an independent yeomanry. Outside certain remote areas in New England, most American farmers were not self-sufficient in the nineteenth century—nor did they want to be. Many were eager participants in the agricultural market economy, animated by a restless, entrepreneurial spirit. Indeed, Jefferson’s own expansionist policies, especially the Louisiana Purchase, encouraged centrifugal movement as much as permanent settlement. “What developed in America,” the historian Richard Hofstadter wrote, “was an agricultural society whose real attachment was not to the land but to land values.” The figure of the independent yeoman, furnishing enough food for himself and his family, participating in the public life of a secure community—this icon embodied longings for stability amid a maelstrom of migration.
Often the longings were tinged with a melancholy sense of loss. […] For those with Jeffersonian sympathies, abandoned farms were disturbing evidence of cultural decline. As a North American Review contributor wrote in 1888: “Once let the human race be cut off from personal contact with the soil, once let the conventionalities and artificial restrictions of so-called civilization interfere with the healthful simplicity of nature, and decay is certain.” Romantic nature-worship had flourished fitfully among intellectuals since Emerson had become a transparent eye-ball on the Concord common and Whitman had loafed among leaves of grass. By the post–Civil War decades, romantic sentiment combined with republican tradition to foster forebodings. Migration from country to city, from this view, was a symptom of disease in the body politic. Yet the migration continued. Indeed, nostalgia for rural roots was itself a product of rootlessness. A restless spirit, born of necessity and desire, spun Americans off in many directions—but mainly westward. The vision of a stable yeomanry was undercut by the prevalence of the westering pioneer.
pp. 246-247
Whether energy came from within or without, it was as limitless as electricity apparently was. The obstacles to access were not material—class barriers or economic deprivation were never mentioned by devotees of abundance psychology—they were mental and emotional. The most debilitating emotion was fear, which cropped up constantly as the core problem in diagnoses of neurasthenia. The preoccupation with freeing oneself from internal constraints undermined the older, static ideal of economic self-control at its psychological base. As one observer noted in 1902: “The root cause of thrift, which we all admire and preach because it is so convenient to the community, is fear, fear of future want; and that fear, we are convinced, when indulged overmuch by pessimist minds is the most frequent cause of miserliness….” Freedom from fear meant freedom to consume.
And consumption began at the dinner table. Woods Hutchinson claimed in 1913 that the new enthusiasm for calories was entirely appropriate to a mobile, democratic society. The old “stagnation” theory of diet merely sought to maintain the level of health and vigor; it was a diet for slaves or serfs, for people who were not supposed to rise above their station. “The new diet theory is based on the idea of progress, of continuous improvement, of never resting satisfied with things as they are,” Hutchinson wrote. “No diet is too liberal or expensive that will…yield good returns on the investment.” Economic metaphors for health began to focus on growth and process rather than stability, on consumption and investment rather than savings.
As abundance psychology spread, a new atmosphere of dynamism enveloped old prescriptions for success. After the turn of the century, money was less often seen as an inert commodity, to be gradually accumulated and tended to steady growth; and more often seen as a fluid and dynamic force. To Americans enraptured by the strenuous life, energy became an end itself—and money was a kind of energy. Success mythology reflected this subtle change. In the magazine hagiographies of business titans—as well as in the fiction of writers like Dreiser and Norris—the key to success frequently became a mastery of Force (as those novelists always capitalized it), of raw power. Norris’s The Pit (1903) was a paean to the furious economic energies concentrated in Chicago. “It was Empire, the restless subjugation of all this central world of the lakes and prairies. Here, mid-most in the land, beat the Heart of the nation, whence inevitably must come its immeasurable power, its infinite, inexhaustible vitality. Here of all her cities, throbbed the true life—the true power and spirit of America: gigantic, crude, with the crudity of youth, disdaining rivalry; sane and healthy and vigorous; brutal in its ambition, arrogant in the new-found knowledge of its giant strength, prodigal of its wealth, infinite in its desires.” This was the vitalist vision at its most breathless and jejune, the literary equivalent of Theodore Roosevelt’s adolescent antics.
The new emphasis on capital as Force translated the psychology of abundance into economic terms. The economist who did the most to popularize this translation was Simon Nelson Patten, whose The New Basis of Civilization (1907) argued that the United States had passed from an “era of scarcity” to an “era of abundance” characterized by the unprecedented availability of mass-produced goods. His argument was based on the confident assumption that human beings had learned to control the weather. “The Secretary of Agriculture recently declared that serious crop failures will occur no more,” Patten wrote. “Stable, progressive farming controls the terror, disorder, and devastation of earlier times. A new agriculture means a new civilization.” Visions of perpetual growth were in the air, promising both stability and dynamism.
The economist Edward Atkinson pointed the way to a new synthesis with a hymn to “mental energy” in the Popular Science Monthly. Like other forms of energy, it was limitless. “If…there is no conceivable limit to the power of mind over matter or to the number of conversions of force that can be developed,” he wrote, “it follows that pauperism is due to want of mental energy, not of material resources.” Redistribution of wealth was not on the agenda; positive thinking was.
pp. 282-283
TR’s policies were primarily designed to protect American corporations’ access to raw materials, investment opportunities, and sometimes markets. The timing was appropriate. In the wake of the merger wave of 1897–1903, Wall Street generated new pools of capital, while Washington provided new places to invest it. Speculative excitement seized many among the middle and upper classes who began buying stocks for the first time. Prosperity spread even among the working classes, leading Simon Nelson Patten to detect a seismic shift from an era of scarcity to an era of abundance. For him, a well-paid working population committed to ever-expanding consumption would create what he called The New Basis of Civilization (1907).
Patten understood that the mountains of newly available goods were in part the spoils of empire, but he dissolved imperial power relations in a rhetoric of technological determinism. The new abundance, he argued, depended not only on the conquest of weather but also on the annihilation of time and space—a fast, efficient distribution system that provided Americans with the most varied diet in the world, transforming what had once been luxuries into staples of even the working man’s diet. “Rapid distribution of food carries civilization with it, and the prosperity that gives us a Panama canal with which to reach untouched tropic riches is a distinctive laborer’s resource, ranking with refrigerated express and quick freight carriage.” The specific moves that led to the seizure of the Canal Zone evaporated in the abstract “prosperity that gives us a Panama Canal,” which in turn became as much a boon to the workingman as innovative transportation. Empire was everywhere, in Patten’s formulation, and yet nowhere in sight.
What Patten implied (rather than stated overtly) was that imperialism underwrote expanding mass consumption, raising standards of living for ordinary folk. “Tropic riches” became cheap foods for the masses. The once-exotic banana was now sold from pushcarts for 6 cents a dozen, “a permanent addition to the laborer’s fund of goods.” The same was true of “sugar, which years ago was too expensive to be lavishly consumed by the well-to-do,” but “now freely gives its heat to the workingman,” as Patten wrote. “The demand that will follow the developing taste for it can be met by the vast quantities latent in Porto Rico and Cuba, and beyond them by the teeming lands of South America, and beyond them by the virgin tropics of another hemisphere.” From this view, the relation between empire and consumption was reciprocal: if imperial policies helped stimulate consumer demand, consumer demand in turn promoted imperial expansion. A society committed to ever-higher levels of mass-produced abundance required empire to be a way of life.
There is another aspect of the “double-movement” to note here as well.
Many of us obviously are in a “moral panic,” as you’ve described it elsewhere in the post, with the underlying reasoning, means and effects being decidely unhealthy — not just for society at large, but for those of us who actually are in a “moral panic” about “the end of Western civilization” as we’ve known it as well.
The links to physical nutrition are very interesting and definitely not receiving their due attention. At the same time, though, “Man (still) does not live by bread alone” and, in large part, is not paying particular heed to his mental, emotional and spiritual diet any more than his physical one.
The trend or movement I wish to point out, of course, is the intense general interest (even all across the so-called political spectrum) we are witnessing in the ancient practice of “mindfulness” or “contemplative” meditation. While it’s true that a good number of us are interested in mindfulness for all the wrong reasons, the seedling of true mindfulness practice nonetheless has been planted and is growing in direct proportion to the “moral panic.” I would even say it has overtaken the trend of “moral panic” at this point, but I’d need some hefty research in this area to be able to say that with absolute certainty.
We have quite the obstacle to overcome in regard to this in the West as the Western — and, specifically, Christian — understanding of mindfulness has become confused, to the say the least. It’s fairly obvious that the egoic “I” has displaced the “I Am that I Am” authority in the West when it comes to mindfulness. Anyone who grew up in a Christian household might be able to relate to this as their parents were no doubt quick to invoke the “Mind me!” imperative when one or the other’s commandments were not being followed to the letter. : ) Needless to say, though, the “ego-center” is not intended to be the focus of mindfulness training and practice, though the center it is in the “Culture of Narcissism…” for the moment.
Fascinating, isn’t it, that it’s primarily the monastics who are on the frontlines of preserving ancient wisdom through our own dark ages just as they have in the past? Of course, just as in the past, there is also a great deal of opposition to the nurturance of mindfulness or “contemplative prayer” in society for one egocentric and egotistical reason or another, but I sincerely doubt those who are balking at the trend can or will succeed in stopping it.
— Diet: It’s Not What You Eat, It’s How You Eat It (Actually, it’s both, but there we are.)
— Mindful Munching
— How to Eat
— Center for Contemplative Mind in Society
— Contemplative Outreach
— Centering Prayer
I could go on all day about this one, complete with quotes, books and links galore, but — not to worry — I’ll spare you and your readers that. : )
Yeah. “Man (still) does not live by bread alone” I’d add that bread does not exist alone. The point I’ve made in a number of posts is that dietary ideology and food systems are built into the social order. Bread is never merely bread.
Food is enmeshed in neucognitive functioning, social identity, culture, etc. We are what we eat. Agriculture creates the agricultural mind, specifically hyper-individualism, and industrial agriculture pushes this to the furthest extreme. We can see and measure how the brain is altered by a high-carb diet as seen in the diseases of civilization such as autism, schizophrenia, etc. And we are beginning to understanding the mechanisms of how this happens, the specific dietary substances and biological systems involved.
Food, quite literally, shapes us and shapes all of society because it is out of which we are built. We would have no brains to think, no bodies to act without food. We simply would not exist. And the kind of existence we have is determined by the kind of food we eat (e.g., feudalism would’ve been impossible with serfs as strong, vital, and healthy as Mongolian warriors were on an animal-based keto diet). It’s my theory. Take it or leave it.
I’m familiar with the tendency to make arguments such as “the seedling of true mindfulness practice nonetheless has been planted and is growing in direct proportion to the “moral panic.” I would even say it has overtaken the trend of “moral panic” at this point…” It seems like a spiritual/integral version of Steven Pinker’s Enlightenment Now, and Pinker did bring forth as much evidence as he could find that modern civilization is progressing. Some found his argument compelling while others didn’t.
Is the world getting better or worse? I don’t know I’m sitting on the fence and don’t feel any strong urge to jump to either side. I’d like to be convinced about Whiggish visions of history. Maybe I’m just being cynical or contrarian for no good reason. Era of moral panics sometimes precede transformative changes, but then again at other times they precede chaos, crash and collapse. Seems like a toss-up at this point. How would we have any clue which way the wind is blowing? I also don’t know. But I’m watching closely for any further signs and shifts. That is my own practice, just to be alert and aware.
All of the above is simply my perspective. It is what makes sense to me at this moment in my miniscule existence, but likely to change over time. So feel free to disagree with it or even to argue against it. But I will note that I’m not really disagreeing with you, as I’m sure you understand. My nature is to be skeptical, until I see strong evidence of something. At the moment, my ignorance is too great to have an opinion on certain issues, and it seems to me that for the most part humanity shares in my ignorance. I jealously protect this state of ignorance. I’d like to think it’s a fruitful ignorance, an openness to possibilities with a dash of curiosity thrown in. That is, let’s see what becomes of it all.
As a side note, feel free to dump as many links as you want in my blog. As long as they’re relevant and interesting, go for it. I love links for purposes of clarification or as suggestions. I don’t consider lots of links as spam, even if they’re links to your own writing (if throw in a bunch of quotes and book titles, all the better). Nor will I treat long comments as intrusive and unwelcome. If anything, I get irritated by enigmatically short comments. And drive-by commenting can be the worse. So have at it. Don’t be shy.
Given what I just had to say about having no respect for argumentation, I should think my comments regarding societal trends would be taken as cultural observations But I should know by now, it’s impossible to converse in this climate.
I didn’t think we were arguing. Did you take that as argumentation? Do you really believe it’s impossible to converse? As far as I was concerned, that is exactly what we were doing. We were conversing, discussing, talking… and I was enjoying the exchange of ideas. I didn’t realize you perceived it otherwise. I don’t consider mild differences of viewpoint as argumentation, especially considering we agree on so much else and basically share the same worldview. The differences really were more of emphasis than substance. But maybe I should have made that more explicitly clear. I sometimes assume something is mutually understood when that isn’t necessarily the case.
I went to immense lengths to be non-confrontational and non-argumentative. It’s true that I did, in a friendly manner, invite you to “feel free to disagree with it or even to argue against it.” But implied in that was an invitation to agree or not argue as well, as I didn’t personally feel like I was disagreeing or arguing. Rather, I thought we were having a friendly and casual conversation. Our respective comments seemed to building on one another, not tearing down the other down. My intention was to invite you to share your views without any worries of my being dismissive or debative. Like you, I’m not overly fond of argument for the sake of argument, especially debate.
I chose my language carefully to indicate I was not (absolutely not!) taking a strong and certain position… just offering thoughts and views with only speculative conclusions. But I guess there was a failure of communication. That isn’t surprising. It happens all the time online. With that in mind, understand that when I use the following language I’m speaking honestly, forthrightly, and sincerely. I really do hold my views lightly or that is what I try to do, not that I’m always successful. I was trying my best to be humble. I apologize for acting in a way that you took as being unfriendly.
“It’s my theory. Take it or leave it.”
“Some found his argument compelling while others didn’t.”
“I don’t know. I’m sitting on the fence and don’t feel any strong urge to jump to either side.”
“I also don’t know. But I’m watching closely for any further signs and shifts. That is my own practice, just to be alert and aware.”
“All of the above is simply my perspective. It is what makes sense to me at this moment in my miniscule existence, but likely to change over time.”
“But I will note that I’m not really disagreeing with you, as I’m sure you understand.”
“I’d like to think it’s a fruitful ignorance, an openness to possibilities with a dash of curiosity thrown in. That is, let’s see what becomes of it all.”
To emphasize our commonality, I agree with you that mindfulness is useful, although ‘useful’ doesn’t exactly seem like the right word to describe what I mean. I do mindful meditation simply because I’m drawn to it, as it is part of the experience of the world that makes sense to me.
Mindfulness feels like a natural result and extension of a different way of being in the world, not so much a choice I’m making as a ‘practice’ per se. I’m not practicing as if learning a sport or musical instrument for I’m simply living, existing. But I don’t mean to quibble over words. It’s hard to communicate about such things.
The heart of my comment was the following: “I also don’t know. But I’m watching closely for any further signs and shifts. That is my own practice, just to be alert and aware.” Maybe that part didn’t stand out to you, in the way it stands out to me in having written it. Just to be alert and aware. Does what I say there make sense to you?
“I can think. I can wait. I can fast.”
~Herman Hesse, Siddhartha
If I have a life philosophy, that is it.
I am curious. Did you actually perceive me as arguing with you, as seeking out or trying to incite argument? I realize I can sometimes come across as confrontational or judgmental. I’ve been known to even admit to occasionally being an asshole. I’m usually aware of that when I’m in that kind of mood. I can feel the edge of irritableness and such that used to be much more common back when I was chronically depressed. So, I’m very familiar with it and tend to be self-aware about this impulse when it’s present — that said, self-awareness is an imperfect thing.
But in this case, I felt nothing along these lines. I thought we were getting along just fine. If anything, I was trying to soften my tone as much as possible. For example, I was worried that you might take my comparison of your views to Pinker the wrong way. So I framed my comment there in as neutral of wording as possible, in stating that some were convinced and others not. But I carefully took neither side in that debate about progress, as I wasn’t interested in debating it. I only wanted to point out the context of that debate, as to clarify I was taking a position adjacent to it.
Was there something about the language I was using that caused you to interpret my intentions as argumentative? Or did you intuit or think you intuited something behind my words? I’d honestly like to know. If I was coming off so far differently than my intentions, I’d like to work on that. I have some old habits from decades of depression and they might leak into my way of communicating without my realizing it.
In case you didn’t notice, I’m really into meta-discussion. I love talking about how we talk and that is even more true on the personal level. I find it fascinating, even when sometimes it involves irritation or less than happy interactions. It probably relates to my deep interest in linguistic relativity and philology, along with the likes of Marshall McLuhan, Walter Ong, E. R. Dodds, Julian Jaynes, etc (thinkers focusing on language). But the origin of this tendency in me probably comes from my upbringing, as my childhood church was new agey with that liberal bent of focusing on how we talk and with a particular emphasis on the power of words ( New Thought theology).
I have immense ability to talk anything to death, as I’m doing right now in these comments. I hope you can forgive me. This is the whole reason I blog. I want dialogue with people, and I’m as fascinated with how we dialogue as much as what we dialogue about. That was the underlying message of my post on moderation trolls, in how rhetoric frames not only debate but as I explored in the comments section there how it influences and shapes society. Anyway, I hope you can offer some sympathy and understanding in my direction. I’m an imperfect human being, like anyone else. My first knee-jerk response to your above comment was to feel defensive, to declare my innocence in doing anything wrong. But to be honest with myself, I can’t pretend to know that what you perceived was necessarily false. I very well might have been putting off signals without any awareness whatsoever. That is a talent humans have.
That is why I like meta-discussion. How else are we supposed to become aware of the shortcomings of our unawareness without occasionally bumping into hidden parts of one another? I want you to know that I don’t have any bad feelings here. Generally speaking, as long as you’re not a troll or racist or authoritarian, I can be quite tolerant of minor incidents of conflict. This isn’t the kind of thing I would hold as a grudge. There is some kind of misunderstanding that happened here and I’m not exactly sure what it was. Maybe I was misunderstanding myself. I’m willing to entertain all possibilities. It’s nothing to get excited about, to my mind. This kind of thing happens so often that I’m trying to learn to be less reactive. This has happened a number of times recently. And in those cases, I think it was the other way around where I was perceiving something in another and they claimed that I had misunderstood. I’m not sure it particularly matters who is right, at least that is the attitude I strive to maintain.
I came across a quote recently that said something along the lines that, if one understood how much baggage we all carry and tend to project, one would not be so quick to judge in taking anything personal. Most conflicts are not really about what they seem. All of us have our own private struggles that we tend to hide away from others, sometimes even from our own awareness. That insight I too often forget. And so I say it here as much to remind myself as to share it with you. If personal issues were somehow popping up in my comments or yours, maybe it doesn’t need to mean anything more than that. And that is fine. It’s not to dismiss and demean what we feel but, quite the opposite, to respect those feelings on their own without needing to rationalize or even explain them. It’s not a matter of maintaining a division between the personal interpersonal, as no such absolute demarcation exists. All issues, in one way or another, are interpersonal. But that doesn’t mean we have the awareness to know what is going on most of the time, even in our own minds.
Whatever may have come up in the offending comments above, whether consciously intended or not, it is irrelevant to how we choose to continue engaging. Or if you feel plain annoyed by the whole situation and my blathering on, you’re free to not respond at all and I’ll try my best to not take it personally. I just don’t like to leave things unsettled with bad feelings still in the air (the equivalent of the advice of resolving an argument before going to bed). I enjoy talking with you and have no desire to harm that friendship. It may only be an online connection of words on blogs, but someone recently reminded me that even online friendships are real friendships, as people online are real people, as it is all part of the real world — there is no fake world, just this one world we live in. I was maybe too careless about this kind of thing in the past, by not taking online interactions all that seriously. That was an unhelpful and sometimes unkind attitude. It does matter.
No worries, mate. We’re good.
You’re speaking to a Celt and, contrary to popular belief, we’re not all “quick tempered.” In fact, it usually takes an overabundance of ill-will directed our way, built up over time, to make the blood boil but — if and when it does — there’s no mistaking that it is. ; )
Good to hear! I’m from two families that are highly sensitive and prone to depression. We’re not Celts, as far as I know from genealogical research… mostly centuries of mutts marrying other mutts, although there is a lot of German thrown in there. I’m fully able to be irritable and defensive. I also inherited an ability to hold grudges, but I’m trying to be different and not take things so personally.
Erm, no. But I don’t think arguments need be made (or evidence proferred) for spiritual/integral revelations, in any case. They’re self-evident. No doubt why true spiritual leaders may tell us where to look, but never what to see. (It seems they’d prefer we see for ourselves. No doubt also why the Siddharta Guatamas of the world are reportedly hesitant to teach in the first place. 1) It’s second-hand and 2) they know they’ll most likely be misunderstood.)
‘Nough said.
In our scientific evidence-based society, though, every word out of our mouths must be backed up with reams of research, experimentation, charts and graphs, ad infinitum to be taken seriously…or not.
As for Pinker…. Well, I’ll leave that to those who find it necessary to “meet him on his own turf.”
Let me attempt the near impossible, to offer a differing view that doesn’t come across as argumentative, disagreeable, and contentious. I hope I can at least make you understand where I’m coming from with agreement or disagreement, as I see it, being irrelevant. Mutual understanding is a greater aspiration, even if it ends in agreeing to disagree. So, instead of debate, rather than claims and counter-claims, I’m going to invoke mystery as an expression of open-minded ignorance and as the prerequisite to seeking truth or, failing that, insight. Basically, I claim my right to humility in having questions without answers — no excuses or apologies, just the way it is or rather the way I am.
I have no desire to challenge the ‘evidence’, self-evident or otherwise, on its own terms but to question and wonder (motivated by curiosity and wonderment) about the framing that leads to specific conclusions based on any given evidence. The same evidence could be interpreted many ways and so potentially could lead to divergent or even contrary conclusions, without it being an issue of right or wrong, true or false. The question is: “self-evident” to which particular selves and what kind of self? We collectively create a particular sense of self that determines what is self-evident, and possible selves are numerous depending on the collective in question — much of my writing has been about how different systems form different selves (e.g., modern agriculture and hyper-individualism).
For instance, what was self-evident to Daniel Everett as an evangelist was not self-evident to the Piraha, since their senses of self caused their senses of reality to be incommensurate and have too few points of contact. At times, they couldn’t even see the same thing when looking exactly at the same place such as when the Piraha all gathered to look at the ‘spirit’ on the other side of the river but to Everett and his family there was nothing there. And vice versa with how Everett’s absolute faith in Jesus Christ was not and could not be real or meaningful to the Piraha within their cultural worldview that disallowed impersonal truth claims, abstractions, and generalizations by default of how their language was structured (related to why their language is one of the few surviving that lacks Chomskyan recursion).
I get that what you experience and understand is self-evident to you, in your sense of self and reality. Yet it doesn’t seem to be self-evident to most people in this society which, of course, proves nothing other than people disagree about a lot of things… on the other hand, is there any society where this has been self-evident to the majority? I don’t know. I’m not even sure that what you speak of is self-evident to me, but let me be careful about what I mean as I’m feeling epistemologically cautious, even if you think it is unneeded, unjustified, and unfair. By personality, I have a certain kind of wariness about the world and it isn’t something to be rationally debated for or against. And maybe there is some key difference in our psychological predispositions, the kinds of selves we possess or rather that possess us and hence the particular collectives we each are drawn to and identify with.
Am I making any sense at all to you? I’ll try to clarify. It’s not that I’m exactly disagreeing about the potential for such self-evidence, as I probably know what you’re getting at. I’ve had some profound experiences from meditation and psychedelics. And if you caught me at another time of my life, I might have been fully on board with this claim of self-evidence that (to you and those who share your view) seems like an obvious and incontrovertible revelation. But my sensibility has shifted over time, increasingly toward an attitude of radical skepticism and Fortean curiosity. Rather than disagreement, it’s about a preference of emphasis and framing… or something like that… I’m not sure I can explain it well. I feel doubtful or even pessimistic about supposed universal truths. The way the world is (or seems to be) is not inevitably how it had to be nor is there a single path forward to what it will become, much less a single revelation to light the way. The world is an infinite multiplicity that opens up to possibility, rather than narrowing down to a single self-evident certainty, so it seems to me. Could I be wrong about that? Sure. But for now, it’s my working hypothesis.
That isn’t to say there aren’t intriguing and, at times, amazing congruencies and resonances between the millennia of spiritual teachings. Along these lines, you seem to be referring to the perennial wisdom tradition, yes? The attraction to this philosophical position has often made sense to me and so I do hold it close as a sometimes useful or inspiring way of looking at the world. I’m always scanning for the patterns that seemingly point to something more. And so I would never dismiss that greater sense of things that one can gain from studying the great teachers of the past. On the other hand, the patterns I discern appear more like ripples in the ocean than they do the eternal orbs gyrating in the sky above. Nonetheless, ripples in the ocean can still tell one quite a bit about the world, including deeper undercurrents of reality, no matter how murky the depths.
Beyond that, I don’t feel strong confidence in knowing which spiritual teachers are true and which are not. My suspicion is there are many truths with many paths that very well may lead to many destinations. It’s a cosmic penny arcade of games to play, even if they are games we play with the seriousness of a child. Robert Anton Wilson, self-described as “mildly puzzled all the time,” ventured an approach of guerilla ontology. He proposed that we can jump from one reality tunnel to another but we never escape all reality tunnels — so basically, choose your illusion with care and try to hold it lightly. An attitude of playfulness is key. “Truly I tell you, unless you change and become like little children, you will never enter the kingdom of heaven.”
I don’t know if this view is ‘true’. It just makes me feel better about the strange and amazing world I find myself in: “But in this dark world where he now dwelt, ugly things and surprising things and once in a long while a tiny wondrous thing spilled out at him constantly; he could count on nothing,” as Philip K. Dick put it in A Scanner Darkly. I have no need to make certain claims of anything, not even self-evidence, as I fully realize that my preferred sense of reality is far outside the norm. And as I so often repeat, that is fine.
I’m a seeker and, as has been said, seekers never find… for then the seeking would end and, as I’d add, the fun is in the seeking. If a “a fox knows many things, but a hedgehog one important thing,” then I’m a fox (or, to be fancy, a bricoleur). Maybe that is my fate or my self-fulfilling prophecy — either way, I accept it. Does all of this still seem wrong or plain irritating? Do I continue to completely miss the point to the extent of being spiritually obtuse? Well, I offer no defense on my own behalf. This is simply what makes sense to me at the moment in where I find myself. I’ve patched together this semblance of meaning as best I can from the bits and pieces I found along the way. I won’t pretend to know if my own understandings, meager as they are, contradict your own sense of things. I suppose it’s no concern of mine. I have no desire to attack what is of such immense value to you and so many others. I mean no harm.
I’m just wandering down this road less traveled and, if others would like the company, they are welcome to join me in my meandering path, but I must emphasize that I do like to meander. This is no straight and narrow. Destination is, as of yet, undetermined. I do take pleasure, though, in drawing maps of the territory I’ve covered, but those maps are maybe a bit idiosyncratic. What is beyond the edge of the mapped territory? Heck if I know. I’ll check it out if and when I get there. There is an immense world to be explored and curiosity is my guide.
Can one be open-minded and ignor(e)-ant at the same time? Hmm.
I suppose that question doesn’t really matter to the conversation, though, does it? Point is: you’re curious, aware and humbled by the fact that there are questions to which we have no answers and may never have any answers. That likely won’t stop us from asking them, though, curious critters that we are.
Self-evident to what Buddhists call our “true selves” and not self-evident to what Buddhists call our “false-selves,” which is most often referred to these days in the West as “ego-consciousness.” A “take it or leave it” proposition if ever there was one.
The example of Everett and the Piraha seems to reflect a difference between expressions of what Gebser calls “consciousness structures,” Everett’s being exemplary of the mythical and the Piraha’s being exemplary of the magical, if I’m understanding Gebser right. If those structures are, in fact, inherent in human beings — both latent and manifest — then I should think any issue there was one of communication and translation as opposed to “consciousness” itself. Physical sight has little to nothing to do with it, imho.
What I had in mind was more along the lines of subjective and objective insight and experience, and I’ve struggled to come up with a really good example of our self-imposed difficulty to integrate the “two.” I did very much appreciate an example of it a conversation between the characters Ellie Arroway and Palmer Joss in the motion picture adaptation of Carl Sagan’s Contact. (Despite that Sagan thought WB’s script wasn’t a true representation of his novel of the same title, the gist of that “argument” seems more or less intact to me.)
More or less, though I don’t think of any wisdom tradition as a “philosophical position,” just perennial as, I further think, intended. Religion I tend to think of a bit differently — as “tuition” rather than intuition, as it were.
We have that in common, then. Perhaps even with quantum physicists. : )
I do, based on the criteria outlined above. A good teacher doesn’t tell us how to “look,” what to “see” (with our “ajna,” to borrow a term from our Hindu bretheren) or what to think. A good teacher just says something like, “Knock and the door shall be opened to you.” (And I like to think humanity in the process of bringing his “Ajna” into balance.)
One of my favorites. Regardless of the plethora of exegeses of that passage, it’s always invoked in my mind a sense of child-like wonder (as opposed to childessness, of course). Believe it or not, we were all children once. So I’m not sure why so many of us think exegeses of it necessary, but there we are. (Perhaps they like of themselves as authorities in the matter? Who knows?)
Good conversing with you. I find your rambling style appealing, by the way. So many sparks! Too many “streams of consciousness” are more like constipated or diarrheal “streams,” if you get my meaning. (Trump’s comes to mind as a perfect example of the diarrheal variety. He could really do with even a modicum of self-reflection, imo. Could it be more polluted?)
“Ignorant” is one of my favorite words. I like the simplicity of it and the shock value. We are born in ignorance and mostly remain ignorant throughout our lives. At the very least, we can be honest and humble about this state of affairs. I wear my ignorance proudly. It’s the starting point for any of us — hopefully, not the ending point. I’m working on it.
We need to enter into ignorance as one must enter an unlit room to carry a candle into it so as to dispell the darkness. Or I suppose one could simply toss the candle into the darkness and light the room on fire, but that seems less useful and wise, depending on one’s opinion about rooms.
In etymological terms, it is related to agnostic. Most basically, it only means to not know, to lack knowledge (and what is related to knowledge in its broadest sense). The sense of intentionally or belligerently ignoring something seems to be more of a later connotation, although that is less clear when that originated. The first known use of ‘ignore’ was in 1801, whereas ‘ignorance’ goes back to the 13th century and ‘ignorant’ to the 14th century.
My radical skepticism coincides with my radical ignorance. To put it differently, we can refer to such ignorance as unknowing, a passive state that implies no moral failure and social judgment. It is also why I have at times called myself an “agnostic gnostic,” in that I I’m a seeker of knowledge, especially in the experiential sense, while acknowledging the knowledge I lack.
It’s my way of talking. Feel free to ‘ignore’ my somewhat idiosyncratic use of ‘ignorance’.
https://www.merriam-webster.com/dictionary/ignorance
“the state or fact of being ignorant : lack of knowledge, education, or awareness”
https://www.etymonline.com/word/ignorant
“late 14c., “lacking wisdom or knowledge; unaware,” from Old French ignorant (14c.), from Latin ignorantem (nominative ignorans) “not knowing, ignorant,” present participle of ignorare “not to know, to be unacquainted; mistake, misunderstand; take no notice of, pay no attention to,” from assimilated form of in- “not, opposite of” (see in- (1)) + Old Latin gnarus “aware, acquainted with” (source also of Classical Latin noscere “to know,” notus “known”), from Proto-Latin suffixed form *gno-ro-, suffixed form of PIE root *gno- “to know.” Also see uncouth. Form influenced by related Latin ignotus “unknown, strange, unrecognized, unfamiliar.””
https://simanaitissays.com/2018/03/16/ignorance-its-etymology-and-display/
“The adjective “ignorant” and noun “ignorance” both are related to the English word “ignore: “to refuse to take notice of.” All three trace back to Latin words ignarus, from in negating gnoscere. This last one, with a negating “a” is related to the English word “agnostic,” unknowable, tracing back to the Greek, gignōskein, “to know.”
“Ain’t etymology fun?
“There’s a curiosity in M-W’s dating the first English appearances of ignorance (13th century), ignorant (14th century), and ignore (1801). Are we to infer there were folks displaying a lack of knowledge 600 years before they refused to take notice of it?
“Which reminds me of the Dunning-Kruger Effect, wherein highly intelligent people realize how little they know, and stupid people are too stupid to realize how lacking they are.”
I’ve read enough about Buddhism and Eastern philosophy to grasp basic notions along the lines of “true selves” and “false selves.” Once again, it’s not so much about agreement or disagreement with that view of things on its own terms. But I’m one who must test everything in my own experience.
I won’t take someone else’s word for it. That is the challenge. And it involves continuous experimentation through various ‘practices’, including how I use language. I’ve had this odd experiment lately where, in my own mind, I’ll talk about myself in third person plural, as ‘they’. It creates an interesting vantage point of being in the world, a shift into narrating mode.
I’m still exploring possibilities. Maybe I always will be. On my deathbed, my last words might be yet another question. Or maybe my final word will be ‘but…” and then the final silence. Some say that nothing good comes after the word ‘but’, but what if what comes after is one’s death? Then what? Put on my tombstone, “but…”
Another way of framing this is through Jean Gebser’s work. We live in a society dominated by the deficient mental-rational. It’s what we are born into, the air we breathe. It’s like dust that settles on everything, gets in our lungs, blurs our vision.
That is where my wariness comes in. I don’t feel self-assured that I’m capable of standing above it all. So, I embrace being down in the muck of things. We have to work with what we have available.
That is maybe why I find comfort in Philip K. Dick’s notion of God in the gutter. I’m in the gutter and so that is where I look for God, as the man looked for his keys under the street light because the light was better there. This may not be the most rational approach. But it’s the best I can do under the circumstances.
https://benjamindavidsteele.wordpress.com/2009/12/06/god-in-the-gutter-jesus-in-disguise/
The keywords there for me are “dominated by.”
The “mental-rational” in its “efficient” form is effective and serves us well, but should no more be “dominant” than any other. I rather like the way Ralston-Saul “framed” the issue in his contemplations of human qualities.
What I’ve grown equally wary of is an alarming tendency among some in the “integral” community to villify “mental-rational consciousness” and argue for “right-hemispheric” (in the McGilchristian sense) dominance when, it seems to me, the hemispheres of the brain are perfectly designed to operate in concert. Ergo, I should think we’d be seeking a balance between them as opposed to dominance of one over the other.
But what do I know?
When I say I like to think that humanity is in the process of bringing its “Ajna” into balance, I mean a process something like this. This is what I think Gebser means by “aperspectival awareness” and hope whomever may be listening is not averse to Hindu metaphors for what I believe to be the same concept, especially considering that Gebser insists awareness, if not consciousness, is already “integral.”
PS As others have asked, kindly pardon the typos. (“Childessness” for childishness; “like of themselves” for like to think of themselves, etc.
I do so wish all Internet commenting systems included ‘edit’ features. I do so hate it when ^that^ happens and there’s nothing you can do about it. : ) Hopefully, though, typos aren’t enough to obstruct mutuality.
As Winnie the Pooh said it so wisely, “when you are a Bear of Very Little Brain, and you Think of Things, you find sometimes that a Thing which seemed very Thingish inside you is quite different when it gets out into the open and has other people looking at it.” Ah, yes, indeed… I’m not one to be dismissive of the “mental-rational,” assuming I know what it is in the Gebserian sense and what it means to be efficient vs deficient. Clearly, my blog is a paean to intellectual inquiry. I “Think of Things” that “gets out into the open” and other people look at it. So, I don’t know where that leaves me.
About the mental-rational dominating, I came across a recently released book. It argues that modern society has never been secularized and the world was never disenchanted. Capitalism didn’t destroy or replace religion but became the new religion. Yet in our deficient mental-rational, we didn’t have the depth of understanding of what had been created. It not only took over our society for it also took over our minds. The most powerful religion dominates without the public realizing it’s a religion. This is the territory I cover in my own theory of symbolic conflation, but others would call it something else and explain it a bit differently. Anyway, capitalist humanity is homo economicus, the supposedly rational individual agent. That is the ideology that dominates, even as it is patently absurd.
I get what you mean about balance and all that. But I honestly don’t have a clue what balance could mean or if it really could mean anything that is efficient and effective, at least under present conditions. As Marashall McLuhan argues, the eye is too dominating of a sensory organ when disconnected from immersive synaesthetic experience. The shattered sensory world, like Humpty Dumpty, can’t be put back together again. One can balance the pieces and yet the parts do not make the whole, the center does not hold. From my view, it is not balance that is the essence of the integral, of integrity, of what holds it all together. Even it if can’t be the integrating function and force, that isn’t to say that balance can’t be a useful tool to keep in one’s toolbox. Anyway, that is my personal take on it, which we’ve already discussed elsewhere.
As to your last point, I’m fine with talk of the Ajna and all that. That is the kind of thing I grew up with. I’m one of the few people in the world who learned of chakras through my Christian upbringing. I’m fully familiar with and quite forgiving/accommodating toward the world of woo, and so it doesn’t generally bother me. I’ve been meditating on chakras, usually the Anahata, for decades now. It is irrelevant to me if one takes them metaphorically or literally, as they are meaningful in some sense. It’s all good. My issue is more at the experiential level, in that I go by what I personally know in what I’ve experienced. I don’t easily accept what others tell me until it is real to me in my direct sense of the world. I have a deep-seated mistrust of words (specifically in relation to mind viruses) when it comes to this kind of thing, maybe the reason for holding concepts like the ‘integral’ at a slight distance because there is so much ideological baggage that has accrued to it.
https://osociety.org/2019/10/13/when-did-capitalism-become-our-religion/
The “third eye chakra” is easily confused with “single vision.”
I personally play it by “ear” and I’ve no idea why. As far as I know, I was just designed that way. In fact, I can’t remember a time when listening and hearing (and I’m not speaking merely of audible sound) wasn’t the predominant “sense” of my constitution. Needless to say, I take the example of Echo as much a warning as the example of Narcissus. (Though I thought until this very moment that I was alone in this, the psychological community apparently has finally taken notice of Echo. I’m not sure if that’s a good thing or not. Here’s hoping Echo’s true role in that story is not lost in adulterated psychism.) I also probably understand “those who have ‘ears,’ let them hear” a bit differently.
Mine either.
You can say that again. A rule of thumb that works exceptionally well for me is: when in doubt, go straight to the source.
I once wrote an essay entitled, “Politics is the New Religion.” So, I’ll definitely be listening to McCarraher’s thoughts on capitalism, though I’ve long-since moved on to incubators of new thought on subjects such as economics, “Doughnut Economics” being the only new contribution I’m aware of at the moment.
I just now came back to this discussion. I’m glad we were able to come to an understanding. Let me add a thought. About the issue of “bread alone,” I’ve been continuing my exploration and contemplation of what food systems mean. I’ve described how diets shape minds and cultures. But it goes the other way as well. Food systems are extensions of entire social orders. As such, they are used as one means, among many, of enforcing the social order they serve.
When social orders change, the food system will change as well. This might be an unintentional side effect with a cascade of effects following from it. When feudalism ended, the feudal food system ended with it. No one at the time could have predicted how this would transform neurocognitive development, public health, and social well-being. In the West, that was when a locally-sourced diet, primarily through subsistence farming combined with wild-caught food, was slowly replaced with the capitalist economy from emerging colonial trade.
Even the notion of ‘bread’ was altered. As I’ve pointed on numerous occasions, grains in general were used more sparingly in the past — instead, grain substitutes were more common. This is particularly true of wheat, in having been difficult to grow and hence a limited commodity only affordable to the wealthy, that is until 19th century advances in agriculture. It is specifically wheat, as discussed in my post about the agricultural mind, that seems to have a powerful effect on human physiology.
You point out that it “is the intense general interest (even all across the so-called political spectrum) we are witnessing in the ancient practice of “mindfulness” or “contemplative” meditation.” And that, even if sometimes for the wrong reasons, “the seedling of true mindfulness practice nonetheless has been planted and is growing in direct proportion to the “moral panic.” You go on to say that, “the “ego-center” is not intended to be the focus of mindfulness training and practice, though the center it is in the “Culture of Narcissism…” for the moment.”
I see that as directly related to the dietary transformation, specifically as it originated in the earliest origin of what we think of as Western civilization. The Axial Age was when modern agriculture first emerged. Prior to that, grain fields were grown in a semi-wild state with little control over what was mixed in, sometimes including ergot. Mass agriculture, as we know it, took many millennia to develop. A consistent and dependable high-carb, specifically with grains as a mainstay, was simply not possible for most of history.
The first signs of change in agriculture began to be seen in the Axial Age. It even more fully altered in the Roman empire, which technically was after the Axial Age. That was when ‘bread’ was taking on new symbolism as it took on new priority in society. It’s unsurprising that it held such a special role in the new religions such as Christianity. The Axial Age and the centuries immediately following was the source point of modern religion and spirituality, the era during which arose the contemplative traditions. My suspicion is that a contributing factor involved the increasing shifts in the food system and hence in what became common in the diet.
There is a reason that monks often turned to a higher carb diet that limited animal foods. This wasn’t a diet that those who physically labored could live on and remain healthy, although expendable slaves were often forced onto such a diet simply because as agriculture came to dominate certain grains (e.g., barley) and legumes became cheap. It also became an issue of social control, as I’ve argued elsewhere. Based on Galenic theory of humors that was revived in the Middle Ages, it was believed that specific animal foods such as red meat were invigorating and so would make the serfs unmanageable, a major concern as riots easily happened during festivals and carnival and, of course, riots sometimes became bloody revolts. The feudal order was precarious and required careful management and diet was key to this.
It wasn’t limited to social control, though. Just as important were developing thoughts about self-control. Decreasing invigorating foods would decrease libido and calm the body. This was useful during periods of pray, contemplation, or meditation. Regular fasting, for monks and the common people, was part of that earlier religious tradition. Weakening the body was considered necessary to strengthening the soul or something like that. But to make this contemplative diet possible required a food system with agricultural improvements. This higher-carb diet, initially used for monks, would later promote the extended periods of mental activity of intellectuals. Caffeine also helped and also was initially used by monks for lengthy prayer sessions that extended into the night.
This goes into my own thoughts about the increase of addictive substances from the Axial Age onwards. Psychedelics that once were the predominant drug of choice were replaced by those that were addictive. This also went along with the development of farming practices that eliminated ergot from the food supply. So, we should expect that mindfulness practices would become more attractive in a society with a high-carb diet. Keeping the body still while the mind remains active is easier to accomplish with the quick fuel from easily digested carbs such as refined grains. It also creates a specific mindset. There is a reason the carnivorous and ketogenic Mongol tribes didn’t have monastic traditions, whereas the Chinese (eating rice) and Europeans (eating grains) they attacked did have monastic traditions.
An animal-based ketogenic diet is great for general health. It’s simply not conducive to contemplative practice. As Galen understood, such a diet invigorates the body by creating excess energy that allows immense physical activity and feats of strength. That isn’t what monks are seeking to accomplish. But maybe there is a way to find balance. On my own animal-based ketogenic diet, I’ve found that I can meditate while on long-distance runs. I repeat a mantra with each inhalation and exhalation, which creates a rhythm and a centered mindfulness that allows me to feel more open to the world around me. It should be noted that the Mongols, although lacking a monastic tradition, would fast and pray for many days before major events and decisions (e.g., Genghis Khan seeking guidance before going off to war).
We are no longer in a feudal society. Nor are we in a tribal society. We have to discover a new diet and food system that promotes and supports a new way of being in the kind of world we hope to make possible. Mindfulness will mean something different, incorporating what was learned from the past while extending into new insights and understandings. This is an age of experimentation. That is partly what interests me so much about the field of diet and nutrition. Many people are exploring, in their personal experience, how foods affect them. This experimental approach is happening in many other areas of life as well, such as seen in the integral community. It is an exciting time to be alive.
It occurred to me that there is a way to explore the relationship of diet and food systems to religion and social orders. Within any given religion, there have been societies and traditions that diverged on their dietary practices. I wonder what differences one might find in their practices, mindsets, and worldviews, in how they relate and organize, in what kind of economic and political systems they have.
For example, one could compare the vegan Seventh Day Adventists, the traditional omnivorous Amish (or the rural communities in Europe that Weston A. Price studied), the meat-eating Mormons, and mainstream Christians eating standard American diet (SAD); maybe also throw in Catholics and Eastern Orthodox, especially the latter, with their traditions of regular fasting. As another example, one could look at vegetarian Buddhists in Southeast Asia and the meat-eating Buddhists of Tibet.
That would be a fascinating study. What kinds and expressions of religion are correlated with which foods, not only food production and sourcing but food preparation and other foodways (raw, cooked, fermented, etc; feasting, fasting, calorie restriction, etc). I’m willing to bet patterns would begin to emerge. The closest I’ve seen to such a dietary exploration of religion is “Food and Faith in Christian Culture” ed. by Ken Albala and Trudy Eden, but the focus is more limited than what I’m thinking of as none of the essays consider the scientific research on diet and nutrition.
I read again all of the comments in our exchange. It made me think about what fundamentally motivates me these days. I can be skeptical and even cynical, but that isn’t really what inspires me.
The word that describes me the best might be ‘radical’ in its etymological sense, as going to the root. So, what makes my gears whir is a sense of radical ignorance, radical doubt, radical questioning, radical curiosity, radical wonder, and radical imagination. And the label that maybe captures this sensibility is that of Fortean, in that more than anything else my take on existence is that the world is strange, sometimes bizarrely and incomprehensibly strange.
I don’t assume the world serves human purposes, much less conforms to the limits of the human mind. This attitude toward non-human realities is more of a pagan/shamanistic worldview of confounding mystery, in contrast to a monotheistic worldview of universalistic mysticism. I tend toward seeing multiplicity, without any clear faith in some overriding broad truth to contain and order it all.
Some might call this “ontological anarchism” (Hakim Bey), “guerrilla ontology” (Robert Anton Wilson), or something along these lines (William S. Burroughs, Philip K. Dick, etc). I’m not exactly sure how this applies to my writings on diet. It’s more of the background to my entire thought process, the source of the probing curiosity that drives me. I ruthlessly question and doubt everything. The desire is to push beyond the surface to something deeper.
I guess the dietary angle is useful for a simple reason. It shifts the frame of thought and so forces to rethink the very basis of so many assumptions and conclusions. It also helps knocks established ideas off of their pedestal, pulls abstractions down into the muck and mud of concrete reality.
“you are what you eat” says it all. Excellent article on one of the most important topics of our time.
Yeah. It seems important to me. It can easily be argued as one of the most important topics of our time, maybe the most important. Without our health as individuals and as a society, we are simply and absolutely fucked! But most people act oblivious or unconcerned.
By the way, are you related to the Lewis family of Virginia, Kentucky, and Indiana? That’s my mother’s family. This line descends from a Henry Lewis.
https://benjamindavidsteele.wordpress.com/2014/05/09/the-many-lewis-families-of-many-places/
Physically, emotionally, mentally and spiritually.
o There’s that “fourfold” again. : )
It’s all inseparable. Health is health.